Unnamed: 0
int64
0
14.2k
ReviewID
int64
2.14M
32.7M
Target
stringlengths
12
3.68k
Background
stringlengths
46
5.35k
Abstract
stringlengths
263
730k
2,000
27,465,589
Dem and generation interventions contribute to increases in modern contraceptive methods use .
OBJECTIVES To synthesis e evidence on the implementation , costs and cost-effectiveness of dem and generation interventions and their effectiveness in improving uptake of modern contraception methods .
OBJECTIVES We examined the effect of a peer-delivered educational intervention , the Malawi Male Motivator intervention , on couples ' contraceptive uptake . We based the intervention design on the information-motivation-behavioral skills ( IMB ) model . METHODS In 2008 we recruited 400 men from Malawi 's Mangochi province who reported not using any method of contraception . We r and omized them into an intervention arm and a control arm , and administered surveys on contraceptive use at baseline and after the intervention . We also conducted in-depth interviews with a subset of intervention participants . RESULTS After the intervention , contraceptive use increased significantly within both arms ( P < .01 ) , and this increase was significantly greater in the intervention arm than it was in the control arm ( P < .01 ) . Quantitative and qualitative data indicated that increased ease and frequency of communication within couples were the only significant predictors of uptake ( P < .01 ) . CONCLUSIONS Our findings indicate that men facilitated contraceptive use for their partners . Although the IMB model does not fully explain our findings , our results show that the intervention 's content and its training in communication skills are essential mechanisms for successfully enabling men to help couples use a contraceptive The aim of this study was to determine the effect of health education on the knowledge and attitude regarding family planning and contraception ’s method among the women who obligatory attended the Premarital Counseling Center in Yasouj city , Iran . An experimental study was carried out and a total of 200 women were selected for the study using convenience sampling method among women who attended in the health centre in order to utilize the necessary premarital actions . Respondents were divided by two experimental and control groups r and omly . A pre-evaluation was done on the knowledge and attitude on family planning using a structured question naire . After which , the health education for experimental group was done within four educational sessions during 4 consecutive weeks and control group underwent traditional education method . Post evaluation was utilized for any changes regarding their knowledge and attitude among the respondents immediately after the intervention . Independent and paired t-test was used to evaluate the mean knowledge and attitude scores differences among both groups . Results showed that there was a significant improvement in respondents ’ knowledge and attitude after educational program in experimental group ( p<0.001 ) , while no significant difference was observed in knowledge and attitude of control group . The finding also indicated that age was significantly associated with the level of respondents ’ knowledge . These results deal the effectiveness of the educational method . In conclusion , the educational method is effective in increasing the knowledge and improving the attitude of women regarding family planning in Yasouj compared to current used educational method . Future educational programs need to incorporate the features that have been associated with successful interventions in the past , as well as including their own evaluation procedures This study tested the hypothesis that individual counselling in the third trimester would increase postpartum contraceptive use to a greater extent than only providing an educational leaflet . A total of 180 third trimester pregnant women of mean age 28.3 years who were attending Marmara University Hospital for prenatal care were enrolled . One-third were r and omly allocated to receive prenatal contraceptive counselling and the remaining two-thirds ( control group ) received an educational leaflet . Participants were followed-up at 6–9 months postpartum . The majority of subjects ( 91.5 % ) wanted to use contraception after delivery but 26.7 % did not know which method to use . At follow-up , 79.6 % of all women had begun a postpartum contraceptive regime and 68.7 % were using a modern contraceptive method . Overall , there was no statistically significant difference in postpartum contraception use between the control and intervention groups in this study population . It is , therefore , concluded that prenatal counselling was not superior to educational leaflets for increasing the use of effective and modern postpartum contraception Background A number of single case reports have suggested that the context within which intervention studies take place may challenge the assumptions that underpin r and omised controlled trials ( RCTs ) . However , the diverse ways in which context may challenge the central tenets of the RCT , and the degree to which this information is known to research ers or subsequently reported , has received much less attention . In this paper , we explore these issues by focusing on seven RCTs of interventions varying in type and degree of complexity , and across diverse context s. Methods This in-depth multiple case study using interviews , focus groups and documentary analysis was conducted in two phases . In phase one , a RCT of a nurse-led intervention provided a single exploratory case and informed the design , sampling and data collection within the main study . Phase two consisted of a multiple explanatory case study covering a spectrum of trials of different types of complex intervention . A total of eighty-four data sources across the seven trials were accessed . Results We present consistent empirical evidence across all trials to indicate that four key elements of context ( personal , organisational , trial and problem context ) are crucial to underst and ing how a complex intervention works and to enable both assessment s of internal validity and likely generalisability to other setting s. The ways in which context challenged trial operation was often complex , idiosyncratic , and subtle ; often falling outside of current trial reporting formats . However , information on such issues appeared to be available via first h and ‘ insider accounts ’ of each trial suggesting that improved reporting on the role of context is possible . Conclusions Sufficient detail about context needs to be understood and reported in RCTs of complex interventions , in order for the transferability of complex interventions to be assessed . Improved reporting formats that require and encourage the clarification of both general and project-specific threats to the likely internal and external validity need to be developed . In addition , a cultural change is required in which the open and honest reporting of such issues is seen as an indicator of study strength and research er integrity , rather than a symbol of a poor quality study or investigator ability OBJECTIVES This study was undertaken to determine the relative efficacy of home visitation with and without husb and participation on the use of modern contraception in Ethiopia . METHODS A r and omized field trial of a family planning education intervention using home visitation with and without husb and participation was conducted in Addis Ababa , Ethiopia , from August 1990 to December 1991 and included a 12-month postintervention follow-up . A total of 266 experimental and 261 control subjects were entered , of whom 91.7 % and 88.9 % , respectively , were followed through 12 months . RESULTS A greater proportion of couples in the experimental group were practicing modern contraception at 2 months ( 25 % vs 15 % ) and 12 months ( 33 % vs 17 % ) following the home visit intervention . By 12 months following the home visits , experimental subjects were less likely to have defaulted and more likely to have started using modern contraception following an initial delay . CONCLUSIONS The inclusion of husb and s in family planning programs will result in relevant increases in the use of modern contraception . However , there exists an important " sleeper " effect to the education intervention , reflected by a delay of greater than 2 months in the initiation of modern contraception for most couples A r and omized community trial of a family planning outreach program was conducted in Rakai District , Ug and a. Five communities received st and ard services ; six intervention communities received additional family planning information , counseling , and contraceptive methods from government service providers and community-based volunteer agents using social marketing and other strategies . Condom use was promoted in all of the communities . The community-based family planning outreach program was implemented in two phases--1999 - 2000 ( early ) and 2001(late)-- and its impact was evaluated by means of population surveys in 2002 - 03 . At follow-up , hormonal contraceptive prevalence was 23 percent in the intervention communities , compared with 20 percent in the control communities . The differential was greater in the early-intervention communities than the late-intervention communities . Pregnancy rates at follow-up were 15 percent in the control and 13 percent in the intervention communities . No differentials in condom use were found between study arms . Family planning outreach via social marketing can significantly increase hormonal contraceptive use and decrease pregnancy rates , but the impact of this outreach program was modest BACKGROUND The study was conducted to determine the impact of counseling and educational leaflets on contraceptive practice s of couples . STUDY DESIGN R and omization of 600 women was done in two groups matched for age , parity and socioeconomic status at the Department of Obstetrics and Gynaecology , Shifa Foundation Community Health Centre , Shifa International Hospital , Islamabad , Pakistan . In Group A , the intervention group was exposed to contraceptive counseling and educational leaflets in the postnatal ward after delivery , whereas in Group B , the nonintervention group was not given any formal contraceptive advice . Later on , both groups were assessed regarding their contraceptive practice s. RESULTS At their follow-up visit ( 8 - 12 weeks ) postpartum , 19 ( 6.3 % ) women in the nonintervention group had started contraceptive use , whereas 153 ( 50.8 % ) had decided to start contraception in the next 6 months , and 129 ( 42.8 % ) women were still undecided . The main contraceptive user was the male partner ( n=117 , 38.8 % ) , and the most common method used was coitus interruptus ( n=62 , 36.3 % ) . In the intervention group , 170 women ( 56.9 % ) had started using contraceptives , whereas 129 ( 43.1 % ) had decided to start contraceptive use in the next 6 months . The predominant contraceptive user was the females ( n=212 - 70.9 % ) , and the most popular method chosen was oral contraceptive pills ( n=111 , 37.1 % ) . CONCLUSION There is a definite increase in contraceptive uptake in women provided with educational leaflets and counseling session with a shift toward use of more reliable contraceptive methods The impact of antenatal counselling on couples ' knowledge and practice of contraception was investigated . An interview question naire was used before and after conducting counselling sessions with 200 pregnant women and 100 spouses . The participants were followed up immediately after delivery and 3 months later . Both the control and study groups displayed a lack of knowledge of contraception . Counselling sessions improved the couples ' knowledge and practice in the study group . Involving husb and s in family planning counselling sessions led to joint decisions being made and encouraged women 's use of contraception . The majority of couples retained most of the information given . Integrating family planning counselling into antenatal care in all facilities and involving the husb and are recommended BACKGROUND Group , rather than individual , family planning counseling has the potential to increase family planning knowledge and use through more efficient use of limited human re sources . STUDY DESIGN A r and omized , noninferiority study design was utilized to identify whether group family planning counseling is as effective as individual family planning counseling in Ghana . Female gynecology patients were enrolled from two teaching hospitals in Ghana in June and July 2008 . Patients were r and omized to receive either group or individual family planning counseling . The primary outcome in this study was change in modern contraceptive method knowledge . Changes in family planning use intention before and after the intervention and intended method type were also explored . RESULTS Comparisons between the two study arms suggest that r and omization was successful . The difference in change in modern contraceptive methods known from baseline to follow-up between the two study arms ( group-individual ) , adjusted for study site , was -0.21 , ( 95 % confidence interval : -0.53 to 0.12 ) suggesting no difference between the two arms . CONCLUSIONS Group family planning counseling was as effective as individual family planning counseling in increasing modern contraceptive knowledge among female gynecology patients in Ghana
2,001
26,992,885
There were improvements in the self-rated psychological symptoms , such as stress , depression , anxiety and mindfulness . To conclude , mindfulness-based stress reduction , as a safe and transportable approach , has potential to improve the psychological symptoms in the caregivers of patients with various conditions
Caring for patients with various conditions is dem and ing and stressful and can have a negative impact on both physical and psychological health . This paper reports a systematic review and critical appraisal of the evidence on the effectiveness of mindfulness-based stress reduction for the family caregivers of patients with various conditions .
Background : Caregivers of people with chronic conditions are more likely than non-caregivers to have depression and emotional problems . Few studies have examined the effectiveness of mindfulness-based stress reduction ( MBSR ) in improving their mental well-being . Methods : Caregivers of persons with chronic conditions who scored 7 or above in the Caregiver Strain Index were r and omly assigned to the 8-week MBSR group ( n = 70 ) or the self-help control group ( n = 71 ) . Vali date d instruments were used to assess the changes in depressive and anxiety symptoms , quality of life , self-efficacy , self-compassion and mindfulness . Assessment s were conducted at baseline , post-intervention and at the 3-month follow-up . Results : Compared to the participants in the control group , participants in the MBSR group had a significantly greater decrease in depressive symptoms at post-intervention and at 3 months post-intervention ( p < 0.01 ) . The improvement in state anxiety symptoms was significantly greater among participants in the MBSR group than those of the control group at post-intervention ( p = 0.007 ) , although this difference was not statistically significant at 3 months post-intervention ( p = 0.084 ) . There was also a statistically significant larger increase in self-efficacy ( controlling negative thoughts ; p = 0.041 ) and mindfulness ( p = 0.001 ) among participants in the MBSR group at the 3-month follow-up compared to the participants in the control group . No statistically significant group effects ( MBSR vs. control ) were found in perceived stress , quality of life or self-compassion . Conclusions : MBSR appears to be a feasible and acceptable intervention to improve mental health among family caregivers with significant care burden , although further studies that include an active control group are needed to make the findings more conclusive BACKGROUND : Compared with other parents , mothers of children with autism spectrum disorder or other neurodevelopmental disabilities experience more stress , illness , and psychiatric problems . Although the cumulative stress and disease burden of these mothers is exceptionally high , and associated with poorer outcomes in children , policies and practice s primarily serve the identified child with disabilities . METHODS : A total of 243 mothers of children with disabilities were consented and r and omized into either Mindfulness-Based Stress Reduction ( mindfulness practice ) or Positive Adult Development ( positive psychology practice ) . Well-trained , supervised peer mentors led 6 weeks of group treatments in 1.5-hour weekly sessions , assessing mothers 6 times before , during , and up to 6 months after treatment . Mothers had children with autism ( 65 % ) or other disabilities ( 35 % ) . At baseline , 85 % of this community sample had significantly elevated stress , 48 % were clinical ly depressed , and 41 % had anxiety disorders . RESULTS : Using slopes-as- outcomes , mixed r and om effects models , both treatments led to significant reductions in stress , depression , and anxiety , and improved sleep and life satisfaction , with large effects in depression and anxiety . Mothers in Mindfulness-Based Stress Reduction versus Positive Adult Development had greater improvements in anxiety , depression , sleep , and well-being . Mothers of children with autism spectrum disorder improved less in anxiety , but did not otherwise differ from their counterparts . CONCLUSIONS : Future studies are warranted on how trained mentors and professionals can address the unmet mental health needs of mothers of children with developmental disabilities . Doing so improves maternal well-being and furthers their long-term caregiving of children with complex developmental , physical , and behavioral needs Objectives Research suggests that an 8-week Mindfulness-Based Stress Reduction ( MBSR ) program ( a structured form of meditation ) might be effective in the treatment of various health problems including chronic pain . Our objective was to compare the clinical effectiveness of the MBSR program with a multidisciplinary pain intervention ( MPI ) program in terms of pain intensity , pain-related distress , quality of life , and mood in patients with chronic pain . Methods A r and omized , comparative clinical trial was conducted , including 6-month posttreatment follow-up . Ninety-nine participants , aged 24 to 64 years , with pain for a minimum of 3 months , were recruited from community-based clinics , hospitals , and community service centers . Participants were r and omly allocated to either the MBSR program ( 51 participants ) or a MPI program ( 48 participants ) . The study used vali date d Chinese versions of self-reported question naires measuring pain , mood symptoms , and health-related quality of life . Results Thirty-nine participants ( 77 % ) completed the MBSR program and 44 ( 90 % ) completed the MPI program . Patients in both the groups were comparable with regard to demographical characteristics , pain intensity , mood symptoms , and health-related quality -of-life measures before intervention . In both the groups , patients who completed the trial demonstrated statistically significant improvements in pain intensity and pain-related distress . However , no statistically significant differences were observed in overall results between the MBSR and MPI groups . Conclusions This r and omized , clinical trial showed that both MBSR and MPI programs reduced pain intensity and pain-related distress although no statistically significant differences were observed between the 2 groups and the improvements were small CONTEXT The decision to institutionalize a patient with dementia is complex and is based on patient and caregiver characteristics and the sociocultural context of patients and caregivers . Most studies have determined predictors of nursing home placement primarily according to patient or caregiver characteristics alone . OBJECTIVE To develop and vali date a prognostic model to determine the comprehensive predictors of placement among an ethnically diverse population of patients with dementia . DESIGN , SETTING , AND PARTICIPANTS The Medicare Alzheimer 's Disease Demonstration and Evaluation study , a prospect i ve study at 8 sites in the United States , with enrollment between December 1989 and December 1994 of 5788 community-living persons with advanced dementia . MAIN OUTCOME MEASURES Time to nursing home placement throughout a 36-month follow-up period , assessed by interview and review of Medicare records , and its association with patient and caregiver characteristics , obtained by interview at enrollment . RESULTS Patients were divided into a development ( n = 3859 ) and validation ( n = 1929 ) cohort . In the development cohort , the Kaplan-Meier estimates of nursing home placement throughout 1 , 2 , and 3 years were 22 % , 40 % , and 52 % , respectively . After multivariate adjustment , patient characteristics that were associated with nursing home placement were as follows : black ethnicity ( hazard ratio , 0.60 ; 95 % confidence interval [ CI ] , 0.48 - 0.74 ) , Hispanic ethnicity ( HR , 0.40 ; 95 % CI , 0.28 - 0.56 ) ( both ethnicities were inversely associated with placement ) , living alone ( HR , 1.74 ; 95 % CI , 1.49 - 2.02 ) , 1 or more dependencies in activities of daily living ( HR , 1.38 ; 95 % CI , 1.20 - 1.60 ) , high cognitive impairment ( for Mini-Mental Status Examination score < or = 20 : HR , 1.52 ; 95 % CI , 1.33 - 1.73 ) , and 1 or more difficult behaviors ( HR , 1.30 ; 95 % CI , 1.11 - 1.52 ) . Caregiver characteristics associated with patient placement were age 65 to 74 years ( HR , 1.17 ; 95 % CI , 1.01 - 1.37 ) , age 75 years or older ( HR , 1.55 ; 95 % CI , 1.31 - 1.84 ) , and high Zarit Burden Scale score ( for highest quartile : HR , 1.73 ; 95 % CI , 1.49 - 2.00 ) . Patients were assigned to quartiles of risk based on this model . In the development cohort , patients in the first , second , third , and fourth quartile had a 25 % , 42 % , 64 % , and 91 % rate of nursing home placement at 3 years , respectively . In the validation cohort , the respective rates were 21 % , 50 % , 64 % , and 89 % . The C statistic for 3-year nursing home placement was 0.66 in the development cohort and 0.63 in the validation cohort . CONCLUSIONS Patient and caregiver characteristics are both important determinants of long-term care placement for patients with dementia . Interventions directed at delaying placement , such as reduction of caregiver burden or difficult patient behaviors , need to take into account the patient and caregiver as a unit OBJECTIVES The purpose of this study was to investigate the predictors of caregiver burden and depression , including objective stressors and mediation forces influencing caregiving outcomes . METHODS This investigation is based on the 1994 Canadian Study of Health and Aging ( CSHA ) data base . Participants were 613 individuals with dementia , living in either the community or an institution , and their informal caregivers . Participants for the CSHA were identified by screening a large r and om sample of elderly persons across Canada . Structural equation models representing four alternative pathways from caregiving stressors ( e.g. , functional limitations , disturbing behaviors , patient residence , assistance given to caregiver ) to caregiver burden and depression were compared . RESULTS The data provided the best fit to a model whereby the effects on the caregiver 's well-being are mediated by appraisal s of burden . A higher frequency of disturbing behavior , caring for a community-dwelling patient , and low informal support were related to higher burden , which in turn led to more depressive symptomatology . Caregivers of patients exhibiting more disturbing behaviors and functional limitations received less help from family and friends , whereas those whose care recipients resided in an institution received more informal support . DISCUSSION Our findings add to the preexisting literature because we tested alternative models of caregiver burden using an unusually large sample size of participants and after overcoming method ological limitations of past research . Results highlight the importance of the effective management of disturbing behaviors , the provision of formal services for caregivers with highly impaired patients and no informal support , and the improvement of coping skills in burdened caregivers OBJECTIVES The purpose of this study was to examine the effects of a structured , 8-week , Mindfulness-Based Stress Reduction ( MBSR ) program on perceived stress , mood , endocrine function , immunity , and functional health outcomes in individuals infected with the human immunodeficiency virus ( HIV ) . DESIGN This study used a quasiexperimental , nonr and omized design . METHODS Subjects were specifically recruited ( nonr and om ) for intervention ( MBSR ) or comparison group . Data were collected at pretest and post-test in the MBSR group and at matched times in the comparison group . t Tests where performed to determine within-group changes and between-group differences . RESULTS Natural killer cell activity and number increased significantly in the MBSR group compared to the comparison group . No significant changes or differences were found for psychological , endocrine , or functional health variables . CONCLUSIONS These results provide tentative evidence that MBSR may assist in improving immunity in individuals infected with HIV Behavioral interventions that support caregivers ' restful sleep may delay the onset or decrease the severity of debilitating depressive symptoms . This , in turn , may increase caregivers ' physical and psychological health and wellbeing . A repeated- measures experimental design was used to test the feasibility and effectiveness of a brief behavioral sleep intervention for family caregivers of persons with advanced stage cancer . The CAregiver Sleep Intervention ( CASI ) includes stimulus control , relaxation , cognitive therapy , and sleep hygiene elements . CASI is individualized and delivered to accommo date caregiver burden . Thirty adult caregivers participated . The Pittsburgh Sleep Quality Index ( PSQI ) , Center for Epidemiological Studies -Depression scale ( CES-D ) , and Caregiver Quality of Life-Cancer scale ( CQOLC ) were used to measure self-reported sleep quality , depressive symptoms , and quality of life . Actigraphs measured latency , duration , efficiency , and wake after sleep onset ( WASO ) scores . Data were collected at baseline , 3 and 5 weeks , 2 , 3 , and 4 months post baseline . Improvement was seen across groups ; however , intervention caregivers showed more improvement in PSQI and CES-D scores than control caregivers . The CASI appears to be effective in improving sleep quality and depressive symptoms in caregivers of persons with cancer . Improvements in quality of life scores were similar across groups . Sample size and homogeneity limit generalizeability This study aims to investigate the impact of caregiving on the health status and quality of life ( QOL ) of primary informal caregivers ( PCGs ) of elderly care recipients in Hong Kong . A total of 246 PCGs and 492 matched noncaregiver ( NCG ) controls were identified in a population -based cross-sectional study through r and om telephone dialing . Their health status and QOL were assessed based on structured question naires and Short Form 36 ( SF-36 ) Health Survey . Multiple conditional logistic regression analysis showed that compared with NCGs , PCGs had significantly increased risks for reporting worse health , more doctor visits , anxiety and depression , and weight loss . Female PCGs were more likely to report chronic diseases , symptoms , and insomnia . PCGs , particularly women , had significantly lower scores in all eight domains of SF-36 Health Survey . High caregiver burden score ( Zarit Burden Scale ) was positively associated with adverse physical and psychological health and poorer QOL . The results indicate that PCGs , particularly women , had an adverse physical and psychological health profile and poorer QOL compared with NCGs Alzheimer 's disease is the most frequent form of dementia , where behavioral and cognitive disruption symptoms coexist . Depression , apathy , anxiety , and other conduct disorders are the complaints most often reported by caregivers . Fifty subjects were referred to our Institute with a diagnosis of probable Alzheimer 's disease . Cognitive impairment was equally distributed among the subjects . Patients , aged 68 to 76 years old , were r and omized to receive inhibitors of cholinesterase ( Donepezil , 5 mg/ day ) alone , or inhibitors of cholinesterase plus selective seratonin reuptake inhibitors ( SSRIs ) ( citalopram HBr , 20 mg/day ) . We followed up all the patients for one year , with particular concern for neuropsychological aspects associated with eventual behavioral changes . Results indicate that SSRI intake seems to be effective for depression , decreasing it and improving quality of life for both patients and caregivers . Side effects in both groups were few , and there were no study withdrawals . This paper discusses the relationship between dementia- and depression , and presents our finding that depressive symptoms , if specifically treated , tend to reduce caregiver stress and improve well-being in patients with Alzheimer 's disease Caring for veterans with dementia is burdensome for family caregivers . This exploratory study tested the efficacy of an innovative , spiritually based mantram caregiver intervention delivered using teleconference calls . A prospect i ve , within-subjects , mixed- methods , and 3-time repeated- measures design with 36-week follow-up telephone interviews was conducted . Sixteen caregivers ( 94 % women , 94 % Whites with mean age 69.2 years , SD = 10.35 years ) completed the intervention . Significant effects for time and linear terms were found for decreasing caregiver burden , perceived stress , depression , and rumination and for increasing quality of life enjoyment and satisfaction , all with large effect sizes . Findings suggest that teleconference delivery of a spiritually based caregiver intervention is feasible OBJECTIVES The objectives of this study were to evaluate whether a mindfulness meditation intervention may be effective in caregivers of close relatives with dementia and to help refine the protocol for future larger trials . DESIGN The design was a pilot r and omized trial to evaluate the effectiveness of a mindfulness meditation intervention adapted from the Mindfulness-Based Cognitive Therapy program in relation to two comparison groups : an education class based on Powerful Tools for Caregivers serving as an active control group and a respite-only group serving as a pragmatic control . SETTING S/LOCATION This study was conducted at the Oregon Health & Science University , Portl and , OR . SUBJECTS The subjects were community-dwelling caregivers aged 45 - 85 years of close relatives with dementia . INTERVENTIONS The two active interventions lasted 7 weeks , and consisted of one 90-minute session per week along with at-home implementation of knowledge learned . The respite-only condition provided the same duration of respite care that was needed for the active interventions . OUTCOME MEASURES Subjects were assessed prior to r and omization and again after completing classes at 8 weeks . The primary outcome measure was a self-rated measure of caregiver stress , the Revised Memory and Behavior Problems Checklist ( RMBPC ) . Secondary outcome measures included mood , fatigue , self-efficacy , mindfulness , salivary cortisols , cytokines , and cognitive function . We also evaluated self-rated stress in the subjects ' own environment , expectancy of improvement , and credibility of the interventions . RESULTS There were 31 caregivers r and omized and 28 completers . There was a significant effect on RMBPC by group covarying for baseline RMBPC , with both active interventions showing improvement compared with the respite-only group . Most of the secondary outcome measures were not significantly affected by the interventions . There was an intervention effect on the caregiver self-efficacy measure and on cognitive measures . Although mindfulness was not impacted by the intervention , there were significant correlations between mindfulness and self-rated mood and stress scores . CONCLUSIONS Both mindfulness and education interventions decreased the self-rated caregiver stress compared to the respite-only control
2,002
19,408,018
The only intervention that receives strong evidence is discectomy for faster relief in carefully selected patients due to lumbar disc prolapse with sciatica .
The r and omized controlled trial ( RCT ) is generally accepted as the most reliable method of conducting clinical research . To obtain an unbiased evaluation of the effectiveness of spine surgery , patients should be r and omly assigned to either new or st and ard treatment . The aim of the present article is to provide a short overview of the advantages and challenges of RCTs and to present a summary of the conclusions of the Cochrane Review s in spine surgery and later published trials in order to evaluate their contribution to quality management and feasibility in practice . The first RCT in spine surgery was published in 1974 and compared debridement and ambulatory treatment in tuberculosis of the spine . The contribution of RCTs in spinal surgery has markedly increased over the last 10 years , which indicates that RCTs are feasible in this field .
Study Design . Single blind r and omized study . Objectives . To compare the effectiveness of lumbar instrumented fusion with cognitive intervention and exercises in patients with chronic low back pain and disc degeneration . Summary of Background Data . To the authors ’ best knowledge , only one r and omized study has evaluated the effectiveness of lumbar fusion . The Swedish Lumbar Spine Study reported that lumbar fusion was better than continuing physiotherapy and care by the family physician . Patients and Methods . Sixty-four patients aged 25–60 years with low back pain lasting longer than 1 year and evidence of disc degeneration at L4–L5 and /or L5–S1 at radiographic examination were r and omized to either lumbar fusion with posterior transpedicular screws and postoperative physiotherapy , or cognitive intervention and exercises . The cognitive intervention consisted of a lecture to give the patient an underst and ing that ordinary physical activity would not harm the disc and a recommendation to use the back and bend it . This was reinforced by three daily physical exercise sessions for 3 weeks . The main outcome measure was the Oswestry Disability Index . Results . At the 1-year follow-up visit , 97 % of the patients , including 6 patients who had either not attended treatment or changed groups , were examined . The Oswestry Disability Index was significantly reduced from 41 to 26 after surgery , compared with 42 to 30 after cognitive intervention and exercises . The mean difference between groups was 2.3 ( −6.7 to 11.4 ) ( P = 0.33 ) . Improvements inback pain , use of analgesics , emotional distress , life satisfaction , and return to work were not different . Fear-avoidance beliefs and fingertip-floor distance were reduced more after nonoperative treatment , and lower limb pain was reduced more after surgery . The success rateaccording to an independent observer was 70 % after surgery and 76 % after cognitive intervention and exercises . The early complication rate in the surgical group was 18 % . Conclusion . The main outcome measure showed equal improvement in patients with chronic low back pain and disc degeneration r and omized to cognitive intervention and exercises , or lumbar fusion This study examines prospect ively the r and omised , long-term , clinical and radiological results of the treatment of spondylitis patients by ventro-dorsal or ventral spine fusion . Group 1 consisted of 12 patients who ( after ventral removal of the focus of infection and autologous bone grafting ) were treated by dorsal instrumentation . Group 2 consisted of ten patients who , after similar ventral removal and bone interposition , were stabilised by ventral instrumentation . The patients prospect ively underwent clinical and radiological studies . In addition , they were asked to fill in self- assessment question naires such as the short-form (SF)-36 health survey , the Oswestry question naire , and the visual analog scales ( VAS ) . The postoperative follow-ups were at 6 months , 2 years and 5.4 years . It proved possible to demonstrate clinical ly that patients with an isolated ventral spondylodesis feel significantly better and experience significantly less pain in the area of spinal fusion than patients with ventro-dorsal fusion 2 and 5.4 years after the operation . Over a number of years a stable fusion can be achieved through either operation . Ventral stabilisation yields more advantages than dorsal instrumentation in the long term . These advantages result in a clinical ly smoother course after the operation . If , in the individual case , ventral instrumentation is feasible , this method should be used . RésuméLa présente étude a pour but d’examiner de façon prospect i ve et r and omisée le devenir à long terme , clinique et radiologique , du traitement des spondylites après greffe antérieure ou greffe postérieure . Le groupe I a inclus 12 patients après abord antérieur , évacuation de l’abcès et greffe antérieure puis mise en place une instrumentation postérieure . Le groupe II ayant inclus 10 patients après un abord antérieur similaire , la greffe osseuse a été stabilisée par une instrumentation antérieure . Les patients ont été examinés rétrospectivement sur le plan clinique et radiologique . Ils ont répondu au question naire SF-36 , le question naire d’Oswestry et au VAS . Le suivi moyen post opératoire a été réalisé à six mois , deux ans et 5,4 ans . Cette étude a démontré de façon claire que les patients ayant bénéficié d’une greffe antérieure ont eu un résultat significativement plus intéressant en termes de douleur et de fusion que les patients ayant été traités par abord antérieur et une greffe postérieure . Si on la compare à l’instrumentation postérieure , cette stabilisation antérieure présente un certain nombre d’avantages à long terme . L’instrumentation antérieure est bien suffisante pour traiter ces pathologies BACKGROUND Management of degenerative spondylolisthesis with spinal stenosis is controversial . Surgery is widely used , but its effectiveness in comparison with that of nonsurgical treatment has not been demonstrated in controlled trials . METHODS Surgical c and i date s from 13 centers in 11 U.S. states who had at least 12 weeks of symptoms and image-confirmed degenerative spondylolisthesis were offered enrollment in a r and omized cohort or an observational cohort . Treatment was st and ard decompressive laminectomy ( with or without fusion ) or usual nonsurgical care . The primary outcome measures were the Medical Outcomes Study 36-Item Short-Form General Health Survey ( SF-36 ) bodily pain and physical function scores ( 100-point scales , with higher scores indicating less severe symptoms ) and the modified Oswestry Disability Index ( 100-point scale , with lower scores indicating less severe symptoms ) at 6 weeks , 3 months , 6 months , 1 year , and 2 years . RESULTS We enrolled 304 patients in the r and omized cohort and 303 in the observational cohort . The baseline characteristics of the two cohorts were similar . The one-year crossover rates were high in the r and omized cohort ( approximately 40 % in each direction ) but moderate in the observational cohort ( 17 % crossover to surgery and 3 % crossover to nonsurgical care ) . The intention-to-treat analysis for the r and omized cohort showed no statistically significant effects for the primary outcomes . The as-treated analysis for both cohorts combined showed a significant advantage for surgery at 3 months that increased at 1 year and diminished only slightly at 2 years . The treatment effects at 2 years were 18.1 for bodily pain ( 95 % confidence interval [ CI ] , 14.5 to 21.7 ) , 18.3 for physical function ( 95 % CI , 14.6 to 21.9 ) , and -16.7 for the Oswestry Disability Index ( 95 % CI , -19.5 to -13.9 ) . There was little evidence of harm from either treatment . CONCLUSIONS In nonr and omized as-treated comparisons with careful control for potentially confounding baseline factors , patients with degenerative spondylolisthesis and spinal stenosis treated surgically showed substantially greater improvement in pain and function during a period of 2 years than patients treated nonsurgically . ( Clinical Trials.gov number , NCT00000409 [ Clinical Trials.gov ] . ) To comprehend the results of a r and omised controlled trial ( RCT ) , readers must underst and its design , conduct , analysis , and interpretation . That goal can be achieved only through total transparency from authors . Despite several decades of educational efforts , the reporting of RCTs needs improvement . Investigators and editors developed the original CONSORT ( Consoli date d St and ards of Reporting Trials ) statement to help authors improve reporting by use of a checklist and flow diagram . The revised CONSORT statement presented here incorporates new evidence and addresses some criticisms of the original statement . The checklist items pertain to the content of the Title , Abstract , Introduction , Methods , Results , and Discussion . The revised checklist includes 22 items selected because empirical evidence indicates that not reporting this information is associated with biased estimates of treatment effect , or because the information is essential to judge the reliability or relevance of the findings . We intended the flow diagram to depict the passage of participants through an RCT . The revised flow diagram depicts information from four stages of a trial ( enrolment , intervention allocation , follow- up , and analysis ) . The diagram explicitly shows the number of participants , for each intervention group , included in the primary data analysis . Inclusion of these numbers allows the reader to judge whether the authors have done an intention- to-treat analysis . In sum , the CONSORT statement is intended to improve the reporting of an RCT , enabling readers to underst and a trial 's conduct and to assess the validity of its results CONTEXT Lumbar diskectomy is the most common surgical procedure performed for back and leg symptoms in US patients , but the efficacy of the procedure relative to nonoperative care remains controversial . OBJECTIVE To assess the efficacy of surgery for lumbar intervertebral disk herniation . DESIGN , SETTING , AND PATIENTS The Spine Patient Outcomes Research Trial , a r and omized clinical trial enrolling patients between March 2000 and November 2004 from 13 multidisciplinary spine clinics in 11 US states . Patients were 501 surgical c and i date s ( mean age , 42 years ; 42 % women ) with imaging-confirmed lumbar intervertebral disk herniation and persistent signs and symptoms of radiculopathy for at least 6 weeks . INTERVENTIONS St and ard open diskectomy vs nonoperative treatment individualized to the patient . MAIN OUTCOME MEASURES Primary outcomes were changes from baseline for the Medical Outcomes Study 36-item Short-Form Health Survey bodily pain and physical function scales and the modified Oswestry Disability Index ( American Academy of Orthopaedic Surgeons MODEMS version ) at 6 weeks , 3 months , 6 months , and 1 and 2 years from enrollment . Secondary outcomes included sciatica severity as measured by the Sciatica Bothersomeness Index , satisfaction with symptoms , self-reported improvement , and employment status . RESULTS Adherence to assigned treatment was limited : 50 % of patients assigned to surgery received surgery within 3 months of enrollment , while 30 % of those assigned to nonoperative treatment received surgery in the same period . Intent-to-treat analyses demonstrated substantial improvements for all primary and secondary outcomes in both treatment groups . Between-group differences in improvements were consistently in favor of surgery for all periods but were small and not statistically significant for the primary outcomes . CONCLUSIONS Patients in both the surgery and the nonoperative treatment groups improved substantially over a 2-year period . Because of the large numbers of patients who crossed over in both directions , conclusions about the superiority or equivalence of the treatments are not warranted based on the intent-to-treat analysis . TRIAL REGISTRATION clinical trials.gov Identifier : NCT00000410 From August 1981 to February 1982 postoperative infections due to different strains of penicillin-resistant Staphylococcus aureus occurred in 20 of 467 patients ( 4.3 % ) undergoing elective cranial and spinal operations . These infections were not attributable to defects in procedures or the theatre environment , therefore chemoprophylaxis was instituted . In the following 8 months , when patients were given penicillin G and sulphadiazine for 5 days commencing immediately postoperatively , S. aureus infections occurred in five of 579 patients ( 0.9 % ) . In a subsequent r and omized uncontrolled study , infections occurred in six of 265 patients receiving penicillin ( 2.3 % ) , three of 270 receiving penicillin and sulphadiazine ( 1.1 % ) and one of 45 receiving erythromycin ( 2.2 % ) immediately postoperatively for 5 days . In a further study in which 587 patients received penicillin for 5 days commencing immediately preoperatively , infections due to S. aureus occurred in six ( 1.1 % ) . Infections due to gram-negative organisms were seen in five ( 0.4 % ) of 1167 patients in the two uncontrolled studies BACKGROUND Lumbar-disk surgery often is performed in patients who have sciatica that does not resolve within 6 weeks , but the optimal timing of surgery is not known . METHODS We r and omly assigned 283 patients who had had severe sciatica for 6 to 12 weeks to early surgery or to prolonged conservative treatment with surgery if needed . The primary outcomes were the score on the Rol and Disability Question naire , the score on the visual-analogue scale for leg pain , and the patient 's report of perceived recovery during the first year after r and omization . Repeated- measures analysis according to the intention-to-treat principle was used to estimate the outcome curves for both groups . RESULTS Of 141 patients assigned to undergo early surgery , 125 ( 89 % ) underwent microdiskectomy after a mean of 2.2 weeks . Of 142 patients design ated for conservative treatment , 55 ( 39 % ) were treated surgically after a mean of 18.7 weeks . There was no significant overall difference in disability scores during the first year ( P=0.13 ) . Relief of leg pain was faster for patients assigned to early surgery ( P<0.001 ) . Patients assigned to early surgery also reported a faster rate of perceived recovery ( hazard ratio , 1.97 ; 95 % confidence interval , 1.72 to 2.22 ; P<0.001 ) . In both groups , however , the probability of perceived recovery after 1 year of follow-up was 95 % . CONCLUSIONS The 1-year outcomes were similar for patients assigned to early surgery and those assigned to conservative treatment with eventual surgery if needed , but the rates of pain relief and of perceived recovery were faster for those assigned to early surgery . ( Current Controlled Trials number , IS RCT N26872154 [ controlled-trials.com ] . ) Purpose . To compare 2 methods of fusion in the treatment of lumbar spondylolisthesis : posterior lumbar interbody fusion ( PLIF ) and intertransverse fusion ( ITF ) . Methods . 20 patients with lumbar spondylolisthesis were r and omly allocated to one of 2 groups : decompression , posterior instrumentation , and PLIF ( n=10 ) or decompression , posterior instrumentation , and ITF ( n=10 ) . The Oswestry low back pain disability question naire was used for clinical assessment . Radiography was performed preoperatively and postoperatively to assess the reduction of spondylolisthesis or slip . Results . In the PLIF and ITF groups , 87.5 % and 100 % had a satisfactory clinical result , and 48 % and 39 % had reduced spondylolisthesis , respectively . Both had a fusion rate of 100 % . PLIF showed better reduction of spondylolisthesis , although ITF achieved a better subjective and clinical outcome . Conclusion . Morbidity and complications are much higher following PLIF than ITF . ITF is recommended because of the simplicity of the procedure , lower complication rate , and good clinical and radiological results Study Design . A prospect i ve r and omized study evaluating the efficacy of autologous fibrin tissue adhesive for decreasing postoperative cerebrospinal fluid ( CSF ) leak in spinal cord surgery . Objective . To compare postoperative CSF leak in 3 groups ( i.e. , autologous fibrin tissue adhesive used , commercial fibrin glue used , and no fibrin tissue adhesive used ) of patients undergoing spinal surgery who needed dural incision . Summary of Background Data . Spinal cord operations , particularly when dural incision is inevitable , sometimes involve postoperative CSF leak . Because CSF leak is a serious complication , countermeasure is necessary to prevent it after dural suture . Commercial fibrin tissue adhesive was formerly used . Because the possibility of prion infection was widely noticed , commercial fibrin tissue adhesive containing animal components has been used less often . Methods . In 13 of 39 cases in which dural incision would be made , 400 mL whole blood was drawn , and autologous fibrin tissue adhesive was made of plasma . Cases were divided into 3 groups : ( 1 ) dural closure alone , ( 2 ) use of autologous fibrin tissue adhesive after dural closure , and ( 3 ) use of commercial fibrin tissue adhesive after dural closure . The primary outcome measure was determined as postoperative ( 3 days ) volume of drainage fluid , and results were analyzed using the analysis of variance . The secondary outcome measure was general blood test , coagulation assay , and plasma fibrinogen , and these were analyzed also using the analysis of variance . Results . There was a significant difference in the primary outcome between the autologous and control groups . No complications such as infection or continuous CSF leak were observed in any case . The mean volume of drainage fluid was 586.2 mL in the group with autologous fibrin tissue adhesive and 1026.1 mL in the group without fibrin tissue adhesive . The volume of drainage fluid was significantly lower in the former group than that in the latter group . There was no statistical difference between the volumes of the group with autologous adhesive and with commercial adhesive ( 639.2 mL ) . Conclusions . We used autologous fibrin tissue adhesive as a new sealant after dural closure instead of commercial fibrin tissue adhesive . No definitive CSF leak was observed , and the volume of drainage fluid was significantly lower in the group with autologous fibrin tissue adhesive than that in the group without fibrin tissue adhesive . The use of autologous fibrin tissue adhesive was superior to that of commercial fibrin tissue adhesive in cost Study Design . A prospect i ve r and omized clinical study . Objective . To determine whether shaving the incision site before spinal surgery causes postsurgical infection . Summary of Background Data . Spine surgeons usually shave the skin of the incision site immediately before surgery is performed . However , evidence from some surgical series suggests that presurgical shaving may increase the postsurgical infection rate . To our knowledge , no previously published studies have addressed this issue . Methods . A total of 789 patients scheduled to undergo spinal surgery were r and omly allocated into 2 groups : those in whom the site of operation was shaved immediately before surgery ( shaved group ; 371 patients ) and the patients in whom presurgical shaving was not performed ( unshaved group ; 418 patients ) . The mean duration of anesthesia and the infection rates in both groups were recorded and compared . Results . The duration of anesthesia did not differ in the 2 groups ( P > 0.05 ) . A postoperative infection developed in 4 patients in the shaved group and in 1 patient in the nonshaved group ( P < 0.01 ) . Conclusions . The shaving of the incision site immediately before spinal surgery may increase the rate of postoperative infection The literature reports on the safety and efficacy of titanium cages ( TCs ) with additional posterior fixation for anterior lumbar interbody fusion . However , these papers are limited to prospect i ve cohort studies . The introduction of TCs for spinal fusion has result ed in increased costs , without evidence of superiority over the established practice . There are currently no prospect i ve controlled trials comparing TCs to femoral ring allografts ( FRAs ) for circumferential fusion in the literature . In this prospect i ve , r and omised controlled trial , our objective was to compare the clinical outcome following the use of FRA ( current practice ) to the use of TC in circumferential lumbar spinal fusion . Full ethical committee approval and institutional research and development departmental approval were obtained . Power calculations estimated a total of 80 patients ( 40 in each arm ) would be required to detect clinical ly relevant differences in functional outcome . Eighty-three patients were recruited for the study fulfilling strict entry requirements ( > 6 months chronic discogenic low back pain , failure of conservative treatment , one- or two-level discographically proven discogenic low back pain ) . The patients completed the Oswestry Disability Index ( ODI ) , Visual Analogue Score ( VAS ) for back and leg pain and the Short-Form 36 ( SF-36 ) preoperatively and also postoperatively at 6 , 12 and 24 months , respectively . The results were available for all the 83 patients with a mean follow-up of 28 months ( range 24–75 months ) . Five patients were excluded on the basis of technical infringements ( unable to insert TC in four patients and FRA in one patient due to the narrowing of the disc space ) . From the remaining 78 patients r and omised , 37 received the FRA and 41 received the TC . Posterior stabilisation was achieved with translaminar or pedicle screws . Baseline demographic data ( age , sex , smoking history , number of operated levels and preoperative outcome measures ) showed no statistical difference between groups ( p<0.05 ) other than for the vitality domain of the SF-36 . For patients who received the FRA , mean VAS ( back pain ) improved by 2.0 points ( p<0.01 ) , mean ODI improved by 15 points ( p=<0.01 ) and mean SF-36 scores improved by > 11 points in all domains ( p<0.03 ) except that of general health and emotional role . For patients who received the TC , mean VAS improved by 1.1 points ( p=0.004 ) , mean ODI improved by 6 points ( p=0.01 ) and SF-36 improved significantly in only two of the eight domains ( bodily pain and physical function ) . Revision procedures and complications were similar in both groups . In conclusion , this prospect i ve , r and omised controlled clinical trial shows the use of FRA in circumferential lumbar fusion to be associated with superior clinical outcomes when compared to those observed following the use of TCs . The use of TCs for circumferential lumbar spinal fusion is not justified on the basis of inferior clinical outcome and the tenfold increase in cost Abstract Objectives To assess the clinical effectiveness of surgical stabilisation ( spinal fusion ) compared with intensive rehabilitation for patients with chronic low back pain . Design Multicentre r and omised controlled trial . Setting 15 secondary care orthopaedic and rehabilitation centres across the United Kingdom . Participants 349 participants aged 18 - 55 with chronic low back pain of at least one year 's duration who were considered c and i date s for spinal fusion . Intervention Lumbar spine fusion or an intensive rehabilitation programme based on principles of cognitive behaviour therapy . Main outcome measure The primary outcomes were the Oswestry disability index and the shuttle walking test measured at baseline and two years after r and omisation . The SF-36 instrument was used as a secondary outcome measure . Results 176 participants were assigned to surgery and 173 to rehabilitation . 284 ( 81 % ) provided follow-up data at 24 months . The mean Oswestry disability index changed favourably from 46.5 ( SD 14.6 ) to 34.0 ( SD 21.1 ) in the surgery group and from 44.8 ( SD14.8 ) to 36.1 ( SD 20.6 ) in the rehabilitation group . The estimated mean difference between the groups was –4.1 ( 95 % confidence interval –8.1 to –0.1 , P = 0.045 ) in favour of surgery . No significant differences between the treatment groups were observed in the shuttle walking test or any of the other outcome measures . Conclusions Both groups reported reductions in disability during two years of follow-up , possibly unrelated to the interventions . The statistical difference between treatment groups in one of the two primary outcome measures was marginal and only just reached the predefined minimal clinical difference , and the potential risk and additional cost of surgery also need to be considered . No clear evidence emerged that primary spinal fusion surgery was any more beneficial than intensive rehabilitation OBJECTIVE We have conducted a prospect i ve study to evaluate the results and effectiveness of bilateral decompression via a unilateral laminectomy in 50 patients with 98 levels of degenerative lumbar spinal stenosis without instability . METHODS Clinical outcomes were assessed using the Visual Analog Scale , Oswestry Disability Index , Short Form-36 , and subjective Satisfaction Measurement . RESULTS Adequate decompression was achieved in all patients . The mean follow-up time was 22.8 months ( range 19 - 47 months ) . Surgical decompression result ed in a dramatic reduction of overall pain in all patients ( late postoperative VAS score was 2.16 + /- 0.81 ) . The ODI scores decreased significantly in early and late follow-up evaluations and the SF-36 scores demonstrated significant improvement in late follow-up results in our series . Patient satisfaction rate was 94 % , and its improvement rate was 96 % . CONCLUSION For degenerative lumbar spinal stenosis with or without mild degenerative spondylolisthesis , the unilateral approach allowed sufficient and safe decompression of the neural structures and adequate preservation of vertebral stability , result ed in a highly significant reduction of symptoms and disability , and improved health-related quality of life PURPOSE To prospect ively assess the short-term clinical outcome of patients with subacute or chronic painful osteoporotic vertebral compression fractures ( VCF ) treated with percutaneous vertebroplasty ( PV ) compared with optimal pain medication ( OPM ) . METHODS R and omization of patients in 2 groups : treatment by PV or OPM . After 2 weeks , patients from the OPM arm could change therapy to PV . Patients were evaluated 1 day and 2 weeks after treatment . Visual analog score ( VAS ) for pain and analgesic use were assessed before , and 1 day and 2 weeks after start of treatment . Quality of Life Question naire of the European Foundation for Osteoporosis ( QUALEFFO ) and Rol and -Morris Disability ( RMD ) question naire scores were assessed before and 2 weeks after start of treatment . Follow-up scores in patients requesting PV treatment after 2 weeks OPM treatment were compared with scores during their OPM period . RESULTS Eighteen patients treated with PV compared with 16 patients treated with OPM had significantly better VAS and used less analgesics 1 day after treatment . Two weeks after treatment , the mean VAS was less but not significantly different in patients treated with OPM , whereas these patients used significantly less analgesics and had better QUALEFFO and RMD scores . Scores in the PV arm were influenced by occurrence of new VCF in 2 patients . After 2 weeks OPM , 14 patients requested PV treatment . All scores , 1 day and 2 weeks after PV , were significantly better compared with scores during conservative treatment . CONCLUSION Pain relief and improvement of mobility , function , and stature after PV is immediate and significantly better in the short term compared with OPM treatment Abstract We performed a r and omised controlled study to assess the accuracy of computer-assisted pedicle screw insertion versus conventional screw placement under clinical conditions . One hundred patients scheduled for posterior thoracolumbar or lumbosacral pedicle screw instrumentation were r and omised into two groups , either for conventional pedicle screw placement or computer-assisted screw application using an optoelectronic navigation system . From the computer-assisted group , nine patients were excluded : one because of an inadequate preoperative computed tomography study , seven because of problems with the specific instruments or the computer system , and one because of an intraoperative anesthesiological complication . Thus , there were 50 patients in the conventional group and 41 in the computer-assisted group , and the number of screws inserted was 277 and 219 , respectively . There was no statistical difference between the groups concerning age , gender , diagnosis , type of operation performed , mean operating time , blood loss , or number of screws inserted . The time taken for screw insertion was significantly longer in the computer-assisted group . Postoperatively , screw positions were assessed by an independent radiologist using a sophisticated CT imaging protocol . The pedicle perforation rate was 13.4 % in the conventional group and 4.6 % in the computer-assisted group ( P = 0.006 ) . Pedicle perforations of more than 4 mm were found in 1.4 % ( 4/277 ) of the screw insertions in the conventional group , and none in the computer-assisted group . Complications not related to pedicle screws were two L5 nerve root lesions , one end plate fracture , one major intraoperative bleeding and one postoperative death in the conventional group , and one deep infection in the computer-assisted group . In conclusion , pedicular screws were inserted more accurately with image-guided computer navigation than with conventional methods Patients suffering from neurogenic intermittent claudication secondary to lumbar spinal stenosis have historically been limited to a choice between a decompressive laminectomy with or without fusion or a regimen of non-operative therapies . The X STOP Interspinous Process Distraction System ( St. Francis Medical Technologies , Concord , Calif. ) , a new interspinous implant for patients whose symptoms are exacerbated in extension and relieved in flexion , has been available in Europe since June 2002 . This study reports the results from a prospect i ve , r and omized trial of the X STOP conducted at nine centers in the U.S. Two hundred patients were enrolled in the study and 191 were treated ; 100 received the X STOP and 91 received non-operative therapy ( NON OP ) as a control . The Zurich Claudication Question naire ( ZCQ ) was the primary outcomes measurement . Vali date d for lumbar spinal stenosis patients , the ZCQ measures physical function , symptom severity , and patient satisfaction . Patients completed the ZCQ upon enrollment and at follow-up periods of 6 weeks , 6 months , and 1 year . Using the ZCQ criteria , at 6 weeks the success rate was 52 % for X STOP patients and 10 % for NON OP patients . At 6 months , the success rates were 52 and 9 % , respectively , and at 1 year , 59 and 12 % . The results of this prospect i ve study indicate that the X STOP offers a significant improvement over non-operative therapies at 1 year with a success rate comparable to published reports for decompressive laminectomy , but with considerably lower morbidity Study Design . A prospect i ve clinical trial was conducted . Objectives . To compare the results of fusion versus nonfusion for surgically treated burst fractures of the thoracolumbar and lumbar spine . Summary of Background Data . The operative results of surgically treated burst fractures with short segmental fixation have been well documented . There is no report comparing the results of fusion and nonfusion . Methods . Fifty-eight patients were included in this study , with the inclusion criteria as follows : neurologically intact spine with a kyphotic angle ≥20 ° , decreased vertebral body height ≥50 % or a canal compromise ≥50 % , incomplete neurologic deficit with a canal compromise < 50 % , complete neurologic deficit , and multilevel spinal injury or multiple traumas . All patients were r and omly assigned to fusion or nonfusion groups , and operative treatment with posterior reduction and instrumentation was carried out . Posterior fusion with autogenous bone graft was performed for the fusion group ( n = 30 ) , and no fusion procedure was done for the nonfusion group ( n = 28 ) . The average follow-up period was 41 months ( range , 24–71 months ) . Results . The average loss of kyphotic angle was not statistically significant between these 2 groups . The radiographic parameters were statistically significantly better in the nonfusion group , including angular change in the flexion-extension lateral view ( 4.8 ° vs. 1.0 ° ) , lost correction of decreased vertebral body height ( 3.6 % vs. 8.3 % ) , intraoperative estimated blood loss ( 303 mL vs. 572 mL ) , and operative time ( 162 minutes vs. 224 minutes ) . The scores on the low back outcome scale were not statistically significant for these 2 groups . Conclusions . The short-term results of short segmental fixation without fusion for surgically treated burst fractures of the thoracolumbar spine were satisfactory . The advantages of instrumentation without fusion are the elimination of donor site complications , saving more motion segments , and reducing blood loss and operative time STUDY DESIGN A cohort of 100 patients with symptomatic lumbar spinal stenosis , characterized in a previous article , were given surgical or conservative treatment and followed for 10 years . OBJECTIVES To identify the short- and long-term results after surgical and conservative treatment , and to determine whether clinical or radiologic predictors for the treatment result can be defined . SUMMARY OF BACKGROUND DATA Surgical decompression has been considered the rational treatment . However , clinical experience indicates that many patients do well with conservative treatment . METHODS In this study , 19 patients with severe symptoms were selected for surgical treatment and 50 patients with moderate symptoms for conservative treatment , whereas 31 patients were r and omized between the conservative ( n = 18 ) and surgical ( n = 13 ) treatment groups . Pain was decisive for the choice of treatment group . All patients were observed for 10 years by clinical evaluation and question naires . The results , evaluated by patient and physician , were rated as excellent , fair , unchanged , or worse . RESULTS After a period of 3 months , relief of pain had occurred in most patients . Some had relief earlier , whereas for others it took 1 year . After a period of 4 years , excellent or fair results were found in half of the patients selected for conservative treatment , and in four fifths of the patients selected for surgery . Patients with an unsatisfactory result from conservative treatment were offered delayed surgery after 3 to 27 months ( median , 3.5 months ) . The treatment result of delayed surgery was essentially similar to that of the initial group . The treatment result for the patients r and omized for surgical treatment was considerably better than for the patients r and omized for conservative treatment . Clinical ly significant deterioration of symptoms during the final 6 years of the follow-up period was not observed . Patients with multilevel afflictions , surgically treated or not , did not have a poorer outcome than those with single-level afflictions . Clinical or radiologic predictors for the final outcome were not found . There were no dropouts , except for 14 deaths . CONCLUSIONS The outcome was most favorable for surgical treatment . However , an initial conservative approach seems advisable for many patients because those with an unsatisfactory result can be treated surgically later with a good outcome Study Design . This prospect i ve r and omized study compared 3 fusion methods : posterolateral fusion ( PLF ) , posterior lumbar interbody fusion ( PLIF ) , and PLIF combined with PLF ( PLF+PLIF ) . Objectives . To compare the outcomes of the 3 fusion methods and find a useful fusion method . Summary of Background Data . Many studies have shown clinical results , advantages , and postoperative complications of each fusion method , but few have compared the 3 fusion methods prospect ively . Methods . A total of 167 patients who underwent 1 or 2-level fusion surgery because of degenerative lumbar disease from January 1996 to September 2000 were studied . Minimum follow-up was 3 years . The patients were r and omized into 1 of 3 treatment groups : group 1 ( PLF ; n = 62 ) ; group 2 ( PLIF ; n = 57 ) ; and group 3 ( PLF+PLIF ; n = 48 ) . A visual analog scale , the Oswestry Disability Question naire , and Kirkaldy-Willis criteria were used to measure low back pain , leg pain , and disability . For radiologic evaluation , disc height , lumbar lordosis , segmental angle , and bone union were examined . Postoperative complications were also analyzed . Results . At the last follow-up , good or excellent results were obtained in 50 cases of PLF ( 80.7 % ) , 50 cases of PLIF ( 87.8 % ) , and 41 cases of PLF+PLIF ( 85.5 % ) . No statistical differences were found among the 3 groups ( P = 0.704 ) . All methods indicated significant improvement in the disc height ( P < 0.05 ) , with PLF having the highest loss in disc height . Lumbar lordosis and segmental angle increased significantly , and improvement of the segmental angle in the 3 fusion methods had statistically significant differences . The nonunion rates at the last follow-up in the 3 fusion groups were not statistically significant , with 8 % in group 1 , 5 % in group 2 , and 4 % in group 3 ( P > 0.05 ) . Complications included deep infection in 3 cases , transient nerve palsy in 4 , permanent nerve palsy in 1 , and donor site pain in 6 . Conclusions . No significant differences in clinical results and union rates were found among the 3 fusion methods . PLIF had better sagittal balance than PLF . PLIF without PLF had advantages of the elimination of donor site pain , shorter operating time , and less blood loss Study Design . A r and omized controlled trial . Objectives . To assess the effectiveness of decompressive surgery as compared with nonoperative measures in the treatment of patients with lumbar spinal stenosis . Summary of Background Data . No previous r and omized trial has assessed the effectiveness of surgery in comparison with conservative treatment for spinal stenosis . Methods . Four university hospitals agreed on the classification of the disease , inclusion and exclusion criteria , radiographic routines , surgical principles , nonoperative treatment options , and follow-up protocol s. A total of 94 patients were r and omized into a surgical or nonoperative treatment group : 50 and 44 patients , respectively . Surgery comprised undercutting laminectomy of the stenotic segments in 10 patients augmented with transpedicular fusion . The primary outcome was based on assessment of functional disability using the Oswestry Disability Index ( scale , 0–100 ) . Data on the intensity of leg and back pain ( scales , 0–10 ) , as well as self-reported and measured walking ability were compiled at r and omization and at follow-up examinations at 6 , 12 , and 24 months . Results . Both treatment groups showed improvement during follow-up . At 1 year , the mean difference in favor of surgery was 11.3 in disability ( 95 % confidence interval [ CI ] , 4.3–18.4 ) , 1.7 in leg pain ( 95 % CI , 0.4–3.0 ) , and 2.3(95 % CI , 1.1–3.6 ) in back pain . At the 2-year follow-up , the mean differences were slightly less : 7.8 in disability ( 95 % CI , 0.8–14.9 ) 1.5 in leg pain ( 95 % CI , 0.3–2.8 ) , and 2.1 in back pain ( 95 % CI , 1.0–3.3 ) . Walking ability , either reported or measured , did not differ between the two treatment groups . Conclusions . Although patients improved over the 2-year follow-up regardless of initial treatment , those undergoing decompressive surgery reported greater improvement regarding leg pain , back pain , and overall disability . The relative benefit of initial surgical treatment diminished over time , but outcomes of surgery remained favorable at 2 years . Longer follow-up is needed to determine if these differences persist OBJECTIVE We compared the intra- and postoperative differences , as well as the final outcome of patients with herniated lumbar discs who underwent either open discectomy ( OD ) or microendoscopic discectomy ( MED ) . METHODS We performed a prospect i ve controlled r and omized study of 40 patients with sciatica caused by lumbar disc herniations nonresponsive to conservative treatment who underwent OD or MED with a 24-month follow-up period . Pre- and postoperative neurological status , pain , and functional outcome were evaluated . Other studied variables were the duration of the procedure , blood loss , time of hospital stay , and time to return to work . Statistical analysis with a P value less than 0.005 was carried out . RESULTS The only statistically significant differences found were for size of the incision , length of hospital stay , and operative time . The former two were greater in the OD group ( P < 0.01 and P = 0.05 , respectively ) , and the latter was greater in the MED group ( P < 0.01 ) . CONCLUSION The few parameters that were found to be statistically significant between the groups did not affect the overall outcome . In the current series , the final clinical and neurological results were similarly satisfactory in both the OD and the MED groups Study Design . Small area analysis . Objectives . To determine the association between the rates of advanced spinal imaging and spine surgery across geographic areas . Summary of Background Data . The rates of spine surgery in the United States have increased along with a concurrent rise in the use of advanced spinal imaging : CT and MRI . Spine surgery rates vary six-fold across geographic areas of the United States . Differences in patient population s and health care supply have explained only about 10 % of this variation . Methods . We used a r and om 5 % sample of Medicare ’s National Cl aims History Part B files for 1996 and 1997 to determine procedure rates across 306 Hospital Referral Regions . We analyzed the association between spinal imaging and spine surgery using linear regression . Main outcome measures were rates of procedures and coefficients of determination ( R2 ) . Results . The rates of advanced spinal imaging ( CT and MRI combined ) varied 5.5-fold across geographic areas . Areas with higher rates of MRI had higher rates of spine surgery overall ( r = 0.46 ) and spinal stenosis surgery specifically ( r = 0.37 ) . The rates of advanced spinal imaging accounted for 22 % of the variability in overall spine surgery rates ( R2 = 0.22 , P < 0.001 ) and 14 % of the variability in lumbar stenosis surgery rates ( R2 = 0.14 , P < 0.001 ) . A simulation model showed that MRIs obtained in the patients undergoing surgery accounted for only a small part of the correlation between MRI and total spine surgery rates . Conclusions . A significant proportion of the variation in rates of spine surgery can be explained by differences in the rates of advanced spinal imaging . The indications for advanced spinal imaging are not firmly agreed on , and the appropriateness of many of these imaging studies has been question ed . Improved consensus on the use and interpretation of advanced spinal imaging studies could have an important effect on variation in spine surgery rates . Spine Study Design . A r and omized , controlled , prospect i ve multicenter trial comparing the outcomes of neurogenic intermittent claudication ( NIC ) patients treated with the interspinous process decompression system ( X STOP ) with patients treated nonoperatively . Objective . To determine the safety and efficacy of the X STOP interspinous implant . Summary of Background Data . Patients suffering from NIC secondary to lumbar spinal stenosis have been limited to a choice between nonoperative therapies and decompressive surgical procedures , with or without fusion . The X STOP was developed to provide an alternative therapeutic treatment . Methods . 191 patients were treated , 100 in the X STOP group and 91 in the control group . The primary outcomes measure was the Zurich Claudication Question naire , a patient-completed , vali date d instrument for NIC . Results . At every follow-up visit , X STOP patients had significantly better outcomes in each domain of the Zurich Claudication Question naire . At 2 years , the X STOP patients improved by 45.4 % over the mean baseline Symptom Severity score compared with 7.4 % in the control group ; the mean improvement in the Physical Function domain was 44.3 % in the X STOP group and −0.4 % in the control group . In the X STOP group , 73.1 % patients were satisfied with their treatment compared with 35.9 % of control patients . Conclusions . The X STOP provides a conservative yet effective treatment for patients suffering from lumbar spinal stenosis . In the continuum of treatment options , the X STOP offers an attractive alternative to both conservative care and decompressive surgery Objective To determine whether the faster recovery after early surgery for sciatica compared with prolonged conservative care is attained at reasonable costs . Design Cost utility analysis alongside a r and omised controlled trial . Setting Nine Dutch hospitals . Participants 283 patients with sciatica for 6 - 12 weeks , caused by lumbar disc herniation . Interventions Six months of prolonged conservative care compared with early surgery . Main outcome measures Quality adjusted life years ( QALYs ) at one year and societal costs , estimated from patient reported utilities ( UK and US EuroQol , SF-6D , and visual analogue scale ) and diaries on costs ( healthcare , patient ’s costs , and productivity ) . Results Compared with prolonged conservative care , early surgery provided faster recovery , with a gain in QALYs according to the UK EuroQol of 0.044 ( 95 % confidence interval 0.005 to 0.083 ) , the US EuroQol of 0.032 ( 0.005 to 0.059 ) , the SF-6D of 0.024 ( 0.003 to 0.046 ) , and the visual analogue scale of 0.032 ( −0.003 to 0.066 ) . From the healthcare perspective , early surgery result ed in higher costs ( difference € 1819 ( £ 1449 ; $ 2832 ) , 95 % confidence interval € 842 to € 2790 ) , with a cost utility ratio per QALY of € 41 000 ( € 14 000 to € 430 000 ) . From the societal perspective , savings on productivity costs led to a negligible total difference in cost ( € −12 , € −4029 to € 4006 ) . Conclusions Faster recovery from sciatica makes early surgery likely to be cost effective compared with prolonged conservative care . The estimated difference in healthcare costs was acceptable and was compensated for by the difference in absenteeism from work . For a willingness to pay of € 40 000 or more per QALY , early surgery need not be withheld for economic reasons . Trial registration Current Controlled Trials IS RCT N 26872154 We studied 117 adult patients undergoing posterior lumbar spinal fusion and instrumentation using bone grafts from the iliac crest between February 1999 and January 2001 . All patients had degenerative disease of the lumbar spine , and all were operated upon by the same surgeon . Patients were r and omized to have the iliac bone graft harvested either through a separate incision ( traditional approach ) or utilizing the same midline incision as used for the spinal surgery ( intrafascial approach ) . Total volume of harvested graft , blood loss , pain , complications , and patient satisfaction were evaluated with a minimum of 2-year follow-up . There were no infections . The average volume of harvested bone was 17.2 cc versus 14.7 cc ; total blood loss was 168 cc versus 96 cc ; total complication rate was 20 % versus 8 % , and overall satisfaction rate was 81 % versus 96 % , respectively . The intrafascial graft harvesting technique minimizes morbidity and increases patient satisfaction compared with the traditional bone harvesting technique . RésuméNous avons étudié 117 malades adultes traités par arthrodèse lombaire postérieure avec instrumentation et greffe iliaque entre février 1999 et janvier 2001 . Tous les malades avaient une maladie dégénérative de la colonne vertébrale lombaire et tous ont été opérés par le même chirurgien . Les malades ont été r and omisés pour avoir la greffe iliaque prélevée par une incision séparée ( approche traditionnelle ) ou par la même incision médiane habituellement utilisée pour la chirurgie vertébrale ( approche intrafasciale ) . Le volume totale de greffe prélevée , la perte de sang , les douleur , les complications et la satisfaction des malades ont été évalué avec un minimum de suivi de deux années . Il n’y avait pas d’infection . Le volume moyen d’os prélevé était 17,2 cc contre 14,7 cc ; la perte totale de sang était 168 cc contre 96 cc ; le taux de complications était 20 % contre 8 % et le taux de la satisfaction était 81 % contre 96 % respectivement . La technique de prélèvement intrafasciale minimise la morbidité et augmente la satisfaction du malade comparée à la technique traditionnelle de prélèvement Abstract The effectiveness of lumbar fusion for chronic low back pain after surgery for disc herniation has not been evaluated in a r and omized controlled trial . The aim of the present study was to compare the effectiveness of lumbar fusion with posterior transpedicular screws and cognitive intervention and exercises . Sixty patients aged 25–60 years with low back pain lasting longer than 1 year after previous surgery for disc herniation were r and omly allocated to the two treatment groups . Experienced back surgeons performed transpedicular fusion . Cognitive intervention consisted of a lecture intended to give the patient an underst and ing that ordinary physical activity would not harm the disc and a recommendation to use the back and bend it . This was reinforced by three daily physical exercise sessions for 3 weeks . The primary outcome measure was the Oswestry Disability Index ( ODI ) . Outcome data were analyzed on an intention‐to‐treat basis . Ninety‐seven percent of the patients , including seven of eight patients who had either not attended treatment ( n = 5 ) or changed groups ( n = 2 ) , completed 1‐year follow‐up . ODI was significantly improved from 47 to 38 after fusion and from 45 to 32 after cognitive intervention and exercises . The mean difference between treatments after adjustment for gender was −7.3 ( 95 % CI −17.3 to 2.7 , p = 0.15 ) . The success rate was 50 % in the fusion group and 48 % in the cognitive intervention/exercise group . For patients with chronic low back pain after previous surgery for disc herniation , lumbar fusion failed to show any benefit over cognitive intervention and exercises STUDY DESIGN Prospect i ve r and omized comparison of anterior lumbar interbody fusion ( ALIF ) plus transpedicular instrumentation plus posterolateral fusion ( PLF ) ( 360 degrees fusion ) to ALIF plus transpedicular instrumentation without PLF ( 270 degrees fusion ) . OBJECTIVES To compare the clinical outcomes , costs , and utilization of health re sources of 360 degrees versus 270 degrees fusions . BACKGROUND The 360 degrees fusion is effective , but its costs and utilization of health re sources are high . The PLF often resorbs and may not be necessary . METHODS Before and after surgery pain was measured by the Numerical Rating Scale ( NRS ) , and function was measured by the Oswestry Low Back Disability Index ( OSI ) . Costs were calculated by billing records . Operating times , blood loss , and hospital stays were measured at the time of hospital discharge . RESULTS There were 48 patients : 21 women and 27 men . Mean age was 42 years . Follow-up averaged 35 months ( range 24 - 45 months ) . In both 360 degrees and 270 degrees fusions , there were significant improvements in NRS and OSI , and the percentage of solid ALIF was high . Only 14 % of PLF appeared solid bilaterally and 18 % appeared solid on one side only . There were no significant differences in changes in NRS , changes in OSI , or percentage solid ALIF between the 360 degrees and 270 degrees fusions . However , the 270 degrees fusion group had significantly less blood loss , shorter operative times , shorter hospital stays , and lower professional fees , and although hospital charges were lower , this difference was not significant . CONCLUSION Both the 360 degrees and 270 degrees fusions significantly reduce pain and improve function , and there are no significant clinical differences between them . However , there were shorter operating times , less blood loss , lower costs , and less utilization of health care re sources associated with the 270 degrees fusions OBJECT The authors evaluated a new minimally invasive spinal surgery technique to correct degenerative lumbar spinal stenosis involving a split-spinous process laminotomy and discectomy ( also known as the " Marmot operation " ) . METHODS This prospect i ve study r and omized 70 patients with lumbar stenosis to undergo either a Marmot operation ( 40 patients ) , or a conventional laminectomy ( 30 patients ) , with or without discectomy . Spinal anteroposterior diameter , cross-sectional area , lateral recess distance , spinal stability , postoperative back pain , functional outcomes , and muscular trauma were evaluated . The follow up ranged from 10 to 18 months , with a mean of 15.1 months for the Marmot operation group and 14.8 months for the conventional laminectomy group . Compared with patients in the conventional laminectomy group , patients who received a Marmot operation had a shorter mean postoperative duration until ambulation without assistance , a reduced mean duration of hospital stay , a lower mean creatine phosphokinase-muscular-type isoenzyme level , a lower visual analog scale score for back pain at 1-year follow up , and a better recovery rate . These patients also had a longer mean duration of operative time and a greater mean blood loss compared with the conventional group . Satisfactory neurological decompression and symptom relief were achieved in 93 % of these patients . Most of the patients ( 66 % ) in this group needed discectomy for decompression . The postoperative mean lateral recess width , spinal anteroposterior diameter , and cross-sectional area were all significantly increased . There was no evidence of spinal instability in any patient . One patient with insufficient lateral recess decompression and recurrent disc herniation needed additional conventional laminectomy and discectomy , and one patient with mild superficial wound infection was successfully treated with antibiotics and frequent dressing changes . CONCLUSIONS A Marmot operation may provide effective spinal decompression . Although this method requires more operative time than a conventional method , it may involve only minimal muscular trauma , spinal stability maintenance , and early mobilization ; shorten the duration of hospital stay ; reduce postoperative back pain ; and provide satisfactory neurological and functional outcomes Study Design . A prospect i ve , r and omized clinical study comparing & bgr;-tricalcium phosphate ( & bgr;-TCP ) with autograft bone graft with follow-up of 3 years . Objective . To determine the efficacy of & bgr;-TCP as a bone graft substitute combined with local autograft obtained from decompression compared with the use of autologous iliac crest bone graft in single-level instrumented posterolateral lumbar fusion . Summary of Background Data . A variety of bone graft substitutes have been used in posterolateral lumbar fusion with different efficacy reported , but no controlled study was conducted on the clinical performance of & bgr;-TCP in instrumented posterolateral lumbar fusion . Methods . Sixty-two patients with symptomatic degenerative lumbar spinal stenosis were treated with single-level instrumented posterolateral lumbar fusion . They were r and omly assigned to fusion with & bgr;-TCP combined with local bone obtained from the decompression ( group A , n = 32 ) or autogenous iliac crest bone graft plus decompression bone ( group B , n = 30 ) . The patients were observed up for 3 years after surgery . The results were assessed clinical ly and radiographically . Results . There were no significant differences in recovery rate of Japanese Orthopedic Association score and SF-36 score at all time intervals . Successful radiographic fusion was documented in all patients in both treatment groups . All patients in group B , however , complained bone graft donor site pain although significant improvement of pain was observed during the follow-up . Conclusion . Instrumented posterolateral fusion with & bgr;-TCP combined with local autograft results in the same radiographic fusion rates and similar improvement of clinical outcomes and life quality compared with autograft alone . The authors therefore recommend the use of & bgr;-TCP as bone graft substitute for instrumented posterolateral fusion of lumbar spine to eliminate the need of bone grafting harvesting from the ilium Study Design . Prospect i ve , single-blinded , r and omized study . Objectives . To evaluate the efficacy of dilute betadine irrigation of spinal surgical wounds in prevention of postoperative wound infection . Summary and background . Deep wound infection is a serious complication of spinal surgery that can jeopardize patient outcomes and increase costs . Povidoneiodine is a widely used antiseptic with bactericidal activity against a wide spectrum of pathogens , including methicillin-resistant Staphylococcus aureus . The aim of this study was to evaluate the efficacy of dilute betadine solution in the prevention of wound infection after spinal surgery . Methods . Four hundred and fourteen patients undergoing spinal surgery were r and omly assigned to two groups . In group 1 ( 208 patients ) , surgical wounds were irrigated with dilute betadine solution ( 3.5 % betadine ) before wound closure . Betadine irrigation was not used in group 2 ( 206 patients ) . Otherwise , perioperative management was the same for both groups . Results . Mean length of follow-up was 15.5 months in both groups ( range , 6–24 months ) . No wound infection occurred in group 1 . One superficial infection ( 0.5 % ) and six deep infections ( 2.9 % ) occurred in group 2 . The differences between the deep infection rate ( P = 0.0146 ) and total infection rate ( P = 0.0072 ) were significant between the two groups . Conclusions . Our report is the first prospect i ve , single-blinded , r and omized study to evaluate the clinical effectiveness of dilute betadine solution irrigation for prevention of wound infection following spinal surgery . We recommended this simple and inexpensive measure following spinal surgery , particularly in patients with accidental wound contamination , risk factors for wound infection , or undergoing surgery in the absence of routine ultraviolet light , laminar flow , and isolation suits The use of autologous blood is a well established and extremely popular technique to decrease the necessity for homologous transfusions and the attendant risks of hepatitis , HIV , and HTLV -- I/II infections . The most beneficial timing for autologous reinfusion of predonated blood remains unknown . The present study was undertaken to determine the optimal timing of autologous blood reinfusion in elective spinal surgery . Fifty-seven patients were prospect ively individually r and omly allocated into early versus delayed reinfusion groups prior to undergoing elective spinal surgery by a single surgeon . Three surgical subgroups were entered into the study : anterior/posterior ( A/P ) spinal fusion patients , posterior thoracolumbar scoliosis fusion patients ( PSF ) , and degenerative posterior lumbar fusion patients ( LF ) . R and omization was successful in that three was no significant difference in male to female ratio , age , preoperative hemoglobin , or number of units predonated between the early and delayed reinfusion groups . Likewise , there was no significant difference in the details of the operative procedure when compared as a group for the early versus delayed reinfusion groups . A significant increase in the postoperative day # 1 , 2 and 3 hemoglobin was seen in the early reinfusion group , while there was no significant difference seen in the postoperative day # 7 hemoglobin between the early versus delayed reinfusion group . There was no effect of surgical grouping on these significant comparisons . Earlier patient mobilization was also seen in the early reinfusion groups for the A/P and PSF groups . There was no difference in patients ' subjective evaluation of satisfaction and discomfort between the early or delayed reinfusion groups as determined by blinded interview on days 1 , 3 , 5 , and 7 postoperatively . ( ABSTRACT TRUNCATED AT 250 WORDS Study Design . A prospect i ve r and omized study involving 280 consecutive cases of lumbar disc herniation managed either by an endoscopic discectomy alone or an endoscopic discectomy combined with an intradiscal injection of a low dose ( 1000 U ) of chymopapain . Objective . To compare outcome , complications , and reherniations of both techniques . Summary of Background Data . Despite a low complication rate , posterolateral endoscopic nucleotomy has made a lengthy evolution because of an assumed limited indication . Chemonucleolysis , however , proven to be safe and effective , has not continued to be accepted by the majority in the spinal community as microdiscectomy is considered to be more reliable . Method . A total of 280 consecutive patients with a primary herniated , including sequestrated , lumbar disc with predominant leg pain , was r and omized . A clinical follow-up was performed at 3 months , and at 1 and 2 years after the index operation with an extensive question naire , including the visual analog scale for pain and the MacNab criteria . The cohort integrity at 3 months was 100 % , at 1 year 96 % , and at 2 years 92 % . Results . At the 3-month evaluation , only minor complications were registered . At 1-year postoperatively , group 1 ( endoscopy alone ) had a recurrence rate of 6.9 % compared to group 2 ( the combination therapy ) , with a recurrence rate of 1.6 % , which was a statistically significant difference in favor of the combination therapy ( P = 0045 ) . At the 2-year follow-up , group 1 reported that 85.4 % had an excellent or good result , 6.9 % a fair result , and 7.7 % were not satisfied . At the 2-year follow-up , group 2 reported that 93.3 % had an excellent or good result , 2.5 % a fair result , and 4.2 % were not satisfied . This outcome was statistically significant in favor of the group including chymopapain . There were no infections or patients with any form of permanent iatrogenic nerve damage , and no patients had a major complication . Conclusions . A high percentage of patient satisfaction could be obtained with a posterior lateral endoscopic discectomy for lumbar disc herniation , and a statistically significant improvement of the results was obtained when an intradiscal injection of 1000 U of chymopapain was added . There was a low recurrence rate with no major complications . The method can be applied in any type of lumbar disc herniation , including the L5−S1 level We investigated the efficacy of a single dose of 1 g of cephazolin in reducing postoperative infections in patients undergoing ' clean ' operations on the lumbar spine . In a double-blind , r and omised , trial there were 21 wound or urinary infections in the 71 patients who received placebo and nine in the 70 who received cephazolin ( p < 0.05 ) . Nine of the placebo patients ( 12.7 % ) developed wound infections ( complicated by bacteraemia in two ) compared with three ( 4.3 % ) in the cephazolin group ( p = 0.07 ) . Hospital stay was longer for infected patients than for non-infected patients ( p < 0.05 ) . Cephazolin-resistant pathogens were isolated more frequently from patients who received cephazolin than from those who received placebo Study Design . A prospect i ve , r and omized study on patients who underwent posterior lumbar decompression with bilateral posterolateral arthrodesis . Objective . To determine the long-term influence of pseudarthrosis on the clinical outcome of patients with degenerative spondylolisthesis and spinal stenosis . Summary of Background Data . Spinal decompression and posterolateral arthrodesis have been shown to be beneficial in the surgical treatment of symptomatic spinal stenosis with concurrent spondylolisthesis . Methods . Forty-seven patients with single-level symptomatic spinal stenosis and spondylolisthesis were prospect ively studied . Patients were treated with posterior decompression and bilateral posterolateral arthrodesis with autogenous bone graft . Radiographic evaluation was used to determine if fusion or pseudarthrosis was present . The solid fusion and pseudarthrosis groups were analyzed clinical ly , roentgenographically , and with a vali date d self-administered spinal stenosis question naire . Results . Forty-seven patients were available for review at a range of follow-up from 5 to 14 years . Average follow-up was 7 years 8 months . Clinical outcome was excellent to good in 86 % of patients with a solid arthrodesis and in 56 % of patients with a pseudarthrosis ( P = 0.01 ) . Significant differences in residual back and lower limb pain was discovered between the two groups using a scale ranging from 0 ( no pain ) to 5 ( severe pain ) . Preoperative back and lower limb pain scores were statistically similar between the two groups . The solid fusion group performed significantly better in the symptom severity and physical function categories on the self-administered question naire . The two groups had similar results in the patient satisfaction category of this question naire . Conclusions . In patients undergoing single-level decompression and posterolateral arthrodesis for spinal stenosis and concurrent spondylolisthesis , a solid fusion improves long-term clinical results . Benefits of a successful arthrodesis over pseudarthrosis were demonstrated with respect to back and lower limb symptomatology compared with prior shorter-term studies , which indicated no significant difference in clinical outcome between the two groups Objective : A prospect i ve r and omized study was conducted to determine whether there exist any differences in radiographic , clinical , or functional outcomes when individuals with stable burst fractures of the thoracolumbar junction without neurologic deficit are treated with either a posterior fusion with instrumentation or anterior reconstruction , fusion , and instrumentation . There exists relatively little literature evaluating the outcomes of individuals treated with anterior surgery , and no prospect i ve r and omized studies exist comparing the two treatment approaches . Methods : From May 1995 to March 2001 , a consecutive series of subjects with acute isolated burst fractures of the thoracolumbar junction ( T10-L2 ) without neurologic deficit were r and omized to receive either an anterior fusion with instrumentation or a posterior fusion with instrumentation . Radiographs including computed tomography ( CT ) were obtained . Radiographs were repeated at 2 , 4 , 6 , 12 , and 24 months . The CT scan was also repeated at 24 months . Hospital stay , cost , operating time , blood loss , complications , and patient-related functional outcomes were measured . Results : Of 43 enrolled , 38 completed a minimum of 2-year follow-up ( average : 43 months ; range : 24 - 108 months ) . Eighteen received a posterior spine fusion and 20 an anterior approach . Hospital stay and operating time were similar . Blood loss was higher in the group treated anteriorly ; however , the incidence of transfusion was the same . There were 17 “ complications ” including instrumentation removal for pain in 18 patients treated posteriorly , but only 3 minor complications in 3 patients treated anteriorly . Patient-related functional outcomes were similar for the two groups . Conclusions : Although patient outcomes are similar , anterior fusion and instrumentation for thoracolumbar burst fractures may present fewer complications or additional surgeries X-STOP is the first interspinous process decompression device that was shown to be superior to nonoperative therapy in patients with neurogenic intermittent claudication secondary to spinal stenosis in the multicenter r and omized study at 1 and 2 years . We present 4-year follow-up data on the X-STOP patients . Patient records were screened to identify potentially eligible subjects who underwent X-STOP implantation as part of the FDA clinical trial . The inclusion criteria for the trial were age of at least 50 years , leg , buttock , or groin pain with or without back pain relieved during flexion , being able to walk at least 50 feet and sit for at least 50 minutes . The exclusion criteria were fixed motor deficit , cauda equina syndrome , previous lumbar surgery or spondylolisthesis greater than grade I at the affected level . Eighteen X-STOP subjects participated in the study . The average follow-up was 51 months and the average age was 67 years . Twelve patients had the X-STOP implanted at either L3 - 4 or L4 - 5 levels . Six patients had the X-STOP implanted at both L3 - 4 and L4 - 5 levels . Six patients had a grade I spondylolisthesis . The mean preoperative Oswestry score was 45 . The mean postoperative Oswestry score was 15 . The mean improvement score was 29 . Using a 15-point improvement from baseline Oswestry Disability Index score as a success criterion , 14 out of 18 patients ( 78 % ) had successful outcomes . Our results have demonstrated that the success rate in the X-STOP interspinous process decompression group was 78 % at an average of 4.2 years postoperatively and are consistent with 2-year results reported by Zucherman et al previously and those reported by Lee et al. Our results suggest that intermediate – term outcomes of X-STOP surgery are stable over time as measured by the Oswestry Disability Index OBJECT Recently , limited decompression procedures have been proposed in the treatment of lumbar stenosis . The authors undertook a prospect i ve study to compare the safety and outcome of unilateral and bilateral laminotomy with laminectomy . METHODS One hundred twenty consecutive patients with 207 levels of lumbar stenosis without herniated discs or instability were r and omized to three treatment groups ( bilateral laminotomy [ Group 1 ] , unilateral laminotomy [ Group 2 ] , and laminectomy [ Group 3 ] ) . Perioperative parameters and complications were documented . Symptoms and scores , such as visual analog scale ( VAS ) , Rol and -Morris Scale , Short Form-36 ( SF-36 ) , and patient satisfaction were assessed preoperatively and at 3 , 6 , and 12 months after surgery . Adequate decompression was achieved in all patients . The overall complication rate was lowest in patients who had undergone bilateral laminotomy ( Group 1 ) . The minimum follow up of 12 months was obtained in 94 % of patients . Residual pain was lowest in Group 1 ( VAS score 2.3 + /- 2.4 and 4 + /- 1 in Group 3 ; p < 0.05 and 3.6 + /- 2.7 in Group 2 ; p < 0.05 ) . The Rol and -Morris Scale score improved from 17 + /- 4.3 before surgery to 8.1 + /- 7 , 8.5 + /- 7.3 , and 10.9 + /- 7.5 ( Groups 1 - 3 , respectively ; p < 0.001 compared with preoperative ) corresponding to a dramatic increase in walking distance . Examination of SF-36 scores demonstrated marked improvement , most pronounced in Group 1 . The number of repeated operations did not differ among groups . Patient satisfaction was significantly superior in Group 1 , with 3 , 27 , and 26 % of patients unsatisfied ( in Groups 1 , 2 , and 3 , respectively ; p < 0.01 ) . CONCLUSIONS Bilateral and unilateral laminotomy allowed adequate and safe decompression of lumbar stenosis , result ed in a highly significant reduction of symptoms and disability , and improved health-related quality of life . Outcome after unilateral laminotomy was comparable with that after laminectomy . In most outcome parameters , bilateral laminotomy was associated with a significant benefit and thus constitutes a promising treatment alternative BACKGROUND Surgery for spinal stenosis is widely performed , but its effectiveness as compared with nonsurgical treatment has not been shown in controlled trials . METHODS Surgical c and i date s with a history of at least 12 weeks of symptoms and spinal stenosis without spondylolisthesis ( as confirmed on imaging ) were enrolled in either a r and omized cohort or an observational cohort at 13 U.S. spine clinics . Treatment was decompressive surgery or usual nonsurgical care . The primary outcomes were measures of bodily pain and physical function on the Medical Outcomes Study 36-item Short-Form General Health Survey ( SF-36 ) and the modified Oswestry Disability Index at 6 weeks , 3 months , 6 months , and 1 and 2 years . RESULTS A total of 289 patients were enrolled in the r and omized cohort , and 365 patients were enrolled in the observational cohort . At 2 years , 67 % of patients who were r and omly assigned to surgery had undergone surgery , whereas 43 % of those who were r and omly assigned to receive nonsurgical care had also undergone surgery . Despite the high level of nonadherence , the intention-to-treat analysis of the r and omized cohort showed a significant treatment effect favoring surgery on the SF-36 scale for bodily pain , with a mean difference in change from baseline of 7.8 ( 95 % confidence interval , 1.5 to 14.1 ) ; however , there was no significant difference in scores on physical function or on the Oswestry Disability Index . The as-treated analysis , which combined both cohorts and was adjusted for potential confounders , showed a significant advantage for surgery by 3 months for all primary outcomes ; these changes remained significant at 2 years . CONCLUSIONS In the combined as-treated analysis , patients who underwent surgery showed significantly more improvement in all primary outcomes than did patients who were treated nonsurgically . ( Clinical Trials.gov number , NCT00000411 [ Clinical Trials.gov ] . ) Study Design . Prospect i ve comparative r and omized clinical and radiologic study . Objective . This study was conducted to compare the short-term effects of rigid versus semirigid and dynamic instrumentation on the global and segmental lumbar spine profile , subjective evaluation of the result , and the associated complications . Background Data . Lumbar spine fusion with rigid instrumentation for degenerative spinal disorders seems to increase the fusion rate . However , rigid instrumentation may be associated with some undesirable effects , such as increased low back pain following decrease of lumbar lordosis , fracture of the vertebral body and pedicle , pedicle screw loosening , and adjacent segment degeneration . The use of semirigid and dynamic devices has been advocated to reduce such adverse effects of the rigid instrumentation and thus to achieve a more physiologic bony fusion . Material s and Methods . This study compared 3 equal groups of 45 adult patients , who underwent primary decompression and stabilization for symptomatic degenerative lumbar spinal stenosis . The patients of each group were r and omly selected and received either the rigid ( Group A ) , or semirigid ( Group B ) , or dynamic ( Group C ) spinal instrumentation with formal decompression and fusion . The mean ages of the patients who received rigid , semirigid , and dynamic instrumentation were 65 ± 9 , 59 ± 16 , and 62 ± 10 years , respectively . All patients had detailed roentgenographic study including computed tomography scan and magnetic resonance imaging before surgery to the latest follow-up observation . The following roentgenographic parameters were measured and compared in all spines : lumbar lordosis ( L1–S1 ) , total lumbar lordosis ( T12–S1 ) , sacral tilt , distal lordosis ( L4–S1 ) , segmental lordosis , vertebral inclination , and disc index . The SF-36 health survey and Visual Analogue Scale was used before surgery to the latest evaluation . Results . All patients were evaluated after a mean follow-up of 47 ± 14 months . Both lumbar and total lordosis correction did not correlate with the number of the levels instrumented in any group . Total lordosis was slightly decreased after surgery ( 3 % , P < 0.05 ) in Group C. The segmental lordosis L2–L3 was increased after surgeryby8.5 % ( P < 0.05 ) in Group C , whereas the segmentallordosis L4–L5 was significantly decreased in Group A and C by 9.8 % ( P = 0.01 ) and 16.2 % ( P < 0.01 ) , respectively . The disc index L2–L3 was decreased after surgery in Group A and C by 17 % ( P < 0.05 ) and 23.5 % ( P < 0.05 ) , respectively . The disc index L3–L4 was increased in Group C by 18.74 % ( P < 0.01 ) . The disc index L4–L5 was after surgery decreased in all 3 groups : Group A by 21 % ( P = 0.01 ) , Group B by 13 % ( P < 0.05 ) , and Group C by 13.23 % ( P < 0.05 ) . The disc index L5–S1 was significantly decreased in Group B by 13 % ( P < 0.05 ) . The mean preoperative scores of the SF-36 before surgery were 11 , 14 , and 13 for Groups C , B , and A , respectively . In the first year after surgery , there was a significant increase of the preoperative SF-36 scores to 65 , 61 , and 61 for Groups C , B , and A , respectively , that represents an improvement of 83 % , 77 % , and 79 % , respectively . In the second year after surgery and thereafter , there was a further increase of SF-36 scores of 19 % , 23 % , and 21 % for Groups C , B , and A , respectively . The mean preoperative scores of Visual Analogue Scale for low back pain for Groups C , B , and A were 5 , 4.5 , and 4.3 , respectively , and decreased after surgery to 1.9 , 1.5 , and 1.6 , respectively . The mean preoperative scores of the Visual Analogue Scale for leg pain for Groups C , B , and A were 7.6 , 7.1 , and 6.9 , respectively , and decreased after surgery to 2.5 , 2.5 , and 2.7 , respectively . All fusions healed radiologically within the expected time in all three groups without pseudarthrosis or malunion . Delayed hardware failure ( 1 screw and 2 rod breakages ) 1 year and 18 months after surgery without radiologic pseudarthrosis was observed in 2 patients in Group C. Asymptomatic radiolucent areas were shown around pedicle screws in the pedicles L5 and S1 in 2 , 3 , and 4 cases in Group C , A , and B , respectively . There was no adjacent segment degeneration in any spine until the last evaluation . Discussion and Conclusion . This comparative study showed that all three instrumentations applied over a short area for symptomatic degenerative spinal stenosis almost equally after surgery maintained the preoperative global and segmental sagittal profile of the lumbosacral spine and was followed by similarly significant improvement of both self- assessment and pain scores . Hardware failure occurred at a low rate following dynamic instrumentation solely without radiologically visible pseudarthrosis or loss of correction . Because of the similar clinical and radiologic data in all three groups and the relative small number of patients that were included in each group , it is difficult for the authors to make any recommendation in favor of any instrumentation Study Design A prospect i ve study was conducted on the surgical procedures for lumbar disc herniation . Objective The objective of this study is to investigate the surgical outcomes of different methods when performed by the same surgeon , using a prospect i ve study . Background Macro discectomy is widely known as a common surgical procedure for lumbar disc herniation , while microdiscectomy in place of Caspar technique ( the Caspar method ) and microendoscopic discectomy by a posterior approach are reported as less invasive surgical methods for this condition . However , there have not been a significant number of prospect i ve studies conducted to compare different surgical procedures for lumbar disc herniation . Material s and Methods The target of our study was a group of 62 patients ( male : 43 , female : 19 ) who underwent surgery by macro discectomy ( A group ) and 57 patients ( male : 33 , female : 24 ) who underwent surgery by microdiscectomy in place of Caspar technique ( B group ) . The mean ages at surgery were 34 ( 14 to 62 ) years and 41 ( 18 to 65 ) years respectively , and the mean duration of follow-up was 2 years and 8 months ( 12 months to 4 years ) . For all patients , the surgery was performed by 1 of the authors . The items investigated were the operation time , amount of bleeding , duration of hospitalization , amount of analgesic agent used after surgery , pre- and postoperative scores based on judgment criteria for treatment of lumbar spine disorders established by the Japanese Orthopaedic Association score , visual analog scales ( VAS , 0 to 10 ) for lumbago before surgery and at discharge , VAS for sciatica before surgery and at discharge , perioperative complications , and cases requiring further surgery . Results There were no significant differences between the 2 surgical procedures in the frequency of use of an analgesic agent after surgery , the pre- and postoperative Japanese Orthopaedic Association scores or postoperative VAS for sciatica . Statistically significant differences were observed in the operation time , amount of bleeding , duration of hospitalization , and postoperative VAS for lumbar pain , but the differences were not large , and may not have been clinical ly significant . Conclusions For herniotomy for lumbar disc herniation , both macro discectomy and microdiscectomy are appropriate , as long as surgeons have mastery of the procedures Study Design . Prospect i ve r and omized study on 82 patients with degenerative lumbar spondylolisthesis , having undergone posterolateral fusion with bilateral or unilateral instrumentation . Objective . To determine the effectiveness of unilateral pedicle instrumentation in clinical outcome and rate of union in comparison with the classic bilateral system . Summary of Background Data . Instrumentation has proved to have advantages and disadvantages related to its rigidity . The use of less rigid systems applied to posterior lumbar fusions proved promising according to the results achieved in both experimental and clinical field . Methods . Eighty-two patients were r and omized into 2 groups : Group 1 ( n = 42 ) had had bilateral instrumentation , and Group 2 ( n = 40 ) had only had unilateral instrumentation . One case from Group 1 , L3–S1 dropped out ; only fusions of 1 or 2 levels remained in the study . Length of time spent on operating , blood loss , blood transfusion , hospital stay , complications , clinical results measured by SF-36v2 , and radiologic assessment of union and of loss of height of adjacent discs were analyzed and compared by means of & khgr;2 test , t test , and Fisher exact test . Results . Statistically , there was no significant difference between the 2 groups in relation to demographics , blood loss , need of transfusion , hospital stay , complications , clinical results , rate of union , and effect on adjacent discs . The operating time needed for Group 2 was significantly shorter in than the time needed for Group 1 ( P < 0.001 ) . In Group 1 , 3 of 186 screws violated the pedicle cortex requiring reoperation because root irritation versus no complication on a total of 90 screws in Group 2 . Conclusion . Unilateral instrumentation used for the treatment of degenerative lumbar spondylolisthesis is as effective as bilateral instrumentation when performed in addition to 1- or 2-level posterolateral fusion . The cost of this method is lower , saves time , and reduces possible risk inserting screws in only one side This is a prospect i ve , r and omized study to compare the efficacy of two similar “ long-segment ” Texas Scottish Rite Hospital instrumentations with the use of hooks in the thoracic spine and pedicle screws versus laminar hook claw in the lumbar spine for thoracolumbar A3 , B , and C injuries . Forty consecutive patients with such thoracolumbar fractures ( T11–L1 ) associated with spinal canal encroachment underwent early operative postural reduction and stabilization . The patients were r and omly sample d into two groups : Twenty patients received hooks in “ claw configuration ” in both the thoracic and the lumbar spine ( group A ) , and 20 patients received hooks in the thoracic vertebrae and pedicle screws in the lumbar vertebrae ( group B ) . Pre- and postoperative plain roentgenograms and computed tomography scans were used to evaluate any changes in Gardner post-traumatic kyphotic deformity , anterior and posterior vertebral body height at the fracture level , and spinal canal clearance ( SCC ) . All patients were followed for an average period of 52 months ( range 42–71 months ) . The correction of anterior vertebral body height was significantly more ( P < 0.01 ) in the spines of group B ( 33 % ) than in group A ( 16 % ) , with a subsequent 11 % loss of correction at the latest evaluation in group A and no loss of correction in group B. There were no significant differences in the changes of posterior vertebral body height and Gardner angle between the two groups . The SCC was significantly more ( P < 0.05 ) immediately postoperatively in the spine of group B ( 32 % ) than in group A ( 19 % ) . In the latest evaluation , there was a 9 % loss of the immediately postoperatively achieved SCC in group A , while SCC was furthermore increased at 10.5 % in group B. All patients with incomplete neurologic lesions in groups A and B were postoperatively improved at 1.1 and 1.7 levels , respectively . There were two hook dislodgements in the thoracic spine , one in each group , while there was no screw failure in group B. There was neither pseudarthrosis nor neurologic deterioration following surgery . Visual Analog Pain Scale and Short Form-36 scores were equally improved and did not differ between the two groups . The use of pedicle screws in the lumbar spine to stabilize the lowermost end of a long rigid construct applied for A3 , B , and C thoracolumbar injuries was advantageous when compared with that using hook claws in the lumbar spine because the constructs with screws restored and maintained the fractured anterior vertebral body height better than the hooks without subsequent loss of correction and safeguarded postoperatively a continuous SCC at the injury level Study Design . Prospect i ve r and omized controlled trial . Objective . To assess effectiveness of microdiscectomy in lumbar disc herniation patients with 6 to 12 weeks of symptoms but no absolute indication for surgery . Summary of Background Data . There is limited evidence in favor of discectomy for prolonged symptoms of lumbar disc herniation . However , only one r and omized trial has directly compared discectomy with conservative treatment . Methods . Fifty-six patients ( age range , 20–50 years ) with a lumbar disc herniation , clinical findings of nerve root compression , and radicular pain lasting 6 to 12 weeks were r and omized to microdiscectomy or conservative management . Fifty patients ( 89 % ) were available at the 2-year follow-up . Leg pain intensity was the primary outcome measure . Results . There were no clinical ly significant differences between the groups in leg or back pain intensity , subjective disability , or health-related quality of life over the 2-year follow-up , although discectomy seemed to be associated with a more rapid initial recovery . In a subgroup analysis , discectomy was superior to conservative treatment when the herniation was at L4–L5 . Conclusions . Lumbar microdiscectomy offered only modest short-term benefits in patients with sciatica due to disc extrusion or sequester . Spinal level of the herniation may be an important factor modifying effectiveness of surgery , but this hypothesis needs verification A series of 316 patients with transient cerebral ischemic attacks and no neurological deficit were r and omly allocated to surgical or nonsurgical treatment categories in a controlled manner . The total group was divided by anatomical patterns of lesions ( carotid stenosis , unilaterally ; carotid stenosis , bilaterally ; and unilateral occlusion with opposite stenosis ) . During an average 42-month follow-up , distinct differences in outcome were noted in specific subgroups in the frequency and pattern of transient attacks and in the occurrence and location of cerebral infa rct ion 1 One hundred and fifty Hong Kong patients with a diagnosis of tuberculosis of the thoracic or lumbar spine were allocated at r and om to operation by radical resection of the spinal lesion and insertion of autologous bone grafts ( Rad . series ) or by simple débridement of the spinal focus ( Deb . series ) . All the patients were treated with isoniazid plus PAS for 18 months and daily streptomycin for the first 3 months . 2 The main analysis of this report concerns 64 Rad . and 66 Deb . patients with less than 3 vertebral bodies destroyed . 3 The clinical and radiographic condition of the two series on admission was similar . 4 In 60 of the Rad . and 63 of the Deb . patients the allocated operation was successfully completed ; in the remaining 4 Rad . and 3 Deb . it had to be repeated or a different operation performed . 5 All the Rad , patients and only 4 Deb . patients were treated in plaster beds , the mean period of recumbency in bed being 73.2 and 11.2 days respectively . 6 Twenty per cent of the Rad . patients and 12 per cent of the Deb . patients had a clinical ly evident abscess and /or sinus initially ; all had resolved by 18 months . 1 Of 22 Rad . and 20 Deb . patients with a radiographically evident mediastinal abscess initially and who at no time had a sinus or clinical ly evident abscess , the shadow had disappeared in 19 and 9 respectively by 12 months , in 19 and 17 respectively by 24 months and was still present in 3 and 1 respectively at 36 months . 8 The mean total vertebral loss on admission was 0.7 in each of the series ; at 36 months there was a mean gain of 0.2 of a vertebra in the Rad . series and a mean further loss of 02 of a vertebra in the Deb . series ( P < 0∼001 ) . 9 The mean angulation of the spine at the start of treatment was 23.0o for the Rad . and 16.40for the Deb . patients . The mean increase over the 3 years was 0.90 for the Rad . and 4.50 for the Deb . patients ( P = 0.1 ) . 10 Radiographic evidence of bony fusion of the affected vertebral bodies had occurred in 31 per cent of 55 Rad . and 3 per cent of 58 Deb . patients at 6 months ( P = 0.0001 ) , in 89 and 53 per cent respectively at 18 months ( P < 0.0001 ) and in 93 and 69 per cent respectively at 36 months ( P = 0.003 ) . 11 At 18 months 89 per cent of the Rad . and 79 per cent of the Deb . patients had a favourable response to the originally allocated treatment . The corresponding percentages at 3 years were 87 and 86per cent . However , if the response was classified irrespective of any additional chemotherapy or additional operation received , the proportions with a favourable response increased to 97 and 95 per cent respectively . 12 Biopsy specimens from spinal lesions were obtained at operation in 149 patients . In 127 ( 85 per cent ) the specimens were histologically tuberculous and lor yielded positive cultures for tubercle bacilli . The organisms were resistant to one or more of the three st and ard drugs in 24 of 108 strains but the influence on the results was minor Study Design . A r and omized controlled trial with 5-year outcome data . Objective . To compare clinical outcomes following spinal decompression ( Group 1 ) with those following decompression and instrumented posterolateral fusion ( Group 2 ) and decompression and instrumented posterolateral fusion plus transforaminal interbody fusion ( TLIF ) ( Group 3 ) . Summary of Background Data . Decompression is frequently advocated for the relief of nerve root stenosis in the presence of degenerate disc disease . It is uncertain if spinal fusion is also necessary . Material s and Methods . Following completion of a st and ardized physiotherapy program , 44 patients with single-level disc disease were r and omly assigned to 1 of 3 surgical groups . In those patients undergoing instrumentation , segmental pedicle screw fixation was used to stabilize the spine . Titanium interbody cages filled with autologous bone were inserted into patients in Group 3 . Spinal disability , quality of life , and pain were assessed before surgery , and then at 1 , 2 , and 5 years by an independent research er . Results . At 2 years , 82 % of the patients were pain free or moderately improved . Disability ( Low Back Outcome Score and Rol and Morris index ) were both better in Group 1 , but only Low Back Outcome Score was better in Group 2 ( P < 0.05 ) . By 5 years , although patients in all 3 groups showed some improvements in all the ratings used ( Low Back Outcome Score , SF-36 Physical Functioning , and Rol and Morris score ) , only Group 1 patients showed significant changes in all 3 outcomes ( P < 0.05 ) . There was no difference in any score between groups ( P > 0.05 ) . Two had secondary surgery for adjacent level stenosis ( Group 2 and 3 ) . One patient ( Group 1 ) underwent subsequent lateral mass fusion for chronic pain . No patient required revision surgery for instrumentation failure , cage displacement , or pseudarthrosis . Evidence of at least unilateral lateral mass bone graft incorporation was evident in 95 % of Groups 2 and 3 . Conclusions . The results are encouraging in that almost all patients had improved by 5 years . However , it is a concern that no significant additional benefit has been noted from the more complex surgery . This suggests that patients are optimally treated by decompression alone , with the proviso that further operations may be required Two hundred eighty patients with herniated lumbar discs , verified by radiculography , were divided into three groups . One group , which mainly will be dealt with in this paper , consisted of 126 patients with uncertain indication for surgical treatment , who had their therapy decided by r and omization which permitted comparison between the results of surgical and conservative treatment . Another group comprising 67 patients had symptoms and signs that beyond doubt , required surgical therapy . The third group of 87 patients was treated conservatively because there was no indication for operative intervention . Follow-up examinations in the first group were performed after one , four , and ten years . The controlled trial showed a statistically significant better result in the surgically treated group at the one-year follow-up examination . After four years the operated patients still showed better results , but the difference was no longer statistically significant . Only minor changes took place during the last six years of observation OBJECT Interspinous process decompression ( IPD ) theoretically relieves narrowing of the spinal canal and neural foramen in extension and thus reduces the symptoms of neurogenic intermittent claudication ( NIC ) . The purpose of this study was to compare the efficacy of IPD with nonoperative treatment in patients with NIC secondary to degenerative spondylolisthesis . METHODS The authors conducted a r and omized controlled study in patients with NIC ; they compared the results obtained in patients treated with the X STOP IPD device with those acquired in patients treated nonoperatively . The X STOP implant is a titanium alloy device that is placed between the spinous processes to reduce the canal and foraminal narrowing that occurs in extension . In a cohort of 75 patients with degenerative spondylolisthesis , 42 underwent surgical treatment in which the X STOP IPD device was placed and 33 control individuals were treated nonoperatively . Patients underwent serial follow-up evaluations . The Zurich Claudication Question naire ( ZCQ ) , 36-Item Short Form Health Survey ( SF-36 ) , and radiographic assessment were used to determine outcomes . Two-year follow-up data were obtained in 70 of 75 patients . Statistically significant improvement in ZCQ and SF-36 scores was seen in X STOP device-treated patients but not in the nonoperative control patients at all postoperative intervals . Overall clinical success occurred in 63.4 % of X STOP device-treated patients and only 12.9 % of controls . Spondylolisthesis and kyphosis were unaltered . CONCLUSIONS The X STOP device was more effective than nonoperative treatment in the management of NIC secondary to degenerative lumbar spondylolisthesis Although r and omised trials are widely accepted as the ideal way of obtaining unbiased estimates of treatment effects , some treatments have dramatic effects that are highly unlikely to reflect inadequately controlled biases . We compiled a list of historical examples of such effects and identified the features of convincing inferences about treatment effects from sources other than r and omised trials . A unifying principle is the size of the treatment effect ( signal ) relative to the expected prognosis ( noise ) of the condition . A treatment effect is inferred most confidently when the signal to noise ratio is large and its timing is rapid compared with the natural course of the condition . For the examples we considered in detail the rate ratio often exceeds 10 and thus is highly unlikely to reflect bias or factors other than a treatment effect . This model may help to reduce controversy about evidence for treatments whose effects are so dramatic that r and omised trials are unnecessary . The relation between a treatment and its effect is sometimes so dramatic that bias can be ruled out as an explanation . Paul Glasziou and colleagues suggest how to determine when observations speak for OBJECTIVE To compare and evaluate instrumented posterior fusion with instrumented circumferential lumbar fusion in the treatment of lumbar stenosis with low degree lumbar spondylolisthesis . METHODS From April 1998 to April 2003 , 45 patients who suffered from lumbar stenosis with low degree lumbar spondylolisthesis were divided into 2 groups ( A and B ) at r and om . The patients in group A ( n = 24 , average age 54 years old ) were performed decompressive laminectomy , intertransverse process arthrodesis with bone grafting and transpedicle instrumentation of solid connection ( SOCON ) system . The patients in group B ( n = 21 , average age 53 years old ) were performed the same procedure as group A except adding posterior lumbar interbody fusion ( PROSPACE ) . The main levels of lumbar spondylolisthesis in 2 groups was L(4 - 5 ) or L(5)-S(1 ) . All cases were classified as degree 1 to degree 2 . All patients in the two groups received preoperative myelography or CTM , and were diagnosed lateral recess stenosis and ( or ) central lumbar canal stenosis . RESULTS All the patients were followed up from 12 to 72 months . In group A , the results showed that the preoperative clinical symptoms disappeared completely in 12 of 24 patients , pain relief was seen in 91.7 % ( 22/24 ) , anatomical reduction rate was 91.7 % . No infection or neurologic complication occurred in this series . In group B , the results showed that the preoperative clinical symptoms disappeared completely in 13 of 21 patients , pain relief was seen in 90.5 % ( 19/21 ) , anatomical reduction rate was 95.2 % . Four cases of infection or neurologic complication occurred in this series . Two groups had no significant difference in follow-up clinical outcome and anatomical reduction rate . But group A had better intraoperative circumstances and postoperative outcome than group B , group B had better postoperative parameters in X-ray of angle of slipping and disc index than group A. CONCLUSIONS The best surgical treatment method of lumbar stenosis with low degree lumbar spondylolisthesis is complete intraoperative decompressive laminectomy , reduction with excellent transpedicle system instrumentation and solid fusion after bone grafting . The use of cage should be conformed to strict indications Study Design . A total of 115 patients were r and omized in a 1:1 ratio to a Bryan artificial disc replacement ( 56 ) or an anterior cervical fusion with allograft and a plate ( 59 ) . Objective . The purpose of this study is to examine the functional outcome and radiographic results of this prospect i ve , r and omized trial to determine the role of the Bryan artificial cervical disc replacement for patients with 1-level cervical disc disease . Summary of Background Data . Artificial cervical disc replacement has become an option for cervical radiculopathy . Previous studies have evaluated the efficacy of this alternative without the scientific rigor of a concurrent control population . This study is a pooled data set from 3 centers involved in the U.S. FDA Investigational Device Exemption trial evaluating the Bryan artificial cervical disc . Methods . The purpose of this study is to examine the functional outcome and radiographic results of this prospect i ve , r and omized trial to determine the role of the Bryan artificial cervical disc replacement for patients with 1-level cervical disc disease ; 12-month follow-up is available for 110 patients and 24 month follow-up complete for 99 patients . There are 30 males and 26 females in the Bryan group and 32 males and 27 females in the fusion group . The average age was 43 years ( Bryan ) and 46 years ( fusion ) . Disability and pain were assessed using the Neck Disability Index ( NDI ) and the Visual Analog Scale ( VAS ) of the neck and of the arm pain . SF-36 outcome measures were obtained including the physical component as well as the mental component scores . Range of motion was determined by independent radiologic assessment of flexion-extension radiographs . We report a prospect i ve , r and omized study comparing the functional outcome of cervical disc replacement to an anterior cervical fusion with results of 99 patients at 2 years . Prospect i ve data were collected before surgery and at 6 weeks , 3 , 6 , 12 , and 24 months after surgery . Results . The average operative time for the control group was 1.1 hours and the Bryan Group 1.7 hours . Average blood loss was 49 mL ( control ) and 64 mL ( Bryan ) . Average hospital stay was 0.6 days ( control ) and 0.9 days ( Bryan ) . The mean NDI before surgery was not statistically different between groups : 47 ( Bryan ) and 49 ( control ) . Twelve-month follow-up NDI is 10 ( Bryan ) and 18 ( control ) ( P = 0.013 ) . At 2-year follow-up , NDI for the Bryan group is 11 and the control group is 20 ( P = 0.005 ) . The mean arm pain VAS before surgery was 70 ( Bryan ) and 71 ( control ) . At 1-year follow-up , Bryan arm pain VAS was 12 and control 23 ( P = 0.031 ) . At 2-year follow-up , the average arm pain VAS for the Bryan group was 14 and control 28 ( P = 0.014 ) . The mean neck pain VAS before surgery was 72 ( Bryan ) and 73 ( control ) . One-year follow-up scores were 17 ( Bryan ) and 28 ( control ) ( P = 0.05 ) . At 2 years : 16 ( Bryan ) and 32 ( control ) ( P = 0.005 ) . SF-36 scores : Physical component- Before surgery Bryan 34 and control 32 . At 24 months : Bryan 51 and control 46 ( P = 0.009 ) . More motion was retained after surgery in the disc replacement group than the plated group at the index level ( P < 0.006 at 3 , 6 , 12 , and 24 months ) . The disc replacement group retained an average of 7.9 ° of flexion-extension at 24 months . In contrast , the average range of motion in the fusion group was 0.6 ° at 24 months . There were 6 additional operations in this series : 4 in the control group and 2 in the investigational group . There were no intraoperative complications , no vascular or neurologic complications , no spontaneous fusions , and no device failures or explantations in the Bryan cohort . Conclusion . The Bryan artificial disc replacement compares favorably to anterior cervical discectomy and fusion for the treatment of patients with 1-level cervical disc disease . At the 2-year follow-up , there are statistically significant differences between the groups with improvements in the NDI , the neck pain and arm pain VAS scores , and the SF-36 physical component score in the Bryan disc population
2,003
18,451,395
Trials in orthopaedic trauma typically measure many outcomes requiring judgment , but the individuals assessing those outcomes are seldom blinded .
BACKGROUND Blinding personnel in r and omized controlled trials is an important strategy to minimize bias and increase the validity of the results . Trials of surgical interventions present blinding challenges not seen in drug trials . How often orthopaedic trauma investigators undertake blinding , and the frequency with which they could potentially utilize blinding , remains uncertain .
BACKGROUND A prospect i ve review was performed on 60 consecutive patients with hip hemiarthroplasty after femoral neck fractures . METHODS Twenty-two patients underwent Austin Moore hemiarthroplasty with an intramedullary corticocancellous bone plug at the tip of the prosthesis ( group A ) and 38 patients underwent Austin Moore hemiarthroplasty alone ( group B ) . The patients were evaluated clinical ly and radiographically at 3 and 6 months postoperatively and annually thereafter . RESULTS There was no statistically significant difference in thigh pain score between the two groups . At 3- and 6-month follow-up , 88 % and 83 % of group A patients experienced no pain or mild thigh pain , compared with 72 % and 76 % in group B , respectively . The radiographs revealed more stem subsidence and calcar osteolysis in group B than in group A ( p < 0.01 and p < 0.05 , respectively ) . Furthermore , in both groups there was a correlation between calcar atrophy , stem subsidence , and early clinical thigh pain score ( p < 0.05 ) . CONCLUSION Our data suggest that , whereas the radiologic findings in both groups may be related to thigh pain , they had little effect on the rate of femoral stem revision . We believe that the application of a corticocancellous bone plug in uncemented hip hemiarthroplasty for treatment of femoral neck fractures can decrease the incidence of early thigh pain in the first 6 months Publisher Summary This chapter focuses on the bias in analytic research . Case-control studies are attractive . They can be executed quickly and at low cost , even when the disorders of interest are rare . The execution of pilot case-control studies is becoming automated ; strategies have been devised for the “ computer scanning ” of large files of hospital admission diagnoses and prior drug exposures , with detailed analyses carried out in the same data set on an ad hoc basis . As evidence of their growing popularity , when one original article was r and omly selected from each issue of The New Engl and Journal of Medicine , The Lancet , and the Journal of the American Medical Association for the years 1956 , 1966 , and 1976 , the proportion that reported case-control analytic studies increased fourfold over these two decades ; however , the proportion reporting cohort analytic studies fell by half ; a general trend toward fewer study subjects but more study authors was also noted Blinding embodies a rich history spanning over two centuries . Most research ers worldwide underst and blinding terminology , but confusion lurks beyond a general comprehension . Terms such as single blind , double blind , and triple blind mean different things to different people . Moreover , many medical research ers confuse blinding with allocation concealment . Such confusion indicates misunderst and ings of both . The term blinding refers to keeping trial participants , investigators ( usually health-care providers ) , or assessors ( those collecting outcome data ) unaware of the assigned intervention , so that they will not be influenced by that knowledge . Blinding usually reduces differential assessment of outcomes ( information bias ) , but can also improve compliance and retention of trial participants while reducing biased supplemental care or treatment ( sometimes called co-intervention ) . Many investigators and readers naïvely consider a r and omised trial as high quality simply because it is double blind , as if double-blinding is the sine qua non of a r and omised controlled trial . Although double blinding ( blinding investigators , participants , and outcome assessors ) indicates a strong design , trials that are not double blinded should not automatically be deemed inferior . Rather than solely relying on terminology like double blinding , research ers should explicitly state who was blinded , and how . We recommend placing greater credence in results when investigators at least blind outcome assessment s , except with objective outcomes , such as death , which leave little room for bias . If investigators properly report their blinding efforts , readers can judge them . Unfortunately , many articles do not contain proper reporting . If an article cl aims blinding without any accompanying clarification , readers should remain sceptical about its effect on bias reduction We report a r and omised , prospect i ve study comparing a st and ard sliding hip screw and the intramedullary hip screw for the treatment of unstable intertrochanteric fractures in the elderly . One hundred and two patients were r and omised on admission to two treatment groups . Fifty-two patients were treated with a compression hip screw ( CHS ) , and fifty had intramedullary fixation with an intramedullary hip screw ( IMHS ) . Patients were followed for 1 year and had a clinical and radiological review at 3 , 6 and 12 months . The mean duration of operation and fluoroscopy screening time was significantly greater for insertion of the intramedullary hip screw . There was no difference between the groups with regard to transfusion requirements or time to mobilise after surgery . There were two technical complications in the CHS group and three in the IMHS group . There was no significant difference between the two groups in radiological or functional outcome at 12 months . It remains to be shown whether the theoretical advantages of intramedullary fixation of extracapsular hip fractures bring a significant improvement in eventual outcome R and omized controlled trials are the gold st and ard for the evaluation of new therapies and surgical procedures and as such require strict attention to study design and statistical analysis . There are , however , multiple challenges in conducting a well- design ed clinical trial . This article describes the difficulties encountered at a single institution participating in a multicenter drug study and review s the challenges involved in developing a high- quality r and omized controlled study BACKGROUND Intramedullary nailing of the femur without reaming of the medullary canal has been advocated as a method to reduce marrow embolization to the lungs and the rate of infection after open fractures . The use of nailing without reaming , however , has been associated with lower rates of fracture-healing . The purpose of this prospect i ve study was to compare the rate of union of femoral shaft fractures following intramedullary nailing with and without reaming . METHODS Two hundred and twenty-four patients were enrolled in a multicenter , prospect i ve , r and omized clinical trial to compare nailing without reaming and nailing with reaming . One hundred and six patients with 107 femoral shaft fractures were treated with a smaller diameter nail without reaming of the canal , and 118 patients with 121 fractures had reaming of the canal and insertion of a relatively larger diameter nail . Patients were followed at six-week intervals until union occurred or a nonunion was diagnosed . RESULTS The two groups were comparable with regard to the measured patient and injury characteristics . Eight ( 7.5 % ) of the 107 fractures in the group without reaming had a nonunion compared with two ( 1.7 % ) of 121 fractures in the group with reaming ( p = 0.049 ) . The relative risk of nonunion was 4.5 times greater ( 95 % confidence interval = 1 to 20 ) without reaming and with use of a relatively small-diameter nail . CONCLUSION Intramedullary nailing of femoral shaft fractures without reaming results in a significantly higher rate of nonunion compared with intramedullary nailing with reaming The ability of a small-scale r and om-control clinical trial comprising less than 500 patients to disclose clinical ly important differences between treatment groups depends on the event rate in the control group . The high rate of wound infection after abdominal operations has attracted many trials of methods of antibiotic prophylaxis . We review ed all of the pertinent English literature recorded in Index Medicus in 1980 and 1981 . We examined 45 articles for defects in design , analysis , and presentation . Of the 45 articles , 25 reported statistically significant differences between treatment groups and 20 , no significant differences . Unsatisfactory methods of r and omization were used in four trials , ethics were question able in 22 , statistical methods were incorrect in 31 , and presentation was inadequate in 40 . We concluded that there is room for improvement in the conduct of clinical trials Considerable effort is often expended to adjudicate outcomes in clinical trials , but little has been written on the administration of the adjudication process and its possible impact on study results . As a case study , we describe the function and performance of an adjudication committee in a large r and omized trial of two diagnostic approaches to potentially operable lung cancer . Up to five independent adjudicators independently determined two primary outcomes : tumor status at death or at final follow-up and the cause of death . Patients for whom there was any disagreement were discussed in committee until a consensus was achieved . We describe the pattern of agreement among the adjudicators and with the final consensus result . Additionally , we model the adjudication process and predict the results if a smaller committee had been used . We found that reducing the number of adjudicators from five to two or three would probably have changed the consensus outcome in less than 10 % of cases . Correspondingly , the effect on the final study results ( comparing primary outcomes in both r and omized arms ) would have been altered very little . Even using a single adjudicator would not have affected the results substantially . About 90 minutes of person-time per patient was required for activities directly related to the adjudication process , or approximately 6 months of full time work for the entire study . This level of effort could be substantially reduced by using fewer adjudicators with little impact on the results . Thus , we suggest that when high observer agreement is demonstrated or anticipated , adjudication committees should consist of no more than three members . Further work is needed to evaluate if smaller committees are adequate to detect small but important treatment effects or if they compromise validity when the level of adjudicator agreement is lower This paper presents the short term results of an ongoing prospect i ve r and omized trial comparing a cemented unipolar with a cemented bipolar hemiarthroplasty for the treatment of displaced femoral neck fractures in the elderly . Forty-seven patients with an average age of 77 years completed 6-month followup . Outcomes at 6 weeks , 3 months and 6 months were assessed by completion of a patient oriented hip outcome instrument and by functional tests of walking speed and endurance . No differences in the postoperative complication rates or lengths of hospitalization were seen between the two groups . Patients treated with a bipolar hemiarthroplasty had greater range of hip motion in rotation and abduction and had faster walking speeds . However , no differences in hip rating outcomes were found . These early results suggest that use of the less expensive unipolar prosthesis for hemiarthroplasty after femoral neck fracture may be justified in the elderly
2,004
29,873,885
There was more evidence of publication bias in review 2 , and somewhat greater risk of confounding in studies included in review 1 . There was some evidence that adjustment for key confounders strengthened the associations for review 2 . Misclassification bias by HIV/HPV exposure status could also have biased estimates toward the null . These results provide evidence for synergistic HIV and HPV interactions of clinical and public health relevance . Although observational studies can never perfectly control for residual confounding , the evidence presented here lends further support for the presence of biological interactions between HIV and HPV that have a strong plausibility
INTRODUCTION Observational studies suggest HIV and human papillomavirus ( HPV ) infections may have multiple interactions . We review ed the strength of the evidence for the influence of HIV on HPV acquisition and clearance , and the influence of HPV on HIV acquisition .
Background Sexually transmitted infections ( STIs ) such as herpes simplex virus (HSV)-2 are associated with an increased risk of HIV infection . Human papillomavirus ( HPV ) is a common STI , but little is know about its role in HIV transmission . The objective of this study was to determine whether cervico-vaginal HPV infection increases the risk of HIV acquisition in women independent of other common STIs . Methods and Findings This prospect i ve cohort study followed 2040 HIV-negative Zimbabwean women ( average age 27 years , range 18–49 years ) for a median of 21 months . Participants were tested quarterly for 29 HPV types ( with L1 PCR primers ) and HIV ( antibody testing on blood sample s with DNA or RNA PCR confirmation ) . HIV incidence was 2.7 per 100 woman-years . Baseline HPV prevalence was 24.5 % , and the most prevalent HPV types were 58 ( 5.0 % ) , 16 ( 4.7 % ) , 70 ( 2.4 % ) , and 18 ( 2.3 % ) . In separate regression models adjusting for baseline variables ( including age , high risk partner , positive test for STIs , positive HSV-2 serology and condom use ) , HIV acquisition was associated with having baseline prevalent infection with HPV 58 ( aHR 2.13 ; 95 % CI 1.09–4.15 ) or HPV 70 ( aHR 2.68 ; 95 % CI 1.08–6.66 ) . In separate regression models adjusting for both baseline variables and time-dependent variables ( including HSV-2 status , incident STIs , new sexual partner and condom use ) , HIV acquisition was associated with concurrent infection with any non-oncogenic HPV type ( aHR 1.70 ; 95 % CI 1.02–2.85 ) , any oncogenic HPV type ( aHR 1.96 ; 95 % CI 1.16–3.30 ) , HPV 31 ( aHR 4.25 ; 95 % CI 1.81–9.97 ) or HPV 70 ( aHR 3.30 ; 95 % CI 1.50–7.20 ) . Detection of any oncogenic HPV type within the previous 6 months was an independent predictor of HIV acquisition , regardless of whether HPV status at the HIV acquisition visit was included ( aHR 1.95 ; 95 % CI 1.19–3.21 ) or excluded ( aHR 1.96 ; 95 % CI 1.02–2.85 ) from the analysis . Conclusions / Significance Cervico-vaginal HPV infection was associated with an increased risk of HIV acquisition in women , and specific HPV types were implicated in this association . The observational nature of our study precludes establishment of causation between HPV infection and HIV acquisition . However , given the high prevalence of HPV infection in women , further investigation of the role of HPV in HIV transmission is warranted Background Human papillomaviruses are the most common sexually transmitted infections , and genital warts , caused by HPV-6 and 11 , entail considerable morbidity and cost . The natural history of genital warts in relation to HIV-1 infection has not been described in African women . We examined risk factors for genital warts in a cohort of high-risk women in Burkina Faso , in order to further describe their epidemiology . Methods A prospect i ve study of 765 high-risk women who were followed at 4-monthly intervals for 27 months in Burkina Faso . Logistic and Cox regression were used to identify factors associated with prevalent , incident and persistent genital warts , including HIV-1 serostatus , CD4 + count , and concurrent sexually transmitted infections . In a subset of 306 women , cervical HPV DNA was tested at enrolment . Results Genital wart prevalence at baseline was 1.6 % ( 8/492 ) among HIV-uninfected and 7.0 % ( 19/273 ) among HIV-1 seropositive women . Forty women ( 5.2 % ) experienced at least one incident GW episode . Incidence was 1.1 per 100 person-years among HIV-uninfected women , 7.4 per 100 person-years among HIV-1 seropositive women with a nadir CD4 + count > 200 cells/μL and 14.6 per 100 person-years among HIV-1 seropositive women with a nadir CD4 + count ≤200 cells/μL. Incident genital warts were also associated with concurrent bacterial vaginosis , and genital ulceration . Antiretroviral therapy was not protective against incident or persistent genital warts . Detection of HPV-6 DNA and abnormal cervical cytology were strongly associated with incident genital warts . Conclusions Genital warts occur much more frequently among HIV-1 infected women in Africa , particularly among those with low CD4 + counts . Antiretroviral therapy did not reduce the incidence or persistence of genital warts in this population Background While infections with human papillomavirus ( HPV ) are highly prevalent among sexually active young women in Ug and a , information on incidence , clearance and their associated risk factors is sparse . To estimate the incidence , prevalence and determinants of HPV infections , we conducted a prospect i ve follow-up study among 1,275 women aged 12 - 24 years at the time of recruitment . Women answered a question naire and underwent a pelvic examination at each visit to collect exfoliated cervical cells . The presence of 42 HPV types was evaluated in exfoliated cervical cells by a polymerase chain based ( PCR ) assay ( SPF10-DEIA LiPA ) . Results Three hundred and eighty ( 380 ) of 1,275 ( 29.8 % ) women were followed up for a median time of 18.5 months ( inter-quartile range 9.7 - 26.6 ) . Sixty-nine ( 69 ) women had incident HPV infections during 226 person-years of follow-up reflecting an incidence rate of 30.5 per 100 person-years . Incident HPV infections were marginally associated with HIV positivity ( RR = 2.8 , 95 % CI : 0.9 - 8.3 ) . Clearance for HPV type-specific infections was frequent ranging between 42.3 % and 100.0 % for high- and 50 % and 100 % for low-risk types . Only 31.2 % of women cleared all their infections . Clearance was associated with HIV negativity ( Adjusted clearance = 0.2 , 95 % CI : 0.1 - 0.7 ) but not with age at study entry , lifetime number of sexual partners and multiplicity of infections . The prevalence of low- grade squamous intraepithelial lesions ( LSILs ) was 53/365 ( 14.5 % ) . None of the women had a high- grade cervical lesion ( HSIL ) or cancer . Twenty-two ( 22 ) of 150 ( 14.7 % ) HPV negative women at baseline developed incident LSIL during follow-up . The risk for LSIL appeared to be elevated among women with HPV 18-related types compared to women not infected with those types ( RR = 3.5 , 95 % CI : 1.0 - 11.8 ) . Conclusions Incident HPV infections and type-specific HPV clearance were frequent among our study population of young women . These results underscore the need to vaccinate pre-adolescent girls before initiation of sexual activity BACKGROUND Male circumcision could provide substantial protection against acquisition of HIV-1 infection . Our aim was to determine whether male circumcision had a protective effect against HIV infection , and to assess safety and changes in sexual behaviour related to this intervention . METHODS We did a r and omised controlled trial of 2784 men aged 18 - 24 years in Kisumu , Kenya . Men were r and omly assigned to an intervention group ( circumcision ; n=1391 ) or a control group ( delayed circumcision , 1393 ) , and assessed by HIV testing , medical examinations , and behavioural interviews during follow-ups at 1 , 3 , 6 , 12 , 18 , and 24 months . HIV seroincidence was estimated in an intention-to-treat analysis . This trial is registered with Clinical Trials.gov , with the number NCT00059371 . FINDINGS The trial was stopped early on December 12 , 2006 , after a third interim analysis review ed by the data and safety monitoring board . The median length of follow-up was 24 months . Follow-up for HIV status was incomplete for 240 ( 8.6 % ) participants . 22 men in the intervention group and 47 in the control group had tested positive for HIV when the study was stopped . The 2-year HIV incidence was 2.1 % ( 95 % CI 1.2 - 3.0 ) in the circumcision group and 4.2 % ( 3.0 - 5.4 ) in the control group ( p=0.0065 ) ; the relative risk of HIV infection in circumcised men was 0.47 ( 0.28 - 0.78 ) , which corresponds to a reduction in the risk of acquiring an HIV infection of 53 % ( 22 - 72 ) . Adjusting for non-adherence to treatment and excluding four men found to be seropositive at enrollment , the protective effect of circumcision was 60 % ( 32 - 77 ) . Adverse events related to the intervention ( 21 events in 1.5 % of those circumcised ) resolved quickly . No behavioural risk compensation after circumcision was observed . INTERPRETATION Male circumcision significantly reduces the risk of HIV acquisition in young men in Africa . Where appropriate , voluntary , safe , and affordable circumcision services should be integrated with other HIV preventive interventions and provided as expeditiously as possible BACKGROUND Studies analyzing the impact of combination antiretroviral therapy ( cART ) on cervical infection with high-risk human papillomavirus ( HR-HPV ) have generated conflicting results . We assessed the long-term impact of cART on persistent cervical HR-HPV infection in a very large cohort of 652 women who underwent follow-up of HIV infection for a median duration of 104 months . METHODS Prospect i ve cohort of HIV-infected women undergoing HIV infection follow-up who had HR-HPV screening and cytology by Papanicolaou smear performed yearly between 2002 and 2011 . RESULTS At baseline , the median age was 38 years , the race/ethnic origin was sub-Sarahan Africa for 84 % , the median CD4(+ ) T-cell count was 426 cells/µL , 79 % were receiving cART , and the HR-HPV prevalence was 43 % . The median interval of having had an HIV load of < 50 copies/mL was 40.6 months at the time of a HR-HPV-negative test result , compared with 17 months at the time of a HR-HPV-positive test result ( P < .0001 , by univariate analysis ) . The median interval of having had a CD4(+ ) T-cell count of > 500 cells/µL was 18.4 months at the time of a HR-HPV-negative test result , compared with 4.45 months at the time of a HR-HPV-positive test result ( P < .0001 ) . In multivariate analysis , having had an HIV load of < 50 copies/mL for > 40 months ( odds ratio [ OR ] , 0.81 ; 95 % confidence interval [ CI ] , .76-.86 ; P < .0001 ) and having had a CD4(+ ) T-cell count of > 500 cells/µL for > 18 months ( OR , 0.88 ; 95 % CI , .82-.94 ; P = .0002 ) were associated with a significantly decreased risk of HR-HPV infection . CONCLUSION Sustained HIV suppression for > 40 months and a sustained CD4(+ ) T-cell count of > 500 cells/µL for > 18 months are independently and significantly associated with a decreased risk of persistent cervical HR-HPV infection BACKGROUND High rates of persistence of human papillomavirus ( HPV ) infection have been reported for adult women with human immunodeficiency virus ( HIV ) infection . Although most women are first infected with HPV during adolescence , persistence of specific HPV types has not been carefully examined among HIV-infected adolescents . The objective of this study was to examine the rates of and risk factors for persistence of HPV types among HIV-infected and -uninfected adolescent girls . METHODS This is a prospect i ve cohort study of female adolescents , aged 13 - 18 years , participating in the Reaching for Excellence in Adolescent Care and Health project , a national study of HIV-infected and -uninfected adolescents . The main outcome measured was type-specific loss of initial HPV DNA detected . Loss of HPV DNA was defined for the following categories of HPV DNA types : low risk , which included types 6 , 11 , 42 , 44 , 54 , 40 , 13 , 32 , 62 , 72 , 2 , 57 , and 55 ; and high risk , which included types 16-like ( 16 , 31 , 33 , 35 , 52 , 58 , and 67 ) , 18-like ( 18 , 39 , 45 , 59 , 68 , 70 , 26 , 69 , and 51 ) , and 56-like ( 56 , 53 , and 66 ) . RESULTS Prevalent or incident HPV infection was detected in 334 girls . When type-specific loss of HPV was examined , HIV-uninfected girls had a shorter mean time to loss of initial infection than did HIV-infected girls ( 403 days vs. 689 days , respectively ; P<.0001 ) . By means of multivariate analysis , CD4 immunosuppression and the presence of multiple HPV-type subgroups were found to be associated with persistence of HPV . CONCLUSION Since persistence of high-risk HPV types has been strongly linked with the development of invasive cancer , the prolonged persistence of HPV observed among HIV-infected adolescents who are relatively healthy underscores the importance of prevention of HPV infection in this group BACKGROUND The association between human papillomavirus ( HPV ) infection and the risk of human immunodeficiency virus ( HIV ) seroconversion is unclear , and the genital cellular immunology has not been evaluated . METHODS A case-control analysis nested within a male circumcision trial was conducted . Cases consisted of 44 male HIV seroconverters , and controls were 787 males who were persistently negative for HIV . The Roche HPV Linear Array Genotype Test detected high-risk HPV ( HR-HPV ) and low-risk HPV ( LR-HPV ) genotypes . Generalized estimating equations logistic regression was used to estimate adjusted odds ratios ( aORs ) of HIV seroconversion . In addition , densities of CD1a(+ ) dendritic cells , CD4(+ ) T cells , and CD8(+ ) T cells were measured using immunohistochemistry analysis in foreskins of 79 males r and omly selected from participants in the circumcision trial . RESULTS HR-HPV or LR-HPV acquisition was not significantly associated with HIV seroconversion , after adjustment for sexual behaviors . However , HR-HPV and LR-HPV clearance was significantly associated with HIV seroconversion ( aOR , 3.25 [ 95 % confidence interval { CI } , 1.11 - 9.55 ] and 3.18 [ 95 % CI , 1.14 - 8.90 ] , respectively ) . The odds of HIV seroconversion increased with increasing number of HPV genotypes cleared ( P < .001 , by the test for trend ) . The median CD1a(+ ) dendritic cell density in the foreskin epidermis was significantly higher among males who cleared HPV ( 72.0 cells/mm(2 ) [ interquartile range { IQR } , 29.4 - 138.3 cells/mm(2 ) ] ) , compared with males who were persistently negative for HPV ( 32.1 cells/mm(2 ) [ IQR , 3.1 - 96.2 cells/mm(2 ) ] ; P = .047 ) , and increased progressively with the number of HPV genotypes cleared ( P = .05 ) . CONCLUSIONS HPV clearance was associated with subsequent HIV seroconversion and also with increased epidermal dendritic cell density , which potentially mediates HIV seroconversion BACKGROUND Whether the natural history of human papillomavirus ( HPV ) infection is affected by bacterial vaginosis ( BV ) or Trichomonas vaginalis ( TV ) infection has not been adequately investigated in prospect i ve studies . METHODS Human immunodeficiency virus 1 (HIV-1)-infected ( n=1763 ) and high-risk HIV-1-uninfected ( n=493 ) women were assessed semiannually for BV ( by Nugent 's criteria ) , TV infection ( by wet mount ) , type-specific HPV ( by polymerase chain reaction with MY09/MY11/HMB01 HPV primers ) , and squamous intraepithelial lesions ( SIL ) ( by cytological examination ) . Sexual history was obtained from patient report at each visit . Risk factors for prevalent and incident HPV infection and SIL were evaluated by use of multivariate models . RESULTS BV was associated with both prevalent and incident HPV infection but not with duration of HPV infection or incidence of SIL . TV infection was associated with incident HPV infection and with decreased duration and lower prevalence of HPV infection . TV infection had no association with development of SIL . Effects of BV and TV infection were similar in HIV-1-infected and high-risk HIV-1-uninfected women . HIV-1 infection and low CD4(+ ) lymphocyte count were strongly associated with HPV infection and development of SIL . CONCLUSIONS BV and TV infection may increase the risk of acquisition ( or reactivation ) of HPV infection , as is consistent with hypotheses that the local cervicovaginal milieu plays a role in susceptibility to HPV infection . The finding that BV did not affect persistence of HPV infection and that TV infection may shorten the duration of HPV infection helps explain the lack of effect that BV and TV infection have on development of SIL BACKGROUND Few data on the effect of human papillomavirus ( HPV ) infection on human immunodeficiency virus ( HIV ) acquisition are available . METHODS HIV-seronegative , sexually active , 18 - 24-year-old Kenyan men participating in a r and omized trial of male circumcision provided exfoliated penile cells from 2 anatomical sites ( glans/coronal sulcus and shaft ) at baseline . The GP5+/6 + polymerase chain reaction assay ascertained a wide range of HPV DNA types at the baseline visit . The risk of HIV infection was estimated using Kaplan-Meier methods and hazard ratios from proportional hazards models . RESULTS Of 2168 uncircumcised men with baseline HPV data , 1089 ( 50 % ) were positive for HPV DNA . The cumulative incidence of HIV infection by 42 months was 5.8 % ( 95 % confidence interval [ CI ] , 3.6%-7.9 % ) among men with HPV-positive glans/coronal sulcus specimens , versus 3.7 % [ 95 % CI , 1.8%-5.6 % ] among men with HPV-negative glans/coronal sulcus specimens ( P = .01 ) . Controlling for subsequent circumcision status , baseline herpes simplex virus type 2 serostatus , and sexual and sociodemographic risk factors , the hazard ratio for HIV infection among men with HPV-positive glans/coronal sulcus specimens was 1.8 ( 95 % CI , 1.1 - 2.9 ) , compared with men with HPV-negative glans/coronal sulcus specimens . CONCLUSION The results suggest an independent increased risk of HIV seroconversion among HPV-positive men . If this finding is confirmed in other studies , HPV prevention could be another tool for HIV prevention The clustering of human papillomavirus ( HPV ) infections in some individuals is often interpreted as the result of common risk factors rather than biological interactions between different types of HPV . The intraindividual correlation between times-at-risk for all HPV infections is not generally considered in the analysis of epidemiologic studies . We used a deterministic transmission model to simulate cross-sectional and prospect i ve epidemiologic studies measuring associations between 2 HPV types . When we assumed no interactions , the model predicted that studies would estimate odds ratios and incidence rate ratios greater than 1 between HPV types even after complete adjustment for sexual behavior . We demonstrated that this residual association is due to correlation between the times-at-risk for different HPV types , where individuals become concurrently at risk for all of their partners ' HPV types when they enter a partnership and are not at risk when they are single . This correlation can be controlled in prospect i ve studies by restricting analyses to susceptible individuals with an infected sexual partner . The bias in the measured associations was largest in low-sexual-activity population s , cross-sectional studies , and studies which evaluated infection with a first HPV type as the exposure . These results suggest that current epidemiologic evidence does not preclude the existence of competitive biological interactions between HPV types Objective : To quantify incidence of , and risk factors for , progression to and spontaneous regression of high- grade anal squamous intraepithelial lesions ( ASILs ) . Design : Retrospective review of patients at St Vincent 's Hospital Anal Cancer Screening Clinic during a period when high- grade ASILs were not routinely treated ( 2004–2011 ) . Methods : All patients who had an anal Papanicolaou smear or high-resolution anoscopy were included , except for patients with previous anal cancer . High- grade anal intraepithelial neoplasia ( HGAIN ) was defined as a composite of histologically confirmed grade 2 or 3 anal intraepithelial neoplasia ( AIN2/3 ) and /or high- grade squamous intraepithelial lesion on anal cytology . Analyses were repeated restricting to histologically confirmed AIN3 . Results : There were 574 patients : median age 45 years ( interquartile range , IQR 36–51 ) , 99.3 % male and 73.0 % HIV-infected [ median HIV duration was 13.8 years ( IQR 6.4–19.8 ) , median CD4 + T-lymphocyte count was 500 cells/&mgr;l ( IQR 357–662 ) , 83.5 % had undetectable plasma HIV viral load ] . Median follow-up was 1.1 years ( IQR 0.26–2.76 ) . Progression rate to HGAIN was 7.4/100 person-years ( 95 % confidence interval , CI 4.73–11.63 ) . No risk factor for progression to HGAIN was identified ; progression to AIN3 was more likely with increasing age ( Ptrend = 0.004 ) and in those who were HIV-infected [ hazard ratio 2.8 ( 95 % CI 1.18–6.68 ) versus HIV-uninfected ; P = 0.019 ] , particularly in those whose nadir CD4 + T-lymphocyte count was less than 200 cells/&mgr;l ( Ptrend = 0.003 ) . In 101 patients with HGAIN , 24 ( 23.8 % ) patients had spontaneous regression [ rate 23.5/100 person-years ( 95 % CI 15.73–35.02 ) ] , mostly to AIN1 . Regression was less likely in older patients ( Ptrend = 0.048 ) . Two patients with HGAIN developed anal cancer . Conclusion : High- grade ASILs frequently spontaneously regress . Longer-term , prospect i ve studies are required to determine whether these regressions are sustained BACKGROUND Women infected with human immunodeficiency virus ( HIV ) are disproportionately affected by human papillomavirus (HPV)-related anogenital disease , particularly with increased immunosuppression . AIDS Clinical Trials Group protocol A5240 was a trial of 319 HIV-infected women in the United States , Brazil , and South Africa to determine immunogenicity and safety of the quadrivalent HPV vaccine in 3 strata based on screening CD4 count : > 350 ( stratum A ) , 201 - 350 ( stratum B ) , and ≤200 cells/µL ( stratum C ) . METHODS Safety and serostatus of HPV types 6 , 11 , 16 , and 18 were examined . HPV serological testing was performed using competitive Luminex immunoassay ( HPV-4 cLIA ) . HPV type-specific seroconversion analysis was done for participants who were seronegative for the given type at baseline . RESULTS Median age of patients was 36 years ; 11 % were white , 56 % black , and 31 % Hispanic . Median CD4 count was 310 cells/µL , and 40 % had undetectable HIV-1 load . No safety issues were identified . Seroconversion proportions among women at week 28 for HPV types 6 , 11,16 , and 18 were 96 % , 98 % , 99 % , and 91 % , respectively , for stratum A ; 100 % , 98 % , 98 % , and 85 % , respectively , for stratum B , and 84 % , 92 % , 93 % , and 75 % , respectively , for stratum C. CONCLUSIONS The quadrivalent HPV vaccine targeted at types 6 , 11 , 16 , and 18 was safe and immunogenic in HIV-infected women aged 13 - 45 years . Women with HIV RNA load > 10 000 copies/mL and /or CD4 count < 200 cells/µL had lower rates of seroconversion rates . Clinical Trials Registration . NCT00604175 Objectives : There are very few data from men on the risk of HIV acquisition associated with penile human papillomavirus ( HPV ) infection and no data on the potential modifying effect of male circumcision . Therefore , this study evaluated whether HPV is independently associated with risk of HIV . Design : A cohort study of HPV natural history nested within a r and omized control trial of male circumcision to reduce HIV incidence in Kisumu , Kenya . Methods : Prospect i ve data from 2519 men were analyzed using 6-month discrete-time Cox models to determine if HIV acquisition was higher among circumcised or uncircumcised men with HPV compared to HPV-uninfected men . Results : Risk of HIV acquisition was nonsignificantly increased among men with any HPV [ adjusted hazard ratio ( aHR ) 1.72 ; 95 % confidence interval ( CI ) 0.94–3.15 ] and high-risk HPV ( aHR 1.92 ; 95 % CI 0.96–3.87 ) compared to HPV-uninfected men , and estimates did not differ by circumcision status . Risk of HIV increased 27 % with each additional HPV genotype infection ( aHR 1.27 ; 95 % CI 1.09–1.48 ) . Men with persistent ( aHR 3.27 ; 95 % CI 1.59–6.72 ) or recently cleared ( aHR 3.05 ; 95 % CI 1.34–6.97 ) HPV had a higher risk of HIV acquisition than HPV-uninfected men . Conclusions : Consistent with the findings in women , HPV infection , clearance , and persistence were associated with an increased risk of HIV acquisition in men . Given the high prevalence of HPV in population s at risk of HIV , consideration of HPV in future HIV-prevention studies and investigation into mechanisms through which HPV might facilitate HIV acquisition are needed Objective : Human papillomavirus ( HPV ) is a common sexually transmitted agent that causes anogenital cancer and precancer lesions that have an inflammatory infiltrate , may be friable and bleed . Our aim was to determine the association between anal HPV infection and HIV acquisition . Design : A prospect i ve cohort study . Methods : We recruited 1409 HIV-negative men who have sex with men from a community-based setting in Boston , Denver , New York and San Francisco . We used Cox proportional hazards regression modeling and assessed the independent association of HPV infection with the rate of acquisition of HIV infection . Results : Of 1409 participants contributing 4375 person-years of follow-up , 51 HIV-seroconverted . The median number of HPV types in HPV-infected HIV-seroconverters was 2 ( interquartile range 1–3 ) at the time of HIV seroconversion . After adjustment for sexual activity , substance use , occurrence of other sexually transmitted infections and demographic variables , there was evidence ( P = 0.002 ) for the effect of infection with at least two HPV types ( hazard ratio 3.5 , 95 % confidence interval 1.2–10.6 ) in HIV seroconversion . Conclusion : Anal HPV infection is independently associated with HIV acquisition . Studies that incorporate high-resolution anoscopy to more accurately identify HPV-associated disease are needed to determine the relationship between HPV-associated disease and HIV seroconversion Background : We sought to identify factors associated with newly detected human papillomavirus ( HPV ) infection in a high-risk cohort of injection drug using women in Baltimore , MD . Methods : We studied 146 HIV-infected and 73 HIV-uninfected female participants in a 5-year prospect i ve HIV natural history study . We examined the association of sexual and nonsexual risk factors and newly detected type-specific HPV infection as determined by consensus PCR between consecutive visits . Results : Newly detected HPV was more common among HIV-infected versus HIV-uninfected women ( 30 % and 6 % , respectively ; P < 0.01 ) . Among the entire cohort , recent crack use ( OR , 1.7 ; 95 % CI , 1.1–2.6 ) and HIV infection/CD4 cell count were independent predictors for new HPV detection ( HIV-uninfected as reference , OR , 4.6 ; 95 % CI , 2.3–8.9 , OR , 5.4 ; 95 % CI , 2.8–10.3 , and OR , 10.9 ; 95 % CI , 5.5–21.7 for HIV-infected CD4 > 500 , 200–500 , and < 200 , respectively ) . Among HIV-uninfected women , recent marijuana use was an independent predictor of newly detected HPV infection ( OR , 3.5 ; 95 % CI , 1.3–9.5 ) . Conclusions : Newly detected HPV clearly increased with greater immunosuppression in HIV-infected injection drug users . Larger studies of HIV-uninfected and infected high-risk individuals are needed to clarify the independent associations of crack and marijuana use with new ( or reactivated ) HPV infection As part of a prospect i ve cohort study to assess HIV incidence among high-risk women in Kigali , Rw and a , we evaluated the association between high-risk human papillomavirus ( HPV ) infection and subsequent HIV acquisition . Women who seroconverted for HIV between the first and second HPV measurement visit were 4.9 times [ 95 % confidence interval = 1.2–19.7 ] more likely to have HR-HPV detected at the first visit compared with women who remained HIV-negative Objectives : A large portion of anogenital cancers is caused by high-risk human papillomavirus ( hrHPV ) infections , which are especially common in HIV-infected men . We aim ed to compare the incidence and clearance of anal and penile hrHPV infection between HIV-infected and HIV-negative MSM . Design : Analyses of longitudinal data from a prospect i ve cohort study . Methods : MSM aged 18 years or older were recruited in Amsterdam , the Netherl and s , and followed-up semi-annually for 24 months . At each visit , participants completed risk-factor question naires . Anal and penile self- sample s were tested for HPV DNA using the SPF10-PCR DEIA/LiPA25 system . Effects on incidence and clearance rates were quantified via Poisson regression , using generalized estimating equations to correct for multiple hrHPV types . Results : Seven hundred and fifty MSM with a median age of 40 years ( interquartile 35–48 ) were included in the analyses , of whom 302 ( 40 % ) were HIV-infected . The incidence rates of hrHPV were significantly higher in HIV-infected compared with HIV-negative MSM [ adjusted incidence rate ratio ( aIRR ) 1.6 ; 95 % confidence interval ( CI ) 1.3–2.1 for anal and aIRR 1.4 ; 95%CI 1.0–2.1 for penile infection ] . The clearance rate of hrHPV was significantly lower for anal [ adjusted clearance rate ratio ( aCRR ) 0.7 ; 95%CI 0.6–0.9 ] , but not for penile infection ( aCRR 1.3 ; 95%CI 1.0–1.7 ) . HrHPV incidence or clearance did not differ significantly by nadir CD4 + cell count . Conclusion : Increased anal and penile hrHPV incidence rates and decreased anal hrHPV clearance rates were found in HIV-infected compared with HIV-negative MSM , after adjusting for sexual behavior . Our findings suggest an independent effect of HIV infection on anal hrHPV infections Summary : A total of 230 women drug users were prospect ively studied . At 6-month intervals , interviews , HIV testing , and cervicovaginal lavage sampling for human papillomavirus ( HPV ) were performed . HPV was detected and typed using a MY09/MY11 polymerase chain reaction system . 230 women without high-risk HPV ( types 16 , 18 , 26 , 31 , 33 , 35 , 39 , 45 , 51 , 52 , 56 , 58 . 59 , 68 , 73 and 82 ) , with or without non-high risk HPV types at baseline , were included in analyses . Incidence rates of and factors associated with HPV infections of all types and high-risk types ( types 16 , 18 , 26 , 31 , 33 , 35 , 39 , 45 , 51 , 52 , 56 , 58 , 59 , 68 , 73 , and 82 ) were analyzed . Baseline median age was 40 years ( range 24 - 65 ) ; 62 % of women were Hispanic , 20 % black , and 16 % white ; 54 ( 24 % ) were HIV seropositive ; 172 ( 75 % ) were without detectable HPV ; 58 ( 25 % ) had only low-risk or untypeable HPV . The incidence rates for any and for high-risk type HPV infection were 9.5/100 and 4.8/100 person-years , respectively . HIV-positive women had a significantly increased hazard rate for any HPV ( HRadj : 3.4 ; 95 % CI : 1.4 to 8.0 ) and for high-risk HPV ( HRadj 3.0 ; 95 % CI : 1.4 to 6.6 ) , adjusted for race , sexual behaviors , condom use , and history of other sexually transmitted infections . HIV infection was independently associated with a substantial and significantly increased risk for any and for high-risk genital HPV infection and was the most important risk factor found Background HIV infection and associated immunodeficiency are known to alter the course of human papillomavirus ( HPV ) infections and of associated diseases . Goal This study investigated the association between HIV and HPV and genital warts . Study Design HPV testing and physical examinations were performed in two large prospect i ve studies : the Women 's Interagency HIV Study ( WIHS ) and the HIV Epidemiology Research Study ( HERS ) . Statistical methods incorporating dependencies of longitudinal data were used to examine the relationship between HIV and HPV and genital warts . Results A total of 1008 HIV-seronegative and 2930 HIV-seropositive women were enrolled in the two studies . The prevalence of HPV 6 or 11 was 5.6 times higher in HIV-seropositive women in the WIHS and 3.6 times higher in the HERS . Genital wart prevalence increased by a factor of 3.2 in the WIHS and 2.7 in the HERS in HIV-seropositive women . In the WIHS , infection with HPV type 6 or 11 , in comparison with no HPV infection , was associated with odds of genital wart prevalence of 5.1 ( 95 % CI : 2.9–8.8 ) , 8.8 ( 95 % CI : 6.1–12.8 ) , and 12.8 ( 95 % CI : 8.8–18.8 ) in HIV-seronegative women , HIV-seropositive women with ≥201 CD4 cells/&mgr;l , and HIV-seropositive women with ≤200 CD4 cells/&mgr;l , respectively . In the HERS , infection with HPV type 6 or 11 was associated with odds of 2.7 ( 95 % CI : 1.6–4.6 ) , 4.9 ( 95 % CI : 3.2–7.7 ) , and 5.3 ( 95 % CI : 3.3–8.5 ) in these same groups . Other HPV types showed a similar dose – response relation , but of substantially lower magnitude and statistical significance . Conclusions HIV infection and immunodeficiency synergistically modified the relation between HPV 6 or 11 infection and genital wart prevalence OBJECTIVE The objective of the study was to estimate the impact of human immunodeficiency virus ( HIV ) infection on the incidence of high- grade cervical intraepithelial neoplasia ( CIN ) . STUDY DESIGN HIV-seropositive and comparison seronegative women enrolled in a prospect i ve US cohort study were followed up with semiannual Papanicolaou testing , with colposcopy for any abnormality . Histology results were retrieved to identify CIN3 + ( CIN3 , adenocarcinoma in situ , and cancer ) and CIN2 + ( CIN2 and CIN3 + ) . Annual detection rates were calculated and risks compared using a Cox analysis . Median follow-up ( interquartile range ) was 11.0 ( 5.4 - 17.2 ) years for HIV-seronegative and 9.9 ( 2.5 - 16.0 ) for HIV-seropositive women . RESULTS CIN3 + was diagnosed in 139 HIV-seropositive ( 5 % ) and 19 HIV-seronegative women ( 2 % ) ( P<.0001 ) , with CIN2 + in 316 ( 12 % ) and 34 ( 4 % ) ( P<.0001 ) . The annual CIN3 + detection rate was 0.6 per 100 person-years in HIV-seropositive women and 0.2 per 100 person-years in seronegative women ( P<.0001 ) . The CIN3 + detection rate fell after the first 2 years of study , from 0.9 per 100 person-years among HIV-seropositive women to 0.4 per 100 person-years during subsequent follow-up ( P<.0001 ) . CIN2 + incidence among these women fell similarly with time , from 2.5 per 100 person-years during the first 2 years after enrollment to 0.9 per 100 person-years subsequently ( P<.0001 ) . In Cox analyses controlling for age , the hazard ratio for HIV-seropositive women with CD4 counts less than 200/cmm compared with HIV-seronegative women was 8.1 ( 95 % confidence interval , 4.8 - 13.8 ) for CIN3 + and 9.3 ( 95 % confidence interval , 6.3 - 13.7 ) for CIN2 + ( P<.0001 ) . CONCLUSION Although HIV-seropositive women have more CIN3 + than HIV-seronegative women , CIN3 + is uncommon and becomes even less frequent after the initiation of regular cervical screening
2,005
29,878,834
Self-efficacy – HRQOL associations were similar in strength across age groups , regardless of presence of cardiovascular surgery , and among patients diagnosed with different forms of CVD . Conclusions / Implication s : General and exercise-specific self-efficacy are moderately related with HRQOL among people with CVD after surgery or during rehabilitation .
Purpose / Objective : Self-efficacy forms key modifiable personal re sources influencing illness management , rehabilitation participation , and their outcomes such as perceived health-related quality of life ( HRQOL ) among people with a cardiovascular disease ( CVD ) . Yet , an overarching research synthesis of the self-efficacy – HRQOL association in the CVD context is missing . This systematic review and meta- analysis of research on the self-efficacy – HRQOL relationship among people with CVD investigates whether the strength of associations depends on conceptualizations of self-efficacy and HRQOL ( general vs. specific ) , presence of cardiovascular surgery , the type of CVD diagnosis , and patients ’ age ( up to 60 vs. older than 60 ) .
To determine the reliability and validity of a patient outcome question naire for chronic heart failure , a r and omized , double-blind , placebo-controlled , 3-month trial of pimobendan , an investigational medication with inotropic and vasodilator activities , was performed . Evaluated were 198 ambulatory patients with primarily New York Heart Association ( NYHA ) class III heart failure from 20 referral centers . Baseline therapy included digoxin , diuretics and , in 80 % , a converting enzyme inhibitor . Oral pimobendan at 2.5 ( n = 49 ) , 5.0 ( n = 51 ) , or 10 ( n = 49 ) mg daily or matching placebo ( n = 49 ) was administered . The Minnesota Living with Heart Failure ( LIhFE ) question naire was a primary outcome measure , along with an exercise test . Interitem correlations identified subgroups of questions representing physical and emotional dimensions . Repeated baseline scores were highly correlated ( r = 0.93 ) , as were the physical ( r = 0.89 ) and emotional ( r = 0.88 ) dimension scores . Placebo did not have a significant effect with median ( 25th , 75th percentile ) changes from baseline scores of 1 ( -3 , 5 ) , 1 ( -2 , 3 ) , and 0 ( -1 , 2 ) , respectively ( all p values greater than 0.10 ) . The 5 mg dose significantly improved the total score , 7.5 ( 0 , 18 ; p = 0.01 ) and the physical dimension , 4 ( 0 , 8 ; p = 0.01 ) , compared with placebo . Changes in the total ( r = 0.33 ; p less than 0.01 ) and physical ( r = 0.35 ; p less than 0.01 ) scores were weakly related to changes in exercise times , but corresponded well with changes in patients ' ratings of dyspnea and fatigue . ( ABSTRACT TRUNCATED AT 250 WORDS Objective We examine prospect ively the role of specific forms of self-efficacy in the physical and role function for patients with coronary heart disease after controlling for the effects of anxiety and depression . Methods A 6-month prospect i ve cohort study was conducted after cardiac catheterization of 198 HMO members , demonstrating clinical ly significant coronary disease . Coronary disease severity was assessed through cardiac catheterization ; physical function , role function , anxiety , depression , and self-efficacy were assessed through question naires . Results The Cardiac Self-Efficacy Scale had two factors ( maintain function and control symptoms ) with high internal consistency and good convergent and discriminant validity . In multiple regression models , the self-efficacy scales significantly predicted physical function , social function , and family function after controlling for baseline function , baseline anxiety , and other significant correlates . Conclusions Self-efficacy to maintain function and to control symptoms helps predict the physical function and role function , after accounting for coronary disease severity , anxiety , and depression in patients with clinical ly significant coronary disease . Interventions to improve self-efficacy may have a broader applicability in the heart disease population than previously appreciated Purpose We examined whether changes in health-related quality of life ( HRQL ) predict subsequent mortality among the Spanish elderly . Methods Prospect i ve cohort study of 2,373 persons , representative of the Spanish population aged 60 and older . HRQL was measured in 2001 and 2003 using the SF-36 health question naire . Cox regression models were used to examine the association of changes in the physical and mental component summary ( PCS and MCS ) scores of HRQL from 2001 to 2003 with all-cause mortality through 2007 . Results Two hundred twelve deaths were ascertained from 2003 to 2007 . The hazard ratios for mortality across categories of PCS change were as follows : 2.12 ( 95 % confidence interval [ CI ] 1.39–3.24 ) for a > 10-point decline ; 1.51 ( 1.01–2.28 ) for a 6- to 10-point decline ; 1 for the reference category , a change of −5 to + 5 points ; 0.83 ( 0.51–1.34 ) for a 6- to 9-point improvement and 0.68 ( 0.42–1.09 ) for a > 10-point improvement ; P for linear trend < 0.001 . The associations between changes in the MCS and mortality showed the same direction , but were of a lower magnitude and attained statistical significance ( P < 0.05 ) only for a > 10-point decline in MCS . Conclusions Changes in HRQL predict mortality in the older adults . A decline in HRQL should alert to a worse vital prognosis and stimulate the search for the possible determinants of such decline BACKGROUND The application of health-related quality of life ( HRQOL ) as a pediatric population health measure may facilitate risk assessment and re source allocation , the tracking of community health , the identification of health disparities , and the determination of health outcomes from interventions and policy decisions . OBJECTIVE To determine the feasibility , reliability , and validity of the 23-item PedsQL 4.0 ( Pediatric Quality of Life Inventory ) Generic Core Scales as a measure of pediatric population health for children and adolescents . DESIGN Mail survey in February and March 2001 to 20 031 families with children ages 2 - 16 years throughout the State of California encompassing all new enrollees in the State 's Children 's Health Insurance Program ( SCHIP ) for those months and targeted language groups . METHODS The PedsQL 4.0 Generic Core Scales ( Physical , Emotional , Social , School Functioning ) were completed by 10 241 families through a statewide mail survey to evaluate the HRQOL of new enrollees in SCHIP . RESULTS The PedsQL 4.0 evidence d minimal missing responses , achieved excellent reliability for the Total Scale Score ( alpha = .89 child;.92 parent report ) , and distinguished between healthy children and children with chronic health conditions . The PedsQL 4.0 was also related to indicators of health care access , days missed from school , days sick in bed or too ill to play , and days needing care . CONCLUSION The results demonstrate the feasibility , reliability , and validity of the PedsQL 4.0 as a pediatric population health outcome . Measuring pediatric HRQOL may be a way to evaluate the health outcomes of SCHIP
2,006
30,770,049
Conclusions No definite signal of harm with 80 % FiO2 in adult surgical patients undergoing general anaesthesia was demonstrated and there is little evidence on safety‐related issues to discourage its use in this population
Background Evidence ‐based guidelines from the World Health Organization ( WHO ) have recommended a high ( 80 % ) fraction of inspired oxygen ( FiO2 ) to reduce surgical site infection in adult surgical patients undergoing general anaesthesia with tracheal intubation . However , there is ongoing debate over the safety of high FiO2 . We performed a systematic review to define the relative risk of clinical ly relevant adverse events ( AE ) associated with high FiO2 .
Pathological formation of reactive oxygen species within the coronary circulation has been hypothesized to mediate some clinical manifestations of ischemic heart disease ( IHD ) by interfering with physiological regulation of coronary tone . To determine the degree to which coronary tone responds to acute changes in ambient levels of oxidants and antioxidants in vivo in a clinical setting , we measured the effect of an acute oxidative stress ( breathing 100 % oxygen ) on coronary capacitance artery diameter ( quantitative angiography ) and blood flow velocity through the coronary microcirculation ( intracoronary Doppler ultrasonography ) before and after treatment with the antioxidant vitamin C ( 3-g intravenous infusion ) in 12 IHD patients undergoing a clinical coronary interventional procedure . Relative to room air breathing , 100 % oxygen breathing promptly reduced coronary blood flow velocity by 20 % and increased coronary resistance by 23 % , without significantly changing the diameter of capacitance arteries . Vitamin C administration promptly restored coronary flow velocity and resistance to a slightly suprabasal level , and it prevented the reinduction of coronary constriction with rechallenge with 100 % oxygen . This suggests that acute oxidative stress produces prompt and substantial changes in coronary resistance and blood flow in a clinical setting in patients with IHD , and it suggests that these changes are mediated by vitamin C-quenchable substances acting on the coronary microcirculation . This observation may have relevance for clinical practice BACKGROUND Administration of supplemental oxygen in the perioperative period is controversial , as it may increase long-term mortality . Our aim was to assess the association between 80 % oxygen and occurrence of subsequent cancer in patients undergoing abdominal surgery in a post hoc analysis of the PROXI trial . METHODS The 1386 patients in the PROXI trial underwent elective or emergency laparotomy between 2006 and 2008 with r and omization to either 80 % or 30 % oxygen during and for 2 h after surgery . We retrieved follow-up status regarding vital status , new cancer diagnoses , and new histological cancer specimens . Data were analysed using the Cox proportional hazards model . RESULTS Follow-up was complete in 1377 patients ( 99 % ) after a median of 3.9 yr . The primary outcome of new cancer diagnosis or new malignant histological specimen occurred in 140 of 678 patients ( 21 % ) in the 80 % oxygen group vs 150 of 699 patients ( 21 % ) assigned to 30 % oxygen ; hazards ratio 1.06 [ 95 % confidence interval ( CI ) 0.84 , 1.34 ] , P=0.62 . Cancer-free survival was significantly shorter in the 80 % oxygen group ; hazards ratio 1.19 ( 95 % CI 1.01 , 1.42 ) , P=0.04 , as was the time between surgery and new cancer , median 335 vs. 434 days in the 30 % oxygen group . In patients with localized disease , non-significant differences in cancer and cancer-free survival were found with hazard ratios of 1.31 and 1.29 , respectively . CONCLUSIONS Although new cancers occurred at similar rate , the cancer-free survival was significantly shorter in the 80 % oxygen group , but this did not appear to explain the excess mortality in the 80 % oxygen group . CLINICAL TRIAL REGISTRATION Clinical Trials.gov ( NCT01723280 ) BACKGROUND Perioperative supplemental oxygen has been proposed to decrease the incidence of surgical site infection ( SSI ) in colorectal surgery with controversial results . We have assessed the influence of hyperoxygenation on SSI by using the most homogeneous study population . METHODS We studied , in a prospect i ve r and omized study , 81 patients , who underwent elective open infraperitoneal anastomosis for rectal cancer . Patients were assigned r and omly to an oxygen/air mixture with a fraction of inspired oxygen ( FiO2 ) of 30 % ( n = 41 ) or 80 % ( n = 40 ) . Administration was commenced after induction of anesthesia and maintained for 6 hours after surgery . RESULTS The overall wound infection rate was 21 % : 11 patients ( 26.8 % ) had wound infections in the 30 % FiO2 group and 6 ( 15 % ) in the 80 % FiO2 group ( P < .05 ) . The risk of SSI was 41 % lower in the 80 % FiO2 group . CONCLUSION Supplemental 80 % FiO2 reduced postoperative SSI with few risks to the patient and little associated cost OBJECTIVE The purpose of this study was to investigate whether supplemental oxygen during and for 2 hours after cesarean delivery reduces the incidence of postcesarean infectious morbidity . STUDY DESIGN We conducted a r and omized , controlled trial from 2008 - 2010 . Women who underwent cesarean delivery were r and omly assigned to receive either 2 L of oxygen by nasal cannula during cesarean delivery only ( st and ard care ) or 10 L of oxygen by nonrebreather mask ( intervention group ) during and for 2 hours after cesarean delivery . Women who underwent scheduled or intrapartum cesarean delivery were eligible and were observed for 1 month after the procedure . The primary composite outcome was maternal infectious morbidity , which included endometritis and wound infection . RESULTS Five hundred eighty-five women were included in the final analysis . Infectious morbidity occurred in 8.8 % of patients in the st and ard care group and in 12.2 % of patients in the supplemental oxygen group . There was no significant difference in the rate of infectious morbidity between the st and ard care and intervention groups ( relative risk , 1.4 ; 95 % confidence interval , 0.9 - 2.3 ) . CONCLUSION Supplemental oxygen does not reduce the rate of postcesarean delivery infectious morbidity , including endometritis and wound infection BACKGROUND : A high perioperative inspiratory oxygen fraction ( 80 % ) has been recommended to prevent postoperative wound infections . However , the most recent and one of the largest trials , the PROXI trial , found no reduction in surgical site infection , and 30-day mortality was higher in patients given 80 % oxygen . In this follow-up study of the PROXI trial we assessed the association between long-term mortality and perioperative oxygen fraction in patients undergoing abdominal surgery . METHODS : From October 8 , 2006 , to October 6 , 2008 , 1386 patients underwent elective or emergency laparotomy and were r and omized to receive either 80 % or 30 % oxygen during and for 2 hours after surgery . The follow-up date was February 24 , 2010 . Survival was analyzed using Kaplan-Meier statistics and the Cox proportional hazards model . RESULTS : Vital status was obtained in 1382 of 1386 patients after a median follow-up of 2.3 years ( range 1.3 to 3.4 years ) . One hundred fifty-nine of 685 patients ( 23.2 % ) died in the 80 % oxygen group compared to 128 of 701 patients ( 18.3 % ) assigned to 30 % oxygen ( HR , 1.30 [ 95 % confidence interval , 1.03 to 1.64 ] , P = 0.03 ) . In patients undergoing cancer surgery , the HR was 1.45 ; 95 % confidence interval , 1.10 to 1.90 ; P = 0.009 ; and after noncancer surgery , the HR was 1.06 ; 95 % confidence interval , 0.69 to 1.65 ; P = 0.79 . CONCLUSIONS : Administration of 80 % oxygen in the perioperative period was associated with significantly increased long-term mortality and this appeared to be statistically significant in patients undergoing cancer surgery but not in noncancer patients This explicit and strong recommendation has caused considerable concern in the anaesthesia community because it directly contradicts the results of many trials ( and a meta- analysis of those trials ) which showednobenefitof supplemental oxygen . The use of an increased inspired oxygen fraction ( FiO2 ) has a long history in anaesthesia , intensive care , and emergency medicine . Planned induction and extubation of anaesthesia are conducted with 100 % oxygen to enhance safety in the case of airwaydifficulty . Extra oxygen is alsogivenduringgeneral anaesthesia as 21 % is rarely sufficient , with concentrations ranging from 30 % to nearly 100 % depending on case factors and anaesthesiologist preference . There is considerable reason to expect that supplemental oxygen might reduce the risk of surgical site infection [ 21 ] . All surgical wounds become contaminated , and the primary defence against bacterial contamination isoxidativekillingbyneutrophils , a process that requires molecular oxygen and depends on the partial pressureofoxygenintissueovertheentire physiological range . Consistent with this theory , Hopf et al. [ 11 ] showed that surgical wound infection risk was inversely related to tissue oxygenation . Based on these observational data , the Outcomes Research Consortium r and omized 500 patients having colorectal resection to either 80%or30 % inspiredoxygen [ 9 ] . The incidence of SSI was halved from 11.2 AIM An association has been proposed between perioperative administration of 80 % oxygen and a lower incidence of wound infection after colorectal surgery . The present study was conducted to assess this hypothesis . METHODS Thirty-eight patients ( ASA classification 1 and 2 ) undergoing elective colorectal cancer surgery were allocated at r and om to 2 groups . Group 1 consisted of 19 patients who received an admixture of 80 % oxygen and 20 % nitrogen during anesthesia through an orotracheal tube and during the 2 first hours in the recovery room through a tight facemask with reservoir . Group 2 consisted of 19 patients who received an admixture of 70 % nitrous oxide and 30 % oxygen during anesthesia , followed by administration of 30 % oxygen delivered by a blender through a tight facemask with reservoir in the same manner than group 1 , during the first 2 hours in the recovery room . Wound infection was evaluated daily during hospital stay and after 7 days , 2 weeks , and 1 month . RESULTS The incidence of wound infection was 12.5 % in group 1 and 17.6 % in group 2 ( p=0.53 ) . CONCLUSIONS The results of this study showed no reduction in the incidence of wound infection following elective colorectal surgery in patients receiving 80 % oxygen during the perioperative period Background : Benefits and limitations of supplementation with 80 % fraction of inspired oxygen for preventing surgical site infections have not yet been clearly defined . Some studies have reported benefits in colorectal surgery , whereas trials in abdominal and gynecologic surgery have reported either no effect or a deleterious effect . Methods : Controlled , r and omized , assessor-blind multicenter trial , the ISO2 study , comparing the effects of hyperoxygenation ( fraction of inspired oxygen , 80 % ) with those of 30 % oxygen on the frequency of surgical site infections in routine abdominal , gynecologic , and breast surgery on 434 patients . Patients not seen in consultation after discharge were contacted . Results : In total , 208 patients received 30 % perioperative oxygen and 226 received 80 % . There was no difference between the two groups for baseline , intraoperative , and postoperative characteristics , except for oxygen saturation at closure , higher in the 80 % group ( P = 0.01 ) . The frequency of 30-day surgical site infections was 7.2 % ( 15/208 ) in the 30 % group and 6.6 % ( 15/226 ) in the 80 % group ( relative risk , 0.92 ; 95 % CI [ 0.46–1.84 ] , P = 0.81 ) . Frequency of adverse events ( nausea and vomiting , sternal pain , cough , hypotension ) was similar in the two groups . Desaturation and bradycardia were more frequent in the 30 % group . In an up date d meta- analysis including the result of this trial and those of eight published r and omized trials , the overall relative risk was 0.97 ; 95 % CI ( 0.68–1.40 ) , I2 ( inconsistency degree ) = 73 % , ( P = 0.88 ) . Conclusions : The routine use of hyperoxygenation throughout abdominal , gynecologic , and breast surgery had no effect on the frequency of 30-day surgical site infections and was not accompanied by more frequent adverse effects Background A high perioperative inspiratory oxygen fraction ( FiO2 ) may reduce the frequency of surgical site infection . Perioperative atelectasis is caused by absorption , compression and reduced function of surfactant . It is well accepted , that ventilation with 100 % oxygen for only a few minutes is associated with significant formation of atelectasis . However , it is still not clear if a longer period of 80 % oxygen results in more atelectasis compared to a low FiO2.Our aim was to assess if a high FiO2 is associated with impaired oxygenation and decreased pulmonary functional residual capacity ( FRC ) . Methods Thirty-five patients scheduled for laparotomy for ovarian cancer were r and omized to receive either 30 % oxygen ( n = 15 ) or 80 % oxygen ( n = 20 ) during and for 2 h after surgery . The oxygenation index ( PaO2/FiO2 ) was measured every 30 min during anesthesia and 90 min after extubation . FRC was measured the day before surgery and 2 h after extubation by a rebreathing method using the inert gas SF6 . Results Five min after intubation , the median PaO2/FiO2 was 69 kPa [ 53 - 71 ] in the 30%-group vs. 60 kPa [ 47 - 69 ] in the 80%-group ( P = 0.25 ) . At the end of anesthesia , the PaO2/FiO2 was 58 kPa [ 40 - 70 ] vs. 57 kPa [ 46 - 67 ] in the 30%- and 80%-group , respectively ( P = 0.10 ) . The median FRC was 1993 mL [ 1610 - 2240 ] vs. 1875 mL [ 1545 - 2048 ] at baseline and 1615 mL [ 1375 - 2318 ] vs. 1633 mL [ 1343 - 1948 ] postoperatively in the 30%- and 80%-group , respectively ( P = 0.70 ) . Conclusion We found no significant difference in oxygenation index or functional residual capacity between patients given 80 % and 30 % oxygen for a period of approximately 5 hours . Trial registration Clinical Trials.gov Identifier : NCT00637936 CONTEXT Use of 80 % oxygen during surgery has been suggested to reduce the risk of surgical wound infections , but this effect has not been consistently identified . The effect of 80 % oxygen on pulmonary complications has not been well defined . OBJECTIVE To assess whether use of 80 % oxygen reduces the frequency of surgical site infection without increasing the frequency of pulmonary complications in patients undergoing abdominal surgery . DESIGN , SETTING , AND PATIENTS The PROXI trial , a patient- and observer-blinded r and omized clinical trial conducted in 14 Danish hospitals between October 2006 and October 2008 among 1400 patients undergoing acute or elective laparotomy . INTERVENTIONS Patients were r and omly assigned to receive either 80 % or 30 % oxygen during and for 2 hours after surgery . MAIN OUTCOME MEASURES Surgical site infection within 14 days , defined according to the Centers for Disease Control and Prevention . Secondary outcomes included atelectasis , pneumonia , respiratory failure , and mortality . RESULTS Surgical site infection occurred in 131 of 685 patients ( 19.1 % ) assigned to receive 80 % oxygen vs 141 of 701 ( 20.1 % ) assigned to receive 30 % oxygen ( odds ratio [ OR ] , 0.94 ; 95 % confidence interval [ CI ] , 0.72 - 1.22 ; P = .64 ) . Atelectasis occurred in 54 of 685 patients ( 7.9 % ) assigned to receive 80 % oxygen vs 50 of 701 ( 7.1 % ) assigned to receive 30 % oxygen ( OR , 1.11 ; 95 % CI , 0.75 - 1.66 ; P = .60 ) , pneumonia in 41 ( 6.0 % ) vs 44 ( 6.3 % ) ( OR , 0.95 ; 95 % CI , 0.61 - 1.48 ; P = .82 ) , respiratory failure in 38 ( 5.5 % ) vs 31 ( 4.4 % ) ( OR , 1.27 ; 95 % CI , 0.78 - 2.07 ; P = .34 ) , and mortality within 30 days in 30 ( 4.4 % ) vs 20 ( 2.9 % ) ( OR , 1.56 ; 95 % CI , 0.88 - 2.77 ; P = .13 ) . CONCLUSION Administration of 80 % oxygen compared with 30 % oxygen did not result in a difference in risk of surgical site infection after abdominal surgery . TRIAL REGISTRATION clinical trials.gov Identifier : NCT00364741 Non-r and omised studies of the effects of interventions are critical to many areas of healthcare evaluation , but their results may be biased . It is therefore important to underst and and appraise their strengths and weaknesses . We developed ROBINS-I ( “ Risk Of Bias In Non-r and omised Studies - of Interventions ” ) , a new tool for evaluating risk of bias in estimates of the comparative effectiveness ( harm or benefit ) of interventions from studies that did not use r and omisation to allocate units ( individuals or clusters of individuals ) to comparison groups . The tool will be particularly useful to those undertaking systematic review s that include non-r and omised studies OBJECTIVE : Most postcesarean infections are caused by anaerobic bacteria . Oxidative killing , an important defense against surgical infections , depends on the oxygen level in contaminated tissue . Among patients undergoing colorectal surgery , perioperative supplemental oxygen decreased infection rates by 50 % . We tested the hypothesis that high-concentration inspired oxygen decreases the incidence of surgical site infection in women undergoing cesarean delivery . METHODS : Using a double blind technique , 143 women undergoing cesarean delivery under regional anesthesia after the onset of labor were r and omly assigned to receive low- or high-concentration inspired oxygen via nonrebreathing mask during the operation and for 2 hours after . Surgical site infection was defined clinical ly as administration of antibiotics for postpartum endometritis or wound infection during the initial hospital stay or within 14 days of surgery . Interim statistical analysis was performed after 25 % of the planned sample size ( 143 of 550 ) accrued using intention-to-treat principle . The stopping rule P value for futility was P>.11 with two planned interim analyses . RESULTS : Postcesarean infection occurred in 17 ( 25 % , 95 % confidence interval [ CI ] 15–35 % ) of 69 women assigned to high-concentration oxygen compared with 10 ( 14 % , 95 % CI 6–22 % ) of 74 women assigned to low-concentration inspired oxygen ( relative risk 1.8 , 95 % CI 0.9–3.7 , P=.13 ) . The P value exceeded the P value for futility , suggesting these differences were unlikely to reach statistical significance with continued recruitment . CONCLUSION : High-concentration perioperative oxygen delivered through a nonrebreathing mask did not decrease the risk of postcesarean surgical site infection . CLINICAL TRIAL REGISTRATION : Clinical Trials.gov , www . clinical trials.gov , NCT00670020 LEVEL OF EVIDENCE : BACKGROUND : A follow-up analysis from a large trial of oxygen and surgical-site infections reported increased long-term mortality among patients receiving supplemental oxygen , especially those having cancer surgery . Although concerning , there is no obvious mechanism linking oxygen to long-term mortality . We thus tested the hypothesis that supplemental oxygen does not increase long-term mortality in patients undergoing colorectal surgery . Secondarily , we evaluated whether the effect of supplemental oxygen on mortality depended on cancer status . METHODS : Mortality data were obtained for 927 patients who participated in 2 r and omized trials evaluating the effect of supplemental oxygen on wound infection . We assessed the effect of 80 % vs 30 % oxygen on long-term mortality across 4 clinical sites in the 2 trials using a Cox proportional hazards regression model stratified by study and site . Kaplan-Meier survival estimates were calculated for each trial . Finally , we report site-stratified hazard ratios for patients with and without cancer at baseline . RESULTS : There was no effect of 80 % vs 30 % oxygen on mortality , with an overall site-stratified hazard ratio of 0.93 ( 95 % confidence interval [ CI ] , 0.72–1.20 ; P = 0.57 ) . The treatment effect was consistent across the 2 original studies ( interaction P = 0.88 ) and across the 4 sites ( P = 0.84 ) . There was no difference between patients with ( n = 451 ) and without ( n = 450 ) cancer ( interaction P = 0.51 ) , with hazard ratio of 0.85 ( 95 % CI , 0.64–1.1 ) for cancer patients and 0.97 ( 0.53–1.8 ) for noncancer patients . CONCLUSIONS : In contrast to the only previous publication , we found that supplemental oxygen had no influence on long-term mortality in the overall surgical population or in patients having cancer surgery OBJECTIVE To assess the influence of hyperoxygenation on surgical site infection by using the most homogeneous study population . DESIGN A r and omized , prospect i ve , controlled trial . SETTING Department of surgery in a government hospital . PATIENTS A total of 210 patients who underwent open surgery for acute appendicitis . In the study group , patients received 80 % oxygen during anesthesia , followed by high-flow oxygen for 2 hours in the recovery room . The control group received 30 % oxygen , as usual . INTERVENTION Open appendectomy via incision in the right lower quadrant of the abdomen . MAIN OUTCOME MEASURES Surgical site infection , mainly assessed by the ASEPSIS ( additional treatment , serous discharge , erythema , purulent discharge , separation of deep tissues , isolation of bacteria , and stay in hospital prolonged > 14 days ) system score . RESULTS Surgical site infections were recorded in 6 of 107 patients ( 5.6 % ) in the study group vs 14 of 103 patients ( 13.6 % ) in the control group ( P = .04 ) . Significant differences in the ASEPSIS score were also found . The mean hospital stay was longer in the control group ( 2.92 days ) compared with the study group ( 2.51 days ) ( P = .01 ) . CONCLUSION The use of supplemental oxygen is advantageous in operations for acute appendicitis by reducing surgical site infection rate and hospital stay . TRIAL REGISTRATION clinical trials.gov Identifier : NCT01002365 Background : Nitrous oxide is widely used in anesthesia , often administered at an inspired concentration around 70 % . Although nitrous oxide interferes with vitamin B12 , folate metabolism , and deoxyribonucleic acid synthesis and prevents the use of high inspired oxygen concentrations , the consequences of these effects are unclear . Methods : Patients having major surgery expected to last at least 2 h were r and omly assigned to nitrous oxide – free ( 80 % oxygen , 20 % nitrogen ) or nitrous oxide – based ( 70 % N2O , 30 % oxygen ) anesthesia . Patients and observers were blind to group identity . The primary endpoint was duration of hospital stay . Secondary endpoints included duration of intensive care stay and postoperative complications ; the latter included severe nausea and vomiting , and the following major complications : pneumonia , pneumothorax , pulmonary embolism , wound infection , myocardial infa rct ion , venous thromboembolism , stroke , awareness , and death within 30 days of surgery . Results : Of 3,187 eligible patients , 2,050 consenting patients were recruited . Patients in the nitrous oxide – free group had significantly lower rates of major complications ( odds ratio , 0.71 ; 95 % confidence interval , 0.56–0.89 ; P = 0.003 ) and severe nausea and vomiting ( odds ratio , 0.40 ; 95 % confidence interval , 0.31–0.51 ; P < 0.001 ) , but median duration of hospital stay did not differ substantially between groups ( 7.0 vs. 7.1 days ; P = 0.06 ) . Among patients admitted to the intensive care unit postoperatively , those in the nitrous oxide – free group were more likely to be discharged from the unit on any given day than those in the nitrous oxide group ( hazard ratio , 1.35 ; 95 % confidence interval , 1.05–1.73 ; P = 0.02 ) . Conclusions : Avoidance of nitrous oxide and the concomitant increase in inspired oxygen concentration decreases the incidence of complications after major surgery , but does not significantly affect the duration of hospital stay . The routine use of nitrous oxide in patients undergoing major surgery should be question ed UNLABELLED Propose : The clinical role of hyperoxiato prevent postoperative surgical site infection ( SSI ) remains uncertain since r and omized controlled trials on this topic have reported different results . One of the principal reasons for such mixed results can be that previous trials have entered a heterogeneous population of patients and set of procedures . The aim of our study was to assess the influence of hyperoxygenation on SSI usingan homogeneous study population . METHODS We studied , in a prospect i ve r and omized study , extended on a time interval January 2009 to May 2015 , 85 patients who underwent open intraperitoneal anastomosis for acute sigmoid diverticulitis . Patients were assigned r and omly to an oxygen/air mixture with a faction of inspiration ( FiO2 ) of 30 % ( n=43 ) or 80 % ( n=42 ) . Administration was started after induction of anesthesia and maintained for 6 hours after surgery . RESULTS The overall wound site infection rate was 24.7 % ( 21 out of 85 ) : 14 patients ( 32.5 % ) had a wound infection in the 30 % FiO2 group and 7 ( 16.6 % ) in the 80 % FiO2 group ( p 0.05 ) . The risk of SSI was 43 % lower in the 80 % FiO2 group ( RR , 0.68 ; 95 % confidence interval , 0.35 - 0.88 ) versus 30 % FiO2 . CONCLUSIONS Therefore , supplemental 80 % FiO2 during and 6 hours after open surgery for acute sigmoid diverticulitis , reducing post-operative SSI , should be considered part of ongoing quality improvement activities related to surgical care , accompanied by few risk to the patients and little associates cost Background : Nitrous oxide inactivates methionine synthase and may lead to DNA damage and wound infection . By using single-cell gel electrophoresis ( comet assay ) , the authors determined the effect of nitrous oxide on DNA damage in circulating leukocytes . Methods : In this double-blind , r and omized controlled trial , 91 patients undergoing major colorectal surgery were r and omized to receive 70 % nitrous oxide ( n = 31 ) or nitrous oxide-free anesthesia using 30 ( n = 30 ) or 80 % ( n = 30 ) oxygen . Venous blood was collected before and 24 h after surgery . The primary outcome was extent of DNA damage , quantified as the percentage of DNA staining intensity in the comet tail using digital fluorescence microscopy . Incidence of postoperative wound infection was also recorded . Results : Nitrous oxide exposure was associated with a two-fold increase in the percentage of DNA intensity in tail ( P = 0.0003 ) , but not in the 30 ( P = 0.181 ) or 80 % oxygen groups ( P = 0.419 ) . There was a positive correlation between the duration of nitrous oxide exposure and extent of DNA damage , r = 0.33 , P = 0.029 . However , no correlation was observed in nitrous oxide-free patients . The proportions of postoperative wound infection , using the Centers for Disease Control and Prevention criteria , were 19.4 % ( 6 of 31 ) in the 70 % nitrous oxide group and 6.7 % ( 2 of 30 ) in both the 30 and 80 % oxygen groups , P = 0.21 . An increase in DNA damage was associated with a higher risk of wound infection , adjusted odds ratio ( 95 % CIs ) : 1.19 ( 1.07–1.34 ) , P = 0.003 . Conclusions : Nitrous oxide increased DNA damage compared with nitrous oxide-free anesthesia and was associated with postoperative wound infection OBJECTIVE : To evaluate whether supplemental perioperative oxygen decreases surgical site wound infections or endometritis . STUDY DESIGN : This was a prospect i ve , r and omized trial . Patients who were to undergo cesarean delivery were recruited and r and omly allocated to either 30 % or 80 % oxygen during the cesarean delivery and for 1 hour after surgery . The obstetricians and patients were blinded to the concentration of oxygen used . Patients were evaluated for wound infection or endometritis during their hospital stay and by 6 weeks postpartum . The primary end point was a composite of either surgical site infection or endometritis . RESULTS : Eight hundred thirty-one patients were recruited . Of these , 415 participants received 30 % oxygen perioperatively and 416 received 80 % oxygen . The groups were well matched for age , race , parity , diabetes , number of previous cesarean deliveries , and scheduled compared with unscheduled cesarean deliveries . An intention-to-treat analysis was used . There was no difference in the primary composite outcome ( 8.2 % in women who received 30 % oxygen compared with 8.2 % in women who received 80 % oxygen , P=.89 ) , no difference in surgical site infection in the two groups ( 5.5 % compared with 5.8 % , P=.98 ) , and no significant difference in endometritis in the two groups ( 2.7 % compared with 2.4 % , P=.66 ) , respectively . CONCLUSION : Women who received 80 % supplemental oxygen perioperatively did not have a lower rate of a surgical site infection or endometritis as compared with women who received 30 % supplemental oxygen concentration . CLINICAL TRIAL REGISTRATION : Clinical Trials.gov , www.clincaltrials.gov , NCT00876005 . LEVEL OF EVIDENCE : BACKGROUND Higher concentrations of fraction of inspired oxygen ( FIO2 ) have been shown to be associated with lower risk for surgical site infection in multiple studies outside the domain of orthopedic surgery . We evaluated the efficacy of high FIO2 administered during the perioperative period to reduce the rate of surgical site infection after open fixation of lower-extremity fractures at high risk of infection . METHODS We conducted a r and omized controlled , parallel design , double-blind study . Patients sustaining high-energy tibial plateau , tibial pilon , and calcaneus fractures treated in a staged fashion were selected for enrollment because these injuries are associated with high risk of infection . The study population included 222 patients with 235 fractures . Consenting patients were r and omized by r and om number sequence to either the treatment or the control group . Treatment group patients received 80 % FIO2 intraoperatively and for 2 hours afterward . Control group patients received 30 % FIO2 during the same period . Surgeons , patients , and personnel who performed wound assessment s were blinded to group assignment . The primary outcome measure was surgical site infection as defined by the Centers for Disease Control criteria for postoperative wound infection . RESULTS The overall rates of postoperative surgical site infection were 12 % ( 14 of 119 fractures ) in the treatment group and 16 % ( 19 of 116 fractures ) in the control group ( p = 0.31 ) . Multivariate analysis , accounting for risk factors for infection , yielded the closest to a statistically significant reduction in the odds of infection with treatment ( odds ratio , 0.54 ; p = 0.17 ) . No treatment-associated events were observed . CONCLUSION Use of a high concentration of FIO2 during the perioperative period is safe and shows a trend toward reduction of surgical site infection in patients undergoing open operative fixation of high-energy traumatic lower-extremity fractures . Further study in a larger patient population is indicated . LEVEL OF EVIDENCE Therapeutic study , level III Background The role of supplemental oxygen therapy in the healing of esophagojejunal anastomosis is still very much in an experimental stage . The aim of the present prospect i ve , r and omized study was to assess the effect of administration of perioperative supplemental oxygen therapy on esophagojejunal anastomosis , where the risk of leakage is high . Methods We enrolled 171 patients between January 2009 and April 2012 who underwent elective open esophagojejunal anastomosis for gastric cancer . Patients were assigned r and omly to an oxygen/air mixture with a fraction of inspired oxygen ( FiO2 ) of 30 % ( n = 85 ) or 80 % ( n = 86 ) . Administration commenced after induction of anesthesia and was maintained for 6 h after surgery . Results The overall anastomotic leak rate was 14.6 % ( 25 of 171 ) : 17 patients ( 20 % ) had an anastomotic dehiscence in the 30 % FiO2 group and 8 ( 9.3 % ) in the 80 % FiO2 group ( P < 0.05 ) . The risk of anastomotic leak was 49 % lower in the 80 % FiO2 group ( relative risk 0.61 ; 95 % confidence interval 0.40–0.95 ) versus 30 % FiO2 . Conclusions Supplemental 80 % FiO2 provided during and for 6 h after major gastric cancer surgery to reduce postoperative anastomotic dehiscence should be considered part of ongoing quality improvement activities related to surgical care , with few risks to the patient and little associated cost OBJECTIVE To determine if supplemental perioperative oxygen will reduce surgical site infection ( SSI ) following cesarean delivery . METHODS This is a r and omized , controlled trial evaluating SSI following either 30 % or 80 % fraction of inspired oxygen ( FIO2 ) during and 2 hours after cesarean delivery . Anesthesia providers administered FIO2 via a high-flow oxygen blender . Subjects , surgeons , and wound evaluation teams were blinded . Serial wound evaluations were performed . Data were analyzed using logistic regression models , Fisher exact test , and t test . RESULTS In all , 179 women were r and omized , and 160 subjects were included in the analysis . There were 12/83 ( 14.5 % ) SSIs in the control group versus 10/77 ( 13.0 % ) in the investigational group ( p = 0.82 ) . Caucasian race , increased body mass index , and longer operative time were identified as significant risk factors for infection ( p = 0.026 , odds ratio 0.283 ; p = 0.05 , odds ratio = 1.058 ; p = 0.037 , odds ratio = 1.038 , respectively ) . CONCLUSION Perioperative oxygenation with 80 % Fio2 is not effective in reducing SSI following cesarean delivery CONTEXT Surgical site infection ( SSI ) in the general surgical population is a significant public health issue . The use of a high fractional inspired concentration of oxygen ( FIO2 ) during the perioperative period has been reported to be of benefit in selected patients , but its role as a routine intervention has not been investigated . OBJECTIVE To determine whether the routine use of high FIO2 during the perioperative period alters the incidence of SSI in a general surgical population . DESIGN , SETTING , AND PATIENTS Double-blind , r and omized controlled trial conducted between September 2001 and May 2003 at a large university hospital in metropolitan New York City of 165 patients undergoing major intra-abdominal surgical procedures under general anesthesia . INTERVENTIONS Patients were r and omly assigned to receive either 80 % oxygen ( FIO2 of 0.80 ) or 35 % oxygen ( FIO2 of 0.35 ) during surgery and for the first 2 hours after surgery . MAIN OUTCOME MEASURES Presence of clinical ly significant SSI in the first 14 days after surgery , as determined by clinical assessment , a management change , and at least 3 prospect ively defined objective criteria . RESULTS The study groups were closely matched in a large number of clinical variables . The overall incidence of SSI was 18.1 % . In an intention-to-treat analysis , the incidence of infection was significantly higher in the group receiving FIO2 of 0.80 than in the group with FIO2 of 0.35 ( 25.0 % vs 11.3 % ; P = .02 ) . FIO2 remained a significant predictor of SSI ( P = .03 ) in multivariate regression analysis . Patients who developed SSI had a significantly longer length of hospitalization after surgery ( mean [ SD ] , 13.3 [ 9.9 ] vs 6.0 [ 4.2 ] days ; P<.001 ) . CONCLUSIONS The routine use of high perioperative FIO2 in a general surgical population does not reduce the overall incidence of SSI and may have predominantly deleterious effects . General surgical patients should continue to receive oxygen with cardiorespiratory physiology as the principal determinant BACKGROUND High concentrations of inspired oxygen are associated with pulmonary atelectasis but also provide recognized advantages . Consequently , the appropriate inspired oxygen concentration for general surgical use remains controversial . The authors tested the hypothesis that atelectasis and pulmonary dysfunction on the first postoperative day are comparable in patients given 30 % or 80 % perioperative oxygen . METHODS Thirty patients aged 18 - 65 yr were anesthetized with isoflurane and r and omly assigned to 30 % or 80 % oxygen during and for 2 h after colon resection . Chest radiographs and pulmonary function tests ( forced vital capacity and forced expiratory volume ) were obtained preoperatively and on the first postoperative day . Arterial blood gas measurements were obtained intraoperatively , after 2 h of recovery , and on the first postoperative day . Computed tomography scans of the chest were also obtained on the first postoperative day . RESULTS Postoperative pulmonary mechanical function was significantly reduced compared with preoperative values , but there was no difference between the groups at either time . Arterial gas partial pressures and the alveolar-arterial oxygen difference were also comparable in the two groups . All preoperative chest radiographs were normal . Postoperative radiographs showed atelectasis in 36 % of the patients in the 30%-oxygen group and in 44 % of those in the 80%-oxygen group . Relatively small amounts of pulmonary atelectasis ( expressed as a percentage of total lung volume ) were observed on the computed tomography scans , and the percentages ( mean + /- SD ) did not differ significantly in the patients given 30 % oxygen ( 2.5 % + /- 3.2 % ) or 80 % oxygen ( 3.0 % + /- 1.8 % ) . These data provided a 99 % chance of detecting a 2 % difference in atelectasis volume at an alpha level of 0.05 . CONCLUSIONS Lung volumes , the incidence and severity of atelectasis , and alveolar gas exchange were comparable in patients given 30 % and 80 % perioperative oxygen . The authors conclude that administration of 80 % oxygen in the perioperative period does not worsen lung function . Therefore , patients who may benefit from generous oxygen partial pressures should not be denied supplemental perioperative oxygen for fear of causing atelectasis High concentrations of inspired oxygen have been reported to have significant hemodynamic effects that may be related to increased free radical production . If oxygen therapy increases free radical production , it may also modify hemodynamic responses to a nitric oxide donor . Twenty-nine healthy male volunteers were studied using r and omized , double-blind , placebo-controlled , crossover design s to determine whether oxygen therapy is associated with hemodynamic and forearm vascular effects . We measured hemodynamic parameters and forearm vascular responses before and 1 h after exposure to 100 % oxygen versus medical air . Plasma 8-iso-PGF2alpha and plasma vitamin C were measured to assess the biochemical effects of oxygen administration . Hemodynamic measurements were also made following the acute administration of sublingual nitroglycerin . Oxygen therapy caused no significant change in blood pressure , plasma 8-iso-PGF2alpha , or vitamin C. Oxygen did cause a significant reduction in heart rate and forearm blood flow , and an increase in peripheral vascular resistance . Oxygen caused no change in the hemodynamic response to nitroglycerin . Therefore , in healthy young adults , therapy with 100 % oxygen does not affect blood pressure , despite causing an increase in vascular resistance , is not associated with evidence of increased free radical injury , and does not affect the hemodynamic responses to nitroglycerin BACKGROUND Increased long-term mortality was found in patients exposed to perioperative hyperoxia in the PROXI trial , where patients undergoing laparotomy were r and omised to 80 % versus 30 % oxygen during and after surgery . This post hoc follow-up study assessed the impact of perioperative hyperoxia on long-term risk of cardiovascular events . METHODS A total of 1386 patients undergoing either elective or emergency laparotomy were r and omised to 80 % versus 30 % oxygen during and two hours after surgery . At follow-up , the primary outcome of acute coronary syndrome was assessed . Secondary outcomes included myocardial infa rct ion , other heart disease , and acute coronary syndrome or death . Data were analysed in the Cox proportional hazards model . RESULTS The primary outcome , acute coronary syndrome , occurred in 2.5 % versus 1.3 % in the 80 % versus 30 % oxygen group ; HR 2.15 ( 95 % CI 0.96 - 4.84 ) . Patients in the 80 % oxygen group had significantly increased risk of myocardial infa rct ion ; HR 2.86 ( 95 % CI 1.10 - 7.44 ) , other heart disease ; HR 1.40 ( 95 % 1.06 - 1.83 ) , and acute coronary syndrome or death ; HR 1.22 ( 95 % CI 1.01 - 1.49 ) . CONCLUSIONS Perioperative hyperoxia may be associated with an increased long-term risk of myocardial infa rct ion and other heart disease
2,007
30,695,068
A majority of ( 28/36 ) RCTs showed improvement in some reported outcomes . CONCLUSIONS Interventions to improve treatment adherence result in modest short-term benefits in surrogate outcome measures in dialysis patients , but significant improvements in trial design and outcome reporting are warranted to identify strategies that would achieve meaningful and sustainable clinical benefits .
BACKGROUND In patients with end stage kidney disease ( ESKD ) on dialysis , treatment non-adherence is common and results in poor health outcomes . However , the clinical benefits of interventions to improve adherence in dialysis patients are difficult to evaluate since trialled interventions and reported outcomes are highly diverse/ heterogeneous . This review summarizes existing literature on r and omized controlled trials ( RCTs ) evaluating adherence interventions in ESKD patients focusing on the intervention category , outcome efficacy and persistence of benefit beyond the intervention .
BACKGROUND Peritoneal dialysis ( PD ) requires patients to take an active role in their adherence to fluid restrictions . Although fluid non-adherence had been identified among this patient group , no specific interventions have been research ed or published with in the PD population . The current study sought to investigate whether an applied cognitive behavioural therapy ( CBT-based intervention ) used among haemodialysis patients would improve fluid adherence among PD patients ; utilizing clinical indicators used in practice . METHODS Fifteen PD patients identified as fluid non-adherent were r and omly assigned to an intervention group ( IG ) or a deferred-entry control group ( CG ) . The study ran for a total of 21 weeks , with five data collection points ; at baseline , post-intervention and at three follow-up points ; providing a RCT phase and a combined longitudinal analysis phase . The content of the group intervention encompassed educational , cognitive and behavioural components , aim ed to assist patients ' self-management of fluid . RESULTS No significant differences in weight ( kg ) reduction were found in either phase and undesirable changes in blood pressure ( BP ) were observed . However , in the longitudinal phase , a statistically significant difference in oedematous status was observed at 6-week follow-up ; which may be indicative of fluid adherence . Positive and significant differences were observed in the desired direction for measures of psychological well-being , quality of life and health beliefs ; areas correlated with enhanced fluid adherence in other research . CONCLUSIONS This study reveals encouraging and significant changes in predictors of fluid adherence . Although there were no significant changes in weight as a crude clinical measure of fluid intake , significant reductions in oedematous status were observed as a consequence of this CBT-based group intervention Objective The purpose of this study is to evaluate the efficacy of a behavioral self-regulation intervention vs. active control condition using a parallel-group r and omized clinical trial with a sample of center hemodialysis patients with chronic kidney disease . Method Participants were recruited from 8 hemodialysis treatment centers in the Midwest . Eligible patients were ( a ) fluid nonadherent as defined by an interdialytic weight gain > 2.5 kg over a 4-week period , ( b ) > 18 years of age , ( c ) English-speaking without severe cognitive impairment , ( d ) treated with center-based hemodialysis for > 3 months , and ( e ) not living in a care facility in which meals were managed . Medical records were used to identify eligible patients . Patients were r and omly assigned to either a behavioral self-regulation intervention or active control condition in which groups of 3–8 patients met for hour-long , weekly sessions for 7 weeks at their usual hemodialysis clinic . Primary analyses were intention-to-treat . Results Sixty-one patients were r and omized to the intervention while 58 were assigned to the attention-placebo support and discussion control . Covariate-adjusted between-subjects analyses demonstrated no unique intervention effect for the primary outcome , interdialytic weight gain ( β = 0.13 , p = 0.48 ) . Significant within-subjects improvement over time was observed for the intervention group ( β = −0.32 , p = 0.014 ) . Conclusions The present study found that participation in a behavioral self-regulation intervention result ed in no unique intervention effect on a key indicator of adherence for those with severe chronic kidney disease . There was , however , modest within-subjects improvement in interdialytic weight gain for the intervention group which meshes with other evidence showing the utility of behavioral interventions in this patient population . Clinical Trials.gov Identifier : This research examined the relative efficacies of three intervention strategies design ed to increase compliance to medical regimens in a group of ambulatory hemodialysis patients . The interventions examined included behavioral contracting ( with or without the involvement of a family member or friend ) and weekly telephone contacts with patients . Compliance was assessed with regard to following dietary restrictions and limiting fluid intake . Data were collected from 116 patients drawn from two outpatient clinics . Within clinics , patients were r and omly assigned either to an intervention program or to a control group . The study employed a pretest-posttest control group design . Patients were interviewed before the intervention programs began ( T1 ) , after a 6-week intervention period ( T2 ) , and 3 months after completion of the intervention period ( T3 ) . Results showed that the interventions achieved substantial reductions in patients ' serum potassium levels and in weight gains between dialysis treatments between T1 and T2 . In general , however , these program effects tapered off to preintervention levels between T2 and T3 . The findings thus indicate a need for long-term intervention programs OBJECTIVE This study aim ed to assess the effectiveness of Benson 's relaxation technique in improving the hemodialysis patients ' dietary and fluid adherence and biomedical markers . DESIGN This r and omized controlled trial with a pre-post test design was conducted on 86 hemodialysis patients r and omly divided into an intervention ( receiving Benson 's relaxation technique ) and a control group ( usual care ) . SETTING The setting of the study was two hemodialysis units affiliated to Shiraz University of Medical Sciences , Shiraz , Iran . INTERVENTION The patients listened to the audiotape of Benson 's relaxation technique twice a day each time for 20min for 8 weeks . MAIN OUTCOME MEASURES Dietary and fluid adherence and some biomedical markers were measured in both the intervention and the control group at baseline and at the 8th week after the intervention . RESULTS The results showed significant differences between the two groups regarding blood urea nitrogen and phosphate as dietary adherence and interdialytic weight gain as fluid adherence in the 8th week of the intervention ( P<0.05 ) . Also , a significant difference was found between the two groups concerning blood glucose level after the intervention ( P<0.05 ) . CONCLUSIONS This study highlighted the importance of Benson 's relaxation technique in improvement of adherence and some biomedical markers in hemodialysis patients . Thus , Benson 's relaxation therapy could be used as a part of the nursing care practice for hemodialysis patients and those suffering from chronic diseases OBJECTIVE To examine the effect of self-management dietary counselling ( SMDC ) on adherence to dietary management of hyperphosphatemia among haemodialysis patients . DESIGN An eight-week cluster based r and omised control trial . PARTICIPANTS 122 stable adult patients were recruited from an HD unit in Sidon , Lebanon . Study groups were : full intervention ( A ) ( n = 41 ) , partial intervention ( B ) ( n = 41 ) and control ( C ) ( n = 40 ) . INTERVENTION Group ( A ) received SMDC , Group ( B ) received educational games only and Group ( C ) did not receive any research intervention . MAIN OUTCOME MEASURES Serum phosphorus ( P ) , Calcium Phosphate product ( Ca × P ) and two question naires : patient knowledge ( PK ) and dietary non-adherence ( PDnA ) to P reduced diet . RESULTS Group A experienced a significant improvement in mean ( ± SD ) P ( 6.54 ± 2.05 - 5.4 ± 1.97 mg/dl ) , Ca × P ( 58 ± 17 - 49 ± 12 ) , PK scores ( 50 ± 17 - 69 ± 25 % ) and PDnA scores ( 21.4 ± 4.0 - 18.3 ± 2.0 ) . Group B experienced a significant improvement in Ca × P ( 52 ± 14 - 45 ± 16 ) . Group C did not experience any significant change post intervention . CONCLUSION Our findings demonstrate the importance of patient-tailored counselling on serum P management Background / Aims . One of the causes of uncontrolled secondary hyperparathyroidism ( sHPT ) is patient 's poor drug adherence . We evaluated the clinical benefits of an integrated care approach on the control of sHPT by cinacalcet . Methods . Prospect i ve , r and omized , controlled , multicenter , open-label study . Fifty hemodialysis patients on a stable dose of cinacalcet were r and omized to an integrated care approach ( IC ) or usual care approach ( UC ) . In the IC group , cinacalcet adherence was monitored using an electronic system . Results were discussed with the patients in motivational interviews , and drug prescription adapted accordingly . In the UC group , drug adherence was monitored , but results were not available . Results . At six months , 84 % of patients in the IC group achieved recommended iPTH targets versus 55 % in the UC group ( P = 0.04 ) . The mean cinacalcet taking adherence improved by 10.8 % in the IC group and declined by 5.3 % in the UC group ( P = 0.02 ) . Concomitantly , the mean dose of cinacalcet was reduced by 7.2 mg/day in the IC group and increased by 6.4 mg/day in the UC group ( P = 0.03 ) . Conclusions . The use of a drug adherence monitoring program in the management of sHPT in hemodialysis patients receiving cinacalcet improves drug adherence and iPTH control and allows a reduction in the dose of cinacalcet Flaws in the design , conduct , analysis , and reporting of r and omised trials can cause the effect of an intervention to be underestimated or overestimated . The Cochrane Collaboration ’s tool for assessing risk of bias aims to make the process clearer and more OBJECTIVE Poor compliance with the dietary prescriptions is quite common in dialysis patients . We believe that most of the noncompliance is caused by the patient 's poor underst and ing of the dietary prescription . Therefore , in the present study , we tried to investigate the role of menu suggestion in improving the patient 's compliance with the dietary prescription . DESIGN AND SETTING A longitudinal cohort study conducted at an outpatient dialysis clinic . PATIENTS Seventy clinical ly stable patients on peritoneal dialysis were included in this prospect i ve study during April 1 , 2004 , to November 31 , 2004 , in a single center . Patients who had significant cognitive impairment and thus did not underst and the food contents during the training course were not eligible for enrollment . INTERVENTION All the patients were r and omly assigned to 1 of 2 groups . Group 1 patients received the traditional patient education method . Group 2 patients additionally received individualized menu suggestions based on their food preferences and education on how to exchange the foods at equivalent amounts according to the exchange list . MAIN OUTCOME MEASURES At present , there are no clear optimal dietary protein intake levels for peritoneal dialysis patients . Our experience is that a dietary protein intake level of 0.8 to 1.2 g/kg/d can maintain our patients in a good nutritional status . Thereafter , in this study we prescribed the dietary protein intake level at 0.8 to 1.2 g/kg/d and defined compliance as meeting this target protein intake level . RESULTS There were 35 patients in each group . The compliance was 22.9 % in group 1 and 57.1 % in group 2 ( P < .05 ) . CONCLUSIONS Our study suggests that menu suggestion may be an effective way of improving the compliance with the diet in peritoneal dialysis patients . It improves the patient 's underst and ing of the dietary prescription CONTEXT High dietary phosphorus intake has deleterious consequences for renal patients and is possibly harmful for the general public as well . To prevent hyperphosphatemia , patients with end-stage renal disease limit their intake of foods that are naturally high in phosphorus . However , phosphorus-containing additives are increasingly being added to processed and fast foods . The effect of such additives on serum phosphorus levels is unclear . OBJECTIVE To determine the effect of limiting the intake of phosphorus-containing food additives on serum phosphorus levels among patients with end-stage renal disease . DESIGN , SETTING , AND PARTICIPANTS Cluster r and omized controlled trial at 14 long-term hemodialysis facilities in northeast Ohio . Two hundred seventy-nine patients with elevated baseline serum phosphorus levels ( > 5.5 mg/dL ) were recruited between May and October 2007 . Two shifts at each of 12 large facilities and 1 shift at each of 2 small facilities were r and omly assigned to an intervention or control group . INTERVENTION Intervention participants ( n=145 ) received education on avoiding foods with phosphorus additives when purchasing groceries or visiting fast food restaurants . Control participants ( n=134 ) continued to receive usual care . MAIN OUTCOME MEASURE Change in serum phosphorus level after 3 months . RESULTS At baseline , there was no significant difference in serum phosphorus levels between the 2 groups . After 3 months , the decline in serum phosphorus levels was 0.6 mg/dL larger among intervention vs control participants ( 95 % confidence interval , -1.0 to -0.1 mg/dL ) . Intervention participants also had statistically significant increases in reading ingredient lists ( P<.001 ) and nutrition facts labels ( P = .04 ) but no significant increase in food knowledge scores ( P = .13 ) . CONCLUSION Educating end-stage renal disease patients to avoid phosphorus-containing food additives result ed in modest improvements in hyperphosphatemia . TRIAL REGISTRATION clinical trials.gov Identifier : NCT00583570 CONTEXT Mortality rates among US hemodialysis patients are the highest in the industrialized world at 23 % per year . Measures of dialysis dose ( Kt/V ) correspond strongly with survival and are inadequate in one sixth of patients . Inadequate dialysis is also associated with increased hospitalizations and high inpatient costs . Our previous work identified 3 barriers to adequate hemodialysis : dialysis underprescription , catheter use , and shortened treatment time . OBJECTIVE To determine the effect of a tailored intervention on adequacy of hemodialysis . DESIGN AND SETTING Community-based r and omized controlled trial with recruitment from April 1999 to June 2000 at 29 hemodialysis facilities in northeast Ohio . PARTICIPANTS Forty-four nephrologists and their 169 r and omly selected adult patients receiving inadequate hemodialysis . INTERVENTION Nephrologists were r and omly assigned to an intervention ( n = 21 ) or control ( n = 23 ) group . For patients in the intervention group ( n = 85 ) , depending on the barrier(s ) present , a study coordinator gave nephrologists recommendations about optimizing dialysis prescriptions , expedited conversion of catheters to surgically created grafts or fistulas , and educated patients about the importance of compliance with treatment time . Patients in the control group ( n = 84 ) continued to receive usual care . MAIN OUTCOME MEASURES Changes in Kt/V and specific barriers after 6 months . RESULTS At baseline , intervention and control patients had similar Kt/V measurements , specific barriers , and demographic and medical characteristics . After 6 months , intervention patients had 2-fold larger increases in Kt/V compared with control patients ( + 0.20 vs + 0.10 ; P<.001 ) and were more likely to achieve their facility Kt/V goal ( 62 % vs 42 % ; P = .01 ) . Intervention patients also had nearly 3-fold larger increases in dialysis prescription ( + 0.16 vs + 0.06 ; P<.001 ) and were 4 times more likely to change from use of catheters to use of fistulas/grafts ( 28 % vs 7 % ; P = .04 ) . CONCLUSIONS An intervention tailored to patient-specific barriers result ed in increased hemodialysis dose . Extending this approach to the 33 000 persons in the United States receiving inadequate hemodialysis may substantially enhance patient survival , diminish hospitalizations , and decrease inpatient expenditures Background Treatment of secondary hyperparathyroidism ( SHPT ) is important in management of patients with end-stage renal disease on hemodialysis ( HD ) . Calcimimetic agent , cinacalcet provides an option for control of SHPT in patients who fail traditional therapy . It may not have optimal results in non-compliant patients . To enhance compliance , we evaluated effectiveness of post-dialysis dosing of cinacalcet ( group AD ) as compared to daily home administration ( group D ) in a prospect i ve r and omized trial of HD patients with refractory SHPT . Methods After 2-week run-in phase , patients were r and omly assigned to two treatment groups . In group AD ( N = 12 ) , patients were administered cinacalcet on the day of dialysis ( 3 times/week ) by dialysis staff , while in control group D ( N = 11 ) , cinacalcet was prescribed daily to be taken by patients at home . Intact parathyroid hormone ( i-PTH ) , serum calcium , phosphorus , and alkaline phosphatase were followed for 16 weeks and compared to baseline in both groups . Data were analyzed using between-groups linear regression for repeated measures . Results No significant decline in i-PTH occurred in group AD at 16 weeks as compared to a significant drop in group D ( p = 0.006 ) . However , subgroup analysis showed effectiveness of post-dialysis dosing in patients with less severe SHPT ( p = 0.04 ) . Conclusion Although daily dosing overall was more effective for treatment of SHPT , dialysis dosing was effective in patients with less severe SHPT . This warrants a larger study considering the limitations of this pilot trial . In the meantime , dialysis dosing can be considered in non-compliant patients with less severe SHPT OBJECTIVE The purpose of this study was to evaluate the effectiveness of 20 to 30 minutes per month of additional diet education on monthly laboratory values ( phosphorus , calcium , parathyroid hormone , and calcium/phosphorus product ) and knowledge of dietary phosphorus management in hemodialysis patients with hyperphosphatemia . DESIGN A quasi-experimental design . SETTING Three outpatient dialysis centers owned by the same corporation in 1 southern state . PATIENTS Based on a 3-month average serum phosphorus > 6.0 mg/dL , 70 patients were selected for participation ; 63 dialysis patients completed the study , 32 in the experimental group and 31 in the control group . INTERVENTION All patients completed a before- and -after knowledge test and had monthly blood sample s drawn . Each month , the same registered dietitian provided the routine laboratory results review with control group . The experimental group received the routine laboratory review plus 20 to 30 minutes of additional diet education specifically targeting phosphorus . Main outcome measures Before- and -after knowledge test results and baseline and final serum calcium , phosphorus , parathyroid hormone , and calcium/phosphorus product levels . RESULTS At baseline , there were no significant differences in any of the laboratory values , but the knowledge level of the experimental group was greater ( P < .05 ) After 6 months , gains in knowledge were significantly higher in the intervention group , and the serum phosphorus and calcium/phosphorus product levels were significantly lower ( P < .01 ) than in the control group . CONCLUSION Based on this research , those patients who received extra education monthly showed positive changes , which may be beneficial in reducing hyperphosphatemia Patients with ESRD have high rates of depression , which is associated with diminished quality of life and survival . We determined whether individual cognitive behavioral therapy ( CBT ) reduces depression in hemodialysis patients with elevated depressive affect in a r and omized crossover trial . Of 65 participants enrolled from two dialysis centers in New York , 59 completed the study and were assigned to the treatment-first group ( n=33 ) or the wait-list control group ( n=26 ) . In the intervention phase , CBT was administered chairside during dialysis treatments for 3 months ; participants were assessed 3 and 6 months after r and omization . Compared with the wait-list group , the treatment-first group achieved significantly larger reductions in Beck Depression Inventory II ( self-reported , P=0.03 ) and Hamilton Depression Rating Scale ( clinician-reported , P<0.001 ) scores after intervention . Mean scores for the treatment-first group did not change significantly at the 3-month follow-up . Among participants with depression diagnosed at baseline , 89 % in the treatment-first group were not depressed at the end of treatment compared with 38 % in the wait-list group ( Fisher 's exact test , P=0.01 ) . Furthermore , the treatment-first group experienced greater improvements in quality of life , assessed with the Kidney Disease Quality of Life Short Form ( P=0.04 ) , and interdialytic weight gain ( P=0.002 ) than the wait-list group , although no effect on compliance was evident at follow-up . In summary , CBT led to significant improvements in depression , quality of life , and prescription compliance in this trial , and studies should be undertaken to assess the long-term effects of CBT on morbidity and mortality in patients with ESRD Background : Worldwide , approximately 11 % of patients on dialysis receive peritoneal dialysis ( PD ) . Whilst PD may offer more autonomy to patients compared with hemodialysis , patient and caregiver burnout , technique failure , and peritonitis remain major challenges to the success of PD . Improvements in care and outcomes are likely to be mediated by r and omized trials of innovative therapies , but will be limited if the outcomes measured and reported are not important for patients and clinicians . The aim of the St and ardised Outcomes in Nephrology-Peritoneal Dialysis ( SONG-PD ) study is to establish a set of core outcomes for trials in patients on PD based on the shared priorities of all stakeholders , so that outcomes of most relevance for decision-making can be evaluated , and that interventions can be compared reliably . Methods : The 5 phases in the SONG-PD project are : a systematic review to identify outcomes and outcome measures that have been reported in r and omized trials involving patients on PD ; focus groups using nominal group technique with patients and caregivers to identify , rank , and describe reasons for their choice of outcomes ; semi-structured key informant interviews with health professionals ; a 3-round international Delphi survey involving a multi-stakeholder panel ; and a consensus workshop to review and endorse the proposed set of core outcome domains for PD trials . Discussion : The establishment of 3 to 5 high-priority core outcomes , to be measured and reported consistently in all trials in PD , will enable patients and clinicians to make informed decisions about the relative effectiveness of interventions , based upon outcomes of common importance OBJECTIVE Among chronic hemodialysis patients , hyperphosphatemia is common and associated with mortality . Behavioral economics and complementary behavior-change theories may offer valuable approaches to achieving phosphorus ( PO4 ) control . The aim was to determine feasibility of implementing financial incentives and structured coaching to improve PO4 in the hemodialysis setting . DESIGN AND METHODS This pilot r and omized controlled trial was conducted in 3 urban dialysis units for 10 weeks among 36 adults with elevated serum PO4 ( median > 5.5 mg/dL over 3 months ) . INTERVENTIONS Twelve participants each were r and omized to : ( 1 ) financial incentives for lowering PO4 , ( 2 ) coaching about dietary and medication adherence , or ( 3 ) usual care . PO4 was measured during routine clinic operations . Each incentives arm participant received the equivalent of $ 1.50/day if the PO4 was ≤5.5 mg/dL or > 5.5 mg/dL but decreased ≥0.5 mg/dL since the prior measurement . The coach was instructed to contact coaching arm participants at least 3 times per week . MAIN OUTCOME MEASURES The outcome measures included : ( 1 ) enrollment rate , ( 2 ) dropout rate , and ( 3 ) change in PO4 from beginning to end of 10-week intervention period . RESULTS Of 66 eligible patients , 36 ( 55 % ) enrolled . Median age was 53 years , 83 % were black race , and 78 % were male . Median baseline PO4 was 6.0 ( interquartile range 5.6 , 7.5 ) . Using stratified generalized estimation equation analyses , the monthly decline in PO4 was -0.32 mg/dL ( 95 % CI -0.60 , -0.04 ) in the incentives arm , -0.40 mg/dL ( -0.60 , -0.20 ) in the coaching arm , and -0.24 mg/dL ( -0.60 , 0.08 ) in the usual care arm . No patients dropped out . All intervention arm participants expressed interest in receiving similar support in the future . CONCLUSIONS This pilot trial demonstrated good feasibility in enrollment and implementation of novel behavioral health strategies to reduce PO4 in dialysis patients Objective : Haemodialysis patients are at risk of serious health complications ; yet , treatment non-adherence remains high . Warnings about health risks associated with non-adherence may trigger defensive reactions . We studied whether an intervention based on self-affirmation theory reduced resistance to health-risk information and improved fluid treatment adherence . Design : In a cluster r and omised controlled trial , 91 patients either self-affirmed or completed a matched control task before reading about the health-risks associated with inadequate fluid control . Outcome measures : Patients ’ perceptions of the health-risk information , intention and self-efficacy to control fluid were assessed immediately after presentation of health-risk information . Interdialytic weight gain ( IDWG ) , excess fluid removed during haemodialysis , is a clinical measure of fluid treatment adherence . IDWG data were collected up to 12 months post-intervention . Results : Self-affirmed patients had significantly reduced IDWG levels over 12 months . However , contrary to predictions derived from self-affirmation theory , self-affirmed participants and controls did not differ in their evaluation of the health-risk information , intention to control fluid or self-efficacy . Conclusion : A low-cost , high-reach health intervention based on self-affirmation theory was shown to reduce IDWG over a 12-month period , but the mechanism by which this apparent behaviour change occurred is uncertain . Further work is still required to identify mediators of the observed effects BACKGROUND Adhering to fluid restrictions represents one of the most difficult aspects of the hemodialysis treatment regimen . This report describes a r and omized controlled trial of a group-based cognitive behavioral intervention aim ed at improving fluid-restriction adherence in patients receiving hemodialysis . It was hypothesized that the intervention would improve adherence , measured by means of interdialytic weight gain ( IWG ) , without impacting negatively on psychosocial functioning . METHODS Fifty-six participants receiving hemodialysis from 4 renal outpatient setting s were r and omly assigned to an immediate-treatment group ( ITG ; n = 29 ) or deferred-treatment group ( DTG ; n = 27 ) . Participants were assessed at baseline , posttreatment , and follow-up stages . Treatment consisted of a 4-week intervention using educational , cognitive , and behavioral strategies to enhance effective self-management of fluid consumption . RESULTS No significant difference in mean IWGs was found between the ITG and DTG during the acute-phase analysis ( F(1,54 ) = 0.03 ; P > 0.05 ) . However , in longitudinal analysis , there was a significant main effect for mean IWG ( F(1.76,96.80 ) = 9.10 ; P < 0.001 ) and a significant difference between baseline and follow-up IWG values ( t55 = 3.85 ; P < 0.001 ) , reflecting improved adherence over time . No adverse effects of treatment were indicated through measures of psychosocial functioning . Some significant changes were evidence d in cognitions thought to be important in mediating behavioral change . CONCLUSION The current study provides evidence for the feasibility and effectiveness of applying group-based cognitive behavior therapy to enhance adherence to hemodialysis fluid restrictions . Results are discussed in the context of the study 's method ological limitations OBJECTIVE To evaluate the effectiveness of a protocol design ed to optimize serum phosphate levels in patients undergoing regular hemodialysis ( HD ) . DESIGN R and omized , controlled trial . SETTING Hemodialysis units at Barts and the London NHS Trust and satellite units . PATIENTS Thirty-four clinical ly stable adults undergoing regular HD with a serum phosphate level > 1.8 mmol/L on at least one occasion within 4 months of starting the study . INTERVENTION Management of serum phosphate using a specially design ed phosphate management protocol during a 4-month study period implemented by a renal dietitian and renal pharmacist compared with st and ard practice . MAIN OUTCOME MEASURE Change in serum phosphate levels in both groups after 4 months . RESULTS Patients managed using the phosphate management protocol had a significantly greater reduction in serum phosphate levels compared with patients receiving st and ard practice ( -0.22 + /- 0.67 mmol/L vs. + 0.19 + /- 0.32 mmol/L , P = 0.03 ) . CONCLUSION The phosphate management protocol was effective , and its implementation was associated with significantly better serum phosphate control in patients undergoing regular HD Hypertension in patients on hemodialysis ( HD ) contributes significantly to their morbidity and mortality . This study examined whether a supportive nursing intervention incorporating monitoring , goal setting , and reinforcement can improve blood pressure ( BP ) control in a chronic HD population . A r and omized controlled design was used and 118 participants were recruited from six HD units in the Detroit metro area . The intervention consisted of ( 1 ) BP education sessions ; ( 2 ) a 12-week intervention , including monitoring , goal setting , and reinforcement ; and ( 3 ) a 30-day post-intervention follow-up period . Participants in the treatment were asked to monitor their BP , sodium , and fluid intake weekly for 12 weeks in weekly logs . BP , fluid and sodium logs were review ed weekly with the research er to determine if goals were met or not met . Reinforcement was given for goals met and problem solving offered when goals were not met . The control group received st and ard care . Both systolic and diastolic BPs were significantly decreased in the treatment group This study is a r and omized , controlled trial to examine the effect of the health contract intervention , based on the goal attainment theory , on the self-care behavior and physiological indices of renal dialysis patients in Korea . The experimental group ( n = 21 ) underwent health contract intervention for 4 weeks , while the control group ( n = 22 ) received routine care . The data were collected using question naires and measurement of physiological indices and analyzed using the SPSS WIN 12.0 program . A P value < 0.05 was considered statistically significant . Total score of self-care behavior ( P = 0.011 ) and individual scores for behaviors , such as diet ( P = 0.017 ) , exercise and rest ( P = 0.001 ) , and blood pressure and body weight ( P = 0.006 ) were higher in the experimental group . Serum potassium concentration and mean weight gain between dialysis sessions were significantly low in the experimental group ( P = 0.002 , P = 0.017 ) . Therefore , the health contract intervention based on the goal attainment theory proved effective in improving self-care behavior and physiological indices ( K , P , mean weight gain ) in renal dialysis patients in Korea BACKGROUND Patients with end stage renal failure require dialysis and strict adherence to treatment plans to sustain life . However , non-adherence is a common and serious problem among patients with chronic kidney disease . There is a scarcity of studies in examining the effects of disease management programmes on patients with chronic kidney disease . OBJECTIVES This paper examines whether the study group receiving the disease management programme have better improvement than the control group , comparing outcomes at baseline ( O1 ) , at 7 weeks at the completion of the programme ( O2 ) and at 13 weeks ( O3 ) . METHODS This is a r and omized controlled trial . The outcome measures were non-adherence in diet , fluid , dialysis and medication , quality of life , satisfaction , symptom control , complication control and health service utilisation . RESULTS There was no significant difference between the control and study group for the baseline measures , except for sleep . Significant differences ( p<0.05 ) were found between the control and study group at O2 in the outcome measures of diet degree non-adherence , sleep , symptom , staff encouragement , overall health and satisfaction . Sustained effects at O3 were noted in the outcome measures of continuous ambulatory peritoneal dialysis ( CAPD ) non-adherence degree , sleep , symptom , and effect of kidney disease . CONCLUSIONS Many studies exploring chronic disease management have neglected the group with end stage renal failure and this study fills this gap . This study has employed an innovative model of skill mix using specialist and general nurses and demonstrated patient improvement in diet non-adherence , CAPD non-adherence , aspects of quality of life and satisfaction with care . Re design ing chronic disease management programmes helps to optimize the use of different levels of skills and re sources to bring about positive outcomes OBJECTIVE To determine the effect of a dietetic educational intervention on phosphate and calcium levels of hemodialysis patients . DESIGN Parallel-group r and omized controlled trial . SETTING Teaching hospital hemodialysis unit in London , Engl and . PATIENTS Fifty-six stable adult hemodialysis patients with hyperphosphatemia . INTERVENTION An educational intervention and one-to-one teaching session given by a renal dietitian , attempting to improve patients ' knowledge of phosphate management and their compliance with diet and medication . OUTCOME MEASUREMENT Patients ' serum phosphate , calcium , and calcium x phosphate products in the 3 months after the intervention , compared with those before the intervention . Results were also compared with a control group that had not undergone the intervention . RESULTS In the intervention group , serum phosphate was significantly reduced after the education session , as compared with the results previously . In the control group , there was no significant change in serum phosphate level . The improved results were sustained over a period of 3 months . Serum calcium increased in the intervention group , but this result was not significant . There was an improvement in the calcium-phosphate product in the intervention group , but again this was not significant . CONCLUSION Dietetic educational intervention can favorably alter patients ' serum phosphate levels , with potential impact on morbidity and mortality BACKGROUND Patients with end-stage renal disease often fail to follow a prescribed diet and fluid regimen , which undermines the effectiveness of care and leads to unpredictable progression of the disease and greater likelihood of complications . AIM The purpose of this r and omized controlled study was to examine the effectiveness of self-efficacy training on fluid intake compliance . METHODS This study took place in northern Taiwan . Eligible patients were receiving routine haemodialysis ; 20 - 65 years of age ; living in a community setting ; able to read and write ; and willing to participate . Sixty-two end-stage renal disease patients participated in the study . Those in the experimental group ( n = 31 ) received 12 sessions of structured self-efficacy training ; the control group patients ( n = 31 ) received only routine care . The intervention was based on B and ura 's theory and included an educational component , performance mastery , experience sharing , and stress management . The outcome measure was the mean body weight gain between dialysis sessions . Data were collected at baseline , 1 , 3 and 6 months following the intervention and analysed by a descriptive and repeated- measures anova . RESULTS Experimental group mean weight gains decreased gradually following self-efficacy training . However , control group mean weight gains decreased only slightly over time . These results were statistically significant when baseline differences controlled for ( P < 0.05 ) . CONCLUSIONS The study supports the effectiveness of the self-efficacy training in controlling mean body weight gains of end-stage renal disease patients receiving haemodialysis OBJECTIVE Assess the effectiveness of a self-monitoring tool on perceptions of self-efficacy , health beliefs , and adherence in patients receiving hemodialysis . DESIGN A monthly intervention using a pretest , posttest design over a 6-month period . Both the treatment and control groups were r and omly selected and received surveys to assess health beliefs , perceptions of self-efficacy for performing specific healthful behaviors , and renal diet knowledge at baseline , before intervention , and 6 months later . The treatment group also received monthly feedback of monthly phosphorus levels and interdialytic weight gains . SETTING A university hospital-based 43-chair ambulatory dialysis center . SUBJECTS Forty patients with end-stage renal disease ( 25 men and 15 women , age 26 to 78 years ) , on chronic hemodialysis for at least 2 months and with a history of noncompliance with phosphorus and /or fluid restrictions for 1 or more months . MAIN OUTCOME MEASURES Self-efficacy , health beliefs , knowledge , biochemical , and demographic variables were analyzed . Analysis of variance tests of repeated measures were used to examine relationships between adherence with phosphorus and fluid restrictions to health beliefs and perceptions of self-efficacy after training in self-monitoring . RESULTS Overall , there were no significant improvements in adherence with phosphorus and fluid restrictions between the two groups , although a comparison within the groups revealed the treatment group had a statistically significant decrease in mean phosphorus levels of 7.14 to 6.22 mg/dL ( P = .005 ) from baseline to month 3 . However , because this value was not maintained , it was not statistically significant . No significant differences existed between the two groups for health beliefs and perceptions of self-efficacy . Knowledge scores in the treatment group , however , improved significantly as compared to the control group ( P = .008 ) and was a significant increase from baseline ( P = . 002 ) . In the control group , all scores fell slightly but this difference was not significant . CONCLUSIONS The benefits of patient self-monitoring and behavioral contracting upon adherence in patients on hemodialysis are inconclusive , as serum phosphorus and interdialytic weight gains did not differ between the two groups . The interventional tools also appeared to have little effect on perceptions of self-efficacy and health beliefs . Trends of improvement , however , did exist for phosphorus within the treatment group and subjects in this group had a statistically significant increase in knowledge scores over time . Additional research using repeated measures design is needed to explore the effects of increased frequency and duration of an intervention on the attainment of patient clinical outcome measures BACKGROUND Nonadherence among hemodialysis patients compromises dialysis delivery , which could influence patient morbidity and mortality . The Dialysis Outcomes and Practice Patterns Study ( DOPPS ) provides a unique opportunity to review this problem and its determinants on a global level . METHODS Nonadherence was studied using data from the DOPPS , an international , observational , prospect i ve hemodialysis study . Patients were considered nonadherent if they skipped one or more sessions per month , shortened one or more sessions by more than 10 minutes per month , had a serum potassium level openface>6.0 mEq/L , a serum phosphate level openface>7.5 mg/dL ( > 2.4 mmol/L ) , or interdialytic weight gain (IDWG)>5.7 % of body weight . Predictors of nonadherence were identified using logistic regression . Survival analysis used the Cox proportional hazards model adjusting for case-mix . RESULTS Skipping treatment was associated with increased mortality [ relative risk ( RR ) = 1.30 , P = 0.01 ] , as were excessive IDWG ( RR = 1.12 , P = 0.047 ) and high phosphate levels ( RR = 1.17 , P = 0.001 ) . Skipping also was associated with increased hospitalization ( RR = 1.13 , P = 0.04 ) , as were high phosphate levels ( RR = 1.07 , P = 0.05 ) . Larger facility size ( per 10 patients ) was associated with higher odds ratios ( OR ) of skipping ( OR = 1.03 , P = 0.06 ) , shortening ( OR = 1.03 , P = 0.05 ) , and IDWG ( OR = 1.02 , P = 0.07 ) . An increased percentage of highly trained staff hours was associated with lower OR of skipping ( OR = 0.84 per 10 % , P = 0.02 ) ; presence of a dietitian was associated with lower OR of excessive IDWG ( OR = 0.75 , P = 0.08 ) . CONCLUSION Nonadherence was associated with increased mortality risk ( skipping treatment , excessive IDWG , and high phosphate ) and with hospitalization risk ( skipping , high phosphate ) . Certain patient/facility characteristics also were associated with nonadherence AIM This paper is a report of a study conducted to determine the effect of an educational intervention on dietary and fluid compliance in patients having haemodialysis . BACKGROUND Many of the clinical problems experienced by patients having haemodialysis are related to their failure to eat appropriate foods and restrict their fluid intake . Educational intervention in patients having haemodialysis to improve their compliance with dietary and fluid restrictions appears to be effective . METHODS Sixty-three patients having haemodialysis in three general hospitals in Tehran , Iran , were allocated into two groups at r and om for oral and /or video education . They were asked to give demographic and medical data . Bimonthly average values of serum potassium , sodium , calcium , phosphate , albumin , creatinine , uric acid , and blood urea nitrogen and interdialytic weight gain were measured before and after the teaching programmes . The data were collected in 2007 . FINDINGS Compliance in terms of biochemical parameters and interdialytic weight gain was observed in 63.5 % and 76.2 % of patients in the oral and video teaching groups respectively . Statistically significant correlations were observed between demographic variables ( age , educational level and occupation ) and dietary and fluid compliances ( P < 0.001 ) . There was no difference between the effectiveness of two educational interventions . CONCLUSION Nurses should emphasize sodium compliance in patients having haemodialysis and explain its adverse effects , such as excessive weight gain , hypertension , and peripheral oedema AIMS AND OBJECTIVES To prospect ively evaluate the effects of a nurse-led educational intervention on the management of hyperphosphataemia as well as knowledge of phosphate among patients with end-stage renal disease . BACKGROUND Haemodialysis and phosphate binder therapy are the major methods used to reduce the phosphate level in dialysis patients . However , patient education related to hyperphosphataemia , diet and phosphate binders may be another important factor associated with the success of the control of the hyperphosphataemia . DESIGN This prospect i ve r and omised controlled trial was conducted during the period from June 2009-March 2011 at the HD units of two hospitals in Tianjin , China . METHODS A total of 80 participants were r and omly assigned to experimental group ( n=40 ) and control group ( n=40 ) . Participants in the experimental group received the nurse-led intensive educational programme , including individualised education and educational session about diet and medicine regimes , etc . , while participants in the control group received the routine guidance . RESULTS There were statistically significant differences between the study groups in decline in serum phosphorus and calcium-phosphorus product levels and improvement in patients ' general knowledge three months postintervention , and these differences sustained until the end of the study . Increased serum calcium level was observed both in the experimental group and in the control group , but there was no significant difference between groups . No statistical significance was found regarding serum albumin level between the groups . No significant difference in the serum parathyroid hormone level was found between the groups by month 6 . CONCLUSIONS Nurse-led intensive educational programme plays an important role in the control of hyperphosphataemia among patients with end-stage renal disease . RELEVANCE TO CLINICAL PRACTICE Chronic kidney failure patients with hyperphosphataemia are more likely to benefit from nurse-led intensive educational programmes BACKGROUND Poor adherence to treatment is common in hemodialysis patients . However , effective interventions for adherence in this population are lacking . Small studies of behavioral interventions have yielded improvements , but clinical effectiveness and long-term effects are unclear . STUDY DESIGN Multicenter parallel ( 1:1 ) design , blinded cluster-r and omized controlled trial . SETTING & PARTICIPANTS Patients undergoing maintenance hemodialysis enrolled in 14 dialysis centers . INTERVENTION Dialysis shifts of eligible patients were r and omly assigned to either an interactive and targeted self-management training program ( HED-SMART ; intervention ; n=134 ) or usual care ( control ; n=101 ) . HED-SMART , developed using the principles of problem solving and social learning theory , was delivered in a group format by health care professionals over 4 sessions . OUTCOMES & MEASUREMENTS Serum potassium and phosphate concentrations , interdialytic weight gains ( IDWGs ) , self-reported adherence , and self-management skills at 1 week , 3 months , and 9 months postintervention . RESULTS 235 participants were enrolled in the study ( response rate , 44.2 % ) , and 82.1 % completed the protocol . IDWG was significantly lowered across all 3 assessment s relative to baseline ( P<0.001 ) among patients r and omly assigned to HED-SMART . In contrast , IDWG in controls showed no change except at 3 months , when it worsened significantly . Improvements in mineral markers were noted in the HED-SMART arm at 3 months ( P<0.001 ) and in potassium concentrations ( P<0.001 ) at 9 months . Phosphate concentrations improved in HED-SMART at 3 months ( P=0.03 ) , but these effects were not maintained at 9 months postintervention . Significant differences between the arms were found for the secondary outcomes of self-reported adherence , self-management skills , and self-efficacy at all time points . LIMITATIONS Low proportion of patients with diabetes . CONCLUSIONS HED-SMART provides an effective and practical model for improving health in hemodialysis patients . The observed improvements in clinical markers and self-report adherence , if maintained at the longer follow-up , could significantly reduce end-stage renal disease-related complications . Given the feasibility of this kind of program , it has strong potential for supplementing usual care . TRIAL REGISTRATION Registered at IS RCT N with study number IS RCT N31434033
2,008
23,966,427
Sensitivity analyses of RCTs in children showed more pronounced benefits in preventing weight gain in SSB substitution trials ( compared with school-based educational programs ) and among overweight children ( compared with normal-weight children ) . Our systematic review and meta- analysis of prospect i ve cohort studies and RCTs provides evidence that SSB consumption promotes weight gain in children and adults
BACKGROUND The relation between sugar-sweetened beverages ( SSBs ) and body weight remains controversial . OBJECTIVE We conducted a systematic review and meta- analysis to summarize the evidence in children and adults .
The long-term physiological effects of refined carbohydrates on appetite and mood remain unclear . Reported effects when subjects are not blind may be due to expectations and have rarely been studied for more than 24 h. The present study compared the effects of supplementary soft drinks added to the diet over 4 weeks on dietary intake , mood and BMI in normal-weight women ( n 133 ) . Subjects were categorised as ' watchers ' or ' non-watchers ' of what they ate then received sucrose or artificially sweetened drinks ( 4 x 250 ml per d ) . Expectancies were varied by labelling drinks ' sugar ' or ' diet ' in a counter-balanced design . Sucrose supplements provided 1800 kJ per d and sweetener supplements provided 67 kJ per d. Food intake was measured with a 7 d diary and mood with ten single Likert scales . By 4 weeks , sucrose supplements significantly reduced total carbohydrate intake ( F(1,129 ) = 53.81 ; P<0.001 ) , fat ( F(2,250 ) = 33.33 ; P<0.001 ) and protein intake ( F(2,250 ) = 28.04 ; P<0 - 001 ) compared with sweetener supplements . Mean daily energy intake increased by just under 1000 kJ compared with baseline ( t ( 67 df ) = 3.82 ; P < 0.001 ) and was associated with a non-significant trend for those receiving sucrose to gain weight . There were no effects on appetite or mood . Neither dietary restraint status as measured by the Dutch Eating Behaviour Question naire nor the expectancy procedure had effects . Expectancies influenced mood only during baseline week . It is concluded that sucrose satiates , rather than stimulates , appetite or negative mood in normal-weight subjects BACKGROUND The consumption of beverages that contain sugar is associated with overweight , possibly because liquid sugars do not lead to a sense of satiety , so the consumption of other foods is not reduced . However , data are lacking to show that the replacement of sugar-containing beverages with noncaloric beverages diminishes weight gain . METHODS We conducted an 18-month trial involving 641 primarily normal-weight children from 4 years 10 months to 11 years 11 months of age . Participants were r and omly assigned to receive 250 ml ( 8 oz ) per day of a sugar-free , artificially sweetened beverage ( sugar-free group ) or a similar sugar-containing beverage that provided 104 kcal ( sugar group ) . Beverages were distributed through schools . At 18 months , 26 % of the children had stopped consuming the beverages ; the data from children who did not complete the study were imputed . RESULTS The z score for the body-mass index ( BMI , the weight in kilograms divided by the square of the height in meters ) increased on average by 0.02 SD units in the sugar-free group and by 0.15 SD units in the sugar group ; the 95 % confidence interval ( CI ) of the difference was -0.21 to -0.05 . Weight increased by 6.35 kg in the sugar-free group as compared with 7.37 kg in the sugar group ( 95 % CI for the difference , -1.54 to -0.48 ) . The skinfold-thickness measurements , waist-to-height ratio , and fat mass also increased significantly less in the sugar-free group . Adverse events were minor . When we combined measurements at 18 months in 136 children who had discontinued the study with those in 477 children who completed the study , the BMI z score increased by 0.06 SD units in the sugar-free group and by 0.12 SD units in the sugar group ( P=0.06 ) . CONCLUSIONS Masked replacement of sugar-containing beverages with noncaloric beverages reduced weight gain and fat accumulation in normal-weight children . ( Funded by the Netherl and s Organization for Health Research and Development and others ; DRINK Clinical Trials.gov number , NCT00893529 . ) BACKGROUND Increased intake of sugar-sweetened beverages and fruit juice has been associated with overweight in children . OBJECTIVE This study prospect ively assessed beverage consumption patterns and their relationship with weight status in a cohort of children born at different risk for obesity . METHODS AND PROCEDURES Participants were children born at low risk ( n = 27 ) or high risk ( n = 22 ) for obesity based on maternal prepregnancy BMI ( kg/m(2 ) ) . Daily beverage consumption was generated from 3-day food records from children aged 3 - 6 years and coded into seven beverage categories ( milk , fruit juice , fruit drinks , caloric and non-caloric soda , soft drinks including and excluding fruit juice ) . Child anthropometric measures were assessed yearly . RESULTS High-risk children consumed a greater percentage of daily calories from beverages at age 3 , more fruit juice at ages 3 and 4 , more soft drinks ( including fruit juice ) at ages 3 - 5 , and more soda at age 6 compared to low-risk children . Longitudinal analyses showed that a greater 3-year increase in soda intake was associated with an increased change in waist circumference , whereas a greater increase in milk intake was associated with a reduced change in waist circumference . There was no significant association between change in intake from any of the beverage categories and change in BMI z-score across analyses . DISCUSSION Children 's familial predisposition to obesity may differentially affect their beverage consumption patterns . Future research should examine the extent to which dietary factors may play a role in pediatric body fat deposition over time BACKGROUND During the nutrition transition in Chile , dietary changes were marked by increased consumption of high-energy , nutrient-poor products , including sugar-sweetened beverages ( SSBs ) . Obesity is now the primary nutritional problem in posttransitional Chile . OBJECTIVE We conducted a r and omized controlled trial to examine the effects on body composition of delivering milk beverages to the homes of overweight and obese children to displace SSBs . DESIGN We r and omly assigned 98 children aged 8 - 10 y who regularly consumed SSBs to intervention and control groups . During a 16-wk intervention , children were instructed to drink 3 servings/d ( approximately 200 g per serving ) of the milk delivered to their homes and to not consume SSBs . Body composition was measured by dual-energy X-ray absorptiometry . Data were analyzed by multiple regression analysis according to the intention-to-treat principle . RESULTS For the intervention group , milk consumption increased by a mean ( + /- SEM ) of 452.5 + /- 37.7 g/d ( P < 0.0001 ) , and consumption of SSBs decreased by -711.0 + /- 33.7 g/d ( P < 0.0001 ) . For the control group , milk consumption did not change , and consumption of SSBs increased by 71.9 + /- 33.6 g/d ( P = 0.04 ) . Changes in percentage body fat , the primary endpoint , did not differ between groups . Nevertheless , the mean ( + /- SE ) accretion of lean body mass was greater ( P = 0.04 ) in the intervention ( 0.92 + /- 0.10 kg ) than in the control ( 0.62 + /- 0.11 kg ) group . The increase in height was also greater ( P = 0.01 ) in the intervention group ( 2.50 + /- 0.21 cm ) than in the control group ( 1.77 + /- 0.20 cm ) for boys but not for girls . CONCLUSION Replacing habitual consumption of SSBs with milk may have beneficial effects on lean body mass and growth in children , despite no changes in percentage body fat . This trial was registered at clinical trials.gov as NCT00149695 To examine whether artificial sweeteners aid in the control of long-term food intake and body weight , we gave free-living , normal-weight subjects 1150 g soda sweetened with aspartame ( APM ) or high-fructose corn syrup ( HFCS ) per day . Relative to when no soda was given , drinking APM-sweetened soda for 3 wk significantly reduced calorie intake of both females ( n = 9 ) and males ( n = 21 ) and decreased the body weight of males but not of females . However , drinking HFCS-sweetened soda for 3 wk significantly increased the calorie intake and body weight of both sexes . Ingesting either type of soda reduced intake of sugar from the diet without affecting intake of other nutrients . Drinking large volumes of APM-sweetened soda , in contrast to drinking HFCS-sweetened soda , reduces sugar intake and thus may facilitate the control of calorie intake and body weight BACKGROUND High consumption of sugar-sweetened drinks has been associated with weight gain and obesity in the United States . This trend may also be affecting population s with different eating patterns who increasingly are adopting typical US dietary patterns . OBJECTIVE We assessed whether the consumption of sweetened drinks and other food items increased the likelihood of weight gain in a Mediterranean population . DESIGN This was a prospect i ve cohort analysis of 7194 men and women with a mean age of 41 y who were followed-up for a median of 28.5 mo with mailed question naires . Dietary exposure was assessed with a previously vali date d semiquantitative food-frequency question naire . RESULTS During follow-up , we observed that 49.5 % of the participants increased their weight ( x weight gain : 0.64 kg ; 95 % CI : 0.55 , 0.73 kg ) . In the participants who had gained > or =3 kg in the 5 y before baseline , the adjusted odds ratio of subsequent weight gain for the fifth quintile compared with the first quintile of sugar-sweetened soft drink consumption was 1.6 ( 95 % CI : 1.2 , 2.1 ; P for trend = 0.02 ) . This association was absent in the participants who had not gained weight in the 5-y period before baseline . The consumption of hamburgers , pizza , and sausages ( as a proxy for fast-food consumption ) was also independently associated with weight gain ( adjusted odds ratio for the fifth compared with the first quintile = 1.2 ; 95 % CI : 1.0 , 1.4 ; P for trend = 0.05 ) . We also found a significant , but weaker , association between weight gain and both red meat and sweetened fruit juice consumption . CONCLUSION In a Mediterranean cohort , particularly in the participants who had already gained weight , an increased consumption of sugar-sweetened soft drinks and of hamburgers , pizza , and sausages was associated with a higher risk of additional subsequent weight gain BACKGROUND Consumption of liquid calories from beverages has increased in parallel with the obesity epidemic in the US population , but their causal relation remains unclear . OBJECTIVE The objective of this study was to examine how changes in beverage consumption affect weight change among adults . DESIGN This was a prospect i ve study of 810 adults participating in the PREMIER trial , an 18-mo r and omized , controlled , behavioral intervention trial . Measurements ( weight , height , and 24-h dietary recall ) were made at baseline , 6 mo , and 18 mo . RESULTS Baseline mean intake of liquid calories was 356 kcal/d ( 19 % of total energy intake ) . After potential confounders and intervention assignment were controlled for , a reduction in liquid calorie intake of 100 kcal/d was associated with a weight loss of 0.25 kg ( 95 % CI : 0.11 , 0.39 ; P < 0.001 ) at 6 mo and of 0.24 kg ( 95 % CI : 0.06 , 0.41 ; P = 0.008 ) at 18 mo . A reduction in liquid calorie intake had a stronger effect than did a reduction in solid calorie intake on weight loss . Of the individual beverages , only intake of sugar-sweetened beverages ( SSBs ) was significantly associated with weight change . A reduction in SSB intake of 1 serving/d was associated with a weight loss of 0.49 kg ( 95 % CI : 0.11 , 0.82 ; P = 0.006 ) at 6 mo and of 0.65 kg ( 95 % CI : 0.22 , 1.09 ; P = 0.003 ) at 18 mo . CONCLUSIONS These data support recommendations to limit liquid calorie intake among adults and to reduce SSB consumption as a means to accomplish weight loss or avoid excess weight gain . This trial was registered at clinical trials.gov as NCT00000616 BACKGROUND Type 2 diabetes mellitus is an increasingly serious health problem among African American women . Consumption of sugar-sweetened drinks was associated with an increased risk of diabetes in 2 studies but not in a third ; however , to our knowledge , no data are available on African Americans regarding this issue . Our objective was to examine the association between consumption of sugar-sweetened beverages , weight gain , and incidence of type 2 diabetes mellitus in African American women . METHODS A prospect i ve follow-up study of 59,000 African American women has been in progress since 1995 . Participants reported on food and beverage consumption in 1995 and 2001 . Biennial follow-up question naires ascertained new diagnoses of type 2 diabetes . The present analyses included 43,960 women who gave complete dietary and weight information and were free from diabetes at baseline . We identified 2713 incident cases of type 2 diabetes mellitus during 338,884 person-years of follow-up . The main outcome measure was the incidence of type 2 diabetes mellitus . RESULTS The incidence of type 2 diabetes mellitus was higher with higher intake of both sugar-sweetened soft drinks and fruit drinks . After adjustment for confounding variables including other dietary factors , the incidence rate ratio for 2 or more soft drinks per day was 1.24 ( 95 % confidence interval , 1.06 - 1.45 ) . For fruit drinks , the comparable incidence rate ratio was 1.31 ( 95 % confidence interval , 1.13 - 1.52 ) . The association of diabetes with soft drink consumption was almost entirely mediated by body mass index , whereas the association with fruit drink consumption was independent of body mass index . CONCLUSIONS Regular consumption of sugar-sweetened soft drinks and fruit drinks is associated with an increased risk of type 2 diabetes mellitus in African American women . While there has been increasing public awareness of the adverse health effects of soft drinks , little attention has been given to fruit drinks , which are often marketed as a healthier alternative to soft drinks OBJECTIVE : To examine the long-term relationship between changes in water and beverage intake and weight change . SUBJECTS : Prospect i ve cohort studies of 50 013 women aged 40–64 years in the Nurses ’ Health Study ( NHS , 1986–2006 ) , 52 987 women aged 27–44 years in the NHS II ( 1991–2007 ) and 21 988 men aged 40–64 years in the Health Professionals Follow-up Study ( 1986–2006 ) without obesity and chronic diseases at baseline . MEASURES : We assessed the association of weight change within each 4-year interval , with changes in beverage intakes and other lifestyle behaviors during the same period . Multivariate linear regression with robust variance and accounting for within-person repeated measures were used to evaluate the association . Results across the three cohorts were pooled by an inverse-variance-weighted meta- analysis . RESULTS : Participants gained an average of 1.45 kg ( 5th to 95th percentile : −1.87 to 5.46 ) within each 4-year period . After controlling for age , baseline body mass index and changes in other lifestyle behaviors ( diet , smoking habits , exercise , alcohol , sleep duration , TV watching ) , each 1 cup per day increment of water intake was inversely associated with weight gain within each 4-year period ( −0.13 kg ; 95 % confidence interval ( CI ) : −0.17 to −0.08 ) . The associations for other beverages were : sugar-sweetened beverages ( SSBs ) ( 0.36 kg ; 95 % CI : 0.24–0.48 ) , fruit juice ( 0.22 kg ; 95 % CI : 0.15–0.28 ) , coffee ( −0.14 kg ; 95 % CI : −0.19 to −0.09 ) , tea ( −0.03 kg ; 95 % CI : −0.05 to −0.01 ) , diet beverages ( −0.10 kg ; 95 % CI : −0.14 to −0.06 ) , low-fat milk ( 0.02 kg ; 95 % CI : −0.04 to 0.09 ) and whole milk ( 0.02 kg ; 95 % CI : −0.06 to 0.10 ) . We estimated that replacement of 1 serving per day of SSBs by 1 cup per day of water was associated with 0.49 kg ( 95 % CI : 0.32–0.65 ) less weight gain over each 4-year period , and the replacement estimate of fruit juices by water was 0.35 kg ( 95 % CI : 0.23–0.46 ) . Substitution of SSBs or fruit juices by other beverages ( coffee , tea , diet beverages , low-fat and whole milk ) were all significantly and inversely associated with weight gain . CONCLUSION : Our results suggest that increasing water intake in place of SSBs or fruit juices is associated with lower long-term weight gain BACKGROUND St and ard behavioral obesity treatment produces poor long-term results . Focusing on healthy eating behaviors rather than energy intake may be an alternative strategy . In addition , important behaviors might differ for short- vs long-term weight control . OBJECTIVE Our aim was to describe and compare associations between changes in eating behaviors and weight after 6 and 48 months . DESIGN We performed secondary analysis of data collected during a r and omized weight-loss intervention trial with 48-month follow-up . PARTICIPANTS We studied 481 overweight and obese postmenopausal women enrolled in the Women on the Move through Activity and Nutrition ( WOMAN ) Study . MAIN OUTCOME MEASURES We measured changes in weight from baseline to 6 and 48 months . STATISTICAL ANALYSES PERFORMED Linear regression models were used to examine the associations between 6- and 48-month changes in eating habits assessed by the Conner Diet Habit Survey and changes in weight . Analyses were conducted in the combined study population and stratified by r and omization group . RESULTS At 6 months in the combined population , weight loss was independently associated with decreased desserts ( P<0.001 ) , restaurant eating ( P=0.042 ) , sugar-sweetened beverages ( P=0.009 ) , and fried foods ( P<0.001 ) , and increased fish consumption ( P=0.003 ) . Results were similar in intervention participants ; only reduced desserts and fried foods associated with weight loss in controls . At 48 months in the combined population , weight loss was again associated with decreased desserts ( P=0.003 ) and sugar-sweetened beverages ( P=0.011 ) , but also decreased meats/cheeses ( P=0.024 ) and increased fruits/vegetables ( P<0.001 ) . Decreased meats/cheeses predicted weight loss in intervention participants ; desserts , sugar-sweetened beverages , and fruits/vegetables were independently associated in controls . CONCLUSIONS Changes in eating behaviors were associated with weight change , although important behaviors differed for short- and long-term weight change and by r and omization group . Future studies should determine whether interventions targeting these behaviors could improve long-term obesity treatment outcomes BACKGROUND The consumption of sucrose-sweetened soft drinks ( SSSDs ) has been associated with obesity , the metabolic syndrome , and cardiovascular disorders in observational and short-term intervention studies . Too few long-term intervention studies in humans have examined the effects of soft drinks . OBJECTIVE We compared the effects of SSSDs with those of isocaloric milk and a noncaloric soft drink on changes in total fat mass and ectopic fat deposition ( in liver and muscle tissue ) . DESIGN Overweight subjects ( n = 47 ) were r and omly assigned to 4 different test drinks ( 1 L/d for 6 mo ) : SSSD ( regular cola ) , isocaloric semiskim milk , aspartame-sweetened diet cola , and water . The amount of intrahepatic fat and intramyocellular fat was measured with (1)H-magnetic resonance spectroscopy . Other endpoints were fat mass , fat distribution ( dual-energy X-ray absorptiometry and magnetic resonance imaging ) , and metabolic risk factors . RESULTS The relative changes between baseline and the end of 6-mo intervention were significantly higher in the regular cola group than in the 3 other groups for liver fat ( 132 - 143 % , sex-adjusted mean ; P < 0.01 ) , skeletal muscle fat ( 117 - 221 % ; P < 0.05 ) , visceral fat ( 24 - 31 % ; P < 0.05 ) , blood triglycerides ( 32 % ; P < 0.01 ) , and total cholesterol ( 11 % ; P < 0.01 ) . Total fat mass was not significantly different between the 4 beverage groups . Milk and diet cola reduced systolic blood pressure by 10 - 15 % compared with regular cola ( P < 0.05 ) . Otherwise , diet cola had effects similar to those of water . CONCLUSION Daily intake of SSSDs for 6 mo increases ectopic fat accumulation and lipids compared with milk , diet cola , and water . Thus , daily intake of SSSDs is likely to enhance the risk of cardiovascular and metabolic diseases . This trial is registered at clinical trials.gov as NCT00777647 Soft drinks and other sweetened beverages may contribute to risk of type 2 diabetes and obesity . However , research has not addressed higher risk and Asian population s. The authors examined the association between soft drinks and juice and the risk of type 2 diabetes among Chinese Singaporeans enrolled in a prospect i ve cohort study of 43,580 participants aged 45 - 74 years and free of diabetes and other chronic diseases at baseline . The incidence of physician-diagnosed type 2 diabetes was assessed by interview and vali date d ; 2,273 participants developed diabetes during follow-up . After adjustment for potential lifestyle and dietary confounders , participants consuming > or = 2 soft drinks per week had a relative risk of type 2 diabetes of 1.42 ( 95 % confidence interval ( CI ) : 1.25 , 1.62 ) compared with those who rarely consumed soft drinks . Similarly , consumption of > or = 2 juice beverages per week was associated with an increased risk ( relative risk ( RR ) = 1.29 , 95 % CI : 1.05 , 1.58 ) . The association was modified by 5-year weight gain for > or = 2 soft drinks per week among those who gained > or =3 kg ( RR = 1.70 , 95 % CI : 1.34 , 2.16 ) compared with those who gained less weight ( RR = 1.20 , 95 % CI : 1.03 , 1.41 ) . Relatively frequent intake of soft drinks and juice is associated with an increased risk for development of type 2 diabetes in Chinese men and women Previous studies have yielded inconsistent results when documenting the association between key dietary factors and adolescent weight change over time . The purpose of this study was to examine the extent to which changes in adolescent sugar-sweetened beverage ( SSB ) , diet soda , breakfast , and fast-food consumption were associated with changes in BMI and percent body fat ( PBF ) . This study analyzed data from a sample of 693 Minnesota adolescents followed over 2 years . R and om coefficient models were used to examine the relationship between dietary intake and BMI and PBF and to separate cross-sectional and longitudinal associations . Adjusting for total physical activity , total energy intake , puberty , race , socioeconomic status , and age , cross-sectional findings indicated that for both males and females , breakfast consumption was significantly and inversely associated with BMI and PBF , and diet soda intake was significantly and positively associated with BMI and PBF among females . In longitudinal analyses , however , there were fewer significant associations . Among males there was evidence of a significant longitudinal association between SSB consumption and PBF ; after adjustment for energy intake , an increase of one serving of SSB per day was associated with an increase of 0.7 units of PBF among males . This study adds to previous research through its method ological strengths , including adjustment for physical activity and energy intake assessed using state-of-the-art methods ( i.e. , accelerometers and 24-h dietary recalls ) , as well as its evaluation of both BMI and PBF . Additional research is needed to better underst and the complex constellation of factors that contribute to adolescent weight gain over time OBJECTIVE To study changes in lifestyle in relation to changes in body weight and waist circumference associated with occupational retirement in men . DESIGN A prospect i ve cohort study with 5 years of follow-up . At baseline and at follow-up , question naires were completed and body weight and waist circumference were measured . SETTING The Doetinchem Cohort Study , consisting of inhabitants of Doetinchem , a town in a rural area of The Netherl and s. SUBJECTS In total 288 healthy men aged 50 - 65 years at baseline , who either remained employed or retired over follow-up . RESULTS The effect of retirement on changes in weight and waist circumference was dependent on type of former occupation . Increase in body weight and waist circumference was higher among men who retired from active jobs ( 0.42 kg year(-1 ) and 0.77 cm year(-1 ) , respectively ) than among men who retired from sedentary jobs ( 0.08 kg year(-1 ) and 0.23 cm year(-1 ) , respectively ) . Weight gain and increase in waist circumference were associated with a decrease in fruit consumption and fibre density of the diet , with an increase in frequency of eating breakfast , and with a decrease in several physical activities , such as household activities , bicycling , walking and doing odd jobs . CONCLUSION Retirement was associated with an increase in weight and waist circumference among those with former active jobs , but not among those with former sedentary jobs . Retirement may bring opportunities for healthy changes in diet and physical activity , which could be used in health promotion programmes In the present study the relationship between the consumption of different beverage groups and body-weight status in 5 years of study participation in German adolescents was investigated . We used anthropometric and dietary data from 3 d weighed records of 244 subjects between 9 and 18 years of age participating in the Dortmund Nutritional and Anthropometric Longitudinally Design ed ( DONALD ) study . Only subjects with at least four out of six possible weighed dietary records were considered . A repeated- measures regression model ( PROC MIXED ) was used to analyse the effect of beverage consumption on body-weight status . BMI st and ard deviation scores ( BMI -SDS ) and body fat percentage ( % BF ) were chosen as the dependent variables . In boys , energetic beverage consumption was not associated with BMI -SDS or % BF , neither cross-sectionally nor prospect ively . In girls , baseline consumption of energetic beverages did not predict baseline BMI -SDS , baseline % BF , or change in either variable over the study period . However , an increase in energetic beverage consumption over the study period was associated with an increase in BMI -SDS ( + 0.070 SDS/MJ increase in energetic beverage consumption ; P = 0.01 ) . Separate consideration of regular soft drinks and fruit juices revealed that , in girls , BMI -SDS increased with increased fruit juice consumption ( + 0.096 SDS/MJ increase in fruit juice consumption ; P = 0.01 ) , and to a lesser extent with regular soft drink consumption ( + 0.055 SDS/MJ increase in regular soft drink consumption ; P = 0.08 ) . In conclusion , these results suggest that an increase in energetic beverage consumption may result in weight gain , at least in adolescent girls Objective . To prospect ively identify behavioral risk factors for childhood overweight and to assess their relevance in high risk sub groups ( children of mothers with overweight or low education ) . Methods . In the PIAMA birth cohort ( n = 3963 ) , question naire data were obtained at ages 5 and 7 on “ screen time ” , walking or cycling to school , playing outside , sports club membership , fast food consumption , snack consumption and soft drink consumption . Weight and height were measured at age 8 years . Results . Screen time , but none of the other hypothesized behavioral factors , was associated with overweight ( aOR 1.4 ( CI : 1.2–1.6 ) ) . The adjusted population attributable risk fraction for screen time > 1 hr/day was 10 % in the high risk and 17 % in the low risk sub groups . Conclusion . Reduction of screen time to < 1 hr/day could result in a reduction of overweight prevalence in the order of 2 percentage points in both high and low risks sub groups BACKGROUND The rising prevalence of obesity in children has been linked in part to the consumption of sugar-sweetened drinks . Our aim was to examine this relation . METHODS We enrolled 548 ethnically diverse schoolchildren ( age 11.7 years , SD 0.8 ) from public schools in four Massachusetts communities , and studied them prospect ively for 19 months from October , 1995 , to May , 1997 . We examined the association between baseline and change in consumption of sugar-sweetened drinks ( the independent variables ) , and difference in measures of obesity , with linear and logistic regression analyses adjusted for potentially confounding variables and clustering of results within schools . FINDINGS For each additional serving of sugar-sweetened drink consumed , both body mass index ( BMI ) ( mean 0.24 kg/m2 ; 95 % CI 0.10 - 0.39 ; p=0.03 ) and frequency of obesity ( odds ratio 1.60 ; 95 % CI 1.14 - 2.24 ; p=0.02 ) increased after adjustment for anthropometric , demographic , dietary , and lifestyle variables . Baseline consumption of sugar-sweetened drinks was also independently associated with change in BMI ( mean 0.18 kg/m2 for each daily serving ; 95 % CI 0.09 - 0.27 ; p=0.02 ) . INTERPRETATION Consumption of sugar-sweetened drinks is associated with obesity in children OBJECTIVE To determine whether an educational programme aim ed at discouraging students from drinking sugar-sweetened beverages could prevent excessive weight gain . DESIGN Forty-seven classes in twenty-two schools were r and omised as intervention or control . SUBJECTS Participants were 1140 , 9 - 12-year-old fourth grade rs ( 435 in the intervention group and 608 in the control group ) . Sugar-sweetened beverages and juice intake were measured through one 24 h recall at baseline and another at the end of the trial . The main outcome was the change in BMI ( BMI = weight (kg)/height ( m2 ) ) , measured at the beginning and at the end of the school year . Intention-to-treat analysis was performed taking into account the cluster ( classes ) effect . RESULTS A statistically significant decrease in the daily consumption of carbonated drinks in the intervention compared to control ( mean difference = -56 ml ; 95 % CI -119 , -7 ml ) was followed by a non-significant overall reduction in BMI , P = 0.33 . However , among those students overweight at baseline , the intervention group showed greater BMI reduction ( -0.4 kg/m2 compared with -0.2 kg/m2 in the control group ( P = 0.11 ) ) , and this difference was statistically significant among girls ( P = 0.009 ) . Fruit juice consumption was slightly increased in the intervention group ( P = 0.08 ) , but not among girls . CONCLUSION Decreasing sugar-sweetened beverages intake significantly reduced BMI among overweight children , and mainly among girls . Efforts to reduce energy intake through liquids need to emphasise overall sweetened beverages and addition of sugar on juices OBJECTIVE We examined whether drinking water per se is associated with drinking less of other beverages and whether changes in BMI are associated with the intake of water and other beverages . DESIGN Secondary analysis of a r and omized trial of fourth grade rs followed over 1 year . SETTING Public schools in the metropolitan area of Rio de Janeiro , Brazil . SUBJECTS Participants were 1134 students aged 10 - 11 years . RESULTS At baseline , a higher frequency of water consumption was associated with a greater daily intake of fruit juice ( P = 0.02 ) and a higher daily frequency of milk ( P = 0.005 ) . In the intervention group , the baseline frequency of water consumption was negatively associated with weight change over 1 year but without statistical significance ( coefficient = -0.08 kg/m2 ; 95 % CI -0.37 , 0.24 kg/m2 ) , whereas fruit juice intake frequency was positively associated with weight change : each increase in fruit juice intake of 1 glass/d was associated with a BMI increase of 0.16 ( 95 % CI 0.02 , 0.30 ) kg/m2 . CONCLUSIONS Our findings do not support a protective effect of water consumption on BMI , but confirm consumption of juice drinks as a risk factor for BMI gain . Students who reported high water consumption also reported high intake of other beverages ; therefore , the promotion of water consumption per se would not prevent excessive weight gain Objective To assess the long term effects of an obesity prevention programme in schools . Design Longitudinal results after a cluster r and omised controlled trial . Setting Schools in southwest Engl and . Participants Of the original sample of 644 children aged 7 - 11 , 511 children were tracked and measurements were obtained from 434 children three years after baseline . Intervention The intervention was conducted over one school year , with four sessions of focused education promoting a healthy diet and discouraging the consumption of carbonated drinks . Main outcome measures Anthropometric measures of height , weight , and waist circumference . Body mass index ( BMI ) converted to z scores ( SD scores ) and to centile values with growth reference curves . Waist circumference was also converted to z scores ( SD scores ) . Results At three years after baseline the age and sex specific BMI z scores ( SD scores ) had increased in the control group by 0.10 ( SD 0.53 ) but decreased in the intervention group by −0.01 ( SD 0.58 ) , with a mean difference of 0.10 ( 95 % confidence interval −0.00 to 0.21 , P=0.06 ) . The prevalence of overweight increased in both the intervention and control group at three years and the significant difference between the groups seen at 12 months was no longer evident . The BMI increased in the control group by 2.14 ( SD 1.64 ) and the intervention group by 1.88 ( SD 1.71 ) , with mean difference of 0.26 ( −0.07 to 0.58 , P= 0.12 ) . The waist circumference increased in both groups after three years with a mean difference of 0.09 ( −0.06 to 0.26 , P=0.25 ) . Conclusions These longitudinal results show that after a simple year long intervention the difference in prevalence of overweight in children seen at 12 months was not sustained at three years BACKGROUND Consumption of sugar-sweetened beverages may cause excessive weight gain . We aim ed to assess the effect on weight gain of an intervention that included the provision of noncaloric beverages at home for overweight and obese adolescents . METHODS We r and omly assigned 224 overweight and obese adolescents who regularly consumed sugar-sweetened beverages to experimental and control groups . The experimental group received a 1-year intervention design ed to decrease consumption of sugar-sweetened beverages , with follow-up for an additional year without intervention . We hypothesized that the experimental group would gain weight at a slower rate than the control group . RESULTS Retention rates were 97 % at 1 year and 93 % at 2 years . Reported consumption of sugar-sweetened beverages was similar at baseline in the experimental and control groups ( 1.7 servings per day ) , declined to nearly 0 in the experimental group at 1 year , and remained lower in the experimental group than in the control group at 2 years . The primary outcome , the change in mean body-mass index ( BMI , the weight in kilograms divided by the square of the height in meters ) at 2 years , did not differ significantly between the two groups ( change in experimental group minus change in control group , -0.3 ; P=0.46 ) . At 1 year , however , there were significant between-group differences for changes in BMI ( -0.57 , P=0.045 ) and weight ( -1.9 kg , P=0.04 ) . We found evidence of effect modification according to ethnic group at 1 year ( P=0.04 ) and 2 years ( P=0.01 ) . In a prespecified analysis according to ethnic group , among Hispanic participants ( 27 in the experimental group and 19 in the control group ) , there was a significant between-group difference in the change in BMI at 1 year ( -1.79 , P=0.007 ) and 2 years ( -2.35 , P=0.01 ) , but not among non-Hispanic participants ( P>0.35 at years 1 and 2 ) . The change in body fat as a percentage of total weight did not differ significantly between groups at 2 years ( -0.5 % , P=0.40 ) . There were no adverse events related to study participation . CONCLUSIONS Among overweight and obese adolescents , the increase in BMI was smaller in the experimental group than in the control group after a 1-year intervention design ed to reduce consumption of sugar-sweetened beverages , but not at the 2-year follow-up ( the prespecified primary outcome ) . ( Funded by the National Institute of Diabetes and Digestive and Kidney Diseases and others ; Clinical Trials.gov number , NCT00381160 . ) OBJECTIVE . The role of sugar-sweetened beverages ( SSBs ) in promoting obesity is controversial . Observational data link SSB consumption with excessive weight gain ; however , r and omized , controlled trials are lacking and necessary to resolve the debate . We conducted a pilot study to examine the effect of decreasing SSB consumption on body weight . METHODS . We r and omly assigned 103 adolescents aged 13 to 18 years who regularly consumed SSBs to intervention and control groups . The intervention , 25 weeks in duration , relied largely on home deliveries of noncaloric beverages to displace SSBs and thereby decrease consumption . Change in SSB consumption was the main process measure , and change in body mass index ( BMI ) was the primary end point . RESULTS . All of the r and omly assigned subjects completed the study . Consumption of SSBs decreased by 82 % in the intervention group and did not change in the control group . Change in BMI , adjusted for gender and age , was 0.07 ± 0.14 kg/m2 ( mean ± SE ) for the intervention group and 0.21 ± 0.15 kg/m2 for the control group . The net difference , −0.14 ± 0.21 kg/m2 , was not significant overall . However , baseline BMI was a significant effect modifier . Among the subjects in the upper baseline- BMI tertile , BMI change differed significantly between the intervention ( −0.63 ± 0.23 kg/m2 ) and control ( + 0.12 ± 0.26 kg/m2 ) groups , a net effect of −0.75 ± 0.34 kg/m2 . The interaction between weight change and baseline BMI was not attributable to baseline consumption of SSBs . CONCLUSIONS . A simple environmental intervention almost completely eliminated SSB consumption in a diverse group of adolescents . The beneficial effect on body weight of reducing SSB consumption increased with increasing baseline body weight , offering additional support for American Academy of Pediatrics guidelines to limit SSB consumption BACKGROUND Obesity in adolescence has been increasing in the past several decades . Beverage habits among adolescents include increased consumption of sugar-sweetened beverages and decreased consumption of milk . OBJECTIVE This study aim ed to examine the association between beverage consumption and 5-y body weight change in 2294 adolescents . DESIGN Project EAT ( Eating Among Teens ) is a 5-y longitudinal study of eating patterns among adolescents . Surveys were completed in 1998 - 1999 ( time 1 ) and in 2003 - 2004 ( time 2 ) . Multivariable linear regression was used to examine the association between beverage consumption at time 2 and change in body mass index from time 1 to time 2 , with adjustments for age , socioeconomic status , race , cohort , physical activity , sedentary behavior , coffee , tea , time 1 body mass index , and beverage variables . RESULTS In prospect i ve analyses , consumption of beverages was not associated with weight gain , except for consumption of low-calorie soft drinks ( positive association , P = 0.002 ) and white milk ( inverse association , P = 0.03 ) , but these associations did not appear to be a monotonic linear dose-response relation . The positive association with low-calorie soft drinks was no longer present after adjustment for dieting and parental weight-related concerns , which suggests that the use of low-calorie soft drinks is a marker for more general dietary behaviors and weight concerns . CONCLUSIONS We showed no association between sugar-sweetened beverage consumption , juice consumption , and adolescent weight gain over a 5-y period . A direct association between diet beverages and weight gain appeared to be explained by dieting practice s. Adolescents who consumed little or no white milk gained significantly more weight than their peers who consumed white milk . Future research that examines beverage habits and weight among adolescents should address portion sizes , adolescent maturation , and dieting behaviors OBJECTIVE To examine prospect ively the association between beverage consumption ( fruit juice , fruit drinks , milk , soda , and diet soda ) and changes in weight and body mass index among preschool children . DESIGN A prospect i ve cohort study that collected dietary , anthropometric , and sociodemographic data .Subjects/ Setting The study population included 1,345 children age 2 to 5 years participating in the North Dakota Special Supplemental Nutrition Program for Women , Infants , and Children ( WIC ) on two visits between 6 to 12 months apart . Statistical analyses We performed linear regression analyses to examine whether beverage consumption was associated with annual change in weight and body mass index . Intakes were measured as continuous ( oz/day ) and we also dichotomized fruit juice , fruit drinks , and milk at high intakes . RESULTS In multivariate regression analyses adjusted for age , sex , energy intake , change in height , and additional sociodemographic variables , weight change was not significantly related to intakes ( per ounce ) of fruit juice ( beta=0.01 lb/year , 95 % CI : -0.01 to 0.20 , P=.28 ) , fruit drinks ( beta=-0.03 lb/year , 95 % CI : -0.07 to 0.01 , P=.28 ) , milk ( beta=0.00 lb/year , 95 % CI : -0.02 to 0.02 , P=.86 ) , soda ( beta=-0.00 lb/year , 95 % CI : -0.08 to 0.08 , P=.95 ) , or diet soda ( beta=0.01 lb/year , 95 % CI : -0.11 to 0.13 , P=.82 ) . Findings remained null when we examined associations with body mass index and when fruit juice , fruit drinks , and milk were dichotomized at high intake levels in both analyses . CONCLUSIONS Our study does not show an association between beverage consumption and changes in weight or body mass index in this population of low-income preschool children in North Dakota OBJECTIVE The increase in consumption of sugar-added beverages over recent decades may be partly responsible for the obesity epidemic among U.S. adolescents . Our aim was to evaluate the relationship between BMI changes and intakes of sugar-added beverages , milk , fruit juices , and diet soda . RESEARCH METHODS AND PROCEDURES Our prospect i ve cohort study included > 10,000 boys and girls participating in the U.S. Growing Up Today Study . The participants were 9 to 14 years old in 1996 and completed question naires in 1996 , 1997 , and 1998 . We analyzed change in BMI ( kilograms per meter squared ) over two 1-year periods among children who completed annual food frequency question naires assessing typical past year intakes . We studied beverage intakes during the year corresponding to each BMI change , and in separate models , we studied 1-year changes in beverage intakes , adjusting for prior year intakes . Models included all beverages simultaneously ; further models adjusted for total energy intake . RESULTS Consumption of sugar-added beverages was associated with small BMI gains during the corresponding year ( boys : + 0.03 kg/m2 per daily serving , p = 0.04 ; girls : + 0.02 kg/m2 , p = 0.096 ) . In models not assuming a linear dose-response trend , girls who drank 1 serving/d of sugar-added beverages gained more weight ( + 0.068 , p = 0.02 ) than girls drinking none , as did girls drinking 2 servings/d ( + 0.09 , p = 0.06 ) or 3 + servings/d ( + 0.08 , p = 0.06 ) . Analyses of year-to-year change in beverage intakes provided generally similar findings ; boys who increased consumption of sugar-added beverages from the prior year experienced weight gain ( + 0.04 kg/m2 per additional daily serving , p = 0.01 ) . Children who increased intakes by 2 or more servings/d from the prior year gained weight ( boys : + 0.14 , p = 0.01 ; girls + 0.10 , p = 0.046 ) . Further adjusting our models for total energy intake substantially reduced the estimated effects , which were no longer significant . DISCUSSION Consumption of sugar-added beverages may contribute to weight gain among adolescents , probably due to their contribution to total energy intake , because adjustment for calories greatly attenuated the estimated associations PURPOSE To determine whether a significant relationship exists between fat mass ( FM ) development and physical activity ( PA ) and /or sugar-sweetened drink ( SD ) consumption in healthy boys and girls aged 8 - 19 yr . METHODS A total of 105 males and 103 females were assessed during childhood and adolescence for a maximum of 7 yr and a median of 5 yr . Height was measured biannually . Fat-free mass ( FFM ) and FM were assessed annually by dual x-ray absorptiometry ( DXA ) . PA was evaluated two to three times annually using the PAQ-C/A. Energy intake and SD were assessed using a 24-h dietary intake question naire also completed two to three times per year . Years from peak height velocity were used as a biological maturity age indicator . Multilevel r and om effects models were used to test the relationship . RESULTS When controlling for maturation , FFM , and energy intake adjusted for SD , PA level was negatively related to FM development in males ( P<0.05 ) but not in females ( P>0.05 ) . In contrast , there was no relationship between SD and FM development of males or females ( P>0.05 ) . There was also no interaction effect between SD and PA ( P>0.05 ) with FM development . CONCLUSION This finding lends support to the idea that increasing PA in male youths aids in the control of FM development . Models employed showed no relationship between SD and FM in either gender The present study examined behavioural predictors of body weight cross-sectionally and longitudinally in a cohort of 1639 male and 1913 female employees in 32 companies participating in a worksite intervention study for smoking cessation and weight control . Dietary intake , current and previous dieting behaviours , and physical activity were examined for their association with body weight over the two-year period . Cross-sectionally in both men and women , history of previous dieting , previous participation in a formal weight loss programme , current dieting and meat consumption were positively related to body weight while high intensity activity was negatively related to body weight . Prospect ively , history of participation in a formal weight loss programme and dieting to lose weight at baseline , and increased consumption over time of french fries , dairy products , sweets and meat , independently predicted increases in body weight in women . Women who were dieting to lose weight or who had previously participated in a formal weight loss programme at baseline gained 1.99 lb and 1.74 lb more , respectively , than those who were not dieting to lose weight or who had not previously participated in a formal weight loss programme . Increased exercise , either walking or high intensity activity , predicted decreases in body weight in women ( 1.76 lb and 1.39 lb , respectively , for each session increase per week ) . In men , previous participation in a formal weight loss programme predicted increases in body weight over the two-year period . Men who had previously participated in a formal weight loss programme at baseline gained 4.83 lb more than those who had never previously participated in a formal weight loss programme . Increases in consumption of sweets and egg were prospect ively related to increases in body weight , while increased walking and high intensity activity were related to decreases in body weight ( 0.86 lb and 3.54 lb , respectively , for each session increase per week ) . These results suggest the role that specific diet and exercise behaviours may play in body weight changes over time The long-term effects of sucrose on appetite and mood remain unclear . Normal weight subjects compensate for sucrose added blind to the diet ( Reid et al. , 2007 ) . Overweight subjects , however , may differ . In a single-blind , between-subjects design , soft drinks ( 4x25cl per day ; 1800kJ sucrose sweetened versus 67kJ aspartame sweetened ) were added to the diet of overweight women ( n=53 , BMI 25 - 30 , age 20 - 55 ) for 4 weeks . A 7-day food diary gave measures of total energy , carbohydrate , protein , fat , and micronutrients . Mood and hunger were measured by ten single Likert scales rated daily at 11.00 , 14.00 , 16.00 , and 20.00 . Activity levels were measured by diary and pedometer . Baseline energy intake did not differ between groups . During the first week of the intervention energy intake increased slightly in the sucrose group , but not in the aspartame group , then decreased again , so by the final week intake again did not differ from the aspartame group . Compensation was not large enough to produce significant changes in the composition of the voluntary diet . There were no effects on hunger or mood . It is concluded that overweight women do not respond adversely to sucrose added blind to the diet , but compensate for it by reducing voluntary energy intake . Alternative explanations for the correlation between sugary soft drink intake and weight gain are discussed
2,009
28,403,861
All selected studies demonstrated significant improvement in subjective measures of angina symptoms and /or quality of life , in the majority of studies left ventricular function and myocardial perfusion improved . Conclusions Systematic review of CSWT studies in stable coronary artery disease ( CAD ) demonstrated consistent improvement of clinical variables . Meta- analysis showed a moderate improvement of exercise capacity . Overall , CSWT is a promising non-invasive option for patients with end-stage CAD , but evidence is limited to small sample single-center studies .
Aim To systematic ally review currently available cardiac shock-wave therapy ( CSWT ) studies in humans and perform meta- analysis regarding anti-anginal efficacy of CSWT .
Background Cardiac shock wave therapy ( CSWT ) improves cardiac function in patients with severe coronary artery disease ( CAD ) . We aim ed to evaluate the clinical outcomes of a new CSWT treatment regimen . Methods The 55 patients with severe CAD were r and omly divided into 3 treatment groups . The control group ( n = 14 ) received only medical therapy . In group A ( n = 20 ) , CSWT was performed 3 times within 3 months . In group B ( n = 21 ) , patients underwent 3 CSWT sessions/week , and 9 treatment sessions were completed within 1 month . Primary outcome measurement was 6-minute walk test ( 6MWT ) . Other measurements were also evaluated . Results The 6MWT , CCS grading of angina , dosage of nitroglycerin , NYHA classification , and SAQ scores were improved in group A and B compared to control group . Conclusions A CSWT protocol with 1 month treatment duration showed similar therapeutic efficacy compared to a protocol of 3 months duration . Clinical trial registryWe have registered on Clinical Trials.gov , the protocol ID is CSWT IN CHINA OBJECTIVE The aim of this study was to evaluate the effect of cardiac shock wave therapy ( CSWT ) on microvolt T wave alternans ( MTWA ) in patients with coronary artery disease ( CAD ) . METHODS 87 patients with old myocardial infa rct ion ( OMI ) were enrolled in this study . Sixty-two patients were r and omized into the CSWT group , 32 patients into the regular treatment group ( Group A ) according to different shock wave procedure , and 30 into the exp and ing scope treatment group ( Group B ) , and 25 patients were r and omized into the control group ( Group C ) . But the shock wave ( SW ) energy was only applied to the patients in the CSWT group and not to the patients in the control group . Three months was a treatment course , thus patients received a total of 9 CSWT treatment sessions . RESULTS Technetium-99 m sestamibi myocardial perfusion , fluorine-18 fluorodeoxyglucose myocardial metabolism single-photon emission computed tomography ( SPECT ) were performed to identify segments of myocardial ischemia , myocardial viability , and microvolt T wave alternans ( MTWA ) before and after CSWT . After CSWT , the rehospitalization rates of CSWT group were lower than control group ( P<0.05 ) . The myocardial ischemic segments , metabolism abnormal segments , total radioactive score of perfusion imaging and metabolism imaging , MTWA , and MTWA/HR in CSWT group were reduced significantly ( P<0.05 ) . And the heart rate of maximum MTWA , exercise time were increased significantly ( P<0.05 ) . All of the parameters in the control group did not change significantly even worsen after the treatment ( P>0.05 ) . CONCLUSIONS CSWT can reduce the MTWA value , improve the heart chronotropic function and increase the threshold of frequency which causes MTWA Background Ultrasound guided cardiac shock wave therapy ( CSWT ) is a noninvasive therapeutic option in the treatment of chronic-refractory angina . Clinical trials have shown that CSWT reduces angina symptoms , improves regional systolic function , LV ejection fraction , myocardial perfusion and quality of life parameters . Absolute measurements of myocardial perfusion before and after CSWT have not been performed so far . Methods and results We studied a total of 21 CCS III patients with history of CAD and multiple interventions who suffered from disabling angina despite individually optimized medical therapy . An N-13 NH3 PET perfusion scan under adenosine was performed before and after CSWT treatment . CSWT was well tolerated in all patients . Absolute perfusion under adenosine of the global left-ventricular myocardium did not change under therapy or minimal coronary resistance . The treated segments , however , showed in terms of both perfusion and resistance a mild but significant improvement , by 11 and 15 % , respectively , whereas no change could be observed in the remote segments . Considering a threshold of increased perfusion of 5 % , 10 ( 77 % ) out of 13 patients with a better target perfusion improved in their CCS class , whereas 3 ( 43 % ) out of 7 patients without improved target perfusion improved in their CCS class too . ConclusionS t and ard CSWT has the potential to improve myocardial perfusion of the therapy zone and clinical CAD symptomatology without affecting global myocardial perfusion . As a noninvasive and well tolerated therapeutic option , these data suggest the use of CSWT in patients with end-stage CAD Our aim was to evaluate the safety and effectiveness of extracorporeal cardiac shock wave therapy ( CSWT ) for the patients with coronary heart disease ( CHD ) using a r and omized , double-blind , controlled clinical trial design . Twenty-five patients with CHD were enrolled in this study . Fourteen of the patients were r and omized into the CSWT group and 11 into the control group . We applied the CSWT procedure to each patient by using nine shock treatments during 3 months , but the shock wave ( SW ) energy was only applied to the patients in the CSWT group and not to the patients in the control group . Technetium-99 m sestamibi myocardial perfusion , fluorine-18 fluorodeoxyglucose myocardial metabolism single-photon emission computed tomography ( SPECT ) , and two-dimensional echocardiography were performed to identify segments of myocardial ischemia , myocardial viability , and ejection fraction before and after CSWT . We also followed the patients to evaluate adverse effects . After CSWT , the New York Heart Association class , the Canadian Cardiovascular Society angina scale , nitroglycerin dosage , myocardial perfusion and myocardial metabolic imaging scores of dual-isotope SPECT in the CSWT group were reduced significantly ( P = 0.019 , 0.027 , 0.039 , 0.000 , 0.001 , respectively ) , and the Seattle Angina Question naire scale , 6-min walking test , and left ventricular ejection fraction were increased significantly ( P = 0.021 , 0.024 , 0.016 , respectively ) compared with those before the SW treatment . All of the parameters in the control group did not change significantly after the treatment ( all P > 0.05 ) . No serious adverse effects of CSWT were observed . Cardiac shock wave therapy is a safe and effective treatment for CHD patients Flaws in the design , conduct , analysis , and reporting of r and omised trials can cause the effect of an intervention to be underestimated or overestimated . The Cochrane Collaboration ’s tool for assessing risk of bias aims to make the process clearer and more AIMS An increasing number of patients with severe coronary artery disease ( CAD ) are not c and i date s for traditional revascularization and experience angina in spite of excellent medical therapy . Despite limited data regarding the natural history and predictors of adverse outcome , these patients have been considered at high risk for early mortality . METHODS AND RESULTS The OPtions In Myocardial Ischemic Syndrome Therapy ( OPTIMIST ) program at the Minneapolis Heart Institute offers traditional and investigational therapies for patients with refractory angina . A prospect i ve clinical data base includes detailed baseline and yearly follow-up information . Death status and cause were determined using the Social Security Death Index , clinical data , and death certificates . Time to death was analysed using survival analysis methods . For 1200 patients , the mean age was 63.5 years ( 77.5 % male ) with 72.4 % having prior coronary artery bypass grafting , 74.4 % prior percutaneous coronary intervention , 72.6 % prior myocardial infa rct ion , 78.3 % 3-vessel CAD , 23.0 % moderate-to-severe left-ventricular ( LV ) dysfunction , and 32.6 % congestive heart failure ( CHF ) . Overall , 241 patients died ( 20.1 % : 71.8 % cardiovascular ) during a median follow-up 5.1 years ( range 0 - 16 , 14.7 % over 9 ) . By Kaplan-Meier analysis , mortality was 3.9 % ( 95 % CI 2.8 - 5.0 ) at 1 year and 28.4 % ( 95 % CI 24.9 - 32.0 ) at 9 years . Multivariate predictors of all-cause mortality were baseline age , diabetes , angina class , chronic kidney disease , LV dysfunction , and CHF . CONCLUSION Long-term mortality in patients with refractory angina is lower than previously reported . Therapeutic options for this distinct and growing group of patients should focus on angina relief and improved quality of life Background Cardiac shockwave therapy ( CSWT ) might improve symptoms and decrease ischaemia burden by stimulating collateral growth in chronic ischaemic myocardium . This prospect i ve study was performed to evaluate the feasibility and safety of CSWT . Methods We included 33 patients ( mean age 70 ± 7 years , mean left ventricular ejection fraction 55 ± 12 % ) with end-stage coronary artery disease , chronic angina pectoris and reversible ischaemia on myocardial scintigraphy . CSWT was applied to the ischaemic zones ( 3–7 spots/session , 100 impulses/spot , 0.09 mJ/mm2 ) in an echocardiography-guided and ECG-triggered fashion . The protocol included a total of 9 treatment sessions ( 3 treatment sessions within 1 week at baseline , and after 1 and 2 months ) . Clinical assessment was performed using exercise testing , angina score ( CCS class ) , nitrate use , myocardial scintigraphy , and cardiac magnetic resonance ( CMR ) 1 and 4 months after the last treatment session . Results One and 4 months after CSWT , sublingual nitrate use decreased from 10/week to 2/week ( p < 0.01 ) and the angina symptoms diminished from CCS class III to CCS class II ( p < 0.01 ) . This clinical improvement was accompanied by an improved myocardial uptake on stress myocardial scintigraphy ( 54.2 ± 7.7 % to 56.4 ± 9.4 % , p = 0.016 ) and by increased exercise tolerance at 4-month follow-up ( from 7.4 ± 2.8 to 8.8 ± 3.6 min p = 0.015 ) . No clinical ly relevant side effects were observed . Conclusion CSWT improved symptoms and reduced ischaemia burden in patients with end-stage coronary artery disease without relevant side effects . The study provides a solid basis for a r and omised multicentre trial to establish CSWT as a new treatment option in end-stage coronary artery disease Objectives The usefulness and safety of percutaneous myocardial laser therapy in selected patients have been identified in previous 1-year r and omized trial reports , including that from a double-blind , sham-controlled trial we independently conducted . We aim ed to determine whether the 1-year effects are maintained through a long-term , longitudinal follow-up . Methods Patients ( n=77 ) with chronic , stable , medically refractory angina ( class III or IV ) not amenable to conventional revascularization and with evidence of reversible ischemia , ejection fraction ≥25 % , and myocardial wall thickness ≥8 mm were treated with percutaneous myocardial laser . After the 1-year follow-up and disclosure of all r and omized assignments as prespecified in the respective study protocol , patients were followed up longitudinally for a mean of 3 years for angina class , left ventricular ejection fraction , medication usage , and adverse events . Results No procedural mortality , myocardial infa rct ion , or cerebral embolism occurred . Pericardiocentesis was required in two patients ( 2.6 % ) . Cardiac event-free survival was 88 % at 1 year and 66 % at late follow-up . Mean Canadian Cardiovascular Society angina class was significantly improved from baseline ( 3.2±0.4 ) at 1 year ( 2.2±1.1 , P<0.001 ) and at a mean of 3 years ( 1.9±1.2 , P<0.001 ) . Nitrate usage was significantly reduced at late follow-up ; however , ejection fraction did not change over time . In a multivariate analysis , angina improvement at 1 year was found to be a significant independent predictor of both survival and angina improvement at late follow-up . Conclusion We conclude that percutaneous myocardial laser therapy in selected patients with severe , medically refractory angina not treatable with conventional revascularization induces significant and sustained symptomatic benefit Objectives . To investigate the effectiveness of extracorporeal shock wave therapy ( ESWT ) for symptoms alleviation in chronic pelvic pain syndrome ( CPPS ) . Material s and Methods . 40 patients with CPPS were r and omly allocated into either the treatment or sham group . In the first group , patients were treated by ESWT once a week for 4 weeks by a defined protocol . In the sham group , the same protocol was applied but with the probe being turned off . The follow-up assessment s were done at 1 , 2 , 3 , and 12 weeks by Visual Analogue Scale ( VAS ) for pain and NIH-developed Chronic Prostatitis Symptom Index ( NIH-CPSI ) . Results . Pain domain scores at follow-up points in both treatment and sham groups were reduced , more so in the treatment group , which were significant at weeks 2 , 3 , and 12 . Urinary scores became significantly different at weeks 3 and 12 . Also , quality of life ( QOL ) and total NIH-CPSI scores at all four follow-up time points reduced more significantly in the treatment group as compared to the sham group . Noticeably , at week 12 a slight deterioration in all variables was observed compared to the first 3 weeks of the treatment period . Conclusions . our findings confirmed ESWT therapy as a safe and effective method in CPPS in short term An optimal treatment for patients with diffuse obstructive arterial disease unsuitable for catheter-based or surgical intervention is still pending . This study tested the hypothesis that extracorporeal shock wave ( ECSW ) therapy may be a therapeutic alternative under such clinical situation . Myocardial ischemia was induced in male mini-pigs through applying an ameroid constrictor over mid-left anterior descending artery ( LAD ) . Twelve mini-pigs were equally r and omized into group 1 ( Constrictor over LAD only ) and group 2 ( Constrictor over LAD plus ECSW [ 800 impulses at 0.09 mJ/mm2 ] once 3 months after the procedure ) . Results showed that the parameters measured by echocardiography did not differ between two groups on days 0 and 90 . However , echocardiography and left ventricular ( LV ) angiography showed higher LV ejection fraction and lower LV end-systolic dimension and volume in group 2 on day 180 ( p<0.035 ) . Besides , mRNA and protein expressions of CXCR4 and SDF-1α were increased in group 2 ( p<0.04 ) . Immunofluorescence staining also showed higher number of vWF- , CD31- , SDF-1α- , and CXCR4-positive cells in group 2 ( all p<0.04 ) . Moreover , immunohistochemical staining showed notably higher vessel density but lower mean fibrosis area , number of CD40-positive cells and apoptotic nuclei in group 2 ( all p<0.045 ) . Mitochondrial protein expression of oxidative stress was lower , whereas cytochrome-C was higher in group 2 ( all p<0.03 ) . Furthermore , mRNA expressions of MMP-9 , Bax and caspase-3 were lower , whereas Bcl-2 , eNOS , VEGF and PGC-1α were higher in group 2 ( all p<0.01 ) . In conclusion , ECSW therapy effectively reversed ischemia-elicited LV dysfunction and remodeling through enhancing angiogenesis and attenuating inflammation and oxidative stress OBJECTIVE To assess the safety and efficacy of extracorporeal shockwave myocardial revascularization ( ESMR ) therapy in treating patients with refractory angina pectoris . PATIENTS AND METHODS A single-arm multicenter prospect i ve trial to assess safety and efficacy of the ESMR therapy in patients with refractory angina ( class III/IV angina ) was performed . Screening exercise treadmill tests and pharmacological single-photon emission computed tomography ( SPECT ) were performed for all patients to assess exercise capacity and ischemic burden . Patients were treated with 9 sessions of ESMR to ischemic areas over 9 weeks . Efficacy end points were exercise capacity by using treadmill test as well as ischemic burden on pharmacological SPECT at 4 months after the last ESMR treatment . Safety measures included electrocardiography , echocardiography , troponin , creatine kinase , and brain natriuretic peptide testing , and pain question naires . RESULTS Fifteen patients with medically refractory angina and no revascularization options were enrolled . There was a statistically significant mean increase of 122.3±156.9 seconds ( 38 % increase compared with baseline ; P=.01 ) in exercise treadmill time from baseline ( 319.8±157.2 seconds ) to last follow-up after the ESMR treatment ( 422.1±183.3 seconds ) . There was no improvement in the summed stress perfusion scores after pharmacologically induced stress SPECT at 4 months after the last ESMR treatment in comparison to that at screening ; however , SPECT summed stress score revealed that untreated areas had greater progression in ischemic burden vs treated areas ( 3.69±6.2 vs 0.31±4.5 ; P=.03 ) . There was no significant change in the mean summed echo score from baseline to posttreatment ( 0.4±5.1 ; P=.70 ) . The ESMR therapy was performed safely without any adverse events in electrocardiography , echocardiography , troponins , creatine kinase , or brain natriuretic peptide . Pain during the ESMR treatment was minimal ( a score of 0.5±1.2 to 1.1±1.2 out of 10 ) . CONCLUSION In this multicenter feasibility study , ESMR seems to be a safe and efficacious treatment for patients with refractory angina pectoris . However , larger sham-controlled trials will be required to confirm these findings BACKGROUND Cardiac shock wave therapy ( CSWT ) delivered to the myocardium increases capillary density and regional myocardial blood flow in animal experiments . In addition , nonenzymatic nitric oxide production and the upregulation of vascular growth factor 's mRNA by CSWT have been described . The aim of the study was therefore to test its potential to relieve symptoms in patients with chronic stable angina pectoris . METHODS Twenty-one patients ( mean age 68.2 ± 8.3 years , 19 males ) with chronic refractory angina pectoris and evidence of inducible myocardial ischemia during MIBI-SPECT imaging , were r and omized into a treatment ( n = 11 ) and a placebo arm ( n = 10 ) . The region of exercise-induced ischemia was treated with echocardiographic guidance during nine sessions over a period of 3 months . One session of CSWT consisted of 200 shots/spot ( 9 - -12 spots/session ) with an energy intensity of 0.09 mJ/mm(2 ) . In the control group acoustic simulation was performed without energy application . Medication was kept unchanged during the whole treatment period . RESULTS In the treatment group , symptoms improved in 9/11 patients , and the ischemic threshold , determined by cardiopulmonary exercise stress testing , increased from 80 ± 28 to 95 ± 28 W ( P= 0.036 ) . In the placebo arm , only 2/10 patients reported an improvement and the ischemic threshold remained unchanged ( 98 ± 23 to 107 ± 23 W ; P= 0.141 ) . The items " physical functioning " ( P= 0.043 ) , " general health perception " ( P= 0.046 ) , and " vitality " ( P= 0.035 ) of the SF-36 question naire significantly improved in the treatment arm , whereas in the placebo arm , no significant change was noted . Neither arrhythmias , troponin rise nor complications were observed during treatment . CONCLUSIONS This placebo controlled trial shows a significant improvement in symptoms , quality of life parameters and ischemic threshold during exercise in patients with chronic refractory angina pectoris treated with CSWT . Thus , CSWT represents a new option for the treatment of patients with refractory AP BACKGROUND Emerging evidence suggests that stem cells and progenitor cells derived from bone marrow can be used to improve cardiac function in patients after acute myocardial infa rct ion . In this r and omised trial , we aim ed to assess whether intracoronary transfer of autologous bone-marrow cells could improve global left-ventricular ejection fraction ( LVEF ) at 6 months ' follow-up . METHODS After successful percutaneous coronary intervention ( PCI ) for acute ST-segment elevation myocardial infa rct ion , 60 patients were r and omly assigned to either a control group ( n=30 ) that received optimum postinfa rct ion medical treatment , or a bone-marrow-cell group ( n=30 ) that received optimum medical treatment and intracoronary transfer of autologous bone-marrow cells 4.8 days ( SD 1.3 ) after PCI . Primary endpoint was global left-ventricular ejection fraction ( LVEF ) change from baseline to 6 months ' follow-up , as determined by cardiac MRI . Image analyses were done by two investigators blinded for treatment assignment . Analysis was per protocol . FINDINGS Global LVEF at baseline ( determined 3.5 days [ SD 1.5 ] after PCI ) was 51.3 ( 9.3 % ) in controls and 50.0 ( 10.0 % ) in the bone-marrow cell group ( p=0.59 ) . After 6 months , mean global LVEF had increased by 0.7 percentage points in the control group and 6.7 percentage points in the bone-marrow-cell group ( p=0.0026 ) . Transfer of bone-marrow cells enhanced left-ventricular systolic function primarily in myocardial segments adjacent to the infa rct ed area . Cell transfer did not increase the risk of adverse clinical events , in-stent restenosis , or proarrhythmic effects . INTERPRETATION Intracoronary transfer of autologous bone-marrow-cells promotes improvement of left-ventricular systolic function in patients after acute myocardial infa rct ion BACKGROUND Low-energy shock wave ( SW ) therapy has improved myocardial ischemia in both a porcine model and in patients with severe angina pectoris . METHODS AND RESULTS To further confirm the effectiveness and safety of SW therapy , 8 patients with severe angina pectoris were treated with SW therapy in a double-blind , placebo-controlled and cross-over manner . SW therapy , but not placebo , significantly improved chest pain symptoms and cardiac function without any complications or adverse effects . CONCLUSIONS Extracorporeal cardiac SW therapy is an effective , safe and non-invasive therapeutic option for severe angina pectoris Objective Medically refractory angina remains a significant health concern despite major advances in revascularization techniques and emerging medical therapies . We aim ed to determine the safety and efficacy of extracorporeal shockwave myocardial therapy ( ESMT ) in managing angina pectoris . Methods A single-arm multicenter prospect i ve study was design ed aim ing to determine the safety and efficacy of ESMT . Patients of functional Canadian Cardiovascular Society class II – IV , despite stable and optimal medical management , with documented myocardial segments with reversible ischemia and /or hibernation on the basis of echocardiography/single-photon emission computerized tomography ( SPECT ) were enrolled from 2010 to 2012 . A total of 111 patients were enrolled , 33 from Indonesia , 21 from Malaysia , and 57 from Philippines . Patients underwent nine cycles of ESMT over 9 weeks . Patients were followed up for 3–6 months after ESMT treatment . During follow-up , patients were subjected to clinical evaluation , the Seattle Angina Question naire , assessment of nitrate intake , the 6-min walk test , echocardiography , and SPECT . Results The mean age of the population was 62.9±10.9 years . The summed difference score on pharmacologically induced stress SPECT improved from 9.53±17.87 at baseline to 7.77±11.83 at follow-up ( P=0.0086 ) . Improvement in the total Seattle Angina Question naire score was seen in 83 % of patients ( P<0.0001 ) . Sublingual nitroglycerin use significantly decreased ( 1.14±1.01 tablets per week at baseline to 0.52±0.68 tablets per week at follow-up ; P=0.0215 ) . There were no changes in left ventricular function on echocardiography ( 0.33±9.97 , P=0.93 ) . The Canadian Cardiovascular Society score improved in 74.1 % of patients . Conclusion This multicenter prospect i ve trial demonstrated that ESMT is both a safe and an efficacious means of managing medically refractory angina Objectives Left ventricular ( LV ) remodeling after acute myocardial infa rct ion ( AMI ) is associated with a poor prognosis and an impaired quality of life . We have shown earlier that low-energy extracorporeal cardiac shock wave ( SW ) therapy improves chronic myocardial ischemia in pigs and humans and also ameliorates LV remodeling in a pig model of AMI induced by permanent coronary ligation . However , in the current clinical setting , most of the patients with AMI receive reperfusion therapy . Thus , in this study we examined whether our SW therapy also ameliorates LV remodeling after myocardial ischemia – reperfusion ( I/R ) injury in pigs in vivo . Methods Pigs were subjected to a 90-min ischemia and reperfusion using a balloon catheter and were r and omly assigned to two groups with or without SW therapy to the ischemic border zone ( 0.09 mJ/mm2 , 200 pulses/spot , 9 spots/animal , three times in the first week ) ( n=15 each ) . Results Four weeks after I/R , compared with the control group , the SW group showed significantly ameliorated LV remodeling in terms of LV enlargement ( 131±9 vs. 100±7 ml ) , reduced LV ejection fraction ( 28±2 vs. 36±3 % ) , and elevated left ventricular end-diastolic pressure ( 11±2 vs. 4±1 mmHg ) ( all P<0.05 , n=8 each ) . The SW group also showed significantly increased regional myocardial blood flow ( −0.06±0.11 vs. 0.36±0.13 ml/min/g , P<0.05 ) , capillary density ( 1.233±31 vs. 1.560±60/mm2 , P<0.001 ) , and endothelial nitric oxide synthase activity ( 0.24±0.03 vs. 0.41±0.05 , P<0.05 ) in the ischemic border zone compared with the control group ( n=7 each ) . Conclusion These results indicate that our SW therapy is also effective in ameliorating LV remodeling after myocardial I/R injury in pigs in vivo Objectives To determine the efficacy of cardiac shock wave therapy ( CSWT ) in the management of patients with end-stage coronary artery disease ( CAD ) . Introduction Patients with end-stage CAD have symptoms such as recurrent angina , breathlessness , and other debilitating conditions . End-stage CAD patients are usually those who have angina pectoris following a coronary artery bypass surgery or a percutaneous coronary intervention . These patients are refractory to optimal medical therapy and not fit for a redo procedure , and are often termed as ‘ no option ’ patients . Methods We carried out a prospect i ve cohort study to examine the effects of CSWT application in patients who had end-stage CAD and were no option patients . Characteristics such as angina class scores and functional status scores among cases ( patients with end-stage CAD who received CSWT ) and controls ( patients with end-stage CAD who did not receive CSWT ) were compared at baseline and at 6 months after CSWT therapy . Results There were 43 patients in the case group and 43 patients in the control group . The mean age of the patients was 58.7±9.5 years in the case group and 56.6±11.6 years in the control group . Other characteristics such as the prevalence of diabetes , hypertension , coronary artery bypass graft and percutaneous coronary intervention were similar in both groups . Clinical results showed a significant improvement in exercise time between the cases and the controls 6 months after treatment with CSWT ( 20.1±15.7 min in cases vs. 10.1±4.2 min in controls ; P<0.0001 ) , and symptomatic improvement in the CCS class scores ( 1.95±0.80 in cases and 2.63±0.69 in controls ; P<0.0001 ) and NYHA class scores ( 1.95±0.80 in cases vs. 2.48±0.59 in controls ; P<0.001 ) . In the control group , there was no improvement in angina class , functional class and exercise time . Conclusion The present study shows that CSWT application to the ischemic myocardium in patients with refractory angina pectoris improved symptoms and reduced the severity of ischemic areas at 6 months after CSWT treatment compared with the baseline . No side effects were observed with this therapy OBJECTIVE To evaluate the feasibility , safety and efficiency of extracorporeal cardiac shock wave therapy ( CSWT ) in patients with ischemic heart failure . METHODS Fifty patients with ischemic heart failure and left ventricular ejection fraction ( LVEF ) < 50 % were r and omized to CSWT ( shots/spot at 0.09 mJ/mm(2 ) for 9 spots , 9 times within 3 month ) or control group . Dual isotope simultaneous acquisition single-photon emission computed tomography with (99)Tc(m)-sestamibi/(18)F-fluorodeoxyglucose ( (99)Tc(m)-MIBI/(18)F-FDG ) was performed before r and omization and at 1 month after CSWT/control to locate and evaluate viable myocardium region . Canadian cardiovascular society ( CCS ) class sores , NYHA , Seattle Angina Question naire ( SAQ ) , 6-min walk test ( 6 MWT ) , left ventricular ejection fraction ( LVEF ) , left ventricular end-diastolic dimension ( LVEDD ) and the dosage of nitroglycerin use were compared between two groups at each time point . RESULTS All patients completed the study protocol without procedural complications . At 1 month , patients in CSWT group experienced improvement in NYHA ( P < 0.01 ) , CCS ( P < 0.01 ) , SAQ ( P = 0.021 ) , 6 MWT ( P = 0.012 ) and dosage of nitroglycerin use ( P < 0.01 ) compared to baseline . LVEF [ 45.0 ( 39.0 , 48.0 ) vs. 47.0 ( 42.0 , 50.0 ) P = 0.001 ] , LVEDD [ 58.0 ( 56.0 , 59.0 ) vs. 56.0 ( 55.0 , 58.0 ) P = 0.002 ] , summed perfused score [ 23.0 ( 20.5 , 24.5 ) vs. 20.0 ( 18.0 , 22.0 ) P < 0.01 ] and metabolic score [ 25.0 ( 23.0 , 26.0 ) vs. 24.0 ( 21.5 , 25.0 ) P = 0.028 ] were also improved in CSWT group . All these parameters remained unchanged in control group between baseline and at 1 month . CSWT was independent factor for improved cardiac function , quality of life and echocardiography parameters after adjusting for known factors which might affect outcome . CONCLUSION CSWT could improve symptom , cardiac function , quality of life and exercise tolerance in patients with ischemic heart failure , CSWT might serve as a new , non-invasive , safe and efficient therapy for these patients Objectives : Obstructive pancreatic duct stones can cause recurrent attacks of acute pancreatitis and chronic abdominal pain . Use of extracorporeal shock wave lithotripsy ( ESWL ) for treatment of abdominal pain associated with obstructive pancreatic duct stones has been well documented . However , its effect on prevention of recurrent pancreatitis in this group of patients has not been studied . Methods : Patients with recurrent episodes of acute pancreatitis due to obstructive pancreatic duct stones not amenable to endoscopic removal were prospect ively examined . All patients underwent ESWL by a pancreatologist using an electromagnetic shock wave lithotripter . After ESWL , the patients were followed up for recurrence of acute pancreatitis . Results : Ten patients underwent 13 sessions of ESWL . Complete and partial ductal clearances were achieved in 7 and 3 patients , respectively . The mean length of follow-up was 20 months ( range , 12 - 36 months ) . Three patients had recurrent attacks of acute pancreatitis during the follow-up period , caused by recurrence of obstructive stones in 2 and presence of main duct stricture in 1 patient . Conclusions : Extracorporeal shock wave lithotripsy of obstructive pancreatic duct stones in patients with recurrent attacks of acute pancreatitis can prevent further attacks . New episodes of acute pancreatitis in this group of patients may indicate stone recurrence or presence of strictures . Abbreviations : CT - computed tomography , ERCP - endoscopic retro grade cholangiopancreatography , ESWL - extracorporeal shock wave OBJECTIVES The incidence of patients with refractory angina ( RA ) is increasing . Medical therapy for RA is limited and prognosis is poor . Experimental data suggest that the use of Extracorporeal shockwave myocardial revascularization ( ESMR ) may contribute to angiogenesis and improve symptoms of angina in patients with RA . Purpose of our study is to determine the efficacy of cardiac shock wave therapy ( ESMR ) in the management of patients with nonrevascolarized coronary artery disease ( CAD ) . METHODS We performed a prospect i ve cohort study to examine the efficacy of ESMR applcation in patients with RA despite optimal medical therapy , not suitable for further PCI or CABG . Characteristics such as angina class scores ( CCS class score ) , nitroglycerin consumption and hospitalization rate among cases ( patients with RA who received ESMR ) and controls ( patients with RA who did not receive ESMR ) were compared at baseline and 6 months after ESMR therapy . In patients receiveing d ESMR the effect of on cardiac perfusion was assessed . RESULTS There were 43 patients in the case group and 29 patients in the control group . The mean age of the patients was 70 ± 9.5 years in the case group and 71 ± 5.3 years in the control group . Other characteristics ( diabetes , coronary artery bypass graft , percutaneus coronary intervention , baseline CCS class score ) were similar in both groups . There was a significant improvement in CCS class score ( 1.33 ± 0.57 in cases and 1.92 ± 0.69 in controls ; p = 0.0002 ) , nitroglycerin consumption ( 20 % in case cases , and 44.8 % in controls ; P < 0.03 ) and hospitalization rate significantly reduced ( 13.9 % in case cases , and 37.9 % in controls ; P < 0.03 ) . The patients who received ESMR , there was a significantly improvement in myocardial perfusion after 6 months with a 33 % relative reduction of summed stress score ( SSS ) ( p = 0.002 ) . CONCLUSION This case control study demonstrates the beneficial effect of ESMR therapy on cardiac symptoms , myocardial perfusion and reduced hospitalization in patients with refractory angina . Ther current study supports a role for ESMR as a non-invasive therapuetic option for patients with RA
2,010
21,235,770
Few differences were detected by type of chemotherapy , with the single exception of worse results among younger women treated with radiotherapy . In the short term , better results were reported for all HRQL components by women undergoing conservative rather than radical surgery . Presence of lymphedema was associated with worse HRQL . Psychosocial disorder and level of depression and anxiety , regardless of treatment or disease stage , worsened HRQL . One study found that breast cancer patients scored worse than did healthy women on almost all SF-12 scales .
Background Breast cancer is one of the oncological diseases in which health-related quality of life ( HRQL ) has been most studied . This is mainly due to its high incidence and survival . This paper seeks to : review published research into HRQL among women with breast cancer in Spain ; analyse the characteristics of these studies ; and describe the instruments used and main results reported .
BACKGROUND The aim of the study was to analyse the toxicity and health related quality of life ( HRQoL ) of breast cancer patients treated with FAC ( 5-fluorouracil , doxorubicin , cyclophosphamide ) and TAC ( docetaxel , doxorubicin , cyclophosphamide ) with and without primary prophylactic G-CSF ( PPG ) . PATIENTS AND METHODS This was a phase III study to compare FAC and TAC as adjuvant treatment of high-risk node-negative breast cancer patients . After the entry of the first 237 patients , the protocol was amended to include PPG in the TAC arm due to the high incidence of febrile neutropenia . A total of 1047 evaluable patients from 49 centres in Spain , two in Pol and and four in Germany were included in the trial . Side-effects and the scores of the EORTC QLQ-C30 and QLQ BR-23 question naires were compared in the three groups ( FAC , TAC pre-amendment and TAC post-amendment ) . RESULTS The addition of PPG to TAC significantly reduced the incidence of neutropenic fever , grade 2 - 4 anaemia , asthenia , anorexia , nail disorders , stomatitis , myalgia and dysgeusia . Patient QoL decreased during chemotherapy , more with TAC than FAC , but returned to baseline values afterwards . The addition of PPG to TAC significantly reduced the percentage of patients with clinical ly relevant Global Health Status deterioration ( 10 or more points over baseline value ) at the end of chemotherapy ( 64 % versus 46 % , P<0.03 ) . CONCLUSIONS The addition of PPG significantly reduces the incidence of neutropenic fever associated with TAC chemotherapy as well as that of some TAC-induced haematological and extrahaematological side-effects . The HRQoL of patients treated with TAC is worse than that of those treated with FAC but improves with the addition of PPG , particularly in the final part of chemotherapy treatment Introduction There are few studies on the effect on quality of life ( QL ) of cancer-related illness and treatment in elderly patients . The aim of this work was to evaluate prospect ively QL in a sample of elderly patients with stages I – III breast cancer who started radiotherapy treatment and compare their QL with that of a sample of younger patients . Material s and methods Forty-eight patients , ≥ 65 years of age completed the European Organization for Research and Treatment of Cancer ( EORTC ) QL question naires QLQ-C30 and QLQ-BR23 , and the Interview for Deterioration in Daily Living Activities in Dementia ( IDDD ) daily activities scale three times throughout treatment and follow-up periods . Clinical and demographic data were also recorded . Fifty patients ages 40–64 years with the same disease stage and treatment modality had previously completed the QL question naires . QL scores , changes in them among the three assessment s , differences between groups based on clinical factors , and differences between the two sample s were calculated . Results QL scoring was good and stable ( > 70/100 points ) in most areas , in line with clinical data . Light and moderate limitations occurred in global QL and some emotional , sexual , and treatment-related areas . Moderate decreases ( 10–20 ) appeared in some toxicity-related areas , which recovered during the follow-up period . Breast-conservation and sentinel-node patients presented higher scores in emotional areas . There were few QL differences among age-based sample s. Conclusions QL and clinical data indicate radiotherapy was well tolerated . Age should not be the only factor evaluated when deciding upon treatment for breast cancer patients Quality of life ( QOL ) assessment s of women entering a UK r and omised trial of adjuvant radiotherapy ( START ) were investigated to estimate the independent effects on QOL of age , time since surgery , type of breast surgery , chemotherapy and endocrine therapy . QOL was evaluated using the EORTC general cancer QOL scale ( EORTC QLQ-C30 ) , breast cancer module ( BR23 ) , the Body Image Scale ( BIS ) and the Hospital Anxiety and Depression Scale ( HADS ) . Independent effects of age and clinical factors were tested using multiple regression analysis . A total of 2208 ( mean age 56.9 years , range 26 - 87 ) consented to the QOL study prior to radiotherapy ; 17.1 % had undergone mastectomy ( Mx ) and the remainder had undergone a wide local excision ( WLE ) . 33.3 % had received adjuvant chemotherapy ( CT ) and 56.7 % were taking endocrine therapy ( ET ) . Age had significant effects on QOL with older and younger subgroups predicting poorer QOL for different domains . CT affected most QOL domains and result ed in worse body image , sexual functioning , breast and arm symptoms ( < 0.001 ) . Mx was associated with greater body image concerns ( p<0.001 ) , and WLE with more arm symptoms ( p=0.01 ) . There were no effects of ET on QOL . Women < 50 years ( proxy pre-menopausal ) had worse QOL in respect of anxiety , body image and breast symptoms but age and clinical factors had no effect on depression . Overall , QOL and mental health were favourable for most women about to start RT , but younger age and receiving CT were significant risk factors for poorer QOL , and so patients in these subgroups warrant further monitoring . Surgery had a limited impact and ET had no effect on QOL OBJECTIVE To evaluate a single cycle of adjuvant chemotherapy compared with longer duration chemotherapy for premenopausal women or chemoendocrine therapy for postmenopausal women with operable breast cancer using a quality -of-life-oriented end point , Q-TWiST ( quality -adjusted analysis of TWiST : Time Without Symptoms and Toxicity ) . DESIGN Multicenter r and omized clinical trial -- International Breast Cancer Study Group ( IBCSG : formerly Ludwig Group ) Trial V. SETTING IBCSG participating centers in Sweden , Switzerl and , Australia , Yugoslavia , Spain , New Zeal and , Italy , Germany , and South Africa . PATIENTS Data were available for 1229 eligible patients with node-positive breast cancer who were r and omized to receive one of three adjuvant treatments after at least a total mastectomy and axillary clearance . INTERVENTIONS Patients received either a single cycle of perioperative chemotherapy consisting of cyclophosphamide , methotrexate , fluorouracil , and leucovorin ; or six cycles ( 6 months ) of a conventionally timed chemotherapy consisting of cyclophosphamide , methotrexate , fluorouracil , and prednisone for premenopausal women or this combination plus tamoxifen for postmenopausal women ; or both perioperative and conventionally timed chemotherapy for a 7-month course of adjuvant therapy . RESULTS At 5 years of median follow-up , patients who received the longer duration therapies had an improved 5-year disease-free survival percentage ( 53 % compared with 36 % ; P less than 0.001 ) and 5-year overall survival percentage ( 73 % compared with 63 % ; P = 0.001 ) compared with those who received the single perioperative cycle alone . By 3.5 years , the greater burden of toxic effects associated with the longer duration treatments was balanced by their superior control of disease . Within 5 years of follow-up , even after subtracting time with adjuvant treatment toxicity , patients gained an average of 2.2 months of Q-TWiST if treated with the longer duration therapies compared with the single cycle ( P = 0.03 ) . The gain for premenopausal patients was 2.8 months ( P = 0.05 ) , whereas the gain for postmenopausal women was 1.5 months ( P greater than 0.2 ) . CONCLUSIONS Six or seven months of adjuvant chemotherapy or chemoendocrine therapy improve both the quantity and quality of life for patients with node-positive breast cancer compared with a single short course of perioperative combination chemotherapy BACKGROUND Quality of life has increasingly become an important issue in breast cancer treatment . One of the impetuses for sentinel lymph node biopsy or selective axillary lymph node dissection ( ALND ) is the assumed decreased incidence of lymphedema compared with st and ard ALND . This is based on the assumption that ALND is associated with a clinical ly significant incidence of lymphedema and that this lymphedema decreases the quality of life of these patients . However , few data exist on this issue . This study attempts to define the incidence and effect on quality of life of postoperative lymphedema in breast cancer patients . METHODS To determine the incidence of postoperative lymphedema , the Breast Cancer Registry at Henry Ford Hospital was accessed to obtain information on all patients who underwent ALND in the management of breast cancer over a 7-year period . The registry is a prospect ively gathered data base to include the development of various complications , such as lymphedema . To determine the effects of lymphedema on quality of life , 101 consecutive , unselected patients who underwent breast surgery were asked to complete the SF-36 , a generic quality of life instrument . The SF-36 measures eight domains of quality of life . Patients were then divided into three groups : ( 1 ) breast surgery without ALND ( -ALND ) , ( 2 ) breast surgery with ALND but no lymphedema ( -LE ) , and ( 3 ) breast surgery with ALND and lymphedema ( + LE ) . RESULTS In all , 827 patients with ALND were identified in the registry . Of these , 8.3 % developed clinical ly apparent lymphedema . Patients in -ALND and -LE groups had similar scores in all domains of the SF-36 . However , patients in the + LE group had significantly lower scores in the domains of role-emotional and bodily pain . A significantly higher percentage of patients in the + LE group had scores below one st and ard deviation compared with national norms in the domains of bodily pain ( P = 0.005 ) , mental health ( P = 0.01 ) , and general health ( P = 0.04 ) . CONCLUSIONS Although postoperative lymphedema occurs in a minority of patients , when it does occur it can produce demonstrable diminutions in quality of life . Therefore , efforts to reduce the incidence of lymphedema , such as sentinel lymph node biopsy or selective ALND , would benefit breast cancer patients PURPOSE The r and omized study aim ed to determine the efficacy of psychological intervention consisting of relaxation and guided imagery to reduce anxiety and depression in gynecologic and breast cancer patients undergoing brachytherapy during hospitalization . METHODS AND MATERIAL S Sixty-six patients programmed to receive brachytherapy in two hospitals in Barcelona ( Spain ) were included in this study . The patients were r and omly allocated to either the study group ( n=32 ) or the control group ( n=34 ) . Patients in both groups received training regarding brachytherapy , but only study group patients received training in relaxation and guided imagery . After collection of sociodemographic data , all patients were given a set of question naires on anxiety and depression : the Hospital Anxiety and Depression Scale ( HADS ) , and on quality of life : Cuestionario de Calidad de Vida QL-CA-AFex ( CCV ) , prior to , during and after brachytherapy . RESULTS The study group demonstrated a statistically significant reduction in anxiety ( p=0.008 ) , depression ( p=0.03 ) and body discomfort ( p=0.04 ) compared with the control group . CONCLUSIONS The use of relaxation techniques and guided imagery is effective in reducing the levels of anxiety , depression and body discomfort in patients who must remain isolated while undergoing brachytherapy . This simple and inexpensive intervention enhances the psychological wellness in patients undergoing brachytherapy . State : This study has passed Ethical Committee review PURPOSE / OBJECTIVES To describe the side-effects burden experienced over time by 53 women who were receiving treatment for breast cancer and to describe the association of side-effects burden with psychological adjustment and life quality . DESIGN Data were drawn from the Self-Help Intervention Project ( SHIP ) , an intervention study design ed to test the effectiveness of nursing interventions for women receiving treatment for breast cancer . SETTING Subjects were interviewed in their homes or treatment locations three times over a period of four to five months . SAMPLE 53 women r and omly assigned to the control group of the SHIP . METHODS The research ers collected data after treatment was initiated , six to eight weeks later , and three months after that . MAIN RESEARCH VARIABLES Side-effects burden , psychological adjustment , and life quality . FINDINGS Fatigue was the most problematic side effect over time . Other problematic side effects included sore arm(s ) , difficulty sleeping , hair loss , and skin irritation . Significant associations were evident for psychological adjustment with symptom extension and number of side effects at Time 2 and Time 3 . Depression burden and anxiety burden were associated significantly with psychological adjustment at all three times . Overall life quality and present life quality was associated negatively with symptom extension and number of side effects at all three times . Fatigue burden was associated negatively with life quality at Time 2 and Time 3 with depression burden and anxiety burden negatively associated with life quality at all three times . CONCLUSIONS Over time , evidence showed that negative feelings , in particular depression burden and anxiety burden , persist . Depression burden and anxiety burden each were negatively associated with overall and present life quality at all three times . IMPLICATION S FOR NURSING PRACTICE A need exists for clinical ly individualized nursing interventions that will reduce the side effects burden of women receiving treatment for breast cancer . Interventions can do much to reduce the perception of illness severity so that psychological adjustment and life quality can be maintained The classical criteria for the evaluation of clinical trials in cancer reflect alterations in physical well-being , but are insensitive to other important factors , such as psychosocial state , sociability , and somatic sensation that may play a critical role in determining the patients ' functional response to their illness and its treatment . The Functional Living Index-Cancer is design ed for easy , repeated patient self-administration . It is a 22-item question naire that has been vali date d on 837 patients in two cities over a three-year period . Criteria for validity include stability of factor analysis , concurrent validation studies against the Karnofsky , Beck Depression , Spielberger State and Trait Anxiety , and Katz Activities of Daily Living scales , as well as the scaled version of The General Health Question naire and The McGill/ Melzack Pain Index . The index is uncontaminated by social desirability issues . The validation studies demonstrate the lack of correlation between traditional measures of patient response and other significant functional factors such as depression and anxiety ( r = 0.33 ) , sociability and family interaction , and nausea . These findings eluci date the frequently observed discrepancies between traditional assessment s of clinical response and overall functional patient outcome . The index is proposed as an adjunct to clinical trials assessment and may provide additional patient functional information on which to analyse the outcome of clinical trials or offer specific advice to individual patients PURPOSE / OBJECTIVES To examine the effectiveness of a psychoeducational intervention on quality of life ( QOL ) in breast cancer survivors in post-treatment survivorship . DESIGN A r and omized controlled trial . SETTING An academic center collaborating with a regional cancer center in the southeastern United States . SAMPLE 256 breast cancer survivors . METHODS Women were r and omly assigned to the experimental or wait control group . The Breast Cancer Education Intervention ( BCEI ) study was delivered in three face-to-face sessions and five monthly follow-up sessions ( three by telephone and two in person ) . The control group received four monthly attention control telephone calls and the BCEI at month 6 . Data were collected at baseline , three and six months after the BCEI for the experimental group , and one month after the BCEI ( at month 7 ) for the wait control group . MAIN RESEARCH VARIABLES Primary endpoints were overall QOL and physical , psychological , social , and spiritual well-being . FINDINGS No differences in QOL were reported at baseline between groups . The experimental group reported improved QOL at three months , whereas the wait control group reported a significant decline in QOL . The experimental group reported continued maintenance of QOL at six months . Although the wait control group reported improved QOL at six months , significant differences continued to exist between the groups . CONCLUSIONS The BCEI was an effective intervention in improving QOL during the first year of breast cancer survivorship . Treatment effects were durable over time . IMPLICATION S FOR NURSING Post-treatment survivorship has not been empirically studied to a large degree . The BCEI is one of the few interventions demonstrating effectiveness among survivors after primary treatment , suggesting that oncology nurses may be uniquely positioned to provide safe passage using education and support As the number of women surviving breast cancer increases , with implication s for the health system , research into the physical and psychosocial sequelae of the cancer and its treatment is a priority . This research estimated self-reported health-related quality of life ( HRQoL ) associated with two rehabilitation interventions for breast cancer survivors , compared to a non-intervention group . Women were selected if they received an early home-based physiotherapy intervention ( DAART , n = 36 ) or a group-based exercise and psychosocial intervention ( STRETCH , n = 31 ) . Question naires on HRQoL , using the Functional Assessment of Cancer Therapy – Breast Cancer plus Arm Morbidity module , were administered at pre- , post-intervention , 6- and 12-months post-diagnosis . Data on a non-intervention group ( n = 208 ) were available 6- and 12-months post-diagnosis . Comparing pre/post-intervention measures , benefits were evident for functional well-being , including reductions in arm morbidity and upper-body disability for participants completing the DAART service at one-to-two months following diagnosis . In contrast , minimal changes were observed between pre/post-intervention measures for the STRETCH group at approximately 4-months post-diagnosis . Overall , mean HRQoL scores ( adjusted for age , chemotherapy , hormone therapy , high blood pressure and occupation type ) improved gradually across all groups from 6- to 12-months post-diagnosis , and no prominent differences were found . However , this obscured declining HRQoL scores for 20–40 % of women at 12 months post-diagnosis , despite receiving supportive care services . Greater awareness and screening for adjustment problems among breast cancer survivors is required throughout the disease trajectory . Early physiotherapy after surgery has the potential for short-term functional , physical and overall HRQoL benefits The purpose of this pilot study was to examine the effects of a combined cardiorespiratory and resistance exercise training program of short duration on the cardiorespiratory fitness , strength endurance , task specific functional muscle capacity , body composition and quality of life ( QOL ) in women breast cancer survivors . Sixteen subjects were r and omly assigned to either a training ( n = 8 ; age : 50 + /- 5 yrs ) or control non-exercising group ( n = 8 ; age : 51 + /- 10 yrs ) . The training group followed an 8-week exercise program consisting of 3 weekly sessions of 90-min duration , supervised by an experienced investigator and divided into resistance exercises and aerobic training . Before and after the intervention period , all of the subjects performed a cardiorespiratory test to measure peak oxygen uptake ( VO2peak ) , a dynamic strength endurance test ( maximum number of repetitions for chest and leg press exercise at 30 - 35 % and 100 - 110 % of body mass , respectively ) and a sit-st and test . Quality of life was assessed using the European Organization for Research and Treatment of Cancer QLQ-C30 ( EORTC-C30 ) question naire . In response to training , QOL , VO2peak ( mean 3.9 ml/kg/min ; 95 % CI , 0.93 , 6.90 ) performance in leg press ( 17.9 kg ; 95 % CI , 12.8 , 22.4 ) and sit-st and test ( - 0.67 s ; 95 % CI , - 0.52 , - 1.2 ) improved ( p < or = 0.05 ) . We observed no significant changes in the control group . Combined cardiorespiratory and resistance training , even of very brief duration , improves the QOL and the overall physical fitness of women breast cancer survivors Background . Although mortality rates from breast cancer are declining , many breast cancer survivors will experience physical and psychological sequelae that affect their everyday lives . Few prospect i ve studies have examined the rehabilitation needs of newly diagnosed breast cancer patients , and little is known about the predictors of health‐related quality of life ( QOL ) in this population . Methods . Between 1987 and 1990 , 227 women with early stage breast cancer participated in a prospect i ve longitudinal study in which detailed information was collected through interviews , st and ardized measures of QOL and psychological distress , and clinical evaluation . Comparisons of physical and treatment‐related problems were made according to type of surgical treatment . Multivariate regression analysis was performed to examine the predictors of QOL at one year after surgery . Results . Physical and treatment‐related problems were reported frequently one month after breast cancer surgery , and occurred with equal frequency in women receiving modified radical mastectomy or breast conservation treatment . There were no significant differences in problems reported at one year by type of surgery ; however , frequently reported problems include ‘ numbness in the chest wall or axilla , ’ ‘ tightness , pulling or stretching in the arm or axilla , ’ ‘ less energy or fatigue , ’ ‘ difficulty in sleeping , ’ and ‘ hot flashes ’ . There was no relationship between the type of surgery and mood or QOL . Poorer QOL one year after surgery was significantly associated with greater mood disturbance and body image discomfort one month after surgery , as well as positive lymph node involvement . Although the majority of patients experienced substantial disruptions in the physical and psychosocial dimensions of QOL post‐operatively , most women recovered during the year after surgery , with only a minority ( < 10 % ) significantly worsening during that time . Conclusions . At one year after surgery , most women report high levels of functioning and QOL , with no relationship between the type of surgery and QOL . Women who reported lower levels of QOL at one year after diagnosis had greater mood disturbance and poorer body image one month after surgery , as well as lower income and positive axillary nodes The purpose of this study was to examine how level of depression burden influences women 's psychological adjustment and quality of life over time and how depression burden interacted with a community-based oncology support program to influence psychological adjustment and life quality . Participants were 169 women who completed a side effects checklist at three data collection points . Women were divided into two groups based on their depression burden scores : 123 women reporting no burden , and 46 women reporting high depression burden . For psychological adjustment , there were significant interaction effects for intervention by time and for intervention by depression burden by time and significant main effects for depression burden . For life quality , there was a significant interaction effect for intervention by time and a significant main effect for depression burden . The findings document the negative impact of depression burden on psychological adjustment and life quality . Oncology support interventions can be effective in reducing this negative impact
2,011
27,729,027
Several studies found that access to a specialist heart failure team/service reduced hospital readmissions and mortality . In primary care , a collaborative model of care where the primary physician shared the care with a cardiologist , improved patient outcomes compared to a primary physician only . During hospitalisation , quality improvement programs improved the quality of inpatient care result ing in reduced hospital readmissions and mortality . In the transitional care phase , heart failure programs , nurse-led clinics , and early outpatient follow-up reduced hospital readmissions . Conclusion Re design ing systems of care aim ed at improving the translation of evidence into clinical practice and transitional care can potentially improve patient outcomes in a cohort of patients known for high readmission rates and mortality
Background Hospital admissions for heart failure are predicted to rise substantially over the next decade placing increasing pressure on the health care system . There is an urgent need to re design systems of care for heart failure to improve evidence -based practice and create seamless transitions through the continuum of care . The aim of the review was to examine systems of care for heart failure that reduce hospital readmissions and /or mortality .
Heart failure ( HF ) is the leading cause of rehospitalization in older adults . The purpose of this pilot study was to examine whether telemonitoring by an advanced practice nurse reduced subsequent hospital readmissions , emergency department visits , costs , and risk of hospital readmission for patients with HF . One hundred two patient/caregiver dyads were r and omized into 2 groups postdischarge ; 84 dyads completed the study . Hospital readmissions , emergency department visits , costs , and days to readmission were abstract ed from medical records . Participants were interviewed soon after discharge and 3 months later about effects of telemonitoring on depressive symptoms , quality of life , and caregiver mastery . There were no significant differences due to telemonitoring for any outcomes . Caregiver mastery , informal social support , and electronic home monitoring were not significant predictors for risk of hospital readmission . Further studies should address the interaction between the advanced practice nurse and follow-up intervention with telemonitoring of patients with HF to better target those who are most likely to benefit OBJECTIVES We sought to identify whether home telemonitoring ( HTM ) improves outcomes compared with nurse telephone support ( NTS ) and usual care ( UC ) for patients with heart failure who are at high risk of hospitalization or death . BACKGROUND Heart failure is associated with a high rate of hospitalization and poor prognosis . Telemonitoring could help implement and maintain effective therapy and detect worsening heart failure and its cause promptly to prevent medical crises . METHODS Patients with a recent admission for heart failure and left ventricular ejection fraction ( LVEF ) < 40 % were assigned r and omly to HTM , NTS , or UC in a 2:2:1 ratio . HTM consisted of twice-daily patient self-measurement of weight , blood pressure , heart rate , and rhythm with automated devices linked to a cardiology center . The NTS consisted of specialist nurses who were available to patients by telephone . Primary care physicians delivered UC . The primary end point was days dead or hospitalized with NTS versus HTM at 240 days . RESULTS Of 426 patients r and omly assigned , 48 % were aged > 70 years , mean LVEF was 25 % ( SD , 8) and median plasma N-terminal pro-brain natriuretic peptide was 3,070 pg/ml ( interquartile range 1,285 to 6,749 pg/ml ) . During 240 days of follow-up , 19.5 % , 15.9 % , and 12.7 % of days were lost as the result of death or hospitalization for UC , NTS , and HTM , respectively ( no significant difference ) . The number of admissions and mortality were similar among patients r and omly assigned to NTS or HTM , but the mean duration of admissions was reduced by 6 days ( 95 % confidence interval 1 to 11 ) with HTM . Patients r and omly assigned to receive UC had higher one-year mortality ( 45 % ) than patients assigned to receive NTS ( 27 % ) or HTM ( 29 % ) ( p = 0.032 ) . CONCLUSIONS Further investigation and refinement of the application of HTM are warranted because it may be a valuable role for the management of selected patients with heart failure Background — Assessment of the quality of care for out patients with heart failure ( HF ) has focused on the development and use of process-based performance measures , with the supposition that these care process measures are associated with clinical outcomes . However , this association has not been evaluated for current and emerging outpatient HF measures . Methods and Results — Performance on 7 HF process measures ( 4 current and 3 emerging ) and 2 summary measures was assessed at baseline in patients from 167 US outpatient cardiology practice s with patients prospect ively followed up for 24 months . Participants included 15 177 patients with reduced left ventricular ejection fraction ( ⩽35 % ) and chronic HF or post – myocardial infa rct ion . Multivariable analyses were performed to assess the process- outcome relationship for each measure in eligible patients . Vital status was available for 11 621 patients . The mortality rate at 24 months was 22.1 % . Angiotensin-converting enzyme inhibitor or angiotensin receptor blocker use , & bgr;-blocker use , anticoagulant therapy for atrial fibrillation , cardiac resynchronization therapy , implantable cardioverter-defibrillators , and HF education for eligible patients were each independently associated with improved 24-month survival , whereas aldosterone antagonist use was not . The all-or-none and composite care summary measures were also independently associated with improved survival . Each 10 % improvement in composite care was associated with a 13 % lower odds of 24-month mortality ( adjusted odds ratio , 0.87 ; 95 % confidence interval , 0.84 to 0.90 ; P<0.0001 ) . Conclusions — Current and emerging outpatient HF process measures are positively associated with patient survival . These HF measures may be useful for assessing and improving HF care . Clinical Trial Registration — URL : http://www . clinical trials.gov . Unique identifier : NCT00303979 OBJECTIVE To study the impact of remote patient monitoring ( RPM ) upon the most frequent diagnosis in hospitalized patients over 65 years of age-heart failure ( HF ) . We examined the effect of RPM on hospital utilization and Medicare costs of HF patients receiving home care . MATERIAL S AND METHODS Two studies were simultaneously conducted : A r and omized and a matched-cohort study . In the r and omized study , 168 subjects were r and omly assigned ( after hospitalization ) to home care utilizing RPM ( live nursing visits and video-based nursing visits ) or to home care receiving live nursing visits only . In the matched-cohort study , 160 subjects receiving home care with RPM ( live nursing visits and video-based nursing visits ) were matched with home care subjects receiving live nursing visits only . RESULTS Regardless of whether outcomes were being analyzed for all subjects ( intention to treat ) or for hospitalized subjects only , hospitalization rates , time to first admission , length of stay , and costs to Medicare did not differ significantly between groups in either study at 30 or 90 days after enrollment . A notable trend , however , emerged across studies : Although time to hospitalization was shorter in the RPM groups than the control groups , RPM groups had lower hospitalization costs . CONCLUSIONS RPM , when utilized in conjunction with a robust management protocol , was not found to significantly differ from live nursing visits in the management of HF in home care . Shorter hospitalization times and lower associated costs may be due to earlier identification of exacerbation . These trends indicate the need for further study Background — Trials investigating efficacy of disease management programs ( DMP ) in heart failure reported contradictory results . Features rendering specific interventions successful are often ill defined . We evaluated the mode of action and effects of a nurse-coordinated DMP ( HeartNetCare-HF , HNC ) . Methods and Results — Patients hospitalized for systolic heart failure were r and omly assigned to HNC or usual care ( UC ) . Besides telephone-based monitoring and education , HNC addressed individual problems raised by patients , pursued networking of health care providers and provided training for caregivers . End points were time to death or rehospitalization ( combined primary ) , heart failure symptoms , and quality of life ( SF-36 ) . Of 1007 consecutive patients , 715 were r and omly assigned ( HNC : n=352 ; UC : n=363 ; age , 69±12 years ; 29 % female ; 40 % New York Heart Association class III-IV ) . Within 180 days , 130 HNC and 137 UC patients reached the primary end point ( hazard ratio , 1.02 ; 95 % confidence interval , 0.81–1.30 ; P=0.89 ) , since more HNC patients were readmitted . Overall , 32 HNC and 52 UC patients died ( 1 UC patient and 4 HNC patients after dropout ) ; thus , uncensored hazard ratio was 0.62 ( 0.40–0.96 ; P=0.03 ) . HNC patients improved more regarding New York Heart Association class ( P=0.05 ) , physical functioning ( P=0.03 ) , and physical health component ( P=0.03 ) . Except for HNC , health care utilization was comparable between groups . However , HNC patients requested counseling for noncardiac problems even more frequently than for cardiovascular or heart-failure – related issues . Conclusions — The primary end point of this study was neutral . However , mortality risk and surrogates of well-being improved significantly . Quantitative assessment of patient requirements suggested that besides (tele)monitoring individualized care considering also noncardiac problems should be integrated in efforts to achieve more sustainable improvement in heart failure outcomes . Clinical Trial Registration — URL : http://www.controlled-trials.com . Unique identifier : IS RCT N23325295 The evidence base of what works in chronic care management programs is underdeveloped . To fill the gap , we pooled and reanalyzed data from ten r and omized clinical trials of heart failure care management programs to discern how program delivery methods contribute to patient outcomes . We found that patients enrolled in programs using multi-disciplinary teams and in programs using in-person communication had significantly fewer hospital readmissions and readmission days than routine care patients had . Our study offers policymakers and health plan administrators important guideposts for developing an evidence base on which to build effective policy and programmatic initiatives for chronic care management BACKGROUND Small studies suggest that telemonitoring may improve heart-failure outcomes , but its effect in a large trial has not been established . METHODS We r and omly assigned 1653 patients who had recently been hospitalized for heart failure to undergo either telemonitoring ( 826 patients ) or usual care ( 827 patients ) . Telemonitoring was accomplished by means of a telephone-based interactive voice-response system that collected daily information about symptoms and weight that was review ed by the patients ' clinicians . The primary end point was readmission for any reason or death from any cause within 180 days after enrollment . Secondary end points included hospitalization for heart failure , number of days in the hospital , and number of hospitalizations . RESULTS The median age of the patients was 61 years ; 42.0 % were female , and 39.0 % were black . The telemonitoring group and the usual-care group did not differ significantly with respect to the primary end point , which occurred in 52.3 % and 51.5 % of patients , respectively ( difference , 0.8 percentage points ; 95 % confidence interval [ CI ] , -4.0 to 5.6 ; P=0.75 by the chi-square test ) . Readmission for any reason occurred in 49.3 % of patients in the telemonitoring group and 47.4 % of patients in the usual-care group ( difference , 1.9 percentage points ; 95 % CI , -3.0 to 6.7 ; P=0.45 by the chi-square test ) . Death occurred in 11.1 % of the telemonitoring group and 11.4 % of the usual care group ( difference , -0.2 percentage points ; 95 % CI , -3.3 to 2.8 ; P=0.88 by the chi-square test ) . There were no significant differences between the two groups with respect to the secondary end points or the time to the primary end point or its components . No adverse events were reported . CONCLUSIONS Among patients recently hospitalized for heart failure , telemonitoring did not improve outcomes . The results indicate the importance of a thorough , independent evaluation of disease-management strategies before their adoption . ( Funded by the National Heart , Lung , and Blood Institute ; Clinical Trials.gov number , NCT00303212 . ) BACKGROUND Clinical registries are used increasingly to analyze quality and outcomes , but the generalizability of findings from registries is unclear . METHODS We linked data from the Acute Decompensated Heart Failure National Registry ( ADHERE ) to 100 % fee-for-service Medicare cl aims data . We compared patient characteristics and inpatient mortality of linked and unlinked ADHERE hospitalizations ; patient characteristics , readmission , and postdischarge mortality of linked ADHERE patients to a r and om 20 % sample of Medicare beneficiaries hospitalized for heart failure ; and characteristics of Medicare sites participating and not participating in ADHERE . RESULTS Among 135,667 ADHERE records for eligible patients ≥ 65 years , we matched 104,808 ( 77.3 % ) records to fee-for-service Medicare cl aims , representing 82,074 patients . Linked hospitalizations were more likely than unlinked hospitalizations to involve women and white patients ; there were no meaningful differences in other patient characteristics . In-hospital mortality was identical for linked and unlinked hospitalizations . In Medicare , ADHERE patients had slightly lower unadjusted mortality ( 4.4 % vs 4.9 % in-hospital , 11.2 % vs 12.2 % at 30 days , 36.0 % vs 38.3 % at 1 year [ P < .001 ] ) and all-cause readmission ( 22.1 % vs 23.7 % at 30 days , 65.8 % vs 67.9 % at 1 year [ P < .001 ] ) . After risk adjustment , modest but statistically significant differences remained . ADHERE hospitals were more likely than non-ADHERE hospitals to be teaching hospitals , have higher volumes of heart failure discharges , and offer advanced cardiac services . CONCLUSION Elderly patients in ADHERE are similar to Medicare beneficiaries hospitalized with heart failure . Differences related to selective enrollment in ADHERE hospitals and self- selection of participating hospitals are modest Patients with chronic conditions are heavy users of the health care system . There are opportunities for significant savings and improvements to patient care if patients can be maintained in their homes . A r and omized control trial tested the impact of 3 months of telehome monitoring on hospital readmission , quality of life , and functional status in patients with heart failure or angina . The intervention consisted of video conferencing and phone line transmission of weight , blood pressure , and electrocardiograms . Telehome monitoring significantly reduced the number of hospital readmissions and days spent in the hospital for patients with angina and improved quality of life and functional status in patients with heart failure or angina . Patients found the technology easy to use and expressed high levels of satisfaction . Telehealth technologies are a viable means of providing home monitoring to patients with heart disease at high risk of hospital readmission to improve their self-care abilities BACKGROUND Heart failure ( HF ) remains a condition with high morbidity and mortality . We tested a telephone support strategy to reduce major events in rural and remote Australians with HF , who have limited healthcare access . Telephone support comprised an interactive telecommunication software tool ( TeleWatch ) with follow-up by trained cardiac nurses . METHODS Patients with a general practice ( GP ) diagnosis of HF were r and omized to usual care ( UC ) or UC and telephone support intervention ( UC+I ) using a cluster design involving 143 GPs throughout Australia . Patients were followed up for 12 months . The primary endpoint was the Packer clinical composite score . Secondary endpoints included hospitalization for any cause , death or hospitalization , as well as HF hospitalization . RESULTS Four hundred and five patients were r and omized to CHAT . Patients were well matched at baseline for key demographic variables . The primary endpoint of the Packer score was not different between the two groups ( P = 0.98 ) , although more patients improved with UC+I. There were fewer patients hospitalized for any cause ( 74 vs. 114 , adjusted HR 0.67 [ 95 % CI 0.50 - 0.89 ] , P = 0.006 ) and who died or were hospitalized ( 89 vs. 124 , adjusted HR 0.70 [ 95 % CI 0.53 - 0.92 ] , P = 0.011 ) , in the UC+I vs. UC group . HF hospitalizations were reduced with UC+I ( 23 vs. 35 , adjusted HR 0.81 [ 95 % CI 0.44 - 1.38 ] ) , although this was not significant ( P = 0.43 ) . There were 16 deaths in the UC group and 17 in the UC+I group ( P = 0.43 ) . CONCLUSIONS Although no difference was observed in the primary endpoint of CHAT ( Packer composite score ) , UC+I significantly reduced the number of HF patients hospitalized among a rural and remote cohort . These data suggest that telephone support may be an efficacious approach to improve clinical outcomes in rural and remote HF patients AIMS Chronic heart failure ( CHF ) patients are frequently rehospitalized within 6 months after an episode of fluid retention . Rehospitalizations are preventable , but this requires an extensive organization of the healthcare system . In this study , we tested whether intensive follow-up of patients through a telemonitoring-facilitated collaboration between general practitioners ( GPs ) and a heart failure clinic could reduce mortality and rehospitalization rate . METHODS AND RESULTS One hunderd and sixty CHF patients [ mean age 76 ± 10 years , 104 males , mean left ventricular ejection fraction ( LVEF ) 35 ± 15 % ] were block r and omized by sealed envelopes and assigned to 6 months of intense follow-up facilitated by telemonitoring ( TM ) or usual care ( UC ) . The TM group measured body weight , blood pressure , and heart rate on a daily basis with electronic devices that transferred the data automatically to an online data base . Email alerts were sent to the GP and heart failure clinic to intervene when pre-defined limits were exceeded . All-cause mortality was significantly lower in the TM group as compared with the UC group ( 5 % vs. 17.5 % , P = 0.01 ) . The total number of follow-up days lost to hospitalization , dialysis , or death was significantly lower in the TM group as compared with the UC group ( 13 vs. 30 days , P = 0.02 ) . The number of hospitalizations for heart failure per patient showed a trend ( 0.24 vs. 0.42 hospitalizations/patient , P = 0.06 ) in favour of TM . CONCLUSION Telemonitoring-facilitated collaboration between GPs and a heart failure clinic reduces mortality and number of days lost to hospitalization , death , or dialysis in CHF patients . These findings need confirmation in a large trial
2,012
26,177,653
EBUS-TBNA appears to be an efficacious and safe procedure and should be used as an initial diagnostic tool for mediastinal TBLA
BACKGROUND Endobronchial ultrasound-guided transbronchial needle aspiration ( EBUS-TBNA ) has been widely used in the diagnosis of mediastinal lymphadenopathies . Here , we performed a systematic review and meta- analysis to explore the diagnostic value of EBUS-TBNA in mediastinal tuberculous lymphadenopathy ( TBLA ) .
Introduction The Centers for Disease Control and Prevention has recommended using a nucleic acid amplification test ( NAAT ) for diagnosing pulmonary tuberculosis ( TB ) but there is a lack of data on NAAT cost-effectiveness . Methods We conducted a prospect i ve cohort study that included all patients with an AFB smear-positive respiratory specimen at Grady Memorial Hospital in Atlanta , GA , USA between January 2002 and June 2008 . We determined the sensitivity , specificity , and positive and negative predictive value of a commercially available and FDA -approved NAAT ( amplified MTD , Gen-Probe ) compared to the gold st and ard of culture . A cost analysis was performed and included costs related to laboratory tests , hospital charges , anti-TB medications , and contact investigations . Average cost per patient was calculated under two conditions : ( 1 ) using a NAAT on all AFB smear-postive respiratory specimens and ( 2 ) not using a NAAT . One-way sensitivity analyses were conducted to determine sensitivity of cost difference to reasonable ranges of model inputs . Results During a 6 1/2 year study period , there were 1,009 patients with an AFB smear-positive respiratory specimen at our public urban hospital . We found the NAAT to be highly sensitive ( 99.6 % ) and specific ( 99.1 % ) on AFB smear-positive specimens compared to culture . Overall , the positive predictive value ( PPV ) of an AFB smear-positive respiratory specimen for culture-confirmed TB was 27 % . The PPV of an AFB smear-positive respiratory specimen for culture-confirmed TB was significantly higher for HIV-uninfected persons compared to those who were HIV-seropositive ( 152/271 [ 56 % ] vs. 85/445 [ 19 % ] ; RR = 2.94 , 95 % CI 2.36–3.65 , p<0.001 ) . The cost savings of using the NAAT was $ 2,003 per AFB smear-positive case . Conclusions Routine use of the NAAT on AFB smear-positive respiratory specimens was highly cost-saving in our setting at a U.S. urban public hospital with a high prevalence of TB and HIV because of the low PPV of an AFB smear for culture-confirmed TB Background To investigate the ability of rESAT6 to identify different mycobacteria-sensitized guinea pigs and its safety in pre clinical and phase I clinical study . Material / Methods Guinea pigs were sensitized with different Mycobacteria . After sensitization , all animals were intradermally injected with rESAT6 and either PPD or PPD-B. At 24 h after the injection , the erythema of the injection sites were measured using a double-blind method . For the pre clinical safety study , different doses of rESAT6 and BSA were given 3 times intramuscularly to guinea pigs . On day 14 after the final immunization , the guinea pigs were intravenously injected with the same reagents in the hind legs and the allergic reactions were observed . A single-center , r and omized , open phase I clinical trial was employed . The skin test was conducted in 32 healthy volunteers aged 19–65 years with 0.1 μg , 0.5 μg , and 1 μg rESAT6 . Physical examination and laboratory tests were performed before and after the skin test and adverse reactions were monitored . The volunteers ’ local and systemic adverse reactions and adverse events were recorded for 7 days . Results Positive PPD or PPD-B skin tests were observed in all Mycobacteria-sensitized guinea pigs ; the diameters of erythema were all > 10 mm . The rESAT6 protein induced a positive skin test result in the guinea pigs sensitized with MTB , M. bovis , M. africanum and M. kansasii ; the diameters of erythema were 14.7±2.0 , 9.3±3.8 , 18.7±2.4 , and 14.8±4.2 mm , respectively . A negative skin test result was detected in BCG-vaccinated and other NTM-sensitized guinea pigs . The rESAT6 caused no allergic symptoms , but many allergic reactions , such as cough , dyspnea , and even death , were observed in the guinea pigs who were administered BSA . During the phase I clinical trial , no adverse reactions were found in the 0.1 μg rESAT6 group , but in the 0.5 μg rESAT6 group 2 volunteers reported pain and 1 reported itching , and in the 1 μg rESAT6 group there was 1 case of pain , 1 case of itching , and 1 case of blister . No other local or systemic adverse reactions or events were reported . Conclusions The rESAT6 can differentiate effectively among MTB infection , BCG vaccination , and NTM infection and is safe in healthy volunteers Background This study aim ed to determine the efficacy and safety of recombinant Mycobacterium tuberculosis ESAT-6 protein for diagnosis of pulmonary tuberculosis ( TB ) . Material / Methods A phase II trial was performed in 158 patients with pulmonary TB ( 145 initially-treated and 13 re-treated ) and 133 healthy subjects . Skin testing was carried out by injecting purified protein derivative ( PPD ) ( on left forearm ) or recombinant ESAT-6 protein at a dosage of 2 , 5 , or 10 μg/mL ( on the right forearm ) in each subject . Reaction activity and adverse events were monitored at 24 , 48 , and 72 h following the injection . Receiver operating characteristic curves were plotted to determine the areas under the curves ( AUCs ) and the cut-off in duration diameters for the optimal diagnostic performance . Results The reaction activity was significantly increased upon recombinant ESAT-6 injection in pulmonary TB patients compared with healthy subjects . In pulmonary TB patients , the reaction was dose-dependent , and at 48 h , 10 μg/mL recombinant ESAT-6 produced a reaction similar to that produced by PPD . The AUCs for a 10 μg/mL dosage were 0.9823 , 0.9552 , and 0.9266 for 24 h , 48 h , and 72 h , respectively , and the in duration diameters of 4.5–5.5 mm were the optimal trade-off values between true positive rates and false positive rates . No serious adverse events occurred in any subjects . Conclusions Recombinant ESAT-6 protein is efficacious and safe for diagnosing pulmonary TB . Based on the reaction , performance , safety , and practicability , we recommend that 10 μg/mL at 48 h with an in duration cut-off value of 5.0 mm be used OBJECTIVE The goal of this study was to determine the optimal number of aspirations per lymph node ( LN ) station during endobronchial ultrasound (EBUS)-guided transbronchial needle aspiration ( TBNA ) for maximum diagnostic yield in mediastinal staging of non-small cell lung cancer ( NSCLC ) in the absence of rapid on-site cytopathologic examination . METHODS EBUS-TBNA was performed in potentially operable NSCLC patients with mediastinal LNs accessible by EBUS-TBNA ( 5 to 20 mm ) . Every target LN station was punctured four times . RESULTS We performed EBUS-TBNA in 163 mediastinal LN stations in 102 NSCLC patients . EBUS-TBNA confirmed malignancy in 41 LN stations in 30 patients . Two malignant LN stations were missed in two patients . The sensitivity , specificity , positive predictive value , negative predictive value ( NPV ) , and accuracy of EBUS-TBNA in predicting mediastinal metastasis were 93.8 % , 100 % , 100 % , 96.9 % , and 97.9 % , respectively . Sample adequacy was 90.1 % for one aspiration , and it reached 100 % for three aspirations . The sensitivity for differentiating malignant from benign LN stations was 69.8 % , 83.7 % , 95.3 % , and 95.3 % for one , two , three , and four aspirations , respectively . The NPV was 86.5 % , 92.2 % , 97.6 % , and 97.6 % for one , two , three , and four aspirations , respectively . Maximum diagnostic values were achieved in three aspirations . When at least one tissue core was obtained by the first or second aspiration , the sensitivity and NPV of the first two aspirations were 91.9 % and 96.0 % , respectively . CONCLUSIONS Optimal results can be obtained in three aspirations per LN station in EBUS-TBNA for mediastinal staging of potentially operable NSCLC . When at least one tissue core specimen is obtained by the first or second aspiration , two aspirations per LN station can be acceptable RATIONALE Patients with isolated mediastinal lymphadenopathy ( IML ) are a common presentation to physicians , and mediastinoscopy is traditionally considered the " gold st and ard " investigation when a pathological diagnosis is required . Endobronchial ultrasound-guided transbronchial needle aspiration ( EBUS-TBNA ) is established as an alternative to mediastinoscopy in patients with lung cancer . OBJECTIVE To determine the efficacy and health care costs of EBUS-TBNA as an alternative initial investigation to mediastinoscopy in patients with isolated IML . METHODS Prospect i ve multicenter single-arm clinical trial of 77 consecutive patients with IML from 5 centers between April 2009 and March 2011 . All patients underwent EBUS-TBNA . If EBUS-TBNA did not provide a diagnosis , then participants underwent mediastinoscopy . MEASUREMENTS AND MAIN RESULTS EBUS-TBNA prevented 87 % of mediastinoscopies ( 95 % confidence interval [ CI ] , 77 - 94 % ; P < 0.001 ) but failed to provide a diagnosis in 10 patients ( 13 % ) , all of whom underwent mediastinoscopy . The sensitivity and negative predictive value of EBUS-TBNA in patients with IML were 92 % ( 95 % CI , 83 - 95 % ) and 40 % ( 95 % CI , 12 - 74 % ) , respectively . One patient developed a lower respiratory tract infection after EBUS-TBNA , requiring inpatient admission . The cost of the EBUS-TBNA procedure per patient was £ 1,382 ( $ 2,190 ) . The mean cost of the EBUS-TBNA strategy was £ 1,892 ( $ 2,998 ) per patient , whereas a strategy of mediastinoscopy alone was significantly more costly at £ 3,228 ( $ 5,115 ) per patient ( P < 0.001 ) . The EBUS-TBNA strategy is less costly than mediastinoscopy if the cost per EBUS-TBNA procedure is less than £ 2,718 ( $ 4,307 ) per patient . CONCLUSIONS EBUS-TBNA is a safe , highly sensitive , and cost-saving initial investigation in patients with IML . Clinical trial registered with Clinical Trials.gov ( NCT00932854 ) Background Tuberculin skin test ( TST ) has been used for years as an aid in diagnosing latent tuberculosis infection ( LTBI ) but it suffers from a number of well-documented performance and logistic problems . Quantiferon-TB Gold In Tube test ( QFT-GIT ) has been reported to have better sensitivity and specifity than TST . In this study , it was aim ed to compare the performance of a commercial IFN-γ release assay ( QFT-GIT ) with TST in the diagnosis of HCWs at risk for latent TB infection in BCG vaccinated population . Material / Methods Hundred healthy volunteer health care workers were enrolled . All were subjected to TST and QFT-GIT . Results were compared among Health Care Workers ( HCWs ) groups in terms of profession , workplace , working duration . Results TST is affected by previous BCG vaccinations and number of cases with QFT-GIT positivity is increased in accordance with the TST in duration diameter range . QFT-GIT result was negative in 17 of 32 TST positive ( ≥15 mm ) cases and positive in 4 of 61 cases whose TST diameters are between 6–14 mm , that is attritutable to previous BCG vaccination(s ) . It was negative in all cases with TST diameters between 0–5 mm . HCWs with positive QFT-GIT results were significantly older than the ones with negative results . Furthermore duration of work was significantly longer in QFT-GIT positive than in negative HCWs . Conclusions There was a moderate concordance between QFT-GIT and TST , when TST result was defined as positive with a ≥15 mm diameter of in duration . We suggest that QFT-GIT can be used as an alternative to TST for detection of LTBI , especially in groups with high risk of LTBI and in population with routine BCG vaccination program STUDY OBJECTIVES Although various techniques are available for obtaining pathology specimens from the mediastinal lymph nodes , including conventional bronchoscopic transbronchial needle aspiration ( TBNA ) , transesophageal ultrasonography-guided needle aspiration , and mediastinoscopy , there are limitations to these techniques , which include low yield , poor access , need for general anesthesia , or complications . To overcome these problems , we undertook the current study to evaluate the clinical utility of the newly developed ultrasound puncture bronchoscope to visualize and perform real-time TBNA of the mediastinal and hilar lymph nodes under direct endobronchial ultrasonography ( EBUS ) guidance . DESIGN Prospect i ve patient enrollment . SETTING University teaching hospital . PATIENTS From March 2002 to September 2003 , 70 patients were included in the study . INTERVENTIONS The new convex probe ( CP ) EBUS is integrated with a convex scanning probe on its tip with a separate working channel , thus permitting real-time EBUS-guided TBNA . The indications for CP-EBUS were the diagnosis of mediastinal and /or hilar lymphadenopathy for known or suspected malignancy . Lymph nodes and the surrounding vessels were first visualized with CP-EBUS using the Doppler mode . The dimensions of the lymph nodes were recorded , followed by real-time TBNA under direct EBUS guidance . Final diagnosis was based on cytology , surgical results , and /or clinical follow-up . RESULTS All lymph nodes that were detected on the chest CT scan could be visualized using CP-EBUS . In 70 patients , CP-EBUS-guided TBNA was performed to obtain sample s from mediastinal lymph nodes ( 58 nodes ) and hilar lymph nodes ( 12 nodes ) . The sensitivity , specificity , and accuracy of CP-EBUS-guided TBNA in distinguishing benign from malignant lymph nodes were 95.7 % , 100 % , and 97.1 % , respectively . The procedure was uneventful , and there were no complications . CONCLUSIONS Real-time CP-EBUS-guided TBNA of mediastinal and hilar lymph nodes is a novel approach that is safe and has a good diagnostic yield . This new ultrasound puncture bronchoscope has an excellent potential for assisting in safe and accurate diagnostic interventional bronchoscopy BACKGROUND / PURPOSE Isolated intrathoracic lymphadenopathy ( IT-LAP ) is clinical ly challenging because of the difficult anatomic location and wide range of associated diseases , including tuberculosis ( TB ) . Although sampling via endobronchial ultrasound-guided transbronchial needle aspiration ( EBUS-TBNA ) for histopathology is a major development , there is still room for improvement . This study aim ed to investigate an algorithmic approach driven by EBUS-TBNA and conventional bronchoscopy to streamline the management of IT-LAP . METHODS Eighty-three prospect ively enrolled patients with IT-LAP were subjected to an EBUS-TBNA diagnostic panel test ( histopathology , cytology , and microbiology ) and underwent conventional bronchoscopy for bronchoalveolar lavage . The results were structured into an algorithmic approach to direct patient treatment , workup , or follow-up . RESULTS The diagnostic yields of EBUS-TBNA based on histopathology were similar for each disease entity : 77.8 % for malignancy , 70.0 % for TB , 75.0 % for sarcoidosis , 80.0 % for anthracosis , and 70.0 % for lymphoid hyperplasia ( p = 0.96 ) . The incidence of malignancy was 10.8 % for total IT-LAP patients , and 12.0 % and 33.7 % for patients with TB and sarcoidosis , respectively . Thirty-five ( 42.2 % ) patients were symptomatic . The leading diagnosis was sarcoidosis ( 60 % ) , followed by TB ( 20 % ) , malignancy ( 11.4 % ) , lymphoid hyperplasia ( 5.7 % ) , and anthracosis ( 2.9 % ) . By logistic regression analysis , granulomatous disease ( odds ratio : 13.45 ; 95 % confidence interval : 4.45 - 40.67 , p < 0.001 ) was an independent predictor of symptoms . Seven ( 8.4 % ) and three ( 3.6 % ) IT-LAP patients diagnosed active TB and suggestive of TB with household contact history , respectively , were all placed on anti-TB treatment . CONCLUSION The algorithmic approach streamlines patient management . It enables early detection of malignancy , correctly places nonmalignant patients on an appropriate treatment regimen , and particularly identifies c and i date s at high risk of TB reactivation for anti-TB chemoprophylaxis BACKGROUND AND STUDY AIMS Patients with suspected tuberculosis without pulmonary lesions and with mediastinal lymphadenopathy often pose a diagnostic challenge . Endoscopic ultrasound (EUS)-guided fine-needle aspiration ( FNA ) cytology is an established modality to evaluate mediastinal and abdominal lesions . The aim of the present study was to evaluate the role of EUS-FNA in isolated mediastinal lymphadenopathy in patients suspected of having tuberculosis . METHODS Consecutive patients suspected of having tuberculosis with isolated mediastinal lymphadenopathy were included in a prospect i ve study . Mediastinal lymphadenopathy was diagnosed on a contrast-enhanced computed tomography scan of the chest . Patients with concomitant lung parenchymal lesions were excluded . Previous attempts to diagnose the etiology of lymphadenopathy had failed in 69 % of patients . EUS-FNA was performed on an outpatient basis under conscious sedation . The sensitivity , specificity , and diagnostic accuracy of EUS-FNA were calculated . RESULTS A total of 60 consecutive patients ( mean age 39.8 years , 58 % males ) with mediastinal lymphadenopathy were included . EUS confirmed the presence of mediastinal lymph nodes ranging in size from 8 mm to 40 mm ( mean 26 mm ) in all patients . EUS-FNA provided an adequate tissue sample in 54 patients during the first examination and repeat EUS-FNA was necessary in six patients . A final diagnosis was obtained by EUS-FNA in 42 patients ( tuberculosis in 32 , sarcoidosis in six , and Hodgkin 's disease in four patients ) . An additional 14 patients were treated for tuberculosis based on EUS-FNA and clinical features . Mediastinoscopy was required for diagnosis in the remaining four patients . EUS-FNA had an overall diagnostic yield of 93 % , sensitivity of 71 % , specificity of 100 % , and positive predictive value of 100 % . CONCLUSION EUS-FNA is an accurate , safe , and minimally invasive modality for evaluating isolated mediastinal lymphadenopathy in patients suspected of having tuberculosis in an endemic area with a high prevalence of tuberculosis BACKGROUND Endobronchial ultrasound-guided transbronchial needle aspiration ( EBUS-TBNA ) is a minimally invasive procedure that has enabled mediastinal and hilar lymph node assessment with a high sensitivity , but its role in the diagnosis of intrathoracic tuberculosis ( TB ) has not been established . METHODS We prospect ively studied 59 patients suspected of having TB with thoracic lymph node lesions or intrapulmonary lesions accessible by EBUS-TBNA at a clinical center for thoracic medicine from January 2010 to December 2011 . Bronchoscopic findings , EBUS-TBNA procedures , pathologic findings , and microbiologic results were recorded . RESULTS Of 59 eligible patients , 41 patients had TB , 5 had lung cancer , 7 had inflammation , and 6 had sarcoidosis . Sensitivity was 85 % , specificity was 100 % , positive and negative predictive values were 100 % and 75 % , respectively , and accuracy was 90 % by EBUS-TBNA for TB . Pathologic findings were consistent with TB in 80 % of patients ( 33 of 41 ) , and in 27 % ( 11 of 41 ) the smear was positive . A total of 37 patients with TB had cultures , of whom 17 ( 46 % ) were positive . There were 80 mediastinal and hilar lymph nodes and 5 intrapulmonary lesions that were biopsied in the 41 patients with TB . Multivariate logistic regression revealed that short-axis diameter was an independent risk factor associated with positive pathology , smear , and culture ( p < 0.05 ) . Additionally , pathology showing necrosis was an independent risk factor associated with a positive culture . CONCLUSIONS Endobronchial ultrasound-guided transbronchial needle aspiration has a high diagnostic yield in the investigation of suspected intrathoracic TB by means of aspiration of intrathoracic lymph nodes and tracheobronchial wall-adjacent lung lesions OBJECTIVE To evaluate the diagnostic yield of bronchoscopy and mediastinoscopy in adults with isolated mediastinal tuberculous lymphadenitis and to assess the effect of antituberculous treatment . DESIGN Prospect i ve longitudinal cohort study of 34 patients with mediastinal tuberculous lymphadenitis followed for 6 to 19 months after completion of treatment . SETTING Tertiary care hospital , Kuwait . PATIENTS 34 consecutive patients who presented with isolated mediastinal lymphadenopathy from 1996 to 1998 . INTERVENTIONS Bronchoscopy and cervical mediastinoscopy for all patients . MAIN OUTCOME MEASURES Diagnostic yield of bronchoscopy and mediastinoscopy , and the outcome of treatment in patients with tuberculous lymphadenopathy . RESULTS The mean age was 35 years ( range 15 - 58 ) . The most common symptoms were cough , fever , and weight loss . The chest radiographs and computed tomograms showed abnormal mediastinal shadows with no evidence of parenchymal disease . All patients had right sided paratracheal lymphadenopathy . Tuberculin skin test gave a weal of > 15 mm in 17 patients ( 50 % ) . Sputum smears and cultures failed to grow acid-fast bacilli in any patient . Seven patients had an endobronchial abnormality and sample s taken at bronchoscopy gave a definite diagnosis in 3 ( 9 % ) . Paratracheal lymph node biopsy and culture by mediastinoscopy diagnosed tuberculosis in all cases . All patients were treated by a six month course of rifampicin and isoniazid supplemented initially by pyrazinamide for two months . Twenty-eight patients had a good response and the remaining patients were treated for a further 3 months . CONCLUSIONS Bronchoscopy has a low diagnostic yield in mediastinal tuberculous lymphadenopathy in the absence of a parenchymal lesion . Mediastinoscopy is a safe but invasive procedure and provides a tissue diagnosis in most cases . Six months treatment with rifampicin and isoniazid supplemented initially by pyrazinamide is adequate treatment for most adults with tuberculous mediastinal lymphadenopathy STUDY OBJECTIVE Our group performed a r and omized trial to assess whether the addition of endobronchial ultrasound ( EBUS ) guidance will lead to better results than st and ard transbronchial needle aspiration ( TBNS ) . EBUS guidance seems to be beneficial in increasing the yield of TBNA but has not been proven to be superior to conventional procedures in a r and omized trial . METHODS Consecutive patients who were referred for TBNA were r and omized to an EBUS-guided and a conventional TBNA arm . Patients with subcarinal lymph nodes were r and omized and analyzed separately ( group A ) from all other stations ( group B ) . A positive result was defined as either lymphocytes or a specific abnormality on cytology . RESULTS Two hundred patients were examined ( 100 patients each in groups A and B ) . Half of the patients underwent EBUS-guided TBNA rather than conventional TBNA . In group A , the yield of conventional TBNA was 74 % compared to 86 % in the EBUS group ( difference not significant ) . In group B , the overall yields were 58 % and 84 % , respectively . This difference was statistically highly significant ( p < 0.001 ) . The average number of passes was four . CONCLUSION EBUS guidance significantly increases the yield of TBNA in all stations except in the subcarinal region . It should be considered to be a routine adjunct to TBNA . On-site cytology may be unnecessary , and the number of necessary needle passes required is low OBJECTIVE The study objective was to compare endobronchial ultrasound-guided transbronchial needle aspiration ( EBUS-TBNA ) with mediastinoscopy for mediastinal lymph node staging of potentially resectable non-small cell lung cancer . METHODS Patients with confirmed or suspected non-small cell lung cancer who required mediastinoscopy to determine suitability for lung cancer resection were entered into the trial . All patients underwent EBUS-TBNA followed by mediastinoscopy under general anesthesia . If both were negative for N2 or N3 disease , the patient underwent pulmonary resection and mediastinal lymphadenectomy . RESULTS Between July 2006 and August 2010 , 190 patients were registered in the study , 159 enrolled , and 153 were eligible for analysis . EBUS-TBNA and mediastinoscopy sample d an average of 3 and 4 lymph node stations per patient , respectively . The mean short axis of the lymph node biopsied by EBUS-TBNA was 6.9 ± 2.9 mm . The prevalence of N2/N3 disease was 35 % ( 53/153 ) . There was excellent agreement between EBUS-TBNA and mediastinoscopy for mediastinal staging in 136 patients ( 91 % ; Kappa , 0.8 ; 95 % confidence interval , 0.7 - 0.9 ) . Specificity and positive predictive value for both techniques were 100 % . The sensitivity , negative predictive value , and diagnostic accuracy for mediastinal lymph node staging for EBUS-TBNA and mediastinoscopy were 81 % , 91 % , 93 % , and 79 % , 90 % , 93 % , respectively . No significant differences were found between EBUS-TBNA and mediastinoscopy in determining the true pathologic N stage ( McNemar 's test , P = .78 ) . There were no complications from EBUS-TBNA . Minor complications from mediastinoscopy were observed in 4 patients ( 2.6 % ) . CONCLUSIONS EBUS-TBNA and mediastinoscopy achieve similar results for the mediastinal staging of lung cancer . As performed in this study , EBUS-TBNA can replace mediastinoscopy in patients with potentially resectable non-small cell lung cancer
2,013
24,967,206
As major adverse events , the differences in the number of eye pain , endophthalmitis , hypertension and arterial thromboembolic events were not significant between the two groups , and the incidence of serious adverse events in the two groups was very low . For the maintenance of vision , the comparison of the combination of ranibizumab with PDT vs ranibizumab monotherapy shows no apparent difference . Compared with the combination of ranibizumab and PDT , patients treated with ranibizumab monothearpy may gain more visual acuity ( VA ) improvement . The combination treatment group had a tendency to reduce the number of ranibizumab retreatment . Both the two treatment strategies were well tolerated
AIM To compare the efficacy and safety of combination of ranibizumab with photodynamic therapy ( PDT ) vs ranibizumab monotherapy in the treatment of age-related macular degeneration ( AMD ) .
PURPOSE To determine if photodynamic therapy with verteporfin ( Visudyne ; Novartis AG , Bülach , Switzerl and ) , termed verteporfin therapy , can safely reduce the risk of vision loss compared with a placebo ( with sham treatment ) in patients with subfoveal choroidal neovascularization caused by age-related macular degeneration who were identified with a lesion composed of occult with no classic choroidal neovascularization , or with presumed early onset classic choroidal neovascularization with good visual acuity letter score . METHODS This was a double-masked , placebo-controlled ( sham treatment ) , r and omized , multicenter clinical trial involving 28 ophthalmology practice s in Europe and North America . The study population was patients with age-related macular degeneration , with subfoveal choroidal neovascularization lesions measuring no greater than 5400 microm in greatest linear dimension with either 1 ) occult with no classic choroidal neovascularization , best-corrected visual acuity score of at least 50 ( Snellen equivalent approximately 20/100 ) , and evidence of hemorrhage or recent disease progression ; or 2 ) evidence of classic choroidal neovascularization with a best-corrected visual acuity score of at least 70 ( better than a Snellen equivalent of approximately 20/40 ) ; assigned r and omly ( 2:1 ) to verteporfin therapy or placebo therapy . Verteporfin ( 6 mg per square meter of body surface area ) or placebo ( 5 % dextrose in water ) was administered by means of intravenous infusion of 30 ml over 10 minutes . Fifteen minutes after the start of the infusion , a laser light at 689 nm delivered 50 J/cm(2 ) by application of an intensity of 600 mW/cm(2 ) over 83 seconds using a spot size with a diameter 1000 microm larger than the greatest linear dimension of the choroidal neovascularization lesion on the retina . At follow-up examinations every 3 months , retreatment with the same regimen was applied if angiography showed fluorescein leakage . The main outcome measure was at least moderate vision loss , that is , a loss of at least 15 letters ( approximately 3 lines ) , adhering to an intent-to-treat analysis with the last observation carried forward to impute for missing data . RESULTS Two hundred ten ( 93 % ) and 193 ( 86 % ) of the 225 patients in the verteporfin group compared with 104 ( 91 % ) and 99 ( 87 % ) of the 114 patients in the placebo group completed the month 12 and 24 examinations , respectively . On average , verteporfin-treated patients received five treatments over the 24 months of follow-up . The primary outcome was similar for the verteporfin-treated and the placebo-treated eyes through the month 12 examination , although a number of secondary visual and angiographic outcomes significantly favored the verteporfin-treated group . Between the month 12 and 24 examinations , the treatment benefit grew so that by the month 24 examination , the verteporfin-treated eyes were less likely to have moderate or severe vision loss . Of the 225 verteporfin-treated patients , 121 ( 54 % ) compared with 76 ( 67 % ) of 114 placebo-treated patients lost at least 15 letters ( P = .023 ) . Likewise , 67 of the verteporfin-treated patients ( 30 % ) compared with 54 of the placebo-treated patients ( 47 % ) lost at least 30 letters ( P = .001 ) . Statistically significant results favoring verteporfin therapy at the month 24 examination were consistent between the total population and the subgroup of patients with a baseline lesion composition identified as occult choroidal neovascularization with no classic choroidal neovascularization . This subgroup included 166 of the 225 verteporfin-treated patients ( 74 % ) and 92 of the 114 placebo-treated patients ( 81 % ) . In these patients , 91 of the verteporfin-treated group ( 55 % ) compared with 63 of the placebo-treated group ( 68 % ) lost at least 15 letters ( P = .032 ) , whereas 48 of the verteporfin-treated group ( 29 % ) and 43 of the placebo-treated group ( 47 % ) lost at least 30 letters ( P = .004 ) . Other secondary outcomes , including visual acuity letter score worse than 34 ( approximate Snellen equivalent of 20/200 or worse ) , mean change in visual acuity letter score , development of classic choroidal neovascularization , progression of classic choroidal neovascularization and size of lesion , favored the verteporfin-treated group at both the month 12 and month 24 examination for both the entire study group and the subgroup of cases with occult with no classic choroidal neovascularization at baseline . Subgroup analyses of lesions composed of occult with no classic choroidal neovascularization at baseline suggested that the treatment benefit was greater for patients with either smaller lesions ( 4 disc areas or less ) or lower levels of visual acuity ( letter score less than 65 , an approximate Snellen equivalent of 20/50(-1 ) or worse ) at baseline . Prospect ively planned multivariable analyses confirmed that these two baseline variables affected the magnitude of treatment benefit . ( ABSTRACT TRUNCATED OBJECTIVE The 2-year , phase III trial design ated Anti-vascular endothelial growth factor ( VEGF ) Antibody for the Treatment of Predominantly Classic Choroidal Neovascularization ( CNV ) in Age-related Macular Degeneration ( ANCHOR ) compared ranibizumab with verteporfin photodynamic therapy ( PDT ) in treating predominantly classic CNV . DESIGN Multicenter , international , r and omized , double-masked , active-treatment-controlled clinical trial . PARTICIPANTS Patients with predominantly classic , subfoveal CNV not previously treated with PDT or antiangiogenic drugs . INTERVENTION Patients were r and omized 1:1:1 to verteporfin PDT plus monthly sham intraocular injection or to sham verteporfin PDT plus monthly intravitreal ranibizumab ( 0.3 mg or 0.5 mg ) injection . The need for PDT ( active or sham ) retreatment was evaluated every 3 months using fluorescein angiography ( FA ) . MAIN OUTCOME MEASURES The primary , intent-to-treat efficacy analysis was at 12 months , with continued measurements to month 24 . Key measures included the percentage losing < 15 letters from baseline visual acuity ( VA ) score ( month 12 primary efficacy outcome measure ) , percentage gaining > or=15 letters from baseline , and mean change over time in VA score and FA-assessed lesion characteristics . Adverse events were monitored . RESULTS Of 423 patients ( 143 PDT , 140 each in the 2 ranibizumab groups ) , the majority ( > or=77 % in each group ) completed the 2-year study . Consistent with results at month 12 , at month 24 the VA benefit from ranibizumab was statistically significant ( P<0.0001 vs. PDT ) and clinical ly meaningful : 89.9 % to 90.0 % of ranibizumab-treated patients had lost < 15 letters from baseline ( vs. 65.7 % of PDT patients ) ; 34 % to 41.0 % had gained > or=15 letters ( vs. 6.3 % of PDT group ) ; and , on average , VA was improved from baseline by 8.1 to 10.7 letters ( vs. a mean decline of 9.8 letters in PDT group ) . Changes in lesion anatomic characteristics on FA also favored ranibizumab ( all comparisons P<0.0001 vs. PDT ) . Overall , there was no imbalance among groups in rates of serious ocular and nonocular adverse events . In the pooled ranibizumab groups , 3 of 277 ( 1.1 % ) patients developed presumed endophthalmitis in the study eye ( rate per injection = 3/5921 [ 0.05 % ] ) . CONCLUSIONS In this 2-year study , ranibizumab provided greater clinical benefit than verteporfin PDT in patients with age-related macular degeneration with new-onset , predominantly classic CNV . Rates of serious adverse events were low . FINANCIAL DISCLOSURE(S ) Proprietary or commercial disclosure may be found after the references Purpose This prospect i ve multi-center pilot study compares the use of half-fluence photodynamic therapy combined with ranibizumab with ranibizumab monotherapy for the treatment of neovascular age-related macular degeneration . Methods All patients presenting with untreated subfoveal neovascular age-related macular degeneration were considered for inclusion . Patients were r and omized to receive either ranibizumab with half-fluence photodynamic therapy or ranibizumab alone . Patients in the ranibizumab alone group were given three consecutive monthly ranibizumab injections and were followed monthly . They were treated with ranibizumab as needed , based on clinical discretion , using vision and optical coherence tomography . Patients in the combined group were given one same-day combined ranibizumab and half-fluence ( 25 j/cm2 ) photodynamic therapy treatment and were treated monthly as needed . Outcomes included changes in st and ardized visual acuity , optical coherence tomography foveal thickness , and percentage of as-needed injections to maintenance examinations . Results Fifty-six out of 60 enrolled patients completed the twelve month primary outcome visit ; this consisted of 27 patients receiving ranibizumab alone and 29 receiving combined treatment . The average age was 79.1 for the ranibizumab alone group and 79.3 for the combined group . The mean visual acuity in the ranibizumab alone group improved from 52.9 Early Treatment of Diabetic Retinopathy letters initially to 62.8 letters at twelve months . The mean visual acuity in the combined group improved from 49.2 letters to 51.8 letters at twelve months . The differences in visual acuity improvements were not statistically significant based on a two-tailed t-test ( P = 0.2 ) . Due to the presence of outliers in each group , a Mann – Whitney U test was performed to confirm the results ( U = 325 ; P = 0.28 ) . The mean optical coherence tomography foveal thickness improved 92.5 microns and 106.7 microns in the ranibizumab alone and the combined group , respectively . The difference was not significant based on a two-tailed t-test ( P = 0.6 ) . The ranibizumab alone group received an average of 6.8 injections , while the combined group received an average of three injections . This difference was not significant based on a chi-square test ( P = 0.11 ) . Conclusion The groups appeared similar based on statistical analysis , but larger studies are needed to determine possible small differences between combination therapy and monotherapy Purpose Combination verteporfin photodynamic therapy ( vPDT ) and antivascular endothelial growth factor ( anti-VEGF ) therapy may decrease the need for injections while maintaining visual acuity in exudative age-related macular degeneration . This pilot study was design ed to determine the threshold fluence dose of vPDT ( the dose required to demonstrate an effect on choroidal perfusion ) combined with ranibizumab . Methods Seven patients were r and omized to sham vPDT ( two patients ) , 20 % fluence vPDT ( two patients ) , or 40 % fluence vPDT ( three patients ) in combination with three-monthly intravitreal 0.5 mg ranibizumab injections . Intravitreal ranibizumab was reinjected if disease activity was seen on fluorescein angiography , optical coherence tomography , or clinical examination . Indocyanine green-determined choroidal hypoperfusion was grade d in a masked fashion . Results Patients with 20 % vPDT had mild hypoperfusion defects at seven days that resolved by week 4 ( threshold dose ) ; patients with 40 % fluence vPDT had marked hypoperfusion at seven days that persisted as long as 12 months . Recruitment was stopped after limited efficacy was observed . One patient with 20 % fluence vPDT lost 19 letters at one year ; no other patient lost or gained > 10 letters . Central retinal thickness decreased in six of seven patients , but ranibizumab injections did not decrease . Conclusion This pilot study shows that the threshold fluence dose of vPDT ( when combined with ranibizumab ) is approximately 20 % st and ard fluence , and that mild and transient choroidal hypoperfusion can occur . Forty percent fluence vPDT causes a more prolonged and striking hypoperfusion . Despite hypoperfusion , no decrease in visual acuity or injections required was noted , suggesting that even higher fluence levels of vPDT may be necessary to decrease the number of anti-VEGF injections OBJECTIVE To report 24-month vision and fluorescein angiographic outcomes from trials evaluating photodynamic therapy with verteporfin ( Visudyne ; CIBA Vision Corp , Duluth , Ga ) in patients with subfoveal choroidal neovascularization ( CNV ) caused by age-related macular degeneration ( AMD ) . DESIGN Two multicenter , double-masked , placebo-controlled , r and omized clinical trials . SETTING Twenty-two ophthalmology practice s in Europe and North America . PARTICIPANTS Patients with subfoveal CNV lesions caused by AMD with greatest linear dimension on the retina measuring 5400 micrometer or less , with evidence of classic CNV and best-corrected visual acuity ( approximate Snellen equivalent ) between 20/40 and 20/200 . METHODS The methods were similar to those described in our 1-year results , with follow-up examinations beyond 1 year continuing every 3 months ( except for Photograph Reading Center evaluations , which occurred only at month 18 and month 24 examinations ) . During the second year , the same regimen ( with verteporfin or placebo as applied at baseline ) was used if angiography showed fluorescein leakage from CNV . The primary outcome was the proportion of eyes with fewer than 15 letters ( approximately 3 lines ) of visual acuity loss at the month 24 examination , adhering to an intent-to-treat analysis . The last observation was carried forward to impute for any missing data . RESULTS Three hundred fifty-one ( 87 % ) of 402 patients in the verteporfin group compared with 178 ( 86 % ) of 207 patients in the placebo group completed the month 24 examination . Beneficial outcomes with respect to visual acuity and contrast sensitivity noted at the month 12 examination in verteporfin-treated patients were sustained through the month 24 examination . At the month 24 examination for the primary outcome , 213 ( 53 % ) of 402 verteporfin-treated patients compared with 78 ( 38 % ) of 207 placebo-treated patients lost fewer than 15 letters ( P<.001 ) . In subgroup analyses for predominantly classic lesions ( in which the area of classic CNV makes up at least 50 % of the area of the entire lesion ) at baseline , 94 ( 59 % ) of 159 verteporfin-treated patients compared with 26 ( 31 % ) of 83 placebo-treated patients lost fewer than 15 letters at the month 24 examination ( P<.001 ) . For minimally classic lesions ( in which the area of classic CNV makes up < 50 % but > 0 % of the area of the entire lesion ) at baseline , no statistically significant differences in visual acuity were noted . Few additional photosensitivity adverse reactions and injection site adverse events were associated with verteporfin therapy in the second year of follow-up . CONCLUSIONS The visual acuity benefits of verteporfin therapy for AMD patients with predominantly classic CNV subfoveal lesions are safely sustained for 2 years , providing more compelling evidence to use verteporfin therapy for these cases . For AMD patients with subfoveal lesions that are minimally classic , there is insufficient evidence to warrant routine use of verteporfin therapy OBJECTIVE To evaluate the long-term safety and efficacy of multiple intravitreal ranibizumab injections ( Lucentis , Genentech , Inc. , South San Francisco , CA ) administered at the investigator 's discretion in patients with choroidal neovascularization secondary to age-related macular degeneration . DESIGN An open-label , multicenter , extension study . PARTICIPANTS Patients who completed the controlled treatment phase of 1 of 3 prospect i ve , r and omized , 2-year clinical trials of ranibizumab were eligible for enrollment . Analyses were performed for 3 groups : ( 1 ) patients treated with ranibizumab in the initial study ( ranibizumab treated-initial ; n = 600 ) ; ( 2 ) patients r and omized to control who crossed over to receive ranibizumab ( ranibizumab treated-XO ; n = 190 ) ; and ( 3 ) ranibizumab-naïve patients ( ranibizumab untreated ; n = 63 ) . METHODS Ranibizumab 0.5 mg was administered at the investigator 's discretion . Adverse events ( AEs ) and Early Treatment Diabetic Retinopathy Study ( ETDRS ) best-corrected visual acuity ( BCVA ) assessment s were conducted at study visits every 3 to 6 months . MAIN OUTCOME MEASURES Incidence and severity of AEs . RESULTS There was 1 occurrence of mild endophthalmitis per 3552 HORIZON injections in the ranibizumab treated-initial/ranibizumab treated-XO groups . There were no serious AE reports of lens damage , retinal tears , or rhegmatogenous retinal detachments in the study eyes . The proportion of patients with any single postdose intraocular pressure ≥30 mmHg was 9.2 % , 6.6 % , and 0 % , and the proportion of patients with glaucoma was 3.2 % , 4.2 % , and 3.2 % in the ranibizumab treated-initial , ranibizumab treated-XO , and ranibizumab untreated groups , respectively . Cataract AEs were less frequent in the ranibizumab untreated group : 6.3 % versus 12.5 % and 12.1 % in the ranibizumab treated-initial and ranibizumab treated-XO groups , respectively . The proportion of patients with arterial thromboembolic events as defined by the Antiplatelet Trialists ' Collaboration was 5.3 % in the ranibizumab treated-initial and ranibizumab treated-XO groups , and 3.2 % in the ranibizumab untreated group . At month 48 ( 2 years of HORIZON ) , the mean change in BCVA ( ETDRS letters ) relative to the initial study baseline was 2.0 in the ranibizumab treated-initial group versus -11.8 in the pooled ranibizumab treated-XO and ranibizumab untreated groups . CONCLUSIONS Multiple ranibizumab injections were well tolerated for ≥4 years . With less frequent follow-up leading to less treatment , there was an incremental decline of the visual acuity ( VA ) gains achieved with monthly treatment . FINANCIAL DISCLOSURE(S ) Proprietary or commercial disclosure may be found after the references Aims The aim of this study is to evaluate the effect of st and ard-fluence verteporfin photodynamic therapy ( PDT ) delivered on the first day of a ranibizumab regimen for choroidal neovascularisation secondary to age-related macular degeneration compared with ranibizumab monotherapy . Methods Patients were r and omised to sham or st and ard-fluence verteporfin PDT at baseline . The first of three monthly loading doses of ranibizumab was given on the same day , and thereafter patients received monthly treatment with ranibizumab as required . All patients underwent monthly visual acuity and OCT assessment and 3-monthly fluorescein angiography with follow-up to 1 year . Results In all , 18 patients were recruited . The PDT group gained a mean of 2.2 ETDRS letters at 1 year and the sham group gained a mean of 4.4 letters ( P=0.47 ) . Both groups required a mean of 1.3 injections of ranibizumab following the 3-month loading phase . Fluorescein angiography at 1 month demonstrated marked choroidal hypoperfusion in all patients treated with PDT with reduced choroidal perfusion persisting to month 12 . This did not occur in the sham group . Conclusion The addition of st and ard-fluence verteporfin PDT at baseline to a ranibizumab regimen conferred no benefit in terms of visual acuity or number of ranibizumab injections required at 1 year . The combination of these treatments result ed in persistent reduced choroidal perfusion , which raises potential safety concerns PURPOSE To assess the long-term efficacy of a variable-dosing regimen with ranibizumab in the Prospect i ve Optical Coherence Tomography ( OCT ) Imaging of Patients with Neovascular Age-Related Macular Degeneration ( AMD ) Treated with intraOcular Ranibizumab ( PrONTO ) Study , patients were followed for 2 years . DESIGN A 2-year prospect i ve , uncontrolled , variable-dosing regimen with intravitreal ranibizumab based on OCT . METHODS In this open-label , prospect i ve , single-center , uncontrolled clinical study , AMD patients with neovascularization involving the central fovea and a central retinal thickness ( CRT ) of at least 300 microm as measured by OCT were enrolled to receive 3 consecutive monthly intravitreal injections of ranibizumab ( 0.5 mg ) [ Lucentis ; Genentech Inc , South San Francisco , California , USA ] . During the first year , retreatment with ranibizumab was performed at each monthly visit if any criterion was fulfilled such as an increase in OCT-CRT of at least 100 microm or a loss of 5 letters or more . During the second year , the retreatment criteria were amended to include retreatment if any qualitative increase in the amount of fluid was detected using OCT . RESULTS Forty patients were enrolled and 37 completed the 2-year study . At month 24 , the mean visual acuity ( VA ) improved by 11.1 letters ( P < .001 ) and the OCT-CRT decreased by 212 microm ( P < .001 ) . VA improved by 15 letters or more in 43 % of patients . These VA and OCT outcomes were achieved with an average of 9.9 injections over 24 months . CONCLUSIONS The PrONTO Study using an OCT-guided variable-dosing regimen with intravitreal ranibizumab result ed in VA outcomes comparable with the outcomes from the phase III clinical studies , but fewer intravitreal injections were required PURPOSE To compare the efficacy and safety of same-day verteporfin photodynamic therapy ( PDT ) and intravitreal ranibizumab combination treatment versus ranibizumab monotherapy in neovascular age-related macular degeneration . DESIGN Prospect i ve , multicenter , double-masked , r and omized , active-controlled trial . PARTICIPANTS We included 255 patients with all types of active subfoveal choroidal neovascularization . METHODS Patients were r and omized 1:1 to as-needed ( pro re nata ; PRN ) combination ( st and ard-fluence verteporfin 6 mg/m(2 ) PDT and ranibizumab 0.5 mg ) or PRN ranibizumab monotherapy ( sham infusion [ 5 % dextrose ] PDT and ranibizumab 0.5 mg ) . Patients received 3 consecutive monthly injections followed by PRN retreatments based on protocol -specific retreatment criteria . MAIN OUTCOME MEASURES Mean change in best-corrected visual acuity ( BCVA ) from baseline to month 12 , and the proportion of patients with treatment-free interval ≥3 months at any timepoint after month 2 . RESULTS The mean change in BCVA at month 12 was + 2.5 and + 4.4 letters in the combination and monotherapy groups , respectively ( P = 0.0048 ; difference : -1.9 letters [ 95 % confidence interval , -5.76 to 1.86 ] , for having achieved noninferiority with a margin of 7 letters ) . The proportion of patients with a treatment-free interval of ≥3 months at any timepoint after month 2 was high , but did not show a clinical ly relevant difference between the treatment groups . Secondary efficacy endpoints included the mean number of ranibizumab retreatments after month 2 ( 1.9 and 2.2 with combination and monotherapy , respectively [ P = 0.1373 ] ) . The time to first ranibizumab retreatment after month 2 was delayed by 34 days ( about 1 monthly visit ) with combination ( month 6 ) versus monotherapy ( month 5 ) . At month 12 , mean ± st and ard error central retinal thickness decreased by 115.3±9.04 μm in the combination group and 107.7±11.02 μm in the monotherapy group . The mean number of verteporfin/sham PDT treatments was comparable in the 2 groups ( combination , 1.7 ; monotherapy , 1.9 ) . The safety profiles of the 2 groups were comparable , with a low incidence of ocular serious adverse events . CONCLUSIONS The combination PRN treatment regimen with verteporfin PDT and ranibizumab was effective in achieving BCVA gain comparable with ranibizumab monotherapy ; however , the study did not show benefits with respect to reducing the number of ranibizumab retreatment over 12 months . The combination therapy was well tolerated Purpose : To evaluate photodynamic therapy ( PDT ) with verteporfin along with intravitreal Ranibizumab in treatment of neovascular age related macular degeneration . Methods : This prospect i ve interventional care series included 16 patients ( 17 eyes ) of choroidal neovascularization secondary to neovascular AMD , who were treated with PDT with verteporfin followed by an injection of 0.5 mg Ranibuzimab on the same day . The main outcome measures were best corrected visual acuity ( VA ) as recorded by both Snellen ’s and ETDRS charts ( logMAR ) , contrast sensitivity ( Pelli-Robson Chart ) , retreatment frequency and frequency of side effects . Results : Seventeen eyes underwent PDT with verteporfin and intravitreal 0.5 mg Ranibizumab , following PDT . Patients were followed up every month for a total period of 6 months . Initial VA ranged from CF to 20/32 and final acuity ranged from CF to 20/20 . VA stabilizes ( gain/loss < 2 lines ) in 14 out of 17 eyes ( 82.24 % ) and improved in 3 out of 17 ( 17.65 % ) . Contrast sensitivity improved in 15 out of 17 eyes ( 82.24 % ) Lesion type , patient age had no influence on the outcome . There were no cases of ocular/ systematic adverse events . Retreatment was required in only 2 out of 17 cases ( 11.76 % ) with only a single injection of ranibizumab . Conclusion : The combination of PDT with intravitreal Ranibizumab improves contrast sensitivity and stabilizes vision and reduces the number of retreatments , without significant ocular and /or systematic risks BACKGROUND We compared ranibizumab -- a recombinant , humanized , monoclonal antibody Fab that neutralizes all active forms of vascular endothelial growth factor A -- with photodynamic therapy with verteporfin in the treatment of predominantly classic neovascular age-related macular degeneration . METHODS During the first year of this 2-year , multicenter , double-blind study , we r and omly assigned patients in a 1:1:1 ratio to receive monthly intravitreal injections of ranibizumab ( 0.3 mg or 0.5 mg ) plus sham verteporfin therapy or monthly sham injections plus active verteporfin therapy . The primary end point was the proportion of patients losing fewer than 15 letters from baseline visual acuity at 12 months . RESULTS Of the 423 patients enrolled , 94.3 % of those given 0.3 mg of ranibizumab and 96.4 % of those given 0.5 mg lost fewer than 15 letters , as compared with 64.3 % of those in the verteporfin group ( P<0.001 for each comparison ) . Visual acuity improved by 15 letters or more in 35.7 % of the 0.3-mg group and 40.3 % of the 0.5-mg group , as compared with 5.6 % of the verteporfin group ( P<0.001 for each comparison ) . Mean visual acuity increased by 8.5 letters in the 0.3-mg group and 11.3 letters in the 0.5-mg group , as compared with a decrease of 9.5 letters in the verteporfin group ( P<0.001 for each comparison ) . Among 140 patients treated with 0.5 mg of ranibizumab , presumed endophthalmitis occurred in 2 patients ( 1.4 % ) and serious uveitis in 1 ( 0.7 % ) . CONCLUSIONS Ranibizumab was superior to verteporfin as intravitreal treatment of predominantly classic neovascular age-related macular degeneration , with low rates of serious ocular adverse events . Treatment improved visual acuity on average at 1 year . ( Clinical Trials.gov number , NCT00061594 [ Clinical Trials.gov ] . )
2,014
15,984,501
From the limited evidence available , survival appeared to increase when patients were selected by performance status ( survival increasing from approximately three to seven months in high performance status groups , as defined by Karnofsky performance status or Recursive Partitioning Analysis classification ) . The evidence suggests no survival benefit when patients with poor performance status were treated with whole brain radiotherapy . WBRT appears to be of benefit in higher performance status patients but not in low performance status patients .
Background : Brain metastases are the most common intracranial tumour in adults , estimated to occur in up to 40 % of patients with cancer . Despite being used in clinical practice for 50 years , the effectiveness of whole brain radiotherapy for the treatment of brain metastases remains uncertain . Objectives : To assess the effectiveness of whole brain radiotherapy ( WBRT ) on survival and quality of life . To identify whether patient performance status , number of brain metastases , extent of extracranial disease and primary site of cancer are important effect modifiers .
The incidence of brain metastases secondary to small cell lung cancer ( SCLC ) is about 35 % and the treatment strategy of brain irradiation with respect to dose and fractionation is controversial . In order to evaluate treatment outcome of brain irradiation in SCLC patients with brain relapse , we retrospectively evaluated all patients treated with brain irradiation in the eastern part of Denmark from 1988 to 1992 ( PCI patients excluded ) . During this 5-year period , 101 evaluable patients were included ( 44 females , 57 males ) ( median age 61 years ; range , 39 - 75 years ) . Forty-four patients , of whom 43 were in extracerebral complete remission ( CR ) , received extended course ( EC ) brain irradiation ( > 45 Gy , treatment schedule > 4 weeks ) . Fifty-seven patients received short course ( SC ) brain irradiation ( < 30 Gy , treatment schedule < 1 week ) . Among the SC treated patients , 14 were in CR , 20 had partial remission or stable disease and 23 had progressive extracerebral disease . The median survival ( from diagnosis of brain metastases ) in the group receiving irradiation with EC ( 44 patients ) was 160 days ( range , 74 - 2021 days ) , while the 57 patients treated with SC had a median survival of 88 days ( range , 20 - 948 days ) ( P = 0.00001 , Log-Rank analysis ) . In a subgroup of 14 patients in extracerebral CR , receiving SC irradiation , the median survival was 83 days ( range , 15 - 948 days ) . When the latter patients were compared to the 43 patients in CR in the group treated with EC , a statistically significant difference was shown ( P = 0.034 , Log-Rank analysis ) . Using Cox-hazard regression analysis with backward elimination , liver metastases and poor performance status were adverse prognostic signs , although the only significant parameters of survival were gender ( female vs. male , relative risk of dying 1 and 1.52 , P = 0.05 ) and schedule of brain irradiation ( extended course vs. short course , relative risk of dying , 0.36 and 1 , P < 0.001 ) . Extended course irradiation of brain relapse secondary to SCLC seems in general to be of limited value , although a significant prolonged survival at approximately 7 weeks , was obtained . The prolongation of survival does not seem worthwhile considering the length of treatment time ( 5 - 6 weeks ) compared to SC treatment ( 1 week ) . However , the data do not permit evaluation of the quality of life of the patients . This retrospective evaluation suggests the need for r and omized trials with carefully planned quality -of-life assessment
2,015
30,006,762
Main results showed that resistance training had positive effects on the executive function and global cognitive function of the elderly , and short-term interventions had little positive effect on memory and attention . Secondary results demonstrated that there was a significant benefit of triweekly resistance training in global cognitive function and biweekly in executive function of the elderly . Conclusions Resistance training had positive effects on the executive cognitive ability and global cognitive function among the elderly ; however , it had a weak-positive impact on memory . No significant improvement was found in attention . Triweekly resistance training has a better effect on general cognitive ability than biweekly .
Background Aging is often accompanied by decline in aspects of cognitive function . Cognitive decline has harmful effects on living independence and general health . Resistance training is seen as a promising intervention to prevent or delay cognitive deterioration , yet the evidence from review s is less consistent . Aim To assess the effect of resistance training on cognition in the elderly with and without mild cognitive impairment and to provide an up-to- date overview .
AIM The effectiveness of resistance training in improving cognitive function in older adults is well demonstrated . In particular , unconventional high-speed resistance training can improve muscle power development . In the present study , the effectiveness of 12 weeks of elastic b and -based high-speed power training ( HSPT ) was examined . METHODS Participants were r and omly assigned into a HSPT group ( n = 14 , age 75.0 ± 0.9 years ) , a low-speed strength training ( LSST ) group ( n = 9 , age 76.0 ± 1.3 years ) and a control group ( CON ; n = 7 , age 78.0 ± 1.0 years ) . A 1-h exercise program was provided twice a week for 12 weeks for the HSPT and LSST groups , and balance and tone exercises were carried out by the CON group . RESULTS Significant increases in levels of cognitive function , physical function , and muscle strength were observed in both the HSPT and LSST groups . In cognitive function , significant improvements in the Mini-Mental State Examination and Montreal Cognitive Assessment were seen in both the HSPT and LSST groups compared with the CON group . In physical functions , Short Physical Performance Battery scores were increased significantly in the HSPT and LSST groups compared with the CON group . In the 12 weeks of elastic b and -based training , the HSPT group showed greater improvements in older women with mild cognitive impairment than the LSST group , although both regimens were effective in improving cognitive function , physical function and muscle strength . CONCLUSIONS We conclude that elastic b and -based HSPT , as compared with LSST , is more efficient in helping older women with mild cognitive impairment to improve cognitive function , physical performance and muscle strength . Geriatr Gerontol Int 2017 ; 17 : 765 - 772 Objective The purpose of this study was to test the effects of a 6-month Wheelchair-bound Senior Elastic B and ( WSEB ) exercise program on the activities of daily living ( ADL ) and functional fitness of wheelchair-bound older adults with cognitive impairment . Design Cluster r and omized controlled trial was used . A convenience sample of 138 wheelchair-bound older adults with cognitive impairment were recruited from 8 nursing homes in southern Taiwan and were r and omly assigned based on the nursing homes they lived to the experimental ( 4 nursing homes ; n = 73 ) or the control group ( 4 nursing homes ; n = 65 ) . The experimental group performed WSEB exercises 3 times per week and 40 minutes per session for 6 months . The ADL and functional fitness ( cardiopulmonary function , body flexibility , range of joint motion , and muscle strength and endurance ) were examined at baseline , 3 months , and the end of 6-month study . Results The ADL and functional fitness indicators of participants in the experimental group showed significant improvements compared to the control group ( all P < 0.05 ) . Conclusions The WSEB exercises have positive benefits for the ADL and functional fitness of wheelchair-bound older adults with cognitive impairment . It is suggested that WSEB exercises be included as a routine activity in nursing homes . To Cl aim CME Credits : Complete the self- assessment activity and evaluation online at http://www.physiatry.org/JournalCME CME Objectives : Upon completion of this article the reader should be able to : ( 1 ) Underst and the risk factors for functional decline in older adults with dementia ; ( 2 ) Articulate the benefits of structured activities and exercises in the older adult with dementia ; and ( 3 ) Incorporate elastic b and exercises into the treatment plan of wheelchair bound older adults with dementia . Level : Advanced Accreditation : The Association of Academic Physiatrists is accredited by the Accreditation Council for Continuing Medical Education to provide continuing medical education for physicians . The Association of Academic Physiatrists design ates this activity for a maximum of 1.5 AMA PRA Category 1 Credit(s) ™ . Physicians should only cl aim credit commensurate with the extent of their participation in the activity This study aim ed to investigate the effects of a long-term resistance exercise intervention on executive functions in healthy elderly males , and to further underst and the potential neurophysiological mechanisms mediating the changes . The study assessed forty-eight healthy elderly males r and omly assigned to exercise ( n = 24 ) or control ( n = 24 ) groups . The assessment included neuropsychological and neuroelectric measures during a variant of the oddball task paradigm , as well as growth hormone ( GH ) , insulin-like growth factor-1 ( IGF-1 ) , and homocysteine levels at baseline and after either a 12 month intervention of resistance exercise training or control period . The results showed that the control group had a significantly lower accuracy rate and smaller P3a and P3b amplitudes in the oddball condition after 12 months . The exercise group exhibited improved reaction times ( RTs ) , sustained P3a and P3b amplitudes , increased levels of serum IGF-1 , and decreased levels of serum homocysteine . The changes in IGF-1 levels were significantly correlated with the changes in RT and P3b amplitude of the oddball condition in the exercise group . In conclusion , significantly enhanced serum IGF-1 levels after 12 months of resistance exercise were inversely correlated with neurocognitive decline in the elderly . These findings suggest that regular resistance exercise might be a promising strategy to attenuate the trajectory of cognitive aging in healthy elderly individuals , possibly mediated by IGF-1 Background There is limited research about beneficial effects of physical activity in older adults suffering from mild cognitive impairment ( MCI ) . Aim The aim of the study was to provide preliminary evidence on the effects of two types of non-aerobic training on cognitive functions in older women suffering from MCI . Methods Twenty-eight participants aged 66–78 years with MCI were r and omly assigned to a combined balance and core resistance training group ( n = 14 ) or to a Pilates group ( n = 14 ) . Results Following completion of the 8-week exercise programme , both groups showed significant improvements in global and specific cognitive domains . Conclusion Findings suggest that non-aerobic training should be further explored as a beneficial intervention for older adults suffering from MCI BACKGROUND Cognitive decline among seniors is a pressing health care issue . Specific exercise training may combat cognitive decline . We compared the effect of once-weekly and twice-weekly resistance training with that of twice-weekly balance and tone exercise training on the performance of executive cognitive functions in senior women . METHODS In this single-blinded r and omized trial , 155 community-dwelling women aged 65 to 75 years living in Vancouver were r and omly allocated to once-weekly ( n = 54 ) or twice-weekly ( n = 52 ) resistance training or twice-weekly balance and tone training ( control group ) ( n = 49 ) . The primary outcome measure was performance on the Stroop test , an executive cognitive test of selective attention and conflict resolution . Secondary outcomes of executive cognitive functions included set shifting as measured by the Trail Making Tests ( parts A and B ) and working memory as assessed by verbal digit span forward and backward tests . Gait speed , muscular function , and whole-brain volume were also secondary outcome measures . RESULTS Both resistance training groups significantly improved their performance on the Stroop test compared with those in the balance and tone group ( P < or = .03 ) . Task performance improved by 12.6 % and 10.9 % in the once-weekly and twice-weekly resistance training groups , respectively ; it deteriorated by 0.5 % in the balance and tone group . Enhanced selective attention and conflict resolution was significantly associated with increased gait speed . Both resistance training groups demonstrated reductions in whole-brain volume compared with the balance and tone group at the end of the study ( P < or = .03 ) . CONCLUSION Twelve months of once-weekly or twice-weekly resistance training benefited the executive cognitive function of selective attention and conflict resolution among senior women . TRIAL REGISTRATION clinical trials.gov Identifier : NCT00426881 Background Mild cognitive impairment ( MCI ) represents a critical window to intervene against dementia . Exercise training is a promising intervention strategy , but the efficiency ( i.e. , relationship of costs and consequences ) of such types of training remains unknown . Thus , we estimated the incremental cost-effectiveness of resistance training or aerobic training compared with balance and tone exercises in terms of changes in executive cognitive function among senior women with probable MCI . Methods Economic evaluation conducted concurrently with a six-month three arm r and omized controlled trial including eighty-six community dwelling women aged 70 to 80 years living in Vancouver , Canada . Participants received twice-weekly resistance training ( n = 28 ) , twice weekly aerobic training ( n = 30 ) or twice-weekly balance and tone ( control group ) classes ( n = 28 ) for 6 months . The primary outcome measure of the Exercise for Cognition and Everyday Living ( EXCEL ) study assessed executive cognitive function , a test of selective attention and conflict resolution ( i.e. , Stroop Test ) . We collected healthcare re source utilization costs over six months . Results Based on the bootstrapped estimates from our base case analysis , we found that both the aerobic training and resistance training interventions were less costly than twice weekly balance and tone classes . Compared with the balance and tone group , the resistance-training group had significantly improved performance on the Stroop Test ( p = 0.04 ) . Conclusions Resistance training and aerobic training result in health care cost saving and are more effective than balance and tone classes after only 6 months of intervention . Resistance training is a promising strategy to alter the trajectory of cognitive decline in seniors with MCI . Trial Registration Clinical Trials.gov NCT00958867 Background Older adults are encouraged to participate in regular physical activity to counter the age-related declines in physical and cognitive health . Literature on the effect of different exercise training modalities ( aerobic vs resistance ) on these health-related outcomes is not only sparse , but results are inconsistent . In general , it is believed that exercise has a positive effect on executive cognitive function , possibly because of the physiological adaptations through increases in fitness . Indications are that high-intensity interval training is a potent stimulus to improve cardiovascular fitness , even in older adults ; however , its effect on cognitive function has not been studied before . Therefore , the purpose of this study was to compare the effects of resistance training , high-intensity aerobic interval training and moderate continuous aerobic training on the cognitive and physical functioning of healthy older adults . Methods Sixty-seven inactive individuals ( 55 to 75 years ) were r and omly assigned to a resistance training ( RT ) group ( n = 22 ) , high-intensity aerobic interval training ( HIIT ) group ( n = 13 ) , moderate continuous aerobic training ( MCT ) group ( n = 13 ) and a control ( CON ) group ( n = 19 ) for a period of 16 weeks . Cognitive function was assessed with a Stroop task and physical function with the Timed-Up- and -Go ( TUG ) and submaximal Bruce treadmill tests . Results No significant GROUP x TIME interaction was found for Stroop reaction time ( P > .05 ) . The HIIT group showed the greatest practical significant improvement in reaction time on the information processing task , i.e. Stroop Neutral ( ES = 1.11 ) . MCT group participants had very large practical significant improvements in reaction time on the executive cognitive tasks , i.e. Stroop Incongruent and Interference ( ES = 1.28 and 1.31 , respectively ) . The HIIT group showed the largest practically significant increase in measures of physical function , i.e. walking endurance ( ES = 0.91 ) and functional mobility ( ES = 0.36 ) . Conclusions MCT and RT proved to be superior to HIIT for the enhancement of older individuals ’ executive cognitive function ; whereas HIIT were most beneficial for improvement in information processing speed . HIIT also induced the largest gains in physical function Objective : To investigate the effects of a 12-week resistance exercise program with an elastic b and on electroencephalogram ( EEG ) patterns and cognitive function in elderly patients with mild cognitive impairment ( MCI ) . Design : R and omized controlled trial . Setting : Community center . Participants : Twenty-two subjects with MCI and 25 healthy volunteer subjects were r and omly assigned to 1 of 4 groups : subjects with MCI who undertook the exercise program ( MCI-EX ; n = 10 ) , an MCI control group ( MCI-Con ; n = 12 ) , a healthy volunteer exercise group ( NG-EX ; n = 12 ) , and a healthy volunteer control group ( NG-Con ; n = 13 ) . Intervention : The exercise group engaged in a 15-repetition maximum ( 15RM ; 65 % of 1RM ) resistance exercise program for 12 weeks . Main Outcome Measures : Electroencephalograms , neuropsychological tests , and Senior Fitness Test . Results : The 12-week 15RM ( 65 % of 1RM ) resistance exercise program significantly improved variables related to the physical fitness of the elderly subjects . Furthermore , for the EEG test , the MCI and NG groups showed significant differences at baseline in relative beta waves on electrodes Fp1 ( P < 0.05 ) and F3 ( P < 0.05 ) , as well as in relative beta2 waves on F3 ( P < 0.05 ) . In addition , after the 12-week exercise intervention , differences in a region that benefits from exercise were observed between ( 1 ) the MCI-EX group in the relative theta power on F3 ( P < 0.05 ) and the relative alpha power on T3 ( P < 0.05 ) and in ( 2 ) the NG-EX group in the relative theta power on P3 ( P < 0.05 ) and P4 ( P < 0.01 ) . In addition , only the score of the digit span backward in the MCI-EX group changed significantly ( P < 0.05 ) . Conclusions : The 12-week resistance exercise with an elastic b and had a positive effect on EEG patterns in elderly subjects with MCI , along with providing physical benefits and slight changes in cognitive function in MCI-EX group . Significance : A 15RM resistance exercise program can be an effective treatment for delaying cognitive decline and improving physical fitness Flaws in the design , conduct , analysis , and reporting of r and omised trials can cause the effect of an intervention to be underestimated or overestimated . The Cochrane Collaboration ’s tool for assessing risk of bias aims to make the process clearer and more Background Reliability of quantitative gait assessment while dual-tasking ( walking while doing a secondary task such as talking ) in people with cognitive impairment is unknown . Dual-tasking gait assessment is becoming highly important for mobility research with older adults since better reflects their performance in the basic activities of daily living . Our purpose was to establish the test-retest reliability of assessing quantitative gait variables using an electronic walkway in older adults with mild cognitive impairment ( MCI ) under single and dual-task conditions . Methods The gait performance of 11 elderly individuals with MCI was evaluated using an electronic walkway ( GAITRite ® System ) in two sessions , one week apart . Six gait parameters ( gait velocity , step length , stride length , step time , stride time , and double support time ) were assessed under two conditions : single-task ( sG : usual walking ) and dual-task ( dG : counting backwards from 100 while walking ) . Test-retest reliability was determined using intra-class correlation coefficient ( ICC ) . Gait variability was measured using coefficient of variation ( CoV ) . Results Eleven participants ( average age = 76.6 years , SD = 7.3 ) were assessed . They were high functioning ( Clinical Dementia Rating Score = 0.5 ) with a mean Mini-Mental Status Exam ( MMSE ) score of 28 ( SD = 1.56 ) , and a mean Montreal Cognitive Assessment ( MoCA ) score of 22.8 ( SD = 1.23 ) . Under dual-task conditions , mean gait velocity ( GV ) decreased significantly ( sGV = 119.11 ± 20.20 cm/s ; dGV = 110.88 ± 19.76 cm/s ; p = 0.005 ) . Additionally , under dual-task conditions , higher gait variability was found on stride time , step time , and double support time . Test-retest reliability was high ( ICC>0.85 ) for the six parameters evaluated under both conditions . Conclusion In older people with MCI , variability of time-related gait parameters increased with dual-tasking suggesting cognitive control of gait performance . Assessment of quantitative gait variables using an electronic walkway is highly reliable under single and dual-task conditions . The presence of cognitive impairment did not preclude performance of dual-tasking in our sample supporting that this methodology can be reliably used in cognitive impaired older individuals PURPOSE The purpose of this study was to assess the impact of 24 wk of resistance training at two different intensities on cognitive functions in the elderly . METHODS Sixty-two elderly individuals were r and omly assigned to three groups : CONTROL ( N = 23 ) , experimental moderate ( EMODERATE ; N = 19 ) , and experimental high ( EHIGH ; N = 20 ) . The volunteers were assessed on physical , hemodynamic , cognitive , and mood parameters before and after the program . RESULTS On the 1 RM test ( P < 0.001 ) , the two experimental groups performed better than the CONTROL group , but they did not show differences between themselves . The EHIGH group gained more lean mass ( P = 0.05 ) than the CONTROL group and performed better on the following tests : digit span forward ( P < 0.001 ) , Corsi 's block-tapping task backward ( P = 0.001 ) , similarities ( P = 0.03 ) , Rey-Osterrieth complex figure immediate recall ( P = 0.02 ) , Toulouse-Pieron concentration test errors ( P = 0.01 ) , SF-36 ( general health ) ( P = 0.04 ) , POMS ( tension-anxiety , P = 0.04 ; depression-dejection , P = 0.03 ; and total mood disorder , P = 0.03 ) . The EMODERATE group scored higher means than the CONTROL group on digit span forward ( P < 0.001 ) , Corsi 's block-tapping task backward ( P = 0.01 ) , similarities ( P = 0.02 ) , Rey-Osterrieth complex figure immediate recall ( P = 0.02 ) , SF-36 ( general health , P = 0.005 ; vitality , P = 0.006 ) , POMS ( tension-anxiety , P = 0.001 ; depression-dejection , P = 0.006 ; anger-hostility , P = 0.006 ; fatigue-inertia , P = 0.02 ; confusion-bewilderment , P = 0.02 ; and total mood disorder , P = 0.001 ) . We also found that IGF-1 serum levels were higher in the experimental groups ( EMODERATE , P = 0.02 ; EHIGH , P < 0.001 ) . CONCLUSIONS Moderate- and high-intensity resistance exercise programs had equally beneficial effects on cognitive functioning The purpose of this study was to explore the dose-response relationship between resistance exercise intensity and cognitive performance . Sixty-eight participants were r and omly assigned into control , 40 % , 70 % , or 100 % of 10-repetition maximal resistance exercise groups . Participants were tested on Day 1 ( baseline ) and on Day 2 ( measures were taken relative to performance of the treatment ) . Heart rate , ratings of perceived exertion , self-reported arousal , and affect were assessed on both days . Cognitive performance was assessed on Day 1 and before and following treatment on Day 2 . Results from regression analyses indicated that there is a significant linear effect of exercise intensity on information processing speed , and a significant quadratic trend for exercise intensity on executive function . Thus , there is a dose-response relationship between the intensity of resistance exercise and cognitive performance such that high-intensity exercise benefits speed of processing , but moderate intensity exercise is most beneficial for executive function The hippocampus shrinks in late adulthood , leading to impaired memory and increased risk for dementia . Hippocampal and medial temporal lobe volumes are larger in higher-fit adults , and physical activity training increases hippocampal perfusion , but the extent to which aerobic exercise training can modify hippocampal volume in late adulthood remains unknown . Here we show , in a r and omized controlled trial with 120 older adults , that aerobic exercise training increases the size of the anterior hippocampus , leading to improvements in spatial memory . Exercise training increased hippocampal volume by 2 % , effectively reversing age-related loss in volume by 1 to 2 y. We also demonstrate that increased hippocampal volume is associated with greater serum levels of BDNF , a mediator of neurogenesis in the dentate gyrus . Hippocampal volume declined in the control group , but higher preintervention fitness partially attenuated the decline , suggesting that fitness protects against volume loss . Cau date nucleus and thalamus volumes were unaffected by the intervention . These theoretically important findings indicate that aerobic exercise training is effective at reversing hippocampal volume loss in late adulthood , which is accompanied by improved memory function OBJECTIVE To determine the short- and long-term effects of resistance training on muscle strength , psychological well-being , control-beliefs , cognitive speed and memory in normally active elderly people . METHODS 46 elderly people ( mean age 73.2 years ; 18 women and 28 men ) , were r and omly assigned to training and control groups ( n=23 each ) . Pre- and post-tests were administered 1 week before and 1 week after the 8-week training intervention . The training sessions , performed once a week , consisted of a 10 min warm-up phase and eight resistance exercises on machines . RESULTS There was a significant increase in maximum dynamic strength in the training group . This training effect was associated with a significant decrease in self-attentiveness , which is known to enhance psychological well-being . No significant changes could be observed in control-beliefs . Modest effects on cognitive functioning occurred with the training procedure : although there were no changes in cognitive speed , significant pre/post-changes could be shown in free recall and recognition in the experimental group . A post-test comparison between the experimental group and control group showed a weak effect for recognition but no significant differences in free recall . Significant long-term effects were found in the training group for muscular strength and memory performance ( free recall ) 1 year later . CONCLUSION An 8-week programme of resistance training lessens anxiety and self-attentiveness and improves muscle strength BACKGROUND Underst and ing pre clinical transitions to impairment in cognitive abilities associated with risks for functional difficulty and dementia . This study characterized in the Women 's Health and Aging Study ( WHAS ) II 9-year declines and transitions to impairment across domains of cognition . METHODS The WHAS II is an observational study of initially high-functioning , community-dwelling women aged 70 - 80 years at baseline . R and om-effects models jointly compared rates of decline , and discrete-time Cox models estimated hierarchies of incident clinical impairment on measures of psychomotor speed and executive function ( EF ) using the Trail Making Test and in immediate and delayed verbal recall using the Hopkins Verbal Learning Test . Patterns of transition were related to incidence of global cognitive impairment on the Mini-Mental State Exam ( MMSE ) . RESULTS Mean decline and impairment occurred first in EF and preceded declines in memory by about 3 years . Thereafter , memory decline was equivalent to that for EF . Over 9 years , 49 % developed domain-specific impairments . Risk of incident EF impairment occurred in 37 % of the sample and was often the first impairment observed ( 23.7 % ) , at triple the rate for psychomotor speed ( p < .01 ) . Risk of immediate and delayed recall impairments was nearly double that for psychomotor speed ( p values < .01 ) . Incident impairment in EF and delayed recall was associated with greater risk for MMSE impairment . CONCLUSIONS Executive dysfunction developed first among nearly one quarter of older women and was associated with elevated risk for global cognitive impairment . Because EF declines preceded memory declines and are important to efficient storage and retrieval EF represents an important target for interventions to prevent declines in memory and MMSE both of which are associated with progression to dementia Objective : To estimate the prevalence and examine the course of mild cognitive impairment ( MCI ) , amnestic type , using current criteria , within a representative community sample . Methods : Retroactive application of MCI criteria to data collected during a prospect i ve epidemiologic study was performed . The subjects were drawn from voter registration lists , composing a cohort of 1,248 individuals with mean age of 74.6 ( 5.3 ) years , who were nondemented at entry and who were assessed biennially over 10 years of follow-up . The Petersen amnestic MCI criteria were operationalized as 1 ) impaired memory : Word List Delayed Recall score of < 1 SD below mean ; 2 ) normal mental status : Mini-Mental State Examination score of 25 + ; 3 ) normal daily functioning : no instrumental impairments ; 4 ) memory complaint : subjective response to st and ardized question ; 5 ) not demented : Clinical Dementia Rating Scale score of < 1 . Results : At the five assessment s , amnestic MCI criteria were met by 2.9 to 4.0 % of the cohort . Of 40 persons with MCI at the first assessment , 11 ( 27 % ) developed dementia over the next 10 years . Over each 2-year interval , MCI persons showed increased risk of dementing ( odds ratio = 3.9 , 95 % CI = 2.1 to 7.2 ) ; 11.1 to 16.7 % progressed to Alzheimer disease and 0 to 5.0 % progressed to other dementias . Over the same intervals , 11.1 to 21.2 % of those with MCI remained MCI ; of 33.3 to 55.6 % who no longer had MCI , half had reverted to normal . Conclusions : In this community-based sample , 3 to 4 % of nondemented persons met MCI operational criteria ; despite increased risk of progressing to dementia , a substantial proportion also remained stable or reverted to normal during follow-up . Amnestic MCI as currently defined is a high-risk but unstable and heterogeneous group OBJECTIVES To determine whether improvements in aerobic capacity ( VO2peak ) and strength after progressive resistance training ( PRT ) mediate improvements in cognitive function . DESIGN R and omized , double-blind , double-sham , controlled trial . SETTING University research facility . PARTICIPANTS Community-dwelling older adults ( aged ≥55 ) with mild cognitive impairment ( MCI ) ( N = 100 ) . INTERVENTION PRT and cognitive training ( CT ) , 2 to 3 days per week for 6 months . MEASUREMENTS Alzheimer 's Disease Assessment Scale-cognitive subscale ( ADAS-Cog ) ; global , executive , and memory domains ; peak strength ( 1 repetition maximum ) ; and VO2peak . RESULTS PRT increased upper ( st and ardized mean difference ( SMD ) = 0.69 , 95 % confidence interval = 0.47 , 0.91 ) , lower ( SMD = 0.94 , 95 % CI = 0.69 - 1.20 ) and whole-body ( SMD = 0.84 , 95 % CI = 0.62 - 1.05 ) strength and percentage change in VO2peak ( 8.0 % , 95 % CI = 2.2 - 13.8 ) significantly more than sham exercise . Higher strength scores , but not greater VO2peak , were significantly associated with improvements in cognition ( P < .05 ) . Greater lower body strength significantly mediated the effect of PRT on ADAS-Cog improvements ( indirect effect : β = -0.64 , 95 % CI = -1.38 to -0.004 ; direct effect : β = -0.37 , 95 % CI = -1.51 - 0.78 ) and global domain ( indirect effect : β = 0.12 , 95 % CI = 0.02 - 0.22 ; direct effect : β = -0.003 , 95 % CI = -0.17 - 0.16 ) but not for executive domain ( indirect effect : β = 0.11 , 95 % CI = -0.04 - 0.26 ; direct effect : β = 0.03 , 95 % CI = -0.17 - 0.23 ) . CONCLUSION High-intensity PRT results in significant improvements in cognitive function , muscle strength , and aerobic capacity in older adults with MCI . Strength gains , but not aerobic capacity changes , mediate the cognitive benefits of PRT . Future investigations are warranted to determine the physiological mechanisms linking strength gains and cognitive benefits
2,016
30,871,490
Overall , the models performed well in the development cohorts but less accurately in some independent population s , particularly in patients with high risk and young and elderly patients . An exception is the Nottingham Prognostic Index , which retains its predicting ability in most independent population s. Conclusions Many prognostic models have been developed for breast cancer , but only a few have been vali date d widely in different setting s. Importantly , their performance was suboptimal in independent population s , particularly in patients with high risk and in young and elderly patients
Background Breast cancer is the most common cancer in women worldwide , with a great diversity in outcomes among individual patients . The ability to accurately predict a breast cancer outcome is important to patients , physicians , research ers , and policy makers . Many models have been developed and tested in different setting s. We systematic ally review ed the prognostic models developed and /or vali date d for patients with breast cancer .
In 1990 , 215 patients with operable breast cancer were entered into a prospect i ve study of the prognostic significance of five biochemical markers and 15 other factors ( pathological/chronological/patient ) . After a median follow-up of 6.6 years , there were 77 recurrences and 77 deaths ( 59 breast cancer-related ) . By univariate analysis , patient outcome related significantly to 13 factors . By multivariate analysis , the most important of nine independent factors were : number of nodes involved , steroid receptors ( for oestrogen or progestogen ) , age , clinical or pathological tumour size and grade . Receptors and grade exerted their influence only in the first 3 years . Progestogen receptors ( immunohistochemical ) and oestrogen receptors ( biochemical ) were of similar prognostic significance . The two receptors were correlated ( r=+0.50 , P=0.001 ) and displaced each other from the analytical model but some evidence for the additivity of their prognostic values was seen when their levels were discordant Prognosis studies are investigations of future events or the evaluation of associations between risk factors and health outcomes in population s of patients ( 1 ) . The results of such studies improve our underst and ing of the clinical course of a disease and assist clinicians in making informed decisions about how best to manage patients . Prognostic research also informs the design of intervention studies by helping define subgroups of patients who may benefit from a new treatment and by providing necessary information about the natural history of a disorder ( 2 ) . There has recently been a rapid increase in the use of systematic review methods to synthesize the evidence on research questions related to prognosis . It is essential that investigators conducting systematic review s thoroughly appraise the method ologic quality of included studies to be confident that a study 's design , conduct , analysis , and interpretation have adequately reduced the opportunity for bias ( 3 , 4 ) . Caution is warranted , however , because inclusion of method ologically weak studies can threaten the internal validity of a systematic review ( 4 ) . This follows abundant empirical evidence that inadequate attention to biases can cause invalid results and inferences ( 5 - 9 ) . However , there is limited consensus on how to appraise the quality of prognosis studies ( 1 ) . A useful framework to assess bias in such studies follows the basic principles of epidemiologic research ( 10 , 11 ) . We focus on 6 areas of potential bias : study participation , study attrition , prognostic factor measurement , confounding measurement and account , outcome measurement , and analysis . The main objectives of our review of review s are to describe methods used to assess the quality of prognosis studies and to describe how well current practice s assess potential biases . Our secondary objective is to develop recommendations to guide future quality appraisal , both within single studies of prognostic factors and within systematic review s of the evidence . We hope this work facilitates future discussion and research on biases in prognosis studies and systematic review s. Methods Literature Search and Study Selection We identified systematic review s of prognosis studies by search ing MEDLINE ( 1966 to October 2005 ) using the search strategy recommended by McKibbon and colleagues ( 12 ) . This strategy combines broad search terms for systematic review s ( systematic review .mp ; meta- analysis .mp ) and a sensitive search strategy for prognosis studies ( cohort , incidence , mortality , follow-up studies , prognos * , predict * , or course ) . We also search ed the reference lists of included review s and method ologic papers to identify other relevant publications . We restricted our search to English- language publications . One review er conducted the search and selected the studies . Systematic review s , defined as review s of published studies with a comprehensive search and systematic selection , were included if they assessed the method ologic quality of the included studies by using 1 or more explicit criteria . We excluded studies if they were meta-analyses of independent patient data only , if their primary goal was to investigate the effectiveness of an intervention or specific diagnostic or screening tests , or if they included studies that were not done on humans . Data Extraction and Synthesis Individual items included in the quality assessment of the systematic review s were recorded as they were reported in the publication ( that is , the information that would be available to readers and future review ers ) . We review ed journal Web sites and contacted the authors of the systematic review s for additional information when authors made such an offer in their original papers . When review s assessed different study design s by using different sets of quality items , we extracted only those items used to assess cohort studies . We constructed a comprehensive list of distinct items that the review s used to assess the quality of their included studies . The full text of each review was screened . All items used by the review authors to assess the quality of studies were extracted into a computerized spreadsheet by 1 review er . Two experienced review ers , a clinical epidemiologist and an epidemiologist , independently synthesized the quality items extracted from the prognosis review s to determine how well the systematic review s assessed potential biases . We did this in 3 steps : 1 ) identified distinct concepts or domains addressed by the quality items ; 2 ) grouped each extracted quality item into the appropriate domain or domains ; and 3 ) identified the domains necessary to assess potential biases in prognosis studies . We then used this information to assess how well the review s ' quality assessment included items from the domains necessary to assess potential biases . After completing each of the first 3 steps , the review ers met to attempt to reach a consensus . The consensus process involved each review er presenting his or her observations and results , followed by discussion and debate . A third review er was available in cases of persistent disagreement or uncertainty . In the first step , all domains addressed by the quality items were identified . The first review er iteratively and progressively defined the domains as items were extracted from the included review s. The second review er defined domains from a r and om list of all extracted quality items . Limited guidance was provided to the review ers so that their assessment s and definitions of domains would be independent . The review ers agreed on a final set of domains that adequately and completely defined all of the extracted items . In the second step , review ers independently grouped each extracted item into the appropriate domains . Review ers considered each extracted item by asking , What is each particular quality item addressing ? or What are the review 's authors getting at with the particular quality assessment item ? . Items were grouped into the domain or domains that best represented the concepts being addressed . For example , the extracted items at least 80 % of the group originally identified was located for follow-up and follow-up was sufficiently complete or does n't jeopardize validity were each independently classified by both review ers as assessing the domain completeness of follow-up adequate , whereas the extracted item quantification and description of all subjects lost to follow-up was classified as assessing the domain completeness of follow-up described . In the third step , we identified the domains necessary to assess potential biases . Each review er considered the ability of the identified domains to adequately address , at least in part , 1 of the following 6 potential biases : 1 ) study participation , 2 ) study attrition , 3 ) prognostic factor measurement , 4 ) confounding measurement and account , 5 ) outcome measurement , and 6 ) analysis . Domains were considered to adequately address part of the framework if information garnered from that domain would inform the assessment of potential bias . For example , both review ers judged that the identified domain study population represents source population or population of interest assessed potential bias in a prognosis study , whereas the domain research question definition did not , although the latter is an important consideration in assessing the inclusion of studies in a systematic review . Finally , on the basis of our previous ratings , we looked at whether each review included items from the domains necessary to assess the 6 potential biases . We calculated the frequency of systematic review s by assessing each potential bias and the number of review s that adequately assessed bias overall . From this systematic synthesis , we developed recommendations for improving quality appraisal in future systematic review s of prognosis studies . We used Microsoft Access and Excel 2002 ( Microsoft Corp. , Redmond , Washington ) for data management and SAS for Windows , version 9.1 ( SAS Institute , Inc. , Cary , North Carolina ) for descriptive statistics . Role of the Funding Sources The funding sources , the Canadian Institutes of Health Research , the Canadian Chiropractic Research Foundation , the Ontario Chiropractic Association , and the Ontario Ministry of Health and Long Term Care , did not have a role in the collection , analysis , or interpretation of the data or in the decision to su bmi t the manuscript for publication . Results We identified 1384 potentially relevant articles . Figure 1 shows a flow chart of studies that were included and excluded . Figure 2 shows the number of review s identified by year of publication . We excluded 131 systematic review s of prognosis studies that did not seem to include any quality assessment of the included studies ; this represented 44 % of prognosis review s. We included 163 review s of prognosis studies in our analysis ( 13 - 175 ) . The most common topics were cancer ( 15 % ) , musculoskeletal disorders and rheumatology ( 13 % ) , cardiovascular ( 10 % ) , neurology ( 10 % ) , and obstetrics ( 10 % ) . Other review s included a wide range of health and health care topics . Sixty-three percent of the review s investigated the association between a specific prognostic factor and a particular outcome ; the remainder investigated multiple prognostic factors or models . The number of primary studies included in each systematic review ranged from 3 to 167 ( median , 18 [ interquartile range , 12 to 31 ] ) . A complete description of the included review s is available from the authors on request . Figure 1 . Flow diagram of inclusion and exclusion criteria of systematic review s. Figure 2 . Number of systematic review s of prognosis studies identified over time . Quality Items One hundred fifty-three review s provided adequate detail to allow extraction of quality items . Eight hundred eighty-two distinct quality items were extracted from the review s. Most review s developed their own set of quality items , with only a few applying criteria from previous review s. Most quality items Objective Accurately predicting the prognosis of young patients with breast cancer ( < 40 years ) is uncertain since the literature suggests they have a higher mortality and that age is an independent risk factor . In this cohort study we considered two prognostic tools ; Nottingham Prognostic Index and Adjuvant Online ( Adjuvant ! ) , in a group of young patients , comparing their predicted prognosis with their actual survival . Setting North East Engl and Participants Data was prospect ively collected from the breast unit at a Hospital in Grimsby between January 1998 and December 2007 . A cohort of 102 young patients with primary breast cancer was identified and actual survival data was recorded . The Nottingham Prognostic Index and Adjuvant ! scores were calculated and used to estimate 10-year survival probabilities . Pearson 's correlation coefficient was used to demonstrate the association between the Nottingham Prognostic Index and Adjuvant ! scores . A constant yearly hazard rate was assumed to generate 10-year cumulative survival curves using the Nottingham Prognostic Index and Adjuvant ! predictions . Results Actual 10-year survival for the 92 patients who underwent potentially curative surgery for invasive cancer was 77.2 % ( CI 68.6 % to 85.8 % ) . There was no significant difference between the actual survival and the Nottingham Prognostic Index and Adjuvant ! 10-year estimated survival , which was 77.3 % ( CI 74.4 % to 80.2 % ) and 82.1 % ( CI 79.1 % to 85.1 % ) , respectively . The Nottingham Prognostic Index and Adjuvant ! results demonstrated strong correlation and both predicted cumulative survival curves accurately reflected the actual survival in young patients . Conclusions The Nottingham Prognostic Index and Adjuvant ! are widely used to predict survival in patients with breast cancer . In this study no statistically significant difference was shown between the predicted prognosis and actual survival of a group of young patients with breast cancer A prognostic index , previously derived in a group of 387 patients with primary breast cancer , has been recalculated for the same patients with over 5 years further follow-up and shown to be unchanged . The prognostic index has also been applied prospect ively to a further group of 320 patients and shown to be similarly effective in identifying patients with either a very good or a very poor prognosis . It has been verified that the index applies to patients with primary breast cancer . Patients have now been divided into 5 prognostic groups , predicting 11 % of patients with an almost normal survival and a further 10 % with a very poor prognosis . The index is used to stratify patients to study the effects of treatment regimes within groups of similar patients PURPOSE Our objective was to develop a nomogram to predict individual overall survival ( OS ) for primary breast cancer , based on pathological and biological tumor parameters . METHODS A retrospective study in a cohort of 180 patients with primary breast cancer was used to build the nomogram . Pathological factors and tumor proteases measured prospect ively in primary tumors were used . A multivariate Cox proportional hazards model was used to explore the relationship with OS , and regression coefficients were used to build the nomogram . The nomogram was internally vali date d with 200 bootstrap re- sample s. RESULTS The final variables included in the nomogram comprised tumor size ( P = 0.04 ) , nodal pathological status ( P = 0.01 ) , estrogen receptor status ( P = 0.04 ) , urokinase plasminogen activator inhibitor-1 ( PAI-1 ; P = 0.02 ) , thymidine kinase ( P = 0.03 ) , and cathepsin D ( P = 0.004 ) . The predictive accuracy of the nomogram at estimating the probability of OS , at both 2 and 5 years , was respectively 0.874 and 0.832 before and after calibration . CONCLUSION A nomogram to predict 2- and 5-year OS in BC , using histological and biological parameters was successfully developed . This prognostic tool should prove useful in decision-making and therapeutic research Abstract Web-based prognostication tools may provide a simple and economically feasible option to aid prognostication and selection of chemotherapy in early breast cancers . We vali date d PREDICT , a free online breast cancer prognostication and treatment benefit tool , in a re source -limited setting . All 1480 patients who underwent complete surgical treatment for stages I to III breast cancer from 1998 to 2006 were identified from the prospect i ve breast cancer registry of University Malaya Medical Centre , Kuala Lumpur , Malaysia . Calibration was evaluated by comparing the model-predicted overall survival ( OS ) with patients ’ actual OS . Model discrimination was tested using receiver-operating characteristic ( ROC ) analysis . Median age at diagnosis was 50 years . The median tumor size at presentation was 3 cm and 54 % of patients had lymph node-negative disease . About 55 % of women had estrogen receptor-positive breast cancer . Overall , the model-predicted 5 and 10-year OS was 86.3 % and 77.5 % , respectively , whereas the observed 5 and 10-year OS was 87.6 % ( difference : −1.3 % ) and 74.2 % ( difference : 3.3 % ) , respectively ; P values for goodness-of-fit test were 0.18 and 0.12 , respectively . The program was accurate in most subgroups of patients , but significantly overestimated survival in patients aged < 40 years , and in those receiving neoadjuvant chemotherapy . PREDICT performed well in terms of discrimination ; areas under ROC curve were 0.78 ( 95 % confidence interval [ CI ] : 0.74–0.81 ) and 0.73 ( 95 % CI : 0.68–0.78 ) for 5 and 10-year OS , respectively . Based on its accurate performance in this study , PREDICT may be clinical ly useful in prognosticating women with breast cancer and personalizing breast cancer treatment in re source -limited setting The Nottingham Prognostic Index Plus ( NPI+ ) is a clinical decision making tool in breast cancer ( BC ) that aims to provide improved patient outcome stratification superior to the traditional NPI . This study aim ed to vali date the NPI+ in an independent series of BC . Eight hundred and eighty five primary early stage BC cases from Edinburgh were semi‐quantitatively assessed for 10 biomarkers [ Estrogen Receptor ( ER ) , Progesterone Receptor ( PgR ) , cytokeratin ( CK ) 5/6 , CK7/8 , epidermal growth factor receptor ( EGFR ) , HER2 , HER3 , HER4 , p53 , and Mucin 1 ] using immunohistochemistry and classified into biological classes by fuzzy logic‐derived algorithms previously developed in the Nottingham series . Subsequently , NPI+ Prognostic Groups ( PGs ) were assigned for each class using bespoke NPI‐like formulae , previously developed in each NPI+ biological class of the Nottingham series , utilising clinicopathological parameters : number of positive nodes , pathological tumour size , stage , tubule formation , nuclear pleomorphism and mitotic counts . Biological classes and PGs were compared between the Edinburgh and Nottingham series using Cramer 's V and their role in patient outcome prediction using Kaplan – Meier curves and tested using Log Rank . The NPI+ biomarker panel classified the Edinburgh series into seven biological classes similar to the Nottingham series ( p > 0.01 ) . The biological classes were significantly associated with patient outcome ( p < 0.001 ) . PGs were comparable in predicting patient outcome between series in Luminal A , Basal p53 altered , HER2+/ER+ tumours ( p > 0.01 ) . The good PGs were similarly vali date d in Luminal B , Basal p53 normal , HER2+/ER− tumours and the poor PG in the Luminal N class ( p > 0.01 ) . Due to small patient numbers assigned to the remaining PGs , Luminal N , Luminal B , Basal p53 normal and HER2+/ER− classes could not be vali date d. This study demonstrates the reproducibility of NPI+ and confirmed its prognostic value in an independent cohort of primary BC . Further validation in large r and omised controlled trial material is warranted OBJECTIVE To investigate the behavior of predictive performance measures that are commonly used in external validation of prognostic models for outcome at intensive care units ( ICUs ) . STUDY DESIGN AND SETTING Four prognostic models ( Simplified Acute Physiology Score II , the Acute Physiology and Chronic Health Evaluation II , and the Mortality Probability Models II ) were evaluated in the Dutch National Intensive Care Evaluation registry data base . For each model discrimination ( AUC ) , accuracy ( Brier score ) , and two calibration measures were assessed on data from 41,239 ICU admissions . This validation procedure was repeated with smaller sub sample s r and omly drawn from the data base , and the results were compared with those obtained on the entire data set . RESULTS Differences in performance between the models were small . The AUC and Brier score showed large variation with small sample s. St and ard errors of AUC values were accurate but the power to detect differences in performance was low . Calibration tests were extremely sensitive to sample size . Direct comparison of performance , without statistical analysis , was unreliable with either measure . CONCLUSION Substantial sample sizes are required for performance assessment and model comparison in external validation . Calibration statistics and significance tests should not be used in these setting s. Instead , a simple customization method to repair lack-of-fit problems is recommended OBJECTIVE To develop a new web-based tool , design ated IBTR ! , which integrates prognostic factors for local recurrence ( LR ) into a model to predict the 10-year risk of LR after breast conserving surgery ( BCS ) with or without radiation therapy ( RT ) with the goal of assisting with patient counseling and medical decision-making . METHODS All available r and omized trials of BCS alone versus BCS plus RT , meta-analyses , and institutional reports were review ed to identify the principal prognostic factors for LR after breast-conserving therapy . Patient age , margin status , lymphovascular invasion ( LVI ) , tumor size , tumor grade , use of chemotherapy , and use of hormonal therapy were found to consistently and significantly impact LR across multiple studies . Based upon a composite analysis of the relevant published r and omized and nonr and omized studies , relative risk ( RR ) ratios were estimated and assigned to each prognostic category . These RR ratios were entered into a mathematical model with the 10-year baseline rates of recurrence with and without RT , 7 % and 24 % , respectively , to predict patient-specific LR risk . RESULTS Individual data entered into this computer model with regards to patient age , margin status , LVI , tumor size , tumor grade , use of chemotherapy , and use of hormonal therapy will generate patient-specific predicted 10-year LR risk with and without RT . A graphic representation of the relative risk reduction with RT will also be displayed alongside the numerical display . CONCLUSION IBTR ! is a first attempt at a computer model incorporating LR prognostic factors in an evidence -based fashion to predict individual LR risk and the potential additional benefit from RT Summary In 1982 we constructed a prognostic index for patients with primary , operable breast cancer . This index was based on a retrospective analysis of 9 factors in 387 patients . Only 3 of the factors ( tumour size , stage of disease , and tumour grade ) remained significant on multivariate analysis . The index was subsequently vali date d in a prospect i ve study of 320 patients . We now present the results of applying this prognostic index to all of the first 1,629 patients in our series of operable breast cancer up to the age of 70 . We have used the index to define three subsets of patients with different chances of dying from breast cancer : 1 ) good prognosis , comprising 29 % of patients with 80 % 15-year survival ; 2 ) moderate prognosis , 54 % of patients with 42 % 15-year survival ; 3 ) poor prognosis , 17 % of patients with 13 % 15-year survival . The 15-year survival of an age-matched female population was 83 % The Nottingham prognostic index ( NPI ) is a widely adopted prognostic tool in breast cancer . The HER-2 oncogene overexpression is also associated with higher risk . The aim of our study is to investigate the correlation between the NPI and HER-2 in breast cancer . Two hundred and five patients with breast cancer managed in the Department of Surgery , University of Hong Kong Medical Centre , at Queen Mary Hospital between January 1995 and December 1997 were studied . This is a retrospective analysis of prospect ively collected data base . Variables evaluated for the analysis of prognostic significance included tumor size , axillary lymph node status , histological grade , estrogen and progesterone receptor status and HER-2 oncogene expression . NPI was calculated from tumor size , lymph node status and histological grade . In univariate analysis , tumor size , axillary lymph node status , distant metastasis status , HER-2 oncogene overexpression and NPI score were found to be significant variables . Significant poorer overall survival was also observed with higher NPI category and positive HER-2 . Only tumor size and NPI category with subdivision by HER-2 status were independent factors in multivariate analysis . The combination of the HER-2 oncogene expression status with the widely adopted NPI provides a reliable way to predict prognosis in breast cancer patients Background To evaluate the prognostic effect of log odds of positive lymph nodes ( LODDS ) and develop a nomogram for survival prediction in breast cancer patients at the time of surgery . Results LODDS was an independent risk factor for cancer-related death in breast cancer ( hazard ratio : 1.582 , 95%CI : 1.190 - 2.104 ) . Menopausal status , tumor size , pathological lymph node staging , estrogen receptor status and human epidermal growth factor receptor-2 status were also included in the nomogram . The calibration plots indicated optimal agreement between the nomogram prediction and actual observation . Discrimination of nomogram was superior to the seventh edition TNM staging system [ C-index : 0.745 vs. 0.721 ( p = 0.03 ) in training cohort ; 0.796 vs. 0.726 ( p < 0.01 ) in validation cohort ] . Methods We retrospectively evaluated 2023 breast cancer patients from Jan 2002 to Dec 2008 at our center . The cohort was r and omly divided into training cohort and validation cohort . Univariate and multivariate analyses were performed to identify prognostic factors , and nomogram was established using Cox regression model in training cohort . External validation of the nomogram was performed in the validation cohort . Conclusions The LODDS is an independent prognostic indicator in breast cancer and the novel nomogram can provide individual prediction of cancer-specific survival and help prognostic assessment for breast cancer patients CONTEXT Previous studies indicate that the interpretation of trial results can be distorted by authors of published reports . OBJECTIVE To identify the nature and frequency of distorted presentation or " spin " ( ie , specific reporting strategies , whatever their motive , to highlight that the experimental treatment is beneficial , despite a statistically nonsignificant difference for the primary outcome , or to distract the reader from statistically nonsignificant results ) in published reports of r and omized controlled trials ( RCTs ) with statistically nonsignificant results for primary outcomes . DATA SOURCES March 2007 search of MEDLINE via PubMed using the Cochrane Highly Sensitive Search Strategy to identify reports of RCTs published in December 2006 . STUDY SELECTION Articles were included if they were parallel-group RCTs with a clearly identified primary outcome showing statistically nonsignificant results ( ie , P > or = .05 ) . DATA EXTRACTION Two readers appraised each selected article using a pretested , st and ardized data abstract ion form developed in a pilot test . RESULTS From the 616 published reports of RCTs examined , 72 were eligible and appraised . The title was reported with spin in 13 articles ( 18.0 % ; 95 % confidence interval [ CI ] , 10.0%-28.9 % ) . Spin was identified in the Results and Conclusions sections of the abstract s of 27 ( 37.5 % ; 95 % CI , 26.4%-49.7 % ) and 42 ( 58.3 % ; 95 % CI , 46.1%-69.8 % ) reports , respectively , with the conclusions of 17 ( 23.6 % ; 95 % CI , 14.4%-35.1 % ) focusing only on treatment effectiveness . Spin was identified in the main-text Results , Discussion , and Conclusions sections of 21 ( 29.2 % ; 95 % CI , 19.0%-41.1 % ) , 31 ( 43.1 % ; 95 % CI , 31.4%-55.3 % ) , and 36 ( 50.0 % ; 95 % CI , 38.0%-62.0 % ) reports , respectively . More than 40 % of the reports had spin in at least 2 of these sections in the main text . CONCLUSION In this representative sample of RCTs published in 2006 with statistically nonsignificant primary outcomes , the reporting and interpretation of findings was frequently inconsistent with the results PURPOSE To develop clinical prediction models for local regional recurrence ( LRR ) of breast carcinoma after mastectomy that will be superior to the conventional measures of tumor size and nodal status . METHODS AND MATERIAL S Clinical information from 1,010 invasive breast cancer patients who had primary modified radical mastectomy formed the data base of the training and testing of clinical prognostic and prediction models of LRR . Cox proportional hazards analysis and Bayesian tree analysis were the core method ologies from which these models were built . To generate a prognostic index model , 15 clinical variables were examined for their impact on LRR . Patients were stratified by lymph node involvement ( < 4 vs. > or = 4 ) and local regional status ( recurrent vs. control ) and then , within strata , r and omly split into training and test data sets of equal size . To establish prediction tree models , 255 patients were selected by the criteria of having had LRR ( 53 patients ) or no evidence of LRR without postmastectomy radiotherapy ( PMRT ) ( 202 patients ) . RESULTS With these models , patients can be divided into low- , intermediate- , and high-risk groups on the basis of axillary nodal status , estrogen receptor status , lymphovascular invasion , and age at diagnosis . In the low-risk group , there is no influence of PMRT on either LRR or survival . For intermediate-risk patients , PMRT improves LR control but not metastases-free or overall survival . For the high-risk patients , however , PMRT improves both LR control and metastasis-free and overall survival . CONCLUSION The prognostic score and predictive index are useful methods to estimate the risk of LRR in breast cancer patients after mastectomy and for estimating the potential benefits of PMRT . These models provide additional information criteria for selection of patients for PMRT , compared with the traditional selection criteria of nodal status and tumor size AIMS After treatment , early breast cancer patients undergo follow-up according to st and ard regimens . After the first year , the main goal is particularly to detect locoregional recurrences ( LRR ) . Our aim was to develop a simple prognostic index to predict LRR to tailor the follow-up programme . METHODS We used data from four large international clinical r and omised trials and constructed the prognostic index using Cox proportional hazards regression . The bootstrap ( a resampling method ) was used for internal validation . RESULTS A total of 6516 patients treated according to current guidelines with complete covariable information were used for analysis . Covariables important for LRR in patients treated with breast conserving therapy were age , pathological tumour status , boost and surgical margins . The same variables were important for patients treated with a mastectomy , however , instead of the boost , the pathological nodal status was important . The index is composed to consist of three groups based on LRR risk after 10-years . CONCLUSIONS We constructed a simple prognostic index that can be used to estimate risks of LRR in patients with early breast cancer . The prognostic index enables patients to be stratified into three subgroups with different outcomes with regard to LRR PURPOSE The goal of the computer program Adjuvant ! is to allow health professionals and their patients with early breast cancer to make more informed decisions about adjuvant therapy . METHODS Actuarial analysis was used to project outcomes of patients with and without adjuvant therapy based on estimates of prognosis largely derived from Surveillance , Epidemiology , and End- Results data and estimates of the efficacy of adjuvant therapy based on the 1998 overviews of r and omized trials of adjuvant therapy . These estimates can be refined using the Prognostic Factor Impact Calculator , which uses a Bayesian method to make adjustments based on relative risks conferred and prevalence of positive test results . RESULTS From the entries of patient information ( age , menopausal status , comorbidity estimate ) and tumor staging and characteristics ( tumor size , number of positive axillary nodes , estrogen receptor status ) , baseline prognostic estimates are made . Estimates for the efficacy of endocrine therapy ( 5 years of tamoxifen ) and of polychemotherapy ( cyclophosphamide/methotrexate/fluorouracil-like regimens , or anthracycline-based therapy , or therapy based on both an anthracycline and a taxane ) can then be used to project outcomes presented in both numerical and graphical formats . Outcomes for overall survival and disease-free survival and the improvement seen in clinical trials , are reasonably modeled by Adjuvant ! , although an ideal validation for all patient subsets with all treatment options is not possible . Additional speculative estimates of years of remaining life expectancy and long-term survival curves can also be produced . Help files supply general information about breast cancer . The program 's Internet links supply national treatment guidelines , cooperative group trial options , and other related information . CONCLUSION The computer program Adjuvant ! can play practical and educational roles in clinical setting OBJECTIVE To assess the ability of the morphometric prognostic index ( MPI ) in predicting clinical outcome in a group of breast cancer patients with short-term follow-up and to assess the relationship between MPI and other prognosticators . STUDY DESIGN The study group consisted of 63 cases of breast cancer . Follow-up data were available for 48 patients . MPI values were calculated , and degree of nuclear and tubular differentiation was investigated in each tumor . S-phase fraction ( SPF ) , estrogen and progesterone receptors were also studied . RESULTS The group of patients with MPI values < 0.60 had percent values of disease-free survival significantly higher than did those with MPI values > or = 0.60 . Furthermore , significant direct correlations were found between MPI and degree of nuclear atypia and between MPI and SPF . Significant inverse relationships were found between MPI and tumor progesterone receptor levels and between MPI and degree of histologic tubular differentiation . CONCLUSION The validity of MPI as a prognosticator in breast cancer was confirmed , even in a limited number of patients observed in short-term follow-up . MPI seems to be a reliable and economical prognosticator in selecting breast cancer patients for adjuvant chemotherapy PURPOSE IBTR ! version 1.0 is a web-based tool that uses literature -derived relative risk ratios for seven clinicopathologic variables to predict ipsilateral breast tumor recurrence ( IBTR ) after breast-conserving therapy ( BCT ) . Preliminary testing demonstrated over-estimation in high-risk subgroups . This study uses two independent population -based data sets to create and vali date a modified nomogram , IBTR ! version 2.0 . METHODS Cox regression modeling was performed on 7,811 patients treated with BCT at the British Columbia Cancer Agency ( median follow-up , 9.4 years ) . Population -based hazard ratios were generated for the seven variables in the original nomogram . A modified nomogram was then tested against 664 patients from Massachusetts General Hospital ( median follow-up , 9.3 years ) . The mean predicted and observed 10-year estimates were compared for the entire cohort and for four groups predefined by nomogram-predicted risks : group 1 : less than 3 % ; group 2 : 3 % to 5 % ; group 3 : 5 % to 10 % ; and group 4 : more than 10 % . Results IBTR ! version 2.0 predicted an overall 10-year IBTR estimate of 4.0 % ( 95 % CI , 3.8 to 4.2 ) , while the observed estimate was 2.8 % ( 95 % CI , 1.6 to 4.7 ; P = .10 ) . The predicted and observed IBTR estimates were : group 1 ( n = 283 ) : 2.2 % versus 1.3 % , P = .40 ; group 2 ( n = 237 ) : 3.8 % versus 3.5 % , P = .80 ; group 3 ( n = 111 ) : 6.7 % versus 3.2 % , P = .05 ; and group 4 ( n = 33 ) : 12.5 % versus 8.7 % , P = .50 . CONCLUSION IBTR ! version 2.0 is accurate in the majority of patients with a low to moderate risk of in-breast recurrence . The nomogram still overestimates risk in a minority of patients with higher risk features . Validation in a larger prospect i ve data set is warranted
2,017
31,295,933
Mixed results were found regarding the effectiveness of many types of interventions . Interventions exclusively delivered in the acute hospital pre-discharge and those involving education were most common but their effectiveness was limited in avoiding (re)admission . Successful pre- and post-discharge interventions focused on multidisciplinary approaches . Post-discharge interventions exclusively delivered at home reduced hospital stay and contributed to patient satisfaction . Existing systematic review s on tele-health and long-term care interventions suggest insufficient evidence for admission avoidance . The most effective interventions to avoid inappropriate re-admission to hospital and promote early discharge included integrated systems between hospital and the community care , multidisciplinary service provision , individualization of services , discharge planning initiated in hospital and specialist follow-up
Increasing pressure on limited healthcare re sources has necessitated the development of measures promoting early discharge and avoiding inappropriate hospital (re)admission .
Background Geriatric evaluation and management has become st and ard care for community dwelling older adults following an acute admission to hospital . It is unclear whether this approach is beneficial for the frailest older adults living in permanent residential care . This study was undertaken to evaluate ( 1 ) the feasibility and consumer satisfaction with a geriatrician-led supported discharge service for older adults living in residential care facilities ( RCF ) and ( 2 ) its impact on the uptake of Advanced Care Planning ( ACP ) and acute health care service utilisation . Methods In 2002–4 a r and omised controlled trial was conducted in Melbourne , Australia comparing the geriatrician – led outreach service to usual care for RCF residents . Patients were recruited during their acute hospital stay and followed up at the RCF for six months . The intervention group received a post-discharge home visit within 96 hours , at which a comprehensive geriatric assessment was performed and a care plan developed . Participants and their families were also offered further meetings to discuss ACPs and document Advanced Directives ( AD ) . Additional review s were made available for assessment and management of intercurrent illness within the RCF . Consumer satisfaction was surveyed using a postal question naire . Results The study included 116 participants ( 57 intervention and 59 controls ) with comparable baseline characteristics . The service was well received by consumers demonstrated by higher satisfaction with care in the intervention group compared to controls ( 95 % versus 58 % , p = 0.006).AD were completed by 67 % of participants /proxy decision makers in the intervention group compared to 13 % of RCF residents prior to service commencement . At six months there was a significant reduction in outpatient visits ( intervention 21 ( 37 % ) versus controls 45 ( 76 % ) , ( p < 0.001 ) , but no difference in readmissions rates ( 39 % intervention versus 34 % control , p = 0.6 ) . There was a trend towards reduced hospital bed-day utilisation ( intervention 271 versus controls 372 days ) . Conclusion It is feasible to provide a supported discharge service that includes geriatrician assessment and care planning within a RCF . By exp and ing the service there is the potential for acute health care cost savings by decreasing the dem and for outpatient consultation and further reducing acute care bed-days Objective : To investigate the impact of an in-reach rehabilitation team for patients admitted after road trauma . Design : R and omised control trial of usual care versus early involvement of in-reach rehabilitation team . Telephone follow-up was conducted by a blind assessor at three months for those with minor/moderate injuries and six months for serious/severe injuries . Setting : Four participating trauma services in New South Wales , Australia . Subjects : A total of 214 patients admitted during 2012 - 2015 with a length of stay of at least five days . Intervention : Provision of rehabilitation services in parallel with ward based therapy using an in-reach team for the intervention group . The control group could still access the ward based therapy ( usual care ) . Main measures : The primary outcome was acute length of stay . Secondary outcomes included percentage requiring inpatient rehabilitation , function ( Functional Independence Measure and Timed Up and Go Test ) , psychological status ( Depression Anxiety and Stress Score 21 ) , pain ( Orebro Musculoskeletal Pain Question naire ) and quality of life ( Short Form-12 v2 ) . Results : Median length of stay in acute care was 13 days ( IQR 8 - 21 ) . The intervention group , compared to the control group , received more physiotherapy and occupational therapy sessions ( median number of sessions 16.0 versus 11.5 , P=0.003 ) . However , acute length of stay did not differ between the intervention and control groups ( median 15 vs 12 days , P=0.37 ) . There were no significant differences observed in the secondary outcomes at hospital discharge and follow-up . Conclusion : No additional benefit was found from the routine use of acute rehabilitation teams for trauma patients over and above usual care Background Despite growing emphasis on transitional care to reduce costs and improve quality , few studies have examined transitional care improvements in socioeconomically disadvantaged adults . It is important to consider these patients separately as many are high-utilizers , have different needs , and may have different responses to interventions . Objective To evaluate the impact of a multicomponent transitional care improvement program on 30-day readmissions , emergency department ( ED ) use , transitional care quality , and mortality . Design Clustered r and omized controlled trial conducted at a single urban academic medical center in Portl and , Oregon . Participants Three hundred eighty-two hospitalized low-income adults admitted to general medicine or cardiology who were uninsured or had public insurance . InterventionMulticomponent intervention including ( 1 ) transitional nurse coaching and education , including home visits for highest risk patients ; ( 2 ) pharmacy care , including provision of 30 days of medications after discharge for those without prescription drug coverage ; ( 3 ) post-hospital primary care linkages ; ( 4 ) systems integration and continuous quality improvement . Measurements Primary outcomes included 30-day inpatient readmission and ED use . Readmission data were obtained using state-wide administrative data for all participants ( insured and uninsured ) . Secondary outcomes included quality ( 3-item Care Transitions Measure ) and mortality . Research staff administering question naires and assessing outcomes were blinded . Results There was no significant difference in 30-day readmission between C-TraIn ( 30/209 , 14.4 % ) and control patients ( 27/173 , 16.1 % ) , p = 0.644 , or in ED visits between C-TraIn ( 51/209 , 24.4 % ) and control ( 33/173 , 19.6 % ) , p = 0.271 . C-TraIn was associated with improved transitional care quality ; 47.3 % ( 71/150 ) of C-TraIn patients reported a high quality transition compared to 30.3 % ( 36/119 ) control patients , odds ratio 2.17 ( 95 % CI 1.30–3.64 ) . Zero C-TraIn patients died in the 30-day post-discharge period compared with five in the control group ( unadjusted p = 0.02 ) . Conclusions C-TraIn did not reduce 30-day inpatient readmissions or ED use ; however , it improved transitional care quality Background Intermediate care is a health care model developed to optimize the coordination of health care services and functional independence . In Central Norway , an intermediate care hospital ( ICH ) was established in a municipality to improve hospital discharge and follow-up among elderly patients with chronic conditions and comprehensive care needs . The aim of this study was to investigate the effectiveness of hospital discharges to a municipality with an ICH compared to discharges to a municipality without an ICH . Methods This was a non-r and omized controlled observational study of hospitalized patients aged 60 years and older from two municipalities . Patients ( n = 328 ) admitted to a general hospital from February 2010 through September 2011 were included in the study and followed for 12 months . The data were analyzed using descriptive statistics , analysis of covariance ( ANCOVA ) and Cox proportional hazard regression . Results Each patient discharged from the general hospital to the municipality with an ICH had a shorter length of stay and used on average 4.2 ( p = 0.046 ) fewer hospital days during 1 year compared to patients from the municipality without an ICH . Otherwise , no statistical significant differences were found between the municipalities in terms of hospital readmissions , admissions , mortality , activities of daily living , primary health care utilization or total care days . A post hoc analysis of patients discharged to the ICH compared to the municipality without an ICH , showed that the ICH patients were older and frailer , but the outcome was similar to the main analysis . Conclusions Having an ICH in the municipality facilitated shorter length of hospital stay and kept the risk of readmissions , mortality and post-hospitalization care needs at the same level as without an ICH Background One of the reasons physiotherapy services are provided to emergency departments ( EDs ) and emergency extended care units ( EECUs ) is to review patients ' mobility to ensure they are safe to be discharged home . Aim To investigate whether a physiotherapy service to an EECU altered the rate of hospital admission , rate of re-presentation to the ED , visits to community healthcare practitioners , return to usual work/home/leisure activities and patient satisfaction . Methods A r and omised trial with concealed allocation , assessor blinding and intention-to-treat analysis was undertaken in an EECU . The sample comprised 186 patients ( mean age 70 years , 123 ( 66 % ) female patients , 130 ( 70 % ) trauma ) who were referred for physiotherapy assessment /intervention . Referral occurred at any stage of the patients ' EECU admission . All participants received medical/nursing care as required . The physiotherapy group also received physiotherapy assessment /intervention . Results The physiotherapy group had a 4 % ( 95 % CI −18 % to 9 % ) lower rate of admission to hospital than the control group and a 4 % ( 95 % CI −6 % to 13 % ) higher rate of re-presentation to the ED , which were statistically non-significant ( p≥0.45 ) . Differences between groups for use of community healthcare re sources , return to usual work/home/leisure activities and satisfaction with their EECU care were small and not significant . Conclusion A physiotherapy service for EECU patients , as provided in this study , did not reduce the rate of hospital admission , rate of re-presentation to the ED , use of community healthcare re sources , or improve the rate of return to usual work/home/leisure activities or patient satisfaction . Trial registration number ANZCTRN12609000106235 IMPORTANCE Hospital readmissions are common and costly , and no single intervention or bundle of interventions has reliably reduced readmissions . Virtual wards , which use elements of hospital care in the community , have the potential to reduce readmissions , but have not yet been rigorously evaluated . OBJECTIVE To determine whether a virtual ward-a model of care that uses some of the systems of a hospital ward to provide interprofessional care for community-dwelling patients -can reduce the risk of readmission in patients at high risk of readmission or death when being discharged from hospital . DESIGN , SETTING , AND PATIENTS High-risk adult hospital discharge patients in Toronto were r and omly assigned to either the virtual ward or usual care . A total of 1923 patients were r and omized during the course of the study : 960 to the usual care group and 963 to the virtual ward group . The first patient was enrolled on June 29 , 2010 , and follow-up was completed on June 2 , 2014 . INTERVENTIONS Patients assigned to the virtual ward received care coordination plus direct care provision ( via a combination of telephone , home visits , or clinic visits ) from an interprofessional team for several weeks after hospital discharge . The interprofessional team met daily at a central site to design and implement individualized management plans . Patients assigned to usual care typically received a typed , structured discharge summary , prescription for new medications if indicated , counseling from the resident physician , arrangements for home care as needed , and recommendations , appointments , or both for follow-up care with physicians as indicated . MAIN OUTCOMES AND MEASURES The primary outcome was a composite of hospital readmission or death within 30 days of discharge . Secondary outcomes included nursing home admission and emergency department visits , each of the components of the primary outcome at 30 days , as well as each of the outcomes ( including the composite primary outcome ) at 90 days , 6 months , and 1 year . RESULTS There were no statistically significant between-group differences in the primary or secondary outcomes at 30 or 90 days , 6 months , or 1 year . The primary outcome occurred in 203 of 959 ( 21.2 % ) of the virtual ward patients and 235 of 956 ( 24.6 % ) of the usual care patients ( absolute difference , 3.4 % ; 95 % CI , -0.3 % to 7.2 % ; P = .09 ) . There were no statistically significant interactions to indicate that the virtual ward model of care was more or less effective in any of the prespecified subgroups . CONCLUSIONS AND RELEVANCE In a diverse group of high-risk patients being discharged from the hospital , we found no statistically significant effect of a virtual ward model of care on readmissions or death at either 30 days or 90 days , 6 months , or 1 year after hospital discharge . TRIAL REGISTRATION clinical trials.gov Identifier : NCT01108172 Residents of long-term care facilities have highly complex care needs and quality of care is of international concern . Maintaining resident wellness through proactive assessment and early intervention is key to decreasing the need for acute hospitalization . The Residential Aged Care Integration Program ( RACIP ) is a quality improvement intervention to support residential aged care staff and includes on-site support , education , clinical coaching , and care coordination provided by gerontology nurse specialists ( GNSs ) employed by a large district health board . The effect of the outreach program was evaluated through a r and omized comparison of hospitalization 1 year before and after program implementation . The sample included 29 intervention facilities ( 1,425 residents ) and 25 comparison facilities ( 1,128 residents ) receiving usual care . Acute hospitalization rate unexpectedly increased for both groups after program implementation , although the rate of increase was significantly less for the intervention facilities . The hospitalization rate after the intervention increased 59 % for the comparison group and 16 % for the intervention group ( rate ratio ( RR ) = 0.73 , 95 % confidence interval ( CI ) = 0.61 - 0.86 , P < .001 ) . Subgroup analysis showed a significantly lower rate change for those admitted for medical reasons for the intervention group ( 13 % increase ) than the comparison group ( 69 % increase ) ( RR = 0.67 , 95 % CI = 0.56 - 0.82 , P < .001 ) . Conversely , there was no significant difference in the RR for surgical admissions between the intervention and comparison groups ( RR = 1.0 , 95 % CI = 0.68 - 1.46 , P = .99 ) . The integration of GNS expertise through the RACIP intervention may be one approach to support staff to provide optimal care and potentially improve resident health Abstract Objective to compare the clinical and cost-effectiveness of a Community In-reach Rehabilitation and Care Transition ( CIRACT ) service with the traditional hospital-based rehabilitation ( THB-Rehab ) service . Design pragmatic r and omised controlled trial with an integral health economic study . Setting s large UK teaching hospital , with community follow-up . Subjects frail older people aged 70 years and older admitted to hospital as an acute medical emergency . Measurements Primary outcome : hospital length of stay ; secondary outcomes : readmission , day 91-super spell bed days , functional ability , co-morbidity and health-related quality of life ; cost-effectiveness analysis . Results a total of 250 participants were r and omised . There was no significant difference in length of stay between the CIRACT and THB-Rehab service ( median 8 versus 9 days ; geometric mean 7.8 versus 8.7 days , mean ratio 0.90 , 95 % confidence interval ( CI ) 0.74–1.10 ) . Of the participants who were discharged from hospital , 17 % and 13 % were readmitted within 28 days from the CIRACT and THB-Rehab services , respectively ( risk difference 3.8 % , 95 % CI −5.8 % to 13.4 % ) . There were no other significant differences in any of the other secondary outcomes between the two groups . The mean costs ( including NHS and personal social service ) of the CIRACT and THB-Rehab service were £ 3,744 and £ 3,603 , respectively ( mean cost difference £ 144 ; 95 % CI −1,645 to 1,934 ) . Conclusion the CIRACT service does not reduce major hospital length of stay nor reduce short-term readmission rates , compared to the st and ard THB-Rehab service ; however , a modest ( < 2.3 days ) effect can not be excluded . Further studies are necessary powered with larger sample sizes and cluster r and omisation . Trial registration IS RCT N 94393315 , 25th April Purpose : The purpose of the study was to identify predictors and outcomes of adult medical-surgical patients ' perceptions of their readiness for hospital discharge . Design : A correlational , prospect i ve , longitudinal design with path analyses was used to explore relationships among transition theory-related variables . Setting : Midwestern tertiary medical center . Sample : 147 adult medical-surgical patients . Methods : Predictor variables included patient characteristics , hospitalization factors , and nursing practice s that were measured prior to hospital discharge using a study enrollment form , the Quality of Discharge Teaching Scale , and the Care Coordination Scale . Discharge readiness was measured using the Readiness for Hospital Discharge Scale administered within 4 hours prior to discharge . Outcomes were measured 3 weeks postdischarge with the Post-Discharge Coping Difficulty Scale and self-reported utilization of health services . Findings : Living alone , discharge teaching ( amount of content received and nurses ' skill in teaching delivery ) , and care coordination explained 51 % of readiness for discharge score variance . Patient age and discharge readiness explained 16 % of variance in postdischarge coping difficulty . Greater readiness for discharge was predictive of fewer readmissions . Conclusions : Quality of the delivery of discharge teaching was the strongest predictor of discharge readiness . Study results provided support for Meleis ' transitions theory as a useful model for conceptualizing and investigating the discharge transition . Implication s for Practice : The study results have implication s for the CNS role in patient and staff education , system building for the postdischarge transition , and measurement of clinical care outcomes Objectives To determine the effectiveness of early assisted discharge for chronic obstructive pulmonary disease ( COPD ) exacerbations , with home care provided by generic community nurses , compared with usual hospital care . Design Prospect i ve , r and omised controlled and multicentre trial with 3-month follow-up . Setting Five hospitals and three home care organisations in the Netherl and s. Participants Patients admitted to the hospital with an exacerbation of COPD . Patients with no or limited improvement of respiratory symptoms and patients with severe unstable comorbidities , social problems or those unable to visit the toilet independently were excluded . Intervention Early discharge from hospital after 3 days inpatient treatment . Home visits by generic community nurses . Primary outcome measure was change in health status measured by the Clinical COPD Question naire ( CCQ ) . Treatment failures , readmissions , mortality and change in generic health-related quality of life ( HRQL ) were secondary outcome measures . Results 139 patients were r and omised . No difference between groups was found in change in CCQ score at day 7 ( difference in mean change 0.29 ( 95 % CI −0.03 to 0.61 ) ) or at 3 months ( difference in mean change 0.04 ( 95 % CI –0.40 to 0.49 ) ) . No difference was found in secondary outcomes . At day 7 there was a significant difference in change in generic HRQL , favouring usual hospital care . Conclusions While patients ’ disease-specific health status after 7-day treatment tended to be somewhat better in the usual hospital care group , the difference was small and not clinical ly relevant or statistically significant . After 3 months , the difference had disappeared . A significant difference in generic HRQL at the end of the treatment had disappeared after 3 months and there was no difference in treatment failures , readmissions or mortality . Early assisted discharge with community nursing is feasible and an alternative to usual hospital care for selected patients with an acute COPD exacerbation . Trial registration : Netherl and sTrialRegister NTR 1129 Background Although the cost of Crohn ’s disease ( CD ) treatment differs considerably , hospitalization and surgery costs account for most of the total treatment cost . Decreasing hospitalization and surgery rates are pivotal issues in reducing health-care costs . Material / Methods We evaluated the effect of azathioprine ( AZA ) compared with mesalazine on incidence of re-hospitalizations due to all causes and for CD-related surgeries . In this controlled , r and omized study , 72 subjects with sub-occlusive ileocecal CD were r and omized for AZA ( 2–3 mg/kg per day ) or mesalazine ( 3.2 g per day ) therapy during a 3-year period . The primary end point was the re-hospitalization proportion due to all causes , as well as for surgical procedures during this period evaluated between the groups . Results On an intention-to-treat basis , the proportion of patients re-hospitalized within 36 months due to all causes was lower in patients treated with AZA compared to those on mesalazine ( 0.39 vs. 0.83 , respectively ; p=0.035 ) . The AZA group had also significantly lower proportions of re-hospitalization for surgical intervention ( 0.25 vs. 0.56 , respectively ; p=0.011 ) . The number of admissions ( 0.70 vs. 1.41 , p=0.001 ) and the length of re-hospitalization ( 3.8 vs. 7.7 days ; p=0.002 ) were both lower in AZA patients . Conclusions Patients with sub-occlusive ileocecal CD treated with AZA had lower re-hospitalization rates due to all causes and for surgical management of CD compared to those treated with mesalazine during a 3-year period . The long-term use of AZA in ileocecal CD patients recovering from a sub-occlusion episode can save healthcare costs AIM To determine the effect on quality of life and cost effectiveness of specialist nurse early supported discharge for women undergoing major abdominal and /or pelvic surgery for benign gynaecological disease compared with routine care . STUDY DESIGN R and omised controlled trial comparing specialist nurse supported discharge with routine hospital care in gynaecology . The SF-36 , a generic health status question naire , was used to measure women 's evaluation of their health state before surgery and at 6 weeks after surgery . A further question naire scoring patient symptoms , milestones of recovery , information given and satisfaction , was administered prior to discharge from hospital and at 6 weeks thereafter . SETTING Gynaecology service at the Western Infirmary Glasgow , part of North Glasgow University , NHS Trust . PARTICIPANTS One hundred and eleven women scheduled for major abdominal or pelvic surgery for benign gynaecological disease . MAIN OUTCOME MEASURES SF-36 health survey question naire baseline scores were reported before surgery and at 6 weeks follow-up . Complications , length of hospital stay , readmission , information on discharge support and satisfaction of women were recorded at discharge from hospital and at 6 weeks follow-up . A cost consequence analysis was conducted based on the perspective of the NHS . RESULTS The addition of a specialist nurse to routine hospital care in gynaecology significantly reduced the post-operative length of hospital stay p = 0.001 , improved information delivery and satisfaction of women . The specialist nurse supported discharge group was associated with significantly lower total costs to the NHS than routine care result ing principally from the difference in the cost of the post-operative length of stay . CONCLUSIONS Women undergoing major abdominal and pelvic surgery were discharged home earlier with provision of support from a specialist gynaecology nurse . The results of this study suggest that duration of hospital stay can be shortened by the introduction of a specialist nurse without introducing any adverse physical and psychological effects . This process of care is associated with receipt of information on health and lifestyle issues and maintenance of high levels of patient satisfaction and demonstrates the effectiveness of the specialist nurse role in the provision of health information for women . Earlier hospital discharge at 48 h after major abdominal and pelvic surgery is an acceptable , cost effective alternative to current routine practice in the absence of further r and omised evidence Background Pharmacists may improve medication-related outcomes during transitions of care . The aim of the Iowa Continuity of Care Study was to determine if a pharmacist case manager ( PCM ) providing a faxed discharge medication care plan from a tertiary care institution to primary care could improve medication appropriateness and reduce adverse events , rehospitalization and emergency department visits . Methods Design . R and omized , controlled trial of 945 participants assigned to enhanced , minimal and usual care groups conducted 2007 to 2012 . Subjects . Participants with cardiovascular-related conditions and /or asthma or chronic obstructive pulmonary disease were recruited from the University of Iowa Hospital and Clinics following admission to general medicine , family medicine , cardiology or orthopedics . Intervention . The minimal group received admission history , medication reconciliation , patient education , discharge medication list and medication recommendations to inpatient team . The enhanced group also received a faxed medication care plan to their community physician and pharmacy and telephone call 3–5 days post-discharge . Participants were followed for 90 days post-discharge . Main Outcomes and Measures . Medication appropriateness index ( MAI ) , adverse events , adverse drug events and post-discharge healthcare utilization were compared by study group using linear and logistic regression , as models accommodating r and om effects due to pharmacists indicated little clustering . Results Study groups were similar at baseline and the intervention fidelity was high . There were no statistically significant differences by study group in medication appropriateness , adverse events or adverse drug events at discharge , 30-day and 90-day post-discharge . The average MAI per medication as 0.53 at discharge and increased to 0.75 at 90 days , and this was true across all study groups . Post-discharge , about 16 % of all participants experienced an adverse event , and this did not differ by study group ( p > 0.05 ) . Almost one-third of all participants had any type of healthcare utilization within 30 days post-discharge , where 15 % of all participants had a 30-day readmission . Healthcare utilization post-discharge was not statistically significant different at 30 or 90 days by study group . Conclusion The pharmacist case manager did not affect medication use outcomes post-discharge perhaps because quality of care measures were high in all study groups . Trial registration Clinical trials.gov registration : NCT00513903 , August 7 , 2007 Background : home visits and telephone calls are two often used approaches in transitional care but their differential effects are unknown . Objective : to examine the overall effects of a transitional care programme for discharged medical patients and the differential effects of telephone calls only . Design : r and omised controlled trial . Setting : a regional hospital in Hong Kong . Participants : patients discharged from medical units fitting the inclusion criteria ( n = 610 ) were r and omly assigned to : control ( ‘ control ’ , n = 210 ) , home visits with calls ( ‘ home ’ , n = 196 ) and calls only ( ‘ call ’ , n = 204 ) . Intervention : the home groups received alternative home visits and calls and the call groups calls only for 4 weeks . The control group received two placebo calls . The nurse case manager was supported by nursing students in delivering the interventions . Results : the home visit group ( after 4 weeks 10.7 % , after 12 weeks 21.4 % ) and the call group ( 11.8 , 20.6 % ) had lower readmission rates than the control group ( 17.6 , 25.7 % ) . Significance differences were detected in intention-to-treat ( ITT ) analysis for the home and intervention group ( home and call combined ) at 4 weeks . In the per- protocol analysis ( PPA ) results , significant differences were found in all groups at 4 weeks . There was significant improvement in quality of life , self-efficacy and satisfaction in both ITT and PPA for the study groups . Conclusions : this study has found that bundled interventions involving both home visits and calls are more effective in reducing readmissions . Many of the transitional care programmes use all-qualified nurses , and this study reveals that a mixed skills model seems to bring about positive effects as well OBJECTIVE To assess the efficacy of a home care program design ed to improve access to medical care for older adults with multiple chronic conditions who are at risk for hospitalization . STUDY DESIGN R and omized controlled trial in which participants were assigned to the home care intervention ( Choices for Healthy Aging [ CHA ] ) program or usual care . METHODS The intervention group consisted of 298 older adults at risk of hospitalization as determined by a risk stratification tool . Measures included satisfaction with medical care , medical service use , and costs of medical care . RESULTS The intervention group reported significantly greater satisfaction with care than usual care recipients ( t test = 2.476 ; P = .014 ) . CHA patients were less likely than usual care patients to be admitted to the hospital ( 25.6 % and 37.1 % , respectively ; P = .02 ) . There were no differences in terms of costs of care between the home care and usual care groups . CONCLUSIONS Provision of home care to older adults at high risk of hospitalization may improve satisfaction with care while reducing hospitalizations . Lack of difference in medical costs suggests that managed care organizations need to consider targeting rather than using risk stratification measures when design ing programs for high-risk groups Background Intermediate care is intended to reduce hospital admissions and facilitate early discharge . In Norway , a model was developed with transfer to intermediate care shortly after hospital admission . Efficacy and safety of this model have not been studied previously . In a parallel-group r and omized controlled trial , patients over 70 years living at home before admission were eligible if clinical ly stable , without need for surgical treatment and deemed suited for intermediate care by attending physician . Intervention group patients were transferred to a nursing home unit with increased staff and multidisciplinary assessment , for a maximum stay of three weeks . Patients in the control group received usual care in hospital . Blinding to group assignment was not possible . The primary outcome was number of days living at home in a follow-up period of 365 days . Secondary outcomes were mortality , hospital admissions , need for residential care and home care services . Data were obtained from patient records and registers . Results 376 patients were included , 74 % female and mean age 84 years . There was no significant differences between intervention ( n = 190 ) and control group ( n = 186 ) for number of days living at home ( 253.7 vs 256.5 , p = 0.80 ) or days in hospital ( 10.4 vs 10.5 , p = 0.748 ) . Intervention group patients spent less time in nursing home ( 40.6 days vs. 55.0 , p = 0.046 ) , and more patients lived independently without home health care services ( 31.6 % vs 19.9 % , p = 0.007).For orthopaedic patients ( n = 128 ) , mortality was higher in the intervention group ; 15 intervention patients and 7 controls died ( 25.1 % vs 10.3 % , p = 0.049 ) . There was no significant difference in one-year mortality for medical patients ( n = 150 ) or the total study population . Conclusions This model of rapid transfer to intermediate care did not significantly influence number of days living at home during one year follow-up , but reduced dem and for nursing home care and need for home health care services . In post-hoc analysis mortality was increased for orthopedic patients .Trial registration The trial was registered 26 . July 2013 at Current Controlled Trials and assigned with registration number IS RCT N21608185 Background The RATPAC trial showed that using a point-of-care panel of CK-MB(mass ) , myoglobin and troponin at baseline and 90 min increased the proportion of patients successfully discharged home , leading to reduced median length of initial hospital stay . However , it did not change mean hospital stay and may have increased mean costs per patient . The aim of this study was to explore variation in outcome and costs between participating hospitals . Methods RATPAC was a pragmatic multicentre r and omised controlled trial ( N=2243 ) and economic analysis comparing diagnostic assessment using the panel to st and ard care for patients with acute chest pain due to suspected myocardial infa rct ion at six hospitals . The difference in the proportion of patients successfully discharged ( primary outcome ) and mean costs per patient between the participating hospitals was compared . Results Point-of-care assessment led to a higher proportion of successful discharges in four hospitals , a lower proportion in one and was equivocal in another . The OR ( 95 % CI ) for the primary outcome varied from 0.12 ( 0.01 to 1.03 ) to 11.07 ( 6.23 to 19.66 ) with significant heterogeneity between the centres ( p<0.001 ) . The mean cost per patient for the intervention group ranged from being £ 214.49 less than the control group ( −132.56 to 657.10 ) to £ 646.57 more expensive ( 73.12 to 1612.71 ) , with weak evidence of heterogeneity between the centres ( p=0.0803 ) . Conclusion The effect of point-of-care panel assessment on successful discharge and costs per patient varied markedly between hospitals and may depend on local protocol s , staff practice s and available facilities Background : This study evaluated the effectiveness of using trained volunteer staff in reducing 30-day readmissions of congestive heart failure ( CHF ) patients . Methods : From June 2010 to December 2010 , 137 patients ( mean age 73 years ) hospitalized for CHF were r and omly assigned to either : an interventional arm ( arm A ) receiving dietary and pharmacologic education by a trained volunteer , follow-up telephone calls within 48 hours , and a month of weekly calls ; ora control arm ( arm B ) receiving st and ard care . Primary outcomes were 30-day readmission rates for CHF and worsening New York Heart Association ( NYHA ) functional classification ; composite and all-cause mortality were secondary outcomes . Results : Arm A patients had decreased 30-day readmissions ( 7 % vs 19 % ; P ! .05 ) with a relative risk reduction ( RRR ) of 63 % and an absolute risk reduction ( ARR ) of 12 % . The composite outcome of 30-day readmission , worsening NYHA functional class , and death was decreased in the arm A ( 24 % vs 49%;P ! .05 ; RRR 51 % , ARR 25 % ) . St and ard-care treatment and hypertension , age $ 65 years and hypertension , and cigarette smoking were predictors of increased risk for readmissions , worsening NYHA functional class , and all-cause mortality , respectively , in the multivariable analysis . Conclusions : Utilizing trained volunteer staff to improve patient education and engagement might be an efficient and low-cost intervention to reduce CHF readmissions OBJECTIVE The objective of the study was to evaluate the effectiveness of a single home-based educational intervention for patients admitted with heart failure . METHODS There were 106 patients : 42 in the intervention group and 64 in the control group . Patients were r and omly assigned to receive an intervention by nursing staff 1 week after discharge . Primary end points were readmissions , emergency department visits , deaths , costs , and quality of life . RESULTS During the 24-month follow-up , there were fewer mean emergency department visits in the intervention group than in the control group ( .68 vs 2.00 ; P = .000 ) , fewer unplanned readmissions ( .68 vs 1.71 ; P = .000 ) , and lower costs ( € 671.56 = $ 974.63 = GBP598.42 per person vs € 2,154.24 = $ 3,126.01 = GBP1,919.64 ; P = .001 ) . There was a trend toward fewer out-of-hospital deaths ( 14 [ 46.6 % ] vs 31 [ 55.3 % ] ; P = .45 ) and improvement in quality of life . CONCLUSION Patients with heart failure who receive a home-based educational intervention experience fewer emergency department visits and unplanned readmissions with lower healthcare costs Objective : to assess the impact of telecare on the use of social and health care . Part of the evaluation of the Whole Systems Demonstrator trial . Participants and setting : a total of 2,600 people with social care needs were recruited from 217 general practice s in three areas in Engl and . Design : a cluster r and omised trial comparing telecare with usual care , general practice being the unit of r and omisation . Participants were followed up for 12 months and analyses were conducted as intention-to-treat . Data sources : trial data were linked at the person level to administrative data sets on care funded at least in part by local authorities or the National Health Service . Main outcome measures : the proportion of people admitted to hospital within 12 months . Secondary endpoints included mortality , rates of secondary care use ( seven different metrics ) , contacts with general practitioners and practice nurses , proportion of people admitted to permanent residential or nursing care , weeks in domiciliary social care and notional costs . Results : 46.8 % of intervention participants were admitted to hospital , compared with 49.2 % of controls . Unadjusted differences were not statistically significant ( odds ratio : 0.90 , 95 % CI : 0.75–1.07 , P = 0.211 ) . They reached statistical significance after adjusting for baseline covariates , but this was not replicated when adjusting for the predictive risk score . Secondary metrics including impacts on social care use were not statistically significant . Conclusions : telecare as implemented in the Whole Systems Demonstrator trial did not lead to significant reductions in service use , at least in terms of results assessed over 12 months . International St and ard R and omised Controlled Trial Number Register IS RCT N43002091 Objective : to assess the effect of home versus day rehabilitation on patient outcomes . Design : r and omised controlled trial . Setting : post-hospital rehabilitation . Participants : two hundred and twenty-nine hospitalised patients referred for ambulatory rehabilitation . Interventions : hospital-based day rehabilitation programme versus home-based rehabilitation programme . Main Outcome Measures : at 3 months , information was collected on hospital readmission , transfer to residential care , functional level , quality of life , carer stress and carer quality of life . At 6 months , place of residence , hospital re-admissions and mortality status were collected . Results : there were significant improvements in the functional outcomes from baseline to 3 months for all participants . At discharge , carers of patients in day hospital reported higher Caregiver Strain Index ( CSI ) scores in comparison to home rehabilitation carers ( 4.95 versus 3.56 , P = 0.047 ) . Patients in day hospital had double the risk of readmission compared to those in home rehabilitation ( RR = 2.1 ; 95 % CI 1.2–3.9 ) . This effect persisted at 6 months . Conclusions : day hospital patients are more likely to be readmitted to hospital possibly due to increased access to admitting medical staff . This small trial favours the home as a better site for post-hospital rehabilitation Background Transition from hospital to home is a critical period for older persons with acute myocardial infa rct ion ( AMI ) . Home-based secondary prevention programs led by nurses have been proposed to facilitate the patients ’ adjustment to AMI after discharge . The objective of this study was to evaluate the effects of a nurse-based case management for elderly patients discharged after an AMI from a tertiary care hospital . Methods In a single-centre r and omized two-armed parallel group trial of patients aged 65 years and older hospitalized with an AMI between September 2008 and May 2010 in the Hospital of Augsburg , Germany , patients were r and omly assigned to a case management or a control group receiving usual care . The case-management intervention consisted of a nurse-based follow-up for one year including home visits and telephone calls . Key elements of the intervention were to detect problems or risks and to give advice regarding a wide range of aspects of disease management ( e.g. nutrition , medication ) . Primary study endpoint was time to first unplanned readmission or death . Block r and omization per telephone call to a biostatistical center , where the r and omization list was kept , was performed . Persons who assessed one-year outcomes and vali date d readmission data were blinded . Statistical analysis was based on the intention-to-treat approach and included Cox Proportional Hazards models . Results Three hundred forty patients were allocated to receive case-management ( n=168 ) or usual care ( n=172 ) . The analysis is based on 329 patients ( intervention group : n=161 ; control group : n=168 ) . Of these , 62 % were men , mean age was 75.4 years , and 47.1 % had at least either diabetes or chronic heart failure as a major comorbidity . The mean follow-up time for the intervention group was 273.6 days , and for the control group it was 320.6 days . During one year , in the intervention group there were 57 first unplanned readmissions and 5 deaths , while the control group had 75 first unplanned readmissions and 3 deaths . With respect to the endpoint there was no significant effect of the case management program after one year ( Hazard Ratio 1.01 , 95 % confidence interval 0.72 - 1.41 ) . This was also the case among subgroups according to sex , diabetes , living alone , and comorbidities . Conclusions A nurse-based management among elderly patients with AMI had no significant influence on the rate of first unplanned readmissions or death during a one-year follow-up . A possible long-term influence should be investigated by further studies . Clinical trial registration IS RCT Background Community services are playing an increasing role in supporting older adults who are discharged from hospital with ongoing non-acute care needs . However , there is a paucity of information regarding how community services are involved in the discharge process of older individuals from hospital into the community . Methods Twenty-nine data bases were search ed from 1980 to 2012 ( inclusive ) for relevant primary published research , of any study design , as well as relevant unpublished work ( e.g. clinical guidelines ) which investigated community services ' involvement in the discharge of older individuals from hospital into the community . Data analysis and quality appraisal ( using McMaster critical appraisal tools ) were undertaken predominately by the lead author . Data was synthesis ed qualitatively . Results Twelve papers were eligible for inclusion ( five r and omised controlled trials , four before and after studies and three controlled trials ) , involving a total of 8440 older adults ( > 65 years ) . These papers reported on a range of interventions . During data synthesis , descriptors were assigned to four emergent discharge methods : Virtual Interface Model , In-reach Interface Model , Out-reach Interface Model and Independent Interface Model . In each model , the findings were mixed in terms of health care and patient and carer outcomes . Conclusions It is plausible that each model identified in this systematic review has a role to play in successfully discharging different cohorts of older adults from hospital . Further research is required to identify appropriate population groups for various discharge models and to select suitable outcome measures to determine the effectiveness of these models , considering all stakeholders ' involved BACKGROUND / AIM Poor patient underst and ing of their diagnosis and treatment plan can adversely impact clinical outcome following hospital discharge . Discharge summaries are primarily written for the doctor rather than the patient . We determined patient underst and ing of the reasons for hospitalisation , in-hospital tests , treatments and post-discharge recommendations , and whether a brief patient-directed discharge letter ( PADDLE ) delivered during a brief discussion prior to discharge would improve underst and ing . METHODS A prospect i ve r and omised controlled trial was conducted , including 67 hospitalised patients . After a baseline question naire , patients were r and omised to receive the PADDLE letter or usual care . Those receiving the letter had an immediate follow-up question naire . Patient underst and ing was compared with a summary letter written by the treating clinician , using a 5-point Likert scale ranging from none to full underst and ing . A question naire was administered at 3 and 6 months . RESULTS At baseline , patients had almost full underst and ing ( median score 4 ) of reasons for hospitalisation and treatments . However , despite high self- appraisal , patients objective ly had very little underst and ing of tests performed and post-discharge recommendations ( median 2 ) . Those receiving the letter had an immediate increase to almost full underst and ing ( median 4 ) of tests performed ( P < 0.001 ) and to full underst and ing ( median 5 ) of post-discharge recommendations . This increase did not persist at 3 or 6 months . CONCLUSIONS A simple patient-directed letter delivered during a brief discussion improves patient underst and ing of their hospitalisation and post-discharge recommendations , which is otherwise limited . Further evaluation of this brief and well-received intervention is indicated , with the goal of improving patient underst and ing , satisfaction and clinical outcomes OBJECTIVES To evaluate an integrated telehealth intervention ( Integrated Telehealth Education and Activation of Mood ( I-TEAM ) ) to improve chronic illness ( congestive heart failure , chronic obstructive pulmonary disease ) and comorbid depression in the home healthcare setting . DESIGN R and omized controlled trial . SETTING Hospital-affiliated home healthcare setting . PARTICIPANTS Medically frail older homebound individuals ( N = 102 ) . INTERVENTION The 3-month intervention consisted of integrated telehealth chronic illness and depression care , with a telehealth nurse conducting daily telemonitoring of symptoms , body weight , and medication use ; providing eight weekly sessions of problem-solving treatment for depression ; and providing for communication with participants ' primary care physicians , who also prescribed antidepressants . Control participants were allocated to usual care with in-home nursing plus psychoeducation ( UC+P ) . MEASUREMENTS The two groups were compared at baseline and 3 and 6 months after baseline on clinical measures ( depression , health , problem-solving ) and 12 months after baseline on health utilization ( readmission , episodes of care , and emergency department ( ED ) visits ) . RESULTS Depression scores were 50 % lower in the I-TEAM group than in the UC+P group at 3 and 6 months . Those who received the I-TEAM intervention significantly improved their problem-solving skills and self-efficacy in managing their medical condition . The I-TEAM group had significantly fewer ED visits ( P = .01 ) but did not have significantly fewer days in the hospital at 12 months after baseline . CONCLUSION Integrated telehealth care for older adults with chronic illness and comorbid depression can reduce symptoms and postdischarge ED use in home health setting BACKGROUND A 23-h unit was established in June 2005 to relieve pressure on surgical beds . Patients were to be discharged by 0900h without review by a doctor . However , discharge without review remained the exception rather than the rule . OBJECTIVE The aim of the current trial was to asses the affect of a protocol driven , nurse-initiated discharge process on discharge time , patient satisfaction and adverse events . DESIGN R and omised controlled trial . SETTING A large , major metropolitan hospital in Queensl and , Australia . PARTICIPANTS Patients undergoing a surgical procedure and requiring an overnight stay in the 23-h unit were eligible for inclusion . 182 were r and omised and 131 patients completed the study . METHODS Participants were r and omly assigned into one of two groups : protocol driven , nurse-initiated or usual care . The primary end-point was the proportion of patients discharged by 0900h . Patients completed a self-report question naire two weeks after hospital discharge , to evaluate their satisfaction . RESULTS Of the 131 patients completing the trial , only 82 ( 62.6 % ) were discharged by 0900h . In the Protocol group 45 ( 78.9 % ) patients were discharged on time compared with 37 ( 50.0 % ) in the usual care group . This difference was statistically significant ( OR 3.75 ; 95 % CI-1.74 - 8.21 ; p=0.001 ) . The average length of stay in the 23-h unit was 16.5(SD 6.8)h . This did not differ by group ( MD 0.29 ; 95 % CI-2.13 - 2.71 ; p=0.81 ) . The overall mean satisfaction score was 95.4 ( SD 8.8 ) and results were similar between groups ( Protocol group 96.2 versus usual care group 94.6 ; p=0.40 ) . CONCLUSIONS A protocol driven , nurse-initiated discharge process in an overnight post surgery unit results in a higher proportion of patients being discharged by 0900h without compromising patient satisfaction RATIONALE The effect of disease management for chronic obstructive pulmonary disease ( COPD ) is not well established . OBJECTIVES To determine whether a simplified disease management program reduces hospital admissions and emergency department ( ED ) visits due to COPD . METHODS We performed a r and omized , adjudicator-blinded , controlled , 1-year trial at five Veterans Affairs medical centers of 743 patients with severe COPD and one or more of the following during the previous year : hospital admission or ED visit for COPD , chronic home oxygen use , or course of systemic corticosteroids for COPD . Control group patients received usual care . Intervention group patients received a single 1- to 1.5-hour education session , an action plan for self-treatment of exacerbations , and monthly follow-up calls from a case manager . MEASUREMENTS AND MAIN RESULTS We determined the combined number of COPD -related hospitalizations and ED visits per patient . Secondary outcomes included hospitalizations and ED visits for all causes , respiratory medication use , mortality , and change in Saint George 's Respiratory Question naire . After 1 year , the mean cumulative frequency of COPD -related hospitalizations and ED visits was 0.82 per patient in usual care and 0.48 per patient in disease management ( difference , 0.34 ; 95 % confidence interval , 0.15 - 0.52 ; P < 0.001 ) . Disease management reduced hospitalizations for cardiac or pulmonary conditions other than COPD by 49 % , hospitalizations for all causes by 28 % , and ED visits for all causes by 27 % ( P < 0.05 for all ) . CONCLUSIONS A relatively simple disease management program reduced hospitalizations and ED visits for COPD . Clinical trial registered with www . clinical trials.gov ( NCT00126776 ) Accurate h and over of clinical information is imperative to ensure continuity of patient care , patient safety and reduction in clinical errors . Verbal and paper-based h and overs are common practice in many institutions but the potential for clinical errors and inefficiency is significant . We have recently introduced an electronic templated signout to improve clarity of transfer of patient details post-surgical take . The aim of this study was to prospect ively audit the introduction of this new electronic h and over in our hospital with particular emphasis regarding efficacy and efficiency . The primary surrogate chosen to assess efficacy and efficiency was length of stay for those patients admitted through the emergency department . To do this we compared two separate , two-week periods before and after the introduction of this new electronic signout format . Users were not informed of the study . Information recorded on the signout included details of the emergency admissions , consults received on call and any issues with regard to in patients . ASA grade , time to first intervention and admission diagnosis were also recorded . Our results show that introduction of this electronic signout significantly reduced median length of stay from five to four days ( P=0.047 ) . No significant difference in ASA grade s , time to first intervention or overall admission diagnosis was obtained between the two time periods . In conclusion , this is the first study to show that the introduction of electronic signout post-call was associated with a significant reduction in patient length of stay and provided better continuity of care than the previously used paper-based h and over Objective To evaluate the effect of specialist geriatric medical management on the outcomes of at risk older people discharged from acute medical assessment units . Design Individual patient r and omised controlled trial comparing intervention with usual care . Setting Two hospitals in Nottingham and Leicester , UK . Participants 433 patients aged 70 or over who were discharged within 72 hours of attending an acute medical assessment unit and at risk of decline as indicated by a score of at least 2 on the Identification of Seniors At Risk tool . Intervention Assessment made on the acute medical assessment unit and further outpatient management by specialist physicians in geriatric medicine , including advice and support to primary care services . Main outcome measures The primary outcome was the number of days spent at home ( for those admitted from home ) or days spent in the same care home ( if admitted from a care home ) in the 90 days after r and omisation . Secondary outcomes were determined at 90 days and included mortality , institutionalisation , dependency , mental wellbeing , quality of life , and health and social care re source use . Results The two groups were well matched for baseline characteristics , and withdrawal rates were similar in both groups ( 5 % ) . Mean days at home over 90 days ’ follow-up were 80.2 days in the control group and 79.7 in the intervention group . The 95 % confidence interval for the difference in means was −4.6 to 3.6 days ( P=0.31 ) . No significant differences were found for any of the secondary outcomes . Conclusions This specialist geriatric medical intervention applied to an at risk population of older people attending and being discharged from acute medical units had no effect on patients ’ outcomes or subsequent use of secondary care or long term care BACKGROUND Telemonitoring has been advocated as a way of decreasing costs and improving outcomes , but no study has looked at true Medicare payments and 30-day readmission rates in a r and omized group of well treated patients . OBJECTIVE The aim of this work was to analyze Medicare cl aims data to identify effects of home telemonitoring on medical costs , 30-day rehospitalization , mortality , and health-related quality of life . METHODS A total of 204 subjects were r and omized to usual-care and monitored groups and evaluated with the SF-36 and Minnesota Living With Heart Failure Question naire ( MLHF ) . Hospitalizations , Medicare payments , and mortality were also assessed . Monitored subjects transmitted weight , blood pressure , and heart rate , which were monitored by an experienced heart failure nurse practitioner . RESULTS Subjects were followed for 802 ± 430 days ; 75 subjects in the usual-care group ( 316 hospitalizations ) and 81 in the monitored group ( 327 hospitalizations ) were hospitalized at least once ( P = .51 ) . There were no differences in Medicare payments for inpatient or emergency department visits , and length of stay was not different between groups . There was no difference in 30-day readmissions ( P = .627 ) or mortality ( P = .575 ) . Scores for SF-36 and MLHF improved ( P < .001 ) over time , but there were no differences between groups . The percentage of patients readmitted within 30 days was lower with telemonitoring for the 1st year , but this did not persist . CONCLUSIONS Telemonitoring did not result in lower total costs , decreased hospitalizations , improved symptoms , or improved mortality . A decrease in 30-day readmission rates for the 1st year did not result in decreased total cost or better outcomes AIM This paper reports a study comparing the outcomes of diabetic patients undergoing either early discharge or routine care . BACKGROUND The hospital is not the best place to monitor the glycaemic control of patients with diabetes with no other morbidity or complications . It is an unnatural environment in which diet is planned and the activity level is low . The hospital is also an expensive place in which to treat patients . METHODS This r and omized controlled trial was conducted in the medical department of a regional hospital in Hong Kong . A total of 101 patients who needed glycaemic monitoring , but who were otherwise fit for discharge , were recruited . The control group continued to receive routine hospital care . The study group was discharged early and received a follow-up programme which included a weekly or biweekly telephone call from a nurse . FINDINGS When compared with the control group , the study group had a greater decrease in HbA1c at 24 weeks , although the statistical difference was marginal ( 7.6 vs. 8.1 , P = 0.06 ) , a higher blood monitoring adherence score at both 12 weeks ( 5.4 vs. 3.6 , P < 0.001 ) and 24 weeks ( 5.3 vs. 3.5 , P < 0.001 ) , and a higher exercise adherence score at 12 weeks ( 5.3 vs. 3.4 , P = 0.001 ) and 24 weeks ( 5.5 vs. 3.2 , P < 0.001 ) . The study group had a shorter hospital stay ( 2.2 vs. 5.9 , P < 0.001 ) , and the net savings were HK$11,888 per patient . CONCLUSION It is feasible to integrate treatment into the real life environments of patients with diabetes , and nurse-led transitional care is a practical and cost-effective model . Nurse follow-up is effective in maintaining optimal glycaemic control and enhancing adherence to health behaviours . Management of glycaemic control is better done in the community than in the hospital BACKGROUND Hospital readmissions for acute exacerbations of COPD ( AE COPD s ) pose burdens to the health-care system and patients . A current gap in knowledge is whether a predischarge screening and educational tool administered to patients with COPD reduces readmissions and ED visits . METHODS A single-center , r and omized trial of admitted patients with AE COPD s was conducted at Henry Ford Hospital between February 2010 and April 2013 . One hundred seventy-two patients were r and omized to either the control ( st and ard care ) or the bundle group in which patients received smoking cessation counseling , screening for gastroesophageal reflux disease and depression or anxiety , st and ardized inhaler education , and a 48-h postdischarge telephone call . The primary end point was the difference in the composite risk of hospitalizations or ED visits for AE COPD between the two groups in the 30 days following discharge . A secondary end point was 90-day readmission rate . RESULTS Of the 172 patients , 18 of 79 in the control group ( 22.78 % ) and 18 of 93 in the bundle group ( 19.35 % ) were readmitted within 30 days . The risk of ED visits or hospitalizations within 30 days was not different between the groups ( risk difference , -3.43 % ; 95 % CI , -15.68 % to 8.82 % ; P = .58 ) . Overall , the time to readmission in 30 and 90 days was similar between groups ( log-rank test P = .71 and .88 , respectively ) . CONCLUSIONS A predischarge bundle intervention in AE COPD is not sufficient to reduce the 30-day risk of hospitalizations or ED visits . More re sources may be needed to generate a measurable effect on readmission rates . TRIAL REGISTRY Clinical Trials.gov ; No. : NCT02135744 ; URL : www . clinical trials.gov Objective To investigate whether an early rehabilitation intervention initiated during acute admission for exacerbations of chronic respiratory disease reduces the risk of readmission over 12 months and ameliorates the negative effects of the episode on physical performance and health status . Design Prospect i ve , r and omised controlled trial . Setting An acute cardiorespiratory unit in a teaching hospital and an acute medical unit in an affiliated teaching district general hospital , United Kingdom . Participants 389 patients aged between 45 and 93 who within 48 hours of admission to hospital with an exacerbation of chronic respiratory disease were r and omised to an early rehabilitation intervention ( n=196 ) or to usual care ( n=193 ) . Main outcome measures The primary outcome was readmission rate at 12 months . Secondary outcomes included number of hospital days , mortality , physical performance , and health status . The primary analysis was by intention to treat , with prespecified per protocol analysis as a secondary outcome . Interventions Participants in the early rehabilitation group received a six week intervention , started within 48 hours of admission . The intervention comprised prescribed , progressive aerobic , resistance , and neuromuscular electrical stimulation training . Patients also received a self management and education package . Results Of the 389 participants , 320 ( 82 % ) had a primary diagnosis of chronic obstructive pulmonary disease . 233 ( 60 % ) were readmitted at least once in the following year ( 62 % in the intervention group and 58 % in the control group ) . No significant difference between groups was found ( hazard ratio 1.1 , 95 % confidence interval 0.86 to 1.43 , P=0.4 ) . An increase in mortality was seen in the intervention group at one year ( odds ratio 1.74 , 95 % confidence interval 1.05 to 2.88 , P=0.03 ) . Significant recovery in physical performance and health status was seen after discharge in both groups , with no significant difference between groups at one year . Conclusion Early rehabilitation during hospital admission for chronic respiratory disease did not reduce the risk of subsequent readmission or enhance recovery of physical function following the event over 12 months . Mortality at 12 months was higher in the intervention group . The results suggest that beyond current st and ard physiotherapy practice , progressive exercise rehabilitation should not be started during the early stages of the acute illness . Trial registration Current Controlled Trials IS RCT N05557928 BACKGROUND Home visiting programs have been developed to improve the functional abilities of older people and subsequently to reduce the use of institutional care services . The results of trials have been inconsistent and their cost-effectiveness uncertain . Home visits for a high-risk population rather than the general population seems a promising approach . We therefore studied the effects of a home visiting program for older people with poor health . This article describes the effects on health care use and associated cost . METHODS We conducted a r and omized clinical trial among 330 community-dwelling citizens , aged 70 - 84 years , in the Netherl and s. Participants in the intervention group ( n = 160 ) received eight home visits by a trained home nurse over an 18-month period ; a multidimensional geriatric assessment of problems was included . The main outcomes are : admissions to hospital , nursing home , and home for older persons ; contacts with medical specialists , general practitioners , and paramedics ; and hours of home care help . The data on health care use were mostly obtained from computerized data bases of various medical administration offices ; the follow-up period was 24 months . RESULTS Inpatient and outpatient health care use was similar for both groups , with the exception of a higher distribution of aids and in-home modifications in favor of the intervention group . No differences were found between the intervention and control group in health care cost . CONCLUSION The home visiting program did not appear to have any effect on the health care use of older people with poor health and had a low chance of being cost-effective . We conclude that these visits are probably not beneficial for such persons within the health care setting in the Netherl and s or comparable setting s in other Western countries OBJECTIVES To assess the effect of an intervention on drug-related problem ( DRP ; adverse drug reactions , adherence problems , underuse)-related readmission rates in older adults . DESIGN Ancillary study from a 6-month , prospect i ve , r and omized , parallel-group , open-label trial . SETTING Six acute geriatric units in Paris and suburbs . PARTICIPANTS Six hundred sixty-five consecutively admitted individuals were included : 317 in the intervention group ( IG ) and 348 in the control group ( CG ) ( aged 86.1 ± 6.2 , 66 % female ) . INTERVENTION Discharge-planning intervention combining chronic drug review , education , and enhanced transition-of-care communication . MEASUREMENTS Chronic drugs at discharge of the two groups were compared . An expert committee blinded to group assignment adjudicated whether 6-month readmission to the study hospitals was related to drugs . RESULTS Six hundred thirty-nine individuals were discharged and followed up ( 300 IG , 339 CG ) . The intervention had no significant effect on drug regimen at discharge , characterized by prescriptions that are mostly appropriate but increase iatrogenic risk . Three hundred eleven readmissions occurred during follow-up ( 180 CG , 131 IG ) , of which 185 ( 59.5 % ) were adjudicated ( 102 CG , 83 IG ) . For 16 , DRP imputability was doubtful . Of the remaining 169 , DRPs were the most frequent cause for readmission , with 38 ( 40.4 % ) readmissions in the CG and 26 ( 34.7 % ) in the IG ( relative risk reduction = 14.3 % , 95 % confidence interval = 14.0 - 14.5 % , P = .54 ) . The intervention was associated with 39.7 % fewer readmissions related to adverse drug reactions ( P = .12 ) despite the study 's lack of power . CONCLUSION Drug-related problem prevention in older people discharged from the hospital should be a priority , with a focus on improving the monitoring of drugs with high iatrogenic risk OBJECTIVES The aim of this study was to determine the effect of stress cardiac magnetic resonance ( CMR ) imaging in an observation unit ( OU ) on revascularization , hospital readmission , and recurrent cardiac testing in intermediate-risk patients with possible acute coronary syndromes ( ACS ) . BACKGROUND Intermediate-risk patients commonly undergo hospital admission with high rates of coronary revascularization . It is unknown whether OU-based care with CMR is a more efficient alternative . METHODS A total of 105 intermediate-risk participants with symptoms of ACS but without definite ACS on the basis of the first electrocardiogram and troponin were r and omized to usual care provided by cardiologists and internists ( n = 53 ) or to OU care with stress CMR ( n = 52 ) . The primary composite endpoint of coronary artery revascularization , hospital readmission , and recurrent cardiac testing at 90 days was determined . The secondary endpoint was length of stay from r and omization to index visit discharge ; safety was measured as ACS after discharge . RESULTS The median age of participants was 56 years ( range 35 to 91 years ) , 54 % were men , and 20 % had pre-existing coronary disease . Index hospital admission was avoided in 85 % of the OU CMR participants . The primary outcome occurred in 20 usual care participants ( 38 % ) versus 7 OU CMR participants ( 13 % ) ( hazard ratio : 3.4 ; 95 % confidence interval : 1.4 to 8.0 , p = 0.006 ) . The OU CMR group experienced significant reductions in all components : revascularizations ( 15 % vs. 2 % , p = 0.03 ) , hospital readmissions ( 23 % vs. 8 % , p = 0.03 ) , and recurrent cardiac testing ( 17 % vs. 4 % , p = 0.03 ) . Median length of stay was 26 h ( interquartile range : 23 to 45 h ) in the usual care group and 21 h ( interquartile range : 15 to 25 h ) in the OU CMR group ( p < 0.001 ) . ACS after discharge occurred in 3 usual care participants ( 6 % ) and no OU CMR participants . CONCLUSIONS In this single-center trial , management of intermediate-risk patients with possible ACS in an OU with stress CMR reduced coronary artery revascularization , hospital readmissions , and recurrent cardiac testing , without an increase in post-discharge ACS at 90 days . ( R and omized Investigation of Chest Pain Diagnostic Strategies ; NCT01035047 ) BACKGROUND Intraoperative fluid therapy regimens using oesophageal Doppler monitoring ( ODM ) to optimize stroke volume ( SV ) ( goal -directed fluid therapy , GDT ) have been associated with a reduction in length of stay ( LOS ) and complication rates after major surgery . We hypothesized that intraoperative GDT would reduce the time to surgical readiness for discharge ( RfD ) of patients having major elective colorectal surgery but that this effect might be less marked in aerobically fit patients . METHODS In this double-blinded controlled trial , 179 patients undergoing major open or laparoscopic colorectal surgery were characterized as aerobically ' fit ' ( n=123 ) or ' unfit ' ( n=56 ) on the basis of their performance during a cardiopulmonary exercise test . Within these fitness strata , patients were r and omized to receive a st and ard fluid regimen with or without ODM-guided intraoperative GDT . RESULTS GDT patients received an average of 1360 ml of additional intraoperative colloid . The mean cardiac index and SV at skin closure were significantly higher in the GDT group than in controls . Times to RfD and LOS were longer in GDT than control patients but did not reach statistical significance ( median 6.8 vs 4.9 days , P=0.09 , and median 8.8 vs 6.7 days , P=0.09 , respectively ) . Fit GDT patients had an increased RfD ( median 7.0 vs 4.7 days ; P=0.01 ) and LOS ( median 8.8 vs 6.0 days ; P=0.01 ) compared with controls . CONCLUSIONS Intraoperative SV optimization conferred no additional benefit over st and ard fluid therapy . In an aerobically fit subgroup of patients , GDT was associated with detrimental effects on the primary outcome . TRIAL REGISTRY UK NIHR CRN 7285 , IS RCT N 14680495 . http://public.ukcrn.org.uk/ Search / Study Detail.aspx ? Study ID=7285 OBJECTIVE To examine community nursing services for patients with cardiovascular diseases , chronic respiratory diseases and other general medical conditions , making the transition from hospital to home . DESIGN The original study design was a r and omised controlled trial . This study is a secondary analysis of the hospital records documented by community nurses for the study -group patients . SAMPLE The sample consisted of 46 subjects , r and omly drawn from the main study group of the study . MEASUREMENTS The community nursing records were analysed using the Omaha System . Self-reported health status and readmission data were retrieved from the data base of the original study . RESULTS The three groups of patients experienced problems across the four domains in the Omaha System . Community nursing interventions did not differ greatly by disease groups . The primary purpose of home visits was observation , followed by treatment and procedures and health teaching . The community nurses in the study spent more effort providing health teaching to the respiratory group than to their counterparts . The outcome measures are self-reported health status and hospital readmission rates . For self-reported health status , significant differences were observed in the respiratory and cardiovascular group before and after community nursing services . For hospital readmission rate , no significant difference was found . CONCLUSIONS To improve the well being of chronically ill patients , a comprehensive home intervention programme , emphasising continuous needs of monitoring and case management , is fundamental to producing desired , measurable effects . RELEVANCE TO CLINICAL PRACTICE This paper adds the underst and ing of home-care services provided by community nurses to chronically ill patients . The scope of nursing services emphasises the significance of a positive , patient-centred , caring and appropriate client-practitioner relationship to improve the self-reported health of patients OBJECTIVE To assess effect of a complex , multidisciplinary intervention aim ed at reducing avoidable acute hospitalization of residents of residential aged care ( RAC ) facilities . DESIGN Cluster r and omized controlled trial . SETTING RAC facilities with higher than expected hospitalizations in Auckl and , New Zeal and , were recruited and r and omized to intervention or control . PARTICIPANTS A total of 1998 residents of 18 intervention facilities and 18 control facilities . INTERVENTION A facility-based complex intervention of 9 months ' duration . The intervention comprised gerontology nurse specialist (GNS)-led staff education , facility bench-marking , GNS resident review , and multidisciplinary ( geriatrician , primary -care physician , pharmacist , GNS , and facility nurse ) discussion of residents selected using st and ard criteria . MAIN OUTCOME MEASURES Primary end point was avoidable hospitalizations . Secondary end points were all acute admissions , mortality , and acute bed-days . Follow-up was for a total of 14 months . RESULTS The intervention did not affect main study end points : number of acute avoidable hospital admissions ( RR 1.07 ; 95 % CI 0.85 - 1.36 ; P = .59 ) or mortality ( RR 1.11 ; 95 % CI 0.76 - 1.61 ; P = .62 ) . CONCLUSIONS This multidisciplinary intervention , packaging selected case review , and staff education had no overall impact on acute hospital admissions or mortality . This may have considerable implication s for resourcing in the acute and RAC sectors in the face of population aging . Australian and New Zeal and Clinical Trials Registry ( ACTRN12611000187943 ) OBJECTIVES To test whether coordination of discharge from hospital reduces hospitalizations in patients with chronic obstructive pulmonary disease ( COPD ) . DESIGN R and omized controlled clinical trial . SETTING Specialized pulmonary hospital . PARTICIPANTS Patients hospitalized for an acute exacerbation of COPD . INTERVENTION Care as usual included routine patient education , supervised inhaler use , respiratory physiotherapy , and disease-related communication . The discharge coordinator intervention added assessment of patient situation and homecare needs . Patients and caregivers were actively involved and empowered in the discharge planning process , which was communicated with community medical professionals to provide continuity of care at home . MEASUREMENTS The primary end-point of the study was the number of patients hospitalized because of worsening COPD . Key secondary end-points were time-to- COPD hospitalization , all-cause mortality , all-cause hospitalization , days alive and out of hospital , and health-related quality of life . RESULTS Of 253 eligible patients ( 71 ± 9 years , 72 % men , 87 % GOLD III/IV ) , 118 were assigned to intervention and 135 to usual care . During a follow-up of 180 days , fewer patients receiving intervention were hospitalized for COPD ( 14 % versus 31 % , P = .002 ) or for any cause ( 31 % versus 44 % , P = .033 ) . In time-to-event analysis , intervention was associated with lower rates of COPD hospitalizations ( P = .001 ) . A Cox model of proportional hazards , adjusted for sex , age , GOLD stage , heart failure , malignant disease , and long-term oxygen treatment , demonstrated that intervention reduced the risk of COPD hospitalization ( hazard ratio 0.43 , 95 % confidence interval 0.24 - 0.77 , P = .002 ) . CONCLUSION Among patients hospitalized for acute COPD exacerbation , discharge coordinator intervention reduced both COPD hospitalizations and all-cause hospitalizations OBJECTIVE To investigate the business case of postdischarge care transition ( PDCT ) among Medicare beneficiaries by conducting a cost-benefit analysis . DESIGN R and omized controlled trial . SETTING A general hospital in upstate New York State . PARTICIPANTS Elderly Medicare beneficiaries being treated from October 2008 through December 2009 were r and omly selected to receive services as part of a comprehensive PDCT program ( intervention--173 patients ) or regular discharge process ( control--160 patients ) and followed for 12 months . INTERVENTION The intervention comprised five activities : development of a patient-centered health record , a structured discharge preparation checklist of critical activities , delivery of patient self-activation and management sessions , follow-up appointments , and coordination of data flow . MEASUREMENTS Cost-benefit ratio of the PDCT program ; self-management skills and abilities . RESULTS The 1-year readmission analysis revealed that control participants were more likely to be readmitted than intervention participants ( 58.2 % vs 48.2 % ; P = .08 ) ; with most of that difference observed in the 91 to 365 days after discharge . Findings from the cost-benefit analysis revealed a cost-benefit ratio of 1.09 , which indicates that , for every $ 1 spent on the program , a saving of $ 1.09 was realized . In addition , participating in a care transition program significantly enhanced self-management skills and abilities . CONCLUSION Postdischarge care transition programs have a dual benefit of enhancing elderly adults ' self-management skills and abilities and producing cost savings . This study builds a case for the inclusion of PDCT programs as a reimbursable service in benefit packages This study was initiated to determine the impact of post-discharge , nurse-led , home-based case management intervention on the number of emergency readmissions , level of care utilization , quality of life , and psychological functioning . Patients discharged home from a general hospital ( N=147 ) were r and omly assigned to usual care or nurse-led , home-based , case management intervention . During the 24 weeks of follow-up , no difference between the two groups was found for readmission , care utilization , quality of life , or psychological functioning . Patients in the control group tended to move sooner to non-independent living accommodation than patients in the nurse-led , home-based , case management intervention group OBJECTIVES To assess the effect of an electronic health record-based transitional care intervention involving automated alerts to primary care providers and staff when older adults were discharged from the hospital . DESIGN R and omized controlled trial . SETTING Large multispecialty group practice . PARTICIPANTS Individuals aged 65 and older discharged from hospital to home . INTERVENTION In addition to notifying primary care providers about the individual 's recent discharge , the system provided information about new drugs added during the inpatient stay , warnings about drug-drug interactions , recommendations for dose changes and laboratory monitoring of high-risk medications , and alerts to the primary care provider 's support staff to schedule a posthospitalization office visit . MEASUREMENTS An outpatient office visit with a primary care provider after discharge and rehospitalization within 30 days after discharge . RESULTS Of the 1,870 discharges in the intervention group , 27.7 % had an office visit with a primary care provider within 7 days of discharge . Of the 1,791 discharges in the control group , 28.3 % had an office visit with a primary care provider within 7 days of discharge . In the intervention group , 18.8 % experienced a rehospitalization within the 30-day period after discharge , compared with 19.9 % in the control group . The hazard ratio for an office visit with a primary care physician did not significantly differ between the intervention and control groups . The hazard ratio for rehospitalization in the 30-day period after hospital discharge in the intervention versus the control group was 0.94 ( 95 % confidence interval = 0.81 - 1.1 ) . CONCLUSION This electronic health record-based intervention did not have a significant effect on the timeliness of office visits to primary care providers after hospitalization or risk of rehospitalization This r and omized controlled trial examined a discharge nursing intervention aim ed at promoting self-regulation of care for early discharge interventional cardiology patients . The purpose of this study was to compare medication adherence , patient satisfaction , use of urgent care , and illness perception in patients with cardiovascular disease undergoing interventional revascularization procedures who receive usual care and those who receive a discharge nursing intervention Background Heart failure is a common and costly condition , particularly in the elderly . A range of models of interventions have shown the capacity to decrease hospitalizations and improve health-related outcomes . Potentially , cardiac rehabilitation models can also improve outcomes . Aim To assess the impact of a nurse-coordinated multidisciplinary , cardiac rehabilitation program to decrease hospitalizations , increase functional capacity , and meet the needs of patients with heart failure . Method In a r and omized control trial , a total of 105 patients were recruited to the study . Patients in the intervention group received an individualized , multidisciplinary 12-week cardiac rehabilitation program , including an individualized exercise component tailored to functional ability and social circumstances . The control group received an information session provided by the cardiac rehabilitation coordinator and then follow-up care by either their cardiologist or general practitioner . This trial was stopped prematurely after the release of state-based guidelines and funding for heart failure programs . Results During the study period , patients in the intervention group were less likely to have been admitted to hospital for any cause ( 44 vs. 69 % , P = 0.01 ) or after a major acute coronary event ( 24 vs. 55 % , P = 0.001 ) . Participants in the intervention group were more likely to be alive at 12 months , ( 93 vs. 79 % ; P = 0.03 ) ( odds ratio = 3.85 ; 95 % confidence interval = 1.0314.42 ; P = 0.0042 ) . Quality of life scores improved at 3 months compared with baseline ( intervention t = 4.37 , P [ 0.0001 ; control t= 3.52 , P [ 0.01 ) . Improvement was also seen in 6-min walk times at 3 months compared with baseline in the intervention group ( t = 3.40 ; P = 0.01 ) . Conclusion This study shows that a multidisciplinary heart failure cardiac rehabilitation program , including an individualized exercise component , coordinated by a specialist heart failure nurse can substantially reduce both all-cause and cardiovascular readmission rates , improve functional status at 3 months and exercise tolerance BACKGROUND Hospitals are implementing discharge support programs to reduce readmissions , and these programs have had mixed success . OBJECTIVE To examine whether a peridischarge , nurse-led intervention decreased emergency department ( ED ) visits or readmissions among ethnically and linguistically diverse older patients admitted to a safety-net hospital . DESIGN R and omized , controlled trial using computer-generated r and omization with 1:1 allocation , stratified by language . ( Clinical Trials.gov : NCT01221532 ) . SETTING Publicly funded urban hospital in Northern California . PATIENTS Hospitalized adults aged 55 years or older with anticipated discharge to the community who spoke English , Spanish , or Chinese ( M and arin or Cantonese ) . INTERVENTION Usual care versus in-hospital , one-on-one , self-management education given by a dedicated language -concordant registered nurse combined with a telephone follow-up after discharge from a nurse practitioner . MEASUREMENTS Staff blinded to the study groups determined ED visits or readmissions to any facility at 30 , 90 , and 180 days after initial hospital discharge using administrative data from several hospitals . RESULTS There were 700 low-income , ethnically and linguistically diverse patients with a mean age of 66.2 years ( SD , 9.0 ) . The primary outcome of ED visits or readmissions did not differ between the intervention and usual care groups ( hazard ratio , 1.26 [ 95 % CI , 0.89 to 1.78 ] at 30 days , 1.21 [ CI , 0.91 to 1.62 ] at 90 days , and 1.11 [ CI , 0.86 to 1.43 ] at 180 days ) . LIMITATIONS This study was done at a single acute-care hospital . There were fewer outcomes than expected , which may have caused the study to be underpowered . CONCLUSION A nurse-led , in-hospital discharge support intervention did not show a reduction in readmissions or ED visits among diverse , low-income older adults at a safety-net hospital . Although wide CIs preclude firm conclusions , the intervention may have increased ED visits . Alternative readmission prevention strategies should be tested in this population . PRIMARY FUNDING SOURCE Gordon and Betty Moore Foundation ♦ Background : Patients with end-stage renal failure ( ESRF ) need integrated health care to maintain a desirable quality of life . Studies suggest that post-discharge nurseled telephone support has a positive effect for patients suffering from chronic diseases . But the post-discharge care is under-developed in mainl and China and the effects of post-discharge care on patients with peritoneal dialysis have not been conclusive . ♦ Aim : The purpose of this study is to test the effectiveness of postdischarge nurse-led telephone support on patients with peritoneal dialysis in mainl and China . ♦ Methods : A r and omized controlled trial was conducted in the medical department of a regional hospital in Guangzhou . 135 patients were recruited , 69 in the study group and 66 in the control group . The control group received routine hospital discharge care . The study group received post-discharge nurse-led telephone support . The quality of life ( Kidney Disease Quality of Life Short Form , KDQOL-SF ) , blood chemistry , complication control , readmission and clinic visit rates were observed at three time intervals : baseline before discharge ( T1 ) , 6 ( T2 ) and 12 ( T3 ) weeks after discharge . ♦ Results : Statistically significant effects were found for symptom/problem , work status , staff encouragement , patient satisfaction and energy/fatigue in KDQOL-SF and 84-day ( 12-week ) clinic visit rates between the two groups . The study group had more significant improvement than the control group for sleep , staff encouragement at both T2 and T3 , and pain at T2 and patient satisfaction at T3 . No significant differences were observed between the two groups for the baseline measures , other dimensions in KDQOL-SF , blood chemistry , complication control , readmission rates at all time intervals and clinic visit rates at the first two time intervals . ♦ Conclusions : Post-discharge nurse-led telephone support for patients undergoing peritoneal dialysis is effective to enhance patients ’ well-being in the transition from hospital to home in mainl and China RATIONALE , AIMS , AND OBJECTIVES Medication-related problems are frequent and can lead to serious adverse events result ing in increased morbidity , mortality , and costs . Medication use in frail older patients is even more complex . The aim of this study was to investigate the effect of a pharmacist-led medicines management model among older patients at admission , during inpatient stay and at discharge on medication-related readmissions . METHOD A r and omized controlled trial conducted at the acute admission unit in a Danish hospital with acutely admitted medical patients , r and omized to either a control group or one of two intervention groups . The intervention consisted of pharmacist-led medication review and patient interview upon admission ( intervention ED ) or pharmacist-led medication review and patient interview upon admission , medication review during inpatient stay , and medication report and patient counselling at discharge ( intervention STAY ) . RESULTS In total , 600 patients were included . The pharmacist identified 920 medication-related problems with 57 % of the recommendations accepted by the physician . After 30 days , 25 patients had a medication-related readmission , with no statistical significant difference between the groups on either primary or secondary outcomes . CONCLUSIONS This study showed that a clinical pharmacist can be used to identify and solve medication-related problems , but this study did not find any effect on the selected outcomes . The frequency of medication-related readmissions was low , leaving little room for improvement . Future research should consider other study design s or outcome measures
2,018
31,604,470
Conclusions Compared with the internal fixation group , patients that underwent hemiarthroplasty had a lower reoperation rate and an equivalent overall mortality rate . Our meta- analysis suggests that hemiarthroplasty might be a better treatment choice than internal fixation in treating elderly patients with an undisplaced femoral neck fracture
Background Although internal fixation has been the main treatment option for elderly patients with an undisplaced femoral neck fracture , it is associated with a high reoperation rate . Some surgeons have discussed the use of hemiarthroplasty , but there is limited literature comparing these two treatment modalities . In this study , we compared the perioperative results of hemiarthroplasty with internal fixation for undisplaced femoral neck fractures .
Introduction There were higher rates of revision , complication , non-union , delayed union , and poorer functional outcomes reported in super-aged patients of undisplaced femoral neck fractures treated with internal fixation . Therefore , we design ed this r and omized comparative study aim ing to compare the effectiveness and long-term follow-up results of hemiarthroplasty ( HA ) with that of multiple cannulated screws ( MCS ) . Material s and methods Eligible participants were r and omly assigned into two groups for different methods of operation ( hemiarthroplasty group and internal fixation group ) . The related indexes and data of two groups were collected for comparative analysis during the average follow-up period of 38.68 ± 28.24 months . Results There were only two patients performed reoperation in HA group , and the reoperation rate of HA group ( 5.41 % , 2/37 ) was significantly lower than that of IF group ( 21.4 % , 9/41 ) ( P value = 0.000 ) . The comparison of survival curves for reoperation showed significant differences between two groups ( P value = 0.031).The results of Cox proportional hazards model suggested that only operation method significantly affected the occurrence of reoperation ( P value = 0.049 ) . The results of survival analysis showed that there was no significant difference in survival time between two groups ( P value = 0.682 ) . And in the Cox proportional hazards model , only age significantly affected the occurrence of death ( P value = 0.000 ) . The average Harris scores of two groups were all above 75 points , and there was no significant difference in Harris scores between the two groups ( P value greater than 0.05 ) . But in the early term follow-up , the excellent and good rate of hip joint function in HA group was significantly higher than that in IF group ( P value less than 0.05 ) . Conclusions Hemiarthroplasty with less postoperative complications , low reoperation rate and better function recovery in early stage provide a good choice for the treatment of super-aged patients with nondisplaced femoral neck fracture BACKGROUND Arthroplasty is now widely used to treat intra-capsular proximal femoral fractures ( PFFs ) in older patients , even when there is little or no displacement . However , whether arthroplasty is associated with lower mortality and complication rates in non-displaced or mildly displaced PFFs is unknown . The objectives of this prospect i ve study were : ( 1 ) to evaluate early mortality rates with the two treatment methods , ( 2 ) to identify risk factors for complications , ( 3 ) and to identify predictors of functional decline . HYPOTHESIS Arthroplasty and internal fixation produce similar outcomes in non-displaced fractures of patients older than 80 years with PFFs . MATERIAL AND METHODS This multicentre prospect i ve study included consecutive patients older than 80 years who were managed for intra-capsular PFFs at eight centres in 2014 . Biometric data and geriatric assessment scores ( Parker Mobility Score , Katz Index of Independence , and Mini-Nutritional Assessment [ MNA ] score ) were collected before and 6 months after surgery . Independent risk factors were sought by multivariate analysis . We included 418 females and 124 males with a mean age of 87±4years . The distribution of Garden stages was stage I , n=56 ; stage II , n=33 ; stage III , n=130 ; and stage IV , n=323 . Arthroplasty was performed in 494 patients and internal fixation in 48 patients with non-displaced intra-capsular PFFs . RESULTS Mortality after 6 months was 16.4 % overall , with no significant difference between the two groups . By multivariate analysis , two factors were significantly associated with higher mortality , namely , male gender ( odds ratio [ OR ] , 3.24 ; 95 % confidence interval [ 95 % CI ] , 2.0 - 5.84 ; P<0.0001 ) and high ASA score ( OR , 1.56 ; 95 % CI , 1.07 - 2.26 ; P=0.019 ) . Two factors were independently associated with lower mortality , with 75 % predictive value , namely , high haematocrit ( OR , 0.8 ; 95 % CI , 0.7 - 0.9 ; P=0.001 ) and better Parker score ( OR , 0.5 ; 95 % CI , 0.3 - 0.8 ; P=0.01 ) . The cut-off values associated with a significant risk increase were 2 for the Parker score ( OR , 1.8 ; 95 % CI , 1.1 - 2.3 ; P=0.001 ) and 37 % for the haematocrit ( OR , 3.3 ; 95 % CI , 1.9 - 5.5 ; P=0.02 ) . Complications occurred in 5.5 % of patients . Surgical site infections were seen in 1.4 % of patients , all of whom had had arthroplasty . Blood loss was significantly greater with arthroplasty ( 311±197mL versus 201±165mL , P<0.0002 ) . Dependency worsened in 39 % of patients , and 31 % of patients lost self-sufficiency . A higher preoperative Parker score was associated with a lower risk of high postoperative dependency ( OR , 0.86 ; 95 % CI , 0.76 - 0.97 ; P=0.014 ) . DISCUSSION Neither treatment method was associated with decreased mortality or better function after intra-capsular PFFs in patients older than 80 years . Early mortality rates were consistent with previous reports . Among the risk factors identified in this study , age , preoperative self-sufficiency , and gender are not amenable to modification , in contrast to haematocrit and blood loss . CONCLUSION Internal fixation remains warranted in patients older than 80 years with non-displaced intra-capsular PFFs . LEVEL OF EVIDENCE III , prospect i ve case-control study Introduction : Although the preferred treatment for displaced femoral neck fractures in the elderly is hip arthroplasty , the treatment for impacted or undisplaced femoral neck fractures ( UFNF ) is still a subject of controversy . Our purpose was to systematic ally review studies of elderly patients with UFNF treated with internal fixation using screws : ( i ) what is the reported mortality ; ( ii ) what is the reoperation rate ; ( iii ) what are the clinical and radiological outcomes ; and ( iv ) what is the method ological quality of the included studies ? Methods : This systematic review was performed through a search of PubMed and the Cochrane data base using a structured search algorithm including studies enrolling patients older than 60 years old , with UFNF treated with internal fixation using screws . Our literature search returned 950 studies and 11 were selected for final abstract ion . Results : 6 studies reported mortality rate . At 1-year follow-up mortality was reported by 3 studies : 18.8 % ; 22 % , and 19 % . At 5 years , 1 study reported mortality rate of 42 % . Overall reoperation rate was reported by 9 studies and ranged from 8%-19 % , while conversion to hip arthroplasty was performed in the range between 8 % and 16 % according to 6 studies . Conclusions : Internal fixation with cannulated screws for UFNF in the elderly is a valuable option , although it has substantial reoperation and mortality rates . Further prospect i ve high- quality , r and omised controlled trials are required to establish the optimal approach for the treatment of UFNF INTRODUCTION The objective of this study was to identify indications and predictors for subsequent surgeries in the same hip and to evaluate life expectancy following screw fixation of undisplaced femoral neck fractures ( FNF ) . The study further aim ed to determine the necessary follow-up time for future studies aim ing to evaluate the treatment of such fractures . MATERIAL S AND METHODS This is a single-center retrospective cohort study with prospect ively collected data including skeletally mature patients with undisplaced FNFs operated between 2005 and 2013 . Gender , age at fracture , American Society of Anesthesiologists score , smoking status and excess use of alcohol were retrieved from electronical medical records . Further , complications leading to all consecutive reoperations were registered along with time from primary operation to all reoperations , type of procedure during subsequent surgeries and time of death . RESULTS 383 patients with a median ( range ) follow-up of 77 ( 23 - 125 ) months were identified . Within 1 , 2 and 5 years from primary surgery , 8 % , 17 % and 21 % respectively , had at least one subsequent surgery in the same hip . 10 % of the patients underwent salvage arthroplasty , however , in long time survivors ; conversion to arthroplasty was estimated in one out of four . Posterior tilt of the femoral head was a predictor for new surgeries due to instability of the bone-implant construct , but not for later avascular necrosis . For patients 70 years or older , the one-year mortality in men was 32 % with an expected survival of approx . 2.5 years , compared to 17 % and 5.5 years in women . CONCLUSIONS Screw fixation of undisplaced femoral neck fractures appears to be a safe procedure in particular in the absence of a posterior tilt of the femoral head . Conversion to arthroplasty was estimated to occur in one out of four of long time survivors . Men have a particularly poor medical prognosis and should receive careful medical attention . In order to capture 80 % of reoperations , clinical studies and register studies must have a follow-up time of at least two years Background Reported revision of internal fixation for undisplaced intracapsular hip fractures is between 12 and 17 % at 1 year . This risk is greater for elderly patients , for whom mortality after such a fracture is also higher . Our purpose was to identify predictors of fixation failure and mortality for elderly patients sustaining undisplaced intracapsular hip fractures , and to assess whether their socioeconomic status affected their outcome . Methods During a 3-year period we prospect ively compiled a consecutive series of 162 elderly ( ≥65 years old ) patients who underwent internal fixation for an undisplaced ( Garden stage I or II ) intracapsular hip fracture . Patient demographics , American Society of Anesthesiologists ( ASA ) grade , and posterior tilt ( measured on the lateral radiograph ) were recorded pre-operatively . All patients were followed up for a minimum of 1 year . Each patient ’s socioeconomic status was assigned by use of the Scottish Index of Multiple Deprivation . Patient mortality was established by use of the General Register Office for Scotl and . Results There were 28 failures of fixation during the study period . In Cox regression analysis , ASA grade and the presence of posterior tilt ( p < 0.0001 ) were significant independent predictors of fixation failure . Overall unadjusted mortality at 1 year was 19 % ( n = 30/162 ) . Cox regression analysis also affirmed ASA grade to be the only significant independent predictor of 1-year mortality ( p = 0.003 ) . The st and ardised mortality rate for the cohort was 2.3 ( p < 0.001 ) , and was significantly greater for patients less than 80 years of age ( p = 0.004 ) . Socioeconomic status did not affect outcome , but the most deprived patients sustain their fracture at a significantly younger age ( p = 0.001 ) . Conclusion We have demonstrated that ASA grade and posterior tilt of the femoral neck are independent predictors of fixation failure of undisplaced intracapsular hip fractures in elderly patients , and ASA grade was also an independent predictor of mortality Abstract Hemiarthroplasty is the most commonly used treatment for displaced femoral neck fractures in the elderly . There is limited evidence in the literature of improved functional outcome with cemented implants , although serious cement-related complications have been reported . We performed a r and omized , controlled trial in patients 70 years and older comparing a cemented implant ( 112 hips ) with an uncemented , hydroxyapatite-coated implant ( 108 hips ) , both with a bipolar head . The mean Harris hip score showed equivalence between the groups , with 70.9 in the cemented group and 72.1 in the uncemented group after 3 months ( mean difference , 1.2 ) and 78.9 and 79.8 after 12 months ( mean difference , 0.9 ) . In the uncemented group , the mean duration of surgery was 12.4 minutes shorter and the mean intraoperative blood loss was 89 mL less . The Barthel Index and EQ-5D scores did not show any differences between the groups . The rates of complications and mortality were similar between groups . Both arthroplasties may be used with good results after displaced femoral neck fractures . Level of Evidence : Level I , therapeutic study . See the Guidelines for Authors for a complete description of levels of evidence & NA ; Pain intensity is frequently measured on an 11‐point pain intensity numerical rating scale ( PI‐NRS ) , where 0=no pain and 10=worst possible pain . However , it is difficult to interpret the clinical importance of changes from baseline on this scale ( such as a 1‐ or 2‐point change ) . To date , there are no data driven estimates for clinical ly important differences in pain intensity scales used for chronic pain studies . We have estimated a clinical ly important difference on this scale by relating it to global assessment s of change in multiple studies of chronic pain . Data on 2724 subjects from 10 recently completed placebo‐controlled clinical trials of pregabalin in diabetic neuropathy , postherpetic neuralgia , chronic low back pain , fibromyalgia , and osteoarthritis were used . The studies had similar design s and measurement instruments , including the PI‐NRS , collected in a daily diary , and the st and ard seven‐point patient global impression of change ( PGIC ) , collected at the endpoint . The changes in the PI‐NRS from baseline to the endpoint were compared to the PGIC for each subject . Categories of ‘ much improved ’ and ‘ very much improved ’ were used as determinants of a clinical ly important difference and the relationship to the PI‐NRS was explored using graphs , box plots , and sensitivity/specificity analyses . A consistent relationship between the change in PI‐NRS and the PGIC was demonstrated regardless of study , disease type , age , sex , study result , or treatment group . On average , a reduction of approximately two points or a reduction of approximately 30 % in the PI‐NRS represented a clinical ly important difference . The relationship between percent change and the PGIC was also consistent regardless of baseline pain , while higher baseline scores required larger raw changes to represent a clinical ly important difference . The application of these results to future studies may provide a st and ard definition of clinical ly important improvement in clinical trials of chronic pain therapies . Use of a st and ard outcome across chronic pain studies would greatly enhance the comparability , validity , and clinical applicability of these studies Background and purpose There is very little information on the cost of different treatments for femoral neck fractures . We assessed whether total hospital and societal costs of treatment of elderly patients with displaced femoral neck fractures differ between patients operated with internal fixation or hemiarthroplasty . Methods 222 patients ( mean age 83 years , 165 women ( 74 % ) ) who had been r and omized to internal fixation or hemiarthroplasty were followed for 2 years . Re source use in hospital , rehabilitation , community-based care , and nursing home use were identified , quantified , evaluated , and analyzed . Results The average cost per patient for the initial hospital stay was lower for patients in the internal fixation group than in the hemiarthroplasty group ( € 9,044 vs. € 11,887 , p < 0.01 ) . When all hospital costs , i.e. rehabilitation , reoperations , and formal and informal contact with the hospital were included , the costs were similar ( € 21,709 for internal fixation vs. € 19,976 for hemiarthroplasty ) . When all costs were included ( hospital admissions , cost of nursing home , and community-based care ) , internal fixation was the most expensive treatment ( € 47,186 vs. € 38,615 ( p = 0.09 ) ) . Interpretation The initial lower average cost per patient for internal fixation as treatment for a femoral neck fracture can not be used as an argument in favor of this treatment , since the average cost per patient is more than outweighed by subsequent costs , mainly due to a higher reoperation rate after internal fixation Objective To compare the functional results after displaced fractures of the femoral neck treated with internal fixation or hemiarthroplasty . Design R and omised trial with blinding of assessment s of functional results . Setting University hospital . Participants 222 patients ; 165 ( 74 % ) women , mean age 83 years . Inclusion criteria were age above 60 , ability to walk before the fracture , and no major hip pathology , regardless of cognitive function . Interventions Closed reduction and two parallel screws ( 112 patients ) and bipolar cemented hemiarthroplasty ( 110 patients ) . Follow-up at 4 , 12 , and 24 months . Main outcome measures Hip function ( Harris hip score ) , health related quality of life ( Eq-5d ) , activities of daily living ( Barthel index ) . In all cases high scores indicate better function . Results Mean Harris hip score in the hemiarthroplasty group was 8.2 points higher ( 95 % confidence interval 2.8 to 13.5 points , P=0.003 ) at four months and 6.7 points ( 1.5 to 11.9 points , P=0.01 ) higher at 12 months . Mean Eq-5d index score at 24 months was 0.13 higher in the hemiarthroplasty group ( 0.01 to 0.25 , P=0.03 ) . The Eq-5d visual analogue scale was 8.7 points higher in the hemiarthroplasty group after 4 months ( 1.9 to 15.6 , P=0.01 ) . After 12 and 24 months the percentage scoring 95 or 100 on the Barthel index was higher in the hemiarthroplasty group ( relative risk 0.67 , 0.47 to 0.95 , P=0.02 . and 0.63 , 0.42 to 0.94 , P=0.02 , respectively ) . Complications occurred in 56 ( 50 % ) patients in the internal fixation group and 16 ( 15 % ) in the hemiarthroplasty group ( 3.44 , 2.11 to 5.60 , P<0.001 ) . In each group 39 patients ( 35 % ) died within 24 months ( 0.98 , 0.69 to 1.40 , P=0.92 ) Conclusions Hemiarthroplasty is associated with better functional outcome than internal fixation in treatment of displaced fractures of the femoral neck in elderly patients . Trial registration NCT00464230 There is a lack of consensus about how to treat intracapsular hip fractures in the ' young elderly ' ( 50 - 75 years ) . Evidence for older more mobile patients seems to point towards Internal Fixation ( IF ) for undisplaced fractures and Total Hip Replacement ( THR ) for displaced fractures . Radiographs of 263 patients from the Norfolk and Norwich University Hospital , who have suffered an intracapsular hip fracture between 2000 - 2009 were review ed . The complication and mortality rates were noted . A Hip function question naire ( Oxford hip score ( OHS ) ) and Numeric pain score ( NPS ) were sent out to patients , then methods of treatment ( IF and THR ) were compared . In displaced fractures THR compared favourably to IF , OHS ( 16.0 vs. 20.0 p 0.029 ) , NPS ( 2.0 vs. 4.0 p 0.007 ) , complications ( Odds Ratio ( OR ) 2.90 ; p 0.006 ) and death rate ( OR 3.61 ; p 0.007 ) . Although not statistically significant when stratified for age , the youngest age group ( 50 - 60 ) still achieved better function with a THR ( 13.0 vs. 18.0 ; p 0.129 ) . There was little difference in the results for undisplaced fractures . This retrospective cross-sectional study showed IF is associated with a much higher complication rate than THR for patients who sustained a displaced hip fracture . THR also showed a better functional outcome and reduced pain . IF should be used in undisplaced fractures as there was no difference in functional outcome or complication rate . A large r and omised controlled trial is needed to confirm these results PURPOSE The aim of the study was to evaluate the mortality following the operative treatment of undisplaced subcapital fracture of the hip by internal fixation ( with three lag screws ) or hemiarthroplasty . METHODS A prospect i ve audit of all patients admitted with hip fracture was undertaken at the university hospital in Nottingham . An independent research assistant collected data on a st and ardised question naire . Mortality was calculated from data received from National office of Statistics allowing 100 % 1-year follow up for mortality statistics . RESULTS One hundred and sixty patients were admitted with undisplaced intracapsular fracture of the hip . Twenty-one patients had non-operative management and were excluded from the results . One hundred and thirty-nine patients had surgical treatment . Mean age of patients was 78 years . Twenty-nine patients had hemiarthroplasty and 110 patients underwent internal fixation of their fractures . There was no significant difference between the two groups for age , sex , mobility , residential status , co-morbidity and cognitive state . There was a significant difference in mortality between the two operated groups at 1 month and 1 year after the operation . Six patients ( 21 % ) died after hemiarthroplasty in the first month while there were only two ( 2 % ) deaths in the internal fixation group ( P < 0.001 ) . At 1 year from operation , 11 patients ( 38 % ) from the hemiarthroplasty group and 17 patients ( 16 % ) from the internal fixation group died ( P = 0.0072 ) . The re-operation rate within 1 year was higher for the internal fixation group ( n = 8 ; 7.2 % ) than the hemiarthroplasty group ( n = 1 ; 3 % ) . CONCLUSIONS There is significant increase in mortality when undisplaced intracapsular hip fractures are treated by hemiarthroplasty as compared to internal fixation and we would not recommend it for these fractures We r and omised 143 patients –age 75 years or older – with displaced femoral neck fracture to either internal fixation or total hip replacement ( THR ) and compared the socio-economic consequences . In the internal fixation group , 34 of 78 hips underwent secondary surgery . In the THR group , 12 of 68 hips dislocated , the majority in mentally impaired patients . We calculated the total hospital costs for two years after operation . When secondary surgery was included , there was no difference in costs between the internal fixation and THR groups , or between the mentally impaired and lucid subgroups . The costs to the community were calculated comparing the baseline cost before surgery with the average cost per month during the first postoperative year . No difference was found between the treatment groups . The Harris hip scores were higher in the THR group , and pain was more common in the internal fixation group . In lucid patients , THR gives a better clinical result at the same cost . RésuméNous avons r and omisé 143 malades-âgés de 75 ans ou plus– avec une fracture du col fémoral déplacée traitée par fixation interne ou remplacement totale de la hanche et avons comparé les conséquences socio-économiques . Dans le groupe de la fixation interne , 34/78 hanches ont subi une chirurgie secondaire . Dans le groupe PTH , 12/68 hanches se sont luxées , la majorité chez des malades mentalement affaiblis . Nous avons calculé les coûts totaux d’hospitalisation jusqu’à deux années postopératoires . Qu and la chirurgie secondaire a été incluse , il n’y avait aucune différence dans les coûts entre la fixation interne et le groupe PTH , ou entre les sous-groupes mentalement affaibli et lucides . Les coûts municipaux ont été calculés en comparant le coût de base avant chirurgie avec le coût moyen par mois pendant la première année postopératoire . Aucune différence n’a été trouvée entre les groupes de traitement . Le score de hanche de Harris étaient supérieur dans le groupe PTH et la douleur était plus fréquente dans le groupe de la fixation interne . Chez les malades lucides , la PTH donne un meilleur résultat clinique au même coût Three hundred and five undisplaced subcapital femoral neck fractures managed by pinning in situ with Knowles ' pins were evaluated to eluci date the role and effect of such treatment . The protocol of management and follow up , and evaluation , both radiographically and functionally , were set up prospect ively . The duration from injury to management was 3.5 ( 1 - 14 ) days , the operation time was 22 ( 9 - 48 ) min and most of the patients were discharged without hospitalization . The follow-up period was 75 ( 28 - 136 ) months . The final results showed 282 ( 92.5 per cent ) fractures united without complications ( mean union time : 20 weeks ) , 14 ( 4.6 per cent ) limbs with non-union , and 9 ( 2.9 per cent ) had implant problems . Twenty-two ( 7.2 per cent ) developed avascular necrosis after union . Percutaneous pinning of undisplaced subcapital femoral neck fractures as day cases is a simple , safe , effective and economic method The results and the related conditions of 250 undisplaced femoral neck fractures managed by percutaneous Knowles pinning were evaluated . All of the patients were over 59 years old , and the protocol of management and follow-up was determined prospect ively . The duration from injury to management was 3.0 ( range 1–12 ) days , the operation time was 20 ( range 10–44 ) min , and most of the patients were discharged without hospitalization . The follow-up period was 74 ( range 24–138 ) months . The final results showed 226 ( 90.4 % ) fractures with smooth course of union ( mean union time : 24 weeks ) , 15 ( 6.0 % ) fractures with nonunion , and 9 ( 3.6 % ) fractures with implant problems . Eighteen ( 7.2 % ) hips developed avascular necrosis of femoral head after union . The analysis showed that the rate of complications was higher in elderly persons with undisplaced femoral neck fractures BACKGROUND Although less likely to be reported in clinical trials than expressions of the statistical significance of differences in outcomes , whether or not a treatment has delivered a specified minimum clinical ly important difference ( MCID ) is also relevant to patients and their caregivers and doctors . Many dementia treatment r and omised controlled trials ( RCTs ) have not reported MCIDs and , where they have been done , observed differences have not reached these . METHODS As part of the development of the Statistical Analysis Plan for the DOMINO trial , investigators met to consider expert opinion- and distribution-based values for the MCID and triangulated these to provide appropriate values for three outcome measures , the St and ardised Mini-mental State Examination ( sMMSE ) , Bristol Activities of Daily Living Scale ( BADLS ) and Neuropsychiatric Inventory ( NPI ) . Only st and ard deviations ( SD ) were presented to investigators who remained blind to treatment allocation . RESULTS Adoption of values for MCIDs based upon 0.4 of the SD of the change in score from baseline on the sMMSE , BADLS and NPI in the first 127 participants to complete DOMINO yielded MCIDs of 1.4 points for sMMSE , 3.5 for BADLS and 8.0 for NPI . CONCLUSIONS Reference to MCIDs is important for the full interpretation of the results of dementia trials and those conducting such trials should be open about the way in which they have determined and chosen their values for the MCIDs Background : Elderly patients with a displaced femoral neck fracture treated with hip arthroplasty may have better function than those treated with internal fixation . We hypothesized that hemiarthroplasty would be superior to screw fixation with regard to hip function , mobility , pain , quality of life , and the risk of a reoperation in elderly patients with a nondisplaced femoral neck fracture . Methods : In a multicenter r and omized controlled trial ( RCT ) , Norwegian patients ≥70 years of age with a nondisplaced ( valgus impacted or truly nondisplaced ) femoral neck fracture were allocated to screw fixation or hemiarthroplasty . Assessors blinded to the type of treatment evaluated hip function with the Harris hip score ( HHS ) as the primary outcome as well as on the basis of mobility assessed with the timed “ Up & Go ” ( TUG ) test , pain as assessed on a numerical rating scale , and quality of life as assessed with the EuroQol-5 Dimension-3 Level ( EQ-5D ) at 3 , 12 , and 24 months postsurgery . Results , including reoperations , were assessed with intention-to-treat analysis . Results : Between February 6 , 2012 , and February 6 , 2015 , 111 patients were allocated to screw fixation and 108 , to hemiarthroplasty . At the time of follow-up , there was no significant difference in hip function between the screw fixation and hemiarthroplasty groups , with a 24-month HHS ( and st and ard deviation ) of 74 ± 19 and 76 ± 17 , respectively , and an adjusted mean difference of −2 ( 95 % confidence interval [ CI ] = −6 to 3 ; p = 0.499 ) . Patients allocated to hemiarthroplasty were more mobile than those allocated to screw fixation ( 24-month TUG = 16.6 ± 9.5 versus 20.4 ± 12.8 seconds ; adjusted mean difference = 6.2 seconds [ 95 % CI = 1.9 to 10.5 seconds ] ; p = 0.004 ) . Furthermore , screw fixation was a risk factor for a major reoperation , which was performed in 20 % ( 22 ) of 110 patients who underwent screw fixation versus 5 % ( 5 ) of 108 who underwent hemiarthroplasty ( relative risk reduction [ RRR ] = 3.3 [ 95 % CI = 0.7 to 10.0 ] ; number needed to harm [ NNH ] = 6.5 ; p = 0.002 ) . The 24-month mortality rate was 36 % ( 40 of 111 ) for patients allocated to internal fixation and 26 % ( 28 of 108 ) for those allocated to hemiarthroplasty ( RRR = 0.4 [ 95 % CI = −0.1 to 1.1 ] ; p = 0.11 ) . Two patients were lost to follow-up . Conclusions : In this multicenter RCT , hemiarthroplasty was not found to be superior to screw fixation in reestablishing hip function as measured by the HHS ( the primary outcome ) . However , hemiarthroplasty led to improved mobility and fewer major reoperations . The findings suggest that certain elderly patients with a nondisplaced femoral neck fracture may benefit from being treated with a latest-generation hemiarthroplasty rather than screw fixation . Level of Evidence : Therapeutic Level I. See Instructions for Authors for a complete description of levels of evidence
2,019
27,919,103
Medication-assisted treatment more effectively reduces opioid use than behavioral treatment alone ( 5 , 6 ) . Office-based opioid treatment may be particularly advantageous for reaching persons with OUD who are already engaged in primary care and offers an alternative for patients who can not access methadone treatment programs . Challenges include a variable scope of psychosocial services and structure required for management of complex patients .
Opioid use disorder ( OUD ) is a national crisis in the United States ( 1 ) . In 2014 , approximately 1.9 million Americans aged 12 years or older were estimated to have an OUD related to prescription opioids , and nearly 600000 used heroin ( 2 ) . In 2013 , an estimated 16000 persons died as a result of prescription opioid overdose , and approximately 8000 died of heroin overdose ( 3 ) . Medication-assisted treatment ( MAT ) for OUD , also referred to as pharmacotherapy , decreases illicit opioid use , prevents relapse , improves health , and reduces the risk for death from OUD ( 4 ) . Medications approved by the U.S. Food and Drug Administration include a full agonist ( methadone ) , partial opioid agonists ( buprenorphine , buprenorphinenaloxone , and implantable buprenorphine ) , and opioid antagonists ( oral and extended-release naltrexone ) . These medications block the euphoric and sedating effects of opioids , reduce craving for opioids , and mitigate opioid withdrawal symptoms . Behavioral therapy addresses the psychosocial contributors to OUD and may augment retention in treatment . Underst and ing the most effective and promising models of care is critical for optimizing initiatives to exp and access to MAT ( 1 ) . Although many providers offer OBOT without staff assistance , some practice s design ate a clinic staff member ( often a nurse or social worker ) to coordinate buprenorphine prescribing ( 1416 ) . Also , nurse practitioners and physician assistantsimportant providers of primary care in rural areas are currently not eligible to prescribe buprenorphine .
Background : Cocaine use is common in opioid-dependent HIV-infected patients , but its impact on treatment outcomes in these patients receiving buprenorphine/naloxone is not known . Methods : We conducted a prospect i ve study in 299 patients receiving buprenorphine/naloxone who provided baseline cocaine data and a subset of 266 patients who remained in treatment for greater than or equal to one quarter . Assessment s were conducted at baseline and quarterly for 1 year . We evaluated the association between baseline and in-treatment cocaine use on buprenorphine/naloxone retention , illicit opioid use , antiretroviral adherence , CD4 counts , HIV RNA , and risk behaviors . Results : Sixty-six percent ( 197 of 299 ) of patients reported baseline cocaine use and 65 % ( 173 of 266 ) of patients with follow-up data reported in-treatment cocaine use . Baseline and in-treatment cocaine use did not impact buprenorphine/naloxone retention , antiretroviral adherence , CD4 lymphocytes , or HIV risk behaviors . However , baseline cocaine use was associated with a 14.8 ( 95 % confidence interval [ CI ] , 9.0 - 24.2 ) times greater likelihood of subsequent cocaine use ( 95 % CI , 9.0 - 24.2 ) , a 1.4 ( 95 % CI , 1.02 - 2.00 ) times greater likelihood of subsequent opioid use , and higher log10 HIV RNA ( P < 0.016 ) over time . In-treatment cocaine use was associated with a 1.4 ( 95 % CI , 1.01 - 2.00 ) times greater likelihood of concurrent opioid use . Conclusions : Given cocaine use negatively impacts opioid and HIV treatment outcomes , interventions to address cocaine use in HIV-infected patients receiving buprenorphine/naloxone treatment are warranted This study was design ed to compare the neonatal abstinence syndrome ( NAS ) in neonates of methadone and buprenorphine maintained pregnant opioid-dependent women and to provide preliminary safety and efficacy data for a larger multi-center trial . This r and omized , double-blind , double-dummy , flexible dosing , parallel-group controlled trial was conducted in a comprehensive drug-treatment facility that included residential and ambulatory care . Participants were opioid-dependent pregnant women and their neonates . Treatment involved daily administration of either sublingual buprenorphine or oral methadone using flexible dosing of 4 - 24 mg or 20 - 100 mg , respectively . Primary a priori outcome measures were : ( 1 ) number of neonates treated for NAS ; ( 2 ) amount of opioid agonist medication used to treat NAS ; ( 3 ) length of neonatal hospitalization ; and ( 4 ) peak NAS score . Two of 10 ( 20 % ) buprenorphine-exposed and 5 of 11 ( 45.5 % ) methadone-exposed neonates were treated for NAS ( p=.23 ) . Total amount of opioid-agonist medication administered to treat NAS in methadone-exposed neonates was three times greater than for buprenorphine-exposed neonates ( 93.1 versus 23.6 ; p=.13 ) . Length of hospitalization was shorter for buprenorphine-exposed than for methadone-exposed neonates ( p=.021 ) . Peak NAS total scores did not significantly differ between groups ( p=.25 ) . Results suggest that buprenorphine is not inferior to methadone on outcome measures assessing NAS and maternal and neonatal safety when administered starting in the second trimester of pregnancy Background : Having opioid dependence and HIV infection are associated with poor HIV-related treatment outcomes . Methods : HIV-infected , opioid-dependent subjects ( N = 295 ) recruited from 10 clinical sites initiated buprenorphine/naloxone ( BUP/NX ) and were assessed at baseline and quarterly for 12 months . Primary outcomes included receiving antiretroviral therapy ( ART ) , HIV-1 RNA suppression , and mean changes in CD4 lymphocyte count . Analyses were stratified for the 119 subjects not on ART at baseline . Generalized estimating equations were deployed to examine time-dependent correlates for each outcome . Results : At baseline , subjects on ART ( N = 176 ) were more likely than those not on ART ( N = 119 ) to be older , heterosexual , have lower alcohol addiction severity scores , and lower HIV-1 RNA levels ; they were less likely to be homeless and report sexual risk behaviors . Subjects initiating BUP/NX ( N = 295 ) were significantly more likely to initiate or remain on ART and improve CD4 counts over time compared with baseline ; however , these improvements were not significantly improved by longer retention on BUP/NX . Retention on BUP/NX for three or more quarters was , however , significantly associated with increased likelihood of initiating ART ( β = 1.34 [ 1.18 , 1.53 ] ) and achieve viral suppression ( β = 1.25 [ 1.10 , 1.42 ] ) for the 64 of 119 ( 54 % ) subjects not on ART at baseline compared with the 55 subjects not retained on BUP/NX . In longitudinal analyses , being on ART was positively associated with increasing time of observation from baseline and higher mental health quality of life scores ( β = 1.25 [ 1.06 , 1.46 ] ) and negatively associated with being homo- or bisexual ( β = 0.55 [ 0.35 , 0.97 ] ) , homeless ( β = 0.58 [ 0.34 , 0.98 ] ) , and increasing levels of alcohol addiction severity ( β = 0.17 [ 0.03 , 0.88 ] ) . The strongest correlate of achieving viral suppression was being on ART ( β = 10.27 [ 5.79 , 18.23 ] ) . Female gender ( β = 1.91 [ 1.07 , 3.41 ] ) , Hispanic ethnicity ( β = 2.82 [ 1.44 , 5.49 ] ) , and increased general health quality of life ( β = 1.02 [ 1.00,1.04 ] ) were also independently correlated with viral suppression . Improvements in CD4 lymphocyte count were significantly associated with being on ART and increased over time . Conclusions : Initiating BUP/NX in HIV clinical care setting s is feasible and correlated with initiation of ART and improved CD4 lymphocyte counts . Longer retention on BPN/NX was not associated with improved prescription of ART , viral suppression , or CD4 lymphocyte counts for the overall sample in which the majority was already prescribed ART at baseline . Among those retained on BUP/NX , HIV treatment outcomes did not worsen and were sustained . Increasing time on BUP/NX , however , was especially important for improving HIV treatment outcomes for those not on ART at baseline , the group at highest risk for clinical deterioration . Retaining subjects on BUP/NX is an important goal for sustaining HIV treatment outcomes for those on ART and improving them for those who are not . Comorbid substance use disorders ( especially alcohol ) , mental health problems , and quality -of-life indicators independently contributed to HIV treatment outcomes among HIV-infected persons with opioid dependence , suggesting the need for multidisciplinary treatment strategies for this population Substance abuse is associated with poor medical and quality -of-life outcomes among HIV-infected individuals . Although drug treatment may reduce these negative consequences , for many patients , options are limited . Buprenorphine/naloxone , an opioid agonist treatment that can be prescribed in the United States in office-based setting s , can be used to exp and treatment capacity and integrate substance abuse services into HIV care . Recognizing this potential , the US Health Re sources and Services Administration funded the development and implementation of demonstration projects that integrated HIV care and buprenorphine/naloxone treatment at 10 sites across the country . An Evaluation and Technical Assistance Center provided programmatic and clinical support as well as oversight for an evaluation that examined the processes for and outcomes of integrated care . The evaluation included patient-level self-report and chart abstract ions as well as provider and site level data collected through surveys and in-depth interviews . Although multisite demonstrations pose implementation and evaluation challenges , our experience demonstrates that these can , in part , be addressed through ongoing communication and technical assistance as well as a comprehensive evaluation design that incorporates multiple research methods and data sources . Although limitations to evaluation findings persist , they may be balanced by the scope and “ real-world ” context of the initiative Background : Opioid dependence and HIV infection are associated with poor health-related quality of life ( HRQOL ) . Buprenorphine/naloxone ( bup/nx ) provided in HIV care setting s may improve HRQOL . Methods : We surveyed 289 HIV-infected opioid-dependent persons treated with clinic-based bup/nx about HRQOL using the Short Form Health Survey ( SF-12 ) administered at baseline , 3 , 6 , 9 , and 12 months . We used normalized SF-12 scores , which correspond to a mean HRQOL of 50 for the general US population ( SD 10 , possible range 0 - 100 ) . We compared mean normalized mental and physical composite and component scores in quarters 1 , 2 , 3 , and 4 with baseline scores using generalized estimating equation models . We assessed the effect of clinic-based bup/nx prescription on HRQOL composite scores using mixed effects regression with site as r and om effect and time as repeated effect . Results : Baseline normalized SF-12 scores were lower than the general US population for all HRQOL domains . Average composite mental HRQOL improved from 38.3 ( SE 12.5 ) to 43.4 ( SE 13.2 ) [ β 1.13 ( 95 % CI : 0.72 to 1.54 ) ] and composite physical HRQOL remained unchanged [ β 0.21 ( 95 % CI : −0.16 to 0.57 ) ] over 12 months follow-up . Continued bup/nx treatment across all 4 quarters was associated with improvements in both physical [ β 2.38 ( 95 % CI : 0.63 to 4.12 ) ] and mental [ β 2.51 ( 95 % CI : 0.42 to 4.60 ) ] HRQOL after adjusting for other contributors to HRQOL . Conclusions : Clinic-based bup/nx maintenance therapy is potentially effective in ameliorating some of the adverse effects of opioid dependence on HRQOL for HIV-infected population IMPORTANCE Buprenorphine opioid agonist treatment ( OAT ) has established efficacy for treating opioid dependency among persons seeking addiction treatment . However , effectiveness for out-of-treatment , hospitalized patients is not known . OBJECTIVE To determine whether buprenorphine administration during medical hospitalization and linkage to office-based buprenorphine OAT after discharge increase entry into office-based OAT , increase sustained engagement in OAT , and decrease illicit opioid use at 6 months after hospitalization . DESIGN , SETTING , AND PARTICIPANTS From August 1 , 2009 , through October 31 , 2012 , a total of 663 hospitalized , opioid-dependent patients in a general medical hospital were identified . Of these , 369 did not meet eligibility criteria . A total of 145 eligible patients consented to participation in the r and omized clinical trial . Of these , 139 completed the baseline interview and were assigned to the detoxification ( n = 67 ) or linkage ( n = 72 ) group . INTERVENTIONS Five-day buprenorphine detoxification protocol or buprenorphine induction , intrahospital dose stabilization , and postdischarge transition to maintenance buprenorphine OAT affiliated with the hospital 's primary care clinic ( linkage ) . MAIN OUTCOMES AND MEASURES Entry and sustained engagement with buprenorphine OAT at 1 , 3 , and 6 months ( medical record verified ) and prior 30-day use of illicit opioids ( self-report ) . RESULTS During follow-up , linkage participants were more likely to enter buprenorphine OAT than those in the detoxification group ( 52 [ 72.2 % ] vs 8 [ 11.9 % ] , P < .001 ) . At 6 months , 12 linkage participants ( 16.7 % ) and 2 detoxification participants ( 3.0 % ) were receiving buprenorphine OAT ( P = .007 ) . Compared with those in the detoxification group , participants r and omized to the linkage group reported less illicit opioid use in the 30 days before the 6-month interview ( incidence rate ratio , 0.60 ; 95 % CI , 0.46 - 0.73 ; P < .01 ) in an intent-to-treat analysis . CONCLUSIONS AND RELEVANCE Compared with an inpatient detoxification protocol , initiation of and linkage to buprenorphine treatment is an effective means for engaging medically hospitalized patients who are not seeking addiction treatment and reduces illicit opioid use 6 months after hospitalization . However , maintaining engagement in treatment remains a challenge . TRIAL REGISTRATION clinical trials.gov Identifier : NCT00987961 BACKGROUND Untreated opioid dependence adversely affects the care of human immunodeficiency virus (HIV)-positive patients . Buprenorphine , a partial opioid agonist , is available for maintenance treatment of opioid dependence in HIV specialty setting s. We investigated the feasibility and efficacy of integrating buprenorphine , along with 2 levels of counseling , into HIV clinical care . METHODS HIV-positive , opioid-dependent patients were enrolled in a 12-week pilot study and r and omized to receive daily buprenorphine/naloxone treatment along with either brief physician management or physician management combined with nurse-administered drug counseling and adherence management . Primary outcomes included treatment retention ; illicit drug use , assessed by urine toxicology test and self-report ; CD4 lymphocyte counts ; and log(10 ) HIV type 1 ( HIV-1 ) RNA levels . RESULTS Of the 16 patients who received at least 1 dose of buprenorphine , 13 ( 81 % ) completed 12 weeks of treatment . The proportion of opioid-positive weekly urine test results decreased from 100 % at baseline to 32 % ( month 1 ) , 20 % ( month 2 ) , and 16 % ( month 3 ) . Only 4 patients reported any opioid use ( in the prior 7 days ) during the 12-week study . CD4 lymphocyte counts remained stable over the course of the study . The mean log(10 ) HIV-1 RNA level ( + /- st and ard deviation ) declined significantly , from 3.66+/-1.06 log(10 ) HIV-1 RNA copies/mL at baseline to 3.0+/-0.57 log(10 ) HIV-1 RNA copies/mL at month 3 ( P<.05 ) . No significant differences based on counseling intervention were detected . All 13 patients who completed the study continued to receive treatment in an extension phase of at least 0 - 15 months ' duration . CONCLUSIONS We conclude that it is feasible to integrate buprenorphine into HIV clinical care for the treatment of opioid dependence . Patients experienced good treatment retention and reductions in their opioid use . HIV biological markers remained stable or improved during buprenorphine/naloxone treatment BACKGROUND Opioid dependence is associated with low rates of treatment-seeking , poor adherence to treatment , frequent relapse , and major societal consequences . We aim ed to assess the efficacy , safety , and patient-reported outcomes of an injectable , once monthly extended-release formulation of the opioid antagonist naltrexone ( XR-NTX ) for treatment of patients with opioid dependence after detoxification . METHODS We did a double-blind , placebo-controlled , r and omised , 24-week trial of patients with opioid dependence disorder . Patients aged 18 years or over who had 30 days or less of inpatient detoxification and 7 days or more off all opioids were enrolled at 13 clinical sites in Russia . We r and omly assigned patients ( 1:1 ) to either 380 mg XR-NTX or placebo by an interactive voice response system , stratified by site and gender in a central ised , permuted-block method . Participants also received 12 biweekly counselling sessions . Participants , investigators , staff , and the sponsor were masked to treatment allocation . The primary endpoint was the response profile for confirmed abstinence during weeks 5–24 , assessed by urine drug tests and self report of non-use . Secondary endpoints were self-reported opioid-free days , opioid craving scores , number of days of retention , and relapse to physiological opioid dependence . Analyses were by intention to treat . This trial is registered at Clinical Trials.gov , NCT00678418 . FINDINGS Between July 3 , 2008 , and Oct 5 , 2009 , 250 patients were r and omly assigned to XR-NTX ( n=126 ) or placebo ( n=124 ) . The median proportion of weeks of confirmed abstinence was 90·0 % ( 95 % CI 69·9–92·4 ) in the XR-NTX group compared with 35·0 % ( 11·4–63·8 ) in the placebo group ( p=0·0002 ) . Patients in the XR-NTX group self-reported a median of 99·2 % ( range 89·1–99·4 ) opioid-free days compared with 60·4 % ( 46·2–94·0 ) for the placebo group ( p=0·0004 ) . The mean change in craving was –10·1 ( 95 % CI –12·3 to –7·8 ) in the XR-NTX group compared with 0·7 ( –3·1 to 4·4 ) in the placebo group ( p<0·0001 ) . Median retention was over 168 days in the XR-NTX group compared with 96 days ( 95 % CI 63–165 ) in the placebo group ( p=0·0042 ) . Naloxone challenge confirmed relapse to physiological opioid dependence in 17 patients in the placebo group compared with one in the XR-NTX group ( p<0·0001 ) . XR-NTX was well tolerated . Two patients in each group discontinued owing to adverse events . No XR-NTX-treated patients died , overdosed , or discontinued owing to severe adverse events . INTERPRETATION XR-NTX represents a new treatment option that is distinct from opioid agonist maintenance treatment . XR-NTX in conjunction with psychosocial treatment might improve acceptance of opioid dependence pharmacotherapy and provide a useful treatment option for many patients . FUNDING Alkermes CONTEXT Oral naltrexone can completely antagonize the effects produced by opioid agonists . However , poor compliance with naltrexone has been a major obstacle to the effective treatment of opioid dependence . OBJECTIVE To evaluate the safety and efficacy of a sustained-release depot formulation of naltrexone in treating opioid dependence . DESIGN AND SETTING R and omized , double-blind , placebo-controlled , 8-week trial conducted at 2 medical centers . PARTICIPANTS Sixty heroin-dependent adults . INTERVENTIONS Participants were stratified by sex and years of heroin use ( > or = 5 vs < 5 ) and then were r and omized to receive placebo or 192 or 384 mg of depot naltrexone . Doses were administered at the beginning of weeks 1 and 5 . All participants received twice-weekly relapse prevention therapy , provided observed urine sample s , and completed other assessment s at each visit . MAIN OUTCOME MEASURES Retention in treatment and percentage of opioid-negative urine sample s. RESULTS Retention in treatment was dose related , with 39 % , 60 % , and 68 % of patients in the placebo , 192 mg of naltrexone , and 384 mg of naltrexone groups , respectively , remaining in treatment at the end of 2 months . Time to dropout had a significant main effect of dose , with mean time to dropout of 27 , 36 , and 48 days for the placebo , 192 mg of naltrexone , and 384 mg of naltrexone groups , respectively . The percentage of urine sample s negative for opioids , methadone , cocaine , benzodiazepines , and amphetamine varied significantly as a function of dose . When the data were recalculated without the assumption that missing urine sample s were positive , a main effect of group was not found for any drugs tested except cocaine , where the percentage of cocaine-negative urine sample s was lower in the placebo group . Adverse events were minimal and generally mild . This formulation of naltrexone was well tolerated and produced a robust , dose-related increase in treatment retention . CONCLUSION These data provide new evidence of the feasibility , efficacy , and tolerability of long-lasting antagonist treatments for opioid dependence DESCRIPTION After HIV diagnosis , timely entry into HIV medical care and retention in that care are essential to the provision of effective antiretroviral therapy ( ART ) . Adherence to ART is among the key determinants of successful HIV treatment outcome and is essential to minimize the emergence of drug resistance . The International Association of Physicians in AIDS Care convened a panel to develop evidence -based recommendations to optimize entry into and retention in care and ART adherence for people with HIV . METHODS A systematic literature search was conducted to produce an evidence base restricted to r and omized , controlled trials and observational studies with comparators that had at least 1 measured biological or behavioral end point . A total of 325 studies met the criteria . Two review ers independently extracted and coded data from each study using a st and ardized data extraction form . Panel members drafted recommendations based on the body of evidence for each method or intervention and then grade d the overall quality of the body of evidence and the strength for each recommendation . RECOMMENDATIONS Recommendations are provided for monitoring entry into and retention in care , interventions to improve entry and retention , and monitoring of and interventions to improve ART adherence . Recommendations cover ART strategies , adherence tools , education and counseling , and health system and service delivery interventions . In addition , they cover specific issues pertaining to pregnant women , incarcerated individuals , homeless and marginally housed individuals , and children and adolescents , as well as substance use and mental health disorders . Recommendations for future research in all areas are also provided BACKGROUND Opioid dependence is common in HIV clinics . Buprenorphine-naloxone ( BUP ) is an effective treatment of opioid dependence that may be used in routine medical setting s. OBJECTIVE To compare clinic-based treatment with BUP ( clinic-based BUP ) with case management and referral to an opioid treatment program ( referred treatment ) . DESIGN Single-center , 12-month r and omized trial . Participants and investigators were aware of treatment assignments . ( Clinical Trials.gov registration number : NCT00130819 ) SETTING HIV clinic in Baltimore , Maryl and . PATIENTS 93 HIV-infected , opioid-dependent participants who were not receiving opioid agonist therapy and were not dependent on alcohol or benzodiazepines . INTERVENTION Clinic-based BUP included BUP induction and dose titration , urine drug testing , and individual counseling . Referred treatment included case management and referral to an opioid-treatment program . MEASUREMENTS Initiation and long-term receipt of opioid agonist therapy , urine drug test results , visit attendance with primary HIV care providers , use of antiretroviral therapy , and changes in HIV RNA levels and CD4 cell counts . RESULTS The average estimated participation in opioid agonist therapy was 74 % ( 95 % CI , 61 % to 84 % ) for clinic-based BUP and 41 % ( CI , 29 % to 53 % ) for referred treatment ( P < 0.001 ) . Positive test results for opioids and cocaine were significantly less frequent in clinic-based BUP than in referred treatment , and study participants receiving clinic-based BUP attended significantly more HIV primary care visits than those receiving referred treatment . Use of antiretroviral therapy and changes in HIV RNA levels and CD4 cell counts did not differ between the 2 groups . LIMITATION This was a small single-center study , follow-up was only moderate , and the study groups were unbalanced in terms of recent drug injections at baseline . CONCLUSION Management of HIV-infected , opioid-dependent patients with a clinic-based BUP strategy facilitates access to opioid agonist therapy and improves outcomes of substance abuse treatment . PRIMARY FUNDING SOURCE Health Re sources and Services Administration Special Projects of National Significance program BACKGROUND The optimal level of counseling and frequency of attendance for medication distribution has not been established for the primary care , office-based buprenorphine-naloxone treatment of opioid dependence . METHODS We conducted a 24-week r and omized , controlled clinical trial with 166 patients assigned to one of three treatments : st and ard medical management and either once-weekly or thrice-weekly medication dispensing or enhanced medical management and thrice-weekly medication dispensing . St and ard medical management was brief , manual-guided , medically focused counseling ; enhanced management was similar , but each session was extended . The primary outcomes were the self-reported frequency of illicit opioid use , the percentage of opioid-negative urine specimens , and the maximum number of consecutive weeks of abstinence from illicit opioids . RESULTS The three treatments had similar efficacies with respect to the mean percentage of opioid-negative urine specimens ( st and ard medical management and once-weekly medication dispensing , 44 percent ; st and ard medical management and thrice-weekly medication dispensing , 40 percent ; and enhanced medical management and thrice-weekly medication dispensing , 40 percent ; P=0.82 ) and the maximum number of consecutive weeks during which patients were abstinent from illicit opioids . All three treatments were associated with significant reductions from baseline in the frequency of illicit opioid use , but there were no significant differences among the treatments . The proportion of patients remaining in the study at 24 weeks did not differ significantly among the patients receiving st and ard medical management and once-weekly medication dispensing ( 48 percent ) or thrice-weekly medication dispensing ( 43 percent ) or enhanced medical management and thrice-weekly medication dispensing ( 39 percent ) ( P=0.64 ) . Adherence to buprenorphine-naloxone treatment varied ; increased adherence was associated with improved treatment outcomes . CONCLUSIONS Among patients receiving buprenorphine-naloxone in primary care for opioid dependence , the efficacy of brief weekly counseling and once-weekly medication dispensing did not differ significantly from that of extended weekly counseling and thrice-weekly dispensing . Strategies to improve buprenorphine-naloxone adherence are needed . ( Clinical Trials.gov number , NCT00023283 [ Clinical Trials.gov ] . ) Background : Buprenorphine/naloxone allows the integration of opioid dependence and HIV treatment . Methods : We conducted a prospect i ve study in HIV-infected opioid-dependent patients to investigate the impact of buprenorphine/naloxone treatment on drug use . Self-report and chart review assessment s were conducted every 3 months ( quarters 1 - 4 ) for 1 year . Outcomes were buprenorphine/naloxone treatment retention , drug use , and addiction treatment processes . Results : Among 303 patients enrolled between July 2005 and December 2007 , retention in buprenorphine/naloxone treatment was 74 % , 67 % , 59 % , and 49 % during Quarters 1 , 2 , 3 , and 4 , respectively . Past 30-day illicit opioid use decreased from 84 % of patients at baseline to 42 % in retained patients over the year . Patients were 52 % less likely to use illicit opioids for each quarter in treatment ( Odds ratio = 0.66 ; 95 % CI : 0.61 to 0.72 ) . Buprenorphine/naloxone doses and office visits approximated guidelines published by the United States Department of Health and Human Services . Urine toxicology monitoring was less frequent than recommended . Conclusions : Buprenorphine/naloxone provided in HIV treatment setting s can decrease opioid use . Strategies are needed to improve retention and address ongoing drug use in this treatment population IMPORTANCE Opioid-dependent patients often use the emergency department ( ED ) for medical care . OBJECTIVE To test the efficacy of 3 interventions for opioid dependence : ( 1 ) screening and referral to treatment ( referral ) ; ( 2 ) screening , brief intervention , and facilitated referral to community-based treatment services ( brief intervention ) ; and ( 3 ) screening , brief intervention , ED-initiated treatment with buprenorphine/naloxone , and referral to primary care for 10-week follow-up ( buprenorphine ) . DESIGN , SETTING , AND PARTICIPANTS A r and omized clinical trial involving 329 opioid-dependent patients who were treated at an urban teaching hospital ED from April 7 , 2009 , through June 25 , 2013 . INTERVENTIONS After screening , 104 patients were r and omized to the referral group , 111 to the brief intervention group , and 114 to the buprenorphine treatment group . MAIN OUTCOMES AND MEASURES Enrollment in and receiving addiction treatment 30 days after r and omization was the primary outcome . Self-reported days of illicit opioid use , urine testing for illicit opioids , human immunodeficiency virus ( HIV ) risk , and use of addiction treatment services were the secondary outcomes . RESULTS Seventy-eight percent of patients in the buprenorphine group ( 89 of 114 [ 95 % CI , 70%-85 % ] ) vs 37 % in the referral group ( 38 of 102 [ 95 % CI , 28%-47 % ] ) and 45 % in the brief intervention group ( 50 of 111 [ 95 % CI , 36%-54 % ] ) were engaged in addiction treatment on the 30th day after r and omization ( P < .001 ) . The buprenorphine group reduced the number of days of illicit opioid use per week from 5.4 days ( 95 % CI , 5.1 - 5.7 ) to 0.9 days ( 95 % CI , 0.5 - 1.3 ) vs a reduction from 5.4 days ( 95 % CI , 5.1 - 5.7 ) to 2.3 days ( 95 % CI , 1.7 - 3.0 ) in the referral group and from 5.6 days ( 95 % CI , 5.3 - 5.9 ) to 2.4 days ( 95 % CI , 1.8 - 3.0 ) in the brief intervention group ( P < .001 for both time and intervention effects ; P = .02 for the interaction effect ) . The rates of urine sample s that tested negative for opioids did not differ statistically across groups , with 53.8 % ( 95 % CI , 42%-65 % ) in the referral group , 42.9 % ( 95 % CI , 31%-55 % ) in the brief intervention group , and 57.6 % ( 95 % CI , 47%-68 % ) in the buprenorphine group ( P = .17 ) . There were no statistically significant differences in HIV risk across groups ( P = .66 ) . Eleven percent of patients in the buprenorphine group ( 95 % CI , 6%-19 % ) used inpatient addiction treatment services , whereas 37 % in the referral group ( 95 % CI , 27%-48 % ) and 35 % in the brief intervention group ( 95 % CI , 25%-37 % ) used inpatient addiction treatment services ( P < .001 ) . CONCLUSIONS AND RELEVANCE Among opioid-dependent patients , ED-initiated buprenorphine treatment vs brief intervention and referral significantly increased engagement in addiction treatment , reduced self-reported illicit opioid use , and decreased use of inpatient addiction treatment services but did not significantly decrease the rates of urine sample s that tested positive for opioids or of HIV risk . These findings require replication in other centers before widespread adoption . TRIAL REGISTRATION clinical trials.gov Identifier : NCT00913770 Background : The safety of buprenorphine/naloxone ( bup/nx ) in HIV-infected patients has not been established . Prior reports raise concern about hepatotoxicity and interactions with atazanavir . Methods : We conducted a prospect i ve cohort study of 303 opioid-dependent HIV-infected patients initiating bup/nx treatment . We assessed changes in aspartate aminotransferase ( AST ) and alanine aminotransferase ( ALT ) over time . We compared bup/nx doses in patients receiving the antiretroviral atazanavir to those not receiving atazanavir . We conducted surveillance for pharmacodynamic interactions . Results : Median AST [ 37.0 vs. 37.0 units/liter ( U/L ) respective interquartile ranges ( IQRs ) 26 - 53 and 26 - 59 ] and ALT ( 33.0 vs. 33.0 U/L , respective IQRs 19 - 50 and 18 - 50 ) values did not change over time among 141 patients comparing pre-bup/nx exposure with post-bup/nx exposure measures . During bup/nx exposure , 207 subjects demonstrated no significant change in median AST ( 36.0 vs. 35.0 U/L , respective IQRs 25 - 57 and 25 - 61 ) and ALT ( 29.0 vs. 31.0 U/L , respective IQRs 19 - 50 and 18 - 50 ) values collected a median of 6 months apart . Analyses restricted to patients with hepatitis C and HIV co-infection yielded similar results , except a small but significant decrease in first to last AST , during treatment with bup/nx ( P = 0.048 ) . Mean bup/nx dose , ranging 16.0 - 17.8 mg , did not differ over time or with co-administration of atazanavir . No pharmacodynamic interactions were noted . Conclusions : Buprenorphine/naloxone did not produce measurable hepatic toxicity or pharmacodynamic interaction with atazanavir in HIV-infected opioid-dependent patients
2,020
29,547,226
We did not find evidence that rTMS improved disability . We found no evidence relating to the effectiveness of CES on disability . We found evidence of small study bias in the tDCS analyses . We did not find evidence that tDCS improved disability . We did not find evidence that low-frequency rTMS , rTMS applied to the dorsolateral prefrontal cortex and CES are effective for reducing pain intensity in chronic pain .
BACKGROUND This is an up date d version of the original Cochrane Review published in 2010 , Issue 9 , and last up date d in 2014 , Issue 4 . Non-invasive brain stimulation techniques aim to induce an electrical stimulation of the brain in an attempt to reduce chronic pain by directly altering brain activity . They include repetitive transcranial magnetic stimulation ( rTMS ) , cranial electrotherapy stimulation ( CES ) , transcranial direct current stimulation ( tDCS ) , transcranial r and om noise stimulation ( tRNS ) and reduced impedance non-invasive cortical electrostimulation ( RINCE ) . OBJECTIVES To evaluate the efficacy of non-invasive cortical stimulation techniques in the treatment of chronic pain .
BACKGROUND Carefully design ed controlled studies are essential in further evaluating the therapeutic efficacy of transcranial magnetic stimulation ( TMS ) in psychiatric disorders . A major method ological concern is the design of the " sham " control for TMS . An ideal sham would produce negligible cortical stimulation in conjunction with a scalp sensation akin to real treatment . Strategies employed so far include alterations in the position of the stimulating coil , but there has been little systematic study of their validity . In this study , we investigated the effects of different coil positions on cortical activation and scalp sensation . METHODS In nine normal subjects , single TMS pulses were administered at a range of intensities with a " figure eight " coil held in various positions over the left primary motor cortex . Responses were measured as motor-evoked potentials in the right first dorsal interosseus muscle . Scalp sensation to TMS with the coil in various positions over the prefrontal area was also assessed . RESULTS None of the coil positions studied met the criteria for an ideal sham . Arrangements associated with a higher likelihood of scalp sensation were also more likely to stimulate the cortex . CONCLUSIONS The choice of a sham for TMS involves a trade-off between effective blinding and truly inactive " stimulation . " Further research is needed to develop the best sham condition for a range of applications Although electrical stimulation of the pre central gyrus ( MCS ) is emerging as a promising technique for pain control , its mechanisms of action remain obscure , and its application largely empirical . Using positron emission tomography ( PET ) we studied regional changes in cerebral flood flow ( rCBF ) in 10 patients undergoing motor cortex stimulation for pain control , seven of whom also underwent somatosensory evoked potentials and nociceptive spinal reflex recordings . The most significant MCS-related increase in rCBF concerned the ventral-lateral thalamus , probably reflecting cortico-thalamic connections from motor areas . CBF increases were also observed in medial thalamus , anterior cingulate/orbitofrontal cortex , anterior insula and upper brainstem ; conversely , no significant CBF changes appeared in motor areas beneath the stimulating electrode . Somatosensory evoked potentials from SI remained stable during MCS , and no rCBF changes were observed in somatosensory cortex during the procedure . Our results suggest that descending axons , rather than apical dendrites , are primarily activated by MCS , and highlight the thalamus as the key structure mediating functional MCS effects . A model of MCS action is proposed , whereby activation of thalamic nuclei directly connected with motor and premotor cortices would entail a cascade of synaptic events in pain-related structures receiving afferents from these nuclei , including the medial thalamus , anterior cingulate and upper brainstem . MCS could influence the affective-emotional component of chronic pain by way of cingulate/orbitofrontal activation , and lead to descending inhibition of pain impulses by activation of the brainstem , also suggested by attenuation of spinal flexion reflexes . In contrast , the hypothesis of somatosensory cortex activation by MCS could not be confirmed by our results OBJECTIVE There is growing interest in neuropsychiatry for repetitive transcranial magnetic stimulation ( rTMS ) as a neuromodulatory treatment . However , there are limitations in interpreting rTMS effects as a real consequence of physiological brain changes or as placebo-mediated unspecific effects , which may be particularly strong in psychiatric patients . This is due to the fact that existing sham rTMS procedures are less than optimal . A new placebo tool is introduced here , called real electro-magnetic placebo ( REMP ) device , which can simulate the scalp sensation induced by the real TMS , while leaving both the visual impact and acoustic sensation of real TMS unaltered . METHODS Physical , neurophysiological and behavioural variables of monophasic and biphasic single-pulse TMS and biphasic 1Hz and 20Hz rTMS procedures ( at different intensities ) were tested in subjects who were expert or naïve of TMS . Results of the real TMS were compared with those induced by the REMP device and with two other currently used sham procedures , namely the commercially available Magstim sham coil and tilting the real coil by 90 degrees . RESULTS The REMP device , besides producing scalp sensations similar to the real TMS , attenuated the TMS-induced electric field ( as measured by a dipole probe ) to a biologically inactive level . Behaviourally , neither expert nor naïve TMS subjects identified the " coil at 90 degrees " or the " Magstim sham coil " as a real TMS intervention , whilst naïve subjects were significantly more likely to identify the REMP-attenuated TMS as real . CONCLUSIONS The " goodness of sham " of the REMP device is demonstrated by physical , neurophysiological , and behavioural results . SIGNIFICANCE Such placebo TMS is superior to the available sham procedures when applied on subjects naïve to TMS , as in case of patients undergoing a clinical rTMS trial Background Many double-blind clinical trials of transcranial direct current stimulation ( tDCS ) use stimulus intensities of 2 mA despite the fact that blinding has not been formally vali date d under these conditions . The aim of this study was to test the assumption that sham 2 mA tDCS achieves effective blinding . Methods A r and omised double blind crossover trial . 100 tDCS-naïve healthy volunteers were incorrectly advised that they there were taking part in a trial of tDCS on word memory . Participants attended for two separate sessions . In each session , they completed a word memory task , then received active or sham tDCS ( order r and omised ) at 2 mA stimulation intensity for 20 minutes and then repeated the word memory task . They then judged whether they believed they had received active stimulation and rated their confidence in that judgement . The blinded assessor noted when red marks were observed at the electrode sites post-stimulation . Results tDCS at 2 mA was not effectively blinded . That is , participants correctly judged the stimulation condition greater than would be expected to by chance at both the first session ( kappa level of agreement ( κ ) 0.28 , 95 % confidence interval ( CI ) 0.09 to 0.47 p = 0.005 ) and the second session ( κ = 0.77 , 95%CI 0.64 to 0.90 ) , p = < 0.001 ) indicating inadequate participant blinding . Redness at the reference electrode site was noticeable following active stimulation more than sham stimulation ( session one , κ = 0.512 , 95%CI 0.363 to 0.66 , p<0.001 ; session two , κ = 0.677 , 95%CI 0.534 to 0.82 ) indicating inadequate assessor blinding . Conclusions Our results suggest that blinding in studies using tDCS at intensities of 2 mA is inadequate . Positive results from such studies should be interpreted with caution OBJECTIVE Brain polarization in the form of transcranial direct current stimulation ( tDCS ) , which influences motor function and learning processes , has been proposed as an adjuvant strategy to enhance training effects in Neurorehabilitation . Proper testing in Neurorehabilitation requires double-blind sham-controlled study design s. Here , we evaluated the effects of tDCS and sham stimulation ( SHAM ) on healthy subjects and stroke patients ' self-report measures of attention , fatigue , duration of elicited sensations and discomfort . METHODS tDCS or SHAM was in all cases applied over the motor cortex . Attention , fatigue , and discomfort were self rated by study participants using visual analog scales . Duration of perceived sensations and the ability to distinguish tDCS from Sham sessions were determined . Investigators question ing the patients were blind to the intervention type . RESULTS tDCS and SHAM elicited comparably minimal discomfort and duration of sensations in the absence of differences in attention or fatigue , and could not be distinguished from SHAM by study participants nor investigators . CONCLUSIONS Successful blinding of subjects and investigators and ease of application simultaneously with training protocol s supports the feasibility of using tDCS in double-blind , sham-controlled r and omized trials in clinical Neurorehabilitation . SIGNIFICANCE tDCS could evolve into a useful tool , in addition to TMS , to modulate cortical activity in Neurorehabilitation
2,021
28,986,715
Conclusion Induction therapy of basiliximab has similar short-time effects on the recipients in renal transplantation compared with that of ATG . However , regarding the long-term effect , as represented by the rate of neoplasm , basiliximab has a significant advantage
Objective The aim of this meta- analysis was to evaluate the efficacy of basiliximab versus antithymocyte globulin for induction therapy in renal allograft .
Objective : The aim of this study was to determine the safety and efficacy of induction with rabbit antithymocyte globulin ( RATG ) compared with interleukin-2 receptor antagonists in a racially diverse kidney transplant patient population under modern immunosuppression . Background : The optimal induction therapy in patients at risk for rejection , particularly black recipients , in the modern era of immunosuppression with flow cytometry-based cross-matching is unclear . Methods : This was a prospect i ve , risk-stratified , r and omized , single-center , open-label study of 200 consecutively enrolled patients in a large academic teaching center . Patients were r and omized to receive either daclizumab or basiliximab versus RATG for induction in combination with tacrolimus , mycophenolate mofetil , and corticosteroids . Patients were stratified between groups to ensure equal numbers of black , retransplants , high panel reactive antibodies ( PRAs ) ( > 20 % ) , and prolonged cold ischemic times ( > 24 hours ) in each group . Primary outcome measure is treatment efficacy defined as the incidence of biopsy-proven acute rejection and estimated creatinine clearance . Patients were followed up for 12 months . Renal transplant recipients were included if they were adult ( ≥18 years old ) and received an allograft from a deceased , living unrelated , or nonhuman leukocyte antigen identical living-related donor . Results : A total of 200 patients ( n = 98 in the interleukin-2 receptor antagonists , and n = 102 in the RATG ) were enrolled from February 2009 through July 2011 . One-year acute rejection rates were low and similar between groups ( 10 % in the interleukin-2 receptor antagonist group vs 6 % in the RATG group ; P = 0.30 ) . Creatinine clearance was also similar between groups ( interleukin-2 receptor antagonist group 56 ± 20 mL/min per 1.73 m2 vs RATG group 55 ± 22 mL/min per 1.73 m2 ; P = 0.73 ) . Sub analysis of recipient race revealed that in blacks only RATG was protective against 6- and 12-month acute rejection , without an increased risk of infection . Induction did not affect rejection rates according to recipient calculated PRAs ; however , RATG was associated with an increased risk of BK virus in low-PRA patients . Conclusions and Relevance : RATG induction provides improved protection against early acute rejection in black renal transplant recipients , whereas sensitized patients do not seem to demonstrate a similar benefit from this therapy . This study is registered at Clinical trials.gov ( NCT00859131 ) We conducted a single-center prospect i ve double-arm open-labeled study on kidney transplant patients from 2010 to 2011 to evaluate the efficacy of induction therapy using low , single-dose rabbit-antithymocyte globulin ( r-ATG ) , 1.5 mg/kg on Day 0 ( n = 80 , 60 males , mean age 35.9 years ) , versus basiliximab ( Interleukin-2 blocker ) 20 mg on Days 0 and 4 ( n = 20 , 12 males , mean age 45.1 years ) on renal allograft function in terms of serum creatinine ( SCr ) , rejection and infection episodes and patient/graft survival and cost . Demographic and post-transplant follow-up including immunosuppression was similar in both groups . In the r-ATG group , donors were unrelated ( spouse , n = 25 ) , deceased ( n = 31 ) and parents/siblings ( others ) , with a mean HLA match of 1.58 . Donors in the basiliximab group were living unrelated ( spouse , n = 15 ) and deceased ( n = 5 ) , with a mean HLA match of 1.56 . No patient/graft was lost in the r-ATG group over a mean of one year follow-up , and the mean SCr was 1.28 mg/dL with 7.5 % acute rejection ( AR ) episodes ; infections were also not observed . In the basiliximab group , over the same period of follow-up , there was 95 % death-censored graft survival , and the mean SCr was 1.23 mg/dL with 10 % AR episodes . One patient died due to bacterial pneumonia and one succumbed to coronary artery disease ; one graft was lost due to uncontrolled acute humoral and cellular rejection . The cost of r-ATG and basiliximab were $ 600 and $ 2500 , respectively . We conclude that induction immunosuppressive therapy with a low-dose r-ATG may be a better option as compared with basiliximab in terms of graft function , survival and cost benefit in kidney transplant patients BACKGROUND Induction therapy reduces the frequency of acute rejection and delayed graft function after transplantation . A rabbit antithymocyte polyclonal antibody or basiliximab , an interleukin-2 receptor monoclonal antibody , is most commonly used for induction . METHODS In this prospect i ve , r and omized , international study , we compared short courses of antithymocyte globulin and basiliximab in patients at high risk for acute rejection or delayed graft function who received a renal transplant from a deceased donor . Patients taking cyclosporine , mycophenolate mofetil , and prednisone were r and omly assigned to receive either rabbit antithymocyte globulin ( 1.5 mg per kilogram of body weight daily , 141 patients ) during transplantation ( day 0 ) and on days 1 through 4 or basiliximab ( 20 mg , 137 patients ) on days 0 and 4 . The primary end point was a composite of acute rejection , delayed graft function , graft loss , and death . RESULTS At 12 months , the incidence of the composite end point was similar in the two groups ( P=0.34 ) . The antithymocyte globulin group , as compared with the basiliximab group , had lower incidences of acute rejection ( 15.6 % vs. 25.5 % , P=0.02 ) and of acute rejection that required treatment with antibody ( 1.4 % vs. 8.0 % , P=0.005 ) . The antithymocyte globulin group and the basiliximab group had similar incidences of graft loss ( 9.2 % and 10.2 % , respectively ) , delayed graft function ( 40.4 % and 44.5 % ) , and death ( 4.3 % and 4.4 % ) . Though the incidences of all adverse events , serious adverse events , and cancers were also similar between the two groups , patients receiving antithymocyte globulin had a greater incidence of infection ( 85.8 % vs. 75.2 % , P=0.03 ) but a lower incidence of cytomegalovirus disease ( 7.8 % vs. 17.5 % , P=0.02 ) . CONCLUSIONS Among patients at high risk for acute rejection or delayed graft function who received a renal transplant from a deceased donor , induction therapy consisting of a 5-day course of antithymocyte globulin , as compared with basiliximab , reduced the incidence and severity of acute rejection but not the incidence of delayed graft function . Patient and graft survival were similar in the two groups . ( Clinical Trials.gov number , NCT00235300 [ Clinical Trials.gov ] . ) Background / Aims : The long-term evaluation of basiliximab induction therapy has not been addressed yet . We aim to evaluate its long-term effects in living related donor kidney transplantation . Methods : 100 adult recipients with their first kidney allograft were r and omized into two treatment groups – one group received basiliximab and the second served as a control . All patients received a maintenance triple immunosuppressive therapy ( steroids , cyclosporine microemulsion and azathioprine ) and were closely followed for 5 years . Results : Basiliximab significantly reduced the proportion of patients who experienced an acute rejection in the first year ( 18/50 ) when compared to the control group ( 31/50 ) and in 5 years ( 27/50 ) when compared to ( 36/50 ) the controls . The cumulative steroid dose used throughout the study was significantly lower in the basiliximab group . The overall incidence of post-transplant complications was comparable between the two treatment groups . There was no significant difference in patients and graft survival ; 5-year patient and graft survival were 100 and 86 % for basiliximab , and 96 and 88 % for the control group respectively . Conclusion : Although routine basiliximab induction significantly reduces the incidence of acute rejection , its beneficial long-term effects on graft function and patient and graft survival are not yet evident Background . Sequential anti-thymocyte globulins (ATG)/cyclosporine immunosuppression has two main advantages : delayed introduction of the nephrotoxic drug cyclosporine and prevention of acute rejection . Basiliximab , a recently developed chimeric monoclonal antibody that selectively depletes the minor sub population of activated T lymphocytes , has been shown to reduce the incidence of acute rejection when used with cyclosporine introduced on day 1 . Methods . This open , r and omized , multicenter study was undertaken to compare the safety and efficacy of ATG versus basiliximab induction therapy ( IT ) with delayed introduction of cyclosporine for microemulsion ( Neoral ) in 105 low immunologic risk renal-transplant patients receiving mycophenolate mofetil and steroids . Results . One-year patient and graft survival rates were 98.1 % and 94.2 % , respectively , in the basiliximab group ( n=52 ) , and 98.1 % and 96.2 % in the ATG group ( n=53 ) . The incidence of biopsy-confirmed acute rejection was comparable ( basiliximab 9.6 % , ATG 9.4 % ) , as were key parameters of renal function , notably serum creatinine levels , time-to-nadir serum creatinine , and the number of patients requiring posttransplantation dialysis ( basiliximab 28.8 % , ATG 30.2 % ) . However , significantly fewer patients in the basiliximab group experienced cytomegalovirus ( CMV ) infection , leukopenia , and thrombocytopenia , and this without any significant difference in any other key safety parameters ( including the incidences of serum sickness , fever , lymphoma , and infections in general ) . Conclusions . Both ATG and basiliximab , when used for IT in a sequential protocol , are equally effective in terms of graft and patient survival as well as at preventing acute rejection . However , basiliximab is associated with a lower incidence of certain key adverse events , namely CMV infection , leukopenia , and thrombocytopenia Background . The aim of this prospect i ve r and omized study was to examine the effect of induction immunosuppression and low initial cyclosporine ( CsA ) on the onset of graft function and its long-term consequences . Methods . During 1999–2001 , 155 patients were r and omized to single 9 mg/kg dose antithymocyte globulin (ATG)-Fresenius ( group A ) or two 20-mg doses of basiliximab ( group B ) with reduced dose CsA or conventional CsA triple therapy without induction ( group C ) . Results . Delayed function ( DGF ) was lower in group A than in groups B or C ( 5.7 % vs. 24.1 % and 15.9 % , P<0.025 ) and need of dialysis was less in groups A and B compared to C ( 10.3 and 10.4 vs. 20.0 days , P<0.05 ) . Acute rejections occurred in 11.3 % , 12.1 % and 20.5 % , and the mean ( median ) time to rejection was 16 ( 13 ) , 97 ( 46 ) and 101 ( 35 ) days in groups A , B , and C , respectively ( P<0.005 ) . One- and 5-year graft survivals ( GS ) were 98.1 % and 90.6 % ( group A ) , 96.6 % and 96.6 % ( group B ) , and 93.2 % and 84.1 % ( group C ) . Five-year GS was significantly better in group B than in group C ( P<0.05 ) . The death censored 5-year GS in groups A , B , and C were 94.3 % , 96.6 % , and 90.0 % ( P = NS ) . Single high-dose ATG induction was associated with hemodynamic and pulmonary disturbances without , however , serious or long-term consequences . Conclusions . ATG induction significantly reduced DGF . Both induction regimens together with low initial CsA led to significantly less posttransplant dialysis and excellent survival . The high dose ATG was associated with significant hemodynamic and pulmonary side effects during drug infusion Background . The purpose of this study was to determine the impact of antilymphocyte globulin (ALG)-induction on long-term outcomes of postrenal transplantation . Methods . Between January 1985 and January 1986 , 123 consecutive renal transplants from deceased donors were performed at a single institution . Patients were r and omized into two groups : group 1 ( n=63 , 40±10 year ) received cyclosporine ( CsA ) , prednisone , and azathioprine ; and group 2 ( n=60 , 36±9 year ) received ALG-induction , CsA , and prednisone and delayed initiation ( 45–90 days posttransplantation ) of azathioprine if the CsA dose was less than 4 mg/kg per day . Target CsA trough levels were 150 to 250 ng/mL. Cytomegalovirus prophylaxis was not used . Human leukocyte antigen matching ( 2.4±1.1 vs. 2.6±1.2 ) and cold ischemia time ( 38±8 hr vs. 39±9 hr ) did not differ . Results . The incidence of acute rejection was lower in group 2 ( 28 % vs. 75 % , P<0.0001 ) . The incidence of cytomegalovirus infection was 10 % in group 1 and 18 % in group 2 ( P=0.41 ) . The incidence of cancer was 22.2 % in group 1 and 11.7 % in group 2 ( P=0.53 ) and the incidence of lymphoma did not differ ( 3 % vs. 5 % , P=0.77 ) . Patient and graft survival in groups 1 and 2 at 1 , 10 , and 20 years were 100%/79 % vs. 100%/93 % , 83%/56 % vs. 88%/51 % , and 64%/43 % vs. 54%/47 % , respectively ( log-rank test , P=0.18 and P=0.078 ) . Conclusion . The use of ALG-induction result ed in a lower incidence of acute rejection and improved graft survival during the first year postrenal transplantation . Patient and graft survival at 20-year follow-up was not affected by ALG-induction Background . Basiliximab ( Simulect ) , a high-affinity chimeric , monoclonal antibody directed against the alpha chain of human interleukin-2 receptor ( CD25 ) , reduces the incidence of acute renal allograft rejection when used in combination with cyclosporine ( Neoral ) and steroids . This study was design ed to compare the safety and efficacy of basiliximab to polyclonal anti-T-cell ( ATGAM ) therapy for the prevention of acute rejection in de novo renal transplant recipients . Methods . This 1-year , open-label , r and omized trial was conducted in recipients of cadaveric or living-related donor renal transplants . All patients received cyclosporine ( Neoral ) , mycophenolate mofetil ( CellCept , MMF ) , and corticosteroids . Patients who were r and omized to basiliximab therapy received a 20 mg i.v . bolus dose on days 0 and 4 , and the majority of patients were initiated on cyclosporine within 48 hr of transplantation . Patients who were r and omized to antithymocyte globulin therapy ( ATGAM , ATG ) received 15 mg/day i.v . within 48 hr of transplant and continued treatment for up to 14 days ; ATG was stopped once therapeutic cyclosporine blood levels were achieved . The initiation of cyclosporine use was delayed in the ATG group until renal function was established ( serum creatinine < 3.0 mg/dl or 50 % fall from baseline ) . Results . Of the 138 r and omized patients , 135 received at least 1 dose of study medication ( 70 patients , basiliximab ; 65 patients , ATG ) . Demographic characteristics were similar between the basiliximab and ATG-treatment groups . At 12 months , the rate of biopsy-proven acute rejection was 19 % and 20 % , respectively , in the basiliximab and ATG groups . Although the overall profile of adverse events was similar between basiliximab- and ATG-treated patients , adverse events considered by the investigators to be associated with the study drug occurred more often among patients receiving ATG ( 42 % vs. 11 % with basiliximab ) . Conclusions . Basiliximab combined with early initiation of cyclosporine therapy result ed in low acute rejection rates similar to those achieved with ATG combined with delayed cyclosporine . Basiliximab therapy showed an excellent safety profile , with no increases in malignancies , infections , or deaths . Based on its convenient two-dose , body-weight independent regimen and comparable effectiveness to ATG , basiliximab is an attractive choice for the prevention of acute rejection episodes in renal transplant patients We compared the influence of induction therapy on 5-year patient and graft survival as well as on renal function in 100 kidney graft recipients at low immunological risk treated with antilymphocyte globulin ( n = 50 ) versus anti-IL-2R monoclonal antibody ( n = 50 ) in a prospect i ve multicenter study . Long-term immunosuppressive treatment included cyclosporine , mycophenolate mofetil , and a short course of steroids in all patients . Five year graft ( 86 % vs 86 % ) and patient ( 94 % vs 94 % ) survivals were identical in both study arms . Moreover , neither serum creatinine or proteinuria were significantly different between the two groups . Our results showed that the choice of the induction therapy seemed to not have a major impact on long-term outcomes among renal recipients at low immunological risk 5Although electronic search ing and publishing have made it easier to identify and obtain a report of a single r and omised trial , this can not be deemed a short-cut to obtaining an overall , balanced view of the effectiveness of the interventions investigated in the trial . Instead , similar trials need to be brought together . This bringing together needs to be done in as unbiased a way as possible . Traditional narrative review s , usually written by a recognised expert , are generally not systematic . Authors might simply not have the time to identify and bring together all relevant studies or they might actually seek to discuss and selectively combine trials that confirm their opinions and prejudices . A systematic review aims to circumvent this weakness by the use of a predefined , explicit methodology . The methods used include steps to minimise bias in the identification of relevant studies , in the selection criteria for inclusion , and in the collection of data . Guidance is available on the conduct , Acute graft‐versus‐host disease ( aGVHD ) occurs in up to 80 % of patients who undergo allogeneic stem cell transplantation ( SCT ) and contributes significantly to transplant‐related mortality ( TRM ) . We conducted a prospect i ve phase II trial to assess the efficacy and feasibility of treating steroid‐refractory aGVHD with basiliximab , a chimaeric monoclonal antibody directed against the alpha chain of the interleukin‐2 ( IL‐2 ) receptor . Basiliximab was administered intravenously at a dose of 20 mg on days 1 and 4 . Twenty‐three patients were enrolled between October 1999 and July 2004 . We found a primary overall response rate of 82·5 % with four patients ( 17·5 % ) showing a complete response and 15 patients ( 65 % ) a partial response . Six patients were again treated successfully with an IL‐2 receptor antagonist because of recurrence of aGVHD . The rates of infections , chronic GVHD , malignancy recurrence and 1‐year TRM following immunosuppression with basiliximab were comparable with those found with other treatment modalities for aGVHD . We conclude that basiliximab is efficient and feasible for steroid‐refractory aGVHD and merits further evaluation
2,022
27,548,729
Discussion : Wireless esophageal capsule endoscopy is well tolerated and safe in patients with liver cirrhosis and suspicion of portal hypertension . The sensitivity of capsule endoscopy is not currently sufficient to replace EGD as a first exploration in these patients , but given its high accuracy , it may have a role in cases of refusal or contraindication to EGD
Introduction : Esophageal variceal bleeding is a severe complication of portal hypertension with significant morbidity and mortality . Although traditional screening and grading of esophageal varices has been performed by endogastroduodenoscopy ( EGD ) , wireless video capsule endoscopy provides a minimally invasive alternative that may improve screening and surveillance compliance . Aim of the Study : The aim of the study was to perform a systematic review and structured meta- analysis of all eligible studies to evaluate the efficacy of wireless capsule endoscopy for screening and diagnosis of esophageal varices among patients with portal hypertension .
Background Endoscopy ( esophagogastroduodenoscopy , EGD ) to screen for esophageal varices ( EV ) is recommended in patients with portal hypertension . Reports indicate that capsule endoscopy ( CE ) is capable of identifying large/medium varices ( L/MV ) when the varix comprises more than 25 % of the circumference of the field of view . Aims We evaluated the ability of CE to discriminate the size of EV using this grading scale . Methods Patients underwent CE and EGD on the same day . A blinded investigator interpreted capsule findings . CE labeled EV as L/MV if ≥25 % of the lumen circumference was occupied , and small/none for < 25 % . Results A total of 37 patients were enrolled in this prospect i ve , observational study at a single tertiary-care academic center . Three CE were excluded due to rapid esophageal transit time or technical malfunction . Using a 25 % threshold , the sensitivity , specificity , positive predictive value ( PPV ) , and negative predictive value ( NPV ) for EC to discriminate L/MV were 23.5 % , 88.2 % , 66.7 % , and 53.6 % , respectively ( κ = 0.12 ) . Reducing the threshold to 12.5 % result ed in sensitivity , specificity , PPV , and NPV of 88.2 % , 64.7 % , 71.4 % , and 84.6 % , respectively ( κ = 0.53 ) . A receiver-operator characteristic ( ROC ) curve showed a 15 % threshold to be optimal in discriminating EV size using CE , result ing in sensitivity , specificity , PPV , and NPV of 76.5 % , 82.4 % , 81.3 % , and 77.8 % , respectively ( κ = 0.59 ) . Conclusions This study indicates that discriminating EV size by the current capsule scale is unreliable . Lowering the grading threshold improved the ability to discriminate EV size by CE . In the proper context , CE is an alternative to EGD to screen for EV AIM To investigate the utility of esophageal capsule endoscopy in the diagnosis and grading of esophageal varices . METHODS Cirrhotic patients who were undergoing esophagogastroduodenoscopy ( EGD ) for variceal screening or surveillance underwent capsule endoscopy . Two separate blinded investigators read each capsule endoscopy for the following results : variceal grade , need for treatment with variceal b and ing or prophylaxis with beta-blocker therapy , degree of portal hypertensive gastropathy , and gastric varices . RESULTS Fifty patients underwent both capsule and EGD . Forty-eight patients had both procedures on the same day , and 2 patients had capsule endoscopy within 72 h of EGD . The accuracy of capsule endoscopy to decide on the need for prophylaxis was 74 % , with sensitivity of 63 % and specificity of 82 % . Inter-rater agreement was moderate ( kappa = 0.56 ) . Agreement between EGD and capsule endoscopy on grade of varices was 0.53 ( moderate ) . Inter-rater reliability was good ( kappa = 0.77 ) . In diagnosis of portal hypertensive gastropathy , accuracy was 57 % , with sensitivity of 96 % and specificity of 17 % . Two patients had gastric varices seen on EGD , one of which was seen on capsule endoscopy . There were no complications from capsule endoscopy . CONCLUSION We conclude that capsule endoscopy has a limited role in deciding which patients would benefit from EGD with b and ing or beta-blocker therapy . More data is needed to assess accuracy for staging esophageal varices , PHG , and the detection of gastric varices BACKGROUND Nonselective beta-adrenergic blockers decrease portal pressure and prevent variceal hemorrhage . Their effectiveness in preventing varices is unknown . METHODS We r and omly assigned 213 patients with cirrhosis and portal hypertension ( minimal hepatic venous pressure gradient [ HVPG ] of 6 mm Hg ) to receive timolol , a nonselective beta-blocker ( 108 patients ) , or placebo ( 105 patients ) . The primary end point was the development of gastroesophageal varices or variceal hemorrhage . Endoscopy and HVPG measurements were repeated yearly . RESULTS During a median follow-up of 54.9 months , the rate of the primary end point did not differ significantly between the timolol group and the placebo group ( 39 percent and 40 percent , respectively ; P=0.89 ) , nor were there significant differences in the rates of ascites , encephalopathy , liver transplantation , or death . Serious adverse events were more common among patients in the timolol group than among those in the placebo group ( 18 percent vs. 6 percent , P=0.006 ) . Varices developed less frequently among patients with a baseline HVPG of less than 10 mm Hg and among those in whom the HVPG decreased by more than 10 percent at one year and more frequently among those in whom the HVPG increased by more than 10 percent at one year . CONCLUSIONS Nonselective beta-blockers are ineffective in preventing varices in unselected patients with cirrhosis and portal hypertension and are associated with an increased number of adverse events . ( Clinical Trials.gov number , NCT00006398 . Several treatments have been proven to be effective for variceal bleeding in patients with cirrhosis . The aim of this multicenter , prospect i ve , cohort study was to assess how these treatments are used in clinical practice and what are the posttherapeutic prognosis and prognostic indicators of upper digestive bleeding in patients with cirrhosis . A training set of 291 and a test set of 174 bleeding cirrhotic patients were included . Treatment was according to the preferences of each center and the follow-up period was 6 weeks . Predictive rules for 5-day failure ( uncontrolled bleeding , rebleeding , or death ) and 6-week mortality were developed by the logistic model in the training set and vali date d in the test set . Initial treatment controlled bleeding in 90 % of patients , including vasoactive drugs in 27 % , endoscopic therapy in 10 % , combined ( endoscopic and vasoactive ) in 45 % , balloon tamponade alone in 1 % , and none in 17 % . The 5-day failure rate was 13 % , 6-week rebleeding was 17 % , and mortality was 20 % . Corresponding findings for variceal versus nonvariceal bleeding were 15 % versus 7 % ( P = .034 ) , 19 % versus 10 % ( P = .019 ) , and 20 % versus 15 % ( P = .22 ) . Active bleeding on endoscopy , hematocrit levels , aminotransferase levels , Child-Pugh class , and portal vein thrombosis were significant predictors of 5-day failure ; alcohol-induced etiology , bilirubin , albumin , encephalopathy , and hepatocarcinoma were predictors of 6-week mortality . Prognostic re assessment including blood transfusions improved the predictive accuracy . All the developed prognostic models were superior to the Child-Pugh score . In conclusion , prognosis of digestive bleeding in cirrhosis has much improved over the past 2 decades . Initial treatment stops bleeding in 90 % of patients . Accurate predictive rules are provided for early recognition of high-risk patients BACKGROUND / AIMS The incidence and natural history of small esophageal varices ( EV ) in cirrhotics may influence the frequency of endoscopies and the decision to start a pharmacological treatment in these patients . METHODS We prospect ively evaluated 206 cirrhotics , 113 without varices and 93 with small EV , during a mean follow-up of 37+/-22 months . Patients with previous gastrointestinal bleeding or receiving any treatment for portal hypertension were excluded . Endoscopy was performed every 12 months . RESULTS The rate of incidence of EV was 5 % ( 95%CI : 0.8 - 8.2 % ) at 1 year and 28 % ( 21.0 - 35.0 % ) at 3 years . The rate of EV progression was 12 % ( 5.6 - 18.4 % ) at 1 year and 31 % ( 21.2 - 40.8 % ) at 3 years . Post-alcoholic origin of cirrhosis , Child-Pugh 's class ( B or C ) and the finding of red wale marks at first examination were predictors for the variceal progression . The two-years risk of bleeding from EV was higher in patients with small varices upon enrollment than in those without varices : 12 % ( 95 % CI : 5.2 - 18.8 % ) vs. 2 % ( 0.1 - 4.1 % ) ; ( P<0.01 ) . Predictor for bleeding was the presence of red wale marks at first endoscopy . CONCLUSIONS In patients with no or small EV , endoscopy surveillance should be planned taking into account cause and degree of liver dysfunction This is one of a series of statements discussing the utilization of GI endoscopy in common clinical situations . The St and ards of Practice Committee of the American Society for Gastrointestinal Endoscopy prepared this text . In preparing this guideline , a MEDLINE literature search was performed and additional references were obtained from the bibliographies of the identified articles and from recommendations of expert consultants . When little or no data exist from well design ed prospect i ve trials , emphasis is given to results from large series and reports from recognized experts . Guidelines for appropriate utilization of endoscopy are based on a critical review of the available data and expert consensus . Further controlled clinical studies are needed to clarify aspects of this statement , and revision may be necessary as new data appear . Clinical consideration may justify a course of action at variance to these recommendations . This document is intended to provide the principles by which credentialing organizations may create policy and practical guidelines for granting privileges to perform capsule endoscopy . For information on credentialing for other endoscopic procedures , please refer to ‘ ‘ Guidelines for Credentialing and Granting Privileges for Gastrointestinal Endoscopy . ’ BACKGROUND AND STUDY AIMS Esophagogastroduodenoscopy ( EGD ) is the most effective method for examining the upper gastrointestinal tract , and particularly for evaluating portal hypertension in cirrhotic patients , especially for screening purpose s. The aim of this study was to assess the feasibility , safety , accuracy , and tolerance of PillCam ESO capsule endoscopy for this indication . PATIENTS AND METHODS In this prospect i ve study , unse date d EGD and capsule endoscopy examinations were conducted on the same day in cirrhotic patients at the time of diagnosis . The patients quantified the tolerability ( relative to pain , nausea , choking sensations , etc . ) of the two procedures using a 100-mm visual analogue scale . The time required for the recording and for diagnosis with the capsule examination were documented , as were the patients ' preferences in comparison with EGD . Two independent endoscopists blinded to the EGD diagnoses assessed the diagnostic accuracy of the images obtained . RESULTS Twenty-one patients were included in the study ( mean age 62 , mean Model for End-Stage Liver Disease score 10.5 , mean Child-Pugh score 7.3 ) . The procedure was safe . One patient was unable to swallow the capsule . The mean recording time was 213 s ( range 6 - 1200 s ) ; the procedure accurately assessed the presence or absence of esophageal varices in 16 of 19 patients ( 84.2 % ) ; and it correctly indicated a need for primary prophylaxis ( esophageal varices of grade 2 or more and /or red signs ) in 100 % of cases . The tolerability of the capsule endoscopy examination was significantly better , and all of the patients preferred capsule endoscopy to EGD ( which was transnasal in 11 patients ) . CONCLUSIONS Capsule endoscopy was feasible , safe , accurate , highly acceptable , and preferred by cirrhotic patients undergoing screening for portal hypertension . This new technique requires further and more extensive evaluation , as well as assessment of its cost-effectiveness OBJECTIVES : Esophagogastroduodenoscopy ( EGD ) is the st and ard method for the diagnosis of esophago-gastric varices . The aim of this prospect i ve multicenter study was to evaluate the PillCam esophageal capsule endoscopy ( ECE ) for this indication . METHODS : Patients presenting with cirrhotic or noncirrhotic portal hypertension underwent ECEfollowed by EGD at the time of diagnosis . Capsule recordings were blindly read by two endoscopists . RESULTS : A total of 120 patients ( 72 males , mean age : 58 years ; mean Child – Pugh score : 7.2 ) were included . Esophageal varices were detected in 74 patients . No adverse event was observed after either EGD or ECE . Seven ( 6 % ) patients were unable to swallow the capsule . The mean recording time was 204 s ( range 1–876 ) . Sensitivity , specificity , negative predictive value , and positive predictive value of ECE for the detection of esophageal varices were 77 % , 86 % , 69 % , and 90 % , respectively . Sensitivity , specificity , negative and positive predictive values of ECE for the indication of primary prophylaxis ( esophageal varices ⩾ grade 2 and /or red signs ) were 77 , 88 , 90 , and 75 % , respectively , and 85 % of the patients were adequately classified for the indication ( or not ) of prophylaxis . Interobserver concordance for ECE readings was 79.4 % for the diagnosis of varices , 66.4 % for the grading of varices , and 89.7 % for the indication of prophylaxis . CONCLUSIONS : This large multicenter study confirms the safety and acceptable accuracy of ECE for the evaluation of esophageal varices . ECE might be proposed as an alternative to EGD for the screening of portal hypertension , especially in patients unable or unwilling to undergo EGD We conducted a prospect i ve study of 321 patients with cirrhosis of the liver and esophageal varices with no history of bleeding to see whether a comprehensive analysis of their clinical features and of the endoscopic appearances of their varices could help to identify those at highest risk for bleeding . Varices were classified endoscopically as suggested by the Japanese Research Society for Portal Hypertension . Patients were followed for 1 to 38 months ( median , 23 ) , during which 85 patients ( 26.5 percent ) bled . Multiple regression analysis ( Cox 's model ) revealed that the risk of bleeding was significantly related to the patient 's modified Child class ( an index of liver dysfunction based on serum albumin concentration , bilirubin level , prothrombin time , and the presence of ascites and encephalopathy ) , the size of the varices , and the presence of red wale markings ( longitudinal dilated venules resembling whip marks ) on the varices . A prognostic index based on these variables was devised that enabled us to identify a subset of patients with a one-year incidence of bleeding exceeding 65 percent . The index was prospect ively vali date d on an independent sample of 75 patients with varices and no history of bleeding . We conclude that our prognostic index , which identifies groups of patients with one-year probabilities of bleeding ranging from 6 to 76 percent , can be used to identify c and i date s for prophylactic treatment Treatment of variceal hemorrhage is one of the most controversial subjects in medicine . The resurgence of old therapies ( endoscopic sclerotherapy ) and the introduction of new modalities ( obliterative angiotherapy ) has exacerbated the controversy . No widely accepted controlled therapeutic trial is available . The problem related to survival analysis has been studied in the light of the available information concerning the natural history of variceal bleeding . It is believed that controlled trials can be design ed which will prove the efficacy , or lack of it , for any proposed treatment ; however , valid conclusions based on previous studies are limited , largely because of the many confounding variables . Time , as a variable factor both for r and omization and therapeutic intervention , has been largely ignored , yet , we believe it is the major variable in this setting . For the population of variceal bleeders , risk of rebleeding or death rapidly diminishes over the first few days after a bleed , and early survival may be the best marker for later survival . Neither presentation nor treatment seems to alter this fundamental behavior . Variceal hemorrhage may serve as a prototype for problems of survival analysis of diseases with early high mortality BACKGROUND AND STUDY AIM Esophageal video capsule endoscopy ( ECE ) is a new technique that allows examination of the esophagus using a noninvasive approach . The aim of this study was to compare ECE with esophagogastroduodenoscopy ( EGD ) for the diagnosis of esophageal varices in patients with cirrhosis . PATIENTS AND METHODS A total of 330 patients with cirrhosis and with no known esophageal varices were prospect ively enrolled . Patients underwent ECE first , followed by EGD ( gold st and ard ) . The endoscopists who performed EGD were blind to the ECE result . Patient satisfaction was assessed using a visual analog scale ( maximum score 100 ) . RESULTS A total of 30 patients were excluded from the analysis because they did not undergo any endoscopic examinations . Patients ( mean age 56 years ; 216 male ) had mainly alcoholic ( 45 % ) or viral ( 27 % ) cirrhosis . The diagnostic indices of ECE to diagnose and correctly stage esophageal varices were : sensitivity 76 % and 64 % , specificity 91 % and 93 % , positive predictive value 88 % and 88 % , and negative predictive value 81 % and 78 % , respectively . ECE patient satisfaction scored significantly higher than EGD ( 87 ± 22 vs. 58 ± 35 ; P < 0.0001 ) . CONCLUSIONS ECE was well tolerated and safe in patients with liver cirrhosis and suspicion of portal hypertension . The sensitivity of ECE is not currently sufficient to replace EGD as a first exploration in these patients . However , due to its excellent specificity and positive predictive value , ECE may have a role in cases of refusal or contraindication to EGD . ECE might also improve compliance to endoscopic follow-up and aid important therapeutic decision making in the prophylaxis of bleeding . TRIAL REGISTRATION EudraCT ( ID RCB 2009-A00532 - 55 ) and Clinical Trials.gov ( NCT00941421 ) The natural history of gastrointestinal bleeding in cirrhosis has been studied using prospect ively collected data of 532 patients included in a r and omized clinical trial with a regular follow-up of up to 12 yr . Of the total 199 patients who experienced gastrointestinal bleeding , 95 ( 48 % ) bled from esophageal or gastric varices , 67 ( 34 % ) bled from peptic ulcer or gastritis , and 37 ( 18 % ) had either insufficient evidence of the source ( 33 ) or mixed sources ( 4 ) . In the total group of patients the cumulative percentage of patients in whom varices had been demonstrated of patients in whom varices had been demonstrated by radiography increased from 12 to 90 in 10 yr , while that of bleeding from varices increased from 7 to 40 . In 104 patients who bled for the first time during the trial period ( trial bleeding patients ) the median number of bleeding episodes was one ( range 1 - 8 ) . In these patients the fatality from bleeding from varices was 82 % . The risk of rebleeding from varices was 81 % , and 4 yr after the first bleeding the cumulative survival had decreased to less than 10 % . Rebleeding was significantly less frequent and survival significantly higher in patients bleeding from sources other than varices . Prednisone reduced the occurrence rate of varices , bleeding from varices , and death from bleeding varices in nonalcoholic females without ascites , 40 % of whom fulfilled the histologic criteria of chronic active hepatitis . Prednisone significantly increased the occurrence rate of varices inpatient with ascites and of bleeding from varices in alcoholic patients . Prednisone significantly increased the occurrence rate of peptic ulcer in males and in patients without chronic active hepatitis
2,023
29,753,042
Meta‐ analysis of delayed‐type hypersensitivity)DTH ( reactions and CD8+‐T cell levels , as immune responses , displayed the significant differences in the vaccinated groups compared to their non‐vaccinated counterparts . In addition , the recurrence , and the overall and the disease‐free survival were significantly different in the vaccinated subjects versus the control . Evaluation of the local and systemic toxicity of the E75 peptide vaccine demonstrated the minimal side effects .
Abstract The E75 peptide vaccine , derived from tumor‐associated antigen HER2 , is the most frequently studied anti‐HER2 vaccination strategy for the treatment of breast cancer patients . It has been investigated in the several phases I/II of the clinical trials and is currently being evaluated in a r and omized multicenter phase III clinical trial . We conducted a systematic review and meta‐ analysis to clarify the outcomes of the E75 peptide vaccine including the therapeutic efficacy , the disease recurrence , the survival rate , and the side effects .
Background Trastuzumab , an anti-HER2/neu monoclonal antibody , is thought to promote HER2/neu receptor internalization and /or turnover . This study was design ed to investigate the kinetics of trastuzumab treatment on tumor cells with varying levels of HER2/neu expression and to determine the effect of trastuzumab on HER2/neu-specific cytotoxic T lymphocyte – mediated lysis . Methods Three cell lines with varying levels of HER2/neu expression were incubated with varying doses of trastuzumab at multiple time points . Trastuzumab binding and HER2/neu expression were determined . Peripheral blood mononuclear cells from three HLA-A2 + healthy donors and four E75 peptide – vaccinated patients were stimulated with HER2/neu-derived peptides and tested in st and ard chromium release cytotoxicity assays with HER2/neu+ tumor cells pretreated with trastuzumab . Results Treatment of tumor cells with 10 μg/mL of trastuzumab in an overnight incubation result ed in saturation of cell-surface HER2/neu receptors . At higher doses , trastuzumab staining and HER2/neu expression decreased in a time-dependent manner . Pretreatment of tumor cells with trastuzumab result ed in increases in specific cytotoxicity by peptide-stimulated cytotoxic T lymphocytes from HLA-A2 + donors over untreated cells by an average of 5.6 % and 15.3 % ( P = .0002 ) for doses of 10 and 50 μg/mL , respectively . In similar experiments involving peripheral blood mononuclear cells obtained from immunized patients , the average specific cytotoxicity for untreated cells was 34.2 % ± 1.3 % vs. 40.6 % ± 2.5 % ( P = .035 ) and 40.7 % ± 1.6 % ( P = .0005 ) for those treated with 10 and 50 μg/mL , respectively . Conclusions Our data suggest that pretreatment of breast cancer cells with trastuzumab induces turnover of the HER2/neu protein and enhanced killing by HER2/neu peptide – stimulated CTLs . This increased lysis occurs regardless of the degree of HER2/neu expression and seems more pronounced in vaccinated patients . These findings support further investigation into the use of combination immunotherapy with trastuzumab and HER2/neu peptide – based vaccines Purpose : HER2/neu , a source of immunogenic peptides , is expressed in > 75 % of breast cancer patients . We have conducted clinical trials with the HER2/neu E75 peptide vaccine in breast cancer patients with varying levels of HER2/neu expression . Vaccine response based on HER2/neu expression level was analyzed . Experimental Design : Patients were stratified by HER2/neu expression . Low expressors ( n = 100 ) were defined as HER2/neu immunohistochemistry ( IHC ) 1 + to 2 + or fluorescence in situ hybridization < 2.0 . Overexpressors ( n = 51 ) were defined as IHC 3 + or fluorescence in situ hybridization ≥ 2.0 . Additional analyses were done stratifying by IHC status ( 0 - 3 + ) . St and ard clinocopathlogic factors , immunologic response ( in vivo delayed-type hypersensitivity reactions ; ex vivo human leukocyte antigen A2:immunoglobulin G dimer assay ) , and clinical responses ( recurrence ; mortality ) were assessed . Results : Low-expressor ( control , 44 ; vaccinated , 56 ) versus overexpressor patients ( control , 22 ; vaccinated , 29 ) were assessed . Low expressors , overexpressors , and most IHC-status vaccinated groups responded immunologically . Vaccinated low-expressor patients had larger maximum immunologic responses compared with overexpressor patients ( P = 0.04 ) , and vaccinated IHC 1 + patients had increased long-term immune response ( P = 0.08 ) . More importantly , compared with controls , low-expressor patients had a mortality reduction ( P = 0.08 ) . The largest decrease in mortality was seen in IHC 1 + patients ( P = 0.05 ) . In addition , a subset of overexpressor patients ( n = 7 ) received trastuzumab before vaccination , and this combination seems safe and immunologically beneficial . Conclusions : Most patients with various levels of HER2/neu expression responded immunologically and seemed to benefit from vaccination . The low expressors , specifically IHC 1 + patients , had more robust immunologic responses and may derive the greatest clinical benefit from the E75 vaccine Ideally , vaccines should be design ed to elicit long-lived immunity . The goal of this study was to determine whether HER-2/neu peptide-specific CD8 + T-cell immunity could be elicited using an immunodominant HER-2/neu-derived HLA-A2 peptide alone in the absence of exogenous help . Granulocyte macrophage colony-stimulating factor ( GM-CSF ) was used as adjuvant . Six HLA-A2 patients with HER-2/neu-overexpressing cancers received 6 monthly vaccinations with a vaccine preparation consisting of 500 microg of HER-2/neu peptide , p369 - 377 , admixed with 100 microg of GM-CSF . The patients had either stage III or IV breast or ovarian cancer . Immune responses to the p369 - 377 were examined using an IFN-gamma enzyme-linked immunosorbent spot assay . Before vaccination , the median precursor frequency ( range ) , defined as precursors per 10(6 ) peripheral blood mononuclear cell , to p369 - 377 was 0 ( no range ) . After vaccination , the median precursor frequency to p369 - 377 in four evaluable patients was 0 ( 0 - 116 ) . Overall , HER-2/neu peptide-specific precursors developed to p369 - 377 in two of four evaluable subjects . The responses were short-lived and not detectable at 5 months after the final vaccination . Immunocompetence was evident , because patients had detectable enzyme-linked immunosorbent spot responses to tetanus toxoid and influenza . These results demonstrate that HER-2/neu MHC class I epitopes can induce HER-2/neu peptide-specific IFN-gamma-producing CD8 + T cells . However , the magnitude of the responses were low , as well as short-lived , suggesting that CD4 + T-cell help is required for lasting immunity to this epitope E75 , a HER‐2/neu‐derived peptide , was administered as a preventive vaccine with granulocyte‐macrophage – colony‐stimulating factor ( GM‐CSF ) in disease‐free lymph node‐positive ( NP ) and lymph node‐negative ( NN ) breast cancer ( BCa ) patients . The optimal biologic dose ( OBD ) was determined based on toxicity and immunologic response PURPOSE E75 is an immunogenic peptide from the HER2/neu protein that is highly expressed in breast cancer . We are conducting a clinical trial of an E75 + granulocyte-macrophage colony-stimulating factor vaccine to assess safety , immunologic response , and the prevention of clinical recurrences in patients with disease-free , node-positive breast cancer ( NPBC ) . PATIENTS AND METHODS Fifty-three patients with NPBC were enrolled and HLA typed . HLA-A2 + patients ( n = 24 ) were vaccinated , and HLA-A2- patients ( n = 29 ) are observed prospect ively as clinical controls . Local/systemic toxicities , immunologic responses , and time to recurrence are being measured . RESULTS Only minor toxicities have occurred ( one grade 3 [ 4 % ] ) . All patients have demonstrated clonal expansion of E75-specific CD8+T cells that lysed HER2/neu-expressing tumor cells . An optimal dosage and schedule have been established . Patients have developed delayed-type hypersensitivity reactions to E75 postvaccination compared with controls ( 33 v 7 mm ; P < .01 ) . HLA-A2 + patients have been found to have larger , more poorly differentiated , and more hormonally insensitive tumors compared to HLA-A2- patients . Despite this , the only two deaths have occurred in the control group . The disease-free survival in the vaccinated group is 85.7 % compared to 59.8 % in the controls at 22 months ' median follow-up with a recurrence rate of 8 % compared to 21 % , respectively ( P < .19 ) . Median time to recurrence in the vaccinated patients was prolonged ( 11 v 8 months ) , and recurrence correlated with a weak delayed-type hypersensitivity response . CONCLUSION This HER2/neu ( E75 ) vaccine is safe and effective in eliciting a peptide-specific immune response in vivo . Induced HER2/neu immunity seems to reduce the recurrence rate in patients with NPBC The authors conducted exploratory phase 1‐2 clinical trials vaccinating breast cancer patients with E75 , a human leukocyte antigen ( HLA ) A2/A3–restricted HER‐2/neu ( HER2 ) peptide , and granulocyte‐macrophage colony‐stimulating factor . The vaccine is given as adjuvant therapy to prevent disease recurrence . They previously reported that the vaccine is safe and effective in stimulating expansion of E75‐specific cytotoxic T cells . Here , they report 24‐month l and mark analyses of disease‐free survival ( DFS ) Background E75 , a HER2/neu immunogenic peptide , is expressed in breast cancer ( BCa ) . We have performed clinical trials of E75 + GM-CSF vaccine in disease-free , node-positive and node-negative BCa patients at high recurrence risk and recurrences were noted in both control and vaccine groups . Methods Among the 186 BCa patients enrolled , 177 completed the study . Patients were HLA typed ; the HLA-A2+/A3 + patients were vaccinated ; HLA-A2−/A3− patients were followed as controls . St and ard clinicopathological factors , immunologic response to the vaccine , and recurrences were collected and assessed . Results The control group recurrence rate was 14.8 and 8.3 % in the vaccinated group ( P = 0.17 ) . Comparing the 8 vaccinated recurrences ( V-R ) to the 88 vaccinated nonrecurrent patients ( V-NR ) , the V-R group had higher nodal stage ( ≥N2 : 75 vs. 5 % , P = 0.0001 ) and higher grade tumors ( % grade 3 : 88 vs. 31 % , P = 0.003 ) . The V-R group did not fail to respond immunologically as noted by equivalent dimer responses and post-DTH responses . Compared to control recurrent patients ( C-R ) , V-R patients trended toward higher- grade tumors and hormone-receptor negativity . C-R patients had 50 % bone-only recurrences , compared to V-R patients with no bone-only recurrences ( P = 0.05 ) . Lastly , V-R mortality rate was 12.5 % compared with 41.7 % for the C-R group ( P = 0.3 ) . Conclusions The vaccinated patients who recurred had more aggressive disease compared to V-NR patients . V-R patients had no difference in immune response to the vaccine either in vitro or in vivo . V-R patients , when compared to C-R patients , trended towards more aggressive disease , decreased recurrence rates , decreased mortality , and no bone-only recurrences We used the Luminex assay to compare serum cytokine profiles of breast cancer patients ( BCa ) to healthy controls , node-positive ( NP ) patients to node-negative ( NN ) , and pre- and post-vaccination serum of BCa vaccinated with a HER2/neu E75 peptide vaccine . Sera from 36 pre- and post-vaccination BCa , ( 12 NP and 24 NN ) and 13 healthy , female donors , were evaluated using Luminex technology . Levels of 22 cytokines consisting of interleukin (IL)-1alpha , -1beta , -2 , -4 , -5 , -6 , -7 , -8 , -10 , -12 , -13 , -15 , -17 , IFN-gamma , G-CSF , GM-CSF , TNF-alpha , IP-10 , MIP-1alpha , RANTES , eotaxin and monocyte chemotactic protein-1 ( MCP-1 ) were assessed . Six of 22 cytokines showed significant differences between BCa and healthy controls . MCP-1 , eotaxin , RANTES and GM-CSF levels were significantly elevated in BCa ( P<0.009 ) and IL-1alpha and IL-4 levels were significantly decreased in BCa ( P<0.015 ) . Cytokine levels were generally elevated in NN patients compared to NP patients with the exception of eotaxin and IL-13 , which were increased in NP patients . Three cytokines , IL-6 , MIP-1alpha and G-CSF reached statistical significance ( P<0.05 ) . In 34 vaccinated BCa , MCP-1 , eotaxin and IL-13 were significantly elevated post-vaccination with MCP-1 demonstrating the most significant response ( median , 145.8 - 217.0 pg/ml , P=0.003 ) . Using a multiplex assay we found significant differences in cytokine levels in sera of BCa compared to healthy controls , in NN compared to NP patients , and in vaccinated patients . Our results support an extended analysis of serum cytokine profiles for the potential development of predictive panels in diagnosis , staging and monitoring cancer vaccine trials The authors are conducting clinical trials of the HER‐2/neu E75‐peptide vaccine in clinical ly disease‐free breast cancer ( BC ) patients . Their phase 1‐2 trials revealed that the E75 + granulocyte‐macrophage colony‐stimulating factor ( GM‐CSF ) vaccine is safe and effective in stimulating clonal expansion of E75‐specific CD8 + T cells . They assessed the need for and response to a booster after completion of primary vaccination series BACKGROUND We are conducting clinical vaccine trials with the HER2/neu peptide , E75 , in patients with breast cancer . The purpose of this study was to demonstrate clonal expansion of E75-specific CD8(+ ) T cells and to identify intra- and interantigenic epitope spreading . METHOD Pre- and postvaccination peripheral blood leukocyte sample s ( 24 node positive [ NP ] and 20 node negative [ NN ] ) from 44 vaccinated patients were analyzed . HLA-A2:Ig dimer molecules were loaded with the HER2 peptides , E75 or GP2 , and were used with anti-TcR and CD8 antibodies to stain peripheral blood leukocyte immediately ex vivo and were analyzed with flow cytometry . In 8 r and omly selected patients , dimers were loaded with the folate binding protein peptide E41 to evaluate for interantigenic epitope spreading . RESULTS All patients with NP and 95 % of the patients with NN showed E75-specific clonal expansion . Patients with NN showed more robust expansion . All patients with NP and 85 % of the patients with NN showed evidence of intra-antigenic epitope that was spreading to GP2 . However , patients with NN showed only moderate expansion to this subdominant epitope , which was not included in the immunizing mix . The degree of HER2/neu expression and disease stage impacted the ability to exp and clonally E75- and GP2-specific CD8(+ ) T cells . Evidence of interantigenic epitope spreading to E41 was shown in 63 % of the patients who were tested . CONCLUSION Our data provide evidence for the induction of intra- and interantigenic epitope spreading that results from a single HER2/neu peptide vaccine even in early staged patients . The ability to raise immunity to multiple tumor antigens depends on both the degree of HER2/neu expression and the extent of disease . Epitope spreading is an essential element for the success of a peptide vaccine strategy We are conducting clinical trials of the E75 peptide as a vaccine in breast cancer ( BrCa ) patients . We assessed T cell sub population s in BrCa patients before and after E75 vaccination and compared them to healthy controls . We obtained 17 sample s of blood from ten healthy individuals and sample s from 22 BrCa patients prior to vaccination . We also obtained pre- and post-vaccination sample s of blood from seven BrCa patients who received the E75/GM-CSF vaccine . CD4 , CD8 , CD45RA , CD45RO , and CCR7 antibodies were used to analyze the CD4 + and CD8 + T cells by four-color flow cytometry . Compared to healthy individuals , BrCa patients have significantly more memory and less naïve T cells and more effector-memory CD8 + and less effector CD4 + T cells . Phenotypic differences in defined circulating CD4 + and CD8 + T cell sub population s suggest remnants of an active immune response to tumor distinguished by a predominant memory T cell response and by untapped recruitment of naïve helper and cytotoxic T cells . E75 vaccination induced recruitment of both CD4 + and CD8 + naïve T cells while memory response remained stable . Additionally , vaccination induced global activation of all T cells , with specific enhancement of effector CD4 + T cells . E75 vaccination causes activation of both memory and naïve CD4 + and CD8 + T cells , while recruiting additional naïve CD4 + and CD8 + T cells to the overall immune response
2,024
24,215,104
EXPERT OPINION The results of this review have led to the following main observations : i ) there is scarce documentation on the association between CVH and sexual dysfunction ; ii ) hormonal impairment seems to be a major component in the development of erectile dysfunction in CVH ; however , published evidence concerning the contribution of other pathogenetic factors is rare and inconclusive and iii ) available treatment options for CVH potentially contribute to the development of sexual dysfunction in these patients .
INTRODUCTION This article review s the literature on epidemiology and pathogenetic factors of erectile dysfunction in patients with chronic viral hepatic ( CVH ) diseases in men and the potential implication s for diagnosis and treatment .
OBJECTIVES : Although sexual dysfunction has been reported in patients with hepatitis C virus ( HCV ) infection , little is known about this association . The aims of this study were to determine the prevalence of sexual dysfunction among men with chronic HCV infection and to evaluate the impact of sexual dysfunction on health-related quality of life ( HRQOL ) . METHODS : We prospect ively enrolled 112 HCV positive men and 239 HCV negative controls , and all patients completed vali date d question naires to assess sexual function ( Brief Male Sexual Function Inventory [ BMSFI ] ) , depression ( Beck Depression Inventory ) , and HRQOL ( Medical Outcomes Study Short Form-36 ) . The BMSFI assessed sexual drive , erection , ejaculation , sexual problem assessment , and overall sexual satisfaction . RESULTS : HCV positive men had significantly more sexual dysfunction than control subjects across all five domains of the BMFSI . In addition , HCV-infected men were significantly more likely than controls to not be sexually satisfied ( 53.6 % vs 28.9 % , p < 0.001 ) and this remained statistically significant after adjusting for age , race , and other potential confounding variables ( OR = 3.36 ; 95 % CI , 1.59–7.13 ) . In the 241 individuals without depression , HCV positive men were significantly more likely to not be sexually satisfied as compared with control subjects ( 47.5 % vs 11.0 % , p < 0.001 ) . HCV-infected men who were not sexually satisfied scored significantly worse in six of eight domains of HRQOL as compared with HCV-infected men who were sexually satisfied . CONCLUSIONS : Sexual dysfunction is highly prevalent in men with chronic HCV infection , is independent of depression , and is associated with a marked reduction in HRQOL The effects of interferon-alpha ( IFN-alpha ) , given at a dosage of 6 MU thrice weekly for 12 months , on gonadal function were investigated in 18 males affected by chronic hepatitis C. Periodically , all patients were clinical ly monitored and question ed about sexual function . Gonadotropin and serum and rogen concentrations ( follicle-stimulating hormone , luteinizing hormone , total testosterone , free testosterone , and rostenedione , dehydroepi and rosterone , dehydroepi and rosterone sulfate , and sex hormone binding globulin ) were tested every 3 months . Ten of 18 patients ( 55 % ) responded to IFN-alpha therapy . Serum total testosterone and sex hormone binding globulin values decreased slightly at the third month of treatment , then returned to baseline values . Serum free testosterone and other sex hormones remained essentially unchanged during IFN-alpha therapy . Four patients ( 22.2 % ) complained of sexual dysfunction ( impaired libido , erectile failure , and impaired ejaculation ) , which was unrelated to any significant hormonal change and resolved after IFN therapy was stopped . Serum sex hormones values did not differ between responders and nonresponders to IFN-alpha . This study indicates that 12 months treatment with 6 MU of IFN-alpha thrice weekly does not significantly affect gonadal function in men with chronic hepatitis C. The sexual dysfunction observed could be ascribed to such other side effects of IFN as asthenia , fatigue , or anxiety , or it could have a psychologic basis OBJECTIVE Recombinant human interferon alpha ( rhIFN-alpha ) is used therapeutically in malignant disorders and chronic hepatitis . The present study was assessed to study the effects of rhIFN-alpha on the hypothalamic-pituitary-testicular ( HPT ) axis . DESIGN AND METHODS We performed a saline-controlled cross-over study in six healthy men , sequentially measuring the serum concentrations of gonadotropins , testosterone , the free and rogen index ( FAI ) and sex hormone-binding globulin ( SHBG ) after a bolus subcutaneous injection of rhIFN-alpha . RESULTS rhIFN-alpha induced a sustained decrease of both testosterone ( from 19.5+/-1.88 to a nadir of 5.49+/-0.51nmol/l at the end of the study ) and FAI ( from 98.7+/-14.7 to a nadir of 32 . 1+/-5.3 at the end of the study ) , whereas concentrations of LH , FSH and SHBG were not different between the two studies . CONCLUSIONS Our results suggest that rhIFN-alpha affects the HPT axis at the testicular level , either directly or indirectly , and changes feedback relationships between the pituitary and the testis INTRODUCTION In men , erectile dysfunction ( ED ) is an important issue . Data concerning ED in men with end-stage liver disease ( ESLD ) is limited , and the risk factors for ED in this population are still unknown . AIMS To determine the prevalence , timescale , and risk factors for ED in ESLD patients c and i date s to liver transplantation . METHODS Patients c and i date s for a liver transplantation were asked to participate in a mailed survey about sexual function . Among the 123 eligible men , 98 ( 84 % ) agreed to complete the question naire . MAIN OUTCOME MEASURES The quality of erection was evaluated using the five-item International Index of Erectile Function ( IIEF-5 ) score , and satisfaction for sexuality , using the patient-baseline Treatment-Satisfaction Scale ( TSS ) score . Other questions also focused on patient perception of changes over time . RESULTS On the overall population , 28 patients ( 29 % ) were nonsexually active . Among the 70 patients who were sexually active , 52 patients ( 74 % ) had ED . Regarding the development of ED , 50 % of the patients perceived that a deterioration of erectile function occurred within the six previous months . The absence of sexual activity was more frequent in hepatitis B or C patients ( P = 0.02 ) . The risk factors for ED were alcohol intake ( P = 0.03 ) , tobacco use ( P = 0.03 ) , and cardiovascular disease ( P = 0.004 ) . The significant risk factors for having a low TSS score were having viral hepatitis ( P = 0.01 ) , and cardiovascular disease ( P = 0.01 ) . CONCLUSION Population of men with ESLD who are c and i date s for a liver transplantation is characterized by a high frequency of lack of sexual activity , and by a high prevalence of ED and should be targeted by interventions to improve sexual functioning . These preliminary data need further validation in prospect i ve trial using more comprehensive question naires BACKGROUND & AIMS Compared with Caucasian Americans ( CA ) , African Americans ( AA ) with chronic hepatitis C are less likely to respond to interferon-based antiviral therapy . METHODS In a multicenter treatment trial , 196 AA and 205 CA treatment-naive patients with hepatitis C virus ( HCV ) genotype 1 infection were treated with peginterferon alfa-2a ( 180 microg/wk ) and ribavirin ( 1000 - 1200 mg/day ) for up to 48 weeks . The primary end point was sustained virologic response ( SVR ) . RESULTS Baseline features were similar among AA and CA , including HCV-RNA levels and histologic severity , but AA had higher body weights , a higher prevalence of diabetes and hypertension , and lower alanine transaminase levels ( P < .001 for all ) . The SVR rate was 28 % in AA and 52 % in CA ( P < .0001 ) . Racial differences in viral responses were evident as early as treatment week 4 . Breakthrough viremia was more frequent among AA than CA ( 13 % vs 6 % , P = .05 ) ; relapse rates were comparable ( 32 % vs 25 % , P = .30 ) . Proportions of patients with serious adverse events and dose modifications and discontinuations were similar among AA and CA . In multiple regression analyses , CA had a higher SVR rate than AA ( relative risk , 1.96 ; 95 % confidence interval , 1.48 - 2.60 ; P < .0001 ) . Other factors independently associated with higher SVR included female sex , lower baseline HCV-RNA level , less hepatic fibrosis , and more peginterferon taken . CONCLUSIONS AA with chronic hepatitis C genotype 1 have lower rates of virologic response to peginterferon and ribavirin than CA . These differences are not explained by disease characteristics , baseline viral levels , or amount of medication taken OBJECTIVE OF THE STUDY To report on the results of two projects on chronic hepatitis B in Western Balkans lead by Ioannina , Northwest Greece and Tirana , Albania . METHODS In two prospect i ve projects , HEPAGA I and HEPAGA II which lasted 4 years . In HEPAGA I , serum sample s from 410 Albanians were tested for HBV . In HEPAGA II , health care consumption was recorded in hospitalized patients with chronic hepatitis B. RESULTS HEPAGA I showed that 11.89 % of the Albanians was HBsAg(+ ) and only 21.19 % had HBV immunoprotection . HEPAGA II study included 101 patients . There was a significant difference in hospitalization costs per patient between centers . The Greek patients were significantly older ( p=0.027 ) and there was a significant correlation between age > 50 years and hospitalization costs ( p=0.035 ) . In Greece , hospitalization costs , number of patients admitted and number of hospitalization days per patient were in a remarkable position compared to other causes of hospitalization . CONCLUSIONS The HEPAGA I study showed a decrease in the prevalence of chronic HBV infection in Albania compared to that of the previous decade . The HEPAGA II study demonstrated that health care consumption due to HBV infection is still an important determinant of the overall health consumption in Western Balkans The prevalence and course of sexual dysfunction was evaluated in 221 alcoholic cirrhotic men participating in a double-blind , placebo-controlled study on the effect of oral testosterone treatment on liver disease . At entry , 67 % ( 95 % confidence limits , 61%-74 % ) complained of sexual dysfunction . Sexual dysfunction was significantly ( p less than 0.05 ) associated with lower serum concentrations of testosterone , non-protein-bound testosterone , and non-sex hormone-binding globulin-bound testosterone . The significant associations between sexual dysfunction and non-protein-bound and non-sex hormone-binding globulin-bound testosterone concentrations disappeared , however , when age , ethanol consumption , and severity of liver disease were included as covariates in the analysis . During follow-up ( median 30 mo , range 1 - 48 mo ) sexual dysfunction improved significantly ( p less than 0.05 ) at 6 , 12 , and 24 mo . Furthermore , the reported libido and erectile and ejaculatory function improved significantly at the end of the follow-up period ( p less than 0.01 ) . However , the testosterone-treated patients did not differ significantly from the placebo-treated patients regarding any of the changes in sexual function . In conclusion , oral testosterone treatment does not significantly influence the type or course of sexual dysfunction in alcoholic cirrhotic men . However , sexual function improved after reduction of ethanol consumption in these patients Decrease of libido and erectile dysfunction are reported by male patients during antiviral therapy of chronic hepatitis C , but therapy-associated underlying factors for sexual dysfunction are not well defined . To assess putative contributions of interferon-induced sex hormone changes to sexual dysfunction , we prospect ively investigated changes in free testosterone , total testosterone , dehydroepi and rosterone sulfate , prolactin , sex hormone-binding globulin , FSH and LH levels and psychometric self- assessment scores in 34 male patients treated with interferon alfa-2b ( 5 MIU three times weekly ) (n=19)+ ribavirin ( n=15 ) for 6 - 12 months . Depression was measured by the Hospital Anxiety and Depression Scale . Sexual dysfunction was evaluated by the Symptom Checklist 90 Item Revised and a five-point rating scale assessing sexual arousal disorder . Free and total testosterone decreased significantly during antiviral therapy in close correlation with libido/sexual function . Depression scores increased during therapy and were also significantly associated with sexual dysfunction . However , and rogen levels displayed no significant correlation with depression . These results suggest that interferon-induced decrease in sexual function is associated - but not causally related -with both and rogen reduction and increased depressive symptoms . These findings may affect care for male hepatitis C patients during interferon therapy PURPOSE We assessed the change in confidence , relationships and self-esteem , and its correlation with erectile function in men with ED treated with sildenafil citrate in the first United States based , double-blind , placebo controlled , r and omized trial assessed by the vali date d SEAR . MATERIAL S AND METHODS This 12-week flexible dose ( 25 , 50 or 100 mg ) trial determined change scores from baseline to end of treatment for the 5 SEAR components ( Sexual Relationship domain , Confidence domain , Self-Esteem subscale [ prespecified as the primary end point ] , Overall Relationship subscale and Overall score ) , and their correlations with the IIEF and event log data , as well as correlations between SEAR components and a general efficacy question at the end of treatment . RESULTS Compared with the placebo group ( 125 patients , mean age + /- SD 55 + /- 13 years , mean years ED 3.8 + /- 4.2 ) , the sildenafil group ( 128 patients , mean age + /- SD 56 + /- 12 , mean years ED 4.6 + /- 4.3 ) had significantly greater improvements in all 5 SEAR components ( p < 0.0001 ) and all sexual function measures . SEAR component scores showed significant correlations with IIEF Erectile Function domain scores ( r range 0.34 to 0.69 , p < 0.0001 ) , other IIEF domain scores ( p < 0.0001 ) , percentage of successful intercourse attempts ( p < 0.0001 ) and frequency of erection that allowed satisfactory intercourse ( p < 0.0001 ) . CONCLUSIONS In this study of men with ED , sildenafil produced substantial improvements in self-esteem , confidence and relationship satisfaction as measured by SEAR scores , which showed moderate to high positive correlations with IIEF scores The prevalence of and risk factors for HCV and HBV infections in the general population and the predictive value of ALT screening in identifying anti-HCV positive subjects have been evaluated in a small Sicilian town . A r and om 1:4 sampling from the census of the general population was performed . Anti-HCV , HCV-RNA , HCV genotype , HBsAg , and anti-HBc were tested . The linkage between HCV infection and potential risk factors was evaluated by multiple logistic regression analysis . Among 721 subjects studied , 75 ( 10.4 % ) were anti-HCV positive . The HCV infection rate increased from 0.4 % in subjects 10 - 29 years of age to 34 % in those > 60 years of age . Among the 75 anti-HCV positive subjects , 66.7 % were HCV-RNA positive and 36 % had abnormal ALT , in contrast abnormal ALT levels were found in 4.3 % of the 646 anti-HCV negative subjects ( P < 0.01 ) . HCV genotype 1b infected the majority ( 88.0 % ) of viremic subjects . Exposure to HBV infection ( anti-HBc positivity ) was found in 11.2 % of subjects ; HBsAg positivity was 0.7 % . At multivariate analysis , two variables were associated with HCV infection : age > or = 45 years ( OR 27.8 ; CI 95 % = 11.0 - 70.2 ) and previous hospitalization ( OR 2.5 ; CI 95 % = 1.3 - 4.7 ) . ALT testing had low positive predictive value ( PPV = 49.1 % ) for HCV infection . The positive predictive value was good ( 88 % ) in people > or = 60 years of age , but minimal ( 16.7 % ) in those below 60 . These findings indicate that HCV infection is common in the elderly , perhaps as a result of past iatrogenic transmission . The present low rate of HCV infection among the younger generations coupled with the low progression of the viral related liver damage does not support the projection of a future increasing incidence in the next decades of the burden of HCV-related chronic disease . HBV infection , formerly common in this area , is already in sharp decline . In an area of high HCV endemicity , screening of the general population by ALT can not be used a surrogate marker to detect HCV infection in those susceptible to treatment & NA ; Few studies have assessed the role of pituitary and gonadal hormones on age‐related changes in sexual behavior in healthy men . We conducted a retrospective and prospect i ve evaluation of sexual function and behavior in 77 healthy married men aged 45 to 74 years . The subjects were studied in the sleep laboratory for four nights with the last night devoted to sequential blood sampling every 20 minutes . Significant age‐related decreases in sexual desire , sexual arousal and activity , and increases in erectile problems were noted . Aging was negatively correlated with bioavailable testosterone ( bT ) , was positively correlated with luteinizing hormone ( LH ) , and was not related to total testosterone , estradiol , and prolactin . Bioavailable testosterone , and the ratio of bT over LH showed a close association with several sexual behavior dimensions while total testosterone , estradiol , and prolactin demonstrated few or no behavioral relationships . The age‐related effect of bT was , however , a more important determinant of the reported behavioral differences than were the effect of bT independent of age . There was no evidence that changes in circulating hormones contribute to erectile disorders in healthy aging men INTRODUCTION Chronic liver diseases are often accompanied by hypogonadism , testicular atrophy , and a reduction in libido , all of which are factors that may contribute to the development of erectile dysfunction ( ED ) . However , large-scaled studies investigating the association between ED and viral hepatitis are still sparse . AIM This study aim ed to estimate the association between ED and a prior diagnosis of viral hepatitis using a population -based data set with a case-control design in Taiwan . METHODS We identified 6,429 patients with ED as cases and r and omly selected 32,145 subjects as controls . We used conditional logistic regression to compute the odds ratio ( OR ) for having previously received a diagnosis of viral hepatitis between cases and controls . MAIN OUTCOME MEASURE The prevalence and odds of having been previously diagnosed with hepatitis B , hepatitis C , a coinfection with hepatitis B and C , and viral hepatitis of other etiology were calculated between cases and controls . RESULTS . Of the 38,574 sample d subjects , 3,930 ( 10.2 % ) had viral hepatitis before the index date ; viral hepatitis was found in 900 ( 14.0 % ) cases and in 3,030 ( 9.4 % ) controls . After adjusting for monthly income , geographic location , hypertension , diabetes , hyperlipidemia , hepatic steatosis , coronary heart disease , obesity , and alcohol abuse/alcohol dependence syndrome , cases were found to be more likely to have prior viral hepatitis than controls ( OR = 1.51 , 95 % confidence interval [ CI ] = 1.39 - 1.64 , P < 0.001 ) . A much higher proportion of coinfection with viral hepatitis B and C was additionally found among cases ( OR = 1.84 , 95 % CI = 1.72 - 1.97 ) than controls . CONCLUSIONS . We conclude that ED was associated with prior viral hepatitis , especially with a coinfection of hepatitis B and C , after adjusting for potential confounders
2,025
20,140,479
There is convincing evidence to recommend the use of heparins or fondaparinux for prevention of VTE in selected cancer patients , and , especially in some particular types of malignancies and cancer treatments . Management of VTE in patients with cancer is more challenging and bleeding complications associated with the use of anticoagulants are significantly higher in cancer patients than in those without malignancy .
Venous thromboembolism ( VTE ) is a serious and potentially fatal disorder , which is often associated with a significant impact on the quality of life and on the clinical outcome of cancer patients . The pathophysiology of the association between thrombosis and cancer is complex : malignancy is associated with a baseline hypercoagulable state due to many factors including release of inflammatory cytokines , activation of the clotting system , expression of hemostatic proteins on tumor cells , inhibition of natural anticoagulants , and impaired fibrinolysis . Several risk factors , related to the patient , the disease , and the therapeutic interventions , have been identified as contributing to the occurrence of VTE .
PURPOSE Initial heparinization followed by vitamin K antagonists is the treatment of choice for patients with venous thromboembolism . There is controversy whether known malignancy is a risk factor for recurrences and bleeding complications during this treatment . Furthermore , the incidence of such events in these patients is dependent on the achieved International Normalized Ratio ( INR ) . The aim of this study was to assess the incidence of venous thromboembolic recurrence and major bleeding among patients with venous thromboembolism in relation to both malignancy and the achieved INR . PATIENTS AND METHODS In a retrospective analysis , the INR-specific incidence of venous thromboembolic and major bleeding events during oral anticoagulant therapy was calculated separately for patients with and without malignancy . Eligible patients participated in two multicenter , r and omized clinical trials on the initial treatment of venous thromboembolism . Patients were initially treated with heparin ( st and ard or low-molecular weight ) . Treatment with vitamin K antagonists was started within 1 day and continued for 3 months , with a target INR of 2.0 to 3.0 . RESULTS In 1,303 eligible patients ( 264 with malignancy ) , 35 recurrences and 12 bleeds occurred . Patients with malignancy , compared with nonmalignant patients , had a clinical ly and statistically significantly increased overall incidence of recurrence ( 27.1 v 9.0 , respectively , per 100 patient-years ) as well as bleeding ( 13.3 v 2.1 , respectively , per 100 patient-years ) . In both groups of patients , the incidence of recurrence was lower when the INR was above 2.0 compared with below 2.0 . CONCLUSION Although adequately dosed vitamin K antagonists are effective in patients with malignant disease , the incidence of thrombotic and bleeding complications remains higher than in patients without malignancy Cancer patients are at high risk for venous thromboembolism ( VTE ) . Laboratory parameters with a predictive value for VTE could help stratify patients into high- or low-risk groups . The cell adhesion molecule P-selectin was recently identified as risk factor for VTE . To investigate soluble P-selectin ( sP-selectin ) in cancer patients as risk predictor for VTE , we performed a prospect i ve cohort study of 687 cancer patients and followed them for a median ( IQR ) of 415 ( 221 - 722 ) days . Main tumor entities were malignancies of the breast ( n = 125 ) , lung ( n = 86 ) , gastrointestinal tract ( n = 130 ) , pancreas ( n = 42 ) , kidney ( n = 19 ) , prostate ( n = 72 ) , and brain ( n = 80 ) ; 91 had hematologic malignancies ; 42 had other tumors . VTE occurred in 44 ( 6.4 % ) patients . In multivariable analysis , elevated sP-selectin ( cutoff level , 53.1 ng/mL , 75th percentile of study population ) was a statistically significant risk factor for VTE after adjustment for age , sex , surgery , chemotherapy , and radiotherapy ( hazard ratio = 2.6 , 95 % confidence interval , 1.4 - 4.9 , P = .003 ) . The cumulative probability of VTE after 6 months was 11.9 % in patients with sP-selectin above and 3.7 % in those below the 75th percentile ( P = .002 ) . High sP-selectin plasma levels independently predict VTE in cancer patients . Measurement of sP-selectin at diagnosis of cancer could help identify patients at increased risk for VTE Summary Background Data : The epidemiology of venous thromboembolism ( VTE ) after cancer surgery is based on clinical trials on VTE prophylaxis that used venography to screen deep vein thrombosis ( DVT ) . However , the clinical relevance of asymptomatic venography-detected DVT is unclear , and the population of these clinical trials is not necessarily representative of the overall cancer surgery population . Objective : The aim of this study was to evaluate the incidence of clinical ly overt VTE in a wide spectrum of consecutive patients undergoing surgery for cancer and to identify risk factors for VTE . Methods : @RISTOS was a prospect i ve observational study in patients undergoing general , urologic , or gynecologic surgery . Patients were assessed for clinical ly overt VTE occurring up to 30 ± 5 days after surgery or more if the hospital stay was longer than 35 days . All outcome events were evaluated by an independent Adjudication Committee . Results : A total of 2373 patients were included in the study : 1238 ( 52 % ) undergoing general , 685 ( 29 % ) urologic , and 450 ( 19 % ) gynecologic surgery . In-hospital prophylaxis was given in 81.6 % and postdischarge prophylaxis in 30.7 % of the patients . Fifty patients ( 2.1 % ) were adjudicated as affected by clinical ly overt VTE ( DVT , 0.42 % ; nonfatal pulmonary embolism , 0.88 % ; death 0.80 % ) . The incidence of VTE was 2.83 % in general surgery , 2.0 % in gynecologic surgery , and 0.87 % in urologic surgery . Forty percent of the events occurred later than 21 days from surgery . The overall death rate was 1.72 % ; in 46.3 % of the cases , death was caused by VTE . In a multivariable analysis , 5 risk factors were identified : age above 60 years ( 2.63 , 95 % confidence interval , 1.21–5.71 ) , previous VTE ( 5.98 , 2.13–16.80 ) , advanced cancer ( 2.68 , 1.37–5.24 ) , anesthesia lasting more than 2 hours ( 4.50 , 1.06–19.04 ) , and bed rest longer than 3 days ( 4.37 , 2.45–7.78 ) . Conclusions : VTE remains a common complication of cancer surgery , with a remarkable proportion of events occurring late after surgery . In patients undergoing cancer surgery , VTE is the most common cause of death at 30 days after surgery Deep venous thrombosis ( DVT ) and pulmonary embolism ( PE ) are well recognized complications of cancer . However , our current knowledge of this association is derived from studies conducted more than a decade ago . In light of the changes in medical practice and the improvement in cancer care in recent years , a re-evaluation of the relationship between malignancy and venous thrombosis is in order . Of a total of 1041 patients with solid tumors admitted to 3 major medical centers , there were 81 ( 7.8 % ) diagnosed with DVT/PE . Patients were more likely to develop DVT/PE during chemotherapy ( p = .0001 ) . Advanced malignancies ( p = .001 ) , renal carcinoma ( p = .005 ) , pancreatic ( p = .001 ) , gastric ( p = .014 ) and brain tumors ( p = .001 ) were independent variables strongly associated with the occurrence of venous thrombosis . The occurrence of thrombotic events in the population tested in this study did not adversely affect survival ( p = .082 ) . The study identifies subset of patients with cancer at high risk for venous thrombosis . Early prophylaxis with anticoagulants in these patients may be warranted . Most importantly , further clinical trials are desperately awaited to detect possible new trends in the frequency and types of thrombotic events and to better define prevention strategies in cancer patients at risk for thrombosis . The association between venous thromboembolic disease ( VTD ) and malignant neoplastic disorders is well known and has been the subject of several reports for more than a century . There is a general agreement among investigators that thrombotic complications in patients with cancer occur at a rather high frequency , and that other circumstantial factors such as surgery or chemotherapy potentiates this risk . However , several important considerations pertaining to cancer and thrombosis continue to be shrouded in controversy . For example , there are inexplicable differences in the proportion of patients with cancer diagnosed with deep venous thrombosis ( DVT ) or pulmonary embolism ( PE ) reported in the literature . In the absence of large well-controlled studies , one may only postulate on the reasons that contributed to these differences . The inclusion of different types of VTD such as superficial , venous , arterial or vascular access device-induced thrombosis may have led to overestimation of the incidence of these events in patients with underlying malignancy . Another possible explanation for this discrepancy relates to the use of a variety of diagnostic and method ological criteria ranging from observation and clinical suspicion to more invasive procedures result ing in considerable differences in the rate of reported clotting events . Along the same line of discussion , one may argue whether VTD is a r and om event or constitutes a complication that occurs more commonly in patients with distinct characteristics and certain tumor types . To further investigate the association between malignancy and thrombosis , we evaluated 1041 patients with solid tumors for the risk of DVT/PE . The main objectives of the study were to determine the frequency of DVT/PE based on vali date d diagnostic criteria and to identify patients with cancer at high risk for developing these thrombotic episodes . Also , we evaluated the impact of VTD on the survival of these patients BACKGROUND The B-20 study of the National Surgical Adjuvant Breast and Bowel Project ( NSABP ) was conducted to determine whether chemotherapy plus tamoxifen would be of greater benefit than tamoxifen alone in the treatment of patients with axillary lymph node-negative , estrogen receptor-positive breast cancer . METHODS Eligible patients ( n = 2306 ) were r and omly assigned to one of three treatment groups following surgery . A total of 771 patients with follow-up data received tamoxifen alone ; 767 received methotrexate , fluorouracil , and tamoxifen ( MFT ) ; and 768 received cyclophosphamide , methotrexate , fluorouracil , and tamoxifen ( CMFT ) . The Kaplan-Meier method was used to estimate disease-free survival , distant disease-free survival , and survival . Reported P values are two-sided . RESULTS Through 5 years of follow-up , chemotherapy plus tamoxifen result ed in significantly better disease-free survival than tamoxifen alone ( 90 % for MFT versus 85 % for tamoxifen [ P = .01 ] ; 89 % for CMFT versus 85 % for tamoxifen [ P = .001 ] ) . A similar benefit was observed in both distant disease-free survival ( 92 % for MFT versus 87 % for tamoxifen [ P = .008 ] ; 91 % for CMFT versus 87 % for tamoxifen [ P = .006 ] ) and survival ( 97 % for MFT versus 94 % for tamoxifen [ P = .05 ] ; 96 % for CMFT versus 94 % for tamoxifen [ P = .03 ] ) . Compared with tamoxifen alone , MFT and CMFT reduced the risk of ipsilateral breast tumor recurrence after lumpectomy and the risk of recurrence at other local , regional , and distant sites . Risk of treatment failure was reduced after both types of chemotherapy , regardless of tumor size , tumor estrogen or progesterone receptor level , or patient age ; however , the reduction was greatest in patients aged 49 years or less . No subgroup of patients evaluated in this study failed to benefit from chemotherapy . CONCLUSIONS Findings from this and other NSABP studies indicate that patients with breast cancer who meet NSABP protocol criteria , regardless of age , lymph node status , tumor size , or estrogen receptor status , are c and i date s for chemotherapy PURPOSE A substantial clinical need exists for an alternative to vitamin K antagonists for treating deep-vein thrombosis in cancer patients who are at high risk of both recurrent venous thromboembolism and bleeding . Low-molecular-weight heparin , body-weight adjusted , avoids anticoagulant monitoring and has been shown to be more effective than vitamin-K-antagonist therapy . SUBJECTS AND METHODS Subjects were patients with cancer and acute symptomatic proximal-vein thrombosis . We performed a multi-centre r and omized , open-label clinical trial using objective outcome measures comparing long-term therapeutic tinzaparin subcutaneously once daily with usual-care long-term vitamin-K-antagonist therapy for 3 months . Outcomes were assessed at 3 and 12 months . RESULTS Of 200 patients , 100 received tinzaparin and 100 received usual care . At 12 months , the usual-care group had an excess of recurrent venous thromboembolism ; 16 of 100 ( 16 % ) versus 7 of 100 ( 7 % ) receiving low-molecular-weight heparin ( P=.044 ; risk ratio=.44 ; absolute difference -9.0 ; 95 % confidence interval [ CI ] , -21.7 to -0.7 ) . Bleeding , largely minor , occurred in 27 patients ( 27 % ) receiving tinzaparin and 24 patients ( 24 % ) receiving usual care ( absolute difference -3.0 ; 95 % CI , -9.1 to 15.1 ) . In patients without additional risk factors for bleeding at the time of r and omization , major bleeding occurred in 0 of 51 patients ( 0 % ) receiving tinzaparin and 1 of 48 patients ( 2.1 % ) receiving usual care . Mortality at 1 year was high , reflecting the severity of the cancers ; 47 % in each group died . CONCLUSION Our findings confirm the limited but benchmark data in the literature that long-term low-molecular-weight heparin is more effective than vitamin-K-antagonist therapy for preventing recurrent venous thromboembolism in patients with cancer and proximal venous thrombosis The activity and safety of the histone deacetylase inhibitor vorinostat ( suberoylanilide hydroxamic acid , SAHA ) were evaluated in patients with refractory cutaneous T-cell lymphoma ( CTCL ) . Group 1 received vorinostat 400 mg daily , group 2 received vorinostat 300 mg twice daily for 3 days with 4 days rest , and group 3 received vorinostat 300 mg twice daily for 14 days with 7 days rest followed by 200 mg twice daily . Treatment continued until disease progression or intolerable toxicity . The primary objective was to determine the complete and partial response ( PR ) rate . Time to response ( TTR ) , time to progressive disease ( TTP ) , response duration ( DOR ) , pruritus relief , and safety were determined . Thirty-three patients who had received a median of 5 prior therapies were enrolled . Eight patients achieved a PR , including 7 with advanced disease and 4 with Sézary syndrome . The median TTR , DOR , and TTP for responders were 11.9 , 15.1 , and 30.2 weeks , respectively . Fourteen of 31 evaluable patients had pruritus relief . The most common drug-related AEs were fatigue , thrombocytopenia , diarrhea , and nausea . The most common grade 3 or 4 drug-related AEs were thrombocytopenia and dehydration . Vorinostat demonstrated activity in heavily pretreated patients with CTCL . The 400 mg daily regimen had the most favorable safety profile and is being further evaluated OBJECTIVE To study the efficacy of daily low-dose aspirin ( 81 mg orally ) in decreasing the incidence of venous thromboembolic events ( VTEs ) in patients with multiple myeloma receiving pegylated doxorubicin , vincristine , and decreased-frequency dexamethasone , plus thalidomide ( DVd-T ) . PATIENTS AND METHODS In this phase 2 clinical trial of DVd-T , conducted by the Clevel and Clinic Foundation from August 2001 to October 2003 , 105 patients were enrolled . The first 35 patients experienced increased numbers of VTEs . von Willebr and levels and platelet aggregation to ristocetin before and after treatment with DVd-T increased significantly , suggesting a pathophysiology involving platelet-endothelial interaction . Aspirin was added to the regimen , thus generating 3 patient groups : group 1 received aspirin from the start of DVd-T treatment before the study began ( 58 patients ) , group 2 received aspirin after the start of DVd-T treatment and after the study began ( 26 patients ) , and group 3 did not receive daily low-dose aspirin during the study ( 19 patients ) . Two patients being treated with warfarin for other indications were excluded from the study . The primary end point for this study was the incidence of VTE in the form of either deep venous thrombosis or pulmonary embolism . Secondary end points were the time to the first VTE , time to the composite end point of death or first VTE , and incidence of bleeding complications . RESULTS After a median follow-up of 24 months , on an intent-to-treat basis , 26 posttreatment VTEs occurred after a median of 90 days , with 19 % occurring in group 1 , 15 % in group 2 , and 58 % in group 3 . Following multivariate time-to-event analysis , aspirin use continued to be associated with lower relative risk of VTE ( hazard ratio , 0.22 ; confidence interval , 0.10 - 0.47 ; P<.001 ) and of the composite end point ( hazard ratio , 0.28 ; confidence interval , 0.15 - 0.51 ; P<.001 ) . CONCLUSION Daily low-dose aspirin ( 81 mg orally ) given to patients with newly diagnosed and relapsed/refractory multiple myeloma who were receiving DVd-T reduced the incidence of VTEs without an increase in bleeding complications PURPOSE AND METHODS Associations between thromboembolism and malignancy , usually widespread , and between thromboembolism and hormonal and /or chemotherapy have been previously reported . We performed a r and omized trial of tamoxifen 30 mg/d for 2 years ( T ) versus T plus 6 months of intravenous chemotherapy with cyclophosphamide , methotrexate , and fluorouracil ( CMF ) for postmenopausal women with involved axillary nodes and positive estrogen receptor ( ER ) or progesterone receptor ( PgR ) status following primary therapy for breast cancer . RESULTS We observed one or more thromboembolic events in 48 of 353 women ( 13.6 % ) allocated to receive T plus CMF in comparison to five of 352 women ( 2.6 % ) r and omized to receive T alone ( P < .0001 ) . Six women in the T plus CMF arm , but none r and omized to receive T alone , suffered two thromboembolic events while an study therapy . There were also significantly more women who developed severe ( grade 3 to 5 ) thromboembolic events in the T plus CMF arm than in the T arm ( 34 v five ; P < .0001 ) . Most thromboembolic events ( 39 of 54 ) occurred while women were actually receiving chemotherapy ( P < .0001 ) . Thromboembolic complications result ed in more days in hospital and more deaths than any other complication of therapy , including infection , in this trial . CONCLUSION Thromboembolism related to the addition of CMF chemotherapy to tamoxifen as adjuvant therapy in this group of women represents a relatively common and serious complication that may outweigh any benefits offered by this additional therapy BACKGROUND The use of warfarin sodium for treating venous thromboembolism in patients with cancer is associated with a significant risk of recurrence and bleeding . The use of low-molecular-weight heparin sodium for secondary prevention of venous thromboembolism in cancer patients may reduce the complication rate . OBJECTIVE To determine whether a fixed dose of subcutaneous low-molecular-weight heparin is superior to oral warfarin for the secondary prophylaxis of venous thromboembolism in patients with cancer and venous thromboembolism . METHODS In a r and omized , open-label multicenter trial performed between April 1995 and March 1999 , we compared subcutaneous enoxaparin sodium ( 1.5 mg/kg once a day ) with warfarin given for 3 months in 146 patients with venous thromboembolism and cancer . MAIN OUTCOME MEASURE A combined outcome event defined as major bleeding or recurrent venous thromboembolism within 3 months . RESULTS Of the 71 evaluable patients assigned to receive warfarin , 15 ( 21.1 % ; 95 % confidence interval [ CI ] , 12.3%-32.4 % ) experienced one major outcome event compared with 7 ( 10.5 % ) of the 67 evaluable patients assigned to receive enoxaparin ( 95 % CI , 4.3%-20.3 % ; P = .09 ) . There were 6 deaths owing to hemorrhage in the warfarin group compared with none in the enoxaparin group . In the warfarin group , 17 patients ( 22.7 % ) died ( 95 % CI , 13.8%-33.8 % ) compared with 8 ( 11.3 % ) in the enoxaparin group ( 95 % CI , 5.0%-21.0 % ; P = .07 ) . No difference was observed regarding the progression of the underlying cancer or cancer-related death . CONCLUSIONS These results confirm that warfarin is associated with a high bleeding rate in patients with venous thromboembolism and cancer . Prolonged treatment with low-molecular-weight heparin may be as effective as oral anticoagulants and may be safer in these cancer patients A small proportion of patients with deep vein thrombosis develop recurrent venous thromboembolic complications or bleeding during anticoagulant treatment . These complications may occur more frequently if these patients have concomitant cancer . This prospect i ve follow-up study sought to determine whether in thrombosis patients those with cancer have a higher risk for recurrent venous thromboembolism or bleeding during anticoagulant treatment than those without cancer . Of the 842 included patients , 181 had known cancer at entry . The 12-month cumulative incidence of recurrent thromboembolism in cancer patients was 20.7 % ( 95 % CI , 15.6%-25.8 % ) versus 6.8 % ( 95 % CI , 3.9%- 9.7 % ) in patients without cancer , for a hazard ratio of 3.2 ( 95 % CI , 1.9 - 5.4 ) The 12-month cumulative incidence of major bleeding was 12.4 % ( 95 % CI , 6.5%-18.2 % ) in patients with cancer and 4.9 % ( 95 % CI , 2.5%-7.4 % ) in patients without cancer , for a hazard ratio of 2.2 ( 95 % CI , 1.2 - 4.1 ) . Recurrence and bleeding were both related to cancer severity and occurred predominantly during the first month of anticoagulant therapy but could not be explained by sub- or overanticoagulation . Cancer patients with venous thrombosis are more likely to develop recurrent thromboembolic complications and major bleeding during anticoagulant treatment than those without malignancy . These risks correlate with the extent of cancer . Possibilities for improvement using the current paradigms of anticoagulation seem limited and new treatment strategies should be developed Purpose : Hemostatic activation is common in pancreatic cancer and may be linked to angiogenesis and venous thromboembolism . We investigated expression of tissue factor ( TF ) , the prime initiator of coagulation , in noninvasive and invasive pancreatic neoplasia . We correlated TF expression with vascular endothelial growth factor ( VEGF ) expression , microvessel density , and venous thromboembolism in resected pancreatic cancer . Experimental Design : Tissue cores from a tri-institutional retrospective series of patients were used to build tissue microarrays . TF expression was grade d semiquantitatively using immunohistochemistry in normal pancreas ( n = 10 ) , intraductal papillary mucinous neoplasms ( n = 70 ) , pancreatic intraepithelial neoplasia ( n = 40 ) , and resected or metastatic pancreatic adenocarcinomas ( n = 130 ) . Results : TF expression was observed in a majority of noninvasive and invasive pancreatic neoplasia , including 77 % of pancreatic intraepithelial neoplasias , 91 % of intraductal papillary mucinous neoplasms , and 89 % of pancreatic cancers , but not in normal pancreas . Sixty-six of 122 resected pancreatic cancers ( 54 % ) were found to have high TF expression ( defined as grade ≥2 , the median score ) . Carcinomas with high TF expression were more likely to also express VEGF ( 80 % versus 27 % with low TF expression , P < 0.0001 ) and had a higher median MVD ( 8 versus 5 per tissue core with low TF expression , P = 0.01 ) . Pancreatic cancer patients with high TF expression had a venous thromboembolism rate of 26.3 % compared with 4.5 % in patients with low TF expression ( P = 0.04 ) . Conclusions : TF expression occurs early in pancreatic neoplastic transformation and is associated with VEGF expression , increased microvessel density , and possibly clinical venous thromboembolism in pancreatic cancer . Prospect i ve studies evaluating the role of TF in pancreatic cancer outcomes are warranted Risk of venous thromboembolism ( VTE ) is elevated in cancer , but individual risk factors can not identify a sufficiently high-risk group of out patients for thromboprophylaxis . We developed a simple model for predicting chemotherapy-associated VTE using baseline clinical and laboratory variables . The association of VTE with multiple variables was characterized in a derivation cohort of 2701 cancer out patients from a prospect i ve observational study . A risk model was derived and vali date d in an independent cohort of 1365 patients from the same study . Five predictive variables were identified in a multivariate model : site of cancer ( 2 points for very high-risk site , 1 point for high-risk site ) , platelet count of 350 x 10(9)/L or more , hemoglobin less than 100 g/L ( 10 g/dL ) and /or use of erythropoiesis-stimulating agents , leukocyte count more than 11 x 10(9)/L , and body mass index of 35 kg/m(2 ) or more ( 1 point each ) . Rates of VTE in the derivation and validation cohorts , respectively , were 0.8 % and 0.3 % in low-risk ( score = 0 ) , 1.8 % and 2 % in intermediate-risk ( score = 1 - 2 ) , and 7.1 % and 6.7 % in high-risk ( score > /= 3 ) category over a median of 2.5 months ( C-statistic = 0.7 for both cohorts ) . This model can identify patients with a nearly 7 % short-term risk of symptomatic VTE and may be used to select cancer out patients for studies of thromboprophylaxis Abstract Objective To determine the efficacy and safety of the anticoagulant fondaparinux in older acute medical in patients at moderate to high risk of venous thromboembolism . Design Double blind r and omised placebo controlled trial . Setting 35 centres in eight countries . Participants 849 medical patients aged 60 or more admitted to hospital for congestive heart failure , acute respiratory illness in the presence of chronic lung disease , or acute infectious or inflammatory disease and expected to remain in bed for at least four days . Interventions 2.5 mg fondaparinux or placebo subcutaneously once daily for six to 14 days . Outcome measure The primary efficacy outcome was venous thromboembolism detected by routine bilateral venography along with symptomatic venous thromboembolism up to day 15 . Secondary outcomes were bleeding and death . Patients were followed up at one month . Results 425 patients in the fondaparinux group and 414 patients in the placebo group were evaluable for safety analysis ( 10 were not treated ) . 644 patients ( 75.9 % ) were available for the primary efficacy analysis . Venous thrombembolism was detected in 5.6 % ( 18/321 ) of patients treated with fondaparinux and 10.5 % ( 34/323 ) of patients given placebo , a relative risk reduction of 46.7 % ( 95 % confidence interval 7.7 % to 69.3 % ) . Symptomatic venous thromboembolism occurred in five patients in the placebo group and none in the fondaparinux group ( P = 0.029 ) . Major bleeding occurred in one patient ( 0.2 % ) in each group . At the end of follow-up , 14 patients in the fondaparinux group ( 3.3 % ) and 25 in the placebo group ( 6.0 % ) had died . Conclusion Fondaparinux is effective in the prevention of asymptomatic and symptomatic venous thromboembolic events in older acute medical patients . The frequency of major bleeding was similar for both fondaparinux and placebo treated patients Patients with early-stage myeloma are typically observed without therapy until symptomatic disease occurs . However , they are at high risk of progression to symptomatic myeloma , with a median time to progression of approximately 1–2 years . We report the final results of a phase II trial of thalidomide as initial therapy for early-stage multiple myeloma in an attempt to delay progression to symptomatic disease . In total , 31 patients with smoldering or indolent multiple myeloma were studied at the Mayo Clinic . Two patients were deemed ineligible because they were found to have received prior therapy for myeloma , and were excluded from analyses except for toxicity . Thalidomide was initiated at a starting dose of 200 mg/day . Patients were followed-up monthly for the first 6 months and every 3 months thereafter . Of the 29 eligible patients , 10 ( 34 % ) had a partial response to therapy with at least 50 % or greater reduction in serum and urine monoclonal ( M ) protein . When minor responses ( 25–49 % decrease in M protein ) were included , the response rate was 66 % . Three patients had progressive disease while on therapy . Kaplan – Meier estimates of progression-free survival are 80 % at 1 year and 63 % at 2 years . Major grade 3–4 toxicities included two patients with somnolence and one patient each with neuropathy , deep-vein thrombosis , hearing loss , weakness , sinus bradycardia , and edema . Thalidomide has significant activity in early-stage myeloma and has the potential to delay progression to symptomatic disease . This approach must be further tested in r and omized trials BACKGROUND The incidence of venous thromboembolism after diagnosis of specific cancers and the effect of thromboembolism on survival are not well defined . METHODS The California Cancer Registry was linked to the California Patient Discharge Data Set to determine the incidence of venous thromboembolism among cancer cases diagnosed between 1993 and 1995 . The incidence and timing of thromboembolism within 1 and 2 years of cancer diagnosis and the risk factors associated with thromboembolism and death were determined . RESULTS Among 235 149 cancer cases , 3775 ( 1.6 % ) were diagnosed with venous thromboembolism within 2 years , 463 ( 12 % ) at the time cancer was diagnosed and 3312 ( 88 % ) subsequently . In risk-adjusted models , metastatic disease at the time of diagnosis was the strongest predictor of thromboembolism . Expressed as events per 100 patient-years , the highest incidence of thromboembolism occurred during the first year of follow-up among cases with metastatic-stage pancreatic ( 20.0 ) , stomach ( 10.7 ) , bladder ( 7.9 ) , uterine ( 6.4 ) , renal ( 6.0 ) , and lung ( 5.0 ) cancer . Adjusting for age , race , and stage , diagnosis of thromboembolism was a significant predictor of decreased survival during the first year for all cancer types ( hazard ratios , 1.6 - 4.2 ; P<.01 ) . CONCLUSIONS The incidence of venous thromboembolism varied with cancer type and was highest among patients initially diagnosed with metastatic-stage disease . The incidence rate of thromboembolism decreased over time . Diagnosis of thromboembolism during the first year of follow-up was a significant predictor of death for most cancer types and stages analyzed . For some types of cancer , the incidence of thromboembolism was sufficiently high to warrant prospect i ve clinical trials of primary thromboprophylaxis BACKGROUND The length of time after an episode of venous thromboembolism during which the risk of newly diagnosed cancer is increased is not known , and whether vitamin K antagonists have an antineoplastic effect is controversial . METHODS In a prospect i ve , r and omized study of the duration of oral anticoagulation ( six weeks or six months ) after a first episode of venous thromboembolism , patients were question ed annually about any newly diagnosed cancer . After a mean follow-up of 8.1 years , we used the Swedish Cancer Registry to identify all diagnoses of cancer and causes of death in the study population . The observed numbers of cases of cancer were compared with expected numbers based on national incidence rates , and the st and ardized incidence ratios were calculated . RESULTS A first cancer was diagnosed in 111 of 854 patients ( 13.0 percent ) during follow-up . The st and ardized incidence ratio for newly diagnosed cancer was 3.4 ( 95 percent confidence interval , 2.2 to 4.6 ) during the first year after the thromboembolic event and remained between 1.3 and 2.2 for the following five years . Cancer was diagnosed in 66 of 419 patients ( 15.8 percent ) who were treated for six weeks with oral anticoagulants , as compared with 45 of 435 patients ( 10.3 percent ) who were treated for six months ( odds ratio , 1.6 ; 95 percent confidence interval , 1.1 to 2.4 ) . The difference was mainly due to the occurrence of new urogenital cancers , of which there were 28 cases in the six-week group ( 6.7 percent ) and 12 cases in the six-month group ( 2.8 percent ) ( odds ratio , 2.5 ; 95 percent confidence interval , 1.3 to 5.0 ) . The difference in the incidence of cancer between the treatment groups became evident only after two years of follow-up , and it remained significant after adjustment for sex , age , and whether the thromboembolism was idiopathic or nonidiopathic . Older age at the time of the venous thrombosis and an idiopathic thromboembolism were also independent risk factors for a diagnosis of cancer . No difference in the incidence of cancer-related deaths was detected . CONCLUSIONS The risk of newly diagnosed cancer after a first episode of venous thromboembolism is elevated during at least the following two years . Subsequently , the risk seems to be lower among patients treated with oral anticoagulants for six months than among those treated for six weeks Venous thromboembolism ( VTE ) is a common complication in patients with malignant disease . In addition to well-established acquired risk factors for VTE , several genetic risk factors , mainly related to the haemostatic system , are known to influence thrombotic risk . However , the contribution of gene abnormalities to thrombotic tendency in cancer patients remains poorly explored . We performed a prospect i ve study to evaluate the prevalence and clinical significance of four gene variations ( factor V Leiden [ FVL ] , factor II G20210A , factor XIII Val34Leu and MTHFR C677 T ) in cancer patients , with and without VTE . Enrolled were 211 unrelated and unselected patients ( M/F ratio 0.5 , mean age 57 years , range 12 - 91 years ) with a diagnosis of cancer , admitted to two University Oncology Clinics in the city of São Paulo , Southeastern Brazil . After admission , all patients were evaluated for the presence of symptoms and signs of VTE . Sixty-four patients ( 30.3 % ) had an episode of deep venous thrombosis ( DVT ) or pulmonary embolism ( PE ) , which has been objective ly verified ; 147 patients ( 69.7 % ) had no evidence of VTE . FVL was found with a frequency of 1.5 % and 2.7 % in the VTE and non-VTE group , respectively ( odds ratio [ OR ] for VTE 0.6 , 95 % CI : 0.06 - 5.3 ) . FII G20210A was found in 1.5 % and 1.3 % of thrombotic and nonthrombotic patients , respectively , yielding an OR of 1.2 ( 95 % CI : 0.1 - 13.1 ) . FXIII Val34Leu was detected in 29.6 % of the thrombotic patients and in 28.5 % of the non-thrombotic patients ( OR 1.1 , 95 % CI : 0.5 - 2 ) . MTHFR 677 T was present in 53.1 % and 60.5 % of patients with and without thrombosis , respectively ( OR 0.8 , 95 % CI : 0.4 - 1.4 ) . The present data do not point to an association between the four polymorphisms here investigated and the risk of VTE in cancer patients BACKGROUND Indirect evidence suggests that prolonged treatment with warfarin might be associated with a decreased incidence of urogenital cancer . We aim ed to assess this association in a large population -based study . METHODS Beneficiaries of Saskatchewan Health who were eligible for prescription drug benefits and aged 50 years or over with no history of cancer since 1967 were enrolled into a nested , matched case-control study . 19 412 new cases of urogenital cancer diagnosed between Jan 1 , 1981 , and Dec 31 , 2002 , were identified by use of information from the Saskatchewan Cancer Agency registry . For each case , six controls , totalling 116 470 , who were matched for age , sex , and time of diagnosis were selected r and omly . Conditional logistic regression analysis was used to calculate adjusted incidence rates of urogenital cancer in relation to warfarin use . FINDINGS Compared with men who never used warfarin , men with 4 years of warfarin use had an adjusted incidence rate of 0.80 ( 95 % CI [ 0.65 - 0.99 ] ) . For warfarin use 76 - 100 % of the time , the adjusted rate ratios were 0.80 ( 0.66 - 0.96 ) during year 2 preceding diagnosis of prostate cancer , 0.76 ( 0.62 - 0.94 ) during year 3 , and 0.67 ( 0.53 - 0.86 ) during year 4 . No significant association was found between warfarin and risk of other urogenital cancers . INTERPRETATION Our results suggest that warfarin has an antitumour effect that is specific to prostate cancer . Further investigation , with more complete assessment of confounders and that addresses the effect of warfarin on mortality of prostate cancer , is warranted PURPOSE To determine if thalidomide plus dexamethasone yields superior response rates compared with dexamethasone alone as induction therapy for newly diagnosed multiple myeloma . PATIENTS AND METHODS Patients were r and omly assigned to receive thalidomide plus dexamethasone or dexamethasone alone . Patients in arm A received thalidomide 200 mg orally for 4 weeks ; dexamethasone was administered at a dose of 40 mg orally on days 1 to 4 , 9 to 12 , and 17 to 20 . Cycles were repeated every 4 weeks . Patients in arm B received dexamethasone alone at the same schedule as in arm A. RESULTS Two hundred seven patients were enrolled : 103 were r and omly assigned to thalidomide plus dexamethasone and 104 were r and omly assigned to dexamethasone alone ; eight patients were ineligible . The response rate with thalidomide plus dexamethasone was significantly higher than with dexamethasone alone ( 63 % v 41 % , respectively ; P = .0017 ) . The response rate allowing for use of serum monoclonal protein levels when a measurable urine monoclonal protein was unavailable at follow-up was 72 % v 50 % , respectively . The incidence rates of grade 3 or higher deep vein thrombosis ( DVT ) , rash , bradycardia , neuropathy , and any grade 4 to 5 toxicity in the first 4 months were significantly higher with thalidomide plus dexamethasone compared with dexamethasone alone ( 45 % v 21 % , respectively ; P < .001 ) . DVT was more frequent in arm A than in arm B ( 17 % v 3 % ) ; grade 3 or higher peripheral neuropathy was also more frequent ( 7 % v 4 % , respectively ) . CONCLUSION Thalidomide plus dexamethasone demonstrates significantly superior response rates in newly diagnosed myeloma compared with dexamethasone alone . However , this must be balanced against the greater toxicity seen with the combination BACKGROUND Tamoxifen , taken for five years , is the st and ard adjuvant treatment for postmenopausal women with primary , estrogen-receptor-positive breast cancer . Despite this treatment , however , some patients have a relapse . METHODS We conducted a double-blind , r and omized trial to test whether , after two to three years of tamoxifen therapy , switching to exemestane was more effective than continuing tamoxifen therapy for the remainder of the five years of treatment . The primary end point was disease-free survival . RESULTS Of the 4742 patients enrolled , 2362 were r and omly assigned to switch to exemestane , and 2380 to continue to receive tamoxifen . After a median follow-up of 30.6 months , 449 first events ( local or metastatic recurrence , contralateral breast cancer , or death ) were reported--183 in the exemestane group and 266 in the tamoxifen group . The unadjusted hazard ratio in the exemestane group as compared with the tamoxifen group was 0.68 ( 95 percent confidence interval , 0.56 to 0.82 ; P<0.001 by the log-rank test ) , representing a 32 percent reduction in risk and corresponding to an absolute benefit in terms of disease-free survival of 4.7 percent ( 95 percent confidence interval , 2.6 to 6.8 ) at three years after r and omization . Overall survival was not significantly different in the two groups , with 93 deaths occurring in the exemestane group and 106 in the tamoxifen group . Severe toxic effects of exemestane were rare . Contralateral breast cancer occurred in 20 patients in the tamoxifen group and 9 in the exemestane group ( P=0.04 ) . CONCLUSIONS Exemestane therapy after two to three years of tamoxifen therapy significantly improved disease-free survival as compared with the st and ard five years of tamoxifen treatment BACKGROUND The efficacy and safety of thromboprophylaxis in patients with acute medical illnesses who may be at risk for venous thromboembolism have not been determined in adequately design ed trials . METHODS In a double-blind study , we r and omly assigned 1102 hospitalized patients older than 40 years to receive 40 mg of enoxaparin , 20 mg of enoxaparin , or placebo subcutaneously once daily for 6 to 14 days . Most patients were not in an intensive care unit . The primary outcome was venous thromboembolism between days 1 and 14 , defined as deep-vein thrombosis detected by bilateral venography ( or duplex ultrasonography ) between days 6 and 14 ( or earlier if clinical ly indicated ) or documented pulmonary embolism . The duration of follow-up was three months . RESULTS The primary outcome could be assessed in 866 patients . The incidence of venous thromboembolism was significantly lower in the group that received 40 mg of enoxaparin ( 5.5 percent [ 16 of 291 patients ] ) than in the group that received placebo ( 14.9 percent [ 43 of 288 patients ] ) ( relative risk , 0.37 ; 97.6 percent confidence interval , 0.22 to 0.63 ; P < 0.001 ) . The benefit observed with 40 mg of enoxaparin was maintained at three months . There was no significant difference in the incidence of venous thromboembolism between the group that received 20 mg of enoxaparin ( 43 of 287 patients [ 15.0 percent ] ) and the placebo group . The incidence of adverse effects did not differ significantly between the placebo group and either enoxaparin group . By day 110 , 50 patients had died in the placebo group ( 13.9 percent ) , 51 had died in the 20-mg group ( 14.7 percent ) , and 41 had died in the 40-mg group ( 11.4 percent ) ; the differences were not significant . CONCLUSIONS Prophylactic treatment with 40 mg of enoxaparin subcutaneously per day safely and effectively reduces the risk of venous thromboembolism in patients with acute medical illnesses BACKGROUND There is limited information about risk factors for venous thromboembolism ( VTE ) in acutely ill hospitalized general medical patients . METHODS An international , r and omized , double-masked , placebo-controlled trial ( MEDENOX ) has previously been conducted in 1102 acutely ill , immobilized general medical patients and has shown the efficacy of using a low-molecular-weight heparin , enoxaparin sodium , in preventing thrombosis . We performed logistic regression analysis to evaluate the independent nature of different types of acute medical illness ( heart failure , respiratory failure , infection , rheumatic disorder , and inflammatory bowel disease ) and predefined factors ( chronic heart and respiratory failure , age , previous VTE , and cancer ) as risk factors for VTE . RESULTS The primary univariate analysis showed that the presence of an acute infectious disease , age older than 75 years , cancer , and a history of VTE were statistically significantly associated with an increased VTE risk . Multiple logistic regression analysis indicated that these factors were independently associated with VTE . CONCLUSIONS Several independent risk factors for VTE were identified . These findings allow recognition of individuals at increased risk of VTE and will contribute to the formulation of an evidence -based risk assessment model for thromboprophylaxis in hospitalized general medical patients OBJECTIVES Recombinant epoetin alfa and darbepoetin alfa ( r-HuEPO ) have been shown to be safe and effective treatments for anemia , but recent reports have suggested an increased risk of thromboembolic events when these agents are used to treat chemotherapy-induced anemia among patients with breast or ovarian cancer . We examined the possible risk of such events among patients with ovarian or primary peritoneal carcinomas and chemotherapy-induced anemia . METHODS We retrospectively analyzed data over 10 years from women at one hospital with ovarian or primary peritoneal carcinoma and chemotherapy-induced anemia . The incidence and odds ratio for development of deep venous thrombosis , unadjusted and adjusted for baseline differences and risk factors , was assessed between patients who had received r-HuEPO versus no treatment for anemia . RESULTS Of the 364 women , 90 had received r-HuEPO and 253 had not . The incidence of deep venous thrombosis was 6.7 % in the group that had received r-HuEPO and 5.1 % in the group that had not ( unadjusted odds ratio , 1.31 ; 95 % confidence interval [ CI ] , 0.48 - 3.55 ) . After adjustment for differences in age , body-mass index , prior thromboembolic disease or cancer , and tobacco use , the odds ratio for developing deep venous thrombosis with the use of r-HuEPO was 1.35 ( 95 % CI , 0.49 - 3.75 ) . CONCLUSIONS The use of r-HuEPO was not associated with an increased risk of deep venous thrombosis in this population . A r and omized trial is needed to further explore this issue and to detail the safety and efficacy of these agents in patients with various other cancers BACKGROUND Fatal pulmonary embolism and other thromboembolic complications are common in hospital in patients . However , there is little evidence on the routine use of pharmacological thromboprophylaxis in non-surgical patients . We assessed the efficacy and safety of low-dose heparin in the prevention of hospital-acquired , clinical ly relevant , fatal pulmonary embolism in patients with infectious diseases . METHODS Our study used the postr and omisation consent design . 19,751 consecutive patients , aged 55 years or older , admitted to departments of infectious diseases in six Swedish hospitals , were screened for inclusion in the r and omised , controlled , unblinded , multicentre trial . Of the eligible patients , 5776 were assigned subcutaneous st and ard heparin ( 5000 IU every 12 h ) until hospital discharge or for a maximum of 3 weeks ; 5917 were assigned no prophylactic treatment ( control group ) . We sought consent only from the heparin group . Follow-up was for 3 weeks after discharge from hospital or for a maximum of 60 days from r and omisation . The primary endpoint was necropsy-verified pulmonary embolism of predefined clinical relevance . FINDINGS By intention-to-treat analysis mortality was similar in the heparin and control groups ( 5.3 vs 5.6 % , p = 0.39 ) and the median time from admission to death was 16 days in both groups ( IQR 8 - 31 vs 6 - 28 days ) . Necropsy-verified pulmonary embolism occurred in 15 heparin-treated and 16 control-group patients . There was a significant difference between heparin and control groups in median time from r and omisation to fatal pulmonary embolism ( 28 [ 24 - 36 ] vs 12.5 [ 10 - 20 ] days , p = 0.007 ) . This difference corresponds to the duration of heparin prophylaxis . Non-fatal thromboembolic complications occurred in more of the control than of the heparin group ( 116 vs 70 , p = 0.0012 ) . INTERPRETATION Our findings do not support the routine use of heparin prophylaxis for 3 weeks or less in large groups of non-surgical patients . Further studies are needed to investigate whether heparin prophylaxis of longer duration may prevent fatal pulmonary embolism PURPOSE The extent of venous thromboembolism ( VTE ) associated with central vein catheters ( CVC ) in cancer patients remains unclear . The aim of this study was to evaluate the efficacy and safety of the low molecular weight heparin , enoxaparin , in the prevention of VTE . PATIENTS AND METHODS In a multicenter , double-blind study , consecutive cancer patients scheduled for CVC insertion were r and omly assigned to receive either subcutaneous enoxaparin 40 mg once a day or placebo . Treatment was started 2 hours before CVC insertion and continued for 6 weeks . The primary end points of the study were deep vein thrombosis ( DVT ) , confirmed by venography of the CVC limb performed 6 weeks after r and omization , or clinical ly overt pulmonary embolism , confirmed by objective testing during the study drug administration . Patients were assessed for bleeding complications . RESULTS Three hundred eighty-five patients were r and omized , of which 321 ( 83.4 % ) underwent venography . A venography was adequate for adjudication in 155 patients in each treatment group . A DVT was observed in 22 patients ( 14.1 % ) treated with enoxaparin and in 28 patients ( 18.0 % ) treated with placebo , corresponding to a relative risk of 0.78 ( 95 % CI , 0.47 to 1.31 ) . No major bleeding occurred . Five patients ( 2.6 % ) in the enoxaparin group and two patients ( 1.0 % ) in the placebo group died during the treatment period . CONCLUSION In this study , no difference in the rate of CVC-related VTE was detected between patients receiving enoxaparin and patients receiving placebo . The dose of enoxaparin used in this study proved to be safe . Clinical trials evaluating higher enoxaparin doses could optimize the efficacy of this agent for this indication Deep venous thrombosis of the lower extremity is a serious disorder ; the estimated incidence is 1 per 1000 persons per year [ 1 - 3 ] . The disease can occur after surgical procedures and trauma and in the presence of cancer or inherited coagulation disorders ; it can also develop without any of these factors [ 3 ] . The clinical course of deep venous thrombosis might be complicated by pulmonary embolism , recurrent episodes of deep venous thrombosis , and the development of serious post-thrombotic sequelae , such as venous ulceration , debilitating pain , and intractable edema [ 3 ] . Patients with deep venous thrombosis are usually treated with an initial course of heparin ( 5 to 10 days ) followed by 3 to 6 months of oral anticoagulant therapy . This treatment regimen reduces the risk for short-term thromboembolic complications to approximately 5 % [ 4 , 5 ] . The long-term risk for recurrent venous thromboembolism and the incidence and severity of post-thrombotic sequelae in patients with symptomatic deep venous thrombosis have not been well documented . In a recent large r and omized , clinical trial comparing 6 weeks of oral anticoagulant therapy with 6 months of therapy [ 6 ] , patients with symptomatic deep venous thrombosis were followed for 2 years for recurrences and death . This trial showed a substantial reduction in the risk for recurrent venous thromboembolism among patients in the 6-month oral anticoagulant group , but the investigators did not report on the occurrence of the post-thrombotic syndrome . Another recent study [ 7 ] reported the 8-year incidence of recurrences and post-thrombotic manifestations in patients with confirmed symptomatic deep venous thrombosis . However , only a few patients were included in this study , and data were collected retrospectively . We assessed the clinical course of a first episode of symptomatic deep venous thrombosis in a large consecutive series of patients who had long-term follow-up . We assessed mortality and the long-term incidences of recurrent venous thromboembolism and the post-thrombotic syndrome . We also evaluated the potential risk factors for these three outcomes . Methods Identification of Inception Cohort The Department of Internal Medicine of the University of Padua , Padua , Italy , is a diagnostic facility for out patients with clinical ly suspected venous thromboembolism in a community of approximately 350 000 persons . All consecutive out patients with a first episode of clinical ly suspected deep venous thrombosis who were referred by their general practitioners between January 1986 and December 1991 had noninvasive testing [ 8 ] . Patients were potentially eligible for the study if confirmatory venography showed deep venous thrombosis . Patients were excluded from the study if they had been referred because of recurrent venous thrombosis , were geographically inaccessible for follow-up , or refused to give informed consent . The Institutional Review Board of the hospital of the University of Padua approved the study . Baseline Assessment At the time of referral , demographic characteristics were recorded and a medical history was taken ; information was elicited on the period between the onset of symptoms and presentation to the thrombosis service ( patientphysician delay ) , the presence of risk factors for thrombosis ( that is , cancer , surgery , trauma or fracture , immobilization for more than 7 days , pregnancy or childbirth , or estrogen use ) , and symptoms of pulmonary embolism . Information was also obtained on the history of venous thromboembolism in first-degree relatives . Antithrombin , protein C and S , and lupus-like anticoagulant levels were subsequently measured . Assays were done , and previously described criteria for abnormality and deficiency were used [ 9 ] . The venograms obtained at baseline were divided into those representing proximal venous thrombosis ( with or without concurrent venous thrombosis of the calf ) and those indicating isolated venous thrombosis of the calf . Proximal venous thrombosis was defined as thrombosis located above the trifurcation of the calf veins that involved at least the popliteal vein , superficial femoral vein , common femoral vein , or iliac vein . The location and occlusiveness of proximal thrombi were also determined . A patient was considered to have nonocclusive deep venous thrombosis if contrast material was seen between the thrombus and the vessel wall along the entire thrombus . Treatment Patients were admitted to the hospital and treated with an initial course of high-dose intravenous st and ard heparin ( a bolus of 5000 U followed by continuous infusion of 30 000 U/d , subsequently adjusted to maintain an activated partial thromboplastin time between 1.5 and 2.5 times the normal value ) or subcutaneous low-molecular-weight heparin ( 90 U of anti-factor Xa/kg of body weight twice daily ) . Therapy with oral anticoagulant agents ( warfarin ) was started on day 5 to 7 of treatment and was continued for 3 months . The oral anticoagulant dose was adjusted daily to maintain an international normalized ratio between 2.0 and 3.0 . Treatment with low-molecular-weight heparin was discontinued on day 10 or later if the international normalized ratio was less than 2.0 . This treatment strategy deviated in the following groups of patients : those with cancer , protein deficiencies , or lupus anticoagulant , in whom oral anticoagulation therapy was prolonged ; those with small isolated venous thrombosis of the calf , who received oral anticoagulation alone ; those with contraindications to anticoagulant treatment , who received no treatment or an inferior caval-vein filter ; those who refused to be hospitalized , who received low-dose heparin and oral anticoagulant agents ; and those with threatened viability of the leg , who received thrombolytic therapy . The actual type and duration of treatments were recorded . All patients were instructed to wear elastic graduated compression stockings ( providing 40 mm Hg of pressure at the ankle ) for at least 2 years . Follow-up All patients were seen 3 and 6 months after the initial referral and thereafter returned to the study center every 6 months for follow-up assessment s. Patients were asked to return to the thrombosis center immediately if symptoms suggestive of recurrent venous thromboembolism developed . Follow-up was continued for as long as 8 years or until July 1995 . To avoid diagnostic suspicion bias , the medical history on general health , symptoms of recurrent venous thromboembolism , and the post-thrombotic syndrome was obtained by using a st and ardized form . Patients who could not attend the follow-up sessions were visited at home . For all patients who died during follow-up , the date and cause of death were documented . Diagnosis of Recurrent Venous Thromboembolism and Hemorrhage Contrast venography of the symptomatic leg or legs was done as described previously [ 10 ] . The criteria for deep venous thrombosis were a constant intraluminal filling defect confirmed in at least two different projections or nonvisualization of a vein or a segment thereof , despite adequate technique and repeated injections with contrast material . The presence or absence of venous thrombosis was assessed by a panel of independent observers who were unaware of the patient 's other clinical features or previous test results . If a patient presented with clinical ly suspected recurrent venous thrombosis of the leg , venography was done . The criterion for recurrent venous thrombosis of the leg was a new intraluminal filling defect on the venogram . If the venogram was not diagnostic , recurrent venous thrombosis was diagnosed on the basis of an abnormal 125I-fibrinogen leg scan or results of noninvasive tests that had changed from normal to abnormal [ 11 , 12 ] . Patients with suspected pulmonary embolism had venography if they had concurrent leg symptoms or perfusion lung scanning in the absence of leg symptoms . Pulmonary embolism was excluded if the perfusion scan was normal . Because ventilation lung scanning was not available during the first years of the study and because pulmonary angiography could not routinely be done , we could not definitively diagnose pulmonary embolism in some patients . If a definitive diagnosis could not be made , patients were classified as not having recurrent venous thromboembolism . Perfusion lung scanning and pulmonary angiography were done and their results were interpreted according to st and ard procedures [ 13 ] . Hemorrhagic episodes were classified as major or minor , as reported previously [ 14 ] . The documentation of all patients suspected of having a recurrent venous thromboembolic or bleeding event was review ed by a three-member adjudication committee that was unaware of further clinical details of the patient . Criteria for the Post-Thrombotic Syndrome Presence of the post-thrombotic syndrome was assessed by investigators who were unaware of previous post-thrombotic manifestations and further clinical details of the patient . The presence of leg symptoms ( pain , cramps , heaviness , pruritus , and paresthesia ) and signs ( pretibial edema , in duration of the skin , hyperpigmentation , new venous ectasia , redness , and pain during calf compression ) was scored . For each item , the investigators assigned a score of 0 ( not present or minimal ) to 3 ( severe ) . The presence of a venous ulcer of the lower limb was recorded . In patients with bilateral thrombosis , the higher score was used . A total score of 15 or more on two consecutive visits or the presence of a venous ulcer indicated severe post-thrombotic syndrome , and a total score of 5 to 14 on two consecutive visits indicated mild post-thrombotic syndrome . This score has been shown to have good reproducibility , and it correlates well with the patient 's perception of the interference of leg symptoms with daily life [ 15 ] . Statistical Analysis We calculated Kaplan-Meier estimates and 95 % CIs for a visual assessment of survival and calculated the risk for recurrent venous thromboembolism and mild and severe PURPOSE Thrombosis of long-term central venous catheters ( CVC ) is a serious complication that causes morbidity and interrupts the infusion of chemotherapy , intravenous medication , and blood products . We performed a prospect i ve study to examine the incidence , risk factors , and long-term complications of symptomatic catheter-related thrombosis ( CRT ) in adults with cancer . PATIENTS AND METHODS Consecutive patients with cancer , undergoing insertion of a CVC , were enrolled and prospect ively followed while their catheter remained in place plus 4 subsequent weeks or a maximum of 52 weeks , whichever came first . Patients with symptomatic CRT were followed for an additional 52 weeks from the date of CRT diagnosis . The end points were symptomatic CRT , symptomatic pulmonary embolism ( PE ) , postphlebitic syndrome , and catheter life span . RESULTS Over 76,713 patient-days of follow-up , 19 of 444 patients ( 4.3 % ) had symptomatic CRT in 19 of 500 catheters ( 0.3 per 1,000 catheter-days ) . The median time to CRT was 30 days and the median catheter life span was 88 days . Significant baseline risk factors for CRT were : more than one insertion attempt ( odds ratio [ OR ] = 5.5 ; 95 % CI , 1.2 to 24.6 ; P = .03 ) ; ovarian cancer ( OR = 4.8 ; 95 % CI , 1.5 to 15.1 ; P = .01 ) ; and previous CVC insertion ( OR = 3.8 ; 95 % CI , 1.4 to 10.4 ; P = .01 ) . Nine of the 19 CRT patients were treated with anticoagulants alone , eight patients were treated with anticoagulants and catheter removal , while two patients did not receive anticoagulation . None had recurrent CRT or symptomatic PE . Postphlebitic symptoms were infrequent . CONCLUSION In adults with cancer , the incidence of symptomatic CRT is low and long-term complications are uncommon OBJECTIVES Although effective strategies for the prevention of venous thromboembolism ( VTE ) are widely available , a significant number of patients still develop VTE because appropriate thromboprophylaxis is not correctly prescribed . We conducted this study to estimate the risk profile for VTE and the employment of adequate thromboprophylaxis procedures in patients admitted to hospitals in the state of São Paulo , Brazil . METHODS Four hospitals were included in this study . Data on risk factors for VTE and prescription of pharmacological and non-pharmacological thromboprophylaxis were collected from 1454 r and omly chosen patients ( 589 surgical and 865 clinical ) . Case report forms were filled according to medical and nursing records . Physicians were unaware of the survey . Three risk assessment models were used : American College of Chest Physicians ( ACCP ) Guidelines , Caprini score , and the International Union of Angiololy Consensus Statement ( IUAS ) . The ACCP score classifies VTE risk in surgical patients and the others classify VTE risk in surgical and clinical patients . Contingency tables were built presenting the joined distribution of the risk score and the prescription of any pharmacological and non-pharmacological thromboprophylaxis ( yes or no ) . RESULTS According to the Caprini score , 29 % of the patients with the highest risk for VTE were not prescribed any thromboprophylaxis . Considering the patients under moderate , high or highest risk who should be receiving prophylaxis , 37 % and 29 % were not prescribed thromboprophylaxis according to ACCP ( surgical patients ) and IUAS risk scores , respectively . In contrast , 27 % and 42 % of the patients at low risk of VTE , according to Caprini and IUAS scores , respectively , had thromboprophylaxis prescribed . CONCLUSION Despite the existence of several guidelines , this study demonstrates that adequate thromboprophylaxis is not correctly prescribed : high-risk patients are under-treated and low-risk patients are over-treated . This condition must be changed to insure that patients receive adequate treatment for the prevention of thromboembolism Cancer patients undergoing surgery are at a high risk of venous thromboembolism , but few studies have described the rate of autopsy-confirmed fatal pulmonary embolism after heparin thromboprophylaxis . In a post hoc analysis of a r and omized study ( MC-4 ) , which compared the efficacy and safety of certoparin ( 3000 anti-Xa IU , subcutaneously , once-daily ) with unfractionated heparin ( 5000 IU , subcutaneously , three-times daily ) in 23078 patients undergoing surgery lasting more than 30 min , the incidence of autopsy-confirmed fatal pulmonary embolism , death and bleeding in the cancer patients ( n=6124 ) was compared with non-cancer patients ( n=16954 ) . Fatal pulmonary embolism was significantly more frequent in cancer patients ( 0.33 % [ 20/6124 ] ) than in non-cancer patients ( 0.09 % [ 15/16954 ] , relative risk ( RR ) , 3.7 [ 95 % confidence intervals ( CI ) , 1.80 , 7.77 ] , p=0.0001 ) at 14 days post-prophylaxis . Perioperative mortality was also significantly higher in cancer patients than in noncancer patients ( 3.14 % [ 192/6124 ] vs. 0.71 % [ 120/16954 ] , RR , 4.54 [ 95 % CI , 3.59 , 5.76 ] , p=0.0001 ) , as were blood loss ( p<0.0001 ) , and transfusion requirements ( p<0.0001 ) . Prevention of venous thromboembolism in cancer surgical patients remains a clinical challenge PURPOSE Patients with advanced pancreatic cancer have a poor prognosis and there have been no improvements in survival since the introduction of gemcitabine in 1996 . Pancreatic tumors often overexpress human epidermal growth factor receptor type 1 ( HER1/EGFR ) and this is associated with a worse prognosis . We studied the effects of adding the HER1/EGFR-targeted agent erlotinib to gemcitabine in patients with unresectable , locally advanced , or metastatic pancreatic cancer . PATIENTS AND METHODS Patients were r and omly assigned 1:1 to receive st and ard gemcitabine plus erlotinib ( 100 or 150 mg/d orally ) or gemcitabine plus placebo in a double-blind , international phase III trial . The primary end point was overall survival . RESULTS A total of 569 patients were r and omly assigned . Overall survival based on an intent-to-treat analysis was significantly prolonged on the erlotinib/gemcitabine arm with a hazard ratio ( HR ) of 0.82 ( 95 % CI , 0.69 to 0.99 ; P = .038 , adjusted for stratification factors ; median 6.24 months v 5.91 months ) . One-year survival was also greater with erlotinib plus gemcitabine ( 23 % v 17 % ; P = .023 ) . Progression-free survival was significantly longer with erlotinib plus gemcitabine with an estimated HR of 0.77 ( 95 % CI , 0.64 to 0.92 ; P = .004 ) . Objective response rates were not significantly different between the arms , although more patients on erlotinib had disease stabilization . There was a higher incidence of some adverse events with erlotinib plus gemcitabine , but most were grade 1 or 2 . CONCLUSION To our knowledge , this r and omized phase III trial is the first to demonstrate statistically significantly improved survival in advanced pancreatic cancer by adding any agent to gemcitabine . The recommended dose of erlotinib with gemcitabine for this indication is 100 mg/d PURPOSE Studies in cancer patients with venous thromboembolism suggested that low molecular weight heparin may prolong survival . In a double-blind study , we evaluated the effect of low molecular weight heparin on survival in patients with advanced malignancy without venous thromboembolism . METHODS Patients with metastasized or locally advanced solid tumors were r and omly assigned to receive a 6-week course of subcutaneous nadroparin or placebo . The primary efficacy analysis was based on time from r and om assignment to death . The primary safety outcome was major bleeding . RESULTS In total , 148 patients were allocated to nadroparin and 154 patients were allocated to placebo . Mean follow-up was 1 year . In the intention-to-treat analysis the overall hazard ratio of mortality was 0.75 ( 95 % CI , 0.59 to 0.96 ) with a median survival of 8.0 months in the nadroparin recipients versus 6.6 months in the placebo group . After adjustment for potential confounders , the treatment effect remained statistically significant . Major bleeding occurred in five ( 3 % ) of nadroparin-treated patients and in one ( 1 % ) of the placebo recipients ( P = .12 ) . In the a priori specified subgroup of patients with a life expectancy of 6 months or more at enrollment , the hazard ratio was 0.64 ( 95 % CI , 0.45 to 0.90 ) with a median survival of 15.4 and 9.4 months , respectively . For patients with a shorter life expectancy , the hazard ratio was 0.88 ( 95 % CI , 0.62 to 1.25 ) . CONCLUSION A brief course of subcutaneous low molecular weight heparin favorably influences the survival in patients with advanced malignancy and deserves additional clinical evaluation PURPOSE In experimental systems , interference with coagulation can affect tumor biology . Furthermore , it has been suggested that low molecular weight heparin therapy may prolong survival in patients with cancer . The primary aim of this study was to assess survival at 1 year of patients with advanced cancer . PATIENTS AND METHODS Patients with advanced malignancy ( N = 385 ) were r and omly assigned to receive either a once-daily subcutaneous injection of dalteparin ( 5,000 IU ) , a low molecular weight heparin , or placebo for 1 year . RESULTS The Kaplan-Meier survival estimates at 1 , 2 , and 3 years after r and omization for patients receiving dalteparin were 46 % , 27 % , and 21 % , respectively , compared with 41 % , 18 % , and 12 % , respectively , for patients receiving placebo ( P = .19 ) . In an analysis not specified a priori , survival was examined in a subgroup of patients ( dalteparin , n = 55 ; and placebo , n = 47 ) who had a better prognosis and who were alive 17 months after r and omization . In these patients , Kaplan-Meier survival estimates at 2 and 3 years from r and omization were significantly improved for patients receiving dalteparin versus placebo ( 78 % v 55 % and 60 % v 36 % , respectively , P = .03 ) . The rates of symptomatic venous thromboembolism were 2.4 % and 3.3 % for dalteparin and placebo , respectively , with bleeding rates of 4.7 % and 2.7 % , respectively . CONCLUSION Dalteparin administration did not significantly improve 1-year survival rates in patients with advanced malignancy . However , the observed improved survival in a subgroup of patients with a better prognosis suggests a potential modifying effect of dalteparin on tumor biology Patients receiving chemotherapy for metastatic breast cancer are at high risk of thromboembolic disease . Long-term oral anticoagulant therapy is needed but increases the risk of haemorrhagic complications . We have assessed the safety and efficacy of warfarin in very low doses as prophylaxis . Women receiving chemotherapy for metastatic breast cancer were r and omly assigned either very-low-dose warfarin ( 152 patients ) or placebo ( 159 ) . The warfarin dose was 1 mg daily for 6 weeks and was then adjusted to maintain the prothrombin time at an international normalised ratio ( INR ) of 1.3 to 1.9 . Study treatment continued until 1 week after the end of chemotherapy . The average daily dose from initiation of titration was 2.6 ( SD 1.2 ) mg for the warfarin group and the mean INR was 1.52 . The mean time at risk of thrombosis was 199 ( 126 ) days for warfarin-treated patients and 188 ( 137 ) days for placebo recipients ( p = 0.45 ) . There were 7 thromboembolic events ( 6 deep-vein thrombosis , 1 pulmonary embolism ) in the placebo group and 1 ( pulmonary embolism ) in the warfarin group , a relative risk reduction of about 85 % ( p = 0.031 ) . Major bleeding occurred in 2 placebo recipients and 1 warfarin-treated patient . There was no detectable difference in survival between the treatment groups . Very-low-dose warfarin is a safe and effective method for prevention of thromboembolism in patients with metastatic breast cancer who are receiving chemotherapy Background —Considerable variability exists in the use of pharmacological thromboprophylaxis among acutely ill medical patients , partly because clinical ly relevant end points have not been fully assessed in this population . We undertook an international , multicenter , r and omized , double-blind , placebo-controlled trial using clinical ly important outcomes to assess the efficacy and safety of dalteparin in the prevention of venous thromboembolism in such patients . Methods and Results — Patients ( n=3706 ) were r and omly assigned to receive either subcutaneous dalteparin 5000 IU daily or placebo for 14 days and were followed up for 90 days . The primary end point was venous thromboembolism , defined as the combination of symptomatic deep vein thrombosis , symptomatic pulmonary embolism , and asymptomatic proximal deep vein thrombosis detected by compression ultrasound at day 21 and sudden death by day 21 . The incidence of venous thromboembolism was reduced from 4.96 % ( 73 of 1473 patients ) in the placebo group to 2.77 % ( 42 of 1518 patients ) in the dalteparin group , an absolute risk reduction of 2.19 % or a relative risk reduction of 45 % ( relative risk , 0.55 ; 95 % CI , 0.38 to 0.80 ; P=0.0015 ) . The observed benefit was maintained at 90 days . The overall incidence of major bleeding was low but higher in the dalteparin group ( 9 patients ; 0.49 % ) compared with the placebo group ( 3 patients ; 0.16 % ) . Conclusions —Dalteparin 5000 IU once daily halved the rate of venous thromboembolism with a low risk of bleeding The optimal long-term treatment of acute venous thromboembolism ( VTE ) in patients with malignancy remains undefined . In particular , based on current evidence , it is uncertain whether secondary prophylaxis using st and ard intensity oral anticoagulant therapy is associated with higher risks of bleeding and recurrent thrombosis in patients with cancer than in those without cancer . This study compared the outcome of anticoagulation courses in 95 patients with malignancy with those of 733 patients without malignancy . All patients were participants in a large , nation-wide population study and were prospect ively followed from the initiation of their oral anticoagulant therapy . Based on 744 patient-years of treatment and follow-up , the rates of major ( 5.4 % vs 0.9 % ) , minor ( 16.2 % vs 3.6 % ) and total ( 21.6 % vs 4.5 % ) bleeding were statistically significantly higher in cancer patients compared with patients without cancer . Bleeding was also a more frequent cause of early anticoagulation withdrawal in patients with malignancy ( 4.2 % vs. 0.7 % ; p < 0.01 ; RR 6.2 ( 95 % CI 1.95 - 19.4 ) . There was a trend towards a higher rate of thrombotic complications in cancer patients ( 6.8 % vs. 2.5 % ; p = 0.058 ; RR 2.5 [ CI 0.96 - 6.5 ] ) but this did not achieve statistical significance . In the group of patients with cancer , the bleeding rate was high across the different INR categories and was independent of the temporally associated International Normalized Ratio ( INR ) . In contrast , the bleeding rate was increased only with INR values greater than 4.5 in the group of patients without cancer . The rate of thrombotic events was significantly higher in both cohorts when the INR was less than 2.0 . In conclusion , patients with malignancy treated with oral anticoagulants have a higher rate of bleeding and possibly an increased risk of recurrent thrombosis compared with patients without malignancy . Safer and more effective anticoagulant therapy is needed for this challenging group of patients PURPOSE The long-term impact of thalidomide plus dexamethasone ( thal/dex ) as primary therapy for newly diagnosed multiple myeloma ( MM ) is unknown . The goal of this study was to compare thalidomide plus dexamethasone versus placebo plus dexamethasone (placebo/dex)as primary therapy for newly diagnosed MM . PATIENTS AND METHODS In this double-blind , placebo-controlled trial , patients with untreated symptomatic MM were r and omized to thal/dex ( arm A ) or to placebo plus dexamethasone ( dex ) ( arm B ) . Patients in arm A received oral thalidomide 50 mg daily , escalated to 100 mg on day 15 , and to 200 mg from day 1 of cycle 2 ( 28-day cycles ) . Oral dex 40 mg was administered on days 1 through 4 , 9 through 12 , and 17 through 20 during cycles 1 through 4 and on days 1 through 4 only from cycle 5 onwards . Patients in arm B received placebo and dex , administered as in arm A. The primary end point of the study was time to progression . This study is registered at http:// Clinical Trials.gov ( NCT00057564 ) . RESULTS A total of 470 patients were enrolled ( 235 r and omly assigned to thal/dex and 235 to placebo/dex ) . The overall response rate was significantly higher with thal/dex compared with placebo/dex ( 63 % v 46 % ) , P < .001 . Time to progression ( TTP ) was significantly longer with thal/dex compared with placebo/dex ( median , 22.6 v 6.5 months , P < .001 ) . Grade 4 adverse events were more frequent with thal/dex than with placebo/dex ( 30.3 % v 22.8 % ) . CONCLUSION Thal/dex results in significantly higher response rates and significantly prolongs TTP compared with dexamethasone alone in patients with newly diagnosed MM PURPOSE Venous thromboembolism ( VTE ) has been associated with negative prognosis in cancer patients . Most series reporting on VTE have included different tumor types not differentiating between recurrent or primary disease . Data regarding the actual impact of VTE on primary advanced ovarian cancer ( AOC ) are limited . PATIENTS AND METHODS Between 1995 and 2002 , the Arbeitsgemeinschaft Gynaekologische Onkologie Ovarian Cancer Study group ( AGO-OVAR ) recruited 2,743 patients with AOC in three prospect ively r and omized trials on platinum paclitaxel-based chemotherapy after primary surgery . Pooled data analysis was performed to evaluate incidence , predictors , and prognostic impact of VTE in AOC . Survival curves were calculated for the VTE incidence . Univariate analysis and Cox regression analysis were performed to identify independent predictors of VTE and mortality . RESULTS Seventy-six VTE episodes were identified , which occurred during six to 11 cycles of adjuvant chemotherapy ; 50 % of them occurred within 2 months postoperatively . Multivariate analysis identified body mass index higher than 30 kg/m(2 ) and increasing age as independent predictors of VTE . International Federation of Gynecology and Obstetrics stage and surgical radicality did not affect incidence . Overall survival was significantly reduced in patients with VTE ( median , 29.8 v 36.2 months ; P = .03 ) . Multivariate analysis identified pulmonary embolism ( PE ) , but not deep vein thrombosis alone , to be of prognostic significance . In addition , VTE was not identified to significantly affect progression-free survival . CONCLUSION Patients with AOC have their highest VTE risk within the first 2 months after radical surgery . Only VTE complicated by symptomatic PE have been identified to have a negative impact on survival . Studies evaluating the role of prophylactic anticoagulation during this high risk postoperative period are warranted
2,026
18,254,012
In spite of these method ological weaknesses and the clinical heterogeneity , the consistency and magnitude of the effects reported provides some evidence that cognitive behavioural therapy may be a useful intervention for children with recurrent abdominal pain although most children , particularly in primary care , will improve with reassurance and time
BACKGROUND Between 4 % and 25 % of school-age children complain of recurrent abdominal pain ( RAP ) of sufficient severity to interfere with daily activities . For the majority of such children , no organic cause for their pain can be found on physical examination or investigation . Although most children are managed by reassurance and simple measures , a large range of psychosocial interventions including cognitive and behavioural treatments and family therapy have been recommended . OBJECTIVES To determine the effectiveness of psychosocial interventions for recurrent abdominal pain or IBS in school-age children .
OBJECTIVE To evaluate the efficacy of a distance treatment delivered through Internet and telephone for pediatric recurrent pain . METHODS Forty-seven participants ( 9 - 16 years of age ) were r and omly assigned to either an Internet-based treatment or a st and ard medical care waitlist . Treatment employed a Web-based manual for children and parents with weekly therapist contact by telephone or e-mail . At 1- and 3-month follow-ups , participants were assessed on the outcome variables of pain and quality of life . A 50 % reduction in diary pain scores was considered clinical ly significant . RESULTS Significant between-group differences were found : 71 and 72 % of the treatment group achieved clinical ly significant improvement at the 1- and 3-month follow-ups , respectively , whereas only 19 and 14 % of the control group achieved the criterion . No significant differences were found on the quality of life variable . CONCLUSIONS Distance methods have considerable potential for making effective treatments more accessible with lower associated costs Recurrent abdominal pain ( RAP ) affects 10 % to 18 % of school-age children and is caused by obvious organic pathology in fewer than 10 % of cases . Two recent studies do not support previous beliefs that most RAP is psychogenic . Studies have shown disorders of bowel motility in children with RAP similar to those of adult irritable bowel syndrome ( IBS ) ; controlled trials of additional dietary fiber in adult IBS have shown beneficial results . We did a r and omized , double-blind , placebo-controlled study in 52 children with RAP and demonstrated a clinical ly and statistically significant decrease in pain attacks ( at least 50 % fewer ) in almost twice as many children who were given additional fiber as placebo . Compliance was excellent in both groups and side effects were few . Although the cause of RAP is poorly understood , it is hypothesized that the beneficial effect of added fiber is due to its effect on shortening transit time , as in IBS Objectives Recurrent abdominal pain ( RAP ) is a common childhood complaint rarely associated with organic disease . Recently , the Pediatric Rome Criteria were developed to st and ardize the classification of pediatric functional gastrointestinal disorders ( FGIDs ) using a symptom-based approach . The authors tested the hypothesis that most patients with childhood RAP could be classified into one or more of the symptom subtypes defined by the Pediatric Rome Criteria . Methods Using a prospect i ve longitudinal design , new patients with RAP ( n = 114 ) were studied at a tertiary care children 's medical center . Before the medical evaluation , parents completed a question naire about their child , assessing symptoms defined by the Pediatric Rome Criteria . Results Of the 107 children for whom medical evaluation revealed no organic etiology for pain , 73 % had symptom profiles consistent with the Pediatric Rome Criteria for one of the FGIDs associated with abdominal pain ( irritable bowel syndrome , 44.9 % ; functional dyspepsia,15.9 % ; functional abdominal pain , 7.5 % ; abdominal migraine , 4.7 % ) Conclusions This study provides the first systematic empirical evidence that RAP , originally defined by Apley , includes children whose symptoms are consistent with the symptom criteria for several FGIDs defined by the Rome criteria . The pediatric Rome criteria may be useful in clinical research to ( 1 ) describe the symptom characteristics of research participants who meet Apley 's broad criteria for RAP , and ( 2 ) select patients with particular symptom profiles for investigation of potential biologic and psychosocial mechanisms associated with pediatric FGIDs OBJECTIVE To assess whether parental psychological and physical factors and child factors measured in the first year of life were associated with recurrent abdominal pain ( RAP ) in children at age 6(3/4 ) years . METHOD A longitudinal cohort study ( the Avon Longitudinal Study of Parents and Children ) , followed 8,272 children from pregnancy to age 6(3/4 ) years . Parental reports of child and parent functioning were gathered . Associations between parental and child functioning assessed at 6 to 8 months postpartum , and RAP measured at age 6(3/4 ) years were investigated . RESULTS The prevalence of RAP in this sample was 11.8 % . Both maternal anxiety ( adjusted odds ratio = 1.53 ; 95 % confidence interval 1.24 - 1.89 ) and paternal anxiety ( adjusted odds ratio = 1.38 ; 95 % confidence interval 1.12 - 1.71 ) in the first year of a child 's life were associated with later childhood RAP . Parent reports of child temperament features such as irregular feeding and sleeping were also associated with later RAP . CONCLUSIONS This is the first evidence from a prospect i ve study that anxiety in both mothers and fathers and child temperament features pre date the occurrence of RAP in children . These findings highlight the potential importance of addressing parental anxiety in families in which children present with RAP , although some caution should be exercised in their interpretation because of possible reporting bias To determine the benefit of using an H2-receptor antagonist in children with abdominal pain and dyspepsia , 25 such children were enrolled in a double-blind , placebo-controlled trial of famotidine . Global and quantitative pain assessment s were done before and after each treatment period . The quantitative assessment was calculated based on the abdominal pain score that was the sum of three components . Based on the global evaluation , there was a clear benefit of famotidine over placebo ( 68 % vs 12 % ) . Using the quantitative assessment , however , the mean improvement of the score using famotidine versus placebo was not statistically significant ( 3.37+/-3.53 vs 1.66+/-2.7 ) . There was a significant improvement in this score during the first treatment period regardless of medication used ( period effect : P = 0.05 ) . A subset of patients with peptic symptoms demonstrated a significant drug effect that outweighed the period effect ( drug effect : P = 0.01 ; period effect : P = 0.02 ) . We conclude that famotidine subjectively improves the symptoms of children with recurrent abdominal pain but not objective ly using the derived score . However , famotidine is significantly more effective than placebo among children with peptic symptoms . The use of this simple scoring scale may facilitate selecting those children who will benefit from H2-receptor antagonist therapy We sought to prospect ively characterize and compare the symptoms of children ≥ 5 years of age with recurrent abdominal pain to previously established criteria for irritable bowel syndrome ( IBS ) in adults . For all eligible subjects , a detailed question naire concerning characteristics of abdominal pain and defecatory pattern was completed at presentation . In addition , a battery of screening tests was performed and additional evaluation was done at the discretion of their physician . In all , 227 subjects fulfilled the entrance criteria , but 56 were sub-sequently excluded because of diagnoses of inflammatory bowel disease ( nine cases ) , lactose malabsorption ( 46 cases ) , or celiac disease ( one case ) . Of the remaining 171 patients , 117 had IBS symptoms . In the IBS subjects , lower abdominal discomfort ( p < 0.001 ) , cramping pain ( p < 0.0009 ) , and increased flatus ( p < 0.0003 ) were more common , whereas dyspeptic symptoms such as epigastric discomfort ( p < 0.003 ) , pain radiating to the chest ( p < 0.009 ) , and regurgitation ( p < 0.02 ) were more common in the non-IBS subjects . Our study not only confirms the clinical heterogeneity of children with recurrent abdominal pain but also concomitantly demonstrates that most children with this disorder have symptoms that fulfill the st and ardized criteria for IBS in adults . The identification of subgroups of children with recurrent abdominal pain can provide a framework for the diagnosis of functional bowel disease as well as establish the need for invasive and expensive tests 93 % of 88 children with severe frequent migraine recovered on oligoantigenic diets ; the causative foods were identified by sequential re introduction , and the role of the foods provoking migraine was established by a double-blind controlled trial in 40 of the children . Most patients responded to several foods . Many foods were involved , suggesting an allergic rather than an idiosyncratic ( metabolic ) pathogenesis . Associated symptoms which improved in addition to headache included abdominal pain , behaviour disorder , fits , asthma , and eczema . In most of the patients in whom migraine was provoked by non-specific factors , such as blows to the head , exercise , and flashing lights , this provocation no longer occurred while they were on the diet The association of lactase deficiency with recurrent abdominal pain was investigated . One hundred three white children between the ages of 6 to 14 years with recurrent abdominal pain were evaluated . Sixty-nine underwent lactose tolerance tests and 26 had intestinal biopsies with lactase determinations ; 21 of 69 ( 30.4 % ) had abnormal lactose tolerance tests and eight of 26 ( 31 % ) were lactase deficient . However , 16 of 61 ( 26.4 % ) control subjects matched for age and ethnic background exhibited lactase deficiency . Thus , a similar prevalence of lactase deficiency was found in the control and the recurrent abdominal pain groups . Thirty-eight patients with recurrent abdominal pain completed three successive six-week diet trials conducted in a double-blind fashion . An increase above base line value in pain frequency was seen in ten of 21 ( 48 % ) lactose malabsorbers and four of 17 ( 24 % ) lactose absorbers . After a 12-month milk elimination diet , six of 15 ( 40 % ) malabsorbers and five of 13 ( 38 % ) absorbers had elimination of their pain . This result compared with improvement occurring in five of 12 ( 42 % ) absorbers with recurrent abdominal pain who received a regular diet for one year and suggests that the elimination of lactose will not affect the overall frequency of improvement in recurrent abdominal pain . In addition , the recovery rate from recurrent abdominal pain is similar in both lactose absorbers and nonabsorbers independent of dietary restrictions spread metastases and died two months later , having remained normocalcaemic and swallowed normally until death . Case 2-A 50 year old man developed lymphadenopathy in the left supraclavicular fossa due to squamous carcinoma secondary to a symptomless primary tumour of the left main bronchus . The nodes were treated with palliative radiotherapy , but two months later he developed total dysphagia for solids , anorexia , and nausea . Examination showed some residual lymphadenopathy but was otherwise unremarkable ; a chest x ray film , however , showed a primary carcinoma in the left hilar region and multiple bone metastases . Barium swallow showed free flow of barium with no mechanical hold up but some slight muscular incoordination in the pharynx . A biochemical profile showed calcium concentration 3 - 8 mmol/l ( 15 ng/100 ml ) , alkaline phosphatase activity 200 IU/1 , urea concentration 17 - 9 mmol/l ( 108 mg/100 ml ) ( normal 2 - 0 - 8 - 1 mmol/l ( 12 - 49 mg/100 ml ) ) , and glutamic oxaloacetic transaminase activity 315 IU/l ( normal 5 - 45 IU/1 ) . Treatment with intravenous fluids , high dose steroids , and mithramycin corrected the hypercalcaemia , and his dysphagia had gone completely after five days and he was discharged home . He had no further problems with dysphagia . but died a month later from generalised disease . Postmortem examination confirmed a squamous cell carcinoma of the left main bronchus with widespread metastases but with a normal oesophagus that was not obstructed
2,027
28,671,892
Conclusion : Although additional research is warranted to further underst and the mechanisms by which PA affects behavioral and cognitive outcome measures in children with SEBDs , PA offers a safe and alternative form of treatment for this population
Objective : Perform a systematic review of the available literature regarding the effectiveness of exercise interventions on children with any type of social , emotional , or behavioral disability ( SEBD ) , with attention to a range of physiological , behavioral , and mood outcomes .
BACKGROUND Regular physical exercise may improve a variety of physiological and psychological factors in depressive persons . However , there is little experimental evidence to support this assumption for adolescent population s. We conducted a r and omized controlled trial to investigate the effect of physical exercise on depressive state , the excretions of stress hormones and physiological fitness variables in adolescent females with depressive symptoms . METHODS Forty-nine female volunteers ( aged 18 - 20 years ; mean 18.8 + /- 0.7 years ) with mild-to-moderate depressive symptoms , as measured by the Centre for Epidemiologic Studies Depression ( CES-D ) scale , were r and omly assigned to either an exercise regimen or usual daily activities for 8 weeks . The subjects were then crossed over to the alternate regimen for an additional 8-week period . The exercise program consisted of five 50-min sessions per week of a group jogging training at a mild intensity . The variables measured were CES-D rating scale , urinary cortisol and epinephrine levels , and cardiorespiratory factors at rest and during exercise endurance test . RESULTS After the sessions of exercise the CES-D total depressive score showed a significant decrease , whereas no effect was observed after the period of usual daily activities ( ANOVA ) . Twenty-four hour excretions of cortisol and epinephrine in urine were reduced due to the exercise regimen . The training group had a significantly reduced resting heart rate and increased peak oxygen uptake and lung capacity . CONCLUSIONS The findings of this study suggest that a group jogging exercise may be effective in improving depressive state , hormonal response to stress and physiological fitness of adolescent females with depressive symptoms PURPOSE The objective of this study is to test the feasibility and impact of a 10-wk after-school exercise program for children with attention deficit hyperactivity disorder and /or disruptive behavior disorders living in an urban poor community . METHODS Children were r and omized to an exercise program ( n = 19 ) or a comparable but sedentary attention control program ( n = 16 ) . Cognitive and behavioral outcomes were collected pre-/posttest . Intent-to-treat mixed models tested group-time and group-time-attendance interactions . Effect sizes were calculated within and between groups . RESULTS Feasibility was evidence d by 86 % retention , 60 % attendance , and average 75 % maximum HR . Group-time results were null on the primary outcome , parent-reported executive function . Among secondary outcomes , between-group effect sizes favored exercise on hyperactive symptoms ( d = 0.47 ) and verbal working memory ( d = 0.26 ) , and controls on visuospatial working memory ( d = -0.21 ) and oppositional defiant symptoms ( d = -0.37 ) . In each group , within-group effect sizes were moderate to large on most outcomes ( d = 0.67 to 1.60 ) . A group-time-attendance interaction emerged on visuospatial working memory ( F[1,33 ] = 7.42 , P < 0.05 ) , such that attendance to the control program was related to greater improvements ( r = 0.72 , P < 0.01 ) , whereas attendance to the exercise program was not ( r = 0.25 , P = 0.34 ) . CONCLUSIONS Although between-group findings on the primary outcome , parent-reported executive function , were null , between-group effect sizes on hyperactivity and visuospatial working memory may reflect adaptations to the specific challenges presented by distinct formats . Both groups demonstrated substantial within-group improvements on clinical ly relevant outcomes . Findings underscore the importance of programmatic features , such as routines , engaging activities , behavior management strategies , and adult attention , and highlight the potential for after-school programs to benefit children with attention deficit hyperactivity disorder and disruptive behavior disorder living in urban poverty where health needs are high and services re sources few This study was conducted to determine the effect of acute aerobic exercise on executive function in children with attention deficit hyperactivity disorder ( ADHD ) . Forty children with ADHD were r and omly assigned into exercise or control groups . Participants in the exercise group performed a moderate intensity aerobic exercise for 30 min , whereas the control group watched a running/exercise-related video . Neuropsychological tasks , the Stroop Test and the Wisconsin Card Sorting Test ( WCST ) , were assessed before and after each treatment . The results indicated that acute exercise facilitated performance in the Stroop Test , particularly in the Stroop Color-Word condition . Additionally , children in the exercise group demonstrated improvement in specific WCST performances in Non-perseverative Errors and Categories Completed , whereas no influences were found in those performances in the control group . Tentative explanations for the exercise effect postulate that exercise allocates attention re sources , influences the dorsolateral prefrontal cortex , and is implicated in exercise-induced dopamine release . These findings are promising and additional investigations to explore the efficacy of exercise on executive function in children with ADHD are encouraged Self-control problems commonly manifest as temper outbursts and repetitive/rigid/impulsive behaviors , in children with autism spectrum disorders ( ASD ) , which often contributes to learning difficulties and caregiver burden . The present study aims to compare the effect of a traditional Chinese Chan-based mind-body exercise , Nei Yang Gong , with that of the conventional Progressive Muscle Relaxation ( PMR ) technique in enhancing the self-control of children with ASD . Forty-six age- and IQ-matched ASD children were r and omly assigned to receive group training in Nei Yang Gong ( experimental group ) or PMR ( control group ) twice per week for four weeks . The participants ’ self-control was measured by three neuropsychological tests and parental rating on st and ardized question naires , and the underlying neural mechanism was assessed by the participants ’ brain EEG activity during an inhibitory-control task before and after intervention . The results show that the experimental group demonstrated significantly greater improvement in self-control than the control group , which concurs with the parental reports of reduced autistic symptoms and increased control of temper and behaviors . In addition , the experimental group showed enhanced EEG activity in the anterior cingulate cortex , a region that mediates self-control , whereas the PMR group did not . The present findings support the potential application of Chinese Chan-based mind-body exercises as a form of neuropsychological rehabilitation for patients with self-control problems . Chinese Clinical Trial Registry ; Registration No. : ChiCTR-TRC-12002561 ; URL : www.chictr.org Attention Deficit Hyperactivity Disorder ( ADHD ) mainly affects the academic performance of children and adolescents . In addition to bringing physical and mental health benefits , physical activity has been used to prevent and improve ADHD comorbidities ; however , its effectiveness has not been quantified . In this study , the effect of physical activity on children 's attention was measured using a computer game . Intense physical activity was promoted by a relay race , which requires a 5-min run without a rest interval . The proposed physical stimulus was performed with 28 volunteers : 14 with ADHD ( GE-EF ) and 14 without ADHD symptoms ( GC-EF ) . After 5 min of rest , these volunteers accessed the computer game to accomplish the tasks in the shortest time possible . The computer game was also accessed by another 28 volunteers : 14 with ADHD ( GE ) and 14 without these symptoms ( GC ) . The response time to solve the tasks that require attention was recorded . The results of the four groups were analyzed using D'Agostino statistical tests of normality , Kruskal-Wallis analyses of variance and post-hoc Dunn tests . The groups of volunteers with ADHD who performed exercise ( GE-EF ) showed improved performance for the tasks that require attention with a difference of 30.52 % compared with the volunteers with ADHD who did not perform the exercise ( GE ) . The ( GE-EF ) group showed similar performance ( 2.5 % difference ) with the volunteers in the ( GC ) group who have no ADHD symptoms and did not exercise . This study shows that intense exercise can improve the attention of children with ADHD and may help their school performance Background Exercise has been shown to be effective in treating depression , but trials testing the effect of exercise for depressed adolescents utilising mental health services are rare . The aim of this study was to determine the effectiveness of a preferred intensity exercise intervention on the depressive symptoms of adolescents with depression . Methods We r and omly assigned 87 adolescents who were receiving treatment for depression to either 12 sessions of aerobic exercise at preferred intensity alongside treatment as usual or treatment as usual only . The primary outcome was depressive symptom change using the Children ’s Depression Inventory 2nd Version ( CDI-2 ) at post intervention . Secondary outcomes were health-related quality of life and physical activity rates . Outcomes were taken at baseline , post intervention and at six month follow up . Results CDI-2 score reduction did not differ significantly between groups at post-intervention ( est . 95 % CI −6.82 , 1.68 , p = 0.23 ) . However , there was a difference in CDI-2 score reduction at six month follow-up in favour of the intervention of −4.81 ( est . 95 % CI −9.49 , −0.12 , p = 0.03 ) . Health-related quality of life and physical activity rates did not differ significantly between groups at post-intervention and follow-up . Conclusions There was no additional effect of preferred intensity exercise alongside treatment as usual on depressive reduction immediately post intervention . However , effects were observed at six months post-intervention , suggesting a delayed response . However , further trials , with larger sample s are required to determine the validity of this finding . Trial registration Clinical Trials.gov NCT01474837 , March 16 The Depressed Adolescents Treated with Exercise ( DATE ) study evaluated a st and ardized aerobic exercise protocol to treat nonmedicated adolescents that met DSM-IV-TR criteria for major depressive disorder . From an initial screen of 90 individuals , 30 adolescents aged 12 - 18 years were r and omized to either vigorous exercise ( EXER ) ( > 12 kg/kcal/week [ KKW ] ) or a control stretching ( STRETCH ) activity ( < 4 KKW ) for 12 weeks . The primary outcome measure was the blinded clinician rating of the Children 's Depression Rating Scale - Revised ( CDRS-R ) to assess depression severity and Actical ( KKW ) accelerometry 24hr/7days a week to assess energy expenditure and adherence . Follow-up evaluations occurred at weeks 26 and 52 . The EXER group averaged 77 % adherence and the STRETCH group 81 % for meeting weekly target goals for the 12 week intervention based on weekly sessions completed and meeting KKW requirements . There was a significant increase in overall weekly KKW expenditures ( p < .001 ) for both groups with the EXER group doubling the STRETCH group in weekly energy expenditure . Depressive symptoms were significantly reduced from baseline for both groups with the EXER group improving more rapidly than STRETCH after six weeks ( p < .016 ) and nine weeks ( p < .001 ) . Both groups continued to improve such that there were no group differences after 12 weeks ( p = .07 ) . By week 12 , the exercise group had a 100 % response rate ( 86 % remission ) , whereas the stretch group response rate was 67 % ( 50 % remission ) ( p = .02 ) . Both groups had improvements in multiple areas of psychosocial functioning related to school and relationships with parents and peers . Anthropometry reflected decreased waist , hip and thigh measurements ( p = .02 ) , more so for females than males ( p = .05 ) , but there were no weight changes for either gender . The EXER group sustained 100 % remission at week 26 and 52 . The STRETCH group had 80 % response and 70 % remission rates at week 26 and by week 52 only one had not fully responded . The study provides support for the use of exercise as a non-medication intervention for adolescents with major depressive disorders when good adherence and energy expenditure ( KKW ) are achieved OBJECTIVE To test the dose-response effects of an exercise program on depressive symptoms and self-worth in children . METHOD Overweight , sedentary children ( N = 207 , 7 - 11 years , 58 % male , 59 % Black ) were r and omly assigned to low or high dose ( 20 or 40 min/day ) aerobic exercise programs ( 13 + /- 1.6 weeks ) , or control group . Children completed the Reynolds Child Depression Scale and Self-Perception Profile for Children at baseline and posttest . RESULTS A dose-response benefit of exercise was detected for depressive symptoms . A race x group interaction showed only White children 's global self-worth ( GSW ) improved . There was some evidence that increased self-worth mediated the effect on depressive symptoms . CONCLUSIONS This study shows dose-response benefits of exercise on depressive symptoms and self-worth in children . However , Blacks did not show increased GSW in response to the intervention . Results provide some support for mediation of the effect of exercise on depressive symptoms via self-worth Aerobic moderate-intensity continuous exercise ( MCE ) can improve executive function ( EF ) acutely , potentially through the activation of both physiological and psychological factors . Recently , high-intensity interval exercise ( HIIE ) has been reported to be more beneficial for physical adaptation than MCE . Factors for EF improvement can potentially be more enhanced by HIIE than by MCE ; but the effects of HIIE on EF remain unknown . Therefore , we aim ed to examine to what extent HIIE impacts post-exercise EF immediately after exercise and during post-exercise recovery , compared with traditional MCE . Twelve healthy male subjects performed cycle ergometer exercise based on either HIIE or MCE protocol s in a r and omized and counterbalanced order . The HIIE protocol consisted of four 4-min bouts at 90 % of peak VO2 with 3-min active recovery at 60 % of peak VO2 . A volume-matched MCE protocol was applied at 60 % of peak VO2 . To evaluate EF , a color-words Stroop task was performed pre- and post-exercise . Improvement in EF immediately after exercise was the same for the HIIE and MCE protocol s. However , the improvement of EF by HIIE was sustained during 30 min of post-exercise recovery , during which MCE returned to the pre-exercise level . The EF response in the post-exercise recovery was associated with changes in physiological and psychological responses . The present findings showed that HIIE and MCE were capable of improving EF . Moreover , HIIE could prolong improvement in EF during post-exercise recovery . For the first time , we suggest that HIIE may be more effective strategy than MCE for improving EF OBJECTIVE This present study examined time spent in the target heart zone ( THZ ) and its relationship to tasks requiring variable amounts of executive control function in prepubescent children participating in a 9-month r and omized controlled physical activity program . METHODS A sample of 59 participants performed the Stroop Color-Word Test and the Comprehensive Trail Making Test cognitive assessment s. Heart rate data were collected during participation in the physical activity program using E600 heart rate monitors ( Polar , Finl and ) . RESULTS There was a significant difference , F(1 , 58)=7.44 , p < .009 , between males and females for relative VO2max , but not absolute ( p=.69 ) or percent VO2max ( p=.73 ) . Regression analysis identified KBIT , age , and mean time above the THZ as significant predictors of performance in the Stroop Color-Word condition , F(1 , 56)=5.21 , p=.02 . KBIT and mean time above the THZ were significant predictors for Trails B , F(1 , 56)=7.60 . CONCLUSIONS These results suggest that heart rate , as a measure of physical activity intensity , should be closely monitored during research that is intended to make inferences about its effects on cognitive performance as participation in vigorous activities may have specific benefits over lower intensities among prepubescent children [ Purpose ] The purpose of the present study was to determine the effect of a jump rope and ball combined exercise program on the physical fitness the neurotransmitter ( epinephrine , serotonin ) levels of children with attention-deficit hyperactivity disorder . [ Subjects and Methods ] The subjects were 12 boys attending elementary school , whose grade levels ranged from 1–4 . The block r and omization method was used to distribute the participants between the combined exercise group ( n = 6 ) and control group ( n = 6 ) . The program consisted of a 60-min exercise ( 10-min warm-up , 40-min main exercise , and 10-min cool down ) performed three times a week , for a total of 12 weeks . [ Results ] The exercise group showed a significant improvement in cardiorespiratory endurance , muscle strength , muscle endurance and flexibility after 12 weeks . A significant increase in the epinephrine level was observed in the exercise group . [ Conclusion ] The 12-week combined exercise program in the current study ( jump rope and ball exercises ) had a positive effect on overall fitness level , and neurotransmission in children with attention-deficit hyperactivity disorder Objective : To examine the role of physical activity in determining the affect and executive functioning of children with symptoms of ADHD . Method : In Study 1 , the association between physical activity and affect in the daily lives of children with varying degrees of hyperactivity was examined . In Study 2 , children with ADHD were r and omly assigned a physical activity or a sedentary task before working on a task requiring executive control . Results : Lack of physical activity was shown to relate to depressed affect , more strongly in participants with severe hyperactivity symptoms ( Study 1 ) . The physically active participants showed improved executive functioning after only 5 min of vigorous activity ; the sedentary control participants showed no improvement ( Study 2 ) . Conclusion : These results indicate that interventions to increase the level of physical activity in children with and without ADHD might improve affect and executive functioning Attention deficit hyperactivity disorder ( ADHD ) is related to a deficiency of central catecholamines ( CA ) in cognitive , biochemical , and physical tests , and pharmaceutical intervention may have no effect if it is not accompanied by changes in the environment . The objective of our study was to test the hypothesis that central CA are responsible for the increase in speed reaction seen after physical activity ( PA ) and to measure the impact of high intensity PA on the sustained attention of 25 children diagnosed with ADHD consistent with the Disease Statistical Mental-IV ( DSM-IV ) criteria . It is possible that practicing sports assists in the management of the disorder . The children were divided between users ( US ) and non-users ( NUS ) of methylpheni date ( MTP ) , and the groups were compared to evaluate the effect of the drug on cognition after PA . Post-exercise performance on Conner ’s Continuous Performance Test-II ( CPT ) was not affected by MTP , we observed significant improvements in response time , and we saw normalization in the impulsivity and vigilance measures . These results suggest that the improvements in cognition after physical effort are not CA dependent . Additionally , our results suggest that children ’s attention deficits can be minimized through PA irrespective of treatment with MTP . Additional studies are necessary to confirm that exercise mitigates the harmful symptoms of ADHD The aim of the present study was to underst and if sport improves attention symptoms , social competency , and cognitive functions in children with attention deficit and hyperactivity disorder ( ADHD ) . The present study was design ed as a 6-week , prospect i ve trial , including 12 sessions of education/sports therapy . 13 ADHD children participated in a 90-min athletic activity ( sports-cADHD ) twice a week , while 15 ADHD children received education on behavior control ( edu-cADHD ) . During the 6-week treatment period , the sports-cADHD group showed greater improvements in DuPaul 's ADHD Rating Scale scores , parent and teacher version ( K-ARS-PT ) , compared to those of the edu-sADHD group . The cognitive functions assessed with the digit symbol and Trail-Making Test part B ( TMT B ) were improved in the sports-cADHD group , while the cognitive functions observed in the edu-sADHD group were not significantly changed . The cooperativeness scores in the sports-cADHD group were greatly increased compared to those of the edu-sADHD group . The results demonstrated a positive correlation with sports and improvement in attention symptoms , cognitive symptoms and social skills . The results of the present study suggest that therapy in the form of athletic activity may increase social competency in children with ADHD , as demonstrated by improved cognitive functions PURPOSE The effects of exercise on children with attention-deficit hyperactivity disorder ( ADHD ) were evaluated by study ing the rate of spontaneous eye blinks , the acoustic startle eye blink response ( ASER ) , and motor impersistence among 8- to 12-yr-old children ( 10 boys and 8 girls ) meeting DSM-III-R criteria for ADHD . METHODS Children ceased methylpheni date medication 24 h before and during each of three daily conditions separated by 24 - 48 h. After a maximal treadmill walking test to determine cardiorespiratory fitness ( VO(2peak ) ) , each child was r and omly assigned to counterbalanced conditions of treadmill walking at an intensity of 65 - 75 % VO(2peak ) or quiet rest . Responses were compared with a group of control participants ( 11 boys and 14 girls ) equated with the ADHD group on several key variables . RESULTS Boys with ADHD had increased spontaneous blink rate , decreased ASER latency , and decreased motor impersistence after maximal exercise . Girls with ADHD had increased ASER amplitude and decreased ASER latency after submaximal exercise . CONCLUSIONS The findings suggest an interaction between sex and exercise intensity that is not explained by physical fitness , activity history , or selected personality attributes . The clinical meaning of the eye blink results is not clear , as improvements in motor impersistence occurred only for boys after maximal exercise . Nonetheless , these preliminary findings are sufficiently positive to encourage additional study to determine whether a session of vigorous exercise has efficacy as a dopaminergic adjuvant in the management of behavioral features of ADHD Boys diagnosed with ADHD by specialist pediatricians and stabilized on medication were r and omly assigned to a 20-session yoga group ( n = 11 ) or a control group ( cooperative activities ; n = 8) . Boys were assessed pre- and post-intervention on the Conners ’ Parent and Teacher Rating Scales-Revised : Long ( CPRS-R : L & CTRS-R : L ; Conners , 1997 ) , the Test of Variables of Attention ( TOVA ; Greenberg , Cormna , & Kindschi , 1997 ) , and the Motion Logger Actigraph . Data were analyzed using one-way repeated measures analysis of variance ( ANOVA ) . Significant improvements from pre-test to post-test were found for the yoga , but not for the control group on five subscales of the Conners ’ Parents Rating Scales ( CPRS ) : Oppositional , Global Index Emotional Lability , Global Index Total , Global Index Restless/Impulsive and ADHD Index . Significant improvements from pre-test to post-test were found for the control group , but not the yoga group on three CPRS subscales : Hyperactivity , Anxious/Shy , and Social Problems . Both groups improved significantly on CPRS Perfectionism , DSM-IV Hyperactive/Impulsive , and DSM-IV Total .For the yoga group , positive change from pre- to post-test on the Conners ’ Teacher Rating Scales ( CTRS ) was associated with the number of sessions attended on the DSM-IV Hyperactive-Impulsive subscale and with a trend on DSM-IV Inattentive subscale . Those in the yoga group who engaged in more home practice showed a significant improvement on TOVA Response Time Variability with a trend on the ADHD score , and greater improvements on the CTRS Global Emotional Lability subscale . Results from the Motion Logger Actigraph were inconclusive . Although these data do not provide strong support for the use of yoga for ADHD , partly because the study was under-powered , they do suggest that yoga may have merit as a complementary treatment for boys with ADHD already stabilized on medication , particularly for its evening effect when medication effects are absent . Yoga remains an investigational treatment , but this study supports further research into its possible uses for this population . These findings need to be replicated on larger groups with a more intensive supervised practice program Abstract This investigation examined the long term effect of Karate techniques training on communication of children with autism spectrum disorders ( ASD ) . Thirty school aged children with ASD were r and omly assigned to an exercise ( n = 15 ) or a control group ( n = 15 ) . Participants in the exercise group were engaged in 14 weeks of Karate techniques training . Communication deficit at baseline , post-intervention ( week 14 ) , and at 1 month follow up were evaluated . Exercise group showed significant reduction in communication deficit compared to control group . Moreover , reduction in communication deficit in the exercise group at one month follow up remained unchanged compared to post-intervention time . We concluded that teaching Karate techniques to children with ASD leads to significant reduction in their communication deficit
2,028
27,937,089
Conclusion Women without sentinel lymph node ( SLN ) metastases should not receive axillary lymph node dissection ( ALND ) . Women with one to two metastatic SLNs who are planning to undergo breast-conserving surgery with whole-breast radiotherapy should not undergo ALND ( in most cases ) . Women who have large or locally advanced invasive breast cancer ( tumor size T3/T4 ) , inflammatory breast cancer , or ductal carcinoma in situ ( when breast-conserving surgery is planned ) or are pregnant should not undergo SNB
Purpose To provide current recommendations on the use of sentinel node biopsy ( SNB ) for patients with early-stage breast cancer .
Background The aim was to prove the low identification rate of sentinel lymph node biopsy ( SNB ) and to determine the feasibility of replacing axillary lymph node dissection ( AND ) in axillary lymph node positive patients after chemotherapy . Methods From October 2001 to July 2005 , 875 consecutive patients with primary operable breast cancer underwent SNB and AND . Among them , 238 received pre-operative chemotherapy . We compared the identification rate , false negative rate ( FNR ) , negative predictive value ( NPV ) , and accuracy of SNB in clinical ly node-positive patients with or without chemotherapy . Results The identification rate was significantly lower in patients received chemotherapy ( 77.6 % ) than in those not received it ( 97.0 % ) ( P < 0.001 ) . In those received the therapy , the FNR was 5.6 % , the NPV was 86.8 % , and the accuracy was 95.9 % . In those not received therapy , the FNR was 7.4 % and the accuracy was 92.6 % ( differences not statistically significant ) . Conclusion The identification rate in confirmed axillary lymph node-positive patients was significantly lower in patients received pre-operative chemotherapy , but accuracy did not differ significantly between the two groups . Thus , for patients who achieve complete axillary clearance by chemotherapy , SNB could replace AND PURPOSE Multicentric breast cancer has been considered to be a contraindication for sentinel node ( SN ) biopsy ( SNB ) . In this prospect i ve multi-institutional trial , SNB-feasibility and accuracy was evaluated in 142 patients with multicentric cancer from the Austrian Sentinel Node Study Group ( ASNSG ) and compared with data from 3,216 patients with unicentric cancer . PATIENTS AND METHODS Between 1996 and 2004 , 3,730 patients underwent SNB at 15 ASNSG-affiliated hospitals . Patient data were entered in a multicenter data base . One hundred forty-two patients presented with multicentric invasive breast cancer and underwent SNB . RESULTS Intraoperatively , a mean number of 1.67 SNs were excised ( identification -rate , 91.5 % ) . The incidence of SN metastases was 60.8 % ( 79 of 130 ) . This was confirmed by axillary lymph node dissection ( ALND ) in 125 patients . Of patients with positive SNs , 60.8 % ( 48 of 79 ) showed involvement of nonsentinel nodes ( NSNs ) , as did three patients with negative SNs ( false-negative rate , 4.0 ) . Sensitivity , negative predictive value , and overall accuracy were 96.0 % , 93.3 % , and 97.3 % , respectively . Ninety-one percent of the patients underwent mastectomy , and 9 % were treated with breast conserving surgery . None of the patients have shown axillary recurrence so far ( mean follow-up , 28.8 months ) . Compared with 3,216 patients with unicentric cancer , there was a significantly higher rate of SN metastases as well as in NSNs , whereas there was no difference in detection and false-negative rates . CONCLUSION Multicentric breast cancer is a new indication for SNB without routine ALND in controlled trials . Given adequate quality control and an interdisciplinary teamwork of surgical , nuclear medicine , and pathology units , SNB is both feasible and accurate in this disease entity PURPOSE Sentinel lymph node ( SLN ) biopsy has proved to be an accurate method for detecting nodal micrometastases in previously untreated patients with early-stage breast cancer . We investigated the accuracy of this technique for patients with more advanced breast cancer after neoadjuvant chemotherapy . PATIENTS AND METHODS Patients with stage II or III breast cancer who had undergone doxorubicin-based neoadjuvant chemotherapy before breast surgery were eligible . Intraoperative lymphatic mapping was performed with peritumoral injections of blue dye alone or in combination with technetium-labeled sulfur colloid . All patients were offered axillary lymph node dissection . Negative sentinel and axillary nodes were subjected to additional processing with serial step sectioning and immunohistochemical staining with an anticytokeratin antibody to detect micrometastases . RESULTS Fifty-one patients underwent SLN biopsy after neoadjuvant chemotherapy from 1994 to 1999 . The SLN identification rate improved from 64.7 % to 94.1 % . Twenty-two ( 51.2 % ) of the 43 successfully mapped patients had positive SLNs , and in 10 of those 22 patients ( 45.5 % ) , the SLN was the only positive node . Three patients had false-negative SLN biopsy ; that is , the sentinel node was negative , but at least one nonsentinel node contained metastases . Additional processing revealed occult micrometastases in four patients ( three in sentinel nodes and one in a nonsentinel node ) . CONCLUSION SLN biopsy is accurate after neoadjuvant chemotherapy . The SLN identification improved with experience . False-negative findings occurred at a low rate throughout the series . This technique is a potential way to guide the axillary treatment of patients who are clinical ly node negative after neoadjuvant chemotherapy PURPOSE To develop a guideline for the use of sentinel node biopsy ( SNB ) in early stage breast cancer . METHODS An American Society of Clinical Oncology ( ASCO ) Expert Panel conducted a systematic review of the literature available through February 2004 on the use of SNB in early-stage breast cancer . The panel developed a guideline for clinicians and patients regarding the appropriate use of a sentinel lymph node identification and sampling procedure from hereon referred to as SNB . The guideline was review ed by selected experts in the field and the ASCO Health Services Committee and was approved by the ASCO Board of Directors . RESULTS The literature review identified one published prospect i ve r and omized controlled trial in which SNB was compared with axillary lymph node dissection ( ALND ) , four limited meta-analyses , and 69 published single-institution and multicenter trials in which the test performance of SNB was evaluated with respect to the results of ALND ( completion axillary dissection ) . There are currently no data on the effect of SLN biopsy on long-term survival of patients with breast cancer . However , a review of the available evidence demonstrates that , when performed by experienced clinicians , SNB appears to be a safe and acceptably accurate method for identifying early-stage breast cancer without involvement of the axillary lymph nodes . CONCLUSION SNB is an appropriate initial alternative to routine staging ALND for patients with early-stage breast cancer with clinical ly negative axillary nodes . Completion ALND remains st and ard treatment for patients with axillary metastases identified on SNB . Appropriately identified patients with negative results of SNB , when done under the direction of an experienced surgeon , need not have completion ALND . Isolated cancer cells detected by pathologic examination of the SLN with use of specialized techniques are currently of unknown clinical significance . Although such specialized techniques are often used , they are not a required part of SLN evaluation for breast cancer at this time . Data suggest that SNB is associated with less morbidity than ALND , but the comparative effects of these two approaches on tumor recurrence or patient survival are unknown BACKGROUND Sentinel lymph node biopsy in women with operable breast cancer is routinely used in some countries for staging the axilla despite limited data from r and omized trials on morbidity and mortality outcomes . We conducted a multicenter r and omized trial to compare quality -of-life outcomes between patients with clinical ly node-negative invasive breast cancer who received sentinel lymph node biopsy and patients who received st and ard axillary treatment . METHODS The primary outcome measures were arm and shoulder morbidity and quality of life . From November 1999 to October 2003 , 1031 patients were r and omly assigned to undergo sentinel lymph node biopsy ( n = 515 ) or st and ard axillary surgery ( n = 516 ) . Patients with sentinel lymph node metastases proceeded to delayed axillary clearance or received axillary radiotherapy ( depending on the protocol at the treating institution ) . Intention-to-treat analyses of data at 1 , 3 , 6 , and 12 months after surgery are presented . All statistical tests were two-sided . RESULTS The relative risks of any lymphedema and sensory loss for the sentinel lymph node biopsy group compared with the st and ard axillary treatment group at 12 months were 0.37 ( 95 % confidence interval [ CI ] = 0.23 to 0.60 ; absolute rates : 5 % versus 13 % ) and 0.37 ( 95 % CI = 0.27 to 0.50 ; absolute rates : 11 % versus 31 % ) , respectively . Drain usage , length of hospital stay , and time to resumption of normal day-to-day activities after surgery were statistically significantly lower in the sentinel lymph node biopsy group ( all P < .001 ) , and axillary operative time was reduced ( P = .055 ) . Overall patient-recorded quality of life and arm functioning scores were statistically significantly better in the sentinel lymph node biopsy group throughout ( all P < or = .003 ) . These benefits were seen with no increase in anxiety levels in the sentinel lymph node biopsy group ( P > .05 ) . CONCLUSION Sentinel lymph node biopsy is associated with reduced arm morbidity and better quality of life than st and ard axillary treatment and should be the treatment of choice for patients who have early-stage breast cancer with clinical ly negative nodes BACKGROUND The optimum timing of sentinel-lymph-node biopsy for breast cancer patients treated with neoadjuvant chemotherapy is uncertain . The SENTINA ( SENTinel NeoAdjuvant ) study was design ed to evaluate a specific algorithm for timing of a st and ardised sentinel-lymph-node biopsy procedure in patients who undergo neoadjuvant chemotherapy . METHODS SENTINA is a four-arm , prospect i ve , multicentre cohort study undertaken at 103 institutions in Germany and Austria . Women with breast cancer who were scheduled for neoadjuvant chemotherapy were enrolled into the study . Patients with clinical ly node-negative disease ( cN0 ) underwent sentinel-lymph-node biopsy before neoadjuvant chemotherapy ( arm A ) . If the sentinel node was positive ( pN1 ) , a second sentinel-lymph-node biopsy procedure was done after neoadjuvant chemotherapy ( arm B ) . Women with clinical ly node-positive disease ( cN+ ) received neoadjuvant chemotherapy . Those who converted to clinical ly node-negative disease after chemotherapy ( ycN0 ; arm C ) were treated with sentinel-lymph-node biopsy and axillary dissection . Only patients whose clinical nodal status remained positive ( ycN1 ) underwent axillary dissection without sentinel-lymph-node biopsy ( arm D ) . The primary endpoint was accuracy ( false-negative rate ) of sentinel-lymph-node biopsy after neoadjuvant chemotherapy for patients who converted from cN1 to ycN0 disease during neoadjuvant chemotherapy ( arm C ) . Secondary endpoints included comparison of the detection rate of sentinel-lymph-node biopsy before and after neoadjuvant chemotherapy , and also the false-negative rate and detection rate of sentinel-lymph-node biopsy after removal of the sentinel lymph node . Analyses were done according to treatment received ( per protocol ) . FINDINGS Of 1737 patients who received treatment , 1022 women underwent sentinel-lymph-node biopsy before neoadjuvant chemotherapy ( arms A and B ) , with a detection rate of 99.1 % ( 95 % CI 98.3 - 99.6 ; 1013 of 1022 ) . In patients who converted after neoadjuvant chemotherapy from cN+ to ycN0 ( arm C ) , the detection rate was 80.1 % ( 95 % CI 76.6 - 83.2 ; 474 of 592 ) and false-negative rate was 14.2 % ( 95 % CI 9.9 - 19.4 ; 32 of 226 ) . The false-negative rate was 24.3 % ( 17 of 70 ) for women who had one node removed and 18.5 % ( 10 of 54 ) for those who had two sentinel nodes removed ( arm C ) . In patients who had a second sentinel-lymph-node biopsy procedure after neoadjuvant chemotherapy ( arm B ) , the detection rate was 60.8 % ( 95 % CI 55.6 - 65.9 ; 219 of 360 ) and the false-negative rate was 51.6 % ( 95 % CI 38.7 - 64.2 ; 33 of 64 ) . INTERPRETATION Sentinel-lymph-node biopsy is a reliable diagnostic method before neoadjuvant chemotherapy . After systemic treatment or early sentinel-lymph-node biopsy , the procedure has a lower detection rate and a higher false-negative rate compared with sentinel-lymph-node biopsy done before neoadjuvant chemotherapy . These limitations should be considered if biopsy is planned after neoadjuvant chemotherapy . FUNDING Brustkrebs Deutschl and , German Society for Senology , German Breast Group BACKGROUND In women with breast cancer , sentinel-lymph-node biopsy ( SLNB ) provides information that allows surgeons to avoid axillary-lymph-node dissection ( ALND ) if the SLN does not have metastasis , and has a favourable effect on quality of life . Results of our previous trial showed that SLNB accurately screens the ALN for metastasis in breast cancers of diameter 2 mm or less . We aim ed to up date this trial with results from longer follow-up . METHODS Women with breast tumours of diameter 2 cm or less were r and omly assigned after breast-conserving surgery either to SLNB and total ALND ( ALND group ) , or to SLNB followed by ALND only if the SLN was involved ( SLN group ) . Analysis was restricted to patients whose tumour characteristics met eligibility criteria after treatment . The main endpoints were the number of axillary metastases in women in the SLN group with negative SLNs , staging power of SLNB , and disease-free and overall survival . FINDINGS Of the 257 patients in the ALND group , 83 ( 32 % ) had a positive SLN and 174 ( 68 % ) had a negative SLN ; eight of those with negative SLNs were found to have false-negative SLNs . Of the 259 patients in the SLN group , 92 ( 36 % ) had a positive SLN , and 167 ( 65 % ) had a negative SLN . One case of overt clinical axillary metastasis was seen in the follow-up of the 167 women in the SLN group who did not receive ALND ( ie , one false-negative ) . After a median follow-up of 79 months ( range 15 - 97 ) , 34 events associated with breast cancer occurred : 18 in the ALND group , and 16 in the SLN group ( log-rank p=0.6 ) . The overall 5-year survival of all patients was 96.4 % ( 95 % CI 94.1 - 98.7 ) in the ALND group and 98.4 % ( 96.9 - 100 ) in the SLN group ( log-rank p=0.1 ) . INTERPRETATION SLNB can allow total ALND to be avoided in patients with negative SLNs , while reducing postoperative morbidity and the costs of hospital stay . The finding that only one overt axillary metastasis occurred during follow-up of patients who did not receive ALND ( whereas eight cases were expected ) could be explained by various hypotheses , including those from cancer-stem-cell research Objective : The aim of this multicenter r and omized trial was to assess the efficacy and safety of sentinel lymph node ( SLN ) biopsy compared with axillary lymph node dissection ( ALND ) . Background : All studies on SLN biopsy in breast cancer report a variable false negative rate , whose prognostic consequences are still unclear . Methods : From May 1999 to December 2004 , patients with breast cancer ≤3 cm were r and omly assigned to receive SLN biopsy associated with ALND ( ALND arm ) or SLN biopsy followed by ALND only if the SLN was metastatic ( SLN arm ) . The main aim was the comparison of disease-free survival in the 2 arms . Results : A total of 749 patients were r and omized and 697 were available for analysis . SLNs were identified in 662 of 697 patients ( 95 % ) and positive SLNs were found in 189 of 662 patients ( 28.5 % ) . In the ALND group , positive non-SLNs were found in 18 patients with negative SLN , giving a false negative rate of 16.7 % ( 18 of 108 ) . Postoperative side effects were significantly less in the SLN group and there was no negative impact of the SLN procedure on psychologic well being . At a median follow-up of 56 months , there were more locoregional recurrences in the SLN arm , and the 5-year disease-free survival was 89.9 % in the ALND arm and 87.6 % in the SLN arm , with a difference of 2.3 % ( 95 % confidence interval : −3.1 % to 7.6 % ) . However , the number of enrolled patients was not sufficient to draw definitive conclusions . Conclusion : SLN biopsy is an effective and well-tolerated procedure . However , its safety should be confirmed by the results of larger r and omized trials and meta-analyses Summary Background Despite the widespread application of sentinel lymph node biopsy ( SLNB ) for early stage breast cancer , there is a wide variation in reported test performance characteristics . A major aim of this prospect i ve multicentre validation study was to quantify detection and false-negative rates of SLNB and evaluate factors influencing them . Methods Eight-hundred and fourty-two patients with clinical ly node-negative breast cancer underwent SLNB according to a st and ardised protocol that used a combination of radiopharmaceutical 99mTc-albumin colloid and Patent Blue V dye . SLNB was followed by st and ard axillary treatment at the same operation in all patients . Results Sentinel lymph nodes ( SLNs ) were identified in 803 ( 96.1 % ) of 836 evaluable cases . The median number of SLNs removed per patient was 2 ( range 1–9 ) . There were 19 false negatives , result ing in a sensitivity of 263/282 ( 93.3 % ) and accuracy 782/803 ( 97.6 % ) . SLNs were successfully identified by blue dye in 698 ( 85.6 % ) , by isotope in 698 ( 85.6 % ) , and by the combination of blue dye and isotope in 782 ( 96.0 % ) of 815 patients . Among 276 node positive patients , one or more positive SLNs were identified by blue dye in 251 ( 90.9 % ) , by isotope in 246 ( 89.1 % ) and by the combination of blue dye and gamma probe in 258 ( 93.5 % ) . Obesity , tumor location other than upper outer quadrant and non-visualisation of SLNs on the pre-operative lymphoscintiscan were significantly associated with failed localisation ( p<0.001 , p=0.008 , p<0.001 , respectively ) . The false-negative rate in patients with grade 3 tumors was 9.6 % , compared with 4.7 % in those with grade 2 tumors ( p=0.022 ) . The false-negative rate in patients who had one SLN harvested was 10.1 % , compared with 1.1 % in those who had multiple SLNs ( three or more ) removed ( p=0.010 ) . ConclusionS LNB can accurately determine whether axillary metastases are present in patients with early stage breast cancer with clinical ly negative axillary nodes . Both success and accuracy of SLNB are optimised by the combined use of blue dye and isotope . SLNB success decreases with increasing body mass , tumor location other than the upper outer quadrant and non-visualisation of hot nodes on the pre-operative lymphoscintiscan . This study demonstrates reduction in the predictive value of a negative SLNB in grade 3 tumors The RACS sentinel node biopsy versus axillary clearance ( SNAC ) trial compared sentinel-node-based management ( SNBM ) and axillary lymph-node dissection ( ALND ) for breast cancer . In this sub study , we sought to determine whether patient ratings of arm swelling , symptoms , function and disability or clinicians ’ measurements were most efficient at detecting differences between r and omized groups , and therefore , which of these outcome measures would minimise the required sample sizes in future clinical trials . 324 women r and omised to SNBM and 319 r and omised to ALND were included . The primary endpoint of the trial was percentage increase in arm volume calculated from clinicians ’ measurements of arm circumference at 10 cm intervals . Secondary endpoints included reductions in range of motion and sensation ( both measured by clinicians ) ; and , patients ’ ratings of arm swelling , symptoms and quality of life , using the European Organisation for Research and Treatment of Cancer Breast Cancer Module ( EORTC QLM-BR23 ) , the body image after breast cancer question naire ( BIBC ) and the SNAC study specific scales ( SSSS ) . The relative efficiency ( RE , the squared ratio of the test statistics , with 95 % confidence intervals calculated by bootstrapping ) was used to compare these measures in detecting differences between the treatment groups . Patients ’ self-ratings of arm swelling were generally more efficient than clinicians ’ measurements of arm volume in detecting differences between treatment groups . The SSSS arm symptoms scale was the most efficient ( RE = 7.1 ) The entire SSSS was slightly less so ( RE = 4.6 ) . Patients ’ ratings on single items were 3–5 times more efficient than clinicians ’ measurements . Primary endpoints based on patient-rated outcome measures could reduce the required sample size in future surgical trials BACKGROUND Sentinel lymph node ( SLN ) staging is currently used to avoid complete axillary dissection in breast cancer patients with negative SLNs . Evidence of a similar efficacy , in terms of survival and regional control , of this strategy as compared with axillary resection is based on few clinical trials . In 1998 , we started a r and omized study comparing the two strategies , and we present here its results . MATERIAL S AND METHODS Patients were r and omly assigned to sentinel lymph node biopsy ( SLNB ) and axillary dissection [ axillary lymph node dissection ( ALND arm ) ] or to SLNB plus axillary resection if SLNs contained metastases ( SLNB arm ) . Main end points were overall survival ( OS ) and axillary recurrence . RESULTS One hundred and fifteen patients were assigned to the ALND arm and 110 to the SLNB arm . A positive SLN was found in 27 patients in the ALND arm and in 31 in the SLNB arm . Overall accuracy of SLNB was 93.0 % . Sensitivity and negative predictive values were 77.1 % and 91.1 % , respectively . At a median follow-up of 5.5 years , no axillary recurrence was observed in the SLNB arm . OS and event-free survival were not statistically different between the two arms . CONCLUSIONS The SLNB procedure does not appear inferior to conventional ALND for the subset of patients here considered PURPOSE Sentinel lymph node biopsy ( SLNB ) has been rapidly adopted by surgical oncologists in the management of invasive breast cancer . This study review s the Royal Australasian College of Surgeons ( RACS ) Sentinel Node versus Axillary Clearance ( SNAC ) trial and reports an interim analysis of the first 150 subjects . Other currently open multi-institutional r and omized trials in SLNB are review ed . METHODS The SNAC trial is a multicentre , central ly r and omized , phase III clinical trial . Subjects are r and omized to SLNB alone ( with completion axillary clearance , AC , for sentinel node-positive patients ) or AC plus SLNB , with stratification according to age ( < 50 years , more than or equal to 50 years ) , primary tumour palpability ( palpable vs impalpable ) , lymphatic mapping technique ( blue dye plus scintigraphy vs blue dye alone ) and centre . RESULTS The trial was launched in May 2001 in two centres . R and omization continues currently at the rate of approximately 30 subjects per month ( total , 1,012 at the time of writing ) from 32 participating centres in Australia and New Zeal and . Data from the first 150 subjects have been analysed to assess : compliance with r and omized treatment allocation ; measures of test performance for SLNB ( detection , removal , sensitivity , specificity and false-negative rates ) ; measures of arm volume , function , symptoms and quality of life ; and sample size estimates . CONCLUSIONS The SNAC trial is one of the fastest accruing clinical trials in Australasia . It is on track to determine whether differences in morbidity , with equivalent cancer-related outcomes , exist between SLNB and AC for women with early breast cancer Background American College of Surgeons Oncology Group Z0010 is a prospect i ve multicenter trial design ed to evaluate the prognostic significance of micrometastases in the sentinel lymph nodes and bone marrow aspirates of women with early-stage breast cancer . Surgical complications associated with the sentinel lymph node biopsy surgical procedure are reported . Methods Eligible patients included women with clinical T1/2N0M0 breast cancer . Surgical outcomes were available at 30 days and 6 months after surgery for 5327 patients . Patients who had a failed sentinel node mapping ( n = 71 , 1.4 % ) or a completion lymph node dissection ( n = 814 , 15 % ) were excluded . Univariate and multivariate analyses were performed to identify predictors for the measured surgical complications . Results In patients who received isosulfan blue dye alone ( n = 783 ) or a combination of blue dye and radiocolloid ( n = 4192 ) , anaphylaxis was reported in .1 % of subjects ( 5 of 4975 ) . Other complications included axillary wound infection in 1.0 % , axillary seroma in 7.1 % , and axillary hematoma in 1.4 % of subjects . Only increasing age and an increasing number of sentinel lymph nodes removed were significantly associated with an increasing incidence of axillary seroma . At 6 months , 8.6 % of patients reported axillary paresthesias , 3.8 % had a decreased upper extremity range of motion , and 6.9 % demonstrated proximal upper extremity lymphedema ( change from baseline arm circumference of > 2 cm ) . Significant predictors for surgical complications at 6 months were a decreasing age for axillary paresthesias and increasing body mass index and increasing age for upper extremity lymphedema . Conclusions This study provides a prospect i ve assessment of the sentinel lymph node biopsy procedure , as performed by a wide range of surgeons , demonstrating a low complication rate AIMS To compare physical morbidity and health-related quality of life ( HRQOL ) in breast cancer patients who received st and ard axillary dissection ( ALND ) or sentinel lymph node biopsy ( SLNB ) , followed by axillary dissection only in the case of sentinel-node positivity , within a r and omised clinical trial . PATIENTS AND METHODS Patients with early breast cancer < or =3 cm and clinical ly negative axilla were r and omly allocated to ALND or SLNB . All patients underwent physical examination every 6 months in order to assess any arm-related symptoms . A subset of patients completed the SF-36 quality of life question naire and the Psychological General Well Being Index ( PGWBI ) before r and omisation , at 6 and 12 months after surgery and yearly thereafter . Results of the first 24 months are reported . RESULTS Six-hundred and seventy-seven patients were available for analysis : 341 patients r and omised to the ALND group and 336 to the SLNB group . Six months after surgery , the SLNB group had significantly less lymph-oedema , movement restrictions , pain and numbness with respect to the ALND group . Lymph-oedema was also significantly reduced at 12 months and numbness remained significantly less frequent in the SLNB arm at all time points . Three-hundred and ten patients participated in the HRQOL assessment . The mean scores of the PGWB question naire general index and anxiety domain were significantly better in the SLNB group than in the ALND group but the difference ceased to be significant at 24 months . CONCLUSIONS The SLNB is associated with reduced arm morbidity without evidence of a negative impact on psychological well being . While waiting for long-term results of ongoing r and omised clinical trials , the SLNB may be proposed for early stage breast cancer patients after adequate information on the expected advantages and the possible risks BACKGROUND Feasibility and accuracy of sentinel node biopsy ( SLNB ) after the delivery of neo-adjuvant chemotherapy ( NAC ) is controversial . We here report our experience in NAC-treated patients with locally advanced breast cancer and clinical ly positive axillary nodes , and compare it with the results from our previous r and omized trial assessing SLNB in early-stage breast cancer patients . PATIENTS AND METHODS Sixty-four consecutive patients with large infiltrating tumor and clinical ly positive axillary nodes received NAC and subsequent lymphatic mapping , SLNB and complete axillary lymph node dissection ( ALND ) . The status of the sentinel lymph node ( SLN ) was compared to that of the axilla . RESULTS At least one SLN was identified in 60 of the 64 patients ( 93.8 % ) . Among those 60 patients , 37 ( 61.7 % ) had one or more positive SLN(s ) and 23 ( 38.3 % ) did not . Two of the patients with negative SLN(s ) presented metastases in other non-sentinel nodes . SLNB thus had a false-negative rate , a negative predictive value and an overall accuracy of 5.1 % , 91.3 % and 96.7 % , respectively . All these values were similar to those we reported for SLNB in the setting s of early-stage breast cancer . CONCLUSION SLNB after NAC is safe and feasible in patients with locally advanced breast cancer and clinical ly positive nodes , and accurately predicts the status of the axilla Lower socioeconomic status and lack of access to care are often implicated as plausible causes for African American women to present with later stage breast cancer than Caucasian women . Our objective is to determine if racial differences are present in newly diagnosed breast cancer in women of equivalent socioeconomic status . A retrospective review of prospect ively gathered data from women with newly diagnosed breast cancer was performed . All women presented to the indigent ( uninsured and below the poverty line ) breast clinic for evaluation and treatment of their breast pathology . Data pertaining to epidemiologic factors , diagnosis , pathology , and treatment were collected . The data were analyzed by chi-squared and tailed t-tests . Between March 2002 and May 2004 , 52 women ( African American=36 , Caucasian=16 ) were diagnosed with breast cancer at our clinic . The median age for both groups at presentation was 56.6 years . The staging assessment based on the pathologic size of the tumor was also equivalent between African American and Caucasian women at 2.29 cm and 2.21 cm , respectively . Metastatic lymph node involvement occurred in 14 women ( African American=7 , Caucasian=7 ) , with 19.4 % African American and 43.8 % Caucasian being node positive ( p=0.068 ) . In fact , there were no statistically significant differences between the races for menarche , menopause , body mass index ( BMI ) , duration of symptoms before presentation , type of diagnostic biopsy or surgery chosen , histology , receptor status , utilization of chemotherapy and radiation , and length of follow-up . The only statistical differences found were in the age of the first live birth ( African American=19 , Caucasian=22 ; p=0.028 ) , the use of ultrasound in initial evaluation of a breast mass ( less use in African American ; p=0.012 ) , and utilization of sentinel lymph node biopsy ( Caucasian=75 % , African American=42 % ; p=0.026 ) . Breast cancer in African American women traditionally presents at a more advanced stage and with poor prognostic features . However , when matched for lower socioeconomic status , racial disparities essentially disappear Summary This study is the first large prospect i ve RCT of sentinel node biopsy ( SNB ) compared with st and ard axillary treatment ( level I-III axillary lymph node dissection or four node sampling ) , which includes comprehensive and repeated quality of life ( QOL ) assessment s over 18 months . Patients ( n=829 ) completed the Functional Assessment of Cancer Therapy – Breast ( FACT-B+4 ) and the Spielberger State/Trait Anxiety Inventory ( STAI ) at baseline ( pre-surgery ) and at 1 , 3 , 6 , 12 , and 18 months post-surgery . There were significant differences between treatment groups favouring the SNB group throughout the 18 months assessment . Patients in the st and ard treatment group showed a greater decline in Trial Outcome Index ( TOI ) scores ( physical well-being , functional well-being and breast cancer concerns subscales in FACT-B+4 ) and recovered more slowly than patients in the SNB group ( p<0.01 ) . The change in total FACT-B+4 scores ( measuring global QOL ) closely resembled the TOI results . 18 months post-surgery approximately twice as many patients in the st and ard group compared with the SNB group reported substantial arm swelling ( 14 % versus 7 % ) ( p=0.002 ) or numbness ( 19 % versus 8.7 % ) ( p<0.001 ) . Despite the uncertainty about undergoing a relatively new procedure and the possible need for further surgery , there was no evidence of increased anxiety amongst patients r and omised to SNB ( p>0.05 ) . For 6 months post-surgery younger patients reported less favourable QOL scores ( p<0.001 ) and greater levels of anxiety ( p<0.01 ) . In view of the benefits regarding arm functioning and quality of life , the data from this r and omised study support the use of SNB in patients with clinical ly node negative breast cancer Background : Women r and omized into the sentinel node biopsy‐only arm of the Sentinel Node versus Axillary Clearance Trial require axillary clearance if the sentinel node is unable to be identified , or if the sentinel node contains metastases . The aim of the present study was to determine the likelihood of immediate and delayed axillary clearance in patients in the trial when nodes were subjected to intraoperative imprint cytology AIM As a means of staging the axilla with minimal surgical trauma , sentinel lymph node biopsy ( SNB ) has dramatically altered the management of early-stage breast cancer . The aim of this prospect i ve multicentre study was to assess the safety of the method in cases of non-palpable tumours and in cases with an open biopsy prior to SNB . METHOD In the period 1999 - 2001 , 57 non-palpable breast cancers and 75 patients with diagnostic biopsy were collected prospect ively to the first part of the study . In the second part , 745 patients with non-palpable breast cancers and 86 cases with prior open surgery diagnosed between 2000 and 2005 were followed up till the end of 2005 . All patients in the first part of the study had an axillary clearance irrespective of sentinel node status , whereas in the second part axillary clearance was done only if the sentinel node was metastatic . RESULTS The detection rate was 95 % in the group of non-palpable breast cancers , with a false negative rate of 5.6 % ( 1/18 ) , and the corresponding figures for the group with prior intervention were 96 % and 10 % ( 2/20 ) . Two axillary recurrences , after a negative SNB at primary surgery , were found in the non-palpable group after 16 and 17 months , respectively . No axillary recurrence has been observed in the group of cancers with a prior open biopsy . Four women in the non-palpable group and two women with a diagnostic operation experienced distant metastases . CONCLUSION We conclude that SNB is a safe procedure for women with non-palpable breast cancer , as well as after previous open diagnostic excision AIMS Sentinel lymph node ( SN ) biopsy has been vali date d in the treatment of breast carcinoma . Patients with previous excisional biopsy are regarded as ineligible for SN biopsy . We evaluated the results of SN biopsy for this group of patients based on confirmatory axillary lymph node dissection . PATIENTS AND METHODS From April 1997 all 88 patients with stage T(1 - 3 ) breast cancer who had previously undergone diagnostic excisional biopsy followed by complete axillary lymph node dissection , were enrolled into a prospect i ve study to determine the validity of the sentinel node procedure . RESULTS Lymphoscintigraphy visualized one or more axillary hot spots in 84/88 patients . A successful SN biopsy was performed in 87 patients . Complete axillary lymph-node dissection showed no false-negative SN biopsy among the 87 SN procedures . CONCLUSION SN biopsy is a reliable and safe method following excisional biopsy as is confirmed by completion axillary lymph node dissection . Therefore , patients with previous excisional biopsy are eligible for sentinel node procedure and can be spared unnecessary axillary lymph node dissection PURPOSE Experience with sentinel node biopsy ( SNB ) after neoadjuvant chemotherapy is limited . We examined the feasibility and accuracy of this procedure within a r and omized trial in patients treated with neoadjuvant chemotherapy . PATIENTS AND METHODS During the conduct of National Surgical Adjuvant Breast and Bowel Project trial B-27 , several participating surgeons attempted SNB before the required axillary dissection in 428 patients . All underwent lymphatic mapping and an attempt to identify and remove a sentinel node . Lymphatic mapping was performed with radioactive colloid ( 14.7 % ) , with lymphazurin blue dye alone ( 29.9 % ) , or with both ( 54.7 % ) . RESULTS Success rate for the identification and removal of a sentinel node was 84.8 % . Success rate increased significantly with the use of radioisotope ( 87.6 % to 88.9 % ) versus with the use of lymphazurin alone ( 78.1 % , P = .03 ) . There were no significant differences in success rate according to clinical tumor size , clinical nodal status , age , or calendar year of r and om assignment . Of 343 patients who had SNB and axillary dissection , the sentinel nodes were positive in 125 patients and were the only positive nodes in 70 patients ( 56.0 % ) . Of the 218 patients with negative sentinel nodes , nonsentinel nodes were positive in 15 ( false-negative rate , 10.7 % ; 15 of 140 patients ) . There were no significant differences in false-negative rate according to clinical patient and tumor characteristics , method of lymphatic mapping , or breast tumor response to chemotherapy . CONCLUSION These results are comparable to those obtained from multicenter studies evaluating SNB before systemic therapy and suggest that the sentinel node concept is applicable following neoadjuvant chemotherapy BACKGROUND Although numerous studies have shown that the status of the sentinel node is an accurate predictor of the status of the axillary nodes in breast cancer , the efficacy and safety of sentinel-node biopsy require validation . METHODS From March 1998 to December 1999 , we r and omly assigned 516 patients with primary breast cancer in whom the tumor was less than or equal to 2 cm in diameter either to sentinel-node biopsy and total axillary dissection ( the axillary-dissection group ) or to sentinel-node biopsy followed by axillary dissection only if the sentinel node contained metastases ( the sentinel-node group ) . RESULTS The number of sentinel nodes found was the same in the two groups . A sentinel node was positive in 83 of the 257 patients in the axillary-dissection group ( 32.3 percent ) , and in 92 of the 259 patients in the sentinel-node group ( 35.5 percent ) . In the axillary-dissection group , the overall accuracy of the sentinel-node status was 96.9 percent , the sensitivity 91.2 percent , and the specificity 100 percent . There was less pain and better arm mobility in the patients who underwent sentinel-node biopsy only than in those who also underwent axillary dissection . There were 15 events associated with breast cancer in the axillary-dissection group and 10 such events in the sentinel-node group . Among the 167 patients who did not undergo axillary dissection , there were no cases of overt axillary metastasis during follow-up . CONCLUSIONS Sentinel-node biopsy is a safe and accurate method of screening the axillary nodes for metastasis in women with a small breast cancer PURPOSE Axillary lymph node dissection ( ALND ) as part of surgical treatment for patients with breast cancer is associated with significant morbidity . Sentinel lymph node biopsy ( SLNB ) is a newly developed method of staging the axilla and has the potential to avoid an ALND in lymph node-negative patients , thereby minimizing morbidity . The aim of this study was to investigate physical and psychological morbidity after SLNB in the treatment of early breast cancer in a r and omized controlled trial . PATIENTS AND METHODS Between November 1999 and February 2003 , 298 patients with early breast cancer ( tumors 3 cm or less on ultrasound examination ) who were clinical ly node negative were r and omly allocated to undergo ALND ( control group ) or SLNB followed by ALND if subsequently found to be lymph node positive ( study group ) . A detailed assessment of physical and psychological morbidity was performed during a 1-year period postoperatively . RESULTS A significant reduction in postoperative arm swelling , rate of seroma formation , numbness , loss of sensitivity to light touch and pinprick was observed in the study group . Although shoulder mobility was less impaired on average in the study group , this was significant only for abduction at 1 month and flexion at 3 months . Scores reflecting quality of life and psychological morbidity were significantly better in the study group in the immediate postoperative period , with fewer long-term differences . CONCLUSION SLNB in patients undergoing surgery for breast cancer results in a significant reduction in physical and psychological morbidity BACKGROUND The goals of axillary-lymph-node dissection ( ALND ) are to maximise survival , provide regional control , and stage the patient . However , this technique has substantial side-effects . The purpose of the B-32 trial is to establish whether sentinel-lymph-node ( SLN ) resection can achieve the same therapeutic goals as conventional ALND but with decreased side-effects . The aim of this paper is to report the technical success and accuracy of SLN resection plus ALND versus SLN resection alone . METHODS 5611 women with invasive breast cancer were r and omly assigned to receive either SLN resection followed by immediate conventional ALND ( n=2807 ; group 1 ) or SLN resection without ALND if SLNs were negative on intraoperative cytology and histological examination ( n=2804 ; group 2 ) in the B-32 trial . Patients in group 2 underwent ALND if no SLNs were identified or if one or more SLNs were positive on intraoperative cytology or subsequent histological examination . Primary endpoints , including survival , regional control , and morbidity , will be reported later . Secondary endpoints are accuracy and technical success and are reported here . This trial is registered with the Clinical Trial registry , number NCT00003830 . FINDINGS Data for technical success were available for 5536 of 5611 patients ; 75 declined protocol treatment , had no SLNs removed , or had no SLN resection done . SLNs were successfully removed in 97.2 % of patients ( 5379 of 5536 ) in both groups combined . Identification of a preincision hot spot was associated with greater SLN removal ( 98.9 % [ 5072 of 5128 ] ) . Only 1.4 % ( 189 of 13171 ) of SLN specimens were outside of axillary levels I and II . 65.1 % ( 8571 of 13 171 ) of SLN specimens were both radioactive and blue ; a small percentage was identified by palpation only ( 3.9 % [ 515 of 13 171 ] ) . The overall accuracy of SLN resection in patients in group 1 was 97.1 % ( 2544 of 2619 ; 95 % CI 96.4 - 97.7 ) , with a false-negative rate of 9.8 % ( 75 of 766 ; 95 % CI 7.8 - 12.2 ) . Differences in tumour location , type of biopsy , and number of SLNs removed significantly affected the false-negative rate . Allergic reactions related to blue dye occurred in 0.7 % ( 37 of 5588 ) of patients with data on toxic effects . INTERPRETATION The findings reported here indicate excellent balance in clinical patient characteristics between the two r and omised groups and that the success of SLN resection was high . These findings are important because the B-32 trial is the only trial of sufficient size to provide definitive information related to the primary outcome measures of survival and regional control . Removal of more than one SLN and avoidance of excisional biopsy are important variables in reducing the false-negative rate The timing of sentinel node biopsy in the setting of neo-adjuvant chemotherapy for breast cancer is controversial . Sentinel node biopsy performed after neo-adjuvant chemotherapy may save patients with a nodal response the morbidity of an axillary lymph node dissection . A retrospective review of prospect ively collected data compared sentinel node biopsies performed after patients had received neo-adjuvant chemotherapy with patients who had not received neo-adjuvant chemotherapy . Demographic factors , tumor characteristics , and the results of the sentinel node biopsies and completion lymph node dissections ( when applicable ) were compared . A total of 231 axillary procedures ( 224 patients ) were evaluated . The patients who received neo-adjuvant chemotherapy ( NEO ; N=52 ) were younger , had higher grade tumors , were more likely to have a mastectomy , and were more likely to have ER-negative and HER-2/neu positive tumors than the patients who did not receive neo-adjuvant chemotherapy ( NON ; N=179 ) . The mean clinical tumor size in the neo-adjuvant group was 4.5 cm ( ±1.8 ) prior to chemotherapy ; the post-chemotherapy pathologic size was 1.4 cm ( ±1.3 ) . A sentinel node was identified in all cases . There were no significant differences between the groups in the mean number of sentinel nodes removed ( NEO=3.3 ; NON=3.1 ; p=0.545 ) , the percentage of positive axillae ( NEO=24 % ; NON=21 % ; p=0.776 ) or the mean number of positive sentinel nodes ( NEO=1.3 ; NON=1.5 ; p=0.627 ) . There was no difference in the percentage of completion lymph node dissections with additional positive nodes ( NEO=20 % ; NON=35 % ; p=0.462 ) ; there was a difference in the number of nodes removed in the completion lymph node dissections ( mean NEO=12.0 ; NON=16.4 ; p=0.047 ) . Sentinel node biopsy performed after neo-adjuvant chemotherapy appears to be an oncologically sound procedure and may save some patients the morbidity of a complete lymph node dissection Objective : Sentinel node biopsy ( SNB ) is widely used to stage the axilla in breast cancer . We present 10-year follow-up of our single-institute trial design ed to compare outcomes in patients who received no axillary dissection if the sentinel node was negative , with patients who received complete axillary dissection . Methods : From March 1998 to December 1999 , 516 patients with primary breast cancer up to 2 cm in pathologic diameter were r and omized either to SNB plus complete axillary dissection ( AD arm ) or to SNB with axillary dissection only if the sentinel node contained metastases ( SN arm ) . Results : The 2 arms were well-balanced for number of sentinel nodes found , proportion of positive sentinel nodes , and all other tumor and patient characteristics . About 8 patients in the AD arm had false-negative SNs on histologic analysis : a similar number ( 8 , 95 % CI : 3–15 ) of patients with axillary involvement was expected in SN arm patients who did not receive axillary dissection ; but only 2 cases of overt axillary metastasis occurred . There were 23 breast cancer-related events in the SN arm and 26 in the AD arm ( log-rank , P = 0.52 ) , while overall survival was greater in the SN arm ( log-rank , P = 0.15 ) . Conclusions : Preservation of healthy lymph nodes may have beneficial consequences . Axillary dissection should not be performed in breast cancer patients without first examining the sentinel node Background The axillary recurrence ( AR ) rate after negative sentinel node biopsy ( SNB ) in patients with high risk of axillary metastases is largely unknown . The aim of this study was to analyze the risk factors for isolated AR after negative SNB with special interest in large or multifocal tumors . Methods A prospect i ve SNB registry was analyzed for 2,408 invasive breast cancer patients operated between 2001 and 2007 . No axillary clearance was performed in 1,309 cases with a negative SNB , including 1,138 small unifocal tumors , 121 small multifocal tumors , 48 large unifocal tumors , and 2 large multifocal tumors . Results Six ( 0.5 % ) isolated AR were observed during a median follow-up of 43 months . Four ( 0.4 % ) patients with small unifocal tumors and two ( 1.6 % ) with small multifocal tumors had isolated AR ( p = 0.179 ) . None of the patients with large unifocal or multifocal tumors had isolated AR . Instead of tumor size and multifocality , estrogen receptor negativity ( p < 0.001 ) , nuclear grade III ( p < 0.001 ) , Her-2 status ( p = 0.002 ) , no radiotherapy ( p = 0.005 ) , and mastectomy ( p = 0.005 ) were found to be associated with AR . Conclusions A remarkable proportion of patients with large unifocal tumors and small multifocal tumors may avoid unnecessary AC due to tumor negative SNB , without an excessive risk of AR IMPORTANCE Sentinel lymph node ( SLN ) surgery provides reliable nodal staging information with less morbidity than axillary lymph node dissection ( ALND ) for patients with clinical ly node-negative ( cN0 ) breast cancer . The application of SLN surgery for staging the axilla following chemotherapy for women who initially had node-positive cN1 breast cancer is unclear because of high false-negative results reported in previous studies . OBJECTIVE To determine the false-negative rate ( FNR ) for SLN surgery following chemotherapy in women initially presenting with biopsy-proven cN1 breast cancer . DESIGN , SETTING , AND PATIENTS The American College of Surgeons Oncology Group ( ACOSOG ) Z1071 trial enrolled women from 136 institutions from July 2009 to June 2011 who had clinical T0 through T4 , N1 through N2 , M0 breast cancer and received neoadjuvant chemotherapy . Following chemotherapy , patients underwent both SLN surgery and ALND . Sentinel lymph node surgery using both blue dye ( isosulfan blue or methylene blue ) and a radiolabeled colloid mapping agent was encouraged . MAIN OUTCOMES AND MEASURES The primary end point was the FNR of SLN surgery after chemotherapy in women who presented with cN1 disease . We evaluated the likelihood that the FNR in patients with 2 or more SLNs examined was greater than 10 % , the rate expected for women undergoing SLN surgery who present with cN0 disease . RESULTS Seven hundred fifty-six women were enrolled in the study . Of 663 evaluable patients with cN1 disease , 649 underwent chemotherapy followed by both SLN surgery and ALND . An SLN could not be identified in 46 patients ( 7.1 % ) . Only 1 SLN was excised in 78 patients ( 12.0 % ) . Of the remaining 525 patients with 2 or more SLNs removed , no cancer was identified in the axillary lymph nodes of 215 patients , yielding a pathological complete nodal response of 41.0 % ( 95 % CI , 36.7%-45.3 % ) . In 39 patients , cancer was not identified in the SLNs but was found in lymph nodes obtained with ALND , result ing in an FNR of 12.6 % ( 90 % Bayesian credible interval , 9.85%-16.05 % ) . CONCLUSIONS AND RELEVANCE Among women with cN1 breast cancer receiving neoadjuvant chemotherapy who had 2 or more SLNs examined , the FNR was not found to be 10 % or less . Given this FNR threshold , changes in approach and patient selection that result in greater sensitivity would be necessary to support the use of SLN surgery as an alternative to ALND . TRIAL REGISTRATION clinical trials.gov Identifier : NCT00881361 BACKGROUND Sentinel-lymph-node ( SLN ) surgery was design ed to minimise the side-effects of lymph-node surgery but still offer outcomes equivalent to axillary-lymph-node dissection ( ALND ) . The aims of National Surgical Adjuvant Breast and Bowel Project ( NSABP ) trial B-32 were to establish whether SLN resection in patients with breast cancer achieves the same survival and regional control as ALND , but with fewer side-effects . METHODS NSABP B-32 was a r and omised controlled phase 3 trial done at 80 centres in Canada and the USA between May 1 , 1999 , and Feb 29 , 2004 . Women with invasive breast cancer were r and omly assigned to either SLN resection plus ALND ( group 1 ) or to SLN resection alone with ALND only if the SLNs were positive ( group 2 ) . R and om assignment was done at the NSABP Biostatistical Center ( Pittsburgh , PA , USA ) with a biased coin minimisation approach in an allocation ratio of 1:1 . Stratification variables were age at entry ( ≤ 49 years , ≥ 50 years ) , clinical tumour size ( ≤ 2·0 cm , 2·1 - 4·0 cm , ≥ 4·1 cm ) , and surgical plan ( lumpectomy , mastectomy ) . SLN resection was done with a blue dye and radioactive tracer . Outcome analyses were done in patients who were assessed as having pathologically negative sentinel nodes and for whom follow-up data were available . The primary endpoint was overall survival . Analyses were done on an intention-to-treat basis . All deaths , irrespective of cause , were included . The mean time on study for the SLN-negative patients with follow-up information was 95·6 months ( range 70·1 - 126·7 ) . This study is registered with Clinical Trials.gov , number NCT00003830 . FINDINGS 5611 women were r and omly assigned to the treatment groups , 3989 had pathologically negative SLN . 309 deaths were reported in the 3986 SLN-negative patients with follow-up information : 140 of 1975 patients in group 1 and 169 of 2011 in group 2 . Log-rank comparison of overall survival in groups 1 and 2 yielded an unadjusted hazard ratio ( HR ) of 1·20 ( 95 % CI 0·96 - 1·50 ; p=0·12 ) . 8-year Kaplan-Meier estimates for overall survival were 91·8 % ( 95 % CI 90·4 - 93·3 ) in group 1 and 90·3 % ( 88·8 - 91·8 ) in group 2 . Treatment comparisons for disease-free survival yielded an unadjusted HR of 1·05 ( 95 % CI 0·90 - 1·22 ; p=0·54 ) . 8-year Kaplan-Meier estimates for disease-free survival were 82·4 % ( 80·5 - 84·4 ) in group 1 and 81·5 % ( 79·6 - 83·4 ) in group 2 . There were eight regional-node recurrences as first events in group 1 and 14 in group 2 ( p=0·22 ) . Patients are continuing follow-up for longer-term assessment of survival and regional control . The most common adverse events were allergic reactions , mostly related to the administration of the blue dye . INTERPRETATION Overall survival , disease-free survival , and regional control were statistically equivalent between groups . When the SLN is negative , SLN surgery alone with no further ALND is an appropriate , safe , and effective therapy for breast cancer patients with clinical ly negative lymph nodes . FUNDING US Public Health Service , National Cancer Institute , and Department of Health and Human Services
2,029
29,276,545
The current systematic review will focus mainly on behavioral studies that look into the dual effects of different types of physical training ( e.g. , balance training , aerobic training , strength training , group sports , etc . ) Overall , observations from the 19 included studies conclude that improvements on both motor and cognitive functions were found , mainly in interventions that adopt physical-cognitive training or combined exercise training .
The decline in cognitive and motor functions with age affects the performance of the aging healthy population in many daily life activities . Physical activity appears to mitigate this decline or even improve motor and cognitive abilities in older adults .
OBJECTIVE To compare training and follow-up effects of combined aerobic and strength training versus aerobic-only training on cognitive and motor function in institutionalized patients with dementia and to explore whether improved motor function mediates improved cognitive function . METHODS Using a 9-week , parallel , three-group , single-blind , r and omized , controlled trial with a follow-up assessment at week 18 , we assessed 109 patients with dementia ( age 85.5 ± 5.1 years ) in a psycho-geriatric nursing home . Each 9-week intervention consisted of 36 , 30-minute sessions . A combined group ( N = 37 ) received and completed two strength and two walking sessions per week , an aerobic group ( N = 36 ) completed four walking sessions , and a social group ( N = 36 ) completed four social visits per week . Cognitive and motor functions were assessed at baseline , after the 9-week intervention , and after a consecutive 9 weeks of usual care . RESULTS Baseline corrected post-test scores in the combined versus the social group were higher for global cognition , visual memory , verbal memory , executive function , walking endurance , leg muscle strength , and balance . Aerobic versus social group scores were higher for executive function . Follow-up effects reversed toward baseline values . Motor improvement did not significantly mediate cognitive improvement . CONCLUSION Compared with a nonexercise control group , a combination of aerobic and strength training is more effective than aerobic-only training in slowing cognitive and motor decline in patients with dementia . No mediating effects between improvements in cognitive function via improved motor function were found . Future research into the underlying mechanistic associations is needed Background About one-third of people older than 65 years fall at least once a year . Physical exercise has been previously demonstrated to improve gait , enhance physical fitness , and prevent falls . Nonetheless , the addition of cognitive training components may potentially increase these effects , since cognitive impairment is related to gait irregularities and fall risk . We hypothesized that simultaneous cognitive – physical training would lead to greater improvements in dual-task ( DT ) gait compared to exclusive physical training . Methods Elderly persons older than 70 years and without cognitive impairment were r and omly assigned to the following groups : 1 ) virtual reality video game dancing ( DANCE ) , 2 ) treadmill walking with simultaneous verbal memory training ( MEMORY ) , or 3 ) treadmill walking ( PHYS ) . Each program was complemented with strength and balance exercises . Two 1-hour training sessions per week over 6 months were applied . Gait variables , functional fitness ( Short Physical Performance Battery , 6-minute walk ) , and fall frequencies were assessed at baseline , after 3 months and 6 months , and at 1-year follow-up . Multiple regression analyses with planned comparisons were carried out . Results Eighty-nine participants were r and omized to three groups initially ; 71 completed the training and 47 were available at 1-year follow-up . DANCE/MEMORY showed a significant advantage compared to PHYS in DT costs of step time variability at fast walking ( P=0.044 ) . Training-specific gait adaptations were found on comparing DANCE and MEMORY : DANCE reduced step time at fast walking ( P=0.007 ) and MEMORY reduced gait variability in DT and DT costs at preferred walking speed ( both trend P=0.062 ) . Global linear time effects showed improved gait ( P<0.05 ) , functional fitness ( P<0.05 ) , and reduced fall frequency ( −77 % , P<0.001 ) . Only single-task fast walking , gait variability at preferred walking speed , and Short Physical Performance Battery were reduced at follow-up ( all P<0.05 or trend ) . Conclusion Long-term multicomponent cognitive – physical and exclusive physical training programs demonstrated similar potential to counteract age-related decline in physical functioning This study aim ed to evaluate the effects of different types of exercise on cognition . Eighty participants , 32 males and 48 females , aged 66.96 ± 11.73 , volunteered for this study . The participants were r and omly divided into the four following groups : Resistance Group ( RG ; n=20 ) , involved in high intensity strength training ; Cardiovascular Group ( CVG ; n=20 ) , involved in high intensity cardiovascular training ; Postural Group ( PG ; n=20 ) involved in low intensity training , based on postural and balance exercises ; and Control Group ( CG ; n=20 ) . Exercises were performed over the course of 12 weeks . All participants were tested for their cognitive functions pre- and post-intervention using the following neurocognitive tests : the Attentive Matrices Test , Raven 's Progressive Matrices , Stroop Color and Word Interference Test , Trail Making Test and Drawing Copy Test . Statistical analysis showed that the CVG group improved significantly in the Attentive Matrices Test and Raven 's Progressive Matrices ( both p=<0.05 ) , whereas the RG group improved in Drawing Copy Test time ( p=<0.05 ) . These results confirm that different types of exercise interventions have unique effects on cognition . Cardiovascular training is effective in improving performance attentive and analytic tasks , whereas resistance training is effective in improving praxis . Further investigation is necessary to evaluate the combination of the two exercise types in order to ascertain if their respective effects can be summated when performed together This study examined transfer effects of fall training on fear of falling ( Falls Efficacy Scale-International [ FES-I ] ) , balance performance , and spatiotemporal gait characteristics in older adults . Eighteen community-dwelling older adults ( ages 65 - 85 ) were r and omly assigned to an intervention or control group . The intervention group completed 12 training sessions ( 60 min , 6 weeks ) . During pre- and posttesting , we measured FES-I , balance performance ( double limb , closed eyes ; single limb , open eyes ; double limb , open eyes with motor-interfered task ) , and gait parameters ( e.g. , velocity ; cadence ; stride time , stride width , and stride length ; variability of stride time and stride length ) under single- and motor-interfered tasks . Dual tasks were applied to appraise improvements of cognitive processing during balance and gait . FES-I ( p = .33 ) and postural sway did not significantly change ( 0.36 < p < .79 ) . Trends toward significant interaction effects were found for step width during normal walking and stride length variability during the motor dual task ( p = .05 , ηp 2 = .22 ) . Fall training did not sufficiently improve fear of falling , balance , or gait performance under single- or dual-task conditions in healthy older adults Background Stepping impairments are associated with physical and cognitive decline in older adults and increased fall risk . Exercise interventions can reduce fall risk , but adherence is often low . A new exergame involving step training may provide an enjoyable exercise alternative for preventing falls in older people . Purpose To assess the feasibility and safety of unsupervised , home-based step pad training and determine the effectiveness of this intervention on stepping performance and associated fall risk in older people . Design Single-blinded two-arm r and omized controlled trial comparing step pad training with control ( no-intervention ) . Setting / Participants Thirty-seven older adults residing in independent-living units of a retirement village in Sydney , Australia . Intervention Intervention group ( IG ) participants were provided with a computerized step pad system connected to their TVs and played a step game as often as they liked ( with a recommended dose of 2–3 sessions per week for 15–20 minutes each ) for eight weeks . In addition , IG participants were asked to complete a choice stepping reaction time ( CSRT ) task once each week . Main Outcome Measures CSRT , the Physiological Profile Assessment ( PPA ) , neuropsychological and functional mobility measures were assessed at baseline and eight week follow-up . Results Thirty-two participants completed the study ( 86.5 % ) . IG participants played a median 2.75 sessions/week and no adverse events were reported . Compared to the control group , the IG significantly improved their CSRT ( F31,1 = 18.203 , p<.001 ) , PPA composite scores ( F31,1 = 12.706 , p = 0.001 ) , as well as the postural sway ( F31,1 = 4.226 , p = 0.049 ) and contrast sensitivity ( F31,1 = 4.415 , p = 0.044 ) PPA sub-component scores . In addition , the IG improved significantly in their dual-task ability as assessed by a timed up and go test/verbal fluency task ( F31,1 = 4.226 , p = 0.049 ) . Conclusions Step pad training can be safely undertaken at home to improve physical and cognitive parameters of fall risk in older people without major cognitive and physical impairments . Trial Registration Australian New Zeal and Clinical Trials Registry ACTRN12611001081909 Different types of exercise training have the potential to induce structural and functional brain plasticity in the elderly . Thereby , functional brain adaptations were observed during cognitive tasks in functional magnetic resonance imaging studies that correlated with improved cognitive performance . This study aim ed to investigate if exercise training induces functional brain plasticity during challenging treadmill walking and elicits associated changes in cognitive executive functions . Forty-two elderly participants were recruited and r and omly assigned to either interactive cognitive-motor video game dancing ( DANCE ) or balance and stretching training ( BALANCE ) . The 8-week intervention included three sessions of 30 min per week and was completed by 33 participants ( mean age 74.9 ± 6.9 years ) . Prefrontal cortex ( PFC ) activity during preferred and fast walking speed on a treadmill was assessed applying functional near infrared spectroscopy pre- and post-intervention . Additionally , executive functions comprising shifting , inhibition , and working memory were assessed . The results showed that both interventions significantly reduced left and right hemispheric PFC oxygenation during the acceleration of walking ( p < 0.05 or trend , r = 0.25–0.36 ) , while DANCE showed a larger reduction at the end of the 30-s walking task compared to BALANCE in the left PFC [ F(1 , 31 ) = 3.54 , p = 0.035 , r = 0.32 ] . These exercise training induced modulations in PFC oxygenation correlated with improved executive functions ( p < 0.05 or trend , r = 0.31–0.50 ) . The observed reductions in PFC activity may release cognitive re sources to focus attention on other processes while walking , which could be relevant to improve mobility and falls prevention in the elderly . This study provides a deeper underst and ing of the associations between exercise training , brain function during walking , and cognition in older adults Background Although basic research has uncovered biological mechanisms by which exercise could maintain and enhance adult brain health , experimental human studies with older adults have produced equivocal results . Purpose This r and omized clinical trial aim ed to investigate the hypotheses that ( a ) the effects of exercise training on the performance of neurocognitive tasks in older adults is selective , influencing mainly tasks with a substantial executive control component and ( b ) performance in neurocognitive tasks is related to cardiorespiratory fitness . Methods Fifty-seven older adults ( 65–79 years ) participated in aerobic or strength- and -flexibility exercise training for 10 months . Neurocognitive tasks were selected to reflect a range from little ( e.g. , simple reaction time ) to substantial ( i.e. , Stroop Word – Color conflict ) executive control . Results Performance in tasks requiring little executive control was unaffected by participating in aerobic exercise . Improvements in Stroop Word – Color task performance were found only for the aerobic exercise group . Changes in aerobic fitness were unrelated to changes in neurocognitive function . Conclusions Aerobic exercise in older adults can have a beneficial effect on the performance of speeded tasks that rely heavily on executive control . Improvements in aerobic fitness do not appear to be a prerequisite for this beneficial effect Strength training has been reported as a potentially useful exercise to improve psychological aspects in the elderly , but its effects remain controversial . This study investigated the effectiveness of strength training conducted twice a week for 12 weeks for improving health-related quality of life ( HRQOL ) and executive cognitive function . The study was a single-blind r and omized controlled trial with assessment s before and after intervention . HRQOL and executive function were assessed using the SF-36 Health Status Survey and a computerized neuro-cognitive assessment using task-switch reaction time trials , respectively . Subjects comprised 119 participants > or = 65 years old , r and omized to either strength training ( n=65 ) or health education classes ( controls , n=54 ) . The strength training program was design ed to strengthen the large muscle groups most important for functional activities and to improve balance . The effects of the intervention on the eight dimensions of the SF-36 in the control and training groups were analyzed . Only the mental health scale of the SF-36 was significantly improved for the training group compared with controls after 12 weeks . Task-switch reaction time and correct response rate remained unchanged . Short-term strength training might have modest positive effects on HRQOL , although this training period may not be sufficient to affect executive function in relatively healthy older people Background Physical activity ( PA ) for older adults has well-documented physical and cognitive benefits , but most seniors do not meet recommended guidelines for PA , and interventions are lacking . Objectives This study evaluated the efficacy of a 12-week Internet intervention to help sedentary older adults over 55 years of age adopt and maintain an exercise regimen . Methods A total of 368 sedentary men and women ( M=60.3 ; SD 4.9 ) were recruited , screened , and assessed online . They were r and omized into treatment and control groups and assessed at pretest , at 12 weeks , and at 6 months . After treatment group participants rated their fitness level , activity goals , and barriers to exercise , the Internet intervention program helped them select exercise activities in the areas of endurance , flexibility , strengthening , and balance enhancement . They returned to the program weekly for automated video and text support and education , with the option to change or increase their exercise plan . The program also included ongoing problem solving to overcome user-identified barriers to exercise . Results The multivariate model indicated significant treatment effects at posttest ( P=.001 ; large effect size ) and at 6 months ( P=.001 ; medium effect size ) . At posttest , intervention participation showed significant improvement on 13 of 14 outcome measures compared to the control participants . At 6 months , treatment participants maintained large gains compared to the control participants on all 14 outcome measures . Conclusions These results suggest that an online PA program has the potential to positively impact the physical activity of sedentary older adult participants . More research is needed to replicate the study results , which were based on self-report measures . Research is also needed on intervention effects with older population Background While many studies confirm the positive effect of cognitive and physical training on cognitive performance of older adults , only little is known about the effects of simultaneously performed cognitive and physical training . In the current study , older adults simultaneously performed a verbal working memory and a cardiovascular training to improve cognitive and motor-cognitive dual task performance . Twenty training sessions of 30 minutes each were conducted over a period of ten weeks , with a test session before , in the middle , and after the training . Training gains were tested in measures of selective attention , paired-associates learning , executive control , reasoning , memory span , information processing speed , and motor-cognitive dual task performance in the form of walking and simultaneously performing a working memory task . Results Sixty-three participants with a mean age of 71.8 ± 4.9 years ( range 65 to 84 ) either performed the simultaneous training ( N = 21 ) , performed a single working memory training ( N = 16 ) , or attended no training at all ( N = 26 ) . The results indicate similar training progress and larger improvements in the executive control task for both training groups when compared to the passive control group . In addition , the simultaneous training result ed in larger improvements compared to the single cognitive training in the paired-associates task and was able to reduce the step-to-step variability during the motor-cognitive dual task when compared to the single cognitive training and the passive control group . Conclusions The simultaneous training of cognitive and physical abilities presents a promising training concept to improve cognitive and motor-cognitive dual task performance , offering greater potential on daily life functioning , which usually involves the recruitment of multiple abilities and re sources rather than a single one BACKGROUND Japan is one of the most rapidly ageing societies in the world . A number of municipalities have started services for the prevention of cognitive decline for community-dwelling elderly individuals , but the effectiveness of these services is currently insufficient . Our study explored the efficacy of a comprehensive intervention programme consisting of physical and leisure activities to prevent cognitive decline in community-dwelling elderly subjects . METHOD We administered a 12-week intervention programme consisting of physical and leisure activities aim ed at enhancing participants ' motivation to participate and support one another by providing a pleasant atmosphere , empathetic communication , praise , and errorless support . This programme for the prevention of cognitive decline was conducted as a service by the city of Maebashi . All participants underwent the Five-Cog test , which evaluated the cognitive domains of attention , memory , visuospatial function , language , and reasoning . Executive function was evaluated by the Wechsler Digit Symbol Substitution Test and Yamaguchi Kanji-Symbol Substitution Test . Subjective health status , level of social support , functional capacity , subjective quality of life , and depressive symptoms were assessed with a question naire . Grip strength test , timed up- and -go test , 5-m maximum walking times test , and functional reach test were performed to evaluate physical function . Fifty-two participants were r and omly allocated to intervention ( n = 26 ) and control ( n = 26 ) groups . Twenty-six participants , aged between 65 - 87 years , received intervention once a week at a community centre . The programme was conducted by health-care professionals , with the help of senior citizen volunteers . RESULTS The intervention group ( n = 19 ) showed significant improvement on the analogy task of the Five-Cog test ( F(1,38 ) = 4.242 , P = 0.046 ) and improved quality of life ( F(1,38 ) = 4.773 , P = 0.035 ) as compared to the control group ( n = 24 ) . CONCLUSION A community-based 12-week intervention programme that aim ed to enhance motivation to participate in activities result ed in improvements in some aspects of cognitive function and quality of life . Senior citizens who volunteered in the present intervention enabled the smooth implementation of the programme and alleviated the burden on professional staff OBJECTIVES To evaluate the efficacy of a municipality-led walking program under the Japanese public Long-Term Care Insurance Act to prevent mental decline . DESIGN R and omized controlled trial . SETTING The city of Takasaki . PARTICIPANTS One hundred fifty community members aged 72.0 ± 4.0 were r and omly divided into intervention ( n = 75 ) and control ( n = 75 ) groups . INTERVENTION A walking program was conducted once a week for 90 minutes for 3 months . The program encouraged participants to walk on a regular basis and to increase their steps per day gradually . The intervention was conducted in small groups of approximately six , so combined benefits of exercise and social interaction were expected . MEASUREMENTS Cognitive function was evaluated focusing on nine tests in five domains : memory , executive function , word fluency , visuospatial abilities , and sustained attention . Quality of life ( QOL ) , depressive state , functional capacity , range of activities , and social network were assessed using question naires , and motor function was evaluated . RESULTS Significant differences between the intervention and control groups were shown in word fluency related to frontal lobe function ( F(1 , 128 ) = 6.833 , P = .01 ) , QOL ( F(1,128 ) = 9.751 , P = .002 ) , functional capacity including social interaction ( F(1,128 ) = 13.055 , P < .001 ) , and motor function ( Timed Up and Go Test : F(1,127 ) = 10.117 , P = .002 ) . No significant differences were observed in other cognitive tests . CONCLUSION Walking programs may provide benefits in some aspects of cognition , QOL , and functional capacity including social interaction in elderly community members . This study could serve as the basis for implementation of a community-based intervention to prevent mental decline Background Computer-based interventions have demonstrated consistent positive effects on various physical abilities in older adults . This study aims to compare two training groups that achieve similar amounts of strength and balance exercise where one group receives an intervention that includes additional dance video gaming . The aim is to investigate the different effects of the training programs on physical and psychological parameters in older adults . Methods Thirty-one participants ( mean age ± SD : 86.2 ± 4.6 years ) , residents of two Swiss hostels for the aged , were r and omly assigned to either the dance group ( n = 15 ) or the control group ( n = 16 ) . The dance group absolved a twelve-week cognitive-motor exercise program twice weekly that comprised progressive strength and balance training supplemented with additional dance video gaming . The control group performed only the strength and balance exercises during this period . Outcome measures were foot placement accuracy , gait performance under single and dual task conditions , and falls efficacy . Results After the intervention between-group comparison revealed significant differences for gait velocity ( U = 26 , P = .041 , r = .45 ) and for single support time ( U = 24 , P = .029 , r = .48 ) during the fast walking dual task condition in favor of the dance group . No significant between-group differences were observed either in the foot placement accuracy test or in falls efficacy . Conclusions There was a significant interaction in favor of the dance video game group for improvements in step time . Significant improved fast walking performance under dual task conditions ( velocity , double support time , step length ) was observed for the dance video game group only . These findings suggest that in older adults a cognitive-motor intervention may result in more improved gait under dual task conditions in comparison to a traditional strength and balance exercise program . Trial registration This trial has been registered under IS RCT N05350123 ( http://www.controlled-trials.com Physical exercise , particularly aerobic exercise , is documented as providing a low cost regimen to counter well-documented cognitive declines including memory , executive function , visuospatial skills , and processing speed in normally aging adults . Prior aging studies focused largely on the effects of medium to long term ( > 6 months ) exercise training ; however , the shorter term effects have not been studied . In the present study , we examined changes in brain blood flow , cognition , and fitness in 37 cognitively healthy sedentary adults ( 57–75 years of age ) who were r and omized into physical training or a wait-list control group . The physical training group received supervised aerobic exercise for 3 sessions per week 1 h each for 12 weeks . Participants ' cognitive , cardiovascular fitness and resting cerebral blood flow ( CBF ) were assessed at baseline ( T1 ) , mid ( T2 ) , and post-training ( T3 ) . We found higher resting CBF in the anterior cingulate region in the physical training group as compared to the control group from T1 to T3 . Cognitive gains were manifested in the exercise group 's improved immediate and delayed memory performance from T1 to T3 which also showed a significant positive association with increases in both left and right hippocampal CBF identified earlier in the time course at T2 . Additionally , the two cardiovascular parameters , VO2 max and rating of perceived exertion ( RPE ) showed gains , compared to the control group . These data suggest that even shorter term aerobic exercise can facilitate neuroplasticity to reduce both the biological and cognitive consequences of aging to benefit brain health in sedentary adults Purpose Physical exercise and cognitive training have been shown to enhance cognition among older adults . However , few studies have looked at the potential synergetic effects of combining physical and cognitive training in a single study . Prior trials on combined training have led to interesting yet equivocal results . The aim of this study was to examine the effects of combined physical and cognitive interventions on physical fitness and neuropsychological performance in healthy older adults . Methods Seventy-six participants were r and omly assigned to one of four training combinations using a 2 × 2 factorial design . The physical intervention was a mixed aerobic and resistance training program , and the cognitive intervention was a dual-task ( DT ) training program . Stretching and toning exercises and computer lessons were used as active control conditions . Physical and cognitive measures were collected pre- and postintervention . Results All groups showed equivalent improvements in measures of functional mobility . The aerobic – strength condition led to larger effect size in lower body strength , independently of cognitive training . All groups showed improved speed of processing and inhibition abilities , but only participants who took part in the DT training , independently of physical training , showed increased task-switching abilities . The level of functional mobility after intervention was significantly associated with task-switching abilities . Conclusion Combined training did not yield synergetic effects . However , DT training did lead to transfer effects on executive performance in neuropsychological tests . Both aerobic-resistance training and stretching-toning exercises can improve functional mobility in older adults Growing evidence supports the use of physical training interventions to improve both physical and cognitive performances in healthy older adults . Few studies have examined the impact of aerobic exercise on Stroop task performance , a measure of executive functions . In the current 3-month aerobic training study , 50 older adults ( mean age = 67.96 ± 6.25 years ) were r and omly assigned to either a three-month physical training group or to a control group ( waiting list ) . Training sessions were 3 times per week for 60 minutes . All participants completed pre- and post-test measures of cognitive performance using the modified Stroop task and physical performance ( Rockport one-mile test ) . Compared to controls , the training group showed significant improvements in physical capacity ( P < 0.001 ) and enhanced Stroop performance , but only in the inhibition/switching condition ( P < 0.03 ) . Furthermore , the increase in aerobic capacity induced by the training regimen correlated negatively with reaction time in the inhibition/switching condition of the Stroop task at posttest ( r = −0.538 ; P = 0.007 ) . Importantly , the reported gains in cognitive performance were observed after only three months of physical training . Taken together , the results suggest that even short-term physical interventions can enhance older adults ' executive functions BACKGROUND AND AIMS A close relationship exists between dual-task (DT)-related gait changes and the risk of falling in the elderly . However , the impact of DT training on the incidence of falls in the elderly remains unclear . We aim ed to evaluate the effects of a seated stepping exercise in DT conditions to improve walking ability in community-dwelling elderly . METHODS This was a r and omized controlled trial ( RCT ) in community-dwelling elderly in Japan . Fifty-three participants were r and omly assigned to a DT group ( stepping exercise in DT conditions , n=26 ) and a singletask ( ST ) group ( stepping exercise in ST conditions , n=27 ) . All participants received 50 min group training sessions , once a week for 24 weeks . Outcome measures were based on differences in walking ability in singletask ( ST ) , cognitive-task ( CT ) , and manual-task ( MT ) conditions between DT and ST groups . RESULTS Participants in the DT group showed significantly greater improvement in outcome measures , including 10-m gait speed , walking cadence , and cost during cognitive and manual tasks . The number of enumerated figures during CT , as well as the numbers of steps taken and of enumerated figures during stepping with MT demonstrated significant Group × Time interactions ( p<0.05 ) . CONCLUSIONS This RCT suggests that the seated stepping exercise is more effective at improving ambulatory function in DT conditions than in ST conditions Background Exercise interventions often do not combine physical and cognitive training . However , this combination is assumed to be more beneficial in improving walking and cognitive functioning compared to isolated cognitive or physical training . Methods A multicenter parallel r and omized controlled trial was conducted to compare a motor to a cognitive-motor exercise program . A total of 182 eligible residents of homes-for-the-aged ( n = 159 ) or elderly living in the vicinity of the homes ( n = 23 ) were r and omly assigned to either strength-balance ( SB ) or strength-balance-cognitive ( SBC ) training . Both groups conducted similar strength-balance training during 12 weeks . SBC additionally absolved computerized cognitive training . Outcomes were dual task costs of walking , physical performance , simple reaction time , executive functions , divided attention , fear of falling and fall rate . Participants were analysed with an intention to treat approach . Results The 182 participants ( mean age ± SD : 81.5 ± 7.3 years ) were allocated to either SB ( n = 98 ) or SBC ( n = 84 ) . The attrition rate was 14.3 % . Interaction effects were observed for dual task costs of step length ( preferred walking speed : F(1,174 ) = 4.94 , p = 0.028 , η2 = 0.027 , fast walking speed : F(1,166 ) = 6.14 , p = 0.009 , η2 = 0.040 ) and dual task costs of the st and ard deviation of step length ( F(1,166 ) = 6.14 , p = 0.014 , η2 = 0.036 ) , in favor of SBC . Significant interactions in favor of SBC revealed for in gait initiation ( F(1,166 ) = 9.16 , p = 0.003 , η2 = 0.052 ) , ‘ reaction time ’ ( F(1,180 ) = 5.243 , p = 0.023 , η2 = 0.028 ) & ‘ missed answers ’ ( F(1,180 ) = 11.839 , p = 0.001 , η2 = 0.062 ) as part of the test for divided attention . Within-group comparison revealed significant improvements in dual task costs of walking ( preferred speed ; velocity ( p = 0.002 ) , step time ( p = 0.018 ) , step length ( p = 0.028 ) , fast speed ; velocity ( p < 0.001 ) , step time ( p = 0.035 ) , step length ( p = 0.001 ) ) , simple reaction time ( p < 0.001 ) , executive functioning ( Trail making test B ; p < 0.001 ) , divided attention ( p < 0.001 ) , fear of falling ( p < 0.001 ) , and fall rate ( p < 0.001 ) . Conclusions Combining strength-balance training with specific cognitive training has a positive additional effect on dual task costs of walking , gait initiation , and divided attention . The findings further confirm previous research showing that strength-balance training improves executive functions and reduces falls . Trial registration This trial has been registered under IS RCT We have compared the effects of different 12-week exercise programs on simple and choice reaction and movement times in persons 61 to 84 years old . One hundred thirty-eight volunteers were r and omized to either a control group , a two-day exercise group ( two 60-min sessions a week of aerobic exercises ) , or a two-day physical plus cognitive exercise group ( two 60-min sessions a week of aerobic and cognitive exercises ) . At follow-up , the aerobic and cognitive exercise program was found to have result ed in significant positive effects . Improvements were found in the two-day physical plus cognitive exercise group in all of the reaction parameters , particularly improvement in choice reaction time , which is used in most daily activities . Our results suggest that to improve reaction time values , it is advisable to include cognitive features into a physical exercise routine BACKGROUND Extreme levels of gait variability and local dynamic stability of walking are associated with risk of falling and reduced executive functions . However , it is not sufficiently investigated how gait variability and local dynamic stability of human walking develop in the course of a motor-cognitive intervention . As dancing implies high dem and s on ( and therewith trains ) executive functioning and motor control , it might increase local dynamic stability or reduce gait variability . METHODS 32 older healthy participants were r and omly assigned to either a health-related exercise group ( age : mean=68.33 years , st and ard deviation=3.17 years ; BMI : mean=27.46 , st and ard deviation=2.94 ; female/male : 10/6 ) or a dancing group ( age : mean=66.73 years , st and ard deviation=3.33 years ; BMI : mean=26.02 , st and ard deviation=3.55 ; female/male : 11/5 ) . Based on angular velocity data of trunk kinematics , local dynamic stability and stride-to-stride variability in level overground walking were assessed prior to and after the specific intervention . The data were analysed by a blinded observer using two-way repeated measures ANOVAs . Based on one-way ANOVAs , time and group effects were determined . FINDINGS Regarding the variability of trunk movements , no interaction effect was observed ( F 1,30=0.506 , P=.482 ; η2=0.017 ) . For local dynamic stability of trunk movements , an interaction effect in favour of the dancing group was observed ( F 1,30=5,436 ; P=.026 ; η2=0.146 ) . INTERPRETATION Our data indicate that a dancing programme ( which combines cognitive and motor efforts ) might increase local dynamic stability in older people From animal research , it is known that combining physical activity with sensory enrichment has stronger and longer-lasting effects on the brain than either treatment alone . For humans dancing has been suggested to be analogous to such combined training . Here we assessed whether a newly design ed dance training program that stresses the constant learning of new movement patterns is superior in terms of neuroplasticity to conventional fitness activities with repetitive exercises and whether extending the training duration has additional benefits . Twenty-two healthy seniors ( 63–80 years ) who had been r and omly assigned to either a dance or a sport group completed the entire 18-month study . MRI , BDNF and neuropsychological tests were performed at baseline and after 6 and 18 months of intervention . After 6 months , we found a significant increase in gray matter volume in the left pre central gyrus in the dancers compared to controls . This neuroplasticity effect may have been mediated by the increased BDNF plasma levels observed in the dancers . Regarding cognitive measures , both groups showed significant improvements in attention after 6 months and in verbal memory after 18 months . In addition , volume increases in the parahippocampal region were observed in the dancers after 18 months . The results of our study suggest that participating in a long-term dance program that requires constant cognitive and motor learning is superior to engaging in repetitive physical exercises in inducing neuroplasticity in the brains of seniors . Therefore , dance is highly promising in its potential to counteract age-related gray matter decline Background This r and omized controlled pilot study aim ed to explore whether a cognitive-motor exercise program that combines traditional physical exercise with dance video gaming can improve the voluntary stepping responses of older adults under attention dem and ing dual task conditions . Methods Elderly subjects received twice weekly cognitive-motor exercise that included progressive strength and balance training supplemented by dance video gaming for 12 weeks ( intervention group ) . The control group received no specific intervention . Voluntary step execution under single and dual task conditions was recorded at baseline and post intervention ( Week 12 ) . Results After intervention between-group comparison revealed significant differences for initiation time of forward steps under dual task conditions ( U = 9 , P = 0.034 , r = 0.55 ) and backward steps under dual task conditions ( U = 10 , P = 0.045 , r = 0.52 ) in favor of the intervention group , showing altered stepping levels in the intervention group compared to the control group . Conclusion A cognitive-motor intervention based on strength and balance exercises with additional dance video gaming is able to improve voluntary step execution under both single and dual task conditions in older adults Insidious declines in normal aging are well-established . Emerging evidence suggests that non-pharmacological interventions , specifically cognitive and physical training , may counter diminishing age-related cognitive and brain functions . This r and omized trial compared effects of two training protocol s : cognitive training ( CT ) vs. physical training ( PT ) on cognition and brain function in adults 56–75 years . Sedentary participants ( N = 36 ) were r and omized to either CT or PT group for 3 h/week over 12 weeks . They were assessed at baseline- , mid- , and post-training using neurocognitive , MRI , and physiological measures . The CT group improved on executive function whereas PT group 's memory was enhanced . Uniquely deploying cerebral blood flow ( CBF ) and cerebral vascular reactivity ( CVR ) MRI , the CT cohort showed increased CBF within the prefrontal and middle/posterior cingulate cortex ( PCC ) without change to CVR compared to PT group . Improvements in complex abstract ion were positively associated with increased resting CBF in dorsal anterior cingulate cortex ( dACC ) . Exercisers with higher CBF in hippocampi bilaterally showed better immediate memory . The preliminary evidence indicates that increased cognitive and physical activity improves brain health in distinct ways . Reasoning training enhanced frontal networks shown to be integral to top-down cognitive control and brain resilience . Evidence of increased resting CBF without changes to CVR implicates increased neural health rather than improved vascular response . Exercise did not improve cerebrovascular response , although CBF increased in hippocampi of those with memory gains . Distinct benefits incentivize testing effectiveness of combined protocol s to strengthen brain health Forty-eight healthy adults aged 65 - 85 were recruited for structural magnetic resonance scans after an extensive neuropsychological battery that ensured a high degree of variability across the sample in performance on long-term memory tests , and on tests traditionally thought to rely on prefrontal cortex . Gray matter volumes were measured for three gyri in the frontal lobe ( superior , middle , inferior ) , six gyri in the temporal lobe ( superior , middle , inferior , fusiform , parahippocampal , and hippocampus ) , and the occipital lobe . Gray matter volumes declined across the age range evaluated , but with substantial regional variation -- greatest in the inferior frontal , superior temporal , and middle temporal gyri but negligible in the occipital lobe . Both memory performance and executive function declined as the number of hyperintense regions in the subcortical white matter increased . Memory performance was also significantly correlated with gray matter volumes of the middle frontal gyrus ( MFG ) , and several regions of temporal neocortex . However , the correlations were all in the negative direction ; better memory performance was associated with smaller volumes . Several previous reports of significant negative correlations between gray matter volumes and memory performance are described , so that the possible reasons for this surprising finding are discussed Objectives : The purpose of this study was to evaluate the effects of creative dance on physical fitness and life satisfaction in older women . Methods : A total of 57 women ( 65–80 years old ) were r and omized to either an experimental group or a control group . The experimental group participated in a supervised creative dance program for 24 weeks . Physical fitness ( strength , aerobic endurance , flexibility , motor ability/dynamic balance , and body composition ) and life satisfaction were assessed pre- and posttreatment ( at 12 and 24 weeks ) by the Senior Fitness Test and the Life Satisfaction scale , respectively . Results : After the intervention , the experimental group had better physical fitness and life satisfaction when compared with the control group . Conclusion : Creative dance has a positive effect in different dimensions of functioning and has the potential to contribute to healthy aging . This could be related to the integrated mobilization of physical , cognitive , and social skills promoted by creative dance BACKGROUND Our objective was to assess the effects of targeted exercise programs on health-related quality of life compared with usual care based on the ability to perform activities of daily living ( ADL ) and the Neuropsychiatric Inventory scores in geriatric institutionalized persons . METHODS A r and omized controlled trial of 2 exercise programs vs usual care was conducted in 160 institutionalized persons 65 years or older who were able to underst and basic motor comm and s and to move from one position to another . Interventions were performed over 6 months and were either an adapted tai chi program ( 4 times 30 min/wk ) or a cognition-action program ( 2 times 30 - 45 min/wk ) that focused primarily on an adapted guidance of patient-centered communication skills . The control group received usual care . The study was conducted at 4 setting s. The main outcomes were changes in health-related quality of life based on ADL and Neuropsychiatric Inventory scores after 12 months . RESULTS The control group experienced a decline in ADL over the 12-month period compared with the adapted tai chi and cognition-action groups , but the differences were not significant ( P = .24 and P = .15 , respectively ) . Also , the components of ADL , eg , ability to walk , continence , and nutrition , were maintained better in the intervention groups than in the control group . The total Neuropsychiatric Inventory score also worsened significantly in the control group , while it was unchanged or improved in the intervention groups . The differences between the cognition-action group and the control group were significant ( P > .001 ) . Neuropsychiatric diagnosis subgroups ( such as dementia and psychosis ) did not show a specific response from any intervention . CONCLUSION Adapted exercise programs can slow down the decline in health-related quality of life among heterogeneous , institutionalized elderly persons . TRIAL REGISTRATION clinical trials.gov Identifier : NCT00623532 AIM Physical independence and positive mood states contribute to successful aging . The aim of this study was to analyze the effects of aerobic and strength-based training programs on functional fitness and mood in older adults , and to assess the relationship between adiposity and mood states . METHODS Seventy eight participants ( age 65 to 95 year old ) were r and omly assigned to a control group , aerobic training ( AT ) , or strength training group ( ST ) . Functional fitness was assessed using dimensions of the Senior Fitness Test battery relating to lower and upper body strength and flexibility , velocity , agility and dynamic balance , and aerobic endurance . Mood states ( depression , tension , fatigue , vigour , anger , and confusion ) were determined using the POMS-SF question naire . Participants were evaluated at the baseline and at the end of a 16-week exercise programme . RESULTS Both the ST and AT groups improved their functional fitness following the 16 week training . Body Mass Index ( BMI ) was positively associated with tension ( r=0.30 ; P<0.01 ) , fatigue ( r=0.31 ; P<0.01 ) and confusion ( r=0.24 ; P<0.05 ) . At 16-week evaluation , control group reported increased levels of confusion , and the ST group reported increases in vigour ( P<0.05 ) . CONCLUSION Results support the idea that strength-based training can be as effective as aerobic-based training in improving physical skills that contribute to functional mobility in later years . Positive associations between increased BMI and mood disturbance were also found . Physical training also contributed to some improvements in mood Previous studies have suggested beneficial effects of physical activity on cognition . Here , we asked in an interventional approach if physical activity performed at different intensity levels would differentially affect episodic memory function . Additionally , we tried to identify mechanisms mediating these changes . Sixty-two healthy elderly individuals were assessed for level of physical activity , aerobic fitness , episodic memory score , neurotrophin and catecholamine levels , and received a magnetic resonance image of the brain at baseline and after a six months intervention of medium or low-intensity physical activity or control . Increase in total physical activity was positively associated with increase in memory score over the entire cohort , without significant differences between intensity groups . It was also positively associated with increases in local gray matter volume in prefrontal and cingulate cortex , and BDNF levels ( trend ) . In conclusion , we showed that physical activity conveys the beneficial effects on memory function independently of its intensity , possibly mediated by local gray matter volume and neurotrophic factors . Our findings may carry significant implication s for prevention of cognitive decline in the elderly The purpose of this study was to investigate the effects of participation in an exercise program on several abilities associated with driving performance in older adults . Thirty-two subjects were r and omly assigned to either an exercise group ( 60 - 81 years , n=16 ) or a control group ( 60 - 82 years , n=16 ) . The exercise program was planned to stress perceptive , cognitive , and physical abilities . It lasted 12 weeks with a periodicity of three sessions of 60 min per week . Assessment s were conducted before and after the intervention on behavioral speed ( in single- and dual-task conditions ) , visual attention , psychomotor performance , speed perception ( time-to-contact ) , and executive functioning . Significant positive effects were found at 12-week follow-up result ing from participation in the exercise program . Behavioral speed improvements were found in reaction time , movement time , and response time ( both in single- and dual-task conditions ) ; visual attention improvements took place in speed processing and divided attention ; psychomotor performance improvements occurred in lower limb mobility . These results showed that exercise is capable of enhancing several abilities relevant for driving performance and safety in older adults and , therefore , should be promoted BACKGROUND AND PURPOSE Although much is known about the benefits of aerobic exercise on cardiovascular health , little research has been done on the effect of aerobic exercise on motor performance . This study examined whether aerobic exercise has an effect on visuospatial information processing during finger-movement tracking in elderly subjects . SUBJECTS Fifteen elderly subjects ( mean age=83.2 years , SD=5.7 , range=72 - 91 ) from a senior housing complex were r and omly assigned to a control group or an experimental ( exercise ) group . Twelve subjects completed the study , and data obtained for 10 subjects were used for data analysis ( 2 control subjects were eliminated to allow for matched-pairs analysis between the experimental and control groups ) . The control group ( n=5 ) had a mean age of 80.2 years ( SD=7.8 ) . Subjects in the experimental group ( n=5 ) had a mean age of 84.8 years ( SD=2.5 ) . METHODS The intervention consisted of group exercise 3 times a week for 8 consecutive weeks , and included calisthenics ( eg , marching in place , side stepping , mock boxing ) , stationary bicycling , and walking . A finger-movement tracking test and submaximal grade d exercise tolerance step tests were performed before and after training to determine changes in finger-movement tracking and any aerobic training effects . RESULTS Matched-pairs t tests showed a difference in tracking from pretest to posttest in the experimental group compared with the control group . Step test performance did not differ between the 2 groups . DISCUSSION AND CONCLUSION The results of this small-scale study with a limited number of subjects indicate that , for elderly people , finger-movement tracking performance can improve with aerobic exercise , despite the absence of an aerobic training effect . Possible mechanisms for the treatment effect on information processing are discussed BACKGROUND Gait abnormalities and vascular disease risk factors are associated with cognitive impairment in aging . OBJECTIVE To determine the impact of group-based exercise and dual-task training on gait and vascular health , in active community-dwelling older adults without dementia . METHODS Participants [ n=44 , mean ( SD ) age : 73.5 ( 7.2 ) years , 68 % female ] were r and omized to either intervention ( exercise+dual-task ; EDT ) or control ( exercise only ; EO ) . Each week , for 26 weeks , both groups accumulated 50 or 75 min of aerobic exercise from group-based classes and 45 min of beginner-level square stepping exercise ( SSE ) . Participants accumulating only 50 min of aerobic exercise were instructed to participate in an additional 25 min each week outside of class . The EDT group also answered cognitively challenging questions while performing SSE ( i.e. , dual-task training ) . The effect of the interventions on gait and vascular health was compared between groups using linear mixed effects models . RESULTS At 26 weeks , the EDT group demonstrated increased dual-task ( DT ) gait velocity [ difference between groups in mean change from baseline ( 95 % CI ) : 0.29 m/s ( 0.16 - 0.43 ) , p<0.001 ] , DT step length [ 5.72 cm ( 2.19 - 9.24 ) , p = 0.002 ] , and carotid intima-media thickness [ 0.10 mm ( 0.003 - 0.20 ) , p=0.04 ] , as well as reduced DT stride time variability [ 8.31 coefficient of variation percentage points ( -12.92 to -3.70 ) , p<0.001 ] , when compared to the EO group . CONCLUSIONS Group-based exercise combined with dual-task training can improve DT gait characteristics in active older adults without dementia OBJECTIVES Few studies have examined the effects of physical training programs on gait variability while single and dual tasking , and they reported mixed results . The aim of this study was to compare the stride time variability while single and dual tasking before and after a physical training program developed to improve gait stability in French community-dwelling older adults . DESIGN A prospect i ve pre-post interventional cohort study . SETTING The community-dwelling area of " Pays de la Loire " , France . POPULATION Forty-eight older adults ( mean age ± st and ard deviation 72.2±8 years ; 75 % female ) . METHODS Physical training program consisted in 12 sessions scheduled to attend physical exercises 1 time a week with total time duration of 3 months . Coefficient of variation ( CoV ) of stride time under three walking conditions ( i.e. , walking alone , walking while backward counting , and while performing a verbal fluency task ) was determined while steady-state walking using the SMTEC ® footswitches system before and after the physical training program . Participants were separated into two groups based on being or not in the highest tertile ( i.e. , worst performance with cutpoint > 4.4 % ) of the CoV of stride time while walking alone . RESULTS After physical training compared to before period , a significant decrease in CoV of stride time ( i.e. , better gait performance ) while walking alone ( 2.8±2.8 % versus 7±7.1 % , P=0.001 ) but not while dual tasking ( P=0.600 for counting backward and P=0.105 for verbal fluency task ) was shown in participants who had highest ( i.e. , worst ) gait variability at baseline . In addition , physical training modified the strategy of dual tasking in participants with highest gait variability at baseline compared to the other participants . Before training , a significant decrease in CoV of stride time ( 7±7.1 % versus 4.9±4.6 % , P=0.017 ) while counting backward was shown , but there was a significant increase after training ( 2.8±2.8 % versus 5.4±5.8 % , P=0.007 ) . CONCLUSIONS Physical training reduced gait variability while walking alone in participants with gait instability , and influenced their strategy for dual tasking . CLINICAL REHABILITATION IMPACT Physical program training developed in the community to improve gait stability should included participants with high gait variability Aerobic exercise training has been shown to attenuate cognitive decline and reduce brain atrophy with advancing age . The extent to which resistance exercise training improves cognition and prevents brain atrophy is less known , and few studies include long-term follow-up cognitive and neuroimaging assessment s. We report data from a r and omized controlled trial of 155 older women , who engaged in 52 weeks of resistance training ( either once- or twice-weekly ) or balance- and -toning ( twice-weekly ) . Executive functioning and memory were assessed at baseline , 1-year follow-up ( i.e. , immediately post-intervention ) , and 2-year follow-up . A subset underwent structural magnetic resonance imaging scans at those time points . At 2-year follow-up , both frequencies of resistance training promoted executive function compared to balance- and -toning ( st and ardized difference [d]=.31-.48 ) . Additionally , twice-weekly resistance training promoted memory ( d=.45 ) , reduced cortical white matter atrophy ( d=.45 ) , and increased peak muscle power ( d=.27 ) at 2-year follow-up relative to balance- and -toning . These effects were independent of one another . These findings suggest resistance training may have a long-term impact on cognition and white matter volume in older women To examine putative brain substrates of cognitive functions differentially affected by age the authors measured the volume of cortical regions and performance on tests of executive functions , working memory , explicit memory , and priming in healthy adults ( 18 - 77 years old ) . The results indicate that shrinkage of the prefrontal cortex mediates age-related increases in perseveration . The volume of visual processing areas predicted performance on nonverbal working memory tasks . Contrary to the hypotheses , in the examined age range , the volume of limbic structures was unrelated to any of the cognitive functions ; verbal working memory , verbal explicit memory , and verbal priming were independent of cortical volumes . Nevertheless , among the participants aged above 60 , reduction in the volume of limbic structures predicted declines in explicit memory . Chronological age adversely influenced all cognitive indices , although its effects on priming were only indirect , mediated by declines in verbal working memory BACKGROUND Evidence is still insufficient regarding the effects of Power Rehabilitation ( PR ) on physical performance and higher-level functional capacity of community-dwelling frail elderly people . METHODS This nonr and omized controlled interventional trial consisted of 46 community-dwelling elderly individuals with light levels of long-term care needs . They were allocated to the intervention ( I-group , n = 24 ) and control ( C-group , n = 22 ) groups . Of them , 32 persons ( 17 in the l-group ; 15 in the C-group ) ( median age , 77 years ; sex , 28 % male ) completed the study . The l-group subjects underwent PR twice a week for 12 weeks . The outcomes were physical performance ( muscle strength , balance , flexibility , and mobility ) and higher-level functional capacity as evaluated by the Tokyo Metropolitan Institute of Gerontology Index of Competence ( TMIG-IC ) and the level of long-term care need as certified by the public long-term care insurance . RESULTS The l-group demonstrated a significant improvement in the measured value of the timed up- and -go test ( median change , a decrease of 4.4 seconds versus a decrease of 0.2 seconds , p = 0.033 ) and the timed 10-meter walk ( a decrease of 3.0 seconds versus an increase of 0.2 seconds , p = 0.007 ) in comparison with the C-group . No significant change was observed in the TMIG-IC scores or in the level of long-term care need in the l-group . CONCLUSION PR improved mobility of community-dwelling frail elderly people ; however , such improvement did not translate into higher-level functional capacity . Our findings demonstrate the difficulty in transferring the positive effects associated with PR into an improvement in higher-level functional capacity BACKGROUND Aerobic exercise training ( AET ) has been shown to provide health benefits in individuals with Parkinson 's disease ( PD ) . However , it is yet unknown to what extent AET also improves cognitive and procedural learning capacities , which ensure an optimal daily functioning . OBJECTIVE In the current study , we assessed the effects of a 3-month AET program on executive functions ( EF ) , implicit motor sequence learning ( MSL ) capacity , as well as on different health-related outcome indicators . METHODS Twenty healthy controls ( HC ) and 19 early PD individuals participated in a supervised , high-intensity , stationary recumbent bike-training program ( 3 times/week for 12 weeks ) . Exercise prescription started at 20 min ( + 5 min/week up to 40 min ) based on participant 's maximal aerobic power . Before and after AET , EF tests assessed participants ' inhibition and flexibility functions , whereas implicit MSL capacity was evaluated using a version of the Serial Reaction Time Task . RESULTS The AET program was effective as indicated by significant improvement in aerobic capacity in all participants . Most importantly , AET improved inhibition but not flexibility , and motor learning skill , in both groups . CONCLUSION Our results suggest that AET can be a valuable non-pharmacological intervention to promote physical fitness in early PD , but also better cognitive and procedural functioning Objective : ( 1 ) To develop a motor – cognition training programme ; ( 2 ) to evaluate the ability to recruit and retain elderly people ; ( 3 ) to assess the effects of the interventions . Design : Pilot r and omized controlled trial . Setting : Assisted living facility . Participants : Sixteen subjects ( 11 female ) living in an assisted living facility were r and omized to a motor or motor – cognition group . Interventions : Both groups received machine-driven strength training and balance exercises for 45 minutes , twice weekly , for 12 weeks . In addition , the motor – cognition group received computerized training for attention 3–5 times per week for 10 weeks . Main outcome measures : Baseline and post-intervention ( 12 weeks ) assessment s focused on recruitment , attrition and adherence . Secondary outcome measures assessed dual-task costs of gait ( velocity , cadence , step time , step length ) , exp and ed timed get-up- and -go , falls efficacy and reaction time . Results : Of 35 subjects initially approached , 16 started and 14 completed the study , result ing in 46 % recruitment , 19 % attrition and > 80 % adherence rates . There is more evidence of altered levels in the motor – cognitive treatment group with significant differences in average change for fear of falling ( P = 0.017 ) and foot reaction time ( P = 0.046 ) . No statistical significance was reached for gait parameters . Conclusions : Motor – cognition training is feasible and shows trends to stronger improvement in walking and reaction time . The application in a main study is deemed feasible . A minimum of ±55 subjects per group are required to achieve a power of 80 % at the 5 % level of significance based on step length and considering the expectable attrition rate in a required larger scale study Background . Aged-related loss of afferent feedback of the feet plays an important role in gait performance . Although strength , balance and gait training can significantly improve the muscle power and functional abilities of older individuals , it remains unclear whether training effects can be enhanced by augmenting afferent feedback from the feet adding shoe insoles complementary to conventional training . Objective . The current study investigated the effect of physical exercise combined with wearing MedReflex ® shoe insoles on the gait performance and muscle power in older adults . Methods . Twenty-eight independent living , older adults aged 65–91 years were r and omly assigned to either an insole group ( IG ; n = 14 ) or a training group ( TG ; n = 14 ) . Further 14 subjects matched to the IG and TG were recruited as a control group ( CG ; n = 14 ) ( no exercise ) . The IG and TG completed the same training program consisting of aerobic exercises , progressive resistance strength training and stretching exercises twice per week for 12 weeks , whereas , the IG wore the insoles during everyday life and during training sessions . Assessment s included the Falls Efficacy Scale – International ( FES-I ) , gait analysis and muscle power measurements of the knee and ankle joint at pre- and post-training . Results . There were significant time × group interactions in walking speed , step length and in several muscle power measurements . The positive effects of gait parameters ranged between 1 % and 12 % and between 1 % and 8 % and the trend to improvements of muscle power ranged between 15–79 % and 20–79 % for the IG and TG , respectively . The IG and TG did not differ significantly in their improvements . The CG showed a trend to deteriorations between 0 % and −5 % for gait parameters and between −4 % and −14 % for muscle power . No significant change in FES-I score occurred in neither groups . Conclusions . The results of this study provide evidence of significant improvements in gait performance and muscle power after a conventional training program in independent living , older adults . However , there is no additional effect of long-term adaptation of gait caused by wearing insoles concurrent to physical training Dancing is a complex sensorimotor activity involving physical and mental elements which have positive effects on cognitive functions and motor control . The present r and omized controlled trial aims to analyze the effects of a dancing program on the performance on a motor-cognitive dual task . Data of 35 older adults , who were assigned to a dancing group or a health-related exercise group , are presented in the study . In pretest and posttest , we assessed cognitive performance and variability of minimum foot clearance , stride time , and stride length while walking . Regarding the cognitive performance and the stride-to-stride variability of minimum foot clearance , interaction effects have been found , indicating that dancing lowers gait variability to a higher extent than conventional health-related exercise . The data show that dancing improves minimum foot clearance variability and cognitive performance in a dual-task situation . Multi-task exercises ( like dancing ) might be a powerful tool to improve motor-cognitive dual-task performance BACKGROUND Cognitive impairment is an important contributor to disability . Limited clinical trial evidence exists regarding the impact of physical exercise on cognitive function ( CF ) . We report results of a pilot study to provide estimates of the relative impact of physical activity ( PA ) on 1-year changes in cognitive outcomes and to characterize relationships between changes in mobility disability and changes in cognition in older adults at increased risk for disability . METHODS Sedentary persons ( 102 ) at increased risk for disability ( aged 70 - 89 years ) were r and omized to moderate-intensity PA or health education . Participants were administered the Digit Symbol Substitution Test ( DSST ) , Rey Auditory Verbal Learning Test ( RAVLT ) , modified Stroop test , and Modified Mini-Mental State Examination at baseline and 1 year . RESULTS Group differences were not significant but improvements in cognitive scores were associated with improvements in physical function . Specifically , the DSST significantly correlated with change in the Short Physical Performance Battery score ( r = .38 , p = .0002 ) , in chair st and score ( r = .26 , p = .012 ) , in balance score ( r = .21 , p = .046 ) , and in 400-m gait speed ( r = .15 , p = .147 ) . Change recall on the RAVLT and in the Stroop test was also positively correlated with changes in chair st and and balance , respectively . CONCLUSIONS These results provide further support for the benefits of exercise on CF in older adults . An adequately powered clinical trial of PA involving older adults at increased risk for cognitive disability is needed to exp and the indications for prescribing exercise for prevention of decline in brain function Decline in dual-task walking performance is associated with increased risk of falls among older adults . The objective of this study is to determine whether 18 hr of participation in EnhanceFitness ( EF ) , an evidence -based group exercise program , improves dual-task walking performance among community-dwelling older adults . Twenty-eight healthy , community-dwelling older adults were evaluated before participating in EF and after 18 hr of participation . Gait speed was evaluated under single task and dual tasks using the TUG ( Timed Up and Go ) and 1-min walk tests . Dual-task costs ( DTC ) , the relative cost of dual-task performance compared to single-task performance , were calculated for both cognitive and motor tasks . Postural control and executive functions were evaluated as well . After 18 hr of EF , dual-task walking performance improved . Single-task performance improved as well as postural control and executive function . There was no significant change in DTC across all measurements , except for the cognitive task of the TUG
2,030
29,736,728
However , our subgroup analyses did not reveal any statistically significant effects of the moderator variables age , sex , training status , setting and testing method on overall balance ( i.e. aggregation of static and dynamic balance ) . BT-related effects in adolescents were moderate to large for measures of static ( SMDwm = 0.61 ) and dynamic ( SMDwm = 0.86 ) balance . Conclusions BT is a highly effective means to improve balance performance with moderate to large effects on static and dynamic balance in healthy youth irrespective of age , sex , training status , setting and testing method . The examined training modalities did not have a moderating effect on balance performance in healthy adolescents . Thus , we conclude that an additional but so far unidentified training modality may have a major effect on balance performance that was not assessed in our analysis .
Background Effects and dose – response relationships of balance training on measures of balance are well-documented for healthy young and old adults . However , this has not been systematic ally studied in youth . Objectives The objectives of this systematic review and meta- analysis were to quantify effects of balance training ( BT ) on measures of static and dynamic balance in healthy children and adolescents . Additionally , dose – response relations for BT modalities ( e.g. training period , frequency , volume ) were quantified through the analysis of controlled trials .
Abstract The influence of a 12-week-proprioceptive training on functional ankle stability was investigated in young speed skaters . Twenty-eight speed skaters were r and omly divided into an intervention ( n = 14 ) and into a control group ( n = 14 ) . A 15-min circle training was performed 5 times per week over a 12-week period . Measurements were taken prior to the training , after 6 and 12 weeks of training . Kinaesthesia was evaluated with the Isomed2000 in all movements of the ankle joint . Dynamic balance was tested with the Biodex Stability System at the stable level 8 and at the unstable level 2 , measuring the overall stability index , the anterior/posterior and the medial/lateral scores . Static single-leg stance was evaluated using the Kistler force platform . Kinaesthesia of the intervention group improved significantly for plantarflexion of the right foot ( P = 0.001 ) after 12 weeks . Dynamic balance showed significant differences in the intervention group after 12 weeks in comparison with the first measurement for each foot in the overall stability index , the anterior/posterior and the medial/lateral scores ( P ≤ 0.017 , respectively ) at the unstable level 2 . Functional ankle stability improved in terms of dynamic balance after 12 weeks of proprioceptive training . Therefore , inclusion of proprioceptive exercises in the daily training programme is recommended for young speed skaters Background : Sport is the leading cause of injury requiring medical attention among adolescents . We studied the effectiveness of a home-based balance-training program using a wobble board in improving static and dynamic balance and reducing sports-related injuries among healthy adolescents . Methods : In this cluster r and omized controlled trial , we r and omly selected 10 of 15 high schools in Calgary to participate in the fall of 2001 . We then recruited students from physical education classes and r and omly assigned them , by school , to either the intervention ( n = 66 ) or the control ( n = 61 ) group . Students in the intervention group participated in a daily 6-week and then a weekly 6-month home-based balance-training program using a wobble board . Students at the control schools received testing only . The primary outcome measures were timed static and dynamic balance , 20-m shuttle run and vertical jump , which were measured at baseline and biweekly for 6 weeks . Self-reported injury data were collected over the 6-month follow-up period . Results : At 6 weeks , improvements in static and dynamic balance were observed in the intervention group but not in the control group ( difference in static balance 20.7 seconds , 95 % confidence interval [ CI ] 10.8 to 30.6 seconds ; difference in dynamic balance 2.3 seconds , 95 % CI 0.7 to 4.0 seconds ) . There was evidence of a protective effect of balance training in over 6 months ( relative risk of injury 0.2 , 95 % CI 0.05 to 0.88 ) . The number needed to treat to avoid 1 injury over 6 months was 8 ( 95 % CI 4 to 35 ) . Interpretation : Balance training using a wobble board is effective in improving static and dynamic balance and reducing sports-related injuries among healthy adolescents The purpose of this study was to determine the effect of a 4-week balance training program on specified functional tasks . Thirty-six subjects ( age = 22.7 ± 2.10 years ; height = 168.30 ± 9.55 cm ; weight = 71.15 ± 16.40 kg ) were r and omly placed into control ( C ; n = 19 ) and experimental groups ( Tx ; n = 17 ) . The Tx group trained using a commercially available balance training device ( BOSU ) . Postural limits ( displacement and sway ) and functional task ( time on ball , shuttle run , and vertical jump ) were assessed during a pretest ( T1 ) , a posttest ( T2 ) , and 2 weeks posttraining ( T3 ) . Multivariate repeated measures analysis ( a = 0.05 ) revealed significant differences in time on ball , shuttle run , total sway , and fore/aft displacement after the exercise intervention ( T2 ) . T3 assessment revealed that total sway and time on ball remained controlled ; however , no other measures were retained . Balance training improved performance of selected sport-related activities and postural control measures , although it is unclear whether the effect of training would transfer to general functional enhancement Typically , balance training has been used as an intervention paradigm either as static or as reactive balance training . Possible differences in functional outcomes between the two modalities have not been profoundly studied . The objective of the study was to investigate the specificity of neuromuscular adaptations in response to two balance intervention modalities within test and intervention paradigms containing characteristics of both profiles : classical sensorimotor training ( SMT ) referring to a static ledger pivoting around the ankle joint vs. reactive balance training ( RBT ) using externally applied perturbations to deteriorate body equilibrium . Thirty-eight subjects were assigned to either SMT or RBT . Before and after four weeks of intervention training , postural sway and electromyographic activities of shank and thigh muscles were recorded and co-contraction indices ( CCI ) were calculated . We argue that specificity of training interventions could be transferred into corresponding test setting s containing properties of SMT and RBT , respectively . The results revealed that i ) postural sway was reduced in both intervention groups in all test paradigms ; magnitude of changes and effect sizes differed dependent on the paradigm : when training and paradigm coincided most , effects were augmented ( P<0.05 ) . ii ) These specificities were accompanied by segmental modulations in the amount of CCI , with a greater reduction within the CCI of thigh muscles after RBT compared to the shank muscles after SMT ( P<0.05 ) . The results clearly indicate the relationship between test and intervention specificity in balance performance . Hence , specific training modalities of postural control cause multi-segmental and context -specific adaptations , depending upon the characteristics of the trained postural strategy . In relation to fall prevention , perturbation training could serve as an extension to SMT to include the proximal segment , and thus the control of structures near to the body ’s centre of mass , into training The objective of this study was to determine the relationship between specific performance measures and hockey skating speed . Thirty competitive secondary school and junior hockey players were timed for skating speed . Off-ice measures included a 40-yd ( 36.9-m ) sprint , concentric squat jump , drop jump , 1 repetition maximum leg press , flexibility , and balance ratio ( wobble board test ) . Pearson product moment correlations were used to quantify the relationships between the variables . Electromyographic ( EMG ) activity of the dominant vastus lateralis and biceps femoris was monitored in 12 of the players while skating , stopping , turning , and performing a change-of-direction drill . Significant correlations ( p < 0.005 ) were found between skating performance and the sprint and balance tests . Further analysis demonstrated significant correlations between balance and players under the age of 19 years ( r = −0.65 ) but not those over 19 years old ( r = −0.28 ) . The significant correlations with balance suggested that stability may be associated with skating speed in younger players . The low correlations with drop jumps suggested that short contact time stretch-shortening activities ( i.e. , low amplitude plyometrics ) may not be an important factor . Electromyographic activities illustrated the very high activation levels associated with maximum skating speed The purpose of the present study was to investigate the effects of a soccer training session on the balance ability of the players and assess whether the effectiveness of a balance program is affected by its performance before or after the regular soccer training . Thirty-nine soccer players were r and omly divided into three subject groups ( n=13 each ) , one control group ( C group ) , one training group that followed a balance program ( 12 weeks , 3 times per week , 20 min per session ) before the regular soccer training ( TxB group ) , and one training group that performed the same balance program after the soccer training ( TxA group ) . St and ard testing balance boards and the Biodex Stability System were used to assess balance ability in the C , TxB , and TxA groups at baseline ( T0 ) and after completing the balance program ( T12 ) . The same tests and additional isokinetic knee joint moment measurements were carried out in the TxB and TxA groups pre- and post-soccer training . Two main results were obtained : ( 1 ) No differences ( p>0.05 ) were found in balance ability and knee joint moment production between pre- and post-soccer training . ( 2 ) The balance program increased ( p<0.01 ) the balance ability in the TxB and TxA groups , and the improvement in the TxA group was greater ( p<0.05 ) than that in the TxB group post-soccer training . Result ( 1 ) is in contrast to the notion of a link between fatigue induced by a soccer training session or game and injury caused by impaired balance , and result ( 2 ) has implication s for athletic training and rehabilitation BACKGROUND AND PURPOSE Assessment of the quality of r and omized controlled trials ( RCTs ) is common practice in systematic review s. However , the reliability of data obtained with most quality assessment scales has not been established . This report describes 2 studies design ed to investigate the reliability of data obtained with the Physiotherapy Evidence Data base ( PEDro ) scale developed to rate the quality of RCTs evaluating physical therapist interventions . METHOD In the first study , 11 raters independently rated 25 RCTs r and omly selected from the PEDro data base . In the second study , 2 raters rated 120 RCTs r and omly selected from the PEDro data base , and disagreements were resolved by a third rater ; this generated a set of individual rater and consensus ratings . The process was repeated by independent raters to create a second set of individual and consensus ratings . Reliability of ratings of PEDro scale items was calculated using multirater kappas , and reliability of the total ( summed ) score was calculated using intraclass correlation coefficients ( ICC [ 1,1 ] ) . RESULTS The kappa value for each of the 11 items ranged from.36 to.80 for individual assessors and from.50 to.79 for consensus ratings generated by groups of 2 or 3 raters . The ICC for the total score was.56 ( 95 % confidence interval=.47-.65 ) for ratings by individuals , and the ICC for consensus ratings was.68 ( 95 % confidence interval=.57-.76 ) . DISCUSSION AND CONCLUSION The reliability of ratings of PEDro scale items varied from " fair " to " substantial , " and the reliability of the total PEDro score was " fair " to " good . Objectives : To compare the effect of a neuromuscular training program and a basic exercise program on postural control in figure skaters . Design : Two groups ; parallel design ; prospect i ve , r and omized controlled trial . Setting : Postural control laboratory , arenas , September 2001 to December 2002 . Participants : Forty-four young , healthy figure skaters ( 18 years ± 3 years ) . Interventions : Participants were r and omly assigned to receive a neuromuscular training program ( n = 22 ) or a basic exercise training program ( n = 22 ) . Both programs were completed 3 times per week for 4 weeks , and each session was supervised . Main Outcome Measurements : Participants completed baseline and postintervention measures of postural control on a force plate . Postural control was quantified as the center of pressure ( CoP ) path length during tests of single-limb st and ing balance that mimicked figure skating skills and challenged the postural control system to varying degrees . The primary outcome measure was the CoP path length observed during a l and ing jump test completed with eyes closed . Results : The post intervention CoP path lengths during the more challenging tests were significantly ( P < 0.05 ) lower ( indicating better postural control ) for the neuromuscular trained group than for the basic exercise-trained group . For the l and ing jump test completed with eyes closed , the percent improvement in the neuromuscular trained group was significantly greater ( mean = 21.0 ± 22.0 % ) than the basic exercise trained group ( mean = −4.9 ± 24.9 % ; P < 0.05 ) . The magnitude of improvement in the neuromuscular-trained group ranged from approximately 1 % to 21 % , depending on the specific postural control test used . Conclusions : The results suggest that off-ice neuromuscular training can significantly improve postural control in figure skaters , whereas basic exercise training does not Sensory motor training programs are used in the rehabilitation and prevention of injuries among soccer players . Inconsistencies are found in the literature regarding the duration of the protocol s and the exercises and equipment used . OBJECTIVE To evaluate the benefits of a five-week sensory motor training program on the functional performance and postural control of young soccer players . METHODS The study sample comprised 22 young male soccer players who were evaluated using : the Figure-of-Eight Test ( F8 ) , Side Hop Test ( SHT ) , Star Excursion Balance Test ( SEBT ) , and a force platform . The players were r and omly divided into a control group ( N = 10 ) , who continued their soccer practice sessions and an intervention group ( N = 12 ) , who continued their soccer practice sessions and were also enrolled in a supervised five-week sensory motor training program . RESULTS After the five-week training program , the intervention group obtained significant results in the F8 , SHT and SEBT , as well as in the following parameters : area of pressure of sway center ( COP ) , mean velocity and mean frequency of COP . CONCLUSION The five-week sensory motor training program , carried out with easily available and low cost equipment , was effective at improving functional performance and postural control in young soccer players In young elite athletes the influence of a sensorimotor training ( SMT = balance training ) on strength , jump height and spinal reflex excitability was compared with adaptations induced by strength training ( ST ) . Seventeen athletes were r and omly assigned to either a SMT or a ST group . Before and after 6 weeks of training , maximal isometric strength ( MVC ) and rate of force development ( RFD ( max ) ) were determined . Changes in jump height and EMG activity were assessed during squat- ( SJ ) , countermovement- ( CMJ ) and drop-jump ( DJ ) . To evaluate neural adaptations , H-reflex recruitment was recorded at rest and during dynamic activation of the plantarflexors following stance perturbation . MVC was enhanced after ST but not influenced by SMT . RFD ( max ) was not affected by any training . Both SMT and ST significantly improved jump performance in SJ , CMJ , and DJ . Maximum H-reflex to maximum M-wave ratios ( H (max)/M (max)-ratios ) at rest remained unchanged . During stance perturbation , H (max)/M (max)-ratios were significantly reduced following SMT whereas ST augmented H (max)/M (max)-ratios ( p < 0.05 ) . In contrast to other studies , no changes in RFD were found . This may be explained by method ological and /or training specific differences . However , both SMT and ST improved jump performance in well trained young athletes but induced opposing adaptations of the H (max)/M (max)-ratio when measured during dynamic contractions . These adaptations were task-specific as indicated by the unchanged reflexes at rest . Decreased spinal excitability following SMT was interpreted as the attempt to improve movement control , whereas augmented excitability following ST accounts for the effort to enhance motoneuron output . Functionally , our results emphasise that SMT is not only beneficial for prevention and rehabilitation but also improves athletic performance
2,031
27,772,529
Conclusions VAE surveillance missed many cases of VAP , and the population characteristics identified by the two surveillance paradigms differed . VAE surveillance does not accurately detect cases of traditional VAP in ICUs
Background Ventilator-associated event ( VAE ) is a new surveillance paradigm for monitoring complications in mechanically ventilated patients in intensive care units ( ICUs ) . The National Healthcare Safety Network replaced traditional ventilator-associated pneumonia ( VAP ) surveillance with VAE surveillance in 2013 . The objective of this study was to assess the consistency between VAE surveillance and traditional VAP surveillance .
Ventilator-associated pneumonia ( VAP ) , an infection of the lower respiratory tract which occurs in association with mechanical ventilation , is one of the most common causes of nosocomial infection in the intensive care unit ( ICU ) . VAP causes significant morbidity and mortality in critically ill patients including increased duration of mechanical ventilation , ICU stay and hospitalization . Current knowledge for its prevention , diagnosis and management is therefore important clinical ly and is the basis for this review . We discuss recent changes in VAP surveillance nomenclature incorporating ventilator-associated conditions and ventilator-associated events , terms recently proposed by the Centers for Disease Control . To the extent possible , we rely predominantly on data from r and omized control trials ( RCTs ) and meta-analyses BACKGROUND A study was undertaken to assess the diagnostic value of different clinical criteria and the impact of microbiological testing on the accuracy of clinical diagnosis of suspected ventilator associated pneumonia ( VAP ) . METHODS Twenty five deceased mechanically ventilated patients were studied prospect ively . Immediately after death , multiple bilateral lung biopsy specimens ( 16 specimens/patient ) were obtained for histological examination and quantitative lung cultures . The presence of both histological pneumonia and positive lung cultures was used as a reference test . RESULTS The presence of infiltrates on the chest radiograph and two of three clinical criteria ( leucocytosis , purulent secretions , fever ) had a sensitivity of 69 % and a specificity of 75 % ; the corresponding numbers for the clinical pulmonary infection score ( CPIS ) were 77 % and 42 % . Non-invasive as well as invasive sampling techniques had comparable values . The combination of all techniques achieved a sensitivity of 85 % and a specificity of 50 % , and these values remained virtually unchanged despite the presence of previous treatment with antibiotics . When microbiological results were added to clinical criteria , adequate diagnoses originating from microbiological results which might have corrected false positive and false negative clinical judgements ( n = 5 ) were countered by a similar proportion of inadequate diagnoses ( n = 6 ) . CONCLUSIONS Clinical criteria had reasonable diagnostic values . CPIS was not superior to conventional clinical criteria . Non-invasive and invasive sampling techniques had diagnostic values comparable to clinical criteria . An algorithm guiding antibiotic treatment exclusively by microbiological results does not increase the overall diagnostic accuracy and carries the risk of undertreatment BACKGROUND The Centers for Disease Control and Prevention has shifted policy away from using ventilator-associated pneumonia ( VAP ) and toward using ventilator-associated conditions ( VACs ) as a marker of ICU quality . To date , limited prospect i ve data regarding the incidence of VAC among medical and surgical ICU patients , the ability of VAC criteria to capture patients with VAP , and the potential clinical preventability of VACs are available . METHODS This study was a prospect i ve 12-month cohort study ( January 2013 to December 2013 ) . RESULTS We prospect ively surveyed 1,209 patients ventilated for ≥ 2 calendar days . Sixty-seven VACs were identified ( 5.5 % ) , of which 34 ( 50.7 % ) were classified as an infection-related VAC ( IVAC ) with corresponding rates of 7.0 and 3.6 per 1,000 ventilator days , respectively . The mortality rate of patients having a VAC was significantly greater than that of patients without a VAC ( 65.7 % vs 14.4 % , P < .001 ) . The most common causes of VACs included IVACs ( 50.7 % ) , ARDS ( 16.4 % ) , pulmonary edema ( 14.9 % ) , and atelectasis ( 9.0 % ) . Among IVACs , 44.1 % were probable VAP and 17.6 % were possible VAP . Twenty-five VACs ( 37.3 % ) were adjudicated to represent potentially preventable events . Eighty-six episodes of VAP occurred in 84 patients ( 10.0 of 1,000 ventilator days ) during the study period . The sensitivity of the VAC criteria for the detection of VAP was 25.9 % ( 95 % CI , 16.7%-34.5 % ) . CONCLUSIONS Although relatively uncommon , VACs are associated with greater mortality and morbidity when they occur . Most VACs represent nonpreventable events , and the VAC criteria capture a minority of VAP episodes Background Ventilator-associated pneumonia ( VAP ) surveillance is time consuming , subjective , inaccurate , and inconsistently predicts outcomes . Shifting surveillance from pneumonia in particular to complications in general might circumvent the VAP definition 's subjectivity and inaccuracy , facilitate electronic assessment , make interfacility comparisons more meaningful , and encourage broader prevention strategies . We therefore evaluated a novel surveillance paradigm for ventilator-associated complications ( VAC ) defined by sustained increases in patients ' ventilator setting s after a period of stable or decreasing support . Methods We assessed 600 mechanically ventilated medical and surgical patients from three hospitals . Each hospital contributed 100 r and omly selected patients ventilated 2–7 days and 100 patients ventilated > 7 days . All patients were independently assessed for VAP and for VAC . We compared incidence-density , duration of mechanical ventilation , intensive care and hospital lengths of stay , hospital mortality , and time required for surveillance for VAP and for VAC . A subset of patients with VAP and VAC were independently review ed by a physician to determine possible etiology . Results Of 597 evaluable patients , 9.3 % had VAP ( 8.8 per 1,000 ventilator days ) and 23 % had VAC ( 21.2 per 1,000 ventilator days ) . Compared to matched controls , both VAP and VAC prolonged days to extubation ( 5.8 , 95 % CI 4.2–8.0 and 6.0 , 95 % CI 5.1–7.1 respectively ) , days to intensive care discharge ( 5.7 , 95 % CI 4.2–7.7 and 5.0 , 95 % CI 4.1–5.9 ) , and days to hospital discharge ( 4.7 , 95 % CI 2.6–7.5 and 3.0 , 95 % CI 2.1–4.0 ) . VAC was associated with increased mortality ( OR 2.0 , 95 % CI 1.3–3.2 ) but VAP was not ( OR 1.1 , 95 % CI 0.5–2.4 ) . VAC assessment was faster ( mean 1.8 versus 39 minutes per patient ) . Both VAP and VAC events were predominantly attributable to pneumonia , pulmonary edema , ARDS , and atelectasis . Conclusions Screening ventilator setting s for VAC captures a similar set of complications to traditional VAP surveillance but is faster , more objective , and a superior predictor of outcomes Objectives : Correct classification of the source of infection is important in observational and interventional studies of sepsis . Centers for Disease Control and Prevention criteria are most commonly used for this purpose , but the robustness of these definitions in critically ill patients is not known . We hypothesized that in a mixed ICU population , the performance of these criteria would be generally reduced and would vary among diagnostic subgroups . Design : Prospect i ve cohort . Setting : Data were collected as part of a cohort of 1,214 critically ill patients admitted to two hospitals in The Netherl and s between January 2011 and June 2011 . Patients : Eight observers assessed a r and om sample of 168 of 554 patients who had experienced at least one infectious episode in the ICU . Each patient was assessed by two r and omly selected observers who independently scored the source of infection ( by affected organ system or site ) , the plausibility of infection ( rated as none , possible , probable , or definite ) , and the most likely causative pathogen . Assessment s were based on a post hoc review of all available clinical , radiological , and microbiological evidence . The observed diagnostic agreement for source of infection was classified as partial ( i.e. , matching on organ system or site ) or complete ( i.e. , matching on specific diagnostic terms ) , for plausibility as partial ( 2-point scale ) or complete ( 4-point scale ) , and for causative pathogens as an approximate or exact pathogen match . Interobserver agreement was expressed as a concordant percentage and as a kappa statistic . Interventions : None . Measurements and Main Results : A total of 206 infectious episodes were observed . Agreement regarding the source of infection was 89 % ( 183/206 ) and 69 % ( 142/206 ) for a partial and complete diagnostic match , respectively . This result ed in a kappa of 0.85 ( 95 % CI , 0.79–0.90 ) . Agreement varied from 63 % to 91 % within major diagnostic categories and from 35 % to 97 % within specific diagnostic subgroups , with the lowest concordance observed in cases of ventilator-associated pneumonia . In the 142 episodes for which a complete match on source of infection was obtained , the interobserver agreement for plausibility of infection was 83 % and 65 % on a 2- and 4-point scale , respectively . For causative pathogen , agreement was 78 % and 70 % for an approximate and exact pathogen match , respectively . Conclusions : Interobserver agreement for classifying sources of infection using Centers for Disease Control and Prevention criteria was excellent overall . However , full concordance on all aspects of the diagnosis between independent observers was rare for some types of infection , in particular for ventilator-associated pneumonia Purpose Early hyperoxia may be an independent risk factor for mortality in mechanically ventilated intensive care unit ( ICU ) patients . We examined the relationship between early arterial oxygen tension ( PaO2 ) and in-hospital mortality . Method We retrospectively assessed arterial blood gases ( ABG ) with ‘ worst ’ alveolar-arterial ( A-a ) gradient during the first 24 h of ICU admission for all ventilated adult patients from 150 participating ICUs between 2000 and 2009 . We used multivariate analysis in all patients and defined subgroups to determine the relationship between PaO2 and mortality . We also studied the relationship between worst PaO2 , admission PaO2 and peak PaO2 in a r and om cohort of patients . Results We studied 152,680 patients . Their mean PaO2 was 20.3 kPa ( SD 14.6 ) and mean inspired fraction of oxygen ( FiO2 ) was 62 % ( SD 26 ) . Worst A-a gradient ABG identified that 49.8 % ( 76,110 ) had hyperoxia ( PaO2 > 16 kPa ) . Nineteen per cent of patients died in ICU and 26 % in hospital . After adjusting for site , Simplified Acute Physiology Score II ( SAPS II ) , age , FiO2 , surgical type , Glasgow Coma Scale ( GCS ) below 15 and year of ICU admission , there was an association between progressively lower PaO2 and increasing in-hospital mortality , but not with increasing levels of hyperoxia . Similar findings were observed with a sensitivity analysis of PaO2 derived from high FiO2 ( ≥50 % ) versus low FiO2 ( < 50 % ) and in defined subgroups . Worst PaO2 showed a strong correlation with admission PaO2 ( r = 0.98 ) and peak PaO2 within 24 h of admission ( r = 0.86 ) . Conclusion We found there was an association between hypoxia and increased in-hospital mortality , but not with hyperoxia in the first 24 h in ICU and mortality in ventilated patients . Our findings differ from previous studies and suggest that the impact of early hyperoxia on mortality remains uncertain Objectives : Centers for Disease Control and Prevention built up new surveillance paradigms for the patients on mechanical ventilation and the ventilator-associated events , comprising ventilator-associated conditions and infection-related ventilator-associated complications . We assess 1 ) the current epidemiology of ventilator-associated event , 2 ) the relationship between ventilator-associated event and ventilator-associated pneumonia , and 3 ) the impact of ventilator-associated event on antimicrobials consumption and mechanical ventilation duration . Design : Inception cohort study from the longitudinal prospect i ve French multicenter OUTCOME REA data base ( 1996 - 2012 ) . Patients : Patients on mechanical ventilation for greater than or equal to 5 consecutive days were classified as to the presence of a ventilator-associated event episode , using slightly modified Centers for Disease Control and Prevention definitions . Intervention : None . Measurements and Main Results : Among the 3,028 patients , 2,331 patients ( 77 % ) had at least one ventilator-associated condition , and 869 patients ( 29 % ) had one infection-related ventilator-associated complication episode . Multiple causes , or the lack of identified cause , were frequent . The leading causes associated with ventilator-associated condition and infection-related ventilator-associated complication were nosocomial infections ( 27.3 % and 43.8 % ) , including ventilator-associated pneumonia ( 14.5 % and 27.6 % ) . Sensitivity and specificity of diagnosing ventilator-associated pneumonia were 0.92 and 0.28 for ventilator-associated condition and 0.67 and 0.75 for infection-related ventilator-associated complication , respectively . A good correlation was observed between ventilator-associated condition and infection-related ventilator-associated complication episodes , and ventilator-associated pneumonia occurrence : R2 = 0.69 and 0.82 ( p < 0.0001 ) . The median number of days alive without antibiotics and mechanical ventilation at day 28 was significantly higher in patients without any ventilator-associated event ( p < 0.05 ) . Ventilator-associated condition and infection-related ventilator-associated complication rates were closely correlated with antibiotic use within each ICU : R2 = 0.987 and 0.99 , respectively ( p < 0.0001 ) . Conclusions : Ventilator-associated event is very common in a population at risk and more importantly highly related to antimicrobial consumption and may serve as surrogate quality indicator for improvement programs Objectives : The Centers for Disease Control and Prevention recently released new surveillance definitions for ventilator-associated events , including the new entities of ventilator-associated conditions and infection-related ventilator-associated complications . Both ventilator-associated conditions and infection-related ventilator-associated complications are associated with prolonged mechanical ventilation and hospital death , but little is known about their risk factors and how best to prevent them . We sought to identify risk factors for ventilator-associated conditions and infection-related ventilator-associated complications . Design : Retrospective case-control study . Setting : Medical , surgical , cardiac , and neuroscience units of a tertiary care teaching hospital . Patients : Hundred ten patients with ventilator-associated conditions matched to 110 controls without ventilator-associated conditions on the basis of age , sex , ICU type , comorbidities , and duration of mechanical ventilation prior to ventilator-associated conditions . Interventions : None . Measurements : We compared cases with controls with regard to demographics , comorbidities , ventilator bundle adherence rates , sedative exposures , routes of nutrition , blood products , fluid balance , and modes of ventilatory support . We repeated the analysis for the subset of patients with infection-related ventilator-associated complications and their controls . Main Results : Case and control patients were well matched on baseline characteristics . On multivariable logistic regression , significant risk factors for ventilator-associated conditions were m and atory modes of ventilation ( odds ratio , 3.4 ; 95 % CI , 1.6–8.0 ) and positive fluid balances ( odds ratio , 1.2 per L positive ; 95 % CI , 1.0–1.4 ) . Possible risk factors for infection-related ventilator-associated complications were starting benzodiazepines prior to intubation ( odds ratio , 5.0 ; 95 % CI , 1.3–29 ) , total opioid exposures ( odds ratio , 3.3 per 100 & mgr;g fentanyl equivalent/kg ; 95 % CI , 0.90–16 ) , and paralytic medications ( odds ratio , 2.3 ; 95 % CI , 0.79–80 ) . Traditional ventilator bundle elements , including semirecumbent positioning , oral care with chlorhexidine , venous thromboembolism prophylaxis , stress ulcer prophylaxis , daily spontaneous breathing trials , and sedative interruptions , were not associated with ventilator-associated conditions or infection-related ventilator-associated complications . Conclusions : M and atory modes of ventilation and positive fluid balance are risk factors for ventilator-associated conditions . Benzodiazepines , opioids , and paralytic medications are possible risk factors for infection-related ventilator-associated complications . Prospect i ve studies are needed to determine if targeting these risk factors can lower ventilator-associated condition and infection-related ventilator-associated complication rates BACKGROUND Current sedation guidelines recommend avoiding benzodiazepines but express no preference for propofol vs dexmedetomidine . In addition , few data exist on whether r and omized controlled trials of sedatives can be successfully generalized to routine practice , in which conditions tend to be more varied and complex . METHODS Data regarding daily sedative exposure were gathered from all patients undergoing mechanical ventilation for ≥ 3 days over a 7-year period in a large academic medical center . Hazard ratios ( HRs ) were compared for ventilator-associated events ( VAEs ) , extubation , hospital discharge , and hospital death among patients receiving benzodiazepines , propofol , and dexmedetomidine . Proportional subdistribution hazard models with competing risks were used for analysis . All analyses were adjusted for ICU type , demographic characteristics , comorbidities , procedures , severity of illness , hypotension , oxygenation , renal function , opioids , neuroleptic agents , neuromuscular blockers , awakening and breathing trials , and calendar year . RESULTS A total of 9,603 consecutive episodes of mechanical ventilation were evaluated . Benzodiazepines and propofol were associated with increased VAE risk , whereas dexmedetomidine was not . Propofol was associated with less time to extubation compared with benzodiazepines ( HR , 1.4 ; 95 % CI , 1.3 - 1.5 ) . Dexmedetomidine was associated with less time to extubation compared with benzodiazepines ( HR , 2.3 ; 95 % CI , 2.0 - 2.7 ) and propofol ( HR , 1.7 ; 95 % CI , 1.4 - 2.0 ) , but relatively few dexmedetomidine exposures were available for analysis . There were no differences between any two agents in HRs for hospital discharge or mortality . CONCLUSIONS In this large , real-world cohort , propofol and dexmedetomidine were associated with less time to extubation compared with benzodiazepines , but dexmedetomidine was also associated with less time to extubation vs propofol . These possible differences merit further study Objectives : The primary aim of the study was to measure the test characteristics of the National Health Safety Network ventilator-associated event/ventilator-associated condition constructs for detecting ventilator-associated pneumonia . Its secondary aims were to report the clinical features of patients with National Health Safety Network ventilator-associated event/ventilator-associated condition , measure costs of surveillance , and its susceptibility to manipulation . Design : Prospect i ve cohort study . Setting : Two inpatient campuses of an academic medical center . Patients : Eight thous and four hundred eight mechanically ventilated adults discharged from an ICU . Interventions : None . Measurements and Main Results : The National Health Safety Network ventilator-associated event/ventilator-associated condition constructs detected less than a third of ventilator-associated pneumonia cases with a sensitivity of 0.325 and a positive predictive value of 0.07 . Most National Health Safety Network ventilator-associated event/ventilator-associated condition cases ( 93 % ) did not have ventilator-associated pneumonia or other hospital-acquired complications ; 71 % met the definition for acute respiratory distress syndrome . Similarly , most patients with National Health Safety Network probable ventilator-associated pneumonia did not have ventilator-associated pneumonia because radiographic criteria were not met . National Health Safety Network ventilator-associated event/ventilator-associated condition rates were reduced 93 % by an unsophisticated manipulation of ventilator management protocol s. Conclusions : The National Health Safety Network ventilator-associated event/ventilator-associated condition constructs failed to detect many patients who had ventilator-associated pneumonia , detected many cases that did not have a hospital complication , and were susceptible to manipulation . National Health Safety Network ventilator-associated event/ventilator-associated condition surveillance did not perform as well as ventilator-associated pneumonia surveillance and had several undesirable characteristics Objective : The Centers for Disease Control has recently proposed a major change in how ventilator-associated pneumonia is defined . This has profound implication s for public reporting , reimbursement , and accountability measures for ICUs . We sought to provide evidence for or against this change by quantifying limitations of the national definition of ventilator-associated pneumonia that was in place until January 2013 , particularly with regard to comparisons between , and ranking of , hospitals and ICUs . Design : A prospect i ve survey of a nationally representative group of 43 hospitals , r and omly selected from the American Hospital Association Guide ( 2009 ) . Subjects classified six st and ardized vignettes of possible cases of ventilator-associated pneumonia as pneumonia or no pneumonia . Subjects : Individuals responsible for ventilator-associated pneumonia surveillance at 43 U.S. hospitals . Interventions : None . Measurements and Main Results : We measured the proportion of st and ardized cases classified as ventilator-associated pneumonia . Of 138 hospitals consented , 61 partially completed the survey and 43 fully completed the survey ( response rate 44 % and 31 % , respectively ) . Agreement among hospitals about classification of cases as ventilator-associated pneumonia/not ventilator-associated pneumonia was nearly r and om ( Fleiss & kgr ; 0.13 ) . Some hospitals rated 0 % of cases as having pneumonia ; others classified 100 % as having pneumonia ( median , 50 % ; interquartile range , 33–66 % ) . Although region of the country did not predict case assignment , respondents who described their region as “ rural ” were more likely to judge a case to be pneumonia than respondents elsewhere ( relative risk , 1.25 , Kruskal-Wallis chi-square , p = 0.03 ) . Conclusions : In this nationally representative study of hospitals , assignment of ventilator-associated pneumonia is extremely variable , enough to render comparisons between hospitals worthless , even when st and ardized cases eliminate variability in clinical data abstract ion . The magnitude of this variability highlights the limitations of using poorly performing surveillance definitions as methods of hospital evaluation and comparison , and our study provides very strong support for moving to a more objective definition of ventilator-associated complications BACKGROUND Ventilator-associated conditions ( VACs ) and infection-related ventilator-associated complications ( iVACs ) are the Centers for Disease Control and Prevention 's new surveillance paradigms for patients who are mechanically ventilated . Little is known regarding the clinical impact and preventability of VACs and iVACs and their relationship to ventilator-associated pneumonia ( VAP ) . We evaluated these using data from a large , multicenter , quality -improvement initiative . METHODS We retrospectively applied definitions for VAC and iVAC to data from a prospect i ve time series study in which VAP clinical practice guidelines were implemented in 11 North American ICUs . Each ICU enrolled 30 consecutive patients mechanically ventilated > 48 h during each of four study periods . Data on clinical outcomes and concordance with prevention recommendations were collected . VAC , iVAC , and VAP rates over time , the agreement ( κ statistic ) between definitions , associated morbidity/mortality , and independent risk factors for each were determined . RESULTS Of 1,320 patients , 139 ( 10.5 % ) developed a VAC , 65 ( 4.9 % ) developed an iVAC , and 148 ( 11.2 % ) developed VAP . The agreement between VAP and VAC was 0.18 , and between VAP and iVAC it was 0.19 . Patients who developed a VAC or iVAC had significantly more ventilator days , hospital days , and antibiotic days and higher hospital mortality than patients who had neither of these conditions . Increased concordance with VAP prevention guidelines during the study was associated with decreased VAP and VAC rates but no change in iVAC rates . CONCLUSIONS VACs and iVACs are associated with significant morbidity and mortality . Although the agreement between VAC , iVAC , and VAP is poor , a higher adoption of measures to prevent VAP was associated with lower VAP and VAC rates BACKGROUND In 2012 , the National Healthcare Safety Network presented a new surveillance definition for ventilator-associated events ( VAEs ) to objective ly define worsening pulmonary status in ventilated patients . VAE subcategories , ventilator-associated condition ( VAC ) , infection-related VAC , and probable ventilator-associated pneumonia ( PrVAP ) , were vetted predominantly in medical intensive care units . Our goal was to evaluate how well VAE criteria characterize pulmonary complications in surgical intensive care unit ( SICU ) patients . METHODS Since September 2012 , all intubated SICU patients were screened prospect ively for VAE and monitored for sustained respiratory dysfunction that did not meet VAE criteria . We diagnosed ventilator-associated pneumonia ( VAP ) using a clinical definition : Clinical Pulmonary Infection Score ( CPIS ) greater than 6 and catheter-directed bronchoalveolar lavage cultures with 104 or more colony-forming units per milliliter of pathogenic organisms . RESULTS We admitted 704 intubated patients . A total of 437 were intubated for two or more days ( mean [ SD ] , age 46 [ 18 ] years ; 65 % male ; median ventilator days , 4 [ range , 2–9 ] ; median Sequential Organ Failure Assessment [ SOFA ] score , 8 [ range , 5–10 ] ) . Using VAE criteria , we identified 37 patients with VAC , 31 with infection-related VAC , and 22 with PrVAP . While the remaining 400 patients did not meet VAE criteria , we identified 111 patients ( 28 % ) with respiratory deterioration and diagnosed 99 additional pneumonias . Of the 111 patients , 85 ( 77 % ) never had a period of stable/decreasing oxygenation , requiring elevated vent setting s upon initiation of ventilation preventing them from meeting VAE criteria . Of the 99 pneumonia patients , 10 % had sustained respiratory deterioration treated with elevations in mean airway pressure ; they did not meet VAE criteria as the positive end-expiratory pressure or FIO2 was not elevated . Twenty-seven percent never had a period of stable/decreasing oxygenation . Fifty-eight percent had less than 2 days of respiratory deterioration . Agreement between PrVAP and clinical VAP was 77.3 % ( & kgr ; = 0.243 , p < 0.001 ) . CONCLUSION The applicability of the new National Healthcare Safety Network categories of VAE to critically ill surgery patients is limited . Agreement between PrVAP and clinical VAP in SICU patients is poor . Most surgical patients are not well categorized by this new definition ; a better method of surveillance should be created for this patient population . LEVEL OF EVIDENCE Diagnostic study , level III Objectives : Ventilator-associated pneumonia diagnosis remains a debatable topic . New definitions of ventilator-associated conditions involving worsening oxygenation have been recently proposed to make surveillance of events possibly linked to ventilator-associated pneumonia as objective as possible . The objective of the study was to confirm the effect of subglottic secretion suctioning on ventilator-associated pneumonia prevalence and to assess its concomitant impact on ventilator-associated conditions and antibiotic use . Design : R and omized controlled clinical trial conducted in five ICUs of the same hospital . Patients : Three hundred fifty-two adult patients intubated with a tracheal tube allowing subglottic secretion suctioning were r and omly assigned to undergo suctioning ( n = 170 , group 1 ) or not ( n = 182 , group 2 ) . Main Results : During ventilation , microbiologically confirmed ventilator-associated pneumonia occurred in 15 patients ( 8.8 % ) of group 1 and 32 patients ( 17.6 % ) of group 2 ( p = 0.018 ) . In terms of ventilatory days , ventilator-associated pneumonia rates were 9.6 of 1,000 ventilatory days and 19.8 of 1,000 ventilatory days , respectively ( p = 0.0076 ) . Ventilator-associated condition prevalence was 21.8 % in group 1 and 22.5 % in group 2 ( p = 0.84 ) . Among the 47 patients with ventilator-associated pneumonia , 25 ( 58.2 % ) experienced a ventilator-associated condition . Neither length of ICU stay nor mortality differed between groups ; only ventilator-associated condition was associated with increased mortality . The total number of antibiotic days was 1,696 in group 1 , representing 61.6 % of the 2,754 ICU days , and 1,965 in group 2 , representing 68.5 % of the 2,868 ICU days ( p < 0.0001 ) . Conclusions : Subglottic secretion suctioning result ed in a significant reduction of ventilator-associated pneumonia prevalence associated with a significant decrease in antibiotic use . By contrast , ventilator-associated condition occurrence did not differ between groups and appeared more related to other medical features than ventilator-associated pneumonia CONTEXT Infection is a major cause of morbidity and mortality in intensive care units ( ICUs ) worldwide . However , relatively little information is available about the global epidemiology of such infections . OBJECTIVE To provide an up-to- date , international picture of the extent and patterns of infection in ICUs . DESIGN , SETTING , AND PATIENTS The Extended Prevalence of Infection in Intensive Care ( EPIC II ) study , a 1-day , prospect i ve , point prevalence study with follow-up conducted on May 8 , 2007 . Demographic , physiological , bacteriological , therapeutic , and outcome data were collected for 14,414 patients in 1265 participating ICUs from 75 countries on the study day . Analyses focused on the data from the 13,796 adult ( > 18 years ) patients . RESULTS On the day of the study , 7087 of 13,796 patients ( 51 % ) were considered infected ; 9084 ( 71 % ) were receiving antibiotics . The infection was of respiratory origin in 4503 ( 64 % ) , and microbiological culture results were positive in 4947 ( 70 % ) of the infected patients ; 62 % of the positive isolates were gram-negative organisms , 47 % were gram-positive , and 19 % were fungi . Patients who had longer ICU stays prior to the study day had higher rates of infection , especially infections due to resistant staphylococci , Acinetobacter , Pseudomonas species , and C and ida species . The ICU mortality rate of infected patients was more than twice that of noninfected patients ( 25 % [ 1688/6659 ] vs 11 % [ 682/6352 ] , respectively ; P < .001 ) , as was the hospital mortality rate ( 33 % [ 2201/6659 ] vs 15 % [ 942/6352 ] , respectively ; P < .001 ) ( adjusted odds ratio for risk of hospital mortality , 1.51 ; 95 % confidence interval , 1.36 - 1.68 ; P < .001 ) . CONCLUSIONS Infections are common in patients in contemporary ICUs , and risk of infection increases with duration of ICU stay . In this large cohort , infection was independently associated with an increased risk of hospital death RATIONALE Accurate surveillance of ventilator-associated pneumonia ( VAP ) is hampered by subjective diagnostic criteria . A novel surveillance paradigm for ventilator-associated events ( VAEs ) was introduced . OBJECTIVES To determine the validity of surveillance using the new VAE algorithm . METHODS Prospect i ve cohort study in two Dutch academic medical centers ( 2011 - 2012 ) . VAE surveillance was electronically implemented and included assessment of ( infection-related ) ventilator-associated conditions ( VAC , IVAC ) and VAP . Concordance with ongoing prospect i ve VAP surveillance was assessed , along with clinical diagnoses underlying VAEs and associated mortality of all conditions . Consequences of minor differences in electronic VAE implementation were evaluated . MEASUREMENTS AND MAIN RESULTS The study included 2,080 patients with 2,296 admissions . Incidences of VAC , IVAC , VAE-VAP , and VAP according to prospect i ve surveillance were 10.0 , 4.2 , 3.2 , and 8.0 per 1000 ventilation days , respectively . The VAE algorithm detected at most 32 % of the patients with VAP identified by prospect i ve surveillance . VAC signals were most often caused by volume overload and infections , but not necessarily VAP . Subdistribution hazards for mortality were 3.9 ( 95 % confidence interval , 2.9 - 5.3 ) for VAC , 2.5 ( 1.5 - 4.1 ) for IVAC , 2.0 ( 1.1 - 3.6 ) for VAE-VAP , and 7.2 ( 5.1 - 10.3 ) for VAP identified by prospect i ve surveillance . In sensitivity analyses , mortality estimates varied considerably after minor differences in electronic algorithm implementation . CONCLUSIONS Concordance between the novel VAE algorithm and VAP was poor . Incidence and associated mortality of VAE were susceptible to small differences in electronic implementation . More studies are needed to characterize the clinical entities underlying VAE and to ensure comparability of rates from different institutions
2,032
28,124,278
WEPD seems to be an effective intervention to reduce the risk of post-operative SSI in patients undergoing abdominal surgery
This article highlights the clinical effectiveness of wound edge protector devices ( WEPD ) in preventing the post-operative surgical site infections ( SSI ) in patients undergoing abdominal surgery .
BACKGROUND Kaiser Sunnyside Medical Center has participated in the American College of Surgeons National Surgical Quality Improvement Program ( NSQIP ) since January 2006 . Data on general and colorectal surgical site infections ( SSIs ) demonstrated a need for improvement in SSI rates . OBJECTIVE To evaluate application of a " care bundle " for patients undergoing colorectal operations , with the goal of reducing overall SSI rates . METHODS We prospect ively implemented multiple interventions , with retrospective analysis of data using the NSQIP data base . The overall , superficial , deep , and organ/space SSI rates were compared before and after implementation of this colorectal care bundle . RESULTS Between January 2006 and December 2009 , there were 430 colorectal cases in our NSQIP report with 91 infections , an overall rate of 21.16 % . Between January 2010 , when the colorectal care bundle was implemented , and June 2011 , there were 195 cases and 13 infections , a 6.67 % overall rate . The absolute decrease of 14.49 % is significant ( p < 0.0001 ) . The rate of superficial SSI decreased from 15.12 % to 3.59 % ( p < 0.0001 ) . The rates for deep and organ/space SSI also showed a decrease ; however , this was not statistically significant . The NSQIP observed-to-expected ratio for colorectal SSI decreased from a range of 1.27 to 1.83 before implementation to 0.54 after implementation ( fiscal year 2010 ) . CONCLUSIONS Our institution was a NSQIP high outlier in general surgery SSIs and had a high proportion of these cases represented in colorectal cases . By instituting a care bundle composed of core and adjunct strategies , we significantly decreased our rate of colorectal SSIs Background Wound-edge protection devices ( WEPDs ) have been used in surgery for more than 40 years to reduce surgical site infection ( SSI ) . No economic evaluation of WEPDs against any comparator has ever been conducted . The aim of the paper was to assess whether WEPDs are cost-effective in reducing SSI compared to st and ard care alone in the United Kingdom . Methods and Findings An economic evaluation was conducted alongside the ROSSINI trial . The study perspective was that of the UK National Health Service and the time horizon was 30 days post-operatively . The study was conducted in 21 UK hospitals . 760 patients undergoing laparotomy were r and omised to either WEPD or st and ard care and 735 were included in the primary analysis . The main economic outcome was cost-effectiveness based on incremental cost ( £ ) per quality adjusted life year ( QALY ) gained . Patients in the WEPD arm accessed health care worth £ 5,420 on average and gained 0.02131 QALYs , compared to £ 5,130 and 0.02133 QALYs gained in the st and ard care arm . The WEPD strategy was more costly and equally effective compared to st and ard care , but there was significant uncertainty around incremental costs and QALYs . The findings were robust to a range of sensitivity analyses . Conclusions There is no evidence to suggest that WEPDs can be considered a cost effective device to reduce SSI . Their continued use is a waste of limited health care re sources Objective To determine the clinical effectiveness of wound edge protection devices in reducing surgical site infection after abdominal surgery . Design Multicentre observer blinded r and omised controlled trial . Participants Patients undergoing laparotomy at 21 UK hospitals . Interventions St and ard care or the use of a wound edge protection device during surgery . Main outcome measures Surgical site infection within 30 days of surgery , assessed by blinded clinicians at seven and 30 days and by patient ’s self report for the intervening period . Secondary outcomes included quality of life , duration of stay in hospital , and the effect of characteristics of the patient and operation on the efficacy of the device . Results 760 patients were enrolled with 382 patients assigned to the device group and 378 to the control group . Six patients in the device group and five in the control group did not undergo laparotomy . Fourteen patients , seven in each group , were lost to follow-up . A total of 184 patients experienced surgical site infection within 30 days of surgery , 91/369 ( 24.7 % ) in the device group and 93/366 ( 25.4 % ) in the control group ( odds ratio 0.97 , 95 % confidence interval 0.69 to 1.36 ; P=0.85 ) . This lack of benefit was consistent across wound assessment s performed by clinicians and those reported by patients and across all secondary outcomes . In the secondary analyses no subgroup could be identified in which there was evidence of clinical benefit associated with use of the device . Conclusions Wound edge protection devices do not reduce the rate of surgical site infection in patients undergoing laparotomy , and therefore their routine use for this role can not be recommended . Trial registration Current Controlled Trials IS RCT N A system has been constructed to evaluate the design , implementation , and analysis of r and omized control trials ( RCT ) . The degree of quadruple blinding ( the r and omization process , the physicians and patients as to therapy , and the physicians as to ongoing results ) is considered to be the most important aspect of any trial . The analytic techniques are scored with the same emphasis as is placed on the control of bias in the planning and implementation of the studies . Description of the patient and treatment material s and the measurement of various controls of quality have less weight . An index of quality of a RCT is proposed with its pros and cons . If published papers were to approximate these principles , there would be a marked improvement in the quality of r and omized control trials . Finally , a reasonable st and ard design and conduct of trials will facilitate the interpretation of those with conflicting results and help in making valid combinations of undersized trials Aim Surgical site infection ( SSI ) remains a common postoperative morbidity , particularly in colorectal resections , and poses a significant financial burden to the healthcare system . The omission of mechanical bowel preparation , as is performed in enhanced recovery after surgery programmes , appears to further increase the incidence . Various wound protection methods have been devised to reduce the incidence of SSIs . However , there are few r and omized controlled trials assessing their efficacy . The aim of this study is to investigate whether ALEXIS wound retractors with reinforced O‐rings are superior to conventional wound protection methods in preventing SSIs in colorectal resections Introduction Surgical site infections ( SSIs ) are frequent complications in colorectal surgery and may lead to burst abdomen , incisional hernia , and increased perioperative costs . Plastic wound ring drapes ( RD ) were introduced some decades ago to protect the abdominal wound from bacteria and reduce SSIs . There have been no controlled trials examining the benefit of RD in laparoscopic colorectal surgery . The Reduction of wound infections in laparoscopic assisted colorectal resections by plastic wound ring drapes ( REDWIL ) trial was thus design ed to assess their effectiveness in preventing SSIs after elective laparoscopic colorectal resections . Material s/ methods REDWIL is a r and omized controlled monocenter trial with two parallel groups ( experimental group with RD and control group without RD ) . Patients undergoing elective laparoscopic colorectal resection were included . The primary endpoint was SSIs . Secondary outcomes were colonization of the abdominal wall with bacteria , reoperations/readmissions , early/late postoperative complications , and cost of hospital stay . The duration of follow-up was 6 months . Results Between January 2008 and October 2010 , 109 patients were r and omly assigned to the experimental or control group ( with or without RD ) . Forty-six patients in the RD group and 47 patients in the control group completed follow-up . SSIs developed in ten patients with RD ( 21.7 % ) and six patients without RD ( 12.8 % ) ( p = 0.28 ) . An intraoperative swab taken from the abdominal wall was positive in 66.7 % of patients with RD and 57.5 % without RD ( p = 0.46 ) . The number of species cultured within one swab was significantly higher in those without RD ( p = 0.03 ) . The median total inpatient costs including emergency readmissions were 3,402 ± 4,038 € in the RD group and 3,563 ± 1,735 € in the control group ( p = 0.869 ) . Conclusions RD do not reduce the rate of SSIs in laparoscopic colorectal surgery . The inpatient costs are similar with and without RD A controlled , r and omized study of the efficacy of a plastic wound ring drape ( Opdrape , Triplus ) to prevent contamination and infection in elective colorectal operations is reported . Seventy patients were operated upon with the wound ring drape and 70 patients without . All patients received preoperative systemic antibiotic prophylaxis . Abdominal wound infection was observed in seven of 70 ( 10 per cent ) patients with the wound ring drape and six of 70 ( 9 per cent ) without ( N.S. ) . An operative swab for bacteriologic evaluation was obtained from 85 per cent of the wounds . There was no evidence that the drape protected the wound from contamination with intestinal bacterial flora . It was concluded that the wound ring drape prevents neither contamination nor infection PURPOSE : Surgical site infection following colorectal surgery is a frequent and costly problem . Barrier protection at the time of this form of surgery has been used with varying results . The aim of this r and omized study was to examine the efficacy of barrier retractional wound protection in the prevention of surgical site infections in open , elective colorectal surgery . METHODS : One hundred thirty consecutive patients undergoing open elective colorectal resectional surgery were r and omly assigned to have either barrier retractional wound protection or st and ard wound retraction . Patients were then followed up for a minimum of 30 days postoperatively . The primary end point was surgical site infection as defined by the Centers for Disease Control and Prevention . The secondary end point was performance of the wound protector as assessed by operating surgeons . RESULTS : There was a significant reduction in the incidence of incisional surgical site infections when the wound protector was used : 3 of 64 ( 4.7 % ) vs 15 of 66 ( 22.7 % ) ; P = .004 . Most surgical site infections were diagnosed after discharge from the hospital ( 78 % ) , and there was no difference in the rates of reoperation , readmission , or formal wound drainage between the 2 groups . Surgeons found the wound protector to be helpful with retraction during surgery , with 88 % ( 7/8 ) adopting it as part of their st and ard setup . CONCLUSIONS : In this study the use of barrier wound protection in elective open colorectal resectional surgery result ed in a clinical ly significant reduction in incisional surgical site infections . Barrier wound protection of this nature should be considered routine in this type of surgery Introduction Surgical site infections ( SSIs ) remain a major problem in colorectal surgery . Method In this prospect i ve , r and omised study , we compared two kinds of wound protection , namely , “ plastic ring drape ” versus “ st and ard cloth towels ” . One hundred one patients were r and omised to the control group ( wet cloth towels ) and 98 to the study cohort ( ring drape ) . SSIs were classified according to Centers for Disease Control and Prevention recommendations . Discussion In the control group , 30 patients had an SSI , whereas 20 did so in the study group . This difference was not significant ( p = 0.131 ) . Conclusion Plastic ring drape for wound protection does not guard against SSIs in colorectal surgery Surgical site infections ( SSIs ) after cesarean section appear to be more common than generally believed . We prospect ively evaluated 231 consecutive pregnant women who underwent elective or emergency cesarean section , and were assigned to have either the Alexis wound retractor ( study group ) or a conventional Doyen retractor ( control group ) during the operation . There was no evidence of SSI , defined as wound dehiscence , pain or tenderness in the lower abdomen , localized swelling , redness , heat or purulent discharge from the wound in any woman in the study group . Moreover , no endometritis occurred in this patient collective . There were three SSI in the control group , but no endometritis . Our preliminary data show excellent protection of wound infections with an additive protective effect to that given by antibiotic cover . After a short learning curve , the h and ling of the Alexis device became easier and the median insertion time was 18 A study is reported of the influence of plastic wound protectors on the incidence of wound infection in patients having abdominal operations involving the opening of the alimentary tract or biliary system BACKGROUND We prospect ively investigated whether the wound-protective Alexis ( Applied Medical , Rancho Santa Margarita , CA ) wound retractor was effective in preventing surgical site infection ( SSI ) . METHODS We examined the actual condition of SSI in a 12-month r and omized , controlled trial consisting of 221 patients who had undergone nontraumatic gastrointestinal surgery . The patients were divided into a With Alexis retractor group ( n = 111 ) and a Without Alexis retractor group ( n = 110 ) . We also analyzed SSI separately on the basis of surgical sites such as gastric surgery or colorectal surgery . RESULTS Overall estimation showed a significant decrease in wound infection ( superficial incisional SSI ) in the With Alexis retractor group . In the analysis based on surgical sites , a significant decrease in wound infection was noted in the With Alexis retractor group , the members of which had undergone colorectal surgery . There was no significant difference between the two groups in the occurrence of organ/space SSI , including anastomotic leak or intraperitoneal abscess . CONCLUSION It was suggested that the use of the Alexis wound retractor would protect surgical wounds from contamination by bacteria and thus prevent infection The preventive effects of a plastic wound drape ( Vi-drape ) on wound contamination and subsequent infection was investigated in 289 appendicectomies r and omized into treatment and control groups . Sample s for quantitative culture of enterobacteria using a simple technique were obtained from the drape , the wound under the drape and from the wounds of patients being operated without the drape . The drape reduced the number of wound bacteria by an average of 94 % or 1.2 logarithms . The overall rate of postoperative wound infection was 7.6 % in the treatment group and 9.1 % in the control group ( N.S ) . Evidence is presented that the drape might prevent infection in patients with contaminated operations A r and omized controlled trial has been performed to assess the value of plastic wound drapes in the prevention of surgical wound infection . One hundred and forty‐four patients undergoing abdominal surgery were allocated to one of three groups ; a control group ( A ) in which st and ard cloth towels were applied to the abdominal wound , group B in which an adhesive plastic drape was added and group C in which a plastic ring protector was inserted into the wound
2,033
22,634,197
It was observed that consumption of capsaicinoids increases energy expenditure by approximately 50 kcal/day , and that this would produce clinical ly significant levels of weight loss in 1 - 2 years . It was also observed that regular consumption significantly reduced abdominal adipose tissue levels and reduced appetite and energy intake . The mechanism of action is not presently fully understood , although it is well accepted much of the effects are caused by stimulation of the TRPV1 receptor . While capsaicinoids are not a magic bullet for weight loss , the evidence is that they could play a beneficial role , as part of a weight management program
Capsaicinoids are a group of chemicals found in chilli peppers , with bioactive properties . The purpose of this study is to systematic ally review research investigating the potential benefits capsaicinoid compounds may have in relation to weight management .
Two studies were conducted to investigate the effects of red pepper ( capsaicin ) on feeding behaviour and energy intake . In the first study , the effects of dietary red pepper added to high-fat ( HF ) and high-carbohydrate ( HC ) meals on subsequent energy and macronutrient intakes were examined in thirteen Japanese female subjects . After the ingestion of a st and ardized dinner on the previous evening , the subjects ate an experimental breakfast ( 1883 kJ ) of one of the following four types : ( 1 ) HF ; ( 2 ) HF and red pepper ( 10 g ) ; ( 3 ) HC ; ( 4 ) HC and red pepper . Ad libitum energy and macronutrient intakes were measured at lunch-time . The HC breakfast significantly reduced the desire to eat and hunger after breakfast . The addition of red pepper to the HC breakfast also significantly decreased the desire to eat and hunger before lunch . Differences in diet composition at breakfast time did not affect energy and macronutrient intakes at lunch-time . However , the addition of red pepper to the breakfast significantly decreased protein and fat intakes at lunch-time . In Study 2 , the effects of a red-pepper appetizer on subsequent energy and macronutrient intakes were examined in ten Caucasian male subjects . After ingesting a st and ardized breakfast , the subjects took an experimental appetizer ( 644 kJ ) at lunch-time of one of the following two types : ( 1 ) mixed diet and appetizer ; ( 2 ) mixed diet and red-pepper ( 6 g ) appetizer . The addition of red pepper to the appetizer significantly reduced the cumulative ad libitum energy and carbohydrate intakes during the rest of the lunch and in the snack served several hours later . Moreover , the power spectral analysis of heart rate revealed that this effect of red pepper was associated with an increase in the ratio sympathetic : parasympathetic nervous system activity . These results indicate that the ingestion of red pepper decreases appetite and subsequent protein and fat intakes in Japanese females and energy intake in Caucasian males . Moreover , this effect might be related to an increase in sympathetic nervous system activity in Caucasian males BACKGROUND Capsinoids-nonpungent capsaicin analogs-are known to activate brown adipose tissue ( BAT ) thermogenesis and whole-body energy expenditure ( EE ) in small rodents . BAT activity can be assessed by [¹⁸F]fluorodeoxyglucose-positron emission tomography ( FDG-PET ) in humans . OBJECTIVES The aims of the current study were to examine the acute effects of capsinoid ingestion on EE and to analyze its relation to BAT activity in humans . DESIGN Eighteen healthy men aged 20 - 32 y underwent FDG-PET after 2 h of cold exposure ( 19 ° C ) while wearing light clothing . Whole-body EE and skin temperature , after oral ingestion of capsinoids ( 9 mg ) , were measured for 2 h under warm conditions ( 27 ° C ) in a single-blind , r and omized , placebo-controlled , crossover design . RESULTS When exposed to cold , 10 subjects showed marked FDG uptake into adipose tissue of the supraclavicular and paraspinal regions ( BAT-positive group ) , whereas the remaining 8 subjects ( BAT-negative group ) showed no detectable uptake . Under warm conditions ( 27 ° C ) , the mean ( ±SEM ) resting EE was 6114 ± 226 kJ/d in the BAT-positive group and 6307 ± 156 kJ/d in the BAT-negative group ( NS ) . EE increased by 15.2 ± 2.6 kJ/h in 1 h in the BAT-positive group and by 1.7 ± 3.8 kJ/h in the BAT-negative group after oral ingestion of capsinoids ( P < 0.01 ) . Placebo ingestion produced no significant change in either group . Neither capsinoids nor placebo changed the skin temperature in various regions , including regions close to BAT deposits . CONCLUSION Capsinoid ingestion increases EE through the activation of BAT in humans . This trial was registered at http://www.umin.ac.jp/ctr/ as UMIN 000006073 The aim of the present study was to investigate whether capsaicin assists weight maintenance by limiting weight regain after weight loss of 5 to 10 % . In this r and omized double-blind placebo-controlled study , ninety-one moderately overweight subjects were r and omly assigned to an intensive group that underwent all the measurements , and an extensive group that underwent the same measurements except the metabolism measurements . After a 4-week very-low-energy diet ( VLED ) intervention , a 3-month weight-maintenance period followed . During weight maintenance , subjects were divided into a capsaicin ( 135 mg capsaicin/d ) and a placebo group . Body mass was measured before and after the VLED and after 1 , 2 and 3 months of weight maintenance . The mean body-mass loss during the VLED was 6.6 ( SD 2.0 ) kg ( 7.8 ( SD 1.8 ) % initial body mass ) , and was not different between the subsequent treatment and placebo group . During weight maintenance , mean % regain during treatment was not significantly different compared with placebo ( 33.3 ( SD 35.7 ) v. 19.2 ( SD 41.8 ) % , P=0.09 ) . RQ was significantly less increased during weight maintenance in the treatment group compared with placebo ( 0.04 ( SD 0.06 ) v. 0.07 ( SD 0.05 ) , P<0.05 ) , indicating a relatively more sustained fat oxidation . Fat oxidation ( g/h ) after weight maintenance was higher in the capsaicin group compared with placebo ( 4.2 ( SD 1.1 ) v. 3.5 ( SD 0.9 ) , P<0.05 ) . These results indicate that capsaicin treatment caused sustained fat oxidation during weight maintenance compared with placebo . However , capsaicin treatment has no limiting effect on 3-month weight regain after modest weight loss Background Dihydrocapsiate ( DCT ) is a natural safe food ingredient which is structurally related to capsaicin from chili pepper and is found in the non-pungent pepper strain , CH-19 Sweet . It has been shown to elicit the thermogenic effects of capsaicin but without its gastrointestinal side effects . Methods The present study was design ed to examine the effects of DCT on both adaptive thermogenesis as the result of caloric restriction with a high protein very low calorie diet ( VLCD ) and to determine whether DCT would increase post-pr and ial energy expenditure ( PPEE ) in response to a 400 kcal/60 g protein liquid test meal . Thirty-three subjects completed an outpatient very low calorie diet ( 800 kcal/day providing 120 g/day protein ) over 4 weeks and were r and omly assigned to receive either DCT capsules three times per day ( 3 mg or 9 mg ) or placebo . At baseline and 4 weeks , fasting basal metabolic rate and PPEE were measured in a metabolic hood and fat free mass ( FFM ) determined using displacement plethysmography ( BOD POD ) . Results PPEE normalized to FFM was increased significantly in subjects receiving 9 mg/day DCT by comparison to placebo ( p < 0.05 ) , but decreases in resting metabolic rate were not affected . Respiratory quotient ( RQ ) increased by 0.04 in the placebo group ( p < 0.05 ) at end of the 4 weeks , but did not change in groups receiving DCT . Conclusions These data provide evidence for postpr and ial increases in thermogenesis and fat oxidation secondary to administration of dihydrocapsiate . Trial registration clinical Dietary red pepper suppresses energy intake and modifies macronutrient intake . We have investigated whether a stimulus in the mouth and the sensation of spiciness are necessary for red pepper-induced changes in energy and macronutrient intake in human volunteers . In a preliminary test , sixteen Japanese male volunteers tasted sample s of a soup with grade d doses of red pepper in order to define a moderate and a maximum tolerable ( strong ) dose of red pepper . On the day of the experiment , a st and ardised breakfast was given to the volunteers . At lunchtime , the subjects ingested one of four experimental soups containing either a placebo , a moderate or a strong dose of red pepper plus placebo capsules , or a placebo soup plus capsules delivering a strong dose of red pepper . The rest of the meal was given ad libitum to all subjects . The amount of food , protein and carbohydrate ingested was similar for all conditions . Energy and fat intake were similar after the ingestion of the moderate soup compared with placebo . However , the strong soup significantly lowered fat intake compared with placebo ( P=0.043 ) , and ingestion of strong capsules also tended to suppress it ( P=0.080 ) . Moreover , energy intake after strong soup and capsules tended to be lower than placebo ( P=0.089 and 0.076 , respectively ) . The present results indicate that the maximum tolerable dose is necessary to have a suppressive effect of red pepper on fat intake . The main site of the action of red pepper is not in the mouth BACKGROUND : Decreased appetite and increased energy expenditure after oral consumption of red pepper has been shown . OBJECTIVE : The aim of the present study was to assess the relative oral and gastrointestinal contribution to capsaicin-induced satiety and its effects on food intake or macronutrient selection . METHODS : For 24 subjects ( 12 men and 12 women ; age : 35±10 y ; BMI : 25.0±2.4 kg/m2 ; range 20–30 ) , 16 h food intake was assessed four times during 2 consecutive days by offering macronutrient-specific buffets and boxes with snacks , in our laboratory restaurant . At 30 min before each meal , 0.9 g red pepper ( 0.25 % capsaicin ; 80 000 Scoville Thermal Units ) or a placebo was offered in either tomato juice or in two capsules that were swallowed with tomato juice . Hunger and satiety were recorded using Visual Analogue Scales . RESULTS : Average daily energy intake in the placebo condition was 11.5±1.0 MJ/d for the men and 9.4±0.8 MJ/d for the women . After capsaicin capsules , energy intake was 10.4±0.6 and 8.3±0.5 MJ/d ( P<0.01 ) ; after capsaicin in tomato juice , it was 9.9±0.7 and 7.9±0.5 MJ/d , respectively ( compared to placebo : P<0.001 ; compared to capsaicin in capsules : P<0.05 ) . En % from carbohydrate/protein/fat ( C/P/F ) : changed from 46±3/15±1/39±2 to 52±4/15±1/33±2 en% ( P<0.01 ) in the men , and from 48±4/14±2/38±3 to 42±4/14±2/32±3 en% ( P<0.01 ) in the women , in both capsaicin conditions . Satiety ( area under the curve ) increased from 689 to 757 mmh in the men and from 712 to 806 mmh in the women , both ( P<0.01 ) . Only in the oral exposure condition was the reduction in energy intake and the increase in satiety related to perceived spiciness . CONCLUSION : In the short term , both oral and gastrointestinal exposure to capsaicin increased satiety and reduced energy and fat intake ; the stronger reduction with oral exposure suggests a sensory effect of capsaicin The biochemical and physiological indices were monitored in 44 subjects after 4-week capsinoids ( capsaicin analogues with low pungency ) intake . The subjects were r and omly assigned to 3 groups : CSNs3 ( 3 mg/kg of capsinoids ) , CSNs10 ( 10 mg/kg of capsinoids ) and the control ( placebo ) . Measurements were performed in the morning on overnight-fasted subjects . The oxygen consumption ( VO2 ) , resting energy expenditure ( REE ) and fat oxidation increased slightly compared to pre-administration values without any adverse effects , although the increase was not significant . The increase in fat oxidation was positively and significantly correlated with the body mass index ( BMI ) . A meta- analysis was therefore conducted on a subgroup consisting of subjects with BMI ≥ 25 ( n=28 ) . As a result , not only VO2 increased significantly ( p<0.05 ) in the CSNs10 group , but also REE in the CSNs10 group and fat oxidation in the CSNs3 and CSNs10 groups tended to increase ( p<0.1 ) . Consequently , a capsinoids intake would be able to enhance the energy expenditure and fat burning in humans , particularly those with high BMI BACKGROUND & AIMS Bioactive ingredients have been shown to reduce appetite and energy intake . The magnitude of these effects might depend on energy balance why it was investigated how capsaicin , green tea , CH-19 sweet pepper as well as green tea and capsaicin affect appetite and energy intake during respectively negative and positive energy balance . METHODS 27 subjects were r and omized to three weeks of negative and three weeks of positive energy balance during which capsaicin , green tea , CH-19 sweet pepper , capsaicin+green tea or placebo was ingested on ten separate test days while the effects on appetite , energy intake , body weight and heart rate were assessed . RESULTS CH-19 sweet pepper and a combination of capsaicin and green tea reduced energy intake during positive energy balance . Capsaicin and green tea suppressed hunger and increased satiety more during negative than during positive energy balance . CONCLUSIONS Bioactive ingredients had energy intake reducing effects when used in combinations and in positive energy balance . Energy balance did not affect possible treatment induced energy intake , but did affect appetite by supporting negative energy balance . Bioactive ingredients may therefore be helpful in reducing energy intake and might support weight loss periods by relatively sustaining satiety and suppressing hunger The effects of dietary hot red pepper on energy metabolism at rest and during exercise were examined in long distance male runners 18 - 23 yr of age . A st and ardized meal was given on the evening prior to the experiment . The subjects had a meal ( 2720 kJ ) with or without 10 g of hot red pepper for breakfast . During rest ( 2.5 h after meal ) and exercise ( pedaling for 1 h at 150 W , about 60 % VO2max , using cycling ergometry ) , expired gasses and venous blood were collected . The meal with hot red pepper significantly elevated respiratory quotient and blood lactate levels at rest and during exercise . Oxygen consumption at rest was slightly but nonsignificantly higher in the hot red pepper meal at 30 min after the meal . Plasma epinephrine and norepinephrine levels were significantly higher in those who had only hot red pepper at 30 min after the meal . These results suggest that hot red pepper ingestion stimulates carbohydrate oxidation at rest and during exercise OBJECTIVE : To evaluate the efficacy and side effects of an herbal formulation to promote weight loss , as compared to placebo . DESIGN : 12-week multicenter double-blind , placebo-controlled , r and omized parallel groups design . Study conducted at three clinical sites in New York State . Subjects were r and omized to receive either the ‘ active ’ product or a ‘ placebo ’ supplement for 12 weeks . Minimal steps were taken to influence lifestyle changes with regard to diet or exercise . SUBJECTS : 102 overweight/obese ( 30 < BMI ≤39.9 kg/m2 ) volunteers between the ages of 18 and 65 y . MAIN OUTCOME MEASURES : Weight , percent body fat , fat mass , waist circumference , BMI , blood pressure , and pulse measured at 2 days , 1 week , 2 weeks , 4 weeks , 8 weeks , and 12 weeks postr and omization . RESULTS : Subjects receiving the ‘ active ’ treatment experienced , on average , an additional 1.5 kg of weight loss compared with subjects receiving the placebo . In addition , subjects receiving the ‘ active ’ treatment experienced greater reductions in BMI and waist circumference over the 12-week period . No differences were observed with respect to percent body fat , fat mass , diastolic or systolic blood pressure , pulse , the occurrence of any adverse event , or the occurrence of any presumed treatment-related adverse event . Testing of the study product by two independent laboratories indicated that it had only approximately half of the intended amount of ephedrine alkaloids and caffeine . CONCLUSIONS : Over the 12-week trial , subjects on the active treatment experienced significantly greater weight loss than subjects on placebo , without an increase in blood pressure , pulse , or the rate of adverse events . These benefits were achieved in the absence of any lifestyle treatment to change dietary or exercise behavior and with lower doses of ephedrine alkaloids and caffeine than those commonly utilized Previous studies suggest consumption of red pepper ( RP ) promotes negative energy balance . However , the RP dose provided in these studies ( up to 10 g/meal ) usually exceeded the amount preferred by the general population in the United States ( mean=~1 g/meal ) . The objective of this study was to evaluate the effects of hedonically acceptable RP doses served at a single meal in healthy , lean individuals on thermogenesis and appetite . Twenty-five men and women ( aged 23.0 ± 0.5 years , BMI 22.6 ± 0.3 kg/m(2 ) , 13 spicy food users and 12 non-users ) participated in a r and omized crossover trial during which they consumed a st and ardized quantity ( 1 g ) ; their preferred quantity ( regular spicy food users 1.8 ± 0.3 g/meal , non-users 0.3 ± 0.1 g/meal ) ; or no RP . Energy expenditure , core body and skin temperature , and appetite were measured . Postpr and ial energy expenditure and core body temperature were greater , and skin temperature was lower , after test loads with 1 g RP than no RP . Respiratory quotient was lower after the preferred RP dose was ingested orally , compared to in capsule form . These findings suggest that RP 's effects on energy balance stem from a combination of metabolic and sensory inputs , and that oral exposure is necessary to achieve RP 's maximum benefits . Energy intake was lower after test loads with 1 g RP than no RP in non-users , but not in users . Preoccupation with food , and the desire to consume fatty , salty , and sweet foods were decreased more ( or tended to be decreased more ) in non-users than users after a 1 g RP test load , but did not vary after a test load with no RP . This suggests that individuals may become desensitized to the effects of RP with long-term spicy food intake Background : A combination of tyrosine , capsaicin , catechines and caffeine may stimulate the sympathetic nervous system and promote satiety , lipolysis and thermogenesis . In addition , dietary calcium may increase fecal fat excretion . Objective : To investigate the acute and subchronic effect of a supplement containing the above mentioned agents or placebo taken t.i.d on thermogenesis , body fat loss and fecal fat excretion . Design : In total , 80 overweight – obese subjects ( ( body mass index ) 31.2±2.5 kg/m2 , mean±s.d . ) underwent an initial 4-week hypocaloric diet ( 3.4 MJ/day ) . Those who lost>4 % body weight were instructed to consume a hypocaloric diet ( −1.3 MJ/day ) and were r and omized to receive either placebo ( n=23 ) or bioactive supplement ( n=57 ) in a double-blind , 8-week intervention . The thermogenic effect of the compound was tested at the first and last day of intervention , and blood pressure , heart rate , body weight and composition were assessed . Results : Weight loss during the induction phase was 6.8±1.9 kg . At the first exposure the thermogenic effect of the bioactive supplement exceeded that of placebo by 87.3 kJ/4 h ( 95%CI : 50.9;123.7 , P=0.005 ) and after 8 weeks this effect was sustained ( 85.5 kJ/4 h ( 47.6;123.4 ) , P=0.03 ) . Body fat mass decreased more in the supplement group by 0.9 kg ( 0.5 ; 1.3 ) compared with placebo ( P<0.05 ) . The bioactive supplement had no effect on fecal fat excretion , blood pressure or heart rate . Conclusion : The bioactive supplement increased 4-h thermogenesis by 90 kJ more than placebo , and the effect was maintained after 8 weeks and accompanied by a slight reduction in fat mass . These bioactive components may support weight maintenance after a hypocaloric diet
2,034
26,626,268
Change in VJ height was not different for OW versus plyometric training . CONCLUSIONS OW is an effective training method to improve VJ height . The similar effects observed for OW and plyometric training on VJ height suggests that either of these methods would be beneficial when devising training programmes to improve VJ height
PURPOSE This systematic review was conducted to evaluate the effect of Olympic weightlifting ( OW ) on vertical jump ( VJ ) height compared to a control condition , traditional resistance training and plyometric training .
PURPOSE The purpose of this study was to determine whether ballistic resistance training would increase the vertical jump ( VJ ) performance of already highly trained jump athletes . METHODS Sixteen male volleyball players from a NCAA Division I team participated in the study . A Vertec was used to measure st and ing vertical jump and reach ( SJR ) and jump and reach from a three-step approach ( AJR ) . Several types of vertical jump tests were also performed on a Plyometric Power System and a forceplate to measure force , velocity , and power production during vertical jumping . The subjects completed the tests and were then r and omly divided into two groups , control and treatment . All subjects completed the usual preseason volleyball on-court training combined with a resistance training program . In addition , the treatment group completed 8 wk of squat jump training while the control group completed squat and leg press exercises at a 6RM load . Both groups were retested at the completion of the training period . RESULTS The treatment group produced a significant increase in both SJR and AJR of 5.9+/-3.1 % and 6.3+/-5.1 % . respectively . These increases were significantly greater than the pre- to postchanges produced by the control group , which were not significant for either jump . Analysis of the data from the various other jump tests suggested increased overall force output during jumping , and in particular increased rate of force development were the main contributors to the increased jump height . CONCLUSIONS These results lend support to the effectiveness of ballistic resistance training for improving vertical jump performance in elite jump athletes The purpose of this study was to compare the effects of combined strength and plyometric training with strength training alone on power-related measurements in professional soccer players . Subjects in the intervention team were r and omly divided into 2 groups . Group ST ( n = 6 ) performed heavy strength training twice a week for 7 weeks in addition to 6 to 8 soccer sessions a week . Group ST+P ( n = 8) performed a plyometric training program in addition to the same training as the ST group . The control group ( n = 7 ) performed 6 to 8 soccer sessions a week . Pretests and posttests were 1 repetition maximum ( 1RM ) half squat , countermovement jump ( CMJ ) , squat jump ( SJ ) , 4-bounce test ( 4BT ) , peak power in half squat with 20 kg , 35 kg , and 50 kg ( PP20 , PP35 , and PP50 , respectively ) , sprint acceleration , peak sprint velocity , and total time on 40-m sprint . There were no significant differences between the ST+P group and ST group . Thus , the groups were pooled into 1 intervention group . The intervention group significantly improved in all measurements except CMJ , while the control group showed significant improvements only in PP20 . There was a significant difference in relative improvement between the intervention group and control group in 1RM half squat , 4BT , and SJ . However , a significant difference between groups was not observed in PP20 , PP35 , sprint acceleration , peak sprinting velocity , and total time on 40-m sprint . The results suggest that there are no significant performance-enhancing effects of combining strength and plyometric training in professional soccer players concurrently performing 6 to 8 soccer sessions a week compared to strength training alone . However , heavy strength training leads to significant gains in strength and power-related measurements in professional soccer players Functional performance of lower limb muscles and contractile properties of chemically skinned single muscle fibers were evaluated before and after 8 wk of maximal effort stretch-shortening cycle ( SSC ) exercise training . Muscle biopsies were obtained from the vastus lateralis of eight men before and after the training period . Fibers were evaluated regarding their mechanical properties and subsequently classified according to their myosin heavy chain content ( SDS-PAGE ) . After training , maximal leg extensor muscle force and vertical jump performance were improved 12 % ( P<0.01 ) and 13 % ( P<0.001 ) , respectively . Single-fiber cross-sectional area increased 23 % in type I ( P<0.01 ) , 22 % in type IIa ( P<0.001 ) , and 30 % in type IIa/IIx fibers ( P<0.001 ) . Peak force increased 19 % in type I ( P<0.01 ) , 15 % in type IIa ( P<0.001 ) , and 16 % in type IIa/IIx fibers ( P<0.001 ) . When peak force was normalized with cross-sectional area , no changes were found for any fiber type . Maximal shortening velocity was increased 18 , 29 , and 22 % in type I , IIa , and hybrid IIa/IIx fibers , respectively ( P<0.001 ) . Peak power was enhanced in all fiber types , and normalized peak power improved 9 % in type IIa fibers ( P<0.05 ) . Fiber tension on passive stretch increased in IIa/IIx fibers only ( P<0.05 ) . In conclusion , short-term SSC exercise training enhanced single-fiber contraction performance via force and contraction velocity in type I , IIa , and IIa/IIx fibers . These results suggest that SSC exercises are an effective training approach to improve fiber force , contraction velocity , and therefore power The present study investigates the effects of power training on mechanical efficiency ( ME ) in jumping . Twenty-three subjects , including ten controls , volunteered for the study . The experimental group trained twice a week for 15 weeks performing various jumping exercises such as drop jumps , hurdle jumps , hopping and bouncing . In the maximal jumping test , the take-off velocity increased from 2.56 ( 0.24 ) m·s−1 to 2.77 ( 0.18 ) m·s−1 ( P<0.05 ) . In the submaximal jumping of 50 % of the maximum , energy expenditure decreased from 660 ( 110 ) to 502 ( 68 ) J·kg−1·min−1 ( P<0.001 ) while , simultaneously , ME increased from 37.2 (8.4)% to 47.4 (8.2)% ( P<0.001 ) . Some muscle enzyme activities of the gastrocnemius muscle increased during the training period : citrate synthase from 35 ( 8) to 39 ( 7 ) μmol·g−1 dry mass·min−1 ( P<0.05 ) and β-hydroxyacyl CoA dehydrogenase from 21 ( 4 ) to 23 ( 5 ) μmol·g−1 dry mass·min−1 ( P<0.05 ) , whereas no significant changes were observed in phosphofructokinase and lactate dehydrogenase . In the control group , no changes in ME or in enzyme activities were observed . In conclusion , the enhanced performance capability of 8 % in maximal jumping as a result of power training was characterized by decreased energy expenditure of 24 % . Thus , the increased neuromuscular performance , joint control strategy , and intermuscular coordination ( primary factors ) , together with improved aerobic capacity ( secondary factor ) , may result in reduced oxygen dem and s and increased ME Thomas , K , French , D , and Hayes , PR . The effect of two plyometric training techniques on muscular power and agility in youth soccer players . J Strength Cond Res 23(1 ) : 332 - 335 , 2009-The aim of this study was to compare the effects of two plyometric training techniques on power and agility in youth soccer players . Twelve males from a semiprofessional football club 's academy ( age = 17.3 ± 0.4 years , stature = 177.9 ± 5.1 cm , mass = 68.7 ± 5.6 kg ) were r and omly assigned to 6 weeks of depth jump ( DJ ) or countermovement jump ( CMJ ) training twice weekly . Participants in the DJ group performed drop jumps with instructions to minimize ground-contact time while maximizing height . Participants in the CMJ group performed jumps from a st and ing start position with instructions to gain maximum jump height . Posttraining , both groups experienced improvements in vertical jump height ( p < 0.05 ) and agility time ( p < 0.05 ) and no change in sprint performance ( p > 0.05 ) . There were no differences between the treatment groups ( p > 0.05 ) . The study concludes that both DJ and CMJ plyometrics are worthwhile training activities for improving power and agility in youth soccer players This study was performed to determine which of three theoretically optimal resistance training modalities result ed in the greatest enhancement in the performance of a series of dynamic athletic activities . The three training modalities included 1 ) traditional weight training , 2 ) plyometric training , and 3 ) explosive weight training at the load that maximized mechanical power output . Sixty-four previously trained subjects were r and omly allocated to four groups that included the above three training modalities and a control group . The experimental groups trained for 10 wk performing either heavy squat lifts , depth jumps , or weighted squat jumps . All subjects were tested prior to training , after 5 wk of training and at the completion of the training period . The test items included 1 ) 30-m sprint , 2 ) vertical jumps performed with and without a countermovement , 3 ) maximal cycle test , 4 ) isokinetic leg extension test , and 5 ) a maximal isometric test . The experimental group which trained with the load that maximized mechanical power achieved the best overall results in enhancing dynamic athletic performance recording statistically significant ( P < 0.05 ) improvements on most test items and producing statistically superior results to the two other training modalities on the jumping and isokinetic tests OBJECTIVE : To test the feasibility of creating a valid and reliable checklist with the following features : appropriate for assessing both r and omised and non-r and omised studies ; provision of both an overall score for study quality and a profile of scores not only for the quality of reporting , internal validity ( bias and confounding ) and power , but also for external validity . DESIGN : A pilot version was first developed , based on epidemiological principles , review s , and existing checklists for r and omised studies . Face and content validity were assessed by three experienced review ers and reliability was determined using two raters assessing 10 r and omised and 10 non-r and omised studies . Using different raters , the checklist was revised and tested for internal consistency ( Kuder-Richardson 20 ) , test-retest and inter-rater reliability ( Spearman correlation coefficient and sign rank test ; kappa statistics ) , criterion validity , and respondent burden . MAIN RESULTS : The performance of the checklist improved considerably after revision of a pilot version . The Quality Index had high internal consistency ( KR-20 : 0.89 ) as did the subscales apart from external validity ( KR-20 : 0.54 ) . Test-retest ( r 0.88 ) and inter-rater ( r 0.75 ) reliability of the Quality Index were good . Reliability of the subscales varied from good ( bias ) to poor ( external validity ) . The Quality Index correlated highly with an existing , established instrument for assessing r and omised studies ( r 0.90 ) . There was little difference between its performance with non-r and omised and with r and omised studies . Raters took about 20 minutes to assess each paper ( range 10 to 45 minutes ) . CONCLUSIONS : This study has shown that it is feasible to develop a checklist that can be used to assess the method ological quality not only of r and omised controlled trials but also non-r and omised studies . It has also shown that it is possible to produce a checklist that provides a profile of the paper , alerting review ers to its particular method ological strengths and weaknesses . Further work is required to improve the checklist and the training of raters in the assessment of external validity The purpose of this study was to evaluate the effects of sprint training on muscle function and dynamic athletic performance and to compare them with the training effects induced by st and ard plyometric training . Male physical education students were assigned r and omly to 1 of 3 groups : sprint group ( SG ; n = 30 ) , plyometric group ( PG ; n = 30 ) , or control group ( CG ; n = 33 ) . Maximal isometric squat strength , squat- and counter-movement jump ( SJ and CMJ ) height and power , drop jump performance from 30-cm height , and 3 athletic performance tests ( st and ing long jump , 20-m sprint , and 20-yard shuttle run ) were measured prior to and after 10 weeks of training . Both experimental groups trained 3 days a week ; SG performed maximal sprints over distances of 10–50 m , whereas PG performed bounce-type hurdle jumps and drop jumps . Participants in the CG group maintained their daily physical activities for the duration of the study . Both SG and PG significantly improved drop jump performance ( 15.6 and 14.2 % ) , SJ and CMJ height ( ∼10 and 6 % ) , and st and ing long jump distance ( 3.2 and 2.8 % ) , whereas the respective effect sizes ( ES ) were moderate to high and ranged between 0.4 and 1.1 . In addition , SG also improved isometric squat strength ( 10 % ; ES = 0.4 ) and SJ and CMJ power ( 4 % ; ES = 0.4 , and 7 % ; ES = 0.4 ) , as well as sprint ( 3.1 % ; ES = 0.9 ) and agility ( 4.3 % ; ES = 1.1 ) performance . We conclude that short-term sprint training produces similar or even greater training effects in muscle function and athletic performance than does conventional plyometric training . This study provides support for the use of sprint training as an applicable training method of improving explosive performance of athletes in general Among sport conditioning coaches , there is considerable discussion regarding the efficiency of training methods that improve lower-body power . Heavy resistance training combined with vertical jump ( VJ ) training is a well-established training method ; however , there is a lack of information about its combination with Olympic weightlifting ( WL ) exercises . Therefore , the purpose of this study was to compare the short-term effects of heavy resistance training combined with either the VJ or WL program . Thirty-two young men were assigned to 3 groups : WL = 12 , VJ = 12 , and control = 8 . These 32 men participated in an 8-week training study . The WL training program consisted of 3 × 6RM high pull , 4 × 4RM power clean , and 4 3 4RM clean and jerk . The VJ training program consisted of 6 × 4 double-leg hurdle hops , 4 × 4 alternated single-leg hurdle hops , 4 × 4 single-leg hurdle hops , and 4 × 4 40-cm drop jumps . Additionally , both groups performed 4 × 6RM half-squat exercises . Training volume was increased after 4 weeks . Pretesting and posttesting consisted of squat jump ( SJ ) and countermovement jump ( CMJ ) tests , 10- and 30-m sprint speeds , an agility test , a half-squat 1RM , and a clean- and -jerk 1RM ( only for WL ) . The WL program significantly increased the 10-m sprint speed ( p < 0.05 ) . Both groups , WL and VJ , increased CMJ ( p < 0.05 ) , but groups using the WL program increased more than those using the VJ program . On the other h and , the group using the VJ program increased its 1RM half-squat strength more than the WL group ( 47.8 and 43.7 % , respectively ) . Only the WL group improved in the SJ ( 9.5 % ) . There were no significant changes in the control group . In conclusion , Olympic WL exercises seemed to produce broader performance improvements than VJ exercises in physically active subjects & NA ; Markovic , G. , D. Dizdar , I. Jukic , and M. Cardinale . Reliability and factorial validity of squat and countermovement jump tests . J. Strength Cond . Res . 18(3):551–555 . 2004.—The primary aim of this study was to determine reliability and factorial validity of squat ( SJ ) and countermovement jump ( CMJ ) tests . The secondary aim was to compare 3 popular methods for the estimation of vertical jumping height . Physical education students ( n = 93 ) performed 7 explosive power tests : 5 different vertical jumps ( Sargent jump , Abalakow 's jump with arm swing and without arm swing , SJ , and CMJ ) and 2 horizontal jumps ( st and ing long jump and st and ing triple jump ) . The greatest reliability among all jumping tests ( Cronbach 's a = 0.97 and 0.98 ) had SJ and CMJ . The reliability a coefficients for other jumps were also high and varied between 0.93 and 0.96 . Within‐subject variation ( CV ) in jumping tests ranged between 2.4 and 4.6 % , the values being lowest in both horizontal jumps and CMJ . Factor analysis result ed in the extraction of only 1 significant principal component , which explained 66.43 % of the variance of all 7 jumping tests . Since all jumping tests had high correlation coefficients with the principal component ( r = 0.76–0.87 ) , it was interpreted as the explosive power factor . The CMJ test showed the highest relationship with the explosive power factor ( r = 0.87 ) , that is , the greatest factorial validity . Other jumping tests had lower but relatively homogeneous correlation with the explosive power factor extracted . Based on the results of this study , it can be concluded that CMJ and SJ , measured by means of contact mat and digital timer , are the most reliable and valid field tests for the estimation of explosive power of the lower limbs in physically active men Twenty members of an National Collegiate Athletic Association Division III collegiate football team were assigned to either an Olympic lifting ( OL ) group or power lifting ( PL ) group . Each group was matched by position and trained 4-days-wk-1 for 15 weeks . Testing consisted of field tests to evaluate strength ( 1RM squat and bench press ) , 40-yard sprint , agility , vertical jump height ( VJ ) , and vertical jump power ( VJP ) . No significant pre- to posttraining differences were observed in 1RM bench press , 40-yard sprint , agility , VJ or in VJP in either group . Significant improvements were seen in 1RM squat in both the OL and PL groups . After log10-transformation , OL were observed to have a significantly greater improvement in AVJ than PL . Despite an 18 % greater improvement in 1RM squat ( p > 0.05 ) , and a twofold greater improvement ( p > 0.05 ) in 40-yard sprint time by OL , no further significant group differences were seen . Results suggest that OL can provide a significant advantage over PL in vertical jump performance changes Arabatzi , F , Kellis , E , and Saèz-Saez de Villarreal , E. Vertical jump biomechanics after plyometric , weight lifting , and combined ( weight lifting + plyometric ) training . J Strength Cond Res 24(9 ) : 2440 - 2448 , 2010-The purpose of this study was to compare the effects of an Olympic weight lifting ( OL ) , a plyometric ( PL ) , and combined weight lifting + plyometric ( WP ) training program on vertical jump ( VJ ) biomechanics . Thirty-six men were assigned r and omly to 4 groups : PL group ( n = 9 ) , OL group ( n = 9 ) , WP group ( 10 ) , and control ( C ) group ( n = 8) . The experimental groups trained 3 d·wk−1 , for 8 weeks . Sagital kinematics , VJ height , power , and electromyographic ( EMG ) activity from rectus femoris ( RF ) and medial gastrocnemius ( GAS ) were collected during squat jumping and countermovement jumping ( CMJ ) before and after training . The results showed that all experimental groups improved VJ height ( p < 0.05 ) . The OL training improved power and muscle activation during the concentric phase of the CMJ while the subjects used a technique with wider hip and knee angles after training ( p < 0.05 ) . The PL group subjects did not change their CMJ technique although there was an increase in RF activation and a decrease of GAS activity after training ( p < 0.05 ) . The WP group displayed a decline in maximal hip angle and a lower activation during the CMJ after training ( p < 0.05 ) . These results indicate that all training programs are adequate for improving VJ performance . However , the mechanisms for these improvements differ between the 3 training protocol s. Olympic weight lifting training might be more appropriate to achieve changes in VJ performance and power in the precompetition period of the training season . Emphasis on the PL exercises should be given when the competition period approaches , whereas the combination of OL and PL exercises may be used in the transition phases from precompetition to the competition period In the literature , it is well established that subjects are able to jump higher in a countermovement jump ( CMJ ) than in a squat jump ( SJ ) . The purpose of this study was to estimate the relative contribution of the time available for force development and the storage and reutilization of elastic energy to the enhancement of performance in CMJ compared with SJ . Six male volleyball players performed CMJ and SJ . Kinematics , kinetics , and muscle electrical activity ( EMG ) from six muscles of the lower extremity were monitored . It was found that even when the body position at the start of push-off was the same in SJ as in CMJ , jump height was on average 3.4 cm greater in CMJ . The possibility that nonoptimal coordination in SJ explained the difference in jump height was ruled out : there were no signs of movement disintegration in SJ , and toe-off position was the same in SJ as in CMJ . The greater jump height in CMJ was attributed to the fact that the countermovement allowed the subjects to attain greater joint moments at the start of push-off . As a consequence , joint moments were greater over the first part of the range of joint extension in CMJ , so that more work could be produced than in SJ . To explain this finding , measured and manipulated kinematics and electromyographic activity were used as input for a model of the musculoskeletal system . According to simulation results , storage and reutilization of elastic energy could be ruled out as explanation for the enhancement of performance in CMJ over that in SJ . The crucial contribution of the countermovement seemed to be that it allowed the muscles to build up a high level of active state ( fraction of attached cross-bridges ) and force before the start of shortening , so that they were able to produce more work over the first part of their shortening distance & NA ; Carlock , J.M. , S.L. Smith , M.J. Hartman , R.T. Morris , D.A. Ciroslan , K.C. Pierce , R.U. Newton , E.A. Harman , W.A. S and s , and M.H. Stone . The relationship between vertical jump power estimates and weightlifting ability : A field‐test approach . J. Strength Cond . Res . 18(3):534–539 . 2004.—The purpose of this study was to assess the usefulness of the vertical jump and estimated vertical‐jump power as a field test for weightlifting . Estimated PP output from the vertical jump was correlated with lifting ability among 64 USA national‐level weightlifters ( junior and senior men and women ) . Vertical jump was measured using the Kinematic Measurement System , consisting of a switch mat interfaced with a laptop computer . Vertical jumps were measured using a h and s‐on‐hips method . A counter‐movement vertical jump ( CMJ ) and a static vertical jump ( SJ , 90 ° knee angle ) were measured . Two trials were given for each condition . Testretest reliability for jump height was intra‐class correlation ( ICC ) = 0.98 ( CMJ ) and ICC = 0.96 ( SJ ) . Athletes warmed up on their own for 2–3 minutes , followed by 2 practice jumps at each condition . Peak power ( PP ) was estimated using the equations developed by Sayers et al. ( 24 ) . The athletes ’ current lifting capabilities were assessed by a question naire , and USA national coaches checked the listed values . Differences between groups ( i.e. , men versus women , juniors versus resident lifters ) were determined using t‐tests ( p ≤ 0.05 ) . Correlations were determined using Pearson 's r. Results indicate that vertical jumping PP is strongly associated with weightlifting ability . Thus , these results indicate that PP derived from the vertical jump ( CMJ or SJ ) can be a valuable tool in assessing weightlifting performance Hawkins , SB , Doyle , TLA , and McGuigan , MR . The effect of different training programs on eccentric energy utilization in college-aged males . J Strength Cond Res 23(7 ) : 1996 - 2002 , 2009-The Eccentric Utilization Ratio ( EUR ) , which is the ratio of countermovement jump ( CMJ ) to squat jump ( SJ ) performance measures , is a useful indicator of training status in elite athletes and their utilization of the stretch-shortening cycle . This investigation sought to determine if EUR was sensitive to different types of resistance training in untrained college-aged males . Twenty-nine college-aged males completed 8 weeks of training and were r and omly allocated to 1 of 3 training programs : weight training ( n = 10 ) , plyometrics ( n = 10 ) , or weightlifting ( n = 9 ) . Testing occurred 3 times ( pre , mid , post ) with a CMJ and SJ conducted on a force plate integrated with a position transducer . Height , weight , and a 1RM ( repetition maximum ) squat also were measured . Weightlifting significantly ( p < 0.05 ) helped subjects jump higher and produce more power than plyometrics for height and power for both CMJ and SJ results . This investigation indicated EUR did not significantly change , suggesting that this type of performance indicator may not be useful in a recreationally active population . Alternatively , an 8-week training program might not be a long enough time period to see changes in this group of participants . Results did indicate that high-velocity and high-force training programs , consisting of weightlifting and plyometrics , improved lower-body performance , especially in the areas of jump height and power Abstract Arabatzi , F and Kellis , E. Olympic weightlifting training causes different knee muscle – coactivation adaptations compared with traditional weight training . J Strength Cond Res 26(8 ) : 2192–2201 , 2012—The purpose of this study was to compare the effects of an Olympic weightlifting ( OL ) and traditional weight ( TW ) training program on muscle coactivation around the knee joint during vertical jump tests . Twenty-six men were assigned r and omly to 3 groups : the OL ( n = 9 ) , the TW ( n = 9 ) , and Control ( C ) groups ( n = 8) . The experimental groups trained 3 d·wk−1 for 8 weeks . Electromyographic ( EMG ) activity from the rectus femoris and biceps femoris , sagittal kinematics , vertical stiffness , maximum height , and power were collected during the squat jump , countermovement jump ( CMJ ) , and drop jump ( DJ ) , before and after training . Knee muscle coactivation index ( CI ) was calculated for different phases of each jump by dividing the antagonist EMG activity by the agonist . Analysis of variance showed that the CI recorded during the preactivation and eccentric phases of all the jumps increased in both training groups . The OL group showed a higher stiffness and jump height adaptation than the TW group did ( p < 0.05 ) . Further , the OL showed a decrease or maintenance of the CI recorded during the propulsion phase of the CMJ and DJs , which is in contrast to the increase in the CI observed after TW training ( p < 0.05 ) . The results indicated that the altered muscle activation patterns about the knee , coupled with changes of leg stiffness , differ between the 2 programs . The OL program improves jump performance via a constant CI , whereas the TW training caused an increased CI , probably to enhance joint stability
2,035
28,264,201
Based on the extracted data , it remains unclear which of the injection parameters is the most important determinant for adequate attenuation .
Background Various different injection parameters influence enhancement of the coronary arteries . There is no consensus in the literature regarding the optimal contrast media ( CM ) injection protocol . The aim of this study is to provide an up date on the effect of different CM injection parameters on the coronary attenuation in coronary computed tomographic angiography ( CCTA ) .
OBJECTIVES Contrast media ( CM ) injection protocol s should be customized to the individual patient . Aim of this study was to determine if software tailored CM injections result in diagnostic enhancement of the coronary arteries in computed tomography angiography ( CTA ) and if attenuation values were comparable between different weight categories . MATERIAL S AND METHODS 265 consecutive patients referred for routine coronary CTA were scanned on a 2nd generation dual- source CT . Group 1 ( n=141 ) received an individual CM bolus based on weight categories ( 39 - 59 kg ; 60 - 74 kg ; 75 - 94 kg ; 95 - 109 kg ) and scan duration ( ' high-pitch : 1s ; " dual-step prospect i ve triggering " : 7s ) , as determined by contrast injection software ( Certegra ™ P3 T , Bayer , Berlin , Germany ) . Group 2 ( n=124 ) received a st and ard fixed CM bolus ; Iopromide 300 mgI/ml ; volume : 75 ml ; flow rate : 7.2 ml/s . Contrast enhancement was measured in all proximal and distal coronary segments . Subjective and objective image quality was evaluated . Statistical analysis was performed using SPSS ( IBM , version 20.0 ) . RESULTS For group 1 , mean attenuation values of all segments were diagnostic ( > 325 HU ) without statistical significant differences between different weight categories ( p>0.17 ) , proximal vs. distal : 449 ± 65 - 373 ± 58 HU ( 39 - 59 kg ) ; 443 ± 69 - 367 ± 81 HU ( 60 - 74 kg ) ; 427 ± 59 - 370 ± 61 HU ( 75 - 94 kg ) ; 427 ± 73 - 347 ± 61 HU ( 95 - 109 kg ) . Mean CM volumes were : 55 ± 6 ml ( 39 - 59 kg ) ; 61 ± 7 ml ( 60 - 74 kg ) ; 71 ± 8 ml ( 75 - 94 kg ) ; 84 ± 9 ml ( 95 - 109 kg ) . For group 2 , mean attenuation values were not all diagnostic with differences between weight categories ( p<0.01 ) , proximal vs. distal : 611 ± 142 - 408 ± 69 HU ( 39 - 59 kg ) ; 562 ± 135 - 389 ± 98 HU ( 60 - 74 kg ) ; 481 ± 83 - 329 ± 81 HU ( 75 - 94 kg ) ; 420 ± 73 - 305 ± 35 HU ( 95 - 109 kg ) . Comparable image noise and image quality were found between groups ( p ≥ 0.330 ) . CONCLUSIONS Individually tailored CM injection protocol s yield diagnostic attenuation and a more homogeneous enhancement pattern between different weight groups . CM volumes could be reduced for the majority of patients utilizing individualized CM bolus application Objective : To compare patient-weight-adjusted and fixed iodine-dose protocol s at coronary computed tomography angiography ( CTA ) using a 64-detector scanner and computer-assisted bolus tracking . Material s and Methods : Approval from our institutional review board and patient prior informed consent were obtained before entering 60 patients with suspected coronary disease in this study . The patients were r and omly assigned to one of 2 protocol s. In the fixed iodine-dose protocol , they received a fixed dose of 80 mL Iopamidol-370 ; the injection duration was 20 seconds . In the weight-adjusted iodine-dose protocol , the dose was tailored to the patient body weight ; this group received 1.0 mL/kg and the injection duration was shorter , ie , 15 seconds . Imaging was on a 64-detector CT scanner using a computer-assisted bolus tracking technique . A radiologist blinded to the protocol used measured the Hounsfield density number of the large vessels and coronary arteries . CT attenuation in the aortic root was compared in patients whose weight was less than 58 kg ( group 1 ) or 58 kg or more ( group 2 ) . The st and ard deviation ( SD ) of CT attenuation in the aortic root and the myocardium was compared with evaluate image noise . Using a 3-point scale , 2 radiologists independently evaluated beam-hardening artifacts and coronary enhancement . Statistical analysis was with the two-tailed Student t test and the Mann-Whitney U test . Results : There was no significant difference between the protocol s with respect to CT attenuation of the ascending aorta and coronary arteries . Under the fixed-iodine-dose protocol , mean CT attenuation in the aortic root was 421.3 ± 51.5 Hounsfield unit ( HU ) in the lighter- , and 397.2 ± 42.3 HU in the heavier weight group , respectively ; the difference was statistically significant ( P = 0.03 ) . Under the weight-adjusted iodine-dose protocol , these values were 407.6 ± 85.1 and 409.2 ± 47.9 HU , respectively and the difference was not statistically significant ( P = 0.17 ) . The SD of the ascending aorta and myocardium was significantly higher for the fixed- than the weight-adjusted iodine-dose protocol . The mean visual score for beam-hardening artifacts was significantly lower in the weight-adjusted- than the fixed-iodine-dose protocol ( P < 0.01 ) , however , there was no significant difference in the enhancement of the coronary arteries ( P = 0.82 ) . Conclusion : At 64-detector CTA of the heart , the patient weight-tailored dose protocol with the 15-second injection duration yielded significantly better image quality than the fixed-dose , 20-second injection duration protocol PURPOSE An extensive number of protocol s have been suggested to allow for functional diagnostics ; however , no data is available about the minimal amount of contrast medium to achieve reliable imaging properties . None of the plethora of existing studies report a rational why the specific concentration was chosen . MATERIAL S AND METHODS A total of 40 patients were included in this prospect i ve , controlled study . They were divided up into four equal groups getting a different concentration ( 10 % , 20 % , 30 % or 40 % ) of a second contrast medium bolus . Corresponding septal and right ventricular ROIs were compared . A visual score was established . Coronary attenuation was measured in the right and left coronary artery . Streak artifacts in the right atrium/ventricle were assessed . RESULTS In the 10 % contrast medium ( CM ) group only in 5/10 ( 50 % ) patients full septal delineation was reached . In all other groups full septal visualization was obtained . No group showed a relevant difference of mean density measured in HU units of the left ventricle or the coronary arteries . All study groups except of group 1 ( 10 % CM ) showed streak artifacts in the right atrium . CONCLUSION The dual flow protocol with a minimum concentration of 20 % improves septal visualization as basis for left ventricular functional assessment , however , does not allow for reliable right ventricular or atrial visualization . There is no significant difference between the different concentration protocol s in terms of coronary attenuation Purpose Pain sensation and extravasation are potential drawbacks of contrast media ( CM ) injection during computed tomographic angiography . The purpose was to evaluate safety and patient comfort of higher flow rates in different CM protocol s during coronary computed tomographic angiography . Methods Two hundred consecutive patients of a double-blind r and omized controlled trial ( NCT02462044 ) were analyzed . Patients were r and omized to receive 94 mL of prewarmed iopromide 240 mg I/mL at 8.3 mL/s ( group I ) , 75 mL of 300 mg I/mL at 6.7 mL/s ( group II ) , or 61 mL of 370 mg I/mL at 5.4 mL/s ( group III ) , respectively . Iodine delivery rate ( 2.0 g I/s ) and total iodine load ( 22.5 g I ) were kept identical . Outcome was defined as intravascular enhancement , patient comfort during injection , and injection safety , expressed as the occurrence of extravasation . Patients completed a question naire for comfort , pain , and stress during CM injection . Comfort was grade d using a 5-point scale , 1 representing “ very bad ” and 5 “ very well . ” Pain was grade d using a 10-point scale , 0 representing “ no pain ” and 10 “ severe pain . ” Stress was grade d using a 5-point scale , 1 representing “ no stress ” and 5 “ unsustainable stress . ” Results Mean enhancement levels within the coronary arteries were as follows : 437 ± 104 Hounsfield units ( HU ) ( group I ) , 448 ± 111 HU ( group II ) , and 447 ± 106 HU ( group III ) , with P ≥ 0.18 . Extravasation occurred in none of the patients . Median ( interquartile range ) for comfort , pain , and stress was , respectively , 4 ( 4–5 ) , 0 ( 0–0 ) , and 1 ( 1–2 ) , with P ≥ 0.68 . Conclusions High flow rates of prewarmed CM were safely injected without discomfort , pain , or stress . Therefore , the use of high flow rates should not be considered a drawback for CM administration in clinical practice OBJECTIVE Because an increase in body mass index ( weight in kilograms divided by height squared in meters ) confers higher image noise at coronary CT angiography , we evaluated a body mass index-adapted scanning protocol for low-dose 64-MDCT coronary angiography with prospect i ve ECG triggering . SUBJECTS AND METHODS One hundred one consecutively registered patients underwent coronary CTA with prospect i ve ECG triggering with a fixed contrast protocol ( 80 mL of iodixanol , 50-mL saline chaser , flow rate of 5 mL/s ) . Tube voltage ( range , 100 - 120 kV ) and current ( range , 450 - 700 mA ) were adapted to body mass index . Attenuation was measured , and contrast-to-noise ratio was calculated for the proximal right coronary artery and left main coronary artery . Image noise was determined for each patient as the SD of attenuation in the ascending aorta . RESULTS Body mass index ranged from 18.2 to 38.8 , and mean effective radiation dose from 1.0 to 3.2 mSv . There was no correlation between body mass index and image noise ( r = 0.11 , p = 0.284 ) , supporting the validity of the body mass index-adapted scanning protocol . However , body mass index was inversely correlated with vessel attenuation ( right coronary artery , r = -0.45 , p < 0.001 ; left main coronary artery , r = -0.47 , p < 0.001 ) and contrast-to-noise ratio ( right coronary artery , r = -0.39 , p < 0.001 ; left main coronary artery , r = -0.37 , p < 0.001 ) . CONCLUSION Use of the proposed body mass index-adapted scanning parameters results in similar image noise regardless of body mass index . Increased bolus dilution due to larger blood volume may account for the decrease in contrast-to-noise ratio and vessel attenuation in patients with higher body mass index , but the contrast bolus was not adapted to body mass index in this study Objective To prospect ively investigate the influence of contrast material concentration on enhancement in cardiac CT by using a biphasic single-injection protocol . Methods Sixty-four-row multidetector cardiac CT angiography was performed in 159 patients r and omised to a moderate or high contrast medium concentration . Contrast material injection included a first phase for enhancement of the coronary arteries and a second phase , at half the iodine flux , targeted at enhancement of the right ventricle . Contrast medium injection was followed by a saline flush . For both concentrations , injection duration ( and thus total iodine dose ) was adapted to the duration of the CT data acquisition and iodine flux was adjusted to patient weight . Attenuation was measured at various levels in the heart and vessels and the two concentrations compared , overall and per weight group . Results Enhancement of the aorta and left ventricle was significantly greater with the moderate than with the high concentration contrast medium . This remained true for the two higher weight groups . No difference was found in the lowest weight group or in the right ventricle and pulmonary outflow tract . Conclusion With a biphasic injection protocol , enhancement of the aorta and left ventricle was weaker with the higher concentration of contrast material To individually optimize contrast medium protocol for high-pitch prospect i ve ECG-triggering coronary CT angiography using body weight . Ninety patients undergoing high-pitch coronary CT angiography were r and omly assigned to 3 contrast medium injection protocol s with bolus tracking technique : Group A , 0.7 ml CM per kg patient weight ( ml/kg ) ; Group B , 0.6 ml/kg ; Group C , 0.5 ml/kg . Each group had 30 patients . The CT values of superior vena cava ( SVC ) , pulmonary artery ( PA ) , ascending aorta ( AA ) , left atrium ( LA ) , left ventricle ( LV ) , left main artery ( LM ) and proximal segment of right coronary artery ( RCA ) were measured . The image quality of coronary artery was evaluated on per-segment basis using a 4-point scale ( 1-excellent , 4-non-diagnosis ) . The CT value was not significantly different on AA ( p = 0.735 ) , LM ( p = 0.764 ) , and proximal segment of RCA ( p = 0.991 ) . The CT value was significantly different on SVC , PA , LA and LV ( all p < 0.05 ) . The mean image quality score was 1.6 ± 0.1 , 1.6 ± 0.1 and 1.6 ± 0.1 ( p = 0.217 ) . The volume of CM was 47 ± 8 , 44 ± 8 and 36 ± 6 ml for 3 groups ( p < 0.001 ) . The effective radiation dose was 0.88 ± 0.04 , 0.87 ± 0.06 , and 0.85 ± 0.07 mSv for 3 groups . Contrast medium could be reduced to 0.5 ml/kg for high-pitch coronary CT angiography without compromising diagnostic image quality , which associated ~50 % reduction of total contrast volume compared with st and ard contrast protocol with test bolus technique Objectives : To compare a contrast agent with high iodine concentration with an iso-osmolar contrast agent for coronary dual- source computed tomography angiography ( DS-CTA ) , and to assess whether the contrast agent characteristics may affect the diagnostic quality of coronary DS-CTA . Material s and Methods : Patients were r and omized to receive either 80 mL of iodixa : nol-320 ( Visipaque , GE Healthcare , Chalfont St. Giles , United Kingdom ) or iomeprol-400 ( Iomeron , Bracco Imaging SpA , Milan , Italy ) at 5 mL/s . Mean , minimum , maximum heart rate , and its variation ( max-min ) were assessed during calcium scoring scan and coronary DS-CTA . Three off-site readers independently evaluated the image sets in terms of technical adequacy , reasons for inadequacy , vessel visualization , diagnostic confidence ( based on a 5-point scale ) , and arterial contrast opacification in Hounsfield units ( HUs ) . Results : Ninety-six patients were included in the final evaluation . No significant differences were observed for pre- and postdose heart rate values for iomeron-400 compared with iodixanol-320 , and changes in heart rate variation were also not significantly different ( −2.3 ± 11.7 vs. −2.5 ± 7.3 bpm , P > 0.1 ) . Contrast measurements in all analyzed vessels were significantly higher for iomeprol-400 ( mean , 391.5–441.4 HU ) compared with iodixanol-320 ( mean , 332.3–365.5 HU , all P ≤ 0.0038 ) . There was no significant difference in qualitative visualization of coronary arteries ( mean scores , 4.3–4.5 for iomeprol , 4.1–4.3 for iodixanol , P = 0.15–0.28 ) , or in diagnostic confidence scores . HU were inversely correlated with the number of insufficiently opacified segments ( all readers P ≤ 0.0006 ) . Conclusions : The high-iodine concentration contrast medium iomeprol-400 demonstrated significant benefit for coronary arterial enhancement compared with the iso-osmolar contrast medium iodixanol-320 when administered at identical flow rates and volumes for coronary DS-CTA . In addition , higher enhancement levels were found to be associated with lower numbers of inadequately visualized segments . Finally , observed mean heart rate changes after intravenous contrast injection were generally small during the examination and comparable for both agents PURPOSE To investigate the effect of duration and rate of contrast material injection on aortic peak time and peak enhancement in an injection protocol in which the contrast material dose is adjusted according to patient weight . MATERIAL S AND METHODS One hundred ninety-nine patients were r and omly divided into three groups in which the fixed duration of contrast material injection was 25 seconds ( group A ) or 35 seconds ( group B ) or the fixed injection rate was 4.0 mL/sec ( group C ) . Computed tomography ( CT ) at the L3 vertebral level was performed before and after contrast material injection . Aortic peak time , aortic peak enhancement , and period when aortic enhancement is 200 HU or greater ( T200 ) were calculated . The Pearson product-moment correlation coefficient ( r ) was used to investigate relationships between patient weight and aortic peak time , aortic peak enhancement , and T200 in each group . RESULTS A significant correlation between aortic peak time and patient weight ( r = 0.91 , P < .001 ) was observed in group C. No significant correlations between patient weight and aortic peak time were observed in group A ( r = 0.16 , P = .21 ) or B ( r = -0.05 , P = .69 ) . A significant inverse correlation between aortic peak enhancement and patient weight ( r = -0.70 , P < .001 ) was observed in group C. No significant correlations between patient weight and aortic peak enhancement were observed in group A ( r = 0.09 , P = .48 ) or B ( r = 0.10 , P = .41 ) . A significant correlation between T200 and patient weight ( r = 0.72 , P < .001 ) was observed in group C. No significant correlations between patient weight and T200 were observed in group A ( r = 0.12 , P = .34 ) or B ( r = 0.001 , P = .99 ) . CONCLUSION Aortic peak time , aortic peak enhancement , and T200 were closely related to injection duration in the protocol with contrast material dose determined according to patient weight Background Body mass index ( BMI ) has a positive linear influence on arterial attenuation at coronary CT angiography involving injection protocol with dose linearly tailored to body weight ( BW ) . Excessive contrast material may inadvertently be given in heavier patients when the dose is determined by BW only . Purpose To investigate the effect of injection protocol with dose of contrast material ( CM ) tailored to BW and BMI on coronary arterial attenuation , contrast-to-noise ratio , and image noise at dual- source CT coronary angiography ( DSCT-CA ) . Material and Methods A total of 233 consecutive patients ( mean age , 60.2 years ) undergoing DSCT-CA were included . Image acquisition protocol was st and ardized ( 120 kV , 380 mAs , and retrospective electrocardiograph-triggered DSCT-CA ) . CM dosage calculation was r and omly categorized into groups : a BW group and a BW- BMI group . CM flow rate in both groups was calculated as dosage divided by scan time plus 8 s. Correlations between BW , BMI , and attenuations of ascending aorta ( AA ) above coronary ostia , left main coronary artery ( LM ) , proximal right coronary artery ( RCA ) , left anterior descending ( LAD ) , and left circumflex artery ( LCX ) , contrast to noise ratio of LM ( LMCNR ) and RCA ( RCACNR ) , and image noise were evaluated with simple linear regression for two groups individually . Results In BW group , attenuations of AA and coronary arteries showed positive linear correlations to BW and BMI . In contrast , no relationships were found in BW- BMI group . LMCNR and RCACNR were inversely determined by BW and BMI in both groups . Image noise increased with BW and BMI increasing in two groups . Conclusion BMI has a positive linear influence on arterial attenuation with fixed iodine per BW . The injection protocol with CM dose tailored to BW and BMI is reasonable during DSCT-CA Rationale and Objectives : This study was design ed to determine the optimal contrast protocol for 4-detector-row computed tomography angiography of the heart . Methods : Sixty patients were r and omly assigned to 1 of 4 groups with 300 and 400 mg/mL iodine concentrations and 2.5 and 3.5 mL/s flow rates . Contrast density was measured in the left ventricular cavity and coronary arteries . Results : Low iodine concentration injected at slow flow rate ( 0.75 g iodine/s ) result ed in acceptable contrast enhancement in only 53.8 % of the patients . There was no significant difference between low contrast concentration injected at high flow rate and high contrast concentration injected at slow flow rate ( ≈1 g iodine/s ) . High contrast concentration administered with high flow rates ( 1.4 g iodine/s ) may result in an enhancement above 350 Hounsfield units ( HU ) and interfere with coronary calcifications . Conclusions : The injection of ≈1 g iodine/s result ed in an optimal ( 250–300 HU ) contrast enhancement for cardiac 4-detector-row computed tomography The aim of this study was to determine whether individually tailored protocol s for the injection of contrast medium ( CM ) result in higher and more homogeneous vascular attenuation throughout the coronary arteries at coronary CT angiography compared with conventional injection protocol s using fixed injection parameters . Of 120 patients included in the study , 80 patients were r and omized into two groups . Group 1 received 80 mL of CM at 6 mL/s . For group 2 injection parameters were individually adjusted to patient weight , the duration of CT data acquisition , and attenuation parameters following a test bolus . In the control group ( group 3 ) the volume of CM was adjusted to the duration of CT data acquisition and injected at 5 mL/s . Attenuation was measured in the proximal , middle , and distal right coronary artery ( RCA ) , in the proximal and middle left anterior descending artery ( LAD ) , and in cranial and caudal sections of both ventricles . Patient parameters , scan delay , and scan duration did not differ significantly between the groups . Mean CM volume was 82.5 mL ( flow rate 5.1 mL/s ) in group 2 and 73.5 mL in group 3 . Attenuation in both RCA and LAD was significantly higher for group 2 vs. group 3 ( RCA : 414.9(±49.9)–396.1(±52.1 ) HU vs. 366.0(±64.3)–341.6(±72.5 ) HU ; LAD : 398.9(±48.6)–364.6(±44.6 ) HU vs. 356.3(±69.5)–323.0(±67.2 ) HU ) . For group 1 vs. group 2 only attenuation in the distal RCA differed significantly : 396.1(±52.1 ) vs. 370.7(±70.5 ) HU . Individually tailored CM injection protocol s yield higher attenuation , especially in the distal segments of the coronary vessels , compared with injection protocol s using fixed injection parameters A cubital intravenous iodine contrast agent enhancement is used to visualize coronary arteries using EBT . The quality of the coronary artery visualization however is limited by the nearly simultaneous approximation of CT values in coronary arteries and myocardial tissue . The objective of the study was to evaluate if " under real clinical circumstances " the lower iodine concentration and the dimeric based characteristic of iodixanol may effect the kinetic of the applied contrast agent and the visualization of coronary arteries studied noninvasively by EBT . A double-blind , r and omized , parallel study was performed in 111 cardiac patients , using iodixanol 270 mg I/ml or iohexol 300 mg I/ml . The kinetics of contrast enhancement was studied in the flow mode measuring following parameters : mean arrival time and mean time to reach peak CT values in the pulmonary trunk , transit time from the pulmonary trunk to the aorta as well as mean and maximum CT values in the left ventricular chamber and in the myocardium with respect to the body mass index . The mean difference of CT values in the left ventricular chamber and the myocardium was calculated . The length of the visualized coronary arteries was assessed and the diagnostic quality of coronary artery visualization scored on a visual analogue scale . Although iodixanol was used with a lower iodine concentration than iohexol there was no significant statistical difference between both groups with respect to the diagnostic visualization and length assessment of the coronary arteries as well as in the mean difference of CT values in the left ventricular chamber and the myocardium . This means that the advantageous dimeric characteristics of iodixanol may be used to reduce the amount of applicated iodine in contrast agents without loss of diagnostic image quality and information Objectives To assess the effect of lower volumes of contrast medium ( CM ) on image quality in high-pitch dual- source computed tomography coronary angiography ( CTCA ) . Methods One-hundred consecutive patients ( body weight 65–85 kg , stable heart rate ≤65 bpm , cardiac index ≥2.5 L/min/m2 ) referred for CTCA were prospect ively enrolled . Patients were r and omly assigned to one of five groups of different CM volumes ( G30 , 30 mL ; G40 , 40 mL ; G50 , 50 mL ; G60 , 60 mL ; G70 , 70 mL ; flow rate 5 mL/s each , iodine content 370 mg/mL ) . Attenuation within the proximal and distal coronary artery segments was analysed . Results Mean attenuation for men and women ranged from 345.0 and 399.1 HU in G30 to 478.2 and 571.8 HU in G70 . Mean attenuation values were higher in groups with higher CM volumes ( P < 0.0001 ) and higher in women than in men ( P < 0.0001 ) . The proportions of segments with attenuation of at least 300 HU in G30 , G40 , G50 , G60 and G70 were 89 % , 95 % , 98 % , 98 % and 99 % . CM volume of 30 mL in women and 40 mL in men proved to be sufficient to guarantee attenuation of at least 300 HU . Conclusions In selected patients high-pitch dual- source CTCA can be performed with CM volumes of 40 mL in men or 30 mL in women . Key Points• High-pitch dual- source coronary angiography is feasible with low contrast media volumes.• Traditional injection rules still apply : higher volumes result in higher enhancement.• The patient ’s gender is a co-factor determining the level of contrast enhancement.• Volumes can be reduced down to 30–40 mL in selected patients AIMS To determine the feasibility of prospect i ve electrocardiogram (ECG)-gating to achieve low-dose computed tomography coronary angiography ( CTCA ) . METHODS AND RESULTS Forty-one consecutive patients with suspected ( n = 35 ) or known coronary artery disease ( n = 6 ) underwent 64-slice CTCA using prospect i ve ECG-gating . Individual radiation dose exposure was estimated from the dose-length product . Two independent readers semi-quantitatively assessed the overall image quality on a five-point scale and measured vessel attenuation in each coronary segment . One patient was excluded for atrial fibrillation . Mean effective radiation dose was 2.1 + /- 0.6 mSv ( range , 1.1 - 3.0 mSv ) . Image quality was inversely related to heart rate ( HR ) ( 57.3 + /- 6.2 , range 39 - 66 b.p.m . ; r = 0.58 , P < 0.001 ) , vessel attenuation ( 346 + /- 104 , range 110 - 780 HU ; r = 0.56 , P < 0.001 ) , and body mass index ( 26.1 + /- 4.0 , range 19.1 - 36.3 kg/m(2 ) ; r = 0.45 , P < 0.001 ) , but not to HR variability ( 1.5 + /- 1.0 , range 0.2 - 5.1 b.p.m . ; r = 0.28 , P = 0.069 ) . Non-diagnostic CTCA image quality was found in 5.0 % of coronary segments . However , below a HR of 63 b.p.m . ( n = 28 ) , as determined by receiver operator characteristic curve , only 1.1 % of coronary segments were non-diagnostic compared with 14.8 % with HR of > 63 b.p.m . ( P < 0.001 ) . CONCLUSION This first experience documents the feasibility of prospect i ve ECG-gating for CTCA with diagnostic image quality at a low radiation dose ( 1.1 - 3.0 mSv ) , favouring HR < 63 b.p.m OBJECTIVE The purpose of this study was to determine the optimal contrast injection protocol for clear delineation of the endocardial and epicardial contours and coronary vessels in anatomic and functional imaging with cardiac 16-MDCT . SUBJECTS AND METHODS Thirty-eight patients were allocated to three groups according to contrast injection protocol : a long- duration biphasic protocol in which diluted contrast material was used in the latter phase ( protocol A , 13 patients ) ; a uniphasic protocol with saline flush ( protocol B , 12 patients ) ; a uniphasic protocol without a flush ( protocol C , 13 patients ) . Six regions of interest were drawn within the left ventricle ( LV ) , right ventricle ( RV ) , and interventricular septum along the z-axis . Mean ventricular attenuation , mean difference between maximum and minimum ventricular attenuation , and ventricular-myocardial contrast-to-noise ratio ( CNR ) were calculated . Attenuation and visualization of the coronary vessels also were compared . RESULTS The difference between maximum and minimum RV attenuation was significantly smaller in group A ( 58.1 H ) than in groups B ( 179.5 H ) and C ( 157.0 H ) . RV-myocardial CNR was significantly higher in group A ( 9.0 ) than in group B ( 5.5 ) . The mean LV attenuation , difference between maximum and minimum LV attenuation , and LV-myocardial CNR were not significantly different among three groups . In protocol A , both endocardial and epicardial contours were clearly delineated , and cardiac functional analysis was feasible in all cases . Average attenuation and visualization of the coronary vessels were not significantly different among groups . The diagnostic accuracies in detection of coronary stenosis were 92 % , 93 % , and 91 % , respectively , for protocol s A , B , and C. CONCLUSION The long- duration contrast injection protocol with diluted contrast material is optimal for assessing the coronary vessels and cardiac function To investigate the effect on coronary arterial attenuations of contrast material flow rate adjusted to a patient ’s heart rate during dual source CT coronary angiography ( DSCT-CCTA ) . A total of 296 consecutive patients ( mean age : 58.7 years ) undergoing DSCT-CCTA without previous coronary stent placement , bypass surgery , congenital or valvular heart disease were included . The image acquisition protocol was st and ardized ( 120 kV , 380 mAs ) and retrospective electrocardiograph ( ECG ) gating was used . Patients were r and omly assigned to one of three groups [ flow rate : G1 : dosage/16 , G2 : dosage/(scan time + 8) , G3 : fixed flow rate ] . The groups were compared with respect to the attenuations of the ascending aorta ( AA ) above coronary ostia , the left main coronary artery ( LM ) , the proximal right coronary artery ( RCA ) , the left anterior descending artery ( LAD ) , the left circumflex artery ( LCX ) , and the contrast to noise ratio of the LM ( LMCNR ) and the proximal RCA ( RCACNR ) . Correlations between heart rate and attenuation of the coronary arteries were evaluated in three groups with linear regression . There was no significant difference in the three groups among the mean attenuations of AA ( P = 0.141 ) , LM ( P = 0.068 ) , RCA ( P = 0.284 ) , LMCNR ( P = 0.598 ) and RCACNR ( P = 0.546 ) . The attenuations of the LAD and the LCX in group 1 were slightly higher than those in group 2 and 3 ( P < 0.05 ) . In group 1 , the attenuations of the AA ( P < 0.01 ) , LM ( P < 0.01 ) , RCA ( P < 0.01 ) , LAD ( P = 0.02 ) and LCX ( P < 0.01 ) decreased , respectively , with an increasing heart rate . A similar finding was detected in group 3 ( AA : P < 0.01 , LM : P < 0.01 , RCA : P < 0.01 , LAD : P < 0.01 and LCX : P < 0.01 ) . In contrast , the attenuations of the AA ( P = 0.55 ) , LM ( P = 0.27 ) , RCA ( P = 0.77 ) , LAD ( P = 0.22 ) and LCX ( P = 0.74 ) had no significant correlation with heart rate in group 2 . In all three groups , LMCNR ( P = 0.77 , 0.69 and 0.73 respectively ) and RCACNR ( P = 0.75 , 0.39 and 0.61 respectively ) had no significant correlation with heart rate . Contrast material flow rate adjusted to heart rate can diminish the influence of heart rate on attenuations of the coronary arteries in DSCT-CCTA The institutional review board approved this study , and all patients gave written informed consent . One hundred twenty-five patients scheduled to undergo retrospectively electrocardiographically gated 16-detector row computed tomographic coronary angiography were prospect ively r and omized into the following five groups with respect to the intravenous administration of a 140-mL bolus of contrast material at 4 mL/sec : group 1 ( iohexol [ 300 mg of iodine per milliliter ] ) , group 2 ( iodixanol [ 320 mg I/mL ] ) , group 3 ( iohexol [ 350 mg I/mL ] ) , group 4 ( iomeprol [ 350 mg I/mL ] ) , and group 5 ( iomeprol [ 400 mg I/mL ] ) . Attenuation was measured in the descending aorta and coronary arteries . One-way analysis of variance was used to compare groups . Mean attenuation values in the descending aorta were significantly ( P < .05 ) lower in group 1 and higher in group 5 compared with the mean values in the other three groups . The same pattern was observed in the coronary arteries . Contrast material s with higher iodine concentrations yield significantly higher attenuation in the descending aorta and coronary arteries To analyze the invasiveness and image quality of coronary CT angiography ( CCTA ) with 80 kV. We enrolled 181 patients with low body weight and low calcium level . Of these , 154 patients were r and omly assigned to 1 of 3 groups : 280 HU/80 kV ( n = 51 ) ; 350 HU/80 kV ( n = 51 ) ; or 350 HU/120 kV ( n = 52 ) . The amount of contrast media ( CM ) was decided with a CT number – controlling system . Twenty-seven patients were excluded because of an invalid time density curve by timing bolus . The predicted amount of CM , volume CT dose index , dose-length product , effective dose , image noise , and 5-point image quality were measured . The amounts of CM for the 80 kV/280 HU , 80 kV/350 HU , and 120 kV/350 HU groups were 10 ± 4 mL , 15 ± 7 mL , and 30 ± 6 mL , respectively . Although image noise was greater at 80 than 120 kV , there was no significant difference in image quality between 80 kV/350 HU and 120 kV/350 HU ( p = 0.390 ) . There was no significant difference in image quality between 80 kV/280 HU and 80 kV/350 HU ( 4.4 ± 0.7 vs. 4.7 ± 0.4 , p = 0.056 ) . The amount of CM and effective dose was lower for 80 kV CCTA than for 120 kV CCTA . CCTA at 80 kV/280 HU may decrease the amount of CM and radiation dose necessary while maintaining image quality OBJECTIVES This study sought to determine the diagnostic accuracy of 64-slice computed tomographic coronary angiography ( CTCA ) to detect or rule out significant coronary artery disease ( CAD ) . BACKGROUND CTCA is emerging as a noninvasive technique to detect coronary atherosclerosis . METHODS We conducted a prospect i ve , multicenter , multivendor study involving 360 symptomatic patients with acute and stable anginal syndromes who were between 50 and 70 years of age and were referred for diagnostic conventional coronary angiography ( CCA ) from September 2004 through June 2006 . All patients underwent a nonenhanced calcium scan and a CTCA , which was compared with CCA . No patients or segments were excluded because of impaired image quality attributable to either coronary motion or calcifications . Patient- , vessel- , and segment-based sensitivities and specificities were calculated to detect or rule out significant CAD , defined as > or=50 % lumen diameter reduction . RESULTS The prevalence among patients of having at least 1 significant stenosis was 68 % . In a patient-based analysis , the sensitivity for detecting patients with significant CAD was 99 % ( 95 % confidence interval [ CI ] : 98 % to 100 % ) , specificity was 64 % ( 95 % CI : 55 % to 73 % ) , positive predictive value was 86 % ( 95 % CI : 82 % to 90 % ) , and negative predictive value was 97 % ( 95 % CI : 94 % to 100 % ) . In a segment-based analysis , the sensitivity was 88 % ( 95 % CI : 85 % to 91 % ) , specificity was 90 % ( 95 % CI : 89 % to 92 % ) , positive predictive value was 47 % ( 95 % CI : 44 % to 51 % ) , and negative predictive value was 99 % ( 95 % CI : 98 % to 99 % ) . CONCLUSIONS Among patients in whom a decision had already been made to obtain CCA , 64-slice CTCA was reliable for ruling out significant CAD in patients with stable and unstable anginal syndromes . A positive 64-slice CTCA scan often overestimates the severity of atherosclerotic obstructions and requires further testing to guide patient management A method for constructing a personalized contrast medium protocol at contrast enhanced , Coronary CT Angiography ( CCTA ) is presented . A one compartment pharmacokinetic model is parameterized and identified with a minimal data set from a test-bolus injection . A direct- search optimization is performed to construct a protocol that achieves target enhancement in the cardiac structures . Clinical results demonstrating the method 's ability to achieve prospect ively chosen image enhancement levels while reducing contrast medium dose are presented Objective To prospect ively compare iopamidol 370 , which is a low-osmolar contrast medium and iodixanol 320 , which is an iso-osmolar contrast medium , in terms of image quality and nonserious adverse effects that have the potential to influence the image quality in a 16-slice multi-detector row computed tomography coronary angiography . Methods Sixty patients were divided into two groups to receive iodixanol 320 or iopamidol 370 . Image quality was assessed , using a five-point grading scale . Differences in the mean attenuation ( Hounsfield units ) at the origin of the coronary arteries and on the ascending aorta in both the groups were compared . The number and intensity of adverse effects were compared between the two groups . Results The mean attenuation values of the ascending aorta and the origins of the coronary arteries for the two groups showed no significant difference ( P≥0.41 ) . There was no significant difference in terms of image quality between the two groups on all evaluated segments . There was a statistically significant difference in the number of adverse effects ( P=0.001 ) between the two groups . However , in both the iodixanol group and the iopamidol group , there was no significant difference in terms of image quality between the patients with and without adverse effects . Conclusion The frequency of adverse effects is lower in the iodixanol group than the iopamidol group . Iodixanol 320 can provide both vascular enhancement and image quality , which is similar to iopamidol 370 in a 16-slice multi-detector row computed tomography coronary angiography . There was no significant difference in terms of overall image quality between the patients with and without adverse effects in either of the groups Coronary CT angiography allows high- quality imaging of the coronary arteries when state-of-the-art CT systems are used . However , radiation exposure has been a concern . We describe a new scan mode that uses a very high-pitch spiral acquisition , " Flash Spiral , " which has been developed specifically for low-dose imaging with dual- source CT . The scan mode uses a pitch of 3.2 to acquire a spiral CT data set , while covering the entire volume of the heart in one cardiac cycle . Data acquisition is prospect ively triggered by the electrocardiogram and starts in late systole to be completed within one cardiac cycle . Images are reconstructed with a temporal resolution that corresponds to one-quarter of the gantry rotation time . Throughout the data set , subsequent images are reconstructed at later time instants in the cardiac cycle . In a patient with a heart rate of 49 beats/min , the Flash Spiral scan mode was used with a first-generation dual- source CT system and allowed artifact-free visualization of the coronary arteries with a radiation exposure of 1.7 mSv for a 12-cm scan range at 120 kVp tube voltage PURPOSE To investigate the impact of saline bolus chaser in coronary angiography with a 16-row multislice CT ( 16-MSCT ) scanner . MATERIAL S AND METHODS 32 patients were divided in two groups for contrast material ( cm ) administration : group 1 ( 140 ml at 4 ml/s ) , and group 2 ( 100 ml at 4 ml/s followed by 40 ml of saline chaser at 4 ml/s ) . All patients underwent ECG-gated coronary angiography with a 16-MSCT scanner . The attenuation at the origin coronary vessels was assessed . Three regions of interest were drawn throughout the data -set : ascending aorta ( ROI1 ) ; descending aorta ( ROI2 ) ; pulmonary artery ( ROI3 ) . The attenuation in the superior vena cava was recorded ( ROI4 ) . The average attenuation and the slope were calculated in each ROI . RESULTS The average attenuation in the coronary vessels was not significantly different in the two groups . The attenuation value in the superior vena cava was significantly inferior in group 2 . CONCLUSIONS 16-row MSCT coronary angiography with bolus chaser allows optimal vessel enhancement with a lower dose of cm and a lower enhancement of the superior vena cava and the right heart PURPOSE To prospect ively investigate the effect of the injection rate of a saline solution as a bolus chaser for the enhancement of the aorta and coronary arteries at multidetector computed tomographic ( CT ) coronary angiography . MATERIAL S AND METHODS Institutional review board approved this study , and all patients gave informed consent . One hundred consecutive patients ( 59 men , 41 women ; mean age , 58 years + /- 11 [ st and ard deviation ] ) underwent 64-section CT coronary angiography for coronary artery disease . They were divided into five groups ( each group , n = 20 ) according to the injection rate of saline solution ( 3 , 4 , 5 , 6 , and 7 mL/sec ) . Iodinated contrast medium ( 60 mL ) was injected intravenously at a rate of 4 mL/sec , followed by 60 mL saline solution administered intravenously at a rate of 3 - 7 mL/sec , depending on the groups . Attenuation values of the aortic root , right coronary artery , left anterior descending artery , and left circumflex artery were measured . Analysis of variance with the Scheffé method was used to evaluate statistical significance of the differences in attenuation according to the injection rate of saline solution . RESULTS The degree of contrast enhancement was affected by the injection rate of saline solution , and the attenuation values were higher as the injection rate increased up to 4 - 5 mL/sec ( P < .05 ) . The values plateaued at rates over 5 mL/sec in the aorta and over 4 mL/sec in the coronary arteries . CONCLUSION An injection rate of 4 - 5 mL/sec as a saline solution chaser is optimal for achieving maximum attenuation values of the aorta or coronary arteries by using 64-section CT with 60 mL contrast material OBJECTIVE The objective of our study was to evaluate whether iodixanol 320 mg I/mL ( iodixanol 320 ) , with the highest iodine concentration of dimeric nonionic contrast agents on the market , results in decreased vascular or myocardial enhancement compared with iohexol 350 mg I/mL ( iohexol 350 ) . SUBJECTS AND METHODS During a 4-month period , 72 patients referred for cardiac MDCT were consecutively enrolled and r and omized into two groups : iohexol 350 and iodixanol 320 . The injection and scanning protocol s were the same for both groups . Enhancement of the right heart , left heart , coronary arteries , and left ventricular ( LV ) myocardium in both the arterial and delayed phases was compared using two-tailed independent Student 's t test . RESULTS Enhancement in the right heart , left heart , coronary arteries , and LV myocardium in the arterial phase showed no statistical difference ( p > 0.05 ) between the two groups , although the iohexol group showed slightly higher enhancement ( average , 11.2 H ) in all of the areas . Surprisingly , in the delayed phase , the iodixanol group displayed significantly higher ( 7.7 H ) persistent enhancement ( p < 0.05 ) in the LV myocardium . CONCLUSION Iodixanol 320 can provide vascular enhancement in cardiac MDCT that is similar to iohexol 350 . In the delayed phase , iodixanol 320 shows significantly higher delayed enhancement ( 7.7 H ) in the LV myocardium than iohexol 350 Background Most of current coronary CT angiography protocol s are not adapted to body weight ( BW ) or cardiac output and no literature about influence of gender on coronary attenuation are reported with administration of a fixed iodine load per BW . Purpose To determine the influence of body mass index ( BMI ) and gender on coronary arterial attenuation if contrast material dose is linearly adjusted to a patient 's BW at dual- source CT coronary angiography ( DSCT-CA ) . Material and Methods A total of 207 consecutive patients ( mean age 60.6 years ) undergoing DSCT-CA were included . Contrast material ( 370 mg I/mL ) dose calculation was r and omly categorized into two groups ( Group1 : 1.10 mL/kg for men and women ; Group 2 : men 1.10 mL/kg , women 0.99 mL/kg ) and flow rate was calculated as dose was divided by scan time plus 8 s. Mean arterial attenuations between men and women were compared with respect to attenuations of ascending aorta ( AA ) above coronary ostia , left main coronary artery ( LM ) , proximal segments of right coronary artery ( RCA ) , left anterior descending ( LAD ) , and left circumflex artery ( LCX ) in two groups , respectively . Attenuations of coronary arteries were correlated with BW and BMI with simple linear regression . Results The mean attenuations of AA , LM , RCA , LAD , and LCX were 407.8 ± 53.6 HU , 412.6 ± 55.4 HU , 411.4 ± 64.3 HU , 399.1 ± 56.7 HU , and 399.1 ± 60.2 HU , respectively , and there were no significant differences between men and women in group 1 ( AA , P = 0.571 ; LM , P = 0.670 ; RCA , P = 0.737 ; LAD , P = 0.439 , and LCX , P = 0.888 ) . In group 2 , the mean attenuations of AA , LM , RCA , LAD , and LCX in men were significantly higher than those in women ( AA , P = 0.008 ; LM , P = 0.025 ; RCA , P = 0.017 ; LAD , P = 0.015 , and LCX , P = 0.002 ) . Positive linear regression between BW and attenuations of AA ( R2 = 0.047 , P = 0.02 ) , LM ( R2 = 0.036 , P = 0.04 ) , RCA ( R2 = 0.080 , P < 0.01 ) , LAD ( R2 = 0.078 , P < 0.01 ) , and LCX ( R2 = 0.033 , P = 0.05 ) was found in group 1 , suggesting that attenuations of coronary arteries increased in heavier patients . Similarly , there was positive linear regression between BMI and attenuations of AA ( R2 = 0.117 , P < 0.01 ) , LM ( R2 = 0.090 , P < 0.01 ) , RCA ( R2 = 0.138 , P < 0.01 ) , LAD ( R2 = 0.111 , P < 0.01 ) , and LCX ( R2 = 0.078 , P < 0.01 ) . Conclusion Men and women have similar coronary attenuations with a fixed iodine load per BW . BMI has a positive linear influence on arterial attenuation at DSCT-CA involving injection protocol with dose linearly tailored to BW . Excessive contrast material may inadvertently be given in heavier patients when the dose is determined by BW only . Contrast material dose may need to be tailored individually by BW and BMI The aim of this study was to investigate the usefulness of saline chaser in 16-row multislice CT ( 16-MSCT ) coronary angiography . Forty-two patients were divided into two groups for contrast material ( CM ) administration : group 1 ( 140 ml at 4 ml/s ) and group 2 ( 100 ml at 4 ml/s followed by 40 ml of saline chaser at 4 ml/s ) . All patients underwent retrospectively ECG-gated 16-MSCT coronary angiography . The attenuation at the origin coronary vessels was assessed . Three regions of interest ( ROIs ) were drawn throughout the data set : ( a ) ascending aorta ( ROI 1 ) ; ( b ) descending aorta ( ROI 2 ) ; and ( c ) pulmonary artery ( ROI 3 ) . The attenuation in the superior vena cava was recorded ( ROI 4 ) . The average attenuation and the slope were calculated in each ROI and differences were assessed with a Student ’s t test . The average attenuation in the coronary vessels was not significantly different in the two groups . The average attenuations in ROI 1 were 325 and 327 HU , in ROI 2 were 328 and 329 HU and in ROI 3 were 357 and 320 HU , for groups 1 and 2 , respectively ( p>0.05 ) . The slopes in ROI 1 were −0.2 and 1.1 , in ROI 2 were 2.8 and 2.1 ( p>0.05 ) and in ROI3 were 3.9 and −9.0 ( p<0.05 ) , for groups 1 and 2 , respectively . The average attenuations in ROI 4 were 927 and 643 HU ( p<0.05 ) , for groups 1 and 2 , respectively . One hundred milliliters of CM with 40 ml of saline chaser provides the same attenuation as 140 ml of CM ( 35 % less ) with decreased hyper-attenuation in the superior vena cava OBJECTIVE The purpose of this article is to assess whether iopamidol-370 provides superior vascular contrast of the coronaries and depiction of anatomic detail without affecting heart rate and beat-to-beat variability during coronary dual- source MDCT compared with iodixanol-320 . SUBJECTS AND METHODS In this prospect i ve trial , coronary CT angiography was performed on 60 adult patients using either iopamidol-370 or iodixanol-320 . Cohorts were matched by age , habitus , sex , and baseline heart rate , with cohort sizes determined by power analysis . All studies were performed on a dual- source MDCT scanner with retrospective ECG-gating utilizing automatic pitch adjustment . Data assessment focused on heart rate variability during contrast administration statistically evaluated as Student t test comparisons within and between cohorts , coronary contrast-to-noise ratio analysis of the main coronary arteries utilizing Student t test comparisons between cohorts , and coronary branch depiction and distribution analysis in dual-reader consensus decisions between cohorts . RESULTS Thirty patients matched for age , habitus , sex , and heart rate were evaluated in each cohort . ECG analyses found a statistically significant ( p = 0.013 ) decrease in heart rate during administration of iodixanol-320 . Beat-to-beat variations , expressed as coefficient of variation , within and among cohorts were low ( coefficient of variation , < 0.05 ) . Contrast-to-noise ratio was significantly increased for iopamidol-370 versus iodixanol-320 ( aortic root , p = 0.021 ; left main , p = 0.032 ; left anterior descending , p = 0.033 ; left circumflex , p = 0.039 ; and right , p = 0.009 ) . Analysis of coronary branch visualization revealed improved depiction for iopamidol-370 compared with iodixanol-320 . CONCLUSION Iopamidol-370 , with its higher iodine concentration , provided greater vascular contrast of the arterial coronary tree and improved depiction of anatomic detail without significantly impacting cardiac heart rate during coronary MDCT imaging , as compared with iodixanol-320 Purpose We compared the fixed injection rate protocol ( P2 ) with the fixed injection duration protocol ( P1 ) for coronary CT angiography using the test bolus technique . Material s and methods We r and omly assigned 100 patients to one of two protocol s. In P1 , they received 0.7 mL/kg Iohexol-350 in an injection duration of 9 s , and we selected a delay of 3 s after peak enhancement of test bolus scan . In P2 , they received 0.7 mL/kg Iohexol-350 at an injection rate of 5 mL/s , and we selected a delay after peak enhancement of test bolus scan using the following formula : TID/2 - 2 s , where TID is the injection duration of full bolus . We compared attenuation values in the ascending aorta and coronary arteries and patient-to-patient enhancement variability at each segment . Results At all segments , CT attenuations of P2 were significantly greater than those of P1 ( ascending aorta 400 ± 64 vs. 368 ± 60 , P = 0.01 ; left main trunk 399 ± 67 vs. 369 ± 55 , P = 0.02 ; proximal-RCA 393 ± 66 vs. 363 ± 56 , P = 0.01 ) . There was no significant difference in patient-to-patient enhancement variability at all segments between the two groups ( P > 0.05 ) . Conclusion P2 yielded superior vessel enhancement and comparable patient-to-patient enhancement variability compared with P1 in thin patients We developed a new individually customized contrast-injection protocol for coronary computed tomography ( CT ) angiography based on the time-attenuation response in a test bolus , and investigated its clinical applicability . We scanned 60 patients with suspected coronary diseases using a 64-detector CT scanner , who were r and omly assigned to one of two protocol s. In protocol 1 ( P1 ) , we estimated the contrast dose to yield a peak aortic attenuation of 400 HU based on the time-attenuation response to a small test-bolus injection ( 0.3 ml/kg body weight ) delivered over 9 s. Then we administered a customized contrast dose over 9 s. In protocol 2 ( P2 ) , the dose was tailored to the patient ’s body weight ; this group received 0.7 ml/kg body weight with an injection duration of 9 s. We compared the two protocol s for dose of contrast medium , peak attenuation , variations in attenuation values of the ascending aorta , and the success rate of adequate attenuation ( 250–350 HU ) of the coronary arteries . The contrast dose was significantly smaller in P1 than in P2 ( 36.9 ± 9.2 vs 43.1 ± 7.0 ml , P < 0.01 ) . Peak aortic attenuation was significantly less under P1 than under P2 ( 384.1 ± 25.0 vs 413.5 ± 45.7 , P < 0.01 ) . The mean variation ( st and ard deviation ) of the attenuation values was smaller in P1 than in P2 ( 25.0 vs 45.7 , P < 0.01 ) . The success rate of adequate attenuation of the coronary arteries was significantly higher with P1 than with P2 ( 85.0 vs 65.8 % , P < 0.01 ) . P1 facilitated a reduction in the contrast dose , reduced the individual variations in peak aortic attenuation , and achieved optimal coronary CT attenuation ( 250–350 HU ) more frequently than P2 Purpose We evaluated low-contrast injection protocol s for coronary computed tomography angiography ( CTA ) using a 64-detector scanner and the test bolus technique . Material s and methods We r and omly assigned 60 patients undergoing coronary CTA to one of two contrast material ( CM ) injection protocol s. For the lowcontrast dose protocol ( Plow ) , the patients received injections of iohexol-350 [ 0.7 ml/kg body weight ( BW ) ] during 9 s , and the test-bolus technique was used . Under the conventional protocol ( Pconv ) , they received iohexol-350 ( 1.0 ml/kg BW ) during 15 s , and bolus tracking was used . We compared the protocol s for attenuation values in the ascending aorta and coronary arteries and for the amount of CM required . Results There was no significant difference in the mean CT attenuation of the ascending aorta and coronary arteries between the Plow and Pconv groups . The amount of CM was significantly less with Plow than with Pconv [ 49.7 ± 6.4 ml ( main bolus : 39.7 ± 6.4 ml ) vs. 57.0 ± 10.1 ml , P < 0.01 ] . Conclusion With 64-detector CTA of the heart , the low-dose and short-injection- duration protocol with the test-injection technique provides vessel attenuation comparable to that obtained with the st and ard-dose protocol with the bolus-tracking technique OBJECTIVE The objective of our study was to evaluate the difference in coronary enhancement provided by 60 versus 80 mL of contrast medium ( 370 mg I/mL ) for prospect ively ECG-gated single-heartbeat axial 320-MDCT . MATERIAL S AND METHODS We retrospectively evaluated 108 consecutive 320-MDCT angiography studies . Group 1 ( n = 36 ) received 60 mL of an iodinated contrast medium and group 2 ( n = 72 ) , 80 mL. All patients were imaged with a st and ardized protocol : iopamidol 370 followed by 40 mL of saline , both administered at a rate of 6 mL/s . Two imagers subjectively assessed image quality throughout the coronary arteries . Region-of-interest attenuation ( HU ) measurements were performed in the aorta plus the proximal and distal coronary arteries . RESULTS Subjective analysis of all coronary segments showed slightly better image quality for group 2 . Patients in group 1 had significantly ( p < 0.05 ) lower mean attenuation values for the individual coronary vessels . Nevertheless , 96.7 % of all coronary segments in the group 1 patients had an attenuation of greater than 300 HU ; when analysis was limited to group 1 patients with a body mass index of greater than 30 , 92.8 % of the segments were more than 300 HU , and all segments measured more than 250 HU . CONCLUSION An injection protocol based on 60 mL of iopamidol ( 370 mg I/mL ) for prospect ively ECG-gated wide-area detector single-heartbeat coronary CT angiography ( CTA ) has less coronary enhancement than a protocol based on 80 mL. However , using 60 mL , more than 96 % of coronary segments had sufficient enhancement ( i.e. , > 300 HU ) , supporting the general use of 60-mL protocol s for clinical wide-area detector coronary CTA Objectives : To prospect ively compare subjective and objective measures of image quality using 4 different contrast material injection protocol s in dual-energy computed tomography pulmonary angiography ( CTPA ) studies of patients with suspected pulmonary embolism . Material s and Methods : A total of 100 consecutive patients referred for CTPA for the exclusion of pulmonary embolism were r and omized into 1 of 4 contrast material injection protocol s manipulating iodine concentration and iodine delivery rate ( IDR , expressed as grams of iodine per second ) : Iomeprol 400 at 3 mL/s ( IDR = 1.2 gI/s ) , iomeprol 400 at 4 mL/s ( IDR = 1.6 gI/s ) , iomeprol 300 at 5.4 mL/s ( IDR = 1.6 gI/s ) , or iomeprol 300 at 4 mL/s ( IDR = 1.2 gI/s ) . Total iodine delivery was held constant . Dual-energy CTPA of the lungs were acquired and used to calculate virtual 120 kV CTPA images as well as iodine perfusion maps . Attenuation values in the thoracic vasculature and image quality of virtual 120 kV CTPAs were compared between groups . Iodine perfusion maps were also compared by identifying differences in the extent of beam-hardening artifacts and subjective image quality . Results : Protocol s with an IDR of 1.6 gI/s provided the best attenuation profiles . CTPA image quality was greatest in the high concentration , high IDR ( 1.6 gI/s ) protocol ( P < 0.05 for all group comparisons ) with no differences between the other groups ( all P ≥ 0.05 ) . Extent of beam-hardening artifacts and perfusion map image quality was significantly better using the high concentration , high IDR protocol as compared with all groups ( P < 0.05 for all comparisons ) and significantly worse using the low concentration , low IDR protocol as compared with all groups ( all P ≥ 0.05 ) ; no difference was found between the high concentration , low IDR protocol and the low concentration , high IDR protocol ( P = 0.73 for comparison of beam-hardening artifacts ; P = 0.50 for comparison of perfusion map image quality ) . Conclusion : High iodine concentration and high IDR contrast material delivery protocol s provide the best image quality of both CTPA and perfusion map images of the lung through high attenuation in the pulmonary arteries and minimization of beam-hardening artifacts PURPOSE Evaluation of a new protocol for Dual- source CT contrast-enhanced cardiac imaging for better visualization of right ventricle structures . METHODS A total of 106 patients were included in this prospect i ve , controlled study . The control group ( n=53 ) underwent our clinic 's st and ard procedure for contrast-enhanced imaging of coronary arteries . The study group ( n=53 ) was imaged using a protocol with the dual flow injection protocol in which the saline chaser bolus contained 20 % contrast media . The images were analyzed for mean density values using defined ROIs in the septum and both ventricles . In addition the data sets were semi-quantitatively evaluated for visual delineation between right ventricle and septum . To investigate whether this new protocol influenced the visualization of coronary arteries , mean density was also measured in the right and left coronary artery . RESULTS The dual flow concept allows for a statistically significant better delineation of the septum in Dual- source cardiac computed tomography for both the quantitative and semi-quantitative analyses . Also , the dual flow concept allows for statistically relevant higher coronary attenuation . CONCLUSION Using a saline chaser containing 20 % contrast medium improves septal delineation for functional ventricular analysis as well as unimpaired coronary visualization Objectives : The aims of our study were to compare contrast injection protocol s with contrast media containing 300 and 400 mg iodine per milliliter for optimal contrast enhancement in cardiac multidector row computed tomography ( CT ) and to evaluate the correlation of test bolus curve parameters with the final contrast density of the main bolus . Material s and Methods : Sixty patients with known or suspected coronary artery disease were included in a prospect i ve double-blind study . Patients were r and omized to 2 groups . Group 1 received 83 mL of a contrast medium ( CM ) containing 300 mg of iodine ( Iomeron 300 , Bracco Imaging SpA , Milan , Italy ) at a flow rate of 3.3 mL/s , whereas group 2 received 63 mL of the same agent containing 400 mg of iodine ( Iomeron 400 ) at a flow rate of 2.5 mL/s . The test bolus volumes were 20 mL and 15 mL , respectively . Imaging was performed using a 16-slice CT system ( 16DCT ; Somatom Sensation 16 , Siemens Medical Solutions , Forchheim , Germany ) . Contrast densities ( Hounsfield Units [ HU ] ) were determined in the cardiac chambers and in the main coronary arteries . The peak density and area under the curve of the test bolus were calculated for each patient . Results : The mean contrast densities of the coronary arteries were 259.1 ± 46.7 HU for group 1 and 251.6 ± 51.0 HU , for group 2 . No noteworthy differences between groups were noted for density measurements in the cardiac chambers or for the ratio of right-to-left ventricle density . Whereas a positive correlation was noted for both groups between the area under the curve of the test bolus and the mean density of the main bolus , a positive correlation between peak density of the test bolus and mean density of the main bolus was noted only for group 1 . Conclusion : Equivalent homogenous enhancement of the ventricular cavities and coronary arteries to that obtained using a CM with st and ard iodine concentration ( Iomeron 300 ) can be achieved with lower overall volumes of administered CM and reduced injection flow rates when a CM with high iodine concentration ( Iomeron 400 ) is used In computed tomography(CT ) several contrast media with different iodine concentrations are available . The aim of this study is to prospect ively compare contrast media with iodine concentrations of 300 , 370 and 400 mg iodine/ml for chest- CT . 300 consecutive patients were prospect ively enrolled , under a waiver of the local ethics committee . The first ( second , third ) 100 patients , received contrast medium with 300(370 , 400)mg iodine/ml . Injection protocol s were adapted for an identical iodine delivery rate(1.3 mg/s ) and total iodine load(33 g ) for all three groups . St and ardized MDCT of the chest ( 16 × 0.75 mm , 120 kVp , 100 mAseff . ) was performed . Intravascular attenuation values were measured in the pulmonary trunk and the ascending aorta ; subjective image quality was rated on a 3-point-scale . Discomfort during and after injection was evaluated . There were no statistically significant differences in contrast enhancement comparing the three contrast media at the pulmonary trunk(p = 0.3198 ) and at the ascending aorta(p = 0.0840 ) . Image quality ( p = 0.0176 ) and discomfort during injection(p = 0.7034 ) were comparable for all groups . General discomfort after injection of contrast media with 300 mg iodine/ml was statistically significant higher compared to 370 mg iodine/ml(p = 0.00019 ) . Given identical iodine delivery rates of 1.3 g/s and iodine loads of 33 g , contrast media with concentrations of 300 , 370 and 400 mg iodine/ml do not result in different intravascular enhancement in chest-CT Objective : The objective of this study was to compare intracoronary attenuation on 16-row multislice computed tomography ( 16-MSCT ) coronary angiography using 2 contrast material s ( CM ) with high iodine concentration . Material and Methods : Forty consecutive patients ( 29 male , 11 female ; mean age , 61 ± 11 years ) with suspected coronary artery disease were r and omized to 2 groups to receive 100 mL of either iopromide 370 ( group 1 : Ultravist 370 , 370 mg iodine/mL ; Schering AG , Berlin , Germany ) or iomeprol 400 ( group 2 : Iomeron 400 , 400 mg iodine/mL ; Bracco Imaging SpA , Milan , Italy ) . Both CM were administered at a rate of 4 mL/s . All patients underwent 16-MSCT coronary angiography ( Sensation 16 ; Siemens , Germany ) with collimation 16 × 0.75 mm and rotation time 375 ms . The attenuation in Hounsfield units ( HU ) achieved after each CM was determined at regions of interest ( ROIs ) placed at the origin of coronary arteries and on the ascending aorta , descending aorta , and pulmonary artery . Differences in mean attenuation in the coronary arteries and on the ascending aorta , descending aorta , and pulmonary artery were evaluated using Student t test . Results : The mean attenuation achieved at each anatomic site was consistently greater after iomeprol 400 than after iopromide 370 . At the origin of coronary arteries , the mean attenuation after iomeprol 400 ( 340 ± 53 HU ) was greater ( P < 0.05 ) than that after iopromide 370 ( 313 ± 42 HU ) . Similar findings were noted for the mean attenuation in the ascending aorta , descending aorta , and pulmonary artery . Conclusion : The intravenous administration of iomeprol 400 provides higher attenuation of the coronary arteries and of the great arteries of the thorax as compared with iopromide 370 using the same injection parameters PURPOSE To evaluate the influence of the body mass index ( BMI ) on coronary artery opacification in 64-slice CT . MATERIAL AND METHODS Sixty-two patients retrospectively underwent ECG-gated 64-slice CT coronary angiography ( tube potential 120 kV , tube current time product 650 mAs ) after intravenous injection of 80 ml of iodinated contrast agent ( 320 mg/ml , 5 ml/s ) . Attenuation values ( HU ) were measured and contrast-to-noise ratios ( CNR ) were calculated in the right coronary artery ( RCA ) and left main artery ( LMA ) . The CNR was defined as the difference between the mean attenuation in the vessel and the mean attenuation in the perivascular fat tissue divided by the image noise in the ascending aorta . The height and weight of the patients at the time of the CT scan were recorded and the BMI was calculated . RESULTS The mean BMI was 26.2 + /- 3.2 kg/m ( 2 ) ( range 19.7 - 32.2 kg/m ( 2 ) ) , the mean attenuation in the LMA was 330 + /- 64 HU , and the mean attenuation in the RCA was 309 + /- 68 HU . The CNR in the LMA was 16.7 + /- 3.8 , and the CNR in the RCA was 15.9 + /- 3.6 . The image noise in the ascending aorta significantly correlated with the BMI ( r = 0.36 , p < 0.01 ) . A weak negative correlation was found between the BMI and LMA attenuation ( r = - 0.28 , p < 0.05 ) , whereas no significant correlation was found for the RCA ( r = - 0.21 , p = 0.12 ) . A significant negative correlation was found between the BMI and the CNR in the RCA ( r = - 0.41 , p < 0.05 ) and the LMA ( r = - 0.47 , p < 0.001 ) . CONCLUSION With constant scan parameters and a constant contrast medium amount , the CNR in both coronary arteries decreases while the BMI increases . This implies a modification of previously st and ardized and fixed examinations with respect to individually adapted protocol s with variable parameters for CT coronary angiography OBJECTIVES To assess the impact of low-concentration contrast medium on vascular enhancement , image quality and radiation dose of coronary CT angiography ( cCTA ) by using a combination of iterative reconstruction ( IR ) and low-tube-voltage technique . MATERIAL S AND METHODS One hundred patients were prospect ively r and omized to two types of contrast medium and underwent prospect i ve electrocardiogram-triggering cCTA ( Definition Flash , Siemens Healthcare ; collimation : 128 mm × 0.6 mm ; tube current : 300 mAs ) . Fifty patients received Iopromide 370 were scanned using the conventional tube setting ( 100 kVp or 120 kVp if BMI ≥ 25 kg/m(2 ) ) and reconstructed with filtered back projection ( FBP ) . Fifty patients received Iodixanol 270 were scanned using the low-tube-voltage ( 80 kVp or 100 kVp if BMI ≥ 25 kg/m(2 ) ) technique and reconstructed with IR . CT attenuation was measured in coronary artery and other anatomical regions . Noise , image quality and radiation dose were compared . RESULTS Compared with two Iopromide 370 subgroups , Iomeprol 270 subgroups showed no significant difference in CT attenuation ( 576.63 ± 95.50 vs. 569.51 ± 118.93 for BMI < 25 kg/m(2 ) , p=0.647 and 394.19 ± 68.09 vs. 383.72 ± 63.11 for BMI ≥ 25 kg/m(2 ) , p=0.212 ) , noise ( in various anatomical regions of interest ) and image quality ( 3.5 vs. 4.0 , p=0.13 ) , but significantly ( 0.41 ± 0.17 vs. 0.94 ± 0.45 for BMI < 25 kg/m(2 ) , p<0.001 and 1.14 ± 0.24 vs. 2.37 ± 0.69 for BMI ≥ 25 kg/m(2 ) , p<0.001 ) lower radiation dose , which reflects dose saving of 56.4 % and 51.9 % , respectively . CONCLUSIONS Combined IR with low-tube-voltage technique , a low-concentration contrast medium of 270 mg I/ml can still maintain the contrast enhancement without impairing image quality , as well as significantly lower the radiation dose
2,036
26,690,497
There was little evidence that monitoring by PR interval analysis conveyed any benefit of any sort . The modest benefits of fewer fetal scalp samplings during labour ( in setting s in which this procedure is performed ) and fewer instrumental vaginal births have to be considered against the disadvantages of needing to use an internal scalp electrode , after membrane rupture , for ECG waveform recordings . We found little strong evidence that ST waveform analysis had an effect on the primary outcome measures in this systematic review .There was a lack of evidence showing that PR interval analysis improved any outcomes ; and a larger future trial may possibly demonstrate beneficial effects . There is little information about the value of fetal ECG waveform monitoring in preterm fetuses in labour .
BACKGROUND Hypoxaemia during labour can alter the shape of the fetal electrocardiogram ( ECG ) waveform , notably the relation of the PR to RR intervals , and elevation or depression of the ST segment . Technical systems have therefore been developed to monitor the fetal ECG during labour as an adjunct to continuous electronic fetal heart rate monitoring with the aim of improving fetal outcome and minimising unnecessary obstetric interference . OBJECTIVES To compare the effects of analysis of fetal ECG waveforms during labour with alternative methods of fetal monitoring .
Background . In a large Swedish multicenter r and omized controlled trial ( RCT ) on intra partum fetal monitoring with automatic analysis of fetal ECG waveform ( STAN ) in combination with cardiotocography ( CTG ) ( 4966 parturients , 300 obstetricians and midwives managing the patients ) , interim analysis revealed protocol violations . By a post hoc analysis of the results over time , factors affecting the acceptance of the new technique were analyzed . Methods . The rates of primary and secondary outcome measures ( fetal outcome , operative deliveries ) were compared in the two study groups ( CTG + ST and CTG only ) . Changes over time were statistically evaluated using a test for homogeneity between the two periods . Results . After retraining , the CTG + ST group showed the lowest rates of operative delivery for fetal distress , fetal blood sampling and admissions to neonatal intensive care unit . Operative deliveries ( p = 0.02 ) and the number of fetal blood sampling decreased significantly over time ( p = 0.001 ) . Conclusions . Training and education probably predisposed the clinicians to a change and reinforced it when it occurred as a result of increased personal experience . The audit and feedback together with the influence of opinion leaders and inter-collegial interactions seem to have been of importance for the successively increasing acceptance of the new method during the RCT Abstract Objective : The aim of this study was to investigate the possible relationship between cord bloodalpha-fetoprotein ( AFP ) level and development of subsequent neonatal hyperbilirubinemia . Study design : The term newborns born between March 2005 and October 2005 were included in the study . Infants with Coombs-positive ABO and /or Rh incompatibility and /or hemolytic jaundice , asphyxia , congenital anomaly and signs of bleeding were excluded from the study . Cord blood AFP levels were measured in 504 full term newborns in this period . Infants were followed-up for possible neonatal hyperbilirubinemia . The capillary bilirubin level ( CBL ) was examined expeditiously in newborns developing jaundice and in other infants at the time discharge while the screening test was being performed . Results : The mean umbilical cord AFP level was 49.1 ± 44.9 mg/L ( range 1.1–396.2 mg/L ) , mean CBL was 5.8 ± 3.1 mg/dL ( range 1–19.4 mg/dL ) , and the mean bilirubin detection time was 37 ± 23.2 hours ( range 12–144 h ) of age . Although a significant positive correlation was found between umbilical cord AFP and CBL levels , it was weak ( r = 0.187 , p < 0.001 ) . Comparison of AFP levels in terms of bilirubin percentile values appropriate for postnatal age also showed a significant weak positive correlation ( r = 0.113 , p < 0.001 ) . Conclusion : The umbilical cord AFP levels may not be used as a strong predictor for the determination of newborns at risk for hyperbilirubinemia Objective . To undertake a renewed analysis of data from the previously published Swedish r and omized controlled trial on intrapartum fetal monitoring with cardiotocography ( CTG‐only ) vs. CTG plus ST analysis of fetal electrocardiogram ( CTG+ST ) , using current st and ards of intention‐to‐treat ( ITT ) analysis and to compare the results with those of the modified ITT ( mITT ) and per protocol analyses . Methods . Renewed extraction of data from the original data base including all cases r and omized according to primary case allocation ( n=5 049 ) . Main outcome measure . Metabolic acidosis in umbilical artery at birth ( pH < 7.05 , base deficit in extracellular fluid > 12.0mmol/l ) including sample s of umbilical vein blood or neonatal blood if umbilical artery blood was missing . Results . The metabolic acidosis rates were 0.66 % ( 17 of 2 565 ) and 1.33 % ( 33 of 2 484 ) in the CTG+ST and CTG‐only groups , respectively [ relative risk ( RR ) 0.50 ; 95 % confidence interval ( CI ) 0.28–0.88 ; p=0.019 ] . The original mITT gave RR 0.47 , 95%CI 0.25–0.86 ( p=0.015 ) , mITT with correction for 10 previously misclassified cases RR 0.48 , 95%CI 0.24–0.96 ( p=0.038 ) and per protocol analysis RR 0.40 , 95%CI 0.20–0.80 ( p=0.009 ) . The level of significance of the difference in metabolic acidosis rates between the two groups remained unchanged in all analyses . Conclusion . Re‐ analysis of data according to the ITT principle showed that regardless of the method of analysis , the Swedish r and omized controlled trial maintained its ability to demonstrate a significant reduction in metabolic acidosis rate when using CTG+ST analysis for fetal surveillance in labor OBJECTIVE Cardiotocography plus automatic ST analysis of the fetal electrocardiography has been shown recently to reduce both the operative delivery rate for fetal distress and the cord artery metabolic acidosis rate . The purpose of this study was to analyze findings that were related to cases with a complicated/adverse neonatal outcome in the Swedish r and omized controlled trial . STUDY DESIGN Of the 4966 term fetuses that were included in the trial , all 351 newborn infants who required special neonatal care were identified . Cases of perinatal death , neonatal encephalopathy , or metabolic acidosis at birth were review ed . RESULTS Of the 29 fetuses with adverse/complicated neonatal outcome , 22 fetuses had cardiotocography and ST patterns that indicated a need for intervention , according to the cardiotocography plus ST clinical guidelines . The number of live-born with moderate or severe neonatal encephalopathy showed a significant decrease from 0.33 % ( 8/2447 fetuses ) in the cardiotocography-only group to 0.04 % ( 1/2519 fetuses ) in the cardiotocography plus ST group . CONCLUSION Cardiotocography plus ST analysis provides accurate information about intrapartum hypoxia and may prevent intrapartum asphyxia and neonatal encephalopathy by giving a clear alert to the staff members who are in charge OBJECTIVE : To estimate the effectiveness of intrapartum fetal monitoring by cardiotocography plus ST analysis using a strict protocol for performance of fetal blood sampling . METHODS : We performed a multicenter r and omized trial among laboring women with a high-risk singleton pregnancy in cephalic presentation beyond 36 weeks of gestation . Participants were assigned to monitoring by cardiotocography with ST analysis ( index ) or cardiotocography only ( control ) . Primary outcome was metabolic acidosis , defined as an umbilical cord artery pH below 7.05 combined with a base deficit calculated in the extracellular fluid compartment above 12 mmol/L. Secondary outcomes were metabolic acidosis in blood , operative deliveries , Apgar scores , neonatal admissions , and hypoxic – ischemic encephalopathy . RESULTS : We r and omly assigned 5,681 women to the two groups ( 2,832 index , 2,849 control ) . The fetal blood sampling rate was 10.6 % in the index compared with 20.4 % in the control group ( relative risk 0.52 ; 95 % [ CI ] 0.46–0.59 ) . The primary outcome occurred 0.7 % in the index compared with 1.1 % in the control group ( relative risk 0.70 ; 95 % CI 0.38–1.28 ; number needed to treat 252 ) . Using metabolic acidosis calculated in blood , these rates were 1.6 % and 2.6 % , respectively ( relative risk 0.63 ; 95 % CI 0.42–0.94 ; number needed to treat 100 ) . The number of operative deliveries , low Apgar scores , neonatal admissions , and newborns with hypoxic – ischemic encephalopathy was comparable in both groups . CONCLUSION : Intrapartum monitoring by cardiotocography combined with ST analysis does not significantly reduce the incidence of metabolic acidosis calculated in the extracellular fluid compartment . It does reduce the incidence of metabolic acidosis calculated in blood and the need for fetal blood sampling without affecting the Apgar score , neonatal admissions , hypoxic – ischemic encephalopathy , or operative deliveries . CLINICAL TRIAL REGISTRATION : IS RCT N Register , www.is rct n.org , IS RCT N95732366 . LEVEL OF EVIDENCE : BACKGROUND It is unclear whether using fetal electrocardiographic ( ECG ) ST-segment analysis as an adjunct to conventional intrapartum electronic fetal heart-rate monitoring modifies intrapartum and neonatal outcomes . METHODS We performed a multicenter trial in which women with a singleton fetus who were attempting vaginal delivery at more than 36 weeks of gestation and who had cervical dilation of 2 to 7 cm were r and omly assigned to " open " or " masked " monitoring with fetal ST-segment analysis . The masked system functioned as a normal fetal heart-rate monitor . The open system displayed additional information for use when uncertain fetal heart-rate patterns were detected . The primary outcome was a composite of intrapartum fetal death , neonatal death , an Apgar score of 3 or less at 5 minutes , neonatal seizure , an umbilical-artery blood pH of 7.05 or less with a base deficit of 12 mmol per liter or more , intubation for ventilation at delivery , or neonatal encephalopathy . RESULTS A total of 11,108 patients underwent r and omization ; 5532 were assigned to the open group , and 5576 to the masked group . The primary outcome occurred in 52 fetuses or neonates of women in the open group ( 0.9 % ) and 40 fetuses or neonates of women in the masked group ( 0.7 % ) ( relative risk , 1.31 ; 95 % confidence interval , 0.87 to 1.98 ; P=0.20 ) . Among the individual components of the primary outcome , only the frequency of a 5-minute Apgar score of 3 or less differed significantly between neonates of women in the open group and those in the masked group ( 0.3 % vs. 0.1 % , P=0.02 ) . There were no significant between-group differences in the rate of cesarean delivery ( 16.9 % and 16.2 % , respectively ; P=0.30 ) or any operative delivery ( 22.8 % and 22.0 % , respectively ; P=0.31 ) . Adverse events were rare and occurred with similar frequency in the two groups . CONCLUSIONS Fetal ECG ST-segment analysis used as an adjunct to conventional intrapartum electronic fetal heart-rate monitoring did not improve perinatal outcomes or decrease operative-delivery rates . ( Funded by the Eunice Kennedy Shriver National Institute of Child Health and Human Development and Neoventa Medical ; Clinical Trials.gov number , NCT01131260 . ) OBJECTIVE To determine the possibilities of ST analysis of fetal ECG ( STAN ) in premature deliveries between 30th to 36th week of pregnancy . To compare the results of a group of premature deliveries monitored by ST analysis with a control group of premature deliveries monitored by means of cardiotocography ( CTG ) and intrapartum fetal pulse oxymetry ( IFPO ) . TYPE OF STUDY A prospect i ve study . SETTING Department of Gynecology-Obstetrics , Masaryk University and Faculty Hospital Brno . METHODS The authors evaluated 39 women with premature delivery between 30th and 36th week of pregnancy from a total cohort of 239 high-risk pregnant women , who had been monitored by means of ST analysis of fetal ECG . The control group included 229 pregnant women who gave birth between 30th and 36th week of pregnancy under the monitoring with CTG and IFPO . The allocation into individual groups was at r and om order . The authors evaluated the duration and way of termination of delivery , pH in arterial umbilical blood , Apgar score in the first , fifth and tenth minute , total duration of hospitalization , necessity and duration of stay at the Neonatologic Intensive Care Unit , Intermediatry Intensive Care Unit , the presence of sepsis , hyperbilirubinemia and neurological state of the newborn . The statistical analysis was performed by means of the Fisher 's exact test , Kruskal-Wallis test , chi2 test and the parametric test Anova . RESULTS Almost none of the observed parameters in both categories of premature deliveries ( STAN vs. CTG+IFPO ) exhibited a statistically significant difference except a mild neurological affection of the newborn . In the group of premature deliveries monitored by ST analysis there are only 33.3 % of newborns with signs of light neurological damage as compared with the control group , where 56.3 % subjects were so affected ( p<0.01 ) . CONCLUSION It has become obvious that the ST analysis of fetal ECG in premature deliveries between 30th and 36th week of pregnancy provides the same results as the so far used monitoring by CTG and IFPO . In the group of premature deliveries monitored by the ST analysis , there were significantly less frequent neurological disturbances Background Cardiotocography ( CTG ) is worldwide the method for fetal surveillance during labour . However , CTG alone shows many false positive test results and without fetal blood sampling ( FBS ) , it results in an increase in operative deliveries without improvement of fetal outcome . FBS requires additional expertise , is invasive and has often to be repeated during labour . Two clinical trials have shown that a combination of CTG and ST- analysis of the fetal electrocardiogram ( ECG ) reduces the rates of metabolic acidosis and instrumental delivery . However , in both trials FBS was still performed in the ST- analysis arm , and it is therefore still unknown if the observed results were indeed due to the ST- analysis or to the use of FBS in combination with ST- analysis . Methods / Design We aim to evaluate the effectiveness of non-invasive monitoring ( CTG + ST- analysis ) as compared to normal care ( CTG + FBS ) , in a multicentre r and omised clinical trial setting . Secondary aims are : 1 ) to judge whether ST- analysis of fetal electrocardiogram can significantly decrease frequency of performance of FBS or even replace it ; 2 ) perform a cost analysis to establish the economic impact of the two treatment options . Women in labour with a gestational age ≥ 36 weeks and an indication for CTG-monitoring can be included in the trial . Eligible women will be r and omised for fetal surveillance with CTG and , if necessary , FBS or CTG combined with ST- analysis of the fetal ECG.The primary outcome of the study is the incidence of serious metabolic acidosis ( defined as pH < 7.05 and Bdecf > 12 mmol/L in the umbilical cord artery ) . Secondary outcome measures are : instrumental delivery , neonatal outcome ( Apgar score , admission to a neonatal ward ) , incidence of performance of FBS in both arms and cost-effectiveness of both monitoring strategies across hospitals . The analysis will follow the intention to treat principle . The incidence of metabolic acidosis will be compared across both groups . Assuming a reduction of metabolic acidosis from 3.5 % to 2.1 % , using a two-sided test with an alpha of 0.05 and a power of 0.80 , in favour of CTG plus ST- analysis , about 5100 women have to be r and omised . Furthermore , the cost-effectiveness of CTG and ST- analysis as compared to CTG and FBS will be studied . Discussion This study will provide data about the use of intrapartum ST- analysis with a strict protocol for performance of FBS to limit its incidence . We aim to clarify to what extent intrapartum ST- analysis can be used without the performance of FBS and in which cases FBS is still needed . Trial Registration NumberIS RCT OBJECTIVE The physiology of changes in the ST waveform of the fetal electrocardiogram has been eluci date d in extensive animal and human observational studies . A combination of heart rate and ST waveform analysis might improve the predictive value of intrapartum monitoring . Our purpose was to compare operative intervention and neonatal outcome in labors monitored by the conventional cardiotocogram with those monitored by ST waveform plus the cardiotocogram . STUDY DESIGN A prospect i ve , r and omized clinical trial was performed on 2434 high-risk labors in a district general hospital in Plymouth , Engl and . Statistical analysis was performed by Student t test and chi 2 analysis . RESULTS There was a 46 % reduction ( p < 0.001 , odds ratio 1.85 [ 1.35 - 2.66 ] ) in operative deliveries for " fetal distress " and a trend to less metabolic acidosis ( p = 0.09 , odds ratio 0.38 [ 0.13 - 1.07 ] ) and fewer low 5-minute Apgar scores ( p = 0.12 , odds ratio 0.62 [ 0.35 - 1.08 ] ) in the ST waveform plus cardiotocogram arm . CONCLUSIONS ST waveform analysis discriminates cardiotocogram changes in labor , and the protocol for interpretation is safe . Further r and omized studies are warranted Objective . To assess the cost‐effectiveness of addition of ST analysis of the fetal electrocardiogram ( ECG ; STAN ® ) to cardiotocography ( CTG ) for fetal surveillance during labor compared with CTG only . Design . Cost‐effectiveness analysis based on a r and omized clinical trial on ST analysis of the fetal ECG . Setting . Obstetric departments of three academic and six general hospitals in The Netherl and s. Population . Laboring women with a singleton high‐risk pregnancy , a fetus in cephalic presentation , a gestational age > 36weeks and an indication for internal electronic fetal monitoring . Methods . A trial‐based cost‐effectiveness analysis was performed from a health‐care provider perspective . Main Outcome Measures . Primary health outcome was the incidence of metabolic acidosis measured in the umbilical artery . Direct medical costs were estimated from start of labor to childbirth . Cost‐effectiveness was expressed as costs to prevent one case of metabolic acidosis . Results . The incidence of metabolic acidosis was 0.7 % in the ST‐ analysis group and 1.0 % in the CTG‐only group ( relative risk 0.70 ; 95 % confidence interval 0.38–1.28 ) . Per delivery , the mean costs per patient of CTG plus ST analysis ( n= 2 827 ) were € 1 345 vs. € 1 316 for CTG only ( n= 2 840 ) , with a mean difference of € 29 ( 95 % confidence interval −€9 to € 77 ) until childbirth . The incremental costs of ST analysis to prevent one case of metabolic acidosis were € 9 667 . Conclusions . The additional costs of monitoring by ST analysis of the fetal ECG are very limited when compared with monitoring by CTG only and very low compared with the total costs of delivery It is possible to record the fetal electrocardiographic waveform ( ECG ) from the scalp electrode used in labour for detection of fetal heart rate . Animal and observational studies of changes in the ST waveform of the ECG during hypoxia suggest that a combination of heart rate and ST waveform analysis might improve the predictive value of intrapartum monitoring . In a r and omised trial , we have studied intervention rates and neonatal outcome for high-risk labours monitored either by conventional cardiotocography ( CTG ) or by ST waveform analysis plus CTG . 1200 women with pregnancy of at least 34 weeks ' gestation were assigned to the groups when the decision to apply a fetal scalp electrode was made . Neonatal outcome was assessed by umbilical-cord blood gas analysis , Apgar scores , resuscitation needed , and postnatal course . All recordings were retrospectively viewed by an observer unaware of clinical details to check adherence to the trial protocol . The addition of ST waveform monitoring to CTG substantially reduced the proportion of deliveries for fetal distress ( ST + CTG 27/615 vs CTG 58/606 ; p less than 0.001 ) . The groups did not differ in rate of operative delivery for other reasons , incidence of asphyxia at birth , or neonatal outcome . Metabolic acidosis and low 5 min Apgar scores were less common in the ST + CTG than the CTG group , but not significantly so . The only case of birth asphyxia in the ST + CTG group was identified by both heart rate and ST changes . The review of recordings showed that the reduction in intervention rate was among cases with CTG patterns classified as normal or intermediate , whereas there was no difference in intervention rates among cases with abnormal recordings . Our findings confirm that ST waveform analysis discriminates CTG changes in labour and that our protocol for interpretation is safe . Further r and omised studies are warranted BACKGROUND There is a need to improve the sensitivity and specificity of fetal monitoring during labour . We compared the gold st and ard , cardiotocography , with cardiotocography plus time-interval analysis of the fetal electrocardiogram in fetal surveillance . The aim was to find out whether time-interval analysis decreased the need for operative intervention due to fetal distress . METHODS We did a r and omised , prospect i ve trial in five hospitals in the UK , Hong Kong , the Netherl and s , and Singapore . 1038 women undergoing high-risk labours were r and omly assigned fetal monitoring by cardiotocography alone , or cardiotocography plus fetal electrocardiography ( ECG ) . Outcomes measured were rates of operative intervention , and neonatal outcome . Analysis was by intention to treat . FINDINGS 515 women were assigned management by cardiotocography , and 523 cardiotocography plus fetal ECG . There was a trend towards fewer operative interventions for presumed fetal distress in the time-interval analysis plus cardiotocography group ( 63 [ 13 % ] vs 78 [ 16 % ] ) , but this was not significant ( relative risk 0.80 [ 95 % CI 0.59 - 1.08 ] , p=0.17 ) . There was no significant difference between groups in the proportion of babies who had an umbilical arterial pH of 7.15 or less ( 51 [ 11 % ] vs 49 [ 11 % ] ; 1.01 [ 0.7 - 1.47 ] ) , or in the frequency of unsuspected acidaemia ( 42 [ 9 % ] vs 35 [ 8 % ] ; 1.17 [ 0.76 - 1.79 ] ) . INTERPRETATION The addition of time-interval analysis of the fetal electrocardiogram during labour did not show a significant benefit in decreasing operative intervention . There was no significant difference in neonatal outcome OBJECTIVE The purpose of this study was to assess whether knowledge of ST-segment analysis was associated with a reduction in operative deliveries for nonreassuring fetal status ( NRFS ) or with a need for at least 1 scalp pH during labor . STUDY DESIGN Seven hundred ninety-nine women at term with abnormal cardiotocography or meconium-stained amniotic fluid ( 7 % ) were assigned r and omly to the intervention group ( cardiotocography + STAN ) or the control group ( cardiotocography ) in 2 university hospitals in Strasbourg , France . Scalp pH testing was optional in both groups . Abnormal neonatal outcome was pH < 7.05 or umbilical cord blood artery base deficit of > 12 or a 5-min Apgar score of < 7 or neonatal intensive care unit admission or convulsions or neonatal death . Study power was 80 % for the detection of a prespecified reduction from 50%-40 % in operative delivery for NRFS . RESULTS The operative delivery ( cesarean or instrumental ) rate for NRFS did not differ between the 2 groups : 33.6 % ( 134/399 ) in the cardiotocography + STAN analysis group vs 37 % ( 148/400 ) in the cardiotocography group ( relative risk , 0.91 ; 95 % CI , 0.75 - 1.10 ) . The rate of operative delivery for dystocia was also similar in both groups . The percentage of women whose fetus had at least 1 scalp pH measurement during labor was substantially lower in the group with ST-segment analysis : 27 % compared with 62 % ( relative risk , 0.44 ; 95 % CI , 0.36 - 0.52 ) . Neonatal outcomes did not differ significantly between groups . CONCLUSION In a population with abnormal cardiotocography in labor , cardiotocography combined with ST-segment analysis was not associated with a reduction in operative deliveries for NRFS . The proportion of infants without scalp pH sampling during labor increased substantially , however BACKGROUND Previous studies indicate that analysis of the ST waveform of the fetal electrocardiogram provides information on the fetal response to hypoxia . We did a multicentre r and omised controlled trial to test the hypothesis that intrapartum monitoring with cardiotocography combined with automatic ST-waveform analysis results in an improved perinatal outcome compared with cardiotocography alone . METHODS At three Swedish labour wards , 4966 women with term fetuses in the cephalic presentation entered the trial during labour after a clinical decision had been made to apply a fetal scalp electrode for internal cardiotocography . They were r and omly assigned monitoring with cardiotocography plus ST analysis ( CTG+ST group ) or cardiotocography only ( CTG group ) . The main outcome measure was rate of umbilical-artery metabolic acidosis ( pH < 7.05 and base deficit > 12 mmol/L ) . Secondary outcomes included operative delivery for fetal distress . Results were first analysed according to intention to treat , and secondly after exclusion of cases with severe malformations or with inadequate monitoring . FINDINGS The CTG+ST group showed significantly lower rates of umbilical-artery metabolic acidosis than the cardiotocography group ( 15 of 2159 [ 0.7 % ] vs 31 of 2079 [ 2 % ] , relative risk 0.47 [ 95 % CI 0.25 - 0.86 ] , p=0.02 ) and of operative delivery for fetal distress ( 193 of 2519 [ 8 % ] vs 227 of 2447 [ 9 % ] , 0.83 [ 0.69 - 0.99 ] , p=0.047 ) when all cases were included according to intention to treat . The differences were more pronounced after exclusion of 291 in the CTG+ST group and 283 in the CTG group with malformations or inadequate recording . INTERPRETATION Intrapartum monitoring with cardiotocography combined with automatic ST-waveform analysis increases the ability of obstetricians to identify fetal hypoxia and to intervene more appropriately , result ing in an improved perinatal outcome Objective . To evaluate whether correct adherence to clinical guidelines might have led to prevention of cases with adverse neonatal outcome . Design . Secondary analysis of cases with adverse outcome in a multicenter r and omized clinical trial . Setting . Nine Dutch hospitals . Population . Pregnant women with a term singleton fetus in cephalic position . Methods . Data were obtained from a r and omized trial that compared monitoring by STAN ® ( index group ) with cardiotocography ( control group ) . In both trial arms , three observers independently assessed the fetal surveillance results in all cases with adverse neonatal outcome , to determine whether an indication for intervention was present , based on current clinical guidelines . Main outcome measures . Adverse neonatal outcome cases fulfilled one or more of the following criteria : ( i ) metabolic acidosis in umbilical cord artery ( pH < 7.05 and base deficit in extracellular fluid > 12 mmol/L ) ; ( ii ) umbilical cord artery pH < 7.00 ; ( iii ) perinatal death ; and /or ( iv ) signs of moderate or severe hypoxic ischemic encephalopathy . Results . We studied 5681 women , of whom 61 ( 1.1 % ) had an adverse outcome ( 26 index ; 35 control ) . In these women , the number of performed operative deliveries for fetal distress was 18 ( 69.2 % ) and 16 ( 45.7 % ) , respectively . Re assessment of all 61 cases showed that there was a fetal indication to intervene in 23 ( 88.5 % ) and 19 ( 57.6 % ) cases , respectively . In 13 ( 50.0 % ) vs. 11 ( 33.3 % ) cases , respectively , this indication occurred more than 20 min before the time of delivery , meaning that these adverse outcomes could possibly have been prevented . Conclusions . In our trial , more strict adherence to clinical guidelines could have led to additional identification and prevention of adverse outcome OBJECTIVE Our goal was to test the hypothesis that the addition of fetal electrocardiogram time-interval analysis to conventional electronic fetal monitoring would significantly reduce the number of cases requiring fetal scalp blood sampling without an increase in adverse outcome . STUDY DESIGN A r and omized prospect i ve trial was performed in 214 women with high-risk labor . RESULTS There was a significant reduction in the number of cases that had fetal blood sampling performed in the fetal electrocardiogram plus electronic fetal monitoring group ( risk ratio for electronic fetal monitoring alone 3.53 ; p < 0.01 , 95 % confidence interval 1.39 to 8.95 ) . The fetal blood samplings performed in the electronic fetal monitoring alone group were less likely to be abnormal ( pH < 7.25 , base excess < -8.0 ) than those performed in the fetal electrocardiogram plus electronic fetal monitoring group ( risk ratio for electronic fetal monitoring alone 0.62 , p = 0.05 , 95 % confidence interval 0.35 to 1.10 ) . There was a trend of more infants with an arterial umbilical pH < 7.15 and a base excess less than -8.0 mmol/L at birth being unsuspected and more instrumental deliveries for presumed fetal distress being performed in the electronic fetal monitoring alone than in the fetal electrocardiogram plus electronic fetal monitoring group . CONCLUSION The addition of fetal electrocardiogram analysis to conventional electronic fetal monitoring during labor can reduce significantly the number of parturients undergoing fetal scalp blood sampling and can simultaneously increase its efficiency without an increase in adverse outcome OBJECTIVE Evaluation of the role of ST analysis of fetus ECG for early detection of developing acute hypoxia in the course of delivery of fetuses with presumed growth retardation . A comparison with present way of intrapartal fetus monitoring . Impact on the number of surgical births for indications of threatening fetus hypoxia . Influence of the method on perinatal results and postnatal adaptation of the newborns . TYPE OF STUDY A prospect i ve study . SETTING Gynecology-Obstetrics Clinic , Masaryk University and Teaching Hospital Brno . METHOD Forty seven women with a growth retardation of the fetus diagnosed before delivery who gave birth in the Teaching Hospital in Brno during 2003 - 2005 and intrapartal ST analysis of fetus ECG was subsequently used , were enrolled into this prospect i ve study ( group A ) . The control group consisted of 87 deliveries taking place in the same period of time and concerning women with fetuses suffering from growth retardation and monitored by st and ard methods ( group B ) . The st and ard methods included cardiotocography ( CTG ) , supplemented with pulse oximetry ( IFPO ) if needed . The diagnosis of intrauterine fetus growth retardation was established on the basis of the results of repeated prepartal ultrasound fetus biometry with estimation of the mass , which corresponded to a group below 10 percentile for the given gestational age . The numbers of vaginal deliveries and surgically treated delivery due to threatening fetus hypoxia ( Cesarean section , forceps delivery ) were recorded . The authors evaluated postpartal pH from umbilical artery , independently for the group of values of pH < 7.00 , the group of pH 7.00 - 0.10 and pH 7.10 or more . The values of Apgar score were evaluated for the first , fifth and tenth minute , respectively . The neonatologist followed the duration of stay of the newborn at the Newborn Intensive Care Unit , the Intermediate Care Unit , total duration of hospitalization , the occurrence of sepsis in the early newbotn period , the occurrence of hyperbilirubinemia , and the conclusion of neurological examination . All the results were evaluated statistically by the chi2 test , Kruskal-Wallis test or the Anova method . RESULTS There was no statistically significantly difference in the number of delivery ended by surgery for threatening fetus hypoxia ( p = 0.856 ) or the detection rate of intrapartal hypoxia according to pH values of umbilical blood divided into the three groups ( p = 0.657 , p = 0.958 , p = 0.730 , respectively ) . The values of Apgar score differed in favor of the group A significantly only in the first minute at the level of 5 % opf significance ( p = 0.018 ) . The values of Apgar score in the fifth and tenth minute did not show any significant difference ( P = 0301 and p = 0313 , respectively ) . There was no statistically significant difference in neonatological results between the group A and B. CONCLUSION The use of ST analysis of fetal ECG in the course of delivery of fetuses with presumed intrauterine growth retardation did not show any significant difference from the presently used methods ( CTG supplemented with IFPO if needed ) . In using the method there was not any effect on the number of surgically treated deliveries for indications of threatening acute fetus hypoxia or perinatal results and postnatal adaptation of the newborns INTRODUCTION In previous papers we proposed ways to improve the diagnostic potential of the " quantitative cardiotocography " computer method , which allowed us to introduce clinical practice guidelines . Using these guidelines we aim to evaluate the effectiveness of quantitative cardiotocography ( qCTG ) as compared to st and ard cardiotocography ( CTG ) and , if necessary fetal blood sampling ( FBS ) . MATERIAL AND METHODS The prospect i ve study involves 220 pregnant women divided r and omly into two groups . All 110 women in the control group were monitored by st and ard indirect cardiotocography Interpretation of CTG findings was performed according to the latest revision of FIGO classification . We performed FBS in all cases of abnormal or suspect CTG tracings and pH values below 7.20 were indicative for urgent delivery . If pH values were in the range between 7.20 and 7.25 , another FBS was carried out after 30 minutes . If pH was above 7.25 , FBS was repeated according to CTG evaluation by the attending doctor . All patients in the index group ( 110 women ) were monitored by indirect quantitative cardiotocography ( qCTG ) . Our clinical practice guidelines were used for the interpretation of CTG tracings . Obstetric behavior was strictly based on the recommendations in these guidelines . Outcome measures are : incidence of metabolic acidosis ( defined by pH < 7.05 and BE > -12 mmol/l in the umbilical cord artery of the newborn ) , number of instrumental deliveries and sensitivity/specificity of each method ( qCTG and CTG + FBS ) in relation to fetal hypoxia ( defined by pH < 7.20 ) . RESULTS In the CTG group , 4.4 % of the newborns were affected by metabolic acidosisas opposed to 2.2 % in the qCTG group ( P > 0.05 ) . The sensitivity/specificity rates for fetal hypoxia were 89/67 % ( control group ) and 97/85 % ( index group ) . The number of operative deliveries was 27.25 % and 16.35 % , respectively ( P < 0.05 ) . CONCLUSION The incidence of metabolic acidosis was comparable in both groups . Indirect quantitative cardiotocography shows much better specificity for fetal hypoxia which results in significantly lower rates of operative deliveries compared to st and ard indirect cardiotocography combined with fetal blood sampling
2,037
30,011,314
Intervention studies and guidelines did not reference evidence to support the effectiveness of components included in the intervention for South Asian population s in particular .
Intervention trials and guidelines for the prevention of type 2 diabetes ( T2D ) in population s of South Asian origin often include strategies to improve diet and physical activity that are based on those developed for other population s. These may be suboptimal for the South Asian target population s. We aim ed to provide an overview of included recommended dietary and physical activity components , and to identify whether these were supported by evidence of their effectiveness .
Low serum 25-hydroxyvitamin D ( 25(OH)D ) has been shown to correlate with increased risk of type 2 diabetes . Small , observational studies suggest an action for vitamin D in improving insulin sensitivity and /or insulin secretion . The objective of the present study was to investigate the effect of improved vitamin D status on insulin resistance ( IR ) , utilising r and omised , controlled , double-blind intervention administering 100 microg ( 4000 IU ) vitamin D(3 ) ( n 42 ) or placebo ( n 39 ) daily for 6 months to South Asian women , aged 23 - 68 years , living in Auckl and , New Zeal and . Subjects were insulin resistant - homeostasis model assessment 1 (HOMA1)>1.93 and had serum 25(OH)D concentration < 50 nmol/l . Exclusion criteria included diabetes medication and vitamin D supplementation > 25 microg ( 1000 IU)/d . The HOMA2 computer model was used to calculate outcomes . Median ( 25th , 75th percentiles ) serum 25(OH)D(3 ) increased significantly from 21 ( 11 , 40 ) to 75 ( 55 , 84 ) nmol/l with supplementation . Significant improvements were seen in insulin sensitivity and IR ( P = 0.003 and 0.02 , respectively ) , and fasting insulin decreased ( P = 0.02 ) with supplementation compared with placebo . There was no change in C-peptide with supplementation . IR was most improved when endpoint serum 25(OH)D reached > or = 80 nmol/l . Secondary outcome variables ( lipid profile and high sensitivity C-reactive protein ) were not affected by supplementation . In conclusion , improving vitamin D status in insulin resistant women result ed in improved IR and sensitivity , but no change in insulin secretion . Optimal vitamin D concentrations for reducing IR were shown to be 80 - 119 nmol/l , providing further evidence for an increase in the recommended adequate levels . Registered Trial No. ACTRN12607000642482 Abstract Aims /hypothesisThe objective of this prevention programme was to study whether combining pioglitazone with lifestyle modification would enhance the efficacy of lifestyle modification in preventing type 2 diabetes in Asian Indians with impaired glucose tolerance . Methods In a community-based , placebo-controlled 3 year prospect i ve study , 407 participants with impaired glucose tolerance ( mean age 45.3 ± 6.2 years , mean BMI 25.9 ± 3.3 kg/m2 ) were sequentially grouped to receive either : lifestyle modification plus pioglitazone , 30 mg ( n = 204 ) or lifestyle modification plus placebo ( n = 203 ) . The participants and investigators were blinded to the assignment . The primary outcome was development of diabetes . Results At baseline , both groups had similar demographic , anthropometric and biochemical characteristics . At year 3 , the response rate was 90.2 % . The cumulative incidence of diabetes was 29.8 % with pioglitazone and 31.6 % with placebo ( unadjusted HR 1.084 [ 95 % CI 0.753–1.560 ] , p = 0.665 ) . Normoglycaemia was achieved in 40.9 % and 32.3 % of participants receiving pioglitazone and placebo , respectively ( p = 0.109 ) . In pioglitazone group , two deaths and two non-fatal hospitalisations occurred due to cardiac problems ; in the placebo group there were two occurrences of cardiac disease . Conclusions /interpretationDespite good adherence to lifestyle modification and drug therapy , no additional effect of pioglitazone was seen above that achieved with placebo . The effectiveness of the intervention in both groups was comparable with that of lifestyle modification alone , as reported from the Indian Diabetes Prevention Programme-1 . The results are at variance with studies that showed significant relative risk reduction in conversion to diabetes with pioglitazone in Americans with IGT . An ethnicity-related difference in the action of pioglitazone in non-diabetic participants may be one explanation . Trial registration : Clinical Trials.gov NCT00276497 Funding : This study was funded by the India Diabetes Research Background Expert bodies and health organisations recommend that adults undertake at least 150 min.week−1 of moderate-intensity physical activity ( MPA ) . However , the underpinning data largely emanate from studies of population s of European descent . It is unclear whether this level of activity is appropriate for other ethnic groups , particularly South Asians , who have increased cardio-metabolic disease risk compared to Europeans . The aim of this study was to explore the level of MPA required in South Asians to confer a similar cardio-metabolic risk profile to that observed in Europeans undertaking the currently recommended MPA level of 150 min.week−1 . Methods Seventy-five South Asian and 83 European men , aged 40–70 , without cardiovascular disease or diabetes had fasted blood taken , blood pressure measured , physical activity assessed objective ly ( using accelerometry ) , and anthropometric measures made . Factor analysis was used to summarise measured risk biomarkers into underlying latent ‘ factors ’ for glycaemia , insulin resistance , lipid metabolism , blood pressure , and overall cardio-metabolic risk . Age-adjusted regression models were used to determine the equivalent level of MPA ( in bouts of ≥10 minutes ) in South Asians needed to elicit the same value in each factor as Europeans undertaking 150 min.week−1 MPA . Findings For all factors , except blood pressure , equivalent MPA values in South Asians were significantly higher than 150 min.week−1 ; the equivalent MPA value for the overall cardio-metabolic risk factor was 266 ( 95 % CI 185 - 347 ) min.week−1 . Conclusions South Asian men may need to undertake greater levels of MPA than Europeans to exhibit a similar cardio-metabolic risk profile , suggesting that a conceptual case can be made for ethnicity-specific physical activity guidance . Further study is needed to extend these findings to women and to replicate them prospect ively in a larger cohort Background — Body mass index ( BMI ) is widely used to assess risk for cardiovascular disease and type 2 diabetes . Cut points for the classification of obesity ( BMI > 30 kg/m2 ) have been developed and vali date d among people of European descent . It is unknown whether these cut points are appropriate for non-European population s. We assessed the metabolic risk associated with BMI among South Asians , Chinese , Aboriginals , and Europeans . Methods and Results — We r and omly sample d 1078 subjects from 4 ethnic groups ( 289 South Asians , 281 Chinese , 207 Aboriginals , and 301 Europeans ) from 4 regions in Canada . Principal components factor analysis was used to derive underlying latent or “ hidden ” factors associated with 14 clinical and biochemical cardiometabolic markers . Ethnic-specific BMI cut points were derived for 3 cardiometabolic factors . Three primary latent factors emerged that accounted for 56 % of the variation in markers of glucose metabolism , lipid metabolism , and blood pressure . For a given BMI , elevated levels of glucose- and lipid-related factors were more likely to be present in South Asians , Chinese , and Aboriginals compared with Europeans , and elevated levels of the blood pressure – related factor were more likely to be present among Chinese compared with Europeans . The cut point to define obesity , as defined by distribution of glucose and lipid factors , is lower by ≈6 kg/m2 among non-European groups compared with Europeans . Conclusions — Revisions may be warranted for BMI cut points to define obesity among South Asians , Chinese , and Aboriginals . Using these revised cut points would greatly increase the estimated burden of obesity-related metabolic disorders among non-European population High prevalence of type 2 diabetes ( T2D ) is seen in some immigrant groups in Western countries , particularly in those from the Indian subcontinent . Our aims were to increase the physical activity ( PA ) level in a group of Pakistani immigrant men , and to see whether any increase was associated with reduced serum glucose and insulin concentrations . The intervention was developed in collaboration with the Pakistani community . It used a social cognitive theory framework and consisted of structured supervised group exercises , group lectures , individual counselling and telephone follow-up . One- hundred and fifty physically inactive Pakistani immigrant men living in Oslo , Norway , were r and omised to either a control group or an intervention group . The 5-month intervention focused on increasing levels of PA , which were assessed by use of accelerometer ( Actigraph MTI 7164 ) recordings . Risk of diabetes was assessed by serum glucose and insulin concentrations determined in a fasted state , and after an oral glucose tolerance test ( OGTT ) . ANCOVA was used to assess differences between groups . There was a mean difference in PA between the two groups of 49 counts per minute per day , representing a 15 % ( 95 % CI = 8.7–21.2 ; P = 0.01 ) higher increase in total PA level in the intervention group than in the control group . Insulin values taken 2 h after an OGTT were reduced in the intervention group by 27 % ( 95 % CI = 18.9–35.0 ; P = 0.02 ) more than those in the control group . There were no differences in fasting or postpr and ial glucose values between the groups at the follow-up test . This type of intervention can increase PA and reduce serum insulin in Pakistani immigrant men , thereby presumably reducing their risk of T2D Introduction Type 2 diabetes ( T2D ) is a major health concern among population s of South Asian ethnicity . Although dietary and physical activity interventions may reduce the risk of T2D , the effectiveness has been moderate among South Asians . This might ( in part ) be because this subgroup follows strategies that were originally developed for interventions among other population s. Therefore , this review aims to assess the evidence for the current dietary and physical activity strategies recommended in T2D prevention intervention studies and guidelines for South Asians . Methods and analysis Included will be all studies and guidelines on dietary and /or physical activity strategies to prevent T2D in adult South Asians . Two review ers will search online data bases from their start until the present date for published and unpublished experimental/quasiexperimental studies , with at least an abstract in English . References of identified articles and key review s will be screened for additional studies . Guidelines will be identified by search es in online data bases and websites of public organisations . Finally , expert consultations will be held to supplement any missing information . Trial quality will be assessed with the Quality Assessment Tool for Quantitative Studies Data , and guidelines with the Appraisal of Guidelines for Research & Evaluation II . Data on the strategies recommended , targeting and evidence on effectiveness will be extracted by two review ers and presented in tabular and narrative forms . Recommendations will be compared with the National Institute for Health and Care Excellence guidelines [ PH35 ] . Overall findings on dietary and physical activity recommendations , as well as findings for specific subgroups ( eg , by sex ) , will be discussed . Ethics and dissemination Ethics assessment is not required . Start date : 1 January 2016 , finishing and reporting date 31 July 2016 . Results will be published in a peer- review ed scientific journal , the project report of EuroDHYAN ( www.eurodhyan.eu ) and in a PhD dissertation . Trial registration number The protocol is registered with the International Prospect i ve Register of Systematic Review s ( PROSPERO ) registration number CRD42015027067 AIMS To assess the beneficial effects of the components of lifestyle intervention in reducing incidence of diabetes in Asian Indian men with impaired glucose tolerance ( IGT ) in India . METHODS This analysis was based on a 2 year prospect i ve , r and omized controlled primary prevention trial in a cohort of Asian Indian men with IGT ( n=537 ) ( Clinical Trial No : NCT00819455 ) . Intervention and control groups were given st and ard care advice at baseline . Additionally , the intervention group received frequent , mobile phone based text message reminders on healthy lifestyle principles . Dietary intake and physical activity habits were recorded by vali date d question naires . The lifestyle goals were : reductions in consumption of carbohydrates , oil , portion size and body mass index of at least 1 unit ( 1 kg/m(2 ) ) from baseline and maintenance of good physical activity . The association between diabetes and lifestyle goals achieved was assessed using multiple logistic regression analyses . Changes in insulin sensitivity ( Matsuda 's insulin sensitivity index ) and oral disposition index during the follow-up were assessed . RESULTS At the end of the study , 123 ( 23.8 % ) participants developed diabetes . The mean lifestyle score was higher in the intervention group compared with control ( 2.59 ± 1.13 vs. 2.28 ± 1.17 ; P=0.002 ) . Among the 5 lifestyle variables , significant improvements in the 3 dietary goal were seen with intervention . Concomitant improvement in insulin sensitivity and oral disposition index was noted . Higher lifestyle score was associated with lower risk of developing diabetes ( odds ratio : 0.54 [ 95 % CI : 0.44 - 0.70 ] ; P<0.0001 ) . CONCLUSIONS Beneficial effects of intervention were associated with increased compliance to lifestyle goals . The plausible mechanism is through improvement in insulin sensitivity and beta cell preservation Purpose The purpose of this study was to describe the feasibility of using a community-based participatory research ( CBPR ) approach to implement the Power to Prevent ( P2P ) diabetes prevention education curriculum in rural African American ( AA ) setting s. Methods Trained community health workers facilitated the 12-session P2P curriculum across 3 community setting s. Quantitative ( based on the pre- and post-curriculum question naires and changes in blood glucose , blood pressure [ BP ] , and weight at baseline and 6 months ) and qualitative data ( based on semi-structured interviews with facilitators ) were collected . Indicators of feasibility included : dem and , acceptability , implementation fidelity , and limited efficacy testing . Results Across 3 counties , 104 AA participants were recruited ; 43 % completed ≥ 75 % of the sessions . There was great dem and for the program . Fifteen community health ambassadors ( CHAs ) were trained , and 4 served as curriculum facilitators . Content and structure of the intervention was acceptable to facilitators but there were challenges to implementing the program as design ed . Improvements were seen in diabetes knowledge and the impact of healthy eating and physical activity on diabetes prevention , but there were no significant changes in blood glucose , BP , or weight . Conclusion While it is feasible to use a CBPR approach to recruit participants and implement the P2P curriculum in AA community setting s , there are significant challenges that must be overcome Background India currently has more than 60 million people with Type 2 Diabetes Mellitus ( T2DM ) and this is predicted to increase by nearly two-thirds by 2030 . While management of those with T2DM is important , preventing or delaying the onset of the disease , especially in those individuals at ‘ high risk ’ of developing T2DM , is urgently needed , particularly in re source -constrained setting s. This paper describes the protocol for a cluster r and omised controlled trial of a peer-led lifestyle intervention program to prevent diabetes in Kerala , India . Methods / design A total of 60 polling booths are r and omised to the intervention arm or control arm in rural Kerala , India . Data collection is conducted in two steps . Step 1 ( Home screening ) : Participants aged 30–60 years are administered a screening question naire . Those having no history of T2DM and other chronic illnesses with an Indian Diabetes Risk Score value of ≥60 are invited to attend a mobile clinic ( Step 2 ) . At the mobile clinic , participants complete question naires , undergo physical measurements , and provide blood sample s for biochemical analysis . Participants identified with T2DM at Step 2 are excluded from further study participation . Participants in the control arm are provided with a health education booklet containing information on symptoms , complications , and risk factors of T2DM with the recommended levels for primary prevention . Participants in the intervention arm receive : ( 1 ) eleven peer-led small group sessions to motivate , guide and support in planning , initiation and maintenance of lifestyle changes ; ( 2 ) two diabetes prevention education sessions led by experts to raise awareness on T2DM risk factors , prevention and management ; ( 3 ) a participant h and book containing information primarily on peer support and its role in assisting with lifestyle modification ; ( 4 ) a participant workbook to guide self-monitoring of lifestyle behaviours , goal setting and goal review ; ( 5 ) the health education booklet that is given to the control arm . Follow-up assessment s are conducted at 12 and 24 months . The primary outcome is incidence of T2DM . Secondary outcomes include behavioural , psychosocial , clinical , and biochemical measures . An economic evaluation is planned . Discussion Results from this trial will contribute to improved policy and practice regarding lifestyle intervention programs to prevent diabetes in India and other re source -constrained setting s . Trial registration Australia and New Zeal and Clinical Trials Registry : ACTRN12611000262909 Background To investigate the prevalence of obesity and diabetes among adult men and women in India consuming different types of vegetarian diets compared with those consuming non-vegetarian diets . Methods We used cross-sectional data of 156,317 adults aged 20–49 years who participated in India ’s third National Family Health Survey ( 2005–06 ) . Association between types of vegetarian diet ( vegan , lacto-vegetarian , lacto-ovo vegetarian , pesco-vegetarian , semi-vegetarian and non-vegetarian ) and self-reported diabetes status and measured body mass index ( BMI ) were estimated using multivariable logistic regression adjusting for age , gender , education , household wealth , rural/urban residence , religion , caste , smoking , alcohol use , and television watching . Results Mean BMI was lowest in pesco-vegetarians ( 20.3 kg/m2 ) and vegans ( 20.5 kg/m2 ) and highest in lacto-ovo vegetarian ( 21.0 kg/m2 ) and lacto-vegetarian ( 21.2 kg/m2 ) diets . Prevalence of diabetes varied from 0.9 % ( 95 % CI : 0.8 - 1.1 ) in person consuming lacto-vegetarian , lacto-ovo vegetarian ( 95 % CI:0.6 - 1.3 ) and semi-vegetarian ( 95 % CI:0.7 - 1.1 ) diets and was highest in those persons consuming a pesco-vegetarian diet ( 1.4 % ; 95 % CI:1.0 - 2.0 ) . Consumption of a lacto- ( OR:0.67;95 % CI:0.58 - 0.76;p < 0.01 ) , lacto-ovo ( OR:0.70 ; 95 % CI:0.51 - 0.96;p = 0.03 ) and semi-vegetarian ( OR:0.77 ; 95 % CI:0.60 - 0.98 ; p = 0.03 ) diet was associated with a lower likelihood of diabetes than a non-vegetarian diet in the adjusted analyses . Conclusions In this large , nationally representative sample of Indian adults , lacto- , lacto-ovo and semi-vegetarian diets were associated with a lower likelihood of diabetes . These findings may assist in the development of interventions to address the growing burden of overweight/obesity and diabetes in Indian population . However , prospect i ve studies with better measures of dietary intake and clinical measures of diabetes are needed to clarify this relationship OBJECTIVES To study the effectiveness of yoga intervention on oxidative stress , glycemic status , blood pressure and anthropometry in prediabetes . DESIGN R and omized-controlled trial . PARTICIPANTS Twenty nine prediabetes subjects aged 30 - 75 years . SETTING Yoga was conducted at 4 different community diabetes clinics in Mangalore , India . INTERVENTIONS Participants were r and omized to either 3-month yoga or wait-list control groups . MAIN OUTCOME MEASURES Malondialdehyde , glutathione , vitamin C , vitamin E , superoxide dismutase , plasma glucose , glycated haemoglobin , BMI , waist circumference , waist-to-hip ratio and blood pressure . RESULTS Yoga intervention result ed in a significant decline in malondialdehyde ( p<0.001 ) , relative to the control group . In comparison with the control , there was a significant improvement in BMI , waist circumference , systolic blood pressure and fasting glucose levels at follow-up . No significant improvement in glycated haemoglobin , waist-to-hip ratio or any of the antioxidants was observed . CONCLUSIONS Yoga intervention may be helpful in control of oxidative stress in prediabetes subjects . Yoga can also be beneficial in reduction in BMI , waist circumference , systolic blood pressure and fasting glucose . Effect of yoga on antioxidant parameters was not evident in this study . The findings of this study need to be confirmed in larger trials involving active control groups This study used an experimental , pretest-posttest control group repeated measures design to evaluate the effectiveness of a community-based culturally appropriate lifestyle intervention program to reduce the risk for type 2 diabetes ( T2DM ) among Gujarati Asian Indians ( AIs ) in an urban community in the US . Participants included 70 adult AIs in the greater Houston metropolitan area . The primary outcomes were reduction in weight and hemoglobin A1c ( HbA1c ) and improvement in physical activity . Participants were screened for risk factors and r and omly assigned to a 12-week group-based lifestyle intervention program ( n = 34 ) or a control group ( n = 36 ) that received st and ard print material on diabetes prevention . Participants also completed clinical measures and self-reported question naires about physical activity , social , and lifestyle habits at 0 , 3 , and 6 months . No significant baseline differences were noted between groups . While a significant decline in weight and increase in physical activity was observed in all participants , the intervention group lowered their HbA1c ( p < 0.0005 ) and waist circumference ( p = 0.04 ) significantly as compared to the control group . Findings demonstrated that participation in a culturally tailored , lifestyle intervention program in a community setting can effectively reduce weight , waist circumference , and HbA1c among Gujarati AIs living in the US OBJECTIVE To describe the 1 ) lifestyle intervention used in the Finnish Diabetes Prevention Study , 2 ) short- and long-term changes in diet and exercise behavior , and 3 ) effect of the intervention on glucose and lipid metabolism . RESEARCH DESIGN AND METHODS There were 522 middle-aged , overweight subjects with impaired glucose tolerance who were r and omized to either a usual care control group or an intensive lifestyle intervention group . The control group received general dietary and exercise advice at baseline and had an annual physician 's examination . The subjects in the intervention group received additional individualized dietary counseling from a nutritionist . They were also offered circuit-type resistance training sessions and advised to increase overall physical activity . The intervention was the most intensive during the first year , followed by a maintenance period . The intervention goals were to reduce body weight , reduce dietary and saturated fat , and increase physical activity and dietary fiber . RESULTS The intervention group showed significantly greater improvement in each intervention goal . After 1 and 3 years , weight reductions were 4.5 and 3.5 kg in the intervention group and 1.0 and 0.9 kg in the control group , respectively . Measures of glycemia and lipemia improved more in the intervention group . CONCLUSIONS The intensive lifestyle intervention produced long-term beneficial changes in diet , physical activity , and clinical and biochemical parameters and reduced diabetes risk . This type of intervention is a feasible option to prevent type 2 diabetes and should be implemented in the primary health care system Type 2 diabetes is extremely common in South Asians , e.g. in men from Pakistani and Indian population s it is about three times as likely as in the general population in Engl and , despite similarities in body mass index . Lifestyle interventions reduce the incidence of diabetes . Trials in Europe and North America have not , however , reported on the impact on South Asian population s separately or provided the details of their cross-cultural adaptation processes . Prevention of diabetes and obesity in South Asians ( PODOSA ) is a r and omized , controlled trial in Scotl and of an adapted , lifestyle intervention aim ed at reducing weight and increasing physical activity to reduce type 2 diabetes in Indians and Pakistanis . The trial was adapted from the Finnish Diabetes Prevention Study . We describe , reflect on and discuss the following key issues : The core adaptations to the trial design , particularly the delivery of the intervention in homes by dietitians rather than in clinics . The use of both a multilingual panel and professional translators to help translate and /or develop material s. The processes and challenges of phonetic translation . How intervention re sources were adapted , modified , newly developed and translated into Urdu and Gurmukhi ( written Punjabi ) . The insights gained in PODOSA ( including time pressures on investigators , imperfections in the adaptation process , the power of verbal rather than written information , the utilization of English and the mother-tongue language s simultaneously by participants and the costs ) might help the research community , given the challenge of health promotion in multi-ethnic , urban societies BACKGROUND Type 2 diabetes can often be prevented by lifestyle modification ; however , successful lifestyle intervention programmes are labour intensive . Mobile phone messaging is an inexpensive alternative way to deliver educational and motivational advice about lifestyle modification . We aim ed to assess whether mobile phone messaging that encouraged lifestyle change could reduce incident type 2 diabetes in Indian Asian men with impaired glucose tolerance . METHODS We did a prospect i ve , parallel-group , r and omised controlled trial between Aug 10 , 2009 , and Nov 30 , 2012 , at ten sites in southeast India . Working Indian men ( aged 35 - 55 years ) with impaired glucose tolerance were r and omly assigned ( 1:1 ) with a computer-generated r and omisation sequence to a mobile phone messaging intervention or st and ard care ( control group ) . Participants in the intervention group received frequent mobile phone messages compared with controls who received st and ard lifestyle modification advice at baseline only . Field staff and participants were , by necessity , not masked to study group assignment , but allocation was concealed from laboratory personnel as well as principal and co-investigators . The primary outcome was incidence of type 2 diabetes , analysed by intention to treat . This trial is registered with Clinical Trials.gov , number NCT00819455 . RESULTS We assessed 8741 participants for eligibility . 537 patients were r and omly assigned to either the mobile phone messaging intervention ( n=271 ) or st and ard care ( n=266 ) . The cumulative incidence of type 2 diabetes was lower in those who received mobile phone messages than in controls : 50 ( 18 % ) participants in the intervention group developed type 2 diabetes compared with 73 ( 27 % ) in the control group ( hazard ratio 0·64 , 95 % CI 0·45 - 0·92 ; p=0·015 ) . The number needed to treat to prevent one case of type 2 diabetes was 11 ( 95 % CI 6 - 55 ) . One patient in the control group died suddenly at the end of the first year . We recorded no other serious adverse events . INTERPRETATION Mobile phone messaging is an effective and acceptable method to deliver advice and support towards lifestyle modification to prevent type 2 diabetes in men at high risk . FUNDING The UK India Education and Research Initiative , the World Diabetes Foundation Aims /hypothesisLifestyle modification helps in the primary prevention of diabetes in multiethnic American , Finnish and Chinese population s. In a prospect i ve community-based study , we tested whether the progression to diabetes could be influenced by interventions in native Asian Indians with IGT who were younger , leaner and more insulin resistant than the above population s. Methods We r and omised 531 ( 421 men 110 women ) subjects with IGT ( mean age 45.9±5.7 years , BMI 25.8±3.5 kg/m2 ) into four groups . Group 1 was the control , Group 2 was given advice on lifestyle modification ( LSM ) , Group 3 was treated with metformin ( MET ) and Group 4 was given LSM plus MET . The primary outcome measure was type 2 diabetes as diagnosed using World Health Organization criteria . Results The median follow-up period was 30 months , and the 3-year cumulative incidences of diabetes were 55.0 % , 39.3 % , 40.5 % and 39.5 % in Groups 1–4 , respectively . The relative risk reduction was 28.5 % with LSM ( 95 % CI 20.5–37.3 , p=0.018 ) , 26.4 % with MET ( 95 % CI 19.1–35.1 , p=0.029 ) and 28.2 % with LSM + MET ( 95 % CI 20.3–37.0 , p=0.022 ) , as compared with the control group . The number needed to treat to prevent one incident case of diabetes was 6.4 for LSM , 6.9 for MET and 6.5 for LSM + MET . Conclusions /interpretationProgression of IGT to diabetes is high in native Asian Indians . Both LSM and MET significantly reduced the incidence of diabetes in Asian Indians with IGT ; there was no added benefit from combining them The objective of the present study was to explore whether a culturally adapted lifestyle education programme would improve the risk factor profile for type 2 diabetes ( T2D ) and the metabolic syndrome ( MetS ) among Pakistani immigrant women in Oslo , Norway . The r and omised controlled trial ( the InnvaDiab study ) , lasting 7 ± 1 months , comprised six educational sessions about blood glucose , physical activity and diet . Participants ( age 25 - 62 years ) were r and omised into either a control ( n 97 ) or an intervention ( n 101 ) group . Primary outcome variables were fasting and 2 h blood glucose , and secondary outcome variables were fasting levels of insulin , C-peptide , lipids , glycated Hb , BMI , waist circumference and blood pressure , measured 1 - 3 weeks before and after the intervention . During the intervention period , the mean fasting blood glucose decreased by 0·16 ( 95 % CI -0·27 , -0·05 ) mmol/l in the intervention group , and remained unchanged in the control group ( difference between the groups , P=0·022 ) . Glucose concentration 2 h after the oral glucose tolerance test decreased by 0·53 ( 95 % CI -0·84 , -0·21 ) mmol/l in the intervention group , but not significantly more than in the control group . A larger reduction in fasting insulin was observed in the intervention group than in the control group ( between-group difference , P= 0·036 ) . Among the individuals who attended four or more of the educational sessions ( n 59 ) , we found a more pronounced decrease in serum TAG ( -0·1 ( 95 % CI -0·24 , 0·07 ) mmol/l ) and BMI ( -0·48 ( 95 % CI -0·78 , -0·18 ) kg/m² ) compared with the control group . During the intervention period , there was a significant increase in participants having the MetS in the control group ( from 41 to 57 % ) , which was not seen in the intervention group ( from 44 to 42 % ) . Participation in a culturally adapted education programme may improve risk factors for T2D and prevent the development of the MetS in Pakistani immigrant women AIMS There are a number of studies showing that zinc supplementation may improve glucose h and ling in people with established diabetes . We sought to investigate whether this zinc-dependent improvement in glucose h and ling could potentially be harnessed to prevent the progression of pre-diabetes to diabetes . In this double-blind r and omized placebo-controlled trial , we determined participants ' fasting blood glucose levels , ( FBG ) and Homeostasis Model Assessment ( HOMA ) parameters ( beta cell function , insulin sensitivity and insulin resistance ) at baseline and after 6 months of zinc supplementation . METHODS The Bangladesh Institute of Health Sciences Hospital ( BIHS ) ( Mirpur , Dhaka , Bangladesh ) data base was used to identify 224 patients with prediabetes , of whom 55 met the inclusion criteria and agreed to participate . The participants were r and omized either to the intervention or control group using block r and omization . The groups received either 30 mg zinc sulphate dispersible tablet or placebo , once daily for six months . RESULTS After six months , the intervention group significantly improved their FBG concentration compared to the placebo group ( 5.37±0.20mmol/L vs 5.69±0.26 , p<0.001 ) as well as compared to their own baseline ( 5.37±0.20mmol/L vs 5.8±0.09 , p<0.001 ) . Beta cell function , insulin sensitivity and insulin resistance all showed a statistically significant improvement as well . CONCLUSION To our knowledge this is the first trial to show an improvement in glucose h and ling using HOMA parameters in participants with prediabetes . Larger r and omized controlled trials are warranted to confirm these findings and to explore clinical endpoints The study on lifestyle-intervention and impaired glucose tolerance Maastricht ( SLIM ) is a 3 years r and omised clinical trial design ed to evaluate the effect of a combined diet and physical activity intervention program on glucose tolerance in a Dutch population at increased risk for developing type 2 diabetes . Here the design of the lifestyle-intervention study is described and results are presented from the preliminary population screening , conducted between March 1999 and June 2000 . In total , 2,820 subjects with an increased risk of having disturbances in glucose homeostasis ( i.e. age > 40 years and BMI > 25 kg/m(2 ) or a family history of diabetes ) underwent a first oral glucose tolerance test ( OGTT ) . Abnormal glucose homeostasis was detected in 826 subjects ( 30.4 % ) : 226 type 2 diabetes ( type 2DM , 8.3 % ) , 215 impaired fasting glucose ( IFG , 7.9 % ) and 385 impaired glucose tolerance ( IGT , 14.2 % ) . Both increasing age and BMI were strongly related to the prevalence of IGT and diabetes . After a second OGTT , 114 subjects with glucose intolerance and in otherwise good health were eligible for participation in the intervention study ( SLIM ) . The high prevalence of disturbances in glucose homeostasis observed in the preliminary screening underscore the importance of early ( lifestyle ) interventions in those at risk for developing diabetes . SLIM will address this topic in the Dutch population Physical inactivity and low birth weight ( LBW ) may lead to an increased risk for developing type 2 diabetes . The extent to which LBW individuals may benefit from physical exercise training when compared with those with normal birth weight ( NBW ) controls is uncertain . We assessed the impact of an outdoor exercise intervention on body composition , insulin secretion and action in young men born with LBW and NBW in rural India . A total of 61 LBW and 56 NBW healthy young men were recruited into the study . The individuals were instructed to perform outdoor bicycle exercise training for 45 min every day . Fasting blood sample s , intravenous glucose tolerance tests and bioimpedance body composition assessment were carried out . Physical activity was measured using combined accelerometry and heart rate monitoring during the first and the last week of the intervention . Following the exercise intervention , the LBW group displayed an increase in physical fitness [ 55.0 ml (O2)/kg min (52.0 - 58.0)-57.5 ml (O2)/kg min ( 54.4 - 60.5 ) ] level and total fat-free mass [ 10.9 % (8.0 - 13.4)-11.4 % ( 8.0 - 14.6 ) ] , as well as a corresponding decline in the ratio of total fat mass/fat-free mass . In contrast , an increase in total fat percentage as well as total fat mass was observed in the NBW group . After intervention , fasting plasma insulin levels , homoeostasis model assessment s ( HOMA ) of insulin resistance ( HOMA-IR ) and insulin secretion ( HOMA-IS ) , improved to the same extent in both the groups . In summary , young men born with LBW in rural India benefit metabolically from exercise training to an extent comparable with NBW controls BACKGROUND The Cochrane Collaboration is strongly encouraging the use of a newly developed tool , the Cochrane Collaboration Risk of Bias Tool ( CCRBT ) , for all review groups . However , the psychometric properties of this tool to date have yet to be described . Thus , the objective of this study was to add information about psychometric properties of the CCRBT including inter-rater reliability and concurrent validity , in comparison with the Effective Public Health Practice Project Quality Assessment Tool ( EPHPP ) . METHODS Both tools were used to assess the method ological quality of 20 r and omized controlled trials included in our systematic review of the effectiveness of knowledge translation interventions to improve the management of cancer pain . Each study assessment was completed independently by two review ers using each tool . We analysed the inter-rater reliability of each tool 's individual domains , as well as final grade assigned to each study . RESULTS The EPHPP had fair inter-rater agreement for individual domains and excellent agreement for the final grade . In contrast , the CCRBT had slight inter-rater agreement for individual domains and fair inter-rater agreement for final grade . Of interest , no agreement between the two tools was evident in their final grade assigned to each study . Although both tools were developed to assess ' quality of the evidence ' , they appear to measure different constructs . CONCLUSIONS Both tools performed quite differently when evaluating the risk of bias or method ological quality of studies in knowledge translation interventions for cancer pain . The newly introduced CCRBT assigned these studies a higher risk of bias . Its psychometric properties need to be more thoroughly vali date d , in a range of research fields , to underst and fully how to interpret results from its application BACKGROUND The susceptibility to type 2 diabetes of people of south Asian descent is established , but there is little trial-based evidence for interventions to tackle this problem . We assessed a weight control and physical activity intervention in south Asian individuals in the UK . METHODS We did this non-blinded trial in two National Health Service ( NHS ) regions in Scotl and ( UK ) . Between July 1 , 2007 , and Oct 31 , 2009 , we recruited men and women of Indian and Pakistani origin , aged 35 years or older , with waist circumference 90 cm or greater in men or 80 cm or greater in women , and with impaired glucose tolerance or impaired fasting glucose determined by oral glucose tolerance test . Families were r and omised ( using a r and om number generator program , with permuted blocks of r and om size , stratified by location [ Edinburgh or Glasgow ] , ethnic group [ Indian or Pakistani ] , and number of participants in the family [ one vs more than one ] ) to intervention or control . Participants in the same family were not r and omised separately . The intervention group received 15 visits from a dietitian over 3 years and the control group received four visits in the same period . The primary outcome was weight change at 3 years . Analysis was by modified intention to treat , excluding participants who died or were lost to follow-up . We used linear regression models to provide mean differences in baseline-adjusted weight at 3 years . This trial is registered , number IS RCT N25729565 . FINDINGS Of 1319 people who were screened with an oral glucose tolerance test , 196 ( 15 % ) had impaired glucose tolerance or impaired fasting glucose and 171 entered the trial . Participants were in 156 family clusters that were r and omised ( 78 families with 85 participants were allocated to intervention ; 78 families with 86 participants were allocated to control ) . 167 ( 98 % ) participants in 152 families completed the trial . Mean weight loss in the intervention group was 1.13 kg ( SD 4.12 ) , compared with a mean weight gain of 0.51 kg ( 3.65 ) in the control group , an adjusted mean difference of -1.64 kg ( 95 % CI -2.83 to -0.44 ) . INTERPRETATION Modest , medium-term changes in weight are achievable as a component of lifestyle-change strategies , which might control or prevent adiposity-related diseases . FUNDING National Prevention Research Initiative , NHS Research and Development ; NHS National Services Scotl and ; NHS Health Scotl and BACKGROUND R and omized trials can show whether a treatment effect is statistically significant and can describe the size of the effect . There are , however , no vali date d methods available for establishing the clinical relevance of these outcomes . Recently , it was proposed that a st and ardized mean difference ( SMD ) of 0.50 be used as cutoff for clinical relevance in the treatment of depression . METHODS We explore what the effect size means and why the size of an effect has little bearing on its clinical relevance . We will also examine how the " minimally important difference , " as seen from the patient perspective , may be helpful in deciding where the cutoff for clinical relevance should be placed for a given condition . RESULTS Effect sizes in itself can not give an indication of the clinical relevance of an intervention because the outcome itself determines the clinical relevance and not only the size of the effects . The " minimal important difference " ( MID ) could be used as a starting point for pinpointing the cutoff for clinical relevance . A first , rough attempt to implement this approach for depression result ed in a tentative clinical relevance cutoff of SMD = 0.24 . Using this cutoff , psychotherapy , pharmacotherapy , and combined treatment have effect sizes above this cutoff . DISCUSSION Statistical outcomes can not be equated with clinical relevance . The " MID " may be used for pinpointing the cutoff for clinical relevance , but more work in this area is needed
2,038
30,137,579
Increases in its outcome measure R2 * ( transverse relaxation rate expressed as per second ) correspond to higher deoxyhaemoglobin concentrations and suggest lower oxygenation , whereas decreases in R2 * indicate higher oxygenation . BOLD-MRI has shown that patients suffering from chronic kidney disease ( CKD ) or kidneys with severe renal artery stenosis have lower tissue oxygenation than controls . Additionally , CKD patients with the lowest cortical oxygenation have the worst renal outcome .
Abstract Tissue hypoxia plays a key role in the development and progression of many kidney diseases . Blood oxygenation level-dependent magnetic resonance imaging ( BOLD-MRI ) is the most promising imaging technique to monitor renal tissue oxygenation in humans . BOLD-MRI measures renal tissue deoxyhaemoglobin levels voxel by voxel . BOLD-MRI has been vali date d against micropuncture techniques in animals . Its reproducibility has been demonstrated in humans , provided that physiological and technical conditions are st and ardized . Finally , BOLD-MRI has been used to assess the influence of drugs on renal tissue oxygenation , and may offer the possibility to identify drugs with nephroprotective or nephrotoxic effects at an early stage . Unfortunately , different methods are used to prepare patients , acquire MRI data and analyse the BOLD images .
Renal tissue hypoxia is a final pathway in the development and progression of chronic kidney disease ( CKD ) , but whether renal oxygenation predicts renal function decline in humans has not been proven . Therefore , we performed a prospect i ve study and measured renal tissue oxygenation by blood oxygenation level-dependent magnetic resonance imaging ( BOLD-MRI ) in 112 patients with CKD , 47 with hypertension without CKD , and 24 healthy control individuals . Images were analyzed with the twelve-layer concentric objects method that divided the renal parenchyma in 12 layers of equal thickness and reports the mean R2 * value of each layer ( a high R2 * corresponds to low oxygenation ) , along with the change in R2 * between layers called the R2 * slope . Serum creatinine values were collected to calculate the yearly change in estimated glomerular function rate ( MDRD eGFR ) . Follow up was three years . The change in eGFR in CKD , hypertensive and control individuals was -2.0 , 0.5 and -0.2 ml/min/1.73m2/year , respectively . In multivariable regression analysis adjusted for age , sex , diabetes , RAS-blockers , eGFR , and proteinuria the yearly eGFR change correlated negatively with baseline 24 hour proteinuria and the mean R2 * value of the cortical layers , and positively with the R2 * slope , but not with the other covariates . Patients with CKD and high outer R2 * or a flat R2 * slope were three times more likely to develop an adverse renal outcome ( renal replacement therapy or over a 30 % increase in serum creatinine ) . Thus , low cortical oxygenation is an independent predictor of renal function decline . This finding should stimulate studies exploring the therapeutic impact of improving renal oxygenation on renal disease progression BACKGROUND AND OBJECTIVES Atherosclerotic renal artery stenosis ( ARAS ) can reduce renal blood flow , tissue oxygenation , and GFR . In this study , we sought to examine associations between renal hemodynamics and tissue oxygenation with single-kidney function , pressor hormones , and inflammatory biomarkers in patients with unilateral ARAS undergoing medical therapy alone or stent revascularization . DESIGN , SETTING , PARTICIPANTS , & MEASUREMENTS Nonr and omized inpatient studies were performed in patients with unilateral ARAS ( > 60 % occlusion ) before and 3 months after revascularization ( n=10 ) or medical therapy ( n=20 ) or patients with essential hypertension ( n=32 ) under identical conditions . The primary study outcome was change in single-kidney GFR . Individual kidney hemodynamics and volume were measured using multidetector computed tomography . Tissue oxygenation ( using R(2 ) * as a measure of deoxyhemoglobin ) was determined by blood oxygen level-dependent magnetic resonance imaging at 3 T. Renal vein neutrophil gelatinase-associated lipocalin ( NGAL ) , monocyte chemoattractant protein-1 ( MCP-1 ) , and plasma renin activity were measured . RESULTS Total GFR did not change over 3 months in either group , but the stenotic kidney ( STK ) GFR rose over time in the stent compared with the medical group ( + 2.2[-1.8 to 10.5 ] versus -5.3[-7.3 to -0.3 ] ml/min ; P=0.03 ) . Contralateral kidney ( CLK ) GFR declined in the stent group ( 43.6±19.7 to 36.6±19.5 ml/min ; P=0.03 ) . Fractional tissue hypoxia fell in the STK ( fraction R(2 ) * > 30/s : 22.1%±20 % versus 14.9%±18.3 % ; P<0.01 ) after stenting . Renal vein biomarkers correlated with the degree of hypoxia in the STK : NGAL(r=0.3 ; P=0.01 ) and MCP-1(r=0.3 ; P=0.02 ; more so after stenting ) . Renal vein NGAL was inversely related to renal blood flow in the STK ( r=-0.65 ; P<0.001 ) . Biomarkers were highly correlated between STK and CLK , NGAL ( r=0.94 ; P<0.001 ) , and MCP-1 ( r=0.96 ; P<0.001 ) . CONCLUSIONS These results showed changes over time in single-kidney GFR that were not evident in parameters of total GFR . Furthermore , they delineate the relationship of measurable tissue hypoxia within the STK and markers of inflammation in human ARAS . Renal vein NGAL and MCP-1 indicated persistent interactions between the ischemic kidney and both CLK and systemic levels of inflammatory cytokines PURPOSE To prospect ively determine the 3-year stability and potential changes of functional parameters in renal allograft recipients obtained from diffusion-weighted imaging ( DWI ) and blood oxygenation level-dependent ( BOLD ) MRI . MATERIAL S AND METHODS Nine renal allograft recipients underwent DWI and BOLD-MRI twice , once 7 ± 3 months after transplantation , and again 32 ± 2 months after the first MRI . DWI yielded an apparent diffusion coefficient ( ADC ) and the perfusion contribution ( F(P ) ) . BOLD imaging yielded R2 , providing an estimation of renal oxygenation . Coefficients of variation between ( CV(b ) ) and within subjects ( CV(w ) ) were calculated . RESULTS The parameters were stable after 32 months in eight of the nine patients , who had well-functioning allografts . Mean diffusion values were very similar in the first and second scan . CV(w ) and CV(b ) for ADC values were less than 3.5 % and 5.9 % , respectively , in cortex and medulla , but were higher for F(P ) ( 15%-18 % ) . CV(w ) and CV(b ) of R2 were also low ( medulla : CV(w ) = 10.8 % , CV(b ) = 11.4 % ; cortex : CV(w ) and CV(b ) = 7.2 % ) . R2 increased significantly ( P = 0.035 ) in cortex but not in medulla , suggesting reduced cortical oxygen content . One subject with decreased glomerular filtration rate demonstrated strongly altered parameters . CONCLUSION In the absence of graft dysfunction , DWI and BOLD imaging yield consistent results over 3 years in stable human renal allograft recipients Background — Atherosclerotic renal artery stenosis reduces renal blood flow ( RBF ) and amplifies stenotic kidney hypoxia . Revascularization with percutaneous transluminal renal angioplasty ( PTRA ) and stenting often fails to recover renal function , possibly because of ischemia/reperfusion injury developing after PTRA . Elamipretide is a mitochondrial-targeted peptide that binds to cardiolipin and stabilizes mitochondrial function . We tested the hypothesis that elamipretide plus PTRA would improve renal function , oxygenation , and RBF in patients with atherosclerotic renal artery stenosis undergoing PTRA . Methods and Results — Inpatient studies were performed in patients with severe atherosclerotic renal artery stenosis scheduled for PTRA . Patients were treated before and during PTRA with elamipretide ( 0.05 mg/kg per hour intravenous infusion , n=6 ) or placebo ( n=8 ) . Stenotic kidney cortical/medullary perfusion and RBF were measured using contrast-enhanced multidetector CT , and renal oxygenation by 3-T blood oxygen level−dependent magnetic resonance imaging before and 3 months after PTRA . Age and basal glomerular filtration rate did not differ between groups . Blood oxygen level−dependent imaging demonstrated increased fractional hypoxia 24 hours after angiography and stenting in placebo ( + 47 % ) versus elamipretide ( −6 % ) . These were reverted to baseline 3 months later . Stenotic kidney RBF rose ( 202±29–262±115 mL/min ; P=0.04 ) 3 months after PTRA in the elamipretide-treated group only . Over 3 months , systolic blood pressure decreased , and estimated glomerular filtration rate increased ( P=0.003 ) more in the elamipretide group than in the placebo group ( P=0.11 ) . Conclusions — Adjunctive elamipretide during PTRA was associated with attenuated postprocedural hypoxia , increased RBF , and improved kidney function in this pilot trial . These data support a role for targeted mitochondrial protection to minimize procedure-associated ischemic injury and to improve outcomes of revascularization for human atherosclerotic renal artery stenosis . Clinical Trial Registration — URL : https://www . clinical trials.gov . Unique identifier : NCT01755858 PURPOSE To prospect ively assess the oxygenation state of renal transplants and determine the feasibility of using blood oxygen level-dependent ( BOLD ) magnetic resonance ( MR ) imaging to differentiate between acute tubular necrosis ( ATN ) , acute rejection , and normal function . MATERIAL S AND METHODS This HIPAA-compliant study had institutional human subjects review committee approval , and written informed consent was obtained from all patients . BOLD MR imaging was performed in 20 patients ( age range , 21 - 70 years ) who had recently received renal transplants . Six patients had clinical ly normal functioning transplants , eight had biopsy-proved rejection , and six had biopsy-proved ATN . R2 * ( 1/sec ) measurements were obtained in the medulla and cortex of transplanted kidneys . R2 * is a measure of the rate of signal loss in a specific region and is related to the amount of deoxyhemoglobin present . Statistical analysis was performed by using a two- sample t test . Threshold R2 * values were identified to discriminate between transplanted kidneys with ATN , those with acute rejection , and those with normal function . RESULTS R2 * values for the medulla were significantly lower in the acute rejection group ( R2 * = 15.8/sec + /- 1.5 ) than in normally functioning transplants ( R2 * = 23.9/sec + /- 3.2 ) and transplants with ATN ( R2 * = 21.3/sec + /- 1.9 ) . The differences between the acute rejection and normal function groups ( P = .001 ) , as well as between the acute rejection and ATN groups ( P < .001 ) , were significant . Acute rejection could be differentiated from normal function and ATN in all cases by using a threshold R2 * value of 18/sec . R2 * values for the cortex were higher in ATN ( R2 * = 14.2/sec + /- 1.4 ) than for normally functioning transplants ( R2 * = 12.7/sec + /- 1.6 ) and transplants with rejection ( R2 * = 12.4/sec + /- 1.2 ) . The difference in R2 * values in the cortex between ATN and rejection was statistically significant ( P = .034 ) , although there was no threshold value that enabled differentiation of all cases of ATN from cases of normal function or acute rejection . CONCLUSION R2 * measurements in the medullary regions of transplanted kidneys with acute rejection were significantly lower than those in normally functioning transplants or transplants with ATN . These results suggest that marked changes in intrarenal oxygenation occur during acute transplant rejection Acute experimental reduction of renal blood flow decreases the renal blood oxygenation level – dependent ( BOLD ) MRI signal in animals . Angiotensin II also reduces renal blood flow , but the ability of BOLD MRI to dynamically detect this response has not yet been investigated in humans . Six healthy male volunteers underwent an individual dose-finding study to identify the intravenous doses of angiotensin II , norepinephrine , and sodium nitroprusside necessary to induce a 15-mm Hg peak mean arterial blood pressure change . MRI studies followed within 3 weeks , when angiotensin II ( 8.8±1.4 ng/kg ) , norepinephrine ( 52±12 ng/kg ) , and sodium nitroprusside ( 2.0±0.3 & mgr;g/kg ) were given twice in an unblocked , r and omized sequence while imaging experiments were performed on a 1.5-T Siemens Sonata . A multiecho echo-planar imaging sequence was used to acquire T2 * maps with a temporal resolution of 1 respiratory cycle . Averaged over a renal cortex dominated region of interest , angiotensin II caused a shortening of T2 * between 6 % and 10 % . Sodium nitroprusside and norepinephrine , although of equal potency concerning blood pressure responses , did not alter the renal BOLD signal . The renal BOLD response to angiotensin II appeared with short onset latency ( as early as 10 seconds after peripheral intravenous angiotensin II bolus administration ) suggesting that this response is a consequence of altered perfusion rather than increased renal oxygen consumption . The methods described here are suitable to assess renal responsiveness to angiotensin II and may , thus , be of great value in human hypertension research Blood oxygenation level-dependent ( BOLD ) MRI was shown to allow non-invasive observation of renal oxygenation in humans . However , clinical applications of this type of functional MRI of the kidney are still limited , most likely because of difficulties in obtaining reproducible and reliable information . The aim of this study was to evaluate the reproducibility and robustness of a BOLD method applied to the kidneys and to identify systematic physiological changes potentially influencing the renal oxygenation of healthy volunteers . To measure the BOLD effect , a modified multi-echo data image combination ( MEDIC ) sequence was used to acquire 12 T2 * -weighted images within a single breath-hold . Three identical measurements were performed on three axial and three coronal slices of right and left kidneys in 18 volunteers . The mean R2 * ( 1/T2 * ) values determined in medulla and cortex showed no significant differences over three repetitions and low intra-subject coefficients of variation ( CV ) ( 3 and 4 % in medulla and cortex , respectively ) . The average R2 * values were higher in the medulla ( 16.15 + /- 0.11 ) than in the cortex ( 11.69 + /- 0.18 ) ( P < 0.001 ) . Only a minor influence of slice orientation was observed . Mean R2 * values were slightly higher ( 3 % ) in the left than in the right kidney ( P < 0.001 ) . Differences between volunteers were identified ( P < 0.001 ) . Part of these differences was attributable to age-dependent R2 * values , since these values increased with age when medulla ( P < 0.001 , r = 0.67 ) or cortex ( P < 0.020 , r = 0.42 ) were considered . Thus , BOLD measurements in the kidney are highly reproducible and robust . The results allow one to identify the known cortico-medullary gradient of oxygenation evidence d by the gradient of R2 * values and suggest that medulla is more hypoxic in older than younger individuals . BOLD-MRI is therefore a useful tool to study sequentially and non-invasively regional oxygenation of human kidneys Atherosclerotic renovascular disease ( RVD ) reduces renal blood flow ( RBF ) and GFR and accelerates poststenotic kidney ( STK ) tissue injury . Pre clinical studies indicate that mesenchymal stem cells ( MSCs ) can stimulate angiogenesis and modify immune function in experimental RVD . We assessed the safety and efficacy of adding intra-arterial autologous adipose-derived MSCs into STK to st and ardized medical treatment in human subjects without revascularization . The intervention group ( n=14 ) received a single infusion of MSC ( 1.0 × 105 or 2.5 × 105 cells/kg ; n=7 each ) plus st and ardized medical treatment ; the medical treatment only group ( n=14 ) included subjects matched for age , kidney function , and stenosis severity . We measured cortical and medullary volumes , perfusion , and RBF using multidetector computed tomography . We assessed tissue oxygenation by blood oxygen level-dependent MRI and GFR by iothalamate clearance . MSC infusions were well tolerated . Three months after infusion , cortical perfusion and RBF rose in the STK ( 151.8 - 185.5 ml/min , P=0.01 ) ; contralateral kidney RBF increased ( 212.7 - 271.8 ml/min , P=0.01 ) ; and STK renal hypoxia ( percentage of the whole kidney with R2*>30/s ) decreased ( 12.1 % [ interquartile range , 3.3%-17.8 % ] to 6.8 % [ interquartile range , 1.8%-12.9 % ] , P=0.04 ) . No changes in RBF occurred in medical treatment only subjects . Single-kidney GFR remained stable after MSC but fell in the medical treatment only group ( -3 % versus -24 % , P=0.04 ) . This first-in-man dose-escalation study provides evidence of safety of intra-arterial infusion of autologous MSCs in patients with RVD . MSC infusion without main renal artery revascularization associated with increased renal tissue oxygenation and cortical blood flow
2,039
31,414,328
This study indicates that OSAHS patients have higher serum YKL-40 level , which may serve as a potential biomarker for OSAHS diagnosis and monitoring
Several studies have reported that serum YKL-40 level was elevated in patients with obstructive sleep apnea hypopnea syndrome ( OSAHS ) . However , most of these studies had relatively small sample sizes and the results were inconsistent . Therefore , a meta- analysis was conducted to determine the potential role of serum YKL-40 level in OSAHS .
Objective YKL-40 , a chitinase-like glycoprotein associated with inflammation and tissue remodeling , is produced by joint tissues and recognized as a c and i date auto-antigen in rheumatoid arthritis ( RA ) . In the present study , we investigated YKL-40 as a potential biomarker of disease activity in patients with early RA at baseline and during intensive treatment aim ing for early remission . Methods Ninety-nine patients with early DMARD-naïve RA participated in the NEO-RACo study . For the first four weeks , the patients were treated with the combination of sulphasalazine , methotrexate , hydroxychloroquine and low dose prednisolone ( FIN-RACo DMARD combination ) , and subsequently r and omized to receive placebo or infliximab added on the treatment for further 22 weeks . Disease activity was evaluated using the 28-joint disease activity score and plasma YKL-40 concentrations were measured by immunoassay . Results At the baseline , plasma YKL-40 concentration was 57 ± 37 ( mean ± SD ) ng/ml . YKL-40 was significantly associated with the disease activity score , interleukin-6 and erythrocyte sedimentation rate both at the baseline and during the 26 weeks ’ treatment . The csDMARD combination decreased YKL-40 levels already during the first four weeks of treatment , and there was no further reduction when the tumour necrosis factor-α antagonist infliximab was added on the combination treatment . Conclusions High YKL-40 levels were found to be associated with disease activity in early DMARD-naïve RA and during intensive treat-to-target therapy . The present results suggest YKL-40 as a useful biomarker of disease activity in RA to be used to steer treatment towards remission
2,040
26,011,829
Qualitative analysis suggests that water infusion colonoscopy was not associated with a markedly increased rate of adverse events compared with the st and ard procedure . Completeness of colonoscopy , that is cecal intubation rate , was not improved by water infusion compared with st and ard air insufflation colonoscopy . However , adenoma detection , assessed with two different measures ( that is adenoma detection rate and number of detected adenomas per procedure ) , was slightly augmented by the water infusion colonoscopy . Improved adenoma detection might be due to the cleansing effects of water infusions on the mucosa . Detection of premalignant lesions during st and ard colonoscopy is suboptimal , and so improvements in adenoma detection by water infusion colonoscopy , although small , may help to reduce the risk of interval colorectal carcinoma . The most obvious benefit of water infusion colonoscopy was reduction of procedure-related abdominal pain , which may enhance the acceptance of screening/surveillance colonoscopy
BACKGROUND Colonoscopy is a widely used diagnostic and therapeutic modality . A large proportion of the population is likely to undergo colonoscopy for diagnosis and treatment of colorectal diseases , or when participating in colorectal cancer screening programs . To reduce pain , water infusion instead of traditional air insufflation during the insertion phase of the colonoscopy has been proposed , thereby improving patients ' acceptance of the procedure . Moreover , the water infusion method may improve early detection of precancerous neoplasms . OBJECTIVES To compare water infusion techniques with st and ard air insufflation , specifically evaluating technical quality and screening efficacy , as well as patients ' acceptance of the water infusion procedure .
BACKGROUND Colonoscopy may be associated with discomfort when performed without sedation . A study was conducted to determine whether instillation of water into the colon at the beginning of the procedure reduces intubation time as well as patient discomfort and pain . METHODS Colonoscopy was performed in 259 patients by 3 endoscopists-in-training with limited experience . Patients were r and omly allocated to 2 groups . In one , a technique was used in which 500 to 1000 mL of water is instilled into the colon by enema at the beginning of the procedure ( instillation group , n = 130 ) . In the other , patients underwent a conventional colonoscopy ( control group , n = 129 ) . Intubation time was measured and compared between the groups , and subjective discomfort experienced by the patients was measured upon completion of the examination . RESULTS Success rates for insertion to the cecum were similar , ( 95.4 % , instillation group ; 96.1 % , control group ) . Detection rates for any colorectal diseases were not different between the groups ( 30.0 % vs. 32.6 % ) . Mean time to cecal intubation was 10.5 minutes in the instillation group and 16.2 minutes in the control group ( p < 0.0001 ) . The proportion of patients who complained of abdominal pain during the procedure was 17.1 % in the instillation group and 33.3 % in the control group ( p < 0.001 ) . CONCLUSIONS When used by endoscopists-in-training , the water-instillation colonoscopy technique was associated with less discomfort and faster cecal intubation with no decrease in the rate of detection of colorectal diseases Objective . The volume of colonoscopies performed is increasing and differences in colonoscopy practice over time and between centres have been reported . Examination of current practice is important for benchmarking quality . The objective of this study was to examine variations in colonoscopy practice in endoscopy centres internationally . Material and methods . This observational study prospect ively included consecutive patients referred for colonoscopy from 21 centres in 11 countries . Patient , procedure and centre characteristics were collected through question naires . Descriptive statistics were performed and the variation between centres while controlling for case-mix was examined . Results . A total of 6004 patients were included in the study . Most colonoscopies ( 93 % ; range between centres 70–100 % ) were performed for diagnostic purpose s. The proportion of main indications for colonoscopy showed wide variations between centres , the two most common indications , surveillance and haematochezia , ranging between 7–24 % and 5–38 % , respectively . High- quality cleansing occurred in 74 % ( range 51–94 % ) of patients , and 30 % ( range 0–100 % ) of patients received deep sedation . Three-quarters ( range 0–100 % ) of the patients were monitored during colonoscopy , and one-quarter ( range 14–35 % ) underwent polypectomy . Colonoscopy was complete in 89 % ( range 69–98 % ) of patients and the median total duration was 20 min ( range of centre medians 15–30 min ) . The variation between centres was not reduced when case-mix was controlled for . Conclusions . This study documented wide variations in colonoscopy practice between centres . Controlling for case-mix did not remove these variations , indicating that centre and procedure characteristics play a role . Centres generally were within the existing guidelines , although there is still some work to be done to ensure that all centres attain the goal of providing high- quality colonoscopy BACKGROUND Pilot studies using a novel water method to perform screening colonoscopy allowed patients to complete colonoscopy without sedation medications and also significantly increased the cecal intubation success rate . OBJECTIVE To perform a r and omized , controlled trial comparing air insufflation ( conventional method ) and water infusion in lieu of air insufflation ( study method ) colonoscopy in minimally se date d patients . HYPOTHESIS Compared with the conventional method , patients examined by the study method had lower pain scores and required less medication but had a similar cecal intubation rate and willingness to undergo colonoscopy in the future . SETTING Outpatient colonoscopy in a single Veterans Affairs hospital . METHODS After informed consent and st and ard bowel preparation , patients received premedications administered as 0.5-increments of fentanyl ( 25 microg ) and 0.5-increments of Versed ( midazolam ) ( 1 mg ) plus 50 mg of diphenhydramine . The conventional and the study methods for colonoscopy were implemented as previously described . Additional pain medications were administered at the patients ' request . MAIN OUTCOME MEASUREMENTS Increments of medications , pain scores , cecal intubation , and willingness to repeat colonoscopy . RESULTS Increments of medications used before reaching the cecum ( 1.6 + /- 0.2 vs 2.4 + /- 0.2 , P < .0027 ) , total increments used ( 1.8 + /- 0.2 vs 2.5 + /- 0.2 , P < .014 ) , and the maximum pain scores ( 1.3 + /- 0.3 vs 4.1 + /- 0.6 , P < .0002 ) were significantly lower with the water method . Cecal intubation rate ( 100 % ) and willingness to undergo a repeat colonoscopy ( 96 % ) were similar . LIMITATIONS Single Veterans Affairs hospital , older male population . CONCLUSION Water infusion in lieu of air insufflation is superior to air insufflation during colonoscopy in the minimally se date d patients ( Clinical Trials.gov Identifier NCT00785889 ) BACKGROUND & AIMS Several studies have indicated that water infusion , instead of air insufflation , enhances cecal intubation in selected patients undergoing unse date d colonoscopy . We performed a prospect i ve , r and omized , controlled trial to investigate whether the water technique increases the proportion of patients that are able to complete unse date d colonoscopy . METHODS We analyzed data from 116 consecutive out patients who were willing to start colonoscopy without sedation ; 58 were each r and omly assigned to groups given water infusion or air insufflation during the insertion phase . Sedation and analgesia were administered on dem and . RESULTS Fewer patients requested sedation in the water group ( 8.6 % ) than in the air group ( 34.5 % ; P = .003 ) and their maximum pain scores were lower ( 2.8 ± 1.9 vs 4.2 ± 2.3 in the air group ; P = .011 ) . However , differences in percentages of patients who received complete , unse date d colonoscopy between the water group ( 74.1 % ) and air group ( 62.1 % ) did not reach statistical significance ( P = .23 ) ; the percentage of successful cecal intubations was lower in the water group ( 82.8 % ) than in the air group ( 96.5 % ; P = .03 ) because of poor visibility . Failed procedures in the water group were completed successfully after air insufflation . The cecal intubation time was shorter in the air group ( 6.2 ± 3.4 min ) than in the water group ( 8.1 ± 3.0 min ; P = .01 ) . CONCLUSIONS In patients willing to undergo unse date d colonoscopy , water infusion improves patient tolerance for cecal intubation , compared with air insufflation . However , it does not increase the overall percentage of successful cecal intubations because suboptimal bowel preparation interferes with visibility BACKGROUND : The increasing dem and for colonoscopy has renewed the interest for unse date d procedures . Alternative techniques , such as carbon dioxide insufflation and warm-water infusion , have been advocated to improve patient tolerance for colonoscopy in comparison with air insufflation . OBJECTIVE : The aim of this study was to evaluate the benefits of carbon dioxide insufflation and warm-water irrigation over air insufflation in unse date d patients . DESIGN : This study was a r and omized , controlled trial . SETTING : This study was conducted at a nonacademic single center . PATIENTS : Consecutive out patients agreeing to start colonoscopy without premedication were included . INTERVENTIONS : Patients were assigned to either carbon dioxide insufflation , warm-water irrigation , or air insufflation colonoscopy insertion phase . Sedation/analgesia were administered on patient request if significant pain or discomfort occurred . MAIN OUTCOME MEASURES : The primary outcome measured was the percentage of patients requiring sedation/analgesia . Pain and tolerance scores were assessed at discharge by using a 100-mm visual analog scale . RESULTS : Three hundred forty-one subjects ( 115 in the carbon dioxide , 113 in the warm-water , and 113 in the air group ) were enrolled . Intention-to-treat analysis showed that the proportion of patients requesting sedation/analgesia during colonoscopy was 15.5 % in the carbon dioxide group , 13.2 % in the warm-water group , and 25.6 % in the air group ( p = 0.04 carbon dioxide vs air ; p = 0.03 warm water vs air ) . Median ( interquartile range ) scores for pain were 30 ( 10–50 ) , 28 ( 15–50 ) , and 46 ( 22–62 ) in the carbon dioxide , warm-water , and air groups ( carbon dioxide vs air , p < 0.01 ; warm water vs air , p < 0.01 ) ; corresponding figures for tolerance were 20 ( 5–30 ) , 19 ( 5–36 ) , and 28 ( 10–50 ) ( carbon dioxide vs air , p < 0.01 ; warm water vs air , p < 0.01 ) . LIMITATIONS : This investigation was limited because it was a single-center study and the endoscopists were not blinded to r and omization . CONCLUSIONS : Carbon dioxide insufflation was associated with a decrease in the proportion of patients requesting on-dem and sedation , improved patient tolerance , and decreased colonoscopy-related pain in comparison with air insufflation . The findings regarding warm-water irrigation confirmed the previously reported advantages , so that warm-water irrigation and carbon dioxide insufflation could represent competitive strategies for colonoscopy in unse date d patients BACKGROUND AND STUDY AIMS Water immersion is an alternative colonoscopy technique that may reduce discomfort and facilitate insertion of the instrument . This was a prospect i ve study to compare the success of colonoscopy with minimal sedation using water immersion and conventional air insufflation . PATIENTS AND METHODS A total of 229 patients were r and omized to either water immersion or the st and ard air insertion technique . The primary outcome was success of minimal sedation colonoscopy , which was defined as reaching the cecum without additional sedation , exchange of the adult colonoscope or h and s-on assistance for trainees . Patient comfort and satisfaction were also assessed . RESULTS Successful minimal-sedation colonoscopy was achieved in 51 % of the water immersion group compared with 28 % in the st and ard air group ( OR , 2.66 ; 95 % CI 1.48 - 4.79 ; P = 0.0004 ) . Attending physicians had 79 % success with water immersion compared with 47 % with air insufflation ( OR , 4.19 ; 95 % CI 1.5 - 12.17 ; P = 0.002 ) , whereas trainees had 34 % success with water compared with 16 % using air ( OR , 2.75 ; 95 % CI 1.15 - 6.86 ; P = 0.01 ) . Using the water method , endoscopists intubated the cecum faster and this was particularly notable for trainees ( 13.0 + /- 7.5 minutes with water vs. 20.5 + /- 13.9 minutes with air ; P = 0.0001 ) . Total procedure time was significantly shorter with water for both experienced and trainee endoscopists ( P < 0.05 ) . Patients reported less intraprocedural pain with water compared with air ( 4.1 + /- 2.7 vs. 5.3 + /- 2.7 ; P = 0.001 ) , with a similar level of satisfaction . There was no difference in the neoplasm detection rates between the groups . CONCLUSION Colonoscopy insertion using water immersion increases the success rate of minimal sedation colonoscopy . Use of the technique leads to a decrease in discomfort , time to reach the cecum , and the amount of sedative and analgesic used , without compromising patient satisfaction OBJECTIVES : Completion rates , pain , and difficulties during the exam are still problems in colonoscopy . New methods of lubrication , rarely considered a matter of study , may help in this respect . Our aim was to compare an oil-assisted technique with a modified warm water method applied during colonoscopy . METHODS : A prospect i ve , r and omized , and controlled study was planned in which three groups of patients were su bmi tted to colonoscopy : a st and ard lubricating method ( water-soluble jelly : group A , 170 patients ) was adopted in a control group , whereas the st and ard method plus injection into the colon of corn seed oil ( group B , 170 patients ) or warm water ( group C , 170 patients ) were employed in the other groups . The main variables evaluated were : the success rate for total intubation , the time required to reach the cecum and the time needed to examine the colon at withdrawal , and the level of pain and degree of difficulty associated with the examination . RESULTS : Successful intubation to the cecum was significantly more frequent ( P < 0.01 and P < 0.001 , respectively ) in the oil group ( group B , 155/166 ) and in the warm water group ( group C , 156/163 ) than in the control group ( group A , 138/164 ) , and less time was needed ( P < 0.001 ) ; no significant difference was found between group B and C. Furthermore , no significant differences were found with regard to time for examination at withdrawal among the three groups . Level of pain and degree of difficulty during colonoscopy were significantly lower in the oil ( P < 0.001 ) and in the warm water ( P < 0.001 ) groups than in the control group , but no significant difference was found between group B and C. Neither side effects were observed for patients nor damage to the instrument . CONCLUSIONS : Warm water and oil-assisted colonoscopy could be simple , safe , and inexpensive methods for easier and less painful examinations BACKGROUND An observational study in veterans showed that a novel water method ( water infusion in lieu of air insufflation ) enhanced cecal intubation and willingness to undergo a repeat scheduled unse date d colonoscopy . OBJECTIVE To confirm these beneficial effects and significant attenuation of discomfort in a r and omized , controlled trial ( RCT ) . DESIGN Prospect i ve RCT , intent-to-treat analysis . SETTING Veterans Affairs ambulatory care facility . PATIENTS Veterans undergoing scheduled unse date d colonoscopy . INTERVENTIONS During insertion , the water and traditional air methods were compared . MAIN OUTCOME MEASUREMENTS Discomfort and procedure-related outcomes . RESULTS Eighty-two veterans were r and omized to the air ( n = 40 ) or water ( n = 42 ) method . Cecal intubation ( 78 % vs 98 % ) and willingness to repeat ( 78 % vs 93 % ) were significantly better with the water method ( P < .05 ; Fisher exact test ) . The mean ( st and ard deviation ) of maximum discomfort ( 0 = none , 10 = most severe ) during colonoscopy was 5.5 ( 3.0 ) versus 3.6 ( 2.1 ) P = .002 ( Student t test ) , and the median overall discomfort after colonoscopy was 3 versus 2 , P = .052 ( Mann-Whitney U test ) , respectively . The method , but not patient characteristics , was a predictor of discomfort ( t = -1.998 , P = .049 , R(2 ) = 0.074 ) . The odds ratio for failed cecal intubation was 2.09 ( 95 % CI , 1.49 - 2.93 ) for the air group . Fair/poor previous experience increased the risk of failed cecal intubation in the air group only . The water method numerically increased adenoma yield . LIMITATIONS Single site , small number of elderly men , unblinded examiner , possibility of unblinded subjects , restricted generalizability . CONCLUSIONS The RCT data confirmed that the water method significantly enhanced cecal intubation and willingness to undergo a repeat colonoscopy . The decrease in maximum discomfort was significant ; the decrease in overall discomfort approached significance . The method , but not patient characteristics , was a predictor of discomfort . ( Clinical trial registration number NCT00747084 ) Background and Study Aims There is general consensus that water instillation helps insert a colonoscope . However , the most effective method for water instillation has not yet been established , especially for endoscopists-in-training . The aim of this study was to determine volume and temperature for effective water instillation colonoscopy . Patients and Methods This is a prospect i ve , r and omized , controlled trial that was carried out at a single center , and a total of 207 consecutive subjects who underwent colonoscopic examination for health checkup were included in the study . Water instillation of supplied water was conducted under four different conditions : 100 and 300 ml at room temperature , 300 ml at 30 ° C and no use of water instillation . The following parameters were recorded and analyzed : intubation success rate , independent predictors of successful intubation and intubation time to reach the cecum . Results The intubation success rate was not significantly different between individual groups . Independent predictors of successful intubation were younger age ( P = 0.004 ) and later examined subjects ( P = 0.016 ) . The 300-ml warm water instillation during colonoscopy significantly reduced intubation time over the conventional method without water instillation ( P = 0.034 ) . Conclusions Instillation of 300-ml warm ( 30 ° C ) water during colonoscopy can reduce cecal intubation time for in-training endoscopists without improving the intubation success rate OBJECTIVES To determine the minimum clinical ly significant difference in visual analog scale ( VAS ) pain scores for acute pain in the ED setting and to determine whether this difference varies with gender , age , or cause of pain . METHODS A prospect i ve , descriptive study of 152 adult patients presenting to the ED with acute pain . At presentation and at 20-minute intervals to a maximum of three measurements , patients marked the level of their pain on a 100-mm , nonhatched VAS . At each follow-up they also gave a verbal rating of their pain as " a lot better , " " much the same , " " a little worse , " or " much worse . " The minimum clinical ly significant difference in VAS pain scores was defined as the mean difference between current and preceding scores when pain was reported as a little worse or a little better . Data were compared based on gender , age more than or less than 50 years , and traumatic vs nontraumatic causes of pain . RESULTS The minimum clinical ly significant difference in VAS pain scores is 9 mm ( 95 % CI , 6 to 13 mm ) . There is no statistically significant difference between the minimum clinical ly significant differences in VAS pain scores based on gender ( p=0.172 ) , age ( p=0.782 ) , or cause of pain ( p=0.84 ) . CONCLUSIONS The minimum clinical ly significant difference in VAS pain scores was found to be 9 mm . Differences of less than this amount , even if statistically significant , are unlikely to be of clinical significance . No significant difference in minimum significant VAS scores was found between gender , age , and cause-of-pain groups BACKGROUND Colonoscopy is widely used for management of colorectal diseases . A history of abdominal or pelvic surgery is a well-recognized factor associated with difficult colonoscopy . Although water exchange colonoscopy ( WEC ) was effective in small groups of male U.S. veterans with such a history , its application in other cultural setting s is uncertain . OBJECTIVE To investigate the application of WEC in such patients . DESIGN Prospect i ve , r and omized , controlled , patient-blinded study . SETTING Tertiary-care referral center in China . PATIENTS Out patients with prior abdominal or pelvic surgery undergoing unse date d diagnostic , screening , or surveillance colonoscopy . INTERVENTION Patients were r and omized to examination by either WEC or conventional air colonoscopy ( AC ) . MAIN OUTCOME MEASUREMENTS Cecal intubation rate . RESULTS A total of 110 patients were r and omized to the WEC ( n = 55 ) or AC ( n = 55 ) group . WEC significantly increased the cecal intubation rate ( 92.7 % vs 76.4 % ; P = .033 ) . The maximum pain scores ( ± st and ard deviation ) were 2.1 ± 1.8 ( WEC ) and 4.6 ± 1.7 ( AC ) , respectively ( P < .001 ) . Multivariate analysis showed that the colonoscopy method was the only independent predictor of failed colonoscopy ( odds ratio 11.44 , 95 % confidence interval , 1.35 - 97.09 ) . A higher proportion of patients examined by WEC would be willing to have a repeat unse date d colonoscopy ( 90.9 % vs 72.7 % , P = .013 ) . LIMITATIONS Single center ; unblinded but experienced endoscopists . CONCLUSION This r and omized , controlled trial confirms that the water exchange method significantly enhanced cecal intubation in potentially difficult colonoscopy in unse date d patients with prior abdominal or pelvic surgery . The lower pain scores and higher proportion accepting repeat of the unse date d option suggest that WEC is promising . It may enhances compliance with colonoscopy in specific population s. ( CLINICAL TRIAL REGISTRATION NUMBER NCT01485133 . ) BACKGROUND : Investigators in the US described large volume water infusion with marked benefits but acknowledged the limitation of male veteran predominance in the study subjects . The aim of this study was to assess the feasibility of large volume water infusion in Asian patients undergoing minimal sedation diagnostic colonoscopy in a community setting . METHODS : Consecutive patients who underwent colonoscopy were r and omized to receive large volume ( entire colon ) ( Group A , n=51 ) , limited volume ( rectum and sigmoid colon ) ( Group B , n=51 ) water infusion , or air insufflation ( Group C , n=51 ) . Pain during insertion , completion rate , cecal intubation and total procedure times , and patient satisfaction were evaluated . Pain and satisfaction were assessed with a 0 - 10 visual analog scale . RESULTS : The mean pain scores during insertion were lower in the Group A and Group B than in Group C , 3.3±2.4 , 3.0±2.2 and 4.4±2.6 , respectively ( p=0.028 and p=0.004 ) . The completion rates and cecal intubation times were similar among the three groups . The procedure time was significantly longer in Group A than in group C ( 15.3±5.9 min vs. 13.1±5.4 min , p=0.049 ) . Overall satisfaction with the procedure was greater in Group B than in Group C only ( 9.7±0.5 vs. 9.4±0.8 , p=0.044 ) . CONCLUSIONS : Diagnostic colonoscopy with large volume water infusion without air insufflation appears to be feasible in minimally se date d Asian patients in a community setting . Measures to improve the outcome further are discussed Objective Water immersion insertion and carbon dioxide ( CO2 ) insufflation , as alternative colonoscopic techniques , are able to reduce patient discomfort during and after the procedure . We assessed whether the combination of water immersion and CO2 insufflation is superior in efficacy and patient comfort to other colonoscopic techniques . Methods In a prospect i ve , r and omized study , a total of 420 patients were r and omized to either water immersion insertion and CO2 insufflation during withdrawal ( water/CO2 ) , water insertion and air insufflation during withdrawal ( water/air ) , CO2 insufflation during both insertion and withdrawal ( CO2/CO2 ) , or air insufflation during both insertion and withdrawal ( air/air ) . The main outcome was the success of minimal sedation colonoscopy , which was defined as reaching the cecum without switching to another insertion method and without additional sedation beyond the initial 2 mg of midazolam . Patient comfort during and after the procedure was assessed . Results A total of 404 patients were analyzed . The success rate of minimal sedation colonoscopy in the water insertion arm ( water/CO2 and water/air ) was 97 % compared with 83.3 % in the gas insertion arm ( CO2/CO2 and air/air ; P<0.0001 ) . Intraprocedural pain and bloating were significantly lower in the water/CO2 group than in all other groups . Patient discomfort in the water/CO2 group during 24 h after the procedure was comparable with that in the CO2/CO2 group and significantly lower than that in the air groups ( water/air and air/air ) . No complications were recorded during the study . Conclusion The combination of water immersion and CO2 insufflation appears to be an effective and safe method for minimal sedation colonoscopy . Overall patient discomfort was significantly reduced compared with that in other techniques BACKGROUND Uncontrolled data suggest that warm water infusion ( WWI ) instead of air insufflation ( AI ) during the insertion phase of unse date d colonoscopy improves patient tolerance and satisfaction . OBJECTIVE We tested the hypothesis that water could increase the proportion of patients able to complete unse date d colonoscopy and improve patient tolerance compared with the conventional procedure . DESIGN R and omized , controlled trial . SETTING Single center , community hospital . PATIENTS Consecutive out patients agreeing to start colonoscopy without premedication . METHODS Patients were r and omly assigned to either WWI or AI insertion phase of colonoscopy . Sedation and /or analgesia were administered on patient request if significant pain or discomfort occurred . MAIN OUTCOME MEASUREMENTS Percentage of patients requiring sedation/analgesia . Pain and tolerance scores were assessed at discharge by using a 100-mm visual analog scale . RESULTS A total of 230 subjects ( 116 in the WWI group and 114 in the AI group ) were enrolled . Intention-to-treat analysis showed that the proportion of patients requesting sedation/analgesia during the procedure ( main outcome measurement ) was 12.9 % in the WWI group and 21.9 % in AI group ( P = .07 ) . Cecal intubation rates were 94 % in the WWI group and 95.6 % in the AI group ( P = .57 ) . Median ( interquartile range ) scores for pain were 28 ( 12 - 44 ) and 39 ( 14 - 54 ) in WWI and AI groups , respectively ( P = .05 ) ; corresponding figures for tolerance were 10 ( 3 - 18 ) and 14 ( 5 - 42 ) , respectively ( P = .01 ) . The adenoma detection rates were 25 % and 40.1 % for the WWI and AI groups , respectively ( P = .013 ) . LIMITATIONS Single-center study , endoscopists not blinded to r and omization . CONCLUSIONS WWI instead of AI is not associated with a statistically significant decrease in the number of patients requiring on-dem and sedation , although it significantly improves the overall patient tolerance of colonoscopy . The finding of a lower adenoma detection rate in the WWI group calls for further evaluations . ( CLINICAL TRIAL REGISTRATION NUMBER NCT00905554 ) BACKGROUND AND STUDY AIM Water-aided colonoscopy includes water immersion and water exchange . Several small single-center studies have suggested that the use of water rather than air insufflation during colonoscopy reduces pain on insertion . The aim of this study was to investigate whether water-aided colonoscopy is less painful than air insufflation in a large cohort of patients . PATIENTS AND METHODS This was a two-center , r and omized controlled trial . Consecutive patients who agreed to start colonoscopy without premedication were included . Sedation was administered on dem and . Water-aided colonoscopy was performed using water immersion in the early phase of the study , and subsequently water exchange was used . The primary endpoint was cecal intubation with pain scores of ≤ 2 and sedation with no or ≤ 2 mg midazolam . Secondary outcomes were pain score at discharge , cecal intubation rate and time , and adenoma detection rate ( ADR ) . RESULTS A total of 672 patients were r and omized to water exchange ( n = 338 ) or air insufflation ( n = 334 ) . The primary endpoint was achieved in more patients in the water exchange group ( 83.8 % vs. 62 % ; P < 0.0005 ) . On-dem and sedation was also required less ( 11.5 % vs. 26.0 % ; P < 0.0005 ) and mean pain score was lower ( 1.3 vs. 2.3 ; P < 0.0005 ) in the water exchange group . The cecal intubation rates were comparable . Water exchange had a significantly higher overall ADR ( 25.8 % vs. 19.1 % ; P = 0.041 ) , proximal ADR ( 10.1 % vs. 4.8 % ; P = 0.014 ) , and proximal < 10 mm ADR ( 7.7 % vs. 3.9 % ; P = 0.046 ) ; proximal ADR was also higher in screening-only patients in the water exchange group ( 18.9 % vs. 7.4 % ; P = 0.015 ) . No detailed analysis was possible for the air insufflation vs. water immersion comparison . CONCLUSION The current results confirmed that water exchange minimized the requirement for sedation and increased the ADR AIM To investigate a limited water infusion method in colonoscopy . METHODS Consecutive patients undergoing minimally se date d colonoscopy were r and omized to receive air insufflation ( n = 89 ) or water infusion limited to the rectum , sigmoid colon and descending colon ( n = 90 ) . Completion rates , cecal intubation times , procedure times , need for abdominal compression , turning of patients and levels of discomfort were evaluated . RESULTS Completion rates , total procedure times , need for abdominal compression , and turning of patients were similar between groups . Less pain was experienced in the water group than in the air group ( 2.5 ± 2.5 vs 3.4 ± 2.8 , mean ± SD , P = 0.021 ) . The cecal intubation time was significantly longer in the water group than in the air group ( 6.4 ± 3.1 min vs 4.5 ± 2.4 min , P < 0.001 ) . More water was infused in the water group ( 322 ± 80.9 mL vs 26.2 ± 39.4 mL , P < 0.001 ) . CONCLUSION Limited airless water infusion in the distal colon reduces patients ' pain during colonoscopy BACKGROUND : The water method decreases patient discomfort and sedation requirement . Applicability in non-veteran community setting s in the United States ( U.S. ) has not been reported . AIMS : Our aim is to perform a pilot study to establish feasibility of use the water method at 2 community sites . We tested the hypothesis that compared with air insufflation patients examined with the water method would require less sedation without adverse impact on outcomes . METHODS : Two performance improvement projects were carried out . Consecutive patients who consented to respond to a question naire after colonoscopy were enrolled . Project 1 : The design was single-blinded ( patient only ) ; quasi-r and omized - odd days ( water ) , even days ( air ) . Colonoscopy was performed by a staff attending . Project 2 : A supervised trainee performed the reported procedures . In both , patient demographics ( age , gender and body mass index ) , amount of sedation required during colonoscopy and procedure-related variables were recorded . The patients completed a question naire that enquired about discomfort during colonoscopy and willingness to repeat the procedure within 24 hours after the procedure . RESULTS : Project 1 : Significantly lower doses of fentanyl and midazolam were used and a higher adenoma detection rate ( ADR ) was demonstrated in the water group . Project 2 : 100 % cecal intubation rate was achieved by the supervised trainee . CONCLUSION : This is the first pilot report in the U.S. documenting feasibility of the water method as the principal modality to aid colonoscope insertion in both male and female community patients . In a head-to-head comparison , significant reduction of sedation requirement is confirmed as hypothesized . No adverse impact on outcomes was noted BACKGROUND AND STUDY AIMS Rapid passage through the sigmoid and descending colon is important during flexible colonoscopy , and colonoscopists have developed several techniques and tricks for achieving this . The present study was design ed to explore the effect of instilling 200 ml of water into the first bend of the sigmoid on the speed of passage of the endoscope from the rectum to the left colonic ( splenic ) flexure . PATIENTS AND METHODS A prospect i ve study of 100 successive single-h and ed colonoscopies was carried out , using r and omly either the water intubation technique ( 50 patients ) or the traditional method ( 50 controls ) to compare the time needed to pass the endoscope from the rectum to the left colonic flexure . RESULTS The results indicate that water intubation allowed the endoscope to be advanced through the sigmoid and descending colon in a median time ( fiftieth percentile ) of 154.5 seconds , compared to 223.5 seconds using the traditional technique . Water intubation speeds up the insertion time by 31 % . This difference was highly significant statistically ( P<0.0001 ) . The difference remained significant when the data for men and women were analyzed separately . There was no statistically significant difference in the formation of N loops , or in incidentally formed alpha loops between the two study groups . CONCLUSION The water intubation technique is more efficient than the traditional method , particularly in difficult left-sided colonoscopies , but it is equally safe BACKGROUND Colonic spasm can interfere with colonoscopy by hindering insertion of the colonoscope and by making polypectomy difficult , painful , and dangerous . Methods for dealing with colonic spasm include waiting for it to subside and administration of antispasmodic agents such as glucagon or hyoscyamine . Glucagon is expensive and hyoscyamine has side effects . This study evaluated an inexpensive technique , warm water irrigation , for overcoming colonic spasm during colonoscopy . METHODS A prospect i ve , r and omized , controlled trial in a consecutive series of patients was conducted to compare warm water irrigation for relaxation of spasm with st and ard examination techniques . Patients in whom the sigmoid colon had been resected were excluded . In the test group , water from the hot water tap at approximately body temperature was instilled into the colon by means of the accessory channel of the colonoscope with a 30 mL syringe . Any irrigation , either for removal of stool or control of spasm , was performed with warm water in the test group and water at room temperature in the control group . After each colonoscopy , the level of pain experienced by the patient was recorded with a linear analog scale . RESULTS Sixty-nine patients were r and omized . The groups were similar with respect to gender distribution , age , and degree of spasm . There was no difference between groups for insertion time , total duration of colonoscopy , dose of midazolam administered , or frequency of severe spasm . Patients who had warm water irrigation had significantly less discomfort than control patients ( median 2.0 , interquartile range : 1 - 4 on a 10 point linear analog scale , vs. 4.0 , interquartile range : 2 - 5 ) . CONCLUSIONS Although glucagon and hyoscyamine remain options for treatment of colonic spasm , the results of this study suggest that warm water is also effective . It has no side effects and costs practically nothing BACKGROUND Successful colonoscopy depends on insertion of the instrument to the cecum , precise observation , and minimal patient discomfort during the procedure . The aim of this prospect i ve study was to determine whether certain variables are associated with insertion time and patient discomfort during colonoscopy . METHODS Nine hundred nine consecutive colonoscopic examinations performed by a single endoscopist in patients without obstructive disease of the colorectum were analyzed . Four liters of Colonlyte ( Taejun , Seoul , Korea ) were used for bowel cleansing , and meperidine ( 25 mg ) was administered intramuscularly 10 minutes before the procedure . The degree of patient discomfort was assessed using a 5-level Likert scale . RESULTS Among 909 study patients , colonoscopy was completed to the cecum in 876 patients ( 96.4 % ) . The adjusted completion rate was 98 % and mean insertion time for complete colonoscopy was 6.9+/-4.2 minutes . Colonoscopy caused less patient discomfort than barium enema or esophagogastroduodenoscopy . Multivariate logistic regression analysis demonstrated that inadequate bowel cleansing , advanced age , and constipation as an indication are independent factors associated with prolonged insertion time ( > 10 minutes ) . Female gender was the only independent factor associated with significant discomfort ( > or = level 4 ) during colonoscopy . CONCLUSIONS Among the factors affecting insertion time and patient discomfort during colonoscopy , unsatisfactory bowel preparation was the only correctable factor BACKGROUND Sedation for colonoscopy discomfort imposes a recovery-time burden on patients . The water method permitted 52 % of patients accepting on-dem and sedation to complete colonoscopy without sedation . On-site and at-home recovery times were not reported . OBJECTIVE To confirm the beneficial effect of the water method and document the patient recovery-time burden . DESIGN R and omized , controlled trial , with single-blinded , intent-to-treat analysis . SETTING Veterans Affairs outpatient endoscopy unit . PATIENTS This study involved veterans accepting on-dem and sedation for screening and surveillance colonoscopy . INTERVENTION Air versus water method for colonoscope insertion . MAIN OUTCOME MEASUREMENTS Proportion of patients completing colonoscopy without sedation , cecal intubation rate , medication requirement , maximum discomfort ( 0 = none , 10 = severe ) , procedure-related and patient-related outcomes . RESULTS One hundred veterans were r and omized to the air ( n = 50 ) or water ( n = 50 ) method . The proportions of patients who could complete colonoscopy without sedation in the water group ( 78 % ) and the air group ( 54 % ) were significantly different ( P = .011 , Fisher exact test ) , but the cecal intubation rate was similar ( 100 % in both groups ) . Secondary analysis ( data as Mean [ SD ] ) shows that the water method produced a reduction in medication requirement : fentanyl , 12.5 ( 26.8 ) μg versus 24.0 ( 30.7 ) μg ; midazolam , 0.5 ( 1.1 ) mg versus 0.94 ( 1.20 ) mg ; maximum discomfort , 2.3 ( 1.7 ) versus 4.9 ( 2.0 ) ; recovery time on site , 8.4 ( 6.8 ) versus 12.3 ( 9.4 ) minutes ; and recovery time at home , 4.5 ( 9.2 ) versus 10.9 ( 14.0 ) hours ( P = .049 ; P = .06 ; P = .0012 ; P = .0199 ; and P = .0048 , respectively , t test ) . LIMITATIONS Single Veterans Affairs site , predominantly male population , unblinded examiners . CONCLUSION This r and omized , controlled trial confirms the reported beneficial effects of the water method . The combination of the water method with on-dem and sedation minimizes the patient recovery-time burden . ( CLINICAL TRIAL REGISTRATION NUMBER NCT00920751 . )
2,041
25,824,447
Medial meniscal injury/meniscectomy showed moderate evidence for influencing OA development ( tibiofemoral OA and compartment unspecified ) . Lateral meniscal injury/meniscectomy showed moderate evidence for no relationship ( compartment unspecified ) , as did time between injury and reconstruction ( tibiofemoral and patellofemoral OA ) . CONCLUSIONS Medial meniscal injury/meniscectomy after ACL rupture increased the risk of OA development . In contrast , it seems that lateral meniscal injury/meniscectomy has no relationship with OA development . Our results suggest that time between injury and reconstruction does not influence patellofemoral and tibiofemoral OA development . Many determinants showed conflicting and limited evidence and no determinant showed strong evidence
BACKGROUND Anterior cruciate ligament ( ACL ) injury is an important risk factor for development of knee osteoarthritis ( OA ) . To identify those ACL injured patients at increased risk for knee OA , it is necessary to underst and risk factors for OA . AIM To summarise the evidence for determinants of ( 1 ) tibiofemoral OA and ( 2 ) patellofemoral OA in ACL injured patients .
Background Concomitant injuries to secondary structures have been proposed as a major cause of failure of anterior cruciate ligament reconstruction . Purpose Our purpose was to determine the relationship between meniscal status at the time of anterior cruciate ligament reconstruction and ultimate long-term function and stability . Study Design Prospect i ve cohort study . Methods We prospect ively studied 63 patients for an average of 10.4 years after arthroscopically assisted bone-patellar tendon-bone anterior cruciate reconstruction . All surgeries were performed between 1988 and 1991 ; concomitant meniscal surgery was performed if necessary . Subjects were divided into subgroups relative to the integrity of their menisci at the end of the reconstruction procedure ( intact meniscus , partial meniscectomy , complete meniscectomy ) . Results Patients who had undergone any degree of meniscal resection reported significantly more subjective complaints and activity limitations than those with intact menisci . Subjective International Knee Documentation Committee and Lysholm scores were lower in the meniscectomy subgroups than in the meniscus-intact group . Objective testing revealed a significantly lower ability to perform the single-legged hop in the meniscectomy subgroups . Ligament stability based on instrumented laxity measurements was not significantly different between the subgroups . Radiographic abnormalities were also more common in the subgroups that had undergone meniscectomy . Conclusions The menisci should be repaired if at all possible , especially in the setting of anterior cruciate ligament reconstruction , for optimal functional outcome and patient satisfaction BACKGROUND There are no controlled , prospect i ve studies comparing the 10-year outcomes of anterior cruciate ligament ( ACL ) reconstruction using patellar tendon ( PT ) and 4-str and hamstring tendon ( HT ) autografts . HYPOTHESIS Comparable results are possible with HT and PT autografts . STUDY DESIGN Cohort study ; Level of evidence , 2 . METHODS One hundred eighty ACL-deficient knees that met inclusion criteria underwent ACL reconstruction ( 90 HT autograft , 90 PT autograft ) by one surgeon and were treated with an accelerated rehabilitation program . All knees were observed in a prospect i ve fashion with subjective , objective , and radiographic evaluation at 2 , 5 , 7 , and 10-year intervals . RESULTS At 10 years , there were no differences in graft rupture rates ( 7/90 PT vs. 12/90 HT , P = .24 ) . There were 20 contralateral ACL ruptures in the PT group , compared with 9 in the HT group ( P = .02 ) . In all patients , graft rupture was associated with instrumented laxity > 2 mm at 2 years ( P = .001 ) . Normal or near-normal function of the knee was reported in 97 % of patients in both groups . In the PT group , harvest-site symptoms ( P = .001 ) and kneeling pain ( P = .01 ) were more common than in the HT group . More patients reported pain with strenuous activities in PT knees than in HT knees ( P = .05 ) . Radiographic osteoarthritis was more common in PT knees than the HT-reconstructed knees ( P = .04 ) . The difference , however , was composed of patients with mild osteoarthritis . Other predictors of radiographic osteoarthritis were < 90 % single-legged hop test at 1 year and the need for further knee surgery . An " ideal " outcome , defined as an overall International Knee Documentation Committee grade of A or B and a radiographic grade of A at 10 years after ACL reconstruction , was associated with <3 mm of instrumented laxity at 2 years , the absence of additional surgery in the knee , and HT grafts . CONCLUSIONS It is possible to obtain excellent results with both HT and PT autografts . We recommend HT reconstructions to our patients because of decreased harvest-site symptoms and radiographic osteoarthritis We used single-photon emission computed tomography ( SPECT ) to determine the long-term risk of degenerative change after reconstruction of the anterior cruciate ligament ( ACL ) . Our study population was a prospect i ve series of 31 patients with a mean age at injury of 27.8 years ( 18 to 47 ) and a mean follow-up of ten years ( 9 to 13 ) after bone-patellar tendon-bone reconstruction of the ACL . The contralateral normal knee was used as a control . All knees were clinical ly stable with high clinical scores ( mean Lysholm score , 93 ; mean Tegner activity score , 6 ) . Fifteen patients had undergone a partial meniscectomy and ACL reconstruction at or before reconstruction of their ACL . In the group with an intact meniscus , clinical symptoms of osteoarthritis ( OA ) were found in only one patient ( 7 % ) , who was also the only patient with marked isotope uptake on the SPECT scan compatible with OA . In the group which underwent a partial meniscectomy , clinical symptoms of OA were found in two patients ( 13 % ) , who were among five ( 31 % ) with isotope uptake compatible with OA . Only one patient ( 7 % ) in this group had evidence of advanced OA on plain radiographs . The risk of developing OA after ACL reconstruction in this series is very low and lower than published figures for untreated ACL-deficient knees . There is a significant increase ( p < 0.05 ) in degenerative change in patients who had a reconstruction of their ACL and a partial meniscectomy compared with those who had a reconstruction of their ACL alone Objective To compare , in young active adults with an acute anterior cruciate ligament ( ACL ) tear , the mid-term ( five year ) patient reported and radiographic outcomes between those treated with rehabilitation plus early ACL reconstruction and those treated with rehabilitation and optional delayed ACL reconstruction . Design Extended follow-up of prospect i ve r and omised controlled trial . Setting Orthopaedic departments at two hospitals in Sweden . Participants 121 young , active adults ( mean age 26 years ) with acute ACL injury to a previously uninjured knee . One patient was lost to five year follow-up . Intervention All patients received similar structured rehabilitation . In addition to rehabilitation , 62 patients were assigned to early ACL reconstruction and 59 were assigned to the option of having a delayed ACL reconstruction if needed . Main outcome measure The main outcome was the change from baseline to five years in the mean value of four of the five subscales of the knee injury and osteoarthritis outcome score ( KOOS4 ) . Other outcomes included the absolute KOOS4 score , all five KOOS subscale scores , SF-36 , Tegner activity scale , meniscal surgery , and radiographic osteoarthritis at five years . Results Thirty ( 51 % ) patients assigned to optional delayed ACL reconstruction had delayed ACL reconstruction ( seven between two and five years ) . The mean change in KOOS4 score from baseline to five years was 42.9 points for those assigned to rehabilitation plus early ACL reconstruction and 44.9 for those assigned to rehabilitation plus optional delayed reconstruction ( between group difference 2.0 points , 95 % confidence interval −8.5 to 4.5 ; P=0.54 after adjustment for baseline score ) . At five years , no significant between group differences were seen in KOOS4 ( P=0.45 ) , any of the KOOS subscales ( P≥0.12 ) , SF-36 ( P≥0.34 ) , Tegner activity scale ( P=0.74 ) , or incident radiographic osteoarthritis of the index knee ( P=0.17 ) . No between group differences were seen in the number of knees having meniscus surgery ( P=0.48 ) or in a time to event analysis of the proportion of meniscuses operated on ( P=0.77 ) . The results were similar when analysed by treatment actually received . Conclusion In this first high quality r and omised controlled trial with minimal loss to follow-up , a strategy of rehabilitation plus early ACL reconstruction did not provide better results at five years than a strategy of initial rehabilitation with the option of having a later ACL reconstruction . Results did not differ between knees surgically reconstructed early or late and those treated with rehabilitation alone . These results should encourage clinicians and young active adult patients to consider rehabilitation as a primary treatment option after an acute ACL tear . Trial registration Current Controlled Trials IS RCT N84752559 Background The choice of different graft types and surgical techniques used when reconstructing a torn anterior cruciate ligament may influence the long-term prevalence of osteoarthritis and functional outcomes . Hypothesis There are no differences in the prevalence of knee osteoarthritis or knee function in patients undergoing reconstruction of a torn anterior cruciate ligament with 4-str and hamstring autograft versus patellar tendon — bone autograft . Study Design R and omized controlled trial ; Level of evidence , 1 . Methods Seventy-two patients with subacute or chronic rupture of the anterior cruciate ligament were r and omly assigned to autograft reconstruction with 4-str and gracilis and semitendinosus tendon ( HAM ) ( N = 37 ) or with patellar tendon — bone ( PTB ) ( N = 35 ) from the ipsilateral side . Outcome measurements were the Cincinnati knee score , single-legged hop tests , isokinetic muscle strength tests , pain , knee joint laxity test ( KT-1000 arthrometer ) , and a radiologic evaluation ( Kellgren and Lawrence ) at 10-year follow-up . Results At 10 years , 57 patients ( 79 % ) were eligible for evaluation—29 in the HAM group and 28 in the PTB group . No differences were found between the 2 graft groups with respect to the Cincinnati knee score , the single-legged hop tests , pain , muscle strength measurements , or knee joint laxity . Fifty-five percent and 64 % of the patients had osteoarthritis corresponding to Kellgren and Lawrence grade 2 or more in the HAM and the PTB groups , respectively ( P = .27 ) . For the uninvolved knee , the corresponding numbers were 28 % and 22 % ( P = .62 ) . Conclusion At 10 years postoperatively , no statistically significant differences in clinical outcome between the 2 graft types were found . The prevalence of osteoarthritis was significantly higher in the operated leg than in the contralateral leg , but there were no significant differences between the 2 groups . The results indicate that the choice of graft type after an anterior cruciate ligament injury has minimal influence on the prevalence of osteoarthritis 10 years after surgery There is little evidence examining the relationship between anatomical l and marks , radiological placement of the tunnels and long-term clinical outcomes following anterior cruciate ligament ( ACL ) reconstruction . The aim of this study was to investigate the reproducibility of intra-operative l and marks for placement of the tunnels in single-bundle reconstruction of the ACL using four-str and hamstring tendon autografts . Isolated reconstruction of the ACL was performed in 200 patients , who were followed prospect ively for seven years with use of the International Knee Documentation Committee forms and radiographs . Taking 0 % as the anterior and 100 % as the posterior extent , the femoral tunnel was a mean of 86 % ( sd 5 ) along Blumensaat 's line and the tibial tunnel was 48 % ( sd 5 ) along the tibial plateau . Taking 0 % as the medial and 100 % as the lateral extent , the tibial tunnel was 46 % ( sd 3 ) across the tibial plateau and the mean inclination of the graft in the coronal plane was 19 degrees ( sd 5.5 ) . The use of intra-operative l and marks result ed in reproducible placement of the tunnels and an excellent clinical outcome seven years after operation . Vertical inclination was associated with increased rotational instability and degenerative radiological changes , while rupture of the graft was associated with posterior placement of the tibial tunnel . If the osseous tunnels are correctly placed , single-bundle reconstruction of the ACL adequately controls both anteroposterior and rotational instability The aims of the study were to analyse the change in knee laxity over time after anterior cruciate ligament ( ACL ) reconstruction , using either bone – patellar – tendon – bone ( BPTB ) or hamstring ( HS ) tendon autografts , and to compare the knee laxity measurements between the study groups both pre-operatively and on multiple follow-up occasions . Another aim was to compare the radiographic findings in terms of degenerative changes between the study groups . A r and omised series of 71 patients , who underwent ACL reconstruction using BPTB or HS tendon autografts and interference screw fixation , were included in the study . Of these patients , 47/71 ( 66 % ) attended a clinical examination , including laxity measurements using the KT-1000 arthrometer , pre-operatively and on four post-operative occasions ; 6 months , 1 year , 2 years and 7 years after the reconstruction . The BPTB group consisted of 22 patients , while there were 25 patients in the HS group . There were no significant differences in the mean side-to-side knee laxity between the BPTB and the HS group pre-operatively or at the follow-up examinations . There was a tendency towards a reduction in side-to-side knee laxity over time in both groups , measured with the KT-1000 arthrometer . The decrease was significant when analysing the injured and uninjured knee separately ( injured side p < 0.001 ( BPTB ) and p = 0.005 ( HS ) , uninjured side p = 0.008 and p = 0.042 , respectively ) . Forty-four patients ( BPTB 21 , ST 23 ) underwent a radiographic assessment at the 7-year follow-up , which revealed no significant differences between the study groups in terms of osteoarthritic findings classified according to the Fairbank and Ahlbäck rating systems . In overall terms , osteoarthritis was identified in 16 % ( BPTB 19 % ; ST 13 % ; n.s . ) according to the Ahlbäck rating system and 68 % ( BPTB 67 % ; ST 70 % ; n.s . ) according to the Fairbank rating system . There were no significant differences in knee laxity measurements between the two study groups pre-operatively or at 7 years . A decrease in knee laxity over time was seen in both groups . There were no significant differences between the BPTB and ST groups in terms of osteoarthritic findings at 7 years Purpose The purpose of this study was to compare subjective , objective and radiographic outcome of the lateralized single-bundle bone-patellar tendon-bone autograft with a non-anatomical double-bundle hamstring tendons autograft anterior cruciate ligament ( ACL ) reconstruction technique at long-term follow-up . Methods Seventy-nine non-consecutive r and omized patients ( 42 men ; 37 women ) with unilateral ACL insufficiency were prospect ively evaluated , before and after ACL reconstruction by means of the above-mentioned techniques , with a minimum follow-up of 8 years ( range 8–10 years ; mean 8.6 years ) . In the double-bundle hamstrings technique , we used one tibial and one femoral tunnel combined with one “ over-the-top ” passage , cortical staple ’s fixation and we left intact hamstrings ’ tibial insertion . Patients were evaluated subjectively and objective ly , using IKDC score , Tegner level , manual maximum displacement test with KT-2000 ™ arthrometer . Radiographic evaluation was performed according to IKDC grading system , and re-intervention rate for meniscal lesions was also recorded . Results The subjective and objective IKDC were similar in both groups while double-bundle hamstrings group showed significantly higher Tegner level ( P = 0.0007 ) , higher passive range of motion recovery ( P = 0.0014 ) , faster sport resumption ( P = 0.0052 ) , lower glide pivot-shift phenomenon ( P = 0.0302 ) and lower re-intervention rate ( P = 0.0116 ) compared with patellar tendon group . Radiographic evaluation showed significant lower objective degenerative changes in double-bundle hamstrings group at final follow-up ( P = 0.0056 ) . Conclusion Although both techniques provide satisfactory results , double-bundle ACL reconstruction shows better functional results , with a faster return to sport activity , a lower re-operation rate and lower degenerative knee changes Background Long-term follow-up is required for accurate assessment of results after anterior cruciate ligament ( ACL ) reconstruction and recent years have witnessed the publication of numerous papers detailing long-term outcomes . The primary aim of this systematic review was to determine which patient factors affect long-term clinical and radiographic outcomes based on the current literature . Methods A comprehensive literature review yielded 18 prospect i ve manuscripts with minimum follow-up ranging from 5 - 12 years after ACL reconstruction . Results Longer follow-up was associated with increased radiographic evidence of osteoarthritis . Increased meniscal or articular cartilage pathology at ACL reconstruction were found to be associated with increased prevalence of radiographic evidence of osteoarthritis at long-term follow-up in most studies . There is currently insufficient evidence to correlate these intra-articular findings with decreases in clinical outcome measures . Further research is needed to determine the effect of body mass index ( BMI ) on long-term outcome after ACL reconstruction . Conclusions Intra-articular injuries noted at the time of ACL reconstruction affect long-term results . The effect of BMI and other patient factors is unclear . Long-term follow-up of large multicenter cohorts will provide definitive data on the relative importance of different factors in determining results of ACL reconstruction OBJECTIVE Prevalence and clinical relevance of patellofemoral ( PF ) osteoarthritis ( OA ) after anterior cruciate ligament ( ACL ) injury . METHOD Prospect ively we studied 94 out of 100 consecutive patients 15 years after acute ACL injury . ACL reconstructions were only performed late if recurrent " give way " persisted or a secondary meniscal injury suitable for repair occurred . The subjects , mean age 42 years , had knee radiographs including skyline PF view taken , which were grade d according to the atlas of the Osteoarthritis Research Society International . Knee-related symptoms and function were assessed by question naires . RESULTS PF OA was present in 12/75 knees ( 16 % ) . Of 94 patients 22 ( 23 % ) have had their ACL reconstructed during follow-up . Meniscal injury and ACL reconstruction had occurred more often in knees with PF OA than in knees without PF OA ( P=0.004 and P=0.002 , respectively ) . Seven of 15 ACL reconstructed knees showed radiographic PF OA at follow-up . Knees with PF OA had more extension and flexion deficit than knees without PF OA . Subjects with PF OA maintained a higher activity level from injury to follow-up , but did not differ significantly from those without PF OA regarding patient-relevant symptoms and knee function . However , there was a trend for worse outcome in subjects with PF OA . CONCLUSION We found a relatively low prevalence of mild PF OA after ACL injury treated non-operatively , and it had limited impact on knee symptoms and patient-relevant knee function . At follow-up PF OA was associated with higher activity level , meniscal injury , extension and flexion deficit , and ACL reconstruction Forty-three patients who had undergone an anterior cruciate ligament ( ACL ) reconstruction using a doubled semitendinosus and gracilis graft were prospect ively review ed at 5-year follow-up . All had suffered subacute or chronic tears of the ligament . At surgery , the femoral tunnel was drilled first through the antero-medial portal . The correct position of the femoral and tibial guide wire was checked fluoroscopically . A cortical fixation to the bone was achieved in the femur with a Mitek anchor , directly passing the two tendons in the slot of the anchor , and in the tibia with an RCI screw , supplemented with a spiked washer and bicortical screw . Rehabilitation was aggressive , controlled and without braces . The International Knee Documentation Committee ( IKDC ) form , KT-1000 arthrometer , and Cybex dynamometer were employed for clinical evaluation . A radiographic study was also performed . At the 5-year follow-up all the patients had recovered full range of motion and 2 % of them complained of pain during light sports activities . Four patients ( 9.5 % ) reported giving-way symptoms . The KT-1000 side-to-side difference was on average 2.1 mm at 30 lb , and 68 % of the knees were within 2 mm . The final IKDC score showed 90 % satisfactory results . There was no difference between the 2-year and 5-year evaluations in terms of stability . Extensor and flexor muscle strength recovery was almost complete ( maximum deficit 5 % ) . Radiographic study showed a tunnel widening in 32 % of the femurs and 40 % of the tibias . A correlation was found between the incidence of tibial tunnel widening and the distance of the RCI screw from the joint ( the closer the screw to the joint , the lower the incidence of widening ) . In conclusion , we can state that , using a four-str and hamstring graft and a cortical fixation at both ends , we were able to achieve satisfactory 5-year results in 90 % of the patients OBJECTIVE : To test the feasibility of creating a valid and reliable checklist with the following features : appropriate for assessing both r and omised and non-r and omised studies ; provision of both an overall score for study quality and a profile of scores not only for the quality of reporting , internal validity ( bias and confounding ) and power , but also for external validity . DESIGN : A pilot version was first developed , based on epidemiological principles , review s , and existing checklists for r and omised studies . Face and content validity were assessed by three experienced review ers and reliability was determined using two raters assessing 10 r and omised and 10 non-r and omised studies . Using different raters , the checklist was revised and tested for internal consistency ( Kuder-Richardson 20 ) , test-retest and inter-rater reliability ( Spearman correlation coefficient and sign rank test ; kappa statistics ) , criterion validity , and respondent burden . MAIN RESULTS : The performance of the checklist improved considerably after revision of a pilot version . The Quality Index had high internal consistency ( KR-20 : 0.89 ) as did the subscales apart from external validity ( KR-20 : 0.54 ) . Test-retest ( r 0.88 ) and inter-rater ( r 0.75 ) reliability of the Quality Index were good . Reliability of the subscales varied from good ( bias ) to poor ( external validity ) . The Quality Index correlated highly with an existing , established instrument for assessing r and omised studies ( r 0.90 ) . There was little difference between its performance with non-r and omised and with r and omised studies . Raters took about 20 minutes to assess each paper ( range 10 to 45 minutes ) . CONCLUSIONS : This study has shown that it is feasible to develop a checklist that can be used to assess the method ological quality not only of r and omised controlled trials but also non-r and omised studies . It has also shown that it is possible to produce a checklist that provides a profile of the paper , alerting review ers to its particular method ological strengths and weaknesses . Further work is required to improve the checklist and the training of raters in the assessment of external validity Background : A long-term follow-up comparing double-bundle and single-bundle techniques for anterior cruciate ligament ( ACL ) reconstruction has not been reported before . Hypothesis : Double-bundle ACL reconstruction may have fewer graft ruptures , lower rates of osteoarthritis ( OA ) , and better stability than single-bundle reconstruction . Study Design : R and omized controlled trial ; Level of evidence , 2 . Methods : Ninety patients were r and omized for double-bundle ACL reconstruction with bioabsorbable screw fixation ( DB group ; n = 30 ) , single-bundle ACL reconstruction with bioabsorbable screw fixation ( SBB group ; n = 30 ) , and single-bundle ACL reconstruction with metallic screw fixation ( SBM group ; n = 30 ) . Evaluation methods consisted of a clinical examination , KT-1000 arthrometer measurements , International Knee Documentation Committee ( IKDC ) and Lysholm knee scores , and a radiographic examination of both the operated and contralateral knees . Results : Eighty-one patients ( 90 % ) were available at the 10-year follow-up . Eleven patients ( 1 in the DB group , 7 in the SBB group , and 3 in the SBM group ) had a graft failure during the follow-up and went on to undergo revision ACL surgery ( P = .043 ) . In the remaining 70 patients at 10 years , no significant group differences were found in the pivot-shift test findings , KT-1000 arthrometer measurements , or knee scores . The most OA findings were found in the medial compartment of the knee , with 38 % of the patients in the operated knee and 28 % of the patients in the contralateral nonoperated knee . However , no significant group difference was found . The most severe OA changes were in the patients who had the longest delay from the primary injury to ACL reconstruction ( P = .047 ) and in the patients who underwent partial meniscal resection at the time of ACL reconstruction ( P = .024 ) . Conclusion : Double-bundle ACL reconstruction result ed in significantly fewer graft failures than single-bundle ACL reconstruction during the follow-up . Knee stability and OA rates were similar at 10 years . The most severe OA changes were found in the patients who had the longest delay from the primary injury to ACL reconstruction and in the patients who underwent partial meniscal resection at the time of ACL reconstruction A clinical , radiographic , and scintigraphic comparative study was performed on 57 consecutive successful patellar tendon anterior cruciate ligament reconstructions for chronic laxity . Patients were divided into 3 matched groups according to the medial meniscal treatment . Group A included 18 patients with medial meniscal repairs ; Group B , 19 patients with partial medial meniscectomies ; and Group C , 20 patients with normal menisci ( controls ) . The average followup was 55 months . At clinical examination , patients in Group B had more activity-related pain than those in Group C ( p = 0.04 ) . The anteroposterior weight-bearing views in extension showed more degenerative changes in the medial compartment in Group B than in the other 2 groups ( Group A versus B , p = 0.01 ; Group C versus B , p < 0.001 ) . Scintigraphy showed an increased uptake in the operated knee as compared with the normal side ( 11 % ) , but no differences among the 3 study groups . The patients with partial meniscectomies had more pain and degenerative radiographically evident changes than the control group . Medial meniscal repair offers a better chance than partial meniscectomy to preserve the articular cartilage of the medial compartment . Bone homeostasis , as detected by bone scanning , remains slightly altered in successful reconstructions as compared with the opposite normal side PURPOSE The purpose of this study was to analyze the clinical outcome of arthroscopic anterior cruciate ligament ( ACL ) reconstruction with bone-patellar tendon-bone autograft versus allograft . METHODS Between May 2000 and June 2004 , 172 patients undergoing arthroscopic bone-patellar tendon-bone ACL reconstruction were prospect ively r and omized into autograft ( n = 86 ) or allograft ( n = 86 ) groups . The senior surgeon performed all operations using the same surgical technique . Each fixation was performed by means of an interference screw . Patients were evaluated preoperatively and postoperatively at follow-up . Of the patients , 156 ( 76 in the autograft group and 80 in the allograft group ) were available for full evaluation . Evaluations included a detailed history , physical examination , functional knee ligament testing , KT-2000 arthrometer testing ( MEDmetric , San Diego , CA ) , Harner 's vertical jump and Daniel 's 1-leg hop tests , Lysholm score , Tegner score , International Knee Documentation Committee st and ard evaluation form , Cincinnati knee score , and radiograph . RESULTS Demographic data were comparable between groups . The mean follow-up was 5.6 years for both groups . There were no statistically significant differences according to evaluations of outcome between the 2 groups except that patients in the allograft group had a shorter operation time and longer fever time postoperatively compared with the autograft group . The postoperative infection rates were 0 % and 1.25 % for the autograft group and allograft group , respectively . There was a significant difference ( P < .05 ) in the development of osteoarthritis between the operated knee in comparison to the contralateral knee according to radiographs . However , no significant difference was found between the 2 groups at the final follow-up examination ( P > .05 ) . CONCLUSIONS Both groups of patients achieved almost the same satisfactory outcomes after a mean of 5.6 years of follow-up . Allograft is a reasonable alternative to autograft for ACL reconstruction . LEVEL OF EVIDENCE Level II , prospect i ve comparative study We investigated the long-term outcome of 100 patients 15 years after having been r and omly allocated to primary repair ( augmented or non-augmented ) or non-surgical treatment of an anterior cruciate ligament ( ACL ) rupture . The subjective outcome was similar between the groups , with no difference regarding activity level and knee-injury and osteoarthritis outcome score but with a slightly lower Lysholm score for the non-surgically treated group . This difference was attributed to more instability symptoms . The radiological osteoarthritis ( OA ) frequency did not differ between surgically or non-surgically treated patients , but if a meniscectomy was performed , two-thirds of the patients showed OA changes regardless of initial treatment of the ACL . There were significantly more meniscus injuries in patients initially treated non-surgically . One-third of the patients in the non-surgically treated group underwent secondary ACL reconstruction due to instability problems . In this study , ACL repair itself could not reduce the risk of OA nor increase the subjective outcome scores . However , one-third of the non-surgical treated patients were later ACL reconstructed due to instability . The status of the menisci was found to be the most important predictor of developing OA . Early ACL repair and also ACL reconstruction can reduce the risk of secondary meniscus tears . Indirectly this supports the hypothesis that early stabilization of the knee after ACL injury is advantageous for the long-term outcome Purpose Analysis of long-term clinical and radiological outcomes after anterior cruciate ligament ( ACL ) reconstruction with special attention to knee osteoarthritis and its predictors . Methods A prospect i ve , consecutive case series of 100 patients . Arthroscopic transtibial ACL reconstruction was performed using 4-str and hamstring tendon autografts with a st and ardized accelerated rehabilitation protocol . Analysis was performed preoperatively and 10 years postoperatively . Clinical examination included Lysholm and Tegner scores , IKDC , KT-1000 testing ( MEDmetric Co. , San Diego , CA , USA ) and leg circumference measurements . Radiological evaluation included AP weight bearing , lateral knee , Rosenberg and sky view X-rays . Radiological classifications were according to Ahlbäck and Kellgren & Lawrence . Statistical analysis included univariate and multivariate logistic regressions . Results Clinical outcome A significant improvement ( p < 0.001 ) between preoperative and postoperative measurements could be demonstrated for the Lysholm and Tegner scores , IKDC patient subjective assessment , KT-1000 measurements , pivot shift test , IKDC score and one-leg hop test . A pivot shift phenomenon ( glide ) was still present in 43 ( 50 % ) patients and correlated with lower levels of activity ( p < 0.022 ) . Radiological outcome : At follow-up , 46 ( 53.5 % ) patients had signs of osteoarthritis ( OA ) . In this group , 33 patients ( 72 % ) had chondral lesions ( ≥ grade 2 ) at the time of ACL reconstruction . A history of medial meniscectomy before or at the time of ACL reconstruction increased the risk of knee OA 4 times ( 95 % CI 1.41–11.5 ) . An ICRS grade 3 at the time of ACL reconstruction increased the risk of knee OA by 5.2 times ( 95 % CI 1.09–24.8 ) . There was no correlation between OA and activity level ( Tegner score ≥6 ) nor between OA and a positive pivot shift test . Conclusion Transtibial ACL reconstruction with 4-str and hamstring autograft and accelerated rehabilitation restored anteroposterior knee stability . Clinical parameters and patient satisfaction improved significantly . At 10-year follow-up , radiological signs of OA were present in 53.5 % of the subjects . Risk factors for OA were meniscectomy prior to or at the time of ACL reconstruction and chondral lesions at the time of ACL reconstruction . Level of evidence II Background : Because of specific method ological difficulties in conducting r and omized trials , surgical research remains dependent predominantly on observational or non‐r and omized studies . Few vali date d instruments are available to determine the method ological quality of such studies either from the reader 's perspective or for the purpose of meta‐ analysis . The aim of the present study was to develop and vali date such an instrument In a prospect i ve seven-year study , we treated 32 patients with partial ruptures of the anterior cruciate ligament ( ACL ) verified by arthroscopy . Twelve knees ( 38 % ) progressed to complete ACL deficiency with positive pivot shift tests and increased anteroposterior translation on tests with the KT-1000 arthrometer . Patients with partial ACL tears frequently had limitation for strenuous sports , while those developing ACL deficiency had additional functional limitations involving recreational activities . Three factors were statistically significant in predicting which partial tears would develop complete ACL deficiency : the amount of ligament tearing -- one-fourth tears infrequently progressed , one-half tears progressed in 50 % and three-fourth tears in 86 % ; a subtle increase in initial anterior translation ; and the occurrence of a subsequent re-injury with giving-way Background : Meniscectomy and articular cartilage damage have been found to increase the prevalence of osteoarthritis after anterior cruciate ligament reconstruction , but the effect of knee range of motion has not been extensively studied . Hypothesis : The prevalence of osteoarthritis as observed on radiographs would be higher in patients who had abnormal knee range of motion compared with patients with normal knee motion , even when grouped for like meniscal or articular cartilage lesions . Study Design : Cohort study ; Level of evidence , 3 . Methods : We prospect ively followed patients at a minimum of 5 years after surgery . The constant goal of rehabilitation was to obtain full knee range of motion as quickly as possible after surgery and maintain it in the long term . Range of motion and radiographs were evaluated at the time of initial return to full activities ( early follow-up ) and final follow-up according to International Knee Documentation Committee ( IKDC ) objective criteria . A patient was considered to have normal range of motion if extension was within 2 ° of the opposite knee including hyperextension and knee flexion was within 5 ° . Radiograph findings were rated as abnormal if any signs of joint space narrowing , sclerosis , or osteophytes were present . Results : Follow-up was obtained for 780 patients at a mean of 10.5 ± 4.2 years after surgery . Of these , 539 had either normal or abnormal motion at both early and final follow-up . In 479 patients who had normal extension and flexion at both early and final follow-up , 188 ( 39 % ) had radiographic evidence of osteoarthritis versus 32 of 60 ( 53 % ) patients who had less than normal extension or flexion at early and final follow-up ( P = .036 ) . In subgroups of patients with like meniscal status , the prevalence of normal radiograph findings was significantly higher in patients with normal motion at final follow-up versus patients with motion deficits . Multivariate logistic regression analysis of categorical variables showed that abnormal knee flexion at early follow-up , abnormal knee extension at final follow-up , abnormal knee flexion at final follow-up , partial medial meniscectomy , and articular cartilage damage were significant factors related to the presence of osteoarthritis on radiographs . Abnormal knee extension at early follow-up showed a trend toward statistical significance ( P = .0544 ) . Logistic regression showed the odds of having osteoarthritis were 2 times more for patients with abnormal range of motion at final follow-up ; these odds were similar for those with partial medial meniscectomy and articular cartilage damage . Conclusion : The prevalence of osteoarthritis on radiographs in the long term after anterior cruciate ligament reconstruction is lower in patients who achieve and maintain normal knee motion , regardless of the status of the meniscus Background Specific guidelines for operative versus nonoperative management of anterior cruciate ligament injuries do not yet exist . Hypothesis Surgical risk factors can be used to indicate whether reconstruction or conservative management is best for an individual patient . Study Design Prospect i ve nonr and omized controlled clinical trial ; Level of evidence , 2 . Methods Patients were classified as high , moderate , or low risk using preinjury sports participation and knee laxity measurements . Early anterior cruciate ligament reconstruction ( within 3 months of injury ) was recommended to high-risk patients and conservative care to low-risk patients . It was recommended that moderate-risk patients have either early reconstruction or conservative care , according to the day of presentation . Assessment of subjective outcomes , activity , physical measurements , and radiographs was performed at mean follow-up of 6.6 years . Results Early phase conservative management result ed in more late phase meniscus surgery than did early phase reconstruction at all risk levels ( high risk , 25 % vs 6.5 % ; moderate risk , 37 % vs 7.7 % , P = . 01 ; low risk , 16 % vs 0 % ) . Early- and late-reconstruction patients ’ Tegner scores increased from presurgery to follow-up ( P < . 001 ) but did not return to preinjury levels . Early-reconstruction patients had higher rates of degenerative change on radiographs than did nonreconstruction patients ( P < . 05 ) . Conclusions Early phase reconstruction reduced late phase knee laxity , risk of symptomatic instability , and the risk of late meniscus tear and surgery . Moderate- and high-risk patients had similar rates of late phase injury and surgery . Reconstruction did not prevent the appearance of late degenerative changes on radiographs . Relationship between bone contusion on initial magnetic resonance images and the finding of degenerative changes on follow-up radiographs were not detected . The treatment algorithm used in this study was effective in predicting risk of late phase knee surgery Abstract Ninety-one patients were assessed 5–9 years after an anterior cruciate ligament reconstruction ( bone patella-tendon bone autograft ) . Forty-eight patients had been treated within 6 weeks of the injury ( Group I ) and 43 patients more than 3 months after the injury ( Group II ) . 73 patients had either a normal or nearly normal final outcome . The mean Lysholm score was 82 and the mean Marshall score was 42 . Eighty nine patients had normal or nearly normal stability in the operated knee when compared to the contralateral joint . In none of these results was there any significant difference between the groups . Results of functional and of isokinetic strength tests , as well as the presence of anterior knee pain , were also similar in both groups . However , patients with early reconstruction had fewer degenerative changes in the tibio-femoral joint and were more satisfied with the result . They also returned to their pre-injury level of sports activity more often than those patients in the late reconstruction group . Résumé Nous avons revu quatre-vingt douze patients ayant eu une reconstruction du ligament croisé antérieur du genou par autogreffe os-tendon rotulien-os , avec un recul de 5 á 9 ans . Quarante-huit patients ont été opérés dans les 6 semaines suivants le traumatisme ( groupe I ) et quarante-trois patients plus de trois mois aprés le traumatisme ( groupe II ) . Soixante-trois patients ont un résultat normal ou presque normal . Le score Lysholm a été 82 et le score Marshall a été 42 . La stabilité du genou opéré a été normale ou presque normale dans quatre-vingt neuf cas . Il n’y a pas eu de différence significative dans les deux groupes . Le résultat des tests fonctionnels , la force musculaire isokinétique et la douleur résiduelle dans le genou ont été similaires dans les deux groupes . Cependant , dans la reconstruction précoce il y a eu moins de lésions dégénératives de l’articulation tibio-fémorale , moins de gêne ou de douleurs et le retour á l’activité sportive a été plus fréquent . Nos résultats montrent que la réparation précoce est préférable Arthroscopic ACL reconstruction has a satisfactory functional outcome of up to 90 % , but there are few long-term prospect i ve studies . This prospect i ve study presents the outcomes of ACL reconstruction in terms of laxity , function and degenerative change , after a mean follow-up of 7 years . Function was assessed using the Lysholm and Tegner Activity Scores , laxity using the Stryker Knee Laxity Tester , employing maximum manual effort , and degenerative change was assessed as joint line narrowing on st and ardised radiographs . At latest follow-up , the mean Lysholm score improved from 70 to 87 and the Tegner from 4 to 7 ( P<0.001 ) . AP translation also improved ( P<0.001 ) . The incidence of early degenerative change was 50 % and although this appeared to be associated with a previous meniscectomy , the correlation was not significant ( P=0.06 ) . In conclusion , the improved functional scores and laxity are sustained beyond 7 years but the 50 % incidence of early degenerative change may be a cause for concern Purpose The aim of our study was to review the clinical and radiological outcome of patients who had undergone anterior cruciate ligament ( ACL ) reconstruction in comparison to a group of non-operatively treated patients . Methods In a retrospective study we compared ACL reconstruction using a bone-patellar tendon-bone graft with a non-operatively treated group of patients 17–20 years later . Fifty-four patients that met the inclusion criteria , with arthroscopically proven ACL rupture , were treated between 1989 and 1991 . Thirty-three patients underwent ACL reconstruction , forming group one . Eighteen non-reconstructed patients continued with rehabilitation and modification of activities ( group two ) . The International Knee Documentation Committee ( IKDC ) subjective and objective evaluation forms and the Lysholm and Tegner scale were used to assess the knees at follow-up . Radiographic assessment was performed using the IKDC grading scale . Results Follow-up results showed that 83 % of reconstructed patients had stable knees and normal or nearly normal IKDC grade . Patients in the non-reconstructed group had unstable knees with 84 % having abnormal or severe laxity . The subjective IKDC score was significantly in favour of group one : 83.15 compared to 64.6 in group two . The Lysholm and Tegner score was also significantly better in group one . Conservatively treated patients all had unstable knees and worse scores . The rate of osteoarthritis showed more severe changes in non-reconstructed patients with additional meniscus injury . Conclusions We can conclude that 94 % of patients who underwent ACL reconstruction had stable knees after 15–20 years and there was a significantly lower percentage of osteoarthritis in comparison to conservatively treated patients Background : No consensus has been reached on the advantages of double-bundle ( DB ) anterior cruciate ligament reconstruction ( ACLR ) over the single-bundle ( SB ) technique , particularly with respect to the prevention of osteoarthritis ( OA ) after ACLR . Purpose : To evaluate whether DB ACLR has any advantages in the prevention of OA or provides better stability and function after ACLR compared with the SB technique . Study Design : R and omized controlled trial ; Level of evidence , 2 . Methods : A total of 130 patients with an ACL injury in one knee were prospect ively r and omized into a DB group ( n = 65 ) or an SB group ( n = 65 ) . For the radiologic evaluation , we determined the degree of OA based on the Kellgren-Lawrence grade before the operation and at the time of the final follow-up and determined the number of patients with progression of OA more than one grade from pre- to postoperation . We evaluated the stability results using the Lachman and pivot-shift tests and stress radiography . We also compared the functional outcomes based on the Lysholm knee score , Tegner activity score , and International Knee Documentation Committee ( IKDC ) subjective scale . Results : Six patients ( 4 in the DB group and 2 in the SB group ) suffered graft failure during the follow-up and had ACL revision surgery ( P = .06 ) . A total of 112 patients were observed for a minimum of 4 years ( DB group , n = 52 ; SB group , n = 60 ) . Five patients ( 9.6 % ) in the DB group and 6 patients ( 10 % ) in the SB group had more advanced OA at the final follow-up ( P = .75 ) . All patients recovered full range of motion within 6 months from surgery . Stability results of the Lachman test , pivot-shift test , and the radiographic stability test failed to reveal any significant intergroup differences ( P = .37 , .27 , and .67 , respectively ) . In the pivot-shift result , the DB group had 4 patients with grade 2 and the SB group had 3 patients with grade 2 ( P = .27 ) . Clinical outcomes , including Lysholm knee and Tegner activity scores , were similar in the 2 groups . Statistical significance was achieved only for the IKDC subjective scale ( 78.2 in DB group vs 73.1 in SB group ; P = .03 ) . Conclusion : The DB technique , compared with SB , was not more effective in preventing OA and did not have a more favorable failure rate . Although the DB ACLR technique produced a better IKDC subjective scale result than did the SB ACLR technique , the 2 modalities were similar in terms of clinical outcomes and stability after a minimum 4 years of follow-up PURPOSE To analyze the long-term evaluation of clinical , functional , and magnetic resonance imaging ( MRI ) results after implant-free press-fit anterior cruciate ligament ( ACL ) reconstruction with bone-patella tendon ( BPT ) versus quadrupled hamstring tendon ( HT ) grafts . METHODS Sixty-two ACL-insufficient patients were included in a prospect i ve , r and omized study ( 31 BPT and 31 HT ) . Both surgical procedures were performed without any implants by a press-fit technique by the senior author . The femoral tunnel was drilled through the anteromedial portal for anatomic placement . At 8.8 years after reconstruction , 53 patients ( 28 BPT and 25 HT ) were examined by different clinical and functional tests . Bilateral MRI scans were performed and interpreted by an independent radiologist . RESULTS On follow-up , the score on the International Knee Documentation Committee evaluation form was significantly better in the HT group . The clinical examination including range of motion , KT-1000 test ( MEDmetric , San Diego , CA ) , and pivot-shift test showed no significant differences . On isokinetic testing , the mean quadriceps strength was close to normal ( 96 % ) in both groups , but the hamstring strength was lower in the HT group ( 100.3%/95.1 % ) . Kneeling ( 1.5/1.1 , P = .002 ) , knee walking ( 1.72/1.14 , P = .002 ) , and single-leg hop test ( 95.8%/99.1 % , P = .057 ) were better in the HT group . The MRI findings about the mean degree of cartilage lesion ( International Cartilage Repair Society protocol ) of the operated ( 2.1/2.1 ) and nonoperated ( 1.4/1.8 ) knee showed no significant differences . No significant difference was found in the grade of medial or lateral meniscal lesion or the number of patients having meniscal lesions when the operated and nonoperated knees were compared . Tunnel measurements , Caton-Deschamps Index , and the sagittal ACL angle were similar . CONCLUSIONS The implant-free press-fit technique for anterior cruciate ligament reconstruction by use of bone-patellar tendon and hamstring grafts with anatomic graft placement is an innovative technique to preserve the cartilage and meniscal status without significant differences between the operated and nonoperated knees in the long term . Significantly less anterior knee pain was noted in the hamstring group , when testing for kneeling and knee walking . LEVEL OF EVIDENCE Level II , prospect i ve comparative study Background : The purpose of this study was to further delineate the outcome of arthroscopically assisted anterior cruciate ligament reconstruction in 125 patients who had previously been followed for two to five years . One of the original 125 patients was excluded from the present study because of insufficient follow-up , and an additional group of 101 patients was added . All 225 patients in the present study were followed for a minimum of six years . Methods : Patients were r and omly assigned to reconstruction with a double-str and ed semitendinosus-gracilis graft with use of a two‐incision technique ( group I ) , reconstruction with a patellar ligament graft with use of a two‐incision technique ( group II ) , or reconstruction with a patellar ligament graft with use of a single‐incision endoscopic technique ( group III ) . The groups were compared with regard to the rate of graft failure , the amount of instability , knee strength , radiographic signs of degenerative changes , and functional outcome . Results : There was no significant difference among the three groups with regard to the rate of graft failure , the amount of knee instability , or the functional outcome . A normal or nearly normal functional outcome was recorded for 208 ( 92 % ) of the 225 patients . There were significant differences among the groups with regard to quadriceps muscle-strength deficits : group I had fewer patients with deficits than group III , and groups I and III both had fewer patients with deficits than group II ( p = 0.04 ) . There also were significant differences among the groups with regard to hamstring muscle-strength deficits : group III had fewer patients with deficits than group II , and group II had fewer patients with deficits than group I ( p < 0.01 ) . Twelve knees ( 16 % ) in group I , six knees ( 8 % ) in group II , and eight knees ( 11 % ) in group III showed radiographic evidence of progressive degenerative changes , but the differences among the three groups were not significant . Conclusion : Although 11.6 % of the 225 knees had radiographic evidence of degenerative arthritis at a minimum of six years after arthroscopically assisted reconstruction of the anterior cruciate ligament , the choice of graft and the technique of reconstruction did not seem to affect the rate of development of these changes Background Few prospect i ve long-term studies of more than 10 years have reported changes in knee function and radiologic outcomes after anterior cruciate ligament ( ACL ) reconstruction . Purpose To examine changes in knee function from 6 months to 10 to 15 years after ACL reconstruction and to compare knee function outcomes over time for subjects with isolated ACL injury with those with combined ACL and meniscal injury and /or chondral lesion . Furthermore , the aim was to compare the prevalence of radiographic and symptomatic radiographic knee osteoarthritis between subjects with isolated ACL injuries and those with combined ACL and meniscal and /or chondral lesions 10 to 15 years after ACL reconstruction . Study Design Cohort study ; Level of evidence , 2 . Methods Follow-up evaluations were performed on 221 subjects at 6 months , 1 year , 2 years , and 10 to 15 years after ACL reconstruction with bone-patellar tendon-bone autograft . Outcome measurements were KT-1000 arthrometer , Lachman and pivot shift tests , Cincinnati knee score , isokinetic muscle strength tests , hop tests , visual analog scale for pain , Tegner activity scale , and the Kellgren and Lawrence classification . Results One hundred eighty-one subjects ( 82 % ) were evaluated at the 10- to 15-year follow-up . A significant improvement over time was revealed for all prospect i ve outcomes of knee function . No significant differences in knee function over time were detected between the isolated and combined injury groups . Subjects with combined injury had significantly higher prevalence of radiographic knee osteoarthritis compared with those with isolated injury ( 80 % and 62 % , P = .008 ) , but no significant group differences were shown for symptomatic radiographic knee osteoarthritis ( 46 % and 32 % , P = .053 ) . Conclusion An overall improvement in knee function outcomes was detected from 6 months to 10 to 15 years after ACL reconstruction for both those with isolated and combined ACL injury , but significantly higher prevalence of radiographic knee osteoarthritis was found for those with combined injuries Patellofemoral pain ( PFP ) remains one of the most common conditions encountered in sports medicine . Characterised by anterior knee pain that is aggravated by activities such as running , squatting and stair ambulation ; PFP generally reduces or restricts physical activity . While PFP may subside with activity reduction , the natural history of this common condition is not one of spontaneous recovery . Indeed , PFP is often recalcitrant and can persist for many years . In a prospect i ve study of people with PFP , symptoms persisted in 25 % of people up to 20 years.1 Despite considerable evidence for the efficacy of conservative interventions for PFP , such as multimodal physiotherapy,2 these interventions do not appear to have long-lasting effects.2 Compounding the management of PFP is that surgery for PFP is widely considered to have poor outcomes . As mentioned in the 2014 consensus statement from the International Patellofemoral Pain Research Retreat3 there is speculation that PFP may be a prelude to degenerative joint changes and ultimately the development of patellofemoral osteoarthritis (PFOA).4 , 5 While no current studies have prospect ively studied people with PFP through to the development of PFOA ( and thus verify this relationship ) , a recent systematic review 4 observed that individuals undergoing arthroplasty for PFOA were more than twice as likely ( OR=2.31 , 95 % CI 1.37 to 3.88 ) to report having had PFP as an adolescent than patients undergoing an arthroplasty for isolated tibiofemoral OA . In the absence of rigorous
2,042
18,510,742
No serious side-effects were reported . Conclusion LLLT administered with optimal doses of 904 nm and possibly 632 nm wavelengths directly to the lateral elbow tendon insertions , seem to offer short-term pain relief and less disability in LET , both alone and in conjunction with an exercise regimen .
Background Recent review s have indicated that low level level laser therapy ( LLLT ) is ineffective in lateral elbow tendinopathy ( LET ) without assessing validity of treatment procedures and doses or the influence of prior steroid injections .
& NA ; Forty‐nine patients suffering from lateral humeral epicondylalgia were enrolled in a double‐blind study to observe the effects of Ga‐As laser applied to acupuncture points . The Mid 1500 IRRADIA laser machine was used , wavelength : 904 nm , mean power output : 12 mW , peak value : 8.3 W ; frequency : 70 Hz ( pulse train ) . Localization of points : LI 10 , 11 , 12 , Lu 5 and SJ 5 . Each point was treated for 30 sec result ing in a dose of treatment of 0.36 J/point . The patients were treated 2–3 times weekly with 10 treatments in all . Follow‐ups were done after 3 months and 1 year . No significant differences were observed between the laser and the placebo group in relation to the subjective or objective outcome after 10 treatments or at the follow‐ups Abstract Objective : To assess the association between competing interests and authors ' conclusions in r and omised clinical trials . Design : Epidemiological study of r and omised clinical trials published in the BMJ from January 1997 to June 2001 . Financial competing interests were defined as funding by for profit organisations and other competing interests as personal , academic , or political . Studies : 159 trials from 12 medical specialties . Main outcome measures : Authors ' conclusions defined as interpretation of extent to which overall results favoured experimental intervention . Conclusions appraised on 6 point scale ; higher scores favour experimental intervention . Results : Authors ' conclusions were significantly more positive towards the experimental intervention in trials funded by for profit organisations alone compared with trials without competing interests ( mean difference 0.48 ( SE 0.13 ) , P=0.014 ) , trials funded by both for profit and non-profit organisations ( 0.30 ( SE 0.10 ) , P=0.003 ) , and trials with other competing interests ( 0.45 ( SE 0.13 ) , P=0.006 ) . Other competing interests and funding from both for profit and non-profit organisations were not significantly associated with authors ' conclusions . The association between financial competing interests and authors ' conclusions was not explained by method ological quality , statistical power , type of experimental intervention ( pharmacological or non-pharmacological ) , type of control intervention ( for example , placebo or active drug ) , or medical specialty . Conclusions : Authors ' conclusions in r and omised clinical trials significantly favoured experimental interventions if financial competing interests were declared . Other competing interests were not significantly associated with authors ' conclusions The treatment of lateral epicondylalgia , a widely-used model of musculoskeletal pain in the evaluation of many physical therapy treatments , remains somewhat of an enigma . The protagonists of a new treatment technique for lateral epicondylalgia report that it produces substantial and rapid pain relief , despite a lack of experimental evidence . A r and omized , double blind , placebo-controlled repeated- measures study evaluated the initial effect of this new treatment in 24 patients with unilateral , chronic lateral epicondylalgia . Pain-free grip strength was assessed as an outcome measure before , during and after the application of the treatment , placebo and control conditions . Pressure-pain thresholds were also measured before and after the application of treatment , placebo and control conditions . The results demonstrated a significant and substantial increase in pain-free grip strength of 58 % ( of the order of 60 N ) during treatment but not during placebo and control . In contrast , the 10 % change in pressure-pain threshold after treatment , although significantly greater than placebo and control , was substantially smaller than the change demonstrated for pain-free grip strength . This effect was only present in the affected limb . The selective and specific effect of this treatment technique provides a valuable insight into the physical modulation of musculoskeletal pain and requires further investigation BACKGROUND AND PURPOSE Lateral epicondylitis ( " tennis elbow " ) is a common entity . Several nonoperative interventions , with varying success rates , have been described . The aim of this study was to compare the effectiveness of 2 protocol s for the management of lateral epicondylitis : ( 1 ) manipulation of the wrist and ( 2 ) ultrasound , friction massage , and muscle stretching and strengthening exercises . SUBJECTS AND METHODS Thirty-one subjects with a history and examination results consistent with lateral epicondylitis participated in the study . The subjects were r and omly assigned to either a group that received manipulation of the wrist ( group 1 ) or a group that received ultrasound , friction massage , and muscle stretching and strengthening exercises ( group 2 ) . Three subjects were lost to follow-up , leaving 28 subjects for analysis . Follow-up was at 3 and 6 weeks . The primary outcome measure was a global measure of improvement , as assessed on a 6-point scale . Analysis was performed using independent t tests , Mann-Whitney U tests , and Fisher exact tests . RESULTS Differences were found for 2 outcome measures : success rate at 3 weeks and decrease in pain at 6 weeks . Both findings indicated manipulation was more effective than the other protocol . After 3 weeks of intervention , the success rate in group 1 was 62 % , as compared with 20 % in group 2 . After 6 weeks of intervention , improvement in pain as measured on an 11-point numeric scale was 5.2 ( SD=2.4 ) in group 1 , as compared with 3.2 ( SD=2.1 ) in group 2 . DISCUSSION AND CONCLUSION Manipulation of the wrist appeared to be more effective than ultrasound , friction massage , and muscle stretching and strengthening exercises for the management of lateral epicondylitis when there was a short-term follow-up . However , replication of our results is needed in a large-scale r and omized clinical trial with a control group and a longer-term follow-up Abstract Objective To investigate the efficacy of physiotherapy compared with a wait and see approach or corticosteroid injections over 52 weeks in tennis elbow . Design Single blind r and omised controlled trial . Setting Community setting , Brisbane , Australia . Participants 198 participants aged 18 to 65 years with a clinical diagnosis of tennis elbow of a minimum six weeks ' duration , who had not received any other active treatment by a health practitioner in the previous six months . Interventions Eight sessions of physiotherapy ; corticosteroid injections ; or wait and see . Main outcome measures Global improvement , grip force , and assessor 's rating of severity measured at baseline , six weeks , and 52 weeks . Results Corticosteroid injection showed significantly better effects at six weeks but with high recurrence rates thereafter ( 47/65 of successes subsequently regressed ) and significantly poorer outcomes in the long term compared with physiotherapy . Physiotherapy was superior to wait and see in the short term ; no difference was seen at 52 weeks , when most participants in both groups reported a successful outcome . Participants who had physiotherapy sought less additional treatment , such as non-steroidal anti-inflammatory drugs , than did participants who had wait and see or injections . Conclusion Physiotherapy combining elbow manipulation and exercise has a superior benefit to wait and see in the first six weeks and to corticosteroid injections after six weeks , providing a reasonable alternative to injections in the mid to long term . The significant short term benefits of corticosteroid injection are paradoxically reversed after six weeks , with high recurrence rates , implying that this treatment should be used with caution in the management of tennis elbow OBJECTIVE The aim of this study was to evaluate the effectiveness of 904-nm low-level laser therapy ( LLLT ) in the management of lateral epicondylitis . BACKGROUND DATA Lateral epicondylitis is characterized by pain and tenderness over the lateral elbow , which may also result in reduction in grip strength and impairment in physical function . LLLT has been shown effective in its therapeutic effects in tissue healing and pain control . METHODS Thirty-nine patients with lateral epicondylitis were r and omly assigned to receive either active laser with an energy dose of 0.275 J per tender point ( laser group ) or sham irradiation ( placebo group ) for a total of nine sessions . The outcome measures were mechanical pain threshold , maximum grip strength , level of pain at maximum grip strength as measured by the Visual Analogue Scale ( VAS ) and the subjective rating of physical function with Disabilities of the Arm , Shoulder and H and ( DASH ) question naire . RESULTS Significantly greater improvements were shown in all outcome measures with the laser group than with the placebo group ( p < 0.0125 ) , except in the two subsections of DASH . CONCLUSION This study revealed that LLLT in addition to exercise is effective in relieving pain , and in improving the grip strength and subjective rating of physical function of patients with lateral epicondylitis OBJECTIVE This study was undertaken to compare the effectiveness of a protocol of combination of laser with plyometric exercises and a protocol of placebo laser with the same program , in the treatment of tennis elbow . BACKGROUND DATA The use of low-level laser has been recommended for the management of tennis elbow with contradictory results . Also , plyometric exercises was recommended for the treatment of the tendinopathy . METHODS Fifty patients who had tennis elbow participated in the study and were r and omised into two groups . Group A ( n = 25 ) was treated with a 904 Ga-As laser CW , frequency 50 Hz , intensity 40 mW and energy density 2.4 J/cm(2 ) , plus plyometric exercises and group B ( n = 25 ) that received placebo laser plus the same plyometric exercises . During eight weeks of treatment , the patients of the two groups received 12 sessions of laser or placebo , two sessions per week ( weeks 1 - 4 ) and one session per week ( weeks 5 - 8 ) . Pain at rest , at palpation on the lateral epicondyle , during resisted wrist extension , middle finger test , and strength testing was evaluated using Visual Analogue Scales . Also it was evaluated the grip strength , the range of motion and weight test . Parameters were determined before the treatment , at the end of the eighth week course of treatment ( week 8) , and eighth ( week 8) after the end of treatment . RESULTS Relative to the group B , the group A had ( 1 ) a significant decrease of pain at rest at the end of 8 weeks of the treatment ( p < 0.005 ) and at the end of following up period ( p < 0.05 ) , ( 2 ) a significant decrease in pain at palpation and pain on isometric testing at 8 weeks of treatment ( p < 0.05 ) , and at 8 weeks follow-up ( p < 0.001 ) , ( 3 ) a significant decrease in pain during middle finger test at the end of 8 weeks of treatment ( p < 0.01 ) , and at the end of the follow-up period ( p < 0.05 ) , ( 4 ) a significant decrease of pain during grip strength testing at 8 weeks of treatment ( p < 0.05 ) , and at 8 weeks follow-up ( p < 0.001 ) , ( 5 ) a significant increase in the wrist range of motion at 8 weeks follow-up ( p < 0.01 ) , ( 6 ) an increase in grip strength at 8 weeks of treatment ( p < 0.05 ) and at 8 weeks follow-up ( p < 0.01 ) , and ( 7 ) a significant increase in weight-test at 8 weeks of treatment ( p < 0.05 ) and at 8 weeks follow-up ( p < 0.005 ) . CONCLUSION The results suggested that the combination of laser with plyometric exercises was more effective treatment than placebo laser with the same plyometric exercises at the end of the treatment as well as at the follow-up . Future studies are needed to establish the relative and absolute effectiveness of the above protocol OBJECTIVE To investigate the course of lateral epicondylitis and identify prognostic indicators associated with short- and longterm outcome of pain intensity . METHODS We prospect ively followed patients ( n = 349 ) from 2 r and omized controlled trials investigating conservative interventions for lateral epicondylitis in primary care . Uni- and multivariate linear regression analyses were used to investigate the association between potential prognostic indicators and pain intensity ( 0 - 100 point scale ) measured at 1 , 6 , and 12 months after r and omization . Potential prognostic factors were duration of elbow complaints , concomitant neck pain , concomitant shoulder pain , previous elbow complaints , baseline pain scores , age , gender , involvement of dominant side , social class , and work status . The variables " study " and " treatment " were included as covariates in all models . RESULTS Pain scores at 1 month followup were higher in patients with severe pain , a long duration of elbow complaints , and concomitant shoulder pain . At 12 month followup , the only different prognostic indicator for poor outcome was concomitant neck pain , in place of shoulder pain . Patients from higher social classes reported lower pain scores at 12 month followup than patients from lower social classes . CONCLUSIONS Lateral epicondylitis seems to be a self-limiting condition in most patients . Long duration of elbow complaints , concomitant neck pain , and severe pain at presentation are associated with poor outcome at 12 months . Our results will help care providers give patients accurate information regarding their prognosis and assist in medical decision-making OBJECTIVES Pulsed low-intensity ultrasound therapy ( LIUS ) has been found to be beneficial in accelerating fracture healing and has produced positive results in animal tendon repair . In the light of this we undertook a r and omized , double-blind , placebo controlled trial to assess the effectiveness of LIUS vs placebo therapy daily for 12 weeks in patients with chronic lateral epicondylitis ( LE ) . METHODS Patients with LE of at least 6 weeks ' duration were recruited from general practice , physiotherapy and rheumatology clinics , and had to have failed at least one first-line treatment including non steroidal anti-inflammatory drugs ( NSAIDs ) and corticosteroid injection . Participants were assigned either active LIUS or placebo . Treatment was self-administered daily for 20 min over a 12-week period . The primary end-point was a 50 % improvement from baseline in elbow pain measured at 12 weeks using a patient-completed visual analogue scale . RESULTS Fifty-five subjects aged 18 - 80 were recruited over a 9-month period . In the active group 64 % ( 16/25 ) achieved at least 50 % improvement from baseline in elbow pain at 12 weeks compared with 57 % ( 13/23 ) in the placebo group ( difference of 7 % ; 95 % confidence interval -20 to 35 % ) . However , this was not statistically significant ( chi(2 ) = 0.28 , P = 0.60 ) . CONCLUSION In this study LIUS was no more effective for a large treatment effect than placebo for recalcitrant LE . This is in keeping with other interventional studies for the condition Summary The efficacy of “ athermic ” lasers ( HeNe λ = 632.8 nm and IR diode λ = 904 nm ) in the treatment of tendinopathies was investigated in a r and omized double-blind study . On 10 consecutive days , 64 patients ( 32 therapy , 32 placebo ) were treated for 15 minutes each with a switched-on or switched-off laser under otherwise identical conditions . The extent of movement in involved joints ( neutral 0 method ) and rating on a pain scale for resting pain , movement pain , and pressure pain before treatment , after treatment , and 2 weeks after conclusion of therapy , as well as infrared thermography , served to check therapy . After the end of therapy , a significant reduction ( P = < 0.001 ) of 50 % was shown for resting pain as well as reductions of 30 % for movement and 30 % for pressure pain . This result was identical in the therapy group and in the placebo group . There was also no indication of a different result of therapy between the therapy and placebo groups with regard to the thermographic control and the extent of movement . The breakdown of the data in terms of age , sex , and duration of disease did not provide any indications of different results for placebo or therapy . It was striking that the patients who reported sensations during or after the treatment ( irrespective of whether pleasant or unpleasant ) had a greater reduction of pain than the patients without sensations . This laser therapy thus did not show any effect above and beyond that in the untreated group in our double-blind clinical study The aim of this double-blind study was to explore the pain-alleviating effect of low energy laser in lateral epicondylalgia . Forty-nine patients were consecutively assigned at r and om to two groups , laser or placebo . The Mid 1500 Irradia laser was used with the following parameters : wavelength 904 nm ; average power output 12 mW ; peak value 8.3 W ; frequency 70 Hz ( pulse train 8000 Hz ) . The laser ( Ga-As ) was locally applied to 6 sites on and around the epicondyle . Each point was treated for 30 sec , result ing in a dose of 0.36 J/point and an area of treatment of 0.2 mm2 . Patients were treated 2 - 3 times weekly , for a total of 10 treatments . Follow-ups were done after three and 12 mo . The statistical analysis showed that the laser treated group had a significant improvement in some objective outcomes after the treatment period and at the 3 mo follow-up , but there were no significant differences in the subjective outcomes between the groups . Irradia laser treatment may be a valuable therapy in lateral epicondylalgia , if carried out as described in this study . However , further studies are necessary before low energy laser can be employed as a pain-relieving method CONTEXT Studies with positive results are more likely to be published than studies with negative results ( publication bias ) . One reason this occurs is that authors are less likely to su bmi t manuscripts reporting negative results to journals . There is no evidence that publication bias occurs once manuscripts have been su bmi tted to a medical journal . We assessed whether su bmi tted manuscripts that report results of controlled trials are more likely to be published if they report positive results . METHODS Prospect i ve cohort study of manuscripts su bmi tted to JAMA from February 1996 through August 1999 . We classified results as positive if there was a statistically significant difference ( P<.05 ) reported for the primary outcome . Study characteristics and indicators for quality were also appraised . We included manuscripts that reported prospect i ve studies in which participants were assigned to an intervention or comparison group and statistical tests compared differences between groups . RESULTS Among 745 manuscripts , 133 ( 17.9 % ) were published : 78 ( 20.4 % ) of 383 with positive results , 51 ( 15.0 % ) of 341 with negative results , and 4 ( 19.0 % ) of 21 with unclear results . The crude relative risk for publication of studies with positive results compared with negative results was 1.36 ( 95 % confidence interval [ CI ] , 0.99 - 1.88 ) . After being adjusted simultaneously for study characteristics and quality indicators , the odds ratio for publishing studies with positive results was 1.30 ( 95 % CI , 0.87 - 1.96 ) . CONCLUSIONS Among su bmi tted manuscripts , we did not find a statistically significant difference in publication rates between those with positive vs negative results BACKGROUND AND PURPOSE Assessment of the quality of r and omized controlled trials ( RCTs ) is common practice in systematic review s. However , the reliability of data obtained with most quality assessment scales has not been established . This report describes 2 studies design ed to investigate the reliability of data obtained with the Physiotherapy Evidence Data base ( PEDro ) scale developed to rate the quality of RCTs evaluating physical therapist interventions . METHOD In the first study , 11 raters independently rated 25 RCTs r and omly selected from the PEDro data base . In the second study , 2 raters rated 120 RCTs r and omly selected from the PEDro data base , and disagreements were resolved by a third rater ; this generated a set of individual rater and consensus ratings . The process was repeated by independent raters to create a second set of individual and consensus ratings . Reliability of ratings of PEDro scale items was calculated using multirater kappas , and reliability of the total ( summed ) score was calculated using intraclass correlation coefficients ( ICC [ 1,1 ] ) . RESULTS The kappa value for each of the 11 items ranged from.36 to.80 for individual assessors and from.50 to.79 for consensus ratings generated by groups of 2 or 3 raters . The ICC for the total score was.56 ( 95 % confidence interval=.47-.65 ) for ratings by individuals , and the ICC for consensus ratings was.68 ( 95 % confidence interval=.57-.76 ) . DISCUSSION AND CONCLUSION The reliability of ratings of PEDro scale items varied from " fair " to " substantial , " and the reliability of the total PEDro score was " fair " to " good . Objective . To investigate treatment practice among general practitioners ( GPs ) and physiotherapists ( PTs ) in chronic epicondylitis of 3 months ’ duration or more . Design . Postal survey . Setting and subjects . All 129 GPs and all 77 PTs at 35 primary health care centres in Uppsala County , Sweden , received the question naire . Main outcome measures . Proportion of responders using various treatments ( five specified alternatives + open question ) . Results . The question naire was answered by 70 % of the GPs and 61 % of the PTs . Ergonomic counselling , stretching , and orthotic devices were common , and used to a similar extent by GPs and PTs . Acupuncture was also common , but less so among GPs than PTs . Transcutaneous electric nerve stimulation was used by relatively few GPs and PTs . The open question revealed that dynamic exercise , particularly eccentric , was used by most PTs but only one GP . A majority of GPs prescribed sick leave and anti-inflammatory treatment with an NSAID or cortisone injections . Conclusion . A large number of treatment methods in chronic epicondylitis were reported , none of which is properly evidence -based and some of which are even known to be ineffective . There is a need for r and omized controlled studies of potentially effective treatments in this condition OBJECTIVE We design ed an animal pleurisy study to assess if the anti-inflammatory effect of photoradiation could be affected by concomitant use of the cortisol antagonist mifepristone . BACKGROUND DATA Although interactions between photoradiation and pharmacological agents are largely unknown , parallel use of steroids and photoradiation is common in the treatment of inflammatory disorders such as arthritis and tendinitis . METHODS Forty BALB/c male mice were r and omly divided in five groups . Inflammation was induced by carrageenan administered by intrathoracic injections . Four groups received carrageenan , and one control group received injections of sterile saline solution . At 1 , 2 , and 3 h after injections , photoradiation irradiation was performed with a dose of 7.5 J/cm(2 ) . Two of the carrageenan-injected groups were pre-treated with orally administered mifepristone . RESULTS Total leukocyte cell counts revealed that in carrageenan-induced pleurisy , photoradiation significantly reduced the number of leukocyte cells ( p < 0.0001 , mean 34.5 [ 95 % CI : 32.8 - 36.2 ] versus 87.7 [ 95 % CI : 81.0 - 94.4 ] ) , and that the effect of photoradiation could be totally blocked by adding the cortisol antagonist mifepristone ( p < 0.0001 , mean 34.5 [ 95 % CI : 32.1 - 36.9 ] versus 82.9 [ 95%CI : 70.5 - 95.3 ] ) . CONCLUSION The steroid receptor antagonist mifepristone significantly inhibited the anti-inflammatory effect of photoradiation . Commonly used glucocorticoids are also known to down-regulate steroid receptors , and further clinical studies are necessary to eluci date how this interaction may decrease the effect size of photoradiation over time . For this reason , we also suggest that , until further clinical data can be provided , clinical photoradiation trials should exclude patients who have received steroid therapy within 6 months before recruitment Thirty-six patients with lateral epicondylitis of the elbow ( 19 women , 17 men , median age 48 yrs ) were treated either with active laser or placebo , 18 patients in each group . The active laser was a GA-AL-AS 30 mW/830 nm low power laser ( LPL ) . The study design was double blind and r and omized . The treatment session consisted of eight treatments , two per week . Patients were irradiated on tender points on the lateral epicondyle and in the forearm extensors . Output power was 3,6 J/point . A follow up was performed by telephone , 10 weeks after the last treatment . No difference between laser and placebo was found on lateral elbow pain ( Mann Whitney test , 95 % confidence limits ) . We conclude that low power laser offers no advantage over placebo in the treatment of musculoskeletal pain as lateral epicondylitis . Further studies with low power laser treatment of musculoskeletal pain seem useless BACKGROUND AND OBJECTIVE Among the other treatment modalities of medial and lateral epicondylitis , low level laser therapy ( LLLT ) has been promoted as a highly successful method . The aim of this clinical study was to assess the efficacy of LLLT using trigger points ( TPs ) and scanner application techniques under placebo-controlled conditions . STUDY DESIGN / MATERIAL AND METHODS The current clinical study was completed at two Laser Centers ( Locarno , Switzerl and and Opatija , Croatia ) as a double-blind , placebo controlled , crossover clinical study . The patient population ( n = 324 ) , with either medial epicondylitis ( Golfer 's elbow ; n = 50 ) or lateral epicondylitis ( Tennis elbow ; n = 274 ) , was recruited . Unilateral cases of either type of epicondylitis ( n = 283 ) were r and omly allocated to one of three treatment groups according to the LLLT technique applied : ( 1 ) Trigger points ; ( 2 ) Scanner ; ( 3 ) Combination Treatment ( i.e. , TPs and scanner technique ) . Bilateral cases of either type of epicondylitis ( n = 41 ) were subject to crossover , placebo-controlled conditions . Laser devices used to perform these treatments were infrared ( IR ) diode laser ( GaAlAs ) 830 nm continuous wave for treatment of TPs and HeNe 632.8 nm combined with IR diode laser 904 nm , pulsed wave for scanner technique . Energy doses were equally controlled and measured in Joules/cm2 either during TPs or scanner technique sessions in all groups of patients . The treatment outcome ( pain relief and functional ability ) was observed and measured according to the following methods : ( 1 ) short form of McGill 's Pain Question naire ( SF-MPQ ) ; ( 2 ) visual analogue scales ( VAS ) ; ( 3 ) verbal rating scales ( VRS ) ; ( 4 ) patient 's pain diary ; and ( 5 ) h and dynamometer . RESULTS Total relief of the pain with consequently improved functional ability was achieved in 82 % of acute and 66 % of chronic cases , all of which were treated by combination of TPs and scanner technique . CONCLUSIONS This clinical study has demonstrated that the best results are obtained using combination treatment ( i.e. , TPs and scanner technique ) . Good results are obtained from adequate treatment technique correctly applied , individual energy doses , adequate medical education , clinical experience , and correct approach of laser therapists . We observed that under- and overirradiation dosage can result in the absence of positive therapy effects or even opposite , negative ( e.g. , inhibitory ) effects . The current clinical study provides further evidence of the efficacy of LLLT in the management of lateral and medial epicondylitis The aims of this study were to evaluate the effects of low-level laser therapy ( LLLT ) and to compare these with the effects of brace or ultrasound ( US ) treatment in tennis elbow . The study design used was a prospect i ve and r and omized , controlled , single-blind trial . Fifty-eight out patients with lateral epicondylitis ( 9 men , 49 women ) were included in the trial . The patients were divided into three groups : 1 ) brace group-brace plus exercise , 2 ) ultrasound group-US plus exercise , and 3 ) laser group-LLLT plus exercise . Patients in the brace group used a lateral counterforce brace for three weeks , US plus hot pack in the ultrasound group , and laser plus hot pack in the LLLT group . In addition , all patients were given progressive stretching and strengthening exercise programs . Grip strength and pain severity were evaluated with visual analog scale ( VAS ) at baseline , at the second week of treatment , and at the sixth week of treatment . VAS improved significantly in all groups after the treatment and in the ultrasound and laser groups at the sixth week ( p<0.05 ) . Grip strength of the affected h and increased only in the laser group after treatment , but was not changed at the sixth week . There were no significant differences between the groups on VAS and grip strength at baseline and at follow-up assessment s. The results show that , in patients with lateral epicondylitis , a brace has a shorter beneficial effect than US and laser therapy in reducing pain , and that laser therapy is more effective than the brace and US treatment in improving grip strength The effect of low level laser ( GaAs ) on lateral epicondylitis was investigated in a double-blind , r and omized , controlled study . Thirty patients were assigned equally to a laser ( n = 15 ) or a placebo laser ( n = 15 ) group . All patients received eight treatments and were evaluated subjectively and objective ly before , at the end of , and four weeks after treatment . Patients also completed a follow-up question naire on an average of five to six months after treatment . A significant improvement in the laser compared to the placebo group was found on visual analog scale ( p = 0.02 ) and grip strength ( p = 0.03 ) tests four weeks after treatment . In this study low level laser therapy was shown to have an effect over placebo ; however , as a sole treatment for lateral epicondylitis it is of limited value . Further studies are needed to evaluate the reliability of our findings and to compare laser to other established treatment methods We have conducted a prospect i ve , double blind trial of low level laser therapy ( LLLT ) in musculoskeletal injuries to assess its efficacy . We assigned patients with a variety of painful skeletal soft tissue conditions to one of two treatment groups , treatment from a functional machine , placebo treatment from an inactive machine . Both machines were identical and both appeared functional . The operative status of each machine was unknown to both the therapist and the patient . The results suggest that LLLT has no significant therapeutic effect and acts primarily as a placebo The purpose s of this study were to compare the pain alleviating effects of laser treatment and placebo in tennis elbow . Also , the effects of laser radiation on the radial sensory nerve conduction , and the temperature changes in the tissue surrounding the treated radial nerve were studied . The results show that laser treatment is not significantly better than placebo in treating tennis elbow . Furthermore , no significant change was noted in the evoked sensory potential as well as subcutaneous temperatures in either experimental or control groups as a result of the applications of the laser radiation treatment OBJECTIVE To evaluate the efficacy of an oral nonsteroidal anti-inflammatory drug in the treatment of lateral epicondylitis . DESIGN Multicenter double-blind r and omized controlled trial in which the following hypothesis was tested : whether diclofenac sodium provided a 20 % or greater improvement over rest and cast immobilization in the response rate to treatment of lateral epicondylitis beyond and over rest in an experimental group compared with a control group after 4 weeks of treatment . SETTING Recruitment from urban general practice s and referrals to 4 university hospitals . SUBJECTS AND METHODS During a 1-year period , 206 subjects aged 18 to 60 years with lateral epicondylitis were recruited from the clientele treated by family physicians . Thirty subjects refused to participate and 47 presented with exclusion criteria , leaving 129 subjects who entered the study . One subject withdrew after 21 days . INTERVENTIONS The experimental group was treated with a daily dose of diclofenac sodium ( 150 mg ) for 28 days , while the control group received a placebo during the same period . In addition , both groups were immobilized in a cast for 14 days and were told not to perform repetitive movements of the involved limb for 21 days . MAIN OUTCOME MEASURES Measuring instruments consisted of grip strength measurements with a squeeze dynamometer , a visual analog pain scale , a visual analog function scale , and an 8-item pain-free function index . RESULTS A statistically and clinical ly significant reduction of pain was associated with treatment with diclofenac , but no clinical ly significant difference in grip strength or functional improvement could be detected between the 2 groups . Secondary effects ( diarrhea and abdominal pain ) were significantly more frequent in the diclofenac-treated group . CONCLUSION Taking into account the limited improvement noted over rest and cast immobilization and the number of associated adverse events , it is difficult to recommend the use of diclofenac in the treatment of lateral epicondylitis at the dosage used in this study OBJECTIVE To assess the effectiveness of low intensity laser therapy in the treatment of lateral epicondylitis . DESIGN A double-masked , placebo-controlled , r and omized clinical trial . SETTING A physical medicine and rehabilitation clinic . PARTICIPANTS Fifty-two ambulatory men and women ( age range , 18 - 70 yr ) with symptomatic lateral epicondylitis of more than 30 days in duration and a normal neurologic examination . INTERVENTION Subjects were bloc r and omized into 2 groups with a computer-generated schedule . All underwent irradiation for 60 seconds at 7 points along the symptomatic forearm 3 times weekly for 4 weeks by a masked therapist . The sole difference between the groups was that the probe of a 1.06-microm continuous wave laser emitted 204 mW/cm2 ( 12.24 J/cm2 ) for the treated subjects and was inactive for the control subjects . Subjects were assessed at the beginning , midpoint ( session 6 ) , and end ( session 12 ) of treatment , as well as at follow-up 28 to 35 days after their last treatment . MAIN OUTCOME MEASURES Pain in last 24 hours , tenderness to palpation , and patient 's perception of change ( benefit ) . RESULTS The treated and untreated groups were well matched demographically . Masking was maintained for subjects and therapists ; however , the groups did not vary to a statistically significant extent in terms of the main outcome measures either during treatment or at follow-up . Secondary outcome variables , such as grasp and pinch strength , medication use , and pain with grasp and pinch , also failed to statistically differ significantly between the groups . No significant treatment side effects were noted . CONCLUSION Treatment with low intensity 1.06-microm laser irradiation within the parameters of this study was a safe but ineffective treatment of lateral epicondylitis . Further research seems warranted in this controversial area
2,043
23,181,986
There is also a Cochrane systematic review which suggests that the GI may play a role in promoting weight loss . Therefore , the literature , as it currently st and s , presents a convincing case for the clinical application of the GI in the management of body weight and glucose homeostasis . In this issue of the British Journal of Nutrition , Lagerpusch et al. observed that in the dynamic phase of weight gain , a high-fibre , low-GI diet reduced daytime measurements of interstitial glucose when compared with an energy-matched lowfibre , high-GI diet . Furthermore , the deterioration in insulin sensitivity induced by refeeding was attenuated , though the effects of the two dietary interventions were resolved at the end of the refeeding phase . The authors do not offer any suggestions as to which specific dietary manipulation they believe may be driving their observed improvements in glucose metabolism and insulin sensitivity . In two large-scale epidemiological studies ( the Nurses ’ Health Study and Health Professionals Follow-up Study ) , it has been demonstrated that the risk of developing type 2 diabetes increases with a concomitant increase in dietary glycaemic load ( the GI multiplied by the amount of carbohydrate consumed ) and a reduction in fibre consumption . The uncoupling of these two aspects of the diet meant that the relationship disappeared or was significantly weaker . Furthermore , the Reading , Imperial , Surrey , Cambridge , and Kings ( RISCK ) study , a large multicentre dietary intervention in over 500 adults at risk of CVD , aim ed to eluci date how dietary changes may influence insulin sensitivity and other CVD risk factors . A surprising finding of the study was the absence of an improvement in insulin sensitivity following a lowv . high-GI diet . This raises the intriguing question of whether the dietary deconstruction seen in most nutritional interventions , in line with the current reductionist scientific approach , is actually detrimental . By controlling for every aspect of the diet , could we actually be missing important physiological effects that occur from dietary manipulations which often go h and -in-h and , like GI and dietary fibre ? In the gastrointestinal tract , low-GI diets with a high fibre content slow gastric emptying , reduce digesta transit rate and alter the luminal environment , all of which will ultimately delay glucose absorption and result in an ameliorated insulin response . For example , high-fibre , low-GI diets may promote insulin sensitivity by improving metabolic flexibility , i.e. the ability of an organism to modify fuel oxidation in response to changes in nutrient availability . Metabolic flexibility enables an efficient transition from lipid oxidation and high rates of fatty acid uptake during the fasted state , to suppression of lipid oxidation and increased glucose uptake and utilisation in response to insulin stimulation .
Since its first description by Jenkins et al. in the 1980s , the glycaemic index ( GI ) has been used as a dietary tool to enhance the glycaemic control of people living with diabetes . The management of glycaemia is relevant today more than ever , with an estimated 2·9 million people currently affected by type 2 diabetes in the UK . However , the GI as a concept has dipped in and out of scientific fashion . By its nature , the GI is highly complex in that it hinges on a number of physico-chemical properties of carbohydrates , such as the chemical structure of the carbohydrate , the surrounding food matrix and the processing it undergoes . The fact that GI measurements are confounded by other food components within the diet serves as an additional layer of complexity . Consequently , the concept of the GI has been met , at times , with scepticism and some have found it challenging to embrace . However , one could argue that the proof is in the eating . The authors cl aim that there is public health relevance to this , as many adults demonstrate short-term weight cycling , i.e. repeated cycles of weight loss and weight regain . Indeed , estimations of the prevalence of weight cycling are within the range of 18–34 and 20–55 % for men and women , respectively . However , the health effects associated with weight cycling are largely unknown . This study by Lagerpusch et al. is the first to examine the impact of the GI on insulin sensitivity during and after weight regain in young healthy individuals using multiple indices of glucose and insulin homeostasis . However , previous studies have attempted to answer this question . The mechanisms underlying the beneficial effects of high-fibre , low-GI diets on insulin sensitivity are complex and multifactorial .
BACKGROUND Increasing evidence suggests an important role of carbohydrate quality in the development of type 2 diabetes . OBJECTIVE Our objective was to prospect ively examine the association between glycemic index , glycemic load , and dietary fiber and the risk of type 2 diabetes in a large cohort of young women . DESIGN In 1991 , 91249 women completed a semiquantitative food-frequency question naire that assessed dietary intake . The women were followed for 8 y for the development of incident type 2 diabetes , and dietary information was up date d in 1995 . RESULTS We identified 741 incident cases of confirmed type 2 diabetes during 8 y ( 716 300 person-years ) of follow-up . After adjustment for age , body mass index , family history of diabetes , and other potential confounders , glycemic index was significantly associated with an increased risk of diabetes ( multivariate relative risks for quintiles 1 - 5 , respectively : 1 , 1.15 , 1.07 , 1.27 , and 1.59 ; 95 % CI : 1.21 , 2.10 ; P for trend = 0.001 ) . Conversely , cereal fiber intake was associated with a decreased risk of diabetes ( multivariate relative risks for quintiles 1 - 5 , respectively : 1 , 0.85 , 0.87 , 0.82 , and 0.64 ; 95 % CI : 0.48 , 0.86 ; P for trend = 0.004 ) . Glycemic load was not significantly associated with risk in the overall cohort ( multivariate relative risks for quintiles 1 - 5 , respectively : 1 , 1.31 , 1.20 , 1.14 , and 1.33 ; 95 % CI : 0.92 , 1.91 ; P for trend = 0.21 ) . CONCLUSIONS A diet high in rapidly absorbed carbohydrates and low in cereal fiber is associated with an increased risk of type 2 diabetes Previous studies suggest that a low-glycaemic index ( LGI ) diet may improve insulin sensitivity ( IS ) . As IS has been shown to decrease during refeeding , we hypothesised that an LGI- v. high-GI ( HGI ) diet might have favourable effects during this phase . In a controlled nutritional intervention study , sixteen healthy men ( aged 26·8 ( SD 4·1 ) years , BMI 23·0 ( SD 1·7 ) kg/m2 ) followed 1 week of overfeeding , 3 weeks of energy restriction and of 2 weeks refeeding at ^50 % energy requirement ( 50 % carbohydrates , 35 % fat and 15 % protein ) . During refeeding , subjects were divided into two matched groups receiving either high-fibre LGI or lower-fibre HGI foods ( GI 40 v. 74 , fibre intake 65 ( SD 6 ) v. 27 ( SD 4 ) g/d ) . Body weight was equally regained in both groups with refeeding ( mean regain 70·5 ( SD 28·0)% of loss ) . IS was improved by energy restriction and decreased with refeeding . The decreases in IS were greater in the HGI than in the LGIgroup ( group £ time interactions for insulin , homeostasis model assessment of insulin resistance ( HOMAIR ) , Matsuda IS index (MatsudaISI);all P,0·05 ) . Mean interstitial glucose profiles during the day were also higher in the HGI group ( DAUCHGI-LGI of continuous interstitial glucose monitoring : 6·6 mmol/l per 14 h , P¼0·04 ) . At the end of refeeding , parameters of IS did not differ from baseline values in either diet group ( adiponectin , insulin , HOMAIR , Matsuda ISI , M-value ; all P.0·05 ) . In conclusion , nutritional stress imposed by dietary restriction and refeeding reveals a GI/fibre effect in healthy non-obese subjects . LGI foods rich in fibre may improve glucose metabolism during the vulnerable refeeding phase of a weight cycle BACKGROUND Insulin sensitivity ( Si ) is improved by weight loss and exercise , but the effects of the replacement of saturated fatty acids ( SFAs ) with monounsaturated fatty acids ( MUFAs ) or carbohydrates of high glycemic index ( HGI ) or low glycemic index ( LGI ) are uncertain . OBJECTIVE We conducted a dietary intervention trial to study these effects in participants at risk of developing metabolic syndrome . DESIGN We conducted a 5-center , parallel design , r and omized controlled trial [ RISCK ( Reading , Imperial , Surrey , Cambridge , and Kings ) ] . The primary and secondary outcomes were changes in Si ( measured by using an intravenous glucose tolerance test ) and cardiovascular risk factors . Measurements were made after 4 wk of a high-SFA and HGI ( HS/HGI ) diet and after a 24-wk intervention with HS/HGI ( reference ) , high-MUFA and HGI ( HM/HGI ) , HM and LGI ( HM/LGI ) , low-fat and HGI ( LF/HGI ) , and LF and LGI ( LF/LGI ) diets . RESULTS We analyzed data for 548 of 720 participants who were r and omly assigned to treatment . The median Si was 2.7 × 10(-4 ) mL · μU(-1 ) · min(-1 ) ( interquartile range : 2.0 , 4.2 × 10(-4 ) mL · μU(-1 ) · min(-1 ) ) , and unadjusted mean percentage changes ( 95 % CIs ) after 24 wk treatment ( P = 0.13 ) were as follows : for the HS/HGI group , -4 % ( -12.7 % , 5.3 % ) ; for the HM/HGI group , 2.1 % ( -5.8 % , 10.7 % ) ; for the HM/LGI group , -3.5 % ( -10.6 % , 4.3 % ) ; for the LF/HGI group , -8.6 % ( -15.4 % , -1.1 % ) ; and for the LF/LGI group , 9.9 % ( 2.4 % , 18.0 % ) . Total cholesterol ( TC ) , LDL cholesterol , and apolipoprotein B concentrations decreased with SFA reduction . Decreases in TC and LDL-cholesterol concentrations were greater with LGI . Fat reduction lowered HDL cholesterol and apolipoprotein A1 and B concentrations . CONCLUSIONS This study did not support the hypothesis that isoenergetic replacement of SFAs with MUFAs or carbohydrates has a favorable effect on Si . Lowering GI enhanced reductions in TC and LDL-cholesterol concentrations in subjects , with tentative evidence of improvements in Si in the LF-treatment group . This trial was registered at clinical trials.gov as IS RCT N29111298 OBJECTIVE To examine the relation between weight change and weight fluctuation ( cycling ) and mortality in middle-aged men . METHODS A prospect i ve study of 5608 men aged 40 to 59 years at screening , drawn from one general practice in each of 24 British towns . Changes in weight observed during a 12- to 14-year period were related to mortality during the subsequent 8 years . RESULTS There were 943 deaths from all causes : 458 cardiovascular disease ( CVD ) and 485 non-CVD deaths . Those with stable weight or weight gain had the lowest total , CVD , and non-CVD mortality . Sustained weight loss or weight fluctuation ( loss-gain or gain-loss ) showed a significantly higher mortality risk than stable weight even after adjustment for lifestyle variables ( relative risk [ 95 % confidence interval ] , 1.60 [ 1.32 - 1.95 ] , 1.50 [ 1.17 - 1.91 ] , and 1.63 [ 1.24 - 2.14 ] , respectively ) . Adjustment or exclusion of men with preexisting disease markedly attenuated the increased risk of CVD and total mortality associated with sustained weight loss and weight gain-weight loss . In long-term nonsmokers , any weight loss since screening was associated with an increased risk of mortality , but this was markedly attenuated by adjustment for preexisting disease . Recent ex-smokers showed the most marked increase in mortality associated with sustained weight loss . CONCLUSIONS The increased mortality in middle-aged men with sustained weight loss and weight fluctuation ( cycling ) is determined to a major extent by disadvantageous lifestyle factors and preexisting disease . The evidence suggests that weight loss and weight fluctuation ( cycling ) in these men does not directly increase the risk of death OBJECTIVE Intake of carbohydrates that provide a large glycemic response has been hypothesized to increase the risk of NIDDM , whereas dietary fiber is suspected to reduce incidence . These hypotheses have not been evaluated prospect ively . RESEARCH DESIGN AND METHODS We examined the relationship between diet and risk of NIDDM in a cohort of 42,759 men without NIDDM or cardiovascular disease , who were 40–75 years of age in 1986 . Diet was assessed at baseline by a vali date d semiquantitative food frequency question naire . During 6-years of follow-up , 523 incident cases of NIDDM were documented . RESULTS The dietary glycemic index ( an indicator of carbohydrate 's ability to raise blood glucose levels ) was positively associated with risk of NIDDM after adjustment for age , BMI , smoking , physical activity , family history of diabetes , alcohol consumption , cereal fiber , and total energy intake . Comparing the highest and lowest quintiles , the relative risk ( RR ) of NIDDM was 1.37 ( 95 % CI , 1.02–1.83 , P trend = 0.03 ) . Cereal fiber was inversely associated with risk of NIDDM ( RR = 0.70 ; 95 % CI , 0.51–0.96 , P trend = 0.007 ; for > 8.1 g/day vs. < 3.2 g/day ) . The combination of a high glycemic load and a low cereal fiber intake further increased the risk of NIDDM ( RR = 2.17 , 95 % CI , 1.04–4.54 ) when compared with a low glycemic load and high cereal fiber intake . CONCLUSIONS These findings support the hypothesis that diets with a high glycemic load and a low cereal fiber content increase risk of NIDDM in men . Further , they suggest that grains should be consumed in a minimally refined form to reduce the incidence of NIDDM The risk of coronary heart disease ( CHD ) is influenced by family history , insulin sensitivity ( IS ) , and diet . Adiposity affects CHD and IS . The cellular mechanism of IS is thought to involve the adipocyte cytokine tumor necrosis factor-alpha ( TNF-alpha ) . Insulin-stimulated glucose uptake in isolated subcutaneous and omental adipocytes obtained during elective surgery was measured in 61 premenopausal women , 24 with a parental history ( PH ) of CHD . In vivo IS was measured using the short insulin tolerance test ( SITT ) in 28 women , 16 with PH-CHD , before and 3 weeks after r and omization to a low glycemic index ( LGI ) or high glycemic index ( HGI ) diet . In vitro adipocyte IS and TNF-alpha production was measured following dietary modification . On the habitual diet , in vitro insulin-stimulated glucose uptake in adipocytes as a percentage increase over basal was less in women with PH-CHD than in those without it ( presented as the median with 95 % confidence limits : subcutaneous , 28 % ( 17 % to 39 % ) v 96 % ( 70 % to 120 % ) , P < .01 ) ; omental , 40 % ( 28 % to 52 % ) v 113 % ( 83 % to 143 % ) , P < .01 ) . In vivo IS in 16 PH-CHD subjects and 12 controls before dietary r and omization was similar , and increased in both groups consuming a LGI versus HGI diet ( PH-CHD , 0.31 ( 0.26 to 0.37 ) v 0.14 ( 0.10 to 0.24 ) mmol/L/min , P < .01 ; controls , 0.31 ( 0.1 to 0.53 ) v 0.15 ( 0.06 to 0.23 ) mmol/L/min , P < .05 ) . Adipocyte IS was greater in PH-CHD women on a LGI versus HGI diet ( subcutaneous , 50 % ( 20 % to 98 % ) v 13 % ( 1 % to 29 % ) ; omental , 97 % ( 47 % to 184 % ) v 29 % ( 4 % to 84 % ) , P < .05 ) . Adipocyte TNF-alpha production was higher in women with versus without PH-CHD ( subcutaneous , 0.3 ( 0.18 to 0.42 ) v 0.93 ( 0.39 to 1.30 ) ng/mL/min ; visceral , 0.22 ( 0.15 to 1.30 ) v 0.64 ( 0.24 to 1.1 ) ng/mL/min , P < .04 , respectively ) , but was uninfluenced by the dietary glycemic index . We conclude that in vitro adipocyte IS is reduced and adipocyte TNF-alpha production is increased in premenopausal women with PH-CHD . A LGI diet improves both adipocyte IS in women with PH-CHD and in vivo IS in women with and without PH-CHD The insulin resistance syndrome has recently been implicated in the etiology of coronary heart disease , with a possible metabolic defect at the level of the adipocyte . We report the effects of a low- versus high-glycemic-index ( LGI and HGI , respectively ) diet on insulin and glucose response as assessed by oral glucose tolerance test ( OGTT ) and insulin-stimulated glucose uptake in isolated adipocytes in a group of 32 patients with advanced coronary heart disease . The area under the insulin curve following OGTT was significantly reduced after 4 weeks in the LGI group ( P < .03 ) , but not in the HGI group . Insulin-stimulated glucose uptake in isolated adipocytes harvested from a presternal fat biopsy was significantly greater following the LGI diet ( P < .05 ) . This study demonstrates that simple short-term dietary measures can improve insulin sensitivity in patients with coronary heart disease CONTEXT Clinical trials using antihyperglycemic medications to improve glycemic control have not demonstrated the anticipated cardiovascular benefits . Low-glycemic index diets may improve both glycemic control and cardiovascular risk factors for patients with type 2 diabetes but debate over their effectiveness continues due to trial limitations . OBJECTIVE To test the effects of low-glycemic index diets on glycemic control and cardiovascular risk factors in patients with type 2 diabetes . DESIGN , SETTING , AND PARTICIPANTS A r and omized , parallel study design at a Canadian university hospital research center of 210 participants with type 2 diabetes treated with antihyperglycemic medications who were recruited by newspaper advertisement and r and omly assigned to receive 1 of 2 diet treatments each for 6 months between September 16 , 2004 , and May 22 , 2007 . INTERVENTION High-cereal fiber or low-glycemic index dietary advice . MAIN OUTCOME MEASURES Absolute change in glycated hemoglobin A(1c ) ( HbA(1c ) ) , with fasting blood glucose and cardiovascular disease risk factors as secondary measures . RESULTS In the intention-to-treat analysis , HbA(1c ) decreased by -0.18 % absolute HbA(1c ) units ( 95 % confidence interval [ CI ] , -0.29 % to -0.07 % ) in the high-cereal fiber diet compared with -0.50 % absolute HbA(1c ) units ( 95 % CI , -0.61 % to -0.39 % ) in the low-glycemic index diet ( P < .001 ) . There was also an increase of high-density lipoprotein cholesterol in the low-glycemic index diet by 1.7 mg/dL ( 95 % CI , 0.8 - 2.6 mg/dL ) compared with a decrease of high-density lipoprotein cholesterol by -0.2 mg/dL ( 95 % CI , -0.9 to 0.5 mg/dL ) in the high-cereal fiber diet ( P = .005 ) . The reduction in dietary glycemic index related positively to the reduction in HbA(1c ) concentration ( r = 0.35 , P < .001 ) and negatively to the increase in high-density lipoprotein cholesterol ( r = -0.19 , P = .009 ) . CONCLUSION In patients with type 2 diabetes , 6-month treatment with a low-glycemic index diet result ed in moderately lower HbA(1c ) levels compared with a high-cereal fiber diet . Trial Registration clinical trials.gov identifier : NCT00438698
2,044
25,480,313
Relatively good clinical and INR outcomes were reported with the use of any treatment protocol while less good results were reported for INR outcome when a predefined protocol was missing ( doctor strategy ) . Lowest PCC dosages were infused in the fixed dose strategy . In emergency VKA reversal , a predefined PCC dosing protocol seems essential . We found no evidence that one dosing strategy is superior .
Management of patients with a major bleed while on vitamin K antagonist ( VKA ) is a common clinical challenge . Prothrombin Complex Concentrates ( PCC ) provide a rapid reversal of VKA induced coagulopathy . However , a well-defined PCC dosing strategy , especially in emergency setting , is still lacking . We performed a systematic review to describe the currently used PCC dosing strategies and to present their efficacy in terms of target INR achievement and clinical outcome .
Objective Intracranial hemorrhage in patients receiving oral anticoagulant ( OAC ) therapy is associated with poor neurological outcome . Prothrombin complex concentrate ( PCC ) is the gold – st and ard therapy to normalize hemostasis but remains underused . Ultra-rapid reversal of anticoagulation could reduce the time to biological and surgical hemostasis , and might improve outcome . We report the use of bolus infusions of PCC to immediately reverse anticoagulation and allow for urgent neurosurgical care . Design Prospect i ve , observational study . Setting Neurosurgical intensive care unit , university hospital . Patients and participants Eighteen patients with OAC-associated intracranial hemorrhage requiring urgent neurosurgical intervention . Interventions All patients received 20 UI/kg of PCC as an intravenous bolus infusion ( 3 min ) and 5 mg of enteral vitamin K. Surgery was started immediately , without waiting for blood sample results . Measurements and results Serial blood sample s were performed to assess prothrombin time . Coagulation was considered normal when the international normalized ratio was ≤ 1.5 . All patients , including nine who were over-anticoagulated , had complete reversal of anticoagulation immediately after the bolus of PCC . No hemorrhagic or thrombotic adverse effect was observed intra- or postoperatively . Conclusions A bolus infusion of PCC completely reverses anticoagulation within 3 min . Neurosurgery can be performed immediately in OAC-related intracranial hemorrhage . This study shows that OAC – treated patients can be managed as rapidly as non – anticoagulated patients Background Co-fact © , prothrombin complex concentrate , is used for restoring the international normalized ratio ( INR ) in patients on vitamin K antagonists ( VKA ) presenting with acute bleeding . In this prospect i ve cohort study , we evaluated whether adequate INR values were reached in ED patients using the Sanquin ( Federation of Dutch Thrombosis Services ) treatment protocol . Methods We evaluated this protocol for two target INR groups : group 1 , target INR ≤ 1.5 ( for life-threatening bleeding/immediate intervention ) ; group 2 , target INR 1.6 - 2.1 ( in cases of a minor urgent surgery or serious overdosing of anticoagulant ) . We specifically wanted to identify both under- and over-treated patients . Reversing VKA anticoagulation therapy to unnecessarily low INR values may involve thrombotic risks . Apart from this risk , the patient is also administered an excess amount of the drug . This means unnecessary costs and may present problems with restoring an anticoagulated state at a later time . Results In our cohort , the Sanquin dosing protocol was followed for 45/60 patients . It appeared that out of the 41 patients in group 1 ( target INR ≤ 1.5 ) , 35 ( 85 % ) achieved the goal INR . This occurred more often than for the 19 patients in group 2 ( target INR 1.6–2.1 ) , where only 6 ( 32 % ) achieved the goal INR . Using the protocol result ed in a positive trend toward better INR reversal in group 1.In group 2 , no relation between using the protocol and achieving the desired INR value was detected . Physicians ignoring the proposed dose of Co-fact © prescribed significantly less Co-fact © ( even when correcting for patient weight ) . It appeared that patients in group 1 had a significantly lower baseline INR than patients in group 2 . Group 2 patients , on the other h and , had a baseline INR > 7.5 in 53 % of the cases . Conclusion In our cohort , for most patients in INR group 2 treated with Co-fact © , the achieved INR value was outside the desired range of 1.6 - 2.1 . The supra-therapeutic range of baseline INR in group 2 may have contributed to the different kind of bleeding witnessed in this patient group . Our results support the idea that treatment of patients on vitamin K antagonists with Co-fact © could benefit from a slightly different approach , taking into account the INR value to which the patient needs to be reversed BACKGROUND Prothrombin complex concentrate ( PCC ) can substantially shorten the time needed to reverse antivitamin K oral anticoagulant therapy ( OAT ) . OBJECTIVES . To determine the effectiveness and safety of emergency OAT reversal by a balanced pasteurized nanofiltered PCC ( Beriplex P/N ) containing coagulation factors II , VII , IX , and X , and anticoagulant proteins C and S. PATIENTS AND METHODS Patients receiving OAT were eligible for this prospect i ve multinational study if their International Normalized Ratio ( INR ) exceeded 2 and they required either an emergency surgical or urgent invasive diagnostic intervention or INR normalization due to acute bleeding . Stratified 25 , 35 , or 50 IU kg(-1 ) PCC doses were infused based on initial INR . Study endpoints included INR normalization ( < /=1.3 ) by 30 min after PCC infusion and hemostatic efficacy . RESULTS Forty-three patients , 26 requiring interventional procedures and 17 experiencing acute bleeding , received PCC infusions at a median rate of 7.5 mL min(-1 ) ( 188 IU min(-1 ) ) . At 30 min thereafter , INR declined to < /=1.3 in 93 % of patients . At all postinfusion time points through 48 h , median INR remained between 1.2 and 1.3 . Clinical hemostatic efficacy was classified as very good or satisfactory in 42 patients ( 98 % ) . Prompt and sustained increases in circulating coagulation factors and anticoagulant proteins were observed . One fatal suspected pulmonary embolism in a patient with metastatic cancer was judged to be possibly PCC-related . CONCLUSIONS PCC treatment serves as an effective rapid hemorrhage control re source in the emergency anticoagulant reversal setting . More widespread availability of PCC is warranted to ensure its benefits in appropriate patients BACKGROUND Fresh frozen plasma ( FFP ) and prothrombin complex concentrates ( PCC ) reverse oral anticoagulants . We compared PCC and FFP intraoperative administration in patients undergoing heart surgery with cardiopulmonary bypass ( CPB ) . METHODS Forty patients [ with international normalized ratio (INR)≥ 2·1 ] assigned semi-urgent cardiac surgery were r and omized to receive either FFP ( n = 20 ) or PCC ( n = 20 ) . Prior to CPB , they received either 2 units of FFP or half of the PCC dose calculated according to body weight , initial INR and target INR ( ≤ 1·5 ) . After CPB and protamine administration , patients received either another 2 units of FFP or the other half PCC dose . Additional doses were administered if INR was still too high ( ≥ 1·5 ) . RESULTS Fifteen minutes after CPB , more patients reached INR target with PCC ( P = 0·007 ) : 7/16 patients vs. 0/15 patients with FFP ; there was no difference 1 h after CPB ( 6/15 patients with PCC vs. 4/15 patients with FFP reached target ) . Fifteen minutes after CPB , median INR ( range ) decreased to 1·6 ( 1·2 - 2·2 ) with PCC vs. 2·3 ( 1·5 - 3·5 ) with FFP ; 1 h after CPB both groups reached similar values [ 1·6 ( 1·3 - 2·2 ) with PCC and 1·7 ( 1·3 - 2·7 ) with FFP ] . With PCC , less patients needed additional dose ( 6/20 ) than with FFP ( 20/20 ) ( P < 0·001 ) . Both groups differed significantly on the course of factor II ( P = 0·0023 ) and factor X ( P = 0·008 ) over time . Dilution of coagulation factors was maximal at CPB onset . Safety was good for both groups , with only two related oozing cases with FFP . CONCLUSION PCC reverses anticoagulation safely , faster and with less bleeding than FFP Introduction Prothrombin complex concentrates ( PCC ) are haemostatic blood preparations indicated for urgent anticoagulation reversal , though the optimal dose for effective reversal is still under debate . The latest generation of PCCs include four coagulation factors , the so-called 4-factor PCC . The aim of this study was to compare the efficacy and safety of two doses , 25 and 40 IU/kg , of 4-factor PCC in vitamin K antagonist ( VKA ) associated intracranial haemorrhage . Methods We performed a phase III , prospect i ve , r and omised , open-label study including patients with objective ly diagnosed VKA-associated intracranial haemorrhage between November 2008 and April 2011 in 22 centres in France . Patients were r and omised to receive 25 or 40 IU/kg of 4-factor PCC . The primary endpoint was the international normalised ratio ( INR ) 10 minutes after the end of 4-factor PCC infusion . Secondary endpoints were changes in coagulation factors , global clinical outcomes and incidence of adverse events ( AEs ) . Results A total of 59 patients were r and omised : 29 in the 25 IU/kg and 30 in the 40 IU/kg group . Baseline demographics and clinical characteristics were comparable between the groups . The mean INR was significantly reduced to 1.2 - and ≤1.5 in all patients of both groups - 10 minutes after 4-factor PCC infusion . The INR in the 40 IU/kg group was significantly lower than in the 25 IU/kg group 10 minutes ( P = 0.001 ) , 1 hour ( P = 0.001 ) and 3 hours ( P = 0.02 ) after infusion . The 40 IU/kg dose was also effective in replacing coagulation factors such as PT ( P = 0.038 ) , FII ( P = 0.001 ) , FX ( P < 0.001 ) , protein C ( P = 0.002 ) and protein S ( 0.043 ) , 10 minutes after infusion . However , no differences were found in haematoma volume or global clinical outcomes between the groups . Incidence of death and thrombotic events was similar between the groups . Conclusions Rapid infusion of both doses of 4-factor PCC achieved an INR of 1.5 or less in all patients with a lower INR observed in the 40 IU/kg group . No safety concerns were raised by the 40 IU/kg dose . Further trials are needed to evaluate the impact of the high dose of 4-factor PCC on functional outcomes and mortality . Trial registration Eudra CT number 2007 - 000602 - 73 INTRODUCTION Prothrombin complex concentrates ( PCCs ) are used for the urgent reversal of oral vitamin K antagonists in patients with life-threatening bleeding or prior to urgent procedures /surgery . PCCs offer rapid and complete reversal without the disadvantages of volume overload and adverse reactions seen with fresh frozen plasma ( FFP ) . There is concern about the risk of thrombosis associated with the use PCCs ; data on this is limited at present . OBJECTIVES To determine the incidence of objective ly confirmed arterial or venous thromboembolism within 30 days following the administration of PROTHROMBINEX ® -VF ( PTX-VF ) to acutely reverse a prolonged INR . MATERIAL S/ METHODS A prospect i ve observational study was conducted at two teaching hospitals in Auckl and , NZ . All patients who received PTX-VF for the acute reversal of prolonged INR were eligible . Baseline patient demographics and reasons for PTX-VF administration were recorded . Patients were review ed at days 7 and 30 , to confirm/exclude thromboembolism or adverse events . RESULTS 173 patients were enrolled from August 2008 to March 2009 . The most frequent indication for reversal was acute bleeding . At 30 days 4.6 % ( 8/173 ) patients had a definite/probable thrombotic event , and 16.7 % had died either due to the presenting bleed ( intracranial haemorrhage ) or a complication of their presenting complaint ( e.g. sepsis , renal failure ) . CONCLUSIONS Acute reversal of anticoagulant therapy with PTX-VF is associated with a significant rate of thromboembolism ( 4.6 % ) within 30 days . These events can be explained by ongoing cessation of anticoagulant therapy in patients with ongoing risk factors for arterial or venous thrombosis , rather than directly attributable to PTX-VF therapy Introduction Prothrombin Complex Concentrate ( PCC ) is a key treatment in the management of bleeding related to Vitamin K antagonists ( VKA ) . This study aim ed to evaluate prospect ively PCC use in patients with VKA-related bleeding in view of the French guidelines published in 2008 . Methods All consecutive patients with VKA-related bleeding treated with a 4-factor PCC ( Octaplex ® ) were selected in 33 French hospitals . Collected data included demographics , site and severity of bleeding , modalities of PCC administration , International Normalized Ratio ( INR ) values before and after PCC administration , outcomes and survival rate 15 days after infusion . Results Of 825 patients who received PCC between August 2008 and December 2010 , 646 had severe bleeding . The main haemorrhage sites were intracranial ( 43.7 % ) and abdominal ( 24.3 % ) . Mean INR before PCC was 4.4 ± 1.9 ; INR was unavailable in 12.5 % of patients . The proportions of patients who received a PCC dose according to guidelines were 15.8 % in patients with initial INR 2 - 2.5 , 41.5 % in patients with INR 2.5 - 3 , 40.8 % in patients with INR 3 - 3.5 , 26.9 % in patients with INR > 3.5 , and 63.5 % of patients with unknown INR . Vitamin K was administered in 84.7 % of patients . The infused dose of PCC did not vary with initial INR ; the mean dose was 25.3 ± 9.8 IU/Kg . Rates of controlled bleeding and target INR achievement were similar , regardless of whether or not patients were receiving PCC doses as per the guidelines . No differences in INR after PCC treatment were observed , regardless of whether or not vitamin K was administered . INR was first monitored after a mean time frame of 4.5 ± 5.6 hours post PCC . The overall survival rate at 15 days after PCC infusion was 75.4 % ( 65.1 % in patients with intracranial haemorrhage ) . A better prognosis was observed in patients reaching the target INR . Conclusions Severe bleeding related to VKA needs to be better managed , particularly regarding the PCC infused dose , INR monitoring and administration of vitamin K. A dose of 25 IU/kg PCC appears to be efficacious in achieving a target INR of 1.5 . Further studies are required to assess whether adjusting PCC dose and /or better management of INR would improve outcomes Prothrombin Complex Concentrate ( PCC ) is indicated for the acute reversal of oral anticoagulation therapy . To compare the efficacy of a " st and ard " dosage of 20 ml PCC equivalent to about 500 IU factor IX ( group A ) , and an " individualized " dosage based on a target-INR of 2.1 or 1.5 , the initial-INR and the patient 's body weight ( group B ) , we performed an open , prospect i ve , r and omized , controlled trial . The in vivo response and in vivo recovery of factor II , VII , IX and X in these patients on oral anticoagulation was determined . Ninety three patients ( group A : 47 ; group B : 46 ) with major bleedings or admitted for urgent ( surgical ) interventions were enrolled . PCC and Vitamin K ( 10 mg ) were administered intravenously . We evaluated the effect of treatment by the decrease of INR and the clinical outcome . The number of patients reaching the target-INR 15 min after the dosage of PCC was significantly higher in the group treated with an " individualized " dosage , compared to the group treated with a st and ard dose , ( 89 % versus 43 % ; p<0.001 ) . So , we conclude that for the acute reversal of oral anticoagulant therapy , an " individualized " dosage regimen of PCC based on the target-INR , the initial-INR , and body weight of the patient , is significantly more effective in reaching the target-INR than a " st and ard " dosage . The in vivo response and in vivo recovery found in this study was higher then in patients with isolated factor deficiencies . This suggests that the pharmacokinetics in patients on oral anticoagulants may be different We evaluated the effectiveness and safety of prothrombin complex concentrate ( PCC ) in urgent warfarin reversal in Iran . This is a nonr and omized , pre – post intervention study . Thirty-seven high-risk patients with prolonged international normalized ratio ( [ INR ] > 8) or with bleeding symptoms were enrolled in this study . Prothrombin complex concentrate was infused with a dose of 22.4 ± 11.2 IU/kg . The INR was measured 30 minutes postinfusion . Mean age of the participants was 63.9 ± 14.5 ( 22 - 89 years ) . After 30 minutes of PCC injection , INR reversed to < 3.5 in 31 patients ( 84 % ) in whom 12 patients achieved INR 1.5 to 2.5 , and 19 patients achieved INR 2.5 to 3.5 . Bleeding symptoms were subsided during 0.5 hour after PCC infusion in patients with hemorrhagic symptoms . Not any adverse reaction was observed in patients during 3-month follow-up . Our experience in use of single injection of PCC in high-risk patients with prolonged INR or with bleeding symptoms was successful without any complication OBJECTIVE : To test the feasibility of creating a valid and reliable checklist with the following features : appropriate for assessing both r and omised and non-r and omised studies ; provision of both an overall score for study quality and a profile of scores not only for the quality of reporting , internal validity ( bias and confounding ) and power , but also for external validity . DESIGN : A pilot version was first developed , based on epidemiological principles , review s , and existing checklists for r and omised studies . Face and content validity were assessed by three experienced review ers and reliability was determined using two raters assessing 10 r and omised and 10 non-r and omised studies . Using different raters , the checklist was revised and tested for internal consistency ( Kuder-Richardson 20 ) , test-retest and inter-rater reliability ( Spearman correlation coefficient and sign rank test ; kappa statistics ) , criterion validity , and respondent burden . MAIN RESULTS : The performance of the checklist improved considerably after revision of a pilot version . The Quality Index had high internal consistency ( KR-20 : 0.89 ) as did the subscales apart from external validity ( KR-20 : 0.54 ) . Test-retest ( r 0.88 ) and inter-rater ( r 0.75 ) reliability of the Quality Index were good . Reliability of the subscales varied from good ( bias ) to poor ( external validity ) . The Quality Index correlated highly with an existing , established instrument for assessing r and omised studies ( r 0.90 ) . There was little difference between its performance with non-r and omised and with r and omised studies . Raters took about 20 minutes to assess each paper ( range 10 to 45 minutes ) . CONCLUSIONS : This study has shown that it is feasible to develop a checklist that can be used to assess the method ological quality not only of r and omised controlled trials but also non-r and omised studies . It has also shown that it is possible to produce a checklist that provides a profile of the paper , alerting review ers to its particular method ological strengths and weaknesses . Further work is required to improve the checklist and the training of raters in the assessment of external validity Background and Purpose Anticoagulant-associated intracranial hemorrhage ( aaICH ) presents with larger hematoma volumes , higher risk of hematoma expansion , and worse outcome than spontaneous intracranial hemorrhage . Prothrombin complex concentrates ( PCCs ) are indicated for urgent reversal of anticoagulation after aaICH . Given the lack of r and omized controlled trial evidence of efficacy , and the potential for thrombotic complications , we aim ed to determine outcomes in patients with aaICH treated with PCC . Methods We conducted a prospect i ve multicenter registry of patients treated with PCC for aaICH in Canada . Patients were identified by local blood banks after the release of PCC . A chart review abstract ed clinical , imaging , and laboratory data , including thrombotic events after therapy . Hematoma volumes were measured on brain CT scans and primary outcomes were modified Rankin Scale at discharge and in-hospital mortality . Results Between 2008 and 2010 , 141 patients received PCC for aaICH ( 71 intraparenchymal hemorrhages ) . The median age was 78 years ( interquartile range , 14 ) , 59.6 % were male , and median Glasgow Coma Scale was 14 . Median international normalized ratio was 2.6 ( interquartile range , 2.0 ) and median parenchymal hematoma volume was 15.8 mL ( interquartile range , 31.8 ) . Median post-PCC therapy international normalized ratio was 1.4 : 79.5 % of patients had international normalized ratio correction ( < 1.5 ) within 1 hour of PCC therapy . Patients with intraparenchymal hemorrhage had an in-hospital mortality rate of 42.3 % with median modified Rankin Scale of 5 . Significant hematoma expansion occurred in 45.5 % . There were 3 confirmed thrombotic complications within 7 days of PCC therapy . Conclusions PCC therapy rapidly corrected international normalized ratio in the majority of patients , yet mortality and morbidity rates remained high . Rapid international normalized ratio correction alone may not be sufficient to alter prognosis after aaICH The present prospect i ve study was design ed to evaluate the effectiveness and safety of prothrombin complex concentrate ( PCC ) for emergency reversal of oral anticoagulation with phenprocoumon , a long-acting coumarin . Patients were eligible for study entry if they required emergency reversal of phenprocoumon anticoagulation because they needed invasive surgical or diagnostic procedures or were actively bleeding . Patients received one or more infusions of pasteurized nanofiltered PCC ( Beriplex P/N ) . Primary study endpoints were changes in International Normalized Ratio , Quick value , factors II , VII , IX and X , and protein C 10 , 30 and 60 min following PCC infusion . Eight adult patients were enrolled , seven requiring urgent invasive procedures and one experiencing intracranial bleeding . In the first infusion , patients received a median 3600 IU PCC at median infusion rate 17.0 ml/min . Mean ( SD ) baseline International Normalized Ratio was 3.4 ( 1.2 ) . The International Normalized Ratio 10 min after PCC infusion declined to 1.3 or less in seven of eight patients and to 1.4 in one patient . After PCC infusion , the Quick value increased by a mean of 57 % [ confidence interval ( CI ) , 45–69 % ] , circulating factor II concentration by 85 % ( CI , 68–103 % ) , factor VII by 51 % ( CI , 40–62 % ) , factor IX by 61 % ( CI , 47–76 % ) , factor X by 115 % ( CI , 95–135 % ) and protein C by 100 % ( CI , 82–117 % ) . Clinical effectiveness of PCC treatment was rated ‘ very good ’ in seven patients and ‘ satisfactory ’ in one . No thromboembolic or other adverse events occurred . PCC treatment rapidly , effectively and safely reversed phenprocoumon anticoagulation in patients undergoing urgent invasive procedures or actively bleeding INTRODUCTION Prothrombin complex concentrate ( PCC ) for reversal of vitamin K antagonist ( VKA ) is the main therapeutic option in cases of life-threatening bleeding . Clinical use of PCC is poorly documented . METHODS We prospect ively assessed PCC use in four French emergency departments during a two year period 2006 - 2008 before publication of French Guidelines . An appropriate treatment was defined when PCC was recommended , with a dose of PCC above or equal to 20 UI/kg , with vitamin K and with an assessment of international normalized ratio ( INR ) after PCC . Time of diagnosis and PCC administration were collected , as INR values , thromboembolic events within seven days , hospital mortality . RESULTS 256 patients received PCC for reversal of OAT . PCC was mainly prescribed for major intracerebral ( ICH ) or gastrointestinal hemorrhage . An appropriate treatment was observed in 26 % of patients . Intra-hospital mortality for major bleeding was 33 % for ICH and 26 % for non-ICH major bleeding . A PCC dose>20 UI/kg was able to reach an INR<1.5 in 65 % of patients . For major hemorrhages ( 70 % ) , time between patient arrival and treatment delivery exceeded three hours in half of cases . Control of INR was omitted in 20 % of patients . No patients presented a thromboembolic event . CONCLUSION A suitable treatment was administered in 26 % of patients . A PCC dose of 20 - 30 IU/kg seems adequate in most cases to reverse VKA activity , but both higher and lower doses achieve similar effects . Considerable progress is required to improve PCC administration and control of treatment efficacy , and to shorten time to diagnosis The incidence of spontaneous intracranial haemorrhage has increased markedly in line with the increased use of oral anticoagulant agents . Recent guidelines for reversal of this acquired coagulation defect in an emergency have been established , but they are not adhered to in all centres . Our unit is referred between 20 and 60 patients per year ( 1994 - 1999 ) who are anticoagulated and require urgent neurosurgical intervention . In order to investigate this , we performed a prospect i ve study using prothrombin complex concentrate ( PCC ) . PCC was given to the first six patients with intracranial haemorrhage admitted to the neurosurgical unit requiring urgent correction of anticoagulation ( Group 1 ) and compared with patients receiving st and ard treatment with fresh frozen plasma and vitamin K ( Group 2 ) . Mean International Normalised Ratios of Group 1 were 4.86 pretreatment and 1.32 posttreatment , and of Group 2 were 5.32 and 2.30 , respectively . Results for complete reversal and reversal time were significant for PCC with p < 0.001 . We recommend PCC for rapid and effective reversal of warfarin in life-threatening neurosurgical emergencies Bleeding is the most serious adverse event of oral anticoagulants and is a major cause of morbidity and mortality in such patients . Rapid reversal of anticoagulation in bleeding patients or prior to urgent surgery is m and atory . The therapeutic options in these situations include administration of fresh frozen plasma ( FFP ) , and recently of prothrombin complex concentrates ( PCCs ) . However , viral safety and thrombogenicity of PCCs remain issues of concern . In the present study , we administered Octaplex , a new solvent/detergent ( S/D ) treated and nanofiltered PCC , to excessively anticoagulated bleeding patients or to anticoagulated patients facing urgent surgery . Ten excessively anticoagulated patients with major bleeding and 10 anticoagulated patients awaiting surgery ( median age 72.5 ( 43 - 83 ) years , 9 females ) received a median dose of 26.1 IU/kg body weight ( BW ) of Octaplex for reversal of anticoagulation . Response to Octaplex was rapid with decline of INR within 10 min after Octaplex administration ( from 6.1+/-2 . to 1.5+/-0.3 ) . Clinical response was grade d as good in most patients ( 85 % ) and as moderate in the rest . Octaplex administration was uneventful in all patients . Following Octaplex administration , a small increase in F1 + 2 levels was observed in bleeding patients , whereas D-dimer level did not change significantly . We conclude that Octaplex is effective and safe in situations where rapid reversal of anticoagulation is needed OBJECTIVE To corroborate results obtained in The Netherl and s with PPSB-SD , showing a safe acute reversal of anticoagulation within 15 minutes of administration . MATERIAL AND METHODS PPSB-SD is a concentrate prothrombin complex containing a relatively constant high level of vitamin K-dependant coagulation factors II , VII , IX and X. PPSB-SD was administered to 14 patients treated with oral anticoagulants , according the patient 's weight , the initial and the target INR ( < 2.0 for moderate haemorrhage and abdominal surgery , or < 1.5 for severe haemorrhage and cardio-vascular interventions ) . INR values were measured with the Coagucheck Pro ( Roche Diagnostics ) upon admission and at 15 minutes , 1 , 3 and 5 hours after treatment , and confirmed by the hospitals ' laboratory . RESULTS Within 15 minutes 11 patients out of 12 reached their INR target ( data were missing for 2 patients ) . INR decreased rapidly , then remained stable for the next 5 hours . All patients had a favourable outcome : bleeding was stopped and no haemorrhage occurred during surgery . Only one adverse event was reported , but it was not related to the PPSB-SD treatment . No sign of disseminated intravascular coagulation was observed during this study . The administration of PPSB-SD along with vitamin K and dosed according to body weight and initial and target INR allowed for optimal reversal of anticoagulation , as no second infusion was necessary . The recommended dosing worked also very well for patients with high initial INR ( 9.2 to 22.8 ) who were brought down to normal values ( 0.9 to 1.1 ) within 15 minutes . CONCLUSION PPSB-SD can safely be used for the rapid reversal of anticoagulation as needed in emergency situations Major bleeding is a serious and potentially fatal complication of treatment with vitamin K antagonists ( VKAs ) . Prothrombin complex concentrates ( PCCs ) can substantially shorten the time needed to reverse VKA effects . To determine the efficacy and safety of 3-factor PCCs for the rapid reversal of VKAs in patients with major bleeding . Patients receiving VKAs and suffering from acute major bleeding were eligible for this prospect i ve cohort study if their international normalized ratio ( INR ) was higher than or equal to 2.0 . Stratified 35–50 IU kg−1 PCC doses were infused based on initial INR . A total of 126 patients ( 62 males ; mean age : 74 years , range 37–96 years ) were enrolled . The mean INR at presentation was 3.3 ( range 2–11 ) . At 30 min after PCC administration the mean INR was 1.4 ( range : 0.9–3.1 ) , declining to less than or equal to 1.5 in 75 % of patients . The benefit of PCC was maintained for a long time , since in 97 % of all post-infusion time points through 96 h the mean INR remained lower than or equal to 1.5 ( mean : 1.19 ; range : 0.9–2.3 ) . During hospitalization neither thrombotic complications nor significant adverse events were observed and 12 patients died ( 10 % ) ; none of the deaths was judged to be related to PCC administration . 3-factor PCC administration is an effective , rapid ad safe treatment for the urgent reversal of VKAs in patients with acute major bleeding . Broader use of PCC in this clinical setting appears to be appropriate BACKGROUND Although prothrombin complex concentrate ( PCC ) is often used to counteract vitamin K antagonist ( VKA ) therapy , evidence regarding the optimal dose for this indication is lacking . In Dutch hospitals , either a variable dose , based on body weight , target INR ( international normalised ratio ) and initial INR , or a fixed dose is used . AIM / OBJECTIVES In this observational , pilot study , the efficacy and feasibility of the fixed dose strategy compared to the variable dosing regimen , is investigated . MATERIAL S AND METHODS Consecutive patients receiving PCC ( Cofact ® , Sanquin , Amsterdam ) for VKA reversal because of a major non-cranial bleed or an invasive procedure were enrolled in two cohorts . Data were collected prospect ively in the fixed dose group , cohort 1 , and retrospectively in the variable dose regimen , cohort 2 . Study endpoints were proportion of patients reaching target INR and successful clinical outcome . RESULTS Cohort 1 consisted of 35 and cohort 2 of 32 patients . Target INR was reached in 70 % of patients in cohort 1 versus 81 % in cohort 2 ( P = 0·37 ) . Successful clinical outcome was seen in 91 % of patients in cohort 1 versus 94 % in cohort 2 ( P = 1·00 ) . Median INR decreased from 4·7 to 1·8 with a median dosage of 1040 IU factor IX ( F IX ) in cohort 1 and from 4·7 to 1·6 with a median dosage of 1580 IU F IX in cohort 2 . CONCLUSION This study suggests that a fixed dose of 1040 IU of F IX may be an effective way to rapidly counteract VKA therapy in our patient population and provides a basis for future research INTRODUCTION Rapid reversal of anticoagulant effect from the use of vitamin K antagonists ( VKA ) is essential when acute bleeding or emergency surgery occurs . Prothrombin complex concentrates ( PCCs ) produce a more rapid effect with a better clinical outcome , and do not cause volume overload as compared with fresh-frozen plasma ( FFP ) . Octaplex is a modern , double virus safeguarded PCC with balanced content of vitamin K-dependent coagulation factors , which ensures fast onset of action and efficacious treatment , i.e. rapid correction of international normalized ratio ( INR ) . MATERIAL S AND METHODS The main purpose of this study was to demonstrate that Octaplex , when individually dosed , efficiently corrects INR to pre-determined levels in patients under oral anticoagulation who have bleeding complications or are undergoing invasive procedures . To measure the efficacy response , the INR achieved after PCC application per patient was calculated as geometric mean of three measurements within 1 h post-infusion . RESULTS Sixty patients received a median total Octaplex dose of 41.1 ( 15.3 - 83.3 ) IU/kg body weight ( bw ) . Of 56 patients evaluable in terms of efficacy , 51 ( 91 % ) showed a response as pre-defined in the protocol and in 52 ( 93 % ) the INR decreased to a value below 1.4 within one hour after dosing . The median INR declined from 2.8 ( 1.5 - 9.5 ) to 1.1 ( 1.0 - 1.9 ) within 10 min . All prothrombin complex coagulation factors recovered in parallel . Three patients had minor adverse drug reactions . One patient showed a non-symptomatic parvovirus B19 seroconversion . No thrombotic side effects were observed . CONCLUSIONS Octaplex is efficacious and safe in immediate correction of dosage-dependent INR in patients with VKA-related deficiency of prothrombin complex coagulation factors BACKGROUND There is uncertainty regarding the efficacy and incidence of thromboembolic events in patients treated with prothrombin complex concentrates ( PCC ) for the emergency reversal of warfarin effect . METHODS During 2002 to 2010 we prospect ively included 160 patients treated with PCC for emergency reversal of warfarin either for bleeding or because of the need of emergency surgery . A possible relationship to PCC was considered if objective ly verified thromboembolism occurred within 7days of PCC administration . Efficacy was adjudicated as good if the bleeding was controlled promptly or if the surgeon did not report excessive perioperative bleeding . RESULTS We included 160 patients ; 72 % received PCC for bleeding . The median international normalized ratio ( INR ) before and after treatment with PCC was 3.5 ( interquartile range [ IQR ] 2.6 - 5.4 ) and 1.4 ( IQR 1.2 - 1.6 ) . The mean dose of PCC was 1800IU ( IQR 1200 - 2000 ) . In addition to PCC , 74 % of the patients received vitamin K and 34 % received plasma . Six patients ( 3.8 % ; 95 % confidence interval [ CI ] , 1.4 - 8.0 % ) developed thromboembolic events ( 3 strokes , 1 myocardial infa rct ion , 1 deep vein thrombosis , 1 splenic infa rct ion ) , possibly related to PCC . The clinical efficacy was good in 146 ( 91 % ) , moderate in 6 ( 4 % ) , poor in 4 ( 2.5 % ) and non-evaluable in 4 patients . CONCLUSION The administration of PCC for the emergency reversal of warfarin may be associated with a low risk of thromboembolism . The contribution of an unmasked thrombotic process by cessation of anticoagulation or of activation of coagulation by the hemorrhagic event should also be considered Summary . Beriplex , a prothrombin complex concentrate ( PCC ) , was administered to 42 patients requiring immediate reversal of their oral anticoagulant therapy . The dose administered was determined using the pretreatment International Normalized Ratio ( INR ) . Blood sample s were obtained before treatment and at 20 , 60 and 120 min after treatment . The following investigations were performed on all sample s – INR , clotting factors II , VII , IX and X , coagulation inhibitors protein C ( PC ) and antithrombin ( AT ) , and other markers of disseminated intravascular coagulation , plasma fibrinogen , d‐dimer and platelet count . Immediate reversal of the INR , the vitamin K‐dependent clotting factors and PC was achieved in virtually all patients . Reduced AT levels were present in 18 patients before treatment . Further slight AT reductions occurred in four patients , but other associated abnormalities of haemostasis were observed in only one of the four patients . One patient with severe peripheral vascular disease , sepsis and renal and cardiac failure died of a thrombotic stroke following leg amputation , 48 h after receiving Beriplex . No other arterial and no venous thromboembolic events occurred within 7 d of treatment . Beriplex is effective in rapidly reversing the anticoagulant effects of warfarin , including PC deficiency , without inducing coagulation activation . Caution should continue to be exercised in the use of these products in patients with disseminated intravascular coagulation , sepsis or liver disease In an open non‐r and omized study , 10 patients with major bleeding and an Internationalized Normal Ratio ( INR ) greater than 14 were treated with 5 mg of intravenous vitamin K and 30 iu/kg of a single concentrate containing factors II , VII , IX and X ( Beriplex P/N ; Aventis Behring ) . The levels of these factors before and immediately after treatment were 4·7 , 1·6 , 8·5 and 1·1 iu/dl and 94 , 30 , 66 and 91 iu/dl respectively . The median INR before treatment was greater than 20 and , after treatment , 1·1 . All patients had a satisfactory clinical response with immediate cessation of bleeding , and no thromboembolic complications occurred Background Despite years of experience with vitamin K antagonist-associated bleeding events , there is still no evidence to help identify the optimal treatment with prothrombin complex concentrates . Variable dosing and fixed dose strategies are being used . In this observational prospect i ve two-cohort study , we aim ed to assess the non-inferiority of a low fixed PCC dose ( 1,040 IU Factor IX ) compared to the registered variable dosing regimen based on baseline International Normalized Rate , bodyweight , and target International Normalized Rate , to counteract vitamin K antagonists in a bleeding emergency in a daily clinical practice setting . Design and Methods Non-inferiority of the fixed prothrombin complex concentrate dose was hypothesized with a margin of 4 % . Main end points were proportion of patients reaching the target International Normalized Rate ( < 2.0 ) after prothrombin complex concentrate treatment , and successful clinical outcome . Results Target International Normalized Rate was reached in 92 % of the fixed dose patients ( n=101 ) versus 95 % of variable dose patients ( n=139 ) result ing in a risk difference of -2.99 % ( 90 % CI : - 8.6 to 2.7 ) ( non-inferiority not confirmed ) . Clinical outcome was successful in 96 % and 88 % of fixed versus variable dose , respectively , with a risk difference of 8.3 % ( 90 % CI : 2.7 - 13.9 ; non-inferiority confirmed ) . Conclusions Although a lower fixed prothrombin complex concentrate dose was associated with successful clinical outcome , fewer patients reached the target International Normalized Rate Background — Patients experiencing major bleeding while taking vitamin K antagonists require rapid vitamin K antagonist reversal . We performed a prospect i ve clinical trial to compare nonactivated 4-factor prothrombin complex concentrate ( 4F-PCC ) with plasma for urgent vitamin K antagonist reversal . Methods and Results — In this phase IIIb , multicenter , open-label , noninferiority trial , nonsurgical patients were r and omized to 4F-PCC ( containing coagulation factors II , VII , IX , and X and proteins C and S ) or plasma . Primary analyses examined whether 4F-PCC was noninferior to plasma for the co primary end points of 24-hour hemostatic efficacy from start of infusion and international normalized ratio correction ( ⩽1.3 ) at 0.5 hour after end of infusion . The intention-to-treat efficacy population comprised 202 patients ( 4F-PCC , n=98 ; plasma , n=104 ) . Median ( range ) baseline international normalized ratio was 3.90 ( 1.8–20.0 ) for the 4F-PCC group and 3.60 ( 1.9–38.9 ) for the plasma group . Effective hemostasis was achieved in 72.4 % of patients receiving 4F-PCC versus 65.4 % receiving plasma , demonstrating noninferiority ( difference , 7.1 % [ 95 % confidence interval , –5.8 to 19.9 ] ) . Rapid international normalized ratio reduction was achieved in 62.2 % of patients receiving 4F-PCC versus 9.6 % receiving plasma , demonstrating 4F-PCC superiority ( difference , 52.6 % [ 95 % confidence interval , 39.4 to 65.9 ] ) . Assessed coagulation factors were higher in the 4F-PCC group than in the plasma group from 0.5 to 3 hours after infusion start ( P<0.02 ) . The safety profile ( adverse events , serious adverse events , thromboembolic events , and deaths ) was similar between groups ; 66 of 103 ( 4F-PCC group ) and 71 of 109 ( plasma group ) patients experienced ≥1 adverse event . Conclusions — 4F-PCC is an effective alternative to plasma for urgent reversal of vitamin K antagonist therapy in major bleeding events , as demonstrated by clinical assessment s of bleeding and laboratory measurements of international normalized ratio and factor levels . Clinical Trial Registration — URL : http://www . clinical trials.gov . Unique identifier : NCT00708435
2,045
17,347,963
Antiviral prophylaxis is associated with a significant reduction in the risk of CMV infection and all-cause mortality in CMV-negative and CMV-positive renal transplant recipients from CMV-positive donors , regard-less of the immunosuppressive treatments used ( evidence from SR ) . Pre-emptive therapy has been found to be effective in preventing CMV disease but not all-cause mortality in these patients , even if evidence is less satisfactory compared to data on antiviral prophylaxis ( evidence from SR ) . There is insufficient evidence of conclusive recommendations on treatment of CMV-negative recipients of renal transplants from CMV-negative donors .
BACKGROUND The current 3rd edition of the Italian Society of Nephrology guidelines has been drawn up to summarize evidence of key intervention issues on the basis of systematic review s ( SR ) of r and omized trials ( RCT ) or RCT data only . In the present guideline , evidence of antiviral prophylaxis and pre-emptive treatment for preventing cytomegalovirus ( CMV ) infection in kidney transplant recipients is presented .
BACKGROUND The aim of this double-blind , placebo-controlled study was to determine whether a prolonged course of low-dose ganciclovir prevented the development of clinical cytomegalovirus disease after heart transplantation . METHODS Fifty-six consecutive patients were stratified into two groups : cytomegalovirus-positive recipients ( n = 40 ) and cytomegalovirus-negative recipients of organs from cytomegalovirus-positive donors ( n = 16 ) . All patients received equine antithymocyte globulin induction for 7 days and maintenance doses of cyclosporine , azathioprine , and prednisolone . Ganciclovir ( 5 mg/kg intravenously ) or matching placebo was given with the premedication , three times weekly for the first 6 weeks after transplantation and for another 2 weeks for each treated rejection episode between 6 and 12 weeks . RESULTS Ganciclovir prophylaxis reduced the actuarial incidence of cytomegalovirus disease from 71 % to 11 % in cytomegalovirus-mismatched patients ( p < 0.01 ) . Ganciclovir prophylaxis did not reduce the incidence of cytomegalovirus disease in cytomegalovirus-positive recipients ( 25 % in both placebo and ganciclovir groups ) but did delay its onset and reduce its morbidity . There were no adverse reactions during ganciclovir administration . Gastritis was the most common clinical manifestation of cytomegalovirus disease . Pneumonitis and myocarditis were seen only in placebo-treated cytomegalovirus-mismatched patients . All patients with clinical cytomegalovirus disease responded to ganciclovir , 10 mg/kg/day for 2 weeks . CONCLUSIONS Prolonged low-dose ganciclovir prophylaxis after heart transplantation reduces the incidence of cytomegalovirus disease in cytomegalovirus-mismatched patients and reduces the morbidity of cytomegalovirus disease in cytomegalovirus-positive recipients The use of postdetection antiviral treatment of cytomegalovirus ( CMV ) as a strategy to prevent infection and disease in solid-organ transplant patients has not been evaluated by placebo-controlled trials . We carried out such a study in 69 patients who had received liver transplants and had positive results of CMV polymerase chain reaction within 8 weeks after transplantation but did not have concomitant CMV infection or disease . These patients were r and omly assigned to receive placebo or oral ganciclovir for 8 weeks . CMV infection developed in 21 % and disease developed in 12 % of placebo recipients ( P = .022 ) , compared with 3 % and 0 % , respectively , among ganciclovir recipients ( P = .003 ) . Similarly , in the placebo arm , 55 % and 36 % of CMV-negative patients who received organs from CMV-positive donors developed CMV infection or disease , respectively ( P = .02 ) , compared with 11 % and 0 % of such patients in the ganciclovir arm ( P < .01 ) . Oral ganciclovir administered on CMV detection by PCR prevents CMV infection or disease after liver transplantation Cytomegalovirus is the most important infectious cause of complications and death in organ transplant recipients . The three major consequences of infection with this virus are cytomegalovirus disease ; superinfection with opportunistic pathogens result ing from host defects caused by the virus ; and allograft injury [ 1 , 2 ] . The interaction of the following three factors determines whether cytomegalovirus disease develops in a transplant recipient infected with cytomegalovirus : 1 ) whether the donor and donor organ or the recipient , or both , harbor latent virus that can be reactivated after transplant [ 1 ] ; 2 ) whether the transplant recipient can mount an immune response to the virus [ both cellular and humoral ] ; and 3 ) the type of immunosuppressive therapy administered after transplant [ 4 ] . Transplant recipients in whom primary infection develops at the time of transplant ( donor : cytomegalovirus antibody positive ; recipient : cytomegalovirus antibody negative ) or who test positive for cytomegalovirus antibody before transplantation and require antilymphocyte antibody therapy after transplantation have a greater than 50 % attack rate of cytomegalovirus disease [ 4 ] . Each risk group accounts for 10 % to 20 % of all transplant recipients . Preventing cytomegalovirus disease in these two patient population s is a priority for transplant physicians . Cytomegalovirus disease has been prevented by prolonged administration ( 3 to 4 months ) of antiviral therapies such as acyclovir [ 5 ] , anticytomegalovirus hyperimmune globulin [ 6 ] , or the combination of these two therapies [ 7 ] . These prophylactic strategies reduce the attack rate of cytomegalovirus disease , but this benefit is attenuated in patients who receive antilymphocyte antibody therapy [ 1 ] . Because the risk for developing cytomegalovirus disease depends on the type of immunosuppression administered after transplantation , we have proposed an alternative approach to preventing cytomegalovirus disease . This approach targets patients at greatest risk for cytomegalovirus disease for treatment with the most potent anticytomegalovirus therapy available . The antiviral therapy is administered when the risk is greatest ( for example , during treatment with antilymphocyte antibodies ) . To distinguish this approach from preventive strategies used in all patients ( nontargeted prophylaxis ) , we introduced the term preemptive therapy [ 8 ] . Preliminary studies suggested that ganciclovir administered as preemptive therapy to patients receiving antilymphocyte antibodies might reduce the attack rate of cytomegalovirus disease in transplant recipients who are positive for cytomegalovirus antibody . This multiinstitutional , r and omized clinical trial was design ed to assess the efficacy and safety of preemptive ganciclovir therapy in preventing cytomegalovirus disease in transplant recipients who are positive for cytomegalovirus antibody and who are receiving antilymphocyte antibodies . Methods Patients Consecutive renal transplant recipients who tested positive for cytomegalovirus antibody and who received a kidney from cytomegalovirus antibody-positive or antibody-negative donors at Albany Medical Center , Emory University Hospital , Massachusetts General Hospital , New Engl and Medical Center , University of Chicago Medical Center , and Yale-New Haven Hospital were eligible for enrollment in the study . Patients were r and omly assigned to receive either preemptive ganciclovir or no ganciclovir every day that antilymphocyte antibody therapy ( muromonab-CD3 [ OKT3 ] , antithymocyte globulin , or antilymphocyte globulin ) was administered . Separate r and omization lists were used in each transplantation center ; within each center , separate r and omization lists were used for cadaveric and living , related donors . The investigators at each site knew which patients received the study drug and which patients received no anticytomegalovirus therapy . The cytomegalovirus antibody status of donors and recipients was determined using enzyme immunoassay on recipient serum obtained before transplantation and on donor serum obtained before transfusion . Patients were excluded from the study if they were younger than 20 years of age , were pregnant , had received another organ in addition to a kidney , had received any anticytomegalovirus therapy ( defined as more than 1.2 g of acyclovir per day , unselected immune globulin , or cytomegalovirus hyperimmune globulin ) , or refused to give consent . The committees on human experimentation at each institution approved the study . All study participants were observed for the following outcomes during the 6 months after they received antilymphocyte antibodies : 1 ) cytomegalovirus disease ; 2 ) other infectious diseases ; 3 ) and noninfectious diagnoses . Allograft function 6 months after administration of antilymphocyte antibody was recorded as either present ( for those not requiring dialysis ) or absent ( for those receiving dialysis ) . Serum creatinine levels were recorded for all patients with functioning allografts 6 months after antilymphocyte antibody therapy . Patients were evaluated for cytomegalovirus viremia and disease in three ways . First , the virology laboratory at each program tried to isolate cytomegalovirus from buffy-coat specimens at least every month using either centrifugation culture [ 9 ] , conventional culture techniques [ 10 ] , or both . Second , buffy-coat specimens were cultured for cytomegalovirus when any of the following signs , symptoms , or laboratory abnormalities were present : temperature greater than 38 C , leukocyte count less than 3.0 109 cells/L , dyspnea , abdominal pain , or gastrointestinal bleeding . Third , a biopsy specimen was obtained from any abnormal site , and all biopsy specimens were examined histologically for the presence of characteristic cytomegalovirus inclusion bodies and uptake of immunofluorescent-labeled anticytomegalovirus antibodies . Patients with other signs or symptoms suggesting infection ( such as cough , headache , diarrhea , allograft discomfort , or dysuria ) were evaluated using st and ard diagnostic protocol s. At all study sites , the diagnostic protocol included at least a chest radiograph , leukocyte count , renal and liver function tests , two sets of blood cultures , urinalysis , urine culture , and cultures from other potential sites of infection . For patients who did not survive the 6-month observation period , data on the cause and date of death were collected . Definition of Cytomegalovirus Disease Cytomegalovirus disease was defined as either the cytomegalovirus syndrome or tissue-invasive disease developing within 6 months of antilymphocyte antibody therapy or within 1 month of discontinuing immunosuppression after allograft loss . The cytomegalovirus syndrome was diagnosed when both virologic and clinical criteria were met within a 7-day period . Virologic criteria were fulfilled when cytomegalovirus was isolated from a buffy-coat specimen or bronchoalveolar fluid . Clinical criteria were fulfilled when patients had a temperature higher than 38 C [ without antipyretic agents ] for 3 or more consecutive days and within 7 days of two or more of the following : 1 ) leukopenia [ leukocyte count < 3.0 109 cells/L on two consecutive measurements after stopping azathioprine therapy ] ; 2 ) hepatitis [ serum alanine aminotransferase > 1.5 times the upper limit of normal , without serologic evidence of active hepatitis B or hepatitis C virus ] ; 3 ) atypical lymphocytosis [ more than 20 % of leukocytes ] ; and 4 ) pneumonitis ( an abnormal result on chest radiograph and no alternative explanation [ including absence of Pneumocystis carinii in respiratory secretions ] ) . These criteria were modified slightly from those used by other research ers [ 6 ] because isolation of cytomegalovirus from buffy-coat specimens or bronchoalveolar fluid predicts presence of cytomegalovirus disease , whereas isolation of cytomegalovirus from urine and saliva may not [ 11 ] . Tissue-invasive cytomegalovirus disease was diagnosed histopathologically by showing the presence of inclusion bodies characteristic of cytomegalovirus or by an immunochemical stain positive for cytomegalovirus antigens in a biopsy specimen from a lesion or an abnormal site ( gastrointestinal tract , lung , or liver ) . The investigator at each study site made decisions about treating cytomegalovirus disease . Study Drug Administration The study drug was given daily ( or according to the schedule in Table 1 only when the patient already had intravenous access for administration of antilymphocyte antibodies . The study medication was started within 24 hours of the first dose of each course of antilymphocyte antibodies . Patients receiving the study drug were given ganciclovir infusions based on daily serum creatinine concentrations ( Table 1 ) . Table 1 . Ganciclovir Dosage Schedule according to Daily Serum Creatinine Concentration Leukocyte count , platelet count , and serum creatinine concentration were monitored daily during antilymphocyte antibody therapy in patients who received preemptive ganciclovir therapy and in controls . Sample Size and Statistical Analysis In the primary analysis , the proportion of patients with cytomegalovirus disease in the two study groups were compared . On the basis of our previous study [ 4 ] , we assumed that cytomegalovirus disease would develop in 60 % of controls . We wanted to enroll 48 patients who could be evaluated in each group ( for a total of 96 ) to detect a decrease in the proportion of patients with cytomegalovirus disease from 60 % ( in controls ) to 30 % or less ( in patients receiving preemptive ganciclovir therapy ) ; that is , we predicted that preemptive ganciclovir treatment would yield a 50 % lower attack rate ( using a two-sided test , = 0.05 and power = 0.8 ) . Initial analyses were done on an intention-to-treat basis and included all eligible patients . We defined the primary outcome and potential predictors of cytomegalovirus disease before beginning the study . At the end of therapy and follow-up , the study nurse purged all study Despite current approaches to prophylaxis , cytomegalovirus ( CMV ) continues to be a common cause of infection and disease in solid-organ-transplant patients . Thus , we conducted a controlled trial comparing long-term administration of ganciclovir with high-dose acyclovir for prevention of CMV infection and disease in liver transplant recipients . At the time of transplant , patients were r and omised to receive either ganciclovir ( 6 mg/kg body weight per day intravenously from postoperative day 1 to day 30 , then 6 mg/kg per day Monday through Friday until day 100 ) or acyclovir ( 10 mg/kg intravenously every 8 h from postoperative day 1 to day of discharge , then 800 mg orally four times a day until day 100 ) . Patients were followed for development of CMV infection , CMV disease , and drug-related toxicity by frequent cultures , serological tests , laboratory measurements , and tissue biopsies . During the first 120 days after transplant , CMV infection occurred in 48 of 126 ( 38 % ) acyclovir patients but in only 6 of 124 ( 5 % ) ganciclovir patients ( p < 0.0001 ) . Similarly , symptomatic CMV disease developed in 12 of 126 ( 10 % ) acyclovir patients but in only 1 of 124 ( 0.8 % ) ganciclovir patients ( p = 0.002 ) . Ganciclovir reduced the incidence of CMV infection in both CMV antibody positive ( 37 vs 4 % , p = 0.001 ) and negative patients ( 42 vs 11 % , p = 0.06 ) . In a multivariate analysis of donor-recipient CMV antibody status and other risk factors , prophylactic ganciclovir was the most significant factor protecting against CMV infection ( p < 0.0001 ) and disease ( p = 0.001 ) . Ganciclovir and acyclovir were generally well-tolerated . Incidences of leukopenia , thrombocytopenia , renal failure , and other adverse events were similar in the two groups . CMV can be eliminated almost completely as a significant pathogen in liver transplant recipients by the long-term administration of prophylactic ganciclovir . In addition , the treatment is safe PURPOSE To assess impact of cytomegalovirus ( CMV ) donor-recipient serostatus , infection , or disease on development of invasive fungal infection in orthotopic liver transplant recipients . PATIENTS AND METHODS An analysis of prospect ively collected data in 146 liver transplant recipients ( intention to treat cohort ) from 4 tertiary care , university-affiliated transplant centers in Boston ( Boston Center for Liver Transplantation ) . Patients were observed for 1 year after transplantation for the development of CMV infection , CMV disease , CMV pneumonia , as well as for the development of opportunistic fungal infections , graft survival , and mortality . Weekly cultures were taken of urine and throat and every other week of buffy coat for CMV for 2 months , then monthly for 6 months , at 1 year , and at the time of any clinical illness . Pre- and posttransplant variables including CMV-serostatus of donor and recipient , fungal isolation from sterile body sites , fungemia , bacteremia , antibiotic use , immunosuppression , treatment for rejection , and volumes of blood products were measured . RESULTS Survival analysis demonstrated that 36 % of patients with CMV disease developed invasive fungal disease within the first year post-transplant compared with 8 % of those without CMV disease ( P < 0.0001 ) . One-year mortality in patients with invasive fungal disease was 15 of 22 ( 68 % ) compared with 23 of 124 ( 19 % ) in those without invasive fungal disease ( P < 0.001 ) . A multivariable , time-dependent analysis demonstrated that being a CMV-seronegative recipient of a CMV-seropositive donor organ ( P < 0.001 ) , having bacteremia ( P = 0.001 ) , UNOS ( United Network for Organ Sharing ) status 4 ( need for life support measures ) at transplant ( P = 0.002 ) , and volume of platelets ( P = 0.002 ) were independently associated with invasive fungal disease . Restriction of cases of invasive fungal disease to those that occurred more than 2 weeks after transplant demonstrated an association with CMV disease ( P = 0.003 ) , bacteremia ( P = 0.003 ) , need for life support ( P = 0.03 ) , and volume of blood products transfused ( P = 0.02 ) . CONCLUSION CMV disease or being a CMV-seronegative recipient of a CMV-seropositive donor organ is an important predictor for invasive fungal disease following orthotopic liver transplantation BACKGROUND Cytomegalovirus is an important cause of morbidity and mortality risk after lung transplantation . Ganciclovir , when given for a period of up to 3 months after lung transplantation , has been shown to reduce the incidence and severity of cytomegalovirus . However , daily prophylaxis is associated with considerable expense , inconvenience , and morbidity risk . The goal of this study was to determine whether 3-times-weekly dosage is as effective as daily prophylaxis with ganciclovir in preventing cytomegalovirus disease . METHODS Seventy-two consecutive subjects who had either donor or recipient cytomegalovirus seropositivity were r and omized to the daily group ( n = 35 ) or the 3-times-weekly group ( n = 37 ) . All subjects received twice-daily ganciclovir treatment for 2 weeks . Thereafter , subjects received either daily or 3-times-weekly ganciclovir dosing until 90 days after transplantation . Subjects were then monitored for 28 + /- 13 months to identify outcomes and complications . RESULTS There were no significant differences between the daily and 3-times-weekly groups with respect to survival free from cytomegalovirus infection or survival free from cytomegalovirus disease . In both groups , cytomegalovirus infection and disease frequently emerged after the termination of prophylaxis . However , in most cases the cytomegalovirus syndromes observed were mild and in many cases could be treated on an outpatient basis . There was no significant difference between the groups in the incidence of obliterative bronchiolitis or time to onset of grade 2 bronchiolitis obliterans syndrome . Overall patient survival was better in the daily group , but the survival advantage did not appear to be related to a reduction in cytomegalovirus-related disease . Complications of ganciclovir prophylaxis included leukopenia in 2 subjects in the 3-times-weekly group and catheter-related sepsis in 6 subjects from each group . CONCLUSIONS We conclude that intravenous ganciclovir given 3 times weekly for 3 months after transplantation is as effective as daily ganciclovir given for a similar time period . The 3-times-weekly dosing regimen did not result in increased infection , disease , or sequelae of cytomegalovirus infection when compared with the daily regimen BACKGROUND About one-quarter of renal transplant patients will suffer from symptomatic cytomegalovirus ( CMV ) disease if no preventive therapeutic measures are taken . In this prospect i ve , r and omized single-centre study pre-emptive therapy with oral ganciclovir is compared with conventional deferred treatment . METHODS Renal transplant recipients ( n= 455 ) over 18 years of age were screened weekly for CMV pp65 antigenaemia during the first 12 weeks post-transplantation . If CMV pp65 antigen in leukocytes appeared within 8 weeks post-transplantation patients were r and omized and included in the study . Five patients developed CMV disease before positive CMV pp65 , and 14 patients with a positive antigen test developed CMV disease before r and omization could take place , all these representing a limitation of the applicability of the results in the overall renal transplant population . Altogether 179 patients were not r and omized for various reasons . Eighty patients completed the study , 42 were r and omized to receive pre-emptive oral ganciclovir therapy and 38 to conventional deferred treatment ( control group ) . RESULTS Time from transplantation to start of ganciclovir capsules was 36 ( 12 - 60 ) days and duration of oral ganciclovir therapy was 49 ( 27 - 70 ) days , median ( range ) . No patient in the pre-emptive treatment group , but nine of 38 patients ( 23.7 % ) in the control group , developed CMV disease during the first 12 weeks post-transplantation ( P= 0.0009 ) . In the period from 3 months to 1 year post-transplantation , two patients in each group developed CMV disease . There were no significant differences in acute rejection or renal function between treatment groups during the first post-transplant year . CONCLUSIONS Pre-emptive oral ganciclovir therapy in renal transplant recipients during the first 12 weeks post-transplantation effectively prevents CMV disease during this time period . The incidence of late CMV disease ( 3 months to 1 year after transplantation ) was similar in the two groups , indicating that pre-emptive therapy does not result in late onset of CMV disease Background . With the development of sensitive tests to detect cytomegalovirus ( CMV ) viremia , preemptive approaches become a reasonable alternative to general CMV prophylaxis . We performed a r and omized trial comparing pp65-antigenemia guided preemptive therapy using oral ganciclovir with symptom-triggered intravenous ganciclovir treatment . Methods . Eighty-eight of 372 liver transplant recipients developed antigenemia early after orthotopic liver transplantation . Twenty-eight symptomatic patients with antigenemia were excluded from r and omization and treated with intravenous ganciclovir . Sixty pp65-antigen – positive asymptomatic patients were r and omized to receive either oral ganciclovir 3 × 1 g/day for 14 days ( group 1 ) or no preemptive treatment ( group 2 ) . Patients that developed CMV disease were treated with intravenous ganciclovir 2 × 5 mg/kg body weight for 14 days . The high-risk ( Donor+/Recipient− ) patients were equally distributed in the two study groups . Results . Three of 30 ( 10 % ) patients on oral ganciclovir developed mild to moderate CMV disease compared with 6/30 ( 20 % ) patients in the control group . In the Donor+/Recipient− patients , the incidence of CMV disease was 1/6 and 3/7 . All disease episodes resolved after intravenous treatment . The 1- and 3-year patient and organ survival was the same in the study groups and in the patients with or without CMV infection . No deaths related to CMV occurred . Conclusions . The positive predictive value of pp65-antigenemia for the development of CMV disease was very low , and , in 28/88 patients ( 32 % ) , antigenemia did not precede symptoms . Therefore , pp65-antigenemia is of limited value in deciding on the timing and need for ganciclovir therapy after liver transplantation OBJECTIVE To evaluate the efficacy , safety , and cost of preemptive ganciclovir therapy in cytomegalovirus (CMV)-seropositive renal transplant recipients treated with antilymphocyte antibody ( ALA ) preparations . DESIGN AND SETTING A prospect i ve , r and omized trial at a 650-bed tertiary medical center hospital . PATIENTS Forty consecutive CMV-seropositive renal allograft recipients who underwent transplantation between January 1992 and January 1994 and were treated with ALA for induction immunosuppression or acute rejection therapy . MAIN OUTCOME MEASURES The incidence and severity of CMV disease , length of hospitalization , and patient and allograft survival . INTERVENTION Cytomegalovirus infection prophylaxis by use of intravenous ganciclovir during ALA therapy was administered to 22 patients ( group 1 ) and the results were compared with those obtained in 18 control patients who did not receive prophylaxis for CMV disease ( group 2 ) . RESULTS Preemptive ganciclovir therapy significantly reduced the incidence of CMV disease ( P < .05 ) in CMV-seropositive renal transplant patients who were treated with ALA and was well tolerated . In addition , the cost of prophylactic therapy was offset by the decreased length of hospitalization observed in patients in group 1 . CONCLUSION Preemptive ganciclovir therapy provides a cost-effective approach toward significantly improving the outcome of renal transplantation in CMV-seropositive patients treated with ALA BACKGROUND Cytomegalovirus ( CMV ) disease is a major complication of organ transplantation . We hypothesized that prophylactic treatment with valacyclovir would reduce the risk of CMV disease . METHODS A total of 208 CMV-negative recipients of a kidney from a seropositive donor and 408 CMV-positive recipients were r and omly assigned to receive either 2 g of valacyclovir or placebo orally four times daily for 90 days after transplantation , with the dose adjusted according to renal function . The primary end point was laboratory-confirmed CMV disease in the first six months after transplantation . RESULTS Treatment with valacyclovir reduced the incidence or delayed the onset of CMV disease in both the seronegative patients ( P<0.001 ) and the seropositive patients ( P=0.03 ) . Among the seronegative patients , the incidence of CMV disease 90 days after transplantation was 45 percent among placebo recipients and 3 percent among valacyclovir recipients . Among the seropositive patients , the respective values were 6 percent and 0 percent . At six months , the incidence of CMV disease was 45 percent among seronegative recipients of placebo and 16 percent among seronegative recipients of valacyclovir ; it was 6 percent among seropositive placebo recipients and 1 percent among seropositive valacyclovir recipients . At six months , the rate of biopsy-confirmed acute graft rejection in the seronegative group was 52 percent among placebo recipients and 26 percent among valacyclovir recipients ( P=0.001 ) . Treatment with valacyclovir also decreased the rates of CMV viremia and viruria , herpes simplex virus disease , and the use of inpatient medical re sources . Hallucinations and confusion were more common with valacyclovir treatment , but these events were not severe or treatment-limiting . The rates of other adverse events were similar among the groups . CONCLUSIONS Prophylactic treatment with valacyclovir is a safe and effective way to prevent CMV disease after renal transplantation BACKGROUND Posttransplantation cytomegalovirus ( CMV ) infection remains a significant cause of morbidity in kidney transplant recipients . We performed a r and omized prospect i ve controlled trial of oral acyclovir versus oral ganciclovir for CMV prophylaxis in a group of renal allograft recipients considered at high risk for CMV disease due to the use of OKT3 induction therapy . METHODS A total of 101 recipients of cadaveric ( 83 ) and zero haplotype-matched live donor ( 18 ) kidney transplants were entered into the trial . A total of 22 D-R- patients received no prophylaxis . Twenty-seven D+R- , 29 D+R+ , and 23 D-R+ patients were r and omized to receive 3 months of either oral acyclovir ( 800 mg q.i.d . ) or oral ganciclovir ( 1000 mg t.i.d . ) . Doses were adjusted according to the level of renal function . The D+R- patients were also given CMV immune globulin biweekly for 16 weeks . Surveillance blood cultures were obtained at transplantation , at months 1 , 2 , 3 , and 6 , and when clinical ly indicated . The primary study end points were time to CMV infection and disease the first 6 months after transplantation . RESULTS The mean follow up was 14.4 months . Both agents were well tolerated , and no drug interruptions for toxicity occurred . CMV was isolated in 14 of 39 ( 35.9 % ) acyclovir-treated and 1 of 40 ( 2.5 % ) ganciclovir-treated recipients by 6 months ( P=0.0001 ) . Symptomatic CMV disease occurred in 9 of 14 ( 64 % ) of the acyclovir patients , two with tissue-invasive disease . Infection rates for acyclovir vs. ganciclovir , respectively , stratified by CMV serology were : D+R- , 54 vs. 0 % , P=0.0008 ; D+R+ , 43 vs. 6.6 % , P=0.01 ; D-R+ , 8.3 vs. 0 % , P = NS . No patient developed CMV infection while taking oral ganciclovir , however three delayed infections occurred 2 - 7 months after finishing therapy . Each patient had been previously treated for acute rejection . CONCLUSIONS Oral acyclovir provides effective CMV prophylaxis only for recipients of seronegative donor kidneys . Oral ganciclovir is a superior agent providing effective CMV prophylaxis for recipients of seropositive donor kidneys . Recipients who are treated for acute rejection are at risk for delayed CMV infection during the first posttransplantation year Unlike parenteral gancicovir , the efficacy of oral ganciclovir in the prevention and treatment of cytomegalovirus ( CMV ) infection in kidney transplantation has not been well documented . This study prospect ively evaluated the episodes of CMV infection within the first nine months after transplantation in renal transplant recipients treated prophylactically with oral ganciclovir ( 750 mg twice a day ) over a period of 3 months ( oral ganciclovir , N = 22 ) , compared with patients r and omly assigned as controls ( controls , N = 22 ) who did not receive any anti-viral prophylaxis . Diagnosis of CMV infection at presentation included serological determination of CMV-specific immunoglobulin M antibodies , CMV immunofluorescence assay [ st and ard culture tube and shell vial ] ( blood ) and virus isolation ( urine and tissue ) . CMV infection occurred in one patient ( 5 % ) in the oral ganciclovir group and 6 patients ( 27 % ) in the control group ( p < 0.05 ) . The episodes of biopsy proven allograft rejections were 5 % ( 1/21 ) and 18 % ( 4/22 ) in the oral ganciclovir and control groups , respectively . Except for one , none of these patients developed CMV infection either before or after rejection(s ) . Controlling for the reason ( induction or treatment of rejection ) for using cytolytic antibody therapies , we found that prophylactic oral ganciclovir was protective against CMV infection ( adjusted risk reduction 0.83 ; 95 % confidence interval , 0.33 - 0.98 ; p < 0.05 ) . Neither , the CMV status of donors and recipients nor the treatment for acute rejection had any significant impact on the occurrence of CMV infections . Our results show that oral ganciclovir is an effective and well tolerated therapy in the prevention of CMV infection in renal transplant patients BACKGROUND Cytomegalovirus ( CMV ) continues to be a major problem post-transplantation ; early markers for predicting patients at risk of CMV disease are needed . Peak CMV load in the blood correlates with CMV disease but frequently occurs too late to provide prognostic information . METHODS 359 transplant recipients ( 162 liver , 87 renal , and 110 bone marrow ) were prospect ively monitored for CMV DNA in the blood with qualitative and quantitative PCR . 3873 sample s were analysed . The CMV load in the first PCR-positive sample and the rate of increase in CMV load in blood during the initial phase of replication were assessed as risk factors for CMV disease using logistic regression . FINDINGS 127 of the 359 patients had CMV DNA in the blood and 49 developed CMV disease . Initial viral load correlated significantly with peak CMV load ( R2=0.47 , p=<0.001 ) and with CMV disease ( odds ratio 1.82 [ 95 % CI 1.11 - 2.98 ; p=0.02 ; 1.34 [ 1.07 - 1.68 ] , p=0.01 , and 1.52 [ 1.13 - 2.05 ] , p=0.006 , per 0.25 log10 increase in viral load for liver , renal , and bone-marrow patients , respectively ) . The rate of increase in CMV load between the last PCR-negative and first PCR-positive sample was significantly faster in patients with CMV disease ( 0.33 log10 versus 0.19 log10 genomes/mL daily , p<0.001 ) . In multivariate-regression analyses , both initial CMV load and rate of viral load increase were independent risk factors for CMV disease ( 1.28 [ 1.06 - 1.52 ] , p=0.01 , per 0.25 log10 increase in CMV load and 1.52 [ 1.06 - 2.17 ] , p=0.02 , per 0.1 log10 increase in CMV load/mL daily , respectively ) . INTERPRETATION CMV load in the initial phase of active infection and the rate of increase in viral load both correlate with CMV disease in transplant recipients ; in combination , they have the potential to identify patients at imminent risk of CMV disease We compared the efficacy and safety of valganciclovir with those of oral ganciclovir in preventing cytomegalovirus ( CMV ) disease in high‐risk seronegative solid organ transplant ( SOT ) recipients of organs from seropositive donors ( D+/R‐ ) . In this r and omised , prospect i ve , double‐blind , double‐dummy study , 364 CMV D+/R‐ patients received valganciclovir 900 mg once daily or oral ganciclovir 1000 mg three times a day ( tid ) within 10 days of transplant and continued through 100 days . CMV disease , plasma viremia , acute graft rejection , graft loss and safety were analyzed up to 6 and 12 months post‐transplant . Endpoint committee‐defined CMV disease developed in 12.1 % and 15.2 % of valganciclovir and ganciclovir patients , respectively , by 6 months , though with a difference in the relative efficacy of valganciclovir and ganciclovir between organs ( i.e. an organ type‐treatment interaction ) . By 12 months , respective incidences were 17.2 % and 18.4 % , and the incidence of investigator‐treated CMV disease events was comparable in the valganciclovir ( 30.5 % ) and ganciclovir ( 28.0 % ) arms . CMV viremia during prophylaxis was significantly lower with valganciclovir ( 2.9 % vs. 10.4 % ; p = 0.001 ) , but was comparable by 12 months ( 48.5 % valganciclovir vs 48.8 % ganciclovir ) . Time‐to‐onset of CMV disease and to viremia were delayed with valganciclovir ; rates of acute allograft rejection were generally lower with valganciclovir . Except for a higher incidence of neutropenia with valganciclovir ( 8.2 % , vs 3.2 % ganciclovir ) the safety profile was similar for both drugs . Overall , once‐daily oral valganciclovir was as clinical ly effective and well‐tolerated as oral ganciclovir tid for CMV prevention in high‐risk SOT recipients This study was design ed to evaluate the longitudinal history of cytomegalovirus ( CMV ) infection and to test the capacity of ganciclovir as effective therapy in CMV-seropositive renal transplant recipients . The CMV viremia was detected with CMV pp65 antigenemia assay in 153 renal transplants . The recipients were classified as having low- grade and high- grade CMV infections according to the severity of CMV infection . The recipients with low- grade CMV infections were observed without ganciclovir treatment , and the recipients with high- grade CMV infection were r and omly assigned to ganciclovir-treated and untreated groups . The clinical course between low- grade and high- grade CMV infections was evaluated . All recipients with low- grade CMV infection ( n = 62 ) showed spontaneous remission regardless of immunosuppresants . In high- grade CMV infection ( n = 31 ) , the ciclosporin A treated group ( n = 11 ) showed no evidence of CMV disease , and the methylprednisolone-treated group ( n = 8) showed CMV disease in 1 ( 25 % ) of 4 ganciclovir-untreated recipients . In the OKT3 group ( n = 12 ) , symptomatic CMV infection was observed in 6 ( 100 % ) ganciclovir-untreated recipients contrary to no CMV disease in the ganciclovir-treated group ( p < 0.05 ) . In conclusion , the CMV antigenemia assay is effective in monitoring CMV viremia , and ganciclovir treatment should be done during early CMV viremia in OKT3-treated recipients BACKGROUND Because of the immunosuppression required , heart-transplant recipients frequently have complications caused by cytomegalovirus ( CMV ) , including pneumonia , esophagitis , gastritis , and a syndrome of fever , hepatitis , and leukopenia . We undertook a controlled trial to evaluate the prophylactic administration of ganciclovir to prevent CMV-induced disease after heart transplantation . METHODS This r and omized , double-blind , placebo-controlled trial was conducted at four centers . Before r and omization , the patients were stratified into two groups : those who were seropositive for CMV before transplantation and those who were seronegative but who received hearts from seropositive donors . Ganciclovir was given intravenously at a dose of 5 mg per kilogram of body weight every 12 hours from postoperative day 1 through day 14 , then at a dose of 6 mg per kilogram each day for 5 days per week until day 28 . RESULTS Among the seropositive patients , CMV illness occurred during the first 120 days after heart transplantation in 26 of 56 patients given placebo ( 46 percent ) , as compared with 5 of 56 patients treated with ganciclovir ( 9 percent ) ( P less than 0.001 ) . Among 37 seronegative patients , CMV illness was frequent in both groups ( placebo , 29 percent ; ganciclovir , 35 percent ; P not significant ) . From day 15 through day 60 , the patients who took ganciclovir had significantly fewer urine cultures positive for CMV , but by day 90 there was no difference . More of the ganciclovir-treated patients had serum creatinine concentrations greater than or equal to 221 mumol per liter ( 2.5 mg per deciliter ) ( 18 percent vs. 4 percent in the placebo group ) , but those elevations were transient . CONCLUSIONS The prophylactic administration of ganciclovir after heart transplantation is safe , and in CMV-seropositive patients it reduces the incidence of CMV-induced illness BACKGROUND The early detection of cytomegalovirus ( CMV ) after liver transplantation may form the basis of a preemptive strategy for prevention of active CMV disease . METHODS We prospect ively analyzed the clinical use of weekly quantitative polymerase chain reaction-(PCR ) based plasma viral load determinations and the antigenemia assay for predicting the development of active CMV disease in 97 consecutive liver transplant recipients . RESULTS CMV disease occurred in 21/97 patients . Using a positive cut-off of > 400 copies/ml plasma , PCR had a sensitivity of 100 % , specificity 47.4 % , positive predictive value 34.4 % and negative predictive value 100 % for prediction of CMV disease . Respective values for a positive antigenemia ( > 0 positive cells/slide ) were 95.2 , 55.3 , 37.0 , and 97.7 % . Different cut-off points for a positive test were analyzed using receiver-operating characteristic ( ROC ) curves . The optimal cut-off for viral load was in the range of 2000 - 5000 copies/ml ( sensitivity 85.7 % , specificity 86.8 % , PPV 64.3 % , NPV 95.7 % for > 5000 copies/ml ) . The optimal cut-off for antigenemia was in the range of four to six positive cells/slide . Mean peak viral load in symptomatic patients was 73,715 copies per/ml versus 3615 copies/ml in patients with asymptomatic CMV reactivation ( P<0.001 ) . In a multivariate logistic regression analysis of risk factors for CMV disease ( CMV serostatus , acute rejection , and induction immunosuppression ) , peak viral load and peak antigenemia emerged as the only significant independent predictors of CMV disease ( for PCR , odds ratio=1.40/1000 copy/ml increase in viral load , P=0.0001 ; for antigenemia odds ratio=1.17/1 positive cell/slide ) . CONCLUSIONS Plasma viral load by quantitative PCR is useful for predicting CMV disease and could be used in a preemptive strategy Background . Both oral ganciclovir and valacyclovir decrease the incidence of cytomegalovirus ( CMV ) disease after renal transplantation . Moreover , valacyclovir has been shown to reduce the risk of acute rejection . Our study was design ed to compare the efficacy and safety of oral ganciclovir and valacyclovir in the prophylaxis of CMV disease after renal transplantation . Methods . A total of 83 patients were prospect ively r and omized to 3-month treatment with oral ganciclovir ( 3 g/day , n=36 , GAN ) or oral valacyclovir ( 8 g/day , n=35 , VAL ) . A control group ( DEF , n=12 ) was managed by deferred therapy . Results . No differences were found in demography , immunosuppression , or donor/recipient CMV serology . The 12-month incidence of CMV disease was 67 % in the DEF group compared with 6 % in the GAN group and 3 % in the VAL group ( P<0.001 GAN or VAL vs. DEF ; P=0.575 GAN vs. VAL ) . The biopsy-confirmed acute rejection rate at 12 months was 12 % in the VAL group compared with 34 % in the GAN group ( P=0.030 ) and 58 % in the DEF group ( P<0.001 ) . The difference between the GAN and DEF groups was not significant ( P=0.087 ) . The average CMV-associated costs per patient were $ 3,072 , $ 2,906 , and $ 4,906 in the GAN , VAL , and DEF groups , respectively . Conclusions . Valacyclovir and oral ganciclovir are equally effective in the prevention of CMV disease after renal transplantation . Both regimens are cost-effective . Valacyclovir is associated with a significantly reduced risk of acute rejection compared with both ganciclovir prophylaxis and deferred therapy The objective of this r and omized , prospect i ve study was to compare preemptive to deferred treatment of cytomegalovirus ( CMV ) infection in high-risk renal transplant recipients . Conducted at a university-affiliated transplant center , the study included 36 renal allograft recipients with donor or recipient CMV-seropositivity who received anti-thymocyte induction therapy . Ganciclovir was administered intravenously for 21 days upon detection of CMV viremia ( preemptive , N = 15 ) or detection of CMV viremia associated with a CMV syndrome ( deferred , N = 21 ) . Shell vial culture , conventional culture , and polymerase chain reaction ( PCR ) were performed upon buffy-coat specimens weekly for 12 to 16 wk . CMV and non-CMV-associated charges were calculated . The comparative sensitivities of PCR , shell vial culture , and conventional culture were 91 % , 44 % , and 47 % , respectively . A delay in specimen processing of > 24 h severely compromised the sensitivity of culture techniques but not that of PCR . Preemptive therapy tended to decrease symptomatic CMV episodes ( 0.4 versus 0.6 episodes per patient r and omized ; P = 0.22 ) . One patient in each group had organ involvement , and no patient died . Allograft function and survival were similar . Ganciclovir use was increased in the preemptive group ( 1.2 versus 0.6 courses per patient r and omized ; P = 0.02 ) . CMV-associated charges were $ 10,368 ( preemptive ) versus $ 5,752 ( deferred ) ; P = 0.13 . PCR is superior to conventional monitoring to detect CMV viremia . Culture can not be considered the " gold st and ard " for detection of CMV viremia , especially when transport of specimens over distances results in processing delays . Preemptive therapy may reduce symptomatic CMV infections in renal transplant recipients . It was associated with higher CMV-related charges but equivalent overall charges versus deferred treatment with intensive monitoring . Either strategy can achieve control of CMV infection after renal transplantation Background . Without effective antiviral prophylaxis , cytomegalovirus ( CMV ) disease is a common cause of morbidity and mortality after liver transplantation . In this r and omized , controlled trial , we compared the efficacy and safety of oral ganciclovir with oral acyclovir after induction with intravenous ( IV ) ganciclovir for long-term prophylaxis of CMV disease in CMV-seropositive liver transplant recipients . Methods . Patients were initially administered IV ganciclovir at a dose of 6 mg/kg per day from day 1 to day 14 after transplantation followed by either oral ganciclovir ( 1 g every 8 hr ) or oral acyclovir ( 800 mg every 6 hr ) from day 15 to day 100 after transplantation . Results . CMV disease occurred in only 1 of 110 patients ( 0.9 % ) receiving ganciclovir compared with 8 of 109 patients ( 7.3 % ) receiving acyclovir within the first year after transplantation ( P = 0.019 ) . There was one case of CMV colitis in the ganciclovir group , whereas four cases of CMV syndrome , three cases of CMV pneumonia , and one case of CMV hepatitis developed in the acyclovir group . The only death from CMV disease occurred in an acyclovir-treated patient with CMV pneumonia . Both oral ganciclovir and oral acyclovir were generally well tolerated . Reversible leukopenia ( decline in white blood cell count to < 3.0 × 109/L ) was more common with oral ganciclovir ( 38/110 patients , 35 % ) than with oral acyclovir ( 20/109 patients , 18 % ) ( P = 0.009 ) . The emergence of ganciclovir-resistant strains of CMV was not found during the study . Conclusions . A prophylactic regimen of 2 weeks of IV ganciclovir followed by an additional 12 weeks of oral ganciclovir is superior to a similar regimen of IV ganciclovir followed by oral acyclovir and almost completely eliminates CMV disease after liver transplantation . This superior protection against CMV disease extends up to 1 year after transplantation and is not associated with ganciclovir resistance BACKGROUND Optimal prophylaxis against cytomegalovirus ( CMV ) disease for organ transplant patients at risk for primary infection ( donor seropositive , recipient seronegative , D+R- ) remains to be determined . We hypothesized that prolonged oral ganciclovir therapy following intravenous therapy would provide increased protection . METHODS A total of 155 evaluable D+R- organ transplant recipients from 13 transplant centers were entered into the study : all received intravenous ganciclovir ( 5 mg/kg/day ) for 5 - 10 days and then either oral acyclovir ( 400 mg tid ) or oral ganciclovir ( 1 g tid ) for an additional 12 weeks . Patients were assigned to their treatment groups at a central r and omization site , with a separate r and omization scheme for each of the organs transplanted ( kidney , heart , or liver ) . In the case of kidney transplants , the patients were stratified according to source of the kidney ( living related vs. cadaveric donor ) . The primary endpoint was the incidence of CMV disease in the first six months post-transplant . RESULTS Treatment with oral ganciclovir was associated with a significant decrease in the incidence of symptomatic disease or viremia when compared with the oral acyclovir group ( 32 % vs. 50 % , P<0.05 ) . This difference was most marked in terms of tissue invasive disease : only 3 of 15 symptomatic patients in the ganciclovir group vs. 10 of 21 in the acyclovir group developed tissue-invasive infection ( P<0.05 ) . There was a significant difference in the time to CMV disease or viremia in the two groups : mean time 212+/-17 days post-transplant for the acyclovir group vs. 291+/-13 days for the ganciclovir group ( P<0.001 ) . The incidence of allograft rejection was 34 % in the ganciclovir group and 46 % in the acyclovir group ( P = NS ) . Leukopenia was more common in the ganciclovir group ( P<0.05 ) , but in no case did it require drug discontinuation . Ganciclovir resistance did not develop in this study . CONCLUSION Prophylaxis with oral ganciclovir following a brief course of intravenous ganciclovir provides useful protection against primary CMV disease BACKGROUND Cytomegalovirus ( CMV ) is a major cause of serious morbidity following solid organ transplantation via both direct and indirect mechanisms . The aim of this study was to investigate the efficacy and safety of valacyclovir prophylaxis in heart transplant recipients . METHODS Twenty-seven CMV seropositive adults due to receive a heart transplant were included in a single-center , r and omized , double-blind study . Patients were r and omized to receive either oral valacyclovir 2000 mg or oral acyclovir 200 mg four times daily starting within 3 days of heart transplant and continuing for 90 days . The primary outcome measure was time to development of CMV antigenemia assessed for 6 months after surgery . Other measures were time to asymptomatic CMV infection , symptomatic CMV infections , and end-organ CMV disease . Patients were monitored for other herpes infections , other opportunistic infections , and acute graft rejection . Safety was assessed by evaluating changes in hematology and clinical chemistry parameters and by the occurrence of adverse events . RESULTS The median time to CMV antigenemia was 19 days for the acyclovir group compared with 119 days for the valacyclovir group ( hazard ratio 0.42 ; 95 % CI , 0.18 - 0.99 ; p = 0.049 ) . Similar delays of approximately 100 days were found for CMV infection , symptomatic CMV infection , and CMV disease . There was also a trend for delayed acute rejection , and fewer opportunistic or other herpesvirus infections occurred in the valacyclovir group . Valacyclovir was well tolerated in the study population . CONCLUSION Oral valacyclovir is a safe and effective mode of prophylaxis of CMV after heart transplantation Cytomegalovirus ( CMV ) is still a major pathogen in liver transplantation ( LTX ) . The clinical efficacy of prophylactic high-dose acyclovir therapy ( 800 mg qid ) was assessed for the prevention of CMV infection and disease in liver recipients . Fifty-five patients were enrolled in a prospect i ve , r and omised , double-blind and placebo-controlled trial ; 28 on acyclovir vs. 27 on placebo . The therapy was given for 12 weeks . The patients were followed for 24 weeks . CMV infection was diagnosed in 60 % ( 16 on acyclovir , 17 on placebo ) and CMV disease developed in 38 % ( 7 on acyclovir , 14 on placebo ) of the patients . The total mortality was 27 % ( 6 on acyclovir , 10 on placebo ) . Acyclovir delayed 32 % of the CMV infections and prevented 59 % of the CMV disease cases which occurred in the placebo cohort . The time to CMV disease was significantly prolonged in patients on acyclovir as compared to patients on placebo ( P=0.013 ) . Adverse events included neurotoxicity which occurred in 5 cases in the acyclovir , but none in the placebo arm , and nephrotoxicity which was detected in 6 patients in the acyclovir and 5 in the placebo arm , respectively . We conclude that acyclovir prophylaxis significantly reduced the incidence of CMV disease , and delayed the onset of CMV infection in liver transplant patients Cytomegalovirus (CMV)-seronegative liver transplant recipients with CMV-seropositive donors have the greatest risk for CMV disease . We performed a r and omized , controlled trial comparing sequential intravenous ( IV ) and oral ganciclovir with prolonged IV ganciclovir for long-term prophylaxis of CMV disease in these high-risk patients . Patients were initially given IV ganciclovir at a dose of 6 mg/kg per day from days 1 to 14 after transplantation . Patients then either received oral ganciclovir ( 1 g every 8 hr ) or continued IV ganciclovir ( 6 mg/kg once per day on Monday – Friday of each week ) from days 15 to 100 after transplantation . CMV disease occurred in 3 of 32 patients ( 9.3 % ) receiving oral ganciclovir and in 4 of 32 patients ( 12.5 % ) receiving IV ganciclovir within the first year after transplantation ( P > 0.2 ) . All cases of CMV disease occurred more than 90 days after transplantation ( median time of onset day + 137 for oral ganciclovir and day + 135 for IV ganciclovir ) . There were no deaths from CMV in either study group . Both oral and IV ganciclovir were generally well tolerated . These results indicate that , after induction with 14 days of IV ganciclovir , oral ganciclovir can be as effective as IV ganciclovir for long-term prophylaxis of CMV disease in high-risk CMV-seronegative liver transplant recipients with CMV-seropositive donors and eliminates the need for prolonged IV access A r and omized trial was performed to compare the sequential use of 2 weeks of intravenous ganciclovir ( 10 mg/[kg.d ] ) followed by 50 weeks of high-dose oral acyclovir ( 800 mg/m2 four times daily ) with 2 weeks of intravenous ganciclovir alone as prophylaxis for cytomegalovirus ( CMV ) and Epstein-Barr virus ( EBV ) disease after pediatric liver transplantation . CMV disease was diagnosed for seven of 24 patients treated with ganciclovir followed by high-dose oral acyclovir compared with two of 24 children treated with ganciclovir alone ( P = .048 ) . Similarly , the rate of CMV disease among high-risk patients ( CMV-positive donor/CMV-negative recipient ) treated with the combination regimen was higher than that among high-risk patients treated with ganciclovir alone ( four [ 57 % ] of seven vs. zero of five , respectively ; vs P < .05 ) . The rate of EBV disease among patients treated with the combination regimen ( eight [ 33 % ] of 24 ) was similar to that among patients treated with ganciclovir alone ( five [ 21 % ] of 24 ; P = not significant ) . We conclude that sequential prophylaxis with 2 weeks of intravenous ganciclovir followed by 50 weeks of high-dose oral acyclovir did not decrease the frequency of CMV or EBV disease after pediatric liver transplantation when compared with 2 weeks of intravenous ganciclovir alone BACKGROUND The aim of this study was to evaluate pp65 antigen-guided antiviral therapy in preventing human cytomegalovirus ( HCMV ) infection in solid organ transplant recipients . METHODS Ten kidney and two liver transplant recipients with asymptomatic HCMV infection were r and omized either for i.v . ganciclovir or placebo treatment in a prospect i ve , double-blind study . All patients were positive by HCMV pp65 antigen test at levels > 5 positive cells/2 x 10(5 ) investigated cells . RESULTS No cases of HCMV end-organ disease occurred . In contrast to patients on placebo ( 5/7 ) , none of the patients on ganciclovir ( 0/5 ) developed HCMV-associated symptoms ( P=0.01 ) . However , because of the small number of patients , all three high-risk patients ( donor seropositive , recipient seronegative ) were r and omized to placebo and all three developed symptoms . CONCLUSIONS Preemptive antiviral therapy guided by the pp65 antigen test seems to have a beneficial effect on preventing HCMV-associated symptoms in kidney and liver transplant recipients In a cohort of 43 liver transplant recipients who did not receive antiviral prophylaxis , qualitative and quantitative polymerase chain reactions ( PCRs ) from peripheral blood were prospect ively compared to determine their value in the diagnosis of established cytomegalovirus ( CMV ) disease and for the early detection of CMV replication as a marker for preemptive antiviral therapy . Using a cutoff of 7000 copies of CMV DNA per sample , the specificity and positive predictive values of qualitative PCR for the diagnosis of established CMV disease increased from 33 % to 89 % and from 54 % to 82 % , respectively , without reducing the 100 % sensitivity and negative predictive value . By contrast , quantification of viral load provided no additional advantage to qualitative PCR for the early diagnosis of CMV infection before development of disease Our objective in this study was to determine the efficacy of 2 grams a day of oral acyclovir administered for 16 weeks after transplantation for the prevention of cytomegalovirus ( CMV ) infection and disease in CMV-seropositive liver transplant recipients . Seventy-three adult liver transplant recipients , seropositive for CMV , were r and omized to receive either 2 grams a day of oral acyclovir for 16 weeks after transplantation or no prophylaxis . The incidence of CMV disease was significantly lower in the acyclovir group ( 5 % ) than in the control group ( 27 % ; P < 0.05 ) . By log-rank analysis , the differences in the probability of presenting CMV disease over the first 16 weeks and over the 1st year were also significant ( P < 0.05 ) . We conclude that 2 grams a day of oral acyclovir provides effective prophylaxis against CMV disease in CMV-seropositive liver transplant recipients Cytomegalovirus ( CMV ) infection is the most frequent infectious complication observed in renal-transplant recipients and induces a significant morbidity in these patients due to CMV disease itself and to associated renal dysfunction or opportunistic superinfection . In order to evaluate the effect of gangiclovir prophylaxis we conducted an open-label prospect i ve r and omized study of ganciclovir administration in CMV seronegative recipients of a renal allograft from CMV seropositive donors . Ganciclovir ( 5 mg/kg b.i.d./day for 14 days ) was started on day 14 after transplantation . Thirty-two patients were included in this study ( 15 in the control group , 17 in the ganciclovir group ) . There was no significant difference between the two groups for age , immunosuppressive regimen , number of rejection , steroid pulses , and OKT3 treatments . Renal and patient outcomes were similar in both groups . The rate of CMV infection and CMV disease were similar in both groups ( 80 % and 73.3 % in the control group versus 70.6 % and 47.1 % in the ganciclovir group ; P = NS ) . Less severe CMV disease was observed in the ganciclovir group compared to controls . The delay between transplantation and CMV infection was significantly longer in the ganciclovir group compared to control group ( 68.1 + /- 5.1 versus 44.0 + /- 5.2 days , P < 0.005 ) . Twelve group patients ( 80 % ) versus nine ( 53 % ) of the ganciclovir group required curative treatment with ganciclovir after the diagnosis of CMV infection ( NS ) . All the patients recovered from CMV disease and no significant side effect was observed during ganciclovir administration . ( ABSTRACT TRUNCATED AT 250 WORDS Cytomegalovirus ( CMV ) infections , either primoinfection or reactivation , remain an important problem in organ transplantation . We therefore design ed a prospect i ve study in which pre-transplant CMV-positive renal transplant ( RT ) patients were r and omized to receive for 3 months starting immediately after transplantation either acyclovir or nothing . Between April 1992 and January 1993 , 53 cadaveric renal transplantations were performed in our institution . The immunosuppressive regimen included anti-thymoglobulins ( ATG ) , azathioprine , steroids and cyclosporine A. Patients r and omized in the acyclovir arm received the drug from day 1 to day 90 ( D90 ) intravenously as long as the creatinine clearance was not above 10 ml/min and per os afterwards ( 3200 mg/day if the creatinine clearance was above 50 ml/min ) . CMV viraemia tests were systematic ally performed every 2 weeks until day 90 or when febrile episodes occurred . The patients were 53 adults who received a RT during the study period ; 37 were included in the study of which 19 received acyclovir prophylaxis ( group A ) and 18 , no prophylaxis ( group B ) . The two groups did not significantly differ according to sex ratio , recipient 's age , number of CMV-negative donors and number of days on ATG ( 10.76+/-6.16 vs. 8.28+/-4.21 days ) . There were significantly fewer viraemia episodes in group A ( n = 6 ) than in group B ( n = 13 , P < 0.05 ) ; nevertheless , the percentage of symptomatic CMV viraemia was the same in both groups ( 35 % vs. 38.5 % ) . The onset of CMV viraemia occurred in the same period in both groups ( 39+/-13.8 days vs. 34.3+/-15 days ; P = NS ) . The number of rejection episodes in the study period was the same in both groups ( 8 in each ) . We conclude from this prospect i ve study that post-RT acyclovir prophylaxis reduces significantly the number of CMV viraemia episodes but does not delay their onset . Furthermore , it has no effect upon the percentage of symptomatic viraemias In an attempt to modify the sequelae of cytomegalovirus ( CMV ) infections after lung transplantation , 25 allograft recipients were r and omized to either ganciclovir 5 mg/kg once a day 5 d/wk ( Group G ) or acyclovir 800 mg four times a day ( Group A ) . All subjects received ganciclovir during postoperative Weeks 1 through 3 , and they were then given either A or G regimens until Day 90 . At termination of study enrollment , the cumulative incidence of all CMV infections ( including seroconversions ) was increased in Group A compared with that in Group G ( 75 % versus 15 % , p < 0.01 ) , as was the incidence of overt CMV shedding and /or pneumonitis ( 50 % versus 15 % , p < 0.043 ) . In comparison with those in Group G , subjects in Group A were also afflicted with an increased prevalence of obliterative bronchiolitis ( OB ) during the first year after transplantation ( 54 % versus 17 % , p < 0.033 ) . Intravenous catheters for ganciclovir administration result ed in four complications among three of the subjects in Group G ( 23 % ) . The short-term benefits of ganciclovir were ultimately limited , moreover , in that cumulative rates of CMV and prevalence of OB are now similar in both treatment groups after approximately 2 yr of observation . We conclude that prolonged ganciclovir prophylaxis decreases the early incidence of CMV and OB among lung transplant recipients , but these effects are of finite duration . Although CMV prevention appears to have considerable potential value in this population , definitive viral prophylaxis will require development of protracted or repeated treatment regimens , or longer-acting agents Recent studies showed contradictory results concerning the efficacy of oral acyclovir in the prevention or amelioration of cytomegalovirus ( CMV ) disease after renal transplantation ( TX ) . This study evaluated the incidence and severity of CMV disease within the first year after TX in high-risk renal transplant recipients ( CMV-seropositive donor , seronegative recipient ) treated prophylactically with oral acyclovir ( 800 to 3200 mg/day ) over a period of 12 wk ( ACY , N = 22 ) , compared with high-risk patients r and omly assigned as controls ( CO , N = 10 ) . Follow-up for CMV infection included serological determination of CMV-specific immunoglobulin G and immunoglobulin M antibodies , antigen detection in peripheral blood leukocytes ( PP 65 ) , shell vial culture ( blood ) , and virus isolation/early antigen detection ( urine ) . Severity of CMV disease was quantified by a scoring system for CMV-related symptoms . Nine patients ( 40.1 % ) in the acyclovir group and four patients ( 40 % ) in the control group developed CMV disease . Neither severity ( ACY , 11.4 versus CO ; 12.5 points score ) , nor duration of disease ( ACY , 21 days ; CO , 22 days ) , nor transplant function at the end of the observation period differed significantly . The onst of CMV disease was not delayed significantly in acyclovir-treated patients compared with controls ( ACY , 47 + /- 34 days versus CO , 27 + /- 14 days after TX , not significant ) . Our results show no beneficial effect of oral acyclovir prophylaxis in CMV high-risk renal transplant recipients Cytomegalovirus ( CMV ) is a major pathogen in liver transplant recipients . Although recent progress in the treatment of CMV disease has led to decreased mortality from this virus , a substantial number of patients still die of CMV-related complications , such as superinfection with bacterial and fungal agents in association with CMV infection [ 1 - 3 ] , CMV-associated atherosclerosis in heart transplant recipients [ 4 ] , bronchiolitis obliterans in lung transplant recipients [ 5 ] , and chronic rejection ( the vanishing bile duct syndrome ) in liver transplant recipients [ 6 ] . Cytomegalovirus disease also substantially increases the cost of transplantation because it prolongs hospitalization [ 7 ] . In 1989 , Balfour and colleagues [ 8 ] reported that high-dose oral acyclovir decreased the rate of CMV disease in kidney transplant recipients . Despite a small number of patients and an unusually high attack rate ( 100 % ) in the control patients , acyclovir-treated seronegative recipients of grafts from seropositive donors had the greatest protection from CMV disease . Based on this study , high-dose oral acyclovir is now also routinely used as prophylaxis for CMV in other organ transplant recipients , including liver recipients . Acyclovir , however , is inactive against CMV in vitro , its long-term administration is expensive , and CMV disease continued to occur at our institution despite such prophylaxis . Another potential problem with continuous , long-term administration of acyclovir is the possible emergence of CMV strains resistant to its closely related nucleoside analog , ganciclovir . Ganciclovir is several times more active against CMV in vitro than is acyclovir . Ganciclovir prophylaxis has been administered in various ways in organ transplant recipients [ 9 ] , either as prophylaxis in all post-transplant patients , thereby unnecessarily exposing a large number of patients to the drug ( most of whom do not develop CMV infection [ 10 - 12 ] ) or as prolonged therapy ( 100 to 120 days ) , which causes hematologic toxicity , expense , and possibly a prolonged hospital stay [ 13 , 14 ] . In this study , our approach was to identify or target the patients at highest risk for CMV disease . Viral excretion or shedding precedes CMV disease in transplant recipients [ 15 ] . Assuming that viral excretion is a predictor of CMV disease , we hypothesized that a short course ( 7 days ) of ganciclovir instituted as preemptive antiviral therapy in patients shedding the virus would prevent progression of early asymptomatic CMV infection to more severe , invasive CMV disease . Thus , we did a r and omized , controlled trial of st and ard high-dose acyclovir compared with short-course , pulse ganciclovir to be given only if CMV shedding was documented in asymptomatic patients using a CMV surveillance protocol . Methods Study Design All patients having liver transplantation at our institution were r and omly assigned to one of the two prophylactic groups . R and omization was stratified by the CMV antibody status of the recipient and the donor . Patients in the control group received 800 mg of acyclovir orally , four times daily , beginning immediately after transplantation as described by Balfour and colleagues [ 8 ] ; patients continued receiving acyclovir ( Zovirax ; Burroughs Wellcome , Research Triangle Park , North Carolina ) for 24 weeks postoperatively . The dosage of acyclovir was adjusted for impaired renal function as follows : If the creatinine clearance was greater than 50 mL/min , then patients received 800 mg of acyclovir four times a day ; if the creatinine clearance was 25 to 50 mL/min , they received 800 mg of acyclovir three times a day ; if the clearance was 10 to 25 mL/min , they received 800 mg of acyclovir twice a day ; and if the clearance was less than 10 mL/min , they received 800 mg of acyclovir once daily . Surveillance cultures for CMV ( buffy coat and urine ) were obtained at 2 , 4 , 6 , 8 , 12 , 16 , and 24 weeks postoperatively for all study patients using the shell vial culture method . The experimental group did not receive acyclovir , but intravenous ganciclovir ( Cytovene ; Syntex , Palo Alto , California ) , 5 mg/kg twice daily , was administered for 7 days only if surveillance cultures yielded CMV ( Figure 1 ) . Ganciclovir dosage was modified for abnormal creatinine clearance as follows : If the creatinine clearance was 80 mL/min or more , then patients received 5 mg/kg of ganciclovir twice daily ; if the creatinine clearance was 50 to 79 mL/min , they received 2.5 mg/kg of ganciclovir twice daily ; if the clearance was 25 to 49 mL/min , they received 2.5 mg of ganciclovir daily ; if the clearance was less than 25 mL/min , they received 1.25 mg/kg of ganciclovir daily . The study was continued for 24 weeks postoperatively . Figure 1 . Flow chart representing the study design . Immunosuppression All patients received 0.10 mg/kg of tacrolimus ( Fujisawa , Deerfield , Illinois ) as a continuous drip for 24 hours , until they were able to take oral medications . The oral dosage of tacrolimus was 0.10 mg/kg every 12 hours . Subsequent dosage adjustments were made as indicated by clinical course and plasma levels of tacrolimus . Methylprednisolone , 1 g , was given immediately after revascularization of the graft . Methylprednisolone , 20 mg , was given intravenously immediately after transplantation and daily thereafter until the oral route was established , at which time 20 mg of prednisone was administered daily . During the subsequent months , prednisone was slowly tapered . Rejection episodes were treated with boluses of 1 g of methylprednisolone with or without steroid recycles ( prednisone decreasing daily by 40 mg from an initial starting dose of 200 mg ) . Muromonab-CD3 ( Orthoclone OKT3 , Ortho Pharmaceuticals , Raritan , New Jersey ) was used for steroid-resistant rejection . Definition of Viral Infections Cytomegalovirus Infection Serologic test results for cytomegalovirus were determined using an enzyme immunosorbent assay , and titers of 0.79 or more were considered positive . All patients received blood products that were neither tested nor screened for CMV antibody . Primary infection was defined as isolation of virus or seroconversion in a patient who was seronegative before transplantation . Reactivation infection was diagnosed by isolation of virus in a seropositive recipient . Cytomegalovirus Disease Clinical diseases caused by CMV included the viral syndrome , localized CMV disease , and disseminated CMV disease . Identification of the viral syndrome caused by CMV required the following : 1 ) positive culture for CMV ; 2 ) temperature of 38 C or more with no other source to account for it ; and 3 ) one of the following findings : leukocyte count 4000/mm3 or less , atypical lymphocytes 3 % or more , and platelets 100 000/mm3 or less . Localized CMV disease was defined as tissue invasion of a single organ determined histopathologically with or without culture of the virus from tissue . Disseminated disease was defined as tissue involvement of two or more noncontiguous organ sites . Identification of Other Viruses Antibodies against viral capsid antigen , early antigen , and Epstein-Barr virus ( EBV ) nuclear antigen were determined preoperatively in all patients . Patients were defined as having a symptomatic EBV infection if they had EBV-associated lymphoproliferative disease identified by the presence of EBV DNA in tissue using nucleic acid hybridization . Antibody titers to detect asymptomatic increases in EBV were not routinely determined . Herpes simplex virus infection was defined as the presence of typical symptomatic oral or genital ulcers . Varicella zoster virus infection was determined clinical ly by the presence of typical dermatomal lesions with or without viral isolation . Statistical Analysis Analysis was done using the Prophet System ( BBN Systems and Technologies , Division of Research Re sources , National Institutes of Health , Bethesda , Maryl and ) . Baseline characteristics ( age , Child-Pugh score ) were compared using the Fisher exact or t-test . We estimated that at least 20 patients in each group would be needed to detect a decrease in CMV disease from 35 % with st and ard acyclovir prophylaxis to 5 % with ganciclovir ( = 0.05 , power = 0.8 ) . The incidence of infection or disease was compared using a two-tailed Fisher exact test . A Kaplan-Meier estimate was used to examine the number of days from transplant until first CMV infection for each group . The two curves were compared using the Mantel-Cox log-rank test . Similar curves were constructed for CMV disease . Results The study sample consisted of 47 consecutive adult male patients who received liver transplants at the Pittsburgh Veterans Affairs Medical Center during a 2-year period and who survived at least 72 hours postoperatively . These included 44 patients who had primary transplants and 3 who had re-transplants ( 2 transplanted once previously and another twice previously ) . Of 47 patients enrolled in the study , 24 were r and omly assigned to the acyclovir group and 23 , to the experimental group . The two patient groups were similar at entry in terms of all baseline characteristics measured ( Table 1 ) . The Child-Pugh scoring system was used to assess the severity of liver disease in the two groups before transplantation [ 16 ] . Table 1 . Characteristics of the Study Group at the Time of Enrollment Cytomegalovirus Infection and Disease Shedding of CMV before the onset of CMV disease occurred in 25 % ( 6 of 24 ) of patients receiving acyclovir prophylaxis and in 22 % ( 5 of 23 ) of patients receiving no prophylaxis ( experimental group ) ( Figure 2 ) . Seventeen percent ( 4 of 24 ) of the patients in the acyclovir group and 4 % ( 1 of 23 ) in the experimental group did not have previous shedding and developed CMV disease as the first manifestation of CMV infection . Thus , 42 % ( 10 of 24 ) in the acyclovir group and 26 % ( 6 of 23 ) in the experimental group had CMV infection ( 16 % difference ; 95 % CI , 10 % to 42 % ; P > 0.2 ) . All CMV infections were diagnosed by viral isolation . One of 6 infections in the experimental group A prospect i ve study to investigate risk factors for CMV disease was conducted in 94 renal transplant recipients . CMV disease was defined as either unexplained fever for > 3 days with viremia or unexplained fever for > 3 days with isolation of CMV from the urine or throat wash and at least one of the following : leukopenia , elevated serum alanine aminotransferase , or biopsy-proved invasive tissue infection of the lung or gastrointestinal tract . Fifty-three patients received immunosuppressive regimens consisting of prednisone and cyclosporine , with or without azathioprine . The remaining 41 patients were treated with these agents plus OKT3 ( 21 received OKT3 to treat rejection , 20 received OKT3 prophylactically ) . Thirty-seven patients were at minimal risk of CMV disease ( donor and recipient seronegative for CMV ) ; 12 patients were at risk of primary disease ( donor seropositive , recipient seronegative ) , and 45 were at risk of reactivation disease ( recipient seropositive at the time of transplantation ) . The incidences of CMV disease in the 3 groups were 0 % , 58 % , and 36 % , respectively . Although the incidence of CMV disease in patients at risk of primary disease was not influenced by the immunosuppressive regimen , immunosuppression had a profound effect on the occurrence of CMV disease in CMV-seropositive transplant recipients . The incidence of CMV disease in those receiving OKT3 was 59 % ; but only 21 % in those who did not receive OKT3 . OKT3 increased the risk of CMV disease five-fold ( odds ratio 5.2 ( 95 % confidence limits 1.4–17.5 ) ) . In the CMV-seropositive patient , OKT3 was also the most important predictor of CMV disease by multivariate analysis ( P<0.002 ) . A pilot study of preemptive therapy with ganciclovir ( 2.5 mg/kg daily during OKT3 therapy ) in 17 patients decreased the incidence of CMV disease without appreciable toxicity The potential of prophylactic ganciclovir for the control of cytomegalovirus ( CMV ) infection was evaluated in a prospect i ve controlled trial of 65 patients after liver transplantation . A group of 33 patients received ganciclovir ( 10 mg/kg/day ) during the third and fourth weeks after transplantation , while 32 patients were r and omised to receive ganciclovir ( 10 mg/kg/day ) only when clinical CMV disease was diagnosed . Eight patients ( 25 % ) in the latter group received ganciclovir and included the only death attributed to CMV infection in the study . Prophylactic ganciclovir was associated with a lower incidence of serologically diagnosed secondary infection and the development of the IgM anti-CMV antibody . However , the frequency of clinical infections was similar in the two groups ( 9/33 vs. 11/32 ) . Liver function tests during the second month after transplantation were not significantly better in the patients receiving prophylactic ganciclovir . Leucopaenia was not seen in patients receiving prophylactic ganciclovir In earlier work we demonstrated that CMV immediate early antigens can be detected in peripheral blood leukocytes of patients with active CMV infection . We now report a comparison of the antigenemia assay and an anti-CMV ELISA in a prospect i ve longitudinal study of 130 renal transplant recipients who were monitored for active CMV infection during the first 3 months after transplantation . Active CMV infection developed in 56 patients . The antigenemia assay had a sensitivity of 89 % and a specificity of 93 % in the diagnosis of active CMV infection ; for the ELISA these figures were 95 and 100 % , respectively . In 22 of the 56 patients a CMV syndrome occurred . Antigenemia was demonstrated in all 22 patients while an antibody response occurred in 21 of them . The antigenemia assay became positive 8 + /- 7 days before the onset of symptoms while the antibody response was observed 4 + /- 9 days after the onset of symptoms . The pattern of antigenemia was helpful for monitoring the course of the infection . The maximum level of antigenemia was significantly higher and its duration significantly longer in symptomatic than asymptomatic infection . We conclude that CMV antigenemia is a sensitive , specific , and early marker of CMV infection . The antigenemia assay is of great value in monitoring patients with a high risk of CMV infection BACKGROUND Cytomegalovirus ( CMV ) disease is a frequent cause of serious morbidity after solid-organ transplantation . The prophylactic regimens used to prevent CMV infection and disease have shown limited benefit in seronegative recipients . We studied the safety and efficacy of oral ganciclovir in the prevention of CMV disease following orthotopic liver transplantation . METHODS Between December , 1993 , and April , 1995 , 304 liver-transplant recipients were r and omised to receive oral ganciclovir 1000 mg or matching placebo three times a day . Seronegative recipients of seronegative livers were excluded . Study drug was administered as soon as the patient was able to take medication by mouth ( no later than day 10 ) until the 98th day after transplantation . Patients were assessed at specified times throughout the first 6 months after surgery for evidence of CMV infection , CMV disease , rejection , opportunistic infections , and possible drug toxicity . FINDINGS The Kaplan-Meier estimate of the 6-month incidence of CMV disease was 29 ( 18.9 % ) of 154 in the placebo group , compared with seven ( 4.8 % ) of 150 in the ganciclovir group ( p < 0.001 ) . In the high-risk group of seronegative recipients ( R- ) of seropositive livers ( D+ ) , incidence of CMV disease was 11 ( 44.0 % ) of 25 in the placebo group , three ( 14.8 % ) of 21 in the ganciclovir group ( p = 0.02 ) . Significant benefit was also observed in those receiving antibodies to lymphocytes , where the incidence of CMV disease was 12 ( 32.9 % ) of 37 in the placebo group and two ( 4.6 % ) of 44 in the ganciclovir group ( p = 0.002 ) . Oral ganciclovir reduced the incidence of CMV infection ( placebo 79 [ 51.5 % ] of 154 ; ganciclovir 37 [ 24.5 % ] of 150 ; p < 0.001 ) and also reduced symptomatic herpes-simplex infections ( Kaplan-Meier estimates : placebo 36 [ 23.5 % ] of 154 ; ganciclovir five [ 3.5 % ] of 150 ; p < 0.001 ) . INTERPRETATION Oral ganciclovir is a safe and effective method for the prevention of CMV disease after orthotopic liver transplantation
2,046
24,336,429
This association was robust against both publication bias and the generally low quality of the literature . The magnitude of the association with PTSD was significantly larger than that with sexual or physical abuse . The association of reported trauma with chronic fatigue syndrome was larger than the association with either irritable bowel syndrome or fibromyalgia . Studies using nonvali date d question naires or self-report of trauma reported larger associations than did those using vali date d question naires . Findings are consistent with the hypothesis that traumatic events are associated with an increased prevalence of functional somatic syndromes .
OBJECTIVE This meta- analysis systematic ally examined the association of reported psychological trauma and posttraumatic stress disorder ( PTSD ) with functional somatic syndromes including fibromyalgia , chronic widespread pain , chronic fatigue syndrome , temporom and ibular disorder , and irritable bowel syndrome . Our goals were to determine the overall effect size of the association and to examine moderators of the relationship .
OBJECTIVES This study examined the extent to which being abused and /or neglected in childhood increases a person 's risk for promiscuity , prostitution , and teenage pregnancy . METHODS A prospect i ve cohorts design was used to match , on the basis of age , race , sex , and social class , cases of abused and /or neglected children from 1967 to 1971 with nonabused and nonneglected children ; subjects were followed into young adulthood . From 1989 to 1995 1196 subjects ( 676 abused and /or neglected and 520 control subjects were located and interviewed . RESULTS Early childhood abuse and /or neglect was a significant predictor of prostitution for females ( odds ratio [ OR ] = 2.96 ) . For females , sexual abuse ( OR = 2.54 ) and neglect ( OR = 2.58 ) were associated with prostitution , whereas physical abuse was only marginally associated . Childhood abuse and neglect were not associated with increased risk for promiscuity or teenage pregnancy . CONCLUSIONS These findings strongly support a relationship between childhood victimization and subsequent prostitution . The presumed causal sequence between childhood victimization and teenage pregnancy may need to be reevaluated The relationship between sexual and physical abuse history and negative health effects has been well-documented in medical facility sample s. Few studies have examined the role of abuse history and its relationship with chronic fatigue and psychiatric disorders in a diverse , r and omly selected community-based sample . The present study compared rates of different types of abuse events in individuals with chronic fatigue and non-symptomatic controls . Relationships between specific types of abuse and psychiatric disorders commonly associated with chronic fatigue were also explored . A stratified r and om sample of 18,675 adults residing in ethnically and socioeconomically diverse neighborhoods in Chicago first completed a telephone screening question naire . A control group and a group of individuals with chronic fatigue symptomatology were identified and administered a semi-structured psychiatric interview assessing DSM-IV Axis I psychiatric disorders and a sexual and physical abuse history question naire . Controlling for sociodemographic differences , fatigue outcome was significantly predicted by childhood sexual abuse and the total number of different childhood abuse events . Within the chronic fatigue group , diagnosis of posttraumatic stress disorder ( PTSD ) was significantly predicted by childhood sexual abuse , childhood death threat , the total number of childhood abuse events , and lifetime abuse events . Sexual abuse during adolescence or adulthood significantly predicted other anxiety disorders among individuals with chronic fatigue . These findings suggest that a history of abuse , particularly during childhood , may play a role in the development and perpetuation of a wide range of disorders involving chronic fatigue . Among individuals with chronic fatigue , PTSD and other anxiety disorders appear to demonstrate the strongest association with abuse history . The implication s of these findings are discussed Using a r and omly selected community-based sample , this investigation examined whether histories of childhood sexual , physical , and death threat abuse predicted adulthood outcomes of specific medical and psychiatric conditions involving chronic fatigue . This study also tested prior suggestions that most individuals with chronic fatigue syndrome report a past history of interpersonal abuse . Multinomial logistic regression was used to examine the relationship between abuse history and chronic fatigue group outcomes while controlling for the effects of sociodemographics . Compared with healthy controls , childhood sexual abuse was significantly more likely to be associated with outcomes of idiopathic chronic fatigue , chronic fatigue explained by a psychiatric condition , and chronic fatigue explained by a medical condition . None of the abuse history types were significant predictors of chronic fatigue syndrome . A closer examination of individuals in the chronic fatigue syndrome group revealed that significantly fewer individuals with CFS reported abuse as compared with those who did not . The implication s of these findings are discussed Patterns of physical comorbidity among women with posttraumatic stress disorder ( PTSD ) were explored using Michigan Medicaid cl aims data . PTSD-diagnosed women ( n = 2,133 ) were compared with 14,948 r and omly selected women in three health outcome areas : ICD-9 categories of disease , chronic conditions associated with sexual assault history in previous research , and reproductive health conditions . PTSD was associated with increased risk of all categories of diseases ( OR range = 1.3 - 4.8 ) , endometriosis ( OR = 2.7 ) , and dyspareunia ( OR = 3.4 ) . When PTSD was not complicated by other mental health conditions , odds ratios for chronic conditions ranged from 1.9 for fibromyalgia to 4.3 for irritable bowel . Comorbidity with depression or a dissociative or borderline personality disorder raised risk in a dose-response pattern OBJECTIVE Clinic based studies suggest that adverse events in childhood may predispose to chronic pain in adult life . These have been conducted on highly selected groups , and it is unknown whether these relationships hold in the general population and to what extent the increased rate of adverse childhood events in persons with pain is an artefact of differential reporting . We examined the hypothesis that chronic widespread pain was associated with reports of adverse experiences in childhood and whether any observed relationships could be explained by differential recall . METHODS A cross sectional population based screening survey was conducted . Subjects completed a question naire that included assessment s of pain and psychological state . In total , 296 subjects who had demonstrated psychological distress were r and omly selected and had a detailed interview , which included an assessment of 14 adverse childhood experiences . Medical records relating to childhood were also examined for those subjects . RESULTS The prevalence of self-reported adverse childhood experiences was greatest in adult subjects with current chronic widespread pain . Exposure to illness in family members , parental loss , operations , and abuse were all associated with increased , but nonsignificant , odds of having chronic widespread pain versus those without such exposures . However the only statistically significant association was with childhood hospitalizations . From medical record information the associations of hospitalizations ( OR 5.1 , 95 % CI 2.0 - 13.0 ) and operations ( OR 3.0 , 95 % CI 1.2 - 7.2 ) with pain previously noted were partly explained by differential recall between subjects with and without pain : hospitalizations , OR 2.2 , 95 % CI 0.9 - 5.5 ; operations , OR 1.2 , 95 % CI 0.5 - 3.4 . CONCLUSION Although several reported adverse events in childhood were observed to be associated with chronic widespread pain in adulthood , only reports of hospitalizations were significantly associated . Validation of self-reported exposures suggests that there was differential recall of past events among those with and without pain , and this differential recall explained the association between hospitalizations and current chronic pain . Such differential recall may explain other observations of an association between reports of adverse childhood events and chronic pain in adulthood To profile differences in current physical symptoms and medical conditions among women users of Veterans Administration ( VA ) health services with and without a self-reported history of sexual assault sustained during military service , we conducted a cross-sectional analysis of a nationally representative , r and om sample of women veterans using VA outpatient services ( n = 3632 ) . A self-administered , mailed survey asked whether women had sustained sexual assault while in the military and requested information about a spectrum of physical symptoms and medical conditions . A history of sexual assault while in the military was reported by 23 % of women VA users and was associated with current physical symptoms and medical conditions in every domain assessed . For example , women who reported sexual assault were more likely to indicate that they had a " heart attack " within the past year , even after adjusting for age , hypertension , diabetes , and smoking history ( OR 2.3 , 95 % CI 1.3 - 4.0 ) . Among women reporting a history of sexual assault while in the military , 26 % endorsed > or = 12 of 24 symptoms/conditions , compared with 11 % of women with no reported sexual assault while in the military ( p < 0.001 ) . Clinicians need to be attuned to the high frequency of sexual assault occurring while in the military reported by women VA users and its associated array of current physical symptoms and medical conditions . Clinicians should consider screening both younger and older patients for a sexual violence history , especially patients with multiple physical symptoms BACKGROUND Little is known about the aetiology of chronic fatigue syndrome/myalgic encephalomyelitis ( CFS/ME ) ; prospect i ve studies suggest a role for premorbid mood disorder . AIMS To examine childhood and early adult adversity , ill health and physical activity as premorbid risk markers for CFS/ME by 42 years , taking psychopathology into account . METHOD Data were from the 1958 British birth cohort , a prospect i ve study from birth to 42 years ( n = 11 419 ) . The outcomes were self-reported CFS/ME ( n = 127 ) and operationally defined CFS-like illness ( n = 241 ) at 42 years . RESULTS Adjusting for psychopathology , parental physical abuse ( odds ratio ( OR ) = 2.10 , 95 % CI 1.16 - 3.81 ) , childhood gastrointestinal symptoms ( OR = 1.58 , 95 % CI 1.00 - 2.50 ) and parental reports of many colds ( OR = 1.65 , 95 % CI 1.09 - 2.50 ) were independently associated with self-reported CFS/ME . Female gender and premorbid psychopathology were the only risk markers for CFS-like illness , independent of comorbid psychopathology . CONCLUSIONS This confirms the importance of premorbid psychopathology in the aetiological pathways of CFS/ME , and replicates retrospective findings that childhood adversity may play a role in a minority Abstract Prior studies of careseeking fibromyalgia ( FM ) patients often report that they have an elevated risk of psychiatric disorders , but biased sampling may distort true risk . The current investigation utilizes state‐of‐the‐art diagnostic procedures for both FM and psychiatric disorders to estimate prevalence rates of FM and the comorbidity of FM and specific psychiatric disorders in a diverse community sample of women . Participants were screened by telephone for FM and MDD , by r and omly selecting telephone numbers from a list of households with women in the NY/NJ metropolitan area . Eligible women were invited to complete physical examinations for FM and clinician‐administered psychiatric interviews . Data were weighted to adjust for sampling procedures and population demographics . The estimated overall prevalence of FM among women in the NY/NJ metropolitan area was 3.7 % ( 95 % CI = 3.2 , 4.4 ) , with higher rates among racial minorities . Although risk of current MDD was nearly 3‐fold higher in community women with than without FM , the groups had similar risk of lifetime MDD . Risk of lifetime anxiety disorders , particularly obsessive compulsive disorder and post‐traumatic stress disorder , was approximately 5‐fold higher among women with FM . Overall , this study found a community prevalence for FM among women that replicates prior North American studies , and revealed that FM may be even more prevalent among racial minority women . These community‐based data also indicate that the relationship between MDD and FM may be more complicated than previously thought , and call for an increased focus on anxiety disorders in FM Abstract Previous studies of the association between posttraumatic stress disorder ( PTSD ) and chronic widespread pain ( CWP ) or fibromyalgia have not examined the role of familial or genetic factors . The goals of this study were to determine if symptoms of PTSD are related to CWP in a genetically informative community‐based sample of twin pairs , and if so , to ascertain if the association is due to familial or genetic factors . Data were obtained from the University of Washington Twin Registry , which contains 1042 monozygotic and 828 dizygotic twin pairs . To assess the symptoms of PTSD , we used questions from the Impact of Events Scale ( IES ) . IES scores were partitioned into terciles . CWP was defined as pain located in 3 body regions lasting at least 1 week during the past 3 months . R and om‐effects regression models , adjusted for demographic features and depression , examined the relationship between IES and CWP . IES scores were strongly associated with CWP ( P < 0.0001 ) . Compared to those in the lowest IES tercile , twins in the highest tercile were 3.5 times more likely to report CWP . Although IES scores were associated with CWP more strongly among dizygotic than among monozygotic twins , this difference was not significant . Our findings suggest that PTSD symptoms , as measured by IES , are strongly linked to CWP , but this association is not explained by a common familial or genetic vulnerability to both conditions . Future research is needed to underst and the temporal association of PTSD and CWP , as well as the physiological underpinnings of this relationship Background —In out patients and the community , an association between abuse ( particularly sexual abuse ) and irritable bowel syndrome ( IBS ) has been observed , but whether there is a causal link continues to be disputed . Aims —To test the hypothesis that psychological factors explain the apparent association between abuse and IBS . Methods —A sample of residents of Penrith ( a Sydney suburb sociodemographically similar to the Australian population ) selected r and omly from the electoral rolls ( that by law include the entire population of age 18 years and above ) was mailed a vali date d self report question naire . Measured were gastrointestinal ( GI ) symptoms including the Rome criteria for IBS , abuse ( including the st and ardised Drossman questions ) , neuroticism ( Eysenck Personality Question naire ) , and psychological morbidity ( General Health Question naire ) . Results —The response rate was 64 % ( n=730 ) ; 12 % fulfilled the Rome criteria for IBS . Overall abuse in childhood ( odds ratio (OR)=2.02 , 95 % confidence interval ( CI ) 1.29 to 3.15 ) but not adulthood ( OR=1.39 , 95 % CI 0.88 to 2.19 ) was associazted with IBS univariately . Neuroticism and psychological morbidity were also univariately associated with abuse in childhood , abuse in adulthood , and IBS , respectively . However , by logistic regression , abuse in childhood was not associated with IBS after controlling for age , gender , and psychological factors ( OR=1.34 , 95 % CI 0.83 to 2.17 ) . The results were not altered by restricting the analyses to more severe forms of abuse , and were not explained by interactions between abuse and psychological variables . Conclusion —There is an association between abuse and IBS in the community , but this may be explained in part by other psychological factors . Based on a path analysis , we postulate that abuse may induce the expression of neuroticism that in turn leads to IBS Objectives : According to the trauma hypothesis , women with fibromyalgia syndrome ( FMS ) are more likely to report a history of sexual and /or physical abuse than women without FMS . In this study , we rely on a community sample to test this hypothesis and the related prediction that women with FMS are more likely to have posttraumatic stress disorder than women without FMS . Methods : Eligibility for the present study was limited to an existing community sample in which FMS and major depressive disorder were prevalent . The unique composition of the original sample allowed us to recruit women with and without FMS from the community . A total of 52 female participants were enrolled in the present FMS group and 53 in the control ( no FMS ) group . Sexual and physical abuse were assessed retrospectively using a st and ardized telephone interview . Results : Except for rape , sexual and physical abuse were reported equally often by women in the FMS and control groups . Women who reported rape were 3.1 times more likely to have FMS than women who did not report rape ( P < 0.05 ) . There was no evidence of increased childhood abuse in the FMS group . Women with FMS were more likely to have posttraumatic stress disorder symptoms ( intrusive thoughts and arousal ) as well as posttraumatic stress disorder diagnosis ( P < 0.01 ) . Discussion : With the exception of rape , no self-reported sexual or physical abuse event was associated with FMS in this community sample . In accord with the trauma hypothesis , however , posttraumatic stress disorder was more prevalent in the FMS group . Chronic stress in the form of posttraumatic stress disorder but not major depressive disorder may mediate the relationship between rape and FMS Objectives : Recent academic debate has centered on whether functional somatic syndromes should be defined as separate entities or as one syndrome . The aim of this study was to investigate whether there may be significant differences in the etiology or precipitating factors associated with two common functional syndromes , irritable bowel syndrome ( IBS ) and chronic fatigue syndrome ( CFS ) . Methods : We prospect ively studied 592 patients with an acute episode of Campylobacter gastroenteritis and 243 with an acute episode of infectious mononucleosis who had no previous history of CFS or IBS . At the time of infection , patients completed a baseline question naire that measured their levels of distress using the Hospital Anxiety and Depression scale . At 3- and 6-month follow-up , they completed question naires to determine whether they met published diagnostic criteria for chronic fatigue ( CF ) , CFS , and /or IBS . Results : The odds of developing IBS were significantly greater post-Campylobacter than post – infectious mononucleosis at both 3- ( odds ratio , 3.45 [ 95 % confidence interval ( CI ) , 1.75–6.67 ] ) and 6- ( 2.22 [ 95 % CI , 1.11–6.67 ] ) month follow-up . In contrast , the odds for developing CF/CFS were significantly greater after infectious mononucleosis than after Campylobacter at 3 ( 2.77 [ 95 % CI , 1.08–7.11 ] ) but not 6 ( 1.48 [ 95 % CI , 0.62–3.55 ] ) months postinfection . Anxiety and depression were the strongest predictors of CF/CFS , whereas the nature of the infection was the strongest predictor of IBS . Conclusions : These results support the argument to distinguish between postinfectious IBS and CFS . The nature of the precipitating infection appears to be important , and premorbid levels of distress appear to be more strongly associated with CFS than IBS , particularly levels of depression . CDC = Centers for Disease Control and Prevention ; CFS = chronic fatigue syndrome ; CF = chronic fatigue ; CI = confidence interval ; IBS = irritable bowel syndrome ; IM = infectious mononucleosis ; HADS = Hospital Anxiety and Depression Scale OBJECTIVE To assess the prevalence of self-reported symptoms and illnesses among military personnel deployed during the Persian Gulf War ( PGW ) and to compare the prevalence of these conditions with the prevalence among military personnel on active duty at the same time , but not deployed to the Persian Gulf ( non-PGW ) . DESIGN Cross-sectional telephone interview survey of PGW and non-PGW military personnel . The study instrument consisted of vali date d questions , vali date d question naires , and investigator-derived questions design ed to assess relevant medical and psychiatric conditions . SETTING Population -based sample of military personnel from Iowa . STUDY PARTICIPANTS A total of 4886 study subjects were r and omly selected from 1 of 4 study domains ( PGW regular military , PGW National Guard/Reserve , non-PGW regular military , and non-PGW National Guard/Reserve ) , stratifying for age , sex , race , rank , and branch of military service . MAIN OUTCOME MEASURES Self-reported symptoms and symptoms of medical illnesses and psychiatric conditions . RESULTS Overall , 3695 eligible study subjects ( 76 % ) and 91 % of the located subjects completed the telephone interview . Compared with non-PGW military personnel , PGW military personnel reported a significantly higher prevalence of symptoms of depression ( 17.0 % vs 10.9 % ; Cochran-Mantel-Haenszel test statistic , P<.001 ) , posttraumatic stress disorder ( PTSD ) ( 1.9 % vs 0.8 % , P=.007 ) , chronic fatigue ( 1.3 % vs 0.3 % , P<.001 ) , cognitive dysfunction ( 18.7 % vs 7.6 % , P<.001 ) , bronchitis ( 3.7 % vs 2.7 % , P<.001 ) , asthma ( 7.2 % vs 4.1 % , P=.004 ) , fibromyalgia ( 19.2 % vs 9.6 % , P<.001 ) , alcohol abuse ( 17.4 % vs 12.6 % , P=.02 ) , anxiety ( 4.0 % vs 1.8 % , P<.001 ) , and sexual discomfort ( respondent , 1.5 % vs 1.1 % , P=.009 ; respondent 's female partner , 5.1 % vs 2.4 % , P<.001 ) . Assessment of health-related quality of life demonstrated diminished mental and physical functioning scores for PGW military personnel . In almost all cases , larger differences between PGW and non-PGW military personnel were observed in the National Guard/Reserve comparison . Within the PGW military study population , compared with veterans in the regular military , veterans in the National Guard/Reserve only reported more symptoms of chronic fatigue ( 2.9 % vs 1.0 % , P=.03 ) and alcohol abuse ( 19.4 % vs 17.0 % , P=.004 ) . CONCLUSIONS Military personnel who participated in the PGW have a higher self-reported prevalence of medical and psychiatric conditions than contemporary military personnel who were not deployed to the Persian Gulf . These findings establish the need to further investigate the potential etiologic , clinical , pathogenic , and public health implication s of the increased prevalence of multiple medical and psychiatric conditions in population s of military personnel deployed to the Persian Gulf OBJECTIVE Acute stress disorder permits an early identification of trauma survivors who are at risk of developing chronic posttraumatic stress disorder ( PTSD ) . This study aim ed to prevent PTSD by an early provision of cognitive behavior therapy . Specifically , this study indexed the relative efficacy of prolonged exposure and anxiety management in the treatment of acute stress disorder . METHOD Forty-five civilian trauma survivors with acute stress disorder were given five sessions of 1 ) prolonged exposure ( N = 14 ) , 2 ) a combination of prolonged exposure and anxiety management ( N = 15 ) , or 3 ) supportive counseling ( N = 16 ) within 2 weeks of their trauma . Forty-one trauma survivors were assessed at the 6-month follow-up . RESULTS Fewer patients with prolonged exposure ( 14 % , N = 2 of 14 ) and prolonged exposure plus anxiety management ( 20 % , N = 3 of 15 ) than supportive counseling ( 56 % , N = 9 of 16 ) met the criteria for PTSD after treatment . There were also fewer cases of PTSD in the prolonged exposure group ( 15 % , N = 2 of 13 ) and the prolonged exposure plus anxiety management group ( 23 % , N = 3 of 13 ) than in the supportive counseling group ( 67 % , N = 10 of 15 ) 6 months after the trauma . Chronic PTSD in the supportive counseling condition was characterized by greater avoidance behaviors than in the prolonged exposure condition or the prolonged exposure plus anxiety management condition . CONCLUSIONS These findings suggest that PTSD can be effectively prevented with an early provision of cognitive behavior therapy and that prolonged exposure may be the most critical component in the treatment of acute stress disorder The hypothesis that childhood abuse is critical in the development and course of some types of pain syndromes in adulthood has generated a substantial body of research , most commonly based on retrospective interviews with patients in specialty pain practice s. The heuristic value of this body of research should not be discounted , as these studies have the potential to make a case for the importance of psychosocial factors in chronic pain . On the other h and , proponents of the relationship seem to disregard the fact that even these potentially biased studies do not consistently find a statistically significant relationship , unless the sample size is very large . Less biased data from prospect i ve studies has not yet confirmed a relationship . Moreover , the clinical utility of self-reported abuse is unclear , as there is contradictory evidence about whether such reports predict greater pain severity or pain-related impairment within a chronic pain patient sample . Given this type of evidence , it is worthwhile to reconsider whether pain should be considered the ‘ ‘ next frontier ’ ’ in child maltreatment research . The 4 papers in this Special Topics Series add to the literature on childhood abuse and pain , by further challenging the prevailing dogma , in finding only partial or limited support for the relationship . The papers critically consider these questions : Does a relationship exist ? How strong is it ? What is its clinical significance ? What factors might alter detection of this relationship ? The earliest literature on childhood abuse and pain appeared nearly a half-century ago , with the publication of George Engel ’s paper on ‘ ‘ Psychogenic Pain and the PainProne Patient . ’ ’ Engel introduced the view that pain could develop in the absence of a peripheral stimulus , as a result of negative childhood experiences in childhood . He particularly suggested the pathogenic role of a physically or verbally abusive parent . In expansions of the theory by others , childhood sexual abuse was proposed as another potent factor leading to ‘ ‘ psychogenic pain ’ ’ in adulthood . Even if one rejects the mindbody dualism inherent in the concept of psychogenic pain , the more broadly accepted biopsychosocial model of chronic pain disorders allows for the possibility that early childhood adversity is an important psychosocial influence on pain pathogenesis . Despite some strong theoretical justification for expecting a robust relationship , taken as a whole , the articles in this Special Topics Series indicate that the relationship between childhood abuse and adult pain complaints is unlikely to be dramatic . Central to the debate about the role of childhood abuse in adult pain is the question of whether and how strongly selfreports of childhood abuse may be related to pain in adulthood . Davis et al ’s meta-analytic study in this series provides a clear positive answer to the first part of this question : overall , studies relying on self-reported childhood abuse find that patients with pain are more likely to report abuse in childhood and that adults reporting a history of abuse in childhood also tend to report more current pain . On the other h and , these same authors find that the relationship between childhood abuse and pain is not large . Given small effect sizes , it is reasonable to question the clinical significance of the relationship . Another issue concerns the fact that nearly all data surrounding the relationship derives from retrospective studies , studies that are subject to biased recall about abuse histories . Because neither of the 2 prospect i ve studies of this question have supported an association between childhood abuse and pain in adulthood , Brown et al ’s new report in this Special Topics Series is particularly welcome . Similar to the earlier study using court-record documentation of abuse status , Brown et al did not find a significant association between documented childhood maltreatment and adult pain complaints , but did find a significant association between self-reported sexual abuse and adult pain . Brown et al address another directly related issue in this debate ; that is , the extent to which self-reports of childhood abuse may be influenced by psychologic symptoms present at time the time of retrospective report . Brown et al ’s analyses that control for depressive symptoms at time of reporting are noteworthy , in finding that statistical control for depressive symptoms leads to a distinct reduction in the strength of association between childhood sexual abuse reports and adult pain . However , a statistically significant association between sexual abuse and pain still remained , after controlling for depression . Consistent with Hardt and Rutter ’s recent review on the validity of adult retrospective reports on childhood trauma , Received for publication October 2 , 2004 ; accepted for publication October 2 , 2004 . From the Department of Psychiatry , New Jersey Medical School , University of Medicine and Dentistry of New Jersey , Newark , NJ . Reprints : Karen G. Raphael , PhD , Department of Psychiatry , New Jersey Medical School , University of Medicine and Dentistry of New Jersey , 183 South Orange Avenue , BHSB F1512 , Newark , NJ 07103 ( e-mail : [email protected] ) . Copyright 2005 by Lippincott Williams & & NA ; Numerous studies report that fibromyalgia ( FM ) , a syndrome characterized by widespread pain and generalized tender points , is comorbid with major depressive disorder ( MDD ) . The current study tests two alternate explanations for their comorbidity using a family study methodology . The first is that FM is a depression spectrum disorder . The second is that depression is a consequence of living with FM . We recruited potential prob and s by initially screening by telephone for FM and MDD among women in the NY/NJ metropolitan area , r and omly selecting telephone numbers from a list of households with women . Eligible women were invited for second stage physical examinations for FM diagnosis and psychiatric interviews for MDD diagnosis . All available adult , first‐degree relatives received psychiatric interviews . Relatives of prob and s were divided into four groups on the basis of the prob and s ' FM and MDD diagnoses ( FM+/MDD+ ( n=156 ) , FM+/MDD− ( n=51 ) , FM−/MDD+ ( n=351 ) and FM−/MDD− ( n=101 ) ) . Results indicated that rates of MDD in the relatives of prob and s with FM but without personal histories of MDD were virtually identical to rates of MDD in relatives of prob and s with MDD themselves . This outcome is consistent with the hypothesis that FM is a depression spectrum disorder , in which FM and MDD are characterized by shared , familially mediated risk factors . The implication s of these findings for a stress‐vulnerability model of FM are discussed Objectives : The aim of this multicentre inquiry was to evaluate the prevalence of sexual abuse among IBS patients consulting a gastroenterologist , in comparison to healthy controls and patients with organic digestive diseases . Patients and methods : Patients with irritable bowel syndrome ( IBS ; Rome Criteria ) were included by eight university hospitals ( n=196 ; 41.2±20.6 years ; sex ratio ( M/F ) = 0.23 ) . Control groups were : ( i ) patients consulting for the follow‐up of nonneoplastic organic digestive diseases ( n=135 ; 41.5 ±17.0 years ; 1.21 ) ; ( ii ) patients attending ophthalmology units ( n = 200 ; 43.8±20.7 years ; 0.81 ) ; ( iii ) healthy subjects seen in centres of the National Health System ( n= 172 ; 40.3 ± 16.3 years ; 0.83 ) . Each patient filled in an anonymous question naire , without help . Prevalence of sexual abuse in the various groups was compared by the χ2 test . Results : Sixty‐two instance of sexual abuse ( 55 females , 7 males ) were recorded among the 196 IBS patients ( 31.6 % ) : 8 cases of verbal aggression , 4 of exhibitionism , 11 of sexual harassment , 22 sexual touches , 17 rapes . The prevalence of sexual abuse was 14.0 % for the patients with organic digestive diseases ( P= 0.0005 vs. IBS ) , 12.5 % among ophthalmology patients ( P<0.0001 ) and 7.6 % in healthy controls ( P<0.0001 ) . Sexual abuse was accompanied by physical abuse in 23 IBS patients and 19 patients from control groups ( not significant ) . Twenty‐six IBS patients reported isolated physical abuse ( 14.7 % ) versus 40 from control groups ( 8.8 % ; P=0.041 ) . There was a significant trend towards more severe attacks of abuse among IBS patients than in others . Conclusion : This study confirms the high prevalence of sexual abuse among IBS patients consulting in Gastroenterology . Some of these patients would benefit from appropriate therapy . ( In the majority of cases , this will be psychotherapy . Primary fibromyalgia is regarded as disorder with a complex symptomatology , and no morphological alterations . Findings increasingly point to a dysfunction of the central nervous pain processing . The study aims to discuss vulnerability for fibromyalgia from a developmental psychopathological perspective . We investigated the presence of psychosocial adversities affecting the childhood of adult fibromyalgia patients ( FM ) and compared them to those of patients with somatoform pain disorders ( SOM ) and a control group ( CG ) with medically explained chronic pain . Using the structured biographical interview for pain patients ( SBI‐P ) , 38 FM patients , 71 SOM patients , and 44 CG patients were compared on the basis of 14 childhood adversities verified as relevant regarding longterm effects for adult health by prospect i ve studies . The FM patients show the highest score of childhood adversities . In addition to sexual and physical maltreatment , the FM patients more frequently reported a poor emotional relationship with both parents , a lack of physical affection , experiences of the parents ’ physical quarrels , as well as alcohol or other problems of addiction in the mother , separation , and a poor financial situation before the age of 7 . These experiences were found to a similar extent in the SOM patients , but distinctly less frequently in the CG . The results point to early psychosocial adversities as holding a similar etiological meaning in fibromyalgia as well as in somatoform pain disorders . The potential role of these factors as increasing the vulnerability for fibromyalgia is discussed
2,047
25,803,757
DISCUSSION TENS might relieve pain due to knee osteoarthritis .
OBJECTIVES Transcutaneous electrical nerve stimulation ( TENS ) has been reported to relieve pain and improve function in patients with knee osteoarthritis . The purpose of this systematic review and meta- analysis was to evaluate the efficacy of TENS for the management of knee osteoarthritis .
OBJECTIVE To compare the effectiveness of transcutaneous electrical nerve stimulation ( TENS ) , interferential currents ( IFCs ) , and shortwave diathermy ( SWD ) against each other and sham intervention with exercise training and education as a multimodal package . DESIGN A double-blind , r and omized , controlled , multicenter trial . SETTING Departments of physical medicine and rehabilitation in 4 centers . PARTICIPANTS Patients ( N=203 ) with knee osteoarthritis ( OA ) . INTERVENTIONS The patients were r and omized by the principal center into the following 6 treatment groups : TENS sham , TENS , IFCs sham , IFCs , SWD sham , and SWD . All interventions were applied 5 times a week for 3 weeks . In addition , exercises and an education program were given . The exercises were carried out as part of a home-based training program after 3 weeks ' supervised group exercise . MAIN OUTCOME MEASURES Primary outcome was a visual analog scale ( 0 - 100 mm ) to assess knee pain . Other outcome measures were time to walk a distance of 15 m , range of motion , Western Ontario and McMaster Universities Osteoarthritis Index ( WOMAC ) , Nottingham Health Profile , and paracetamol intake ( in grams ) . RESULTS We found a significant decrease in all assessment parameters ( P<.05 ) , without a significant difference among the groups except WOMAC stiffness score and range of motion . However , the intake of paracetamol was significantly lower in each treatment group when compared with the sham groups at 3 months ( P<.05 ) . Also , the patients in the IFCs group used a lower amount of paracetamol at 6 months ( P<.05 ) in comparison with the IFCs sham group . CONCLUSIONS Although all groups showed significant improvements , we can suggest that the use of physical therapy agents in knee OA provided additional benefits in improving pain because paracetamol intake was significantly higher in the patients who were treated with 3 sham interventions in addition to exercise and education The purpose of this study was to compare the effectiveness of transcutaneous nerve stimulation ( TENS ) , electroacupuncture ( EA ) , and ice massage with placebo treatment for the treatment of pain . Subjects ( n = 100 ) diagnosed with osteoarthritis ( OA ) of the knee were treated with these modalities . The parameters for evaluating the effectiveness of treatment include pain at rest , stiffness , 50 foot walking time , quadriceps muscle strength , and knee flexion degree . The results showed ( a ) that all three methods could be effective in decreasing not only pain but also the objective parameters in a short period of time ; and ( b ) that the treatment results in TENS , EA and ice massage were superior to placebo Background The present study tests whether a combined treatment of acupuncture and transcutaneous electrical nerve stimulation ( TENS ) is more effective than acupuncture or TENS alone for treating knee osteoarthritis ( OA ) . Methods Thirty-two patients with knee OA were r and omly allocated to four groups . The acupuncture group ( ACP ) received only acupuncture treatment at selected acupoints for knee pain ; the TENS group ( TENS ) received only TENS treatment at pain areas ; the acupuncture and TENS group ( A&T ) received both acupuncture and TENS treatments ; the control group ( CT ) received topical poultice ( only when necessary ) . Each group received specific weekly treatment five times during the study . Outcome measures were pain intensity in a visual analogue scale ( VAS ) and knee function in terms of the Western Ontario and McMaster Universities Osteoarthritis Index ( WOMAC ) . Results The ACP , TENS and A&T groups reported lower VAS and WOMAC scores than the control group . Significant reduction in pain intensity ( P = 0.039 ) and significant improvement in knee function ( P = 0.008 ) were shown in the A&T group . Conclusion Combined acupuncture and TENS treatment was effective in pain relief and knee function improvement for the sample d patients suffering from knee OA In the GRADE approach , r and omized trials start as high- quality evidence and observational studies as low- quality evidence , but both can be rated down if most of the relevant evidence comes from studies that suffer from a high risk of bias . Well-established limitations of r and omized trials include failure to conceal allocation , failure to blind , loss to follow-up , and failure to appropriately consider the intention-to-treat principle . More recently recognized limitations include stopping early for apparent benefit and selective reporting of outcomes according to the results . Key limitations of observational studies include use of inappropriate controls and failure to adequately adjust for prognostic imbalance . Risk of bias may vary across outcomes ( e.g. , loss to follow-up may be far less for all-cause mortality than for quality of life ) , a consideration that many systematic review s ignore . In deciding whether to rate down for risk of bias -- whether for r and omized trials or observational studies -- authors should not take an approach that averages across studies . Rather , for any individual outcome , when there are some studies with a high risk , and some with a low risk of bias , they should consider including only the studies with a lower risk of bias OBJECTIVE This study examined the optimal stimulation duration of transcutaneous electrical nerve stimulation ( TENS ) for relieving osteoarthritic knee pain and the duration ( as measured by half-life ) of post-stimulation analgesia . SUBJECTS Thirty-eight patients received either : ( i ) 20 minutes ( TENS20 ) ; ( ii ) 40 minutes ( TENS40 ) ; ( iii ) 60 minutes ( TENS60 ) of TENS ; or ( iv ) 60 minutes of placebo TENS ( TENS(PL ) ) 5 days a week for 2 weeks . METHODS A visual analogue scale recorded the magnitude and pain relief period for up to 10 hours after stimulation . RESULTS By Day10 , a significantly greater cumulative reduction in the visual analogue scale scores was found in the TENS40 ( 83.40 % ) and TENS60 ( 68.37 % ) groups than in the TENS20 ( 54.59 % ) and TENS(PL ) ( 6.14 % ) groups ( p < 0.000 ) , such a group difference was maintained in the 2-week follow-up session ( p < 0.000 ) . In terms of the duration of post-stimulation analgesia period , the duration for the TENS40 ( 256 minutes ) and TENS60 ( 258 minutes ) groups was more prolonged than in the other 2 groups ( TENS20 = 168 minutes , TENS(PL ) = 35 minutes ) by Day10 ( p < 0.000 ) . However , the TENS40 group produced the longest pain relief period by the follow-up session . CONCLUSION 40 minutes is the optimal treatment duration of TENS , in terms of both the magnitude ( VAS scores ) of pain reduction and the duration of post-stimulation analgesia for knee osetoarthritis Abstract Objective : To study the effects of transcutaneous electrical nerve stimulation ( TENS ) on joint position sense ( JPS ) in knee osteoarthritis ( OA ) subjects . Methods : Thirty subjects with knee OA ( 40–60 years old ) using non-r and om sampling participated in this study . In order to evaluate the absolute error of repositioning of the knee joint , Qualysis Track Manager system was used and sensory electrical stimulation was applied through the TENS device . Results : The mean errors in repositioning of the joint , in two position of the knee joint with 20 and 60 degree angle , after applying the TENS was significantly decreased ( p < 0.05 ) . Conclusion : Application of TENS in subjects with knee OA could improve JPS in these subjects Cetin N , Aytar A , Atalay A , Akman MN : Comparing hot pack , short-wave diathermy , ultrasound , and TENS on isokinetic strength , pain , and functional status of women with osteoarthritic knees : a single-blind , r and omized , controlled trial . Am J Phys Med Rehabil 2008;87:443–451 . Objective : To investigate the therapeutic effects of physical agents administered before isokinetic exercise in women with knee osteoarthritis . Design : One hundred patients with bilateral knee osteoarthritis were r and omized into five groups of 20 patients each : group 1 received short-wave diathermy + hot packs and isokinetic exercise ; group 2 received transcutaneous electrical nerve stimulation + hot packs and isokinetic exercise ; group 3 received ultrasound + hot packs and isokinetic exercise ; group 4 received hot packs and isokinetic exercise ; and group 5 served as controls and received only isokinetic exercise . Results : Pain and disability index scores were significantly reduced in each group . Patients in the study groups had significantly greater reductions in their visual analog scale scores and scores on the Lequesne index than did patients in the control group ( group 5 ) . They also showed greater increases than did controls in muscular strength at all angular velocities . In most parameters , improvements were greatest in groups 1 and 2 compared with groups 3 and 4 . Conclusions : Using physical agents before isokinetic exercises in women with knee osteoarthritis leads to augmented exercise performance , reduced pain , and improved function . Hot pack with a transcutaneous electrical nerve stimulator or short-wave diathermy has the best outcome It is not clear whether segmental innocuous stimulation has a stronger analgesic effect than segmental noxious stimulation for chronic pain and whether the fading of current sensation during treatment interferes with the analgesic effect , as suggested by the gate control theory . Electrical stimulation ( by way of Interferential Current ) applied at the pain area ( segmental ) was administered to 4 groups of patients with osteoarthritis ( OA ) knee pain . Two groups were administered with noxious stimulation ( 30 % above pain threshold ) and two with innocuous stimulation ( 30 % below pain threshold ) . In each group half of the patients received a fixed current intensity while the other half raised the intensity continuously during treatment whenever fading of sensation was perceived . Group 5 and 6 received sham stimulation and no treatment , respectively . The outcome measures were : chronic pain intensity , morning stiffness , range of motion ( ROM ) , pain threshold and % pain reduction . Both noxious and innocuous stimulation significantly decreased chronic pain ( P<0.001 ) and morning stiffness ( P<0.01 ) and significantly increased pain threshold ( P<0.001 ) and ROM ( P<0.001 ) compared with the control groups . Nevertheless , noxious stimulation decreased pain intensity ( P<0.05 ) and increased pain threshold ( P<0.001 ) significantly more than innocuous stimulation . No differences in treatment outcomes were found between adjusted and unadjusted stimulation . ( a ) Interferential current is very effective for chronic OA knee pain , ( b ) segmental noxious stimulation produces a stronger analgesic effect than segmental innocuous stimulation , ( c ) the fading of sensation during treatment , does not decrease the analgesic effect . Possible mechanisms explaining the findings are discussed Background : According to a recent meta analysis study , there is strong evidence to support the view that transcutaneous electrical nerve stimulation ( TENS ) is an effective treatment for managing osteoarthritis ( OA ) knee pain . However , there is limited evidence showing its effectiveness in improving physical function . This study examined whether TENS alone can improve physical function in terms of range of knee motion and the Timed-Up- and -Go Test . Methods : Subjects were r and omly allocated into 2 groups receiving TENS at 100 Hz or a placebo TENS . Outcome measures included : 1 ) visual analog scale for measuring the intensity of the present pain , 2 ) Timed-Up- and -Go Test , and 3 ) range of knee motion ( ROM ) . Repeated- measures analysis of variance and Pearson correlation were used for data analyses . Results : By day 10 , TENS produced a significantly greater increase in maximum knee ROM than the placebo group ( P = 0.033 ) . TENS also significantly increased the pain-limited knee ROM across sessions , but the between-group difference was short of significance ( P = 0.067 ) . The decrease in time in performing the Timed-Up- and -Go Test was also not significantly different between the 2 groups . A moderate correlation was observed between the reduction in pain scores and the improvement in the Timed-Up- and -Go Test . Conclusions : Our findings suggested that TENS did improve some of the physical parameters but over 10 days was unable to produce significant improvement in functional performance among people with knee OA . A larger-scale study with the assessment of other functional outcomes may be required to clarify if TENS could improve function in people with knee OA . Also , exercise can be considered to be an important adjunct treatment to TENS to improve function significantly OBJECTIVE This is a double blind study that examined the optimal stimulation frequency of transcutaneous electrical nerve stimulation in reducing pain due to knee osteoarthritis . SUBJECTS Thirty-four subjects were r and omly allocated into 4 groups receiving transcutaneous electrical nerve stimulation at either : ( i ) 2 Hz ; ( ii ) 100 Hz ; ( iii ) an alternating frequency of 2 Hz and 100 Hz ( 2/100 Hz ) ; or ( iv ) a placebo transcutaneous electrical nerve stimulation . METHODS Treatment was administered 5 days a week for 2 weeks . The outcome measures included : ( i ) a visual analogue scale ; ( ii ) a timed up- and -go test ; and ( iii ) a range of knee motion . RESULTS The 3 active transcutaneous electrical nerve stimulation groups ( 2 Hz , 100 Hz , 2/100 Hz ) , but not the placebo group , significantly reduced osteoarthritic knee pain across treatment sessions . However , no significant between-group difference was found . Similarly , the 3 active transcutaneous electrical nerve stimulation groups , but not the placebo group , produced significant reductions in the amount of time required to perform the timed up- and -go test , and an increase in the maximum passive knee range of motion . CONCLUSION Our findings suggested that 2 weeks of repeated applications of transcutaneous electrical nerve stimulation at 2 Hz , 100 Hz or 2/100 Hz produced similar treatment effects for people suffering from osteoarthritic knee This article introduces the approach of GRADE to rating quality of evidence . GRADE specifies four categories-high , moderate , low , and very low-that are applied to a body of evidence , not to individual studies . In the context of a systematic review , quality reflects our confidence that the estimates of the effect are correct . In the context of recommendations , quality reflects our confidence that the effect estimates are adequate to support a particular recommendation . R and omized trials begin as high- quality evidence , observational studies as low quality . " Quality " as used in GRADE means more than risk of bias and so may also be compromised by imprecision , inconsistency , indirectness of study results , and publication bias . In addition , several factors can increase our confidence in an estimate of effect . GRADE provides a systematic approach for considering and reporting each of these factors . GRADE separates the process of assessing quality of evidence from the process of making recommendations . Judgments about the strength of a recommendation depend on more than just the quality of evidence Objective : To evaluate the cumulative effect of repeated transcutaneous electrical nerve stimulation ( TENS ) on chronic osteoarthritic ( OA ) knee pain over a four-week treatment period , comparing it to that of placebo stimulation and exercise training given alone or in combination with TENS . Design : Sixty-two patients , aged 50–75 , were stratified according to age , gender and body mass ratio before being r and omly assigned to four groups . Interventions : Patients received either ( 1 ) 60 minutes of TENS , ( 2 ) 60 minutes of placebo stimulation , ( 3 ) isometric exercise training , or ( 4 ) TENS and exercise ( TENS & Ex ) five days a week for four weeks . Main outcome measures : Visual analogue scale ( VAS ) was used to measure knee pain intensity before and after each treatment session over a four-week period , and at the four-week follow-up session . Results : Repeated measures ANOVA showed a significant cumulative reduction in the VAS scores across the four treatment sessions ( session 1 , 10 , 20 and the follow-up ) in the TENS group ( 45.9 % by session 20 , p < 0.001 ) and the placebo group ( 43.3 % by session 20 , p = 0.034 ) . However , linear regression of the daily recordings of the VAS indicated that the slope in the TENS group ( slope = -2.415 , r = 0.943 ) was similar to the exercise group ( slope = -2.625 , r = 0.935 ) , which were steeper than the other two groups . Note that the reduction of OA knee pain was maintained in the TENS group and the TENS & Ex group at the four-week follow-up session , but not in the other two groups . Conclusions : The four treatment protocol s did not show significant between-group difference over the study period . It was interesting to note that isometric exercise training of the quadriceps alone also reduced knee pain towards the end of the treatment period OBJECTIVE To determine the effectiveness of subsensory , pulsed electrical stimulation ( PES ) in the symptomatic management of osteoarthritis ( OA ) of the knee . METHODS This was a double-blind , r and omized , placebo-controlled , repeated- measures trial in 70 participants with clinical and radiographically diagnosed OA of the knee who were r and omized to either PES or placebo . The primary outcome was change in pain score over 26 weeks measured on a 100-mm visual analog scale ( VAS ) . Other measures included pain on the Western Ontario and McMaster Universities Osteoarthritis Index ( WOMAC ) , function on the WOMAC , patient 's global assessment of disease activity ( on a 100-mm VAS ) , joint stiffness on the WOMAC , quality of life on the Medical Outcomes Study Short-Form 36 ( SF-36 ) health survey , physical activity ( using the Human Activity Profile and an accelerometer ) , and global perceived effect ( on an 11-point scale ) . RESULTS Thirty-four participants were r and omized to PES and 36 to placebo . Intent-to-treat analysis showed a statistically significant improvement in VAS pain score over 26 weeks in both groups , but no difference between groups ( mean change difference 0.9 mm [ 95 % confidence interval -11.7 , 13.4 ] ) . Similarly , there were no differences between groups for changes in WOMAC pain , function , and stiffness scores ( -5.6 [ 95 % confidence interval -14.9 , 3.6 ] , -1.9 [ 95 % confidence interval -9.7 , 5.9 ] , and 3.7 [ 95 % confidence interval -6.0 , 13.5 ] , respectively ) , SF-36 physical and mental component summary scores ( 1.7 [ 95 % confidence interval -1.5 , 4.8 ] and 1.2 [ 95 % confidence interval -2.9 , 5.4 ] , respectively ) , patient 's global assessment of disease activity ( -2.8 [ 95 % confidence interval -13.9 , 8.4 ] ) , or activity measures . Fifty-six percent of the PES-treated group achieved a clinical ly relevant 20-mm improvement in VAS pain score at 26 weeks compared with 44 % of controls ( 12 % [ 95 % confidence interval -11 % , 33 % ] ) . CONCLUSION In this sample of subjects with mild-to-moderate symptoms and moderate-to-severe radiographic OA of the knee , 26 weeks of PES was no more effective than placebo OBJECTIVE The safety and effectiveness of pulsed electrical stimulation was evaluated for the treatment of osteoarthritis ( OA ) of the knee . METHODS A multicenter , double blind , r and omized , placebo controlled trial that enrolled 78 patients with OA of the knee incorporated 3 primary efficacy variables of patients ' pain , patients ' function , and physician global evaluation of patients ' condition , and 6 secondary variables that included duration of morning stiffness , range of motion , knee tenderness , joint swelling , joint circumference , and walking time . Measurements were recorded at baseline and during the 4 week treatment period . RESULTS Patients treated with the active devices showed significantly greater improvement than the placebo group for all primary efficacy variables in comparisons of mean change from baseline to the end of treatment ( p < 0.05 ) . Improvement of > or = 50 % from baseline was demonstrated in at least one primary efficacy variable in 50 % of the active device group , in 2 variables in 32 % , and in all 3 variables in 24 % . In the placebo group improvement of > or = 50 % occurred in 36 % for one , 6 % for 2 , and 6 % for 3 variables . Mean morning stiffness decreased 20 min in the active device group and increased 2 min in the placebo group ( p < 0.05 ) . No statistically significant differences were observed for tenderness , swelling , or walking time . CONCLUSION The improvements in clinical measures for pain and function found in this study suggest that pulsed electrical stimulation is effective for treating OA of the knee . Studies for longterm effects are warranted Patients suffering from pain due to osteoarthritis of the hip and knee participated in a double-blind placebo controlled trial using daily Codetron home care units for 6 weeks over the tibial , saphenous , popliteal and sciatic nerves , and tender points . Seventy-four percent of patients in the real Codetron ( Group A ) and 28 % of the patients in sham Codetron ( Group B ) improved their pain level more than 25 % as measured by visual analogue scale . The difference in pain improvement in the two groups was statistically significant ( p < 0.02 using Fisher 's exact probability ratio ) Other functional parameters proved to be insensitive to change in this study . This is highly suggestive of beneficial effect of chronic pain conditions such as osteoarthritis Background Transcutaneous electrical nerve stimulation ( TENS ) is commonly used for the management of pain ; however , its effects on several pain and function measures are unclear . Objective The purpose of this study was to determine the effects of high-frequency TENS ( HF-TENS ) and low-frequency TENS ( LF-TENS ) on several outcome measures ( pain at rest , movement-evoked pain , and pain sensitivity ) in people with knee osteoarthritis . Design The study was a double-blind , r and omized clinical trial . Setting The setting was a tertiary care center . Participants Seventy-five participants with knee osteoarthritis ( 29 men and 46 women ; 31–94 years of age ) were assessed . Intervention Participants were r and omly assigned to receive HF-TENS ( 100 Hz ) ( n=25 ) , LF-TENS ( 4 Hz ) ( n=25 ) , or placebo TENS ( n=25 ) ( pulse duration = 100 microseconds ; intensity=10 % below motor threshold ) . Measurements The following measures were assessed before and after a single TENS treatment : cutaneous mechanical pain threshold , pressure pain threshold ( PPT ) , heat pain threshold , heat temporal summation , Timed “ Up & Go ” Test ( TUG ) , and pain intensity at rest and during the TUG . A linear mixed-model analysis of variance was used to compare differences before and after TENS and among groups ( HF-TENS , LF-TENS , and placebo TENS ) . Results Compared with placebo TENS , HF-TENS and LF-TENS increased PPT at the knee ; HF-TENS also increased PPT over the tibialis anterior muscle . There was no effect on the cutaneous mechanical pain threshold , heat pain threshold , or heat temporal summation . Pain at rest and during the TUG was significantly reduced by HF-TENS , LF-TENS , and placebo TENS . Limitations This study tested only a single TENS treatment . Conclusions Both HF-TENS and LF-TENS increased PPT in people with knee osteoarthritis ; placebo TENS had no significant effect on PPT . Cutaneous pain measures were unaffected by TENS . Subjective pain ratings at rest and during movement were similarly reduced by active TENS and placebo TENS , suggesting a strong placebo component of the effect of TENS OBJECTIVE To assess the effects of a burst application of transcutaneous electrical nerve stimulation ( TENS ) on cervical range of motion and pressure point sensitivity of latent myofascial trigger points ( MTrPs ) . DESIGN A single-session , single-blind r and omized trial . SETTING General community rehabilitation clinic . PARTICIPANTS Individuals ( N = 76 ; 45 men , 31 women ) aged 18 to 41 years ( mean ± SD , 23 ± 4y ) with latent MTrPs in 1 upper trapezius muscle . INTERVENTIONS Subjects were r and omly divided into 2 groups : a TENS group that received a burst-type TENS ( pulse width , 200 μs ; frequency , 100 Hz ; burst frequency , 2 Hz ) stimulation over the upper trapezius for 10 minutes , and a placebo group that received a sham-TENS application over the upper trapezius also for 10 minutes . MAIN OUTCOME MEASURES Referred pressure pain threshold ( RPPT ) over the MTrP and cervical range of motion in rotation were assessed before , and 1 and 5 minutes after the intervention by an assessor blinded to subjects ' treatment . RESULTS The analysis of covariance revealed a significant group × time interaction ( P < .001 ) for RPPT : the TENS group exhibited a greater increase compared with the control group ; however , between-group differences were small at 1 minute ( 0.3 kg/cm² ; 95 % confidence interval [ CI ] , 0.1 - 0.4 ) and at 5 minutes ( 0.6 kg/cm² ; 95 % CI , 0.3 - 0.8 ) after treatment . A significant group × time interaction ( P=.01 ) was also found for cervical rotation in favor of the TENS group . Between-group differences were also small at 1 minute ( 2.0 ° ; 95 % CI , 1.0 - 2.8 ) and at 5 minutes ( 2.7 ° ; 95 % CI , 1.7 - 3.8 ) after treatment . CONCLUSIONS A 10-minute application of burst-type TENS increases in a small but statistically significant manner the RPPT over upper trapezius latent MTrPs and the ipsilateral cervical range of motion TENS can be administered in conventional ( high frequency , low intensity ) or acupuncture-like ( AL-TENS : low frequency , high intensity ) formats . It is cl aim ed that AL-TENS produces stronger and longer-lasting hypoalgesia than conventional TENS , although evidence is lacking . This r and omised controlled parallel group study compared the effects of 30 minutes of AL-TENS , conventional TENS , and placebo ( no current ) TENS , on cold-pressor pain threshold ( CPT ) , in 43 healthy participants . Results showed a greater increase in mean loge cold-pressor pain threshold relative to baseline for both AL-TENS and conventional TENS vs. placebo TENS , and for AL-TENS vs. placebo 5 and 15 minutes after TENS was switched off . There were no statistically significant differences between conventional TENS vs. placebo or between AL-TENS vs. conventional TENS at 5 or 15 minutes after TENS was switched off . In conclusion , AL-TENS but not conventional TENS prolonged post-stimulation hypoalgesia compared to placebo TENS . However , no differences between AL-TENS and conventional TENS were detected in head-to-head comparisons Objective : To determine whether sensory transcutaneous electrical nerve stimulation ( TENS ) augmented with therapeutic exercise and worn for daily activities for four weeks would alter peak gait kinetics and kinematics , compared with placebo electrical stimulation and exercise , and exercise only . Design : R and omized controlled trial . Setting : Motion analysis laboratory . Subjects : Thirty-six participants with radiographically assessed knee osteoarthritis and volitional quadriceps activation below 90 % were r and omly assigned to electrical stimulation , placebo and comparison ( exercise-only ) groups . Interventions : Participants in all three groups completed a four-week quadriceps strengthening programme directed by an experienced rehabilitation clinician . Active electrical stimulation units and placebo units were worn in the electrical stimulation and placebo groups throughout the rehabilitation sessions as well as during all activities of daily living . Main measures : Peak external knee flexion moment and angle during stance phase were analysed at a comfortable walking speed before and after the intervention . Findings : Comfortable walking speed increased for all groups over time ( TENS 1.16 ± 0.15 versus 1.32 ± 0.16 m/s ; placebo 1.21 ± 0.34 versus 1.3 ± 0.24 m/s ; comparison 1.27 ± 0.18 versus 1.5 ± 0.14 m/s ) , yet no group differences in speed were found . No differences were found for peak flexion moment or angle between groups overtime . Conclusions : TENS in conjunction with therapeutic exercise does not seem to affect peak flexion moment and angle during stance over a four-week period in participants with tibiofemoral osteoarthritis Summary An electrode array enabled transcutaneous electrical nerve stimulation ( TENS ) treatment at sites of lowest skin resistance which reduced movement‐related osteoarthritic knee pain more than adjacent TENS treatment sites . Abstract A novel device was developed that measured local electrical skin resistance and generated pulsed local electrical currents that were delivered across the skin around the knee for patients with osteoarthritis ( termed eBrace TENS ) . Currents were delivered using an electrode array of 16 small circular electrode elements so that stimulation could be spatially targeted . The aim of this study was to investigate the effects of spatially targeted transcutaneous electrical nerve stimulation ( TENS ) to points of low skin resistance on pain relief and mobility in osteoarthritis of the knee ( OAK ) . A r and omised , controlled , 3‐arm , parallel‐group trial was design ed that compared pain and function following a 30 to 45 minute intervention of TENS at specific locations depending on the local electrical skin resistance . Pain intensity by the visual analogue scale ( VAS ) , 6‐minute walk test , maximum voluntary contraction ( MVC ) , and range‐of‐motion ( ROM ) were the primary outcomes . Lowest‐resistance TENS reduced pain intensity during walking relative to resting baseline compared with r and om TENS ( 95 % confidence interval of the difference : −20.8 mm , −1.26 mm ) . There were no statistically significant differences between groups in distance during the walk test , maximum voluntary contraction ( MVC ) or range‐of‐motion ( ROM ) measures or WOMAC scores . In conclusion , we provide evidence that use of a matrix electrode that spatially targets strong nonpainful TENS for 30 to 45 minutes at sites of low resistance can reduce pain intensity at rest and during walking
2,048
23,256,167
At present , disaggregated re source utilization accompanied by some cost information seems to be the most promising approach . The method for assigning values to costs , including external or indirect cost ( such as time off work ) , can have a significant impact on the outcome of any economic evaluation .
INTRODUCTION Professional societies , like many other organizations around the world , have recognized the need to use rigorous processes to ensure that health care recommendations are based on the best available research evidence . This is the sixth of a series of 14 articles prepared to advise guideline developers for respiratory and other diseases on how to achieve this goal . In this article , we focused on integrating cost and re source information in guideline development and formulating recommendations focusing on four key questions . 1 ) When is it important to incorporate costs , and /or re source implication s , and /or cost-effectiveness , and /or affordability considerations in guidelines ? ( 2 ) Which costs and which re source use should be considered in guidelines ? (3)What sources of evidence should be used to estimate costs , re source use , and cost-effectiveness ? ( 4 ) How can cost-effectiveness , re source implication s , and affordability be taken into account explicitly ? Where drug use is at issue , " explicit consideration " may need to involve only noting whether the price ( easily determined and usually the main component of " acquisition cost " ) of a drug is high or low . St and ards for evidence for clinical data are usually good- quality trials reporting a relevant endpoint that should be summarized in a systematic review .
Emerging health problems require rapid advice . We describe the development and pilot testing of a systematic , transparent approach used by the World Health Organization ( WHO ) to develop rapid advice guidelines in response to requests from member states confronted with uncertainty about the pharmacological management of avian influenza A ( H5N1 ) virus infection . We first search ed for systematic review s of r and omized trials of treatment and prevention of seasonal influenza and for non-trial evidence on H5N1 infection , including case reports and animal and in vitro studies . A panel of clinical experts , clinicians with experience in treating patients with H5N1 , influenza research ers , and method ologists was convened for a two-day meeting . Panel members review ed the evidence prior to the meeting and agreed on the process . It took one month to put together a team to prepare the evidence profiles ( i.e. , summaries of the evidence on important clinical and policy questions ) , and it took the team only five weeks to prepare and revise the evidence profiles and to prepare draft guidelines prior to the panel meeting . A draft manuscript for publication was prepared within 10 days following the panel meeting . Strengths of the process include its transparency and the short amount of time used to prepare these WHO guidelines . The process could be improved by shortening the time required to commission evidence profiles . Further development is needed to facilitate stakeholder involvement , and evaluate and ensure the guideline 's usefulness STUDY OBJECTIVE We report on the incremental costs associated with improvements in health-related quality of life ( HRQL ) following 6 months of respiratory rehabilitation compared with conventional community care . DESIGN Prospect i ve r and omized controlled trial of rehabilitation . SETTING A respiratory rehabilitation unit . PARTICIPANTS Eighty-four subjects who completed the rehabilitation trial . INTERVENTION Two months of inpatient rehabilitation followed by 4 months of outpatient supervision . MEASUREMENTS AND RESULTS All costs ( hospitalization , medical care , medications , home care , assistive devices , transportation ) were included . Simultaneous allocation was used to determine capital and direct and indirect hospitalization costs . The incremental cost of achieving improvements beyond the minimal clinical ly important difference in dyspnea , emotional function , and mastery was $ 11,597 ( Canadian ) . More than 90 % of this cost was attributable to the inpatient phase of the program . Of the nonphysician health-care professionals , nursing was identified as the largest cost center , followed by physical therapy and occupational therapy . The number of subjects needed to be treated ( NNT ) to improve one subject was 4.1 for dyspnea , 4.4 for fatigue , 3.3 for emotion , and 2.5 for mastery . CONCLUSION Cost estimates of various approaches to rehabilitation should be combined with valid , reliable , and responsive measures of outcome to enable cost-effectiveness measures to be reported . Comparison studies with the same method are necessary to determine whether the improvements in HRQL that follow inpatient rehabilitation are cheap or expensive . Such information will be important in identifying the extent to which alternative approaches to rehabilitation can influence re source allocation . A consideration of cost-effectiveness from the perspective of NNT may be useful in the evaluation of health-care programs The aim of this paper is to assess the health economic consequences of substituting ipratropium with the new , once-daily bronchodilator tiotropium in patients with a diagnosis of chronic obstructive pulmonary disease ( COPD ) . This prospect i ve cost-effectiveness analysis was performed alongside two 1‐yr r and omised , double-blind clinical trials in the Netherl and s and Belgium . Patients had a diagnosis of COPD and a forced expiratory volume in one second ( FEV1 ) ≤65 % predicted normal . Patients were r and omised to tiotropium ( 18 µg once daily ) or ipratropium ( 2 puffs of 20 µg administered four times daily ) in a ratio of 2:1 . The mean number of exacerbations was reduced from 1.01 in the ipratropium group ( n=175 ) to 0.74 in the tiotropium group ( n=344 ) . The percentages of patients with a relevant improvement on the St. George 's Respiratory Question naire ( SGRQ ) were 34.6 % and 51.2 % respectively . Compared to ipratropium , the number of hospital admissions , hospital days and unscheduled visits to healthcare providers was reduced by 46 % , 42 % and 36 % respectively . Mean annual healthcare costs including the acquisition cost of the study drugs were 1721 ( sem 160 ) in the tiotropium group and 1,541 ( SEM 163 ) in the ipratropium group ( difference 180 ) . Incremental cost-effectiveness ratios were 667 per exacerbation avoided and 1084 per patient with a relevant improvement on the SGRQ . Substituting tiotropium for ipratropium in chronic obstructive pulmonary disease patients offers improved health outcomes and is associated with increased costs of 180 per patient per year BACKGROUND Valproate is widely accepted as a drug of first choice for patients with generalised onset seizures , and its broad spectrum of efficacy means it is recommended for patients with seizures that are difficult to classify . Lamotrigine and topiramate are also thought to possess broad spectrum activity . The SANAD study aim ed to compare the longer-term effects of these drugs in patients with generalised onset seizures or seizures that are difficult to classify . METHODS SANAD was an unblinded r and omised controlled trial in hospital-based outpatient clinics in the UK . Arm B of the study recruited 716 patients for whom valproate was considered to be st and ard treatment . Patients were r and omly assigned to valproate , lamotrigine , or topiramate between Jan 12 , 1999 , and Aug 31 , 2004 , and follow-up data were obtained up to Jan 13 , 2006 . Primary outcomes were time to treatment failure , and time to 1-year remission , and analysis was by both intention to treat and per protocol . This study is registered as an International St and ard R and omised Controlled Trial , number IS RCT N38354748 . FINDINGS For time to treatment failure , valproate was significantly better than topiramate ( hazard ratio 1.57 [ 95 % CI 1.19 - 2.08 ] ) , but there was no significant difference between valproate and lamotrigine ( 1.25 [ 0.94 - 1.68 ] ) . For patients with an idiopathic generalised epilepsy , valproate was significantly better than both lamotrigine ( 1.55 [ 1.07 - 2.24 ] and topiramate ( 1.89 [ 1.32 - 2.70 ] ) . For time to 12-month remission valproate was significantly better than lamotrigine overall ( 0.76 [ 0.62 - 0.94 ] ) , and for the subgroup with an idiopathic generalised epilepsy 0.68 ( 0.53 - 0.89 ) . But there was no significant difference between valproate and topiramate in either the analysis overall or for the subgroup with an idiopathic generalised epilepsy . INTERPRETATION Valproate is better tolerated than topiramate and more efficacious than lamotrigine , and should remain the drug of first choice for many patients with generalised and unclassified epilepsies . However , because of known potential adverse effects of valproate during pregnancy , the benefits for seizure control in women of childbearing years should be considered
2,049
20,855,806
They report no evidence for a specific drug interaction and find that PPIs were associated with an increased risk for adverse cardiovascular outcomes independent of clopidogrel use . In other words , the adverse effects attributed to this suspected drug interaction are not only 4- to 9-fold larger than clopidogrel 's benefits , as established by RCT data , but also occur in an etiologically implausible time window during which clopidogrel has not been shown to be therapeutically active . The previous efficacy evidence suggests that the adverse events attributed to concomitant administration of clopidogrel and PPIs are more likely related to residual confounding from unmeasured adverse prognostic variables among patients who receive PPIs . An abstract publication from a post hoc analysis of the CREDO ( Clopidogrel for the Reduction of Events During Observation ) study ( 2 ) has also reported an increased risk for adverse events with PPIs , independent of clopidogrel exposure . Although PPIs have been inappropriately prescribed , with a lack of objective benefit , in young patients with dyspepsia , previous works from the same Danish data base ( 13 ) and other data bases ( 14 ) have clearly demonstrated that elderly patients with ACS who receive combination antiplatelet regimens have a 4- to 5-fold increase in their risk for serious gastrointestinal bleeding , which can be reduced with PPIs ( 15 , 16 ) . Many studies ( 17 , 18 ) have also suggested that periprocedural bleeding during percutaneous coronary intervention , including gastrointestinal bleeding , is associated with increased long-term mortality .
As summarized by Charlot and colleagues , several other studies ( 26 ) have been published on this subject , and despite conflicting results regulatory bodies have issued edicts ( 7 , 8) that warn about a clinical ly significant drug interaction . These biases are a more likely explanation for the between- study variation in outcomes than differences in patient population s. The idea that a drug interaction could impede clopidogrel 's clinical action was first suggested almost 10 years ago after in vitro studies of CYP3A4-metabolizing statins ( 20 ) . From a scientific viewpoint , the current appreciation of the essential role of CYP2C19 over CYP3A4 metabolism in clopidogrel 's pharmacokinetics may favor today 's increased scrutiny of PPIs over statins .
Background —Statins primarily metabolized by cytochrome P450 3A4 ( CYP3A4 ) reportedly reduce clopidogrel ’s metabolism to active metabolite , thus attenuating its inhibition of platelet aggregation ex vivo . However , the clinical impact of this interaction has not been evaluated . Methods and Results —Clopidogrel for the Reduction of Events During Observation ( CREDO ) was a double-blind , placebo-controlled , r and omized trial comparing pretreatment ( 300 mg ) and 1-year ( 75 mg/d ) clopidogrel therapy ( clopidogrel ) with no pretreatment and 1-month clopidogrel therapy ( 75 mg/d ) ( control ) after a planned percutaneous coronary intervention . All patients received aspirin . The 1-year primary end point was a composite of death , myocardial infa rct ion , and stroke . We performed a post hoc analysis to evaluate the clinical efficacy of concomitant clopidogrel and statin administration , categorizing baseline statin use to those predominantly CYP3A4-metabolized ( atorvastatin , lovastatin , simvastatin , and cerivastatin ) ( CYP3A4-MET ) or others ( pravastatin and fluvastatin ) ( non-CYP3A4-MET ) . Of the 2116 patients enrolled , 1001 received a CYP3A4-MET and 158 a non-CYP3A4-MET statin . For the overall study population , the primary end point was significantly reduced in the clopidogrel group ( 8.5 % versus 11.5 % , RRR 26.9 % ; P = 0.025 ) . This clopidogrel benefit was similar with statin use , irrespective of treatment with a CYP3A4-MET ( 7.6 % clopidogrel , 11.8 % control , RRR 36.4 % , 95 % CI 3.9 to 57.9 ; P = 0.03 ) or non-CYP3A4-MET statin ( 5.4 % clopidogrel , 13.6 % control , RRR 60.6 % , 95 % CI −23.9 to 87.4 ; P = 0.11 ) . Patients given atorvastatin or pravastatin had similar 1-year event rates . Additionally , concomitant therapy with statins had no impact on major or minor bleeding rates . Conclusions —Although ex vivo testing has suggested a potential negative interaction when coadministering a CYP3A4-metabolized statin with clopidogrel , this was not clinical ly observed statistically in a post hoc analysis of a placebo-controlled study BACKGROUND Despite current treatments , patients who have acute coronary syndromes without ST-segment elevation have high rates of major vascular events . We evaluated the efficacy and safety of the antiplatelet agent clopidogrel when given with aspirin in such patients . METHODS We r and omly assigned 12,562 patients who had presented within 24 hours after the onset of symptoms to receive clopidogrel ( 300 mg immediately , followed by 75 mg once daily ) ( 6259 patients ) or placebo ( 6303 patients ) in addition to aspirin for 3 to 12 months . RESULTS The first primary outcome --a composite of death from cardiovascular causes , nonfatal myocardial infa rct ion , or stroke -- occurred in 9.3 percent of the patients in the clopidogrel group and 11.4 percent of the patients in the placebo group ( relative risk with clopidogrel as compared with placebo , 0.80 ; 95 percent confidence interval , 0.72 to 0.90 ; P<0.001 ) . The second primary outcome --the first primary outcome or refractory ischemia -- occurred in 16.5 percent of the patients in the clopidogrel group and 18.8 percent of the patients in the placebo group ( relative risk , 0.86 ; 95 percent confidence interval , 0.79 to 0.94 ; P<0.001 ) . The percentages of patients with in-hospital refractory or severe ischemia , heart failure , and revascularization procedures were also significantly lower with clopidogrel . There were significantly more patients with major bleeding in the clopidogrel group than in the placebo group ( 3.7 percent vs. 2.7 percent ; relative risk , 1.38 ; P=0.001 ) , but there were not significantly more patients with episodes of life-threatening bleeding ( 2.2 percent [ corrected ] vs. 1.8 percent ; P=0.13 ) or hemorrhagic strokes ( 0.1 percent vs. 0.1 percent ) . CONCLUSIONS The antiplatelet agent clopidogrel has beneficial effects in patients with acute coronary syndromes without ST-segment elevation . However , the risk of major bleeding is increased among patients treated with clopidogrel Background —We observed that the prodrug clopidogrel was less effective in inhibiting platelet aggregation with coadministration of atorvastatin during point-of-care platelet function testing . Because atorvastatin is metabolized by cytochrome P450 ( CYP ) 3A4 , we hypothesized that clopidogrel might be activated by CYP3A4 . Methods and Results —Platelet aggregation was measured in 44 patients undergoing coronary artery stent implantation treated with clopidogrel or clopidogrel plus pravastatin or atorvastatin , and in 27 volunteers treated with clopidogrel and either erythromycin or trole and omycin , CYP3A4 inhibitors , or rifampin , a CYP3A4 inducer . Atorvastatin , but not pravastatin , attenuated the antiplatelet activity of clopidogrel in a dose-dependent manner . Percent platelet aggregation was 34±23 , 58±15 ( P = 0.027 ) , 74±10 ( P = 0.002 ) , and 89±7 ( P = 0.001 ) in the presence of clopidogrel and 0 , 10 , 20 , and 40 mg of atorvastatin , respectively . Erythromycin attenuated platelet aggregation inhibition ( 55±12 versus 42±12 % platelet aggregation;P = 0.002 ) , as did trole and omycin ( 78±18 versus 45±18 % platelet aggregation;P < 0.0003 ) , whereas rifampin enhanced platelet aggregation inhibition ( 33±18 versus 56±20 % platelet aggregation , P = 0.001 ) . Conclusions —CYP3A4 activates clopidogrel . Atorvastatin , another CYP3A4 substrate , competitively inhibits this activation . Use of a statin not metabolized by CYP3A4 and point-of-care platelet function testing may be warranted in patients treated with clopidogrel
2,050
27,354,917
Conclusion : Overall , this review demonstrates that pharmacist-led interventions may improve prescribing appropriateness in community-dwelling older adults .
Objective : To evaluate studies of pharmacist-led interventions on potentially inappropriate prescribing among community-dwelling older adults receiving primary care to identify the components of a successful intervention .
The effect of pharmaceutical care on the prevention , detection , and resolution of medication-related problems in high-risk patients in a rural community was studied . Adult patients who received care at clinics in a medically underserved area of Alabama and who were identified as being at high risk of medication-related adverse events were r and omly assigned to a control group or an intervention group . The control group received st and ard medical care , and the intervention group received pharmaceutical care , including a medical record review , a medication history review , pharmacotherapeutic evaluation , and patient medication education and monitoring over a one-year period . A total of 69 patients completed the study ( 33 in the intervention group and 36 in the control group ) . The percentage of patients responding to hypertension , diabetes , dyslipidemia , and anticoagulation therapy increased significantly in the intervention group and declined in the control group . Ratings for inappropriate prescribing improved in all 10 domains evaluated in the intervention group but worsened in 5 domains in the control group . There were no significant differences between the groups at 12 months in health-related quality of life or medication misadventures . Medication compliance scores improved in the intervention group but not in the control group . Medication knowledge increased in the intervention group and decreased in the control group . Pharmaceutical care in a rural , community-based setting appeared to reduce inappropriate prescribing , enhance disease management , and improve medication compliance and knowledge without adversely affecting health-related quality of life Background Older people are at increased risk of drug-related problems ( DRPs ) caused by inappropriate use or underuse of medications which may be increased during care transitions . Objective To examine the effects of applying a vali date d prescribing appropriateness criteria -set during medication review in a cohort of older ( ≥65 years ) Australians at the time of discharge from hospital . Setting Private hospital and homes of older patients in Sydney , Australia . Methods Cognitively well English speaking patients aged 65 years or over taking five or more medications were recruited . A prescribing appropriateness criteria -set and SF-36 health-related quality of life health ( HRQoL ) survey were applied to all patients at discharge . Patients were then r and omly assigned to receive either usual care ( control , n = 91 ) or discharge medication counselling and a medication review by a clinical pharmacist ( intervention , n = 92 ) . Medication review recommendations were sent to the general practitioners of intervention group patients . All patients were followed up at 3 months post discharge , where the prescribing appropriateness criteria -set was reapplied and HRQoL survey repeated . Main outcome measures change in the number of prescribing appropriateness criteria met ; change in HRQoL ; number and causes of DRPS identified by medication review ; intervention patient medication recommendation implementation rates . Results There was no significant difference in the number of criteria applicable and met in intervention patients , compared to control patients , between follow-up and discharge ( 0.09 ≤ p ≤ 0.97 ) . While the difference between groups was positive at follow-up for SF-36 scores , the only domain that reached statistical significance was that for vitality ( p = 0.04 ) . Eighty-eight intervention patient medication review s identified 750 causes of DRPs ( 8.5 ± 2.7 per patient ) . No causes of DRPs were identified in four patients . Of these causes , 76.4 % ( 573/750 ) were identified by application of the prescribing appropriateness criteria -set . GPs implemented a relatively low number ( 42.4 % , 318/750 ) of recommendations . Conclusion Application of a prescribing appropriateness criteria -set during medication review in intervention patients did not increase the number of criteria met , nor result in a significant improvement in HRQoL. Higher recommendation implementation rates may require additional facilitators , including a higher quality of collaboration Objective To undertake a process evaluation of pharmacists ' recommendations arising in the context of a complex IT-enabled pharmacist-delivered r and omised controlled trial ( PINCER trial ) to reduce the risk of hazardous medicines management in general practice s. Methods PINCER pharmacists manually recorded patients ' demographics , details of interventions recommended , actions undertaken by practice staff and time taken to manage individual cases of hazardous medicines management . Data were coded , double-entered into SPSS version 15 and then summarised using percentages for categorical data ( with 95 % confidence interval ( CI ) ) and , as appropriate , means ( ± st and ard deviation ) or medians ( interquartile range ) for continuous data . Key findings Pharmacists spent a median of 20 min ( interquartile range 10 , 30 ) review ing medical records , recommending interventions and completing actions in each case of hazardous medicines management . Pharmacists judged 72 % ( 95 % CI 70 , 74 ; 1463/2026 ) of cases of hazardous medicines management to be clinical ly relevant . Pharmacists recommended 2105 interventions in 74 % ( 95 % CI 73 , 76 ; 1516/2038 ) of cases and 1685 actions were taken in 61 % ( 95 % CI 59 , 63 ; 1246/2038 ) of cases ; 66 % ( 95 % CI 64 , 68 ; 1383/2105 ) of interventions recommended by pharmacists were completed and 5 % ( 95 % CI 4 , 6 ; 104/2105 ) of recommendations were accepted by general practitioners ( GPs ) , but not completed at the end of the pharmacists ' placement ; the remaining recommendations were rejected or considered not relevant by GPs . Conclusions The outcome measures were used to target pharmacist activity in general practice towards patients at risk from hazardous medicines management . Recommendations from trained PINCER pharmacists were found to be broadly acceptable to GPs and led to ameliorative action in the majority of cases . It seems likely that the approach used by the PINCER pharmacists could be employed by other practice pharmacists following appropriate training OBJECTIVE To measure the effect of nurse practitioner and pharmacist consultations on the appropriate use of medications by patients . DESIGN We studied patients in the intervention arm of a r and omized controlled trial . The main trial intervention was provision of multidisciplinary team care and the main outcome was quality and processes of care for chronic disease management . SETTING Patients were recruited from a single publicly funded family health network practice of 8 family physicians and associated staff serving 10 000 patients in a rural area near Ottawa , Ont . PARTICIPANTS A total of 120 patients 50 years of age or older who were on the practice roster and who were considered by their family physicians to be at risk of experiencing adverse health outcomes . INTERVENTION A pharmacist and 1 of 3 nurse practitioners visited each patient at his or her home , conducted a comprehensive medication review , and developed a tailored plan to optimize medication use . The plan was developed in consultation with the patient and the patient 's doctor . We assessed medication appropriateness at the study baseline and again 12 to 18 months later . MAIN OUTCOME MEASURES We used the medication appropriateness index to assess medication use . We examined associations between personal characteristics and inappropriate use at baseline and with improvements in medication use at the follow-up assessment . We recorded all drug problems encountered during the trial . RESULTS At baseline , 27.2 % of medications were inappropriate in some way and 77.7 % of patients were receiving at least 1 medication that was inappropriate in some way . At the follow-up assessment s these percentages had dropped to 8.9 % and 38.6 % , respectively ( P < .001 ) . Patient characteristics that were associated with receiving inappropriate medication at baseline were being older than 80 years of age ( odds ratio [ OR ] = 5.00 , 95 % CI 1.19 to 20.50 ) , receiving more than 4 medications ( OR = 6.64 , 95 % CI 2.54 to 17.4 ) , and not having a university-level education ( OR = 4.55 , 95 % CI 1.69 to 12.50 ) . CONCLUSION We observed large improvements in the appropriate use of medications during this trial . This might provide a mechanism to explain some of the reductions in mortality and morbidity observed in other trials of counseling and advice provided by pharmacists and nurses . TRIAL REGISTRATION NUMBER NCT00238836 ( Clinical Trials.gov ) OBJECTIVES To determine whether a computerized tool that alerted pharmacists when patients aged 65 and older were newly prescribed potentially inappropriate medications was effective in decreasing the proportion of patients dispensed these medications . DESIGN Prospect i ve , r and omized trial . SETTING U.S. health maintenance organization . PARTICIPANTS All 59,680 health plan members aged 65 and older were r and omized to intervention ( n=29,840 ) or usual care ( n=29,840 ) . Pharmacists received alerts on all patients r and omized to intervention who were newly prescribed a targeted medication . INTERVENTION Prescription and age information were linked to alert pharmacists when a patient aged 65 and older was newly prescribed one of 11 medications that are potentially inappropriate in older people . MEASUREMENTS Physicians and pharmacists collaborated to develop the targeted medication list , indications for medication use for which an intervention should occur , intervention guidelines and scripts , and to implement the intervention . RESULTS Over the 1-year study , 543 ( 1.8 % ) intervention group patients aged 65 and older were newly dispensed prescriptions for targeted medications , compared with 644 ( 2.2 % ) usual care group patients ( P=.002 ) . For medication use indications in which an intervention should occur , dispensings of amitriptyline ( P<.001 ) and diazepam ( P=.02 ) were reduced . CONCLUSIONS This study demonstrated the effectiveness of a computerized pharmacy alert system plus collaboration between healthcare professionals in decreasing potentially inappropriate medication dispensings in elderly patients . Coupling data available from information systems with the knowledge and skills of physicians and pharmacists can improve prescribing safety in patients aged 65 and older BACKGROUND The pharmaceutical care approach serves as a model for medication review , involving collaboration between GPs , pharmacists , patients , and carers . Its use is advocated with older patients who are typically prescribed several drugs . However , it has yet to be thoroughly evaluated . AIM To estimate the effectiveness of pharmaceutical care for older people , shared between GPs and community pharmacists in the UK , relative to usual care . DESIGN OF STUDY Multiple interrupted time-series design in five primary care trusts which implemented pharmaceutical care at 2-month intervals in r and om order . Patients acted as their own controls , and were followed over 3 years including their 12 months ' participation in pharmaceutical care . SETTING In 2002 , 760 patients , aged > or = 75 years , were recruited from 24 general practice s in East and North Yorkshire . Sixty-two community pharmacies also took part . A total of 551 participants completed the study . METHOD Pharmaceutical care was undertaken by community pharmacists who interviewed patients , developed and implemented pharmaceutical care plans together with patients ' GPs , and thereafter undertook monthly medication review s. Pharmacists and GPs attended training before the intervention . Outcome measures were the UK Medication Appropriateness Index , the Short Form-36 Health Survey ( SF-36 ) , and serious adverse events . RESULTS The intervention did not lead to any statistically significant change in the appropriateness of prescribing or health outcomes . Although the mental component of the SF-36 decreased as study participants become older , this trend was not affected by pharmaceutical care . CONCLUSION The RESPECT model of pharmaceutical care ( R and omised Evaluation of Shared Prescribing for Elderly people in the Community over Time ) shared between community pharmacists and GPs did not significantly change the appropriateness of prescribing or quality of life in older patients Background Small trials with short term follow up suggest pharmacists ’ interventions targeted at healthcare professionals can improve prescribing . In comparison with clinical guidance , contemporary statin prescribing is sub-optimal and achievement of cholesterol targets falls short of accepted st and ards , for patients with atherosclerotic vascular disease who are at highest absolute risk and who st and to obtain greatest benefit . We hypothesised that a pharmacist-led complex intervention delivered to doctors and nurses in primary care , would improve statin prescribing and achievement of cholesterol targets for incident and prevalent patients with vascular disease , beyond one year . Methods We allocated general practice s to a 12-month Statin Outreach Support ( SOS ) intervention or usual care . SOS was delivered by one of 11 pharmacists who had received additional training . SOS comprised academic detailing and practical support to identify patients with vascular disease who were not prescribed a statin at optimal dose or did not have cholesterol at target , followed by individualised recommendations for changes to management . The primary outcome was the proportion of patients achieving cholesterol targets . Secondary outcomes were : the proportion of patients prescribed simvastatin 40 mg with target cholesterol achieved ; cholesterol levels ; prescribing of simvastatin 40 mg ; prescribing of any statin and the proportion of patients with cholesterol tested . Outcomes were assessed after an average of 1.7 years ( range 1.4–2.2 years ) , and practice level simvastatin 40 mg prescribing was assessed after 10 years . Findings We r and omised 31 practice s ( 72 General Practitioners ( GPs ) , 40 nurses ) . Prior to r and omisation a subset of eligible patients were identified to characterise practice s ; 40 % had cholesterol levels below the target threshold . Improvements in data collection procedures allowed identification of all eligible patients ( n = 7586 ) at follow up . Patients in practice s allocated to SOS were significantly more likely to have cholesterol at target ( 69.5 % vs 63.5 % ; OR 1.11 , CI 1.00–1.23 ; p = 0.043 ) as a result of improved simvastatin prescribing . Subgroup analysis showed the primary outcome was achieved by prevalent but not incident patients . Statistically significant improvements occurred in all secondary outcomes for prevalent patients and all but one secondary outcome ( the proportion of patients with cholesterol tested ) for incident patients . SOS practice s prescribed more simvastatin 40 mg than usual care practice s , up to 10 years later . Interpretation Through a combination of educational and organisational support , a general practice based pharmacist led collaborative intervention can improve statin prescribing and achievement of cholesterol targets in a high-risk primary care based population . Trial Registration International St and ard R and omised Controlled Trials Register IS RCT Summary Background Medication errors are common in primary care and are associated with considerable risk of patient harm . We tested whether a pharmacist-led , information technology-based intervention was more effective than simple feedback in reducing the number of patients at risk of measures related to hazardous prescribing and inadequate blood-test monitoring of medicines 6 months after the intervention . Methods In this pragmatic , cluster r and omised trial general practice s in the UK were stratified by research site and list size , and r and omly assigned by a web-based r and omisation service in block sizes of two or four to one of two groups . The practice s were allocated to either computer-generated simple feedback for at-risk patients ( control ) or a pharmacist-led information technology intervention ( PINCER ) , composed of feedback , educational outreach , and dedicated support . The allocation was masked to general practice s , patients , pharmacists , research ers , and statisticians . Primary outcomes were the proportions of patients at 6 months after the intervention who had had any of three clinical ly important errors : non-selective non-steroidal anti-inflammatory drugs ( NSAIDs ) prescribed to those with a history of peptic ulcer without co-prescription of a proton-pump inhibitor ; β blockers prescribed to those with a history of asthma ; long-term prescription of angiotensin converting enzyme ( ACE ) inhibitor or loop diuretics to those 75 years or older without assessment of urea and electrolytes in the preceding 15 months . The cost per error avoided was estimated by incremental cost-effectiveness analysis . This study is registered with Controlled-Trials.com , number IS RCT N21785299 . Findings 72 general practice s with a combined list size of 480 942 patients were r and omised . At 6 months ' follow-up , patients in the PINCER group were significantly less likely to have been prescribed a non-selective NSAID if they had a history of peptic ulcer without gastroprotection ( OR 0·58 , 95 % CI 0·38–0·89 ) ; a β blocker if they had asthma ( 0·73 , 0·58–0·91 ) ; or an ACE inhibitor or loop diuretic without appropriate monitoring ( 0·51 , 0·34–0·78 ) . PINCER has a 95 % probability of being cost effective if the decision-maker 's ceiling willingness to pay reaches £ 75 per error avoided at 6 months . Interpretation The PINCER intervention is an effective method for reducing a range of medication errors in general practice s with computerised clinical records . Funding Patient Safety Research Portfolio , Department of Health , Engl and Flaws in the design , conduct , analysis , and reporting of r and omised trials can cause the effect of an intervention to be underestimated or overestimated . The Cochrane Collaboration ’s tool for assessing risk of bias aims to make the process clearer and more Background Polypharmacy in the Swedish elderly population is currently a prioritised area of research with a focus on reducing the use of potentially inappropriate medications ( PIMs ) . Multi-professional interventions have previously been tested for their ability to improve drug therapy in frail elderly patients . Objective This study aim ed to assess a structured model for pharmacist-led medication review s in primary health care in southern Sweden and to measure its effects on numbers of patients with PIMs ( using the definition of the Swedish National Board of Health and Welfare ) using ≥10 drugs and using ≥3 psychotropics . Methods This study was a r and omised controlled clinical trial performed in a group of patients aged ≥75 years and living in nursing homes or the community and receiving municipal health care . Medication review s were performed by trained clinical pharmacists based on nurse-initiated symptom assessment s with team-based or distance feedback to the physician . Data were collected from the patients ’ electronic medication lists and medical records at baseline and 2 months after the medication review . Results A total of 369 patients were included : 182 in the intervention group and 187 in the control group . One-third of the patients in both groups had at least one PIM at baseline . Two months after the medication review s , the number of intervention group patients with at least one PIM and the number of intervention group patients using ten or more drugs had decreased ( p = 0.007 and p = 0.001 , respectively ) , while there were no statistically significant changes in the control patients . No changes were seen in the number of patients using three or more psychotropic drugs , although the dosages of these drugs tended to decrease . Drug-related problems ( DRPs ) were identified in 93 % of the 182 patients in the intervention group . In total , there were 431 DRPs in the intervention group ( a mean of 2.5 DRPs per patient , range 0–9 , SD 1.5 at 95 % CI ) and 16 % of the DRPs were related to PIMs . Conclusions Medication review s involving pharmacists in primary health care appear to be a feasible method to reduce the number of patients with PIMs , thus improving the quality of pharmacotherapy in elderly patients BACKGROUND regular medication review has been recommended for those over 75 and those on multiple drug therapy . Pharmacists are a potential source of assistance in review ing medication . Evidence of the benefits of this process is needed . OBJECTIVE to study the effect of medication review led by a pharmacist on resolution of pharmaceutical care issues , medicine costs , use of health and social services and health-related quality of life . DESIGN r and omized , controlled trial . SETTING general medical practice s in the Grampian region of Scotl and . SUBJECTS patients aged at least 65 years , with at least two chronic disease states who were taking at least four prescribed medicines regularly . METHODS pharmacists review ed the drug therapy of 332 patients , using information obtained from the practice computer , medical records and patient interviews . In 168 patients , a pharmaceutical care plan was then drawn up and implemented . The 164 control patients continued to receive normal care . All outcome measures were assessed at baseline and after 3 months . RESULTS all patients had at least two pharmaceutical care issues at baseline . Half of these were identified from the prescription record , the rest from notes and patient interview . Of all the issues , 21 % were resolved by information found in notes and 8.5 % by patient interview . General practitioners agreed with 96 % of all care issues documented on the care plans in the intervention group . At the time of follow-up , 70 % of the remaining care issues had been resolved in the intervention group , while only 14 % had been resolved in the control group . There were no changes in medicine costs or health-related quality of life in either group . There were small increases in contacts with health-care professionals and slightly fewer hospital admissions among the intervention group than the control group . CONCLUSIONS pharmacist-led medication review has the capacity to identify and resolve pharmaceutical care issues and may have some impact on the use of other health services Purpose To evaluate the effect of a combined or a single educational intervention on the prescribing behaviour of general practitioners ( GPs ) . The primary endpoint was effect on inappropriate prescribing according to the Medication Appropriateness Index ( MAI ) . Methods General practitioners were r and omised to either ( 1 ) a combined intervention consisting of an interactive educational meeting plus feedback on participating patients ’ medication , ( 2 ) a single intervention with an interactive educational meeting or ( 3 ) a control group ( no intervention ) . Elderly ( > 65 years ) patients exposed to polypharmacy ( ≥5 medications ) were identified and approached for inclusion . Data on medications prescribed over a 3-month period were collected , and the GPs provided detailed information on their patients before and after the intervention . A pre- and post-MAI were scored for all medications . Results Of the 277 GPs invited to participate ; 41 ( 14.8 % ) volunteered . Data were obtained from 166 patients before and after the intervention . Medication appropriateness improved in the combined intervention group but not in the single intervention group . The mean change in MAI and number of medications was −5 [ 95 % confidence interval ( CI ) −7.3 to −2.6 ] and −1.03 ( 95 % CI −1.7 to −0.30 ) in the combined intervention group compared with the group with the educational meeting only and the no intervention group . Conclusions A combined intervention consisting of an interactive educational meeting plus recommendations given by clinical pharmacologists/pharmacists concerning specific patients can improve the appropriateness of prescribing among elderly patients exposed to polypharmacy . This study adds to the limited number of well-controlled , r and omised studies on overall medication appropriateness among elderly patients in primary care . Important limitations to the study include variability in data provided by participating GPs and a low number of GPs volunteering for the study Background This trial aims to investigate the effectiveness and cost implication s of ' pharmaceutical care ' provided by community pharmacists to elderly patients in the community . As the UK government has proposed that by 2004 pharmaceutical care services should extend nationwide , this provides an opportunity to evaluate the effect of pharmaceutical care for the elderly . Design The trial design is a r and omised multiple interrupted time series . We aim to recruit 700 patients from about 20 general practice s , each associated with about three community pharmacies , from each of the five Primary Care Trusts in North and East Yorkshire . We shall r and omise the five result ing groups of practice s , pharmacies and patients to begin pharmaceutical care in five successive phases . All five will act as controls until they receive the intervention in a r and om sequence . Until they receive training community pharmacists will provide their usual dispensing services and so act as controls . The community pharmacists and general practitioners will receive training in pharmaceutical care for the elderly . Once trained , community pharmacists will meet recruited patients , either in their pharmacies ( in a consultation room or dispensary to preserve confidentiality ) or at home . They will identify drug-related issues/problems , and design a pharmaceutical care plan in conjunction with both the GP and the patient . They will implement , monitor , and up date this plan monthly . The primary outcome measure is the ' Medication Appropriateness Index ' . Secondary measures include adverse events , quality of life , and patient knowledge and compliance . We shall also investigate the cost of pharmaceutical care to the NHS , to patients and to society as a whole Background Currently , far too many older adults consume inappropriate prescriptions , which increase the risk of adverse drug reactions and unnecessary hospitalizations . A health education program directly informing patients of prescription risks may promote inappropriate prescription discontinuation in chronic benzodiazepine users . Methods / Design This is a cluster r and omized controlled trial using a two-arm parallel- design . A total of 250 older chronic benzodiazepine users recruited from community pharmacies in the greater Montreal area will be studied with informed consent . A participating pharmacy with recruited participants represents a cluster , the unit of r and omization . For every four pharmacies recruited , a simple 2:2 r and omization is used to allocate clusters into intervention and control arms . Participants will be followed for 1 year . Within the intervention clusters , participants will receive a novel educational intervention detailing risks and safe alternatives to their current potentially inappropriate medication , while the control group will be wait-listed for the intervention for 6 months and receive usual care during that time period . The primary outcome is the rate of change in benzodiazepine use at 6 months . Secondary outcomes are changes in risk perception , self-efficacy for discontinuing benzodiazepines , and activation of patients initiating discussion s with their physician or pharmacist about safer prescribing practice s. An intention-to-treat analysis will be followed . The rate of change of benzodiazepine use will be compared between intervention and control groups at the individual level at the 6-month follow-up . Risk differences between the control and experimental groups will be calculated , and the robust variance estimator will be used to estimate the associated 95 % confidence interval ( CI ) . As a sensitivity analysis ( and /or if any confounders are unbalanced between the groups ) , we will estimate the risk difference for the intervention via a marginal model estimated via generalized estimating equations with an exchangeable correlation structure . Discussion Targeting consumers directly as catalysts for engaging physicians and pharmacists in collaborative discontinuation of benzodiazepine drugs is a novel approach to reduce inappropriate prescriptions . By directly empowering chronic users with knowledge about risks , we hope to imitate the success of individually targeted anti-smoking campaigns . Trial registration Clinical Trials.gov identifier : BACKGROUND Previous studies have not demonstrated a consistent association between potentially inappropriate medicines ( PIMs ) in older patients as defined by Beers criteria and avoidable adverse drug events ( ADEs ) . This study aim ed to assess whether PIMs defined by new STOPP ( Screening Tool of Older Persons ' potentially inappropriate Prescriptions ) criteria are significantly associated with ADEs in older people with acute illness . METHODS We prospect ively studied 600 consecutive patients 65 years or older who were admitted with acute illness to a university teaching hospital over a 4-month interval . Potentially inappropriate medicines were defined by both Beers and STOPP criteria . Adverse drug events were defined by World Health Organization-Uppsala Monitoring Centre criteria and verified by a local expert consensus panel , which also assessed whether ADEs were causal or contributory to current hospitalization . Hallas criteria defined ADE avoidability . We compared the proportions of patients taking Beers criteria PIMs and STOPP criteria PIMs with avoidable ADEs that were causal or contributory to admission . RESULTS A total of 329 ADEs were detected in 158 of 600 patients ( 26.3 % ) ; 219 of 329 ADEs ( 66.6 % ) were considered causal or contributory to admission . Of the 219 ADEs , 151 ( 68.9 % ) considered causal or contributory to admission were avoidable or potentially avoidable . After adjusting for age , sex , comorbidity , dementia , baseline activities of daily living function , and number of medications , the likelihood of a serious avoidable ADE increased significantly when STOPP PIMs were prescribed ( odds ratio , 1.847 ; 95 % confidence interval [ CI ] , 1.506 - 2.264 ; P < .001 ) ; prescription of Beers criteria PIMs did not significantly increase ADE risk ( odds ratio , 1.276 ; 95 % CI , 0.945 - 1.722 ; P = .11 ) . CONCLUSION STOPP criteria PIMs , unlike Beers criteria PIMs , are significantly associated with avoidable ADEs in older people that cause or contribute to urgent hospitalization BACKGROUND It has been cited that the management of congestive heart failure ( CHF ) requires a multidisciplinary approach ; however , the role of the pharmacist has not been extensively studied . The roles for pharmacists are changing to meet the long term needs of patients in the community setting , including patients with CHF . OBJECTIVES To evaluate the effect of a pharmacist on the appropriateness of medications taken by patients in the heart function clinic using the Medication Appropriateness Index ( MAI ) and to measure the effect of a pharmacist on the patients ' response to the pharmacist 's interventions using the Purdue Directive Guidance ( DG ) scale . METHODS Eighty patients attending the heart function clinic at The University Health Network , Toronto General Hospital Toronto , Ontario were r and omly assigned to an intervention group that received pharmacist services or a nonintervention group that received usual care from the clinic staff . Patients were assessed at baseline and at one-month follow-up . RESULTS The change in MAI score from baseline was 0.74 and 0.49 for the intervention and nonintervention groups , respectively ( P=0.605 ) . The change in DG survey results was 9.97 and 1.00 for the intervention and nonintervention groups , respectively ( P<0.001 ) . The intervention group improved significantly in all components of the DG survey , especially those pertaining to feedback and goal setting . CONCLUSIONS A benefit was demonstrated for ' directive guidance ' of patients , in the form of education and goal setting as shown by positive survey results BACKGROUND Older people are prone to problems related to use of medicines . As they tend to use many different medicines , monitoring pharmacotherapy for older people in primary care is important . AIM To determine which procedure for treatment review s ( case conferences versus written feedback ) results in more medication changes , measured at different moments in time . To determine the costs and savings related to such an intervention . DESIGN OF STUDY R and omised , controlled trial , r and omisation at the level of the community pharmacy . SETTING Primary care ; treatment review s were performed by 28 pharmacists and 77 GPs concerning 738 older people ( > or = 75 years ) on polypharmacy ( > five medicines ) . METHOD In one group , pharmacists and GPs performed case conferences on prescription-related problems ; in the other group , pharmacists provided results of a treatment review to GPs as written feedback . Number of medication changes was counted following clinical ly-relevant recommendations . Costs and savings associated with the intervention at various times were calculated . RESULTS In the case-conference group significantly more medication changes were initiated ( 42 versus 22 , P = 0.02 ) . This difference was also present 6 months after treatment review s ( 36 versus 19 , P = 0.02 ) . Nine months after treatment review s , the difference was no longer significant ( 33 versus 19 , P = 0.07 ) . Additional costs in the case-conference group seem to be covered by the slightly greater savings in this group . CONCLUSION Performing treatment review s with case conferences leads to greater uptake of clinical ly-relevant recommendations . Extra costs seem to be covered by related savings . The effect of the intervention declines over time , so performing treatment review s for older people should be integrated in the routine collaboration between GPs and pharmacists PURPOSE To determine if inpatient or outpatient geriatric evaluation and management , as compared with usual care , reduces adverse drug reactions and suboptimal prescribing in frail elderly patients . METHODS The study employed a r and omized 2 x 2 factorial controlled design . Subjects were patients in 11 Veterans Affairs ( VA ) hospitals who were > or = 65 years old and met criteria for frailty ( n = 834 ) . Inpatient geriatric unit and outpatient geriatric clinic teams evaluated and managed patients according to published guidelines and VA st and ards . Patients were followed for 12 months . Blinded physician-pharmacist pairs rated adverse drug reactions for causality ( using Naranjo 's algorithm ) and seriousness . Suboptimal prescribing measures included unnecessary and inappropriate drug use ( Medication Appropriateness Index ) , inappropriate drug use ( Beers criteria ) , and underuse . RESULTS For serious adverse drug reactions , there were no inpatient geriatric unit effects during the inpatient or outpatient follow-up periods . Outpatient geriatric clinic care result ed in a 35 % reduction in the risk of a serious adverse drug reaction compared with usual care ( adjusted relative risk = 0.65 ; 95 % confidence interval : 0.45 to 0.93 ) . Inpatient geriatric unit care reduced unnecessary and inappropriate drug use and underuse significantly during the inpatient period ( P < 0.05 ) . Outpatient geriatric clinic care reduced the number of conditions with omitted drugs significantly during the outpatient period ( P < 0.05 ) . CONCLUSION Compared with usual care , outpatient geriatric evaluation and management reduces serious adverse drug reactions , and inpatient and outpatient geriatric evaluation and management reduces suboptimal prescribing , in frail elderly patients BACKGROUND Explicit measures of potentially inappropriate prescribing , such as the Beers criteria , have been associated with risk for adverse drug events ( ADEs ) . However , no such link has been established for actual inappropriate prescribing using implicit measures . OBJECTIVE To determine whether an implicit measure of inappropriate prescribing can predict ADE risk . METHODS Patients were veterans aged 65 years and older who were seen in primary care clinics and participated in a r and omized controlled trial of a pharmacist-physician collaborative intervention . Inappropriate prescribing was determined at baseline , using the 2003 Beers criteria as an explicit measure and the Medication Appropriateness Index ( MAI ) as an implicit measure . A modified MAI scoring approach was design ed to target ADE risk and was used in addition to st and ard scoring . ADEs that occurred during the 3 months following baseline were assessed by patient interview and plausibility verification by blinded pharmacist review . Logistic regression analysis was used to determine whether inappropriate prescribing predicted risk for an ADE , controlling for potential confounding factors . RESULTS Of 236 patients , 34 ( 14.4 % ) experienced an ADE . Inappropriate prescribing was common at baseline , with 48.7 % of patients receiving a Beers criteria drug and 98.7 % of patients having an inappropriate rating on at least 1 MAI criterion . Modified MAI scoring , but not other measures of inappropriate prescribing , significantly predicted ADE risk . For every unit increase in modified MAI score ( 3.1 + /- 3.5 ; mean + /- SD ) , the adjusted 3-month odds of an ADE increased 13 % ( OR 1.13 ; 95 % CI 1.02 to 1.26 ) . For example , patients with a modified MAI score of 3 , near the precise mean score of 3.1 , were at a nearly 40 % greater risk for an ADE compared with patients with a score of zero . CONCLUSIONS Implicit measurement of actual inappropriate prescribing predicted ADE risk , an important clinical outcome . This finding helps confirm the validity of prior studies that have relied on explicit measures to link potentially inappropriate prescribing to adverse health outcomes PURPOSE To evaluate the effect of sustained clinical pharmacist interventions involving elderly out patients with polypharmacy and their primary physicians . PATIENTS AND METHODS R and omized , controlled trial of 208 patients aged 65 years or older with polypharmacy ( > or = 5 chronic medications ) from a general medicine clinic of a Veterans Affairs Medical Center . A clinical pharmacist met with intervention group patients during all scheduled visits to evaluate their drug regimens and make recommendations to them and their physicians . Outcome measures were prescribing appropriateness , health-related quality of life , adverse drug events , medication compliance and knowledge , number of medications , patient satisfaction , and physician receptivity . RESULTS Inappropriate prescribing scores declined significantly more in the intervention group than in the control group by 3 months ( decrease 24 % versus 6 % , respectively ; P = 0.0006 ) and was sustained at 12 months ( decrease 28 % versus 5 % , respectively ; P = 0.0002 ) . There was no difference between groups at closeout in health-related quality of life ( P = 0.99 ) . Fewer intervention than control patients ( 30.2 % ) versus 40.0 % ; P = 0.19 ) experienced adverse drug events . Measures for most other outcomes remained unchanged in both groups . Physicians were receptive to the intervention and enacted changes recommended by the clinical pharmacist more frequently than they enacted changes independently for control patients ( 55.1 % versus 19.8 % ; P < 0.001 ) . CONCLUSIONS This study demonstrates that a clinical pharmacist providing pharmaceutical care for elderly primary care patients can reduce inappropriate prescribing and possibly adverse drug effects without adversely affecting health-related quality of life OBJECTIVES To test the efficacy of a medication use improvement program developed specifically for home health agencies . The program addressed four medication problems identified by an expert panel : unnecessary therapeutic duplication , cardiovascular medication problems , use of psychotropic drugs in patients with possible adverse psychomotor or adrenergic effects , and use of nonsteroidal antiinflammatory drugs ( NSAIDs ) in patients at high risk of peptic ulcer complications . It used a structured collaboration between a specially trained clinical pharmacist and the patients ' home-care nurses to improve medication use . DESIGN Parallel-group , r and omized controlled trial . SETTING Two of the largest home health agencies in the United States . PARTICIPANTS Study subjects were consenting Medicare patients aged 65 and older admitted to participating agency offices from October 1996 through September 1998 , with a projected home healthcare duration of at least 4 weeks and at least one study medication problem . INTERVENTION Qualifying patients were r and omized to usual care or usual care with the medication improvement program . MEASUREMENTS Medication use was measured during an in-home interview , with container inspection at baseline and at follow-up ( between 6 and 12 weeks ) by interviewers unaware of treatment assignment . The trial endpoint was the proportion of patients with medication use improvement according to predefined criteria at follow-up . RESULTS There were 259 r and omized patients with completed follow-up interviews : 130 in the intervention group and 129 with usual care . Medication use improved for 50 % of intervention patients and 38 % of control patients , an attributable improvement of 12 patients per 100 ( 95 % confidence interval ( CI ) = 0.0 - 24.0 , P = .051 ) . The intervention effect was greatest for therapeutic duplication , with improvement for 71 % of intervention and 24 % of control patients , an attributable improvement of 47 patients per 100 ( 95 % CI = 20 - 74 , P = .003 ) . Use of cardiovascular medications also improved more frequently in intervention patients : 55 % vs 18 % , attributable improvement 37 patients per 100 ( 95 % CI = 9 - 66 , P = .017 ) . There were no significant improvements for the psychotropic medication or NSAID problems . There was no evidence of adverse intervention effects : new medication problems , more agency nurse visits , or increased duration of home health care . CONCLUSIONS A program congruent with existing personnel and practice s of home health agencies improved medication use in a vulnerable population and was particularly effective in reducing therapeutic duplication Improving precision and economy in the prescribing of drugs is a goal whose importance has increased with the proliferation of new and potent agents and with growing economic pressures to contain health-care costs . We implemented an office-based physician education program to reduce the excessive use of three drug groups : cerebral and peripheral vasodilators , an oral cephalosporin , and propoxyphene . A four-state sample of 435 prescribers of these drugs was identified through Medicaid records and r and omly assigned to one of three groups . Physicians who were offered personal educational visits by clinical pharmacists along with a series of mailed " unadvertisements " reduced their prescribing of the target drugs by 14 per cent as compared with controls ( P = 0.0001 ) . A comparable reduction in the number of dollars reimbursed for these drugs was also seen between the two groups , result ing in substantial cost savings . No such change was seen in physicians who received mailed print material s only . The effect persisted for at least nine months after the start of the intervention , and no significant increase in the use of expensive substitute drugs was found . Academically based " detailing " may represent a useful and cost-effective way to improve the quality of drug-therapy decisions and reduce unnecessary expenditures We estimated the cost and cost-effectiveness of a clinical pharmacist intervention known to improve the appropriateness of drug prescribing . Elderly veteran out patients prescribed at least five drugs were r and omized to an intervention ( 105 patients ) or control ( 103 ) group and followed for 1 year . The intervention pharmacist provided advice to patients and their physicians during all general medicine visits . Mean fixed and variable costs/intervention patient were $ 36 and $ 84 , respectively Health services use and costs were comparable between groups . Intervention costs ranged from $ 7.50 - 30/patient/unit change in drug appropriateness . The cost to improve the appropriateness of drug prescribing is thus relatively low BACKGROUND Despite progress in describing the problem of potentially inappropriate medication ( PIM ) use , there have been few prospect i ve studies demonstrating that interventions with specific medication criteria can make a difference in decreasing the use of problematic drugs in older adults . OBJECTIVE To design an intervention study to change physician behavior regarding PIM prescribing to older patients . STUDY DESIGN AND METHODS A prospect i ve r and omized block design was used during an 18-month period from January 2001 to June 2002 . The study population was primary care physicians ( n = 355 ) in the Medicare + Choice product line of a southeastern managed care organization and their patients 65 years and older . There were 170 physicians in the treatment group and 185 in the control group . Physicians were assigned to the treatment or usual-care , groups using a r and omization table , and each group included physicians who had and had not prescribed a PIM . RESULTS Approximately 71 % ( 84/118 ) of the physicians in the intervention group who prescribed a PIM completed and faxed back at least 1 potentially inappropriate medication form to the managed care organization . On 15.4 % ( 260/1692 ) of the medication forms , physicians made some change regarding PIM use . CONCLUSIONS Although many studies have addressed medication use among older adults , intervention studies aim ed at influencing physician prescribing in this population are limited . This study describes a low-cost , replicable method to contact and educate physicians on drug therapy issues in older adults OBJECTIVES There are conflicting results in studies of pharmacists undertaking medication review s for older people . With increasing promotion and funding for ' medication review s ' there is a need for them to be st and ardised , and to determine their effectiveness and the feasibility of providing them from a community pharmacy . The objective was to determine whether involvement of community pharmacists undertaking clinical medication review s , working with general practitioners , improved medicine-related therapeutic outcomes for patients . METHODS A r and omised controlled trial was carried out in people 65 years and older on five or more prescribed medicines . Community pharmacists undertook a clinical medication review ( Comprehensive Pharmaceutical Care ) and met with the patient 's general practitioner to discuss recommendations about possible medicine changes . The patients were followed-up 3-monthly . The control group received usual care . The main outcome measures were Quality of Life ( SF-36 ) and Medication Appropriateness Index . KEY FINDINGS A total of 498 patients were enrolled in the study . The quality -of-life domains of emotional role and social functioning were significantly reduced in the intervention group compared to the control group . The Medication Appropriateness Index was significantly improved in the intervention group . Only 39 % of the 44 pharmacists who agreed to participate in the study provided adequate data , which was a limitation of the study and indicated potential barriers to the generalisability of the study . CONCLUSION Clinical medication review s in collaboration with general practitioners can have a positive effect on the Medication Appropriateness Index . However , pharmacist withdrawal from the study suggests that community pharmacy may not be an appropriate environment from which to exp and clinical medication review s in primary care Background : There are concerns that automated drug dispensing may increase inappropriate drug use . Automated dispensing could lead to perpetual repeating of drug therapies without the necessary re-evaluation . Objectives : The aim of this study was to examine the effect of a pharmacist-led medication review on drug-related problems ( DRPs ) in older patients receiving their drugs via automated dispensing . Methods : This was a pragmatic r and omized controlled study conducted in primary care . Patients were recruited from six Dutch community pharmacies . They were eligible if they lived at home , were aged ≥65 years , and used five or more different drugs , of which at least one had to be dispensed via an automated system . Patients were r and omly allocated to receive a medication review at the start of the study ( intervention group ) or after 6 months ( waiting-list group ) . Each patient was independently review ed by two pharmacist review ers . The results of these medication review s were sent to the community pharmacist to be discussed with the patient ’s general practitioner ( GP ) . The primary outcome measure was the number of DRPs leading to a recommendation for drug change . Secondary outcomes were the total number of drug changes and the number of drug changes related to a recommendation . In order to analyse drug changes , medication records were collected 6 months after the medication review or index date in the waiting-list group . Potential DRPs were classified using the DOCUMENT classification . Results : There were no baseline differences between the 63 patients in the intervention group and the 55 patients in the waiting-list group with respect to age , sex , number of drugs per patient and type of drug prescribed . The mean number of DRPs per patient at baseline in the intervention group and waiting list combined was 8.5 , with no difference between the groups . At baseline , the mean number of DRPs leading to a recommendation for drug change was 4.5 per patient and did not differ between the two groups . After 6 months , the number of DRPs leading to a recommendation for drug change decreased by 29 % in the intervention group versus 5 % in the waiting-list group ( p < 0.01 ) . Recommendations for cessation of a drug were more frequently accepted than recommendations to add a new drug ( 82 % vs 44 % , p = 0.01 ) . Conclusions : This study shows that patients using automated drug dispensing have a high number of DRPs . Medication review decreases the number of DRPs among these patients . We recommend that all patients with automatic drug dispensing should have a thorough medication review by pharmacists and prescribers
2,051
27,535,291
Treatment ranking reflected improved PFS and OS with G-Clb over other treatment strategies ( median rank of one for both endpoints ) . Conclusion G-Clb is likely to show superior efficacy to other treatment options selected in our NMA for unfit treatment-naïve patients with CLL .
Introduction Rituximab plus fludarabine and cyclophosphamide ( RFC ) is the st and ard of care for fit patients with untreated chronic lymphocytic leukemia ( CLL ) ; however , its use is limited in ‘ unfit ’ ( co-morbid and /or full-dose F-ineligible ) patients due to its toxicity profile . We conducted a systematic review and Bayesian network meta- analysis ( NMA ) to determine the relative efficacy of commercially available interventions for the first-line treatment of unfit CLL patients .
The efficacy of bendamustine versus chlorambucil in a phase III trial of previously untreated patients with Binet stage B/C chronic lymphocytic leukaemia ( CLL ) was re‐evaluated after a median observation time of 54 months in May 2010 . Overall survival ( OS ) was analysed for the first time . At follow‐up , investigator‐assessed complete response ( CR ) rate ( 21·0 % vs 10·8 % ) , median progression‐free survival ( 21·2 vs 8·8 months ; P < 0·0001 ; hazard ratio 2·83 ) and time to next treatment ( 31·7 vs 10·1 months ; P < 0·0001 ) were improved for bendamustine over chlorambucil . OS was not different between groups for all patients or those ≤65 years , > 65 years , responders and non‐responders . However , patients with objective response or a CR experienced a significantly longer OS than non‐responders or those without a CR . Significantly more patients on chlorambucil progressed to second/further lines of treatment compared with those on bendamustine ( 78·3 % vs 63·6 % ; P = 0·004 ) . The benefits of bendamustine over chlorambucil were achieved without reducing quality of life . In conclusion , bendamustine is significantly more effective than chlorambucil in previously untreated CLL patients , with the achievement of a CR or objective response appearing to prolong OS . Bendamustine should be considered as a preferred first‐line option over chlorambucil for CLL patients ineligible for fludarabine , cyclophosphamide and rituximab Although chronic lymphocytic leukemia ( CLL ) is a disease of elderly patients , subjects older than 65 years are heavily underrepresented in clinical trials . The German CLL study group ( GCLLSG ) initiated a multicenter phase III trial for CLL patients older than 65 years comparing first-line therapy with fludarabine with chlorambucil . A total of 193 patients with a median age of 70 years were r and omized to receive fludarabine ( 25 mg/m(2 ) for 5 days intravenously , every 28 days , for 6 courses ) or chlorambucil ( 0.4 mg/kg body weight [ BW ] with an increase to 0.8 mg/kg , every 15 days , for 12 months ) . Fludarabine result ed in a significantly higher overall and complete remission rate ( 72 % vs 51 % , P = .003 ; 7 % vs 0 % , P = .011 ) . Time to treatment failure was significantly shorter in the chlorambucil arm ( 11 vs 18 months ; P = .004 ) , but no difference in progression-free survival time was observed ( 19 months with fludarabine , 18 months with chlorambucil ; P = .7 ) . Moreover , fludarabine did not increase the overall survival time ( 46 months in the fludarabine vs 64 months in the chlorambucil arm ; P = .15 ) . Taken together , the results suggest that in elderly CLL patients the first-line therapy with fludarabine alone does not result in a major clinical benefit compared with chlorambucil . This trial is registered with www.is rct n.org under identifier IS RCT N 36294212 Mixed treatment comparison ( MTC ) meta- analysis is a generalization of st and ard pairwise meta- analysis for A vs B trials , to data structures that include , for example , A vs B , B vs C , and A vs C trials . There are two roles for MTC : one is to strengthen inference concerning the relative efficacy of two treatments , by including both ' direct ' and ' indirect ' comparisons . The other is to facilitate simultaneous inference regarding all treatments , in order for example to select the best treatment . In this paper , we present a range of Bayesian hierarchical models using the Markov chain Monte Carlo software WinBUGS . These are multivariate r and om effects models that allow for variation in true treatment effects across trials . We consider models where the between-trials variance is homogeneous across treatment comparisons as well as heterogeneous variance models . We also compare models with fixed ( unconstrained ) baseline study effects with models with r and om baselines drawn from a common distribution . These models are applied to an illustrative data set and posterior parameter distributions are compared . We discuss model critique and model selection , illustrating the role of Bayesian deviance analysis , and node-based model criticism . The assumptions underlying the MTC models and their parameterization are also discussed BACKGROUND Chronic lymphocytic leukemia ( CLL ) primarily affects older persons who often have coexisting conditions in addition to disease-related immunosuppression and myelosuppression . We conducted an international , open-label , r and omized phase 3 trial to compare two oral agents , ibrutinib and chlorambucil , in previously untreated older patients with CLL or small lymphocytic lymphoma . METHODS We r and omly assigned 269 previously untreated patients who were 65 years of age or older and had CLL or small lymphocytic lymphoma to receive ibrutinib or chlorambucil . The primary end point was progression-free survival as assessed by an independent review committee . RESULTS The median age of the patients was 73 years . During a median follow-up period of 18.4 months , ibrutinib result ed in significantly longer progression-free survival than did chlorambucil ( median , not reached vs. 18.9 months ) , with a risk of progression or death that was 84 % lower with ibrutinib than that with chlorambucil ( hazard ratio , 0.16 ; P<0.001 ) . Ibrutinib significantly prolonged overall survival ; the estimated survival rate at 24 months was 98 % with ibrutinib versus 85 % with chlorambucil , with a relative risk of death that was 84 % lower in the ibrutinib group than in the chlorambucil group ( hazard ratio , 0.16 ; P=0.001 ) . The overall response rate was higher with ibrutinib than with chlorambucil ( 86 % vs. 35 % , P<0.001 ) . The rates of sustained increases from baseline values in the hemoglobin and platelet levels were higher with ibrutinib . Adverse events of any grade that occurred in at least 20 % of the patients receiving ibrutinib included diarrhea , fatigue , cough , and nausea ; adverse events occurring in at least 20 % of those receiving chlorambucil included nausea , fatigue , neutropenia , anemia , and vomiting . In the ibrutinib group , four patients had a grade 3 hemorrhage and one had a grade 4 hemorrhage . A total of 87 % of the patients in the ibrutinib group are continuing to take ibrutinib . CONCLUSIONS Ibrutinib was superior to chlorambucil in previously untreated patients with CLL or small lymphocytic lymphoma , as assessed by progression-free survival , overall survival , response rate , and improvement in hematologic variables . ( Funded by Pharmacyclics and others ; RESONATE-2 Clinical Trials.gov number , NCT01722487 . ) BACKGROUND The monoclonal anti-CD20 antibody rituximab , combined with chemotherapeutic agents , has been shown to prolong overall survival in physically fit patients with previously untreated chronic lymphocytic leukemia ( CLL ) but not in those with coexisting conditions . We investigated the benefit of the type 2 , glycoengineered antibody obinutuzumab ( also known as GA101 ) as compared with that of rituximab , each combined with chlorambucil , in patients with previously untreated CLL and coexisting conditions . METHODS We r and omly assigned 781 patients with previously untreated CLL and a score higher than 6 on the Cumulative Illness Rating Scale ( CIRS ) ( range , 0 to 56 , with higher scores indicating worse health status ) or an estimated creatinine clearance of 30 to 69 ml per minute to receive chlorambucil , obinutuzumab plus chlorambucil , or rituximab plus chlorambucil . The primary end point was investigator-assessed progression-free survival . RESULTS The patients had a median age of 73 years , creatinine clearance of 62 ml per minute , and CIRS score of 8 at baseline . Treatment with obinutuzumab-chlorambucil or rituximab-chlorambucil , as compared with chlorambucil monotherapy , increased response rates and prolonged progression-free survival ( median progression-free survival , 26.7 months with obinutuzumab-chlorambucil vs. 11.1 months with chlorambucil alone ; hazard ratio for progression or death , 0.18 ; 95 % confidence interval [ CI ] , 0.13 to 0.24 ; P<0.001 ; and 16.3 months with rituximab-chlorambucil vs. 11.1 months with chlorambucil alone ; hazard ratio , 0.44 ; 95 % CI , 0.34 to 0.57 ; P<0.001 ) . Treatment with obinutuzumab-chlorambucil , as compared with chlorambucil alone , prolonged overall survival ( hazard ratio for death , 0.41 ; 95 % CI , 0.23 to 0.74 ; P=0.002 ) . Treatment with obinutuzumab-chlorambucil , as compared with rituximab-chlorambucil , result ed in prolongation of progression-free survival ( hazard ratio , 0.39 ; 95 % CI , 0.31 to 0.49 ; P<0.001 ) and higher rates of complete response ( 20.7 % vs. 7.0 % ) and molecular response . Infusion-related reactions and neutropenia were more common with obinutuzumab-chlorambucil than with rituximab-chlorambucil , but the risk of infection was not increased . CONCLUSIONS Combining an anti-CD20 antibody with chemotherapy improved outcomes in patients with CLL and coexisting conditions . In this patient population , obinutuzumab was superior to rituximab when each was combined with chlorambucil . ( Funded by F. Hoffmann-La Roche ; Clinical Trials.gov number , NCT01010061 . ) Early results of the fludarabine , cyclophosphamide , and rituximab ( FCR ) regimen in 224 patients showed that it was highly active as initial therapy of chronic lymphocytic leukemia . In this report , we present the final results of all 300 study patients at a median follow up of 6 years . The overall response rate was 95 % , with complete remission in 72 % , nodular partial remission in 10 % , partial remission due to cytopenia in 7 % , and partial remission due to residual disease in 6 % . Two patients ( < 1 % ) died within 3 months of starting therapy . Six-year overall and failure-free survival were 77 % and 51 % , respectively . Median time to progression was 80 months . Pretreatment characteristics independently associated with inferior response were age 70 years or older , beta2-microglobulin twice the upper limit of normal ( 2N ) or more , white cell count 150 x 10(9)/L or more , abnormal chromosome 17 , and lactate dehydrogenase 2N or more . No pretreatment characteristic was independently associated with decreased complete remission duration . The risk of late infection was 10 % and 4 % for the first and second years of remission , respectively , and less than 1.5 % per year for the third year onward . In a multivariate analysis of patients receiving fludarabine-based therapy at our center , FCR therapy emerged as the strongest independent determinant of survival BACKGROUND On the basis of promising results that were reported in several phase 2 trials , we investigated whether the addition of the monoclonal antibody rituximab to first-line chemotherapy with fludarabine and cyclophosphamide would improve the outcome of patients with chronic lymphocytic leukaemia . METHODS Treatment-naive , physically fit patients ( aged 30 - 81 years ) with CD20-positive chronic lymphocytic leukaemia were r and omly assigned in a one-to-one ratio to receive six courses of intravenous fludarabine ( 25 mg/m(2 ) per day ) and cyclophosphamide ( 250 mg/m(2 ) per day ) for the first 3 days of each 28-day treatment course with or without rituximab ( 375 mg/m(2 ) on day 0 of first course , and 500 mg/m(2 ) on day 1 of second to sixth courses ) in 190 centres in 11 countries . Investigators and patients were not masked to the computer-generated treatment assignment . The primary endpoint was progression-free survival ( PFS ) . Analysis was by intention to treat . This study is registered with Clinical Trials.gov , number NCT00281918 . FINDINGS 408 patients were assigned to fludarabine , cyclophosphamide , and rituximab ( chemoimmunotherapy group ) and 409 to fludarabine and cyclophosphamide ( chemotherapy group ) ; all patients were analysed . At 3 years after r and omisation , 65 % of patients in the chemoimmunotherapy group were free of progression compared with 45 % in the chemotherapy group ( hazard ratio 0·56 [ 95 % CI 0·46 - 0·69 ] , p<0·0001 ) ; 87 % were alive versus 83 % , respectively ( 0·67 [ 0·48 - 0·92 ] ; p=0·01 ) . Chemoimmunotherapy was more frequently associated with grade 3 and 4 neutropenia ( 136 [ 34 % ] of 404 vs 83 [ 21 % ] of 396 ; p<0·0001 ) and leucocytopenia ( 97 [ 24 % ] vs 48 [ 12 % ] ; p<0·0001 ) . Other side-effects , including severe infections , were not increased . There were eight ( 2 % ) treatment-related deaths in the chemoimmunotherapy group compared with ten ( 3 % ) in the chemotherapy group . INTERPRETATION Chemoimmunotherapy with fludarabine , cyclophosphamide , and rituximab improves progression-free survival and overall survival in patients with chronic lymphocytic leukaemia . Moreover , the results suggest that the choice of a specific first-line treatment changes the natural course of chronic lymphocytic leukaemia . FUNDING F Hoffmann-La Roche BACKGROUND Treatment for patients with chronic lymphocytic leukaemia who are elderly or who have comorbidities is challenging because fludarabine-based chemoimmunotherapies are mostly not suitable . Chlorambucil remains the st and ard of care in many countries . We aim ed to investigate whether the addition of ofatumumab to chlorambucil could lead to better clinical outcomes than does treatment with chlorambucil alone , while also being tolerable for patients who have few treatment options . METHODS We carried out a r and omised , open-label , phase 3 trial for treatment-naive patients with chronic lymphocytic leukaemia in 109 centres in 16 countries . We included patients who had active disease needing treatment , but in whom fludarabine-based treatment was not possible . We r and omly assigned patients ( 1:1 ) to receive oral chlorambucil ( 10 mg/m(2 ) ) on days 1 - 7 of a 28 day treatment course or to receive chlorambucil by this schedule plus intravenous ofatumumab ( cycle 1 : 300 mg on day 1 and 1000 mg on day 8 ; subsequent cycles : 1000 mg on day 1 ) for three to 12 cycles . Assignment was done with a r and omisation list that was computer generated at GlaxoSmithKline , and was stratified , in a block size of two , by age , disease stage , and performance status . The primary endpoint was progression-free survival in the intention-to-treat population and assessment was done by an independent review committee that was masked to group assignment . The study is registered with Clinical Trials.gov , number NCT00748189 . FINDINGS We enrolled 447 patients , median age 69 years ( range 35 - 92 ) . Between Dec 22 , 2008 , and May 26 , 2011 , we r and omly assigned 221 patients to chlorambucil plus ofatumumab and 226 patients to chlorambucil alone . Median progression-free survival was 22·4 months ( 95 % CI 19·0 - 25·2 ) in the group assigned to chlorambucil plus ofatumumab compared with 13·1 months ( 10·6 - 13·8 ) in the group assigned to chlorambucil only ( hazard ratio 0·57 , 95 % CI 0·45 - 0·72 ; p<0·0001 ) . Grade 3 or greater adverse events were more common in the chlorambucil plus ofatumumab group ( 109 [ 50 % ] patients ; vs 98 [ 43 % ] given chlorambucil alone ) , with neutropenia being the most common event ( 56 [ 26 % ] vs 32 [ 14 % ] ) . Grade 3 or greater infections had similar frequency in both groups . Grade 3 or greater infusion-related adverse events were reported in 22 ( 10 % ) patients given chlorambucil plus ofatumumab . Five ( 2 % ) patients died during treatment in each group . INTERPRETATION Addition of ofatumumab to chlorambucil led to clinical ly important improvements with a manageable side-effect profile in treatment-naive patients with chronic lymphocytic leukaemia who were elderly or had comorbidities . Chlorambucil plus ofatumumab is therefore an important treatment option for these patients who can not tolerate more intensive therapy . FUNDING GlaxoSmithKline , Genmab PURPOSE Fludarabine and cyclophosphamide ( FC ) , which are active in treatment of chronic lymphocytic leukemia ( CLL ) , are synergistic with the monoclonal antibody rituximab in vitro in lymphoma cell lines . A chemoimmunotherapy program consisting of fludarabine , cyclophosphamide , and rituximab ( FCR ) was developed with the goal of increasing the complete remission ( CR ) rate in previously untreated CLL patients to > /= 50 % . PATIENTS AND METHODS We conducted a single-arm study of FCR as initial therapy in 224 patients with progressive or advanced CLL . Flow cytometry was used to measure residual disease . Results and safety were compared with a previous regimen using FC . RESULTS The median age was 58 years ; 75 patients ( 33 % ) had Rai stage III to IV disease . The CR rate was 70 % ( 95 % CI , 63 % to 76 % ) , the nodular partial remission rate was 10 % , and the partial remission rate was 15 % , for an overall response rate of 95 % ( 95 % CI , 92 % to 98 % ) . Two thirds of patients evaluated with flow cytometry had less than 1 % CD5- and CD19-coexpressing cells in bone marrow after therapy . Grade 3 to 4 neutropenia occurred during 52 % of courses ; major and minor infections were seen in 2.6 % and 10 % of courses , respectively . One third of the 224 patients had > /= one episode of infection , and 10 % had a fever of unknown origin . CONCLUSION FCR produced a high CR rate in previously untreated CLL . Most patients had no detectable disease on flow cytometry at the end of therapy . Time to treatment failure analysis showed that 69 % of patients were projected to be failure free at 4 years ( 95 % CI , 57 % to 81 % ) BACKGROUND Fludarabine is an effective treatment for chronic lymphocytic leukemia that does not respond to initial treatment with chlorambucil . We compared the efficacy of fludarabine with that of chlorambucil in the primary treatment of chronic lymphocytic leukemia . METHODS Between 1990 and 1994 , we r and omly assigned 509 previously untreated patients with chronic lymphocytic leukemia to one of the following treatments : fludarabine ( 25 mg per square meter of body-surface area , administered intravenously daily for 5 days every 28 days ) , chlorambucil ( 40 mg per square meter , given orally every 28 days ) , or fludarabine ( 20 mg per square meter per day for 5 days every 28 days ) plus chlorambucil ( 20 mg per square meter every 28 days ) . Patients with an additional response at each monthly evaluation continued to receive the assigned treatment for a maximum of 12 cycles . RESULTS Assignment of patients to the fludarabine-plus-chlorambucil group was stopped when a planned interim analysis revealed excessive toxicity and a response rate that was not better than the rate with fludarabine alone . Among the other two groups , the response rate was significantly higher for fludarabine alone than for chlorambucil alone . Among 170 patients treated with fludarabine , 20 percent had a complete remission , and 43 percent had a partial remission . The corresponding values for 181 patients treated with chlorambucil were 4 percent and 33 percent ( P < 0.001 for both comparisons ) . The median duration of remission and the median progression-free survival in the fludarabine group were 25 months and 20 months , respectively , whereas both values were 14 months in the chlorambucil group ( P<0.001 for both comparisons ) . The median overall survival among patients treated with fludarabine was 66 months , which was not significantly different from the overall survival in the other two groups ( 56 months with chlorambucil and 55 months with combined treatment ) . Severe infections and neutropenia were more frequent with fludarabine than with chlorambucil ( P=0.08 ) , although , overall , toxic effects were tolerable with the two single-drug regimens . CONCLUSIONS When used as the initial treatment for chronic lymphocytic leukemia , fludarabine yields higher response rates and a longer duration of remission and progression-free survival than chlorambucil PURPOSE We conducted a r and omized trial to evaluate the efficacy and safety of intravenous alemtuzumab compared with chlorambucil in first-line treatment of chronic lymphocytic leukemia ( CLL ) . PATIENTS AND METHODS Patients received alemtuzumab ( 30 mg three times per week , for up to 12 weeks ) or chlorambucil ( 40 mg/m(2 ) every 28 days , for up to 12 months ) . The primary end point was progression-free survival ( PFS ) . Secondary end points included overall response rate ( ORR ) , complete response ( CR ) , time to alternative therapy , safety , and overall survival . RESULTS We r and omly assigned 297 patients , 149 to alemtuzumab and 148 to chlorambucil . Alemtuzumab had superior PFS , with a 42 % reduction in risk of progression or death ( hazard ratio [ HR ] = 0.58 ; P = .0001 ) , and a median time to alternative treatment of 23.3 versus 14.7 months for chlorambucil ( HR = 0.54 ; P = .0001 ) . The ORR was 83 % with alemtuzumab ( 24 % CR ) versus 55 % with chlorambucil ( 2 % CR ) ; differences in ORR and CR were highly statistically significant ( P < .0001 ) . Elimination of minimal residual disease occurred in 11 of 36 complete responders to alemtuzumab versus none to chlorambucil . Adverse events profiles were similar , except for more infusion-related and cytomegalovirus ( CMV ) events with alemtuzumab and more nausea and vomiting with chlorambucil . CMV events had no apparent impact on efficacy . CONCLUSION As first-line treatment for patients with CLL , alemtuzumab demonstrated significantly improved PFS , time to alternative treatment , ORR and CR , and minimal residual disease-negative remissions compared with chlorambucil , with predictable and manageable toxicity Idelalisib is a first-in-class oral inhibitor of PI3Kδ that has shown substantial activity in patients with relapsed/refractory chronic lymphocytic leukemia ( CLL ) . To evaluate idelalisib as initial therapy , 64 treatment-naïve older patients with CLL or small lymphocytic leukemia ( median age , 71 years ; range , 65 - 90 ) were treated with rituximab 375 mg/m(2 ) weekly ×8 and idelalisib 150 mg twice daily continuously for 48 weeks . Patients completing 48 weeks without progression could continue to receive idelalisib on an extension study . The median time on treatment was 22.4 months ( range , 0.8 - 45.8 + ) . The overall response rate ( ORR ) was 97 % , including 19 % complete responses . The ORR was 100 % in patients with del(17p)/TP53 mutations and 97 % in those with unmutated IGHV . Progression-free survival was 83 % at 36 months . The most frequent ( > 30 % ) adverse events ( any grade ) were diarrhea ( including colitis ) ( 64 % ) , rash ( 58 % ) , pyrexia ( 42 % ) , nausea ( 38 % ) , chills ( 36 % ) , cough ( 33 % ) , and fatigue ( 31 % ) . Elevated alanine transaminase/aspartate transaminase was seen in 67 % of patients ( 23 % grade ≥3 ) . The combination of idelalisib and rituximab was highly active , result ing in durable disease control in treatment-naïve older patients with CLL . These results support the further development of idelalisib as initial treatment of CLL . This study is registered at Clinical Trials.gov as # NCT01203930 Idelalisib is a small-molecule inhibitor of PI3Kδ with demonstrated efficacy for the treatment of relapsed/refractory chronic lymphocytic leukemia ( CLL ) . To evaluate idelalisib as front-line therapy , we enrolled 24 subjects in a phase 2 study consisting of 2 months of idelalisib monotherapy followed by 6 months of combination therapy with idelalisib and the anti-CD20 antibody ofatumumab . After a median follow-up period of 14.7 months , hepatotoxicity was found to be a frequent and often severe adverse event . A total of 19 subjects ( 79 % ) experienced either grade ≥1 ALT or AST elevation during the study , and 13 subjects ( 54 % ) experienced grade ≥3 transaminitis . The median time to development of transaminitis was 28 days , occurring before ofatumumab introduction . Younger age and mutated immunoglobulin heavy chain status were significant risk factors for the development of hepatotoxicity . Multiple lines of evidence suggest that this hepatotoxicity was immune mediated . A lymphocytic infiltrate was seen on liver biopsy specimens taken from 2 subjects with transaminitis , and levels of the proinflammatory cytokines CCL-3 and CCL-4 were higher in subjects experiencing hepatotoxicity . All cases of transaminitis resolved either by holding the drug , initiating immunosuppressants , or both , and rates of recurrent toxicity were lower in patients taking steroids when idelalisib was reinitiated . A decrease in peripheral blood regulatory T cells was seen in patients experiencing toxicity on therapy , which is consistent with an immune-mediated mechanism . These results suggest that caution should be taken as drugs within this class are developed for CLL , particularly in younger patients who have not received prior disease-specific therapy . This study was registered at www . clinical trials.gov as # NCT02135133
2,052
25,828,851
The most accurate results were achieved with two configurations : ( 1 ) the optical intraoral system with powder and ( 2 ) the open technique with splinted squared transfer copings , using polyether as impression material
BACKGROUND Several studies link the seamless fit of implant-supported prosthesis with the accuracy of the dental impression technique obtained during acquisition . In addition , factors such as implant angulation and coping shape contribute to implant misfit . PURPOSE The aim of this study was to identify the most accurate impression technique and factors affecting the impression accuracy .
The aim of this research was to compare the accuracy outcomes of open- and closed-tray implant impressions for partially edentulous patients . Eleven partially edentulous spaces in seven patients with two existing implants for fixed partial dentures were included . Group I ( closed-tray ) and group II ( open-tray ) were compared using microcomputed tomography scanning . No statistically significant differences were found between the closed- and open-tray techniques ( P = .317 ) . The subjective evaluation of patient comfort showed no differences with either impression technique . There were no differences seen between open- and closed-tray impression techniques in partially edentulous patients when implants had less than 10 degrees of angulation
2,053
26,106,751
AUTHORS ' CONCLUSIONS CBT plus taper is effective in the short term ( three month time period ) in reducing BZD use . However , this is not sustained at six months and subsequently . Currently there is insufficient evidence to support the use of MI to reduce BZD use . There is emerging evidence to suggest that a tailored GP letter versus a generic GP letter , a st and ardised interview versus TAU , and relaxation versus TAU could be effective for BZD reduction . There is currently insufficient evidence for other approaches to reduce BZD use
BACKGROUND Benzodiazepines ( BZDs ) have a sedative and hypnotic effect upon people . Short term use can be beneficial but long term BZD use is common , with several risks in addition to the potential for dependence in both opiate and non-opiate dependent patients . OBJECTIVES To evaluate the effectiveness of psychosocial interventions for treating BZD harmful use , abuse or dependence compared to pharmacological interventions , no intervention , placebo or a different psychosocial intervention on reducing the use of BZDs in opiate dependent and non-opiate dependent groups .
Disturbed sleep is a common problem , particularly among elderly people , and is usually treated with hypnotics . The side effects of longterm administration of hypnotic drugs are well known , but despite this there remains a substantial population of chronic users . These people can be helped to reduce their dependence on hypnotics through psychological techniques . A group of longterm users treated in this manner were shown to reduce their intake of hypnotics significantly more than a group of users who did not receive any psychological treatment . Furthermore , the treated patients did not experience any deterioration in their sleep patterns , and their subjective refreshment from sleep improved significantly . For the patient with sleep problems , psychological techniques are preferable to the longterm use of hypnotics both as a weaning-off agent and as an alternative to drugs We described characteristics of subjects with benzodiazepine dependence that was typically complicated by harmful and hazardous alcohol use or high benzodiazepine doses , and assessed predictors of successful discontinuation of benzodiazepines for this group . Seventy-six patients who participated in a r and omized clinical trial of two different gradual withdrawal treatment approaches were assessed . The trial was conducted between February 1995 and July 1999 . The mean age ±SD of subjects was 40.0±9.6 years , 55 % were male , 38 % were married or cohabiting , and 70 % had received more than nine years of education . The median benzodiazepine dose was 35 mg /day ( range 2.5–180 ) in diazepam equivalents . The median duration of benzodiazepine use was 84 ( range 8–360 ) months . Subjects with lower benzodiazepine doses and no previous withdrawal attempts were more successful at benzodiazepine discontinuation . Cluster B personality/borderline personality disorder was associated with an inability to stop benzodiazepine use and with “ dropping out ” of treatment . Alcohol use – related disorders or other psychiatric diagnoses were not associated with outcome . Further studies on predictors of successful benzodiazepine discontinuation in different population s are required . Patients manifesting cluster B personality/borderline personality disorder and benzodiazepine dependence may need concomitant treatment for their personality disorders to benefit from benzodiazepine discontinuation treatment Background The prevalence of substance use in people acutely admitted to in-patient psychiatric wards is high and the patients ` duration of stay is limited . Motivational interviewing is a method with evidence based effect in short interventions . The aims of the present study were to compare the effects of 2 sessions of motivational interviewing and treatment as usual ( intervention group ) with treatment as usual only ( control group ) on adult patients with comorbid substance use admitted to a psychiatric in-patient emergency unit . Methods This was an open r and omised controlled trial including 135 patients where substance use influenced the admittance . After admission and assessment s , the patients were allocated to the intervention group ( n = 67 ) or the control group ( n = 68 ) . The primary outcome was self-reported days per month of substance use during the last 3 months at 3 , 6 , 12 and 24 months after inclusion . Data was analysed with a multilevel linear repeated measures regression model . Results Both groups reduced substance use during the first 12 months with no substantial difference between the 2 groups . At 2 year follow-up , the control group had increased their substance use with 2.4 days ( 95 % confidence interval ( CI ) –1.5 to 6.3 ) , whereas the intervention group had reduced their monthly substance use with 4.9 days ( 95 % CI 1.2 to 8.6 ) compared to baseline . The 2 year net difference was 7.3 days of substance use per month ( 95 % CI 1.9 to 12.6 , p < 0.01 ) in favour of the intervention group . Conclusions The present study suggests that 2 sessions of motivational interviewing to patients with comorbid substance use admitted to a psychiatric emergency unit reduce substance use frequency substantially at 2 year follow-up . Trial registration Clinical Trials.gov Identifier : Eighty-six participants wishing to stop benzodiazepine and who met DSM-IV ( Diagnostic and Statistical Manual of Mental Disorders , 4th ed . American Psychological Association , 1994 ) criteria for anxiety disorder or insomnia were assessed pre- and post-taper on clinical , pharmacological and psychosocial measures . An initial cohort of 41 participants received treatment as usual ( taper only ) plus physician counselling in the same clinic setting . A second cohort of 45 participants were r and omly allocated to group cognitive-behavioural therapy ( CBT ) plus taper , or group support ( GS ) plus taper . At 3 months follow-up , the outcomes in both the CBT and the GS subgroups were equivalent . Intention to treat analysis revealed a slight advantage to the CBT over the GS group and the CBT group showed higher self-efficacy post-taper . Over all 86 participants , a high-baseline level of psychological distress , anxiety and dosage predicted a poor outcome , but increase in self-efficacy contributed to a successful outcome particularly in those with initially poor baseline predictors . Although there was a decrease in positive affect during preliminary stages of tapered discontinuation compared to baseline , there was no significant overall increase in negative affect Objective To evaluate the association between use of benzodiazepines and incident dementia . Design Prospect i ve , population based study . Setting PAQUID study , France . Participants 1063 men and women ( mean age 78.2 years ) who were free of dementia and did not start taking benzodiazepines until at least the third year of follow-up . Main outcome measures Incident dementia , confirmed by a neurologist . Results During a 15 year follow-up , 253 incident cases of dementia were confirmed . New use of benzodiazepines was associated with an increased risk of dementia ( multivariable adjusted hazard ratio 1.60 , 95 % confidence interval 1.08 to 2.38 ) . Sensitivity analysis considering the existence of depressive symptoms showed a similar association ( hazard ratio 1.62 , 1.08 to 2.43 ) . A secondary analysis pooled cohorts of participants who started benzodiazepines during follow-up and evaluated the association with incident dementia . The pooled hazard ratio across the five cohorts of new benzodiazepine users was 1.46 ( 1.10 to 1.94 ) . Results of a complementary nested case-control study showed that ever use of benzodiazepines was associated with an approximately 50 % increase in the risk of dementia ( adjusted odds ratio 1.55 , 1.24 to 1.95 ) compared with never users . The results were similar in past users ( odds ratio 1.56 , 1.23 to 1.98 ) and recent users ( 1.48 , 0.83 to 2.63 ) but reached significance only for past users . Conclusions In this prospect i ve population based study , new use of benzodiazepines was associated with increased risk of dementia . The result was robust in pooled analyses across cohorts of new users of benzodiazepines throughout the study and in a complementary case-control study . Considering the extent to which benzodiazepines are prescribed and the number of potential adverse effects of this drug class in the general population , indiscriminate widespread use should be caution ed against OBJECTIVE This study tested cognitive behavior therapy ( CBT ) in hypnotic-dependent , late middle-age and older adults with insomnia . METHOD Seventy volunteers age 50 and older were r and omized to CBT plus drug withdrawal , placebo biofeedback ( PL ) plus drug withdrawal , or drug withdrawal ( MED ) only . The CBT and PL groups received eight , 45 min weekly treatment sessions . The drug withdrawal protocol comprised slow tapering monitored with about six biweekly , 30 min sessions . Assessment including polysomnography ( PSG ) , sleep diaries , hypnotic consumption , daytime functioning question naires , and drug screens collected at baseline , posttreatment , and 1-year follow-up . RESULTS Only the CBT group showed significant sleep diary improvement , sleep onset latency significantly decreased at posttreatment . For all sleep diary measures for all groups , including MED , sleep trended to improvement from baseline to follow-up . Most PSG sleep variables did not significantly change . There were no significant between group differences in medication reduction . Compared to baseline , the three groups decreased hypnotic use at posttreatment , down 84 % , and follow-up , down 66 % . There was no evidence of withdrawal side-effects . Daytime functioning , including anxiety and depression , improved by posttreatment . Rigorous method ological features , including documentation of strong treatment implementation and the presence of a credible placebo , elevated the confidence due these findings . CONCLUSIONS Gradual drug withdrawal was associated with substantial hypnotic reduction at posttreatment and follow-up , and withdrawal side-effects were absent . When supplemented with CBT , participants accrued incremental self-reported , but not PSG , sleep benefits AIMS Chronic benzodiazepine use is highly prevalent and is associated with a variety of negative health consequences . The present study examined the long-term effectiveness of a tailored patient education intervention on benzodiazepine use . DESIGN A r and omized controlled trial was conducted comprising three arms , comparing ( i ) a single tailored intervention ; ( ii ) a multiple tailored intervention and ( iii ) a general practitioner letter . The post-test took place after 12 months . PARTICIPANTS Five hundred and eight patients using benzodiazepines were recruited by their general practitioners and assigned r and omly to one of the three groups . INTERVENTION Two tailored interventions , the single tailored intervention ( patients received one tailored letter ) and the multiple tailored intervention ( patients received three sequential tailored letters at intervals of 1 month ) , were compared to a short general practitioner letter that modelled usual care . The tailored interventions not only provided different and more information than the general practitioner letter ; they were also personalized and adapted to individual baseline characteristics . The information in both tailored interventions was the same , but in the multiple tailored intervention the information was provided to the participants spread over three occasions . In the multiple tailored intervention , the second and the third tailored letters were based on short and st and ardized telephone interviews . MEASUREMENTS Benzodiazepine cessation at post-test was the outcome measure . FINDINGS The results showed that participants receiving the tailored interventions were twice as likely to have quit benzodiazepine use compared to the general practitioner letter . Particularly among participants with the intention to discontinue usage at baseline , both tailored interventions led to high percentages of those who actually discontinued usage ( single tailored intervention 51.7 % ; multiple tailored intervention 35.6 % ; general practitioner letter 14.5 % ) . CONCLUSIONS It was concluded that tailored patient education can be an effective tool for reducing benzodiazepine use , and can be implemented easily This study examined contingent methadone take-home privileges for effectiveness in reducing on-going supplemental drug use of methadone maintenance patients . Fifty-three new intakes were r and omly assigned to begin receiving take-home privileges after 2 consecutive weeks of drug-free urines or to a noncontingent procedure in which take-homes were delivered independently of urine test results . The contingent procedure produced more individuals with at least 4 consecutive weeks of abstinence ( 32 % vs. 8 % ) ; 28 % of noncontingent subjects also achieved abstinence after shifting to the contingent procedure . Lower baseline rate of drug-free urines was strongly associated with successful outcome , whereas the type of drug abused ( cocaine vs. benzodiazepines ) did not influence outcomes . Findings support a recommendation for using contingent take-home incentives to motivate abstinence during methadone maintenance treatment This study evaluated the specific effectiveness of cognitive-behavior therapy ( CBT ) combined with medication tapering for benzodiazepine discontinuation among generalized anxiety disorder ( GAD ) patients by using a nonspecific therapy control group . Sixty-one patients who had used benzodiazepines for more than 12 months were r and omly assigned to the experimental conditions . Nearly 75 % of patients in the CBT condition completely ceased benzodiazepine intake , as compared with 37 % in the control condition . Results of the 3- , 6- , and 12-month follow-ups confirmed the maintenance of complete cessation . Discontinuation rates remained twice as high in the CBT condition . The number of patients who no longer met GAD criteria was also greater in the CBT condition . The addition of specific CBT components thus seemed to facilitate benzodiazepine tapering among patients with GAD BACKGROUND Benzodiazepine withdrawal programmes have never been experimentally compared with a nonintervention control condition . AIMS To evaluate the efficacy and feasibility of tapering off long-term benzodiazepine use in general practice , and to evaluate the value of additional group cognitive-behavioural therapy ( CBT ) . METHOD A 3-month r and omised , 3-month controlled trial was conducted in which 180 people attempting to discontinue long-term benzodiazepine use were assigned to tapering off plus group CBT , tapering off alone or usual care . RESULTS Tapering off led to a significantly higher proportion of successful discontinuations than usual care ( 62 % nu . 21 % ) . Adding group CBT did not increase the success rate ( 58 % v. 62 % ) . Neither successful discontinuation nor intervention type affected psychological functioning . Both tapering strategies showed good feasibilityin general practice . CONCLUSIONS Tapering off is a feasible and effective way of discontinuing long-term benzodiazepine use in general practice . The addition of group CBT is of limited value Despite recent emphasis on integrating empirically vali date d treatment into clinical practice , there are little data on whether manual-guided behavioral therapies can be implemented in st and ard clinical practice and whether incorporation of such techniques is associated with improved outcomes . The effectiveness of integrating motivational interviewing ( MI ) techniques into the initial contact and evaluation session was evaluated in a multisite r and omized clinical trial . Participants were 423 substance users entering outpatient treatment in five community-based treatment setting s , who were r and omized to receive either the st and ard intake/evaluation session at each site or the same session in which MI techniques and strategies were integrated . Clinicians were drawn from the staff of the participating programs and were r and omized either to learn and implement MI or to deliver the st and ard intake/evaluation session . Independent analyses of 315 session audiotapes suggested the two forms of treatment were highly discriminable and that clinicians trained to implement MI tended to have higher skill ratings . Regarding outcomes , for the sample as a whole , participants assigned to MI had significantly better retention through the 28-day follow-up than those assigned to the st and ard intervention . There were no significant effects of MI on substance use outcomes at either the 28-day or 84-day follow-up . Results suggest that community-based clinicians can effectively implement MI when provided training and supervision , and that integrating MI techniques in the earliest phases of treatment may have positive effects on retention early in the course of treatment AIMS Dependence on or problematic use of prescription drugs ( PD ) is estimated to be between 1 and 2 % in the general population . In contrast , the proportion of substance-specific treatment in PD use disorders at 0.5 % is comparatively low . With an estimated prevalence of 4.7 % , PD-specific disorders are widespread in general hospitals compared to the general population . Brief intervention delivered in general hospitals might be useful to promote discontinuation or reduction of problematic prescription drug use . DESIGN A r and omized , controlled clinical trial . SETTING Internal , surgical and gynaecological wards of a general and a university hospital . PARTICIPANTS One hundred and twenty-six patients fulfilling criteria for either regular use of PD ( more than 60 days within the last 3 months ) or dependence on or abuse of PD , respectively , were allocated r and omly to two conditions . INTERVENTION Subjects received two counselling sessions based on Motivational Interviewing plus an individualized written feedback ( intervention group , IG ) or a booklet on health behaviour ( control group , CG ) . MEASUREMENTS The outcome was measured as reduction ( > 25 % ) and discontinuation of PD intake in terms of defined daily dosages ( DDD ) . FINDINGS After 3 months , more participants in the IG reduced their DDD compared to the participants in the CG ( 51.8 % versus 30 % ; chi(2 ) = 6.17 ; P = 0.017 ) . In the IG 17.9 % , in the CG 8.6 % discontinued use of PD ( chi(2 ) = 2.42 ; P = 0.17 ) . Conclusions Brief intervention based on Motivational Interviewing is effective in reducing PD intake in non-treatment-seeking patients About two-thirds of long-term users of benzodiazepines in the population are able to discontinue this drug with the aid of supervised programmes for tapering off . Little is known about the long-term outcome of such programmes , and they have never been compared with usual care . After a 15-month follow-up of a r and omised controlled trial comparing such a programme with and without psychotherapy with usual care , we found significantly higher longitudinal abstinence rates in long-term benzodiazepine users who received a benzodiazepine tapering-off programme without psychotherapy ( 25 out of 69 , 36 % ) compared with those who received usual care ( 5 out of 33,15 % ; P=0.03 ) BACKGROUND evidence about possibilities to help older persons to withdraw the long-term use of benzodiazepines ( BZD ) is scarce . Effective and practicable methods are needed . OBJECTIVE the study aim ed to assess the persistence of one-time counselling by a geriatrician to reduce psychotropic drugs , especially BZD and related drugs ( RD ) . DESIGN a prospect i ve r and omised controlled trial with a 12-month follow-up was conducted . SUBJECTS five hundred ninety-one community-dwelling people aged 65 or older participated in the study . METHODS instructions to withdraw , reduce or change psychotropic drugs were given to the intervention group . A 1-h lecture about these drugs and their adverse effects was given later on . No changes in the drug therapy were suggested for the controls . RESULTS the number of regular users of BZD and RD decreased by 35 % ( 12/34 ) ( odds ratios ( OR ) = 0.61 , 95 % confidence interval ( 95 % CI ) 0.44 - 0.86 ) in the intervention group while it increased by 4 % ( 2/46 ) ( OR = 1.05 , 95 % CI 0.81 - 1.36 ) in the controls ( P = 0.012 ) . No significant changes in the users of other types of psychotropics were found . CONCLUSION one-time counselling of psychotropics and other fall-risk-increasing drugs by a geriatrician followed with a 1-h lecture about adverse effects of these drugs had positive effects in decreasing the number of regular users of BZD and RD , and these effects persisted for the total 12-month intervention period This study aim ed to assess the efficacy of a minimal intervention focusing on hypnotic discontinuation and cognitive-behavioral treatment ( CBT ) for insomnia . Fifty-three adult chronic users of hypnotics were r and omly assigned to an 8-week hypnotic taper program , used alone or combined with a self-help CBT . Weekly hypnotic use decreased in both conditions , from a nearly nightly use at baseline to less than once a week at posttreatment . Nightly dosage ( in lorazepam equivalent ) decreased from 1.67 mg to 0.12 mg . Participants who received CBT improved their sleep efficiency by 8 % , whereas those who did not remained stable . Total wake time decreased by 52 min among CBT participants and increased by 13 min among those receiving the taper schedule alone . Total sleep time remained stable throughout withdrawal in both CBT and taper conditions . The present findings suggest that a systematic withdrawal schedule might be sufficient in helping chronic users stop their hypnotic medication . The addition of a self-help treatment focusing on insomnia , a readily available and cost-effective alternative to individual psychotherapy , produced greater sleep improvement Illicit drug users undergoing m and atory reductions in prescribed diazepam were r and omly allocated to one of two methods of delivering psychological support to help reduce their prescription : a ) an enhanced intervention consisting of skills training and reinforcement , and b ) a limited intervention where patients initially received skills training and thereafter only advice . Outcome measures at baseline and six-months consisted of daily diazepam dose ; reported illicit drug use ; Severity of Dependence Scale ; Hospital Anxiety and Depression Scale ( HADS ) ; Pittsburgh Sleep Quality Index . Fifty-three of 119 eligible patients agreed to be r and omly allocated to the interventions . Those in the enhanced intervention reduced their daily dose of prescribed diazepam from a mean of 27.8 mgs to 19.9 mgs at six months ( 5.3 % per month ) compared with 29.8 mgs to 17.6 mgs at six months ( 7.5 % ) among those in the limited intervention group . However , there was no statistically significant difference in the reduction rate between the intervention groups . Approximately 75 % of patients in each group suspended their reduction programme . The enhanced intervention group reported a statistically and clinical ly greater reduction in the mean HADS depression score ( 10.6 at baseline and 7.7 at follow-up ) , compared with a rise from 8.9 to 11.2 in the limited intervention group . In conclusion , it is possible to reduce prescribed diazepam among illicit drug users but not at the rate of 10 % per month set by the study . The difficulties of working with this population necessitate a flexible and possibly long-term approach to reducing prescribed benzodiazepines AIMS To evaluate whether gradual benzodiazepine taper combined with cognitive-behavioural treatment is more effective than st and ard treatment for patients with dependence in out-patient clinics . DESIGN A r and omized , controlled clinical trial , using st and ard question naires and serum and urine tests . SETTING S Four public-sector out-patient clinics for alcohol and drug abusers in Helsinki . PARTICIPANTS Seventy-six patients with benzodiazepine dependence ( DSM-III-R ) . Patients taking high doses of the drug or with alcohol use disorders were included to obtain a subject group representative of usual clinical practice . INTERVENTION Subjects received gradual benzodiazepine taper combined with cognitive-behavioural therapy ( experimental group ) or st and ard withdrawal treatment not scheduled by the research ers ( control group ) . MEASUREMENTS The outcome was measured in terms of attaining a state of abstinence or by a decrease in the dosage during the study period of up to 12 months ' duration . FINDINGS No statistically significant differences in the outcomes were observed between the groups . A total of 13 % of the experimental group and 27 % of the control group were able to discontinue drug use . In addition 67 % of the experimental group and 57 % of the control group were able to decrease the dose . CONCLUSIONS The search continues for improved methods of helping patients with complicated benzodiazepine dependence OBJECTIVE Craving for benzodiazepines has never been examined as a factor of relapse after successful benzodiazepine discontinuation . In this study , we examined the predictive value of craving on benzodiazepine relapse . METHOD A stepped-care intervention trial aim ed to discontinue long-term benzodiazepine use in general practice . The first step was the sending of a letter to users advising them to gradually quit their use by themselves ( i.e. , minimal intervention ) . The second step , a supervised tapering-off program , was offered to those unable to discontinue by themselves . Craving was assessed by means of the Benzodiazepine Craving Question naire ( BCQ ) . Multiple Cox proportional hazards regression analyses were performed to examine the effect of craving on subsequent relapse during a 15-month follow-up period in patients who had successfully quit their benzodiazepine use by themselves after the minimal intervention ( N = 79 ) and in those patients who had successfully quit after the supervised tapering-off program ( N = 45 ) . Data were collected from August 1998 to December 2001 . RESULTS Thirty-five ( 44 % ) and 24 ( 53 % ) patients had relapsed after the minimal intervention and tapering-off program , respectively . Patients able to quit by themselves experienced very little craving . In this sample , craving was not related to relapse ( p = .82 ) . In patients who needed an additional supervised tapering-off program , higher craving scores were significantly related to relapse ( hazard ratio = 1.26 , 95 % CI = 1.02 to 1.54 , p = .029 ) , when corrected for benzodiazepine characteristics , psychopathology , and personality characteristics . CONCLUSION Craving is an independent factor of subsequent relapse after successful benzodiazepine discontinuation in long-term benzodiazepine users who are not able to quit their usage of their own accord Despite its acute efficacy for the treatment of panic disorder , benzodiazepines ( BZs ) are associated with a withdrawal syndrome that closely mimics anxiety sensations , leading to difficulty with treatment discontinuation and often disorder relapse . An exposure-based cognitive-behavioral treatment for BZ discontinuation , Panic Control Treatment for BZ Discontinuation ( CBT ) targets the fear of these sensations and has demonstrated efficacy in preventing disorder relapse and facilitating successful BZ discontinuation among patients with panic disorder . In this r and omized controlled trial , CBT was compared to taper alone and a taper plus a relaxation condition to control for the effect of therapist contact and support among 47 patients with panic disorder seeking taper from BZs . Based on the primary outcome of successful discontinuation of BZ use , results indicate that adjunctive CBT provided additive benefits above both taper alone and taper plus relaxation , with consistently medium and large effect sizes over time that reached significance at the six month follow-up evaluation . The efficacy of CBT relative to either of the other taper conditions reflected very large and significant effect sizes at that time . These findings suggest that CBT provides specific efficacy for the successful discontinuation from BZs , even when controlling for therapist contact and relaxation training STUDY OBJECTIVES Insomnia is a common problem that affects 9 % to 15 % of the population chronically . The primary objective of this study was to demonstrate that 8 weekly sessions of sleep restriction therapy of insomnia combined with hypnotic reduction instructions following a single session of sleep hygiene education would result in greater improvements in sleep and hypnotic use than sleep hygiene education alone . METHODS Forty-six men and women were recruited from a sleep medicine practice and r and omly assigned to sleep hygiene education plus 8 weeks of sleep restriction and hypnotic withdrawal ( SR+HW ; n = 24 ) , or a sleep hygiene education alone ( SHE ; n = 22 ) condition . Pre-r and omization , all patients received a single session of instruction in good sleep habits ( sleep hygiene education ) . RESULTS The SR+HW condition had greater improvements in hypnotic medication usage , sleep onset latency , morning wake time , sleep efficiency , and wake time after sleep onset ( trend ) , than the SHE condition . Continued improvement was seen in TST in the SR+HW group at 6-month follow-up , and gains on all other variables were maintained at 6- and 12-month follow-up . CONCLUSIONS These results provide evidence that more intensive treatment of insomnia ( i.e. , 8 sessions of SR+HW plus hypnotic withdrawal instructions ) results in better outcomes than SHE alone BACKGROUND The existing literature does not address the question of whether cognitive-behavioral therapy would have an impact on insomnia in older adults who are chronic users of sleep medication and have current insomnia , but are also stable in their quantity of medication usage during treatment . The present report seeks to answer this question . METHODS Hypnotic-dependant older adults , who were stable in their amount of medication usage and still met the criteria for chronic insomnia put forth by American Academy of Sleep Medicine , were treated using a cognitive-behavioral intervention for insomnia . The three-component treatment included relaxation training , stimulus control , and sleep hygiene instructions . Participants were r and omly assigned to either the active treatment group or a comparably credible placebo control group , and were instructed not to alter their pattern of hypnotic consumption during treatment . RESULTS The active treatment group had significantly better self-report measures of sleep at post-treatment . Statistically significant improvement was paralleled by clinical ly meaningful improvement for key sleep variables . As planned , there was no significant change in sleep medication usage from pre- to post-treatment . CONCLUSIONS The findings support the use of cognitive-behavioral therapy for insomnia in hypnotic-dependant older adults The present research evaluated patients from 2 previous studies ( 1 conducted in Peoria , the other at Dartmouth ) during a 2- to 5-year posttreatment period . Results showed that 75 % of the Peoria sample and 76 % of the Dartmouth sample were able to discontinue alprazolam therapy , remain abstinent of any type of treatment for panic disorder , and maintain their acute-treatment clinical gains over this follow-up period . The degree to which patients ' anxiety sensitivity declined during treatment predicted relapse versus survival during the 1st 6 months of follow-up , when most relapses occurred . Implication s of these findings for benzodiazepine discontinuation , combined pharmacotherapy and psychotherapy , and relapse prevention in panic disorder are discussed The relationship between pre-treatment illicit benzodiazepine use ( days of use in the last 30 ) assessed on the Addiction Severity Index ( ASI ) and treatment outcome was investigated by retrospective analysis of data from two controlled clinical trials in 361 methadone maintained cocaine/opiate users r and omly assigned to 12-week voucher- or prize-based contingency management ( CM ) or control interventions . Based on screening ASI , participants were identified as non-users ( BZD-N ; 0 days of use ) or users ( BZD-U ; > 0 days of use ) . Outcome measures were : urine drug screens ( thrice weekly ) ; quality of life and self-reported HIV-risk behaviors ( every 2 weeks ) ; and current DSM-IV diagnosis of cocaine and heroin dependence ( study exit ) . In the CM group , BZD-U had significantly worse outcomes on in-treatment cocaine use , quality -of-life scores , needle-sharing behaviors , and current heroin dependence diagnoses at study exit compared to BZD-N. In the control group , BZD-U had significantly higher in-treatment cocaine use but did not differ from BZD-N on psychosocial measures . Thus , in a sample of non-dependent BZD users , self-reported illicit BZD use on the ASI , even at low levels , predicted worse outcome on cocaine use and blunted response to CM OBJECTIVE The authors investigated whether cognitive behavioral treatment could facilitate discontinuation of alprazolam therapy and maintenance of drug abstinence among panic disorder patients treated with alprazolam doses sufficient to suppress spontaneous panic attacks . METHOD Twenty-one out patients who met DSM-III-R criteria for panic disorder with mild to severe agoraphobia were made panic-free with alprazolam ( mean dose = 2.2 mg/day ) and were then r and omly assigned to receive either supportive drug maintenance and slow , flexible drug taper or an identical medication treatment plus 12 weeks of concurrent , individual cognitive behavioral treatment . Taper in the combined treatment group was sequenced to conclude before cognitive behavioral treatment ended . RESULTS Twenty subjects completed the study . There was no significant difference between groups in the rate of alprazolam discontinuation ( 80 % and 90 % , respectively , in the alprazolam-only group and the combined treatment group ) . However , during the 6-month follow-up period , half of the subjects who discontinued alprazolam without cognitive behavior therapy , but none of those who were given cognitive behavior therapy , relapsed and resumed alprazolam treatment . CONCLUSIONS Cognitive behavioral treatment administered in parallel with alprazolam maintenance and taper was effective in preventing relapse after drug discontinuation . The results warrant further research on the thoughtful integration of these two therapeutic modalities A controlled trial was conducted evaluating cognitive-behavioural group psychotherapy as a measure to reduce concomitant drug use in methadone maintenance treatment ( MMT ) . 73 opiate addicts were r and omly assigned to local routine MMT or to routine MMT plus group psychotherapy ( 20 sessions over 20 weeks ) . Psychotherapy was delivered by therapists according to a manual . Drug use ( urine screen ) was compared at onset of psychotherapy , end of intervention period ( 6 months after study onset ) , and 6 months later . Data analysis was done according to intention-to-treat principles . Results indicated that patients in the psychotherapy group ( n = 41 ) showed less drug use than control subjects ( n = 32 ) . This group difference was statistically significant at 6-month follow-up ( p = 0.02 ) . These findings underscore the usefulness of group psychotherapy in MMT . The delayed effect is comparable to other studies evaluating cognitive-behavioural psychotherapy AIMS The present study sought to replicate and extend a small pilot study conducted by Baker , Boggs & Lewin ( 2001 ) which demonstrated that brief interventions consisting of motivational interviewing and cognitive-behaviour therapy ( CBT ) were feasible and associated with better outcomes compared with a control condition . DESIGN R and omized controlled trial ( RCT ) . SETTING Greater Brisbane Region of Queensl and and Newcastle , NSW , Australia . PARTICIPANTS The study was conducted among 214 regular amphetamine users . MEASUREMENTS Demographic characteristics , past and present alcohol and other drug use and mental health , treatment , amphetamine-related harms and severity of dependence . FINDINGS The main finding of this study was that there was a significant increase in the likelihood of abstinence from amphetamines among those receiving two or more treatment sessions . In addition , the number of treatment sessions attended had a significant short-term beneficial effect on level of depression . There were no intervention effects on any other variables ( HIV risk-taking , crime , social functioning and health ) . Overall , there was a marked reduction in amphetamine use among this sample over time and , apart from abstinence rates and short-term effects on depression level , this was not differential by treatment group . Reduction in amphetamine use was accompanied by significant improvements in stage of change , benzodiazepine use , tobacco smoking , polydrug use , injecting risk-taking behaviour , criminal activity level , and psychiatric distress and depression level . CONCLUSIONS A stepped-care approach is recommended . The first step in providing an effective intervention among many regular amphetamine users , particularly those attending non-treatment setting s , may include provision of : a structured assessment of amphetamine use and related problems ; self-help material ; and regular monitoring of amphetamine use and related harms . Regular amphetamine users who present to treatment setting s could be offered two sessions of CBT , while people with moderate to severe levels of depression may best be offered four sessions of CBT for amphetamine use from the outset , with further treatment for amphetamine use and /or depression depending on response . Pharmacotherapy and /or longer-term psychotherapy may be suitable for non-responders . An RCT of a stepped-care approach among regular amphetamine users is suggested OBJECTIVE The primary disadvantage of high-potency benzodiazepine treatment for panic disorder is the difficulty of discontinuing the treatment . During treatment discontinuation , new symptoms may emerge and anxiety may return , preventing many patients from successfully discontinuing their treatment . In this controlled , r and omized trial the authors investigated the efficacy of a cognitive-behavioral program for patients with panic disorder who were attempting to discontinue treatment with high-potency benzodiazepines . METHOD Out patients treated for panic disorder with alprazolam or clonazepam for a minimum of 6 months and expressing a desire to stop taking the medication ( N = 33 ) were r and omly assigned to one of two taper conditions : a slow taper condition alone or a slow taper condition in conjunction with 10 weeks of group cognitive-behavioral therapy . RESULTS The rate of successful discontinuation of benzodiazepine treatment was significantly higher for the patients receiving the cognitive-behavioral program ( 13 of 17 ; 76 % ) than for the patients receiving the slow taper program alone ( four of 16 ; 25 % ) . There was no difference in the likelihood of discontinuation success between the patients treated with alprazolam and those who received clonazepam . At the 3-month follow-up evaluation , 77 % of the patients in the cognitive-behavioral program who successfully discontinued benzodiazepine treatment remained benzodiazepine free . CONCLUSIONS These findings support the efficacy of cognitive-behavioral interventions in aiding benzodiazepine discontinuation for patients with panic disorder Treatment outcomes were compared for an intervention emphasizing reinforcement for abstinence from illicit drug use and an alternative intervention which combined the same reinforcement contingencies with aversive consequences for unauthorized drug use . Sixteen polydrug abusing methadone maintenance patients were r and omly assigned to one of two treatment groups . Both groups received the opportunity to take methadone doses away from the clinic ( take-home dose ) as reinforcement for su bmi tting urines testing negative for illicit substances . A regular weekly take-home dose of methadone could be earned for every 2 weeks of verified abstinence from unauthorized drugs , up to a maximum of three take-home doses per week . The combined intervention group had an additional contingency which involved a reduction in methadone dose as an aversive consequence for su bmi tting urine sample s testing positive for illicit substances . Specifically , 10 % of the patient 's daily methadone dose was lost for each week in which two of three urines tested positive for illicit drugs . An examination of the urinalysis results indicated no between group differences . Overall , 8 % of the 12-week baseline urinalysis results tested negative for illicit substances while 42 % tested negative for unauthorized substances during the 20 weeks of treatment intervention . At the end of the intervention period , nine subjects remained in treatment with three patients in each group receiving at least once weekly take-home privileges . Of the seven subjects no longer in treatment , five were in the combined intervention group . ( ABSTRACT TRUNCATED AT 250 WORDS The United Kingdom Alcohol Treatment Trial ( UKATT ) is intended to be the largest trial of treatment for alcohol problems ever conducted in the UK . UKATT is a multicentre , r and omized , controlled trial with blind assessment , representing a collaboration between psychiatry , clinical psychology , biostatistics , and health economics . This article sets out , in advance of data analysis , the theoretical background of the trial and its hypotheses , design , and methods . A projected total of 720 clients attending specialist services for treatment of alcohol problems will be r and omized to Motivational Enhancement Therapy ( MET ) or to Social Behaviour and Network Therapy ( SBNT ) , a novel treatment developed for the trial with strong support from theory and research . The trial will test two main hypotheses , expressed in null form as : ( 1 ) less intensive , motivationally based treatment ( MET ) is as effective as more intensive , socially based treatment ( SBNT ) ; ( 2 ) more intensive , socially based treatment ( SBNT ) is as cost-effective as less intensive , motivationally based treatment ( MET ) . A number of subsidiary hypotheses regarding client-treatment interactions and therapist effects will also be tested . The article describes general features of the trial that investigators considered desirable , namely that it should : ( 1 ) be a pragmatic , rather than an explanatory , trial ; ( 2 ) be an effectiveness trial based on " real-world " conditions of treatment delivery ; ( 3 ) incorporate high st and ards of training , supervision and quality control of treatment delivery ; ( 4 ) pay close attention to treatment process as well as treatment outcome ; ( 5 ) build economic evaluation into the design at the outset . First results from UKATT are expected in 2002 and the main results in 2003 This study examined the efficacy of a urinalysis-based contingency management program for preventing relapse to abused drugs following a brief residential detoxification . Fourteen methadone maintenance patients who were chronic benzodiazepine users were enrolled in a 7-day inpatient benzodiazepine detoxification and r and omly assigned to receive Contingency Management ( N = 7 ) or St and ard Care ( N = 7 ) therapy upon return to outpatient methadone treatment . In the Contingency Management condition , a methadone take-home dose or a US $ 25 voucher ( patient 's choice ) could be earned for each urine sample su bmi tted on a Monday , Wednesday or Friday that was free of opiates , cocaine and benzodiazepines . Data analysis and interpretation focused on within-group post-hoc differences due to group differences on employment and legal status , potentially confounding baseline variables . Repeated measures analysis of variance showed that Contingency Management patients su bmi tted significantly more drug-free urine sample s during the intervention compared to pre-detoxification ( p < 0.01 ) , whereas no significance changes were observed from pre- to post-detoxification in the St and ard Care patients . Employment and legal status of patients may have facilitated response to contingency management procedures , but did not prevent relapse when contingency management procedures were withdrawn . Overall , these preliminary results suggest that abstinence-based contingency management is a promising strategy for preventing relapse to multiple drugs of abuse in a subset of methadone maintenance patients when abstinence has been initiated through brief inpatient treatment AIM This study set out to assess the effect of a letter from the general practitioner , suggesting a reduction in the use of benzodiazepines , and whether the impact of the letter could be increased by the addition of information on how to tackle drug reduction . METHOD Two hundred and nine long-term users of benzodiazepines in general practice were divided into three groups : two intervention groups and a control group . The first intervention group received a letter from their general practitioner asking that benzodiazepine use be gradually reduced and perhaps , in time , stopped . The second intervention group received the same letter plus four information sheets at monthly intervals , design ed to assist drug reduction : The mean age of the 209 people was 69 years ( age range 34 - 102 years ) . RESULTS After six months , both intervention groups had reduced their consumption to approximately two thirds of the original intake of benzodiazepines and there was a statistically significant difference between the intervention groups and the control group . Eighteen per cent of those receiving the interventions received no prescriptions at all during the six month monitoring period . CONCLUSION The results indicate that a simple intervention can have a considerable effect on the use of hypnotic and anxiolytic drugs , even with a sample of elderly users Twenty-two patients were withdrawn from normal-dose , long-term benzodiazepine treatment and their data compared with those of two control groups . Patients and controls were assessed repeatedly on the Digit Symbol Substitution Test ( DSST ) , Symbol Copying Test ( SCT ) , Cancellation Task ( CT ) , Auditory Reaction Time ( RT ) and Key Tapping Rate ( KTR ) . A substantial and prolonged practice effect was found on all the tests except RT and KTR . Prior to withdrawal the patients did not show the performance decrement on the CT , RT and KTR customarily associated with the initial phases of benzodiazepine therapy . A rebound performance increment was observed on KTR during the withdrawal . Patients demonstrated impaired performance on tasks requiring the combined use of sensory and fine motor skills The effectiveness of motivational enhancement therapy ( MET ) in comparison with counseling as usual ( CAU ) for increasing retention and reducing substance use was evaluated in a multisite r and omized clinical trial . Participants were 461 out patients treated by 31 therapists within 1 of 5 outpatient substance abuse programs . There were no retention differences between the 2 brief intervention conditions . Although both 3-session interventions result ed in reductions in substance use during the 4-week therapy phase , MET result ed in sustained reductions during the subsequent 12 weeks whereas CAU was associated with significant increases in substance use over this follow-up period . This finding was complicated by program site main effects and higher level interactions . MET result ed in more sustained substance use reductions than CAU among primary alcohol users , but no difference was found for primary drug users . An independent evaluation of session audiotapes indicated that MET and CAU were highly and comparably discriminable across sites Hispanic individuals are underrepresented in clinical and research population s and are often excluded from clinical trials in the United States . Hence , there are few data on the effectiveness of most empirically vali date d therapies for Hispanic substance users . The authors conducted a multisite r and omized trial comparing the effectiveness of 3 individual sessions of motivational enhancement therapy with that of 3 individual sessions of counseling as usual on treatment retention and frequency of substance use ; all assessment and treatment sessions were conducted in Spanish among 405 individuals seeking treatment for any type of current substance use . Treatment exposure was good , with 66 % of participants completing all 3 protocol sessions . Although both interventions result ed in reductions in substance use during the 4-week therapy phase , there were no significant Treatment Condition x Time interactions nor Site x Treatment Condition interactions . Results suggest that the individual treatments delivered in Spanish were both attractive to and effective with this heterogeneous group of Hispanic adults , but the differential effectiveness of motivational enhancement therapy may be limited to those whose primary substance use problem is alcohol and may be fairly modest in magnitude BACKGROUND Several interventions aim ing at discontinuation of long-term benzodiazepine use have been proven effective in the short term . However , data on the persistence of discontinuation are lacking . OBJECTIVES To assess 10-year follow-up status in patients who succeeded in stopping benzodiazepine use after a discontinuation letter from the patient 's own GP . To identify determinants of successful discontinuation on the long term . METHODS Follow-up data of patients who participated in a large prospect i ve , controlled stepped care intervention programme among long-term benzodiazepine users in primary care . RESULTS At 10-year follow-up , the percentage of benzodiazepine abstinence was 58.8 % . Non-abstinent patients used lower doses of benzodiazepine . Being abstinent at 21 months after the intervention predicted abstinence at 10-year follow-up . CONCLUSIONS Ten years after a minimal intervention to decrease long-term benzodiazepine use , the majority of patients who were able to discontinue benzodiazepine use initially , does not use benzodiazepines at 10-year follow-up . Patients who did not succeed in maintaining abstinence from benzodiazepines appear to use lower or average dosages BACKGROUND The problematic use of prescription drugs ( PDs ) and related disorders are considerably prevalent but evidence concerning brief intervention for problematic PD users is sparse . A previous analysis of the present study on the effectiveness of brief intervention for problematic PD use in a general hospital revealed a significant reduction in PD use after 3 months . The analyses presented herein provides data from the 12-month follow-up . METHOD In a r and omized controlled trial , 126 proactively recruited general hospital patients were analyzed . The intervention group received two brief Motivational Interviewing ( MI ) sessions . Two follow-ups ( after 3 and 12 months ) were conducted . Intervention effects at 12-month follow-up on PD cessation and reduction were analyzed using regression methods and controlling for significant group differences . Subgroups of sedative/hypnotic- and opioid-users were examined . RESULTS No significant intervention effects were found in the overall sample . Respecting significant differences between the intervention and control groups , we detected no effects of the intervention for the subgroups of sedative/hypnotic- or opioid-users . CONCLUSIONS In contrast to the short-term effects after 3 months , no long-term effects of brief MI sessions on PD use were found . More intensive interventions , booster-sessions or regular aftercare might help in stabilizing intervention effects on PD use among hospital patients . However , studies using larger sample s are needed to allow more powerful and specific analyses . Different sample s should be examined . Problems concerning the recruitment of study participants in PD research were discussed and should be considered in further studies BACKGROUND The long-term use of benzodiazepines is highly prevalent in developed societies and is not devoid of risks . Withdrawing patients from these drugs is often difficult . Tapering off benzodiazepines has been shown to be a good strategy for discontinuing their long-term use . AIM To establish the efficacy of an intervention programme for reducing the chronic use of benzodiazepines . DESIGN OF STUDY R and omised , two-arm , parallel , non-blinded controlled trial . SETTING Three urban healthcare centres covering a population of 50,000 inhabitants ( Mallorca , Spain ) . METHOD Adult patients ( n = 139 ) taking benzodiazepines daily for more than a year and visited by their family physician were r and omised into an intervention group ( n = 73 ) that received st and ardised advice and a tapering off schedule with biweekly follow-up visits , or into a control group ( n = 66 ) , that was managed following routine clinical practice . Both were followed for a year . RESULTS Patients achieved withdrawal or reduced their dose by at least 50 % after 6 and 12 months . Abstinence and withdrawal symptoms were also measured . Both groups were homogeneous for personal , clinical and psychological characteristics and for benzodiazepine use . Only two patients from each group were lost to follow-up . After 12 months , 33 ( 45.2 % ) patients in the intervention group and six ( 9.1 % ) in the control group had discontinued benzodiazepine use ; relative risk = 4.97 ( 95 % confidence interval [ CI ] = 2.2 to 11.1 ) , absolute risk reduction = 0.36 ( 95 % CI = 0.22 to 0.50 ) . For every three interventions , one patient achieved withdrawal . Sixteen ( 21.9 % ) subjects from the intervention group and 11 ( 16.7 % ) controls reduced their initial dose by more than 50 % . CONCLUSION St and ardised advice given by the family physician , together with a tapering off schedule , is effective for withdrawing patients from long-term benzodiazepine use and is feasible in primary care A clinical trial was used to evaluate short-term interpersonal psychotherapy ( IPT ) as treatment for psychiatric disorders in opiate addicts who were also participating in a full-service methadone hydrochloride maintenance program . Seventy-two opiate addicts were r and omly assigned to one of two treatment conditions for six months : ( 1 ) IPT , consisting of weekly individual psychotherapy , and ( 2 ) low-contact treatment , consisting of one brief meeting per month . Recruitment was a problem , as only 5 % of eligible clients agreed to participate and only around half of the subjects completed the study treatment . The outcome was similar for the two study groups . However , in many of the outcome areas , subjects in both treatment conditions attained significant clinical improvement . Several factors limited the generalizability of findings and may have biased against showing a psychotherapy effect Despite the application of treatments that combine methadone administration , weekly counseling , and contingency reinforcement strategies , many opiate-dependent patients continue illicit drug use . In this controlled study we piloted a novel cognitive-behavioral treatment ( CBT ) design ed to reduce illicit drug use among patients receiving methadone treatment . The treatment targeted the reduction of sensitivity to interoceptive cues associated with drug craving , and trained alternative responses to these cues . Patients ( N = 23 ) were r and omly assigned to either this novel CBT program or a program of increased counseling , such that the two programs of treatment were equated for therapist contact , assessment time , and contingency-reinforcement strategies . We found that , compared to a doubling of contact with their outpatient counselor , the new program was associated with significantly greater reductions in illicit drug use for women , but not for men . Reasons for differential performance by women and men and future directions for this new treatment are discussed The Benzodiazepine Dependence Self-Report Question naire ( Bendep-SRQ ) measures the severity of benzodiazepine ( BZ ) dependence on four domains : awareness of problematic use , preoccupation with the availability of BZ , lack of compliance with the therapeutic regimen , and withdrawal . Although promising results of the Bendep-SRQ have been obtained in cross-sectional studies , no attention has been paid to its clinical relevance during BZ withdrawal , i.e. , predictive validity and time course . We performed cross-validation and evaluated the predictive validity and time course on 180 long-term BZ users who were taking part in a general practice BZ discontinuation trial . Three of the four domains had good scalability . Some concerns arose about the preoccupation scale , which emphasizes the need for cross-validation in clinical ly relevant population s. All scales showed excellent reliability ( subject discriminability , item discriminability ) , while construct and discriminant validity were adequate . All four scales contributed significantly to the prediction of whether complete abstinence would be achieved directly after taking part in the discontinuation program . This prediction was independent of the other prognostic variables , except for those in the domain problematic use . The scales problematic use and preoccupation showed good sensitivity to changes during follow-up . The insensitivity of the scale , lack of compliance can be explained by low baseline scores in our population , while the insensitivity of the withdrawal scale was probably the result of the study design . In conclusion , our study indicated the clinical relevance of the Bendep-SRQ before and during a BZ discontinuation trial . We recommend the use of the Bendep-SRQ in discontinuation therapy and research into the field of BZ addiction AIMS This study addressed the following questions for patients after 1 year of methadone maintenance treatment ( MMT ) ; ( 1 ) What are the demographic features and past history of drug use of benzodiazepine ( BZD ) abusers ? ( 2 ) Do BZD abusers abuse more heroin , cocaine and /or cannabis and do they receive a higher methadone dosage level ? ( 3 ) Do BZD abusers suffer more from hepatitis C ( HCV ) and do they have more HIV/HCV risk-taking behaviors than non-abusers ? ( 4 ) Do BZD abusers have more psychopathology and more emotional distress than non-abusers ? DESIGN All 148 patients who completed 1 year of MMT underwent r and om and twice-weekly observed urine analysis for various drugs of abuse , responded to self-report question naires ( SCL-90-R ; POMS ; HIV/HCV risk-taking behaviors ) , interviews ( ASI ) and underwent testing for hepatitis C. Abuse in this study is defined as any use during the 12th month of treatment . FINDINGS After 1 year of MMT , more BZD abusers ( n = 63 ) were single , had spent time in prison , were unemployed and had at least one parent with an addiction problem or mental illness in comparison to non-abusers ( n = 85 ) . They had started using heroin and cocaine earlier and currently abused more cocaine , heroin and cannabis . They had significantly more psychopathology and negative mood . They had significantly more HCV and reported more HIV/HCV risk-taking behavior . IMPLICATION S We suggest that this group of patients is in need of more intensive pharmacological and psychological treatment BACKGROUND Long-term use of hypnotics is not recommended because of risks of dependency and adverse effects on health . The usual clinical management of benzodiazepine dependency is gradual tapering , but when used alone this method is not highly effective in achieving long-term discontinuation . We compared the efficacy of tapering plus cognitive-behavioural therapy for insomnia with tapering alone in reducing the use of hypnotics by older adults with insomnia . METHODS People with chronic insomnia who had been taking a benzodiazepine every night for more than 3 months were recruited through media advertisements or were referred by their family doctors . They were r and omly assigned to undergo either cognitive-behavioural therapy plus gradual tapering of the drug ( combined treatment ) or gradual tapering only . The cognitive-behavioural therapy was provided by a psychologist in 8 weekly small-group sessions . The tapering was supervised by a physician , who met weekly with each participant over an 8-week period . The main outcome measure was benzodiazepine discontinuation , confirmed by blood screening performed at each of 3 measurement points ( immediately after completion of treatment and at 3- and 12-month follow-ups ) . RESULTS Of the 344 potential participants , 65 ( mean age 67.4 years ) met the inclusion criteria and entered the study . The 2 study groups ( 35 subjects in the combined treatment group and 30 in the tapering group ) were similar in terms of demographic characteristics , duration of insomnia and hypnotic dosage . Immediately after completion of treatment , a greater proportion of patients in the combined treatment group had withdrawn from benzodiazepine use completely ( 77 % [ 26/34 ] v. 38 % [ 11/29 ] ; odds ratio [ OR ] 5.3 , 95 % confidence interval [ CI ] 1.8 - 16.2 ; OR after adjustment for initial benzodiazepine daily dose 7.9 , 95 % CI 2.4 - 30.9 ) . At the 12-month follow-up , the favourable outcome persisted ( 70 % [ 23/33 ] v. 24 % [ 7/29 ] ; OR 7.2 , 95 % CI 2.4 - 23.7 ; adjusted OR 7.6 , 95 % CI 2.5 - 26.6 ) ; similar results were obtained at 3 months . INTERPRETATION A combination of cognitive-behavioural therapy and benzodiazepine tapering was superior to tapering alone in the management of patients with insomnia and chronic benzodiazepine use . The beneficial effects were sustained for up to 1 year . Applying this multidisciplinary approach in the community could help reduce benzodiazepine use by older people BACKGROUND The study aim ed to monitor subjects with benzodiazepine ( BZ ) dependence after withdrawal treatment in order to evaluate long-term outcome and predictors of remaining BZ-free . Subjects with high-dose dependence or co-occurring alcohol problems were not excluded . METHOD Seventy-six participants in an earlier , r and omized , controlled trial of outpatient BZ discontinuation were interviewed , and documents from their treatment setting s obtained , along with urine and serum sample s for BZ use . Long-term outcomes for a cognitive-behavioral treatment group and a treatment-as-usual group were measured . RESULTS BZ discontinuation treatment outcomes were maintained in both treatment groups . No between-group differences were found . At the end of the study 25 % of the subjects were BZ-free , and the median dose decrease from pre-treatment levels was 16.1 mg in diazepam equivalents . Subjects with pre-treatment doses exceeding 40 mg were able to maintain their doses at therapeutic levels through the follow-up . Pre-treatment low BZ dose , no previous withdrawal attempts , and high life satisfaction predicted success in staying BZ-free . CONCLUSIONS In subjects with complicated BZ dependence , the benefits of BZ discontinuation treatment may persist , but more studies are needed Objective : To identify predictors of resumed benzodiazepine use after participation in a benzodiazepine discontinuation trial . Method : We performed multiple Cox regression analyses to predict the long-term outcome of a 3-condition , r and omized , controlled benzodiazepine discontinuation trial in general practice . Results : Of 180 patients , we completed follow-up for 170 ( 94 % ) . Of these , 50 ( 29 % ) achieved long-term success , defined as no use of benzodiazepines during follow-up . Independent predictors of success were as follows : offering a taper-off program with group therapy ( hazard ratio [ HR ] 2.4 ; 95 % confidence interval [ CI ] , 1.5 to 3.9 ) or without group therapy ( HR 2.9 ; 95%CI , 1.8 to 4.8 ) ; a lower daily benzodiazepine dosage at the start of tapering off ( HR 1.5 ; 95%CI , 1.2 to 1.9 ) ; a substantial dosage reduction by patients themselves just before the start of tapering off ( HR 2.1 ; 95%CI , 1.4 to 3.3 ) ; less severe benzodiazepine dependence , as measured by the Benzodiazepine Dependence Self-Report Question naire Lack of Compliance subscale ( HR 2.4 ; 95%CI , 1.1 to 5.2 ) ; and no use of alcohol ( HR 1.7 ; 95%CI , 1.2 to 2.5 ) . Patients who used over 10 mg of diazepam equivalent , who had a score of 3 or more on the Lack of Compliance subscale , or who drank more than 2 units of alcohol daily failed to achieve long-term abstinence . Conclusions : Benzodiazepine dependence severity affects long-term taper outcome independent of treatment modality , benzodiazepine dosage , psychopathology , and personality characteristics . An identifiable subgroup needs referral to specialized care Anxiolytic therapy with benzodiazepines and their potential for dependence are review ed . Relaxation training and biofeedback have been used for chemically dependent anxious patients . These techniques have been recommended for benzodiazepine-dependent patients , but not investigated . Previous withdrawal studies offer only limited follow-up data . Stress management treatment was based on a successful case study . Recruitment difficulties were encountered . However , seven patients were r and omly assigned to stress management or brief psycho-therapy . All showed improvement , but three of four patients available for 1 year follow-up had returned to pretreatment dependence . These withdrawal difficulties suggest the need for more effective treatments and more adequate follow-up studies Better underst and ing of compliance with BZD taper is warranted . Compliance with a taper program and perceived self-efficacy ( SE ) in being able to comply with hypnotic reduction goals was monitored weekly in 52 older adults ( mean age : 63.0 years ) with chronic insomnia ( average duration : 21.9 years ) who underwent a 10-week physician-supervised medication tapering . One group received cognitive- behavior therapy for insomnia during discontinuation , whereas the other did not . Compliant patients showed higher SE ratings at Weeks 6 , 8 , 9 , and 10 . Medication-free patients at the end of the treatment also reported higher mean SE ratings at those 4 weeks . Differences remained significant when withdrawal symptoms and sleep efficiency were controlled for . These results have important clinical implication s because SE may indicate key time points when patients are experiencing more difficulty during discontinuation OBJECTIVE This study evaluated the effectiveness of a supervised benzodiazepine taper , singly and combined with cognitive behavior therapy , for benzodiazepine discontinuation in older adults with chronic insomnia . METHOD Seventy-six older adult out patients ( 38 women , 38 men ; mean age of 62.5 years ) with chronic insomnia and prolonged use ( mean duration of 19.3 years ) of benzodiazepine medication for sleep were r and omly assigned for a 10-week intervention consisting of a supervised benzodiazepine withdrawal program ( N=25 ) , cognitive behavior therapy for insomnia ( N=24 ) , or supervised withdrawal plus cognitive behavior therapy ( N=27 ) . Follow-up assessment s were conducted at 3 and 12 months . The main outcome measures were benzodiazepine use , sleep parameters , and anxiety and depressive symptoms . RESULTS All three interventions produced significant reductions in both the quantity ( 90 % reduction ) and frequency ( 80 % reduction ) of benzodiazepine use , and 63 % of the patients were drug-free within an average of 7 weeks . More patients who received medication taper plus cognitive behavior therapy ( 85 % ) were benzodiazepine-free after the initial intervention , compared to those who received medication taper alone ( 48 % ) and cognitive behavior therapy alone ( 54 % ) . The patients in the two groups that received cognitive behavior therapy perceived greater subjective sleep improvements than those who received medication taper alone . Polysomnographic data showed an increase in the amount of time spent in stages 3 and 4 sleep and REM sleep and a decrease in total sleep time across all three conditions from baseline to posttreatment . Initial benzodiazepine reductions were well maintained up to the 12-month follow-up , and sleep improvements became more noticeable over this period . No significant withdrawal symptoms or adverse events were associated with benzodiazepine tapering . CONCLUSIONS A structured , time-limited intervention is effective in assisting chronic users of benzodiazepine medication to discontinue or reduce their use of medication . The addition of cognitive behavior therapy alleviates insomnia , but sleep improvements may become noticeable only after several months of benzodiazepine abstinence The study aim ed at evaluating the efficacy of complaints management training ( CMT ) compared to that of anxiety management training ( AMT ) in patients undergoing benzodiazepine withdrawal . CMT focused on techniques to alleviate reported withdrawal symptoms . Nineteen patients were r and omly allocated either to CMT or to AMT . Both groups received 9 weekly treatment sessions and were assessed every other week . Withdrawal was design ed to be gradual over the first 4 weeks . CMT proved more successful than AMT in terms of abstinence rate , reported number of severe withdrawal symptoms , depression and anxiety . At follow-up after 6 months , there was no difference between groups in terms of abstinence rate Enhanced schedules of counseling can improve response to routine opioid-agonist treatment , although it is associated with increased time dem and s that enhance patient resistance and nonadherence . Internet-based counseling can reduce these concerns by allowing patients to participate from home . This study assesses treatment satisfaction and response to Internet-based ( CRC Health Group 's e-Getgoing ) group counseling for partial responders to methadone maintenance treatment . Patients testing positive for an illicit substance ( n = 37 ) were r and omly assigned to e-Getgoing or onsite group counseling and followed for 6 weeks . Patients in both conditions responded favorably to intensified treatment by achieving at least 2 consecutive weeks of abstinence and 100 % attendance to return to less-intensive care ( e-Getgoing : 70 % vs. routine : 71 % , ns ) . Treatment satisfaction was good and comparable across conditions . E-Getgoing patients expressed a preference for the Internet-based service , reporting convenience and increased confidentiality as major reasons . Integrating Internet-based group counseling with on-site treatment services could help exp and the continuum of care in methadone maintenance clinics
2,054
27,243,770
FINDINGS Critical evaluation of the six studies that met inclusion criteria suggests that antioxidant therapies such as amino acids , vitamins C and E , progesterone , N-acetylcysteine , and enzogenol may be safe and effective adjunctive therapies in adult patients with TBI . Although certain limitations were found , the overall trend of using antioxidant therapies to improve the clinical outcomes of TBI was positive . LINKING EVIDENCE TO ACTION By incorporating antioxidant therapies into practice , clinicians can help attenuate the oxidative posttraumatic brain damage and optimize patients ' recovery
BACKGROUND Traumatic brain injury ( TBI ) is an acquired brain injury that occurs when there is sudden trauma that leads to brain damage . This acute complex event can happen when the head is violently or suddenly struck or an object pierces the skull or brain . The current principal treatment of TBI includes various pharmaceutical agents , hyperbaric oxygen , and hypothermia . There is evidence that secondary injury from a TBI is specifically related to oxidative stress . However , the clinical management of TBI often does not include antioxidants to reduce oxidative stress and prevent secondary injury . AIMS The purpose of this article is to examine current literature regarding the use of antioxidant therapies in treating TBI . This review evaluates the evidence of antioxidant therapy as an adjunctive treatment used to reduce the underlying mechanisms involved in secondary TBI injury .
Background Mild traumatic brain injury ( mTBI ) secondary to blast exposure is the most common battlefield injury in Southwest Asia . There has been little prospect i ve work in the combat setting to test the efficacy of new counter measures . The goal of this study was to compare the efficacy of N-acetyl cysteine ( NAC ) versus placebo on the symptoms associated with blast exposure mTBI in a combat setting . Methods This study was a r and omized double blind , placebo-controlled study that was conducted on active duty service members at a forward deployed field hospital in Iraq . All symptomatic U.S. service members who were exposed to significant ordnance blast and who met the criteria for mTBI were offered participation in the study and 81 individuals agreed to participate . Individuals underwent a baseline evaluation and then were r and omly assigned to receive either N-acetyl cysteine ( NAC ) or placebo for seven days . Each subject was re-evaluated at 3 and 7 days . Outcome measures were the presence of the following sequelae of mTBI : dizziness , hearing loss , headache , memory loss , sleep disturbances , and neurocognitive dysfunction . The resolution of these symptoms seven days after the blast exposure was the main outcome measure in this study . Logistic regression on the outcome of ‘ no day 7 symptoms ’ indicated that NAC treatment was significantly better than placebo ( OR = 3.6 , p = 0.006 ) . Secondary analysis revealed subjects receiving NAC within 24 hours of blast had an 86 % chance of symptom resolution with no reported side effects versus 42 % for those seen early who received placebo . Conclusion This study , conducted in an active theatre of war , demonstrates that NAC , a safe pharmaceutical countermeasure , has beneficial effects on the severity and resolution of sequelae of blast induced mTBI . This is the first demonstration of an effective short term countermeasure for mTBI . Further work on long term outcomes and the potential use of NAC in civilian mTBI is warranted . Trial Registration Clinical Trials.gov BACKGROUND Traumatic brain injury is a major cause of death and disability . We sought to assess the safety and efficacy of dexanabinol , a synthetic cannabinoid analogue devoid of psychotropic activity , in severe traumatic brain injury . METHODS 861 patients with severe traumatic brain injury admitted to 86 specialist centres from 15 countries were included in a multi-centre , placebo-controlled , phase III trial . Patients were r and omised to receive a single intravenous 150 mg dose of dexanabinol or placebo within 6 h of injury . The primary outcome was the extended Glasgow outcome scale assessed at 6 months , with the point of dichotomisation into unfavourable versus favourable outcome differentiated by baseline prognostic risk . Prespecified subgroup analyses were defined by injury severity , recruitment rate , and time to dosing . Secondary analysis included control of intracranial pressure and quality of life . Analysis were prespecified in the protocol and the statistical analysis plan . This study is registered with Clinical Trials.gov , number NCT00129857 . FINDINGS 846 patients were included in the efficacy analysis . The extended Glasgow outcome scale at 6 months did not differ between groups ; 215 ( 50 % ) patients in the dexanabinol group and 214 ( 51 % ) patients in the placebo group had an unfavourable outcome ( odds ratio for a favourable response 1.04 ; 95 % CI 0.79 - 1.36 ) . Improvements in the control of intracranial pressure or quality of life were not recorded and subgroup analysis showed no indication of differential treatment effects . Dexanabinol was not associated with hepatic , renal , or cardiac toxic effects . INTERPRETATION Dexanabinol is safe , but is not efficacious in the treatment of traumatic brain injury OBJECTIVE To investigate whether supplementation with branched-chain amino acids ( BCAAs ) may improve recovery of patients with a posttraumatic vegetative or minimally conscious state . DESIGN Patients were r and omly assigned to 15 days of intravenous BCAA supplementation ( n=22 ; 19.6g/d ) or an isonitrogenous placebo ( n=19 ) . SETTING Tertiary care rehabilitation setting . PARTICIPANTS Patients ( N=41 ; 29 men , 12 women ; mean age , 49.5+/-21 y ) with a posttraumatic vegetative or minimally conscious state , 47+/-24 days after the index traumatic event . INTERVENTION Supplementation with BCAAs . MAIN OUTCOME MEASURE Disability Rating Scale ( DRS ) as log(10)DRS . RESULTS Fifteen days after admission to the rehabilitation department , the log(10)DRS score improved significantly only in patients who had received BCAAs ( log(10)DRS score , 1.365+/-0.08 to 1.294+/-0.05 ; P<.001 ) , while the log(10)DRS score in the placebo recipients remained virtually unchanged ( log(10)DRS score , 1.373+/-0.03 to 1.37+/-0.03 ; P not significant ) . The difference in improvement of log(10)DRS score between the 2 groups was highly significant ( P<.000 ) . Moreover , 68.2 % ( n=15 ) of treated patients achieved a log(10)DRS point score of .477 or higher ( 3 as geometric mean ) that allowed them to exit the vegetative or minimally conscious state . CONCLUSIONS Supplemented BCAAs may improve the recovery from a vegetative or minimally conscious state in patients with posttraumatic vegetative or minimally conscious state An accurate prognostic model is extremely important in severe traumatic brain injury ( TBI ) for both patient management and research . Clinical prediction models must be vali date d both internally and externally before they are considered widely applicable . Our aim is to independently externally vali date two prediction models , one developed by the Corticosteroid R and omization After Significant Head injury ( CRASH ) trial investigators , and the other from the International Mission for Prognosis and Analysis of Clinical Trials in Traumatic Brain Injury ( IMPACT ) group . We used a cohort of 300 patients with severe TBI ( Glasgow Coma Score [ GCS ] ≤8 ) consecutively admitted to the National Neuroscience Institute ( NNI ) , Singapore , between February 2006 and December 2009 . The CRASH models ( base and CT ) predict 14 day mortality and 6 month unfavorable outcome . The IMPACT models ( core , extended , and laboratory ) estimate 6 month mortality and unfavorable outcome . Validation was based on measures of discrimination and calibration . Discrimination was assessed using the area under the receiving operating characteristic curve ( AUC ) , and calibration was assessed using the Hosmer-Lemeshow ( H-L ) goodness-of-fit test and Cox calibration regression analysis . In the NNI data base , the overall observed 14 day mortality was 47.7 % , and the observed 6 month unfavorable outcome was 71.0 % . The CRASH base model and all three IMPACT models gave an underestimate of the observed values in our cohort when used to predict outcome . Using the CRASH CT model , the predicted 14 day mortality of 46.6 % approximated the observed outcome , whereas the predicted 6 month unfavorable outcome was an overestimate at 74.8 % . Overall , both the CRASH and IMPACT models showed good discrimination , with AUCs ranging from 0.80 to 0.89 , and good overall calibration . We conclude that both the CRASH and IMPACT models satisfactorily predicted outcome in our patients with severe TBI BACKGROUND AND PURPOSE Enzogenol , a flavonoid-rich extract from Pinus radiata bark with antioxidant and anti-inflammatory properties has been shown to improve working memory in healthy adults . In traumatic brain injury ( TBI ) , oxidation and inflammation have been linked to poorer cognitive outcomes . Hence , this phase II , r and omized controlled trial investigated safety , compliance and efficacy of Enzogenol for improving cognitive functioning in people following mild TBI . METHODS Sixty adults , who sustained a mild TBI , 3 - 12 months prior to recruitment , and who were experiencing persistent cognitive difficulties [ Cognitive Failures Question naire ( CFQ ) score > 38 ] , were r and omized to receive Enzogenol ( 1000 mg/day ) or matching placebo for 6 weeks . Subsequently , all participants received Enzogenol for a further 6 weeks , followed by placebo for 4 weeks . Compliance , side-effects , cognitive failures , working and episodic memory , post-concussive symptoms and mood were assessed at baseline , 6 , 12 and 16 weeks . Simultaneous estimation of treatment effect and breakpoint was effected , with confidence intervals ( CIs ) obtained through a treatment-placebo balance-preserving bootstrap procedure . RESULTS Enzogenol was found to be safe and well tolerated . Trend and breakpoint analyses showed a significant reduction in cognitive failures after 6 weeks [ mean CFQ score , 95 % CI , Enzogenol versus placebo -6.9 ( -10.8 to -4.1 ) ] . Improvements in the frequency of self-reported cognitive failures were estimated to continue until week 11 before stabilizing . Other outcome measures showed some positive trends but no significant treatment effects . CONCLUSIONS Enzogenol supplementation is safe and well tolerated in people after mild TBI , and may improve cognitive functioning in this patient population . This study provides Class IIB evidence that Enzogenol is well tolerated and may reduce self-perceived cognitive failures in patients 3 - 12 months post-mild TBI
2,055
21,134,261
There was also no clinical evidence for a positive interaction according to specific PPIs . Conclusion The observed association between clopidogrel and PPIs is found uniquely in studies judged to be of low quality and with an increased risk of bias .
Background Recently , several publications have investigated a possible drug interaction between clopidogrel and proton pump inhibitors ( PPIs ) , and regulatory agencies have issued warnings despite discordant study results . In an attempt to clarify the situation , we performed a systematic review with a critical analysis of study method ologies to determine whether varying study quality ( that is , bias ) could explain the discordant results .
Background —The antiplatelet effect of clopidogrel may be attenuated by short-term coadministration of lipophilic statins . We investigated whether the coadministration of atorvastatin for 5 weeks in patients with acute coronary syndromes ( ACS ) could affect the antiplatelet potency of clopidogrel . Methods and Results —Forty-five hypercholesterolemic patients with the first episode of an ACS were included in the study . Patients were r and omized to receive daily either 10 mg of atorvastatin ( n=21 ) or 40 mg of pravastatin ( n=24 ) . Thirty patients who underwent percutaneous coronary intervention ( PCI ) received a loading dose of 375 mg of clopidogrel , followed by 75 mg/d for at least 3 months . In the remaining 15 patients who refused to undergo PCI , clopidogrel therapy was not administered . Eight normolipidemic patients with the first episode of an ACS were also included and received only clopidogrel . The serum levels of soluble CD40L and the adenosine 5′-diphosphate– or thrombin receptor activating peptide-14–induced platelet aggregation , as well as P-selectin and CD40L surface expression , were studied at baseline ( within 30 minutes after admission ) and 5 weeks later . Neither atorvastatin nor pravastatin significantly influenced the clopidogrel-induced inhibition of platelet activation , nor did clopidogrel influence the therapeutic efficacy of atorvastatin . Conclusions —Atorvastatin does not affect the antiplatelet potency of clopidogrel when coadministered for 5 weeks in ACS patients BACKGROUND Gastrointestinal complications are an important problem of antithrombotic therapy . Proton-pump inhibitors ( PPIs ) are believed to decrease the risk of such complications , though no r and omized trial has proved this in patients receiving dual antiplatelet therapy . Recently , concerns have been raised about the potential for PPIs to blunt the efficacy of clopidogrel . METHODS We r and omly assigned patients with an indication for dual antiplatelet therapy to receive clopidogrel in combination with either omeprazole or placebo , in addition to aspirin . The primary gastrointestinal end point was a composite of overt or occult bleeding , symptomatic gastroduodenal ulcers or erosions , obstruction , or perforation . The primary cardiovascular end point was a composite of death from cardiovascular causes , nonfatal myocardial infa rct ion , revascularization , or stroke . The trial was terminated prematurely when the sponsor lost financing . RESULTS We planned to enroll about 5000 patients ; a total of 3873 were r and omly assigned and 3761 were included in analyses . In all , 51 patients had a gastrointestinal event ; the event rate was 1.1 % with omeprazole and 2.9 % with placebo at 180 days ( hazard ratio with omeprazole , 0.34 , 95 % confidence interval [ CI ] , 0.18 to 0.63 ; P<0.001 ) . The rate of overt upper gastrointestinal bleeding was also reduced with omeprazole as compared with placebo ( hazard ratio , 0.13 ; 95 % CI , 0.03 to 0.56 ; P = 0.001 ) . A total of 109 patients had a cardiovascular event , with event rates of 4.9 % with omeprazole and 5.7 % with placebo ( hazard ratio with omeprazole , 0.99 ; 95 % CI , 0.68 to 1.44 ; P = 0.96 ) ; high-risk subgroups did not show significant heterogeneity . The two groups did not differ significantly in the rate of serious adverse events , though the risk of diarrhea was increased with omeprazole . CONCLUSIONS Among patients receiving aspirin and clopidogrel , prophylactic use of a PPI reduced the rate of upper gastrointestinal bleeding . There was no apparent cardiovascular interaction between clopidogrel and omeprazole , but our results do not rule out a clinical ly meaningful difference in cardiovascular events due to use of a PPI . ( Funded by Cogentus Pharmaceuticals ; Clinical Trials.gov number , NCT00557921 . ) BACKGROUND Recent pharmacodynamic and retrospective clinical analyses have suggested that proton pump inhibitors ( PPIs ) may modify the antiplatelet effects of clopidogrel bisulfate . METHODS We conducted a retrospective cohort study of persons enrolled in a multistate health insurance plan with commercial and Medicare clients to evaluate adverse clinical outcomes in patients using clopidogrel plus a PPI compared with clopidogrel alone . Patients who were discharged from the hospital after myocardial infa rct ion ( MI ) or coronary stent placement and treated with clopidogrel plus a PPI ( n = 1033 ) were matched 1:1 ( using propensity scoring ) with patients with similar cardiovascular risk factors treated with clopidogrel alone . Rehospitalizations for MI or coronary stent placement were evaluated for up to 360 days . A sub analysis was conducted to study the impact of pantoprazole sodium , the most used PPI . RESULTS Patients who received clopidogrel plus a PPI had a 93 % higher risk of rehospitalization for MI ( adjusted hazard ratio , 1.93 ; 95 % confidence interval , 1.05 - 3.54 ; P = .03 ) and a 64 % higher risk of rehospitalization for MI or coronary stent placement ( 1.64 ; 1.16 - 2.32 ; P = .005 ) than did patients receiving clopidogrel alone . Increased risk of rehospitalization for MI or coronary stent placement was also observed for the subgroup of patients receiving clopidogrel plus pantoprazole ( adjusted hazard ratio , 1.91 ; 95 % confidence interval , 1.19 - 3.06 ; P = .008 ) . CONCLUSIONS Patients who received clopidogrel plus a PPI had a significantly higher risk of rehospitalization for MI or coronary stent placement than did patients receiving clopidogrel alone . Prospect i ve clinical trials and laboratory analyses of biochemical interactions are warranted to further evaluate the potential impact of PPIs on the efficacy of clopidogrel CONTEXT Following percutaneous coronary intervention ( PCI ) , short-term clopidogrel therapy in addition to aspirin leads to greater protection from thrombotic complications than aspirin alone . However , the optimal duration of combination oral antiplatelet therapy is unknown . Also , although current clinical data suggest a benefit for beginning therapy with a clopidogrel loading dose prior to PCI , the practical application of this therapy has not been prospect ively studied . OBJECTIVES To evaluate the benefit of long-term ( 12-month ) treatment with clopidogrel after PCI and to determine the benefit of initiating clopidogrel with a preprocedure loading dose , both in addition to aspirin therapy . DESIGN , SETTING , AND PARTICIPANTS The Clopidogrel for the Reduction of Events During Observation ( CREDO ) trial , a r and omized , double-blind , placebo-controlled trial conducted among 2116 patients who were to undergo elective PCI or were deemed at high likelihood of undergoing PCI , enrolled at 99 centers in North America from June 1999 through April 2001 . INTERVENTIONS Patients were r and omly assigned to receive a 300-mg clopidogrel loading dose ( n = 1053 ) or placebo ( n = 1063 ) 3 to 24 hours before PCI . Thereafter , all patients received clopidogrel , 75 mg/d , through day 28 . From day 29 through 12 months , patients in the loading-dose group received clopidogrel , 75 mg/d , and those in the control group received placebo . Both groups received aspirin throughout the study . MAIN OUTCOME MEASURES One-year incidence of the composite of death , myocardial infa rct ion ( MI ) , or stroke in the intent-to-treat population ; 28-day incidence of the composite of death , MI , or urgent target vessel revascularization in the per- protocol population . RESULTS At 1 year , long-term clopidogrel therapy was associated with a 26.9 % relative reduction in the combined risk of death , MI , or stroke ( 95 % confidence interval [ CI ] , 3.9%-44.4 % ; P = .02 ; absolute reduction , 3 % ) . Clopidogrel pretreatment did not significantly reduce the combined risk of death , MI , or urgent target vessel revascularization at 28 days ( reduction , 18.5 % ; 95 % CI , -14.2 % to 41.8 % ; P = .23 ) . However , in a prespecified subgroup analysis , patients who received clopidogrel at least 6 hours before PCI experienced a relative risk reduction of 38.6 % ( 95 % CI , -1.6 % to 62.9 % ; P = .051 ) for this end point compared with no reduction with treatment less than 6 hours before PCI . Risk of major bleeding at 1 year increased , but not significantly ( 8.8 % with clopidogrel vs 6.7 % with placebo ; P = .07 ) . CONCLUSIONS Following PCI , long-term ( 1-year ) clopidogrel therapy significantly reduced the risk of adverse ischemic events . A loading dose of clopidogrel given at least 3 hours before the procedure did not reduce events at 28 days , but subgroup analyses suggest that longer intervals between the loading dose and PCI may reduce events OBJECTIVES The goal of comparative effectiveness analysis is to examine the relationship between two variables , treatment , or exposure and effectiveness or outcome . Unlike data obtained through r and omized controlled trials , research ers face greater challenges with causal inference with observational studies . Recognizing these challenges , a task force was formed to develop a guidance document on method ological approaches to addresses these biases . METHODS The task force was commissioned and a Chair was selected by the International Society for Pharmacoeconomics and Outcomes Research Board of Directors in October 2007 . This report , the second of three reported in this issue of the Journal , discusses the inherent biases when using secondary data sources for comparative effectiveness analysis and provides method ological recommendations to help mitigate these biases . RESULTS The task force report provides recommendations and tools for research ers to mitigate threats to validity from bias and confounding in measurement of exposure and outcome . Recommendations on design of study included : the need for data analysis plan with causal diagrams ; detailed attention to classification bias in definition of exposure and clinical outcome ; careful and appropriate use of restriction ; extreme care to identify and control for confounding factors , including time-dependent confounding . CONCLUSIONS Design of nonr and omized studies of comparative effectiveness face several daunting issues , including measurement of exposure and outcome challenged by misclassification and confounding . Use of causal diagrams and restriction are two techniques that can improve the theoretical basis for analyzing treatment effects in study population s of more homogeneity , with reduced loss of generalizability OBJECTIVES This study sought to compare the effect of 2 proton pump inhibitors ( PPIs ) on platelet response to clopidogrel after coronary stenting for non-ST-segment elevation acute coronary syndrome ( NSTE ACS ) . BACKGROUND Use of omeprazole has been reported to significantly decrease the clopidogrel antiplatelet effect because of cytochrome P450 interaction . Because all PPIs are metabolized by CYP2C19 , but to a varying degree , we hypothesized that the reported negative omeprazole-clopidogrel drug interaction may not be caused by a class effect . METHODS A total of 104 patients undergoing coronary stenting for NSTE ACS were prospect ively included and r and omized to omeprazole or pantoprazole 20 mg . They received at discharge 75-mg aspirin and 150-mg clopidogrel . Platelet reactivity index ( PRI ) vasoactive stimulated phosphoprotein ( VASP ) was used to assess clopidogrel response and adenosine diphosphate (ADP)-induced aggregation for platelet reactivity ( ADP-Ag ) . RESULTS After 1 month , patients receiving pantoprazole had a significantly better platelet response to clopidogrel as assessed with the PRI VASP : 36 + /- 20 % versus 48 + /- 17 % ( p = 0.007 ) . We identified more clopidogrel nonresponders in the omeprazole group than in the pantoprazole group : 44 % versus 23 % ( p = 0.04 ) , odds ratio : 2.6 ( 95 % confidence interval : 1.2 to 6.2 ) . Conversely , we did not observe any significant difference in platelet reactivity with ADP-Ag between the omeprazole and pantoprazole groups : 52 + /- 15 % and 50 + /- 18 % , respectively ( p = 0.29 ) . CONCLUSIONS The present findings suggest the preferential use of pantoprazole compared with omeprazole in patients receiving clopidogrel to avoid any potential negative interaction with CYP2C19 Background Recent studies have suggested that proton pump inhibitors ( PPIs ) attenuate the benefits of clopidogrel . The clinical relevance of this interaction in patients who have undergone percutaneous coronary intervention ( PCI ) is unknown . We hypothesized that post-PCI patients discharged on clopidogrel will have higher cardiovascular events if concomitant PPI therapy is used . Aims To determine whether post-PCI patients discharged on clopidogrel will have higher cardiovascular events if concomitant PPI therapy is used . Methods We review ed the medical records of all the patients discharged on clopidogrel who underwent PCI from January 2003 to August 2004 . The primary outcome studied was a major adverse cardiovascular event ( MACE ) , which was defined as a composite of death , myocardial infa rct ion , and target vessel failure . Results Of the 315 post-PCI patients who were discharged on clopidogrel , 72 were discharged on PPI . During a mean follow-up period of 50 months , patients discharged on concomitant clopidogrel-PPI therapy had a MACE rate of 56 % ( vs. 38 % in the clopidogrel alone group ) ( P = 0.025 ) and had 95 % excess risk of MACE . Conclusion Concomitant use of clopidogrel and PPI in post-PCI patients is associated with a higher risk of MACE . This suggests that PPIs may attenuate clopidogrel ’s beneficial antiplatelet effect , which is crucial after PCI . Prospect i ve r and omized studies are warranted to provide definitive evidence for this interaction Background —We observed that the prodrug clopidogrel was less effective in inhibiting platelet aggregation with coadministration of atorvastatin during point-of-care platelet function testing . Because atorvastatin is metabolized by cytochrome P450 ( CYP ) 3A4 , we hypothesized that clopidogrel might be activated by CYP3A4 . Methods and Results —Platelet aggregation was measured in 44 patients undergoing coronary artery stent implantation treated with clopidogrel or clopidogrel plus pravastatin or atorvastatin , and in 27 volunteers treated with clopidogrel and either erythromycin or trole and omycin , CYP3A4 inhibitors , or rifampin , a CYP3A4 inducer . Atorvastatin , but not pravastatin , attenuated the antiplatelet activity of clopidogrel in a dose-dependent manner . Percent platelet aggregation was 34±23 , 58±15 ( P = 0.027 ) , 74±10 ( P = 0.002 ) , and 89±7 ( P = 0.001 ) in the presence of clopidogrel and 0 , 10 , 20 , and 40 mg of atorvastatin , respectively . Erythromycin attenuated platelet aggregation inhibition ( 55±12 versus 42±12 % platelet aggregation;P = 0.002 ) , as did trole and omycin ( 78±18 versus 45±18 % platelet aggregation;P < 0.0003 ) , whereas rifampin enhanced platelet aggregation inhibition ( 33±18 versus 56±20 % platelet aggregation , P = 0.001 ) . Conclusions —CYP3A4 activates clopidogrel . Atorvastatin , another CYP3A4 substrate , competitively inhibits this activation . Use of a statin not metabolized by CYP3A4 and point-of-care platelet function testing may be warranted in patients treated with clopidogrel BACKGROUND Dual antiplatelet therapy with clopidogrel plus low-dose aspirin has not been studied in a broad population of patients at high risk for atherothrombotic events . METHODS We r and omly assigned 15,603 patients with either clinical ly evident cardiovascular disease or multiple risk factors to receive clopidogrel ( 75 mg per day ) plus low-dose aspirin ( 75 to 162 mg per day ) or placebo plus low-dose aspirin and followed them for a median of 28 months . The primary efficacy end point was a composite of myocardial infa rct ion , stroke , or death from cardiovascular causes . RESULTS The rate of the primary efficacy end point was 6.8 percent with clopidogrel plus aspirin and 7.3 percent with placebo plus aspirin ( relative risk , 0.93 ; 95 percent confidence interval , 0.83 to 1.05 ; P=0.22 ) . The respective rate of the principal secondary efficacy end point , which included hospitalizations for ischemic events , was 16.7 percent and 17.9 percent ( relative risk , 0.92 ; 95 percent confidence interval , 0.86 to 0.995 ; P=0.04 ) , and the rate of severe bleeding was 1.7 percent and 1.3 percent ( relative risk , 1.25 ; 95 percent confidence interval , 0.97 to 1.61 percent ; P=0.09 ) . The rate of the primary end point among patients with multiple risk factors was 6.6 percent with clopidogrel and 5.5 percent with placebo ( relative risk , 1.2 ; 95 percent confidence interval , 0.91 to 1.59 ; P=0.20 ) and the rate of death from cardiovascular causes also was higher with clopidogrel ( 3.9 percent vs. 2.2 percent , P=0.01 ) . In the subgroup with clinical ly evident atherothrombosis , the rate was 6.9 percent with clopidogrel and 7.9 percent with placebo ( relative risk , 0.88 ; 95 percent confidence interval , 0.77 to 0.998 ; P=0.046 ) . CONCLUSIONS In this trial , there was a suggestion of benefit with clopidogrel treatment in patients with symptomatic atherothrombosis and a suggestion of harm in patients with multiple risk factors . Overall , clopidogrel plus aspirin was not significantly more effective than aspirin alone in reducing the rate of myocardial infa rct ion , stroke , or death from cardiovascular causes . ( Clinical Trials.gov number , NCT00050817 . ) OBJECTIVES This trial sought to assess the influence of omeprazole on clopidogrel efficacy . BACKGROUND Clopidogrel has proved its benefit in the treatment of atherothrombotic diseases . In a previous observational study , we found clopidogrel activity on platelets , tested by vasodilator-stimulated phosphoprotein ( VASP ) phosphorylation , to be diminished in patients receiving proton pump inhibitor ( PPI ) treatment . METHODS In this double-blind placebo-controlled trial , all consecutive patients undergoing coronary artery stent implantation received aspirin ( 75 mg/day ) and clopidogrel ( loading dose , followed by 75 mg/day ) and were r and omized to receive either associated omeprazole ( 20 mg/day ) or placebo for 7 days . Clopidogrel effect was tested on days 1 and 7 in both groups by measuring platelet phosphorylated-VASP expressed as a platelet reactivity index ( PRI ) . Our main end point compared PRI value at the 7-day treatment period in the 2 groups . RESULTS Data for 124 patients were analyzed . On day 1 , mean PRI was 83.2 % ( st and ard deviation [ SD ] 5.6 ) and 83.9 % ( SD 4.6 ) , respectively , in the placebo and omeprazole groups ( p = NS ) , and on day 7 , 39.8 % ( SD 15.4 ) and 51.4 % ( SD 16.4 ) , respectively ( p < 0.0001 ) . RESULTS Omeprazole significantly decreased clopidogrel inhibitory effect on platelet P2Y12 as assessed by VASP phosphorylation test . Aspirin-clopidogrel antiplatelet dual therapy is widely prescribed worldwide , with PPIs frequently associated to prevent gastrointestinal bleeding . The clinical impact of these results remains uncertain but merits further investigation
2,056
26,772,611
Conclusion : This systematic review and meta- analysis demonstrates that younger age and a return to high level of activity are salient factors associated with secondary ACL injury . These data indicate that activity modification , improved rehabilitation and return-to-play guidelines , and the use of integrative neuromuscular training may help athletes more safely reintegrate into sport and reduce second injury in this at-risk population
Background : Injury to the ipsilateral graft used for reconstruction of the anterior cruciate ligament ( ACL ) or a new injury to the contralateral ACL are disastrous outcomes after successful ACL reconstruction ( ACLR ) , rehabilitation , and return to activity . Studies reporting ACL reinjury rates in younger active population s are emerging in the literature , but these data have not yet been comprehensively synthesized . Purpose : To provide a current review of the literature to evaluate age and activity level as the primary risk factors in reinjury after ACLR .
Background Neuromuscular training may reduce risk factors that contribute to ACL injury incidence in female athletes . Multi-component , ACL injury prevention training programs can be time and labor intensive , which may ultimately limit training program utilization or compliance . The purpose of this study was to determine the effect of neuromuscular training on those classified as " high-risk " compared to those classified as " low-risk . " The hypothesis was that high-risk athletes would decrease knee abduction moments while low-risk and control athletes would not show measurable changes . Methods Eighteen high school female athletes participated in neuromuscular training 3 × /week over a 7-week period . Knee kinematics and kinetics were measured during a drop vertical jump ( DVJ ) test at pre/post training . External knee abduction moments were calculated using inverse dynamics . Logistic regression indicated maximal sensitivity and specificity for prediction of ACL injury risk using external knee abduction ( 25.25 Nm cutoff ) during a DVJ . Based on these data , 12 study subjects ( and 4 controls ) were grouped into the high-risk ( knee abduction moment > 25.25 Nm ) and 6 subjects ( and 7 controls ) were grouped into the low-risk ( knee abduction < 25.25 Nm ) categories using mean right and left leg knee abduction moments . A mixed design repeated measures ANOVA was used to determine differences between athletes categorized as high or low-risk . Results Athletes classified as high-risk decreased their knee abduction moments by 13 % following training ( Dominant pre : 39.9 ± 15.8 Nm to 34.6 ± 9.6 Nm ; Non-dominant pre : 37.1 ± 9.2 to 32.4 ± 10.7 Nm ; p = 0.033 training X risk factor interaction ) . Athletes grouped into the low-risk category did not change their abduction moments following training ( p > 0.05 ) . Control subjects classified as either high or low-risk also did not significantly change from pre to post-testing . Conclusion These results indicate that " high-risk " female athletes decreased the magnitude of the previously identified risk factor to ACL injury following neuromuscular training . However , the mean values for the high-risk subjects were not reduced to levels similar to low-risk group following training . Targeting female athletes who demonstrate high-risk knee abduction loads during dynamic tasks may improve efficacy of neuromuscular training . Yet , increased training volume or more specific techniques may be necessary for high-risk athletes to substantially decrease ACL injury risk Background : Previous studies have found significant predictors for functional outcome after anterior cruciate ligament ( ACL ) reconstruction ; however , studies examining predictors for functional outcome in nonoperatively treated individuals are lacking . Hypothesis : Single-legged hop tests predict self-reported knee function ( International Knee Documentation Committee [ IKDC ] 2000 ) in nonoperatively treated ACL-injured individuals 1 year after baseline testing . Study Design : Cohort study ( prognosis ) ; Level of evidence , 2 . Methods : Ninety-one nonoperatively treated patients with an ACL injury were tested using 4 single-legged hop tests on average 74 ± 30 days after injury in a prospect i ve cohort study . Eighty-one patients ( 89 % ) completed the IKDC 2000 1 year later . Patients with an IKDC 2000 score equal to or higher than the age- and gender-specific 15th percentile score from previously published data on an uninjured population were classified as having self-reported function within normal ranges . Logistic regression analyses were performed to identify predictors of self-reported knee function . The area under the curve ( AUC ) from receiver operating characteristic curves was used as a measure of discriminative accuracy . Optimal limb symmetry index ( LSI ) cutoff for the best single-legged hop test was defined as the LSI with the highest product of sensitivity and specificity . Results : Single hop for distance symmetry indexes predicted self-reported knee function at the 1-year follow-up ( P = .036 ) . Combinations of any 2 hop tests ( AUC = 0.64 - 0.71 ) did not give a higher discriminative accuracy than the single hop alone ( AUC = 0.71 ) . A cutoff of 88 % ( LSI ) for the single hop revealed a sensitivity of 71.4 % and a specificity of 71.7 % . Conclusion : The single hop for distance ( LSI ) significantly predicted self-reported knee function after 1 year in nonoperatively treated ACL-injured patients . Combinations of 2 single-legged hop tests did not lead to higher discriminative accuracy than the single hop alone BACKGROUND Knee valgus load during sports movement is viewed as an important predictor of non-contact anterior cruciate ligament injury risk , particularly in females . Formulating movement strategies that can reduce valgus loading during these movements therefore appears pertinent to reducing anterior cruciate ligament injury rates . With this in mind , the current study examined the relationship between peak valgus moment and lower extremity postures at impact during a sidestep cutting task . METHODS Ten male and ten female NCAA athletes had initial contact three-dimensional hip , knee and ankle angles and subsequent knee valgus moment quantified during the execution of ( n=10 trials ) sidesteps . Peak valgus data were normalized to mass and height and tested for the main effect of gender ( ANOVA , P<0.05 ) . Intra-subject correlations between the eight initial joint angles and the normalized valgus moment were then conducted across the ten sidestepping trials . The ensuing slopes of regression were su bmi tted to a two- sample t-test to determine whether mean slope values were significantly different from zero and for the main effect of gender ( P<0.05 ) . FINDINGS Females had significantly larger normalized knee valgus moments than males . A greater peak valgus moment was associated with larger initial hip flexion and internal rotation , and with larger initial knee valgus angle . Peak knee valgus moment was more sensitive to initial hip internal rotation and knee valgus position in females . INTERPRETATION Training of neuromuscular control at the hip joint may reduce the likelihood of anterior cruciate ligament injury via a valgus loading mechanism during sidestepping , especially in females Background : The incidence of second anterior cruciate ligament ( ACL ) injuries in the first 12 months after ACL reconstruction ( ACLR ) and return to sport ( RTS ) in a young , active population has been reported to be 15 times greater than that in a previously uninjured cohort . There are no reported estimates of whether this high relative rate of injury continues beyond the first year after RTS and ACLR . Hypothesis : The incidence rate of a subsequent ACL injury in the 2 years after ACLR and RTS would be less than the incidence rate reported within the first 12 months after RTS but greater than the ACL injury incidence rate in an uninjured cohort of young athletes . Study Design : Cohort study ; Level of evidence , 2 . Methods : Seventy-eight patients ( mean age , 17.1 ± 3.1 years ) who underwent ACLR and were ready to return to a pivoting/cutting sport and 47 controls ( mean age , 17.2 ± 2.6 years ) who also participated in pivoting/cutting sports were prospect ively enrolled . Each participant was followed for injury and athlete exposure ( AE ) data for a 24-month period after RTS . Twenty-three ACLR and 4 control participants suffered an ACL injury during this time . Incidence rate ratios ( IRRs ) were calculated to compare the rates ( per 1000 AEs ) of ACL injury in athletes in the ACLR and control groups . For the ACLR group , similar comparisons were conducted for side of injury by sex . Results : The overall incidence rate of a second ACL injury within 24 months after ACLR and RTS ( 1.39/1000 AEs ) was nearly 6 times greater ( IRR , 5.71 ; 95 % CI , 2.0 - 22.7 ; P = .0003 ) than that in healthy control participants ( 0.24/1000 AEs ) . The rate of injury within 24 months of RTS for female athletes in the ACLR group was almost 5 times greater ( IRR , 4.51 ; 95 % CI , 1.5 - 18.2 ; P = .0004 ) than that for female controls . Although only a trend was observed , female patients within the ACLR group were twice as likely ( IRR , 2.43 ; 95 % CI , 0.8 - 8.6 ) to suffer a contralateral injury ( 1.13/1000 AEs ) than an ipsilateral injury ( 0.47/1000 AEs ) . Overall , 29.5 % of athletes suffered a second ACL injury within 24 months of RTS , with 20.5 % sustaining a contralateral injury and 9.0 % incurring a retear injury of the ipsilateral graft . There was a trend toward a higher proportion of female participants ( 23.7 % ) who suffered a contralateral injury compared with male participants ( 10.5 % ) ( P = .18 ) . Conversely , for ipsilateral injuries , the incidence proportion between female ( 8.5 % ) and male ( 10.5 % ) participants was similar . Conclusion : These data support the hypothesis that in the 24 months after ACLR and RTS , patients are at a greater risk to suffer a subsequent ACL injury compared with young athletes without a history of ACL injuries . In addition , the contralateral limb of female patients appears at greatest risk PURPOSE The aim of this study was to determine the rates of contralateral anterior cruciate ligament ( ACL ) rupture and of ACL graft rupture after ACL reconstruction using either patellar tendon or hamstring tendon autograft , and to identify any patient characteristics that may increase this risk . TYPE OF STUDY Case series . METHODS Over a 2-year period , 760 endoscopic ACL reconstructions were performed in 743 patients . Bone-patellar tendon-bone autograft was used in 316 patients and 4-str and hamstring tendon in 427 patients . Those patients with a previous contralateral ACL rupture or those who underwent a simultaneous bilateral ACL reconstruction were excluded , leaving 675 knees ( 675 patients ) for review . Persons not involved in the index operation or the care of the patient conducted follow-up assessment by telephone interview conducted 5 years after surgery . Patients were question ed about the incidence of ACL graft rupture , contralateral ACL rupture , symptoms of instability or significant injury , family history of ACL injury , and activity level according to the International Knee Documentation Committee scale . From our prospect i ve data base we obtained further information on graft source , meniscal or articular surface injury , and gender . Binary logistic regression was used to measure the relative association between the measured variables and the risk of graft rupture and contralateral ACL rupture . RESULTS Five years after primary ACL reconstruction , 612 of the 675 patients ( 90.7 % ) were assessed . ACL graft rupture occurred in 39 patients ( 6 % ) and contralateral ACL rupture occurred in 35 patients ( 6 % ) . Three patients suffered both a graft rupture and a contralateral ACL injury . The odds of ACL graft rupture were increased 3-fold by a contact mechanism of initial injury . Return to level 1 or 2 sports increased the risk of contralateral ACL injury by a factor of 10 . The risk of sustaining an ACL graft rupture was greatest in the first 12 months after reconstruction . No other studied variable increased the risk of repeat ACL injury . CONCLUSIONS After reconstruction , repeat ACL injury occurred in 12 % of patients over 5 years . Twelve months after reconstruction , the ACL graft is at no greater risk than the contralateral ACL , suggesting that adequate graft and muscular function for most activities is achieved by this time . Risk factors for repeat ACL injury identified included a return to competitive side-stepping , pivoting , or jumping sports , and the contact mechanism of the index injury . Female patients were at no greater risk of repeat ACL injury than male patients and graft choice did not affect the rate of repeat ACL injury . LEVEL OF EVIDENCE Level IV , case series Background : Tearing an anterior cruciate ligament ( ACL ) graft is a devastating occurrence after ACL reconstruction ( ACLR ) . Identifying and underst and ing the independent predictors of ACLR graft failure is important for surgical planning , patient counseling , and efforts to decrease the risk of graft failure . Hypothesis : Patient and surgical variables will predict graft failure after ACLR . Study Design : Prospect i ve cohort study . Methods : A multicenter group initiated a cohort study in 2002 to identify predictors of ACLR outcomes , including graft failure . First , to control for confounders , a single surgeon ’s data ( n = 281 ACLRs ) were used to develop a multivariable regression model for ACLR graft failure . Evaluated variables were graft type ( autograft vs allograft ) , sex , age , body mass index , activity at index injury , presence of a meniscus tear , and primary versus revision reconstruction . Second , the model was vali date d with the rest of the multicenter study ’s data ( n = 645 ACLRs ) to evaluate the generalizability of the model . Results : Patient age and ACL graft type were significant predictors of graft failure for all study surgeons . Patients in the age group of 10 to 19 years had the highest percentage of graft failures . The odds of graft rupture with an allograft reconstruction are 4 times higher than those of autograft reconstructions . For each 10-year decrease in age , the odds of graft rupture increase 2.3 times . Conclusion : There is an increased risk of ACL graft rupture in patients who have undergone allograft reconstruction . Younger patients also have an increased risk of ACL graft failure . Clinical Relevance : Given these risks for ACL graft rupture , allograft ACLRs should be performed with caution in the younger patient population Background The risk of subsequent anterior cruciate ligament injury to either knee after surgery based on sex , age , and activity has not been extensively studied . Hypotheses Women have a higher incidence of anterior cruciate ligament injury to the contralateral knee after surgery than men but do not have a difference in injuries to the reconstructed knee . Young , competitive athletes have a higher incidence of injury than older patients . The time to return to full activities does not affect injury rate . Study Design Cohort study ( prognosis ) ; Level of evidence , 2 . Methods The authors prospect ively followed 1820 patients after primary anterior cruciate ligament reconstruction to determine if patients suffered an injury to either knee within 5 years after surgery . Subsequent injury was evaluated based on sex , age , and activity level . Results Minimum 5-year follow-up was obtained on 1415 patients ( 78 % ) . Seventy-five patients ( 5.3 % ) had an injury to the contralateral knee , and 61 patients ( 4.3 % ) suffered an injury to the reconstructed knee ( P = . 2185 ) . Women suffered more injuries ( 7.8 % ) to the contralateral normal knee than men ( 3.7 % ; P < . 001 ) but not more injuries to the reconstructed knee ( 4.3 % vs 4.1 % ; P = . 5543 ) . The risk of subsequent injury to either knee was 17 % for patients < 18 years old , 7 % for patients aged 18 to 25 years , and 4 % for patients older than 25 years . There was no difference in injury rate between patients who returned before and after 6 months postoperatively . Conclusion Women have a higher incidence of anterior cruciate ligament injury to the contralateral knee than men after reconstruction . The incidence of injury to either knee after reconstruction is associated with younger age and higher activity level , but returning to full activities before 6 months postoperatively does not increase the risk of subsequent injury OBJECTIVE : To test the feasibility of creating a valid and reliable checklist with the following features : appropriate for assessing both r and omised and non-r and omised studies ; provision of both an overall score for study quality and a profile of scores not only for the quality of reporting , internal validity ( bias and confounding ) and power , but also for external validity . DESIGN : A pilot version was first developed , based on epidemiological principles , review s , and existing checklists for r and omised studies . Face and content validity were assessed by three experienced review ers and reliability was determined using two raters assessing 10 r and omised and 10 non-r and omised studies . Using different raters , the checklist was revised and tested for internal consistency ( Kuder-Richardson 20 ) , test-retest and inter-rater reliability ( Spearman correlation coefficient and sign rank test ; kappa statistics ) , criterion validity , and respondent burden . MAIN RESULTS : The performance of the checklist improved considerably after revision of a pilot version . The Quality Index had high internal consistency ( KR-20 : 0.89 ) as did the subscales apart from external validity ( KR-20 : 0.54 ) . Test-retest ( r 0.88 ) and inter-rater ( r 0.75 ) reliability of the Quality Index were good . Reliability of the subscales varied from good ( bias ) to poor ( external validity ) . The Quality Index correlated highly with an existing , established instrument for assessing r and omised studies ( r 0.90 ) . There was little difference between its performance with non-r and omised and with r and omised studies . Raters took about 20 minutes to assess each paper ( range 10 to 45 minutes ) . CONCLUSIONS : This study has shown that it is feasible to develop a checklist that can be used to assess the method ological quality not only of r and omised controlled trials but also non-r and omised studies . It has also shown that it is possible to produce a checklist that provides a profile of the paper , alerting review ers to its particular method ological strengths and weaknesses . Further work is required to improve the checklist and the training of raters in the assessment of external validity Recent publications have reported differences in the incidence , rate , risk , and type of sports injury among men and women . We undertook a prospect i ve study to determine the incidence of injury among high school basketball players and to examine the differences in injury type , incidence , rate , and risk between male and female athletes . During a single basketball season , an injury survey of girls ' varsity teams at 100 class 4A and 5A high schools in Texas was conducted . These data were previously reported . We surveyed the same 100 high schools during a subsequent season to gather injury data from the boys ' varsity teams . The athletic trainer collected data on each reportable injury and reported the data weekly to the University Interscholastic League . A reportable injury was defined as one that occurred during a practice or a game , result ed in missed practice or game time , required physician consultation , or involved the head or the face . The boys ' and girls ' data were compared and statistically analyzed . The rate of injury was 0.56 among the boys and 0.49 among the girls . The risk of injury per hour of exposure was not significantly different between the two groups . In both groups , the most common injuries were sprains , and the most commonly injured area was the ankle , followed by the knee . Female athletes had a significantly higher rate of knee injuries including a 3.79 times greater risk of anterior cruciate ligament injuries . For both sexes , the risk of injury during a game was significantly higher than during practice Background The risk of tear of the intact anterior cruciate ligament in the contralateral knee after anterior cruciate ligament reconstruction of the opposite knee and the incidence of rupturing the anterior cruciate ligament graft during the first 2 years after surgery have not been extensively studied in a prospect i ve manner . Clinicians have hypothesized that the opposite normal knee is at equal or increased risk compared with the risk of anterior cruciate ligament graft rupture in the operated knee . Hypothesis The risk of anterior cruciate ligament graft rupture and contralateral normal knee anterior cruciate ligament rupture at 2-year follow-up is equal . Study Design Cohort study ; Level of evidence , 2 . Methods The Multicenter Orthopaedic Outcome Network ( MOON ) data base of a prospect i ve longitudinal cohort of anterior cruciate ligament reconstructions was used to determine the number of anterior cruciate ligament graft ruptures and tears of the intact anterior cruciate ligament in the contralateral knee at 2-year follow-up . Two-year follow-up consisted of a phone interview and review of operative reports . Results Two-year data were obtained for 235 of 273 patients ( 86 % ) . There were 14 ligament disruptions . Of these , 7 were tears of the intact anterior cruciate ligament in the contralateral knee ( 3.0 % ) and 7 were anterior cruciate ligament graft failures ( 3.0 % ) . Conclusion The contralateral normal knee anterior cruciate ligament is at a similar risk of anterior cruciate ligament tear ( 3.0 % ) as the anterior cruciate ligament graft after primary anterior cruciate ligament reconstruction ( 3.0 % ) Background : Patients generally choose to undergo anterior cruciate ligament reconstruction ( ACLR ) to return to their active lifestyles . However , returning to their previous activity level may result in a retear of their reconstructed knee or an injury to their contralateral anterior cruciate ligament ( CACL ) . Purpose : To determine the risk factors associated with revision ACLR and contralateral ACLR ( CACLR ) , compare the survival of the reconstructed ACL with the CACL , and determine how the risk factors associated with revision ACLR compare with those for CACLR . Study Design : Cohort study ; Level of evidence , 3 . Methods : A retrospective cohort study of prospect ively collected data from the Kaiser Permanente ACLR registry between February 1 , 2005 , and September 30 , 2012 , was conducted . Primary ACLR cases without history of contralateral knee ACL injury were included . The study endpoints included revision ACLR and CACLR . Graft type ( bone – patellar tendon – bone [ BPTB ] autograft , hamstring autograft , and allograft ) was the main exposure of interest , and patient characteristics were evaluated as risk factors for revision ACLR and CACLR . Survival analyses were conducted . Results : A total of 17,436 ACLRs were evaluated . The median age was 27.2 years ( interquartile range , 18.7 - 37.7 years ) , and 64 % were males . The 5-year survival for index ACLR was 95.1 % ( 95 % CI , 94.5%-95.6 % ) , and for CACL it was 95.8 % ( 95 % CI , 95.2%-96.3 % ) . Overall , the cohort had a mean of 2.4 ± 1.7 years of follow-up ; 18.2 % were lost to follow-up . There were fewer CACLRs per 100 years of observation ( 0.83 ) than there were revision ACLRs ( 1.05 ) during the study period ( P < .001 ) . There was a statistically significant difference in the density of revision ACLR and CACL in BPTB autografts ( 0.74 vs 1.06 , respectively ; P = .010 ) , hamstring autografts ( 1.07 vs 0.81 ; P = .042 ) , and allografts ( 1.26 vs 0.67 ; P < .001 ) . The risk factors for revision ACLR and contralateral surgery were different ( P < .05 ) . After adjusting for covariates , factors associated with higher risk of revision ACLR were as follows : allografts , hamstring autografts , male sex , younger age , lower body mass index ( BMI ) , and being white as opposed to black . Factors associated with higher risk of CACLR were as follows : younger age , female sex , and lower BMI . Conclusion : The 5-year revision-free and CACLR-free survival rate in this study was 95.1 % and 95.8 % , respectively . Allografts and hamstring autografts had a higher risk of revision ACLR surgery , and BPTB autografts had a higher risk of CACLR . Males were found to have a higher risk of revision ACLR , and females had a higher risk of CACLR . Increasing age and increasing BMI decreased the risk of both revision and CACLR Background Female athletes participating in high-risk sports suffer anterior cruciate ligament injury at a 4- to 6-fold greater rate than do male athletes . Hypothesis Prescreened female athletes with subsequent anterior cruciate ligament injury will demonstrate decreased neuromuscular control and increased valgus joint loading , predicting anterior cruciate ligament injury risk . Study Design Cohort study ; Level of evidence , 2 . Methods There were 205 female athletes in the high-risk sports of soccer , basketball , and volleyball prospect ively measured for neuromuscular control using 3-dimensional kinematics ( joint angles ) and joint loads using kinetics ( joint moments ) during a jump-l and ing task . Analysis of variance as well as linear and logistic regression were used to isolate predictors of risk in athletes who subsequently ruptured the anterior cruciate ligament . Results Nine athletes had a confirmed anterior cruciate ligament rupture ; these 9 had significantly different knee posture and loading compared to the 196 who did not have anterior cruciate ligament rupture . Knee abduction angle ( P < . 05 ) at l and ing was 8 ° greater in anterior cruciate ligament-injured than in uninjured athletes . Anterior cruciate ligament-injured athletes had a 2.5 times greater knee abduction moment ( P < . 001 ) and 20 % higher ground reaction force ( P < . 05 ) , whereas stance time was 16 % shorter ; hence , increased motion , force , and moments occurred more quickly . Knee abduction moment predicted anterior cruciate ligament injury status with 73 % specificity and 78 % sensitivity ; dynamic valgus measures showed a predictive r2 of 0.88 . Conclusion Knee motion and knee loading during a l and ing task are predictors of anterior cruciate ligament injury risk in female athletes . Clinical Relevance Female athletes with increased dynamic valgus and high abduction loads are at increased risk of anterior cruciate ligament injury . The methods developed may be used to monitor neuromuscular control of the knee joint and may help develop simpler measures of neuromuscular control that can be used to direct female athletes to more effective , targeted interventions The consequences of athletic injuries extend beyond the musculoskeletal system . Depression , anger , and tension have been observed in athletes with athletic injuries . It was hypothesized that among student athletes , the psychologic impact of injury may be seen as a drop in academic performance . Thirty-eight students who had an anterior cruciate ligament injury and subsequent reconstruction were evaluated retrospectively by academic transcript and question naire to measure their academic performance before their injury , in the semester of their injury , and in the semester after their surgery . The patients were compared with r and omly selected undergraduate control subjects . To evaluate any effect of the timing of the surgery on academic performance , the patients were separated into two groups , according to the timing of their reconstruction : those who had surgery during the academic semester , and those who elected to wait for a school break . There was a significant drop in grade point average of 0.3 grade points during the semester of injury among all injured students . Compared with those who had surgery during a break , the students who had surgery during the semester received more frequently the grade of failure ( 6 % versus 0 % ) or incomplete ( 33 % versus 9 % ) . These students also missed more school days ( 10.5 days versus 1.5 days ) and examinations ( 2.2 examinations versus 0.1 examinations ) . Only 47 % of students who had surgery during the semester were satisfied with their decision for surgical timing , compared with 96 % satisfied with the timing during an academic break . Acute anterior cruciate ligament rupture , and surgical reconstruction during an academic semester , have a significant academic effect in university students Objective Incidence rate ( IR ) of an ipsilateral or contralateral injury after anterior cruciate ligament reconstruction ( ACLR ) is unknown . The hypotheses were that the IR of anterior cruciate ligament ( ACL ) injury after ACLR would be greater than the IR in an uninjured cohort of athletes and would be greater in female athletes after ACLR than male athletes . Design Prospect i ve case – control study . Setting Regional sports community . Participants Sixty-three subjects who had ACLR and were ready to return to sport ( RTS ) and 39 control subjects . Independent Variables Second ACL injury and sex . Main Outcome Measures Second ACL injury and athletic exposure ( AE ) was tracked for 12 months after RTS . Sixteen subjects after ACLR and 1 control subject suffered a second ACL injury . Between- and within-group comparisons of second ACL injury rates ( per 1000 AEs ) were conducted . Results The IR of ACL injury after ACLR ( 1.82/1000 AE ) was 15 times greater [ risk ratio ( RR ) = 15.24 ; P = 0.0002 ) than that of control subjects ( 0.12/1000AE ) . Female ACLR athletes demonstrated 16 times greater rate of injury ( RR = 16.02 ; P = 0.0002 ) than female control subjects . Female athletes were 4 ( RR = 3.65 ; P = 0.05 ) times more likely to suffer a second ACL injury and 6 times ( RR = 6.21 ; P = 0.04 ) more likely to suffer a contralateral injury than male athletes . Conclusions An increased rate of second ACL injury after ACLR exists in athletes when compared with a healthy population . Female athletes suffer contralateral ACL injuries at a higher rate than male athletes and seem to suffer contralateral ACL injuries more frequently than graft re-tears . The identification of a high-risk group within a population of ACLR athletes is a critical step to improve outcome after ACLR and RTS Background : Revision surgery is one of the most important endpoints during follow-up after anterior cruciate ligament ( ACL ) reconstruction . Purpose : To investigate if commonly known patient factors can predict revision surgery after ACL reconstruction . Study Design : Cohort study ; Level of evidence , 2 . Methods : This prospect i ve cohort study was based on data from the Swedish National Knee Ligament Register during the period of January 1 , 2005 , through December 31 , 2013 . Patients who underwent primary ACL reconstruction with hamstring tendon or bone – patellar tendon – bone autografts were included . Follow-up started on the date of primary ACL reconstruction , and follow-up ended with ACL revision surgery , after 24 months of follow-up , or on December 31 , 2013 , whichever occurred first . The analyzed patient variables were activity at the time of injury , sex , age , height , weight , body mass index , smoking , and the use of smokeless tobacco . The primary study endpoint was revision surgery , defined as replacement of a primary ACL reconstruction . Relative risk ( RR ) and 95 % CIs were calculated and adjusted for confounding factors using multivariate statistics . Results : A total of 16,930 patients were included ( males , n = 9767 [ 57.7 % ] ; females , n = 7163 [ 42.3 % ] ) . The 2-year revision rate was 1.82 % ( 95 % CI , 1.62%-2.02 % ) . There was no significant difference between male and female revision rates ( 1.74 % [ 95 % CI , 1.48%-2.00 % ] vs 1.93 % [ 95 % CI , 1.61%-2.25 % ] , P = .383 ) . In both males and females there was a significantly increased risk of revision surgery associated with soccer playing and adolescence ( age 13 - 19 years ) ( males : RR = 1.58 [ 95 % CI , 1.12 - 2.23 ] , P = .009 and RR = 2.67 [ 95 % CI , 1.91 - 3.73 ] , P < .001 , respectively ; females : RR = 1.43 [ 1.01 - 2.04 ] , P = .045 and RR = 2.25 [ 95 % CI , 1.57 - 3.24 ] , P < .001 , respectively ) . A combination of these predictors were associated with a further increased risk of revision surgery ( males : RR = 2.87 [ 95 % CI , 1.79 - 4.60 ] , P < .001 ; females : RR = 2.59 [ 95 % CI , 1.69 - 3.96 ] , P < .001 ) . Conclusion : Soccer players and adolescents had an increased risk of revision surgery after ACL reconstruction , with a respective factor of 1.5 and 2.5 . Individuals with a combination of these 2 predictors carried an almost 3-fold higher risk of revision surgery . There were no significant associations for sex , height , weight , body mass index , or tobacco use
2,057
20,724,193
There is a lack of evidence of survival advantages to use this treatment at the time of diagnosis or relapse . There is also insufficient evidence regarding the benefice/harms in the use of stereotactic fractionated radiation therapy for patients with glioma . Stereotactic irradiation as boost in primary diagnosed glioma or relapsed tumour is not associated with survival improvement .
The purpose of this literature systematic review was the use of stereotactic radiotherapy in glioma .
PURPOSE A phase I dose escalation of hypofractionated stereotactic radiotherapy ( H-SRT ) in recurrent or persistent malignant gliomas as a means of increasing the biologically effective dose and decreasing the high rate of reoperation due to toxicity associated with single-fraction stereotactic radiosurgery ( SRS ) and brachytherapy . MATERIAL S AND METHODS From November 1994 to September 1996 , 25 lesions in 20 patients with clinical and /or imaging evidence of malignant glioma persistence or recurrence received salvage H-SRT . Nineteen patients at the time of initial diagnosis had glioblastoma multiforme ( GBM ) and one patient had an anaplastic astrocytoma . All of these patients with tumor persistence or recurrence had received initial fractionated radiation therapy ( RT ) with a mean and median dose of 60 Gy ( 44.0 - 72.0 Gy ) . The median time from completion of initial RT to H-SRT was 3.1 months ( 0.7 - 45.5 months ) . Salvage H-SRT was delivered using daily 3.0 - 3.5 Gy fractions ( fxs ) . Three different total dose levels were sequentially evaluated : 24.0 Gy/3.0 Gy fxs ( five lesions ) , 30.0 Gy/3.0 Gy fxs ( 10 lesions ) , and 35.0 Gy/3.5 Gy fxs ( nine lesions ) . Median treated tumor volume measured 12.66 cc ( 0.89 - 47.5 cc ) . The median ratio of prescription volume to tumor volume was 2.8 ( 1.4 - 5.0 ) . Toxicity was judged by RTOG criteria . Response was determined by clinical neurologic improvement , a decrease in steroid dose without clinical deterioration , and /or radiologic imaging . RESULTS No grade 3 toxicities were observed and no reoperation due to toxicity was required . At the time of analysis , 13 of 20 patients had died . The median survival time from the completion of H-SRT is 10.5 months with a 1-year survival rate of 20 % . Neurological improvement was found in 45 % of patients . Decreased steroid requirements occurred in 60 % of patients . Minor imaging response was noted in 22 % of patients . Using Fisher 's exact test , response of any kind correlated strongly to total dose ( p = 0.0056 ) . None of six lesions treated with 21 Gy or 24 Gy responded , whereas there was a 79 % response rate among the 19 lesions treated with 30 or 35 Gy . Tumor volumes < or = 20 cc were associated with a higher likelihood of response ( p = 0.053 ) . CONCLUSIONS H-SRT used in this cohort of previously irradiated patients with malignant glioma was not associated with the need for reoperation due to toxicity or grade 3 toxicity . This low toxicity profile and encouraging H-SRT dose-related response outcome justifies further evaluation and dose escalation The purpose of this study was to retrospectively evaluate the survival of patients with high- grade gliomas treated with external beam radiotherapy with or without radiosurgical boost . From July 1993 to April 1998 , 32 patients were selected , 15 of which received radiosurgery . Inclusion criteria were age > 18 years , histological confirmation of high- grade glioma , primary tumor treatment with curative intent , unifocal tumor and supratentorial location . All patients were found to be in classes III – VI , according to the recursive partitioning analysis proposed by the Radiation Therapy Oncology Group . The median interval between radiotherapy and radiosurgery was 5 weeks ( range 1–13 ) . Treatment volumes ranged from 2.9 to 70.3 cc ( median 15.0 cc ) . Prescribed radiosurgery doses varied from 8.0 to 12.5 Gy ( median 10.0 Gy ) . Radiosurgery and control groups were well balanced with respect to prognostic factor distributions . Median actuarial survival time in radiosurgery and control groups was 21.4 months and 11.6 months , respectively ( p=0.0254 ) . Among patients with KPS≥80 , median survival time was 11.0 months and 53.9 months in the control and radiosurgery groups , respectively ( p=0.0103 ) . Radiosurgery was the single factor correlated with survival on Cox model analysis ( p=0.0362 ) and was associated with a 2.76 relative reduction in the risk of cancer death ( 95 % confidence interval ( CI ) 1.07–7.13 ) . Our results suggest that radiosurgery may confer a survival advantage for patients in RPA classes III – VI , especially for those with Karnofsky performance status ≥80 . The definitive role of radiosurgical boost for patients with high- grade gliomas awaits the results of r and omized trials Background . The treatment of recurrent gliomas is palliative ; however , the local pattern of tumor recurrence permits retreatment with single fraction , high dose stereotactic radiotherapy or radiosurgery ( RS ) . Twenty-two patients with recurrent glioma have been treated on a dose escalation protocol with fractionated stereotactic external beam radiotherapy ( SRT ) . All had previously received radical radiotherapy ( median dose 55 Gy ) as part of the initial treatment . The dose of SRT was increased from 30 Gy in six fractions to 50 Gy in ten fractions . Median survival from the date of SRT was 9.8 months . There was no significant acute morbidity but five patients who received > or = 40 Gy developed steroid responsive neurological deterioration assumed to represent late radiation damage . The survival and toxicity in patients with recurrent glioma are comparable with interstitial therapy . Fractionated SRT is a noninvasive form of localised radiation which may be a suitable alternative to interstitial therapy in this group of patients In a non-r and omised study in six centres in the UK , 24 patients with previously untreated small-cell lung cancer of limited extent were treated with a regimen of alternating chemotherapy and radiotherapy to assess response , toxicity , and the feasibility of applying such a regimen on a multicentre basis in the UK . The intention was to give six courses of chemotherapy on five consecutive days at 4-week intervals : etoposide 75 mg m-2 on days 1 , 2 , and 3 ; doxorubicin 40 mg m-2 on day 1 ; cisplatin 100 mg m-2 on day 2 ; and cyclophosphamide 300 mg m-2 on days 2 , 3 , 4 and 5 . A dose of 20 Gy thoracic radiotherapy was to be given following the 2nd and the 3rd courses , and one of 15 Gy following the 4th course . After 12 patients had been admitted , the cisplatin dosage was reduced to 80 mg m-2 because of unacceptable toxicity . Two patients were withdrawn during treatment on review of their histology because their diagnosis was found to be incorrect . Only one patient of the 12 treated with cisplatin 100 mg m-2 was able to complete treatment , compared with five of the eligible ten given the lower dosage . Among the 22 patients with confirmed small-cell disease , a complete response was reported in 14 ( 64 % ) and a partial response in a further three ( total response rate 77 % ) . Myelosuppression was the commonest serious adverse effect . It occurred in 19 of the 24 patients and gave rise to septicaemia in five , four of whom were receiving the higher cisplatin dose . Sixteen patients required blood transfusion and ten platelet transfusion . Vomiting , oesophagitis , and peripheral neuropathy occurred in 12 , four and four patients , respectively , and radiation pneumonitis developed in two . Treatment was considered a contributory cause of death in four . The working party concluded that the alternating regimen was feasible in only a small proportion of centres in the UK , and decided not to embark on a multicentre r and omised trial comparing alternating with conventional scheduling Despite the progress in neurosurgery and radiotherapy , almost all patients treated with malignant gliomas develop recurrent tumors and die of their disease . Eighty-eight patients ( median age 56 years ) with recurrent glioblastoma ( median tumor volume 32.7 cm3 ) were treated with noninvasive fractionated stereotactic radiosurgery and concurrent paclitaxel used as a sensitizer . The median interval between diagnosis of primary glioblastoma and salvage radiosurgery was 7.8 months . Four weekly treatments ( median dose : 6.0 Gy ) were delivered after the 3-hour paclitaxel infusion ( median dose : 120 mg/m2 ) . Survival was calculated by the Kaplan-Meier method from radiosurgery treatment . Overall median survival was 7.0 months , and the 1-year and 2-year actuarial survival rates were 17 % and 3.4 % , respectively . When grouped by performance status , there was no difference in survival between the patients with low and high Karnofsky score . Patients with tumor volume less than 30 cm3 survived significantly longer than those with tumor greater than 30 cm3 ( 9.4 vs. 5.7 months , p = 0.0001 ) . Their 1-year survival rate was 40 % and 8 % , respectively . Eleven patients ( 11 % ) had reoperation because of exp and ing mass . Stable disease was seen in 40 % of patients ( n = 34 ) , and increase in radiographically detected mass was observed in 41 patients ( 48.8 % ) . Although the treatment of recurrent GBM is mostly palliative , the fractionated radiosurgery offers a chance for prolonged survival , especially in patients with a smaller tumor volume PURPOSE The purpose is twofold : ( 1 ) to identify the malignant glioma patients treated in a trial of hyperfractionated radiotherapy ( RT ) and carmustine ( BCNU ) who may have been eligible for a stereotactic radiosurgery ( SRS ) boost ; and ( 2 ) to compare survival of such patients with that of those considered SRS-ineligible . PATIENTS AND METHODS From January 1983 to July 1989 , 778 malignant glioma patients were enrolled on Radiation Therapy Oncology Group ( RTOG ) 83 - 02 , a r and omized phase I/II hyperfractionated RT dose-escalation trial with BCNU chemotherapy . The SRS criteria used in a single-institution trial were applied to these patients ; they are : Karnofsky performance status ( KPS ) of greater than 60 ; well-circumscribed tumor less than 4.0 cm ; no subependymal spread ; and a location not adjacent to brainstem or optic chiasm . RESULTS Eighty-nine patients ( 11.9 % ) were identified as potentially SRS-eligible . The median survival times ( MST ) and 18-month survival rates of the 89 eligible and 643 ineligible patients were 14.4 versus 11.7 months and 40 % versus 27 % , respectively ( P = .047 ) . The MST and 18-month survival rate of the 544 SRS-ineligible patients with KPS greater than 60 were 12.1 months and 29 % , respectively , and were not statistically inferior to the survival of the SRS-eligible group ( P = .21 ) . Multivariate analysis revealed age , KPS , and histopathology to be strongly predictive of survival , and SRS eligibility was also significantly predictive ( P = .047 ) . CONCLUSION SRS-eligible patients enrolled on RTOG 83 - 02 had survival superior to that of the SRS-ineligible group , and this advantage is mainly due to the selection of a subgroup with a high minimum KPS OBJECT After conventional doses of 55 to 65 Gy of fractionated irradiation , glioblastoma multiforme ( GBM ) usually recurs at its original location . This institutional phase II study was design ed to assess whether dose escalation to 90 cobalt gray equivalent ( CGE ) with conformal protons and photons in accelerated fractionation would improve local tumor control and patient survival . METHODS Twenty-three patients were enrolled in this study . Eligibility criteria included age between 18 and 70 years , Karnofsky Performance Scale score of greater than or equal to 70 , residual tumor volume of less than 60 ml , and a supratentorial , unilateral tumor . Actuarial survival rates at 2 and 3 years were 34 % and 18 % , respectively . The median survival time was 20 months , with four patients alive 22 to 60 months postdiagnosis . Analysis by Radiation Therapy Oncology Group prognostic criteria or Medical Research Council indices showed a 5- to 11-month increase in median survival time over those of comparable conventionally treated patients . All patients developed new areas of gadolinium enhancement during the follow-up period . Histological examination of tissues obtained at biopsy , resection , or autopsy was conducted in 15 of 23 patients . Radiation necrosis only was demonstrated in seven patients , and their survival was significantly longer than that of patients with recurrent tumor ( p = 0.01 ) . Tumor regrowth occurred most commonly in areas that received doses of 60 to 70 CGE or less ; recurrent tumor was found in only one case in the 90-CGE volume . CONCLUSIONS A dose of 90 CGE in accelerated fractionation prevented central recurrence in almost all cases . The median survival time was extended to 20 months , likely as a result of central control . Tumors will usually recur in areas immediately peripheral to this 90-CGE volume , but attempts to extend local control by enlarging the central volume are likely to be limited by difficulties with radiation necrosis PURPOSE To compare survivorship , and acute and delayed toxicities following radiation therapy ( RT ) of radiosurgery-ineligible glioblastoma multiforme ( GBM ) patients treated with tumor volume-influenced , high-dose accelerated , hyperfractionated RT plus bischloroethyl-nitrosourea ( BCNU ) , using prior RTOG malignant glioblastoma patients as historical controls . METHODS AND MATERIAL S One hundred four of 108 patients accrued from June 1994 through May 1995 from 26 institutions were analyzable . Patients were histologically confirmed with GBM , and previously untreated . Treatment assignment ( 52 patients /arm ) was based on tumor mass ( TM ) , defined as the product of the maximum diameter and greatest perpendicular dimension of the titanium-gadolinium-enhanced postoperative MRI : Arm A , 64 Gy , TM > 20 cm(2 ) ; or Arm B , 70.4 Gy , TM < or = 20 cm(2 ) . Both Arms A and B received BCNU ( 80 mg/m(2 ) , under hyperhydration ) days 1 - 3 , 56 - 58 , then 4 cycles , each 8 weeks , for a total of 6 treatment series . RESULTS During the 24 months immediately post-treatment , the overall median survival was 9.1 months in Arm A ( 64 Gy ) and 11.0 months in Arm B ( 70.4 Gy ) . Median survival in recursive partitioning analysis ( RPA ) Class III/IV was 10.4 months in Arm A and 12.2 months in Arm B , while RPA Class V/VI was 7.6 months in Arm A and 6.1 months in Arm B. There were no grade 4 neurological toxicities in Arm A ; 2 grade 4 neurological toxicities were observed in Arm B ( 1 motor deficit , 1 necrosis at 157 days post-treatment ) . CONCLUSION This strategy of high-dose , accelerated hyperfractionated radiotherapy shortens overall RT treatment times while allowing dose escalation , and it provides the potential for combination with currently available , as well as newer , chemotherapy agents . Survival is comparable with previously published RTOG data , and toxicities are within acceptable limits Background To reduce this complication and to enhance the radiation effect to hypoxic cells of high- grade gliomas , the authors performed noninvasive fractionated stereotactic radiotherapy ( FSRT ) using a Gamma unit combined with hyperbaric oxygen ( HBO ) therapy for the treatment of recurrent disease . Patients and methods Twenty-five consecutive patients who had previously received radiotherapy with chemotherapy for recurrent high- grade gliomas , including 14 patients with anaplastic astrocytoma ( AA ) and 11 with glioblastoma multiforme ( GBM ) , underwent Gamma FSRT immediately after HBO therapy ( 2.5 atmospheres absolute for 60 min ) . The Gamma FSRT was repeatedly performed using a relocatable head cast . Median tumor volume was 8.7 cc ( range , 1.7–159.3 cc ) , and the median total radiation dose was 22 Gy ( range , 18–27 Gy ) to the tumor margin in 8 fractions . Results Actuarial median survival time after FSRT was 19 months for patients with AA and 11 months for patients with GBM , which was significantly different ( P = 0.012 , log-rank test ) . Two patients underwent subsequent second FSRT for regional or remote recurrence . Seven patients ( 28 % ) underwent subsequent craniotomies and resections at a mean of 8.4 months after FSRT treatment , and 4 of them had radiation effects without viable cells and remained alive for 50–78 months . Conclusion Gamma FSRT after HBO therapy appears to confer a survival benefit for patients with recurrent high- grade gliomas and warrants further investigation BACKGROUND There is no community st and ard for the treatment of glioblastoma in patients 70 years of age or older . We conducted a r and omized trial that compared radiotherapy and supportive care with supportive care alone in such patients . METHODS Patients 70 years of age or older with a newly diagnosed anaplastic astrocytoma or glioblastoma and a Karnofsky performance score of 70 or higher were r and omly assigned to receive supportive care only or supportive care plus radiotherapy ( focal radiation in daily fractions of 1.8 Gy given 5 days per week , for a total dose of 50 Gy ) . The primary end point was overall survival ; secondary end points were progression-free survival , tolerance of radiotherapy , health-related quality of life , and cognition . RESULTS We r and omly assigned 85 patients from 10 centers to receive either radiotherapy and supportive care or supportive care alone . The trial was discontinued at the first interim analysis , which showed that with a preset boundary of efficacy , radiotherapy and supportive care were superior to supportive care alone . A final analysis was carried out for the 81 patients with glioblastoma ( median age , 73 years ; range , 70 to 85 ) . At a median follow-up of 21 weeks , the median survival for the 39 patients who received radiotherapy plus supportive care was 29.1 weeks , as compared with 16.9 weeks for the 42 patients who received supportive care alone . The hazard ratio for death in the radiotherapy group was 0.47 ( 95 % confidence interval , 0.29 to 0.76 ; P=0.002 ) . There were no severe adverse events related to radiotherapy . The results of quality -of-life and cognitive evaluations over time did not differ significantly between the treatment groups . CONCLUSIONS Radiotherapy results in a modest improvement in survival , without reducing the quality of life or cognition , in elderly patients with glioblastoma . ( Clinical Trials.gov number , NCT00430911 [ Clinical Trials.gov ] . ) OBJECTIVE During an 8-year interval , we evaluated the survival benefit of stereotactic radiosurgery performed in 64 patients with glioblastomas multiforme ( GBM ) and 43 patients with anaplastic astrocytomas ( AA ) . METHODS Adjuvant radiosurgery was performed either before disease progression or for recurrent tumor at the time of disease progression . Clinical and imaging follow-up data were obtained for all patients . The diagnosis of GBM was obtained by performing craniotomies in 41 patients and by performing stereotactic biopsies in 23 . The diagnosis of AA was obtained by performing craniotomies in 19 patients ( 44 % ) and by performing biopsies in 24 . RESULTS Of the entire series , the median survival time after initial diagnosis for patients with GBM was 26 months ( st and ard deviation [ SD ] , 19 mo ; range , 5 - 79 mo ) and the median survival time after radiosurgery was 16 months ( SD , 16 mo ; range , 1 - 74 mo ) . The 2-year survival rate was 51 % . No survival benefit was identified for patients who underwent intravenously administered chemotherapy in addition to radiosurgery ( P = 0.97 ) . After undergoing radiosurgery , 12 patients ( 19 % ) underwent craniotomies and resections and 4 ( 6 % ) underwent subsequent radiosurgery for regional or remote recurrence . For 45 patients who underwent radiosurgery as part of the initial management plan , the median survival time after diagnosis was 20 months . Of the entire series , the median survival time after diagnosis for patients with anaplastic astrocytomas was 32 months ( SD , 23 mo ; range 5 - 96 mo ) and the median survival time after radiosurgery was 21 months ( SD , 18 mo ; range 3 - 93 mo ) . The 2-year survival rate was 67 % . Ten patients ( 23 % ) underwent subsequent craniotomies at a mean of 8 months after initial surgery , and two underwent subsequent radiosurgery . There was no acute neurological morbidity after radiosurgery . Histologically proven radiation necrosis occurred in one patient with GBM ( 1.6 % ) and two patients with AA ( 4.7 % ) . For 21 patients for whom radiosurgery was part of the initial management plan , the median survival time after diagnosis was 56 months . CONCLUSION In comparison to historical controls , improved survival benefit after radiosurgery was identified for patients with GBM and patients with AA . Although this survival benefit may be related to our selection of patients for radiosurgery based on their having smaller tumor volumes , no selection was made based on location . We observed that radiosurgery was safe and well tolerated . Its effectiveness as an adjuvant therapy deserves a properly stratified r and omized trial PURPOSE To assess the outcome of high central dose Gamma Knife radiosurgery plus marimastat in patients with recurrent malignant glioma . METHODS AND MATERIAL S Twenty-six patients with recurrent malignant glioma were enrolled in a prospect i ve Phase II study between November 1996 and January 1999 . The radiosurgery dose was prescribed at the 25 - 30 % isodose surface to increase the dose substantially within the tumor 's presumably hypoxic core . Marimastat was administered after radiosurgery to restrict regional tumor progression . Survival was compared with that of historical patients treated at our institution with st and ard radiosurgery . RESULTS The median times to progression after radiosurgery for Grade 3 and 4 patients was 31 and 15 weeks , respectively . The corresponding median survival time after radiosurgery was 68 and 38 weeks . The median survival time after radiosurgery in the historical patients was 59 and 44 weeks . CONCLUSION The dual strategies of using high central dose radiosurgery to overcome tumor hypoxia together with marimastat to inhibit local tumor invasion may offer a small survival advantage for recurrent Grade 3 tumors ; they do not offer an advantage for recurrent Grade 4 tumors We attempted to show a dose effect relationship for radiation therapy by treating patients harbouring malignant glioma with increasing doses of radiation in a step-wise fashion . We postulated that no increase in delayed toxicity would be seen because we used hyperfractionation technique . Between January 1981 and December 1988 we treated 280 patients three times daily at 4 hour intervals . 100 patients received a total dose of 6141 cGy , 73 patients received 7120 cGy , and 107 patients received 8000 cGy . CCNU was given at the time of tumor progression following radiotherapy . Median time to tumor progression was 28 weeks for patients who received 6141 cGy , 27 weeks for patients who received 7120 cGy and 36 weeks for patients who received 8000 cGy . Median survival was 46 weeks for patients who received 6141 cGy , 38 weeks for patients who received 7120 cGy and 45 weeks for patients who received 8000 cGy . There was no statistically significant difference in either time to tumor progression or survival among the three treatment arms and no dose response effect was seen . There was no increase in delayed radiation toxicity when the total radiation dose was increased up to 8000 cGy PURPOSE To evaluate the efficacy and toxicity of a stereotactic radiosurgery boost as part of the primary management of a minimally selected population of patients with malignant gliomas . METHODS AND MATERIAL S Between June , 1991 and January , 1994 a stereotactic radiosurgery boost was given to 30 patients after completion of fractionated external beam radiotherapy . The study population consisted of 22 males and 8 females , with a range in age at treatment from 5 to 74 years ( median : 54 years ) . Tumor volume ranged from 2.1 to 115.5 cubic centimeters ( cc ) ( median : 24 cc ) . Histology included 17 with glioblastoma multiforme , 10 with anaplastic astrocytoma , 1 with a mixed anaplastic astrocytoma-oligodendroglioma , and 2 with a gliosarcoma . A complete resection was performed in 9 ( 30 % ) patients , while 18 ( 60 % ) underwent a subtotal resection , and 3 ( 10 % ) received a biopsy only . Fractionated radiation dose ranged from 44 to 62 Gy , with a median of 59.4 Gy . Prescribed stereotactic radiosurgery dose ranged from 0.5 to 18 Gy ( median : 10 Gy ) , and the volume receiving the prescription dose ranged from 2.1 to 158.7 cc ( median : 46 cc ) . The volume of tumor receiving the prescription dose ranged from 70 - 100 % ( median : 100 % ) . One to four ( median : 2 ) isocenters were used , and collimator size ranged from 12.5 to 50 mm ( median size : 32.5 mm ) . The median minimum stereotactic radiosurgery dose was 70 % of the prescription dose and the median maximum dose was 200 % of the prescription dose . RESULTS With a minimum follow-up of 1 year from radiosurgery , 7 ( 23 % ) of the patients are still living and 22 ( 73 % ) have died of progressive disease . One patient died of a myocardial infa rct ion 5 months after stereotactic radiosurgery . Follow-up for living patients ranged from 12 to 45 months , with a median of 30 months . The 1- and 2-year disease-specific survival from the date of diagnosis is 57 [ 95 % confidence interval ( CI ) 39 to 74 % ] and 25 % ( 95 % CI 9 to 41 % ) , respectively ( median survival : 13.9 months ) . No significant acute or late toxicity has been observed . CONCLUSION Stereotactic radiosurgery provides a safe and feasible technique for dose escalation in the primary management of unselected malignant gliomas . Longer follow-up and a r and omized prospect i ve trial is required to more thoroughly evaluate the role of radiosurgery in the primary management of malignant gliomas Abstract : The purpose of this study was to evaluate the potential selection bias using stereotactic eligibility as a criteria for participation in studies of glioblastoma multiforme . Radiation Therapy Oncology Group ( RTOG ) 90–06 comparing 60 Gy in 30 fractions with BCNU and 72 Gy in 60 fractions with BCNU was analyzed based on eligibility criteria used to enter patients on RTOG 93–05 using a stereotactic boost for patients with glioblastoma . Five hundred nine patients with histopathologically confirmed glioblastoma multiforme were analyzed ; of these , 137 met criteria for 93–05 and 372 did not . Recursive partitioning analysis ( RPA ) was used to evaluate for differences . The RPA distribution in stereotactic radiosurgery (SRS)-eligible and -ineligible patients was similar . The median survival for RPA class 3 SRS-eligible patients was 1.4 years and -ineligible patients 1.4 years . For RPA class 4 , the median survival was 1.0 years for eligible patients and 0.9 years for ineligible patients ( P = 0.0421 ) . For class 5 patients , the median survival was 8.3 months versus 7.2 months ( P = 0.09 ) . RPA class 6 patients had a median survival of 1.7 months versus 2.7 months for ineligible patients ( P = 0.199 ) . By analyzing previously r and omized patients in a study not using a stereotactic boost , there does not appear to be a survival benefit for those patients who fit the criteria for consideration of a stereotactic boost in patients with glioblastoma multiforme Purpose To prospect ively evaluate efficacy , side effects and quality of life in patients with recurrent malignant glioma after hypofractionated stereotactic radiotherapy . Methods and material sFrom 1/2003 to 8/2005 , 15 patients with recurrent malignant glioma were prospect ively scheduled for hfSRT with 5 × 7 Gy ( 90%-isodose ) . Median gross tumor volume and planning target volume were 5.75 ( range , 0.77–21.94 ) and 22.4 ( range , 4.22–86.79 ) cc , respectively . Irradiation was performed with the dedicated stereotactic radiosurgery system Novalis ™ ( BrainLAB , Heimstetten , Germany ) . Results Rates of remission , no change and progressive disease were 27 % , 33 % , and 40 % , respectively , after a median follow-up of 9 months . Progression-free survival rates at 6 and 12 months were 75 % and 53 % respectively . Quality of life , measured by the European Organization for Research and Treatment of Cancer Quality of Life Question naire scores could be kept stable in two thirds of the patients for a median time of 9 months , respectively . Conclusion : Hypofractionated stereotactic radiotherapy with 5 × 7 Gy of recurrent high grade glioma is an effective treatment that helps to maintain quality of life for an acceptable period , comparable to the results obtained with current chemotherapy schedules . Combined approaches of radiotherapy , chemotherapy and other targeted therapies deserve further inverstigation OBJECT This investigation was performed to determine the tolerance and toxicities of split-course fractionated gamma knife radiosurgery ( FSRS ) given in combination with conventional external-beam radiation therapy ( CEBRT ) . METHODS Eighteen patients with previously unirradiated , gliomas treated between March 1995 and January 2000 form the substrate of this report . These included 11 patients with malignant gliomas , six with low- grade gliomas , and one with a recurrent glioma . They were stratified into three groups according to tumor volume ( TV ) . Fifteen were treated using the initial FSRS dose schedule and form the subject of this report . Group A ( four patients ) , had TV of 5 cm3 or less ( 7 Gy twice pre- and twice post-CEBRT ) ; Group B ( six patients ) , TV greater than 5 cm3 but less than or equal to 15 cm3 ( 7 Gy twice pre-CEBRT and once post-CEBRT ) ; and Group C ( five patients ) , TV greater than 15 cm3 but less than or equal to 30 cm3 ( 7 Gy once pre- and once post-CEBRT ) . All patients received CEBRT to 59.4 Gy in 1.8-Gy fractions . Dose escalation was planned , provided the level of toxicity was acceptable . All patients were able to complete CEBRT without interruption or experiencing disease progression . Unacceptable toxicity was observed in two Grade 4/Group B patients and two Grade 4/Group C patients . Eight patients required reoperation . In three ( 38 % ) there was necrosis without evidence of tumor . Neuroimaging studies were available for evaluation in 14 patients . Two had a partial ( > or = 50 % ) reduction in volume and nine had a minor ( > 20 % ) reduction in size . The median follow-up period was 15 months ( range 9 - 60 months ) . Six patients remained alive for 3 to 60 months . CONCLUSIONS The imaging responses and the ability of these patients with intracranial gliomas to complete therapy without interruption or experiencing disease progression is encouraging . Excessive toxicity derived from combined FSRS and CEBRT treatment , as evaluated thus far in this study , was seen in patients with Group B and C lesions at the 7-Gy dose level . Evaluation of this novel treatment strategy with dose modification is ongoing BACKGROUND AND PURPOSE The EORTC trial No. 22972 investigated the role of an additional fractionated stereotactic boost ( fSRT ) to conventional radiotherapy for patients with high grade gliomas . A quality -assurance ( QA ) programme was run in conjunction with the study and was the first within the EORTC addressing the quality of a supposedly highly accurate treatment technique such as stereotactic radiotherapy . A second aim was to investigate a possible relation between the clinical results of the stereotactic boost arm and the results of the QA . MATERIAL S AND METHODS The trial was closed in 2001 due to low accrual . In total , 25 patients were r and omized : 14 into the experimental arm and 11 into the control arm . Six centres r and omized patients , 8 centres had completed the dummy run ( DR ) for the stereotactic boost part . All participating centres ( 9 ) were asked to complete a quality -assurance question naire . The DR consisted of treatment planning according to the guidelines of the protocol on 3 different tumour volumes drawn on CT images of a humanized phantom . The SRT technique to be used was evaluated by the question naire . Clinical data from patients recruited to the boost arm from 6 participating centres were analysed . RESULTS There was a full compliance to the protocol requirements for 5 centres . Major and minor deviations in conformality were observed for 2 and 3 centres , respectively . Of the 8 centres which completed the DR , one centre did not comply with the requirements of stereotactic radiotherapy concerning accuracy , dosimetry and planning . Median follow-up and median overall survival were 39.2 and 21.4 months , respectively . Acute and late toxicities of the stereotactic boost were low . One radiation necrosis was seen for a patient who has not received the SRT boost . Three reported serious adverse events were all seizures and probably therapy-related . CONCLUSIONS Overall compliance was good but not ideal from the point of view of this highly precise radiation technique . Survival in the subgroup of patients with small volume disease was encouraging , but the study does not provide sufficient information about the potential value of fSRT boost in patients with malignant glioma . Toxicity due to an additional stereotactic boost of 20 Gy in 4 fractions was low and may be considered as a safe treatment option for patients with small tumours OBJECTIVE To assess the feasibility , toxicity , and local control of stereotactic radiosurgery followed by accelerated external beam radiotherapy ( AEBR ) for patients with glioblastoma multiforme . MATERIAL S AND METHODS Six males and eight females , with a median age of 67.5 years ( range 45 - 78 years ) , entered the study . Karnofsky performance status was 90 for five , 80 for six , and 60 for three patients . Following surgery , the patients were left with a residual mass 4 cm . Radiosurgery was delivered with a single dose of 20 Gy to the 90 % isodose surface corresponding to the contrast-enhancing edge of the tumour . A total AEBR dose of 60 Gy in 30 fractions was delivered using a concomitant boost technique over four weeks . RESULTS Median survival time was 40 weeks ( range 17 - 80 weeks ) . Actuarial survivals at 12 and 18 months were 43 % and 14 % , respectively . The median time to progression was 25 weeks ( range 2 - 77 weeks ) . One patient developed a seizure on the day of stereotactic radiosurgery . Two patients experienced somnolence at 47 and 67 days post-radiotherapy . Eight patients remained steroid-dependent . Radiological evidence of leukoencephalopathy was observed in one patient , and brain necrosis in two additional patients at 30 and 63 weeks . One of these two patients with brain necrosis developed complete loss of vision in one eye , and decreased vision in the contralateral eye at 63 weeks . CONCLUSION Stereotactic radiosurgery followed by AEBR was feasible but was associated with late complications . The use of such radiosurgical boost for patients with glioblastoma multiforme should be reserved for those patients entering controlled clinical trials External beam irradiation of malignant astrocytoma often provides temporary local tumor control , but dose is limited by potential toxicity to normal brain . Fractionated stereotactic radiotherapy ( SRT ) provides additional radiation to the tumor with less dose deposition in adjacent normal brain . We administered a potential radiosensitizer , cis-platinum ( CDDP ) , to optimize the therapeutic index . CDDP ( 40 mg/m2 ) was given weekly , with SRT once or twice weekly , to 20 patients . One had a partial response , 11 stable disease , and eight progressed despite therapy . Acute toxicities were manageable . Five patients required surgery for tumor progression or radiation necrosis . Median response duration was 18.5 weeks and median survival was 55 weeks . SRT combined with CDDP is safe , with durable responses in some patients . Further investigations to determine optimal SRT and CDDP doses and scheduling are justified PURPOSE To evaluate the efficacy of fractionated stereotactic radiotherapy ( FSRT ) performed as reirradiation in 172 patients with recurrent low- and high- grade gliomas . PATIENTS AND METHODS Between 1990 and 2004 , 172 patients with recurrent gliomas were treated with FSRT as reirradiation in a single institution . Seventy-one patients suffered from WHO grade 2 gliomas . WHO grade 3 gliomas were diagnosed in 42 patients , and 59 patients were diagnosed with glioblastoma multiforme ( GBM ) . The median time between primary radiotherapy and reirradiation was 10 months for GBM , 32 months for WHO grade 3 tumors , and 48 months for grade 2 astrocytomas . FSRT was performed with a median dose of 36 Gy in a median fractionation of 5 x 2 Gy/wk . RESULTS Median overall survival after primary diagnosis was 21 months for patients with GBM , 50 months for patients with WHO grade 3 gliomas , and 111 months for patients with WHO grade 2 gliomas . Histologic grading was the strongest predictor for overall survival , together with the extent of neurosurgical resection and age at primary diagnosis . Median survival after reirradiation was 8 months for patients with GBM , 16 months for patients with grade 3 tumors , and 22 months for patients with low- grade gliomas . Only time to progression and histology were significant in influencing survival after reirradiation . Progression-free survival after FSRT was 5 months for GBM , 8 months for WHO grade 3 tumors , and 12 months for low- grade gliomas . CONCLUSION FSRT is well tolerated and may be effective in patients with recurrent gliomas . Prospect i ve studies are warranted for further evaluation BACKGROUND Despite notable technical advances in therapy for malignant gliomas during the past decade , improved patient survival has not been clearly documented , suggesting that pretreatment prognostic factors influence outcome more than minor modifications in therapy . Age , performance status , and tumor histopathology have been identified as the pretreatment variables most predictive of survival outcome . However , an analysis of the association of survival with both pretreatment characteristics and treatment-related variables is necessary to assure reliable evaluation of new approaches for treatment of malignant glioma . PURPOSE This study of malignant glioma patients used a non-parametric statistical technique to examine the associations of both pretreatment patient and tumor characteristics and treatment-related variables with survival duration . This technique was used to identify subgroups with survival rates sufficiently different to create improvements in the design and stratification of clinical trials . METHODS We used a recursive partitioning technique to analyze survival in 1578 patients entered in three Radiation Therapy Oncology Group malignant glioma trials from 1974 to 1989 that used several radiation therapy ( RT ) regimens with and without chemotherapy or a radiation sensitizer . This approach creates a regression tree according to prognostic variables that classifies patients into homogeneous subsets by survival . Twenty-six pretreatment characteristics and six treatment-related variables were analyzed . RESULTS The years ) . Patients younger than 50 years old were categorized by histology ( astrocytomas with anaplastic or atypical foci [ AAF ] versus glioblastoma multiforme [ GBM ] ) and subsequently by normal or abnormal mental status for AAF patients and by performance status for those with GBM . For patients aged 50 years or older , performance status was the most important variable , with normal or abnormal mental status creating the only significant split in the poorer performance status group . Treatment-related variables produced a subgroup showing significant differences only for better performance status GBM patients over age 50 ( by extent of surgery and RT dose ) . Median survival times were 4.7 - 58.6 months for the 12 subgroups result ing from this analysis , which ranged in size from 32 to 256 patients . CONCLUSIONS This approach permits examination of the interaction between prognostic variables not possible with other forms of multivariate analysis . IMPLICATION S The recursive partitioning technique can be employed to refine the stratification and design of malignant glioma trials OBJECTIVE Stereotactic radiosurgery ( SRS ) has become an effective therapeutic modality for the treatment of patients with glioblastoma multiforme ( GBM ) . This retrospective review evaluates the impact of SRS delivered on a gamma knife ( GK ) unit as an adjuvant therapy in the management of patients with GBM . METHODS Between August 1993 and December 1998 , 82 patients with pathologically confirmed GBM received external beam radiotherapy ( EBRT ) at the University of Maryl and Medical Center . Of these 82 patients , 64 with a minimum follow-up duration of at least 1 month are the focus of this analysis . Of the 64 assessable patients , 33 patients were treated with EBRT alone ( Group 1 ) , and 31 patients received both EBRT plus a GK-SRS boost ( Group 2 ) . GK-SRS was administered to most patients within 6 weeks of the completion of EBRT . The median EBRT dose was 59.7 Gy ( range , 28–70.2 Gy ) , and the median GK-SRS dose to the prescription volume was 17.1 Gy ( range , 10–28 Gy ) . The median age of the study population was 50.4 years , and the median pretreatment Karnofsky performance status was 80 . Patient- , tumor- , and treatment-related variables were analyzed by Cox regression analysis , and survival curves were generated by the Kaplan-Meier product limit . RESULTS Median overall survival for the entire cohort was 16 months , and the actuarial survival rate at 1 , 2 , and 3 years were 67 , 40 , and 26 % , respectively . When comparing age , Karnofsky performance status , extent of resection , and tumor volume , no statistical differences where discovered between Group 1 versus Group 2 . When comparing the overall survival of Group 1 versus Group 2 , the median survival was 13 months versus 25 months , respectively ( P = 0.034 ) . Age , Karnofsky performance status , and the addition of GK-SRS were all found to be significant predictors of overall survival via Cox regression analysis . No acute Grade 3 or Grade 4 toxicity was encountered . CONCLUSION The addition of a GK-SRS boost in conjunction with surgery and EBRT significantly improved the overall survival time in this retrospective series of patients with GBM . A prospect i ve , r and omized validation of the benefit of SRS awaits the results of the recently completed Radiation Therapy Oncology Group ’s trial RTOG 93 - 05 From February 1989 to December 1992 , 31 patients who presented with an initial pathological diagnosis of glioblastoma multiforme underwent tumor debulking or biopsy , stereotactic radiosurgery , and st and ard radiation therapy as part of their primary treatment . Presenting characteristics in the 22 men and nine women included a median age of 57 years , Karnofsky Performance Scale score median of 80 , and median tumor volume of 16.4 cm3 . Stereotactic radiosurgery delivered a central dose of 15 to 35 Gy with the isocenter location , collimator size , and beam paths individualized by means of three-dimensional software developed at the University of Wisconsin . The peripheral isodose line varied from 40 % to 90 % with a median of 72.5 % and a mode of 80 % . The mean follow-up period was 12.84 months with a median of 9.5 months . Statistical analysis was performed using Kaplan-Meier analysis and log-rank comparison of risk factor groups . The parameters of age , initial Karnofsky Performance Scale score , and biopsy were significantly different in patient survival from debulking ; but no difference was noted between single and multiple isocenters and patterns of steroid requirement . Radiographic recurrences were divided by location into the following categories : central ( within central stereotactic radiosurgery dose ) , 0 ; peripheral ( within 2 cm of central dose ) , 19 ; and distant ( > 2 cm ) , 4 . There is no evidence of recurrence in five surviving patients . Actuarial 12-month survival was 37 % , with a median survival of 9.5 months . These values are similar to previous results for surgery and st and ard radiotherapy alone . The results suggest that the curative value of radiosurgery is significantly limited by peripheral recurrences PURPOSE To determine the value and the toxicity of an additional fractionated stereotactic boost as used in the joint r and omized EORTC-22972/MRC-BR10 study in patients with malignant gliomas . MATERIAL S AND METHODS Seventeen patients ( 11 male , six female ) with a high- grade glioma ( two WHO III , 15 WHO IV ) < or = 4 cm in maximum diameter , with a good performance status ( WHO > or = 2 ) , were treated with a fractionated stereotactic radiotherapy ( SRT ) boost to 20 Gy in four fractions following partial brain irradiation to a dose of 60 Gy in 30 fractions . This patient group was compared with historical data in a matched-pair analysis . RESULTS All patients were treated by conventional radiotherapy and a SRT boost ( 15 patients received 20 Gy and two patients 10 Gy ) . Acute side effects included fatigue ( two ) , impairment of short-term memory ( one ) and worsening of pre-existing symptoms ( one ) . No patient developed steroid dependence after SRT . One patient was re-operated for radiation necrosis . At a median follow-up of 25 months ( 9 - 50 months ) 14 patients recurred locally . Survival was 77 % at 1 year and 42 % at 2 years ; progression-free survival was 70 % at 1 year and 35 % at 2 years for all patients , respectively . Median survival for the whole patient group is 20 months . Comparison with a matched historical group showed a significantly better survival for the group treated with a stereotactic boost ( P<0.0001 ) . CONCLUSION A fractionated stereotactic boost after st and ard external beam radiotherapy in selected patients with high- grade glioma is feasible and well tolerated with low toxicity . Compared to historical data survival is significantly better with an additional SRT boost . However , its effectiveness has to be proven in a r and omized trial The purpose of this pilot study was to determine the feasibility and toxicities of an accelerated treatment program by using a concomitant stereotactic radiotherapy boost given weekly during a course of st and ard external-beam irradiation ( EBXRT ) in patients with malignant gliomas . Twelve patients underwent biopsy or subtotal resection of a malignant glioma and were enrolled on the protocol , which delivered 44 Gy-EBXRT and a 12-Gy stereotactic radiotherapy boost given on 3 consecutive weeks of treatment for a total dose of 80 Gy over 33 days . Three patients with anaplastic astrocytoma and nine patients with glioblastoma multiforme had median survival times of 33 months and 16 months , respectively . All of the tumor recurrences were within or were closely adjacent to the region of high-dose irradiation . None of the patients required a treatment break , and there were no acute complications . Two patients developed seizures in the follow-up period , and four patients were diagnosed with radionecrosis at the time of the second operation . The treatment program was found to be feasible and was well tolerated , and it result ed in a rate of late complications similar to those of radiosurgery or interstitial brachytherapy PURPOSE Fractionated external beam radiotherapy ( EBRT ) + /- carmustine ( BCNU ) is the st and ard of care for patients with glioblastoma multiforme ( GBM ) , but survival results remain poor . Pre clinical studies indicate synergy between RT and paclitaxel ( TAX ) in astrocytoma cell lines . Phase I studies in GBM have demonstrated a maximum tolerated dose for TAX of 225 mg/m(2)/3 h/week x 6 , during EBRT , with no exacerbation of typical RT-induced toxicities . The Radiation Therapy Oncology Group ( RTOG ) therefore mounted a Phase II study to determine the feasibility and efficacy of conventional EBRT and concurrent weekly TAX at its MTD . PATIENTS AND METHODS Sixty-two patients with histologic diagnosis of GBM were enrolled from 8/16/96 through 3/21/97 in a multi-institutional Phase II trial of EBRT and TAX 225 mg/m(2)/3 h ( 1 - 3 h before EBRT ) , administered the first treatment day of each RT week . Total EBRT dose was 60 Gy ( 200 cGy/fraction ) , 5 days per week . A smaller treatment field , to include gross disease plus a margin only , was used after 46 Gy . RESULTS Sixty-one patients ( 98 % ) were evaluable . Median age was 55 years ( range , 28 - 78 ) . Seventy-four percent were > or = 50 years . Recursive partitioning analysis ( RPA ) Classes III , IV , V , VI included 10 ( 17 % ) , 21 ( 34 % ) , 25 ( 41 % ) , and 5 ( 8 % ) patients , respectively . Gross total resection was performed in only 16 % . There was no Grade 3 or 4 neutropenia or thrombocytopenia . Hypersensitivity reactions precluding further use of TAX occurred in 4 patients . There were 2 instances of late neurotoxicity ( 4 % Grade 3 or 4 ) . Ninety-one percent of patients received treatment per protocol . Seventy-seven percent completed prescribed treatment ( 6 weeks ) . Of 35 patients with measurable disease , CR/PR was observed in 23 % , MR in 17 % , and SD in 43 % . Seventeen percent demonstrated progression at first follow-up . Median potential follow-up time is 20 months . Median survival is 9.7 months , with median survivals for RPA classes III , IV , V , and VI of 16.3 , 10.2 , 9.5 , 2.5 months , respectively . Ten patients remain alive . CONCLUSION Concurrent full-dose EBRT and weekly high-dose TAX is feasible in the majority of GBM patients . Acute toxicity is acceptable ; myelosuppression and peripheral sensory neuropathy are surprisingly modest , despite considerably higher overall dose intensity , compared to that achievable in other disease sites . Median survival by RPA class without prolonged adjuvant therapy is comparable to RTOG controls treated with st and ard EBRT and BCNU ( 1 year of BCNU ) PURPOSE To help investigators decide if new therapies for glioma warrant definitive evaluation in r and omized studies we have been developing a method for assessing the degree to which patient selection may have enhanced the results of uncontrolled treatment trials . In this study , we analyzed the impact of case selection on the survival of patients with malignant glioma receiving adjuvant stereotactic radiosurgery , a promising therapy reserved for those with small tumors and good performance status . METHODS Following published eligibility criteria we simulated the patient selection process for stereotactic radiosurgery given as a boost at the conclusion of conventional radiotherapy . Eligible patients were culled from a pre-existing clinical /imaging data base of 101 consecutive conventionally-treated patients with biopsy-proven malignant glioma and known survival times . Median duration s of survival and 2- and 3-year survival rates were determined for those judged eligible or ineligible for stereotactic radiosurgery . RESULTS Twenty-seven percent of patients were deemed eligible for stereotactic radiosurgery , eligible patients had more favorable prognostic factors and significantly longer median survival than ineligible patients ( 23.4 vs. 8.6 months ; 2-year rate , 48 % vs. 15 % ; 3-year rate , 30 % vs. 7 % ) ; eligible patients also had a longer median survival than the entire group of unselected patients ( 23.4 vs. 11.4 months ) . Radiosurgery-eligible , conventionally-treated patients with glioblastoma multiforme and a group of radiosurgery-treated patients at a special referral center had similar median survival times ( 16.4 vs. 19.7 months ) . CONCLUSION We provide additional evidence for selection bias in uncontrolled trials of stereotactic radiosurgery and by simulating the selection process accurately have detected a larger bias effect than noted previously . Judging from experience with interstitial radiation and intraarterial chemotherapy where substantial selection bias also occurred and r and omized controlled trials proved disappointing , we conclude that a phase III study of stereotactic radiosurgery for malignant glioma is unlikely to yield a positive result and may not be necessary PURPOSE Most primary oligodendrogliomas and mixed gliomas ( oligoastrocytoma ) respond to treatment with procarbazine , lomustine , and vincristine ( PCV ) , with response rates of approximately 80 % . However , limited data on second-line treatments are available in patients with recurrent tumors . A novel second-generation alkylating agent , temozolomide , has recently demonstrated efficacy and safety in patients with recurrent glioblastoma multiforme and anaplastic astrocytoma . This study describes the effects of temozolomide in patients with recurrent anaplastic oligodendroglioma ( AO ) and anaplastic mixed oligoastrocytoma ( AOA ) . PATIENTS AND METHODS Forty-eight patients with histologically confirmed AO or AOA who had received previous PCV chemotherapy were treated with temozolomide ( 150 to 200 mg/m2/d for 5 days per 28-day cycle ) . The primary end point was objective response . Secondary end points included progression-free survival ( PFS ) , time to progression , overall survival ( OS ) , safety , and tolerability . RESULTS Eight patients ( 16.7 % ) experienced a complete response , 13 patients ( 27.1 % ) experienced a partial response ( objective response rate , 43.8 % ) , and 19 patients ( 39.6 % ) experienced stable disease . For the entire treatment group , median PFS was 6.7 months and median OS was 10 months . For objective responders , median PFS was 13.1 months and median OS was 16 months . For complete responders , PFS was more than 11 . 8 months and OS was more than 26 months . Response correlated with improved survival . Temozolomide was safe and well tolerated . Twelve patients developed grade 1/2 thrombocytopenia and three patients developed grade 3/4 thrombocytopenia . CONCLUSION Temozolomide is safe and effective in the treatment of recurrent AO and AOA Radiotherapy remains an important component of the treatment of high grade gliomas . Despite high conventional doses of irradiation to the level of radiation tolerance , tumours recur largely at the primary site or its vicinity . An attempt to improve the local control of glioma with increased dose of radiation is likely to be effective only if increased tumour dose is not accompanied by increased dose to normal tissue , which may result in excessive CNS toxicity . Currently there is no evidence which would support the use of increased doses of radiation to large target volumes . Therefore , the use of techniques such as interstitial brachytherapy or stereotactic radiotherapy ( SRT ) has been advocated to treat this patient population in combination with conventional external irradiation . SRT has been employed in the form of single-fraction radiosurgery using either multi source cobalt unit ( gamma knife ) or a linear accelerator with multiple noncoplanar arcs of rotation although as a single-dose irradiation it carries a risk of radiation necrosis . With the advent of relocatable frames [ 1–3 ] it has become possible to fractionate SRT , exploiting the potential benefit of fractionation in terms of further sparing of normal tissues . On current evidence the risk of necrosis is less than with single-fraction treatment [ 9 ]
2,058
28,954,042
CONCLUSION Proton pump inhibitors or histamine H2 receptor antagonists may be used to treat children with gastroesophageal reflux disease , but not to treat asthma or unspecific symptoms
BACKGROUND Proton pump inhibitors and histamine H2 receptor antagonists are two of the most commonly prescribed drug classes for pediatric gastroesophageal reflux disease , but their efficacy is controversial . Many patients are treated with these drugs for atypical manifestations attributed to gastroesophageal reflux , even that causal relation is not proven . OBJECTIVE To evaluate the use of proton pump inhibitors and histamine H2 receptor antagonists in pediatric gastroesophageal reflux disease through a systematic review .
The effectiveness of cimetidine ( 30–40 mg/ kg/day ) was evaluated in 32 children with gastroesophageal reflux disease complicated by esophagitis who entered a r and om double-blind trial for 12 weeks . Esophagitis was diagnosed in all patients by endoscopy with biopsy . Seventeen patients ( age , mean ± SD : 21.7 ± 97.65 months ) received cimetidine ( c-pts ) , and 15 ( age , mean ± SD : 29.03 ± 39.73 months ) received a placebo ( p-pts ) . All patients received intensive postural therapy . Based on clinical and endoscopic ( and histologic ) data , 12 c-pts and three p-pts were healed ( p < 0.01 ) , the condition of four c-pts and three p-pts had improved ( not statistically significant ) , and the condition of one c-pt and nine p-pts had worsened ( p < 0.01 ) . Both clinical and esophagitis scores significantly decreased only in the c-pt group , as compared with p-pts . Improvement of esophagitis was seen in all ( 100 % ) of c-pts with mild or moderate esophagitis versus 57.14 % of p-pts ( p < 0.01 ) and in 87.5 % of c-pts with severe esophagitis as compared with 25 % of the p-pt group ( p < 0.01 ) . We conclude that cimetidine is an effective agent for treatment of reflux esophagitis in children . Although gastroesophageal reflux disease in infancy has a naturally self-limited course with conservative care ( thickened feedings and posture adjustment ) , extensive pharmacologic therapy is needed in the presence of esophagitis In order to evaluate the use of sucralfate in the treatment of children with reflux esophagitis , we studied 66 children aged from 4 months to 12 years ( mean 5.9 years , SD 3.5 ) diagnosed to have gastroesophageal reflux by means of esophageal isotopic examination and radiology . An endoscopic examination was carried out in all cases . None of the patients suffered from kidney disease or had taken antacids , cimetidine , sucralfate or antirheumatic drugs in the two weeks prior to the study . Patients were divided into three groups matched according to age , grade of esophagitis , sex , nutritional state and semiology and treated with sucralfate in tablets , cimetidine , or sucralfate in suspension ; no dietetic or postural measures were used . On days 14 , 28 , 42 and 56 , clinical control was carried out and endoscopy was done on day 28 , this being repeated on day 56 if the course was not satisfactory . From the statistical analysis of the results we deduce that there are no differences between the three groups . Therefore sucralfate appears to be a useful drug for the treatment of children with esophagitis due to GER Background : Epidemiological studies have shown an association between gastro-oesophageal reflux disease ( GORD ) and asthma , and oesophageal acid perfusion may cause bronchial constriction . However , no causative relation has been proven . Aim : To assess whether acid suppression would lead to reduced asthma symptoms in children with concomitant asthma and GORD . Methods : Thirty eight children ( mean age 10.8 years , range 7.2–16.8 ; 29 males ) with asthma and a reflux index ⩾5.0 assessed by 24 hour oesophageal pH monitoring were r and omised to 12 weeks of treatment with omeprazole 20 mg daily or placebo . The groups were similar in age , gender , mean reflux index , and asthma severity . Primary endpoints were asthma symptoms ( daytime wheeze , symptoms at night , in the morning , and during exercise ) and quality of life ( PAQLQ ) . Secondary endpoints were changes in lung function and the use of short acting bronchodilators . At the end of the study a repeated pH study was performed to confirm the efficacy of acid suppression . Results : The change in total symptom score did not differ significantly between the omeprazole and the placebo group , and decreased by 1.28 ( 95 % CI −0.1 to 2.65 ) and 1.28 ( 95 % CI −0.72 to 3.27 ) respectively . The PAQLQ score increased by 0.62 ( 95 % CI 0.29 to 0.95 ) in the omeprazole group compared to 0.50 ( 95 % CI 0.29 to 0.70 ) in the placebo group . Change in lung function and use of short acting bronchodilators were similar in the groups . The acid suppression was adequate ( reflux index < 5.0 ) under omeprazole treatment . Conclusion : Omeprazole treatment did not improve asthma symptoms or lung function in children with asthma and GORD The effect of antisecretory treatment on extraesophageal symptoms of gastroesophageal reflux disease was evaluated . Seventy-eight children presenting with typical and extraesophageal symptoms of gastroesophageal reflux disease underwent a multichannel intraluminal impedance and pH monitoring ( MII/pH ) . Children with a positive MII/pH were r and omly treated with proton pump inhibitors ( PPIs ) or histamine H(2 ) -receptor antagonists ( H(2 ) RAs ) during 3 months . At the end of the treatment period , all patients were recalled . A second treatment period of 3 months was given to those patients who were not symptom-free after 3 months . Thirty-five of the forty-one ( 85.4 % ) children with a pathologic MII/pH presented with extraesophageal symptoms and were treated with PPIs ( omeprazole ; n:19 ) or H(2 ) RAs ( ranitidine ; n:16 ) for 12 weeks . After 3 months , 11/19 ( 57.9 % ) PPI-treated patients had a complete resolution of symptoms ; 6/8 nonresponders were treated with PPI for another 3 months and became all symptom-free . The other two underwent a Nissen fundoplication . Only 5/16 ( 31.2 % ) patients treated with H(2 ) RAs had a complete resolution of symptoms after 3 months ; 1/11 was treated again with H(2 ) RAs during 3 months , and 10/11 were changed to PPIs . In 3/10 , a partial resolution of symptoms was achieved , while in 7/10 , a complete remission was obtained ( P < 0.05 ) . Antisecretory reflux treatment improves extraesophageal reflux symptoms . The efficacy of PPIs is superior to that of H(2 ) RAs in these children Thirty two consecutive patients ( age range 6 months-13.4 years ) with severe reflux oesophagitis were r and omised to a therapeutic trial for eight weeks during which they received either st and ard doses of omeprazole ( 40 mg/day/1.73 m2 surface area ) or high doses of ranitidine ( 20 mg/kg/day ) . Twenty five patients completed the trial ( 12 on omeprazole , 13 on ranitidine ) . At entry and at the end of the trial patients underwent symptomatic score assessment , endoscopic and histological evaluation of the oesophagus , and simultaneous oesophageal and gastric pH measurement ; results are given as median ( range ) . Both therapeutic regimens were effective in decreasing clinical score ( omeprazole before 24.0 ( 15 - 33 ) , after 9.0 ( 0 - 18 ) ; ranitidine before 19.5 ( 12 - 33 ) , after 9.0 ( 6 - 12 ) ) , in improving the histological degree of oesophagitis ( omeprazole before 8.0 ( 6 - 10 ) , after 2.0 ( 0 - 60 ) ; ranitidine before 8.0 ( 8 - 10 ) , after 2.0 ( 2 - 6 ) , and in reducing oesophageal acid exposure , measured as minutes of reflux at 24 hour pH monitoring ( omeprazole before 129.4 ( 84 - 217 ) , after 44.6 ( 0.16 - 128 ) ; ranitidine before 207.3 ( 66 - 306 ) , after 58.4 ( 32 - 128 ) ) as well as intragastric acidity , measured as median intragastric pH ( omeprazole before 2.1 ( 1.0 - 3.0 ) , after 5.1 ( 2.2 - 7.4 ) ; ranitidine before 1.9 ( 1.6 - 4 ) , after 3.4 ( 2.3 - 5.3 ) ) . Serum gastrin concentration was > 150 ng/l in four patients on omeprazole and in three patients on ranitidine . It is concluded that in children with refractory reflux oesophagitis high doses of ranitidine are comparable with omeprazole for the healing of oesophagitis and relief of symptoms ; both drugs result ed in efficacious reduction of intragastric acidity and intra-oesophageal acid exposure In order to study the importance of gastro-oesophageal reflux ( GOR ) as a trigger of asthma the effect of inhibition of gastric acid secretion on asthma was assessed in a double-blind , cross-over , placebo-controlled trial over four weeks in 37 children and adolescents ( mean age 14 yrs ) with bronchial asthma . Ranitidine 300 mg , ( 150 mg if B.W. was less than 40 kg ) was given as a single evening dose during four weeks . In previous investigations 18 of the 37 patients had been shown to have pathological GOR by 24 h pH monitoring in the oesophagus . The remaining 19 patients with normal GOR served as controls for possible effects of ranitidine on asthma , not related to reduction of GOR . A modest ( 30 % ) but statistically significant reduction of nocturnal asthma symptoms was produced by ranitidine in the patients with pathological GOR when compared to those with normal GOR . There was a significant correlation between the improvement in asthma symptoms and the degree of acid reflux . Side-effects of ranitidine were negligible . Acid reflux appears to be only a weak stimulus for bronchoconstriction in children and adolescents with bronchial asthma and pathological GOR . Further confirmative trials with more potent inhibitors of gastric acid secretion are , however , warranted Purpose Gastroesophageal reflux disease ( GERD ) occurs in pediatric patients when reflux of gastric contents presents with troublesome symptoms . The present study compared the effects of omeprazole and ranitidine for the treatment of symptomatic GERD in infants of 2 - 12 months . Methods This study was a clinical r and omized double-blind trial and parallel-group comparison of omeprazole and ranitidine performed at Children Training Hospital in Tabriz , Iran . Patients received a st and ard treatment for 2 weeks . After 2 weeks , the patients with persistent symptoms were enrolled in this r and omized study . Results We enrolled 76 patients in the present study and excluded 16 patients . Thirty patients each were included in group A ( ranitidine ) and in group B ( omeprazole ) . GERD symptom score for groups A and B was 47.17±5.62 and 51.93±5.42 , respectively , with a P value of 0.54 , before the treatment and 2.47±0.58 and 2.43±1.15 , respectively , after the treatment ( P=0.98 ) . No statistically significant differences were found between ranitidine and omeprazole in their efficacy for the treatment of GERD . Conclusion The safety and efficacy of ranitidine and omeprazole have been demonstrated in infants . Both groups of infants showed a statistically significant decrease in the score of clinical variables after the treatment OBJECTIVES : To evaluate the efficacy of acid-suppressive maintenance therapy for gastroesophageal reflux disease ( GERD ) in children , after the healing of reflux esophagitis . METHODS : Forty-eight children ( median age 105 months , range 32–170 ) with erosive reflux esophagitis were initially treated with omeprazole 1.4 mg/kg/day for 3 months . Patients in endoscopic remission were assigned in a r and omized , blinded manner by means of a computer-generated list to three groups of 6-month maintenance treatment : group A ( omeprazole at half the starting dose , once daily before breakfast ) , group B ( ranitidine 10 mg/kg/day , divided in two doses ) , and group C ( no treatment ) . Endoscopic , histological , and symptomatic scores were evaluated at : T0 , enrollment ; T1 , assessment for remission at 3 months after enrollment ( healing phase ) ; T2 , assessment for effective maintenance at 12 months after T0 ( 3 months after the completion of the maintenance phase ) . Relapse was defined as the recurrence of macroscopic esophageal lesions . After the completion of the maintenance phase , patients without macroscopic esophagitis relapse were followed up for GERD symptoms for a further period of 30 months . RESULTS : Of 48 initially treated patients , 46 ( 94 % ) healed and entered the maintenance study . For all patients , in comparison to T0 , the histological , endoscopic , and symptomatic scores were significantly reduced both at T1 and T2 ( P < 0.0001 , for each ) . No significant difference was found in these three scores , comparing group A , B , and C at T1 and T2 . A relapse occurred in one patient only , who presented with macroscopic esophageal lesions at T2 . Three months after the completion of the maintenance phase , 12 ( 26 % ) patients complained of symptoms sufficiently mild to discontinue GERD therapy , excluding the patient who showed macroscopic esophagitis relapse . Three of 44 ( 6.8 % ) patients reported very mild GERD symptoms within a period of 30 months after maintenance discontinuation . CONCLUSIONS : Our pediatric population showed a low rate of erosive esophagitis relapse and GERD symptom recurrence long term after healing with omeprazole , irrespective of the maintenance therapy Objectives : Gastroesophageal reflux disease ( GERD ) is present in pediatric patients when reflux of gastric contents causes troublesome symptoms and / or complications . The present study evaluates the efficacy and safety of esomeprazole in infants ages 1 to 11 months with GERD . Methods : In this multicenter r and omized , double-blind , placebo-controlled , parallel-group , treatment-withdrawal study , infants received open-label , weight-adjusted doses of esomeprazole ( 2.5–10 mg ) once daily for 2 weeks . Infants with symptom improvement were r and omized to esomeprazole ( weight-adjusted doses [ 2.5–10 mg ] ) or placebo for 4 weeks . The primary endpoint was time to discontinuation owing to symptom worsening based on global assessment s by the parent/guardian and physician . Adverse events were recorded . Results : Of the 98 patients enrolled , 81 ( 82.7 % ) experienced symptom improvement determined by physician global assessment ( PGA ) during open-label esomeprazole treatment ; 80 entered the double-blind phase . During this phase , discontinuation rates owing to symptom worsening were 48.8 % ( 20/41 ) for placebo-treated versus 38.5 % ( 15/39 ) for esomeprazole-treated patients ( hazard ratio 0.69 ; P = 0.28 ) . Posthoc analysis of infants with symptomatic GERD ( ie , no diagnostic procedure performed ) revealed that time to discontinuation was significantly longer with esomeprazole than placebo ( hazard ratio 0.24 ; P = 0.01 ) ; the complementary subgroup difference was not significant ( hazard ratio 1.39 ; P = 0.48 ) . Esomeprazole was well tolerated . Conclusions : The discontinuation rate owing to symptom worsening did not differ significantly between infants receiving esomeprazole versus those receiving placebo . Improved diagnostic criteria in this age group are needed to identify infants with GERD who may benefit from acid suppression therapy Introduction : Proton pump inhibitor ( PPI ) therapy is increasingly being used to treat premature infants with gastroesophageal reflux disease ( GERD ) ; however , the efficacy of PPI on acid production in this population has yet to be assessed in this patient group . The aim of this study was to determine the effect of 0.7 mg/kg/d omeprazole on gastric acidity and acid gastroesophageal reflux in preterm infants with reflux symptoms and pathological acid reflux on 24-h pH probe . Methods : A r and omized , double blind , placebo-controlled , crossover design trial of omeprazole therapy was performed in 10 preterm infants ( 34–40 weeks postmenstrual age ) . Infants were given omeprazole for 7 d and then placebo for 7 d in r and omized order . Twenty-four-hour esophageal and gastric pH monitoring was performed on days 7 and 14 of the trial . Results : Compared to placebo , omeprazole therapy significantly reduced gastric acidity ( % time pH < 4 , 54 % vs 14 % , P < 0.0005 ) , esophageal acid exposure ( % time pH < 4 , 19 % vs 5 % , P < 0.01 ) and number of acid GER episodes ( 119 vs 60 episodes , P < 0.05 ) . Conclusions : Omeprazole is effective in reducing esophageal acid exposure in premature infants with pathological acid reflux on 24-h pH probe ; however , the far more complex issues of safety and efficacy have yet to be addressed Objective : The objective of this study was to assess the efficacy of pantoprazole in infants with gastroesophageal reflux disease ( GERD ) . Material s and Methods : Infants ages 1 through 11 months with GERD symptoms after 2 weeks of conservative treatment received open-label ( OL ) pantoprazole 1.2 mg · kg−1 · day−1 for 4 weeks followed by a 4-week r and omized , double-blind ( DB ) , placebo-controlled , withdrawal phase . The primary endpoint was withdrawal due to lack of efficacy in the DB phase . Mean weekly GERD symptom scores ( WGSSs ) were calculated from daily assessment s of 5 GERD symptoms . Safety was assessed . Results : One hundred twenty-eight patients entered OL treatment , and 106 made up the DB modified intent-to-treat population . Mean age was 5.1 months ( 82 % full-term , 64 % male ) . One third of patients had a GERD diagnostic test before OL study entry . WGSSs at week 4 were similar between groups . WGSSs decreased significantly from baseline during OL therapy ( P < 0.001 ) , when all patients received pantoprazole . The decrease in WGSSs was maintained during the DB phase in both treatment groups . There was no difference in withdrawal rates due to lack of efficacy ( pantoprazole 6/52 ; placebo 6/54 ) or time to withdrawal during the DB phase . The greatest between-group difference in WGSS was slightly worse with placebo at week 5 ( P = 0.09 ) , mainly due to episodes of arching back ( P = 0.028 ) . No between-group differences in adverse event frequency were noted . Serious adverse events in 8 patients were considered unrelated to treatment . Conclusions : Pantoprazole significantly improved GERD symptom scores and was well tolerated . However , during the DB treatment phase , there were no significant differences noted between pantoprazole and placebo in withdrawal rates due to lack of efficacy OBJECTIVE To evaluate the efficacy and safety of proton pump inhibitors in infants aged < 1 year with gastroesophageal reflux disease ( GERD ) . STUDY DESIGN In this r and omized , double-blind , placebo-controlled multicenter study , neonates ( premature to 1 month corrected age ; n = 52 ) with signs and symptoms of GERD received esomeprazole 0.5 mg/kg or placebo once daily for up to 14 days . Change from baseline in the total number of GERD symptoms ( from video monitoring ) and GERD-related signs ( from cardiorespiratory monitoring ) was assessed with simultaneous esophageal pH , impedance , cardiorespiratory , and 8-hour video monitoring . RESULTS There were no significant differences between the esomeprazole and placebo groups in the percentage change from baseline in the total number of GERD-related signs and symptoms ( -14.7 % vs -14.1 % , respectively ) . Mean change from baseline in total number of reflux episodes was not significantly different between esomeprazole and placebo ( -7.43 vs -0.2 , respectively ) ; however , the percentage of time pH was < 4.0 and the number of acidic reflux episodes > 5 minutes in duration was significantly decreased with esomeprazole vs placebo ( -10.7 vs 2.2 and -5.5 vs 1.0 , respectively ; P ≤ .0017 ) . The number of patients with adverse events was similar between treatment groups . CONCLUSIONS Signs and symptoms of GERD traditionally attributed to acidic reflux in neonates were not significantly altered by esomeprazole treatment . Esomeprazole was well tolerated and reduced esophageal acid exposure and the number of acidic reflux events in neonates OBJECTIVE . Gastric acidity ( GA ) inhibitors , including histamine-2 receptor antagonists ( H2 blockers ) and proton pump inhibitors ( PPIs ) , are the mainstay of gastroesophageal reflux disease ( GERD ) treatment . A prolonged GA inhibitor – induced hypochlorhydria has been suggested as a risk factor for severe gastrointestinal infections . In addition , a number of papers and a meta- analysis have shown an increased risk of pneumonia in H2-blocker – treated intensive care patients . More recently , an increased risk of community-acquired pneumonia associated with GA inhibitor treatment has been reported in a large cohort of adult patients . These findings are particularly relevant to pediatricians today because so many children receive some sort of GA-blocking agent to treat GERD . To test the hypothesis that GA suppression could be associated with an increased risk of acute gastroenteritis and pneumonia in children treated with GA inhibitors , we conducted a multicenter , prospect i ve study . METHODS . The study was performed by expert pediatric gastroenterologists from 4 pediatric gastroenterology centers . Children ( aged 4–36 months ) consecutively referred for common GERD-related symptoms ( for example , regurgitation and vomiting , feeding problems , effortless vomiting , choking ) , from December 2003 to March 2004 , were considered eligible for the study . Exclusion criteria were a history of GA inhibitors therapy in the previous 4 months , Helicobacter pylori infection , diabetes , chronic lung or heart diseases , cystic fibrosis , immunodeficiency , food allergy , congenital motility gastrointestinal disorders , neuromuscular diseases , or malnutrition . Control subjects were recruited from healthy children visiting the centers for routine examinations . The diagnosis of GERD was confirmed in all patients by st and ard criteria . GA inhibitors ( 10 mg/kg ranitidine per day in 50 children or 1 mg/kg omeprazole per day in 50 children ) were prescribed by the physicians for 2 months . All enrolled children were evaluated during a 4-month follow-up . The end point was the number of patients presenting with acute gastroenteritis or community-acquired pneumonia during a 4-month follow-up study period . RESULTS . We obtained data in 186 subjects : 95 healthy controls and 91 GA-inhibitor users ( 47 on ranitidine and 44 on omeprazole ) . The 2 groups were comparable for age , gender , weight , length , and incidence of acute gastroenteritis and pneumonia in the 4 months before enrollment . Rate of subjects presenting with acute gastroenteritis and community-acquired pneumonia was significantly increased in patients treated with GA inhibitors compared with healthy controls during the 4-month follow-up period . In the GA inhibitor-treated group , the rate of subjects presenting with acute gastroenteritis and community-acquired pneumonia was increased when comparing the 4 months before and after enrollment . No differences were observed between H2 blocker and PPI users in acute gastroenteritis and pneumonia incidence in the previous 4 months and during the follow-up period . On the contrary , in healthy controls , the incidence of acute gastroenteritis and pneumonia remained stable . CONCLUSIONS . This is the first prospect i ve study performed in pediatric patients showing that the use of GA inhibitors was associated with an increased risk of acute gastroenteritis and community-acquired pneumonia in GERD-affected children . It could be interesting to underline that we observed an increased incidence of intestinal and respiratory infection in otherwise healthy children taking GA inhibitors for GERD treatment . On the contrary , the majority of the previous data showed that the patients most at risk for pneumonia were those with significant comorbid illnesses such as diabetes or immunodeficiency , and this points to the importance of GA suppression as a major risk factor for infections . In addition , this effect seems to be sustained even after the end of therapy . The results of our study are attributable to many factors , including direct inhibitory effect of GA inhibitors on leukocyte functions and qualitative and quantitative gastrointestinal microflora modification . Additional studies are necessary to investigate the mechanisms of the increased risk of infections in children treated with GA inhibitors , and prophylactic measures could be considered in preventing them The use of over‐the‐counter antacids has increased in children under the age of 12 years , and has been followed by an apparent increase in the use of over‐the‐counter histamine‐2 receptor antagonists . However , the pharmacokinetic and pharmacodynamic effects of over‐the‐counter histamine‐2 receptor antagonists in the paediatric population are largely unknown OBJECTIVE To assess the effect of medical antireflux treatment , and of an infant mental health consultation ( IMHC ) , on persistent crying in infants and maternal distress . METHODS Infants under 9 months of age with persistent crying , and their mothers , were enrolled in a r and omized placebo-controlled trial . At enrollment , a question naire on demographic and clinical details was completed by mothers , and maternal distress was measured ( Experience of Motherhood Question naire ; EMQ ) . Oesophageal 24-h pH monitoring was performed in all infants on day 2 . At week 4 , the cry chart and EMQ were repeated in conjunction with a final interview . RESULTS One hundred and three infants ( 56 under 3 months of age ; 55 male ) who were r and omized to active medication ( ranitidine plus cisapride ; n = 34 ) , placebo ( n = 29 ) or IMHC ( n = 40 ) completed the trial . There was a significant reduction in crying duration from baseline to week 4 ( 253 + /- 96.5 min vs 159 + /- 92.3 min per 24 h ; P < 0.001 ) , without differences between treatment groups ( AVOVA : F = 0.75 ; P = 0.48 ) . There was a modest improvement in EMQ scores from 44.9 + /- 8.6 at day 1 to 42.8 + /- 9.4 at week 4 ; P = 0.006 . The improvement in maternal stress was similar in all treatment groups ( Kruskal-Wallis chi2 = 0.354 ; P = 0.84 ) , but subsequent admission to a mother-infant unit was significantly less frequent in the IMHC group ( P < 0.05 ) . CONCLUSION Antireflux medications and IMHC were not superior to placebo in treating infants with persistent crying . Although the reduction in maternal distress was similar in all treatment groups , the individualized IMHC reduced the need for subsequent admission to a mother-infant unit Objectives : We aim ed to determine if nocturnal acid breakthrough occurs in children receiving proton pump inhibitors for reflux esophagitis , and to compare the healing of esophagitis in children with nocturnal acid breakthrough receiving proton pump inhibitors ± ranitidine . Methods : This is a prospect i ve , double-blind study . Endoscopic and histologic esophagitis were scored 0 - 4 and 0 - 3 , respectively . Patients were treated with a proton pump inhibitor twice daily and esophagogastric pH monitoring was performed at week 3 . Patients with nocturnal acid breakthrough were r and omized . One group received ranitidine and the other received placebo at bedtime in addition to proton pump inhibitor therapy . Endoscopy was performed on all patients ( with pH monitoring on patients with nocturnal acid breakthrough ) during the 17th week of therapy . Results : We enrolled 18 patients , ages 1 to 13 years ( mean = 10.3 years ) . Mean baseline endoscopic and histologic scores were 3.1 + /− 1.4 and 1.8 + /− 0.7 , respectively . Mean dose of proton pump inhibitor was 1.3 mg/kg + /− 0.6 . Nocturnal acid breakthrough was documented in 16/18 ( 89 % ) patients . Seven patients received ranitidine and 9 received placebo . The reflux index improved : mean of 14.3 at baseline , 2.0 at week 3 ( P = 0.0001 ) , and 5.1 at week 17 ( P = 0.09 ) . Nocturnal acid breakthrough persisted in 9/12 ( 75 % ) patients , 3 of whom received ranitidine at bedtime . Esophagitis improved in all patients following therapy : mean endoscopy and histology scores were 1.6 + /− 1.8 ( P = 0.0020 ) and 0.8 + /− 0.9 ( P = 0.0013 ) , respectively . Symptoms significantly improved from a mean score of 2.0 at baseline to 0.4 at week 17 ( P = 0.0001 ) . Conclusions : Nocturnal acid breakthrough is common in pediatric patients treated with proton pump inhibitors . Reflux index remains normal in spite of nocturnal acid breakthrough . Symptoms and esophagitis continued to improve during therapy in spite of nocturnal acid breakthrough . There appears to be no additional benefit to supplementation with ranitidine at bedtime Aim : The efficacy and safety of rabeprazole , a proton pump inhibitor , were studied in infants with gastroesophageal reflux disease ( GERD ) . Methods : Infants ages 1 to 11 months , with symptomatic GERD resistant to conservative therapy and /or previous exposure to acid-suppressive medications , were screened . After scoring > 16 on a GERD symptom score ( Infant Gastroesophageal Reflux Question naire-Revised [ I-GERQ ] ) , 344 infants were enrolled in a 1- to 3-week open-label ( OL ) phase and received rabeprazole 10 mg/day . Following caregiver-rated clinical improvement during the OL phase , patients were r and omized to placebo , rabeprazole 5 mg , or rabeprazole 10 mg in the ensuing 5-week double-blind ( DB ) withdrawal phase . Primary endpoints evaluated from DB baseline to the end of the DB withdrawal phase included frequency of regurgitation , weight-for-age z score , and daily and weekly GERD symptom scores . Results : Overall , 231 ( 86 % ) of the 268 r and omized infants ( placebo : 90 ; rabeprazole 5 mg : 90 ; rabeprazole 10 mg : 88 ) completed the study . Efficacy endpoints were similarly improved during the OL phase in all of the groups , and continued improving during the DB withdrawal phase with no difference between the placebo and combined rabeprazole groups . Mean decrease in frequency of regurgitation ( −0.79 vs −1.20 times per day ; P = 0.168 ) , in I-GERQ-Revised scores ( −3.6 [ −25 % ] vs −3.9 points [ −27 % ] ; P = 0.960 ) , in I-GERQ-Daily Diary scores ( −1.87 [ −19 % ] vs −1.85 [ −19 % ] ; P = 0.968 ) , and increase in weight-for-age z scores ( mean [ st and ard deviation ] : 0.11 [ 0.329 ] vs 0.14 [ 0.295 ] ; P = 0.440 ) indicated equal improvement . Equal percentages ( 47 % ) reported adverse events in placebo and combined rabeprazole groups , with no new safety signals emerging . Conclusions : In those infants with GERD who improved with rabeprazole during the OL phase , improvements in symptoms and weight were similar in those who continued rabeprazole and those withdrawn to placebo during a 5-week DB phase Peptic esophagitis is a common complication of gastroesophageal reflux . Therapeutic measures aim ed at reinforcing the anti-gastroesophageal reflux barrier , reducing acid secretion , or increasing the defense mechanisms of the esophageal mucosa are used to treat this form of esophagitis . The purpose of this study was to determine the efficacy of sucralfate in the treatment of peptic esophagitis in children . We studied 75 patients diagnosed endoscopically as suffering from esophagitis . The age of the patients ranged from three months to 13 years . Gastroesophageal reflux was diagnosed by isotopic investigation and /or radiologically . None of the patients had kidney disease or had received anti-inflammatory drugs , sucralfate , or cimetidine during the preceding two weeks . The patients were divided into three groups of 25 . Patients were homogeneous in age , sex , nutritional status , symptoms , and grade of esophagitis . All patients in each group were treated with cimetidine , sucralfate tablets , or sucralfate suspension . No other dietary or postural measures were prescribed . Clinical examinations were carried out on Days 14 , 28 , 42 , and 56 , with an endoscopic examination on Day 28 . Endoscopy was repeated on Day 56 if the course was unsatisfactory . Statistical examination of the data showed that there were no differences between the three groups . Sucralfate is a useful drug for the treatment of peptic esophagitis in children OBJECTIVE To assess the efficacy and safety of lansoprazole in treating infants with symptoms attributed to gastroesophageal reflux disease ( GERD ) that have persisted despite a > or= 1-week course of nonpharmacologic management . STUDY DESIGN This multicenter , double-blind , parallel-group study r and omized infants with persisting symptoms attributed to GERD to treatment with lansoprazole or placebo for 4 weeks . Symptoms were tracked through daily diaries and weekly visits . Efficacy was defined primarily by a > or= 50 % reduction in measures of feeding-related crying and secondarily by changes in other symptoms and global assessment s. Safety was assessed based on the occurrence of adverse events ( AEs ) and clinical /laboratory data . RESULTS Of the 216 infants screened , 162 met the inclusion /exclusion criteria and were r and omized . Of those , 44/81 infants ( 54 % ) in each group were responders -- identical for lansoprazole and placebo . No significant lansoprazole-placebo differences were detected in any secondary measures or analyses of efficacy . During double-blind treatment , 62 % of lansoprazole-treated subjects experienced 1 or more treatment-emergent AEs , versus 46 % of placebo recipients ( P= .058 ) . Serious AEs ( SAEs ) , particularly lower respiratory tract infections , occurred in 12 infants , significantly more frequently in the lansoprazole group compared with the placebo group ( 10 vs 2 ; P= .032 ) . CONCLUSIONS This study detected no difference in efficacy between lansoprazole and placebo for symptoms attributed to GERD in infants age 1 to 12 months . SAEs , particularly lower respiratory tract infections , occurred more frequently with lansoprazole than with placebo OBJECTIVE To assess the efficacy of omeprazole in treating irritable infants with gastroesophageal reflux and /or esophagitis . STUDY DESIGN Irritable infants ( n=30 ) 3 to 12 months of age met the entry criteria of esophageal acid exposure > 5 % ( n=22 ) and /or abnormal esophageal histology ( n=15 ) . They completed a 4-week , r and omized , double-blind , placebo-controlled crossover trial of omeprazole . Cry/fuss diary ( minutes/24 hours ) and a visual analogue scale of infant irritability as judged by parental impression were obtained at baseline and the end of each 2-week treatment period . RESULTS The reflux index fell significantly during omeprazole treatment compared with placebo ( -8.9%+/-5.6 % , -1.9%+/-2.0 % , P<.001 ) . Cry/fuss time decreased from baseline ( 267+/-119 ) , regardless of treatment sequence ( period 1 , 203+/-99 , P<.04 ; period 2 , 188+/-121 , P<.008 ) . Visual analogue score decreased from baseline to period 2 ( 6.8+/-1.6 , 4.8+/-2.9 , P=.008 ) . There was no significant difference for both outcome measures while taking either omeprazole or placebo . CONCLUSIONS Compared with placebo , omeprazole significantly reduced esophageal acid exposure but not irritability . Irritability improved with time , regardless of treatment Background : Gastro‐oesophageal reflux afflicts up to 7 % of all infants . Histamine‐2 receptor antagonists are the most commonly prescribed medications for this disorder , but few controlled studies support this practice OBJECTIVE Diagnosing asthma in infancy is largely made on the basis of the symptoms of cough and wheezing . A similar presentation can be seen in neurologically normal infants with excessive gastroesophageal reflux ( GER ) . There are no r and omized placebo controlled studies in infants using proton pump inhibitors ( PPI ) alone or in addition to prokinetic agents . The primary objective was to confirm the presence of excessive GER in a population of infants that also had respiratory symptoms suggestive of asthma . Second , in a r and omized placebo-controlled fashion , we determined whether treatment of GER with bethanacol and omeprazole could improve these respiratory symptoms . METHODS Infants ( n=22 ) with a history of chronic cough and wheeze were enrolled , if they had evidence of GER by history and an abnormal pH probe or gastric emptying scan . Infants were r and omly allocated to four treatment groups : placebo/placebo ( PP ) , omeprazole plus bethanacol ( OB ) , omeprazole/placebo ( OP ) , bethanacol/placebo ( BP ) . Evaluations by clinic question naire and exam , home diary , and pH probe data were done before , after study -medication and after open label of OB . RESULTS Nineteen children were studied . PP did not affect GER or respiratory symptoms , and did not decrease GER measured by pH probe . In contrast , OB decreased GER as measured by pH probe indices and parental assessment . In association , OB significantly decreased daytime coughing and improved respiratory scores . No adverse effects were reported . CONCLUSIONS In infants with a clinical presentation suggestive of chronic GER-related cough , the use of omeprazole and bethanacol appears to be viable therapeutic option BACKGROUND Nizatidine is an H2 histaminic receptor blocker , which acts on the oxintic cells in the stomach . The efficacy of nizatidine on acid gastric secretion has been widely studied in adults with erosive and ulcerative esophagitis , but not in children . The aim of the present study was to evaluate the therapeutic efficacy of nizatidine in children with reflux esophagitis . METHODS Twenty-six patients were studied ; all of them underwent endoscopy with multiple esophageal biopsies and 24-h intraesophageal pH monitoring . The diagnosis of esophagitis was based on histologic features . Patients were r and omly assigned to double-blind treatment with either nizatidine or a placebo ( 10 mg/kg/day in two doses ) for 8 weeks . A symptomatic score assessment was evaluated during the study . RESULTS Twenty-four patients completed the 8-week protocol . After therapy , 9/13 ( 69 % ) patients on nizatidine and 2/13 ( 15 % ) patients on the placebo were healed ( p < 0.007 by Fisher 's exact test ) . Histological findings were improved in two other ( 16.7 % ) patients and unchanged in the last ( 8.3 % ) patient on nizatidine . In the placebo group there was histological improvement in three ( 25 % ) patients , no variation in six ( 50 % ) , and worsening in one ( 8.3 % ) . After therapy , determination of esophageal pH showed a statistically significant decrease of the total acid exposure time ( p < 0.01 ) only in the nizatidine group . The clinical score analysis showed an improvement of symptoms only in the nizatidine group ( p < 0.01 ) , except for vomiting , which was reduced in both groups . CONCLUSIONS Our results show that nizatidine is effective in treating children with reflux esophagitis . The children included in this study did not have severe esophagitis , and the conclusion must be limited to those with mild to moderate degrees of disease BACKGROUND This study was performed to study the demography , effect of treatment with ranitidine and relapse pattern in patients with reflux symptoms . METHODS Patients with reflux symptoms were examined by endoscopy and included in a double-blind , comparative trial of placebo and ranitidine 150 mg b.i.d . for two weeks . At two weeks satisfied patients continued the same treatment . Non-satisfied patients were r and omised to ranitidine 150 mg b.i.d . or q.i.d for another two weeks . After four weeks medication was stopped and satisfied patients were followed for 24 weeks . No further endoscopy was performed . RESULTS Four hundred and twenty-seven patients were r and omised . At two weeks there was no significant difference between placebo and ranitidine , regarding the proportion of patients with complete relief from symptoms or satisfied with treatment . Ranitidine was superior to placebo in improving symptoms at two weeks . Ranitidine , 150 mg q.i.d . offered no additional advantage in weeks three to four over prolonging treatment with 150 mg b.i.d . after the first two weeks . Patients with oesophagitis at inclusion relapsed more than those with symptoms only , 67 % compared with 52 % , ( p = 0.013 ) . CONCLUSIONS The effect of ranitidine was marginal compared to placebo . The relapse rate was high after treatment stopped OBJECTIVES Gastroesophageal reflux disease ( GERD ) is present in pediatric patients when reflux of gastric contents causes troublesome symptoms and /or complications . The present study evaluates the efficacy and safety of esomeprazole in infants ages 1 to 11 months with GERD . METHODS In this multicenter r and omized , double-blind , placebo-controlled , parallel-group , treatment-withdrawal study , infants received open-label , weight-adjusted doses of esomeprazole ( 2.5 - 10 mg ) once daily for 2 weeks . Infants with symptom improvement were r and omized to esomeprazole ( weight-adjusted doses [ 2.5 - 10 mg ] ) or placebo for 4 weeks . The primary endpoint was time to discontinuation owing to symptom worsening based on global assessment s by the parent/guardian and physician . Adverse events were recorded . RESULTS Of the 98 patients enrolled , 81 ( 82.7 % ) experienced symptom improvement determined by physician global assessment ( PGA ) during open-label esomeprazole treatment ; 80 entered the double-blind phase . During this phase , discontinuation rates owing to symptom worsening were 48.8 % ( 20/41 ) for placebo-treated versus 38.5 % ( 15/39 ) for esomeprazole-treated patients ( hazard ratio 0.69 ; P = 0.28 ) . Posthoc analysis of infants with symptomatic GERD ( ie , no diagnostic procedure performed ) revealed that time to discontinuation was significantly longer with esomeprazole than placebo ( hazard ratio 0.24 ; P = 0.01 ) ; the complementary subgroup difference was not significant ( hazard ratio 1.39 ; P = 0.48 ) . Esomeprazole was well tolerated . CONCLUSIONS The discontinuation rate owing to symptom worsening did not differ significantly between infants receiving esomeprazole versus those receiving placebo . Improved diagnostic criteria in this age group are needed to identify infants with GERD who may benefit from acid suppression therapy Objective : Proton-pump inhibitors ( PPIs ) reduce acid gastroesophageal reflux ( GER ) and esophageal acid exposure in infants ; however , they do not reduce total GER or symptoms attributed to GER . Reflux is reduced in the left lateral position ( LLP ) . We hypothesize that the effect of LLP in combination with acid suppression is most effective in reducing GER symptoms in infants . Methods : In this prospect i ve sham-controlled trial , infants ( 0–6 months ) with symptoms suggestive of gastroesophageal reflux disease were studied using 8-hour pH-impedance , cardiorespiratory and video monitoring , direct nurse observation , and a vali date d question naire . Infants demonstrating a positive GER symptom association were r and omized to 1 of 4 groups ; PPI + LLP , PPI + head of cot elevation ( HE ) , antacid ( AA ) + LLP , or AA + HE . HE and AA were considered “ sham ” therapies . After 2 weeks the 8-hour studies were repeated on-therapy . Results : Fifty-one patients were included ( aged 13.6 [ 2–26 ] weeks ) . PPI + LLP was most effective in reducing GER episodes ( 69 [ 13 ] to 46 [ 10 ] , P < 0.001 ) and esophageal acid exposure ( median [ interquartile range ] 8.9 % [ 3.1%–18.1 % ] to 1.1 % [ 0%–4.4 % ] , P = 0.02 ) . No treatment group showed improvement in crying/irritability , although vomiting was reduced in AA + LLP ( from 7 [ 2 ] to 2 [ 0 ] episodes P = 0.042 ) . LLP compared with HE produced greater reduction in total GER ( −21 [ 4 ] vs −10 [ 4 ] , P = 0.056 ) , regardless of acid-suppressive therapy . Acid exposure was reduced on PPI compared with AA ( −6.8 [ 2.1 ] vs −0.9 [1.4]% , pH < 4 , P = 0.043 ) regardless of positional intervention . A post-hoc analysis using automated analysis software revealed a significant reduction in crying symptoms in the PPI + LLP group ( 99 [ 65–103 ] to 62 [ 32–96 ] episodes , P = 0.018 ) . Conclusions : “ Symptomatic gastroesophageal reflux disease ” implies disease causation for distressing infant symptoms . In infants with symptoms attributed to GER , LLP produced a significant reduction in total GER , but did not result in a significant improvement in symptoms other than vomiting ; however , automated analysis appeared to identify infants with GER-associated crying symptoms who responded to positioning therapy . This is an important new insight for future research
2,059
21,718,430
• Gemcitabine combined chemotherapy is active in the management of metastatic bladder cancer . • GCis may be considered an alternative regime to MVAC . •
OBJECTIVE • To systematic ally review the literature on gemcitabine chemotherapy for advanced or metastatic bladder cancer .
Objective : The study aim ed at evaluating the activity and toxicity of gemcitabine monochemotherapy in a unselected series of elderly patients with advanced bladder cancer . The secondary objectives were to establish whether there is a correlation between treatment and Comprehensive Geriatric Assessment ( CGA ) and , in addition , to determine overall patient survival . Methods : Treatment consisted of six courses of chemotherapy with gemcitabine at a dosage of 1,200 mg/m2 on days 1 and 8 , every 21 days . CGA , as described by Gruppo Italiano di Oncologia Geriatrica , was assessed for evaluating the functional status of patients before , during , and after treatment . Results : Twenty-five patients were enrolled ( M/F 22/3 ) , 22 of these were evaluable for response and 23 for toxicity . Characteristics of patients : median age 76 years ( range 71–87 ) ; ECOG performance status ( PS ) 1 in 12 patients and 2 in 13 patients ; clinical stage III in 6 patients and IV in 19 patients . At the end of the therapy the parameters of CGA improved in 4 cases ( 17 % ) , remained unchanged in 17 cases ( 74 % ) and worsened only in 2 cases ( 9 % ) . Two patients were not evaluable . Response evaluation showed 3 ( 13.5 % ) complete responses ( CRs ) and 7 ( 32 % ) partial responses ( PRs ) , for an overall response rate of 45.5 % [ 95 % confidence interval ( CI ) , 24.3–65.7 % ] . Three ( 13.5 % ) patients had stable disease ( SD ) and 9 ( 41 % ) disease progression ( DP ) . Median overall survival was 8 months and median time to progression was 5 months . Treatment was generally well tolerated , with 1 patient having grade 3 gastrointestinal toxicity and 3 having grade 4 neutropenia . Conclusions : We conclude that gemcitabine can be safely administered in monochemotherapy , is effective and does not worsen the functional status of elderly patients with advanced bladder cancer PURPOSE To evaluate the efficacy and toxicity of weekly paclitaxel and gemcitabine in patients with advanced transitional-cell carcinoma ( TCC ) of the urothelial tract . PATIENTS AND METHODS Patients with advanced unresectable TCC were enrolled onto this multicenter , community-based , phase II trial . Initially , patients were treated with paclitaxel 110 mg/m(2 ) and gemcitabine 1,000 mg/m(2 ) by intravenous infusion on days 1 , 8 , and 15 every 28 days . Patients who had an objective response or stable disease continued treatment for a maximum of six courses . Paclitaxel was decreased to 90 mg/m(2 ) and gemcitabine was decreased to 800 mg/m(2 ) for the last 12 patients because of a concerning incidence of pulmonary toxicity in the first 24 patients . RESULTS Thirty-six patients were enrolled between September 1998 and March 2003 . Twenty-four patients received the higher doses of paclitaxel and gemcitabine , and 12 patients received the lower doses . Twenty-five ( 69.4 % ) of 36 patients had major responses to treatment , including 15 patients ( 41.7 % ) with complete responses . With a median follow-up time of 38.7 months , the median survival time was 15.8 months . Grade 3 and 4 toxicities included granulocytopenia ( 36.1 % ) , thrombocytopenia ( 8.3 % ) , and neuropathy ( 16.7 % ) . Five patients ( 13.9 % ) had grade s 3 to 5 pulmonary toxicity , and one patient had grade 2 pulmonary toxicity . CONCLUSION Weekly paclitaxel and gemcitabine is an active regimen in the treatment of patients with advanced TCC . However , because of the high incidence of pulmonary toxicity associated with this schedule of paclitaxel and gemcitabine , we recommend against the use of this regimen in this patient population BACKGROUND To investigate a novel schedule of gemcitabine ( G ) , paclitaxel ( P ) , and carboplatin ( C ) , based on pre clinical studies demonstrating greater activity and decreased toxicity of administering P prior to C. MATERIAL / METHODS The effect of the P and C drug sequence on tumor cell viability was assessed with a tetrazolium assay on T24 bladder and DU145 prostate cancer cells . Patients with transitional cell cancer ( TCC ) and other advanced malignancies were treated with G and P on days 1 , 8 , and 15 of each 28 day cycle . C was administered on day 2 at an AUC of 5 . Doses of G and P were varied among cohorts of three patients . RESULTS Pre clinical studies demonstrated that the sequence of P followed by C induced greater cytotoxicity than the reverse sequence . The recommended phase II dose ( RPTD ) was defined as 70 mg/m2 P , 300 mg/m2 G , and C with AUC of 5 . Therapy was well tolerated ; fever and neutropenia occurred in only one patient at the RPTD . Grade 3 thrombocytopenia occurred in 5 of 21 patients treated at the RPTD . Out of all 37 patients treated on study , 9 achieved a partial tumor response ( PR ) and two patients achieved a complete response ( CR ) . Out of the 15 patients with TCC , six had a PR and two had a CR . CONCLUSIONS Pre clinical studies demonstrated that the sequence of paclitaxel followed by carboplatin was more effective than the opposite sequence . Administration of gemcitabine and paclitaxel followed by carboplatin was well tolerated and clinical ly active . GCP should be compared to other combination regimens under investigation for the treatment of TCC Our objective was to determine the response to gemcitabine plus docetaxel in advanced urothelial transitional cell carcinoma in a phase II trial , and gemcitabine distribution between plasma and erythrocytes , following docetaxel administration . Patients with locally advanced or metastatic transitional cell carcinoma , following a maximum of one prior chemotherapy regimen , were given gemcitabine 800 mg/m2 on days 1 and 8 plus docetaxel 85 mg/m2 on day 8 , every 21 days . Gemcitabine was measured in the plasma and erythrocytes of nine patients before and after docetaxel administration . Thirty-four patients ( median 63 years ; range 49–79 years ) , of whom seven had prior chemotherapy and 27 were chemotherapy-naive , received a median of six cycles ( range 1–6 ) . Complete and partial remissions were observed in two and 16 ( including three pretreated ) patients , respectively , for an overall response rate of 53 % . Median response duration was 5 months ( range 1–39 + ) . Haematoxicity was manageable , despite grade 3 infections in 24 % of patients , but other toxicities were mostly mild . An apparent shift of gemcitabine from plasma to erythrocytes occurred after docetaxel in five of six patients evaluable for this analysis . We conclude gemcitabine plus docetaxel is tolerable and highly active in treated and untreated patients with advanced transitional cell carcinoma BACKGROUND Despite the fact that new drugs have emerged from clinical research in urothelial cancer during the last decade , the prognosis of patients with advanced disease remains poor with a median survival of 12 to 14 months . We design ed a feasibility study of gemcitabine and oxaliplatin ( GO ) in patients with advanced urothelial cancer . PATIENTS AND METHODS Twenty patients received bimonthly cycles of gemcitabine 1500 mg/m2 and oxaliplatin 85 mg/m2 . The cycles were given at 2-week intervals without G-CSF support . Thirteen patients were treated with the GO combination as first-line chemotherapy because of a poor performance status or a creatinine clearance < 1 ml/s . RESULTS The median number of cycles of GO was 5 ( 1 - 7 ) . The median number of days between cycles was 14 throughout the treatment . Seven ( 8 % ) out of 87 cycles had to be delayed because of neutropenia or asthenia . A 25 % dose reduction in the doses of cytotoxic drugs was necessary in 2 patients . Chemotherapy was stopped before the sixth cycle because of an early death related to a myocardial infa rct ion in 1 patient , a grade 3 neuropathy in 1 patient and a progressive disease in 9 patients . CONCLUSION Using these doses and schedules , the GO regimen appears a safe therapy for patients with advanced urothelial cancer . Phase II studies are required to assess the possible role of this combination in advanced urothelial cancer AIM Chemotherapeutic agents are active in transitional cell cancer of the urothelium , and combinations have shown promising results . The objective of this study was to evaluate the palliative chemotherapy with gemcitabine , paclitaxel , and cisplatin for transitional cell carcinoma . METHODS Thirty-four patients with advanced transitional cell carcinoma of the urothelium were treated between 2000 and 2007 . All patients received chemotherapy with intravenous gemcitabine at a dose of 1000 mg/m2 on days I and VIII , intravenous paclitaxel at a dose of 80 mg/m2 on days I and VIII , and intravenous cisplatin at a dose of 50 mg/m2 on day II . Treatment courses were repeated every 21 days . After completion of four to six courses in this regimen an application of intravenous gemcitabine at a dose of 1000 mg/m2 followed repeating every 28 days . RESULTS Twelve patients ( 35.3 % ) had 1 visceral sites of metastases . Twenty two patients ( 64.7 % ) had achieved objective responses to treatment ( 29.4 % complete responses ) . The median actuarial survival was 18.5 months , and the actuarial one-year and two-year survival rates were 56 % and 26 % respectively . After a median follow-up of 16.3 months , 18 patients remained alive . The median progression-free survival was 7 months . Median survival time for patients with ECOG status 0 , 1 , and 2 was 45 , 12 , and 10.5 months respectively . Grade 3 - 4 neutropenia occurred in 41.2 % of patients . CONCLUSIONS The combination of gemcitabine , paclitaxel , and cisplatin is a highly effective and tolerable regimen for patients with advanced transitional cell carcinoma of the urothelium . This treatment should be considered as a suitable option that deserves further prospect i ve evaluation . ECOG performance status and visceral metastases are important predictive factors for survival Clinical trials in urothelial cancer exclude a large population of patients . An observational study evaluated the behavior of frail patients not eligible for cisplatin- or carboplatin-based regimens . Urothelial cancer patients requiring chemotherapy with either chronic renal failure ( creatinine clearance < 60 ml/min ) , and /or performance status ( PS ) ≥2 and /or cardiac dysfunction were prospect ively observed . The treatment associated gemcitabine 1200 mg/m2 and oxaliplatin 85 mg/m2 , bimonthly ( GO ) . Over 2 years , 31 of 45 ( 69 % ) patients with urothelial cancer requiring chemotherapy were not eligible for cisplatin- or carboplatin-based chemotherapy . Sixteen ( 52 % ) had a PS ≥2 , 23 ( 74 % ) had creatinine clearance < 60 ml/min , and 20 ( 65 % ) had an underlying cardiopathy . A total of 178 cycles of GO were administered ( median 6 per patient , range 2–12 ) . No aggravation of renal or cardiac status was noted . Acute grade 3 and 4 neutropenia and thrombocytopenia were observed in 16 and 13 % of patients , respectively , with one febrile neutropenia . The median progression-free and overall survival values were 4.2 and 9.5 months , respectively . The majority of urothelial cancer patients have severe renal or cardiac comorbidities , and we conclude that in this subset of patients the combination of gemcitabine and oxaliplatin is well tolerated , and its clinical activity warrants further evaluation PURPOSE To determine the efficacy of gemcitabine and cisplatin combination therapy in patients with advanced and /or metastatic transitional cell urothelial carcinoma . PATIENTS AND METHODS Forty-two chemonaïve patients with Karnofsky performance status ( KPS ) > or = 70 were treated with cisplatin 35 mg/m2 followed by gemcitabine 1000 mg/m2 ( 30 min i.v . infusion ) on days 1 , 8 , and 15 every twenty-eight days . RESULTS Thirty-eight patients were evaluable for efficacy . Half had visceral disease . There were seven complete ( 18 % ) and nine partial responses ( 24 % ) , for a response rate of 42 % ( 95 % confidence interval ( 95 % CI ) : 26%-59 % ) . Responses were independently review ed . Median response duration was 13.5 months ( 95 % CI : 8.5 - 18.1 months ) , median time to progressive disease 7.2 months ( 95 % CI : 4.0 - 9.1 months ) and median survival 12.5 months ( 95 % CI : 8.1 - 18.7 months ) ; one-year survival was 52 % . Laboratory toxicities included leucopenia ( 44 % grade 3 ; 17 % grade 4 ) , neutropenia ( 25 % grade 3 ; 33 % grade 4 ) and thrombocytopenia ( 29 % grade 3 ; 49 % grade 4 ) . Four patients had grade 4 symptomatic toxicity ( three nausea and vomiting , one diarrhoea ) . There were no grade 4 infections and no toxic deaths . CONCLUSIONS The combination of gemcitabine and cisplatin is active in patients with locally advanced and /or metastatic urothelial carcinoma . The weekly schedule of cisplatin is considered inappropriate There is a need to identify active new regimens in patients with advanced urothelial cancer . Pemetrexed and gemcitabine are active agents in advanced urothelial cancer . A phase 2 trial of the combination of these 2 agents was performed in patients with advanced urothelial cancer who were previously untreated for metastatic disease Objectives : i ) To evaluate objective response , toxicity , and quality of life ( QoL ) of gemcitabine monotherapy as second-line treatment in patients with cisplatin-refractory , metastatic transitional cell carcinoma ( TCC ) . ii ) To assess prognostic parameters for response to treatment and for improvement of QoL parameters . Patients and Methods : 30 patients were prospect ively enrolled in this open-label , nonr and omized multicenter phase II trial . Patients received up to 6 courses of gemcitabine monotherapy ( 1,250 mg/m2 on day 1 and 8 of a 21-day course ) . 28 of 30 patients were available for response evaluation . Results : Objective response ( OR ) was seen in 3/28 ( 11 % ) of patients ( 2 complete remissions , 1 partial remission ) . The mean time to progression ( TTP ) was 4.9 ± 3.5 months and mean disease-specific survival time was 8.7 ± 4.7 months . 13 of 28 patients did not progress ( OR + 10 stable diseases ) , and TTP ( 8.0 ± 2.7 months , p < 0.001 ) as well as survival time ( 10.2 ± 3.8 months , p < 0.05 ) differed significantly from those who showed progressive disease within 18 weeks of treatment . Pain values significantly improved in the group of responders from 4.3 ± 1.9 to 5.8 ± 1.3 points ( p < 0.05 ) . Response to cisplatin pretreatment was the best prognosticator for the response to gemcitabine . Conclusions : Gemcitabine monotherapy as second-line treatment is justified in patients with metastatic TCC who are refractory to cisplatin treatment . Patients with initially OR to cisplatin benefit most from second-line treatment . QoL remains stable during treatment , and pain improves especially in patients with bone metastases OBJECTIVE The aim of this study was to evaluate the efficacy and toxicities of the gemcitabine and paclitaxel combination regimen as second-line chemotherapy for patients with advanced or metastatic urothelial carcinoma ( UC ) who have previously been treated with platinum-based chemotherapy for the metastatic disease . METHODS Thirty-three patients with advanced or metastatic UC who had received platinum-based chemotherapy were treated with an outpatient gemcitabine and paclitaxel combination regimen . A dose of 180 mg/m(2 ) paclitaxel was administered by intravenous ( IV ) infusion on Day 1 , and 1000 mg/m(2 ) gemcitabine was administered by IV on Days 1 , 8 and 15.The course was repeated every 28 days . Patients were evaluated after every 2 cycles of therapy using computed tomography . RESULTS Of the 33 patients enrolled in this study , 30 could be evaluated to determine treatment efficacy ; 10 had an objective response [ overall response rate : 33.3 % , 95 % confidence interval ( CI ) , 19.2 - 51.2 % ] . The median overall survival was 11.3 months ( 95 % CI , 7.2 - 13.6 months ) . The chemotherapy sensitivity differed with disease site . The response rates of lung and bone metastases were 27 % and 14 % , and the progressive disease ( PD ) rates of lung and bone metastases were 13 % and 14 % , respectively . On the other h and , the response rate of liver metastasis was 14 % , and its PD rate was 57 % . None of the patients ( n = 3 ) with adrenal metastasis responded to this regimen . Toxicities were mild , and no life-threatening complications occurred . CONCLUSIONS Gemcitabine and paclitaxel combination therapy is a tolerable and active regimen for patients with advanced UC after failure of platinum-based chemotherapy Chemotherapeutic agents are active in advanced bladder cancer , and various combinations have shown promising results . The objective of this study was to evaluate the efficacy of combination chemotherapy with gemcitabine , paclitaxel , and cisplatin in patients with advanced urothelial carcinoma . Fifty-nine patients with metastatic or locally advanced transitional cell carcinoma of the urothelium were treated between 2000 and 2005 . No patient had received any previous systemic chemotherapy . All patients received chemotherapy intravenously with gemcitabine at a dose of 1000 mg/m(2 ) on days 1 and 8 , paclitaxel at a dose of 80 mg/m(2 ) on days 1 and 8 , and cisplatin at a dose of 50 mg/m(2 ) on day 2 . Treatment courses were repeated every 21 days . After completion of four to six courses in this regimen an intravenous application of gemcitabine was repeated every 28 days at a dose of 1000 mg/m(2 ) . Fifty-nine patients were treated between 2000 and 2005 . Nine patients ( 15 % ) had > or=1 visceral site of metastases , and no patient had received any previous systemic chemotherapy . Forty-eight patients ( 81 % ) achieved objective responses to treatment ( 56 % complete responses ) . The median actuarial survival was 22 months , and the actuarial 1-year and 2-year survival rates were 68 % and 39 % , respectively . After a median follow-up of 17.5 months , 29 patients remained alive and 25 were free of disease progression . The median progression-free survival for the entire group was 10 months . The median survival time for patients with an Eastern Cooperative Oncology Group ( ECOG ) status of 0 , 1 , and 2 was 37.5 , 17 , and 12 months , respectively . Grade 3 - 4 neutropenia occurred in 39 % of the patients . The combination of gemcitabine , paclitaxel , and cisplatin is a highly effective and tolerable regimen for patients with advanced urothelial carcinoma . This treatment should be considered as a suitable option that deserves further prospect i ve evaluation . The ECOG performance status is an important predictive factor for survival PURPOSE There is no st and ard treatment for patients with advanced urothelial cancer who are ineligible ( " unfit " ) for cisplatin-based chemotherapy ( CHT ) . To compare the activity and safety of two CHT combinations in this patient group , a r and omized phase II/III trial was conducted by the EORTC ( European Organisation for Research and Treatment of Cancer ) . We report here the phase II results of the study . PATIENTS AND METHODS CHT-naïve patients with measurable disease and impaired renal function ( 30 mL/min < glomerular filtration rate [ GFR ] < 60 mL/min ) and /or performance status ( PS ) 2 were r and omly assigned to receive either GC ( gemcitabine 1,000 mg/m(2 ) on days 1 and 8 and carboplatin area under the serum concentration-time curve [ AUC ] 4.5 ) for 21 days or M-CAVI ( methotrexate 30 mg/m(2 ) on days 1 , 15 , and 22 ; carboplatin AUC 4.5 on day 1 ; and vinblastine 3 mg/m(2 ) on days 1 , 15 , and 22 ) for 28 days . End points of response and severe acute toxicity ( SAT ) were evaluated with respect to treatment group , renal function , PS , and Bajorin risk groups . RESULTS Three of 178 patients who were ineligible or did not start treatment were excluded . SAT was reported in 13.6 % of patients on GC and in 23 % on M-CAVI . Overall response rates were 42 % ( 37 of 88 ) for GC and 30 % ( 26 of 87 ) for M-CAVI . Patients with PS 2 and GFR less than 60 mL/min and patients in Bajorin risk group 2 showed a response rate of only 26 % and 20 % and an SAT rate of 26 % and 25 % , respectively . CONCLUSION Both combinations are active in this group of unfit patients . However , patients with PS 2 and GFR less than 60 mL/min do not benefit from combination CHT . Alternative treatment modalities should be sought in this subgroup of poor-risk patients PURPOSE To evaluate the efficacy and toxicity of gemcitabine ( 2 ' , 2'-difluorodeoxycytidine ) plus cisplatin in previously untreated patients with advanced transitional-cell carcinoma . PATIENTS AND METHODS Thirty-one patients with measurable advanced transitional-cell carcinoma who had received no prior chemotherapy for metastatic disease were scheduled to receive gemcitabine 1,000 mg/m(2 ) intravenously over 30 minutes on days 1 , 8 , and 15 and cisplatin 70 mg/m(2 ) over 1 hour on day 2 of a 28-day cycle . Prior adjuvant or neoadjuvant therapy for locally advanced disease was allowed if this was completed more than 1 year before study entry . RESULTS There were six complete responses and 10 partial responses in 28 assessable patients , for an overall response rate of 16 of 28 ( 57 % ) . The response rate on an intent-to-treat basis was 16 of 31 patients ( 52 % ) . The median survival is 13.2 months , with 18 patients still alive at this time . Toxicity was primarily hematologic , with 12 of 31 patients ( 39 % ) having > or = grade 3 granulocytopenia and 17 of 31 ( 55 % ) having > or = grade 3 thrombocytopenia . Two patients had febrile neutropenia . All patients required a dose modification of gemcitabine at some point in their therapy ; the primary reason was thrombocytopenia and /or neutropenia . CONCLUSION Gemcitabine plus cisplatin is an active regimen for the treatment of urothelial cancer Gemcitabine and docetaxel are active agents in advanced urothelial carcinoma . A Phase II trial of this combination was performed to determine the activity and toxicity of these agents in a multiinstitutional setting in patients previously treated with one prior chemotherapy regimen The toxicity of platinum‐based combinations represents a common problem for patients with advanced urothelial carcinoma . The authors previously reported encouraging efficacy for the combination of carboplatin and gemcitabine in patients considered to be unfit for cisplatin‐based treatment . The objective of the current multicenter Phase II study was to evaluate the safety and efficacy of the combination of gemcitabine and carboplatin as first‐line treatment in unselected patients with advanced urothelial carcinoma The aim of this study was to evaluate the efficacy and safety of gemcitabine , a pyrimidine antimetabolite , in the treatment of advanced transitional cell carcinoma of the urinary tract . 35 patients with unresectable or metastatic transitional cell carcinoma of the urinary tract previously treated with a platinum-based regimen were studied . Gemcitabine was administered at a dosage of 1200 mg/m2 as a 30-min intravenous infusion on days 1 , 8 and 15 , repeated every 28 days . 31 patients were evaluable for efficacy . 4 patients achieved a complete response ( 12.9 % ) , 3 a partial response ( 9.6 % ) and 13 ( 42 % ) were stable for at least 4 weeks ( overall response 22.5 % ; 95 % confidence interval 8 - 37 % ) . The median response duration was 11.8 months ( range 3.6 - 17.7 + months ) and median survival for all patients entered was 5 months ( range 2 - 21 + months ) . 2 patients with complete response are still alive with no evidence of disease after 14 and 21 months . Gemcitabine also provided subjective symptomatic relief from pain , cystitis , dysuria , haematuria and peripheral oedema . Patients experienced little WHO grade 3 - 4 toxicity , with anaemia in 8 patients ( 23 % ) , thrombocytopenia in 5 ( 14.2 % ) , leucopenia in 4 ( 11.4 % ) and neutropenia in 7 ( 20 % ) . WHO grade 3 - 4 hepatic toxicity occurred in 4 patients ( 11.4 % ) and transient elevations of transaminase was noted in 3 ( 8.6 % ) . No patient had WHO grade 3 - 4 elevation of serum creatinine level . There was no WHO grade 4 symptomatic toxicity and no alopecia was noted . Transient influenza symptoms with gemcitabine occurred in 18 patients ( 51.4 % ) with 13 patients ( 37.1 % ) experiencing fever ( 2.9 % WHO grade 3 ) . In conclusion , gemcitabine is an new active agent for the treatment of transitional cell carcinoma of the urinary bladder with a mild toxicity profile ; it warrants further investigation in combination with cisplatin in chemotherapy naive patients PURPOSE We investigated the feasibility , safety , and antitumor activity of weekly gemcitabine given in combination with low doses of cisplatin and ifosfamide in previously treated patients with advanced transitional-cell carcinoma ( TCC ) of the urothelium . PATIENTS AND METHODS Patients with measurable , metastatic or unresectable TCC who had received one or two prior chemotherapy regimens were eligible . On a 28-day course , doses of cisplatin 30 mg/m(2 ) , gemcitabine 800 mg/m(2 ) , and ifosfamide 1 g/m(2 ) were given on day 1 and then repeated on day 8 and day 15 unless there was dose-limiting hematologic toxicity . RESULTS Fifty-one patients were registered ; 10 patients participated in a pilot study , after which 41 patients were registered onto the phase II protocol . Forty-eight patients ( 94.1 % ) had dose-limiting hematologic toxicity on day 8 or day 15 . Nonhematologic toxicity of grade 3 or greater consisted mainly of nausea and vomiting ( seven patients , 13.7 % ) and infection ( seven patients , 13.7 % ) . Responses could be assessed in 49 of 51 eligible patients ; two complete responses ( 4.1 % ) and 18 partial responses ( 36.7 % ) were observed for an overall response rate of 40.8 % ( exact 95 % confidence interval , 27 % to 56 % ) . CONCLUSION This regimen of cisplatin , gemcitabine , and ifosfamide is not feasible for weekly administration because of hematologic toxicity . Nevertheless , there was promising activity with only two doses per 28-day cycle . On the basis of these results , we have initiated a phase II trial of this combination given as a single dose every 14 days in patients with untreated , metastatic urothelial carcinoma For the purpose of a subsequent phase II/III European Organization for Research and Treatment of Cancer ( EORTC ) trial , a gemcitabine/carboplatin feasibility study in " unfit " patients with advanced urothelial cell cancer was conducted . Gemcitabine was given at 1000 mg/m(2 ) days 1 and 8 with carboplatin ( area under the curve ( AUC ) 4.5 or 5 ) day 1 every 21 days . 16 patients were treated , median age 68 years ( 47 - 75 ) years , performance status ( PS ) 0/1/2 in 3/10/3 patients . Creatinine clearance was > 1 ml/s in 3 patients , 0.5 - 1 ml/s in 9 and < 0.5 ml/s in 4 patients . Half of the patients had visceral disease . Median number of cycles given was 4 ( range 2 - 6 ) , for a total of 69 cycles . The first 8 patients received 33 cycles using a carboplatin AUC of 5 . World Health Organization ( WHO ) grade 3 - 4 toxicity was : haemoglobin 5 patients , platelets 6 patients , neutrophils 5 patients and febrile neutropenia 2 patients . In view of this haematological toxicity in subsequent patients , the carboplatin AUC was decreased to 4.5 . At this dose level , 8 patients received 36 cycles . WHO grade 3 - 4 toxicity was : anaemia 1 patient , platelets 4 patients , neutrophils 4 patients with no febrile neutropenia . Thus , this dose level was regarded to be feasible . For the 16 evaluable patients , overall response rate was 44 % , ( 1 complete response ( CR ) , 6 partial response ( PR ) ) . In conclusion , the combination of gemcitabine with carboplatin at an AUC of 4.5 appears to be an active and well tolerated regimen with acceptable toxicity in this unfit patient population . Based on these data , a r and omised trial in the framework of the EORTC-Genitourinary ( GU ) group of gemcitabine/carboplatin versus carboplatin/methotrexate/vinblastine ( MCAVI ) is ongoing Cisplatin‐based combinations are considered to be the st and ard treatment for advanced transitional cell carcinoma ( TCC ) of the urothelium . Many of the patients are elderly with concomitant diseases or impaired renal function . We studied the tolerance and activity of the gemcitabine/carboplatin combination as a therapeutic alternative PURPOSE To determine the maximum-tolerated dose and the antitumor activity of a combination of paclitaxel , cisplatin , and gemcitabine in advanced transitional-cell carcinoma ( TCC ) of the urothelium . PATIENTS AND METHODS Patients with measurable , previously untreated , locally advanced or metastatic TCC and with Eastern Cooperative Oncology Group performance status < or = 2 and creatinine clearance > or = 55 mL/min were eligible . Cisplatin was given on day 1 at a fixed dose of 70 mg/m(2 ) . Paclitaxel and gemcitabine were given on days 1 and 8 at increasing dose levels . Cycles were repeated every 21 days to a maximum of six cycles . RESULTS Sixty-one patients were registered . In phase I , 15 patients were entered at four different dose levels . Dose-limiting toxicity consisted of early onset ( after the first cycle ) grade 2 asthenia ( two of six patients ) and grade 3 asthenia ( one of six patients ) at dose level 4 . A paclitaxel dose of 80 mg/m(2 ) and gemcitabine 1,000 mg/m(2 ) was recommended for phase II , and 46 additional patients were entered at this level for a total of 49 patients . Main nonhematologic toxicity was grade 2 asthenia in 18 patients , with early onset in five patients , and grade 3 in four patients . Grade 3/4 neutropenia and thrombocytopenia occurred in 27 ( 55 % ) and 11 ( 22 % ) patients , respectively . Overall , febrile neutropenia was seen in 11 patients , and one toxic death occurred because of neutropenic sepsis . The combination was active at all dose levels . In total , 58 of 61 eligible patients were assessable for response ; 16 complete responses ( 27.6 % ) and 29 partial responses ( 50 % ) were observed for an overall response rate of 77.6 % ( 95 % confidence interval , 60 % to 98 % ) . The median survival time ( MST ) available for the phase I part of the study is 24.0 months . MST has not been reached for the whole group with the current follow-up . CONCLUSION This combination of paclitaxel , cisplatin , and gemcitabine is feasible and highly active in patients with advanced TCC of the urothelium . Further evaluation of this regimen in patients with TCC is warranted Introduction : Bladder cancer is a frequently occurring tumour in Spain and usually affects elderly patients with renal impairment . The development of new combination therapies for such patients is thus of vital importance . Patients and Methods : Between 1997 and 1998 , 17 patients with locally advanced non-surgical or metastatic bladder tumours were treated at our centres . Treatment consisted of 1,000 mg/m2 of gemcitabine administered on days 1 and 8 , and carboplatin ( area under the concentration curve = 5 ) on day 1 , every 21 days . Results : The mean age of the patients [ 4 females ( 26 % ) and 13 males ] was 69 years ( range : 54–78 years ) . The average Karnofsky performance status was 80 % ( range : 50–100 % ) . Mean creatinine clearance was 45.4 ml/min ( range : 21–55 ml/min ) . There were 2 complete responses , 7 partial responses ( RO : 56 % ; range 31–81 % ) , 6 patients had stable disease and 1 disease progression . Haematological toxicities were as follows : grade I anaemia in 2 patients , grade III in 3 ; grade I granulocytopenia in 2 patients , grade III – IV in 4 patients ; grade III thrombocytopenia in 3 patients . Toxic death occurred in the course of one grade IV neutropenic event . Non-haematological toxicities were as follows : grade I – II vomiting in 3 patients and grade III in 1 . One patient had grade III hepatic toxicity . One patient had grade III renal toxicity , and 3 patients grade II alopecia . Conclusions : The above-mentioned treatment has low toxicity , is easy to administer and offers promising results in this group of patients OBJECTIVE To evaluate the efficacy and toxicity of gemcitabine plus vinorelbine chemotherapy in patients with advanced bladder carcinoma who are unsuitable for or who have failed cisplatin-containing chemotherapy . PATIENTS AND METHODS Thirty-one patients with advanced transitional cell carcinoma ( TCC ) of the bladder were scheduled to receive gemcitabine and vinorelbine chemotherapy . Twenty-one patients had received no prior chemotherapy and their creatinine clearance was below 50 ml/min ( group 1 ) , and the remaining 10 patients did not respond to previous cisplatin-containing chemotherapy ( group 2 ) . RESULTS In group 1 , objective response rate was 47.6 % , including 2 ( 9.5 % ) complete and 8 ( 38.9 % ) partial responses . In group 2 , partial response was observed in 2 ( 20 % ) patients . The median survival time for patients in group 1 and 2 were 15 months ( range 3 - 23 ) and 7 months ( range 3 - 21 ) , respectively . Grade s 3 or 4 leukopenia developed in 16.1 % of patients . Overall , 12.9 % of the patients suffered from grade 3 nonhematologic toxicity . CONCLUSION Our preliminary data indicate that the combination of gemcitabine and vinorelbine is active and well tolerated especially in patients with advanced TCC who are unsuitable for cisplatin-based chemotherapy PURPOSE To evaluate the toxicity and efficacy of combination chemotherapy with paclitaxel and gemcitabine in patients with advanced transitional-cell carcinoma of the urothelial tract . PATIENTS AND METHODS Fifty-four patients with advanced unresectable urothelial carcinoma entered this multi-centered , community-based , phase II trial between May 1997 and December 1999 . All patients were treated with paclitaxel 200 mg/m(2 ) by 1-hour intravenous ( IV ) infusion on day 1 and gemcitabine 1,000 mg/m(2 ) IV on days 1 , 8 , and 15 ; courses were repeated every 21 days . Patients who had objective response or stable disease continued treatment for six courses . RESULTS Twenty-nine of 54 patients ( 54 % ; 95 % confidence interval , 40 % to 67 % ) had major responses to treatment , including 7 % complete responses . With a median follow-up of 24 months , 16 patients ( 30 % ) remain alive and nine ( 17 % ) are progression-free . The median survival for the entire group was 14.4 months ; 1- and 2-year actuarial survival rates were 57 % and 25 % , respectively . Seven ( 47 % ) of 15 patients previously treated with platinum-based chemotherapy responded to paclitaxel/gemcitabine . Grade 3/4 toxicity was primarily hematologic , including leukopenia ( 46 % ) , thrombocytopenia ( 13 % ) , and anemia ( 28 % ) . Ten patients ( 19 % ) required hospitalization for neutropenia and fever , and one patient had treatment-related septic death . CONCLUSION The combination of paclitaxel and gemcitabine is active and well tolerated in the first- or second-line treatment of patients with advanced transitional-cell carcinoma of the urothelial tract . Response rate and duration compare favorably with those produced by other active , first-line regimens . This regimen should be further evaluated in phase II and III studies , as well as in patients with compromised renal function Cisplatin‐based therapy is st and ard in patients with advanced urothelial carcinoma but a large proportion are ineligible due to renal impairment . The safety and activity of a dose‐dense carboplatin‐based regimen in this patient population were explored OBJECTIVES To evaluate , in a multicenter Phase II study , the safety and efficacy of the combination of gemcitabine and carboplatin , as first-line treatment in elderly and unfit patients with advanced bladder carcinoma . The toxicity of platinum-based chemotherapy combinations represents a common problem for elderly or unfit patients with advanced bladder carcinoma . METHODS Patients with previously untreated inoperable or metastatic bladder carcinoma and an Eastern Cooperative Oncology Group performance status greater than 2 , age older than 75 years , or creatinine clearance of less than 50 mL/min were treated with carboplatin area under the curve 4 on day 1 and gemcitabine 1000 mg/m(2 ) on days 1 and 8 , every 21 days for a total of six cycles . RESULTS A total of 56 patients ( 48 men and 8 women , median age 75 years ) were enrolled . Of these patients , 46 % had a performance status of 2 to 3 , 68 % had a creatinine clearance of less than 50 mL/min , and 59 % had distant metastases . The overall response rate was 36 % ( 95 % confidence interval 23.4 % to 49.6 % ) , and an additional 14 patients had disease stabilization ( 25 % , 95 % confidence interval 14.4 % to 38.4 % ) . The median time to progression was 4.8 months , the median overall survival was 7.2 months , and the 1-year survival rate was 26 % . Grade 3 or 4 toxicity included anemia ( 18 % ) ; thrombocytopenia ( 16 % ) ; neutropenia ( 27 % ) , with two episodes of febrile neutropenia requiring hospitalization ; diarrhea ( 2 % ) ; and fatigue ( 5.5 % ) . Two toxic deaths occurred during the study . CONCLUSIONS The combination of gemcitabine and carboplatin has some activity as first-line treatment of advanced bladder carcinoma in the elderly and those unfit for cisplatin-based chemotherapy , with manageable toxicity , and represents a reasonable choice for the treatment of such patients BACKGROUND The purpose of this study was to investigate the toxicity and efficacy of the sequential administration of gemcitabine ( GMB ) in combination with cisplatin ( CDDP ) followed by docetaxel ( Taxotere ) as first-line treatment of advanced urothelial carcinoma . PATIENTS AND METHODS Patients [ aged < /=70 years and performance status ( PS ) ( Eastern Cooperative Oncology Group ) 0 - 2 ] with previously untreated locally advanced/recurrent or metastatic urothelial carcinoma were eligible . Study treatment consisted of GMB ( 1000 mg/m(2 ) , days 1 and 8) and CDDP ( 70 mg/m(2 ) , day 1 ) ( GP regimen ) , every 21 days for a total of four cycles followed by docetaxel ( D ; 100 mg/m(2 ) , day 1 ) every 21 days for four cycles . RESULTS Thirty-eight patients with a median age of 67 years were enrolled ; 67 % of them had PS 0 and 87 % stage IV disease . Patients received a median of four GP and four D cycles per patient . Grade 3 - 4 neutropenia occurred in 27 % and 63 % patients with GP and D , respectively . Grade 3 - 4 thrombocytopenia occurred in 11 % of patients , only with the GP regimen . Other toxic effects were mild . There was no toxic death . The objective response rate was 55.2 % [ 95 % CI : 39.45%-71.07 % ] . Five patients had complete response ( 13.15 % ) and 16 patients had partial response ( 42.1 % ) , while nine patients had disease stabilization ( 23.7 % ) ( intention-to-treat analysis ) . After a median follow-up period of 13 months ( range 1.5 - 40.5 months ) , the median time to progression was 6.8 months ( range 1 - 40.5 months ) , the median overall survival 13 months ( range 1.5 - 40.5 months ) , and the 1-year survival rate 55.3 % . CONCLUSION The sequential administration of GP followed by D is active and well tolerated as first-line treatment of advanced urothelial carcinoma and merits to be further evaluated The purpose of the study was to evaluate the antitumor activity and the safety of paclitaxel combined with gemcitabine and cisplatin in patients affected by advanced transitional cell carcinoma of the urothelium ( TCC ) . Eighty-five patients affected by advanced TCC and measurable disease were r and omized to receive either paclitaxel at dosage of 70 mg/m2 , gemcitabine 1000 mg/m2 and cisplatin 35 mg/m2 on days 1 and 8 every 3 weeks ( GCP ) or gemcitabine 1000 mg/m2 on days 1 , 8 , 15 and cisplatin 70 mg/m2 on day 2 every 4 weeks ( GC ) . All enrolled patients were considered evaluable for response and toxicity ( intention to treat ) . The observed response rate was 43 % for GCP and 44 % for GC combination , respectively . Median time to treatment failure was 32 weeks for GCP and 26 weeks for GC and overall survival 61 vs 49 weeks , respectively ( p-value not significant ) . Grade 3 - 4 neutropenia was observed in 49 % of patients treated with GCP vs 35 % of those treated with GC ( P=0.05 ) and grade 3 - 4 thrombocytopenia was observed in 36 % of GCP treated patients as compared to 21 % of those treated with GC ( P=0.01 ) . Seven patients over 70 years old or with poor PS were removed from the study : 6 patients from GCP group ( 2 toxic deaths , 2 grade 4 myelotoxicity and 2 grade 3 asthenia ) and 1 from GC group was lost to follow-up after the first cycle . The combination of paclitaxel , gemcitabine and cisplatin is effective in the treatment of TCC . However , the addition of paclitaxel to the combination of gemcitabine plus cisplatin seems to increase toxicity , therefore it seems not suitable for poor PS patients and those over 70 years old . Larger and more powered studies are needed to exactly define the role of paclitaxel in this combination PURPOSE The aim of this analysis is the evaluation of the activity and toxicity of gemcitabine and carboplatin in patients with advanced urothelial transitional carcinoma ( TCC ) with special regard to patients with impaired renal function . PATIENTS AND METHODS 30 consecutive patients with metastatic TCC [ mean age : 68 ( range : 47 - 82 ) years , median ECOG-PS:1 ] were treated with gemcitabine ( 1000 mg/m ( 2 ) on days 1 and 8 of a 21-day schedule ) and carboplatin ( AUC 4.5 day 1 ) . In 15 patients ( considered as renal unfit ) a creatinine clearance of less than 60 mL/min ( range : 31 - 59 mL/min ) was seen . RESULTS Concerning the survival rate , no significant difference noticed between the two subgroups of renal impaired patients and patients with normal renal function was detected ( median 13 vs. 14 months , p = 0.901 ) . An overall response rate of 50 % was obtained . In 16.7 % and 33.3 % of all cases a complete or a partial response was noted . Median time to progression was 5.34 months . The 1-year-survival rate has been calculated as 51.8 % . There was no restriction of renal function under chemotherapy in any single patient . CONCLUSIONS The chemotherapy combination of gemcitabine and carboplatin is definitely powerful for a first-line-therapy in patients with advanced TCC . Toxicity is well manageable . Due to the dosage for carboplatin by AUC an adaptation to the glomerular filtration rate is possible . Decreases of effectiveness in cases of impaired renal function were not detected . Patients with metastatic TCC should be entered onto well design ed , r and omised clinical trials with the gemcitabine/carboplatin combination to afford a tailored chemotherapy PURPOSE To evaluate the efficacy and toxicity of a combination of weekly docetaxel , gemcitabine and cisplatin in advanced transitional cell carcinoma ( TCC ) of the bladder . PATIENTS AND METHODS Thirty-five chemotherapy-naïve ( adjuvant and neoadjuvant chemotherapy was allowed ) patients with advanced TCC received intravenous docetaxel 35 mg/m2 , gemcitabine 800 mg/m2 and cisplatin 35 mg/m2 , on days 1 and 8 every 3 weeks . Prophylactic granulocyte-colony stimulating factor was given from days 3 to 6 and days 10 to 15 , anti-emetics were used routinely . RESULTS Most ( 27 ) patients ( 77.1 % ) had a performance status of 0 to 1 and eight ( 22.9 % ) had received prior adjuvant or neoadjuvant cisplatin-based chemotherapy . In the intention-to-treat analysis , the objective response rate was 65.6 % [ 23/35 patients , 95 % confidence interval ( CI ) 47.8 % to 80.9 % ] . Ten patients ( 28.5 % ) achieved a complete response ( 95 % CI 14.6 % to 46.3 % ) and 13 ( 37.1 % ) a partial response ( 95 % CI 21.5 % to 55.0 % ) . Median survival time was 15.5 months , median duration of response was 10.2 months and median time to progression was 8.9 months . Ten patients ( 28.5 % ) developed grade 3/4 neutropenia , including five ( 14.3 % ) who experienced febrile neutropenia , which was successfully treated . Grade 3/4 anaemia and thrombocytopenia occurred in 20 % and 25.7 % of patients , respectively ; four patients required platelet transfusions . There were no treatment-related deaths . CONCLUSIONS Weekly docetaxel , gemcitabine plus cisplatin is a highly effective treatment for chemotherapy-naïve advanced TCC , and causes only moderate toxicity . This regimen should be considered as a suitable option that deserves further prospect i ve evaluation through r and omised phase III trials BACKGROUND Gemcitabine is an active agent in the treatment of bladder cancer . The enzyme deoxycytidine kinase catalyzes the phosphorylation of gemcitabine into the active gemcitabine triphosphate . After an infusion during 30 minutes , this enzyme will be saturated , therefore , accumulation of higher intracellular concentrations of the active gemcitabine triphosphate could be achieved by prolonging the infusion time of gemcitabine . PATIENTS AND METHODS Based on previously published Phase I trials , the efficacy and safety of a combination of cisplatin and gemcitabine given as prolonged infusion were tried in a Phase II study of 57 untreated patients with stage III/IV bladder cancer , which is the most common malignant tumor among Egyptian males . Patients received gemcitabine ( 250 mg/m(2 ) during 6-hour infusion ) on days 1 and 8 , and cisplatin ( 70 mg/m(2 ) ) on day 2 every 21-day cycle . RESULTS The 41 males and 16 females had a median age of 55 years ( range 37 - 77 ) . A total of 37 patients had transitional cell , 15 had squamous cell , 2 had adenocarcinoma , and 3 had undifferentiated cell carcinoma . The median number of cycles given to these 57 patients was 4 ( range 1 - 6 ) . Of 54 evaluable patients , 5 ( 9.4 % ) had complete remission , and 27 ( 50 % ) partial remission , for an overall response rate of 59.4 % . These results are comparable to those of a previous Phase II study of the same combination but with gemcitabine given in the st and ard dose and schedule . Responses were observed at all disease sites . Both hematologic and nonhematologic toxicity were treatable and not severe . CONCLUSIONS Prolonged infusion of gemcitabine and cisplatin is an effective treatment for advanced bilharzial-related bladder cancer . Toxicity , especially myelosuppression , is surprisingly mild . This combination deserves to be tried in other different disease categories The objective of this study was to evaluate the efficacy and toxicity of gemcitabine plus epirubicin in previously untreated patients with advanced urothelial carcinoma who were not eligible for cisplatin‐based regimens PURPOSE We investigated the safety and efficacy ( response rates , time to disease progression , survival ) of trastuzumab , carboplatin , gemcitabine , and paclitaxel in advanced urothelial carcinoma patients and prospect ively evaluated human epidermal growth factor receptor-2 ( Her-2/neu ) overexpression rates . PATIENTS AND METHODS Advanced urothelial carcinoma patients were screened for Her-2/neu overexpression . Eligibility for therapy required human epidermal growth factor receptor-2 ( Her-2/neu ) overexpression by immunohistochemistry ( IHC ) , gene amplification and /or elevated serum Her-2/neu , no prior chemotherapy for metastasis , and adequate organ function including a normal cardiac function . Treatment consisted of trastuzumab ( T ) 4 mg/kg loading dose followed by 2 mg/kg on days 1 , 8 , and 15 ; paclitaxel ( P ) 200 mg/m2 on day 1 ; carboplatin ( C ; area under the curve , 5 ) on day 1 ; and gemcitabine ( G ) 800 mg/m2 on days 1 and 8 . The primary end point was cardiac toxicity . RESULTS Fifty-seven ( 52.3 % ) of 109 registered patients were Her-2/neu positive , and 48.6 % were positive by IHC . Her-2/neu-positive patients had more metastatic sites and visceral metastasis than did Her-2/neu negative patients . Forty-four of 57 Her-2/neu-positive patients were treated with TPCG . The median number of cycles was six ( range , 1 to 12 cycles ) . The most common grade 3/4 toxicity was myelosuppression . Grade 3 sensory neuropathy occurred in 14 % of patients , and 22.7 % experienced grade 1 to 3 cardiac toxicity ( grade 3 , n = 2 : one left ventricular dysfunction , one tachycardia ) . There were three [ corrected ] therapy-related deaths . Thirty-one ( 70 % ) of 44 patients responded ( five complete and 26 partial ) , and 25 ( 57 % ) of 44 were confirmed responses . Median time to progression and survival were 9.3 and 14.1 months , respectively . CONCLUSION We prospect ively characterized Her-2/neu status in advanced urothelial carcinoma patients . TPCG is feasible ; cardiac toxicity rates were higher than projected , but the majority were grade two or lower . Determining the true contribution of trastuzumab requires a r and omized trial The objectives of the current study were to evaluate the safety and efficacy of gemcitabine plus docetaxel in patients with unresectable ( Stage T4 or ≥ N1 ) metastatic or locally advanced transitional cell carcinoma ( TCC ) of the urothelial tract 2′,2′-difluorodeoxycytidine ( Gemcitabine , dFdC ) is an antineoplastic agent with clinical activity against several cancer types.1 cis-Diamminedichloroplatinum ( cisplatin , CDDP ) is a drug with long established anticancer activity , which acts by Platinum (Pt)-DNA adduct formation.2,3 Because of the low toxicity profile of dFdC and the differences in mechanism of cytotoxicity , pre clinical studies were performed that demonstrated synergism between dFdC and CDDP in several cancer cell lines and in vivo,4–8 which is likely to be related to increased formation of Pt-DNA adducts.8 Pre-treatment with dFdC gave the best results both in vitro and in vivo.6,7,8 Several potential mechanisms underlying the synergism were studied in vitro , based on these results several schedules were studied in patients The objectives are to evaluate and compare the response and toxicity of a 3-weekly and a 2-weekly regimen of gemcitabine ( Gem ) and paclitaxel ( Pac ) second-line treatment in patients with transitional cell carcinoma ( TCC ) . Between June 2000 and July 2001 , 30 patients with progressive disease ( PD ) during first-line chemotherapy ( n = 11 ) or relapse after adjuvant cisplatin-based chemotherapy of a metastatic or locally advanced TCC ( n = 18 ) have been r and omised to receive either six cycles ( schedule A ) of 3-weekly Gem ( 1000 mg/qm , days 1 and 8) and Pac ( 175 mg/qm , day 1 ) or 2-weekly treatment until disease progression ( schedule B ) with Gem ( 1250 mg/qm , day 1 ) and Pac ( 120 mg/qm , day 2 ) . Restaging was performed after every 6 weeks by clinical imaging . Of 30 patients , one patient in schedule A and two patients in schedule B were not evaluable for response due to serious adverse events ( SAEs ) during the first cycle . The overall objective response ( OR ) was 44 % ( 12 of 27 ) with eight complete remissions ( CRs ) and four partial remissions . Median time to progression ( TTP ) was 11 ( 3 - 41 ) months in schedule A and 6 ( 1 - 15 + ) months in schedule B. Median survival was 13 ( 5 - 46 ) months in schedule A and 9 ( 0 - 16 ) months in schedule B. Schedule A showed a significantly higher rate of CRs ( 7 vs. 1 , p < 0.05 ) . With a median number of six ( 1 - 6 ) cycles ( A ) and nine ( 1 - 23 ) cycles ( B ) , TTP and survival were not significantly different . In schedule B , one patient had WHO grade IV anaemia and leucopenia . WHO grade III toxicities were seen in schedule A/B as follows : anaemia 3 (23%)/2 ( 16 % ) patients , leucopenia 5 (38%)/2 ( 16 % ) , thrombocytopenia 0/2 ( 16 % ) and alopecia 10 (76%)/4 ( 32 % ) . The combination of Gem and Pac is an effective second-line regimen in patients with mainly poor prognosis due to PD after cisplatin-based chemotherapy . Except for three SAEs ( uncertainly therapy related ) , both regimens were tolerated well . The 3-weekly schedule with a nonsplit Pac dose showed a significantly higher complete response rate in our small study population and , thus , might be superior to the 2-weekly schedule Patients with urothelial carcinoma are not always amenable to cisplatin‐based chemotherapy . The authors previously reported that they achieved a 60 % response rate in patients who failed on cisplatin‐based combination chemotherapy ( methotrexate , vinblastine , doxorubicin , and cisplatin ) by using a convenient outpatient regimen of gemcitabine ( G ) and paclitaxel ( P ) every 2 weeks . A multicenter trial was initiated in 5 Italian centers to evaluate this regimen as first‐line chemotherapy
2,060
21,328,258
Electrocoagulation was associated with less morbidity including post-operative pain when compared with the modified Pomeroy and tubal ring methods , despite the risk of burns to the small bowel . The small sample size and the relative short period of follow-up in these studies limited the power to show clinical or statistical differences for rare outcomes such as failure rates .
BACKGROUND Female sterilisation is the most popular contraceptive method worldwide . Several techniques are described in the literature , however only few of them are commonly used and properly evaluated . OBJECTIVES To compare the different tubal occlusion techniques in terms of major and minor morbidity , failure rates ( pregnancies ) , technical failures and difficulties , and women 's and surgeons ' views .
The aim of this study was to evaluate the efficacy , safety and acceptability of two monthly transcervical applications of quinacrine 252 mg and ibuprofen 55.5 mg as pellets for non-surgical female sterilization . From August 1992 through October 1996 , a prospect i ve clinical study was conducted on 200 normal women seeking surgical sterilization voluntarily in the Family Planning Clinic of the Department of Obstetrics and Gynecology , Regency Hospital , Wonosobo , Central Java , Indonesia . Quinacrine 252 mg and ibuprofen 55.5 mg were inserted transcervically , as pellets , using a Copper T IUD insertor in the proliferative phase of two consecutive menstrual cycles . The women were followed up 6 , 12 , 24 and 48 months after insertion . There were no major complications during the insertion procedures , and side-effects which occurred during the us e of the methods were transient . Cumulative life-table continuation rate per 100 women at four years was 0.91±0.02 ( SE ) . The pregnancy fail ure rate was 0.04 or 4.3 % . The results of this study indicate that intrauterine insertion of quinacrine pellets is a safe , acceptable and effective method of non-surgical female sterilization Objective To compare patient satisfaction , discomfort , procedure time , success rate and adverse events of hysteroscopic ( ESSURE , Conceptus Inc , San Carlos , USA ) versus laparoscopic sterilisation Approximately 200 women of American Society of Anesthesiologists class I and II physical status electing outpatient laparoscopic tubal sterilization with Yoon rings were involved in a double-blind study to evaluate postoperative pain relief after intraoperative suprapubic infiltration of the fallopian mesosalpinx . The postoperative pain levels were lower after bilateral infiltration of 0.5 % bupivacaine beneath the site of ring application . Postoperative , suprapubic infiltration provided safe , prolonged and effective pain relief , allowed prompt ambulation and early discharge , and reduced the need for narcotic analgesics and postoperative analgesia OBJECTIVE To assess the safety , effectiveness , and reliability of a tubal occlusion microinsert for permanent contraception , as well as to document patient recovery from the placement procedure and overall patient satisfaction . METHODS A cohort of 518 previously fertile women seeking sterilization participated in this prospect i ve , phase III , international , multicenter trial . Microinsert placement was attempted in 507 women . Microinserts were placed bilaterally into the proximal fallopian tube lumens under hysteroscopic visualization in outpatient procedures . RESULTS Bilateral placement of the microinsert was achieved in 464 ( 92 % ) of 507 women . The most common reasons for failure to achieve satisfactory placement were tubal obstruction and stenosis or difficult access to the proximal tubal lumen . More than half of the women rated the average pain during the procedure as either mild or none , and 88 % rated tolerance of device placement procedure as good to excellent . Average time to discharge was 80 minutes . Sixty percent of women returned to normal function within 1 day or less , and 92 % missed 1 day or less of work . Three months after placement , correct microinsert placement and tubal occlusion were confirmed in 96 % and 92 % of cases , respectively . Comfort was rated as good to excellent by 99 % of women at all follow-up visits . Ultimately , 449 of 518 women ( 87 % ) could rely on the microinsert for permanent contraception . After 9620 woman-months of exposure to intercourse , no pregnancies have been recorded . CONCLUSION This study demonstrates that hysteroscopic interval tubal sterilization with microinserts is well tolerated and results in rapid recovery , high patient satisfaction , and effective permanent contraception OBJECTIVE To evaluate the reliability of pelvic X-ray and transvaginal ultrasound to localize Essure microinserts ( Conceptus , San Carlos , California ) after successful placement in both fallopian tubes 3 months after placement . DESIGN Prospect i ve , observational study . SETTING Gynecology departments at two teaching hospitals . PATIENT(S ) One hundred eighty-two patients who underwent hysteroscopic sterilization by placement of Essure microinserts between August 2002 and August 2004 . INTERVENTION(S ) Transvaginal ultrasound , pelvic X-ray , and hysterosalpingography ( HSG ) 3 months after sterilization with Essure . MAIN OUTCOME MEASURE(S ) Transvaginal ultrasound confirmation of correct localization of microinserts after a 3-month follow-up . RESULT ( S ) In 150 of 182 patients , confirmation of successful bilateral placement of two microinserts ( 300 devices ) was possible . In 9 patients it was not possible to identify both devices with ultrasound , or there was doubt about the extension of the device through the uterotubal junction . The other 291 devices were identified as being in a good position . CONCLUSION ( S ) Hysterosalpingography at the 3-month follow-up after successful placement of Essure microinserts can be replaced by transvaginal ultrasonography . A 3-month follow-up with HSG after the Essure procedure is only required after unsatisfactory placements . In those patients in whom transvaginal ultrasonography can not confirm satisfactory localization , a complementary pelvic X-ray should be performed OBJECTIVE To evaluate women 's satisfaction and tolerance of hysteroscopic sterilization . DESIGN Prospect i ve analysis of case series . SETTING Gynecology department in a teaching hospital . PATIENT(S ) A total of 1,630 women who underwent hysteroscopic sterilization by placement of Essure microinserts ( Conceptus , Inc. , Mountain View , CA ) from January 2003 to June 2006 . INTERVENTION(S ) Transvaginal ultrasound examination , pelvic x-ray examination , and hysterosalpingography 3 months after sterilization with Essure microinserts . Satisfaction was assessed by a visual analog scale . Adverse effects and tolerance also were recorded . MAIN OUTCOME MEASURE(S ) Transvaginal ultrasound and pelvic x-ray confirmation of correct localization of microinserts and patient 's satisfaction and tolerance after a 3-month follow-up . RESULT ( S ) The rate of successful insertion was 99 % . Most of women returned to their daily activities on the same day of insertion , and 86.5 % considered the procedure painless or scarcely painful . All the patients were highly satisfied after hysteroscopic sterilization : 91 % of subjects by visual analog scale ( on a 0 to 10 scale ) rated the method at 10 ( high satisfaction degree ) , and none of the subjects rated it under 8 . For patients , the most valuable aspects of the procedure were absence of surgery room ( 52.7 % ) , method 's quickness and comfort ( 19.9 % ) , and permanent sterilization ( 18.2 % ) . More than 97 % of the patients said that they would recommend the procedure to others . CONCLUSION ( S ) This study provides evidence that Essure microinserts can be placed in a usual gynecologic consultation room in st and ard conditions without any type of anesthesia or sedation and are associated with great overall patient satisfaction . Women also have high tolerance for the procedure and describe minor postoperative pain OBJECTIVE The present study examines the safety , effectiveness , and local tissue response for a new transcervical fallopian tube permanent contraceptive device , the STOP device ( Conceptus , Inc. , San Carlos , CA ) . DESIGN Nonr and omized prospect i ve evaluation of tubal occlusion and histologic response . SETTING Inpatient , university and university-affiliated medical centers in the United States and Mexico . PATIENT(S ) Premenopausal and perimenopausal women with benign indications for hysterectomy who were able to defer their hysterectomy for 1 to 13 weeks . INTERVENTION(S ) A transcervically placed microcoil ( STOP device ) was inserted into the fallopian tubes of women who were scheduled for hysterectomy , and the device was worn for 1 to 12 weeks . At hysterectomy , hysterosalpingography was done to determine tubal occlusion ; subsequently , the tubes containing the STOP devices were processed , sectioned , and evaluated to determine the histologic response . MAIN OUTCOME MEASURE(S ) Ability to place a device and evaluate tubal occlusion and tissue response . RESULT ( S ) Devices were placed in 33 women , representing 57 tubes ; the women wore the devices from 1 day to 30 weeks . Histology on 27 women ( 47 tubes ) showed an acute inflammatory and fibrotic response in the short term that , over time , became a chronic inflammatory response with extensive fibrosis . CONCLUSION ( S ) The localized tissue response and notable absence of any normal tubal architecture in the segment of the fallopian tube containing the STOP device supports the postulated mechanisms of action of the device . Prehysterectomy study findings suggest the usefulness of the STOP device for pregnancy prevention , this is being evaluated in long-term safety and effectiveness studies OBJECTIVE To determine the feasibility and patient satisfaction of female sterilisation using the Essure system in an outpatient hysteroscopy clinic without conscious sedation or general anaesthesia . DESIGN Prospect i ve cohort study . SETTING Outpatient hysteroscopy clinic in a large teaching hospital . POPULATION Women undergoing outpatient hysteroscopic sterilisation using the Essure system for permanent fertility control . METHODS Demographic and procedural data were prospect ively collected from 112 consecutive women undergoing outpatient hysteroscopic sterilisation without sedation or general anaesthesia . A hysterosalpingogram ( HSG ) was performed routinely in all women 3 months after the procedure to confirm bilateral tubal occlusion . Postal question naires were sent at this time enquiring about patient satisfaction and experience with the outpatient procedure . Multivariable logistic regression was used to identify factors independently predictive of successful completion of the procedure . MAIN OUTCOME MEASURES Technical feasibility , predictive factors for technical success ( operator , body mass index , uterine size , axis , menstrual phase and cervical stenosis ) , complications , tubal occlusion on HSG , patient satisfaction and procedure-related experience . RESULTS Successful bilateral tubal placement of the Essure microinserts was achieved in 103/112 ( 92 % , 95 % CI 85 - 96 % ) women . Nonsecretory phase of the menstrual cycle ( P = 0.04 ) and a clinical ly normal-sized uterus ( P = 0.003 ) were independently predictive for successful completion of the outpatient procedure on multivariable modelling . There were no major procedure-related complications recorded , but transient vasovagal reactions occurred in 5/112 ( 5 % ) women . Of the original cohort of 112 women with successful procedures , 84 women were 3 months postprocedure and had undergone a HSG . Bilateral tubal occlusion was confirmed in 83/84 ( 99 % , 95 % CI 94 - 100 % ) women at 3 months and in 100 % at 6 months . Seventy-six of 84 ( 91 % ) had returned the question naires , and 70/73 ( 96 % , 95 % CI 88 - 99 % ) were satisfied with their overall experience of the procedure including radiological follow up , with most reporting being ' very satisfied ' ( 64/73 , 88 % , 95 % CI 78 - 94 % ) . CONCLUSIONS Outpatient hysteroscopic sterilisation using the Essure system without sedation or general anaesthesia is a successful and safe procedure associated with high rates of patient satisfaction . If practical , women should be scheduled to have their procedures in the proliferative phase of the menstrual cycle to optimise successful placement of Essure devices , especially if the uterus is clinical ly enlarged Self-reported postoperative pain was reduced significantly ( P less than .05 ) for up to six hours in a group of ambulatory surgical patients with the application of 5 mL of 1 % etidocaine to the b and ed portion of each fallopian tube after laparoscopic tubal ligation with Falope Rings in comparison to a control group receiving normal saline . The etidocaine group had less nausea and vomiting and smaller antiemetic and analgesic requirements than did the control group , though those results were not statistically significant Objective To compare objective ly the pain associated with tubal occlusion by Silastic rings versus electrocoagulation during laparoscopic tubal ligation under local anesthesia . Methods Consecutive patients scheduled for laparoscopic tubal ligation under local anesthesia were r and omized to Silastic rings ( N = 50 ) or electrocoagulation { N = 52 ) as the method of tubal occlusion . Sterilization was performed under local anesthesia in a st and ard fashion . Bupivacaine 0.5 % was used as the local anesthetic agent . Operative pain was measured based on intraoperative anesthesia requirements and a modified McGill pain question naire . This question naire was used to assess pain at 15 minutes , 1 hour , and 24 hours postoperatively . Results Demographics were similar for the two groups . Operative time was shorter in the Silastic – ring group ( 16.7 versus 21.8 minutes ; P = .001 ) , and this group also required less intraoperative anesthesia ( P= .004 ) . There were no statistical differences between the groups in self – reported pain intraoperatively or postoperatively . No patient in either group required antiemetics or pain medication in the recovery room . Conclusion Silastic rings appear preferable to bipolar electrocoagulation for laparoscopic tubal sterilization under local anesthesia when long – acting local agents are used for tubal anesthesia The use of multiple clips for the occlusion of the Fallopian tubes has been reported in interval laparoscopic sterilization , but the circumstances leading to the performance of the multiple-clip procedure and its effects on safety and efficacy have not been carefully studied . A data set from international multi-center clinical trials of Filshie clips and Wolf ( Hulka ) clips was used to examine the possible reasons for performing this procedure for 102 women . Their complications , complaints , and surgical and post-surgical events before discharge and during one month of follow-up were compared with those of the 408 women whose tubes were occluded by single clip . Results indicate that multiple clips were most often used when surgical difficulties ( and to a much lesser degree , tubal and /or mesosalpingeal injury ) were encountered during the sterilization procedure . No increased risk of short-term complications or complaints ( including pelvic pain ) was found at one-month follow-up for those patients who received multiple clips Background : Over 100,000 women worldwide have been sterilized by insertion of quinacrine into the uterus to induce tubal scarring . Concern has been expressed about possible carcinogenicity , and specifically the risk of uterine cancer . Methods : From 2001 through 2006 , we conducted a population -based , case – control study of gynecologic cancers in 12 provinces in northern Vietnam , where relatively large numbers of women had received quinacrine . Cases of incident cervical , ovarian , and uterine cancer were identified at provincial hospitals or at referral hospitals in Hanoi . For each case , 3 age- and residence-matched controls were r and omly selected from the population registries of the case 's home community . Results : The prevalence of quinacrine exposure was 1.2 % among cases and 1.1 % among controls . For cervical cancer , analysis of 606 cases ( 9 exposed ) and their 1774 matched controls ( 18 exposed ) produced an odds ratio of 1.44 ( 95 % confidence interval = 0.59–3.48 ) ( adjusted for several covariates including human papillomavirus risk score ) . For ovarian cancer , based on 262 cases ( 3 exposed ) and 755 controls ( 8 exposed ) and adjusted for age and number of years of ovulation , the odds ratio was 1.26 ( 0.21–5.45 ) . For uterine cancer , none of the cases — including 23 cases of leiomyosarcoma — was exposed to quinacrine . The 95 % confidence interval , based on 161 cases ( none exposed ) and 470 controls ( 7 exposed ) and adjusted only for age , was 0–1.85 . Conclusion : We found no evidence of a relationship between quinacrine sterilization and gynecologic cancer A previous report described the development of a blind method to deliver methylcyanoacrylate ( MCA ) transcervically . Using 0.6 ml of a stable MCA whose polymerization time was closely controlled , we reported a 78 % bilateral tubal closure rate in 23 cases with hysterosalpingographic control . Subsequent to the previous report , we initiated a study in which patients were r and omly assigned to one of three treatment groups : a single MCA injection , a single MCA injection after uterine lavage , or two MCA injections 1 month apart . In addition , a radiopaque MCA has been developed with which it is possible to determine tubal entry after its application by means of the FEMCEPT device . Patients treated with radiopaque MCA have been studied to determine whether it is possible to predict tubal closure on the basis of tubal entry and distribution patterns . The results of these studies and their implication s for contraceptive effectiveness of the FEMCEPT/MCA system will be reported OBJECTIVE To evaluate the cumulative probability of regret after tubal sterilization , and to identify risk factors for regret that are identifiable before sterilization . METHODS We used a prospect i ve , multicenter cohort study to evaluate the cumulative probability of regret within 14 years after tubal sterilization . Participants included 11,232 women aged 18 - 44 years who had tubal sterilizations between 1978 and 1987 . Actuarial life tables and Cox proportional hazards models were used to identify those groups at greatest risk of experiencing regret . RESULTS The cumulative probability of expressing regret during a follow-up interview within 14 years after tubal sterilization was 20.3 % for women aged 30 or younger at the time of sterilization and 5.9 % for women over age 30 at sterilization ( adjusted relative risk [ RR ] 1.9 ; 95 % confidence interval [ CI ] 1.6 , 2.3 ) . For the former group , the cumulative probability of regret was similar for women sterilized during the postpartum period ( after cesarean , 20.3 % , 95 % CI 14.5 , 26.0 ; after vaginal delivery , 23.7 % , 95 % CI 17.6 , 29.8 ) and for women sterilized within 1 year after the birth of their youngest child ( 22.3 % , 95 % CI 16.4 , 28.2 ) . For women aged 30 or younger at sterilization , the cumulative probability of regret decreased as time since the birth of the youngest child increased ( 2 - 3 years , 16.2 % , 95 % CI 11.4 , 21.0 ; 4 - 7 years , 11.3 % , 95 % CI 7.8 , 14.8 ; 8 or more years , 8.3 % , 95 % CI 5.1 , 11.4 ) and was lowest among women who had no previous births ( 6.3 % , 95 % CI 3.1 , 9.4 ) . CONCLUSION Although most women expressed no regret after tubal sterilization , women 30 years of age and younger at the time of sterilization had an increased probability of expressing regret during follow-up interviews within 14 years after the procedure
2,061
25,398,366
Regular use of dentifrices containing 5,000 ppm F- and quarterly professionally applied CHX or SDF varnishes seem to be efficacious to decrease progression and initiation of root caries , respectively .
The present systematic review critically summarizes results of clinical studies investigating chemical agents to reduce initiation or inactivation of root caries lesions ( RCLs ) .
The objective of this research was to evaluate the anticaries effectiveness of a low-dose ( 500 ppm F , low-NaF ) sodium fluoride dentifrice , a high-dose ( 2,800 ppm F , high-NaF ) sodium fluoride dentifrice and an experimental 0.454 % stabilized stannous fluoride ( 1,100 ppm F ) with sodium hexametaphosphate ( SnF2-HMP ) dentifrice , each relative to a st and ard 1,100 ppm F sodium fluoride positive control dentifrice . Subjects ( n = 955 , with ∼239 per group ) with a mean age of 10.6 ( ∼9–12 years ) were r and omly assigned to one of four dentifrice treatments . Two calibrated examiners independently measured visual-tactile caries as DMFS that was supplemented with a radiographic examination at baseline , 12 months and 24 months for each subject . Generally similar results were independently observed by both examiners at the conclusion of the 2-year study period . Considering all subjects that attended at least 60 % of the supervised brushing sessions , statistically significantly less caries was observed in the high-NaF group compared to the control group . Similarly , statistically significantly less caries was observed in the SnF2-HMP group as compared to the control group . Differences in caries increments between the low-NaF and control groups were not statistically significant . One of the examiners observed these same statistically significant differences after 1 year . In conclusion , the results of this clinical trial indicated that while no difference in caries increments was observed between the low-NaF and control groups , both the high-NaF and the SnF2-HMP groups experienced significantly fewer lesions than the control group OBJECTIVE To compare the effectiveness of annual topical application of silver diamine fluoride ( SDF ) solution , semi-annual topical application of SDF solution , and annual application of a flowable high fluoride-releasing glass ionomer in arresting active dentine caries in primary teeth . METHODS A total of 212 children , aged 3 - 4 years , were r and omly allocated to one of three groups for treatment of carious dentine cavities in their primary teeth : Gp1-annual application of SDF , Gp2-semi-annual application of SDF , and Gp3-annual application of glass ionomer . Follow-up examinations were carried out every six months to assess whether the treated caries lesions had become arrested . RESULTS After 24 months , 181 ( 85 % ) children remained in the study . The caries arrest rates were 79 % , 91 % and 82 % for Gp1 , Gp2 and Gp3 , respectively ( p=0.007 ) . In the logistic regression model using GEE to adjust for clustering effect , higher caries arrest rates were found in lesions treated in Gp2 ( OR=2.98 , p=0.007 ) , those in anterior teeth ( OR=5.55 , p<0.001 ) , and those in buccal/lingual smooth surfaces ( OR=15.6 , p=0.004 ) . CONCLUSION Annual application of either SDF solution or high fluoride-releasing glass ionomer can arrest active dentine caries . Increasing the frequency of application to every 6 months can increase the caries arrest rate of SDF application . CLINICAL SIGNIFICANCE Arrest of active dentine caries in primary teeth by topical application of SDF solution can be enhanced by increasing the frequency of application from annually to every 6 months , whereas annual paint-on of a flowable glass ionomer can also arrest active dentine caries and may provide a more aesthetic outcome Root caries is common in institutionalized elders , and effective prevention methods are needed . This clinical trial compared the effectiveness of four methods in preventing new root caries . From 21 residential homes , 306 generally healthy elders having at least 5 teeth with exposed sound root surfaces were r and omly allocated into one of four groups : ( 1 ) individualized oral hygiene instruction ( OHI ) ; ( 2 ) OHI and applications of 1 % chlorhexidine varnish every 3 months ; ( 3 ) OHI and applications of 5 % sodium fluoride varnish every 3 months ; and ( 4 ) OHI and annual applications of 38 % silver diamine fluoride ( SDF ) solution . Two-thirds ( 203/306 ) of the elders were followed for 3 years . Mean numbers of new root caries surfaces in the four groups were 2.5 , 1.1 , 0.9 , and 0.7 , respectively ( ANOVA , p < 0.001 ) . SDF solution , sodium fluoride varnish , and chlorhexidine varnish were more effective in preventing new root caries than giving OHI alone OBJECTIVES Little is known about the effect of Cervitec , a chlorhexidine-thymol varnish , on root caries . Our objective was to determine whether a 3-monthly application of Cervitec over 1 year would limit the progress of existing root caries lesions and reduce the incidence of dental root caries in a group of dentate institutionalized elderly , as a complement to their usual oral hygiene practice s. METHODS A double-blind r and omized clinical trial was conducted in 68 subjects ( 34 per group ) in two residences in Almería ( Spain ) . Twenty-one subjects with 60 root caries lesions and 25 with 65 lesions , in the Cervitec and placebo groups , respectively , completed the study . Varnishes were applied twice in the first week , 1 month later , and every 3 months until the end of the study . Clinical parameters associated with established lesions were determined at baseline and after 6 and 12 months , as was the incidence of root caries lesions . RESULTS The clinical evolution of lesions was significantly better in the Cervitec group as opposed to the placebo group in terms of width , height , color , and texture . The increase in root caries was significantly lower ( p=0.039 ) in the Cervitec group . CONCLUSION According to these results , Cervitec may help to control established root lesions and reduce the incidence of root caries lesion among institutionalized elderly A clinical trial was conducted to compare the effect of different caries – preventive strategies on caries progression in lower – income , ethnically diverse persons 60 years of age and older . Two hundred and ninety – seven subjects were r and omized into one of five experimental groups . Group 1 received usual care from a public health department or a private practitioner . Group 2 received an educational program of 2 h duration implemented twice a year . Group 3 received the educational program plus a 0.12 % chlorhexidine rinse weekly . Group 4 received the education and chlorhexidine interventions and a fluoride varnish application twice a year . Group 5 received all the above interventions as well as scaling and root planing every 6 months throughout the 3–year study . A carious event was defined as the onset of a carious lesion , a filling , or an extraction on a surface which was sound at baseline . Two hundred and one subjects remained in the study for the 3–year period . Groups that received usual intraoral procedures ( groups 3 , 4 , and 5 ) had a 27 % reduction for coronal caries events ( p = 0.09 ) and 23 % for root caries events ( p = 0.15 ) , when compared to the groups that received no intraoral procedures ( groups 1 and 2 ) . Routine preventive treatments may have had only a small – to – moderate effect upon caries development Background / Aims : Root caries among elderly communities is of growing public health concern globally . This controlled clinical trial investigated the effectiveness of silver diamine fluoride and oral health education in preventing and arresting root caries . Methods : Two hundred sixty-six elderly subjects who had at least 5 teeth with exposed root surfaces and did not have serious life-threatening medical diseases were allocated to 3 groups according to a computer-generated r and om list : group 1 ( the control group ) received oral hygiene instructions ( OHI ) annually ; group 2 received OHI and silver diamine fluoride ( SDF ) application annually , and group 3 was given OHI and SDF application annually , plus an oral health education ( OHE ) programme every 6 months . Results : Two hundred twenty-seven elderly subjects were followed for 24 months . The mean numbers of new root caries surfaces in groups 1 , 2 and 3 were 1.33 , 1.00 and 0.70 , respectively ( ANOVA , p < 0.05 ) . Group 3 had fewer root surfaces with new caries than group 1 ( Scheffé multiple-comparison test , p < 0.05 ) . The mean numbers of arrested root caries surfaces in groups 1 , 2 and 3 were 0.04 , 0.28 and 0.33 , respectively ( ANOVA , p < 0.01 ) . Group 3 and group 2 had a greater number of active root caries surfaces which became arrested than group 1 ( Scheffé multiple-comparison test , p < 0.05 ) . Conclusion : Annual application of SDF together with biannual OHE was effective in preventing new root caries and arresting root caries among community-dwelling elderly subjects The purpose of this study was to determine the effect of a 48-month preventive dental program on the incidence of root caries in an urban , geriatric , noninstitutionalized population residing in an optimally fluori date d area . The 466 participants were r and omly assigned to one of three groups at baseline . Group A ( control ) : 171 subjects using a placebo mouthrinse daily ; group B : 147 subjects receiving semiannual applications of a topical APF gel ( 1.2 % F- ) with the daily use of a placebo mouthrinse ; group C : 148 subjects using a fluori date d mouthrinse daily , ACT ( 0.05 % F- ) . At baseline , the numbers of surfaces at risk , and decayed and filled surfaces were recorded . After 48 months , in addition , the number of reversed and new lesions were determined , and the incremental DMFS calculated . The data were analyzed by ANCOVA . The incremental DMFS were : A = 0.91 , B = 0.27 , C = 0.26 . The incremental DMFS in groups B and C were significantly lower than in group A ( P < .05 ) . The number of reversed lesions in group C ( 1.53 + /- 2.03 ) was significantly greater than in group A ( 1.11 + /- 1.74 ) and group B ( 1.01 + /- 1.86 ) ( P < .05 ) . The number of new lesions in group B ( 1.36 + /- 2.00 ) was significantly less than in group A ( 1.99 + /- 2.65 ) ( P < .05 ) . The daily use of the fluoride mouthrinse significantly increased the number of reversed lesions Arresting Caries Treatment ( ACT ) has been proposed to manage untreated dental caries in children . This prospect i ve r and omized clinical trial investigated the caries-arresting effectiveness of a single spot application of : ( 1 ) 38 % silver diamine fluoride ( SDF ) with tannic acid as a reducing agent ; ( 2 ) 38 % SDF alone ; ( 3 ) 12 % SDF alone ; and ( 4 ) no SDF application in primary teeth of 976 Nepalese schoolchildren . The a priori null hypothesis was that the different treatments have no effect in arresting active cavitated caries . Only the single application of 38 % SDF with or without tannic acid was effective in arresting caries after 6 months ( 4.5 and 4.2 mean number of arrested surfaces ; p < 0.001 ) , after 1 year ( 4.1 and 3.4 ; p < 0.001 ) , and after 2 years ( 2.2 and 2.1 ; p < 0.01 ) . Tannic acid conferred no additional benefit . ACT with 38 % SDF provides an alternative when restorative treatment for primary teeth is not an option Objective The aim of this single – blind , multicenter , parallel , r and omized controlled trial was to evaluate the effectiveness of the application of a high-fluoride toothpaste on root caries in adults . Methods Adult patients ( n = 130 , ♂ = 74 , ♀ = 56 ; mean age ± SD : 56.9 ± 12.9 ) from three participating centers , diagnosed with root caries , were r and omly allocated into two groups : Test ( n = 64 , ♂ = 37 , ♀ = 27 ; lesions = 144 ; mean age : 59.0 ± 12.1 ; intervention : high-fluoride toothpaste with 5000 ppm F ) , and Control ( n = 66 , ♂ = 37 , ♀ = 29 ; lesions = 160 ; mean age : 54.8 ± 13.5 ; intervention : regular-fluoride toothpaste with 1350 ppm F ) groups . Clinical examinations and surface hardness scoring of the carious lesions were performed for each subject at specified time intervals ( T0 – at baseline before intervention , T1 – at 3 months and T2 – at 6 months after intervention ) . Mean surface hardness scores ( HS ) were calculated for each patient . Statistical analyses comprised of two-way analysis of variance and post hoc comparisons using the Bonferroni – Dunn correction . Results At T0 , there was no statistical difference between the two groups with regard to gender ( P = 0.0682 , unpaired t-test ) , or age ( P = 0.9786 , chi-squared test ) , and for the overall HS ( Test group : HS = 3.4 ± 0.61 ; Control group : HS = 3.4 ± 0.66 ; P = 0.8757 , unpaired t-test ) . The anova revealed significantly better HS for the test group than for the control groups ( T1 : Test group : HS = 2.9 ± 0.67 ; Control group : HS = 3.1 ± 0.75 ; T2 : Test group : HS = 2.4 ± 0.81 ; Control group : HS = 2.8 ± 0.79 ; P < 0.0001 ) . However , the interaction term time-point*group was not significant . Conclusions The application of a high-fluoride containing dentifrice ( 5000 ppm F ) in adults , twice daily , significantly improves the surface hardness of otherwise untreated root caries lesions when compared with the use of regular fluoride containing ( 1350 ppm F ) toothpastes AIM The purpose of the present study was to evaluate in a group of periodontal maintenance patients , the effect of using a dentifrice and mouthrinse containing amine fluoride ( AmF ) and stannous fluoride ( SnF2 ) as compared with a dentifrice and mouthrinse both containing sodium fluoride ( NaF ) with regard to their root caries experience . MATERIAL In total , 80 patients who had been treated for moderate-to-severe periodontitis agreed to participate in this study . Subjects received supportive periodontal therapy at regular intervals of 3 - 4 months for at least a period of 1 year . The patients were r and omly divided into two groups : ( 1 ) the test group used an AmF/SnF2 dentifrice and mouthrinse and ( 2 ) the control group used an NaF-containing dentifrice and mouthrinse . Root caries was recorded at four sites per tooth at baseline and 24 months . RESULTS An increase in number of the exposed root surfaces was noted for both groups during the experimental period ( p<0.05 ) . The mean number of active caries lesions at baseline was 2.1 and 1.8 for the test group and control group , respectively . At 24 months , the corresponding values were 1.8 for the test and 2.2 for the control group . An increase of the mean number of restored surfaces was noted for the AmF/SnF2 group ( from 7.3 to 13.4 ) and the control group ( from 7.9 to 14.7 ) during the course of the study . This increase was found to be statistically significant for both groups in comparison with the baseline values ( p < or = 0.01 ) . No statistically significant differences were noted between groups . Further analysis of the restored surfaces revealed that the major increase in number of the restorations was associated with restorations involving three to four root surfaces in the same tooth . Molars and premolars were the teeth receiving most new restorations . CONCLUSION The present study did not detect a difference in terms of root caries development between the two groups . Root caries development is a common finding associated with surfaces developing recession in patients once treated for periodontal problems This study compared the ability of two sodium fluoride dentifrices , one containing 5,000 ppm fluoride ( Prevident 5000 Plus ) and the other 1,100 ppm fluoride ( Winterfresh Gel ) , to reverse primary root caries lesions ( PRCLs ) . A total of 201 subjects with at least one PRCL each entered the study and were r and omly allocated to use one of the dentifrices . After 6 months , 186 subjects were included in statistical analyses . At baseline and after 3 and 6 months , the lesions were clinical ly assessed and their electrical resistance measured using an electrical caries monitor . After 3 months , 39 ( 38.2 % ) of the 102 subjects in the 5,000 ppm F– group and 9 ( 10.7 % ) of 84 subjects using the 1,100 ppm F– dentifrice , had one or more PRCLs which had hardened ( p = 0.005 ) . Between baseline and 3 months , the log10 mean ± SD resistance values of lesions for subjects in the 1,100 ppm F– group had decreased by 0.06±0.55 , whereas those in the 5,000 ppm F– group had increased by 0.40±0.64 ( p<0.001 ) . After 6 months , 58 ( 56.9 % ) of the subjects in the 5,000 ppm F– group and 24 ( 28.6 % ) in the 1,100 ppm F– group had one or more PRCLs that had become hard ( p = 0.002 ) . Between baseline and 6 months , the log10 mean ± SD resistance values of lesions for subjects in the 1,100 ppm F– group decreased by 0.004±0.70 , whereas in the 5,000 ppm F– group , they increased by 0.56±0.76 ( p<0.001 ) . After 3 and 6 months , the distance from the apical border of the root caries lesions to the gingival margin increased significantly in the 5,000 ppm F– group when compared with the 1,100 ppm F– group . The plaque index in the 5,000 ppm F– group was also significantly reduced when compared with the 1,100 ppm F– group . The colour of the lesions remained unchanged . It was concluded that the dentifrice containing 5,000 ppm F– was significantly better at remineralising PRCLs than the one containing 1,100 ppm OBJECTIVES This study compared a 10 % chlorhexidine varnish treatment with placebo and sham treatments for preventing dental caries in adult patients with xerostomia ( dry mouth ) . DESIGN The study was a multicentred , r and omized , parallel group , double blind , placebo-controlled clinical trial . SETTING All examinations and procedures were performed at Tuft 's University , Boston , MA , the University of British Columbia , Vancouver , BC or the University of Western Ontario , London , ON . SUBJECTS Subjects were adults with recent or current dental caries experience , high salivary levels of cariogenic microorganisms and low salivary flow rates . RESULTS 236 subjects completed at least one post-treatment examination . There were 697 new carious lesions diagnosed , 446 ( 64 % ) located on coronal surfaces and 251 ( 36 % ) located on root surfaces . The mean attack rate was 0.23 surfaces/100 surfaces at risk . A treatment difference observed between the Active and Placebo groups was statistically significant for root caries increment ( p = .02 ) and total caries increment ( p = .03 ) . A treatment difference observed between the Active and Sham groups was not statistically significant for coronal , root or total caries increment . Analysis of variance of treatment group differences was performed using mutans streptococci counts , salivary flow rates , age , sex , caries prevalence , medications , time to first event and early withdrawal as co-variables . These factors did not meaningfully alter the findings . CONCLUSIONS The difference between the 10 % chlorhexidine varnish and placebo treatments is considered to be highly clinical ly significant for root caries increment ( 41 % reduction ) and for total caries increment ( 25 % reduction ) but only for coronal caries increment ( 14 % ) The effects of fluoride and chlorhexidine varnishes on the microflora of dental root surfaces and on the progression of root-surface caries were studied . Forty-four patients , surgically treated for advanced periodontal disease , were distributed at r and om among three groups . All patients received a st and ardized preventive treatment . Furthermore , the dentition of the patients in the two experimental groups was treated , at three-month intervals , with chlorhexidine and fluoride varnish , respectively . Patients in the control group received no additional treatment . In the experimental groups , plaque sample s were collected from selected sound and carious root surfaces at baseline and at three , six , and nine months after the onset of the study . The presence of root-surface caries was scored at baseline and after one year . In addition , the texture , depth , and color of the root-surface lesions were monitored . Mutans streptococci on root surfaces were suppressed significantly ( p<0.05 ) during the whole experimental period in the chlorhexidine varnish group , but not in the fluoride varnish group . A non-significant increase in the number of Actinomyces viscosus/naeslundii was noted after treatment with chlorhexidine and fluoride varnish . The increase in the number of decayed and filled root surfaces after one year was significantly lower in the experimental groups than in the control group . After treatment with chlorhexidine varnish , significantly more initial root-surface lesions had hardened than in the other groups OBJECTIVE To estimate the caries preventive effect of 4 fluoride programs over 2 years in the elderly . SETTING The Public Dental Clinics of Bålsta and Knivsta and the Faculty of Odontology in Göteborg , Sweden . SUBJECTS One hundred and sixty-four individuals , aged 60 years and older ( mean age 71.5 years ) who were considered to be at risk from caries . DESIGN The participants were r and omly assigned either to : 1 ) rinse twice a day with a 0.05 % NaF solution ( n = 49 ; rinsing group ) , 2 ) suck twice a day on a 1.66 mg NaF tablet ( n = 51 ; tablet group ) , 3 ) brush their teeth three times a day using a toothpaste slurry rinsing technique ( n = 32 ; slurry group ) , or 4 ) brush their teeth in their usual manner ( n = 32 ; control group ) . The participants in all 4 groups used a fluoride toothpaste ( containing 0.32 % NaF ) at least twice daily . RESULTS No new carious lesions were found in 67 % of the participants in the rinsing , 43 % in the tablet , 25 % in the slurry and 16 % in the control group over the 2 years . The mean ( + /- SD ) 2-year caries increment was 0.8 + /- 1.4 , 1.4 + /- 1.7 , 1.9 + /- 1.9 and 2.3 + /- 2.1 DFS in the rinsing , tablet , slurry and control groups , respectively ; it was significantly lower in the rinsing than in the control group ( p < 0.01 ) . A lower incidence of DFS was also found in the tablet group than in the slurry group , but only for the lingual surfaces ( p < 0.05 ) . CONCLUSION The type of fluoride program may be of importance in the reduction of new caries lesions in an older population There is limited evidence from clinical trials on the dose response of sodium fluoride dentifrices at concentrations above 1100 ppm fluoride ion , with respect to caries efficacy . This r and omized , double-blind study examined the anti-caries effectiveness of sodium fluoride dentifrices containing 1700 ppm , 2200 ppm and 2800 ppm fluoride ion relative to an 1100 ppm fluoride ion control . A population of 5439 elementary schoolchildren , aged 6 - 15 years , was recruited from an urban central Ohio area with a low fluoride content water supply ( < 0.3 ppm ) . Subjects were examined by visual-tactile and radiographic examination at baseline and after 1 , 2 , and 3 years of using the sodium fluoride dentifrices . Subjects were stratified according to gender , age and baseline DMFS scores derived from the visual-tactile baseline examination and r and omly assigned to one of four treatment groups : 0.243 % sodium fluoride ( 1100 ppm fluoride ion ) , 0.376 % sodium fluoride ( 1700 ppm fluoride ion ) , 0.486 % sodium fluoride ( 2200 ppm fluoride ion ) , and 0.619 % sodium fluoride ( 2800 ppm fluoride ion ) . All products were formulated with the same fluoride compatible silica abrasive . Results after 1 year provided evidence of a positive sodium fluoride dose response . Compared to the 1100 ppm fluoride treatment group , the 1700 ppm fluoride treatment group had an 11.0 % reduction in DMFS that was not statistically significant , while the 2200 ppm and 2800 ppm fluoride treatment groups showed statistically significant ( P<0.05 ) reductions of 18.6 % and 20.4 % , respectively . The reductions in caries delivered by the higher fluoride dentifrices were present across all tooth surface types , but were most pronounced for occlusal surfaces . Results at years 2 and 3 were confounded by a concurrent fluoride rinse program , which involved portions of the study population . While the trends for the higher fluoride dentifrices observed at year 1 remained at years 2 and 3 , the difference observed between treatments were substantially less and failed to reach statistical significance ( P<0.05 ) . Collectively , the data demonstrate that the 2200 ppm and the 2800 ppm fluoride treatments delivered statistically significantly greater caries efficacy than the 1100 ppm fluoride treatment . This large-scale clinical trial provides evidence of a positive statistically significant dose relationship between dental caries and sodium fluoride in a dentifrice at levels above 1100 ppm fluoride at year 1 PURPOSE To assess the safety and efficacy of ozone either with or without a root sealant , for the management of leathery root caries . METHODS 79 subjects with 220 root caries lesions were recruited into four study groups in this r and omized , controlled trial . At baseline and after 1 , 3 , and 6 months , the ECM III and DIAGNOdent were employed . Subsequently , the root caries lesions were clinical ly assessed for color , hardness , cavitation , dimensions , distance from the gingival margin , and severity index . Modified USPHS criteria were also performed after 1 , 3 , and 6 months . These groups were as follows : Group 1 : Ozone application was performed for a period of 10 seconds on caries lesions ; Group 2 : There was neither ozone nor root sealant application on root caries ; Group 3 : Ozone treatment and a root sealant were applied to root caries lesions ; and Group 4 : Only root sealant was applied to root caries . RESULTS At the 6-month recall , 78 subjects were examined . There were no observed adverse events . 38.1 % of lesions became hard in the ozone only group , while none of the lesions became hard in the control group ( P < 0.001 ) . Noncavitated lesions were more likely to reverse than cavitated lesions . 38.4 % of noncavitated lesions became hard , while only 5.7 % of cavitated lesions became hard in the ozone only group . Modified USPHS criteria revealed that there were 66.6 % intact sealants in the ozone and sealant group and 45.5 % intact sealants in the sealant only group ( P < 0.05 ) . After 1 , 3 , and 6 months , the ECM and DIAGNOdent readings showed improvements in the ozone only group when compared to the control group ( P < 0.001 ) . The ozone and sealant group also had greater improvements in the ECM and DIAGNOdent values when compared to the sealant only group ( P < 0.05 ) OBJECTIVES The purpose of this study was to determine the efficacy and safety of a specially formulated remineralising toothpaste in controlling caries in a high-risk population : head and neck radiation patients . DESIGN The study compared the performance of the remineralising toothpaste with a conventional fluoride dentifrice using double-blind r and omisation . MATERIAL S AND METHODS Test products : The products compared contained equivalent quantities of fluoride ( 1100 p.p.m . ) . The dual-phase remineralising toothpaste , Enamelon , also delivered soluble calcium and phosphate ions , essential components of teeth , from separate phases . Both groups had all caries restored at baseline and used a fluoride rinse daily . SUBJECTS Fifty-seven subjects who received radiation to the head and neck causing saliva hypofunction , entered the study , while 44 completed the 10 - 12 month visit . MEASUREMENTS Examinations included coronal and root caries using the Pitts Diagnostic Criteria , salivary flow rate , plaque and gingival indices and microbiological counts over a 1-year period . RESULTS The average net increment per year for root caries per subject was 0.04 ( + /-.052 ) in subjects completing the study using the remineralising toothpaste and 1.65 ( + /-0.51 ) for root caries in subjects completing the study using the conventional fluoride dentifrice . The difference was statistically significant ( p = 0.03 ) , suggesting lower net root surface increment/year for the remineralising toothpaste relative to the conventional toothpaste . No significant differences were noted on coronal surfaces . CONCLUSION The results indicate that the remineralising toothpaste provides a significant benefit in preventing and remineralising root caries in high-risk patients OBJECTIVE To assess the effect of an ozone delivery system , combined with the daily use of a remineralising patient kit , on the clinical severity of non-cavitated leathery primary root carious lesions ( PRCL 's ) , in an older population group . DESIGN A total of 89 subjects , ( age range 60 - 82 , mean + /- SD , 70.8 + /- 6 years ) , each with two leathery PRCL 's , were recruited . The two lesions in each subject were r and omly assigned for treatment with ozone or air , in a double-blind design , in a general dental practice . Subjects were recalled at three , six , 12 and 18 months . Lesions were clinical ly recorded at each visit as soft , leathery or hard , scored with a vali date d root caries severity index . RESULTS There were no observed adverse events . After three months , in the ozone-treated group , 61 PRCL 's ( 69 % ) had become hard and none had deteriorated , whilst in the control group , four PRCL 's ( 4 % ) had become worse ( p<0.01 ) . At the six-month recall , in the ozone group , seven PRCL 's ( 8 % ) remained leathery , the remaining 82 ( 92 % ) PRCL 's had become hard , whilst in the control group , 10 PRCL 's had become worse ( 11 % ) and one had become hard ( p<0.01 ) . At 12 and 18 months , 87 Subjects attended . In the ozone group at 12 months , two PRCL 's remained leathery , compared to 85 ( 98 % ) that had hardened , whilst in the control group 21 ( 24 % ) of the PRCL 's had progressed from leathery to soft , i.e. became worse , 65 PRCL 's ( 75 % ) were still leathery , and one remained hard ( p<0.01 ) . At 18 months , 87 ( 100 % ) of ozone-treated PRCL 's had arrested , whilst in the control group , 32 lesions ( 37 % ) of the PRCL 's had worsened from leathery to soft ( p<0.01 ) , 54 ( 62 % ) PRCL 's remained leathery and only one of the control PRCL 's had reversed ( p<0.01 ) . CONCLUSIONS Leathery non-cavitated primary root caries can be arrested non-operatively with ozone and remineralising products . This treatment regime is an effective alternative to conventional " drilling and filling " OBJECTIVE Colgate Total toothpaste has been demonstrated to be highly effective in plaque and gingivitis control . The effect of triclosan on root caries and on the survival of dental crowns ( fixed dental prosthetic treatment ) has not been evaluated . In order to examine these important variables , a r and omized controlled clinical trial was conducted comparing Colgate Total toothpaste with triclosan , and an identical fluoride toothpaste without triclosan . METHODS Adult subjects were r and omly assigned to a test group using Colgate Total plus Whitening toothpaste with triclosan , and a control group using Colgate sodium fluoride toothpaste without triclosan . By the end of the study , following three years of product use , an evaluation was performed to compare baseline data to the three-year data for root caries and dental crown survival . Clinical root caries was evaluated by the Katz RCI- Root Caries Index . Within-treatment analysis for each dentifrice was conducted using a paired t-test . Between-treatment analysis was performed using Analysis of Covariance ( ANCOVA ) . For fixed dental prosthetic treatment evaluation , dental crowns were dichotomized for success and failure at the end of the study . Within-treatment analysis for each dentifrice was conducted using a paired t-test . Between-treatment analysis was performed using the Bonferroni test . RESULTS One-thous and , three-hundred , and fifty-seven subjects ( 1,357 ) completed the study . Regarding root caries , at termination of the study the Colgate Total group presented a mean score of 1.14 + /- 1.75 and a + 5.6 % change from baseline , while the sodium fluoride toothpaste presented a mean of 1.25 + /- 1.88 and a + 43.2 % change from baseline ( p < 0.001 ) . The adjusted mean root caries increment was 0.07 + /- 0.03 for the Colgate Total group , and 0.38 + /- 0.03 for the sodium fluoride toothpaste group ( p < 0.001 ) . Regarding crowns , at termination of the study the Colgate Total group presented a mean score of 5.38 + /- 3.70 and a + 1.1 % change from baseline , while the sodium fluoride toothpaste without triclosan presented a mean of 5.75 + /- 3.86 and a + 3.8 % change from baseline ( p < 0.001 ) . The mean ( adjusted for multiple comparisons ) dental crowns failure increment was 0.09 + /- 0.03 for the Colgate Total group , and 0.31 + /- 0.02 for the sodium fluoride toothpaste group ( p < 0.001 ) . CONCLUSION A comparison between the two study groups revealed a statistically significant difference for root caries and dental crown failure scores , both favoring the triclosan toothpaste ( Colgate Total ) . The significantly lower root caries and observed dental crown failure scores among the Colgate Total toothpaste users indicate an effect of the triclosan and the copolymer system . These results are important and could provide a strong and valuable public health measure We hypothesized that the six-monthly application of silver diamine fluoride ( SDF ) can arrest the development of caries in the deciduous dentition of six-year-old schoolchildren and prevent caries in their first permanent molars . A prospect i ve controlled clinical trial was conducted on the efficacy of a 38 % SDF solution for caries reduction . Four hundred and twenty-five six-year-old children were divided into two groups : One group received SDF solution in primary canines and molars and first permanent molars every 6 mos for 36 mos . The second group served as controls . The 36-month follow-up was completed by 373 children . The mean number of new decayed surfaces appearing in primary teeth during the study was 0.29 in the SDF group vs. 1.43 in controls . The mean of new decayed surfaces in first permanent molars was 0.37 in the SDF group vs. 1.06 in controls . The SDF solution was found to be effective for caries reduction in primary teeth and first permanent molars in schoolchildren OBJECTIVES The effectiveness of either a 0.2 % neutral sodium fluoride ( NaF ) solution or a 0.12 % chlorhexidine ( CHX ) solution as a daily mouthrinse for controlling caries was tested against a placebo rinse in this 2-year r and omized clinical trial among elders in long-term care ( LTC ) facilities . METHODS At baseline , 369 recruits were examined clinical ly for caries and allocated r and omly to one of the mouthrinse groups . RESULTS After 2 years , 116 participants remained in the trial . The prevalence of caries and the dental status of the groups were similar at baseline and after 2 years . On average , each group lost less than one tooth per person , but the fluoride group compared with the others had significantly less caries and significantly more reversals from carious to sound dental surfaces at the end of the trial . CONCLUSIONS We conclude that 0.2 % neutral NaF mouthrinse every day does reduce the incidence of caries among elders in LTC facilities The aim of the study was to assess the effect of a toothpaste and mouthwashes containing amine fluoride ( AmF ) and stannous fluoride ( SnF2 ) on dental plaque , gingivitis and root-surface caries . Forty-four adults participated in a five-month double-blind study . The following combinations of toothpastes and mouthwashes were used : ( 1 ) AmF/SnF2 toothpaste and AmF/SnF2 ( Meridol ) mouthwash ( 20 patients , mean age 45.7 years ) , ( 2 ) sodium fluoride ( NaF ) toothpaste and NaF mouthwash ( 24 patients , mean age 48.8 years ) . The mean values for dental plaque ( Silness-Löe index ) were clinical ly and statistically significantly reduced in both groups , the difference being greater in the AmF/SnF2 group . Sulcus-bleeding index values were clinical ly and statistically significantly decreased in both groups . There was no marked difference between the groups . The root caries index ( RCl , Katz ) had decreased by the end of the experiment by 10 % in the NaF group and by 47.4 % in the AmF/SnF2 group Little information is available on the effect of fluorides on root surface caries in adults . This double-blind clinical study of 810 healthy adults , aged 54 and older , demonstrated decided cariostatic effects of a fluori date d dentifrice containing 1,100 ppm F as sodium fluoride . Statistically significant differences on both coronal ( 41 % ) and root surface caries ( 67 % ) incidence were produced in the test group and compared with a control dentifrice group during 1 year of study PURPOSE To assess the clinical effect of daily use of a toothpaste and mouthrinse , both containing amine fluoride , on primary root caries lesions ( PRCL ) in an adult caries risk population . METHODS A clinical trial based on male and female subjects , 55 - 81 years of age , r and omly assigned into two equal groups ( Groups A and B ) . Fifty subjects allocated to Group A used a fluoride toothpaste twice a day , ( Elmex sensitive toothpaste , 1400 ppm F ) plus a mouthrinse twice a day with 10 ml of a fluoride solution ( Elmex sensitive rinse containing 250 ppm F ) . The fluoride used was amine fluoride and potassium fluoride ( AmF/KF , 1:1 ) . Subjects in Group B used the same fluoride toothpaste plus a placebo mouth rinse solution without fluoride . At baseline , a total of 420 PRCL were clinical ly recorded as either soft ( score 3 ) or leathery ( score 2 ) . Parallel and blind recordings measuring electric resistance of the selected PRCL were performed at baseline and after 3 , 6 , 9 and 12 months using an electrical caries monitor ( ECM ) . Prevalence of tooth sensitivity and subject 's satisfaction was also measured in the two groups . RESULTS The clinical results showed statistically significant higher numbers of reversals of soft ( score 3 ) and leathery ( score 2 ) PRCL in Group A compared to Group B. After 12 months , the number of soft PRCL ( score 3 ) decreased in Group A from 74 % at baseline to 11 % compared with 73 % to 46 % in Group B. After 12 months , 67 % of the PRCL became hard ( score 1 ) in Group A compared to only 7 % in Group B ( P < 0.001 ) . Statistically significant higher ( P < 0.001 ) ECM mean ( sd ) log10 resistance values were recorded for the subjects in Group A , 2.67 ( 2.56 ) kOhm compared to 2.12 ( 1.88 ) kOhm in Group B. Tooth sensitivity was substantially reduced after 12 months , by 56 % in Group A compared to 20 % in Group The aim of this study was to examine the effect of combined use of a toothpaste/mouthrinse containing amine fluoride/stannous fluoride ( AmF/SnF2 ; meridol ) on the development of white spot lesions , plaque , and gingivitis on maxillary anterior teeth in orthodontic patients . A prospect i ve , r and omized , double-blind study with 115 orthodontic patients ( 42 males and 73 females , average age 14.4 years , drop outs 18 ) was design ed . Group A ( 50 ) brushed twice daily with an AmF/SnF2 toothpaste ( 1400 ppm F ) and rinsed every evening with an AmF/SnF2 solution ( 250 ppm F ) . Group B ( 47 ) brushed twice daily with a sodium fluoride ( NaF ) toothpaste ( 1400 ppm F ) and rinsed every evening with a NaF solution ( 250 ppm F ) . Visible plaque index ( VPI ) , gingival bleeding index ( GBI ) and white spot lesion index ( WSL ) were recorded on the six maxillary anterior teeth at bonding and after debonding , and evaluated with t tests . In group A no significant differences between bonding and debonding were recorded for WSL ( 1.02 + /- 0.08 versus 1.05 + /- 0.13 , P = 0.14 ) , VPI ( 0.10 + /- 0.21 versus 0.12 + /- 0.21 , P = 0.66 ) or GBI ( 0.13 + /- 0.21 versus 0.16 + /- 0.22 , P = 0.47 ) , whereas statistically significant differences were found in group B between bonding and debonding for WSL ( 1.00 + /- 0.02 versus 1.08 + /- 0.17 , P = 0.01 ) , VPI ( 0.06 + /- 0.13 versus 0.17 + /- 0.25 , P = 0.01 ) and GBI ( 0.06 + /- 0.12 versus 0.16 + /- 0.21 , P = 0.01 ) . The increase in lesions on the upper anterior teeth was 4.3 per cent in group A and 7.2 per cent in group B. It was concluded that the combined use of an AmF/SnF2 toothpaste/mouthrinse had a slightly more inhibitory effect on white spot lesion development , plaque and gingivitis on maxillary anterior teeth during fixed orthodontic treatment compared with Root caries is prevalent in elderly disabled nursing home residents in Denmark . This study aim ed to compare the effectiveness of tooth brushing with 5,000 versus 1,450 ppm of fluori date d toothpaste ( F-toothpaste ) for controlling root caries in nursing home residents . The duration of the study was 8 months . Elderly disabled residents ( n = 176 ) in 6 nursing homes in the Copenhagen area consented to take part in the study . They were r and omly assigned to use one of the two toothpastes . Both groups had their teeth brushed twice a day by the nursing staff . A total of 125 residents completed the study . Baseline and follow-up clinical examinations were performed by one calibrated examiner . Texture , contour , location and colour of root caries lesions were used to evaluate lesion activity . No differences ( p values > 0.16 ) were noted in the baseline examination with regards to age , mouth dryness , wearing of partial or full dentures in one of the jaws , occurrence of plaque and active ( 2.61 vs. 2.67 ; SD , 1.7 vs.1.8 ) or arrested lesions ( 0.62 vs. 0.63 ; SD , 1.7 vs. 1.7 ) between the 5,000 and the 1,450 ppm fluoride groups , respectively . Mean numbers of active root caries lesions at the follow-up examination were 1.05 ( 2.76 ) versus 2.55 ( 1.91 ) and mean numbers of arrested caries lesions were 2.13 ( 1.68 ) versus 0.61 ( 1.76 ) in the 5,000 and the 1,450 ppm fluoride groups , respectively ( p < 0.001 ) . To conclude , 5,000 ppm F-toothpaste is significantly more effective for controlling root caries lesion progression and promoting remineralization compared to 1,450 ppm F-toothpaste PURPOSE The aim of this study was to evaluate the efficacy of three topical fluoride treatments to arrest initial root carious lesions . MATERIAL S AND METHODS Forty patients participated in a r and omised study . Of the 60 root carious lesions that were included , 20 were r and omised for treatment with the Carisolv chemo-mechanical technique and the Duraphat ( 2.23 % F ) fluoride varnish , 20 with Duraphat alone and 20 with stannous fluoride solution ( 8 % ) . The lesions were treated at baseline and after three and six months ; a clinical evaluation was carried out on these occasions and after 1 year . RESULTS All but four lesions were categorised as arrested caries during the 1-year follow-up period : 18 in the Carisolv/Duraphat group and 19 each in the Duraphat and the stannous fluoride groups , respectively . There was a minor reduction in the mean size of the lesions of around 0.1 to 0.2 mm height and width and a moderate change in colour from a lighter to a darker appearance . No obvious differences were found between the groups . The mean percentage of mutans streptococci in plaque from all lesions was 3.5 % at baseline , and it decreased to 1.8 % during the year . The decrease was , however , not statistically significant , and no significant differences were found between the groups . CONCLUSIONS It can be concluded that the frequent topical application of fluoride could be a successful treatment for incipient root carious lesions , irrespective of the type of fluoride treatment used A three-year , double blind , r and omised clinical trial was conducted in Polk County , Florida from 1983 - 1987 . The objective was to compare the effect of four dose levels of sodium monofluorophosphate ( SMFP ) and a single dose level of sodium fluoride ( NaF ) on DMFS , DMFT , and DFS Interproximal indices . A total of 8,027 children were examined clinical ly and radiographically at baseline , and 5,474 children completed the three-year study , which included daily supervised brushing at school . No differences existed at baseline between the five study cells on age or gender distribution , or on any of the dental indices . Results indicated that the 2000 ppm F NaF group had significantly smaller DMFS increment than the 2000 ppm F SMFP group p < 0.005 . The 2000 ppm F NaF group demonstrated an 18 per cent ( 26 per cent for children > 10 years at baseline ) reduction in DMFS over the 1500 ppm F SMFP group , the 2500 ppm F group a 15 per cent ( 19 per cent ) reduction , and the 2000 ppm F SMFP a 5 per cent ( 9 per cent ) reduction . Results are strongest in children at greatest risk -- older children with previous caries . This study concludes that the anticaries efficacy of SMFP dentifrices rises with increasing fluoride , and that the anticaries efficacy of a 2000 ppm NaF dentifrice is superior to that of a 2000 ppm F SMFP dentifrice , p < 0.005 )
2,062
30,294,903
Despite the heterogeneity of studies , there is limited evidence that interventions delivered by geriatrics-trained staff reduce hospitalisations in nursing home residents .
OBJECTIVE To determine the efficacy of interventions , delivered by geriatrics-trained staff for nursing home residents , in reducing hospitalisation .
Background Geriatric evaluation and management has become st and ard care for community dwelling older adults following an acute admission to hospital . It is unclear whether this approach is beneficial for the frailest older adults living in permanent residential care . This study was undertaken to evaluate ( 1 ) the feasibility and consumer satisfaction with a geriatrician-led supported discharge service for older adults living in residential care facilities ( RCF ) and ( 2 ) its impact on the uptake of Advanced Care Planning ( ACP ) and acute health care service utilisation . Methods In 2002–4 a r and omised controlled trial was conducted in Melbourne , Australia comparing the geriatrician – led outreach service to usual care for RCF residents . Patients were recruited during their acute hospital stay and followed up at the RCF for six months . The intervention group received a post-discharge home visit within 96 hours , at which a comprehensive geriatric assessment was performed and a care plan developed . Participants and their families were also offered further meetings to discuss ACPs and document Advanced Directives ( AD ) . Additional review s were made available for assessment and management of intercurrent illness within the RCF . Consumer satisfaction was surveyed using a postal question naire . Results The study included 116 participants ( 57 intervention and 59 controls ) with comparable baseline characteristics . The service was well received by consumers demonstrated by higher satisfaction with care in the intervention group compared to controls ( 95 % versus 58 % , p = 0.006).AD were completed by 67 % of participants /proxy decision makers in the intervention group compared to 13 % of RCF residents prior to service commencement . At six months there was a significant reduction in outpatient visits ( intervention 21 ( 37 % ) versus controls 45 ( 76 % ) , ( p < 0.001 ) , but no difference in readmissions rates ( 39 % intervention versus 34 % control , p = 0.6 ) . There was a trend towards reduced hospital bed-day utilisation ( intervention 271 versus controls 372 days ) . Conclusion It is feasible to provide a supported discharge service that includes geriatrician assessment and care planning within a RCF . By exp and ing the service there is the potential for acute health care cost savings by decreasing the dem and for outpatient consultation and further reducing acute care bed-days OBJECTIVES This study describes the outcomes of an intervention program in Nursing Homes and their effects on emergency room attendance , hospital admissions , and pharmaceutical expenditure . MATERIAL AND METHODS This involved non-r and omised community intervention in Nursing Homes with a control group . The program was implemented gradually from 2007 to 2009 in 10 Nursing Homes ( 857 beds ) which participated voluntarily . The control group consisted of 14 Nursing homes ( 1,200 beds ) , which refused to participate or were not assigned to our Primary Care centres . Intervention consisted of comprehensive geriatric assessment and follow-up visits by trained personnel , review and adjustment of drug treatment , case management and staff training . RESULTS In the Nursing Homes where the program was carried out , emergency room attendance decreased from 1165‰ ( 95%CI 1100 - 1240 ] ) in 2006 to 674‰ ( 95%CI 620 - 730 ) in 2009 , while in the control group it increased from 1071 ( 95%CI 1020 - 1130 ) to 1246‰ ( 95%CI 1190 - 1310 ) . The hospital admissions also decreased from 48.4 % ( 95%CI 45 - 52 ) in 2006 to 32.1 % ( 95%CI 29 - 35 ) in 2009 , while in the control group increased from 43.5 % ( 95%CI 41 - 46 ) to 55.8 % ( 95%CI 53 - 59 ) . There was also a 9 % reduction in pharmacy cost compared with an increase of 11.9 % in the control group . CONCLUSIONS The intervention has proved effective at reducing hospital admissions and emergency room attendance in institutionalised patients , thereby streamlining pharmacy costs OBJECTIVES To examine the frequency and reasons for potentially avoidable hospitalizations of nursing home ( NH ) residents . DESIGN Medical records were review ed as a component of a project design ed to develop and pilot test clinical practice tools for reducing potentially avoidable hospitalization . SETTING NHs in Georgia . PARTICIPANTS In 10 NHs with high and 10 with low hospitalization rates , 10 hospitalizations were r and omly selected , including long- and short-stay residents . MEASUREMENTS Ratings using a structured review by expert NH clinicians . RESULTS Of the 200 hospitalizations , 134 ( 67.0 % ) were rated as potentially avoidable . Panel members cited lack of on-site availability of primary care clinicians , inability to obtain timely laboratory tests and intravenous fluids , problems with quality of care in assessing acute changes , and uncertain benefits of hospitalization as causes of these potentially avoidable hospitalizations . CONCLUSION In this sample of NH residents , experienced long-term care clinicians commonly rated hospitalizations as potentially avoidable . Support for NH infrastructure , clinical practice and communication tools for health professionals , increased attention to reducing the frequency of medically futile care , and financial and other incentives for NHs and their affiliated hospitals are needed to improve care , reduce avoidable hospitalizations , and avoid unnecessary healthcare expenditures in this population Residents of long-term care facilities have highly complex care needs and quality of care is of international concern . Maintaining resident wellness through proactive assessment and early intervention is key to decreasing the need for acute hospitalization . The Residential Aged Care Integration Program ( RACIP ) is a quality improvement intervention to support residential aged care staff and includes on-site support , education , clinical coaching , and care coordination provided by gerontology nurse specialists ( GNSs ) employed by a large district health board . The effect of the outreach program was evaluated through a r and omized comparison of hospitalization 1 year before and after program implementation . The sample included 29 intervention facilities ( 1,425 residents ) and 25 comparison facilities ( 1,128 residents ) receiving usual care . Acute hospitalization rate unexpectedly increased for both groups after program implementation , although the rate of increase was significantly less for the intervention facilities . The hospitalization rate after the intervention increased 59 % for the comparison group and 16 % for the intervention group ( rate ratio ( RR ) = 0.73 , 95 % confidence interval ( CI ) = 0.61 - 0.86 , P < .001 ) . Subgroup analysis showed a significantly lower rate change for those admitted for medical reasons for the intervention group ( 13 % increase ) than the comparison group ( 69 % increase ) ( RR = 0.67 , 95 % CI = 0.56 - 0.82 , P < .001 ) . Conversely , there was no significant difference in the RR for surgical admissions between the intervention and comparison groups ( RR = 1.0 , 95 % CI = 0.68 - 1.46 , P = .99 ) . The integration of GNS expertise through the RACIP intervention may be one approach to support staff to provide optimal care and potentially improve resident health Flaws in the design , conduct , analysis , and reporting of r and omised trials can cause the effect of an intervention to be underestimated or overestimated . The Cochrane Collaboration ’s tool for assessing risk of bias aims to make the process clearer and more OBJECTIVES To assess the effect of a Screening Tool of Older Persons potentially inappropriate Prescriptions/Screening Tool to Alert doctors to Right Treatment ( STOPP/START ) medication intervention on clinical and economic outcomes . DESIGN Parallel-group r and omized trial . SETTING Chronic care geriatric facility . PARTICIPANTS Residents aged 65 and older prescribed with at least one medication ( N = 359 ) were r and omized to receive usual pharmaceutical care or undergo medication intervention . INTERVENTION Screening medications with STOPP/START criteria followed up with recommendations to the chief physician . MEASUREMENTS The outcome measures assessed at the initiation of the intervention and 1 year later were number of hospitalizations and falls , Functional Independence Measure ( FIM ) , quality of life ( measured using the Medical Outcomes Study 12-item Short-Form Health Survey ) , and costs of medications . RESULTS The average number of drugs prescribed was significantly lower in the intervention than in the control group after 1 year ( P < .001 ) . The average drug costs in the intervention group decreased by 103 shekels ( US$ 29 ) per participant per month ( P < .001 ) . The average number of falls in the intervention group dropped significantly ( P = .006 ) . Rates of hospitalization , FIM scores , and quality of life measurements were similar for both groups . CONCLUSION Implementation of STOPP/START criteria reduced the number of medications , falls , and costs in a geriatric facility . Their incorporation in those and similar setting s is recommended OBJECTIVE To describe the process and outcomes of nursing home ( NH ) residents transferred to hospital EDs . METHODS This was a prospect i ve , observational study conducted at 2 Midwestern community teaching hospitals during a 12-month period . All elder patients ( > 64 years of age ) transferred to hospital EDs from regional NHs were eligible for the study . Hospital records were used to abstract relevant descriptive and clinical data . Need for ambulance use was grade d prospect ively using 3 categories of urgency developed in other studies . Transfers were considered " appropriate " based on outcome measures or if the problem necessitated diagnostic and /or therapeutic procedures not available in the NH . Transfer documentation was evaluated using a st and ardized 18-item checklist . RESULTS A total of 709 consecutive NH patients made 1,012 ED visits . Their mean age was 83.4 years ( range 65 - 100 ) ; 76 % were female . The majority of patients ( 94 % ) were transferred by ambulance . Ambulance transfer was classified as emergent ( 16 % of patients ) , urgent ( 45 % ) , or routine ( 39 % ) . There were 319 ( 45 % ) patients subsequently admitted to the hospital . Approximately 77 % ( 546/709 ) of the NH transfers were considered appropriate by the emergency physician ( EP ) . Sixty-seven patients ( 10 % ) were transferred without any documentation . For those patients with transfer documentation , 6 common discrepancies were identified . CONCLUSION Although the majority of NH transfers in this population were appropriate , many patients were transferred without adequate documentation for the EP The dem and s of long-term care facilities ( LTCFs ) residents are complex which usually require a range of professionals and caregivers to provide treatment and care . To reduce this fragmentation of care , integrated care models are developed in modern health care system , and a gradual change from traditional LTCF care to integrated care has occurred in many countries . Although integrated care is assumed to improve the quality of care , evidence s supporting these effects are insufficient . We recruited 7 private LTCF ( 74 residents ) in northern Taipei and r and omized them into integrated care model ( N=42 , mean age=82.8+/-8.0 years , 54.8 % males ) and traditional model ( N=32 , 81.7+/-8.8 years , 43.8 % males ) . Integrated care model group was provided an actively working interdisciplinary team in addition to traditional nursing and personal care in traditional model group . Physical function , nutritional status and several quality indicators ( unplanned feeding tube replacement , unplanned urinary catheter replacement , pneumonia , urinary tract infection and so on ) were compared with both groups . Overall , LTCF residents in the integrated care model group showed significant improvement in serum levels of albumin ( 3.78+/-0.32 vs. 3.60+/-0.45 , p=0.004 ) and hemoglobin ( 12.62+/-1.58 vs. 12.03+/-1.24 , p=0.004 ) during the study period . Among selected quality indicators , subjects in integrated care model group were similar to traditional model group except that integrated care model group had a significantly reduced unplanned feeding tube replacement rate . In conclusion , the clinical effectiveness of integrated care model among severly disabled LTCF residents is minimal and a further cost-effectiveness study is needed to promote optimal quality of care in this setting OBJECTIVE To assess effect of a complex , multidisciplinary intervention aim ed at reducing avoidable acute hospitalization of residents of residential aged care ( RAC ) facilities . DESIGN Cluster r and omized controlled trial . SETTING RAC facilities with higher than expected hospitalizations in Auckl and , New Zeal and , were recruited and r and omized to intervention or control . PARTICIPANTS A total of 1998 residents of 18 intervention facilities and 18 control facilities . INTERVENTION A facility-based complex intervention of 9 months ' duration . The intervention comprised gerontology nurse specialist (GNS)-led staff education , facility bench-marking , GNS resident review , and multidisciplinary ( geriatrician , primary -care physician , pharmacist , GNS , and facility nurse ) discussion of residents selected using st and ard criteria . MAIN OUTCOME MEASURES Primary end point was avoidable hospitalizations . Secondary end points were all acute admissions , mortality , and acute bed-days . Follow-up was for a total of 14 months . RESULTS The intervention did not affect main study end points : number of acute avoidable hospital admissions ( RR 1.07 ; 95 % CI 0.85 - 1.36 ; P = .59 ) or mortality ( RR 1.11 ; 95 % CI 0.76 - 1.61 ; P = .62 ) . CONCLUSIONS This multidisciplinary intervention , packaging selected case review , and staff education had no overall impact on acute hospital admissions or mortality . This may have considerable implication s for resourcing in the acute and RAC sectors in the face of population aging . Australian and New Zeal and Clinical Trials Registry ( ACTRN12611000187943 ) Background There are 1.5 million people living in nursing homes in the United States1 . The number of people admitted to nursing homes has increased since 1994 , and it is expected that the number of people aged 65 and older living in nursing homes will double by the year 20202 . Nursing home patients are sicker than they have been in the past 10 years , and the frail , sick patients are more likely to be hospitalized3 . Unnecessary hospitalization of nursing home patients is a costly and critical problem in our healthcare system3 . Hospitalization can cause irreversible decline in function for the elderly patient and can " expose residents to iatrogenic disease and delirium"4 . It has been cl aim ed that nurse practitioners ( NPs ) can play a valuable role in caring for the long term care patient , reducing unnecessary hospital admissions , and supporting the physician 's practice . A NP on site in the nursing home can provide quick assessment and treatment when a patient has a change of condition . The NP can intervene and treat the patient as needed , instead of transferring the patient to the hospital for assessment . Objectives The objective of this systematic review was to evaluate the effectiveness of having a NP in the nursing home and whether this lead to a decrease in the rate of patient hospitalizations . Inclusion Criteria Types of Participants This systematic review considered studies that include long term care nursing home residents . Types of Interventions The review considered studies that evaluate utilization of a NP ( in collaboration with a physician ) as a primary care provider for long term care nursing home patients . Types of Outcomes This review considered studies that include the following outcome measures : incidence of hospitalization , types of hospitalization and duration of hospitalization of nursing home patients . Types of Studies R and omized controlled trials ( RCTs ) were not identified in the search . Therefore , other research design s , such as non‐r and omized controlled trials and before and after studies , were included . Search Strategy Major data bases were search ed for English articles written from 1983 to December 2008 . Method ological Quality Seven papers were selected for retrieval , and were assessed by two independent review ers for method ological validity prior to inclusion in the review using st and ardized critical appraisal instruments from the Joanna Briggs Institute Meta Analysis of Statistics Assessment and Review Instrument ( JBI‐MAStARI ) ( Appendix I ) . Data Collection / Extraction Quantitative data was extracted from papers included in the review using the st and ardized data extraction tool from JBI‐MAStARI ( Appendix II ) . Data Synthesis Statistical pooling was not possible and the findings are presented in narrative form . Results The review consisted of 12,681 patients in 238 nursing homes . All of the seven included articles found a decrease in hospitalization rates when NPs were utilized as a part of the medical team . Five of the 7 studies found a decrease in ER transfers with the NP group . Garrard , Kane , et al17 did not measure ER transfers and Kane , Garrard et al24 found no difference in rate of ER use . Three studies also measured length of hospitalization , and all 3 found that the patients with NPs had shorter lengths of stay . Conclusions This review has demonstrated that nurse practitioners can reduce hospitalization and ER transfers of nursing home patients . Implication s for Research It is recommended that more studies be initiated using only Master 's prepared advanced practice nurses . Implication s for Practice It is recommended that NPs be utilized as primary care providers in nursing homes . Physicians should be encouraged to employ NPs to improve patient outcomes and to assist with patient loads BACKGROUND Hospital admissions are frequent among long-term residents of nursing homes and can result in detrimental complications affecting the patients ' somatic , psychological , and cognitive status . In this prospect i ve controlled study , we investigated the effects of a mobile geriatric consultant service ( GECO ) offered by specialists in internal medicine on frequency of hospitalizations in nursing home residents . METHODS During a 10-month observation period , residents in a control nursing home received medical attendance by general practitioners as is common in Austrian nursing homes . Residents in the intervention nursing home also received the medical service of GECO . RESULTS Within the group of rest home residents receiving GECO support , a statistically significant lower frequency of acute transports to hospitals was observed in comparison to residents of the control nursing home ( mean number of acute transports to hospitals/100 residents/month : 6.1 versus 11.7 ; p < 0.01 ) . The number of planned non-acute hospital and specialist office presentations was also lower in the intervention nursing home ( mean number of hospital and specialist office presentations/100 residents/month : 14.4 versus 18.0 ) ; however , this difference did not reach statistical significance . CONCLUSION This study shows that a mobile medical geriatric consultant service based on specialists in internal medicine can improve medical care in nursing homes result ing in a statistically significant reduction of acute transports to hospitals BACKGROUND over the last decade , high dem and for acute healthcare services by long-term residents of residential care facilities ( RCFs ) has stimulated interest in exploring alternative models of care . The Residential Care Intervention Program in the Elderly ( RECIPE ) service provides expert outreach services to RCFs residents , interventions include comprehensive care planning , management of inter-current illness and rapid access to acute care substitution services . OBJECTIVE to evaluate whether the RECIPE service decreased acute healthcare utilisation . DESIGN a retrospective cohort study using interrupted time series analysis to analyse change in acute healthcare utilisation before and after enrolment . SETTING a 300-bed metropolitan teaching hospital in Australia and 73 RCFs within its catchment . SUBJECTS there were 1,327 patients enrolled in the service with a median age of 84 years ; 61 % were female . METHODS data were collected prospect ively on all enrolled patients from 2004 to 2011 and linked to the acute health service administrative data set . Primary outcomes change in admission rates , length of stay and bed days per quarter . RESULTS in the 2 years prior to enrolment , the mean number of acute care admissions per patient per year was 3.03 ( SD 2.9 ) versus post 2.4 ( SD 3.3 ) , the service reducing admissions by 0.13 admissions per patient per quarter ( P = 0.046 ) . Prior to enrolment , the mean length of stay was 8.6 ( SD 11.0 ) versus post 3.5 ( SD 5.0 ) , a reduction of 1.5 days per patient per quarter ( P = 0.003 ) . CONCLUSIONS this study suggests that an outreach service comprising a geriatrician-led multidisciplinary team can reduce acute hospital utilisation rates
2,063
30,863,464
In these studies , metformin showed to control seizure attacks by attenuating seizure generation , delaying the onset of epilepsy , reducing hippocampal neuronal loss , and averting cognitive impairments in both acute and chronic models of an epileptic seizure . The possible mechanisms for its antiseizure or antiepileptic action might be due to activation of AMPK , antiapoptotic , antineuroinflammatory , and antioxidant properties , which possibly modify disease progression through affecting epileptogenesis . Conclusion This review revealed the benefits of metformin in alleviating symptoms of epileptic seizure and modifying different cellular and molecular changes that affect the natural history of the disease in addition to its good safety profile
Background Epilepsy is one of the common neurological illnesses which affects millions of individuals globally . Although the majority of epileptic patients have a good response for the currently available antiepileptic drugs ( AEDs ) , about 30 - 40 % of epileptic patients are developing resistance . In addition to low safety profiles of most of existing AEDs , there is no AED available for curative or disease-modifying actions for epilepsy so far . Objectives This systematic review is intended to evaluate the effect of metformin in acute and chronic animal models of an epileptic seizure .
BACKGROUND & AIMS Given the long term benefits observed with metformin use in diabetes patients , a role in modulating oxidative stress is imputable . Effects of metformin on markers of oxidative stress , antioxidant reserve , and HDL-c associated antioxidant enzymes were investigated . METHODS In a clinical trial setting ( Registered under Clinical Trials.gov Identifier no. NCT01521624 ) 99 medication-naïve , newly diagnosed type 2 diabetes patients were r and omly assigned to either metformin or lifestyle modification . AOPP , AGE , FRAP , activities of LCAT , and PON were measured at baseline and after 12-weeks . RESULTS Baseline values of the oxidative stress markers did not differ significantly between the two groups . In cases , after three months treatment , there was a significant reduction in AOPP ( 137.52 ± 25.59 , 118.45 ± 38.42 , p < 0.001 ) , and AGE ( 69.28 ± 4.58 , 64.31 ± 8.64 , p = 0.002 ) . FRAP and PON increased significantly ( 1060.67 ± 226.69 , 1347.80 ± 251.40 , p < 0.001 and 29.85 ± 23.18 , 37.86 ± 27.60 , p = 0.012 respectively ) . LCAT levels remained unchanged ( 45.23 ± 4.95 , 46.15 ± 6.28 , p = 0.439 ) . Comparing the two groups in a final multivariate model , AOPP , FRAP , and AGE levels changed more significantly in metformin compared with lifestyle modification alone ( p = 0.007 , p < 0.001 and p < 0.001 respectively ) . Escalation in LCAT or PON activities did not differ between the two groups ( p = 0.199 and 0.843 respectively ) . CONCLUSIONS Use of metformin is more effective in reducing oxidative stress compared with lifestyle modification alone Epidemiological studies have identified a robust association between type II diabetes mellitus and Alzheimer disease ( AD ) , and neurobiological studies have suggested the presence of central nervous system insulin resistance in individuals with AD . Given this association , we hypothesized that the central nervous system – penetrant insulin-sensitizing medication metformin would be beneficial as a disease-modifying and /or symptomatic therapy for AD , and conducted a placebo-controlled crossover study of its effects on cerebrospinal fluid ( CSF ) , neuroimaging , and cognitive biomarkers . Twenty nondiabetic subjects with mild cognitive impairment or mild dementia due to AD were r and omized to receive metformin then placebo for 8 weeks each or vice versa . CSF and neuroimaging ( Arterial Spin Label MRI ) data were collected for biomarker analyses , and cognitive testing was performed . Metformin was found to be safe , well-tolerated , and measureable in CSF at an average steady-state concentration of 95.6 ng/mL. Metformin was associated with improved executive functioning , and trends suggested improvement in learning/memory and attention . No significant changes in cerebral blood flow were observed , though post hoc completer analyses suggested an increase in orbitofrontal cerebral blood flow with metformin exposure . Further study of these findings is warranted RationalE pidemiological evidence suggests that individuals with diabetes mellitus are at greater risk of developing Alzheimer ’s disease , and controversy overwhelms the usefulness of the widely prescribed insulin-sensitizing drug , metformin , on cognition . Objectives Through the scopolamine-induced memory deficit model , we investigated metformin influence on cognitive dysfunction and explored underlying mechanisms . Methods Sixty adult male Wistar rats were r and omly assigned into 5 groups ( 12 rats each ) to receive either normal saline , scopolamine 1 mg/kg intraperitoneally once daily , scopolamine + oral metformin ( 100 mg/kg/day ) , scopolamine + oral metformin ( 300 mg/kg/day ) or scopolamine + oral rivastigmine ( 0.75 mg/kg/day ) for 14 days . Cognitive behaviours were tested using Morris water maze and passive avoidance tasks . Biochemically , brain oxidative ( malondialdehyde ) and inflammatory ( TNF-α ) markers , nitric oxide , Akt , phospho-Akt , phospho-tau and acetyl cholinesterase activity in hippocampal and cortical tissues were assessed . Results The lower dose of metformin ( 100 mg/kg ) ameliorated scopolamine-induced impaired performance in both Morris water maze and passive avoidance tasks , and was associated with significant reduction of inflammation and to a lesser extent oxidative stress versus rivastigmine . Given the role of total Akt in regulation of abnormal tau accumulation and degradation , our finding that metformin 100 decreased the elevated total Akt while increasing its phosphorylated form explains its beneficial modulatory effect on phosphorylated tau in both tissues , and could further clarify its protection against memory impairment . Conclusion Metformin , only in the average human antidiabetic dose , offers a protective effect against scopolamine-induced cognitive impairment , while no deleterious effect was observed with the higher dose , which may support a bonus effect of metformin in type 2 diabetic patients AIM Advanced research has radically changed both diagnosis and treatment of diabetes during last three decades ; a number of classes of oral antidiabetic agents are currently available for better glycemic control . Present study aims to evaluate the effect of metformin on different stress and inflammatory parameters in diabetic subjects . METHODS 208 type 2 diabetes patients were r and omly assigned for metformin and placebo . RESULTS Reactive oxygen species generation , advanced oxidation protein products ( 179.65±13.6 , 120.65±10.5 μmol/l ) and pentosidine ( 107±10.4 , 78±7.6 pmol/ml ) were found to be reduced by metformin treatment compared to placebo . On the other h and metformin administration enhanced total thiol and nitric oxide level ( p<0.05 ) . But nutrient level ( Mg(+2 ) , Ca(+2 ) ) in plasma was not altered by the treatment . Significant restoration of C reactive protein ( p<0.05 ) was noticed after metformin therapy . Metformin administration also improved Na(+)K(+)ATPase activity ( 0.28±0.08 , 0.41±0.07 μmol Pi/mg/h ) in erythrocyte membrane . CONCLUSIONS This study explores that metformin treatment restores the antioxidant status , enzymatic activity and inflammatory parameters in type 2 diabetic patients . Metformin therapy improves the status of oxidative and nitrosative stress altered in type 2 diabetes . This study unfolds the cardio protective role of metformin as an oral hypoglycemic agent Metformin ( Met ) is used to treat neurodegenerative disorders such as Alzheimer 's disease ( AD ) . Conversely , high-fat diets ( HFD ) have been shown to increase AD risk . In this study , we investigated the neuroprotective effects of Met on β-amyloid (Aβ)-induced impairments in hippocampal synaptic plasticity in AD model rats that were fed a HFD . In this study , 32 adult male Wistar rats were r and omly assigned to four groups : group I ( control group , regular diet ) ; group II ( HFD+vehicle ) ; group III ( HFD+Aβ ) ; or group IV ( Met+HFD+Aβ ) . Rats fed a HFD were injected with Aβ to induce AD , allowed to recover , and treated with Met for 8 weeks . The rats were then anesthetized with intraperitoneal injections of urethane and placed in a stereotaxic apparatus for surgery , electrode implantation , and field potential recording . In vivo electrophysiological recordings were then performed to measure population spike ( PS ) amplitude and excitatory postsynaptic potential ( EPSP ) slope in the hippocampal dentate gyrus . Long-term potentiation ( LTP ) was induced by high-frequency stimulation of the perforant pathway . Blood sample s were then collected to measure plasma levels of triglycerides , low-density lipoproteins , very low-density lipoprotein , and cholesterol . After induction of LTP , PS amplitude and EPSP slope were significantly decreased in Aβ-injected rats fed a HFD compared to vehicle-injected animals or untreated animals that were fed a normal diet . Met treatment of Aβ-injected rats significantly attenuated these decreases , suggesting that Met decreased the effects of Aβ on LTP . These findings suggest that Met treatment is neuroprotective against the detrimental effects of Aβ and HFDs on hippocampal synaptic plasticity Epilepsy is known as one of the most frequent neurological diseases , characterized by an enduring predisposition to generate epileptic seizures . Oxidative stress is believed to directly participate in pathways leading to neurodegeneration , which serves as the most important propagating factor , leading to the epileptic condition and cognitive decline . Moreover , there is also a growing body of evidence showing the disturbance of antioxidant system balance and consequently increased production of reactive species in patients with epilepsy . A meta- analysis , conducted in the present review confirms an association between epilepsy and increased lipid peroxidation . Furthermore , it was also shown that some of the antiepileptic drugs could potentially be responsible for additionally increased lipid peroxidation . Therefore , it is reasonable to propose that during the epileptic process neuroprotective treatment with antioxidants could lead to less sever structural damages , reduced epileptogenesis and milder cognitive deterioration . To evaluate this hypothesis studies investigating the neuroprotective therapeutic potential of various antioxidants in cells , animal seizure models and patients with epilepsy have been review ed . Numerous beneficial effects of antioxidants on oxidative stress markers and in some cases also neuroprotective effects were observed in animal seizure models . However , despite these encouraging results , till now only a few antioxidants have been further applied to patients with epilepsy as an add-on therapy . Based on the several positive findings in animal models , a strong need for more carefully planned , r and omized , double-blind , cross-over , placebo-controlled clinical trials for the evaluation of antioxidants efficacy in patients with epilepsy is warranted Background . In a recent r and omized controlled trial ( RCT ) in obese adolescents , 18 month-treatment with metformin versus placebo was reported to lead to stabilisation of the BMI . This study aim ed to compare the effect of metformin on BMI in obese adolescents in daily practice versus results obtained in an RCT . Methods . Obese adolescents treated off label with metformin in daily practice in an outpatient clinic with a follow-up of ≥18 months were identified . Anthropometric and biochemical data were collected at baseline and at 18 months . Patients treated with metformin for 18 months in an RCT were used for comparison . BMI was compared between the two groups . Results . Nineteen patients ( median age 14.3 ( interquartile range 11.7–15.7 ) years , BMI 31.3 ( 28.8–33.8 ) kg/m2 ) treated in daily practice were compared to 23 patients receiving metformin in the RCT ( age 13.6 ( 12.6–15.3 ) years , BMI 29.8 ( 28.1–34.5 ) kg/m2 ) . BMI change after 18 months was −0.36 ( −2.10–1.58 ) versus + 0.22 ( −2.87–1.27 ) kg/m2 for the two groups , respectively . In the multivariable model , BMI change was not statistically significantly different between the two groups ( p = 0.61 ) . Conclusion . Treatment with metformin in obese adolescents in daily practice result ed in a comparable change in BMI as observed in an RCT . This trial is registered with Clinical Trials.gov number : NCT01487993
2,064
30,871,517
Conclusions Tai Chi seems to be effective in treating type 2 diabetes . Different training duration s and styles result in variable effectiveness . The evidence was insufficient to support whether long-term Tai Chi training was more effective
Background Physical activity is an important part of the diabetes management plan . However , the effects caused by different training duration s and styles of Tai Chi have not been evaluated . We conducted an up date d systematic review of the effects of Tai Chi on patients with type 2 diabetes based on different training duration s and styles .
Purpose The aim was to assess the effects of a Tai Chi – based program on health-related quality of life ( HR-QOL ) in people with elevated blood glucose or diabetes who were not on medication for glucose control . Method 41 participants were r and omly allocated to either a Tai Chi intervention group ( N = 20 ) or a usual medical-care control group ( N = 21 ) . The Tai Chi group involved 3 × 1.5 h supervised and group-based training sessions per week for 12 weeks . Indicators of HR-QOL were assessed by self-report survey immediately prior to and after the intervention . Results There were significant improvements in favor of the Tai Chi group for the SF36 subscales of physical functioning ( mean difference = 5.46 , 95 % CI = 1.35–9.57 , P < 0.05 ) , role physical ( mean difference = 18.60 , 95 % CI = 2.16–35.05 , P < 0.05 ) , bodily pain ( mean difference = 9.88 , 95 % CI = 2.06–17.69 , P < 0.05 ) and vitality ( mean difference = 9.96 , 95 % CI = 0.77–19.15 , P < 0.05 ) . Conclusions The findings show that this Tai Chi program improved indicators of HR-QOL including physical functioning , role physical , bodily pain and vitality in people with elevated blood glucose or diabetes who were not on diabetes medication OBJECTIVE This study aim ed to vali date the effects of a simplified , gentle form of t'ai chi chuan in patients with type 2 diabetes and who are also obese . DESIGN The study was design ed to be a r and omized controlled trial . SETTING This study was conducted in the department of metabolism and endocrinology at Cheng Ching Hospital , in Taichung , Taiwan . SUBJECTS The study subjects were hospital-based patients with type 2 diabetes and who were also obese ( ages 40 - 70 , with a body-mass index [ BMI ] range of 30 - 35 ) . The patients were r and omly selected and grouped into t'ai chi exercise ( TCE ) and conventional exercise ( CE ) groups . INTERVENTIONS After receiving instruction in t'ai chi , the TCE group and the CE group practice d three times per week , including one practice session lasting up to 1 hour , for 12 weeks . OUTCOME MEASURES Hemoglobin A1C , serum lipid profile , serum malondialdehyde , and C-reactive protein were measured . Physical parameters of body weight and BMI were also measured . Diet and medications of participants were monitored carefully while biochemical and physical conditions were analyzed . RESULTS After 12 weeks , hemoglobin A1C values of the TCE group did not decrease ( 8.9 ± 2.7 % : 8.3 ± 2.2 % ; p = 0.064 ) . BMI ( 33.5 ± 4.8 : 31.3 ± 4.2 ; p = 0.038 ) and serum lipids , including triglyceride ( 214 ± 47 mg/dL : 171 ± 34 mg/dL ; p = 0.012 ) and high density lipoprotein cholesterol ( 38 ± 16 mg/dL : 45 ± 18 mg/dL ; p = 0.023 ) had significant improvements . Serum malondialdehyde tended to decrease from baseline ( 2.66 ± 0.78 μmol/L : 2.31 ± 0.55 μmol/L ; p = 0.035 ) , and C-reactive protein also decreased ( 0.39 ± 0.19 mg/dL : 0.22 ± 0.15 mg/dL ; p = 0.014 ) . No improvements occurred in BMI , lipids , and oxidative stress profiles in the CE group . CONCLUSIONS T'ai chi exercise practice d by patients who are obese and have type 2 diabetes is efficient and safe when supervised by professionals and helps improve parameters , such as BMI , lipid profile , C-reactive protein , and malondialdehyde . Periodic monitoring of blood glucose , blood pressure , heart rate , breathing , physical fitness , and symptoms of discomfort of patients who exercise helps prevent injury . Simple , gentle TCE can be applied as regular daily exercise for patients with type 2 diabetes even when such patients are obese Older adults with type 2 diabetes have mobility impairment and reduced fitness . This study aim ed to test the efficacy of the “ Tai Chi for Diabetes ” form , developed to address health-related problems in diabetes , including mobility and physical function . Thirty-eight older adults with stable type 2 diabetes were r and omized to Tai Chi or sham exercise , twice a week for 16 weeks . Outcomes included gait , balance , musculoskeletal and cardiovascular fitness , self-reported activity and quality of life . Static and dynamic balance index ( −5.8 ± 14.2 ; p = 0.03 ) and maximal gait speed ( 6.2 ± 11.6 % ; p = 0.005 ) improved over time , with no significant group effects . There were no changes in other measures . Non-specific effects of exercise testing and /or study participation such as outcome expectation , socialization , the Hawthorne effect , or unmeasured changes in health status or compliance with medical treatment may underlie the modest improvements in gait and balance observed in this sham-exercise-controlled trial . This Tai Chi form , although developed specifically for diabetes , may not have been of sufficient intensity , frequency , or duration to effect positive changes in many aspects of physiology or health status relevant to older people with diabetes BACKGROUND Previous research has suggested that tai chi offers a therapeutic benefit in patients with fibromyalgia . METHODS We conducted a single-blind , r and omized trial of classic Yang-style tai chi as compared with a control intervention consisting of wellness education and stretching for the treatment of fibromyalgia ( defined by American College of Rheumatology 1990 criteria ) . Sessions lasted 60 minutes each and took place twice a week for 12 weeks for each of the study groups . The primary end point was a change in the Fibromyalgia Impact Question naire ( FIQ ) score ( ranging from 0 to 100 , with higher scores indicating more severe symptoms ) at the end of 12 weeks . Secondary end points included summary scores on the physical and mental components of the Medical Outcomes Study 36-Item Short-Form Health Survey ( SF-36 ) . All assessment s were repeated at 24 weeks to test the durability of the response . RESULTS Of the 66 r and omly assigned patients , the 33 in the tai chi group had clinical ly important improvements in the FIQ total score and quality of life . Mean ( + /-SD ) baseline and 12-week FIQ scores for the tai chi group were 62.9+/-15.5 and 35.1+/-18.8 , respectively , versus 68.0+/-11 and 58.6+/-17.6 , respectively , for the control group ( change from baseline in the tai chi group vs. change from baseline in the control group , -18.4 points ; P<0.001 ) . The corresponding SF-36 physical-component scores were 28.5+/-8.4 and 37.0+/-10.5 for the tai chi group versus 28.0+/-7.8 and 29.4+/-7.4 for the control group ( between-group difference , 7.1 points ; P=0.001 ) , and the mental-component scores were 42.6+/-12.2 and 50.3+/-10.2 for the tai chi group versus 37.8+/-10.5 and 39.4+/-11.9 for the control group ( between-group difference , 6.1 points ; P=0.03 ) . Improvements were maintained at 24 weeks ( between-group difference in the FIQ score , -18.3 points ; P<0.001 ) . No adverse events were observed . CONCLUSIONS Tai chi may be a useful treatment for fibromyalgia and merits long-term study in larger study population s. ( Funded by the National Center for Complementary and Alternative Medicine and others ; Clinical Trials.gov number , NCT00515008 . This study examined the effects of a 24-week Tai Chi intervention on physical function in individuals with peripheral neuropathy . Twenty-five women and men with peripheral neuropathy were recruited . Plantar pressure detection threshold was assessed with a 5.07 gauge monofilament . Functional gait was assessed by the 6-min walk and timed up- and -go tests . Isokinetic leg strength and st and ing balance was also assessed . Twenty-four consecutive weeks of modified , group-based Tai Chi practice was completed , with testing repeated every six weeks throughout . No adverse events were observed and attendance was 17 + /- 4 sessions per 6 weeks . After 6 weeks of Tai Chi , participants increased 6-min walk ( P < 0.0001 ) , timed up- and -go ( P < 0.0001 ) , and leg strength ( P < 0.01 ) performance . Continued improvement was observed in the timed up- and -go . Plantar sensation improved ( P = 0.003 ) following the Tai Chi intervention . Group-based Tai Chi is a safe , plausible , and effective intervention for those with PN BACKGROUND a large proportion of adults with type 2 diabetes remain sedentary despite evidence of benefits from exercise for type 2 diabetes . Simplified Yang Tai Chi has been shown in one study to have no effect on insulin sensitivity in older adults . However , a modified Tai Chi form , Tai Chi for Diabetes ( TCD ) has recently been composed , cl aim ing to improve diabetes control . METHODS subjects were r and omised to Tai Chi or sham exercise , twice a week for 16 weeks . Primary outcomes were insulin resistance 72 h post-exercise ( HOMA2-IR ) , and long-term glucose control ( HbA(1c ) ) . RESULTS thirty-eight subjects ( 65 + /- 7.8 years , 79 % women ) were enrolled . Baseline BMI was 32.2 + /- 6.3 kg/m(2 ) , 84 % had osteoarthritis , 76 % hypertension , and 34 % cardiac disease . There was one dropout , no adverse events , and median compliance was 100 (0 - 100)% . There were no effects of time or group assignment on insulin resistance or HbA(1c ) ( -0.07 + /- 0.4 % Tai Chi versus 0.12 + /- 0.3 % Sham ; P = 0.13 ) at 16 weeks . Improvement in HbA(1c ) was related to decreased body fat ( r = 0.484 , P = 0.004 ) and improvement in insulin resistance was related to decreased body fat ( r = 0.37 , P = 0.03 ) and central adiposity ( r = 0.38 , P = 0.02 ) , as well as increased fat-free mass ( r = -0.46 , P = 0.005 ) . CONCLUSIONS TCD did not improve glucose homeostasis or insulin sensitivity measured 72 h after the last bout of exercise . More intense forms of Tai Chi may be required to produce the body composition changes associated with metabolic benefits in type 2 diabetes OBJECTIVE To examine the effect of high-intensity progressive resistance training combined with moderate weight loss on glycemic control and body composition in older patients with type 2 diabetes . RESEARCH DESIGN AND METHODS Sedentary , overweight men and women with type 2 diabetes , aged 60 - 80 years ( n = 36 ) , were r and omized to high-intensity progressive resistance training plus moderate weight loss ( RT & WL group ) or moderate weight loss plus a control program ( WL group ) . Clinical and laboratory measurements were assessed at 0 , 3 , and 6 months . RESULTS HbA(1c ) fell significantly more in RT & WL than WL at 3 months ( 0.6 + /- 0.7 vs. 0.07 + /- 0.8 % , P < 0.05 ) and 6 months ( 1.2 + /- 1.0 vs. 0.4 + /- 0.8 % , P < 0.05 ) . Similar reductions in body weight ( RT & WL 2.5 + /- 2.9 vs. WL 3.1 + /- 2.1 kg ) and fat mass ( RT & WL 2.4 + /- 2.7 vs. WL 2.7 + /- 2.5 kg ) were observed after 6 months . In contrast , lean body mass ( LBM ) increased in the RT & WL group ( 0.5 + /- 1.1 kg ) and decreased in the WL group ( 0.4 + /- 1.0 ) after 6 months ( P < 0.05 ) . There were no between-group differences for fasting glucose , insulin , serum lipids and lipoproteins , or resting blood pressure . CONCLUSIONS High-intensity progressive resistance training , in combination with moderate weight loss , was effective in improving glycemic control in older patients with type 2 diabetes . Additional benefits of improved muscular strength and LBM identify high-intensity resistance training as a feasible and effective component in the management program for older patients with type 2 diabetes OBJECTIVE To examine the effect of tai chi chuan exercise on peripheral nerve modulation in patients with type 2 diabetes mellitus . DESIGN Parallel group comparative study with a pre- and post- design . SUBJECTS Twenty-eight participants with diabetes mellitus and 32 healthy adult controls from communities in Kaohsiung , Taiwan . METHODS Cheng 's tai chi chuan , 3 times a week for 12 weeks . Fasting blood glucose levels , insulin resistance index and nerve conduction studies were measured . RESULTS A 12-week tai chi chuan programme significantly improved fasting blood glucose ( p = 0.035 ) and increased nerve conduction velocities in all nerves tested ( p = 0.046 , right ; p = 0.041 , left ) in diabetic patients . Tai chi chuan exercise did not advance the nerve conduction velocities of normal adults ; however , it significantly improved the motor nerve conduction velocities of bilateral median and tibial nerves , and distal sensory latencies of bilateral ulnar nerves in diabetic patients . Tai chi chuan exercise had no significant effect on amplitudes of all nerves tested in diabetic patients . CONCLUSIONS Results from this study suggest that fasting blood glucose and peripheral nerve conduction velocities in diabetic patients can be improved by 12 weeks tai chi chuan exercise . A further larger r and omized controlled clinical trial with longer follow-up time is needed Peripheral neuropathy and loss of somatosensation in older adults with type 2 diabetes can increase risk of falls and disability . In nondiabetic older adult population Tai Chi has been shown to enhance balance and fitness through improvements in somatosensation and neuromuscular control , and it is unclear if Tai Chi would elicit similar benefits in older adults with diabetes . Therefore , the purpose of this study was to investigate the effects of an 8-week , three-hour-per-week Tai Chi intervention on peripheral somatosensation in older adults with type 2 diabetes . Participants were eight Hispanic older adults with type 2 diabetes who participated in the Tai Chi intervention and a convenience sample of Hispanic older adults as a referent group . Baseline and postintervention assessment s included ankle proprioception , foot tactile sense , plantar pressure distribution , balance , and fitness . After intervention , older adults with type 2 diabetes showed significant improvements in ankle proprioception and fitness and decreased plantar pressure in the forefoot , with no statistical effect noted in balance or tactile sensation . Study results suggest that Tai Chi may be beneficial for older adults with diabetes as it improves ankle proprioception ; however , study findings need to be confirmed in a larger sample size r and omized controlled trial BACKGROUND This study assessed the effect of tai chi on glycosylated haemoglobin ( HbA1c ) , blood pressure and health status ( SF-36 ) in adults with type 2 diabetes . METHODS A r and omised controlled trial of tai chi classes for 6 months versus wait list control for adults with type 2 diabetes and a baseline HbA1c of 7 % or more . RESULTS A total of 53 patients were recruited to the study and r and omised to tai chi ( 28 ) or control group ( 25 ) . There were improvements in HbA1c ; 6 m walk test , and total cholesterol between baseline and follow up but the difference between the two treatment groups was not statistically significant . Health status results showed improvements in three domains for the tai chi group . DISCUSSION There was no significant improvement in metabolic control or cardiovascular risk at follow up compared to the control group . Patients in the tai chi group showed improvements in physical and social functioning
2,065
31,015,873
Sodium or calcium polystyrene sulfonate ( moderate confidence ) , sodium zirconium cyclosilicate ( moderate confidence ) , and insulin plus dextrose ( moderate confidence ) showed superior efficacy to , respectively , placebo , no treatment , placebo , and dextrose . Other therapies ( low confidence ) showed similar efficacy compared to active or inactive alternatives .
Background Although the management of hyperkalemia follows expert guidelines , treatment approaches are based on traditionally accepted practice st and ards . New drugs have been assessed such as sodium zirconium cyclosilicate and patiromer ; however , their safety and efficacy or effectiveness have not yet been compared to traditional pharmacotherapy . Objective The present systematic review had the purpose to evaluate the efficacy , effectiveness , and safety of hyperkalemia pharmacotherapies .
Ten patients with acute and 60 with chronic renal failure ( both groups having hyperkalaemia ) , were managed at Kenyatta National Hospital in the medical wards and Renal Unit between August , 1995 and January , 1996 . They were divided into seven different treatment groups , each consisting of ten patients . Treatment A glucose 25 g i.v . with insulin 10 units i.v . , treatment B 50 mmol of 8.4 % sodium bicarbonate infusion , treatment C 0.5 mg of salbutamol i.v . in 50mls 5 % dextrose , treatment D was a combination of treatments A and B , treatment E was a combination of treatment B and C , treatment F was a combination of treatments A and C while treatment G was a combination of treatments A and B and C. Serum potassium was measured , 30 minutes , 1 hour , 2 hours , 4 hours and 8 hours after treatment . Plasma glucose concentration was measured before treatment was given and 1 hour after in all patients . Electrocardiography was done before treatment on all patients and repeated 30 minutes and 1 hour after treatment for the patients with hyperkalaemic changes on the initial recording . All treatment modalities had satisfactory potassium lowering effects . Of the single therapeutic approaches , treatment A and C were equieffective , but better than treatment B ( P < 0.001 ) . Amongst the two regimen combinations , treatment D and F were more efficacious than treatment E and all the single therapeutic approaches ( P < 0.001 ) . Treatment G was the most efficacious in lowering serum potassium in this study . All treatment modalities had maximum serum potassium lowering effect at 1 - 2 hours . A fall in plasma glucose concentration was a notable feature of treatments A and D , but significant hypoglycaemia occurred in 20 % of patients receiving treatment A and in none on treatment D. The ECG changes of hyperkalaemia did not correlate with serum potassium levels . The normalisation of hyperkalaemic ECG alteration occurred within the first 30 minutes after treatment . In conclusion , combination therapies for hyperkalaemia appear to be more efficacious than single therapeutic approaches . Inclusion of salbutamol seems to protect against insulin induced hypoglycaemia . The maximum potassium lowering effect is observed 1 - 2 hours of administration of either agents . The potassium reducing effect remains significant compared to baseline values even after 8 hours . If dialysis can not be instituted early enough it seems reasonable to repeat treatment every 4 - 6 hours to sustain the effect . Repeated administration of glucose with insulin may not be safe because of the hypoglycaemic effect . Other single and combination therapies can theoretically be repeated regularly until dialysis is initiated although this requires further clinical evaluation Three groups of patients with acute or chronic renal failure ( GFR less than 5 ml/min ) and hyperkalaemia ( K+ greater than or equal to 6 mEq/l ) , similar in age , serum creatinine and pretreatment K+ . Group A ( n = 24 ) received salbutamol 0.5 mg i.v . in 15 min , group B ( n = 10 ) received glucose 40 g i.v . plus 10 units insulin i.v . in 15 min , and group C ( n = 10 ) received salbutamol 0.5 mg i.v . , glucose 40 g i.v . and insulin 10 units i.v . over a 15-min period . Serum potassium was measured at 30 , 60 , 180 and 360 min after administration of treatment . All treatments reduced serum potassium , maximal at 30 or 60 min , and ranging from -0.5 + /- 0.1 to -1.5 + /- 0.2 mEq/l ; patients in group C exhibited a significantly greater decrement in serum potassium , when compared to group B at 60 ( -1.5 + /- 0.2 vs -1 + /- 0.1 mEq/l , respectively ; P less than 0.01 ) and 180 min ( -1.2 + /- 0.2 vs -0.7 + /- 0.1 mEq/l , respectively ; P less than 0.05 ) . There were no significant differences between groups A and C. Patients from group C had moderate tachycardia and more prolonged hyperglycaemia than those from group B , but all treatments were well tolerated Non-r and omised studies of the effects of interventions are critical to many areas of healthcare evaluation , but their results may be biased . It is therefore important to underst and and appraise their strengths and weaknesses . We developed ROBINS-I ( “ Risk Of Bias In Non-r and omised Studies - of Interventions ” ) , a new tool for evaluating risk of bias in estimates of the comparative effectiveness ( harm or benefit ) of interventions from studies that did not use r and omisation to allocate units ( individuals or clusters of individuals ) to comparison groups . The tool will be particularly useful to those undertaking systematic review s that include non-r and omised studies Hyperkalemia contributes to significant mortality and limits the use of cardioprotective and renoprotective renin – angiotensin – aldosterone blockers . Current therapies are poorly tolerated and not always effective . Here we conducted a phase 2 r and omized , double-blind , placebo-controlled dose-escalation study to assess safety and efficacy of ZS-9 . This oral selective cation exchanger that preferentially entraps potassium in the gastrointestinal tract was given to patients with stable Stage 3 chronic kidney disease and hyperkalemia ( 5.0 to 6.0 mEq/l ) during a 2-day period . Of 90 eligible patients with mean baseline serum potassium of 5.1 mEq/l , 30 were r and omized to placebo , 12–0.3 g , 24–3 g , or 24 to 10 g of ZS-9 three times daily for 2 days with regular meals . None withdrew and ZS-9 dose-dependently reduced serum potassium . The primary efficacy end point ( rate of serum potassium decline in the first 48 h ) was met with significance in the 3- and 10-g cohorts . From baseline , mean serum potassium was significantly decreased by 0.92±0.52 mEq/l at 38 h. Urinary potassium excretion significantly decreased with 10-g ZS-9 as compared to placebo at day 2 ( + 15.8 + /− 21.8 vs. + 8.9 + /− 22.9 mEq per 24h ) from placebo at day 2 . In this short-term study , no serious adverse events were reported ; only mild constipation in the 3-g dose group was possibly related to treatment . Thus , ZS-9 was well-tolerated in patients with stable chronic kidney disease and hyperkalemia leading to a rapid , sustained reduction in serum potassium Chronic kidney disease ( CKD ) in heart failure ( HF ) increases the risk of hyperkalaemia ( HK ) , limiting angiotensin‐converting enzyme inhibitor ( ACE‐I ) or angiotensin receptor blocker ( ARB ) use . Patiromer is a sodium‐free , non‐absorbed potassium binder approved for HK treatment . We retrospectively evaluated patiromer 's long‐term safety and efficacy in HF patients from AMETHYST‐DN BACKGROUND Hyperkalaemia is a commonly encountered problem in dialysis patients with end-stage renal disease ( ESRD ) . The aim of the present study was to assess the effect of fludrocortisone acetate ( FCA ) on reducing serum potassium levels in haemodialysis ( HD ) patients with hyperkalaemia . METHODS Prospect ively , 21 HD patients with hyperkalaemia were enrolled in this study . Patients were divided into two groups , including FCA ( 0.1 mg/d , n = 13 ) administration or no treatment ( control , n = 8) for 10 months . No changes in dialysis or drug regimens were made during this period . Result . There were no significant differences in the baseline characteristics and biochemical parameters between the two groups ( FCA therapy and control ) . At 10-months after FCA therapy , serum potassium levels were not significantly different between the treatment and control groups [ median value ( range ) : 5.2 ( 4.4 - 6.0 ) vs 5.8 ( 4.8 - 6.3 ) mEq/l , P = 0.121 ] . However , using the Wilcoxon signed ranks test , serum potassium levels were significantly lower at the end of the 10 month time period after FCA therapy compared with serum potassium levels of the pre-treatment period [ 5.2 ( 4.4 - 6.0 ) vs 6.1 ( 5.3 - 6.8 ) , P = 0.01 ] . The biochemical values , including sodium , chloride , protein , albumin , blood nitrogen , creatinine , interdialytic weight change and blood pressure , did not show significant difference in comparisons between the two groups and pre- and post-FCA therapy period . CONCLUSIONS FCA therapy appears to slightly decrease serum potassium value in hyperkalaemic HD patients . However , these results are insufficient to explain the effectiveness of FCA . Therefore , potentially large-scale studies with increased dose concentrations are needed to minimize the positive potassium balance in hyperkalaemic HD patients BACKGROUND AND OBJECTIVES Hyperkalemia affects up to 10 % of patients with CKD . Sodium polystyrene sulfonate has long been prescribed for this condition , although evidence is lacking on its efficacy for the treatment of mild hyperkalemia over several days . This study aim ed to evaluate the efficacy of sodium polystyrene sulfonate in the treatment of mild hyperkalemia . DESIGN , SETTING , PARTICIPANTS , & MEASUREMENTS In total , 33 out patients with CKD and mild hyperkalemia ( 5.0 - 5.9 mEq/L ) in a single teaching hospital were included in this double-blind r and omized clinical trial . We r and omly assigned these patients to receive either placebo or sodium polystyrene sulfonate of 30 g orally one time per day for 7 days . The primary outcome was the comparison between study groups of the mean difference of serum potassium levels between the day after the last dose of treatment and baseline . RESULTS The mean duration of treatment was 6.9 days . Sodium polystyrene sulfonate was superior to placebo in the reduction of serum potassium levels ( mean difference between groups , -1.04 mEq/L ; 95 % confidence interval , -1.37 to -0.71 ) . A higher proportion of patients in the sodium polystyrene sulfonate group attained normokalemia at the end of their treatment compared with those in the placebo group , but the difference did not reach statistical significance ( 73 % versus 38 % ; P=0.07 ) . There was a trend toward higher rates of electrolytic disturbances and an increase in gastrointestinal side effects in the group receiving sodium polystyrene sulfonate . CONCLUSIONS Sodium polystyrene sulfonate was superior to placebo in reducing serum potassium over 7 days in patients with mild hyperkalemia and CKD BACKGROUND Previous small uncontrolled studies suggested that fludrocortisone may significantly decrease serum potassium concentrations in hemodialysis patients , possibly through enhancement of colonic potassium secretion . The aim of this study is to evaluate the effect of oral fludrocortisone on serum potassium concentrations in hyperkalemic hemodialysis patients in an open-label r and omized controlled trial . METHODS Thirty-seven hemodialysis patients with predialysis hyperkalemia were r and omly allocated to administration of either oral fludrocortisone ( 0.1 mg/d ; n = 18 ) or no treatment ( control ; n = 19 ) for 3 months . The primary outcome measure was midweek predialysis serum potassium concentration , which was measured monthly during the trial . Prospect i ve power calculations indicated that the study had an 80 % probability of detecting a decrease in serum potassium levels of 0.7 mEq/L ( 0.7 mmol/L ) . RESULTS Baseline patient characteristics were similar , except for slightly longer total weekly dialysis hours in the fludrocortisone group ( 13.0 + /- 1.3 versus 12.1 + /- 1.0 ; P = 0.02 ) . At the end of the study period , no significant changes in serum potassium concentrations were observed between the fludrocortisone and control groups ( 4.8 + /- 0.5 versus 5.2 + /- 0.7 mEq/L [ mmol/L ] , respectively ; P = 0.10 ) . Similar results were obtained when changes in serum potassium levels over time were examined between the 2 arms by using repeated- measures analysis of variance , with or without adjustment for total weekly dialysis hours . Secondary outcomes , including predialysis mean arterial pressure , interdialytic weight gain , serum sodium level , and hospitalization for hyperkalemia , were not significantly different between groups . There were no observed adverse events . CONCLUSION Administering fludrocortisone to hyperkalemic hemodialysis patients is safe and well tolerated , but does not achieve clinical ly important decreases in serum potassium levels Background : Hyperkalemia is a common medical emergency that may result in serious cardiac arrhythmias . St and ard therapy with insulin plus glucose reliably lowers the serum potassium concentration ( [ K+ ] ) but carries the risk of hypoglycemia . This study examined whether an intravenous glucose-only bolus lowers serum [ K+ ] in stable , nondiabetic , hyperkalemic patients and compared this intervention with insulin-plus-glucose therapy . Methods : A r and omized , crossover study was conducted in 10 chronic hemodialysis patients who were prone to hyperkalemia . Administration of 10 units of insulin with 100 ml of 50 % glucose ( 50 g ) was compared with the administration of 100 ml of 50 % glucose only . Serum [ K+ ] was measured up to 60 min . Patients were monitored for hypoglycemia and EKG changes . Results : Baseline serum [ K+ ] was 6.01 ± 0.87 and 6.23 ± 1.20 mmol/l in the insulin and glucose-only groups , respectively ( p = 0.45 ) . At 60 min , the glucose-only group had a fall in [ K+ ] of 0.50 ± 0.31 mmol/l ( p < 0.001 ) . In the insulin group , there was a fall of 0.83 ± 0.53 mmol/l at 60 min ( p < 0.001 ) and a lower serum [ K+ ] at that time compared to the glucose-only group ( 5.18 ± 0.76 vs. 5.73 ± 1.12 mmol/l , respectively ; p = 0.01 ) . In the glucose-only group , the glucose area under the curve ( AUC ) was greater and the insulin AUC was smaller . Two patients in the insulin group developed hypoglycemia . Conclusion : Infusion of a glucose-only bolus caused a clinical ly significant decrease in serum [ K+ ] without any episodes of hypoglycemia OBJECTIVE To evaluate the efficacy of inhaled albuterol for treatment of hyperkalemia in premature neonates by conducting a prospect i ve , r and omized placebo-controlled and double-blinded clinical trial . STUDY DESIGN Neonates < 2000 g receiving mechanical ventilation with central serum potassium > or = 6.0 mmol/L ( 6.0 mEq/L ) , were r and omly assigned to treatment or placebo groups . Albuterol ( 400 microg ) or saline was given by nebulization . The dose was repeated every 2 hours until the potassium level fell below 5 mmol/L ( maximum 12 doses ) or there were signs of toxicity . RESULTS Nineteen patients completed the study ( 8 in the albuterol and 11 in the saline group ) . Serum potassium levels declined rapidly in the first 4 hours in the albuterol group , from 7.06 + /- 0.23 mmol/L to 6.34 + /- 0.24 mmol/L ( P = .003 ) versus no significant change in the saline group ( 6.88 + /- 0.18 mmol/L to 6.85 + /- 0.24 mmol/L ; P = .87 ) . At 8 hours , the fall continued to be greater in the albuterol group versus the saline group ( 5.93 + /- 0.3 mmol/L and 6.35 + /- 0.22 mmol/L , respectively ; P = .04 ) . CONCLUSION Albuterol inhalation may be useful in rapidly lowering serum potassium levels in premature neonates BACKGROUND Hyperkalemia is one of the most dreadful complications of chronic kidney disease ( CKD ) . Medical management includes use of cation exchange resins to reduce the amount of excessive potassium from the body . Sodium polystyrene sulphonate ( SPS ) and calcium polystyrene sulphonate ( CPS ) are currently used for hyperklemia of CKD all over the world . The objective was to compare the efficacy and safety of two different cation exchange resins ( CPS and SPS ) in patients of CKD with hyperkalemia . METHODS This r and omized control trial was done at the Kidney Centre , Post Graduate Training Institute ( PGTi ) , Karachi , Pakistan between 15 January 2010 till 31st December 2010 to compare the efficacy and safety of , CPS and SPS in 97 CKD patients with hyperkalemia . The subjects were divided in two groups . Group-A received CPS while group-B received SPS . The data included symptoms , food recall , physical signs of volume overload and electrolytes . After receiving potassium binding resin for 3 days patients were evaluated for symptoms , weight gain , worsening of blood pressure and effect on electrolytes . Adverse events were recorded in an event reporting form . RESULTS Average potassium level pre resin was 5.8_0.26 in group-A and 5.8±0.6 in group-B , which reduced to 4.8±0.5 in group-A and 4.3±0.53 in group-B suggesting the efficacy of both drugs for treatment of hyperkalemia in CKD patients . Systolic blood pressure remains stable in both the groups while an increase in diastolic blood pressure was noticed in group-B patients ( p-value 0.004 ) . No major adverse effect occurred in both the groups . CONCLUSION Both CPS and SPS can be used effectively for reducing hyperkalemia of CKD . CPS showed fewer side effects as compared to SPS Introduction : Hyperkalemia is a common , sometimes fatal electrolyte abnormality seen in patients with heart failure ( HF ) or kidney disease . Acute treatments that cause the intracellular translocation of potassium can be effective in the short-term but they simply buy time until definitive removal by dialysis or binding agents ( e.g. , sodium polystyrene sulfonate ) can occur . In contrast , treatment for chronic hyperkalemia , which often occurs in the setting of HF treated with renin-angiotensin-aldosterone inhibitors ( RAASi ) or mineralocorticoid receptor antagonists ( MRA ) , is limited and has question able efficacy . Areas covered : Sodium zirconium cyclosilicate ( ZS-9 ) , a novel , non-absorbed , potassium-selective cation exchanger , has demonstrated activity in acutely lowering and maintaining normal potassium levels . When used chronically , maintenance of normal serum potassium has been demonstrated for up to 1 month . Although higher doses of ZS-9 have been associated with modest increases in the rates of edema and hypokalemia , the overall adverse event rate is similar to placebo . Expert opinion : The efficacy of ZS9 has been shown in patients with chronic hyperkalemia , offering promise for conditions such as HF , where optimized therapy with RAASi and MRA is often limited by a concomitant , drug-induced increase in potassium . Further , in acute hyperkalemia it has potential to become an important option by rapidly lowering potassium levels , thus delaying or potentially averting the need for emergent dialysis . While further r and omized trials demonstrating improved clinical outcomes are required for both these indications , initial data suggests a promising role for this agent in the management of both acute and chronic hyperkalemia Background : Hyperkalemia is a risky and potentially life-threatening condition in pre-term infants . Glucose-insulin infusion has been considered a major therapeutic way for non-oligouric hyperkalemia but affects the stability of blood sugar level . We aim ed to evaluate the effectiveness of salbutamol nebulization compared to glucose-insulin infusion for the treatment of non-oliguric hyperkalemia in premature infants . Methods : Forty premature infants ( gestation age ⩽36 weeks ) with non-oliguric hyperkalemia ( central serum potassium level greater than 6.0 mmol/L ) within 72 h of birth were enrolled in this study . These infants were r and omly assigned into two groups . One group received a regular insulin bolus with glucose infusion ( Group A ; n = 20 ) , and the other received salbutamol ( Ventolin ® ) by nebulization ( Group B ; n = 20 ) . Potassium level , blood sugar , heart rate , and blood pressure were recorded for each group before treatment and at 3 , 12 , 24 , 48 , and 72 h post-treatment . Results : The serum potassium levels were reduced after treatment in both groups . No significant changes in heart rate or blood pressure were observed in either group . The fluctuation in glucose levels was gentler in the salbutamol-treated group than in the glucose-insulin infusion group . Conclusion : Salbutamol nebulization is not only as effective as glucose-insulin infusion for treating non-oliguric hyperkalemia in premature infants but can avoid potential side effects such as vigorous blood glucose fluctuations Hyperkalemia is a life-threatening emergency in maintenance hemodialysis ( MHD ) patients . This clinical trial investigated the efficacy and safety of calcium-polystyrene sulfonate ( Ca-PS ) in MHD patients with interdialytic hyperkalemia . A total of 58 hemodialysis patients with hyperkalemia ( ≥5.5 mol/L ) were selected and administered either a 3-week Ca-PS ( 3 × 5 g/day ) or a blank control following the model of a prospect i ve , r and omized , crossover clinical trial with a 1-week washout period . All patients were followed up for another 3 weeks for safety evaluations . The primary outcome was the magnitude of the change in serum potassium levels . The secondary outcomes were electrocardiography ( ECG ) changes and treatment safety ( volume overload , electrolyte imbalance ) . Compared with the control group , Ca-PS treatment significantly reduced serum potassium levels ( P < 0.01 ) . More patients in the Ca-PS group had lower serum potassium levels than the safety level of < 5.5 mmol/L ( 32 % for control vs. 61 % for Ca-PS , P < 0.01 ) . Peaked T-wave occurred less frequently in patients in the Ca-PS group ( 13.8 % for Ca-PS vs. 31.03 % for control , P < 0.01 ) . In addition , Ca-PS reduced serum phosphorus levels with no effects on serum levels of calcium and sodium , fluid volume , blood pressure , or interdialytic weight gain . Ca-PS treatment decreases serum levels of potassium and phosphorus in MHD patients with interdialytic hyperkalemia . Ca-PS does not induce volume overload or disrupt electrolyte balance The study aim ed to verify the application and performance of triggers for adverse drug events in hospitalized newborns . This prospect i ve cohort study was conducted in the neonatal care units of a university hospital from March to September 2015 . A list of triggers was developed for the identification of adverse drug events in this population . The list included antidote , clinical , and laboratory triggers . A total of 125 newborns who had received drugs during the hospitalization were included . Neonatal patient charts were screened to detect triggers . When a trigger was found , the patient chart was review ed to identify possible adverse drug events . Each trigger 's yield in the identification of adverse drug events was calculated and then classified according to its performance . Nine hundred and twenty-five triggers identified 208 suspected adverse drug events . The triggers ' overall yield was 22.5 % . The most frequently identified triggers were : drop in oxygen saturation , increased frequency of bowel movements , medications stop , and vomiting . The triggers with the best performance in the identification of adverse drug events were : increased creatinine , increased urea , necrotizing enterocolitis , prescription of flumazenil , hypercalcemia , hyperkalemia , hypernatremia , and oversedation . The triggers identified in this study can be used to track adverse drug events in similar neonatal care services , focusing on the triggers with the best performance and the lowest workload in the identification BACKGROUND Hyperkalemia ( serum potassium level , > 5.0 mmol per liter ) is associated with increased mortality among patients with heart failure , chronic kidney disease , or diabetes . We investigated whether sodium zirconium cyclosilicate ( ZS-9 ) , a novel selective cation exchanger , could lower serum potassium levels in patients with hyperkalemia . METHODS In this multicenter , two-stage , double-blind , phase 3 trial , we r and omly assigned 753 patients with hyperkalemia to receive either ZS-9 ( at a dose of 1.25 g , 2.5 g , 5 g , or 10 g ) or placebo three times daily for 48 hours . Patients with normokalemia ( serum potassium level , 3.5 to 4.9 mmol per liter ) at 48 hours were r and omly assigned to receive either ZS-9 or placebo once daily on days 3 to 14 ( maintenance phase ) . The primary end point was the exponential rate of change in the mean serum potassium level at 48 hours . RESULTS At 48 hours , the mean serum potassium level had decreased from 5.3 mmol per liter at baseline to 4.9 mmol per liter in the group of patients who received 2.5 g of ZS-9 , 4.8 mmol per liter in the 5-g group , and 4.6 mmol per liter in the 10-g group , for mean reductions of 0.5 , 0.5 , and 0.7 mmol per liter , respectively ( P<0.001 for all comparisons ) and to 5.1 mmol per liter in the 1.25-g group and the placebo group ( mean reduction , 0.3 mmol per liter ) . In patients who received 5 g of ZS-9 and those who received 10 g of ZS-9 , serum potassium levels were maintained at 4.7 mmol per liter and 4.5 mmol per liter , respectively , during the maintenance phase , as compared with a level of more than 5.0 mmol per liter in the placebo group ( P<0.01 for all comparisons ) . Rates of adverse events were similar in the ZS-9 group and the placebo group ( 12.9 % and 10.8 % , respectively , in the initial phase ; 25.1 % and 24.5 % , respectively , in the maintenance phase ) . Diarrhea was the most common complication in the two study groups . CONCLUSIONS Patients with hyperkalemia who received ZS-9 , as compared with those who received placebo , had a significant reduction in potassium levels at 48 hours , with normokalemia maintained during 12 days of maintenance therapy . ( Funded by ZS Pharma ; Clinical Trials.gov number , NCT01737697 . )
2,066
23,235,634
The analysis of the five included trials comparing intravenous versus oral steroid therapy for MS relapses do not demonstrate any significant differences in clinical ( benefits and adverse events ) , radiological or pharmacological outcomes . Based on the evidence , oral steroid therapy may be a practical and effective alternative to intravenous steroid therapy in the treatment of MS relapses
DOI : 10.1002/14651858.CD006921.pub2).Multiple sclerosis ( MS ) , a chronic inflammatory and neurodegenerative disease of the central nervous system ( CNS ) , is characterized by recurrent relapses of CNS inflammation ranging from mild to severely disabling . Relapses have long been treated with steroids to reduce inflammation and hasten recovery . However , the commonly used intravenous methylprednisolone ( IVMP ) requires repeated infusions with the added costs of homecare or hospitalization , and may interfere with daily responsibilities . Oral steroids have been used in place of intravenous steroids , with lower direct and indirect costs . OBJECTIVES The primary objective was to compare efficacy of oral versus intravenous steroids in promoting disability recovery in MS relapses < = six weeks . Secondary objectives included subsequent relapse rate , disability , ambulation , hospitalization , immunological markers , radiological markers , and quality of life .
THE PURPOSE of this report is to present the results of a double-blind drug trial in which an oral corticosteroid , methylprednisolone ( Medrol ) , was compared to oral cyanocobalamin ( vitamin B12 ) in patients with multiple sclerosis . This pair of drugs was chosen for comparison for several reasons . If multiple sclerosis is a peculiar auto-immune disease related to the myelin of the brain , then it might be reasonable to expect that corticosteroids may favorably alter the course of the disease.1A number of reports in the recent literature have dealt with clinical trials of corticosteroids and corticotropin ( ACTH ) in the treatment of patients with multiple sclerosis . Studies by Alex and er et al on corticotropin,2,3which did not include modern statistical design s such as r and omization of administration of placebo and corticotropin and a double-blind technique , have indicated a favorable response . Miller et al,4utilizing a modern statistical design , also concluded that The immunological effects of high‐dose methylprednisolone in attacks of multiple sclerosis and acute optic neuritis have only been examined in a few r and omized , controlled trials . We studied immunological changes in 50 patients with optic neuritis or multiple sclerosis who underwent lumbar puncture before and 1 week after completing a 15‐day course of oral high‐dose methylprednisolone treatment . Treatment result ed in a decrease in the concentration of myelin basic protein , a decrease in the serum concentration of immunoglobulin G ( IgG ) and intrathecal IgG synthesis , an increase in the cerebrospinal fluid concentration of transforming growth factor‐β1 , and changes in the expression of CD25 , CD26 , and human leukocyte antigen‐DR ( HLA‐DR ) on CD4 T‐cells . No effect was seen on the cerebrospinal fluid leucocyte count or the cerebrospinal fluid activity of matrix metalloproteinase‐9 ( MMP‐9 ) . The lack of a persistent effect on cerebrospinal fluid leucocyte recruitment and MMP‐9 activity , despite changes in IgG synthesis , T‐cell activation , and cytokine production , suggests that modulation of the function of inflammatory cells may contribute to the clinical efficacy of oral high‐dose methylprednisolone treatment in optic neuritis and multiple sclerosis 30 patients with acute exacerbations of multiple sclerosis were treated by ACTH , dexamethasone or methylprednisolone in a double-blind r and omized study . Clinical parameters were assessed ; cerebrospinal fluid and neurophysiological parameters ( visual- and brainstem-evoked potentials ) were evaluated at the beginning and at the end of treatment . Dexamethasone was more effective than ACTH and 6-methylprednisolone in shortening bout duration . Neither CSF nor neurophysiological parameters were significantly affected by therapy The efficacy of dexamethasone ( DX ) and methylprednisolone ( MP ) at high ( HD ) and low ( LD ) dose in acute multiple sclerosis ( MS ) relapses was evaluated by a double-blind trial in 31 patients followed for 1 year . DX and HDMP were similarly efficacious in promoting recovery , while LDMP was ineffective in the short-term outcome and was followed by an early clinical reactivation . The different outcomes seem to be related to different immunomodulating effects , mainly on cerebrospinal fluid ( CSF ) IgG synthesis and on peripheral blood and CSF CD4 + lymphocyte subsets We investigated the effect of oral and intravenous methylprednisolone treatment on subsequent relapse rate in patients with multiple sclerosis . Following a double blind trial design ed to compare the effect of oral and intravenous methylprednisolone treatment on promoting recovery from acute relapses of multiple sclerosis , 80 patients were followed for two years with six-monthly assessment s during which all subsequent relapses were recorded . The annual relapse rate was slightly higher in the oral compared with the intravenous methylprednisolone-treated patients ( 1.06 vs. 0.78 ) , but the adjusted difference between the two groups was not statistically significant ( 0.18 ; 95 % CI -0.19 to 0.55 , P=0.3 ) . The time to onset and the severity of the first relapse after treatment , the number of relapse free patients at the end of the follow-up period , and the severity of the relapses during the follow-up period were similar in the two groups . This trial did not show a statistically significant difference in relapse rate during the first two years following oral compared with intravenous methylprednisolone treatment Objective : To assess the efficacy of oral high-dose methylprednisolone in acute optic neuritis ( ON ) . Background : It has been determined that oral high-dose methylprednisolone is efficacious in attacks of MS . Methods : A total of 60 patients with symptoms and signs of ON with a duration of less than 4 weeks and a visual acuity of 0.7 or less were r and omized to treatment with placebo ( n = 30 ) or oral methylprednisolone ( n = 30 ; 500 mg daily for 5 days , with a 10-day tapering period ) . Visual function was measured and symptoms were scored on a visual analog scale ( VAS ) before treatment and after 1 , 3 , and 8 weeks . Primary efficacy measures were spatial vision and VAS scores the first 3 weeks ( analysis of variance with baseline values as the covariate ) , and changes in spatial vision and VAS scores after 8 weeks . A significance level of p < 0.0125 was employed . Results : The VAS score ( p = 0.008 ) but not the spatial visual function ( p = 0.03 ) differed in methylprednisolone- and placebo-treated patients during the first 3 weeks . After 8 weeks the improvement in VAS scores ( p = 0.8 ) and spatial visual function ( p = 0.5 ) was comparable with methylprednisolone- and placebo-treated patients . A post hoc subgroup analysis suggested that patients with more severe baseline visual deficit and patients treated early after onset of symptoms had a more pronounced response to treatment . The risk of a new demyelinating attack within 1 year was unaffected by treatment . No serious adverse events were seen . Conclusion : Oral high-dose methylprednisolone treatment improves recovery from ON at 1 and 3 weeks , but no effect could be demonstrated at 8 weeks or on subsequent attack frequency Objective : There is only limited evidence from adequately controlled clinical trials to support high-dose methylprednisolone therapy for attacks of multiple sclerosis ( MS ) and none supporting oral administration . We assessed the effect of oral high-dose methylprednisolone therapy in attacks of MS . Methods : Twenty-five patients with an attack of MS lasting less than 4 weeks were r and omized to placebo treatment . Twenty-six patients received oral methylprednisolone ( 500 mg once a day for 5 days with a 10-day tapering period ) . The patients received scores on the Scripps Neurological Rating Scale ( NRS ) and Kurtzke Exp and ed Disability Status Scale . The symptoms were scored on a visual analog scale ( VAS ) before treatment and after 1 , 3 , and 8 weeks of treatment . Primary efficacy measures were NRS and VAS scores in the first 3 weeks and changes in NRS score and answers to an efficacy question naire administered after 8 weeks of treatment . Results : Changes in NRS scores among methylprednisolone- and placebo-treated patients differed significantly in the first 3 weeks and after 8 weeks(p = 0.005 and p = 0.0007 ) . VAS scores the first 3 weeks and treatment efficacy after 8 weeks also favored a beneficial effect of methylprednisolone treatment ( p = 0.02 and p = 0.05 ) . After 1 , 3 , and 8 weeks , 4 % , 24 % , and 32 % in the placebo group and 31 % , 54 % , and 65 % in the methylprednisolone group had improved one point on the Exp and ed Disability Status Scale score ( all p < 0.05 ) . No serious adverse events were seen . Conclusion : Oral high-dose methylprednisolone is recommended for managing attacks of MS To compare the efficacy of high-dose intravenous methylprednisolone with intramuscular ACTH in the treatment of acute relapse in multiple sclerosis , we undertook a double-blind , r and omized , controlled study involving 61 patients . There was a marked improvement in both groups in the course of the study , but no difference between them in either the rate of recovery or the final outcome . High-dose IV methylprednisolone is a safe alternative to ACTH in the management of acute relapse in MS Oral prednisone 1might be a convenient , inexpensive alternative to IV methylprednisolone ( IVMP ) if the bioequivalent dose was known . We compared the total amount of steroid absorbed after 1250 mg oral prednisone vs 1 gram IVMP in 16 patients with multiple sclerosis ( MS ) . At 24 hours , the mean area under the concentration-time curve ( AUC ) , the main component of bioavailability , did not differ between groups ( p = 0.122 ) . This suggests that the amount of absorbed corticosteroid is similar after either steroid at these doses BACKGROUND An intravenous rather than oral course of methylprednisolone is often prescribed for treating acute relapses in multiple sclerosis ( MS ) despite the lack of evidence to support this route of administration . Our double-blind placebo-controlled r and omised trial was design ed to compare the efficacy of commonly used intravenous and oral steroid regimens in promoting recovery from acute relapses in MS . METHODS 42 patients with clinical ly definite relapse in MS received oral , and 38 intravenous , methylprednisolone . Clinical measurements at entry and at 1 week , 4 weeks , 12 weeks , and 24 weeks included Kurtzke 's exp and ed disability status scale ( EDSS ) , Hauser 's Ambulatory Index , and an arm-function index . The primary outcome criterion was a difference between the two treatment groups of one or more EDSS grade s at 4 weeks . FINDINGS There were no significant differences between the two groups at any stage of the study in any measurement taken : the mean difference in EDSS at 4 weeks ( adjusted for baseline level ) was 0.07 grade s more in those taking oral steroids ( 95 % CI -0.46 to 0.60 ) . The most optimistic outcome for intravenous therapy is an average benefit of less than half a grade improvement on EDSS over oral treatment . INTERPRETATION Since our study did not show any clear advantage of the intravenous regime we conclude that it is preferable to prescribe oral rather than intravenous steroids for acute relapses in MS for reasons of patient convenience , safety , and cost Many patients with multiple sclerosis ( MS ) are currently receiving treatment with interferon (IFN)–β . Early identification of nonresponder patients is crucial to try different therapeutic approaches . We investigated various criteria of treatment response to assess which criterion better identifies patients with a poor response We investigated in multiple sclerosis the difference between two commonly used oral and intravenous steroid regimens on the level of adhesion molecule expression on blood T lymphocytes , the distribution of circulating T cell subsets , and the concentration of serum tumour necrosis factor alpha . Venous blood sample s were collected from a cohort of 22 patients with acute relapses who were participating in a r and omised trial comparing intravenous methylprednisolone 1000 mg daily for 3 days with oral methylprednisolone 48 mg daily for one week , 24 mg daily for one week , and finally 12 mg daily for one week . There was a similar significant reduction of T cell LFA-1 surface expression and serum TNF alpha concentrations after 4 days of treatment with each regimen . There was no change in other lymphocyte adhesion molecules expression ( ICAM-1 , LFA-3 or CD2 ) at day 4 , although LFA-3 and CD2 expression was moderately decreased at day 28 and day 90 respectively ; nor was there any change in the distribution of lymphocyte subsets ( CD4 , CD8 , and CD45RA , CD45RO ) , although a small decrease in CD45RO circulatory T cells was noted at day 28 . This study suggests that some of the beneficial effects of glucocorticosteroids may be related to the inhibition of lymphocyte adhesion as well as the modulation of proinflammatory cytokines The objective of this study was to investigate the feasibility of treating relapses of multiple sclerosis ( MS ) at home with oral dexamethasone . Twenty-five out of 28 consecutive patients with MS who presented with a relapse of less than 2 weeks ' duration were treated on an open basis with oral dexamethasone 16 mg per day ( four divided doses ) for 5 consecutive days . After one week , the exp and ed disability status scale ( EDSS ) had improved by one or more grade s in 88 % ( 22 patients ) and after 4 weeks in 92 % ( 23 patients ) . Treatment was well tolerated . We conclude that a course of oral dexamethasone 16 mg per day shortens the duration of an exacerbation in MS in a similar way as seen after high dose i.v . methylprednisolone . Although a r and omized study is needed to test this treatment regimen against i.v . high dose corticosteroids , oral dexamethasone can be used in situations when i.v . therapy is difficult to apply . Copyright 1999 Harcourt Publishers Objective : To compare the efficacy , tolerability , and safety of IV methylprednisolone ( IV MP ) vs oral methylprednisolone ( oMP ) at equivalent high doses in patients with multiple sclerosis ( MS ) experiencing a recent relapse . Methods : Patients with a clinical relapse within the previous 2 weeks and at least 1 gadolinium (Gd)-enhancing lesion on a screening brain MRI scan were included . Forty patients with MS were r and omized to receive either 1 g/day for 5 days of oMP ( 20 patients ) or 1 g/day for 5 days of IV MP ( 20 patients ) . Exp and ed Disability Status Scale ( EDSS ) and brain MRI ( dual-echo and postcontrast T1-weighted scans ) were assessed at baseline and at weeks 1 and 4 . The study primary research question ( endpoint ) was to compare the efficacy of the 2 treatment routes in reducing the number of Gd-enhancing lesions after 1 week from treatment initiation . Secondary outcomes were safety , tolerability , and clinical efficacy profiles of the 2 routes of administration . Results : The 2 groups showed a reduction of Gd-enhancing lesions over time ( p = 0.002 for oMP and p = 0.001 for IV MP ) with a “ non-inferiority effect ” between the 2 routes of administration at week 1 . Both groups showed an improvement of EDSS over time ( p < 0.001 ) without between-group difference at week 4 . Both treatments were well-tolerated and adverse events were minimal and occurred similarly in the 2 treatment arms . Conclusions : Oral methylprednisolone ( oMP ) is as effective as IV methylprednisolone in reducing gadolinium-enhancing lesions in patients with MS soon after an acute relapse with similar clinical , safety , and tolerability profiles . This study provides class III evidence that 1 g oMP × 5 days is not inferior to 1 g IV MP × 5 days in reducing the number of gadolinium-enhancing lesions over a period of 1 week ( mean difference in lesion reduction comparing IV MP to oMP is −20 % , 95 % confidence interval −48 % to + 5 % ) A r and omised double-blind placebo-controlled trial of intravenous methylprednisolone versus oral methylprednisolone at equivalent high dose was carried out on 35 patients with an acute relapse of multiple sclerosis ( MS ) . After baseline evaluation each was r and omly allocated to oral treatment and intravenous placebo or intravenous treatment and oral placebo , receiving 500 mg of methylprednisolone for five consecutive days and with re assessment at days five and twenty-eight . There was no significant difference in response when disability or functional scores were compared in the two groups . Adverse effects were minor and equally distributed . In this study oral treatment with methylprednisolone was as effective as intravenous treatment in acute relapse of MS
2,067
27,178,983
ORrr , followed by HRpfs , had the strongest association with HRos at the trial level . However , these measures were not strong enough to replace OS
INTRODUCTION Recent improvements in chemotherapy agents have prolonged postprogression survival of non-small cell lung cancer . Thus , primary outcomes other than overall survival ( OS ) have been frequently used for recent phase III trials to obtain quick results . However , no systematic review had assessed whether progression-free survival ( PFS ) , response rate ( RR ) , and disease control rate ( DCR ) can serve as surrogates for OS at the trial level in the phase III first-line chemotherapy setting .
Introduction : As a result of recent publications , we hypothesized that period of 8 weeks after initiation of treatment is a useful l and mark point for cytotoxic agents for advanced non-small cell lung cancer ( NSCLC ) . To test this hypothesis , we conducted l and mark analyses with clinical trials employing cytotoxic agents . Our goal was to assess the proper design of clinical trials with cytotoxic agents for NSCLC for maximizing patients ’ benefit . Methods : We conducted l and mark analyses of a phase II study of pemetrexed in locally advanced or metastatic NSCLC and a phase III study of Four-Arm Cooperative Study for advanced NSCLC . A total of 806 patients who received chemotherapy ( pemetrexed , cisplatin and irinotecan , paclitaxel and carboplatin , cisplatin and gemcitabine , cisplatin and vinorelbine ) were included in this assessment . Results : Tumor-shrinkage rate at 8 weeks was significantly associated with longer survival in the study with pemetrexed ( p = 0.043 ) , whereas tumor-shrinkage rate at 4 weeks did not correlated with survival ( p = 0.139 ) . Similarly , using the Four-Arm Cooperative Study data , the optimal l and mark point was 8 weeks ( p = 0.002 ) , not 4 weeks ( p = 0.190 ) . Conclusion : The l and mark point for NSCLC was 8 weeks with all cytotoxic agents in our analysis when the therapy was given as a frontline or subsequent therapy . Our result suggests the concept of a disease-specific l and mark point , which may lead to a change of phase II/III clinical study design to evaluate cytotoxic agents and clinical investigators , and their sponsors may consider an early look to assess the efficacy of cytotoxic agents for NSCLC Background : Bevacizumab , the anti-vascular endothelial growth factor agent , provides clinical benefit when combined with platinum-based chemotherapy in first-line advanced non-small-cell lung cancer . We report the final overall survival ( OS ) analysis from the phase III AVAiL trial . Patients and methods : Patients ( n = 1043 ) received cisplatin 80 mg/m2 and gemcitabine 1250 mg/m2 for up to six cycles plus bevacizumab 7.5 mg/kg ( n = 345 ) , bevacizumab 15 mg/kg ( n = 351 ) or placebo ( n = 347 ) every 3 weeks until progression . Primary end point was progression-free survival ( PFS ) ; OS was a secondary end point . Results : Significant PFS prolongation with bevacizumab compared with placebo was maintained with longer follow-up { hazard ratio ( HR ) [ 95 % confidence interval ( CI ) ] 0.75 ( 0.64–0.87 ) , P = 0.0003 and 0.85 ( 0.73–1.00 ) , P = 0.0456 } for the 7.5 and 15 mg/kg groups , respectively . Median OS was > 13 months in all treatment groups ; nevertheless , OS was not significantly increased with bevacizumab [ HR ( 95 % CI ) 0.93 ( 0.78–1.11 ) , P = 0.420 and 1.03 ( 0.86–1.23 ) , P = 0.761 ] for the 7.5 and 15 mg/kg groups , respectively , versus placebo . Most patients ( ∼62 % ) received multiple lines of post study treatment . Up date d safety results are consistent with those previously reported . Conclusions : Final analysis of AVAiL confirms the efficacy of bevacizumab when combined with cisplatin – gemcitabine . The PFS benefit did not translate into a significant OS benefit , possibly due to high use of efficacious second-line therapies Categorizations of best response observed at week 8 ( between week 3 and 14 ) of first-line treatment in two studies of bevacizumab plus chemotherapy in Western ( 878 patients ) and Chinese ( 198 patients ) patients with non-small cell lung cancer were assessed together with baseline prognostic factors in multivariate parametric models to predict overall survival ( OS ) and progression free survival ( PFS ) . Predictive performances of the models were assessed by simulating multiple replicates of the studies . Disease control rate ( DCR ) was the best response categorization to predict OS and PFS . In the OS model , DCR fully captured bevacizumab effect . For PFS , DCR did not fully capture bevacizumab treatment effect . The models adequately predicted OS and PFS distributions in each arm as well as bevacizumab hazard ratio ( HR ) for OS and PFS , for example , in Western patients ( model prediction [ 95 % prediction interval ] : 0.84 [ 0.71 - 0.98 ] vs. observed : 0.77 for OS and 0.59 [ 0.49 - 0.72 ] vs. observed : 0.58 for PFS ) . Covariates in the models captured endpoint differences seen in Chinese patients . There was no impact of Chinese ethnicity on the DCR relationship to OS or PFS . DCR predicted OS benefit with bevacizumab in first-line NSCLC patients . Western data can be used to inform design of studies in Chinese patients PURPOSE A traditional end point for colon adjuvant clinical trials is overall survival ( OS ) , with 5 years demonstrating adequate follow-up . A shorter-term end point providing convincing evidence to allow treatment comparisons could significantly speed the translation of advances into practice . METHODS Individual patient data were pooled from 18 r and omized phase III colon cancer adjuvant clinical trials . Trials included 43 arms , with a pooled sample size of 20,898 patients . The primary hypothesis was that disease-free survival ( DFS ) , with 3 years of follow-up , is an appropriate primary end point to replace OS with 5 years of follow-up . RESULTS The recurrence rates for years 1 through 5 were 12 % , 14 % , 8 % , 5 % , and 3 % , respectively . Median time from recurrence to death was 12 months . Eighty percent of recurrences were in the first 3 years ; 91 % of patients with recurrence by 3 years died before 5 years . Correlation between 3-year DFS and 5-year OS was 0.89 . Comparing control versus experimental arms within each trial , the correlation between hazard ratios for DFS and OS was 0.92 . Within-trial log-rank testing using both DFS and OS provided the same conclusion in 23 ( 92 % ) of 25 cases . Formal measures of surrogacy were satisfied . CONCLUSION In patients treated on phase III adjuvant colon clinical trials , DFS and OS are highly correlated , both within patients and across trials . These results suggest that DFS after 3 years of median follow-up is an appropriate end point for adjuvant colon cancer clinical trials of fluorouracil-based regimens , although marginally significant DFS improvements may not translate into significant OS benefits Background Following publication of the PRISMA statement , the UK Centre for Review s and Dissemination ( CRD ) at the University of York in Engl and began to develop an international prospect i ve register of systematic review s with health-related outcomes . The objectives were to reduce unplanned duplication of review s and provide transparency in the review process , with the aim of minimizing reporting bias . Methods An international advisory group was formed and a consultation undertaken to establish the key items necessary for inclusion in the register and to gather views on various aspects of functionality . This article describes the development of the register , now called PROSPERO , and the process of registration . Results PROSPERO offers free registration and free public access to a unique prospect i ve register of systematic review s across all areas of health from all around the world . The dedicated web-based interface is electronically search able and available to all prospect i ve registrants . At the moment , inclusion in PROSPERO is restricted to systematic review s of the effects of interventions and strategies to prevent , diagnose , treat , and monitor health conditions , for which there is a health-related outcome .Ideally , registration should take place before the research ers have started formal screening against inclusion criteria but review s are eligible as long as they have not progressed beyond the point of completing data extraction .The required data set captures the key attributes of review design as well as the administrative details necessary for registration .Su bmi tted registration forms are checked against the scope for inclusion in PROSPERO and for clarity of content before being made publicly available on the register , rejected , or returned to the applicant for clarification . The public records include an audit trail of major changes to planned methods , details of when the review has been completed , and links to result ing publications when provided by the authors . Conclusions There has been international support and an enthusiastic response to the principle of prospect i ve registration of protocol s for systematic review s and to the development of PROSPERO .In October 2011 , PROSPERO contained 200 records of systematic review s being undertaken in 26 countries around the world on a diverse range of interventions BACKGROUND Whether progression-free survival ( PFS ) or overall survival ( OS ) is the more appropriate endpoint in clinical trials of metastatic cancer is controversial . In some disease and treatment setting s , an improvement in PFS does not result in an improved OS . METHODS We partitioned OS into two parts and expressed it as the sum of PFS and survival postprogression ( SPP ) . We simulated r and omized clinical trials with two arms that had respective medians for PFS of 6 and 9 months . We assumed no treatment difference in median SPP . We found the probability of a statistically significant benefit in OS for various median SPP and observed P values for PFS . We compared the sample sizes required for PFS vs OS for various median SPP . We compare our results with the literature regarding surrogacy of PFS for OS by use of the correlation between hazard ratios for PFS and OS . All statistical tests were two-sided . RESULTS For a trial with observed P value for improvement in PFS of .001 , there was a greater than 90 % probability for statistical significance in OS if median SPP was 2 months but less than 20 % if median SPP was 24 months . For a trial requiring 280 patients to detect a 3-month difference in PFS , 350 and 2440 patients , respectively , were required to have the same power for detecting a real difference in OS that is carried over from the 3-month benefit in PFS when the median SPP was 2 and 24 months . CONCLUSIONS Addressing SPP is important in underst and ing treatment effects . For clinical trials with a PFS benefit , lack of statistical significance in OS does not imply lack of improvement in OS , especially for diseases with long median SPP . Although there may be no treatment effect on SPP , its variability so dilutes the OS comparison that statistical significance is likely lost . OS is a reasonable primary endpoint when median SPP is short but is too high a bar when median SPP is long , such as longer than 12 months BACKGROUND Many more phase II studies have favorable outcomes than the subsequent phase III trials . We used historical data from phase II and phase III studies for patients with extensive-stage small-cell lung cancer ( SCLC ) to generate a statistical model to provide assistance in selecting chemotherapy regimens from phase II studies for subsequent use in phase III r and omized studies . METHODS Information from 21 phase III trials for patients with extensive-stage SCLC initiated during the period from 1972 through 1990 was review ed to identify those that were preceded by phase II studies of the same regimen . We used data from all the trial pairs to develop a statistical model in which the number of patients , the median survival of patients , and the number of deaths observed in the phase II trial are used to estimate the statistical power of the subsequent phase III trial . All statistical tests were two-sided . RESULTS Nine phase II studies were identified that preceded phase III trials of the same regimen . The regimens from two phase II studies with the greatest expected power in the phase III trial ( 0 . 62 and 0.58 ) both demonstrated significantly prolonged survival when compared with st and ard treatment in subsequent phase III trials ( P<. 001 and P = .002 , respectively ) . The regimens from six of the other phase II studies , for which the median power expected in the phase III trial was 0.28 ( range , 0.19 - 0.52 ) , showed no difference when compared with st and ard treatment in a phase III trial . CONCLUSIONS Phase II studies for particular regimens that have an expected power of greater than 0.55 provide a reasonable basis for proceeding with a phase III trial This paper is an overview of the new response evaluation criteria in solid tumours : revised RECIST guideline ( version 1 . 1 ) , with a focus on up date d contents Several challenging and often controversial issues arise in oncology trials with the use of the end point progression-free survival ( PFS ) , defined to be the time to detection of progressive disease or death . While this end point does not directly measure how a patient feels , functions , or survives , it does provide insights about whether an intervention affects the tumor burden process , the intended mechanism through which it is hoped that most anticancer agents will provide benefit . However , simply achieving statistically significant effects on PFS is insufficient to obtaining reliable evidence of important clinical benefit , and even is insufficient to justifying the conclusion that the experimental intervention is " reasonably likely to provide clinical benefit . " The magnitude of the effect on PFS in addition to the statistical strength of evidence is of great importance in interpreting the reliability of the evidence regarding clinical efficacy . PFS has several important properties , including being a direct measure of the effect of treatment on the tumor burden process , being sensitive to cytostatic as well as cytotoxic mechanisms of interventions , and incorporating the clinical ly relevant event of death , increasing its sensitivity to influential harmful mechanisms and avoiding substantial bias that arises when deaths are censored . To obtain reliable evidence about the effect of an intervention on PFS and patient survival , r and omized trials should be conducted where all patients are followed to progression and death , and where patients in a control arm do not cross-in at progression unless the experimental regimen has already been established to be effective rescue treatment
2,068
25,678,991
Data from published trials revealed that encouraging results were achieved by autologous BMMNCs for the treatment of spinal cord injury .
BACKGROUND Cell-based therapies can be used to treat neurological diseases and spinal cord injuries . The aim of this study was to assess the clinical outcome of bone marrow derived mononuclear cells ( BM-MNCs ) transplantation in patients with spinal cord injuries .
BACKGROUND : Severe traumatic brain injury ( TBI ) in children is associated with substantial long-term morbidity and mortality . Currently , there are no successful neuroprotective/neuroreparative treatments for TBI . Numerous pre clinical studies suggest that bone marrow-derived mononuclear cells ( BMMNCs ) , their derivative cells ( marrow stromal cells ) , or similar cells ( umbilical cord blood cells ) offer neuroprotection . OBJECTIVE : To determine whether autologous BMMNCs are a safe treatment for severe TBI in children . METHODS : Ten children aged 5 to 14 years with a postresuscitation Glasgow Coma Scale of 5 to 8 were treated with 6 × 106 autologous BMMNCs/kg body weight delivered intravenously within 48 hours after TBI . To determine the safety of the procedure , systemic and cerebral hemodynamics were monitored during bone marrow harvest ; infusion-related toxicity was determined by pediatric logistic organ dysfunction ( PELOD ) scores , hepatic enzymes , Murray lung injury scores , and renal function . Conventional magnetic resonance imaging ( cMRI ) data were obtained at 1 and 6 months postinjury , as were neuropsychological and functional outcome measures . RESULTS : All patients survived . There were no episodes of harvest-related depression of systemic or cerebral hemodynamics . There was no detectable infusion-related toxicity as determined by PELOD score , hepatic enzymes , Murray lung injury scores , or renal function . cMRI imaging comparing gray matter , white matter , and CSF volumes showed no reduction from 1 to 6 months postinjury . Dichotomized Glasgow Outcome Score at 6 months showed 70 % with good outcomes and 30 % with moderate to severe disability . CONCLUSION : Bone marrow harvest and intravenous mononuclear cell infusion as treatment for severe TBI in children is logistically feasible and safe OBJECTIVE We sought to assess the safety and therapeutic efficacy of autologous human bone marrow derived mononuclear cell transplantation on spinal cord injury in a phase I/II , nonr and omized , open-label study , conducted on 297 patients . MATERIAL S AND METHODS We transplanted unmanipulated bone marrow mononuclear cells through a lumbar puncture , and assessed the outcome using st and ard neurologic investigations and American Spinal Injury Association ( ASIA ) protocol , and with respect to safety , therapeutic time window , CD34-/+ cell count , and influence on sex and age . RESULTS No serious complications or adverse events were reported , except for minor reversible complaints . Sensory and motor improvements occurred in 32.6 % of patients , and the time elapsed between the injury and the treatment considerably influenced the outcome of the therapy . The CD34-/+ cell count determined the state of improvement , or no improvement , but not the degree of improvement . No correlation was found between level of injury and improvement , and age and sex had no role in the outcome of the cellular therapy . CONCLUSION Transplant of autologous human bone marrow derived mononuclear cells through a lumbar puncture is safe , and one-third of spinal cord injury patients show perceptible improvements in the neurologic status . The time elapsed between injury and therapy and the number of CD34-/+ cells injected influenced the outcome of the therapy To assess the safety and therapeutic efficacy of autologous human bone marrow cell ( BMC ) transplantation and the administration of granulocyte macrophage-colony stimulating factor ( GM-CSF ) , a phase I/II open-label and nonr and omized study was conducted on 35 complete spinal cord injury patients . The BMC s were transplanted by injection into the surrounding area of the spinal cord injury site within 14 injury days ( n = 17 ) , between 14 days and 8 weeks ( n = 6 ) , and at more than 8 weeks ( n = 12 ) after injury . In the control group , all patients ( n = 13 ) were treated only with conventional decompression and fusion surgery without BMC transplantation . The patients underwent preoperative and follow-up neurological assessment using the American Spinal Injury Association Impairment Scale ( AIS ) , electrophysiological monitoring , and magnetic resonance imaging ( MRI ) . The mean follow-up period was 10.4 months after injury . At 4 months , the MRI analysis showed the enlargement of spinal cords and the small enhancement of the cell implantation sites , which were not any adverse lesions such as malignant transformation , hemorrhage , new cysts , or infections . Furthermore , the BMC transplantation and GM-CSF administration were not associated with any serious adverse clinical events increasing morbidities . The AIS grade increased in 30.4 % of the acute and subacute treated patients ( AIS A to B or C ) , whereas no significant improvement was observed in the chronic treatment group . Increasing neuropathic pain during the treatment and tumor formation at the site of transplantation are still remaining to be investigated . Long-term and large scale multicenter clinical study is required to determine its precise therapeutic effect . Disclosure of potential conflicts of interest is found at the end of this article BACKGROUND AIMS Spinal cord injury is common among young subjects involved in motor vehicle accidents . Mechanisms and attempts to reverse post-traumatic pathophysiologic consequences are still being investigated . Unfortunately no effective and well-established treatment modality has been developed so far . The regeneration capability of the human nervous system following an injury is highly limited . METHODS The study involved four patients ( two male , two female ) who had suffered spinal cord injury as a result of various types of trauma . On neurologic examination , all the patients were determined to be in American Spinal Injury Association ( ASIA ) grade A. All patients were treated with decompression , stabilization and fusion for vertebral trauma anteriorly , as well as intralesional implantation of cellular bone marrow concentrates using a posterior approach 1 month after the first operation . The patients were then treated and followed-up in the physical rehabilitation clinic . RESULTS At the end of the post-operative 1-year follow-up , two of the patients were classified as ASIA C while one was classified as ASIA B. One patient showed no neurologic change ; none of the patients suffered from any complications or adverse effects as a result of intralesional application of bone marrow cells . CONCLUSIONS The results of this experimental study show the potential contribution of intralesional implantation of bone marrow to neuronal regeneration in the injured spinal cord , with neuronal changes . In light of the results of this experimental study , the potential for regenerative treatment in injuries of the human spinal cord is no longer a speculation but an observation
2,069
27,806,998
Conclusions : Carriers of CYP2C19 loss-of-function alleles are at greater risk of stroke and composite vascular events than noncarriers among patients with ischemic stroke or TIA treated with clopidogrel
Background : The association of genetic polymorphisms and clopidogrel efficacy in patients with ischemic stroke or transient ischemic attack ( TIA ) remains controversial . We performed a systematic review and meta- analysis to assess the association between genetic polymorphisms , especially CYP2C19 genotype , and clopidogrel efficacy for ischemic stroke or TIA .
BACKGROUND The relative efficacy of streptokinase and tissue plasminogen activator and the roles of intravenous as compared with subcutaneous heparin as adjunctive therapy in acute myocardial infa rct ion are unresolved questions . The current trial was design ed to compare new , aggressive thrombolytic strategies with st and ard thrombolytic regimens in the treatment of acute myocardial infa rct ion . Our hypothesis was that newer thrombolytic strategies that produce earlier and sustained reperfusion would improve survival . METHODS In 15 countries and 1081 hospitals , 41,021 patients with evolving myocardial infa rct ion were r and omly assigned to four different thrombolytic strategies , consisting of the use of streptokinase and subcutaneous heparin , streptokinase and intravenous heparin , accelerated tissue plasminogen activator ( t-PA ) and intravenous heparin , or a combination of streptokinase plus t-PA with intravenous heparin . ( " Accelerated " refers to the administration of t-PA over a period of 1 1/2 hours -- with two thirds of the dose given in the first 30 minutes -- rather than the conventional period of 3 hours . ) The primary end point was 30-day mortality . RESULTS The mortality rates in the four treatment groups were as follows : streptokinase and subcutaneous heparin , 7.2 percent ; streptokinase and intravenous heparin , 7.4 percent ; accelerated t-PA and intravenous heparin , 6.3 percent , and the combination of both thrombolytic agents with intravenous heparin , 7.0 percent . This represented a 14 percent reduction ( 95 percent confidence interval , 5.9 to 21.3 percent ) in mortality for accelerated t-PA as compared with the two streptokinase-only strategies ( P = 0.001 ) . The rates of hemorrhagic stroke were 0.49 percent , 0.54 percent , 0.72 percent , and 0.94 percent in the four groups , respectively , which represented a significant excess of hemorrhagic strokes for accelerated t-PA ( P = 0.03 ) and for the combination strategy ( P < 0.001 ) , as compared with streptokinase only . A combined end point of death or disabling stroke was significantly lower in the accelerated-tPA group than in the streptokinase-only groups ( 6.9 percent vs. 7.8 percent , P = 0.006 ) . CONCLUSIONS The findings of this large-scale trial indicate that accelerated t-PA given with intravenous heparin provides a survival benefit over previous st and ard thrombolytic regimens IMPORTANCE Data are limited regarding the association between CYP2C19 genetic variants and clinical outcomes of patients with minor stroke or transient ischemic attack treated with clopidogrel . OBJECTIVE To estimate the association between CYP2C19 genetic variants and clinical outcomes of clopidogrel-treated patients with minor stroke or transient ischemic attack . DESIGN , SETTING , AND PARTICIPANTS Three CYP2C19 major alleles ( * 2 , * 3 , * 17 ) were genotyped among 2933 Chinese patients from 73 sites who were enrolled in the Clopidogrel in High-Risk Patients with Acute Nondisabling Cerebrovascular Events ( CHANCE ) r and omized trial conducted from January 2 , 2010 , to March 20 , 2012 . INTERVENTIONS Patients with acute minor ischemic stroke or transient ischemic attack in the trial were r and omized to treatment with clopidogrel combined with aspirin or to aspirin alone . MAIN OUTCOMES AND MEASURES The primary efficacy outcome was new stroke . The secondary efficacy outcome was a composite of new composite vascular events ( ischemic stroke , hemorrhagic stroke , myocardial infa rct ion , or vascular death ) . Bleeding was the safety outcome . RESULTS Among 2933 patients , 1948 ( 66.4 % ) were men , with a mean age of 62.4 years . Overall , 1207 patients ( 41.2 % ) were noncarriers and 1726 patients ( 58.8 % ) were carriers of loss-of-function alleles ( * 2 , * 3 ) . After day 90 follow-up , clopidogrel-aspirin reduced the rate of new stroke in the noncarriers but not in the carriers of the loss-of-function alleles ( P = .02 for interaction ; events among noncarriers , 41 [ 6.7 % ] with clopidogrel-aspirin vs 74 [ 12.4 % ] with aspirin ; hazard ratio [ HR ] , 0.51 [ 95 % CI , 0.35 - 0.75 ] ; events among carriers , 80 [ 9.4 % ] with clopidogrel-aspirin vs 94 [ 10.8 % ] with aspirin ; HR , 0.93 [ 95 % CI , 0.69 to 1.26 ] ) . Similar results were observed for the secondary composite efficacy outcome ( noncarriers : 41 [ 6.7 % ] with clopidogrel-aspirin vs 75 [ 12.5 % ] with aspirin ; HR , 0.50 [ 95 % CI , 0.34 - 0.74 ] ; carriers : 80 [ 9.4 % ] with clopidogrel-aspirin vs 95 [ 10.9 % ] with aspirin ; HR , 0.92 [ 95 % CI , 0.68 - 1.24 ] ; P = .02 for interaction ) . The effect of treatment assignment on bleeding did not vary significantly between the carriers and the noncarriers of the loss-of-function alleles ( 2.3 % for carriers and 2.5 % for noncarriers in the clopidogrel-aspirin group vs 1.4 % for carriers and 1.7 % for noncarriers in the aspirin only group ; P = .78 for interaction ) . CONCLUSIONS AND RELEVANCE Among patients with minor ischemic stroke or transient ischemic attack , the use of clopidogrel plus aspirin compared with aspirin alone reduced the risk of a new stroke only in the subgroup of patients who were not carriers of the CYP2C19 loss-of-function alleles . These findings support a role of CYP2C19 genotype in the efficacy of this treatment . TRIAL REGISTRATION clinical trials.gov Identifier : NCT00979589 OBJECT Symptomatic intracranial atherosclerotic disease ( ICAD ) has a high risk of recurrent stroke . Genetic polymorphisms in CYP2C19 and CES1 are associated with adverse outcomes in cardiovascular patients , but have not been studied in ICAD . The authors studied CYP2C19 and CES1 single-nucleotide polymorphisms ( SNPs ) in symptomatic ICAD patients . METHODS Genotype testing for CYP2C19 * 2 , (*)3 , (*)8 , (*)17 and CES1 G143E was performed on 188 adult symptomatic ICAD patients from 3 medical centers who were medically managed with clopidogrel and aspirin . Testing was performed prospect ively at 1 center , and retrospectively from a DNA sample biorepository at 2 centers . Multiple logistic regression and Cox regression analysis were performed to assess the association of these SNPs with the primary endpoint , which was a composite of transient ischemic attack ( TIA ) , stroke , myocardial infa rct ion , or death within 12 months . RESULTS The primary endpoint occurred in 14.9 % of the 188 cases . In multiple logistic regression analysis , the presence of the CYP2C19 loss of function ( LOF ) alleles * 2 , * 3 , and * 8 in the medically managed patients was associated with lower odds of primary endpoint compared with wild-type homozygotes ( odds ratio [ OR ] 0.13 , 95 % CI 0.03 - 0.62 , p = 0.0101 ) . Cox regression analysis demonstrated the CYP2C19 LOF carriers had a lower risk for the primary endpoint , with hazard ratio ( HR ) of 0.27 ( 95 % CI 0.08 - 0.95 ) , p = 0.041 . A sensitivity analysis of a secondary composite endpoint of TIA , stroke , or death demonstrated a significant trend in multiple logistic regression analysis of CYP2C19 variants , with lower odds of secondary endpoint in patients carrying at least 1 LOF allele ( * 2 , * 3 , * 8) than in wild-type homozygotes ( OR 0.27 , 95 % CI 0.06 - 1.16 , p = 0.078 ) . Cox regression analysis demonstrated that the carriers of CYP2C19 LOF alleles had a lower risk forthe secondary composite endpoint ( HR 0.22 , 95 % CI 0.05 - 1.04 , p = 0.056 ) . CONCLUSIONS This is the first study examining genetic variants and their effects in symptomatic ICAD . Variant alleles of CYP2C19 ( * 2 , * 3 , * 8) were associated with lower odds of the primary and secondary composite endpoints . However , the direction of the association was opposite of what is expected based on this SNP . This may reflect an incomplete underst and ing of this genetic variation and its effect in symptomatic ICAD and warrants further investigations BACKGROUND Many clinical trials have evaluated the benefit of long-term use of antiplatelet drugs in reducing the risk of clinical thrombotic events . Aspirin and ticlopidine have been shown to be effective , but both have potentially serious adverse effects . Clopidogrel , a new thienopyridine derivative similar to ticlopidine , is an inhibitor of platelet aggregation induced by adenosine diphosphate . METHODS CAPRIE was a r and omised , blinded , international trial design ed to assess the relative efficacy of clopidogrel ( 75 mg once daily ) and aspirin ( 325 mg once daily ) in reducing the risk of a composite outcome cluster of ischaemic stroke , myocardial infa rct ion , or vascular death ; their relative safety was also assessed . The population studied comprised subgroups of patients with atherosclerotic vascular disease manifested as either recent ischaemic stroke , recent myocardial infa rct ion , or symptomatic peripheral arterial disease . Patients were followed for 1 to 3 years . FINDINGS 19,185 patients , with more than 6300 in each of the clinical subgroups , were recruited over 3 years , with a mean follow-up of 1.91 years . There were 1960 first events included in the outcome cluster on which an intention-to-treat analysis showed that patients treated with clopidogrel had an annual 5.32 % risk of ischaemic stroke , myocardial infa rct ion , or vascular death compared with 5.83 % with aspirin . These rates reflect a statistically significant ( p = 0.043 ) relative-risk reduction of 8.7 % in favour of clopidogrel ( 95 % Cl 0.3 - 16.5 ) . Corresponding on-treatment analysis yielded a relative-risk reduction of 9.4 % . There were no major differences in terms of safety . Reported adverse experiences in the clopidogrel and aspirin groups judged to be severe included rash ( 0.26 % vs 0.10 % ) , diarrhoea ( 0.23 % vs 0.11 % ) , upper gastrointestinal discomfort ( 0.97 % vs 1.22 % ) , intracranial haemorrhage ( 0.33 % vs 0.47 % ) , and gastrointestinal haemorrhage ( 0.52 % vs 0.72 % ) , respectively . There were ten ( 0.10 % ) patients in the clopidogrel group with significant reductions in neutrophils ( < 1.2 x 10(9)/L ) and 16 ( 0.17 % ) in the aspirin group . INTERPRETATION Long-term administration of clopidogrel to patients with atherosclerotic vascular disease is more effective than aspirin in reducing the combined risk of ischaemic stroke , myocardial infa rct ion , or vascular death . The overall safety profile of clopidogrel is at least as good as that of medium-dose aspirin BACKGROUND Stroke is common during the first few weeks after a transient ischemic attack ( TIA ) or minor ischemic stroke . Combination therapy with clopidogrel and aspirin may provide greater protection against subsequent stroke than aspirin alone . METHODS In a r and omized , double-blind , placebo-controlled trial conducted at 114 centers in China , we r and omly assigned 5170 patients within 24 hours after the onset of minor ischemic stroke or high-risk TIA to combination therapy with clopidogrel and aspirin ( clopidogrel at an initial dose of 300 mg , followed by 75 mg per day for 90 days , plus aspirin at a dose of 75 mg per day for the first 21 days ) or to placebo plus aspirin ( 75 mg per day for 90 days ) . All participants received open-label aspirin at a clinician-determined dose of 75 to 300 mg on day 1 . The primary outcome was stroke ( ischemic or hemorrhagic ) during 90 days of follow-up in an intention-to-treat analysis . Treatment differences were assessed with the use of a Cox proportional-hazards model , with study center as a r and om effect . RESULTS Stroke occurred in 8.2 % of patients in the clopidogrel-aspirin group , as compared with 11.7 % of those in the aspirin group ( hazard ratio , 0.68 ; 95 % confidence interval , 0.57 to 0.81 ; P<0.001 ) . Moderate or severe hemorrhage occurred in seven patients ( 0.3 % ) in the clopidogrel-aspirin group and in eight ( 0.3 % ) in the aspirin group ( P=0.73 ) ; the rate of hemorrhagic stroke was 0.3 % in each group . CONCLUSIONS Among patients with TIA or minor stroke who can be treated within 24 hours after the onset of symptoms , the combination of clopidogrel and aspirin is superior to aspirin alone for reducing the risk of stroke in the first 90 days and does not increase the risk of hemorrhage . ( Funded by the Ministry of Science and Technology of the People 's Republic of China ; CHANCE Clinical Trials.gov number , NCT00979589 . ) Background Ischemic stroke and other vascular outcomes occur in 10–20 % of patients in the three-months following a transient ischemic attack or minor ischemic stroke , and many are disabling . The highest risk period for these outcomes is the early hours and days immediately following the ischemic event . Aspirin is the most common antithrombotic treatment used for these patients . Aim The aim of POINT is to determine whether clopidogrel plus aspirin taken < 12 h after transient ischemic attack or minor ischemic stroke symptom onset is more effective in preventing major ischemic vascular events at 90 days in the high-risk , and acceptably safe , compared with aspirin alone . Design POINT is a prospect i ve , r and omized , double-blind , multicenter trial in patients with transient ischemic attack or minor ischemic stroke . Subjects are r and omized to clopidogrel ( 600 mg loading dose followed by 75 mg/day ) or matching placebo , and all will receive open-label aspirin 50–325 mg/day , with a dose of 162 mg daily for five-days followed by 81 mg daily strongly recommended . Study Outcomes The primary efficacy outcome is the composite of new ischemic vascular events — ischemic stroke , myocardial infa rct ion , or ischemic vascular death — by 90 days . The primary safety outcome is major hemorrhage , which includes symptomatic intracranial hemorrhage . Discussion Aspirin is the most common antithrombotic given to patients with a stroke or transient ischemic attack , as it reduces the risk of subsequent stroke . This trial expects to determine whether more aggressive antithrombotic therapy with clopidogrel plus aspirin , initiated acutely , is more effective than aspirin alone
2,070
28,539,181
In conclusion , our data suggest that hypovitaminosis D is associated with an elevated risk of future diabetes in older people .
Low serum levels of 25 hydroxyvitamin D ( 25OHD ) ( hypovitaminosis D ) is common in older adults and associated with several negative outcomes . The association between hypovitaminosis D and diabetes in older adults is equivocal , however . We conducted a meta- analysis investigating if hypovitaminosis D is associated with diabetes in prospect i ve studies among older participants .
Summary Vitamin D status was assessed in 142 elderly Dutchmen participating in a prospect i ve population -based study of environmental factors in the aetiology of non-insulin-dependent diabetes mellitus . Of the men aged 70–88 years examined between March and May 1990 , 39 % were vitamin D depleted . After adjustment for confounding by age , BMI , physical activity , month of sampling , cigarette smoking and alcohol intake the 1-h glucose and area under the glucose curve during a st and ard 75-g oral glucose tolerance test ( OGTT ) were inversely associated with the serum concentration of 25-OH vitamin D ( r = −0.23 , p < 0.01 ; r = −0.26 , p < 0.01 , respectively ) . After excluding newly diagnosed diabetic patients total insulin concentrations during OGTT were also inversely associated with the concentration of 25-OH vitamin D ( r = −0.18 to −0.23 , p < 0.05 ) . Hypovitaminosis D may be a significant risk factor for glucose intolerance . [ Diabetologia ( 1997 ) 40 : 344–347 More than 25 % of the U.S. population aged ≥65 years has diabetes ( 1 ) , and the aging of the overall population is a significant driver of the diabetes epidemic . Although the burden of diabetes is often described in terms of its impact on working-age adults , diabetes in older adults is linked to higher mortality , reduced functional status , and increased risk of institutionalization ( 2 ) . Older adults with diabetes are at substantial risk for both acute and chronic microvascular and cardiovascular complications of the disease . Despite having the highest prevalence of diabetes of any age-group , older persons and /or those with multiple comorbidities have often been excluded from r and omized controlled trials of treatments— and treatment targets — for diabetes and its associated conditions . Heterogeneity of health status of older adults ( even within an age range ) and the dearth of evidence from clinical trials present challenges to determining st and ard intervention strategies that fit all older adults . To address these issues , the American Diabetes Association ( ADA ) convened a Consensus Development Conference on Diabetes and Older Adults ( defined as those aged ≥65 years ) in February 2012 . Following a series of scientific presentations by experts in the field , the writing group independently developed this consensus report to address the following questions : 1 . What is the epidemiology and pathogenesis of diabetes in older adults ? 2 . What is the evidence for preventing and treating diabetes and its common comorbidities in older adults ? 3 . What current guidelines exist for treating diabetes in older adults ? 4 . What issues need to be considered in individualizing treatment recommendations for older adults ? 5 . What are consensus recommendations for treating older adults with or at risk for diabetes ? 6 . How can gaps in the evidence best be filled ? According to the most recent surveillance data , the prevalence of diabetes among U.S. adults aged ≥65 years varies from 22 to 33 % , depending on the diagnostic criteria OBJECTIVE To examine whether lower serum levels of serum 25-hydroxyvitamin ( OH ) D [ 25(OH)D ] are associated with increased risk of developing type 2 diabetes . RESEARCH DESIGN AND METHODS A post hoc analysis of three nested case-control studies of fractures , colon cancer , and breast cancer that measured serum 25(OH)D levels in women participating in the Women ’s Health Initiative ( WHI ) Clinical Trials and Observational Study who were free of prevalent diabetes at baseline . Diabetes was defined as self-report of physician diagnosis or receiving insulin or oral hypoglycemic medication . We used inverse probability weighting to make the study population representative of the WHI population as a whole . Weighted logistic regression models compared 25(OH)D levels ( divided into quartiles , clinical cut points [ < 50 , 50–<75 , ≥75 nmol/L ] , or as a continuous variable ) using the distribution of control subjects and adjusted for multiple confounding factors . RESULTS Of 5,140 women ( mean age 66 years ) followed for an average of 7.3 years , 317 ( 6.2 % ) developed diabetes . Regardless of the cut points used or as a continuous variable , 25(OH)D levels were not associated with diabetes incidence in either age or fully adjusted models . Nor was any relationship found between 25(OH)D and incident diabetes when evaluated by strata of BMI , race/ethnicity , or r and omization status in the Calcium Vitamin D trial . CONCLUSIONS Lower serum 25(OH)D levels were not associated with increased risk of developing type 2 diabetes in this racially and ethnically diverse population of postmenopausal women OBJECTIVE Institutionalised elderly people at northern latitudes may be at elevated risk for vitamin D deficiency . In addition to osteoporosis-related disorders , vitamin D deficiency may influence several medical conditions conferring an increased mortality risk . The aim of this study was to explore the prevalence of vitamin D deficiency and its association with mortality . DESIGN The Study of Health and Drugs in the Elderly ( SHADES ) is a prospect i ve cohort study among elderly people ( > 65 years ) in 11 nursing homes in Sweden . METHODS We analysed the levels of 25-hydroxyvitamin D₃ ( 25(OH)D₃ ) at baseline . Vital status of the subjects was ascertained and hazard ratios ( HRs ) for mortality according to 25(OH)D₃ quartiles were calculated . RESULTS We examined 333 study participants with a mean follow-up of 3 years . A total of 147 ( 44 % ) patients died within this period . Compared with the subjects in Q4 ( 25(OH)D₃ > 48 nmol/l ) , HR ( with 95 % CI ) for mortality was 2.02 ( 1.31 - 3.12 ) in Q1 ( 25(OH)D₃ < 29 nmol/l ) ( P<0.05 ) ; 2.03 ( 1.32 - 3.14 ) in Q2 ( 25(OH)D₃ 30 - 37 nmol/l ) ( P<0.05 ) and 1.6 ( 1.03 - 2.48 ) in Q3 ( 25(OH)D₃ 38 - 47 nmol/l ) ( P<0.05 ) . The mean 25(OH)D₃ concentration was 40.2 nmol/l ( S.D. 16.0 ) and 80 % had 25(OH)D₃ below 50 nmol/l . The vitamin D levels decreased from baseline to the second and third measurements . CONCLUSIONS Vitamin D deficiency was highly prevalent and associated with increased mortality among the elderly in Swedish nursing homes . Strategies are needed to prevent , and maybe treat , vitamin D deficiency in the elderly in nursing homes and the benefit of vitamin D supplementation should be evaluated in r and omised clinical trials BACKGROUND The combined impact of lifestyle factors on incidence of diabetes mellitus later in life is not well established . The objective of this study was to determine how lifestyle factors , assessed in combination , relate to new-onset diabetes in a broad and relatively unselected population of older adults . METHODS We prospect ively examined associations of lifestyle factors , measured using repeated assessment s later in life , with incident diabetes mellitus during a 10-year period ( 1989 - 1998 ) among 4883 men and women 65 years or older ( mean [ SD ] age at baseline , 73 [ 6 ] years ) enrolled in the Cardiovascular Health Study . Low-risk lifestyle groups were defined by physical activity level ( leisure-time activity and walking pace ) above the median ; dietary score ( higher fiber intake and polyunsaturated to saturated fat ratio , lower trans-fat intake and lower mean glycemic index ) in the top 2 quintiles ; never smoked or former smoker more than 20 years ago or for fewer than 5 pack-years ; alcohol use ( predominantly light or moderate ) ; body mass index less than 25 ( calculated as weight in kilograms divided by height in meters squared ) ; and waist circumference of 88 cm for women or 92 cm for men . The main outcome measure was incident diabetes defined annually by new use of insulin or oral hypoglycemic medications . We also evaluated fasting and 2-hour postchallenge glucose levels . RESULTS During 34,539 person-years , 337 new cases of drug-treated diabetes mellitus occurred ( 9.8 per 1000 person-years ) . After adjustment for age , sex , race , educational level , and annual income , each lifestyle factor was independently associated with incident diabetes . Overall , the rate of incident diabetes was 35 % lower ( relative risk , 0.65 ; 95 % confidence interval , 0.59 - 0.71 ) for each 1 additional lifestyle factor in the low-risk group . Participants whose physical activity level and dietary , smoking , and alcohol habits were all in the low-risk group had an 82 % lower incidence of diabetes ( relative risk , 0.18 ; 95 % confidence interval , 0.06 - 0.56 ) compared with all other participants . When absence of adiposity ( either body mass index < 25 or waist circumference < or = 88/92 cm for women/men ) was added to the other 4 low-risk lifestyle factors , incidence of diabetes was 89 % lower ( relative risk , 0.11 ; 95 % confidence interval , 0.01 - 0.76 ) . Overall , 9 of 10 new cases of diabetes appeared to be attributable to these 5 lifestyle factors . Associations were slightly attenuated , but still highly significant , for incident diabetes defined by medication use or glucose level . CONCLUSION Even later in life , combined lifestyle factors are associated with a markedly lower incidence of new-onset diabetes mellitus BACKGROUND Vitamin D insufficiency was shown to be associated with adverse musculoskeletal and nonskeletal outcomes in numerous observational studies . However , some studies did not control for confounding factors such as age or seasonal variation of 25-hydroxyvitamin D [ 25(OH)D ] . OBJECTIVE We sought to determine the effect of vitamin D status on health outcomes . DESIGN Healthy community-dwelling women ( n = 1471 ) with a mean age of 74 y were followed in a 5-y trial of calcium supplementation . 25(OH)D was measured at baseline in all women . Skeletal and nonskeletal outcomes were evaluated according to seasonally adjusted vitamin D status at baseline . RESULTS Fifty percent of women had a seasonally adjusted 25(OH)D concentration < 50 nmol/L. These women were significantly older , heavier , and less physically active and had more comorbidities than women with a seasonally adjusted 25(OH)D concentration > or = 50 nmol/L. Women with a seasonally adjusted 25(OH)D concentration < 50 nmol/L had an increased incidence of stroke and cardiovascular events that did not persist after adjustment for between-group differences in age or comorbidities . Women with a seasonally adjusted 25(OH)D concentration < 50 nmol/L were not at increased risk of adverse consequences for any musculoskeletal outcome , including fracture , falls , bone density , or grip strength or any nonskeletal outcomes , including death , myocardial infa rct ion , cancer , heart failure , diabetes , or adverse changes in blood pressure , weight , body composition , cholesterol , or glucose . CONCLUSIONS Vitamin D insufficiency is more common in older , frailer women . Community-dwelling older women with a seasonally adjusted 25(OH)D concentration < 50 nmol/L were not at risk of adverse outcomes over 5 y after control for comorbidities . R and omized placebo-controlled trials are needed to determine whether vitamin D supplementation in individuals with vitamin D insufficiency influences health outcomes . This trial was registered at www.anzctr.org.au as ACTRN 012605000242628
2,071
24,054,931
Available evidence seems to suggest that testosterone replacement therapy is able to improve central obesity ( subjects with MetS ) and glycometabolic control ( patients with MetS and T2DM ) , as well as to increase lean body mass ( HIV , chronic obstructive pulmonary disease ) , along with insulin resistance ( MetS ) and peripheral oxygenation ( chronic kidney diseases ) .
Late-onset hypogonadism ( LOH ) is a relatively common conditions affecting the aging male . The aim of this review is to summarize the available evidence regarding LOH and its interaction with general health . LOH is often comorbid to obesity and several chronic diseases .
Secondary hypogonadism is not an infrequent abnormality in older patients presenting with the primary complaint of erectile dysfunction . Because of the role of testosterone in mediating sexual desire and erectile function in men , these patients are usually treated with exogenous testosterone , which , while elevating the circulating and rogens , suppresses gonadotropins from the hypothalamic-pituitary axis . The response of this form of therapy , although extolled in the lay literature , has usually not been effective in restoring or even improving sexual function . This failure of response could be the result of suppression of gonadotropins or the lack of a cause and effect relationship between sexual function and circulating and rogens in this group of patients . Further , because exogenous testosterone can potentially increase the risk of prostate disease , it is important to be sure of the benefit sought , i.e. an increase in sexual function . In an attempt to answer this question , we measured the hormone levels and studied the sexual function in 17 patients with erectile dysfunction who were found to have secondary hypogonadism . This double blind , placebo-controlled , cross-over study consisted of treatment with clomiphene citrate and a placebo for 2 months each . Similar to our previous observations , LH , FSH , and total and free testosterone levels showed a significant elevation in response to clomiphene citrate over the response to placebo . However , sexual function , as monitored by question naires and nocturnal penile tumescence and rigidity testing , did not improve except for some limited parameters in younger and healthier men . The results confirmed that there can be a functional secondary hypogonadism in men on an out-patient basis , but correlation of the hormonal status does not universally reverse the associated erectile dysfunction to normal , thus requiring closer scrutiny of cl aims of cause and effect relationships between hypogonadism and erectile dysfunction OBJECTIVE To evaluate the effect of clomiphene in men with hypogonadism and conventionally treated nonfunctioning pituitary adenomas ( NFPA ) . PATIENTS AND METHODS Open label , single-arm , prospect i ve trial . Nine hypogonadal men ( testosterone < 300 ng/dL and low/normal LH ) with previously treated NFPA . Clomiphene ( 50 mg/day orally ) for 12 weeks . Testosterone , estradiol , LH , FSH , prolactin and erectile function were evaluated before and after 10 days , 4 , 8 and 12 weeks of clomiphene treatment . RESULTS After clomiphene treatment , testosterone and erectile function improved in only one patient . In the remaining eight patients , testosterone levels decreased whereas LH , FSH , and estradiol remained unchanged . Insulin sensitivity increased in unresponsive patients . CONCLUSIONS Compared with hypogonadal men with prolactinomas under dopaminergic therapy , clomiphene treatment failed to restore normal testosterone levels in most patients with conventionally treated NFPA The objective of this study was to assess the effects of oral testosterone supplementation therapy on glucose homeostasis , obesity and sexual function in middle-aged men with type 2 diabetes and mild and rogen deficiency . Forty-eight middle-aged men , with type 2 diabetes , ( visceral ) obesity and symptoms of and rogen deficiency , were included in this open-label study . Twenty-four subjects received testosterone undecanoate ( TU ; 120 mg daily , for 3 months ) ; 24 subjects received no treatment . Body composition was analyzed by bio-impedance . Parameters of metabolic control were determined . Symptoms of and rogen deficiency and erectile dysfunction were scored by self-administered question naires . TU had a positive effect on ( visceral ) obesity : statistically significant reduction in body weight ( 2.66 % ) , waist-hip ratio ( -3.96 % ) and body fat ( -5.65 % ) ; negligible changes were found in the control group . TU significantly improved metabolic control : decrease in blood glucose values and mean glycated hemoglobin ( HbA1c ) ( from 10.4 to 8.6 % ) . TU treatment significantly improved symptoms of and rogen deficiency ( including erectile dysfunction ) , with virtually no change in the control group . There were no adverse effects on blood pressure or hematological , biochemical and lipid parameters , and no adverse events . Oral TU treatment of type 2 diabetic men with and rogen deficiency improves glucose homeostasis and body composition ( decrease in visceral obesity ) , and improves symptoms of and rogen deficiency ( including erectile dysfunction ) . In these men , the benefit of testosterone supplementation therapy exceeds the correction of symptoms of and rogen deficiency and also includes glucose homeostasis and metabolic control Although weight loss associated with human immunodeficiency virus ( HIV ) infection is multifactorial in its pathogenesis , it has been speculated that hypogonadism , a common occurrence in HIV disease , contributes to depletion of lean tissue and muscle dysfunction . We , therefore , examined the effects of testosterone replacement by means of And roderm , a permeation-enhanced , nongenital transdermal system , on lean body mass , body weight , muscle strength , health-related quality of life , and HIV-disease markers . We r and omly assigned 41 HIV-infected , ambulatory men , 18 - 60 yr of age , with serum testosterone levels below 400 ng/dL , to 1 of 2 treatment groups : group I , two placebo patches ( n = 21 ) ; or group II , two testosterone patches design ed to release 5 mg testosterone over 24 h. Eighteen men in the placebo group and 14 men in the testosterone group completed the 12-week treatment . Serum total and free testosterone and dihydrotestosterone levels increased , and LH and FSH levels decreased in the testosterone-treated , but not in the placebo-treated , men . Lean body mass and fat-free mass , measured by dual energy x-ray absorptiometry , increased significantly in men receiving testosterone patches [ change in lean body mass , + 1.345 + /- 0.533 kg ( P = 0.02 compared to no change ) ; change in fat-free mass , + 1.364 + /- 0.525 kg ( P = 0.02 compared to no change ) ] , but did not change in the placebo group [ change in lean body mass , 0.189 + /- 0.470 kg ( P = NS compared to no change ) ; change in fat-free mass , 0.186 + /- 0.470 kg ( P = NS compared to no change ) ] . However , there was no significant difference between the 2 treatment groups in the change in lean body mass . The change in lean body mass during treatment was moderately correlated with the increment in serum testosterone levels ( r = 0.41 ; P = 0.02 ) . The testosterone-treated men experienced a greater decrease in fat mass than those receiving placebo patches ( P = 0.04 ) . There was no significant change in body weight in either treatment group . Changes in overall quality of life scores did not correlate with testosterone treatment ; however , in the subcategory of role limitation due to emotional problems , the men in the testosterone group improved an average of 43 points of a 0 - 100 possible score , whereas those in the placebo group did not change . Red cell count increased in the testosterone group ( change in red cell count , + 0.1 + /- 0.1 10(12)/L ) but decreased in the placebo group ( change in red cell count , -0.2 + /- 0.1 10(12)/L ) . CD4 + and CD8 + T cell counts and plasma HIV copy number did not significantly change during treatment . Serum prostate-specific antigen and plasma lipid levels did not change in either treatment group . Testosterone replacement in HIV-infected men with low testosterone levels is safe and is associated with a 1.35-kg gain in lean body mass , a significantly greater reduction in fat mass than that achieved with placebo treatment , an increased red cell count , and an improvement in role limitation due to emotional problems . Further studies are needed to assess whether testosterone supplementation can produce clinical ly meaningful changes in muscle function and disease outcome in HIV-infected men Dysfunction of the muscles of ambulation contributes to exercise intolerance in chronic obstructive pulmonary disease ( COPD ) . Men with COPD have high prevalence of low testosterone levels , which may contribute to muscle weakness . We determined effects of testosterone supplementation ( 100 mg of testosterone enanthate injected weekly ) with or without resistance training ( 45 minutes three times weekly ) on body composition and muscle function in 47 men with COPD ( mean FEV(1 ) = 40 % predicted ) and low testosterone levels ( mean = 320 ng/dl ) . Subjects were r and omized to 10 weeks of placebo injections + no training , testosterone injections + no training , placebo injections + resistance training , or testosterone injections + resistance training . Testosterone injections yielded a mean increase of 271 ng/dl in the nadir serum testosterone concentration ( to the middle of the normal range for young men ) . The lean body mass ( by dual-energy X-ray absorptiometry ) increase averaged 2.3 kg with testosterone alone and 3.3 kg with combined testosterone and resistance training ( p < 0.001 ) . Increase in one-repetition maximum leg press strength averaged 17.2 % with testosterone alone , 17.4 % with resistance training alone , and 26.8 % with testosterone + resistance training ( p < 0.001 ) . Interventions were well tolerated with no abnormalities in safety measures . Further studies are required to determine long-term benefits of adding testosterone supplementation and resistance training to rehabilitative programs for carefully screened men with COPD and low testosterone levels BACKGROUND Whole body and abdominal obesity are associated with increased risk of diabetes mellitus and heart disease . The effects of testosterone therapy on whole body and visceral fat mass in HIV-infected men with abdominal obesity are unknown . OBJECTIVE The objective of this study was to determine the effects of testosterone therapy on intraabdominal fat mass and whole body fat distribution in HIV-infected men with abdominal obesity . METHODS IN this multicenter , r and omized , placebo-controlled , double-blind trial , 88 HIV-positive men with abdominal obesity ( waist-to-hip ratio > 0.95 or mid-waist circumference > 100 cm ) and total testosterone 125 - 400 ng/dl , or bioavailable testosterone less than 115 ng/dl , or free testosterone less than 50 pg/ml on stable antiretroviral regimen , and HIV RNA less than 10,000 copies per milliliter were r and omized to receive 10 g testosterone gel or placebo daily for 24 wk . Fat mass and distribution were determined by abdominal computerized tomography and dual energy x-ray absorptiometry during wk 0 , 12 , and 24 . We used an intention-to-treat approach and nonparametric statistical methods . RESULTS Baseline characteristics were balanced between groups . In 75 subjects evaluated , median percent change from baseline to wk 24 in visceral fat did not differ significantly between groups ( testosterone 0.3 % , placebo 3.1 % , P = 0.75 ) . Total ( testosterone -1.5 % , placebo 4.3 % , P = 0.04 ) and sc ( testosterone-7.2 % , placebo 8.1 % , P < 0.001 ) abdominal fat mass decreased in testosterone-treated men , but increased in placebo group . Testosterone therapy was associated with significant decrease in whole body , trunk , and appendicular fat mass by dual energy x-ray absorptiometry ( all P < 0.001 ) , whereas whole body and trunk fat increased significantly in the placebo group . The percent of individuals reporting a decrease in abdomen ( P = 0.01 ) , neck ( P = 0.08 ) , and breast size ( P = 0.01 ) at wk 24 was significantly greater in testosterone-treated than placebo-treated men . Testosterone-treated men had greater increase in lean body mass than placebo ( testosterone 1.3 % , placebo -0.3 , P = 0.02 ) . Plasma insulin , fasting glucose , and total high-density lipoprotein and low-density lipoprotein cholesterol levels did not change significantly . Testosterone therapy was well tolerated . CONCLUSIONS Testosterone therapy in HIV-positive men with abdominal obesity and low testosterone was associated with greater decrease in whole body , total , and sc abdominal fat mass and a greater increase in lean mass compared to placebo . However , changes in visceral fat mass were not significantly different between groups . Further studies are needed to determine testosterone effects on insulin sensitivity and cardiovascular risk BACKGROUND The association between aging-related testosterone deficiency and late-onset hypogonadism in men remains a controversial concept . We sought evidence -based criteria for identifying late-onset hypogonadism in the general population on the basis of an association between symptoms and a low testosterone level . METHODS We surveyed a r and om population sample of 3369 men between the ages of 40 and 79 years at eight European centers . Using question naires , we collected data with regard to the subjects ' general , sexual , physical , and psychological health . Levels of total testosterone were measured in morning blood sample s by mass spectrometry , and free testosterone levels were calculated with the use of Vermeulen 's formula . Data were r and omly split into separate training and validation sets for confirmatory analyses . RESULTS In the training set , symptoms of poor morning erection , low sexual desire , erectile dysfunction , inability to perform vigorous activity , depression , and fatigue were significantly related to the testosterone level . Increased probabilities of the three sexual symptoms and limited physical vigor were discernible with decreased testosterone levels ( ranges , 8.0 to 13.0 nmol per liter [ 2.3 to 3.7 ng per milliliter ] for total testosterone and 160 to 280 pmol per liter [ 46 to 81 pg per milliliter ] for free testosterone ) . However , only the three sexual symptoms had a syndromic association with decreased testosterone levels . An inverse relationship between an increasing number of sexual symptoms and a decreasing testosterone level was observed . These relationships were independently confirmed in the validation set , in which the strengths of the association between symptoms and low testosterone levels determined the minimum criteria necessary to identify late-onset hypogonadism . CONCLUSIONS Late-onset hypogonadism can be defined by the presence of at least three sexual symptoms associated with a total testosterone level of less than 11 nmol per liter ( 3.2 ng per milliliter ) and a free testosterone level of less than 220 pmol per liter ( 64 pg per milliliter ) OBJECTIVE Low levels of testosterone in men have been shown to be associated with type 2 diabetes , visceral adiposity , dyslipidaemia and metabolic syndrome . We investigated the effect of testosterone treatment on insulin resistance and glycaemic control in hypogonadal men with type 2 diabetes . DESIGN This was a double-blind placebo-controlled crossover study in 24 hypogonadal men ( 10 treated with insulin ) over the age of 30 years with type 2 diabetes . METHODS Patients were treated with i.m . testosterone 200 mg every 2 weeks or placebo for 3 months in r and om order , followed by a washout period of 1 month before the alternate treatment phase . The primary outcomes were changes in fasting insulin sensitivity ( as measured by homeostatic model index ( HOMA ) in those not on insulin ) , fasting blood glucose and glycated haemoglobin . The secondary outcomes were changes in body composition , fasting lipids and blood pressure . Statistical analysis was performed on the delta values , with the treatment effect of placebo compared against the treatment effect of testosterone . RESULTS Testosterone therapy reduced the HOMA index ( -1.73 + /- 0.67 , P = 0.02 , n = 14 ) , indicating an improved fasting insulin sensitivity . Glycated haemoglobin was also reduced ( -0.37 + /- 0.17 % , P = 0.03 ) , as was the fasting blood glucose ( -1.58 + /- 0.68 mmol/l , P = 0.03 ) . Testosterone treatment result ed in a reduction in visceral adiposity as assessed by waist circumference ( -1.63 + /- 0.71 cm , P = 0.03 ) and waist/hip ratio ( -0.03 + /- 0.01 , P = 0.01 ) . Total cholesterol decreased with testosterone therapy ( -0.4 + /- 0.17 mmol/l , P = 0.03 ) but no effect on blood pressure was observed . CONCLUSIONS Testosterone replacement therapy reduces insulin resistance and improves glycaemic control in hypogonadal men with type 2 diabetes . Improvements in glycaemic control , insulin resistance , cholesterol and visceral adiposity together represent an overall reduction in cardiovascular risk INTRODUCTION Although testosterone ( T ) has been suggested to play a protective role against the development of atherosclerosis , studies demonstrating an association between low T and incident major adverse cardiovascular events ( MACE ) are scanty in the general population and absent in subjects with erectile dysfunction ( ED ) . AIM To investigate whether low T in subjects with ED predict incident fatal or nonfatal MACE . METHODS This is an observational prospect i ve cohort study evaluating a consecutive series of 1687 patients attending our and rological unit for ED . Patients were interviewed using the structured interview on erectile dysfunction ( SIEDY ) and AND ROTEST structured interviews measuring components relative to ED and hypogonadal-related symptoms , respectively . MAIN OUTCOME MEASURES Total T was evaluated at baseline . Information on MACE was obtained through the City of Florence Registry Office . RESULTS Among the patients studied , 5.2 , 13.8 , and 22.4 % were hypogonadal according to different thresholds ( T < 8 , 10.4 and 12 nmol/L or 230 , 300 and 350 ng/dL , respectively ) . During a mean follow-up of 4.3 + or - 2.6 years , 139 MACE , 15 of which were fatal , were observed . Unadjusted incidence of MACE was not associated with T levels . Conversely , the proportion of lethal events among MACE was significantly higher in hypogonadal patients , using either 10.4 nmol/L ( 300 ng/dL ) or 8 nmol/L ( 230 ng/dL ) thresholds . However , after adjustment for age and Chronic Diseases Score in a Cox regression model , only the association between incident fatal MACE and T < 8 nmol/L ( 230 ng/dL ) was confirmed ( HR = 7.1 [ 1.8 - 28.6 ] ; P < 0.001 ) . Interestingly , measuring hypogonadal-related symptoms and signs through AND ROTEST , only fatal MACE were also associated with a higher score ( HR = 1.2 [ 1.0 - 1.5 ] for each AND ROTEST score increment ; P = 0.05 after adjustment for age and Chronic Diseases Score ) . CONCLUSIONS T levels are associated with a higher mortality of MACE . The identification of low T levels should alert the clinician thus identifying subjects with an increased cardiovascular risk AIM Symptomatic late-onset hypogonadism is associated not only with a decline in serum testosterone , but also with a rise in serum estradiol . These endocrine changes negatively affect libido , sexual function , mood , behavior , lean body mass , and bone density . Currently , the most common treatment is exogenous testosterone therapy . This treatment can be associated with skin irritation , gynecomastia , nipple tenderness , testicular atrophy , and decline in sperm counts . In this study we investigated the efficacy of clomiphene citrate in the treatment of hypogonadism with the objectives of raising endogenous serum testosterone ( T ) and improving the testosterone/estrogen ( T/E ) ratio . METHODS Our cohort consisted of 36 Caucasian men with hypogonadism defined as serum testosterone level less than 300 ng/dL. Each patient was treated with a daily dose of 25 mg clomiphene citrate and followed prospect ively . Analysis of baseline and follow-up serum levels of testosterone and estradiol levels were performed . RESULTS The mean age was 39 years , and the mean pretreatment testosterone and estrogen levels were 247.6 + /- 39.8 ng/dL and 32.3 + /- 10.9 , respectively . By the first follow-up visit ( 4 - 6 weeks ) , the mean testosterone level rose to 610.0 + /- 178.6 ng/dL ( P < 0.00001 ) . Moreover , the T/E ratio improved from 8.7 to 14.2 ( P < 0.001 ) . There were no side effects reported by the patients . CONCLUSIONS Low dose clomiphene citrate is effective in elevating serum testosterone levels and improving the testosterone/estradiol ratio in men with hypogonadism . This therapy represents an alternative to testosterone therapy by stimulating the endogenous and rogen production pathway STUDY OBJECTIVE To evaluate the influence of oral anabolic steroids on body mass index ( BMI ) , lean body mass , anthropometric measures , respiratory muscle strength , and functional exercise capacity among subjects with COPD . DESIGN Prospect i ve , r and omized , controlled , double-blind study . SETTING Pulmonary rehabilitation program . PARTICIPANTS Twenty-three undernourished male COPD patients in whom BMI was below 20 kg/m2 and the maximal inspiratory pressure ( PImax ) was below 60 % of the predicted value . INTERVENTION The study group received 250 mg of testosterone i.m . at baseline and 12 mg of oral stanozolol a day for 27 weeks , during which time the control group received placebo . Both groups participated in inspiratory muscle exercises during weeks 9 to 27 and cycle ergometer exercises during weeks 18 to 27 . MEASUREMENTS AND RESULTS Seventeen of 23 subjects completed the study . Weight increased in nine of 10 subjects who received anabolic steroids ( mean , + 1.8+/-0.5 kg ; p<0.05 ) , whereas the control group lost weight ( -0.4+/-0.2 kg ) . The study group 's increase in BMI differed significantly from that of the control group from weeks 3 to 27 ( p<0.05 ) . Lean body mass increased in the study group at weeks 9 and 18 ( p<0.05 ) . Arm muscle circumference and thigh circumference also differed between groups ( p<0.05 ) . Changes in PImax ( study group , 41 % ; control group , 20 % ) were not statistically significant . No changes in the 6-min walk distance or in maximal exercise capacity were identified in either group . CONCLUSION The administration of oral anabolic steroids for 27 weeks to malnourished male subjects with COPD was free of clinical or biochemical side effects . It was associated with increases in BMI , lean body mass , and anthropometric measures of arm and thigh circumference , with no significant changes in endurance exercise capacity OBJECTIVE To investigate the effect of testosterone treatment on insulin resistance , glycemic control , and dyslipidemia in Asian Indian men with type 2 diabetes mellitus ( T2DM ) and hypogonadism . METHODS We conducted a double-blind , placebo-controlled , crossover study in 22 men , 25 to 50 years old , with T2DM and hypogonadism . Patients were treated with intramuscularly administered testosterone ( 200 mg every 15 days ) or placebo for 3 months in r and om order , followed by a washout period of 1 month before the alternative treatment phase . The primary outcomes were changes in fasting insulin sensitivity ( as measured by homeostasis model assessment [ HOMA ] in those patients not receiving insulin ) , fasting blood glucose , and hemoglobin A1c . The secondary outcomes were changes in fasting lipids , blood pressure , body mass index , waist circumference , waist-to-hip ratio , and and rogen deficiency symptoms . Statistical analysis was performed on the delta values , with the treatment effect of placebo compared with the effect of testosterone . RESULTS Treatment with testosterone did not significantly influence insulin resistance measured by the HOMA index ( mean treatment effect , 1.67 + /- 4.29 ; confidence interval , -6.91 to 10.25 ; P>.05 ) . Mean change in hemoglobin A1c ( % ) ( -1.75 + /- 5.35 ; -12.46 to 8.95 ) and fasting blood glucose ( mg/dL ) ( 20.20 + /- 67.87 ; -115.54 to 155.94 ) also did not reach statistical significance . Testosterone treatment did not affect fasting lipids , blood pressure , and anthropometric determinations significantly . CONCLUSION In this study , testosterone treatment showed a neutral effect on insulin resistance and glycemic control and failed to improve dyslipidemia , control blood pressure , or reduce visceral fat significantly in Asian Indian men with T2DM and hypogonadism Abstract Objective : Axiron ( testosterone topical solution 2 % ) is an approved topical testosterone replacement therapy applied to the axilla . The axilla is a novel application site for testosterone replacement therapy , with differences in skin structure and exposure that could impact the type and /or severity of skin reactions observed with testosterone topical solution 2 % . We therefore present a detailed description of data from a pivotal clinical trial regarding the incidence , time of onset , duration , and severity of patient-reported skin reactions as well as visual assessment s made by investigators and rated using Draize scoring . * Axiron is a trademark of Eli Lilly and Company , Indianapolis , IN , USA . Methods : Data were analyzed from a multinational , open-label , clinical study in which a 2 % testosterone topical solution was applied to the axilla in hypogonadal men . The primary study was for 120 days ( N = 155 ) with a 60-day extension that evaluated skin safety ( N = 71 ) . At each visit investigators asked patients about adverse skin reactions ( including those occurring between study visits ) ; visually assessed the application site ; and grade d observed instances of erythema or edema using Draize scoring ( rated from 0 to 4 ) . Results : Application-site irritation following study drug application was the most commonly reported event ( n = 12 patients ) and was generally mild ( n = 11 ; moderate , n = 1 ) in severity . Application-site irritation did not increase in severity over time and led to only one discontinuation . Erythema was the second most common patient-reported skin reaction ( n = 10 patients ) and was also generally mild ( n = 9 ; moderate , n = 1 ) . Draize scoring rated all directly observed cases of erythema as grade 1 ( very slight , 6 patients ) or grade 2 ( well-defined , two patients ) , and identified two instances of erythema not reported by patients . Erythema was typically transient , and in most cases resolved without interruption of therapy . Three cases of edema were reported by patients , and two of these were also identified by visual inspection ; all cases of edema occurred in conjunction with erythema . Two cases of acne ( facial , shoulders ) and one of folliculitis ( scalp ) were also reported . Conclusions : Skin reactions were observed in a minority of patients , were mild or at most moderate in severity , and seldom led to discontinuation CONTEXT The cause of declining testosterone ( T ) in aging men and their relationships with risk factors are unclear . OBJECTIVE The objective of the study was to investigate the relationships between lifestyle and health with reproductive hormones in aging men . DESIGN This was a baseline cross-sectional survey on 3200 community-dwelling men aged 40 - 79 yr from a prospect i ve cohort study in eight European countries . RESULTS Four predictors were associated with distinct modes of altered function : 1 ) age : lower free T ( FT ; -3.12 pmol/liter.yr , P < 0.001 ) with raised LH , suggesting impaired testicular function ; 2 ) obesity : lower total T ( TT ; -2.32 nmol/liter ) and FT ( -17.60 pmol/liter ) for body mass index ( BMI ; > or = 25 to < 30 kg/m(2 ) ) and lower TT ( -5.09 nmol/liter ) and FT ( -53.72 pmol/liter ) for BMI 30 kg/m(2 ) or greater ( P < 0.001 - 0.01 , referent : BMI < 25 kg/m(2 ) ) with unchanged/decreased LH , indicating hypothalamus/pituitary dysfunction ; 3 ) comorbidity : lower TT ( -0.80 nmol/liter , P < 0.01 ) with unchanged LH in younger men but higher LH in older men ; and 4 ) smoking : higher SHBG ( 5.96 nmol/liter , P < 0.001 ) and LH ( 0.77 U/liter , P < 0.01 ) with increased TT ( 1.31 nmol/liter , P < 0.001 ) but not FT , compatible with a re setting of T-LH-negative feedback due to elevated SHBG . CONCLUSIONS Complex multiple alterations in the hypothalamic-pituitary-testicular axis function exist in aging men against a background of progressive age-related testicular impairment . These changes are differentially linked to specific risk factors . Some risk factors operate independently of but others interact with age , in contributing to the T decline . These potentially modifiable risk factors suggest possible preventative measures to maintain T during aging in men OBJECTIVE To explore the correlation models between body mass index ( BMI ) and sex hormones constructed from a male cross-sectional survey and evaluate the effects of surgery-induced weight loss on sex hormones in morbidly obese subjects that are not predicted by the constructed BMI correlation models . DESIGN Cross-sectional population and longitudinal studies . SETTING Bariatric surgery center in a university hospital . PATIENT(S ) A cross-sectional survey of a male general population of 161 patients ( BMI median [ interquartile range ] = 29.2 [ 24.8 - 41.9 ] kg/m(2 ) ) in addition to 24 morbidly obese subjects ( BMI = 43.9 [ 40.8 - 53.8 ] kg/m(2 ) ) who were undergoing bariatric surgery were prospect ively studied for 6 and 12 months . INTERVENTION(S ) Bariatric surgery on 24 morbidly obese men . MAIN OUTCOME MEASURE(S ) Cross-sectional population : construction of the best-fitting models describing the relationship between baseline BMI with total ( TT ) and calculated free ( cFT ) testosterone , E2 , sex hormone-binding globulin ( SHBG ) , FSH , and LH levels . Longitudinal study deviation between the observed sex hormone levels at 6- and 12-month follow-up and those expected on BMI bases . RESULT ( S ) The correlation of BMI with sex hormones was not univocally linear ( E2 ) , but the best-fitting model was exponential for TT , cFT , FSH , LH , and TT/E2 and power for SHBG . In addition to the significant improvement of all parameters observed after surgery in the longitudinal cohort , the increase in TT and SHBG , but not in cFT , was significantly higher than expected from the corresponding weight loss at 6 months from surgery ( 14.80 [ 12.30 - 19.00 ] nM vs. 12.77 [ 10.92 - 13.64 ] nM and 40.0 [ 28.9 - 54.5 ] nM vs. 24.7 [ 22.5 - 25.8 ] nM for TT and SHBG , respectively ) , remaining rather stable at 12 months . CONCLUSION ( S ) The increase in TT and SHBG , but not the increase in cFT , after bariatric surgery is greater than expected based on weight loss CONTEXT Low testosterone levels in men have been associated with increased mortality . However , the influence of testosterone treatment on mortality in men with low testosterone levels is not known . OBJECTIVE The objective of the study was to examine the association between testosterone treatment and mortality in men with low testosterone levels . DESIGN This was an observational study of mortality in testosterone-treated compared with untreated men , assessed with time-varying , adjusted Cox proportional hazards regression models . Effect modification by age , diabetes , and coronary heart disease was tested a priori . SETTING The study was conducted with a clinical data base that included seven Northwest Veterans Affairs medical centers . PATIENTS Patients included a cohort of 1031 male veterans , aged older than 40 yr , with low total testosterone [ ≤250 ng/dl ( 8.7 nmol/liter ) ] and no history of prostate cancer , assessed between January 2001 and December 2002 and followed up through the end of 2005 . MAIN OUTCOME MEASURE Total mortality in testosterone-treated compared with untreated men was measured . RESULTS Testosterone treatment was initiated in 398 men ( 39 % ) during routine clinical care . The mortality in testosterone-treated men was 10.3 % compared with 20.7 % in untreated men ( P<0.0001 ) with a mortality rate of 3.4 deaths per 100 person-years for testosterone-treated men and 5.7 deaths per 100 person-years in men not treated with testosterone . After multivariable adjustment including age , body mass index , testosterone level , medical morbidity , diabetes , and coronary heart disease , testosterone treatment was associated with decreased risk of death ( hazard ratio 0.61 ; 95 % confidence interval 0.42 - 0.88 ; P = 0.008 ) . No significant effect modification was found by age , diabetes , or coronary heart disease . CONCLUSIONS In an observational cohort of men with low testosterone levels , testosterone treatment was associated with decreased mortality compared with no testosterone treatment . These results should be interpreted cautiously because residual confounding may still be a source of bias . Large , r and omized clinical trials are needed to better characterize the health effects of testosterone treatment in older men with low testosterone levels CONTEXT Persistence of hypogonadism is common in male patients with prolactinomas under dopamine agonist ( DA ) treatment . Conventional therapy with testosterone causes undesirable fluctuations in serum testosterone levels and inhibition of spermatogenesis . OBJECTIVE To evaluate the use of clomiphene as a treatment for persistent hypogonadism in males with prolactinomas . DESIGN Open label , single-arm , prospect i ve trial . PATIENTS Fourteen adult hypogonadal males ( testosterone < 300 ng/dl and low/normal LH ) with prolactinomas on DA , including seven with high prolactin ( range : 29 - 1255 microg/l ; median : 101 microg/l ) despite maximal doses of DA . INTERVENTION Clomiphene ( 50 mg/day orally ) for 12 weeks . MEASURES Testosterone , estradiol , LH , FSH , and prolactin were measured before and 10 days , 4 , 8 , and 12 weeks after clomiphene . Erectile function , sperm analysis , body composition , and metabolic profiles were evaluated before and after clomiphene . RESULTS Ten patients ( 71 % ) , five hyperprolactinemic and two normoprolactinemic , responded to clomiphene ( testosterone > 300 ng/dl ) . Testosterone levels increased from 201+/-22 to 457+/-37 ng/dl , 436+/-52 , and 440+/-47 ng/dl at 4 , 8 , and 12 weeks respectively ( 0.001<P<0.01 ) . Estradiol increased significantly and peaked at 12 weeks . LH increased from 1.7+/-0.4 to 6.2+/-2.0 IU/l , 4.5+/-0.7 , and 4.6+/-0.7 IU/l at 4 , 8 , and 12 weeks respectively ( 0.001<P<0.05 ) . FSH levels increased in a similar fashion . Prolactin levels remained unchanged . Erectile function improved ( P<0.05 ) and sperm motility increased ( P<0.05 ) in all six patients with asthenospermia before clomiphene . CONCLUSIONS Clomiphene restores normal testosterone levels and improves sperm motility in most male patients with prolactinomas and persistent hypogonadism under DA therapy . Recovery of gonadal function by clomiphene is independent of prolactin levels It is unknown whether hypogonadism contributes to decreased insulin-like growth factor I ( IGF-I ) production and /or how testosterone administration may effect the GH-IGF-I axis in human immunodeficiency virus (HIV)-infected men with the acquired immunodeficiency syndrome ( AIDS ) wasting syndrome ( AWS ) . In this study , we investigate the GH-IGF-I axis in men with the AWS and determine the effects of testosterone on GH secretory dynamics , pulse characteristics determined from overnight frequent sampling , arginine stimulation , and total and free IGF-I levels . Baseline GH-IGF-I parameters in hypogonadal men with AWS ( n=51 ) were compared before testosterone administration ( 300 mg , i m , every 3 weeks vs. placebo for 6 months ) with cross-sectional data obtained in two age-matched control groups : eugonadal men with AIDS wasting ( n=10 ) and healthy age-matched normal men ( n=15 ) . The changes in GH-IGF-I parameters were then compared prospect ively in testosterone- and placebo-treated patients . Mean overnight GH levels [ 1.8+/-0.3 and 2.4+/-0.3 vs. 0.90+/-0.1 microg/L ( P=0.04 and P=0.003 vs. healthy controls ) ] and pulse frequency [ 0.35+/-0.06 and 0.37+/-0.02 vs. 0.22+/-0.03 pulses/h ( P=0.06 and P=0.002 vs. healthy controls ) ] were comparably elevated in the eugonadal and hypogonadal HIV-positive groups , respectively , compared to those in the healthy control group . No significant differences in pulse amplitude , interpulse interval , or maximal GH stimulation to arginine administration ( 0.5 g/kg , i.v . ) were seen between either the eugonadal and hypogonadal HIV-positive or healthy control patients . In contrast , IGF-I levels were comparably decreased in both HIV-positive groups compared to the healthy control group [ 143+/-16 and 165+/-14 vs. 216+/-14 microg/L ( P=0.004 and P=0.02 vs. healthy controls ) ] . At baseline , before treatment with testosterone , overnight GH levels were inversely correlated with IGF-I ( r=-0.42 ; P=0.003 ) , percent ideal body weight ( r=-0.36 ; P=0.012 ) , albumin ( r=-0.37 ; P=0.012 ) , and fat mass ( r=-0.52 ; P=0.0002 ) , whereas IGF-I levels correlated with free testosterone ( r=0.35 ; P=0.011 ) and caloric intake ( r=0.32 ; P= 0.023 ) in the hypogonadal HIV-positive men . In a stepwise regression model , albumin ( P=0.003 ) and testosterone ( P=0.011 ) were the only significant predictors of GH [ mean GH (microg/L)=-1.82 x albumin ( g/dL ) + 0.003 x total testosterone ( microg/L ) + 6.5 ] , accounting for 49 % of the variation in GH . Mean overnight GH levels decreased significantly in the testosterone-treated patients compared to those in the placebo-treated hypogonadal patients ( 0.9+/-0.3 vs. 0.2+/-0.4 microg/L ; P=0.020 ) . In contrast , no differences in IGF-I or free IGF-I were observed in response to testosterone administration . The decrement in mean overnight GH in response to testosterone treatment was inversely associated with increased fat-free mass ( r=-0.49 ; P= 0.024 ) , which was the only significant variable in a stepwise regression model for change in GH [ change in mean GH (microg/L)=-0.197 x kg fat-free mass - 0.53 ] and accounted for 27 % of the variation in the change in GH . In this study , we demonstrate increased basal GH secretion and pulse frequency in association with reduced IGF-I concentrations , consistent with GH resistance , among both hypogonadal and eugonadal men with AIDS wasting . Testosterone administration decreases GH in hypogonadal men with AIDS wasting . The change in GH is best predicted by and is inversely related to the magnitude of the change in lean body mass in response to testosterone administration . These data demonstrate that among hypogonadal men with the AWS , testosterone administration has a significant effect on the GH axis Two previous short-term studies ( 12 weeks and up to 16 weeks ) that used and rogens to supplement recombinant human erythropoietin ( rHuEPO ) for the treatment of the anemia associated with end-stage renal disease showed divergent results . Both studies were limited by their brief duration , since the hematopoietic effect of and rogens does not peak until 5 months . Therefore , we conducted a 6-month , prospect i ve , r and omized trial comparing low-dose rHuEPO alone and in combination with and rogens for the treatment of the anemia of end-stage renal failure . Nineteen anemic chronic hemodialysis patients were r and omized into two groups . Group A ( n = 10 ) received 1,500 U rHuEPO intravenously three times a week for 26 weeks . Group B ( n = 9 ) received the same dose of rHuEPO plus n and rolone decanoate 100 mg intramuscularly weekly . Baseline transferrin saturation , serum ferritin , intact serum parathyroid hormone , plasma aluminum , and hematocrit levels were not significantly different between the groups . At study completion , both groups showed a significant increase in mean hematocrit compared with baseline ( group A : 24.8 % + /- 1.4 % to 28.3 % + /- 2.8 % , P = 0.003 ; group B : 25.1 % + /- 1.5 % to 33.2 % + /- 4.5 % , P = 0.001 ) . The increase in hematocrit in the rHuEPO plus and rogen-treated group was statistically greater than in the rHuEPO-alone group ( 8.2 % + /- 4.4 % v 3.5 % + /- 2.8 % ; P = 0.012 ) . With the exception of mild discomfort at the injection site , there were no significant side effects from n and rolone . We conclude that the combination of low-dose rHuEPO and n and rolone decanoate is effective treatment for the anemia of end-stage renal failure PURPOSE Weight loss is a strong predictor of morbidity and mortality in human immunodeficiency virus (HIV)-infected patients . Men with acquired immunodeficiency syndrome ( AIDS ) lose body cell mass . Hypogonadism is also common . This study tested the efficacy of a testosterone transscrotal patch ( 6 mg/day ) in improving body cell mass and treating hypogonadism in these patients . SUBJECTS AND METHODS This multicenter , r and omized , double-blinded , placebo-controlled trial was conducted from August 1995 to October 1996 in 133 men , 18 years of age and older , who had AIDS , 5 % to 20 % weight loss , and either a low morning serum total testosterone level ( < 400 ng/dL ) or a low free testosterone level ( < 16 pg/mL ) . Outcomes included weight , body cell mass as measured using bioelectrical impedance analysis , quality of life , and morning measurements of serum testosterone and dihydrotestosterone levels , lymphocyte subsets , and HIV quantification . RESULTS There were no significant differences in baseline weight , CD4 cell counts , or HIV serum viral quantification between treatment arms . Morning total and free testosterone levels increased in those treated with testosterone , but not with placebo . Following 12 weeks of treatment there were no differences ( testosterone-placebo ) in mean weight change ( -0.3 kg [ 95 % confidence interval ( CI ) : -1.4 to 0.8 ] ) or body cell mass ( -0.2 kg [ 95 % CI : -1.0 to 0.6 ] ) in the two groups . There were also no changes in quality of life in either group . CONCLUSION Hypogonadal men with AIDS and weight loss can achieve adequate morning serum sex hormone levels using a transscrotal testosterone patch . However , this system of replacement does not improve weight , body cell mass , or quality of life CONTEXT Previous studies of testosterone supplementation in HIV-infected men failed to demonstrate improvement in muscle strength . The effects of resistance exercise combined with testosterone supplementation in HIV-infected men are unknown . OBJECTIVE To determine the effects of testosterone replacement with and without resistance exercise on muscle strength and body composition in HIV-infected men with low testosterone levels and weight loss . DESIGN AND SETTING Placebo-controlled , double-blind , r and omized clinical trial conducted from September 1995 to July 1998 at a general clinical research center . PARTICIPANTS Sixty-one HIV-infected men aged 18 to 50 years with serum testosterone levels of less than 12.1 nmol/L ( 349 ng/dL ) and weight loss of 5 % or more in the previous 6 months , 49 of whom completed the study . INTERVENTIONS Participants were r and omly assigned to 1 of 4 groups : placebo , no exercise ( n = 14 ) ; testosterone enanthate ( 100 mg/wk intramuscularly ) , no exercise ( n = 17 ) ; placebo and exercise ( n = 15 ) ; or testosterone and exercise ( n = 15 ) . Treatment duration was 16 weeks . MAIN OUTCOME MEASURES Changes in muscle strength , body weight , thigh muscle volume , and lean body mass compared among the 4 treatment groups . RESULTS Body weight increased significantly by 2.6 kg ( P<.001 ) in men receiving testosterone alone and by 2.2 kg ( P = .02 ) in men who exercised alone but did not change in men receiving placebo alone ( -0.5 kg ; P = .55 ) or testosterone and exercise ( 0.7 kg ; P = .08 ) . Men treated with testosterone alone , exercise alone , or both experienced significant increases in maximum voluntary muscle strength in leg press ( range , 22%-30 % ) , leg curls ( range , 18%-36 % ) , bench press ( range , 19%-33 % ) , and latissimus pulls ( range , 17%-33 % ) . Gains in strength in all exercise categories were greater in men assigned to the testosterone-exercise group or to the exercise-alone group than in those assigned to the placebo-alone group . There was a greater increase in thigh muscle volume in men receiving testosterone alone ( mean change , 40 cm3 ; P<.001 vs zero change ) or exercise alone ( 62 cm3 ; P = .003 ) than in men receiving placebo alone ( 5 cm3 ; P = .70 ) . Average lean body mass increased by 2.3 kg ( P = .004 ) and 2.6 kg ( P<.001 ) , respectively , in men who received testosterone alone or testosterone and exercise but did not change in men receiving placebo alone ( 0.9 kg ; P = .21 ) . Hemoglobin levels increased in men receiving testosterone but not in those receiving placebo . CONCLUSION Our data suggest that testosterone and resistance exercise promote gains in body weight , muscle mass , muscle strength , and lean body mass in HIV-infected men with weight loss and low testosterone levels . Testosterone and exercise together did not produce greater gains than either intervention alone Comorbidity is a known risk factor for antibiotic-resistant bacterial infections . Although aggregate comorbidity measures are useful in epidemiologic research , none of the existing measures was developed for use with this outcome . This study compared the utility of two comorbidity measures , the Charlson Comorbidity Index and the Chronic Disease Score , in assessing the comorbidity-attributable risk of nosocomial infections with methicillin-resistant Staphylococcus aureus ( MRSA ) or vancomycin-resistant enterococci ( VRE ) . Two case-control studies were conducted at the University of Maryl and Medical System in Baltimore , Maryl and . Cases were in patients with a first positive clinical culture of MRSA or VRE at least 48 hours postadmission ( July 1 , 1998-July 1 , 2001 ) . Three inpatient controls were r and omly selected per case . The MRSA study included 2,164 patients , and the VRE study included 1,948 . The scores ' discrimination and calibration were measured by using the c statistic and Hosmer-Lemeshow chi-square test . The Charlson Comorbidity Index ( c = 0.653 ) and Chronic Disease Score ( c = 0.608 ) were similar discriminators of MRSA and VRE ( c = 0.670 and c = 0.647 , respectively ) . Calibration of the scores was poor for both outcomes ( p < 0.05 ) . A revised comorbidity measure specific to resistant infections would likely provide a better assessment of the comorbidity-attributable risk of antibiotic-resistant infections
2,072
28,590,326
Conclusion : Assisted partner notification improved partner testing and diagnosis of HIV-positive partners , with few reports of harm .
Objective : Despite the enormous expansion of HIV testing services ( HTS ) , an estimated 40 % of people with HIV infection remain undiagnosed . To enhance the efficiency of HTS , new approaches are needed . The WHO conducted a systematic review on the effectiveness of assisted partner notification in improving HIV test uptake and diagnosis , and the occurrence of adverse events , to inform the development of normative guidelines .
Partner notification ( PN ) is an important method for controlling the AIDS epidemic worldwide . Here , we looked into the differences between two PN counseling modes for HIV ( + ) men who have sex with men in Taiwan . Using r and om assignment , we placed 42 of the 84 subjects into the experimental group where they received two sessions of PN counseling , while the control group ( 42 ) received only one session . All 84 subjects were single males with an average age of 28.06 . The mean number of successful notified partner was 5.38 ( SD = 3.44 ) in the experimental which was statistically significantly higher than 2.81 ( SD = 1.62 ) in the control group ( β = 0.650 , p = 0.000 ) . The notification success rate was 77.13 % in the experimental and 74.21 % in the control group ( IRR 1.039 , 95 % CI 0.83–1.30 ) . In the experimental and control group , the average number of the partners accepted an HIV test was 1.86 ( SD = 1.58 ) and 0.79 ( SD = 0.66 ) ( β = 0.601 , p = 0.000 ) , and 39.74 and 27.27 % of the tested partners were HIV positive ( IRR 1.457 , 95 % CI 0.69–3.06 ) . The study results may be used to improve the policies and practice s for PN and contact follow-up Background Partner services ( PSs ) are a long-st and ing component of HIV control programs in the United States and some parts of Europe . Small r and omized trials suggest that HIV PS can be effective in identifying persons with undiagnosed HIV infection . However , the scalability and effectiveness of HIV PS in low-income countries are unknown . Methods We used data collected from 2009 to 2010 through a large HIV PS program in Cameroon to evaluate HIV PS in a developing country . HIV-positive index cases diagnosed in antenatal care , voluntary counseling and testing , and inpatient facilities were interviewed to collect information on their sexual partners . Partners were contacted via telephone or home visit to notify , test , and enroll those found to be HIV positive in medical care . Results Health advisors interviewed 1462 persons with HIV infection during the evaluation period ; these persons provided information about 1607 sexual partners . Health advisors notified 1347 ( 83.8 % ) of these partners , of whom 900 ( 66.8 % ) were HIV tested . Of partners tested , 451 ( 50.1 % ) were HIV positive , of whom 386 ( 85.6 % ) enrolled into HIV medical care . An average 3.2 index cases needed to be interviewed to identify 1 HIV case . Conclusions HIV PS can be successfully implemented in a developing country and is highly effective in identifying persons with HIV infection and linking them to care BACKGROUND We sought to compare two methods of notifying sex partners of subjects infected with the human immunodeficiency virus ( HIV ) or persons who had shared needles with them ( needle-sharing partners ) : " patient referral , " in which the responsibility for notifying partners was left to the patient , and " provider referral , " in which providers attempted to notify partners . METHODS Names of sex partners and needle-sharing partners and information on how to locate them were obtained from consenting HIV-infected subjects identified in the HIV-testing programs at three public health departments in North Carolina . The subjects were r and omly assigned to a patient-referral group ( in which patients had the initial responsibility for notifying their partners ) or a provider-referral group ( in which the study counselor notified the partners ) . The success of attempts to notify partners was monitored by means of interviews with counselors conducted both in the field and at the health department . RESULTS Of 534 HIV-positive persons identified at the health departments , 247 ( 46 percent ) did not return for counseling after the test , 8 were counseled outside the study , and 117 ( 22 percent ) were ineligible . Of the 162 invited to participate , 88 ( 54 percent ) declined and 74 ( 46 percent ) agreed . The subjects were mostly male ( 69 percent ) , black ( 87 percent ) , homosexual or bisexual ( 76 percent of the men ) , and had a median age of 30 years . Thirty-nine were assigned to the provider-referral group and 35 to the patient-referral group . In the provider-referral group 78 of 157 partners ( 50 percent ) were successfully notified , whereas in the patient-referral group only 10 of 153 ( 7 percent ) were notified . Of the partners notified by the counselors , 94 percent were not aware that they had been exposed to HIV . Overall , 23 percent of the partners notified and tested were HIV-positive . CONCLUSIONS In this trial , leaving the notification of partners up to the subjects ( patient referral ) was quite ineffective , despite the North Carolina law requiring that partners be notified . Partner notification by public health counselors ( provider referral ) was significantly more effective . Although the effectiveness of notification procedures is constrained by the accuracy of the information provided by HIV-infected patients , counselors who notify the partners of an infected patient can refer them to educational , medical , and support services targeted to persons at high risk for HIV infection and may encourage the adoption of less risky behavior BACKGROUND Assisted partner services for index patients with HIV infections involves elicitation of information about sex partners and contacting them to ensure that they test for HIV and link to care . Assisted partner services are not widely available in Africa . We aim ed to establish whether or not assisted partner services increase HIV testing , diagnoses , and linkage to care among sex partners of people with HIV infections in Kenya . METHODS In this cluster r and omised controlled trial , we recruited non-pregnant adults aged at least 18 years with newly or recently diagnosed HIV without a recent history of intimate partner violence who had not yet or had only recently linked to HIV care from 18 HIV testing services clinics in Kenya . Consenting sites in Kenya were r and omly assigned ( 1:1 ) by the study statistician ( restricted r and omisation ; balanced distribution in terms of county and proximity to a city ) to immediate versus delayed assisted partner services . Primary outcomes were the number of partners tested for HIV , the number who tested HIV positive , and the number enrolled in HIV care , in those who were interviewed at 6 week follow-up . Participants within each cluster were masked to treatment allocation because participants within each cluster received the same intervention . This trial is registered with Clinical Trials.gov , number NCT01616420 . FINDINGS Between Aug 12 , 2013 , and Aug 31 , 2015 , we r and omly allocated 18 clusters to immediate and delayed HIV assisted partner services ( nine in each group ) , enrolling 1305 participants : 625 ( 48 % ) in the immediate group and 680 ( 52 % ) in the delayed group . 6 weeks after enrolment of index patients , 392 ( 67 % ) of 586 partners had tested for HIV in the immediate group and 85 ( 13 % ) of 680 had tested in the delayed group ( incidence rate ratio 4·8 , 95 % CI 3·7 - 6·4 ) . 136 ( 23 % ) partners had new HIV diagnoses in the immediate group compared with 28 ( 4 % ) in the delayed group ( 5·0 , 3·2 - 7·9 ) and 88 ( 15 % ) versus 19 ( 3 % ) were newly enrolled in care ( 4·4 , 2·6 - 7·4 ) . Assisted partner services did not increase intimate partner violence ( one intimate partner violence event related to partner notification or study procedures occurred in each group ) . INTERPRETATION Assisted partner services are safe and increase HIV testing and case-finding ; implementation at the population level could enhance linkage to care and antiretroviral therapy initiation and substantially decrease HIV transmission . FUNDING National Institutes of Health OBJECTIVE This analysis describes the Outreach-Assisted Model of Partner Notification , an innovative strategy for encouraging seropositive injecting drug users ( IDUs ) to inform their partners of shared human immunodeficiency virus ( HIV ) exposure . The analysis focuses on two core components of the notification process : the identification of at-risk partners and preferences for self-tell vs. outreach assistance in informing partners of possible exposure to the virus . METHODS Using community outreach techniques , 386 IDUs were recruited for HIV pretest counseling , testing , and partner notification over a 12-month period . Of these , 63 tested HIV seropositive , and all but three returned for their test results . The 60 who were informed of their serostatus were r and omly assigned to either a minimal or an enhanced intervention condition . Participants assigned to the minimal ( self-tell ) group were strongly encouraged to inform their partners of possible exposure . Those assigned to the enhanced ( outreach-assisted ) group had the option of either informing one or more of their partner(s ) themselves or choosing to have the project 's outreach team do so . RESULTS Together , the 60 index persons who received their results provided names or at least one piece of locating information for a total of 142 partners with whom they perceived having shared possible exposure to the virus within the past five years . By itself , drug use accounted for half of all partners named . Sexual behavior alone accounted for 25 % of named partners . Eighty-two percent of the enhanced group preferred to have the outreach team tell at least one partner ; the team was requested to notify 71 % of the total number of partners whom this group named . CONCLUSIONS Findings suggest that IDUs want to notify their partners of shared HIV exposure . Outreach assistance was the preferred mode in the majority of cases . Exp and ing traditional community-based HIV outreach activities to include delivering street-based counseling , test , a partner notification appears to be a positive and workable prevention strategy Background : Sexual partners of persons with newly diagnosed HIV infection require HIV counseling , testing and , if necessary , evaluation for therapy . However , many African countries do not have a st and ardized protocol for partner notification , and the effectiveness of partner notification has not been evaluated in developing countries . Methods : Individuals with newly diagnosed HIV infection presenting to sexually transmitted infection clinics in Lilongwe , Malawi , were r and omized to 1 of 3 methods of partner notification : passive referral , contract referral , or provider referral . The passive referral group was responsible for notifying their partners themselves . The contract referral group was given seven days to notify their partners , after which a health care provider contacted partners who had not reported for counseling and testing . In the provider referral group , a health care provider notified partners directly . Results : Two hundred forty-five index patients named 302 sexual partners and provided locator information for 252 . Among locatable partners , 107 returned for HIV counseling and testing ; 20 of 82 [ 24 % ; 95 % confidence interval ( CI ) : 15 % to 34 % ] partners returned in the passive referral arm , 45 of 88 ( 51 % ; 95 % CI : 41 % to 62 % ) in the contract referral arm , and 42 of 82 ( 51 % ; 95 % CI : 40 % to 62 % ) in the provider referral arm ( P < 0.001 ) . Among returning partners ( n = 107 ) , 67 ( 64 % ) of were HIV infected with 54 ( 81 % ) newly diagnosed . Discussion : This study provides the first evidence of the effectiveness of partner notification in sub-Saharan Africa . Active partner notification was feasible , acceptable , and effective among sexually transmitted infections clinic patients . Partner notification will increase early referral to care and facilitate risk reduction among high-risk uninfected partners Objective : Determine the cost and effectiveness of partner notification for human immunodeficiency virus ( HIV ) infection . Methods : Persons testing HIV positive in three areas were r and omly assigned one of four approaches to partner notification . Analysis plans changed because disease intervention specialists notified many partners from the patient referral group . We dropped the patient referral group and combined the others to assess the cost and effectiveness of provider referral . Results : The 1,070 patients reported 8,633 partners . Of those , 1,035 were located via record search or in person . A previous positive test was reported by 248 partners . Of the 787 others , 560 were tested : 438 were HIV negative and 122 were newly identified as HIV positive . The intervention specialist 's time totaled 197 minutes per index patient . The cost of the intervention specialist 's time , travel , and overhead was $ 268,425 : $ 251 per index patient , $ 427 per partner notified , or $ 2,200 per new HIV infection identified . No demographic characteristic of the index patient strongly predicted the likelihood of finding an infected partner . Conclusion : We could not compare the effectiveness of different partner notification approaches because of frequent crossover between r and omized groups . The cost of partner notification can be compared with other approaches to acquired immunodeficiency syndrome prevention , but the benefits are not easily measured . We do not know the number of HIV cases prevented or the value of fulfilling the ethical obligation to warn partners of a potential threat to their health The incidence of human immunodeficiency virus ( HIV ) infection has significantly increased among black men who have sex with men ( MSM ) in the United States , and young black MSM have been disproportionately affected . HIVinfected black MSM are also less likely to engage in HIV care and achieve viral suppression than MSM of other races/ethnicities . Engaging in care and achieving viral suppression is a multistep process that starts with diagnosis . Diagnosing persons unaware of their HIV status traditionally has been a critical component of HIV partner services , but partner services also provide an important opportunity to reengage HIVinfected partners in medical care . One approach for partner services involves contacting partners of persons with newly diagnosed HIV infection and using sexual and social network and molecular phylogenetic data to improve the continuum of HIV care among black MSM . To evaluate the effectiveness of that approach , results from a prospect i ve partner services study conducted in North Carolina were examined , and one of the partner networks identified through this study was evaluated in depth . Overall , partner services were provided to 30 black , HIV-infected MSM who named 95 sex partners and social contacts , of whom 39 ( 41 % ) previously had been diagnosed with HIV infection . The partner network evaluation demonstrated that HIV-infected and HIV-negative partners were frequently in the same network , and that the majority of HIV-infected partners were already aware of their diagnosis but had not achieved viral suppression . Using partner services to ensure that HIV-infected partners are linked to care and treatment might reduce HIV transmission and might improve outcomes along the continuum of care BACKGROUND Couples HIV testing and counselling ( CHTC ) is encouraged but is not widely done in sub-Saharan Africa . We aim ed to compare two strategies for recruiting male partners for CHTC in Malawi 's option B+ prevention of mother-to-child transmission programme : invitation only versus invitation plus tracing and postulated that invitation plus tracing would be more effective . METHODS We did an unblinded , r and omised , controlled trial assessing uptake of CHTC in the antenatal unit at Bwaila District Hospital , a maternity hospital in Lilongwe , Malawi . Women were eligible if they were pregnant , had just tested HIV-positive and therefore could initiate antiretroviral therapy , had not yet had CHTC , were older than 18 years or 16 - 17 years and married , reported a male sex partner in Lilongwe , and intended to remain in Lilongwe for at least 1 month . Women were r and omly assigned ( 1:1 ) to either the invitation only group or the invitation plus tracing group with block r and omisation ( block size=4 ) . In the invitation only group , women were provided with an invitation for male partners to present to the antenatal clinic . In the invitation plus tracing group , women were provided with the same invitation , and partners were traced if they did not present . When couples presented they were offered pregnancy information and CHTC . Women were asked to attend a follow-up visit 1 month after enrolment to assess social harms and sexual behaviour . The primary outcome was the proportion of couples who presented to the clinic together and received CHTC during the study period and was assessed in all r and omly assigned participants . This study is registered with Clinical Trials.gov , number NCT02139176 . FINDINGS Between March 4 , 2014 , and Oct 3 , 2014 , 200 HIV-positive pregnant women were enrolled and r and omly assigned to either the invitation only group ( n=100 ) or the invitation plus tracing group ( n=100 ) . 74 couples in the invitation plus tracing group and 52 in the invitation only group presented to the clinic and had CHTC ( risk difference 22 % , 95 % CI 9 - 35 ; p=0.001 ) during the 10 month study period . Of 181 women with follow-up data , two reported union dissolution , one reported emotional distress , and none reported intimate partner violence . One male partner , when traced , was confused about which of his sex partners was enrolled in the study . No other adverse events were reported . INTERPRETATION An invitation plus tracing strategy was highly effective at increasing CHTC uptake . Invitation plus tracing with CHTC could have many substantial benefits if brought to scale . FUNDING National Institutes of Health Objective : To evaluate the feasibility and effectiveness of a st and ardized HIV partner notification programme within genitourinary medicine clinics in Engl and . Design : A prospect i ve survey of HIV partner notification activity over a 12-month period . Setting : Nineteen genitourinary medicine clinics in Engl and . Patients and participants : A total of 501 eligible HIV-positive patients ( either newly diagnosed or with whom partner notification had not been undertaken previously ) seen during the study period . Main outcome measures : The numbers of partners named by patients , and the number of contacts notified , counselled and HIV-tested . Results : Information on overall partner notification activity was obtained by review ing available medical records of 471 patients ; 353 ( 75 % ) had discussed partner notification with a health-care worker during the study period and 197 ( 42 % ) had undertaken partner notification . Detailed information on outcomes was obtained for only 70 patients who named 158 contacts as being at risk of acquiring HIV . Although 71 ( 45 % ) contacts were eventually notified , only 28 were subsequently seen in participating clinics . Almost all contacts ( n = 27 ) requested HIV counselling and testing , and five were diagnosed HIV-positive . Patient referral was the most popular notification method chosen . Conclusions : This study illustrates some of the practical difficulties that limit HIV partner notification within genitourinary medicine clinics . These include health-care workers ' misgivings about undertaking partner notification , insufficient locating information to identify contacts , and migration of newly diagnosed patients , which prevents continuity and completion of notification . Nevertheless , HIV partner notification uncovered previously undiagnosed HIV infections . Further work needs to be undertaken in staff training and policy implementation if higher rates of partner notification and outcome measurements are to be achieved
2,073
22,336,822
On a patient , rather than per implant basis , implants placed with a flapless technique and implant exposures performed with laser induced statistically significantly less postoperative pain than flap elevation . Sites augmented with soft tissues connective grafts showed a better aesthetic and thicker tissues . Both palatal autografts or the use of a porcine-derived collagen matrix are effective in increasing the height of keratinised mucosa at the price of a 0.5 mm recession of peri-implant soft tissues . There is limited weak evidence suggesting that flapless implant placement is feasible and has been shown to reduce patient postoperative discomfort in adequately selected patients , that augmentation at implant sites with soft tissue grafts is effective in increasing soft tissue thickness improving aesthetics and that one technique to increase the height of keratinised mucosa using autografts or an animal-derived collagen matrix was able to achieve its goal but at the price of a worsened aesthetic outcome ( 0.5 mm of recession ) .
BACKGROUND Dental implants are usually placed by elevating a soft tissue flap , but in some instances , they can also be placed flapless reducing patient discomfort . Several flap design s and suturing techniques have been proposed . Soft tissues are often manipulated and augmented for aesthetic reasons . It is often recommended that implants are surrounded by a sufficient width of attached/keratinised mucosa to improve their long-term prognosis .
Recent empirical evidence supports the importance of adequate r and omization in controlled trials . Trials with inadequate allocation concealment have been associated with larger treatment effects compared with trials in which authors reported adequate allocation concealment . While that provides empirical evidence of bias being interjected into trials , trial investigators rarely document the sensitive details of subverting the intended purpose of r and omization . This article relates anonymous accounts run the gamut from simple to intricate operations , from transillumination of envelopes to search ing for code in the office files of the principal investigator . They indicate that deciphering is something more frequent than a rate occurrence . These accounts prompt some method ological recommendations to help prevent deciphering . R and omized controlled trials appear to annoy human nature -- if properly conducted , indeed they should PURPOSE To compare the efficacy of immediate functionally loaded implants placed with a flapless procedure ( test group ) versus implants placed after flap elevation and conventional load-free healing ( control group ) in partially edentulous patients . MATERIAL S AND METHODS Forty patients were r and omized : 20 to the flapless immediately loaded group and 20 to the conventional group . To be immediately loaded , implants had to be inserted with a minimum torque > 45 Ncm . Implants in the immediately loaded group were provided with full acrylic resin temporary restorations the same day . Implants in the conventional group were submerged ( anterior region ) or left unsubmerged ( posterior region ) and were left load-free for 3 months ( m and ibles ) or 4 months ( maxillae ) . Provisional restorations were replaced with definitive single metal-ceramic crowns 1 month postloading . Outcome measures were prosthesis and implant failures , biological and prosthetic complications , postoperative edema , pain , and use of analgesics . Independent sample chi2 tests , Mann-Whitney tests , t tests , and paired t tests were used with a significance level of .05 . RESULTS Fifty-two implants were placed in the flapless group and 56 in the conventionally loaded group . In the flapless group , 1 flap had to be raised to control the direction of the bur and 1 implant did not reach the planned primary stability and was treated as belonging to the conventional group . After 3 years no dropouts or failures occurred . There was no statistically significant difference for complications ; however , patients in the conventional group had significantly more postoperative edema and pain and consumed more analgesics than those in the flapless group . Osstell values were significantly higher at baseline in the flapless group ( P = .033 ) . When comparing baseline data with years 1 , 2 , and 3 within each group , mean Osstell values of the flapless group did not increase , whereas there were statistically significant increases in the Periotest values . CONCLUSIONS Implants can be successfully placed flapless and loaded immediately without compromising success rates ; the procedure decreases treatment time and patient discomfort The objective of this clinical study was to compare the survival rates of early loaded implants placed using flapless and flapped surgical techniques and to determine the bone density in the implant recipient sites using computerized tomography ( CT ) . The study population consisted of 12 patients who were referred implant placement . One group consisted of five patients referred for the placement of 14 implants and treated with a flapless procedure . The other group consisted of seven patients referred for the placement of 45 implants with a conventional flapped procedure . Patients were selected r and omly . CT machine was used for pre-operative evaluation of the jaw bone and the mean bone density value of each implant recipient site was recorded in Hounsfield units ( HU ) . All implants were placed using CT-guided surgical stents . The early loading protocol s included 2 months of healing in the m and ible and 3 months of healing in the maxilla . Single-implant crowns , implant-supported fixed partial dentures , and implant-retained over dentures were delivered to the patients . Of 59 implants placed , one was lost in the conventional flapped group within the first month of healing , meaning overall implant survival rate of 98.3 % average 9 months later . The highest average bone density value ( 801 + /- 239 HU ) was found in the anterior m and ible , followed by 673 + /- 449 HU for the posterior maxilla , 669 + /-346 HU for the anterior maxilla and 538 + /- 271 HU for the posterior m and ible . The results of this study show that the early loading of implants placed utilizing flapless surgical technique with CT-guided surgical stents may be possible BACKGROUND Regarding the Brånemark implant system , nonresorbable sutures have been advocated for reapproximation of the flaps . Fast-absorbable sutures are frequently used in oral surgery , which is convenient for both the patient and the surgeon . It would be advantageous if fast-absorbable sutures are suitable in implant surgery as well . PURPOSE The purpose of this study was to compare irradiated polyglactin 910 ( Vicryl Rapide , Ethicon GmbH , Norderstedt , Germany ) suture with a nonresorbable polyfilament suture ( Supramid , Schwarz , Resorba GmbH , Nürnberg , Germany ) used in oral implant surgery . MATERIAL S AND METHODS The study comprised 101 edentulous patients ( 52 females , 49 males ) who were provided with 350 Brånemark implants . They were r and omized to receive either Vicryl Rapide suture ( n = 61 ) or 3 - 0 Supramid suture ( n = 40 ) . The patients were evaluated after 10 days and at the time of abutment surgery . Any wound complications and implant losses were recorded . RESULTS The implant failure rate at abutment surgery was low ( 1.2 % ) , and no difference was seen between the two suture groups . A higher incidence of complications ( mainly wound dehiscence ) was found in the absorbable suture group , especially when a continuous suture was used . CONCLUSIONS The results of this study indicate that it is possible to use irradiated polyglactin 910 sutures in oral implant surgery without affecting the rate of early implant failure . However , it is recommended to add interrupted " security sutures " if a continuous suture technique is used in combination with fast-absorption suture material AIM The aim of this controlled r and omized clinical trial was to evaluate the efficacy of a xenogeneic collagen matrix ( CM ) to augment the keratinized tissue around implants supporting prosthetic restorations at 6 months when compared with the st and ard treatment , the connective tissue autograft , CTG ) . MATERIAL S AND METHODS This r and omized longitudinal parallel controlled clinical trial studied 24 patients with at least one location with minimal keratinized tissue ( ≤1 mm ) . MAIN OUTCOME MEASURE The 6-month width of keratinized tissue . As secondary outcomes the esthetic outlook , the maintenance of peri-implant mucosal health and the patient morbidity were assessed pre-operatively and 1 , 3 , and 6 months post-operatively . RESULTS At 6 months , Group CTG attained a mean width of keratinized tissue of 2.75 ( 1.5 ) mm , while the corresponding figure in Group CM was 2.8 ( 0.4 ) mm , the inter-group differences not being statistically significant . The surgical procedure in both groups did not alter significantly the mucosal health in the affected abutments . There was a similar esthetic result and significant increase in the vestibular depth in both groups as a result of the surgery . In the CM group it changed from 2.2 ( 3.3 ) to 5.1 ( 2.5 ) mm at 6 months . The patients treated with the CM referred less pain , needed less pain medication , and the surgical time was shorter , although these differences were not statistically significant when compared with the CTG group . CONCLUSIONS These results prove that this new CM was as effective and predictable as the CTG for attaining a b and of keratinized tissue PURPOSE The aim of this study was to compare the pain experienced after implant placement with 2 different surgical procedures : a flapless surgical procedure using an image-guided system based on a template and an open-flap procedure . MATERIAL S AND METHODS The study population consisted of 60 patients who were referred for implant placement . One group consisted of 30 patients who were referred for the placement of 80 implants and treated with a flapless procedure . The other group consisted of 30 patients who were referred for the placement of 72 implants with a conventional procedure . Patients were selected r and omly . They were requested to fill out a question naire using a visual analog scale ( VAS ) to assess the pain experienced and to indicate the number of analgesic tablets taken every postoperative day from the day of the surgery ( DO ) to 6 days after surgery ( D6 ) . RESULTS The results showed a significant difference in pain measurements , with higher scores on the VAS with open-flap surgery ( P < .01 ) . Pain decreased faster with the flapless procedure ( P = .05 ) . The number of patients who felt no pain ( VAS = 0 ) was higher with the flapless procedure ( 43 % at DO versus 20 % ) . With the flapless procedure , patients took fewer pain tablets ( P = .03 ) and the number of tablets taken decreased faster ( P = .04 ) . DISCUSSION Minimally invasive procedures may be requested by patients to reduce their anxiety and the pain experienced and thus increase the treatment acceptance rate . CONCLUSION With the flapless procedure , patients experienced pain less intensely and for shorter periods of time PURPOSE To evaluate the efficacy of flapless versus open flap implant placement in partially edentulous patients . MATERIAL S AND METHODS Forty patients with two separate edentulous areas characterised by residual bone at least 5 mm thick and 10 mm in height had these sites r and omised following a split-mouth design to receive at least one implant to each side after flap elevation or not . Implants were first placed in one site , and after 2 weeks in the other site freeh and . Implants inserted with a torque > 48 Ncm were immediately loaded with full occluding acrylic temporary restorations . Definitive single cemented crowns or screw-retained metal ceramic fixed dental prostheses were delivered after 2 months . Outcome measures were prosthesis and implant failures , complications , postoperative swelling and pain , consumption of analgesics , patient preference , surgical time , marginal bone level changes , and implant stability quotient ( ISQ ) values . RESULTS Seventy-six implants were placed flapless and 67 after flap elevation . In the flapless group , four flaps had to be raised to control the direction of the bur , whereas one haemorrhage and one fracture of the buccal bone occurred in two patients of the flap elevation group . Four implants did not reach the planned stability ( three belonging to the flapless group ) and they were all immediately replaced by larger diameter ones . After 1 year , no drop-outs occurred . Two definitive bridges could not be placed when planned ( one in each group ) and two crowns had to be remade ( one in each group ) . Two implants failed in each group , all in different patients . There were no statistically significant differences for prosthetic and implant failures , complications , ISQ values and marginal bone levels between groups . However , flapless implant placement required significantly less operation time ( 17 minutes less , saving almost two-thirds of the time for implant placement ) , induced less postoperative pain , swelling , analgesic consumption and was preferred by patients . Mean ISQ values of both groups significantly decreased over time . CONCLUSIONS Implants can be successfully placed flapless and loaded immediately , reducing treatment time and patient discomfort AIM The aim of this study was to test a new collagen matrix ( CM ) aim ed to increase keratinized gingiva/mucosa when compared with the free connective tissue graft ( CTG ) . MATERIAL AND METHODS This r and omized longitudinal parallel controlled clinical trial studied 20 patients with at least one location with minimal keratinized tissue ( < or=1 mm ) . MAIN OUTCOME MEASURE The 6-month width of keratinized tissue . As secondary outcomes , the aesthetic outlook , the maintenance of periodontal health and the patient morbidity were assessed pre-operatively at 1 , 3 and 6 months . RESULTS At 6 months , the CTG attained a mean width of keratinized tissue of 2.6 ( 0.9 ) mm , while the CM was 2.5 ( 0.9 ) mm , these differences being insignificant . In both groups , there was a marked contraction ( 60 % and 67 % , respectively ) although the periodontal parameters were not affected . The CM group had a significantly lower patient morbidity ( pain and medication intake ) as well as reduced surgery time . CONCLUSIONS These results prove that this new CM was as effective and predictable as the CTG for attaining a b and of keratinized tissue , but its use was associated with a significantly lower patient morbidity OBJECTIVES To evaluate and compare the outcome of dental implants placed using a flapless protocol and immediate loading with a conventional protocol and loading after 6 weeks . MATERIAL S AND METHODS Fourteen patients with bilateral maxillary edentulous areas were treated using Straumann SLA-implants . Using a r and omized split-mouth design , implants were placed in one side of the maxilla using a stereolithographic surgical guide for flapless surgery and immediately loaded on temporary abutments with a bridge ( test ) . Implants in the other side were placed using the conventional protocol and loaded after 6 weeks of healing ( control ) . Clinical and radiographic evaluation of peri-implant tissues was performed at time of implant surgery , and after 1 week , 6 weeks , 3 , 6 , 12 and 18 months . RESULTS A total of 70 implants were placed ( 36 test and 34 control ) . One implant ( test ) was lost after 3 months , result ing in a survival rate of 97.3 % for the test implants and 100 % for the control implants . Marginal bone levels were not statistically significantly different between the test and control implants but at baseline the marginal bone level was significantly lower compared to the other evaluation periods ( P < 0.05 ) . The mean bone level for test and control implants was 1.95 mm ± 0.70 and 1.93 mm ± 0.42 after 18 months , respectively . There was a significant change in height of the attached mucosa at implants placed with a conventional flap between post-operative and 1 week and between 1 week and 6 weeks . Statistically significant differences were found between the test side and the control side for opinion about speech , function , aesthetics , self-confidence and overall appreciation the first 6 weeks . CONCLUSION Implants can successfully integrate in the posterior maxilla using a flapless approach with immediate loading similar to a conventional protocol . The mucosal tissues around implants placed with a conventional flap changed significantly compared with flapless placed implants BACKGROUND The need for keratinized mucosa ( KM ) or immobile keratinized mucosa ( i.e. , attached mucosa [ AM ] ) for the maintenance of osseointegrated endosseous dental implants has been controversial . The purpose of this study was to investigate the significance of KM in the maintenance of root-form dental implants with different surfaces . METHODS A total of 339 endosseous dental implants in place for at least 3 years in 69 patients were evaluated . The width of KM and AM , modified plaque index ( mPI ) , gingival index ( GI ) , modified bleeding index ( mBI ) , probing depth ( PD ) , and average annual bone loss ( ABL ) were measured clinical ly and radiographically by a masked examiner . Based on the amounts of KM or AM , implants were categorized as follows : 1 ) KM < 2 mm ( KL ) ; 2 ) KM > or = 2 mm ( KU ) ; 3 ) AM < 1 mm ( AL ) ; and 4 ) AM > or = 1 mm ( AU ) . Implants were further subdivided into the following four subgroups based on their surface configurations : 1 ) smooth surface implants ( SI ) with KM < 2 mm ( SKL ) ; 2 ) SI with KM > or = 2 mm ( SKM ) ; 3 ) rough surface implants ( RI ) with KM < 2 mm ( RKL ) ; or 4 ) RI with KM > or = 2 mm ( RKM ) ; or 1 ) SI with AM < 1 mm ( SAL ) ; 2 ) SI with AM > or = 1 mm ( SAM ) ; 3 ) RI with AM < 1 mm ( RAL ) ; or 4 ) RI with AM > or = 1 mm ( RAM ) . The effect of KM or AM on clinical parameters was evaluated by comparing the different KM/AM groups . In addition , the significance of the presence of KM on implant prostheses types ( i.e. , fixed versus removable ) and on implant locations ( i.e. , anterior versus posterior ) was evaluated . RESULTS Comparison of ABL among the four subgroups in KM or AM failed to reveal statistically significant differences ( P > 0.05 ) ; however , statistically significantly higher GI and mPI were present in SKL or SAL compared to the other three subgroups ( P < 0.05 ) . GI and mPI were significantly higher in KL ( 0.94 and 1.51 ) than KU ( 0.76 and 1.26 ) and higher in AL ( 0.95 and 1.50 ) than AU ( 0.70 and 1.19 ) ( P < 0.05 ) , respectively . The difference in GI between posterior implants with or without an adequate amount of KM was also significant ( P < 0.05 ) . CONCLUSIONS The absence of adequate KM or AM in endosseous dental implants , especially in posterior implants , was associated with higher plaque accumulation and gingival inflammation but not with more ABL , regardless of their surface configurations . R and omized controlled clinical trials are needed to confirm the results obtained in this retrospective clinical study PURPOSE To compare the efficacy of a new uncovering technique with that of the conventional uncovering technique for papilla generation . MATERIAL S AND METHODS Thirty-three patients with 67 implants were enrolled in the study . Patients were r and omly assigned to 1 of 2 treatment groups ( test and control ) . Implants of the test group were uncovered by the new technique and implants of the other group uncovered by the conventional technique ( simple midcrestal incision ) . The height of each papilla after uncovering at baseline , 3 months , and 6 months and the thickness of the tissue covering the implant prior the uncovering were measured . PPD , PI , GI , and BOP measurements were made at 0 and 6 months , and st and ardized radiographs were obtained at 0 , 3 , and 6 months . Subject means were used for all statistical analyses . RESULTS The mean difference between the 2 surgical methods revealed that the new technique provided 1.5 mm greater papilla height ( P < .001 ) at all 3 visits ( baseline , 3 , and 6 months ) for implants adjacent to teeth . An overall significant difference for papilla height between the implants was detected between the 2 groups ( P = .02 ) . There was no significant difference between the 2 groups with regard to PPD , PI , GI , BOP , thickness of soft tissue , or overall bone level measurements during the course of the study . CONCLUSION Based on this study , it appears that over the course of 6 months , the new surgical approach for uncovering leads to a more favorable soft tissue response CONFLICT OF INTEREST Nothing to declare . OBJECTIVES To evaluate whether connective tissue grafts performed at implant placement could be effective in augmenting peri-implant soft tissues . MATERIAL S AND METHODS Ten partially edentulous patients requiring at least one single implant in the premolar or molar areas of both sides of the m and ible were r and omised to have one side augmented at implant placement with a connective soft tissue graft harvested from the palate or no augmentation . After 3 months of submerged healing , abutments were placed and within 1 month definitive crowns were permanently cemented . Outcome measures were implant success , any complications , peri-implant marginal bone level changes , patient satisfaction and preference , thickness of the soft tissues and aesthetics ( pink aesthetic score ) evaluated by an independent and blinded assessor 1 year after loading . RESULTS One year after loading , no patients dropped out , no implants failed and no complications occurred . Both groups lost statistically significant amounts of peri-implant bone 1 year after loading ( 0.8 mm in the grafted group and 0.6 mm in the non-grafted group ) , but there was no statistically significant difference between groups . Soft tissues at augmented sites were 1.3 mm thicker ( P < 0.001 ) and had a significantly better pink aesthetic score ( P < 0.001 ) . Patients were highly satisfied ( no statistically significant differences between treatments ) though they preferred the aesthetics of the augmented sites ( P = 0.031 ) . However , five patients would not undergo the grafting procedure again and two were uncertain . CONCLUSIONS Connective tissue grafts are effective in increasing soft tissue thickness , thus improving aesthetics . Longer follow-ups are needed to evaluate the stability of peri-implant tissues over time BACKGROUND Based on three-dimensional implant planning software for computed tomographic ( CT ) scan data , customized surgical templates and final dental prostheses could be design ed to ensure high precision transfer of the implant treatment planning to the operative field and an immediate rigid splinting of the installed implants , respectively . PURPOSE The aim of the present study was to ( 1 ) evaluate a concept including a treatment planning procedure based on CT scan images and a prefabricated fixed prosthetic reconstruction for immediate function in upper jaws using a flapless surgical technique and ( 2 ) vali date the universality of this concept in a prospect i ve multicenter clinical study . MATERIAL S AND METHODS Twenty-seven consecutive patients with edentulous maxillae were included . Treatments were performed according to the Teeth-in-an-Hour concept ( Nobel Biocare AB , Göteborg , Sweden ) , which includes a CT scan-derived customized surgical template for flapless surgery and a prefabricated prosthetic suprastructure . RESULTS All patients received their final prosthetic restoration immediately after implant placement , that is , both the surgery and the prosthesis insertion were completed within approximately 1 hour . In the 24 patients followed for 1 year , all prostheses and individual implants were recorded as stable . CONCLUSION The present prospect i ve multicenter study indicates that the prefabrication , on the basis of models derived from three-dimensional oral implant planning software , of both surgical templates for flapless surgery and dental prostheses for immediate loading is a very reliable treatment option . It is evident that the same approach could be used for staged surgery and in partial edentulism PURPOSE Conventional implant dentistry implies 2 surgical stages . In this context , pain is often present in the second stage , despite the fact that it is comparatively less aggressive for the patient . The present pilot study proposes application of Erbium : YAG ( Er : YAG ) laser for second-stage implant surgery . MATERIAL S AND METHODS Twenty patients were studied with a total of 50 implants in which osseointegration was complete . The subjects were divided into 2 groups : a control group ( 10 patients with 25 implants ) , subjected to conventional second-stage surgery ; and a group of 10 subjects ( also with 25 implants ) treated with the Er : YAG laser at second-stage implant surgery . RESULTS The use of Er : YAG laser obviated the need for local anesthesia and minimized postoperative pain and time needed before starting the second stage . With regard to surgical duration , quality of hemostasis , and success in implant treatment , no differences were reported . DISCUSSION In the second stage of implant surgery , different types of laser have been used , taking advantage of their bacteridal effect ; disadvantages arise from inducing damage to the implant surface and adverse thermal effects . CONCLUSION The advantages afforded by laser treatment include technical simplicity , the possibility of obviating local anesthesia , absence of postoperative pain and edema , and complete tissue healing by day 5 , thus facilitating rapid prosthetic rehabilitation . The technique described can be used in all cases except situations where esthetic considerations prevail in anterior areas , or in the event of a lack of keratinized gingiva surrounding the implant OBJECTIVE To evaluate the use of acellular dermal matrix ( ADM ) to improve esthetic effects of alveolar ridge in dental implantology . METHODS Fifty patients with similar single missing tooth in the anterior maxilla were r and omly divided into two groups : the ADM group was treated with dental implant therapy plus ADM transplantation ; the control group was treated with dental implant therapy alone . The periodontal parameters and the changes of horizontal width of alveolar crest at implant zones were evaluated before surgery and 12 weeks after surgery . RESULTS All operated sites healed uneventfully . Mean horizontal width of alveolar crest in ADM group increased by ( 3.10 + /- 0.64 ) mm at 12 weeks and the control group increased by ( 0.30 + /- 0.50 ) mm , The volume increase showed a significant difference groups ( P < 0.05 ) . Mean horizontal width of alveolar crest in ADM group was ( 11.50 + /- 1.48 ) mm and the contralateral alveolar crest was ( 11.60 + /- 1.60 ) mm ( P > 0.05 ) . CONCLUSIONS ADM is a suitable material for the treatment of soft tissue deformities due to its biocompatibility and horizontal gain of soft tissue BACKGROUND Flapless implant surgery is considered to offer advantages over the traditional flap access approach . There may be minimized bleeding , decreased surgical times and minimal patient discomfort . Controlled studies comparing patient outcome variables to support these assumptions , however , are lacking . AIM The objective of this clinical study was to compare patient outcome variables using flapless and flapped implant surgical techniques . PATIENTS AND METHODS From January 2008 to October 2008 , 16 consecutive patients with edentulous maxillas were included in the study . Patients were r and omly allocated to either implant placement with a flapless procedure ( eight patients , mean age 54.6 + or - 2.9 years ) or surgery with a conventional flap procedure ( eight patients , mean age 58.7 + or - 7.2 years ) . All implants were placed using a Nobel guide CT-guided surgical template . Outcome measures were the Dutch version of the Impact of Event Scale-Revised ( IES-R ) , dental anxiety using the s-DAI and oral health-related quality of life ( OHIP-14 ) . RESULTS Ninety-six implants were successfully placed . All implants were placed as two-phase implants and the after-implant placement dentures were adapted . No differences could be shown between conditions on dental anxiety ( s-DAI ) , emotional impact ( IES-R ) , anxiety , procedure duration or technical difficulty , although the flapless group did score consistently higher . The flap procedure group reported less impact on quality of life and included more patients who reported feeling no pain at all during placement . CONCLUSIONS Differences found in the patient outcome variables do suggest that patients in the flapless implant group had to endure more than patients in the flap group
2,074
27,372,437
Male gender and low depression levels were the most consistent predictors of successful treatment outcomes across multiple time-points . Likely predictors of successful treatment outcomes also included older age , lower gambling symptom severity , lower levels of gambling behaviours and alcohol use , and higher treatment session attendance . Mixed results were identified for treatment goal , while education , income , preferred gambling activity , problem gambling duration , anxiety , any psychiatric comorbidity , psychological distress , substance use , prior gambling treatment and medication use were not significantly associated with treatment outcomes at any time-point .
This systematic review aim ed to synthesis e the evidence relating to pre-treatment predictors of gambling outcomes following psychological treatment for disordered gambling across multiple time-points ( i.e. , post-treatment , short-term , medium-term , and long-term ) .
Few studies have evaluated efficacy of psychotherapies for pathological gambling . Pathological gamblers ( N = 231 ) were r and omly assigned to ( a ) referral to Gamblers Anonymous ( GA ) , ( b ) GA referral plus a cognitive- behavioral ( CB ) workbook , or ( c ) GA referral plus 8 sessions of individual CB therapy . Gambling and related problems were assessed at baseline , 1 month later , posttreatment , and at 6- and 12-month follow-ups . CB treatment reduced gambling relative to GA referral alone during the treatment period and result ed in clinical ly significant improvements , with some effects maintained throughout follow-up ( ps = .05 ) . Individual CB therapy improved some outcomes compared with the CB workbook . Attendance at GA and number of CB therapy sessions or workbook exercises completed were associated with gambling abstinence . These data suggest the efficacy of this CB therapy approach OBJECTIVE To examine the influence of co-occurring conditions on gambling treatment outcomes . DESIGN , SETTING AND PARTICIPANTS Prospect i ve cohort study of problem gamblers . Participants were recruited from consecutive referrals to a gambling therapy service in 2008 . Inclusion criteria were : ( i ) assessed as a problem gambler based on a screening interview including DSM-IV criteria for pathological gambling , and ( ii ) suitable for admission to a treatment program . Cognitive-behavioural therapy was based on grade d exposure-to-gambling urge . One-to-one treatment was conducted with 1-hour sessions weekly for up to 12 weeks . MAIN OUTCOME MEASURES Problem gambling screening and co-occurring conditions including depression , anxiety and alcohol use . RESULTS Of 127 problem gamblers , 69 were males ( 54 % ) , mean age was 43.09 years , and 65 ( 51 % ) reported a duration of problem gambling greater than 5 years . Median time for participants ' enrolment in the study was 8.9 months . Results from mixed effects logistic regression analysis indicated that individuals with higher depression levels had a greater likelihood ( 13 % increase in odds [ 95 % CI , 1%-25 % ] ) of problem gambling during treatment and at follow-up . CONCLUSION Addressing depression may be associated with improved treatment outcomes in problem gambling ; conversely , treatment of problem gambling improves affective instability . We therefore recommend a dual approach that treats both depression and problem gambling Sixty-eight individuals were r and omised to either six sessions of imaginal desensitisation plus motivational interviewing ( IDMI ) or Gamblers Anonymous . Individuals assigned to IDMI had significantly greater reductions in Yale-Brown Obsessive Compulsive Scale Modified for Pathological Gambling total scores , gambling urges and gambling behaviour . People who failed to respond to Gamblers Anonymous reported significantly greater reduction in pathological gambling symptoms following later assignment to IDMI . Abstinence was achieved by 63.6 % during the acute IDMI treatment period Two brief treatments for problem gambling were compared with a waiting-list control in a r and omized trial . Eighty-four percent of participants ( N = 102 ) reported a significant reduction in gambling over a 12-month follow-up period . Participants who received a motivational enhancement telephone intervention and a self-help workbook in the mail , but not those who received the workbook only , had better outcomes than participants in a 1-month waiting-list control . Participants who received the motivational interview and workbook showed better outcomes than those receiving the workbook only at 3- and 6-month follow-ups . At the 12-month follow-up , the advantage of the motivational interview and workbook condition was found only for participants with less severe gambling problems . Overall , these results support the effectiveness of a brief telephone and mail-based treatment for problem gambling BACKGROUND Pathological gambling ( PG ) , a disabling disorder experienced by approximately 1 % of adults , has few empirically vali date d treatments . A recent study demonstrated that 6 sessions of imaginal desensitization plus motivational interviewing ( IDMI ) was effective in achieving abstinence for a majority of individuals with PG . This study sought to examine whether those benefits were maintained 6 months post-treatment . METHODS Sixty-eight individuals who met DSM-IV criteria for PG were r and omly assigned to 6 sessions of IDMI or Gamblers Anonymous ( GA ) referral over an 8-week period . Participants who failed to respond to GA were offered IDMI after the 8-week acute treatment period . All individuals who responded to IDMI were contacted after 6 months and assessed with measures of gambling severity and psychosocial functioning . RESULTS Forty-four participants completed 6 sessions of IDMI ( 25 initially assigned to IDMI and 19 to GA ) . Thirty-five of the 44 ( 79.5 % ) responded during acute treatment , and all 35 were available for a 6-month evaluation . All gambling severity scales maintained statistically significant gains from baseline , although some measures showed significant worsening compared with post-IDMI treatment . CONCLUSIONS Six sessions of IDMI result ed in statistically significant reductions in PG urges and behavior , which were largely maintained for 6 months AIMS College students experience high rates of problem and pathological gambling , yet little research has investigated methods for reducing gambling in this population . This study sought to examine the efficacy of brief intervention strategies . DESIGN R and omized trial . SETTING College campuses . PARTICIPANTS A total of 117 college student problem and pathological gamblers . INTERVENTIONS Students were assigned r and omly to : an assessment -only control , 10 minutes of brief advice , one session of motivational enhancement therapy ( MET ) or one session of MET , plus three sessions of cognitive-behavioral therapy ( CBT ) . The three interventions were design ed to reduce gambling . MEASUREMENTS Gambling was assessed at baseline , week 6 and month 9 using the Addiction Severity Index-gambling ( ASI-G ) module , which also assesses days and dollars wagered . FINDINGS Compared to the assessment -only condition , those receiving any intervention had significant decreases in ASI-G scores and days and dollars wagered over time . The MET condition decreased significantly ASI-G scores and dollars wagered over time , and increased the odds of a clinical ly significant reduction in gambling at the 9-month follow-up relative to the assessment -only condition , even after controlling for baseline indices that could impact outcomes . The Brief Advice and MET+CBT conditions had benefits on some , but not all , indices of gambling . None of the interventions differed significantly from one another . Conclusions These results suggest the efficacy of brief interventions for reducing gambling problems in college students A substantial proportion of pathological gamblers engage in gambling-related illegal behavior . We examined differences in baseline characteristics and treatment outcomes in two groups : pathological gamblers who did and did not commit gambling-related illegal acts in the year before treatment . Participants were 231 pathological gamblers enrolled in a r and omized study of treatment that included cognitive behavior therapy and referral to Gamblers Anonymous ( GA ) . Participants reporting recent illegal behavior ( n = 63 ) endorsed more severe lifetime and recent ( past-year ) gambling disorder symptoms and higher gambling-related debt than did gamblers who denied illegal behavior ( n = 168 ) . Those who reported illegal behavior also maintained a significantly higher severity of gambling disorder throughout treatment , although both groups experienced similar improvements in gambling symptoms over time . While pathological gamblers with or without gambling-related illegal behavior appeared to improve at a similar rate regardless of the treatment provided , more intensive treatment may be warranted for individuals with gambling-related illegal behavior , as they demonstrated greater gambling severity throughout treatment and follow-up Effective therapies for pathological gambling exist , but their use is limited to about 10 % of the target population . In an attempt to lower the barriers for help , Internet-based cognitive behavioural therapy ( ICBT ) has been shown to be effective when delivered to a non-depressed sample with pathological gambling . This study sought to extend this finding to a larger , more representative population , and also test a model to predict responder status . Following advertisement , a total of 284 participants started an 8-week ICBT programme with minimal therapist contact via e-mail and weekly telephone calls of less than 15 min . The average time spent on each participant , including telephone conversations , e-mail , and administration , was 4 h. In addition to a mixed effects model to evaluate the effectiveness of the treatment , two logistic regression analyses were performed with the following eight pre-defined response predictor variables : work-life satisfaction , primary gambling activity , debts due to gambling , social support , personal yearly salary , alcohol consumption , stage of change , and dissociative gambling . ICBT result ed in statistically significant reductions in the scores of pathological gambling , anxiety , and depression as well as an increase in quality of life compared to pre-treatment levels . Follow-ups carried out in the treatment group at 6 , 18 , and 36 months indicated that treatment effects were sustained . Using the eight predictor variable model rendered an acceptable predictive ability to identify responders both at post-test ( AUC = .72 , p < .01 ) and at 36-month follow-up ( AUC = .70 , p < .01 ) . We conclude that ICBT for pathological gamblers , even if depressed , can be effective and that outcome can partly be predicted by pre-treatment characteristics This exploratory study investigated the effect of interventions design ed to improve compliance and reduce dropout rates during the outpatient treatment of pathological gambling at a University-based gambling treatment clinic . Forty subjects ( 29 males , 11 females , mean age = 37.6 ) meeting DSM-IV criteria ( APA , 1994 ) for pathological gambling were r and omly assigned to either a cognitive-behavioural treatment or a cognitive-behavioural treatment combined with interventions design ed to improve treatment compliance . Compliance was indicated by the completion of all treatment sessions . Outcome measures were DSM-IV criteria assessed by structured clinical interview , South Oaks Gambling Screen scores , and percentage of income gambled . Logistic regression analyses identified pretreatment characteristics predicting compliance and outcome . Compliance-improving interventions significantly reduced dropout rates , result ing in superior outcomes at posttreatment compared to the cognitive behavioural treatment alone . At 9-month follow-up , there was no difference in outcome between treatments , although both produced clinical ly significant change . Comorbid problem drinking , drug use , and problem gambling duration predicted poor compliance . Poor outcome was predicted by comorbid problem drinking . The clinical implication s of these results are discussed in light of the exploratory nature of the study and the need for future research to address compliance , outcome , and comorbidity issues This study examined the efficacy of two group treatments for pathological gambling , a node-link mapping-enhanced cognitive-behavioral group therapy ( CBGT-mapping ) and twelve-step facilitated ( TSF ) group treatment . Forty-nine participants meeting criteria for pathological gambling were recruited from local newspaper advertisements . These participants were r and omly assigned to one of three conditions : TSF ( n = 11 ) , CBGT-mapping ( n = 18 ) , and Wait-List control ( n = 9 ) ; 11 refused treatment prior to r and omization . Outcome measures included number of DSM-IV criteria met , perception of control/self-efficacy , desire to gamble , and frequency of gambling episodes . Analyses revealed a significant treatment group × time interaction ( η²partial = .39 ) . Specifically , the group treatments result ed in significant improvements in the dependent measures , while the Wait-List group remained relatively stable . Overall , CBGT-mapping and TSF had no significant differences on any outcome measure at follow-up assessment s. Analysis of post-treatment and 6-month follow-up reveal a significant improvement in gambling outcomes ( i.e. , fewer DSM-IV criteria met , greater self-efficacy , and fewer gambling episodes ( η²partial = .35 ) , with treatment gains maintained at 6 months . These results are consistent with previous research for group treatment for pathological gambling and provide support for the utility of TSF and a mapping-based CBT therapy as viable intervention for pathological gambling The transtheoretical model has been applied to many addictive disorders . In this study , psychometrics properties of the University of Rhode Isl and Change Assessment ( URICA ) scale were evaluated in 234 pathological gamblers initiating treatment . Four components were identified -- reflective of precontemplation , contemplation , action , and maintenance stages -- with internal consistency from .74 to .88 . Cluster analyses identified 4 patterns of responding , ranging from ambivalent to active change . The 4 clusters differed with respect to baseline gambling variables and treatment engagement and outcomes assessed 2 months later . A continuous measure of readiness to change was also correlated with gambling severity and predictive of reductions in gambling . This study provides initial support for reliability and validity of the URICA in treatment-seeking gamblers , and it suggests that stage of change may have an impact on outcomes INTRODUCTION Cognitive-behavioural therapy ( CBT ) seems to offer effective treatment for pathological gambling ( PG ) . However , it has not yet been established which techniques yield the best results , or whether exposure and response prevention ( ERP ) techniques are of additional use . OBJECTIVES To evaluate clinical and socio-demographic characteristics of a PG sample at baseline , comparing cognitive-behavioural group intervention , with and without exposure , with response prevention ( CBT + ERP vs. CBT ) , to compare the results of therapy and to assess pre-post changes in psychopathology between both groups . DESIGN We applied a quasi-experimental design comprising intervention on the independent variable , but without r and om assignment . METHODS The sample comprised 502 males with PG , consecutively admitted to a specialist unit , who received st and ardized outpatient CBT group therapy in 16 weekly sessions . Scores on the Symptom Checklist-Revised ( SCL-90-R ) , the Temperament and Character Inventory-Revised ( TCI-R ) , the South Oaks Gambling Screen ( SOGS ) , and other clinical and psychopathological scales were recorded . RESULTS Pre-post changes did not differ between groups , except for SCL paranoid ideation , being greater in the CBT therapy group . The risk of relapse during treatment was similar in the CBT + ERP and CBT patients . However , compliance with treatment was poorer in the CBT + ERP group , who presented higher drop-out rates during treatment . Drop-out during therapy was associated with shorter disorder duration and higher scores on the TCI-R novelty seeking scale . CONCLUSIONS Although the two CBT programs elicited similar therapy responses , patients receiving CBT alone showed higher adherence to therapy and lower drop-out rates Limited research exists regarding methods for reducing problem gambling . Problem gamblers ( N = 180 ) were r and omly assigned to assessment only control , 10 min of brief advice , 1 session of motivational enhancement therapy ( MET ) , or 1 session of MET plus 3 sessions of cognitive-behavioral therapy . Gambling was assessed at baseline , at 6 weeks , and at a 9-month follow-up . Relative to assessment only , brief advice was the only condition that significantly decreased gambling between baseline and Week 6 , and it was associated with clinical ly significant reductions in gambling at Month 9 . Between Week 6 and Month 9 , MET plus cognitive-behavioral therapy evidence d significantly reduced gambling on 1 index compared with the control condition . These results suggest the efficacy of a very brief intervention for reduction of gambling among problem and pathological gamblers who are not actively seeking gambling treatment According to a report of National Gambling Impact Study Commission ( National Gambling Impact Study Commission ( 1999 ) . Final report . Washington , DC : Government Printing Office . ) , 97 % of problem gamblers in the United States do not seek treatment . Within the small proportion of problem gamblers who enter into treatment , a high percentage drops out . Despite the fact that some research ers argue against abstinence as the only acceptable treatment goal and that regaining control over gambling behaviour appears to be possible for some pathological gamblers ( PG ) , abstinence has been the only gambling intervention treatment goal . The primary goal of this study was to verify whether controlled gambling is a viable goal for pathological gamblers . The second goal was to identify the characteristics that predicted a successful outcome for treatment with a controlled gambling goal . Eighty-nine PGs were enrolled in cognitive-behavioural treatment aim ed at controlled gambling . Six and twelve month follow-ups were conducted in order to evaluate the maintenance of therapeutic gains and to identify significant predictors of successful controlled gambling . Results showed that using the intent-to-treat procedure , 63 % had a score of 4 or less on the DSM-IV at the end of treatment . That proportion was 56 % and 51 % at the 6- and 12-month follow-ups . If we retain only those who completed the treatment , these proportions increased to 92 % , 80 % and 71 % at post-treatment , 6- and 12-month follow-ups , respectively . On the majority of the measures , significant improvements were found at post-treatment and the therapeutic gains were maintained at the 6- and 12-month follow-ups . However , few variables were identified to predict who would benefit from control rather than abstinence . The clinical and philosophical implication s of these results are discussed in this paper The current study aim ed to determine the differential efficacy of a cognitive-behavioural treatment program for female pathological gamblers delivered in individual and group format . Fifty-six female pathological gamblers with electronic gaming machine gambling problems were r and omly assigned to the control ( waiting list ) group or one of the treatment groups ( individual or group treatment ) . Treatment comprised a 12-session program including financial limit setting , alternative activity planning , cognitive correction , problem solving , communication training , relapse prevention , and imaginal desensitisation . Treatment outcome was evaluated with conceptually related measures within the areas of gambling behaviour and psychological functioning . While individual and group treatment formats generally produced comparable outcomes in terms of gambling behaviour and psychological functioning , group treatment failed to produce superior outcomes to the control group in relation to several measures of psychological functioning . Moreover , by the completion of the six-month follow-up , 92 % of the gamblers allocated to individual treatment compared with 60 % allocated to group treatment no longer satisfied the diagnostic criteria for pathological gambling . These findings suggest that some caution should be employed when delivering cognitive-behavioural treatment in a group format until further research is conducted to establish its efficacy AIMS Cognitive-behavioral therapy ( CBT ) is useful for treating substance abusers , and recent data suggest it is also efficacious for pathological gamblers . CBT is purported to exert its beneficial effects by altering coping skills , but data supporting coping changes as the mechanism of action are mixed . This study examined whether coping skills acquisition mediated the effects of CBT on decreasing gambling in pathological gamblers . DESIGN Participants were assigned r and omly to CBT plus referral to Gamblers Anonymous ( GA ) or to GA referral alone . Setting Out-patient clinic . PARTICIPANTS A total of 127 pathological gamblers . MEASUREMENTS Participants completed the Coping Strategies Scale ( CSS ) before treatment and 2 months later ; indices of gambling behavior and problems were administered pretreatment and at months 2 and 12 . FINDINGS Overall , CSS scores increased for participants in both conditions , but those receiving CBT evidence d larger increases than those in the GA condition ( P < 0.05 ) , and they also reduced gambling more substantially between pretreatment and month 2 . Changes in CSS scores mediated the relationship between treatment assignment and gambling outcomes from pretreatment to month 2 , but little evidence of mediation occurred for the long-term follow-ups . CONCLUSIONS CBT 's beneficial effects in decreasing gambling may be related partly to changes in coping responses , and improvements in coping are associated with long-term changes in gambling . However , relationships between coping skills and gambling behavior are fairly strong , regardless of treatment received The relationship between the therapeutic alliance and treatment participation and drinking outcomes during and after treatment was evaluated among alcoholic outpatient and aftercare clients . In the outpatient sample , ratings of the working alliance , whether provided by the client or therapist , were significant predictors of treatment participation and drinking behavior during the treatment and 12-month posttreatment periods , after a variety of other sources of variance were controlled . Ratings of the alliance by the aftercare clients did not predict treatment participation or drinking outcomes . Therapists ratings of the alliance in the aftercare sample predicted only percentage of days abstinent during treatment and follow-up . The results document the independent contribution of the therapeutic alliance to treatment participation and outcomes among alcoholic out patients A structured gambling interview schedule , the Eysenck Personality Question naire , Spielberger 's State-Trait Anxiety Inventory , Zuckerman 's Sensation Seeking Scale and Beck Depression Inventory were administered to 63 out of 120 pathological gamblers who had 5 years previously completed a behavioural treatment for uncontrollable gambling behaviour . Results indicated that both abstinence and controlled gambling outcomes were associated with continued improvement in self-report and psychological indices of social functioning and psychopathology . Response to treatment was associated with a reduction in arousal levels , anxiety and depression . Uncontrolled gamblers failed to show post-treatment changes on parameters of improvement . It was concluded that abstinence is not the only possible therapeutic outcome in behavioural treatment and further , that controlled gambling is not a temporary response which is followed by a return to continued uncontrollable gambling This study evaluated the efficacy of a cognitive treatment package for pathological gambling . Sixty-six gamblers , meeting DSM-IV criteria for pathological gambling , were r and omly assigned to treatment or wait-list control conditions . Cognitive correction techniques were used first to target gamblers ’ erroneous perceptions about r and omness and then to address issues of relapse prevention . The dependent measures used were the South Oaks Gambling Screen , the number of DSM-IV criteria for pathological gambling met by participants , as well as gamblers ’ perception of control , frequency of gambling , perceived self-efficacy , and desire to gamble . Posttest results indicated highly significant changes in the treatment group on all outcome measures , and analysis of data from 6- and 12-month follow-ups revealed maintenance of therapeutic gains . Recommendations for clinical interventions are discussed , focusing on the cognitive correction of erroneous perceptions toward the notion of r and omness Aims : The aim of this study was to evaluate posttreatment changes of individuals with a diagnosis of gambling disorder ( GD ) treated with group cognitive behavioral therapy ( CBT ) , to assess the potential moderator effect of sex on CBT outcome , and to explore the best predictors of posttreatment changes , relapse , and dropout rates . Methods : A cohort design was applied with a prospect i ve follow-up . The sample comprised 440 patients and the CBT intervention consisted of 16 weekly outpatient group sessions and a 3-month follow-up period . Results : Patients showed significant improvements in both the level of psychopathology and the severity of the gambling behavior . High self-transcendence and the involvement of the spouse or partner in the therapy predicted a higher risk of relapse . Younger age and low education predicted a higher risk of dropout . Conclusion : Many patients with GD can be treated with strategies to improve self-control and emotional regulation , but other techniques should be incorporated to address the individual characteristics of each patient . This is particularly important in group therapy , in which the same treatment is applied to several patients simultaneously . The involvement of a family member needs to be carefully considered since it may have a negative effect on the response to treatment if not adequately managed With the increasing availability of gambling throughout North America , there is interest in developing more effective treatments . This study compares the effectiveness of two brief outpatient treatments for problem gambling : eight sessions of Cognitive-Behavioral Therapy ( n = 65 ) and eight sessions of a twelve-step treatment-oriented approach based on the first five steps of Gamblers Anonymous ( n = 61 ) . There were no baseline group differences on gambling-relevant variables . Twelve months post-treatment showed no group differences on key gambling variables ( eg , frequency , abstinence rates , money wagered ) in an analysis of completers . Participants who attended more sessions and chose an initial abstinent treatment goal appeared to achieve better outcomes This study examined the relatively unexplored contribution of the therapist 's performance in determining outcomes of treatment . Nine therapists were studied : three performed supportive-expressive psychotherapy ; three , cognitive-behavioral psychotherapy ; and three , drug counseling . Profound differences were discovered in the therapists ' success with the patients in their case loads . Four potential determinants of these differences were explored : patient factors ; therapist factors ; patient-therapist relationship factors ; and therapy factors . Results showed that patient characteristics within each case load ( after r and om assignments ) were similar and disclosed no differences that would have explained the differences in success ; therapist 's personal qualities were correlated with outcomes but not significantly ( mean r = .32 ) ; an early-in-treatment measure of the patient-therapist relationship , the Helping Alliance Question naire , yielded significant correlations with outcomes ( mean r = .65 ) ; among the therapy techniques , " purity " provided significant correlations with outcomes ( mean r = .44 ) , both across therapists and within each therapist 's case load . The three therapist-related factors were moderately associated with each other This study evaluated the efficacy of a group cognitive treatment for pathological gambling . Gamblers , meeting DSM-IV criteria for pathological gambling , were r and omly assigned to treatment ( N=34 ) or wait-list control ( N=24 ) conditions . Cognitive correction techniques were used first to target gamblers ' erroneous perceptions about r and omness , and then to address issues of relapse prevention . The dependent measures used were the DSM-IV criteria for pathological gambling , perceived self-efficacy , gamblers ' perception of control , desire to gamble , and frequency of gambling . Post-treatment results indicated that 88 % of the treated gamblers no longer met the DSM-IV criteria for pathological gambling compared to only 20 % in the control group . Similar changes were observed on all outcome measures . Analysis of data from 6- , 12- and 24-month follow-ups revealed maintenance of therapeutic gains . Recommendations for group interventions are discussed , focusing on the cognitive correction of erroneous perceptions toward the notion of r and omness Individuals with addictive disorders , including substance abusers and pathological gamblers , discount or devalue rewards delayed in time more than controls . Theoretically , preference for probabilistic rewards is directly related to gambling , but limited empirical research has examined probabilistic discounting in individuals with pathological gambling . This study evaluated probability and delay discounting in treatment-seeking pathological gamblers and their association with gambling treatment outcomes during and after treatment . At time of treatment entry , 226 pathological gamblers completed probability and delay discounting tasks . They were then r and omized to one of three treatment conditions , and gambling behavior was measured throughout treatment and at a 1-year follow-up assessment . After controlling for possibly confounding variables and treatment condition , more shallow probability discounting was associated with greater reductions in amounts wagered during treatment and likelihood of gambling abstinence at the end of treatment and throughout the follow-up period . No associations were noted between delay discounting and gambling treatment outcomes . These data suggest that probability discounting may be an important construct in underst and ing pathological gambling and its treatment
2,075
15,979,232
In conclusion , an acute bout of aerobic exercise appears to have a significant impact on the BP response to a psychosocial stressor
The beneficial impact of regular exercise on cardiovascular health is partly mediated by psychobiological mechanisms . However , the effect of acute exercise on psychobiological responses is unclear .
This study was design ed to assess the effect of performance feedback on stress reactivity after recovery from maximal exercise . Forty competitive athletes were recruited to complete a maximal exercise test . Performance feedback was manipulated after the exercise test to give four groups : ( 1 ) high performance , ( 2 ) low performance , ( 3 ) accurate feedback and ( 4 ) no exercise control . Cardiovascular reactivity was assessed during psychological stress . The results indicate that accurate feedback participants experienced lower relative reactivity to stress ( lower mean arterial pressure ) than their no-exercise counterparts . These results demonstrate that the stressbuffering effect of exercise extends to maximal exercise . In addition , high-performance participants experienced lower relative reactivity than low-performance participants . Thus , low-performance feedback was sufficient to remove the buffering effect of exercise . There were no differences between the high-performance and accurate feedback conditions , or between the low-performance and control conditions Endogenous opiate peptides can regulate neuroendocrine and circulatory responses to behavioral stress and may be important in the pathogenic effects of sympathoadrenal reactivity . We tested this hypothesis by examining the effect of the opiate antagonist naloxone on blood pressure responses to behavioral stress in young adults with high , medium , or low casual blood pressures . Naloxone increased mean arterial pressure responses to stress in subjects with low casual pressure , but had no significant effect on responses in subjects with high casual pressure . These results suggest opioidergic inhibition of sympathetic nervous system responses may be deficient in persons at risk for essential hypertension Thirty-six competitive sportsmen and 36 inactive men participated in a two-session experiment . Session 1 involved exercise to exhaustion so as to assess maximal oxygen consumption ( VO2 max ) . In Session 2 , both groups were r and omized into three experimental conditions : 20 min of exercise at high intensity ( 70 % VO2max ) or moderate intensity ( 50 % VO2max ) or a light exercise control . Following 30 min of recovery , all subjects performed mental arithmetic and public speech tasks in a counterbalanced order . Cardiovascular , electrodermal , respiratory , and subjective variables were recorded . Sportsmen had higher VO2max , lower body fat , and lower resting heart rate ( HR ) than inactive men . A postexercise hypotensive response was observed among subjects in the 70 % and 50 % VO2max conditions , accompanied by baroreceptor reflex inhibition in the 70 % condition . Systolic pressure was lower during mental arithmetic and during recovery from the speech task in the high-intensity than in the control group . Diastolic pressure was lower following mental arithmetic in the high-intensity group . No differences in HR reactivity , electrodermal , or respiratory parameters were observed , but baroreceptor reflex sensitivity was inhibited during mental arithmetic . The results are discussed in relation to previous reports of suppressed cardiovascular reactivity to mental stress tests following vigorous exercise and the role of stress-related processes in the antihypertensive response to physical training The effects of exercise on subsequent psychophysiological responses to mental stress were assessed in a study of 30 normotensive male volunteers . Participants were r and omly allocated to three experimental conditions--20-min exercise at 100 Watts ( high exercise ) , 20-min exercise at 25 Watts ( low exercise ) , or 20-min no exercise ( control ) . After a recovery period of 20 min , all subjects performed a mental arithmetic task for four 5-min trials . Blood pressure and heart rate were monitored continuously using a Finapres , and respiration and electrodermal activity were also recorded . Baroreceptor reflex control of heart rate was assessed using power spectrum analysis . Exercise produced consistent increases in systolic blood pressure , heart rate , and subjective tension , together with reductions in systemic resistance and baroreflex sensitivity . The systolic and diastolic blood pressure and heart rate reactions to mental arithmetic were significantly blunted in the high exercise compared with control conditions , with the low exercise group showing an intermediate pattern . Subjective responses to mental stress were unaffected by prior exercise . The pattern of hemodynamic response was not a result of changes in baroreflex sensitivity . The mechanisms underlying this result are discussed in relation to the discrepancies between subjective and physiological responses to mental stress , and the implication s for the use of exercise in stress management This investigation compared patterns of regional cerebral blood flow ( rCBF ) during exercise recovery both with and without postexercise hypotension ( PEH ) . Eight subjects were studied on 3 days with r and omly assigned conditions : 1 ) after 30 min of rest ; 2 ) after 30 min of moderate exercise ( M-Ex ) at 60 - 70 % heart rate ( HR ) reserve during PEH ; and 3 ) after 30 min of light exercise ( L-Ex ) at 20 % HR reserve with no PEH . Data were collected for HR , mean blood pressure ( MBP ) , and ratings of perceived exertion and relaxation , and rCBF was assessed by use of single-photon-emission computed tomography . With the use of ANOVA across conditions , there were differences ( P < 0.05 ; mean + /- SD ) from rest during exercise recovery from M-Ex ( HR = + 12 + /- 3 beats/min ; MBP = -9 + /- 2 mmHg ) , but not from L-Ex ( HR = + 2 + /- 2 beats/min ; MBP = -2 + /- 2 mmHg ) . After M-Ex , there were decreases ( P < 0.05 ) for the anterior cingulate ( -6.7 + /- 2 % ) , right and left inferior thalamus ( -10 + /- 3 % ) , right inferior insula ( -13 + /- 3 % ) , and left inferior anterior insula (-8 + /- 3 % ) , not observed after L-Ex . There were rCBF decreases for leg sensorimotor regions after both M-Ex ( -15 + /- 4 % ) and L-Ex ( -12 + /- 3 % ) and for the left superior anterior insula ( -7 + /- 3 % and -6 + /- 3 % ) , respectively . Data show that there are rCBF reductions within specific regions of the insular cortex and anterior cingulate cortex coupled with a postexercise hypotensive response after M-Ex . Findings suggest that these cerebral cortical regions , previously implicated in cardiovascular regulation during exercise , may also be involved in PEH The role of endogenous opioids in aerobic fitness-induced decrements in cardiovascular stress reactivity was examined by comparing the effects of opioid antagonism with naltrexone on responses to stress in young adults with high versus low levels of aerobic fitness . Two hundred forty subjects were given an activity question naire and males with the highest ( Fit ) and lowest ( Nonfit ) aerobic activity profiles were recruited for maximal oxygen consumption ( VO2max ) treadmill testing and psychological stress testing ( final sample N = 28 ) . Heart rate and blood pressures were measured during performance on a computer-controlled arithmetic task after pretreatment with either naltrexone ( Trexan , DuPont ) or a placebo . During placebo challenges , Fit subjects , compared with Nonfit , showed lower heart rate reactivity during stress and lower mean arterial blood pressures immediately before and during recovery from stress . Naltrexone eliminated these reactivity differences by increasing heart rate reactivity and raising mean arterial blood pressure in Fit subjects . These data suggest that aerobic fitness is associated with enhanced opioidergic inhibition of circulatory stress reactivity . Opioidergic modulatory effects on stress reactivity may comprise an important mechanism in fitness-associated risk reduction for cardiovascular disease We evaluated the experimental hypothesis that an acute bout of aerobic exercise ( AE ) serves as a buffer to psychosocial stress responses in low to moderate physically fit women . Forty-eight ( 24 White , 24 Black ) 25- to 40-year-old women participated in two counterbalanced experimental conditions : an attention control and a 40-min bout of AE at 70 % heart rate ( HR ) reserve . The attention control and AE treatments were followed by ( a ) 30 min of quiet rest , ( b ) exposure to mental and interpersonal threat , and ( c ) 5 min of recovery . Blood pressure ( BP ) and HR were monitored at baseline , during the stressors , and throughout recovery . Self-reported distress was assessed before each stressor and upon completion of the recovery period . The results provided clear evidence that exercise dampens BP reactivity to psychosocial stress . Additionally , compared with the attention placebo control , AE reduced both the frequency and intensity of anxiety-related thoughts that occur in anticipation of interpersonal threat and challenge We examined the correspondence between laboratory measures of cardiovascular reactivity ( CVR ) and within-person changes in cardiovascular activity during the challenges of daily life , after adjustment for posture , activity , and other effects . Healthy adults ( n = 335 ) were administered laboratory measures of CVR along with 6 days of ambulatory blood pressure monitoring and electronic diary reports . Compared with low reactors , high laboratory systolic blood pressure ( SBP ) reactors showed larger increases in SBP during periods of high task dem and or low decisional control in daily life . High diastolic blood pressure ( DBP ) reactors showed larger increases in ambulatory DBP during situations rated as both low control and high dem and . This multilevel modeling approach may enhance our ability to detect the correspondence between laboratory and ambulatory measures of CVR , and to identify the circumstances under which it may be most clearly observed BACKGROUND The study objective was to determine the health and quality -of-life effects of moderate-intensity exercise among older women family caregivers . METHODS This 12-month r and omized controlled trial involved a volunteer sample of 100 women aged 49 to 82 years who were sedentary , free of cardiovascular disease , and caring for a relative with dementia . Participants were r and omized to 12 months of home-based , telephone-supervised , moderate-intensity exercise training or to an attention-control ( nutrition education ) program . Exercise consisted of four 30- to 40-minute endurance exercise sessions ( brisk walking ) prescribed per week at 60 % to 75 % of heart rate reserve based on peak treadmill exercise heart rate . Main outcomes were stress-induced cardiovascular reactivity levels , rated sleep quality , and reported psychological distress . RESULTS Compared with nutrition participants ( NU ) , exercise participants ( EX ) showed significant improvements in the following : total energy expenditure ( baseline and post-test means [ SD ] for EX = 1.4 [ 1.9 ] and 2.2 [ 2.2 ] kcal/kg/day ; for NU = 1.2 [ 1.7 ] and 1.2 [ 1.6 ] kcal/kg/day ; p < .02 ) ; stress-induced blood pressure reactivity ( baseline and post-test systolic blood pressure reactivity values for EX = 21.6 [ 12.3 ] and 12.4 [ 11.2 ] mm Hg ; for NU = 17.9 [ 10.2 ] and 17.7 [ 13.8 ] mm Hg ; p < .024 ) ; and sleep quality ( p < .05 ) . NU showed significant improvements in percentages of total calories from fats and saturated fats relative to EX ( p values < .01 ) . Both groups reported improvements in psychological distress . Conclusions . Family caregivers can benefit from initiating a regular moderate-intensity exercise program in terms of reductions in stress-induced cardiovascular reactivity and improvements in rated sleep quality Objectives : Mild to moderate acute , endurance exercise has generally been shown to reduce blood pressure ( BP ) in hypertensive ( HT ) individuals . Whether a slightly more strenuous bout of exercise can elicit a greater and more prolonged BP reduction is unknown . Therefore , the purpose of this study was to examine the effects of two , 30-min exercise bouts , conducted at 50 % and 75 % of maximal oxygen uptake ( VO2max ) , on the quantity and quality of BP reduction over a 24-h period . Methods : Sixteen , Stage 1 and 2 non-medicated , HT ( 8 men/8 women ) subjects were matched with normotensive ( NT ) men and women ( n = 16 ) . All subjects were evaluated for VO2max with a symptom-limited treadmill test and then completed a 30-min exercise bout at 50 % and 75 % of VO2max as well as a control ( no exercise ) session in r and om fashion on separate days . Twenty-four hour ambulatory BPs were measured after both the exercise and control setting s. Data was assessed at 1 , 3 , 6 , 12 , and 24 h post-exercise and control session . Results : A repeated- measures ANOVA showed non-significant differences between HT men and women and that both exercise intensities , relative to the control session , significantly ( P < 0.05 ) reduced systolic ( s ) and diastolic ( d ) bps . nt subjects showed non-significant reductions following both intensities . the reductions in the ht men and women averaged 4 and 9 mm hg (sbp)/5 and 7 mm hg ( dbp ) for 50 % and 75 % , respectively . on average , the ht subjects ( men and women combined ) maintained significant sbp reductions for 13 h after the 75 % bout compared to 4 h after the 50 % intensity . likewise , dbp was reduced for an average of 11 h following the 75 % bout compared to 4 h after the 50 % intensity . Conclusions : These results suggest that an exercise bout conducted between 50–75 % VO2max significantly decreases SBP and DBP in HT subjects and that a greater and longer-lasting absolute reduction is evident following a 75 % of maximum bout of exercise The effect of 30 min of cycling exercise at 60 % VO2max on hemodynamic responses to the Stroop and cold pressor tests in 12 normotensive males was examined . Subjects were r and omly assigned in a counterbalanced design to perform the stressors pre- and postexercise and served as their own controls . Cardiac output ( CO ) , heart rate ( HR ) , and stroke volume were measured continuously by impedance cardiography . Blood pressure was measured beat to beat using a photoplethysmographic volume transducer . Total peripheral resistance ( TPR ) was calculated . The systolic blood pressure response ( elevation ) to the Stroop test was significantly attenuated postexercise ( 4.2 + /- 1.4 % , p < 0.05 ) as compared to preexercise and control ( 10.1 + /- 1.8 % ) . Postexercise HR reactivity was also significantly attenuated to the Stroop test postexercise ( 0.3 + /- 1.7 % ) compared to preexercise and control ( 6.5 + /- 2.3 % , p < 0.05 ) . Hemodynamic variables among treatment groups , with the possible exception of HR , appeared to be unaffected during the cold pressor postexercise . Neither central ( i.e. , CO ) nor peripheral ( i.e. , TPR ) responses appeared to be solely responsible for the attenuated blood pressure response to the Stroop stressor The purpose of this study was to determine the effects of exercise and weight loss on cardiovascular responses during mental stress in mildly to moderately overweight patients with elevated blood pressure . Ninety-nine men and women with high normal or unmedicated stage 1 to stage 2 hypertension ( systolic blood pressure 130 to 179 mm Hg , diastolic blood pressure 85 to 109 mm Hg ) underwent a battery of mental stress tests , including simulated public speaking , anger recall interview , mirror trace , and cold pressor , before and after a 6-month treatment program . Subjects were r and omly assigned to 1 of 3 treatments : ( 1 ) aerobic exercise , ( 2 ) weight management combining aerobic exercise with a behavioral weight loss program , or ( 3 ) waiting list control group . After 6 months , compared with control subjects , participants in both active treatment groups had lower levels of systolic blood pressure , diastolic blood pressure , total peripheral resistance , and heart rate at rest and during mental stress . Compared with subjects in the control group , subjects in the exercise and weight management groups also had greater resting stroke volume and cardiac output . Diastolic blood pressure was lower for the weight management group than for the exercise-only group during all mental stress tasks . These results demonstrate that exercise , particularly when combined with a weight loss program , can lower both resting and stress-induced blood pressure levels and produce a favorable hemodynamic pattern resembling that targeted for antihypertensive therapy PURPOSE The objective of this study was to investigate the effects of exercise training and weight loss on blood pressure ( BP ) associated with physical activity and emotional stress during daily life . METHODS One hundred twelve participants with unmedicated high normal or stage 1 to stage 2 hypertension were r and omized to one of three conditions : a combined exercise and behavioral weight management group ( WM ) , an exercise-only group ( EX ) , or a wait list control group ( CON ) . BP was assessed in the clinic and during 15 h of daytime ambulatory BP monitoring at baseline and after 6 months of treatment . RESULTS Increased levels of physical activity and emotional distress measured during daily life were associated with increases in systolic blood pressure ( SBP ) , diastolic blood pressure ( DBP ) , heart rate ( HR ) , and rate pressure product ( RPP ) . After treatment , the WM group had significantly lower DBP , HR , and RPP responses during both high and low levels of physical activity and emotional distress compared with the CON group . The EX group had similar BP levels as the WM group , although the EX group had significantly lower BP than the CON group during low but not high levels of physical activity and emotional distress . CONCLUSION These findings indicate that exercise , especially when combined with weight loss , reduces BP levels at rest and in situations that typically elevate BP such as intense physical activity and emotional distress An experiment was conducted to examine the acute emotional and psychophysiological effects of a single bout of aerobic exercise . Forty active and 40 inactive college students were r and omly assigned to an aerobic exercise or a waiting-period control condition . Self-report measures of mood and cardiovascular response measures to challenging cognitive tasks were collected before and after the 20-min exercise/control period to examine any exercise-induced changes . The results indicated that mood was significantly altered by the exercise activity , with reductions in tension and anxiety specifically evident . Exercise was not found to have any effects on cardiovascular reactivity . A test of aerobic fitness confirmed fitness differences between active and inactive participants , but no mood or reactivity effects related to activity status were obtained . These results suggest that both active and inactive individuals experience acute reductions in anxiety following single bouts of exercise , even in the absence of changes in cardiovascular reactivity . Implication s for the continued investigation of the acute effects of exercise are discussed Design : Psychological stress is associated with the development of hypertension . Exercise is purported to have a prophylactic effect on stress . Immediately after a single bout of aerobic exercise there is a transient decrease in blood pressure Objective : We sought to examine the cardiovascular responses to a psychological stressor , the Stroop color word task during the postexercise hypotensive period . Methods : Eight borderline hypertensive subjects ( resting blood pressure 137±1.9/ 85 ±1.8 mmHg ) participated in three r and omly assigned experimental trials : Stroop color word task without prior exercise ( Stroop ) ; Stroop color word task administered 10min after 60min exercise at 60 % maximal oxygen uptake ( Ex + Stroop ) ; and 60min exercise at 60 % maximal oxygen uptake followed by 20 min seated recovery ( Ex ) . Blood pressure and heart rate were monitored at the start and end of exercise and at every 2 min of recovery . Results : During the Stroop trial there were significant increases in mean arterial ( MAP ) , systolic ( SBP ) and diastolic blood pressure ( DBP ) . During the Ex + Stroop trial the increases in MAP , SBP and DBP during the Stroop color word task were significantly less than the increases without prior exercise . During recovery in the Ex trial there were significant decreases in MAP and SBP . However , there were no significant changes in DBP during the Ex trial . Conclusions : These results suggest that following an acute bout of exercise there is a reduction in blood pressure , and during this postexercise hypotensive period the blood pressure response to a psychological stressor is attenuated
2,076
27,048,292
Conclusion Given the high heterogeneity of the studies , the discrete value presented by the estimated effect on effectiveness and safety , potential conflicts of interest of the studies , and the appreciable higher cost of insulin glargine , there is still no support for recommending first-line therapy with analogs .
Introduction The use of insulin analogs for the treatment of type 1 diabetes mellitus ( T1DM ) is widespread ; however , the therapeutic benefits still require further evaluation given their higher costs . The objective of this study was to evaluate the effectiveness and safety of analog insulin glargine compared to recombinant DNA ( rDNA ) insulin in patients with T1DM in observational studies , building on previous review s of r and omized controlled trials comparing neutral protamine Hagedorn insulin and insulin glargine .
OBJECTIVE To study the pharmacodynamic properties of the subcutaneously injected long-acting insulin analog HOE901 ( 30 microg/ml zinc ) in comparison with those of NPH insulin and placebo . RESEARCH DESIGN AND METHODS In this single-center double-blind euglycemic glucose clamp study , 15 healthy male volunteers ( aged 27 + /- 4 years , BMI 22.2 + /- 1.8 kg/m2 ) received single subcutaneous injections of 0.4 U/kg body wt of HOE901 , NPH insulin , or placebo on 3 study days in a r and omized order . The necessary glucose infusion rates ( GIRs ) to keep blood glucose concentrations constant at 5.0 mmol/l were determined over a 30-h period after administration . RESULTS The injection of HOE901 did not induce the pronounced peak in metabolic activity observed with NPH insulin ( GIRmax 5.3 + /- 1.1 vs. 7.7 + /- 1.3 mg x kg(-1 ) x min(-1 ) ) ( P < 0.05 ) ; after an initial rise , metabolic activity was rather constant over the study period . This lack of peak was confirmed by a lower glucose consumption in the first 4 h after injection ( area under the curve from 0 to 4 h [ AUC(0 - 4 h ) ] 1.02 + /- 0.34 vs. 1.48 + /- 0.34 g/kg ) ( P < 0.001 ) with HOE901 , as compared with NPH insulin . In this single-dose study , the metabolic effect measured over a period of 30 h was lower with HOE901 than with NPH insulin ( AUC(0 - 30 h ) 7.93 + /- 1.82 vs. 9.24 + /- 1.29 g/kg ) ( P < 0.05 ) . CONCLUSIONS This study shows that the soluble long-acting insulin analog HOE901 induces a smoother metabolic effect than NPH insulin , from which a better substitution of basal insulin requirements may follow Objective Long-acting basal insulin analogs have demonstrated positive effects on the balance between effective glycemic control and risk of hypoglycemia versus neutral protamine Hagedorn ( NPH ) insulin in r and omized controlled trials . Evidence of severe hypoglycemic risk with insulin detemir , insulin glargine , or NPH insulin is presented from a nationwide retrospective data base study . Research design and methods Data from hospital and secondary healthcare visits due to hypoglycemic coma from 75 682 insulin-naïve type 1 or 2 diabetes patients initiating therapy with NPH insulin , insulin glargine , or insulin detemir in Finl and between 2000 and 2009 were analyzed . Incidence rates with 95 % confidence intervals ( CIs ) were calculated using Poisson regression . Hazard ratios were estimated using Cox 's regression with adjustments for relevant background variables . Results The adjusted risk of hospital/ secondary healthcare visits due to the first severe hypoglycemic event was 21.7 % ( 95 % CI 9.6–32.1 % , p < 0.001 ) lower for insulin detemir and 9.9 % ( 95 % CI 1.5–17.6 % , p = 0.022 ) lower for insulin glargine versus NPH insulin . Risk of hypoglycemic coma recurrence was 36.3 % ( 95 % CI 8.9–55.5 % , p = 0.014 ) lower for detemir and 9.5 % but not significantly ( 95 % CI −10.2 to 25.7 % , p = 0.318 ) lower for glargine versus NPH insulin . Risk of all hypoglycemic coma events was 30.8 % ( 95 % CI 16.2–42.8 % , p-value < 0.001 ) lower for detemir and 15.6 % ( 95 % CI 5.1–25.0 % , p-value 0.005 ) lower for glargine versus NPH . Insulin detemir had a significantly lower risk for first ( 13.1 % lower [ p = 0.034 ] ) , recurrent ( 29.6 % lower [ p = 0.021 ] ) , and all ( 17.9 % lower [ p = 0.016 ] ) severe hypoglycemic events than insulin glargine . Conclusions There were considerable differences in risk of hospitalization or secondary healthcare visits due to hypoglycemic coma between basal insulin treatments in real-life clinical practice In this open study of clinical practice , 142 paediatric patients with type 1 diabetes mellitus ( > 1 year duration ) , stratified by age , received pr and ial insulin ( regular or lispro ) and either once daily insulin glargine ( GLAR ; n=74 ) , titrated to target fasting blood glucose ( FBG ) levels 4.4 - 7.8 mmol/l , or NPH/semilente insulin ( NPH insulin , administered once , twice or three times daily ; n=68 ) , titrated to target FBG 4.4 - 8.9 mmol/l . Both groups were treated for 20 + /- 10 months . HbA(1c ) significantly increased in GLAR ( 7.3 + /- 1.0 % to 7.6 + /- 1.1 % ; p = 0.003 ) and NPH/semilente insulin ( 7.7 + /- 1.6 % to 8.3 + /- 1.5 % ; p = 0.0001 ) treated patients . The incidence of symptomatic hypoglycaemia was comparable between GLAR versus NPH/semilente insulin at endpoint ( 2.19 vs. 1.94 episodes/week ) ; however , the overall incidence of severe hypoglycaemia was significantly lower with GLAR versus NPH/semilente insulin ( 0.14 vs. 0.73 events/patient-year ; p = 0.002 ) . The daily insulin dose was similar between the treatment groups ; however , perceived quality of life ( QoL ) was better with GLAR . GLAR is associated with equivalent glycaemic control , less severe hypoglycaemia and improved QoL compared with NPH/semilente insulin OBJECTIVE To define the relationship between HbA(1c ) and plasma glucose ( PG ) levels in patients with type 1 diabetes using data from the Diabetes Control and Complications Trial ( DCCT ) . RESEARCH DESIGN AND METHODS The DCCT was a multicenter , r and omized clinical trial design ed to compare intensive and conventional therapies and their relative effects on the development and progression of diabetic complications in patients with type 1 diabetes . Quarterly HbA(1c ) and corresponding seven-point capillary blood glucose profiles ( premeal , postmeal , and bedtime ) obtained in the DCCT were analyzed to define the relationship between HbA(1c ) and PG . Only data from complete profiles with corresponding HbA(1c ) were used ( n = 26,056 ) . Of the 1,441 subjects who participated in the study , 2 were excluded due to missing data . Mean plasma glucose ( MPG ) was estimated by multiplying capillary blood glucose by 1.11 . Linear regression analysis weighted by the number of observations per subject was used to correlate MPG and HbA(1c ) . RESULTS Linear regression analysis , using MPG and HbA(1c ) summarized by patient ( n = 1,439 ) , produced a relationship of MPG ( mmol/l ) = ( 1.98 . HbA(1c ) ) - 4.29 or MPG ( mg/dl ) = ( 35.6 . HbA(1c ) ) - 77.3 , r = 0.82 ) . Among individual time points , afternoon and evening PG ( postlunch , predinner , postdinner , and bedtime ) showed higher correlations with HbA(1c ) than the morning time points ( prebreakfast , postbreakfast , and prelunch ) . CONCLUSIONS We have defined the relationship between HbA(1c ) and PG as assessed in the DCCT . Knowing this relationship can help patients with diabetes and their healthcare providers set day-to-day targets for PG to achieve specific HbA(1c ) goals OBJECTIVE To determine the efficacy and safety of insulin glargine ( IG ) in children and adolescents with type 1 diabetes . In a prospect i ve , 6-month study , 80 patients , aged 2 - 19 years , received IG once daily plus insulin regular or rapid analogue before meals . The data of body mass index , frequency of severe hypoglycaemia , daily mean blood glucose , fasting blood glucose , haemoglobin A1c and total daily insulin dosage before and after institution of glargine therapy were collected . RESULTS After 6 months , the average HbA1c level in the entire cohort dropped from 7.63+/-0.81 to 7.14+/-0.70 % ( p<0.001 ) . Fasting blood glucose decreased from 161+/-37 to 150+/-35 mg/dl ( p<0.05 ) in the total group . Severe hypoglycaemic episodes were reduced from 0.18 events per patient in the 6 months before IG therapy to 0.11 events per patient in the 6 months after IG therapy . The total daily insulin dose was reduced in the entire group from 0.90+/-0.32 to 0.83+/-0.29 u/kg/day ( p<0.05 ) . Body mass index ( BMI ) remained unchanged . In the 14 preschooler children , the HbA1c dropped from 7.54+/-0.60 to 6.96+/-0.57 % ( p<0.05 ) . CONCLUSIONS Insulin glargine is an efficacious treatment to improve metabolic control in children and adolescents with type 1 diabetes . It also improved the metabolic control in preschool-age children , without increasing the number of hypoglycaemic events BACKGROUND The goal of new therapies introduced for type 1 diabetes should be to decrease hypoglycemic episodes while improving glycemic control . METHODS A data base was used to computer match the baseline A1C values in 196 subjects with type 1 diabetes receiving multiple daily injections ( MDI ) consisting of four or more injections per day . There were 98 patients transferred from NPH to insulin glargine ( Lantus , Aventis Pharmaceuticals , Bridgewater , NJ ) , and 98 patients remained on NPH throughout the study . The gender distribution and mean age ( approximately 32 years ) , duration of diabetes ( approximately 16 years ) , and duration of treatment ( approximately 13 months ) were not significantly different between the groups . The majority of patients were well controlled ( > 50 % in both groups had an A1C < 7 % ) . RESULTS The mean A1c values were not significantly different in the groups at baseline or at follow-up . Severe hypoglycemic episodes per patient per year were significantly lower in the glargine group compared with the NPH group ( 0.5 vs. 1.2 , respectively ; P = 0.04 ) . The mean end-of- study total ( P = 0.03 ) and long-acting ( P = 0.0001 ) doses were significantly reduced from baseline in the group that switched to glargine , but not in the group that remained on NPH , with no change in the short-acting dose in either group . The weight gain was significantly higher in the NPH group at the end of the study ( P = 0.004 ) with no significant change in the glargine group . CONCLUSIONS Transfer to glargine treatment from NPH in MDI regimens significantly reduces severe hypoglycemic episodes despite a decline in long-acting basal insulin without significant weight gain OBJECTIVE Insulin glargine offers sustained insulin delivery for 24 h. Change to glargine treatment consistently results in lower fasting glucose and fewer hypoglycemic episodes in children with type 1 diabetes compared to continuation of NPH , although glargine has not been shown to improve HbA1c in r and omized trials . Studies comparing glargine and NPH in multiple injection therapy in children treated from diagnosis of type 1 diabetes are lacking . METHODS HbA1c and insulin requirement were compared in a retrospective study of children ( 7 - 17 yr of age ) with type 1 diabetes treated from diagnosis with basal insulin glargine ( n = 49 ) or NPH ( n = 49 ) in a multiple injection therapy ( MIT ) regimen with a rapid-acting insulin analogue . Patients were followed every third month for 1 yr . HbA1c , insulin dose , and weight data were retrieved . RESULTS HbA1c ( mean ± SD ) was lower at 3 - 5 months ( 5.5 ± 0.89 vs. 6.2 ± 0.89 % , p < 0.05 ) and 6 - 9 months ( 5.6 ± 1.14 vs. 6.6 ± 0.99 % ; p < 0.001 ) in glargine treated . After 12 months , HbA1c was significantly lower in glargine treated ( 6.3 ± 1.56 vs. 7.1 ± 1.28 ; p < 0.01 ) . Reported total insulin doses were similar at nadir ( 0.5 U/kg BW × 24 h ) , but significantly lower at 12 months in glargine treated ( 0.64 ± 0.23 vs. 0.86 ± 0.3 U/kg BW × 24 h ; p < 0.001 ) . CONCLUSIONS HbA1c 1 yr from diagnosis was lower in children treated with glargine from start as compared with those on NPH . This observation should be viewed in the light of a significantly lower dose of total daily insulin in the glargine group ABSTRACT Objective : To investigate the effect of initiating insulin glargine ( glargine : LANTUS * ) , a once-daily basal insulin analogue , plus an educational programme , on glycaemic control and body weight in patients with type 1 diabetes in clinical practice . Research design and methods : A retrospective analysis of the medical records of 65 patients ( mean age : 40.7 ± 13.3 years ) with type 1 diabetes was performed . Patients had previously been treated with NPH insulin ( NPH ; n = 54 ) or NPH insulin + lente insulin ( NPH + lente ; n = 11 ) and then received glargine once daily ( bedtime ) , plus short-acting pr and ial insulin , for 30 months . Before initiation of glargine , patients participated in a diabetes educational programme and then received physician consultations throughout the study . Metabolic control , body weight and severe hypoglycaemia data were analysed at 9 and 30 months . Results : Following initiation of glargine , patients showed a decrease in HbA1c from 7.29 ± 1.1 % to 7.06 ± 1.0 % ; p < 0.01 at 30 months . When the results were analysed by pre-treatment , both NPH-pre-treated and NPH+lente-pre-treated patients showed a significant reduction in HbA1c of 0.14 % and 0.82 % , respectively , at 30 months ( 7.27 ± 1.2 % to 7.13 ± 1.1 % and 7.42 ± 1.2 to 6.60 ± 0.3 % , respectively ; p < 0.01 ) . No change in body weight was observed in the overall group . No episodes of severe hypoglycaemia ( blood glucose < 40 mg/dL [ < 2.2 mmol/L ] occurred . Conclusions : In this retrospective study of medical records , patients with type 1 diabetes treated with insulin glargine over 30 months in combination with educational support and close clinical supervision decreased their HbA1c levels without weight gain versus previous treatment with NPH insulin or insulin lente . Further studies in a larger cohort of patients would help to confirm these results
2,077
20,384,548
The findings of this review have been used to generate guidelines for best practice of reporting for AFO intervention studies .
Studies which have examined the effects of ankle-foot orthoses ( AFOs ) on children with cerebral palsy ( CP ) often report insufficient detail about the participants , devices and testing protocol s. The aim of this systematic review was to evaluate the level and quality of detail reported about these factors in order to generate best practice guidelines for reporting of future studies .
Developing an evidence base for making public health decisions will require using data from evaluation studies with r and omized and nonr and omized design s. Assessing individual studies and using studies in quantitative research syntheses require transparent reporting of the study , with sufficient detail and clarity to readily see differences and similarities among studies in the same area . The Consoli date d St and ards of Reporting Trials ( CONSORT ) statement provides guidelines for transparent reporting of r and omized clinical trials . We present the initial version of the Transparent Reporting of Evaluations with Nonr and omized Design s ( TREND ) statement . These guidelines emphasize the reporting of theories used and descriptions of intervention and comparison conditions , research design , and methods of adjusting for possible biases in evaluation studies that use nonr and omized design Eighteen children with hemiplegia , mean age 8 years 5 months , underwent gait analysis and musculoskeletal modelling using specially design ed software . The maximum lengths of the hamstrings were determined for each child walking in and out of an ankle-foot orthosis ( AFO ) . The muscles were deemed to be short if shorter than the normal average -1SD . In bare feet 8 participants had short medial hamstrings with a higher proportion of these in the less involved individuals . All participants showed an increase in maximum hamstring length when wearing an AFO . In all but one child this was sufficient to restore hamstring length to within normal limits . These finding suggest that hamstring pathology in hemiplegic gait is usually secondary to more distal lower limb pathology This study compared the functional efficacy of three commonly prescribed ankle-foot orthosis ( AFO ) configurations ( solid [ SAFO ] , hinged [ HAFO ] , and posterior leaf spring [ PLS ] ) . Sixteen independently ambulatory children ( 10 males , six females ; mean age 8 years 4 months , SD 2 years 4 months ; range 4 years 4 months to 11 years 6 months ) with spastic diplegia participated in this study . Four children were classified at level I of the Gross Motor Function Classification System ( GMFCS ; Palisano et al. 1997 ) ; the remaining 12 were at level II . Children were assessed barefoot ( BF ) at baseline ( baseline assessment of energy consumption was performed with shoes on , no AFO ) and in each orthotic configuration after three months of use , using gait analysis , oxygen consumption , and functional outcome measures . AFO use did not markedly alter joint kinematics or kinetics at the pelvis , hip , or knee . All AFO configurations normalized ankle kinematics in stance , increased step/stride length , decreased cadence , and decreased energy cost of walking . Functionally , all AFO configurations improved the execution of walking/running/jumping skills , upper extremity coordination , and fine motor speed/dexterity . However , the quality of gross motor skill performance and independence in mobility were unchanged . These results suggest that most children with spastic diplegia benefit functionally from AFO use . However , some children at GMFCS level II demonstrated a subtle but detrimental effect on function with HAFO use , shown by an increase in peak knee extensor moment in early stance , excessive ankle dorsiflexion , decreased walking velocity , and greater energy cost . Therefore , constraining ankle motion by using a PLS or SAFO should be considered for most , but not all , children with spastic diplegia Twenty-one subjects with spastic diplegic cerebral palsy were studied to quantify the effects of fixed and articulated ankle-foot orthoses ( AFOs ) on gait and delineate criteria for their use . Children underwent gait analysis under three conditions , fixed AFOs ( FAFOs ) , articulated AFOs ( AAFOs ) , and shoes alone . Greater dorsiflexion occurred at initial contact with both FAFOs and AAFOs than shoes alone . Dorsiflexion at terminal stance was greatest in AAFOs . Plantarflexor power generation at preswing was preserved in AAFOs . No differences were found in knee position during stance . Knee-extensor strength was positively related to knee extension during stance . No relationships were found between dorsiflexion range of motion , calf spasticity and strength , and peak dorsiflexion during stance . AAFOs are appropriate for subjects with varying degrees of calf spasticity , as long as adequate passive range of motion is available . These findings can be applied primarily to children who do not have a preexisting tendency to crouch Adequate reporting of r and omized , controlled trials ( RCTs ) is necessary to allow accurate critical appraisal of the validity and applicability of the results . The CONSORT ( Consoli date d St and ards of Reporting Trials ) Statement , a 22-item checklist and flow diagram , is intended to address this problem by improving the reporting of RCTs . However , some specific issues that apply to trials of nonpharmacologic treatments ( for example , surgery , technical interventions , devices , rehabilitation , psychotherapy , and behavioral intervention ) are not specifically addressed in the CONSORT Statement . Furthermore , considerable evidence suggests that the reporting of nonpharmacologic trials still needs improvement . Therefore , the CONSORT group developed an extension of the CONSORT Statement for trials assessing nonpharmacologic treatments . A consensus meeting of 33 experts was organized in Paris , France , in February 2006 , to develop an extension of the CONSORT Statement for trials of nonpharmacologic treatments . The participants extended 11 items from the CONSORT Statement , added 1 item , and developed a modified flow diagram . To allow adequate underst and ing and implementation of the CONSORT extension , the CONSORT group developed this elaboration and explanation document from a review of the literature to provide examples of adequate reporting . This extension , in conjunction with the main CONSORT Statement and other CONSORT extensions , should help to improve the reporting of RCTs performed in this field The purpose of this study was to examine the effectiveness of the hinged ankle-foot orthosis ( HAFO ) , posterior leaf spring ( PLS ) , and solid ankle-foot orthosis ( SAFO ) , in preventing contracture , improving efficiency of gait , and enhancing performance of functional motor skills in 30 children ( 21 male , 9 female ; mean age 9 years 4 months ; age range 4 to 18 years , ) with spastic hemiplegia . Following a 3-month baseline period of no ankle-foot orthosis ( AFO ) use , each AFO was worn for 3 months after which ankle range of motion , gait analysis , energy consumption , and functional motor skills were assessed . The HAFO and PLS increased passive ankle dorsiflexion and normalization of ankle rocker function during gait . Normalization of knee motion in stance was dependent upon the knee abnormality present and AFO configuration . The HAFO was the most effective in controlling knee hyperextension in stance , while PLS was the most effective in promoting knee extension in children with > 10 degree knee flexion in stance . Energy efficiency was improved in 21 of the children , with 13 of these children demonstrating the greatest improvement in HAFO and PLS . Improvements in functional mobility were greatest in the HAFO and PLS The effectiveness of ankle-foot orthoses ( AFOs ) on walking pattern was studied in 12 children with cerebral palsy between the ages of 3 and 7 years . Over a 2-year period two trials of fortnightly periods without AFOs were carried out . The range of ankle dorsiflexion , video analysis looking specifically at footfall , and rank scoring of the mediolateral shear force obtained using an oscilloscope printout from a Kistler force platform were compared with measurements obtained during periods when the child was wearing AFOs . The results using different outcomes were consistent , and indicated that the range of movement and gait deteriorated during the two trial periods during which the splints were not worn compared with periods during which the splints were worn . This finding needs to be confirmed in a large r and omised controlled trial BACKGROUND AND PURPOSE This study compared the effects of dynamic ankle-foot orthoses ( DAFOs ) with a plantar-flexion stop , polypropylene solid ankle-foot orthoses ( AFOs ) , and no AFOs on the gait of children with cerebral palsy ( CP ) . These orthoses were used to reduce the excessive ankle plantar flexion during the stance phase of gait . SUBJECTS AND METHODS Ten children with spastic CP ( 6 with diplegia and 4 with hemiplegia ) were tested after wearing no AFOs for an initial 2-week period , solid AFOs for 1 month , no AFOs for an additional 2 weeks , and DAFOs for 1 month . The effects of the two orthoses and no AFOs on lower-extremity muscle timing , joint motions , and temporal-distance characteristics were compared . RESULTS Both orthoses increased stride length , decreased cadence , and reduced excessive ankle plantar flexion when compared with no orthoses . No differences were found for the gait variables when comparing the two orthoses . CONCLUSION AND DISCUSSION Based on the data , the authors believe that although both orthoses would be recommended for children with spastic CP and excessive ankle plantar flexion during stance , additional individual factors should be considered when selecting either orthosis OBJECTIVE : To test the feasibility of creating a valid and reliable checklist with the following features : appropriate for assessing both r and omised and non-r and omised studies ; provision of both an overall score for study quality and a profile of scores not only for the quality of reporting , internal validity ( bias and confounding ) and power , but also for external validity . DESIGN : A pilot version was first developed , based on epidemiological principles , review s , and existing checklists for r and omised studies . Face and content validity were assessed by three experienced review ers and reliability was determined using two raters assessing 10 r and omised and 10 non-r and omised studies . Using different raters , the checklist was revised and tested for internal consistency ( Kuder-Richardson 20 ) , test-retest and inter-rater reliability ( Spearman correlation coefficient and sign rank test ; kappa statistics ) , criterion validity , and respondent burden . MAIN RESULTS : The performance of the checklist improved considerably after revision of a pilot version . The Quality Index had high internal consistency ( KR-20 : 0.89 ) as did the subscales apart from external validity ( KR-20 : 0.54 ) . Test-retest ( r 0.88 ) and inter-rater ( r 0.75 ) reliability of the Quality Index were good . Reliability of the subscales varied from good ( bias ) to poor ( external validity ) . The Quality Index correlated highly with an existing , established instrument for assessing r and omised studies ( r 0.90 ) . There was little difference between its performance with non-r and omised and with r and omised studies . Raters took about 20 minutes to assess each paper ( range 10 to 45 minutes ) . CONCLUSIONS : This study has shown that it is feasible to develop a checklist that can be used to assess the method ological quality not only of r and omised controlled trials but also non-r and omised studies . It has also shown that it is possible to produce a checklist that provides a profile of the paper , alerting review ers to its particular method ological strengths and weaknesses . Further work is required to improve the checklist and the training of raters in the assessment of external validity OBJECTIVE To investigate the effectiveness of the hinged ankle-foot orthosis ( AFO ) on sit-to-st and ( STS ) transfers in children with spastic cerebral palsy . DESIGN Before-after trial . SETTING University-affiliated hospital . PARTICIPANTS Nineteen spastic diplegic children ( age range , 2 - 6 y ) . INTERVENTIONS Not applicable . MAIN OUTCOME MEASURES The transitional movement of STS was tested in r and om order with children while wearing the barefoot and hinged AFOs . The temporal , kinematic , and kinetic data during the task were collected by using a motion analyzer ( with 6 infrared cameras ) . Statistical comparison between barefoot and hinged AFO was done with the Wilcoxon signed-rank test . RESULTS Total duration of STS transfer was significantly shortened with the hinged AFO ( P < .05 ) . The initial knee flexion , the initial angle , and the final angle of ankle dorsiflexion were increased with the AFO , compared with when barefoot ( P < .05 ) . However , the increased pelvic tilt and hip flexion while barefoot was not reduced with the AFO . The maximal moment and power of hip and knee joints were significantly increased with the AFO ( P < .05 ) , whereas the maximal moment and power of the ankle joint were not significantly changed when wearing the AFO . CONCLUSIONS Although proximal compensatory strategy of increased pelvic tilt and hip flexion did not change with the hinged AFO , some improvements of temporal , kinematic , and kinetic parameters were identified during the task . These findings suggest that a hinged AFO is beneficial for STS transfer activity for children with spastic diplegia Several studies indicated that walking with an ankle foot orthosis ( AFO ) impaired third rocker . The purpose of this study was to evaluate the effects of two types of orthoses , with similar goal setting s , on gait , in a homogeneous group of children , using both barefoot and shoe walking as control conditions . Fifteen children with hemiplegia , aged between 4 and 10 years , received two types of individually tuned AFOs : common posterior leaf-spring ( PLS ) and Dual Carbon Fiber Spring AFO ( CFO ) ( with carbon fibre at the dorsal part of the orthosis ) . Both orthoses were expected to prevent plantar flexion , thus improving first rocker , allowing dorsiflexion to improve second rocker , absorbing energy during second rocker , and returning it during the third rocker . The effect of the AFOs was studied using objective gait analysis , including 3D kinematics , and kinetics in four conditions : barefoot , shoes without AFO , and PLS and CFO combined with shoes . Several gait parameters significantly changed in shoe walking compared to barefoot walking ( cadence , ankle ROM and velocity , knee shock absorption , and knee angle in swing ) . The CFO produced a significantly larger ankle ROM and ankle velocity during push-off , and an increased plantar flexion moment and power generation at pre-swing compared to the PLS ( < 0.01 ) . The results of this study further support the findings of previous studies indicating that orthoses improve specific gait parameters compared to barefoot walking ( velocity , step length , first and second ankle rocker , sagittal knee and hip ROM ) . However , compared to shoes , not all improvements were statistically significant R and omized controlled trials of public health interventions are often complex : practitioners may not deliver interventions as research ers intended , participants may not initiate interventions and may not behave as expected , and interventions and their effects may vary with environmental and social context . Reports of r and omized controlled trials can be misleading when they omit information about the implementation of interventions , yet such data are frequently absent in trial reports , even in journals that endorse current reporting guidelines . Particularly for complex interventions , the Consoli date d St and ards of Reporting Trials ( CONSORT ) statement does not include all types of information needed to underst and the results of r and omized controlled trials . CONSORT should be exp and ed to include more information about the implementation of interventions in all trial arms We evaluated the effect of articulating and solid ankle-foot orthoses ( AFOs ) on the transitional movement of sit-to-st and for 15 children aged 2 - 5 years with spastic diplegia and dynamic equinus . Kinematic and kinetic data were collected for each child . The time to reach stable st and ing was determined by using a force plate . Seven children were comparable to age-matched normals while barefoot and were slowed by the use of AFOs . Eight patients were more than 1 st and ard deviation slower than normals while barefoot . All were significantly ( p < 0.003 ) improved by the use of articulating AFOs . The clinical difference between these groups was the presence of equinus during stable st and ing while barefoot for patients aided by AFOs , whereas the second group remained planti grade barefoot . We conclude that children with spastic diplegia with uncontrolled dynamic equinus benefit from the use of articulating AFOs for the movement of sit-to-st and PURPOSE The aim of this study was to assess the effects of hinged ankle foot orthoses ( AFO ) on the metabolic and cardiopulmonary cost of walking and gross motor skills of children with cerebral palsy ( CP ) . METHODS Ten habitual users of hinged AFO with spastic diplegic CP ( 9.01 yr + /- 2.10 ) participated in the study . Expired gas and heart rate ( HR ) were measured during sitting and with AFO on and off during steady state treadmill walking at three speeds : 3 km.h(-1 ) , comfortable walking speed ( CWS ) , and 90 % of their fastest walking speed ( FWS ) . Comfortable and fastest ground walking speed and Gross Motor Function Measure scores were also assessed with AFO on and off and analyzed with ANOVA . Because not all children could walk at all speeds on the treadmill , an ANOVA was performed on data for children who walked at 3 km.h(-1 ) and CWS ( N = 8 for HR ; N = 9 for pulmonary ventilation and metabolic variables ) and a t-test on data at 90 % of FWS ( N = 9 for HR ; N = 8 for pulmonary ventilation and metabolic variables ) . RESULTS When children wore their AFO net oxygen uptake ( L.min(-1 ) , absolute -- sitting values ) was significantly ( P < 0.05 ) reduced by 8.9 % at 3 km.h(-1 ) and by 5.9 % at 90 % of FWS . Net pulmonary ventilation ( L.min(-1 ) ) was significantly ( P < 0.05 ) lower with AFO on by 10.3 % but only at 3 km.h(-1 ) . AFO did not affect net HR ( beats.min(-1 ) ) nor the respiratory exchange ratio at any speed , nor any physiologic variable at CWS , nor gross motor skills . CONCLUSIONS Use of hinged AFO reduces the oxygen and ventilatory cost of walking in children with spastic diplegic CP This study analyzed the effects of tone-reducing features in ankle-foot orthotics ( AFOs ) on the gait of eight children ( ages 4 - 11 years ) with spastic diplegic cerebral palsy . A st and ard gait analysis was performed on each subject in each of three trial orthotics and in a baseline shoes-only condition . A 4-week accommodation period was allotted for each of the three devices : a st and ard hinged AFO , an AFO with tone-reducing features , and a supramalleolar orthotic with tone-reducing features . Most significant differences were at the ankle , between free-ankle and plantar flexion-limiting conditions . No significant functional changes in gait were evident with the addition of tone-reducing properties to a st and ard articulating AFO Orthoses are the primary conservative treatment option for control of dynamic equinus in spastic cerebral palsy . Our purpose was to compare the effects of a fixed ankle-foot orthosis ( AFO ) , a supramalleolar orthosis ( SMO ) , and a no-brace condition , but including shoes . Gait analyses were performed on 11 children with spastic diplegia , using a system with four cameras and two concealed force plates . Ensemble averages of time-distance , kinematic , and kinetic parameters were obtained for each condition , and a repeated measures analysis of variance was performed ( P < 0.05 ) . Among the important findings were as follows : ( 1 ) AFOs significantly reduced ankle excursion , increased dorsiflexion angle at foot strike , increased plantar flexion moment in push-off , decreased ankle power absorbed during loading response , and decreased ankle power generated in push-off ; ( 2 ) SMOs did not restrict ankle range of motion or significantly alter the power and moment values at the ankle joint . Although neither brace changed stride length and walking speed , AFOs did offer some biomechanical benefits to the child with spastic diplegia , whereas SMOs appeared to have very little measurable effect
2,078
23,529,287
Conclusion Based on current findings , training frequencies of two to four resistance training sessions per muscle group/week can be prescribed to develop upper and lower body strength and power . Strength levels can be maintained for up to 3 weeks of detraining , but decay rates will increase thereafter ( i.e. 5–16 weeks ) . The effect of explosive-ballistic training and detraining on pure power development and decay in elite rugby and American football players remain inconclusive . The long-term effects of periodized resistance training programmes on strength and power seem to follow the law of diminishing returns , as training exposure increases beyond 12–24 months , adaptation rates are reduced
Background and aimS trength and power are crucial components to excelling in all contact sports ; and underst and ing how a player ’s strength and power levels fluctuate in response to various resistance training loads is of great interest , as it will inevitably dictate the loading parameters throughout a competitive season . This is a systematic review of training , maintenance and detraining studies , focusing on the development , retention and decay rates of strength and power measures in elite rugby union , rugby league and American football players .
The purpose of this study was to compare changes in velocity-specific adaptations in moderately resistance-trained athletes who trained with either low or high resistances . The study used tests of sport-specific skills across an intermediate-to high-velocity spectrum . Thirty NCAA Division I baseball players were r and omly assigned to either a low-resistance ( 40–60 % 1 repetition maximum [ 1RM ] ) training group or a high-resistance ( 70–90 % 1RM ) training group . Both of the training groups intended to maximally accelerate each repetition during the concentric phase ( IMCA ) . The 10 weeks of training consisted of 4 training sessions a week using basic core exercises . Peak force , velocity and power were evaluated during set angle and depth jumps as well as weighted jumps using 30 and 50 % 1RM . Squat 1RMs were also tested . Although no interactions for any of the jump tests were found , trends supported the hypothesis of velocity-specific training . Percentage gains suggest that the combined use of heavier training loads ( 70–90 % 1RM ) and IMCA tend to increase peak force in the lower-body leg and hip extensors . Trends also show that the combined use of lighter training loads ( 40–60 % 1RM ) and IMCA tend to increase peak power and peak velocity in the lower-body leg and hip extensors . The high-resistance group improved squats more than the low-resistance group ( p < 0.05 ; + 22.7 vs. + 16.1 kg ) . The results of this study support the use of a combination of heavier training loads and IMCA to increase 1RM strength in the lower bodies of resistance-trained athletes The purpose of the present study was to examine the influence of direct supervision on muscular strength , power , and running speed during 12 weeks of resistance training in young rugby league players . Two matched groups of young ( 16.7 ± 1.1 years [ mean ± SD ] ) , talented rugby league players completed the same periodized resistance-training program in either a supervised ( SUP ) ( N = 21 ) or an unsupervised ( UNSUP ) ( N = 21 ) environment . Measures of 3 repetition maximum ( 3RM ) bench press , 3RM squat , maximal chin-ups , vertical jump , 10– and 20-m sprints , and body mass were completed pretest ( week 0 ) , midtest ( week 6 ) , and posttest ( week 12 ) training program . Results show that 12 weeks of periodized resistance training result ed in an increased body mass , 3RM bench press , 3RM squat , maximum number of chin-ups , vertical jump height , and 10– and 20-m sprint performance in both groups ( p < 0.05 ) . The SUP group completed significantly more training sessions , which were significantly correlated to strength increases for 3RM bench press and squat ( p < 0.05 ) . Furthermore , the SUP group significantly increased 3RM squat strength ( at 6 and 12 weeks ) and 3RM bench press strength ( 12 weeks ) when compared to the UNSUP group ( p < 0.05 ) . Finally , the percent increase in the 3RM bench press , 3RM squat , and chin-upmax was also significantly greater in the SUP group than in the UNSUP group ( p < 0.05 ) . These findings show that the direct supervision of resistance training in young athletes results in greater training adherence and increased strength gains than does unsupervised training For many sporting activities , initial speed rather than maximal speed would be considered of greater importance to successful performance . The purpose of this study was to identify the relationship between strength and power and measures of first-step quickness ( 5-m time ) , acceleration ( 10-m time ) , and maximal speed ( 30-m time ) . The maximal strength ( 3 repetition maximum [ 3RM ] ) , power ( 30-kg jump squat , countermovement , and drop jumps ) , isokinetic strength measures ( hamstring and quadriceps peak torques and ratios at 608·s-1 and 3008·s-1 ) and 5-m , 10-m , and 30-m sprint times of 26 part-time and full-time professional rugby league players ( age 23.2 ± 3.3 years ) were measured . To examine the importance of the strength and power measures on sprint performance , a correlational approach and a comparison between means of the fastest and slowest players was used . The correlations between the 3RM , drop jump , isokinetic strength measures , and the 3 measures of sport speed were nonsignificant . Correlations between the jump squat ( height and relative power output ) and countermovement jump height and the 3 speed measures were significant ( r = −0.43 to −0.66 , p < 0.05 ) . The squat and countermovement jump heights as well as squat jump relative power output were the only variables found to be significantly greater in the fast players . It was suggested that improving the power to weight ratio as well as plyometric training involving countermovement and loaded jump-squat training may be more effective for enhancing sport speed in elite players Six men and six women trained the elbow flexors of both arms 3 d.wk-1 for 20 wk . In each training session , one arm did 3 - 5 sets of 10 maximal concentric actions on an accommodating resistance device ( ARD ) , the other arm 3 - 5 sets of 8 - 12 coupled eccentric/concentric actions ( repetitions ) to volitional failure ( 8 - 12 RM ) on a weight resistance device ( WRD ) . The average " intensity " ( force of concentric actions ) was approximately 1.25 times greater in ARD training , the average " volume " ( number of actions x force of actions ) 1.6 times greater in WRD training , and the time required to complete a training session the same for each . Both types of training produced significant increases in a single maximum weight lift ( 1 RM on the WRD ) , in the peak force of a single maximal concentric action measured on the ARD and an isovelocity dynamometer , and in biceps , brachialis , and total elbow flexor cross-sectional area ( CSA ) . Biceps Type I and II fiber area did not change significantly . WRD training produced greater increases than ARD training in the 1 RM test on the WRD and in brachialis CSA . The data indicate that both of these common training regimens effectively increase strength and muscle mass , but the weight training regimen may be more effective for increasing muscle mass The purpose of this study was to determine if three training loads equated by volume differed in terms of the temporal , kinematic and kinetic characteristics of each set . Twelve experienced weightlifters ( 30.2+/-10.6 years old and 75.8+/-13.0 kg ) performed three sets ( 6 x 30 % 1RM , 3 x 60 % 1RM and 2 x 90 % 1RM ) of ballistic squats on an instrumented supine squat machine . Repeated measures ANOVA and Tukey HSD post hoc comparisons were used to distinguish significant differences between the three training loads on a variety of temporal , kinematic and kinetic variables . Significantly ( p < 0.05 ) greater total time under tension during the eccentric ( 41 - 53 % ) and concentric phases ( 27 - 31 % ) was observed for the 30 % 1RM condition compared to the other two conditions . Similarly , the lighter loading intensity result ed in significantly greater total eccentric ( 9 - 19 % ) and concentric ( 14 - 24 % ) force output compared to the other two conditions . Greater total power output was associated with the 30 % 1RM condition for both the eccentric ( 25 - 48 % ) and concentric ( 40 - 69 % ) phases . Greater total work ( 9 - 24 % ) was also associated with the 30 % 1RM condition . The 60 % 1RM condition produced significantly greater total work , force and power compared to the 90 % 1RM condition . However , greater concentric impulse ( 29 - 42 % ) was associated with the 90 % 1RM condition . It is suggested that strength and power research needs to adopt a set kinematic and kinetic analysis approach within the research design s so that a better underst and ing of the nature of the neuromuscular adaptations elicited by different loading parameters is achieved The purpose of this investigation was to examine the effect of an 8-week training program with heavy- vs. light-load jump squats on various physical performance measures and electromyography ( EMG ) . Twenty-six athletic men with varying levels of resistance training experience performed sessions of jump squats with either 30 % ( JS30 , n = 9 ) or 80 % ( JS80 , n = 10 ) of their one repetition maximum in the squat ( 1RM ) or served as a control ( C , n = 7 ) . An agility test , 20-m sprint , and jump squats with 30 % ( 30J ) , 55 % ( 55J ) , and 80 % ( 80J ) of their 1RM were performed before and after training . Peak force , peak velocity ( PV ) , peak power ( PP ) , jump height , and average EMG ( concentric phase ) were calculated for the jumps . There were significant increases in PP and PV in the 30J , 55J , and 80J for the JS30 group ( p < 0.05 ) . The JS30 group also significantly increased in the 1RM with a trend towards improved 20-m sprint times . In contrast , the JS80 group significantly increased both PF and PP in the 55J and 80J and significantly increased in the 1RM but ran significantly slower in the 20-m sprint . In the 30J the JS30 group 's percentage increase in EMG activity was significantly different from the C group . In the 80J the JS80 group 's percentage increase in EMG activity was significantly different from the C group . This investigation indicates that training with light-load jump squats results in increased movement velocity capabilities and that velocity-specific changes in muscle activity may play a key role in this adaptation Abstract Seventeen subjects performed resistance training of the leg extensor and flexor muscle groups two ( 2/wk ) or three ( 3/wk ) times per week . Changes in the relative myosin heavy chain ( MHC ) isoform contents ( I , IIa and IIx ) of the vastus lateralis and isometric , isokinetic and squat-lift one-repetition maximum ( 1RM ) strength were compared between conditions after both a common training period ( 6 weeks ) and number of training sessions ( 18 ) . After 6 weeks and 18 sessions ( 9 weeks for the 2/wk group ) , increments in 1RM strength for the 3/wk and 2/wk groups were similar [ effect size ( ES ) differences ≈0.3 , 3/wk > 2/wk ] , whereas the 2/wk group presented greater isokinetic ( ES differences = 0.3–1.2 ) and isometric ( ES differences ≈0.7 ) strength increases than the 3/wk condition . A significant ( P < 0.05 ) increase in MHC IIa percentage was evident for the 2/wk group after 18 sessions . Both training groups exhibited a trend towards a reduction in the relative MHC IIx and an increase in MHC IIa contents ( ES range = 0.5–1.24 ) . However , correlations between changes in the strength and MHC profiles were weak ( r2 : 0.0–0.5 ) . Thus , isometric and isokinetic strength responses to variations in training frequency differed from 1RM strength responses , and changes in strength were not strongly related to alterations in relative MHC content This study was performed to determine which of three theoretically optimal resistance training modalities result ed in the greatest enhancement in the performance of a series of dynamic athletic activities . The three training modalities included 1 ) traditional weight training , 2 ) plyometric training , and 3 ) explosive weight training at the load that maximized mechanical power output . Sixty-four previously trained subjects were r and omly allocated to four groups that included the above three training modalities and a control group . The experimental groups trained for 10 wk performing either heavy squat lifts , depth jumps , or weighted squat jumps . All subjects were tested prior to training , after 5 wk of training and at the completion of the training period . The test items included 1 ) 30-m sprint , 2 ) vertical jumps performed with and without a countermovement , 3 ) maximal cycle test , 4 ) isokinetic leg extension test , and 5 ) a maximal isometric test . The experimental group which trained with the load that maximized mechanical power achieved the best overall results in enhancing dynamic athletic performance recording statistically significant ( P < 0.05 ) improvements on most test items and producing statistically superior results to the two other training modalities on the jumping and isokinetic tests The purpose of this study was to explore the effects of 5 weeks of eccentrically loaded and unloaded jump squat training in experienced resistance-trained athletes during the strength/ power phase of a 15-week periodized off-season resistance training program . Forty-seven male college football players were r and omly assigned to 1 of 3 groups . One group performed the jump squat exercise using both concentric and eccentric phases of contraction ( CE ; n = 15 ) . A second group performed the jump squat exercise using the concentric phase only ( n = 16 ) , and a third group did not perform the jump squat exercise and served as control ( CT ; n = 16 ) . No significant differences between the groups were seen in power , vertical jump height , 40-yd sprint speed and agility performance . In addition , no differences between the groups were seen in integrated electromyography activity during the jump squat exercise . Significant differences between the CE and CT groups were seen in D 1RM squat ( 65.8 and 27.5 kg , respectively ) and D 1RM power clean ( 25.9 and 3.8 kg , respectively ) . No other between-group differences were observed . Results of this study provide evidence of the benefits of the jump squat exercise during a short- duration ( 5-week ) training program for eliciting strength and power gains . In addition , the eccentric phase of this ballistic movement appears to have important implication s for eliciting these strength gains in college football players during an off-season training program . Thus , coaches incorporating jump squats ( using both concentric and eccentric phases of contraction ) in the off-season training programs of their athletes can see significant performance improvements during a relatively short duration of training Tribulus terrestris is an herbal nutritional supplement that is promoted to produce large gains in strength and lean muscle mass in 5–28 days ( 15 , 18 ) . Although some manufacturers cl aim T. terrestris will not lead to a positive drug test , others have suggested that T. terrestris may increase the urinary testosterone/epitestosterone ( T/E ) ratio , which may place athletes at risk of a positive drug test . The purpose of the study was to determine the effect of T. terrestris on strength , fat free mass , and the urinary T/E ratio during 5 weeks of preseason training in elite rugby league players . Twenty-two Australian elite male rugby league players ( mean ± SD ; age = 19.8 ± 2.9 years ; weight = 88.0 ± 9.5 kg ) were match-paired and r and omly assigned in a double-blind manner to either a T. terrestris ( n = 11 ) or placebo ( n = 11 ) group . All subjects performed structured heavy resistance training as part of the club 's preseason preparations . A T. terrestris extract ( 450 mg·d-1 ) or placebo capsules were consumed once daily for 5 weeks . Muscular strength , body composition , and the urinary T/E ratio were monitored prior to and after supplementation . After 5 weeks of training , strength and fat free mass increased significantly without any between-group differences . No between-group differences were noted in the urinary T/E ratio . It was concluded that T. terrestris did not produce the large gains in strength or lean muscle mass that many manufacturers cl aim can be experienced within 5–28 days . Furthermore , T. terrestris did not alter the urinary T/E ratio and would not place an athlete at risk of testing positive based on the World Anti-Doping Agency 's urinary T/E ratio limit of 4:1 Ghigiarelli , JJ , Nagle , EF , Gross , FL , Robertson , RJ , Irrgang , JJ , and Myslinski , T. The effects of a 7-wk heavy elastic b and and weight chain program on upper body strength and upper body power in a sample of Division 1-AA football players . J Strength Cond Res 23(3 ) : 756 - 764 , 2009-The purpose of this study was to explore the effects of a 7-week heavy elastic b and and weighted-chain program on maximum muscular strength and maximum power in the bench press exercise . Thirty-six ( n = 36 ) healthy men aged 18 - 30 years old , from the Robert Morris University football team , volunteered to participate in this study . During the first week , predicted 1 repetition maximum ( 1RM ) bench press and a 5RM speed bench press tests were conducted . Subjects were r and omly divided into 3 groups ( n = 12 ) : elastic b and ( EB ) , weighted chain ( WC ) , and traditional bench ( C ) . During weeks 2 - 8 of the study , subjects were required to follow the prescribed resistance training program . Mean and SD of the predicted 1RM bench press and 5RM speed bench press were computed . A two-factor ( method X time ) analysis was applied to identify significant differences between the training groups . Significance was set at α = 0.05 . Results indicated a significant time ( p < 0.05 ) but no group effect for both predicted 1RM ( kg ) and 5RM peak power tests ( watts ) . Although not significant , results did show greater nonsignificant improvements in the EB ( 848 - 883 W ) and WC groups ( 856 - 878 W ) vs. control ( 918 - 928 W ) when the 2 highest and greatest values were selected regarding peak power . The use of EB and WC in conjunction with a general offseason strength and conditioning program can increase overall maximum upper-body strength in a sample of Division 1-AA football players . These types of training modalities add a unique training style and more flexibility with respect to exercise prescription for athletes and strength practitioners Mujika , I , Santisteban , J , and Castagna , C. In-season effect of short-term sprint and power training programs on elite junior soccer players . J Strength Cond Res 23(9 ) : 2581 - 2587 , 2009-The aim of this study was to examine the effects of 2 in-season short-term sprint and power training protocol s on vertical countermovement jump height ( with or without arms ) , sprint ( Sprint-15 m ) speed , and agility ( Agility-15 m ) speed in male elite junior soccer players . Twenty highly trained soccer players ( age 18.3 ± 0.6 years , height 177 ± 4 cm , body mass 71.4 ± 6.9 kg , sum skinfolds 48.1 ± 11.4 mm ) , members of a professional soccer academy , were r and omly allocated to either a CONTRAST ( n = 10 ) or SPRINT ( n = 10 ) group . The training intervention consisted of 6 supervised training sessions over 7 weeks , targeting the improvement of the players ' speed and power . CONTRAST protocol consisted of alternating heavy-light resistance ( 15 - 50 % body mass ) with soccer-specific drills ( small-sided games or technical skills ) . SPRINT training protocol used line 30-m sprints ( 2 - 4 sets of 4 × 30 m with 180 and 90 seconds of recovery , respectively ) . At baseline no difference between physical test performance was evident between the 2 groups ( p > 0.05 ) . No time × training group effect was found for any of the vertical jump and Agility-15 m variables ( p > 0.05 ) . A time × training group effect was found for Sprint-15 m performance with the CONTRAST group showing significantly better scores than the SPRINT group ( 7.23 ± 0.18 vs. 7.09 ± 0.20 m·s−1 , p < 0.01 ) . In light of these findings CONTRAST training should be preferred to line sprint training in the short term in young elite soccer players when the aim is to improve soccer-specific sprint performance ( 15 m ) during the competitive season Twenty members of an National Collegiate Athletic Association Division III collegiate football team were assigned to either an Olympic lifting ( OL ) group or power lifting ( PL ) group . Each group was matched by position and trained 4-days-wk-1 for 15 weeks . Testing consisted of field tests to evaluate strength ( 1RM squat and bench press ) , 40-yard sprint , agility , vertical jump height ( VJ ) , and vertical jump power ( VJP ) . No significant pre- to posttraining differences were observed in 1RM bench press , 40-yard sprint , agility , VJ or in VJP in either group . Significant improvements were seen in 1RM squat in both the OL and PL groups . After log10-transformation , OL were observed to have a significantly greater improvement in AVJ than PL . Despite an 18 % greater improvement in 1RM squat ( p > 0.05 ) , and a twofold greater improvement ( p > 0.05 ) in 40-yard sprint time by OL , no further significant group differences were seen . Results suggest that OL can provide a significant advantage over PL in vertical jump performance changes This study examined the impact of 4 weeks of either complete cessation of training ( DTR ) or a tapering period ( TAP ; short-term reduction of the strength training volume , while the intensity is kept high ) , subsequent to 16 weeks of periodized heavy resistance training ( PRT ) on strength/power gains and the underlying physiologic changes in basal circulating anabolic/catabolic hormones in strength-trained athletes . Forty-six physically active men were matched and r and omly assigned to a TAP ( n = 11 ) , DTR ( n = 14 ) , or control group ( C ; n = 21 ) , subsequent to a 16-week PRT program . Muscular and power testing and blood draws to determine basal hormonal concentrations were conducted before the initiation of training ( T0 ) , after 16 weeks of training ( T1 ) , and after 4 weeks of either DTR or TAP ( T2 ) . Short-term DTR ( 4 weeks ) results in significant decreases in maximal strength ( −6 to −9 % ) and muscle power output ( −17 and −14 % ) of the arm and leg extensor muscles . However , DTR had a significant ( p > 0.01 ) larger effect on muscle power output more than on strength measurements of both upper and lower extremity muscles . Short-term ( 4 weeks ) TAP reached further increases for leg ( 2 % ) and arm ( 2 % ) maximal strength , whereas no further changes were observed in both upper and lower muscle power output . Short-term DTR result ed in a tendency for elevation resting serum insulin-like growth factor (IGF)-1 concentrations , whereas the corresponding TAP experienced elevation in resting serum insulin-like binding protein-3 ( IGFBP-3 ) . These data indicated that DTR may induce larger declines in muscle power output than in maximal strength , whereas TAP may result in further strength enhancement ( but not muscle power ) , mediated , in part , by training-related differences in IGF-1 and IGFBP-3 concentrations Rugby union football requires muscular strength and endurance , as well as aerobic endurance . Creatine supplementation may enhance muscular performance , but it is unclear if it would interfere with aerobic endurance during running because of increased body mass . The purpose of this study was to determine if creatine supplementation during 8 weeks of a season of rugby union football can increase muscular performance , without negatively affecting aerobic endurance . Rugby union football players were r and omized to receive 0.1 g.kg(-1).d(-1 ) creatine monohydrate ( n=9 ) or placebo ( n=9 ) during 8 weeks of the rugby season . Players practice d twice per week for approximately 2 h per session and played one 80 min game per week . Before and after the 8 weeks , players were measured for body composition ( air displacement plethysmography ) , muscular endurance ( number of repetitions at 75 % of one repetition maximum ( 1 RM ) for bench press and leg press ) , and aerobic endurance ( Leger shuttle-run test with 1 min stages of progressively increasing speed ) . There were time main effects for body mass ( -0.7+/-0.4 kg ; p=0.05 ) , fat mass ( -1.9+/-0.8 kg ; p<0.05 ) , and a trend for an increase in lean tissue mass ( + 1.2+/-0.5 kg ; p=0.07 ) , with no differences between groups . The group receiving creatine supplementation had a greater increase in the number of repetitions for combined bench press and leg press tests compared with the placebo group ( + 5.8+/-1.4 vs. + 0.9+/-2.0 repetitions ; p<0.05 ) . There were no changes in either group for aerobic endurance . Creatine supplementation during a rugby union football season is effective for increasing muscular endurance , but has no effect on body composition or aerobic endurance Hoffman , JR , Ratamess , NA , Klatt , M , Faigenbaum , AD , Ross , RE , Tranchina , NM , McCurley , RC , Kang , J , and Kraemer , WJ . Comparison between different off-season resistance training programs in Division III American college football players . J Strength Cond Res 23(1 ) : 11 - 19 , 2009-The purpose of this study was to examine the efficacy of periodization and to compare different periodization models in resistance trained American football players . Fifty-one experienced resistance trained American football players of an NCAA Division III football team ( after 10 weeks of active rest ) were r and omly assigned to 1 of 3 groups that differed only in the manipulation of the intensity and volume of training during a 15-week off-season resistance training program . Group 1 participated in a nonperiodized ( NP ) training program , group 2 participated in a traditional periodized linear ( PL ) training program , and group 3 participated in a planned nonlinear periodized ( PNL ) training program . Strength and power testing occurred before training ( PRE ) , after 7 weeks of training ( MID ) , and at the end of the training program ( POST ) . Significant increases in maximal ( 1-repetition maximum [ 1RM ] ) squat , 1RM bench press , and vertical jump were observed from PRE to MID for all groups ; these increases were still significantly greater at POST ; however , no MID to POST changes were seen . Significant PRE to POST improvements in the medicine ball throw ( MBT ) were seen for PL group only . The results do not provide a clear indication as to the most effective training program for strength and power enhancements in already trained football players . Interestingly , recovery of training-related performances was achieved after only 7 weeks of training , yet further gains were not observed . These data indicate that longer periods of training may be needed after a long-term active recovery period and that active recovery may need to be dramatically shortened to better optimize strength and power in previously trained football players The purpose of this investigation was to study the efficacy of two dietary supplements on measures of body mass , body composition , and performance in 42 American football players . Group CM ( n = 9 ) received creatine monohydrate , Group P ( n = 11 ) received calcium pyruvate , Group COM ( n = 11 ) received a combination of calcium pyruvate ( 60 % ) and creatine ( 40 % ) , and Group PL received a placebo . Tests were performed before ( T1 ) and after ( T2 ) the 50 week supplementation period , during which the subjects continued their normal training schedules . Compared to P and PL , CM and COM showed significantly greater increases for body mass , lean body mass , 1 repetition maximum ( RM ) bench press , combined 1 RM squat and bench press , and static vertical jump ( SVJ ) power output . Peak rate of force development for SVJ was significantly greater for CM compared to P and PL . Creatine and the combination supplement enhanced training adaptations associated with body mass/composition , maximum strength , and SVJ ; however , pyruvate supplementation alone was ineffective Three studies that used rugby league players experienced in power training methods as subjects were performed to investigate the resistance ( percentage of 1 repetition maximum [ 1RM ] ) that maximized the average mechanical power output ( Pmax ) during the jump squat exercise . Maximum strength was assessed via 1RM ( studies 2 and 3 ) or 3RM ( study 1 ) during the full-squat exercise . Pmax was assessed during barbell jump squats , using resistances of 40 , 60 , 80 , and 100 kg within the Plyometric Power System . All studies found that power output was maximized by resistances averaging circa 85–95 kg , representing 55–59 % of 1RM full-squat strength . However , loads in the range of 47–63 % of 1RM were often similarly effective in maximizing power output . The results of this investigation suggest that athletes specifically trained via both maximal strength and power training methods may generate their maximal power outputs at higher percentages of 1RM than those previously reported for solely strength-trained athletes and that there would appear to be an effective range of resistances for maximizing power output
2,079
28,298,066
There was no consistency among investigations regarding the effects of fatigue on hip , knee , or ankle joint angles and moments or surface electromyography muscle activation patterns . The fatigue protocol s typically did not produce statistically significant changes in ground-reaction forces . Conclusion : Published fatigue protocol s did not uniformly produce alterations in lower limb neuromuscular factors that heighten the risk of noncontact ACL injuries . Therefore , justification does not currently exist for major changes in ACL injury prevention training programs to account for potential fatigue effects .
Background : Approximately two-thirds of anterior cruciate ligament ( ACL ) tears are sustained during noncontact situations when an athlete is cutting , pivoting , decelerating , or l and ing from a jump . Some investigators have postulated that fatigue may result in deleterious alterations in lower limb biomechanics during these activities that could increase the risk of noncontact ACL injuries . However , prior studies have noted a wide variation in fatigue protocol s , athletic tasks studied , and effects of fatigue on lower limb kinetics and kinematics . Purpose : First , to determine if fatigue uniformly alters lower limb biomechanics during athletic tasks that are associated with noncontact ACL injuries . Second , to determine if changes should be made in ACL injury prevention training programs to alter the deleterious effects of fatigue on lower limb kinetics and kinematics .
BACKGROUND In spite of ongoing prevention developments , anterior cruciate ligament injury rates and the associated sex-disparity have remained , suggesting an incomplete underst and ing of the injury mechanism . While both fatigue and decision making are known in isolation to directly impact anterior cruciate ligament injury risk , their combined manifestations remain unknown . We thus examined the combined effects of fatigue and decision making on lower limb kinematics during sports relevant l and ings . METHODS Twenty five female National College Athletic Association athletes had initial contact and peak stance phase 3D lower limb joint kinematics quantified during anticipated and unanticipated single ( left and right ) leg l and ings , both before and during the accumulation of fatigue . Jump direction was governed by light stimuli activated prior to and during the pre-l and phase of respective anticipated and unanticipated trials . To induce fatigue , subjects performed repetitive squat ( n=5 ) and r and omly ordered jump sequences , until squats were no longer possible . Subject-based measures of each dependent factor were then calculated across pre-fatigue trials , and for those denoting 100 % and 50 % fatigue , and su bmi tted to a 3-way mixed design analysis of covariance to test for the main effects of fatigue time , decision and leg . FINDINGS Fatigue caused significant increases in initial contact hip extension and internal rotation , and in peak stance knee abduction and internal rotation and ankle supination angles . Fatigue-induced increases in initial contact hip rotations and in peak knee abduction angle were also significantly more pronounced during unanticipated compared to anticipated l and ings . INTERPRETATION The integrative effects of fatigue and decision making may represent a worst case scenario in terms of anterior cruciate ligament injury risk during dynamic single leg l and ings , by perpetuating substantial degradation and overload of central control mechanisms UNLABELLED Fatigue has been shown to alter the biomechanics of lower extremity during l and ing tasks . To date , no study has examined the effects of two types of fatigue on kinetics and kinematics . OBJECTIVES This study was conducted to assess biomechanical differences between two fatigue protocol s [ Slow Linear Oxidative Fatigue Protocol ( SLO-FP ) and Functional Agility Short-Term Fatigue Protocol ( FAST-FP ) ] . DESIGN Single-group repeated measures design . METHODS Fifteen female collegiate soccer players had to perform five successful trials of unanticipated sidestep cutting ( SS ) pre- and post-fatigue protocol s. The SLO-FP consisted of an initial VO(2peak ) test followed by 5-min rest , and a 30-min interval run . The FAST-FP consisted of 4 sets of a functional circuit . Biomechanical measures of the hip and knee were obtained at different instants while performing SS pre- and post-fatigue . Repeated 2 × 2 ANOVAs were conducted to examine task and fatigue differences . Alpha level set a priori at 0.05 . RESULTS During the FAST-FP , participants had increased knee internal rotation at initial contact ( IC ) ( 12.5 ± 5.9 ° ) when compared to the SLO-FP ( 7.9 ± 5.4 ° , p<0.001 ) . For hip flexion at IC , pre-fatigue had increased angles ( 36.4 ± 8.4 ° ) compared to post-fatigue ( 30.4 ± 9.3 ° , p=0.003 ) , also greater knee flexion during pre-fatigue ( 25.6 ± 6.8 ° ) than post-fatigue ( 22.4 ± 8.4 ° , p=0.022 ) . CONCLUSION The results of this study showed that hip and knee mechanics were substantially altered during both fatigue conditions OBJECTIVE : To compare the effects of an isokinetic fatigue protocol and a functional fatigue protocol on time to stabilization ( TTS ) , ground reaction force ( GRF ) , and joint kinematics during a jump l and ing . DESIGN AND SETTING : Subjects were assessed on 2 occasions for TTS , GRF , and joint kinematics immediately before and after completing a fatigue protocol . One week separated the 2 sessions , and the order of fatigue protocol s was r and omly assigned and counterbalanced . SUBJECTS : Twenty healthy male ( n = 8 , age = 21.8 + /- 1.4 years , height = 180.6 + /- 7.6 cm , and mass = 74.1 + /- 13.0 kg ) and female ( n = 12 , age = 22.2 + /- 2.1 years , height = 169.3 + /- 9.8 cm , and mass = 62.5 + /- 10.1 kg ) subjects volunteered to participate . MEASUREMENTS : Subjects performed 2-legged jumps equivalent to 50 % of maximum jump height , followed by a single-leg l and ing onto the center of a forceplate 70 cm from the starting position . Peak vertical GRF and vertical , medial-lateral , and anterior-posterior TTS were obtained from forceplate recordings . Maximum ankle dorsiflexion , knee-flexion , and knee-valgum angles were determined using 3-dimensional motion analysis . RESULTS : A 2-way analysis of variance with repeated measures revealed no significant differences when comparing TTS , GRF , and joint kinematics after isokinetic and functional fatigue protocol s. CONCLUSIONS : No difference was noted between isokinetic and functional fatigue protocol s relative to dynamic stability when l and ing from a jump Background Altered motor control strategies in l and ing and jumping maneuvers are a potential mechanism of noncontact anterior cruciate ligament injury . There are biomechanical differences between male and female athletes in the l and ing phase of stop-jump tasks . Fatigue is a risk factor in musculoskeletal injuries . Hypothesis Lower extremity muscle fatigue alters the knee kinetics and kinematics during the l and ing phase of 3 stop-jump tasks and increases an athlete 's risk of anterior cruciate ligament injury . Study Design Controlled laboratory study . Methods Three-dimensional videography and force plate data were collected for 20 recreational athletes ( 10 male and 10 female athletes ) performing 3 stop-jump tasks before and after completing a fatigue exercise . Knee joint angles and result ant forces and moments were calculated . Results Both male and female subjects had significantly increased peak proximal tibial anterior shear forces ( P = . 01 ) , increased valgus moments ( P = . 03 ) , and decreased knee flexion angles ( P = . 03 ) during l and ings of all 3 stop-jump tasks when fatigued . Fatigue did not significantly affect the peak knee extension moment for male or female athletes . Conclusion Fatigued recreational athletes demonstrate altered motor control strategies , which may increase anterior tibial shear force , strain on the anterior cruciate ligament , and risk of injury for both female and male subjects . Clinic Relevance Fatigued athletes may have an increased risk of noncontact anterior cruciate ligament injury PURPOSE Anterior cruciate ligament injuries and patellofemoral pain syndrome are both common and significant injuries to the knee that have been associated with hip weakness . Prospect i ve studies have linked the risk of experiencing either injury to alterations in the frontal plane knee angle and moment during activity . These components of knee mechanics are theorized to be affected by hip abductor weakness . The purpose of this study was to identify the effects of isolated hip abductor fatigue-induced weakness on lower extremity kinematics and kinetics in recreationally active women . METHODS Twenty participants performed cut , jump , and run tasks off a raised platform while three-dimensional motion analysis data were collected . Participants then performed an isolated hip abductor fatigue protocol in side lying against isokinetic resistance , followed immediately by repeated biomechanical data collection . Separate repeated- measures ANOVA ( P G 0.05 ) were used for each dependent variable . RESULTS After the hip fatigue protocol , regardless of task , the knee angle at initial ground contact was more adducted ( pre = 0.7 degrees + /- 3.4 degrees , post = 1.2 degrees + /- 3.9 degrees , F(1,19 ) = 5.3 , P = 0.032 ) , the knee underwent greater range of motion into abduction ( pre = 0.7 degrees + /- 1.5 degrees , post = 2.1 degrees + /- 1.6 degrees , F(1,19 ) = 73.2 , P < 0.001 ) , and there was a greater internal knee adductor moment ( pre = -2.6 + /- 13.3 N x m , post = 4.7 + /- 14.1 N x m , F(1,19 ) = 41.0 , P < 0.001 ) during the weight acceptance phase of stance . CONCLUSIONS This study demonstrates that simulated hip abductor weakness causes small alterations of frontal plane knee mechanics . Although some of these alterations occurred in directions associated with increased risk of knee injury , changes were small in magnitude , and the effect of these small changes on knee injury risk is unknown PURPOSE Fatigue contributes directly to anterior cruciate ligament ( ACL ) injury via promotion of high risk biomechanics . The potential for central fatigue to dominate this process , however , remains unclear . With central ly mediated movement behaviors being trainable , establishing this link seems critical for improved injury prevention . We thus determined whether fatigue-induced l and ing biomechanics were governed by a central ly fatiguing mechanism . METHODS Twenty female NCAA athletes had initial contact ( IC ) and peak stance ( PS ) three-dimensional hip and knee biomechanics quantified during anticipated and unanticipated single-leg l and ings , before and during unilateral fatigue accumulation . To induce fatigue , subjects performed repetitive ( n = 3 ) single-leg squats and r and omly ordered l and ings , until squats were no longer possible . Subject-based dependent factors were calculated across prefatigue trials and for those denoting 100 % , 75 % , 50 % , and 25 % fatigue and were su bmi tted to three-way mixed- design analyses of covariance to test for decision , fatigue time , and limb effects . RESULTS Fatigue produced significant ( P < 0.01 ) decreases in IC knee flexion angle and PS knee flexion moment and increases in PS hip internal rotation and knee abduction angles and moments , with differences maintained from 50 % fatigue through to maximum . Fatigue-induced increases in PS hip internal rotation angles and PS knee abduction angles and loads were also significantly ( P < 0.01 ) greater during unanticipated l and ings . Apart from PS hip moments , significant limb differences in fatigued l and ing biomechanics were not observed . CONCLUSIONS Unilateral fatigue induces a fatigue crossover to the contralateral limb during single-leg l and ings . Central fatigue thus seems to be a critical component of fatigue-induced sports l and ing strategies . Hence , targeted training of central control processes may be necessary to counter successfully the debilitative impact of fatigue on ACL injury risk BACKGROUND Lower extremity kinematics may change as a result of impaired hip muscle function , thereby placing athletes at increased risk of injury . The purpose of this study was to examine whether experimentally-induced hip extensor fatigue alters lower extremity kinematics during a jump-l and ing task in women . METHODS Forty healthy women were r and omly assigned to an experimental group in which participants performed modified Biering-Sørenson tests to fatigue the hip extensors or to a sham control group in which participants performed repeated push-ups to exhaustion . Three-dimensional hip and knee kinematics and gluteus maximus electromyography ( EMG ) signals were measured during jump-l and ing tests to examine the effects of hip extensor fatigue . FINDINGS Hip extension strength decreased in the experimental group by 25 % following the intervention , thereby confirming effects of the fatigue intervention . No group × time interactions in hip and knee motions were statistically significant , indicating that hip and knee kinematics did not change following the fatigue-inducing intervention . Gluteus maximus recruitment during the post-fatigue test , however , increased by 55 % in the experimental group . INTERPRETATION A 25 % reduction in hip extensor strength did not lead to changes in hip or knee kinematics . Rather , participants accommo date d for the loss of strength by recruiting more Gmax activation to complete the task . Gmax recruitment may compensate when hip extensor strength is impaired , suggesting that improved neuromuscular control can influence motor performance when strength is diminished Abstract The main aim of this study was to assess neuromuscular fatigue during a typical high-load , low-repetition loading protocol . Muscle stimulations were used to assess maximum voluntary contraction , resting single- and double-pulse twitch characteristics , and superimposed double-pulse twitch force ( used to calculate voluntary activation ) before and after an acute knee extension loading protocol . In our participants , who had previous resistance training experience , the mean voluntary activation level was 96.2 % in an unfatigued state . Maximum voluntary contraction ( −11.8 % ) , resting double-pulse twitch force ( −10.6 % ) , and voluntary activation ( −2.1 % ) were markedly decreased as a consequence of loading ( P < 0.05 ) . In addition , although potentiated twitch characteristics were observed during the loading protocol , this was short-lived , as fatigue surpassed the potentiation mechanisms . Our results show that both central and peripheral mechanisms contributed to neuromuscular fatigue during the present loading protocol
2,080
26,414,123
There is a wealth of reliable evidence on the analgesic efficacy of single dose oral analgesics . Fast acting formulations and fixed dose combinations of analgesics can produce good and often long-lasting analgesia at relatively low doses .
BACKGROUND This is an up date d version of the original Cochrane overview published in Issue 9 , 2011 . That overview considered both efficacy and adverse events , but adverse events are now dealt with in a separate overview . Thirty-nine Cochrane review s of r and omised trials have examined the analgesic efficacy of individual drug interventions in acute postoperative pain . This overview brings together the results of those individual review s and assesses the reliability of available data . OBJECTIVES To summarise the efficacy of pharmaceutical interventions for acute pain in adults with at least moderate pain following surgery who have been given a single dose of oral analgesic .
Abstract Variability in patients ' response to interventions in pain and other clinical setting s is large . Many explanations such as trial methods , environment or culture have been proposed , but this paper sets out to show that the main cause of the variability may be r and om chance , and that if trials are small their estimate of magnitude of effect may be incorrect , simply because of the r and om play of chance . This is highly relevant to the questions of ‘ How large do trials have to be for statistical accuracy ? ’ and ‘ How large do trials have to be for their results to be clinical ly valid ? ’ The true underlying control event rate ( CER ) and experimental event rate ( EER ) were determined from single‐dose acute pain analgesic trials in over 5000 patients . Trial group size required to obtain statistically significant and clinical ly relevant ( 0.95 probability of number‐needed‐to‐treat within ±0.5 of its true value ) results were computed using these values . Ten thous and trials using these CER and EER values were simulated using varying group sizes to investigate the variation due to r and om chance alone . Most common analgesics have EERs in the range 0.4–0.6 and CER of about 0.19 . With such efficacy , to have a 90 % chance of obtaining a statistically significant result in the correct direction requires group sizes in the range 30–60 . For clinical relevance nearly 500 patients are required in each group . Only with an extremely effective drug ( EER>0.8 ) will we be reasonably sure of obtaining a clinical ly relevant NNT with commonly used group sizes of around 40 patients per treatment arm . The simulated trials showed substantial variation in CER and EER , with the probability of obtaining the correct values improving as group size increased . We contend that much of the variability in control and experimental event rates is due to r and om chance alone . Single small trials are unlikely to be correct . If we want to be sure of getting correct ( clinical ly relevant ) results in clinical trials we must study more patients . Credible estimates of clinical efficacy are only likely to come from large trials or from pooling multiple trials of conventional ( small ) size & NA ; There is uncertainty over whether the patient group in which acute pain studies are conducted ( pain model ) has any influence on the estimate of analgesic efficacy . Data from four recently up date d systematic review s of aspirin 600/650 mg , paracetamol 600/650 mg , paracetamol 1000 mg and ibuprofen 400 mg were used to investigate the influence of pain model . Area under the pain relief versus time curve equivalent to at least 50 % maximum pain relief over 6 h was used as the outcome measure . Event rates with treatment and placebo , and relative benefit ( RB ) and number needed to treat ( NNT ) were used as outputs from the meta‐analyses . The event rate with placebo was systematic ally statistically lower for dental than postsurgical pain for all four treatments . Event rates with analgesics , RB and NNT were infrequently different between the pain models . Systematic difference in the estimate of analgesic efficacy between dental and postsurgical pain models remains unproven , and , on balance , no major difference is likely Background Previous analysis of a single data set in acute pain following third molar extraction demonstrated a strong relationship between the speed of reduction of pain intensity and overall pain relief , as well as need for additional analgesia . Methods Individual patient data analysis of a single r and omized , double-blind trial of placebo , paracetamol 1000 mg , ibuprofen sodium 400 mg and ibuprofen-poloxamer 400 mg following third molar extraction . Visual analogue scale pain intensity ( VASPI ) and other measurements were made at baseline , every 5–45 min , and at 60 , 90 , 120 , 180 , 240 , 300 and 360 min . Results Most patients produced consistent VASPI results over time . For placebo and paracetamol , few patients achieved low VASPI scores and maintained them . For both ibuprofen formulations , VASPI scores fell rapidly during the first hour and were then typically maintained until later re-medication . Analysis of all patients showed that rapid VASPI reduction in the first hour was strongly correlated with good overall pain relief ( high total pain relief over 0–6 h ) , and with lesser need for additional analgesia within 6 h. Results for this analysis were in very good agreement with a previous analysis , validating the relationship between fast initial pain intensity reduction and overall good pain relief in this setting . Conclusions In acute pain following third molar extraction , faster acting analgesic formulations provide earlier onset of pain relief , better overall pain relief and a less frequent need for additional analgesia , indicating longer lasting pain relief Abstract A previously established relationship for deriving dichotomous from continuous information in r and omised controlled trials ( RCTs ) of analgesics has been tested using an independent data set . Individual patient information from 18 RCTs of parallel‐group design in acute postoperative pain ( after abdominal , gynaecological and oral surgery ) was used to calculate the percentage of the maximum possible pain relief score ( % maxTOTPAR ) and the proportion of patients with > 50%maxTOTPAR for the different treatments . The relationship between the measures was investigated in 85 treatments with over 3400 patients . In 80 of 85 treatments ( 94 % ) agreement between calculated and actual number of patients with > 50%maxTOTPAR was within four patients per treatment and in 72 ( 85 % ) was within three ( average of 40 patients per treatment , range 21–58 patients ) . Summing the positive and negative differences between actual and calculated numbers of patients with > 50%maxTOTPAR gave an average difference of 0.30 patients per treatment arm . Reports of RCTs of analgesics frequently describe results of studies in the form of mean derived indices , rather than using discontinuous events , such as number or proportion of patients with 50 % pain relief . Because mean data inadequately describe information with a non‐normal distribution , combining mean data in systematic review s may compromise the results . Showing that dichotomous data can reliably be derived from mean data in acute pain studies enables data published as means to be used for quantitative systematic review s which require data in dichotomous form We assessed the quality of assessment and reporting of adverse effects in r and omized , double-blind clinical trials of single-dose acetaminophen or ibuprofen compared with placebo in moderate to severe postoperative pain . Reports were identified by systematic search ing of a number of bibliographic data bases ( e.g. , MEDLINE ) . Information on adverse effect assessment , severity and reporting , patient withdrawals , and anesthetic used was extracted . Compliance with former guidelines for adverse effect reporting was noted . Fifty-two studies were included ; two made no mention of adverse effects . No method of assessment was given in 19 studies . Twenty trials failed to report the type of anesthetic used , eight made no mention of patient withdrawals , and nine did not state the severity of reported adverse effects . Only two studies described the method of assessment of adverse effect severity . When all adverse effect data were pooled , significantly more adverse effects were reported with active treatment than with placebo . For individual adverse effects , there was no difference between active ( acetaminophen 1000 mg or ibuprofen 400 mg ) and placebo ; the exception was significantly more somnolence/drowsiness with ibuprofen 400 mg . Ninety percent of trials reporting somnolence/drowsiness with ibuprofen 400 mg were in dental pain . All studies published after 1994 complied with former guidelines for adverse effect reporting . Different methods of assessing adverse effects produce different reported incidence : patient diaries yielded significantly more adverse effects than other forms of assessment . We recommend guidelines for reporting adverse effect information in clinical trials & NA ; Reports of RCTs of analgesics frequently describe results of studies in the form of mean derived indices , rather than using discontinuous events — such as number or proportion of patients with 50 % pain relief . Because mean data inadequately describe information with a non‐normal distribution , combining mean data in systematic review s may compromise the results . Showing that dichotomous data can reliably be derived from mean data , at least in acute pain models , indicates that more meaningful overviews or meta‐ analysis may be possible . This study investigated the relationship between continuous and dichotomous analgesic measures in a set of individual patient data , and then used that relationship to derive dichotomous from continuous information in r and omised controlled trials ( RCTs ) of analgesics . Individual patient information from 13 RCTs of parallel‐group and crossover design in acute postoperative pain was used to calculate the percentage of the maximum possible pain relief score ( % maxTOTPAR ) and the proportion of patients with greater than 50 % pain relief ( > 50%maxTOTPAR ) for the different treatments . The relationship between the measures was investigated in 45 actual treatments and 10 000 treatments simulated using the underlying actual distribution ; 1283 patients had 45 separate treatments . Mean % maxTOTPAR correlated with the proportion of patients with > 50%maxTOTPAR ( r2 = 0.90 ) . The relationship calculated from all the 45 treatments predicted to within three patients the number of patients with more than 50 % pain relief in 42 of 45 treatments , and 98.8 % of 10 000 simulated treatments . For seven effective treatments , actual numbers‐needed‐to‐treat ( NNT ) to achieve > 50%maxTOTPAR compared with placebo were very similar to those derived from calculated data OBJECTIVES To survey the frequency of use of indirect comparisons in systematic review s and evaluate the methods used in their analysis and interpretation . Also to identify alternative statistical approaches for the analysis of indirect comparisons , to assess the properties of different statistical methods used for performing indirect comparisons and to compare direct and indirect estimates of the same effects within review s. DATA SOURCES Electronic data bases . REVIEW METHODS The Data base of Abstract s of Review s of Effects ( DARE ) was search ed for systematic review s involving meta- analysis of r and omised controlled trials ( RCTs ) that reported both direct and indirect comparisons , or indirect comparisons alone . A systematic review of MEDLINE and other data bases was carried out to identify published methods for analysing indirect comparisons . Study design s were created using data from the International Stroke Trial . R and om sample s of patients receiving aspirin , heparin or placebo in 16 centres were used to create meta-analyses , with half of the trials comparing aspirin and placebo and half heparin and placebo . Methods for indirect comparisons were used to estimate the contrast between aspirin and heparin . The whole process was repeated 1000 times and the results were compared with direct comparisons and also theoretical results . Further detailed case studies comparing the results from both direct and indirect comparisons of the same effects were undertaken . RESULTS Of the review s identified through DARE , 31/327 ( 9.5 % ) included indirect comparisons . A further five review s including indirect comparisons were identified through electronic search ing . Few review s carried out a formal analysis and some based analysis on the naive addition of data from the treatment arms of interest . Few method ological papers were identified . Some valid approaches for aggregate data that could be applied using st and ard software were found : the adjusted indirect comparison , meta-regression and , for binary data only , multiple logistic regression ( fixed effect models only ) . Simulation studies showed that the naive method is liable to bias and also produces over-precise answers . Several methods provide correct answers if strong but unverifiable assumptions are fulfilled . Four times as many similarly sized trials are needed for the indirect approach to have the same power as directly r and omised comparisons . Detailed case studies comparing direct and indirect comparisons of the same effect show considerable statistical discrepancies , but the direction of such discrepancy is unpredictable . CONCLUSIONS Direct evidence from good- quality RCTs should be used wherever possible . Without this evidence , it may be necessary to look for indirect comparisons from RCTs . However , the results may be susceptible to bias . When making indirect comparisons within a systematic review , an adjusted indirect comparison method should ideally be used employing the r and om effects model . If both direct and indirect comparisons are possible within a review , it is recommended that these be done separately before considering whether to pool data . There is a need to evaluate methods for the analysis of indirect comparisons for continuous data and for empirical research into how different methods of indirect comparison perform in cases where there is a large treatment effect . Further study is needed into when it is appropriate to look at indirect comparisons and when to combine both direct and indirect comparisons . Research into how evidence from indirect comparisons compares to that from non-r and omised studies may also be warranted . Investigations using individual patient data from a meta- analysis of several RCTs using different protocol s and an evaluation of the impact of choosing different binary effect measures for the inverse variance method would also be useful Abstract One way to ensure adequate sensitivity for analgesic trials is to test the intervention on patients who have established pain of moderate to severe intensity . The usual criterion is at least moderate pain on a categorical pain intensity scale . When visual analogue scales ( VAS ) are the only pain measure in trials we need to know what point on a VAS represents moderate pain , so that these trials can be included in meta‐ analysis when baseline pain of at least moderate intensity is an inclusion criterion . To investigate this we used individual patient data from 1080 patients from r and omised controlled trials of various analgesics . Baseline pain was measured using a 4‐point categorical pain intensity scale and a pain intensity VAS under identical conditions . The distribution of the VAS scores was examined for 736 patients reporting moderate pain and for 344 reporting severe pain . The VAS scores corresponding to moderate or severe pain were also examined by gender . Baseline VAS scores recorded by patients reporting moderate pain were significantly different from those of patients reporting severe pain . Of the patients reporting moderate pain 85 % scored over 30 mm on the corresponding VAS , with a mean score of 49 mm . For those reporting severe pain 85 % scored over 54 mm with a mean score of 75 mm . There was no difference between the corresponding VAS scores of men and women . Our results indicate that if a patient records a baseline VAS score in excess of 30 mm they would probably have recorded at least moderate pain on a 4‐point categorical scale
2,081
29,474,724
Nonetheless , the findings from this systematic review suggest that for patients with a low to intermediate prior probability of having obstructive CAD , computed tomography coronary angiography ( CTCA ) may be cost-effective as an initial diagnostic imaging test in comparison with CAG or other non-invasive diagnostic tests . If functional testing is required , stress echocardiography ( SE ) or single-photon emission computed tomography ( SPECT ) are suggested to be cost-effective initial strategies in patients with intermediate prior probability of CAD . Immediate CAG is suggested to be a cost-effective strategy for patients at a high prior probability of having obstructive CAD whom may benefit from revascularization . The study emphasizes the inextricable link between clinical effectiveness and economic efficiency . Evidence suggests that the optimal diagnostic imaging strategy for individuals suspected of having CAD is CTCA for low and intermediate disease probability , followed by SE or SPECT as necessary , and invasive CAG for high disease probability .
Coronary artery disease ( CAD ) remains one of the leading causes of morbidity and mortality globally . The most cost-effective imaging strategy to diagnose CAD in patients with stable chest pain is however uncertain . To review the evidence on comparative cost-effectiveness of different imaging strategies for patients presenting with stable chest pain symptoms suggestive for CAD .
BACKGROUND The benefit of CT coronary angiography ( CTCA ) in patients presenting with stable chest pain has not been systematic ally studied . We aim ed to assess the effect of CTCA on the diagnosis , management , and outcome of patients referred to the cardiology clinic with suspected angina due to coronary heart disease . METHODS In this prospect i ve open-label , parallel-group , multicentre trial , we recruited patients aged 18 - 75 years referred for the assessment of suspected angina due to coronary heart disease from 12 cardiology chest pain clinics across Scotl and . We r and omly assigned ( 1:1 ) participants to st and ard care plus CTCA or st and ard care alone . R and omisation was done with a web-based service to ensure allocation concealment . The primary endpoint was certainty of the diagnosis of angina secondary to coronary heart disease at 6 weeks . All analyses were intention to treat , and patients were analysed in the group they were allocated to , irrespective of compliance with scanning . This study is registered with Clinical Trials.gov , number NCT01149590 . FINDINGS Between Nov 18 , 2010 , and Sept 24 , 2014 , we r and omly assigned 4146 ( 42 % ) of 9849 patients who had been referred for assessment of suspected angina due to coronary heart disease . 47 % of participants had a baseline clinic diagnosis of coronary heart disease and 36 % had angina due to coronary heart disease . At 6 weeks , CTCA reclassified the diagnosis of coronary heart disease in 558 ( 27 % ) patients and the diagnosis of angina due to coronary heart disease in 481 ( 23 % ) patients ( st and ard care 22 [ 1 % ] and 23 [ 1 % ] ; p<0·0001 ) . Although both the certainty ( relative risk [ RR ] 2·56 , 95 % CI 2·33 - 2·79 ; p<0·0001 ) and frequency of coronary heart disease increased ( 1·09 , 1·02 - 1·17 ; p=0·0172 ) , the certainty increased ( 1·79 , 1·62 - 1·96 ; p<0·0001 ) and frequency seemed to decrease ( 0·93 , 0·85 - 1·02 ; p=0·1289 ) for the diagnosis of angina due to coronary heart disease . This changed planned investigations ( 15 % vs 1 % ; p<0·0001 ) and treatments ( 23 % vs 5 % ; p<0·0001 ) but did not affect 6-week symptom severity or subsequent admittances to hospital for chest pain . After 1·7 years , CTCA was associated with a 38 % reduction in fatal and non-fatal myocardial infa rct ion ( 26 vs 42 , HR 0·62 , 95 % CI 0·38 - 1·01 ; p=0·0527 ) , but this was not significant . INTERPRETATION In patients with suspected angina due to coronary heart disease , CTCA clarifies the diagnosis , enables targeting of interventions , and might reduce the future risk of myocardial infa rct ion . FUNDING The Chief Scientist Office of the Scottish Government Health and Social Care Directorates funded the trial with supplementary awards from Edinburgh and Lothian 's Health Foundation Trust and the Heart Diseases Research Fund BACKGROUND Many patients have symptoms suggestive of coronary artery disease ( CAD ) and are often evaluated with the use of diagnostic testing , although there are limited data from r and omized trials to guide care . METHODS We r and omly assigned 10,003 symptomatic patients to a strategy of initial anatomical testing with the use of coronary computed tomographic angiography ( CTA ) or to functional testing ( exercise electrocardiography , nuclear stress testing , or stress echocardiography ) . The composite primary end point was death , myocardial infa rct ion , hospitalization for unstable angina , or major procedural complication . Secondary end points included invasive cardiac catheterization that did not show obstructive CAD and radiation exposure . RESULTS The mean age of the patients was 60.8±8.3 years , 52.7 % were women , and 87.7 % had chest pain or dyspnea on exertion . The mean pretest likelihood of obstructive CAD was 53.3±21.4 % . Over a median follow-up period of 25 months , a primary end-point event occurred in 164 of 4996 patients in the CTA group ( 3.3 % ) and in 151 of 5007 ( 3.0 % ) in the functional-testing group ( adjusted hazard ratio , 1.04 ; 95 % confidence interval , 0.83 to 1.29 ; P=0.75 ) . CTA was associated with fewer catheterizations showing no obstructive CAD than was functional testing ( 3.4 % vs. 4.3 % , P=0.02 ) , although more patients in the CTA group underwent catheterization within 90 days after r and omization ( 12.2 % vs. 8.1 % ) . The median cumulative radiation exposure per patient was lower in the CTA group than in the functional-testing group ( 10.0 mSv vs. 11.3 mSv ) , but 32.6 % of the patients in the functional-testing group had no exposure , so the overall exposure was higher in the CTA group ( mean , 12.0 mSv vs. 10.1 mSv ; P<0.001 ) . CONCLUSIONS In symptomatic patients with suspected CAD who required noninvasive testing , a strategy of initial CTA , as compared with functional testing , did not improve clinical outcomes over a median follow-up of 2 years . ( Funded by the National Heart , Lung , and Blood Institute ; PROMISE Clinical Trials.gov number , NCT01174550 . ) OBJECTIVES The study aim was to determine observational differences in costs of care by the coronary disease diagnostic test modality . BACKGROUND A number of diagnostic strategies are available with few data to compare the cost implication s of the initial test choice . METHODS We prospect ively enrolled 11,372 consecutive stable angina patients who were referred for stress myocardial perfusion tomography or cardiac catheterization . Stress imaging patients were matched by their pretest clinical risk of coronary disease to a series of patients referred to cardiac catheterization . Composite 3-year costs of care were compared for two patients management strategies : 1 ) direct cardiac catheterization ( aggressive ) and 2 ) initial stress myocardial perfusion tomography and selective catheterization of high risk patients ( conservative ) . Analysis of variance techniques were used to compare costs , adjusting for treatment propensity and pretest risk . RESULTS Observational comparisons of aggressive as compared with conservative testing strategies reveal that costs of care were higher for direct cardiac catheterization in all clinical risk subsets ( range : $ 2,878 to $ 4,579 ) , as compared with stress myocardial perfusion imaging plus selective catheterization ( range : $ 2,387 to $ 3,010 , p < 0.0001 ) . Coronary revascularization rates were higher for low , intermediate and high risk direct catheterization patients as compared with the initial stress perfusion imaging cohort ( 13 % to 50 % , p < 0.0001 ) ; cardiac death or myocardial infa rct ion rates were similar ( p > 0.20 ) . CONCLUSIONS Observational assessment s reveal that stable chest pain patients who undergo a more aggressive diagnostic strategy have higher diagnostic costs and greater rates of intervention and follow-up costs . Cost differences may reflect a diminished necessity for re source consumption for patients with normal test results In patients with stable ischemic heart disease ( SIHD ) , myocardial revascularization should be performed to either improve survival or improve symptoms and functional status among patients who are not well controlled with optimal medical therapy ( OMT ) . A general consensus exists on the core elements of OMT , which include both lifestyle intervention and intensive secondary prevention with proven pharmacotherapies . By contrast , however , there is less general agreement as to what constitutes the optimal approach to revascularization in SIHD patients . The COURAGE and FAME 2 r and omized trials form the foundation of the current clinical evidence base and raise the important question : “ What is the impact of myocardial ischemia on myocardial revascularization in stable ischemic heart disease?”ZusammenfassungBei Patienten mit stabiler ischämischer Herzkrankheit ( SIHD ) sollte eine myokardiale Revaskularisierung erfolgen , um entweder das Überleben oder die Symptome und den Funktionszust and bei Patienten zu verbessern , bei denen die Erkrankung trotz optimaler medikamentöser Therapie ( OMT ) nicht gut unter Kontrolle ist . Es besteht ein allgemeiner Konsens bei den Kernelementen der OMT , zu denen sowohl eine Änderung der Lebensweise als auch eine intensive Sekundärprävention mit bewährter Pharmakotherapie gehören . Dagegen besteht jedoch eine geringere Übereinstimmung hinsichtlich des optimalen Ansatzes für die Revaskularisierung bei SIHD-Patienten . Die r and omisierten Studien COURAGE und FAME 2 bilden die Grundlage der aktuellen klinischen Evidenzbasierung und werfen die wichtige Frage auf : „ Wie beeinflusst die Myokardischämie eine myokardiale Revaskularisierung bei stabiler ischämischer Herzkrankheit ? OBJECTIVES To assess the acceptability and feasibility of functional tests as a gateway to angiography for management of coronary artery disease ( CAD ) , the ability of diagnostic strategies to identify patients who should undergo revascularisation , patient outcomes in each diagnostic strategy , and the most cost-effective diagnostic strategy for patients with suspected or known CAD . DESIGN A rapid systematic review of economic evaluations of alternative diagnostic strategies for CAD was carried out . A pragmatic and generalisable r and omised controlled trial was undertaken to assess the use of the functional cardiac tests : angiography ( controls ) ; single photon emission computed tomography ( SPECT ) ; magnetic resonance imaging ( MRI ) ; stress echocardiography . SETTING The setting was Papworth Hospital NHS Foundation Trust , a tertiary cardiothoracic referral centre . PARTICIPANTS Patients with suspected or known CAD and an exercise test result that required non-urgent angiography . INTERVENTIONS Patients were r and omised to one of the four initial diagnostic tests . MAIN OUTCOME MEASURES Eighteen months post-r and omisation : exercise time ( modified Bruce protocol ) ; cost-effectiveness compared with angiography ( diagnosis , treatment and follow-up costs ) . The aim was to demonstrate equivalence in exercise time between those r and omised to functional tests and those r and omised to angiography [ defined as the confidence interval ( CI ) for mean difference from angiography within 1 minute ] . RESULTS The 898 patients were r and omised to angiography ( n = 222 ) , SPECT ( n = 224 ) , MRI ( n = 226 ) or stress echo ( n = 226 ) . Initial diagnostic tests were completed successfully with unequivocal results for 98 % of angiography , 94 % of SPECT ( p = 0.05 ) , 78 % of MRI ( p < 0.001 ) and 90 % of stress echocardiography patients ( p < 0.001 ) . Some 22 % of SPECT patients , 20 % of MRI patients and 25 % of stress echo patients were not subsequently referred for an angiogram . Positive functional tests were confirmed by positive angiography in 83 % of SPECT patients , 89 % of MRI patients and 84 % of stress echo patients . Negative functional tests were followed by positive angiograms in 31 % of SPECT patients , 52 % of MRI patients and 48 % of stress echo patients tested . The proportions that had coronary artery bypass graft surgery were 10 % ( angiography ) , 11 % ( MRI ) and 13 % ( SPECT and stress echo ) and percutaneous coronary intervention 25 % ( angiography ) , 18 % ( SPECT ) and 23 % ( MRI and stress echo ) . At 18 months , comparing SPECT and stress echo with angiography , a clinical ly significant difference in total exercise time can be ruled out . The MRI group had significantly shorter mean total exercise time of 35 seconds and the upper limit of the CI was 1.14 minutes less than in the angiography group , so a difference of at least 1 minute can not be ruled out . At 6 months post-treatment , SPECT and angiography had equivalent mean exercise time . Compared with angiography , the MRI and stress echo groups had significantly shorter mean total exercise time of 37 and 38 seconds , respectively , and the upper limit of both CIs was 1.16 minutes , so a difference of at least 1 minute can not be ruled out . The differences were mainly attributable to revascularised patients . There were significantly more non-fatal adverse events in the stress echo group , mostly admissions for chest pain , but no significant difference in the number of patients reporting events . Mean ( 95 % CI ) total additional costs over 18 months , compared with angiography , were 415 pounds ( -310 pounds to 1084 pounds ) for SPECT , 426 pounds ( -247 pounds to 1088 pounds ) for MRI and 821 pounds ( 10 pounds to 1715 pounds ) for stress echocardiography , with very little difference in quality -adjusted life-years ( QALYs ) amongst the groups ( less than 0.04 QALYs over 18 months ) . Cost-effectiveness was mainly influenced by test costs , clinicians ' willingness to trust negative functional tests and by a small number of patients who had a particularly difficult clinical course . CONCLUSIONS Between 20 and 25 % of patients can avoid invasive testing using functional testing as a gateway to angiography , without substantial effects on outcomes . The SPECT strategy was as useful as angiography in identifying patients who should undergo revascularisation and the additional cost was not significant , in fact it would be reduced further by restricting the rest test to patients who have a positive stress test . MRI had the largest number of test failures and , in this study , had the least practical use in screening patients with suspected CAD , although it had similar outcomes to stress echo and is still an evolving technology . Stress echo patients had a 10 % test failure rate , significantly shorter total exercise time and time to angina at 6 months post-treatment , and a greater number of adverse events , leading to significantly higher costs . Given the level of skill required for stress echo , it may be best to reserve this test for those who have a contraindication to SPECT and are unable or unwilling to have MRI . Further research , using blinded re assessment of functional test results and angiograms , is required to formally assess diagnostic accuracy . Longer-term cost-effectiveness analysis , and further studies of MRI and new generation computed tomography are also required BACKGROUND Suspected coronary artery disease ( CAD ) is one of the most common , potentially life-threatening diagnostic problems clinicians encounter . However , no large outcome -based r and omized trials have been performed to guide the selection of diagnostic strategies for these patients . METHODS The PROMISE study is a prospect i ve , r and omized trial comparing the effectiveness of 2 initial diagnostic strategies in patients with symptoms suspicious for CAD . Patients are r and omized to either ( 1 ) functional testing ( exercise electrocardiogram , stress nuclear imaging , or stress echocardiogram ) or ( 2 ) anatomical testing with ≥64-slice multidetector coronary computed tomographic angiography . Tests are interpreted locally in real time by subspecialty certified physicians , and all subsequent care decisions are made by the clinical care team . Sites are provided results of central core laboratory quality and completeness assessment . All subjects are followed up for ≥1 year . The primary end point is the time to occurrence of the composite of death , myocardial infa rct ion , major procedural complications ( stroke , major bleeding , anaphylaxis , and renal failure ) , or hospitalization for unstable angina . RESULTS More than 10,000 symptomatic subjects were r and omized in 3.2 years at 193 US and Canadian cardiology , radiology , primary care , urgent care , and anesthesiology sites . CONCLUSION Multispecialty community practice enrollment into a large pragmatic trial of diagnostic testing strategies is both feasible and efficient . The PROMISE trial will compare the clinical effectiveness of an initial strategy of functional testing against an initial strategy of anatomical testing in symptomatic patients with suspected CAD . Quality of life , re source use , cost-effectiveness , and radiation exposure will be assessed AIMS The aim of this study was to investigate in patients with stable angina the effects on costs of frontline diagnostics by exercise-stress testing ( ex-test ) vs. coronary computed tomography angiography ( CTA ) . METHODS AND RESULTS In two coronary units at Lillebaelt Hospital , Denmark , 498 patients were identified in whom either ex-test ( n = 247 ) or CTA ( n = 251 ) were applied as the frontline diagnostic strategy in symptomatic patients with a low-intermediate pre-test probability of coronary artery disease ( CAD ) . During 12 months of follow-up , death , myocardial infa rct ion and costs associated with downstream diagnostic utilization ( DTU ) , treatment , ambulatory visits , and hospitalizations were registered . There was no difference between cohorts in demographic characteristics or the pre-test probability of significant CAD . The mean ( SD ) age was 56 ( 11 ) years ; 52 % were men ; and 96 % were at low-intermediate pre-test probability of CAD . All serious cardiac events ( n = 3 ) during follow-up occurred in patients with a negative ex-test result . Mean costs per patient associated with DTU , ambulatory visits , and cardiovascular medication were significantly higher in the ex-test than in the CTA group . The mean ( SD ) total costs per patient at the end of the follow-up were 14 % lower in the CTA group than in the ex-test group , € 1510 ( 3474 ) vs. € 1777 ( 3746 ) ( P = 0.03 ) . CONCLUSION Diagnostic assessment of symptomatic patients with a low-intermediate probability of CAD by CTA incurred lower costs when compared with the ex-test . These findings need confirmation in future prospect i ve trials CONTEXT Coronary computed tomography angiography ( CCTA ) is a new noninvasive diagnostic test for coronary artery disease ( CAD ) , but its association with subsequent clinical management has not been established . OBJECTIVE To compare utilization and spending associated with functional ( stress testing ) and anatomical ( CCTA ) noninvasive cardiac testing in a Medicare population . DESIGN , SETTING , AND PATIENTS Retrospective , observational cohort study using cl aims data from a 20 % r and om sample of 2005 - 2008 Medicare fee-for-service beneficiaries 66 years or older with no cl aims for CAD in the preceding year , who received nonemergent , noninvasive testing for CAD ( n = 282,830 ) . MAIN OUTCOME MEASURES Cardiac catheterization , coronary revascularization , acute myocardial infa rct ion , all-cause mortality , and total and CAD-related Medicare spending over 180 days of follow-up . RESULTS Compared with stress myocardial perfusion scintigraphy ( MPS ) , CCTA was associated with an increased likelihood of subsequent cardiac catheterization ( 22.9 % vs 12.1 % ; adjusted odds ratio [ AOR ] , 2.19 [ 95 % CI , 2.08 to 2.32 ] ; P < .001 ) , percutaneous coronary intervention ( 7.8 % vs 3.4 % ; AOR , 2.49 [ 2.28 to 2.72 ] ; P < .001 ) , and coronary artery bypass graft surgery ( 3.7 % vs 1.3 % ; AOR , 3.00 [ 2.63 to 3.41 ] ; P < .001 ) . CCTA was also associated with higher total health care spending ( $ 4200 [ $ 3193 to $ 5267 ] ; P < .001 ) , which was almost entirely attributable to payments for any cl aims for CAD ( $ 4007 [ $ 3256 to $ 4835 ] ; P < .001 ) . Compared with MPS , there was lower associated spending with stress echocardiography ( -$4981 [ -$4991 to -$4969 ] ; P < .001 ) and exercise electrocardiography ( -$7449 [ -$7452 to -$7444 ] ; P < .001 ) . At 180 days , CCTA was associated with a similar likelihood of all-cause mortality ( 1.05 % vs 1.28 % ; AOR , 1.11 [ 0.88 to 1.38 ] ; P = .32 ) and a slightly lower likelihood of hospitalization for acute myocardial infa rct ion ( 0.19 % vs 0.43 % ; AOR , 0.60 [ 0.37 to 0.98 ] ; P = .04 ) . CONCLUSION Medicare beneficiaries who underwent CCTA in a nonacute setting were more likely to undergo subsequent invasive cardiac procedures and have higher CAD-related spending than patients who underwent stress testing CONTEXT Coronary artery disease ( CAD ) is the major cause of mortality and morbidity in patients with type 2 diabetes . But the utility of screening patients with type 2 diabetes for asymptomatic CAD is controversial . OBJECTIVE To assess whether routine screening for CAD identifies patients with type 2 diabetes as being at high cardiac risk and whether it affects their cardiac outcomes . DESIGN , SETTING , AND PATIENTS The Detection of Ischemia in Asymptomatic Diabetics ( DIAD ) study is a r and omized controlled trial in which 1123 participants with type 2 diabetes and no symptoms of CAD were r and omly assigned to be screened with adenosine-stress radionuclide myocardial perfusion imaging ( MPI ) or not to be screened . Participants were recruited from diabetes clinics and practice s and prospect ively followed up from August 2000 to September 2007 . MAIN OUTCOME MEASURE Cardiac death or nonfatal myocardial infa rct ion ( MI ) . RESULTS The cumulative cardiac event rate was 2.9 % over a mean ( SD ) follow-up of 4.8 ( 0.9 ) years for an average of 0.6 % per year . Seven nonfatal MIs and 8 cardiac deaths ( 2.7 % ) occurred among the screened group and 10 nonfatal MIs and 7 cardiac deaths ( 3.0 % ) among the not-screened group ( hazard ratio [ HR ] , 0.88 ; 95 % confidence interval [ CI ] , 0.44 - 1.88 ; P = .73 ) . Of those in the screened group , 409 participants with normal results and 50 with small MPI defects had lower event rates than the 33 with moderate or large MPI defects ; 0.4 % per year vs 2.4 % per year ( HR , 6.3 ; 95 % CI , 1.9 - 20.1 ; P = .001 ) . Nevertheless , the positive predictive value of having moderate or large MPI defects was only 12 % . The overall rate of coronary revascularization was low in both groups : 31 ( 5.5 % ) in the screened group and 44 ( 7.8 % ) in the unscreened group ( HR , 0.71 ; 95 % CI , 0.45 - 1.1 ; P = .14 ) . During the course of study there was a significant and equivalent increase in primary medical prevention in both groups . CONCLUSION In this contemporary study population of patients with diabetes , the cardiac event rates were low and were not significantly reduced by MPI screening for myocardial ischemia over 4.8 years . TRIAL REGISTRATION clinical trials.gov Identifier : NCT00769275 Background — There is a paucity of r and omized trials regarding diagnostic testing in women with suspected coronary artery disease ( CAD ) . It remains unclear whether the addition of myocardial perfusion imaging ( MPI ) to the st and ard ECG exercise treadmill test ( ETT ) provides incremental information to improve clinical decision making in women with suspected CAD . Methods and Results — We r and omized symptomatic women with suspected CAD , an interpretable ECG , and ≥5 metabolic equivalents on the Duke Activity Status Index to 1 of 2 diagnostic strategies : ETT or exercise MPI . The primary end point was 2-year incidence of major adverse cardiac events , defined as CAD death or hospitalization for an acute coronary syndrome or heart failure . A total of 824 women were r and omized to ETT or exercise MPI . For women r and omized to ETT , ECG results were normal in 64 % , indeterminate in 16 % , and abnormal in 20 % . By comparison , the exercise MPI results were normal in 91 % , mildly abnormal in 3 % , and moderate to severely abnormal in 6 % . At 2 years , there was no difference in major adverse cardiac events ( 98.0 % for ETT and 97.7 % for MPI ; P=0.59 ) . Compared with ETT , index testing costs were higher for exercise MPI ( P<0.001 ) , whereas downstream procedural costs were slightly lower ( P=0.0008 ) . Overall , the cumulative diagnostic cost savings was 48 % for ETT compared with exercise MPI ( P<0.001 ) . Conclusions — In low-risk , exercising women , a diagnostic strategy that uses ETT versus exercise MPI yields similar 2-year posttest outcomes while providing significant diagnostic cost savings . The ETT with selective follow-up testing should be considered as the initial diagnostic strategy in symptomatic women with suspected CAD . Clinical Trial Registration — http://www . clinical trials.gov . Unique identifier : NCT00282711 Background —The incremental value and cost-effectiveness of stress single photon emission computed tomography ( SPECT ) is of unclear added value over clinical and exercise treadmill testing data in patients with normal resting ECGs , a patient subset known to be at relatively lower risk . Methods and Results —We identified 3058 consecutive patients who underwent exercise dual isotope SPECT , who on follow-up ( mean , 1.6±0.5 years ; 3.6 % lost to follow-up ) were found to have 70 hard events ( 2.3 % hard-event rate ) . Survival analysis used a Cox proportional hazards model , and cost-effectiveness was determined by the cost per hard event identified by strategies with versus without the use of SPECT . In this cohort , a normal study was associated with an exceedingly low hard-event rate ( 0.4 % per year ) that increased significantly as a function of the SPECT result . After adjusting for pre-SPECT information , exercise stress SPECT yielded incremental value for the prediction of hard events ( & khgr;2 52 to 85 , P < 0.001 ) and significantly stratified patients . In patients with intermediate to high likelihood of coronary artery disease after exercise treadmill testing , a cost-effectiveness ratio of $ 25 134 per hard event identified and a cost of $ 5417 per reclassification of patient risk were found . Subset analyses revealed similar prognostic , and cost results were present in men , women , and patients with and without prior histories of coronary artery disease . Conclusions —Stress SPECT yields incremental prognostic value and enhanced risk stratification in patients with normal resting ECGs in a cost-effective manner . These findings are opposite those of previous studies examining anatomic end points in this same population and thus , if confirmed , have significant implication s for patient management AIMS The Euro Heart Survey of Stable Angina set out to prospect ively study the presentation and management of patients with stable angina as first seen by a cardiologist in Europe , with particular reference to adherence to existing guidelines and regional variability in patient presentation and initial assessment . METHODS AND RESULTS Consecutive out patients with a clinical diagnosis by a cardiologist of stable angina were enrolled in the study and 3779 patients were included in the analysis . The average age was 61 years and 58 % were male . The majority of patients ( 88 % ) had mild to moderate angina , CCS class I or II . Despite a high prevalence of recognized risk factors , 27 % did not have cholesterol and 33 % did not have glucose measured within 4 weeks of assessment . The resting ECG was abnormal in 41 % of patients . An exercise ECG was performed or planned as part of initial investigation in 76 % of patients and 18 % had a stress imaging test such as perfusion scanning or stress echo . A coronary angiogram was performed or planned in 41 % , and 64 % had an echo . The time from assessment to investigation varied widely , particularly for angiography . One in 10 patients had neither any form of stress test nor angiography , with marked regional variation . Availability of invasive facilities increased the likelihood of both non-invasive and invasive investigations . Those with more severe symptoms or longer duration of symptoms or a positive non-invasive test were more likely to have angiography . In multivariable analysis , a positive stress test was the strongest predictor of the use of angiography , associated with a six-fold increase in the likelihood of invasive investigation . However , gender and availability of facilities were also predictive . CONCLUSION Considerable variation in features at presentation and use of investigations has been identified in the stable angina population in Europe . The evaluation of biochemical cardiovascular risk factors was suboptimal . Overall rates of non-invasive investigation for angina and the clinical appropriateness of factors predictive of the use of invasive investigation were broadly in line with guidelines . However , the influence of access to facilities , and marked international variation in rates and timing of investigation suggest that factors unrelated to clinical need are also influential in the management of patients with stable angina Despite its limited sensitivity and specificity in patients with low to intermediate probability of coronary artery disease ( CAD ) , exercise treadmill testing ( ETT ) is frequently used as the initial test for investigation of chest pain . Although myocardial perfusion imaging is a significantly more accurate test , its added cost to ETT is considerable . The cost of a non-contrast electron beam computed tomography ( EBCT ) scan is comparable to that of ETT and the calcium score ( CS ) correlates closely with the volume of atherosclerotic plaque . Therefore , we tested the hypothesis that EBCT might be an effective and cost-beneficial technique for the identification of angiographically obstructive CAD ( > or = 50 % stenosis ) in patients with low to intermediate pretest probability of disease . We calculated the theoretic cost of attaining a diagnosis of CAD based on a Bayesian model that utilizes published sensitivity and specificity levels for ETT , EBCT , and stress myocardial perfusion imaging . We then su bmi tted a cohort of 207 patients with low to intermediate probability of disease both to EBCT and ETT in r and om order , and estimated the cost of achieving a correct diagnosis by either route based on the number of expected further tests . An EBCT calcium score of 150 was chosen as a cut-point with a sensitivity of 74 % and a specificity of 89 % for the presence of obstructive CAD . The theoretic Bayesian model predicted substantial cost savings when EBCT was used as the initial test instead of ETT , with decreasing benefit as the prevalence of disease increased ( 44 % saving at 0 % prevalence ; 15 % saving at 100 % prevalence ) . In the patient cohort , the diagnostic pathway starting with EBCT provided a 45 % to 65 % cost saving over the ETT pathway . We conclude that in patients with low to intermediate pretest probability of disease , a pathway based on EBCT as the initial test to investigate presence of obstructive CAD provides a substantial cost benefit over a pathway based on ETT . Such cost advantages decrease as the prevalence of disease increases Objectives To determine the costs and cost-effectiveness of a diagnostic strategy including computed tomography coronary angiography ( CTCA ) in comparison with invasive conventional coronary angiography ( CA ) for the detection of significant coronary artery disease from the point of view of the healthcare provider . Methods The average cost per CTCA was determined via a micro-costing method in four French hospitals , and the cost of CA was taken from the 2011 French National Cost Study that collects data at the patient level from a sample of 51 public or not-for-profit hospitals . Results The average cost of CTCA was estimated to be 180 € ( 95 % CI 162–206 € ) based on the use of a 64-slice CT scanner active for 10 h per day . The average cost of CA was estimated to be 1,378 € ( 95 % CI 1,126–1,670 € ) . The incremental cost-effectiveness ratio of CA for all patients over a strategy including CTCA triage in the intermediate risk group , no imaging test in the low risk group , and CA in the high risk group , was estimated to be 6,380 € ( 95 % CI 4,714–8,965 € ) for each additional correctly classified patient . This strategy correctly classifies 95.3 % ( 95 % CI 94.4–96.2 ) of all patients in the population studied . Conclusions A strategy of CTCA triage in the intermediate-risk group , no imaging test in the low-risk group , and CA in the high-risk group , has good diagnostic accuracy and could significantly cut costs . Medium-term and long-term outcomes need to be evaluated in patients with coronary stenosis potentially misclassified by CTCA due to false negative examinations BACKGROUND Coronary flow velocity reserve ( CFVR ) is an alternative for myocardial perfusion scintigraphy ( SPECT ) in assessing functional severity of coronary lesions . For the acceptance of CFVR in daily clinical decision-making , cost-effectiveness must be proven . AIM Economic evaluation of different diagnostic management strategies using CFVR compared with SPECT for making decisions regarding use of PTCA of an intermediate coronary lesion in patients with multivessel disease . METHODS The incremental cost-effectiveness analysis was based on data from a prospect i ve multicentre study in 201 patients with multivessel coronary artery disease . Four management strategies , assuming performance of angioplasty after positive test result ( s ) , were compared : SPECT alone , CFVR alone ( cut-off value of 2.0 ) , and combined strategies of SPECT and CFVR with one ( ' extensive ' ) or two ( ' restrictive ' ) positive test(s ) . Probabilistic sensitivity analyses were performed using Monte Carlo simulation . Primary outcome was the probability of a cardiac event-free first year with respect to the intermediate lesion . RESULTS A 10 % event rate was observed , which was predominantly associated with ischaemia-driven revascularisations . A strategy based on CFVR was most effective . The restrictive strategy had the lowest costs and was most cost-effective ; with increasing willingness-to-pay values ( above € 20,000 ) a CFVR-alone strategy became equally cost-effective . CONCLUSION It is m and atory to measure CFVR to decide upon angioplasty of the intermediate lesion in patients with multivessel coronary artery disease . This decision can be based on the restrictive strategy ( i.e. performance of PTCA in case of abnormal test results of both SPECT and CFVR ) or solely on CFVR , depending on society 's willingness-to-pay to prevent cardiac events OBJECTIVE To estimate the impact of contrast stress echocardiography on re source use in the treatment of patients with suspected coronary artery disease ( CAD ) . METHODS Fifty-nine patients with suspected CAD underwent nuclear perfusion imaging and contrast echocardiography examination . Further treatment was planned after each test and a final treatment was recommended after review ing the results of both examinations . Medical re sources and productivity losses were then collected for a 3-month follow-up period . RESULTS Diagnosis was possible in 96.6 % of patients with nuclear perfusion imaging and 93.2 % with contrast echocardiography , result ing in a cost per successful diagnosis of $ 637 ( Can ) and $ 476 ( Can ) , respectively . For the majority of patients ( 74 % ) , both tests provided the same result , but for 12 patients nuclear imaging suggested abnormal perfusion , whereas contrast echocardiography indicated normal function and for 2 patients it was the opposite situation . Per-patient costs for the total patient population decreased from $ 316 ( Can ) after nuclear perfusion imaging to $ 250 ( Can ) when results from both tests were known . Three-month follow-up societal costs were $ 441 ( Can ) per patient , with hospitalization contributing 58 % of this total cost . CONCLUSION Contrast echocardiography has a similar success rate to nuclear perfusion imaging in diagnosing CAD , but has a 28 % lower cost and has the potential of additional cost savings through the elimination of further diagnostic tests R and omized trials have shown that fractional flow reserve ( FFR ) guided percutaneous coronary intervention ( PCI ) improves clinical outcome and reduces costs compared with visually guided PCI . FFR has been measured during invasive coronary angiography ( ICA ) , but can now be derived noninvasively from coronary computed tomography ( CT ) angiography ( cCTA ) images ( FF RCT ) . The potential value of FF RCT in clinical decision making is unknown BACKGROUND Clinical outcomes and re source utilization after coronary computed tomography angiography ( CTA ) versus myocardial perfusion single-photon emission CT ( MPS ) in patients with stable angina and suspected coronary artery disease ( CAD ) has not been examined . OBJECTIVE We determined the near-term clinical effect and re source utilization after cardiac CTA compared with MPS . METHODS We r and omly assigned 180 patients ( age , 57.3 ± 9.8 years ; 50.6 % men ) presenting with stable chest pain and suspected CAD at 2 sites to initial diagnostic evaluation by coronary CTA ( n = 91 ) or MPS ( n = 89 ) . The primary outcome was near-term angina-specific health status ; the secondary outcomes were incident medical and invasive treatments for CAD , CAD health care costs , and estimated radiation dose . RESULTS No patients experienced myocardial infa rct ion or death with 98.3 % follow-up at 55 ± 34 days . Both arms experienced comparable improvements in angina-specific health status . Patients who received coronary CTA had increased incident aspirin ( 22 % vs 8 % ; P = 0.04 ) and statin ( 7 % vs -3.5 % ; P = 0.03 ) use , similar rates of CAD-related hospitalization , invasive coronary angiography , noninvasive cardiac imaging tests , and increased revascularization ( 8 % vs 1 % ; P = 0.03 ) . Coronary CTA had significantly lower total costs ( $ 781.08 [ interquartile range ( IQR ) , $ 367.80-$4349.48 ] vs $ 1214.58 [ IQR , $ 978.02-$1569.40 ] ; P < 0.001 ) with no difference in induced costs . Coronary CTA had a significantly lower total estimated effective radiation dose ( 7.4 mSv [ IQR , 5.0 - 14.0 mSv ] vs 13.3 mSv [ IQR , 13.1 - 38.0 mSv ] ; P < 0.0001 ) with no difference in induced radiation . CONCLUSION In a pilot r and omized controlled trial , patients with stable CAD undergoing coronary CTA and MPS experience comparable improvements in near-term angina-related quality of life . Compared with MPS , coronary CTA evaluation is associated with more aggressive medical therapy , increased coronary revascularization , lower total costs , and lower effective radiation dose IMPORTANCE Coronary artery disease ( CAD ) is a major cause of cardiovascular morbidity and mortality in patients with diabetes mellitus , yet CAD often is asymptomatic prior to myocardial infa rct ion ( MI ) and coronary death . OBJECTIVE To assess whether routine screening for CAD by coronary computed tomography angiography ( CCTA ) in patients with type 1 or type 2 diabetes deemed to be at high cardiac risk followed by CCTA-directed therapy would reduce the risk of death and nonfatal coronary outcomes . DESIGN , SETTING , AND PARTICIPANTS The FACTOR-64 study was a r and omized clinical trial in which 900 patients with type 1 or type 2 diabetes of at least 3 to 5 years ' duration and without symptoms of CAD were recruited from 45 clinics and practice s of a single health system ( Intermountain Healthcare , Utah ) , enrolled at a single-site coordinating center , and r and omly assigned to CAD screening with CCTA ( n = 452 ) or to st and ard national guidelines -based optimal diabetes care ( n = 448 ) ( targets : glycated hemoglobin level < 7.0 % , low-density lipoprotein cholesterol level < 100 mg/dL , systolic blood pressure < 130 mm Hg ) . All CCTA imaging was performed at the coordinating center . St and ard therapy or aggressive therapy ( targets : glycated hemoglobin level < 6.0 % , low-density lipoprotein cholesterol level < 70 mg/dL , high-density lipoprotein cholesterol level > 50 mg/dL [ women ] or > 40 mg/dL [ men ] , triglycerides level < 150 mg/dL , systolic blood pressure < 120 mm Hg ) , or aggressive therapy with invasive coronary angiography , was recommended based on CCTA findings . Enrollment occurred between July 2007 and May 2013 , and follow-up extended to August 2014 . MAIN OUTCOMES AND MEASURES The primary outcome was a composite of all-cause mortality , nonfatal MI , or unstable angina requiring hospitalization ; the secondary outcome was ischemic major adverse cardiovascular events ( composite of CAD death , nonfatal MI , or unstable angina ) . RESULTS At a mean follow-up time of 4.0 ( SD , 1.7 ) years , the primary outcome event rates were not significantly different between the CCTA and the control groups ( 6.2 % [ 28 events ] vs 7.6 % [ 34 events ] ; hazard ratio , 0.80 [ 95 % CI , 0.49 - 1.32 ] ; P = .38 ) . The incidence of the composite secondary end point of ischemic major adverse cardiovascular events also did not differ between groups ( 4.4 % [ 20 events ] vs 3.8 % [ 17 events ] ; hazard ratio , 1.15 [ 95 % CI , 0.60 - 2.19 ] ; P = .68 ) . CONCLUSIONS AND RELEVANCE Among asymptomatic patients with type 1 or type 2 diabetes , use of CCTA to screen for CAD did not reduce the composite rate of all-cause mortality , nonfatal MI , or unstable angina requiring hospitalization at 4 years . These findings do not support CCTA screening in this population . TRIAL REGISTRATION clinical trials.gov Identifier : NCT00488033 BACKGROUND The preferred initial treatment for patients with stable coronary artery disease is the best available medical therapy . We hypothesized that in patients with functionally significant stenoses , as determined by measurement of fractional flow reserve ( FFR ) , percutaneous coronary intervention ( PCI ) plus the best available medical therapy would be superior to the best available medical therapy alone . METHODS In patients with stable coronary artery disease for whom PCI was being considered , we assessed all stenoses by measuring FFR . Patients in whom at least one stenosis was functionally significant ( FFR , ≤0.80 ) were r and omly assigned to FFR-guided PCI plus the best available medical therapy ( PCI group ) or the best available medical therapy alone ( medical-therapy group ) . Patients in whom all stenoses had an FFR of more than 0.80 were entered into a registry and received the best available medical therapy . The primary end point was a composite of death , myocardial infa rct ion , or urgent revascularization . RESULTS Recruitment was halted prematurely after enrollment of 1220 patients ( 888 who underwent r and omization and 332 enrolled in the registry ) because of a significant between-group difference in the percentage of patients who had a primary end-point event : 4.3 % in the PCI group and 12.7 % in the medical-therapy group ( hazard ratio with PCI , 0.32 ; 95 % confidence interval [ CI ] , 0.19 to 0.53 ; P<0.001 ) . The difference was driven by a lower rate of urgent revascularization in the PCI group than in the medical-therapy group ( 1.6 % vs. 11.1 % ; hazard ratio , 0.13 ; 95 % CI , 0.06 to 0.30 ; P<0.001 ) ; in particular , in the PCI group , fewer urgent revascularizations were triggered by a myocardial infa rct ion or evidence of ischemia on electrocardiography ( hazard ratio , 0.13 ; 95 % CI , 0.04 to 0.43 ; P<0.001 ) . Among patients in the registry , 3.0 % had a primary end-point event . CONCLUSIONS In patients with stable coronary artery disease and functionally significant stenoses , FFR-guided PCI plus the best available medical therapy , as compared with the best available medical therapy alone , decreased the need for urgent revascularization . In patients without ischemia , the outcome appeared to be favorable with the best available medical therapy alone . ( Funded by St. Jude Medical ; Clinical Trials.gov number , NCT01132495 . ) BACKGROUND To determine the comparative effectiveness and costs of a CT- strategy and a stress-electrocardiography-based strategy ( st and ard-of-care ; SOC- strategy ) for diagnosing coronary artery disease ( CAD ) . METHODS A decision analysis was performed based on a well-documented prospect i ve cohort of 471 out patients with stable chest pain with follow-up combined with best-available evidence from the literature . Outcomes were correct classification of patients as CAD- ( no obstructive CAD ) , CAD+ ( obstructive CAD without revascularization ) and indication for Revascularization ( using a combination reference st and ard ) , diagnostic costs , lifetime health care costs , and quality -adjusted life years ( QALY ) . Parameter uncertainty was analyzed using probabilistic sensitivity analysis . RESULTS For men ( and women ) , diagnostic cost savings were € 245 ( € 252 ) for the CT- strategy as compared to the SOC- strategy . The CT- strategy classified 82 % ( 88 % ) of simulated men ( women ) in the appropriate disease category , whereas 83 % ( 85 % ) were correctly classified by the SOC- strategy . The long-term cost-effectiveness analysis showed that the SOC- strategy was dominated by the CT- strategy , which was less expensive ( -€229 in men , -€444 in women ) and more effective ( + 0.002 QALY in men , + 0.005 in women ) . The CT- strategy was cost-saving ( -€231 ) but also less effective compared to SOC ( -0.003 QALY ) in men with a pre-test probability of ≥ 70 % . The CT- strategy was cost-effective in 100 % of simulations , except for men with a pre-test probability ≥ 70 % in which case it was 59 % . CONCLUSIONS The results suggest that a CT-based strategy is less expensive and equally effective compared to SOC in all women and in men with a pre-test probability < 70 % OBJECTIVES The aim of this study was to project clinical outcomes , health care costs , and cost-effectiveness of coronary computed tomography angiography ( CCTA ) , as compared with conventional diagnostic technologies , in the evaluation of patients with stable chest pain and suspected coronary artery disease ( CAD ) . BACKGROUND CCTA has recently been found to be effective in the evaluation of patients with suspected CAD , but investigators have raised concerns related to radiation exposure , incidental findings , and nondiagnostic exams . METHODS With published data , we developed a computer simulation model to project clinical outcomes , health care costs , and cost-effectiveness of CCTA , compared with conventional testing modalities , in the diagnosis of CAD . Our target population included 55-year-old patients who present to their primary care physicians with stable chest pain . RESULTS All diagnostic strategies yielded similar health outcomes , but performing CCTA-with or without stress testing or performing stress single-photon emission computed tomography-marginally minimized adverse events and maximized longevity and quality -adjusted life-years ( QALYs ) . Health outcomes associated with these strategies were comparable , with CCTA in men and women yielding the greatest QALYs but only by modest margins . Overall differences were small , and performing the most effective test-compared with the least effective-decreased adverse event rates by 3 % in men and women . Comparable increases in longevity and QALYs were 2 months and 0.1 QALYs in men and 1 month and 0.03 QALYs in women . CCTA raised overall costs , partly through the follow-up of incidental findings , and when performed with stress testing , its incremental cost-effectiveness ratio ranged from $ 26,200/QALY in men to $ 35,000/QALY in women . Health outcomes were marginally less favorable in women when radiation risks were considered . CONCLUSIONS CCTA is comparable to other diagnostic studies and might hold good clinical value , but large r and omized controlled trials are needed to guide policy
2,082
23,728,275
To conclude , lower limb joint proprioception is reduced in those with BJHS compared to non-BJHS cohorts , whilst unclear in the upper limb
Joint proprioceptive deficit is documented in a variety of musculoskeletal conditions including osteoarthritis , ligament and meniscal injuries , and individuals with increased joint hypermobility , such as those with Ehlers – Danlos . No systematic review s have assessed joint proprioception in people with benign joint hypermobility syndrome ( BJHS ) . This study addresses this to determine whether people with BJHS exhibit reduced joint proprioception , and , if so , whether this is evident in all age groups .
OBJECTIVES Joint hypermobility ( JH ) or " ligamentous laxity " is felt to be an underlying risk factor for many types of musculoskeletal presentation in paediatrics , and joint hypermobility syndrome ( JHS ) describes such disorders where symptoms become chronic , often more generalized and associated with functional impairment . Clinical features are felt to have much in common with more severe disorders , including Ehlers-Danlos syndrome ( EDS ) , osteogenesis imperfecta and Marfan syndrome , although this has not been formally studied in children . We defined the clinical characteristics of all patients with joint hypermobility-related presentations seen from 1999 to 2002 in a tertiary referral paediatric rheumatology unit . METHODS Patients were identified and recruited from paediatric rheumatology clinic and ward , and a dedicated paediatric rheumatology hypermobility clinic at Great Ormond Street Hospital . Data were collected retrospectively on the patients from the paediatric rheumatology clinics ( 1999 - 2002 ) and prospect ively on patients seen in the hypermobility clinic ( 2000 - 2002 ) . Specifically , historical details of developmental milestones , musculoskeletal or soft tissue diagnoses and symptoms , and significant past medical history were recorded . Examination features sought included measurements of joint and soft tissue laxity , and associated conditions such as scoliosis , dysmorphic features , cardiac murmurs and eye problems . RESULTS One hundred and twenty-five children ( 64 females ) were included on whom sufficient clinical data could be identified and who had clinical problems ascribed to JH present for longer than 3 months . Sixty-four were from the paediatric rheumatology clinic and 61 from the hypermobility clinic . No differences were found in any of the measures between the two population s and results are presented in a combined fashion . Three-quarters of referrals came from paediatricians and general practitioners but in only 10 % was hypermobility recognized as a possible cause of joint complaint . The average age at onset of symptoms was 6.2 yr and age at diagnosis 9.0 yr , indicating a 2- to 3-yr delay in diagnosis . The major presenting complaint was arthralgia in 74 % , abnormal gait in 10 % , apparent joint deformity in 10 % and back pain in 6 % . Mean age at first walking was 15.0 months ; 48 % were considered " clumsy " and 36 % as having poor coordination in early childhood . Twelve per cent had " clicky " hips at birth and 4 % actual congenital dislocatable hip . Urinary tract infections were present in 13 and 6 % of the female and male cases , respectively . Thirteen and 14 % , respectively , had speech and learning difficulties diagnosed . A history of recurrent joint sprains was seen in 20 % and actual subluxation/dislocation of joints in 10 % . Forty per cent had experienced problems with h and writing tasks , 48 % had major limitations of school-based physical education activities , 67 % other physical activities and 41 % had missed significant periods of schooling because of symptoms . Forty-three per cent described a history of easy bruising . Examination revealed that 94 % scored > or = 4/9 on the Beighton scale for generalized hypermobility , with knees ( 92 % ) , elbows ( 87 % ) , wrists ( 82 % ) , h and metacarpophalangeal joints ( 79 % ) , and ankles ( 75 % ) being most frequently involved . CONCLUSIONS JHS is poorly recognized in children with a long delay in the time to diagnosis . Although there is a referral bias towards joint symptoms , a surprisingly large proportion is associated with significant neuromuscular and motor development problems . Our patients with JHS also show many overlap features with genetic disorders such as EDS and Marfan syndrome . The delay in diagnosis results in poor control of pain and disruption of normal home life , schooling and physical activities . Knowledge of the diagnosis and simple interventions are likely to be highly effective in reducing the morbidity and cost to the health and social services In 30 healthy volunteers with clinical ly inconspicuous knee joints and nine patients with posttraumatic recurrent patella dislocation , the proprioceptive abilities of the knee joint were evaluated by an angle reproduction test . The results of the control goup showed no gender-or dominant-specific difference . The patient group showed a significant deterioration of proprioceptive capability in the injured knee joint . Even in the contralateral , uninjured knee joint , the angle reproduction test result was significantly reduced compared with the control group . After applying an elastic knee b and age , the control group and the patients with patella dislocation showed a significant improvement of the proprioceptive capability The objective of the present study is to compare and quantify the postural differences and joint pain distribution between subjects with benign joint hypermobility syndrome ( BJHS ) and the normal population . This observational , non-r and omized , and controlled study was conducted at Rheumatology and Physical Medicine and Rehabilitation Medicine Departments of a tertiary care teaching hospital . Subjects comprise 35 persons with diagnosis of BJHS , and the control group was matched for age and sex . Reedco ’s Posture score ( RPS ) and visual analogue scale ( VAS ) were the outcome measures . The subjects were assessed for pain in ten major joints and rated on a VAS . A st and ard posture assessment was conducted using the Reedco ’s Posture score . The same procedure was executed for an age- and sex-matched control group . Mean RPS for the BJHS group was 55.29 ± 8.15 and for the normal group it was 67 ± 11.94 . The most common postural deviances in subjects with BJHS were identified in the following areas of head , hip ( Sagittal plane ) , upper back , trunk , and lower back ( Coronal plane ) . Intensity of pain was found to be more in BJHS persons than that of the normal persons , and the knee joints were the most affected . The present study compared and quantified the postural abnormalities and the pain in BJHS persons . The need for postural re-education and specific assessment and training for the most affected joints are discussed . There is a significant difference in posture between subjects with BJHS and the normal population . BJHS persons need special attention to their posture re-education during physiotherapy sessions to reduce long-term detrimental effects on the musculoskeletal system We carried out a prospect i ve study to determine the validity of different sets of criteria to define the hypermobility syndrome ( HMS ) , as well as the frequency , reliability and clinical features of HMS items . All consecutive cases of HMS attending the rheumatological outpatient clinic of Hospital del Mar ( Barcelona , Spain ) constituted the index group ( n = 114 ) . A control group of non-HMS rheumatological patients ( n = 59 ) was r and omly selected to assess suitable cutoff points and particular HMS item prevalences . Beighton 's , Carter 's and Rotés ' HMS scores correlated very highly among them . Both the correlation coefficients obtained between each pair of sets of HMS criteria and the predictive efficiencies were uniformly high , suggesting high concurrent and predictive validity . All but 2 of the major items were more frequent among women . A basic set of criteria to define HMS is proposed . In relation to previous criteria the new scale shows better internal reliability and homogeneity . Results suggest that it may be suitable for screening studies and in clinical rheumatological setting OBJECTIVE To determine differences between hypermobile subjects and controls in terms of maximum strength , rate of force development , and balance . METHODS We recruited 13 subjects with hypermobility and 18 controls . Rate of force development and maximal voluntary contraction ( MVC ) during single leg knee extension of the right knee were measured isometrically for each subject . Balance was tested twice on a force plate with 15-second single-leg st and s on the right leg . Rate of force development ( N/second ) and MVC ( N ) were extracted from the force-time curve as maximal rate of force development (= limit Deltaforce/Deltatime ) and the absolute maximal value , respectively . RESULTS The hypermobile subjects showed a significantly higher value for rate of force development ( 15.2 % higher ; P = 0.038 , P = 0.453 , epsilon = 0.693 ) and rate of force development related to body weight ( 16.4 % higher ; P = 0.018 , P = 0.601 , epsilon = 0.834 ) than the controls . The groups did not differ significantly in MVC ( P = 0.767 , P = 0.136 , epsilon = 0.065 ) , and MVC related to body weight varied r and omly between the groups ( P = 0.921 , P = 0.050 , epsilon = 0.000 ) . In balance testing , the mediolateral sway of the hypermobile subjects showed significantly higher values ( 11.6 % higher ; P = 0.034 , P = 0.050 , epsilon = 0.000 ) than that of controls , but there was no significant difference ( 4.9 % difference ; P = 0.953 , P = 0.050 , epsilon = 0.000 ) in anteroposterior sway between the 2 groups . CONCLUSION Hypermobile women without acute symptoms or limitations in activities of daily life have a higher rate of force development in the knee extensors and a higher mediolateral sway than controls with normal joint mobility Quantification of dynamic balance is often necessary to assess a patient 's level of injury or ability to function in order to initiate an appropriate plan of care . Some therapists use the star-excursion test in an attempt to quantify dynamic balance . This test requires the patient to balance on one leg while reaching with the other leg . For the purpose of this study , the reach was performed in four directions . No previous research ers have attempted to evaluate the reliability of this test . Twenty healthy subjects between the ages of 18 and 35 years participated in this study . During two testing sessions , each subject was required to perform five reaching trials in four directions . Reliability estimates , calculated using the intraclass correlation coefficient ( 2 , 1 ) , ranged from 0.67 to 0.87 . Six duplicate practice sessions were suggested to increase this range above 0.86 . Task complexity may account for the moderate reliability estimates . Subjects should engage in a learning period before being evaluated on the star-excursion test
2,083
29,736,815
The Drug Burden Index was the most commonly used scale or tool in community and data base studies , while the Anticholinergic Risk Scale was used more frequently in care homes and hospital setting s. The association between anticholinergic burden and clinical outcomes varied by index and study . Falls and hospitalisation were consistently found to be associated with anticholinergic burden . Mortality , delirium , physical function and cognition were not consistently associated . Conclusions Anticholinergic burden scales vary in their rationale , use and association with outcomes .
Background Cumulative anticholinergic exposure ( anticholinergic burden ) has been linked to a number of adverse outcomes . To conduct research in this area , an agreed approach to describing anticholinergic burden is needed . Objective This review set out to identify anticholinergic burden scales , to describe their rationale , the setting s in which they have been used and the outcomes associated with them .
BACKGROUND Observational studies report a relationship between anticholinergic drug scale ( ADS ) score and cognitive function . This study investigated whether a reduced ADS score improved cognitive function in a frail elderly population . METHODS This r and omized , controlled , single-blinded trial , recruited long-term residents with an ADS score of greater than or equal to 3 from 22 nursing homes in Norway . The participants were r and omly allocated ( 1:1 ) to intervention or control . The intervention was a pharmacist-initiated reduction of ADS score after multidisciplinary drug review s. Primary end point was Consortium to Establish a Registry for Alzheimer 's Disease 10-wordlist test for immediate recall . Secondary end points were Mini-Mental Sate Examination , delayed recall and recognition of words , saliva flow , and serum anticholinergic activity (SAA).The participants were retested after 4 and 8 weeks , and the study groups were compared after adjusting for baseline differences . RESULTS Eighty-seven patients were included . The median ADS score was reduced by 2 units ( p < .0001 ) in the intervention group and remained unchanged in the control group . After 8 weeks , the adjusted mean difference in immediate recall was 0.54 words between the intervention and control group ( 95 % confidence interval [ CI ] : -0.91 , 2.05 ; p = .48 ) . The study groups did not differ significantly in any of the other cognitive end points , salvia flow , or SAA at either follow-up ( p > .18 ) . CONCLUSION Pharmacist-initiated drug changes significantly reduced ADS score but did not improve cognitive function in nursing home residents . Moreover , the drug changes did not reduce SAA or mouth dryness significantly , which might indicate limited applicability of the ADS score to prevent prescription risks in this population The use of prescription drugs in older people is high and many commonly prescribed drugs have anticholinergic effects . We examined the relationship between ACB on mortality and in-patient length of stay in the oldest old hospitalised population . This was a retrospective analysis of prospect i ve audit using hospital audit data from acute medical admissions in three hospitals in Engl and and Scotl and . Baseline use of possible or definite anticholinergics was determined according to the Anticholinergic Cognitive Burden Scale . The main outcome measures were decline in-hospital mortality , early in-hospital mortality at 3- and 7-days and in-patient length of stay . A total of 419 patients ( including 65 patients with known dementia ) were included [ median age=92.9 , inter-quartile range ( IQR ) 91.4 - 95.1 years ] . 256 ( 61.1 % ) were taking anticholinergic medications . Younger age , greater number of pre-morbid conditions , ischemic heart disease , number of medications , higher urea and creatinine levels were significantly associated with higher total ACB burden on univariate regression analysis . There were no significant differences observed in terms of in-patient mortality , in-patient hospital mortality within 3- and 7-days and likelihood of prolonged length of hospital stay between ACB categories . Compared to those without cardiovascular disease , patients with cardiovascular disease showed similar outcome regardless of ACB load ( either = 0 or > 0 ACB ) . We found no association between ACB and early ( within 3- and 7-days ) and in-patient mortality and hospital length of stay outcomes in this cohort of oldest old in the acute medical admission setting Background Delirium prevalence in the intensive care unit ( ICU ) is high . Numerous psychotropic agents are used to manage delirium in the ICU with limited data regarding their efficacy or harms . Methods / Design This is a r and omized controlled trial of 428 patients aged 18 and older suffering from delirium and admitted to the ICU of Wishard Memorial Hospital in Indianapolis . Subjects assigned to the intervention group will receive a multicomponent pharmacological management protocol for delirium ( PMD ) and those assigned to the control group will receive no change in their usual ICU care . The primary outcomes of the trial are ( 1 ) delirium severity as measured by the Delirium Rating Scale revised-98 ( DRS-R-98 ) and ( 2 ) delirium duration as determined by the Confusion Assessment Method for the ICU ( CAM-ICU ) . The PMD protocol targets the three neurotransmitter systems thought to be compromised in delirious patients : dopamine , acetylcholine , and gamma-aminobutyric acid . The PMD protocol will target the reduction of anticholinergic medications and benzodiazepines , and introduce a low-dose of haloperidol at 0.5 - 1 mg for 7 days . The protocol will be delivered by a combination of computer ( artificial intelligence ) and pharmacist ( human intelligence ) decision support system to increase adherence to the PMD protocol . Discussion The proposed study will evaluate the content and the delivery process of a multicomponent pharmacological management program for delirium in the ICU.Trial Registration Clinical Trials.gov : BACKGROUND Adverse effects of anticholinergic medications may contribute to events such as falls , delirium , and cognitive impairment in older patients . To further assess this risk , we developed the Anticholinergic Risk Scale ( ARS ) , a ranked categorical list of commonly prescribed medications with anticholinergic potential . The objective of this study was to determine if the ARS score could be used to predict the risk of anticholinergic adverse effects in a geriatric evaluation and management ( GEM ) cohort and in a primary care cohort . METHODS Medical records of 132 GEM patients were review ed retrospectively for medications included on the ARS and their result ant possible anticholinergic adverse effects . Prospect ively , we enrolled 117 patients , 65 years or older , in primary care clinics ; performed medication reconciliation ; and asked about anticholinergic adverse effects . The relationship between the ARS score and the risk of anticholinergic adverse effects was assessed using Poisson regression analysis . RESULTS Higher ARS scores were associated with increased risk of anticholinergic adverse effects in the GEM cohort ( crude relative risk [ RR ] , 1.5 ; 95 % confidence interval [ CI ] , 1.3 - 1.8 ) and in the primary care cohort ( crude RR , 1.9 ; 95 % CI , 1.5 - 2.4 ) . After adjustment for age and the number of medications , higher ARS scores increased the risk of anticholinergic adverse effects in the GEM cohort ( adjusted RR , 1.3 ; 95 % CI , 1.1 - 1.6 ; c statistic , 0.74 ) and in the primary care cohort ( adjusted RR , 1.9 ; 95 % CI , 1.5 - 2.5 ; c statistic , 0.77 ) . CONCLUSION Higher ARS scores are associated with statistically significantly increased risk of anticholinergic adverse effects in older patients Abstract Objective To assess the potential of anticholinergic drugs as a cause of non-degenerative mild cognitive impairment in elderly people . Design Longitudinal cohort study . Setting 63 r and omly selected general practice s in the Montpellier region of southern France . Participants 372 people aged > 60 years without dementia at recruitment . Main outcome measures Anticholinergic burden from drug use , cognitive examination , and neurological assessment . Results 9.2 % of subjects continuously used anticholinergic drugs during the year before cognitive assessment . Compared with non-users , they had poorer performance on reaction time , attention , delayed non-verbal memory , narrative recall , visuospatial construction , and language tasks but not on tasks of reasoning , immediate and delayed recall of wordlists , and implicit memory . Eighty per cent of the continuous users were classified as having mild cognitive impairment compared with 35 % of non-users , and anticholinergic drug use was a strong predictor of mild cognitive impairment ( odds ratio 5.12 , P = 0.001 ) . No difference was found between users and non-users in risk of developing dementia at follow-up after eight years . Conclusions Elderly people taking anticholinergic drugs had significant deficits in cognitive functioning and were highly likely to be classified as mildly cognitively impaired , although not at increased risk for dementia . Doctors should assess current use of anticholinergic drugs in elderly people with mild cognitive impairment before considering administration of acetylcholinesterase inhibitors OBJECTIVES To examine the longitudinal relationship between cumulative exposure to anticholinergic medications and memory and executive function in older men . DESIGN Prospect i ve cohort study . SETTING A Department of Veterans Affairs primary care clinic . PARTICIPANTS Five hundred forty-four community-dwelling men aged 65 and older with diagnosed hypertension . MEASUREMENTS The outcomes were measured using the Hopkins Verbal Recall Test ( HVRT ) for short-term memory and the instrumental activity of daily living ( IADL ) scale for executive function at baseline and during follow-up . Anticholinergic medication use was ascertained using participants ' primary care visit records and quantified as total anticholinergic burden using a clinician-rated anticholinergic score . RESULTS Cumulative exposure to anticholinergic medications over the preceding 12 months was associated with poorer performance on the HVRT and IADLs . On average , a 1-unit increase in the total anticholinergic burden per 3 months was associated with a 0.32-point ( 95 % confidence interval (CI)= 0.05 - 0.58 ) and 0.10-point ( 95 % CI=0.04 - 0.17 ) decrease in the HVRT and IADLs , respectively , independent of other potential risk factors for cognitive impairment , including age , education , cognitive and physical function , comorbidities , and severity of hypertension . The association was attenuated but remained statistically significant with memory ( 0.29 , 95 % CI=0.01 - 0.56 ) and executive function ( 0.08 , 95 % CI=0.02 - 0.15 ) after further adjustment for concomitant non-anticholinergic medications . CONCLUSION Cumulative anticholinergic exposure across multiple medications over 1 year may negatively affect verbal memory and executive function in older men . Prescription of drugs with anticholinergic effects in older persons deserves continued attention to avoid deleterious adverse effects Although there is an underst and able emphasis on the side effects of individual medications , the cumulative effects of medications have received little attention in palliative care prescribing . Anticholinergic load reflects a cumulative effect of medications that may account for several symptoms and adverse health outcomes frequently encountered in palliative care . A secondary analysis of 304 participants in a r and omised controlled trial had their cholinergic load calculated using the Clinician-Rated Anticholinergic Scale ( modified version ) longitudinally as death approached from medication data collected prospect ively by study nurses on each visit . Mean time from referral to death was 107 days , and mean 4.8 visits were conducted in which data were collected . Relationships to key factors were explored . Data showed that anticholinergic load rose as death approached because of increasing use of medications for symptom control . Symptoms significantly associated with increasing anticholinergic load included dry mouth and difficulty concentrating ( P < 0.05 ) . There were also significant associations with increasing anticholinergic load and decreasing functional status ( Australia-modified Karnofsky Performance Scale ; and quality of life ( P < 0.05 ) . This study has documented in detail the longitudinal anticholinergic load associated with medications used in a palliative care population between referral and death , demonstrating the biggest contributor to anticholinergic load in a palliative care population is from symptom-specific medications , which increased as death approached AIM Increasing evidence from experimental studies and clinical observations suggests that drugs with anticholinergic properties can cause physical and mental impairment . The aim of the present study was to evaluate the relationship between the use of drugs with anticholinergic activity and negative outcomes in older nursing home residents . METHODS We used data from the data base of the U.L.I.S.S.E project ( Un Link Informatico sui Servizi Sanitari Esistenti per l'Anziani ) , a prospect i ve multicenter observational study . Patients from 31 facilities in Italy were assessed at baseline and at 6 and 12 months by trained personnel , using the Minimum Data Set for Nursing Home ( MDS-NH ) . The only exclusion criterion was age younger than 65 years . The Anticholinergic Risk Scale ( ARS ) , a list of commonly prescribed drugs with potential anticholinergic effects , was used to calculate the anticholinergic load . RESULTS A total population of 1490 patients was analyzed ; almost half of the sample ( 48 % ) was using drugs with anticholinergic properties . The population of patients with ARS 1 or higher had a higher comorbidity index ( P < .003 ) and greater cognitive impairment ( CPS 5 - 6 ) ( P < .007 ) . They were more likely to suffer from heart failure , Parkinson disease , depression , anxiety , and schizophrenia . In multivariate analysis , a higher score in the ARS scale was associated with a greater likelihood of functional decline ( described as the loss of ≥1 ADL point ) ( odds ratio [ OR ] 1.13 ; confidence interval [ CI ] 1.03 - 1.23 ) , to a higher rate of falls ( OR 1.26 ; CI 1.13 - 1.41 ) , and to a higher incidence of delirium ( OR 1.16 ; CI 1.02 - 1.32 ) during a 1-year follow-up . CONCLUSIONS The use of medications with anticholinergic properties is common among older nursing home residents . Our results suggest that among older nursing home residents the use of anticholinergic drugs is associated with important negative outcomes , such as functional decline , falls , and delirium Background Drugs with anticholinergic effects are associated with adverse events such as delirium and falls as well as cognitive decline and loss of independence . Objective The aim of the study was to evaluate the association between anticholinergic burden and both cognitive and functional status , according to the hypothesis that the cumulative anticholinergic burden , as measured by the Anticholinergic Cognitive Burden ( ACB ) Scale and Anticholinergic Risk Scale ( ARS ) , increases the risk of cognitive decline and impairs activities of daily living . Methods This cross-sectional , prospect i ve study ( 3-month telephone follow-up ) was conducted in 66 Italian internal medicine and geriatric wards participating in the Registry of Polytherapies SIMI ( Società Italiana di Medicina Interna ) ( REPOSI ) study during 2010 . The sample included 1,380 in patients aged 65 years or older . Cognitive status was rated with the Short Blessed Test ( SBT ) and physical function with the Barthel Index . Each patient ’s anticholinergic burden was evaluated using the ACB and ARS scores . Results The mean SBT score for patients treated with anticholinergic drugs was higher than that for patients receiving no anticholinergic medications as also indicated by the ACB scale , even after adjustment for age , sex , education , stroke and transient ischaemic attack [ 9.2 ( 95 % CI 8.6–9.9 ) vs. 8.5 ( 95 % CI 7.8–9.2 ) ; p = 0.05 ] . There was a dose – response relationship between total ACB score and cognitive impairment . Patients identified by the ARS had more severe cognitive and physical impairment than patients identified by the ACB scale , and the dose – response relationship between this score and ability to perform activities of daily living was clear . No correlation was found with length of hospital stay . Conclusions Drugs with anticholinergic properties identified by the ACB scale and ARS are associated with worse cognitive and functional performance in elderly patients . The ACB scale might permit a rapid identification of drugs potentially associated with cognitive impairment in a dose – response pattern , but the ARS is better at rating activities of daily living BACKGROUND The use of potentially inappropriate medications in older adults can lead to known adverse drug events , but long-term effects are less clear . We therefore conducted a prospect i ve cohort study of older women to determine whether PIM use is associated with risk of functional impairment or low cognitive performance . METHODS We followed up 1,429 community-dwelling women ( ≥ 75 years ) for a period of 5 years at four clinical sites in the United States . The primary predictor at baseline was PIM use based on 2003 Beers Criteria . We also assessed anticholinergic load using the Anticholinergic Cognitive Burden scale . Outcomes included scores on a battery of six cognitive tests at follow-up and having one or more incident impairments in instrumental activities of daily living . Regression models were adjusted for baseline age , race , education , smoking , physical activity , a modified Charlson Comorbidity Index , and cognitive score . RESULTS The mean ± SD age of women at baseline was 83.2 ± 3.3 . In multivariate models , baseline PIM use and higher ACB scores were significantly associated with poorer performance in category fluency ( PIM : p = .01 ; ACB : p = .02 ) and immediate ( PIM : p = .04 ; ACB : p = .03 ) and delayed recall ( PIM : p = .04 ) . Both PIM use ( odds ratio [ OR ] : 1.36 [ 1.05 - 1.75 ] ) and higher ACB scores ( OR : 1.11 [ 1.04 - 1.19 ] ) were also strongly associated with incident functional impairment . CONCLUSIONS The results provide suggestive evidence that PIM use and increased anticholinergic load may be associated with risk of functional impairment and low cognitive performance . More cautious selection of medications in older adults may reduce these potential risks OBJECTIVES To examine whether anticholinergic medications have effects on the level of cognitive function or cognitive decline in persons in their early to mid 60s . METHODS A r and omly selected community-based sample of 2058 persons aged 60 - 64 at baseline was interviewed twice over four years . Anticholinergic medication use was determined from self-report medication data using the Anticholinergic Drug Scale . Cognition was assessed with the California Verbal Learning Test I ( one trial ) , Digits Backwards , the Symbol Digit Modalities Test , the Mini-Mental State Exam and simple and choice reaction time . Persons meeting criteria for Mild Cognitive Impairment were identified in a clinical sub study . Mixed models adjusting for age , sex , self-rated depression and physical health , and total number of medications were used to analyse the data . RESULTS There was a significant main effect of anticholinergic group averaged across time for the Symbol Digits Modalities Test with poorer performance among anticholinergic medication users . Main effects for the other cognitive tests and mild cognitive impairment were non-significant . No time by anticholinergic group interactions were significant . CONCLUSIONS This study suggests that exposure to anticholinergic medication is associated with lower level of complex attention in the young-old , but not with greater cognitive decline over time . Although the clinical significance of this is not clear , caution should be taken when prescribing medications with anticholinergic effects to older persons Objective : To compare associations between four measures of anticholinergic exposure ( anticholinergic risk scale , ARS ; anticholinergic drug burden , DBAC ; number and use versus no use of anticholinergic drugs ) , Barthel Index ( BI , physical function ) and Abbreviated Mental Test ( AMT , cognitive function ) on admission in older hospitalized patients . Methods : Prospect i ve observational study of a consecutive series of 271 older patients ( age 83 ± 7 years ) from community-dwelling and institutionalized setting s , admitted to an acute geriatric admission unit between 28 September 2011 and 18 December 2011 . The main outcome measures were BI quartiles ( primary outcome ) and AMT ( secondary outcome ) on admission . Results : Anticholinergic prevalence was 47 % . Multinomial logistic regression showed higher DBAC was associated with a greater risk of being in the lower BI quartiles versus highest BI quartile ( Q4 ) . This risk was significant for Q3 ( p = 0.04 ) and Q2 ( p = 0.02 ) but not for Q1 ( p = 0.06 ) . A greater number of anticholinergic drugs was associated with a higher risk of being in Q2 ( p = 0.02 ) . This risk was not significant for either Q3 ( p = 0.10 ) or Q1 ( p = 0.06 ) . No significant associations were observed either with use of anticholinergic medication or with ARS and BI quartiles . AMT did not show independent associations with any of the four measures of anticholinergic exposure . Conclusion : In older hospitalized patients , DBAC and some crude measures of anticholinergic exposure , but not ARS , showed independent associations with lower BI , but not AMT . These results highlight differences between various measures of anticholinergic drug exposure when study ing their associations with functional status OBJECTIVES To evaluate the association between the Drug Burden Index ( DBI ) , a measure of a person 's total exposure to anticholinergic and sedative medications that includes principles of dose-response and maximal effect and is associated with impaired physical function in community-dwelling older people , and falls in residents of residential aged care facilities ( RACFs ) . DESIGN Data were drawn from participants in a r and omized controlled trial that investigated falls and fractures . SETTING RACFs in Sydney , Australia . PARTICIPANTS Study participants ( N=602 ; 70.9 % female ) were recruited from 51 RACFs . Mean age was 85.7 ± 6.4 , and mean DBI was 0.60 ± 0.66 . MEASUREMENTS Medication history was obtained on each participant . Drugs were classified as anticholinergic or sedative and a DBI was calculated . Falls were measured over a 12-month period . Comorbidity , cognitive impairment ( Mini-Mental State Examination ) and depression ( Geriatric Depression Scale ) were determined . RESULTS There were 998 falls in 330 individuals during a follow-up period of 574.2 person-years , equating to an average rate of 1.74 falls per person-year . The univariate negative binomial regression model for falls showed incidence rate ratios of 1.69 ( 95 % confidence interval (CI)=1.22 - 2.34 ) for low DBI ( < 1 ) and 2.11 ( 95 % CI=1.47 - 3.04 ) for high DBI ( ≥1 ) when compared with those who had a DBI of 0 . After adjusting for age , sex , history of falling , cognitive impairment , depression , use of a walking aid , comorbidities , polypharmacy , and incontinence , incident rate ratios of 1.61 ( 95 % CI=1.17 - 2.23 ) for low DBI and 1.90 ( 95 % CI=1.30 - 2.78 ) for high DBI were obtained . CONCLUSION DBI is significantly and independently associated with falls in older people living in RACFs . Interventional studies design ed for this population are needed to determine whether reducing DBI , through dose reduction or cessation of anticholinergic and sedative drugs , can prevent falls OBJECTIVES To evaluate risk factors for preoperative and postoperative delirium . DESIGN Prospect i ve cohort study . SETTING Departments of orthopedic surgery in two Norwegian hospitals . PARTICIPANTS Three hundred sixty-four patients with and without cognitive impairment , aged 65 and older . MEASUREMENTS Patients were screened daily for delirium using the Confusion Assessment Method . Established risk factors and risk factors regarded as clinical ly important according to expert opinion were explored in univariate analyses . Variables associated with the outcomes ( P<.05 ) were entered into multivariate logistic regression models . RESULTS Delirium was present in 50 of 237 ( 21.1 % ) assessable patients preoperatively , whereas 68 of 187 ( 36.4 % ) patients developed delirium postoperatively ( incident delirium ) . Multivariate logistic regression identified four risk factors for preoperative delirium : cognitive impairment ( adjusted odds ratio (AOR)=4.7 , 95 % confidence interval (CI)=1.9 - 11.3 ) , indoor injury ( AOR=3.6 , 95 % CI=1.1 - 12.2 ) , fever ( AOR=3.4 , 95 % CI=1.5 - 7.7 ) , and preoperative waiting time ( AOR=1.05 , 95 % CI=1.0 - 1.1 per hour ) . Cognitive impairment ( AOR=2.9 , 95 % CI=1.4 - 6.2 ) , indoor injury ( AOR=2.9 , 95 % CI=1.1 - 6.3 ) , and body mass index ( BMI ) less than 20.0 ( AOR=2.9 , 95 % CI=1.3 - 6.7 ) were independent and statistically significant risk factors for postoperative delirium . CONCLUSION Time from admission to operation is a risk factor for preoperative delirium , whereas low BMI is an important risk factor for postoperative delirium in hip fracture patients . Cognitive impairment and indoor injury are independent risk factors for preoperative and postoperative delirium Anticholinergic Drug Scale ( ADS ) scores were previously associated with serum anticholinergic activity ( SAA ) in a pilot study . To replicate these results , the association between ADS scores and SAA was determined using simple linear regression in subjects from a study of delirium in 201 long-term care facility residents who were not included in the pilot study . Simple and multiple linear regression models were then used to determine whether the ADS could be modified to more effectively predict SAA in all 297 subjects . In the replication analysis , ADS scores were significantly associated with SAA ( R2 = .0947 , P < .0001 ) . In the modification analysis , each model significantly predicted SAA , including ADS scores ( R2 = .0741 , P < .0001 ) . The modifications examined did not appear useful in optimizing the ADS . This study replicated findings on the association of the ADS with SAA . Future work will determine whether the ADS is clinical ly useful for preventing anticholinergic adverse effects OBJECTIVES The anticholinergic risk scale ( ARS ) score is associated with the number of anticholinergic side effects in older out patients . We tested the hypothesis that high ARS scores are negatively associated with " global " parameters of physical function ( Barthel Index , primary outcome ) and predict length of stay and in-hospital mortality ( secondary outcomes ) in older hospitalized patients . DESIGN AND SETTING Prospect i ve study in 2 acute geriatric units . PARTICIPANTS Three hundred sixty-two consecutive patients ( age 83.6 ± 6.6 years ) admitted between February 1 , 2010 , and June 30 , 2010 . MEASUREMENTS Clinical and demographic characteristics , Barthel Index , full medication exposure , and ARS score were recorded on admission . Data on length of stay and in-hospital mortality were obtained from electronic records . RESULTS After adjustment for age , gender , dementia , institutionalization , Charlson Comorbidity Index , admission site , and number of nonanticholinergic drugs , a unit increase in ARS score was associated with a 29 % reduction in the odds of being in a higher Barthel quartile than a lower quartile ( odds ratio 0.71 , 95 % confidence interval [ CI ] 0.59 - 0.86 , P = .001 ) . The Barthel components mostly affected were bathing ( P < .001 ) , grooming ( P < .001 ) , dressing ( P < .001 ) , transfers ( P = .005 ) , mobility ( P < .001 ) , and stairs ( P < .001 ) . Higher ARS scores predicted in-hospital mortality among patients with hyponatremia ( hazard ratio [ HR ] 3.66 , 95 % CI 1.70 - 7.89 , P = .001 ) but not those without hyponatremia ( HR 1.04 , 95 % CI 0.70 - 1.54 , P = .86 ) . The ARS score did not significantly predict length of stay ( HR 1.02 , 95 % CI 0.88 - 1.17 , P = .82 ) . CONCLUSION High ARS scores are negatively associated with various components of the Barthel Index and predict in-hospital mortality in the presence of hyponatremia among older patients . The ARS score may be useful in the acute setting to improve risk stratification
2,084
24,046,037
The data presented in this review encourage the performance of new clinical trials to investigate the possible use of melatonin in cancer patients suffering from sleep-wake and mood disturbances , also considering that melatonin registered a low toxicity in cancer patients
The aim of this article was to perform a systematic review on the role of melatonin in the prevention of cancer tumorigenesis — in vivo and in vitro — as well as in the management of cancer correlates , such as sleep-wake and mood disturbances . The International Agency for Research on Cancer recently classified “ shift-work that involves circadian disruption ” as “ probably carcinogenic to humans ” ( Group 2A ) based on “ limited evidence in humans for the carcinogenicity of shift-work that involves night-work ” , and “ sufficient evidence in experimental animals for the carcinogenicity of light during the daily dark period ( biological night ) ” . In cancer patients depression and insomnia are frequent and serious comorbid conditions which definitely require a special attention .
We have evaluated the perioperative effects of melatonin with those of midazolam in 75 women in a prospect i ve , r and omized , double-blind , placebo-controlled study . Patients were given sublingual midazolam 15 mg , melatonin 5 mg or placebo , approximately 100 min before a st and ard anaesthetic . Sedation , anxiety and orientation were quantified before , and 10 , 30 , 60 and 90 min after premedication , and 15 , 30 , 60 and 90 min after admission to the recovery room . Psychomotor performance was evaluated at these times also , using the digit-symbol substitution test ( DSST ) and the Trieger dot test ( TDT ) . Patients who received premedication with either midazolam or melatonin had a significant decrease in anxiety levels and increase in levels of sedation before operation compared with controls . Midazolam produced the highest scores for sedation at 30 and 60 min after administration and significant psychomotor impairment in the preoperative period compared with melatonin or placebo . After operation , patients who received midazolam or melatonin premedication had increased levels of sedation at 30 min and impairment in performance on the DSST at 15 , 30 and 90 min compared with controls . There were no significant differences between the three groups for anxiety levels or TDT performance after operation . Amnesia was notable only in the midazolam group for one preoperative event ( entry into the operating room ) . Patient satisfaction was noted in the midazolam and melatonin groups only . We have demonstrated that melatonin can be used effectively for premedication of adult patients Background Studies have shown that there is a high prevalence of depression in cancer patients . Women with breast cancer may have an even higher risk of depression particularly in a postmenopausal or estrogen deficiency state . A small number of r and omized controlled trials have examined the efficacy of antidepressants compared to that of a placebo in cancer patients , but some results have been difficult to interpret due to a heterogeneous patient group . In the current investigation , we screened newly diagnosed early stage breast cancer patients for depressive symptoms prior to the initiation of adjuvant therapy and investigated whether the oral antidepressant fluoxetine affected depressive symptoms , completion of adjuvant treatment , and quality of life . Methods Patients with newly diagnosed early stage breast cancer were screened for depressive symptoms prior to the initiation of adjuvant therapy . Patients with depressive symptoms were r and omized to a daily oral fluoxetine or a placebo . Patients were then followed for 6 months and evaluated for quality of life , completion of adjuvant treatment , and depressive symptoms . Results A high percentage of patients with newly diagnosed early stage breast cancer were found to have depressive symptoms prior to the initiation of adjuvant therapy . The use of fluoxetine for 6 months result ed in an improvement in quality of life , a higher completion of adjuvant treatment ( chemotherapy , hormonal therapy , chemotherapy plus hormonal therapy ) , and a reduction in depressive symptoms compared to patients who received placebo . Conclusions An antidepressant should be considered for early stage breast cancer patients with depressive symptoms who are receiving adjuvant treatment Introduction Breast cancer represents about one-third of all cancer diagnoses and accounts for about 15 % of cancer deaths in women . Many of these patients experience depression , anxiety , sleep disturbances and cognitive dysfunction . This may adversely affect quality of life and also contribute to morbidity and mortality . Melatonin is a regulatory circadian hormone having , among others , a hypnotic and an antidepressive effect . It has very low toxicity and very few adverse effects compared with the more commonly used antidepressants and hypnotics . Methods and analysis The objective of this double-blind , r and omised , placebo-controlled trial is to investigate whether treatment with oral melatonin has a prophylactic or ameliorating effect on depressive symptoms , anxiety , sleep disturbances and cognitive dysfunction in women with breast cancer . Furthermore , the authors will examine whether a specific clock-gene , PER3 , is correlated with an increased risk of depressive symptoms , sleep disturbances or cognitive dysfunction . The MELODY trial is a prospect i ve double-blinded , r and omised , placebo-controlled trial in which the authors intend to include 260 patients . The primary outcome is depressive symptoms measured by the Major Depression Inventory . The secondary outcomes are anxiety measured by a Visual Analogue Scale , total sleep time , sleep efficiency , sleep latency and periods awake measured by actigraphy and changes in cognitive function measured by a neuropsychological test battery . Tertiary outcomes are fatigue , pain , well-being and sleep quality /quantity measured by Visual Analogue Scale and sleep diary and sleepiness measured by the Karolinska Sleepiness Scale . The PER3 genotype is also to be determined in blood sample Summary Background . Fatigue can significantly interfere with a cancer patient ’s ability to fulfill daily responsibilities and enjoy life . It commonly co-exists with depression in patients undergoing chemotherapy , suggesting that administration of an antidepressant that alleviates symptoms of depression could also reduce fatigue . Methods . We report on a double-blind clinical trial of 94 female breast cancer patients receiving at least four cycles of chemotherapy r and omly assigned to receive either 20 mg of the selective serotonin re-uptake inhibitor ( SSRI ) paroxetine ( Paxil ® , SmithKline Beecham Pharmaceuticals ) or an identical-appearing placebo . Patients began their study medication seven days following their first on- study treatment and continued until seven days following their fourth on- study treatment . Seven days after each treatment , participants completed question naires measuring fatigue ( Multidimensional Assessment of Fatigue , Profile of Mood States-Fatigue/Inertia subscale and Fatigue Symptom Checklist ) and depression ( Profile of Mood States-Depression subscale [ POMS-DD ] and Center for Epidemiologic Studies -Depression [ CES-D ] ) . Results . Repeated- measures ANOVAs , after controlling for baseline measures , showed that paroxetine was more effective than placebo in reducing depression during chemotherapy as measured by the CES-D ( p=0.006 ) and the POMS-DD ( p=0.07 ) but not in reducing fatigue ( all measures , ps > 0.27 ) . Conclusions . Although depression was significantly reduced in the 44 patients receiving paroxetine compared to the 50 patients receiving placebo , indicating that a biologically active dose was used , no significant differences between groups on any of the measures of fatigued were observed . Results suggest that modulation of serotonin may not be a primary mechanism of fatigue related to cancer treatment BACKGROUND Low urinary melatonin levels have been associated with an increased risk of breast cancer in premenopausal women . However , the association between melatonin levels and breast cancer risk in postmenopausal women remains unclear . METHODS We investigated the association between melatonin levels and breast cancer risk in postmenopausal women in a prospect i ve case-control study nested in the Hormones and Diet in the Etiology of Breast Cancer Risk cohort , which included 3966 eligible postmenopausal women . The concentration of melatonin 's major metabolite , 6-sulfatoxymelatonin , was measured in a baseline 12-hour overnight urine sample from 178 women who later developed incident breast cancer and from 710 matched control subjects . We used multivariable-adjusted conditional logistic regression models to investigate associations . Relative risks are reported as odds ratios ( ORs ) . All statistical tests were two-sided . RESULTS Increased melatonin levels were associated with a statistically significantly lower risk of invasive breast cancer in postmenopausal women ( for women in the highest quartile of total overnight 6-sulfatoxymelatonin output vs the lowest quartile , multivariable OR also adjusted for testosterone = 0.56 , 95 % confidence interval [ CI ] = 0.33 to 0.97 ; P(trend ) = .02 ) . This association was strongest among never and past smokers ( OR = 0.38 , 95 % CI = 0.20 to 0.74 ; P(trend ) = .001 ) and after excluding women who were diagnosed with invasive breast cancer within 4 years after urine collection ( OR = 0.34 , 95 % CI = 0.15 to 0.75 ; P(trend ) = .002 ) . We did not observe substantial variation in relative risks by hormone receptor status of breast tumors . Among the 3966 women in the cohort , 40 of the 992 women in the highest quartile of 6-sulfatoxymelatonin developed breast cancer during follow-up , compared with 56 of the 992 women in the lowest quartile of 6-sulfatoxymelatonin . CONCLUSION Results from this prospect i ve study provide evidence for a statistically significant inverse association between melatonin levels , as measured in overnight morning urine , and invasive breast cancer risk in postmenopausal women BACKGROUND Exposure to light at night may increase the risk of breast cancer by suppressing the normal nocturnal production of melatonin by the pineal gl and , which , in turn , could increase the release of estrogen by the ovaries . This study investigated whether such exposure is associated with an increased risk of breast cancer in women . METHODS Case patients ( n = 813 ) , aged 20 - 74 years , were diagnosed from November 1992 through March 1995 ; control subjects ( n = 793 ) were identified by r and om-digit dialing and were frequency matched according to 5-year age groups . An in-person interview was used to gather information on sleep habits and bedroom lighting environment in the 10 years before diagnosis and lifetime occupational history . Odds ratios ( ORs ) and 95 % confidence intervals ( CIs ) were estimated by use of conditional logistic regression , with adjustment for other potential risk factors . RESULTS Breast cancer risk was increased among subjects who frequently did not sleep during the period of the night when melatonin levels are typically at their highest ( OR = 1.14 for each night per week ; 95 % CI = 1.01 to 1.28 ) . Risk did not increase with interrupted sleep accompanied by turning on a light . There was an indication of increased risk among subjects with the brightest bedrooms . Graveyard shiftwork was associated with increased breast cancer risk ( OR = 1.6 ; 95 % CI = 1.0 to 2.5 ) , with a trend of increased risk with increasing years and with more hours per week of graveyard shiftwork ( P = .02 , Wald chi-squared test ) . CONCLUSION The results of this study provide evidence that indicators of exposure to light at night may be associated with the risk of developing breast cancer BACKGROUND AND PURPOSE To estimate the prevalence of insomnia symptoms and syndrome in the general population , describe the types of self-help treatments and consultations initiated for insomnia , and examine help-seeking determinants . PATIENTS AND METHODS A r and omly selected sample of 2001 French-speaking adults from the province of Quebec ( Canada ) responded to a telephone survey about sleep , insomnia , and its treatments . RESULTS Of the total sample , 25.3 % were dissatisfied with their sleep , 29.9 % reported insomnia symptoms , and 9.5 % met criteria for an insomnia syndrome . Thirteen percent of the respondents had consulted a healthcare provider specifically for insomnia in their lifetime , with general practitioners being the most frequently consulted . Daytime fatigue ( 48 % ) , psychological distress ( 40 % ) , and physical discomfort ( 22 % ) were the main determinants prompting individuals with insomnia to seek treatment . Of the total sample , 15 % had used at least once herbal/dietary products to facilitate sleep and 11 % had used prescribed sleep medications in the year preceding the survey . Other self-help strategies employed to facilitate sleep included reading , listening to music , and relaxation . CONCLUSIONS These findings confirm the high prevalence of insomnia in the general population . While few insomnia sufferers seek professional consultations , many individuals initiate self-help treatments , particularly when daytime impairments such as fatigue become more noticeable . Improved knowledge of the determinants of help-seeking behaviors could guide the development of effective public health prevention and intervention programs to promote healthy sleep Mounting evidence suggests habitual sleep duration is associated with various health outcomes ; both short and long sleep duration have been implicated in increased risk of cardiovascular disease , diabetes , and all-cause mortality . However , data on the relation between sleep duration and cancer risk are sparse and inconclusive . A link between low levels of melatonin , a hormone closely related to sleep , and increased risk of breast cancer has recently been suggested but it is unclear whether duration of sleep may affect breast cancer risk . We explored the association between habitual sleep duration reported in 1986 and subsequent risk of breast cancer in the Nurses ' Health Study using Cox proportional hazards models . During 16 years of follow-up , 4,223 incident cases of breast cancer occurred among 77,418 women in this cohort . Compared with women sleeping 7 hours , covariate-adjusted hazard ratios and 95 % confidence intervals for those sleeping < or = 5 , 6 , 8 , and > or = 9 hours were 0.93 ( 0.79 - 1.09 ) , 0.98 ( 0.91 - 1.06 ) , 1.05 ( 0.97 - 1.13 ) , and 0.95 ( 0.82 - 1.11 ) , respectively . A moderate trend in risk increase towards longer sleep duration was observed when analyses were restricted to participants who reported same sleep duration in 1986 and 2000 ( P(trend ) = 0.05 ) . In this prospect i ve study , we found no convincing evidence for an association between sleep duration and the incidence of breast cancer Background : Lower urinary melatonin levels are associated with a higher risk of breast cancer in postmenopausal women . Literature for premenopausal women is scant and inconsistent . Methods : In a prospect i ve case-control study , we measured the concentration of 6-sulphatoxymelatonin ( aMT6s ) in the 12-hour overnight urine of 180 premenopausal women with incident breast cancer and 683 matched controls . Results : In logistic regression models , the multivariate odds ratio ( OR ) of invasive breast cancer for women in the highest quartile of total overnight aMT6s output compared with the lowest was 1.43 [ 95 % confidence interval ( CI ) , 0.83 - 2.45 ; Ptrend = 0.03 ] . Among current nonsmokers , no association was existent ( OR , 1.00 ; 95 % CI , 0.52 - 1.94 ; Ptrend = 0.29 ) . We observed an OR of 0.68 between overnight urinary aMT6s level and breast cancer risk in women with invasive breast cancer diagnosed > 2 years after urine collection and a significant inverse association in women with a breast cancer diagnosis > 8 years after urine collection ( OR , 0.17 ; 95 % CI , 0.04 - 0.71 ; Ptrend = 0.01 ) . There were no important variations in ORs by tumor stage or hormone receptor status of breast tumors . Conclusion : Overall , we observed a positive association between aMT6s and risk of breast cancer . However , there was some evidence to suggest that this might be driven by the influence of sub clinical disease on melatonin levels , with a possible inverse association among women diagnosed further from recruitment . Thus , the influence of lag time on the association between melatonin and breast cancer risk needs to be evaluated in further studies . Cancer Epidemiol Biomarkers Prev ; 19(3 ) ; BACKGROUND The nocturnal decline of blood pressure ( BP ) is almost coincident with the elevation of melatonin , which may exert vasodilatating and hypotensive effects . In this study we investigated whether prolonged nocturnal administration of melatonin could influence the daily rhythm of BP in women . METHODS In a r and omized double-blind study , 18 women , 47 to 63 years of age and with normal BP ( N = 9 ) or treated essential hypertension ( N = 9 ) , received a 3-week course of a slow-release melatonin pill ( 3 mg ) or placebo 1 h before going to bed . They were then crossed over to the other treatment for another 3 weeks . In each woman ambulatory BP was recorded for 41 h at baseline at the end of each treatment period . RESULTS In comparison with placebo , melatonin administration did not influence diurnal BP but did significantly decrease nocturnal systolic ( -3.77 + /- 1.7 mm Hg , P = .0423 ) , diastolic ( -3.63 + /- 1.3 mm Hg , P = .0153 ) , and mean ( -3.71 + /- 1.3 mm Hg , P = .013 ) BP without modifying heart rate . The effect was inversely related to the day-night difference in BP . CONCLUSION These data indicate that prolonged administration of melatonin may improve the day-night rhythm of BP , particularly in women with a blunted nocturnal decline BACKGROUND Sleep duration has been hypothesized to be inversely associated with breast cancer risk , possibly due to greater overall melatonin production in longer sleepers . However , data are inconclusive from the three studies conducted in Western population s on sleep duration and breast cancer risk . METHODS We investigated the relationship between self-reported usual sleep duration determined at baseline and subsequent risk of breast cancer in the prospect i ve , population -based cohort of the Singapore Chinese Health Study . We excluded from the study women with < 2 years of follow-up due to possible change in sleep pattern among breast cancer cases close to the time of diagnosis . Five hundred and twenty-five incident cases of breast cancer were identified among the remaining 33 528 women after up to 11 years of follow-up . RESULTS Among women postmenopausal at baseline , breast cancer risk decreased with increasing sleep duration ( P trend = 0.047 ) ; those who reported 9 + h of sleep showed a relative risk of 0.67 ( 95 % confidence interval = 0.4 - 1.1 ) compared with women who reported < or = 6 h of sleep . This inverse association was observed primarily in lean women [ i.e. body mass index below the median value ( 23.2 kg/m(2 ) ) ] ( P = 0.024 ) . In this study population , irrespective of gender , urinary 6-sulfatoxymelatonin levels increased with increasing self-reported hours of sleep ( P trend = 0.035 ) after adjustment for age and time of day of urine collection . Melatonin levels were 42 % higher in those with 9 + versus those with < or = 6 h of sleep . CONCLUSION Sleep duration may influence breast cancer risk , possibly via its effect on melatonin levels Purpose Recent data indicate that night shift work is associated with increased endometrial cancer risk , perhaps through a pathway involving lower melatonin production . Melatonin is an antiestrogenic hormone , with production in a circadian pattern that is dependent on presence of dark at night . Sleep duration is positively associated with melatonin production and may be an indicator of melatonin levels in epidemiologic studies . Methods We evaluated associations between self-reported sleep duration and endometrial cancer risk using publicly available prospect i ve data on 48,725 participants in the Women ’s Health Initiative Observational Study , among whom 452 adjudicated incident cases of endometrial cancer were diagnosed over approximately 7.5 years of follow-up . Sleep duration was self-reported at baseline . Cox proportional hazards regression was used to estimate hazard ratios ( HR ) and 95 % confidence intervals ( CI ) for endometrial cancer risk with adjustment for potential confounders . Results Most women reported sleeping ≤6 ( 33.3 % ) or 7 ( 38.5 % ) h each night ; fewer reported sleeping 8 ( 23.4 % ) or ≥9 ( 4.8 % ) h each night . In adjusted analyses , there was an indication of reduced risk associated with longer sleep duration , though no statistically significant association was observed . Women who slept ≥9 h had a nonsignificant reduced risk of endometrial cancer compared with women who slept ≤6 h ( HR = 0.87 ; 95 % CI = 0.51–1.46 ) . Conclusions We found weak evidence of an association between sleep duration and endometrial cancer risk . Self-reported sleep duration may not adequately represent melatonin levels , thus further studies utilizing urinary melatonin levels are necessary to establish the mechanism by which night shift work increases endometrial cancer risk Five patients with winter depression received low doses of melatonin in the afternoon , and five patients received placebo capsules . Melatonin treatment significantly decreased depression ratings compared to placebo . If these findings are replicated in a larger sample with documentation of expected phase shifts , the phase shift hypothesis will be substantially supported Recent studies suggest that the pineal hormone melatonin may reduce chemotherapy-induced immune and bone marrow damage . In addition , melatonin may exert potential oncostatic effects either by stimulating host anticancer immune defenses or by inhibiting tumor growth factor production . On this basis , we have performed a r and omized study of chemotherapy alone vs. chemotherapy plus melatonin in advanced non-small cell lung cancer patients ( NSCLC ) with poor clinical status . The study included 70 consecutive advanced NSCLC patients who were r and omized to receive chemotherapy alone with cisplatin ( 20 mg/m2/day i.v . for 3 days ) and etoposide ( 100 mg/m2/day i.v . for 3 days ) or chemotherapy plus melatonin ( 20 mg/day orally in the evening ) . Cycles were repeated at 21-day intervals . Clinical response and toxicity were evaluated according to World Health Organization criteria . A complete response ( CR ) was achieved in 1/34 patients concomitantly treated with melatonin and in none of the patients receiving chemotherapy alone . Partial response ( PR ) occurred in 10/34 and in 6/36 patients treated with or without melatonin , respectively . Thus , the tumor response rate was higher in patients receiving melatonin ( 11/34 vs. 6/35 ) , without , however , statistically significant differences . The percent of 1-year survival was significantly higher in patients treated with melatonin plus chemotherapy than in those who received chemotherapy alone ( 15/34 vs. 7/36 , P < 0.05 ) . Finally , chemotherapy was well tolerated in patients receiving melatonin , and in particular the frequency of myelosuppression , neuropathy , and cachexia was significantly lower in the melatonin group . This study shows that the concomitant administration of melatonin may improve the efficacy of chemotherapy , mainly in terms of survival time , and reduce chemotherapeutic toxicity in advanced NSCLC , at least in patients in poor clinical condition OBJECTIVES The anticancer activity of the indole melatonin has been explained to be due to its immunomodulatory , anti-prolferative and anti-oxidant effects , whereas at present no data are available about its possible influence on the angiogenesis , which has been shown to be one of the main biological mechanisms responsible for tumor dissemination . Vascular endothelial growth factor ( VEGF ) is the most active angiogenic factor , and the evidence of abnormally high blood levels or VEGF has been proven to be associated with poor prognosis in cancer patients . To investigate the influence of melatonin on angiogenesis , in this preliminary study we have evaluated the effects of melatonin therapy on VEGF blood levels in advanced cancer patients . MATERIAL & METHODS The study included 20 metastatic patients , who progressed on previous conventional antitumor therapies and for whom no other effective treatment was available . Melatonin was given orally at 20 mg/day in the evening for at least 2 months . Serum levels of VEGF were measured by an enzyme immunoassay on venous blood sample s collected at 15-day intervals . RESULTS The clinical response consisted of minor response ( MR ) in 2 , stable disease ( SD ) in 6 and progressive disease ( PD ) in the remaining 12 patients . VEGF mean levels decreased on therapy , without , however , statistical differences with respect to the pre-treatment values . In contrast , by evaluating changes in VEGF levels in relation to the clinical response , non-progressing patients ( MR + SD ) showed a significant decline in VEGF mean concentrations , whereas no effect was achieved in progressing patients . CONCLUSIONS This study , by showing that melatonin-induced control or the neoplastic growth is associated with a decline in VEGF secretion , would suggest that the pineal hormone may control tumor growth at least in part by acting as a natural anti-angiogenic molecule , with a following opposition or angiogenesis-dependent cancer proliferation Experimental data have suggested that the pineal hormone melatonin ( MLT ) may counteract chemotherapy-induced myelosuppression and immunosuppression . In addition , MLT has been shown to inhibit the production of free radicals , which play a part in mediating the toxicity of chemotherapy . A study was therefore performed in an attempt to evaluate the influence of MLT on chemotherapy toxicity . The study involved 80 patients with metastatic solid tumors who were in poor clinical condition ( lung cancer : 35 ; breast cancer : 31 ; gastrointestinal tract tumors : 14 ) . Lung cancer patients were treated with cisplatin and etoposide , breast cancer patients with mitoxantrone , and gastrointestinal tract tumor patients with 5-fluorouracil plus folates . Patients were r and omised to receive chemotherapy alone or chemotherapy plus MLT ( 20 mg/day p.o . in the evening ) . Thrombocytopenia was significantly less frequent in patients concomitantly treated with MLT . Malaise and asthenia were also significantly less frequent in patients receiving MLT . Finally , stomatitis and neuropathy were less frequent in the MLT group , albeit without statistically significant differences . Alopecia and vomiting were not influenced by MLT . This pilot study seems to suggest that the concomitant administration of the pineal hormone MLT during chemotherapy may prevent some chemotherapy-induced side-effects , particularly myelosuppression and neuropathy . Evaluation of the impact of MLT on chemotherapy efficacy will be the aim of future clinical investigations Sleep disturbance is common in major depressive disorder ( MDD ) , and is often characterized by early-morning waking . Melatonin is a hypnotic and synchronizes circadian rhythms . It may also be an antidepressant . The melatonin agonists , ramelteon and agomelatine , have hypnotic and antidepressant properties , but there is a dearth of trials investigating the use of melatonin in MDD . This r and omized , controlled trial aim ed to determine whether exogenous melatonin is a sleep promoter and antidepressant . Thirty-three participants with a Diagnostic and Statistical Manual of Mental Disorders ( fourth edition ) diagnosis of MDD and early-morning waking were selected for a 4-week , r and omized , double-blind trial of slow-release melatonin ( 6 mg ; vs. placebo ) given at bedtime over 4 weeks . Sleep was measured subjectively using sleep diaries and the Leeds Sleep Evaluation Question naire and objective ly using wrist actigraphy . Of the 33 participants , 31 completed the trial . General Linear Modelling showed significant improvements in depression and sleep over time , but this was not specific to melatonin . However , there was a trend towards an improvement in mood with melatonin , and no adverse side effects were observed . In conclusion , melatonin may be beneficial for treating MDD , it seems to be safe and well tolerated , but its potential for treating depression in people who do not wish to take antidepressants requires further evaluation Background : Melatonin shows potential oncostatic activity and is acutely suppressed by light exposure . Some evidence suggests an association between night work and breast cancer risk , possibly through the melatonin pathway . Methods : In a cohort of premenopausal nurses , we prospect ively studied the relation between rotating night shift work and breast cancer risk . Total number of months during which the nurses worked rotating night shifts was first assessed at baseline in 1989 and periodically up date d thereafter . We used Cox proportional hazards models to calculate relative risks ( RRs ) and 95 % confidence intervals ( CIs ) . Results : Among 115,022 women without cancer at baseline , 1,352 developed invasive breast cancer during 12 years of follow up . Women who reported more than 20 years of rotating night shift work experienced an elevated relative risk of breast cancer compared with women who did not report any rotating night shift work ( multivariate RR = 1.79 ; 95 % CI = 1.06–3.01 ) . There was no increase in risk associated with fewer years of rotating night work . Conclusion : Our results suggest a modestly elevated risk of breast cancer after longer periods of rotating night work . Additional studies are warranted to rule out small sample size or uncontrolled sources for confounding as alternative explanations PURPOSE Fatigue and depression typically occur together in cancer patients , suggesting a common etiology , perhaps based on serotonin . This r and omized clinical trial tested whether paroxetine , a selective serotonin reuptake inhibitor antidepressant known to modulate brain serotonin , would reduce fatigue in cancer patients and whether any reduction was related to depression . PATIENTS AND METHODS Cancer patients undergoing chemotherapy for the first time were assessed for fatigue . Of 704 patients who reported fatigue at their second chemotherapy cycle , 549 patients were r and omly assigned to receive either 20 mg of oral paroxetine hydrochloride daily or placebo for 8 weeks . The assessment s of fatigue and depression were performed at cycles 3 and 4 of chemotherapy . RESULTS A total of 244 patients treated with paroxetine and 235 patients treated with placebo provided assessable data . No difference was detected in fatigue between patient groups . At the end of the study , there was a difference between groups in the mean level of depression ( Center for Epidemiologic Studies Depression scores , 12.0 v 14.8 , respectively ; P < .01 ) . CONCLUSION Paroxetine had no influence on fatigue in patients receiving chemotherapy . A possible explanation is that cancer-related fatigue does not involve a reduction in brain 5-HT levels Several experiments have demonstrated that pineal gl and plays a physiological anticancer role . Melatonin ( MLT ) , its most investigated hormone , is a natural anticancer agent . However , MLT would not be the only endocrine molecule responsible for the anticancer property of the pineal gl and . In fact , another pineal indole hormone , the 5-methoxytryptamine ( 5-MTT ) , has appeared to exert in vitro an antitumour activity superior to that of MLT itself . Previous studies have already shown the therapeutic anticancer action of MLT in association with chemotherapy also in human neoplasms . This study was performed to evaluate the influence of 5-MTT at physiological doses ( 1 mg/day orally during light phase ) on the efficacy of chemotherapy with cisplatin plus etoposide in advanced non-small cell lung cancer patients with respect to that obtained in patients treated by chemotherapy alone or chemotherapy plus pharmacological doses of MLT ( 20 mg/day orally during the dark phase of the day ) . The study included 100 patients , who were r and omised to receive chemotherapy alone or in association with MLT or 5-MTT . The overall response rate achieved in both patients concomitantly treated with MLT or 5-MTT was significantly higher with respect to that obtained in patients treated with chemotherapy alone . Moreover , both MLT and 5-MTT significantly reduced some chemotherapy-related toxicities , namely thrombocytopenia and neurotoxicity . This preliminary study shows that less known pineal hormone 5-MTT may exert at low doses the same anticancer therapeutic effect in association with chemotherapy , which may be obtained by pharmacological doses of the most investigated pineal hormone MLT Abstract Age-related changes in nutritional status can play an important role in brain functioning . Specific nutrient deficiencies in the elderly may exacerbate pathological processes in the brain . Consequently , the potential of nutritional intervention to prevent or delay cognitive impairment and the development of dementia is an important topic . A r and omized , double-blind , placebo-controlled trial has been performed in 25 elderly subjects ( 86 ± 6 years , 20 females , 5 males ) with mild cognitive impairment ( MCI ) . These subjects were r and omly assigned to supplement their diet with either an oily emulsion of docosahexaenoic acid (DHA)-phospholipids containing melatonin and tryptophan ( 11 subjects ) or a placebo ( 14-matched subjects ) for 12 weeks . The main aim of this study was to evaluate the efficacy of the dietary supplement on cognition , by the assessment at the start and after 12 weeks of : ( 1 ) Orientation and other cognitive functions : Mini-Mental State Examination ( MMSE ) ; ( 2 ) Short-term memory : digit , verbal , and spatial span ( digit span ; verbal span ; Corsi 's test ) ; ( 3 ) Long-term memory : Rey 's auditory-verbal learning test ; ‘ short story ’ test ; Rey-Osterrieth complex figure ( recall ) ; ( 4 ) Attentional abilities : attentive matrices ; ( 5 ) Executive functions : Weigl 's sorting test ; phonological fluency ‘ FAS ’ ; ( 6 ) Visuo-constructional and visuo-spatial abilities : copy of simple drawings ; Rey-Osterrieth complex figure ( copy ) ; ( 7 ) Language : semantic fluency ; ( 8) Mood : Geriatric Depression Scale ( GDS ) . Moreover , Sniffin ' Sticks olfaction test and Mini Nutritional Assessment ( MNA ) have been performed . After 12 weeks , a significant treatment effect for the MMSE ( P < 0.001 ) and a positive trend for the semantic verbal fluency was found in the supplement group ( P < 0.06 ) . A significant treatment effect was found out for the olfactory sensitivity assessment ( P < 0.009 ) . As regards the nutrition evaluation , after 12 weeks of treatment the supplemented group showed an improvement in the MNA score with a significant difference relative to placebo ( P < 0.005 ) . Older adults with MCI had significant improvements in several measures of cognitive function when supplemented with an oily emulsion of DHA-phospholipids containing melatonin and tryptophan for 12 weeks , compared with the placebo Background and aims : The aim of the study was to evaluate the effect of melatonin administration on sleep and behavioral disorders in the elderly and the facilitation of the discontinuation of regular hypnotic drugs . Methods : This was a prospect i ve , r and omized , double-blind , placebo-controlled , crossover trial in a community-living population . Participants were 22 older adults ( 7 men , 15 women over 65 ) with a history of sleep disorder complaints . Fourteen of these subjects were receiving hypnotic drug therapy . Participants received 2 months of melatonin ( 5 mg/day ) and 2 months of placebo . Sleep disorders were evaluated with the Northside Hospital Sleep Medicine Institute ( NHSMI ) test , discarding secondary insomnia and evaluating sleep quality . Behavioral disorders were evaluated with the Yesavage Geriatric Depression Scale ( GDS ) and Goldberg Anxiety Scale ( GAS ) . Patients discontinuing hypnotic drugs were also recorded . Results : Melatonin treatment for two months significantly improved sleep quality scores measured by the NHSMI test ( 1.78± 0.40 ) when compared with both basal ( 3.72± 0.45 ; p=0.001 ) and placebo ( 3.44± 0.56 ; p=0.025 ) groups . Depression measured by GDS and anxiety measured by GAS also improved significantly after melatonin administration ( p=0.043 and p=0.009 , respectively ) . Nine out of 14 subjects receiving hypnotic drugs were able to discontinue this treatment during melatonin but not placebo administration ; one discontinued hypnotic drugs during both melatonin and placebo administration , and four were unable to discontinue hypnotic therapy . Conclusions : The results of this study suggest that melatonin administration significantly improves sleep and behavioral disorders in the elderly and facilitates discontinuation of therapy with conventional hypnotic drugs Objective . The aim of this study was to assess cognitive function , quality of life , and psychological distress after surgery for early breast cancer but before initiation of adjuvant treatment . Material and methods . We performed a population -based study in the county of North Jutl and , Denmark , including 124 women aged less than 60 years who had surgery for early breast cancer from 2004 – 2006 . They were compared with an aged-matched group of 224 women without previous cancer selected r and omly from the same population . The cognitive function of patients and controls was tested using a revised battery from the ISPOCD study . Data were collected on quality of life ( EORTC QLQ-C30 ) and psychological distress ( POMS ) . Result . The neuropsychological tests did not reveal significant differences between patients and controls . Compared to the control group , breast cancer patients had a significantly 3 – 4 fold increased risk of experiencing cognitive impairment . Quality of life and psychological distress were also significantly poorer among patients . Conclusion . This study demonstrated that women diagnosed with breast cancer experience a significant deterioration of their perceived cognitive functioning , quality of life and of psychological well being We previously reported an inverse association between sleep duration and breast cancer risk in the prospect i ve , population ‐based Singapore Chinese Health Study ( SCHS ) cohort ( Wu et al. , Carcinogenesis 2008;29:1244–8 ) . Sleep duration was significantly positively associated with 6‐sulfatoxymelatonin ( aMT6s ) levels determined in a spot urine , but aMT6s levels in breast cancer cases were lacking ( Wu et al. , Carcinogenesis 2008;29:1244–8 ) . We up date d the sleep duration –breast cancer association with 14 years of follow‐up of 34,028 women in the SCHS . In a nested case – control study conducted within the SCHS , r and omly timed , prediagnostic urinary aMT6s concentrations were compared between 248 incident breast cancer and 743 individually matched cohort controls . Three female controls were individually matched to each case on age at baseline interview ( within 3 years ) , dialect group , menopausal status , date of baseline interview ( within 2 years ) , date of urine sample collection ( within 6 months ) and timing of urine collection during the day ( within 1 hr ) . Cox proportional hazards and conditional regression models with appropriate adjustment for confounders were used to examine the sleep– and aMT6s – breast cancer relationships . Breast cancer risk was not significantly associated with sleep duration ; adjusted odds ratio ( OR ) for 9 + vs. ≤6 hr is 0.89 [ 95 % confidence interval ( 95 % CI ) = 0.64–1.22 ] . Prediagnostic aMT6s levels did not differ between breast cancer cases and matched controls ; adjusted OR for highest versus lowest quartiles is 1.00 ( 95 % CI = 0.64–1.54 ) . We conclude that sleep duration is not significantly associated with breast cancer risk reduction . Melatonin levels derived from r and omly timed spot urine are unrelated to breast cancer . R and omly timed , spot urine‐derived melatonin levels are noninformative as surrogates of nocturnal melatonin production Objective The present study aims to document the frequency of use of hypnotic medication among a large sample of r and omly selected patients having been treated for various types of cancer , as well as to identify the sociodemographic , psychosocial , and medical factors that characterize the users of this type of medication . Methods Five thous and patients who had received treatment for breast , prostate , lung , or colorectal cancer at the L’Hôtel-Dieu de Québec were solicited by mail to take part in this study . Among these patients , 1,984 ( 39.7 % ) agreed to complete a battery of question naires . Results Overall , 22.6 % of the patients were currently consuming hypnotic medication . Factors associated with a greater utilization of hypnotic medication were older age , greater difficulties initiating sleep , more stressful life events experienced in the past 6 months , higher levels of anxiety , past or current psychological difficulties , poorer role functioning , less severe urinary symptoms , greater use of opioids , and past or current chemotherapy treatments . Conclusions These results are consistent with those of previous studies conducted in cancer patients in showing high rates of hypnotic medication use . Moreover , this study identified several factors that might help identify persons at risk of using this type of medication and , therefore , to experience the potential negative effects of chronic hypnotics use OBJECTIVE The authors ' goal was to examine the hypnotic effects of slow-release melatonin during the initial 4 weeks of treatment with fluoxetine in 19 patients with major depressive disorder . METHOD Twenty-four out patients with major depressive disorder were included in the study ; 19 completed the study . Ten patients were treated with fluoxetine plus slow-release melatonin and nine were given fluoxetine plus placebo in a double-blind protocol for 4 weeks . Response was assessed by using rating scales for depression and sleep . RESULTS The 10 patients given slow-release melatonin reported significantly better scores on the Pittsburgh Sleep Quality Index than the nine patients given placebo . No significant differences in the rate of improvement in depressive symptoms were noted between the two groups . No particular side effects were noted from the combination of fluoxetine and slow-release melatonin . CONCLUSIONS Slow-release melatonin was effective in improving the sleep of patients with major depressive disorder . Slow-release melatonin had no effect on the rate of improvement in symptoms of major depressive disorder . The authors conclude that the role of slow-release melatonin for sleep disturbances in major depressive disorder should be investigated further Background and objective : To compare the perioperative effects of melatonin and midazolam given in premedication , on sedation , orientation , anxiety scores and psychomotor performance . Methods : Exogenous administration of melatonin not only facilitates the onset of sleep but also improves its quality . A prospect i ve , r and omized , double-blind , placebo-controlled study was performed in 66 patients undergoing laparoscopic cholecystectomy . Patients were given melatonin 5 mg , midazolam 15 mg or placebo , 90 min before anaesthesia , sublingually . Sedation , orientation and anxiety were quantified before ; 10 , 30 , 60 and 90 min after premedication ; and 15 , 30 , 60 and 90 min after admission to the recovery room . Neurocognitive performance was evaluated at these times , using the Trail Making A and B and Word Fluency tests . The differences between the groups were analysed by ANOVA . Two-way comparisons were performed by Scheffé analysis . Sedation and amnesia were analysed by the χ2 test . Results : Patients who received premedication with either melatonin or midazolam had a significant increase in sedation and decrease in anxiety before operation compared with controls . After operation , there was no difference in sedation scores of all groups . Whereas , 30 , 60 and 90 min after premedication the melatonin and midazolam groups exhibited a significantly poorer performance in Trail Making A and B tests compared with placebo , there were no significant differences among the groups in terms of neuropsychological performance after the operation . Amnesia was notable only in the midazolam group for one preoperative event . Conclusion : Melatonin premedication was associated with preoperative anxiolysis and sedation without postoperative impairment of psychomotor performance Exposure to light at night suppresses melatonin production , and night-shift work ( a surrogate for such exposure ) has been associated with an increased risk of breast cancer . However , the association between circulating melatonin levels and breast cancer risk is unclear . In a prospect i ve case-control study nested within the Nurses ' Health Study II cohort , we measured the concentration of the major melatonin metabolite , 6-sulphatoxymelatonin ( aMT6s ) , in the first morning urine of 147 women with invasive breast cancer and 291 matched control subjects . In logistic regression models , the relative risk ( reported as the odds ratio [ OR ] ) of invasive breast cancer for women in the highest quartile of urinary aMT6s compared with those in the lowest was 0.59 ( 95 % confidence interval [ CI ] = 0.36 to 0.97 ) . This association was essentially unchanged after adjustment for breast cancer risk factors or plasma sex hormone levels but was slightly weakened when the analysis included 43 case patients with in situ breast cancer and their 85 matched control subjects ( OR = 0.70 , 95 % CI = 0.47 to 1.06 ) . The exclusion of women who had a history of night-shift work left our findings largely unchanged . These prospect i ve data support the hypothesis that higher melatonin levels , as measured in first morning urine , are associated with a lower risk of breast cancer BACKGROUND Experimental data from animals suggest a protective role for the pineal hormone melatonin in the etiology of breast cancer , but results from the few retrospective case-control studies that examined the association in humans have been inconsistent . To determine whether low levels of endogenous melatonin are associated with an increased risk for developing breast cancer , we conducted a prospect i ve nested case-control study among British women . METHODS Concentrations of 6-sulfatoxymelatonin , the main metabolite of melatonin in urine and a vali date d marker of circulating melatonin levels , were measured by radioimmunoassay in 24-hour urine sample s collected from women shortly after enrollment in the prospect i ve Guernsey III Study . Levels of 6-sulfatoxymelatonin were compared among 127 patients diagnosed with breast cancer during follow-up and among 353 control subjects , matched for age , recruitment date , menopausal status , and day of menstrual cycle for premenopausal women or number of years postmenopausal for postmenopausal women . Associations were examined by analyses of covariance and conditional logistic regression . All tests of statistical significance were two-sided . RESULTS No statistically significant differences in urinary 6-sulfatoxymelatonin concentrations were observed between women who developed breast cancer and control subjects among premenopausal or postmenopausal women ( P=.8 and P=.9 , respectively ) . When data from premenopausal and postmenopausal women were combined in a multivariable analysis adjusted for potential confounders and grouped into three categories defined by 6-sulfatoxymelatonin tertiles of control subjects , the level of 6-sulfatoxymelatonin excreted was not statistically significantly associated with the risk of breast cancer ( odds ratio [ OR ] for breast cancer = 0.95 , 95 % confidence interval [ CI ] = 0.55 to 1.65 , comparing the middle category with the lowest category of 6-sulfatoxymelatonin concentration , and OR = 0.99 , 95 % CI = 0.58 to 1.70 , comparing the highest category with the lowest category ) . CONCLUSION We found no evidence that the level of melatonin is strongly associated with the risk for breast cancer BACKGROUND : Melatonin has sedative , analgesic , antiinflammatory , antioxidative , and chronobiotic effects . We determined the impact of oral melatonin premedication on anxiolysis , analgesia , and the potency of the rest/activity circadian rhythm . METHODS : This r and omized , double-blind , placebo-controlled study included 33 patients , ASA physical status I – II , undergoing abdominal hysterectomy . Patients were r and omly assigned to receive either oral melatonin 5 mg ( n = 17 ) or placebo ( n = 16 ) the night before and 1 h before surgery . The analysis instruments were the Visual Analog Scale , the State-Trait Anxiety Inventory , and the actigraphy . RESULTS : The number of patients that needed to be treated to prevent one additional patient reporting high postoperative anxiety and moderate to intense pain in the first 24 postoperative hours was 2.53 ( 95 % CI , 1.41–12.22 ) and 2.20 ( 95 % CI , 1.26–8.58 ) , respectively . The number-needed-to-treat was 3 ( 95 % CI , 1.35–5.0 ) to prevent high postoperative anxiety in patients with moderate to intense pain , when compared with 7.5 ( 95 % CI , 1.36–∞ ) in the absence of pain or mild pain . Also , the treated patients required less morphine by patient-controlled analgesia , as assessed by repeated measures ANOVA ( F[1,31 ] = 6.05 , P = 0.02 ) . The rest/activity cycle , assessed by actigraphy , showed that the rhythmicity percentual of 24 h was higher in the intervention group in the first week after discharge ( [ 21.16 ± 8.90 ] versus placebo [ 14.00 ± 7.10 ] ; [ t = −2.41 , P = 0.02 ] ) . CONCLUSIONS : This finding suggested that preoperative melatonin produced clinical ly relevant anxiolytic and analgesic effects , especially in the first 24 postoperative hours . Also , it improved the recovery of the potency of the rest/activity circadian rhythm OBJECTIVE This study compared the efficacy and safety of paroxetine and desipramine with those of placebo in the treatment of depressive disorders in adult women with breast cancer , stages I-IV . METHOD In a double-blind , placebo-controlled study , 35 female out patients with breast cancer and DSM-III-R major depression or adjustment disorder with depressed mood were r and omly assigned to treatment with paroxetine ( N=13 ) , desipramine ( N=11 ) , or placebo ( N=11 ) for 6 weeks . Primary efficacy was assessed by change from baseline in score on the 21-item Hamilton Rating Scale for Depression ( HAM-D ) , and the secondary outcome measure was change from baseline in the Clinical Global Impressions-Severity of Illness scale ( CGI-S ) score . RESULTS Mean changes in the total HAM-D and CGI-S scores from baseline to 6-week endpoint for the paroxetine and desipramine groups were not significantly different than those for the placebo-treated group . An unusually high rate of response ( defined as > or=50 % improvement in the HAM-D score ) in the placebo group was observed ( 55 % [ N=6 ] ) ; adverse events precipitated patient discontinuation in the active treatment groups ( 9 % [ N=1 ] for desipramine , 15 % [ N=2 ] for paroxetine ) similar to that in the placebo-treated patients ( 18 % [ N=2 ] ) . Improvement on symptom dimensions within the HAM-D and Hamilton Rating Scale for Anxiety ( depressive , anxiety , cognitive , neurovegetative , or somatic ) was also similar between groups . CONCLUSION The small number of women in this study most likely contributed to the lack of observed differences in efficacy observed during the 6 weeks of treatment . R and omized , placebo-controlled trials of adequate power seeking to determine efficacy of antidepressants in the United States for the treatment of women with breast cancer and comorbid depression remain of paramount importance The prognosis of brain glioblastoma is still very poor and the median survival time is generally less than 6 months . At present , no chemotherapy has appeared to influence its prognosis . On the other h and , recent advances in brain tumor biology have suggested that brain tumor growth is at least in part under a neuroendocrine control , mainly realized by opioid peptides and pineal substances . On this basis , we evaluated the influence of a concomitant administration of the pineal hormone melatonin ( MLT ) in patients with glioblastoma treated with radical or adjuvant radiotherapy ( RT ) . The study included 30 patients with glioblastoma , who were r and omized to receive RT alone ( 60 Gy ) or RT plus MLT ( 20 mg/daily orally ) until disease progression . Both the survival curve and the percent of survival at 1 year were significantly higher in patients treated with RT plus MLT than in those receiving RT alone ( 6/14 vs. 1/16 ) . Moreover , RT or steroid therapy-related toxicities were lower in patients concomitantly treated with MLT . This preliminary study suggests that a radioneuroendocrine approach with RT plus the pineal hormone MLT may prolong the survival time and improve the quality of life of patients affected by glioblastoma BACKGROUND Depression is a common problem in patients with Delayed Sleep Phase Syndrome ( DSPS ) . This study used a r and omized , double-blind , crossover , placebo-controlled approach to test the hypothesis that exogenous melatonin ( 5 mg ) can attenuate depressive symptomatology in DSPS patients . METHODS Twenty patients with an established diagnosis of DSPS were dichotomized into DSPS with depressive symptoms ( Group I ; n=8 ) and without depressive symptoms ( Group II ; n=12 ) based on structured clinical interviews and a score greater than 17 on Center for Epidemiologic Studies Depression Scale ( CES-D ) . Both groups received melatonin and placebo treatment for 4 weeks with a 1-week washout period in between . Participants underwent a clinical interview and psychometric evaluation to assess depression , and overnight polysomnographic sleep studies were carried out at baseline and at the end of melatonin and placebo treatments . Furthermore , melatonin secretion rhythm as a circadian phase marker was assessed by measuring urinary 6-sulphatoxymelatonin levels . RESULTS Melatonin treatment significantly reduced depression scores in the depressed patients as measured by the CES-D and Hamilton Depression Rating Scale--17 . Melatonin treatment improved sleep continuity in both groups compared to placebo and baseline conditions . Group I individuals showed marked alterations in melatonin rhythms compared to Group II individuals . CONCLUSION Exogenous melatonin treatment may be an effective treatment modality for individuals with circadian rhythm sleep disorders and associated comorbid depressive symptomatology
2,085
28,185,589
Results When considering what communities have , articles reported external linkages as the most frequently gained re source , especially when partnerships result ed in more community power over the intervention . In contrast , financial assets were the least mentioned , despite their importance for sustainability . With how communities act , articles discussed challenges of ensuring inclusive participation and detailed strategies to improve inclusiveness . Very little was reported about strengthening community cohesiveness and collective efficacy despite their importance in community initiatives . When review ing for whom communities act , the importance of strong local leadership was mentioned frequently , while conflict resolution strategies and skills were rarely discussed . Synergies were found across these elements of community capability , with tangible success in one area leading to positive changes in another . Access to information and opportunities to develop skills were crucial to community participation , critical thinking , problem solving and ownership . Conclusions Strengthening community capability is critical to ensuring that community participation leads to genuine empowerment .
Background Community capability is the combined influence of a community ’s social systems and collective re sources that can address community problems and broaden community opportunities . We frame it as consisting of three domains that together support community empowerment : what communities have ; how communities act ; and for whom communities act .
The present study aim ed : ( 1 ) to assess and improve the level of women 's involvement in a strategy to control onchocerciasis by community-directed treatment with ivermectin ( CDTI ) in three parishes of Rukungiri District , Ug and a ; ( 2 ) to measure the performance of female community-directed health workers ( CDHWs ) in comparison with males ; and ( 3 ) to identify culturally acceptable means of enhancing women 's involvement in community-directed healthcare . Health education sessions were used to instruct community members to select female CDHWs in Masya Parish and to stress their potential importance in Karangara Parish ; this subject was not raised in Mukono Parish . In all , 403 mature women who were r and omly selected from the three parishes were interviewed as to their : ( 1 ) knowledge of the classes of people not eligible to take ivermectin ; ( 2 ) knowledge and beliefs about the benefits of ivermectin ; ( 3 ) participation in decision-making ; and ( 4 ) attitudes on the performance of female CDHWs . For analysis , the respondees were divided into : ( 1 ) those who had or had not taken ivermectin treatment during the previous year ; and ( 2 ) those who had or had not attended health education sessions . During the period when face-to-face interviews with women in r and omly selected households were being carried out , participatory evaluation meetings ( PEMs ) were conducted in selected communities from the same parishes in order to reach a consensus on issues which could not easily be included in individual face-to-face interviews . Participant observations were also made regarding : how communities selected their CDHWs ; how the CDHWs organised the distribution exercise and treated community members ; and how the CDHWs kept records in order to underst and issues which were deliberately hidden from the research ers during face-to-face interviews and PEMs . Significantly , the women who had been treated or health educated in Masya Parish were : ( 1 ) more knowledgeable on the groups which were not supposed to be treated ; ( 2 ) aware of women 's involvement in mobilisation of other community members ; ( 3 ) involved in CDTI decision-making ; and ( 4 ) had a better attitude towards female CDHWs ' performance compared with males when compared with those from Karangara and Mukono parishes . There were no differences between the attitude of women in Karangara and Mukono parishes towards performance of female CDHWs . Face-to-face interviews and records from all parishes indicated that female CDHWs achieved as good a coverage as their male counterparts , and sometimes better , in about the same time . Health education increased the number of female CDHWs from nine to 52 in Masya Parish , from 7 to 22 in Karangara Parish and from 6 to 20 in Mukono Parish . Health education improved the attitude of women towards female CDHWs , but the actual experience of having and observing female CDHWs in action in Masya Parish had a more significant positive impact on the womenfolk , as well as on the rest of the community members , and created an impetus for more of them to become actively involved in actual ivermectin distribution . The present authors conclude that recruiting more female CDHWs and supervisors would reduce the current male domination of the health delivery services , greatly strengthening the activities of CDTI programmes Background Few large and rigorous evaluations of participatory interventions systematic ally describe their context and implementation , or attempt to explain the mechanisms behind their impact . This study reports process evaluation data from the Ekjut cluster-r and omised controlled trial of a participatory learning and action cycle with women 's groups to improve maternal and newborn health outcomes in Jharkh and and Orissa , eastern India ( 2005 - 2008 ) . The study demonstrated a 45 % reduction in neonatal mortality in the last two years of the intervention , largely driven by improvements in safe practice s for home deliveries . Methods A participatory learning and action cycle with 244 women 's groups was implemented in 18 intervention clusters covering an estimated population of 114 141 . We describe the context , content , and implementation of this intervention , identify potential mechanisms behind its impact , and report challenges experienced in the field . Methods included a review of intervention documents , qualitative structured discussion s with group members and non-group members , meeting observations , as well as descriptive statistical analysis of data on meeting attendance , activities , and characteristics of group attendees . Results Six broad , interrelated factors influenced the intervention 's impact : ( 1 ) acceptability ; ( 2 ) a participatory approach to the development of knowledge , skills and ' critical consciousness ' ; ( 3 ) community involvement beyond the groups ; ( 4 ) a focus on marginalized communities ; ( 5 ) the active recruitment of newly pregnant women into groups ; ( 6 ) high population coverage . We hypothesize that these factors were responsible for the increase in safe delivery and care practice s that led to the reduction in neonatal mortality demonstrated in the Ekjut trial . Conclusions Participatory interventions with community groups can influence maternal and child health outcomes if key intervention characteristics are preserved and tailored to local context s. Scaling-up such interventions requires ( 1 ) a detailed underst and ing of the way in which context affects the acceptability and delivery of the intervention ; ( 2 ) planned but flexible replication of key content and implementation features ; ( 3 ) strong support for participatory methods from implementing agencies This article presents a detailed description of a community mobilization intervention involving women 's groups in Mchinji District , Malawi . The intervention was implemented between 2005 and 2010 . The intervention aims to build the capacities of communities to take control of the mother and child health issues that affect them . To achieve this it comprises trained local female facilitators establishing groups and using a manual , participatory rural appraisal tools and picture cards to guide them through a community action cycle to identify and implement solutions to mother and child health problems . Significant re source inputs include salaries for facilitators and supervisors , and training , equipment and material s to support their work with groups . It is hypothesized that the groups will catalyse community collective action to address mother and child health issues and improve the health and reduce the mortality of mothers and children . Their impact , implementation and cost-effectiveness have been rigorously evaluated through a r and omized controlled trial design . The results of these evaluations will be reported in 2011 OBJECTIVE To determine the extent to which the community-directed approach used in onchocerciasis control in Africa could effectively and efficiently provide integrated delivery of other health interventions . METHODS A three-year experimental study was undertaken in 35 health districts from 2005 to 2007 in seven research sites in Cameroon , Nigeria and Ug and a. Four trial districts and one comparison district were r and omly selected in each site . All districts had established ivermectin treatment programmes , and in the trial districts four other established interventions - vitamin A supplementation , use of insecticide-treated nets , home management of malaria and short-course , directly-observed treatment for tuberculosis patients - were progressively incorporated into a community-directed intervention ( CDI ) process . At the end of each of the three study years , we performed quantitative evaluations of intervention coverage and provider costs , as well as qualitative assessment s of the CDI process . FINDINGS With the CDI strategy , significantly higher coverage was achieved than with other delivery approaches for all interventions except for short-course , directly-observed treatment . The coverage of malaria interventions more than doubled . The district-level costs of delivering all five interventions were lower in the CDI districts , but no cost difference was found at the first-line health facility level . Process evaluation showed that : ( i ) participatory processes were important ; ( ii ) recurrent problems with the supply of intervention material s were a major constraint to implementation ; ( iii ) the communities and community implementers were deeply committed to the CDI process ; ( iv ) community implementers were more motivated by intangible incentives than by external financial incentives . CONCLUSION The CDI strategy , which builds upon the core principles of primary health care , is an effective and efficient model for integrated delivery of appropriate health interventions at the community level in Africa A study of knowledge , attitudes and practice was carried out in the Rukungiri district of Ug and a , in order to investigate the involvement of women in community-directed treatment with ivermectin ( CDTI ) , for the control of onchocerciasis . The data analysed came from interviews with 260 adult women ( one from each of 260 r and omly-selected households in 20 onchocerciasis-endemic communities ) , community informants , and participatory evaluation meetings ( PEM ) in eight communities . The women who had been treated with ivermectin in 1999 generally had more knowledge of the benefits of taking ivermectin , were more likely to have attended the relevant health-education sessions and were more involved in community decisions on the method of ivermectin distribution than the women who had not received ivermectin in that year . There were fewer female community-directed health workers ( CDHW ) than male CDHW in the communities investigated . The reasons for not attending health-education sessions , not participating in community meetings concerning the CDTI , and the reluctance of some women to serve as CDHW were investigated . The most common reasons given were domestic chores , a reluctance to express their views in meetings outside their own kinship group , suspicions that other women might take advantage of them , and a lack of interest . Most of the women interviewed ( as well as other community members ) felt that there were relatively few women CDHW . The women attributed this to a lack of interaction and trust amongst themselves , which result ed in more men than women being selected as CDHW . The rest of the community members were not against women working as CDHW . It is recommended that communities be encouraged to select women to serve as CDHW in the CDTI , and that the performances of male and female CDHW be compared Background Neonatal mortality rates are high in rural Nepal where more than 90 % of deliveries are in the home . Evidence suggests that death rates can be reduced by interventions at community level . We describe an intervention which aim ed to harness the power of community planning and decision making to improve maternal and newborn care in rural Nepal . Methods The development of 111 women 's groups in a population of 86 704 in Makwanpur district , Nepal is described . The groups , facilitated by local women , were the intervention component of a r and omized controlled trial to reduce perinatal and neonatal mortality rates . Through participant observation and analysis of reports , we describe the implementation of this intervention : the community entry process , the facilitation of monthly meetings through a participatory action cycle of problem identification , community planning , and implementation and evaluation of strategies to tackle the identified problems . Results In response to the needs of the group , participatory health education was added to the intervention and the women 's groups developed varied strategies to tackle problems of maternal and newborn care : establishing mother and child health funds , producing clean home delivery kits and operating stretcher schemes . Close linkages with community leaders and community health workers improved strategy implementation . There were also indications of positive effects on group members and health services , and most groups remained active after 30 months . Conclusion A large scale and potentially sustainable participatory intervention with women 's groups , which focused on pregnancy , childbirth and the newborn period , result ed in innovative strategies identified by local communities to tackle perinatal care problems The challenges of community-directed treatment with ivermectin ( CDTI ) for onchocerciasis control in Africa have been : maintaining a desired treatment coverage , dem and for monetary incentives , high attrition of community distributors and low involvement of women . This study assessed how challenges could be minimised and performance improved using existing traditional kinship structures . In classic CDTI areas , community members decide upon selection criteria for community distributors , centers for health education and training , and methods of distributing ivermectin . In kinship enhanced CDTI , similar procedures were followed at the kinship level . We compared 14 r and omly selected kinship enhanced CDTI communities with 25 classic CDTI communities through interviews of 447 and 750 household members and 127 and 64 community distributors respectively . Household respondents from kinship enhanced CDTI reported better performance ( P<0.001 ) than classic CDTI on the following measures of program effectiveness : ( a ) treatment coverage ( b ) decision on treatment location and ( c ) mobilization for CDTI activities . There were more female distributors in kinship enhanced CDTI than in classic CDTI . Attrition was not a problem . Kinship enhanced CDTI had a higher number of community distributors per population working among relatives , and were more likely to be involved in additional health care activities . The results suggest that kinship enhanced CDTI was more effective than classic CDTI In Ug and a , human onchocerciasis is controlled by annual , mass , community-directed , ivermectin-treatment programmes ( CDITP ) in all endemic communities where the prevalence of the disease is > or = 30 % . This is a practical , long-term and cost-effective strategy . In some communities , this system succeeds in providing treatment at the desired level of coverage ( i.e. 90 % of the annual treatment objective , which is itself equivalent to all those individuals eligible to take ivermectin ) . Other communities , however , fail to reach this target . The aim of the present study was to determine the factors that were significantly associated with success or failure in achieving this target . The data analysed were answers to a question naire completed by 10 household heads r and omly selected from each of 64 r and omly selected endemic communities ( of which 36 succeeded and 28 failed to reach their coverage target ) in the four districts of Kabale , Moyo , Nebbi and Rukungiri . Among the programme-related factors investigated , success was associated , at a statistical level of significance ( P < or = 0.05 ) , with involvement of community members in : ( 1 ) decisions about the execution of the programme ; ( 2 ) attendance at health-education sessions ; ( 3 ) selection of the community-based distributors ( CDB ) ; and ( 4 ) rewarding CBD in kind . In general , the involvement of community members in the planning and execution of a CDITP ( and the result ant sense of pride in community ownership ) was more likely to produce successful results than when external health workers or even community leaders or local councils took responsibility OBJECTIVES To assess and compare the effectiveness of ivermectin distributors in attaining 90 % treatment coverage of the eligible population with each additional health activity they take up . METHODS R and om sampling was applied every year to select distributors for interviews in community-directed treatment with ivermectin ( CDTI ) areas of Cameroon and Ug and a. A total of 288 in 2004 , 357 in 2005 and 348 in 2006 distributors were interviewed in Cameroon , and 706 , 618 and 789 in Ug and a , respectively . The questions included treatment coverage , involvement in additional activities , where and for how long these activities were provided , and whether they were supervised . RESULTS At least 70 % of the distributors in Cameroon and Ug and a during the study period were involved in CDTI and additional health activities . More of the distributors involved in CDTI alone attained 90 % treatment coverage than those who had CDTI with additional health activities . The more the additional activities , the less likely the distributors were to attain 90 % treatment coverage . In Ug and a , distributors were more likely to attain 90 % coverage ( P < 0.001 if they worked within 1 km of their homesteads were selected by community members , worked among kindred , and were responsible for < 20 households . CONCLUSION Additional activities could potentially undermine the performance of distributors . However , being selected by their community members , working largely among kindred and serving fewer households improved their effectiveness Background Studies in India have reported a high prevalence of nutritional anemia among children and adolescent girls . Nutritional anemia is associated with impaired mental , physical , and cognitive performance in children and is a significant risk factor for maternal mortality . Objective To evaluate the effect of a community-led initiative for control of nutritional anemia among children 6 to 35 months of age and unmarried rural adolescent girls 12 to 19 years of age . Methods This Participatory Action Research was done in 23 villages of the Primary Health Centre , Anji , in Wardha District of Maharashtra . In February and March 2008 , needs assessment was undertaken by interviewing the mothers of 261 children and 260 adolescent girls . Hemoglobin levels of adolescent girls and children were measured with the use of the hemoglobin color scale . The girls were given weekly iron — folic acid tablets , and the children were given daily liquid iron prophylaxis for 100 days in a year through community participation . The adolescent girls and the mothers of the children and adolescent girls were also given nutritional education on the benefits and side effects of iron supplementation . In June and July 2008 , follow-up assessment was performed by survey and force field analysis . Results There was a significant reduction in the prevalence of nutritional anemia from 73.8 % to 54.6 % among the adolescent girls and from 78.2 % to 64.2 % among the children . There was improvement in awareness of iron-rich food items among the adolescent girls and the mothers of the children . The benefits to girls , such as increase in appetite and reduction in scanty menses , tiredness , and weakness , acted as positive factors leading to better compliance with weekly iron supplementation . The benefits to children perceived by the mothers , such as increase in appetite , weight gain , reduction in irritability , and reduction in mud-eating behavior , acted as a dominant positive force and generated dem and for iron syrup . Conclusions The community-led initiative for once-weekly iron supplementation for adolescent girls and iron prophylaxis for children , in addition to nutritional education , improved the hemoglobin status of children 6 to 35 months of age and unmarried rural adolescent girls 12 to 19 years of age This three-phase study characterized , vali date d , and applied community capacity domains in a health communication project evaluation in Zambia . Phase I explored community capacity domains from community members ' perspectives ( 16 focus groups , 14 in-depth interviews , 4 sites . These were vali date d in Phase II with 720 r and omly selected adults . The vali date d domains were incorporated into a program evaluation survey ( 2,462 adult women , 2,354 adult men ; October 2009 ) . The results indicated that the intervention had direct effects on community capacity ; enhanced capacity was then associated with having taken community action for health . Finally , community capacity mediated by community action and controlling for confounders , had a significant effect on women 's contraceptive use , children 's bed net use , and HIV testing . The results indicate that building community capacity served as a means to an end — improved health behaviors and reported collective action for health— and an end-in-itself , both of which are essential to overall wellbeing Malaria remains a vital concern of child survival in sub-Saharan Africa despite the existence of effective curative and preventive measures . It is known that child malaria is underpinned by factors such as socioeconomic , cultural , environmental , and so forth , that must be considered simultaneously in order to effectively control it . This study applied to a rural community in Benin ( West Africa ) the Health Promotion concept ( community participation and empowerment , context ualism , intersectorality , multi strategy , equity , and sustainability ) to develop a program in order to control child malaria and close the gap of unsuccessful programs . The study design was a quasi-experimental pre-post conducted over a period of 27 months . As results , 80 % of the community members participated in six of the seven sub-projects planned . The prevalence of fever ( malaria ) was significantly reduced after the intervention ( p = 0.008 ) . The recourse to adequate health care has significantly increased after the intervention ( χ2 = 48.07 , p = 0.000000 ) . All these contributed to a statistically significant reduction of children deaths due to malaria ( p = 0.001 ) in the village . Health Promotion strategies are likely to contribute to sustainable malaria programs ' implementation that reduce malaria incidence and deaths in children under five
2,086
27,363,819
The addition of bevacizumab to erlotinib can significantly improve PFS and the ORR in the second-line treatment of NSCLC with an acceptable and manageable risk of rash and diarrhoea .
OBJECTIVES Bevacizumab and erlotinib inhibit different tumour growth pathways , and both exhibit beneficial effects in the treatment of non-small-cell lung cancer ( NSCLC ) . However , the efficacy of bevacizumab in combination with erlotinib remains controversial . Therefore , we conducted a meta- analysis to compare combination treatment with bevacizumab and erlotinib to bevacizumab or erlotinib monotherapy in the treatment of NSCLC .
Background Combination of erlotinib and bevacizumab is a promising regimen in advanced non-squamous non-small-cell lung cancer ( NSCLC ) . We are conducting a single arm phase II trial which aims to evaluate the efficacy and safety of this regime as a second- or third-line chemotherapy . Methods Key eligibility criteria were histologically or cytologically confirmed non-squamous NSCLC , stage III/IV or recurrent NSCLC not indicated radical chemoradiation , prior one or two regimen of chemotherapy , age 20 years or more , and performance status of two or less . The primary endpoint is objective response rate . The secondary endpoints include overall survival , progression-free survival , disease control rate and incidence of adverse events . This trial plans to accrue 80 patients based on a two-stage design employing a binomial distribution with an alternative hypothesis response rate of 35 % and a null hypothesis threshold response rate of 20 % . A subset analysis according to EGFR mutation status is planned . Discussion We have presented the design of a single arm phase II trial to evaluate the efficacy and safety of combination of bevacizumab and erlotinib in advanced non-squamous NSCLC patients . In particular we are interested in determining the merit of further development of this regimen and whether prospect i ve patient selection using EGFR gene is necessary in future trials . Trial registration This trial was registered at the UMIN Clinical Trials Registry as UMIN000004255 ( http://www.umin.ac.jp/ctr/index.htm ) PURPOSE Bevacizumab ( Avastin ; Genentech , South San Francisco , CA ) is a recombinant , humanized anti-vascular endothelial growth factor monoclonal antibody . Erlotinib HCl ( Tarceva , OSI-774 ; OSI Pharmaceuticals , New York , NY ) is a potent , reversible , highly selective and orally available HER-1/epidermal growth factor receptor tyrosine kinase inhibitor . Pre clinical data in various xenograft models produced greater growth inhibition than with either agent alone . Additionally , both agents have demonstrated benefit in patients with previously treated non-small-cell lung cancer ( NSCLC ) . PATIENTS AND METHODS A phase I/II study in two centers examined erlotinib and bevacizumab ( A+T ) in patients with nonsquamous stage IIIB/IV NSCLC with > or = one prior chemotherapy . In phase I , erlotinib 150 mg/day orally plus bevacizumab 15 mg/kg intravenously every 21 days was established as the phase II dose , although no dose-limiting toxicities were observed . Phase II assessed the efficacy and tolerability of A+T at this dose . Pharmacokinetic parameters were evaluated . Results Forty patients were enrolled and treated in this study ( 34 patients at phase II dose ) ; the median age was 59 years ( range , 36 to 72 years ) , 21 were female , 30 had adenocarcinoma histology , nine were never-smokers , and 22 had > or = two prior regimens ( three patients had > or = four prior regimens ) . The most common adverse events were mild to moderate rash , diarrhea , and proteinuria . Preliminary data showed no pharmacokinetic interaction between A + T. Eight patients ( 20.0 % ; 95 % CI , 7.6 % to 32.4 % ) had partial responses and 26 ( 65.0 % ; 95 % CI , 50.2 % to 79.8 % ) had stable disease as their best response . The median overall survival for the 34 patients treated at the phase II dose was 12.6 months , with progression-free survival of 6.2 months . CONCLUSION Encouraging antitumor activity and safety of A + T support further development of this combination for patients with advanced NSCLC and other solid tumors BACKGROUND We conducted a r and omized , placebo-controlled , double-blind trial to determine whether the epidermal growth factor receptor inhibitor erlotinib prolongs survival in non-small-cell lung cancer after the failure of first-line or second-line chemotherapy . METHODS Patients with stage IIIB or IV non-small-cell lung cancer , with performance status from 0 to 3 , were eligible if they had received one or two prior chemotherapy regimens . The patients were stratified according to center , performance status , response to prior chemotherapy , number of prior regimens , and prior platinum-based therapy and were r and omly assigned in a 2:1 ratio to receive oral erlotinib , at a dose of 150 mg daily , or placebo . RESULTS The median age of the 731 patients who underwent r and omization was 61.4 years ; 49 percent had received two prior chemotherapy regimens , and 93 percent had received platinum-based chemotherapy . The response rate was 8.9 percent in the erlotinib group and less than 1 percent in the placebo group ( P<0.001 ) ; the median duration of the response was 7.9 months and 3.7 months , respectively . Progression-free survival was 2.2 months and 1.8 months , respectively ( hazard ratio , 0.61 , adjusted for stratification categories ; P<0.001 ) . Overall survival was 6.7 months and 4.7 months , respectively ( hazard ratio , 0.70 ; P<0.001 ) , in favor of erlotinib . Five percent of patients discontinued erlotinib because of toxic effects . CONCLUSIONS Erlotinib can prolong survival in patients with non-small-cell lung cancer after first-line or second-line chemotherapy PURPOSE The phase III , r and omized , placebo-controlled Sequential Tarceva in Unresectable NSCLC ( SATURN ; BO18192 ) study found that erlotinib maintenance therapy extended progression-free survival ( PFS ) and overall survival in patients with advanced non-small-cell lung cancer ( NSCLC ) who had nonprogressive disease following first-line platinum-doublet chemotherapy . This study included prospect i ve analysis of the prognostic and predictive value of several biomarkers . PATIENTS AND METHODS M and atory diagnostic tumor specimens were collected before initiating first-line chemotherapy and were tested for epidermal growth factor receptor ( EGFR ) protein expression by using immunohistochemistry ( IHC ) , EGFR gene copy number by using fluorescent in situ hybridization ( FISH ) , and EGFR and KRAS mutations by using DNA sequencing . An EGFR CA simple sequence repeat in intron 1 ( CA-SSR1 ) polymorphism was evaluated in blood . RESULTS All 889 r and omly assigned patients provided tumor sample s. EGFR IHC , EGFR FISH , KRAS mutation , and EGFR CA-SSR1 repeat length status were not predictive for erlotinib efficacy . A profound predictive effect on PFS of erlotinib relative to placebo was observed in the EGFR mutation-positive subgroup ( hazard ratio [ HR ] , 0.10 ; P < .001 ) . Significant PFS benefits were also observed with erlotinib in the wild-type EGFR subgroup ( HR , 0.78 ; P = .0185 ) . KRAS mutation status was a significant negative prognostic factor for PFS . CONCLUSION This large prospect i ve biomarker study found that patients with activating EGFR mutations derive the greatest PFS benefit from erlotinib maintenance therapy . No other biomarkers were predictive for outcomes with erlotinib , although the study was not powered for clinical outcomes in biomarker subgroups other than EGFR IHC-positive [ corrected ] . KRAS mutations were prognostic for reduced PFS . The study demonstrated the feasibility of prospect i ve tissue collection for biomarker analyses in NSCLC BACKGROUND Poor PS is a negative prognostic factor for survival and a risk factor for treatment-related toxicity with st and ard platinum-doublet chemotherapy for advanced NSCLC . A phase II study combining erlotinib and bevacizumab for treatment of recurrent NSCLC showed encouraging efficacy and acceptable toxicity . PATIENTS AND METHODS This single-arm phase II study evaluated erlotinib and bevacizumab as first-line therapy for newly diagnosed nonsquamous advanced NSCLC patients with Eastern Cooperative Oncology Group PS ≥ 2 or age 70 or older . Only patients eligible for bevacizumab per label were enrolled . Patients received erlotinib 150 mg orally daily and bevacizumab 15 mg/kg intravenously on day 1 every 21 days for up to 6 cycles . The primary end point was the rate of nonprogressive disease at 4 months ( alternative hypothesis > 60 % ) . RESULTS Twenty-five patients were enrolled , with median age 77 years ( range , 52 - 90 years ) , 44 % female , 20 % never- or remote-smokers . Ninety-two percent of patients enrolled had PS of 2 per investigator assessment . The rate of nonprogressive disease at 4 months was 28 % . There were no complete responses , 1 patient achieved a partial response , and 11 patients ( 44 % ) experienced stable disease as best response . Rash , fatigue , and diarrhea were the most common toxicities . CONCLUSION The combination of erlotinib and bevacizumab had insufficient activity in the absence of known activating epidermal growth factor receptor gene mutations to warrant study in newly diagnosed elderly or poor PS patients with nonsquamous NSCLC BACKGROUND Bevacizumab and erlotinib target different tumour growth pathways with little overlap in their toxic-effect profiles . On the basis of promising results from a phase 1/2 trial assessing safety and activity of erlotinib plus bevacizumab for recurrent or refractory non-small-cell lung cancer ( NSCLC ) , we aim ed to assess efficacy and safety of this combination in a phase 3 trial . METHODS In our double-blind , placebo-controlled , r and omised phase 3 trial ( BeTa ) , we enrolled patients with recurrent or refractory NSCLC who presented to 177 study sites in 12 countries after failure of first-line treatment . Patients were r and omly allocated in a one-to-one ratio to receive erlotinib plus bevacizumab ( bevacizumab group ) or erlotinib plus placebo ( control group ) according to a computer-generated r and omisation sequence by use of an interactive voice response system . The primary endpoint was overall survival in all enrolled patients . Patients , study staff , and investigators were masked to treatment assignment . We assessed safety by calculation of incidence of adverse events and tissue was collected for biomarker analyses . This trial is registered with Clinical Trials.gov , number NCT00130728 . FINDINGS Overall survival did not differ between 317 controls and 319 patients in the bevacizumab group ( hazard ratio [ HR ] 0·97 , 95 % CI 0·80 - 1·18 , p=0·7583 ) . Median overall survival was 9·3 months ( IQR 4·1 - 21·6 ) for patients in the bevacizumab group compared with 9·2 months ( 3·8 - 20·2 ) for controls . Progression-free survival seemed to be longer in the bevacizumab group ( 3·4 months [ 1·4 - 8·4 ] ) than in the control group ( 1·7 months [ 1·3 - 4·1 ] ; HR 0·62 , 95 % CI 0·52 - 0·75 ) and objective response rate suggested some clinical activity of bevacizumab and erlotinib . However , these secondary endpoint differences could not be defined as significant because the study prespecified that the primary endpoint had to be significant before testing of secondary endpoints could be done , to control type I error rate . In the bevacizumab group , 130 ( 42 % ) of 313 patients with safety data had a serious adverse event , compared with 114 ( 36 % ) controls . There were 20 ( 6 % ) grade 5 adverse events , including two arterial thromboembolic events , in the bevacizumab group , and 14 ( 4 % ) in the control group . INTERPRETATION Addition of bevacizumab to erlotinib does not improve survival in patients with recurrent or refractory NSCLC . FUNDING Genentech BACKGROUND With use of EGFR tyrosine-kinase inhibitor monotherapy for patients with activating EGFR mutation-positive non-small-cell lung cancer ( NSCLC ) , median progression-free survival has been extended to about 12 months . Nevertheless , new strategies are needed to further extend progression-free survival and overall survival with acceptable toxicity and tolerability for this population . We aim ed to compare the efficacy and safety of the combination of erlotinib and bevacizumab compared with erlotinib alone in patients with non-squamous NSCLC with activating EGFR mutation-positive disease . METHODS In this open-label , r and omised , multicentre , phase 2 study , patients from 30 centres across Japan with stage IIIB/IV or recurrent non-squamous NSCLC with activating EGFR mutations , Eastern Cooperative Oncology Group performance status 0 or 1 , and no previous chemotherapy for advanced disease received erlotinib 150 mg/day plus bevacizumab 15 mg/kg every 3 weeks or erlotinib 150 mg/day monotherapy as a first-line therapy until disease progression or unacceptable toxicity . The primary endpoint was progression-free survival , as determined by an independent review committee . R and omisation was done with a dynamic allocation method , and the analysis used a modified intention-to-treat approach , including all patients who received at least one dose of study treatment and had tumour assessment at least once after r and omisation . This study is registered with the Japan Pharmaceutical Information Center , number JapicCTI-111390 . FINDINGS Between Feb 21 , 2011 , and March 5 , 2012 , 154 patients were enrolled . 77 were r and omly assigned to receive erlotinib and bevacizumab and 77 to erlotinib alone , of whom 75 patients in the erlotinib plus bevacizumab group and 77 in the erlotinib alone group were included in the efficacy analyses . Median progression-free survival was 16·0 months ( 95 % CI 13·9 - 18·1 ) with erlotinib plus bevacizumab and 9·7 months ( 5·7 - 11·1 ) with erlotinib alone ( hazard ratio 0·54 , 95 % CI 0·36 - 0·79 ; log-rank test p=0·0015 ) . The most common grade 3 or worse adverse events were rash ( 19 [ 25 % ] patients in the erlotinib plus bevacizumab group vs 15 [ 19 % ] patients in the erlotinib alone group ) , hypertension ( 45 [ 60 % ] vs eight [ 10 % ] ) , and proteinuria ( six [ 8 % ] vs none ) . Serious adverse events occurred at a similar frequency in both groups ( 18 [ 24 % ] patients in the erlotinib plus bevacizumab group and 19 [ 25 % ] patients in the erlotinib alone group ) . INTERPRETATION Erlotinib plus bevacizumab combination could be a new first-line regimen in EGFR mutation-positive NSCLC . Further investigation of the regimen is warranted . FUNDING Chugai Pharmaceutical Co PURPOSE Bevacizumab , a humanized anti-vascular endothelial growth factor monoclonal antibody , and erlotinib , a reversible , orally available epidermal growth factor receptor tyrosine kinase inhibitor , have demonstrated evidence of a survival benefit in the treatment of non-small-cell lung cancer ( NSCLC ) . A single-arm phase I and II study of bevacizumab plus erlotinib demonstrated encouraging efficacy , with a favorable safety profile . PATIENTS AND METHODS A multicenter , r and omized phase II trial evaluated the safety of combining bevacizumab with either chemotherapy ( docetaxel or pemetrexed ) or erlotinib and preliminarily assessed these combinations versus chemotherapy alone , as measured by progression-free survival ( PFS ) . All patients had histologically confirmed nonsquamous NSCLC that had progressed during or after one platinum-based regimen . RESULTS One hundred twenty patients were r and omly assigned and treated . No unexpected adverse events were noted . Fewer patients ( 13 % ) in the bevacizumab-erlotinib arm discontinued treatment as a result of adverse events than in the chemotherapy alone ( 24 % ) or bevacizumab-chemotherapy ( 28 % ) arms . The incidence of grade 5 hemorrhage in patients receiving bevacizumab was 5.1 % . Although not statistically significant , relative to chemotherapy alone , the risk of disease progression or death was 0.66 ( 95 % CI , 0.38 to 1.16 ) among patients treated with bevacizumab-chemotherapy and 0.72 ( 95 % CI , 0.42 to 1.23 ) among patients treated with bevacizumab-erlotinib . One-year survival rate was 57.4 % for bevacizumab-erlotinib and 53.8 % for bevacizumab-chemotherapy compared with 33.1 % for chemotherapy alone . CONCLUSION Results for PFS and overall survival favor combination of bevacizumab with either chemotherapy or erlotinib over chemotherapy alone in the second-line setting . No unexpected safety signals were noted . The rate of fatal pulmonary hemorrhage was consistent with previous bevacizumab trials . The toxicity profile of the bevacizumab-erlotinib combination is favorable compared with either chemotherapy-containing group BACKGROUND Pre clinical data indicate EGFR signals through both kinase-dependent and independent pathways and that combining a small-molecule EGFR inhibitor , EGFR antibody , and /or anti-angiogenic agent is synergistic in animal models . METHODS We conducted a dose-escalation , phase I study combining erlotinib , cetuximab , and bevacizumab . The subset of patients with non-small cell lung cancer ( NSCLC ) was analyzed for safety and response . RESULTS Thirty-four patients with NSCLC ( median four prior therapies ) received treatment on a range of dose levels . The most common treatment-related grade ≥2 adverse events were rash ( n=14 , 41 % ) , hypomagnesemia ( n=9 , 27 % ) , and fatigue ( n=5 , 15 % ) . Seven patients ( 21 % ) achieved stable disease ( SD ) ≥6 months , two achieved a partial response ( PR ) ( 6 % ) , and two achieved an unconfirmed partial response ( uPR ) ( 6 % ) ( total=32 % ) . We observed SD≥6 months/PR/uPR in patients who had received prior erlotinib and /or bevacizumab , those with brain metastases , smokers , and patients treated at lower dose levels . Five of 16 patients ( 31 % ) with wild-type EGFR experienced SD≥6 months or uPR . Correlation between grade of rash and rate of SD≥6 months/PR was observed ( p<0.01 ) . CONCLUSION The combination of erlotinib , cetuximab , and bevacizumab was well-tolerated and demonstrated antitumor activity in heavily pretreated patients with NSCLC BACKGROUND Bevacizumab , a monoclonal antibody against vascular endothelial growth factor , has been shown to benefit patients with a variety of cancers . METHODS Between July 2001 and April 2004 , the Eastern Cooperative Oncology Group ( ECOG ) conducted a r and omized study in which 878 patients with recurrent or advanced non-small-cell lung cancer ( stage IIIB or IV ) were assigned to chemotherapy with paclitaxel and carboplatin alone ( 444 ) or paclitaxel and carboplatin plus bevacizumab ( 434 ) . Chemotherapy was administered every 3 weeks for six cycles , and bevacizumab was administered every 3 weeks until disease progression was evident or toxic effects were intolerable . Patients with squamous-cell tumors , brain metastases , clinical ly significant hemoptysis , or inadequate organ function or performance status ( ECOG performance status , > 1 ) were excluded . The primary end point was overall survival . RESULTS The median survival was 12.3 months in the group assigned to chemotherapy plus bevacizumab , as compared with 10.3 months in the chemotherapy-alone group ( hazard ratio for death , 0.79 ; P=0.003 ) . The median progression-free survival in the two groups was 6.2 and 4.5 months , respectively ( hazard ratio for disease progression , 0.66 ; P<0.001 ) , with corresponding response rates of 35 % and 15 % ( P<0.001 ) . Rates of clinical ly significant bleeding were 4.4 % and 0.7 % , respectively ( P<0.001 ) . There were 15 treatment-related deaths in the chemotherapy-plus-bevacizumab group , including 5 from pulmonary hemorrhage . CONCLUSIONS The addition of bevacizumab to paclitaxel plus carboplatin in the treatment of selected patients with non-small-cell lung cancer has a significant survival benefit with the risk of increased treatment-related deaths . ( Clinical Trials.gov number , NCT00021060 . BACKGROUND Molecularly targeted agents for non-small cell lung cancer ( NSCLC ) can provide similar efficacy to chemotherapy without chemotherapy-associated toxicities . Combining two agents with different modes of action could further increase the efficacy of these therapies . The TASK study evaluated the efficacy and safety of the epidermal growth factor receptor tyrosine kinase inhibitor erlotinib in combination with the anti-angiogenic agent bevacizumab as first-line therapy in unselected , advanced non-squamous NSCLC patients . METHODS Patients were recruited from December 2007 to September 2008 . Planned sample size was 200 patients , a total of 124 patients were r and omized . Patients were r and omized using a minimization algorithm 1:1 to receive bevacizumab ( iv 15 mg/kg day 1 of each 21-day cycle ) plus chemotherapy ( gemcitabine/cisplatin or carboplatin/paclitaxel st and ard doses , 4 - 6 cycles ) ( BC arm ) or bevacizumab plus erlotinib ( p.o . 150 mg/day ; BE arm ) until disease progression or unacceptable toxicity . The primary endpoint was progression-free survival ( PFS ) . If the hazard ratio ( HR ) of PFS for BE relative to BC was above 1.25 at the pre-planned interim analysis in favor of BC , the study would be re-evaluated . Secondary endpoints included overall survival , response rate and safety . RESULTS All r and omized patients ( n = 63 BE ; n = 61 BC ) were evaluated for the efficacy analyses . At the up date d interim analysis , median PFS was 18.4 weeks ( 95 % confidence interval [ CI ] 17.0 - 25.1 ) versus 25.0 weeks ( 95 % CI 20.6-[not reached ] ) for BE versus BC , respectively ( HR for death or disease progression , BE relative to BC , 2.05 , p = 0.0183 ) . The incidence of death was 19 % for BE treatment compared with 11.5 % for BC treatment . The HR for PFS at the up date d interim analysis was above 1.25 , therefore patients on the BE arm were permitted to change arms or switch to another drug and the study was terminated . Adverse events reported were as expected . CONCLUSIONS The TASK study did not show a benefit in terms of PFS for the combination of erlotinib with bevacizumab in unselected first-line advanced non-squamous NSCLC compared with chemotherapy plus bevacizumab PURPOSE A prior study demonstrated that addition of continuous daily erlotinib fails to improve response rate or survival in non-small-cell lung cancer ( NSCLC ) patients treated with carboplatin and paclitaxel . However , pre clinical data support the hypothesis that intermittent administration of erlotinib before or after chemotherapy may improve efficacy . We tested this hypothesis in patients with advanced NSCLC . PATIENTS AND METHODS Eligible patients were former or current smokers with chemotherapy-naive stage IIIB or IV NSCLC . All patients received up to six cycles of carboplatin ( area under the curve = 6 ) and paclitaxel ( 200 mg/m(2 ) ) , with r and om assignment to one of the following three erlotinib treatments : erlotinib 150 mg on days 1 and 2 with chemotherapy on day 3 ( 150 PRE ) ; erlotinib 1,500 mg on days 1 and 2 with chemotherapy on day 3 ( 1,500 PRE ) ; or chemotherapy on day 1 with erlotinib 1,500 mg on days 2 and 3 ( 1,500 POST ) . The primary end point was response rate . RESULTS Eighty-six patients received treatment . The response rates for the 150 PRE , 1,500 PRE , and 1,500 POST arms were 18 % ( five of 28 patients ) , 34 % ( 10 of 29 patients ) , and 28 % ( eight of 29 patients ) , respectively . The median overall survival times were 10 , 15 , and 10 months for the 150 PRE , 1,500 PRE , and 1,500 POST arms , respectively . The most common grade 3 and 4 toxicities were neutropenia ( 39 % ) , fatigue ( 15 % ) , and anemia ( 12 % ) . Grade 3 and 4 rash and diarrhea were uncommon . CONCLUSION Patients treated on the 1,500 PRE arm had the highest response rate and longest survival , with ranges similar to those reported for carboplatin , paclitaxel , and bevacizumab in a more restricted population . Further evaluation of this strategy in a phase III trial is proposed BACKGROUND This trial focused on optimally combining existing targeted therapies and cytotoxic chemotherapy in the treatment of unselected patients with advanced non-small-cell lung cancer ( NSCLC ) . METHODS Patients with previously untreated advanced-stage nonsquamous NSCLC were eligible for this trial . In module A , patients received up to 4 cycles of erlotinib 150 mg daily and bevacizumab 15 mg/kg every 3 weeks . Patients then received carboplatin ( AUC = 6 ) , paclitaxel 200 mg/m2 , and bevacizumab 15 mg/kg for 4 cycles in module B. Patients who did not have progressive disease in module A received maintenance erlotinib 150 mg daily and bevacizumab 15 mg/kg every 3 weeks in module C. RESULTS Forty-eight patients were enrolled in this multicenter phase II trial . Most patients were male ( 62.5 % ) and white ( 77.1 % ) with stage IV disease ( 93.8 % ) and adenocarcinoma histologic type ( 66.7 % ) . The overall response rate in module A was 10.4 % , in module B it was 15.1 % , and in module C it was 5.5 % . The study achieved its primary endpoint , with a nonprogression rate of 45.8 % in module A. The median overall survival ( OS ) was 12.6 months . CONCLUSION The novel systemic therapy regimen is feasible in patients with advanced NSCLC . However there is no further role for developing this regimen in unselected patients with NSCLC We present the treatment rationale and study design of the AvaALL ( MO22097 ; Clinical Trials : NCT01351415 ) trial , a multicenter , open-label , r and omized , two-arm , phase IIIb study . Patients with advanced non-squamous non-small-cell lung cancer ( NSCLC ) whose disease has progressed after four to six cycles of first-line treatment with bevacizumab plus a platinum-based doublet and a minimum of two cycles of bevacizumab ( monotherapy ) maintenance treatment will be r and omized in a 1:1 ratio to one of two study arms . Patients treated on arm A will receive bevacizumab 7.5 or 15 mg/kg intravenously ( I.V. ) on day 1 , every 21 days plus , investigator 's choice of agents indicated for use in second-line ( limited to pemetrexed , docetaxel , or erlotinib ) and subsequent lines of treatment . Patients treated on arm B , will receive investigator 's choice of agents alone indicated for use in second-line and subsequent lines of treatment , but no further bevacizumab treatment . The primary endpoint of this study is overall survival ( OS ) . Secondary endpoints include the 6-month , 12-month , and 18-month OS rates , progression-free survival , and time to progression at second and third progressive disease ( PD ) , response rate , disease control rates , and duration of response at second and third PD . Additionally , efficacy in the subgroup of patients with adenocarcinoma , and the safety of bevacizumab treatment across multiple lines of treatment will be assessed . Exploratory objectives include assessment of the quality of life through multiple lines of treatment , comparison of the efficacy between Asian and non-Asian patients , and correlation of biomarkers with efficacy outcomes , disease response , and adverse events We applied an established and commercially available serum proteomic classifier for survival after treatment with erlotinib ( VeriStrat ) in a blinded manner to pretreatment sera obtained from recurrent advanced NSCLC patients before treatment with the combination of erlotinib plus bevacizumab . We found that VeriStrat could classify these patients into two groups with significantly better or worse outcomes and may enable rational selection of patients more likely to benefit from this costly and potentially toxic regimen BACKGROUND First-line chemotherapy for advanced non-small-cell lung cancer ( NSCLC ) is usually limited to four to six cycles . Maintenance therapy can delay progression and prolong survival . The oral epidermal growth factor receptor ( EGFR ) tyrosine-kinase inhibitor erlotinib has proven efficacy and tolerability in second-line NSCLC . We design ed the phase 3 , placebo-controlled Sequential Tarceva in Unresectable NSCLC ( SATURN ; BO18192 ) study to assess use of erlotinib as maintenance therapy in patients with non-progressive disease following first-line platinum-doublet chemotherapy . METHODS Between December , 2005 , and May , 2008 , 1949 patients were included in the run-in phase ( four cycles of platinum-based chemotherapy ) . At the end of the run-in phase , 889 patients who did not have progressive disease were entered into the main study , and were r and omly allocated using a 1:1 adaptive r and omisation method through a third-party interactive voice response system to receive erlotinib ( 150 mg/day ; n=438 ) or placebo ( n=451 ) until progression or unacceptable toxicity . Patients were stratified by EGFR immunohistochemistry status , stage , Eastern Cooperative Oncology Group performance status , chemotherapy regimen , smoking history , and region . Co- primary endpoints were progression-free survival ( PFS ) in all analysable patients irrespective of EGFR status , and PFS in patients whose tumours had EGFR protein overexpression , as determined by immunohistochemistry . This study is registered with www . Clinical Trials.gov , number NCT00556712 . FINDINGS 884 patients were analysable for PFS ; 437 in the erlotinib group and 447 in the placebo group . After a median follow-up of 11.4 months for the erlotinib group and 11.5 months for the placebo group , median PFS was significantly longer with erlotinib than with placebo : 12.3 weeks for patients in the erlotinib group versus 11.1 weeks for those in the placebo group ( HR 0.71 , 95 % CI 0.62 - 0.82 ; p<0.0001 ) . PFS was also significantly longer in patients with EGFR-positive immunohistochemistry who were treated with erlotinib ( n=307 ) compared with EGFR-positive patients given placebo ( n=311 ; median PFS 12.3 weeks in the erlotinib group vs 11.1 weeks in the placebo group ; HR 0.69 , 0.58 - 0.82 ; p<0.0001 ) . The most common grade 3 or higher adverse events were rash ( 37 [ 9 % ] of 443 patients in the erlotinib group vs none of 445 in the placebo group ) and diarrhoea ( seven [ 2 % ] of 443 patients vs none of 445 ) . Serious adverse events were reported in 47 patients ( 11 % ) on erlotinib compared with 34 patients ( 8 % ) on placebo . The most common serious adverse event was pneumonia ( seven cases [ 2 % ] with erlotinib and four [ < 1 % ] with placebo ) . INTERPRETATION Maintenance therapy with erlotinib for patients with NSCLC is well tolerated and significantly prolongs PFS compared with placebo . First-line maintenance with erlotinib could be considered in patients who do not progress after four cycles of chemotherapy . FUNDING F Hoffmann-La Roche PURPOSE This phase II trial aim ed to evaluate feasibility and efficacy of a first-line combination of targeted therapies for advanced non-squamous NSCLC : bevacizumab ( B ) and erlotinib ( E ) , followed by platinum-based CT at disease progression ( PD ) . METHODS 103 patients with advanced non-squamous NSCLC were treated with B ( 15 mg/kg day 1 of each 21-day cycle ) and E ( 150 mg daily ) until PD or unacceptable toxicity . Upon PD patients received 6 cycles of CT ( cisplatin/carboplatin and gemcitabine ) . The primary endpoint was disease stabilization rate ( DSR ) after 12 weeks of BE treatment . RESULTS 101 patients were evaluable . Under BE , DSR at week 12 was 54.5 % . 73 patients had at least stable disease ( SD ) , including 1 complete remission and 17 partial responses ( PR ) . No unexpected toxicities were observed . Median time to progression ( TTP ) under BE was 4.1 months . 62 patients started CT ; 35 received at least 4 cycles ( 6 PR , 32 SD ) . At a median follow-up of 36 months , median overall survival ( OS ) was 14.1 months . CONCLUSIONS First-line BE treatment followed by a fixed CT regimen at PD is feasible with acceptable toxicity and activity . In a non-squamous NSCLC population unselected for EGFR status , we found OS rates similar to st and ard CT PURPOSE Bevacizumab , a monoclonal antibody targeting vascular endothelial growth factor , improves survival when combined with carboplatin/paclitaxel for advanced nonsquamous non-small-cell lung cancer ( NSCLC ) . This r and omized phase III trial investigated the efficacy and safety of cisplatin/gemcitabine ( CG ) plus bevacizumab in this setting . PATIENTS AND METHODS Patients were r and omly assigned to receive cisplatin 80 mg/m2 and gemcitabine 1,250 mg/m(2 ) for up to six cycles plus low-dose bevacizumab ( 7.5 mg/kg ) , high-dose bevacizumab ( 15 mg/kg ) , or placebo every 3 weeks until disease progression . The trial was not powered to compare the two doses directly . The primary end point was amended from overall survival ( OS ) to progression-free survival ( PFS ) . Between February 2005 and August 2006 , 1,043 patients were r and omly assigned ( placebo , n = 347 ; low dose , n = 345 ; high dose , n = 351 ) . RESULTS PFS was significantly prolonged ; the hazard ratios for PFS were 0.75 ( median PFS , 6.7 v 6.1 months for placebo ; P = .003 ) in the low-dose group and 0.82 ( median PFS , 6.5 v 6.1 months for placebo ; P = .03 ) in the high-dose group compared with placebo . Objective response rates were 20.1 % , 34.1 % , and 30.4 % for placebo , low-dose bevacizumab , and high-dose bevacizumab plus CG , respectively . Duration of follow-up was not sufficient for OS analysis . Incidence of grade 3 or greater adverse events was similar across arms . Grade > or = 3 pulmonary hemorrhage rates were < or = 1.5 % for all arms despite 9 % of patients receiving therapeutic anticoagulation . CONCLUSION Combining bevacizumab ( 7.5 or 15 mg/kg ) with CG significantly improved PFS and objective response rate . Bevacizumab plus platinum-based chemotherapy offers clinical benefit for bevacizumab-eligible patients with advanced NSCLC PURPOSE This phase III trial was performed to assess the potential benefit of adding maintenance erlotinib to bevacizumab after a first-line chemotherapy regimen with bevacizumab for advanced non-small-cell lung cancer ( NSCLC ) . PATIENTS AND METHODS One thous and one hundred forty-five patients with histologically or cytologically confirmed NSCLC ( stage IIIB with malignant pleural effusion , stage IV , or recurrent ) received four cycles of chemotherapy plus bevacizumab . Seven hundred forty-three patients without disease progression or significant toxicity were then r and omly assigned ( 1:1 ) to bevacizumab ( 15 mg/kg , day 1 , 21-day cycle ) plus either placebo or erlotinib ( 150 mg per day ) . The primary end point was progression-free survival ( PFS ) . RESULTS Median PFS from time of r and om assignment was 3.7 months with bevacizumab/placebo and 4.8 months with bevacizumab/erlotinib ( hazard ratio [ HR ] , 0.71 ; 95 % CI , 0.58 to 0.86 ; P < .001 ) . Median overall survival ( OS ) times from r and om assignment were 13.3 and 14.4 months with bevacizumab/placebo and bevacizumab/erlotinib , respectively ( HR , 0.92 ; 95 % CI , 0.70 to 1.21 ; P = .5341 ) . During the postchemotherapy phase , there were more adverse events ( AEs ) overall , more grade 3 and 4 AEs ( mainly rash and diarrhea ) , more serious AEs , and more AEs leading to erlotinib/placebo discontinuation in the bevacizumab/erlotinib arm versus the bevacizumab/placebo arm . The incidence of AEs leading to bevacizumab discontinuation was similar in both treatment arms . CONCLUSION The addition of erlotinib to bevacizumab significantly improved PFS but not OS . Although generally well tolerated , the modest impact on survival and increased toxicity associated with the addition of erlotinib to bevacizumab maintenance mean that this two-drug maintenance regimen will not lead to a new postchemotherapy st and ard of care
2,087
23,543,384
SUMMARY ANSWER The incidence of BV is significantly higher among patients with tubal infertility when compared with patients with non-tubal infertility . BV does not impinge on conception rates but is significantly associated with pre clinical pregnancy loss , though not with first trimester abortion . Still , there is strong circumstantial evidence that supports a causal link between BV and tubal infertility . Studies with a longitudinal design , on the other h and , strongly support a relation between BV and early pregnancy loss . Unfortunately , no study looked beyond first trimester fetal loss , although it is plausible that the high preterm birth rates with IVF are , at least , in part attributable to BV
STUDY QUESTION Is bacterial vaginosis ( BV ) associated with the cause of infertility and does BV impinge on conception rates and early pregnancy loss following IVF ? WHAT IS KNOWN ALREADY BV is prevalent in patients with infertility , as evident from studies across the world .
OBJECTIVE : The reported incidence of acute pelvic inflammatory disease ( PID ) has decreased but rates of tubal infertility have not , suggesting that a large proportion of PID leading to infertility may be undetected . Sub clinical PID is common in women with uncomplicated chlamydial or gonococcal cervicitis or with bacterial vaginosis . We assessed whether women with sub clinical PID are at an increased risk for infertility . METHODS : A prospect i ve observational cohort of 418 women with or at risk for gonorrhea or chlamydia or with bacterial vaginosis was recruited . Women with acute PID were excluded . An endometrial biopsy was performed to identify endometritis ( sub clinical PID ) . After provision of therapy for gonorrhea , chlamydia and bacterial vaginosis participants were followed-up for fertility outcomes . RESULTS : There were 146 incident pregnancies during follow-up , 50 pregnancies in 120 ( 42 % ) women with sub clinical PID and 96 in 187 ( 51 % ) women without sub clinical PID . Women with sub clinical PID diagnosed at enrollment had a 40 % reduced incidence of pregnancy compared with women without sub clinical PID ( hazard ratio 0.6 , 95 % confidence interval 0.4–0.8 ) . Women with Neisseria gonorrhoeae or Chlamydia trachomatis , in the absence of sub clinical PID , were not at increased risk for infertility . CONCLUSION : Sub clinical PID decreases subsequent fertility despite provision of treatment for sexually transmitted diseases . These findings suggest that a proportion of female infertility is attributable to sub clinical PID and indicate that current therapies for sexually transmitted diseases are inadequate for prevention of infertility . LEVEL OF EVIDENCE : Objective : The aim of this study was investigate the impact of vaginal flora and vaginal inflammation on conception and early pregnancy loss following in-vitro fertilization ( IVF ) . Methods : We enrolled 91 women who were undergoing IVF . At embryo transfer ( ET ) , all of the women had quantitative vaginal culture , ET catheter-tip culture , and vaginal Gram stain scored for bacterial vaginosis and quantitated for polymorphonuclear leukocytes ( PMNs ) . Conception and early pregnancy loss were compared with culture and Gram stain results . Statistical analyses included the Chi-square test , Fisher 's exact test and the Mann – Whitney U-test . Results : The overall live birth rate ( LBR ) was 30 % ( 27/91 ) , and the rate of early pregnancy loss was 34 % ( 14/41 ) . In women with bacterial vaginosis , intermediate flora and normal flora , the conception rates were 30 % ( 3/10 ) , 39 % ( 12/31 ) and 52 % ( 26/50 ) , respectively ( p = 0.06 for trend ) . Early pregnancy loss occurred in 33 % ( 1/3 ) , 42 % ( 5/12 ) and 31 % ( 8/26 ) of women , respectively ( p = 0.06 , comparing intermediate and normal flora ) . The vaginal log concentration of hydrogen peroxide-producing lactobacilli was 7.3 ± 1.7 in women with a live birth ( n = 27 ) and 4.9 ± 2.5 in those with early pregnancy loss ( n = 14 ) ( p = 0.1 ) . Conclusions : IVF patients with bacterial vaginosis and with a decreased vaginal log concentration of hydrogen peroxide-producing lactobacilli may have decreased conception rates and increased rates of early pregnancy loss . A larger prospect i ve treatment trial design ed to evaluate the impact on IVF outcomes of optimizing the vaginal flora prior to IVF may be warranted Objectives : To vali date a simplified grading scheme for Gram stained smears of vaginal fluid for the diagnosis of bacterial vaginosis ( BV ) against the accepted “ gold ” st and ard of Amsel ’s composite criteria . Methods : Women attending genitourinary medicine ( GUM ) clinics , as part of a multicentre study , were diagnosed as having BV if three or more of the following criteria were present ; homogeneous discharge , elevated vaginal pH , production of amines , and presence of “ clue ” cells . Women with less than three of the criteria were considered as normal . Simultaneously , smears were made of vaginal fluid and Gram stained and then assessed qualitatively as normal ( grade I ) , intermediate ( grade II ) , or consistent with BV ( grade III ) . Two new grade s were used , grade 0 , epithelial cells only with no bacteria , and grade IV , Gram positive cocci only . Results : BV was diagnosed in 83/162 patient visits using the composite criteria , the remainder being regarded as normal . The majority of patients with BV had a smear assessed as grade III ( 80/83 , 96 % ) and the majority of normal women had a smear assessed as grade I ( normal , 48/79 , 61 % ) , giving a high sensitivity ( 97.5 % ) , specificity ( 96 % ) , and predictive value for a positive ( 94.1 % ) and negative ( 96 % ) test , kappa index = 0.91 . Smears assessed as grade II were found predominantly ( 12/13 ) among patients diagnosed as normal , with less than three of the composite criteria . Grade s 0 and IV were both only found among normal women . Conclusion : This simplified assessment of Gram stained smears can be used as an alternative to Amsel ’s criteria and is more applicable for use in busy GUM clinics Objective To assess whether the rate of bacterial vaginosis ( BV ) is higher in women with tubal factor infertility compared with those with other causes of infertility Purpose The aim of this study was to investigate the impact of bacterial vaginal flora on life-birth rate during ICSI and influence of metronidazole as antibiotic treatment course before ICSI . Method We enrolled 71 women who were undergoing ICSI . At embryo transfer ( ET ) , all of the women had quantitative vaginal culture , ET catheter-tip culture , and vaginal Gram stain scored for bacterial vaginosis . Results The overall live birth rate ( LBR ) was 36.6 % ( 26/71 ) , and the rate of early pregnancy loss was 13 % ( 4/30 ) . In women with bacterial vaginosis , intermediate flora and normal flora , the conception rates were 35 % ( 9/26 ) , 42 % ( 14/33 ) and 58 % ( 7/12 ) , respectively ( p = 0.06 for trend ) . Metronidazole effect to bacterial flora in vaginal . The predominant species isolated from the tip of the embryo transfer catheter in negative pregnancy was Staphylococcus epidermidis ( 7 vs. 15.2 % ) , and Streptococcus viridians ( 11 vs. 24 % ) . Conclusions Woman with bacterial vaginosis and with a decreased vaginal concentration of hydrogen peroxide-producing lactobacilli may have decreased conception rates and increased rates of failed pregnancy . A larger prospect i ve treatment trial design ed to evaluate the impact on ICSI outcomes of optimizing the vaginal flora prior to ICSI may be warranted The objective of this prospect i ve cohort study was to eluci date whether bacterial vaginosis ( BV ) is associated with a pro-inflammatory endometrial secretion cytokine profile and whether there is a relationship between BV and the concentrations of a number of key regulatory cytokines , chemokines and growth factors . A total of 198 women undergoing IVF treatment were included . Prior to embryo transfer , participants underwent screening for BV according to Nugent criteria by a Gram-stained cervical smear . The concentrations of 17 soluble mediators of human implantation were measured by multiplex immunoassay in endometrial secretions aspirated prior to embryo transfer . Seventeen ( 8.6 % ) women had BV ( Nugent score > 6 ) . Multivariable logistic regression showed a significant positive association between interleukin-beta and the presence of BV ( P=0.011 ; Nugent score > 6 versus 6 ) and a significant negative association between eotaxin and BV ( P=0.003 ) . No significant differences were found in the ratios of distinct pro- and anti-inflammatory cytokines in endometrial secretions from women with or without BV . In conclusion , BV is associated with higher concentrations of interleukin-beta in endometrial secretions compared with women without BV . However , no distinct difference in pro- and anti-inflammatory profiles is present . An effect on endometrial receptivity is unlikely
2,088
24,418,958
A significant benefit of RMT was revealed for five outcomes : vital capacity ( mean difference ( 95 % confidence interval))=0.41(0.17–0.64 ) l , maximal inspiratory pressure=10.66(3.59 , 17.72 ) cmH2O , maximal expiratory pressure=10.31(2.80–17.82 ) cmH2O , maximum voluntary ventilation=17.51(5.20 , 29.81 ) l min−1 and inspiratory capacity=0.35(0.05 , 0.65 ) l. No effect was found for total lung capacity , peak expiratory flow rate , functional residual capacity , residual volume , expiratory reserve volume or forced expiratory volume in 1 second . Conclusion : RMT increases respiratory strength , function and endurance during the period of training .
Study design : Systematic review Objectives : To determine the effect of respiratory muscle training ( RMT ) on pulmonary function in tetraplegia .
Thirteen tetraplegic patients were included in the study of the effects of respiratory muscle training and of electrical stimulation of the abdominal muscles on their respiratory capabilities . Each patient was subjected for three 1 month lasting periods of the study : for inspiratory muscle training , expiratory muscle training and for a period without training . The sequence of these three periods was r and om for each patient . Respiratory tests ( RT ) measuring forced vital capacity ( FVC ) and forced expiratory volume in one second ( FEV1 ) were conducted before and following each monthly period . Measurements were taken under four sets of conditions : the patients ' unassisted efforts , their efforts combined with pressure manually applied by a therapist to the upper part of their abdomen , and their efforts accompanied by electrical stimulation ( ES ) of the abdominal muscles during the early phase of expirium , once triggered by the therapist and once by the patients themselves . RT values were increased following respiratory muscle training and inspiratory training apparently had a slightly greater effect than its expiratory counterpart . The increments of values of RT were statistically significant ( P<0.05 ) after the inspiratory muscle training . RT measurements were greater when the patient 's voluntary effort was combined with ES of abdominal muscles than when it was not . This study concludes that respiratory muscle training is a potentially effective approach and that ES of the abdominal muscles has potentials to improve coughing in tetraplegic patients Study design : This was a prospect i ve observational study . Objectives : To review airway management of patients with acute cervical spinal cord injury ( CSCI ) who are admitted to the intensive care unit ( ICU ) and to develop a classification and regression tree ( CART ) to direct clinical decision making in airway management . Setting : This study was carried out in Australia . Methods : All patients with CSCI who required intubation and mechanical ventilation and who were admitted to ICU in three tertiary hospitals in Melbourne between October 2004 and May 2009 and two other interstate hospitals between December 2004 and December 2005 were included . Airway management was recorded . Results : A total of 114 patients were included . Tracheostomy insertion occurred in 68 patients ( 59.7 % ) . Using CART analysis , it was found that the variables forced vital capacity , the volume of pulmonary secretion and gas exchange were predictive of airway management on 82.3 % occasions with an 8.7 % extubation failure rate . Conclusion : A CART can be useful in clinical decision making regarding airway management in CSCI OBJECTIVE To evaluate if resistive inspiratory muscle training ( RIMT ) can improve lung function in patients with complete tetraplegia within half a year after trauma . DESIGN A prospect i ve study . The experimental patients received training with a Diemolding Healthcare Division inspiratory muscle trainer for 15 to 20 minutes per session , twice per day , 7 days a week for 6 weeks . SETTING Hospital-based rehabilitation units . PATIENTS Twenty patients who were in their first 6 months of complete cervical cord injury were r and omly enrolled into RIMT ( 10 patients ) and control ( 10 patients ) groups . MAIN OUTCOME MEASURE Spirometry , lung volume test , maximal inspiratory pressure , maximal expiratory pressure , and modified Borg scale measurements at rest were performed before training and at the end of 6 weeks of training . RESULTS Most of the pulmonary parameters showed statistically significant improvements within the RIMT and control groups , but the improvements were greater in the RIMT group . In addition , the improvements in total lung capacity , total lung capacity predicted percentage , vital capacity , minute ventilation , forced expiratory volume in 1 second predicted percentage , and the resting Borg scale in the RIMT group showed significantly greater improvement . CONCLUSION RIMT can improve ventilatory function , respiratory endurance , and the perceived difficulty of breathing in patients with complete cervical spinal cord injury within half a year after trauma Study design : Prospect i ve single centre study . Objectives : Pulmonary rehabilitation focuses on improving the expiratory muscle function in order to increase the reduced cough capacity in patients with cervical spinal cord injuries ( SCI ) . However , an improvement in the inspiratory function is also important for coughing effectively . Therefore , this study was to examine the significance of the inspiratory muscle strength on the cough capacity in the patients with a cervical SCI . Setting : SCI unit , Yonsei Rehabilitation Hospital , Seoul , Korea . Methods : The vital capacity ( VC ) , maximum inspiratory pressure ( MIP ) , and maximum expiratory pressure ( MEP ) were measured . Moreover , the unassisted peak cough flow ( PCF ) and assisted PCF under three conditions were evaluated . Results : All three assisted cough methods showed a significantly higher value than the unassisted method ( P<0.001 ) . The VC correlated with the voluntary cough capacity and the MIP ( R=0.749 ) correlated more significantly with the VC than the MEP ( R=0.438 ) ( P<0.01 ) . The MIP showed a higher correlation with both the unassisted PCF and all three assisted PCFs than the MEP ( P<0.001 ) . Conclusions : The management of the inspiratory muscle strength should be considered in the pulmonary rehabilitation at cervical SCI patients We examined the effects of ventilatory muscle endurance training on resting breathing pattern in 12 C6-C7 traumatic quadriplegics at least 1 year post-injury . All subjects had complete motor loss below the lesion level . Subjects were r and omly assigned to a training ( N = 6 ) , or a control group ( N = 6 ) . Baseline tests included measurement of resting ventilation and breathing pattern using mercury in rubber strain gauges for 20 minutes in a seated position ; maximum inspiratory mouth pressure ( MIP ) at FRC , and sustainable inspiratory mouth pressure for 10 minutes ( SIP ) ; lung volumes , and arterial blood gases ( ABG 's ) . The training protocol consisted of breathing through an inspiratory resistor equivalent to 85 % SIP for 15 minutes twice daily , 5 days a week for 8 weeks . Both trainers and controls attended the lab every 2 weeks for re assessment of MIP and SIP and the inspiratory resistance was increased in the training group as SIP increased . At the end of 8 weeks , baseline tests were repeated . All subjects had normal ABG 's . There was a significant increase in mean MIP and SIP in both the control group ( 30 % ± 19 % and 31 % ± 18 % respectively ) , and in the training group ( 42 % ± 24 % and 78 % ± 49 % respectively ) . Although the absolute values for both MIP and SIP were greater in the training group than in the control group , the differences were not significant . The alterations in resting breathing pattern were also the same in both groups . Mean frequency decreased significantly in the control group ( 20.2/minute to 16.9/minute ) and , while insignificant , the change in frequency in the training group was the same , 19.4/minute to 16.4/minute . Mean tidal volume ( Vt ) increased 18.2 % of baseline Vt in the control group and 17.0 % baseline in the trainers , result ing in no change in minute ventilation . As MIP and SIP increased similarly in both groups , the data from the control and trainers was pooled and timing changes re-evaluated pre- and post- study . A significant decrease in mean Ti/Ttot was observed , while no change in Vt/ Ti was found . We concluded that the testing procedure itself provided the stimulus result ing in a significant increase in MIP and SIP . The addition of training did not increase MIP and SIP further . The increased MIP and SIP result ed in a slower and deeper breathing pattern and a significantly shorter Ti/Ttot in both trainers and control subjects OBJECTIVE To compare the effects of inspiratory resistance training and isocapnic hyperpnoea vs incentive spirometry ( placebo ) on respiratory function , voice , thorax mobility and quality of life in individuals with tetraplegia . DESIGN R and omized controlled trial . PATIENTS / METHODS A total of 24 individuals with traumatic , complete tetraplegia ( C5-C8 , American Spinal Injury Association ( ASIA ) Impairment Scale ; AIS A ) were r and omly assigned to 1 of 3 groups . They completed 32 supervised training sessions over a period of 8 weeks . Before and after the training period , the following tests were performed : body plethysmography , inspiratory and expiratory muscle strength , subjective breathing parameters using a visual analogue scale , voice measurements , thorax mobility and quality of life . Cohen 's effect sizes and Kruskal-Wallis tests for differences between pre- and post-training values were calculated . RESULTS Compared with placebo training , inspiratory resistance training showed high effect sizes for inspiratory muscle strength ( d = 1.13 ) , the subjective ability " to blow one 's nose " ( d = 0.97 ) and the physical component of quality of life ( d = 0.82 ) . Isocapnic hyperpnoea compared with placebo showed a high effect size for breathlessness during exercise ( d = 0.81 ) . We found a significant effect of inspiratory resistance training vs placebo ( p = 0.016 ) and vs isocapnic hyperpnoea ( p = 0.012 ) for inspiratory muscle strength . CONCLUSION In individuals with motor and sensory complete tetraplegia during the first year post-injury , inspiratory resistance training is more advantageous than isocapnic hyperpnoea , performed 4 times a week for 10 min Adults and children with neuromuscular disease exhibit weak cough and are susceptible to recurrent chest infections , a major cause of morbidity and mortality . Mechanical insufflation/exsufflation may improve cough efficacy by increasing peak cough flow . It was hypothesised that mechanical insufflation/exsufflation would produce a greater increase in peak cough flow than other modes of cough augmentation . The acceptability of these interventions was also compared . Twenty-two patients aged 10–56 yrs ( median 21 yrs ) with neuromuscular disease and 19 age-matched controls were studied . Spirometry was performed and respiratory muscle strength measured . Peak cough flow was recorded during maximal unassisted coughs , followed in r and om order by coughs assisted by physiotherapy , noninvasive ventilation , insufflation and exsufflation , and exsufflation alone . Subjects rated strength of cough , distress and comfort on a visual analogue scale . In the neuromuscular disease group , mean±sd forced expiratory volume in one second was 0.8±0.6 L·s−1 , forced vital capacity 0.9±0.8 L , maximum inspiratory pressure 25±16 cmH2O , maximum expiratory pressure 26±22 cmH2O and unassisted peak cough flow 169±90 L·min−1 . The greatest increase in peak cough flow was observed with mechanical insufflation/exsufflation at 235±111 L·min−1 ( p<0.01 ) . All techniques showed similar patient acceptability . Mechanical insufflation/exsufflation produces a greater increase in peak cough flow than other st and ard cough augmentation techniques in adults and children with neuromuscular disease Stolzmann KL , Gagnon DR , Brown R , Tun CG , Garshick E : Risk factors for chest illness in chronic spinal cord injury : A prospect i ve study . Objective : Chest illnesses commonly cause morbidity in persons with chronic spinal cord injury . Risk factors remain poorly characterized because previous studies have not accounted for factors other than spinal cord injury . Design : Between 1994 and 2005 , 403 participants completed a respiratory question naire and underwent spirometry . Participants were contacted at a median of 1.7 yrs [ interquartile range : 1.3–2.5 yrs ] apart over a mean ( SD ) of 5.1 ± 3.0 yrs and asked to report chest illnesses that had result ed in time off work , spent indoors , or in bed since prior contact . Results : In 97 participants , there were 247 chest illnesses ( 0.12/person-year ) with 54 hospitalizations ( 22 % ) . Spinal cord injury level , completeness of injury , and duration of injury were not associated with illness risk . Adjusting for age and smoking history , any wheeze ( relative risk = 1.92 ; 95 % confidence interval : 1.19 , 3.08 ) , pneumonia or bronchitis since spinal cord injury ( relative risk = 2.29 ; 95 % confidence interval : 1.40 , 3.75 ) , and physician-diagnosed chronic obstructive pulmonary disease ( relative risk = 2.17 ; 95 % confidence interval : 1.08 , 4.37 ) were associated with a greater risk of chest illness . Each percent-predicted decrease in forced expiratory volume in 1 sec was associated with a 1.2 % increase in risk of chest illness ( P = 0.030 ) . Conclusions : In chronic spinal cord injury , chest illness result ing in time spent away from usual activities was not related to the level or completeness of spinal cord injury but was related to reduced pulmonary function , wheeze , chronic obstructive pulmonary disease , a history of pneumonia and bronchitis , and smoking OBJECTIVE To assess the effectiveness of expiratory muscle training on the pulmonary function of spinal cord injured patients . DESIGN R and omized controlled trial . SETTING Acute inpatient rehabilitation hospital . PARTICIPANTS Patients ( N=29 , 22 men and 7 women ) with recent traumatic , motor complete , spinal cord injury ( SCI ) at or above level T1 consecutively admitted to an SCI rehabilitation service . Subjects were r and omized to either resistance training ( n=16 ) or sham training ( n=13 ) . INTERVENTIONS The subjects completed either sham training or expiratory muscle resistive training with maximal expiratory force using a small h and held device , which is a tube with an aperture at the distal end , for 10 repetitions twice a day 5 days a week for a total of 6 weeks . MAIN OUTCOME MEASURES Pulmonary function tests were measured before and after the training program and included forced vital capacity ( FVC ) ; forced expiratory volume in 1 second ( FEV1 ) ; maximum expiratory pressure ( MEP ) , which is often referred to as forced expiratory pressure ; maximum inspiratory pressure ( MIP ) , which is often referred to as negative inspiratory force ; inspiratory capacity ( IC ) ; expiratory reserve volume ( ERV ) ; total lung capacity ( TLC ) ; functional residual capacity ( FRC ) ; and residual volume ( RV ) . RESULTS FVC , FEV1 , and ERV improved in both groups . Although exit values of MEP were improved in both groups compared with entry values , this increase was statistically significant only in the resistance training group . No significant improvements occurred in IC , TLC , FRC , or RV from entry to exit . MIP improved in both groups , but this increase was statistically significant only in the resistance training group . There was also a significant between-group difference in MEP exit values ( 98cmH(2)O for the resistance training group and 59cmH(2)O for the sham training group , t=3.45 , P=.002 ) . Multivariate analyses failed to reveal significant effects of treatment for any of the pulmonary function tests . CONCLUSIONS The resistance training group had significantly greater exit MEP values than the sham training group in univariate analysis only . However , improvements in pulmonary function were noted in both the resistance training and sham training groups . Although multivariate analysis failed to reveal a significant difference between groups , these findings offer some indication that expiratory training may benefit people with SCI Impairments in respiration result ing from spinal cord injury ( SCI ) result in medical consequences that are leading causes of morbidity , mortality , and economic burden . Pulmonary complications of SCI include increased risk of pulmonary infection and death , and higher rates of symptoms of respiratory dysfunction . Inspiratory capacity is diminished in individuals with higher level lesions , contributing to microatelectasis , dyspnea with exertion , and , in those with more severe impairments , respiratory insufficiency . Muscles of expiration are impaired in many individuals with spinal cord injury ( eg , injury > T8 ) with profound effects on cough effectiveness and , presumably , on clearance of secretions and susceptibility to lower respiratory tract infections . In persons with SCI , quality of life is diminished by respiratory symptoms that include cough , phlegm , and wheezing ( 1,2 ) . In those with higher lesions , asthma-like disorders of airway function have been described , which are prevented by cholinergic antagonists . This abnormality has been attributed to the unopposed effects of parasympathetic innervation on respiratory smooth muscle result ing from disruption of sympathetic efferents ( 3 ) . Hope for reductions in the impact of these many respiratory complications comes from new technologies that support respiration , continued growth of knowledge about the specific characteristics and impact of the respiratory complications of SCI , and interventions to reduce their severity . From a more fundamental viewpoint , gains in function of respiratory musculature after SCI , whether occurring spontaneously or stimulated by rehabilitation paradigms , point to the plasticity of the nervous system and its ability to form new connections after SCI . In this issue of the Journal of Spinal Cord Medicine are 7 articles about aspects of respiratory complications of SCI , which provide a perspective of current knowledge about respiratory complications and progress toward current treatments and prevention . Four research articles provide new information about the nature of these complications , the technologies to relieve them , and the neurobiology by which new connections formed within the spinal cord may limit them . Smith and coworkers report on visit rates for respiratory complications based upon analysis of administrative data for more than 18,000 veterans with SCI over a 5-year period . Although the findings of this study are limited by missing information on the level and completeness of injury and , potentially , on visits to non-VA healthcare providers for respiratory symptoms , it has unique strengths including the large number of subjects and availability of comparable data for able-bodied veterans . In agreement with two smaller prospect i ve studies , the authors reported a positive relationship between level and completeness of injury . Risks of pneumonia were increased 2-fold as compared with able-bodied veterans . A significant number of visits were for influenza , raising an intriguing question as to how the rising rates of influenza vaccination in veterans with SCI will affect this respiratory complication . Additionally , smoking has been shown to adversely affect lung function in persons with SCI ( 4 ) raising questions regarding risks of respiratory illness posed by tobacco use for individuals with SCI . Neuroprostheses employing functional electrical stimulation carry the potential to improve cough and ventilation . Phrenic nerve pacing provides at least partial ventilator independence for some individuals with higher lesions . Technologies that improve cough or ventilation would be expected to result in further long-term reductions in morbidity due to respiratory complications of SCI . Progress toward this goal will require development of electrodes ; control systems for arrays of multiple electrodes to intercostal muscles , abdominal muscles , or both ; and an underst and ing of which muscles of respiration must be stimulated and when . In this issue , Walters and coworkers have demonstrated the feasibility of using microstimulators for activation of abdominal and intercostal muscles and the diaphragm . On the one h and , this finding might be expected to exp and the repertoire of devices available for the design of future neuroprostheses to support respiration and cough . However , as acknowledged by the authors , much work remains to be done before microstimulators become st and ard tools for implementation of neuroprostheses in this setting . In mammals , hemisection of the cervical spinal cord has been observed to be followed by spontaneous recovery of contraction of the ipsilateral diaphragm result ing from crossover of motor pathways to activate the phrenic nucleus ipsilateral to the hemisection ( crossed phrenic pathway ) . Work in this field represents an intriguing intersection of the fields of SCI research , respiratory physiology , and the neurobiology of neuroplasticity . Recruitment of this pathway requires specific adaptations within the spinal cord . Insights into the molecular basis for these adaptations are provided by two studies in this issue of the Journal . Evidence that upregulation of the NR2A subunit of the NMDA receptor stimulates such neuroplasticity comes from studies by Alilain and Goshgarian demonstrating that upregulation of this subunit facilitates recovery of motor activity of the ipsilateral phrenic nerve . Petrov and coworkers have explored the molecular basis for findings that adenosine-receptor agonists such as theophylline accelerate ipsilateral phrenic nerve activity . These authors suggest that this effect of theophylline results from preserving normal expression of the A1-adenosine receptor subtype in phrenic nerve motor neurons . Unfortunately , as noted by Zimmer in this issue , damage to the cervical spinal cord in humans with SCI may be too extensive to expect to see the crossed phrenic pathway . Current knowledge regarding treatment of respiratory complications of SCI is review ed by Berly and Shem and by Zimmer and coworkers . What is clear from this work is that much progress has been made in developing supportive care to minimize the impact of respiratory complications over the first days or weeks after injury . Morbidity and mortality due to pulmonary infection have proved to be more challenging problems to address . In this regard , one intervention with proven efficacy in other population s is vaccination against influenza virus and Streptococcus pneumoniae . Despite the progress in immunizing persons with SCI against influenza , approximately one third to one half ( depending on age , smoking history , and other factors ) are not vaccinated even with intensive vaccination paradigms ( 5 ) . The findings of Trautner and coworkers reported in this issue indicate that one rational way to improve patient acceptance is administration to insensate regions . Individuals with SCI have benefited from advances in pulmonary research ; however , progress has been slow toward reducing the impact of chronic impairments in cough ; symptoms of cough , wheeze , phlegm , and dyspnea ; and ventilatory impairments in persons with higher lesions . Areas where rigorous study is needed to develop evidence of efficacy and establish treatment guidelines include presence or absence of benefit for respiratory muscle strength training , cough-assistive devices , and noninvasive ventilation Study design : Prospect i ve mortality study . Objective : To assess the relationship between comorbid medical conditions and other health-related factors to mortality in chronic spinal cord injury ( SCI ) . Setting : Boston , MA , USA . Methods : Between 1994 and 2000 , 361 males ⩾1 year after injury completed a respiratory health question naire and underwent pulmonary function testing . Cause-specific mortality was assessed over a median of 55.6 months ( range 0.33–74.4 months ) through 12/31/2000 using the National Death Index . Results : At entry , mean ( ±SD ) age was 50.6±15.0 years ( range 23–87 ) and years since injury was 17.5±12.8 years ( range 1.0–56.5 ) . Mortality was elevated ( observed/expected deaths=37/25.1 ; SMR=1.47 ; 95 % CI=1.04–2.03 ) compared to US rates . Risk factors for death were diabetes ( RR=2.62 ; 95 % CI=1.19–5.77 ) , heart disease ( RR=3.66 ; 95 % CI=1.77–7.78 ) , reduced pulmonary function , and smoking . The most common underlying and contributing causes of death were diseases of the circulatory system ( ICD-9 390–459 ) in 40 % , and of the respiratory system in 24 % ( ICD-9 460 - 519 ) . Conclusions : These results suggest that much of the excess mortality in chronic SCI is related to potentially treatable factors . Recognition and treatment of cardiovascular disease , diabetes , and lung disease , together with smoking cessation may substantially reduce mortality in chronic SCI OBJECTIVE To determine whether pulmonary function at discharge from inpatient rehabilitation can predict respiratory infection in spinal cord injury in the first year after discharge , and to determine which pulmonary function parameter predicts best . DESIGN Multicentre prospect i ve cohort study . SUBJECTS A total of 140 persons with spinal cord injury . METHODS Pulmonary function was tested at discharge from inpatient rehabilitation . Pulmonary function parameters ( expressed in absolute and percentage predicted values ) were : forced vital capacity , forced expiratory volume in 1 sec , and peak expiratory flow . Respiratory infection was determined one year after discharge by a physician . Differences between the respiratory infection and non-respiratory infection groups were tested ; and receiver operating characteristic curves were used to determine how accurately pulmonary function parameters could predict respiratory infection . RESULTS Of the 140 participants , 14 ( 10 % ) experienced respiratory infection in the first year after discharge . All pulmonary function parameters were significantly lower in persons who experienced respiratory infection than in those who did not . All pulmonary function parameters were almost equally accurate in predicting respiratory infection ; only percentage predicted forced vital capacity was less accurate . CONCLUSION Pulmonary function at discharge from inpatient rehabilitation can be used as a predictor of respiratory infection in the first year after discharge in spinal cord injury . No single pulmonary function parameter was a clearly superior predictor of respiratory infection OBJECTIVE To explore the effects of singing training on respiratory function , voice , mood , and quality of life for people with quadriplegia . DESIGN R and omized controlled trial . SETTING Large , university-affiliated public hospital , Victoria , Australia . PARTICIPANTS Participants ( N=24 ) with chronic quadriplegia ( C4 - 8 , American Spinal Injury Association grade s A and B ) . INTERVENTIONS The experimental group ( n=13 ) received group singing training 3 times weekly for 12 weeks . The control group ( n=11 ) received group music appreciation and relaxation for 12 weeks . Assessment s were conducted pre , mid- , immediately post- , and 6-months postintervention . MAIN OUTCOME MEASURES St and ard respiratory function testing , surface electromyographic activity from accessory respiratory muscles , sound pressure levels during vocal tasks , assessment s of voice quality ( Perceptual Voice Profile , Multidimensional Voice Profile ) , and Voice H and icap Index , Profile of Mood States , and Assessment of Quality of Life instruments . RESULTS The singing group increased projected speech intensity ( P=.028 ) and maximum phonation length ( P=.007 ) significantly more than the control group . Trends for improvements in respiratory function , muscle strength , and recruitment were also evident for the singing group . These effects were limited by small sample sizes with large intersubject variability . Both groups demonstrated an improvement in mood ( P=.002 ) , which was maintained in the music appreciation and relaxation group after 6 months ( P=.017 ) . CONCLUSIONS Group music therapy can have a positive effect on not only physical outcomes , but also can improve mood , energy , social participation , and quality of life for an at-risk population , such as those with quadriplegia . Specific singing therapy can augment these general improvements by improving vocal intensity This study compared the use of abdominal weights ( AbWts ) to inspiratory resistive muscle training ( IMT ) on selected measures of pulmonary function . Eleven patients , aged 16 to 41 years ( mean = 27.8 , SD = 8.3 ) with complete cervical injuries were r and omly assigned to either an AbWts or IMT treatment group . Subjects in both treatment groups received daily treatments ( five times weekly ) for 7 weeks . Forced vital capacity ( FVC ) , inspiratory capacity ( IC ) , maximal voluntary ventilation ( MVV ) , peak expiratory flow rate ( PEFR ) , and inspiratory mouth pressure ( PImax ) were measured weekly . Analysis of variance for repeated measures showed no difference between the AbWts and IMT treatments ; there were significant differences within each respective treatment group for all five variables . Although the data did not support the effectiveness of one method of training over the other , the larger increase in MVV with the IMT protocol may be indicative of an endurance training effect with this protocol . Future research should compare the effects of breathing exercise training to spontaneous recovery of the respiratory muscles in control subjects OBJECTIVE To describe changes in pulmonary function ( PF ) during the 5 years after inpatient rehabilitation in persons with spinal cord injury ( SCI ) and to study potential determinants of change . DESIGN Prospect i ve cohort study . SETTING Eight rehabilitation centers with specialized SCI units . PARTICIPANTS Persons with SCI ( N=180 ) . INTERVENTIONS Not applicable . MAIN OUTCOME MEASURES PF was determined by forced vital capacity ( FVC ) and forced expiratory volume in 1 second ( FEV1 ) as a percentage of the predicted value , at the start of rehabilitation , at discharge , and 1 and 5 years after discharge from inpatient rehabilitation . The population was divided into 3 subgroups on the basis of whether their PF declined , stabilized , or improved . RESULTS FVC improved on average 5.1 % over the whole period between discharge of inpatient rehabilitation and 5 years thereafter , but changes differed largely between persons . FVC declined in 14.9 % of the population during the first year after discharge . During this year , body mass index , inspiratory muscle strength , change in peak power output , and change in peak oxygen uptake differed significantly between subgroups . FVC declined in 28.3 % of the population during the following 4 years , but no differences were found between the subgroups for this period . Subgroups based on changes in FEV1 differed only with respect to change in peak oxygen uptake the first year after discharge . CONCLUSIONS In our study , many persons with SCI showed a decline in PF , larger than the normal age-related decline , during the 5 years after inpatient rehabilitation . Results suggest that a decline in PF during the first year after inpatient rehabilitation is associated with higher body mass index , lower inspiratory muscle strength , and declined physical fitness BACKGROUND Functional loss of respiratory muscles in persons with spinal cord injury leads to impaired pulmonary function and respiratory complications . In addition , respiratory complications are responsible for 50 - 67 % of the morbidity in this population . OBJECTIVE To investigate the effects of normocapnic hyperpnoea training in acute spinal cord injury . PATIENTS AND METHODS Fourteen patients were r and omized between control ( sham ) and an experimental normocapnic hyperpnoea training group . Vital capacity , maximal voluntary ventilation , respiratory muscle strength and endurance , respiratory complications and symptoms were evaluated before , after 4 and 8 weeks of training and after 8 weeks follow-up . RESULTS Maximal voluntary ventilation , respiratory muscle strength and endurance improved significantly in the experimental group compared with the control group ( p < 0.05 ) . Improvements in vital capacity tended to be different from the control group at 8 weeks of training . The Index of Pulmonary Dysfunction decreased after 4 weeks of training and respiratory complications were reported less frequently in the experimental group compared with the control group . CONCLUSION Normocapnic hyperpnoea training in patients with spinal cord injury improved respiratory muscle strength and endurance . Respiratory complications occurred less frequently after training Abstract Background / Objective : To determine the effect of respiratory resistance training ( RRT ) with a concurrent flow respiratory ( CFR ) device on respiratory function and aerobic power in wheelchair athletes . Methods : Ten male wheelchair athletes ( 8 with spinal cord injuries , 1 with a neurological disorder , and 1 with postpolio syndrome ) , were matched by lesion level and /or track rating before r and om assignment to either a RRT group ( n = 5 ) or a control group ( CON , n = 5 ) . The RRT group performed 1 set of breathing exercises using Exp and -a-Lung , a CFR device , 2 to 3 times daily for 10 weeks . Pre/posttesting included measurement of maximum voluntary ventilation ( MW ) , maximum inspiratory pressure ( MIP ) , and peak oxygen consumption ( Vo2peak ) . Results : Repeated measures ANOVA revealed a significant group difference in change for MIP from pre- to posttest ( P < 0.05 ) . The RRT group improved by 33.0 cm H2O , while the CON group improved by 0.6 cm H2O . Although not significant , the MW increased for the RRT group and decreased for the CON group . There was no significant group difference between Vo2peak for pre/posttesting . Due to small sample sizes in both groups and violations of some parametric statistical assumptions , nonparametric tests were also conducted as a crosscheck of the findings . The results of the nonparametric tests concurred with the parametric results . Conclusions : These data demonstrate that 10 weeks of RRT training with a CFR device can effectively improve MIP in wheelchair athletes . Further research and a larger sample size are warranted to further characterize the impact of Exp and -a-Lung on performance and other cardiorespiratory variables in wheelchair athletes
2,089
22,895,918
Overall , there was no evidence of a beneficial effect of treatment with zinc sulphate on the number of ulcers healed .
BACKGROUND Leg ulcers affect up to one percent of people at some time in their life . Leg ulceration is chronic in nature with ulcers being present for months and in some cases years without healing , and with a high risk of recurrence . Management approaches include dressings and the treatment of underlying medical problems such as malnutrition , lack of minerals and vitamins , poor blood supply or infection . OBJECTIVES To assess the effectiveness of oral zinc in healing arterial or venous leg ulcers .
Abstract The need for zinc in the optimal healing of wounds was demonstrated by the acceleration of healing of granulating wounds caused by excision of pilonidal-sinus tracts in young , healthy airmen who were given zinc sulphate U.S.P. , 220 mg . t.i.d . by mouth during the period of repair . The rate of healing was determined by measurements of the wound volumes and the number of days for complete repair . Ten r and omly selected controls ( mean age 25·0 years and wound volume 32·3 ml . ) were matched against ten who were given the drug ( mean age 24·6 years and wound volume 54·5 ml . ) . The days ±S.E. for healing averaged 80·1±13·7 in the controls and 45·8±2·6 (
2,090
31,367,842
Based on the main findings of this meta- analysis , patellar resurfaced TKA was demonstrated to have performed superior overall . Patellar resurfacing detected a lower rate of postoperative anterior knee pain and reoperation . Moreover , the resurfacing group showed greater value of the HSS , KSS and related subscales .
Total knee arthroplasty ( TKA ) is a feasible and cost-effective procedure . However , resurfacing of the patella sparks a heated debate . Anterior knee pain after TKA was supposed to be correlated to the patellofemoral joint , and the resurface of the patella was believed to be effective to avoid this complication . A meta- analysis was performed to up date current evidence concerning the outcomes of patellar resurfacing versus retaining for total knee arthroplasty . The first outcomes of interest were to compare the rate of anterior knee pain and revision surgeries .
Background The primary purpose of this r and omized controlled trial ( RCT ) was to compare knee-specific outcomes ( stiffness , pain , function ) between patellar retention and resurfacing up to 10 years after primary total knee arthroplasty ( TKA ) . Secondarily , we compared re-operation rates . Methods 38 subjects with non-inflammatory arthritis were r and omized at primary TKA surgery to receive patellar resurfacing ( n = 21 ; Resurfaced group ) or to retain their native patella ( n = 17 ; Non-resurfaced group ) . Evaluations were performed preoperatively , one , five and 10 years postoperatively by an evaluator who was blinded to group allocation . Self-reported knee-specific stiffness , pain and function , the primary outcomes , were measured by the Western Ontario McMaster Osteoarthritis Index ( WOMAC ) . Revision rate was determined at each evaluation and through hospital record review . Results 30 ( 88 % ) and 23 ( 72 % ) of available subjects completed the five and 10-year review respectively . Knee-specific scores continued to improve for both groups over the 10-years , despite diminishing overall health with no significant group differences seen . All revisions occurred within five years of surgery ( three Non-resurfaced subjects ; one Resurfaced subject ) ( p = 0.31 ) . Two revisions in the Non-resurfaced group were due to persistent anterior knee pain . Conclusions We found no differences in knee-specific results between groups at 5–10 years postoperatively . The Non-resurfaced group had two revisions due to anterior knee pain similar to rates reported in other studies . Knee-specific results provide useful postoperative information and should be used in future studies comparing patellar management strategies . Clinical Trials.gov BACKGROUND The need for patella resurfacing remains an area of considerable controversy in total knee replacement surgery . There would appear to be no reported evidence on the effect of patella resurfacing on knee function , as measured by functional range of movement used in a series of tasks , in patients undergoing knee replacement . The object of this study was to measure knee joint motion during functional activities both prior to and following total knee replacement in a r and omised group of patients with and without patella resurfacing and to compare these patient groups with a group of normal age-matched subjects . METHODS The study design was a double blinded , r and omised , prospect i ve , controlled trial . The knee joint functional ranges of movement of a group of patients ( n=50 , mean age=70 years ) with knee osteoarthritis were investigated prior to and following total knee arthroplasty ( 4 months and 18 - 24 months ) along with a group of normal subjects ( n=20 , mean age=67 ) . Patients were r and omly allocated into two groups , those who received patella resurfacing ( n=25 ) and those who did not ( n=25 ) . Flexible electrogoniometry was used to measure the flexion-extension angle of the knees with respect to time in eleven functional activities . FINDINGS No statistically significant differences ( alpha level 0.05 ) in joint excursion of the affected knee were found between patients who received patella resurfacing and those who did not . INTERPRETATION Routine patella resurfacing in a typical knee arthroplasty population does not result in an increase in the functional range of movement used after knee replacement BACKGROUND In total knee arthroplasty ( TKA ) , h and ling of the patella surface is still quite controversial . We carried out a prospect i ve r and omized study to compare circumpatellar electrocautery plus patella resurfacing vs circumpatellar electrocautery only in the single-staged bilateral TKA in Chinese population . METHODS One hundred five patients diagnosed with late-staged osteoarthritis who received single-staged bilateral TKA were screened and 53 patients were included . All patients received the same posterior cruciate-stabilizing total knee prostheses . Patients were r and omized to receive circumpatellar electrocautery plus patellar resurfacing or circumpatellar electrocautery only for the first TKA , and the second knee received the opposite treatment . All patients were followed for a minimum of 2 years . RESULTS No differences were found with regard to Knee Society Score , Feller score , anterior knee pain , and revision rates . Fifty-two percent of patients had no preference with regard to pain and function , 27 % of patients preferred the resurfacing plus circumpatellar electrocautery knee while 21 % of the patients preferred the circumpatellar electrocautery only knee . The Insall-Salvati index and the patella tilt were a little smaller in the resurfacing group . One patient ( 2.1 % ) in the circumpatellar electrocautery group underwent a patella resurfacing revision for severe anterior knee pain and patella subluxation . CONCLUSION Equivalent clinical results for circumpatellar electrocautery plus resurfacing and circumpatellar electrocautery alone of the patella in TKA were demonstrated in selective Chinese population with thick enough patella Background There is no difference in the functional outcomes 6 months after total knee arthroplasty ( TKA ) for knee osteoarthritis between patellar resurfacing and non-resurfacing . Thus , we have performed this study to compare the short-term clinical outcomes of TKA performed with and without the patella resurfacing . Methods A total of 50 patients with osteoarthritis of the knee ( OAK ) were r and omized to receive patellar resurfacing ( n=24 ; resurfaced group ) or to retain their native patella ( n=26 ; non-resurfaced group ) based on envelope selection and provided informed consent . Disease specific outcomes including Knee Society Score ( KSS ) , Knee Society Function Score ( KSKS-F ) , Kujala Anterior Knee Pain Scale ( AKPS ) , Western Ontario and McMaster Universities Arthritis Index ( WOMAC ) , Short Form 36 ( SF-36 ) , and functional patella-related activities were measured within six months of follow-up . Results There was no significant difference between the resurfaced and non-resurfaced groups in pre and post-operative improvement of range of motion ( ROM ) ( P=0.421 ) , KSS ( P=0.782 , P=0.553 ) , KSKS-F ( P=0.241 , P=0.293 ) , AKPS ( P=0.128 , P=0.443 ) , WOMAC ( P=0.700 , P=0.282 ) , and pain scores ( P=0.120 , P=0.508 ) . There was no difference in ROM between resurfaced and non-resurfaced group pre ( 15.24 ° and 15.45 ° ) and post-operative ( 18.48 ° and 18.74 ) . No side effects related to patella was observed in any of the groups . Revision was required in none of the participants . Conclusion The results showed no significant difference between patellar resurfacing and non-resurfacing in TKA for all outcome measures in a short term . Level of evidence : Introduction : Anterior knee pain following total knee arthroplasty is estimated to occur in 4 - 49 % of patients . Some orthopedic surgeons use circumpatellar electrocautery ( diathermy ) to reduce the prevalence of postsurgical anterior knee pain ; however , the extent of its use is unknown . Material s and Methodology : In April 2009 , a postal question naire was sent to all 98 departments of orthopedic surgery in The Netherl and s. The questions focused on the frequency of total knee arthroplasties , patellar resurfacing , and the use of circumpatellar electrocautery . Results : The response rate was 92 % . A total of 18,876 TKAs , 2,096 unicompartmental knee arthroplasties , and 215 patellofemoral arthroplasties are performed yearly in The Netherl and s by the responding orthopedic surgeons . Of the orthopedic surgeons performing TKA , 13 % always use patellar resurfacing in total knee arthroplasty for osteoarthritis , 49 % use selective patellar resurfacing , and 38 % never use it . Fifty-six percent of orthopedic surgeons use circumpatellar electrocautery when not resurfacing the patella , and 32 % use electrocautery when resurfacing the patella . Conclusion : There is no consensus among Dutch orthopedic surgeons on the use of patellar resurfacing or circumpatellar electrocautery in total knee replacement performed for osteoarthritis . A prospect i ve clinical trial is currently underway to fully evaluate the effect of circumpatellar electrocautery on the prevalence of anterior knee pain following total knee arthroplasty Complications of patellar resurfacing in total knee arthroplasty have rekindled the interest of many surgeons in patellar retention . In a prospect i ve study 20 r and omly selected patients of 40 underwent patellar resurfacing in combination with their total knee arthroplasty . The other 20 patients were left with an unresurfaced patella . Within 24 months of follow-up , the advantages of patellar resurfacing could be seen according to the Knee Society Score . Especially in advanced osteoarthritis of the knee joint , the patients achieved better scores in climbing stairs and in function . The superior functional results are arguments for patellar resurfacing , at least in knees with advanced osteoarthritis Background : Whether to resurface the patella during a primary total knee arthroplasty performed for the treatment of degenerative osteoarthritis remains a controversial issue . Parameters that have been suggested as being useful in guiding this decision include patient height and weight , the presence of anterior knee pain preoperatively , and the grade of chondromalacia encountered intraoperatively . The purpose of this study was to determine whether these parameters were predictive of the clinical result following total knee arthroplasty with or without patellar resurfacing . Methods : Eighty-six patients ( 118 knees ) undergoing primary total knee arthroplasty for the treatment of osteoarthritis were enrolled in a prospect i ve , r and omized , double-blind study . All patients received the same posterior-cruciate-sparing total knee prosthetic components . Patients were r and omized to treatment with or without resurfacing of the patella . Evaluations consisted of the determination of a Knee Society clinical score , the completion of a patient satisfaction question naire , specific questions relating to patellofemoral symptoms , and radiographs . Sixty-seven patients ( ninety-three knees ) were followed for a minimum of five years ( range , sixty to eighty-four months ; average , 70.5 months ) . Results : With the numbers available , there was no significant difference between the groups treated with and without resurfacing with regard to the overall Knee Society score or the pain and function subscores . Obesity , the degree of patellar chondromalacia , and the presence of preoperative anterior knee pain did not predict postoperative clinical scores or the presence of postoperative anterior knee pain . Conclusions : The occurrence of anterior knee pain could not be predicted with any clinical or radiographic parameter studied . On the basis of these results , it seems likely that postoperative anterior knee pain is related either to the component design or to the details of the surgical technique , such as component rotation , rather than to whether or not the patella is resurfaced 350 knees were evaluated in a prospect i ve , r and omized , double-blinded study of selective patellar resurfacing in primary total knee arthroplasty . Knees with exposed bone on the patellar articular surface were excluded . 327 knees were evaluated at a mean follow-up of 7.8years . 114 knees followed for greater than 10 years were analyzed separately . Satisfaction was higher in patients with a resurfaced patella . In patients followed for at least 10 years , no significant difference was found . No difference was found in KSS scores or survivorship . No complications of patellar resurfacing were identified . The vast majority of patients with remaining patellar articular cartilage do very well with total knee arthroplasty regardless of patellar resurfacing . Patient satisfaction may be slightly higher with patellar resurfacing The aim of this study was to assess medium term results of patellar resurfacing in total knee arthroplasty , specifically looking at anterior knee pain , patellofemoral function and need for reoperation . A prospect i ve cohort study was conducted with patients undergoing staged bilateral knee arthroplasty with the patella being resurfaced only on one side . This was due to change in the clinical practice of the senior author . Sixty patients were review ed clinical ly and radiologically on a regular basis . The surgery was either performed or supervised by the senior author in all cases . All patients received the cemented press-fit condylar © prosthesis . The Knee Society clinical rating system was used . Scores were recorded pre-operatively and post-operatively at three months , one year , two years and three yearly thereafter . The mean age of patients in the study group was 75 years ( range : 62–89 years ) . There were 42 women and 18 men in the study . The mean duration of follow-up was 4.5 years ( range : 2–12 years ) . There was no significant difference in the pre-operative scores in both groups . There were significantly better scores ( p < 0.05 ) on the resurfaced side as compared to the non-resurfaced side at final follow-up . No revision was carried out for patellofemoral complications on the resurfaced side . Four patients required revision in the form of patellar resurfacing on the non-resurfaced side for persistent anterior knee pain . Patellar resurfacing is recommended in total knee arthroplasty for better functional outcome with regards to anterior knee pain and patellofemoral function Patellofemoral problems are a common cause of morbidity and reoperation after total knee arthroplasty . We made a prospect i ve study of 52 patients who had bilateral arthroplasty ( 104 knees ) and in whom the patella was resurfaced on one side and not on the other . A movable-bearing prosthesis with an anatomical femoral groove was implanted on both sides by the same surgeon using an otherwise identical technique . The mean follow-up was 5.24 years ( 2 to 10 ) . In the 30 available patients ( 60 knees ) there was no difference between the two sides in subjective preference , performance on ascending and descending stairs or the incidence of anterior knee pain . Radiographs showed no differences in prosthetic alignment , femoral condylar height , patellar congruency or joint line position . The use of an appropriate prosthetic design and careful surgical technique can provide equivalent results after knee arthroplasty with or without patellar resurfacing . Given the indications and criteria , which we discuss , retention of the patellar surface is an acceptable option Purpose The purpose of this study is to assess the clinical and radiological results of patients who underwent patellar retention or resurfacing for moderate or severe patellar articular defects during total knee arthroplasty and evaluate the clinical efficacy of patellar resurfacing according to the articular defect of the patella . Material s and Methods From May 2003 to March 2006 , 252 patients ( 277 cases ) underwent total knee arthroplasty by one surgeon . Intraoperatively , we divided these patients into a moderate articular defect group ( 50 - 75 % : group I ) and a severe articular defect group ( 75 - 100 % : group II ) and r and omly performed patellar resurfacing . The average age was 67.2 years . There were 234 female and 17 male patients . The average follow-up period was 74.6 months . Clinical outcomes were analyzed using the Knee Society ( KS ) knee score . Functional score , Hospital for Special Surgery ( HSS ) score , Feller patellar score and range of motion ( ROM ) . Radiological outcomes were analyzed using the congruence angle , Insall-Salvati ratio and patella tilt angle . Results The KS knee score and functional score at the last follow-up were 84.4/73.1 in the retention group and 85.2/71.8 in the resurfacing group ( p=0.80 , p=0.63 ) in group I. In group II , the values were 82.1/75.1 and 87.0/71.2 , respectively ( p=0.51 , p=0.26 ) . The HSS score and Feller patella score were 86.7/20.3 in the retention group and 84.3/21.7 in the resurfacing group ( p=0.31 , p=0.29 ) in group I. In group II , the values were 91.6/21.2 and 85.5/22.1 , respectively ( p=0.37/p=0.30 ) . The knee ROM ( p=0.36/p=0.41 ) , congruence angle ( p=0.22/p=0.16 ) , Insall-Salvati ratio ( p=0.16/p=0.21 ) and patella tilt angle ( p=0.12/p=0.19 ) were not statistically different between the two groups . Conclusions In this study , we could not find any correlations between the degree of patellar articular defect and patellar resurfacing in terms of the clinical and radiological results . Therefore , patellar articular defects is thought to be less meaningful in determining patellar resurfacing Background and purpose — Recent research on outcomes after total knee arthroplasty ( TKA ) has raised the question of the ability of traditional outcome measures to distinguish between treatments . We compared functional outcomes in patients undergoing TKA with and without patellar resurfacing , using the knee injury and osteoarthritis outcome score ( KOOS ) as the primary outcome and 3 traditional outcome measures as secondary outcomes . Patients and methods — 129 knees in 115 patients ( mean age 70 ( 42–82 ) years ; 67 female ) were evaluated in this single-center , r and omized , double-blind study . Data were recorded preoperatively , at 1 year , and at 3 years , and were assessed using repeated- measures mixed models . Results — The mean subscores for the KOOS after surgery were statistically significantly in favor of patellar resurfacing : sport/recreation , knee-related quality of life , pain , and symptoms . No statistically significant differences between the groups were observed with the Knee Society clinical rating system , with the Oxford knee score , and with visual analog scale ( VAS ) for patient satisfaction . Interpretation — In the present study , the KOOS — but no other outcome measure used — indicated that patellar resurfacing may be beneficial in TKA Patellar resurfacing in total knee arthroplasty ( TKA ) remains controversial . This study evaluates the results of resurfacing and non-resurfacing of the patella . Fifty-six patients with osteoarthritis ( OA ) of the knee were enrolled in a prospect i ve r and omised clinical trial using a posterior-stabilised TKA . Evaluations were done preoperatively and after 1 , 3 , 6 , 12 and 24 months . Disease specific ( Knee Society Score or KSS ) and functional ( patella-related activities ) outcomes were measured . Patient satisfaction and anterior knee pain question naires were completed . No patients were lost to follow-up . No significant differences were found between groups with regard to the clinical part of the Knee Society score ( KSS ) not even in obese patients , the ability of performing daily activities involving the patellofemoral joint , and patient satisfaction . Significant differences were found regarding the functional section of the KSS , passive flexion , anterior knee pain and patellar tilt and subluxation . In conclusion , the authors believe that , for the implant studied , patellar resurfacing can be indicated Patellar resurfacing in total knee arthroplasty is a topic debated in the literature . Concerns include fracture , dislocation , loosening , and extensor mechanism injury . Residual anterior knee pain has been reported when the patella is not resurfaced . One hundred patients with osteoarthritic knees were prospect ively r and omized to either have their patella resurfaced or left not resurfaced . All patients were treated with a single prosthesis that featured an anatomically design ed patellofemoral articulation ( Anatomic Medullary Knee , DePuy , Warsaw , IN ) Two patients in the unresurfaced group and one in the resurfaced group required repeat surgery for patellofemoral complications . At 8- to 10-year follow-up evaluations , Knee Society Clinical Ratings scores were not different between the 2 groups . Rates of anterior knee pain with walking and stair climbing were significantly less in the resurfaced group . Eighty percent of patients with a resurfaced patella were extremely satisfied with their total knee arthroplasty versus 48 % without patellar resurfacing . When satisfied and extremely satisfied patients were grouped together , there was no difference between the 2 groups A series of 100 consecutive osteoarthritic patients was r and omised to undergo total knee replacement using a Miller-Galante II prosthesis , with or without a cemented polyethylene patellar component . Knee function was evaluated using the American Knee Society score , Western Ontario and McMaster University Osteoarthritis index , specific patellofemoral-related questions and radiographic evaluation until the fourth post-operative year , then via question naire until ten years post-operatively . A ten-point difference in the American Knee Society score between the two groups was considered a significant change in knee performance , with alpha and beta levels of 0.05 . The mean age of the patients in the resurfaced group was 71 years ( 53 to 88 ) and in the non-resurfaced group was 73 years ( 54 to 86 ) . After ten years 22 patients had died , seven were suffering from dementia , three declined further participation and ten were lost to follow-up . Two patients in the non-resurfaced group subsequently had their patellae resurfaced . In the resurfaced group one patient had an arthroscopic lateral release . There was no significant difference between the two treatment groups : both had a similar deterioration of scores with time , and no further patellofemoral complications were observed in either group . We are unable to recommend routine patellar resurfacing in osteoarthritic patients undergoing total knee replacement on the basis of our findings Summary Two hundred and three patients with 219 Miller Galante type I knee arthroplasties have been review ed in a multicentre study at an average of 3.5 years after operation . Three groups are compared : 83 patients were treated with a full polyethylene patellar component ; 68 with a metal backed component , and 68 with patellar debridement without resurfacing . We found no differences between the 3 groups in relation to pain , range of movement , stability , flexion or extension deficit , extension lag , alignment , walking distance , ability to climb stairs or the use of walking aids . There was also no difference in the incidence of patellofemoral complications . The number of revision operations was the same in patellar-resurfaced and non-resurfaced knees . Lateral release was accompanied by more complications in every subgroup . RésuméDans un travail multicentrique nous avons étudié 219 arthroplasties de Miller Galante type 1 chez 203 patients avec un suivi postopératoire de 3.5 ans en moyenne ( de 2 à 6 ) . On a comparé trois modes de traitement : 83 fois un implant rotulien en polyéthylène , 68 fois une plaque métallique postérieure , les 68 autres malades furent traités par débridement simple sans revêtement artificiel de la surface articulaire . On n'a trouvé aucune différence entre les trois groupes quant à la douleur , la mobilité , la stabilité , le déficit de flexion ou d'extension , l'impossibilité d'extension active , l'alignement , la distance de marche , la montée des escaliers ou la nécessité d'utiliser des cannes . En outre le taux de complications patello-fémorales était le même pour les trois groupes . Le nombre de ré interventions ne différait pas selon que les malades avaient bénéficié ou non d'un recouvrement de la face postérieure de la rotule . C'est la section de l'aileron externe de la rotule qui a entraîné le plus de complications dans les trois groupes OBJECTIVE To compare the results of primary total knee arthroplasty with patellar reshaping or resurfacing . METHODS One hundred thirty-three patients were r and omized into patellar reshaping group and patellar resurfacing group . Patellar reshaping includes resecting the partial lateral facet of the patella and the osteophytes surrounding the patella , trimming the patella to match the trochlea of the femoral component . The minimum follow-up time was 7 years . The outcome was measured by anterior knee pain rate , Knee Society clinical score , and radiographs . RESULTS Eight patients in the reshaping group ( 12.5 % ) and 10 patients in the resurfacing group ( 14.7 % ) complained of anterior knee pain ( P=0.712 ) . Meanwhile , there were no significant differences between the two groups in terms of total Knee Society score , Knee Society pain score , Knee Society function score , as well as anterior knee pain rate . CONCLUSIONS With the numbers available , there was no significant difference between the groups treated with patellar reshaping or patellar resurfacing with regard to the KSS , anterior knee pain rate and radiographs . We prefer reshaping the patella to resurfacing the patella because the former preserves sufficient patellar bone stock and can easily be converted to patellar replacement if patients complain of recurrent anterior knee pain The Insall-Burstein and Insall-Burstein II posterior-stabilized ( I-B II PS ) prostheses have been reported to have a high prevalence of patellar complications . This is a prospect i ve , consecutive study of 118 primary total knee arthroplasties in 82 patients with the I-B II PS prosthesis implanted by 1 surgeon , using a specific technique for patellar resurfacing . The mean follow-up time was 4.0 years ( range , 2 - 8 years ) . Clinical evaluation was performed using a st and ard knee score system with specific additional evaluation of the patellofemoral joint . Radiographs were evaluated for fracture , loosening , and subluxation . Ninety-four knees ( 80 % ) were rated excellent , 21 knees ( 17 % ) good , and 3 knees ( 3 % ) fair . The mean flexion was 112 degrees postoperative . No knee required reoperation for the patellofemoral joint . There were 2 nondisplaced and 1 minimally displaced patellar fractures treated nonoperatively , no patellar clunk syndrome , and no subluxations . Using the patellar evaluation system , 109 knees had no anterior knee pain , 7 knees had mild pain , and 2 knees ( 1 patient ) had moderate-to-severe pain only with rising from a chair . Patellofemoral crepitus with active flexion-extension in the seated position was noted in 16 knees ( 14 % ) but was painful in only 2 knees ( 1 patient ) . With this technique for patellar resurfacing with this prosthesis , patellofemoral complications were only 4.2 % , and no knee required reoperation for the patella or for loosening . With attention to operative technique , patellofemoral resurfacing with this posterior-stabilized total knee arthroplasty can be highly successful One hundred patients with osteoarthritic knees were r and omized either to have their patella resurfaced or not resurfaced using the same total knee replacement . These patients were assessed preoperatively and a minimum 2 years postoperatively using disease-specific ( Knee Society Clinical Rating System ) and functional capacity ( 30 second stair climbing and knee flexor and extensor torques ) outcome measures . Two patients in the not resurfaced group required reoperation because of anterior knee pain . At 2 years ' followup , the not resurfaced group had significantly less pain and better knee flexor torques than did the resurfaced group , whereas the results of the Knee Society Function Scores , 30 second stair climbing , and knee extensor torques were similar . These results suggest that longer-term followup is required , but that one should keep an open mind regarding patellar resurfacing during total knee replacement Simultaneous bilateral total knee arthroplasty was performed in twenty-six patients who had rheumatoid arthritis , and a patellar replacement was performed concurrently in one r and omly selected knee in each patient . A lateral retinacular release was performed in all knees . The patients were followed for at least six years ( mean , 6.6 years ; range , 6.0 to 7.5 years ) , and the postoperative status of the patients was evaluated with the knee score of The Hospital for Special Surgery . Pain on st and ing and on ascending or descending stairs as well as tenderness of the patellofemoral joint also were assessed . The over-all score and the individual scores for pain , function , range of motion , muscle strength , flexion contracture , and instability were not significantly different between the knees that had had a patellar replacement and those that had not . However , pain on st and ing and on ascending or descending stairs as well as tenderness of the patellofemoral joint were only noted in knees that had not had a patellar replacement . These findings suggest that , in order to diminish pain on st and ing and on using stairs , replacement of the patella during total knee arthroplasty is preferable for patients who have rheumatoid arthritis We have examined the differences in clinical outcome of total knee replacement ( TKR ) with and without patellar resurfacing in a prospect i ve , r and omised study of 181 osteoarthritic knees in 142 patients using the Profix total knee system which has a femoral component with features considered to be anatomical and a domed patellar implant . The procedures were carried out between February 1998 and November 2002 . A total of 159 TKRs in 142 patients were available for review at a mean of four years ( 3 to 7 ) . The patients and the clinical evaluator were blinded in this prospect i ve study . Evaluation was undertaken annually by an independent observer using the knee pain scale and the Knee Society clinical rating system . Specific evaluation of anterior knee pain , stair-climbing and rising from a seated to a st and ing position was also undertaken . No benefit was shown of TKR with patellar resurfacing over that without resurfacing with respect to any of the measured outcomes . In 22 of 73 knees ( 30.1 % ) with and 18 of 86 knees ( 20.9 % ) without patellar resurfacing there was some degree of anterior knee pain ( p = 0.183 ) . No revisions related to the patellofemoral joint were performed in either group . Only one TKR in each group underwent a re-operation related to the patellofemoral joint . A significant association between knee flexion contracture and anterior knee pain was observed in those knees with patellar resurfacing ( p = 0.006 ) We conducted a r and omized clinical trial to determine long-term outcome differences of patella resurfacing versus non-resurfacing in patients undergoing bilateral total knee arthroplasty . We question ed whether there were differences with respect to the operative procedure , anterior knee pain , Knee Society scores , patellofemoral-related revision rates , patient satisfaction and preference , and patellofemoral functional activities . Thirty-two patients ( 64 knees ) underwent primary bilateral single-stage total knee arthroplasty for osteoarthritis . All patients received the same cruciate-retaining total knee arthroplasty . Patients were r and omized to resurfacing or nonresurfacing of the patella for the first total knee arthroplasty , and the second knee received the opposite treatment . All living patients were followed to a minimum of 10 years . We found no differences with regard to range of motion , Knee Society Clinical Rating Score , satisfaction , revision rates , or anterior knee pain . Thirty-seven percent of patients preferred the resurfaced knee , 22 % the nonresurfaced knee , and 41 % had no preference . Two patients ( 7.4 % ) in the nonresurfaced group and one patient ( 3.5 % ) in the resurfaced group underwent revision for a patellofemoral-related complication . Equivalent clinical results for resurfaced and nonresurfaced patellae in total knee arthroplasty were demonstrated in this 10-year r and omized clinical trial . Level of Evidence : Level I , therapeutic study . See the Guide lines for Authors for a complete description of levels of evidence Background : Anterior knee pain following total knee arthroplasty is a common complaint and typically is attributed to the patellofemoral joint . The purpose of the present study was to compare the outcome of resurfacing and nonresurfacing of the patella , particularly with regard to anterior knee pain , and to clarify the indications for patellar resurfacing at the time of total knee arthroplasty . Methods : We performed a prospect i ve , r and omized study of 514 consecutive primary press-fit condylar total knee replacements . The patients were r and omized to either resurfacing or retention of the patella . They were also r and omized to either a cruciate-substituting or a cruciate-retaining prosthesis as part of a separate trial . The mean duration of follow-up was 5.3 years ( range , two to 8.5 years ) , and the patients were assessed with use of the Knee Society rating , a clinical anterior knee pain score , and the British Orthopaedic Association patient-satisfaction score . The assessment was performed without the examiner knowing whether the patella had been resurfaced . At the time of follow-up , there were 474 knees . Thirty-five patients who had a bilateral knee replacement underwent resurfacing on one side only . Results : The overall prevalence of anterior knee pain was 25.1 % ( fifty-eight of 231 knees ) in the nonresurfacing group , compared with 5.3 % ( thirteen of 243 knees ) in the resurfacing group ( p < 0.0001 ) . There was one case of component loosening . Ten of eleven patients who underwent secondary resurfacing had complete relief of anterior knee pain . The overall postoperative knee scores were lower in the nonresurfacing group , and the difference was significant among patients with osteoarthritis ( p < 0.01 ) . There was no significant difference between the resurfacing and nonresurfacing groups with regard to the postoperative function score . Patients who had a bilateral knee replacement were more likely to prefer the resurfaced side . Conclusions : As the present study showed a significantly higher rate of anterior knee pain following arthroplasty without patellar resurfacing , we recommend patellar resurfacing at the time of total knee replacement when technically possible . Level of Evidence : Therapeutic study , Level I-1a ( R and omized controlled trial [ significant difference ] ) . See Instructions to Authors for a complete description of levels of evidence BACKGROUND Fixed bearing ( FB ) total knee replacement is a well established technique against which new techniques must be compared . Mobile bearing ( MB ) prostheses , in theory , reduce polyethylene wear but the literature is yet to provide evidence that they are superior in terms of function or long-term survivorship . In addition there has been no comparison of patella resurfacing on the outcome of either design . The aims of this r and omised prospect i ve study were firstly to determine whether a mobile bearing prosthesis produced better clinical outcome and range of motion at two year follow-up and secondly to assess the effect of patella resurfacing on the outcomes of both types of bearing design . METHODS Three hundred fifty-two patients were r and omised into receiving either a PFC Sigma © cruciate sacrificing total knee arthroplasty either with a mobile bearing or a fixed bearing , with a sub-r and omisation to either patella resurfacing or patella retention . All patients participated with st and ard clinical outcome measures and had their range of motion measured both pre-operatively and at follow-up . RESULTS The mobile bearing TKR design had no impact on range of motion ; Oxford Knee Score and American Knee Society knee and function scores when compared to its fixed bearing equivalent . CONCLUSIONS At two year follow-up there was no difference between the PFC Sigma © fixed and mobile bearing design s. With no clinical difference between the cohorts , we can not recommend one design over the other . Long term benefits , particularly with regards to polyethylene wear , may yet be demonstrated . Level of evidence --1B Background : The management of the patella in total knee arthroplasty is still problematic . We aim ed to identify differences in the clinical outcome of total knee arthroplasty according to whether or not patellar resurfacing had been performed in a prospect i ve , r and omized study of 220 osteoarthritic knees . Methods : Two hundred and twenty total knee arthroplasties in 201 patients were r and omly assigned to be performed with either resurfacing or retention of the patella , and the results were followed for a mean of forty-eight months ( range , thirty-six to seventy-nine months ) in a double-blind ( both patient and clinical evaluator blinded ) , prospect i ve study . Evaluation was performed annually by an independent observer and consisted of assessment with the Knee Society clinical rating system , specific evaluation of anterior knee pain , a stair-climbing test , and radiographic examination . Results : Fifteen ( 12 % ) of the 128 knees without patellar resurfacing and nine ( 10 % ) of the ninety-two knees with patellar resurfacing underwent a revision or another type of reoperation related to the patellofemoral articulation . This difference was not significant ( chi square with one degree of freedom = 0.206 , p = 0.650 ) . At the time of the latest follow-up , there was a significantly higher incidence of anterior pain ( chi square with one degree of freedom = 5.757 , p = 0.016 ) in the knees that had not had patellar resurfacing . Conclusions : Patients who underwent patellar resurfacing had superior clinical results in terms of anterior knee pain and stair descent . However , anterior knee pain still occurred in patients with patellar resurfacing , and nine ( 10 % ) of the ninety-two patients in that group underwent a revision or another type of reoperation involving the patellofemoral joint . Weight but not body mass index was associated with the development of anterior knee pain in the patients without patellar resurfacing , a finding that suggests that patellofemoral dysfunction may be a function of joint loading rather than obesity Background : Anterior knee pain following total knee arthroplasty ( TKA ) remains one of the important reasons for patient dissatisfaction . The management of patellofemoral joint is controversial and a decision whether to resurface the patella or not , is important . The present study compares the clinical and radiological outcomes between patellar resurfacing and nonresurfacing in patients undergoing bilateral TKA . Material s and Methods : This is a prospect i ve comparative study with 60 patients undergoing bilateral simultaneous TKA ( 120 knees ) with posterior stabilized Hi flex fixed bearing knee ( Zimmer , Warsaw , Indiana ) , by two surgeons . The patients were allocated to the two groups of resurfacing versus nonresurfacing of patella . In nonresurfacing group , patellaplasty was done . Patients with clinicoradiological signs of tricompartmental arthritis were included in the study . Exclusion criteria included unilateral TKA , rheumatoid arthritis , postseptic arthritis , previous high tibial osteotomy , or unicondylar knee arthroplasty cases . Patients were assessed using Knee Society Score ( KSS ) , Modified Samsung Medical Centre Score ( MSMCS ) , Feller patellar score . Radiological evaluation was performed at 1 year using congruence angle and patellar tilt angle . Results : Mean followup was 19 months ( range 12–25 months ) . Mean KSS , MSMCS , Feller patellar scores in resurfacing group were 82.67 , 10.68 , and 25.97 , respectively and in nonresurfacing group were 82.93 , 10.48 , and 24.90 , respectively . Mean congruence angle in resurfacing group was −12.83 ° and in nonresurfacing group was −12.383 ° ( P = 0.917 ) and mean patellar tilt angle in resurfacing is 8.07 and nonresurfacing group is 7.97 ( P = 0.873 ) . Conclusion : There was no statistically significant difference in short-term clinical , functional , and radiological outcomes in the two groups and therefore , routine patellar resurfacing for patient undergoing TKA is not advantageous
2,091
23,904,227
Low- quality evidence from single trials suggested no significant difference in pain or function between two types of pelvic support belt , between osteopathic manipulation ( OMT ) and usual care or sham ultrasound ( sham US ) . Very low- quality evidence suggested that a specially- design ed pillow may relieve night pain better than a regular pillow . For pelvic pain , there was moderate- quality evidence that acupuncture significantly reduced evening pain better than exercise ; both were better than usual care . Low- quality evidence from single trials suggested that adding a rigid belt to exercise improved average pain but not function ; acupuncture was significantly better than sham acupuncture for improving evening pain and function , but not average pain ; and evening pain relief was the same following either deep or superficial acupuncture . Low- quality evidence from single trials suggested that OMT significantly reduced pain and improved function ; either a multi-modal intervention that included manual therapy , exercise and education ( MOM ) or usual care significantly reduced disability , but only MOM improved pain and physical function ; acupuncture improved pain and function more than usual care or physiotherapy ; pain and function improved more when acupuncture was started at 26- rather than 20- weeks ' gestation ; and auricular ( ear ) acupuncture significantly improved these outcomes more than sham acupuncture . Moderate- quality evidence suggested that acupuncture or exercise , tailored to the stage of pregnancy , significantly reduced evening pelvic pain or lumbo-pelvic pain more than usual care alone , acupuncture was significantly more effective than exercise for reducing evening pelvic pain , and a 16- to 20-week training program was no more successful than usual prenatal care at preventing pelvic or LBP . Physiotherapy , OMT , acupuncture , a multi-modal intervention , or the addition of a rigid pelvic belt to exercise seemed to relieve pelvic or back pain more than usual care alone . Acupuncture was more effective than physiotherapy at relieving evening lumbo-pelvic pain and disability and improving pain and function when it was started at 26- rather than 20-weeks ' gestation , although the effects were small . There was no significant difference in LBP and function for different support belts , exercise , neuro emotional technique or spinal manipulation ( SMT ) , or in evening pelvic pain between deep and superficial acupuncture .
BACKGROUND More than two-thirds of pregnant women experience low-back pain ( LBP ) and almost one-fifth experience pelvic pain . Pain increases with advancing pregnancy and interferes with work , daily activities and sleep . OBJECTIVES To assess the effects of interventions for preventing and treating pelvic and back pain in pregnancy .
Background Previous publications indicate that acupuncture is efficient for the treatment of pelvic girdle pain , PGP , in pregnant women . However , the use of acupuncture for PGP is rare due to insufficient documentation of adverse effects of this treatment in this specific condition . The aim of the present work was to assess adverse effects of acupuncture on the pregnancy , mother , delivery and the fetus/neonate in comparison with women that received stabilising exercises as adjunct to st and ard treatment or st and ard treatment alone . Methods In all , 386 women with PGP entered this controlled , single-blind trial . They were r and omly assigned to st and ard treatment plus acupuncture ( n = 125 ) , st and ard treatment plus specific stabilising exercises ( n = 131 ) or to st and ard treatment alone ( n = 130 ) for 6 weeks . Acupuncture that may be considered strong was used and treatment was started as early as in the second trimester of pregnancy . Adverse effects were recorded during treatment and throughout the pregnancy . Influence on the fetus was measured with cardiotocography ( CTG ) before-during and after 43 acupuncture sessions in 43 women . A st and ardised computerized method to analyze the CTG reading numerically ( Oxford 8000 , Oxford , Engl and ) was used . After treatment , the women rated their overall experience of the treatment and listed adverse events if any in a question naire . Data of analgesia and oxytocin augmentation during labour , duration of labour , frequency of preterm birth , operative delivery , Apgar score , cord-blood gas/acid base balance and birth weight were also recorded . Results There were no serious adverse events after any of the treatments . Minor adverse events were common in the acupuncture group but women rated acupuncture favourably even despite this . The computerized or visually assessed CTG analyses of antenatal recordings in connection with acupuncture were all normal . Conclusion This study shows that acupuncture administered with a stimulation that may be considered strong led to minor adverse complaints from the mothers but had no observable severe adverse influences on the pregnancy , mother , delivery or the fetus/neonate Background . Low back and pelvic pain is common in pregnancy and postpartum , but there is no well documented effect of treatment in pregnancy . The aim of the study was to assess whether a group intervention program for pregnant women with pelvic girdle pain has any effect on pain and daily function postpartum . Methods . Pregnant women with pelvic pain between the 18th and 32nd week of gestation were invited to participate in a r and omized clinical study . Among 958 examined women , 569 ( 59 % ) fulfilled the inclusion criteria . Women r and omized to the intervention group ( n = 275 ) participated in an education program that consisted of information , ergonomics , exercises , pain management , advice for daily life movement , pelvic belt/crutches , and information about delivery . Women r and omized to the control group ( n = 285 ) were not offered any treatment , but were free to seek advice or other treatment . Clinical measures and self‐evaluated utility of the intervention were measured by a visual analogue scale 0–10 . Results . Mean debut of pelvic girdle pain in pregnancy was at week 15 . Altogether 42 % of the women reported problems with low back pain earlier , and 34 % reported a family history of pelvic girdle pain in pregnancy . Median visual analogue scale score for all activities at inclusion was 6 both in the control group and the intervention group . At 6 and 12 months postpartum the score was reduced to 1.7/1.6 and 1.1/0.9 . In the intervention group , 75 % marked a self‐evaluated utility visual analogue scale score > 7 . In the control group , 60 % had search ed for alternative treatment . Conclusions . Postpartum pelvic girdle pain improved with time both in the intervention group and the control group , but there were no statistically significant differences between the groups . Self‐evaluated utility of the intervention was , however , high in the intervention group This study was undertaken to investigate the effects of acupuncture in low back and pelvic pain during pregnancy under real life conditions , as compared with patients undergoing conventional treatment alone . A total of 61 conventionally treated pregnant women were allocated r and omly into two groups to be treated or not by acupuncture . Twenty-seven patients formed the study group and 34 the control group . They reported the severity of pain using a Numerical Rating Scale from 0 to 10 , and their capacity to perform general activities , to work , and to walk . We also assessed the use of analgesic drugs . Women were followed up for eight weeks and interviewed five times , at two-week intervals . All women completed the study . In the study group the average pain during the study period showed a larger reduction ( 4.8 points ) than the control group ( −0.3 points ) ( P<0.0001 ) . Average pain scores decreased by at least 50 % over time in 21 ( 78 % ) patients in the acupuncture group and in five ( 15 % ) patients in the control group ( P<0.0001 ) . Maximum pain and pain at the moment of interview were also less in the acupuncture group compared with the control group . The capacity to perform general activities , to work and to walk was improved significantly more in the study group than in the control group ( P<0.05 ) . The use of paracetamol was lower in the acupuncture group ( P<0.01 ) . These results indicate that acupuncture seems to alleviate low back and pelvic pain during pregnancy , as well as to increase the capacity for some physical activities and to diminish the need for drugs , which is a great advantage during this period OBJECTIVE To investigate if water-gymnastics during pregnancy may reduce the intensity of back/low back pain and the number of days on sick-leave . METHODS A prospect i ve , r and omized study . One hundred and twenty-nine women were r and omized to participate in water-gymnastics once a week during the second half of pregnancy and 129 were r and omized to a control group . The women in both groups filled in question naires in gestational weeks 18 , 34 and within the first postpartum week . Every day from week 18 to labor they assessed the intensity of back/low back pain . RESULTS Back pain intensity increased during pregnancy . No excess risk for the pregnancy associated with water-gymnastics was observed . The women participating in water-gymnastics recorded a lower intensity of back/low back pain . The total number of days on sick-leave because of back/low back pain was 982 in the water-gymnastics group ( 124 women ) compared with 1484 in the control group ( 120 women ) . After weeks 32 33 , seven women in the water-gymnastics group compared with 17 in the control group were on sickleave because of back/ low back pain ( p=0.031 ) . CONCLUSIONS Intensity of back/low back pain increased with advancing pregnancy . There was no excess risk for urinary or vaginal infections associated with water-gymnastics . Water-gymnastics during the second half of pregnancy significantly reduced the intensity of back/ low back pain . Water-gymnastics decreased the number of women on sick-leave because of back/low back pain . Water-gymnastics during pregnancy can be recommended as a method to relieve back pain and may reduce the need for sick-leave Abstract Objectives To compare the efficacy of st and ard treatment , st and ard treatment plus acupuncture , and st and ard treatment plus stabilising exercises for pelvic girdle pain during pregnancy . Design R and omised single blind controlled trial . Setting s East Hospital , Gothenburg , and 27 maternity care centres in Sweden . Participants 386 pregnant women with pelvic girdle pain . Interventions Treatment for six weeks with st and ard treatment ( n = 130 ) , st and ard treatment plus acupuncture ( n = 125 ) , or st and ard treatment plus stabilising exercises ( n = 131 ) . Main outcome measures Primary outcome measure was pain ( visual analogue scale ) ; secondary outcome measure was assessment of severity of pelvic girdle pain by an independent examiner before and after treatment . Results After treatment the stabilising exercise group had less pain than the st and ard group in the morning ( median difference = 9 , 95 % confidence interval 1.7 to 12.8 ; P = 0.0312 ) and in the evening ( 13 , 2.7 to 17.5 ; P = 0.0245 ) . The acupuncture group , in turn , had less pain in the evening than the stabilising exercise group ( −14 , −18.1 to −3.3 ; P = 0.0130 ) . Furthermore , the acupuncture group had less pain than the st and ard treatment group in the morning ( 12 , 5.9 to 17.3 ; P < 0.001 ) and in the evening ( 27 , 13.3 to 29.5 ; P < 0.001 ) . Attenuation of pelvic girdle pain as assessed by the independent examiner was greatest in the acupuncture group . Conclusion Acupuncture and stabilising exercises constitute efficient complements to st and ard treatment for the management of pelvic girdle pain during pregnancy . Acupuncture was superior to stabilising exercises in this study Background . The study was design ed to evaluate the analgesic effect and possible adverse effects of acupuncture for pelvic and low‐back pain during the last trimester of pregnancy A single center , prospect i ve , r and omized , single blinded , controlled study comparing the effects and safety of " sitting pelvic tilt exercise " in relieving back pain during the third trimester in primigravida was carried out . The sample s were composed of 67 primigravidas who attended the prenatal clinic at King Chulalongkorn Memorial Hospital . All subjects were selected by the r and om sampling technique and allocated into two groups for the experimental group and the control group ; for 32 and 35 pregnants , respectively . The experimental group received the pelvic tilt exercise program for 8 weeks during the third trimester . Pain intensity was measured by visual analogue scale ( VAS ) at day 0 and day 56 in both groups . The result of the study revealed 1 ) The mean VAS of back pain in the experimental group was significantly lower at day 56 than at day 0 and lower than the control group at day 56 ( p < 0.05 ) by unpaired t-test 2 ) There was no incidence of preterm labor , low birth weight or neonatal complication in the experimental group . In conclusion , the " sitting pelvic tilt exercise " during the third trimester in primigravidas could decrease back pain intensity without incidence of preterm labor , low birth weight or neonatal complication BACKGROUND AND PURPOSE Symphysis pubis pain is a significant problem for some pregnant women . The purpose of this study was to investigate the effects of exercise , advice , and pelvic support belts on the management of symphysis pubis dysfunction during pregnancy . SUBJECTS Ninety pregnant women with symphysis pubis dysfunction were r and omly assigned to 3 treatment groups . METHODS A r and omized masked prospect i ve experimental clinical trial was conducted . Specific muscle strengthening exercises and advice concerning appropriate methods for performing activities of daily living were given to the 3 groups , and 2 of the groups were given either a rigid pelvic support belt or a nonrigid pelvic support belt . The dependent variables , which were measured before and after the intervention , were a Rol and -Morris Question naire score , a Patient-Specific Functional Scale score , and a pain score ( 101-point numerical rating score ) . RESULTS After the intervention , there was a significant reduction in the Rol and -Morris Question naire score , the Patient-Specific Functional Scale score , and the average and worst pain scores in all groups . With the exception of average pain , there were no significant differences between groups for the other measures . DISCUSSION AND CONCLUSION The findings indicate that the use of either a rigid or a nonrigid pelvic support belt did not add to the effects provided by exercise and advice Background . Prevention of lumbopelvic pain in pregnancy has been sparsely studied . One aim of this study was to assess if a 12‐week training program during pregnancy can prevent and /or treat lumbopelvic pain . A r and omized controlled trial was conducted at Trondheim University Hospital and three outpatient physiotherapy clinics . Three hundred and one healthy nulliparous women were included at 20 weeks of pregnancy and r and omly allocated to a training group ( 148 ) or a control group ( 153 ) . Methods . The outcome measures were self‐reported symptoms of lumbopelvic pain ( once per week or more ) , sick leave , and functional status . Pain drawing was used to document the painful area of the body . The intervention included daily pelvic floor muscle training at home , and weekly group training over 12 weeks including aerobic exercises , pelvic floor muscle and additional exercises , and information related to pregnancy . Results . At 36 weeks of gestation women in the training group were significantly less likely to report lumbopelvic pain : 65/148 ( 44 % ) versus 86/153 ( 56 % ) ( p = 0.03 ) . Three months after delivery the difference was 39/148 ( 26 % ) in the training group versus 56/153 ( 37 % ) in the control group ( p = 0.06 ) . There was no difference in sick leave during pregnancy , but women in the training group had significantly ( p = 0.01 ) higher scores on functional status . Conclusions . A 12‐week specially design ed training program during pregnancy was effective in preventing lumbopelvic pain in pregnancy Study Design This study analyzed an education and training program concerning back and pelvic problems among pregnant women . Objective The program was aim ed at reducing tack and pelvic posterior pain during pregnancy . Summary of Background Data Low back and posterior pelvic pain accounts for the majority of sick leave among pregnant women . No previous study has suggested any type of solution to this problem . Methods Four hundred and seven consecutive pregnant women were included in the study and r and omly assigned into three groups . Group A served as controls while different degrees of interventions were made in groups B and C. Results Serious back or posterior pelvic pain developed in 47 % of all women . Pain-related problems were reduced in groups B and C ( P < 0.05 ) , and sick-leave frequency was reduced in group C ( P < 0.01 ) . For some of the women in this group pain intensity was also reduced 8 weeks post partum ( P < 0.005 ) . Weekly physical exercise before pregnancy reduced the risk for back pain problems in pregnancy ( P < 0.05 ) . A non-elastic sacro-illac belt offered some pain relief to 82 % of the women with posterior pelvic pain . Conclusions An individually design ed program reduced sick leave during pregnancy . Working with groups was less effective . Differentiation between low back and posterior pelvic pain was essential . Good physical fitness reduced the risk of back pain in a subsequent pregnency . Reduction of posterior pelvic pain by a non-elastic pelvic support was experienced by 82 % of the women with posterior pelvic pain Background . In this prospect i ve epidemiologic cohort study the aim was to identify possible risk factors for developing four different syndromes of pelvic girdle pain during pregnancy . Methods . Over a one‐year period a total of 2,269 consecutive pregnant women – at week 33 of gestation – responded to a structured question naire and underwent a thorough physical examination . Women who at baseline reported daily pain from pelvic joints and had corresponding objective findings were allocated , according to symptoms , into one of four classification groups , and followed up with question naires and physical examinations up to two years after delivery . Results . Multivariate analysis could distinguish the four pelvic pain sub groups from the “ Pelvic healthy ” group with respect to 13 of 24 variables . The pelvic girdle syndrome group revealed a history of previous low back pain , trauma of the back or pelvis , multiparae , had a relatively higher weight , a higher level of self reported stress and of job At a higher risk of developing symphysiolysis were women who were multiparae , had a relatively higher weight , and were smokers . If a woman had vocational training or a professional education , was stressed , had a poorer experience of previous delivery , had previous low back pain , trauma of back , or previous salpingitis , she had an increased risk of developing one‐sided sacroiliac syndrome . The risk factors for developing double‐sided sacroiliac syndrome were previous low back pain and trauma of the back or pelvis , multiparae , poorer relationship with spouse , and less job satisfaction . Conclusions . This study demonstrates no single dominant risk factor for developing pelvic girdle pain in pregnancy , but reveals a set of physical and psychosocial factors . The risk factors for developing pelvic girdle pain in general are : history of previous low back pain , trauma of the back or pelvis , multivariate , higher level of stress , and low job satisfaction OBJECTIVE To investigate the effect of exercise during pregnancy on the intensity of low back pain and kinematics of spine . METHOD A prospect i ve r and omized study was design ed . 107 women participated in an exercise program three times a week during second half of pregnancy for 12 weeks and 105 as control group . All filled a question naire between 17 - 22 weeks of gestation and 12 weeks later for assessment of their back pain intensity . Lordosis and flexibility of spine were measured by Flexible ruler and Side bending test , respectively , at the same times . Weight gain during pregnancy , Pregnancy length and neonatal weight were recorded . RESULT Low back pain intensity was increased in the control group . The exercise group showed significant reduction in the intensity of low back pain after exercise ( p<0.0001 ) . Flexibility of spine decreased more in the exercise group ( p<0.0001 ) . Weight gain during pregnancy , pregnancy length and neonatal weight were not different between the two groups . CONCLUSION Exercise during second half of the pregnancy significantly reduced the intensity of low back pain , had no detectable effect on lordosis and had significant effect on flexibility of spine Objective . An earlier publication showed that acupuncture and stabilising exercises as an adjunct to st and ard treatment was effective for pelvic girdle pain during pregnancy , but the post‐pregnancy effects of these treatment modalities are unknown . The aim of this follow‐up study was to describe regression of pelvic girdle pain after delivery in these women . Design . A r and omised , single blind , controlled trial . Setting . East Hospital and 27 maternity care centres in Göteborg , Sweden . Population . Some 386 pregnant women with pelvic girdle pain . Methods . Participants were r and omly assigned to st and ard treatment plus acupuncture ( n = 125 ) , st and ard treatment plus specific stabilising exercises ( n = 131 ) or to st and ard treatment alone ( n = 130 ) . Main outcome measures . Primary outcome measures : pain intensity ( Visual Analogue Scale ) . Secondary outcome measure : assessment of the severity of pelvic girdle pain by an independent examiner 12 weeks after delivery . Results . Approximately three‐quarters of all the women were free of pain 3 weeks after delivery . There were no differences in recovery between the 3 treatment groups . According to the detailed physical examination , pelvic girdle pain had resolved in 99 % of the women 12 weeks after delivery . Conclusions . This study shows that irrespective of treatment modality , regression of pelvic girdle pain occurs in the great majority of women within 12 weeks after delivery The effect of Pycnogenol was studied in women in the third trimester of pregnancy , complaining of lower back pain , hip joint pain , pelvic pain ( pain in the inguinal region ) , pain due to varices or calf cramps . The women were supplemented with Pycnogenol at a dose of 30 mg/day without any other therapy . Alleviation of pain was evaluated by pain scores until delivery . A significant reduction of pain could be obtained compared with the control group , where no decrease in pain scores in any symptoms was reported . No unwanted effects were observed in the Pycnogenol group . These results indicate the potential of Pycnogenol to reduce pain associated with pregnancy OBJECTIVE To compare the effect of a l and -based , physical exercise program versus water aerobics on low back or pelvic pain and sick leave during pregnancy . DESIGN R and omized controlled clinical trial . SETTING Three antenatal care centers . PARTICIPANTS 390 healthy pregnant women . INTERVENTIONS A l and -based physical exercise program or water aerobic once a week during pregnancy . MAIN OUTCOME MEASURES Sick leave , pregnancy-related low back pain or pregnancy-related pelvic girdle pain , or both . RESULTS Water aerobics diminished pregnancy-related low back pain ( p=.04 ) and sick leave due to pregnancy-related low back pain ( p=.03 ) more than a l and -based physical exercise program . CONCLUSIONS Water aerobics can be recommended for the treatment of low back pain during pregnancy . The benefits of a l and -based physical exercise program are question able and further evaluation is needed Study Design . A r and omized assessor-blinded clinical trial was conducted . Objective . To compare 3 different physical therapy treatments with respect to pain and activity in women with pelvic girdle pain during pregnancy and 3 , 6 , and 12 months postpartum . Summary of Background Data . In spite of the high prevalence of back pain during pregnancy , documented treatment programs are limited . Methods . Based on a clinical examination , 118 women with pelvic girdle pain diagnosed during pregnancy were r and omized into 3 different treatment groups : Information Group , use of a nonelastic sacroiliac belt and oral/written information about pelvic girdle pain ( n = 40 ) ; Home Exercise Group , same as in the Information Group , with the addition of a home exercise program ( n = 41 ) ; and the In Clinic Exercise Group , same as in the Information Group , plus participation in a training program ( n = 37 ) . Pain intensity was rated on a visual analogue scale ( 0–100 mm ) and marked on a pain drawing concerning localization . The activity ability was scored using the Disability Rating Index , covering 12 daily activity items . Outcome measures were obtained at inclusion , on average in gestation week 38 , and 3 , 6 , and 12 months postpartum . Results . There was no significant difference among the 3 groups during pregnancy or at the follow-ups postpartum regarding pain and activity . In all groups , pain decreased and the activity ability increased between gestation week 38 and at 12 months postpartum . Conclusions . Women with pelvic girdle pain seemed to improve with time in all 3 treatment groups . Neitherhome nor in clinic exercises had any additional value above giving a nonelastic sacroiliac belt and information Study Design A longitudinal , prospect i ve , observational , cohort study . Objectives To describe the natural history of back pain occurring during pregnancy and immediately after delivery . Summary of Background Data Back pain during pregnancy is a frequent clinical problem even during the early stages of pregnancy . The cause is unclear . Methods A cohort of 200 consecutive women attending an antenatal clinic were followed throughout pregnancy with repeated measurements of back pain and possible determinants by question naires and physical examinations . Results Seventy‐six percent reported back pain at some time during pregnancy . Sixty‐one percent reported onset during the present pregnancy . In this group , the prevalence rate increased to 48 % until the 24th week and then remained stable and declined to 9.4 % after delivery . The reported pain intensity increased by pain duration . The pain score correlated closely to self‐rated disability and days of sickness benefit . Conclusions Back pain during pregnancy is a common complaint . The 30 % with the highest pain score reported great difficulties with normal activities . The back pain started early in pregnancy and increased over time . Young women had more pain than older women . Back pain starting during pregnancy may be a special entity and may have another origin than back pain not related to pregnancy Study Design . A prospect i ve r and omized controlled 6‐year follow‐up study of women with back pain during pregnancy . Objectives . To describe the long‐term development of back pain in relation to pregnancy and to identify the effects of a physiotherapy and patient education program attended during pregnancy . Summary of Background Data . Pain incidence and intensity during pregnancy can be reduced by physiotherapy . No study has described the development of pain experienced for a period of years after delivery or the long‐term effect of physiotherapy . Methods . Pregnant women , registered consecutively , were r and omly assigned to one control group and to two intervention groups and were observed throughout pregnancy , with follow‐up after 3 months and 6 years . Results . The first phase of the study was completed by 362 women . After 3 months , 351 and after 6 years , 303 women had been observed . Back pain among 18 % of all women before pregnancy and among 71 % during pregnancy declined to 16 % after 6 years . Pain intensity was highest in Week 36 ( visual analog score , 5.4 ) and declined markedly 6 years later ( visual analog score , 2.5 ) . Slow regression of pain after partus correlated with having a back pain history before pregnancy , ( r = 0.30 ; P < 0.05 ) , with high pain intensity during pregnancy ( r = 0.45 ; P < 0.01 ) , and with much residual pain 3 months after pregnancy ( r = 0.41 ; P < 0.01 ) . These correlations were not found in the intervention groups . Furthermore , frequency of back pain attacks at 6 years correlated with frequency of attacks during pregnancy ( r = 0.41 ; P < 0.01 ) and with a vocational factor ( r = −0.25 ; P < 0.01 ) . Physiotherapy and patient education had no effects on back pain development among women without pain during pregnancy . Conclusions . Back pain during pregnancy regressed spontaneously soon after delivery and improved in few women later than 6 months post partum . Expected correlations between back pain in relation to pregnancy and back pain 6 years later were not present in the intervention groups who had attended a physiotherapy and education program during pregnancy . The program had no prophylactic effects on women without back or pelvic pain during pregnancy Background . The aim of this study was to describe the effects of acupuncture in the treatment of low‐back and pelvic pain during pregnancy and compare it with physiotherapy Background . The efficacy of acupuncture on low‐back and /or pelvic pain in late pregnancy is review ed in few reports . Our aim was to evaluate the effects of two different acupuncture stimulation modes on pelvic pain intensity and some emotional symptoms due to the pain condition . Methods . In a prospect i ve r and omized controlled single‐blind study , pregnant women with pelvic pain , median gestational age 26 weeks ( range 18–35 ) , were given 10 acupuncture treatments . Needles were inserted subcutaneously over acupuncture points without further stimulation ( superficial , n=22 ) , or intramuscular and stimulated repeatedly until a perceived sensation of numbness , de qi , ( deep , n=25 ) . Self‐reported pain intensity at rest and during daily activities was assessed on a visual analog scale . The variables pain , emotional reactions , and loss of energy were assessed according to the Nottingham Health Profile question naire . Changes in assessed variables were analyzed with a nonparametric statistical method allowing for analysis of systematic group changes separated from additional individual changes . Results . After acupuncture stimulation , significant systematic group changes towards lower levels of pain intensity at rest and in daily activities as well as in rated emotional reaction and loss of energy were seen . The results also showed additional individual changes in most variables . In this study , no differences between the effects induced by the superficial and deep acupuncture stimulation modes were observed . Conclusion . Acupuncture stimulation that is individually design ed may be a valuable treatment to ameliorate suffering in the condition of pelvic pain in late pregnancy Twenty-six pregnant women were assigned to a massage therapy or a relaxation therapy group for 5 weeks . The therapies consisted of 20-min sessions twice a week . Both groups reported feeling less anxious after the first session and less leg pain after the first and last session . Only the massage therapy group , however , reported reduced anxiety , improved mood , better sleep and less back pain by the last day of the study . In addition , urinary stress hormone levels ( norepinephrine ) decreased for the massage therapy group and the women had fewer complications during labor and their infants had fewer postnatal complications ( e.g. , less prematurity )
2,092
16,034,866
The data on sub urethral sling operations remain too few to address the effects of this type of surgical treatment .
BACKGROUND Traditional suburethral slings are surgical operations used to treat women with symptoms of stress urinary incontinence . OBJECTIVES To determine the effects of traditional suburethral slings on stress incontinence alone or stress with other types of urinary ( mixed ) incontinence in comparison with other management options .
PURPOSE As part of a continuous quality control effort to measure the interrater reliability of urodynamic studies performed at multiple centers , we compared agreement levels for urodynamic studies between central and local physician review ers . We report interrater reliability findings for the filling cystometrogram . MATERIAL S AND METHODS Following a satisfactory interrater reliability study among 4 central physician review ers in 33 tracings 36 urodynamic study tracings from 9 Urinary Incontinence Treatment Network continence treatment centers and 13 Urinary Incontinence Treatment Network certified urodynamic study testers were r and omly selected for review . These tracings were originally interpreted by 11 local physician review ers using st and ardized Urinary Incontinence Treatment Network interpretation guidelines . Each of the 4 central physician review ers review ed 9 r and omly assigned tracings and none review ed tracings from his or her center . Local and central physician review ers were instructed to categorize values as invalid if specified technical quality assurance st and ards were not met or the signal pattern suggested implausible values because of technical deficiencies . An intraclass correlation coefficient was calculated for continuous ( numerical ) variables and a kappa statistic was calculated for qualitative values with acceptable agreement defined a priori as an intraclass correlation coefficient of greater than 0.6 . RESULTS Filling cystometrogram baseline pressure , Valsalva leak point pressure , and volume and pressure measurements at maximum cystometric capacity had excellent intraclass correlation coefficients of 0.74 to 0.99 . There were no significant differences between local and central physician review er means , indicating excellent agreement . CONCLUSIONS With proper quality control measures in place and a set of st and ardized interpretive guidelines excellent interrater reliability between local and central physician review er can be achieved for numerical cystometrogram variables OBJECTIVES The aim of this study was to compare the long-term success rates , complication rates and patient satisfaction rates for Pelvicol pubovaginal sling ( Bard ) versus TVT ( Gynecare ) in surgical treatment of urodynamic stress incontinence ( USI ) in women . DESIGN Prospect i ve r and omized cohort trial . SETTING District General Hospital , South West of Engl and . METHODS One hundred and forty-two women with urodynamic stress incontinence were r and omized to either surgical procedure ( Pelvicol = 74 , TVT = 68 ) with median follow-up of 36 month . A postal question naire was sent to all women and the response rate was excellent at approximately 90 % in both groups . RESULTS Cure of incontinence , as identified by a quality of life improvement > 90 % , and /or patient-determined continent status as dry , were comparable in both groups . When the cure rates were adjusted assuming the non-respondents as failures the figures were almost identical ( p > 0.05 ) . Preoperative continence pad usage was similar for both groups . Overall , a postoperative significant decrease in pad score was noted in both groups ( p = 0.01 ) but there was no significant difference between the groups ( p > 0.05 ) . Statistical analysis failed to detect significant differences between both groups as regards complication rates such as frequency , nocturia , de-novo urgency or dyspareunia . CONCLUSION Pelvicol sling is a safe procedure in the surgical management of USI with similar success rate and patient satisfaction rate to TVT up to three years of follow-up Purpose : To compare short-term results of autologous pubovaginal sling and synthetic transobturator ( TOT ) SAFYRE sling in the treatment of female stress urinary incontinence ( SUI ) . Methods : Twenty women referred for surgical treatment of SUI were assigned r and omly to autologous pubovaginal sling or synthetic TOT sling . Inclusion criteria were primary treatment of SUI and urodynamic study showing SUI without detrusor overactivity . Pre- and postoperative quantification of the severity of incontinence was done by pad test and a vali date d question naire ( King ’s Health Question naire ) . Results : There were no differences in patients ’ mean age , parity , body mass index , rate of postmenopausal state , pelvic floor defects and mean Valsalva leak point pressure in the preoperative urodynamic study . Mean operating time ( 21.1 ± 3.8 vs. 69.5 ± 23.7 min ; P<0.001 ) and hospital stay ( 28.8 ± 8.4 vs. 44.4 ± 5.8 h ; P<0.001 ) was shorter in the TOT than the autologous group . The postoperative pad test ( 39.4 ± 12.5 vs. 8.4 ± 5.2 g ; P=0.01 ) and the absent in the improvement in the quality of life were significantly higher in the TOT group . Conclusion : Our initial results suggest that the synthetic TOT technique had worse effectiveness for treating female SUI compared to autologous pubovaginal sling OBJECTIVE Objective of the study was to compare the efficacy and the complications of tension-free vaginal tape ( TVT ) and Burch colposuspension in the treatment of female genuine stress incontinence ( GSI ) . METHODS In this controlled , prospect i ve , r and omized study , participated 35 patients who underwent Burch colposuspension and 36 patients that underwent TVT procedure . Patients with prolapse more than first degree , previous surgical treatment of stress urinary incontinence ( SUI ) and detrusor instability were excluded from the study . RESULTS The operative time for TVT was significantly shorter compared to BC . The severity and duration of postoperative pain for TVT was significantly less compared to BC . The necessary time for return to normal activity was 10 days for TVT and 21 days for BC . The cure rate after 24 months of follow-up was as follows : TVT : 84 % and BC : 86 % , while the improvement was 7 % for TVT and 6 % for BC . CONCLUSIONS TVT and Burch colposuspension are equally effective in the management of female GSI at two years follow-up . TVT procedure requires much less operative time , has much shorter hospitalization time , with significantly less postoperative pain and faster return to normal daily activities than Burch colposuspension AIM The efficacy , safety and hospital costs of the tension-free vaginal tape procedure were compared with the pubovaginal sling operation . METHODS A total of 60 women urodynamically diagnosed as having stress or mixed urinary incontinence were operated on using either the tension-free vaginal tape or pubovaginal sling operation in a prospect i ve manner . Preoperative characteristics of the women were not significantly different for the groups . The women were followed for up to 24 months . RESULTS In the tension-free vaginal tape group , the operation time was shorter , numbers of analgesics postoperatively required were less and hospital charges were less expensive compared to those in the pubovaginal sling operation ( P < 0.01 ) . Kaplan-Meier survival analysis showed a marginal significant difference ( P = 0.059 ) in the objective cumulative cure rates at 24 months between the groups receiving the former ( 70.3 % ) and latter ( 48.3 % ) procedures . Subjective cure rates were not significantly different ( P = 0.101 ) . In both groups , an improvement in quality of life was significant and surgical complications were identical . De novo urge incontinence developed in 6 % and 10 % in the former and latter , respectively . CONCLUSIONS The tension-free tape was significantly superior to the pubovaginal sling in terms of operation time , postoperative pain , and hospital charges , but not in cure rates . A longer follow up with a larger sample size is necessary to draw definite conclusions OBJECTIVES To describe the methods and rationale for the first r and omized controlled trial conducted by the Urinary Incontinence Treatment Network . METHODS The primary objective of this clinical trial is to compare two commonly performed surgical procedures for stress urinary incontinence-the Burch colposuspension and the autologous rectus fascial sling-for overall treatment success for urinary incontinence and stress-type symptoms of incontinence at 24 months after surgery . Secondary aims include a comparison of complications , quality of life , sexual function , patient satisfaction , costs , and the need for additional treatments or surgery ; and an evaluation of the prognostic value of preoperative urodynamic studies . The Stress Incontinence Surgical Treatment Efficacy Trial is being conducted on 655 women with predominant stress urinary incontinence , as determined by history and physical examination , urinary stress test with witnessed leakage , and voiding diary . Administration of all question naires and performance of examinations , tests , and both surgical procedures are st and ardized within and across the clinical centers . Assessment s occur preoperatively and at 6 weeks and 3 , 6 , 12 , 18 , and 24 months postoperatively . A sample of 655 women ensures 80 % power to detect a 12 % difference ( 60 % versus 72 % ) at the 5 % significance level . The intent-to-treat analysis will use Fisher 's exact test and time-to-failure analyses . RESULTS Enrollment was completed in June 2004 with 24 months of follow-up to end in June 2006 . CONCLUSIONS This is the first large , multicenter r and omized clinical trial comparing these two st and ard-of-care procedures for stress incontinence Various authors have reported that a pubovaginal sling is efficacious for correction of all types of female stress incontinence.7 Autologous , cadaver , and synthetic allografts have been utilized as supporting material s. Leach et al16 have suggested that autologous slings decay over time , result ing in recurrent stress incontinence . Badlani et al10 have reported that synthetic material s are more durable in maintaining stable cure rates than autologous slings . Infectious and erosive complications of synthetic material s are well known . Urethral erosion is the most feared complication of synthetic sling surgery , causing many urologie surgeons to shy away from using synthetic bio material s. However , not all synthetic material s result in equally unfavorable reaction to human host tissues . Various synthetic material s have differing inherent biochemical and surface characteristics that result in different biological outcomes . This chapter will review these characteristics Abstract We compared morbidity and success rate of pubovaginal sling with Burch colposuspension operations in Type I/type II genuine stress urinary incontinence ( GSI ) . The study included patients who had no preoperative detrusor instability ( DI ) , no recurrent GSI , no severe pelvic prolapsus and whose Valsalva leak point pressure ( VLPP ) values were higher than 90 cm water . Twenty three of free-rectus fascial sling and 23 of Burch colposuspension operations were performed r and omly on the patients by a single surgeon . There was no statistical difference between patients in terms of age , BMI , parity , number of daily pads used and preoperative bladder neck mobility . Operation time , change in hematocrit , spontaneous voiding time , length of hospitalization and urinary infection were not different in 2 procedures . 17 patients from both groups could be compared after one year . The bladder neck mobility of both groups were similar . One surgical failure , 1 DI , 1 severe cystocele and 1 enterocele were found in the Burch group while only 1 DI was found in the pubovaginal sling group . When pubovaginal sling operation was performed as the primary surgery on the patients with type I/II GSI , the morbidity , complications and 1 year success rate are the same as Burch procedure OBJECTIVES To compare short-term functional outcomes , urodynamic parameters , and quality of life of transobturator and retropubic routes in the cure of urinary stress incontinence . POPULATION AND METHODS This prospect i ve , multicentre study involved 88 women undergoing suburethral sling procedure for stress urinary incontinence ( SUI ) . The retropubic route ( RPR ) and the transobturator route ( TOR ) were used in 42 and 46 women , respectively . No difference in epidemiologic and preoperative urinary functional status ( SUI stage , and pollakiuria , nocturia , and urgency rates ) was found between the groups . Functional results and quality of life were evaluated before surgery and at 1 , 3 , 6 , and 12 mo postoperatively . Urodynamic examinations were performed before and 3 mo after surgery . RESULTS The mean follow-up was 10 mo . No difference in the rate of de novo urge incontinence and immediate and late voiding dysfunction was noted between the groups . No difference in the cure rate was observed between the groups ( 89.3 % in the RPR group and 88.6 % in the TOR group ) . RPR was associated with a significant decrease in maximum urinary flow and an increase in residual urine volume . Quality of life was significantly improved after surgery without difference between the groups . CONCLUSIONS Retropubic and transobturator routes for treatment of female SUI have similar high cure rates and quality of life improvement . Because of advantages in the rate of complications and postoperative pain previously demonstrated on the same population , the transobturator route appears to be the best option for the treatment of urinary incontinence Objective To compare the pubovaginal sling and transurethral Macroplastique in the treatment of female stress urinary incontinence ( SUI ) and intrinsic sphincter deficiency ( ISD ) Objective To assess the effectiveness and late postoperative morbidity of the Burch procedure and the sling procedure for the treatment of recurrent urinary stress incontinence after vaginal hysterectomy and anterior repair . Methods Clinical , urodynamic , and sonographic examinations were done on 77 women suffering with recurrent urinary stress incontinence . The women were r and omized to two groups , modified Burch colposuspension and lyophilized dura mater sling surgery ; 72 women were reexamined 32–48 months after these procedures . Results The cure rate at 32–48 months ' follow-up was 86 % for the Burch procedure and 92 % for the sling . Women who had had the sling procedure demonstrated a clear decrease in maximal bladder capacity , from 330 to 240 mL ( P < .05 ) . In both groups , stress profiles demonstrated a shift of maximal pressure point toward the proximal urethra and a significant improvement in pressure transmission ( P < .05 ) . The post-operative patients who had persistent incontinence were found to have insufficient elevation of the bladder neck ( less than 10 mm ) . The uroflow examination showed an increase of urination time in both groups . The incidence of bladder problems was 10 % with the Burch procedure and 29 % with the sling procedure ; however , 13 % of the Burch group developed rectoceles . Conclusion Both procedures offer a high rate of success . We believe that the sling surgery should be used only in certain special cases because of its higher rate of complications , but that posterior vaginal repair should be considered after modified Burch colposuspension because of the possibility of rectocele and enterocele It is difficult to make a choice among the many surgical procedures design ed for the correction of stress urinary incontinence by the vaginal route because their results have not been correctly compared . The Bologna ( B ) operation uses two flaps from the anterior vaginal wall that are anchored to the abdominal wall ; the Ingelman-Sundberg ( IS ) operation is a suburethral sling made from two transplants from the pubococcygeus muscle . A prospect i ve r and omized study has been carried out in order to compare these two procedures . A selection of cases has been based upon the presence of genuine or potential stress incontinence , genital prolapse and available tissues ( anterior vaginal wall excess and palpable pubococcygeus muscles ) for both procedures . No significant difference was noted for clinical results ( 91.7 % and 93.7 % of patients cured by the B and IS operations , respectively ) or for transmission rate gain at 3 months and 1 year . Maximum urethral closing pressure was maintained in both treatment arms . No significant postoperative complication or persistent dysuria occurred . The Bologna procedure is best indicated in case of frank anterior vaginal excess , and the Ingelman-Sundberg procedure when strong anterior parts of pubococcygeus muscles are available . Both are excellent in the cure of stress incontinence associated with genital prolapse PURPOSE The incidence of urinary incontinence in women of childbearing age is about 30 % . Around half have stress incontinence . Many treatment modalities have been eluci date d to treat stress incontinence , and among the most popular are rectus fascia sling and tension-free vaginal tape ( TVT ) . The introduction of TVT to the urological armamentarium put a multiplicity of synthetic material s into use in the correction of stress urinary incontinence . A comparison of the impact of these 2 commonly used techniques is needed . MATERIAL S AND METHODS A total of 53 female patients older than 21 years ( mean age 45.09 ) were r and omized , using closed envelopes , to undergo TVT or rectus fascia sling . R and omization was performed after patients received spinal anesthesia . One surgeon performed the 2 types of treatment . Associated grade 2 cystocele was simultaneously corrected . Patients with bladder or urethral pathology , as well as those with cystocele greater than grade 2 , were excluded from analysis . RESULTS All 53 patients completed 6 months of followup and all had stress urinary incontinence . There were 15 patients who underwent sling surgery and 17 who underwent TVT who had concomitant grade 1 or 2 cystocele . No statistically significant difference was found between the 2 groups at baseline . Cure was accomplished in 23 of 25 ( 92 % ) with sling and in 26 of 28 ( 92.9 % ) with TVT at first followup visit ( 1 week ) . There were 7 patients who needed at least 1 extra week of catheterization in the sling group and 3 in the TVT group . No significant difference was detected in terms of post-void residual urine , symptom score , and filling and voiding parameters . At 6 months 1 patient had de novo detrusor overactivity and 7 had wound pain . Compared to those with TVT , 2 cases of sling were considered treatment failures , none had de novo overactivity and 2 had wound pain . None of the patients had symptoms suggestive of urethral erosion . CONCLUSIONS Rectus fascia sling and TVT seem to be equally effective regarding primary outcome measure ( ie cure of stress incontinence ) . Symptom score related to incontinence surgery as well as simultaneous correction of cystocele are comparable in the 2 groups . Fascial sling is a longer treatment process yet it is more economical . Longer followup is vital before rigorous conclusions can be drawn OBJECTIVE To analyze the use of the Gore-Tex in the treatment of exertion incontinence . METHODS During 1991 , two gynecology units used the Mouchel technique and a Goebell-Stoeckel type technique in 72 patients with exertion incontinence , alone in 36 and in combination with a cure for prolapsus in 36 others . Results were analyzed with the chi 2 squared test and the Student 's test for paired series . RESULTS The rate of incontinence was 65 % with a range from 60 to 66.7 % according to the type of technique used and whether a cure for prolapsus was also performed . Gore-Tex was not well tolerated in 23/72 cases . Rejection was seen in 20 to 37.5 % according to the type of vaginal suture and the type of protection . CONCLUSION The high rate of rejection suggests prudence in using Gore-Tex PURPOSE We compared 2 measures of urethral hypermobility , the Q-tip test and voiding cystourethrogram , preoperatively in women recruited in 1 center participating in a multicenter r and omized clinical trial comparing Burch colposuspension with autologous rectus fascia sling . MATERIAL S AND METHODS Following institutional review board approval , women with stress urinary incontinence and pelvic organ prolapse stage 2 or less underwent a st and ardized st and ing voiding cystourethrogram and a Q-tip test at a 45 degree angle reclining position preoperatively . Urethral angle at rest and straining were measured with a radiological ruler ( voiding cystourethrogram ) or goniometer ( Q-tip ) by 2 different investigators blinded to each other findings . RESULTS In 43 patients the mean urethral angle at rest and UAS were 20 degrees + /- 12 and 51 degrees + /- 20 , by voiding cystourethrogram compared to 16 degrees + /- 9 and 58 degrees + /- 10 by Q-tip test , respectively . The mean angle difference ( urethral angle with straining minus urethral angle at rest ) was greater for the Q-tip test ( 42 degrees + /- 9 ) than that for the voiding cystourethrogram test ( 32 degrees + /- 17 ; p < 0.05 ) . Fewer patients ( 14 % by Q-tip , 28 % by voiding cystourethrogram ) had urethral hypermobility using the definition of urethral angle at rest greater than 30 , while almost all patients ( 91 % by voiding cystourethrogram , 100 % by Q-tip ) had urethral hypermobility using the definition of urethral angle with straining greater than 30 . However , using the definition of urethral angle with straining minus urethral angle at rest greater than 30 , only 58 % of patients had urethral hypermobility by voiding cystourethrogram compared to 98 % by Q-tip . CONCLUSIONS The voiding cystourethrogram and the Q-tip test measure urethral hypermobility differently . This may affect which patients are classified as having urethral hypermobility , and the choice of anti-incontinence surgery OBJECTIVE To compare the cure rate and confirm the clinical efficacy of the 3 most frequently performed surgical procedures for stress urinary incontinence ( SUI ) . METHODS Between January 2001 and May 2003 , 92 women with SUI were r and omly assigned to undergo the Burch colposuspension ( n=33 ) , pubovaginal sling ( n=28 ) , or tension-free vaginal tape ( n=31 ) at the Department of Obstetrics and Gynecology , Yonsei Medical Center , Seoul , Korea . Patient characteristics , urodynamic study results , cure rates at 3 , 6 , and 12 months , and complication rates were compared using the chi2 test . RESULTS There were no statistically significant differences in the cure rates initially , but after 12 months the cure rate of the pubovaginal sling procedure was found to be significantly higher than those of the tension-free vaginal tape or Burch colposuspension procedures . CONCLUSION The cure rate of the pubovaginal sling procedure was significantly higher after 1 year , but no difference in efficacy was observed between the 2 other procedures . A r and omized prospect i ve study of a larger population should be conducted OBJECTIVE To evaluate the use of Goretex pubovaginal sling and compare the results with a typical indigenous material graft made of rectal fascia . METHODS We have prospect ively evaluated 48 consecutive patients in a r and omized fashion in whom a sling procedure was performed to treat their type III incontinence . Sixteen women had a vesicourethral suspension ( VUS ) by a Goretex sling and another group of 32 patients , comparable in age range and medium age with the previous group , had a similar VUS by a sling from the rectus abdominis fascia . Both groups were evaluated urodynamically 6 and 30 months later . RESULTS In the first group , cure of incontinence was observed in 87.5 % and in the remaining patients it was significantly improved . In 2 patients there was an erosion of the urethra and the Goretex sling had to be removed 3.5 years later . Three other women remained dry but complained of occasional irritative symptoms . In the remaining 11 there was no erosion and the excellent postoperative result was maintained . In the group with fascial slings there was no erosion observed and in general they had less irritative symptoms . Cure or improvement of incontinence was comparable with the first group when studied 6 months after surgery , but there was a significant difference in the 30 months postoperative evaluation . CONCLUSIONS Use of a Goretex sling provides better long-term results even though it is associated with somewhat increased morbidity
2,093
24,497,383
There is no reliable evidence to suggest that the use of structured , systematic pressure ulcer risk assessment tools reduces the incidence of pressure ulcers
BACKGROUND Use of pressure ulcer risk assessment tools or scales is a component of the assessment process used to identify individuals at risk of developing a pressure ulcer . Indeed , use of a risk assessment tool is recommended by many international pressure ulcer prevention guidelines , however it is not known whether using a risk assessment tool makes a difference to patient outcomes . We conducted a review to provide a summary of the evidence pertaining to pressure ulcer risk assessment in clinical practice . OBJECTIVES To determine whether using structured , systematic pressure ulcer risk assessment tools , in any health care setting , reduces the incidence of pressure ulcers .
This study aim ed to evaluate the effectiveness of the Norton score in predicting the likely occurrence of pressure sores compared to the Waterlow scale in Hong Kong . Two elderly care wards ( one male and one female ) were chosen , the sample size was 185 and the mean age of subjects was 80.4 . Each newly admitted patient was assessed using both the Norton calculation and the Waterlow calculation . At the end of the research , there were eight patients who had sore formation . The results indicated that the Norton score identified six out of the eight patients while the Waterlow identified seven of them . The Waterlow calculation , however , seems to have misidentified 72 patients as being in the ' at risk group ' . In view of a fear of the misdirection of re sources the Norton score was found to be the better of the two and its use in the elderly care units in this study should be continued until a better scoring system is found It is now well known that st and ard statistical procedures become invali date d when applied to cluster r and omized trials in which the unit of inference is the individual . A result ing consequence is that research ers conducting such trials are faced with a multitude of design choices , including selection of the primary unit of inference , the degree to which clusters should be matched or stratified by prognostic factors at baseline , and decisions related to cluster subsampling . Moreover , application of ethical principles developed for individually r and omized trials may also require modification . We discuss several topics related to these issues , with emphasis on the choices that must be made in the planning stages of a trial and on some potential pitfalls to be avoided This study investigated the influence on the number of pressure sores developing in patients nursed in a hospice when three levels of pressure support were used in association with a risk assessment tool . The study was design ed as phase one of an epidemiological study examining the use of a modified Norton scoring system . This was followed by a second phase of the study , when patients were allocated to pressure sore support systems according to their risk assessment score . A total of 327 patients entered , 223 in phase one and 104 in phase two . A significant reduction in the development of pressure sores was observed in the second phase of the study The presence of pressure ulcers imposes a huge burden on the older person 's quality of life and significantly increases their risk of dying . The objective of this study was to determine patient characteristics associated with the presence of pressure ulcers and to evaluate the risk factors associated with mortality among older patients with pressure ulcers . A prospect i ve observational study was performed between Oct 2012 and May 2013 . Patients with preexisting pressure ulcers on admission and those with hospital acquired pressure ulcers were recruited into the study . Information on patient demographics , functional status , nutritional level , stages of pressure ulcer and their complications were obtained . Cox proportional hazard analysis was used to assess the risk of death in all patients . 76/684 ( 11.1 % ) patients had pre-existing pressure ulcers on admission and 30/684 ( 4.4 % ) developed pressure ulcers in hospital . There were 68 ( 66 % ) deaths by the end of the median follow-up period of 12 ( IQR 2.5 - 14 ) weeks . Our Cox regression model revealed that nursing home residence ( Hazard Ratio , HR=2.33 , 95 % confidence interval , CI=1.30 , 4.17 ; p=0.005 ) , infected deep pressure ulcers ( HR=2.21 , 95 % CI=1.26 , 3.87 ; p=0.006 ) and neutrophilia ( HR=1.76 ; 95 % CI 1.05 , 2.94 ; p=0.031 ) were independent predictors of mortality in our elderly patients with pressure ulcers . The prevalence of pressure ulcers in our setting is comparable to previously reported figures in Europe and North America . Mortality in patients with pressure ulcer was high , and was predicted by institutionalization , concurrent infection and high neutrophil counts Objectives : To study the components of two risk assessment scales for decubitus ulcer risk , Waterlow and Braden , and of the Chailey score for the same purpose . Design : Experimental study of patients at risk of developing decubitus ulcers . Setting : The West Midl and s and Yorkshire . Subjects : One hundred and fifty wheelchair users from the West Midl and s and 9022 patients from a District General Hospital in York , the latter consisting of all admissions to the hospital in a four-month period . Interventions : Braden , Chailey scores ( wheelchair users ) and Waterlow scores ( all subjects ) measured . Main outcome measures : Development of a pressure sore , receiver operating characteristic ( ROC ) curves . Results : Waterlow outperformed Braden for classification of wheelchair patients with respect to decubitus ulcer . The Chailey score performed r and omly in this group . The sensitivity and specificity as seen in ROC curves was different for Waterlow scores for wheelchair users and general patients , the latter being much better classified . Only three items out of 11 in the Waterlow score appeared to have any classification ability in the wheelchair group . Conclusions : Risk indicators used for general patients are probably poorly suited for wheelchair users . There is a need for large-scale predictive studies of wheelchair users and other groups to allow regression analysis of the subscales of risk indicators . From the provisional data of this study it appears that splitting patients by gender and into full and part-time wheelchair users classifies almost as well the much more complicated risk assessment tools currently available A report of a r and omized , controlled trial ( RCT ) should convey to the reader , in a transparent manner , why the study was undertaken and how it was conducted and analyzed . For example , a lack of adequately reported r and omization has been associated with bias in estimating the effectiveness of interventions ( 1 , 2 ) . To assess the strengths and limitations of an RCT , readers need and deserve to know the quality of its methods . Despite several decades of educational efforts , RCTs still are not being reported adequately ( 3 - 6 ) . For example , a review of 122 recently published RCTs that evaluated the effectiveness of selective serotonin-reuptake inhibitors as first-line management strategy for depression found that only 1 ( 0.8 % ) paper described r and omization adequately ( 5 ) . Inadequate reporting makes the interpretation of RCT results difficult if not impossible . Moreover , inadequate reporting borders on unethical practice when biased results receive false credibility . History of CONSORT In the mid-1990s , two independent initiatives to improve the quality of reports of RCTs led to the publication of the CONSORT ( Consoli date d St and ards of Reporting Trials ) statement ( 7 ) , which was developed by an international group of clinical trialists , statisticians , epidemiologists , and biomedical editors . CONSORT has been supported by a growing number of medical and health care journals ( 8 - 11 ) and editorial groups , including the International Committee of Medical Journal Editors ( ICMJE , also known as the Vancouver Group ) ( 12 ) , the Council of Science Editors ( CSE ) , and the World Association of Medical Editors ( WAME ) . CONSORT is also published in Dutch , English , French , German , Japanese , and Spanish . It can be accessed on the Internet , along with other information about the CONSORT group ( 13 ) . The CONSORT statement comprises a checklist and flow diagram for reporting an RCT . For convenience , the checklist and diagram together are called simply CONSORT . They are primarily intended for use in writing , review ing , or evaluating reports of simple two-group , parallel RCTs . Preliminary data indicate that the use of CONSORT does indeed help to improve the quality of reports of RCTs ( 14 , 15 ) . In an evaluation ( 14 ) of 71 RCTs published in three journals in 1994 , allocation concealment was not clearly reported in 43 ( 61 % ) of the RCTs . Four years later , after these three journals required that authors reporting an RCT use CONSORT , the proportion of papers in which allocation concealment was not clearly reported had dropped to 39 % ( 30 of 77 ; mean difference , 22 % [ 95 % CI of the difference , 38 % to 6 % ] ) . The usefulness of CONSORT is enhanced by continuous monitoring of the biomedical literature ; this monitoring allows CONSORT to be modified depending on the merits of maintaining or dropping current items and including new items . For example , when Meinert ( 16 ) observed that the flow diagram did not provide important information about the number of participants who entered each phase of an RCT ( enrollment , treatment allocation , follow-up , and data analysis ) , the diagram could be modified to accommo date the information . The checklist is similarly flexible . This iterative process makes the CONSORT statement a continually evolving instrument . While participants in the CONSORT group and their degree of involvement vary over time , members meet regularly to review the need to refine CONSORT . At the 1999 meeting , the participants decided to revise the original statement . This report reflects changes determined by consensus of the CONSORT group , partly in response to emerging evidence on the importance of various elements of RCTs . Revision of the CONSORT Statement Thirteen members of the CONSORT group met in May 1999 with the primary objective of revising the original CONSORT checklist and flow diagram , as needed . The group discussed the merits of including each item in the light of current evidence . As in developing the original CONSORT statement , our intention was to keep only those items deemed fundamental to reporting st and ards for an RCT . Some items not considered essential may well be highly desirable and should still be included in an RCT report even though they are not included in CONSORT . Such items include approval of an institutional ethical review board , sources of funding for the trial , and a trial registry number ( as , for example , the International St and ard R and omized Controlled Trial Number [ IS RCT N ] used to register an RCT at its inception [ 17 ] ) . Shortly after the meeting , a revised version of the checklist was circulated to the group for additional comments and feedback . Revisions to the flow diagram were similarly made . All these changes were discussed when CONSORT participants met in May 2000 , and the revised statement was finalized shortly afterward . The revised CONSORT statement includes a 22-item checklist ( Table ) and a flow diagram ( Figure ) . Its primary aim is to help authors improve the quality of reports of simple two-group , parallel RCTs . However , the basic philosophy underlying the development of the statement can be applied to any design . In this regard , additional statements for other design s will be forthcoming from the group ( 13 ) . CONSORT can also be used by peer review ers and editors to identify reports with inadequate description of trials and those with potentially biased results ( 1 , 2 ) . Table . Checklist of Items To Include When Reporting a R and omized Trial Figure . Flow diagram of the progress through the phases of a r and omized trial ( enrollment , intervention allocation , follow-up , and data analysis ) . During the 1999 meeting , the group also discussed the benefits of developing an explanatory document to enhance the use and dissemination of CONSORT . The document is patterned on reporting of statistical aspects of clinical research ( 18 ) , which was developed to help facilitate the recommendations of the ICMJE 's Uniform Requirements for Manuscripts Su bmi tted to Biomedical Journals . Three members of the CONSORT group , with assistance from members on some checklist items , drafted an explanation and elaboration document . That document ( 19 ) was circulated to the group for additions and revisions and was last revised after review at the latest CONSORT group meeting . Changes to CONSORT 1 . In the revised checklist , a new column for Paper Section and Topic integrates information from the Subheading column that was contained in the original statement . 2 . The Was It Reported ? column has been integrated into a Reported on Page Number column , as requested by some journals . 3 . Each item of the checklist is now numbered , and the syntax and order have been revised to improve the flow of information . 4 . Title and Abstract are now combined in the first item . 5 . While the content of the revised checklist is similar to that of the original one , some items that previously were combined are now separate . For example , authors had been asked to describe primary and secondary outcome ( s ) measure(s ) and the minimum important difference(s ) , and indicate how the target sample size was projected . In the new version , issues pertaining to outcomes ( item 6 ) and sample size ( item 7 ) are separate , enabling authors to be more explicit about each . Moreover , some items request additional information . For example , for outcomes ( item 6 ) authors are asked to report any methods used to enhance the quality of measurements , such as multiple observations . 6 . The item asking for the unit of r and omization ( for example , cluster ) has been dropped because specific checklists have been developed for reporting cluster RCTs ( 20 ) and other design types ( 13 ) since publication of the original checklist . 7 . Whenever possible , new evidence is incorporated into the revised checklist . For example , authors are asked to be explicit about whether the analysis reported is by intention to treat ( item 16 ) . This request is based in part on the observations ( 21 ) that authors do not adequately describe and apply intention-to-treat analysis and that reports not providing this information are less likely to provide other relevant information , such as losses to follow-up ( 22 ) . 8 . The revised flow diagram depicts information from four stages of a trial ( enrollment , intervention allocation , follow-up , and analysis ) . The revised diagram explicitly includes the number of participants , for each intervention group , included in the primary data analysis . Inclusion of these numbers lets the reader know whether the authors have performed an intention-to-treat analysis ( 21 - 23 ) . Because some of the information may not always be known and to accommo date other information , the structure of the flow diagram may need to be modified for a particular trial . Inclusion of the participant flow diagram in the report is strongly recommended but may be unnecessary for simple trials , such as those without any participant withdrawals or dropouts . Discussion Specifically developed to guide authors about how to improve the quality of reporting of simple two-group , parallel RCTs , CONSORT encourages transparency with reporting of the methods and results so that reports of RCTs can be interpreted both readily and accurately . However , CONSORT does not address other facets of reporting that also require attention , such as scientific content and readability of RCT reports . Some authors , in their enthusiasm to use CONSORT , have modified the checklist ( 24 ) . We recommend against such modifications because they may be based on a different process than the one used by the CONSORT group . The use of CONSORT seems to reduce ( if not eliminate ) inadequate reporting of RCTs ( 14 , 15 ) . Potentially , the use of CONSORT should positively influence the manner in which RCTs are conducted . Granting agencies have noted this potential relationship and , in at least in one case ( 25 ) , have encouraged grantees to consider in their application how they have dealt with the CONSORT items . The evidence -based approach used to develop CONSORT has also been used BACKGROUND There have been no studies that have tested the Braden Scale for predictive validity and established cutoff points for assessing risk specific to different setting s. OBJECTIVES To evaluate the predictive validity of the Braden Scale in a variety of setting s ( tertiary care hospitals , Veterans Administration Medical Centers [ VAMCs ] , and skilled nursing facilities [ SNFs ] ) . To determine the critical cutoff point for classifying risk in these setting s and whether this cutoff point differs between setting s. To determine the optimal timing for assessing risk across setting s. METHOD R and omly selected subjects ( N= 843 ) older than 19 years of age from a variety of care setting s who did not have pressure ulcers on admission were included . Subjects were 63 % men , 79 % Caucasian , and had a mean age of 63 ( + /-16 ) years . Subjects were assessed for pressure ulcers using the Braden Scale every 48 to 72 hours for 1 to 4 weeks . The Braden Scale score and skin assessment were independently rated , and the data collectors were blind to the findings of the other measures . RESULTS One hundred eight of 843 ( 12.8 % ) subjects developed pressure ulcers . The incidence was 8.5 % , 7.4 % , and 23.9 % in tertiary care hospitals , VAMCs , and SNFs , respectively . Subjects who developed pressure ulcers were older and more likely to be female than those who did not develop ulcers . Braden Scale scores were significantly ( p = .0001 ) lower in those who developed ulcers than in those who did not develop ulcers . Overall , the critical cutoff score for predicting risk was 18 . Risk assessment on admission is highly predictive of pressure ulcer development in all setting s but not as predictive as the assessment completed 48 to 72 hours after admission . CONCLUSIONS Risk assessment on admission is important for timely planning of preventive strategies . Ongoing assessment in SNFs and VAMCs improves prediction and permits fine-tuning of the risk-based prevention protocol s. In tertiary care the most accurate prediction occurs at 48 to 72 hours after admission and at this time the care plan can be refined Abstract Objective : To evaluate whether risk assessment scales can be used to identify patients who are likely to get pressure ulcers . Design : Prospect i ve cohort study . Setting : Two large hospitals in the Netherl and s. Participants : 1229 patients admitted to the surgical , internal , neurological , or geriatric wards between January 1999 and June 2000 . Main outcome measure : Occurrence of a pressure ulcer of grade 2 or worse while in hospital . Results : 135 patients developed pressure ulcers during four weeks after admission . The weekly incidence of patients with pressure ulcers was 6.2 % ( 95 % confidence interval 5.2 % to 7.2 % ) . The area under the receiver operating characteristic curve was 0.56 ( 0.51 to 0.61 ) for the Norton scale , 0.55 ( 0.49 to 0.60 ) for the Braden scale , and 0.61 ( 0.56 to 0.66 ) for the Waterlow scale ; the areas for the sub population , excluding patients who received preventive measures without developing pressure ulcers and excluding surgical patients , were 0.71 ( 0.65 to 0.77 ) , 0.71(0.64 to 0.78 ) , and 0.68 ( 0.61 to 0.74 ) , respectively . In this sub population , using the recommended cut-off points , the positive predictive value was 7.0 % for the Norton , 7.8 % for the Braden , and 5.3 % for the Waterlow scale . Conclusion : Although risk assessment scales predict the occurrence of pressure ulcers to some extent , routine use of these scales leads to inefficient use of preventive measures . An accurate risk assessment scale based on prospect ively gathered data should be developed Cluster r and omized controlled trial ( RCT ) , in which groups or clusters of individuals rather than individuals themselves are r and omized , are increasingly common . Indeed , for the evaluation of certain types of intervention ( such as those used in health promotion and educational interventions ) a cluster r and omized trial is virtually the only valid approach . However , cluster trials are generally more difficult to design and execute than individually r and omized studies , and some design features of a cluster trial may make it particularly vulnerable to a range of threats that can introduce bias . In this paper we discuss the issues that can lead to bias in cluster r and omized trials and conclude with some suggestions for avoiding these problems AIMS AND OBJECTIVES To compare the predictive value of two pressure ulcer risk assessment scales ( Braden and Norton ) and of clinical judgement . To evaluate the impact of effective preventive measures on the predictive validity of the two risk assessment scales . METHODS Of the 1772 participating older patients , 314 were r and omly selected and assigned to the ' turning ' group ; 1458 patients were assigned to the " non-turning " group . Using the Braden and the Norton scale the pressure ulcer risk was scored twice weekly during a four-week period . Clinical assessment was monitored daily . The patients at risk in the " turning " group ( Braden score < 17 or Norton score < 12 ) were r and omly assigned to a two-hour turning schedule or to a four-hour turning schedule in combination with a pressure-reducing mattress . The " non-turning " group received preventive care based on the clinical judgement of the nurses . RESULTS The diagnostic accuracy was similar for both scales . If nurses act according to risk assessment scales , 80 % of the patients would unnecessarily receive preventive measures . The use of effective preventive measures decreased the predictive value of the risk assessment scales . Nurses predicted pressure ulcer development less well than the Braden and the Norton scale . Only activity , sensory perception , skin condition and existence of old pressure ulcers were significant predictors of pressure ulcer lesions . RELEVANCE TO CLINICAL PRACTICE The effectiveness of the Norton and Braden scales is very low . Much needless work is done and expensive material is wrongly allocated . The use of effective preventive measures decreases the predictive value of the risk assessment scales . Although the performance of the risk assessment scales is poor , using a risk assessment tool seems to be a better alternative than relying on the clinical judgement of the nurses BACKGROUND Pressure ulcers are not a plague of modern man ; they have been known to exist since ancient Egyptian times . However , despite the increasing expenditure on pressure ulcer prevention , pressure ulcers remain a major health care problem . Although nurses do not have the sole responsibility for pressure ulcer prevention , nurses have a unique opportunity to have a significant impact on this problem . AIMS AND OBJECTIVES The specific aims of the study were to identify : * Staff nurses ' attitudes towards pressure ulcer prevention . * The behaviour of staff nurses ' in relation to pressure ulcer prevention . * Staff nurses ' perceived barriers towards pressure ulcer prevention . DESIGN A cross-sectional survey method was used . METHODS A r and omly selected sample of staff nurses ( n = 300 ) working in an acute care setting in an urban location was invited to participate . Data were collected using a prepiloted question naire . Data analysis was carried out using SPSS version 10 and SPSS Text Smart version 1.1 . RESULTS The nurses surveyed demonstrated a positive attitude towards pressure ulcer prevention . However , prevention practice s were demonstrated to be haphazard and erratic and were negatively affected by lack of time and staff . These barriers prevented the nurses ' positive attitude from being reflected into effective clinical practice . Education , although poorly accessed , or made available , was rarely cited as impeding practice in this area . CONCLUSION This study suggests that positive attitudes are not enough to ensure that practice change takes place , reinforcing the complex nature of behavioural change . Implementation strategies should introduce ways in which key staff can be empowered to overcome barriers to change . RELEVANCE TO CLINICAL PRACTICE This study provides a unique exploration of Irish nurses ' attitudes , behaviours and perceived barriers towards pressure ulcer prevention , thereby contributing to the body of knowledge on this subject . As tissue viability is a new and emerging speciality , this information will contribute to evidence based practice in this area of patient care and will form the basis for the development of an educational strategy for pressure ulcer prevention and management AIM To investigate which factors predict outcome of elderly patients on discharge and at 6 months . METHODS A prospect i ve study in an acute geriatric ward . Within 48 h of admission , patients were assessed for social factors , geriatric problems , admission diagnoses , medication , function and mental ability . Outcome measures were mortality , length of stay , institutionalization , readmissions and attendance at accident and emergency within 6 months . RESULTS 353 patients were studied , with a mean age of 81.8 years . Logistic regression analyses showed that variables predicting hospital mortality were Barthel index on admission , pre-morbid disability and polypharmacy . The only variable independently predictive of prolonged stay in hospital was a Barthel score of < 45 on admission . Functional disability on admission was predictive of institutionalization on discharge . Variables predicting mortality within 6 months of discharge were Barthel index on admission < 65 , presence of pressure sores , malnutrition and polypharmacy . Variables independently predictive of institutionalization were mental state and a low pension . Those who took more than five drugs on admission were more likely to attend accident and emergency and be readmitted . CONCLUSION Limited activities of daily living and geriatric problems on admission are the strongest predictive factors of outcome , independent of diagnoses UNLABELLED BACKGROUND L : Pressure ulcers are common , costly and impact negatively on individuals . Pressure is the prime cause , and immobility is the factor that exposes individuals to pressure . International guidelines advocate repositioning ; however , there is confusion surrounding the best method and frequency required . DESIGN A pragmatic , multi-centre , open label , prospect i ve , cluster-r and omised controlled trial was conducted to compare the incidence of pressure ulcers among older persons nursed using two different repositioning regimens . METHOD Ethical approval was received . Study sites ( n=12 ) were allocated to study arm using cluster r and omisation . The experimental group ( n=99 ) were repositioned three hourly at night , using the 30 ° tilt ; the control group ( n=114 ) received routine prevention ( six-hourly repositioning , using 90 ° lateral rotation ) . Data analysis was by intention to treat ; follow-up was for four weeks . RESULTS All participants ( n=213 ) were Irish and white , among them 77 % were women and 65 % aged 80 years or older . Three patients ( 3 % ) in the experimental group and 13 patients ( 11 % ) in the control group developed a pressure ulcer ( p=0·035 ; 95 % CI 0·031 - 0·038 ; ICC=0·001 ) . All pressure ulcers were grade 1 ( 44 % ) or grade 2 ( 56 % ) . Mobility and activity were the highest predictors of pressure ulcer development ( β=-0·246 , 95 % CI=-0·319 to -0·066 ; p=0·003 ) ; ( β=0·227 , 95 % CI=0·041 - 0·246 ; p = 0·006 ) . CONCLUSION Repositioning older persons at risk of pressure ulcers every three hours at night , using the 30 ° tilt , reduces the incidence of pressure ulcers compared with usual care . The study supports the recommendations of the 2009 international pressure ulcer prevention guidelines . RELEVANCE TO CLINICAL PRACTICE An effective method of pressure ulcer prevention has been identified ; in the light of the problem of pressures ulcers , current prevention strategies should be review ed . It is important to implement appropriate prevention strategies , of which repositioning is one OBJECTIVES To identify prognostic factors that are independently predictive of in-hospital mortality in older patients hospitalized in a medical intensive care unit ( MICU ) . DESIGN Prospect i ve cohort study . SETTING A MICU in an Italian university hospital . PARTICIPANTS Patients aged 65 and older consecutively admitted to the MICU directly from the first-aid unit . MEASUREMENTS Upon admission , the following variables were examined : demographics , clinical history ( diabetes mellitus , active neoplasm , cognitive impairment , immobilization , pressure ulcers , use of nutritional support , home oxygen therapy ) , physiopathology ( Acute Physiology and Chronic Health Evaluation ( APACHE ) II ) , and cognition/function ( activity of daily living ( ADL ) , instrumental activity of daily living ( IADL ) , Short Portable Mental Status Question naire ( SPMSQ ) ) . The vital status of the patient at the end of hospitalization was recorded . RESULTS Over a period of 10 months , 659 patients were recruited ( mean age + /- st and ard deviation = 76.6 + /- 7.5 ; 352 men and 307 women ) . There were 97 deaths ( 14.71 % ) . The following factors proved to be significantly associated with in-hospital mortality : old age , low body mass index ( BMI ) values , low values of albumin , high scores on APACHE II , functional impairment ( ADL , IADL ) , cognitive impairment ( SPMSQ ) , history of cognitive deterioration , history of confinement to bed , and presence of pressure ulcers . Using multivariate analysis , the following variables were independently predictive of in-hospital mortality : lack of independence in ADLs ( P < .001 ) , moderate-to-severe cognitive impairment on SPMSQ ( P < .001 ) , score on APACHE II ( P = .002 ) , and low BMI values ( P = .031 ) . CONCLUSION The prognosis of older patients hospitalized in medical intensive care units depends not only on the acute physiological impairments , but also on a series of preexisting conditions , such as loss of functional independence , severe and moderate cognitive impairment , and low BMI RATIONALE , AIMS AND OBJECTIVES Pressure ulcers ( PUs ) are a common and serious complication in critically ill patients . The aim of this study was to evaluate the relationship between the development of a PU and hospital mortality in patients requiring mechanical ventilation ( MV ) in an intensive care unit ( ICU ) . METHODS A prospect i ve cohort study was performed over two years in patients requiring MV for ≥ 24 hours in a medical-surgical ICU . Primary outcome measure was hospital mortality and main independent variable was the development of a PU grade ≥ II . Hazard ratios ( HRs ) were calculated using a Cox model with time-dependent covariates . RESULTS Out of 563 patients in the study , 110 ( 19.5 % ) developed a PU . Overall hospital mortality was 48.7 % . In the adjusted multivariate model , PU onset was a significant independent predictor of mortality ( adjusted HR , 1.28 ; 95 % confidence interval , 1.003 - 1.65 ; P = 0.047 ) . The model also included the Acute Physiology and Chronic Health Evaluation II score , total Sequential Organ Failure Assessment on day 3 , hepatic cirrhosis and medical admission . CONCLUSION Within the limitations of a single-centre approach , PU development appears to be associated with an increase in mortality among patients requiring MV for 24 hours or longer Objective To evaluate the effectiveness of two pressure-ulcer screening tools against clinical judgement in preventing pressure ulcers . Design A single blind r and omised controlled trial . Setting A large metropolitan tertiary hospital . Participants 1231 patients admitted to internal medicine or oncology wards . Patients were excluded if their hospital stay was expected to be 2 days or less . Interventions Participants allocated to either a Waterlow ( n=410 ) or Ramstadius ( n=411 ) screening tool group or to a clinical judgement group ( n=410 ) where no formal risk screening instrument was used . Main outcome measure Incidence of hospital acquired pressure ulcers ascertained by regular direct observation . Use of any devices for the prevention of pressure ulcers , documentation of a pressure plan and any dietetic or specialist skin integrity review were recorded . Results On admission , 71 ( 5.8 % ) patients had an existing pressure ulcer . The incidence of hospital-acquired pressure ulcers was similar between groups ( clinical judgement 28/410 ( 6.8 % ) ; Waterlow 31/411 ( 7.5 % ) ; Ramstadius 22/410 ( 5.4 % ) , p=0.44 ) . Significant associations with pressure injury in regression modelling included requiring a dietetic referral , being admitted from a location other than home and age over 65 years . Conclusion The authors found no evidence to show that two common pressure-ulcer risk- assessment tools are superior to clinical judgement to prevent pressure injury . Re sources associated with use of these tools might be better spent on careful daily skin inspection and improving management targetted at specific risks . Study registration The trial was registered with the Australian and New Zeal and Clinicat Trials Registry ( ACTRN 12608000541303 ) The aims of the study were ( i ) to investigate the prevalence of pressure ulcers in patients with hip fracture , on arrival at a Swedish hospital , at discharge , and two weeks post-surgery ; ( ii ) to test whether clinical use of the Modified Norton Scale ( MNS ) could identify patients at risk for development of pressure ulcers ; and ( iii ) to compare the reported prevalence of pressure ulcer in the experimental group , where risk assessment and classification of pressure ulcers was performed on a daily basis , with that of the control group , where it was not . The study design was prospect i ve , with an experimental and a control group . The intervention in the experimental group consisted of risk assessment , risk alarm and skin observation performed by the nurse on duty , in the A & E Department , and daily throughout the hospital stay . To facilitate the nurse 's assessment , a ' Pressure Ulcer Card ' was developed , consisting of the MNS and descriptions of the four stages of pressure ulcers . On arrival at the hospital , approximately 20 % of patients in both groups had pressure ulcers . At discharge , the rate had increased to 40 % ( experimental ) and 36 % ( control ) . Clinical use of the MNS made it possible to identify the majority of patients at risk for development of pressure ulcers . Patients who were confused on arrival developed significantly more pressure ulcers than patients who were orientated to time and place . No significant difference was found in the reported prevalence of pressure ulcers between the experimental and control groups The Braden scale is one of the most intensively studied risk assessment scales used in identifying the risk of developing pressure sores . However , not all studies show that the sensitivity and specificity of this scale is sufficient . This study , therefore , investigated whether adding new risk factors can enhance the sensitivity and specificity of the Braden scale . The Braden scale was tested in a prospect i ve multi-centre design . The nurses of 11 wards filled in the Braden scale every 5 days for each patient who was admitted without pressure sores and who had a probable stay of at least 10 days . Based on a literature study and in-depth interviews with experts , the Braden scale was extended by the risk factor blood circulation . In addition , other risk factors , which are more or less stable patient characteristics , were measured during the admission of the patient . Independent research assistants measured the presence of pressure sores twice a week . As the external criterion for the risk of developing pressure sores , the presence of pressure sores and /or the use of preventive activities was used . Results showed that the original Braden scale was a reliable instrument and that the sensitivity and specificity was sufficient . However , reformulating the factors moisture and nutrition , and adding the risk factor age could enhance the sensitivity and specificity . Furthermore , results showed that the factors sensory perception , and friction and shear were especially important risk factors for the Braden scale . In fact , using only the factors sensory perception , friction and shear , moisture ( a reformulated factor ) and age give the highest explained variance of the risk of developing pressure sores . The added risk factor blood circulation , did not enhance the sensitivity and specificity of the original Braden scale . Suggestions are given on how to use risk assessment scales in practice OBJECTIVE The aims of the present study were to ( i ) investigate the incidence of pressure ulcers in 1997 and 1999 among patients with hip fracture , ( ii ) study changes of nursing and treatment routines during the same period and ( iii ) to identify predictors of pressure ulcer development . DESIGN The present comparative study was based partly on data collected in two prospect i ve , r and omized , controlled studies conducted in 1997 and 1999 . SETTING The study was carried out in the Accident & Emergency ( A&E ) Department and the Department of Orthopaedics at the University Hospital in Uppsala , Sweden . STUDY PARTICIPANTS INCLUSION CRITERIA patient with hip fracture , > or = 65 years , admitted without pressure ulcers . Forty-five patents were included in 1997 and 101 in 1999 . INTERVENTIONS Risk assessment , pressure ulcer grading , pressure-reducing mattress and educational programme . MAIN OUTCOME MEASURES Incidence of pressure ulcers . RESULTS There was a significant reduction of the overall incidence of pressure ulcers from 55 % in 1997 to 29 % in 1999 . The nursing notes had become significantly more informative . Nursing and treatment routines for patients with hip fractures had changed both in the A&E Department and the orthopaedic ward through initiatives developed and implemented by pressure ulcer nurses . CONCLUSION In the framework of a quality improvement project , where research activities were integrated with practice -based developmental work , the incidence of pressure ulcers was reduced significantly in patients with hip fractures . The best predictor of pressure ulcer development was increased age This study investigated the effect of using Norton Scale assessment data in the nursing care of patients at risk of developing pressure ulcers . The results indicated that incorporating the Norton Scale in care planning result ed in benefits to patients through earlier and more effective nursing interventions To comprehend the results of a r and omised controlled trial ( RCT ) , readers must underst and its design , conduct , analysis , and interpretation . That goal can be achieved only through total transparency from authors . Despite several decades of educational efforts , the reporting of RCTs needs improvement . Investigators and editors developed the original CONSORT ( Consoli date d St and ards of Reporting Trials ) statement to help authors improve reporting by use of a checklist and flow diagram . The revised CONSORT statement presented here incorporates new evidence and addresses some criticisms of the original statement . The checklist items pertain to the content of the Title , Abstract , Introduction , Methods , Results , and Discussion . The revised checklist includes 22 items selected because empirical evidence indicates that not reporting this information is associated with biased estimates of treatment effect , or because the information is essential to judge the reliability or relevance of the findings . We intended the flow diagram to depict the passage of participants through an RCT . The revised flow diagram depicts information from four stages of a trial ( enrollment , intervention allocation , follow- up , and analysis ) . The diagram explicitly shows the number of participants , for each intervention group , included in the primary data analysis . Inclusion of these numbers allows the reader to judge whether the authors have done an intention- to-treat analysis . In sum , the CONSORT statement is intended to improve the reporting of an RCT , enabling readers to underst and a trial 's conduct and to assess the validity of its results
2,094
31,753,763
Interpretation A large burden of HIV is likely to be attributable to HSV-2 infection , even if the effect of HSV-2 infection on HIV had been imperfectly measured in studies providing adjusted RR estimates , potentially because of residual confounding . The contribution is likely to be greatest in areas where HSV-2 is highly prevalent , particularly Africa . New preventive interventions against HSV-2 infection could not only improve the quality of life of millions of people by reducing the prevalence of herpetic genital ulcer disease , but could also have an additional , indirect effect on HIV transmission .
Summary Background A 2017 systematic review and meta- analysis of 55 prospect i ve studies found the adjusted risk of HIV acquisition to be at least tripled in individuals with herpes simplex virus type 2 ( HSV-2 ) infection . We aim ed to assess the potential contribution of HSV-2 infection to HIV incidence , given an effect of HSV-2 on HIV acquisition .
BACKGROUND Skin and mucosal herpes simplex virus type 2 ( HSV-2 ) shedding predominantly occurs in short sub clinical episodes . We assessed whether st and ard-dose or high-dose antiviral therapy reduces the frequency of such shedding . METHODS HSV-2-seropositive , HIV-seronegative people were enrolled at the University of Washington Virology Research Clinic ( WA , USA ) . We did three separate but complementary open-label cross-over studies comparing no medication with aciclovir 400 mg twice daily ( st and ard-dose aciclovir ) , valaciclovir 500 mg daily ( st and ard-dose valaciclovir ) with aciclovir 800 mg three times daily ( high-dose aciclovir ) , and st and ard-dose valaciclovir with valaciclovir 1 g three times daily ( high-dose valaciclovir ) . The allocation sequence was generated by a r and om number generator . Study drugs were supplied in identical , numbered , sealed boxes . Study periods lasted 4 - 7 weeks , separated by 1 week wash-out . Participants collected genital swabs four times daily for quantitative HSV DNA PCR . Clinical data were masked from laboratory personnel . The primary endpoint was within-person comparison of shedding rate in each study group . Analysis was per protocol . The trials are registered at Clinical Trials.gov ( NCT00362297 , NCT00723229 , NCT01346475 ) . RESULTS Of 113 participants r and omised , 90 were eligible for analysis of the primary endpoint . Participants collected 23 605 swabs ; 1272 ( 5·4 % ) were HSV-positive . The frequency of HSV shedding was significantly higher in the no medication group ( n=384 , 18·1 % of swabs ) than in the st and ard-dose aciclovir group ( 25 , 1·2 % ; incidence rate ratio [ IRR ] 0·05 , 95 % CI 0·03 - 0·08 ) . High-dose aciclovir was associated with less shedding than st and ard-dose valaciclovir ( 198 [ 4·2 % ] vs 209 [ 4·5 % ] ; IRR 0·79 , 95 % CI 0·63 - 1·00 ) . Shedding was less frequent in the high-dose valaciclovir group than in the st and ard-dose valaciclovir group ( 164 [ 3·3 % ] vs 292 [ 5·8 % ] ; 0·54 , 0·44 - 0·66 ) . The number of episodes per person-year did not differ significantly for st and ard-dose valaciclovir ( 22·6 ) versus high-dose aciclovir ( 20·2 ; p=0·54 ) , and st and ard-dose valaciclovir ( 14·9 ) versus high-dose valaciclovir ( 16·5 ; p=0·34 ) , but did for no medication ( 28·7 ) and st and ard-dose aciclovir ( 10·0 ; p=0·001 ) . Median episode duration was longer for no medication than for st and ard-dose aciclovir ( 13 h vs 7 h ; p=0·01 ) and for st and ard-dose valaciclovir than for high-dose valaciclovir ( 10 h vs 7 h ; p=0·03 ) , but did not differ significantly between st and ard-dose valaciclovir and high-dose aciclovir ( 8 h vs 8 h ; p=0·23 ) . Likewise , maximum log(10 ) copies of HSV detected per mL was higher for no medication than for st and ard dose aciclovir ( 3·3 vs 2·9 ; p=0·02 ) , and for st and ard-dose valaciclovir than for high-dose valaciclovir ( 2·5 vs 3·0 ; p=0·001 ) , but no significant difference was recorded for st and ard-dose valaciclovir versus high-dose aciclovir ( 2·7 vs 2·8 ; p=0·66 ) . 80 % of episodes were sub clinical in all study groups . Except for a higher frequency of headaches with high-dose valaciclovir ( n=13 , 30 % ) than with other regimens , all regimens were well tolerated . INTERPRETATION Short bursts of sub clinical genital HSV reactivation are frequent , even during high-dose antiherpes therapy , and probably account for continued transmission of HSV during suppressive antiviral therapy . More potent antiviral therapy is needed to eliminate HSV transmission . FUNDING NIH . Valaciclovir was provided for trial 3 for free by GlaxoSmithKline Summary Herpes simplex virus type 2 ( HSV-2 ) is considered as a major co-factor of both sexual transmission and acquisition of the human immunodeficiency virus ( HIV ) . The HIV epidemic in Senegal is characterized by a remarkable and stable low prevalence . Whether HSV-2 may also constitute a possible co-factor favouring the spreading of HIV epidemic in Senegal is yet unknown . This prompted us to evaluate the HSV-2 seroprevalence in the sentinel population of pregnant women in Senegal . Two hundred and sixty pregnant women attending Roi Baudouin maternity in the capital city Dakar ( n = 135 ) and the antenatal clinic in Kaolack ( n = 125 ) , the third city of Senegal , were prospect ively recruited between March and August 2003 . Fifty-six women ( 22 % ) were positive for HSV-2 serology . The prevalence of HSV-2 seropositivity was higher in women living in Dakar ( 26 % ) than in those living in Kaolack ( 16 % ) ( P < 0.01 ) . Only two women from Dakar and two other from Kaolack were found to be HIV-1-infected . Our observations suggest a seemingly low seroprevalence of HSV-2 infection in adult women Senegal , comparable with those usually reported in Western countries . Further , epidemiological surveys are needed to confirm these results in the general population Background : The epidemiologic utility of STARHS hinges not only on producing accurate estimates of HIV incidence , but also on identifying risk factors for recent HIV infection . Methods : As part of an HIV seroincidence study , 800 Rw and an female sex workers ( FSW ) were HIV tested , with those testing positive further tested by BED-CEIA ( BED ) and AxSYM Avidity Index ( Ax-AI ) assays . A sample of HIV-negative ( N=397 ) FSW were followed prospect ively for HIV seroconversion . We compared estimates of risk factors for : 1 ) prevalent HIV infection ; 2 ) recently acquired HIV infection ( RI ) based on three different STARHS classifications ( BED alone , Ax-AI alone , BED/Ax-AI combined ) ; and 3 ) prospect ively observed seroconversion . Results : There was mixed agreement in risk factors between methods . HSV-2 coinfection and recent STI treatment were associated with both prevalent HIV infection and all three measures of recent infection . A number of risk factors were associated only with prevalent infection , including widowhood , history of forced sex , regular alcohol consumption , prior imprisonment , and current breastfeeding . Number of sex partners in the last 3 months was associated with recent infection based on BED/Ax-AI combined , but not other STARHS-based recent infection outcomes or prevalent infection . Risk factor estimates for prospect ively observed seroconversion differed in magnitude and direction from those for recent infection via STARHS . Conclusions : Differences in risk factor estimates by each method could reflect true differences in risk factors between the prevalent , recently , or newly infected population s , the effect of study interventions ( among those followed prospect ively ) , or assay misclassification . Similar investigations in other population s/ setting s are needed to further establish the epidemiologic utility of STARHS for identifying risk factors , in addition to incidence rate estimation To explore the mechanism by which herpes simplex virus (HSV)-2 infection is related to HIV-1 acquisition , we conducted in situ analysis of the cellular infiltrate from sequential biopsies of HSV-2 lesions from patients on and off antiviral therapy . CD4 + and CD8 + T cells and a mixed population of plasmacytoid and myeloid dendritic cells ( DCs ) , including cells expressing the C-type lectin receptor DC-SIGN , persisted at sites of HSV-2 reactivation for months after healing , even with daily antiviral therapy . The CD4 + T cells that persisted reacted to HSV-2 antigen , were enriched for expression of the chemokine receptor CCR5 , and were contiguous to DCs expressing the interleukin-3 receptor CD123 or DC-SIGN . Ex vivo infection with a CCR5-tropic strain of HIV-1 revealed greater concentrations of integrated HIV-1 DNA in cells derived from healed genital lesion biopsies than in cells from control skin biopsies . The persistence and enrichment of HIV receptor – positive inflammatory cells in the genitalia help explain the inability of anti – HSV-2 therapy to reduce HIV acquisition Genital herpes continues to be epidemic throughout most sexually active population s [ 1 - 4 ] . A recent serosurvey indicated that 21.7 % of the U.S. population have HSV-2 antibodies , which represents a 31 % increase in prevalence in the last decade [ 5 , 6 ] . The seroprevalence of HSV-2 averages 30 % in most family practice and obstetrics clinics and 30 % to 50 % among sexually transmitted disease clinic attendees . Seroprevalence is consistently higher in women than in men [ 7 , 8 ] . The natural history of HSV infection includes acute or sub clinical first-episode mucocutaneous infection , establishment of neuronal latency , and intermittent virus reactivation with or without associated recurrent symptoms [ 9 , 10 ] . Although this sequence of events has been recognized for more than five decades , little is known about the long-term natural history of genital herpes . In the late 1970s , supported by the National Institutes of Health , we began a prospect i ve study of a large cohort of persons with recently acquired symptomatic genital HSV infection in order to define more precisely the natural history of genital herpes . Clinical , demographic , and recurrence data were collected for 457 patients who presented with virologically , serologically , and clinical ly confirmed first-episode genital infection . More than half of these patients did not receive antiviral therapy during their primary episode , providing a population not likely to be replicated in the future . We summarize the natural history of symptomatic recurrences in the complete population and in the subset of untreated patients . Methods In 1974 , a research clinic dedicated to the study of genital herpes infection was established at Harborview Medical Center , a King Countyfunded hospital affiliated with the University of Washington . Patients were referred by their private physicians or by the sexually transmitted disease clinic at Harborview , or responded to advertisements for participation in clinical studies of HSV infection . Only patients willing to participate in prospect i ve studies with long-term follow-up or in therapeutic trials were enrolled . Between 1974 and 1988 , we registered 457 patients with serologically and virologically proven first-episode infection who were followed for at least 60 days from the onset of infection . At the initial clinic visit , genital lesions were cultured and described by anatomic site , stage , and area . Patients were then generally followed at 2- to 3-day intervals until their lesions had healed . The median number of visits and genital examinations during the initial disease episode was 5 ( range , 3 to 14 visits ) . After resolution of the first clinical episode , patients were instructed to return to the clinic during each recurrence or for routine visits at least every 2 to 3 months . Patients who were unable to return for each recurrence were instructed to maintain a diary of onset and resolution date s for each recurrence . These data were collected at the following clinic visit . In general , we insisted that patients return for recurrences until they were able to recognize the signs and symptoms of genital herpes reactivation and to fill out patient diaries about onset and healing of reactivations . We defined a recurrence of genital herpes as the presence of genital ulcerations . We defined duration of a recurrence as the number of days from the first appearance of genital lesions to the reepithelialization of all lesions . If new lesions appeared before complete healing of other lesions , all were considered part of the same episode . We report only recognized symptomatic ( lesional ) recurrences and do not address sub clinical shedding of HSV in the genital tract . Serum specimens from both the acute and convalescent phases were obtained from all patients and tested for HSV-specific antibodies by Western blot ; all patients showed seroconversion to HSV-1 or HSV-2 [ 11 , 12 ] . Some had antibodies to HSV-1 in their acute-phase specimen and antibodies to both HSV-1 and HSV-2 in their convalescent-phase specimens . Patients were categorized as having primary first-episode disease if their acute-phase serum specimen showed absence of antibodies to HSV by Western blot . Based on Western blot profiles of convalescent-phase specimens and the subtype of the HSV isolates from lesions , patients were categorized as having primary HSV-1 or primary HSV-2 infection . Patients who had HSV-1 antibodies in acute-phase specimens and antibodies to HSV-2 in convalescent-phase specimens were classified as having non primary first-episode HSV-2 disease [ 11 , 13 ] . Patients with detectable antibodies to the homologous viral type isolated in initial specimens were classified as having recurrent disease [ 13 ] . Forty-two patients had serologic evidence of HSV-2 antibodies in acute- and convalescent-phase serum specimens , and 19 patients with HSV-1 antibodies in both their acute- and convalescent-phase specimens were classified as having reactivation infection and were not included in this study even though they cl aim ed to be experiencing their first episode . We also excluded 64 patients seen between 1974 and 1985 whose serum specimens were analyzed by microneutralization but whose enrollment specimens could not be retrospectively retrieved for confirmation by Western blot [ 14 ] . In this study , we compared baseline patient characteristics and severity of primary infection with the subsequent recurrence experience of the patient . Patient characteristics included sex , race , age , and measures of past sexual activity , such as number of partners and history of sexually transmitted diseases . Clinical characteristics included pain , itch , discharge , fever , headache , photophobia , and stiff neck . Because many patients were participants in our early r and omized trials of acyclovir , treatment status was also recorded [ 15 - 17 ] . Statistical Methods Monthly recurrence rates for each patient were estimated by dividing the number of recorded recurrences by the number of months the patient was followed . Comparisons of recurrence rates were made using the Wilcoxon rank-sum test or , for comparisons between more than two groups , the Kruskal-Wallis test . The Kaplan-Meier estimate was used to compute time to first recurrence , and appropriate comparisons were made using the Cox model [ 18 ] . Recursive partitioning was used as an exploratory technique to identify potential subsets of patients at higher risk for subsequent recurrence . Classification and regression trees were used for recursive partitioning of recurrence rates and a modification of this technique by LeBlanc and Crowley was used to establish time to first recurrence [ 19 , 20 ] . These techniques identify the variable ( and cutpoint , for continuous variables ) most closely related to the recurrence pattern . Within each of these two splits , or nodes , of the data , the data are split again . This process is repeated within each node until no further splits appear to be important in predicting recurrences . This partitioning of the data can be represented as a tree that shows the splits of the variables into disjoint patient subsets . Because our previous placebo-controlled treatment trials failed to detect an effect of antiviral therapy on subsequent recurrences , most of our analyses are reported for the entire patient population . However , because treatment is known to affect the duration of primary symptoms , analyses to assess the potential effect of symptoms on subsequent recurrences were also done in the subset of untreated patients . Results Viral Type and Clinical Classification of Persons with First-Episode Genital Herpes Of the 457 patients presenting with first-episode genital herpes , 399 ( 87 % ) had primary genital HSV infection ; 73 ( 16 % of the total cohort ) were infected with HSV-1 and 326 ( 71 % of the total cohort ) were infected with HSV-2 . Fifty-eight ( 13 % ) patients were classified as having non primary HSV-2 infection . The median age of the patients was 24 years ; 92 % were single and 91 % were white . Demographic characteristics were similar among those presenting with primary HSV-1 , primary HSV-2 , and non primary HSV-2 infection and were identical to those reported previously [ 10 , 15 - 17 , 21 ] . Frequencies of the major clinical signs and symptoms of the initial episode of genital herpes are shown in Figure 1 . Patients with true primary genital herpes infection , regardless of infecting viral type , tended to have more severe disease than did patients with previous HSV-1 infection . This was most evident with respect to constitutional symptoms : Seventy-nine percent of those with primary HSV-1 or HSV-2 infection reported at least one constitutional symptom ( fever , headache , photophobia , or stiff neck ) during their first episode , compared with only 43 % of those with non primary infection . Seventy-seven percent of those with primary episodes had inguinal adenopathy compared with 52 % of those with non primary genital herpes ( P = 0.001 ) . Frequencies of symptoms and signs were similar between those with primary genital HSV-1 and primary genital HSV-2 infections , except in the case of nuchal rigidity , which was reported by 42 % of patients with primary HSV-2 infection and 12 % of those with primary HSV-1 infection ( P = 0.005 ) . Figure 1 . Frequency of clinical signs and symptoms of first-episode genital herpes by viral type and evidence of previous herpes simplex virus type 1 ( HSV-1 ) exposure . P Overall Recurrence Rates after Resolution of the Initial Episode Follow-up was defined as the time from enrollment to the date of the last clinic visit or to the date that a patient initiated long-term suppressive oral acyclovir therapy . The median follow-up was 418 days ( range , 61 to 4897 days ) and was similar in all subsets of patients whether segregated according to viral type , sex , or severity of clinical episode . The median numbers of clinic visits per patient in the first 90 , 91 to 180 , 181 to 365 , and 366 to 720 days of follow-up were 10 , 2 , 2 , and 1 , respectively . The large numbers Objective : Vaginal colonisation with Lactobacillus species is characteristic of normal vaginal ecology . The absence of vaginal lactobacilli , particularly hydrogen peroxide (H2O2)-producing isolates , has been associated with symptomatic bacterial vaginosis ( BV ) and increased risk for HIV-1 acquisition . Identification of factors associated with vaginal Lactobacillus colonisation may suggest interventions to improve vaginal health . Methods : We conducted a prospect i ve cohort study of correlates of vaginal Lactobacillus colonisation among Kenyan HIV-1 seronegative female sex workers . At monthly follow-up visits , vaginal Lactobacillus cultures were obtained . Generalised estimating equations were used to examine demographic , behavioural and medical correlates of Lactobacillus isolation , including isolation of H2O2-producing strains . Results : Lactobacillus cultures were obtained from 1020 women who completed a total of 8896 follow-up visits . Vaginal washing , typically with water alone or with soap and water , was associated with an approximately 40 % decreased likelihood of Lactobacillus isolation , including isolation of H2O2-producing strains . Recent antibiotic use , excluding metronidazole and treatments for vaginal c and idiasis , reduced Lactobacillus isolation by ∼30 % . H2O2-producing lactobacilli were significantly less common among women with Trichomonas vaginalis infection and those who were seropositive for herpes simplex virus type 2 . In contrast , H2O2-producing lactobacilli were significantly more common among women with concurrent vaginal c and idiasis . Conclusions : Modifiable biological and behavioural factors are associated with Lactobacillus colonisation in African women . Our results suggest intervention strategies to improve vaginal health in women at high risk for HIV-1 Herpes simplex virus type 2 ( HSV-2 ) infection is associated with a 3-fold increase in the risk of human immunodeficiency virus ( HIV ) acquisition , perhaps through alterations in mucosal HIV-susceptible target cells . We performed a clinical trial to assess the impact of herpes therapy on cervical immunology in HSV-2-infected , HIV-uninfected women from Africa or the Caribbean who were living in Toronto , Canada . Thirty participants received 1 g of valacyclovir orally each day for 2 months in a r and omized double-blind , placebo-controlled , crossover trial . Valacyclovir did not reduce the number of cervical CD4(+ ) T cells , the number of dendritic cells , or the expression of proinflammatory cytokines and tended to increase the expression of the HIV coreceptor CCR5 and the activation marker CD69 . Short-term valacyclovir therapy did not reverse HSV-2-associated alterations in genital immunology . Clinical Trials Registration . NCT00946556 The clustering of human papillomavirus ( HPV ) infections in some individuals is often interpreted as the result of common risk factors rather than biological interactions between different types of HPV . The intraindividual correlation between times-at-risk for all HPV infections is not generally considered in the analysis of epidemiologic studies . We used a deterministic transmission model to simulate cross-sectional and prospect i ve epidemiologic studies measuring associations between 2 HPV types . When we assumed no interactions , the model predicted that studies would estimate odds ratios and incidence rate ratios greater than 1 between HPV types even after complete adjustment for sexual behavior . We demonstrated that this residual association is due to correlation between the times-at-risk for different HPV types , where individuals become concurrently at risk for all of their partners ' HPV types when they enter a partnership and are not at risk when they are single . This correlation can be controlled in prospect i ve studies by restricting analyses to susceptible individuals with an infected sexual partner . The bias in the measured associations was largest in low-sexual-activity population s , cross-sectional studies , and studies which evaluated infection with a first HPV type as the exposure . These results suggest that current epidemiologic evidence does not preclude the existence of competitive biological interactions between HPV types Summary Background We assessed prevalences of seven sexually transmitted infections ( STIs ) in Peru , stratified by risk behaviours , to help to define care and prevention priorities . Methods In a 2002 household-based survey of the general population , we enrolled r and omly selected 18–29-year-old residents of 24 cities with population s greater than 50 000 people . We then surveyed female sex workers ( FSWs ) in these cities . We gathered data for sexual behaviour ; vaginal specimens or urine for nucleic acid amplification tests for Neisseria gonorrhoeae , Chlamydia trachomatis , and Trichomonas vaginalis ; and blood for serological tests for syphilis , HIV , and ( in sub sample s ) herpes simplex virus 2 ( HSV2 ) and human T-lymphotropic virus . This study is a registered component of the PREVEN trial , number IS RCT N43722548 . Findings 15 261 individuals from the general population and 4485 FSWs agreed to participate in our survey . Overall prevalence of infection with HSV2 , weighted for city size , was 13·5 % in men , 13·6 % in women , and 60·6 % in FSWs ( all values in FSWs st and ardised to age composition of women in the general population ) . The prevalence of C trachomatis infection was 4·2 % in men , 6·5 % in women , and 16·4 % in FSWs ; of T vaginalis infection was 0·3 % in men , 4·9 % in women , and 7·9 % in FSWs ; and of syphilis was 0·5 % in men , 0·4 % in women , and 0·8 % in FSWs . N gonorrhoeae infection had a prevalence of 0·1 % in men and women , and of 1·6 % in FSWs . Prevalence of HIV infection was 0·5 % in men and FSWs , and 0·1 % in women . Four ( 0·3 % ) of 1535 specimens were positive for human T-lymphotropic virus 1 . In men , 65·0 % of infections with HIV , 71·5 % of N gonorrhoeae , and 41·4 % of HSV2 and 60·9 % of cases of syphilis were in the 13·3 % who had sex with men or unprotected sex with FSWs in the past year . In women from the general population , 66·7 % of infections with HIV and 16·7 % of cases of syphilis were accounted for by the 4·4 % who had been paid for sex by any of their past three partners . Interpretation Defining of high-risk groups could guide targeting of interventions for communicable diseases — including STIs — in the general Peruvian population . Funding Wellcome Trust-Burroughs Wellcome Fund Infectious Disease Initiative and US National Institutes of Health BACKGROUND Risk factors influencing the incidence of human immunodeficiency virus ( HIV ) infection were investigated in a case-control study nested within a community-r and omized trial of treatment of syndromic sexually transmitted infections ( STIs ) in rural Tanzania . METHODS Case patients were persons who became HIV positive , and control subjects were r and omly selected from among persons who remained HIV negative . For each sex , we obtained adjusted odds ratios ( ORs ) and population -attributable fractions ( PAFs ) for biomedical and behavioral factors . RESULTS We analyzed 92 case patients and 903 control subjects . In both sexes , the incidence of HIV infection was significantly higher in subjects with an HIV-positive spouse than in those with HIV-negative spouse ( men : OR , 25.1 ; women : OR , 34.0 ) . The incidence of HIV infection was significantly higher in those who became positive for herpes simplex virus type 2 ( HSV-2 ) ( men : OR , 5.60 ; women : OR , 4.76 ) and those who were HSV-2-positive at baseline ( men : OR , 3.66 ; women : OR , 2.88 ) than in subjects who were HSV-2 negative . In women , living elsewhere ( OR , 3.22 ) and never having given birth ( OR , 4.27 ) were significant risk factors . After adjustment , the incidence of HIV infection was not significantly associated with a history of injections or STIs in either sex . CONCLUSION HSV-2 infection was the most important risk factor for HIV infection , which highlights the need for HSV-2 interventions in HIV infection control , and there were particularly strong associations with recent HSV-2 seroconversion . The PAF associated with having an HIV-positive spouse was low , but this is likely to increase during the epidemic Objectives : Several studies have demonstrated an association between herpes simplex virus type 2 ( HSV-2 ) and HIV-1 , but available data on risk factors for HSV-2 acquisition are limited . The objective of this analysis was to determine the incidence and risk factors for HSV-2 acquisition among HIV-1-seronegative female sex workers in Kenya . Methods : Between February 1993 and December 2006 , HIV-1-seronegative women attending a municipal sexually transmitted infection ( STI ) clinic were invited to enroll in a prospect i ve cohort study . Screening for HIV-1 and STIs were done at monthly follow-up visits . Archived blood sample s were tested for HSV-2 . Results : Of 1527 HIV-1-seronegative women enrolled , 302 ( 20 % ) were HSV-2 seronegative at baseline of whom 297 had at least one follow-up visit . HSV-2 incidence was high ( 23 cases/100 person-years ; 115 cases ) . In multivariate analysis , HSV-2 was significantly associated with more recent entry into sex work , workplace and higher number of sex partners per week . Condom use was protective , although this was statistically significant only for the intermediate strata ( 25–75 % condom use ; HR 0.43 ; p = 0.05 ) . There were statistical trends for bacterial vaginosis to increase HSV-2 risk ( HR 1.56 ; p = 0.07 ) and for oral contraceptive use to decrease risk ( HR 0.50 ; p = 0.08 ) . The 23 % annual HSV-2 incidence in this study is among the highest reported anywhere in the world . Conclusions : Women were at increased risk if they had recently entered sex work , had a higher number of sex partners or worked in bars . HSV-2 risk reduction interventions are urgently needed among high-risk African women BACKGROUND Infection with herpes simplex virus type 2 ( HSV-2 ) is associated with an increased risk of acquiring infection with the human immunodeficiency virus ( HIV ) . This study tested the hypothesis that HSV-2 suppressive therapy reduces the risk of HIV acquisition . METHODS Female workers at recreational facilities in northwestern Tanzania who were 16 to 35 years of age were interviewed and underwent serologic testing for HIV and HSV-2 . We enrolled female workers who were HIV-seronegative and HSV-2-seropositive in a r and omized , double-blind , placebo-controlled trial of suppressive treatment with acyclovir ( 400 mg twice daily ) . Participants attended mobile clinics every 3 months for a follow-up period of 12 to 30 months , depending on enrollment date . The primary outcome was the incidence of infection with HIV . We used a modified intention-to-treat analysis ; data for participants who became pregnant were censored . Adherence to treatment was estimated by a tablet count at each visit . RESULTS A total of 821 participants were r and omly assigned to receive acyclovir ( 400 participants ) or placebo ( 421 participants ) ; 679 ( 83 % ) completed follow-up . Mean follow-up for the acyclovir and placebo groups was 1.52 and 1.62 years , respectively . The incidence of HIV infection was 4.27 per 100 person-years ( 27 participants in the acyclovir group and 28 in the placebo group ) , and there was no overall effect of acyclovir on the incidence of HIV ( rate ratio for the acyclovir group , 1.08 ; 95 % confidence interval , 0.64 to 1.83 ) . The estimated median adherence was 90 % . Genital HSV was detected in a similar proportion of participants in the two study groups at 6 , 12 , and 24 months . No serious adverse events were attributable to treatment with acyclovir . CONCLUSIONS These data show no evidence that acyclovir ( 400 mg twice daily ) as HSV suppressive therapy decreases the incidence of infection with HIV . ( Current Controlled Trials number , IS RCT N35385041 [ controlled-trials.com ] . ) Objectives To quantify the association between prevalent or incident Herpes simplex virus type-2 ( HSV2 ) infection and the incidence of HIV seroconversion among adults in the general population in rural Tanzania . Study population Adults aged 15–54 years sample d r and omly from 12 rural communities in Mwanza Region , Tanzania and recruited to a r and omized trial of improved treatment of sexually transmitted diseases . Study design Unmatched case – control study nested within trial cohort . Methods Participants included 127 cases who seroconverted to HIV during the 2-year follow-up period and 636 r and omly selected controls who remained HIV negative . Subjects were tested for HSV2 serology at baseline and follow-up , and associations between HIV and HSV2 were analysed with adjustment for socio-demographic and behavioural factors . Results After adjusting for confounding factors , a strong association between HSV2 infection and HIV seroconversion was observed in men ( test for trend : P < 0.001 ) , with adjusted odds ratios ( OR ) of 6.12 [ 95 % confidence interval ( CI ) , 2.52–14.9 ] in those HSV2 positive at baseline , and 16.8 ( 95 % CI , 6.06–46.3 ) in those acquiring HSV2 infection during follow-up . A weaker association was observed in women ( tests for trend : P = 0.14 ) , with adjusted OR of 1.32 ( 95 % CI , 0.62–2.78 ) and 2.36 ( 95 % CI , 0.81–6.84 ) , respectively . Population attributable fractions of incident HIV infection due to HSV2 were estimated as 74 % in men and 22 % in women . Conclusions The results suggest that HSV2 plays an important role in the transmission of HIV infection in this population . There is an urgent need to identify effective HSV2 control measures in order to reduce HIV incidence in Africa Female sex workers in Europe have low levels of sexually transmitted infections , attributable to condom use . The aim of this paper is to describe the seroepidemiology of HSV-1 and HSV-2 in female sex workers in London by using a 15-year prospect i ve study of 453 sex workers . The seroprevalence of HSV-1 was 74.4 % and independently associated with birth in a ' transitional country ' ( OR 5.4 , 95 % CI 1.61 - 18.20 ) . The seroprevalence of HSV-2 was 60 % and declined over time ; it was also independently associated with time in sex work ( OR 2.12 , 95 % CI 1.23 - 3.65 ) and birth in a ' developing country ' ( OR 2.95 , 95 % CI 1.34 - 6.48 ) . We show that a cohort of sex workers with extensive condom use and little known sexually transmitted infection have high levels of HSV-1 and HSV-2 infection , suggesting that condoms may not be universally protective . Sex workers are c and i date s for HSV vaccine efficacy or intervention studies Background : Studies of the effect of hormonal contraceptive use on the risk of HIV-1 acquisition have generated conflicting results . A recent study from Ug and a and Zimbabwe found that women using hormonal contraception were at increased risk for HIV-1 if they were seronegative for herpes simplex virus type 2 ( HSV-2 ) , but not if they were HSV-2 seropositive . Objective : To explore the effect of HSV-2 infection on the relationship between hormonal contraception and HIV-1 in a high-risk population . Hormonal contraception has previously been associated with increased HIV-1 risk in this population . Methods : Data were from a prospect i ve cohort study of 1206 HIV-1 seronegative sex workers from Mombasa , Kenya who were followed monthly . Multivariate Cox proportional hazards analyses were used to adjust for demographic and behavioral measures and incident sexually transmitted diseases . Results : Two hundred and thirty-three women acquired HIV-1 ( 8.7/100 person-years ) . HSV-2 prevalence ( 81 % ) and incidence ( 25.4/100 person-years ) were high . In multivariate analysis , including adjustment for HSV-2 , HIV-1 acquisition was associated with use of oral contraceptive pills [ adjusted hazard ratio ( HR ) , 1.46 ; 95 % confidence interval ( CI ) , 1.00–2.13 ] and depot medroxyprogesterone acetate ( adjusted HR , 1.73 ; 95 % CI , 1.28–2.34 ) . The effect of contraception on HIV-1 susceptibility did not differ significantly between HSV-2 seronegative versus seropositive women . HSV-2 infection was associated with elevated HIV-1 risk ( adjusted HR , 3.58 ; 95 % CI , 1.64–7.82 ) . Conclusions : In this group of high-risk African women , hormonal contraception and HSV-2 infection were both associated with increased risk for HIV-1 acquisition . HIV-1 risk associated with hormonal contraceptive use was not related to HSV-2 serostatus BACKGROUND Herpes simplex infection is responsible for substantial morbidity in patients with HIV infection . Data from less-developed countries analyzing risk factors within this population are largely unavailable . AIMS Investigate the incidence and seroprevalence of HSV-1 and HSV-2 infection in population s at high and low risk for HIV infection . MATERIAL S AND METHODS A prospect i ve cohort study was performed in a population at high risk for STDs composed of 170 HIV seronegative male homosexuals and bisexuals ( group A ) . The population at low risk for STDs was composed of 155 volunteer male blood donors ( group B ) . All blood sample s were screened using a type specific ELISA to HSV-1 and HSV-2 glycoprotein G ( gG ) . RESULTS The prevalence of HSV-1 and HSV-2 infection among all the 325 patients was 83.5 % and 63.4 % , respectively . Annual incidence of HSV-1 and 2 among group A were 0.053 % and 0.08 % , respectively . Among group B , the incidence for HSV-1 was 0.04 % and for HSV-2 was 0.02 % . Educational parameters ( P<0.001 ) , irregular use of condoms ( P<0.001 ) , and percentage of previous receptive anal intercourse ( P<0,012 ) were significantly associated with seropositivity to HSV-2 . About 8.4 % of the HSV-1 seronegative subjects presented recurrence episodes of herpes labialis as well as 7.6 % of the HSV-2 seronegative patients had genital herpes in the past . DISCUSSION The high seroprevalence detected suggests that routine screening for HSV should be performed in population s at high risk for STDs , especially in HIV-infected patients . CONCLUSION Educational campaigns , with particular focus on the transmission of HSV , and the regular use of condoms are important measures to reduce the HSV dissemination among patients with less advanced educations and at high risk for STDs Objective : To estimate the effects of reproductive tract infections ( RTIs ) on HIV acquisition among Zimbabwean and Ug and an women . Methods : A multicenter prospect i ve observational cohort study enrolled 4439 HIV-uninfected women aged 18 to 35 attending family planning clinics in Zimbabwe and Ug and a. Participants were interviewed , and tested for HIV and RTIs every 3 months for 15 to 24 months . They received HIV risk reduction counseling , male condoms , and treatment for curable RTIs . Results : Despite HIV risk reduction counseling and regular screening and treatment for RTIs , the HIV incidence did not decline during the study . Positive HSV-2 serostatus at baseline ( hazard ratio [ HR ] = 3.69 , 95 % confidence interval = 2.45–5.55 ) , incident HSV-2 ( HR = 5.35 , 3.06–9.36 ) , incident Neisseria gonorrhoeae ( HR = 5.46 , 3.41–8.75 ) , and altered vaginal flora during the study ( bacterial vaginosis [ BV ] : HR = 2.12 , 1.50–3.01 ; and intermediate flora : HR = 2.02 , 1.39–2.95 ) were independently associated with HIV acquisition after controlling for demographic and behavioral covariates and other RTIs ( Treponema pallidum , Chlamydia trachomatis , Trichomonas vaginalis , and vaginal yeasts ) . For N. gonorrhoeae , C. trachomatis , T. vaginalis , and vaginal yeasts , the risk of HIV acquisition increased when the infection was identified at the visit before the HIV-detection visit or with the duration of infection . Population attributable risk percent ( PAR% ) calculations show that HSV-2 contributes most to acquisition of new HIV infections ( 50.4 % for baseline HSV-2 and 7.9 % for incident HSV-2 ) , followed by altered vaginal flora ( 17.2 % for bacterial vaginosis and 11.8 % for intermediate flora ) . Conclusions : A substantial proportion of new HIV infections in Zimbabwean and Ug and an women are attributable to RTIs , particularly HSV-2 and altered vaginal flora We assess the relative contribution of viral and bacterial sexually transmitted infections ( STIs ) on HIV acquisition among southern African women in a nested case-control study within the Methods for Improving Reproductive Health in Africa ( MIRA ) trial . Cases were women with incident HIV infection ; controls were HIV-uninfected at the time of case seroconversion selected in a 1 to 3 case to control ratio ( risk-set sampling ) , matched on study site and time of follow-up . Conditional logistic regression models were used to calculate adjusted odds ratios ( AORs ) and population -attributable fractions ( PAF ) . Among 4948 enrolled women , we analysed 309 cases and 927 controls . The overall HIV incidence rate was 4.0 per 100 women-years . The incidence of HIV infection was markedly higher in women who had prevalent Herpes simplex virus type 2 ( HSV-2 ) ( AOR : 2.14 ; 95 % confidence interval [ CI ] : 1.55–2.96 ) , incident HSV-2 ( AOR : 4.43 ; 95 % CI : 1.77–11.05 ) and incident Neisseria gonorrhoeae ( AOR : 6.92 ; 95 % CI : 3.01–15.90 ) . The adjusted PAF of HIV incidence for prevalent HSV-2 was 29.0 % ( 95 % CI : 16.8–39.3 ) , for incident HSV-2 2.1 % ( 95 % CI : 0.6–3.6 ) and for incident N. gonorrhoeae 4.1 % ( 95 % CI : 2.5–5.8 ) . Women 's greatest risk factors for HIV acquisition were incident bacterial and viral STIs . Women-centred interventions aim ed at decreasing HIV incidence in young African women need to address these common co-morbid conditions Seventy-seven patients with first episodes of genital herpes and 111 with recurrent episodes were enrolled in a double-blind trial comparing topical acyclovir with a placebo ( polyethylene glycol ointment ) . Among acyclovir-treated patients with first-episode primary genital herpes , the mean duration of viral shedding ( 4.1 days ) and the time to complete crusting of lesions present at the initiation of therapy ( 7.1 days ) were shorter than among placebo recipients ( 7.0 and 10.5 days , respectively ) ( P less than 0.05 ) . Acyclovir-treated patients with recurrent herpes had a shorter duration of viral shedding than placebo recipients ( 0.95 vs. 1.90 days ) ( P = 0.03 ) . Among the patients with recurrent herpes , acyclovir reduced the time to crusting of lesions in men but had no effect on the symptoms or healing times in women . Topical acyclovir shortens the duration of viral shedding and accelerates healing of some genital herpes simplex virus infections Genital herpes continues to be epidemic throughout the world ( 1 - 7 ) . A recent population -based survey in the United States showed a 31 % increase in herpes simplex virus ( HSV ) type 2 seropositivity during the past decade ( 6 ) . The seroprevalence of HSV-2 infection ranges from 30 % to 50 % in most sexually transmitted disease clinics and from 20 % to 30 % in most family practice , obstetric , and general medicine clinics ( 5 , 7 - 9 ) . The seroprevalence of genital HSV-1 infection is also being reported with increasing frequency ( 10 ) . Previous studies of persons with initial episodes of genital herpes have shown that more than 90 % of persons infected with HSV-2 have a recurrence during the first year of follow-up ( 11 , 12 ) . However , little is known about the subsequent course of infection . Anecdotal reports have suggested that recurrences of HSV decrease over time ( 13 , 14 ) . However , in a study of the frequency of genital herpes recurrences among 22 women followed for two consecutive pregnancies , Harger and colleagues ( 15 ) were unable to detect any appreciable difference in recurrence rates between the first and second pregnancies . Fife and coworkers ( 16 ) investigated a group of patients enrolled in a long-term study of continuous daily acyclovir therapy . On discontinuation of therapy , most patients subsequently had a recurrence , although the rate was lower than that reported before treatment . The authors could not determine whether the lower rates were related to antiviral therapy or reflected the long-term natural history of the infection . Studies of objective ly defined observations on the long-term clinical course of recurrent genital herpes are not available . Because decisions to use long-term suppressive therapy or episodic therapy are based on frequency of reactivation , knowledge of the infection 's natural history directly benefits clinical management ( 17 ) . We report on the long-term history of recurrence rates among persons enrolled at a research clinic that studied the clinical course and pathogenesis of genital HSV infection . Methods Patients In 1974 , a research clinic dedicated to the study of symptomatic genital herpes infection was established at the Harborview Medical Center in Seattle , Washington ( 18 - 21 ) . From 1974 to 1991 , we recruited 664 patients with HSV infection documented by serologic studies or culture . These patients were enrolled in a prospect i ve observational study of the frequency of genital HSV reactivation and were followed for at least 14 months . Testing for HIV infection was not part of the study protocol ; however , none of these patients demonstrated clinical immunosuppression at study entry or during follow-up . All patients gave informed consent , and institutional review board approval was obtained throughout observation . Using history , serologic studies , and viral isolation , we classified patients who presented to the clinic as having new or previous acquisition of genital herpes ( 12 , 18 - 21 ) . Patients with newly acquired HSV infection lack serum antibodies to the acquired HSV type , whereas patients with recurrent infection have these antibodies . From 1983 to 1991 , the serologic status of all patients was determined by using Western blot assay , which accurately distinguishes antibodies to HSV-1 from antibodies to HSV-2 ( 22 , 23 ) . Between 1974 and 1983 , all serum sample s obtained at entry were screened by microneutralization assay ( 18 ) . All serum sample s that demonstrated neutralizing activity were retrospectively analyzed by Western blot assay . Patients who had only antibodies to HSV-1 and whose HSV-2 infection was documented by culture or subsequent seroconversion to HSV-2 were included in this report as having non primary initial HSV-2 infection . Patients who had antibodies to HSV-2 at enrollment were included as having recurrent HSV-2 infection . Because neutralization and Western blot assays have equal sensitivity for detecting antibodies to HSV , all patients with entry serum sample s that lacked neutralization were classified as having primary HSV infection ( 23 ) . A viral isolate for genital herpes was used to classify these patients as having primary HSV-1 or primary HSV-2 infection . Data Collection Patients presenting at the clinic with a symptomatic recurrence were followed every 2 to 3 days until their lesions healed . Patients were instructed to return to the clinic during each subsequent recurrence and for routine visits every 2 to 3 months ; they were also asked to return to the clinic for assessment of all genital symptoms until they were able to accurately recognize signs and symptoms of genital herpes recurrence ( 11 , 12 ) . At this time , routine clinic visits were changed from 2-month intervals to 4- to 6-month intervals to enhance long-term compliance with the protocol . Patients were asked to keep a record of onset and resolution date s for all recurrences so that we could document the time of onset of recurrences not observed by clinic personnel . Diary cards were collected and review ed with clinic staff at each clinic visit . Data were coded on a st and ardized coding sheet and entered into a central ized data base . Anatomic site of recurrence , date of onset , duration , and therapy were recorded for all genital lesions . A recurrence was defined as the presence of vesicles , ulcers , or crusts . Anatomic sites categorized as genital recurrences were the mons , vulva , or perineum in women ; the periurethral area , penis , and scrotum in men ; and the perianal area and buttocks in both sexes . If new lesions appeared before other lesions completely healed , all lesions were considered part of the same recurrence . Statistical Analysis Annual recurrence rates for each patient were estimated by dividing the number of recorded recurrences in a specified period by the number of years in that period ( 12 , 20 ) . For example , the second-year recurrence rate for a patient followed the entire year is the number of recurrences in the second year . For a patient followed for only the first 2 months of the second year , the denominator is 2 divided by 12 ( 0.167 ) . Comparisons of patient groups with respect to recurrence rates were made by using the Wilcoxon rank-sum test . Recurrence rates between years 1 and 2 were compared by using the Wilcoxon signed-rank test . Exploratory data analyses ( 24 ) were used to investigate patterns over time in patients followed for more than 2 years . Two-way tables of recurrence rates with a row for each patient and a column for each year of data were created for subsets of patients with varying lengths of follow-up . Two methods were used to estimate an overall effect and patient and year effects : the least-squares fit ( the method used in a two-way analysis of variance when no values are missing ) and the median polish fit ( a method that is resistant to outliers ) . Outliers in estimated recurrence rates occurred when persons were followed for a short period ; that coincided with an unusually high number of recurrences . Examination of year effects suggested that it was reasonable to hypothesize a linear time trend in recurrence rates . We therefore used a r and om-effects model ( 25 ) to assess the changes in recurrence rates over time . The slope coefficient in this model represents the average annual change in the recurrence rate ( a positive slope if the recurrence rate increased and a negative slope if the recurrence rate decreased ) ; the intercept represents the average rate in the first year of follow-up . A plot of mean annual recurrence rates by year provided a visual description of the fitted models . We used a model that allowed the slope and intercept to vary from patient to patient . A consequence of this variability is correlation among the observations from a given patient . This is taken into account in the estimation procedure . We computed approximate 95 % CIs , assuming that the estimated slopes and intercepts were normally distributed . Whether such a normal distribution existed for all data on recurrence rates for all subsets of patients is unclear . Therefore , the reported CIs should be interpreted descriptively . When computing recurrence rates for individual patients , we excluded data from any time at which suppressive antiviral therapy was used . Some analyses were restricted to patients who never received suppressive therapy . Because antiviral treatment of symptomatic episodes has not been shown to affect recurrence rates ( 26 - 28 ) , such treatment was not considered in these analyses . All analyses were done by using S-Plus , version 3.1 ( Statistical Sciences , Inc. , Seattle , Washington ) . The function twoway was used for two-way table analyses , and the function varcomp was used for fitting r and om-effects models . Role of the Funding Source The agency had no role in the collection , analysis , or interpretation of the data or in the decision to su bmi t this paper for publication . Results Of the 664 patients , 306 had newly acquired ( first-episode ) genital herpes . Sixty had primary HSV-1 infection , 205 had primary HSV-2 infection , and 41 had non primary initial HSV-2 infection ( that is , HSV-2 infection after previous HSV-1 infection ) . Previously acquired ( recurrent ) HSV-2 infection was present at enrollment in the other 358 patients ( Table 1 ) . Our study included 277 men and 387 women . Demographic , socioeconomic , and sexual histories of our patients were similar to those of patients in previous reports ( 12 , 18 - 21 ) . Of the 664 patients , 412 were followed for more than 2 years , 277 were followed for more than 3 years , 117 were followed for more than 6 years , and 52 were followed for more than 9 years . Table 1 . Characteristics of Study Sample One hundred ninety-four patients received suppressive therapy at some time during follow-up , including 38 of the patients who presented with first-episode infection . Among these patients , the median duration of suppressive therapy was 10 months ; the median follow-up time while patients were not receiving therapy was 35 months . During follow-up , patients had 11 967 BACKGROUND Across many observational studies , herpes simplex virus type 2 ( HSV-2 ) infection is associated with two-fold to three-fold increased risk for HIV-1 infection . We investigated whether HSV-2 suppression with aciclovir would reduce the risk of HIV-1 acquisition . METHODS We undertook a double-blind , r and omised , placebo-controlled phase III trial in HIV-negative , HSV-2 seropositive women in Africa and men who have sex with men ( MSM ) from sites in Peru and the USA . Participants were r and omly assigned by block r and omisation to twice daily aciclovir 400 mg ( n=1637 ) or matching placebo ( n=1640 ) for 12 - 18 months , and were seen monthly for dispensation of study drug , adherence counselling and measurement by pill count and self-reporting , and risk reduction counselling , and every 3 months for genital examination and HIV testing . The primary outcome was HIV-1 acquisition and secondary was incidence of genital ulcers . Analysis was by intention to treat . This study is registered with Clinical Trials.gov , number NCT00076232 . FINDINGS 3172 participants ( 1358 women , 1814 MSM ) were included in the primary data set ( 1581 in aciclovir group , 1591 in control group ) . The incidence of HIV-1 was 3.9 per 100 person-years in the aciclovir group ( 75 events in 1935 person-years of follow-up ) and 3.3 per 100 person-years in the placebo group ( 64 events in 1969 person-years of follow-up ; hazard ratio 1.16 [ 95 % CI 0.83 - 1.62 ] ) . Incidence of genital ulcers on examination was reduced by 47 % ( relative risk 0.53 [ 0.46 - 0.62 ] ) and HSV-2 positive genital ulcers by 63 % ( 0.37 [ 0.31 - 0.45 ] ) in the aciclovir group . Adherence to dispensed study drug was 94 % in the aciclovir group and 94 % in the placebo group , and 85 % of expected doses in the aciclovir group and 86 % in the placebo group . Retention was 85 % at 18 months in both groups ( 1028 of 1212 in aciclovir group , 1030 of 1208 in placebo group ) . We recorded no serious events related to the study drug . INTERPRETATION Our results show that suppressive therapy with st and ard doses of aciclovir is not effective in reduction of HIV-1 acquisition in HSV-2 seropositive women and MSM . Novel strategies are needed to interrupt interactions between HSV-2 and HIV-1 Objective : The objective of this study was to underst and temporal trends in the contribution of different genital tract infections to HIV incidence over 20 years of follow-up in a cohort of high-risk women . Design : A prospect i ve cohort study . Methods : We performed monthly evaluations for HIV , vaginal yeast , bacterial vaginosis , Trichomonas vaginalis , Neisseria gonorrhoeae , nonspecific cervicitis , herpes simplex virus type two ( HSV-2 ) , genital ulcer disease ( GUD ) and genital warts . We used Cox regression to evaluate the association between sexually transmitted infections ( STIs ) and HIV acquisition over four time periods ( 1993–1997 , 1998–2002 , 2003–2007 , 2008–2012 ) . Models were adjusted for age , workplace , sexual risk behaviour , hormonal contraceptive use and other STIs . The result ing hazard ratios were used to calculate population attributable risk percentage ( PAR% ) . Results : Between 1993 and 2012 , 1964 women contributed 6135 person-years of follow-up . The overall PAR% for each infection was prevalent HSV-2 ( 48.3 % ) , incident HSV-2 ( 4.5 % ) , bacterial vaginosis ( 15.1 % ) , intermediate microbiota ( 7.5 % ) , vaginal yeast ( 6.4 % ) , T. vaginalis ( 1.1 % ) , N. gonorrhoeae ( 0.9 % ) , nonspecific cervicitis ( 0.7 % ) , GUD ( 0.8 % ) and genital warts ( −0.2 % ) . Across the four time periods , the PAR% for prevalent HSV-2 ( 40.4 % , 61.8 % , 58.4 % , 48.3 % ) and bacterial vaginosis ( 17.1 % , 19.5 % , 14.7 % , 17.1 % ) remained relatively high and had no significant trend for change over time . The PAR% for trichomoniasis , gonorrhoea , GUD and genital warts remained less than 3 % across the four periods . Conclusion : Bacterial vaginosis and HSV-2 have consistently been the largest contributors to HIV acquisition risk in the Mombasa Cohort over the past 20 years . Interventions that prevent these conditions would benefit women 's health and could reduce their risk of becoming infected with HIV HSV-2 infection is common and generally asymptomatic , but it is associated with increased HIV susceptibility and disease progression . This may relate to herpes-mediated changes in genital and systemic immunology . Cervical cytobrushes and blood were collected from HIV-uninfected African/Caribbean women in Toronto , and immune cell subsets were enumerated blindly by flow cytometry . Immune differences between groups were assessed by univariate analysis and confirmed using a multivariate model . Study participants consisted of 46 women , of whom 54 % were infected with HSV-2 . T cell activation and expression of the mucosal homing integrin α4β7 ( 19.60 versus 8.76 % ; p < 0.001 ) were increased in the blood of HSV-2–infected women . Furthermore , expression of α4β7 on blood T cells correlated with increased numbers of activated ( coexpressing CD38/HLA-DR ; p = 0.004 ) and CCR5 + ( p = 0.005 ) cervical CD4 + T cells . HSV-2–infected women exhibited an increase in the number of cervical CD4 + T cells ( 715 versus 262 cells/cytobrush ; p = 0.016 ) , as well as an increase in the number and proportion of cervical CD4 + T cells that expressed CCR5 + ( 406 versus 131 cells , p = 0.001 ; and 50.70 versus 34.90 % , p = 0.004 ) and were activated ( 112 versus 13 cells , p < 0.001 ; and 9.84 versus 4.86 % , p = 0.009 ) . Mannose receptor expression also was increased on cervical dendritic cell subsets . In conclusion , asymptomatic HSV-2 infection was associated with significant systemic and genital immune changes , including increased immune activation and systemic α4β7 expression ; correlation of the latter with highly HIV-susceptible CD4 + T cell subsets in the cervix may provide a mechanism for the increased HIV susceptibility observed in asymptomatic HSV-2–infected women
2,095
26,365,102
The preponderance of evidence from all human r and omized controlled trials indicates that LES do not increase EI or BW , whether compared with caloric or non-caloric ( for example , water ) control conditions . Overall , the balance of evidence indicates that use of LES in place of sugar , in children and adults , leads to reduced EI and BW , and possibly also when compared with water
By reducing energy density , low-energy sweeteners ( LES ) might be expected to reduce energy intake ( EI ) and body weight ( BW ) .
The effect of television viewing ( TVV ) and pubertal status of 9- to 14-y-old girls on mealtime food intake ( FI ) after a premeal glucose drink was determined . On four separate mornings , girls r and omly received equally sweetened drinks containing Sucralose ( control ) or glucose ( 1.0 g/kg body weight ) in 250 mL of water 2 h after a st and ardized breakfast . FI from an ad libitum pizza meal was measured 30 min later with or without TVV . Appetite was measured at 15 min intervals to lunch and postmeal . TVV at mealtime had no effect on FI , however , glucose suppressed FI more with no TVV compared with TVV ( 24 versus 10 % , p < 0.001 ) , primarily because of its effect in peripubertal girls ( p < 0.028 ) . In postpubertal girls ( n = 8) , glucose reduced FI by ∼27 % in both the no TVV and TVV conditions , but in peripubertal girls ( n = 17 ) , reduction in FI was 22 % without TVV and only 1 % while TVV . Appetite correlated with FI at 30 min only in postpubertal girls . TVV at mealtime reduced caloric compensation after consumption of the glucose drink in peripubertal , but not postpubertal , girls , with no effect on mealtime FI . ( Clinical trial number NCT01025687 . The long-term physiological effects of refined carbohydrates on appetite and mood remain unclear . Reported effects when subjects are not blind may be due to expectations and have rarely been studied for more than 24 h. The present study compared the effects of supplementary soft drinks added to the diet over 4 weeks on dietary intake , mood and BMI in normal-weight women ( n 133 ) . Subjects were categorised as ' watchers ' or ' non-watchers ' of what they ate then received sucrose or artificially sweetened drinks ( 4 x 250 ml per d ) . Expectancies were varied by labelling drinks ' sugar ' or ' diet ' in a counter-balanced design . Sucrose supplements provided 1800 kJ per d and sweetener supplements provided 67 kJ per d. Food intake was measured with a 7 d diary and mood with ten single Likert scales . By 4 weeks , sucrose supplements significantly reduced total carbohydrate intake ( F(1,129 ) = 53.81 ; P<0.001 ) , fat ( F(2,250 ) = 33.33 ; P<0.001 ) and protein intake ( F(2,250 ) = 28.04 ; P<0 - 001 ) compared with sweetener supplements . Mean daily energy intake increased by just under 1000 kJ compared with baseline ( t ( 67 df ) = 3.82 ; P < 0.001 ) and was associated with a non-significant trend for those receiving sucrose to gain weight . There were no effects on appetite or mood . Neither dietary restraint status as measured by the Dutch Eating Behaviour Question naire nor the expectancy procedure had effects . Expectancies influenced mood only during baseline week . It is concluded that sucrose satiates , rather than stimulates , appetite or negative mood in normal-weight subjects The objective was to compare the effects of ad libitum consumption of commonly consumed meal-time beverages on energy and fluid intakes and post-meal average subjective appetite and blood glucose in healthy adults . In a r and omized controlled design , 29 males and females consumed to satiation an ad libitum pizza meal with one of five beverages in unlimited amount including water ( 0 kcal ) , 1 % milk ( 44 kcal/100 ml ) , regular cola ( 44 kcal/100 ml ) , orange juice ( 44 kcal/100 ml ) and diet cola ( 0 kcal ) . Food and fluid intakes were measured at the meal . Average subjective appetite and blood glucose were measured before and for 2h after the meal . Although energy intake from pizza was similar among all beverage treatments , the amount of fluid consumed ( g ) varied among the beverages with intake of orange juice higher than regular and diet cola , but not different from water or milk . Meal-time ingestion of caloric beverages , milk , orange juice and regular cola , led to higher total meal-time energy intakes compared to either water or diet cola . Post-meal blood glucose area under the curve ( AUC ) was lower after milk than after meals with water , orange juice and regular cola and post-meal average subjective appetite AUC was lower after milk than after meals with water . Meal intakes of nutrients including protein , calcium , phosphorus , zinc , vitamins B12 , A and D were higher at the meal with milk compared to the other beverages . Thus , caloric beverages consumed ad libitum during a meal add to total meal-time energy intake , but 1 % milk favors a lower post-meal blood glucose and average subjective appetite score and adds to nutrient intake BACKGROUND The consumption of beverages that contain sugar is associated with overweight , possibly because liquid sugars do not lead to a sense of satiety , so the consumption of other foods is not reduced . However , data are lacking to show that the replacement of sugar-containing beverages with noncaloric beverages diminishes weight gain . METHODS We conducted an 18-month trial involving 641 primarily normal-weight children from 4 years 10 months to 11 years 11 months of age . Participants were r and omly assigned to receive 250 ml ( 8 oz ) per day of a sugar-free , artificially sweetened beverage ( sugar-free group ) or a similar sugar-containing beverage that provided 104 kcal ( sugar group ) . Beverages were distributed through schools . At 18 months , 26 % of the children had stopped consuming the beverages ; the data from children who did not complete the study were imputed . RESULTS The z score for the body-mass index ( BMI , the weight in kilograms divided by the square of the height in meters ) increased on average by 0.02 SD units in the sugar-free group and by 0.15 SD units in the sugar group ; the 95 % confidence interval ( CI ) of the difference was -0.21 to -0.05 . Weight increased by 6.35 kg in the sugar-free group as compared with 7.37 kg in the sugar group ( 95 % CI for the difference , -1.54 to -0.48 ) . The skinfold-thickness measurements , waist-to-height ratio , and fat mass also increased significantly less in the sugar-free group . Adverse events were minor . When we combined measurements at 18 months in 136 children who had discontinued the study with those in 477 children who completed the study , the BMI z score increased by 0.06 SD units in the sugar-free group and by 0.12 SD units in the sugar group ( P=0.06 ) . CONCLUSIONS Masked replacement of sugar-containing beverages with noncaloric beverages reduced weight gain and fat accumulation in normal-weight children . ( Funded by the Netherl and s Organization for Health Research and Development and others ; DRINK Clinical Trials.gov number , NCT00893529 . ) To examine whether artificial sweeteners aid in the control of long-term food intake and body weight , we gave free-living , normal-weight subjects 1150 g soda sweetened with aspartame ( APM ) or high-fructose corn syrup ( HFCS ) per day . Relative to when no soda was given , drinking APM-sweetened soda for 3 wk significantly reduced calorie intake of both females ( n = 9 ) and males ( n = 21 ) and decreased the body weight of males but not of females . However , drinking HFCS-sweetened soda for 3 wk significantly increased the calorie intake and body weight of both sexes . Ingesting either type of soda reduced intake of sugar from the diet without affecting intake of other nutrients . Drinking large volumes of APM-sweetened soda , in contrast to drinking HFCS-sweetened soda , reduces sugar intake and thus may facilitate the control of calorie intake and body weight This study investigated whether the addition of the high-intensity sweetener aspartame to a multidisciplinary weight-control program would improve weight loss and long-term control of body weight . One hundred sixty-three obese women were r and omly assigned to consume or to abstain from aspartame-sweetened foods and beverages during 16 wk of a 19-wk weight-reduction program ( active weight loss ) , a 1-y maintenance program , and a 2-y follow-up period . Women in both treatment groups lost approximately 10 % of initial body weight ( 10 kg ) during active weight loss . Among women assigned to the aspartame-treatment group , aspartame intake was positively correlated with percentage weight loss during active weight loss ( r = 0.32 , P < 0.01 ) . During maintenance and follow-up , participants in the aspartame group experienced a 2.6 % ( 2.6 kg ) and 4.6 % ( 4.6 kg ) regain of initial body weight after 71 and 175 wk , respectively , whereas those in the no-aspartame group gained an average of 5.4 % ( 5.4 kg ) and 9.4 % ( 9.4 kg ) , respectively . The aspartame group lost significantly more weight overall ( P = 0.028 ) and regained significantly less weight during maintenance and follow-up ( P = 0.046 ) than did the no-aspartame group . Percentage weight losses at 71 and 175 wk were also positively correlated with exercise ( r = 0.32 , P < 0.001 ; and r = 0.34 , P < 0.01 , respectively ) and self-reported eating control ( r = 0.37 , P < 0.001 ; and r = 0.33 , P < 0.01 , respectively ) . These data suggest that participation in a multidisciplinary weight-control program that includes aspartame may facilitate the long-term maintenance of reduced body weight Using a within-subjects design , we gave over-weight and normal-weight subjects a 500-mL drink of fructose , glucose , or aspartame diluted in lemon-flavored water or plain water in a r and omized fashion at about weekly intervals . Food intake was assessed at a buffet lunch that began 38 min after the preload was completed . Blood was drawn throughout and assayed for concentrations of glucose , insulin , glucagon , and free fatty acid . When subjects drank the fructose preload , they subsequently ate fewer overall calories and fewer grams of fat than when they drank any of the other preloads . The aspartame load did not stimulate intake beyond the plain-water control . The effects of the oxidation of fructose as a possible mechanism for the reduction in food intake is discussed . The effects of insulin in stimulating intake are also discussed The extent and time course of caloric compensation for surreptitious dilutions and supplements to the energy value of the diet were examined in free-living normal-weight adults . Ten subjects were provided lunches containing approximately 66 % more or less calories than their customary midday meal for 2-wk periods which were interposed between 1-wk baseline or recovery periods . Diet records were kept throughout the study . Total energy intakes did not differ among the three control periods ( weeks 1 , 4 , and 7 ) or between any of these periods and when subjects were provided the low-calorie meal . Total energy intake was significantly higher relative to all other periods when subjects ingested the high-calorie meal . To the extent that compensation occurred , it was apparent immediately and did not appear to change over the 2-wk study periods . The results suggest that humans compensate more readily for decreases than for increases in caloric intake BACKGROUND Sensory-specific satiety has been found to play an important role in food choice and meal termination , and it might be a factor contributing to obesity . OBJECTIVE We hypothesized that obese and normal-weight people have different sensitivities to sensory-specific satiety for high-fat foods . DESIGN Sensory-specific satiety was measured in 21 obese [ x body mass index ( BMI ; in kg/m(2 ) ) : 33.1 ] and 23 normal-weight ( BMI : 22.8 ) women who were matched for restrained eating behavior , physical activity , age , and smoking behavior . Food intake , appetite ratings , and liking scores before and after an ad libitum lunch were measured . Products differed in fat content and taste ( ie , low-fat sweet , low-fat savory , high-fat sweet , and high-fat savory ) , and the subjects tested all 4 products . In the first study , s and wiches were tested ; in the second study , snacks were tested . RESULTS Sensory-specific satiety for all products was observed in both subject groups . No significant differences were observed between the obese and normal-weight subjects in either sensory-specific satiety or food intake for any of the products or product categories tested . Taste ( sweet or savory ) had a significantly ( P < 0.05 ) stronger effect on sensory-specific satiety than did fat content . Appetite ratings strongly decreased after lunch , and appetite for a meal or snack after lunch was significantly higher in obese than in normal-weight subjects , whereas scores before lunch did not differ significantly . CONCLUSIONS Obese and normal-weight people do not differ in their sensitivity to sensory-specific satiety , and factors other than fat content have the greatest effect on sensory-specific satiety BACKGROUND Consumption of liquid calories from beverages has increased in parallel with the obesity epidemic in the US population , but their causal relation remains unclear . OBJECTIVE The objective of this study was to examine how changes in beverage consumption affect weight change among adults . DESIGN This was a prospect i ve study of 810 adults participating in the PREMIER trial , an 18-mo r and omized , controlled , behavioral intervention trial . Measurements ( weight , height , and 24-h dietary recall ) were made at baseline , 6 mo , and 18 mo . RESULTS Baseline mean intake of liquid calories was 356 kcal/d ( 19 % of total energy intake ) . After potential confounders and intervention assignment were controlled for , a reduction in liquid calorie intake of 100 kcal/d was associated with a weight loss of 0.25 kg ( 95 % CI : 0.11 , 0.39 ; P < 0.001 ) at 6 mo and of 0.24 kg ( 95 % CI : 0.06 , 0.41 ; P = 0.008 ) at 18 mo . A reduction in liquid calorie intake had a stronger effect than did a reduction in solid calorie intake on weight loss . Of the individual beverages , only intake of sugar-sweetened beverages ( SSBs ) was significantly associated with weight change . A reduction in SSB intake of 1 serving/d was associated with a weight loss of 0.49 kg ( 95 % CI : 0.11 , 0.82 ; P = 0.006 ) at 6 mo and of 0.65 kg ( 95 % CI : 0.22 , 1.09 ; P = 0.003 ) at 18 mo . CONCLUSIONS These data support recommendations to limit liquid calorie intake among adults and to reduce SSB consumption as a means to accomplish weight loss or avoid excess weight gain . This trial was registered at clinical trials.gov as NCT00000616 OBJECTIVE : To examine the long-term relationship between changes in water and beverage intake and weight change . SUBJECTS : Prospect i ve cohort studies of 50 013 women aged 40–64 years in the Nurses ’ Health Study ( NHS , 1986–2006 ) , 52 987 women aged 27–44 years in the NHS II ( 1991–2007 ) and 21 988 men aged 40–64 years in the Health Professionals Follow-up Study ( 1986–2006 ) without obesity and chronic diseases at baseline . MEASURES : We assessed the association of weight change within each 4-year interval , with changes in beverage intakes and other lifestyle behaviors during the same period . Multivariate linear regression with robust variance and accounting for within-person repeated measures were used to evaluate the association . Results across the three cohorts were pooled by an inverse-variance-weighted meta- analysis . RESULTS : Participants gained an average of 1.45 kg ( 5th to 95th percentile : −1.87 to 5.46 ) within each 4-year period . After controlling for age , baseline body mass index and changes in other lifestyle behaviors ( diet , smoking habits , exercise , alcohol , sleep duration , TV watching ) , each 1 cup per day increment of water intake was inversely associated with weight gain within each 4-year period ( −0.13 kg ; 95 % confidence interval ( CI ) : −0.17 to −0.08 ) . The associations for other beverages were : sugar-sweetened beverages ( SSBs ) ( 0.36 kg ; 95 % CI : 0.24–0.48 ) , fruit juice ( 0.22 kg ; 95 % CI : 0.15–0.28 ) , coffee ( −0.14 kg ; 95 % CI : −0.19 to −0.09 ) , tea ( −0.03 kg ; 95 % CI : −0.05 to −0.01 ) , diet beverages ( −0.10 kg ; 95 % CI : −0.14 to −0.06 ) , low-fat milk ( 0.02 kg ; 95 % CI : −0.04 to 0.09 ) and whole milk ( 0.02 kg ; 95 % CI : −0.06 to 0.10 ) . We estimated that replacement of 1 serving per day of SSBs by 1 cup per day of water was associated with 0.49 kg ( 95 % CI : 0.32–0.65 ) less weight gain over each 4-year period , and the replacement estimate of fruit juices by water was 0.35 kg ( 95 % CI : 0.23–0.46 ) . Substitution of SSBs or fruit juices by other beverages ( coffee , tea , diet beverages , low-fat and whole milk ) were all significantly and inversely associated with weight gain . CONCLUSION : Our results suggest that increasing water intake in place of SSBs or fruit juices is associated with lower long-term weight gain BACKGROUND The consumption of sucrose-sweetened soft drinks ( SSSDs ) has been associated with obesity , the metabolic syndrome , and cardiovascular disorders in observational and short-term intervention studies . Too few long-term intervention studies in humans have examined the effects of soft drinks . OBJECTIVE We compared the effects of SSSDs with those of isocaloric milk and a noncaloric soft drink on changes in total fat mass and ectopic fat deposition ( in liver and muscle tissue ) . DESIGN Overweight subjects ( n = 47 ) were r and omly assigned to 4 different test drinks ( 1 L/d for 6 mo ) : SSSD ( regular cola ) , isocaloric semiskim milk , aspartame-sweetened diet cola , and water . The amount of intrahepatic fat and intramyocellular fat was measured with (1)H-magnetic resonance spectroscopy . Other endpoints were fat mass , fat distribution ( dual-energy X-ray absorptiometry and magnetic resonance imaging ) , and metabolic risk factors . RESULTS The relative changes between baseline and the end of 6-mo intervention were significantly higher in the regular cola group than in the 3 other groups for liver fat ( 132 - 143 % , sex-adjusted mean ; P < 0.01 ) , skeletal muscle fat ( 117 - 221 % ; P < 0.05 ) , visceral fat ( 24 - 31 % ; P < 0.05 ) , blood triglycerides ( 32 % ; P < 0.01 ) , and total cholesterol ( 11 % ; P < 0.01 ) . Total fat mass was not significantly different between the 4 beverage groups . Milk and diet cola reduced systolic blood pressure by 10 - 15 % compared with regular cola ( P < 0.05 ) . Otherwise , diet cola had effects similar to those of water . CONCLUSION Daily intake of SSSDs for 6 mo increases ectopic fat accumulation and lipids compared with milk , diet cola , and water . Thus , daily intake of SSSDs is likely to enhance the risk of cardiovascular and metabolic diseases . This trial is registered at clinical trials.gov as NCT00777647 Television viewing ( TVV ) is considered a contributing factor to the development of childhood obesity yet it is unclear whether obesity results , in part , from increased energy intake during TVV . The objective of this study was to determine the effect of TVV on food intake ( FI ) of boys at a meal and its effect on caloric compensation at the test meal after a premeal glucose drink . On four separate mornings and in r and om order , boys received equally sweetened preloads containing Splenda sucralose or glucose [ 1.0 g/kg body weight ( BW ) ] in 250 mL of water 2 h after a st and ard breakfast . Food intake from a pizza meal was measured 30 min later with or without TVV . Both preload treatment ( p < 0.01 ) and TVV ( p < 0.001 ) affected FI ( kcal ) . TVV increased lunchtime FI by an average of 228 kcal . Glucose suppressed FI in the no TVV condition compared with control , but the effect was not statistically significant during TVV . Body composition and subjective appetite scores were positively associated with FI at the test lunch . In conclusion , TVV while eating a meal contributes to increased energy intake by delaying normal mealtime satiation and reducing satiety signals from previously consumed foods The objective was to examine the extent to which overfeeding reduces spontaneous food intake in humans . Twelve normal-weight adults participated in the three stage study . During the 14 day baseline period and 21 day recovery period , food intake was consumed ad libitum , beyond a minimum 5 MJ ( 1200 kcal ) basal diet . During the 13 day period of overfeeding , each subject consumed 35 % more energy than they consumed at baseline . Overfeeding result ed in a weight gain of 2.3+/-0.37 kg , ( p<0.0001 ) , approximately half the weight gain was determined to be fat ( 1.2+/-0.19 kg , p<0.0001 ) by underwater densitometry . Following overfeeding , mean daily caloric intake was not significantly suppressed returning immediately to baseline values . Despite normal energy intake , participants lost 1.3+/-0.24 kg of body weight ( p<0.0001 ) , of which 0.75+/-0.15 kg ( p<0.0001 ) was fat . These results indicated that ( 1 ) the physiological control of eating behavior in humans is not the major mechanism responsible for the recovery of body weight following a period of overfeeding and ( 2 ) an increase in energy expenditure of 1.28 MJ ( 307 kcal)/day or about 14 % was required to account for the weight loss following overfeeding Aspartame administered in capsules ( i.e. , without tasting ) 1 h before a meal significantly reduces the amount eaten in that meal . In the present study 36 young men and women were divided into 3 groups of 12 to receive aspartame ( 400 mg ) or placebo ( 400 mg starch ) on separate occasions either 5 min ( Group A ) , 30 min ( Group B ) or 60 min ( Group C ) before beginning an ad lib test meal . Compared with placebo , aspartame reduced food intake in Group C ( by 18.5 % , p < 0.01 ) , but did not reliably affect intake in Groups A or B. There were , in contrast , no significant effects of aspartame on premeal ratings of hunger , desire to eat or fullness for any of the groups . These results confirm a postingestive inhibitory action of aspartame on appetite , which may involve the amplification of the satiating effects of food . The lack of effect of aspartame administered at the shorter intervals before eating suggests a postgastric or even postabsorptive mechanism of action . This observation is also important in its implication s for the possible therapeutic exploitation of the anorexic effect of capsulated aspartame Previous studies have yielded inconsistent results when documenting the association between key dietary factors and adolescent weight change over time . The purpose of this study was to examine the extent to which changes in adolescent sugar-sweetened beverage ( SSB ) , diet soda , breakfast , and fast-food consumption were associated with changes in BMI and percent body fat ( PBF ) . This study analyzed data from a sample of 693 Minnesota adolescents followed over 2 years . R and om coefficient models were used to examine the relationship between dietary intake and BMI and PBF and to separate cross-sectional and longitudinal associations . Adjusting for total physical activity , total energy intake , puberty , race , socioeconomic status , and age , cross-sectional findings indicated that for both males and females , breakfast consumption was significantly and inversely associated with BMI and PBF , and diet soda intake was significantly and positively associated with BMI and PBF among females . In longitudinal analyses , however , there were fewer significant associations . Among males there was evidence of a significant longitudinal association between SSB consumption and PBF ; after adjustment for energy intake , an increase of one serving of SSB per day was associated with an increase of 0.7 units of PBF among males . This study adds to previous research through its method ological strengths , including adjustment for physical activity and energy intake assessed using state-of-the-art methods ( i.e. , accelerometers and 24-h dietary recalls ) , as well as its evaluation of both BMI and PBF . Additional research is needed to better underst and the complex constellation of factors that contribute to adolescent weight gain over time Background : People learn about a food 's satiating capacity by exposure and consequently adjust their energy intake . Objective : To investigate the effect of energy density and texture on subsequent energy intake adjustments during repeated consumption . Design : In a r and omized crossover design , participants ( n=27 , age : 21±2.4 years , body mass index : 22.2±1.6 kg m−2 ) repeatedly consumed highly novel foods that were either low-energy-dense ( LE : 30 kcal per 100 g ) or high-energy-dense ( HE : 130 kcal per 100 g ) , and either liquid or semi-solid , result ing in four product conditions . In each condition , a fixed portion of test food was consumed nine times as an obligatory part of breakfast , lunch and dinner on 3 consecutive days . All meals continued with an ad libitum buffet ; food items for evening consumption were provided and the intake ( kcal per day ) was measured . Results : Buffet intake depended on energy density and day of consumption of the test foods ( day*energy interaction : P=0.02 ) ; daily buffet intake increased from day 1 ( 1745±577 kcal ) to day 3 ( 1979±567 kcal ) in the LE conditions ; intake did not change in the HE conditions ( day 1 : 1523±429 kcal , day 3 : 1589±424 kcal ) . Food texture did not affect the intake ( P=0.56 ) . Conclusions : Intake did depend on energy density of the test foods ; participants increased their buffet intake over days in response to learning about the satiating capacity of the LE foods , but did not change buffet intake over days when repeatedly consuming a HE food as part of their meal . The adjustments in intake were made irrespective of the food texture BACKGROUND Little is understood about the effect of increased consumption of low-calorie sweeteners in diet beverages on dietary patterns and energy intake . OBJECTIVE We investigated whether energy intakes and dietary patterns were different in subjects who were r and omly assigned to substitute caloric beverages with either water or diet beverages ( DBs ) . DESIGN Participants from the Choose Healthy Options Consciously Everyday r and omized clinical trial ( a 6-mo , 3-arm study ) were included in the analysis [ water groups : n = 106 ( 94 % women ) ; DB group : n = 104 ( 82 % women ) ] . For energy , macronutrient , and food and beverage intakes , we investigated the main effects of time , treatment , and the treatment-by-time interaction by using mixed models . RESULTS Overall , the macronutrient composition changed in both groups without significant differences between groups over time . Both groups reduced absolute intakes of total daily energy , carbohydrates , fat , protein , saturated fat , total sugar , added sugar , and other carbohydrates . The DB group decreased energy from all beverages more than the water group did only at month 3 ( P-group-by-time < 0.05 ) . Although the water group had a greater reduction in grain intake at month 3 and a greater increase in fruit and vegetable intake at month 6 ( P-group-by-time < 0.05 ) , the DB group had a greater reduction in dessert intake than the water group did at month 6 ( P-group-by-time < 0.05 ) . CONCLUSIONS Participants in both intervention groups showed positive changes in energy intakes and dietary patterns . The DB group showed decreases in most caloric beverages and specifically reduced more desserts than the water group did . Our study does not provide evidence to suggest that a short-term consumption of DBs , compared with water , increases preferences for sweet foods and beverages . This trial was registered at clinical trials.gov as NCT01017783 Background / Objective : The sweet-taste receptor ( T1r2+T1r3 ) is expressed by enteroendocrine L-cells throughout the gastrointestinal tract . Application of sucralose ( a non-calorific , non-metabolisable sweetener ) to L-cells in vitro stimulates glucagon-like peptide (GLP)-1 secretion , an effect that is inhibited with co-administration of a T1r2+T1r3 inhibitor . We conducted a r and omised , single-blinded , crossover study in eight healthy subjects to investigate whether oral ingestion of sucralose could stimulate L-cell-derived GLP-1 and peptide YY ( PYY ) release in vivo . Methods : Fasted subjects were studied on 4 study days in r and om order . Subjects consumed 50 ml of either water , sucralose ( 0.083 % w/v ) , a non-sweet , glucose-polymer matched for sweetness with sucralose addition ( 50 % w/v maltodextrin+0.083 % sucralose ) or a modified sham-feeding protocol ( MSF = oral stimulation ) of sucralose ( 0.083 % w/v ) . Appetite ratings and plasma GLP-1 , PYY , insulin and glucose were measured at regular time points for 120 min . At 120 min , energy intake at a buffet meal was measured . Results : Sucralose ingestion did not increase plasma GLP-1 or PYY . MSF of sucralose did not elicit a cephalic phase response for insulin or GLP-1 . Maltodextrin ingestion significantly increased insulin and glucose compared with water ( P<0.001 ) . Appetite ratings and energy intake were similar for all groups . Conclusions : At this dose , oral ingestion of sucralose does not increase plasma GLP-1 or PYY concentrations and hence , does not reduce appetite in healthy subjects . Oral stimulation with sucralose had no effect on GLP-1 , insulin or appetite BACKGROUND The rising prevalence of obesity in children has been linked in part to the consumption of sugar-sweetened drinks . Our aim was to examine this relation . METHODS We enrolled 548 ethnically diverse schoolchildren ( age 11.7 years , SD 0.8 ) from public schools in four Massachusetts communities , and studied them prospect ively for 19 months from October , 1995 , to May , 1997 . We examined the association between baseline and change in consumption of sugar-sweetened drinks ( the independent variables ) , and difference in measures of obesity , with linear and logistic regression analyses adjusted for potentially confounding variables and clustering of results within schools . FINDINGS For each additional serving of sugar-sweetened drink consumed , both body mass index ( BMI ) ( mean 0.24 kg/m2 ; 95 % CI 0.10 - 0.39 ; p=0.03 ) and frequency of obesity ( odds ratio 1.60 ; 95 % CI 1.14 - 2.24 ; p=0.02 ) increased after adjustment for anthropometric , demographic , dietary , and lifestyle variables . Baseline consumption of sugar-sweetened drinks was also independently associated with change in BMI ( mean 0.18 kg/m2 for each daily serving ; 95 % CI 0.09 - 0.27 ; p=0.02 ) . INTERPRETATION Consumption of sugar-sweetened drinks is associated with obesity in children The rise in pediatric obesity since the 1970s has been well established in the United States and is becoming a major concern worldwide . As a potential means to help slow the obesity epidemic , low-calorie sweeteners ( LCS ) have gained attention as dietary tools to assist in adherence to weight loss plans or prevention of excess weight gain . Observational studies tend to show positive correlations between LCS consumption and weight gain in children and adolescents . Although the data are intriguing , these epidemiologic studies do not establish that LCS cause weight gain , because there are likely many lifestyle and genetic differences between children and families who choose to consume LCS and those who do not . Short-term r and omized controlled trials have shown LCS use to be BMI neutral or to have modest weight-reducing effects in overweight and obese adolescents . The long-term effects of LCS in children and adolescents are unknown . Some compelling research is currently underway and may provide needed insight into the potential role of LCS in weight management . The paucity of data regarding the effects of LCS use in children and adolescents creates challenges in decision-making for health care providers and parents In two experiments , 2 - 5-year-old children 's responsiveness to caloric density cues was examined . In a preloading protocol , consumption of fixed volumes of drinks ( 205 ml in Experiment 1 ; 150 ml in Experiment 2 ) , sweetened with sucrose , aspartame , aspartame plus low glucose maltodextrin , or a water control , was followed by ad lib consumption from among a variety of foods . Caloric drinks had about 90 kcal in Experiment 1 , 65 kcal in Experiment 2 . The delay interval between the preload and the ad lib consumption was 0 , 30 or 60 minutes . In Experiment 1 , 24 4- and 5-year-old children participated in only one delay interval , while in Experiment 2 , all 20 2- and 3-year-old children were seen in all conditions . Results revealed evidence of caloric compensation , but no evidence of preload x time delay interaction . In both experiments , aspartame also produced a significant suppression of intake relative to water , primarily due to the pattern at 30 min following the preload . Across conditions , the suppression following aspartame was usually significantly less than that produced by the caloric sweet drinks , providing evidence for postingestive effects . In Experiment 1 , suppression of intake was related to the children 's preferences for the foods , not to macronutrient content ; consumption of nonpreferred foods was most suppressed . Consumption of sweetened drinks as long as 1 hour prior to eating suppressed food intake , and this common feeding practice may also reduce dietary variety OBJECTIVE Reduced intake of sweetened caloric beverages ( SCBs ) is recommended to lower total energy intake . Replacing SCBs with non-caloric diet beverages does not automatically lower energy intake , however . Compensatory increases in other food or beverages reportedly negate benefits of diet beverages . The purpose of this study was to evaluate drinking water as an alternative to SCBs . RESEARCH METHODS AND PROCEDURES Secondary analysis of data from the Stanford A TO Z intervention evaluated change in beverage pattern and total energy intake in 118 overweight women ( 25 to 50 years ) who regularly consumed SCBs ( > 12 ounces/d ) at baseline . At baseline and 2 , 6 , and 12 months , mean daily beverage intake ( SCBs , drinking water , non-caloric diet beverages , and nutritious caloric beverages ) , food composition ( macronutrient , water , and fiber content ) , and total energy intake were estimated using three 24-hour diet recalls . Beverage intake was expressed in relative terms ( percentage of beverages ) . RESULTS In fixed effects models that controlled for total beverage intake , non-caloric and nutritious caloric beverage intake ( percentage of beverages ) , food composition , and energy expenditure [ metabolic equivalent ( MET ) ] , replacing SCBs with drinking water was associated with significant decreases in total energy intake that were sustained over time . The caloric deficit attributable to replacing SCBs with water was not negated by compensatory increases in other food or beverages . Replacing all SCBs with drinking water was associated with a predicted mean decrease in total energy of 200 kcal/d over 12 months . DISCUSSION The results suggest that replacing SCBs with drinking water can help lower total energy intake in overweight consumers of SCBs motivated to diet In a study of the impact of aspartame , fat , and carbohydrate on appetite , we monitored blood glucose continuously for 431 ( SE 16 ) min . Ten healthy males ( 19 - 31 years ) participated in three time-blinded visits . As blood glucose was monitored , appetite ratings were scored at r and omized times . On the first meal initiation , volunteers consumed one of three isovolumetric drinks ( aspartame , 1 MJ simple carbohydrate , and 1 MJ high-fat ; r and omized order ) . High-fat and high-carbohydrate foods were available ad libitum subsequently . Blood glucose patterns following the carbohydrate drink ( + 1.78 ( SE 0.28 ) mmol/l in 38 ( SE 3 ) min ) and high-fat drink ( + 0.83 ( SE 0.28 ) mmol/l in 49 ( SE 6 ) min ) were predictive of the next intermeal interval ( R 0.64 and R 0.97 respectively ) . Aspartame ingestion was followed by blood glucose declines ( 40 % of subjects ) , increases ( 20 % ) , or stability ( 40 % ) . These patterns were related to the volunteers ' perception of sweetness of the drink ( R 0.81 , P = 0.014 ) , and were predictive of subsequent intakes ( R -0.71 , P = 0.048 ) . For all drinks combined , declines in blood glucose and meal initiation were significantly associated ( chi 2 16.8 , P < 0.001 ) , the duration of blood glucose responses and intermeal intervals correlated significantly ( R 0.715 , P = 0.0001 ) , and sweetness perception correlated negatively with hunger suppression ( R -0.471 , P = 0.015 ) . Effects of fat , carbohydrate , and aspartame on meal initiation , meal size , and intermeal interval relate to blood glucose patterns . Varied blood glucose responses after aspartame support the controversy over its effects , and may relate to sweetness perception Despite some reports that aspartame (APM)-sweetened beverages may increase subjective appetite , previously we demonstrated that drinking 280 ml of an APM-sweetened soft drink ( 170 mg APM ) had no effect on appetite , and 560 ml of the same soft drink ( 340 mg APM ) reduced appetite . The present study examined this appetite reduction to determine its cause . Eighteen normal weight young adult males received five treatments ( beverage preloads ) at 1100 h in a r and omized order , one per week : 280 ml of carbonated mineral water ( CMW ) ( control ) , 560 ml of CMW , 280 ml of CMW with 340 mg of encapsulated APM , 280 ml of CMW sweetened with 340 mg APM , 560 ml of an APM-sweetened soft drink ( 340 mg APM ) . Subjective hunger and food appeal were measured from 0930 a.m. to 1230 h , and food intake from a buffet lunch offered at 1205 h was measured . Treatment had no effect on food intake or macronutrient selection . Both 560 ml of CMW or soft drink suppressed appetite , although 280 ml of APM-sweetened mineral water significantly increased subjective appetite relative to the control . Encapsulated APM had no effect on appetite . Therefore , appetite reduction following consumption of an APM-sweetened drink is likely due to drink volume and not the APM content . In addition , consuming APM-sweetened CMW produces a short-term increase in subjective appetite This study compared the effects of equal volumes of sugar-rich and sugar-free beverages on feelings of hunger and fullness and the ad libitum consumption of a palatable , fat-rich snack . Eleven healthy males consumed equal volumes ( 375 mL ) of three drinks ( sugar-rich cola , sugar-free cola , mineral water ) in r and om order on separate mornings . After 20 min , the subjects were able to snack freely on potato crisps during the next 90 min . Each subject 's individual bowl of potato crisps was covertly replenished at 15 min intervals while the subjects were completing appetite and mood ratings . After the 110 min experimental period , the subjects ' ad libitum food intake from a buffet-style lunch was covertly recorded . On leaving the laboratory , the subjects filled in a weighed food dairy for the rest of the day . The equal-volume preloads initially decreased hunger to a similar degree and potato crisp intake during the first 15 min interval was not significantly different among the three preloads . On average , total energy intakes from the crisps and lunch were not significantly different among the preloads , and by the end of the day , total energy intakes were similar for the three test conditions . Therefore , the low-calorie/low-sugar drinks did not facilitate a reduced energy intake by the lean , non-dieting male subjects This study examined the effects of four breakfast preloads of different sweetness and energy content on motivational ratings , taste preferences , and energy intakes of 12 obese and 12 lean women . The preloads consisted of creamy white cheese ( fromage blanc ) and were either plain , sweetened with sucrose or aspartame , or sweetened with aspartame and supplemented with maltodextrin . Their energy content was either 300 kcal ( 1,255 kJ ) or 700 kcal ( 2,929 kJ ) . Motivational ratings of hunger and the desire to eat were obtained prior to and at 30 min intervals after breakfast . Taste preferences were measured prior to and 150 min after breakfast . The subjects ate buffet-style lunch , snack , and dinner meals in the laboratory . Obese women consumed significantly more energy at meals ( 2,596 kcal or 10,862 kJ ) than did lean women ( 1,484 kcal or 6,209 kJ ) ; derived a greater proportion of energy from fat ( 39.9 % vs. 35.5 % ) , and had lower dietary carbohydrate-to-fat ratios . Consumption of low-energy as opposed to high-energy breakfast preloads was associated with elevated motivational ratings by noon . However , energy intakes at lunch , snack , or dinner did not vary as a function of preload type , and no compensation was observed for the energy consumed at breakfast . Taste preferences were not affected by preload ingestion or by preload type . The study provided no evidence that aspartame promotes hunger or results in increased energy intakes in obese or in lean women This study investigated whether the energy density of foods affected energy intake when subjects were informed about the energy density of their meals . Forty normal-weight women ate breakfast , lunch , and dinner in the laboratory on three separate days . The entrée at each meal was varied in energy density to be either 1.25 , 1.50 , or 1.75 kcal/g ( 5.23 , 6.28 , or 7.32 kJ/g ) , but was held similar in macronutrient composition and palatability . On each day , the entrées at all three meals had the same energy density . All entrées were consumed ad libitum . Subjects were assigned to one of two groups . Subjects in the information group received a nutrition label with each meal , which showed the energy density of the entrée . Subjects in the no-information group did not receive any nutrition information . The results revealed that subjects in both groups had the same pattern of food intake across the three levels of energy density . Energy density significantly affected energy intake ; subjects in both groups combined consumed 22 % less energy in the condition of low energy density than in the condition of high energy density ( p < 0.0001 ) . These findings show that energy density can have a significant influence on energy intake , even when individuals are informed about the energy density of their meals BACKGROUND Consumption of sugar-sweetened beverages may cause excessive weight gain . We aim ed to assess the effect on weight gain of an intervention that included the provision of noncaloric beverages at home for overweight and obese adolescents . METHODS We r and omly assigned 224 overweight and obese adolescents who regularly consumed sugar-sweetened beverages to experimental and control groups . The experimental group received a 1-year intervention design ed to decrease consumption of sugar-sweetened beverages , with follow-up for an additional year without intervention . We hypothesized that the experimental group would gain weight at a slower rate than the control group . RESULTS Retention rates were 97 % at 1 year and 93 % at 2 years . Reported consumption of sugar-sweetened beverages was similar at baseline in the experimental and control groups ( 1.7 servings per day ) , declined to nearly 0 in the experimental group at 1 year , and remained lower in the experimental group than in the control group at 2 years . The primary outcome , the change in mean body-mass index ( BMI , the weight in kilograms divided by the square of the height in meters ) at 2 years , did not differ significantly between the two groups ( change in experimental group minus change in control group , -0.3 ; P=0.46 ) . At 1 year , however , there were significant between-group differences for changes in BMI ( -0.57 , P=0.045 ) and weight ( -1.9 kg , P=0.04 ) . We found evidence of effect modification according to ethnic group at 1 year ( P=0.04 ) and 2 years ( P=0.01 ) . In a prespecified analysis according to ethnic group , among Hispanic participants ( 27 in the experimental group and 19 in the control group ) , there was a significant between-group difference in the change in BMI at 1 year ( -1.79 , P=0.007 ) and 2 years ( -2.35 , P=0.01 ) , but not among non-Hispanic participants ( P>0.35 at years 1 and 2 ) . The change in body fat as a percentage of total weight did not differ significantly between groups at 2 years ( -0.5 % , P=0.40 ) . There were no adverse events related to study participation . CONCLUSIONS Among overweight and obese adolescents , the increase in BMI was smaller in the experimental group than in the control group after a 1-year intervention design ed to reduce consumption of sugar-sweetened beverages , but not at the 2-year follow-up ( the prespecified primary outcome ) . ( Funded by the National Institute of Diabetes and Digestive and Kidney Diseases and others ; Clinical Trials.gov number , NCT00381160 . ) The effect of short duration exercise ( EXR ) on food intake ( FI ) and energy balance ( EB ) is not well understood in either normal weight ( NW ) or overweight ( OW ) and obese ( OB ) 9 - 14 years old children . Our purpose was to describe the effects of activity and a glucose drink on short term FI , appetite , and EB in NW , OW , and OB boys . Each boy received in r and om order either a noncaloric Sucralose sweetened control or glucose ( 1.0 g·kg(-1 ) body weight ) drink 5 min after either exercise ( EXR ) or sedentary ( SED ) activity . Boys exercised for 15 min at their ventilation threshold ( V(T ) ) in experiment 1 or at 25 % above their V(T ) in experiment 2 . FI was measured at an ad libitum pizza meal 30 min after drink consumption . FI was lower after the glucose drink ( p < 0.001 ) but not affected by activity , even though EXR increased appetite ( p < 0.001 ) . OW/OB boys ate more total food than NW boys ( p = 0.020 ) . EB over the duration of the experiments was reduced by EXR in OW/OB boys ( p = 0.013 ) but not in NW boys in either experiment ( p > 0.05 ) . We conclude that intake regulation in OW/OB boys in response to a glucose drink is similar to NW boys , but it may be less responsive to activity OBJECTIVE . The role of sugar-sweetened beverages ( SSBs ) in promoting obesity is controversial . Observational data link SSB consumption with excessive weight gain ; however , r and omized , controlled trials are lacking and necessary to resolve the debate . We conducted a pilot study to examine the effect of decreasing SSB consumption on body weight . METHODS . We r and omly assigned 103 adolescents aged 13 to 18 years who regularly consumed SSBs to intervention and control groups . The intervention , 25 weeks in duration , relied largely on home deliveries of noncaloric beverages to displace SSBs and thereby decrease consumption . Change in SSB consumption was the main process measure , and change in body mass index ( BMI ) was the primary end point . RESULTS . All of the r and omly assigned subjects completed the study . Consumption of SSBs decreased by 82 % in the intervention group and did not change in the control group . Change in BMI , adjusted for gender and age , was 0.07 ± 0.14 kg/m2 ( mean ± SE ) for the intervention group and 0.21 ± 0.15 kg/m2 for the control group . The net difference , −0.14 ± 0.21 kg/m2 , was not significant overall . However , baseline BMI was a significant effect modifier . Among the subjects in the upper baseline- BMI tertile , BMI change differed significantly between the intervention ( −0.63 ± 0.23 kg/m2 ) and control ( + 0.12 ± 0.26 kg/m2 ) groups , a net effect of −0.75 ± 0.34 kg/m2 . The interaction between weight change and baseline BMI was not attributable to baseline consumption of SSBs . CONCLUSIONS . A simple environmental intervention almost completely eliminated SSB consumption in a diverse group of adolescents . The beneficial effect on body weight of reducing SSB consumption increased with increasing baseline body weight , offering additional support for American Academy of Pediatrics guidelines to limit SSB consumption Background and Objective : The role of sugars in solutions on subjective appetite and food intake ( FI ) has received little investigation in children . Therefore , we examined the effect of isocaloric solutions ( 200 kcal/250 ml ) of sugars including sucrose , high-fructose corn syrup-55 ( HFCS ) or glucose , compared with a non-caloric sucralose control , on subjective appetite and FI in 9- to 14-year-old normal weight ( NW ) boys . Participants and Methods : NW boys ( n=15 ) received each of the test solutions , in r and om order , 60 min before an ad libitum pizza meal . Subjective appetite was measured at baseline ( 0 min ) , and 15 , 30 , 45 and 60 min . Results : Only glucose ( P=0.003 ) , but neither sucrose nor HFCS , reduced FI compared with the sucralose control . This led to a higher cumulative energy intake , compared with sucralose , after sucrose ( P=0.009 ) and HFCS ( P=0.01 ) , but not after glucose . In all treatment sessions , subjective average appetite increased from baseline to 60 min , but change from baseline average appetite was the highest after sucrose ( P<0.005 ) . Furthermore , sucrose ( r=−0.59 , P=0.02 ) and HFCS ( r=−0.56 , P=0.03 ) , but not glucose , were inversely associated with test meal FI when the treatment dose ( 200 kcal ) was expressed on a body weight ( kg ) basis . Conclusions : Change from baseline subjective average appetite was the highest after sucrose , but only the glucose solution suppressed FI at the test meal 60 min later in NW boys A study was design ed to determine the effect of the consumption of the nutritive sweetener aspartame on non-insulin-dependent diabetics . Forty-three adult diabetics between the ages of 21 and 70 completed a 90-day study ; all were diabetics whose conditions were managed by diet and /or hypoglycemic agents . Participants in the blind study were instructed to continue their usual diet and to take two capsules of an assigned preparation three times daily with meals , either the aspartame or the placebo . The 1.8 g of aspartame administered is approximately three times the expected daily consumption of aspartame if used as a sweetener to replace sugar . Throughout the study subjects were examined for ( 1 ) symptoms of intolerance , ( 2 ) fasting plasma phenylalanine levels exceeding 4 mg/100 ml , and ( 3 ) deterioration of diabetic control . At the conclusion of the study subjects exhibited no symptoms that could be traced to the administration of aspartame or the placebo , and diabetic control was unaffected by the chronic administration of these substances . Aspartame seems to be well tolerated by non-insulin-dependent diabetics Objective : To investigate the influence of ingestion of beverages with sucrose or with intense sweeteners on food intake ( FI ) and on hunger ratings in before and after a month of daily consumption of beverages . Design : Experimental study . Setting : Department of Physiology , University Hospital , Dijon , France . Subjects : In all , 12 men and 12 women , aged 20–25 y. Interventions : Four beverages contained either sucrose ( E+:100 g/l , 1672 kJ ) or intense sweeteners ( E− : null energy content ) and were flavoured with either orange ( O ) or raspberry ( R ) . FI was measured in the lab during two 2-consecutive-day periods , carried out on 2 successive weeks ( session 1 ) . The subjects drank 2 l of either E+ or E− beverages on the first day of both weekly periods , according to a balanced r and omised design . E+ was paired with O for 50 % of subjects and with R for the other 50 % . Subjects were then habituated over a 4-week period to both beverages , consuming 1 l of E+ beverage on odd days and 1 l of E– drink on even days . After this period , the measurements of session 1 were repeated ( session 2 , weeks 7–8 ) . Finally , FI was measured for two more 2-day periods ( weeks 9–10 ) after the association between flavour and energy content was reversed ( session 3 ) . Results : The E– drinks were less palatable than the E+ drinks . Besides , we observed that FI was not reduced in response to a liquid extra caloric load and there was no change in hunger ratings after the beverages in any of the sessions . Conclusion : Ingestion of caloric beverages induced a positive energy balance and the continuous exposure phase to these beverages over 1 month did not improve FI adaptation in response to the extra energy provided by the beverages . Sponsorship : This study was sponsored by SEV , Bourg la Reine , France ; the French Ministère de la Recherche et de la Technologie ( Programme AGROBIO-Aliments Demain ) and the Regional Council of Burgundy ( Dijon , France ) The purpose of this investigation was twofold : ( 1 ) to examine the role of low- to moderate-intensity , short- duration physical activity on subjective appetite and ( 2 ) to identify the role of and associations between ventilation threshold ( VeT ) and energy intake at a pizza lunch 30 min after glucose and whey protein drinks in normal weight boys . In 14 boys ( age : 12.5+/-0.4 years ) subjective appetite was measured before and after a 12 min walking protocol design ed to determine physical fitness based on the VeT. On a separate occasion food intake ( FI ) and subjective appetite were measured in response to sweetened preloads of either a SPLENDA Sucralose control , glucose or whey protein made up to 250 ml with water , given in r and om order to each boy , 2h after a st and ardized breakfast . Subjective average appetite and prospect i ve food consumption scores increased after physical activity . VeT was positively associated with FI at a pizza lunch consumed 30 min after glucose and whey protein drinks . Glucose and whey protein reduced FI similarly at lunch compared with control . In conclusion , appetite is increased by low- to moderate-intensity , short- duration physical activity and FI following glucose and protein preloads is positively associated with fitness levels in boys Background / Aims : The increased incidence of obesity coincides with an increased consumption of sugar-sweetened beverages ( SSBs ) . This study investigated the effect of SSB intake on energy intake in an ad libitum 6-month low-fat high-carbohydrate diet in a re analysis of the CARMEN data . Methods : Forty-seven overweight-to-obese men and women participated in the Maastricht centre of the r and omized controlled CARMEN study . They were allocatedto a control ( habitual ) diet group ( CD ) , a low-fat ( –10 energy percent , En% ) high simple carbohydrate ( SCHO ) or low-fat high complex carbohydrate group ( CCHO ) ( SCHO vs. CCHO : 1.5 vs. 0.5 ) using a controlled laboratory shop system . Reanalyses were made for the energy , amount and density of all drinks and in particular of sweetened beverages ( SBs ) . The SCHO and CD group could select nondiet SBs , including soft drinks and fruit juices , while the CCHO group received SB alternatives . Results : Energy intake decreased in the CCHO and SCHO groups versus the CD group ( –2.7 ± 0.4 MJ/day CCHO group vs. –0.2 ± 0.5 MJ/day CD group , p < 0.01 ; –1.4 ± 0.4 MJ/day SCHO group , not significant ) . Simple carbohydrate intake increased significantly in the SCHO group versus the CCHO and CD groups ( + 10.8 ± 1.6 vs. –2.0 ± 0.9 and –0.5 ± 1.1 En% ; p < 0.001 ) . In the SCHO and CD groups , energy intake from SBs increased significantly ( + 187 ± 114 and + 101 ± 83 kJ/day , respectively ; –432 ± 72 kJ/day in the CCHO group ; p < 0.001 ) . Conclusion : Simple carbohydrate intake increased through enhanced intake of nondiet SBs in the SCHO group . Fat reduction combined with only diet SBs in an ad libitum situation has a greater impact on energy intake than fat reduction combined with nondiet SBs BACKGROUND Obesity in adolescence has been increasing in the past several decades . Beverage habits among adolescents include increased consumption of sugar-sweetened beverages and decreased consumption of milk . OBJECTIVE This study aim ed to examine the association between beverage consumption and 5-y body weight change in 2294 adolescents . DESIGN Project EAT ( Eating Among Teens ) is a 5-y longitudinal study of eating patterns among adolescents . Surveys were completed in 1998 - 1999 ( time 1 ) and in 2003 - 2004 ( time 2 ) . Multivariable linear regression was used to examine the association between beverage consumption at time 2 and change in body mass index from time 1 to time 2 , with adjustments for age , socioeconomic status , race , cohort , physical activity , sedentary behavior , coffee , tea , time 1 body mass index , and beverage variables . RESULTS In prospect i ve analyses , consumption of beverages was not associated with weight gain , except for consumption of low-calorie soft drinks ( positive association , P = 0.002 ) and white milk ( inverse association , P = 0.03 ) , but these associations did not appear to be a monotonic linear dose-response relation . The positive association with low-calorie soft drinks was no longer present after adjustment for dieting and parental weight-related concerns , which suggests that the use of low-calorie soft drinks is a marker for more general dietary behaviors and weight concerns . CONCLUSIONS We showed no association between sugar-sweetened beverage consumption , juice consumption , and adolescent weight gain over a 5-y period . A direct association between diet beverages and weight gain appeared to be explained by dieting practice s. Adolescents who consumed little or no white milk gained significantly more weight than their peers who consumed white milk . Future research that examines beverage habits and weight among adolescents should address portion sizes , adolescent maturation , and dieting behaviors Exercise is known to cause physiological changes that could affect the impact of nutrients on appetite control . This study was design ed to assess the effect of drinks containing either sucrose or high-intensity sweeteners on food intake following exercise . Using a repeated- measures design , three drink conditions were employed : plain water ( W ) , a low-energy drink sweetened with artificial sweeteners aspartame and acesulfame-K ( L ) , and a high-energy , sucrose-sweetened drink ( H ) . Following a period of challenging exercise ( 70 % VO2 max for 50 min ) , subjects consumed freely from a particular drink before being offered a test meal at which energy and nutrient intakes were measured . The degree of pleasantness ( palatability ) of the drinks was also measured before and after exercise . At the test meal , energy intake following the artificially sweetened ( L ) drink was significantly greater than after water and the sucrose ( H ) drinks ( p < 0.05 ) . Compared with the artificially sweetened ( L ) drink , the high-energy ( H ) drink suppressed intake by approximately the energy contained in the drink itself . However , there was no difference between the water ( W ) and the sucrose ( H ) drink on test meal energy intake . When the net effects were compared ( i.e. , drink + test meal energy intake ) , total energy intake was significantly lower after the water ( W ) drink compared with the two sweet ( L and H ) drinks . The exercise period brought about changes in the perceived pleasantness of the water , but had no effect on either of the sweet drinks . The remarkably precise energy compensation demonstrated after the higher energy sucrose drink suggests that exercise may prime the system to respond sensitively to nutritional manipulations . The results may also have implication s for the effect on short-term appetite control of different types of drinks used to quench thirst during and after exercise The effects of a sucrose drink ( 160 kcals/40 grams of cane sugar ) on mood state ( Profile of Mood States ) were examined over time in a between-subjects , blind placebo design . Orosensory factors were virtually eliminated due to the prior use of a benzocaine anaesthetic lozenge . The ingestion of sucrose failed to have any substantial effect on mood immediately after intake or 30 and 60 min thereafter , although two female subjects reported an increase in energy at 30 min . There was no evidence that the carbohydrate preload increased hunger or eating , rather a delay in food intake was noted subsequent to the ingestion of sugar which was not found following ingestion of saccharin or water OBJECTIVE To compare the efficacy of non-nutritive sweetened beverages ( NNS ) or water for weight loss during a 12-week behavioral weight loss treatment program . METHODS An equivalence trial design with water or NNS beverages as the main factor in a prospect i ve r and omized trial among 303 men and women was employed . All participants participated in a behavioral weight loss treatment program . The results of the weight loss phase ( 12 weeks ) of an ongoing trial ( 1 year ) that is also evaluating the effects of these two treatments on weight loss maintenance were reported . RESULTS The two treatments were not equivalent with the NNS beverage treatment group losing significantly more weight compared to the water group ( 5.95 kg versus 4.09 kg ; P < 0.0001 ) after 12 weeks . Participants in the NNS beverage group reported significantly greater reductions in subjective feelings of hunger than those in the water group during 12 weeks . CONCLUSION These results show that water is not superior to NNS beverages for weight loss during a comprehensive behavioral weight loss program It was reported previously that the dipeptide sweetener aspartame suppresses food intake in humans by a postingestive action . The present study examined the hypothesis that this is due to an effect of phenylalanine , one of the primary breakdown products of aspartame ( phenylalanine is a potent releaser of the so-called satiety hormone cholecystokinin , CCK ) . Capsulated aspartame ( 400 mg ) administered to human volunteers reduced food intake by 15 % ( 253 kcal ) in a lunchtime test meal begun 1 hour later . However , neither phenylalanine ( 200 mg ) nor the other constituent amino acid of aspartame , aspartic acid ( 200 mg ) , altered intake compared with placebo . Despite the large effect on food intake there were no treatment differences in pre- or postmeal ratings of motivation to eat . This suggests that aspartame may act to intensify the satiating effects of ingested food . Although high doses of phenylalanine reduce food intake , an individual action of phenylalanine can not account for the potent anorexic effect of aspartame . In discussing alternative mechanisms it is noted that the amino acid sequence of aspartame ( Asp-Phe ) is the same as the C-terminal dipeptide of CCK . A direct action of aspartame at CCK receptors appears to be unlikely ; however , aspartame might act as CCK releaser . Further studies are required to eluci date the mechanism of aspartame 's anorexic action and perhaps to evaluate its therapeutic potential as an antiobesity agent Objective The present study evaluated weight loss and compliance outcomes for overweight adolescents assigned to one of two dietary interventions differing in the type of snacks allowed . Methods The study was a 12-week , controlled clinical trial , among otherwise healthy but overweight ( body mass index ≥95th percentile ) 11-year-old to 15-year-old girls who were r and omly assigned to either a 1,500 kcal/day free-snack program or a 1,500 kcal/day restricted-snack program . All subjects were counseled to consume three servings of dairy products per day , and were provided with a 500 mg calcium supplement as well . Subjects in the free-snack group could choose any 150-calorie item as one of their two daily snacks , including regular soda if desired ; however , subjects in the restricted-snack group were limited to diet soda . Results Thirty-two adolescent girls completed the 12-week intervention . Both diets were equally effective in achieving a modest amount of weight loss , and were equally acceptable to the subjects . Significant decreases in weight , body mass index , anthropometric measures , total cholesterol and triglycerides were observed . Conclusions A 1,500 kcal/day diet allowing for a free snack of 150 calories was equally as effective as a more restricted snack policy in achieving a modest amount of weight loss among overweight 11-year-old to 15-year-old girls . In addition , results suggest that some soda may be included in a teen weight control diet , as long as caloric intake is maintained at recommended levels , and care is taken to achieve adequate intake of essential nutrients . Calcium intake among subjects was low at baseline , and , although it increased during the study ( due to supplementation ) , further efforts to increase consumption of naturally calcium-rich and calcium-fortified foods and beverages are needed The present study was design ed to investigate the in vivo effects of monosodium glutamate ( MSG ) and aspartame ( ASM ) individually and in combination on the cognitive behavior and biochemical parameters like neurotransmitters and oxidative stress indices in the brain tissue of mice . Forty male Swiss albino mice were r and omly divided into four groups of ten each and were exposed to MSG and ASM through drinking water for one month . Group I was the control and was given normal tap water . Groups II and III received MSG ( 8 mg/kg ) and ASM ( 32 mg/kg ) respectively dissolved in tap water . Group IV received MSG and ASM together in the same doses . After the exposure period , the animals were subjected to cognitive behavioral tests in a shuttle box and a water maze . Thereafter , the animals were sacrificed and the neurotransmitters and oxidative stress indices were estimated in their forebrain tissue . Both MSG and ASM individually as well as in combination had significant disruptive effects on the cognitive responses , memory retention and learning capabilities of the mice in the order (MSG+ASM)>ASM > MSG . Furthermore , while MSG and ASM individually were unable to alter the brain neurotransmitters and the oxidative stress indices , their combination dose ( MSG+ASM ) decreased significantly the levels of neurotransmitters ( dopamine and serotonin ) and it also caused oxidative stress by increasing the lipid peroxides measured in the form of thiobarbituric acid-reactive substances ( TBARS ) and decreasing the level of total glutathione ( GSH ) . Further studies are required to evaluate the synergistic effects of MSG and ASM on the neurotransmitters and oxidative stress indices and their involvement in cognitive dysfunctions Background : It is hypothesized that a solid form of food or food components suppresses subjective appetite and short-term food intake ( FI ) more than a liquid form . Objective : To compare the effect of eating solid vs drinking liquid forms of gelatin , sucrose and its component mixtures , and whey protein , on subjective appetite and FI in young men . Design and subjects : A r and omized crossover design was used in three experiments in which the subjects were healthy males of normal weight . Solid and liquid forms of gelatin ( 6 g ) ( experiment 1 , n=14 ) , sucrose ( 75 g ) and a mixture of 50 % glucose/50 % fructose ( G50:F50 ) ( experiment 2 , n=15 ) , and acid and sweet whey protein ( 50 g ) ( experiment 3 , n=14 ) were compared . The controls were water ( experiments 1 and 3 ) and calorie-free sweetened water with gelatin ( sweet gelatin , experiment 1 ) or calorie-free sweetened water ( sweet control , experiment 2 ) . Subjective average appetite was measured by visual analog scales over 1 h and ad libitum FI was measured 1 h after treatment consumption . Results : Average appetite area under the curve was not different between solid and liquid forms of sugars , but was larger , indicating greater satiety for solid compared with liquid forms of gelatin and sweet , but not acid whey protein . The FI was not different from that of control because of solid or liquid sugars or gelatin treatments . However , both solid and liquid forms of whey protein , with no difference among them , suppressed FI compared with control ( P<0.05 ) . Conclusion : Macronutrient composition is more important than physical state of foods in determining subjective appetite and FI OBJECTIVE The increase in consumption of sugar-added beverages over recent decades may be partly responsible for the obesity epidemic among U.S. adolescents . Our aim was to evaluate the relationship between BMI changes and intakes of sugar-added beverages , milk , fruit juices , and diet soda . RESEARCH METHODS AND PROCEDURES Our prospect i ve cohort study included > 10,000 boys and girls participating in the U.S. Growing Up Today Study . The participants were 9 to 14 years old in 1996 and completed question naires in 1996 , 1997 , and 1998 . We analyzed change in BMI ( kilograms per meter squared ) over two 1-year periods among children who completed annual food frequency question naires assessing typical past year intakes . We studied beverage intakes during the year corresponding to each BMI change , and in separate models , we studied 1-year changes in beverage intakes , adjusting for prior year intakes . Models included all beverages simultaneously ; further models adjusted for total energy intake . RESULTS Consumption of sugar-added beverages was associated with small BMI gains during the corresponding year ( boys : + 0.03 kg/m2 per daily serving , p = 0.04 ; girls : + 0.02 kg/m2 , p = 0.096 ) . In models not assuming a linear dose-response trend , girls who drank 1 serving/d of sugar-added beverages gained more weight ( + 0.068 , p = 0.02 ) than girls drinking none , as did girls drinking 2 servings/d ( + 0.09 , p = 0.06 ) or 3 + servings/d ( + 0.08 , p = 0.06 ) . Analyses of year-to-year change in beverage intakes provided generally similar findings ; boys who increased consumption of sugar-added beverages from the prior year experienced weight gain ( + 0.04 kg/m2 per additional daily serving , p = 0.01 ) . Children who increased intakes by 2 or more servings/d from the prior year gained weight ( boys : + 0.14 , p = 0.01 ; girls + 0.10 , p = 0.046 ) . Further adjusting our models for total energy intake substantially reduced the estimated effects , which were no longer significant . DISCUSSION Consumption of sugar-added beverages may contribute to weight gain among adolescents , probably due to their contribution to total energy intake , because adjustment for calories greatly attenuated the estimated associations Given the potential use of a low-calorie sweetener during weight reduction , a toxicity study of chronic aspartame ingestion was conducted . Particular attention was given to possible long-term effects of aspartame on the fuel hormonal alterations characteristically caused by weight reduction . As a group mean age was 19.3 yr , body weight was 164.6 lb , and mean height was 65.4 in . Subjects were an average of 33 % in excess of ideal body weight . The aspartame dose was 2.7 g/day and was compared on a double-blind r and omized basis with a lactose placebo . Both material s were given in gelatin capsules . An average of 6.9 + /- 1.5 lb was lost by the aspartame group during the 13-wk study on a calculated 1,000-calorie diet . The placebo group lost 4.5 + /- 1.2 lb ( no significant difference between the two groups ) . After an overnight fast , reductions in glucose and immunoreactive insulin were seen in both groups , while rising trends in immunoreactive glucagon were observed . These changes are all characteristic of calorie restriction . In no instance was there a detectable effect of the ingested aspartame . No meaningful effect of weight reduction or aspartame was seen on plasma triglyceride and cholesterol , nor on any other parameter of hematologic , hepatic , or renal function that was measured . Similarly , side effects were equally distributed between asparatame and placebo This study explores whether the addition of aspartame-sweetened foods and beverages to a low fat , hypocaloric diet enhances compliance and result ing weight loss . Fifty-nine obese ( 130 - 225 % of ideal body weight ) , free living men and women were r and omly assigned to either a Balanced Deficit Diet ( BDD ) or a BDD supplemented with aspartame . Over a 12-week weight loss period , volunteers attended weekly support group meetings including behavior modification training and exercise instruction . Males achieved a clinical ly significant weight loss ( greater than 23 lb ) in both study groups , while females lost an average of 12.8 lb in the control group vs. 16.5 lb in the experimental group . In both treatment groups , sleep , general energy level , level of physical activity , and feeling of well-being showed clinical ly meaningful improvement . This study suggests possible advantages to supplementing a BDD with aspartame-sweetened foods as part of a multidisciplinary weight loss program . The small sample size prohibits definitive conclusions but does provide the protocol for a larger , outpatient clinical trial BACKGROUND Both dietary sucrose and the sweetener aspartame have been reported to produce hyperactivity and other behavioral problems in children . METHODS We conducted a double-blind controlled trial with two groups of children : 25 normal preschool children ( 3 to 5 years of age ) , and 23 school-age children ( 6 to 10 years ) described by their parents as sensitive to sugar . The children and their families followed a different diet for each of three consecutive three-week periods . One diet was high in sucrose with no artificial sweeteners , another was low in sucrose and contained aspartame as a sweetener , and the third was low in sucrose and contained saccharin ( placebo ) as a sweetener . All the diets were essentially free of additives , artificial food coloring , and preservatives . The children 's behavior and cognitive performance were evaluated weekly . RESULTS The preschool children ingested a mean ( + /- SD ) of 5600 + /- 2100 mg of sucrose per kilogram of body weight per day while on the sucrose diet , 38 + /- 13 mg of aspartame per kilogram per day while on the aspartame diet , and 12 + /- 4.5 mg of saccharin per kilogram per day while on the saccharin diet . The school-age children considered to be sensitive to sugar ingested 4500 + /- 1200 mg of sucrose per kilogram , 32 + /- 8.9 mg of aspartame per kilogram , and 9.9 + /- 3.9 mg of saccharin per kilogram , respectively . For the children described as sugar-sensitive , there were no significant differences among the three diets in any of 39 behavioral and cognitive variables . For the preschool children , only 4 of the 31 measures differed significantly among the three diets , and there was no consistent pattern in the differences that were observed . CONCLUSIONS Even when intake exceeds typical dietary levels , neither dietary sucrose nor aspartame affects children 's behavior or cognitive function To examine the eating behavior of preschool children offered chocolate-flavored or plain milk at lunch , food consumption by 135 children , aged 18 - 66 months , was measured . Four different menus were served six times during a 12-week period , each menu being presented twice with each of three test beverages , plain milk ( 18.1 kcal/oz ) , sucrose-sweetened chocolate milk ( 29.4 kcal/oz ) , or aspartame-sweetened chocolate milk ( 18.6 kcal/oz ) . The type of milk beverage served had no significant effect on the consumption of other food items offered at that meal . Subjects did drink significantly more chocolate milk than plain milk during all meals and consequently consumed significantly more energy during those meals in which sucrose-sweetened chocolate milk was served . A macronutrient analysis of lunch-time food intake for each menu revealed significant differences in protein , fat , and carbohydrate content among the four menus . Older children consumed significantly more milk and more energy per lunch-time meal than did younger preschoolers , but no other consistent age-related differences were observed . No significant gender differences were detected in any of the statistical analyses conducted . These findings suggest that young children do not reduce the intake of other food items at a meal to compensate for the increased energy intake that results from excessive sucrose-sweetened milk consumption . Aspartame-sweetened milk increases milk intake in small children without providing them with the additional calories of sucrose-sweetened milk The effects of sucrose and oil preloads were explicitly compared in a single-blind controlled trial using a between-subjects design . Eighty adult subjects ( forty-three male , thirty-seven female ) aged 18 - 50 years received at 11.00 hours one of four yoghurt preloads . All were 80 g low-fat , unsweetened yoghurt ( 188 kJ ) , containing additionally ( 1 ) saccharin ( control , 23 kJ ) , or ( 2 ) 40 g sucrose ( 859 kJ ) , ( 3 ) 40 g maize oil ( 1569 kJ ) , ( 4 ) 20 g sucrose , 20 g maize oil ( 1213 kJ ) . Subjects were normal eaters and of normal weight ( male mean weight : 68.8 ( SD 3.2 ) kg , BMI 21.8 ( SD 1.6 ) kg/m2 ; female mean weight : 53.7 ( SD 5.1 ) kg , BMI 20.4 ( SD 1.2 ) kg/m2 ) . Food intake was measured with a food diary and mood with ten single Likert scales . ANOVA was conducted using preload type ( saccharin , sucrose , oil , sucrose + oil ) , sex ( male , female ) and early v. late breakfast times as factors . Mood was analysed using the same design , with time of rating ( immediate , 60 min , 120 min ) as an additional factor . Men ate more after the saccharin preload than after the other preloads , but did not vary the time of their next solid food . Women increased the intermeal interval only after the oil preload , which also had the highest energy content value , but did not vary the energy content of their next solid food . The saccharin preload decreased rated tiredness at 2 h compared with the sucrose preload , possibly due to its lower energy content . The preloads containing sucrose or sucrose + oil increased calmness between 1 and 2 h afterwards , compared with the saccharin preload . It is concluded that both sucrose and oil increase the intermeal interval in men , but that women are less sensitive to preloading . The mood effects suggest that tiredness after carbohydrate at 2 h may in part be a decrease in rated energy compared with the increased rated energy found after a preload with low energy content . Carbohydrate may genuinely increase calmness . These effects apply to non-restrained eaters of normal weight BACKGROUND Studies of cocoa suggest an array of cardiovascular benefits ; however , the effects of daily intake of sugar-free and sugar-sweetened cocoa beverages on endothelial function ( EF ) have yet to be established . METHODS 44 adults ( BMI 25 - 35 kg/m2 ) participated in a r and omized , controlled , crossover trial . Participants were r and omly assigned to a treatment sequence : sugar-free cocoa beverage , sugar-sweetened cocoa beverage , and sugar-sweetened cocoa-free placebo . Treatments were administered daily for 6 weeks , with a 4-week washout period . RESULTS Cocoa ingestion improved EF measured as flow-mediated dilation ( FMD ) compared to placebo ( sugar-free cocoa : change , 2.4 % [ 95 % CI , 1.5 to 3.2 ] vs. -0.8 % [ 95 % CI , -1.9 to 0.3 ] ; difference , 3.2 % [ 95 % CI , 1.8 to 4.6 ] ; p<0.001 and sugar-sweetened cocoa : change , 1.5 % [ 95 % CI , 0.6 to 2.4 ] vs. -0.8 % [ 95 % CI , -1.9 to 0.3 ] ; difference , 2.3 % [ 95 % CI , 0.9 to 3.7 ] ; p=0.002 ) . The magnitude of improvement in FMD after consumption of sugar-free versus sugar-sweetened cocoa was greater , but not significantly . Other biomarkers of cardiac risk did not change appreciably from baseline . BMI remained stable throughout the study . CONCLUSIONS Daily cocoa ingestion improves EF independently of other biomarkers of cardiac risk , and does not cause weight gain . Sugar-free preparations may further augment endothelial function The long-term effects of sucrose on appetite and mood remain unclear . Normal weight subjects compensate for sucrose added blind to the diet ( Reid et al. , 2007 ) . Overweight subjects , however , may differ . In a single-blind , between-subjects design , soft drinks ( 4x25cl per day ; 1800kJ sucrose sweetened versus 67kJ aspartame sweetened ) were added to the diet of overweight women ( n=53 , BMI 25 - 30 , age 20 - 55 ) for 4 weeks . A 7-day food diary gave measures of total energy , carbohydrate , protein , fat , and micronutrients . Mood and hunger were measured by ten single Likert scales rated daily at 11.00 , 14.00 , 16.00 , and 20.00 . Activity levels were measured by diary and pedometer . Baseline energy intake did not differ between groups . During the first week of the intervention energy intake increased slightly in the sucrose group , but not in the aspartame group , then decreased again , so by the final week intake again did not differ from the aspartame group . Compensation was not large enough to produce significant changes in the composition of the voluntary diet . There were no effects on hunger or mood . It is concluded that overweight women do not respond adversely to sucrose added blind to the diet , but compensate for it by reducing voluntary energy intake . Alternative explanations for the correlation between sugary soft drink intake and weight gain are discussed
2,096
17,300,227
Diagnostic accuracy was the most often used outcome measure and was found in phase I , II and IV . Compared with other specialties in telemedicine ( i.e. telesurgery , telepaediatrics ) , teledermatology seems to be a mature application . However , more evaluation studies with a focus on clinical outcomes such as preventable referrals or time to recovery are needed to prove that teledermatology indeed is a promising and cost-saving technology
BACKGROUND There is a growing interest in teledermatology in today 's clinical practice , but the maturity of the evaluation research of this technology is still unclear . OBJECTIVES This systematic review describes the maturity of teledermatology evaluation research over time and explores what kind of teledermatology outcome measures have been evaluated .
The aim of this study was to determine if a teledermatology consult system , using store- and -forward digital imaging technology , results in patients achieving a shorter time from referral date to date of initial definitive intervention when compared to a traditional referral process . Patients being referred to the dermatology consult service from the primary care clinics at the Durham VA Medical Center were r and omized to either a teledermatology consultation or usual care . A usual care consultation consisted of a text-based electronic consult request . A teledermatology consultation included digital images and a st and ardized history , in addition to the text-based electronic consult . Time to initial definitive intervention was defined as the time between referral date and the date the patient was scheduled for a clinic visit for those patients that the consultant requested a clinic-based evaluation , or the time between referral date and the date the consult was answered by the consultant if a clinic visit was not required . Patients in the teledermatology arm of the study reached a time to initial definitive intervention significantly sooner than did those patients r and omized to usual care ( median 41 days versus 127 days , p = 0.0001 , log-rank test ) . Additionally , 18.5 % of patients in the teledermatology arm avoided the need for a dermatology clinic-based visit compared to zero patients avoiding a dermatology clinic visit in the usual care arm of the study ( p < 0.001 , z-test ) . Teledermatology consult systems can result in significantly shorter times to initial definitive intervention for patients compared to traditional consult modalities , and , in some cases , the need for a clinic-based visit can be avoided The diagnostic accuracy of realtime teledermatology was measured using two different video cameras . One camera was a relatively low-cost , single-chip device camera 1 , while the other was a more expensive , three-chip camera camera 2 . The diagnosis obtained via the videolink was compared with the diagnosis made in person . Sixty-five new patients referred to a dermatology clinic were examined using camera 1 followed by a st and ard face-to-face consultation on the same day . A further 65 patients were examined using camera 2 and the same procedure implemented . Seventy-six per cent of conditions were correctly diagnosed by telemedicine using camera 2 compared with 62 using camera 1 . A working differential diagnosis was obtained in 12 of cases using camera 2 compared with 14 using camera 1 . The percentage of ` no diagnosis ` , wrong and missed diagnoses was halved using camera 2 compared with camera 1 . These results suggest that the performance of the more expensive camera was superior for realtime teledermatology OBJECTIVES To determine the reliability of videoconferencing technology in evaluating skin tumors , the impact of the technology on the clinicians ' degree of suspicion that a skin tumor is malignant , and the recommendation to do a biopsy . MATERIAL S AND METHODS Four skin cancer screenings were conducted at rural health care facilities in eastern North Carolina that were connected to East Carolina University School of Medicine . A dermatologist saw the patients in person at the local facility , and the same patient was seen by a dermatologist via a T-1 connection to Greenville , North Carolina . RESULTS The two physicians were in absolute agreement on 59 % of the 107 skin tumors evaluated . There were five lesions identified by the on-site dermatologist as a probable or definite malignancy . The degree of concern about a lesion being malignant and the decision whether to do a biopsy were not significantly different , as shown by kappa analysis . CONCLUSION The concern about the malignancy of a particular skin lesion and the recommendation whether to do a biopsy were not significantly affected by telemedicine technology BACKGROUND The Israel Defense Forces implemented a pilot teledermatology service in primary clinics . OBJECTIVES To assess user satisfaction and clinical short-term effectiveness of a computerized store and forward teledermatology service in urban and rural units . METHODS A multi-center prospect i ve uncontrolled cohort pilot trial was conducted for a period of 6 months . Primary care physicians referred patients to a board-certified dermatologist using text email accompanied by digital photographs . Diagnosis , therapy and management were sent back to the referring PCP . Patients were asked to evaluate the level of the CSAFTD service , effect of the service on accessibility to dermatologists , respect for privacy , availability of drugs , health improvement and overall satisfaction . PCPs assessed the quality of the teledermatology consultations they received , the contribution to their knowledge , and their overall satisfaction . RESULTS Tele-diagnosis alone was possible for 95 % ( n=413 ) of 435 CSAFTD referrals ; 22 % ( n=95 ) of referrals also required face-to-face consultation , Satisfaction with CSAFTD was high among patients in both rural and urban clinics , with significantly higher scores in rural units . Rural patients rated the level of service , accessibility and overall satisfaction higher than did urban patients . PCPs were satisfied with the quality of the service and its contribution to their knowledge . Rural physicians rated level of service and overall satisfaction higher than did urban physicians . Tele-referrals were completed more efficiently than referral for face-to-face appointments . CONCLUSIONS CSAFTD provided efficient , high quality medical service to rural and urban military clinics in the IDF OBJECTIVE To determine the effect of de grade d digital image resolution ( as viewed on a monitor ) on the accuracy and confidence of dermatologic interpretation . MATERIAL S AND METHODS Eight dermatologists interpreted 180 clinical cases divided into three Logical Competitor Sets ( LCS ) ( pigmented lesions , non-pigmented lesions , and inflammatory dermatoses ) . Each case was digitized at three different resolutions . The images were r and omized and divided into ( 9 ) 60-image sessions . The physicians were completely blinded concerning the image resolution . After 60 seconds per image , the viewer recorded a diagnosis and level of confidence . The result ant ROC curves compared the effect of LCS , level of clinical difficulty , and resolution of the digital image . One-way analysis of variance ( ANOVA ) compared the curves . RESULTS The areas beneath the ROC curves did not demonstrate any consistently significant difference between the digital image resolutions for all LCS and levels of difficulty . The only significant effect observed was amongst pigmented lesions ( LCS-A ) where the ROC curve area was significantly smaller in the easy images at high resolution compared to low and medium resolutions . For all other ROC curve comparisons within LCS-A , at all other levels of difficulty , as well as within the other LCS at all levels of difficulty , none of the differences was significant . CONCLUSION A 720 x 500 pixel image can be considered equivalent to a 1490 x 1000 pixel image for most store- and -forward teledermatology consultations Background Increasing use of teledermatology should be based on demonstration of favourable accuracy and cost – benefit analysis for the different methods of use of this technique Teledermatology consultations were organized between two health centres and two hospitals in Northern Irel and using low-cost videoconferencing equipment . A prospect i ve study of patient satisfaction was carried out . Following each teleconsultation , patients were asked to complete a question naire assessing their satisfaction with the service . Over 22 months , 334 patients were seen by a dermatologist over the video-link , and 292 patients 87 completed the 16-item question naire . Patients reported universal satisfaction with the technical aspects of teledermatology . The quality of both the audio and the display was highly acceptable to patients . Personal experiences of the teledermatology consultation were also favourable : 85 felt comfortable using the video-link . The benefits of teledermatology were generally recognized : 88 of patients thought that a teleconsultation could save time . Patients found the teledermatology consultation to be as acceptable as the conventional dermatology consultation . These findings suggest overall patient satisfaction with realtime teledermatology We studied the perceptions of general practitioners ( GPs ) towards teledermatology , before and after its introduction into eight general practice s for the purpose s of a r and omized controlled trial . A postal question naire was distributed before the trial and again one year later . Thirty-six of the 42 GPs responded on each occasion ( a response rate of 86 % ) . In the second survey , only 21 % of respondents indicated that they were satisfied with teledermatology in their practice , while 47 % said that they were dissatisfied . Thirty-one per cent said that they felt confident about diagnosis and management of care through teledermatology , and 28 % reported that they were unconfident . Only 23 % of respondents said that they would consider using a telemedicine system in the future , while 34 % said they would not ( 43 % were unsure ) . There were no significant findings to suggest that the GPs ' perceptions changed over time . Less favourable responses to telemedicine were found than has been observed in previous studies , which suggests that the model of telemedicine described in this study would not be widely acceptable to GPs We carried out a pilot study on the feasibility and accuracy of store- and -forward teledermatology based on patient-provided images and history as a triage tool for outpatient consultation . Patients referred by their general practitioner provided a history and images via the Internet . The information was review ed by one of 12 teledermatologists and the patient then visited a different dermatologist in person within two days . Three independent dermatologists compared the remote and in-person diagnoses in r and om order to determine diagnostic agreement . Broader agreement was also measured , by comparing the main disease groups into which the two diagnoses fell . The teledermatologists indicated whether an in-person consultation or further investigations were necessary . There were 105 eligible patients , aged four months to 72 years , who were willing to participate . For the 96 cases included in the analysis , complete diagnostic agreement was found in 41 % ( n= 39 ) , partial diagnostic agreement in 10 % ( n= 10 ) and no agreement in 49 % ( n= 47 ) . There was disease group agreement in 66 % of cases ( n= 63 ) . Nearly a quarter ( 23 % ) of participating patients could have safely been managed without an in-person visit to a dermatologist OBJECTIVE This report describes the design , development , and technical evaluation of a teledermatology system utilizing digital images and electronic forms captured through , stored on , and viewed through a common web server in an urban capitated delivery system . MATERIAL S AND METHODS The authors design ed a system whereby a primary care physician was able to seek a dermatologic consultation electronically , provide the specialist with digital images acquired according to a st and ardized protocol , and review the specialist response within 2 business days of the request . The setting s were two primary care practice s in eastern Massachusetts that were affiliated with a large integrated delivery system . Technical evaluation of the effectiveness of the system involved 18 patients . Main outcome measures included physician and patient satisfaction and comfort and efficiency of care delivery . RESULTS In 15 cases , the consultant dermatologist was comfortable in providing definitive diagnosis and treatment recommendations . In 3 cases , additional information ( laboratory studies or more history ) was requested . There were no instances where the dermatologist felt that a face-to-face visit was necessary . CONCLUSIONS This novel approach shows promise for the delivery of specialist expertise via the internet . Cost-effectiveness studies may be necessary for more widespread implementation OBJECTIVE To determine the relative efficacy of store- and -forward teledermatology vs face-to-face dermatology consultations in triage decisions about the need for a biopsy of neoplastic skin changes . DESIGN Prospect i ve study of consecutive patients judged by an internist to require dermatologic consultation for a skin growth . SETTING Private primary care and dermatology practice s and an academic dermatology practice . PATIENTS Patients requiring dermatology consultation for evaluation of skin growths . Patients were seen by a single primary care physician between July 10 , 1998 , and August 4 , 2000 . INTERVENTION Digital photographs of skin growths were obtained by the primary care physician and evaluated by a teledermatologist . The patient was then seen face-to-face by a dermatologist . A biopsy was performed if either dermatologist favored biopsy . MAIN OUTCOME MEASURES Decisions to perform a biopsy . Agreement between the dermatologists was assessed . RESULTS Of the 49 patients with evaluable photographs , the face-to-face dermatologist and teledermatologist recommended a biopsy for the same 26 patients , yielding a sensitivity of the teledermatologist of 1.00 ( 95 % confidence interval [ CI ] , 0.87 - 1.00 ) and a specificity of 1.00 ( 95 % CI , 0.85 - 1.00 ) . The agreement between the dermatologists ( kappa ) was 1.00 ( 95 % CI , 0.72 - 1.00 ) . CONCLUSION Store- and -forward teledermatology may provide an accurate and cost-effective method of determining whether skin growths in patients presenting to primary care physicians should undergo biopsy A r and omized controlled trial was carried out to measure the cost-effectiveness of realtime teledermatology compared with conventional outpatient dermatology care for patients from urban and rural areas . One urban and one rural health centre were linked to a regional hospital in Northern Irel and by ISDN at 128 kbit/s . Over two years , 274 patients required a hospital outpatient dermatology referral - 126 patients ( 46 % ) were r and omized to a telemedicine consultation and 148 ( 54 % ) to a conventional hospital outpatient consultation . Of those seen by telemedicine , 61 % were registered with an urban practice , compared with 71 % of those seen conventionally . The clinical outcomes of the two types of consultation were similar - almost half the patients were managed after a single consultation with the dermatologist . The observed marginal cost per patient of the initial realtime teledermatology consultation was 52.85 for those in urban areas and 59.93 per patient for those from rural areas . The observed marginal cost of the initial conventional consultation was 47.13 for urban patients and 48.77 for rural patients . The total observed costs of teledermatology were higher than the costs of conventional care in both urban and rural areas , mainly because of the fixed equipment costs . Sensitivity analysis using a real-world scenario showed that in urban areas the average costs of the telemedicine and conventional consultations were about equal , while in rural areas the average cost of the telemedicine consultation was less than that of the conventional consultation The diagnostic accuracy of realtime teledermatology was measured using two different video cameras . One camera was a relatively low-cost , single-chip device ( camera 1 ) , while the other was a more expensive three-chip camera ( camera 2 ) . The diagnosis obtained via the videolink was compared with the diagnosis made in person . Sixty-five new patients referred to a dermatology clinic were examined using camera 1 followed by a st and ard face-to-face consultation . A further 65 patients were examined using camera 2 and the same procedure applied . Seventy-six per cent of conditions were correctly diagnosed by telemedicine using camera 2 compared with 59 % using camera 1 . A working differential diagnosis was obtained in 12 % of cases using camera 2 compared with 17 % using camera 1 . The percentage of ‘ no diagnosis ’ , wrong and missed diagnoses was halved using camera 2 compared with camera 1 . These results suggest that the performance of camera 2 was superior to that of camera 1 for realtime teledermatology In remote areas , telemedicine services can improve the quality of access to specialist medical care and dermatology is well suited to the use of this technology . There is no published work on teledermatology services in Australia . Our purpose was to investigate the reliability of dermatological diagnoses obtained using a store and forward telemedicine system , which is being developed to offer specialist consultative services to patients in remote areas of Western Australia . We report on a small prospect i ve non‐r and omized pilot study conducted at Royal Perth Hospital , Western Australia which compared diagnoses reached following telemedicine consultations with diagnoses reached following traditional face‐to‐face consultations . In 25 out of 30 consultations , identical diagnoses were reached . In the remaining five cases , the preferred diagnosis and first differential diagnosis were reversed in order of preference . We feel this system is sufficiently promising to trial more extensively in the field The objective of this multicentre study was to undertake a systematic comparison of face‐to‐face consultations and teleconsultations performed using low‐cost videoconferencing equipment . One hundred and twenty‐six patients were enrolled by their general practitioners across three sites . Each patient underwent a teleconsultation with a distant dermatologist followed by a traditional face‐to‐face consultation with a dermatologist . The main outcome measures were diagnostic concordance rates , management plans and patient and doctor satisfaction . One hundred and fifty‐five diagnoses were identified by the face‐to‐face consultations from the sample of 126 patients . Identical diagnoses were recorded from both types of consultation in 59 % of cases . Teledermatology consultations missed a secondary diagnosis in 6 % of cases and were unable to make a useful diagnosis in 11 % of cases . Wrong diagnoses were made by the teledermatologist in 4 % of cases . Dermatologists were able to make a definitive diagnosis by face‐to‐face consultations in significantly more cases than by teleconsultations ( P = 0.001 ) . Where both types of consultation result ed in a single diagnosis there was a high level of agreement ( κ = 0.96 , lower 95 % confidence limit 0.91–1.00 ) . Overall follow‐up rates from both types of consultation were almost identical . Fifty per cent of patients seen could have been managed using a single videoconferenced teleconsultation without any requirement for further specialist intervention . Patients reported high levels of satisfaction with the teleconsultations . General practitioners reported that 75 % of the teleconsultations were of educational benefit . This study illustrates the potential of telemedicine to diagnose and manage dermatology cases referred from primary care . Once the problem of image quality has been addressed , further studies will be required to investigate the cost‐effectiveness of a teledermatology service and the potential consequences for the provision of dermatological services in the As part of a r and omized controlled trial of the costs and benefits of realtime teledermatology in comparison with conventional face-to-face appointments , patients were asked to complete a question naire at the end of their consultation . One hundred and nine patients took part in an initial teledermatology consultation and 94 in a face-to-face consultation . The proportion of patients followed up by the dermatologist was almost the same after teledermatology 24 as after a hospital appointment 26 and for similar reasons . Two hundred and three question naires were completed after the first visit and a further 20 after subsequent visits . Patients seen by teledermatology at their own health centre travelled an average of 12 km , whereas those who attended a conventional clinic travelled an average of 271 km . The telemedicine group spent an average of 51 min attending the appointment compared with 4.3 h for those seen at the hospital . The results of the present study , as in a similar study conducted in Northern Irel and , show that the economic benefits of teledermatology favour the patient rather than the health-care system Our objective was to assess the economic impact of store- and -forward teledermatology in a United States Department of Veterans Affairs ( VA ) health care setting . Patients being referred to the Dermatology Consult Service from the Primary Care Clinics at the Durham , North Carolina VA Medical Center were r and omized either to usual care or to a teledermatology consultation . Fixed and variable costs for both consult modalities were identified using a microcosting approach . The observed clinical outcomes from the r and omized trial generated probability and effectiveness measures that were inserted into a decision model . A cost analysis and a cost-effectiveness analysis that compared the two consult modalities was performed . Teledermatology was not cost saving when compared to usual care using observed costs and outcomes . Sensitivity analyses indicated that teledermatology has the potential to be cost saving if clinic visit costs , travel costs , or averted clinic visits were higher than observed in the study . Teledermatology was cost-effective for decreasing the time required for patients to reach a point of initial definitive care . Cost-effectiveness ratios ranged from $ 0.12 - 0.17 ( U.S. ) per patient per day of time to initial intervention We evaluated the ability of subjects to capture and su bmi t teledermatology images with a digital camera . We also examined whether participants who received individual training sessions would capture better- quality images than participants who were provided only with self-training . Fifty participants were r and omized between in-person training and self-training via an online tutorial . The majority of participants were young , well educated women . Two dermatologists reading the images for quality indicators had high agreement that digital images acquired were of high quality : images were well framed , appropriately bright , in focus and did not have a shadow . There was moderate agreement on diagnosis-related indicators , such as the presence or absence of pustules or papules and acne versus rosacea . There was no difference in the image- quality attributes between participants personally trained and those trained with the online tutorial . Subjects participating in this study were able to acquire images of good quality , irrespective of whether they received practical training or used an online tutorial A r and omized controlled trial was carried out to measure the societal costs of realtime teledermatology compared with those of conventional hospital care in New Zeal and . Two rural health centres were linked to a specialist hospital via ISDN at 128 kbit/s . Over 10 months , 203 patients were referred for a specialist dermatological consultation and 26 were followed up , giving a total of 229 consultations . Fifty-four per cent were r and omized to the teledermatology consultation and 46 % to the conventional hospital consultation . A cost-minimization analysis was used to calculate the total costs of both types of dermatological consultation . The total cost of the 123 teledermatology consultations was NZ$34,346 and the total cost of the 106 conventional hospital consultations was NZ$30,081 . The average societal cost of the teledermatology consultation was therefore NZ$279.23 compared with NZ$283.79 for the conventional hospital consultation . The marginal cost of seeing an additional patient was NZ$135 via teledermatology and NZ$284 via conventional hospital appointment . From a societal viewpoint , and assuming an equal outcome , teledermatology was a more cost-efficient use of re sources than conventional hospital care Teledermatology is the practice of dermatology across distances ( and time ) and involves the transfer of electronic information . To be effective and safe , the teledermatology process needs to demonstrate an acceptable level of accuracy and reliability . Accuracy is reflected by the degree of concordance ( agreement ) between the teledermatology and face‐to‐face diagnoses . Reliability is dependent on how consistently a set of results can be reproduced across different operators . Mean concordance ( primary diagnoses ) achieved by four dermatologists study ing 53 store‐ and ‐forward diagnostic cases , originating from 49 referred patients , was 79 % ( range 73–85 % ) . When the differential diagnoses were taken into account , the variation across individual dermatologists narrowed further , with a mean of 86 % ( range 83–89 % ) . In contrast , the mean general practitioner ( GP ; n = 11 ) concordance ( GP face‐to‐face vs reference dermatologist store‐ and ‐forward diagnoses ) was 49 % . An interim review of all 49 teledermatology patients showed no adverse outcome at the end of 3 months . The ability to request face‐to‐face visits by dermatologists , combined with GPs maintaining primary care of the referred patient , serve as additional safeguards for patients using a telemedicine system . Our results indicate that teledermatology management of referred skin complaints is both accurate and reliable Many studies have been published recently on the effectiveness of teledermatology as a diagnostic tool ; however , much of the data comes from live 2-way video teleconferencing consultations and very little comes from more readily available " store and forward " consultations . Moreover , most published studies compare the diagnoses of 2 different dermatologists ( interobserver comparison ) . Given the lack of data on baseline interdermatologist diagnostic variability , the interpretation of currently available diagnostic correlation data is somewhat difficult . The objective of this study is to measure the degree of diagnostic concordance between a dermatologist seeing a patient via a teledermatology consult system and the same dermatologist seeing the same patient face-to-face in a dermatology clinic at a tertiary medical center . A r and om sample of 404 patients was selected from patients who had routine appointments at our dermatology clinic OBJECTIVE In addition to the assessment and the management of patients with skin diseases , a considerable portion of dermatology residency involves examining clinical images and generating differential diagnoses from these images . This training , though helpful for recognizing manifestations of rare disorders , goes unused by most practicing dermatologists after certification . In contrast , dermatology residents learn and master verbal descriptions of skin diseases and continue to use this skill throughout their careers . However , problems arise when a dermatologist is not available and a non-dermatologist attempts to verbally describe a skin condition . An accurate description of a cutaneous disorder can facilitate effective triage management of a patient when a dermatologist is not available . Unfortunately , an inaccurate description by the referring provider can lead to diagnostic bias and ineffective , or even harmful , initial treatment . In recent years , digital photography has facilitated the electronic transfer of clinical images over distances . However , despite the promise that this technique shows in providing teledermatologic services to specialty-underserved areas and the availability of low-cost digital cameras , telephone consultation is still the st and ard of care when a dermatologist is not available . The purpose of this study is to compare the reliability of dermatologic consultations that use the telephone with that of dermatologic consultations that use both the telephone and digital images . DESCRIPTION After patient approval , an acute care provider r and omly assigned patients with skin disorders of unclear etiology to two groups , with and without digital images . The acute care provider then performed an exam and took the patient 's history . Telephone data , with or without digital images , were then presented to the consulting dermatologist , who formulated a pre-physical exam differential diagnosis and treatment plan . The consulting dermatologist immediately examined the patient in person and refined the diagnosis and management . The confidence in diagnosis , both before and after the in-person exam , was compared in the patient group with digital images and in the patient group without digital images using a five-point scale ( 1 = no confidence , 5 = most confident ) . DISCUSSION The consulting dermatologist evaluated 12 patients ( six with digital images and six without digital images ) . In the patient group with digital images , the consulting dermatologist 's confidence in diagnosis varied very little from before to after the in-person exam ( from no change in five cases to a one-point increase in the sixth case ) . In the patient group without digital images , the consulting dermatologist 's confidence level increased significantly from before to after the in-person exam . This led to therapy changes for three of the six patients in the patient group without digital images , versus two of the six patients in the patient group with digital images . This study indicates that an acute care provider 's verbal description of a skin condition may be less reliable compared with a provider 's verbal description combined with digital images . Telephone-only descriptions may also lead to management discrepancies more frequently than telephone descriptions with digital images . This has at least two implication s for medical education : ( 1 ) need for support of formal teaching of the language of dermatology to non-dermatologists and ( 2 ) justification of the time spent in two-dimensional clinical image interpretation by dermatology residents in light of digital image technology
2,097
20,012,774
Conclusion Despite several limitations , this meta- analysis based on high- quality studies adds weight to the hypothesis that occupational exposure to endotoxin in cotton textile production and agriculture is protective against lung cancer
Objective To examine the association between exposure to endotoxins and lung cancer risk by conducting a systematic review and meta- analysis of epidemiologic studies of workers in the cotton textile and agricultural industries ; industries known for high exposure levels of endotoxins .
OBJECTIVES This large , prospect i ve cohort study of private applicators , commercial applicators , and spouses of farmer applicators was undertaken to ascertain the etiology of cancers elevated in agriculture . METHODS The participants were matched to cancer registry files in Iowa and North Carolina . Incident cases were identified from enrollment through 31 December 2002 . St and ardized incidence ratios ( SIR ) were used to compare the cancer incidence of the participants with that of the total population in the two states . RESULTS The overall cancer incidence among farmers [ SIR 0.88 , 95 % confidence interval ( 95 % CI ) 0.84 - 0.91 ] and their spouses ( SIR 0.84 , 95 % CI 0.80 - 0.90 ) were significantly lower than expected , particularly for respiratory and urinary cancers . Commercial pesticide applicators had an overall cancer incidence comparable with the expected ( SIR 1.01 , 95 % CI 0.84 - 1.20 ) . Smoking prevalence was significantly lower than the national average . Prostate cancer was elevated among private applicators ( SIR 1.24 , 95 % CI 1.18 - 1.33 ) and commercial applicators ( SIR 1.37 , 0.98 - 1.86 ) . Excess ovarian cancer was observed for female applicators ( SIR 2.97 , 95 % CI 1.28 - 5.85 ) , but not for female spouses ( SIR 0.55 , 95 % CI 0.38 - 0.78 ) . Female spouses had a significant excess of melanoma ( SIR 1.64 , 95 % CI 1.24 - 2.09 ) , which was not observed among pesticide applicators . CONCLUSIONS Low overall cancer incidence rates seem to be a result of low overall smoking prevalence and other lifestyle factors , while excess cancer of the prostate and ovaries among applicators may be occupationally related . The excess risk of melanoma observed among spouses was unexpected We report the immunological and clinical results of a phase II trial with intravenously administered highly purified endotoxin ( Salmonella abortus equi ) in patients with advanced cancer . 15 patients with non-small cell lung cancer and 27 with colorectal cancer were entered into the study . 37 evaluable patients received at least four injections of endotoxin ( 4 ng/kg body weight ) and 1600 mg ibuprofen orally in 2-week intervals . Transient renal ( WHO grade 0 - 1 ) and hepatic ( WHO grade 0 - 4 ) toxicities occurred in several patients . Constitutional side-effects such as fever , chills and hypotension could not be prevented completely by pretreatment with ibuprofen . 3 patients in the colorectal cancer group demonstrated objective responses ( 1 complete remission ( CR ) , 2 partial remission ( PR ) ) . The complete remission has been maintained for more than 3 years , while the partial remissions were stable for 7 and 8 months , respectively . Only marginal antitumour effects were seen in the lung cancer group . Tolerance of the macrophage system to the stimulatory effect of endotoxin , as measured by human necrosis factor alpha ( TNF-alpha ) release into serum , built up after the first administration and remained at a steady-state level after each subsequent injection . In constrast , rising CD4:CD8 ratio and release of tumour necrosis factor beta ( TNF-beta ) indicated the continuing activation of the lymphocyte system by repetitive injections of endotoxin Objective : A prospect i ve study of newly exposed cotton workers was performed to investigate the natural history of respiratory symptoms and lung function changes . Methods : A total of 157 workers naive to cotton dust exposure were investigated by question naire , spirometry , and skin tests . They were examined before employment ( baseline ) and at the end of the first week , and the first , third , sixth , and 12th month after starting work . Acute airway response was defined as either a cross-first-shift or a cross-week fall in forced expiratory volume in one second ( FEV1 ) . The longitudinal change of lung function over the year was also calculated . Five hundred seventy-two personal dust sampling and 191 endotoxin measurements were performed to assess the exposure . Results : Forty percent of workers reported work-related symptoms in the first week of the study . Smoking , endotoxin , and dust concentrations were risk factors for all work-related symptoms . Acute airway responses were witnessed after immediate exposure . Female status was the only factor found to be predictive of acute airway response . The mean longitudinal fall in FEV1 at 1 year was 65.5 mL ( st and ard error = 37.2 ) . Age , early respiratory symptoms , and early fall in cross-week FEV1 were found to predict the 12-month fall in FEV1 . Cross-first-shift and cross-week falls in FEV1 reduced in magnitude during the course of the study . Conclusions : This study of workers naive to cotton dust exposure has demonstrated that respiratory symptoms and acute airway responses develop early following first exposure , and a tolerance effect develops in those workers with the continued exposure . Current smoking and increasing exposure predicts the development of work-related lower respiratory tract symptoms , while early symptoms and acute airway changes across the working week predict the longitudinal loss of lung function at 1 year Objectives Preventive workplace regulations are so far not based on an ubiquitously accepted threshold for airborne endotoxin in the bioaerosol . Methods In a cross-sectional study , 150 employees of a cotton spinning mill underwent lung function testing . Furthermore , in a r and om subset of 75 textile workers cross-shift lung function test and methacholine challenges were performed . Airborne current endotoxin exposure was classified as “ low ” , “ medium ” , and “ high ” ( ≤100 , > 100–≤450 , and > 450 Endotoxin Units (EU)/m3 , respectively ) based on endotoxin activity . Results The exposure – response relationship between current endotoxin exposure and prevalence of an obstructive ventilation pattern was significant ( test for linear trend : P = 0.019 ) ; the adjusted odds ratio for high endotoxin exposure was 11.22 ( 95 % confidence interval 1.03–121.17 ) . Within individuals , FEV1/FVC% was significantly reduced after the shift ( paired t test : P = 0.009 ) but not related to current endotoxin exposure . Twelve workers showed bronchial hyperresponsiveness ( 8.1 % before and 12.2 % after the work shift ; Fisher ’s exact test : P = 0.021 ) . Conclusion The study among German cotton textile workers suggests an exposure-dependent effect of current endotoxin exposure on lung function impairment with significant effects above 450 EU/m3 In order to evaluate chronic effects of long-term exposure to cotton dust on respiratory health , and the role of dust and endotoxin , longitudinal changes in lung function and respiratory symptoms were observed prospect ively from 1981 to 2001 in 447 cotton textile workers , along with 472 silk textile controls . The results from five surveys conducted over the 20-yr period are reported , including st and ardised question naires , pre- and post-shift spirometric measurements , work-area inhalable dust sample collection s and airborne Gram-bacterial endotoxin analysis . Cotton workers had more persistent respiratory symptoms and greater annual declines in forced expiratory volume in one second ( FEV1 ) and forced vital capacity as compared with silk workers . After exposure cessation , in the final 5-yr period , the rate of FEV1 decline tended to slow in nonsmoking males , but not in nonsmoking females . Workers who reported byssinotic symptoms more persistently suffered greater declines in FEV1 . Chronic loss in lung function was more strongly associated with exposure to endotoxin than to dust . In conclusion , the current study suggests that long-term exposure to cotton dust , in which airborne endotoxin appears to play an important role , results in substantial adverse chronic respiratory effects This case-referent study evaluated cancer risks among farmers in central Italy . Cancer cases ( N = 1674 , 17 sites ) were selected from all deceased men aged 35 - 80 years ; a r and om sample of 480 decedents formed the reference series . Farmers had a decreased risk of lung and bladder cancer and melanoma and nonsignificant excess risks for stomach , rectal , kidney , and nonmelanoma skin cancer . Stomach and kidney cancer were significantly increased among the farmers with > 10 years ' experience , and stomach , rectal , and pancreatic cancer were increased among licensed pesticide users with > 10 years ' experience . Possible relationships emerged between specific crops and cancer : fruit and colon and bladder cancer , wheat and prostate cancer , olives and kidney cancer , and potato and kidney cancer . The results regarding stomach , pancreatic , lung , bladder , and prostate cancer and melanoma congrue with earlier results . The kidney cancer excess , the association of colon and bladder cancer with orchard farming , and the excess of rectal cancer among licensed farmers are new and unexpected findings BACKGROUND Cancer incidence in women textile workers has not been adequately studied . The aim of this study was to examine site-specific cancer incidence patterns in a cohort of 267,400 women textile workers in Shanghai , China . METHODS Women employed by the Shanghai Textile Industry Bureau ( STIB ) were followed for cancer incidence from 1989 to 1998 . Age-adjusted st and ardized incidence ratios ( SIRs ) and 95 % confidence intervals ( CI ) were computed based on Shanghai Cancer Registry ( SCR ) rates . RESULTS There was a decrease in cancer incidence for the cohort compared with urban Shanghai women ( SIR = 0.91 , 95 % CI = 0.89 - 0.93 ) . There were small increased risks of other endocrine tumors ( SIR = 1.31 , 95 % CI = 1.02 - 1.65 ) . There were decreased risks for esophageal ( SIR = 0.54 , 95 % CI = 0.44 - 0.66 ) , stomach ( SIR = 0.79 , 95 % CI = 0.73 - 0.85 ) , rectal ( SIR = 0.88 , 95 % CI = 0.78 - 0.98 ) , lung ( SIR = 0.80 , 95 % CI = 0.74 - 0.86 ) , cervical ( SIR = 0.37 , 95 % CI = 0.28 - 0.50 ) , ovarian ( SIR = 0.85 , 95 % CI = 0.75 - 0.96 ) , and bladder cancers ( SIR = 0.63 , 95 % CI = 0.46 - 0.85 ) . CONCLUSIONS Women employed in the textile industry had a lower than expected cancer experience compared with urban Shanghai women . Further research on this cohort will examine associations between site-specific cancers and occupational exposures to dusts and chemicals
2,098
23,256,601
The improved intake of targeted nutrients and foods , such as fruit and vegetables , could potentially reduce the rate of non-communicable diseases in adults , if the changes in diet are sustained . Associated improvements in perinatal outcomes were limited and most evident in women who smoked during pregnancy .
Background Less healthy diets are common in high income countries , although proportionally higher in those of low socio-economic status . Food subsidy programs are one strategy to promote healthy nutrition and to reduce socio-economic inequalities in health . This review summarises the evidence for the health and nutritional impacts of food subsidy programs among disadvantaged families from high income countries .
Of 824 women screened , 410 were enrolled at midpregnancy in a prospect i ve , r and omized , controlled nutrition intervention study . Of these , 226 were predicted as likely to have small or large babies , 184 to have average-sized babies . Two hundred thirty eight mothers received USDA Women , Infants and Children ( WIC ) Food Supplementation vouchers from midpregnancy , 172 did not . Leukocyte protein synthesis ( as a cell model ) was significantly higher ( p = 0.009 ) by 36 weeks gestation in supplemented mothers . Mean birth weight of their babies was greater , 3254 vs 3163 g , ( + 91 g ) p = 0.039 , adjusted for sex , gestational age , prenatal visits , pregnancy interval , smoking , and previous low birth weight infants . Controlling for entry weight obviated the significance of the difference , except for WIC supplemented smokers ( greater than 10 cigarettes/day ) whose babies were significantly heavier by + 168 g ( p = 0.017 ) than those of unsupplemented smokers . WIC partially protects fetal growth in smokers Interpregnancy WIC supplementation was evaluated by comparing maternal nutritional status indicators and subsequent birth outcomes of 703 WIC participants divided into two groups . Study group women received postpartum benefits for 5 - 7 mo while control group women received postpartum benefits for only 0 - 2 mo . Both groups received prenatal benefits during each of two study pregnancies . Infants born to study group women had a higher mean birthweight ( 131 g ) and birthlength ( 0.3 cm ) and a lower risk of being less than or equal to 2500 g. Additionally , at the onset of the second pregnancy study group women had higher mean hemoglobin levels and lower risk of maternal obesity . These results suggest that postpartum WIC supplementation has positive benefits for both the mother and her subsequent infants BACKGROUND Studies of fruit and vegetable consumption in relation to overall health are limited . We evaluated the relationship between fruit and vegetable intake and the incidence of cardiovascular disease and cancer and of deaths from other causes in two prospect i ve cohorts . METHODS A total of 71 910 female participants in the Nurses ' Health study and 37,725 male participants in the Health Professionals ' Follow-up Study who were free of major chronic disease completed baseline semiquantitative food-frequency question naires in 1984 and 1986 , respectively . Dietary information was up date d in 1986 , 1990 , and 1994 for women and in 1990 and 1994 for men . Participants were followed up for incidence of cardiovascular disease , cancer , or death through May 1998 ( women ) and January 1998 ( men ) . Multivariable-adjusted relative risks were calculated with Cox proportional hazards analysis . RESULTS We ascertained 9329 events ( 1964 cardiovascular , 6584 cancer , and 781 other deaths ) in women and 4957 events ( 1670 cardiovascular diseases , 2500 cancers , and 787 other deaths ) in men during follow-up . For men and women combined , participants in the highest quintile of total fruit and vegetable intake had a relative risk for major chronic disease of 0.95 ( 95 % confidence interval [ CI ] = 0.89 to 1.01 ) times that of those in the lowest . Total fruit and vegetable intake was inversely associated with risk of cardiovascular disease but not with overall cancer incidence , with relative risk for an increment of five servings daily of 0.88 ( 95 % CI = 0.81 to 0.95 ) for cardiovascular disease and 1.00 ( 95 % CI = 0.95 to 1.05 ) for cancer . Of the food groups analyzed , green leafy vegetable intake showed the strongest inverse association with major chronic disease and cardiovascular disease . For an increment of one serving per day of green leafy vegetables , relative risks were 0.95 ( 95 % CI = 0.92 to 0.99 ) for major chronic disease and 0.89 ( 95 % CI = 0.83 to 0.96 ) for cardiovascular disease . CONCLUSIONS Increased fruit and vegetable consumption was associated with a modest although not statistically significant reduction in the development of major chronic disease . The benefits appeared to be primarily for cardiovascular disease and not for cancer The longitudinal study of pregnant women enrolled a national probability sample of 5,205 women first certified for WIC and 1,358 comparable low-income pregnant women in 174 WIC clinics located in 58 areas in the contiguous 48 states and in 55 prenatal clinics without WIC programs in counties with low program coverage . The women completed 24-h dietary recalls , histories of food expenditures , health care utilization , health and sociodemographic status , and anthropometric assessment . At late-pregnancy follow-up 3,967 WIC and 1043 control women were interviewed and 853 WIC and 762 control women completed 1-wk food expenditure diaries . Birth outcome was abstract ed ( from hospital records ) for 3,863 WIC and 1058 control women . Anthropometry , dietary intake , health , and use of health services were related to WIC among 2,619 r and om low-income preschoolers . Psychological development was assessed in 526 children aged 4 and 5 y. Control women had higher income , education , and employment status ; therefore , WIC program benefits probably were underestimated OBJECTIVE To examine the effectiveness of two methods of increasing fruit and fruit juice intake in pregnancy : midwives ' advice and vouchers exchangeable for juice . DESIGN Pregnant women were r and omly allocated to three groups : a control group , who received usual care ; an advice group , given advice and leaflets promoting fruit and fruit juice consumption ; and a voucher group , given vouchers exchangeable for fruit juice from a milk delivery firm . Dietary question naires were administered at ~16 , 20 and 32 weeks of pregnancy . Serum beta-carotene was measured at 16 and 32 weeks . SETTING An antenatal clinic in a deprived area . SUBJECTS Pregnant women aged 17 years and over . RESULTS The study comprised 190 women . Frequency of fruit consumption declined during pregnancy in all groups , but that of fruit juice increased substantially in the voucher group . Serum beta-carotene concentration increased in the voucher group , from 106.2 to 141.8 micromol l(-1 ) in women with measurements on both occasions ( P = 0.003 ) , decreased from 120.0 to 99.8 micromol l(-1 ) in the control group ( P = 0.005 ) , and was unchanged in the advice group . CONCLUSIONS Pregnant women drink more fruit juice if they receive vouchers exchangeable for juice supplied by the milk delivery service . Midwives ' advice to eat more fruit has no great effect . Providing vouchers for fruit juice is a simple method of increasing its intake in a deprived population and may be useful for other sections of the community Background There is overwhelming evidence that behavioural factors influence health , but their combined impact on the general population is less well documented . We aim ed to quantify the potential combined impact of four health behaviours on mortality in men and women living in the general community . Methods and Findings We examined the prospect i ve relationship between lifestyle and mortality in a prospect i ve population study of 20,244 men and women aged 45–79 y with no known cardiovascular disease or cancer at baseline survey in 1993–1997 , living in the general community in the United Kingdom , and followed up to 2006 . Participants scored one point for each health behaviour : current non-smoking , not physically inactive , moderate alcohol intake ( 1–14 units a week ) and plasma vitamin C > 50 mmol/l indicating fruit and vegetable intake of at least five servings a day , for a total score ranging from zero to four . After an average 11 y follow-up , the age- , sex- , body mass– , and social class – adjusted relative risks ( 95 % confidence intervals ) for all-cause mortality(1,987 deaths ) for men and women who had three , two , one , and zero compared to four health behaviours were respectively , 1.39 ( 1.21–1.60 ) , 1.95 ( 1.70–-2.25 ) , 2.52 ( 2.13–3.00 ) , and 4.04 ( 2.95–5.54 ) p < 0.001 trend . The relationships were consistent in subgroups stratified by sex , age , body mass index , and social class , and after excluding deaths within 2 y. The trends were strongest for cardiovascular causes . The mortality risk for those with four compared to zero health behaviours was equivalent to being 14 y younger in chronological age . Conclusions Four health behaviours combined predict a 4-fold difference in total mortality in men and women , with an estimated impact equivalent to 14 y in chronological age In several studies , many nutrients in fruits and vegetables , such as dietary fiber , potassium , and antioxidants , have been associated with reduced risk for cardiovascular disease ( 1 - 5 ) . However , as review ed elsewhere ( 6 ) , most prospect i ve studies that have specifically examined intake of fruits and vegetables in relation to risk for cardiovascular disease have been small , and their results have been inconsistent . Dietary assessment s were often crude and available only at baseline , and few studies have examined the effects of specific types of vegetables or fruits . In a recent report ( 7 ) , we evaluated the association between fruit and vegetable intake and risk for ischemic stroke . We found that persons in the highest quintile of fruit and vegetable intake had a relative risk of 0.69 ( 95 % CI , 0.52 to 0.92 ) compared with the lowest quintile of intake ; moreover , a 1-serving/d increase in fruit or vegetable intake was associated with a 6 % lower risk for ischemic stroke , after controlling for st and ard cardiovascular risk factors . In the current study , we sought to evaluate the association between intake of overall and specific fruits and vegetables and incidence of coronary heart disease . Methods Study Sample The sample s for this analysis consisted of participants in the Nurses ' Health Study ( 8) and Health Professionals ' Follow-Up Study ( 1 ) . The two studies have similar design s ; in both , participants complete mailed question naires about medical history , health behaviors , and occurrence of cardiovascular and other outcomes every 2 years . The Nurses ' Health Study began in 1976 , when 121 700 female registered nurses 30 to 55 years of age were recruited ; diet was first assessed in 1980 . Health Professionals ' Follow-up Study participants were recruited in 1986 and comprise 51 529 male health professionals , including dentists , veterinarians , pharmacists , optometrists , osteopaths , and podiatrists , 40 to 75 years of age . Sample for Analysis We excluded participants with incomplete dietary assessment s or with previously diagnosed cancer , diabetes or cardiovascular disease that was reported before the first dietary assessment . We followed 84 251 eligible women during 14 years of follow-up and 42 148 eligible men during 8 years follow-up for incidence of coronary heart disease . The rate of follow-up for nonfatal events was 97 % of the total potential person-years of follow-up in both cohorts . Assessment of Coronary Heart Disease End Points Our primary end point was nonfatal myocardial infa rct ion or fatal coronary disease occurring after return of the 1980 question naire but before 1 June 1994 in women and after return of the 1986 question naire but before 1 January 1994 in men . We sought to review medical records for all such reports . Records were review ed by physicians who were blinded to the participants ' risk factor status . Myocardial infa rct ion was confirmed by using World Health Organization criteria : symptoms plus either diagnostic electrocardiographic changes or elevated levels of cardiac enzymes ( 9 ) . Infa rct ions that required hospital admission and for which confirmatory information was obtained by interview or letter , but for which no medical records were available , were design ated as probable . We included all confirmed and probable cases in our analyses because results were the same after probable cases were excluded . Deaths were identified by using state vital records and the National Death Index or were reported by next of kin and the U.S. postal system . Follow-up for deaths was more than 98 % complete [ 10 ] . Death certificates along with medical records were used to ascertain cause of death . Fatal coronary disease was categorized as definite if 1 ) it was confirmed by hospital record or autopsy or 2 ) coronary disease was listed as the cause of death on the certificate , this was the underlying and most plausible cause , and evidence of previous coronary disease was available . We did not rely on the statement of the cause of death on the death certificate alone as providing sufficient confirmation of death due to coronary heart disease . If no medical records were available , we categorized persons in whom coronary heart disease was the underlying cause on the death certificate as presumed coronary heart disease . Analyses limited to confirmed cases yielded results very similar to those obtained when all cases were included , although with less precision . Persons who experienced sudden death within 1 hour of onset of symptoms and had no plausible cause other than coronary disease were categorized as coronary heart disease cases . Fatal cases of coronary heart disease constituted 30 % of all cases of coronary heart disease among women and 33 % among men . Dietary Assessment Diet was assessed in the Nurses ' Health Study in 1980 , 1984 , 1986 , and 1990 . A 61-item semi-quantitative food-frequency question naire that included 6 fruit items , 11 vegetable items , and 3 potato items was used in 1980 . In 1984 , the question naire was exp and ed to 126 items that covered 15 fruit items and 28 vegetable items plus potatoes ; similar question naires were repeated in 1986 and 1990 . In the Health Professionals ' Follow-up Study , diet was assessed in 1986 and 1990 by using food-frequency question naires very similar to those in the 1984 Nurses ' Health Study question naire . We excluded women who left 10 or more of the 61 items blank or who had implausible scores for total food intake ( < 500 or > 3500 kcal/d ) . Men who left 70 or more of the 131 dietary questions blank or who reported daily caloric intake outside the plausible range of 800 to 4200 calories were also excluded . For each food item , a st and ard serving size was specified . Natural portion sizesfor example , one banana or a small glass of tomato juicewere used whenever possible ; otherwise a weight or volume of that item commonly consumed by the U.S. population at one meal was used . On dietary question naires , participants reported their average intake of the specified portion size ( serving ) for each food over the past year . For each food item on the question naire , nine responses were possible , ranging from never or less than once per month to six or more times per day . Detailed descriptions of the reproducibility and validity of the food frequency question naire for men and women have been published elsewhere ( 11 - 13 ) . Frequencies and portions for the individual food items were converted to average daily intake of each fruit and vegetable item for each participant . The average daily intakes of individual food items were combined to compute total fruit and vegetable intake and intakes of composite fruit and vegetable groups . Definitions of the composite groups ( all fruits , all vegetables , citrus fruit , citrus fruit juice , cruciferous vegetables , green leafy vegetables , vitamin Crich fruits and vegetables , legumes , and potatoes ) were modified for our previous study ( 7 ) by using a report by Steinmetz and colleagues ( 14 ) . Vitamin Crich fruits and vegetables were defined as those containing more than 30 mg of vitamin C per serving . We did not include potatoes , tofu and soybeans , dried beans , and lentils as vegetables ; in addition , condiments such as chili sauce and garlic that had very small portion sizes were not counted in total vegetables . When aggregating items to compute the composite items , we assumed that individual foods for which values were missing implied no intake ( 15 ) . Statistical Analysis We found 1063 incident cases of coronary heart disease among men and 1127 among women . Person-time for each participant was calculated from the date of return of the 1980 question naire in the Nurses ' Health Study or the 1986 question naire in the Health Professionals ' Follow-up Study to the first coronary heart disease event , death , or the cutoff date ( 1 June 1994 for women and 31 January 1994 for men ) , whichever occurred first . We excluded participants who reported cardiovascular disease or cancer or diabetes before completion of the baseline dietary question naires . Each participant contributed only one end point , and the cohort at risk for each 2-year follow-up period included only those who remained free of reported coronary heart disease at the beginning of each follow-up period . The study hypotheses were defined before data were collected . The analyses were performed separately in each cohort because of differences in sex and the question naires administered to the two cohorts . This approach was selected to achieve better control of confounding . We used pooled logistic regression with 2-year follow-up increments ( 16 ) to estimate relative risks ( incidence rate ratios ) and 95 % CIs within each cohort . Analyses were adjusted for age ( 5-year categories ) , smoking ( never , former , or current [ 1 to 14 cigarettes/d , 15 to 24 cigarettes/d , or 25 cigarettes/d ] ) , alcohol consumption ( five categories in women and seven categories in men ) , family history of myocardial infa rct ion ( before 65 years of age in women and before 60 years of age in men ) , body mass index ( quintiles ) ; use of multivitamin supplements , use of vitamin E , use of aspirin , physical activity ( two categories in women and quintiles in men ) , reported hypertension and hypercholesterolemia , total daily caloric intake ( 17 ) , and time period ( each 2-year follow-up period ) . Among women , we also controlled for postmenopausal hormone use . We up date d information on diet and risk factors for coronary heart disease over time to better represent long-term patterns ( 8 , 18 ) . In the Nurses ' Health Study , we used data from the 1980 , 1984 , 1986 , and 1990 question naires , and in the Health Professionals ' Follow-up Study , we used data from the 1986 and 1990 question naires . For each 2-year follow-up period in which events were reported , we computed intake for each composite item as a cumulative average of intake from all available food-frequency question naires up to the start of the follow-up period . For participants who experienced angina , coronary artery bypass graft surgery or OBJECTIVE To test the feasibility of the " Rolling Store , " an innovative food-delivery intervention , along with a nutrition education program to increase the consumption of healthy foods ( fruits and vegetables ) to prevent weight gain in African American women . METHODS Forty eligible African American women were enrolled in the study and r and omized to intervention or control groups . A trained peer educator and a Rolling Store operator implemented the study protocol at a local community center . RESULTS The program retention rate was 93 % . Participants in the intervention group lost a mean weight of 2.0 kg , while participants in the control group gained a mean weight of 1.1 kg at six months . Overall participants showed a mean decrease in weight of -.4 kg ( st and ard deviation 3.0 kg ) , but the intervention group lost significantly more weight and had a decreased body mass index at six months . In the intervention group , the average number of servings consumed per day of fruits/ fruit juice and vegetables significantly increased at six months . CONCLUSIONS The Rolling Store , at least on the small scale on which it was implemented , is a feasible approach to producing weight loss and improvements in healthy eating when combined with an educational program in a small community center AIMS A higher intake of fruits and vegetables has been associated with a lower risk of ischaemic heart disease ( IHD ) , but there is some uncertainty about the interpretation of this association . The objective was to assess the relation between fruit and vegetable intake and risk of mortality from IHD in the European Prospect i ve Investigation into Cancer and Nutrition (EPIC)-Heart study . METHODS AND RESULTS After an average of 8.4 years of follow-up , there were 1636 deaths from IHD among 313 074 men and women without previous myocardial infa rct ion or stroke from eight European countries . Participants consuming at least eight portions ( 80 g each ) of fruits and vegetables a day had a 22 % lower risk of fatal IHD [ relative risk ( RR ) = 0.78 , 95 % confidence interval ( CI ) : 0.65 - 0.95 ] compared with those consuming fewer than three portions a day . After calibration of fruit and vegetable intake to account for differences in dietary assessment between the participating centres , a one portion ( 80 g ) increment in fruit and vegetable intake was associated with a 4 % lower risk of fatal IHD ( RR = 0.96 , 95 % CI : 0.92 - 1.00 , P for trend = 0.033 ) . CONCLUSION Results from this large observational study suggest that a higher intake of fruits and vegetables is associated with a reduced risk of IHD mortality . Whether this association is causal and , if so , the biological mechanism(s ) by which fruits and vegetables operate to lower IHD risks remains unclear Electronic supermarket sales data provide a promising , novel way of estimating nutrient intakes . However , little is known about how these data reflect the nutrients consumed by an individual household member . A cross-sectional survey of 49 primary household shoppers ( age [ mean+/-st and ard deviation age]=48+/-14 years ; 84 % female ) from Wellington , New Zeal and , was undertaken . Three months of baseline electronic supermarket sales data were compared with individual dietary intakes estimated from four r and om 24-hour dietary recalls collected during the same 3-month period . Spearman rank correlations between household purchases and individual intakes ranged from 0.54 for percentage of energy from saturated fat ( P<0.001 ) to 0.06 for sodium ( P=0.68 ) . Other correlation coefficients were : percentage of energy from carbohydrate , 0.48 ; and protein , 0.44 ; energy density of nonbeverages , 0.37 ( kcal/oz ) ; percentage of energy from total fat , 0.34 ; sugar , 0.33 ( oz/kcal ) ; and energy density of beverages , 0.09 ( oz/kcal ; all P values < 0.05 ) . This research suggests that household electronic supermarket sales data may be a useful surrogate measure of some nutrient intakes of individuals , particularly percentage of energy from saturated and total fat . In the case of a supermarket intervention , an effect on household sales of percentage energy from saturated and total fat is also likely to impact the saturated and total fat intake of individual household members The aim of this study was to re-analyse a diet supplementation study conducted in the 1930s and investigate the effects of food supplementation on children 's growth and later adult mortality . A non-r and omised controlled trial was carried out in eight of the sixteen centres participating in the Carnegie Survey of Diet and Health in pre-war Britain ( 1937 - 39 ) . Food supplements were given for 12 months either at school or as food parcels sent to the family home . 545 children aged 2 - 14 received food supplements and 494 children of a similar age acted as their controls . The children came from 465 families . The increase in childhood height and its components-leg length and trunk length-over 12 months follow-up were measured . Mortality from all causes over 60 years follow-up to 1998 was also assessed . There were important differences between fed and control children at baseline . Supplemented children came from larger families with poorer diets and most were examined , on average , 12 days later than control children . After adjustment for baseline imbalances those receiving supplements increased in height by 3.7 mm ( 95 % CI 1.9 - 5.5 ) more than the controls . After adjustment , most of the difference in growth appeared to arise as a result of increases in leg length . After adjustment for confounding factors measured in childhood , no significant effect of childhood food supplements on adult mortality was seen . The age-adjusted hazard ratio for all cause mortality was higher in the supplemented compared to control subjects : 1.13 95 % CI ( 0.77 - 1.64 ) . We found that provision of childhood food supplements led to increased growth amongst supplemented children . The increases in height in this study were mainly as a result of increases in leg length and are similar to those found in a more recent r and omised trial in South Wales . Whilst other analyses suggest that childhood height is important in predicting adult mortality patterns , we found no significant effect of childhood food supplements on adult mortality patterns in this study , although the study lacked statistical power to detect small but , nevertheless , important differences in mortality . Larger r and omised trials with long term follow-up would be required to investigate the impact of childhood food supplementation on adult health BACKGROUND Traditional methods to improve population diets have largely relied on individual responsibility , but there is growing interest in structural interventions such as pricing policies . OBJECTIVE The aim was to evaluate the effect of price discounts and tailored nutrition education on supermarket food and nutrient purchases . DESIGN A 2 x 2 factorial r and omized controlled trial was conducted in 8 New Zeal and supermarkets . A total of 1104 shoppers were r and omly assigned to 1 of the following 4 interventions that were delivered over 6 mo : price discounts ( 12.5 % ) on healthier foods , tailored nutrition education , discounts plus education , or control ( no intervention ) . The primary outcome was change in saturated fat purchased at 6 mo . Secondary outcomes were changes in other nutrients and foods purchased at 6 and 12 mo . Outcomes were assessed by using electronic scanner sales data . RESULTS At 6 mo , the difference in saturated fat purchased for price discounts on healthier foods compared with that purchased for no discount on healthier foods was -0.02 % ( 95 % CI : -0.40 % , 0.36 % ; P = 0.91 ) . The corresponding difference for tailored nutrition education compared with that for no education was -0.09 % ( 95 % CI : -0.47 % , 0.30 % ; P = 0.66 ) . However , those subjects who were r and omly assigned to receive price discounts bought significantly more predefined healthier foods at 6 mo ( 11 % more ; mean difference : 0.79 kg/wk ; 95 % CI : 0.43 , 1.16 ; P < 0.001 ) and 12 mo ( 5 % more ; mean difference : 0.38 kg/wk ; 95 % CI : 0.01 , 0.76 ; P = 0.045 ) . Education had no effect on food purchases . CONCLUSIONS Neither price discounts nor tailored nutrition education had a significant effect on nutrients purchased . However , the significant and sustained effect of discounts on food purchases suggests that pricing strategies hold promise as a means to improve population diets
2,099
15,106,262
REVIEW ERS ' CONCLUSIONS Reducing the number of daily doses appears to be effective in increasing adherence to blood pressure lowering medication and should be tried as a first line strategy , although there is less evidence of an effect on blood pressure reduction .
BACKGROUND Lack of adherence to blood pressure lowering medication is a major reason for poor control of hypertension worldwide . Interventions to improve adherence to antihypertensive medication have been evaluated in r and omised trials but it is unclear which interventions are effective .
The efficacy of self-recording of blood pressure in the management of hypertension was assessed in a r and omized clinical trial involving 140 persons who had been receiving antihypertensive therapy for a year or more , but whose diastolic blood pressure had remained at 95 mm Hg or higher . To control for the increased attention implicit in self-recording , which might affect blood pressure , the patients were assigned at r and om to one of the four groups : self-recording and monthly home visits , self-recording only , monthly home visits only , and neither self-recording nor monthly home visits . This design also permitted assessment of the effect of home visits . During the 6-month experiment no significant differences were apparent between the groups in either compliance or diastolic blood pressure . However , both self-recording and monthly home visits produced a reduction in blood pressure among patients who admitted to difficulty remembering to take their pills ; a reduction was not seen among patients who said they had no such difficulty . This confirmed an earlier observation suggesting that this easily identified group of patients may be the most responsive to intervention programs Abstract . Objectives : To compare compliance with an antihypertensive treatment administered either twice daily or three times daily . The two formulations of the antihypertensive treatment used ( nicardipine ) “ regular tablets ” ( t.d . ) and “ slow release tablets ” ( b.d . ) are bioequivalent at the daily dosage used in the study . Study design : Open , controlled , parallel design ed study with central ised , r and omised allocation to the treatment groups : TID group : A nicardipine 20 mg tablet , three times daily for 3 months . BID group : A capsule of slow release ( SR ) nicardipine , 50 mg twice daily for 3 months . Patients : 7274 hypertensive patients were investigated by 2.651 general practitioners . Compliance with the nicardipine was assessed by means of st and ardised interviews with the patients and by a question naire for the investigators . Results : Compliance was slightly higher in the BID than in the TID group ; 71.2 % and 24.5 % of patients in the first group declared their compliance was 100 % and 80 % compared to 82.3 % and 15 % in the second group . A statistically significant relationship was shown between compliance with nicardipine and the decrease in blood pressure after three months of therapy . However , no significant difference was noticed between the two groups of patients in the absolute decrease in blood pressure after the treatment period : 25.7/14.7 mm Hg in the TID group compared with 25.9/15.0 mm Hg in the BID group . Conclusions : A difference in compliance between the bioequivalent BID and TID formulations of the same active agent was shown in hypertensive patients . However , the difference was not large enough to lead to a difference either in the number of controlled patients or in the decrease in blood pressure . Reducing the number of daily doses does not automatically lead to greater efficacy of treatment Predictors of dropping out of care were examined for 171 treated hypertensive patients enrolled in a r and omized trial of social support strategies design ed to improve compliance and blood pressure control . Control patients who continued to receive routine care were more than twice as likely to drop out as patients who received routine care and periodic home visits by nurses or pharmacists ( odds ratio [OR]= 2.7 ) . The combination of home visits and a second intervention , having family members monitor patients ' blood pressure and compliance behavior , was no more effective than home visits alone ( OR = 1.1 ) . The home visits intervention was one of six variables identified by a stepwise regression as significant predictors of dropping out . Patients with four or more high-risk characteristics constituted 15 % of the sample but contributed almost half ( 46 % ) of the dropouts . Targeting support strategies at high-risk patients may be a cost-effective means of reducing uncontrolled hypertension Address correspondence to : L.D. Saunders , Department of Health Services Administration and Community Medicine , 13 - 103 Clinical Sciences Building , University of Alberta , Edmonton , Alberta Canada T6 G 2G3 among patients attending a primary health care ( PHC ) clinic in Soweto for antihypertensive drug therapy . In a 1-year follow-up study of an inception cohort of newly treated hypertensives , only 27 % were compliant ( defined as attending frequently enough to receive 80 % or more of their required treatment ) , 42 % were not seen after the first 3 months ( early dropouts ) , and 22 % continued to attend until the last 3 The effects of metoprolol ( Betaloc tablets ) in a group of 193 hypertensives were compared with the effects of a slow-release formulation ( Betaloc Durules ) in a further group of 196 patients . Patients were selected at r and om for treatment . There were no differences between the groups in terms of age , weight , sex , blood pressure , concurrent illness or concomitant therapy . Blood pressure control and apparent adverse effects were similar for both groups ; the overall withdrawal rate from each group was similar . Compliance , assessed by tablet counts , was significantly improved in the group receiving once-daily therapy . Simplification of the dosage regimen to once-daily therapy appears to improve the patient 's willingness to comply with the physician 's instructions A multifactorial health-education program design ed to enhance compliance with a once-daily regimen of atenolol was evaluated among 453 patients enrolled in health maintenance organizations ( HMOs ) . The initiation of the 180-day study period was used to classify patients as either new or existing cases of hypertension . In turn , patients in these two categories were r and omly assigned to a control or an experimental group . Patients assigned to the experimental groups received an enrollment kit upon exercising their initial prescription ( new patients ) or their first refill request ( existing patients ) . The kit contained : a 30-day supply of atenolol ; an educational newsletter about hypertension ; information on nutrition and life-style changes ; and an explanation of the intent and content of the program . Before the next scheduled prescription-refill date , each patient was contacted by telephone to inquire about his or her experience with the therapy and to stress the importance of adherence to the regimen . Each month thereafter , the newsletter and an enclosed prescription-refill reminder were mailed to each patient . The medication possession ratio , defined as the number of days ' supply of atenolol obtained by a patient during the 180-day study period , was significantly ( P less than or equal to 0.001 ) enhanced for the new and existing experimental groups relative to the control groups . Multiple regression analyses revealed that enrollment in the health-education program increased the number of days ' supply of atenolol obtained by existing patients by 27 ( P less than or equal to 0.001 ) , and by new patients by 40 ( P less than or equal to 0.001 ) Low rates of adherence to hypertensive therapy limit patients ' securing the full benefits of treatment . While some factors related to adherence have been identified , research on the effectiveness of interventions to increase adherence levels is sparse . The present study was design ed to assess the impact of a series of different interventions on a group of some 400 patients , all under the care of private physicians in a small community . A factorial design was employed to deliver four , sequential educational interventions , about four months apart , to r and omly selected sub-groups . Interviews before and after each intervention provided information concerning self-reported adherence , health status , health beliefs , and personal characteristics . Pertinent medical records and pharmacy data were also obtained . The first intervention — printed material —did not significantly affect adherence . The second and fourth interventions —nurse telephone calls and social support — each increased medication taking and the third intervention — self-monitoring — led to better weight control . There was no cumulative impact of the interventions and different aspects of regimens were not signiticantly related to one another In this r and omized controlled trial , the value of using occupational health nurses ( OHNs ) to monitor the care of hypertensive employees at work was compared with regular care ( RC ) delivered in the community . One year after entry , the blood pressure level , medication history , compliance with treatment , and cost of hypertensive care of the participants were determined by independent evaluators . The reduction in diastolic blood pressure ( DBP ) , the measure of effectiveness , was 10.5 + /- 1.1 mm Hg ( mean + /- SEM ) in the OHN group and 7.7 + /- 1.1 mm Hg in the RC group , and the proportion under good blood pressure control was 41.8 % and 31.0 % respectively . These between-group differences were not statistically significant . Although the employees in the OHN group were more medicated and had a lower treatment dropout rate , neither difference was statistically significant . In addition , the proportion of employees who were compliant with prescribed medication was virtually identical in both groups . The cost of the care received by employees in the OHN group of $ 404.14 for the year was substantially higher than that of $ 250.15 in the RC group with the difference principally related to the cost of visiting the OHNs and a significant difference in drug cost ( p less than 0.006 ) . The incremental cost-effectiveness ( C/E ) ratio of $ 53.67 per mm Hg DBP reduction per year for onsite blood pressure monitoring was higher than the base C/E ratio of $ 32.65 per mm Hg for regular care . Our findings indicate that monitoring the blood pressure of hypertensive employees at work is neither clinical ly effective nor cost-effective . ( ABSTRACT TRUNCATED AT 250 WORDS A 1-year , r and omized study was conducted to test the possibility of improving compliance with therapeutic regimens in hypertensives by means of certain simple arrangements . Patients were given written treatment instructions concerning hypertension , a personal blood-pressure follow-up card , and , for those who failed to attend their blood-pressure check-up , an invitation for a new check-up . Using matched pairs , 202 Finnish hypertensives were r and omly allocated either to an ordinary or a reorganized treatment group . By means of the latter system , patient compliance could be significantly ( p < 0.01 ) improved : Only 4 % of the patients in this group dropped out of treatment , compared with 19 % in the ordinary treatment group . By the end of the year , blood pressure had been lowered by at least 10 % in 95 % of the patients in the reorganized group and in 78 % of those in the ordinary group ( p < 0.01 ) . This was achieved in approximately 60 % of cases using chlorthalidone alone Poor compliance is a principal cause of treatment failure in hypertensive patients . Once-daily dosing improves compliance , but 24-h antihypertensive activity should be provided . The compliance , efficacy , and safety of amlodipine and nifedipine slow-release ( SR ) were compared in patients with mild-to-moderate essential hypertension recruited among 24 centers in France . After a 2-week washout period , 103 patients were r and omized to 12 weeks of 5 to 10 amlodipine mg once daily ( n = 55 ) or 20 mg nifedipine SR twice daily ( n = 48 ) . Compliance was calculated by electronic drug monitoring . Efficacy was measured by ambulatory and casual BP recordings . Patients receiving amlodipine demonstrated better compliance than patients receiving nifedipine SR with respect to compliance index ( the total number of doses taken divided by the total number of doses prescribed , expressed as a percentage ; 98.3 % v 87 % ; P < .0001 ) , days on which the correct number of doses were taken ( 92.5 % v 74.8 % ; P < .0001 ) , and prescribed doses taken on schedule ( 88.7 % v 71.6 % ; P < .0001 ) . Absolute and relative therapeutic coverage were higher in patients receiving amlodipine than nifedipine SR ( P < .0001 ) . Mean SBP and DBP decreased equally in both groups , although amlodipine offered better BP control compared with nifedipine SR at specific times of day . Fewer patients had high nocturnal SBP with amlodipine ( 39.3 % ) than nifedipine SR ( 71.4 % ; P = .042 ) . Adverse events and treatment withdrawals occurred less frequently in amlodipine-treated patients than in nifedipine SR-treated patients . Amlodipine ( 5 to 10 mg ) once daily provides improved compliance , better 24-h BP control , and fewer adverse events than 20 mg nifedipine SR twice daily in patients with mild-to-moderate hypertension A controlled , r and omized study was conducted in two chain pharmacies to determine the clinical value of comprehensive pharmacy services for hypertensive patients in a chain pharmacy setting . Twenty-seven patients were enrolled as intervention participants with 26 control subjects . Monthly services for the intervention group included blood pressure and heart rate assessment s and counseling on lifestyle modifications and drug therapy . Control patients received initial and final blood pressure measurements and minimal counseling . Both study and control groups completed quality -of-life question naires upon entering and completing the study . Results showed that blood pressure control was significantly improved in the study group . Compliance rates as well as energy/fatigue scores ( a quality -of-life scale ) improved in the study group compared with the control population . Community pharmacists in chain stores could have a beneficial effect on the health care of large numbers of patients if pharmaceutical care programs were developed This article reports a r and omized controlled trial design ed to test the effects of special packaging of antihypertensive medication on compliance and blood pressure control . One hundred eighty subjects who had exhibited elevated blood pressure greater than 90 mmHg in the two years prior to the study were recruited from patients receiving care at a community hospital-based family medicine practice . After completing preenrollment interviews and blood pressure measurements , subjects were r and omly assigned to receive their antihypertensive medications either in the usual vials or in special unit dose-reminder packaging . Follow-up interviews , pill counts , and blood pressure measurements were performed at three-month intervals . There were no statistically significant differences between the control and experimental groups with regard to age , sex , race , employment , education , marital status , insurance coverage , or blood pressure regimens . Prior to the intervention , the experimental group had slightly lower diastolic blood pressure and reported better compliance than the control group . Analyses performed on 165 subjects completing the first follow-up visit revealed no significant improvements in blood pressure control or compliance for patients receiving special medication packaging . While some patients found it easy to remember to take pills packaged using this format , they also found the packages somewhat more difficult and inconvenient to use . In contrast to previously reported work , this study did not demonstrate any significant improvement in compliance with special packaging of antihypertensive medications This double-blind , double-dummy , r and omized clinical trial , conducted in elderly patients with mild hypertension , compared adherence to treatment , efficacy , side effects , and quality of life during treatment with transdermal clonidine versus oral sustained-release verapamil ( verapamil-SR ) . Blood pressure declined significantly -- from 148/95 mm Hg at baseline to 139/84 after titration and 135/86 after maintenance -- with transdermal clonidine ( n = 29 ) , and from 156/96 to 144/85 and 148/88 , respectively , with verapamil-SR ( n = 29 ) . Adverse event rates and quality -of-life question naire responses were similar in the two treatment groups . Transdermal clonidine was worn as directed during more than 96 % of patient-weeks of treatment . Compliance with the oral verapamil regimen was less consistent : Verapamil-SR was taken as directed during approximately 50 % of patient-weeks of therapy , and individual compliance , assessed by tablet counts , varied from 50 - 120 % . In all , 86 % of subjects were satisfied or highly satisfied with the convenience of transdermal therapy ; 87 % reported that side effects were slightly or not bothersome ; 65 % indicated that transdermal patches were more convenient than oral therapy ; and almost 60 % preferred transdermal to oral therapy . In this study transdermal clonidine and oral verapamil were equally safe and effective . A substantial majority of patients preferred transdermal to oral therapy , and adherence to treatment was greater with transdermal therapy The importance of number of tablets for patient compliance was investigated in 160 patients with mild-moderate essential hypertension treated with a beta-adrenoceptor blocker and a thiazide diuretic . Mean BP at entry 146 + /- 16/92 + /- 8 mm Hg . All patients were given pindolol 10 mg and clopamide 5 mg in one combination tablet or in separate tablets for 4 months respectively . Approximately 90 % of the patients took greater than 90 % of the prescribed dose throughout the study . Mean BP decreased progressively and heart rate increased slightly . Side effects were more frequently reported during the first month of the study than previously , and 30 patients discontinued the treatment . No differences in this respect were seen between 1 and 2 tablets daily . Approximately 75 % of the patients preferred 1 tablet daily , but combining two drugs in one tablet had no effect upon compliance This prospect i ve , r and omized , controlled study evaluated the impact of pharmacist-initiated home blood pressure monitoring and intervention on blood pressure control , therapy compliance , and quality of life ( QOL ) . Subjects were 36 patients with uncontrolled stage 1 or 2 hypertension . Eighteen subjects received home blood pressure monitors , a diary , and instructions to measure blood pressure twice every morning . Home measurements were evaluated by a clinical pharmacist by telephone , and the patient 's family physician was contacted with recommendations if mean monthly values were 140/90 mm Hg or higher . Eighteen control patients did not receive home monitors or pharmacist intervention . Office blood pressure measurements and QOL surveys ( SF-36 ) were obtained at baseline and after 6 months . Mean absolute reductions in systolic and diastolic pressures were significantly reduced from baseline in intervention subjects ( 17.0 and 10.5 mm Hg , both p < 0.0001 ) but not in controls ( 7.0 and 3.8 mm Hg , p = 0.12 and p = 0.09 ) . More intervention subjects ( 8) had blood pressure values below 140/90 at 6 months compared with controls ( 4 ) . During the study 83.3 % ( 15 ) of intervention subjects had drug therapy changes versus 33 % ( 6 ) of controls ( p < 0.01 ) . Compliance and QOL were not significantly affected . Our data suggest that the combination of pharmacist intervention with home monitoring can improve blood pressure control in patients with uncontrolled hypertension . This may be related to increased modifications of drug regimens 38 hypertensive Canadian steelworkers who were neither compliant with medications nor at goal diastolic blood-pressure six months after starting treatment were allocated either to a control group or to an experimental group who were taught how to measure their own blood-pressures , asked to chart their home blood-pressures and pill taking , and taught how to tailor pill taking to their daily habits and rituals ; these men were also seen fortnightly by a highschool graduate with no formal health professional training who reinforced the experimental manoeuvres and rewarded improvements in compliance and blood-pressure . Six months later , average compliance had fallen by 1.5 % in the control group but rose 21.3 % in the experimental group . Blood-pressures fell in 17 of 20 experimental patients ( to goal in 6 ) and in 10 of 18 control patients ( to goal in 2 ) Information on indices of coronary heart disease ( CHD ) and myocardial infa rct ion ( MI ) ( angina pectoris by Rose Question naire , MI by Rose Question naire , history , and electrocardiogram ) was obtained in the Hypertension Detection and Follow-Up Program ( U.S. National Institutes of Health ) at baseline , Year 2 , and Year 5 of follow-up . The presence of any of these findings at baseline markedly increased all-cause mortality during the 5 years of observation . In individuals with negative findings at baseline , the 5-year incidence of MI and angina pectoris by these indices was less in the Stepped Care than Referred Care cohort . These results are compatible with the conclusion that antihypertensive therapy reduces the incidence of symptomatic CHD Screening of 6,144 patients in a general practice clinic to assist physician case-finding uncovered 983 ( 16 % ) who were uncontrolled hypertensives . Following physician recommendation , 115 patients volunteered for a controlled trial to test the effectiveness of supplementary strategies to the pharmaceutical management of high blood pressure . A study of non participants indicated that about 7 % of the practice population was eligible for cardiovascular health education . One group received a health education program , a second was allocated to self-monitor their blood pressure for 6 months , a third group was allocated to both strategies , and the final group , acting as a control , continued to receive their usual care . Physician monitoring of patients continued for the duration of the study and blood pressures decreased in all patients . The study 's most important outcome was the joint reduction of blood pressure and medication strength . These were assessed by a " blind " clinician before and after the interventions according to criteria set out in the " stepped-care " approach to management of high blood pressure . People allocated to a health education program conducted in the doctor 's common room did twice as well on this measure as those who were not so educated . Daily self-monitoring of blood pressure for 6 months proved to be too much for the majority of those so instructed . It is concluded that the general practice setting remains an important place for health education to prevent cardiac disease and suggestions are made for incorporating this into everyday practice A two-phase study was conducted to assess the effect of an electronic medication compliance aid on hypertension control and pharmaceutical compliance in ambulatory patients . In Phase I ( 12 weeks ) , 36 patients were r and omly assigned to a medication vial equipped with a cap containing a digital timepiece that displays the last time the cap was removed . The control group included 34 patients r and omly assigned to a st and ard medication vial . Subjects using the timepiece cap showed an average compliance rate of 95.1 % , an average decrease in systolic pressure of 7.6 mm Hg ( P = .006 ) , and an average decrease in diastolic pressure of 8.8 mm Hg ( P less than .001 ) . Controls had an average compliance rate of 78 % and decreases of 2.8 mm Hg and 0.2 mm Hg in systolic and diastolic pressures , respectively . Phase II ( 12 weeks ) combined use of the timepiece cap with other compliance aids : a pocket-size card for recording blood pressure and a blood pressure cuff for self-monitoring . Patients using the timepiece cap and the card had an average compliance rate of 98.7 % with mean decreases of 11 mm Hg in systolic pressure ( P less than .01 ) and 7.64 Hg mm in diastolic pressure ( P = .0001 ) . The combined use of the cap , the card , and the blood pressure cuff result ed in an average 100.2 % compliance rate with mean decreases of 15 mm Hg ( P = .0006 ) and 6.60 mm Hg ( P = .0006 ) in systolic and diastolic pressures , respectively . Results of the two-phase study showed statistically significant increases in medication compliance associated with statistically and clinical ly significant reductions in blood pressure for all patients using the timepiece cap A r and omised trial was undertaken to discern the effect of pharmacy-based value-added utilities on prescription refill compliance with antihypertensive therapy and subsequent health care expenditures . The subjects were 304 Medicaid beneficiaries from the state of Florida , previously untreated for mild to moderate hypertension , prescribed 240 mg of calcium channel antagonist verapamil once daily and monitored regarding prescription refill compliance and health service utilisation for one year . Subjects provided informed consent and were r and omly assigned to one of four experimental groups : ( 1 ) the control cohort received st and ard pharmaceutical care with each dispensing of antihypertensive therapy , ( 2 ) the second cohort received st and ard pharmaceutical care and was mailed a medication-refill reminder ten days prior to each sequential refill date , ( 3 ) the third cohort received st and ard pharmaceutical care and was provided unit-of-use packaging with each prescription-refill request and ( 4 ) the fourth cohort received st and ard pharmaceutical care , mailed medication-refill reminders and unit-of-use packaging . Analysis of variance ( ANOVA ) procedures revealed that patients receiving mailed prescription-refill reminders , unit-of-use packaging or a combination of both interventions achieved a significant ( P < or = 0.05 ) increase in the Medication Possession Ratio ( MPR ) for antihypertensive therapy relative to controls . Receipt of both interventions result ed in a significant ( P < or = 0.05 ) improvement in the MPR for antihypertensive therapy relative to all other groups no significant difference was discerned between groups receiving either mailed prescription-refill reminders or unit-of-use packaging . ( ABSTRACT TRUNCATED AT 250 WORDS Compliance was compared in 52 previously noncompliant hypertensive patients r and omly assigned for eight weeks to either a nurse-operated hypertension clinic ( control ) or a patient-operated hypertension group ] ( experimental ) . Control patients listened to audiotapes on hypertension and its management and met individually with a nurse who adjusted their drug regimens . Experimental patients were trained to take their own blood pressure ( BP ) and select their own drugs in a group program emphasizing informed self-help . After the eight-week training period and at two- and six-month follow-up visits , both groups had significantly lower BPs . Compared with control patients , experimental patients had lower diastolic BPs , better pill counts , and better attendance ( all P < .05 ) . This study suggests that training noncompliant patients in groups to manage their own hypertension may achieve better results than traditional management programs Four compliance strategies were compared with education alone to investigate their impact on the control of high blood pressure . One hundred twelve subjects with documented high blood pressure were r and omly assigned to receive education alone , home blood pressure monitoring , contracts , pill packs , or a combination of techniques . Groups were similar in terms of age , sex , race , initial blood pressure , and medications . At the end of the year , there was no significant change in blood pressure for the group that received education alone ( -3/-1 mm Hg ) . There was a statistically significant change in both systolic and diastolic blood pressure for all compliance groups ( -17/-10 mm Hg ) . Information from compliance question naires adds further support to the observation that education alone does not influence compliance while the specific techniques studied did improve compliance . The study was too small to show any difference among techniques The clinical efficacy of using specially trained nurses to treat hypertension at the patient 's place of work was compared in a controlled trial with management by the patient 's family doctor . The 457 study participants were selected from 21 906 volunteers in industry and government whose blood-pressure was screened . The nurses were allowed to prescribe and change drug therapy at the work site without prior physician approval . Patients r and omly allocated to receive care at work were significantly more likely to be put on antihypertensive medications ( 94.7 % vs 62.7 % , to reach goal blood-pressure in the first six months ( 48.5 % vs 27.5 % ) , and to take the drugs prescribed ( 67.6 % vs 49.1 % ) . Only 6 % of patients were dissatisfied with the care provided by the nurses . Thus provision of care at work by specially tranined nurses was well accepted and result ed in significantly improved blood-pressure control and medication compliance among employees with asymptomatic and uncomplicated hypertension The associations of diastolic blood pressure ( DBP ) with stroke and with coronary heart disease ( CHD ) were investigated in nine major prospect i ve observational studies : total 420,000 individuals , 843 strokes , and 4856 CHD events , 6 - 25 ( mean 10 ) years of follow-up . The combined results demonstrate positive , continuous , and apparently independent associations , with no significant heterogeneity of effect among different studies . Within the range of DBP studied ( about 70 - 110 mm Hg ) , there was no evidence of any " threshold " below which lower levels of DBP were not associated with lower risks of stroke and of CHD . Previous analyses have described the uncorrected associations of DBP measured just at " baseline " with subsequent disease rates . But , because of the diluting effects of r and om fluctuations in DBP , these substantially underestimate the true associations of the usual DBP ( ie , an individual 's long-term average DBP ) with disease . After correction for this " regression dilution " bias , prolonged differences in usual DBP of 5 , 7.5 , and 10 mm Hg were respectively associated with at least 34 % , 46 % , and 56 % less stroke and at least 21 % , 29 % , and 37 % less CHD . These associations are about 60 % greater than in previous uncorrected analyses . ( This regression dilution bias is quite general , so analogous corrections to the relations of cholesterol to CHD or of various other risk factors to CHD or to other diseases would likewise increase their estimated strengths . ) The DBP results suggest that for the large majority of individuals , whether conventionally " hypertensive " or " normotensive " , a lower blood pressure should eventually confer a lower risk of vascular disease This study aim ed to compare the efficacy of a patient-directed management strategy with office-based management in maintaining blood pressure control in patients with chronic stable hypertension using a r and omized trial of two months duration . The subjects had chronic stable essential hypertension without secondary causes or unstable cardiovascular disease and were selected through the offices of 11 family physicians and a tertiary care hypertension research unit . Patients were r and omly assigned ( 2:1 ratio ) to either a patient-directed management strategy using home blood pressure monitoring to adjust drug therapy if readings consistently exceeded defined limits , or office-based management through physician visits . The primary endpoint was the change from baseline in mean arterial pressure as determined by automatic ambulatory blood pressure monitoring . Secondary endpoints were changes in compliance , quality of life , and health care re source use . Ninety-one potential subjects were screened and 31 were r and omized . Subjects in the patient-directed management group employed the drug adjustment protocol s appropriately without complications . A significant difference in change in mean blood pressure was observed , favoring the patient-directed management ( -0.95 mm Hg and + 1.90 mm Hg , respectively , for patient-directed management and office-based management , P = .039 ) . Compliance rates and quality of life scores were not significantly different between groups . Physician visits were more frequent in the patient-directed management group ( 1.05 v 0.20 visits/8 weeks , respectively , for patient-directed management and office-based management groups , P = .045 ) . A patient-directed hypertensive management strategy may be feasible for patients with chronic stable hypertension . Such a strategy may improve blood pressure control compared with usual office-based care . However , physician visits may be increased using this strategy , at least in the short term The objective was to compare the compliance of hypertensive patients treated with captopril twice daily or tr and olapril once daily . After a 2-week placebo period , hypertensive patients ( diastolic BP 95 - 115 mm Hg ) were r and omly allocated to tr and olapril 2 mg once daily or to captopril 25 mg twice daily for 6 months . Tr and olapril and captopril were packed in electronic pill-boxes equipped with a microprocessor that recorded date and time of each opening ( MEMS ) . Patients ' compliance was assessed both by st and ard pill-count and by electronic monitoring . Blood pressure was measured using a vali date d semi-automatic device at the end of the placebo period and of the treatment period . One hundred sixty-two patients entered the study . Compliance data were evaluable for 133 patients ( 62 in the captopril group and 71 in the tr and olapril group ) . Treatment groups were comparable at baseline except for age ( P = .046 ) . Using electronic pill-box , overall compliance was 98.9 % in the tr and olapril group and 97.5 % in the captopril group ( P = .002 ) . The percentage of missed doses was 2.6 % in the tr and olapril group and 3.3 % in the captopril group ( P = .06 ) . The percentage of delayed doses was 1.8 % in the tr and olapril group and 11.7 % in the captopril group ( P = .0001 ) . The percentage of correct dosing periods , ie , a period with only one correct recorded opening , was 94.0 % in the tr and olapril group and 78.1 % in the captopril group ( P = .0001 ) . Results were unchanged when adjusted for age . At the end of the study , 41 % of patients in the tr and olapril group and 27 % in the captopril group ( NS ) had their blood pressure normalized ( systolic BP < 140 and diastolic BP < 90 mm Hg ) . In this 6-month study , the electronic pill-box allowed refined analysis of compliance of hypertensive patients . Patients ' compliance with once daily tr and olapril was higher than with twice daily captopril . The between-group difference is mainly explained by an increase in delayed doses in the twice daily group The objective of this study was to determine the relationship between prescribed daily dose frequency and patient medication compliance . The medication compliance of 105 patients receiving antihypertensive medications was monitored by analyzing data obtained from special pill containers that electronically record the date and time of medication removal . Inaccurate compliance estimates derived using the simple pill count method were thereby avoided . Compliance was defined as the percent of days during which the prescribed number of doses were removed . Compliance improved from 59.0 % on a three-time daily regimen to 83.6 % on a once-daily regimen . Thus , compliance improves dramatically as prescribed dose frequency decreases . Probably the single most important action that health care providers can take to improve compliance is to select medications that permit the lowest daily prescribed dose frequency 230 Canadian steelworkers with hypertension took part in a r and omised trial to see if compliance with antihypertensive drug regimens could be improved . For care and follow-up these men were r and omly allocated to see either their own family doctors outside working-hours or industrial physicians during work shifts ; the same men were r and omly allocated to receive or not receive an educational programme aim ed at instructing them about hypertension and its treatment . Surprisingly , the convenience of follow-up at work had no effect upon these men 's compliance with antihypertensive drug regimens . Similarly , although men receiving health education learned a lot about hypertension , they were not more likely to take their medicine This study was conducted to evaluate the effect of automated telephone patient monitoring and counseling on patient adherence to antihypertensive medications and on blood pressure control . A r and omized controlled trial was conducted in 29 greater Boston communities . The study subjects were 267 patients recruited from community sites who were > or= 60 years of age , on antihypertensive medication , with a systolic blood pressure ( SBP ) of > or= 160 mm Hg and /or a diastolic blood pressure ( DBP ) of > or= 90 mm Hg . The study compared subjects who received usual medical care with those who used a computer-controlled telephone system in addition to their usual medical care during a period of 6 months . Weekly , subjects in the telephone group reported self-measured blood pressures , knowledge and adherence to antihypertensive medication regimens , and medication side-effects . This information was sent to their physicians regularly . The main study outcome measures were change in antihypertensive medication adherence , SBP and DBP during 6 months , satisfaction of patient users , perceived utility for physicians , and cost-effectiveness . The mean age of the study population was 76.0 years ; 77 % were women ; 11 % were black . Mean antihypertensive medication adherence improved 17.7 % for telephone system users and 11.7 % for controls ( P = .03 ) . Mean DBP decreased 5.2 mm Hg in users compared to 0.8 mm Hg in controls ( P = .02 ) . Among nonadherent subjects , mean DBP decreased 6.0 mm Hg for telephone users , but increased 2.8 mm Hg for controls ( P = .01 ) . For telephone system users , mean DBP decreased more if their medication adherence improved ( P = .03 ) . The majority of telephone system users were satisfied with the system . Most physicians integrated it into their practice s. The system was cost-effective , especially for nonadherent patient users . Therefore , weekly use of an automated telephone system improved medication adherence and blood pressure control in hypertension patients . This system can be used to monitor patients with hypertension or with other chronic diseases , and is likely to improve health outcomes and reduce health services utilization and costs Objective To assess the current levels of awareness , treatment and control of hypertension in Engl and and to determine the number and type of drugs prescribed . Design A cross-sectional household-based survey of English adults . Subjects A r and om sample from the adult English population of 12 116 adults who participated in the 1994 Health Survey for Engl and . Main outcome measures Prevalences of treatment hypertension , awareness and control . Results Using a definition of hypertension as a systolic blood pressure ≥ 160 mmHg or a diastolic blood pressure ≥ 95 mmHg , or a patient 's being administered antihypertensive treatment , the prevalence of awareness of hypertension was 63 % . Among hypertensives , 50 % were receiving treatment and 30 % had their hypertension controlled ( < 160 mmHg/95 mmHg ) . Awareness , treatment and control rates are considerably lower than the most recently reported rates from the USA . Diuretics and β-blockers remain the most common antihypertensive agents used in Engl and . Conclusion There is considerable scope for improving the treatment and control of hypertension in the English adult population Using a factorial design , four aspects of an educational program for 160 hypertensive patients were manipulated : number of meetings , patient responsibility and participation , directiveness of the intervention , and emphasis on negative consequences of uncontrolled hypertension . Validity checks on the manipulations included content analysis of the nurse-patient interaction and interview-based measures of the patient 's responsibility , participation , and awareness of dangers . Outcome variables included repeated measures of patient knowledge , assessment by the nurse of patient attainment of identified goals , and reduction of the patient 's mean arterial blood pressure . High indirect interventions tended to lead to higher goal attainment , particularly in the psychosocial area . Emphasis on negative consequences tended to promote learning for patients with long st and ing diagnoses , but to retard learning for recently diagnosed patients . Additional meetings and emphasis on patient responsibility were not helpful alone , but in combination they tended to lead to greater learning . Although as a whole , patients in the program tended to reduce their blood pressures , there were no statistically significant main effects or interaction effects of the educational approach variables on blood pressure reduction The effects of an educational program on compliance and blood pressure ( BP ) control were assessed in 47 hypertensive patients hospitalized for nonhypertension-related diseases . Patients were r and omized to receive either a question naire and an educational program ( group I , 25 ) or question naire only ( group II , 22 ) . Baseline clinical characteristics , admission diagnoses and antihypertensive medications were similar between the groups . Antihypertensive medications used by patients before the trial were not changed . Eight weeks after the initial intervention , patients in group I showed a significant reduction in both systolic and diastolic BP ( 137/89 vs 154/98 mm Hg , p = 0.005 and 0.006 , respectively ) and improved compliance ( 96 vs 36 % , p = 0.04 ) , compared with patients in group II . An education program in patients with high BP is an effective method to improve compliance and BP control in the short-term OBJECTIVE To compare enalapril 20 mg once daily with 10 mg twice daily in terms of blood pressure reduction and patient compliance . DESIGN Cross-over study of patients r and omly assigned to a sequence of enalapril 20 mg once daily or 10 mg twice daily in three 4-week periods following a 4-week placebo run-in . SETTING General practice s in the greater Belfast and Lisburn area in Northern Irel and . PATIENTS Twenty-five hypertensive patients who had a mean diastolic blood pressure of between 90 and 110 mm Hg after receiving placebo for 4 weeks . MAIN OUTCOME MEASURES Reduction in blood pressure and estimation of patient compliance . RESULTS Patient compliance was superior on the once daily regimen . However , the twice daily regimen was associated with a greater blood pressure reduction which almost reached statistical significance at the 5 % level . CONCLUSIONS Enalapril 20 mg should be prescribed as 10 mg twice daily and measures taken to improve patient compliance OBJECTIVE To evaluate the impact of pharmaceutical care on selected clinical and economic outcomes in patients with hypertension or chronic obstructive pulmonary disease ( COPD ) in ambulatory care setting s. DESIGN Clinic patients with hypertension or COPD were r and omly assigned to a treatment group ( pharmaceutical care ) or a control group ( traditional pharmacy care ) over a six-month period . Clinical pharmacists and pharmacy residents conducted the protocol s. There were 133 evaluable patients ( 63 treatment and 70 control ) in the hypertension study arm and 98 evaluable patients ( 43 treatment and 55 control ) in the COPD study arm . SETTING 10 Departments of Veterans Affairs medical centers and 1 academic medical center . INTERVENTIONS Patient-centered pharmaceutical care model ( employing st and ardized care ) implemented by clinical pharmacy residents . MAIN OUTCOME MEASURES Patient knowledge , medication compliance , and health re source use . RESULTS The hypertension treatment group had a significantly greater reduction in systolic blood pressure from visit 1 to visit 5 than did the control group . In the COPD study arm , trends were positive in the treatment group for patients ratings of symptom interference with activities and dyspnea measures . There was a significant difference between the hypertension treatment and control group for compliance . There were no significant changes in compliance scores in the COPD study arm . Mean number of hospitalizations and other health care provider visits was higher for the hypertension control group . For patients with COPD , hospitalizations increased in the control group , and the number of other health care provider visits was higher in the control group . CONCLUSION Pharmacists ' participation in a pharmaceutical care program result ed in disease state improvement in ambulatory patients with hypertension and COPD A group of investigators at a major teaching hospital have prepared a study design ed to test the hypothesis that patient outcomes related to hypertension will be improved by increased compliance , as defined by consistent daily compliance with prescribed medical regimen . The study will test the effectiveness of the following interventions in improving compliance : 1 ) improved access to medication , 2 ) computerized phone follow-up , and 3 ) one-on-one follow-up conversations concerning compliance issues . This paper is intended to describe the pending study in detail while focusing on the computerized phone system that will be used in the treatment group of the controlled study for compliance Abstract To study patient compliance in hypertensive out patients amlodipine ( 5 mg once daily ) and slow release nifedipine ( 20 mg twice daily ) were compared in an open , crossover study in general practice s . Four methods of assessment for patient compliance ( pill count , taking compliance , days with correct dosing , timing compliance ) were used in both study arms . For the latter three assessment a special device , the medication event monitoring system , was used to record the time and date of each opening and closure of the container . The compliance of the 320 hypertensive patients with once-daily amlodipine was markedly superior to twice-daily slow release nifedipine . Therapeutic coverage was also significantly better for amlodipine in the hypertensive patients . Amlodipine was better tolerated than nifedipine slow release . Patient compliance and therapeutic coverage with the calcium antagonist amlodipine given once daily was superior to slow release nifedipine b. d. in hypertensive out patients recruited in general practice . Statistical Unit : Léon Kaufmann , Marie-Paule Derde , Data Investigation Company Europe , Brussels Participating Investigators : D. Abbate , G. Arm and , C.I. Authelet , J.L. Badot , J. Baeck , P. Baeck , P. Bastin , C.I. Bernard , P. Bernard , B. Beyssens , J. Bosly , P. Boudart , J. Bourdeaudhuy , W. Callens , L. Carolides , Y. Catry , E. Cerstelotte , F. Charlier , H. Charloteaux , J.M. Chaudron , L. Christiaen , G. Cornette , P. Cranskens , R. Creteur , N. De Cock , M. De Corte , A. De Vos , P. Defrance , P. Delhaye , G. Deneckere , M. Dobbeleir , A. Dufour , P. Dumont , L. D'Haen , H. D'Haenens , P. Eloy , P. Evrard , C. Fellemans , G. Geeraerts , L. Gielen , D. Gr and , J. Grosjean , J. Guffens , R. Guillaume , R. Hacquaert , V. Hamoir , W. Hens , M. Hondeghem , M.C. Humblet-Koch , L. Leven , W. Janssens , L. Jeanfils , J. Jodogne , B. Jortay , W. Ketels , J.M. Krzesinski , E. Langendries , J. Lannoy , M. Leeman , J. Leire , P. Lempereur , L. Lenaerts , F. Lustman , R. Martens , Y. Maus , M. Meroueh , J.P. Meurant , P. Meurant , A. Michiels , E. Mievis , H. Moors , K. Naesens , P. Neels , J. Neven , W. Odeurs , W. Pardon , M. Peduzzi , J. Piette , D. Plessers , P. Putzeys , A. Quoidbach , A. Renaerts , G. Rits , M. Ruhwiedel , M. Salavracos , M. Seret , P. Sibille , M. Taziaux , J. Teucq , H. Therasse , F. Tihon , F. V and enput , J. Van Elsen , J.P. Van Liefferinge , J. Van Neck , M. Van Pelt , T. Van Vlaenderen , G. V and enbeylaardt , M. V and ewoude , F. Veldeman , D. Ven , F. Verbruggen , A. Vlaeminck , P. Werion , J. Westerlinck