Pregnancy duration and endometrial cancer risk: nationwide cohort study
Full-term pregnancies are strongly associated with a reduced risk of endometrial cancer. Whether this protective association is driven by fecundity, a specific process that occurs during pregnancy, or cumulative number of months pregnancy, however, is unknown. In this cohort study, 2,311,332 Danish women born between 1935 and 2002 were followed for an average of 24.8 years to explore the association between pregnancy duration and risk of endometrial cancer. These women had 3,947,650 pregnancies, of which 671,560 were induced abortions (median gestational age 8 weeks, IQR 7-9 weeks), and 3,276,090 were births (median gestational age 40 weeks, IQR 39-41 weeks). During study follow-up, 6,743 women developed endometrial cancer. Researchers found that both induced abortions and births after a first pregnancy were strongly associated with a reduced risk of endometrial cancer after adjustment for age, time period, and socioeconomic factors (induced abortion: RR 0.53, 95% CI 0.45 to 0.64; childbirth: RR 0.66, 95% CI 0.61 to 0.72). Each subsequent pregnancy was also associated with risk reduction, whether it ended in induced abortion (RR 0.81, 95% CI 0.77 to 0.86) or childbirth (RR 0.86, 95% CI 0.84 to 0.89). There was no difference in the relative risk of endometrial cancer when stratifying by age at first pregnancy (p=0.69), nor time since pregnancy (less than 10 years vs. 10 years or greater, p=0.94). The results were also not affected by obesity, birth cohort, or fecundity. This study was limited by the lack of data regarding lifetime use of oral contraceptives, which has also been associated with a decreased risk of endometrial cancer. Overall, this study indicates that pregnancy is associated with a reduced the risk of endometrial cancer, regardless of whether the pregnancy ends in induced abortion or birth, suggesting that the driving protective factor is either a women’s fecundity or a biological process occurring early in gestation.
While it is known that suboptimal chest compressions are associated with worse outcomes after cardiac arrest, the optimal combination of chest compression rate (CCR) and chest compression depth (CCD), and whether this should be applied to all patients despite differences in age, sex, and presenting cardiac rhythm is unknown. In this cohort study, 3,643 patients who had an out-of-hospital cardiac arrest and for whom CCR and CCD had been simultaneously recorded during the first 5 minutes of CPR were studied to determine the optimal combination of CCR-CCD associated with functionally favorable survival (defined as modified Rankin scale of 3 or less at the time of hospital discharge). Data was obtained from the National Institutes of Health (NIH) Resuscitation Outcomes Consortium Prehospital Resuscitation Impedance Valve and Early Versus Delayed Analysis (ROC PRIMED) multicenter trial, in which patients were assigned to receive conventional CPR using either an inactive impedance threshold device (ITD) or an active-ITD, with CCR and CCD measurements collected electronically using sensors linked to electrocardiographic monitor defibrillators. At baseline, 64.4% of the 3,643 patients were men, and the mean (SD) age was 67.5 (15.7) years. CPR was performed by bystanders in 41.9% of patients, and 91.1% of patients received at least one prehospital dose of epinephrine. The first cardiac rhythm recorded was asystole in 47.8%, ventricular fibrillation or ventricular tachycardia in 24.5%, and pulseless electrical activity (PEA) in 23.7%. Researchers found that the optimal combination of CCR-CCD associated with the greatest probability of favorable functional outcome was 107 compressions per minute (cpm) and 4.7 cm, with little difference across subgroups (age, sex, or cardiac rhythm). Survival probability was significantly higher with CPR performed within 20% of this combination (86 to 128 cpm, 3.8 to 5.6 cm) than outside of this range (6.0% vs. 4.3%, respectively, OR 1.44, 95% CI 1.07 to 1.94, p=0.02). Finally, survival probability with the identified optimal CCR-CCD was higher with active-ITD use than with sham-ITD use (9.6% vs. 5.3%, respectively, OR 1.90, 95% CI 1.06 to 3.38, p=0.03), and device effectiveness was dependent on being near the optimal CCR-CCD combination. In summary, this study suggests that a chest compression rate of 107 cpm and a depth of 4.7 cm improves outcomes in out-of-hospital cardiac arrests, especially with the use of an ITD, though further investigation and prospective validation is warranted.
Genetically engineered T-cell therapies, such as chimeric antigen receptor T-cells (CAR T-cells), have been approved for the treatment of hematologic malignancies, however, there are limited data to support this approach in the management of epithelial cancers. Human papillomavirus (HPV)-associated epithelial cancers express viral E6 and E7 oncoproteins, which are absent from healthy tissues, making them appealing targets for T-cell therapies. In this phase I/II single-center trial, 12 patients with metastatic HPV16-positive cancer from any primary tumor site who had received prior platinum-based therapy received autologous genetically engineered T-cells expressing a T-cell receptor (TCR) directed against HPV16 E6 to assess safety, with secondary end points of objective tumor response rate and duration of response. The TCR was a high-avidity TCR directed against an epitope of HPV16 E6 presented by HLA-A*(02:01). Autologous genetically engineered T-cells were created by transducing peripheral blood mononuclear cells with E6 TCR retrovirus, and the conditioning regimen consisted of cyclophosphamide and fludarabine. At baseline, 6 patients had cervical cancer, 4 had anal cancer, 1 had oropharyngeal cancer, and 1 had vaginal cancer. Three of the cervical cancers were adenocarcinomas, and the remainder of the tumors were squamous cell carcinomas (SCC). The median age was 50 years (range 32 to 70 years), and two patients had previously received immunotherapy (one received checkpoint blockade with an anti-PD-1 agent, and one received adoptive T-cell therapy with tumor-infiltrating lymphocytes). Researchers observed no autoimmune adverse events or off-target toxicities attributable to E6 TCR T cells. Two of 12 patients attained objective tumor responses, both of whom had anal SCC (patients 5 and 10). Patient 5 ultimately underwent resection of lung metastases that had regressed with E6 TCR T-cell treatment, and had no evidence of disease 3 years after treatment. This patient was also the only patient with E6 TCR T-cells detected in a post-treatment tumor biopsy. Patient 10 experienced a partial response lasting 3 months before progression of a non-target lesion. All patients demonstrated peripheral blood engraftment with E6 TCR T-cells one month after treatment (median 30%, range 4% to 53%). Whole-exome sequencing of the post-treatment tumor biopsy in a patient who did not respond to treatment revealed a frameshift deletion in IFNGR1, and a tumor of a patient who experienced rapid cancer progression after treatment was found to have allelic imbalance with loss of HLA-A*02:01. Neither of these mutations occurred in Patient 5, and thus they represent potential mechanisms of resistance. Overall, this study indicates that genetically engineered T-cells with virus-specific TCRs are safe and can induce regression in HPV-associated epithelial cancers.
Association of Cereal, Gluten, and Dietary Fiber Intake With Islet Autoimmunity and Type 1 Diabetes
Type 1 diabetes (T1D) is thought to result from a combination of genetic and environmental factors. An increased intake of dietary fiber and gluten has been hypothesized to contribute to the development of T1D, though supporting data is scarce. In this cohort study, 6,081 infants with human leukocyte antigen (HLA)-conferred susceptibility to T1D were followed up to study the association between cereal, gluten, and dietary fiber intake with the development of islet autoimmunity (IA) and T1D. During the 6-year follow-up period, 4.4% of children developed IA at a median age of 2.5 years (IQR 1.3 to 3.6 years). Of the 5714 children with food record data available, 1.6% developed T1D at a median age of 3.8 years (2.9 to 4.8 years). Researchers found that, after adjusting for energy intake, a high intake of oats, (HR 1.08, 95% CI 1.03 to 1.13, p=0.002), wheat (HR 1.09, 95% CI 1.03 to 1.15, p=0.002), rye (HR 1.13, 95% CI 1.03 to 1.23, p-0.01), gluten-containing cereals (HR 1.07, 95% CI, 1.03 to 1.11, p<0.001), gluten without avenin from oats (HR 2.23, 95% CI 1.40 to 3.57, p<0.001), gluten with avenin (HR 2.06, 95% CI 1.45 to 2.92, p<0.001), and dietary fiber (HR 1.41, 95% CI 1.10 to 1.81, p=0.01) was associated with an increased risk of developing IA (HRs for 1 g/MJ increase in intake). The intake of rice and barley was not associated with risk of IA. After adjusting for energy intake, only the intake of oats (HR 1.10, 95% CI 1.00 to 1.21, p=0.04) and rye (HR 1.20, 95% CI 1.03 to 1.41, p=0.02) was associated with an increased risk of developing T1D. However, after multiple testing correction, neither of these associations were statistically significant. In summary, findings from this study suggest that a high intake of oats, gluten-containing cereals, gluten, and dietary fiber may increase the risk of developing islet autoimmunity, but not necessarily that of developing type 1 diabetes.
The most common colorectal cancer (CRC) screening methods are colonoscopy, flexible sigmoidoscopy (FS), and fecal immunochemical testing (FIT). The diagnostic yield of any screening method is dependent on participation, and endoscopic screening consistently shows lower participation rates compared to FIT-based screening. In this analysis of three randomized studies, 30,052 asymptomatic patients aged 50 to 74 years were invited for CRC screening and assigned to either 4 rounds of FIT (15,046 patients invited), once-only FS (8,407 patients invited), or once-only colonoscopy (6,600 patients invited), to determine the number of advanced neoplasias (AN) detected with each test. Patients with positive FIT results (≥10 μg Hb/g feces), or FS patients found to have a polyp ≥10 mm, adenoma with ≥25% villous histology or high-grade dysplasia, sessile serrated adenoma, ≥3 adenomas, ≥20 hyperplastic polyps, or invasive CRC were referred for colonoscopy. Researchers found that the participation rate was significantly higher for FIT screening (77%) than for FS (31%, p<0.001) or colonoscopy (24%, p<0.001). In the intention-to-screen analysis, the cumulative diagnostic yield of AN was higher with FIT screening (4.5%, 95% CI 4.2% to 4.9%) than with colonoscopy (2.2%, 95% CI, 1.8% to 2.6%) or FS (2.3%, 95% CI, 2.0% to 2.7%). In the as-screened analysis, colonoscopy detected more AN (9.1%, 95% CI 7.7% to 10.7%) than FS (7.4%, 95% CI 6.5% to 8.5%) and FIT (6.1%, 95% CI 5.7% to 6.6%). However, CRC detection rates were similar for FIT (0.8%, 95% CI 0.6% to 0.9%), FS (0.5%, 95% CI 0.3% to 0.9%) and colonoscopy (0.6%, 95% CI 0.3% to 1.2%). Among invitees, the rate of interval CRC development was 0.13% in patients with a negative result from the FIT, 0.09% in patients with a negative result from flexible sigmoidoscopy, and 0.01% in patients with a negative result from colonoscopy. A limitation of this study was that those invited for primary endoscopic screening were only invited once, as compared with those screened with FIT who were approached biennially, which may have impacted participation rates. In summary, this study suggests that FIT screening has a higher diagnostic yield for AN compared to endoscopic screening, despite significantly fewer colonoscopies.
Image: PD
©2019 2 Minute Medicine, Inc. All rights reserved. No works may be reproduced without expressed written consent from 2 Minute Medicine, Inc. Inquire about licensing here. No article should be construed as medical advice and is not intended as such by the authors or by 2 Minute Medicine, Inc.