RP7214 is a potent and selective inhibitor of human mitochondrial enzyme dihydroorotate dehydrogenase (DHODH). This paper describes the results from a Phase 1 study that evaluated safety and pharmacokinetics of single and multiple ascending doses (SAD and MAD) and the food effect of RP7214 in healthy subjects. Target engagement of DHODH was also evaluated. A randomized, double-blind, placebo-controlled trial of single-dose (100, 200, and 400 mg QD) and multiple doses (200 and 400 mg BID for 7 days) followed by food effect at a single dose of 200 mg was conducted. A total of 18 healthy volunteers (HVs) (6 subjects in each of three cohorts) in the SAD part, 12 (6 subjects each in two cohorts) in the MAD part, and 12 in the food effect study were enrolled. RP7214 was well tolerated at all dose levels. None of the subjects reported any RP7214-related adverse events. RP7214 showed dose-proportional pharmacokinetics after single and multiple dosing. Steady-state concentrations were reached within about 3–6 days. The mean plasma half-life of RP724 at steady-state was approximately 13h. RP7214 showed accumulation on multiple dosing.. Food did not impact the absorption of RP7214. RP7214 showed dose-dependent inhibition of DHODH as measured by analyzing accumulating DHO levels, confirming target engagement. The rapid absorption and high systemic exposure of RP724 with a favorable safety profile shows the potential for the development of RP7214 in SARS-CoV-2 infection and acute myeloid leukemia. (NCT04680429). Keywords: RP7214, dihydroorotate dehydrogenase, SAD, MAD, HV
Background: Irinotecan and temozolmide achieve objective responses in patients with Ewing sarcoma which recurrences after initial therapy. Optional dose schedules have not been defined. Procedure: We reviewed published series of patients treated with irinotecan and temozolomide for Ewing sarcoma which recurred after initial therapy. We compared objective response rates for patients who received 5 day irinotecan treatment schedules to response rates for patients who achieved 10 day irinotecan treatment schedules. Results: Among 94 patients treated with a 10 day irinotecan schedule there were 48 objective responses (51%). Among 218 patients treated with a 5 day irinotecan schedule there were 65 responses (30%). Conclusion: When we use irinotecan to treat Ewing sarcoma we should administer 10 days of treatment.
A set of new rasagiline derivatives is presented. They were designed to be antioxidant compounds with the potential to be used for treating neurodegenerative disorders. They are expected to be multifunctional molecules that can help reduce oxidative stress, which is thought to contribute to neurodegenerative disorders. The CADMA-Chem computational protocol was used to produce rasagiline derivatives and to evaluate their likeliness as oral drugs and antioxidants. Three of them were identified as the most promising ones. They are proposed to be better free radical scavengers than rasagiline. In addition, they are expected to keep the parent's molecule neuroprotective capability. Hopefully, the results presented here would promote further experimental and theoretical investigations on these compounds.
Introduction: Case definitions are used to guide clinical practice, surveillance, and research protocols. However, how they identify COVID-19-hospitalised patients is not fully understood. We analysed the proportion of hospitalised patients with laboratory-confirmed COVID-19, in the ISARIC prospective cohort study database, meeting widely used case definitions. Methods: Patients were assessed using the CDC, ECDC, WHO, and UKHSA case definitions by age, region, and time. Case fatality ratios (CFR) and symptoms of those who did and who did not meet the case definitions were evaluated. Patients with incomplete data and non-laboratory-confirmed test-result were excluded. Results: 263,218 of the patients (42%) in the ISARIC database were included. Most patients (90.4%) were from Europe and Central Asia. The proportions of patients meeting the case definitions were 56.8% (WHO), 74.4% (UKHSA), 81.6% (ECDC), and 82.3% (CDC). For each case definition, patients at the extremes of age distribution met the criteria less frequently than those aged 30 to 70 years; geographical and time variations were also observed. Estimated CFRs were similar for the patients that met the case definitions. However, when more patients did not meet the case definition, the CFR increased. Conclusions: The performance of case definitions might be different in different regions and may change over time. Similarly concerning is the fact that older patients often did not meet case definitions. While epidemiologists must balance their analytics with field applicability, ongoing revision of case definitions is necessary to improve patient care through early diagnosis and limit potential nosocomial spread.
Background. Oral immunotherapy (OIT) is an emerging method for treating food allergy in children. However, data regarding adults undergoing this process is lacking. Methods. We retrospectively analyzed the medical records of patients with food allergy aged ≥17 years who completed OIT treatment between April 2010 to December 2020 at Shamir medical Center. Data was compared to that of children aged 4 to <11 years and adolescents aged ≥11 to 17 treated during the same time period. Results. A total of 96 adults at a median age of 22.3 years who underwent OIT for milk (n=53), peanut (n=18), sesame (n=7), egg (n=5) and tree nuts (n=13) were analyzed and compared to 1299 children and 309 adolescents. Adults experienced more adverse reactions requiring injectable epinephrine, both during in-clinic up-dosing (49% vs. 15.9% and 26.5% for children and adolescents respectively, p<0.0001) and during home treatment (22.9% vs. 10.5%, p=0.001 for children, and 14.2%, p=0.06 for adolescents). Most adults (61.5%) were fully desensitized, but rates of full desensitization were significantly lower compared to children (73.4%, p=0.013). Significantly more adults (28.3%) undergoing milk OIT failed treatment compared to children (14.3%, p=0.015) and adolescents (14.1%, p=0.022), while failure rates in adults undergoing OIT for other foods were low (9.3%) and comparable to children and adolescents. Conclusions. OIT is successful in desensitizing most adults with IgE-mediated food allergy. Adults undergoing milk OIT are at increased risk for severe reactions and for OIT failure while failure rates in adults undergoing OIT for other foods are low.
Employing New Criteria for Confirmation of Conduction Pacing – Achieving True Left Bundle Branch Pacing May Be Harder Than Meets the EyeJoshua Sink, MD1, Nishant Verma, MD, MPH2Northwestern University, Feinberg School of Medicine, Department of Internal MedicineNorthwestern University, Feinberg School of Medicine, Division of CardiologyCorresponding Author:Nishant Verma, MD, MPH251 East Huron Street, Feinberg 8-503Chicago, IL 60611312-926-2148Nishant.Verma@nm.orgFunding: NoneDisclosures: Dr. Sink has nothing to disclose. Dr. Verma receives speaker honoraria from Medtronic, Biotronik and Baylis Medical and consulting fees from Boston Scientific, Biosense Webster, AltaThera Pharmaceuticals and Knowledge 2 Practice.Word Count: 1200In recent years, conduction system pacing (CSP) has garnered significant attention from the electrophysiology (EP) community. This movement has been driven by the hypothesis that using the natural conduction system activation is desirable and clinically beneficial in patients with advanced conduction disease and ventricular desynchrony. Permanent His-bundle pacing (PHBP) is generally seen as the purest form of conduction system activation. (Figure 1) PHBP was first described over 20 years ago but the idea has attracted substantial investigative effort in recent years. When successfully achieved, His bundle pacing has been associated with reduction in mortality, reduction in heart failure (HF) admissions, and improvement in left ventricular (LV) function compared to right ventricular (RV) pacing.1 Despite this, consistent achievability in real-world practice remains limited due to a variety of factors including narrow anatomic targetability, lead stability, high pacing thresholds, low ventricular sensing, and inability to correct the QRS in bundle branch block.2Thus, while waiting for the next iteration of improved delivery techniques, pacing leads and programming algorithms,, alternative methods of conductive system pacing have emerged, with the potential to surmount the challenges described.Left bundle branch pacing (LBBP) has recently emerged as an alternative method of CSP. The technique was first described by Huang et al. in 2017 and has seen a momentous rise in interest since.3 In 2019, Huang et al. produced a user manual for a successful LBBP procedure, and in it they attempted to develop the first iteration of criteria for the confirmation of LBBP.4 Utilizing these criteria, or close variations of them, a number of studies were published afterwards that demonstrated preliminary safety, feasibility, and efficacy of LBBP.5,6,7 LBBP became an attractive alternative to His bundle pacing because of the lower thresholds, improved lead stability, and higher procedural success rates. When compared against RV pacing in patients requiring a high burden of pacing, LBBP has demonstrated reduced mortality, HF admissions, and need for upgrade to a BiV device.8 In a small, non-randomized patient sample, LBBP showed greater improvement in LV ejection fraction (EF) compared to BiV pacing.9 Most notably, perhaps, is the astonishing rate of lead placement success, with achievement rates reported as high as 98% in sizable studies.6Differences between the two forms of CSP were apparent from the beginning, including in the appropriate QRS morphology after a successful case. Unlike PHBP, LBBP did not reproduce the native QRS and the QRS duration was often greater than at baseline (Figure 2). The arena of LBBP underwent a notable shift in the Fall of 2021 when Wu et al. proposed new criteria to prove LBBP.10 In this study, they presented an exquisite display of fundamental electrophysiologic principles by using mapping catheters positioned on the His and LV septum during LBB lead placement. Through this painstaking work, they clarified the difference between true LBBP and left bundle branch area pacing (LBBAP), which can incorporate both LBBP and left ventricular septal pacing (LVSP). In their proposed framework, without the presence of a His or LV septum mapping catheter, output dependent QRS transition from non-selective (NS-LBBP) to selective-LBBP (S-LBBP) or LVSP is necessary to prove LBBP and had a sensitivity and specificity of 100%.The present study by Shimeno et al, published in the current issue of the Journal of Cardiovascular Electrophysiology , is the first known effort to document achievement rates of LBBP by utilizing the modified criteria proposed by Wu et al.11 The primary finding of the study is that achieving true LBBP with an acceptable pacing threshold is likely harder than previously realized. As expected, there was improvement after a learning curve, but even in the last third of patients enrolled, the achievement rate of LBBP was only 50%. This is dramatically lower than previously reported achievement rates using the original Huang et al. criteria, and it suggests that not all patients in the previously described studies were actually achieving true LBBP. An unknown subset of patients in these studies was likely only achieving LVSP. This is probably due to a prior reliance on indicators such as a paced right bundle branch block (RBBB) pattern, identification of an intrinsic LBB potential, and/or use of V6 R-wave peak time cutoffs (RWPT) without clear output-dependent QRS transition. It is also worth noting that a variety of RWPT cutoffs have been used seemingly arbitrarily as ‘evidence of LBBP’. This presents a major dilemma and highlights the need for a clear set of LBBP criteria to be defined by the collective EP community. Despite these caveats, many of these previous studies did not fully confirm LBBP in their patients, yet the outcomes from these studies were still clinically promising. This raises the obvious question, does obtaining true LBBP matter? Future studies will need to explore the differences in clinical outcomes between true LBBP and LVSP.Secondarily, Shimeno et al. have provided a useful tool in identifying that LBB potential to QRS-onset ≥ 22ms had a specificity of 98% in predicting LBBP.11 This target measure can help future operators ensure proximal enough engagement of the LBB conduction system. Additionally, the group took a close look at validating a RWPT cutoff time for the prediction of LBBP. Unfortunately, a RWPT cutoff of 68 ms (in non-LBBB patients), determined by the ROC curve, was not highly predictive. This runs contrary to previous reports by Wu et al. and Jastrzebski et al., which reported higher predictive value of RWPT cutoffs10,12 Looking at the data surrounding RWPT cutoffs as a collective, it likely should not be used as a primary metric for confirming LBBP due to imperfect sensitivity and specificity, but it may be an alternative if output dependent QRS transition or change in RWPT of ≥10 ms is not observed. Additionally, in the event that capture thresholds are similar between the LBB and the adjacent myocardium, programmed stimulation is an option to try to reveal a QRS transition by exploiting differences in refractory periods.This study also highlighted one of the unique complications of LBBP by demonstrating a high rate of septal perforation. Paradoxically, more perforations were seen with increased experience, likely highlighting that deeper penetration into the septum is often sought as operators become more familiar with the procedure. The long-term clinical implications of this complication are, thus far, unknown.Looking forward, clear guidelines for confirmation of LBBP need to be defined. This is necessary to ensure quality before undertaking multi-center randomized controlled trials to assess LBBP in comparison to current pacing methods. To date, Wu et al. seem to have provided the best framework to achieve this.10 That said, there are concerns given that this has only been validated in 30 patients (and only 9 with LBBB). In an ideal world, these criteria would be validated in a larger population, though the work to accomplish this would be meticulous given the current gold standard of using an LV septal mapping catheter to prove conduction system capture. Shimeno et al. should be congratulated for their effort in putting this framework to practice. In their work, they have demonstrated that achieving true LBBP as defined by Wu et al. may be harder than meets the eye, and this is very important in assessing the practicality of using LBBP as a widespread alternative to other pacing methods.References:Abdelrahman M, Subzposh FA, Beer D, et al. Clinical Outcomes of His Bundle Pacing Compared to Right Ventricular Pacing. J Am Coll Cardiol . 2018;71(20):2319-2330. doi:10.1016/j.jacc.2018.02.048Zanon F, Abdelrahman M, Marcantoni L, et al. Long term performance and safety of His bundle pacing: A multicenter experience. J Cardiovasc Electrophysiol . 2019;30(9):1594-1601. doi:10.1111/jce.14063Huang W, Su L, Wu S, et al. A Novel Pacing Strategy With Low and Stable Output: Pacing the Left Bundle Branch Immediately Beyond the Conduction Block. Can J Cardiol . 2017;33(12):1736.e1-1736.e3. doi:10.1016/j.cjca.2017.09.013Huang W, Chen X, Su L, Wu S, Xia X, Vijayaraman P. A beginner’s guide to permanent left bundle branch pacing. Heart Rhythm . 2019;16(12):1791-1796. doi:10.1016/j.hrthm.2019.06.016Padala SK, Master VM, Terricabras M, et al. Initial Experience, Safety, and Feasibility of Left Bundle Branch Area Pacing: A Multicenter Prospective Study. JACC Clin Electrophysiol . 2020;6(14):1773-1782. doi:10.1016/j.jacep.2020.07.004Su L, Wang S, Wu S, et al. Long-Term Safety and Feasibility of Left Bundle Branch Pacing in a Large Single-Center Study. Circ Arrhythm Electrophysiol . 2021;14(2):e009261. doi:10.1161/CIRCEP.120.009261Huang W, Wu S, Vijayaraman P, et al. Cardiac Resynchronization Therapy in Patients With Nonischemic Cardiomyopathy Using Left Bundle Branch Pacing. JACC Clin Electrophysiol . 2020;6(7):849-858. doi:10.1016/j.jacep.2020.04.011Sharma PS, Patel NR, Ravi V, et al. Clinical outcomes of left bundle branch area pacing compared to right ventricular pacing: Results from the Geisinger-Rush Conduction System Pacing Registry. Heart Rhythm . 2022;19(1):3-11. doi:10.1016/j.hrthm.2021.08.033Wu S, Su L, Vijayaraman P, et al. Left Bundle Branch Pacing for Cardiac Resynchronization Therapy: Nonrandomized On-Treatment Comparison With His Bundle Pacing and Biventricular Pacing. Can J Cardiol . 2021;37(2):319-328. doi:10.1016/j.cjca.2020.04.037Wu S, Chen X, Wang S, et al. Evaluation of the Criteria to Distinguish Left Bundle Branch Pacing From Left Ventricular Septal Pacing. JACC Clin Electrophysiol . 2021;7(9):1166-1177. doi:10.1016/j.jacep.2021.02.018Shimeno K, Tamura S, Hayashi Y, et al. Achievement Rate and Learning Curve of Left Bundle Branch Capture in Left Bundle Branch Area Pacing Procedure Performed to Demonstrate Output-Dependent QRS Transition.J Cardiovasc Electrophysiol . 2022Jastrzębski M, Kiełbasa G, Curila K, et al. Physiology-based electrocardiographic criteria for left bundle branch capture. Heart Rhythm . 2021;18(6):935-943. doi:10.1016/j.hrthm.2021.02.021Figure LegendsFigure 1: Permanent His Bundle PacingPanel A: A 12-lead electrocardiogram (EKG) shows baseline conduction in a patient with exertional intolerance. The PR interval is markedly prolonged and, with exercise, this patient developed AV block. A permanent His-bundle pacemaker was implantedPanel B: An EKG demonstrating permanent His-bundle pacing in the same patient as panel A. Selective His-bundle capture results in reproduction of the intrinsic QRS complex.Figure 2: Non-Selective Left Bundle Branch PacingA 12-Lead electrocardiogram showing non-selective left bundle branch pacing. The paced QRS morphology is not a direct match for native conduction and the QRS duration is longer than at baseline. However, conduction system capture was confirmed with an output dependent QRS morphology change.FiguresFigure 1: Permanent His-Bundle Pacing
For donation after circulatory death, procurement is performed after the heart has arrested. This technique has been employed and adopted by clinicians to overcome the shortage of available hearts for transplant. Warm ischemia time plays a pivotal role in the survival outcome of the heart recipients. We describe a fast and safe technique to flush the heart during recovery from circulatory death donors in order to shorten the warm ischemia time.
Title: Percutaneous Lead Extraction in Patients with Large Vegetations: Limiting our Aspirations.Robert D. Schaller, DO11The Section of Cardiac Electrophysiology, Cardiovascular Division, Department of Medicine, Hospital of the University of Pennsylvania, Philadelphia, PennsylvaniaFunding: This work was supported in part by the Mark Marchlinski EP Research & Education FundKey words: Lead extraction, vegetation, pulmonary embolism, thrombus, aspirationDisclosures: NoneWord count: 1547Transvenous lead extraction (TLE) in the 1960’s involved orthopedic-style pulley systems that joined the exposed portion of the lead to progressively heavier weights hanging from the bed. Sustained tension on the lead was maintained until the patient experienced discomfort, ventricular arrhythmias, or noticeable resistance developed, and was maintained for minutes to days. The location of the lead within the chest was monitored with daily chest radiographs and the ensuingbang of the weight hitting the floor of the intensive care unit signified case conclusion; at which point the patient was assessed. Complications were erratic and included lead laceration and possible migration, injury to the tricuspid valve (TV), myocardial avulsion, tamponade, and death.1 Due to the immature nature of the procedure at that time, it was relegated to infectious indications including lead-related endocarditis, at that time referred to as “catheter fever”.Contemporary TLE has evolved into a highly refined practice with a multitude of tools and predictable results, and procedural indications that now span infection, venous occlusion, management of redundant leads, and access to magnetic resonance imaging.2Procedural imaging with computed tomography (CT) and real-time ultrasound-based tools have similarly changed the TLE experience with identification of adhesions, thrombi, vegetations, and complications.3 Large lead-related masses have historically caused angst due to the possibility of being sheared off by the extraction sheath and embolizing to the lung, and still represent a relative contraindication to percutaneous TLE.2In this issue of the Journal of Cardiovascular Electrophysiology , Giacopelli, et al.4 present the outcomes of 25 consecutive patients (mean age 64 years, 68% male) including 5 with pacemakers, 10 with implantable cardioverter-defibrillators, and 10 with cardiac resynchronization therapy devices, who underwent TLE with vegetations ≥10 mm on transesophageal echocardiography (TEE). Contrast-enhanced CT was performed before and after TLE with 18 (72%) patients showing subclinical pulmonary embolism (PE). Vegetation size (median of 17.5 mm and maximum of 30 mm) did not differ in those with and without PE (20.0 mm vs. 14.0 mm, p=0.116). Complete TLE success was achieved in all patients with 76% requiring advanced tools and 2 needing femoral snaring, and there were no significant procedural complications. In the group with pre-TLE PE, a post-TLE scan confirmed the presence of PE in only 14/18 (78%) and there were no patients with new PE formation. During a median follow-up period of 19.4 months, no re-infection of the new implanted systems was reported and there were 5 deaths (20%); with no differences between the groups. The authors concluded that subclinical PE was common in this clinical scenario but did not influence the complexity or safety of the procedure.Several aspects of this paper warrant comment. No data are reported on the size or location of the PEs nor the time between the first and second CT. It is possible that small PEs would not be identified on subsequent studies days after antibiotics had already been started. Patients also received acute and chronic anticoagulation if PE was identified, which in the setting of vegetations, is generally not indicated and could potentially lead to bleeding. The authors did not provide information regarding infectious pathogens or the timing of culture clearance, which could influence treatment. Additionally, it is unclear which patients received new CIED systems including the type and timing of reimplantation, which might influence subsequent infectious risk. A vascular occlusion balloon was not used in any patients in this report. While this tool is associated with a reduced risk of death in the setting of a superior vena cava laceration when used properly, it has also been shown to be thrombogenic during long dwell times,5 and use could impact post-operative CTs in future studies. Despite utilizing transthoracic echocardiography during TLE, neither TEE nor intracardiac echocardiography were used intraoperatively and thus no information regarding the precise location of the vegetations within the heart is known. Importantly, no information regarding the characteristics of the vegetations other than size was reported.Not all lead-related masses are created equal with two distinct sub-types previously described.6 The first is composed of thickened endocardium and fibrous tissue covering the leads and ultimately forming into connective tissue. These masses, commonly found on leads behind the TV, are caused by a vortical flow pattern leading to low shear stress on the lead surface and provoking neointimal hyperplasia,7 and range from small fibrous strands to large, smooth organized thrombus (Figure, left column). Despite their sterile nature, TLE in the setting of a large, mature thrombus could result in embolization and obstruction of the pulmonary artery resulting in symptomatic PE. The second type, frequently seen in the setting of infective endocarditis, is composed of inflammatory cells, platelets, adhesion molecules, fresh fibrin, and bacteria binding to coagulum and forming vegetations. They are typically longer, more likely to be multi-lobular, and commonly span several chambers of the heart (Figure, right column). These vegetations that are typically acute, with friable finger-like projections, characteristically break apart upon being sheared off during TLE, with reports showing low risk of symptomatic PE.8 Vegetations that are lobular, however, have been associated with worse outcomes.9Despite acute procedural success in the setting of lead-related vegetations, mortality rates at 1 year approach 25%.10 Indeed, despite successful TLE in this report, 20% of patients were dead at 1.5 years. Although complete understanding of the mechanism of these poor outcomes remains unknown, septic emboli, lung abscesses, and infected lead “ghosts” have been implicated.11 Vegetation removal prior to TLE has thus represented an appealing therapeutic option with reports of successful percutaneous aspiration prior to TLE showing promising results, albeit with unknown long-term benefit.12,13 Although the lack of new PEs after TLE in this report does not directly support the effort, cost, and added risk of such a strategy, “debulking” of infectious burden remains a tempting complementary treatment. Importantly, the acute safety of TLE with large vegetations in this study should not be extrapolated to chronic, large lead-related masses, which are more like to cause acute PE if embolized. While aspiration of these sterile masses prior to TLE is appealing from a procedural outcome perspective, their morphologic characteristics, and the imperfect, but evolving, aspiration sheaths currently available are limiting, and requires consideration of surgical extraction. Further advancements in aspiration catheter technology and the development of right ventricular outflow track filters might influence future management.TLE continues to represent the gold standard for the management of lead-related infection.2 Due to the extensive work of the pathfinders in the vanguard of procedural development, the sound of crashing weights has been supplanted by those that power advancing sheaths. Yet despite the safe and predictable nature of modern-day TLE, the sobering long-term mortality of patients with infectious indications remains out of proportion to acute procedural success. While infectious “debulking” continues to represent the most attractive and practical complementary option to address this incongruity, future studies should concentrate both on identification of mass characteristics that suggest success, as well as determining if long-term benefits exist above and beyond lead removal. However, if improvement in clinical outcomes that warrant this added cost and effort are not identified, we should likely limit our aspirations.
Selection on quantitative traits by divergent climatic conditions can lead to substantial trait variation across a species range. In the context of rapidly changing environments, however, it is equally important to understand selection on trait plasticity. To evaluate the role of selection in driving divergences in traits and their associated plasticity within a widespread species, we compared molecular and quantitative trait variation in Populus fremontii (Fremont cottonwood) populations throughout Arizona. Using SNP data and genotypes from 16 populations reciprocally planted in three common gardens, we first performed QST-FST analyses to detect selection on traits and trait plasticity. We then explored the mechanistic basis of selection using trait-climate and plasticity-climate regressions. Three major findings emerged: 1) There was significant genetic variation in traits expressed in each of the common gardens and in the phenotypic plasticity of traits across gardens. 2) Based on QST-FST comparisons, there was evidence of selection in all traits measured; however, this result varied from no effect in one garden to highly significant in another, indicating that detection of past selection is environmentally dependent. We also found strong evidence of divergent selection on plasticity across environments for two traits. 3) Traits and/or their plasticity were often correlated with population source climate (R2 up to 0.77 and 0.66, respectively). This suggests that steep climate gradients across the Southwest have played a major role in shaping the evolution of divergent phenotypic responses in populations and genotypes now experiencing climate change.
Acute fatty liver disease of pregnancy (AFLP) is a rare condition associated with other common liver manifestations such as hemolysis, elevated liver enzymes, and low platelets syndrome (HELLP). We present a 27-year-old pregnant woman who developed hepatic encephalopathy and DIC after being diagnosed with Acute fatty liver disease of pregnancy.
On Time Surgery Start: Is Standardization The Answer?Olufunke Folasade Dada MD, Tanaya Sparkle M.B.B.S.University of Toledo Medical Center, Anesthesiology Department,3000 Arlington Avenue, Toledo, Ohio, USACorresponding Author: Dr. Tanaya Sparkle, M.B.B.S.Address for correspondence:University of Toledo Medical Center, Anesthesiology Department,3000 Arlington Avenue, Toledo, Ohio - 43614E-mail: email@example.comPhone: 419-383-3531
Adolescence is a critical stage of rapid biological, emotional and social change and development. Adolescents and young adults (AYA) with asthma and allergies need to develop the knowledge and skills to self-manage their health independently. Healthcare professionals (HCP), parents and their wider network play an essential role in supporting AYA in this process. Previous work showed significant limitations in transition care across Europe. In 2020, the first evidence-based guideline on effective transition for AYA with asthma and allergies was published by EAACI. We herein summarize practical resources to support this guideline’s implementation in clinical practice. For this purpose, multi-stakeholder Task Force members searched for resources in peer review journals and grey literature. These resources were included if relevant and of good quality, and were pragmatically rated for their evidence-basis and user friendliness. Resources identified covered a range of topics and targeted healthcare professionals, AYA, parents/carers, schools, workplace, and wider community. Most resources were in English, web-based and had limited evidence-basis. This position paper provides a valuable selection of practical resources for all stakeholders to support effective transitional care for AYA with asthma and allergies. Future research should focus on developing validated, patient-centred tools to further assist evidence-based transition care.
A fracture of the mastoid bone should be considered in the work-up of a head and neck traumatic injury. A well-pneumatized mastoid can absorb forceful impacts, protecting middle and inner ear structures. Fractures of the mastoid, followed by Valsalva maneuver can lead to subcutaneous cervical emphysema.
Objective To test equivalence of two doses of intravenous iron (ferric carboxymaltose) in pregnancy. Design Parallel, two-arm equivalence randomised controlled trial with an equivalence margin of 5%. Setting Single centre in Australia. Population 278 pregnant women with iron deficiency. Methods Participants received either 500 mg (n=152) or 1000mg (n=126) of intravenous ferric carboxymaltose in the second or third trimester. Main outcome measures The proportion of participants requiring additional intravenous iron (500mg) to achieve and maintain ferritin >30ug/L (diagnostic threshold for iron deficiency) at 4 weeks post-infusion, and at 6 weeks, and 3-, 6- and 12-months postpartum. Secondary endpoints included repeat infusion rate, iron status, birth, and safety outcomes. Results The two doses were not equivalent within a 5% margin at any timepoint. At 4 weeks post infusion, 26/73 (36%) participants required a repeat infusion in the 500 mg group compared with 5/67 (8%) in the 1000 mg group (difference in proportions, 0.283 95% confidence interval (0.177, 0.389)). Overall, participants in the 500 mg arm received twice the repeat infusion rate (0.81 (SD= 0.824 vs 0.40 (SD= 0.69), rate ratio 2.05, 95% CI (1.45, 2.91)). Conclusions Administration of 1000mg ferric carboxymaltose in pregnancy maintains iron stores and reduces the need for repeat infusions. A 500 mg dose requires ongoing monitoring to ensure adequate iron stores are reached and sustained.
Background: Hemostatic disturbances with coronavirus disease 2019 (COVID-19) can predispose to tricuspid and right heart thrombi in very rare instances. Aim: We describe a 29-year-old female patient without previous cause of thrombosis who developed large tricuspid valve thrombus (TVT) and moderate-to-severe tricuspid regurgitation (TR) during the course of COVID-19 infection. Materials and methods: Persistant fever and tachycardia with thrombocytopenia and high D-dimer increased the index of suspicion. The diagnosis was made by bedside transthoracic echocardiography (TTE) and cardiac magnetic resonance (CMR). Surgery was performed for thrombectomy and tricuspid valve replacement with a tissue valve. Discussion and conclusion: Detection of TVT in COVID-19 patients on the basis of high index of suspicion, bedside TTE and non-invasive CMR helps early surgical treatment and subsequent reduction of mortality and hospital stay.