Category Archives: Original Research

Association between hospital case volume and mortality in pediatric sepsis: A retrospective observational study using a Japanese nationwide inpatient database

DOI: 10.2478/jccm-2025-0006

Introduction: The survival benefits of treatment at high-volume hospitals (HVHs) are well-documented for several critical pediatric conditions. However, their impact on pediatric sepsis, a leading cause of mortality among children, remains understudied.
Aim of the study: To investigate the association between hospital case volume and mortality rates in pediatric sepsis.
Material and Methods: We conducted a retrospective cohort study using data from the Diagnosis Procedure Combination database. The study included patients who met the following criteria: 1) aged 28 days to 17 years; 2) discharged from the hospital between April 2014 and March 2018; 3) had a sepsis diagnosis coded under the International Classification of Diseases, 10th revision; 4) underwent blood cultures on hospital admission day (day 0) or day 1; 5) received antimicrobial agents on day 0 or 1; and 6) required at least one organ support measure (e.g., mechanical ventilation or vasopressors) on day 0 or 1. Hospitals were categorized by case volume during the study period, with HVHs defined as those in the highest quartile and low-volume hospitals (LVHs) as those in the remaining quartiles. In-hospital mortality rates between HVH and LVH groups were compared using mixed-effects logistic regression analysis with propensity score (PS) matching.
Results: A total of 934 pediatric patients were included in the study, with an overall in-hospital mortality rate of 16.1%. Of them, 234 were treated at 5 HVHs (≥26 patients in 4 years), and 700 at 234 LVHs (<26 patients in 4 years). Upon PS matching, patients treated at HVHs demonstrated significantly lower odds of in-hospital mortality compared with those treated at LVHs (odds ratio, 0.42; 95% confidence interval, 0.22–0.80; P = 0.008).
Conclusions: In pediatric patients with sepsis, treatment at HVHs was associated with lower odds of in-hospital mortality.

Full text: PDF

What proteins and albumins in bronchoalveolar lavage fluid and serum could tell us in COVID-19 and influenza acute respiratory distress syndrome on mechanical ventilation patient – A prospective double center study

DOI: 10.2478/jccm-2025-0005

Introduction: The extent of in vivo damage to the alveolar-capillary membrane in patients with primary lung injury remains unclear. In cases of ARDS related to COVID-19 and Influenza type A, the complexity of the damage increases further, as viral pneumonia cannot currently be treated with a causal approach.
Aims of the study: Our primary goal is to enhance the understanding of Acute Respiratory Distress Syndrome (ARDS) by demonstrating damage to the alveocapillary membrane in critically ill patients with COVID-19 and influenza type A. We will achieve this by measuring the levels of proteins and albumin in bronchoalveolar fluid (BAL) and serum. Our secondary objective is to assess patient outcomes related to elevated protein and albumin levels in both BAL and blood serum, which will deepen our understanding of this complex condition.
Materials and methods: Bronchoalveolar lavage (BAL) fluid and serum samples were meticulously collected from a total of 64 patients, categorized into three distinct groups: 30 patients diagnosed with COVID-19-related acute respiratory distress syndrome (ARDS), 14 patients with influenza type A (H1N1 strain), also experiencing ARDS, and a control group consisting of 20 patients who were preoperatively prepared for elective surgical procedures without any diagnosed lung disease. The careful selection and categorization of patients ensure the robustness of our study. BAL samples were taken within the first 24 hours following the commencement of invasive mechanical ventilation in the intensive care unit, alongside measurements of serum albumin levels. In the control group, BAL and serum samples were collected after the induction of general endotracheal anaesthesia.
Results: Patients in the COVID-19 group are significantly older than those in the Influenza type A (H1N1) group, with median ages of 72.5 years and 62 years, respectively (p < 0.01, Mann-Whitney U test). Furthermore, serum albumin levels (measured in g/L) revealed significant differences across all three groups in the overall sample, yielding a p-value of less than 0.01 according to ANOVA. In terms of treatment outcomes, serum albumin levels also exhibited a significant correlation, with a p-value of 0.03 (Mann-Whitney U test). A reduction in serum albumin levels (below 35 g/L), combined with elevated protein levels in bronchoalveolar lavage (BAL), serves as a predictor of poor outcomes in patients with acute respiratory distress syndrome (ARDS), as indicated by a p-value of less than 0.01 (ANOVA).
Conclusions: Our findings indicate that protein and albumin levels in bronchoalveolar lavage (BAL) fluid are elevated in severe acute respiratory distress syndrome (ARDS) cases. This suggests that BAL can effectively evaluate protein levels and fractions, which could significantly assist in assessing damage to the alveolocapillary membrane. Additionally, the increased albumin levels in BAL, often accompanied by a decrease in serum albumin levels, may serve as a valuable indicator of compromised integrity of the alveolar-capillary membrane in ARDS, with potential implications for patient care.

Full text: PDF

Hypercapnia outcome in COVID-19 acute respiratory distress syndrome patients on mechanical ventilator: A retrospective observational cohort

DOI: 10.2478/jccm-2025-0004

Introduction: Acute respiratory distress syndrome (ARDS) is characterized by progressive lung inflammation which leads to increased dead space that can cause hypercapnia and can increase the risk of patient morbidity and mortality. In an attempt to improve ARDS patient outcomes provision of protective lung ventilation has been shown to improve patient mortality but increases the incidence of hypercapnia. Therefore, the role of carbon dioxide in ARDS remains contradicted by conflicted evidence. This study aims to examine this conflicting relationship between hypercapnia and mortality in mechanically ventilated COVID-19 ARDS patients.
Methods: We conducted a retrospective cohort study. The data was collected from the medical records of the patients admitted with COVID-19 ARDS in Sindh Infectious Disease Hospital &Research Centre (SIDH & RC) from August 2020 to August 2022 and who received mechanical ventilation for more than 48 hours. The patients were grouped into severe and no severe hypercapnia groups based on their arterial blood carbon dioxide levels (PaCO2). To understand the effect of hypercapnia on mortality we performed multivariable logistic regression, and inverse probability-weighted regression to adjust for time-varying confounders.
Results: We included 288 patients to detect at least 3% of the effect on mortality. Our analysis revealed an association of severe hypercapnia with severe lung injury, low PaO2/FiO2, high dead space, and poor compliance. In univariate analysis severe hypercapnia showed higher mortality: OR=3.50, 95% CI [1.46-8.43]. However, after, adjusting for disease severity hypercapnia is not found to be associated with mortality: OR=1.08, 95% CI [0.32 -3.64]. The sensitive analysis with weighted regression also shows no significant effect on mortality: OR=1.04, 95% CI [0.95-1.14].
Conclusion: This study showed that hypercapnia is not associated with mortality in COVID-19 ARDS patients.

Full text: PDF

Intraabdominal hypertension is less common than it used to be: A pilot step wedge trial

DOI: 10.2478/jccm-2025-0002

Objective: This is a pilot study to determine the feasibility of a multicentre stepped wedge cluster randomized trial of implementing the 2013 World Society of the Intraabdominal Compartment Syndrome (WSACS) guidelines as an intervention to treat intraabdominal hypertension (IAH) and abdominal compartment syndrome (ACS) in critically ill patients.
Design: Single-centre before-and-after trial, with an observation / baseline period of 3 months followed by a 9-month intervention period.
Setting: A 35 bed medical-surgical-trauma intensive care unit in a tertiary level, Canadian hospital.
Patients: Recruitment from consecutively admitted adult intensive care unit patients.
Intervention: In the intervention period, treatment teams were prompted to implement WSACS interventions in all patients diagnosed with IAH.
Measurements and Main Results: 129 patients were recruited, 59 during the observation period and 70 during the intervention period. Only 17.0% and 12.9%, respectively, met diagnostic criteria for IAH. Many recruited patients did not have intraabdominal pressures measured regularly per study protocol. There was no difference in ICU mortality for patients in either cohort or between those with and without IAH.
Conclusions: The incidence of IAH in our patient population has decreased significantly since 2015. This is likely due to a significant change in routine care of critically ill patients, especially with respect to judicious goal-directed fluid resuscitation. Patient recruitment and protocol adherence in this study were low, exacerbated by other staffing and logistical pressures during the study period. We conclude that a larger multicentre trial is unlikely to yield evidence of a detectable treatment effect.

Full text: PDF

Evaluation of monitoring critical ill children with traumatic brain injury

DOI: 10.2478/jccm-2025-0001

Introduction: In traumatic brain injury (TBI), direct information can be obtained about cerebral blood flow, brain tissue oxygenation and cerebral perfusion pressure values. More importantly, an idea about the changes in these measurements can be obtained with multidimensional monitoring and widely used monitoring methods.
Aim of the study: We aimed to evaluate the monitoring of critically ill children who were followed up in our pediatric intensive care unit (PICU) due to TBI.
Material and Method: Twenty-eight patients with head trauma who were followed up in our tertiary PICU between 2018 and 2020 were included in the study. Cerebral tissue oxygenation, optic nerve sheath diameter (ONSD), Glasgow coma score (GCS) and Glasgow Outcome Score (GOSE) values were obtained from retrospective file records and examined.
Results: Male gender was 71.4% (n=20). When we classified TBI according to GCS, 50% (n=14) had moderate TBI and 50% had severe TBI. On the first day in the poor prognosis group, ONSD and nICP were found to be higher than in the good prognosis group (for ONSD, p=0.01; and for nICP, p=0.004). On the second day of hospitalization, the ONSD and nICP were significantly higher in the poor prognosis group than in the good prognosis group (for ONSD p=0.002; and for nICP p= 0.001). Cerebral tissue oxygenation values measured on the first and second days decreased significantly on the second day in both the good and poor prognosis groups (p=0.03, 0.006). In the good prognosis group, a statistically significant decrease was found in ONSD and nICP measurements taken on the 2nd day compared to the measurements taken at the time of hospitalization (for ONSD p=0.004; for nICP p<0.001).
Conclusion: The aim of multidimensional follow-up in traumatic brain injury is to protect the brain from both primary and secondary damage; for this reason, it should be followed closely with multimonitoring methods that are possibly multidisciplinary.

Full text: PDF

The effect of pre-existing sarcopenia on outcomes of critically ill patients treated for COVID-19

DOI: 10.2478/jccm-2024-0045

Background: Sarcopenia, defined by a loss of skeletal muscle mass and function, has been identified as a prevalent condition associated with poor clinical outcome among critically ill patients. This study aims to evaluate the impact of pre-existing sarcopenia on outcomes in critically ill patients with acute respiratory failure (ARF) due to COVID-19.
Material and Methods: A retrospective study was carried out on COVID-19 patients admitted to intensive care. Pre-existing sarcopenia was assessed using early CT scans. Clinical outcomes, including duration of high-flow oxygenation (HFO), mechanical ventilation (MV), length of hospital stay (LOS) and ICU mortality, were evaluated according to sarcopenia status.
Results: Among the studied population, we found a high prevalence (75 patients, 50%) of pre-existing sarcopenia, predominantly in older male patients. Pre-existing sarcopenia significantly impacted HFO duration (6.8 (+/-4.4) vs. 5 (+/-2.9) days; p=0.005) but did not significantly affect MV requirement (21 (28%) vs. 23 (37.3%); p=185), MV duration (7 vs. 10 days; p=0.233), ICU mortality (12 (16%) vs. 10 (13.3 %); p=0.644) or hospital LOS (27 vs. 25 days; p=0.509). No differences in outcomes were observed between sarcopenic and non-sarcopenic obese patients.
Conclusions: Pre-existing sarcopenia in critically ill COVID-19 patients is associated with longer HFO duration but not with other adverse outcomes. Further research is needed to elucidate the mechanisms and broader impact of sarcopenia on septic critically ill patient outcomes.

Full text: PDF

The Use of Biomarkers Testing in Emergency Department

DOI: 10.2478/jccm-2024-0041

Introduction: In the fast-paced environment of Emergency Departments (EDs), biomarkers are essential for the rapid diagnosis and management of critical conditions.
Aim of the study: This study evaluates the current clinical practice on key biomarkers in Romanian EDs, addressing the needs of emergency medicine physicians, and the challenges associated with biomarker testing.
Material and Methods: An online survey was sent to physicians working in ED to explore their perceptions, needs, and barriers regarding biomarkers, including Point-of-care (POC). Data was collected anonymously through an online platform and subsequently analyzed.
Results: This survey analyzed data from 168 completed responses, with 95.2% of respondents being specialists in emergency medicine. Procalcitonin and presepsin were most preferred for PoCT, while troponin and D-dimer were highly rated regardless of the testing method, reflecting their utility in sepsis and cardiovascular emergencies. Neuron-specific enolase, interleukin-6, and procalcitonin were the biomarkers considered needed.
Conclusions: The most frequently used biomarkers in ED were troponin, D-dimer, BNP/NT-proBNP, and procalcitonin. NSE, IL-6, and procalcitonin were the most recommended for future integration. High costs, limited availability, and false-positive concerns remain significant challenges in biomarker use.

Full text: PDF

The Analgesic Effect of Morphine on Peripheral Opioid Receptors: An Experimental Research

DOI: 10.2478/jccm-2024-0042

Opioids represent one of the key pillars in postoperative pain management, but their use has been associated with a variety of serious side effects. Thus, it is crucial to investigate the timing and course of opioid administration in order to ensure a best efficacy to side-effect profile. The aim of our article was to investigate the analgesic effects of locally administered morphine sulfate (intraplantar) in a carrageenan-induced inflammation model in rats. After carrageenan administration, the rats were divided into 10 equal groups and were injected with either morphine 5 mg/kg or 0.9% saline solution at different time intervals, depending on the assigned group. The analgesic effect was assessed through thermal stimulation. Our results showed that paw withdrawal time was significantly higher in rats treated with morphine compared to those in the control group 9.18 ± 3.38 compared to 5.14 ± 2.21 seconds, p=0.012). However, differences were more pronounced at certain time intervals post-carrageenan administration (at 180 minutes compared to 360 minutes, p=0.003 and at 180 minutes compare to 1440 minutes p<0.001), indicating that efficacy varies depending on the timing of treatment. In conclusion, our findings support the hypothesis that locally administered morphine may alleviate pain under inflammatory conditions and underscores the importance of considering treatment timing when evaluating the analgesic effect.

Full text: PDF

The Correlation of Hemostatic Parameters with the Development of Early Sepsis-Associated Encephalopathy. A Retrospective Observational Study

DOI: 10.2478/jccm-2024-0040

Introduction: Sepsis-associated encephalopathy (SAE) is one of the most common complications seen both in early and late stages of sepsis, with a wide spectrum of clinical manifestations ranging from mild neurological dysfunction to delirium and coma. The pathophysiology of SAE is still not completely understood, and the diagnosis can be challenging especially in early stages of sepsis and in patients with subtle symptoms.
Aim of the study: The objective of this study was to assess the coagulation profile in patients with early SAE and to compare the hemostatic parameters between septic patients with and without SAE in the first 24 hours from sepsis diagnosis.
Material and methods: This retrospective observational study included 280 patients with sepsis in the first 24 hours after sepsis diagnosis. A complete blood count was available in all patients; a complex hemostatic assessment including standard coagulation tests, plasmatic levels of coagulation factors, inhibitors, D-dimers, and Rotation thromboelastometry (ROTEM, Instrumentation Laboratory) was performed in a subgroup of patients.
Results: Early SAE was diagnosed in 184 patients (65.7%) and was correlated with a higher platelet count, after adjusting for age and leucocyte count. Compared to patients without neurological dysfunction, patients with early SAE presented a more active coagulation system revealed by faster propagation phase, increased clot firmness and elasticity with a higher platelet contribution to clot strength. The initiation of coagulation and clot lysis were not different between the groups.
Conclusion: In the early stages of sepsis, the development of SAE is correlated with increased systemic clotting activity where platelets seem to have an important role. More research is needed to investigate the role of platelets and the coagulation system in relation to the development of early SAE.

Full text: PDF

Understanding the Difficulties in Diagnosing Neonatal Sepsis: Assessing the Role of Sepsis Biomarkers

DOI: 10.2478/jccm-2024-0039

Background. Neonatal sepsis is a serious condition with high rates of morbidity and mortality, caused by the rapid growth of microorganisms that trigger a systemic reaction. Symptoms can range from mild to severe presentations. The causative microorganism is usually transmitted from mothers, especially from the urogenital tract, or can originate from the community or hospital.
Methods. Our retrospective study assessed 121 newborns, including both preterm and term infants, divided into three groups within the first 28 days of life: early-onset sepsis (35), late-onset sepsis (39), and a control group (47). Blood samples and cultures were obtained upon admission or at the onset of sepsis (at 24 and 72 hours). The study aimed to evaluate the limitations of commonly used biomarkers and new markers such as lactate dehydrogenase and ferritin in more accurately diagnosing neonatal sepsis.
Results. Our study revealed a significant difference between the initial and final measures of lactate dehydrogenase (LDH) and ferritin in the early-onset sepsis (EOS) and late-onset sepsis (LOS) groups.
Conclusion. Ferritin and LDH may serve as potential markers associated with systemic response and sepsis in cases of both early and late-onset sepsis. Monitoring these biomarkers can aid in the timely detection and management of sepsis, potentially improving patient outcomes.

Full text: PDF