Category Archives: Original Research

Evaluation of muscle strength and renal function in survivors of severe COVID-19: A 12-month follow-up study

DOI: 10.2478/jccm-2026-0022

Introduction: Severe COVID-19 is known to cause kidney injury via ACE2-mediated mechanisms, inflammation, and microvascular damage potentially leading to long-term renal impairment. Critically ill patients are particularly vulnerable to muscle loss and sarcopenia due to immobility, poor nutrition, and cytokine storm–induced catabolism. Post-COVID-19 syndrome often includes fatigue, muscle weakness, and reduced quality of life, yet evidence on long-term outcomes remains limited. This study evaluated kidney function, sarcopenia risk, and quality of life 12 months after intensive care unit (ICU) discharge in patients without pre-existing chronic kidney disease (CKD).
Methods: This retrospective observational cohort included 82 patients without CKD admitted to the ICU between February 2020 and April 2022 who recovered from severe COVID-19. Data collected included serum creatinine, estimated glomerular filtration rate (eGFR), and sarcopenia risk assessed via the SARC-CalF (SARC-F combined with calf circumference). Functional outcomes were assessed by SF-36, pain by a Visual Analog Scale (VAS), and lower limb strength by the 30-second sit-to-stand test.
Results: The mean age was 52 ± 12 years; 90% were male, 46% had hypertension, and 31% diabetes. At 12 months, patients showed low functional scores (SF-36: 47 ± 21), high pain prevalence (VAS: 57%), reduced lower limb strength (sit-to-stand: 8 ± 5 repetitions), and high sarcopenia risk (SARC-F: 46%). Higher sarcopenia scores correlated with poorer physical functioning (r = -0.60; p < 0.001) and greater pain (r = -0.44; p < 0.001). In 49 patients without hypertension, diabetes, or prior acute kidney injury (AKI), creatinine significantly increased (0.95 ± 0.2 to 1.10 ± 0.2 mg/dL; p = 0.007) and eGFR significantly declined (87 ± 22 to 77 ± 18 mL/min; p = 0.001), representing a mean reduction of 10 mL/min.
Conclusion: Critically ill COVID-19 survivors experienced significant declines in kidney function, muscle strength, and functional capacity, alongside increased pain 12 months post-ICU discharge. These results underscore the need for multidisciplinary follow-up, incorporating nephrology, physiotherapy, and nutritional support.

Full text: PDF

Predictive value of NLMR, PLR, and ferritin in relation to SOFA, APACHE II, and SAPS II in sepsis patients

DOI: 10.2478/jccm-2026-0029

Introduction: Sepsis, a critical topic in the medical field, remains one of the deadliest pathologies in intensive care units. It involves an overzealous immune system, with a hyperinflammatory phase that overlaps with a subsequent hypoinflammatory phase.
Aim of the study: To ease the burden on medical systems, this study aimed to assess the predictive value of clinical severity scores (Sequential Organ Failure Assessment (SOFA), Acute Physiology and Chronic Health Evaluation II (APACHE II) score, and the Simplified Acute Physiology Score II (SAPS II)) and inflammatory biomarkers (neutrophile-to-lymphocyte-to-monocyte ratio (NLMR), and platelet-to-lymphocyte ratio (PLR), carboxyhemoglobin (COHb) and ferritin) in predicting outcomes of critically ill intensive care unit (ICU) patients.
Material and methods: This prospective, observational study included 101critically ill patients, for whom we assessed the parameters on the first and fifth days after confirmation of either sepsis or septic shock in ICU, according to the Sepsis-3 Consensus.
Results: Severity scores showed significant correlations on both day 1 and day 5 across all groups. APACHE II and SAPS II correlated with ferritin on day 5 in sepsis, septic shock, and non-survivors. The severity scores correlated with COHb on day 5 in survivors, and on day 1 in non-survivors. NLMR and PLR correlated consistently across groups, with additional associations between these ratios, ferritin, and COHb, particularly in non-survivors. Regarding mortality, NLMR on day 1 showed only modest predictive value, which declined to non-significant by day 5. In contrast, the SOFA, APACHE II, and SAPS II scores demonstrated good discriminatory ability on both days, confirming their strong and reliable performance in predicting mortality.
Conclusions: The study shows that simple cellular ratios and severity scores correlate with ferritin, COHb, and each other, reflecting inflammation, oxidative stress, and organ dysfunction in sepsis. Because these markers are inexpensive and easy to monitor, they may enhance bedside risk stratification, though broader prospective studies are still required.

Full text: PDF

The effect of oral protein and carbohydrate solution administration on NLR, IL-6 and CRP levels in patients undergoing surgery

DOI: 10.2478/jccm-2026-0028

Aims: To determine the effect of administering oral protein and carbohydrate solutions the C-Reactive Protein (CRP), Interleukin-6 (IL-6), and Neutrophil-Lymphocyte Ratio (NLR) in patients planned any surgery
Methods: a double-blind, randomized clinical study, at Ulin Regional Hospital, Banjarmasin with patients planned any surgery. This research had 80 patients in total (40 subjects in the control group and 40 subjects in the intervention group). Before surgery, 200 mL of a protein and carbohydrate solution per oral was given to the intervention group, while a placebo was given to the control group. Twenty-four hours after surgery, each subject’s levels of CRP, IL-6 and NLR were measured. Statistical Package for the Social Sciences Version 29 was used to analyze the data.
Results: NLR at 24 hours postoperatively in the intervention group was lower than in the control group, but not statistically different (8.65±4.33 vs. 7.86±4.65, p=0.308). The IL-6 level at 24 hours postoperatively in the intervention group was significantly lower than in the control group (9.49 (6.03-22.65) vs. 20.08 (11.64-50.11), p=0.011). Although not statistically different, the CRP level at 24 hours postoperatively in the intervention group was lower than in the control group (15.10 (7.20-41.60) vs. 34.70 (11.87-71.55), p=0.056). There was no difference in postoperative nausea or vomiting between the two groups.
Conclusion: Postoperative interleukin-6 levels have been demonstrated to decrease when oral protein and carbohydrate solutions are given to patients undergoing surgery; however, NLR and CRP levels have not been seen to decrease.

Full text: PDF

Early clinical outcomes and quality of life assessment after HeartMate 3 implantation: A single-centre descriptive study

DOI: 10.2478/jccm-2026-0025

Background: Advanced heart failure remains a leading cause of morbidity and mortality worldwide, with limited access to transplantation, particularly in Eastern Europe. Left ventricular assist device therapy offers improved survival and quality of life for end-stage disease. HeartMate 3, a modern form of mechanical circulatory support, is used as bridge-to-transplant or destination therapy. Despite increasing global experience with the HeartMate 3, clinical data from Romanian centers remain scarce. This study aimed to assess early clinical outcomes, postoperative complications, and quality of life after HeartMate 3 implantation at a single cardiovascular surgery center in Romania.
Materials and methods: We conducted a retrospective observational cohort study including 13 patients with Advanced heart failure who underwent HeartMate 3 implantation between September 2023 and March 2025. Preoperative variables (demographics, comorbidities, INTERMACS profile, EuroSCORE II, echocardiographic findings) and postoperative outcomes were analyzed. Quality of life was assessed in 10 surviving patients using a 16-item structured telephone questionnaire addressing physical function, autonomy, social reintegration, and psychological well-being.
Results: The cohort was predominantly male (84.6%); mean age 47.2 ± 12.3 years. Implant strategy: bridge-to-transplant 76.9%, destination therapy 23.1%. Early mortality was 23.1% (n = 3), occurring primarily in patients with EuroSCORE II >8% and INTERMACS I–II. The most frequent postoperative complications were significant perioperative bleeding (61.5%) and right ventricular failure (23.1%). Among survivors, all reported improved mobility, greater independence in activities of daily living, and better social reintegration; 7/10 rated overall quality of life as good or excellent. Psychological distress was frequent early after surgery but showed progressive improvement over time.
Conclusions: HeartMate 3 implantation resulted in favorable early clinical outcomes and significant improvements in quality of life, aligning with international data. Optimizing outcomes with left ventricular assist device therapy relies on timely referral, rigorous patient selection, and comprehensive postoperative management, including psychological and social support.

Full text: PDF

Impact of creatinine measurement methods on eGFR and GFR category assignment

DOI: 10.2478/jccm-2026-0023

Background: Accurate measurement of serum creatinine (SCr) is critical in estimating glomerular filtration rate (eGFR) and classifying kidney function. This study evaluated the analytical differences between the enzymatic and Jaffe methods for SCr measurement and their impact on eGFR estimation using two widely applied equations: CKD-EPI and EKFC.
Methods: The study included 427 patients over 40 years old. SCr was measured using both enzymatic and Jaffe methods on the Alinity c platform. eGFR was calculated with the CKD-EPI (2009) and EKFC equations. Agreement between methods was assessed using Bland-Altman and Passing-Bablok regression. eGFR differences were analyzed using the Wilcoxon signed-rank test and multiple linear regression. Agreement in GFR category classification was evaluated using weighted kappa and Kendall’s tau.
Results: While the mean difference between methods was small, both systematic and proportional biases were statistically significant. eGFR values differed significantly between methods in both sexes (p < 0.01), regardless of the equation used. ΔeGFR was significantly associated with SCr values, but not with age. Although overall agreement in GFR categories was high (kappa > 0.91), method-dependent reclassification of patients was observed, which may influence CKD diagnosis and clinical decision-making.
Conclusions: Even minor analytical differences between enzymatic and Jaffe SCr measurements can lead to clinically relevant discrepancies in GFR categorization. These findings highlight the need for harmonization in laboratory methods to ensure consistent reporting and patient management.

Full text: PDF

Prognosis prediction by urinary liver-type fatty acid-binding protein in patients in the intensive care unit admitted from the emergency department: A single-center, historical cohort study

DOI: 10.2478/jccm-2026-0026

Introduction: Early risk stratification of critically ill patients is essential for optimizing intensive care unit (ICU) resource allocation and treatment decisions. Urinary liver-type fatty acid-binding protein (L-FABP) is a simple, noninvasive biomarker that may provide real-time information on organ dysfunction. However, its prognostic utility in patients admitted to the ICU from the emergency department remains unclear.
Aim of the study: The aim of this study was to evaluate the prognostic value of L-FABP levels measured shortly after ICU admission in predicting 28-day mortality among patients admitted from the emergency department.
Methods: This single-center retrospective observational study included patients admitted to the ICU between December 2020 and August 2022. Urinary L-FABP concentrations were measured at ICU admission (T0) and 3 hours later (T3). The primary outcome was 28-day in-hospital mortality. Prognostic performance was assessed using receiver operating characteristic curves and Cox proportional hazards models with inverse probability of treatment weighting. Results were compared with Acute Physiology and Chronic Health Evaluation (APACHE) II, Sequential Organ Failure Assessment (SOFA) scores, and lactate levels.
Results: Data of 118 patients were included in the final analysis. Urinary L-FABP levels at T3 showed the highest AUC for predicting 28-day mortality (area under the curve [AUC] = 0.873), compared with APACHE II (AUC = 0.801), SOFA (AUC = 0.753), and the lactate level (AUC = 0.734). An elevated L-FABP (T3) level was independently associated with increased mortality (hazard ratio [HR] = 8.60, 95% confidence interval [CI]: 1.02–72.64, P = 0.047). The T3/T0 ratio showed only modest predictive value (AUC = 0.623).
Conclusions: Urinary L-FABP levels measured 3 hours after ICU admission were an independent predictor of short-term mortality. The marker’s simplicity and bedside applicability suggest its potential utility not only in ICUs but also in emergency departments and triage decision-making.

Full text: PDF

Predictive ability of malnutrition screening tools in enterally fed, mechanically ventilated patients with phase angle inference:A prospective observational study

DOI: 10.2478/jccm-2026-0027

Background: The prognostic abilities of malnutrition assessment tools for the critically ill are still controversial. This study aimed to assess the predictive ability of MUST, NRS-2002, and NUTRIC tools to predict malnutrition risk for enterally fed, mechanically ventilated patients in intensive care.
Methods: In a multicenter, prospective, observational study, patients from five ICU units in Jordan were observed at two stages. During the first 24 hours of admission, MUST, NUTRIC, and NRS-2002 scores were obtained in addition to the demographic and admission characteristics. In the assessment stage, on day 6th of admission forward, the Bioelectrical Impedance Analysis (BIA), including body compositions and Phase Angle (PhA) were assessed. Machine Learning (ML), structural equation modeling (SEM), and area under the curve (AUC) were used for measuring malnutrition estimates.
Results: A total of 709 patients were observed. At admission, NUTRIC, MUST, and NRS-2002 were congruent in identifying high malnutrition risk (45.1%, 46.4%, and 53.6%, respectively). In reference to PhA, MUST and NRS-2002 scored higher for sensitivity (77.6%, and 76.8%, respectively) and specificity (93.3%, and 90.7%, respectively). They reported acceptable correlation estimates by SEM (0.65, and 0.70, respectively), and ML (0.90, and 0.91, respectively). Further, MUST was the best to discriminate malnutrition, followed by NRS-2002 and NUTRIC (AUC= 0.76, 0.64, and 0.53, respectively).
Conclusions: Alongside validated BIA technology, MUST and NRS-2002 functioned as reliable prognostic indicators of malnutrition risk in the ICU.

Full text: PDF

Percutaneous tracheostomy in Costa Rican intensive care units: Multicenter epidemiology, practices, and immediate complications

DOI: 10.2478/jccm-2026-0020

Introduction: This study describes the epidemiological and clinical profile of ICU patients undergoing percutaneous tracheostomy in Costa Rica and identifies predictors of acute complications, addressing ongoing debates on timing, technique, and risk stratification.
Methods: We performed a prospective multicenter cohort study in eight CCSS hospitals (2019–2022), including adult ICU patients undergoing percutaneous tracheostomy. Demographic, clinical, and procedural data were collected, and multivariable logistic regression identified predictors of complications.
Results: A total of 516 patients were analyzed (mean age 53.2 ± 16.3 years; 68.2% male). The main indications were anticipated prolonged ventilation (32.4%), neurological deficits (26.7%), and ventilation >10 days (21.8%). The Ciaglia and Griggs techniques were used in 51.0% and 48.3% of cases, respectively. Capnography was applied in 74.2%, ultrasound in 17.7%, and bronchoscopy in 3.1%. First-pass success was achieved in 85.1%. Acute complications occurred in 28.3% of patients, predominantly minor bleeding (25.4%), while serious complications (airway loss, false passage, or bleeding requiring surgery) were rare (3.9%). No procedure-related deaths were observed. Independent predictors of complications included anticoagulation (OR 2.82), obesity (OR 2.10), coagulopathy (OR 2.29), prior neck surgery (OR 3.49), cervical immobilization (OR 4.68), and technical difficulty (OR 4.15 for any complication; OR 2.00 for serious complications). Airway management by physicians, compared with respiratory therapists, was also associated with higher risk (OR 1.52).
Conclusions: Percutaneous tracheostomy was feasible in multiple ICUs of the CCSS with complication rates comparable to international cohorts. Risk factors for complications included anticoagulation and prior neck surgery. Wider adoption of adjunctive monitoring tools and structured multidisciplinary training may further enhance procedural safety. These findings should be interpreted in the context of an observational design and a broad definition of complications.

Full text: PDF

Comparing length of stay between adult patients admitted to the intensive care unit with alcohol withdrawal syndrome treated with phenobarbital versus lorazepam

DOI: 10.2478/jccm-2026-0018

Objective: Review the use of phenobarbital and lorazepam to treat alcohol withdrawal syndrome (AWS) in the ICU, compare length of stay, examine medication use trends, implement provider training, and evaluate outcomes post-training.
Methods: Design – Retrospective observational study, Quality improvement. Setting – Tertiary care hospital with 36 ICU beds. Patients – Adults admitted to the ICU and placed on clinical institute withdrawal assessment (CIWA) protocol. Patients with epilepsy were excluded.
Results: During the 34-month baseline period, 713 patients were admitted to the ICU with alcohol use disorder (AUD) on CIWA, without epilepsy. 189 patients were treated with phenobarbital, 460 patients received only lorazepam, 64 patients received neither medication. All but 2 of the patients who received phenobarbital also received lorazepam. Compared to phenobarbital, lorazepam-only patients had shorter ICU LOS (p<0.001, 95% CI -2.36, -1.32) but higher mortality 13.91% vs. 4.76% (p=0.0008). We then developed and provided a training (which we refer to as an “intervention”) to all ICU providers encouraging consistent use of phenobarbital in the ICU when appropriate. During the 3-month post-intervention period 44 patients were admitted to the ICU with AUD on CIWA protocol. Of the 44 patients: 26 received phenobarbital, 12 received only lorazepam, 6 received neither medication. Significantly more patients were treated with only phenobarbital (57.69%)) compared to baseline (1.06%). Compared to patients treated with phenobarbital, patients treated with only lorazepam had a significantly higher mortality rate (33.33% vs. 7.69%, p=0.04).
Conclusions: We found significant variability in the use of phenobarbital and lorazepam for treatment of AWS in the ICU. After a quality improvement training for ICU physicians, there was less frequent concurrent use of benzodiazepines and barbiturates, no difference in ICU or hospital LOS, and significantly lower mortality rate for those treated with phenobarbital.

Full text: PDF

Influence of different short peripheral cannula materials on the incidence of phlebitis in intensive care units: A post-hoc analysis of the AMOR-VENUS study

DOI: 10.2478/jccm-2026-0014

Aim of the study: Short peripheral cannula (SPC)-related phlebitis occurs in 7.5% of critically ill patients, and mechanical irritation from cannula materials is a risk factor. Softer polyurethane cannulas reportedly reduce phlebitis, but the incidence of phlebitis may vary depending on the type of polyurethane. Differences in cannula stiffness may also affect the incidence of phlebitis; however, this relationship is not well understood. This study analyzed intensive care unit (ICU) patient data to compare the incidence of phlebitis across different cannula products, focusing on polyurethane.
Material and Methods: This is a post-hoc analysis of the AMOR-VENUS study that involved 23 ICUs in Japan. We included patients aged ≥ 18 years, who were admitted to the ICU with SPCs. The primary outcome was phlebitis, evaluated using hazard ratios (HRs) and 95% confidence intervals (CIs). Based on the market share and differences in synthesis, polyurethanes were categorized into PEU-Vialon® (BD, USA), SuperCath® (Medikit, Japan), and other polyurethanes; non-polyurethane materials were also analyzed. Multivariable marginal Cox regression analysis was performed using other polyurethanes as a reference.
Results: In total, 1,355 patients and 3,429 SPCs were evaluated. Among polyurethane cannulas, 1,087 (33.5%) were PEU-Vialon®, 702 (21.6%) were SuperCath®, and 276 (8.5%) were other polyurethanes. Among non-polyurethane cannulas, 1,292 (39.8%) were ethylene tetrafluoroethylene (ETFE) cannulas, and 72 (2.2%) used other materials. The highest incidence of phlebitis was observed with SuperCath® (13.1%). Multivariate analysis revealed an HR of 1.45 (95% CI 0.75-2.8, p = 0.21) for PEU-Vialon®, 2.60 (95% CI 1.35-5.00, p < 0.01) for SuperCath®, 2.29 (95% CI 1.19-4.42, p = 0.01) for ETFE, and 2.2 (95% CI 0.46-10.59, p = 0.32) for others.
Conclusions: The incidence of phlebitis varied among polyurethane cannulas. Further research is warranted to determine the causes of these differences.

Full text: PDF