Category Archives: Volume 10

Acute Calculous Cholecystitis Associated with Leptospirosis: Which is the Emergency? A Case Report and Literature Review

DOI: 10.2478/jccm-2024-0033

Introduction: Leptospirosis is a bacterium with a worldwide distribution and belongs to the group of zoonoses that can affect both humans and animals. Most cases of leptospirosis present as a mild, anicteric infection. However, a small percentage of cases develop Weil’s disease, characterized by bleeding and elevated levels of bilirubin and liver enzymes. It can also cause inflammation of the gallbladder. Acute acalculous cholecystitis has been described as a manifestation of leptospirosis in a small percentage of cases; however, no association between leptospirosis and acute acalculous cholecystitis has been found in the literature.
Case presentation: In this report, we describe the case of a 66-year-old patient who presented to the emergency department with a clinical picture dominated by fever, an altered general condition, abdominal pain in the right hypochondrium, nausea, and repeated vomiting. Acute calculous cholecystitis was diagnosed based on clinical, laboratory, and imaging findings. During preoperative preparation, the patient exhibited signs of liver and renal failure with severe coagulation disorders. Obstructive jaundice was excluded after performing an abdominal ultrasound and computed tomography scan. The suspicion of leptospirosis was then raised, and appropriate treatment for the infection was initiated. The acute cholecystitis symptoms went into remission, and the patient had a favorable outcome. Surgery was postponed until the infection was treated entirely, and a re-evaluation of the patient’s condition was conducted six-week later.
Conclusions: The icterohemorrhagic form of leptospirosis, Weil’s disease, can mimic acute cholecystitis, including the form with gallstones. Therefore, to ensure an accurate diagnosis, leptospirosis should be suspected if the patient has risk factors. However, the order of treatments is not strictly established and will depend on the clinical picture and the patient’s prognosis.

Full text: PDF

Understanding the Correlation between Blood Profile and the Duration of Hospitalization in Pediatric Bronchopneumonia Patients: A Cross-Sectional Original Article

DOI: 10.2478/jccm-2024-0031

Introduction: Pediatric bronchopneumonia is a prevalent life-threatening disease, particularly in developing countries. Affordable and accessible blood biomarkers are needed to predict disease severity which can be based on the Duration of Hospitalization (DOH).
Aim of the Study: To assess the significance and correlation between differential blood profiles, especially the Neutrophil-Lymphocyte Ratio (NLR), and the DOH in bronchopneumonia children.
Material and Methods: A record-based study was conducted at a secondary care hospital in Indonesia. After due ethical permission, following inclusion and exclusion criteria, 284 children with confirmed diagnoses of bronchopneumonia were included in the study. Blood cell counts and ratios were assessed with the DOH as the main criterion of severity. Mann-Whitney test and correlation coefficient were used to draw an analysis.
Results: Study samples were grouped into DOH of ≤ 4 days and > 4 days, focusing on NLR values, neutrophils, lymphocytes, and leukocytes. The NLR median was higher (3.98) in patients hospitalized over 4 days (P<0.0001). Lymphocyte medians were significantly higher in the opposite group (P<0.0001). Thrombocyte medians were similar in both groups (P=0.44481). The overall NLR and DOH were weakly positively correlated, with a moderate positive correlation in total neutrophils and DOH, and a moderate negative correlation in total lymphocytes and DOH. The correlation between the DOH ≤ 4 days group with each biomarker was stronger, except for leukocyte and thrombocyte. Analysis of the longer DOH group did not yield enough correlation across all blood counts.
Conclusions: Admission levels of leukocyte count, neutrophil, lymphocyte, and NLR significantly correlate with the DOH, with NLR predicting severity and positively correlated with the DOH.

Full text: PDF

Rate of Sodium Correction and Osmotic Demyelination Syndrome in Severe Hyponatremia: A Meta-Analysis

DOI: 10.2478/jccm-2024-0030

Introduction: Current guidelines recommend limiting the rate of correction in patients with severe hyponatremia to avoid severe neurologic complications such as osmotic demyelination syndrome (ODS). However, published data have been conflicting. We aimed to evaluate the association between rapid sodium correction and ODS in patients with severe hyponatremia.
Materials and methods: We searched PubMed, Embase, Scopus, Web of Science, and Cochrane Central Register of Controlled Trials from inception to November 2023. The primary outcome was ODS and the secondary outcomes were in-hospital mortality and length of hospital stay.
Results: We identified 7 cohort studies involving 6,032 adult patients with severe hyponatremia. Twenty-nine patients developed ODS, resulting in an incidence rate of 0.48%. Seventeen patients (61%) had a rapid correction of serum sodium in the first or any 24-hour period of admission. Compared with a limited rate of sodium correction, a rapid rate of sodium correction was associated with an increased risk of ODS (RR, 3.91 [95% CI, 1.17 to 13.04]; I2 = 44.47%; p = 0.03). However, a rapid rate of sodium correction reduced the risk of in-hospital mortality by approximately 50% (RR, 0.51 [95% CI, 0.39 to 0.66]; I2 = 0.11%; p < 0.001) and the length of stay by 1.3 days (Mean difference, -1.32 [95% CI, -2.54 to -0.10]; I2 = 71.47%; p = 0.03).
Conclusions: Rapid correction of serum sodium may increase the risk of ODS among patients hospitalized with severe hyponatremia. However, ODS may occur in patients regardless of the rate of serum sodium correction.

Full text: PDF

Intensive Care Fundamentals in Romania. A Critical Step in Romanian Intensive Care Education

DOI: 10.2478/jccm-2024-0029

Any doctor working in the field of Intensive Care Medicine (ICM) remembers their first placement in the Intensive Care Unit (ICU). Most would remember that time as being characterized by trepidation, confusion and worry. In order to mitigate these negative experiences, various institutions run ad hoc induction and orientation programs. The European Society of Intensive Care Medicine (ESICM) has recently developed the Intensive Care Fundamentals (ICF) course to standardize induction and orientation to ICU for this group of doctors throughout Europe and beyond. Now, the ICF has arrived in Romania. [More]

Full text: PDF

A Comparative Analysis of the Effects of Haloperidol and Dexmedetomidine on QTc Interval Prolongation during Delirium Treatment in Intensive Care Units

DOI: 10.2478/jccm-2024-0027

Background: Haloperidol and dexmedetomidine are used to treat delirium in the intensive care unit (ICU). The effects of these drugs on the corrected QT (QTc) interval have not been compared before. It was aimed to compare the effects of haloperidol and dexmedetomidine treatment on QTc intervals in patients who developed delirium during ICU follow-up.
Method: The study is single-center, randomized, and prospective. Half of the patients diagnosed with delirium in the ICU were treated with haloperidol and the other half with dexmedetomidine. The QTc interval was measured in the treatment groups before and after drug treatment. The study’s primary endpoints were maximal QT and QTc interval changes after drug administration.
Results: 90 patients were included in the study, the mean age was 75.2±12.9 years, and half were women. The mean time to delirium was 142+173.8 hours, and 53.3% of the patients died during their ICU follow-up. The most common reason for hospitalization in the ICU was sepsis (%37.8.). There was no significant change in QT and QTc interval after dexmedetomidine treatment (QT: 360.5±81.7, 352.0±67.0, p= 0.491; QTc: 409.4±63.1, 409.8±49.7, p=0.974). There was a significant increase in both QT and QTc interval after haloperidol treatment (QT: 363.2±51.1, 384.6±59.2, p=0.028; QTc: 409.4±50.9, 427.3±45.9, p=0.020).
Conclusions: Based on the results obtained from the study, it can be concluded that the administration of haloperidol was associated with a significant increase in QT and QTc interval. In contrast, the administration of dexmedetomidine did not cause a significant change in QT and QTc interval.

Full text: PDF

Development of a Machine Learning-Based Model for Predicting the Incidence of Peripheral Intravenous Catheter-Associated Phlebitis

DOI: 10.2478/jccm-2024-0028

Introduction: Early and accurate identification of high-risk patients with peripheral intravascular catheter (PIVC)-related phlebitis is vital to prevent medical device-related complications.
Aim of the study: This study aimed to develop and validate a machine learning-based model for predicting the incidence of PIVC-related phlebitis in critically ill patients.
Materials and methods: Four machine learning models were created using data from patients ≥ 18 years with a newly inserted PIVC during intensive care unit admission. Models were developed and validated using a 7:3 split. Random survival forest (RSF) was used to create predictive models for time-to-event outcomes. Logistic regression with least absolute reduction and selection operator (LASSO), random forest (RF), and gradient boosting decision tree were used to develop predictive models that treat outcome as a binary variable. Cox proportional hazards (COX) and logistic regression (LR) were used as comparators for time-to-event and binary outcomes, respectively.
Results: The final cohort had 3429 PIVCs, which were divided into the development cohort (2400 PIVCs) and validation cohort (1029 PIVCs). The c-statistic (95% confidence interval) of the models in the validation cohort for discrimination were as follows: RSF, 0.689 (0.627–0.750); LASSO, 0.664 (0.610–0.717); RF, 0.699 (0.645–0.753); gradient boosting tree, 0.699 (0.647–0.750); COX, 0.516 (0.454–0.578); and LR, 0.633 (0.575–0.691). No significant difference was observed among the c-statistic of the four models for binary outcome. However, RSF had a higher c-statistic than COX. The important predictive factors in RSF included inserted site, catheter material, age, and nicardipine, whereas those in RF included catheter dwell duration, nicardipine, and age.
Conclusions: The RSF model for the survival time analysis of phlebitis occurrence showed relatively high prediction performance compared with the COX model. No significant differences in prediction performance were observed among the models with phlebitis occurrence as the binary outcome.

Full text: PDF

Uncommon Malposition of an Ultrasound-Guided Central Venous Catheter in the Renal Vein through the Superficial Femoral Vein: A Case Report

DOI: 10.2478/jccm-2024-0026

Introduction: Malposition is a relatively rare complication associated with peripherally inserted central catheters (PICCs), particularly in cases of superficial femoral vein (SFV) catheterization. To the best of our knowledge, we are the first to report this rare case of SFV PICC malposition in the contralateral renal vein.
Case presentation: An 82-year-old woman underwent bedside cannulation of the SFV for PICC under ultrasound guidance. Subsequent radiographic examination revealed an unexpected misplacement, with the catheter tip positioned toward the contralateral renal vein. After pulling out the catheter on the basis of the X-ray result, it was observed that the catheter retained its function.
Conclusion: Although rare, tip misplacement should be considered in SFV PICC placement. Prompt correction of the tip position is crucial to prevent catheter malfunction and further catastrophic consequences. For critical patients receiving bedside SFV PICC insertion, postoperational X-ray is crucial for enhancing safety.

Full text: PDF

Evaluation of the Efficiency of the Newly Developed Needle in Emergency Room: A Single-Center Observational Study

DOI: 10.2478/jccm-2024-0025

Aim of the study: Peripheral intravascular catheter (PIVC) insertion is frequently performed in the emergency room (ER) and many failures of initial PIVC insertion occur. To reduce the failures, new needles were developed. This study aimed to investigate whether the use of the newly developed needle reduced the failure of initial PIVC insertion in the ER compared with the use of the existing needle.
Material and methods: This single-centre, prospective observational study was conducted in Japan between April 1, 2022, and February 2, 2023. We included consecutive patients who visited our hospital by ambulance as a secondary emergency on a weekday during the day shift (from 8:00 AM to 5:00 PM). The practitioners for PIVC insertion and assessors were independent. The primary and secondary outcomes were the failure of initial PIVC insertion and number of procedures, respectively. We defined the difficulty of titrating, leakage, and hematoma within 30 s after insertion as failures. To evaluate the association between the outcomes and the use of newly developed needles, we performed multivariate logistic regression and multiple regression analyses by adjusting for covariates.
Results: In total, 522 patients without missing data were analysed, and 81 (15.5%) patients showed failure of initial PIVC insertion. The median number of procedures (interquartile range) was 1 (1–1). Multivariate logistic regression analysis revealed no significant association between the use of newly developed PIVCs and the failure of initial PIVC insertion (odds ratio, 0.79; 95% confidence interval, [0.48–1.31]; p = 0.36). Moreover, multiple regression analysis revealed no significant association between the use of newly developed PIVCs and the number of procedures (regression coefficient, -0.0042; 95% confidence interval, [-0.065–0.056]; p = 0.89).
Conclusions: Our study did not show a difference between the two types of needles with respect to the failure of initial PIVC insertion and the number of procedures.

Full text: PDF

Managing Multifactorial Deep Vein Thrombosis in an Adolescent: A Complex Case Report

DOI: 10.2478/jccm-2024-0024

Introduction: Although rarely diagnosed in the pediatric population, deep vein thrombosis (DVT) is experiencing a growing incidence, while continuously acquiring different nuances due to the widening range of risk factors and lifestyle changes in children and adolescents.
Case presentation: A 17-year-old female within four weeks after child delivery was admitted to our clinic due to a six-month history of pain in the left hypochondriac region. After a thorough evaluation, the presence of a benign splenic cyst was revealed, which was later surgically removed. Following the intervention, the patient developed secondary thrombocytosis and bloodstream infection which, together with pre-existing risk factors (obesity, compressive effect of a large cyst, the postpartum period, the presence of a central venous catheter, recent surgery, and post-operative mobilization difficulties) led to the occurrence of extensive DVT, despite anticoagulant prophylaxis and therapy with low-molecular-weight heparin.
Conclusions: DVT raises many challenges for the pediatrician, requiring a personalized approach. Although rare, pediatric patients with multiple concomitant high-risk factors should benefit from interdisciplinary care as DVT may not respond to standard therapy in such cases and rapidly become critical. Continual efforts to better understand and treat this condition will contribute to improved outcomes for pediatric patients affected by DVT.

Full text: PDF

Challenges of the Regional Anesthetic Techniques in Intensive Care Units – A Narrative Review

DOI: 10.2478/jccm-2024-0023

Effective pain management is vital for critically ill patients, particularly post-surgery or trauma, as it can mitigate the stress response and positively influence morbidity and mortality rates. The suboptimal treatment of pain in Intensive Care Unit (ICU) patients is often due to a lack of education, apprehensions about side effects, and improper use of medications. Hence, the engagement of pain management and anesthesiology experts is often necessary.
While opioids have been traditionally used in pain management, their side effects make them less appealing. Local anesthetics, typically used for anesthesia and analgesia in surgical procedures, have carved out a unique and crucial role in managing pain and other conditions in critically ill patients. This work aims to offer a comprehensive overview of the role, advantages, challenges, and evolving practices related to the use of local anesthetics in ICUs. The ability to administer local anesthetics continuously makes them a suitable choice for controlling pain in the upper and lower extremities, with fewer side effects.
Epidural analgesia is likely the most used regional analgesic technique in the ICU setting. It is primarily indicated for major abdominal and thoracic surgeries, trauma, and oncology patients. However, it has contraindications and complications, so its use must be carefully weighed. Numerous challenges exist regarding critically ill patients, including renal and hepatic failure, sepsis, uremia, and the use of anticoagulation therapy, which affect the use of regional anesthesia for pain management. Appropriate timing and indication are crucial to maximizing the benefits of these methods.
The advent of new technologies, such as ultrasonography, has improved the safety and effectiveness of neuraxial and peripheral nerve blocks, making them feasible options even for heavily sedated patients in ICUs.

Full text: PDF