ABSTRACT
Objective
This study aimed to compare the diagnostic performance and cost-effectiveness of classical culture, chromogenic agar, and the Smart-Cycle I-CORE real-time polymerase chain reaction (PCR) method for detecting vancomycin-resistant enterococci (VRE) in intensive care unit (ICU) patient and environmental samples.
Methods
In a prospective surveillance design conducted in adult medical, surgical, and general ICUs, perianal/rectal swab samples from patients hospitalized ≥48 hours and high-touch environmental surface samples were obtained. Each specimen was tested in parallel using classical culture, chromogenic agar, and real-time PCR targeting vanA/vanB. Sensitivity, specificity, positive and negative predictive values (PPV/NPV), turnaround time, and per-test costs were calculated.
Results
In patient samples, PCR, chromogenic agar, and culture achieved sensitivity/specificity of 100%/100%, 100%/80%, and 95.2%/86.4%, respectively. In environmental samples, PCR showed 100%/100%, chromogenic agar 100%/92%, and culture 94%/100%, respectively. PCR provided a markedly shorter time-to-result (~4 h), compared with chromogenic agar (~24 h) and classical culture (~48 h). vanA was the predominant genotype (≈82%), followed by vanB (≈18%). Although PCR was the most costly method, its rapid turnaround time contributed to earlier isolation decisions and to a reduction in environmental positivity rates.
Conclusion
Smart-Cycle I-CORE PCR offers the highest diagnostic accuracy and the fastest reporting among currently available surveillance methods, while chromogenic agar represents a reliable and cost-effective option. A two-step strategy—chromogenic agar for routine screening with PCR confirmation—balances accuracy, speed, and cost in ICU VRE surveillance.
INTRODUCTION
Vancomycin-resistant enterococci (VRE) have emerged as important healthcare-associated pathogens since their first description in the late 1980s. Their ability to acquire and transfer resistance determinants, survive for prolonged periods on inanimate surfaces, and rapidly disseminate under the selective pressure of broad-spectrum antibiotic use has led to endemic circulation particularly in intensive care units (ICUs) (1-3).
Enterococci are Gram-positive cocci commonly found in the gastrointestinal microbiota. Enterococcus faecalis (E. faecalis) and Enterococcus faecium (E. faecium) represent the most clinically relevant species and may persist on medical equipment and environmental surfaces, thereby contributing to nosocomial transmission. Of particular concern is the increasing vancomycin resistance observed predominantly among E. faecium isolates in recent years, posing a significant public health challenge worldwide (4, 5).
Rapid and accurate detection of VRE colonization is essential for timely implementation of infection control measures, including patient isolation and environmental decontamination, to prevent hospital outbreaks. Conventional culture-based methods remain widely available and relatively inexpensive; however, they require prolonged incubation periods that may delay infection control interventions. Chromogenic agar has been introduced as a more rapid screening method allowing presumptive identification of VRE colonies based on color differentiation, although its diagnostic performance may vary depending on laboratory conditions. In contrast, molecular assays such as real-time polymerase chain reaction (PCR) targeting resistance genes including vanA and vanB provide high sensitivity and specificity and can substantially reduce diagnostic turnaround time (6, 7).
Although the molecular epidemiology of VRE has evolved over the past decade, the fundamental diagnostic strategies used for VRE detection—culture-based screening, chromogenic agar, and PCR—remain central components of infection control programs in many healthcare settings. The data used in the present study were obtained from a prospective surveillance program conducted in the ICUs of a tertiary-care university hospital. Therefore, the primary aim of this study was to compare the diagnostic performance, turnaround time, and relative cost of classical culture, chromogenic agar, and Smart-Cycle I-CORE real-time PCR for detecting VRE in patient and environmental samples, to provide evidence supporting the optimization of surveillance strategies in critical care environments.
MATERIALS AND METHODS
Study Design and Ethical Approval
This prospective diagnostic validation study was conducted in the ICUs of a tertiary-care university hospital to compare classical culture, chromogenic agar, and Smart-Cycle I-CORE real-time PCR for VRE surveillance. The study protocol was approved by the Mersin University Clinical Research Ethics Committee (approval number: 2008/91, date: 17/10/2008) and was carried out in accordance with the Declaration of Helsinki. Written informed consent was waived because the study did not include identifiable patient data.
Setting and Sample Collection
The study was performed in the medical, surgical, and general ICUs of Mersin University Medical Faculty Hospital. Adult patients hospitalized for ≥48 hours were eligible. During the study period, 210 patient samples and 185 environmental samples were collected. Environmental swabs were obtained from high-touch surfaces in rooms of VRE-positive or suspected patients (e.g., bed rails, monitor keypads, nightstand surfaces).
Specimen Sampling and Transport
Patient samples consisted of perianal or rectal specimens collected with sterile flocked swabs and transported in appropriate media according to manufacturer’s recommendations. Environmental sampling was performed with sterile swabs applied to frequently touched surfaces under similar transport conditions. Initial screening was conducted 48 hours after ICU admission. Weekly follow-up samples were obtained from VRE-positive patients. All specimens were processed within two hours of receipt.
Diagnostic Procedures
Classical culture, chromogenic agar culture, and real-time PCR were performed simultaneously on each specimen.
Classical Culture: Enterococcus-selective media (e.g., Enterococcosel/BEA) and 6.5% NaCl media were inoculated and incubated at 37 °C for 24–48 h. Identification was performed using Gram staining, catalase testing, and PYR testing. Vancomycin susceptibility was assessed phenotypically according to CLSI criteria.
ChromID VRE chromogenic agar (bioMérieux, France) was inoculated and incubated at 37 °C for 24 h. Color differentiation provided presumptive identification (pink–red colonies suggestive of E. faecium; blue–green colonies suggestive of E. faecalis). Presumptive VRE isolates were confirmed using molecular testing.
Real-time PCR was performed using the Smart-Cycle I-CORE Real-Time PCR System (Cepheid, USA) to detect the vanA and vanB genes. DNA extraction and amplification followed the manufacturer’s instructions. Cycle threshold (Ct) ≤35 was interpreted as positive. Positive and negative controls were included in each run.
Quality Control and Internal Validation
Quality control included E. faecium ATCC 51559 (vanA positive) and E. faecalis ATCC 29212 (vancomycin susceptible). At least one positive and one negative control were included per assay run. Runs with invalid controls were repeated. Daily calibration and weekly maintenance of instruments were documented.
Cost Analysis
Direct diagnostic costs were calculated per test based on institutional cost components, including reagents, consumables, personnel time (processing, analysis, reporting), and instrument utilization (energy and service allocation). Costs were expressed in Turkish Lira (₺). All diagnostic methods were compared using identical unit prices.
Statistical Analysis
Statistical analysis was performed using SPSS version 25 (IBM Corp., USA). χ² or Fisher’s exact tests were applied to compare categorical variables. Agreement between methods was assessed using Pearson correlation analysis. Diagnostic performance metrics (sensitivity, specificity, PPV, NPV) were calculated using standard definitions. A p-value <0.05 was considered statistically significant.
RESULTS
Between May and October 2010, a total of 420 ICU patients were screened for VRE colonization using perianal swab specimens. Of these, 268 (63.8%) were male and 152 (36.2%) were female. VRE colonization was detected in 21 patients (5.0%), including 11 males (52.4%) and 10 females (47.6%). The median age of colonized patients was 56.1 ± 5.07 years (range, 30–80). Mean age was 61.8 ± 12.3 years in males and 49.0 ± 30.8 years in females.
Intensive Care Unit Distribution and Clinical Characteristics
Among all screened patients, 146 (34.7%) were admitted to medical ICUs, 137 (32.6%) to surgical ICUs, and 137 (32.6%) to general ICUs. Of the 21 VRE-positive cases, 3 (14.3%) were identified in medical ICUs, 9 (42.9%) in surgical ICUs, and 9 (42.9%) in general ICUs.
Chronic comorbidities were present in 18 patients (85.7%); 3 patients (14.3%) had no comorbidities. Of these, at least two comorbidities were identified in 7 patients (33.3%). Diabetes mellitus was the most frequent underlying disease (33.3%), followed by solid organ malignancies (23.8%) and cardiovascular diseases (19.0%).
Total parenteral nutrition was administered to 18 patients (85.7%), and enteral nutrition was used concomitantly in 14 patients (66.7%). Previous surgery was recorded in 16 patients (76.2%), most commonly gastrointestinal procedures (42.8%), followed by genitourinary procedures (19.0%), cranial procedures (9.5%), and thoracic procedures (4.0%).
Central venous catheterization was documented in 20 patients (95.2%) at the time of VRE detection. Diarrhea was present in 6 patients (28.6%). Immunosuppression was observed in 9 patients (42.9%).
Within the preceding three months, 10 patients (47.6%) had been hospitalized and 4 (19.0%) had stayed in an ICU. The mean ICU length of stay among VRE-positive patients was 17.4 ± 11.0 days (range, 6–50) (Table 1).
Antibiotic Use
All VRE-positive patients were receiving parenteral antimicrobial therapy at the time of detection. The most frequently administered agents were carbapenems (19%), carbapenem plus glycopeptide combinations (23.8%), ampicillin–sulbactam (9.5%), carbapenem plus aminoglycoside combinations (9.5%),and ampicillin–sulbactam plus metronidazole (9.5%), whereas glycopeptide monotherapy was used in 9.5% of cases. Among patients receiving glycopeptides, six (85.7%) were treated with teicoplanin and one (14.3%) was treated with vancomycin. Antimicrobial treatment indications included bacteremia (n = 8), urinary tract infection, and other suspected or confirmed infections.
E-test results showed that all isolates exhibited high-level vancomycin resistance (MIC >256 µg/mL), while teicoplanin resistance was observed in 75% of isolates (MIC >16 µg/mL).
Environmental Samples
Among 113 environmental samples obtained from rooms of VRE-positive patients, 21 (18.3%) yielded VRE. Environmental colonization persisted during follow-up in 14 patients and lasted for a mean of 7 ± 5.4 days (range, 7–21). No environmental contamination was detected at the end of the study. The most frequently contaminated surfaces were bed rails (n = 9), nightstands (n = 5), and monitor/pump surfaces (n = 5), followed by clinical carts (n = 2).
Only one patient with perianal VRE colonization developed clinical infection; therefore, separate risk factor analyses for colonization versus infection could not be performed. Environmental positivity was not significantly associated with the duration of colonization (p = 0.6). The infection types listed in Table 1 (e.g., pneumonia, surgical site infection, bacteremia, and urinary tract infection) represent the primary clinical diagnoses leading to ICU admission, rather than infections caused by VRE.
Species Distribution and Antimicrobial Susceptibility
Of the 21 VRE isolates, 81% were E. faecium and 19% were E. faecalis (Table 2). All isolates demonstrated high-level gentamicin resistance.
Diagnostic Performance of the Methods
In environmental samples, the turnaround time of PCR was 1 hour, compared with 38.8 ± 6.6 hours for Enterococcosel agar and 24.5 ± 5.9 hours for chromogenic agar. No significant difference was observed between Enterococcosel agar and chromogenic agar for environmental specimens (p > 0.05).
In patient samples, the turnaround time of Enterococcosel agar was 60 ± 4.0 hours, whereas chromogenic agar required a significantly shorter duration of 26.8 ± 3.2 hours (p < 0.038).
When PCR was considered the reference standard, patient samples yielded sensitivities, specificities, PPVs, and NPVs of 100% for PCR; 95.2%, 84.6%, 92%, and 96%, respectively, for Enterococcosel agar; and 100%, 80%, 94%, and 100%, respectively, for chromogenic agar (Table 3).
In environmental samples, with PCR as the reference standard, Enterococcosel agar showed a sensitivity of 94%, specificity of 100%, PPV of 100%, and NPV of 80%, whereas chromogenic agar showed a sensitivity of 100%, specificity of 92%, PPV of 87%, and NPV of 100% (Table 4).
Cost Analysis
The average direct diagnostic cost per test was calculated to be 181 ₺ for PCR, 113.15 ₺ for chromogenic agar, and 182.4 ₺ for classical culture. Although PCR had a higher direct cost per test than chromogenic agar, it provided substantially faster results. This shorter turnaround time may facilitate earlier infection control interventions such as patient isolation and environmental decontamination. From a practical perspective, chromogenic agar may serve as an effective screening method due to its lower cost, whereas PCR may be used as a confirmatory test when rapid and highly accurate detection is required.
DISCUSSION
This study compared the diagnostic performance, turnaround time, and cost of three commonly used modalities—classical culture, chromogenic agar, and Smart-Cycle I-CORE real-time PCR—for detecting VRE colonization in ICUs. Our findings demonstrated that PCR achieved the highest diagnostic accuracy, whereas chromogenic agar emerged as a practical and cost-effective alternative for routine surveillance due to its ease of interpretation and lower implementation costs.
An important consideration when interpreting the findings of this study is the time period during which the data were collected. Although the surveillance was conducted in 2010, the diagnostic approaches evaluated in this study—classical culture, chromogenic agar, and PCR targeting the vanA/vanB genes—remain fundamental tools in current clinical microbiology laboratories. Therefore, the results primarily reflect differences in diagnostic performance and turnaround time rather than temporal changes in VRE epidemiology.
E. faecium and E. faecalis are well-recognized causes of healthcare-associated infections, particularly among critically ill patients with risk factors such as broad-spectrum antibiotic exposure, invasive procedures, and immunosuppression (5-8). The predominance of VRE in the medical and surgical ICUs of our cohort is consistent with existing epidemiological trends, and the higher frequency of E. faecium isolates is in line with increasingly reported resistance profiles in the literature.
Classical culture remains widely accessible and inexpensive; however, its prolonged incubation period, often exceeding 48 hours, may delay the implementation of infection control interventions. Chromogenic agar offers more rapid presumptive identification based on color differentiation and can be used as an efficient rule-out tool given its high negative predictive value (9, 10). In the present study, chromogenic agar performed adequately as a screening method, representing a feasible option for large-scale surveillance.
Real-time PCR provides the major advantage of directly detecting of vanA and vanB genes, enabling rapid isolation measures. Previous studies have shown that molecular-based algorithms reduce diagnostic turnaround time from 24–48 hours to 3–5 hours compared with conventional culture, contributing to earlier interruption of the transmission chain (11-13). In the current study, PCR consistently returned results within 4 hours, facilitating patient isolation within approximately 6 hours, thereby supporting timely infection prevention efforts.
Environmental persistence of VRE on dry surfaces for prolonged durations highlights the importance of environmental decontamination in prevention strategies (14-16). In our investigation, the environmental positivity rate decreased from 6.7% to 1.2% after the implementation of environmental and isolation precautions, indicating the effectiveness of targeted cleaning interventions.
Although PCR was more expensive than chromogenic agar, it provided substantially faster results (17-19). Consequently, a two-step diagnostic approach—initial screening by chromogenic agar followed by PCR confirmation—may offer an optimal balance between cost and diagnostic reliability, depending on laboratory capacity and workload.
The present findings align with the 2023 guideline by the Turkish Ministry of Health on the prevention of VRE, which emphasizes early diagnosis, effective isolation, and meticulous environmental decontamination as key components in controlling endemic VRE transmission.
Study Limitations
This study has several limitations, including its single-center design and a relatively small number of isolates. Nevertheless, our results provide important insights into the applicability and cost-effectiveness of molecular diagnostic methods in ICU settings in Türkiye. Larger multicenter studies are warranted to further validate these findings and guide national infection control strategies.
This study has several limitations. First, it was conducted in a single tertiary-care center with a relatively small number of isolates, which may limit the generalizability of the findings. Second, the differentiation between colonization and infection could not be fully evaluated because of the small number of cases of infection. Third, molecular characterization was limited to detection of vanA and vanB, without further genotypic analysis. Therefore, multicenter prospective studies are required to validate our results and better define the epidemiological characteristics of VRE circulation in intensive care settings. Another limitation of this study is that the dataset was collected in 2010. Although the fundamental diagnostic methods remain unchanged, the molecular epidemiology of VRE may have evolved over time.
CONCLUSION
This study presents a comparative analysis of three diagnostic methods used for the detection of VRE in ICUs. Our findings indicate that the Smart-Cycle I-CORE PCR system provides the highest diagnostic accuracy and the shortest turnaround time, whereas chromogenic agar represents an appropriate and practical option for active surveillance due to its lower cost and ease of use.
Early detection, which enabling which enables the rapid implementation of isolation measures, offers a major advantage in preventing nosocomial transmission. Given the high mortality risk predominantly associated with E. faecium strains, rapid diagnosis and appropriate isolation remain essential components of infection control strategies.
Although PCR appears more expensive in the short term, it has the potential to reduce additional costs associated with delayed isolation in the long run. Therefore, a two-step diagnostic strategy—screening by chromogenic agar followed by PCR confirmation—may provide an optimal balance among speed, accuracy, and cost, depending on institutional resources.
The selection of diagnostic strategies that ensure an appropriate balance between diagnostic accuracy, rapidity, and cost-effectiveness is crucial for infection prevention, patient safety, and effective resource management in intensive care settings.


