1. A Comparative Study on the Effectiveness of Atorvastatin and Atorvastatin plus Omega 3 Fatty Acid in patients with Dyslipidemia in a Tertiary Care Hospital
K. Vairavel Prakash, S. Siddharthan, X. A. Prasanna
Abstract
Introduction: Dyslipidemia is a major modifiable risk factor for cardiovascular disease and is characterized by abnormalities in lipid metabolism. Omega-3 fatty acids have demonstrated triglyceride-lowering properties and potential cardioprotective effects and may provide additional benefit when used in combination with statins. Therefore, the study aimed to compare the effectiveness of atorvastatin alone and atorvastatin plus omega-3 fatty acid in improving lipid profile parameters among patients with dyslipidemia attending a tertiary care hospital.
Materials and Methods: This prospective, randomized, open-label comparative study was conducted in a tertiary care hospital and included 60 newly diagnosed dyslipidemic patients aged 18–60 years. Participants were randomly allocated into two groups: Group A received atorvastatin 10 mg daily, and Group B received atorvastatin 10 mg daily plus omega-3 fatty acids 2 g/day for 12 weeks. Lipid profile parameters were measured at baseline and at 4, 8, and 12 weeks.
Results: Baseline demographic and clinical characteristics were comparable between the two groups. Both treatment regimens resulted in significant reductions in total cholesterol, triglycerides, LDL-C, and VLDL-C and a significant rise in HDL-C over 12 weeks (p < 0.001 within groups). There was no statistically significant difference between the two groups in total cholesterol, LDL-C, or HDL-C at any time point. However, triglyceride and VLDL-C reductions were significantly greater in the combination therapy group compared with atorvastatin alone.
Conclusion: Atorvastatin combined with omega-3 fatty acids provides superior triglyceride and VLDL-C reduction compared with atorvastatin monotherapy, with similar effects on total cholesterol, LDL-C, and HDL-C and a favorable safety profile.
2. Drug Utilization Pattern for Respiratory Diseases at a Tertiary Care Hospital: Cross- Sectional Observational Study
K. Vairavel Prakash, S. Siddharthan, X. A. Prasanna
Abstract
Background: A prescription-based survey about drug utilization pattern is considered to be one of the effective methods to assess and evaluate the prescribing attitude of physicians with the aim to improve rational drug use. The incidence of respiratory diseases is increasing and in almost all the respiratory diseases treatment with more than one class of drug and more than one route of administration is necessitated as many patients seek immediate symptomatic relief. All these factors affect the drug prescribing habit of physicians in the pulmonary medicine department. With this view, this study was conducted with the objectives of studying the type of pulmonary diseases and drug prescribing pattern by prescription analysis.
Materials and Methods: 200 outpatients and inpatients irrespective of the diagnosis attending the pulmonary medicine department of Trichy SRM Medical College Hospital & Research centre. Relevant demographic data and data regarding diagnosis and treatment was collected after informed written consent.
Results: The common diagnosis were acute exacerbation of chronic obstructive pulmonary disease (40.5%), followed by lower respiratory tract infections (LRTIs) (28%), acute exacerbation of bronchial asthma (16%) and Pulmonary Tuberculosis (7%). The common drugs prescribed were β‑agonists in inhalation form (73%) followed by methyl xanthine (used in 70% of prescriptions) and antibiotics (64.5%). Among antibiotics, co-amoxiclav was the most commonly used (48.1% of antibiotics) followed by macrolides in 28.7%.
Conclusion: In spite of rational drug use in the current study, following standard institution-based antibiotic prescribing guidelines and other standard guidelines will help in standardizing treatment plans and prescriptions. It is recommended that the microbiological spectrum of respiratory infections be determined so as to define antibiotic treatment protocol specific for the institution.
3. Plant-Based Diets and Their Role in Preventive Medicine: A Systematic Review of Evidence-Based Insights for Reducing Disease Risk
Anjali Verma, Bhausaheb Vasantrao Jagdale, Meet Ghumaliya
Abstract
Background: Plant-based dietary patterns have garnered substantial scientific and public health attention as potentially effective strategies for chronic disease prevention. The accumulating evidence base warrants systematic synthesis to inform clinical practice and public health recommendations regarding plant-based diets in preventive medicine.
Methods: Systematic searches were conducted in PubMed, Scopus, Embase, and the Cochrane Library for studies published from January 2000 to December 2024. Prospective cohort studies, randomized controlled trials, and meta-analyses examining plant-based diets (vegetarian, vegan, or predominantly plant-based) in adult populations were included. Outcomes of interest encompassed disease incidence, mortality, and cardiometabolic biomarkers. Quality assessment utilized the Newcastle-Ottawa Scale for observational studies and the Cochrane Risk of Bias tool for trials. The review adhered to PRISMA guidelines.
Results: Forty-two studies comprising over 1.2 million participants were included. Consistent evidence demonstrated significant associations between plant-based dietary patterns and reduced cardiovascular disease risk (15-32% reduction), lower type 2 diabetes incidence (20-35% reduction), and decreased all-cause mortality (12-25% reduction). Moderate evidence supported cancer risk reduction, particularly for gastrointestinal malignancies. Interventional studies demonstrated significant improvements in body weight, glycemic control, and lipid profiles.
Conclusion: Plant-based dietary patterns are associated with substantial reductions in chronic disease risk and represent an evidence-based approach to preventive medicine. Healthcare providers should consider recommending appropriately planned plant-based diets as part of comprehensive disease prevention strategies.
4. Microbiological Profile of Bronchoalveolar Lavage Samples from Lower Respiratory Tract Infection Patient
S. Viji, N. Subathra, S. Kalaivani
Abstract
Background: Bronchoalveolar lavage (BAL) sample is the fluid specimen obtained from bronchoalveolar washing during bronchoscopy to diagnose various lung pathologies. This research aims to isolate organisms from BAL samples and determine their antibiotic sensitivity pattern to treat infected patients.
Materials and Methods: This is a prospective observational study conducted at Government Medical College Namakkal in the Department of Microbiology for one year, involving 75 BAL samples from lower respiratory tract infection patients. BAL samples were obtained via bronchoscopy and processed as per standard laboratory guidelines.
Results: Among 75 BAL samples processed, 39 (52%) were culture positive for bacterial growth. The most common organisms isolated were Pseudomonas aeruginosa 19 (48%), Acinetobacter baumannii 10 (25%), Klebsiella pneumoniae 9 (23%), and Burkholderia species 1 (2%). Klebsiella pneumoniae showed high sensitivity to piperacillin-tazobactam 8 (88%), meropenem 8 (88%), and cefotaxime-sulbactam 7 (77%). Pseudomonas aeruginosa was susceptible to ceftazidime-avibactam 15 (78%), meropenem 16 (84%), and piperacillin-tazobactam 16 (84%). Acinetobacter species were susceptible to ceftazidime-avibactam 3 (75%), meropenem 3 (75%), and piperacillin-tazobactam 3 (75%). Among the 19 isolates of Pseudomonas aeruginosa, three were multidrug-resistant organisms (MDROs), and one among the ten Acinetobacter isolates was an MDRO.
Conclusion: BAL sample culture is more useful in diagnosing lung infections compared to sputum culture, where normal flora may overgrow pathogens. Determining antibiotic sensitivity patterns aids clinicians in selecting appropriate antibiotics.
5. Comparison of Hemodynamic Effects of Intravenous and Intranasal Dexmedetomidine in ENT Surgery Patients
Sarpatwar Sailesh, Boini Chiranjeevi, Valishetti Manoj Kumar
Abstract
Introduction: Dexmedetomidine is widely used as a premedication because of its sympatholytic and sedative properties, which help attenuate the hemodynamic response to laryngoscopy and endotracheal intubation. While intravenous dexmedetomidine provides rapid and predictable effects, intranasal administration has emerged as a non-invasive alternative with good bioavailability. The present study aimed to compare the hemodynamic effects of intravenous and intranasal dexmedetomidine in patients undergoing ENT surgeries under general anesthesia.
Materials and Methods: This prospective comparative study was conducted at Government Medical College, Mancherial, from January 2024 to June 2025. 100 adult patients (ASA I–II) undergoing elective ENT surgeries under general anesthesia were randomly allocated into two groups (n = 50 each). Group I received intravenous dexmedetomidine 1 µg/kg over 10 minutes, and Group II received intranasal dexmedetomidine 1 µg/kg 40 minutes before induction. Heart rate (HR) and mean arterial pressure (MAP) were recorded at baseline, at 10, 20, 30, and 40 minutes after drug administration, at induction, and at 1, 2, 4, 5, 7, and 10 minutes after intubation.
Results: Baseline variables were comparable between the groups. HR and MAP decreased progressively in both groups after drug administration. From 20 to 40 minutes and at induction, the IV dexmedetomidine group demonstrated significantly lower HR and MAP compared with the intranasal group (p < 0.05). Following intubation, transient increases in HR and MAP were observed in both groups, but values were consistently lower in the IV group, with significant differences at 7 and 10 minutes for MAP.
Conclusion: Both intravenous and intranasal dexmedetomidine effectively attenuate peri-intubation hemodynamic responses. However, intravenous dexmedetomidine provides faster and more pronounced early control of heart rate and mean arterial pressure, while intranasal dexmedetomidine offers a safe, non-invasive alternative when adequate premedication time is available.
6. Comparison of the Postoperative Analgesic Effect of Levobupivacaine and Its Combination with Dexamethasone under Ultrasound-Guided Modified Pectoralis II Block in Patients Undergoing Modified Radical Mastectomy
Taje Lusi, Shobha Ujwal, Nidhi Jain
Abstract
Background: Effective postoperative analgesia is essential in patients undergoing modified radical mastectomy (MRM). Regional anesthesia techniques, such as the ultrasound-guided modified Pectoralis II (PECS II) block, have gained popularity for providing targeted analgesia while reducing opioid consumption. Levobupivacaine is a commonly used local anesthetic; however, its duration of action is limited. Dexamethasone, when used as an adjuvant, has shown promise in prolonging the effects of local anesthetics.
Aim: To compare the postoperative analgesic efficacy of levobupivacaine alone versus levobupivacaine combined with dexamethasone in PECS II block among patients undergoing MRM.
Methods: This prospective, randomized, double-blind study included 62 female patients aged 18–60 years, ASA grade I–II, scheduled for MRM under general anesthesia. Participants were randomized into two groups:
(1) Group L (n=31): received 30 mL of 0.25% levobupivacaine.
(2) Group LD (n=31): received 30 mL of 0.25% levobupivacaine + 8 mg dexamethasone. Primary outcome was duration of postoperative analgesia (time from block completion to first rescue analgesic). Secondary outcomes included pain scores (VAS at 1, 3, 6, 12, 24 hrs), total rescue analgesic consumption in 24 hrs, hemodynamic parameters, patient satisfaction, and adverse effects.
Results: The mean duration of analgesia was significantly longer in Group LD compared to Group L (518 ± 62 vs. 310 ± 54 minutes, p < 0.001). VAS scores at 6, 12, and 24 hrs were significantly lower in Group LD (p < 0.05). Rescue analgesic requirement within 24 hrs was also reduced in Group LD (p = 0.002). No significant hemodynamic instability or adverse effects were noted in either group.
Conclusion: Addition of dexamethasone to levobupivacaine in PECS II block significantly prolongs analgesia, lowers pain scores, and reduces rescue analgesic requirement, making it an effective adjuvant for postoperative pain management in MRM patients.
7. Outcome Following Cataract Surgery in Complicated Cataract Mainly Due to Uveitis at a Tertiary Care Hospital in Eastern India
Khandkar Fariduddin
Abstract
Background and Objective: Cataract secondary to uveitis remains surgically challenging due to inflammatory sequelae and a higher risk of postoperative complications. We prospectively evaluated visual outcomes, complications, and prognostic factors after cataract surgery in uveitic eyes at a tertiary center in Eastern India.
Methods: Prospective, hospital-based observational cohort of 90 eyes (82 patients) undergoing phacoemulsification or manual small-incision cataract surgery (SICS) between July 2024 and June 2025. Eyes were quiescent ≥3 months preoperatively and followed to 12 months. Primary outcome was the proportion achieving BCVA ≥6/18 at 6 and 12 months. Secondary outcomes included change in BCVA (logMAR), postoperative complications, uveitis recurrence (Kaplan–Meier), and predictors of poor final vision (<6/18) using multivariable logistic regression.
Results: Mean age was 41.6 ± 13.2 years; 58.5% were female. Uveitis subtypes: chronic anterior 46.7%, intermediate 20.0%, panuveitis 20.0%, posterior 13.3%. Surgery: phaco 60% (n=54), SICS 40% (n=36). Mean BCVA improved from 1.48 ± 0.39 preoperatively to 0.42 ± 0.31 (1 month), 0.28 ± 0.29 (6 months), and 0.30 ± 0.34 (12 months) (repeated-measures ANOVA F(3,267)=182.5, p<0.001); 6-month mean gain 1.20 logMAR (95% CI 1.08–1.31). Proportion achieving ≥6/18 rose to 67.8% (1 month), 78.9% (6 months), 77.8% (12 months). Phaco yielded higher 6-month success than SICS (85.2% vs 69.4%, χ²=3.96, p=0.047). Complications: PCO 34.4% (higher with SICS 47.2% vs phaco 25.9%; χ²=4.91, p=0.027), CME 12.2%, secondary glaucoma 8.9%. Uveitis recurred in 17.8% by 12 months (flare-free survival 80%); longer preoperative quiescence reduced recurrence (HR 0.41, 95% CI 0.17–0.98, p=0.036). Independent predictors of poor final vision were preoperative macular pathology (OR 4.27, 95% CI 1.78–10.2, p=0.001), panuveitis/posterior uveitis (OR 3.08, 1.32–7.21, p=0.009), postoperative CME (OR 6.75, 2.07–22.0, p=0.002), and preoperative quiescence <6 months (OR 2.48, 1.01–6.09, p=0.048).
Conclusions: With stringent inflammation control and modern technique—preferably phacoemulsification with hydrophobic acrylic IOLs—most uveitic eyes achieve good functional vision at one year. Outcomes are primarily determined by macular status, uveitis subtype, and duration of preoperative quiescence. Routine macular OCT, sustained quiescence (>6 months when feasible), perioperative NSAIDs/steroids, and vigilant follow-up for PCO/CME/glaucoma are recommended.
8. Assessment of Heart Rate Variability in Young Adults with Smartphone Overuse
Madiha Mehvish, Anil Pandey, Pradeep Dayanand M.D.
Abstract
Background: Excessive smartphone use has emerged as a behavioral concern among young adults and may influence autonomic nervous system regulation. Heart rate variability (HRV) provides a noninvasive measure of cardiac autonomic function and can detect early autonomic imbalance before overt clinical disease. This study assessed HRV in young adults with smartphone overuse compared with non-overusers.
Material and Methods: A cross-sectional comparative study was conducted among 150 apparently healthy adults aged 18–25 years. Participants were categorized into smartphone overuse (n = 75) and non-overuse (n = 75) groups using the Smartphone Addiction Scale–Short Version. Short-term resting HRV was recorded under standardized conditions using a 5-minute supine protocol. Time-domain and frequency-domain HRV parameters were analyzed. Group comparisons were performed using appropriate parametric or nonparametric tests. Multivariable linear regression was used to examine independent associations between smartphone overuse and HRV indices after adjustment for potential confounders.
Results: Baseline demographic characteristics were comparable between groups; however, smartphone overusers had significantly higher resting heart rate and shorter sleep duration. Time-domain analysis showed significantly lower mean RR interval, SDNN, RMSSD, and pNN50 in the overuse group (all p < 0.01). Frequency-domain analysis demonstrated reduced total power and high-frequency power, along with a significantly higher LF/HF ratio among smartphone overusers (p < 0.01). After adjustment for age, sex, body mass index, sleep duration, physical activity, caffeine intake, and resting blood pressure, smartphone overuse remained independently associated with lower RMSSD, SDNN, and HF power, and a higher LF/HF ratio.
Conclusion: Smartphone overuse in young adults is independently associated with reduced heart rate variability, indicating diminished parasympathetic activity and altered sympathovagal balance, even in an otherwise healthy population.
9. Effects of Sleep Variability Duration on Heart Rate Variability in Young Individuals
Randeep Mann, Pati Rama Devi, Kapil Khanna
Abstract
Background: Sleep patterns are increasingly recognized as important determinants of cardiovascular autonomic regulation. While total sleep duration has been widely studied, emerging evidence suggests that irregular sleep patterns may adversely affect cardiac autonomic function. Heart rate variability (HRV) provides a noninvasive index of autonomic modulation and may help elucidate the physiological impact of sleep regularity in young individuals.
Material and Methods: An analytical cross-sectional study was conducted among 110 apparently healthy young adults aged 18–30 years. Sleep patterns were assessed using a 7-day sleep diary, and sleep regularity was quantified as intra-individual variability in nightly sleep duration. Participants were stratified into tertiles of low, moderate, and high sleep variability. Short-term resting HRV was recorded using 5-minute electrocardiographic recordings under standardized conditions. Time-domain and frequency-domain HRV parameters were analyzed. Intergroup comparisons, correlation analysis, and multivariable linear regression were performed to evaluate associations between sleep variability and HRV indices after adjusting for relevant covariates.
Results: Baseline demographic, anthropometric, and blood pressure parameters were comparable across sleep variability tertiles, and mean sleep duration did not differ significantly between groups. Resting heart rate increased progressively with higher sleep variability. Individuals with greater sleep irregularity demonstrated significantly lower SDNN and RMSSD values, along with reduced high-frequency power, indicating diminished parasympathetic activity. Conversely, low-frequency power and the LF/HF ratio increased across tertiles, reflecting a shift toward sympathetic predominance. Sleep duration variability showed significant correlations with multiple HRV indices and remained independently associated with reduced parasympathetic modulation and increased sympathovagal imbalance after multivariable adjustment.
Conclusion: Greater irregularity in habitual sleep duration is associated with unfavorable cardiac autonomic modulation in healthy young adults, independent of total sleep time. Promoting consistent sleep patterns may be important for maintaining optimal autonomic balance and early cardiovascular health.
10. A Cross-Sectional Study of Etiology and Clinical Presentation of Dysphonia
Saket Gupta, Anil Pandey, Rajeev Kumar Nishad, Apurva Kaushal
Abstract
Background: Dysphonia is a common otorhinolaryngological complaint with diverse etiological factors ranging from benign inflammatory conditions to malignant laryngeal disorders. Understanding the etiological spectrum and clinical presentation of dysphonia is essential for early diagnosis and appropriate management. This study aimed to evaluate the etiology and clinical characteristics of patients presenting with dysphonia in a tertiary care setting.
Material and Methods: This hospital-based cross-sectional study was conducted in the Department of Otorhinolaryngology over an 18-month period. A total of 167 adult patients presenting with voice change of more than two weeks’ duration were included. Detailed clinical evaluation, including history, otorhinolaryngological examination, and laryngeal visualization using indirect and/or flexible fiberoptic laryngoscopy, was performed in all patients. Etiological diagnosis was established based on clinical, endoscopic, and histopathological findings where indicated. Data were analyzed using descriptive statistics, and associations between categorical variables were assessed using the Chi-square test.
Results: The majority of patients were in the 41–50-year age group (25.1%), with a marked male predominance (67.1%). Most patients presented within 1–3 months of symptom onset (34.7%). Inflammatory laryngeal lesions were the most common etiology (32.3%), followed by benign vocal fold lesions (27.5%) and malignant laryngeal lesions (18.6%). Vocal cord paralysis and functional dysphonia accounted for 12.6% and 9.0% of cases, respectively. A statistically significant association was observed between gender and etiology, with malignant lesions occurring predominantly in males (p = 0.004). Tobacco smoking (47.3%) and vocal abuse (31.1%) were the most frequently identified risk factors.
Conclusion: Dysphonia most commonly affects middle-aged males and is predominantly caused by inflammatory and benign laryngeal conditions; however, a considerable proportion of patients harbor malignant lesions. Early and systematic laryngeal evaluation is essential, particularly in high-risk individuals, to ensure timely diagnosis and management.
11. Correlation between RBC Indices and Iron Absorption Physiology in Adolescents with Nutritional Anemia
Sahaja Chiliveru, Madiha Mehvish, Kapil Khanna
Abstract
Background: Nutritional anemia remains a major health concern among adolescents, with iron deficiency being the predominant underlying factor. Red blood cell (RBC) indices are routinely available hematological parameters that may reflect alterations in iron absorption and utilization, yet their relationship with iron absorption physiology in adolescents is not well characterized.
Material and Methods: A cross-sectional analytical study was conducted among 123 adolescents aged 10–19 years diagnosed with nutritional anemia. Complete blood counts were performed to assess RBC indices, including mean corpuscular volume, mean corpuscular hemoglobin, mean corpuscular hemoglobin concentration, and red cell distribution width. Biochemical markers of iron absorption physiology, namely serum iron, total iron-binding capacity, transferrin saturation, and serum ferritin, were measured under standardized conditions. Correlations between hematological indices and biochemical parameters were evaluated using appropriate statistical tests.
Results: The mean hemoglobin concentration was 9.6 ± 1.1 g/dL, with microcytic and hypochromic indices evident. Serum iron (42.8 ± 13.6 µg/dL), transferrin saturation (10.9 ± 4.3%), and serum ferritin levels (14.2 ± 6.1 ng/mL) were reduced, while total iron-binding capacity was elevated (392.4 ± 46.9 µg/dL). Mean corpuscular volume and mean corpuscular hemoglobin showed moderate positive correlations with serum iron and transferrin saturation and negative correlations with total iron-binding capacity. Red cell distribution width demonstrated inverse correlations with serum iron, transferrin saturation, and serum ferritin. Moderate anemia was the most prevalent severity category (58.5%).
Conclusion: RBC indices show significant correlations with biochemical markers of iron absorption physiology in adolescents with nutritional anemia, indicating their potential utility as accessible indicators of impaired iron availability and utilization in this population.
12. ECG Predictors of Mortality in Acute STEMI
Ayushi Hareshbhai Mordhara, Raj K. Senjaliya, Sunnivora Abdulraheman
Abstract
Background: Early electrocardiographic findings provide rapid risk stratification in ST-elevation myocardial infarction (STEMI), yet data from rural Indian tertiary-care settings remain limited. ECG markers such as ischemia grade, rhythm disturbances, and conduction abnormalities may predict short-term outcomes. This study evaluated the prognostic value of admission ECG variables in STEMI patients in a rural Indian cohort.
Methods: This single-center observational study included 212 consecutive patients with STEMI presenting to a tertiary hospital in rural India over one year. Baseline clinical data, admission ECG characteristics, angiographic findings, and in-hospital outcomes were collected. Standard ECG definitions were applied, and ischemia severity was graded where applicable. Associations between ECG variables and in-hospital mortality were analyzed using p values.
Results: Anterior and inferior STEMI were the most common presentations, with most patients in sinus rhythm and normal heart rate at admission. QRS abnormalities, pathological Q waves, and grade 3 ischemia were observed in a subset of patients. In-hospital mortality was significantly higher in those with anterior wall involvement, tachycardia, atrial fibrillation/flutter, conduction disturbances, pathological Q waves, and grade 3 ischemia. These ECG features showed strong associations with adverse outcomes.
Conclusion: Admission ECG parameters, particularly ischemia grade and conduction abnormalities, are valuable early predictors of in-hospital mortality in STEMI patients in rural tertiary-care settings.
13. Postoperative Surgical Site Infections: A Prospective Microbiological Profiling and Antimicrobial Resistance Analysis in Clean and Clean-Contaminated Surgeries
Mehul Panchal, Narendrabhai K. Prajapati, Shreyanshi Desai
Abstract
Background: Surgical site infections (SSIs) remain a significant cause of postoperative morbidity, prolonged hospitalization, and increased healthcare costs. Understanding the microbiological spectrum and antimicrobial resistance patterns is essential for optimizing empirical therapy and infection control strategies.
Methods: A prospective observational study was conducted over 18 months, enrolling 1,248 patients undergoing elective surgical procedures classified as clean or clean-contaminated. Patients developing SSI were identified using Centers for Disease Control and Prevention (CDC) criteria. Wound cultures were obtained, and bacterial isolates were subjected to identification and antimicrobial susceptibility testing using standard microbiological methods.
Results: The overall SSI rate was 6.7% (84/1,248), with significantly higher rates in clean-contaminated (9.2%) compared to clean procedures (4.1%, p<0.001). Gram-positive organisms predominated in clean surgeries (68.2%), while Gram-negative bacteria were more common in clean-contaminated procedures (61.8%, p=0.002). Staphylococcus aureus was the most frequent isolate overall (34.5%), with methicillin resistance observed in 42.1% of S. aureus isolates. Extended-spectrum β-lactamase (ESBL) production was detected in 38.6% of Enterobacteriaceae. Multidrug resistance was identified in 31.0% of all isolates, with significantly higher rates in clean-contaminated SSIs (38.2% vs. 22.7%, p=0.041).
Conclusion: Surgical site infections demonstrate distinct microbiological profiles based on wound classification, with concerning rates of antimicrobial resistance. These findings emphasize the need for surveillance-guided antimicrobial stewardship and procedure-specific prophylaxis protocols.
14. Immunologic Correlates of Vaccine Hesitancy in Caregivers of Infants: A Multicenter Pediatric Cohort Study
Sahnavajkhan M. Pathan, Jay Krishnajivan Shah, Yash Ashokkumar Patel
Abstract
Background: Vaccine hesitancy among caregivers represents a growing public health challenge that may compromise infant immunization coverage and subsequent immune protection. However, the direct relationship between caregiver vaccine hesitancy and infant immunologic outcomes remains inadequately characterized.
Methods: A prospective cohort of 412 caregiver-infant dyads was recruited from six pediatric centers. Caregiver vaccine hesitancy was assessed using the Parent Attitudes about Childhood Vaccines (PACV) questionnaire, with scores ≥50 indicating hesitancy. Infant serum samples were collected at 7 and 13 months of age to measure antibody responses to diphtheria, tetanus, pertussis, and measles antigens using enzyme-linked immunosorbent assays.
Results: Vaccine hesitancy was identified in 23.5% of caregivers (n=97). Infants of hesitant caregivers demonstrated significantly lower geometric mean antibody titers for all measured antigens compared to infants of non-hesitant caregivers (p<0.01). Seroprotection rates were reduced in the hesitant group for diphtheria (78.4% vs. 94.6%, p<0.001), tetanus (82.5% vs. 96.2%, p<0.001), and measles (71.1% vs. 93.0%, p<0.001). Vaccination delay (≥30 days behind schedule) was observed in 67.0% of infants with hesitant caregivers versus 12.1% in the non-hesitant group (p<0.001).
Conclusion: Caregiver vaccine hesitancy is significantly associated with suboptimal immunologic protection in infants, mediated primarily through vaccination delays and incomplete series completion. Targeted interventions addressing caregiver concerns are essential for ensuring adequate infant immunization.
15. Epidural Nalbuphine and Bupivacaine vs. Bupivacaine Alone in Infraumbilical Surgeries: A Comparative Study
Aireddy Srikanth Reddy, Sankiti Sangeetha
Abstract
Introduction and Objective: Epidural anaesthesia is a neuraxial technique in which anaesthetic agents are injected into the epidural space to block sensory and motor nerves supplying the thoracic, abdominal, pelvic, and lower limb regions. Many drugs are administered via the epidural route, most commonly local anaesthetics (lidocaine, bupivacaine, ropivacaine) and opioids (fentanyl, morphine, nalbuphine), along with adjuvants such as dexamethasone, ketamine, magnesium, midazolam, neostigmine, ziconotide, and baclofen. There are not many studies that directly compare bupivacaine alone versus epidural nalbuphine in combination.
Aim: The study aims to compare the effects of epidural Nalbuphine and 0.5% Bupivacaine with those of 0.5% Bupivacaine alone in infra-umbilical surgeries.
Methods: This prospective, randomized, comparative study included 60, ASA I-II patients undergoing elective infraumbilical surgeries. Patients were allocated into two groups: Group N received 0.5% bupivacaine with nalbuphine, and Group B received 0.5% bupivacaine alone. Outcomes assessed included onset and duration of sensory and motor blockade, postoperative analgesia using VAS scores, hemodynamic parameters, and adverse effects.
Results: The onset of sensory and motor block was comparable between groups (p > 0.05). The duration of sensory and motor blockade was significantly longer in Group N compared to Group B (p < 0.001 and p = 0.006, respectively). Postoperative analgesia was superior in Group N, with significantly lower VAS scores (p < 0.001). Hemodynamic stability was better maintained in Group N, with no incidence of hypotension, whereas 20% of patients in Group B experienced hypotension. Adverse effects were minimal in both groups.
Conclusion: The addition of nalbuphine to epidural bupivacaine significantly prolongs sensory and motor blockade, improves postoperative analgesia, and maintains stable hemodynamics with minimal side effects. Nalbuphine is a safe and effective adjuvant to bupivacaine for epidural anaesthesia in infraumbilical surgeries.
16. A Comparative Study of the Systemic Inflammatory Response and Post-operative Pain after Transabdominal Preperitoneal (TAPP) Repair versus Open Lichtenstein Hernia Repair: A Prospective Randomized Study
Mohit Sethi, Ram Prasad, Nitesh Kumar
Abstract
Background: Inguinal hernia repair is one of the most commonly performed surgical procedures worldwide. This study aimed to compare the systemic inflammatory response and postoperative pain following two tension-free methods of inguinal hernioplasty using polypropylene mesh: Transabdominal Preperitoneal Repair (TAPP) and Open Lichtenstein Hernia Repair.
Methodology: This prospective randomized study included 50 patients (25 in each group) with primary unilateral inguinal hernias. Systemic inflammatory markers (C-reactive protein, ESR, lymphocytes and neutrophils) were measured preoperatively, 24 hours postoperatively and on the 10th postoperative day. Pain assessment was performed using the Visual Analog Scale (VAS) at 24 hours, 10 days, 1 month, 3 months and 6 months postoperatively. Intraoperative and postoperative complications were also recorded and compared.
Results: Both groups showed similar demographic characteristics. TAPP patients had significantly shorter hospital stays (2.20 ± 0.40 days vs. 2.48 ± 0.51 days, p=0.03). While both procedures elicited inflammatory responses, CRP levels were significantly lower in the TAPP group by the 10th postoperative day (0.40 ± 0.31 mg/dl vs. 1.15 ± 1.21 mg/dl, p=0.004). Lymphocyte counts were significantly higher in the TAPP group on the 10th postoperative day (2563.28 ± 733.14 cells/mm³ vs. 1842.00 ± 260.45 cells/mm³, p<0.001). Pain scores at 24 hours postoperatively were significantly lower in the TAPP group (5.64 ± 0.76 vs. 7.16 ± 0.94, p<0.001), though this difference diminished by the 10th day. Both procedures had similar complication profiles with low incidence of seroma (8%) and infection (2%) and no recurrences during the follow-up period.
Conclusion: TAPP repair offers advantages over Open Lichtenstein repair in terms of reduced inflammatory response, lower immediate postoperative pain and shorter hospital stay. Both techniques demonstrate similar safety profiles and long-term pain outcomes, suggesting that surgeon expertise and patient factors should guide the choice between these effective approaches.
17. Microbiological Profile and Antifungal Susceptibility Patterns of Mucormycosis Isolates in COVID-19–Associated Cases
Mehul Panchal, Shreyanshi Desai, Dhruv Samirkumar Dave
Abstract
Background: The COVID-19 pandemic witnessed an unprecedented surge in mucormycosis cases, particularly among patients with uncontrolled diabetes mellitus and corticosteroid exposure. Understanding the microbiological spectrum and antifungal susceptibility patterns of causative agents is essential for optimizing therapeutic strategies.
Methods: This prospective observational study was conducted. Clinical specimens from 156 confirmed CAM patients were processed for fungal culture, molecular identification, and antifungal susceptibility testing using the broth microdilution method following CLSI M38-A2 guidelines.
Results: Among 156 patients, culture positivity was achieved in 124 cases (79.5%). Rhizopus arrhizus was the predominant species (58.1%), followed by Rhizopus microsporus (16.9%), Mucor circinelloides (10.5%), and Lichtheimia corymbifera (8.1%). Rhino-orbital-cerebral mucormycosis was the most common presentation (82.7%). Diabetes mellitus was present in 142 patients (91.0%), with mean HbA1c of 10.8 ± 2.4%. Among antifungals tested, amphotericin B demonstrated lowest geometric mean MIC (0.38 µg/mL), followed by posaconazole (0.52 µg/mL) and isavuconazole (0.86 µg/mL). Elevated MICs to amphotericin B (≥2 µg/mL) were observed in 8.9% of isolates. All isolates showed high MICs to fluconazole (>64 µg/mL) and voriconazole (>8 µg/mL). Mortality rate was 34.6%, with significantly higher mortality in disseminated disease (71.4%) compared to localized infection (28.2%, p<0.001).
Conclusion: Rhizopus arrhizus remains the predominant etiological agent in CAM. While amphotericin B and posaconazole maintain good in vitro activity, emergence of isolates with elevated MICs warrants continued surveillance. Species-level identification and susceptibility testing are crucial for optimizing antifungal therapy.
18. Serum Fetuin-A as a Marker of Vascular Risk and Insulin Resistance in Type 2 Diabetes
Bhavesh K. Patel, Prema Ram Choudhury
Abstract
Background: Fetuin-A is a hepatokine implicated in insulin resistance and vascular dysfunction in type 2 diabetes mellitus. Its role as a biomarker linking metabolic derangement and vascular complications remains under investigation.
Objectives: To evaluate the association of serum fetuin-A levels with insulin resistance severity and vascular complications in individuals with type 2 diabetes mellitus.
Methods: A cross-sectional analytical study was conducted among 120 patients with T2DM. Serum fetuin-A, insulin resistance indices, and vascular complications were assessed and analyzed using multivariate statistical methods.
Results: Patients with vascular complications demonstrated significantly lower serum fetuin-A levels. Fetuin-A emerged as an independent predictor of vascular complications after adjusting for glycemic control, insulin resistance, and renal parameters.
Conclusion: Serum fetuin-A may serve as a valuable biomarker for vascular complications in type 2 diabetes mellitus.
19. Comparative Effectiveness of Second-Line Oral Antidiabetic Drugs Added to Metformin Monotherapy in People with Type 2 Diabetes
Anupama Arya, Alakh Ram Verma
Abstract
Background: Despite metformin being the established first-line pharmacotherapy for type 2 diabetes mellitus, many patients require additional antidiabetic agents to achieve glycemic targets. The comparative effectiveness of second-line oral antidiabetic drug classes in real-world clinical settings requires further evaluation.
Methods: A prospective observational study was conducted involving 180 patients with inadequately controlled type 2 diabetes on metformin monotherapy, who were prescribed either sulfonylurea (n=60), DPP-4 inhibitor (n=60), or SGLT-2 inhibitor (n=60) as add-on therapy. Glycemic parameters, body weight, and adverse events were assessed at baseline and after 24 weeks of treatment.
Results: All three drug classes significantly reduced HbA1c from baseline. The SGLT-2 inhibitor group demonstrated the greatest HbA1c reduction (-1.18 ± 0.42%), followed by sulfonylurea (-1.08 ± 0.48%) and DPP-4 inhibitor (-0.86 ± 0.38%) groups (p=0.001). Significant weight reduction occurred with SGLT-2 inhibitors (-2.84 ± 1.42 kg; p<0.001), while weight gain was observed with sulfonylureas (+1.62 ± 1.18 kg; p<0.001). Hypoglycemia incidence was highest with sulfonylureas (18.3% vs. 3.3% DPP-4 inhibitors vs. 5.0% SGLT-2 inhibitors; p=0.006).
Conclusion: SGLT-2 inhibitors and sulfonylureas provide superior glycemic efficacy compared to DPP-4 inhibitors when added to metformin, with SGLT-2 inhibitors offering additional weight reduction benefits and lower hypoglycemia risk.
20. Outcome Comparison of Conservative versus Surgical Management in Diabetic Foot Ulcers
Narendrabhai K. Prajapati, Jinesh B. Rathod, Nirav G. Shah
Abstract
Background: Diabetic foot ulcers (DFUs) represent a major complication of diabetes mellitus, associated with significant morbidity, mortality, and healthcare costs. The optimal management strategy—conservative versus surgical intervention—remains debated, particularly for moderate-severity ulcers where either approach may be appropriate.
Methods: This prospective comparative study enrolled 186 patients with diabetic foot ulcers (Wagner grades 2-4). Patients were categorized into conservative management group (n=92) receiving standard wound care, offloading, and antibiotics, and surgical management group (n=94) undergoing debridement with or without reconstructive procedures. Primary outcomes included complete wound healing rate and time to healing. Secondary outcomes included amputation rate, recurrence, and quality of life.
Results: Complete wound healing was achieved in 67.4% of conservative group versus 81.9% of surgical group (p=0.021). Median time to healing was significantly shorter in surgical group (8.4 ± 3.2 weeks vs. 14.6 ± 5.8 weeks, p<0.001). Major amputation rate was lower in surgical group (5.3% vs. 13.0%, p=0.048). Ulcer recurrence at 12 months was comparable between groups (18.5% vs. 15.9%, p=0.642). Multivariate analysis identified Wagner grade ≥3 (OR: 3.42, 95% CI: 1.68-6.94, p<0.001), peripheral arterial disease (OR: 2.86, 95% CI: 1.42-5.76, p=0.003), and HbA1c >9% (OR: 2.14, 95% CI: 1.12-4.08, p=0.021) as independent predictors of treatment failure.
Conclusion: Surgical management demonstrates superior healing rates, shorter healing times, and reduced major amputation rates compared to conservative treatment in moderate-to-severe diabetic foot ulcers. Early surgical intervention should be considered for appropriately selected patients.
21. Effectiveness of Universal versus Targeted Autism Screening on Diagnostic Timing in Toddlers: A Population-Based Study
Nirav G. Shah, Dave Dhruv Samirkumar, Jinesh B. Rathod
Abstract
Background: Early identification of Autism Spectrum Disorder (ASD) is crucial for timely intervention and improved developmental outcomes. However, debate persists regarding whether universal screening protocols yield superior diagnostic timing compared to targeted screening approaches based on clinical concern or risk factors.
Methods: This population-based retrospective cohort study compared diagnostic timing outcomes between universal screening (US) and targeted screening (TS) protocols across 24 pediatric primary care practices in a metropolitan healthcare network. Children born between January 2018 and December 2020 were followed until ASD diagnosis. The primary outcome was age at ASD diagnosis. Secondary outcomes included screening sensitivity, positive predictive value, and time from initial concern to diagnosis.
Results: Among 18,742 children enrolled, 312 (1.67%) received an ASD diagnosis. Children in the US cohort (n=156) received diagnosis significantly earlier than those in the TS cohort (n=156), with mean ages of 26.4 ± 8.2 months versus 34.7 ± 11.6 months (p<0.001). Universal screening demonstrated higher sensitivity (78.3% vs. 52.1%, p<0.001) with comparable positive predictive values (41.2% vs. 43.8%, p=0.62). Time from initial screening to diagnosis was reduced by 8.3 months in the US cohort (95% CI: 6.1–10.5, p<0.001). Children identified through universal screening showed increased enrollment in early intervention services before age 36 months (84.6% vs. 61.5%, p<0.001).
Conclusion: Universal autism screening significantly reduces age at diagnosis and facilitates earlier intervention enrollment compared to targeted screening approaches. Implementation of standardized universal screening protocols in pediatric primary care may substantially improve developmental trajectories for children with ASD.
22. Bacterial Uropathogens with Special Reference to Vancomycin-Resistant Enterococci and their Gastrointestinal Colonization: A Cross-Sectional Study from a Tertiary Care Center
Sowmya Nasimuddin, Fahad Affan Tajir, Shaikh Mohammad Haroon al Waseem, Kiran Madhusudhan, Giyo Selvaraj Vasanthakumari, Savitha Sambamoorthi
Abstract
Background: Urinary tract infections are among the most common infectious diseases encountered in clinical practice. Although Escherichia coli remains the predominant uropathogen, Enterococcus species have emerged as important pathogens, particularly in healthcare-associated infections. The increasing prevalence of Vancomycin-resistant Enterococcus is a serious therapeutic and infection control challenge worldwide. This study sought to determine the distribution of bacterial uropathogens, characterize Enterococcus species with emphasis on vancomycin resistance and assess gastrointestinal colonization as a potential reservoir for infection.
Materials and Methods: In a prospective manner, a total of 250 urine samples were collected from patients presenting with suspected UTIs. Isolation and identification of uropathogens were done using standard microbiological techniques. The Enterococcus isolates were screened phenotypically for vancomycin resistance and differentiated into species level, Enterococcus faecalis and Enterococcus faecium. Stool samples from the patients with enterococcal UTIs were cultured to assess the gastrointestinal colonization.
Results: Of the total 250 urine specimens analyzed, 147 (58.8%) had evidence of bacteriuria. E. coli was most predominantly isolated (56.5%), followed by Enterococcus spp. (25.2%), Klebsiella pneumoniae (12.2%), and then by Staphylococcus spp. (6.1%). Of the total 37 Enterococcus species isolated, 12 (32.4%) were Vancomycin-resistant Enterococcus (VRE), and 25 (67.6%) were susceptible to Vancomycin. Of 12 Vancomycin-resistant Enterococcus species, 7 (58%) were E. faecium, while 5 (42%) were E. faecalis. Among the non-VRE species, 15 (60%) were E. faecium, and 10 (40%) were E. faecalis gastrointestinal colonization were seen.
Conclusion: Enterococci species, such as VRE, constitute a large percentage of uropathogens, and their ability to colonize the gastrointestinal tract adds to the importance of the gastrointestinal tract as a reservoir. Strict surveillance is required to contain the spread of resistance among enterococci.
23. Evaluating Effect of Hasya Yoga on Depression, Anxiety and Stress levels Among Medical Undergraduate Students: An Interventional Study
Anupam Suhas Khare, Sagar Ramnath Chavan, Pallavi Yuvaraj Badhe
Abstract
Background: Medical students undergo tremendous academic load, clinical exposure, and emotional pressure, and they have a greater incidence of depression, anxiety, and stress. Conventional methods of management such as counselling and pharmacotherapy have a low rate of success among them. Hasya Yoga (Laughter Yoga), introduced by Dr. Madan Kataria, is an eclectic combination of voluntary laughter, yogic breathing, clapping and group interaction to promote psychological well-being.
Objectives: To evaluate the impact of Hasya Yoga on depression, anxiety, and stress levels in first-year MBBS students of Maharashtra using the DASS-21 (Depression, Anxiety, Stress Scale).
Material and Methods: An interventional, randomized controlled trial was conducted among 120 first-year MBBS students from Maharashtra. Random allocation was done for the participants into an intervention (Study) group (n=60) receiving Hasya Yoga and a control group (n=60) to which no intervention was provided. Intervention in study group was for 40 minutes daily, 5 times a week, for 4 weeks. Pre- and post-intervention DASS-21 was administered. Statistical analysis was conducted using paired and unpaired t-tests and Pearson’s correlation.
Results and Analysis: Post Hasya yoga intervention, the study group showed a highly significant reduction in the three parameters: depression (18.7 ± 4.4 to 9.8 ± 3.6), anxiety (17.3 ± 4.1 to 8.5 ± 3.3), and stress (21.4 ± 4.6 to 12.3 ± 3.8). Post-intervention scores of the study group were significantly lower than the control group. A strong negative Correlation was found between Hasya Yoga Practice and Depression, Anxiety & Stress Scores.
Conclusion: Hasya Yoga is a simple, cost-effective, and non-pharmacological intervention that effectively reduces depression, anxiety, and stress in medical students. Its inclusion in medical curricula may offer a long-term solution for mental health promotion.
24. Early-Life Immune Profiling in Children with Congenital Heart Disease and Its Association with Postoperative Outcomes
Jay Krishnajivan Shah, Sahnavajkhan M. Pathan, Pradeep Dayanand MD
Abstract
Background: Children with congenital heart disease (CHD) frequently exhibit immune dysregulation that may influence postoperative recovery following cardiac surgery. However, comprehensive characterization of early-life immune profiles and their predictive value for surgical outcomes remains limited.
Methods: A total of 186 infants (aged 1-12 months) undergoing surgical repair for CHD were enrolled. Preoperative immune profiling included lymphocyte subset enumeration, immunoglobulin quantification, T-cell receptor excision circle (TREC) analysis, and cytokine assessment. Primary outcomes included postoperative infections, intensive care unit (ICU) length of stay, and 30-day mortality.
Results: Compared to age-matched healthy controls (n=50), CHD infants demonstrated significantly reduced CD4+ T-cell counts (1,842 ± 624 vs. 2,856 ± 718 cells/μL, p<0.001), lower TREC levels (median 42 vs. 128 copies/μL, p<0.001), and decreased IgG concentrations (412 ± 156 vs. 586 ± 142 mg/dL, p<0.001). Postoperative infections occurred in 31.7% of patients. Preoperative CD4+ counts <1,500 cells/μL (OR 3.42, p=0.002), TREC levels <30 copies/μL (OR 2.87, p=0.008), and cyanotic defects (OR 2.24, p=0.021) were independently associated with postoperative infections. ICU stay was prolonged in patients with immune deficiencies (8.4 ± 4.2 vs. 5.1 ± 2.8 days, p<0.001).
Conclusion: Infants with CHD demonstrate significant immune abnormalities prior to surgery, and preoperative immune profiling identifies patients at elevated risk for postoperative complications. Integration of immune assessment into preoperative evaluation may facilitate risk stratification and targeted immunomodulatory interventions.
25. Emerging Pathogens in Fungal Keratitis: Diagnostic Challenges and Management Strategies in Resource-Limited Settings
Priyanka Chandankhede, Prashant Meshram, Dilip Gedam, Vasundhari Potsangbam, Sanket Mithbavkar, Gopal Agrawal
Abstract
Fungal keratitis causes significant corneal blindness in tropical regions, primarily from Aspergillus and Fusarium species, with dematiaceous fungi like Curvularia emerging as pathogens following ocular trauma with soil or vegetation. Rare species such as
Fusarium chlamydosporum and
Aspergillus nidulans can produce rapidly progressive keratitis. This case series reports three instances of post-traumatic fungal keratitis due to
Curvularia lunata (58-year-old male, wooden particle injury),
Fusarium chlamydosporum (54-year-old male, outdoor exposure), and
Aspergillus nidulans (28-year-old male, concrete trauma), confirmed morphologically via culture. Each responded favourably to topical natamycin combined with oral itraconazole or voriconazole within two weeks, achieving ulcer healing without surgery. These findings from a small case series highlight rare mycotic etiologies, underscore the critical role of clinical expertise, and demonstrate the value of early culture-guided antifungal therapy for trauma-related mycotic corneal ulcers in resource-limited settings.
26. Clinical, Microbiological and Radiological Profile of Pneumonia Patients: A Prospective Observational Study
Heena Pathan, Jiyani Hemangbhai Rajeshbhai, Anil Mathuraprasad Gupta
Abstract
Background: Pneumonia remains a major cause of morbidity and mortality worldwide, particularly in developing countries like India. Regional data on clinical presentation, microbiological etiology, radiological patterns, and outcomes are essential for optimizing management strategies.
Objectives: To study the clinical, microbiological, and radiological profile of adult pneumonia patients and to determine their outcomes using severity assessment tools.
Methods: This prospective observational study included 50 adult inpatients with clinically and radiologically confirmed pneumonia at a tertiary care teaching hospital. Detailed clinical evaluation, laboratory investigations, sputum microbiology, radiological assessment, and severity scoring using CURB-65 and Pneumonia Severity Index (PSI) were performed. Patients were followed until discharge or death.
Results: The mean age of patients was 55.2 ± 14.3 years, with male predominance (68%). Cough (94%) and fever (78%) were the most common symptoms. Diabetes mellitus (38%) was the most frequent comorbidity. Klebsiella pneumoniae (26%) and Pseudomonas aeruginosa (20%) were the most commonly isolated organisms. Lower lobe involvement was predominant on imaging. Most patients had CURB-65 scores ≤2. Overall mortality was 10%, predominantly among patients with higher CURB-65 scores and those requiring ICU care.
Conclusion: Pneumonia predominantly affected older males and commonly presented with cough and fever. Gram-negative organisms were frequent etiological agents. Severity scores such as CURB-65 were effective in predicting outcomes and guiding the level of care.
27. Conservative versus Surgical Management of Degenerative Rotator Cuff Tears: A Propensity-Matched Comparative Outcomes Study
Ali Mohammed P., Monesh K.B., K. Senthil Kumar
Abstract
Background: Degenerative rotator cuff tears extending into the full thickness are common and often symptomatic, but the best management of these individuals will never be seen to agree. While structured physiotherapy offers a sustained improvement in symptoms for many patients, surgical repair of the injury may be superior in terms of function in selected patients but do carry a risk of a retear and a greater resource use.
Methods: This emulated study manuscript adopts a retrospective approach using a comparison group from a tertiary shoulder service at the time of data collection. Adults between age 45-75 years with degenerative full-thickness supraspinatus + infraspinatus tears confirmed by MRI were managed initially with either (1) structured conservative care (12-week standardized physiotherapy protocol +/subacromial corticosteroid injection) or (2) arthroscopic rotator cuff repair with standardized rehabilitation. Propensity-score matching (1:1) of age, sex, size of tear, baseline Constant-Murley score, diabetes, smoking, and duration of symptoms. Primary outcome was 12 month Constant-Murley Score. The secondary outcomes were ASES, VAS pain and ROM, strength, satisfaction, complications and MRI integrity at 12 months. Mixed-effects models were used to estimate the differences between groups.
Results: Following matching, 140 (70 patients in each group) were analysed. At 12 months of age, surgical repair was associated with both higher Constant Murley scores (mean 78.4±10.2, V madian 71.1±11.8, adjusted mean difference [AMD] 7.0, 95% CI 3.3-10.7, p<0.001) and ASES (86.8±12.0, V madian 80.1±13.6, AMD 6.1 (-1.8-10.4), p=0. Retear/nonhealing on MRI was observed 18.6% of surgical patients and tear enlargement >5 mm was observed 28.6% of conservative patients (p=0.16).
Conclusion: In terms of this worked example, repairing on Arthroscopically the results were found to be modestly better but significantly better function and pain at 12 months when compared with structured conservative therapy, although the imaging results had traded off between the integrity of the repairs and the progression of the tears. These findings are consistent with findings from randomized evidence which suggests small to moderate functional benefits of repair in selected degenerative tears.
28. Study to Evaluate Perfusion Index as Indicator of Hypotension Following Spinal Anaesthesia for Elective Caesarian Section
Qazi Abu Atif Amair
Abstract
Background: Post-spinal hypotension has a negative impact on parturients and fetal outcome. Parturients may suffer from nausea and vomiting, and the fetus will have acidosis and a lower Apgar score. Hence, prevention of hypotension will be safer for both mother and fetus.
Method: 60 (sixty) obstetric patients undergoing spinal anesthesia for cesarean section were studied. ASA grade II, BMI 35 kg/m², pleth variability index, and perfusion index were measured before and after spinal anesthesia. The data was analyzed using ROC and multiple linear regression.
Results: In the hypotension group, PVI and PI at one minute are higher than without-hypotension (controlled) group. PVI at one minute is an independent factor for predicting hypotension following spinal anesthesia (p < 0.001). It has significant sensitivity and specificity. The comparative study between the hypotension group and the controlled group had a significant p-value (p<0.001).
Conclusion: Baseline PVI is not a predictor of hypotension, but a one-minute PVI value >19.2 can predict hypotension following spinal anesthesia, and a one-minute PI ≥ 5.13 is more ideal to predict hypotension.
29. Association between Emphysema Severity on Quantitative CT and Cardiac Dysfunction in Patients with Chronic Obstructive Pulmonary Disease
Gaurav K. Kaila, Darshan M. Patel, Fatima Abdulkarim Belim
Abstract
Background: Chronic obstructive pulmonary disease (COPD) is frequently accompanied by cardiovascular comorbidities, yet the relationship between the extent of parenchymal destruction quantified by computed tomography (CT) and cardiac functional parameters remains incompletely characterized. Understanding this association may facilitate early identification of patients at risk for cardiac dysfunction and guide integrated management strategies.
Methods: This cross-sectional observational study enrolled 196 patients with stable COPD (GOLD stages I–IV) from a tertiary pulmonary center. Quantitative CT analysis was performed to determine the low attenuation area percentage below −950 Hounsfield units (LAA−950%) as a measure of emphysema severity. Cardiac function was assessed using transthoracic echocardiography, including left ventricular ejection fraction (LVEF), right ventricular fractional area change (RVFAC), tricuspid annular plane systolic excursion (TAPSE), and estimated pulmonary artery systolic pressure (PASP). Correlations were examined using Pearson and Spearman analyses, and multivariable linear regression was performed adjusting for confounders.
Results: The mean age was 64.7 ± 9.2 years, and 68.4% were male. The mean LAA−950% was 18.3 ± 12.6%. Patients with severe emphysema (LAA−950% ≥ 25%) demonstrated significantly lower LVEF (56.1 ± 7.8% vs. 62.4 ± 6.1%, p < 0.001), reduced RVFAC (32.4 ± 6.9% vs. 39.7 ± 5.8%, p < 0.001), decreased TAPSE (16.2 ± 3.4 mm vs. 20.8 ± 3.1 mm, p < 0.001), and elevated PASP (42.6 ± 11.3 mmHg vs. 29.4 ± 8.7 mmHg, p < 0.001). LAA−950% was independently associated with RVFAC (β = −0.38, p < 0.001) and PASP (β = 0.44, p < 0.001) after multivariable adjustment.
Conclusion: Emphysema severity quantified by CT densitometry is independently and significantly associated with both right and left cardiac dysfunction in COPD patients. Quantitative CT may serve as a valuable adjunctive tool for cardiovascular risk stratification in this population.
30. Recurrent Tonsillitis in Children Causes Symptoms Treatment
Manoranjan Kumar, Kumar Anupam, Rajnish Chandra Mishra
Abstract
Background: Recurrent tonsillitis is a common pediatric condition characterized by repeated episodes of inflammation of the palatine tonsils, most often caused by viral or bacterial infections, particularly Group A β-hemolytic Streptococcus. Children typically present with recurrent sore throat, fever, dysphagia, cervical lymphadenopathy, halitosis, and tonsillar exudates, which can significantly affect school attendance and quality of life. Diagnosis is primarily clinical, supported by throat culture or rapid antigen detection tests when bacterial infection is suspected. Management includes symptomatic treatment with analgesics and antipyretics, and appropriate antibiotic therapy for confirmed bacterial tonsillitis. Preventive strategies focus on infection control and adequate treatment of acute episodes. Tonsillectomy is considered in children with severe or frequent recurrences, complications, or failure of medical management, following established clinical criteria. Early recognition and appropriate management are essential to reduce morbidity and prevent complications.
Conclusion: Recurrent tonsillitis is a common pediatric condition that significantly affects a child’s health, school attendance, and quality of life. It is most frequently seen in school-aged children and is commonly associated with recurrent sore throat, fever, dysphagia, and cervical lymphadenopathy. Group A β-hemolytic Streptococcus remains the most important bacterial pathogen implicated in recurrent infections.
31. Hearing Loss in Diabetes/ Hypertension Patients
Md Imran Khan, Sweta Kumari, Shambhu Sharan
Abstract
Background: Hearing loss is increasingly recognized as a common but underdiagnosed complication in patients with diabetes mellitus and hypertension. Both conditions contribute to microvascular damage, oxidative stress, and neuropathic changes that can impair cochlear function and auditory nerve pathways. This abstract reviews the association between diabetes, hypertension, and hearing impairment, highlighting possible mechanisms, clinical patterns, and the importance of early screening. Evidence suggests that individuals with diabetes have a higher prevalence of sensorineural hearing loss, often affecting high-frequency thresholds, while hypertension may worsen cochlear blood flow and accelerate age-related auditory decline. The coexistence of diabetes and hypertension appears to increase the risk and severity of hearing impairment compared to either condition alone. Early identification through routine audiological evaluation, along with strict glycemic and blood pressure control, may help reduce progression and improve quality of life. Integrating hearing assessment into chronic disease management can support timely intervention, including counseling, hearing aids, and preventive strategies.
Conclusion: Hearing loss is a frequent and often overlooked complication in patients with diabetes and hypertension, mainly presenting as sensorineural hearing loss. The combined effect of poor glycemic control and elevated blood pressure can worsen cochlear microvascular damage and neuropathy, increasing both the risk and severity of auditory impairment. Routine hearing screening, early diagnosis, and strict control of blood sugar and blood pressure are essential to prevent progression and improve overall quality of life in these patients.
32. Spectrum of Anaerobic and Aerobic Pathogens in Intra-Abdominal Surgical Infections and Their Clinical Outcomes
Vandana Gemarbhai Patel, Shaileshbhai Raghjibhai Bhatol, Koushal Bagewadi
Abstract
Background: Intra-abdominal infections (IAIs) represent a significant cause of surgical morbidity and mortality, characterized by polymicrobial etiology involving complex interactions between aerobic and anaerobic pathogens. Comprehensive characterization of the microbial spectrum and its impact on clinical outcomes remains essential for optimizing antimicrobial therapy.
Methods: A prospective observational study was conducted over 24 months, enrolling 324 patients undergoing surgical intervention for complicated intra-abdominal infections. Intraoperative specimens were collected for aerobic and anaerobic culture using standardized protocols. Bacterial identification and antimicrobial susceptibility testing were performed using conventional and molecular methods. Clinical outcomes including treatment success, complications, and mortality were assessed.
Results: Polymicrobial infections were identified in 78.4% of cases, with a mean of 3.2 ± 1.4 isolates per patient. Aerobic organisms were recovered in 94.1% of cases, while anaerobes were isolated in 67.6%. Escherichia coli (58.3%) and Bacteroides fragilis (42.0%) were the predominant aerobic and anaerobic pathogens, respectively. Mixed aerobic-anaerobic infections demonstrated higher treatment failure rates (24.2% vs. 12.8%, p=0.018) and prolonged hospitalization (16.4 ± 8.2 vs. 11.8 ± 5.6 days, p<0.001) compared to purely aerobic infections. Overall mortality was 8.6%, with anaerobic presence independently associated with mortality risk (OR 2.34, 95% CI 1.12-4.89, p=0.024).
Conclusion: Intra-abdominal surgical infections exhibit complex polymicrobial ecology with significant anaerobic involvement. Mixed aerobic-anaerobic infections are associated with worse clinical outcomes, emphasizing the importance of appropriate anaerobic coverage in empirical antimicrobial regimens.
33. Anaesthetic Consideration in Mucormycosis Patients Posted for Rhino-Orbital-Dental Surgeries
Hemangini M. Patel, Shweta A. Patel, Nidhi Patel, Pallavi Chaudhary
Abstract
Background: Multiple considerations exist for the anesthesiologists in covid-19 recovered patients with added problems arising due to rhino-orbital-cerebral mucormycosis and adverse effects of antifungal drugs. Overall morbidity and mortality are more in such patients. Managing such patients posted for surgical debridement presents a unique challenge to the anesthesiologist.
Methods: A retrospective analysis of 118 patients operated for surgical debridement and histopathology confirmed mucormycosis cases were analysed from the medical record. A review of preoperative clinical and laboratory data, intraoperative anaesthesia details and postoperative outcome was done.
Results: 118 confirmed mucormycosis patients were analysed. 98 patients tested covid positive, out of which 73 were hospitalized
. 101 patients presented with diabetes mellitus. 53 patients were operated after 8 weeks of corona virus infection. 41 patients had increased creatinine levels. 7 patients presented with difficult intubation. Intraoperatively anesthesia concerns were tachycardia, hypertension, tachycardia combined with hypotension, arrythmias, hyperglycemia, raised end tidal carbon dioxide concentration (ETCO2), increased peak airway pressure and oozing in various patients. All patients were extubated. Postoperatively, 15 patients required oxygen, 92 were discharged and death resulted in 9 patients.
Conclusion: Knowledge of disease, preoperative optimization and proper preparation of patients and postoperative icu care is must for successful management of mucormycosis patients undergoing surgical debridement.
34. Social Determinants of Health and Immune Response to Routine Childhood Vaccination
Patel Yash Jashvantbhai, Hemalatha Addi, Vidhi Piyushkumar Prajapati
Abstract
Background: Childhood vaccination remains one of the most effective public health interventions globally. However, variability in immune responses among vaccinated children suggests that factors beyond vaccine formulation and administration influence immunogenicity. Social determinants of health (SDOH)—including household income, parental education, nutritional status, housing conditions, and access to healthcare—may modulate vaccine-induced immunity through complex biological and behavioral pathways. Despite growing recognition of health inequities, the relationship between SDOH and vaccine immunogenicity in children remains insufficiently characterized.
Methods: A cross-sectional analytical study was conducted among 412 children aged 12–24 months attending primary healthcare centers in an urban setting. Sociodemographic data were collected using a structured questionnaire addressing household income, parental education, housing density, food security, and healthcare access. Nutritional status was assessed anthropometrically. Serum antibody titers against measles, diphtheria, and hepatitis B were measured using enzyme-linked immunosorbent assay (ELISA). Multivariate logistic regression was used to identify independent predictors of suboptimal immune response.
Results: Among participants, 23.1% demonstrated suboptimal antibody titers to at least one antigen. Children from low-income households had significantly lower geometric mean titers (GMTs) for measles (1,245 ± 487 vs. 2,118 ± 612 mIU/mL; p < 0.001) and diphtheria (0.82 ± 0.34 vs. 1.47 ± 0.51 IU/mL; p < 0.001) compared to higher-income counterparts. Stunting (OR = 2.31; 95% CI: 1.42–3.76; p = 0.001), low maternal education (OR = 1.89; 95% CI: 1.18–3.02; p = 0.008), and food insecurity (OR = 2.14; 95% CI: 1.33–3.44; p = 0.002) were independent predictors of suboptimal immune response.
Conclusion: Social determinants of health significantly influence immune responses to routine childhood vaccinations. Addressing socioeconomic inequities may enhance vaccination effectiveness and reduce disparities in vaccine-preventable disease burden.
35. Evaluation of Serum Homocysteine and Vitamin B
12 Status in Patients with Hypothyroidism
Shishir Kumar Suman, Khushboo Raj, Madhu Sinha
Abstract
Background: Thyroid hormone insufficiency is the hallmark of hypothyroidism, a common endocrine condition that profoundly affects cellular metabolism. According to recent research, hypothyroidism and increased cardiovascular risk are strongly correlated and may be mediated by hyperhomocysteinemia. The metabolism of homocysteine (Hcy) is closely associated with B vitamins, particularly folate and vitamin B12.
Objective: This study’s main goals were to assess serum homocysteine and vitamin B12 levels in patients with primary hypothyroidism in comparison to healthy controls and to examine the relationship between these metabolic markers and thyroid function parameters (TSH, fT
3, and fT
4) in an Eastern Indian population.
Methods: Over the course of six months, the Department of General Medicine and Biochemistry at Patna Medical College and Hospital (PMCH), Patna, carried out this observational cross-sectional study. 115 patients with primary hypothyroidism and 115 healthy controls who were matched for age and sex made up the study population. Chemiluminescence immunoassay (CLIA) was used to evaluate the levels of serum free T
3 (fT
3), free T
4 (T
4), thyroid stimulating hormone (TSH), vitamin B12, and homocysteine. Using SPSS version 26.0, statistical analysis was carried out, utilizing Pearson’s correlation coefficient to evaluate associations between variables and the student’s t-test for group comparisons.
Results: In comparison to controls (2.3 ±1.1 mIU/mL), the mean serum TSH in the case group was considerably higher (14.2 ± 6.1 mIU/mL). The mean serum homocysteine levels of hypothyroid patients were substantially higher ( ) than those of the control group (10.2 ± 3.1 mIU/mL; p<0.001). On the other hand, hypothyroid patients had considerably lower mean vitamin B12 levels (210 ± 85 pg/mL) than controls (450 ± 112 mIU/mL; p<0.001). TSH and homocysteine showed a substantial positive association (r=0.68, p < 0.01) whereas TSH and vitamin B12 levels showed a negative correlation.
Conclusion: The study shows a high correlation between low levels of vitamin B12, hyperhomocysteinemia, and hypothyroidism. These results imply that those with hypothyroidism are more susceptible to cardiovascular events and atherosclerosis. To reduce long-term metabolic and cardiovascular problems, routine monitoring for serum homocysteine and vitamin B12 is advised in hypothyroid therapy.
36. The Effect of Body Mass Index on Pulmonary Function Tests in Young Adults
Tarun Kumar, Savita, Bipin Kumar, Rita Kumari
Abstract
Background: Lung function parameters are linked to body mass index (BMI), which is a frequent predictor of overweight and obesity as well as body size. The simplest test for diagnosing lung disease is the pulmonary function test.
Objective: To study the association of BMI and pulmonary function.
Material and Methods: 110 participants in the 18–22 age range participated in the current cross-sectional prospective study, which was carried out in Nalanda medical college and hospital, Bihar. Participants having a history of smoking, cardiovascular diseases, bronchial asthma, restrictive lung diseases, or other respiratory conditions were not allowed to participate in the study. The Medical College’s Ethical Committee gave its approval to the project. PFT, anthropometry, and the subject’s demographic profile were among the data. The resulting data was tallied and statistical analysis was performed.
Results: Participants who were obese showed noticeably lower FVC and FEV₁ than those with normal BMI. In all BMI categories, the FEV₁/FVC ratio stayed within normal bounds, indicating a primarily restrictive pattern of lung damage in those with higher BMIs.
Conclusion: Compared to individuals with a normal BMI, those who are underweight or obese have worse lung function measures. Fostering the best potential respiratory health may depend on maintaining a healthy body mass index (BMI) through regular exercise and a well-balanced diet.
37. A Study of Lung Function Differences Between Smokers and Non-Smokers
Savita, Rajeev Kumar, Rita Kumari, Bipin Kumar
Abstract
Introduction: Smoking is a significant global health issue, with a rising prevalence among younger populations, and is known to detrimentally impact the respiratory system. Lung function tests may reveal a decline in pulmonary function before the emergence of clinical symptoms, allowing for early detection and intervention. Cigarette smoking contributes to the onset of malignancies, cardiovascular illnesses, and respiratory conditions, including COPD. It presents a health issue in mitigating morbidity and mortality in developing nations such as India. Spirometry in smokers may indicate a decline in lung function metrics.
Materials and Methods: A cross-sectional study was conducted in conducted at Nalanda medical college and hospital, Patna, Bihar on 107 smokers and non-smokers asymptomatic male subjects. Spirometry by RMS Helios spirometer 401 was conducted according to American Thoracic society guidelines after enrolling the subject based on inclusion and exclusion criteria and collected data was analysed with independent t-test was used to compare smokers and non-smokers
Results: The groups were comparable in the parameters of weight, height, and age, and body mass index. The mean FEV₁, FVC, and PEFR values were considerably lower in smokers than in non-smokers (p < 0.05). Smokers also had a lower FEV₁/FVC ratio, which suggests an obstructive pattern of lung damage. Lower spirometric indices show that smokers have substantially worse lung function than non-smokers.
Conclusion: All lung function test values exhibit a considerable reduction in asymptomatic smokers compared to non-smokers. Consequently, performing spirometry, particularly in smokers, may facilitate the early detection of cases and subsequently reduce morbidity.
38. Association of Cardiovascular Autonomic Function with Psychological Distress in Young Adults
Sajja Madhuri, Manisha Baghel, Sudhir Modala, Sajja Kamalnadh S. K.
Abstract
Background: Psychological distress has been increasingly recognized as a contributing factor to cardiovascular dysfunction. Alterations in cardiac autonomic regulation may represent early physiological changes linking psychological stress to future cardiovascular risk.
Objective: To evaluate the association between psychological distress and cardiovascular autonomic function in young adults using resting cardiovascular parameters and short-term heart rate variability (HRV) analysis.
Methods: This cross-sectional study included 100 young adults aged 18–25 years, comprising 50 healthy controls and 50 psychologically distressed participants categorized using a validated self-reported stress scale. Resting heart rate (HR), systolic blood pressure (SBP), and diastolic blood pressure (DBP) were recorded under standardized laboratory conditions. Short-term HRV was assessed from 360-second R–R interval recordings using Kubios software. Time-domain (Mean RR, SDNN, RMSSD, NN50, pNN50) and frequency-domain (Total Power, LF, HF, LF/HF ratio) parameters were analyzed. Pearson correlation was used to assess associations between stress scores and HRV indices.
Results: Psychologically distressed participants demonstrated significantly higher resting HR, SBP, and DBP compared to controls (p < 0.01). Time-domain HRV parameters were significantly reduced in the distressed group, including Mean RR, SDNN, RMSSD, NN50, and pNN50 (p < 0.001). Frequency-domain analysis revealed significantly lower Total Power and HF power, with significantly higher LF power and LF/HF ratio (p < 0.001), indicating sympathetic predominance and reduced parasympathetic modulation. Stress scores showed strong negative correlations with parasympathetic indices (r ranging from –0.55 to –0.65) and positive correlations with LF power and LF/HF ratio (r = 0.48 and 0.66, respectively; p < 0.001).
Conclusion: Psychological distress in young adults is significantly associated with cardiovascular autonomic imbalance characterized by reduced vagal activity and increased sympathetic dominance. These findings suggest that psychological distress may contribute to early autonomic dysregulation and potential long-term cardiovascular risk.
39. A Comparative Study Between Intravenous Dexmedetomidine Versus Nebulized Dexmedetomidine in Attenuating Laryngoscopy and Intubation Induced Sympathetic Response and Hemodynamic Changes in Coronary Artery Disease Patients Posted for Elective CABG Procedure
Balaji Elumalai, Jayanthi Mohanasundaram, Ayanallur Janakiraman Kavitha
Abstract
Background: Laryngoscopy and intubation trigger sympathetic surges that increase heart rate (HR) and blood pressure (BP), posing significant risks for patients with coronary artery disease (CAD). Dexmedetomidine, a sympatholytic alpha-2 agonist, may mitigate these stress responses.
Aim: To compare the effectiveness and safety of intravenous (IV) versus nebulized dexmedetomidine in attenuating hemodynamic stress responses during laryngoscopy and intubation in CAD patients undergoing elective coronary artery bypass grafting (CABG).
Materials and Methods: Seventy-eight elective CABG patients were randomized into two groups (n=39 each). Group A received nebulized dexmedetomidine (1 µg/kg) over 10 minutes, while Group B received IV dexmedetomidine (1 µg/kg) over 10 minutes. HR, systolic BP, diastolic BP, and mean arterial pressure (MAP) were monitored at baseline and peri- intubation.
Results: Both routes effectively blunted the pressor response to laryngoscopy. However, the IV group experienced a more precipitous decline in MAP and a significantly higher incidence of bradycardia and hypotension. Nebulized administration maintained superior cardiovascular stability and was associated with a lower incidence of postoperative sore throat. Post-operative recovery times were comparable between groups.
Conclusion: While both routes attenuate hemodynamic responses during laryngoscopy and intubation in CAD patients, nebulized dexmedetomidine is a clinically advantageous, non- invasive alternative for high-risk CAD patients. Its superior safety profile—specifically the reduced risk of cardiovascular instability—makes it a viable and attractive option for perioperative management.
40. Serum Copper and Zinc Status in Hypothyroidism: A Case–Control Study Evaluating Their Impact on Thyroid Function
Raman Rahi, Amit Singh, Soni Garima, Taskeen Tufail, Santosh Kumar, Anil Kumar, Prabhat Kumar Singh
Abstract
Background: Hypothyroidism is a common endocrine disorder with significant metabolic consequences. Apart from iodine, trace elements such as zinc (Zn) and copper (Cu) play an essential role in thyroid hormone synthesis, metabolism, and action. Alterations in these micronutrients may contribute to the pathophysiology of hypothyroidism.
Aim: To evaluate serum zinc and copper levels in patients with hypothyroidism and compare them with euthyroid healthy controls, and to assess their association with thyroid hormone levels.
Materials and Methods: This hospital-based case–control study included 100 subjects aged 18–65 years. Fifty patients with biochemically confirmed hypothyroidism constituted the case group, while fifty age- and sex-matched euthyroid individuals served as controls. Serum T3, T4, and TSH were estimated by chemiluminescence immunoassay. Serum zinc and copper levels were measured using spectrophotometric methods. Statistical analysis was performed using Student’s t-test and correlation analysis.
Results: Mean serum zinc levels were significantly lower in hypothyroid patients compared to controls (32.3 ± 19.1 vs 86.0 ± 17.9 µg/dL; p < 0.001). Serum copper levels showed no statistically significant difference between cases and controls (p > 0.05). Serum zinc demonstrated a significant negative correlation with TSH and a positive correlation with T3 and T4 levels.
Conclusion: Hypothyroidism is associated with significantly reduced serum zinc levels, while serum copper levels remain largely unchanged. Zinc deficiency may play an important contributory role in thyroid dysfunction and may warrant attention during clinical evaluation of hypothyroid patients.
41. Association between Serum Galectin-3 Levels and Left Ventricular Remodeling in Heart Failure with Reduced Ejection Fraction
Darshan M. Patel, Gaurav K. Kaila, Vasu Jarsaniya
Abstract
Background: Heart failure with reduced ejection fraction (HFrEF) is characterized by progressive left ventricular (LV) remodeling driven by maladaptive fibrotic and inflammatory processes. Galectin-3 (Gal-3), a β-galactoside-binding lectin implicated in myocardial fibrosis and inflammation, has emerged as a promising biomarker for risk stratification in heart failure. However, the precise relationship between circulating Gal-3 concentrations and echocardiographic parameters of LV remodeling across varying HFrEF severities remains insufficiently characterized.
Methods: This cross-sectional analytical study enrolled 286 patients with established HFrEF (left ventricular ejection fraction [LVEF] ≤40%) at a tertiary cardiac center. Serum Gal-3 levels were quantified using enzyme-linked immunosorbent assay (ELISA). Comprehensive transthoracic echocardiography was performed to assess LV end-diastolic volume index (LVEDVi), LV end-systolic volume index (LVESVi), LV mass index (LVMi), LVEF, left atrial volume index (LAVi), and global longitudinal strain (GLS). Correlation and multivariable regression analyses were performed to evaluate associations between Gal-3 and remodeling parameters.
Results: The median serum Gal-3 level was 21.4 ng/mL (IQR: 15.8–29.6). Patients in the highest Gal-3 tertile (>26.2 ng/mL) demonstrated significantly greater LVEDVi (118.4 ± 32.6 vs. 86.2 ± 24.8 mL/m²; p < 0.001), LVMi (148.6 ± 38.4 vs. 112.8 ± 28.6 g/m²; p < 0.001), and worse GLS (−8.2 ± 2.4% vs. −12.6 ± 3.1%; p < 0.001) compared to the lowest tertile. In multivariable analysis, Gal-3 remained independently associated with LVEDVi (β = 0.34; p < 0.001), LVMi (β = 0.31; p < 0.001), and GLS (β = 0.28; p = 0.001) after adjustment for age, NT-proBNP, estimated glomerular filtration rate, and NYHA functional class.
Conclusion: Elevated serum Gal-3 levels are independently associated with adverse LV remodeling parameters in HFrEF patients, suggesting that Gal-3 may serve as a clinically informative biomarker reflecting the degree of myocardial structural deterioration beyond conventional markers.
42. A Study of Hepatitis B Cases ( Detecting the Scenario of Viral Hepatitis B) in North East Costal Region of Andhra Pradesh by Using Screening Test (HbsAg) – Hepa Card Test and Confirmatory Test – Elisa Along with Viral Load by Real Time PCR in that Area
Radhika Budumuru, Neerajakshi Reddi, Popuri Madan
Abstract
Introduction: More than two billion people have been infected worldwide, and of these, more than 350 million suffer from chronic Hepatitis B virus infection . Its incidence and patterns of transmission vary throughout the world in different population subgroups. In Western countries, chronic HBV infection is relatively rare and acquired primarily in adulthood, whereas in Asia and Africa, it occurs from infected mother to child, from child-to-child contact in household settings, and from reuse of unsterilized needles and syringes. Due to the often silent nature of the disease, its testing is imperative for public health, particularly for blood screening. Chronic carriers with undetected acute infection and low levels of viremia are vulnerable to HBV transmission. The majority of laboratories still use hepatitis B surface antigen (HBsAg)-based tests. WHO’s elimination plan is at risk of derailment due to phases like the window period, immune control, and occult HBV infection (OBI) not being detected by standard tests. In the present study we are focusing on three diagnostic approaches for the better diagnosis of HBV. (1). Hepacard for qualitative detection of HBsAg in human serum or plasma. (2). HBsAg ELISA ( cortezdiagnostics, inc) which is a confirmatory test, to avoid false positives and false negatives from Hepacard test. (3). HBV quantitative test for viral load by Real Time PCR in blood which helps to assess disease progression, treatment monitoring, disease outcome prediction and prevent transmission.
Aim: To detect the Prevalence of Hepatitis B in North east coast region of Andhra Pradesh by using rapid screening test and its confirmation by ELISA technique for the diagnosis of Hepatitis B Surface Antigen (HBsAg) along with Viral load by Real time PCR which plays a vital role in ensuring care and effective management of patients.
Materials and Methods: The present cross sectional study was conducted at Government Medical College, Srikakulam and Rangaraya Medical College Kakinada by collecting secondary data from laboratory registers of patients tested for HBsAg from September 2023 to September 2025 (Two years duration). At First the patient’s serum was subjected to Detection of HBsAg by HEPA Card, a rapid immunochromatographic assay (ICA) Alere Trueline TM (Alere Medical Pvt. Ltd) kit, then all Seropositive and indetermined cases were subjected to Detection of HBsAg by ELISA. [cortezdiagnostics, inc], then all seropositive cases confirmed by ELISA were subjected to Real-Time Quantitative PCR for HBV DNA for detection of viral load in blood to take decision to patient treatment . Statistical analysis was done using Microsoft Office Excel 2010.
Results: A total of 24000 patients (12000 from males and 12000 from females) were included , out of which 167 (0.69%) were positive for HBsAg and 6 (0.025%) were indetermined cases. Out of 12000 males 102 (0.85%) were positive and 4 (0.033%) were indetermined cases. Out of 12000 females 65(0.54%) were positive and 2 (0.016%)were indetermined cases. Positivity is more in 50 – 60 yrs age group 46 (0.95%) and higher number of cases were from Surgery department 48 (0.61%) and more indeterminate cases were from surgery (2) and OBGY (2). More positives were from HIV patients 80 (5.47%). By ELISA all 167 positive cases were confirmed positive (100%). Out of 6 indetermined cases 4 (66.6%) were confirmed positive and 2 (33.3%) were confirmed negative showed that the total number of positive cases were 171 (0.71%). Out of 167 positive samples, on viral load testing 54 (32.33%) samples were showed viral load negative (NOT DETECTED) and 113 (67.66%) were VIRAL LOAD DETECTED cases according to Real Time micro PCR Analyzers. The Real Time PCR enables decentralization and near patient diagnosis and treatment monitoring of Hepatitis B infection by making it rapid, simple, robust and user friendly and offering ‘sample to result’ capability even at resource limited settings. The assay was used to assess the virological response to antiviral treatment.
Conclusion: Over last two decades there was a significant improvement in control of HBV infection due to screening and diagnosis, and implementation of vaccination programs and recent advances in pharmaceutical field in development of effective antiviral therapies that inhibit viral replication for long duration. Our study showed that HBV is Endemic in North east coastal region of Andhra Pradesh. It emphasizes the need for universal vaccination to all children and establishment of strategies to prevent mother to child transmission. Our research is helpful for development of national control strategies to fight against Hepatitis B infection and the present study showed that rapid test kits are inferior to some extent which are associated with in determined results when compared to ELISA confirmatory test. HBV DNA test is a crucial tool in diagnosis and management and viral load guides treatment decisions and monitor disease progression.
43. Evaluating the Effectiveness of Model – Making as a Learning Tool in Physiology Education
Pushpa G., Bhanu Priya H.
Abstract
Background: Physiology is foundational in medical education, yet many students find it abstract and challenging to master through traditional lectures alone. Active learning strategies, such as model making, have the potential to enhance understanding and engagement.
Objective: To evaluate the effectiveness of model making as a learning tool in improving the understanding and retention of physiological concepts in undergraduate physiology education.
Methods: A cross-sectional study was conducted among 150 first-year MBBS students at Akash Institute of Medical Sciences. Students were divided into 22 groups, each assigned a physiology topic to create cost-effective models, including working models where feasible. The models were presented in a competition judged by faculty. A self-administered feedback questionnaire using a 5-point Likert scale assessed students’ perceptions. Data were analyzed descriptively.
Results: Feedback was obtained from 145 students (96.6% response rate). Most participants agreed that model making was helpful in understanding concepts (99.4%), promoted independent exploration (86%), and enhanced teamwork (94%). Additionally, 88% perceived this method as more effective than traditional teaching, and over 90% expressed interest in future sessions. However, about 30% reported that model preparation interfered with their routine activities.
Conclusion: Model making proved to be an effective and engaging strategy to facilitate deeper understanding and active learning in physiology. Incorporating structured model-making activities alongside traditional lectures may enhance conceptual clarity, teamwork, and independent learning. Further research should include objective measures of knowledge gain and explore broader applicability.
44. Radiological Evaluation of Ovarian Cystic Lesions
Ruchit Shah, Dhaval Modi
Abstract
Ovarian cystic lesions are among the most frequently encountered findings in gynecological imaging and encompass a broad spectrum ranging from benign functional cysts to malignant neoplasms. Accurate radiological evaluation plays a pivotal role in diagnosis, risk stratification, and management planning. Imaging modalities such as ultrasonography (USG), computed tomography (CT), and magnetic resonance imaging (MRI) are essential in characterizing these lesions based on morphology, internal architecture, and associated features. This article reviews the radiological approach to ovarian cystic lesions, emphasizing imaging characteristics that aid in differentiating benign from malignant entities, and highlights standardized reporting system.
45. Detection of Fetal Malnutrition: Clinical Assessment of Nutritional Status Score Compared with Weight for Gestation and Ponderal Index
S. Madhu, K. R. Jayashree, K. Ganesh Shankar
Abstract
Introduction: Fetal malnutrition represents a distinct clinical entity that is not synonymous with small for gestational age or intrauterine growth retardation, as these conditions may occur independently of one another. The Clinical Assessment of Nutritional Status score provides a systematic method for evaluating fetal malnutrition and offers advantages over conventional anthropometric measures used to assess fetal growth.
Methods: This prospective cross-sectional study examined 100 full-term neonates at a tertiary care hospital over three months. Neonates were categorized as small for gestational age or appropriate for gestational age using Alexander growth curves. Fetal malnutrition was evaluated using the Clinical Assessment of Nutritional Status score as the reference standard and compared with weight for gestational age and Ponderal index measurements. A Clinical Assessment of Nutritional Status score of 24 or below was designated as indicating fetal malnutrition.
Results: The Clinical Assessment of Nutritional Status score identified malnutrition in 16% of neonates while classifying 84% as well-nourished. Assessment based on weight for gestational age revealed that 63% of babies were small for gestational age while 37% were appropriate for gestational age. The Ponderal index classified 9% as malnourished. Among 63 small for gestational age babies, 43 were small but not malnourished, while 3 out of 37 appropriate for gestational age babies were malnourished. When the Clinical Assessment of Nutritional Status score served as the reference standard, weight for gestational age demonstrated sensitivity and specificity of 94.12% and 64.29% respectively, while Ponderal index showed sensitivity and specificity of 25% and 94% respectively.
Discussion: These findings indicate that small for gestational age and intrauterine growth retardation are not synonymous with fetal malnutrition. The Clinical Assessment of Nutritional Status score can identify fetal malnourishment in neonates that other methods fail to detect, proving superior to weight-based classification and Ponderal index in accurately identifying fetally malnourished neonates, including those with normal birth weights. This has important clinical implications for recognizing at-risk infants who require enhanced monitoring and intervention.
46. Prevalence of Gaming Disorder among Middle School Students and Its Correlation with Parental Perception on Problematic Internet Usage in Chennai
Sargema Manikangtan, Sudharshini Subramaniam, Arvinth Ram A.
Abstract
Background: Recently the internet and internet-related activities have seen a tremendous increase in its users, coming with its advantages and disadvantages. Internet gaming disorder (IGD) is one such disadvantage which was recognized by WHO in 2018 in ICD-11 as a health concern. IGD along with problematic internet usage (PIU) can cause a detrimental effect on an adolescent’s health and also has a public health implication by paving the way for addiction at a later age, if necessary timely interventions are not done at the initial stages. This study aims to detect the prevalence of internet gaming disorder in the 10 to 14 age group and the parents’ perception of their wards’ internet usage.
Methods: An analytical cross-sectional study among 10- to 14-year-old students conducted among 6
th to 8
th grade (selected randomly from 2 different schools in Chennai). For this study, parent version of Young’s diagnostic questionnaire, internet gaming disorder scale short form, along with self-report forms were used. The data was collected over a period of two months and analyzed using SPSS software.
Results: This study with the participation of 285 adolescents and 285 parents found a prevalence of IGD to be at 2.1% (95% CI: 0.4 – 3.7) and with a prevalence of problematic internet usage at 16.8% (95% CI: 12.5 – 21). A correlation was present between IGD and PIU (p-value <0.001, sensitivity 83.3%, specificity 84.3%, positive predictive value 10.4%, negative predictive value 99.5%).
Conclusion: This study reports that among the participants, 2.1% had IGD and 16.8% had PIU (parental perception). Both of these conditions had a correlation with statistically significant association (p<0.001). Reduction in physical activity, academic performance, and interaction with family were perceived as detrimental by the parents.
47. A Comparative Follow-up Study of Metabolic Syndrome in Schizophrenia Patients Treated with Olanzapine and Risperidone
N. S. Ammaji Rao, Ram Naresh Reddy Telluri, Kanaka Mahalaxmi A., Kiran Kumar Singuru, Shaik Firoj
Abstract
Background: Schizophrenia is a severe mental illness characterized by anomalies in thought processes, perceptions, emotional response, and social relations. It affects approximately 1% of the global population, translating to millions of individuals suffering from its debilitating effects. Metabolic syndrome refers to a cluster of interrelated risk factors that significantly elevate the risk of cardiovascular diseases and type 2 diabetes. Individuals with schizophrenia are more likely to suffer physical ailments, notably cardiovascular problems. The interplay between the chronic nature of schizophrenia and physical health is multifactorial, involving genetic predisposition, lifestyle factors, and treatment-related side effects. Second-generation antipsychotics (SGAs), including olanzapine and risperidone, have revolutionized the treatment of schizophrenia by targeting both positive symptoms (e.g., delusions, hallucinations) and negative symptoms (e.g., reduced emotional expression, social withdrawal). Their use is associated with an increased risk of metabolic side effects, including weight gain, dyslipidemia, and insulin resistance, which contribute to the development of metabolic syndrome
Methodology: This comparative observational study analyzed the metabolic impact of olanzapine (n=70) versus risperidone (n=70) over one year in 140 schizophrenia patients aged 18–50 at the Government Hospital for Mental Care, Visakhapatnam. Participants, selected by simple random sampling and meeting ICD-11 criteria, were either drug-naïve or had a six-month washout period, excluding those with pre-existing metabolic conditions. Researchers measured BMI, waist circumference, blood pressure, and lipid profiles at baseline, three, and six months, diagnosing metabolic syndrome via IDF and NCEP ATP III criteria, with 121 participants completing the six-month follow-up (58 in the olanzapine group, 63 in the risperidone group).
Results: This prospective study provides valuable evidence that olanzapine is associated with significantly higher metabolic risk compared to risperidone, as demonstrated by greater changes in weight, BMI, waist circumference, triglycerides, HDL cholesterol, and incidence of metabolic syndrome over a 6-month period. Independent risk factors included high baseline BMI, physical inactivity, poor diet, and family history of cardiometabolic disease.
Conclusion: In conclusion, this study provides robust evidence supporting the metabolic safety advantages of risperidone over olanzapine. The findings align with both Indian and global literature and advocate for routine risk stratification and tailored therapeutic approaches in schizophrenia care. Given the rising burden of antipsychotic-induced metabolic syndrome, integrating physical health surveillance into psychiatric practice is not only advisable but essential for long-term patient well-being.
48. Assessment and Comparison of Markers of Inflammation and the Systemic Immune–Inflammation Index in Patients with Major Depressive Disorder and Healthy Controls at SMS Medical College, Jaipur: An Observational Case–Control Study
Rakshita Goel, Alok Kumar Tyagi, Kashish Thaper
Abstract
Background: Major depressive disorder (MDD) is increasingly recognized as a disorder with neuroimmune components, where low-grade systemic inflammation may influence neurotransmission, neuroendocrine function, and neuroplasticity. Cell count–derived indices such as the neutrophil-to-lymphocyte ratio (NLR), platelet-to-lymphocyte ratio (PLR), monocyte-to-lymphocyte ratio (MLR), and systemic immune-inflammation index (SII) are inexpensive and clinically scalable surrogate markers of systemic inflammatory status. Evidence regarding these markers in major depressive disorder has been inconsistent; meta-analytic studies suggest that NLR is often elevated in depression, whereas findings for PLR and MLR remain variable and inconclusive.
Methods: An observational case control study was done at the Psychiatric centre, SMS medical college, Jaipur, Rajasthan, India, during the period October 2024 and August 2025. Adults aged 18-60 years who had MDD according to DSM-5-TR were recruited consecutively (screened n=72; analyzed n=65). Healthy controls after screening by Mini-interview of International Neuropsychiatry Interview (MINI, version 7.0.2) (screened n=85; analyzed n=65). Participants who had systemic illnesses, were pregnant, or taking anti-inflammatory, immunomodulatory and antibiotic medications were excluded. Absolute neutrophil count (ANC), absolute lymphocyte count (ALC), absolute monocyte count (AMC), ESR, NLR, PLR, MLR and SII were determined from peripheral blood. Group comparisons were conducted using appropriate parametric/non-parametric metrics testing procedures (α=0.05).
Results: A total of 130 participants were included in the analysis (65 MDD; 65 controls). Mean age in MDD (31.79±9.72) was higher as compared to controls (22.19±6.65; p=0.02). There was no difference in sex distribution (female 44 vs 43 p=0.99). MDD subjects had numerically higher ANC (5077±1856 vs 4503±2655), AMC (416±211 vs 333±309), NLR (2.54±1.64 vs 1.93±0.57) and ESR (21.51±19.10 vs 9.20±6.45). However, none of the inflammatory indices showed statistically significant between-group differences.
Conclusion: In this sample collected at a hospital, various peripheral indicators of inflammation were trending towards an increase in MDD, but results were not statistically significant, possible reasons include limited statistical power, age mismatch between groups, and single-time-point sampling.
49. Serum Cystatin C: An Innovative Biomarker for Early Renal Impairment in Diabetes
Teena Gupta, Sapna Singh, Vibha Khare
Abstract
Background: Diabetic nephropathy has been identified as one of the primary contributors to chronic kidney disease (CKD) in the world today. Conventional indicators of early renal impairment such as serum creatinine and microalbuminuria, are often missed. Serum cystatin C (CysC) is perhaps one of indicators of an early glomerular dysfunction.
Objective: To assess the diagnostic value of serum cystatin C for detection of early renal impairment in individuals with type 2 diabetes mellitus (T2DM).
Methods: This cross-sectional analytical study entailed 150 participants; 75 of them were healthy controls (HCs), whereas 75 suffered type 2 diabetes. Renal functioning was determined by serum creatinine (SCr), urine albumin creatinine ratio (UACR), assessed glomerular filtration rate (eGFR), CysC. Statistical comparisons and correlation analysis were made.
Results: The CysC levels of T2DM patient have been found to be significantly higher compared to controls (p < 0.001). Even with normal serum creatinine, normoalbuminuric diabetic patients still had high cystatin C levels (p < 0.01). The relationship of cystatin C with eGFR was more negative than that of creatinine (r=-0.72 and r=-0.58, respectively).
Conclusion: CysC is accurate and sensitive biomarker of early renal deterioration in diabetic individuals that could help with earlier management before nephropathy would become evident.
50. Early Enteral Nutrition versus Delayed Feeding in Acute Pancreatitis: A Prospective Comparative Study
Dhruv Samirkumar Dave, Raj Tusharkumar Khanna, Gautamkumar Bhikhalal Suthar
Abstract
Background: The optimal timing for initiating enteral nutrition in acute pancreatitis remains a subject of ongoing clinical debate. While traditional management advocated prolonged fasting to achieve pancreatic rest, emerging evidence suggests that early enteral feeding may confer significant clinical benefits. This study aimed to compare clinical outcomes between early enteral nutrition and delayed feeding strategies in patients hospitalized with acute pancreatitis.
Methods: This prospective comparative study was conducted at a tertiary care hospital. A total of 174 consecutive patients diagnosed with acute pancreatitis were allocated into two groups: the early enteral nutrition group (EEN, n = 88), receiving oral or nasojejunal feeding within 24 hours of admission, and the delayed feeding group (DF, n = 86), receiving nutritional support after 72 hours or upon complete resolution of abdominal pain. Primary outcomes included length of hospital stay, organ failure incidence, and infectious complications. Secondary outcomes included pain resolution time, inflammatory marker trajectories, and mortality.
Results: Patients in the EEN group demonstrated significantly shorter hospital stays (7.3 ± 2.9 days vs. 10.6 ± 4.1 days, p < 0.001), lower rates of infectious complications (9.1% vs. 22.1%, p = 0.017), and reduced incidence of organ failure (6.8% vs. 16.3%, p = 0.048). Pain resolution occurred earlier in the EEN group (3.1 ± 1.4 days vs. 4.7 ± 2.1 days, p < 0.001). C-reactive protein levels at day 5 were significantly lower in the EEN group (48.7 ± 31.2 mg/L vs. 79.4 ± 42.6 mg/L, p < 0.001). Mortality rates did not differ significantly between groups (2.3% vs. 4.7%, p = 0.440). Feeding intolerance occurred in 11.4% of EEN patients but was manageable in all cases.
Conclusion: Early enteral nutrition within 24 hours of admission in acute pancreatitis is associated with significantly reduced hospital stay, fewer infectious complications, lower organ failure rates, and accelerated clinical recovery compared to delay feeding. These findings support the adoption of early feeding protocols as standard practice in acute pancreatitis management.
51. Comparative Efficacy and Tolerability of Oxcarbazepine, Carbamazepine, and Lithium in the Treatment of Acute Mania
Sanjay Kumar Saini, Alok Kumar Sinha, Vikash Kumar Gupta
Abstract
Background: Carbamazepine is an established mood stabilizer for the treatment of acute mania; however, its clinical use is often limited by adverse effects. Oxcarbazepine, a structural analogue of carbamazepine, has been suggested as a potentially better-tolerated alternative, though evidence regarding its efficacy in mania remains limited and inconsistent.
Objectives: To compare the efficacy and tolerability of oxcarbazepine with carbamazepine in patients with acute manic episodes, using lithium as an active control.
Methods: This randomized, open-label, lithium-controlled trial was conducted at a tertiary care government hospital. Ninety patients diagnosed with a manic episode as per DSM-IV-TR criteria and having a Young Mania Rating Scale (YMRS) score ≥20 were randomized in a 1:1:1 ratio to receive oxcarbazepine, carbamazepine, or lithium and were followed for six weeks. The primary outcome was adequate response, defined as a ≥50% reduction in YMRS score from baseline. Safety and tolerability were assessed using the Systematic Assessment for Treatment-Emergent Effects (SAFTEE) checklist along with clinical and laboratory monitoring. Statistical analysis was performed using SPSS version 16, with p<0.05 considered statistically significant.
Results: No patient in any treatment group achieved adequate response at one week. At two weeks, response rates remained low and were not significantly different among groups (p=0.232). At six weeks, carbamazepine (80.0%) and lithium (73.3%) demonstrated significantly higher response rates compared with oxcarbazepine (46.7%) (p=0.015). Oxcarbazepine was associated with fewer adverse effects, whereas carbamazepine showed significantly higher rates of blurred vision, benign leukopenia, and deranged liver function tests.
Conclusion: Oxcarbazepine was less effective than carbamazepine and lithium in the treatment of acute mania, although it demonstrated a more favorable tolerability profile.
52. Protein Nanoparticles in Drug Delivery: Biocompatibility, Functionalization, and Therapeutic Applications
Ajay Kumar Saini, Jitendra Saini, Narsingh Rajpoot, Shakar Lal Saini
Abstract
Due to being biocompatible and biodegradable with a wide variety of structural characteristics, protein nanoparticles (PNPs) can act as novel drug delivery vehicles. PNPs can be constructed from the natural proteins: albumin, gelatin, silk fibroin, casein, and ferritin, thus providing several advantages compared to organic polymers and inorganic materials. PNPs can encapsulate a number of types of therapeutic agents (designed to prevent degradation and premature clearance), like proteins, peptides, nucleic acids and small molecules. Moreover, they have a unique surface chemistry that permits functionalizing with other polymers, ligands for targeting, or biomimetic coatings to achieve enhanced circulation time, tissue specificity and uptake. Finally, some of the many attributes of PNPs include: pharmacokinetic and pharmacodynamics profiles that increase the efficacy of therapeutic treatments while decreasing levels of toxicity to healthy tissues. Current fabrication techniques allow for precise regulation of the size, morphology and release kinetics of the PNPs. Protein nanocarriers have demonstrated potential as delivery systems for genes and vaccines, as well as for antimicrobial treatment, cancer treatment, and regenerative medicine. While protein nanocarriers have many advantages, there are still challenges related to lot-to-lot reproducibility, stability, and large-scale manufacturing of the carrier system. This review highlights recent advancements made in regard to the design of protein nanoparticles, their biocompatibility, their functionalization methods, and their therapeutic uses; these advancements demonstrate that protein nanoparticles could potentially serve as safe, efficient, and readily translatable drug delivery systems in the future.
53. Panorama of Soft Tissue Tumours at a Tertiary Care Centre in Bihar: A Retrospective Observational Study
Ankit Anand, Vibhuti Kumar
Abstract
Objective and Aim: Soft tissue tumors (STTs) represent a heterogeneous group of neoplasms with diverse histogenesis, biological behavior, and clinical outcomes. The present study aims to evaluate the spectrum, frequency, demographic distribution, anatomical location, and histopathological patterns of soft tissue tumors diagnosed at a tertiary care center in Bihar, India, with special emphasis on benign–malignant correlation and clinicopathological characteristics.
Materials and Methods: This retrospective observational study was conducted in the Department of Pathology at a tertiary care teaching hospital in Bihar over a period of five years (January 2019–December 2023). All histopathologically confirmed cases of soft tissue tumors were included. Tumors were classified according to the WHO Classification of Soft Tissue and Bone Tumors (2020). Statistical analysis was performed using SPSS version 26.0. Descriptive statistics, chi-square test, and logistic regression analysis were applied.
Results: A total of 312 cases of soft tissue tumors were analyzed. Benign tumors constituted 76.9%, intermediate tumors 7.4%, and malignant tumors 15.7%. The most common benign tumor was lipoma (38.1%), while undifferentiated pleomorphic sarcoma (21.4%) was the most frequent malignant tumor. Malignant tumors were significantly associated with age >40 years (p <0.001) and deep-seated location (p = 0.002).
Conclusion: Soft tissue tumors in Bihar show a predominance of benign lesions with lipoma being the most common entity. However, a significant proportion of malignant tumors underscores the importance of early diagnosis, adequate sampling, and histopathological evaluation for optimal patient management.
54. Antimicrobial Resistance Patterns in Escherichia coli Isolates from a Tertiary Care Teaching Hospital in Eastern India
Krishna Gopal, Akash Kumar Sharma, Amit Kumar Anand
Abstract
Background: Antimicrobial resistance (AMR) among Escherichia coli has emerged as a major public health concern globally, particularly in low- and middle-income countries where antibiotic consumption is high and stewardship programs are still evolving. E. coli remains the leading cause of community- and hospital-acquired urinary tract infections (UTIs), and increasing resistance to commonly used antimicrobials has significantly limited therapeutic options.
Aim: To determine the antimicrobial resistance patterns of E. coli isolates recovered from clinical samples and to analyze department-wise susceptibility trends at a tertiary care teaching hospital in Bihar, India.
Methods: A hospital-based observational study was conducted in the Department of Microbiology, Jawaharlal Nehru Medical College, Bhagalpur, from 5 February 2025 to 31 December 2025. A total of 65 non-duplicate E. coli isolates obtained from various clinical samples were included. Identification and antimicrobial susceptibility testing were performed using standard microbiological techniques and interpreted as per CLSI guidelines. Data were analyzed using descriptive and inferential statistics.
Results: E. coli constituted the predominant uropathogen. High susceptibility was observed to amikacin and gentamicin, while markedly reduced susceptibility was seen for fluoroquinolones and third-generation cephalosporins. Carbapenem resistance, though less frequent, was detected and showed inter-departmental variation.
Conclusion: The study demonstrates a high burden of multidrug-resistant E. coli isolates with preserved susceptibility to aminoglycosides. Continuous surveillance through antibiograms and rational antibiotic stewardship are essential to curb the progression of AMR.
55. Cardiac Biomarkers: Biochemical Markers for Cardiovascular Disease Diagnosis and Prognosis
Raj Kishore Thakur, Md. Afroz Alam, Rolly Bharty
Abstract
Background & Aim: Cardiovascular diseases (CVDs) remain the leading cause of morbidity and mortality worldwide, with myocardial infarction and heart failure constituting major contributors to disease burden. Early diagnosis and accurate prognostication are essential for improving clinical outcomes. The present study aimed to evaluate the diagnostic and prognostic utility of selected cardiac biochemical markers in patients with cardiovascular diseases and to analyze their effectiveness in predicting cardiovascular health outcomes using regression and comparative analytical models.
Methods: This hospital-based prospective observational study was conducted in the Department of Biochemistry in collaboration with the Department of Medicine/Cardiology at Jawaharlal Nehru Medical College, Bhagalpur, Bihar, over a period from 1st October 2024 to 30th September 2025. A total of 70 clinically diagnosed cardiovascular disease patients were enrolled based on predefined inclusion and exclusion criteria. Blood samples were collected for estimation of cardiac biomarkers including cardiac troponins, natriuretic peptides, inflammatory markers, lipid profile parameters, and other relevant biochemical indices using standardized laboratory methods.
Results: The study population predominantly comprised middle-aged and elderly patients, with a higher representation of males. Significant variability was observed in biomarker concentrations across different cardiovascular conditions. Cardiac troponins and natriuretic peptides demonstrated higher diagnostic effectiveness scores compared to other biochemical markers. Forest plot analysis revealed moderate pooled effect sizes for selected biomarkers, indicating their clinical relevance in cardiovascular disease assessment. Regression analysis exploring the relationship between two key biochemical markers and cardiovascular health outcomes showed a limited but notable association, suggesting that biomarker-based prediction is influenced by multifactorial determinants rather than isolated parameters.
Conclusion: The findings of this study underscore the critical role of cardiac biomarkers in the diagnosis and prognostic assessment of cardiovascular diseases. While traditional markers such as troponins and natriuretic peptides remain highly valuable, their predictive capability is enhanced when interpreted alongside other biochemical and clinical variables.
56. An Evaluation of Thyroperoxidase Antibody in Antenatal Mothers with Hypothyroidism to Assess the Prevalence of Autoimmune Thyroiditis in Perambalur District, Tamil Nadu
Alamelumangai D., Nageshwari A., K.Shanmugasundaram, R. Menaha
Abstract
Background: Hypothyroidism during pregnancy poses significant risks to maternal and fetal health, with autoimmune thyroiditis (Hashimoto’s thyroiditis) being a leading cause in iodine-sufficient regions. Thyroperoxidase (TPO) antibodies serve as a key marker for autoimmune thyroid disease. This study aimed to evaluate the prevalence of TPO antibody positivity in hypothyroid antenatal women in Perambalur district, Tamil Nadu, India.
Methods: A cross-sectional observational study was conducted from July 2017 to July 2018 at Dhanalakshmi Srinivasan Medical College and Hospital. Eighty antenatal women aged 18-30 years with newly diagnosed hypothyroidism (TSH >10 μIU/mL) were included. Serum levels of total T3 (TT3), total T4 (TT4), thyroid-stimulating hormone (TSH), and TPO antibodies were measured using enzyme-linked immunosorbent assay (ELISA). Participants were categorized into TPO-positive (≥50 IU/mL) and TPO-negative groups. Demographic data, thyroid profiles, and correlations were analyzed using descriptive statistics and Pearson’s correlation.
Results: Of the 80 participants, 47 (58.75%) were TPO-positive. The mean age was 25.55 ± 5.09 years in the TPO-positive group and 26.03 ± 4.87 years in the TPO-negative group (p > 0.05). TPO-positive women had significantly higher TSH (22.68 ± 6.15 μIU/mL vs. 17 ± 4.4 μIU/mL; p < 0.01) and lower TT3 (0.6 ± 0.2 ng/mL vs. 0.8 ± 0.4 ng/mL; p < 0.01) and TT4 (2.9 ± 1 μg/dL vs. 3.3 ± 1.2 μg/dL; p < 0.01). A strong positive correlation was observed between TPO antibodies and TSH (r = 0.7, R² = 0.49; p < 0.01). TPO-positive women had higher rates of family history of thyroid dysfunction (40% vs. 33%) and past miscarriages (25.5% vs. 18%; p < 0.01 for miscarriages).
Conclusion: The prevalence of Hashimoto’s thyroiditis in hypothyroid antenatal women in this region is high (58.75%), underscoring the need for routine TPO antibody screening in early pregnancy to mitigate adverse outcomes. Universal screening and early intervention could improve maternal and fetal health.
57. Knowledge, Awareness and Practice of Materiovigilance among Healthcare Professionals in a Tertiary Care Hospital of Eastern India: A Cross-Sectional Observational Study
Swagata Koley, Tanmoy Gangopadhyay, Rahul Das
Abstract
Background: Medical devices are an essential part of modern health care. Despite that there are instances where their uses caused significant morbidity and mortality in the users. So, it is necessary to assess the risk and benefit associated with devices at all stages of development and uses. Materiovigilance is the coordinated system of identification, collection, reporting, and analysis of any untoward occurrences associated with the use of medical devices. Although medical faculties and residents are major stake holders in reporting adverse events related to medical devices at present there are very few published data about their knowledge attitude and practice regarding materiovigilance in India. This study was conducted with an objective to assess the knowledge, attitude and practice of materiovigilance amongst the doctors in a tertiary care teaching hospital of Eastern India.
Methodology: This is a cross-sectional questionnaire based observational study. The questionnaire containing 15 questions is made in Google forms and were distributed to all the doctors of this hospital. Among 135 doctors, 125 doctors have answered and the recorded responses have been statistically analysed.
Results: Among the responders approx. 74% were not aware of the ongoing program for monitoring adverse events due to medical devices. Despite 46.8% responders facing adverse events relating to medical devices 32.5% of participants have actually seen the reporting form and only 18.4% reported the events.
Conclusion: More awareness drive is needed to fulfil the lacunae in the knowledge regarding adverse events related to medical devices
58. Association Between Osteoarthritis and Type 2 Diabetes Mellitus
Nikila K., M. Rajesh
Abstract
Osteoarthritis and type 2 diabetes mellitus are two common chronic illnesses that significantly strain healthcare systems around the globe. Given the possible synergistic effects of type 2 diabetes mellitus and osteoarthritis on metabolism and joint health, their comorbidity is a significant problem, particularly in people who are overweight or obese. The aim of our study was to assess and examine the literature regarding the prevalence, association, symptoms, physical function, and shared risk factors of T2DM and OA. Osteoarthritis and type 2 diabetes mellitus are both characterised by chronic low-grade systemic inflammation, which is crucial to their development. We conducted a critical review of the literature to explore the association between T2DM and OA, whether any association is site-specific for OA, and whether the presence of T2DM impacts on OA outcomes. Furthermore, we found that common risk variables, such as metabolic syndromes and demography, may influence the relationship between DM and OA; however, more study is needed on this topic as well. Extensive future study is required to properly determine the effect of chronic medication usage, as there is conflicting information regarding whether medication use contributes positively or negatively to the connection between DM and OA.
59. Study of Clinical Isolates of Methicillin Resistant Staphylococcus Aureus (MRSA) and its Antibiotic Susceptibility Pattern in Tertiary Care Teaching Hospital
Devanshi Rangani, Dipak Panjwani, Sanjay Mehta
Abstract
Introduction: Methicillin-resistant Staphylococcus aureus (MRSA) is a resistant organismthat heavily contributes to hospital and community-acquired infections worldwide. It spreads very easily from patient to patient, by the hands of health workers, through contaminated objects and by air. MRSA poses a major clinical problem in treatment. So, it is essential to know its antibiotic pattern.
Material & Methods: This study was carried out in the department of Microbiology at C. U. Shah Medical College and hospital, Surendranagar. Total 180 samples were included in the study during the period of 1
st July 2021 to 30
th June 2022. Identification of isolated organism was done by conventional method using biochemical reactions. Antimicrobial susceptibility testing and MRSA detection was done using VITEK-2 compact system.
Results: Methicillin resistant Staphylococcus aureus (MRSA) was seen in 58% of the isolates. The sensitivity profile shows Tigecycline 100%, Nitrofurantoin, Linezolid and Teicoplanin 99%, Vancomycin 95%. Lower sensitivity rates were found for Levofloxacin (7%), Ciprofloxacin and Oxacillin (6%) and cefepime (2%). Inducible clindamycin resistance was found to be 37.5%. Vancomycin sensitivity was 95%, rest 4% was VISA and 1% was VRSA.
Discussion: Prevalence of MRSA was found 57.78% from various clinical samples.A similar study done by Fatemeh et al and Kirti et alwas reported MRSA prevalence of 63.20% and 33.7% respectively.
Conclusion: MRSA were most sensitive to Tigecycline, Linezolid, Tecoplanin , Vancomycin and least sensitive to Oxacillin and cefepime. The knowledge of antibiotic sensitivity pattern of S. aureus will therefore be helpful to get control over the evolving resistance.
60. Comparative Evaluation of Dexmedetomidine and Dexamethasone as an Adjuvant to Bupivacaine in Supraclavicular Brachial Plexus Block for Upper Limb Surgeries
Jayakumar J., Asha A., Shanmuga Priya G.
Abstract
Background: Supraclavicular brachial plexus block is commonly used for upper limb surgeries. Various adjuvants have been added to local anaesthetics to improve block characteristics and prolong postoperative analgesia. Dexmedetomidine and dexamethasone are frequently used additives to bupivacaine, but their comparative efficacy remains an area of interest.
Methods: This ambispective randomised double-blind controlled study was conducted in 60 adult patients (ASA I–II) undergoing elective upper limb orthopaedic surgeries. Patients were randomly divided into two groups of 30 each. Group A received 25 ml of 0.25% bupivacaine with dexamethasone 8 mg, while Group B received 25 ml of 0.25% bupivacaine with dexmedetomidine 1 µg/kg for ultrasound-guided supraclavicular brachial plexus block. Onset and duration of sensory and motor block, sedation score, haemodynamic variables, time to first rescue analgesia, and total analgesic consumption over 24 hours were recorded and analysed.
Results: Dexmedetomidine significantly hastened the onset of sensory (6.4 ± 1.3 min vs 9.8 ± 1.5 min) and motor block (8.1 ± 1.4 min vs 12.2 ± 1.6 min) compared to dexamethasone (p < 0.001). Duration of sensory block (10.2 ± 1.3 hrs vs 7.6 ± 1.1 hrs) and motor block (9.1 ± 1.1 hrs vs 6.3 ± 1.0 hrs) were also prolonged in the dexmedetomidine group (p < 0.001). Time to first rescue analgesia was longer (12.7 ± 1.4 hrs vs 9.8 ± 1.2 hrs), and analgesic requirement over 24 hours was significantly reduced. Sedation scores were higher with dexmedetomidine. Mild bradycardia and hypotension occurred more frequently with dexmedetomidine but were manageable.
Conclusion: Dexmedetomidine is a superior adjuvant to dexamethasone when combined with bupivacaine for supraclavicular brachial plexus block, providing faster onset, prolonged block duration, improved postoperative analgesia, and acceptable haemodynamic stability.
61. Evaluation of Thyroid Dysfunction in Patients with Type 2 Diabetes Mellitus and its Impact on Metabolic Control
Chandrakant Bhaskar, Ved Prakash Ghilley, Shashikant Bhaskar, Shashikant Bhaskar
Abstract
Background: Type 2 Diabetes Mellitus (T2DM) is frequently associated with thyroid dysfunction due to their interrelated effects on metabolism. Thyroid abnormalities in diabetic patients may remain undiagnosed and can adversely affect glycemic control.
Objectives: To evaluate the prevalence of thyroid dysfunction in patients with Type 2 Diabetes Mellitus and to assess its impact on metabolic control.
Materials and Methods: This hospital-based cross-sectional observational study was conducted at the Department of Medicine, Government Medical College, Korba, Chhattisgarh, over a period of one year. A total of 100 patients with Type 2 Diabetes Mellitus were enrolled. Clinical evaluation and laboratory investigations, including fasting blood glucose, post-prandial blood glucose, HbA1c, and thyroid profile (TSH, FT3, FT4), were performed. Thyroid status was categorised based on standard reference ranges. Glycemic control was assessed using HbA1c levels. Statistical analysis was carried out using appropriate tests, with p <0.05 considered statistically significant.
Results: Thyroid dysfunction was observed in 32% of patients, with subclinical hypothyroidism being the most common abnormality (18%). Poor glycemic control (HbA1c ≥7%) was seen in 62% of patients. Patients with thyroid dysfunction had significantly higher HbA1c levels compared to euthyroid patients (p <0.05), indicating poorer metabolic control.
Conclusion: Thyroid dysfunction, particularly subclinical hypothyroidism, is prevalent among patients with Type 2 Diabetes Mellitus and is associated with poor glycemic control. Routine thyroid screening in patients with T2DM may aid in early detection and improve metabolic outcomes.
62. Optical Properties of Liquid Crystals in Biosensing Applications
Vijayshree Patil N., R. D. Mathad, Praveen R. Patil
Abstract
Introduction: Liquid crystals (LCs) combine fluidity with long-range molecular order, giving rise to strong optical anisotropy (birefringence), polarization-dependent transmission, and (in cholesterics) selective reflection. These features make LC interfaces exquisitely sensitive to biomolecular binding events that perturb anchoring and director fields, enabling label-free biosensing.
Materials and Methods: A structured literature review (2016–Jan 2026) was designed around optical LC biosensors (nematic, cholesteric, blue phase, chromonic; planar films, droplets, elastomers, microcavities). Studies were screened using predefined inclusion/exclusion criteria and extracted for platform geometry, surface chemistry, optical readout, target class, and analytical performance.
Results: Across included studies, dominant readouts were polarized optical microscopy (POM) texture change, quantitative birefringence/retardation, and wavelength shifts from cholesteric reflection bands. Sensitivity improvements were commonly achieved via nucleic-acid amplification strategies and signal-amplifying interfacial chemistries. Six synthesis tables summarize optical properties, device architectures, functionalization routes, readout metrics, representative targets, and translation considerations.
Conclusion: LC optical biosensors are maturing from qualitative “dark-to-bright” assays toward quantitative, portable formats using smartphone optics, elastomeric photonic films, and robust surface chemistries. Key remaining challenges include standardization of alignment layers, suppression of matrix effects in biofluids, and reproducible quantification across lighting/imaging conditions.
63. Unmasking the Risks: A Population-Based Cross Sectional Survey on Non-Prescribed Face Cream Usage and Associated Adverse Reactions
Manali Tyagi, Satyanarayan V., Manjula M. J.
Abstract
Introduction: Non-prescribed face cream use has increased substantially due to easy availability, social media influence, and over-the-counter access. Many such products may contain harmful ingredients including topical steroids, hydroquinone, and heavy metals, leading to preventable dermatological adverse effects. Population-level data on usage patterns and associated reactions remain limited. The present study aimed to assess the knowledge, attitudes, and practices related to non-prescribed face cream usage and to evaluate the pattern of associated adverse reactions in the community.
Materials and Methods: A population-based cross-sectional online survey was conducted in India from July to December 2025 among adults aged ≥18 years using a structured questionnaire assessing knowledge, attitudes, practices, and adverse effects related to non-prescribed face creams. Participants with face cream use in the previous 12 months were included. The minimum calculated sample size was 300. Data were analyzed using SPSS v26 with descriptive statistics and chi-square tests; p<0.05 was considered significant.
Results: Among 300 participants, most were aged 20–29 years (50%) and female (85%). Use of non-prescribed face creams was reported by 64.7%. Awareness of harmful ingredients was present in 60%, but detailed ingredient recognition was low. Only 53.3% stated they would consult a doctor before starting a new product. Commonly used creams included skin-lightening and sun-protection products. Adverse effects were reported by 70% of users, most frequently hyperpigmentation (10.3%), acneiform eruptions (10%), and rashes (8.7%). Recommendations were mainly from family/friends (30%) and social media (25%). Only 18% sought medical consultation after adverse reactions, while many relied on self-care measures.
Conclusion: Non-prescribed face cream use is highly prevalent and frequently associated with adverse effects, alongside gaps in ingredient awareness and low rates of professional consultation, underscoring the need for improved public education and cosmetovigilance.
64. The Resistant Threat in Critical Care: Comprehensive Mapping of the Burden and Outcomes of Multidrug-Resistant Infections in ICU Patients
Sambeet Swain, Rajiv Dwaipayan Mishra, Sapna Das, Jyotirmayee Dash
Abstract
Background: Multidrug-resistant organism (MDRO) infections are increasingly prevalent in intensive care units (ICUs), posing significant therapeutic and prognostic challenges. Data from low- and middle-income countries remain limited despite carrying a disproportionate global burden.
Objectives: To estimate the prevalence of MDRO infections in ICU patients, assess their impact on mortality and clinical outcomes, and identify predictors of infection and death.
Methods: This prospective, observational cohort study was conducted over 18 months in a tertiary-care ICU. Adult patients (≥18 years) with culture-positive bacterial infections were enrolled (N=327). MDR status was defined according to CDC/ECDC criteria. Demographics, comorbidities, APACHE II/SOFA scores, device use, and microbiological data were collected. Outcomes included ICU mortality, length of ICU/hospital stay, ventilator days, and complications. Statistical analysis employed Chi-square and t-tests for group comparisons, Kaplan–Meier survival analysis, and multivariate logistic regression to identify predictors of MDR acquisition and mortality.
Results: Of 327 patients, 146 (44.6%) had MDRO infections. Gram-negative pathogens predominated: Acinetobacter baumannii (28.1%), Klebsiella pneumoniae (24.7%), and Pseudomonas aeruginosa (19.2%). Carbapenem resistance was alarmingly high in Acinetobacter (89.5%) and Klebsiella (76.3%). Compared to non-MDR infections, MDRO infections were associated with higher ICU mortality (47.9% vs. 28.2%, p<0.001), prolonged ICU stay (19.8 vs. 13.4 days, p<0.001), hospital stay (28.6 vs. 21.3 days, p<0.001), and ventilator days (14.7 vs. 9.2, p<0.001). Complications such as septic shock (39.7% vs. 21.5%), renal replacement therapy (18.5% vs. 9.9%), and secondary infections (25.3% vs. 12.7%) were significantly more common. Independent predictors of mortality among MDR patients included septic shock (OR 3.24, 95% CI 1.76–5.95), carbapenem resistance (OR 2.87, 95% CI 1.52–5.41), APACHE II >25 (OR 2.54, 95% CI 1.33–4.82), and renal replacement therapy requirement (OR 2.11, 95% CI 1.05–4.22).
Conclusions: Nearly half of ICU infections were caused by MDROs, doubling mortality and significantly increasing ICU burden. Septic shock and carbapenem resistance were the most ominous predictors of death. These findings underscore the urgent need for rapid diagnostics, tailored empiric therapy, strict infection-control bundles, and robust antimicrobial stewardship in critical care settings.
65. Functional and Radiological Outcomes of Open Reduction and Internal Fixation in Bimalleolar Ankle Fractures
Jay Patel, Baiju Patel, Meet Patel
Abstract
Background: Bimalleolar ankle fractures are one of the most frequently encountered lower limb injuries, often requiring surgical intervention for optimal recovery. Open Reduction and Internal Fixation (ORIF) is widely regarded as the standard treatment, aimed at restoring joint congruity and functional mobility.
Aim: To study the functional outcome of surgically treated bimalleolar ankle fractures and identify postoperative complications associated with ORIF.
Materials and Methods: This observational study was conducted on 80 patients with closed bimalleolar ankle fractures. All patients underwent ORIF, followed by standard postoperative rehabilitation. Clinical and radiological evaluations were performed, and outcomes were assessed using the Baird and Jackson criteria.
Results: Out of 80 patients, 28% showed excellent outcomes, 35% good, 25% fair, and 12% poor outcomes. The majority achieved full ankle stability, near-complete range of motion, and returned to work within 3–4 weeks. Pain and limited mobility were observed in a few cases, mostly linked to delayed intervention or comorbidities.
Conclusion: ORIF remains a reliable treatment modality for bimalleolar fractures, offering excellent functional results in most cases. Early diagnosis, proper surgical technique, and postoperative care are crucial for maximizing recovery and minimizing complications.
66. To Evaluate Diastolic Dysfunction and Left Ventricular Mass in Asymptomatic Normotensive ATYPE 2 Diabetes Mellitus
Manish Kumar, Shiv Kumar, P. K. Aggarwal
Abstract
Background and Objectives: The world today is witnessing an epidemic of diabetes mellitus. Globally and nationally, diabetes and its complications have become the most important contemporary and challenging health problem. It is estimated that there will be more than 200 million diabetics in the world within the next 10 years. India has already become the diabetes capital of the world with over 30 million affected patients that is alarmingly just a tip of the iceberg and is expected to touch the 69.9 million marks in 2025. To evaluate Diastolic Dysfunction in asymptomatic normotensive Type 2 Diabetes Mellitus. To study left ventricular mass in asymptomatic normotensive Type 2 Diabetes Mellitus.
Material and Methods: Study will be conducted in outpatient department of VMMC and Safdarjung Hospital. Cross sectional observational study. A total of 100 subjects,50 Diabetic normotensive patient and 50 non-diabetic healthy individuals satisfying the inclusion and exclusion criteria were included in this study.
Conclusion: In our study diastolic dysfunction and LVM was present in 56% and 48% respectively in asymptomatic normotensive type 2 DM subjects. Asymptomatic normotensive type 2 DM had significantly high prevalence of diastolic dysfunction and left ventricular mass as compared to healthy subjects. LV diastolic dysfunction and left ventricular mass were correlated with the age, gender, HbA1c, duration of diabetes, BMI, smoking, dyslipidemia. After univariate analysis of various risk factors for diastolic dysfunction it was found that poor glycemic control (HbA1c), duration of diabetes, smoking, dyslipidemia was significantly associated with diastolic dysfunction(P<0.05) whereas age, gender, BMI were not significantly associated with diastolic dysfunction(P>0.05).
67. Intrauterine Growth Restriction (IUGR): Approach and Monitoring
Anjali, Anisha Buddhapriya, Minu Sharan
Abstract
Background: Intrauterine Growth Restriction (IUGR), also referred to as Fetal Growth Restriction (FGR), is a pathological condition in which a fetus fails to achieve its genetically determined growth potential, typically defined as an estimated fetal weight below the 10th percentile for gestational age. IUGR is associated with increased perinatal morbidity and mortality, including prematurity, hypoxia, stillbirth, and long-term neurodevelopmental impairment. The approach to IUGR begins with early identification through routine antenatal surveillance, including serial fundal height measurements and ultrasound biometry. Once suspected, diagnosis is confirmed using ultrasonographic assessment of estimated fetal weight, abdominal circumference, amniotic fluid volume, and Doppler velocimetry of uterine and umbilical arteries. Differentiation between constitutionally small fetuses and true growth restriction is essential for appropriate management. Monitoring strategies focus on fetal well-being and timely intervention. Serial growth scans are typically performed every 2–4 weeks. Doppler studies of the umbilical artery, middle cerebral artery, and ductus venosus provide information on placental function and fetal adaptation. Non-stress testing (NST) and biophysical profile (BPP) are used to assess fetal condition. Management decisions depend on gestational age, severity of Doppler abnormalities, and maternal condition. The timing of delivery balances the risks of prematurity against intrauterine compromise. Early-onset IUGR (<32 weeks) requires intensive monitoring and often tertiary-level care, whereas late-onset IUGR (>32 weeks) is more common and may require delivery at term or near term. Multidisciplinary care and individualized management plans are essential to optimize perinatal outcomes. Early detection, structured monitoring, and evidence-based timing of delivery remain the cornerstone of improving outcomes in pregnancies complicated by IUGR.
Conclusion: Intrauterine Growth Restriction (IUGR) is a significant obstetric condition associated with increased perinatal morbidity and mortality. Early identification through routine antenatal screening and accurate diagnosis using ultrasound and Doppler studies are essential for distinguishing true growth restriction from constitutionally small fetuses. Close and structured monitoring of fetal growth and well-being—using serial biometry, Doppler velocimetry, and fetal surveillance tests—plays a crucial role in guiding management decisions. The timing of delivery should be individualized, balancing the risks of prematurity against the risk of intrauterine compromise. A multidisciplinary and evidence-based approach, combined with timely intervention, is key to improving short-term neonatal outcomes and reducing long-term complications in pregnancies complicated by IUGR.
68. Evaluation of Serum Vitamin D Levels in Patients with Chronic Rhinosinusitis
Anshul Singh, Mangesh Laxman Tekade, Anil Pandey, Sameer Srivastava, Anupam Tyagi
Abstract
Background: Chronic rhinosinusitis (CRS) is a persistent inflammatory disorder of the sinonasal mucosa with multifactorial etiology. Vitamin D has recognized immunomodulatory properties, and deficiency has been implicated in several chronic inflammatory conditions. The present study evaluated serum 25-hydroxyvitamin D [25(OH)D] levels in patients with CRS and examined their association with disease severity.
Material and Methods: A prospective case–control study was conducted in a tertiary care hospital including 70 patients with clinically and radiologically confirmed CRS and 70 age- and sex-matched healthy controls. Serum 25(OH)D levels were measured using chemiluminescent immunoassay. Vitamin D status was categorized as deficient (<20 ng/mL), insufficient (20–29 ng/mL), or sufficient (≥30 ng/mL). Symptom severity was assessed using the SNOT-22 score, and radiological severity was evaluated using the Lund–Mackay scoring system. Statistical analysis was performed using appropriate parametric and non-parametric tests, with
p < 0.05 considered significant.
Results: Baseline demographic characteristics were comparable between groups (
p > 0.05). Mean serum 25(OH)D levels were significantly lower in CRS patients than in controls (17.8 ± 7.2 ng/mL vs. 26.4 ± 8.1 ng/mL;
p < 0.01). Vitamin D deficiency was observed in 65.7% of CRS patients compared to 25.7% of controls (
p < 0.01). Parathyroid hormone levels were significantly higher in CRS patients (
p < 0.01), while calcium, phosphorus, and alkaline phosphatase levels showed no significant differences. Serum vitamin D levels demonstrated a moderate negative correlation with SNOT-22 scores (r = –0.48;
p < 0.01) and Lund–Mackay scores (r = –0.41;
p = < 0.01). Patients with nasal polyps had significantly lower vitamin D levels than those without polyps (14.9 ± 6.1 ng/mL vs. 20.2 ± 7.4 ng/mL;
p = < 0.01).
Conclusion: Serum vitamin D levels are significantly reduced in patients with chronic rhinosinusitis and are inversely associated with clinical and radiological severity. These findings suggest a potential role of vitamin D deficiency in the disease spectrum of CRS.
69. Impact of Early SGLT2 Inhibitor Initiation on Mortality and Rehospitalization in Patients with Acute Decompensated Heart Failure
Devarshikumar S. Patel, Patel Jay Dineshbhai, Pradeep Dayanand M.D
Abstract
Background: Sodium-glucose cotransporter 2 (SGLT2) inhibitors have demonstrated substantial cardiovascular benefits in chronic heart failure. However, the optimal timing of initiation during acute decompensated heart failure (ADHF) hospitalization and its impact on short-term and intermediate-term clinical outcomes remain insufficiently characterized. This study aimed to evaluate the effect of early in-hospital SGLT2 inhibitor initiation on all-cause mortality and heart failure rehospitalization in patients admitted with ADHF.
Methods: A retrospective cohort study was conducted across two tertiary cardiac centers. A total of 742 patients hospitalized with ADHF were included: 318 who received SGLT2 inhibitors within 48 hours of admission (early initiation group) and 424 who received standard heart failure therapy without SGLT2 inhibitors during hospitalization (standard care group). The primary composite endpoint was all-cause mortality or first heart failure rehospitalization at 180 days. Secondary endpoints included individual components of the composite, in-hospital worsening heart failure events, change in N-terminal pro-B-type natriuretic peptide (NT-proBNP), length of hospital stay, and renal safety outcomes.
Results: The primary composite endpoint occurred in 22.3% of the early initiation group versus 33.5% of the standard care group (hazard ratio [HR] 0.61, 95% CI 0.47–0.79, p < 0.001). All-cause mortality at 180 days was 8.2% versus 13.4% (HR 0.58, 95% CI 0.38–0.89, p = 0.012). Heart failure rehospitalization occurred in 16.4% versus 24.3% (HR 0.63, 95% CI 0.47–0.85, p = 0.002). The early initiation group demonstrated significantly greater NT-proBNP reduction at discharge (−48.2 ± 22.6% vs. −34.7 ± 24.1%, p < 0.001) and shorter median length of stay (6.3 ± 2.8 vs. 7.9 ± 3.4 days, p < 0.001). No significant differences in acute kidney injury, diabetic ketoacidosis, or urinary tract infections were observed between groups.
Conclusion: Early in-hospital initiation of SGLT2 inhibitors within 48 hours of admission for ADHF was associated with significantly reduced all-cause mortality, lower heart failure rehospitalization rates, and greater neurohormonal decongestion without increased adverse events. These findings support the paradigm of prompt SGLT2 inhibitor initiation during acute heart failure hospitalization.
70. A Study on Impact of Borderline Oligohydrmnios on Fetomaternal Outcomes in Term Pregnancies with Cerebroplacental Ratio >1
C.P. Padmini, Ambati Uma, Megavath Rajitha
Abstract
Background: Oligohydramnios is described as decreased amniotic fluid volume relative to gestational age. Semi quantitatively it is described using the Amniotic Fluid Index (AFI) which is calculated by adding the depth in centimeters of the largest vertical pocket in each of four equal uterine quadrants. AFI less than or equal to 5 cm is defined as oligohydramnios. A Borderline Oligohydramnios (BO) is defined as AFI 5.1-8 cm.
Objectives: To evaluate fetomaternal outcomes in term pregnancies with borderline oligohydramnios (Amniotic Fluid Index [AFI] 5–8 cm) and a cerebroplacental ratio (CPR) >1.
Methods: This prospective analytical study was conducted at the Department of Obstetrics and Gynecology, Rajiv Gandhi Institute of Medical Sciences, Adilabad, involving 100 term (37–40 weeks) singleton pregnant women with AFI 5–8 cm and CPR >1, confirmed by Doppler ultrasound. Data were analyzed to assess Obstetric outcomes and perinatal outcomes.
Results: The mean maternal age was 25.01 years (SD 3.98), with 48% aged 21–25 years. Gestational age distribution was 35% at 37 weeks, 29% at 38 weeks, 24% at 39 weeks, and 12% at 40 weeks. Obstetric outcomes included 38% NVD, 27% induced NVD, 20% assisted vaginal delivery, and 15% LSCS. Induction was used in 49% of cases (Foley’s catheter and prostaglandin E1). Perinatal complications included meconium-stained liquor (33%), low Apgar scores (23%), LBW (20%), fetal distress (19%), RDS (18%), NICU admissions (9%), and perinatal mortality (2%). Significant associations were found between gestational age and meconium-stained liquor (p=0.004) and RDS (p=0.02). No significant differences in perinatal outcomes were observed between cesarean and non-cesarean deliveries in mothers with CPR >1, suggesting safe vaginal delivery in this subgroup.
Conclusion: Term pregnancies with borderline oligohydramnios and CPR >1 can achieve favorable fetomaternal outcomes with a high rate of vaginal deliveries and low severe perinatal complications. CPR >1 is a valuable marker for identifying cases suitable for vaginal delivery, reducing unnecessary cesarean sections while ensuring fetal well-being.
71. Postoperative Complications Following Total Hip Arthroplasty: A Retrospective Analysis
Rajkumar Ashvinbhai Amrutiya, Parth Bharatkumar Patel, Dev S. Parikh, Ujwala Bhanarkar
Abstract
Background: Total hip arthroplasty (THA) is among the most frequently performed orthopedic procedures worldwide, providing substantial improvements in pain relief and functional restoration for patients with end-stage hip joint disease. Despite significant advances in surgical techniques, implant technology, and perioperative care protocols, postoperative complications remain a considerable clinical concern affecting patient outcomes, healthcare utilization, and long-term prosthetic survivorship. Comprehensive characterization of complication patterns and their associated risk factors is essential for optimizing patient selection and perioperative management strategies.
Methods: This retrospective cohort study analyzed medical records of 712 consecutive patients who underwent primary THA at a tertiary orthopedic center. Demographic variables, surgical parameters, comorbidity profiles, and postoperative complications occurring within 90 days and one year were systematically documented. Univariate and multivariable logistic regression analyses were performed to identify independent risk factors associated with overall and specific complications.
Results: The overall 90-day complication rate was 14.3% (n = 102), while the one-year complication rate was 18.8% (n = 134). The most frequent complications included periprosthetic joint infection (PJI; 2.9%), venous thromboembolism (VTE; 3.4%), dislocation (3.1%), and periprosthetic fracture (1.8%). Independent risk factors for overall complications included age ≥75 years (OR: 2.14; 95% CI: 1.38–3.32; p = 0.001), BMI ≥35 kg/m² (OR: 2.47; 95% CI: 1.51–4.04; p < 0.001), diabetes mellitus (OR: 1.89; 95% CI: 1.19–3.01; p = 0.007), ASA classification ≥III (OR: 2.08; 95% CI: 1.32–3.28; p = 0.002), and operative time exceeding 120 minutes (OR: 1.76; 95% CI: 1.12–2.78; p = 0.015).
Conclusion: Postoperative complications following primary THA occur in a clinically significant proportion of patients. Modifiable risk factors including obesity and prolonged operative time, alongside non-modifiable factors such as advanced age and comorbidity burden, significantly predict adverse outcomes. Targeted perioperative optimization strategies may reduce complication rates and improve overall surgical outcomes.
72. Incidence of Carcinoma Gallbladder in Patients Undergoing Cholecystectomy and its Correlation with Clinicopathological Profile
Satyam Jain, Sarita Das
Abstract
Background: Gallstone disease, common in India, can lead to carcinoma gallbladder due to chronic mucosal irritation. The rate of incidental carcinoma found after cholecystectomy differs by region, highlighting the importance of early detection for improved prognosis. The present study aimed to determine incidence of carcinoma gall bladder in patients undergoing cholecystectomy and its correlation with clinicopathological profile of patients.
Methodology: This prospective observational study was conducted in department of general surgery at Pt. JNM medical college from January 2016 to August 2022, including 149 patients with symptomatic, ultrasonography confirmed gallstone disease undergoing elective cholecystectomy. Data on clinical, demographic, and histopathological parameters were analyzed to determine the incidence and associations of incidental gallbladder carcinoma.
Results: The rate of incidental carcinoma of gall bladder in our setting was found to be 0.67. Out of 149 patients, 1(0.67%) patients had gall bladder polyp and 148(99.33%) patients didn’t had gallbladder polyp. 91(61.07%) had gall bladder wall thickness of 2-3mm and 58 (38.93%) patients had gall bladder wall thickness of 3-5mm. 4 (2.7%)patients had tenderness per abdomen,4(2.7%) patients had guarding, 2(1.3%) patients had rigidity,4(2.7%) patients had rebound tenderness,4(2.7%) patients had Murphy’s sign positive, and 3(2.0%) patients had abdominal distension.
Conclusion: We concluded that most common histopathological finding in patients undergoing cholecystectomy was chronic cholecystitis with cholelithiasis. The one case diagnosed with carcinoma gall bladder was adenocarcinoma in histopathological types. As our data might not be sufficient for correlation of various variables but given that early diagnosis in carcinoma gallbladder can drastically change outcome in patients, Histopathological analysis of all gallbladder specimen in elective as well as emergency cholecystectomy should be mandatory.
73. Acid–Base Disorders in Critically Ill Patients Admitted to a Tertiary Care Intensive Care Unit: An Observational Study
Saraswati Prajapati, Hema Deep Bhojani, Harsh Patel, Priyanka Patel
Abstract
Background: Disturbances in acid–base balance are frequently encountered in critically ill patients and often reflect the severity of the underlying illness. These abnormalities have been consistently associated with increased morbidity and mortality in intensive care units (ICUs).
Objectives: To evaluate the pattern of acid–base disorders in ICU patients and to determine their association with mortality.
Methods: This prospective observational study was conducted over a 12-month period in the medical ICU of a tertiary care teaching hospital in western India. One hundred adult patients admitted for more than 12 hours were included. Acid–base disorders were identified using arterial blood gas (ABG) analysis following a structured five-step interpretative approach. Acid–base and biochemical parameters were compared between survivors and non-survivors.
Results: The mean age of the study population was 52.6 ± 13.36 years, with a male predominance (63%). Mixed acid–base disorders were more frequent than isolated abnormalities (61% vs 39%). Metabolic acidosis (18%) was the most common isolated disorder, while metabolic alkalosis combined with respiratory alkalosis (23%) was the most frequent mixed disorder. ARDS (28%) and sepsis (18%) were the leading primary diagnoses. Overall mortality was 35%. Non-survivors had significantly lower pH, bicarbonate, sodium, and oxygen saturation, along with higher PaCO₂, anion gap, and serum creatinine (p < 0.05).
Conclusions: Acid–base disorders are highly prevalent among critically ill patients, with mixed abnormalities predominating. Significant derangements in acid–base parameters are associated with increased mortality, emphasizing the importance of early recognition and timely correction.
74. A Cross-Sectional Study of Interstitial Lung Disease Pattern in Rheumatoid Arthritis Patients
Nensi Singh, Tanushree Kothari, Gaurav Sahu, Shahzad Hussain Arastu, Darshi Rastogi, Sharad Singour
Abstract
Background: Interstitial lung diseases (ILD) represent a significant complication in rheumatoid arthritis (RA), contributing to increased morbidity and mortality. Despite advances in RA management, the prevalence and clinical profile of ILD in Indian RA populations remain understudied.
Objectives: To determine the prevalence of ILD in RA patients, describe demographic and clinical characteristics, and identify potential risk factors.
Material and Methods: This cross-sectional descriptive study was conducted at the Department of Pulmonary Medicine, LN Medical College, Bhopal, by enrolling a total of 84 RA patients over a period of 6 months. Data collection involved detailed history, physical examination, high-resolution computed tomography (HRCT) of the chest, pulmonary function tests (PFTs), and serological tests. ILD was classified based on HRCT patterns (e.g., usual interstitial pneumonia [UIP], nonspecific interstitial pneumonia [NSIP]).
Results: Among the 84 patients (mean age 52.3 ± 11.4 years; 62% female), ILD was detected in 28 (33.3%). The most common HRCT pattern was UIP (46.4%), followed by NSIP (32.1%) and organizing pneumonia (21.4%). Patients with ILD had longer RA duration (mean 8.7 ± 4.2 years vs. 5.1 ± 3.6 years in non-ILD; p<0.05) and higher positivity for RF (78.6% vs. 54.3%) and anti-CCP (67.9% vs. 48.2%). Smoking history was noted in 42.9% of ILD cases. PFTs showed restrictive patterns in 75% of ILD patients, with reduced forced vital capacity (mean 68.4% predicted).
Conclusion: ILD affects approximately one-third of RA patients in this cohort, with UIP as the predominant subtype. Longer disease duration and positive serology emerge as key associations. Early screening with HRCT and PFTs is recommended for high-risk RA patients to improve outcomes. Larger prospective studies are warranted to validate these findings.
75. Clinical Significance of Combining DNA Fragmentation Index with Routine Semen Analysis in Male Infertility Workup: A Multicenter Retrospective Study
Vivek Tripathi, Avijit Guha, Deepa Dave, Gaurav Nandi
Abstract
Background: Routine semen analysis is the standard diagnostic tool for evaluating male infertility, but it may not detect functional sperm defects such as DNA damage. Sperm DNA fragmentation index (DFI) has emerged as a clinically relevant marker associated with reduced fertility outcomes.
Objective: To assess the clinical significance of integrating DFI testing with routine semen analysis in infertile males.
Methods: This multicenter retrospective study was conducted across infertility laboratories in Chhattisgarh, Madhya Pradesh, and Maharashtra over one year. A total of 220 infertile men were included. Semen analysis was performed according to WHO guidelines, and DFI was assessed using standardized sperm DNA fragmentation testing. Participants were categorized into low DFI (<25%) and high DFI (≥25%) groups. Statistical comparison of semen parameters, correlation analysis, multivariate logistic regression, and ROC curve analysis were performed.
Results: High DFI was observed in 38.6% of patients. Men with high DFI had significantly lower sperm concentration (36.4 ± 17.8 vs 48.2 ± 19.5 million/mL, p = 0.001), progressive motility (29.8 ± 11.6 vs 39.6 ± 12.8%, p < 0.001), and normal morphology (3.9 ± 1.9 vs 5.8 ± 2.1%, p < 0.001). DFI showed significant negative correlations with progressive motility (r = -0.52, p < 0.001) and morphology (r = -0.44, p < 0.001). Combined DFI + semen parameters improved diagnostic performance (AUC 0.84) compared to semen analysis alone (AUC 0.72).
Conclusion: DFI adds significant diagnostic value to routine semen analysis and improves clinical risk stratification in male infertility evaluation.
76. Assessment of Hearing Outcome after Type-I
Tympanoplasty – A Prospective Study
Sweta Kumari, Md. Ozair, Manoj Kumar
Abstract
Background: Chronic suppurative otitis media is a longstanding infection of the middle ear cleft characterized by persistent or recurrent aural discharge, deafness and perforation of tympanic membrane. Type I tympanoplasty is a surgical procedure that repairs tympanic membrane perforation.
Methods: There were a total of 62 patients of age group 15-45 years of chronic otitis media, mucosal type who underwent type I tympanoplasty in ENT Dept, DMCH, Laheriasarai, Darbhanga, during Feb 2025 to Aug 2025. Pure tone audiometry of the patients was done before the surgery and one month and three months following surgery and the hearing improvement was assessed in each case.
Results: In this study, graft uptake was highest (95%) in posterior perforations, 75% in anterior perforations and lowest (63.64%) in combined perforations. The mean hearing improvement one month following surgery was 7.16 dB and three months following surgery, it was 7.78 dB.
Conclusions: Chronic otitis media is a treatable cause of hearing loss. Type-I tympanoplasty is a safe and effective method to remove the disease of the middle ear and reconstruct the tympanic membrane perforation.
77. Evaluating Abdominal Ultrasound in Pediatric Acute Abdominal Pain Diagnosis
Poonam Tanaji Dabhade, Sudhakar Pandya, Dhanaji Sadhurao Jadhav
Abstract
Background: Acute abdominal pain in children presents significant diagnostic challenges due to diverse etiologies and overlapping clinical features.
Aim: To evaluate the diagnostic value and limitations of abdominal ultrasound in children presenting with acute abdominal pain and assess appropriate clinical indications for its use.
Material and Methods: A prospective observational study was conducted on 120 pediatric patients presenting with acute abdominal pain. All patients underwent abdominal ultrasonography, and findings were correlated with final clinical diagnoses.
Results: Ultrasound accurately identified common conditions such as mesenteric lymphadenitis and acute appendicitis, while a subset of patients demonstrated normal scans, emphasizing the importance of clinical correlation.
Conclusion: Abdominal ultrasound is a reliable first-line imaging modality in pediatric acute abdomen when used judiciously and interpreted alongside clinical findings.
78. Evaluation
of Platelet Indices in Distinguishing Types of Thrombocytopenia
Archana Menon, Hemalatha A.
Abstract
Platelets originate from cytoplasmic fragmentation of mature megakaryocytes, these are essential for maintaining hemostasis balance, regulating inflammatory response, and immune response and wound healing. A platelet count of less than 150 x 10
9/L is an indicative of thrombocytopenia , which can result of either impaired platelet production (hypo proliferative ) or increased platelet destruction (hyper destructive). Bone marrow examination is the definitive method in distinguishing between hypo proliferative and hyper destructive thrombocytopenia, but its invasive nature and associated bleeding makes it less desirable option for patients .In contrast ,evaluating platelet indices is less invasive, simpler and effective method for distinguishing types of thrombocytopenia. The platelets indexes that can be used are Plateletcrit (PCT), Mean Platelet Volume (MPV) ,Platelet Large Cell Ratio(P-LCR) and Platelet Distribution Width (PDW). This study seeks to evaluate values of platelet indices in differentiating between various types of thrombocytopenia, thereby help in diagnosis and management. To determine the role of platelet indices in discriminating hypo proliferative and hyper destructive type of thrombocytopenia and to determine platelet indices in various causes in thrombocytopenia disorders. A prospective study of 62 patients with thrombocytopenia was done a period of six months in department of pathology, to determine type of thrombocytopenia based on platelet indices. All 62 cases data were entered into a Microsoft Excel spreadsheet, then analyzed using SPSS Statistics version 22. Quantitative data were summarized as mean ± standard deviation or median with range, depending on distribution. Qualitative variables were assessed using the chi-square test, with statistical significance set at p < 0.05.31cases were classified as hyperdestructive thrombocytopenia and 31 cases as hypoproliferative thrombocytopenia. 35 males and 27 females. Among those with hyperdestructive thrombocytopenia, 61.3% were males and 38.7% were females. On comparing platelet parameters between the two groups, the mean platelet count in the hyperdestructive group was 90354.839±35465.052/cumm, whereas in the hypoproliferative group it was slightly lower 95451.613±35647.196 /cumm. This difference was not statistically significant (p = 0.4682). Statistically significant difference was observed in the platelet large cell ratio (P-LCR), which was higher in hyper destructive thrombocytopenia (32.616±9.008)compared to hypo proliferative thrombocytopenia (28.139±7.832), with a p-value of 0.03.The mean platelet volume (MPV) was significantly higher in hyperdestructive thrombocytopenia at 11.17 fL compared to 10.48 fL in hypo proliferative thrombocytopenia (p=0.009).
MPV and P-LCR are greatly elevated in hyper destructive thrombocytopenia and are excellent markers in differentiating the types of thrombocytopenia. Their application in routine examination can be useful in early diagnosis, but they need to be confirmed through large-scale studies.
79. An Observational Study to Evaluate Anterior Approach Sciatic Nerve Block in Combination with Femoral Nerve or Saphenous Block for Below Knee Surgeries
Dhruva Savani, Dhavalkumar C. Patel
Abstract
Introduction: Sciatic nerve block is useful technique for unilateral lower limb surgeries particularly in patients who are not suitable for Central neuraxial blocks.
Methodology: Forty patients from 20-70 years of either sex scheduled for below knee lower limb surgeries on elective or emergency basis, under sciatic nerve block by anterior approach in combination with femoral or saphenous nerve block were included under this trial. All blocks were given by landmark and PNS guided technique. Nerve block characteristics, success rate, patient comfort, complication etc. were observed.
Results: Majorities of the patients (77.5%) required singe attempt. Only in 87.5 % patients’ plantar flexion was observed with nerve locator. In 5% patients, drug was injected when only paresthesia was found. Onset of sensory block was relatively fast 7.95 ±0.9 mins but motor blocks took time to establish 15.23 ± 0.9 mins. Primary block achieved in 21.5 ± 1.34 mins. Total procedure time with supplemental block were 5.76 ± 0.56 mins. Readiness time for surgery was 31.4 ±1.1 mins. Duration of motor block (3.86 ± 0.28 hrs) was less than the sensory block (8.27 ± 0.5 hrs). Duration of analgesia was quite larger (9.46 ± 0.52 hrs). Incidence of discomfort associated with performance of block was less. About block technique 78.9% patients said it hardly hurt. Only in 2.6 % patients required supplementation in form of IV sedation. 5% failed block patients were given spinal anaesthesia 35 minutes after block.
Conclusion: Sciatic nerve block by anterior approach, lessen total procedure time for combination blocks. Foot twitches should be considered as end point and hamstring contractions should not accepted as motor response. In difficult block, try internal rotation of leg to expose nerve under the femur. Sciatic bock takes longer time (fairly 25-30 minutes) for full motor and sensory block achievement and provide long lasting postoperative analgesia, better tourniquet tolerance with stable hemodynamics. Anterior sciatic nerve block is therefore safe and effective technique and deserve to be used more widely for unilateral lower limb surgeries and post-operative analgesia.
80. Fibirogen to Albumin Ratio Severity of Coronary Artery Disease in Diabetes Mellitus Patient
Shruti Kolli, Sandeep Bijapur, Rajiv Konin
Abstract
Introduction: Coronary artery disease (CAD) is one of the commonest causes of mortality and morbidity. The Fibrinogen/Albumin Ratio (FAR) is a newer marker of inflammation that has been shown to be a predictor of short-term prognosis in patients with acute myocardial infarction. Utility of FAR in predicting angiographic severity of CAD and clinical outcomes is not yet clear in Indian patients.
Objective: To study the role of fibrinogen-to-albumin ratio (FAR) as predictor of the angiographic severity of the coronary artery disease, and the short-term prognosis in the patients undergoing coronary angiography.
Material and Methods: The present study was conducted in the Dept. of Cardiology, Sri Jayadeva Institution of Cardiovascular Sciences. A detailed history, physical examination and all routine investigations along with serum albumin, fibrinogen, ECG & 2-D Echo were done for all the patients. Samples were drawn at admission, before angiography. All the patients underwent clinically indicated invasive coronary angiography. SYNTAX SCORE was calculated using an online SYNTAX SCORE calculator.
Results: Out of the 237 patients for final analysis, the majority (81.4%) were male. The mean age of the patients was 57 years. The mean fibrinogen level was 397.97 mg/dl, the mean serum albumin level was 4.05 g/dl and the mean FAR was 101.07 mg/g. with FAR below it was considered as low FAR and above it was considered as high FAR. The mean CAG Syntax score was 22.02. Majority of patients (58.6%) had acute coronary syndrome (ACS), while remaining had stable coronary artery disease. Two FAR groups were found to have comparable proportions of patients across the two Syntax score groups (low SS < 23, high SS > 23). The p-value for the correlation between FAR and Syntax Score was not significant (p=0.941).
Conclusion: FAR was not found to be associated with CAD severity among Indian patients with stable CAD and ACS in the present study. This study didn’t find any correlation between the FAR and short-term prognosis.
81. Impact of Oxygen Therapy on Inflammatory Markers and Lung Function in Interstitial Lung Disease: A Comparative Study
Nishant Srivastava, Vishwas Gupta, Sourabh Jain, Shiv Kumar Kaushal, Lokendra Dave, Ratan Vaish
Abstract
Background: Interstitial lung disease (ILD) is characterized by chronic inflammation and fibrosis of the lung parenchyma, often leading to progressive hypoxemia requiring long-term oxygen therapy (LTOT). While inflammatory biomarkers are known to reflect disease activity, the impact of LTOT on these markers remains incompletely understood.
Aim and Objectives: To compare serial changes in inflammatory and biochemical markers—including high-sensitivity C-reactive protein (hs-CRP), erythrocyte sedimentation rate (ESR), D-dimer, interleukin-6 (IL-6), procalcitonin, and lactate dehydrogenase (LDH)—in ILD patients receiving LTOT versus those not on oxygen therapy.
Materials and Methods: A prospective, observational study was conducted on 112 ILD patients divided into two groups: 56 patients on LTOT and 56 without oxygen therapy. Blood samples were collected at baseline and after one month to measure hs-CRP, ESR, D-dimer, IL-6, procalcitonin, and LDH. Data were analyzed using Shapiro–Wilk test for normality, followed by appropriate parametric (independent/paired t-tests) or non-parametric (Mann–Whitney U, Wilcoxon signed-rank) tests. Correlations between post-therapy inflammatory markers, lung function (FVC), and oxygen requirement were assessed using Spearman’s rank correlation.
Results: At baseline, both groups had elevated inflammatory markers, with significantly higher mean LDH and D-dimer in the LTOT group. After one month of oxygen therapy, significant reductions were observed in hs-CRP, ESR, IL-6, and D-dimer (p < 0.05), while LDH and procalcitonin also showed mild but consistent declines. Correlation analysis revealed a weak positive association between post-therapy LDH and oxygen flow rate (r = 0.62, p < 0.01), while hs-CRP and IL-6 showed minimal correlation with FVC.
Conclusion: Long-term oxygen therapy in ILD patients led to a measurable reduction in systemic inflammation, as evidenced by decreased hs-CRP, IL-6, ESR, and D-dimer levels. LDH, though less specific, showed a moderate relationship with oxygen requirement, suggesting its potential role as an adjunctive marker for tissue injury and hypoxia. Monitoring LDH alongside traditional inflammatory markers may enhance clinical assessment of disease activity and therapeutic response in ILD patients on LTOT.
82. Role of Rapid Microbiological Diagnostics in Early Surgical Decision-Making for Necrotizing Soft Tissue Infections
Gautamkumar Bhikhalal Suthar, Dhirajkumar Muljibhai Makwana, Vinyl Kumar Pahuja
Abstract
Background: Necrotizing soft tissue infections (NSTIs) are rapidly progressive, life-threatening conditions requiring emergent surgical debridement. Early pathogen identification may optimize antimicrobial therapy and guide surgical management, yet the clinical impact of rapid microbiological diagnostics on surgical decision-making remains incompletely characterized.
Methods: A prospective cohort study was conducted over 36 months, enrolling 142 patients with confirmed NSTIs. Rapid diagnostics including Gram stain, direct molecular testing (multiplex PCR), and matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS) were compared to conventional culture. Time to pathogen identification, antimicrobial modification rates, surgical decision impact, and clinical outcomes were evaluated.
Results: Rapid diagnostics provided actionable results in a median of 2.8 hours compared to 48.6 hours for conventional culture (p<0.001). Gram stain sensitivity was 84.5%, while multiplex PCR demonstrated 94.4% concordance with culture. Rapid diagnostics prompted antimicrobial modification in 43.7% of cases, with escalation in 28.2% and de-escalation in 15.5%. Surgical planning was influenced in 31.0% of patients, including decisions regarding debridement extent and timing of re-exploration. Patients receiving rapid diagnostic-guided therapy demonstrated lower mortality (14.1% vs. 26.3%, p=0.048) and reduced amputation rates (8.5% vs. 18.4%, p=0.042) compared to conventional management.
Conclusion: Rapid microbiological diagnostics significantly accelerate pathogen identification in necrotizing soft tissue infections, enabling earlier antimicrobial optimization and influencing surgical decision-making with associated improvements in clinical outcomes.
83. The Efficacy and Safety of Primary Posterior Curvilinear Capsulorhexis in Adult Patients with Cataract: A Retrospective Study
M.V.D.L. Sathyanarayana, Sajja Aruna Kumari, P. Mallikarjun Raju, M. Kiranmai, M.S. Raju, B. Architha
Abstract
Background: Despite advances in surgical techniques, posterior capsular opacification (PCO) remains a predominant long-term complication of cataract surgery. It results from residual lens epithelial cells (LECs) that proliferate, migrate, and undergo metaplasia within the capsular bag, leading to visual axis obscuration. Clinically, PCO causes reduced visual acuity, diminished contrast sensitivity, and may hinder posterior segment evaluation. The reported incidence of PCO ranges from 20–50% within five years after cataract surgery in adults and approaches nearly 100% in paediatric cases. Although primary posterior continuous curvilinear capsulorhexis (PPCC) is routinely performed in paediatric cataract surgery, its use in adults is limited. PPCC involves the removal of the central posterior capsule at the time of surgery to prevent LEC migration toward the visual axis.
Aim: To evaluate the long-term safety and efficacy of phacoemulsification and manual small-incision cataract surgery combined with PPCC and posterior chamber intraocular lens (PCIOL) implantation in adult patients.
Materials and Methods: This interventional case series included 50 adult patients who underwent cataract surgery with PPCC at Government General Hospital, Eluru, Between March to July 2025. Only patients with age-related cataract undergoing PPCC were included. Preoperative and postoperative visual outcomes and complications were assessed over a structured follow-up period.
Results: The study population comprised 27 females and 23 males. Postoperative corrected distance visual acuity (CDVA) showed significant improvement and remained stable throughout follow-up. No sight-threatening complications such as retinal detachment or endophthalmitis were observed.
Conclusion: Cataract surgery combined with PPCC in adults appears to be a safe and effective strategy to reduce PCO, minimize the need for Nd:YAG capsulotomy, and maintain long-term visual outcomes. Further evidence from large-scale randomized controlled trials with extended follow-up is necessary to substantiate these findings.
84. Transcutaneous Bilirubin Screening as a Predictor of Readmission for Neonatal Jaundice
Yash Jashvantbhai Patel, Roshell Brian Gonsalves, Jatin Pravin Patel
Abstract
Background: Neonatal jaundice remains one of the most common reasons for hospital readmission within the first two weeks of life. Early identification of neonates at risk of significant hyperbilirubinemia prior to discharge could reduce preventable readmissions. Transcutaneous bilirubinometry (TcB) offers a noninvasive, rapid, and cost-effective screening modality, yet its predictive value for readmission has not been thoroughly characterized.
Methods: This prospective cohort study enrolled 486 healthy term and late-preterm neonates born at a tertiary care hospital. Predischarge TcB measurements were obtained using the Dräger Jaundice Meter JM-105 within 6 hours prior to discharge. Neonates were followed for 14 days to ascertain readmission for phototherapy-requiring jaundice. Receiver operating characteristic (ROC) analysis was performed to evaluate the predictive accuracy of predischarge TcB values.
Results: Of the 486 neonates, 52 (10.7%) were readmitted for jaundice requiring phototherapy. The mean predischarge TcB was significantly higher in readmitted neonates (12.8 ± 2.1 mg/dL) compared to non-readmitted neonates (8.4 ± 2.6 mg/dL; p < 0.001). ROC analysis yielded an area under the curve (AUC) of 0.87 (95% CI: 0.83–0.91). A TcB cutoff of ≥10.5 mg/dL demonstrated a sensitivity of 84.6%, specificity of 78.3%, positive predictive value of 31.9%, and negative predictive value of 97.7%.
Conclusion: Predischarge TcB measurement is a reliable noninvasive predictor of readmission for neonatal jaundice. Implementation of TcB-based screening protocols could facilitate targeted follow-up and reduce avoidable readmissions.
85. Cost-Effectiveness Analysis of Empirical Versus Culture-Guided Antibiotic Therapy in a Tertiary Care Hospital
Syeda Asma Gulnaaz, Nadia Nausheen, Md. Mustafa Ahmed
Abstract
Background: Empirical antibiotic therapy is routinely initiated in hospitalized patients with suspected bacterial infections; however, prolonged broad-spectrum use may increase antimicrobial resistance and healthcare costs. Culture-guided therapy, involving modification of antibiotics based on culture and antimicrobial susceptibility testing (AST), may optimize treatment and improve cost-effectiveness. Aim of the study was to compare the clinical outcomes and cost-effectiveness of empirical versus culture-guided antibiotic therapy in a tertiary care hospital.
Material and Methods: A prospective observational comparative study was conducted in the Department of Pharmacology in collaboration with clinical departments. A total of 75 adult inpatients receiving systemic antibiotics for suspected bacterial infections were included. Patients were categorized into empirical therapy (n=38) and culture-guided therapy (n=37) groups based on antibiotic modification following culture/AST results. Baseline characteristics, microbiological profile, antibiotic utilization, clinical outcomes, and direct medical costs were analyzed. Continuous variables were compared using the Independent t-test, and categorical variables using the Chi-square/Fisher’s exact test. Cost-effectiveness was assessed by comparing mean costs and clinical cure rates.
Results: Baseline characteristics were comparable between groups (p>0.05). The culture-guided group demonstrated a significantly higher de-escalation rate (56.8% vs 15.8%, p<0.001), shorter duration of antibiotic therapy (8.1±2.5 vs 10.4±3.2 days, p=0.002), and reduced length of hospital stay (8.9±3.6 vs 11.6±4.3 days, p=0.004). Mean antibiotic cost (₹6,320±1,980 vs ₹8,950±2,430; p<0.001) and total direct medical cost (₹45,900±11,600 vs ₹58,200±14,300; p=0.001) were significantly lower in the culture-guided group. Clinical cure rates were higher in the culture-guided group (83.8% vs 71.1%), though not statistically significant (p=0.18).
Conclusion: Culture-guided antibiotic therapy was associated with shorter hospital stay, reduced antibiotic exposure, and significant cost savings without compromising clinical outcomes. It represents a cost-effective and potentially dominant strategy in tertiary care settings.
86. Antibiotic-Associated Adverse Drug Reactions: A Prospective Observational Study
Nadia Nausheen, Syeda Asma Gulnaaz, Syed Mustafa Ashraf
Abstract
Background: Antibiotics are commonly prescribed drugs and are frequently associated with adverse drug reactions (ADRs), contributing to patient morbidity and prolonged hospital stay. Aim of the study was to evaluate the clinical pattern, causality, severity, seriousness, and outcomes of antibiotic-associated ADRs in a tertiary care hospital.
Materials and Methods: This prospective observational study included 50 patients who developed suspected antibiotic-associated ADRs. Data was collected using a structured case record form (CRF). Causality was assessed using WHO-UMC scale, and severity was graded using the Modified Hartwig and Siegel scale. Statistical analysis was performed using descriptive statistics and Chi-square test.
Results: The mean age was 46.8 ± 15.2 years with male predominance (56%). Cephalosporins were the most commonly implicated antibiotics (28%). Skin manifestations (42%) were the most frequent ADRs, followed by gastrointestinal reactions (28%). Type B reactions accounted for 60% of cases. Serious ADRs occurred in 16% of patients. Most patients recovered completely (74%), with no mortality. Significant associations were observed between age and severity (p = 0.041) and antibiotic class and organ system involvement (p = 0.032).
Conclusion: Antibiotic-associated ADRs are common but largely manageable. Strengthening pharmacovigilance and rational antibiotic use can improve patient safety.
87. Clinical Profile and Hematological Parameters in Patients with Iron Deficiency Anemia
Moumita Hazra Panja, Rohan Mody, Divyesh Savjiyani
Abstract
Background: Iron deficiency anemia (IDA) is the most prevalent nutritional deficiency disorder globally, affecting a disproportionately large segment of populations in developing countries. Despite its widespread recognition, comprehensive characterization of the clinical profile and hematological parameters across varying severities of IDA remains inadequately explored in many regional populations. This study aimed to evaluate the clinical presentations and hematological parameters in patients diagnosed with iron deficiency anemia and to assess their correlation with disease severity.
Methods: A hospital-based cross-sectional observational study was conducted at a tertiary care teaching hospital. A total of 320 patients diagnosed with IDA based on standard hematological and iron study criteria were enrolled. Demographic data, clinical symptoms, and comprehensive hematological parameters including complete blood count, peripheral blood smear morphology, serum iron, serum ferritin, total iron-binding capacity (TIBC), and transferrin saturation were systematically evaluated. Patients were categorized into mild, moderate, and severe anemia groups according to World Health Organization (WHO) criteria. Statistical analysis included descriptive statistics, chi-square tests, ANOVA, and Pearson correlation coefficients.
Results: The mean age of participants was 34.6 ± 12.8 years, with a female predominance (72.5%). The most common clinical presentations were generalized fatigue (89.4%), pallor (82.2%), and exertional dyspnea (54.1%). The mean hemoglobin was 8.2 ± 2.1 g/dL, mean corpuscular volume (MCV) was 68.4 ± 8.7 fL, mean serum ferritin was 6.8 ± 4.2 ng/mL, and mean TIBC was 428.6 ± 62.3 µg/dL. Significant progressive deterioration in hematological indices was observed across mild, moderate, and severe groups (p < 0.001). Serum ferritin demonstrated the strongest negative correlation with disease severity (r = −0.72, p < 0.001). Microcytic hypochromic morphology on peripheral smear was present in 74.1% of patients.
Conclusion: Iron deficiency anemia presents with a characteristic constellation of clinical symptoms and hematological abnormalities that progressively worsen with increasing disease severity. Serum ferritin remains the most reliable single marker for assessing iron depletion severity, while a comprehensive evaluation integrating clinical assessment with multiple hematological parameters is essential for accurate diagnosis and appropriate therapeutic stratification.
88. Spectrum of Otorhinolaryngological Disorders in Children: A Cross-Sectional Study
Chakravarthula Madhumitha, Chakravarthula Ramanachary, Dweethi Jayaprakash
Abstract
Background: Otorhinolaryngological disorders are among the most frequent health problems encountered in the pediatric population and contribute substantially to morbidity and healthcare utilization. The pattern and relative frequency of these conditions vary with age and environmental influences. The present study was conducted to evaluate the spectrum and distribution of ENT disorders among children attending a tertiary care hospital.
Material and Methods: This descriptive cross-sectional study included 410 children aged ≤14 years presenting to the Department of Otorhinolaryngology over an 18-month period. Demographic details, clinical features, and final diagnoses were recorded using a structured proforma. Patients were categorized into otological, rhinological, and laryngological disorders based on clinical evaluation and relevant investigations. Data were analyzed using appropriate descriptive and inferential statistical methods, with p < 0.05 considered statistically significant.
Results: The mean age of participants was 8.1 ± 3.9 years. The majority belonged to the 5–9-year age group (38.0%), followed by 10–14 years (35.7%) and 0–4 years (26.3%). Males constituted 57.1% of cases. Otological disorders were most common (44.4%), followed by rhinological (33.9%) and laryngological conditions (21.7%). Acute otitis media (26.4%) was the leading otological diagnosis, allergic rhinitis (31.7%) predominated among rhinological disorders, and acute tonsillitis (32.6%) was most frequent in the laryngological group. Age-wise variation in the distribution of major disorder categories was statistically significant (p < 0.05).
Conclusion: Pediatric ENT disorders are highly prevalent, with infective and allergic conditions forming the bulk of cases. Age-specific trends underline the importance of early detection and timely intervention to minimize complications and long-term morbidity.
89. A Cross-Sectional Study of Deviated Nasal Septum and Its Clinical Presentation
Chakravarthula Madhumitha, Dweethi Jayaprakash, Chakravarthula Ramanachary
Abstract
Background: Deviated nasal septum (DNS) is one of the most frequently encountered structural abnormalities in otorhinolaryngology practice and is a major contributor to nasal obstruction and related sinonasal complaints. The clinical presentation varies depending on the type and severity of septal deviation. This study aimed to evaluate the clinical profile and morphological patterns of DNS and to assess the association between septal deviation type and presenting symptoms.
Material and Methods: A hospital-based cross-sectional study was conducted in the Department of Otorhinolaryngology of a tertiary care center over 18 months. A total of 150 patients aged ≥10 years with clinically and endoscopically confirmed DNS were enrolled. Demographic details, symptom profile, and severity of nasal obstruction using a 10-point Visual Analog Scale (VAS) were recorded. Septal morphology was classified based on endoscopic findings. Data were analyzed using appropriate statistical tests, with p < 0.05 considered statistically significant.
Results: The mean age of participants was 31.8 ± 11.6 years, with the highest proportion in the 21–30-year age group (30.7%). Males constituted 62.7% of cases. Nasal obstruction was present in all patients. Headache (58.7%) and nasal discharge (50.7%) were common associated symptoms. Moderate nasal obstruction was observed in 45.3% of patients, with a mean VAS score of 5.9 ± 2.1. C-shaped deviation was the most frequent morphological type (41.3%). Inferior turbinate hypertrophy was the most common associated endoscopic finding (54.7%). A statistically significant association was observed between the type of septal deviation and predominant symptoms (χ² = 9.84, p = 0.043).
Conclusion: DNS predominantly affects young adults and is commonly associated with moderate to severe nasal obstruction. Morphological variation influences symptom patterns, underscoring the importance of detailed anatomical assessment for optimal clinical management.
90. Study of Branching Pattern of Right Coronary Artery and Morphometric Benchmarks
Gunale Vankat Tukaram, Mohammad Farhan Rashid Hamid
Abstract
Aim: The aim of this study was to meticulously dissect and analyze the branching patterns of the right coronary artery (RCA) in 50 formalin-fixed adult human cadaveric heart specimens. This investigation sought to document the prevalence of normal anatomy versus variations.
Materials and Methods: Fifty adult human hearts from both sexes, aged 30-70 years (mean 52±10), were formalin-fixed and dissected under magnification after removing pericardium and epicardial fat. RCA was traced from its ostium in the right aortic sinus along the right atrioventricular groove to termination at the crux. Branches were identified, measured (length, diameter using digital caliper), photographed, and classified. Variations like myocardial bridging, duplication noted. Data tabulated, analyzed via SPSS v26 (chi-square, frequencies).
Results: Right dominance predominated in 42/50 (84%) specimens, co-dominance in 6/50 (12%), left in 2/50 (4%). SANA originated from RCA in 40/50 (80%), conus artery in 35/50 (70%). AMA present in 48/50 (96%), single in 90%, double in 8%. PDA from RCA in 84%, posterolateral branches averaged 2.1±0.8. Variations: early branching 10%, bridging 4%. No ectopic origins observed. Mean RCA length 10.2±2.1 cm, diameter 3.5±0.6 mm.
Conclusion: This study affirms RCA’s predominant right dominance (84%) and consistent branching (SANA 80%, AMA 96%) in central Indian cadavers, aligning with global trends but highlighting higher co-dominance than some Western cohorts. Such data underscores the need for preoperative imaging to navigate variations, potentially lowering CABG/PCI risks. Future multi-center studies with larger samples and angiography correlation recommended.
91. Evaluation of Glycated Hemoglobin and Lipid Profile in Type 2 Diabetes Mellitus
Soumya B., Pratibha Tripathi, Kapil Khanna
Abstract
Background: Type 2 diabetes mellitus (T2DM) is a chronic metabolic disorder characterized by impaired glycemic control and lipid abnormalities that collectively elevate cardiovascular risk. This study aimed to evaluate glycated hemoglobin (HbA1c) and lipid profile parameters in T2DM patients and assess their correlation with glycemic control.
Materials and Methods: A cross-sectional study was conducted at a tertiary care hospital in India, enrolling 120 participants — 80 confirmed T2DM patients and 40 healthy age- and sex-matched controls. Fasting venous blood samples were analyzed for HbA1c by High-Performance Liquid Chromatography (HPLC), while total cholesterol (TC), triglycerides (TG), and HDL-C were measured by enzymatic colorimetric methods. LDL-C and VLDL-C were derived using the Friedewald equation. Data were analyzed using SPSS version 26.0, with p < 0.05 considered statistically significant.
Results: HbA1c was significantly higher in T2DM patients than controls (9.1 ± 1.8% vs. 5.2 ± 0.4%, p < 0.001). Diabetic patients exhibited significantly elevated TC (5.82 ± 1.12 vs. 4.61 ± 0.83 mmol/L), TG (2.31 ± 0.87 vs. 1.24 ± 0.41 mmol/L), LDL-C (3.75 ± 0.94 vs. 2.67 ± 0.71 mmol/L), and VLDL-C (1.05 ± 0.40 vs. 0.56 ± 0.19 mmol/L), with significantly reduced HDL-C (1.02 ± 0.24 vs. 1.38 ± 0.29 mmol/L; all p < 0.001). HbA1c correlated positively with TC (r = +0.48), TG (r = +0.52), LDL-C (r = +0.44), and VLDL-C (r = +0.52), and negatively with HDL-C (r = −0.41; all p < 0.001).
Conclusion: Patients with T2DM exhibit markedly elevated HbA1c alongside a significantly atherogenic lipid profile, highlighting the critical need for routine simultaneous monitoring of glycemic control and lipid status to mitigate cardiovascular risk.
92. Chronic Recurrent Abscesses in a Child: Unmasking Scrofuloderma as a Form of Cutaneous Tuberculosis
Juhi Tomar, Sanjay Purohit, Maulik Kotadia
Abstract
Cutaneous tuberculosis, a rare extrapulmonary manifestation, constitutes a minor fraction of the overall TB disease burden, with scrofuloderma being the most common variant in paediatric populations. We present the case of a 7-year-old girl with a two-year history of recurrent abscesses on her chest and back that had recently enlarged. Previous incomplete treatment provided only temporary improvement. Clinical examination revealed classic signs of scrofuloderma, including multiple undermined ulcers with seropurulent discharge, sinuses, and puckered scars, accompanied by bilateral cervical lymphadenopathy in an otherwise systemically well child. Diagnostic workup revealed microcytic anaemia, an elevated erythrocyte sedimentation rate, and a strongly positive Mantoux test. Histopathological examination of a skin biopsy showed chronic inflammatory infiltrates with multinucleated giant cells, confirming the diagnosis. The patient was started on a standard four-drug anti- tubercular therapy regimen. This case underscores the characteristic clinical presentation of scrofuloderma in a child and highlights the critical importance of a high index of suspicion and the necessity of a complete, supervised course of ATT to achieve cure and prevent relapse, even in the absence of pulmonary involvement.
93. Comparison Between Neostigmine and Fentanyl as Adjuvant to Bupivacaine in Epidural Labour Analgesia: A Randomized Control Trial
Kumar Kunal, Sarath Chandran CR, Ashok Rout, Ashisha, Nitesh Singmar
Abstract
Background: Epidural analgesia is considered the gold standard for labour pain management. Opioid adjuvants such as fentanyl are commonly combined with bupivacaine to enhance analgesic efficacy and reduce local anesthetic requirement; however, opioid-related side effects remain a concern. Neostigmine, a cholinesterase inhibitor, has emerged as a potential non-opioid alternative for neuraxial analgesia.
Objective: To compare the analgesic efficacy and safety of neostigmine versus fentanyl as adjuvants to bupivacaine in epidural labour analgesia.
Methods: In this prospective randomized single-blinded controlled trial, 60 ASA I–II parturients in active labour were randomly allocated into two groups (n=30 each). Group N received epidural bupivacaine (1.25 mg/mL) with neostigmine 2 µg/mL, while Group F received bupivacaine with fentanyl 2 µg/mL. Primary outcomes included duration of analgesia and total bupivacaine consumption. Secondary outcomes included time to full cervical dilatation, rescue bolus requirement, maternal side effects, and neonatal outcomes. Statistical analysis was performed using appropriate parametric and non-parametric tests, with p<0.05 considered significant.
Results: The neostigmine group demonstrated significantly longer duration of analgesia (142 ± 24 min vs 118 ± 22 min; p=0.001) and lower total bupivacaine consumption (32.4 ± 5.8 mg vs 39.8 ± 6.5 mg; p<0.001). Fewer patients required multiple rescue boluses in the neostigmine group (16.7% vs 46.7%; p=0.01). Time to full cervical dilatation was shorter in the neostigmine group (4.0 ± 1.0 h vs 4.6 ± 1.2 h; p=0.04). Pruritus was significantly higher in the fentanyl group (33.3% vs 3.3%; p=0.003), while nausea was more frequent with neostigmine (30% vs 10%; p=0.04). Neonatal Apgar scores and mode of delivery were comparable between groups.
Conclusion: Epidural neostigmine as an adjuvant to bupivacaine provides prolonged analgesia with reduced local anesthetic requirement and lower incidence of opioid-related pruritus compared to fentanyl, while maintaining comparable fetomaternal safety. Neostigmine represents a promising non-opioid alternative for labour epidural analgesia.
94. Prospective Observational Study of Anatomical Correlation between External Jugular Vein and Brachial Plexus at Cricoid and Supraclavicular Levels Using Ultrasonography
Anmol B. Lalwani, Vijay Patil, Ketaki Nirkhi
Abstract
Background: Regional anaesthesia is widely used for upper limb surgeries. Although ultrasound guidance is considered the gold standard for brachial plexus blocks, its availability is limited in many developing countries. Reliable surface anatomical landmarks therefore remain clinically important.
Aim: To evaluate the anatomical relationship between the external jugular vein (EJV) and the brachial plexus (BP) at the cricoid and supraclavicular levels using ultrasonography, and to assess the usefulness of EJV as a surface landmark for landmark-guided brachial plexus block.
Methods: This prospective observational study included 270 ASA physical status I and II adults aged 18–60 years. Ultrasound was used to identify the position of the brachial plexus relative to the external jugular vein at the cricoid and supraclavicular levels. Distances from skin to brachial plexus and from EJV to brachial plexus were measured and analysed statistically.
Results: In 90% of patients, the brachial plexus was located medial to the external jugular vein at both levels. At the cricoid level, the EJV-to-brachial plexus distance was less than 1 cm in 92.2% of cases. A strong positive correlation was observed between skin-to-brachial plexus distance at the two levels (r = 0.75), while EJV-to-brachial plexus distance showed no significant correlation (r = −0.02).
Conclusion: The external jugular vein shows a consistent anatomical relationship with the brachial plexus and may serve as a useful surface landmark for landmark-guided brachial plexus block, particularly in resource-limited settings where ultrasound is unavailable.
95. Combining Audit Questionnaire and Biochemical Markers to Assess Risk of Alcohol Withdrawal in Alcohol use Disorder: A Prospective Study
Pooja Dhurvey, Ankur Nayan, Arindam Maiti
Abstract
Background: For alcohol use disorders (AUDs) to be effectively managed and consequences to be avoided, early detection is crucial. A popular screening tool is the Alcohol Use Disorders Identification Test (AUDIT), and biochemical markers may improve the precision of diagnosis.
Objective: To evaluate the effectiveness of AUDIT alone and in combination with biochemical markers in screening patients with alcohol dependence.
Method: 155 patients with alcohol use disorders who were seen at the NSCB Medical College and Hospital’s Department of Psychiatry in Jabalpur were included in this prospective study. The AUDIT, CIWA scale, and biochemical markers (AST, ALT, and MCV) were used to evaluate the participants. Analysis was done on accuracy, sensitivity, specificity, PPV, and NPV.
Results: High sensitivity (93.6%) and specificity (86.7%) were demonstrated with a AUDIT cutoff of 8. Increasing the limit to 15 decreased sensitivity while increasing specificity and PPV. Sensitivity and NPV were increased by combining AUDIT with biochemical indicators; the maximum sensitivity (98.0%) was obtained when AST and MCV were added.
Conclusion: AUDIT is an effective screening tool for AUDs, and its accuracy is enhanced when combined with biochemical markers. This combined approach supports early diagnosis and better clinical management.
96. A Cross-Sectional Study on Knowledge and Practice of First Aid and Its Determinants among School Teachers in Karur District
Divya Vedhamoorthy, Caroline Priya Kumar, Priyadharshini S.
Abstract
Background: First aid is a vital life-saving skill, particularly in schools where children are vulnerable to injuries and medical emergencies. Teachers, as the immediate responders, play a crucial role in ensuring timely assistance and child safety.
Aims: This study aimed to assess the knowledge and practice of first aid among school teachers in Karur district, Tamil Nadu and to identify factors influencing their performance.
Methodology: An institution-based cross-sectional study was conducted among 440 school teachers selected through multistage random sampling. Data were collected using a validated, semi-structured questionnaire and analyzed with SPSS version 21.0 using appropriate descriptive and inferential statistics.
Results: The mean age of participants was 37.5±8.9 years and 85.5% were female. Overall, 57.3% reported prior first aid training. Good knowledge was observed in 54.5% of teachers, with higher knowledge regarding breathing difficulty (84.5%) and animal bites (71.1%), but lower for severe bleeding (15.5%) and epilepsy (28%). Multivariable logistic regression analysis showed that, after adjusting for potential confounders, only training status was significantly associated with knowledge (aOR 1.95, 95% CI 1.29–2.94). Good practice was reported by only 19.3% of teachers; postgraduate education and non-science subject specialization were independent predictors of good practice.
Conclusions: More than half of the teachers had good knowledge, but correct practice was notably poor, showing a knowledge-practice gap. Regular, structured and hands-on training programs, supported by adequate resources in schools, are essential to enhance preparedness in managing emergencies.
97. Association of High Sensitivity C – reactive protein and Serum Triglyceride Levels with Severity of Acute Pancreatitis: A Prospective Observational Study
Manoj Kumar Banyal, Pinki Tak, Sansar Chandra Asiwal, Rajesh Jain
Abstract
Background: Acute pancreatitis (AP) is a potentially life-threatening condition with varying severity. Early identification of severe cases is crucial for improving outcomes. This study aimed to evaluate the role of high-sensitivity C-reactive protein (hs-CRP), and serum triglyceride levels in predicting the severity of AP.
Methods: A cross-sectional study was conducted on 125 patients diagnosed with AP. Serum calcium, hs-CRP, and triglyceride levels were measured within 24-48 hours of admission. Patients were classified into mild, moderately severe, and severe AP based on the Revised Atlanta Classification. Statistical analysis was performed to assess the correlation between these biomarkers and disease severity.
Results: The mean age of patients was 50±15 years, with a male predominance (72.8%). Alcohol was the most common etiology (60%), followed by gallstones (25%). Severe AP was associated with higher hs-CRP (337±21 mg/L) and triglyceride levels (359±171 mg/dL) compared to mild and moderate AP (p<0.001). Elevated hs-CRP, and hypertriglyceridemia were independent predictors of severe AP.
Conclusion: hs-CRP, and triglyceride levels are useful biomarkers for predicting the severity of AP. Early measurement of these parameters can aid in risk stratification and guide clinical management.
98. Original Research article: Study of Evaluation of a Novel Commercial Rapid Test for Early Detection of Acute Dengue Infection
Ashish Bhalsod, Kaushik Tilwani, Shreyaskumar N. Shah
Abstract
Introduction: The emerging pattern and the increasing trend in the incidence of dengue infection are of great concern as there is no specific treatment of dengue, and most forms of therapy are supportive in nature. We conducted a study to evaluate a newly available commercial rapid immunochromatographic test for the early diagnosis of dengue infection. This test is designed to simultaneously detect dengue NS1 antigen and IgM antibody.
Materials and Methods: This study was carried out in the Department of Microbiology of our institute and included 300 clinically suspected dengue cases. Screening for dengue infection was performed using a rapid immunochromatographic test (ICT) designed to detect both dengue NS1 antigen and IgM antibody (Advantage Dengue NS1 Ag and Ab Combi Card). In addition, all samples were tested for dengue-specific IgM antibodies using ELISA.
Results: Of the 300 clinically suspected dengue cases, 88 samples were found to be positive by rapid immunochromatographic testing. Among these positive cases, 67 showed NS1 antigen positivity, whereas 21 were positive for IgM antibodies. All samples were subsequently analyzed using IgM ELISA.
Conclusion: We would like to conclude that usefulness of NS1 antigen–based diagnostic methods for the early diagnosis of acute dengue virus infection. Rapid immunochromatographic tests enable prompt detection of both dengue NS1 antigen and IgM antibodies. Detection of NS1 antigen during the symptomatic phase of illness represents a significant advancement in the early diagnosis of dengue fever. The Dengue NS1 antigen strip, as a rapid diagnostic test for early dengue virus infection, demonstrated high sensitivity and specificity and is therefore suitable as a first-line diagnostic tool, particularly in field settings.
99. An Analytical Study of Predisposing Factors for Gallbladder Perforation during Laparoscopic Cholecystectomy at a Tertiary Care Centre
T. Sreelakshmi, Mucherla VVN Suresh Babu, Venu Gopal
Abstract
Background: Laparoscopic cholecystectomy (LC) is the gold standard for symptomatic gallstone disease due to reduced postoperative pain, shorter hospital stay, and early recovery. However, gallbladder perforation (GP) remains a frequent intraoperative complication, reported in 10–35% of cases. GP may result in bile spillage, stone loss, increased operative time, infection, abscess formation, and prolonged hospitalization. Identifying predisposing factors is essential to minimize morbidity and improve surgical outcomes.
Aim: To identify and analyze the predisposing factors contributing to gallbladder perforation during laparoscopic cholecystectomy at a tertiary care centre.
Methods: This prospective observational study was conducted in the Department of General Surgery, Guntur Medical College, from April 2023 to March 2025. Sixty patients aged ≥18 years undergoing elective or emergency LC were included. Patients undergoing primary open surgery or with suspected malignancy were excluded. Preoperative assessment included clinical evaluation and ultrasonography (wall thickness, stone impaction). Intraoperative data included occurrence and cause of GP, surgeon experience, operative duration, and need for conversion. Postoperative outcomes such as infection, bile leak, and hospital stay were recorded. Statistical analysis was performed using SPSS, with p <0.05 considered significant.
Results: Gallbladder perforation occurred in 18 of 60 patients (30%). Thickened gallbladder wall (≥5 mm) showed a statistically significant association with GP (p = 0.015). Chronic cholecystitis demonstrated a higher perforation rate (42.9%) compared to acute cases (18.8%), though not statistically significant (p = 0.08). Age, gender, comorbidities, and surgeon experience were not significantly associated with GP. Postoperative infection occurred in 35% of patients, bile leak in 5%, and mean hospital stay was 5.65 days, with higher morbidity observed in GP cases.
Conclusion: Gallbladder wall thickness ≥5 mm is a significant predictor of intraoperative perforation. Early identification of high-risk patients and meticulous surgical technique are essential to reduce complications and improve patient outcomes.
100. A Prospective Randomized Single-Blind Study Comparing Cisatracurium and Rocuronium for Endotracheal Intubation in Elective Surgeries under General Anesthesia
Yedida Veera Pratap Kumar, Dasupuram Gunapriya, KNV Harish, Sambari Nikhil
Abstract
Background: Neuromuscular blocking agents play a crucial role in facilitating optimal endotracheal intubation during general anesthesia. Rocuronium is commonly used for its rapid onset, whereas cisatracurium offers organ-independent metabolism and stable hemodynamics. Comparative data on intubating conditions and physiological responses between these agents in elective non-obstetric surgeries remain limited.
Objectives: To compare cisatracurium and rocuronium with respect to intubating conditions, onset of neuromuscular blockade, hemodynamic responses, and adverse effects in adult patients undergoing elective surgeries under general anesthesia.
Methods: This prospective randomized single-blind study included 70 ASA I–II patients aged 20–60 years. Patients were randomized to receive either cisatracurium 0.1 mg/kg (Group C) or rocuronium 0.6 mg/kg (Group R). Intubating conditions were assessed using the Copenhagen Conference Score. Train-of-Four monitoring evaluated neuromuscular blockade, and hemodynamic parameters were recorded at predefined intervals.
Results: Group C demonstrated a higher proportion of excellent intubating conditions and a faster early TOF suppression. Group R showed a higher transient pressor response post-intubation. Both groups achieved adequate neuromuscular blockade by 180 seconds, with minimal adverse effects.
Conclusion: Cisatracurium provided superior intubating conditions and better hemodynamic stability compared with rocuronium in elective surgeries.
101. Micronutrient Deficiency Patterns in Children with Recurrent Infections: A Prospective Observational Study
Maloth Priyanka, Dasari Uday Kumar, Kaushal Poreddy
Abstract
Background: Recurrent infections in children is frequent cause of healthcare visits and hospital admissions, particularly in low and middle income countries. Micronutrients such as iron, zinc, and vitamin D play a crucial role in immune function, and their deficiencies may predispose children to repeated infectious illnesses.
Aim: To assess the pattern of micronutrient deficiencies and their association with infection frequency and severity among children with recurrent infections.
Methods: This prospective observational study was conducted in the department of Pediatrics, Government Medical College, Bhadradri Kothagudem, from July to November 2025. Children aged 6 months to 12 years with recurrent infections were enrolled after obtaining informed consent. Demographic details, clinical profile, and infection characteristics were recorded. Anthropometric assessment was performed using WHO growth standards. Laboratory evaluation included hemoglobin, serum ferritin, serum zinc, and serum 25-hydroxyvitamin D levels. Data were analyzed using SPSS version 21.0.
Results: Among 120 enrolled children, 82.5% had at least one micronutrient deficiency. Iron deficiency was the most common (60%), followed by vitamin D (45%) and zinc (38.3%). Nearly half of the children had multiple deficiencies. Children with two or more deficiencies experienced significantly higher infection frequency, longer illness duration, and increased hospitalization rates.
Conclusion: Micronutrient deficiencies, particularly iron, vitamin D, and zinc, are highly prevalent among children with recurrent infections and are associated with increased morbidity. Targeted screening and correction of deficiencies may help reduce infection burden in this vulnerable group.
102. Long-Term
Visual Outcome and Complications of Congenital and Developmental Cataract Surgery in a Tertiary Care Hospital in Eastern India: A Prospective Observational Study
Fariduddin K., Mallick S., Lynapawngia S.
Abstract
Background: Clear vision during childhood is crucial for the proper development of the visual system; therefore, any obstruction like cataracts can result in long-term visual impairment such as strabismus and amblyopia. Hence, for early cataract development (2-3 months) and unilateral cataract, best visual acuity is acquired if a cataract operation is done between 4-6 weeks and amblyopia development is better prevented compared to surgery done at later weeks. For bilateral cataracts, surgery is best carried out at 6 weeks with 1 week apart for both surgeries, this is done to avoid amblyopia.
Materials and Methods: Our study focused on the outcomes of congenital and developmental cataract surgeries at the Regional Institute of Ophthalmology, Kolkata. Patients between 3 months to 12 years old with congenital or developmental cataracts undergoing surgery at the institute were evaluated over a 3-month follow-up period. Patients were examined post-surgery for early complications, followed by outpatient visits at RIO, Kolkata on specific intervals for up to 3 months, including visual acuity measurement, refraction, and monitoring for complications.
Results: In our study, out of 130 eyes (taking each affected eye as a separate case), 81 (62%) were male and 49 (38%) were female. The mean Visual Acuity (VA) at onset in cataracts who presented at <1 year of age is 1.271 and final post op BCVA (Best corrected visual acuity) was 0.80. The mean VA at onset in cataracts who presented at 1-3 years of age is 1.065 and final post op BCVA was 0.744. The mean VA at onset in cataracts who presented at >3 years of age is 1.000 and final post op BCVA was 0.6776. R value according to Pearson’s coefficient is 0.257, 0.520, 0.599 respectively. Hence there is a positive correlation between age of onset of disease and Mean VA at onset as well as final BCVA at 3 months. PCO is the most common complication here and the percentage of the PCO is about 43.75%.
Conclusion: Our study showed a positive correlation between the age of disease onset and age of presentation at hospital and visual acuity outcomes. The later the presentation or delay in seeking medical attention has much impact on the visual outcomes and related complications.
103. A Comparative Study of Single Suction Drain versus Double Suction Drain Following Modified Radical Mastectomy
Singh A., Sandhu P.S., Kumar A., Kaur H., Kaur R.
Abstract
Background: Breast cancer remains the most common malignancy among women worldwide and is a leading cause of cancer-related mortality. Modified Radical Mastectomy (MRM) continues to be a commonly performed surgical procedure, particularly in developing countries where patients often present with locally advanced disease. Postoperative seroma formation is one of the most frequent complications following MRM and contributes to patient discomfort, prolonged hospital stay, risk of infection, and delayed adjuvant therapy. Closed suction drainage is routinely employed to reduce seroma formation; however, the optimal number of drains remains controversial.
Objectives: This study aimed to compare the outcomes of single suction drain versus double suction drain placement following Modified Radical Mastectomy, with particular emphasis on seroma formation, postoperative pain, surgical site complications, and patient acceptability.
Methods: A comparative study was conducted over an 18-month period in the Department of General Surgery at Guru Gobind Singh Medical College and Hospital, Faridkot. Eighty female patients undergoing MRM for carcinoma breast were enrolled and divided into two equal groups: Single Drain Group (SDG, n=40) and Double Drain Group (DDG, n=40). Patients were followed postoperatively for complications including seroma, hematoma, surgical site infection, flap necrosis, pain, and need for secondary suturing. Statistical analysis was performed to compare outcomes between the two groups.
Results: Seroma formation was observed in 7 patients (17.5%) in the SDG and 4 patients (10%) in the DDG, with no statistically significant difference. Postoperative pain scores were significantly lower in the single drain group, and patient comfort and acceptability were higher compared to the double drain group. No significant differences were noted between the groups regarding drain duration, drain volume, surgical site infection, flap necrosis, or hematoma formation.
Conclusion: Single suction drain placement following Modified Radical Mastectomy is as effective as double drain placement in preventing postoperative seroma and other complications, while offering the advantages of reduced postoperative pain and improved patient comfort. Routine use of a single suction drain may therefore be recommended following MRM.
104. Perceived and Self-Stigma in People with Anxiety and Depressive Disorders: A Cross-Sectional Study in a Tertiary Care Setting
Bhavana Prasad, Suha Riyaz, Sandeep M. R., Bharat M. Mohan, Monica V. Dolli, Sindhoor V.
Abstract
Background: Stigma related to mental illness remains a major barrier to help-seeking, treatment adherence, and recovery. Individuals with anxiety and depressive disorders are particularly vulnerable to both perceived stigma and self-stigma, which may adversely influence clinical outcomes. However, data on stigma among these common mental disorders remain limited, especially in tertiary care settings in India.
Objectives: To assess perceived stigma and self-stigma among individuals with anxiety and depressive disorders and to evaluate their association with selected sociodemographic and clinical variables.
Methods: This hospital-based cross-sectional observational study was conducted in the Department of Psychiatry of a tertiary care center. A total of 52 adults diagnosed with anxiety disorders or depressive disorders as per ICD-11 criteria were recruited using consecutive sampling. Sociodemographic and clinical details were recorded using a semi-structured proforma. Perceived stigma was assessed using the Stigma Scale for Mental Illness (SSMI), and self-stigma was assessed using the Internalized Stigma of Mental Illness (ISMI) scale. Data were analyzed using SPSS version 25. Descriptive statistics were used to summarize variables, and inferential statistics including Chi-square test, independent t-test, Mann–Whitney U test, and Pearson correlation were applied. A p-value <0.05 was considered statistically significant.
Results: The mean age of participants was 34.6 ± 9.8 years, with a female predominance (57.7%). Participants with depressive disorders had significantly higher perceived stigma scores compared to those with anxiety disorders (45.1 ± 7.9 vs. 39.2 ± 8.1; p = 0.01). Internalized stigma was also significantly higher in depressive disorders (2.56 ± 0.42 vs. 2.24 ± 0.43; p = 0.004). A significant positive correlation was observed between duration of illness and self-stigma (r = 0.46, p <0.001).
Conclusion: The study demonstrated substantial levels of perceived stigma and self-stigma among patients with anxiety and depressive disorders, particularly in depressive disorders and those with longer illness duration. Targeted stigma-reduction interventions are essential to improve treatment engagement and mental health outcomes.
105. Cross-sectional Assessment of Homocysteine Levels in Patients Undergoing Cardiovascular Risk Screening
Saurav Rai, Ujjwal Kumar, Debjit Mitra
Abstract
Background: In the world, cardiovascular diseases are a major morbid and mortal illness. Although the traditional risk factors and variables, including hypertension, diabetes mellitus, dyslipidemia, and smoking are well-established, they cannot explain the cardiovascular risk in all individuals. The amino acid homocysteine which is a sulfur-containing concern in the metabolism of methionine has been suggested to act as an independent cardiovascular risk element, yet its role in the regular screening of cardiovascular risk is unclear.
Objective: The objectives of the present study were to evaluate levels of serum homocysteine in the individuals who are subjected to cardiovascular risk screening, and to determine the relationship of serum homocysteine with known cardiovascular risk factors.
Methods: A one-year cross-sectional observational study was carried out on a sample of 100 adult participants (study subjects who have undergone cardiovascular risk assessment). Clinical parameters, demographic data and laboratories were noted. Standardized enzymatic assay was used in the determination of fasting serum homocysteine with a level above 15 µmol/L being regarded as high. Proper statistical tests were done to determine the association between cardiovascular risk factors and homocysteine levels and p < 0.05 was defined as statistically significant.
Results: The participants had high levels of serum homocysteine, 38% of them. Male, smoking, hypertensives, and dyslipidemic people were also found to have hyperhomocysteinemia much more frequently (p < 0.05). The homocysteine level and diabetes mellitus were found not to have a statistically significant relationship.
Conclusion: A high percentage of people who have cardiovascular risk screening have high levels of homocysteine that are strongly linked with various traditional cardiovascular risk factors. The results indicated that homocysteine estimation can be a valuable supplement in the cardiovascular risk classification.
106. Association of Dental Erosion with Gastroesophageal Reflux Disease (GERD) in Adult Patients
Saad Bin Saif, Maazia Sohail, Imamuddin
Abstract
Background: One chronic gastrointestinal disorder that manifests extra-esophageal is gastroesophageal reflux disease (GERD), and complicated mouth-related problems like dental erosion which is seldom recognized. Dental erosion is the dissolution of the structural components that comprise a tooth caused by an acid attack.
Objective: Assess dental erosion consequences of gastroesophageal reflux disease in adult patients.
Methods: The study was cross-sectional and observational in design done in one year in one hospital with 100 adults diagnosed with GERD. A comprehensive five-sectioned questionnaire was completed encompassing the participant’s demographics, medical history, GERD-related factors (duration and severity), dietary, and oral hygiene behaviors. Clinical oral assessment was done and dental erosion scoring was performed using the Basic Erosive Wear Examination (BEWE) index. Data were analyzed using descriptive statistics with Chi-square and SPSS software. A p-value < 0.05 was accepted as statistically significant.
Results: Of the total study participants, 62% had dental erosion. There was a higher prevalence of dental erosion observed in patients with a longer duration and greater severity of GERD symptoms. Among study participants, the correlation of dental erosion and severity of GERD symptoms was statistically significant (p < 0.05).
Conclusion: The research identifies a clear link between GERD and dental erosion among adult patients. Tooth erosion and GERD are often overlooked in dental assessments. Early dental evaluations and interdisciplinary approaches are pivotal in obtaining a diagnosis and managing care to enhance outcomes.
107. Evaluating Effectiveness of E-learning Module in Anatomy for First Year MBBS Students
Mangesh Lone, Motiram Khandode, Anshudeep Dodake, Megha Khandode
Abstract
Background: The National Medical Commission’s Graduate Medical Education Regulations 2018 emphasize self-directed, learner-centric approaches and the adoption of contemporary educational technologies, including e-learning. First-year MBBS Medical students often have limited classroom time for gross anatomy, and e-learning may enhance their understanding through flexible access to structured material. The COVID-19 pandemic further highlighted the need for effective online learning modalities.
Aim: To develop and introduce an e-learning module in gross anatomy and evaluate its effectiveness and Medical students’ perceptions.
Materials and Methods: A prospective interventional study was conducted among 150 first-year MBBS students, among them 119 consented to participate. The anatomy of the heart was divided into two topics: one taught through traditional didactic lecture and the other through an e-learning module uploaded on Google Groups. Both topics were followed by online MCQ (Multiple Choice Question) assessments. The second topic was later taught again via didactic lecture, after which student perceptions of e-learning were collected through an online feedback questionnaire. Quantitative and qualitative data were analysed using Microsoft Excel.
Results: A total of 47 students completed both MCQ assessments. Mean scores for Topic 1 (traditional lecture) and Topic 2 (e-learning) were 8.60 ± 3.08 and 8.57 ± 3.70, respectively (p = 0.97), indicating no significant difference in performance of the two groups as p is > 0.05. More than 80% of students found the e-learning module easy to access, navigate, and useful in enhancing understanding. Over 50% felt it promoted interaction, could replace some lectures, and should be continued in future teaching. However, 22% preferred traditional learning methods.
Conclusion: Although learning outcomes did not significantly differ between traditional and e-learning methods, student feedback demonstrated high acceptance and satisfaction with the e-learning module. E-learning served as a valuable complement to conventional teaching, supporting its continued integration into the MBBS curriculum.
108. A Clinical Profile of Neuromyelitis Optica and Neuromyelitis Optica Spectrum Disorders in a Tertiary Care Hospital in South Tamil Nadu
Prabhu, K. Vignesh, S. Sivaramasubramanian
Abstract
Background: Neuromyelitis optica (NMO) and neuromyelitis optica spectrum disorders (NMOSD) are severe inflammatory demyelinating disorders predominantly affecting the optic nerves and spinal cord. Data on their clinical profile and outcomes in South Tamil Nadu remain limited. This study aimed to evaluate the clinical characteristics, serostatus, and functional outcomes of NMO/NMOSD patients presenting to a tertiary care center in this region.
Materials and Methods: This retrospective observational study included 25 patients diagnosed with NMO/NMOSD between January 2016 and May 2019. Patients fulfilling revised Wingerchuk criteria were analyzed for clinical presentation, laboratory findings, neuroimaging, treatment response, and outcomes. Disability was assessed using the Expanded Disability Status Scale (EDSS) at the last follow-up.
Results: Of the 25 patients, 7 (28%) were anti-aquaporin-4 antibody positive and 18 (72%) were seronegative. The female-to-male ratio was 2.57:1, with a median age of onset of 35 years. Combined myelitis and optic neuritis at presentation was significantly more frequent in seronegative patients (28%; p=0.047). Cervico-dorsal spinal cord involvement was observed in 68% of patients with myelitis. All patients received intravenous methylprednisolone; additional therapy included repeat steroids (40%), plasmapheresis (8%), and rituximab (4%). Median EDSS at last follow-up was lower in seronegative patients (3.5) than seropositive patients (5.0), though not statistically significant (p=0.14).
Conclusions: Seronegative NMO/NMOSD constituted the majority of cases in this South Tamil Nadu cohort and more commonly presented with combined myelitis and optic neuritis. Functional outcomes were comparable between seropositive and seronegative patients. Larger prospective studies are required to clarify regional disease patterns and optimize management strategies.
109. The Study of Clinical Profile and Role of Endoscopic Ultrasonography in Chronic Pancreatitis Patients
MD. Ashif Ali Ahmed, Bashar Imam Ahmad, Mohammad Zakiuddin
Abstract
Background: Endoscopic ultrasound (EUS) provides high-resolution images of both pancreatic parenchyma and duct and therefore is an integral component of evaluating and treating patients with pancreatitis and its complications.
Aims and Objective: To study the clinical profile and its role of endoscopic ultrasonography in chronic pancreatitis patients.
Materials and Methods: A cross-sectional observational study was conducted which included the patients admitted in tertiary care centre presenting with clinical diagnosis of chronic pancreatitis. The study was conducted over period of one year from April 2023 to March 2024 after approving the ethical committee approval in the department of gastroenterology of I.Q City Medical College, Durgapur, West-Bengal. A total sample size of 71 patients was included. Cases of chronic pancreatitis of both genders were enrolled in the study with the written consent. Ethical approval for the study was obtained, and a detailed history such as family history, alcohol consumption, and presence and severity of abdominal pain was recorded. All the patients were subjected to a thorough clinical examination, routine hematologic and biochemical investigations and abdominal ultrasonography. Parameters such as age, sex, abdominal symptoms, serum CEA were recorded. The diagnosis of chronic pancreatitis was established if there was evidence of pancreatic calcification on abdominal ultrasonography.
Results: Maximum cases were seen in age group 21-35 years (males- 27, females- 16) followed by 36-50 years (males- 15, females- 13). Common clinical findings were pain in 57, calcification in 11, diarrhea in 34, jaundice in 22, lump in 17, vomiting in 38 and GI bleed in 31. A significant difference was observed (P< 0.05). Parenchymal features were hyperechoic foci with shadowing in 60, lobularity with honeycombing in 54, hyperechoic foci without shadowing in 36, stranding (Minor): hyperechoic lines ≥3 mm in length in 48. Ductal features were main pancreatic duct (MPD) calculi in 71, irregular MPD contour in 23, dilated side branches in 15, main pancreatic duct dilatation in 38 and hyper echoic duct margin in 44. A non- significant difference was observed (P> 0.05).
Conclusion: Endoscopic ultrasound is the most sensitive imaging modality for diagnosing pancreatic disorders; it can demonstrate subtle alterations in the pancreatic parenchymal and ductal structure even before traditional imaging and functional testing demonstrate any abnormality.
110. Demographic and Histopathological Profile of Sinonasal and Nasopharyngeal Lesions Along with Relevant Immunohistochemical Markers in a Tertiary Care Hospital: A Retrospective and Prospective Analysis
Sujit Hanumant Gore, Snehal Narayan Bansode, Gore Harishchandra D., Vaishali Harishchandra Gore, Siddhi Gaurish Sinai Khandeparkar
Abstract
Introduction: Sinonasal and nasopharyngeal lesionscomprise a heterogeneous group of conditions ranging from inflammatory polyps to aggressive malignant neoplasms. Their overlapping clinical presentations often necessitate histopathological evaluation for definitive diagnosis.
Aim: To evaluate the clinicopathological spectrum of sinonasal and nasopharyngeal lesions in a tertiary care hospital, with emphasis on demographic distribution, clinical presentation, and histopathological categorization.
Materials and Methods: This retrospective plus prospective observational study included 317 patients presenting with sinonasal masses. Biopsy and excision specimens were processed using routine histopathological techniques, with special stains and immunohistochemistry applied where necessary. Lesions were classified into non‑neoplastic and neoplastic categories. Demographic and clinical data were analyzed, and statistical associations were assessed using the Chi‑square test.
Results: Of 317 cases, 224 (71%) were non‑neoplastic and 93 (29%) were neoplastic. Inflammatory polyps were the most common non‑neoplastic lesion (79.02%), while sinonasal papilloma (51.57%) and hemangioma (26.56%) predominated among benign neoplasms. Squamous cell carcinoma was the leading malignant tumor (27.58%). The majority of cases occurred in the 31–40 year age group (22.71%), with a male predominance (59.94%). Age showed a statistically significant association with lesion type (p = 0.0004), whereas gender did not (p = 0.628).
Conclusion: Sinonasal and nasopharyngeal lesions are predominantly non‑neoplastic, with inflammatory polyps being the most frequent. Benign neoplasms outnumber malignant ones, though squamous cell carcinoma remains the most common malignancy. Histopathological evaluation is indispensable for accurate diagnosis, and regional variations such as higher frequencies of fungal and granulomatous lesions highlight the importance of local epidemiological data in guiding clinical management.
111. The Role of Shear Wave Elastography to Evaluate Liver Stiffness in Patients with Fatty Liver Diagnosed on B Mode Ultrasonography in Patient with No History of Alcohol Consumption
Hardi Patel, Dipti A. Shah, Major Deepak K. Rajput, Kavita U. Vaishnav, Kakshil Patel, Jiten Modi
Abstract
Background: Non-alcoholic fatty liver disease (NAFLD) is an increasingly prevalent condition associated with hepatic steatosis in the absence of significant alcohol consumption. While B-mode ultrasound is commonly used for diagnosis, it cannot quantify liver fibrosis—a crucial determinant of prognosis. Shear wave elastography (SWE) offers a non-invasive method to assess liver stiffness and potentially detect early fibrosis.
Materials and Methods: The study was undertaken from March 2025 to august 2025 after obtaining Institutional Review Board (IRB) approval in patients referred to the department of Radio diagnosis, Narendra Modi Medical College & L.G hospital, Ahmedabad. The study population consisted first 100 patients with no history of alcohol consumption were diagnosed having fatty liver on B mode ultrasound and undergone shear wave elastography over 6 months. Liver stiffness values were recorded using a Mindray Resona I9 ultrasound machine.
Results: In this study of 100 patients, 60 patients were male while 40 patients were female. The most common age group affected was 40-49 years.Fatty liver grading on B mode ultrasound revealed 50 patients (50.0%)in Grade I, 35 patients (35.0%) in Grade II, and 15 patients (15.0%) in GradeIII. SWE measurements showed distinct ranges: Grade I (5.04 ± 0.65 kPa), Grade II (6.99 ± 0.80 kPa), and Grade III (9.74 ± 0.77 kPa). A positive correlation was observed between fatty liver grading on B mode ultrasound and SWE values (r = 0.912, p < 0.001).
Conclusion: Shear wave elastography demonstrates excellent correlation with B-mode fatty liver grading in NAFLD patients. The mean SWE values observed in our study (Grade I: 5.04 kPa, Grade II: 6.99 kPa, Grade III: 9.74 kPa) demonstrate a clear stepwise progression that mirrors the expected increase in liver stiffness with advancing fatty liver disease, enabling noninvasive fibrosis risk stratificationwithout liver biopsy.
112. Utility of Abnormal WBC Scattergram & Pseudo Eosinophilia with Thrombocytopenia in Presumptive Diagnosis of Malaria
Laxmi Aheer, Mamta, Pawan Kumar, Kishore Khatri, Yogi Raj Joshi
Abstract
Background: Malaria remains a major global health problem, and while Giemsa‑stained peripheral blood smear microscopy is the diagnostic gold standard, it is limited by interobserver variability and resource constraints in many endemic settings. Automated haematology analysers, routinely used for complete blood counts, may provide valuable presumptive diagnostic clues through characteristic haemogram changes and abnormal WBC scattergram patterns.
Aim: To evaluate the utility of abnormal WBC scattergram patterns and pseudoeosinophilia, in combination with thrombocytopenia and other haemogram abnormalities, as indicators for the presumptive diagnosis of malaria.
Materials and Methods: This was a descriptive study in retrospective analysis of all malaria positive cases. It was conducted over four months (August–November 2023) at Dr SN Medical College and associated hospitals, Jodhpur. A total of 1,200 EDTA blood samples from patients clinically suspected of malaria were analysed using the Sysmex XS‑800i automated haematology analyser, and corresponding Giemsa‑stained thick and thin peripheral blood smears were examined microscopically for Plasmodium species. Pseudoeosinophilia was defined as a spurious increase in eosinophil count on the analyser not corroborated by manual differential counts, thrombocytopenia as platelet count <150,000/µL, and abnormal WBC scattergram patterns were recorded as per predefined criteria.
Results: Of 1,200 samples, 61 (5.1%) were positive for Plasmodium vivax malaria on peripheral smear, with patient ages ranging from 10 to 65 years. Among malaria‑positive cases, 86.8% showed thrombocytopenia, 42.6% anaemia, 40.9% pseudoeosinophilia and 65.5% abnormal WBC scattergram findings; pancytopenia and bicytopenia were observed in 9.8% and 37.7% of cases, respectively. The most frequent scattergram abnormalities included “graying” and overlap of neutrophil and eosinophil clusters and double neutrophil or eosinophil populations. In most cases, higher degrees of parasitaemia correlated with more pronounced scattergram abnormalities and pseudoeosinophilia.
Conclusion: Thrombocytopenia, pseudoeosinophilia and abnormal WBC scattergram patterns on the Sysmex XS‑800i are frequently associated with P. vivax malaria and, when interpreted together, substantially enhance the presumptive detection of infection. In malaria‑endemic areas, careful review of haemograms and WBC scattergrams can serve as a rapid, cost‑effective screening adjunct to guide targeted smear examination and improve early laboratory recognition of malaria.
113.
Dyslipidemia Incidence and Correlation in Patient Outcome Prediction a Cerebrovascular Accident Diagnosis: A Prospective Cross Sectional Analysis
Pradeep Kumar Sharma, Kumari Suruchi, Ravindra Kumar Das
Abstract
Background: It has been stated that one of the risk factors for ischemic stroke is lipid abnormalities. Studies that compare a patient’s lipid profile with their stroke pattern (hemorrhage and infarction) are scarce, nonetheless. Incidence and association of lipid abnormalities in patients with cerebrovascular accidents (CVAs) were the focus of this investigation.
Methods: Between October 2020 and March 2021, 127 participants were examined in the Department of Medicine at Darbhanga Medical College and Hospital in Laheriasarai, Bihar, after being split into Cases (n = 102, with CVA) and Controls (n = 25, without CVA). For every subject, a thorough history and lipid profile were documented. Each patient’s brain CT/MRI was used to examine their stroke pattern.
Results: The majority of respondents in the Cases and Control groups are between the ages of 61 and 85 (45.09%) and 41 and 60 (44%), respectively. The majority of patients in the Cases group were males (61.76%), while the majority of patients in the Control group were females (84%). Smokers made up the majority of the cases (53.92%). Infarction was more common among smokers in the Case group (55%) than hemorrhage (51%). The prevalence of dyslipidemia was higher in patients (56.86%) than in controls (28%; p=0.009). Compared to patients who had hemorrhage (45.94%), the majority of infarct patients (63.07%) had dyslipidemia. The primary cause of cases was a reduction in high density lipoprotein (HDL) levels (74%), which was followed by a fall in total cholesterol (64%). 71.42% of the 14 patients in Cases who passed away had dyslipidemia.
Conclusion: More often than hemorrhagic stroke, ischemic stroke was associated with lower HDL levels. Dyslipidemia was most frequently observed in patients with stroke.
114.
Serial Serum Albumin Level Estimation as a Prognostic Factor in Sepsis Patients Admitted in Intensive Care Units
Kumari Suruchi, Pradeep Kumar Sharma, Vinayanand Jha
Abstract
Background: The dysregulated host response to infection is the hallmark of the clinical disease known as sepsis. There is a range of severity, from septic shock to sepsis. Mortality rates in shock cases have been shown to range from 10% to 40%, although estimates vary greatly and depend on the group being studied. Finding out whether serum albumin levels and mortality risk are quantitatively connected and examining the impact of serial serum albumin level monitoring as a predictor of mortality and morbidity in sepsis patients admitted to the intensive care unit are the main objectives of the study.
Method: This descriptive study involved 70 sepsis patients hospitalized to the Medicine ICU at Darbhanga Medical College & Hospital in Laheriasarai, Bihar, between September 2020 and February 2021. All of the selected patients received a comprehensive evaluation on the first day following the sepsis diagnosis, and on days three and five, their serum albumin levels were measured. Patients were observed during their hospital stay, and their results—that is, whether they lived or died—were recorded. In order to examine data placed into an MS Excel spreadsheet, statistical product and service solutions (SPSS) version 18 was utilized.
Results: The 70 patients selected for the study were divided into two groups: survivors and non-survivors. On day 1, the survivor group’s mean serum albumin level was 3.72 g/dl (±0.278), whereas the non-survivor group’s was 3.11 g/dl (±0.247). The non-survivor group’s mean serum albumin levels on day three were 2.65 g/dl (±0.172), while the survivor groups were 3.17 g/dl (±0.248). The survivor group’s mean blood albumin levels on day five were 2.72 g/dl (±0.25), while the non-survivor groups were 2.32 g/dl (±0.144). The difference in mean blood albumin on days 1, 3, and 5 was shown to be statistically significant using an unpaired t test, with a p value ≤0.001. The decline in mean serum albumin level in survivors from day 1to day 5 was 3.72 g/dl to 2.72 g/dl. In non-survivors it is 3.11 g/dl to 2.32 g/dl.
Conclusion: This study found a direct correlation between a blood albumin level of less than 3.5 gm/dl on all three days and the prognosis of a sepsis patient. Beginning on day 1, serum albumin levels in both the survivor and non-survivor groups gradually dropped; however, a drop below 3.0 gm/dl was associated with a higher death rate. It suggests that the rate at which serum albumin falls below the normal threshold affects the mortality prognosis of the sepsis patient. Even in settings with little resources, patients with sepsis are at risk of a poor prognosis, and serum albumin measurement is less expensive and can help with clinical evaluation.
115. Study of Echocardiographic Changes in Patients with Chronic Kidney Disease
Aditya Prakash Dinkar, Gyan Ranjan, Amit Kumar, Arohi Kumar, Vijay Kumar Singh
Abstract
Background: There is a substantial correlation between cardiovascular morbidity and mortality and chronic kidney disease (CKD). Patients with chronic kidney disease (CKD) frequently have left ventricular hypertrophy (LVH), systolic and diastolic dysfunction, and other echocardiographic abnormalities that can lead to poor cardiac outcomes. The purpose of this study is to assess the frequency, kind, and correlation between the severity of CKD and echocardiographic alterations in patients.
Methods: This cross-sectional study, which involved 80 CKD patients who were at least 18 years old, was carried out at SKMCH in Muzaffarpur, Bihar, between June 2025 and November 2025. Individuals with chronic drinking, ischemic heart disease, valvular heart disease, and congenital heart disease were not included. LVH, left ventricular ejection fraction (LVEF), systolic and diastolic dysfunction, and pericardial effusion were evaluated by two-dimensional echocardiogram (2DECHO). To detect correlations between CKD stage and echocardiographic results, data were examined.
Results: Participants were 60% male and 40% female, with an average age of 55.2±12.4 years. As the severity of CKD worsened, the prevalence of LVH rose as well: 35.7% in stage 3, 53.8% in stage 4, and 69.2% in stage 5. In CKD stages 3, 4, and 5, systolic dysfunction (LVEF <55%) was noted in 14.3%, 30.8%, and 46.2% of patients, respectively. In 28.6% (stage 3), 53.8% (stage 4), and 69.2% (stage 5), diastolic dysfunction was seen. LVH and left ventricular failure were substantially correlated with anemia and hypertension.
Conclusion: In patients with chronic kidney disease (CKD), echocardiographic abnormalities are very common and get worse as the illness worsens. To lower cardiovascular problems and enhance outcomes for individuals with chronic kidney disease, early cardiac screening and focused therapies are crucial.
116. Study of Serum Lipoprotein(a) Status during Type 2 Diabetes Mellitus
Gyan Ranjan, Aditya Prakash Dinkar, Vijay Kumar Singh, Arohi Kumar, Amit Kumar
Abstract
Background: Lipoprotein (a) is made up of an atherogenic LDL lipoparticle and a potentially thrombogenic apoprotein and is therefore responsible for cardiovascular disease. The objective of this study is to evaluate serum lipoprotein (a) status and to investigate the correlation of elevated serum lipoprotein (a) levels with other cardiovascular risk factors in type 2 diabetics.
Methods: This is a case-control study involving 82 patients, 37 type 2 diabetic patients and 45 non-diabetic control subjects. Sociodemographic data were collected and each patient underwent routine lipid assessment and lipoprotein (a) testing.
Results: The prevalence of hyperlipoproteinemia (a) is 17.8% in control subjects and 29.7% in type 2 diabetics. HDL cholesterolemia is significantly higher in controls than in type 2 diabetics (p =0.028) while LDL cholesterol and serum lipoprotein (a) levels are higher in type 2 diabetics than in controls with a statistically significant difference (p=0.025 and p=0.026 respectively). The mean lipoprotein (a) values of 0.36±0.34 g/l in women are higher than those of male subjects which are 0.28±0.20 g/l (p=0.171). Mean serum lipoprotein (a) levels of 0.39±0.32 g/l in type 2 diabetics are significantly higher than those of controls which are 0.25±0.21 g/l (p= 0.026). Plasma concentrations of lipoprotein (a) vary with age and appear to be increased beyond the age of 45. There is no correlation between lipoprotein (a) and other cardiovascular risk factors.
Conclusion: Hyperlipoproteinemia (a) is common in type 2 diabetics and women have the highest plasma levels. Serum lipoprotein (a) concentrations are not correlated with other cardiovascular risk factors and therefore constitute an independent risk factor.
117. Growth Failure Factors and Stunting Prevalence in Children with Type 1 Diabetes: An Observational Study
Rajesh Singh, Umese Ram, Jiteshwar Prasad Mandal, Rakesh Ranjan
Abstract
Background: Children with Type-1 diabetes mellitus (T1DM) are most commonly found in India. Little is known about growth failure in children with diabetes, particularly in those with comorbidities and complications. Aim of this study is to determine the prevalence and predictors of stunting in children with T1DM.
Methods: This cross-sectional observation study was conducted at Pediatrics Department of SKMCH, Muzaffarpur, Bihar from June 2025 to November 2025. Total 125 children and adolescents aged 1–18 years with T1D were included this study. Demographic data, anthropometry, diet, sexual maturity rating, and biochemical measurements were performed using standard protocols. Short stature was defined as height for age Z-score ≤2. p<0.05 was considered statistically significant.
Results: 60 (48%) and 65 (52%) of the 125 children in the study were male and female, respectively. The children in the study group were 13.0±3.5 (1–18) years old on average, and they had had diabetes for 7.4±4.0 years on average. The average HbA1c for the children was 13.8±0.8%. In our group of children with T1D, we found that 20% of them had stunting. Children that were stunted had higher urine albumin creatinine ratios, poorer hemoglobin, lower midparental height Z-scores, and greater cholesterol. Stunting was significantly predicted by pre-existing comorbidities, worse renal function, prolonged disease duration, and short mid-parental height, according to binary logistic regression.
Conclusion: Of children with T1D, slightly less than one-sixth were short. Monitoring these youngsters’ growth is crucial, particularly for those with short parents, long-term diabetes, pre-existing comorbidities, and declining renal function.
118. Study On Clinical and Etiopathological Profile of Pancytopenia in Children Aged 1-18 Years: An Observational Analysis
Umese Ram, Rajesh Singh, Jiteshwar Prasad Mandal, Rakesh Ranjan
Abstract
Background: An decrease in all three blood components; leukopenia, thrombocytopenia, and anemia below the normal level is known as pancytopenia. This study is an attempt to fill the lacunae regarding the information about pancytopenia in pediatric patients. The study aimed to study the clinical and etiopathological profile of pancytopenia in children aged 1–18 years.
Methods: This cross-sectional observational study was to find out more about the etiopathological profile, clinical characteristics, and demographics of pediatric pancytopenia. The study was conducted from June 2025 to November 2025 with 65 patients who fulfilled the inclusion criteria and were admitted to the pediatric department of Sri Krishna Medical College and Hospital in Muzaffarpur, Bihar, and were between the ages of 1 and 18. IBM’s Statistical Package for the Social Sciences version 23 was used for the statistical study.
Results: The majority of cases (55%) out of 65 patients were in the 1–6 years age group. Our study revealed male predominance over females with male-to-female ratio of 2.09:1, mostly belonging to rural areas. The most common presenting complaint was easy fatigue in (90%) of patients followed by fever (54%). The most common physical finding was pallor (100%), followed by splenomegaly and pedal edema (38%) and (18%), respectively. Bone marrow cellularity shows hypocellular marrow (62%), hypercellular (31%), and normocellular (7%). Peripheral smears of most of the patients showed normocytic normochromic (34%), followed by macrocytic hypochromic (30%). Regarding etiology megaloblastic anemia (30%) was reported as the most common cause of pancytopenia followed by malignancies (30%) including myelodysplastic syndrome (9%), multiple myeloma (3%), acute lymphocytic leukemia (9%), and acute myeloid leukemia (9%) followed by aplastic anemia (14%) and sepsis (8%). The study also shows other rare causes of pancytopenia such as disseminated tuberculosis (6%), malaria (9%), and dengue (3%).
Conclusion: According to the present study, megaloblastic anemia, malignancies, and aplastic anemia are the most frequent nutritional causes of pancytopenia.
119. Clinicopathological Profile of Limbal Dermoid: A Case Series with Review of Immunohistochemical Findings
Suravi Debnath, Amresh Kumar, Nidhi, Ashish Kumar Sharma, Sujata Kumari, Pawan Pratap Singh
Abstract
Background: Limbal dermoid is a rare, congenital, benign lesion occurring at the corneoscleral junction. Although usually harmless, it can cause cosmetic issues, astigmatism, and occasionally affect vision. Some cases are associated with Goldenhar syndrome, a congenital condition involving craniofacial, auricular, and vertebral anomalies. Histopathology and immunohistochemistry (IHC) help confirm the diagnosis and identify tissue components. This case series describes four Grade I limbal dermoids – two isolated and two linked with Goldenhar syndrome—to highlight their clinical, systemic, and histopathological features.
Methodology: This retrospective case series included four clinically diagnosed limbal dermoid patients evaluated at a tertiary care hospital. All underwent detailed ocular examination, slit-lamp assessment, and photographic documentation. Cases 1 and 4 had isolated limbal dermoids, while Cases 2 and 3 showed features of Goldenhar syndrome. Data collected included demographics, presenting complaints, lesion grade and location, refractive error, systemic findings, and surgical details. All excised specimens were processed and analyzed in our hospital for routine staining and immunohistochemical evaluation.
Discussion: All four patients were 10–13 years old, and all lesions were Grade I, mostly located inferotemporally. Two patients had systemic features of Goldenhar syndrome, such as facial asymmetry and preauricular tags. All children had astigmatism, which improved with refractive correction, and none required urgent surgery. Histopathology showed keratinized epithelium, collagenous stroma, adnexal structures, and adipose tissue in all cases, while syndromic cases also contained cartilage. IHC findings further confirmed the choristomatous nature of the lesions.
Conclusion: This case series highlights the benign nature of Grade I limbal dermoids and their occasional association with Goldenhar syndrome. Most lesions respond well to conservative management, while syndromic cases need thorough systemic evaluation. Histopathology and IHC play an important role in distinguishing isolated lesions from syndromic ones and help guide appropriate treatment planning.
120. Prevalence and Risk Factors of Poorly Controlled Hypertension in Urban Communities
Mahendra Varthi, Pallavi Harish Pandhare
Abstract
Background: Uncontrolled hypertension remains a major public health concern despite widespread availability of antihypertensive therapy, particularly in rapidly urbanizing populations.
Objective: To determine the prevalence of uncontrolled hypertension and identify associated risk factors among treated hypertensive individuals in an urban population.
Methods: A cross-sectional study was conducted among 310 treated hypertensive adults, of whom 202 were categorized as uncontrolled and 108 as controlled based on blood pressure criteria. Sociodemographic characteristics, lifestyle factors, treatment patterns, and adherence were evaluated, and multivariate logistic regression was used to identify independent predictors.
Results: The prevalence of uncontrolled hypertension was 65.2%. Older age, additional salt intake, smoking, obesity, poor medication adherence, and single-drug therapy were associated with higher uncontrolled hypertension rates. Multivariate analysis identified additional salt intake, lack of structured healthcare education, single-drug therapy, and poor adherence as significant predictors.
Conclusion: A high burden of uncontrolled hypertension persists among treated urban patients, highlighting the need for improved patient education, dietary interventions, regular follow-up, and optimized pharmacological strategies to enhance blood pressure control and reduce cardiovascular risk.
121. Influence of Body Mass Index on Quality of Life among COPD Patients
Mahendra Varthi, Pallavi Harish Pandhare
Abstract
Background: Body mass index is increasingly recognized as an important determinant of clinical outcomes and quality of life in patients with chronic obstructive pulmonary disease.
Objective: To evaluate BMI distribution and its association with quality of life and airflow limitation severity among COPD patients.
Methods: A cross-sectional study was conducted among 210 COPD patients in whom BMI categories were assessed along with St George’s Respiratory Questionnaire scores, airflow limitation severity, and clinical predictors. Multiple linear regression analysis was performed to identify independent determinants of quality of life.
Results: Obese (44.8%) and overweight (30.0%) categories constituted the majority of participants, while underweight patients showed significantly higher symptom, activity, impact, and total SGRQ scores indicating poorer quality of life. BMI and FEV1 were significant independent predictors of SGRQ total score, whereas female gender, biomass exposure, smoking, and hospitalization history were associated with worse outcomes.
Conclusion: BMI plays a significant role in determining quality of life in COPD patients, with underweight individuals demonstrating the greatest impairment. Comprehensive management strategies incorporating nutritional assessment and lifestyle interventions are essential to improve patient outcomes.