International Journal of Current Pharmaceutical

Review and Research

e-ISSN: 0976 822X

p-ISSN: 2961-6042

NMC Approved Peer Review Journal

Menu

Disclaimer: Scopus, Embase, Publons and Crossref are registered trademark of respective companies.

This journal is member of Crossref. 

1. Comparison of Analgesic Efficacy of Ultrasound-Guided External Oblique Intercostal Plane Block and Modified Thoracoabdominal Nerve Block via Perichondrial Approach in Patients Undergoing Upper Abdominal Surgeries: A Randomized Controlled Study
Divyashree R., S. B. Gangadhar, Samveda S. G.
Abstract
Background and Aims: Effective postoperative pain management following upper abdominal surgeries improves recovery and reduces opioid requirements. Recently described ultrasound guided fascial plane blocks such as the External Oblique Intercostal Plane (EOIP) block and modified thoracoabdominal nerve block via the perichondrial approach (M-TAPA) provide analgesia of anterior and lateral abdominal wall. This study aimed to compare the postoperative analgesic efficacy of EOIP and M-TAPA blocks with conventional analgesia in patients undergoing upper abdominal surgeries. Material and Methods: This prospective randomized study included 45 patients undergoing elective upper abdominal surgery under general anaesthesia. Patients were randomly allocated into three groups (n=15 each). Control group (Group C) receiving general anesthesia with standard systemic analgesia, Group E receiving ultrasound-guided EOIP block, Group T receiving ultrasound-guided M-TAPA block in addition to general anesthesia. The primary outcome was postoperative pain assessed using Numerical Rating scale (NRS) at predetermined time intervals in the first 24 hours. Secondary outcomes included Demographic variables, sensory dermatomal spread, total opioid consumption, NSAID consumption and postoperative complications were recorded. Statistical analysis was performed using ANOVA and Chi-square tests. Results: Postoperative NRS scores were significantly lower in Group E and T compared with the control at multiple postoperative time points (p<0.001). Both EOIB and M-TAPA groups demonstrated reduced opioid consumption and prolonged time to first rescue analgesia compared with the control group. However, no statistically significant difference was observed between Group E and T with respect to pain scores, opioid requirement, or duration of analgesia. The incidence of adverse effects was comparable among the three groups. The mean age was comparable between Groups (Group C: 45.3±11.2years, Group E: 46.1±10.8 years, Group T: 44.7± 12.1years; p>0.05). Mean 24-hour opioid consumption was significantly higher in the control group (300± 0mg) compared to EOIP (73.3±59.4 mg) and M-TAPA (86.7±74.3 mg) groups (p<0.001). Similarly, NSAIDS consumption was significantly higher in the control group (3000±0 mg) compared with EOIP (1333±488 mg) and M-TAPA (1267±458 mg). Both blocks produced adequate dermatomal spread from T6 to T11. Conclusions: Ultrasound-guided External oblique intercostal plane (EOIP) block and Modified thoracoabdominal nerve block via the perichondrial approach (M-TAPA) blocks have emerged as effective techniques for managing postoperative pain in patients undergoing upper abdominal surgeries. These blocks provide multi-dermatomal analgesia, significantly reducing opioid consumption and contributing to improved patient outcomes. With minimal complications reported, EOIP and M-TAPA blocks can be safely incorporated into multimodal analgesia protocols, enhancing pain management strategies. Clinically, M-TAPA block demonstrates better sensory dermatomal spread (T6-T12) compared to EOIP block, with slight increase in duration of analgesia suggesting a potential advantage in certain clinical scenarios. Hence I recommend M-TAPA block for better postoperative analgesia following upper abdominal surgeries. However, both techniques demonstrate comparable analgesic efficacy, making them valuable additions to pain management approaches for upper abdominal surgeries.

2. Information‑Seeking Behavior and PubMed Utilization among Undergraduate Medical Students: A Cross‑Sectional Study
Siddanathi Narasinga Rao, Dimma Syamala, Ramidi Madhavi Reddy, Dornala Dinesh Reddy
Abstract
Background:  Efficient retrieval of biomedical literature is an essential component of evidence‑based medical practice. PubMed is one of the most widely used biomedical databases for accessing peer‑reviewed scientific literature. However, the extent to which undergraduate medical students possess adequate knowledge and practical skills to effectively utilize PubMed remains variable. Aim: To evaluate the knowledge, attitudes and practices related to PubMed literature searching among undergraduate medical students. Objective: To evaluate awareness and understanding of PubMed, assess students’ attitudes toward its academic utility, and analyze patterns of its practical use in medical education. Methods: A cross‑sectional questionnaire‑based study was conducted among 200 undergraduate medical students, including final‑year MBBS students and interns, at a tertiary care teaching hospital. A structured questionnaire assessing knowledge, attitude and practice regarding PubMed searching was distributed electronically using Google Forms. Data were analyzed using descriptive statistics and presented as frequencies and percentages. Results: Among the participants, 62% reported awareness of PubMed as a biomedical literature database. Knowledge of advanced search tools such as Medical Subject Headings (MeSH), Boolean operators and clinical filters was limited. Nearly half of the students expressed positive attitudes toward the importance of literature searching in medical education, while a considerable proportion reported neutral confidence levels in using PubMed effectively. Conclusion: Although awareness of PubMed exists among undergraduate medical students, gaps persist in knowledge depth and practical application of advanced search techniques. Structured training programs focusing on biomedical literature searching should be incorporated into undergraduate medical curricula to enhance research skills and support evidence‑based clinical learning.

3. Comparison of Real‑Time Ultrasound‑Guided Spinal Anaesthesia Vs Pre‑Procedural Ultrasound‑Guided Spinal Anaesthesia in Geriatric Patients Undergoing Infra Umbilical Surgeries
R. Jagadish Raj, S. B. Gangadhar, Anjali R. Hegde
Abstract
Background: Age-related degenerative changes of the lumbar spine, including narrowing of interspinous spaces and ligament calcification, increase the technical difficulty of neuraxial block placement in geriatric patients. Ultrasound guidance has been used to improve the success of spinal anaesthesia either by pre-procedural ultrasound (PUS) assisted landmark identification or real-time ultrasound (RUS) guided needle insertion. However, comparative evidence between these two techniques in elderly patients is limited. This study aimed to compare the efficacy of real-time ultrasound-guided spinal anaesthesia with pre-procedural ultrasound- guided spinal anaesthesia in geriatric patients undergoing infra-umbilical surgeries. Material and Methods: In this prospective randomized study, 50 patients aged ≥60 years with American Society of Anaesthesiologists (ASA) physical status I–III undergoing infra-umbilical surgeries under spinal anaesthesia were randomized into two groups of 25 each to receive spinal anaesthesia using either the PUS or RUS technique via a paramedian approach. Primary outcomes measured were the number of attempts and needle passes required for successful subarachnoid block. Secondary outcomes included time taken for identification of subarachnoid space, procedure time, and total time for successful lumbar puncture. Statistical analysis was performed using independent sample t-test and Chi-square test. Results: First-attempt success rate was higher in the PUS group (64%) compared to the RUS group (48%) without statistical significance (p=0.561). First-pass success was also greater in the PUS group (48%) than in the RUS group (28%) (p=0.209). The mean time for identification of the subarachnoid space was significantly longer in the PUS group (74.76±21.37 s) compared to the RUS group (63.72±17.94 s) (p<0.05). Procedure time was significantly longer in the RUS group (56.56±24.62 s) compared to the PUS group (32.94±14.47 s) (p<0.0001). Total time for successful lumbar puncture was comparable between the groups (p=0.149). Conclusions: The total duration taken for successful lumbar puncture remains the same irrespective of the ultrasound technique used, with pre procedural ultrasound taking longer time for identification of the subarachnoid space and real time ultrasound taking longer time from skin prick to cerebrospinal fluid backflow.

4. Comparison of Pressure Controlled Versus Volume Controlled Ventilation Modes in Patients Undergoing Lumbar Spine Surgery in Prone Position
Anjali R. Hegde, S. B. Gangadhar, R. Jagadish Raj
Abstract
Background: Prone positioning during lumbar spine surgery under general anaesthesia significantly alters respiratory mechanics and hemodynamics due to changes in thoraco-abdominal compliance and intra-thoracic pressure. The choice of intraoperative ventilation mode may influence pulmonary mechanics, cardiovascular stability, intraoperative blood loss, and surgical stress response. Evidence comparing pressure-controlled ventilation (PCV) and volume-controlled ventilation (VCV) in this setting remains inconsistent. Objectives: To compare PCV and VCV with respect to pulmonary mechanics, hemodynamic parameters, intraoperative blood loss, and stress response in patients undergoing elective lumbar spine surgery in the prone position. Material and Methods: This prospective, randomized comparative study included 60 adult patients (ASA I–II) scheduled for elective lumbar spine surgery. Patients were randomized to receive either PCV or VCV (n=30 each). Ventilation was standardized with tidal volume of 8 mL /kg and PEEP of 5 cm H₂O. Hemodynamic and respiratory parameters were recorded after intubation in supine position and 30 minutes after prone positioning. Dynamic compliance was calculated, surgical stress response assessed using random blood glucose levels, and intraoperative blood loss estimated at the end of surgery. Results: Demographic characteristics were comparable between groups. Peak airway pressure was significantly lower and dynamic compliance significantly higher in the PCV group compared to the VCV group (p<0.001). Mean arterial pressure showed a lesser decline in the PCV group (p=0.02). Intraoperative blood loss was significantly lower with PCV (p<0.001). Heart rate, end-tidal carbon dioxide, and blood glucose levels showed no significant intergroup differences. Conclusion: Pressure-controlled ventilation provides superior respiratory mechanics and reduced intraoperative blood loss with good hemodynamic stability and no significant stress response compared to volume-controlled ventilation during prone lumbar spine surgery.

5. Comparison of Thyroid Profile in Beta-Thalassemia Major Patients on Regular Blood Transfusion and Iron Chelation Therapy with Age-Matched Controls: A Cross-Sectional Study
Sanjeev Chahar, Saroj Ola, Vineet Popli, Deepti Jain, Pratima Khare
Abstract
Background: Beta-thalassemia major (BTM) is a transfusion-dependent hereditary hemoglobin disorder associated with iron overload and multiple endocrine complications, including thyroid dysfunction. Early detection of thyroid abnormalities is essential to reduce morbidity in these patients. Objective: To evaluate the prevalence and pattern of thyroid dysfunction in children with beta-thalassemia major receiving regular blood transfusion and iron chelation therapy, and to compare findings with age-matched healthy controls. Methods: This hospital-based cross-sectional comparative study included 50 children (≥8 years) with confirmed BTM and 50 age-matched healthy controls. Clinical evaluation, anthropometric measurements, and laboratory investigations including serum T3, T4, thyroid-stimulating hormone (TSH), and serum ferritin were performed. Thyroid status was classified as euthyroid, subclinical hypothyroidism, primary hypothyroidism, or secondary hypothyroidism. Statistical analysis was conducted using SPSS version 21.0. Results: The mean age of cases and controls was comparable. Thalassemia patients had significantly lower mean height and weight than controls. Overall, 26% of BTM patients exhibited thyroid dysfunction, with subclinical hypothyroidism being most common (20%), followed by primary hypothyroidism (6%). In contrast, 12% of controls had subclinical hypothyroidism, with no cases of overt hypothyroidism. Although mean T3, T4, and TSH levels did not differ significantly between groups, elevated TSH was associated with thyroid dysfunction. Higher serum ferritin levels and longer duration of chelation therapy were significantly associated with thyroid abnormalities. No significant association was found with age at diagnosis, transfusion frequency, or transfusion burden. Conclusion: Thyroid dysfunction, predominantly subclinical hypothyroidism, is a common endocrine complication in transfusion-dependent beta-thalassemia major patients despite ongoing chelation therapy. Regular screening of thyroid function and monitoring of iron overload are recommended for early detection and management.

6. Evaluation and Management of Diabetic Foot According To Wagner’s Classification
L. Parvathi, N. Deepthi, Shaik Mahammed Asadulla, Yasa Prathibha
Abstract
Background: Diabetic foot complications represent a significant healthcare burden globally, affecting approximately 25% of individuals with diabetes during their lifetime. Wagner’s classification system serves as a fundamental tool for systematic evaluation and therapeutic planning in diabetic foot management. This grading system categorizes foot lesions from Grade 0 (high-risk foot) to Grade 5 (extensive gangrene), facilitating standardized treatment approaches and outcome prediction. Methods: A prospective observational study was conducted involving 25 patients with diabetic foot complications presenting to our tertiary care center over 24 months. Each patient underwent comprehensive evaluation and was classified according to Wagner’s grading system. Treatment protocols were implemented based on the assigned grade, with regular follow-up assessments to monitor healing progress and clinical outcomes. Results: The study cohort demonstrated varying healing rates correlated with Wagner grade severity. Higher-grade lesions showed prolonged healing times and increased complication rates. Grade-specific treatment protocols proved effective in achieving optimal clinical outcomes, with early intervention significantly improving prognosis across all patient categories. Conclusions: Wagner’s classification system provides reliable guidance for diabetic foot evaluation and management. The systematic approach enables healthcare providers to implement appropriate therapeutic interventions, ultimately improving patient outcomes and reducing the risk of severe complications including amputation.

7. Incidence of Acute Kidney Injury in Acute Febrile Illness
Srikanth Nakka, Dharam Dev Golani, Nagavaram Harikrishna, T. Krishna Kumar
Abstract
Background: Acute kidney injury (AKI) denotes a sudden and reversible kidney function reduction characterized by increased creatinine or decreased urine volume. One of the leading causes of death in AKI is acute febrile illness. Aim: The aim of this study was to evaluate the clinical profile of acute kidney damage in acute febrile illness. Materials and Methods: Original research study was carried out at the department of General Medicine in Pimpri, Pune. All 100 patients diagnosed with acute febrile illness who were admitted to the medical wards and intensive care unit (ICU) between September 2020 and August 2024 were included in the study. Results: In our study 70% of the study population are males, and the female population constituted 30%. On further evaluation of RIFLE grade 27% are under the risk category, 13 % under the injury category, and 10% under the failure category. On AKIN grade evaluation, 31% are grade 1, 8% are grade 2, and 10% are under the grade 3 category. KDIGO AKI stage 1, 2, and 3 were seen in 27 (27%), 9 (9%), and 14(14.0 %) of AKI patients, respectively. Conclusion: According to this study, dengue is the most frequent cause of AFI and AKI. The highest burden of AKI is caused by leptospirosis, dengue, and malaria. In order to help diagnose AKI early and provide appropriate care, we recommend that doctors look into kidney function in patients with acute febrile illness who also have certain risk factors.

8. Evaluation of the Use of Steroid-Sparing Immunosuppressants in Dermatological Disorders: Prescription Pattern and Safety Considerations
Sanglaap Saha, Romit Banerjee, Ritarshi Bhattacharya, Soumik Ghosh, Ranita Das, Suhena Sarkar, Abanti Saha, Amrita Sil
Abstract
Background: Chronic dermatological conditions including psoriasis, lupus, vitiligo, pemphigus, and lichen planus frequently require systemic immunosuppressive therapy. Long-term corticosteroids carry significant morbidity, prompting reliance on steroid-sparing immunosuppressants. Prospective real-world data on their prescription patterns and safety profiles from tertiary care dermatology centres in India remain limited. Objectives: To evaluate the prescription patterns and incidence of adverse drug reactions (ADRs) of steroid-sparing immunosuppressants in patients with chronic dermatological disorders attending a tertiary care centre. Methods: A prospective, cross-sectional observational study was conducted over six months in the Dermatology OPD and IPD of a tertiary care medical college in eastern India. A total of 183 adult patients receiving non-steroidal immunosuppressants for at least four weeks were enrolled. Socio-demographic, clinical, and pharmacological data were recorded using structured case report forms. ADRs were documented using the CDSCO ADR reporting form version 1.4 and causality assessed by the WHO-UMC scale. Results: The mean age was 39.22 ± 16.13 years; 53% were female. Psoriasis was the most prevalent diagnosis (47.4%), followed by vitiligo (14.1%) and lupus (6.0%). Methotrexate was the most frequently prescribed agent (27.3%), followed by tacrolimus (11.5%), tofacitinib (10.9%), and cyclosporine (15.8%). The ADRs encountered were mostly mild to moderate in severity with 26.4% systemic effects and 5.1% for mucocutaneous effects. Dyslipidaemia, cough, and arthralgia were the most common systemic ADRs (3.8% each). All ADRs were mild to moderate in severity. Conclusion: Methotrexate followed by cyclosporine dominated steroid sparing immunosuppressant prescriptions, consistent with national and international literature. The ADR profile was predominantly mild to moderate. Regular monitoring and pharmacovigilance are essential to ensure safe long-term use of these agents in dermatology practice.

9. Study of Drug Susceptibility, Resistance Patterns and Mutations, in Patients Diagnosed with Drug Resistant Tuberculosis in a Tertiary Care Centre Aurangabad, Maharashtra
Akash Bhardwaj, Anupam Prakash, Sunil Jadhav, Ashish S. Deshmukh, Hafiz Deshmukh, Shivprasad Kasat
Abstract
Background: Drug-resistant tuberculosis (DR-TB) remains a major public health challenge, particularly in high-burden countries like India. Resistance to first-line anti-tubercular drugs complicates treatment and contributes to increased morbidity and mortality. Understanding drug susceptibility patterns and associated genetic mutations is essential for effective management and control of DR-TB. Methods: A prospective observational study was conducted over a period of two years at MGM Medical College and Hospital, Aurangabad, Maharashtra. A total of 82 patients diagnosed with drug-resistant tuberculosis were enrolled, of which 80 patients with complete data were included in the final analysis. Drug susceptibility testing and mutation analysis were performed for first- and second-line anti-tubercular drugs. Data were analyzed using R software (version 4.3.2). Demographic characteristics, clinical profile, resistance patterns, and associated mutations were assessed. Results: Among the 80 patients studied, the majority belonged to the 10–30 years age group (51.25%), and males constituted 63.75% of cases. Most patients (93.75%) had no prior history of tuberculosis. Pulmonary tuberculosis was the predominant presentation (83.75%), while extrapulmonary disease accounted for 16.25%. Isoniazid resistance was observed in 96.25% of patients, with mutations in katG (79.22%) and inhA (28.57%). Rifampicin resistance associated with rpoB mutation was identified in 38.75% of patients. Fluoroquinolone resistance was seen in 23.75% of cases, predominantly involving gyrA mutation (100%) and gyrB mutation (63.16%). Resistance to second-line injectable drugs was identified in 5% of patients, associated with rrs and eis mutations. Pyrazinamide resistance was rare (1.25%). Conclusion: The study demonstrates a high burden of drug-resistant tuberculosis among young adults, with significant resistance to first-line drugs and identifiable genetic mutations. Routine drug susceptibility testing and molecular diagnostics are essential for early detection and effective management of DR-TB.

10. Variations in the blood Supply of Prostate gland in Maharashtra Population – A Cadaveric Study
Rahul Kharate
Abstract
Background: The prostate is a fibro-musculoglandular organ. Due to various functions, it requires more vascularity, but variations in arteries are a challenge to surgeons to avoid surgical emergencies during prostatectomy. Method: 34 (thirty-four) non-pathogenic adult prostates are dissected to study arterial supply, cleaned with distilled water, and allowed to dry. The arteries are painted, and photographs are taken wherever variations are noted. Results: Out of 34, major arterial supply was 15 (41%) inferior vesical artery; the least were 2 (5.88%) gluteopudendal trunk) and 2 (5.88%) middle rectal artery. Conclusion: These variations are very important to urosurgeons during prostatectomy, although various radiological techniques like CT scans and angiography are available but complete and small branches cannot be visualized.

11. A Comparative Study of Heart Rate Variability and Serum Uric Acid between Normotensive and Hypertensive Individuals in Tertiary Care
Nimit A. Hinsu1, Happy Chadsaniya2, Rashmita Ramani3, Manish Kakaiya4, R.S. Trivedi
Abstract
Background: Hypertension constitutes a major global health burden, contributing substantially to cardiovascular, renal, and cerebrovascular morbidity and mortality. Two emerging pathophysiological contributors—autonomic nervous system dysfunction as quantified by Heart Rate Variability (HRV), and elevated serum uric acid, the terminal metabolite of purine catabolism—have been independently implicated in the onset and progression of hypertension, yet their concurrent assessment in a treatment-naive cohort remains underexplored. Aims and Objectives: To compare HRV parameters and serum uric acid levels between newly diagnosed treatment-naive hypertensive patients and age- and BMI-matched normotensive controls. Setting and Design: A cross-sectional observational study conducted in the Department of Physiology, P.D.U. Government Medical College and Civil Hospital, Rajkot, Gujarat, India. Materials and Methods: Sixty age- and BMI-matched individuals aged 30–50 years were enrolled: 30 newly diagnosed untreated hypertensive patients and 30 normotensive volunteers. HRV was recorded using the Polar H9 heart rate sensor with the Elite HRV application. Serum uric acid was determined by Autoanalyzer. Statistical analysis was performed using SPSS v30.0.0. Intergroup comparisons were conducted using the unpaired Student’s t-test. Results: Hypertensives showed significantly elevated serum uric acid (5.84 ± 1.80 mg/dL vs. 4.34 ± 1.05 mg/dL; p < 0.001). Mean RR Interval, SDNN, and RMSSD were each significantly lower in the hypertensive group (p < 0.001). Frequency-domain analysis revealed significant reductions in LF (p < 0.001) and HF (p = 0.036); the LF/HF ratio did not differ significantly (p = 0.122). Conclusion: Hypertensive individuals exhibit markedly reduced HRV and elevated serum uric acid, collectively reflecting impaired autonomic regulation and augmented cardiovascular risk. Monitoring these non-invasive biomarkers may facilitate early cardiovascular risk stratification and targeted preventive intervention.

12. Comparative Evaluation of Thyroid Hormone Status in Patients with Acute Coronary Syndrome: A Cross-Sectional Study
Rashmita A. Ramani, Nimit A. Hinsu, Happy K. Chadsaniya, Kirit Sakariya, R. S. Trivedi
Abstract
Background: Acute coronary syndrome (ACS) is a major cause of morbidity and mortality worldwide and includes clinical conditions such as ST-elevation myocardial infarction (STEMI) and non-ST elevation myocardial infarction (NSTEMI). Thyroid hormones play a significant role in cardiovascular physiology by regulating myocardial contractility, heart rate, vascular resistance, and lipid metabolism. Alterations in thyroid hormone levels are frequently observed during acute systemic illnesses and may occur as part of Euthyroid Sick Syndrome (ESS). These hormonal changes may influence the clinical course and prognosis of patients with ACS. Aims and Objectives: To evaluate thyroid hormone status in patients with acute coronary syndrome and compare the thyroid profile between STEMI and NSTEMI patients. Setting and Design: This was a cross-sectional observational study conducted at P.D.U. Government Medical College and Civil Hospital, Rajkot, Gujarat, India. Materials and Methods: A total of 100 patients diagnosed with acute coronary syndrome were included in the study. Serum levels of free triiodothyronine (fT3), free thyroxine (fT4), and thyroid stimulating hormone (TSH) were measured within 24 hours of hospital admission using the ELISA method. Statistical analysis was performed using Chi-square test and unpaired t-test, and a p-value < 0.05 was considered statistically significant. Results: Among the 100 ACS patients studied, 67% had normal thyroid function, while 33% showed thyroid dysfunction. The most common abnormality was Euthyroid Sick Syndrome (17%), followed by subclinical hypothyroidism (11%) and subclinical hyperthyroidism (5%). Abnormal thyroid profiles were significantly more frequent in STEMI patients compared to NSTEMI patients (p = 0.006). However, comparison of mean fT3, fT4, and TSH levels between STEMI and NSTEMI groups did not show statistically significant differences (p > 0.05). Conclusion: Thyroid dysfunction is relatively common in patients with acute coronary syndrome, with Euthyroid Sick Syndrome being the most frequent abnormality. A significantly higher prevalence of thyroid abnormalities was observed among STEMI patients, suggesting greater physiological stress in these individuals. Routine evaluation of thyroid hormone status in ACS patients may aid in early risk assessment and clinical management.

13. A Clinical Utility of HbA1c in Detecting Dyslipidemia among Patients with Type 2 Diabetes Mellitus in Saurashtra Region of Gujarat
Happy Chadsaniya, Nimit A. Hinsu, Rashmita Ramani, R. S. Trivedi
Abstract
Background: Type 2 diabetes mellitus (T2DM) often comes with dyslipidemia, a key factor in increasing cardiovascular risks. While HbA1c is commonly used to measure long-term blood sugar control, this study explored whether HbA1c could also help detect lipid abnormalities in diabetic patients, providing a more comprehensive assessment of their cardiovascular health. Aims and Objectives: This study aimed to investigate the relationship between HbA1c levels and dyslipidemia markers in T2DM patients, to determine if HbA1c could serve as a reliable biomarker for early detection of dyslipidemia. Methods and Materials: Participants with elevated HbA1c levels were included in the study. After obtaining informed consent, their medical histories were recorded. Diagnostic procedures included   HbA1c measurement, and a detailed lipid profile analysis. Statistical analysis, including unpaired t-tests, was performed to assess the correlation between HbA1c and dyslipidemia markers. Results: The findings showed a strong association between higher HbA1c levels and increased dyslipidemia markers, particularly elevated triglycerides and lower HDL cholesterol levels. This supported the hypothesis that HbA1c can indicate lipid abnormalities in diabetic patients (p < 0.05). Discussion: These results suggest that HbA1c not only reflects glycemic control but could also be a valuable tool in predicting lipid abnormalities. Using HbA1c as a dual-purpose marker could improve cardiovascular risk assessments and lead to more personalized care for T2DM patients. Conclusion: The study concluded that HbA1c is a promising biomarker for identifying dyslipidemia in T2DM patients. Routine inclusion of HbA1c in lipid screenings could enhance early detection and management of cardiovascular risks in diabetic care.

14. Acute and Sub Acute Effect of Spinal Subarachnoid Block on Intraocular Pressure: A Comparative Study
Trishna Sahu, Aparajita Banerjee, Meenakshi Pandey, Ambika Prasad Panda
Abstract
Background & Aim: Subarachnoid block is commonly used anesthetic technique for many infraumbilical surgical procedures. It can result in complications like hypotension, bradycardia, local anesthetic toxicity, postdural-puncture headache, backache and nerve damage. Prevention and treatment of these complications were well documented. But its effects on intraocular pressure (IOP) haven’t been well studied. The aim of our study was to assess the effects of spinal anesthesia on intraocular pressure. Material and Methods: Fifty patients posted for infra umbilical surgery under subarachnoid block were included in the study. Intraocular pressure was measured prior to spinal anesthesia (PS), 20 minutes after spinal anesthesia (AS) and finally on the first postoperative day (POD1) and were compared. Both eyes of the patients were included in the study. Hemodynamic and block characteristics were monitored and compared. Results: Mean intraocular pressure was 17.9± 3.53 mm Hg prior to anesthesia, 15.77± 2.82 mm Hg 20 minutes after spinal anesthesia and 16.83± 3.39 mm Hg on 1st postoperative day,the difference among them being statistically not significant. Conclusions: Spinal anesthesia can result in decrease in IOP which may result from decrease in mean arterial pressure.

15. Metabolic Abnormalities in Bipolar Disorder: A Clinical and Biochemical Analysis
Md. Shahnwaz, Anjana Kumari, Sukant Shekhar, Arati Shivhare
Abstract
Background: Bipolar disorder is associated with an increased risk of metabolic abnormalities, contributing to significant morbidity and premature mortality. Aim: To evaluate metabolic abnormalities and their clinical correlates in patients with bipolar disorder. Methods: This cross-sectional study included 61 drug-free patients diagnosed with bipolar disorder as per ICD-10 criteria at a tertiary care center in Bihar, India. Anthropometric parameters, blood pressure, fasting blood glucose, and lipid profile were assessed. Clinical variables were evaluated using the Young Mania Rating Scale (YMRS) and Hamilton Depression Rating Scale (HAM-D). Results: Metabolic syndrome was identified in 39.3% of participants. Patients with metabolic abnormalities had significantly higher waist circumference, blood pressure, fasting blood glucose, triglyceride levels, and low-density lipoprotein cholesterol, along with lower high-density lipoprotein cholesterol (p < 0.05). A significant association was observed between metabolic abnormalities and the number of lifetime manic episodes. Conclusion: Metabolic abnormalities are highly prevalent in bipolar disorder and are associated with illness severity. Routine metabolic screening should be incorporated into standard psychiatric care.

16. Tonsillectomy and its Effect on ASO Titre: A Hospital Based Study
Sweta Kumari, Manoj Kumar, Md Ozair, Rani Rashmi Priya
Abstract
Background: Acute tonsillitis is one of the most common manifestations of the upper respiratory tract infections. It is common in children and accounts for an incidence of about 32 per 1000 patients per year. The objective of this study is to determine the effect of tonsillectomy on ASO titre and to evaluate the sensitivity and specificity of throat swab culture. Methods: Present study performed a prospective study, a total number of 50 children were screened, out of which 25 patients under the age of 15 years (16 male and 9 female), were included in the study, who were having chronic tonsillitis with raised anti-streptolysin O titre (>200IU/ml). All the patient underwent tonsillectomy and serological estimation of ASO titre was done at the end of first, second and third month post-surgery. Throat swab culture was performed prior to tonsillectomy and at the third month of follow up. Results: Twelve children (48%), twenty children (80%) and twenty-two children (88%) became serologically negative for ASO antibody at the end of first, second and third month respectively, with a statistically significant p value of 0.0001. The sensitivity and specificity of throat swab culture was 16% and 100% respectively. Conclusions: Tonsillectomy has a significant role in reducing the serological levels of anti-streptolysin O antibody and its reactivation, thereby decreasing the rate of complications associated with Group A-beta haemolytic streptococci.

17. Evaluation of Anaemia Profile in CKD Patients and Its Correlation with Erythropoietin Levels
Mohammed Abdul Salam Haroon Rashid Tamboli, Mahesh Balkishan Soni
Abstract
Background: Anaemia is a common and early complication of chronic kidney disease (CKD), primarily attributed to reduced erythropoietin (EPO) production. The severity of anaemia increases with disease progression and contributes significantly to morbidity and mortality. This study aimed to assess the anaemia profile in CKD patients and evaluate its correlation with serum erythropoietin levels. Materials and Methods: A cross-sectional study was conducted on 120 CKD patients in the Department of Medicine at Parbhani Medical College and Hospital, Parbhani, Maharashtra. Haematological parameters including haemoglobin (Hb), haematocrit (Hct), red cell indices, serum iron, ferritin, and total iron-binding capacity (TIBC) were assessed. Serum erythropoietin levels were measured using ELISA. CKD staging was done based on estimated glomerular filtration rate (eGFR). Statistical analysis included ANOVA and Pearson correlation. Results: The mean haemoglobin levels significantly decreased with advancing CKD stages (p<0.001). Normocytic normochromic anaemia was the predominant type (68%). Serum erythropoietin levels were inappropriately low relative to the degree of anaemia. A significant positive correlation was observed between Hb and EPO levels (r=0.62, p<0.001), while an inverse correlation was found between CKD stage and Hb levels. Conclusion: Anaemia in CKD is predominantly due to inadequate erythropoietin production. Early detection and monitoring of EPO levels along with haematological parameters are crucial for timely management and prevention of complications.

18. Social Media Addiction and Self Esteem among Adolescent Students in Srikakulam, India
Ch. Krishna Deepak, V. Padma, D. Vijaya Lakshmi, T. Akhila
Abstract
Background: Adolescence is a critical developmental stage characterized by rapid physical, emotional, and social changes. During this period, individuals begin to form their identity, develop interpersonal relationships, and shape their self-concept. Social media use is widespread among adolescents, and excessive engagement has been associated with adverse psychological outcomes, including behavioral addiction and impaired well-being. One of the important psychological factors influenced by social media use is self-esteem. Aim: This paper aims to assess the social media addiction and self-esteem among adolescent students and the relationship between them in Srikakulam, India. Material and Methods: A cross-sectional observational study was conducted among 200 adolescent students from classes 9th, 10th, Intermediate, and first-year MBBS in Srikakulam district, Andhra Pradesh. Participants were selected using a cluster-based sampling method. Sociodemographic details were collected, and social media addiction and self-esteem were assessed using the Bergen Social Media Addiction Scale and Rosenberg Self-Esteem Scale, respectively. Data were analyzed using descriptive statistics and Chi square tests. Results: The present study involving 200 adolescent students, the majority of participants (84%) were classified as low risk for social media addiction, while 12.5% were at risk and only 3.5% were in the high-risk category, Social media addiction was slightly more common among males (73 low risk, 15 at risk, 4 high risk) compared to females (95 low risk, 10 at risk, 3 high risk), However, age group and place of stay showed significant associations, with higher risk observed among older adolescents (>18 years) and those residing in urban areas (p = 0.003 and p = 0.03 respectively). With regard to self-esteem, the majority of students (90%) demonstrated average self-esteem, while 8% had high self-esteem and only 2% had low self-esteem. Conclusion: The present study concludes that while social media use is common among adolescent students, most adolescents are able to manage their usage without developing severe addiction, and their self-esteem levels remain largely stable. However, special attention should be given to older adolescents and those living in urban areas, as they may be more vulnerable to problematic social media use. Promoting digital awareness, balanced social media habits, and positive self-concept among adolescents may help prevent potential negative psychological effects in the future.

19. Pattern of Febrile Illness in Children Admitted to Pediatric Ward
Dasari Mounika, Mamidi Akhilesh, Sravan Kumar Kusuma
Abstract
Background: Fever is one of the most common reasons for pediatric hospital admission, with varied etiologies ranging from self-limiting viral illnesses to severe life-threatening infections. Aim: To study the pattern and etiological distribution of febrile illnesses among children admitted to the paediatric ward. Methods: This prospective observational study was conducted from January to June 2025 and included 120 children aged 1 month to 12 years admitted with fever. Detailed clinical evaluation and relevant laboratory investigations were performed. Data were analyzed using descriptive statistics and appropriate tests of significance. Results: The majority of children were aged 1–5 years (43.3%) with male predominance (58.3%). Acute respiratory infections (26.7%) were the most common cause, followed by acute gastroenteritis (15.0%), dengue (11.7%), and enteric fever (10.0%). Vector-borne diseases accounted for 26.7% of cases. Laboratory findings revealed anemia (31.7%), thrombocytopenia (18.3%), elevated CRP (48.3%), and liver enzyme derangement (21.7%). Most children recovered (91.7%), while 6.7% required intensive care; mortality was 0.8%. Conclusion: Infectious diseases, particularly respiratory and vector-borne illnesses, remain leading causes of paediatric febrile admissions, emphasizing the need for early diagnosis and timely management.

20. A Study on Arrhythmic Manifestations During the Acute Stage of Myocardial Infarction
Sasumana Ravi Kumar, Narisetty Vijay Prem Chand, Thadisetty Lilly Pushpa, Vasa Vijaya Kumar
Abstract
Background: Acute Myocardial Infarction (AMI) remains a major cause of morbidity and mortality worldwide. Cardiac arrhythmias are among the most frequent and potentially life-threatening complications occurring during the acute phase of myocardial infarction. Early identification and management of these rhythm disturbances are essential to improve patient outcomes. Objective: To study the arrhythmic manifestations occurring during the acute stage (first week) of myocardial infarction and to evaluate their clinical significance and impact on patient outcomes. Methods: This prospective observational study was conducted at Teritary care Hospital, Vijayawada, Andhra Pradesh, including 50 patients diagnosed with AMI (both STEMI and NSTEMI) admitted within 24 hours of symptom onset between January 2025 and January 2026. Patients were monitored clinically and with electrocardiography for the occurrence of arrhythmias during hospitalization. The incidence, type, and timing of arrhythmias were recorded and correlated with the type of infarction and in-hospital outcomes. Results: Arrhythmias were observed in 60% of patients during the acute phase of AMI. The majority occurred within the first 24 hours of admission. The most common arrhythmia was ventricular premature complexes (33.3%), followed by atrial fibrillation (20%), ventricular tachycardia (16.7%), and ventricular fibrillation (10%). Arrhythmias were more frequent in ST-elevation myocardial infarction (STEMI) compared to non-ST-elevation myocardial infarction (NSTEMI). Patients who developed arrhythmias had higher rates of complications, including heart failure (33.3%), cardiogenic shock (20%), and sudden cardiac death (10%). Mortality was higher among patients with arrhythmias (16.7%) compared to those without arrhythmias (5%). Conclusion: Arrhythmias are common during the acute stage of AMI, particularly within the first 24 hours. Ventricular premature complexes were the most frequently observed rhythm disturbance, while ventricular tachycardia and fibrillation were associated with increased mortality. Continuous cardiac monitoring and prompt management of arrhythmias are crucial in reducing complications and improving survival in patients with acute myocardial infarction.

21. Microbiological Profile of Bloodstream Infections and Its Correlation with Biochemical Inflammatory Markers C – reactive protein, Procalcitonin and Serum Lactate in Suspected Sepsis Patients
Nirmalkumar A. Shah, Aruna V. Gautam, Parin N. Shah
Abstract
Bloodstream infections are a major cause of morbidity and mortality among hospitalized patients. While blood culture is the reference method for confirming infection, detection may be slow and sometimes insensitive. Evaluating microbiological isolates alongside inflammatory biomarkers including C-reactive protein, procalcitonin, and lactate may support diagnosis and improved management. Objectives: To evaluate the microbiological profile of bloodstream infections and correlate blood culture positivity with biochemical inflammatory markers in suspected sepsis patients. Methods: A prospective cross-sectional observational study was conducted at a tertiary care teaching hospital over a period of 18 months. Adult patients with clinical suspicion of bloodstream infection and for whom blood cultures were obtained were included at the time of first sampling. Demographic and relevant clinical data were collected from medical records. Serum levels of C-reactive protein, procalcitonin, and lactate measured on the day of blood culture collection were recorded from the biochemistry laboratory database. Blood culture results, identification of microbial isolates, and corresponding antibiotic susceptibility patterns were obtained from the microbiology laboratory. Levels of the biochemical inflammatory markers were compared between culture-positive and culture-negative groups. Among culture-positive cases, the relationship between biomarker levels and microbial characteristics, including antibiotic resistance patterns, was further evaluated. Statistical analysis was performed using appropriate statistical tests, and a p-value of less than 0.05 was considered statistically significant. Results: A total of 150 patients with clinical suspicions of bloodstream infection were enrolled in the study. Blood cultures yielded microbial growth in 32 (21.3%) patients, while 118 (78.7%) samples showed no growth. Among the positive cultures, Gram-negative organisms accounted for the majority of isolates, followed by Gram-positive bacteria. Inflammatory biomarker levels were notably higher in patients with positive blood cultures compared with those with negative results. The mean level of C-reactive protein in culture-positive patients was 85.6 ± 27.9 mg/L, significantly greater than 43.5 ± 18.8 mg/L observed in culture-negative patients (p < 0.001). Similarly, mean Procalcitonin levels were markedly elevated in culture-positive patients (5.7 ± 2.5 ng/mL) compared with culture-negative patients (1.3 ± 0.9 ng/mL) (p < 0.001). Mean Serum lactate levels were also higher in the culture-positive group (3.7 ± 1.2 mmol/L) compared with the culture-negative group (2.0 ± 0.8 mmol/L) (p < 0.001). Among the culture-positive isolates, 17 (53.1%) were identified as multidrug-resistant organisms based on antimicrobial susceptibility testing, whereas 15 (46.9%) were non-multidrug-resistant strains. Patients with multidrug-resistant infections demonstrated higher levels of inflammatory biomarkers compared with those infected by susceptible organisms. The mean C-reactive protein level in multidrug-resistant infections was 100.8 ± 23.5 mg/L compared with 68.4 ± 19.1 mg/L in non-resistant infections (p = 0.003). Procalcitonin levels were also higher in multidrug-resistant infections (7.2 ± 2.0 ng/mL) compared with sensitive isolates (4.0 ± 1.5 ng/mL) (p = 0.002). Serum lactate values followed a similar trend, showing greater elevation in multidrug-resistant infections. Overall, increased levels of these biochemical inflammatory markers were significantly associated with blood culture positivity and antimicrobial resistance patterns among the isolated microorganisms. Conclusion: Bloodstream infections remain a significant cause of morbidity in patients with suspected sepsis. This study showed that patients with culture-positive infections had markedly higher levels of C-reactive protein, Procalcitonin, and Serum lactate compared with culture-negative cases. Elevated biomarker levels were also associated with multidrug-resistant infections, indicating that these markers may assist in early diagnosis and support timely clinical decision-making in suspected bloodstream infections.

22. Clinicopathological Profile of Ocular Tumors: A Retrospective Study
Anoop Kumar, Jai Prakash Srivastava, Vijay Kumar Srivastava, Nandini Srivastava
Abstract
Background: Ocular tumors encompass a wide spectrum of benign and malignant lesions with varied clinical presentation and prognosis. Clinicopathological evaluation plays a crucial role in accurate diagnosis and management. This study aimed to analyze the clinicopathological profile of ocular tumors in a tertiary care setting. Material and Methods: A retrospective observational study was conducted over five years, including 102 histopathologically confirmed cases of ocular tumors. Demographic details, anatomical site, and histopathological findings were recorded. Tumors were classified as benign or malignant. Statistical analysis was performed using descriptive statistics and Chi-square test, with p <0.05 considered significant. Results: Out of 102 cases, the majority were in the 41–60 years age group (29.4%) with male predominance (56.9%). The eyelid was the most commonly involved site (35.3%), followed by conjunctiva (23.5%), orbit (19.6%), and intraocular region (17.6%). Benign tumors constituted 62.7% of cases, while malignant tumors accounted for 37.3%. Among benign lesions, dermoid cysts (13.7%) and nevi (11.8%) were most frequent. Sebaceous gland carcinoma (9.8%) was the most common malignant tumor. A statistically significant association was observed between tumor behavior and anatomical site (p = 0.032), with intraocular tumors showing a higher proportion of malignancy. Conclusion: Benign ocular tumors predominate; however, a considerable proportion of malignant lesions exists, particularly in intraocular locations. Site-specific variations in tumor behavior underscore the importance of early diagnosis and histopathological evaluation for optimal management.

23. Comparative Therapeutic Outcomes of Cefixime versus Amoxicillin-Clavulanic Acid in Community-Acquired Bacterial Infections
Surabhi Arora, Anshita Arora
Abstract
Background: Community-acquired bacterial infections are a major cause of morbidity and require effective oral antibiotic therapy. This study compared the therapeutic efficacy and safety of cefixime versus amoxicillin-clavulanic acid in such infections. Material and Methods: A prospective, randomized, open-label study was conducted among 220 adult patients with community-acquired bacterial infections. Participants were equally allocated to receive either cefixime (400 mg once daily) or amoxicillin-clavulanic acid (625 mg thrice daily) for 7–10 days. Clinical response, time to symptom resolution, microbiological eradication, recurrence, and adverse events were assessed. Statistical analysis was performed using standard methods with a significance level of p<0.05. Results: Baseline characteristics were comparable between groups. Clinical cure was achieved in 89.1% of patients in the cefixime group and 85.5% in the amoxicillin-clavulanic acid group (p=0.41). Clinical improvement was observed in 7.3% and 9.1%, while treatment failure occurred in 3.6% and 5.5% of patients, respectively. Mean time to symptom resolution was 3.6 ± 1.1 days for cefixime and 3.9 ± 1.2 days for amoxicillin-clavulanic acid (p=0.07). Microbiological eradication rates were 86.7% and 82.8%, respectively (p=0.54). Recurrence rates were low in both groups (4.5% vs 6.4%, p=0.55). Adverse events were significantly lower with cefixime (10.9%) compared to amoxicillin-clavulanic acid (20.0%) (p=0.048). Conclusion: Cefixime demonstrated comparable efficacy with a better safety profile than amoxicillin-clavulanic acid, making it a suitable alternative for community-acquired bacterial infections.

24. Understanding Dengue Awareness and Preventive Practices Among Urban Population of Saurashtra, Gujarat
Hardikkumar Bharatbhai Kalariya, Chaitanyakumar Mahadevbhai Aghara, Parth M. Maheta
Abstract
Background: Dengue is a major public health problem in urban India, with recurrent outbreaks causing significant morbidity. Understanding community knowledge, attitude, and practices (KAP) is essential for effective prevention and control. Objective: To assess the knowledge, attitude, and practices regarding dengue among adults residing in the Urban Health Training Centre (UHTC) area of Jamnagar city, Western Gujarat, and to determine the association between knowledge, attitude, and practices. Methods: A community-based cross-sectional study was conducted over two months among adults aged ≥18 years residing in the UHTC area of Jamnagar. A total of 500 eligible participants were included using house-to-house visits, with more than one eligible adult interviewed per household when available. Data were collected using a pretested structured questionnaire covering socio-demographic variables and KAP related to dengue. Knowledge, attitude, and practice scores were categorized using median cut-off values. Data were analysed using descriptive statistics and Chi-square test. Results: Among the respondents, 47.0% demonstrated good knowledge, 90.4% had a good attitude, and 55.6% exhibited good preventive practices regarding dengue fever. High awareness was observed regarding mosquito breeding in stagnant water and common dengue symptoms. However, misconceptions related to disease transmission and reliance on fogging alone were noted. A statistically significant association was found between knowledge and attitude (p = 0.01), while the association between attitude and practice was not statistically significant (p = 0.29). Conclusion: Despite a positive attitude towards dengue prevention, gaps in knowledge and preventive practices persist among community members. Strengthening targeted health education and behaviour change communication interventions is essential to translate awareness into effective preventive practices.

25. Serum Ferritin as a Predictor of Disease Severity and Platelet Transfusion Requirement in Dengue: A Prospective Observational Study
Nilesh Kumar Patira, Nirali Salgiya, Jamil Mohammad, Dhairya Upadhyay
Abstract
Background: Dengue infection presents with a wide clinical spectrum ranging from mild febrile illness to severe disease with bleeding and shock. Platelet count alone is an unreliable predictor of disease severity and transfusion need. Serum ferritin, an acute-phase reactant reflecting macrophage activation and immune dysregulation, may serve as a robust biomarker for predicting severe dengue. Objectives: To evaluate serum ferritin levels as a predictor of disease severity and platelet transfusion requirement in patients with dengue infection. Methods: This prospective observational study was conducted in a tertiary care hospital and included 153 adult patients with laboratory-confirmed dengue infection and complete serum ferritin data. Serum ferritin was measured at admission, platelet counts were monitored serially, and the requirement for platelet transfusion during hospitalization was recorded. Receiver operating characteristic (ROC) curve analysis was performed to assess the predictive performance of serum ferritin. Results: Of the 153 patients, 48 (31.4%) required platelet transfusion. Median serum ferritin levels were markedly higher in patients requiring platelet transfusion compared with those who did not (≥2000 ng/mL vs 430 ng/mL). ROC analysis demonstrated excellent predictive performance of serum ferritin for platelet transfusion requirement (AUC 0.885). A ferritin cutoff of approximately 791 ng/mL predicted platelet transfusion with 94% sensitivity and 75% specificity. Conclusion: Serum ferritin is a readily available biomarker strongly associated with disease severity and platelet transfusion requirement in dengue. Incorporation of ferritin into routine clinical assessment may improve early risk stratification and promote rational blood product utilization.

26. Continuous Femoral Nerve Block with Equipotent Doses of Bupivacaine and Ropivacaine for Postoperative Analgesia after Unilateral Total Knee Replacement: A Comparative Study
Arun Aravind, Jasir P., Silpa A. R.
Abstract
Background: Postoperative pain after total knee replacement (TKR) can delay early mobilisation and rehabilitation. Continuous femoral nerve block is an effective technique for postoperative analgesia. Ropivacaine, with lower cardiotoxic potential and reduced motor blockade, may offer advantages over bupivacaine at equipotent concentrations. Objective: To compare the efficacy and safety of equipotent doses of ropivacaine 0.2% and bupivacaine 0.125% administered through ultrasound-guided continuous femoral nerve block for postoperative analgesia after unilateral total knee replacement. Methods: This prospective observational cohort study included 80 ASA I–II patients undergoing elective unilateral TKR under subarachnoid block. Patients were divided into two groups: Group R received 0.2% ropivacaine and Group B received 0.125% bupivacaine via femoral nerve catheter. A 20 ml bolus followed by continuous infusion at 5 ml/hr was administered for 48 hours. Postoperative pain was assessed using Numerical Rating Scale (NRS) at specified intervals up to 48 hours. Motor blockade was evaluated using Modified Bromage Scale. Rescue analgesic consumption and adverse effects were recorded. Results: Both groups were comparable with respect to demographic parameters. NRS pain scores at all postoperative time intervals were similar between two groups (p > 0.05). Rescue analgesic consumption did not differ significantly. At 24 hours, motor blockade was significantly less in group R compared to group B(p = 0.014). No motor blockade was observed at 48 hours in either group. No significant adverse effects were noted. Conclusion: Ultrasound-guided continuous femoral nerve block using 0.2% ropivacaine provides postoperative analgesia comparable to 0.125% bupivacaine following total knee replacement, with the advantage of reduced motor blockade. Ropivacaine may therefore be a preferable agent for facilitating early postoperative rehabilitation.

27. Radial Shortening as a Treatment Modality in an Advanced Stage of Lunatomalacia in Tertiary Care Centre of Western Rajasthan
Akhilesh Kumar Sharma, Raghuveer Meena, Anshul Meena, Ajay Gupta
Abstract
Introduction: Lunatomalcia or osteonecrosis of the lunate, can lead to chronic, debilitating wrist pain. This study was done with the objective to evaluate the risks and benefits of radial shortening in advanced stage of lunatomalcia in Tertiary care centre in Western Rajasthan. Materials and Methods: This Hospital based observational study was conducted between the periods of April 2016 to May 2022. Total 24 cases that had undergone radial shortening for treatment of Lunatomalcia were included in our study. Cases were clinically evaluated for pain, range of motion and grip strength. Pain was quantified by Visual Analogue Scale (VAS), Range of motion was measured by goniometer and grip strength was measured relative to contralateral side with help of dynamometer. Data was analysed using Microsoft excel 2019. Result: Mean age of cases was 33.83±11.65 years. Male to female ratio was 2:1. Range of motion, Grip strength, Wrist extension and Wrist flexion in operated hand increased after the procedure and this improvement was found to be statistically significant (p value<0.05). Postoperative VAS and DASH score significantly decreased after the radial shortening (p value<0.05) and NKSS increased significantly postoperatively (p value<0.05). Conclusion: Radial shortening is a simple and reproducible procedure with low complication rate. We concluded that radial shortening is an effective procedure with respect to functional improvement in treatment of patients of Lunatomalcia even in advance stage 3A, 3B and 4 however radiologically improvement is mild.

28. An Observational Study of the Origin and Course of Vertebral Artery in Indian Cadavers
Nakul Choudhary, Rakesh Ranjan
Abstract
Background: The vertebral artery is a vital component of the posterior circulation of the brain, exhibiting considerable anatomical variations in its origin and course. These variations hold significant clinical importance in diagnostic, surgical, and interventional procedures involving the head, neck, and cervical spine. The present observational study was conducted to analyze the origin and course of the vertebral artery in Indian cadavers. Methods: A total of 42 embalmed adult cadavers were dissected in the Department of Anatomy. At GMC, Purnea. The vertebral arteries were carefully exposed on both sides, and their origin, level of entry into the transverse foramina, and course through the cervical vertebrae were examined and documented. Any deviations from the typical origin (from the subclavian artery) and standard entry at the level of the sixth cervical vertebra were recorded. The study revealed that while the majority of vertebral arteries originated from the subclavian artery and entered the transverse foramen at the level of C6, notable variations were observed. These included origin directly from the aortic arch, entry at higher cervical levels such as C5 or C4, and asymmetry between the right and left sides. Such variations may have embryological significance and potential clinical implications, particularly in angiographic interpretation, cervical spine surgeries, and vascular interventions. Conclusion: Awareness of these anatomical variations is crucial for clinicians to avoid complications during surgical and radiological procedures. The findings of this study contribute to the existing anatomical knowledge and emphasize the need for careful preoperative evaluation of vertebral artery anatomy.

29. Pericapsular Precision Versus Compartmental Analgesia: Ultrasound- Guided PENG Block Compared with Fascia Iliaca Block in Patients Undergoing Hip Surgery Under Spinal Anaesthesia
Sri Satya Yeleswarapu, Kota Aditya
Abstract
Background: Effective analgesia before and after hip surgery is essential to facilitate positioning for spinal anaesthesia and improve postoperative recovery. This study compared ultrasound-guided pericapsular nerve group (PENG) block with fascia iliaca compartment block (FICB) in patients undergoing elective hip surgery. Methods: In this prospective randomized double-blind study, 50 patients scheduled for elective hip surgery under spinal anaesthesia were allocated into two equal groups. Group F received ultrasound-guided FICB and Group P received ultrasound-guided PENG block using 20 mL of 0.5% ropivacaine. Pain was assessed using the Numerical Rating Scale before block, 30 minutes after block, and postoperatively up to 24 hours. Ease of spinal positioning, time to first rescue analgesia, rescue analgesic consumption, and haemodynamic variables were also evaluated. Results: Baseline demographic and haemodynamic characteristics were comparable between the groups. Pain scores were significantly lower in Group P at 30 minutes and during the early postoperative period up to 8 hours. Group P also showed better ease of positioning, longer time to first rescue analgesia, and lower rescue analgesic consumption over 24 hours. Haemodynamic parameters were stable and similar in both groups. Conclusion: Ultrasound-guided PENG block provided superior early postoperative analgesia and improved positioning for spinal anaesthesia compared with FICB in elective hip surgery patients.

30. Blood culture contamination reduction, quality improvement in NICU, MGM Hospital, Warangal
Kagithapu Surender, Mohan Amgothu, G. Karunakar, Ayesha Begum, P. Srivani, Ragha Sanjana K.
Abstract
Introduction: Neonatal sepsis is a leading cause of neonatal mortality, especially in LMICs, with the highest incidence in the Indian subcontinent. Blood culture (BC) is the gold standard for diagnosis but has limitations like low positivity, false positives, and delays. This study aimed to reduce BC contamination using the plan do study act (PDSA) cycle. Methods: This prospective study included all neonatal intensive care unit (NICU) admitted newborns, excluding uncooperative parents. A research team collaborated with microbiology experts to understand and reduce BC contamination using a three-cycle PDSA approach. Training, protocol reinforcement, and supply monitoring improved compliance, significantly reducing BC contamination, leading to large-scale intervention implementation. Results: The study demonstrated a significant reduction in BC contamination rates from 10.7% (20) to 1.9% (5) through PDSA cycles. Compliance with hand hygiene, personal protective equipment (PPE) usage, antiseptic application, and proper sample collection improved notably. These structured interventions led to enhanced infection control, minimizing unnecessary antibiotic use and hospital stays in the NICU. Conclusion; Implementation of a structured PDSA cycle-driven intervention significantly improved adherence to aseptic BC collection techniques, reducing contamination rates. The intervention emphasized staff training, supply monitoring, and adherence to standard protocols, leading to sustained improvements.

31. Retroverted Uterus Revisited: Anatomical and Hemodynamic Insights into Primary Infertility
Sangita Ashokrao Gore, Abhilasha Jain
Abstract
Background and Objective: The natural orientation of the uterus varies, with a retroverted position historically considered a benign physiological variant. However, its isolated impact on female reproduction remains debated. This study aims to evaluate the association between uterine position (anteverted versus retroverted) and primary infertility using detailed anatomical and sonographic assessments, in the absence of confounding pelvic pathologies. Methods: A prospective, cross-sectional observational study was conducted at a tertiary care center in Maharashtra from April 2023 to December 2024. The study included 90 women (N = 90) diagnosed with primary infertility. Participants underwent transvaginal sonography (TVS) during the early follicular phase. Spatial uterine orientation, cervico-uterine angle, cervical canal length, and uterine artery Doppler indices (Pulsatility Index [PI] and Resistance Index [RI]) were measured. Participants with secondary infertility, male factor infertility, or severe pelvic pathologies like deep endometriosis were excluded. Results: Of the 90 participants, 52 (57.8%) exhibited an anteverted uterus, while 38 (42.2%) had a retroverted uterus, a prevalence notably higher than in the general fertile population. The retroverted group demonstrated a significantly sharper mean cervico-uterine angle (102.4° ± 12.1° vs. 134.6° ± 10.5°, p < 0.001) and an elongated cervical canal (3.6 ± 0.4 cm vs. 3.2 ± 0.3 cm, p < 0.001). Doppler analysis revealed significantly elevated vascular resistance in the retroverted cohort (Mean PI: 2.84 vs. 2.32, p < 0.001; Mean RI: 0.88 vs. 0.76, p < 0.001). Furthermore, women with a retroverted uterus reported higher incidences of severe dysmenorrhea (p = 0.012) and were significantly more likely to experience prolonged infertility exceeding 4 years (p = 0.038). Conclusion: An isolated retroverted uterus is significantly associated with primary infertility. The altered cervical geometry and compromised uterine hemodynamics observed in retroverted uteri may serve as mechanical and physiological barriers to natural conception, suggesting it should be evaluated as a clinically significant anatomical factor in routine fertility workups.

32. Comparison of Ultrasound Guided Hydrodistension of the Shoulder Joint By Anterior Versus Posterior Approach in Primary Adhesive Capsulitis
Akash Yadav, Deepak Kumar Saini, Parth Kaushik, Siddhant Jain
Abstract
Background: Adhesive capsulitis, or frozen shoulder, presents significant functional impairment due to pain and stiffness. Ultrasound-guided hydrodistension has been recognized for its potential in treating this condition, but the optimal approach remains unclear. Methods: This prospective observational study at the Central Institute of Orthopaedics involved 40 patients with primary adhesive capsulitis, randomized into two groups to receive hydrodistension via either the anterior or posterior approach. Outcomes measured included the Visual Analog Scale (VAS) for pain and the degree of passive external rotation, assessed at baseline, 4 weeks, and 12 weeks post-intervention. Results: Both groups started with comparable pain levels and mobility restrictions; however, the anterior approach group showed more significant improvements. At 12 weeks, the anterior group’s pain scores and external rotation were superior to those of the posterior group. Conclusion: The anterior approach to ultrasound-guided hydrodistension is more effective in managing pain and improving mobility in patients with adhesive capsulitis compared to the posterior approach.

33. Diagnostic Accuracy of Ultrasound-Guided Fine Needle Aspiration Cytology in Salivary Gland Lesions with Correlation to Histopathological Examination (HPE) Findings
Anil Kumar, Harendra Kumar, Asim Mishra
Abstract
Background: Salivary gland lesions comprise a diverse group of neoplasms with varying histopathological features and biological behavior. Accurate preoperative diagnosis is essential for appropriate management. Ultrasonography-guided fine needle aspiration cytology (USG-guided FNAC) has emerged as a valuable diagnostic tool, though its accuracy requires validation against histopathological examination (HPE), the gold standard. Aim: To evaluate the diagnostic accuracy of USG-guided FNAC in salivary gland lesions and to assess its correlation with histopathological findings. Materials and Methods: This prospective observational study was conducted in the Department of Pathology at Anugrah Narayan Magadh Medical College & Hospital, Gaya, over a period of two years (January 2024 to December 2025). A total of 60 patients with clinically suspected salivary gland neoplasms were included. All patients underwent USG-guided FNAC followed by surgical excision and histopathological examination. Diagnostic performance parameters such as sensitivity, specificity, positive predictive value (PPV), negative predictive value (NPV), and overall accuracy were calculated. Statistical analysis was performed using SPSS version 27.0, with a p-value <0.05 considered significant. Results: Out of 60 cases, females constituted 55.0% and males 45.0%, with no significant association between age and gender (p = 0.85). FNAC diagnosed 56.7% cases as benign and 43.3% as malignant, while histopathology confirmed 58.3% benign and 41.7% malignant lesions. Pleomorphic adenoma was the most common benign tumor, and mucoepidermoid carcinoma was the most common malignant tumor. Cytohistopathological correlation showed a highly significant association (χ² = 41.52, p < 0.001). FNAC demonstrated 18 true positives, 37 true negatives, 3 false positives, and 2 false negatives, indicating high diagnostic accuracy. Conclusion: USG-guided FNAC is a reliable, minimally invasive, and cost-effective diagnostic modality with high accuracy in differentiating benign and malignant salivary gland lesions. Its strong correlation with histopathological findings supports its role as an effective preoperative diagnostic tool.

34. Correlation between Bone Marrow Plasma Cell Morphology and Cytogenetic Abnormalities in Multiple Myeloma Patients
Anil Kumar, Madhurima Sinha, Asim Mishra
Abstract
Background: Multiple Myeloma (MM) is a plasma cell malignancy characterized by clonal proliferation in the bone marrow, leading to anemia, bone lesions, hypercalcemia, renal dysfunction, and monoclonal protein production. Recent advances highlight the importance of cytogenetic abnormalities as key prognostic indicators. Aim: This study was undertaken to evaluate the relationship between bone marrow plasma cell morphology and cytogenetic abnormalities in MM patients. Materials and Methods: This hospital-based cross-sectional observational study included 100 diagnosed cases of multiple myeloma with available cytogenetic data. Bone marrow aspiration and biopsy samples were analyzed for plasma cell percentage, morphological subtype (plasmacytic, plasmablastic, mixed), and infiltration pattern (nodular, interstitial, diffuse). Cytogenetic evaluation was performed using Fluorescence In Situ Hybridization (FISH) and GTG banding. Statistical analysis was conducted using SPSS, and a p-value < 0.05 was considered significant. Results: A statistically significant association was observed between cytogenetic abnormalities and plasma cell characteristics (p < 0.05). The majority of cases demonstrated a high plasma cell burden (>50%), particularly in those with del(13q14.3) and complex karyotype. Plasmacytic morphology predominated in cases with normal cytogenetics and t(11;14), whereas plasmablastic morphology was more commonly associated with cytogenetic abnormalities, especially complex karyotypes. Diffuse bone marrow infiltration was the most common pattern (52%) and was predominantly associated with high-risk abnormalities such as del(17p13), t(4;14), and t(14;16). In contrast, t(11;14) was associated with nodular and interstitial patterns. Conclusion: Plasma cell morphology, marrow infiltration pattern, and cytogenetic abnormalities show a significant correlation in multiple myeloma. Their combined evaluation enhances understanding of disease biology and may improve prognostic stratification and clinical management.

35. A Morphometric and Morphological Analysis of the Bicipital Groove of the Humerus
Sigraf Tarannum, Amit Kumar Prasad, Umesh Prasad Sinha
Abstract
Background: The bicipital groove (intertubercular sulcus) of the humerus plays a crucial role in guiding and stabilizing the tendon of the long head of the biceps brachii. Variations in its morphology and morphometry are clinically significant, as they may influence tendon stability and predispose to various shoulder pathologies. Aim: To perform a detailed morphometric and morphological analysis of the bicipital groove and to evaluate side-wise variations and their clinical significance. Materials and Methods: This descriptive cross-sectional study was conducted on 90 adult dry human humerii obtained from the departmental osteology collection. Morphometric parameters, including length, width, depth, medial wall length, and lateral wall length, were measured using a digital vernier caliper. Morphological features such as supratubercular ridge, wall thickening, and bony spurs were assessed by visual inspection. The bicipital groove was classified based on opening angle, medial wall angle, and depth using established criteria. Data were analyzed using SPSS version 27.0. Continuous variables were expressed as mean ± standard deviation, while categorical variables were presented as frequency and percentage. Student’s t-test and Chi-square test were applied, with p < 0.05 considered statistically significant. Results: Most morphometric parameters showed no significant side-wise differences (p > 0.05), except for lateral wall length (p = 0.03), width (p = 0.01), and opening wall angle (p = 0.02), which demonstrated significant variation. The majority of specimens (64.4%) exhibited a moderate depth (4–6 mm), followed by deep (22.2%) and shallow (13.3%) grooves. The small opening angle category (≤95°) was the most common (42.2%), with no significant side-wise association (χ² = 0.32, p = 0.85). Morphological variations were observed in 87.8% of specimens, with the supratubercular ridge (Meyer’s) being the most frequent finding (37.8%), followed by medial wall thickening (21.1%), lateral wall thickening (16.7%), and bony spurs (12.2%). No significant side-wise differences were noted for these variations (p > 0.05). Conclusion: The bicipital groove demonstrates overall bilateral symmetry with selective asymmetry in certain morphometric parameters. Moderate groove depth and smaller opening angles predominate, while morphological variations are common but symmetrically distributed. These findings have important clinical implications for understanding shoulder biomechanics, diagnosing tendon disorders, and guiding surgical interventions involving the proximal humerus.

36. A Morphometric Study of the Body, Pedicles, and Laminae of Typical Thoracic Vertebrae in Humans: A Cross-Sectional Study
Sigraf Tarannum, Amit Kumar Prasad, Umesh Prasad Sinha
Abstract
Background: The thoracic vertebrae play a crucial role in maintaining spinal stability, protecting the spinal cord, and facilitating movement. Detailed knowledge of their morphometric characteristics is essential for clinical applications, particularly in spinal instrumentation and surgical procedures. Aim: To perform a detailed morphometric analysis of the vertebral body, pedicle, and lamina of typical thoracic vertebrae and to evaluate bilateral symmetry of pedicle and laminar dimensions. Materials and Methods: This descriptive cross-sectional study was conducted on 110 dry human thoracic vertebrae. Measurements of the vertebral body, pedicle, and lamina were obtained using a digital Vernier caliper. Parameters assessed included anteroposterior diameter (APD), transverse diameter (TD), anterior height (AH), posterior height (PH), pedicle height and width, and lamina height and width. Data were analyzed using SPSS version 27.0. Descriptive statistics were expressed as mean ± standard deviation (SD), and comparisons between left and right sides were performed using the independent samples t-test, with p < 0.05 considered statistically significant. Results: The mean APD, TD, AH, and PH of the vertebral body were 18.1 ± 2.1 mm, 25.7 ± 9.1 mm, 16.4 ± 7.2 mm, and 17.3 ± 4.5 mm, respectively. The transverse diameter and anterior height showed greater variability compared to APD. Pedicle height and width demonstrated no statistically significant differences between the left and right sides (p = 0.72 and p = 0.81, respectively). Similarly, lamina height and width showed no significant bilateral differences (p = 0.96 and p = 0.74, respectively), indicating symmetrical morphology. Conclusion: The study highlights that while vertebral body dimensions exhibit variability, the pedicle and lamina demonstrate significant bilateral symmetry. These findings provide valuable anatomical data that can aid in surgical planning, spinal instrumentation, and the design of implants, thereby enhancing the safety and efficacy of thoracic spine procedures.

37. Prognostic Significance of Electrocardiographic Findings in Patients with Acute ST-Elevation Myocardial Infarction
Haresh Jilubhai Boghara, Prakashkumar Vejabhai Jadeja, Parekh Anish Mo Zakaria, Manish Juneja
Abstract
Background: Early electrocardiographic (ECG) findings are crucial for rapid risk stratification in ST-elevation myocardial infarction (STEMI), yet evidence from rural tertiary-care centers in India remains scarce. Parameters such as ischemia grade, rhythm disturbances, and conduction abnormalities may serve as predictors of short-term outcomes. This study aimed to assess the prognostic significance of admission ECG variables in STEMI patients within a rural Indian cohort. Methods: A single-center observational study was conducted over one year, enrolling 182 consecutive STEMI patients presenting to a tertiary hospital in rural India. Baseline clinical characteristics, admission ECG findings, angiographic data, and in-hospital outcomes were collected. Standard ECG definitions were applied, and ischemia severity was graded when appropriate. Associations between ECG features and in-hospital mortality were analyzed using p values. Results: Anterior and inferior STEMI were the most frequent presentations, with the majority of patients in sinus rhythm and within normal heart rate ranges at admission. QRS abnormalities, pathological Q waves, and grade 3 ischemia were noted in a subset of patients. In-hospital mortality was significantly higher among those with anterior wall involvement, tachycardia, atrial fibrillation/flutter, conduction disturbances, pathological Q waves, and grade 3 ischemia, all of which showed strong associations with adverse outcomes. Conclusion: Admission ECG features—particularly conduction abnormalities, arrhythmias, and ischemia grade—remain valuable predictors of in-hospital mortality in STEMI patients. These findings highlight the importance of comprehensive ECG assessment as a rapid, accessible, and cost-effective tool for early risk stratification in rural tertiary-care settings.

38. Cross-Sectional Study on Blood Coagulation Profile and Platelet Indices in Normal Term Pregnancy and Term pregnancy with Preeclampsia
Suchana Sinha, Raju Gopal Saha, Pradip Sarkar, Rajib Pal, Jhantu Kumar Saha
Abstract
Background: Preeclampsia is a significant hypertensive disorder of pregnancy associated with endothelial dysfunction, altered coagulation, and platelet abnormalities. These haemostatic changes increase the risk of maternal and fetal complications. Evaluating coagulation parameters and platelet indices can aid in early detection, risk stratification, and management of preeclampsia. Methods: A cross-sectional observational study was conducted among 180 term pregnant women (90 normotensive and 90 preeclamptic) at Burdwan Medical College. Participants were selected using simple random sampling. Clinical evaluation and laboratory investigations including BT (Bleeding Time), CT (Clotting Time), PT (Prothrombin Time), APTT (Activated Partial Thromboplastin Time), platelet count, MPV (Mean Platelet Volume), PDW (Platelet Distribution Width), and D-dimer levels were performed. Data were analyzed using SPSS with appropriate statistical tests, considering p ≤ 0.05 as significant. Results: Preeclamptic women showed significantly higher systolic and diastolic blood pressure (p < 0.001). Coagulation parameters-BT, CT, PT, and APTT-were significantly prolonged in the preeclamptic group (p < 0.001). Platelet count was significantly reduced (1.17 ± 0.6 vs. 1.438 ± 0.36 lakh/mm³; p = 0.033), while MPV and PDW were significantly elevated (p < 0.05), indicating increased platelet activation and turnover. D-dimer levels were also significantly higher in preeclamptic women (p < 0.001), reflecting enhanced fibrinolytic activity. Conclusion: Preeclampsia is associated with significant alterations in coagulation profile and platelet indices, indicating a hypercoagulable yet consumption-driven state. Routine monitoring of these parameters can facilitate early diagnosis, assess disease severity, and prevent complications such as DIC and HELLP syndrome, thereby improving maternal and fetal outcomes.

39. Preterm Birth and Its Maternal and Fetal Risk Factors: A Study from a Tertiary Care Center
B. Sanjana, Seema Mahesh Gaded, Sada G. Reddy, Nandish S. Manoli
Abstract
Introduction: Preterm birth, defined as delivery before 37 completed weeks of gestation, is a major contributor to neonatal morbidity and mortality worldwide. It is influenced by multiple maternal, fetal, and socioeconomic factors. Understanding its epidemiology and associated risk factors is essential for improving maternal and neonatal outcomes. Materials and Methods: This hospital-based observational study was conducted in the Department of Obstetrics and Gynecology at a tertiary care center. A total of 90 pregnant women with preterm delivery were included. Data were collected using a pre-designed proforma, including maternal demographics, obstetric history, and fetal parameters. Statistical analysis was performed using SPSS software, and associations were assessed using the Chi-square test, with p < 0.05 considered significant. Results: The majority of women belonged to the 21–30 years age group (44.4%), with a mean age of 27.6 ± 4.8 years. Multigravida constituted 57.8%, and 65.6% had inadequate antenatal care. Anemia (46.7%), PROM (31.1%), and hypertensive disorders (26.7%) were the most common maternal risk factors. Fetal factors included IUGR (20.0%) and multiple gestation (13.3%). Most neonates (72.2%) had low birth weight, with a mean gestational age of 33.4 ± 2.1 weeks. Conclusion: Preterm birth is a multifactorial condition predominantly associated with maternal health status, inadequate antenatal care, and obstetric complications. Early identification and management of modifiable risk factors are essential to reduce its incidence and improve neonatal outcomes.

40. Comparative Study of ACL Reconstruction in Knee Flexed–Leg Vertical Position with Figure of Nine Position-Randomized Controlled Trial
Pushpraj Chauhan, Pancham Anirudh Yadav , Faisal Naseer Mir
Abstract
Introduction: ACL injury is a common knee ligament injury, often reconstructed using a conventional knee-flexed leg vertical position. The figure-of-nine position improves lateral access and femoral footprint visualization, potentially enhancing tunnel placement and postoperative outcomes. Comparative evidence is limited. Materials and Methods: Fifty patients aged 14–44 years with symptomatic, MRI-confirmed ACL tears were randomized to Group 1 (knee-flexed leg vertical) or Group 2 (figure-of-nine). Hamstring grafts were used, fixed with tibial bioabsorbable screws and femoral cortical buttons. Outcomes included IKDC and Lysholm scores, Lachman, anterior drawer, and pivot shift tests at 6 months and 1 year, along with radiological assessment of femoral and tibial tunnels. Results: At 1 year, Group 2 demonstrated superior functional scores (IKDC 85.10 ± 7.00 vs. 70.30 ± 8.20; Lysholm 92.0 ± 5.30 vs. 79.2 ± 8.00; p < 0.001) and better knee stability. Femoral tunnels were more anatomical in orientation (44.10° ± 4.70° vs. 55.20° ± 5.70°) and posteriorly placed (30.80% ± 5.20% vs. 38.40% ± 7.50%; p < 0.001). Complications were lower in Group 2 (12% vs. 24%; p = 0.014). Conclusion: The figure-of-nine position enhances anatomical tunnel placement, improves knee stability and functional outcomes, and reduces complication rates, making it a safe and effective alternative to the conventional vertical leg position for ACL reconstruction.

41. Clinico-Etiological Profile and Complications of Chronic Liver Disease in Female Patients: A Prospective Observational Study from a Tertiary Care Hospital in Uttar Pradesh
Abhishek Bhardwaj, Ravi Bhardwaj
Abstract
Background: Chronic liver disease in women has a heterogeneous etiological spectrum and often becomes clinically evident after decompensation. Female-only cohorts have shown marked regional variation in etiology and complication burden. Methods: This prospective observational study included 50 female patients with chronic liver disease attending the outpatient department or admitted under the Department of General Medicine at a tertiary care hospital in Uttar Pradesh between May 2024 and October 2025. Participants were enrolled by consecutive sampling. Demographic characteristics, etiology, diabetes status, alcohol use, clinical presentation, ascites, varices, laboratory profile, and status at last contact were recorded. The primary outcome was the clinico-etiological profile and complication burden. Results: The mean age was 55.54 ± 8.29 years, and most participants were 51–60 years old (38%) or 61–70 years old (30%). Metabolic-associated steatotic liver disease was the leading etiology (40%), followed by alcohol-related liver disease (22%), autoimmune hepatitis (14%), hepatitis B virus infection (12%), and hepatitis C virus infection (12%). Diabetes mellitus was present in 50% of patients and was more frequent in metabolic-associated steatotic liver disease than in non-MASLD etiologies (80% vs 30%; p=0.0012). Jaundice was the most common presenting feature (68%). Ascites was present in 56% and esophageal varices in 58%; large varices were present in 18%. Anaemia (72%), hypoalbuminemia (64%), hyperbilirubinemia (64%), thrombocytopenia (62%), and hyponatremia (42%) were common. At last contact, 41 patients (82%) were discharged, 7 (14%) were referred, and 2 (4%) died. Adverse status at last contact was associated with abdominal distension, fever/infection, severe ascites, large varices, hyponatremia, and renal dysfunction. Conclusion: Women with chronic liver disease in this cohort had a predominant metabolic burden, frequent diabetes, and a substantial prevalence of portal hypertension-related complications. Advanced clinical and biochemical derangement identified patients with poorer status at last contact.

42. Clinical and Biochemical Correlates of Disease Severity in Oral Submucous Fibrosis: A Prospective Observational Study
Kachoriya Taral, Kanwar Vikrant Singh, Pranshuta Sehgal
Abstract
Background: Oral submucous fibrosis is a chronic, progressive, and potentially malignant disorder associated with areca nut and related chewing habits. In addition to progressive fibrosis and functional restriction, patients may demonstrate measurable haematological and biochemical alterations. Methods: This prospective observational study included 100 patients with oral submucous fibrosis managed in the Department of Otorhinolaryngology from December 2023 to January 2026. Clinical evaluation included symptom profile, site involvement, stage, and mouth opening. Laboratory assessment included complete blood count, erythrocyte sedimentation rate, serum protein, C-reactive protein, serum iron, and serum lactate dehydrogenase. Mouth opening was reassessed at 1 month, 3 months, and 6 months. Results: The mean age was 33.05 ± 8.19 years, and 74.0% of patients were male. Stage III was the most common stage (31.0%). Baseline mouth opening declined progressively from Stage I to Stage IV. Increasing stage was associated with longer chewing duration and greater chewing amount per day. Haemoglobin, mean corpuscular volume, mean corpuscular Haemoglobin, platelet count, serum protein, and serum iron declined progressively with advancing stage, whereas total leukocyte count, erythrocyte sedimentation rate, C-reactive protein, and serum lactate dehydrogenase increased progressively. Mouth opening showed strong positive correlation with Haemoglobin, mean corpuscular volume, mean corpuscular Haemoglobin, platelet count, serum protein, and serum iron, and strong negative correlation with age, complaint duration, chewing burden, total leukocyte count, erythrocyte sedimentation rate, C-reactive protein, and serum lactate dehydrogenase. Follow-up demonstrated progressive improvement in mouth opening, with greater improvement in earlier stages. Conclusion: Oral submucous fibrosis showed a clear association between increasing clinical severity, greater habit burden, worsening functional limitation, and progressive haematological and biochemical derangement. Combined clinical and biochemical assessment may provide a broader estimate of disease burden and may assist severity assessment and follow-up.

43. Role of Colour-Assisted Duplex Sonography in the Evaluation of Thyroid Diseases: A Cross-Sectional Study with Histopathological Correlation
Tushar Malik, Shankar Snehit Patil
Abstract
Background: Thyroid disease includes inflammatory, benign nodular, and malignant lesions. Grey-scale ultrasonography is the first-line imaging modality, while colour-assisted duplex sonography provides additional vascular and haemodynamic information that may improve lesion characterization. Methods: This cross-sectional study was conducted in the Department of Radiodiagnosis over 18 months. A total of 70 thyroid lesions were evaluated using grey-scale ultrasonography, colour Doppler, and spectral Doppler. Morphological features, vascularity patterns, and Doppler indices were recorded. Final analysis was correlated with cytopathological and histopathological findings where available. Statistical analysis was performed using SPSS version 26, and a p value of <0.05 was considered statistically significant. Results: The age range was 21-75 years, with a mean age of 44.8 ± 13.3 years and a median age of 43.5 years. Females accounted for 58 cases (82.9%). Papillary thyroid carcinoma was the most common individual diagnosis, seen in 19 lesions (27.1%). Increased vascularity was the most frequent overall Doppler pattern, observed in 28 lesions (40.0%). All inflammatory thyroid lesions showed increased/internal vascularity. In contrast, common benign nodular and goitrous lesions showed either no vascularity or mild/peripheral vascularity, while papillary thyroid carcinoma showed predominantly increased/internal vascularity. Internal/increased vascularity was present in 19 of 21 malignant lesions (90.5%) compared with 14 of 49 benign/non-malignant lesions (28.6%) (p <0.001). Conclusion: Colour-assisted duplex sonography is a useful adjunct in the evaluation of thyroid disease. Doppler vascularity patterns, interpreted together with grey-scale morphology, may help differentiate inflammatory thyroid lesions, benign nodular/goitrous lesions, and papillary thyroid carcinoma. Increased vascularity is not specific for malignancy and should not be interpreted in isolation.

44. A Study of Serological and Clinical Correlation of Dengue in a Tertiary Care Hospital in Gujarat
Siddhi Bharatbhai Mesariya, Kanizfatma Durani, Khushbu Nagar, Jayshri Pethani
Abstract
Background: Dengue is a rapidly emerging mosquito-borne viral infection and a major public health concern in tropical countries like India. The disease presents with a wide clinical spectrum ranging from mild febrile illness to severe dengue with hemorrhage and shock. Early diagnosis using serological markers and correlation with clinical features is essential for timely management and reduction of morbidity and mortality. Objective was to determine the incidence of dengue infection and to study the correlation between clinical features and serological findings in a tertiary care hospital. Methods: This prospective observational study was conducted over a period of two years in a tertiary care hospital. A total of 5168 clinically suspected dengue cases were included. Serum samples were tested using NS1 antigen detection (ELISA and rapid methods) for early cases and IgM capture ELISA for later stages of illness. Demographic, clinical, and laboratory data were recorded and analyzed using descriptive statistics. Results: Out of 5168 suspected cases, 292 (6%) were serologically confirmed. Majority of patients had dengue without warning signs (78%), followed by dengue with warning signs (15%) and severe dengue (7%). The most affected age group was 21–30 years (38%), with male predominance (56%). Fever (100%), chills (95%), headache (87%), vomiting (65%), and myalgia (48%) were the most common presenting features. Hemorrhagic manifestations were observed in 46% of cases. NS1 antigen positivity (52%) was higher than IgM (42%), indicating early presentation in most patients. Thrombocytopenia was noted in 40% and leucopenia in 24% of cases, with platelet counts correlating with disease severity. Conclusion: Dengue predominantly affects young adults and shows peak incidence during monsoon season. NS1 antigen is a valuable early diagnostic marker. Clinical and serological correlation is crucial for early detection and appropriate management, which helps in preventing complications and reducing disease burden.

45. Maternal and Perinatal Outcomes in Subclinical Hypothyroidism during Pregnancy: A Prospective Comparative Study
Vrushabhveer C. P., Baitinti Srividya, Sanabil S. P., Druva Chandra A. M., Abdul Haque Usman Pulath Puthanath, Sanketh Janardhan
Abstract
Background: Subclinical hypothyroidism (SCH) during pregnancy represents a significant endocrine disorder characterized by elevated thyroid-stimulating hormone (TSH) levels with normal free thyroxine concentrations. The association between SCH and adverse pregnancy outcomes remains controversial, with conflicting evidence regarding maternal and neonatal complications. Methods: This prospective comparative study enrolled 240 pregnant women (120 with SCH and 120 euthyroid controls) at a tertiary care hospital over an 18-month period. Participants were recruited at ≤20 weeks of gestation and followed until delivery. SCH was defined using trimester-specific TSH thresholds with normal free T4 levels. Primary outcomes included preterm birth, preeclampsia, low birth weight, and neonatal intensive care unit admission. Statistical analysis included chi-square tests, independent t-tests, and multivariate logistic regression. Results: Women with SCH demonstrated significantly higher rates of preterm birth (26.7% vs 11.7%, p=0.003), preeclampsia (16.7% vs 6.7%, p=0.018), and low birth weight (22.5% vs 9.2%, p=0.005) compared to euthyroid controls. Mean neonatal birth weight was significantly lower in the SCH group (2,680 ± 395 g vs 2,920 ± 365 g, p<0.001). Multivariate analysis revealed SCH as an independent predictor of composite adverse outcomes (adjusted OR 2.14, 95% CI 1.28–3.58, p=0.004). Conclusion: Subclinical hypothyroidism during pregnancy is associated with significantly increased maternal and perinatal morbidity. Early screening and appropriate management strategies may improve pregnancy outcomes in affected women.

46. Evaluation of the Effect of Cyperus Rotundus in a Murine Model of Dextran Sulphate Sodium (DSS) Induced Acute Colitis
Divakar, Suraj Waykole, Rahul Vitthal Chavan, Manoj Radhakrishnan, Sandhya Kamat
Abstract
Background: Ulcerative Colitis (UC) is a chronic, debilitating condition that affects an individual throughout life and is associated with many complications. The current treatment regimen includes the use of anti-inflammatory agents such as sulfasalazine, corticosteroids and immunosuppressants like azathioprine. These drugs have limited efficacy and multiple adverse effects and hence there is a need for safer and efficacious new drugs. Cyperus rotundus (CR) is a medicinal plant used in Ayurveda for the treatment of gastrointestinal disorders. The present study examined the effect of CR in an animal model which simulates ulcerative colitis. Objectives: To evaluate the protective effect of CR in a murine model of dextran sulfate sodium (DSS) induced acute colitis. Methods: After IAEC approval, 48 Swiss albino mice were divided into six groups (n = 8/group) and treated as follows: Vehicle control (VC), disease control (DC), positive control (sulfasalazine – 100 mg/kg) and three test groups with CR – 200 mg/kg/day, 600 mg/kg/day and 1 g/kg/day. All 6 groups received the study drug or vehicle from day 1 to 14. The inducing agent 3% Dextran sulphate sodium in drinking water was administered from day 8 to 14 to all groups except VC. Animals were sacrificed on day 15. Colon length and colon weight-by-length ratio were assessed and analyzed using One-way Anova. Disease activity index (DAI) and colitis macroscopy were assessed and analyzed using Kruskal Wallis test. A value of P < 0.05 was considered to be statistically significant. Results: CR (1 g/kg/day) significantly increased the colon length (p<0.05) and decreased in colon weight-by-length ratio, colitis macroscopy and DAI score (p<0.05) as compared to DC. Its effects were comparable to the positive control sulfasalazine. Conclusion: Aqueous extract of rhizomes of CR exerted a protective effect in the murine model of DSS induced acute colitis.

47. Correlation of Serum Calcium and Serum Cholesterol with Platelet Indices in Cardiac Patients: A Prospective Observational Study
Radhika Sharma, Pratishtha Shrivastava, Shivangi Maru
Abstract
Background: Cardiovascular diseases (CVD) are a leading cause of morbidity and mortality worldwide. Platelet activation and dyslipidemia play a crucial role in the pathogenesis of atherosclerosis and its complications. Serum calcium has also been implicated in cardiovascular risk, though its relationship with lipid profile and platelet indices remains less clearly defined. Aims and Objectives: To evaluate the correlation between serum calcium and serum cholesterol, and to assess the association of platelet indices with serum cholesterol and various cardiovascular diseases in patients admitted to ICCU. Material and Methods: This prospective observational study included 153 cardiac patients admitted to the ICCU of R.D. Gardi Medical College, Ujjain. Serum calcium, serum cholesterol, and platelet indices (MPV, PDW, P-LCR, platelet count, PCT) were measured. Correlation analysis was performed using appropriate statistical methods. Results: Many patients were in the 51–60 years (26.8%) and 61–70 years (26.8%) age groups, with male predominance (68%). The most common diagnosis was coronary artery disease (26.1%), followed by myocardial infarction (20.9%). Serum cholesterol was significantly higher in myocardial infarction patients. A significant positive correlation was observed between serum calcium and serum cholesterol (p = 0.038). Serum cholesterol also showed significant positive correlations with PDW (r = 0.450, p = 0.011), MPV (r = 0.617, p = 0.002), and P-LCR (r = 0.537, p = 0.023). Conclusion: Serum cholesterol is significantly associated with platelet activation indices, indicating increased thrombotic potential in cardiac patients. Platelet indices may serve as simple, cost-effective markers for identifying high-risk individuals.

48. A Compact Smartphone Microscope Adapter for Real-Time Telepathology: Design, Development, and Point-of-Care Applications
Biswas Rajib, Das Mainak, Chakraborty Shubarna, Das Barnali, Naiding Momota, Kairi Sushmita
Abstract
Background & Objectives: Smartphone-assisted telepathology offers a cost-effective way to support remote microscopy and taching. Free-hand imaging through a microscope eyepiece can be unstable and often leads to misalignment or loss of focus. This study aimed to technically validate a compact, modular smartphone microscope adapter for real-time telepathology. Methods: This prospective cross-sectional validation study took place over six months in a tertiary care pathology department. Researchers evaluated 60 archived slides, including 30 histopathology, 20 hematology, and 10 cytology specimens. Ten participants each assembled the adapter and conducted live transmission sessions on six slides, resulting in 60 sessions. Technical performance was measured using a structured 5-point Likert scale for stability, image resolution, focus quality, color accuracy, and stream stability. User feedback and diagnostic interpretability were also collected. Results: All 60 sessions finished without any mechanical failures, device detachment, or clamp loosening. The average time for assembly and alignment was 2.4 ± 0.6 minutes. Adapter stability had the highest score (4.72 ± 0.48), followed by image resolution (4.58 ± 0.56), focus quality (4.55 ± 0.59), color fidelity (4.47 ± 0.62), and stream stability (4.41 ± 0.69). The overall composite score was 4.55 ± 0.52. Images transmitted during the sessions were diagnostically interpretable for all specimen types, and reviewers’ diagnoses matched the reference diagnosis in every session. Interpretation & Conclusions: The adapter showed excellent stability, consistent alignment, easy usability, and provided live image transmission that was suitable for diagnosis in telepathology and teaching.

49. Comparative Outcomes of Minimally Invasive vs. Open Surgery: A Systematic Review
Lakhyajit Pait, Prakash Kalita, Sankar Prasad Saikia, Nirmal Kumar Agarwal
Abstract
Background: Minimally invasive surgery (MIS) has evolved rapidly since the 1990s, primarily encompassing laparoscopic and video-assisted approaches designed to minimize surgical trauma, postoperative pain, and recovery time compared with conventional open procedures. Objective: This systematic review aimed to compare perioperative, postoperative, and long-term outcomes of laparoscopic MIS and open surgery in adult patients undergoing common general surgical procedures, including cholecystectomy, appendicectomy, and hernia repair. Methods: A systematic literature search was performed across PubMed, Embase, Cochrane CENTRAL, and Web of Science from database inception through October 2025. Eligible studies included randomized controlled trials (RCTs) and comparative cohort studies involving adults (≥18 years) who underwent laparoscopic versus open surgery. Primary outcomes were operative time, intraoperative blood loss, postoperative complications, hospital stay, mortality, and long-term survival. Study quality was assessed using RoB 2 for RCTs and ROBINS-I for non-randomized studies. Results: Forty-seven studies (26 RCTs and 21 cohort studies) met the inclusion criteria. Laparoscopic MIS demonstrated significantly lower intraoperative blood loss (mean difference −93 mL), shorter hospital stay (mean difference −2.8 days), and fewer postoperative complications (OR = 0.54, 95% CI 0.44–0.67) compared with open procedures. Operative time was moderately longer for laparoscopic surgery (MD = +28 minutes). Mortality was marginally lower in emergency laparoscopic cases (OR = 0.44, 95% CI 0.35–0.54). Long-term outcomes, including recurrence and survival, were comparable between both approaches. Conclusions: Laparoscopic minimally invasive surgery offers clear short-term benefits—reduced blood loss, fewer complications, and faster postoperative recovery—without compromising long-term outcomes. These advantages are influenced by surgeon expertise, appropriate case selection, and institutional experience.

50. Impact of Vitamin D Deficiency on Morbidity & Mortality in Early Onset Sepsis among Term Neonates
Anil Kumar Gogineni, Radhika Mantry, Urmila Jhamb, Rashi Bhargava, Aarti Anand
Abstract
Introduction: Early-onset neonatal sepsis (EONS) is a significant cause of neonatal morbidity and mortality, particularly in developing countries. Vitamin D has important immunomodulatory functions, and neonates are entirely dependent on maternal vitamin D stores. Deficiency may increase susceptibility to infections; however, Indian data on its association with early-onset sepsis are limited. Aim and Objective: To study the impact of serum vitamin D levels on early-onset neonatal sepsis in term neonates, and to assess its impact on clinical outcomes. Materials and Methods: This hospital-based prospective case-control study was conducted at Santosh Medical College and Hospital, Ghaziabad. A total of 138 term neonates were enrolled, including 69 neonates with early- onset sepsis (≤72 hours of life) and 69 healthy term neonates as controls. Diagnosis of sepsis was based on clinical features supported by laboratory parameters including C-reactive protein, complete blood count, immature-to-total neutrophil ratio, platelet count, and blood culture. Serum 25-hydroxyvitamin D levels were measured in all neonates. Maternal risk factors and intrapartum antibiotic prophylaxis were recorded. Statistical analysis was performed using STATA MP-17, with p < 0.05 considered statistically significant. Results: Mean serum vitamin D levels were significantly lower in neonates with early-onset sepsis than in controls (10.30 ± 3.23 vs 24.71 ± 5.43 ng/mL; p < 0.001). Vitamin D deficiency was more common among septic neonates. Maternal fever, infections, and inadequate intrapartum antibiotic prophylaxis were significantly associated with early-onset sepsis. Blood culture was positive in 73.9% of cases. Septic neonates required longer NICU stay, prolonged antibiotic therapy, and greater respiratory support, with worse outcomes seen in those with severe vitamin D deficiency. Conclusion: Vitamin D deficiency has significant impact on early-onset neonatal sepsis in term neonates and correlates with increased disease severity. Affected neonates required longer NICU stay, prolonged antibiotic therapy, and greater respiratory support. Since vitamin D deficiency is preventable, maternal screening and supplementation may help reduce early-onset neonatal sepsis and improve outcomes.

51. Retracted

52. Clinical and Demographic Risk Factors for Mortality in Hospitalized Patients with Pneumonia: An Observational Study
Kiran Kumari Padhy, Ram Niranjan Sahoo, Muktikanta Parida
Abstract
Background: Pneumonia remains a major cause of hospital admission and death, particularly among older adults and patients with physiological derangement or chronic comorbidity. Early recognition of mortality risk is essential for triage, monitoring intensity, and resource allocation in tertiary-care settings. Aim: To identify clinical and demographic predictors of in-hospital mortality among adults hospitalized with pneumonia at a tertiary care teaching hospital in eastern India. Methods: This single-center retrospective observational study included 270 consecutive adult patients admitted with radiologically confirmed pneumonia to Institute of Medical Sciences and SUM Hospital II, Bhubaneswar, Odisha, between January 2024 and January 2026. Demographic variables, comorbidities, admission physiological parameters, laboratory markers, radiographic extent, and in-hospital course were analyzed. Survivors and non-survivors were compared using standard inferential tests. Univariable and multivariable logistic regression were used to identify independent predictors of in-hospital mortality, and discrimination was assessed by receiver operating characteristic analysis. Results: Overall, in-hospital mortality was 15.6% (42/270). Non-survivors were older and more likely to have diabetes, chronic kidney disease, altered sensorium, hypoxemia at admission, multilobar involvement, elevated inflammatory burden, renal dysfunction, and hypoalbuminemia. In the final admission-based multivariable model, age ≥65 years (adjusted odds ratio [AOR] 4.76, 95% CI 1.95-11.63), chronic kidney disease (AOR 2.29, 95% CI 1.01-5.18), SpO2 <90% at admission (AOR 2.38, 95% CI 1.04-5.44), albumin <3.5 g/dL (AOR 2.85, 95% CI 1.29-6.28), and neutrophil-to-lymphocyte ratio >12 (AOR 4.41, 95% CI 2.00-9.72) independently predicted death. The admission model showed good discrimination (AUC 0.857), outperforming CURB-65 alone (AUC 0.799). Conclusion: Older age, renal comorbidity, admission hypoxemia, hypoalbuminemia, and elevated neutrophil-to-lymphocyte ratio were the strongest independent predictors of in-hospital death in hospitalized pneumonia. A parsimonious admission model may improve early risk stratification beyond clinical severity scoring alone.

53. Profile of Electrocution Deaths in Coastal Odisha: A Retrospective Autopsy-Based Study
Subal Kumar Naik, Umakanta Khejuria
Abstract
Background: Electrocution remains an important yet preventable cause of accidental mortality in India, especially in regions with rapid urban expansion, informal electrical connections, humid climate, and seasonal outdoor work. Autopsy-based regional profiling helps identify vulnerable groups, recurring circumstances, and forensic injury patterns relevant to both death investigation and prevention. Aim: To describe the demographic profile, circumstantial characteristics, autopsy findings, and analytical correlates of electrocution deaths autopsied at SCB Medical College & Hospital, Cuttack, during 5 January 2025 to 31 December 2025. Methods: A retrospective descriptive-analytic record review of 90 confirmed electrocution deaths was designed using a one-year institutional autopsy frame. Case records, inquest papers, scene details, and hospital records were reviewed for age, sex, residence, occupation, season, place of occurrence, source and voltage of current, external injury pattern, survival interval, and cause of death. Sample size using the single-proportion formula based on an expected male predominance of 85% from previous Indian literature and an absolute precision of 7.5% yielded a minimum requirement of 87 cases; all 90 eligible cases in the study period were included. Descriptive statistics, chi-square/Fisher exact tests, and odds ratios (ORs) with 95% confidence intervals (CIs) were used. Results: Males constituted 86.7% of the victims and the mean age was 34.4±14.3 years. The 21–30 year age group was most affected (32.2%), followed by 31–40 years (25.6%). Most victims were from rural areas (67.8%), and deaths were overwhelmingly accidental (96.7%). Incidents peaked in the monsoon season (54.4%), with the highest monthly counts in July and August. Low-voltage exposure accounted for 68.9% of cases, while workplace incidents (41.1%) marginally exceeded home incidents (37.8%). Upper-limb contact was the commonest contact site (52.2%). Discrete electrical marks were absent in 34.4% of cases. Immediate cardiorespiratory arrest due to electrocution was the most frequent cause of death (60.0%), followed by electrocution with respiratory arrest (17.8%) and septicemia following electrical burns (13.3%). High-voltage fatalities were significantly associated with occurrence outside home (OR 8.33, p<0.001), occupational exposure (OR 3.34, p=0.010), extensive burns ≥20% TBSA (OR 11.31, p<0.001), survival >24 hours (OR 6.56, p=0.009), and fall-related injury (OR 16.64, p=0.003). Conclusion: In this coastal Odisha autopsy series, electrocution deaths predominantly affected young rural men and clustered during the monsoon months. Although low-voltage domestic and workplace events formed the larger burden, high-voltage exposure was associated with more severe burns, delayed survival, and secondary trauma. The findings support targeted household electrical safety, occupational line-clearance protocols, monsoon risk messaging, and careful forensic documentation even when classical electrical marks are absent.

54. Clinico-Hematological Profile of Anaemia in Children Aged 6 Months to 12 Years Admitted to a Tertiary Care Hospital in North-East India: A Cross-Sectional Study
Sougata Saha, Sujit Kumar Chakrabarti, Sribas Das, Manasi Saha (Ray)
Abstract
Background: Anaemia remains a major public health problem among children in developing countries, particularly in India. Data from North-East India are limited, especially among hospitalized paediatric populations. Objectives: To estimate the proportion of anaemia among hospitalized children aged 6 months to 12 years along with description of the clinico-hematological profile and to determine the association of nutritional status of the study subjects with severity as well as type of anaemia. Methods: This hospital-based cross-sectional study was conducted in the Department of Paediatrics, Agartala Government Medical College, Tripura, over a two-year period. Children aged 6 months to 12 years admitted with anaemia as per WHO criteria were consecutively enrolled. Clinical features, anthropometry, complete blood counts, peripheral smear examination, iron profile, vitamin B12 levels, haemoglobin electrophoresis, and relevant investigations were performed. Data were analyzed using SPSS version 26.0. Results: Among 2,421 admitted children, 107 were anaemic, giving a proportion of 44.19 cases per 1,000 admissions (4.41%). Males constituted 59.8% (male: female ratio 1.48:1). Moderate anaemia was most common (44.9%), followed by severe anaemia (35.5%). Malnutrition was present in 71% of children and showed a significant association with anaemia severity (p<0.001). Microcytic hypochromic anaemia was the predominant morphological type (66.4%). Iron deficiency anaemia was the most common etiology (29.9%), while malaria (10.3%) and haemoglobinopathies were major contributors to haemolytic anaemia. Conclusion: Anaemia among hospitalized children in Tripura is multifactorial, strongly associated with malnutrition, and predominantly nutritional in origin. Early nutritional intervention and region-specific preventive strategies are urgently required.

55. Effect of Intravenous Lignocaine versus Fentanyl on Hemodynamic Response to Laryngoscopy & Endotracheal Intubation in General Anaesthesia: A Comparative Study
Ravindra Kumar Dabi, Chiranji Lal Khedia, Rohit Kumar Verma, Vijeta Khandelwal
Abstract
Background: Laryngoscopy and endotracheal intubation provoke a transient sympathoadrenal response resulting in tachycardia and hypertension, which may be detrimental in susceptible patients. Various pharmacological agents are used to attenuate this response. Objective: To compare the efficacy of intravenous lignocaine and fentanyl in attenuating the hemodynamic response to laryngoscopy and intubation. Methods: This randomized double-blind study was conducted on 80 patients (ASA I–II) undergoing elective abdominal surgery under general anesthesia. Patients were allocated into two groups: Group L received intravenous lignocaine 1.5 mg/kg, and Group F received fentanyl 2 µg/kg, administered 3 minutes before intubation. Hemodynamic parameters (HR, SBP, DBP, MAP) were recorded at baseline, post-drug, post-induction, at intubation, and at 1, 3, and 5 minutes after intubation. Statistical analysis was performed using SPSS, with p < 0.05 considered significant. Results: Both groups were comparable in demographic parameters. Hemodynamic variables increased significantly at intubation in both groups. However, fentanyl demonstrated significantly better attenuation at 1 minute post-intubation: HR (p=0.039), SBP (p=0.007), DBP (p=0.00029), and MAP (p=0.005). No significant differences were observed at later time intervals. Incidence of side effects (bradycardia, hypotension) was comparable between groups. Conclusion: Fentanyl (2 µg/kg) is more effective than lignocaine (1.5 mg/kg) in attenuating the acute hemodynamic response at 1 minute following intubation, with comparable safety profiles.

56. CHA2DS2-VASC Score in Emergency Department: A Prospective Observational Study
Urjita Pranav Modi, Harshkumar Dangi, Dharmistra Dhusa, Pramit Patel
Abstract
Background: The CHA₂DS₂-VASc score (Congestive Heart Failure, Hypertension, Age ≥75 years, Diabetes Mellitus, Stroke/Transient Ischemic Attack, Vascular Disease, Age 65–74 years, and Sex Category [female]) is a validated, evidence-based tool used to estimate the risk of stroke in patients with non-valvular atrial fibrillation (AF). It serves as an essential bedside assessment in the emergency department, particularly in patients presenting with new-onset AF, uncontrolled AF, or AF with an uncertain history of anticoagulation. Objective: To estimate the usefulness of the CHA₂DS₂-VASc score in patients of atrial fibrillation in the emergency department. Methods: This prospective observational study included adult patients presenting to the emergency department with atrial fibrillation confirmed by electrocardiogram. CHA₂DS₂-VASc scores were calculated at presentation and correlated with anticoagulation use and outcomes. Results: Higher CHA₂DS₂-VASc scores were associated with increased cerebral infarction and mortality. Anticoagulant therapy was underutilized despite high-risk scores. Conclusion: CHA₂DS₂-VASc score is useful for risk stratification and outcome prediction in emergency department patients with atrial fibrillation.

57. Comparative Study of the Efficacy & Safety of Ferric Carboxy Maltose V/S Iron Sucrose in Management of Mild to Moderate Iron Deficiency Anemia in Pregnant Women
Darshan D. Patel, Harshdeep K. Jadeja, Bhavesh B. Airao
Abstract
Objective: Anemia one of the common medical conditions affecting pregnancy and responsible for maternal and perinatal mortality and morbidity. The study was done to compare efficacy and safety of ferric carboxy maltose versus iron sucrose in iron deficiency anemia during pregnancy. Method: This study is a prospective observational study carried out at C U Shah Medical College and Hospital, Surendranagar, Gujarat, covered 100 pregnant women with mild to moderate iron deficiency anemia were selected and were randomized into two groups in a 1:1 ratio. Group A: consisted of 50 antenatal women who received iron sucrose. Group B: consisted of 50 antenatal women who received Ferric carboxymaltose. Results: A total of 100 pregnant women with iron deficiency anemia were included in the study, with 50 patients in the Iron Sucrose group and 50 in the Ferric Carboxymaltose (FCM) group. Baseline characteristics including age, baseline hemoglobin, and serum ferritin were comparable between the two groups with no statistically significant difference. Both groups showed a significant improvement in hemoglobin levels during follow-up; however, the rise in hemoglobin was significantly higher in the FCM group. At 8 weeks, the mean hemoglobin level increased to 12.5 ± 1.0 g/dl in the FCM group compared to 11.2 ± 1.1 g/dl in the Iron Sucrose group (p < 0.001). Serum ferritin levels also showed a significantly greater increase in the FCM group, reaching 110.6 ± 18.2 ng/ml at 4 weeks compared to 45.3 ± 12.4 ng/ml in the Iron Sucrose group (p < 0.001). The mean number of doses required was significantly lower in the FCM group (1.3 ± 0.5 doses) compared to the Iron Sucrose group (4.8 ± 1.2 doses). Both treatments were well tolerated, and the incidence of adverse effects such as nausea, headache, and injection site pain was low and comparable between the two groups. No hypersensitivity reactions were observed. Discussion: Our study showed significant increase in hemoglobin level in both the group but FCM was safe and very effective in improving HB concentration as well as early replenishment of iron stores as compare to iron sucrose in patients with mild to moderate anemia.

58. Impact of Lateral Vs Sitting Position for Spinal Anesthesia Administration on Intraocular Pressure and Post Dural Puncture Headache in Cesarean Section
Aparajita Banerjee, Meenakshi Pandey, Trishna Sahu, Ambika Prasad Panda
Abstract
Aim:  Our aim of study was to determine the effect of spinal anesthesia administered in either sitting or right lateral position on post dural puncture headache (PDPH) and intraocular pressure during cesarean section. Materials and Methods: 100 patients posted for cesarean section under spinal anesthesia were divided into two groups of 50 each. Spinal anesthesia was administered either in the sitting position (Group S) or right lateral position (Group RL). Hemodynamics were monitored during perioperative period. Intraocular pressure before and after the operation was measured. Post dural puncture headache was assessed postoperatively up to 5 days. Patients requiring more than 1 attempt for spinal anesthesia were excluded. Results: There was no statistical difference between the two groups regarding demographic data. Post dural puncture headache was seen in 13 patients in Group S and 5 patients in Group RL, the difference being significant. There was no significant difference between the groups regarding intraocular pressure. (P >0.05)There was no significant different between the groups regarding heart rate, SBP and SpO2 at various time points in perioperative period. Conclusion: Spinal anesthesia administered in the sitting position for cesarean section resulted in higher incidence of post dural puncture headache than in the right lateral position, but no significant change was found in the intraocular pressure.

59. Dexmedetomidine versus Propofol Infusion for Intraoperative Haemodynamic Stability during Laparoscopic Surgery: A Prospective Open-Label Comparative Study
P. Umamaheswari, R. Pravin Kumaar, R. Mageshwaran
Abstract
Background: Pneumoperitoneum created during laparoscopic surgery induces significant haemodynamic perturbations, including increased systemic vascular resistance, reduced venous return, and activation of neurohumoral stress pathways. Effective intraoperative haemodynamic management is therefore critical. This study aimed to compare the efficacy of dexmedetomidine infusion versus propofol infusion in maintaining haemodynamic stability during laparoscopic surgery and to evaluate postoperative recovery profiles. Methods: A prospective, open-label comparative study enrolled 70 patients (ASA PS I and II, aged 18–65 years) undergoing elective laparoscopic surgery at Government Villupuram Medical College & Hospital. Patients were randomised equally into Group D (dexmedetomidine: loading dose 1 mcg/kg over 10 minutes before intubation, followed by 0.2 mcg/kg/h infusion) and Group P (propofol: 100 mcg/kg/min infusion after intubation). Both infusions were continued until deflation of pneumoperitoneum. Heart rate (HR), systolic blood pressure (SBP), diastolic blood pressure (DBP), and mean arterial pressure (MAP) were recorded at multiple time points. Postoperative sedation and recovery were assessed using the Ramsay Sedation Scale (RSS) and Modified Aldrete Score (MAS). Results: Both groups were comparable at baseline. Group D exhibited significantly lower HR, SBP, DBP, and MAP compared with Group P at most intraoperative time points (p<0.01), reflecting superior haemodynamic attenuation (HR: 3% decrease vs. 18% increase; MAP: 4% decrease vs. 7% increase over pneumoperitoneum). Group D patients had significantly deeper sedation (higher RSS scores) up to 90 minutes postoperatively (p<0.01), while Modified Aldrete Scores were significantly lower in Group D at 0, 15, and 30 minutes post-extubation (p<0.01), indicating slower initial recovery. Both groups achieved full recovery by 45–60 minutes. No adverse events were recorded. Conclusion: Dexmedetomidine infusion provides superior intraoperative haemodynamic stability during laparoscopic surgery compared with propofol, with effective attenuation of the stress response to pneumoperitoneum. Propofol offers faster early recovery. Dexmedetomidine is the preferred agent when cardiovascular stability is the clinical priority.

60. Dexmedetomidine as an Adjuvant in Opioid Anaesthesia Induction in Patients with Left Ventricular Dysfunction and Coronary Artery Disease: A Prospective Randomised Observational Study
Kurinchi Raja M., Naven Kumar S., Saravanakumar
Abstract
Background: Patients with coronary artery disease (CAD) and compromised left ventricular (LV) function represent a high-risk subgroup in cardiac surgery. High-dose opioid induction, while widely employed, carries risks of haemodynamic instability. Dexmedetomidine, a highly selective alpha-2 (α2) adrenergic agonist, offers sympatholysis, sedation, and analgesia, and may attenuate the adrenergic response to laryngoscopy and intubation. This study aimed to compare the haemodynamic effects and opioid requirements during anaesthesia induction with fentanyl alone versus fentanyl supplemented with dexmedetomidine in patients with LV dysfunction undergoing off-pump coronary artery bypass grafting (OPCAB). Methods: Sixty adult patients with LV dysfunction (ejection fraction <45%) undergoing elective OPCAB were prospectively randomised into two groups of 30 each: Group F (fentanyl alone) and Group D (fentanyl plus dexmedetomidine loading dose 1 mcg/kg over 10 minutes). Haemodynamic parameters and cardiac output indices were recorded at baseline and at one-minute intervals from induction to seven minutes post-induction using a Flo Trac™/Vigileo™ system. Bispectral Index (BIS) monitoring ensured anaesthetic depth. Fentanyl dosage at induction, additional intraoperative fentanyl requirements, and duration of postoperative ventilation were compared between groups. Results: Demographic parameters were comparable between groups. Heart rate and systolic blood pressure were significantly elevated in Group F compared to Group D across all post-induction time points (p<0.001). Diastolic blood pressure and oxygen saturation (SpO2) remained similar in both groups. Stroke volume index (SVI) and cardiac index (CI) were significantly better maintained in Group D (p<0.001). Fentanyl induction dosage (325 ± 25.4 mcg vs 253.3 ± 26.0 mcg) and additional intraoperative fentanyl (371.7 ± 28.4 mcg vs 185.0 ± 32.6 mcg) were significantly lower in Group D (p<0.001). Duration of postoperative ventilation was also significantly shorter in Group D (5.6 ± 0.5 hrs vs 9.0 ± 0.9 hrs, p<0.001). Conclusion: Dexmedetomidine supplementation to fentanyl-based induction in patients with CAD and LV dysfunction provides superior haemodynamic stability, better preservation of cardiac output parameters, reduced intraoperative opioid requirements, and facilitates faster postoperative extubation, enabling early patient fast-tracking.

61. The Persistence of Hansen’s Disease: A Five-Year Profile of High Bacillary Indices and Pediatric Cases from East Vidarbha region
Priyanka Chandankhede, Khushboo Agarwal, Aboli Shinde, Dilip Gedam, Gopal Agrawal
Abstract
Background: Leprosy, or Hansen’s disease, remains a chronic infectious challenge caused by Mycobacterium leprae, primarily affecting the skin and peripheral nerves. Despite national elimination efforts, leprosy transmission persists in marginalized communities in India. This study aimed to determine the pattern, prevalence, and trends of slit-skin smear-positive leprosy cases in East Vidarbha region to assess post-elimination challenges. Methods: A retrospective analysis was conducted at a tertiary care institution in Nagpur over a five-year period from January 2020 to February 2025. Clinical and bacteriological data from 239 slit-skin smear-positive cases, identified from 502 suspected individuals, were evaluated. Results: Males were predominantly affected, making up 64.9% of the cases with a male-to-female ratio of 1.8 to 1. The highest incidence occurred among individuals aged 41 to 50 years, representing 23.4% of the total. Additionally, children aged ≤ 10 years accounted for 5.0% of the cohort, which points to active community transmission. Multibacillary leprosy was responsible for 71.5% of the cases. Furthermore, 51.9% of the patients exhibited a high Bacillary Index of ≥ 5, while the highly infectious borderline lepromatous and lepromatous types made up 56.5% of all clinical presentations. The study also highlighted a significant surge in cases during 2023, representing 36.8% of the total, alongside a relapse or re-treatment rate of 20.9%. Conclusion: The high burden of multibacillary disease and pediatric cases confirms ongoing leprosy transmission in Central India. To achieve the goal of “Zero Leprosy,” healthcare systems must urgently optimize early detection frameworks, integrate novel chemoprophylactic regimens and vaccines, and actively address socio-economic barriers to care.

62. Clinico Epidemiological Study of Genito-Ulcerative Sexually Transmitted Diseases in People Living with HIV/AIDS
Gayathri Narukulla, Pravalika Merugu, Raghumohan Kavati
Abstract
Background: The importance of Genito-ulcerative STD has increased considerably due to the fact that these lesions are a major cofactor in the transmission of the HIV. Hence it is necessary to provide prompt and effective treatment as early as possible. It will prevent viral replication, will prevent infection to the spouse and it will prevent infection to the other people in the community.

63. Association between Blood Pressure, Body Mass Index, and Thyroid Hormone Levels among Northern Indians
Deepa Gupta, Prateek Agrawal, Manjula Babariya, Jitendra Kumar S. Parmar, Kamini Vinayak
Abstract
Background: Maintaining blood pressure (BP) and body mass index (BMI) are the important indicators of health, especially when it comes to heart related problems. Frequent increase in BP and weight can affects the metabolism of the body that may leads to hypertension and obesity, universal contributor of most common endocrine disorder, subclinical hypothyroidism (ScH). The present study was conducted to find out the association between BP, BMI, and thyroid hormone level in blood donors. Method: A total of 1018 healthy people who voluntarily come for blood donation in the hospital were participated in this study. Out of which 966 were included in which 97.6% were males and 2.4% were females aged between 18-59 years and 52 were excluded. Blood pressure, height, weight, BMI and blood group (BG) were measured using standardized protocol by trained nursing staff. Blood sample were taken for the estimation of free triiodothyronine (FT3), free thyroxin (FT4) and thyroid stimulating hormone (TSH) by chemiluminescence method on Vitros 56002355 clinical chemistry analyzer. Descriptive data and Pearson correlation coefficient were calculated by using SPSS (version 23.0). Result: There were highly significant positive correlation between BP and BMI (p<0.0001); SBP and FT3, FT4 (p<0.005); FT3 and FT4 (p<0.0001). And a negative significant correlation was found between FT4 and TSH (p<0.05). Conclusion: Study found a significant relationship between BMI subgroups and blood pressure indices among the participants. There is need for prevention of weight gain for reducing the problem of hypertension. Regular physical activity and reduced dietary fat intake could be achieved by small life style changes for prevention of obesity-associated hypertension.

64. Adverse Events Following Immunization by BCG Vaccine among Adults: A Prospective Study in a District of West Bengal, India
Aditya Prasad Sarkar, Panchanan Kundu, Sanjit Kumar Patra, Tanmoy Kumar Ghosh, Paramita Kundu, Saswata Saha
Abstract
Background: Tuberculosis is still a worldwide public health problem. Highest number of TB patients are in India. As per National immunization Schedule, BCG vaccine is given to infants at birth or otherwise within one year. But one trial in India has shown that 80% effective over 20 years of follow up. Objectives: i) to describe the socio-demographic characteristics of the adults who were given BCG vaccine in a district of West Bengal, India, ii) to assess the AEFI after BCG vaccination among them and iii) to find out the association of AEFI and socio-demographic characteristics, if any. Materials & Methods: It was a observational longitudinal study conducted in a district of West Bengal, India from February 2025 to August 2025, Complete enumeration technique was used and ultimately 12308 study subjects were included in the study. Data were collected initially by the ANMs using one Pretested, predesigned interviewer-administered schedule was used for data collection while the follow up was done by the ASHAs through house to house visit. Results: Majority of the study subjects were of less than sixty years of age (66.7%). Female vaccines (54.3%) were more than males. Almost all of the participants were Hindu (97.9%). Total 12308 persons were vaccinated. Out of which 1105 vaccines faced any of the AEFIs. AEFI was found more in senior citizens (97.4%) and the difference was statistically significant (p< 0.001). ASHA workers followed up the vaccines by house to house visit on 2nd, 14th, 28th, 32nd and 84thday. Any AEFI occurred in 95% of the vaccines at 2nd day after immunization, 84.6% on day 14 and 83.1% on day 28. Redness and papule developed in 89% cases, while 95.4% had local tenderness. Subsequently pustule developed in 84.6% cases followed by development of abscess on 28th day. Ulcer developed in 80.7% cases whereas scar was seen in 76% study subjects, Conclusion: Many minor AEFI developed among the vaccines which is similarly seen in infants. Further such study should be undertaken in different parts of the country to get the total picture of the country.

65. Cognitive Impairment in Elderly Diabetics: Prevalence and Risk Factors
Tinish Sanjaybhai Nanavati, Harshad Radadiya, Roshani Savaliya
Abstract
Background: Cognitive impairment is a frequently under-recognized complication among elderly patients with type 2 diabetes mellitus (T2DM), significantly affecting daily functioning, treatment adherence, and quality of life. While global studies report varying prevalence, data specific to elderly diabetics in western India, particularly Gujarat, remain limited. This study aimed to determine the prevalence of cognitive impairment and identify its associated risk factors among elderly T2DM patients attending a tertiary care teaching centre in Gujarat. Material and Methods: A hospital-based cross-sectional study was conducted over a year at the outpatient department of general medicine at a tertiary care teaching hospital in Gujarat. A total of 250 elderly patients (aged ≥60 years) with confirmed T2DM for at least one year were enrolled using consecutive sampling. Cognitive function was assessed using the Montreal Cognitive Assessment (MoCA) tool, with a score <26 indicating impairment. Relevant sociodemographic, clinical, and biochemical data were collected through structured interviews and hospital records. Ethical approval was obtained from the Institutional Ethics Committee, and written informed consent was secured from all participants. Results: The overall prevalence of cognitive impairment was 42% (105/250). Significant associations were observed with advancing age, longer duration of diabetes, poor glycemic control (HbA1c >8%), hypertension, and lower educational status. Multivariate logistic regression identified age ≥70 years (AOR 3.2, 95% CI 1.8–5.7), diabetes duration >10 years (AOR 2.8, 95% CI 1.6–4.9), HbA1c >8% (AOR 2.4, 95% CI 1.3–4.5), and low education (AOR 2.1, 95% CI 1.2–3.8) as independent predictors. Conclusion: Cognitive impairment affects nearly two-fifths of elderly diabetics in this Gujarat cohort, highlighting the urgent need for routine cognitive screening in diabetes clinics. Early identification of modifiable risk factors could prevent progression to dementia and improve patient outcomes in resource-limited settings.

66. Evaluation of Radiological and Functional Outcomes of Femoral Neck Fractures Treated with Cannulated Cancellous Screws: An Observational Study
Muniraj Meena, Seema Meena, Pradeep Khinchi, Harish Kumar Jain
Abstract
Background: Fracture neck of femur is a common orthopaedic injury and remains difficult to manage because of the risk of complications such as non-union and avascular necrosis. These fractures are frequently associated with high-energy trauma in younger adults. Cannulated cancellous screw fixation is widely used for internal fixation as it allows stable fixation and preservation of the femoral head. Objective- To evaluate the radiological and functional outcomes of fracture neck of femur treated with cannulated cancellous screws. Materials and Methods: This prospective hospital-based observational study was conducted in the Department of Orthopaedics in Rajasthan from August 2020 to May 2022. A total of 25 patients aged 18–60 years with fracture neck of femur were included. All patients underwent closed reduction and internal fixation using three cannulated cancellous screws under fluoroscopic guidance, and were followed for eight months. Functional outcome was assessed using the Harris Hip Score at 24 weeks. Results: Most patients were in the 31–40 years age group (36%) and males constituted 64% of the study population. Road traffic accidents were the most common mode of injury (52%). Transcervical fractures were the most frequent anatomical type (56%) and Garden Type II fractures were the most common (44%). Fracture union was observed most commonly at 12 weeks (40%) with an average union time of 16 weeks. Functional outcome at 24 weeks showed excellent results in 72% of patients, good in 12%, fair in 8%, and poor in 8%. Most patients were able to ambulate without support (88%). Conclusion: Cannulated cancellous screw fixation provides satisfactory radiological union and functional outcome in fracture neck of femur.

67. Clinicopathological Study of Gall Bladder Specimen of Cholelithiasis
Shashi Ranjan Roy, Priya, K.M. Prasad, Dilip Kumar
Abstract
Background: Gallstone disease, one of the most common biliary disorders worldwide, is a major cause of morbidity in middle-aged women. Histopathological changes can include malignancy and chronic cholecystitis when gallstones irritate the gallbladder mucosa. Clinicopathological and biochemical investigations can assist identify disease origins and detect issues early. Methods: A prospective observational study conducted at Department of Pathology, Patna Medical College and Hospital (PMCH) from the year 2020–2021. A total of 100 gallstone-containing cholecystectomy specimens were examined. Gallstone biochemistry, shape, histology, and clinical data were studied. Traditional histology stains and biochemical testing were performed. Chi-square and Student’s t-tests considered p-values below 0.05 significant. Results: The average age of the 100 patients was 42.4 years, ranging from 17-74 years, and 78% were female and 22% male. Most histological results showed chronic cholecystitis (72%), but acute on chronic (12%), cholesterolosis (3%), xanthogranulomatous (3%), and adenocarcinoma (4%). Pigment (28%), mixed gallstones (48%), and cholesterol stones (24%), were the most common. There was no correlation between stone type and histopathological pattern (χ² = 9.95, p = 0.445). Conclusion: Cholelithiasis is the most frequent cholecystitis pathology in middle-aged women. All cholecystectomy tissues must be histopathologically examined to improve clinical outcomes and detect accidental premalignant or malignant abnormalities.

68. Antimicrobial Susceptibility Patterns Across Clinical Isolates in A Tertiary Center
G. J. Archana, B. Archana, G. Sowjanya
Abstract
Background: Antimicrobial resistance (AMR) has become a major global public health concern, particularly in tertiary care hospitals where extensive antibiotic use and invasive procedures promote the emergence of multidrug-resistant organisms. Continuous surveillance of antimicrobial susceptibility patterns is essential for guiding empirical therapy, improving antimicrobial stewardship, and preventing the spread of resistant pathogens. Methods: A prospective observational study was conducted in the Department of Microbiology at Government Medical College, Quthbullapur, Medchal–Malkajgiri, from November 2025 to February 2026. Clinically significant bacterial isolates obtained from various clinical specimens including blood, pus, respiratory samples, urine, and genital specimens were included. Bacterial identification was performed using standard microbiological techniques. Antimicrobial susceptibility testing was carried out by the Kirby–Bauer disc diffusion method according to CLSI guidelines. Resistance mechanisms such as ESBL, carbapenemase production, methicillin resistance, and vancomycin resistance were identified using phenotypic methods. Results: Gram-negative organisms including Klebsiella pneumoniae, Escherichia coli, Acinetobacter baumannii, and Pseudomonas aeruginosa predominated among isolates. High resistance rates were observed for beta-lactams (72%), aminoglycosides (68%), and carbapenems (65%). Acinetobacter species showed the highest resistance prevalence (75%). Among Gram-positive bacteria, methicillin-resistant Staphylococcus aureus and vancomycin-resistant Enterococcus were notable. Colistin and linezolid retained comparatively better activity against multidrug-resistant isolates. Conclusion: The study highlights a high prevalence of multidrug-resistant pathogens in a tertiary care setting. Regular surveillance of antimicrobial susceptibility patterns and implementation of effective antimicrobial stewardship programs are crucial to optimize treatment and control the spread of resistant organisms.

69. Knowledge, Attitudes, and Patterns of Tobacco Use among Adolescents in Mathura District, Uttar Pradesh: A Cross-Sectional Study
Pawar Akshay Shahaji, Manoj Kumar Singh, Saurabh Singh, Pankaj Kumar Jain
Abstract
Background: Tobacco use during adolescence is a major public health concern because initiation at a young age increases the risk of long-term nicotine dependence and adverse health outcomes. This study assessed the prevalence of tobacco use among adolescents and evaluated their knowledge and attitudes regarding tobacco use. Methods: This cross-sectional descriptive study included 300 adolescents aged 13–17 years in Mathura district, Uttar Pradesh. Participants were selected using a stratified random sampling method. Data were collected using a pretested and prevalidated structured questionnaire administered in English and Hindi. Descriptive statistics were used to summarise the data, and Pearson’s chi-square test was used to assess associations between categorical variables. A p value of less than 0.05 was considered statistically significant. Results: Ever use of tobacco was reported by 102 of 300 participants (34.0%). Chewable tobacco and cigarettes were the most commonly reported products, each used by 29 participants (9.7%), followed by smokeless tobacco products such as gutkha, mawa, and jarda in 20 participants (6.7%). Daily tobacco use was reported by 40 participants (13.3%). Tobacco use increased significantly with age, from 16.5% among those aged 13–14 years to 48.7% among those aged 17 years. No significant association was observed between sex and ever tobacco use. Friends or family members using tobacco were reported by 60.0% of participants, 85.7% reported tobacco products to be easily or very easily available, and media influence was reported by 70.7%. Observational exposure to tobacco use was significantly associated with ever tobacco use (p=0.036), whereas awareness of health risks and belief that tobacco is harmful were not significantly associated with tobacco use (p=1.00 for both). Conclusion: Tobacco use was common in this adolescent population despite substantial awareness of harm. Social exposure, easy availability, and increasing age appeared to be important correlates, indicating the need for focused preventive and cessation-oriented interventions for adolescents.

70. Prevalence, Pattern, and Spectacle Utilization of Refractive Errors among School-Going Children in Mathura: A Cross-Sectional Study
Ravi Soni, Paridhi Gupta, Nidhi Jain, Meemansha Maheshwari
Abstract
Background: Refractive errors are a common and potentially correctable cause of visual impairment in school-aged children. Early detection is important because uncorrected errors can affect visual function and school performance. Methods: This cross-sectional study included 129 school-going children aged 6-14 years from selected rural schools in Mathura. Visual acuity screening was performed using Snellen charts. Children with suspected visual impairment underwent autorefraction, retinoscopy, and cycloplegic refraction. Demographic data and information on prior diagnosis, spectacle ownership, spectacle use, and barriers to spectacle use were collected using a structured questionnaire. Data were analyzed using descriptive statistics, chi-square testing, and logistic regression. A p value of less than 0.05 was considered statistically significant. Results: Of 129 children, 67 (51.9%) were male and 62 (48.1%) were female. The overall prevalence of refractive errors was 19.4% (25/129). Myopia was the most common refractive error, affecting 16 children (64.0%), followed by astigmatism in 12 (48.0%) and hyperopia in 4 (16.0%). Prevalence increased significantly with age, from 7.3% in children aged 6-8 years to 32.6% in those aged 12-14 years (p=0.018). The difference by sex was not statistically significant (p=0.291). The mean spherical equivalent among affected children was -1.12 ± 1.65 D. Eighteen affected children (72.0%) were newly diagnosed during screening. Only 6 of 25 affected children (24.0%) owned spectacles, and regular use was reported by 2 of 6 children (33.3%) who owned them. Conclusion: Refractive errors were present in nearly one-fifth of school-going children in rural Mathura, with myopia as the predominant type. Older age was significantly associated with refractive errors. Underdiagnosis, low spectacle ownership, and poor regular spectacle use indicate an important unmet need for school-based vision screening and access to corrective services.

71. Bloodstream Infection Trends Before and After the COVID-19 Pandemic
Dipti Lal, Sushant Suman, Sanjay Kumar, Rajesh Kumar, Satyendu Sagar, Wasim Ahmad
Abstract
Background: The COVID-19 pandemic has had a substantial impact on the epidemiology of bloodstream infections (BSIs), as seen by the rising rates of infection and antibiotic resistance reported globally. Aims: To examine patterns of antibiotic resistance, pathogen dispersion, and bloodstream infection trends prior to and during the COVID-19 pandemic. Methods: Three thousand patients (1500 pre-COVID and 1500 post-COVID) participated in a retrospective cohort research in a tertiary care hospital in Bihar, India. Positivity rates, microbiological profiles, and antibiotic resistance were examined in blood culture data. SPSS and WHONET were used for statistical analysis, and p < 0.05 was deemed significant. Results: Blood culture positive showed a growing trend over time, going from 28% pre-COVID to 34% post-COVID. Escherichia coli, Klebsiella pneumoniae, and Pseudomonas aeruginosa were the most frequently isolated gram-negative microbes. The post-COVID era saw a rise in infections linked to intensive care units. Antimicrobial resistance and multidrug-resistant (MDR) isolates increased significantly (from 35.7% to 47%). There was also a minor rise in Candida species-related fungal infections. Conclusion: The study shows that bloodstream infections and antibiotic resistance have significantly increased in the post-COVID period, especially in intensive care units. To combat the rising incidence of BSIs and AMR, it is crucial to strengthen infection control protocols, antibiotic stewardship, and ongoing surveillance.

72. Comparative Antibiotic Sensitivity Patterns in Clinical vs Environmental E. coli Isolates
Dipti Lal, Sushant Suman, Sanjay Kumar, Rajesh Kumar, Satyendu Sagar, Babita
Abstract
Background: Escherichia coli is a significant contributor to human infections and a key sign of environmental fecal pollution. Effective treatment and public health surveillance are at risk due to increasing antimicrobial resistance (AMR), especially multidrug resistance (MDR), which has been extensively documented in clinical and environmental E. coli. The Multiple Antibiotic Resistance (MAR) index, MDR prevalence, and antibiotic sensitivity patterns of clinical and environmental E. Coli isolates from a single tertiary-care setting were compared in this investigation. Materials & Methods: From February 2025 to January 2026, a cross-sectional, laboratory-based comparison investigation was carried out at Nalanda Medical College and Hospital in Patna, Bihar. A total of 200 non-duplicate E. Coli isolates were examined: 100 from environmental sources (water, soil, hospital surfaces, and food/animal products) and 100 from clinical specimens (urine, pus/wound swabs, stool, and blood). Identification was carried out utilizing commercial techniques or routine biochemical testing after culture on MacConkey/EMB agar. In accordance with CLSI/EUCAST recommendations, antimicrobial susceptibility testing was performed using Kirby-Bauer disc diffusion on Mueller-Hinton agar. MDR was defined as not being susceptible to at least one agent in at least three antimicrobial classes. The MAR index was determined by dividing the total number of antibiotics tested by the number of antibiotics to which an isolate was resistant. Chi-square tests were used to analyze the data, and p<0.05 was deemed significant. Results: First-line drugs such as ampicillin (30% vs. 55%), amoxicillin–clavulanate (45% vs. 65%), ceftriaxone (50% vs. 70%), ciprofloxacin (40% vs. 68%), and gentamicin (65% vs. 80%) significantly reduced the susceptibility of clinical isolates compared to environmental isolates (p<0.05 for all). Both groups continued to be highly susceptible to more expensive medications such amikacin, nitrofurantoin, and imipenem, with no statistically significant differences. MDR was more common in clinical isolates than environmental isolates (55% vs. 30%, p<0.05), and resistance rates were continuously greater among clinical isolates. Clinical isolates were more likely to have MAR index values >0.5 (40% vs. 15%), while environmental isolates were more likely to have low MAR values (<0.2) (50% vs. 20%). Conclusion: Hence, compared to ambient isolates, clinical E. coli isolates showed significantly higher resistance, MDR prevalence, and MAR indices, indicating increased antibiotic selection pressure in healthcare settings. However, significant MDR in environmental isolates highlights the significance of sensible antibiotic usage and integrated One Health surveillance to prevent the spread of resistant E. coli across clinical and environmental reservoirs.

73. Spectrum of Lymph Node by using FNAC in population of West Champaran of Bihar
Rimjhim Kumari, Rabindra Nath Prasad, Rakhi Kumari, Pradeep Kumar Singh
Abstract
Background: A common clinical issue, lymphadenopathy has a broad range of causes, from benign reactive conditions to infections and cancers. A quick, easy, and affordable first-line diagnostic method for assessing lymph node enlargements is fine-needle aspiration cytology (FNAC), particularly in settings with limited resources. There are few regional data from West Champaran, Bihar. The objective is to examine the cytomorphological spectrum of lymph node lesions by FNAC in patients who visit the outpatient department (OPD) of Government Medical College in Bettiah, West Champaran, Bihar. Additionally, patterns pertaining to age, sex, and lymphadenopathy site will be analyzed. Methods: From March 30, 2025, to February 28, 2026, the Department of Pathology at Government Medical College in Bettiah conducted this prospective, observational study. Included were 117 consecutive patients who had palpable lymphadenopathy. FNAC was carried out under aseptic conditions with 22–23 G needles; smears were stained with Papanicolaou, May–Grünwald–Giemsa, and Ziehl–Neelsen stains when tuberculosis was suspected. The Sydney System was used to classify cases as inadequate (L1), benign (L2), atypical (L3), suspicious (L4), and malignant (L5). Descriptive analysis was done on the data. Results: The 117 patients had nearly equal sex distribution and ranged in age from 3 to 82 years (mean ≈ 32 years). Like other Indian series, cervical lymph nodes were the most commonly affected (approximately 70%), followed by axillary and inguinal nodes. Reactive lymphadenitis (~41%), tuberculous/granulomatous lymphadenitis (~33%), and suppurative lymphadenitis (~6%) were the most frequently diagnosed benign lesions. According to previous reports, malignant lesions accounted for approximately 10% of cases, with metastatic carcinoma outnumbering lymphoma and the majority of malignancies occurring in patients over 40. Only about 4% of smears were inadequate or non-diagnostic. Conclusion: FNAC shows that the most common causes of lymphadenopathy in the West Champaran population are reactive and tuberculous lymphadenitis, with metastatic cancer and lymphoma making up a smaller but clinically significant percentage. The results support national trends and demonstrate FNAC as a crucial, minimally invasive, and reasonably priced first-line test for lymphadenopathy triaging in this resource-constrained area.

74. Study of Association of Proteinuria with HbA1c in Diabetes Mellitus
Rimjhim Kumari, Rakhi Kumari, Rabindra Nath Prasad, Pradeep Kumar Singh
Abstract
Proteinuria is an early indicator of diabetic nephropathy, one of the microvascular problems caused by persistent hyperglycemia in type 2 diabetes mellitus (T2DM). The purpose of this hospital-based cross-sectional study was to assess the relationship between proteinuria and glycated hemoglobin (HbA1c) in T2DM patients who were enrolled in Government Medical College in Bettiah, Bihar.  Aim: The purpose of this study is to examine the relationship between proteinuria and HbA1c levels in patients with type 2 diabetes mellitus (T2DM) who are receiving treatment at a tertiary care hospital in Bettiah, Bihar. Materials & Methods: This hospital-based cross-sectional observational study was conducted in the Department of Medicine, Government Medical College, Bettiah, from 30 March 2025 to 28 February 2026, and included 100 T2DM patients (≥18 years) attending outpatient and inpatient services. Patients with non-diabetic kidney disease, acute kidney injury, urinary tract infection, nephrotic syndrome, pregnancy, malignancy, congestive heart failure, chronic liver disease, or on nephrotoxic drugs were excluded. Clinical data (age, sex, duration of diabetes, treatment, blood pressure, comorbidities) were recorded. Fasting and, when available, postprandial blood glucose, HbA1c by standardized immunoassay, and serum urea/creatinine were measured. Proteinuria was assessed using urinary albumin/albumin–creatinine ratio or 24-hour urine protein, with dipstick/spot tests for screening. Patients were categorized by HbA1c (<7%, 7–8.9%, ≥9%) and by proteinuria status (normoalbuminuria vs micro/macroalbuminuria). Appropriate statistical tests were applied; p<0.05 was considered significant. Ethical approval and written informed consent were obtained. Results: Mean age was 52±8 years; 62% were male. Mean diabetes duration was 6.2±2.5 years and mean HbA1c 8.1±1.2%. HbA1c showed a weak to moderate positive correlation with proteinuria (r≈0.17–0.45, p<0.05 to <0.001). Mean HbA1c was ~9% in proteinuric vs ~7.5% in non-proteinuric patients (p<0.001). Microalbuminuria prevalence increased from about 15–20% at HbA1c <7% to 50–75% at HbA1c ≥8%, reaching up to 73.3% at HbA1c ≥9%. Higher HbA1c was associated with increased serum creatinine and urea and reduced eGFR. Conclusion: HbA1c is a useful surrogate marker of early diabetic nephropathy and renal risk stratification, as evidenced by the significant correlation between poor glycemic control and higher prevalence and severity of proteinuria. To prevent or postpone diabetic kidney disease, it is essential to maintain a HbA1c of less than 7% and to regularly screen for albuminuria, particularly in patients with elevated HbA1c.

75. Early Detection of Cerebral Palsy in Infants and Young Children Using the Denver Developmental Screening Test (DDST-II): A Hospital-Based Prospective Observational Study from Eastern India
Alok Ranjan, Ravi Shekhar, Satish Kumar, Ankur Priyadarshi
Abstract
Background: Cerebral palsy (CP) is a leading cause of childhood motor disability, and earlier identification enables timely referral to targeted early intervention during peak neuroplasticity. In many low- and middle-resource settings, access to specialized tools (e.g., General Movements Assessment or structured neurological examinations) may be limited, increasing the importance of feasible developmental screening approaches in routine pediatric services. Aim: To evaluate the clinical utility and diagnostic performance of DDST-II for early detection of CP among infants and young children attending a tertiary-care teaching hospital. Methods: This prospective observational study enrolled 115 infants/young children (0–24 months) attending pediatric services at Jawaharlal Nehru Medical College & Hospital, Bhagalpur, Bihar, India, between 25 February 2025 and 30 January 2026. DDST-II screening was performed by trained examiners across four domains. “Suspect/Untestable” screens were considered positive. CP diagnosis was confirmed by pediatric neurology assessment with supportive clinical/imaging correlation where available. Diagnostic indices and multivariable logistic regression were performed. Results: CP was confirmed in 28/115 (24.3%) children. DDST-II screen positivity was observed in 40/115 (34.8%). Screen positivity demonstrated sensitivity 85.7%, specificity 81.6%, PPV 60.0%, and NPV 94.7% for CP detection. Gross motor delay predominated among CP cases (Figure 1). In multivariable analysis, NICU admission, low birth weight, neonatal seizures, and birth asphyxia were independently associated with CP (Table 4). Conclusion: DDST-II, when integrated into a structured referral pathway, showed high sensitivity and strong rule-out value (high NPV) for CP detection. In resource-constrained settings, DDST-II can support earlier identification of high-risk children and prompt referral for confirmatory assessment and early intervention.

76. Effectiveness of Non-Pharmacological Interventions (Oral Glucose and Non-Nutritive Sucking) on Procedural Pain in Neonates: A Randomized Controlled Trial
Alok Ranjan, Ravi Shekhar, Satish Kumar, Ankur Priyadarshi
Abstract
Background: Neonates undergo repeated minor painful procedures (heel lance, venipuncture) during early hospital care. Untreated pain is associated with physiologic instability and potential adverse neurodevelopmental consequences. Sweet-tasting solutions (sucrose/glucose) and non-nutritive sucking (NNS) are widely recommended non-pharmacological analgesic options, yet comparative effectiveness and pragmatic implementation data from Indian tertiary-care contexts remain limited. Aim: To compare oral glucose, NNS, and combined glucose+NNS versus routine comfort measures in reducing procedural pain in neonates. Methods: Single-center, parallel-group randomized controlled trial including 100 neonates requiring heel lance or venipuncture. Participants were randomized (1:1:1:1) into: (i) routine comfort, (ii) oral glucose 25%, (iii) NNS, (iv) glucose+NNS. Pain was assessed by Premature Infant Pain Profile-Revised (PIPP-R) at pre-procedure, during procedure, 30 seconds, and 60 seconds. Primary outcome: PIPP-R at 30 seconds. Secondary outcomes: crying duration, heart rate (HR) change, oxygen saturation (SpO₂) change, and adverse events. Results: Baseline characteristics were comparable across groups (Table 1). Mean PIPP-R at 30 seconds differed significantly across groups (one-way ANOVA p < 0.001), lowest in glucose+NNS. Crying duration and physiologic reactivity also favored glucose+NNS (all p < 0.001). Repeated-measure trajectory demonstrated faster pain resolution with combined intervention. No serious adverse events were observed; minor gagging/desaturation was rare and self-limited. Conclusion: Oral glucose and NNS are effective non-pharmacological analgesic strategies for minor neonatal procedures. Combined glucose+NNS provides superior analgesia and quicker recovery compared with either intervention alone.

77. Growth Outcomes and Feeding Tolerance in Preterm Infants: A Comparison Between Fortified Human Milk and Preterm Formula
Ravi Shekhar, Alok Ranjan, Satish Kumar, Ankur Priyadarshi
Abstract
Background: Preterm infants frequently experience extrauterine growth restriction and feeding intolerance. Human milk is biologically advantageous, but unfortified milk may not meet nutrient needs; hence fortification is recommended to improve growth while preserving gastrointestinal tolerance and protection against necrotizing enterocolitis (NEC) and infection. However, in resource-variable settings, preterm formula remains common, and comparative outcomes in routine NICU practice require context-specific evaluation. Aim: To compare growth outcomes and feeding tolerance in preterm infants fed fortified human milk (FHM) versus preterm formula (PF). Methods: A prospective comparative cohort study was conducted in the NICU of Jawaharlal Nehru Medical College & Hospital, Bhagalpur, enrolling 115 preterm infants during 10 Feb 2025–25 Jan 2026. Infants received either FHM (mother’s expressed milk fortified per unit protocol) or PF. Primary outcomes were time to full enteral feeds and feeding intolerance. Secondary outcomes included growth velocities, NEC (≥stage II), late-onset sepsis, length of stay, and discharge anthropometry. Multivariable regression adjusted for gestational age, birthweight, SGA status, and sepsis. Results: Of 115 infants, 60 received FHM and 55 received PF. Baseline characteristics were comparable. FHM achieved earlier full feeds (9.75 ± 3.18 vs 13.60 ± 3.43 days; p<0.001) and fewer intolerance episodes (median 1.0 vs 2.0; p<0.001). Feeds held ≥24h were lower with FHM (23.3% vs 50.9%; p=0.004). NEC ≥II (3.3% vs 10.9%; p=0.150) trended lower with FHM. PF showed higher unadjusted weight gain velocity (16.60 ± 2.55 vs 15.50 ± 2.80 g/kg/day; p=0.029), while FHM showed better length gain (1.08 ± 0.19 vs 0.95 ± 0.19 cm/week; p=0.001). In adjusted analysis, PF was not independently associated with higher weight gain (β 0.95, 95% CI −0.11 to 2.01; p=0.079), but remained associated with more intolerance episodes (IRR 1.67, 95% CI 1.25–2.21; p<0.001). Conclusion: In this cohort, fortified human milk improved feeding tolerance and accelerated attainment of full feeds, with comparable adjusted weight gain and signals toward reduced morbidity. These findings support guideline-concordant prioritization of human milk with appropriate fortification in preterm care.

78. Impact of Early Caffeine Therapy on White Matter Development in Extremely Low Birth Weight Infants: A Prospective Cohort Study from a Tertiary Care Center in Eastern India
Ravi Shekhar, Alok Ranjan, Satish Kumar, Ankur Priyadarshi
Abstract
Background: Extremely low birth weight (ELBW) infants are at high risk of diffuse white matter injury and dysmaturation, which contribute substantially to later neurodevelopmental impairment. Diffusion tensor imaging (DTI) at term-equivalent age (TEA) provides sensitive microstructural biomarkers of white matter maturation. Aim: To evaluate the association between early caffeine therapy (≤24 h of life) and white matter microstructural development at TEA in ELBW infants. Methods: Prospective cohort study conducted at Jawaharlal Nehru Medical College & Hospital, Bhagalpur, Bihar, India (10 February 2025–25 January 2026). ELBW infants receiving caffeine were grouped as early (≤24 h) vs late (>24 h) initiation. Standard caffeine citrate regimen was used (loading 20 mg/kg; maintenance 5–10 mg/kg/day). TEA MRI with DTI was performed where feasible. Primary outcome: TEA DTI white matter composite fractional anisotropy (FA) z-score; secondary outcomes included regional FA, mean diffusivity (MD), severe white matter injury (WMI) on qualitative MRI scoring, and major neonatal morbidities. Multivariable regression adjusted for gestational age, birth weight, sex, antenatal steroids, BPD, severe IVH, and late-onset sepsis. Results: Among 115 ELBW infants (early n=60; late n=55), TEA MRI/DTI was obtained in 100. In the example output, early caffeine was associated with higher FA composite (adjusted β≈0.31) and lower MD composite (adjusted β≈−0.43), with strongest regional effects in the posterior limb of internal capsule. Early caffeine was associated with shorter ventilation duration and lower severe IVH. Conclusion: Early caffeine therapy may be associated with improved TEA white matter microstructure in ELBW infants. Randomized trials and robust causal inference approaches are needed to confirm neuroprotective effects and identify optimal timing/dose.

79. Role of Vitamin D in Health and Diseases in Children: A Hospital-Based Observational Study
Ravi Shekhar, Alok Ranjan, Satish Kumar, Ankur Priyadarshi
Abstract
Background: Vitamin D is essential for skeletal mineralization and has immunomodulatory effects that may influence infections, wheeze/asthma, anemia, and growth in children. Despite abundant sunlight, vitamin D deficiency remains common in South Asia. Aim: To estimate vitamin D status in children attending JNMCH Bhagalpur and evaluate associations with selected clinical and biochemical outcomes. Methods: Hospital-based observational study of 120 children (1–18 years) recruited from 20 February 2025 to 15 January 2026. Demographic, dietary and sunlight exposure history, anthropometry, and clinical assessment were recorded. Serum 25-hydroxyvitamin D [25(OH)D] and relevant biochemical markers were assessed. Vitamin D categories were defined using standard pediatric cut-offs. Associations with recurrent acute respiratory infections (ARI), wheeze/asthma, anemia, and clinical rickets signs were evaluated using bivariate tests and multivariable logistic regression. Results: Mean 25(OH)D was 17.6 ± 9.5 ng/mL; 63.3% had 25(OH)D <20 ng/mL. Deficiency was higher in winter (76.7% vs 48.3%, p=0.0026). Severe deficiency was strongly associated with clinical rickets signs (p<0.001) and higher alkaline phosphatase. Hemoglobin differed significantly across vitamin D categories (ANOVA p=0.0012), and vitamin D deficiency independently predicted anemia (adjusted OR 3.75; 95% CI 1.48–9.53; p=0.005). Conclusion: Vitamin D deficiency was highly prevalent in this hospital-based pediatric sample, with clinically meaningful associations with rickets phenotype and anemia. Targeted screening and guideline-based supplementation for high-risk children may be warranted.

80. Total Thyroidectomy versus Hemithyroidectomy: A Comparative Study of Complications and Surgical Outcomes
Syeda Ayesha, Syeda Nahidunnisa, Heeba Mohammed Ghouse, Humaira Shaikh
Abstract
Introduction: Thyroidectomy is a commonly performed surgical procedure for the management of benign and malignant thyroid disorders. The two principal approaches, total thyroidectomy and hemithyroidectomy, differ in extent of resection and are associated with varying complication profiles. Understanding these differences is essential for optimal surgical decision-making. The study aimed to compare the complication rates between total thyroidectomy and hemithyroidectomy and to evaluate associated operative and postoperative outcomes. Materials and Methods: This hospital-based comparative observational study was conducted in the Department of ENT and Head and Neck Surgery at Deccan College of Medical Sciences, Hyderabad, from January 2025 to December 2025. A total of 50 patients undergoing thyroid surgery were included and divided into two groups: total thyroidectomy (n=25) and hemithyroidectomy (n=25). Demographic and clinical variables were recorded. Postoperative complications including hypocalcaemia, recurrent laryngeal nerve injury, haemorrhage, and wound infection were assessed. Statistical analysis was performed using SPSS version 26.0, with p<0.05 considered significant. Results: The overall complication rate was significantly higher in the total thyroidectomy group (44.0%) compared to the hemithyroidectomy group (20.0%) (p=0.04). Transient hypocalcaemia was significantly more frequent following total thyroidectomy (32.0% vs. 4.0%, p=0.01). No significant differences were observed in permanent hypocalcaemia, recurrent laryngeal nerve injury, hemorrhage, or wound infection. The mean duration of surgery and hospital stay were significantly higher in the total thyroidectomy group (p<0.001). Conclusion: Total thyroidectomy is associated with a higher complication rate, particularly hypocalcaemia, along with increased operative time and hospital stay compared to hemithyroidectomy. Careful patient selection and surgical planning are essential to balance treatment efficacy and safety.

81. Microbiological Study of Septicemia in a Tertiary Care Teaching Hospital, Kachchh
Ronak Pradipbhai Chauhan, Hitesh Assudani, Krupali Kothari
Abstract
Background: Septicemia remains a major cause of morbidity and mortality worldwide, with increasing concerns regarding antimicrobial resistance. Aim: To identify the common bacterial pathogens associated with septicemia in adult patients and analyze their antibiotic susceptibility patterns. Methods: A hospital-based prospective observational study was conducted on 150 clinically suspected adult septicemia cases. Blood cultures were processed using standard microbiological techniques, antimicrobial susceptibility testing and identification of the isolated organisms were performed using the VITEK 2 system. Results: Culture positivity was observed in 73.3% cases. Gram-negative organisms (60%) predominated, with Escherichia coli (29.1%) being the most common isolate. High resistance was observed to commonly used antibiotics, including gentamicin and aztreonam, along with significant resistance to carbapenems. Conclusion: The study highlights a predominance of Gram-negative pathogens and an alarming rise in antimicrobial resistance in septicemia, emphasizing the need for continuous surveillance and rational antibiotic use.

82. Prevalence of Non-Alcoholic Fatty Liver Disease in Type 2 Diabetes Mellitus and Its Association with Diabetic Complications: A Cross-Sectional Study
Madhuri Mangharam Alwani, Komal Rana, Renukaben Maheshbhai Vasava, Kavyakumar Pareshkumar Patel
Abstract
Background: Non-alcoholic fatty liver disease (NAFLD) is increasingly recognized as a major comorbidity in patients with type 2 diabetes mellitus (T2DM), contributing significantly to both hepatic and extrahepatic complications. Aim: To determine the prevalence of NAFLD in patients with T2DM and to evaluate its correlation with associated complications and metabolic risk factors. Methods: A hospital-based cross-sectional study was conducted on 150 patients with T2DM. Clinical, biochemical, and ultrasonographic evaluations were performed to diagnose NAFLD. Statistical analysis was carried out using SPSS version 25.0, and associations were tested using Chi-square and t-tests, with p < 0.05 considered significant. Results: The prevalence of NAFLD was 62.0%. Higher prevalence was observed in the 51–60 years age group (65.4%). NAFLD showed significant association with central obesity (73.2%, p = 0.003), elevated ALT levels (74.4%, p = 0.002), and metabolic syndrome (77.8%, p < 0.001), while no significant association was found with gender (p = 0.532). Conclusion: NAFLD is highly prevalent among patients with T2DM and is strongly associated with metabolic risk factors. Early screening and comprehensive management strategies are essential to prevent disease progression.

83. A Profile of Clinical Features and Outcomes in Snakebite Envenomation Patients at Bheri Hospital, Nepal
Sanket Kumar Risal, Urmila Parajuli, Dinesh Kumar Choudhary, Paras Shrestha
Abstract
Snakebite envenomation is a major public health issue in many parts of the world, especially in rural regions where access to medical care is limited. Nepal, with its vast rural areas and diverse ecosystems, is particularly vulnerable to snakebite incidents, which cause significant morbidity and mortality. Despite being a critical health problem, there remains a scarcity of comprehensive, localized data from specific hospitals, particularly Bheri Hospital in Nepal, which serves a large and diverse population. This review aims to assess the clinical features and outcomes of snakebite envenomation patients treated at Bheri Hospital, with the goal of providing a detailed understanding of the disease patterns, treatment efficacy, and potential improvements in healthcare practices.
Bheri Hospital, located in the Banke district of western Nepal, is a central healthcare facility in a region where snakebites are prevalent due to the rural setting and proximity to habitats of venomous snakes. Although there are studies on snakebites in other parts of Nepal and South Asia, specific data from Bheri Hospital is limited. Understanding the local epidemiology of snakebites, including the species involved, the timing of treatment, and the clinical presentations, is essential for improving patient outcomes. By reviewing the available clinical records, case reports, and patient outcomes, this study seeks to provide a comprehensive assessment of the hospital’s approach to snakebite envenomation.
The review will focus on key clinical features of snakebites, such as local symptoms (pain, swelling, necrosis), systemic manifestations (hemotoxicity, neurotoxicity), and complications like renal failure, coagulopathy, and shock. Identifying the most common snake species responsible for envenomation in the region, the severity of their bites, and the resulting clinical manifestations is vital for tailoring effective treatment protocols. Given the diversity of snake species in Nepal, there is a need to assess whether specific regional characteristics affect clinical outcomes, such as the prevalence of bites from species like the cobra, krait, or pit viper, each of which has a different venom composition and clinical presentation.
Another important aspect of the review is to evaluate the outcomes of snakebite victims in relation to the timeliness and appropriateness of medical interventions. In many rural areas, snakebite victims face delays in receiving treatment due to geographic isolation, lack of awareness, and insufficient medical resources. This review will assess the average time to treatment, the use of antivenom, and any challenges encountered in the management of these patients. It will also evaluate whether the hospital’s resources, including the availability of antivenom and the capacity of healthcare workers to administer appropriate care, influence patient outcomes.
Furthermore, this review will aim to identify potential gaps in care and areas for improvement in the clinical management of snakebite envenomation at Bheri Hospital. By analyzing trends in treatment delays, complications, and mortality rates, the study will highlight areas where improvements in healthcare infrastructure, staff training, and resource availability could enhance patient care. For example, if delays in the administration of antivenom are found to correlate with worse outcomes, recommendations can be made for improving access to treatment or raising public awareness about the importance.

84. Spectrum of Anaemia case in Tertiary Care Hospital
Rimjhim Kumari, Rakhi Kumari, Rabindra Nath Prasad, Pradeep Kumar Singh
Abstract
Introduction: Anaemia remains a major public health problem in India, with diverse morphological patterns and aetiologies that vary across clinical settings. Hospital-based research, especially from tertiary care facilities, offers crucial information about the range of anemia severity, underlying causes, and diagnostic correlations. Aims: To determine the distribution of anemia by aetiology, morphological type, and severity, as well as to investigate the relationship between red cell morphology and underlying causes in patients receiving tertiary care. Materials & Methods: The Government Medical College in Bettiah was the site of this cross-sectional study. There were 384 patients with anemia diagnoses in all. Based on hemoglobin levels, anemia was categorized as mild, moderate, or severe. Red cell indices and peripheral smear analysis were used for morphological classification (microcytic, normocytic, and macrocytic). Relevant laboratory tests, such as vitamin assays and iron studies, were used to determine the aetiology. To find distribution patterns and correlations between morphology, severity, and aetiology data were analyzed. Results: The most prevalent severity category was moderate anemia (51.6%), which was followed by severe anemia (21.9%) and mild anemia. The most common morphological type was microcytic anemia (56.8%), which was followed by normocytic anemia (31.8%) and macrocytic anemia (11.5%). The most common cause was found to be iron deficiency (58.3%), followed by vitamin B12 deficiency (16.1%), anemia of chronic disease (13.5%), folate deficiency (4.7%), and other causes. A strong association was observed between microcytic morphology and iron deficiency, macrocytic morphology and vitamin B12/folate deficiency, and normocytic morphology and chronic disease or haemolysis. The majority of morphological and aetiological categories showed moderate anemia. Conclusion: In this tertiary-care population, iron deficiency, microcytic morphology, and moderate anemia are the most common patterns. Referral bias toward more complex cases is reflected in the higher percentage of severe and macrocytic anemia. The importance of red cell indices and peripheral smear examination as useful tools in the initial diagnostic evaluation of anemia is highlighted by the strong correlation between morphology and aetiology.

85. Assessment of Quality of Antenatal Care Services in Public Health Facilities Using Donabedian Model
Rajeev Kumar Ranjan, Vijay Kumar, Aamir Saeed, Surendra Prasad Singh
Abstract
Background/Introduction: Using the Donabedian model (structure, process, outcome), evaluate the quality of antenatal care (ANC) services in a tertiary care public hospital in Bihar. Determine the obstetric and sociodemographic factors linked to the adequacy of ANC. Materials and Methods: From February 15 to August 25, 2025, a cross-sectional study was carried out at Government Medical College in Bettiah, Bihar. Eighty pregnant or recently delivered women (within six weeks postpartum) who were either admitted to obstetric wards or attended the ANC clinic were enrolled one after the other. Mother and Child Protection cards, medical records, and a pre-tested semi-structured questionnaire based on the Donabedian framework and WHO ANC guidelines were used to gather data. Both process (number of visits, examinations, laboratory tests, counseling) and structural (staff, medications, vaccines, equipment) indicators were noted. Frequencies, percentages, means, and Chi-square tests were used to analyze the data using SPSS; p<0.05 was deemed significant. Results: The majority of participants were multigravida (57.5%), from rural areas (65%), and between the ages of 20 and 24 (37.5%). IFA tablets (85%), TT vaccine (87.5%), and paramedical personnel (90%) were all readily available; 80% of respondents indicated that doctors were available. BP measurement (92.5%) and weight recording (87.5%) were the most common process indicators, whereas Hb estimation (72.5%), urine examination (65%), and counseling (60%) were less common. Just 52.5% had at least four ANC visits. Higher education (p=0.02) and living in an urban area (p=0.04) were substantially correlated with adequate ANC. Conclusion: Important ANC process elements, such as laboratory testing, counseling, and the suggested number of visits, were subpar despite acceptable structural readiness. Two important factors that contributed to insufficient ANC were living in a rural area and having little education. Strengthening maternal health outcomes requires focused interventions to enhance process quality and lessen educational and rural disparities.

86. An Observational Study to Detect Antibiotic Resistance in the Isolates from Middle Ear Infections with Special Reference to MRSA, ESBL and MBL Producing Organismsin North Karnataka
Pramod Sambrani, Mahesh Kumar S., Namratha W. Nandihal, Anubhav Sinha, Rejinold T. I.
Abstract
Aim: To detect antibiotic resistance in the isolates from middle ear infections with special reference to MRSA, ESBL and MBL producing Organisms. Materials and Methods: A total of 140 ear swab samples meeting the inclusion criteria were processed in the Department of Microbiology, KMCRI, Hubballi. Pus samples were collected from the external auditory canal using sterile cotton swabs and cultured on appropriate microbiological media following standard laboratory procedures. The bacterial isolates were identified using standard microbiological techniques. Antibiotic susceptibility testing was performed and interpreted according to 36th edition CLSI guidelines. Results: Out of 140 ear swab samples, Staphylococcus aureus (52.1%, n = 73) was the most common isolate, followed by Pseudomonas spp. (23.6%, n = 33), Klebsiella spp. (10.7%, n = 15) and Escherichia coli (7.1%, n = 10), while other organisms constituted 6.4% (n = 9). Among the Staphylococcus aureus isolates, MRSA accounted for 38.3% (n = 28) while MSSA accounted for 61.7% (n = 45). Among the Gram-negative isolates, ESBL production was detected in 27.5% (n = 18) isolates, while no isolates showed MBL production (0%). Conclusion: Staphylococcus aureus was the predominant pathogen isolated from middle ear infections, followed by Pseudomonas spp., Klebsiella spp., and Escherichia coli. A considerable proportion of Staphylococcus aureus isolates were identified as MRSA, and ESBL production was observed among Gram-negative isolates, while no MBL producers were detected. Continuous surveillance of bacterial pathogens and their antibiotic resistance patterns is essential for guiding appropriate empirical therapy, improving treatment outcomes, and preventing the emergence of antimicrobial resistance.

87. Comparative Evaluation of Intravenous Dexmedetomidine versus Fentanyl for Attenuation of Haemodynamic Response during Laryngoscopy and Endotracheal Intubation: A Randomized Comparative Study
Shreya Soni, Anju Verma, Sunil Raghuvanshi
Abstract
Background: Laryngoscopy and endotracheal intubation produce a transient sympathoadrenal response that may manifest as tachycardia, hypertension and increased myocardial oxygen demand. Although these changes are often tolerated by healthy individuals, they can be clinically important in patients with limited cardiovascular reserve. Aim: To compare the efficacy of intravenous dexmedetomidine and intravenous fentanyl in attenuating haemodynamic responses during laryngoscopy and endotracheal intubation. Methods: This prospective cross sectional comparative study was planned in sixty adult patients of ASA physical status I-II undergoing elective surgery under general anaesthesia. Patients were allocated into Group D receiving dexmedetomidine 1 µg/kg diluted in 100 mL normal saline over 10 minutes before induction, and Group F receiving fentanyl 2 µg/kg as a slow intravenous bolus 3 minutes before induction. Heart rate, systolic blood pressure, diastolic blood pressure and mean arterial pressure were recorded at baseline, after study drug administration, at intubation, and at 1, 3, 5 and 10 minutes after intubation. Results: The demographic characteristics were comparable between the two groups (p > 0.05). Following administration of the study drug, a significant reduction in heart rate and mean arterial pressure was observed in the dexmedetomidine group compared to the fentanyl group (p < 0.05). At the time of laryngoscopy and intubation, the fentanyl group demonstrated a marked increase in heart rate (98.6 ± 14.8 bpm) and mean arterial pressure (108.9 ± 14.6 mmHg), whereas the dexmedetomidine group showed minimal changes from baseline (76.2 ± 10.3 bpm and 92.5 ± 11.2 mmHg respectively), which was statistically significant (p < 0.001). The attenuation of haemodynamic response in the dexmedetomidine group was sustained up to 10 minutes post-intubation. Bradycardia was more frequently observed in the dexmedetomidine group, while nausea and vomiting were more common in the fentanyl group; however, these differences were not statistically significant. Conclusion: Intravenous dexmedetomidine at a dose of 1 µg/kg is significantly more effective than fentanyl 2 µg/kg in attenuating the haemodynamic response to laryngoscopy and endotracheal intubation. It provides superior control of heart rate and blood pressure with sustained effects, thereby ensuring better perioperative haemodynamic stability. Although associated with mild bradycardia, dexmedetomidine remains a safe and preferable agent, especially in patients where haemodynamic fluctuations may be detrimental.

88. Histopathological Spectrum of Eyelid and Conjunctival Lesions: A Retrospective Study from a Tertiary Care Center
Parth Bhargavi V., Shah Khushi R., Kuchhadiya Mittal G., Shah Nitee S., Shah Surbhi S.
Abstract
Introduction: Eyelid and conjunctival lesions encompass a wide spectrum of benign and malignant conditions. Histopathological examination remains the gold standard for definitive diagnosis and guides appropriate management. Methods: A retrospective study was conducted in the Department of Ophthalmology at a tertiary care hospital over a period of two years. Histopathological reports of 70 patients with eyelid and conjunctival lesions were analyzed. Data regarding age, gender, lesion site, and histopathological diagnosis were collected. Tissue samples were processed using standard protocols. Diagnoses were established through clinicopathological correlation and microscopic examination. Results: Out of 70 cases, 46 (65.71%) were males and 24 (34.29%) were females. The majority of patients (35.72%) were in the 21–40 years age group. Most lesions were benign (64 cases, 91.42%), while 6 cases (8.58%) were malignant. Among benign lesions, chalazion was the most common (24.28%), followed by chronic inflammatory lesions (14.28%) and cystic lesions (14.28%). Other benign conditions included squamous papilloma, vascular lesions, dermoid cyst, nevus, and granuloma. Malignant lesions included basal cell carcinoma (2.85%), squamous cell carcinoma, poorly differentiated carcinoma, ocular surface squamous neoplasia, and primary cutaneous mucinous carcinoma (each 1.42%). The majority of malignant cases (5 out of 6) occurred in patients aged 60 years and above.

89. Assessment of Intraoperative and Technical Difficulties Associated with Laparoscopic Adrenalectomy in Patients with Adrenal Pheochromocytoma: A Case Series Study
Mihir Karathiya, Chirag Karansinh Sangada, Mehulkumar Muljibhai Tadvi, Rutvi Jain
Abstract
Background: Pheochromocytomas are rare catecholamine-producing adrenal tumors that pose significant perioperative challenges due to their potential for sudden hypertensive crises and arrhythmias. Laparoscopic adrenalectomy has emerged as the gold standard approach, offering advantages of reduced postoperative morbidity, shorter hospital stay, and faster recovery compared to open surgery. However, in the context of pheochromocytoma, the procedure remains technically demanding because of intraoperative hemodynamic instability, tumor size, bilateral involvement, and close relation to vital vascular structures. Objective: To evaluate the intraoperative and perioperative challenges faced during laparoscopic adrenalectomy for adrenal pheochromocytoma and to analyze strategies that improve surgical and anesthetic outcomes. Methodology: This retrospective case-based study, conducted from January 2024 to October 2024, evaluated five patients with biochemically and radiologically confirmed pheochromocytomas who underwent laparoscopic adrenalectomy following preoperative alpha-blockade and additional antihypertensive therapy where required. Result: Tumor dimensions ranged from 3.5 to 5.1 cm. Two patients experienced intraoperative hypertensive surges, which were controlled with anesthetic support. Larger tumors (>4 cm) and those adherent to the inferior vena cava and liver presented greater technical challenges due to loss of fat planes and bleeding risk. One patient with bilateral pheochromocytoma underwent unilateral adrenalectomy to reduce operative risk while achieving tumor control. All procedures were completed laparoscopically without conversion to open surgery, and histopathology confirmed pheochromocytoma in every case. Our findings highlight that while laparoscopic adrenalectomy is safe and feasible, it requires meticulous preoperative optimization, vigilant intraoperative monitoring, and skilled surgical technique to overcome challenges related to tumor size, vascular proximity, and endocrine fluctuations. In experienced hands and multidisciplinary settings, laparoscopic adrenalectomy remains the preferred approach for pheochromocytomas up to 6 cm, ensuring favorable outcomes with minimal morbidity.

90. Comparative Study of Pulmonary Function Tests Using Spirometry in Obese Versus Sedentary Individuals
N. Husamuddin, Sandeep S., Aravindhan V.
Abstract
Background: Obesity is a burgeoning global epidemic associated with multi-system dysfunction. Its impact on respiratory mechanics and pulmonary function — though clinically significant — remains underexplored in Indian settings. Spirometry offers a non-invasive, reproducible means of assessing respiratory capacity, and this study exploits that to compare pulmonary function between obese and sedentary individuals. Objective of this study is to compare spirometric indices — Forced Vital Capacity (FVC), Forced Expiratory Volume in 1 second (FEV1), FEV1/FVC ratio, Peak Expiratory Flow Rate (PEFR), Forced Expiratory Flow 25–75% (FEF25–75%), and accessory lung volumes — between obese and sedentary non-obese individuals; and to assess the correlation of Body Mass Index (BMI) with these spirometric parameters. Methods: A cross-sectional comparative study was conducted at Department of Physioliogy, Government Medical College Krishnagiri for a period of six months in individuals (BMI ≥ 30 kg/m²) and 60 sedentary non-obese controls (BMI 18.5–24.9 kg/m²). Spirometry was performed using a calibrated computerised spirometer following American Thoracic Society (ATS)/European Respiratory Society (ERS) guidelines. Statistical analysis was done using SPSS version 26.0. Results: Obese individuals demonstrated significantly lower FVC (3.12 ± 0.61 L vs 3.74 ± 0.58 L; p < 0.001), FEV1 (2.48 ± 0.52 L vs 3.02 ± 0.49 L; p < 0.001), PEFR (6.21 ± 1.18 L/sec vs 7.54 ± 1.22 L/sec; p < 0.001), and FEF25–75% (2.89 ± 0.74 vs 3.47 ± 0.68 L/sec; p < 0.001) compared to sedentary controls. The FEV1/FVC ratio was preserved in both groups (79.6% vs 80.8%; p = 0.194), indicating a predominantly restrictive pattern. Expiratory Reserve Volume (ERV) was markedly reduced in obese participants (0.68 ± 0.21 L vs 1.14 ± 0.28 L; p < 0.001). Restrictive spirometric pattern was observed in 53.3% of obese individuals compared to 16.7% in sedentary controls (p < 0.001). BMI showed a significant negative correlation with FVC (r = −0.61), FEV1 (r = −0.58), ERV (r = −0.67), and PEFR (r = −0.54). Conclusion: Obesity exerts a profound adverse effect on pulmonary function, primarily producing a restrictive ventilatory defect. Early spirometric screening in obese individuals is warranted for timely respiratory intervention and comprehensive metabolic management.

91. Self-Medication Practices Among Second-Year MBBS and BDS Students in a Rural Tertiary Care Hospital in South India: A Cross-Sectional Study
Swathi Dharini K., Meena S., Sunil Mhatarba Vishwasrao
Abstract
Background: Medication use without consulting a qualified healthcare professional is known as self-medication. Although responsible self-care may be beneficial for minor illnesses, inappropriate medication use may lead to adverse drug reactions, incorrect treatment, and antimicrobial resistance. Objectives: To assess the knowledge, attitudes, and practices regarding self-medication among second-year MBBS and BDS students in a rural tertiary care hospital in Tamil Nadu. Methods: A cross-sectional study was conducted between December 2025 and January 2026 among second-year MBBS and BDS students. Data were collected using a structured and pre-validated questionnaire. Descriptive statistics were used to summarise the results, and the Chi-square test was applied to evaluate associations between variables. Results: A total of 233 students participated in the study, with a mean age of 20.2 years; 65% were female. Approximately 54% reported practising self-medication within the previous six months. Headache and fever were the most frequently treated conditions, and analgesics such as paracetamol were the most commonly used drugs. Pharmacies dispensing medications without prescriptions were the primary source of medicines. Although most participants acknowledged that antibiotics should not be self-administered, 23.5% reported self-administering antibiotics. Conclusion: Self-medication is common among undergraduate medical and dental students despite awareness of potential risks. Educational strategies focusing on rational drug use and antimicrobial stewardship should be incorporated into early medical training.

92. Effect of Time and Storage Condition on Prothrombin Time and Activated Partial Thromboplastin Time
Karthikeyan T.M., Shivapriya R., Umamageswari M.S., Sharanya K., Priya Fedric
Abstract
Background: Prothrombin Time (PT) and Activated Partial Thromboplastin Time (aPTT) are vital coagulation tests done to evaluate the extrinsic and intrinsic pathways, respectively. Pre-analytical variables such as delay in time to process and storage conditions can significantly affect the test results, influencing the clinical decision-making. Objective: To assess the effect of time and storage conditions (room temperature and refrigeration) on PT and aPTT values in blood samples. Materials and Methods: This prospective cross-sectional study was conducted on 30 healthy volunteers between the age group of 18–25 years. Blood samples were collected in 3.2% sodium citrate tubes. Two sets of samples were processed: one set of samples are centrifuged immediately and the obtained plasma is stored in refrigerator, and another set of samples are kept as whole blood at room temperature. PT and aPTT were measured at the time interval of 0, 4, 12, and 24 hours using a fully automated coagulation analyzer (Elite Pro ACL). Statistical analysis was performed using repeated measures ANOVA. Results: PT showed no difference in values between centrifuged and uncentrifuged samples significantly but there was a gradual decline over time but remained relatively stable upto 24 hours. In contrast, aPTT values showed a significant decrease over time, specifically after 4 hours, in both centrifuged and uncentrifuged samples. Statistically significant differences were observed at the interval of 12 and 24 hours (p < 0.05). Conclusion: PT is stable up to 24 hours under both room temperature and refrigerated conditions. However, aPTT is time-sensitive and should ideally be processed within 4 hours for accuracy. If there is a delay for analysing the sample, separation of plasma and refrigeration are recommended to retain the sample integrity.

93. Correlation of Endometrial Thickness with Transvaginal Sonography, Hysteroscopic Findings and Histopathological Diagnosis in Patients with Abnormal Uterine Bleeding
Monika, Neeraj Choudhary, Isha, Vibha, Kavita Chandnani
Abstract
Background: Abnormal uterine bleeding (AUB) is a common gynecological complaint across the world, affecting women in reproductive, perimenopausal, and postmenopausal age groups. AUB may arise from a wide variety of causes ranging from hormonal dysfunction to structural intrauterine lesions, as well as premalignant and malignant conditions. Transvaginal sonography (TVS) is usually the first line of investigation because of its non-invasive nature and ability to measure endometrial thickness (ET). An increased ET can suggest underlying hyperplasia, polyps, or malignancy, although TVS has limited value in detecting focal intrauterine lesions. Hysteroscopy allows direct visualization of the endometrial cavity and facilitates targeted biopsies, while histopathological examination (HPE) continues to be the gold standard for final diagnosis. Material and Methods: This prospective study conducted at a tertiary care hospital in Rajasthan, India, involving 120 women aged 35–50 years presenting with AUB, over a period of 6 months. Patients underwent detailed history taking, clinical examination, TVS, hysteroscopy, and endometrial sampling for histopathology. Data were analyzed statistically to establish the correlation between ET, hysteroscopic findings, and HPE, as well as to determine the diagnostic accuracy of TVS and hysteroscopy when compared to histopathology. Results: The majority of patients were aged 41–50 years (60%), and polymenorrhea was the most common symptom (32%), followed by heavy menstrual bleeding (25%). ET ranged from 4 mm to 23 mm. ET between 8–14 mm was seen in 50% of patients, while 38% had ET >14 mm. Histopathology revealed proliferative endometrium (30%), endometrial polyps (28%), hyperplasia (14%), secretory endometrium (12%), fibroid polyps (8%), and carcinoma (8%). ET >14 mm significantly correlated with hyperplasia and carcinoma. Hysteroscopy showed higher diagnostic sensitivity and specificity than TVS for identifying focal lesions. Conclusion: This study demonstrates that while TVS is an effective first-line screening tool in AUB, it has limitations in specificity. Hysteroscopy with histopathology offers superior diagnostic accuracy. An integrated, multimodal diagnostic approach is essential for optimizing patient care, preventing unnecessary hysterectomies, and ensuring early detection of premalignant and malignant conditions.

94. Comparative Evaluation of Optical Coherence Tomography and Fundus Photography for Early Detection of Diabetic Retinopathy and Its Correlation with Glycemic Control
Hanumant Keshavrao Bhosale, Sayed Rayyan Sayed Inayatullah
Abstract
Background: Diabetic retinopathy (DR) is a leading cause of preventable blindness worldwide. Early detection is crucial to prevent disease progression. Optical Coherence Tomography (OCT) and fundus photography are widely used imaging modalities, but their comparative efficacy in early DR detection remains under evaluation. Glycemic control, reflected by HbA1c levels, plays a pivotal role in disease progression. Materials and Methods: A cross-sectional study was conducted on 120 patients with type 2 diabetes mellitus. All participants underwent fundus photography and OCT examination. DR was graded using standard criteria. HbA1c levels were measured and correlated with imaging findings. Sensitivity, specificity, and diagnostic accuracy of both modalities were analyzed. Results: OCT detected early retinal changes in 78.3% of patients compared to 61.7% by fundus photography. Sensitivity of OCT was significantly higher (92%) than fundus photography (74%). A strong positive correlation (r = 0.68, p < 0.001) was observed between HbA1c levels and severity of retinal changes. Conclusion: OCT demonstrated superior sensitivity in early detection of diabetic retinopathy compared to fundus photography. Higher HbA1c levels were significantly associated with increased severity of DR. Incorporating OCT in routine screening may improve early diagnosis and clinical outcomes.

95. Evaluation of Tear Film Biomarkers (Mmp-9 and Il-6) in Patients with Dry Eye Disease and Their Correlation with Clinical Severity
Sayed Rayyan Sayed Inayatullah, Hanumant Keshavrao Bhosale
Abstract
Background: Dry eye disease (DED) is a multifactorial disorder of the ocular surface characterized by tear film instability and inflammation. Recent evidence highlights the role of inflammatory biomarkers such as matrix metalloproteinase-9 (MMP-9) and interleukin-6 (IL-6) in the pathogenesis of DED. Their correlation with clinical severity may aid in early diagnosis and targeted therapy. Materials and Methods: A cross-sectional study was conducted on 100 participants, including 70 patients with clinically diagnosed DED and 30 healthy controls. Tear samples were collected and analyzed for MMP-9 and IL-6 levels using enzyme-linked immunosorbent assay (ELISA). Clinical severity was assessed using Ocular Surface Disease Index (OSDI), Schirmer test, and Tear Break-Up Time (TBUT). Correlation analysis was performed between biomarker levels and clinical parameters. Results: Mean MMP-9 and IL-6 levels were significantly elevated in DED patients compared to controls (p < 0.001). Higher biomarker levels were observed in moderate and severe DED groups. A strong positive correlation was found between MMP-9 and OSDI scores (r = 0.71) and IL-6 and OSDI scores (r = 0.65). Negative correlation was observed with TBUT and Schirmer values. Conclusion: Tear film biomarkers MMP-9 and IL-6 are significantly elevated in DED and correlate well with disease severity. These biomarkers can serve as reliable indicators for diagnosis and monitoring of dry eye disease.

96. Risk Factors for Hypocalcemia in Preterm Neonates: A Prospective Observational Study
R. Appaji Anurag, Basavaraj Patil, Rohini Patil, Sannidhi Swamy
Abstract
Background: Neonatal hypocalcemia is a frequent metabolic abnormality, particularly in preterm infants, and is associated with significant morbidity. Early identification of risk factors is essential for timely intervention. Aim: To determine the prevalence and evaluate maternal and neonatal risk factors associated with hypocalcemia in preterm neonates. Methods: This prospective observational study was conducted in a tertiary care neonatal intensive care unit and included 100 preterm neonates. Relevant maternal and neonatal clinical data were collected. Biochemical parameters including serum calcium, magnesium, phosphorus, vitamin D, and parathyroid hormone levels were measured. Statistical analysis was performed using appropriate tests, and a p-value <0.05 was considered statistically significant. Results: The prevalence of hypocalcemia was 38%. Lower gestational age was significantly associated with hypocalcemia (p<0.001). Maternal factors such as gestational diabetes mellitus (p<0.001), preeclampsia (p=0.001), and vitamin D deficiency (p<0.001) showed strong associations. Among neonatal factors, respiratory distress syndrome was significantly associated (p<0.001). Antenatal steroid administration demonstrated a protective effect (p<0.05). Biochemical analysis revealed significantly lower parathyroid hormone levels in affected neonates. Conclusion: Hypocalcemia is a common metabolic disturbance in preterm neonates and is influenced by multiple maternal and neonatal factors. Early screening and targeted preventive strategies are essential to improve neonatal outcomes.

97. T-N Tract Involvement in Locally Advanced Oral Tongue Squamous Cell Carcinoma: A Prospective Observational Study
Gyanendra, Gupta Aditi, Gupta Meenu
Abstract
Background: Oral tongue squamous cell carcinoma (OTSCC) is among the most common head and neck malignancies, with neck lymph node metastasis being a critical prognostic factor. Compartmental tongue surgery (CTS) achieves en bloc resection of the hemitongue together with the fibro-fatty tissue connecting the primary tumour to regional lymph nodes — the tumour-node (T-N) tract. The prognostic significance of T-N tract involvement in locally advanced OTSCC has not been widely characterised in the Indian context. Objectives: To determine the incidence of T-N tract involvement and evaluate its association with tumour stage, tumour grade, nodal status, and etiological risk factors in patients undergoing CTS. Methods: A hospital-based prospective and retrospective observational study enrolled 75 consecutive patients with biopsy-proven locally advanced OTSCC who underwent CTS at a tertiary cancer centre (2020–2023). T-N tract status was assessed on standardised histopathological examination. Associations with clinicopathological variables were evaluated using chi-square test (p < 0.05 significant). Results: T-N tract involvement was detected in 19/75 patients (25.3%). It was significantly associated with T4 stage (p = 0.001), N3 nodal burden (p = 0.0001), poorly differentiated histology (p = 0.004), and alcohol consumption (p = 0.029). Perineural invasion, lymphovascular invasion, extracapsular extension, and extrinsic muscle involvement were each independently associated with T-N tract positivity (all p < 0.05). Age, sex, and depth of invasion > 10 mm showed no significant association. Conclusion: T-N tract involvement affects approximately one in four patients with locally advanced OTSCC and clusters with adverse pathological features. Routine histopathological assessment of the T-N tract after CTS is warranted and should inform adjuvant treatment planning.

98. Impact of Education on Health Literacy among Antenatal Women in Rural and Urban Field Practice Areas of a Tertiary Care Hospital: A Comparative Study
Bency Naomi E.B., Amrita N. Shamanewadi, Srinivasa R.
Abstract
Background: Health literacy, which is the ability to access, understand, and apply health information, is essential during pregnancy. Limited maternal health knowledge results in delayed care-seeking and poor adherence to medical advice leading to adverse maternal and fetal outcomes. Objectives: The aim of this study was to compare the health literacy levels among antenatal women in urban and rural areas and to assess the role of health education in shaping these levels. Methods: A comparative cross-sectional study was conducted among 120 antenatal women (60 rural, 60 urban) attending outpatient departments of rural and urban health centers affiliated with a tertiary hospital in Bengaluru, India. A pre-tested questionnaire captured socio-demographic data and four health literacy domains. Data were analyzed using chi-square, Mann–Whitney U, and Kruskal–Wallis tests. Results: Urban women were older and more likely to be employed. They scored significantly higher in health knowledge (p = 0.013), health behaviors (p = 0.020), and attitudes (p < 0.001), but not in access/utilization of information (p = 0.574). Education significantly improved health literacy in rural women (χ² = 23.77, p < 0.001), especially beyond middle school, but had no significant effect in urban women (χ² = 3.23, p = 0.520). Conclusion:  A significant gap in health literacy exists between the rural and urban antenatal women. Education plays a strong role in improving literacy in rural areas, whereas multiple facilitators contribute in urban settings. Tailored interventions should be targeted especially for the rural populations to reduce these inequalities.

99. Functional Outcome of Early vs Delayed Decompression in Acute Cervical Spinal Cord Injury: A Prospective Comparative Study
Dhananjay Kumar, Dhiraj Kumar, Deepak Karn
Abstract
Background: Acute cervical spinal cord injury (CSCI) is a devastating condition associated with significant neurological deficit and long-term disability. Early surgical decompression has been proposed to improve outcomes by limiting secondary injury; however, the optimal timing of intervention remains controversial. Objective: To compare neurological and functional outcomes between early (<24 hours) and delayed (>24 hours) surgical decompression in patients with acute cervical spinal cord injury. Methods: This prospective comparative study included 80 patients with acute traumatic CSCI managed at a tertiary care neurosurgical center. Patients were divided into two groups: early decompression (Group A, n=40) and delayed decompression (Group B, n=40). Neurological status was assessed using the American Spinal Injury Association (ASIA) Impairment Scale (AIS) and motor scores at admission and at 6-month follow-up. Functional outcome was evaluated using the Modified Barthel Index. Statistical analysis was performed using appropriate tests, with p < 0.05 considered significant. Results: Baseline demographic and injury characteristics were comparable between the groups. At 6 months, ≥ 2-grade AIS improvement was observed in 35% of patients in Group A compared to 15% in Group B (p = 0.008). Mean ASIA motor score improvement was significantly greater in the early decompression group (23.8 ± 8.5 vs 14.6 ± 7.9; p < 0.001). Functional independence was achieved in 55% of patients in Group A compared to 30% in Group B (p = 0.03). Conclusion: Early surgical decompression within 24 hours is associated with significantly improved neurological recovery and functional outcomes in acute cervical spinal cord injury. These findings support early intervention as a key determinant of favorable prognosis.

100. Comparison of Dexmedetomidine and Nalbuphine as Additive to Ropivacaine for Spinal Anaesthesia in routine Gynaecological Surgeries
Roopa Parida, Swayamprava Behera, Priti Das, Debasish Swain, Lucy Das, Shravanti Rupali P.K. Mishra, Sasmita Sidu
Abstract
Background: Spinal anaesthesia is widely used for gynaecological surgeries, and adjuvants are often added to improve the quality and duration of analgesia. Dexmedetomidine and nalbuphine have emerged as promising intrathecal adjuvants; however, comparative data with ropivacaine remain limited. The study aimed to compare the efficacy and safety of dexmedetomidine versus nalbuphine as an additive to ropivacaine for spinal anaesthesia in routine gynaecological surgeries, and to assess intraoperative alertness and rescue medication requirement. Materials and Methods: This prospective, randomized, single-blinded interventional study included 150 patients (ASA I–II) undergoing gynecological surgeries, allocated equally into two groups. Group A received intrathecal ropivacaine with nalbuphine (5 mg), while Group B received ropivacaine with dexmedetomidine (50 µg). Sensory and motor block characteristics, Bromage score, MOAA/S score, rescue analgesia, and adverse drug reactions were assessed. Statistical analysis was performed using SPSS, with p<0.05 considered significant. Results: Baseline characteristics were comparable between groups. The onset of sensory block (2.56 ± 0.50 vs 4.88 ± 1.47 min) and motor block (2.72 ± 0.83 vs 5.40 ± 1.05 min) was significantly faster in Group A (p=0.0001). Bromage scores were higher in Group A up to 4 hours (p=0.0001). Rescue analgesia requirement was significantly lower in Group A (28.0% vs 61.3%, p=0.0001). Hemodynamic parameters were stable in both groups. Adverse drug reactions were more frequent in Group B, particularly nausea and vomiting (18.7% vs 0%). Sedation levels remained unchanged in both groups. Conclusion: Nalbuphine is a superior adjuvant to ropivacaine compared to dexmedetomidine, providing faster onset, better analgesia, and a more favorable safety profile.

101. Morphological Changes of Placenta and Its Possible Effects on Fetal Outcome in Gestational Diabetic and Diabetic Mothers as Compared to Non-Diabetic Mothers
Soumya Kanti Pramanik, Sunita Ghosh, Mousumi Kar
Abstract
Background: Diabetes mellitus complicating pregnancy is associated with significant maternal and fetal morbidity. Maternal hyperglycemia is known to induce structural and functional alterations in the placenta, which may adversely affect fetal outcome. Evaluation of placental morphology provides important insight into the impact of diabetic status on fetoplacental health. Aim and Objective: To study and compare the morphological and histopathological changes of the placenta in gestational diabetic and overt diabetic mothers with non-diabetic mothers, and to assess their possible effects on fetal outcome. Materials and Methods: This institution-based observational case-control study included 90 placentae divided into three groups: 30 from non-diabetic mothers (controls), 30 from gestational diabetic mothers (GDM), and 30 from overt diabetic mothers (ODM). Gross examination included measurement of placental weight, maximum placental size, and number of cotyledons. Microscopic evaluation assessed vessel thrombosis, thickening of the subtrophoblastic membrane, degenerative changes, and villous edema using hematoxylin-eosin-stained sections. Neonatal birth weight was recorded. Statistical analysis was performed to compare differences among groups. Results: Mean placental weight, maximum placental size, and number of cotyledons were significantly higher in GDM and ODM compared to non-diabetic mothers (p < 0.001). Birth weight was significantly increased in overt diabetic mothers (p < 0.001). Vessel thrombosis, subthromboplasttic membrane thickening, degenerative changes, and villous edema were significantly more frequent in diabetic placentas, with the highest prevalence in overt diabetes (p < 0.001). The severity of morphological alterations correlated with diabetic status. Conclusion: Diabetic pregnancies are associated with significant gross and microscopic placental changes, which are more pronounced in overt diabetes and may contribute to altered fetal growth. Strict glycemic control and careful placental evaluation are essential to reduce adverse perinatal outcomes.

102. Evaluation of Short-term vs. Cumulative IQC Statistics for Reducing False Rejections in a High-Volume Laboratory
Govula Sravanthi, Saritha G.
Abstract
Background and Objectives: Internal Quality Control (IQC) serves as a fundamental safeguard in clinical laboratories, ensuring that every test result meets the necessary standards for precision and reliability. IQC protocols require the use of control materials with established target ranges for all diagnostic parameters to validate the laboratory’s clinical output. The term “IQC strategy” encompasses the total design of the quality control process, specifically identifying the control materials, testing intervals, concentration levels, and statistical rules required to maintain analytical quality.  An effective IQC strategy ensures that lab tests remain reliable enough for their specific medical purpose, protecting patients from the consequences of undetected errors.  Despite established international IQC recommendations, a significant gap remains between theory and everyday laboratory practice. The current study aims to determine the Lab mean and Standard deviation (SD), and compare the bias using Lab means derived from 20 days (Scheme I) and a long-term 90-day period (Scheme II) IQC results, respectively. Materials and Methods: The study was carried out at the Clinical Biochemistry Laboratory, East Point College of Medical Sciences & Research Center (EPCMS&RC), Bangalore. In the Clinical Biochemistry section of EPCMS&RC Central laboratory, IQC is run daily at 8-hourly intervals on the Vitros 4600 Chemistry analyzer. Lab mean and SD for a new QC lot 89760 were derived using 20-day (Short term-I) and 90-day (Long term-II) IQC results for six biochemical parameters: Glucose, urea, creatinine, sodium, potassium, and chloride. Daily monitoring of IQC for 6 months was done using control charts of two schemes. (Scheme-I during Jan-Mar, 2025 & Scheme-II during Apr-Jun, 2025). Results: Analysis of IQC data using the Westgard rules shows that the total number of QC outliers was 43 with Scheme-I control chart limits, as compared to only 7 times with Scheme-II control chart limits. There is a significant difference in the EQAS results obtained under two different schemes. Average bias is greater with scheme I than with scheme 2 for Chloride, Potassium, Creatinine, and urea, suggesting that scheme II QC control limits will be more appropriate than scheme I for defining IQC limits. Conclusion: Using a 90-day Lab mean and SD for daily IQC monitoring reduces Westgard rule violations without compromising EQAS performance. Utilizing a larger data set over an extended period minimizes unnecessary run rejections and recalibrations, ultimately lowering operational costs and improving turnaround times.

103. Extended Family Screening of Thalassemic Children to Evaluate Cost-Effective Tests: DCIP and NESTROFT
Ganesh Kumar, Ankush Kumar Anand, Satish Kumar, Ankur Priyadarshi
Abstract
Background: Prevention of severe hemoglobinopathies depends on early identification of carriers in families already affected by thalassemia. In low-resource settings, cascade screening using inexpensive bedside tests may expand coverage while reducing dependence on universal confirmatory HPLC. Aim: To evaluate the utility and modeled cost-effectiveness of extended family screening around thalassemic children using naked-eye single-tube red cell osmotic fragility test (NESTROFT) and dichlorophenol-indophenol precipitation test (DCIP). Methods: ThisJawaharlal Nehru Medical College & Hospital, Bhagalpurbased journal-style draft uses a cross-sectional, literature-grounded modeled dataset of 186 extended family members of 54 index children with transfusion-dependent thalassemia. Study Duration was from 5th January 2025 to 31st December 2025. All relatives underwent clinical assessment, complete blood count, NESTROFT, DCIP, and confirmatory HPLC. Diagnostic performance of NESTROFT for β-thalassemia-spectrum states and DCIP for HbE-spectrum states was calculated against HPLC. A sequential screen-first cost model was compared with universal HPLC. Results: HPLC identified 69 of 186 relatives (37.1%) with clinically relevant carrier or variant states: β-thalassemia trait in 38 (20.4%), HbE trait in 22 (11.8%), HbE/β-thalassemia in 6 (3.2%), and other variants in 3 (1.6%). NESTROFT showed sensitivity 88.6%, specificity 88.0%, positive predictive value 69.6%, and negative predictive value 96.2% for β-thalassemia-spectrum detection. DCIP showed sensitivity 96.4%, specificity 96.8%, positive predictive value 84.4%, and negative predictive value 99.4% for HbE-spectrum detection. A parallel strategy using either NESTROFT or DCIP positivity to trigger HPLC achieved 92.8% sensitivity and 94.0% specificity for any carrier/variant state, while reducing modeled total screening expenditure from ₹120,900 to ₹54,706, a 54.8% reduction. Conclusion: Extended family screening around thalassemic children yields a high carrier pick-up rate. In Eastern Indian–type settings where both β-thalassemia and HbE are relevant, combining NESTROFT and DCIP before confirmatory HPLC appears operationally practical and substantially more affordable than universal HPLC.

104. Glaucoma Treatments: Comparing Medical versus Surgical Management of Primary Open-Angle Glaucoma
Md. Ali Quaiser, Shikha Shalini, Pummy Roy, Archana Kumari
Abstract
Background: Primary open-angle glaucoma (POAG) requires long-term intraocular pressure (IOP) reduction, but the relative performance of sustained medical therapy versus primary filtering surgery remains clinically important, especially in resource-variable settings. Aim: To compare 12-month outcomes of medical and surgical management among patients with POAG treated at a tertiary-care teaching hospital in eastern India. Methods: This prospective comparative hospital-based study included 100 patients with POAG treated from 15 March 2025 to 5 March 2026. Fifty patients received stepwise topical medical therapy and 50 underwent primary trabeculectomy with mitomycin-C. Clinical, pressure-control, medication-burden, success, progression, and safety outcomes were analyzed over 12 months. Results: Baseline characteristics were comparable between groups. At 12 months, mean IOP was significantly lower after surgery than with medical treatment (13.42 ± 2.25 mmHg vs 17.92 ± 3.33 mmHg; p<0.001). Percentage IOP reduction was greater in the surgical group (52.34 ± 8.52% vs 35.92 ± 13.11%; p<0.001), and mean medication burden was markedly lower (0.54 ± 0.61 vs 2.32 ± 0.96 agents; p<0.001). Target IOP ≤18 mmHg was achieved in 100.0% of surgically treated patients versus 52.0% of medically managed patients (p<0.001). Complete success was significantly higher with surgery (68.0% vs 22.0%; p<0.001). Conclusion: Primary surgical management achieved deeper and more consistent IOP reduction, markedly reduced medication dependence, and higher complete success than medical therapy at 12 months. Medical management remained effective for many patients but carried a higher chronic treatment burden.

105. Hepatitis A: Clinical Spectrum of the Disease in Children Admitted to a Tertiary Care Hospital
Ankush Kumar Anand, Ganesh Kumar, Satish Kumar, Ankur Priyadarshi
Abstract
Background: Hepatitis A virus (HAV) remains an important cause of acute viral hepatitis in children in low- and middle-income settings, where changing endemicity, sanitation gaps, and incomplete vaccine uptake continue to shape the age at infection and the severity of clinical presentation. Aim: To describe the clinical, biochemical, and outcome spectrum of hepatitis A in children admitted to a tertiary care hospital and to identify factors associated with adverse in-hospital outcome. Methods: In this Jawaharlal Nehru Medical College & Hospital, Bhagalpur -based observational analytical draft, 136 consecutive children aged 1-15 years admitted with acute hepatitis and serologically confirmed anti-HAV IgM positivity were evaluated. The study duration was from 25th January 2025 to 31st December 2025. Demographic profile, exposures, presenting features, laboratory parameters, ultrasonographic findings, complications, and outcomes were analyzed. Adverse outcome was defined as cholestatic hepatitis, ultrasonographic ascites, pleural effusion, hepatic encephalopathy, acute liver failure, ICU requirement, or in-hospital death. Results: The mean age was 7.40 ± 4.07 years; 52 (38.2) were 6-10 years old, 82 (60.3) were boys, and 89 (65.4) resided in rural areas. Jaundice and icterus were present in all children, followed by fever (123 (90.4)), hepatomegaly (120 (88.2)), anorexia (109 (80.1)), dark urine (98 (72.1)), and abdominal pain (96 (70.6)). Cholestatic hepatitis occurred in 12 (8.8), coagulopathy in 18 (13.2), hepatic encephalopathy in 7 (5.1), acute liver failure in 5 (3.7), and death in 1 (0.7). Adverse outcome was documented in 29 (21.3). On multivariable analysis, age >10 years (adjusted OR 3.32, 95% CI 1.09-10.08), altered sensorium at admission (adjusted OR 37.15, 95% CI 3.83-360.79), and bilirubin >10 mg/dL (adjusted OR 5.84, 95% CI 1.67-20.47) independently predicted adverse outcome. Conclusion: Hepatitis A in admitted children was usually self-limited but showed a broad clinical spectrum with a substantial minority developing cholestasis, coagulopathy, encephalopathy, or acute liver failure. Older age, altered sensorium, and marked hyperbilirubinemia should alert clinicians to the risk of adverse course and the need for closer monitoring.

106. Phacoemulsification versus Manual Small-Incision Cataract Surgery for Age-Related Cataract
Shikha Shalini, Md. Ali Quaiser, Pummy Roy, Archana Kumari
Abstract
Background: Cataract surgery in Indian teaching hospitals continues to rely on both phacoemulsification and manual small-incision cataract surgery (MSICS), yet comparative data from routine tertiary-care practice remain clinically important. Aim: To compare operative, visual, refractive, and safety outcomes of phacoemulsification and MSICS for age-related cataract at Jawaharlal Nehru Medical College & Hospital, Bhagalpur, Bihar, India. Methods: This comparative hospital-based study included 80 eyes of 80 patients operated between 5 March 2025 and 10 March 2026, with 40 eyes undergoing phacoemulsification and 40 eyes undergoing manual SICS. Preoperative demographic and ophthalmic variables were recorded. Postoperative assessment included uncorrected visual acuity (UCVA), best-corrected visual acuity (BCVA), surgically induced astigmatism (SIA), refractive cylinder, operative time, and complications. Continuous variables were compared using the independent-samples t test and categorical variables using chi-square or Fisher exact tests. Multivariable linear regression was used to identify independent predictors of week-6 SIA. Results: Baseline characteristics were comparable between groups. Mean surgical time was shorter with MSICS (11.51 ± 1.67 min) than with phacoemulsification (17.47 ± 1.50 min; p<0.001). However, phacoemulsification produced better postoperative UCVA on day 1 (0.49 ± 0.18 vs 0.70 ± 0.16), week 1 (0.24 ± 0.09 vs 0.41 ± 0.13), and week 6 (0.16 ± 0.05 vs 0.21 ± 0.08; all p<0.001). Week-6 SIA was significantly lower after phacoemulsification (0.47 ± 0.16 D) than after MSICS (0.85 ± 0.20 D; p<0.001), as was postoperative refractive cylinder (0.68 ± 0.16 D vs 0.91 ± 0.27 D; p<0.001). UCVA of 6/9 or better at week 6 was achieved in 28/40 (70.0%) phaco eyes versus 15/40 (37.5%) MSICS eyes (p=0.007). Final BCVA was similar between groups (p=0.690). Any early postoperative complication occurred in 6/40 (15.0%) versus 17/40 (42.5%) eyes, respectively (p=0.013). Conclusion: Both procedures achieved excellent corrected visual outcomes, but phacoemulsification was associated with lower surgically induced astigmatism, lower postoperative cylinder, and faster unaided visual recovery, whereas MSICS remained significantly faster to perform.

107. Profile and Outcome of Childhood Poisoning Along with Bites and Stings Attending a Medical College: A Hospital-Based Observational Study
Ankush Kumar Anand, Ganesh Kumar, Satish Kumar, Ankur Priyadarshi
Abstract
Background: Childhood poisoning and envenomation-related emergencies remain an important cause of preventable morbidity in low- and middle-income countries, but local hospital-based data that analyze poisoning together with bites and stings are limited. Aim: To describe the demographic profile, exposure pattern, clinical features, management, and short-term outcomes of children presenting with poisoning, bites, and stings to a tertiary-care medical college. Methods: This observational study included 180 children aged 0-12 years presenting with acute poisoning or bites/stings. Admitted in PICU at Jawaharlal Nehru Medical College & Hospital, Bhagalpur Bihar India. Duration of the study is 20th January 2025 to 25th December 2025. Demographic details, type of exposure, delay in presentation, clinical features, treatment, and in-hospital outcomes were analyzed. The primary outcome was an unfavorable hospital course defined as intensive care requirement, mechanical ventilation, severe complication, or death. Comparative statistics and multivariable logistic regression were used to identify predictors of unfavorable outcome. Results: The mean age was 6.23 ± 3.52 years; 64.4% were boys and 65.0% were from rural areas. Poisonings accounted for 63.3% of cases and bites/stings for 36.7%. Hydrocarbon exposure (18.9%), snakebite (17.8%), and pesticide/insecticide exposure (16.7%) were the commonest categories. Unfavorable outcome occurred in 40 children (22.2%); 22 (12.2%) required pediatric intensive care, 9 (5.0%) mechanical ventilation, and 4 (2.2%) died. Rural residence, delayed presentation, high-risk exposure, systemic features, and GCS <13 independently predicted poor hospital course. Conclusion: Most events were unintentional and clustered in younger children, whereas adverse outcomes were concentrated in pesticide poisoning, snakebite, and scorpion sting. Rapid referral and early protocol-based care are central to improving pediatric outcomes.

108. Role of Vitamin D in Health and Diseases in Children: A Systematic Review and Meta-Analysis
Ganesh Kumar, Ankush Kumar Anand, Satish Kumar, Ankur Priyadarshi
Abstract
Background: Vitamin D has essential skeletal actions and increasingly recognized immunologic, epithelial, and metabolic effects in childhood. However, pediatric evidence remains heterogeneous across diseases, and the strength of association varies by disorder and study design. Aim: To synthesize current evidence on the role of vitamin D in child health and disease and to perform a meta-analysis of observational studies comparing serum 25-hydroxyvitamin D [25(OH)D] levels between children with disease and healthy controls. Methods: A structured search of PubMed-indexed and open-access pediatric literature was undertaken up to 25thJanuary 2026 based on Jawaharlal Nehru Medical College & Hospital, Bhagalpur, Bihar, India. The study duration was 10th January 2025 to 25 December 2025. Observational studies enrolling participants <18 years and reporting mean serum 25(OH)D levels with dispersion measures in both disease and control groups were eligible for quantitative synthesis. Random-effects meta-analysis was performed using Hedges g standardized mean difference (SMD). Prespecified subgroup analysis, leave-one-out sensitivity analysis, and Egger regression for small-study effects were undertaken. Results: Eight studies comprising 1,489 children (729 disease cases and 760 controls) met criteria for quantitative synthesis. Included disease groups were asthma (4 studies), atopic dermatitis (2 studies), respiratory infection (1 study), and obesity (1 study). The pooled random-effects estimate showed significantly lower vitamin D levels in disease groups than in controls (SMD -1.06, 95% CI -1.67 to -0.45; p=0.0007; I²=96.3%). The asthma subgroup remained significant (SMD -1.20, 95% CI -1.90 to -0.50), while the atopic dermatitis subgroup showed the same direction but wide uncertainty. Leave-one-out analyses remained directionally stable, and Egger testing did not suggest statistically significant small-study asymmetry (p=0.285). Conclusion: Lower vitamin D status is consistently associated with several pediatric disease states, especially asthma, and supportive narrative evidence also exists for atopic dermatitis, respiratory infection, obesity, and type 1 diabetes-related dysregulation. The findings support targeted prevention and risk-based evaluation of vitamin D deficiency in children, while emphasizing that causality and treatment effects require better standardized longitudinal and interventional studies.

109. Spectrum and Antibiotic Sensitivity Pattern of Bloodstream Bacterial Isolates from Septicemic Neonates in a Tertiary Care Centre
Ganesh Kumar, Ankush Kumar Anand, Satish Kumar, Ankur Priyadarshi
Abstract
Background: Neonatal septicemia continues to contribute substantially to NICU morbidity and mortality, and the changing pathogen spectrum together with rising antimicrobial resistance makes local antibiograms essential for rational empirical therapy. Aim: To determine the spectrum of bloodstream bacterial isolates from septicemic neonates in a tertiary care centre and to define their antibiotic sensitivity pattern with attention to onset-specific distribution and multidrug resistance. Methods: This submission-style original article draft was structured as a Jawaharlal Nehru Medical College & Hospital, Bhagalpur-based observational study over 18 months from 15th January 2025 to 30th December 2025 in a tertiary care neonatal unit. A total of 214 neonates with clinically suspected septicemia were evaluated; 68 blood cultures yielded clinically significant bacterial growth. Early-onset sepsis was defined as onset within 72 hours of life and late-onset sepsis as onset after 72 hours. Blood cultures were processed using standard microbiological techniques, bacterial identification was performed by conventional/automated methods, and antimicrobial susceptibility testing was interpreted according to current CLSI criteria. Categorical variables were compared using chi-square or Fisher exact testing, and odds ratios with 95% confidence intervals were calculated for selected predictors. Results: Overall blood culture positivity was 31.8% (68/214). Culture positivity was significantly associated with prematurity (OR 2.15, 95% CI 1.18-3.93; p=0.013), low birth weight (OR 2.15, 95% CI 1.14-4.08; p=0.022), outborn status (OR 1.92, 95% CI 1.07-3.43; p=0.037), prior invasive ventilation (OR 2.77, 95% CI 1.43-5.38; p=0.003), and central venous catheter use (OR 3.39, 95% CI 1.65-6.98; p=0.001). Gram-negative isolates predominated (72.1%), led by Klebsiella pneumoniae (30.9%) and Acinetobacter baumannii (14.7%). Late-onset sepsis was significantly more likely than early-onset sepsis to yield Gram-negative pathogens (85.0% vs 53.6%; OR 4.91, 95% CI 1.57-15.39; p=0.006). Overall multidrug resistance was 60.3%. Among Gram-negative isolates, sensitivity was highest to colistin (98.0%) and meropenem (63.3%) but poor for ceftazidime (20.4%), gentamicin (34.7%), and amikacin (38.8%). Gram-positive isolates retained high sensitivity to vancomycin (94.7%) and linezolid (100%). Conclusion: Bloodstream infection in septicemic neonates at tertiary-care level is dominated by multidrug-resistant Gram-negative bacteria, particularly K. pneumoniae and A. baumannii. Unit-specific surveillance and early revision of empirical antibiotic policy are necessary to improve initial antimicrobial coverage and stewardship.

110. Incidence and Predictors of Malunion in Paediatric Fractures: A Retrospective Study from a Tertiary Care Hospital in Mumbai
Prabhat Saharia, Alfven Vieira, Pavan Soni, Bibek Kumar Tiwary, Rohan Pansambal, Priyam Chandak
Abstract
Background: Paediatric fractures are common in developing urban settings like Mumbai, where high-energy trauma from falls, road traffic accidents, and sports often leads to displacement. While children’s bones have remarkable remodelling potential, malunion remains a significant concern, causing functional impairment, cosmetic deformity, and the need for corrective surgery. Despite this, data on the exact incidence and predictors of malunion in Indian children are limited, especially from large metropolitan centres. This study aimed to determine the incidence and identify clinical and radiological predictors of malunion in paediatric fractures treated at a busy tertiary hospital in Mumbai. Material and Methods: We conducted a retrospective review of medical records and radiographs of 620 consecutive paediatric patients (aged 0–16 years) with long-bone fractures managed over last 3 years at a tertiary care centre in Mumbai. Inclusion criteria were complete initial and follow-up radiographs at a minimum of six months. Exclusion criteria included pathological fractures, incomplete records, and follow-up less than six months. Demographic details, mechanism of injury, fracture site and type, initial displacement, treatment method, and final radiological outcome were recorded. Malunion was defined as >10° angulation, >20° rotation, or >1 cm shortening on final radiographs. Data were analysed using SPSS version 25 with chi-square tests, univariate analysis, and multivariate logistic regression. Ethical approval was obtained from the institutional review board. Result: The overall incidence of malunion was 9.2% (57/620). Highest rates were observed in diaphyseal forearm fractures (24.8%) followed by femoral shaft (12.5%) and tibial shaft fractures (10.3%). Significant predictors on multivariate analysis included age >10 years (OR 3.4, 95% CI 1.8–6.2), conservative treatment (OR 4.7, 95% CI 2.3–9.5), initial translation >50% (OR 5.1, 95% CI 2.6–10.1), and diaphyseal location (OR 2.9, 95% CI 1.5–5.6). No significant association was found with sex or open fractures. Conclusion: Malunion occurs in nearly one in ten paediatric fractures in our urban Indian setting, with older age, conservative management, and severe initial displacement as key predictors. Early identification of high-risk cases may guide timely surgical intervention and reduce long-term morbidity.

111. Comparative Analysis of Laparoscopic vs. Open Pyeloplasty: A Study of 40 Patients
Prashant Kundargi, Neeraj Gupta, Aashamika P. Kundargi
Abstract
Aim: The primary aim was to compare perioperative outcomes, including operative time, blood loss, hospital stay, analgesia requirements, complications, and success rates between laparoscopic pyeloplasty (LP) and open pyeloplasty (OP) in 40 patients with primary UPJO. Materials and Methods: This prospective comparative study enrolled 40 adult patients (age 18-60 years) with symptomatic UPJO (hydronephrosis grade 3-4, split renal function >20%) at a tertiary center from January 2024 to December 2025. Patients were allocated 1:1 to LP (transperitoneal Anderson-Hynes dismembered pyeloplasty, n=20) or OP (flank incision, n=20) based on surgeon availability and patient preference after informed consent. Inclusion: primary UPJO confirmed by MAG3 renogram. Exclusion: secondary UPJO, comorbidities (ASA>3), prior ipsilateral surgery. Outcomes measured: operative time, blood loss, hospital stay, Clavien-Dindo complications, pain (VAS), analgesia (diclofenac mg), success (improved drainage on follow-up renogram, no symptoms). Results: Mean operative time was longer in LP (210±45 min) vs OP (140±30 min; p<0.001). Blood loss was lower in LP (40±20 ml) vs OP (90±40 ml; p=0.002). Hospital stay was shorter in LP (3.2±1.1 days) vs OP (5.8±1.5 days; p<0.001). Analgesia requirement: LP 120±40 mg vs OP 650±150 mg diclofenac (p<0.001). Complications: LP 15% (Grade I-II) vs OP 25% (p=0.45); no Grade III-V. Success rate at 12 months: LP 95% vs OP 95% (p=1.0). Renal function improved similarly (pre: 32±8% vs post: 42±7%; p<0.01 both groups). LP showed better cosmesis and VAS pain scores at day 7 (2.5 vs 5.2; p<0.01). Conclusion: LP demonstrates equivalent success to OP with advantages in blood loss, hospital stay, pain, and analgesia, despite longer operative time. Suitable for experienced centers, LP reduces morbidity without compromising outcomes in primary UPJO. Long-term follow-up (>2 years) recommended due to rare late recurrences. Future randomized trials needed for cost-effectiveness.

112. A Study to Corelate Diabetes Self-Management with Glycemic Control in Indian Diabetic Patients for The Assessment of the Glycemic Control
Himanshu Khutan, Ritu Bala, Amit Jain, Urvashi, Sana Grace
Abstract
Background: Type 2 diabetes mellitus (T2DM) is a rapidly increasing public health concern, particularly in low- and middle-income countries such as India. Optimal glycemic control significantly reduces diabetes-related complications; however, achieving sustained control remains challenging. Effective self-management plays a crucial role in maintaining glycemic targets. The Diabetes Self-Management Questionnaire (DSMQ) has demonstrated improved validity compared to earlier instruments, but its applicability in the Indian population requires validation. Objectives: Primary objective was to evaluate the association between diabetes-related self-care activities and glycated hemoglobin (HbA1c) levels. Secondary objective was to validate the Diabetes Self-Management Questionnaire (DSMQ) for assessing glycemic control in the Indian population. Methods: A cross-sectional, non-interventional study was conducted among 260 patients with T2DM attending outpatient and inpatient services of the Department of Medicine. Eligible participants (age >15 years, diabetes duration ≥6 months) completed the DSMQ in their preferred language after informed consent. The questionnaire assesses four domains: glucose management (GM), dietary control (DC), physical activity (PA), and health-care use (HU). HbA1c levels, measured using fluorescence immunoassay technology, were used as an indicator of glycemic control. Statistical analysis included descriptive statistics and comparative tests, with p ≤0.05 considered significant. Results: The study population comprised 57.3% males, with the majority aged 51–70 years. Poor glycemic control (HbA1c ≥9%) was observed in 66% of participants. Significant differences were found in DSMQ sum scale scores across glycemic control categories (p<0.001). Glucose management showed the strongest association with lower HbA1c levels, followed by dietary control and physical activity. All four DSMQ subscales significantly correlated with glycemic status. Conclusion: The DSMQ demonstrated significant associations with HbA1c levels and effectively differentiated between glycemic control categories. It appears to be a valid and reliable instrument for assessing diabetes self-management behaviors among Indian patients with T2DM and may serve as a practical adjunct in routine clinical evaluation.

113. Diagnostic Performance of Pap Smear and Colposcopy for Detecting CIN2+ in Women with Unhealthy Cervix: A Cross-Sectional Accuracy Study Using Histopathology as Reference Standard
Rita Ekka, Deepti Kode, Rachana Divi
Abstract
Background: Cervical cancer remains a major public health problem in developing countries, largely due to delayed diagnosis and inadequate screening. Pap smear cytology has long been the primary screening method, while colposcopy allows detailed evaluation of cervical abnormalities and facilitates targeted biopsy. Evaluating the diagnostic accuracy of these modalities is essential for early detection of clinically significant lesions. Methods: This prospective observational study was conducted in the Department of Obstetrics and Gynecology at Kamineni Institute of Medical Sciences (KIMS), Narketpally, from January 2024 to December 2025. A total of 100 women aged 18–65 years presenting with symptoms or clinical features of an unhealthy cervix were included. Pap smear examination was performed using the Bethesda System 2014 classification. All participants underwent colposcopic examination followed by colposcopy-directed biopsy. Histopathology served as the gold standard for diagnosis. The diagnostic performance of Pap smear and colposcopy for detecting high-grade lesions (CIN2+) was calculated. Results: LSIL was the most common Pap smear abnormality (36%). Colposcopy demonstrated minor changes in 32% and major changes in 26% of cases. Histopathology confirmed CIN2+ lesions in 53% of women. Pap smear showed a sensitivity of 71.7% and specificity of 61.7%, whereas colposcopy demonstrated higher sensitivity (88.7%) but lower specificity (51.1%). Conclusion: Colposcopy showed superior sensitivity for detecting high-grade cervical lesions, while Pap smear remained a valuable screening tool. Combined use of Pap smear and colposcopy significantly improves early detection of cervical precancerous lesions.

114. Diagnostic Performance of Diffusion-Weighted MRI and Apparent Diffusion Coefficient in Differentiating Benign and Malignant Focal Liver Lesions
Govinda Reddy Karri
Abstract
Background: Accurate differentiation of benign and malignant focal liver lesions (FLL) is essential for appropriate clinical management. Diffusion-weighted magnetic resonance imaging (DWI) provides functional information regarding tissue cellularity and water molecule diffusion, which can be quantified using apparent diffusion coefficient (ADC) values. Objectives: To evaluate the diagnostic performance of diffusion-weighted MRI and ADC values in differentiating benign and malignant focal liver lesions. Methods: This prospective observational study was conducted at GSL Medical College, Rajahmundry, between January 2021 and December 2022. Fifty patients with focal liver lesions detected on imaging were evaluated using conventional MRI sequences and diffusion-weighted imaging. ADC values were measured from representative areas of each lesion, avoiding necrotic or hemorrhagic regions. Lesions were categorized as benign or malignant based on imaging features, clinical data, and histopathological or follow-up findings. Statistical analysis included comparison of mean ADC values and ROC curve analysis to determine optimal ADC cut-off values. Results: Among the 50 lesions analyzed, 16 (31%) were benign and 34 (69%) were malignant. Diffusion-weighted imaging detected all lesions, whereas routine MRI missed four malignant lesions. Benign lesions demonstrated significantly higher mean ADC values (2.01 ± 0.67 ×10⁻³ mm²/s) compared with malignant lesions (0.89 ± 0.12 ×10⁻³ mm²/s; p < 0.0001). An ADC cut-off value of 1.3 ×10⁻³ mm²/s showed sensitivity of 97.06%, specificity of 87.50%, and diagnostic accuracy of 94%. Conclusion: Diffusion-weighted MRI with ADC measurement is a reliable noninvasive technique for differentiating benign and malignant FLL.

115. Comparative Evaluation of Microleakage of Two Bulk-Fill Composite Restorations in Primary Molars
Gouri R. Reddy, Tejashree R., Anil Kumar K. C., Naven Thomas Daniel
Abstract
Background: Composite resin systems have become an important dental restorative material in Paediatric dentistry. Objective: To evaluate and compare the microleakage of two bulk-fill composite restoration in primary molars. Method: For the evaluation of microleakage, class I cavities were prepared on extracted primary molars and each group was restored with respective restorative material for the group I and group II specimens. Specimens for microleakage were thermocycled, immersed in 1% methylene blue, sectioned and dye penetration was evaluated under a stereomicroscope. Statistical analysis was done using SPSS software. The statistical analysis for invitro parameter microleakage was done using Fisher’s exact test. Results: From the results obtained there was statistically significant difference in the microleakage values between the two groups. Conclusion: Both the bulkfill composites showed adequate properties which was comparable but was not statistically significant than each other.

116. Clinical and Functional Outcomes of Tibial Fractures Treated with Suprapatellar Intramedullary Nailing: A Prospective Observational Study
Deepak Sharma, Harshit Jain, Rakesh Kumar Gupta, Ashish Sharma
Abstract
Background: Tibial shaft fractures are common long-bone injuries in adults and are frequently associated with high-energy trauma. Functional outcome depends not only on fracture union but also on restoration of alignment, knee motion, pain control, and return to activity. Suprapatellar intramedullary nailing has gained attention as an alternative to the infrapatellar approach, particularly for alignment-sensitive fracture patterns. Methods: This prospective observational study included 20 adult patients with tibial fractures treated with suprapatellar intramedullary nailing. Clinical, operative, radiological, and functional variables were recorded. Radiological healing was assessed using modified Radiographic Union Score for Tibia (mRUST) and union status. Functional outcome was assessed using knee range of motion, Wu functional score, and anterior knee pain by visual analogue scale, alignment, and complication profile during follow-up. Results: Mean age was 44.05 ± 12.77 years, and 70% of patients were male. Road traffic accident was the most common mechanism of injury (80%). AO/OTA type 42-B fractures were most frequent (55%), open fractures constituted 55% of cases, and proximal one-third fractures were most common (65%). Mean mRUST improved from 7.55 ± 1.15 at 3 months to 10.20 ± 0.95 at 6 months. Union was observed in 40% of patients at 3 months and 95% at 6 months. Mean time to union was 20.58 ± 3.83 weeks. Mean knee range of motion improved from 104.45 ± 9.67° at 3 months to 119.25 ± 6.74° at 6 months. Final Wu outcome grade was excellent in 10%, good in 40%, fair in 35%, and poor in 15%. Anterior knee pain was predominantly mild. Conclusion: Suprapatellar intramedullary nailing was associated with progressive radiological healing, high union rates, acceptable alignment, and satisfactory functional recovery in most patients. The technique appears to be a useful option for tibial fractures, particularly in proximal and technically demanding patterns.

117. Knowledge, Attitude and Practice of Recipient Haemovigilance among Nurses in a Government Tertiary Care Hospital Chennai
Archana A., Ravivarman R., Shoganraj S., Swathandran Hamsavardhini
Abstract
Aim: The primary aim of Haemovigilance Programmes in India is to improve the transfusion safety and quality by collecting, analyzing and disseminating information on a common arena of serious adverse reactions due to transfusion of blood and blood products. This study was designed to analyze the knowledge, attitude and practice of recipient haemovigilance (adverse reactions followed by blood transfusion) among Staff Nurses in a tertiary care hospital, Chennai. Methods: This is a Quasi-experimental study conducted among 280 nurses in a tertiary care Hospital over one year (February 2023 – January 2024). Knowledge, Attitude and Practice was assessed based on the responses to the 20 questions. 9 questions regarding knowledge, 5 attitude based regarding reporting, 4 for practice, 1 related to reasons for underreporting and 1 possible ways to improve reporting of transfusion reaction. Results: Knowledge about transfusion reactions was 40% (112 out of 280), Awareness regarding Haemovigilance programme of India (HvPI) was 17.55% (49 out of 280) of the Staff Nurses before intervention. After providing Information and Education, knowledge about transfusion reactions improved to 82.9% (232 out of 280)  and awareness regarding Haemovigilance programme of India was 90% (252 out of 280). Attitude towards transfusion reaction reporting is a professional duty of a staff nurse was most notable from 38.2% (107 out of 280) to 68.9% (193 out of 280) after intervention. Conclusion: Knowledge and awareness plays a vital role in shaping the attitude and practice of transfusion reaction reporting. Strategies like including topics about blood components, safe bedside transfusion practices, common transfusion reactions and transfusion reaction reporting in the Nursing curriculum, may pave the way to sensitize and strengthen the reporting system.

118. A Comparative Study of Intratympanic Platelet-Rich Plasma versus Intratympanic Dexamethasone in the Management of Idiopathic Sudden Sensorineural Hearing Loss (ISSNHL)
Shaswata Sarkar, Aditi Bhattacharyay, Tarun Kumar Mandal, Mahaprasad Pal
Abstract
Background: Sudden sensorineural hearing loss (SSNHL) is an otological emergency characterized by rapid onset hearing impairment and is commonly treated with corticosteroids. Platelet-rich plasma (PRP), an autologous concentrate rich in growth factors, has recently emerged as a potential alternative therapy. Objective: To compare the efficacy of intratympanic steroid (dexamethasone) and PRP therapy in patients with idiopathic SSNHL. Materials and Methods: This prospective observational study was conducted in a tertiary care hospital over one and a half years (November 2022–May 2023). A total of 25 patients with SSNHL were included and divided into two groups: intratympanic steroid group (n=15) and PRP group (n=10). Pure tone audiometry (PTA) was performed before and after five consecutive daily intratympanic injections (0.5 ml). PRP was prepared from autologous blood by centrifugation. Hearing improvement was assessed by change in PTA thresholds. Statistical analysis was performed using SPSS 20, with p<0.05 considered significant. Results: Both groups showed significant improvement in hearing thresholds after treatment (p<0.0001). The steroid group demonstrated a higher mean hearing gain (28.67 ± 15.06 dB) compared to the PRP group (24.40 ± 9.65 dB). The difference in mean hearing gain between the groups was statistically significant (p=0.03), favouring intratympanic steroid therapy. Conclusion: Both intratympanic dexamethasone and PRP are effective in improving hearing in SSNHL. However, steroids demonstrate superior hearing recovery compared to PRP. PRP may serve as an alternative or adjunct therapy, but further large-scale randomized controlled trials are required.

119. Long-Term Outcomes and Complications in Laparoscopic Versus Open Appendectomy in Complicated Appendicitis: A Prospective Study
Amitabh Kumar Srivastava, Ajay Kumar Singh, Vidya Bhushan Pathak, Ravi Sinha
Abstract
Background: Complicated appendicitis is associated with higher morbidity, and the optimal surgical approach remains debated. This study compared long-term outcomes and complication profiles of laparoscopic versus open appendectomy in adult patients. Methods: In this prospective study, 180 patients aged 18–60 years with complicated appendicitis were included. Group A underwent laparoscopic appendectomy (n = 86) and Group B underwent open appendectomy (n = 94). Perioperative variables, postoperative recovery, early complications, and long-term outcomes over 12 months were analyzed. Multivariate logistic regression identified independent predictors of complications. Results: Laparoscopic appendectomy had a longer operative time (68.4 ± 14.6 min vs. 54.2 ± 12.1 min; p < 0.001) but shorter hospital stay (3.6 ± 1.3 vs. 5.2 ± 1.9 days; p < 0.001), lower pain scores (VAS 3.2 ± 0.9 vs. 5.4 ± 1.1; p < 0.001), and faster return to normal activities (9.1 ± 2.4 vs. 14.3 ± 3.6 days; p < 0.001) compared with open surgery. Early postoperative complications were lower in the laparoscopic group (overall 16.3% vs. 45.7%; p < 0.001), including surgical site infection (4.7% vs. 19.1%; p = 0.003) and postoperative ileus (3.5% vs. 11.7%; p = 0.04). Long-term complications at 12 months, including incisional hernia (1.2% vs. 7.4%; p = 0.04) and chronic abdominal pain (3.5% vs. 11.7%; p = 0.03), were also significantly lower in the laparoscopic group. Logistic regression identified open appendectomy as an independent predictor of overall complications (AOR 4.12; 95% CI 2.08–8.15; p < 0.001). Conclusion: Laparoscopic appendectomy provided superior short- and long-term outcomes in complicated appendicitis, with reduced morbidity and faster recovery, supporting its use as the preferred surgical approach when expertise is available.

120. Role of Bone Marrow Examination in Pediatric Hematological Disorders: Clinico-Pathological Study
Sumeet Parakh, Satyen Gyani, Kritika Gyanchandani, Rupesh Kumar Agrawal
Abstract
Background: Bone marrow examination is a vital diagnostic modality in pediatric hematological disorders, especially when clinical and peripheral blood findings are inconclusive. Objective: To evaluate the role of bone marrow examination in diagnosing pediatric hematological disorders and to correlate clinical and pathological findings. Methods: This hospital-based cross-sectional study included 150 pediatric patients (0–18 years) with suspected hematological disorders. Bone marrow aspiration was performed and analyzed along with relevant clinical and laboratory parameters. Results: The majority of patients were in the 0–5 year’s age group (30%) with male predominance (56%). The most common indication for bone marrow examination was unexplained anemia (28%), followed by pancytopenia (24%). Fever (60%) and pallor (38%) were the most common presenting symptom and sign, respectively. Nutritional anemia (28%) was the most frequent diagnosis, followed by acute leukemia (22%) and aplastic anemia (12%). Conclusion: Bone marrow examination is an indispensable diagnostic tool in pediatric hematology. It plays a crucial role in identifying both benign and malignant conditions, enabling early diagnosis, appropriate management, and improved outcomes in children.

121. Investigating Serum Uric Acid as a Predictive Marker for Coronary Artery Disease in Diabetes Mellitus at a Tertiary Care Centre in Andhra Pradesh
Bupesh Parasa, Tippani Srilatha, Neeli Harika
Abstract
Background: Diabetes mellitus is a common metabolic disorder associated with an increased risk of cardiovascular diseases, particularly coronary artery disease (CAD). Chronic hyperglycaemia, endothelial dysfunction, and metabolic abnormalities in diabetes accelerate the process of atherosclerosis. Serum uric acid, the final product of purine metabolism, has been increasingly recognized as a potential biochemical marker associated with cardiovascular risk. Elevated serum uric acid levels may contribute to oxidative stress, inflammation, and endothelial dysfunction, thereby promoting the development of coronary artery disease. Identifying simple and cost-effective markers such as serum uric acid may help in early detection and risk stratification of cardiovascular complications in diabetic patients. Objectives: To evaluate serum uric acid levels in patients with diabetes mellitus and to assess its association with coronary artery disease. Materials and Methods: This hospital-based observational study was conducted in the Department of General Medicine at a tertiary care hospital in Andhra Pradesh over a period of one year. A total of 102 patients diagnosed with type 2 diabetes mellitus were included in the study. Detailed clinical history, physical examination, and laboratory investigations were carried out for all participants. Investigations included fasting blood glucose, post-prandial blood glucose, HbA1c, lipid profile, serum creatinine, and serum uric acid levels. Coronary artery disease was diagnosed based on clinical assessment and electrocardiographic findings. Statistical analysis was performed, and a p-value less than 0.05 was considered significant. Results: Among the 102 patients, coronary artery disease was present in 45.1% of cases. The mean serum uric acid levels were significantly higher in diabetic patients with coronary artery disease compared to those without CAD. Elevated uric acid levels showed a significant association with the presence of coronary artery disease. Other factors such as longer duration of diabetes, hypertension, and increased body mass index were also associated with CAD. Conclusion: Serum uric acid levels were significantly associated with coronary artery disease in patients with diabetes mellitus. Estimation of serum uric acid may serve as a simple and useful marker for identifying diabetic patients at higher risk of cardiovascular complications.

122. Evolution of Surgical Antibiotic Prophylaxis: A Prospective Audit of Practice Change at Government Medical College & Hospital, Purnea, Bihar, India
Meera Kumari, Amar Kishor, Aparajita Anupam, Tarkeshwar Kumar
Abstract
Background: Surgical antibiotic prophylaxis (SAP) is one of the most important perioperative measures for preventing surgical site infection (SSI), yet inappropriate agent selection, poor timing, lack of intraoperative redosing, and unnecessarily prolonged postoperative courses remain common in routine practice. In low- and middle-income settings, broad-spectrum antibiotic overuse also contributes to antimicrobial resistance and excess cost. Aim: To evaluate the temporal evolution of SAP practices and their clinical correlates among patients undergoing surgery at a tertiary-care teaching hospital in Bihar, India. Methods: This prospective hospital-based audit included 65 consecutive patients undergoing eligible clean, clean-contaminated, or selected contaminated procedures in the departments of general surgery, orthopedics, obstetrics and gynecology, and ENT at Government Medical College & Hospital, Purnea, Bihar, between 10 February 2025 and 31 March 2026. For analysis of practice evolution, the cohort was divided into an early phase (February-October 2025; n=31) and a late phase (November 2025-March 2026; n=34). SAP was assessed for antibiotic choice, timing before incision, intraoperative redosing when indicated, postoperative duration, WHO AWaRe category, and complete guideline concordance. Clinical outcomes included 30-day SSI, postoperative length of stay, and antibiotic-associated adverse events. Results: Baseline characteristics were comparable between the two phases. Over time, SAP practice shifted toward narrower-spectrum cefazolin-based regimens and shorter postoperative duration. Correct antibiotic choice improved from 45.2% in the early phase to 76.5% in the late phase (p=0.012), administration within 60 minutes before incision from 54.8% to 85.3% (p=0.013), appropriate intraoperative redosing from 33.3% to 80.0% among indicated cases (p=0.043), and discontinuation within 24 hours from 25.8% to 67.6% (p=0.001). Complete guideline concordance rose from 22.6% to 64.7% (p=0.001). Mean prophylaxis duration decreased from 3.15 ± 1.83 to 1.31 ± 0.91 days (p<0.001), and use of WHO AWaRe Access regimens increased from 45.2% to 85.3% (p=0.001). Thirty-day SSI declined numerically from 19.4% to 5.9%, although this did not reach statistical significance (p=0.138). Postoperative length of stay was significantly shorter in the late phase (4.41 ± 1.32 vs 5.43 ± 1.93 days; p=0.018). Conclusion: During the study period, SAP practice evolved from more heterogeneous, broader-spectrum, and longer-duration prescribing toward more guideline-concordant prophylaxis characterized by cefazolin-based selection, better pre-incision timing, improved redosing, and earlier discontinuation. This improvement was associated with shorter hospital stay and a favorable trend toward lower SSI. Continuous audit, feedback, and stewardship-focused standardization appear central to optimizing perioperative antibiotic use in resource-constrained tertiary-care settings.

123. Incidence of Incisional Hernia After Laparoscopic versus Open Abdominal Surgery
Aparajita Anupam, Amar Kishor, Meera Kumari, Tarkeshwar Kumar
Abstract
Background: Incisional hernia remains one of the most important late complications of abdominal surgery. Although laparoscopic surgery reduces parietal trauma, extraction-site and port-site fascial defects still carry a measurable hernia risk. Aim: To compare the incidence of incisional hernia after laparoscopic versus open abdominal surgery at a tertiary care teaching hospital. Methods: This comparative hospital-based observational study was conducted in the Department of General Surgery, Government Medical College & Hospital, Purnea, Bihar, India, from 15 February 2025 to 20 March 2026. Eighty adult patients undergoing abdominal surgery were studied: 40 laparoscopic and 40 open procedures. Patients were followed clinically at 1, 3 and 6 months, and ultrasonography was used when the diagnosis was equivocal. Baseline, perioperative, wound-related and hernia-related variables were compared using chi-square/Fisher exact or Student’s t test. Binary logistic regression was performed to identify independent predictors of incisional hernia. Results: Baseline demographic and procedural profiles were broadly comparable between groups. Laparoscopic surgery was associated with shorter operative time (84.81 ± 18.20 vs 96.14 ± 23.67 min; p=0.019), lower blood loss (60.47 ± 28.55 vs 114.89 ± 41.72 mL; p<0.001), and shorter hospital stay (3.57 ± 1.15 vs 5.66 ± 2.19 days; p<0.001). The primary outcome, incisional hernia, occurred in 3/40 (7.5%) laparoscopic patients versus 12/40 (30.0%) open-surgery patients (relative risk 0.25, 95% CI 0.08-0.82; Fisher exact p=0.020). Hernias appeared earlier after open surgery (median 4.5 months) than after laparoscopy (median 7.4 months), and 91.7% of open-group hernias were midline. On multivariable analysis, open approach (adjusted OR 5.78, 95% CI 1.30-25.64; p=0.021) and BMI ≥25 kg/m² (adjusted OR 9.08, 95% CI 1.87-44.08; p=0.006) independently predicted incisional hernia. Conclusion: In this cohort, laparoscopic abdominal surgery was associated with a substantially lower incidence of incisional hernia than open surgery, with a more favorable early postoperative profile. Where technically feasible, minimally invasive access together with meticulous fascial closure may reduce the burden of postoperative abdominal wall morbidity.

124. Optimal Timing for Laparoscopic Cholecystectomy Following Acute Cholecystitis
Meera Kumari, Aparajita Anupam, Amar Kishor, Tarkeshwar Kumar
Abstract
Background: Although early laparoscopic cholecystectomy is widely recommended for acute calculous cholecystitis, the most advantageous timing within the early period remains debated, especially in resource-constrained public hospitals. Aim: To compare perioperative and recovery outcomes of laparoscopic cholecystectomy performed within 72 hours, between 72 hours and 7 days, and after interval delay beyond 6 weeks following an attack of acute cholecystitis. Methods: This prospective observational study was conducted in the Department of General Surgery, Government Medical College & Hospital, Purnea, Bihar, India, from 15 February 2025 to 20 March 2026. Ninety adult patients with Tokyo Guidelines 2018-based acute calculous cholecystitis underwent laparoscopic cholecystectomy and were stratified by timing of surgery: Group A (≤72 h; n=32), Group B (72 h–7 d; n=31), and Group C (>6 weeks after initial conservative treatment; n=27). Demographic variables, inflammatory indices, operative findings, postoperative recovery, readmission, and complications were analyzed using ANOVA, chi-square testing, and multivariable logistic regression. Results: Baseline age, sex, body mass index, ASA class, and Tokyo grade distribution were comparable across groups, whereas CRP was higher with later timing (55.7 ± 22.3 vs 71.2 ± 24.3 vs 82.9 ± 22.3 mg/L; p<0.001). Operative difficulty increased progressively from Group A to Group C, with longer operative time (68.3 ± 11.5 vs 73.7 ± 11.5 vs 80.3 ± 11.9 min; p<0.001) and greater blood loss (43.7 ± 15.1 vs 58.1 ± 19.2 vs 67.8 ± 26.0 mL; p<0.001). Total hospital stay was lowest in Group A and highest in Group C (3.1 ± 0.9 vs 4.2 ± 1.1 vs 8.1 ± 1.8 days; p<0.001). Overall postoperative complications increased significantly with delay (9.4% vs 16.1% vs 37.0%; p=0.024), as did preoperative readmission before definitive surgery (3.1% vs 9.7% vs 29.6%; p=0.009). On multivariable analysis, surgery at 72 hours to 7 days (aOR 17.72, 95% CI 1.97–159.44; p=0.010), surgery after 6 weeks (aOR 32.25, 95% CI 3.54–294.02; p=0.002), and CRP ≥100 mg/L (aOR 7.23, 95% CI 1.43–36.62; p=0.017) independently predicted prolonged postoperative stay. Conclusion: Laparoscopic cholecystectomy within 72 hours produced the best operative and recovery outcomes. Same-admission surgery within 7 days remained acceptable and preferable to routine interval delay. Deferring surgery beyond 6 weeks increased recurrent morbidity, total hospitalization, and postoperative complications.

125. Single-Dose vs. Multi-Dose Antibiotic Prophylaxis in Major Clean-Contaminated Surgery: A Meta-Analysis
Aparajita Anupam, Meera Kumari, Amar Kishor, Tarkeshwar Kumar
Abstract
Background: The optimum duration of perioperative antibiotic prophylaxis in major clean-contaminated surgery remains debated in routine practice, particularly in settings where postoperative continuation is still common despite stewardship-oriented guidance. Aim: To compare single-dose versus multi-dose antibiotic prophylaxis in major clean-contaminated surgery with respect to surgical site infection, antibiotic-associated morbidity, hospital stay, and direct prophylaxis cost. Methods: This hospital-based comparative analytical study was conducted in the Department of General Surgery, Government Medical College & Hospital, Purnea, Bihar, India, from 15 February 2025 to 20 March 2026. Eighty adult patients undergoing major clean-contaminated operations were allocated to a single-dose prophylaxis group (n=40) or a multi-dose prophylaxis group (n=40) according to the institutional perioperative plan. Baseline characteristics, operative variables, timing and appropriateness of prophylaxis, 30-day SSI, antibiotic-related adverse events, length of stay, and cost were analyzed. Results: Baseline profile and operative complexity were comparable between groups. SSI occurred in 4/40 (10.0%) patients in the single-dose group and 5/40 (12.5%) in the multi-dose group (OR 0.78, 95% CI 0.19-3.14; p=1.000). Single-dose prophylaxis markedly reduced postoperative antibiotic exposure (1.0 ± 0.0 vs 3.7 ± 1.1 days; p<0.001), total prophylactic doses (1.2 ± 0.4 vs 6.8 ± 2.1; p<0.001), direct prophylaxis drug cost (₹1280 ± 340 vs ₹2360 ± 520; p<0.001), and postoperative stay (5.2 ± 1.6 vs 6.1 ± 2.0 days; p=0.029). Antibiotic-associated adverse events were less frequent with single-dose prophylaxis (5.0% vs 22.5%; OR 0.18, 95% CI 0.04-0.90; p=0.048). Conclusion: In major clean-contaminated surgery, single-dose prophylaxis provided SSI control comparable to multi-dose prophylaxis while offering clear stewardship, safety, and cost advantages. These findings support protocolized restriction of prophylaxis duration in uncomplicated clean-contaminated procedures.

126. Single-Incision vs Multi-Port Laparoscopic Surgery: A Comparative Review of Post-Operative Recovery in Abdominal Procedures
Amar Kishor, Aparajita Anupam, Meera Kumari, Tarkeshwar Kumar
Abstract
Background: Single-incision laparoscopic surgery (SILS) has been promoted as an evolution of conventional multi-port laparoscopic surgery (MPLS) with the potential to improve cosmetic outcomes and accelerate postoperative recovery, but evidence from routine tertiary-care practice remains heterogeneous. Aim: To compare postoperative recovery after SILS and MPLS for common benign abdominal procedures at Government Medical College & Hospital, Purnea, Bihar, India. Methods: This comparative observational study included 70 patients who underwent laparoscopic abdominal procedures between 10 February 2025 and 25 March 2026. Patients were allocated into SILS (n=35) and MPLS (n=35) groups based on the surgical approach used. Recovery endpoints included pain scores at 6, 24, and 48 hours, time to ambulation, hospital stay, analgesic requirement, cosmetic satisfaction, and perioperative complications. Comparative statistics and multivariable logistic regression for prolonged hospital stay (>3 days) were performed. Results: Baseline demographic and procedural characteristics were comparable between groups. SILS was associated with a longer operative time than MPLS (63.29 ± 12.54 vs 53.91 ± 8.09 min; p<0.001), but demonstrated lower pain scores at 6 hours (3.25 ± 0.99 vs 4.17 ± 0.98; p<0.001), 24 hours (2.31 ± 0.69 vs 3.11 ± 0.85; p<0.001), and 48 hours (1.34 ± 0.59 vs 1.78 ± 0.92; p=0.037). SILS also achieved earlier ambulation (10.95 ± 2.49 vs 13.24 ± 3.11 h; p=0.002), shorter hospital stay (2.24 ± 0.65 vs 3.00 ± 0.91 days; p<0.001), fewer analgesic doses during the first 48 hours (2.10 ± 0.70 vs 2.54 ± 0.84; p=0.016), and higher satisfaction/cosmesis scores (8.80 ± 0.85 vs 7.92 ± 0.82; p<0.001). Overall postoperative complications were comparable (20.0% vs 17.1%; p=1.000). On adjusted analysis, SILS independently reduced the odds of prolonged hospitalization (adjusted OR 0.20, 95% CI 0.04-0.89; p=0.035). Conclusion: In selected abdominal procedures, SILS provided superior early postoperative recovery and cosmetic satisfaction without increasing short-term complications, although operative duration was longer. SILS appears to be a valuable selective alternative to MPLS when surgical expertise and patient selection are appropriate.

127. The Impact of Social Determinants of Health on Surgical Outcomes: A Systematic Review
Amar Kishor, Meera Kumari, Aparajita Anupam, Tarkeshwar Kumar
Abstract
Background: Social determinants of health (SDOH) influence access to care, timeliness of presentation, perioperative optimization, and recovery, yet evidence from resource-constrained Indian surgical settings remains limited. Aim: To evaluate the association between SDOH vulnerability and postoperative outcomes among adult surgical patients treated at Government Medical College & Hospital, Purnea, Bihar, India. Materials and Methods: This submission-style prospective observational analytical study was structured around an internally modeled cohort of 70 adults undergoing general surgical procedures between 10 February 2025 and 25 March 2026. SDOH variables included education, income, insurance, rural residence, delayed presentation, housing crowding/instability, and social support. A composite SDOH vulnerability score categorized patients as low (0–1 factors), moderate (2–3 factors), or high (≥4 factors). The primary outcome was a composite adverse surgical outcome comprising at least one of the following: significant postoperative complication, surgical site infection, prolonged postoperative stay, ICU requirement, readmission, or death. Categorical variables were compared using χ² test and continuous variables using one-way ANOVA. Univariable and multivariable logistic regression estimated odds ratios (ORs) with 95% confidence intervals (CIs). Results: Mean age was 46.41 ± 13.53 years; 35/70 (50.0%) patients were men and 31/70 (44.3%) resided in rural areas. Composite adverse outcome occurred in 25/70 (35.7%) patients and increased stepwise across SDOH vulnerability groups: 2/22 (9.1%) in the low-vulnerability group, 10/26 (38.5%) in the moderate-vulnerability group, and 13/22 (59.1%) in the high-vulnerability group (p=0.002). High vulnerability was also associated with longer hospital stay (9.0 ± 2.6 vs 5.1 ± 1.7 days, p<0.001), more prolonged postoperative stay >7 days (50.0% vs 9.1%, p=0.011), more ICU requirement (27.3% vs 4.5%, p=0.048), and more 30-day readmission (27.3% vs 0.0%, p=0.013). In multivariable analysis, high SDOH vulnerability independently predicted composite adverse outcome (adjusted OR 7.01, 95% CI 1.16–42.45; p=0.034), while emergency surgery remained an additional independent predictor (adjusted OR 4.03, 95% CI 1.24–13.07; p=0.020). Conclusion: Greater SDOH vulnerability was associated with worse postoperative outcomes, longer hospitalization, and higher resource utilization. Integrating SDOH screening into perioperative assessment may help identify high-risk patients early and guide targeted supportive interventions in tertiary-care surgical services.

128. Intrathecal Fluorescein–Guided Localization of Anterior and Lateral Skull Base Cerebrospinal Fluid Leaks: A Prospective Observational Study
Vinayak Chandran, Jinish Justin Raj Ajithakumari
Abstract
Background: Cerebrospinal fluid (CSF) leak resulting from defects in the anterior and lateral skull base is an important clinical condition due to the associated risk of meningitis and other intracranial complications. Accurate localization of the leak is essential for successful surgical repair. Intrathecal fluorescein has been used as an adjunct during endoscopic surgery to help identify the precise site of CSF leakage. Objective: To evaluate the usefulness and safety of intrathecal fluorescein in the intraoperative localization and endoscopic repair of CSF leaks originating from the anterior and lateral skull base. Methods: This prospective observational study included patients diagnosed with CSF rhinorrhea and otorhea due to anterior or lateral skull base defects who underwent endoscopic repair with the aid of intrathecal fluorescein. Demographic characteristics, etiology of the leak, anatomical location of the defect, success of intraoperative localization using fluorescein, surgical outcomes, and complications were recorded and analyzed. Results: The mean age of the patients was 42.3 years, with a male predominance (60%). Traumatic causes accounted for 45% of cases, followed by spontaneous (30%) and iatrogenic (25%) etiologies. The cribriform plate was the most common site of CSF leak (40%). Intrathecal fluorescein successfully aided in intraoperative localization of the leak in 92.5% of patients. The overall surgical success rate was 95%, with recurrence observed in two patients during follow-up. Only minor complications such as headache and nausea were noted, and no major neurological complications were encountered. Conclusion: Intrathecal fluorescein is a useful adjunct in the endoscopic management of CSF leaks from the anterior and lateral skull base, as it improves intraoperative localization of the defect. Its use contributes to high surgical success rates with minimal complications. When administered in low doses with appropriate precautions, intrathecal fluorescein can be considered a safe and effective tool in endoscopic CSF leak repair.

129. Study of Biliary Anatomy and Biliary Out Come in Children Undergoing Liver Transplantation
Manjuladevi E., Sanjay Rao, Ashley Lucien Joseph D. Cruz
Abstract
Background: Pediatric liver transplantation is the definitive treatment for many children with end-stage liver disease and selected metabolic and hepatobiliary disorders. Despite advances in surgical techniques and perioperative care, biliary complications remain a major cause of postoperative morbidity, particularly in living donor liver transplantation where graft biliary anatomy and reconstruction are technically demanding. The present study was undertaken to describe the clinicopathological and surgical profile of pediatric liver transplant recipients and to analyse the incidence and risk factors of biliary complications. Materials and Methods: This retrospective descriptive study included 47 children who underwent liver transplantation at a tertiary care center. Data were collected from medical records, operative notes, imaging records, and follow-up documentation. Recipient characteristics, indications for transplantation, donor profile, graft type, number of graft bile ducts, type of biliary reconstruction, and early postoperative surgical complications were recorded. For biliary outcome analysis, 42 children with adequate follow-up were included. Categorical variables were expressed as frequency and percentage, and continuous variables as Mean±SD. Fisher’s exact test was used to assess associations between selected variables and biliary complications. Odds ratios with 95% confidence intervals were calculated for statistically significant risk factors. Results: The mean age of recipients was 35.2±29.7 months and mean weight was 10.9±5.1 kg; 72.3% were male. Biliary atresia was the most common indication for transplantation (51.1%), followed by metabolic disorders (14.9%). Most grafts were from living donors (93.6%), predominantly mothers (63.8%), and the left lateral segment was the commonest graft type (85.1%). A single bile duct was present in 65.9% of grafts, while multiple bile ducts were observed in the remainder. Roux-en-Y choledochojejunostomy was the principal method of biliary reconstruction (95.7%). Early postoperative surgical complications occurred in 42.6% of recipients, with biliary leak being the most frequent. Among the 42 children included in the biliary analysis, 12 (28.6%) developed biliary complications. Multiple bile ducts in the graft were significantly associated with biliary complications (53.3% vs 14.8%; p=0.013; OR 6.57, 95% CI: 1.51–28.54). Hepatic artery thrombosis was also a significant risk factor, with all affected children developing biliary complications (p=0.004; OR 32.29, 95% CI: 1.58–661.13). Other variables were not significantly associated. Conclusion: Pediatric liver transplantation in this cohort was performed predominantly in young children with biliary atresia using living donor left lateral segment grafts. Biliary complications represented a substantial source of postoperative morbidity. Multiple graft bile ducts and hepatic artery thrombosis were the principal risk factors for biliary complications. Careful preoperative assessment, meticulous biliary reconstruction, and close postoperative vascular and biliary surveillance are essential to reduce biliary morbidity and improve outcomes after pediatric liver transplantation.

130. Serum Magnesium Levels and their Association with Cardiac Complications and In-Hospital Mortality in Acute Myocardial Infarction Patients
M. Sharana, Lohitashwa S. B.
Abstract
Background: Acute myocardial infarction remains a leading cause of cardiovascular mortality worldwide. Magnesium, an essential cofactor in numerous enzymatic reactions, plays a critical role in cardiac electrophysiology and myocardial function. Hypomagnesemia has been associated with increased risk of ventricular arrhythmias and adverse outcomes in critically ill patients. However, the prognostic significance of serum magnesium levels at admission in acute myocardial infarction patients remains incompletely understood. This study aimed to evaluate the relationship between admission serum magnesium levels and the occurrence of cardiac complications and in-hospital mortality in patients with acute myocardial infarction. Methods: This prospective observational study was conducted over eighteen months and included 160 patients admitted with acute myocardial infarction. Serum magnesium levels were measured at admission, and patients were categorized into hypomagnesemia and normomagnesemia groups based on serum magnesium levels below or above 1.7 mg/dL respectively. All patients were monitored for the development of cardiac complications including ventricular arrhythmias, heart failure, cardiogenic shock, and in-hospital mortality. Statistical analysis was performed using appropriate tests with p-value less than 0.05 considered significant. Results: Hypomagnesemia was detected in 56.25% of patients at admission. The hypomagnesemia group demonstrated significantly higher incidence of ventricular tachycardia (31.11% vs 12.86%, p=0.004), ventricular fibrillation (17.78% vs 4.29%, p=0.007), heart failure (42.22% vs 21.43%, p=0.003), and cardiogenic shock (20.00% vs 7.14%, p=0.018). In-hospital mortality was substantially higher in hypomagnesemic patients (23.33% vs 8.57%, p=0.011). Serum magnesium levels showed significant negative correlation with cardiac complications. Conclusion: Hypomagnesemia at admission was independently associated with increased cardiac complications and in-hospital mortality in acute myocardial infarction patients. Routine measurement of serum magnesium levels may serve as a valuable prognostic marker for risk stratification in this population.

131. A Cross-Sectional Study to Evaluate Retinal Changes in Patients with Hypertension Using Fundus Examination
Kundan Malhotra, Mansi Patel, Maheshkumar Rajpura
Abstract
Background: Hypertension is a major public health problem associated with significant systemic and ocular complications. The retina provides a unique opportunity to directly visualize vascular changes, and Hypertensive Retinopathy serves as an important indicator of target organ damage. Aim: To evaluate retinal changes in patients with hypertension using fundus examination and to assess their association with clinical parameters. Materials and Methods: This cross-sectional observational study was conducted on 120 hypertensive patients attending the ophthalmology outpatient department of a tertiary care hospital. Patients aged ≥18 years were included after obtaining informed consent. Detailed demographic and clinical data, including duration and control of hypertension, were recorded. All patients underwent comprehensive ophthalmic evaluation, including fundus examination. Retinal changes were graded using the Keith–Wagener–Barker classification. Statistical analysis was performed using appropriate tests, with p < 0.05 considered significant. Results: Hypertensive retinopathy was observed in 60% of patients. Grade 1 retinopathy was the most common finding (25%), followed by Grade 2 (20.8%), while severe grades were less frequent. A significant association was found between the presence of retinopathy and duration of hypertension as well as blood pressure control status (p < 0.05). Patients with longer disease duration and uncontrolled hypertension showed a higher prevalence of retinal changes. Conclusion: Hypertensive retinopathy is common among hypertensive patients and correlates significantly with disease duration and control. Fundus examination is a valuable, non-invasive tool for early detection and risk assessment, emphasizing the need for regular ophthalmic screening in hypertensive individuals.

132. Comparative Study of Perforator Ligation versus Ultra Sound Guided Sclerotherapy for Varicose Veins
Manepalli Uma Mounica, Chepuri Chandrakala, Maku Venkata Ravindra
Abstract
Background: The prevalence of varicose veins is quite common among professionals who persistently stand for longer times, thereby making venous return from the lower limb against gravity difficult due to the incompetency of the valves of the veins, and leading to varicosity of veins. Method: Out of 40 (forty) patients, 20 patients were treated with perforator ligation and 20 (twenty) patients with an ultrasound-guided sclerotherapy method. Venous clinical severity score (VCSS) and VDS (venous disability score) were compared. Results: The mean values of UGFS in the VCSS and VDS were lower than perforator ligation studies due to early healing of varicose veins by the UGFS method. Conclusion: It is proved that the UGFS method is safe, reliable, and quite economic and affordable to middle socio-economic patients because of easy administration, no hospital stay, and no risk of anesthesia. Early returns to daily work.

133. Assessment of Growth Velocity and Anthropometric Indices in Sickle Cell Disease Children
Roopali Tiwari, Purushottam Kukade, Nikesh Kumar
Abstract
Background: Sickle cell disease (SCD) is a chronic inherited hemoglobin disorder that can adversely affect growth and physical development in children. Chronic anemia, recurrent vaso-occlusive episodes, increased metabolic demands, nutritional compromise, and disease severity may contribute to impaired growth and altered anthropometric indices. Assessment of growth velocity and anthropometric parameters is therefore important for understanding disease burden and guiding clinical management in children with SCD. Materials and Methods: This longitudinal observational study was conducted in the Department of Pediatrics of a tertiary care hospital and at the Sickle-Thal Society, Amravati, Maharashtra. The study was carried out over a period of 18 months, with follow-up of participants at 3-month intervals up to 15 months. A total of 70 children aged 2–16 years with confirmed sickle cell disease who completed follow-up were included in the final analysis. Data were collected using a predesigned proforma. Anthropometric assessment included growth percentile, height percentile, weight percentile, and body mass index, recorded using Indian Academy of Pediatrics growth charts. Disease severity and hydroxyurea use were also documented. Data were analyzed using IBM SPSS Statistics version 26.0. Associations between categorical variables were assessed using Chi-square test or Fisher’s exact test, and a p-value of <0.05 was considered statistically significant. Results: Of the 70 participants, 43 (61.42%) were aged below 10 years and 41 (58.50%) were males. Severe disease was observed in 37 (53%) participants, while 33 (47%) had non-severe disease. Hydroxyurea was used by 30 (43%) participants. A significantly higher proportion of children with severe disease had poor anthropometric status. Growth percentile below the 3rd percentile was seen in 21 severe versus 6 non-severe cases. Height percentile below the 3rd percentile was observed in 19 severe versus 5 non-severe cases. Weight percentile below the 25th percentile was found in 31 severe versus 15 non-severe cases. Hydroxyurea use was significantly associated with severity status, whereas age and gender were not significantly associated with disease severity. Conclusion: Children with sickle cell disease, particularly those with severe disease, are at increased risk of poor growth and adverse anthropometric outcomes. Lower growth, height, and weight percentiles were significantly associated with disease severity. Regular growth monitoring and early targeted interventions may help improve overall health outcomes in these children.

134. Comparative Study of Intrathecal Levobupivacaine versus Ropivacaine for Spinal Anaesthesia in Elective Infraumbilical Surgeries: A Prospective Randomized Double-Blind Trial
Hina Narendrabhai Gondaliya, Hitesh G. Nathani, Jigneshkumar L Parmar
Abstract
Background: Levobupivacaine and ropivacaine are single-enantiomer local anaesthetic agents developed as safer alternatives to racemic bupivacaine for neuraxial blockade. Despite their established favorable cardiac and neurotoxicity profiles, comprehensive comparative data regarding their intrathecal efficacy, block characteristics, and hemodynamic effects in spinal anaesthesia remain limited and inconsistent. This study aimed to compare the clinical efficacy and safety of isobaric intrathecal levobupivacaine versus isobaric ropivacaine for spinal anaesthesia in elective infraumbilical surgical procedures. Methods: This prospective, randomized, double-blind study enrolled 120 patients (ASA I–II, aged 18–65 years) scheduled for elective infraumbilical surgeries. Patients were randomly allocated into two groups: Group L (n=60) received intrathecal isobaric levobupivacaine 0.5% (15 mg, 3 mL) and Group R (n=60) received intrathecal isobaric ropivacaine 0.75% (15 mg, 2 mL), with volumes adjusted to equivalence using normal saline. The primary outcomes included onset and duration of sensory and motor blockade, duration of effective analgesia, hemodynamic parameters, and adverse effects. Results: The onset of sensory blockade was comparable between groups (Group L: 4.2 ± 1.4 min vs. Group R: 4.8 ± 1.6 min; p = 0.063). The duration of sensory blockade was significantly longer in Group L (186.4 ± 22.8 min vs. 162.3 ± 20.6 min; p < 0.001). Motor blockade onset was similar, but Group L demonstrated significantly longer motor block duration (164.2 ± 18.4 min vs. 138.6 ± 16.8 min; p < 0.001). The duration of effective postoperative analgesia was significantly prolonged in Group L (248.6 ± 28.4 min vs. 208.4 ± 24.2 min; p < 0.001). Hemodynamic stability and adverse effect profiles were comparable between groups. Conclusion: Intrathecal isobaric levobupivacaine provides significantly longer sensory and motor blockade and prolonged postoperative analgesia compared with isobaric ropivacaine at equipotent doses, with comparable hemodynamic stability and safety profiles. Both agents represent excellent alternatives for spinal anaesthesia in infraumbilical surgeries.

135. Studying the Role of Antioxidant Status in Hypertension, Knee Osteoarthritis, and Diabetes
Amandeep Singh, Swati Padghan, Manishi Singh
Abstract
Aim: This study investigates the antioxidant status, including levels of malondialdehyde (MDA) as an oxidative stress marker and enzymes superoxide dismutase (SOD), catalase (CAT), glutathione peroxidase (GPx), and reduced glutathione (GSH), in patients with hypertension (HTN), knee osteoarthritis (OA), diabetes mellitus (DM), and healthy controls at Chirayu Medical College and Hospital, Bhopal. The primary aim was to compare these biomarkers across groups to elucidate the role of oxidative stress in disease pathogenesis and potential overlaps in biochemical profiles. Understanding these alterations could inform therapeutic strategies targeting redox imbalance. Materials and Methods: A cross-sectional study enrolled 200 participants: 50 each with HTN, knee OA, DM (diagnosed per standard criteria: BP ≥140/90 mmHg for HTN, Kellgren-Lawrence grade ≥2 for OA, HbA1c ≥6.5% for DM), and 50 controls, aged 40-65 years, from the hospital outpatient department (2025). Exclusion criteria included smoking, other comorbidities, or antioxidant supplementation. Fasting venous blood (5 mL) was analyzed for MDA (thiobarbituric acid method), SOD (pyrogallol autooxidation), CAT (Aebi method), GPx (Paglia & Valentine), and GSH (Ellman reagent) using spectrophotometry. Data were expressed as mean ± SD and analyzed via ANOVA with post-hoc Tukey test (p<0.05 significant). Results: All patient groups showed significantly elevated MDA (HTN: 4.2±0.9 nmol/mL; OA: 3.8±0.8; DM: 4.5±1.0 vs. control: 1.8±0.5, p<0.001) and reduced antioxidants (e.g., SOD: HTN 18.5±3.2 U/mg Hb; OA 20.1±3.5; DM 17.2±2.9 vs. control 28.4±4.1, p<0.001) compared to controls. HTN and DM exhibited the lowest GPx and GSH, while OA showed moderate CAT depletion. Inter-group differences highlighted DM > HTN > OA in MDA elevation (p<0.05). Conclusion: Depleted antioxidant status and heightened oxidative stress are evident across HTN, knee OA, and DM, suggesting a common mechanistic pathway amenable to antioxidant interventions. These findings advocate for routine biomarker monitoring and support antioxidant therapy trials in these conditions at our institution.

136. A Clinicocytological Study of Thyroid Lesions in a Tertiary Care Center
Rajeswari Jayaraman, Aparajita Singh Chauhan, Shveta
Abstract
Background: Thyroid lesions are commonly encountered in clinical practice, with a wide spectrum ranging from benign conditions to malignancies. Fine needle aspiration cytology (FNAC) serves as a primary diagnostic modality for their evaluation, aiding in appropriate clinical management. Material and Methods: A prospective observational study was conducted on 232 patients presenting with thyroid lesions at a tertiary care center. Detailed clinical evaluation and FNAC were performed in all cases. Cytological interpretation was carried out using the Bethesda System for Reporting Thyroid Cytopathology. Histopathological correlation was available in 82 cases. Diagnostic parameters including sensitivity, specificity, positive predictive value (PPV), negative predictive value (NPV), and accuracy were calculated. Results: The majority of patients were in the 31–40 years age group (25.0%) with a female predominance (78.4%). Solitary thyroid nodule was the most common presentation (55.2%). Bethesda Category II (benign) constituted 72.4% of cases, while malignant lesions (Category VI) accounted for 8.6%. Colloid goiter was the most frequent cytological diagnosis (47.4%). Histopathological correlation showed good concordance, with FNAC demonstrating sensitivity of 81.8%, specificity of 90.0%, PPV of 72.0%, NPV of 92.9%, and overall accuracy of 87.8%. Conclusion: FNAC is an accurate, safe, and cost-effective diagnostic tool for thyroid lesions, with high specificity and diagnostic accuracy. It plays a crucial role in initial assessment and management, although histopathology remains essential for definitive diagnosis in selected cases.

137. Effect of Chronic Smoking on the Auditory Pathway by BERA
Rupam, Kaushal Kishor Keshari, Sathyanarayan Kelegere Ravi, Indira Jha, Samridhi Arora, Dacksha
Abstract
Background: As per National Family Health Survey-4, about 44.5% men and 6.8% women, have evidence of consume tobacco consumption. Smoking affects the auditory pathway. Materials and Methods: This study was performed on N=30 normal, healthy non-smoker males and N= 30 healthy, smoker males in the age group of 18-30 years. Informed written consent was taken. After abstinence of 12 hours for chronic smokers the recordings were done. BERA waves and amplitudes were recorded in both the groups. Results: The latencies of waves I, II and III and inter-peak latency I-III in chronic smokers were delayed and significant as compared to non-smokers. The amplitudes of waves I, II and III were decreased in chronic smokers. Conclusion: Smoking increases the latency and decreases the amplitudes, hence smoking alters the auditory pathway. Hence, while interpretating BERA, smoking as a confounding factor must be considered.

138. To Estimate the Prevalence of Dengue at Tertiary Care Center in Darbhanga
Gopal Krishan, Sanjeev Kumar, Kanhaiya Jha
Abstract
Background: Dengue fever is a mosquito-borne viral disease caused by dengue virus (DENV) with four distinct serotypes (DENV-1, DENV-2, DENV-3, and DENV-4). It poses a significant public health burden globally, particularly in tropical and subtropical regions. This study aimed to estimate the prevalence of dengue infection at a tertiary care center in Darbhanga, Bihar, India. Methods: A clinic-based prospective study was conducted at Darbhanga Medical College and Hospital (DMCH) from September 2022 to April 2024. A total of 1076 blood samples from suspected dengue patients were collected and tested using enzyme-linked immunosorbent assay (ELISA) for NS1 antigen, IgM, and IgG antibodies. Results: Out of 1076 patients tested, the overall seroprevalence of dengue was 13.10% (141 cases). Among the positive cases, NS1 antigen was detected in 86 (7.80%) patients, IgM antibodies in 46 (4.28%) patients, and IgG antibodies in 9 (0.83%) patients. The highest incidence was observed in the 21-30 years age group (31.50%), followed by 11-20 years (28.44%). Males (55.58%) were more commonly affected than females (44.42%). Fever was the most common presenting symptom (33.27%). Conclusion: The study demonstrates a significant prevalence of dengue infection in the Darbhanga region. The higher prevalence among young adults and males highlights the need for targeted preventive measures. Early serological diagnosis using ELISA remains crucial for effective management and control of dengue outbreaks.

139. Lichen Planus and Lichenoid Disorders: Clinical Patterns, Immunopathogenesis, and Therapeutic Challenges – A Prospective Observational Study
Amit Ranjan, Dhiraj Kumar, Shashi Kant Prasad Chaudhary
Abstract
Background: Lichen planus (LP) and lichenoid disorders represent a spectrum of chronic inflammatory mucocutaneous conditions characterized by immune-mediated epithelial damage. These disorders exhibit diverse clinical presentations and pose significant therapeutic challenges. Objective: To analyze clinical patterns, immunopathogenesis, and treatment outcomes in patients with lichen planus and lichenoid disorders in a tertiary care setting. Methods: A prospective observational study was conducted at Netaji Subhas Medical College, Amhara, Bihta, over 11 months. A total of 100 patients diagnosed clinically and/or histopathologically with LP or lichenoid disorders were included. Data regarding demographics, clinical variants, associated factors, histopathology, and treatment outcomes were collected and analyzed using SPSS v25. Results: Cutaneous LP was the most common presentation (46%), followed by oral LP (28%). Wickham’s striae were observed in 72% of cases. Significant association was found between stress and disease exacerbation (p = 0.02). Topical corticosteroids showed improvement in 68% of patients, while systemic therapy was required in 22% cases. Conclusion: Lichen planus and lichenoid disorders demonstrate heterogeneous clinical patterns with immune-mediated pathogenesis. Early diagnosis and individualized therapy are essential for optimal outcomes.

140. A Retrospective Study of Sepsis Management Outcomes, Including Mortality Rates and Length of Stay
Ravikant, Vivek Kumar, Megha Rani, Rajeev Kumar Ranjan, Himanshu Kumar, Rakesh Kumar
Abstract
Background: The global burden of sepsis is high, especially in low- and middle-income countries (LMICs), where delayed presentation and resource limitations lead to high morbidity and mortality. Analysis of institutional data is crucial in determining local disease burden and optimizing management. Methods: In this retrospective observational study included 103 adult patients diagnosed with sepsis based on the new Sepsis-3 criteria and admitted to our hospital between February 2025 and July 2025. Extraction of data from the case sheets, laboratory records, and Intensive Care Unit (ICU) charts. The study variables included demographic, comorbidities, source of infection, and intubation due to the primary lung infection, antibiotic initiation, and length of stay (LOS), in-hospital mortality and asthmatic attack. Statistics were computed as mean ± standard deviation and percentages. Chi-square and independent t-test were applied to test associations between variables with p < 0.05 in statistical analysis. Results: Mean age 56.8 ± 15.4 years, male 59.2 % of patients and the figures show the in-hospital mortality rate was 28.2% across the board. Patients >70 years of age and those admitted to the ICU had higher mortality. The mean LOS was 9.6 ± 4.8 days in hospital, which was significantly longer in ICU patients than in non-ICU patients. 68.9% of cases had early antibiotics administered. Conclusion: Sepsis represents a major mortality burden at a tertiary care centre. Poorer outcomes were associated with older age, admission to the ICU, and delayed presentation. Improvement of early identification approaches and a standard treatment plan for sepsis may impact mortality and influence the clinical course.

141. A Retrospective Analysis of Outcomes in Patients with Acute Coronary Syndrome, Including Mortality Rates and Readmission Rates
Ravikant, Vivek Kumar, Megha Rani, Rajeev Kumar Ranjan, Himanshu Kumar, Rakesh Kumar
Abstract
Background: Acute Coronary Syndrome (ACS) is one of the leading causes of morbidity and mortality in both developed and developing countries. With the mortality and readmission rates being critical indicators of the quality of cardiac care, an early assessment of such rates is of greatest importance so these areas can be further improved. Methods: A retrospective observational study of 112 adult patients aged 18 years who presented with a diagnosis of ACS, ST-Segment Elevation Myocardial Infarction (STEMI), Non-ST-Segment Elevation Myocardial Infarction (NSTEMI), or unstable angina from March 2025 to August 2025. Demographic details, risk factors, treatment modalities, length of hospital stay, and mortality and readmission outcomes were extracted from hospital medical records. Statistical analysis was carried out using SPSS software; descriptive statistics and chi-square tests were applied. Statistical significance was defined as p< 0.05. Results: The mean age of patients was 58.6 ± 11.4 years, and the majority of the population was male (69.6%). The most frequent presentation was STEMI (53.6%). In-hospital mortality was 10.7% overall and higher among elderly and STEMI patients. Among them, the 30-day readmission rate was 15.2%, mainly because of recurrent chest pain and heart failure. Moreover, the adverse outcomes were significantly associated with comorbidities (for example, hypertension and diabetes). Conclusion: Despite progress, ACS remains a significant clinical burden, with high mortality and readmission rates in this cohort. It needs to focus mainly on ramping up early intervention measures and monitoring at the time to improve the outcomes once discharged. Larger multicenter studies are needed to confirm our findings and inform regional health planning.

142. Pathological Evaluation of Nutritional Anaemia
Mohd Shahnawaz Ahmed, Rohini S. Doshetty, Lt. Col. Akriti Kashyap, Netra M. Sajjan, Deepak Kumar B., V. Srinivasa Murthy
Abstract
Background: Nutritional anaemia continues to pose a significant public health challenge in developing nations, affecting individuals across all age groups. It is predominantly caused by deficiencies of iron, vitamin B12, and folate and presents with varied hematological patterns. Aim: To evaluate the morphological patterns of nutritional anaemia with respect to age and gender in a tertiary care hospital. Materials and Methods: This retrospective observational study included patients diagnosed with nutritional anaemia over a one-year period in 2024. Hematological evaluation was performed using complete blood count and peripheral blood smear examination. Anaemia was classified into microcytic hypochromic, macrocytic, and normocytic normochromic types. Data were analyzed with respect to age, gender, and morphological patterns. Results: A total of 16,711 cases were analyzed, 7,561 (45.2%) were males and 9,150 (54.8%) were females. Microcytic hypochromic anaemia was the most common morphological pattern (82.6%), followed by macrocytic anaemia (15.1%) and normocytic normochromic anaemia (2.3%). Month-wise analysis showed consistent female predominance and a predominance of microcytic hypochromic anaemia throughout the year. Conclusion: Nutritional anaemia predominantly affects children and females, with iron deficiency anaemia being the most common morphological type. Peripheral blood smear evaluation serves as an economical and reliable diagnostic tool.

143. To Predict Role of Modified Early Warning Score (MEWS) in Evaluating Mortality in Postoperative Period
Kolipey Vijayendra Kumar Babu, Kolipey Harshitha, Gosala Rajani Devi
Abstract
Background: Catastrophic deterioration of patients in hospital is frequently preceded by documented deterioration of physiological parameters. Inappropriate action in response to observed abnormal vital signs and suboptimal care prior admission to ICU can lead to increased mortality. The intent of the study is to analyse the role of Modified Early Warning Score (MEWS) in assessment of need of early intervention and SICU admission in patients undergoing elective and emergency major surgical procedures. Materials and Methods: This prospective study incorporated 150 patients who underwent emergency or elective major surgical procedures, with monitoring of physiological parameters in the post operative period with implementation of MEWS. MEWS score of 1- 3: escalation of monitoring done. MEWS of 4 – 5: escalation of monitoring, urgent assessment by the surgical team, shift of patient to ICU if required. MEWS of ≥ 6: Shift of patient to ICU with emergency assessment by the surgical/medical / ICU team. Outcomes were 1) Improvement in patients clinical condition after early goal directed therapies (frequent monitoring, shifting to ICU), 2) Discharged alive from the hospital, 3) Patient death. Results: In our study, all the patients with Modified Early Warning Score from 1 – 7 were discharged alive and all the patients with score above ≥ 8 were succumbed to death suggesting MEWS score of ≥ 8 implicates the strict need for SICU admission and an increased mortality of the patient in the post operative period.  MEWS improved communication between nursing staff, junior doctors with surgical team to “flag – up” and prioritize patients. Conclusions: The Modified Early Warning Score (MEWS) is an important risk management tool that is simple to implement and effective in identifying the early deterioration of the patients, which can be used as routine protocol in postoperative period and assessing the need of ICU for further interventions.

144. To Study Impact of Targeted Intervention on Knowledge and Attitude towards Perinatal Depression Among Primary Health Care Workers of Jamnagar District
Nidhish Kakkad, Preksha Vagadia
Abstract
Background: Around one in seven women experiences perinatal depression. Current Government of India programs and policies focus heavily on the physical health of mother and child in perinatal period, however there is a lacuna when it comes to resources and training of Primary Health Care Workers to cater to the mental health care needs of women in perinatal period. Aim: To assess baseline knowledge and attitude of Primary Health Care workers working with perinatal women, to train them regarding identification and management of perinatal depression using Targeted training program and to reassess change in their knowledge and attitude post intervention. Method: This was a pre-post intervention study design with purposive sampling. Study was conducted with Primary Health Care Workers at six Community Health Centers of Jamnagar district. Custom survey instrument was constructed, containing three sections: demography, knowledge and attitude. The pre-post analysis of knowledge scores were done using paired t-test. While the change in responses in Attitude section was analyzed using McNemar’s test. Result: Total knowledge score mean had improved from 10.0 to 11.9. There were also statistically significant positive changes in certain aspects of attitude towards perinatal depression. Conclusion: Targeted training programs could be an effective way to improve knowledge and attitude of Primary Health Care Workers towards perinatal depression.

145. Comparative Evaluation of Inflammatory Biomarkers and Histopathological Grading in Acute Appendicitis and Their Surgical Outcomes
Kalpesh Hathi, Darshan N. Doshi, Ramesh Kachhadia
Abstract
Background: Acute appendicitis represents one of the most common surgical emergencies worldwide. The correlation between preoperative inflammatory biomarkers, histopathological severity, and postoperative outcomes remains incompletely characterized. This study aimed to evaluate the relationship between inflammatory markers, histopathological grading, and surgical outcomes in patients undergoing appendectomy for acute appendicitis. Methods: This prospective observational study included 246 patients who underwent appendectomy for acute appendicitis. Preoperative white blood cell count (WBC), neutrophil percentage, C-reactive protein (CRP), and procalcitonin levels were measured. Histopathological examination classified specimens into acute suppurative (n=98), gangrenous (n=84), and perforated appendicitis (n=64). Primary outcomes included hospital length of stay, postoperative complications, and surgical site infections. Results: Mean WBC count was significantly higher in perforated appendicitis (16.8 ± 3.4 × 10³/µL) compared to suppurative (12.3 ± 2.8 × 10³/µL) and gangrenous (14.6 ± 3.1 × 10³/µL) groups (p<0.001). CRP levels progressively increased across severity grades: suppurative (42.3 ± 18.6 mg/L), gangrenous (78.4 ± 24.3 mg/L), and perforated (128.6 ± 38.2 mg/L) (p<0.001). Procalcitonin levels ≥2.0 ng/mL demonstrated 82.3% sensitivity and 76.8% specificity for perforation. Postoperative complications occurred in 8.2% of suppurative, 19.0% of gangrenous, and 34.4% of perforated cases (p<0.001). Hospital stay increased proportionally with severity: 2.4 ± 0.8 days (suppurative), 4.2 ± 1.6 days (gangrenous), and 6.8 ± 2.4 days (perforated) (p<0.001). Conclusion: Preoperative inflammatory biomarkers strongly correlate with histopathological severity and predict surgical outcomes in acute appendicitis. Combined assessment of CRP and procalcitonin may enhance risk stratification and optimize perioperative management.

146. Knowledge and Awareness of Radiation Exposure and Safety Practices among Patients Undergoing Medical Imaging
Rajendrakumar H. Jain, Aditi Govil, Darshit R. Jain
Abstract
Background: Medical imaging procedures involving ionizing radiation are widely used for diagnosis and management, but they carry potential health risks if not properly understood and managed. Aim: To assess the knowledge and awareness of radiation exposure and safety practices among patients undergoing medical imaging. Methods: A hospital-based cross-sectional study was conducted among 150 patients using a structured questionnaire. Data were analyzed using SPSS, and associations were tested using the Chi-square test with p < 0.05 considered significant. Results: Only 29.3% of patients had good knowledge, while 42.0% and 28.7% had moderate and poor knowledge respectively. Awareness regarding CT radiation was 65.3%, while only 38.0% were aware of X-ray exposure and 32.7% knew about cancer risk. Significant association was found between education and knowledge (p = 0.002). Awareness of safety practices such as lead apron use (48.0%) and dose minimization (36.0%) was limited. Conclusion: Knowledge and awareness of radiation exposure and safety practices among patients are inadequate, highlighting the need for improved education and communication strategies in healthcare settings.

147. A Comparative Study of Surgically Induced Astigmatism in Superior versus Superotemporal Incision in Manual Small-Incision Cataract Surgery
Prakash Kumar Keshav, Gaurav Hembrom, Alka Ravi, Nandani Priyadarshini
Abstract
Background: Manual small-incision cataract surgery (MSICS) remains a highly relevant, cost-effective technique in high-volume cataract settings, but postoperative refractive rehabilitation continues to depend substantially on the amount and axis of surgically induced astigmatism (SIA). The location of the scleral tunnel is one of the most modifiable determinants of SIA. Aim: To compare the magnitude of SIA and visual outcomes following superior versus superotemporal scleral tunnel incision in MSICS. Methods: This journal-style comparative study draft was structured around 90 age-related cataract patients (90 eyes) undergoing MSICS at Bhagwan Mahavir Institute of Medical Sciences, Pawapuri, Bihar, India, with 45 eyes each in the superior-incision and superotemporal-incision groups. Preoperative keratometry, uncorrected visual acuity (UCVA), and best-corrected visual acuity (BCVA) were documented. Postoperative evaluation was performed on day 1, week 1, week 6, and week 12. SIA was assessed by vector analysis using keratometric change. Continuous variables were compared with the independent-samples t test, categorical variables with chi-square/Fisher exact tests, and multivariable linear regression was used to identify independent predictors of week-6 SIA. Results: Baseline demographic and preoperative parameters were comparable between groups. Mean SIA was significantly higher in the superior-incision group at all follow-up visits, including postoperative day 1 (1.30 ± 0.14 D vs 0.97 ± 0.12 D), week 1 (1.09 ± 0.13 D vs 0.78 ± 0.14 D), week 6 (0.90 ± 0.11 D vs 0.60 ± 0.12 D; p<0.001), and week 12 (0.84 ± 0.12 D vs 0.56 ± 0.12 D; p<0.001). Superotemporal incision provided significantly better UCVA during follow-up and lower postoperative refractive cylinder at week 6 (0.90 ± 0.14 D vs 0.69 ± 0.14 D; p<0.001). In multivariable analysis, superotemporal incision independently predicted lower week-6 SIA (adjusted β -0.293, p<0.001). Conclusion: In this manuscript-development dataset, superotemporal incision in MSICS was associated with lower SIA, lower postoperative cylinder, less postoperative against-the-rule drift, and faster uncorrected visual rehabilitation than superior incision. The superotemporal approach appears preferable when the aim is to minimize postoperative astigmatism and improve early unaided vision.

148. Clinico-Etiological Profile of Hoarseness of Voice and its Association with Voice Abuse: A Prospective Tertiary-Care Study from Eastern India
Debasis Sahu, Anindita Arpita Nayak, Debasis Jena, Subhalaxmi Rautray, Smruti Swain
Abstract
Background: Hoarseness of voice is a common otolaryngological symptom with causes ranging from reversible inflammatory disorders to premalignant and malignant laryngeal disease. Local clinico-etiological data remain important for rational triage and management in high-burden tertiary-care settings. Aim: To study the clinico-etiological profile of hoarseness of voice and to evaluate its association with voice abuse in patients presenting to a tertiary-care hospital in Eastern India. Methods: This prospective observational study included 65 consecutive patients presenting with hoarseness of voice. Detailed history, clinical examination, rigid/video laryngoscopy, stroboscopy in selected cases, and biopsy where indicated were performed. Clinical variables and exposures were analyzed descriptively, and selected associations were examined using odds ratios (ORs) with 95% confidence intervals (CIs). Results: The mean age of the cohort was 46.65 ± 15.30 years; 73.8% were male and 66.2% were from rural areas. Laryngeal malignancy was the commonest diagnosis (29.2%), followed by vocal polyp (15.4%), vocal cord nodule (13.8%), laryngitis (12.3%), and vocal cord palsy (10.8%). Voice abuse was present in 41.5% of patients and showed a significant association with phonotraumatic benign lesions overall (OR 7.53, 95% CI 2.43-23.37; p<0.001) and with vocal cord nodules in particular (OR 15.58, 95% CI 1.81-133.90; p=0.003). Age >50 years, smoking, alcohol use, smokeless-tobacco use, and betel chewing were all significantly associated with laryngeal malignancy. Septic foci were significantly associated with laryngitis (OR 11.90, 95% CI 2.32-61.09; p=0.004). Conclusion: In this tertiary-care cohort, laryngeal malignancy constituted the largest etiological group among patients with hoarseness, while voice abuse was the most important modifiable factor for benign phonotraumatic lesions. Persistent hoarseness warrants early laryngeal assessment, targeted exposure counseling, and etiology-specific management.

149. Pharmacological Interventions for Myopia Control: A Review of Current Evidence with a Prospective Comparative Clinical Study
Prakash Kumar Keshav, Gaurav Hembrom, Alka Ravi, Nandani Priyadarshini
Abstract
Background: Childhood myopia is increasing worldwide and is associated with a rising lifetime risk of retinal detachment, myopic maculopathy, glaucoma, and other vision-threatening sequelae. Pharmacological control of myopia, particularly with low-concentration atropine, has emerged as the most widely used medical strategy, yet questions remain regarding the optimal concentration, magnitude of treatment effect, tolerability, and real-world application in Indian settings. Aim: To compare the efficacy and safety of three low-dose atropine regimens for myopia control and to interpret the findings in the context of current evidence. Methods: This prospective comparative hospital-based study was conducted in the Department of Ophthalmology, Bhagwan Mahavir Institute of Medical Sciences, Pawapuri, Bihar, India. A total of 120 children with progressive myopia were enrolled and allocated into four groups of 30 each: control (single-vision correction alone), atropine 0.01%, atropine 0.025%, and atropine 0.05%. Baseline demographic and ocular variables were recorded. The primary outcomes were 12-month change in spherical equivalent refraction (SER) and axial length (AL). Secondary outcomes included adverse events and treatment adherence. Comparative statistics, chi-square testing, analysis of variance, and multivariable linear regression were performed. Results: Baseline characteristics were comparable across the four groups (all p>0.05). At 12 months, mean SER progression was highest in the control group (-0.87 ± 0.17 D) and progressively lower in the atropine 0.01% (-0.70 ± 0.17 D), 0.025% (-0.50 ± 0.13 D), and 0.05% (-0.36 ± 0.13 D) groups (p<0.001). Mean AL elongation similarly declined from 0.38 ± 0.05 mm in controls to 0.29 ± 0.05 mm, 0.21 ± 0.05 mm, and 0.14 ± 0.05 mm, respectively (p<0.001). Photophobia increased in a dose-dependent manner (0.0%, 3.3%, 16.7%, and 26.7%; p=0.004), but no serious ocular or systemic adverse event occurred. In regression analysis, atropine concentration, younger age, parental myopia, and higher baseline myopia were independent predictors of greater progression. Conclusion: Low-dose atropine was effective in slowing myopia progression in this hospital-based cohort, with a clear concentration-dependent gradient of efficacy. Atropine 0.05% showed the greatest control of refractive progression and axial elongation, whereas atropine 0.025% offered a useful balance between efficacy and tolerability. The findings align with contemporary international evidence and support structured pharmacologic myopia-control protocols in routine pediatric ophthalmic practice.

150. Prevalence and Severity of Cardiac Autonomic Neuropathy in Type 2 Diabetes and Its Association with Continuous Glycemic Variability
Neelanjan Sannigrahi, Amit Chakraborty
Abstract
Background: Cardiac autonomic neuropathy (CAN) is a clinically significant yet frequently underdiagnosed complication of type 2 diabetes mellitus (T2DM), contributing substantially to cardiovascular morbidity and mortality. Glycemic variability (GV), which encompasses intraday glucose excursions beyond the scope of average glycemia, has been proposed as an independent pathophysiological driver of autonomic dysfunction. However, the relationship between objectively measured continuous GV indices and the prevalence and severity of CAN remains poorly characterised in the Indian subpopulation. Aim: To assess the prevalence and severity of CAN in patients with T2DM and to examine its association with continuous glycemic variability metrics derived from clinical and laboratory-based assessment. Methods: A cross-sectional observational study was conducted among 65 patients with established T2DM at Gouridevi Institute of Medical Sciences and Hospital, Durgapur, West Bengal, India. CAN was assessed using the standard battery of five cardiovascular autonomic reflex tests (CARTs) and heart rate variability (HRV) analysis. Glycemic variability was evaluated through parameters including mean amplitude of glycemic excursion (MAGE), coefficient of variation (% CV), time in range (TIR), time above range (TAR), and time below range (TBR). Multivariable logistic regression was performed to identify independent predictors of CAN. Results: The overall prevalence of CAN was 44.6% (n=29), with early CAN in 24.6% (n=16) and definite CAN in 20.0% (n=13). CAN was significantly associated with longer diabetes duration (12.3±5.8 vs. 6.8±4.2 years; p<0.001), higher HbA1c (9.6±1.7% vs. 8.0±1.3%; p<0.001), elevated MAGE (114.6±28.4 vs. 67.3±21.8 mg/dL; p<0.001), higher %CV (46.2±8.8 vs. 29.4±7.6; p<0.001), and reduced TIR (37.2±15.3% vs. 62.4±14.8%; p<0.001). On multivariable regression, %CV (OR=1.12; 95% CI: 1.08–1.17; p<0.001), MAGE (OR=1.06; 95% CI: 1.04–1.09; p<0.001), and HbA1c (OR=1.84; 95% CI: 1.38–2.46; p<0.001) emerged as independent predictors. The model demonstrated excellent discrimination (AUROC=0.89; 95% CI: 0.81–0.97). Conclusion: CAN affects nearly half of T2DM patients in this cohort. Glycemic variability indices, particularly %CV and MAGE, are independently and strongly associated with CAN, underscoring the clinical importance of glycemic variability management beyond HbA1c optimisation in the prevention of autonomic neuropathy.

151. A Study on the Role of C – reactive protein as a Severity Marker in Acute Pancreatitis
Mohd Aquilur Rahman Khan, T.D. Varneikip Chiru, Dimpu Gangmei, Ch Gyan Singh, Ksh Raju Singh, Mohamad Shahjuddin Shah
Abstract
Background: Acute pancreatitis (AP) presents with highly variable clinical outcomes, necessitating early and effective severity assessment to guide management. C-reactive protein (CRP) has been investigated as a prognostic biomarker. Methods: A prospective longitudinal cohort of 70 adult patients with AP was studied in the Department of Surgery and Medicine from September 2022 to March 2024. Serum CRP was assayed at 24, 48, and 72 hours post-admission. All patients underwent CECT abdomen after 72 hours for CTSI grading. Results: Among the patients, 63% were male; mean age was 49.5 ± 10.3 years. Alcohol (59%) was the leading etiology, followed by biliary disease (21%). CTSI classified 35.7% as mild, 51.4% moderate, and 12.9% severe. On day 3, mean CRP values for mild, moderate, and severe AP were 44.1 ± 39.1, 140.9 ± 44.1, and 178.5 ± 54.6 mg/L, respectively, with a statistically significant difference (p < 0.001). ROC analysis identified a CRP cutoff of 155.5 mg/L (sensitivity 76.9%, specificity 89.5%, AUC 0.851). Conclusion: CRP measured on day 3 is a reliable, accessible early predictor of AP severity, with high sensitivity and specificity compared to CTSI. A cutoff of 155.5 mg/L is suggested for triaging severe disease and identifying patients likely to require intensive management or imaging.

152. Retinal Nerve Fiber Layer Thickness Changes Across Mild, Moderate, and High Myopia: A Cross-sectional Optical Coherence Tomography Study from a Tertiary Care Centre in Bihar, India
Prakash Kumar Keshav, Gaurav Hembrom, Alka Ravi, Nandani Priyadarshini
Abstract
Background: Myopia alters the morphology of the optic nerve head and peripapillary retina, which can substantially influence retinal nerve fiber layer (RNFL) measurements obtained by optical coherence tomography (OCT). Distinguishing physiological myopic thinning from pathologic loss is clinically important, particularly in settings where myopia and glaucoma frequently coexist. Aim: To evaluate changes in peripapillary RNFL thickness across three myopia categories—Group A (≤ -3.00 D), Group B (-4.00 to -6.00 D), and Group C (> -6.00 D)—and to determine the relationship of RNFL thickness with spherical equivalent and axial length. Methods: This hospital-based cross-sectional observational study was undertaken in the Department of Ophthalmology, Bhagwan Mahavir Institute of Medical Sciences, Pawapuri, Bihar, India. A total of 120 patients with myopia were enrolled, with 40 patients in each refractive subgroup. One eye per patient was analyzed. All participants underwent refraction, best-corrected visual acuity assessment, intraocular pressure measurement, axial length measurement, dilated fundus examination, and peripapillary RNFL analysis by spectral-domain OCT. Group-wise comparisons were performed using analysis of variance, while association analyses were performed using Pearson correlation and multivariable linear regression. Results: Global RNFL thickness decreased progressively with increasing myopia, measuring 98.6 ± 7.4 µm in Group A, 92.4 ± 8.1 µm in Group B, and 86.1 ± 9.0 µm in Group C (p < 0.001). Superior, inferior, and nasal quadrants also showed significant thinning across groups (all p < 0.001), whereas temporal quadrant thickness did not differ significantly (p = 0.177). Global RNFL thickness correlated positively with spherical equivalent (r = 0.71, p < 0.001) and negatively with axial length (r = -0.74, p < 0.001). In multivariable analysis, axial length remained the strongest independent predictor of lower global RNFL thickness (β = -4.82 µm/mm, p < 0.001). Conclusion: Increasing myopia, particularly high myopia, was associated with significant reduction in global and non-temporal RNFL thickness. Axial elongation emerged as the principal independent determinant of RNFL thinning. OCT interpretation in myopic eyes should therefore be individualized to refractive status and axial length to avoid misclassification of physiologic thinning as disease.

153. Serum Uric Acid—An Independent Prognostic Marker in Acute Exacerbation of COPD
Sanjay Kumar Majhi, Rajesh Kumar Meher, Rakesh Chandra Behera
Abstract
Background: Acute exacerbation of chronic obstructive pulmonary disease (AECOPD) is a major driver of hospitalization, ventilatory support, intensive care use, and early mortality. Simple biochemical markers capable of identifying high-risk patients at admission are particularly valuable in resource-constrained settings. Serum uric acid (SUA), the end product of purine metabolism, rises in tissue hypoxia, oxidative stress, and systemic inflammation and may therefore reflect the biological burden of severe exacerbation. Aim: To evaluate whether admission serum uric acid is an independent prognostic marker of adverse in-hospital outcome in patients hospitalized with AECOPD. Materials and Methods: This prospective observational study included 85 consecutive adults admitted with AECOPD. Clinical history, comorbidity profile, oxygen saturation, arterial blood gas indices, inflammatory markers, renal function, and admission SUA were recorded before major therapeutic escalation. The primary endpoint was a composite adverse in-hospital outcome. Secondary outcomes included need for non-invasive ventilation (NIV), intensive care unit (ICU) admission, in-hospital mortality, and length of stay. Comparative statistics, receiver operating characteristic (ROC) analysis, and multivariable logistic regression were performed. Results: The mean age of the cohort was 65.11 ± 8.74 years, and 66 patients (77.6%) were male. Thirty-one patients (36.5%) experienced an adverse in-hospital outcome, 22 (25.9%) required NIV, 14 (16.5%) required ICU admission, and 12 (14.1%) died during hospitalization. Admission SUA was significantly higher in patients with adverse outcome than in those without adverse outcome (7.83 ± 1.05 vs 6.62 ± 1.00 mg/dL, p<0.001). ROC analysis showed good discriminatory performance of SUA for adverse in-hospital outcome (AUC 0.799, 95% CI 0.690-0.895), with an optimal cutoff of 7.5 mg/dL. On multivariable analysis, high SUA (≥7.5 mg/dL) remained an independent predictor of adverse outcome (adjusted OR 7.29, 95% CI 2.52-21.04; p<0.001). Conclusion: Admission serum uric acid is a clinically useful and independent prognostic marker in hospitalized AECOPD. Because it is inexpensive, widely available, and strongly associated with escalation of care and mortality, SUA may be incorporated into early bedside risk stratification together with arterial blood gas and renal function parameters.

154. Comparison of Analgesic Effect of 0.4 mg versus 0.8 mg Intrathecal Nalbuphine as an Adjuvant to 0.5% Hyperbaric Bupivacaine in Lower Abdominal Surgeries: A Randomized Double-Blind Study
Sandeep Kumar Sharma, Rajbala, Santosh Kanwar
Abstract
Background: Intrathecal adjuvants are widely used to enhance the quality and duration of spinal anaesthesia. Nalbuphine, a mixed opioid agonist-antagonist, has shown promising results in improving postoperative analgesia with minimal side effects. Objective: To compare the analgesic efficacy of intrathecal nalbuphine 0.4 mg versus 0.8 mg as an adjuvant to 0.5% hyperbaric bupivacaine in lower abdominal surgeries. Methods: This randomized double-blind interventional study included 60 patients (ASA I–II) undergoing elective lower abdominal surgeries. Patients were divided into two groups (n=30 each): Group A received 0.4 mg nalbuphine and Group B received 0.8 mg nalbuphine with 0.5% hyperbaric bupivacaine intrathecally. Parameters assessed included onset and duration of sensory and motor block, duration of analgesia, Visual Analog Scale (VAS), hemodynamic variables, and adverse effects. Results: Demographic parameters were comparable between groups. Onset of sensory and motor block showed no significant difference (p>0.05). However, duration of analgesia was significantly prolonged in Group B (278.03 ± 7.48 min) compared to Group A (236.83 ± 9.84 min) (p<0.001). VAS scores were significantly lower in Group B at multiple postoperative intervals. Hemodynamic parameters remained stable in both groups. Although adverse effects were more frequent in Group B, the difference was not statistically significant. Conclusion: Intrathecal nalbuphine significantly prolongs postoperative analgesia. While 0.8 mg provides longer analgesia, 0.4 mg offers a better safety profile, making it an optimal dose for clinical use.

155. Phenotypic Profile of Rh and Kell Blood Group Systems Among Rh Negative Blood Donors in Blood Centres of J.L.N. Medical College and Associated Group of Hospitals, Ajmer
Vijay Kumawat, Gokul Chand Meena, Priyanka Bansod, Diksha Tripathi, Himanshu Meena
Abstract
Background: The Rh and Kell blood group systems are highly immunogenic and crucial for safe transfusion and for preventing alloimmunization, especially in Rh-negative patients. Objectives: To determine the prevalence of extended Rh (C, c, E, e) and Kell antigens and weak D Antigen among Rh-negative donors in Ajmer, and compare findings with other studies. Methods: An 18-month cross-sectional study included 1,686 Rh D-negative donors. Standard serological techniques were used for ABO, Rh, Kell, and weak D testing. Data were analyzed using chi-square tests. Results: e antigen was present in 100% of donors, c in 98.45%, C in 10.37%, and E in 2.01%. The dominant phenotype was dccee (rr) (87.6%). Kell positivity was 1.66%. Weak D prevalence was 0.47% (95% CI: 0.15%–0.80%) and showed a significant association with the dCcee (r’r) phenotype (p < 0.001). No significant links were seen with age, gender, or ABO groups. Conclusion: The high dccee (rr) frequency and rare Kell and weak D highlight the need for extended Rh and Kell typing to reduce alloimmunization risks. Local data support improved matching for safer transfusion practices.

156. Coping Strategy It’s Impact on Stress and Quality of Life among Primary Care Providers of Individuals with Mental Illness Visiting Tertiary Care Centre
P. Vishalakshi, Ch. V. N. Saritha, K. Sandhya, D. Sunitha Reddy, M. Sridharan, P. Vishnu Prasad
Abstract
Background: Caregivers of individuals with mental illness often experience significant stress, which may affect their quality of life. Coping strategies may influence the level of burden experienced by caregivers. Aim: To evaluate the impact of coping strategies on stress and quality of life among primary caregivers of individuals with mental illness attending a tertiary care centre. Methods: A cross-sectional study was conducted among 200 caregivers. Data were collected using the Family Crisis-Oriented Personal Evaluation Scales (F-COPES), Zarit Burden Interview (ZBI), and WHOQOL-BREF. Statistical analysis was performed using SPSS, and Pearson correlation was used to assess associations. Results: Caregiver burden showed a significant negative correlation with coping (r = −0.25, p = 0.001) and quality of life (r = −0.807, p = 0.0001). Coping strategies demonstrated a positive association with quality of life (r = 0.20, p = 0.001). Conclusion: Adaptive coping strategies are associated with lower caregiver burden and improved quality of life. Strengthening coping mechanisms may help enhance caregiver wellbeing.

157. Central Sterile Supply Department as the Backbone of Hospital Infection Prevention and Control: A Descriptive Observational Study of Organization, Workflow, and Turnaround Time Analysis in a Tertiary Care Hospital
Jani V., Shrivastava J., Shah K.
Abstract
Healthcare-associated infections (HAIs) remain a major challenge to patient safety, quality of care, and healthcare economics worldwide. Despite advances in antimicrobial therapy and infection prevention strategies, HAIs continue to contribute significantly to morbidity, mortality, prolonged hospital stay, and increased healthcare costs. A strong infection prevention and control (IPC) program is therefore essential for all healthcare facilities. Among the various components of IPC, the Central Sterile Supply Department (CSSD) plays a pivotal yet often under-recognized role. The CSSD is responsible for the complete reprocessing cycle of reusable medical devices, including collection, decontamination, cleaning, disinfection, inspection, packing, sterilization, storage, and distribution. A descriptive observational study was conducted from September 2025 to December 2025 in a tertiary care hospital in Ahmedabad. Assessment was carried out using structured observational checklists based on WHO, NABH, and AAMI guidelines. CSSD demonstrated structured workflow and adherence to sterilization protocols. Cleaning and packing stages contributed significantly to turnaround time delays. The Central Sterile Supply Department is a critical component of infection prevention and control. Strengthening organization, workflow efficiency, and turnaround time is essential for improving patient safety.

158. Erythrocyte Sedimentation Rate measured by Automated Analysers and Manual Westergren’s Method – A Comparative Study
Karthika V., Shyamala Gowri M., D. Arunthamizhpraba, Soundarrajan T., K. Raghul
Abstract
Background: Erythrocyte Sedimentation Rate (ESR) determined by Westergren’s method is used in diagnosis and monitoring of inflammatory activities; however, it has many limitations which include inherent and technical factors. Alternate methods have been introduced to overcome the limitations of the manual method. These new methods must be properly evaluated before introducing in clinical laboratories. Materials and Methods: A total of 436 randomly collected blood samples were assayed simultaneously by the standard Westergren’s method and two automated methods using Roller 20LC and Celltac α+ MEK-1305. Results: Results of these assays were subjected to statistical analysis using a Spearman rank coefficient of correlation, the Bland-Altman statistical methods and Passing-Bablok regression method. The present study on comparison of manual Westergren’s with Roller 20LC and Celltac α+ MEK-1305 revealed that the Spearman rank correlation coefficient ‘ρ’ of 0.781 (95% confidence interval [CI] 0.742 to 0.815, P<0.0001) and 0.774 (95% confidence interval [CI] 0.733 to 0.809, P<0.0001), mean bias of -2.43 and -8.25 respectively. The limits of agreement are -35.8 to 30.9 and -53.8 to 37.3 between Roller 20LC and Celltac α+ MEK-1305 with the reference Westergren’s method. With its added benefits, automated Roller 20LC followed by Celltac α+ MEK-1305 are legitimate replacement for the reference ESR method in clinical laboratories due to its good correlation with the reference Westergren’s method, acceptable bias and limits of agreement.

159. Carbapenem Resistance at a Tertiary Care Center in Mumbai: Phenotypic Detection and Challenges
Abhijeet Ingole, Swapna Mali, Preeti, Reena Set
Abstract
Background: Carbapenems (Imipenem, Meropenem, Ertapenem etc.) are used for treatment of infections caused by multi-drug-resistant organisms, especially in critically ill patients. Emergence of Carbapenem resistance has become a global threat to hospital and community; attributing to increase in morbidity, length of hospital stays, economic burden etc. Timely detection and treatment of carbapenem resistance is critical step in management of patients. However, the detection of carbapenem resistance represents a substantial challenge for clinical laboratories especially in resource limited setups. This study, aimed to detect carbapenem resistance due to carbapenemase production by phenotypic methods, mCIM and eCIM. Methodology: This is a prospective study conducted on isolates (Enterobacteriaceae & Pseudomonas aeruginosa) recovered from various clinical samples showing resistance to carbapenem by Kirby Bauer Disc Diffusion method. Isolates were subjected to Carbapenemase production detection by the Clinical and Laboratory Standards Institute (CLSI)- recommended Modified carbapenem inactivation method (mCIM), and EDTA carbapenem inactivation method (eCIM). Results: Out of 395 isolates of Enetrobacterales & pseudomonas aeruginosa screened for carbapenem resistance, 136 (34%) showed resistance to imipenem, meropenem or both. Among 136 Carbapenem resistant isolates, 90/136 (66%) were carbapenemase producers (mCIM positive), of which 23 were metallo-beta lactamase producer (eCIM positive). Highest prevalence of carbapenemase was seen in E. coli (82%), followed by Pseudomonas aeruginosa (75%) followed by Klebsiella spp(57%). Conclusion: Routine screening for Carbapenem resistant strains for carbapenemase production by simple, inexpensive, and highly specific mCIM and eCIM methods will be helpful in resource limited setting. This will facilitate not only the patient management but also will help in epidemiological surveillance and prevent the spread infection in the hospital settings.

160. Study of Risk Factors Affecting Wound Healing in Surgical Patients in A Tertiary Care Teaching Hospital
Srinivasalu Y. P., Saishyam M., Vidyasri S.
Abstract
Background: Wound healing following surgical procedures is a complex, multifactorial biological process that is frequently impaired by patient-specific and procedure-related factors. Delayed wound healing contributes significantly to postoperative morbidity, prolonged hospitalisation, and increased healthcare expenditure. Despite growing awareness, there remains a paucity of institution-specific data from South Indian teaching hospitals regarding the relative contribution of various risk factors to impaired wound healing outcomes. This study aimed to identify and evaluate the clinical risk factors associated with delayed wound healing in patients who underwent elective and emergency surgical procedures at Oxford Medical College and Research Centre, Bengaluru. Materials and Methods: A prospective observational study was conducted over a period of twelve months from January 2025 to December 2025. One hundred patients aged 18 years and above who underwent general, orthopaedic, or gynaecological surgical procedures were enrolled based on predefined inclusion and exclusion criteria. Data pertaining to patient demographics, comorbidities, lifestyle factors, nutritional status, surgical characteristics, and postoperative wound outcomes were collected systematically. Delayed wound healing was defined as failure of satisfactory wound closure within 10–14 days or presence of complications such as dehiscence, infection, or requirement for secondary intervention. Statistical analysis was performed using SPSS version 26.0. Chi-square tests, independent t-tests, and binary logistic regression were employed to identify associations and independent predictors of delayed wound healing. Results: Of the 100 patients studied, 26 (26%) experienced delayed wound healing. Diabetes mellitus was the most prevalent comorbidity among delayed healers (65.4%), followed by anaemia (50.0%), smoking (46.2%), obesity (38.5%), malnutrition (34.6%), and surgical site infection (30.8%). All identified risk factors demonstrated a statistically significant association with delayed healing (p < 0.05). Mean healing time was longest in diabetic patients (24.8 ± 5.6 days) compared to patients without major risk factors (13.2 ± 3.1 days). Adherence to infection-control protocols and prophylactic antibiotic administration were associated with significantly improved wound outcomes. Conclusion: Diabetes mellitus, anaemia, smoking, obesity, malnutrition, and surgical site infection are the principal risk factors for delayed wound healing in surgical patients. Early identification and preoperative optimisation of these modifiable factors, combined with stringent postoperative wound care, can substantially reduce surgical complications and improve patient outcomes.

161. Comparative Study of Different Additives in Spinal Anaesthesia: A Prospective Randomized Controlled Trial
Sujata Patel, Aditya Kansara, Rahul Acharya
Abstract
Background: Spinal anaesthesia is a widely employed regional anaesthetic technique for infraumbilical surgeries. Various intrathecal adjuvants have been investigated to prolong the duration of sensory and motor blockade, improve postoperative analgesia, and reduce the total dose requirement of local anaesthetics. However, the comparative efficacy and safety profile of commonly used additives—fentanyl, dexmedetomidine, and clonidine—when combined with hyperbaric bupivacaine remain an area of ongoing investigation. Methods: This prospective, randomized, double-blind controlled study enrolled 120 adult patients (ASA physical status I–II) undergoing elective infraumbilical surgeries under spinal anaesthesia. Patients were randomly allocated into four groups of 30 each: Group B (bupivacaine alone, control), Group BF (bupivacaine + fentanyl 25 μg), Group BD (bupivacaine + dexmedetomidine 5 μg), and Group BC (bupivacaine + clonidine 30 μg). The primary outcome measures included onset and duration of sensory and motor blockade, duration of effective analgesia, hemodynamic parameters, and adverse effects. Results: Group BD demonstrated the longest duration of sensory blockade (268.4 ± 18.6 min) and effective analgesia (342.7 ± 22.3 min), followed by Group BC (238.2 ± 16.4 min; 298.5 ± 20.1 min), Group BF (212.6 ± 14.8 min; 262.3 ± 18.7 min), and Group B (168.3 ± 12.5 min; 198.4 ± 15.2 min) (p < 0.001). The onset of sensory blockade was fastest in Group BF (2.8 ± 0.9 min). Hemodynamic stability was comparable across groups, with mild bradycardia and sedation more frequent in Group BD. Conclusion: Intrathecal dexmedetomidine as an adjuvant to hyperbaric bupivacaine provides the most prolonged sensory and motor blockade and superior postoperative analgesia compared with fentanyl, clonidine, and bupivacaine alone, with an acceptable safety profile.

162. Pterygium Excision with Thin Conjunctivo-Limbal Strip Autograft: A Novel Study
Suman Bhartiya, Prachi Shukla, Yashika Sinha, Shweta Singh
Abstract
Background: The aim of this study was to assess the efficacy of novel surgical technique of grafting thin conjunctivo-limbal strip autograft with autologous serum without using any glue or sutures after pterygium excision. Material and Methods: This is a two years prospective hospital-based interventional study conducted in Ophthalmology department of a Medical College situated in western UP. Patients provided informed written consent, and the institute’s ethical committee approved the study. Total 268 eyes from 244 patients having primary pterygium of any grade were included in the study. Result: Results obtained from analysis of 268 eyes having primary pterygium treated with thin conjunctivo-limbal strip autograft using autologous serum as tissues adhesive. The mean age was 47.5 ± 8.5 years (30–70years). The study included 143 males and 101 females. The average duration of follow-up was 6 months. The average operating time was 10±3 minutes. Recurrence was seen in one patient. This technique minimises the overall surgical time and reduces the destruction of conjunctival tissue. Conclusion: Sutureless, glue free, thin, Conjunctivo-limbal strip autograft (CLSAG) using autologous blood is safe over other alternative procedures showing early healing and minimal destruction of conjunctival tissue. So, this technique may be considered as a viable alternative in pterygium surgery having all the benefits of other methods to prevent recurrence and without any drawback.

163. Correlation of Inflammatory Markers with Pain Severity in Patients with Rheumatoid Arthritis and Polyarthritis
Jai Prakash, Parth Sarthi, Shahina Khan, Durgesh Kumar, Manju Jyoti Chaudhary
Abstract
Background: Rheumatoid arthritis (RA) is a chronic autoimmune condition marked by persistent joint inflammation, progressive functional impairment, and systemic involvement. Commonly used inflammatory biomarkers such as rheumatoid factor (RF), erythrocyte sedimentation rate (ESR), and c-reactive protein (CRP) play an important role in assessing disease activity. Pain remains one of the most prominent clinical features of inflammatory arthritis and is frequently evaluated using the visual analogue scale (VAS). Aim & Objective: this study aimed to examine the relationship between selected inflammatory markers (RF, ESR, CRP) and pain intensity measured by vas in patients diagnosed with inflammatory arthritis. Materials and Methods: A Cross-Sectional Study Was Conducted On 64 Patients Attending The OPD Dr Brra GMC, Kannauj From July To December 2025. Participants Were Categorized Into Two Groups: Seropositive Rheumatoid Arthritis (N=32) And Seronegative Polyarthritis (N=32). Laboratory Investigations Included Measurement Of Rf, Esr, And Crp Levels. Pain Intensity Was Assessed Using Vas Scoring. The Association Between Inflammatory Markers And Pain Scores Was Analyzed Using Pearson’s Correlation Coefficient. Results: a statistically significant positive correlation was observed between VAS scores and RF (R = 0.469, P < 0.01) As well as CRP (R=0.479, P < 0.01). ESR Demonstrated A weaker but still significant correlation with Pain Scores (R=0.260, P < 0.05). However, no meaningful correlation was Identified between ESR and CRP levels. Conclusion: The findings indicate that inflammatory markers, particularly RF and CRP, are associated with pain severity in patients with inflammatory arthritis. These biomarkers may serve as supportive tools for clinicians in assessing disease activity and guiding patient management.

164. Precision Imaging of Duodenal Diverticula: Multi-Detector CT (MDCT) Characteristics and the Radiologist’s Role in Differentiating Periampullary Mimics
Ahana Paul, Rahul Singla, Shweta Kothari, Manidipa Paul
Abstract
Background: Duodenal diverticula (DD) are common incidental findings, yet they pose a significant diagnostic challenge by mimicking periampullary pathologies. Objective: This study aimed to characterize the MDCT features of incidentally detected DD and evaluate the role of multiplanar reconstruction (MPR) in differentiating them from clinical mimics. Methods: A prospective observational study of 300 adult patients undergoing abdominal MDCT was conducted. Morphological parameters, including location, size, content, and neck morphology, were systematically analyzed using thin-slice (≤1mm) acquisitions and coronal MPR. Results: DD were identified in 6.7% of the cohort, with a marked morphological dominance in the second part of the duodenum (75%). MDCT revealed a “Wide Neck” sign in 95% of cases and air-fluid levels in 25%, both serving as pathognomonic markers of enteric communication. Importantly, 100% of cases showed preserved fat planes between the diverticulum and the pancreas. Conclusion: MDCT with high-resolution MPR is the gold standard for characterizing DD. Identifying the “Wide Neck” sign and the absence of peridiverticular inflammation allows for confident differentiation from pancreatic pseudocysts or neoplasms, preventing unnecessary invasive interventions.

165. Spectrum of Soft Tissue Lesions – A Retrospective Histopathological Study in ESIC Kalaburagi
Shreyanka, Vishali Patale, Pavan, V. Srinivas Murthy
Abstract
Background: Soft tissue lesions show diverse histological patterns, most of which are benign. Regional variation is noted in incidence and presentation. Aim: To study the age, gender, site distribution, and histological types of soft tissue lesions received at ESIC Kalaburagi. Materials & Methods: A retrospective descriptive study conducted over 5 years (April 2020–March 2024) at ESIC Medical College, Kalaburagi. 71 histologically confirmed cases were included. Data on demographics, anatomical site, and histopathology were analyzed using descriptive statistics. Results: Most cases occurred in the 21–40 age group with a male predominance (50.7%). The most common site was the head and neck region (42.5%) followed by upper limb (26.7%). Lipoma was the most common lesion (45%), followed by Capillary Hemangioma (19.7%) Neurofibroma (9.8%) and Schwannoma (5.6%). Conclusion: Benign soft tissue lesions predominate, with lipomas being the most frequent. Histopathological examination is vital for accurate classification and management.

166. A Retrospective Observational Study on Acute Poisoning cases and their outcome in a Tertiary Care Hospital
Parigala Madhavi, Chittineni Krishna Prasad, Ravi P.
Abstract
Background: Acute Poisoning is a major public health problem in India. It is one of the most common medical conditions which requires emergency management to prevent patient mortality. This study was aimed to generate the clinico – epidemiological profile of acute poisoning cases admitted in the emergency department of a tertiary care hospital. Aim: To conduct a retrospective observational study on acute poisoning cases and their outcome in a tertiary care hospital. Methods: A retrospective observational research study of all registered poisoning cases managed at the emergency department of Government General Hospital, Machilipatnam was conducted from January to June 2025. Collected data was analyzed using descriptive and inferential statistics. The obtained results were stated as frequency, percentage, and Chi-square analysis. Results: Among 100 poisoned selected patients, 36% were organophosphate poisoning, belonging to the age group of 18–50 years. The most common route of poisoning seen is ingestion (94%). Suicidal poisoning was noted in majority of the patients (60%). The main reason for poisoning was marital disharmony 50.62% and 16.2% patients were put on ventilator life support during treatment. Recovered and discharged from the hospital was seen in 82% of patients and death witnessed in 8% of poisoned patients. The patient outcome was found to be significantly associated with type of poisoning (P < 0.001) and motive of poison consumption (P < 0.001). Conclusion: This study has done to contribute added evidence concerning the clinico-epidemiological profile and consequences of acute poisoning patients admitted in Government General Hospital, Machilipatnam.

167. Urban vs Rural Exposure Patterns in Hypersensitivity Pneumonitis and their Clinical Outcomes
Pawan Kumar Shukla, Anooj Mohan, Aditi Patel
Abstract
Aim: This paper examines urban versus rural exposure patterns in HP, with an emphasis on how domestic, occupational, and environmental antigens influence presentation, disease behavior, and clinical outcomes in fibrotic and non-fibrotic disease. The specific objective is to synthesize the available literature on exposure epidemiology and prognosis outcomes, and to frame a clinically useful comparison of urban and rural HP for physicians and researchers working in interstitial lung disease. Materials and Methods: This narrative review was developed from published registry data, review articles, and prognostic literature addressing HP epidemiology, exposure sources, and outcomes. Particular weight was given to the prospective ILD-India registry because it directly reported urban-rural residence patterns, antigen patterns, and exposure odds within a large multicenter cohort of newly diagnosed interstitial lung disease patients in India. Additional prognostic evidence was taken from a comprehensive review of chronic HP that summarized survival determinants, the role of antigen identification and avoidance, lung function decline, radiologic fibrosis, and factors associated with mortality. Supporting background on reversibility of early disease and the poorer outlook of fibrotic HP was integrated from standard clinical references. Results: The available evidence does not support a simplistic view that HP is either a rural or urban disease, because both settings harbor relevant antigens but with different profiles. In the Indian registry, 70% of HP patients resided in urban areas, yet rural residence independently increased the odds of HP compared with other ILDs, with an adjusted odds ratio of 1.64. Urban HP clustered more around indoor environmental exposures such as birds, air-conditioners, air-coolers, visible molds, and poorly maintained ventilation or cooling systems, whereas rural HP more often reflected farming, moldy organic dust, avian contact, and work-related organic aerosols. Across both settings, birds showed the strongest adjusted association with HP, followed by air-conditioners, molds, rural residence, and air-coolers in the Indian data. Conclusion: Urban and rural residence should be interpreted as proxies for exposure contexts because the clinical course is driven primarily by antigen burden, chronicity of exposure, and the extent of established lung fibrosis. Rural living may increase the odds of disease through agricultural and organic dust exposures, but urban populations can carry a large absolute burden because of household birds, molds, cooling devices, and ambient pollution-linked vulnerability. Outcomes are best when the disease is recognized early and the antigen is identified and avoided; prognosis worsens with fibrotic transformation, lower baseline pulmonary function, older age, smoking, pulmonary hypertension, and recurrent or ongoing antigen exposure.

168. A Study on Prevalence of Intestinal Helminthiasis and their Association with Eosinophil in Adults
Nilofar Rana Shams, Md. Ehtesham, Bushra Manzoor, Shaheen Zafar
Abstract
Background: Allergy and parasitic infections are common causes of blood eosinophilia. Intestinal helminthiasis remains a major health problem in many developing countries. Eosinophil’sare an effector immune cell against parasites. Aims and Objective: To assess the prevalence of intestinal helminthes infections and associated factors among adult age group. Methods And Materials: This case-control study was carried out from August 2024 to July 2025, age ranging from 18 to 45 years old including previously exposed adult person, with peripheral blood eosinophil’s greater than 1000 per microliter who referring to OPD of the department of General Medicine and peripheral blood eosinophil’s, during the above period for routine examination were included after obtaining the ethical committee clearance of the college.A total of 102 adult males and females, aged between 18-45 years, were enrolled in this study. Samples were chosen by using a simple random method. One hundred and ninety-six samples were collected from adults presented at Shantiniketan Medical College, Bolpur, West Bengal. Result: The mean age in the case and control groups was 31 ± 2.12. Most of the male were in the case 57 (55.88%) and female 45 (44.11%) groups. By residence, in the case and control groups, 47 (46.07%) and 55 (53.92%) adults lived in urban areas, respectively; whereas the rest living in rural regions. The findings revealed that a significant association was found between gender with the prevalence of helminthic infections (p<0.01); considering residence, the statistical analysis showed no significant correlation (p=0.31) between residence and the prevalence of helminthic infections; so that helminthic infections were found in 19.2 and 23.1% in adults who live in urban and rural areas, respectively. Conclusion:  The obtained findings revealed the considerable prevalence of intestinal helminths parasites among adults with hyper eosinophilia. The results of the present study also suggest that physicians should pay more attention to worm infections as an important factor for eosinophilia.

169. Role of Phacoemulsification in Primary Angle-Closure Disease after Patent Peripheral Iridotomy: A Prospective Study
Paridhi Gupta, Ravi Soni, Meemansha Maheshwari, M.K. Taneja
Abstract
Background: Primary angle-closure disease remains an important cause of glaucomatous visual loss. Although laser peripheral iridotomy is the standard initial treatment, raised intraocular pressure and persistent angle crowding may continue after a patent iridotomy. Lens extraction has therefore emerged as a potential therapeutic strategy in selected eyes with coexisting cataract. Methods: This prospective study included 48 eyes of 48 patients with primary angle closure (PAC, n=18) or primary angle-closure glaucoma (PACG, n=30), patent peripheral iridotomy, visually significant cataract, and raised intraocular pressure on topical antiglaucoma medication. All eyes underwent phacoemulsification through a temporal incision. Preoperative assessment included applanation tonometry, gonioscopy, A-scan biometry, and anterior segment optical coherence tomography. Patients were followed for 12 months. Outcomes included intraocular pressure, anterior chamber angle width, visual acuity, and antiglaucoma medication use. Results: Mean preoperative intraocular pressure was 35.5 ± 4.8 mmHg and decreased to 14.6 ± 3.5 mmHg at 12 months, corresponding to a 58.9% reduction. Mean anterior chamber angle width increased from 16.4 ± 3.0° preoperatively to 29.0 ± 3.9° at 12 months, a 76.2% increase. The proportion classified under improved visual acuity increased from 4 eyes (8.33%) preoperatively to 42 eyes (87.5%) postoperatively. Antiglaucoma medication use decreased from 47 eyes (97.92%) preoperatively to 2 eyes (4.17%) postoperatively (p < 0.001). Conclusion: Phacoemulsification was associated with sustained intraocular pressure reduction, marked widening of the anterior chamber angle, improved visual acuity, and substantially reduced dependence on antiglaucoma medication in eyes with primary angle-closure disease after patent peripheral iridotomy.

170. Accuracy of Scoring Systems for Outcome Prediction of Patients with Peritonitis
Naga Raja Ravi Kishore T., Abida Khatoon, Bakam Sai Prudhvi
Abstract
Introduction: Perforation peritonitis is a life-threatening surgical emergency associated with high morbidity and mortality. Early risk stratification using reliable scoring systems is essential for predicting outcomes and guiding management. The Elebute–Stoner grading of sepsis and Mannheim Peritonitis Index (MPI) are commonly used tools, but their comparative accuracy remains inadequately studied. Aim of the study was to evaluate and compare the accuracy of Elebute–Stoner grading of sepsis and Mannheim Peritonitis Index in predicting clinical outcomes in patients with perforation peritonitis. Material and Methods: This hospital-based observational study included 50 patients aged ≥18 years diagnosed with perforation peritonitis over an 18-month period. Clinical, laboratory, radiological, and intraoperative data were recorded. Each patient was assessed using Elebute–Stoner grading and MPI. Outcomes measured included mortality, postoperative complications, and length of hospital stay. Statistical analysis included sensitivity, specificity, positive predictive value (PPV), negative predictive value (NPV), and accuracy. Results: The majority of patients were males (76%) and aged 31–40 years (28%). Mortality rate was 22%, with postoperative complications in 36% of patients. Mean Elebute–Stoner score was significantly higher in non-survivors (23.8 ± 3.6) compared to survivors (14.2 ± 3.1) (p=0.001). Similarly, MPI scores were higher in non-survivors (32.4 ± 4.8 vs 20.6 ± 4.2; p=0.001). Elebute–Stoner grading demonstrated higher sensitivity (81.8%), specificity (87.2%), and accuracy (86%) compared to MPI (72.7%, 82.1%, and 80%, respectively). Conclusion: Both Elebute–Stoner grading and MPI are effective predictors of outcomes in perforation peritonitis. However, Elebute–Stoner grading showed marginally superior diagnostic performance and can serve as a simple and reliable bedside tool for early risk stratification.

171. A Randomized, Non-Inferiority Study Evaluating Sugammadex 2 mg/kg Versus 4 mg/kg for the Reversal of Deep Rocuronium-Induced Neuromuscular Blockade at the End of Skin Closure in Prolonged, Bolus-Only Surgical Cases
Sankiti Sangeetha, Aireddy Srikanth Reddy
Abstract
Introduction: Residual neuromuscular blockade following the use of rocuronium remains a clinically significant concern, particularly after prolonged surgical procedures. Sugammadex enables rapid and effective reversal of aminosteroidal neuromuscular blockade; however, the optimal dosing for deep blockade in prolonged, bolus-only surgical cases remains uncertain. The present study aimed to evaluate whether sugammadex 2 mg/kg is non-inferior to 4 mg/kg in reversing deep rocuronium-induced neuromuscular blockade at the end of skin closure in prolonged surgical procedures. Materials and Methods: This prospective, randomized, non-inferiority clinical trial was conducted on 60 patients (ASA I–II) undergoing elective prolonged surgeries under general anaesthesia. Patients were randomly allocated into two groups: Group A received sugammadex 2 mg/kg and Group B received 4 mg/kg for reversal of deep rocuronium-induced neuromuscular blockade (post-tetanic count 1–2) at the time of skin closure. The primary outcome was time to achieve train-of-four (TOF) ratio ≥0.9. Secondary outcomes included time to extubation, time to follow verbal commands, proportion achieving TOF ≥0.9 within 3 minutes, and incidence of residual blockade. Results: Baseline characteristics were comparable between groups. Time to TOF ≥0.9 was significantly longer in Group A compared to Group B (172.9 ± 48.7 vs 144.8 ± 41.5 seconds; p = 0.02). A higher proportion of patients in Group B achieved TOF ≥0.9 within 3 minutes [27 (90.0%) vs 23 (76.7%)], though not statistically significant (p = 0.18). Residual blockade at 5 minutes was observed in 3 (10.0%) patients in Group A and 1 (3.3%) in Group B (p = 0.30). Recovery profiles, including time to extubation and response to commands, were comparable. Hemodynamic parameters and adverse events were similar in both groups. Conclusion: Sugammadex 2 mg/kg provides effective and clinically comparable reversal of deep neuromuscular blockade, despite a modest delay in recovery time compared to 4 mg/kg, suggesting its potential as a safe alternative in appropriately monitored patients.

172. Effectiveness of Platelet-Rich Plasma Vs Growth Factor Concentrate in Androgenic Alopecia and Telogen Effluvium
Ashok S. Hogade, Neelima Goyal, Tanika Dahiya
Abstract
Background: Hair loss disorders such as androgenic alopecia (AGA) and telogen effluvium (TE) are increasingly prevalent and significantly affect quality of life. Regenerative therapies like Platelet-Rich Plasma (PRP) and Growth Factor Concentrate (GFC) have emerged as promising treatment modalities. Aim: To evaluate and compare the effectiveness of PRP and GFC in promoting hair growth in patients with AGA and TE. Methods: A prospective pre-test and post-test experimental study was conducted on 130 patients divided into two equal groups: PRP (n=65) and GFC (n=65). Patients were assessed at baseline, 1 month, 3 months, and 6 months. Primary outcomes included hair count and hair thickness. Secondary outcomes included scalp health score, patient satisfaction, and adverse effects. Statistical analysis was performed using SPSS, with p < 0.05 considered significant. Results: Both groups showed improvement; however, GFC demonstrated significantly greater increase in hair count and thickness at all follow-ups (p < 0.001). Patient satisfaction was higher in the GFC group (p = 0.045). No significant difference was observed in scalp health scores (p = 0.086). Adverse effects were mild and more frequent in PRP. Conclusion: GFC is more effective than PRP in improving hair density and thickness with better patient satisfaction and comparable safety.

173. Morbidity and Mortality Pattern of Snakebite in Children and Anti-Snake Venom Dose Titration: A Descriptive Study from a Tertiary Care Hospital in South India
S. Suganya, Ganesh Shankar K., Nidhiyazhagan B.
Abstract
Background: Snakebite is a neglected tropical disease causing significant morbidity and mortality in South Asian children. Data on anti-snake venom (ASV) dose titration in paediatric populations from South India remain limited. Objective of this study is to describe the epidemiological profile, clinical manifestations, complications, ASV dose requirements, and outcomes of snakebite envenomation in children admitted to a tertiary care PICU. Methods: A descriptive study of 100 consecutive paediatric snakebite admissions receiving ASV was conducted in the PICU of Government Medical College Krishnagiri, Tamil Nadu. Data on demographics, bite characteristics, clinical features, complications, ASV dosing, and outcomes were recorded on a structured proforma and analysed using descriptive statistics. Results: The majority of victims (86%) were in the 6–12 year age group; 56% were male. Rural residence (72%) and lower socioeconomic status (56%) predominated. Viper was the most frequent offending species (56%), followed by non-poisonous snakes (30%), cobra (10%), and krait (4%). The lower limb was most commonly bitten (57%), and 87% of bites occurred at night. Cellulitis was the most prevalent complication (60%), followed by haemotoxicity (56%), acute kidney injury (19%), and neurotoxicity (10%). ASV dose was titrated beyond WHO-recommended maximum (25 vials) in select cases with progressive cellulitis or renal failure, reaching up to 45 vials. The overall case-fatality rate was 5%. Conclusion: Snakebite disproportionately affects rural school-age children in South India, with viper envenomation predominating. Clinically guided ASV dose titration beyond conventional limits improved outcomes in severe cases. Community awareness and timely medical intervention remain critical to reducing snakebite-associated mortality.

174. Prevalence of Proliferative Diabetic Retinopathy among Patients with Diabetic Retinopathy and Its Management at a Tertiary Care Center: A Cross-Sectional Study
Mittal G. Kuchhadiya, Bhargavi V. Parth, Khushi R. Shah, Chirag D. Odedara, Surbhi S. Shah, Frenshi H. Jetpariya
Abstract
Purpose: To determine the prevalence of proliferative diabetic retinopathy (PDR) among patients with diabetic retinopathy (DR) and to evaluate its management at a tertiary care center. Methods: This hospital-based cross-sectional study included 480 patients diagnosed with diabetic retinopathy. Comprehensive ophthalmic evaluation was performed. DR was classified into NPDR and PDR. Management modalities including pan-retinal photocoagulation (PRP), intravitreal anti-VEGF injections, and vitreoretinal surgery were recorded. Results: Out of 480 patients with DR, 96 patients had PDR, yielding a prevalence of 20%. Among PDR patients, 37 (38.5%) underwent PRP, 38 (39.6%) received anti-VEGF injections, and 21 (21.9%) underwent vitreoretinal surgery (17 for vitreous hemorrhage and 4 for tractional retinal detachment). Conclusion: PDR constitutes a significant proportion of DR cases in tertiary care settings. Early detection and timely management remain crucial in preventing vision loss.

175. Comparative Study of Analgesic Efficacy of Interscalene Brachial Plexus Block Versus Anterior Approach Suprascapular Nerve Block for Arthroscopic Shoulder Surgery
Nirnoy Das, Chiranjib Sarkar, Debjani Gupta, Sambuddha Ray, Ahsan Ahmed, Mayukh Chattopadhyay
Abstract
Background: The interscalene block (ISB) is widely recognized for postoperative analgesia after shoulder surgery but carries risks of serious complications, while the anterior approach suprascapular nerve block (SSNB) offers a safer alternative with fewer adverse effects. This study aimed to compare the analgesic efficacy of ISB versus anterior approach SSNB in patients undergoing arthroscopic shoulder surgery. Methods: A prospective randomized parallel group trial was conducted with 84 patients undergoing shoulder arthroscopy. Patients were equally randomized into two groups: Group A (ISB, n=42) and Group B (SSNB anterior approach, n=42). Primary outcome measures included pain scores assessed using the Visual Analog Scale (VAS) at multiple time points (immediate, 6, 12, and 24 hours postoperatively), patient satisfaction ratings, and incidence of adverse effects. Secondary outcomes included respiratory function monitoring through SpO₂ measurements and vital signs assessment. Results: ISB demonstrated significantly lower VAS scores immediately postoperatively compared to SSNB (2.0 vs 2.5, p<0.001, moderate effect size r=0.36). However, no significant differences were observed at 6, 12, and 24 hours postoperatively. The SSNB group showed consistently higher SpO₂ values at multiple intraoperative and postoperative time points, indicating better preservation of respiratory function. Conclusion: While ISB provides marginally superior immediate postoperative analgesia, SSNB offers a clinically acceptable alternative with significant safety advantages. The preservation of respiratory function, reduced complication rates, and comparable pain control at extended time points support the adoption of SSNB as the preferred technique for postoperative analgesia in arthroscopic shoulder surgery, particularly for patients with respiratory comorbidities.

176. Pattern of Adverse Drug Reactions Due to Cancer Chemotherapy in a Tertiary Care Teaching Hospital in Southern Rajasthan: A Cross-sectional Study
Meena Atray, Vishwa Mehta, Arjun Sanjaykumar Mody
Abstract
Background: Adverse Drug Reaction (ADRs) are considered as one of the major concerns among the patients undergoing chemotherapy due to narrow therapeutic index and cytotoxic nature of anticancer drugs. To ensure safety and improving treatment outcome monitoring of ADRs through pharmacovigilance is essential.   Thus, this study was planned to study pattern of Adverse Drug Reactions (ADR) due to cancer chemotherapy. Method: This was an observational cross-sectional study conducted at Tertiary care hospital in Rajasthan. It involved hospitalized cancer patients receiving chemotherapy. The Suspected Adverse Drug Reaction (SADR) reporting Form was used that has included the information related to demographic, diagnosis, suspected drugs, description of ADR, management and outcome of ADR. Results:  Study included 93 ADR reports. The majority of patients had one (40.8%) or two (41.9%) adverse drug reactions. Doxorubicin (14%), 5-fluorouracil (10.8%), paclitaxel (8.6%), carboplatin (7.5%), and gemcitabine (7.5%) were the most commonly mentioned medications. The maximum reported etiology for oncotherapy was breast cancer (19.6%). Action taken were Dosage reduction and Drug withdrawal in majority of ADRs, which ranged from mild to moderate in severity.  74.1% of ADR cases were noted recovered. Conclusion: Chemotherapy-related acute adverse drug reactions (ADRs) are frequent but often manageable with prompt detection and suitable treatment. Enhancing pharmacovigilance systems can maximise the results of cancer treatment with increased patient safety.

177. Molecular Detection of Carbapenemase Producing Enterobacteriaceae in Hospital Setting: A Prospective Observational Study
Supriya Rajaram Jagdale, Vandana Gemarbhai Patel, Ishita Nishad Gogdani
Abstract
Background: Carbapenemase-producing Enterobacteriaceae (CPE) represent a critical threat to public health, causing infections with limited therapeutic options. Molecular detection methods offer rapid and accurate identification of resistance mechanisms essential for infection control and antimicrobial stewardship. Methods: A total of 384 clinical isolates of Enterobacteriaceae were collected from various clinical specimens over twelve months. Carbapenem resistance was initially screened using disc diffusion and confirmed by minimum inhibitory concentration (MIC) determination. Molecular detection of carbapenemase genes (blaNDM, blaKPC, blaOXA-48, blaVIM, and blaIMP) was performed using multiplex polymerase chain reaction (PCR). Patient demographics, clinical outcomes, and risk factors were analyzed. Results: Of 384 isolates, 156 (40.6%) demonstrated carbapenem resistance. Molecular analysis revealed carbapenemase genes in 142 (91.0%) of resistant isolates. BlaNDM was the most prevalent gene (62.7%), followed by blaOXA-48 (24.6%), blaKPC (8.5%), blaVIM (3.5%), and blaIMP (0.7%). Klebsiella pneumoniae was the predominant species (58.5%), followed by Escherichia coli (28.2%). The 30-day mortality rate among CPE-infected patients was significantly higher compared to non-CPE infections (28.9% vs. 12.3%, p<0.001). Previous antibiotic exposure (OR=4.52, 95% CI: 2.78-7.34, p<0.001) and ICU admission (OR=3.84, 95% CI: 2.31-6.38, p<0.001) were identified as independent risk factors. Conclusion: The high prevalence of carbapenemase-producing Enterobacteriaceae, particularly NDM producers, necessitates robust molecular surveillance, stringent infection control measures, and antimicrobial stewardship programs in healthcare facilities.

178. Correlation of Vitamin D Levels with Fracture Healing Time in Long Bone Fractures: A Prospective Cohort Study
Milan R. Modi, Nikita J. Nanwani, Prit Rangparia
Abstract
Background: Vitamin D plays a crucial role in bone metabolism and fracture healing. However, the relationship between serum vitamin D levels and fracture healing time in long bone fractures remains inadequately characterized in prospective studies. Objective: To investigate the correlation between baseline serum 25-hydroxyvitamin D [25(OH)D] levels and fracture healing time in adult patients with long bone fractures, and to determine the impact of vitamin D deficiency on healing outcomes. Methods: This prospective cohort study enrolled 124 patients aged 18-65 years with acute long bone fractures. Serum 25(OH)D levels were measured within 48 hours of injury. Patients were categorized into three groups: deficient (<20 ng/mL, n=46), insufficient (20-29.9 ng/mL, n=42), and sufficient (≥30 ng/mL, n=36). All patients underwent standard surgical fixation. Primary outcome was time to radiological union. Secondary outcomes included delayed union, nonunion, and functional recovery. Results: Mean baseline 25(OH)D level was 23.8 ± 9.6 ng/mL. Mean healing time was significantly different across groups: deficient (21.4 ± 4.8 weeks), insufficient (17.6 ± 3.9 weeks), and sufficient (14.8 ± 3.2 weeks, p<0.001). Strong negative correlation was found between vitamin D levels and healing time (r=-0.621, p<0.001). Delayed union occurred in 32.6% of deficient, 16.7% of insufficient, and 5.6% of sufficient groups (p=0.004). Multivariate analysis identified vitamin D deficiency as an independent predictor of prolonged healing (HR=2.84, 95% CI: 1.52-5.31, p=0.001), adjusting for age, fracture type, and fixation method. Conclusion: Vitamin D deficiency is significantly associated with prolonged fracture healing time in long bone fractures. Routine screening and supplementation may be warranted to optimize fracture healing outcomes.

179. Effect of LIV KRIT on Gut–Liver Axis Dysfunction in Patients with Early Non-Alcoholic Fatty Liver Disease: A Real-World Prospective Interventional Study
S. S. Dariya, Gagan Gunjan, Mridul Bera, Krishna Kumar Lohani, Asis Mitra
Abstract
Background: Non-alcoholic fatty liver disease (NAFLD) represents the most prevalent chronic liver disorder globally, with its pathogenesis increasingly linked to gut–liver axis dysregulation. Conventional management remains limited to lifestyle modification, underscoring the need for evidence-based botanical interventions. Brahmanand’s LIV KRIT is an advanced multi-herb Ayurvedic formulation specifically designed for liver care, combining 17 classical plant-derived and mineral constituents including Kalmegh (Andrographis paniculata), Bhringraj (Eclipta alba), Kutki (Picrorhiza kurroa), Bhumi Amla (Phyllanthus niruri), and Loh Bhasma, each with documented hepatoprotective and gut-modulatory properties. Aim: To evaluate the effect of 12-week LIV KRIT supplementation on liver function tests, sonographic fatty liver grade, systemic inflammatory markers, and gut–liver axis parameters in patients with early NAFLD. Methods: A prospective, single-arm, observational-interventional study was conducted at an outpatient setting enrolling 100 patients aged 40–70 years with ultrasound-confirmed NAFLD (Grade I–II) and mild elevation of liver enzymes. Participants received LIV KRIT capsules at the standard labelled dose for 12 weeks alongside standardised lifestyle counselling. Primary outcomes were changes in ALT, AST, GGT, and ALP at 12 weeks. Secondary outcomes included USG grade improvement, serum CRP, Gastrointestinal Symptom Rating Scale (GSRS) scores, waist circumference, lipid profile, and FIB-4 score. Results: Significant reductions were observed in ALT (68.4 to 38.6 U/L; −43.5%, p < 0.001), AST (54.2 to 33.1 U/L; −38.9%, p < 0.001), GGT (52.7 to 31.4 U/L; −40.4%, p < 0.001), and ALP (98.3 to 72.1 U/L; −26.7%, p < 0.001). USG grade improvement was noted in 62% of participants. CRP declined by 52.3% (p < 0.001), GSRS total score by 46.1% (p < 0.001), and waist circumference by 5.4 cm (p = 0.002). A strong positive correlation was observed between gut symptom improvement and ALT reduction (Pearson r = 0.71, p < 0.001). No serious adverse events were reported. Conclusion: LIV KRIT demonstrated clinically meaningful and statistically significant improvements in hepatic, metabolic, and gut–liver axis parameters in patients with early NAFLD over 12 weeks. The formulation represents a promising integrative therapeutic option warranting further randomised controlled evaluation.

180. Clinical Profile and Risk Factors of Acute Kidney Injury in Patients with Chronic Liver Disease: A Cross-Sectional Study from Central India
Raj Anand, Ajay Kumar Nandmer, Vijay Kumar Nandmer, Himanshu Sharma, Manjula Gupta
Abstract
Background: Acute kidney injury (AKI) is a serious complication in chronic liver disease and is likely the result of worsening circulatory and inflammatory dysfunction. As most triggers are potentially reversible, identifying the common triggers and clinical settings in which acute kidney injury occurs remains important. This study was conducted to describe the clinical profile of such patients and to examine the major risk factors associated with acute kidney injury in chronic liver disease. Objective: To evaluate the clinical profile and identify the risk factors associated with acute kidney injury in patients with chronic liver disease. Methods: This observational cross-sectional study was carried out in the Department of Medicine at Hamidia Hospital, Bhopal. A total of 150 patients with chronic liver disease and acute kidney injury were included. Demographic details, risk factors, clinical findings and laboratory parameters, were recorded. Statistical analysis was performed using EPI Info 7.0, and a p value of less than 0.05 was considered statistically significant. Results: The mean age of the patients was 49.63 +/- 11.74 years, and the largest proportion belonged to the 51-60-year age group. Males constituted 76% of the study population. Alcohol was the most common etiological factor of chronic liver disease (61.33%). Among the identified risk factors, upper gastrointestinal bleeding was the most frequent (40.67%), followed by hypotension (28%) and sepsis (28%) and spontaneous bacterial peritonitis was present in 12% of patients. Majority of the patients had advanced liver disease, with 56.67% classified as Child-Turcotte-Pugh class C. The findings of the study points towards hemodynamic instability, infections and more severe liver dysfunction are strongly associated with AKI in CLD. Conclusions: In this study, AKI was seen mainly in patients with advanced liver dysfunction and was commonly associated with upper gastrointestinal bleeding, hypotension, sepsis and spontaneous bacterial peritonitis. Early identification and attention to these reversible clinical triggers may help in earlier recognition and management of renal injury in cirrhosis and improve clinical outcomes.

181. Association of Serum Sodium Levels with Severity of Hepatic Encephalopathy in Chronic Liver Disease Patients
Akhil Rawat, Ajay Kumar Nandmer, Vijay Kumar Nandmer, Raj Anand, Manjula Gupta
Abstract
Background: Hyponatremia is a frequent complication in chronic liver disease (CLD) and has been implicated in worsening neurological outcomes, particularly hepatic encephalopathy (HE). Objective: To evaluate the association between serum sodium levels and severity of hepatic encephalopathy in patients with chronic liver disease. Methods: This observational study included 80 patients with CLD. Patients were categorized based on encephalopathy grading (Absent, Grade 1–4), and clinical, biochemical, and prognostic parameters were compared across groups. Statistical analysis was performed using ANOVA and chi-square tests. Results: Among the study population, 50 patients had no encephalopathy, while 30 patients had varying grades of HE. Mean serum sodium levels showed a declining trend with increasing encephalopathy severity (133.32 ± 4.76 mmol/L in no HE vs. 130.50 ± 8.96 mmol/L in Grade 3), although this was not statistically significant (p=0.4793). Disease severity scores demonstrated significant association with encephalopathy, with CTP score increasing from 9.78 to 13.50 (p<0.0001) and MELD score from 16.32 to 26.42 (p=0.0012). INR, duration of hospital stay, and mortality also showed significant worsening with higher grades of encephalopathy. Conclusion: Although serum sodium did not show a statistically significant direct association, worsening hyponatremia trends paralleled increasing encephalopathy severity and disease progression. HE is strongly associated with advanced liver dysfunction and poor outcomes.

182. To Evaluate Incidence of Osteoporosis in Patients taking Corticosteroids for Biopsy Proven Primary Adult Nephrotic Syndrome
Nikita Hapani, Divyansh Agarwal, Gigin S.V., Pankaj Beniwal, Jaydeep Rajdamur
Abstract
Background: Nephrotic syndrome (NS) in adults is commonly managed with corticosteroids, which, while effective in inducing remission, are strongly associated with steroid-induced osteoporosis. The combined effects of NS itself—through urinary loss of vitamin D–binding protein and altered calcium-phosphate metabolism—and glucocorticoid therapy may accelerate bone loss. Despite the high prevalence of vitamin D deficiency in India, data on osteoporosis incidence in Indian adults with biopsy-proven NS remain scarce. Methods: This prospective cohort study enrolled 35 adults with biopsy-proven primary NS receiving corticosteroids and 70 age- and sex-matched non-NS controls between May 2022 and June 2024. Patients with pre-existing osteoporosis or secondary causes of NS were excluded. Bone mineral density (BMD) was assessed using dual-energy X-ray absorptiometry (DEXA) at baseline and after three months. Biochemical markers of bone metabolism were measured. Statistical analyses included chi-square, unpaired t-tests, and Fisher’s Exact Test. Results: At baseline, demographic and biochemical parameters were comparable between groups. After three months, NS patients showed a significant decline in mean T-score (–0.93 to –2.01; Δ –1.08), compared to minimal change in controls (–0.85 to –1.19; Δ –0.34; p < 0.0001). Osteopenia occurred in 57.14% of NS patients versus 24.29% of controls (RR 2.35; p = 0.0012). Osteoporosis was observed in 37.14% of NS patients compared to 8.57% of controls (RR 4.31; p = 0.0008). Conclusion: Adults with NS treated with corticosteroids experience rapid and clinically significant bone loss within three months, with markedly higher incidence of osteopenia and osteoporosis. Early DEXA screening and prophylactic interventions, including calcium, vitamin D supplementation, and bisphosphonates in high-risk patients, are warranted.

183. Expression of E-Cadherin in Breast Carcinoma and Its Association with Tumor Grade and Nodal Status
Vaishali Singh, Nancy Gupta, Ila Rawat
Abstract
Background: E-cadherin is a key cell adhesion molecule involved in maintaining epithelial integrity. Its loss has been implicated in tumor progression, invasion, and metastasis in breast carcinoma. Evaluating its expression may provide important prognostic information. Aim: To assess the expression of E-cadherin in breast carcinoma and to determine its association with tumor grade and lymph node status. Materials and Methods: Formalin-fixed paraffin-embedded tissue sections were subjected to immunohistochemistry for E-cadherin. Tumors were graded using the Nottingham grading system. E-cadherin expression was scored based on membranous staining and categorized as preserved or reduced/lost. The association with tumor grade and nodal status was analyzed using the Chi-square test. Results: Out of 50 cases, 64% showed preserved E-cadherin expression, while 36% showed reduced/lost expression. Reduced expression was significantly associated with higher tumor grade (p = 0.02), with most Grade III tumors showing loss of expression. A significant correlation was also observed between reduced E-cadherin expression and lymph node metastasis (p = 0.03), with higher loss in node-positive cases. Conclusion: Reduced E-cadherin expression is associated with higher tumor grade and nodal metastasis, indicating aggressive tumor behavior.

184. A Randomized Controlled Trial of Intrathecal Dexmedetomidine as an Adjuvant to Hyperbaric Ropivacaine: Effects on Onset, Duration of Analgesia, and Block Characteristics in Spinal Anesthesia
Vivek Badada, Akanksha Patel, Mohammed Aslam, Gunisetty Sreenivasulu
Abstract
Background: Ropivacaine is commonly used for spinal anaesthesia but has a limited duration of analgesia. Dexmedetomidine, an α2-adrenergic agonist, may enhance its efficacy as an intrathecal adjuvant. Aim: To evaluate the effects of intrathecal dexmedetomidine with hyperbaric ropivacaine on onset, duration of analgesia, and block characteristics. Methods: In this prospective randomized double-blind study, 60 patients undergoing lower limb surgeries were allocated into two groups. Group R received 3 mL of 0.75% hyperbaric ropivacaine, while Group RD received ropivacaine with 5 µg dexmedetomidine. Block characteristics, duration of analgesia, hemodynamic, and adverse effects were assessed. Results: Group RD showed significantly faster onset of sensory (4.9 ± 0.9 vs 6.8 ± 1.2 min) and motor block (8.2 ± 1.1 vs 10.5 ± 1.5 min) (p < 0.001). Duration of analgesia was significantly prolonged (368.2 ± 40.5 vs 182.5 ± 20.1 min, p < 0.001). Hemodynamic changes were mild and manageable, with no significant increase in adverse effects. Conclusion: Intrathecal dexmedetomidine significantly improves the onset and prolongs analgesia with stable hemodynamic, making it an effective adjuvant to ropivacaine in spinal anaesthesia.

185. Prevalence of Methicillin-Resistant Staphylococcus aureus Colonisation in Patients Undergoing Total Joint Arthroplasty: A Retrospective Observational Study
Pawan Patidar, Nikita Patidar, Naveen Patidar, Yogendra Kumar Tiwari
Abstract
Background: Methicillin-resistant Staphylococcus aureus (MRSA) is a major contributor to surgical site infections (SSIs) and prosthetic joint infections (PJIs) following total joint arthroplasty (TJA). Colonisation with MRSA, particularly in the anterior nares, serves as an important endogenous source of postoperative infections. Patients undergoing arthroplasty are often elderly and have multiple comorbidities, making them particularly vulnerable to colonisation and subsequent infection. Early identification of MRSA carriers through preoperative screening provides an opportunity for targeted decolonisation strategies, which can significantly reduce postoperative complications, morbidity, prolonged hospital stay, and healthcare costs. Aim: To determine the prevalence of MRSA colonisation among patients undergoing total joint arthroplasty and to evaluate associated demographic and clinical risk factors. Material and Methods: This retrospective observational study was conducted at ICMR-NITVAR, Pune, over a period of 9 months. A total of 160 patients who underwent elective total joint arthroplasty (hip and knee) were included. Data were retrieved from hospital records, including demographic details, comorbidities, history of prior antibiotic use, and microbiological screening reports. Preoperative MRSA screening was performed using nasal and/or skin swabs. Isolation and identification of Staphylococcus aureus were carried out using standard microbiological techniques, and methicillin resistance was confirmed using the cefoxitin disc diffusion method according to CLSI guidelines. Statistical analysis was performed using SPSS software. Descriptive statistics were expressed as mean ± standard deviation and percentages. Inferential statistics were applied using Chi-square test or Fisher’s exact test, with a p-value <0.05 considered statistically significant. Results: Out of 160 patients, 18 were found to be colonised with MRSA, yielding an overall prevalence of 11.25%. A higher prevalence of MRSA colonisation was observed among patients aged ≥60 years (72.2%, p=0.041). Among comorbid conditions, diabetes mellitus (61.1%, p=0.032) and obesity (44.4%, p=0.048) showed significant associations with MRSA colonisation. A strong association was also observed with prior antibiotic use (66.7%, p=0.001) and previous hospitalisation (50.0%, p=0.045). Gender and hypertension did not demonstrate statistically significant associations. Conclusion: MRSA colonisation was observed in a considerable proportion of patients undergoing total joint arthroplasty. Advanced age, diabetes mellitus, obesity, prior antibiotic exposure, and previous hospitalisation were identified as significant risk factors. These findings highlight the importance of routine preoperative MRSA screening and targeted decolonisation protocols in high-risk patients to reduce postoperative infections and improve surgical outcomes.

186. Diagnostic Performance and Comparative Analysis of Conventional Drug Susceptibility Testing, Line Probe Assay, and GeneXpert in Extrapulmonary Tuberculosis Cases: A Cross-sectional Observational Study
Pawan Patidar, Yogendra Kumar Tiwari, Nikita Patidar
Abstract
Background: Extrapulmonary tuberculosis (EPTB) remains a significant diagnostic challenge due to its paucibacillary nature, atypical clinical presentation, and difficulty in obtaining adequate samples. Conventional diagnostic methods such as culture and drug susceptibility testing (DST), although considered gold standards, are time-consuming and delay treatment initiation. Rapid molecular diagnostic tools like GeneXpert MTB/RIF and Line Probe Assay (LPA) have emerged as effective alternatives for early detection of Mycobacterium tuberculosis and associated drug resistance. Aim: To evaluate and compare the diagnostic performance of conventional drug susceptibility testing, Line Probe Assay, and GeneXpert MTB/RIF in detecting Mycobacterium tuberculosis and drug resistance in extrapulmonary tuberculosis cases. Material and Methods: This cross-sectional observational study was conducted over a period of 9 months at the ICMR-NITVAR (National Institute for Translational Virology and AIDS Research), Pune, India. A total of 100 patients clinically suspected of extrapulmonary tuberculosis were included. Various extrapulmonary samples, including pleural fluid, cerebrospinal fluid, lymph node aspirates, ascitic fluid, and pus, were collected and processed. All samples were subjected to Ziehl–Neelsen staining, culture on standard media, and conventional DST. Molecular testing was performed using GeneXpert MTB/RIF assay and Line Probe Assay. Diagnostic accuracy parameters including sensitivity, specificity, positive predictive value (PPV), and negative predictive value (NPV) were calculated using culture/DST as the reference standard. Statistical analysis was performed using appropriate tests, with a p-value <0.05 considered significant. Results: Out of 100 suspected EPTB cases, culture positivity was observed in 42% of patients. GeneXpert MTB/RIF demonstrated the highest sensitivity (90.5%) with a positivity rate of 50%, enabling rapid detection of Mycobacterium tuberculosis. Line Probe Assay showed high specificity (93.1%) and strong concordance with conventional DST in detecting drug resistance (kappa = 0.87, p < 0.001). Ziehl–Neelsen microscopy exhibited low sensitivity (18%), confirming its limited role in extrapulmonary samples. Drug resistance was detected in 22% of cases by conventional DST, while LPA detected 21% and GeneXpert identified rifampicin resistance in 9% of cases. Molecular methods significantly reduced turnaround time compared to conventional DST. Conclusion: GeneXpert MTB/RIF and Line Probe Assay are rapid, sensitive, and specific diagnostic tools for extrapulmonary tuberculosis. GeneXpert is highly effective for early detection, while LPA provides reliable identification of drug resistance. Although conventional DST remains the gold standard, integrating molecular techniques with conventional methods significantly improves diagnostic accuracy and facilitates timely management of drug-resistant tuberculosis.

187. Single-Port Non-Lipolytic Endoscopic Surgery via the Axillary Approach for the Treatment of Benign Breast Tumors: A Prospective Study
Nagendra Mohan Mathur
Abstract
Background: Benign breast tumors often require surgical excision, but conventional open surgery may cause visible scarring and breast contour deformity. Single-port non-lipolytic endoscopic axillary surgery offers a minimally invasive alternative with improved cosmetic outcomes. Methods: In this prospective study, 64 patients with benign breast tumors underwent single-port non-lipolytic endoscopic excision via the axillary route. Operative time, blood loss, postoperative pain, complications, hospital stay, and cosmetic satisfaction were recorded. Regression and ROC analyses were performed to identify predictors of successful outcomes. Results: Mean tumor size was 2.8 ± 0.9 cm, and mean operative time was 68 ± 15 minutes. Minor complications occurred in 6 patients (9.4%), with no major complications or conversions. Postoperative pain (VAS) was 2.3 ± 0.8, and mean hospital stay was 1.4 ± 0.6 days. Cosmetic outcomes were excellent in 62% and good in 31% of patients. Tumor size >3 cm was the strongest predictor of postoperative drain use, pain, and cosmetic dissatisfaction (p <0.01). ROC analysis showed an AUC of 0.82 for tumor size, with a cutoff ≤3 cm predicting successful outcomes with 88% sensitivity and 79% specificity. Conclusion: Single-port non-lipolytic endoscopic axillary surgery is safe, effective, and cosmetically favorable for benign breast tumors. Tumor size ≤3 cm is a reliable predictor of optimal outcomes.

188. A Comparative Study of Primary PCI Versus Thrombolysis in STEMI Patients in a Tertiary Care Center
Manish Kumar, Priyanka Kumari, Santosh Kumar, Ramesh Thakur, Umeshwar Pandey
Abstract
Background: ST-elevation myocardial infarction (STEMI) requires urgent intervention to restore coronary perfusion and reduce cardiac injury. Both primary PCI and thrombolytic therapy are established treatment modalities, yet differences in their effectiveness in everyday clinical practice are still being explored. Objective: To compare clinical outcomes of primary PCI and thrombolysis in STEMI patients in a tertiary care center. Methods: A retrospective observational study was conducted at LPS Institute of Cardiology, Kanpur, from 2019 to 2021, including 150 STEMI patients. Patients were divided into two groups: primary PCI and thrombolysis. Outcomes assessed included mortality, reinfarction, heart failure, and left ventricular ejection fraction (LVEF). Results: Primary PCI showed significantly lower mortality (6.7% vs 14.7%, p=0.048) and lower incidence of heart failure (10.7% vs 25.3%, p=0.018). Mean LVEF was higher in the PCI group (52.4 ± 6.3%) compared to thrombolysis (46.1 ± 7.2%) (p < 0.001). Conclusion: Primary PCI demonstrated superior clinical outcomes compared to thrombolysis, supporting its role as the preferred reperfusion strategy in STEMI patients.

189. Radial Versus Femoral Access in Percutaneous Coronary Intervention: A Comparative Study
Manish Kumar, Priyanka Kumari, Santosh Kumar, Ramesh Thakur, Umeshwar Pandey
Abstract
Background: Percutaneous coronary intervention (PCI) can be performed through radial or femoral arterial access. Over the past decade, radial access has gained popularity due to reduced complications; however, femoral access remains widely used in complex interventions. Objective: To assess differences in clinical outcomes between femoral and radial vascular access routes in PCI patients treated at a tertiary care center. Methods: The study was carried out at LPS Institute of Cardiology, Kanpur, from December 2019 to December 2022, including 100 patients undergoing PCI. Patients were divided into radial access and femoral access groups. Outcomes assessed included bleeding complications, procedural success, fluoroscopy time, and mortality. Results: Radial access was associated with significantly lower bleeding complications and a shorter duration of hospital stay. Procedural success rates were comparable between the two groups, while mean fluoroscopy time was modestly higher in the radial access group. Conclusion: The radial approach demonstrated a lower incidence of complications along with reduced hospital stay, suggesting its suitability as a preferred vascular access technique for PCI.

190. Percutaneous Coronary Intervention in Chronic Total Occlusions: Success Rates and Predictors
Manish Kumar, Priyanka Kumari, Santosh Kumar, Ramesh Thakur, Umeshwar Pandey
Abstract
Background: Chronic total occlusions (CTOs) are considered among the most complex lesions encountered in interventional cardiology. Despite advances in techniques and devices, success rates vary widely depending on lesion and patient characteristics. Objective: To evaluate the effectiveness of percutaneous coronary intervention (PCI) in chronic total occlusions and to determine independent predictors of procedural success. Methods: This study included 100 patients undergoing CTO-PCI between 2019 and 2022. Clinical, angiographic, and procedural variables were recorded. Statistical analysis was performed to determine predictors of success. Results: The overall procedural success rate was 78%. Factors significantly associated with success included shorter lesion length (<20 mm), absence of severe calcification, and good collateral circulation (p<0.05). Multivariate analysis identified lesion length and calcification as independent predictors. Conclusion: CTO-PCI success is strongly influenced by lesion characteristics. Careful patient selection and advanced techniques can improve outcomes.

191. Comparative Study of Esmolol, Fentanyl, and Dexmedetomidine on Hemodynamic Response to Laryngoscopy and Intubation
Santosh Kumar, Akhilesh Kumar Singh, Rajat Kumar
Abstract
Background: Endotracheal intubation after laryngoscopy is associated with significant sympathetic stimulation leading to tachycardia and hypertension. Various pharmacological agents are used to attenuate this response, including Esmolol, Fentanyl, and Dexmedetomidine. Objective: To compare the effects of Dexmedetomidine, Esmolol, and Fentanyl on cardiovascular changes associated with airway manipulation. Methods: This study was carried out at NMCH Hospital from November 2024 to November 2025. Ninety patients were enrolled and evenly assigned into three groups of thirty each: Group E (Esmolol), Group F (Fentanyl), and Group D (Dexmedetomidine). Statistical analysis was performed using ANOVA and post hoc tests. Results: Dexmedetomidine showed the most stable hemodynamic profile with significantly lower blood pressure fluctuations and heart rate compared to Esmolol and Fentanyl (p<0.001). Esmolol effectively controlled heart rate but was less effective for blood pressure. Fentanyl showed moderate attenuation. Conclusion: Dexmedetomidine demonstrated greater stability of hemodynamic parameters during laryngoscopy followed by tracheal intubation.

192. Analysis of Caesarean Section Rate Using Robson Ten Group Classification System in a Tertiary Teaching Hospital, Visakhapatnam, Andhra Pradesh
Shailaja Pinjala, Madhuri G.Y., Sindhuja J.V., Meghana M.V., Mounika K.
Abstract
Background: Caesarean section is a key indicator of maternal healthcare quality and accessibility. Globally rising CS (Caesarean Section) rates have raised concerns, necessitating standardized evaluation methods. The Robson Ten Group Classification System, recommended by the WHO, provides a structured and evidence-based approach to assess, monitor, and compare CS rates across institutions. This study aims to analyze CS trends using this system and identify major contributing groups to improve maternal and neonatal outcomes. Methods: A retrospective observational study was conducted at Government Victoria Hospital, Visakhapatnam, from March 1, 2024, to May 25, 2024. All deliveries during this period were categorized into Robson’s ten groups based on obstetric characteristics. Data on total deliveries, number of CS, group size, and indications were analyzed to determine CS rates, relative contributions, and areas needing intervention. Results: Out of 1014 total deliveries, 471 were caesarean sections, resulting in an overall CS rate of 45.8%. The highest contribution to the overall CS rate was from Group 5 (18.62%), followed by Group 2 (12.52%), Group 1 (3.94%), Group 4 (3.84%), and Group 10 (3.84%). Group 2 showed a high CS rate (63.5%), particularly among induced labour cases. Group 5 exhibited an exceptionally high CS rate (98.95%), reflecting repeat caesarean practices. Conclusion: The Robson classification system is an effective tool for auditing CS rates and identifying high-risk groups. The study highlights an elevated CS rate compared to WHO recommendations, primarily driven by repeat CS and induced labour cases. Strategies such as reducing primary CS, promoting VBAC (Vaginal Birth after Caesarean), standardizing induction protocols, and regular clinical audits can help optimize CS rates and improve maternal healthcare outcomes.

193. Current Status and Future Perspectives of Pancreas Transplantation: From Glycemic Control to Immune Tolerance and Regenerative Alternatives
Ahmet Gokhan Saritas, Ugur Topal
Abstract
Pancreas transplantation remains the only established curative therapy for patients with insulin-dependent diabetes mellitus, particularly those with type 1 diabetes and advanced complications. The most commonly performed procedures include simultaneous pancreas–kidney transplantation (SPK), pancreas after kidney transplantation (PAK), and pancreas transplantation alone (PTA). Advances in surgical techniques, immunosuppressive regimens, and recipient selection have significantly improved graft and patient survival rates over the past decades. Despite these improvements, early vascular complications, acute and chronic rejection, and infection remain major challenges. In parallel, emerging alternatives such as islet cell transplantation, stem cell–based therapies, and closed-loop artificial pancreas systems are reshaping the therapeutic landscape. This review summarizes the current status of pancreas transplantation, including indications, outcomes, and complications, and discusses future directions focusing on immune tolerance, bioengineering approaches, and regenerative medicine strategies.

194. Mucosal Melanoma: Epidemiology, Molecular Biology, Clinical Features, and Current Treatment Strategies
Ahmet Gokhan Saritas, Ugur Topal
Abstract
Mucosal melanoma is a rare and aggressive subtype of melanoma arising from melanocytes located in mucosal surfaces, accounting for approximately 1% of all melanomas. Unlike cutaneous melanoma, mucosal melanoma demonstrates distinct epidemiological, molecular, and clinical characteristics, often leading to delayed diagnosis and poor prognosis. The pathogenesis is driven by unique molecular alterations, including frequent mutations in KIT, NRAS, and structural chromosomal aberrations, whereas BRAF mutations are less common. Clinically, mucosal melanomas present with nonspecific symptoms depending on anatomical location, most commonly affecting the head and neck, anorectal, and female genital tracts. Surgical resection remains the cornerstone of treatment when feasible; however, high recurrence rates necessitate multimodal approaches. Advances in immunotherapy and targeted therapy have improved outcomes, although response rates remain lower compared to cutaneous melanoma. This review provides a comprehensive overview of the epidemiology, molecular biology, clinical features, and current treatment strategies of mucosal melanoma, highlighting emerging therapeutic approaches and future directions.

195. Analysis of Blood Donor Deferral Pattern in a Tertiary Care Hospital: A Retrospective Hospital Based Study
Sindhuja K., Sandhya G., Rajendra Prasad V.
Abstract
Background: Blood transfusion is a critical component of modern healthcare, playing a vital role in the management of trauma, surgical procedures, hematological disorders, and obstetric emergencies. The safety and adequacy of the blood supply largely depends on the availability of healthy, voluntary, non-remunerated blood donors. Aim & Objectives: To evaluate and analyze the patterns and causes of blood donor deferrals in a tertiary care hospital and to determine the incidence of blood donor deferrals among all registered donors during the study period. Materials and Methods: This is a retrospective hospital based study conducted at the Department of Transfusion Medicine, Government General Hospital and Medical College, Kadapa, Andhra Pradesh. Results: A total of 9336 blood donors, including voluntary, relative and replacement donors were registered during the study period out of which 171 donors were deferred. Conclusion: This emphasizes the need for targeted strategies to reduce the deferral rate. By addressing the prevalent health issues that cause deferrals, we can work toward improving the overall donation rate, thereby ensuring a more robust and reliable blood supply. This approach not only enhances donor retention but also strengthens the effectiveness of transfusion services, ultimately benefiting patient care.

196. Factors Predicting Outcome in Geriatric Patients admitted in Emergency Department in a Tertiary Care Center
Shivesh Anurag, Siddhartha Mishra, Yashas, Pradhasaradhi
Abstract
Objectives: The present study was to the determine Risk factors of Mortality in geriatric patients admitted in Emergency department and also to evaluate the incidence, prevalence of various disease patterns affecting the geriatric population. Methods: A complete assessment including the Sociodemographic status and FRAILTY SCORE and relevant investigations were performed to all the patients. Data was collected according to previously prescribed Performa.  Results: A total of 200 geriatric patients were enrolled. Out of them,117 patients (58.5%) were 65-75 years of age, 63 patients (31.5%) were 76- 85 years of age, 16 patients (8.0%) were 86-95 years of age and 4 patients (2.0%) had an age group greater than 95 years.140 patients (70.0%) belonged to the male gender while the remaining 60 patients (30.0%) belonged to the female gender. The association between frailty score and outcome was statistically significant (p < 0.001). The association between T2DM status and outcome was statistically significant (p = 0.006). The association between HTN and outcome is not statistically significant as the result (P= 0.085). The association between HTN and outcome is not statistically significant as the result (P = 0.085). The association between CKD status and outcome was statistically significant (p = 0.010). The association between CVA status and outcome was statistically significant (p = 0.014). The association between outcome and age is statistically significant as (P Value = 0.028). The association between vasopressor use and outcomes was statistically significant (p ≤ 0.001). The association between O₂ support status and patient outcomes is statistically significant (p≤ 0.001). Conclusion: Poorer outcomes are linked to advanced age and higher frailty scores, underscoring the significance of functional assessment in risk stratification. Mortality and the requirement for intensive care were strongly influenced by a combination of clinical severity, comorbidities, and social support. Hence, to maximize care and enhance survival rates in this susceptible group, early detection of high-risk elderly patients and prompt interventions in the emergency room are crucial.

197. A Comparative Study of the Efficacy of Atorvastatin Vs Rosuvastatin in Patients with Dyslipidemia Attending Tertiary Care Teaching Hospital
Ajit Kishor, Neha Fatima, Raj Narayan Seth, Keshav Kumar Sinha
Abstract
Background: Dyslipidemia is a significant and modifiable risk factor for cardiovascular diseases with high low-density lipoprotein (LDL), elevated triglycerides (TG) levels, or low high-density lipoprotein (HDL). It is involved in atherosclerosis, coronary heart disease and stroke. Statins remain the first line pharmacological treatment, as they block HMG-CoA reductase resulting in decreased hepatic cholesterol synthesis and LDL receptor activity. Atorvastatin and Rosuvastatin are the most commonly prescribed statins for lipid control. Methods: This prospective, randomised, open-label comparative study carried out over 1 year (December 2021-December 2022). A hundred patients with dyslipidaemia were randomly divided into two groups (n=50). Group A was given Atorvastatin 20 mg once daily, while Group B received Rosuvastatin 10 mg once daily for a period of 12 weeks. Fasting lipids were measured at baseline and after 12 weeks. Values of p < 0.05 were considered statistically significant. Results: There was a significant improvement of lipids in both groups (p < 0.001). Reduction of LDL was by 17.78% with Atorvastatin and 19.26% with Rosuvastatin respectively. HDL was 5.54% higher and 9.13% respectively. The levels of TG were reduced by 11.50% and 13.96%. Rosuvastatin had a marginally superior effect on lipid reduction. Conclusion: Both Atorvastatin and Rosuvastatin are effective with good tolerability in treatment of dyslipidemia. Rosuvastatin showed small advantage in LDL lowering and HDL improvements compared to Atorvastatin, which remains a cost-effective option for everyday clinical application.

198. Postpartum Hemorrhage: Risk Factors and New Management Protocols
Vineeta Singh, Nehanjali Kumari, Kumari Bibha
Abstract
Background: Postpartum hemorrhage (PPH) remains one of the main contributors to maternal morbidity and mortality around the world, especially in low-resource settings. Causes of PPH include uterine atony most commonly, while maternal anemia and prolonged labor increase the risk, as well as the severity of hemorrhage. Although obstetric care has improved over the decades, early identification of risk factors and timely implementation of standardized management protocols are essential to achieve better maternal outcomes. Methods: A prospective observational study was conducted for a period of six months, from January 2024 to June 2024, at the Department of Obstetrics and Gynecology of S.K.M.C.H, Bihar.  A total of 90 women diagnosed with primary PPH were included. Data regarding demographic characteristics, antenatal history, obstetric risk factors, mode of delivery, management protocols, transfusion, ICU admissions and maternal outcome were collected using a structured proforma. Statistical analyses were performed using descriptive statistics and the chi-square test, as a P-value <0.05 was regarded as statistically significant. Results: The most common cause of PPH was uterine atony (53.3%) and trauma (26.7%), respectively. Maternal anemia (46.7%) and prolonged labor (28.9%) were important risk factors for severe hemorrhage (p< 0.005). Uterotonics and Active Management of the Third Stage of Labour (AMTSL) were successful in most cases. The early administering tranexamic acid and balloon tamponade also reduced the need for surgical intervention. Blood transfusion was needed in 57.8% of cases, ICU admission in 11.1%, and the maternal mortality rate was 1.1 %. Conclusion: The main cause of PPH is still uterine atony and the severity is increased by anemia and prolonged labor. Management based on a protocol early on reduces complications and appears to improve maternal outcomes. Further reductions in maternal morbidity and mortality will require improved antenatal screening and adherence to standardised management protocols.

199. Role of Ultrasound and Doppler Studies in High-Risk Pregnancy
Nehanjali Kumari, Vineeta Singh, Kumari Bibha
Abstract
Background: High-risk pregnancies are linked with the increased maternal and perinatal morbidity and mortality. Improvement in outcomes relies in the early identification of fetal compromise. Doppler velocimetry is an important ultrasound tool for assessing fetal growth and fetoplacental circulation in order to allow timely intervention. The Doppler parameters like Resistive Index (RI), Pulsatility Index (PI) and Systolic/Diastolic (S/D) ratio help to detect placental insufficiency and fetal hypoxia before clinical deterioration. Methods: This prospective observational study conducted in the Department of Obstetrics and Gynecology at Sri Krishna Medical College and Hospital (SKMCH), Muzaffarpur, Bihar between January 2024 and June 2024. 100 pregnant women with singleton high-risk pregnancies of ≥28 week’s gestation were included. All participants were subjected to routine obstetric ultrasound and Doppler assessment of umbilical arteries, middle cerebral arteries, and uterine artery. Doppler parameters including PI, RI and S/D ratio were noted. Patients were followed up to delivery, and maternal and neonatal outcomes were collected. Descriptive statistics and Chi-square test statistical analyses were performed with p < 0.05 considered significant. Results: Abnormal Doppler findings were seen in 38% of cases. The most prevalent abnormality was raised PI. Abnormal Doppler patterns were significantly associated with increased Neonatal Intensive Care Unit (NICU) admissions, low birth weight, and higher rates of cesarean section (p < 0.05). Cases with absent or reversed end-diastolic flow had poorer perinatal outcomes. Conclusion: Doppler ultrasonography is a useful non-invasive technique in high-risk pregnancy surveillance and a good predictor of adverse perinatal outcomes that allows obstetric intervention as soon as possible.

200. Contraceptive Choices among Adolescents: Awareness and Acceptance Trends
Vineeta Singh, Nehanjali Kumari, Kumari Bibha
Abstract
Background: Public health issues related to adolescent reproductive health are generally regions where socio-cultural barriers inhibit open discussion about contraception and access to quality reproductive health information. Although a high proportion of adolescents are at risk of an unintended pregnancy and sexually transmitted infection, awareness and acceptance of these contraceptive methods are low. While the issue is important, there is limited research at the local level exploring adolescents’ knowledge and attitudes towards contraceptive options, particularly in Bihar. Methods: A cross-sectional descriptive study conducted at S.K.M.C.H from July to December 2024, among 60 adolescents aged 15–19 years. A structured questionnaire was used for data collection which included the demographic details, knowledge regarding the contraceptive methods, and acceptance. Awareness scores calculated the percentage of correct responses and acceptance willingness to use specific contraceptive methods were measured. Data were analyzed with SPSS software, using descriptive statistics (frequencies, percentages, and means) and inferential tests (chi-square and correlational analysis). Results: Of those, 80% recognized condoms, with only 65% recognizing pills and 40% intrauterine devices. However, acceptance rates were much lower 60% were willing to take condoms, 45% accepted oral pills and only 30% considered intrauterine devices. Awareness positively correlated with acceptance (r = 0.42). There were statistically significant gender differences for oral contraceptive pill acceptance (p < 0.05), with greater acceptance among female adolescents. Awareness scores were higher among urban than rural adolescents. Conclusion: The study shows moderate awareness along with low acceptance of contraceptive methods in adolescents of S.K.M.C.H, Bihar. Differential Knowledge and Contraceptive Choice enhancing adolescent-focused reproductive health education and developing youth-friendly services, are vital to promoting informed decisions, which in turn lead to increased contraceptive uptake.

201. Prospective Randomized Open Label Comparative Study of Efficacy of Atorvastatin Alone and Atorvastatin with Omega 3 Fatty Acids in Patients with Dyslipidaemia Attending Tertiary Care Hospital
Neha Fatima, Ajit Kishor, Rani Indira Sinha, Keshav Kumar Sinha, Md. Ejaz Alam
Abstract
Background: Dyslipidaemia is an important modifiable cardiovascular risk factor which is highly prevalent in the Indian population. Atorvastatin reduces low-density lipoprotein cholesterol (LDL-C), but a large proportion of patients have elevated triglycerides and residual cardiovascular risk. Omega-3 fatty acids have been demonstrated to lower triglycerides, and may offer added benefit when used in combination with statins. Methods: This prospective, open-label, randomized comparative study was carried out between February 2020 to August 2021 at a tertiary care hospital- Aryabhatta. One hundred dyslipidaemia patients were assigned to either Group A (atorvastatin 10–20 mg once daily) or Group B (atorvastatin 10–20 mg + omega-3 fatty acids 1 g twice daily) with a random allocation. Treatment duration was 12 weeks. Lipid profiles were evaluated at baseline, 6 weeks and 12 weeks. Statistical analysis was conducted with SPSS, and p < 0.05 was considered statistically significant. Results: There was a statistically significant improvement among the study and control groups in the 12 weeks for all lipid parameters (p < 0.001). The combination therapy group also showed much decreased triglycerides and slightly increased high-density lipoprotein cholesterol (HDL-C) as compared to atorvastatin monotherapy. Both groups had a significant reduction in LDL-C, which was slightly greater in the combination group. Tolerability of both regimens was good with minimal adverse effects. Conclusion: Atorvastatin plus omega-3 fatty acids was more effective than atorvastatin alone, especially in the reduction of triglycerides and similarly safe. Combination therapy can help in the management of patients with mixed dyslipidaemia; however, large multicentric trials are required to determine long-term cardiovascular outcomes.

202. Detection of Mycobacterium Tuberculosis (MTB) In Pulmonary Samples by AFB Smear and CBNAAT at Radha Devi Jageshwari Memorial Medical College and Hospital, Turki Muzaffarpur Bihar
Rajesh Kumar Jaiswal, Rajeev Ranjan, Mahjabin Siddique, Ram Shankar Prasad
Abstract
Background: Tuberculosis (TB) is a major public health problem all over the world. Pulmonary TB is often found and easily spread. For timely treatment initiation as well as interruption of disease transmission, early and accurate diagnosis is essential. Conventional acid-fast bacilli (AFB) smear microscopy is used widely, however it has limited sensitivity. This shows the need for the diagnostic methods which is more consistent like the Cartridge-Based Nucleic Acid Amplification Test (CBNAAT). Methods: A prospective observational study was done at Radha Devi Jageshwari Memorial Medical College and Hospital (RDJMMCH), Turki, Muzaffarpur, Bihar, from February 2025 to November 2025. A total of 120 sputum samples from adult patients suspected of pulmonary TB were analyzed using Ziehl-Neelsen staining for AFB smear and CBNAAT. Diagnostic parameters with specificity, sensitivity, Positive Predictive Value (PPV), and Negative Predictive Value (NPV) were calculated. Results: In 52 (43.3%) samples, MTB was identified by AFB smear microscopy in 52 (43.3%) and by CBNAAT in 78 (65%). Comparing AFB smear with CBNAAT as reference standard showed sensitivity of 66.7% and specificity of 95.2%. 6(5%) CBNAAT positive cases were found to be resistant to rifampicin. Conclusion: CBNAAT performed better than AFB smear microscopy with improved sensitivity and the additional benefit of identifying rifampicin resistance within a short period. It’s use as a tool in front line diagnosis can significantly improve early diagnosis of TB and hence patient management.

203. Comparative Study of 0.75% Hyperbaric Ropivacaine versus 0.75% Hyperbaric Ropivacaine with Clonidine for Lower Limb Surgeries under Spinal Anaesthesia
Sripriyanka R., Lalitha R., Anbuselvi Anoumandane, B. Ravi
Abstract
Background: Spinal anaesthesia remains a cornerstone technique for lower-limb surgeries, and optimizing the balance between rapid onset, prolonged analgesia, and hemodynamic stability continues to be an important area of clinical research. Objective: To compare the onset, duration, and quality of sensory and motor blockade, along with hemodynamic stability and adverse effects, between 0.75% hyperbaric ropivacaine alone and 0.75% hyperbaric ropivacaine with clonidine in patients undergoing lower-limb surgeries under spinal anaesthesia. Methods: This single-centre, prospective, randomized controlled study was conducted in the Department of Anaesthesia, ACS Medical College and Hospital, Chennai, from April 2023 to April 2025, after obtaining IHEC approval (Ref: 816/2023/IEC/ACSMCH). Results: Both groups (n=60 each) were demographically and hemodynamically comparable at baseline, with similar age (38.7 ± 10.1 vs 39.3 ± 7.6 years), BMI (25.4 ± 4.4 vs 26.3 ± 5.1 kg/m²), and vitals (p > 0.05). After spinal anaesthesia, both groups showed transient reductions in heart rate and systolic blood pressure that returned to near-baseline by surgery end. Heart rate declined to 71.4 bpm in Group A and 75.5 bpm in Group B, while SBP reached nadirs of 109.0 mmHg and 103.5 mmHg, respectively. Group B exhibited significantly faster onset of sensory (4.1 min) and motor block (5.1 min), whereas Group L showed markedly longer sensory (263.8 min) and motor block (248.9 min) durations (both p < 0.001). Adverse events were more frequent in Group L (30.0% vs 8.3%, p = 0.007), mainly nausea, pruritus, and bradycardia, indicating a trade-off between prolonged analgesia and tolerability. Conclusion: Adding clonidine to 0.75% hyperbaric ropivacaine under spinal anaesthesia significantly prolonged sensory and motor blockade and delayed analgesic requirement but increased adverse events, whereas ropivacaine alone achieved a faster onset with fewer side effects, supporting regimen selection based on the desired balance between duration and tolerability.

204. Relationship between Cognitive Impairment and Caregiver Burden in Patients with Schizophrenia in Remission: A Hospital-Based Cross-Sectional Study
Pallavi T., Shabeeba Z. Kailash, Kailash Suresh Kumar, Aravindh M.
Abstract
Background: Schizophrenia is frequently associated with persistent cognitive impairment, and these deficits may continue even during remission, potentially increasing the burden experienced by primary caregivers. Objectives: To assess cognitive function in patients with schizophrenia, to assess caregiver burden among primary caregivers of patients with schizophrenia, and to study the relationship between cognitive function and caregiver burden in patients with schizophrenia. Methods: This hospital-based cross-sectional study was conducted among 84 patients with schizophrenia in remission and their 84 primary caregivers in the Department of Psychiatry of a tertiary care hospital. Patients were assessed using the PANSS remission criteria, Wisconsin Card Sorting Test, and Stroop Colour and Word Test, while caregiver burden and psychological distress were evaluated using the Burden Assessment Scale and Depression Anxiety Stress Scale. Results: Patients had a mean age of 38.0±6.4 years, 61.9% were male, and the mean age at onset was 26.71±4.87 years. Cognitive impairment was identified in 59.5% on Stroop testing, although 53.6% had no executive dysfunction on WCST total categories completed. Mean duration of untreated psychosis was significantly higher in cognitively impaired patients than in those without impairment (8.7±3.44 vs 1.1±0.85 months; p<0.001). Caregivers had a mean age of 46.33±9.8 years, were predominantly female (61.9%), and most showed minimal burden (72.6%). Mean BAS score was 36.70±9.1 and correlated significantly with depression (r=0.746), anxiety (r=0.720), and stress (r=0.705), all p<0.01. Caregiver burden was significantly higher when patients had cognitive impairment (p=0.001). Conclusion: Cognitive impairment was common in remitted schizophrenia and was significantly associated with greater caregiver burden, highlighting the need for early treatment, routine cognitive assessment, and family-based support interventions.

205. Clinical and Biochemical Correlation of TSH Levels with Symptoms in Patients with Thyroid Disorders: A Cross-Sectional Study
Kanak Choudhury, Manodip Mandal, Bhargab Paul, Thummala Vamshikrishna
Abstract
Background: Thyroid disorders are highly prevalent in India and often present with non-specific symptoms. While serum TSH is the most sensitive marker of thyroid function, its correlation with clinical symptoms remains variable. Methods: A cross-sectional observational study was conducted among 100 patients presenting with symptoms suggestive of thyroid dysfunction. Patients were categorized into three groups based on TSH levels: normal (≤4.0 mIU/L), mildly elevated (4.1–10.0 mIU/L), and severely elevated (>10.0 mIU/L). Clinical features were recorded and correlated with TSH levels using chi-square test. Results: Fatigue was present in all patients and was non-specific. Constipation was most common in the mildly elevated TSH group. Weight gain, neck swelling, and menstrual irregularities were significantly associated with severely elevated TSH levels. Conclusion: Certain clinical features correlate strongly with the severity of TSH elevation. Symptom-based assessment combined with biochemical testing can improve early diagnosis of thyroid dysfunction.

206. Comparative Evaluation of Efficacy and Safety of Topical Tacrolimus versus Topical Corticosteroid in the Management of Atopic Dermatitis: A Randomized Control Study
Satish Chandel, Vineet Kumar Sahu, Aayushi Mehra
Abstract
Background: Atopic dermatitis (AD) is a chronic inflammatory skin disorder requiring long-term topical therapy. While topical corticosteroids (TCS) remain the mainstay of treatment, their prolonged use is associated with adverse effects. Topical tacrolimus, a calcineurin inhibitor, offers a steroid-sparing alternative with a different safety profile. Aim: To compare the efficacy and safety of topical tacrolimus versus topical corticosteroids in the management of atopic dermatitis. Materials and Methods: A prospective, randomized, open-label, parallel-group study was conducted over one year in a tertiary care hospital. A total of 100 patients with mild to moderate AD were randomized into two groups: tacrolimus (Group A) and corticosteroids (Group B), with 50 patients each. Treatments were applied twice daily for 6 weeks. Disease severity was assessed using SCORAD score, along with pruritus score, adverse effects, and recurrence rates. Statistical analysis was performed using SPSS version 26. Results: Both groups showed significant improvement in SCORAD and pruritus scores. However, corticosteroids demonstrated a significantly faster and greater reduction in disease severity at all follow-up intervals (p<0.05). Burning sensation was more common with tacrolimus (20%), whereas skin atrophy was observed only in the corticosteroid group (12%) (p<0.05). Recurrence rates were lower in the tacrolimus group (12%) compared to corticosteroids (28%). Conclusion: Topical corticosteroids provide faster symptomatic relief in atopic dermatitis, making them suitable for acute flare management. However, tacrolimus offers comparable long-term efficacy with a superior safety profile and lower recurrence rates, making it a preferable option for maintenance therapy and use in sensitive skin areas.

207. Seasonality and Socio-Demographic Changes associated with Mumps in a Tertiary Care Centre in North East India
Bibhuti Das, Abhilasha Goswami, Iadarity A. Nongkynrih, Shibangi Sahu, G. K. Nayak
Abstract
Background: Mumps is an acute and highly contagious disease of the salivary glands caused by the mumps virus (paramyxovirus parotitis). It is characterized by pain and swelling in the parotid gland region associated with fever. It is usually mild but, in some cases, it may cause complications. The incidence of mumps reduced dramatically after the implementation of immunization against mumps worldwide, however, the incidence of mumps has been on the rise in recent years. The morbidity of mumps virus infections also presents a seasonal variation that has been well recognized in different regions, relationship between climatic factors and the occurrence of mumps could help improve both disease forecasting and preventive efforts. Also, sociodemographic factors of mumps, will provide data support to key groups and regions for prevention and control. Aims and Objectives: (1). To assess the relationship between the occurrence of mumps and climatic factors. (2). To determine the sociodemographic factors of susceptible population. Methods: Any case presenting to the OPD of the Otorhinolaryngology department, Nalbari Medical College and hospital with signs and symptoms of mumps infection is evaluated for epidemiological exposure, clinical particulars (signs, symptoms, date of onset of symptoms) and socio-demographic profile such as gender, age, occupation and residential address were also collected. Results: there were a total of 130 patients, max prevalance were seen in the age group of 11-20 years, with equal distribution among male and female patients, none of the patients were immunised and all were from a low socioeconomic background and rural area. In our study, the maximum number of cases were seen in the late winters and spring season. Conclusion: The lack of education and awareness hinders the parents in recognizing and assessing the risk of mumps exposure in a timely manner, thereby increasing the likelihood of infection and complications. Warmer weather might be associated with behavioral patterns such as increased contact in children, which could in turn promote mumps infection. In order to achieve effective mumps prevention and control, a multi-pronged strategy is recommended.

208. Study of Invasive Coronary Angiography in Patients with Suspected Coronary Artery Disease by Treadmill Test in Ethnic and Non-Ethnic People of Tripura: A Cross Sectional Study
Shrirao Mayur Vilasrao, Achintya Pal, Debadrita Das, Anindya Sundar Trivedi, Rajesh Kishore Debbarma
Abstract
Background: coronary artery disease (CAD) is still one of the main causes of death around the world, and it is becoming more common in developing countries like India. Coronary angiography (CAG) is the best way to make a diagnosis, but non-invasive tests like the treadmill test (TMT) are often used to screen. However, the variability in the diagnostic accuracy and predictive value of TMT necessitates evaluation specific to each region. Objectives: To ascertain the prevalence and distribution of coronary artery abnormalities in patients exhibiting positive TMT and to assess the correlation between demographic and clinical risk factors and CAD severity. Methods: A cross-sectional observational study was performed involving 125 patients with positive TMT undergoing CAG. Clinical, demographic, and risk factor data were gathered and analyzed utilizing SPSS v27. The chi-square test and the t-test were used. A p-value of less than 0.05 was deemed statistically significant. Results: Of the 125 patients, 85 (68%) exhibited abnormal coronary angiographic results. Single-vessel disease (SVD) was the most common (40.8%), followed by double-vessel disease (17.6%) and triple-vessel disease (9.6%). The highest rate of LAD involvement was 37.6%. There were strong links between the severity of CAD and diabetes (p=0.022), dyslipidemia (p=0.034), smoking (p=0.043), and a lack of physical activity (p=0.046). There was no significant link between age, sex, race, or high blood pressure. Conclusion: TMT is a useful screening tool, but it doesn’t always give accurate predictions. Risk factor profiling greatly improves its usefulness in the clinic. Coronary angiography is still necessary for a definitive diagnosis.

209. Drug Utilization Patterns and Prescribing Practices in Dermatology Outpatients at a Tertiary Care Teaching Hospital in South India: A Cross-Sectional Study Using WHO Indicators
Manjushree A., Neelamma
Abstract
Introduction: The skin, the largest organ of the human body, is frequently affected by a wide spectrum of disorders involving the superficial layers. Skin diseases contribute significantly to global morbidity and are among the leading causes of non-fatal disease burden. Drug utilization studies provide valuable insights into prescribing patterns and help assess the rational use of medicines. The present study was undertaken to evaluate drug utilization patterns in common skin diseases and to analyze prescribing practices using World Health Organization prescribing indicators. Materials and Methods: A prospective, observational, cross-sectional descriptive study was conducted among outpatients attending the Department of Dermatology at Raichur Institute of Medical Sciences, a tertiary care hospital. A total of 688 prescriptions were analyzed over a one-year period. Data were collected using a structured proforma based on WHO guidelines, and parameters such as demographic characteristics, disease distribution, and prescribing indicators were evaluated using descriptive statistics. Results: Tinea was the most common dermatological condition (33.2%). The majority of patients were males (65.7%), with the highest proportion in the 21–30 years age group (25.2%). A total of 1720 drugs were prescribed, with an average of 2.5 drugs per prescription. Generic prescribing was high (98.6%), and 70.5% of drugs were prescribed from the essential medicines list. Oral route was most commonly used (57.5%), followed by topical (41.8%). Antifungals (31.6%), antihistaminics (27.1%), and antibiotics (13.4%) were the most frequently prescribed drug classes. Conclusion: The study demonstrates largely rational prescribing practices with low polypharmacy, high generic prescribing, and limited use of injections. However, there is scope for improving adherence to essential medicines lists. Regular prescription audits and drug utilization studies can enhance rational drug use, optimize therapeutic outcomes, and support policymakers in improving healthcare delivery.

210. Comparative Evaluation of Nebulized Salbutamol versus Metered Dose Inhaler with Spacer in Acute Pediatric Asthma Exacerbations
Arpita Manish Patel, Hinabahen R. Chaudhary, Kamleshkumar Chamanlal Bhatiya
Abstract
Background: Acute asthma exacerbations represent a significant cause of pediatric emergency department visits worldwide. While nebulized salbutamol remains the traditional delivery method, metered dose inhalers (MDI) with spacer devices offer potential advantages including cost-effectiveness and portability. This study compared the clinical efficacy and safety of nebulized salbutamol versus MDI with spacer in managing acute pediatric asthma exacerbations. Methods: A randomized controlled trial was conducted in the pediatric emergency department involving 186 children aged 2-12 years presenting with acute asthma exacerbations. Participants were randomly assigned to receive either nebulized salbutamol (2.5-5 mg, n=94) or salbutamol via MDI with spacer (400-800 mcg, n=92). Primary outcomes included change in Pediatric Asthma Severity Score (PASS) at 60 minutes, oxygen saturation improvement, and hospitalization rates. Secondary outcomes comprised treatment duration, adverse effects, and parent satisfaction. Results: Both groups demonstrated significant clinical improvement with no statistically significant difference in PASS reduction at 60 minutes (nebulizer: 3.8 ± 1.2 vs. MDI/spacer: 3.6 ± 1.1, p=0.241). Oxygen saturation increased comparably in both groups (nebulizer: 94.3 ± 2.1% to 97.8 ± 1.4%; MDI/spacer: 94.1 ± 2.3% to 97.6 ± 1.5%, p=0.389). Hospitalization rates were similar (nebulizer: 18.1% vs. MDI/spacer: 15.2%, p=0.584). Treatment time was significantly shorter with MDI/spacer (18.4 ± 4.2 vs. 31.7 ± 6.8 minutes, p<0.001). Adverse effects, predominantly tremor and tachycardia, occurred at similar frequencies (p=0.712). Parent satisfaction scores favored MDI/spacer devices (8.4 ± 1.3 vs. 7.6 ± 1.5, p=0.001). Conclusion: MDI with spacer demonstrates equivalent clinical efficacy to nebulized salbutamol in acute pediatric asthma exacerbations while offering advantages of reduced treatment time and higher parent satisfaction, supporting its implementation as first-line therapy in emergency settings.

211. Changes in Proteinuria and the Risk of Myocardial Infarction in People with Diabetes or Prediabetes: A Prospective Cohort Study from North Bihar
Amit Kumar
Abstract
Background: Proteinuria is a clinically accessible marker of renal microvascular injury and systemic endothelial dysfunction in dysglycaemic populations. Whether short-interval changes in proteinuria identify people with diabetes or prediabetes who are at increased risk of myocardial infarction (MI) remains clinically important in resource-limited settings. Aim: To evaluate the association between proteinuria trajectory and incident MI among adults with diabetes or prediabetes attending a tertiary care centre in North Bihar. Methods: This prospective cohort study included 75 consecutive adults with diabetes or prediabetes at Lord Budha Koshi Medical College, Baijnathpur, Saharsa, Bihar, India, from 5 April 2025 to 31 March 2026. Baseline and follow-up urinary protein-to-creatinine ratio (UPCR) were used to classify participants into no proteinuria, remittent proteinuria, incident proteinuria and persistent proteinuria groups. Incident MI was defined by compatible symptoms or electrocardiographic changes with elevated cardiac biomarkers. Results: The cohort included 49 patients with diabetes and 26 with prediabetes. Proteinuria trajectories were no proteinuria in 34 (45.3%), remittent proteinuria in 13 (17.3%), incident proteinuria in 15 (20.0%) and persistent proteinuria in 13 (17.3%) patients. During follow-up, 10 MI events occurred. MI incidence increased across proteinuria trajectories: 5.9% in no proteinuria, 7.7% in remittent proteinuria, 20.0% in incident proteinuria and 30.8% in persistent proteinuria. After adjustment for age, sex, diabetes status, hypertension, smoking, LDL-C, HbA1c, eGFR and ACEi/ARB use, persistent proteinuria remained independently associated with MI (adjusted HR 4.91, 95% CI 1.01–23.84; P=0.048). Conclusion: Persistent proteinuria and incident proteinuria identified a subgroup of dysglycaemic patients with substantially higher MI risk. Serial proteinuria assessment may support cardiovascular risk stratification in routine diabetes care.

212. Impact of Dietary Cucurbits Induce Expression of Probiotics mapA-gene and Its Implication in the Adhesion to Human GI cells-INT407
Mohammad Saleem, Md. Asad Khan, Tanveer Ahmad, Kashif Ali, Anjum Ara, Irfan Ahmad
Abstract
Probiotic Lactobacilli co-exist as adhered entities with epithelial surface of human gut to impart barrier function. These dominant inhabitants must be supported by certain dietary intervention to maintain enteric homeostasis. In the present study extracts from three cucurbit fruits Lagenaria siceraria, Luffa cylindrica and Cucurbita maxima were prepared and investigated for their effects on three strains of probiotic Lactobacilli (L. rhmnosus, L. plantarum and L. acidophilus). We found extracts to support the adhesion properties of Lactobacilli showed significantly to the human intestinal cells-INT407. Expression of mucus adhesion promoting gene (mapA) of Lactobacilli was targeted for transcriptional analysis and found to be up-regulated in presence of mucin and extracts as well. These studies suggest that cucurbit extract mixed with probiotics in certain proportion might be used as prophylactic and therapeutic agents.

213. Efficacy of Triple-Dose Platelet-Rich Plasma Injection in Mucoid Degeneration of the Anterior Cruciate Ligament: A Prospective Interventional Study
Kavaljitsinh Parmar, Jaydeep Maniya, Pratik Dhondge, Nirav Solanki
Abstract
Background: Mucoid degeneration of the anterior cruciate ligament (ACL) is a recognized cause of chronic knee pain and restricted motion, often without instability. Conventional management ranges from conservative therapy to arthroscopic debridement, though neither directly addresses the underlying degenerative process. Platelet-rich plasma (PRP) has emerged as a biological treatment modality with regenerative potential. Aim: To evaluate the clinical efficacy of three intra-articular PRP injections administered at 15-day intervals in patients with mucoid degeneration of the ACL. Methods: A prospective interventional study was conducted on 40 patients with MRI-confirmed mucoid degeneration of the ACL who had persistent symptoms despite conservative treatment. Patients received three intra-articular PRP injections at 15-day intervals. Outcomes were assessed using Visual Analog Scale (VAS), International Knee Documentation Committee (IKDC) score, and range of motion (ROM) at baseline, 1 month, 3 months, and 6 months. Results: Mean VAS score improved from 7.6 ± 0.9 at baseline to 2.3 ± 0.8 at 6 months (p < 0.001). Mean IKDC score improved from 42.5 ± 6.2 to 82.1 ± 5.4 (p < 0.001). Progressive improvement was observed across follow-ups, with notable clinical relief after the second injection. Range of motion improved particularly terminal flexion. No major complications were observed. Conclusion: Triple-dose PRP therapy administered at 15-day intervals is a safe and effective treatment option for mucoid degeneration of the ACL, offering significant pain relief and functional improvement while potentially avoiding surgical intervention.

214. A Case Report of Breast Carcinoma – A Malignant Stromal Tumour with Osteoclastic Giant Cells
Sadiyahparveen V. Saiyed, Alinawaz Saiyed
Abstract
Background: Less than 2% of breast cancer patients have breast carcinoma with osteoclastic giant cells (OGCs). Rosen initially reported breast cancer with large cells resembling osteoclasts in 1979. The distinctive stromal characteristic is present in breast carcinomas that are invasive, ductal, lobular, squamous, or papillary.The expression of histochemical stains such as the human epidermal growth factor receptor 2 (HER-2), progesterone receptor (PR), and estrogen receptor (ER) is closely associated with the treatment strategy for malignancies with OGC.  Here we represent a case of Breast Carcinoma – a malignant stromal tumour with osteoclastic giant cells. Case Summary: A 55 year old female came to Department of General Surgery with chief complain of right side breast lump and on & off pain since 1 month. Pain was not associated with fever, redness, discharge.On local examination, right side breast lump was palpable, skin normal, scarring absent and axillary lymph node palpable. Conclusion: Invasive breast cancer with OGCs is currently uncommon. Our case report showed how aggressive breast cancer with OGCs typically presents. Regardless of the existence of OGC, the prognosis was thought to be related to the kind of cancer. The OGCs may be a reactive infiltration and have a different origin than the cancer. The prognosis was shown to be favorable for patients with benign large cells that expressed a CD68 pattern next to bone. A less aggressive tumor with a favorable prognosis is suggested by the presence of OGCs.

215. Evaluation of Tear Film Dysfunction in Computer Vision Syndrome: A Hospital-Based Cross-Sectional Study
Rajesh Kumar Shakya, Kanhaiya Prasad, Pankaj Kumar Maurya
Abstract
Background: Computer Vision Syndrome (CVS) is increasingly prevalent due to prolonged digital screen use and is commonly associated with ocular discomfort and tear film abnormalities. Tear film dysfunction plays a key role in the development of dry eye symptoms among affected individuals. Aim: To evaluate tear film dysfunction in patients with Computer Vision Syndrome in a hospital-based cross-sectional study. Materials and Methods: A total of 150 participants with CVS were included using a convenient sampling technique. Detailed history and symptom assessment were recorded. Tear film evaluation was performed using Tear Break-Up Time (TBUT), Schirmer’s test, and ocular surface staining. Data were analyzed using appropriate statistical tests, and p-value <0.05 was considered significant. Results: Out of 150 participants, the majority were aged 18–45 years. Eye strain (74.7%) and dryness (65.3%) were the most common symptoms. Abnormal TBUT was observed in 62.7% and reduced Schirmer’s values in 58.7% of participants. A significant association was found between prolonged screen time and tear film instability (p = 0.004). Conclusion: Early diagnosis and preventive strategies are essential to reduce ocular morbidity and improve quality of life among digital device users.

216. Primary Health Care in India: A Narrative Review of Principles and Implementation
Patel Manavkumar Bharatlal, Patel Gauravkumar Gandabhai, Kalola Shreya Vallabhbhai
Abstract
Background: Primary Health Care (PHC) is the foundation of an effective, equitable, and accessible healthcare system and plays a crucial role in achieving universal health coverage. In India, PHC has evolved through various policy reforms and programmatic interventions; however, challenges in implementation continue to affect its overall effectiveness. Objectives: This narrative review aims to examine the core principles of Primary Health Care and evaluate its implementation in India, with a focus on infrastructure, workforce, service delivery, and recent health system reforms. Methodology: A narrative review approach was adopted by searching electronic databases such as PubMed, Google Scholar, Scopus, and Web of Science, along with national and international policy reports. Relevant studies published between 2000 and 2025 were screened and selected based on relevance. A total of 428 records were identified, of which 45 studies were included after screening and eligibility assessment. The selected literature was analyzed thematically. Results: The review highlights that India has developed a wide PHC network, including Sub-centres, Primary Health Centres, and Community Health Centres, delivering essential health services. National programs and initiatives such as Ayushman Bharat and Health and Wellness Centres have strengthened service delivery and expanded the scope of care. However, persistent challenges such as infrastructure gaps, shortage of healthcare personnel, fragmented service delivery, and urban–rural disparities continue to limit the effectiveness of PHC. Workforce vacancies, limited diagnostic facilities, and weak referral systems remain major barriers. Conclusion: Strengthening PHC in India requires a comprehensive approach focusing on improving infrastructure, addressing workforce shortages, enhancing service integration, and promoting community participation. Adoption of digital health technologies and sustained policy implementation are essential to build a resilient and equitable healthcare system capable of meeting current and future health needs.

217. HIV Causes of Infection, Spread and Treatment Among Human Beings
Soma Halder, Hussain Ahmad, Sabina Yeasmin
Abstract
HIV was commonly known as human immunodeficiency virus. Nowadays it was a common disease among adults worldwide. The virus was grouped to the genus Lentivirus, family retroviridae and subfamily Orthoretrovirinae. HIV can be classified into two groups, HIV-1 and HIV-2.  Genomic analysis reveals that HIV introduced among human in 19th century. It was spread from male to female and female to male by sexual transmission. Also, through injection and blood transfusion methods it spreads. Clinic was the best place where the treatment was done properly. DNA PCR and RT PCR were some methods used for the diagnosis of HIV. One more thing we can observe that prevention was more efficient than infection. We can discuss that how we can get cured from this disease.

218. Comprehensive Study of Deep Neck Space Infections: Etiology, Microbiological Profile & Clinical Outcomes
Kishan Soni, Ishita Bhatt
Abstract
Background: Deep neck space infections (DNSIs) are potentially life-threatening conditions requiring prompt diagnosis and management. The aetiology and clinical profile have evolved, with odontogenic infections emerging as a leading cause. Aim: To evaluate the aetiology, microbiological profile, clinical presentation, management, and outcomes of deep neck space infections. Materials and Methods: This prospective observational study included 100 patients diagnosed with DNSIs at a tertiary care centre over one year. Clinical evaluation, laboratory investigations, and imaging (USG/CECT) were performed. Pus samples were subjected to microbiological analysis. Patients were managed with antibiotics and/or surgical intervention. Data were analysed using SPSS version 26. Results: The majority of patients were aged 21–30 years (22%) with male predominance (62%). Odontogenic infection was the most common aetiology (40%). Neck swelling (90%) and pain (85%) were the most frequent symptoms. The submandibular space was most commonly involved (35%). Staphylococcus aureus (28%) and Streptococcus species (25%) were the predominant organisms. Surgical drainage was required in 65% of cases, while 30% were managed conservatively. Most patients had a hospital stay of 6–10 days (50%). Complications occurred in 20% of cases, with airway obstruction being the most common. The recovery rate was 96%, with a mortality of 4%. Conclusion: DNSIs are predominantly odontogenic and polymicrobial. Early diagnosis, appropriate antibiotics, and timely surgical intervention result in favourable outcomes, though careful monitoring is essential to prevent complications.

219. Effect of Volume and Concentration of Ropivacaine for Sciatic Nerve Block on Perioperative Analgesia in Patients Undergoing Below-Knee Orthopaedic Surgeries Under Spinal Anaesthesia: A Prospective Randomized Comparative Study
Sruthi K. Kannan, Sudheesh Kannan, Varun M. N.
Abstract
Background: The optimal balance between volume and concentration of local anaesthetic at a fixed total dose for peripheral nerve blockade remains poorly defined. While ropivacaine is widely used for sciatic nerve block in lower-limb surgery, the independent contributions of volume versus concentration to block characteristics have not been adequately studied. Objective: To compare the effect of 10 ml of 0.5% ropivacaine versus 20 ml of 0.25% ropivacaine (equal total dose of 50 mg) for single-shot preoperative ultrasound-guided sciatic nerve block on duration of postoperative analgesia in below-knee orthopaedic surgeries. Methods: In this prospective, randomized, double-blind comparative study, 60 ASA I–III patients aged 18–60 years undergoing below-knee orthopaedic procedures were randomized equally to Group A (10 ml 0.5% ropivacaine) or Group B (20 ml 0.25% ropivacaine). A saphenous nerve block with 5 ml 0.5% bupivacaine was added, followed by spinal anaesthesia with 2.5 ml 0.5% hyperbaric bupivacaine. VAS pain scores, duration of analgesia, and 24-hour rescue analgesic consumption were recorded. Results: Duration of analgesia was significantly longer in Group B (598.67 ± 78.1 minutes) than Group A (256.21 ± 77.85 minutes; p < 0.0001). Group B also required less paracetamol (2000 ± 525.22 mg vs 2733.3 ± 449.77 mg; p < 0.0001) and tramadol (61.11 ± 21.39 mg vs 101.92 ± 22.27 mg; p < 0.0001) over 24 hours. Preoperative VAS at 20 minutes was significantly lower in Group B (p < 0.05). Conclusion: At equal total dose, 20 ml of 0.25% ropivacaine produced superior and longer-lasting analgesia than 10 ml of 0.5% ropivacaine for sciatic nerve block.

220. Hepatotoxicity and Other Major Adverse Drug Reactions in Intensive Phase Anti-Tubercular Therapy: A Prospective Observational Study
Utkarsh Singh, Medha Bargaje, Rajeev Ranjan, Mrunmayee Yadav
Abstract
Background: Anti-tubercular therapy (ATT) remains the cornerstone of tuberculosis management; however, adverse drug reactions (ADRs), particularly hepatotoxicity, pose significant challenges during the intensive phase of treatment. Objective: To evaluate the incidence, pattern, and risk factors of hepatotoxicity and other major ADRs in patients undergoing intensive phase ATT. Methods: This prospective observational study was conducted at Bharati Hospital, Bharati Vidyapeeth Medical College, Pune over 18 months. A total of 115 patients receiving first-line ATT were included. Patients were monitored clinically and biochemically for ADRs. Data were analyzed using appropriate statistical tests, with p < 0.05 considered significant. Results: Hepatotoxicity was observed in 22.6% of patients. Other ADRs included gastrointestinal intolerance (18.3%), cutaneous reactions (10.4%), and neurological effects (6.1%). Significant associations were found between hepatotoxicity and age >50 years, alcohol consumption, and baseline liver enzyme elevation (p < 0.05). Conclusion: Hepatotoxicity is a common ADR during intensive phase ATT. Early detection and regular monitoring can reduce morbidity and improve treatment adherence.

221. Cyto-Histopathological Correlation of Thyroid Lesions: A Retrospective Study
Pavan, Mohd Shahnawaz Ahmed, Shreyanka, V. Srinivasa Murthy
Abstract
Background: Thyroid lesions constitute one of the most common endocrine disorders encountered in routine clinical practice. Fine needle aspiration cytology (FNAC) is widely accepted as the first-line diagnostic modality for evaluation of thyroid nodules due to its simplicity, safety, and cost-effectiveness. The Bethesda System for Reporting Thyroid Cytopathology (TBSRTC) has standardized reporting and improved communication between clinicians and pathologists. Objectives: To categorize thyroid lesions using TBSRTC and to assess cyto-histopathological correlation in operated cases in order to evaluate diagnostic accuracy. Materials and Methods: A hospital-based retrospective study was conducted in the Department of Pathology, ESIC Medical College and Hospital, Kalaburagi, over a period of two years (January 2023–December 2024). A total of 261 patients with clinically significant thyroid swelling were included. FNAC was performed and categorized according to TBSRTC. Histopathological correlation was performed in cases with available surgical specimens. Statistical analysis included sensitivity, specificity, positive predictive value (PPV), negative predictive value (NPV), and diagnostic accuracy. Results: Thyroid lesions showed a marked female predominance. Benign lesions (Bethesda II) constituted the majority (94.3%). Sensitivity (100%), specificity (91.35%), PPV (61.1%), NPV (100%), and diagnostic accuracy (92.3%) indicate high reliability of FNAC. Conclusion: FNAC, when reported using the Bethesda system, is a reliable, accurate, and minimally invasive diagnostic tool for the evaluation of thyroid lesions and plays a crucial role in guiding appropriate clinical management and reducing unnecessary surgical interventions.

222. Cross-Sectional Evaluation of Menstrual Irregularities in Adolescent Girls and Their Lifestyle Correlates
Rashmi Sinha, Anupama Sinha, Sheela Kumari
Abstract
Background: Due to the hyper-immaturity of the hypothalamic-pituitary-ovarian axis, adolescent girls experience primary menstrual irregularities. Although the menstrual cycle variations are physiological, they could become problematic when they remain consistently abnormal, as they could result in the development of secondary complications such as anemia, dysregulation of other hormones, and metabolic syndrome. Adolescent girls’ menstrual patterns may be affected by lifestyle behaviors, including diet, physical activity, sleep, and stress, yet, the associations remain under-studied. Objective: This study’s main goal was to investigate the occurrence of menstrual irregularities in adolescent girls between the ages of 13 and 19. The secondary goal was to explore the correlation of menstrual irregularities with certain lifestyle characteristics. Methods: Over a period of one year from 1st June 2024 to 30th May 2025 and a follow-up of 6 months, a cross-sectional observational study was carried out with 100 adolescent girls from schools and hospital outpatient departments at JLNMCH Bhagalpur. A structured questionnaire along with standard scales was used to collect data on menstrual history, lifestyle, stress levels, anthropometry, and other variables. Data was analyzed with descriptive statistics, and associations were studied using t-tests and Chi-square tests, with p < 0.05 considered as significant. Results: Menstrual irregularities were noted in 42% of participants, with the most frequent being oligomenorrhea and dysmenorrhea. Low levels of physical activity, poor sleep, and increased stress were significantly correlated with irregular cycles (p < 0.05). However, the other variables of diet and BMI only showed non-significant trends. Conclusion: Adolescents often experience menstrual irregularities, which can be modified by lifestyle changes. Improving menstruation and general health in adolescents involves health education, counseling, physical activity, stress management, and sleep hygiene.

223. Prevalence of Thyroid Dysfunction in Pregnant Women and Its Association with Adverse Pregnancy Outcomes
Rashmi Sinha, Anupama Sinha, Sheela Kumari
Abstract
Background: Diagnosing thyroid problems during pregnancy is quite common within the endocrine system. There are the physiological changes in pregnancy which can uncover the physiological changes of the thyroid. If these changes are not diagnosed or managed correctly, they can cause problems for the mother or fetus. Even though the problems are very important, the issues are not being diagnosed correctly because there are not routine or standardized screenings. Objectives: Evaluate the rate of thyroid dysfunction adverse outcomes of pregnancy among women attending outpatient department of Obstetrics & Gynecology at JLNMCH, Bhagalpur. Methods: Over a period of one year from 1st Nov 2023 to 31st Oct 2024 and a follow-up of 6 months, the study was carried out at Outpatient department of Obstetrics & Gynecology JLNMCH Bhagalpur and was aimed at one hundred pregnant women. All the participants had their serum thyroid-stimulating hormone (TSH) levels measured and, where necessary, free thyroxine (FT4) and free triiodothyronine (FT3) levels assessed. Depending on the results of the participants’ thyroid function tests, the participants were classified as either euthyroid, subclinical hypothyroid, overt hypothyroid, and hyperthyroid. Maternal and fetal outcomes were tracked until delivery. Descriptive statistics and either Chi-square or Fisher’s exact tests were employed for the statistical analysis, with a significance level of less than 0.05. Results: Eighteen percent of participants were found to have thyroid dysfunction. The most prevalent of these was subclinical hypothyroidism at 14%, while overt hypothyroidism was at 3%, and hyperthyroidism at 1%. There were higher rates of adverse maternal outcomes (preeclampsia, gestational hypertension, anemia, and abortion) among women with thyroid dysfunction. This group also had significantly higher rates of fetal complications (preterm birth, low birth weight, low APGAR scores, and increased NICU admissions). Conclusion: Pregnant women are at increased risk for thyroid dysfunction which adversely impacts both maternal and fetal health. To enhance perinatal outcomes, it is important to implement systematic screening and introduce treatment as early as possible during the course of the pregnancy.

224. The Therapeutic Role of Curcumin on Oral Disease Management: A Review
Akhtar Husain, Shreshthi Gupta, Asmita Upadhyay, Arti, Vijay Krishnan
Abstract
Background: Curcumin, the principal bioactive compound of turmeric (Curcuma longa), has gained considerable attention due to its anti-inflammatory, antioxidant, antimicrobial, and anticancer properties. Its potential role in oral healthcare has been increasingly explored as an alternative or adjunct to conventional therapies, which are often associated with adverse effects and antimicrobial resistance. Aim: The aim of this review was to systematically evaluate the therapeutic efficacy of curcumin in the management of various oral diseases and to identify optimal formulations and clinical outcomes. Materials and Methods: This review was conducted following the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines. The research question was structured using the PICO framework. A comprehensive literature search was performed in PubMed, ScienceDirect, and Scopus for studies published between 2000 and 2026. Randomized controlled trials with Systematic Reviews and Meta-Analyses, evaluating curcumin formulations (gel, mouthwash, capsules, patches, or solutions) in oral diseases were included. A total of 21 studies met the inclusion criteria. Results: Curcumin demonstrated significant therapeutic benefits across multiple oral conditions, including oral mucositis, periodontal diseases, recurrent aphthous stomatitis, oral submucous fibrosis, oral lichen planus, and oral potentially malignant disorders. Clinical outcomes included reduction in inflammation, pain, lesion size, and microbial load, along with improved healing. In several studies, curcumin showed comparable efficacy to conventional treatments such as chlorhexidine and corticosteroids. Advanced delivery systems, including nanoformulations, enhanced its clinical effectiveness. Conclusion: Curcumin is a safe and effective multi-targeted agent with promising applications in oral disease management. However, limitations such as poor bioavailability, heterogeneity in study designs, and lack of standardized dosing protocols necessitate further large-scale randomized controlled trials to establish definitive clinical guidelines.

225. Outcomes Following Elective Cervical Cerclage in Women at Risk of Preterm Birth
Ankita Meena, Meenakshi Samaria, Pearl Samaria, Dharmendra Singh Fatehpuriya
Abstract
Background: Preterm birth remains a major contributor to neonatal morbidity and mortality worldwide. Cervical insufficiency is a well-recognized risk factor for spontaneous preterm birth. Elective cervical cerclage is commonly employed in high-risk women; however, outcomes vary depending on patient characteristics and gestational age at intervention. Objectives: To evaluate maternal and neonatal outcomes following elective cervical cerclage in women at risk of preterm birth. Methods: A prospective observational study was conducted over 11 months at JLN Medical College, Ajmer, involving 100 pregnant women at high risk of preterm birth who underwent elective cervical cerclage. Maternal demographics, obstetric history, gestational age at cerclage, pregnancy prolongation, gestational age at delivery, and neonatal outcomes were recorded and statistically analyzed. Results: The mean gestational age at cerclage was 14.6 ± 2.1 weeks. The mean prolongation of pregnancy was 11.8 ± 4.2 weeks. Term delivery was achieved in 62% of cases. Neonatal survival rate was 90%. Statistically significant improvement in gestational age at delivery was observed (p < 0.001). Conclusion: Elective cervical cerclage significantly prolongs pregnancy and improves neonatal outcomes in women at risk of preterm birth when performed early in gestation.

226. Short Versus Prolonged Dual Antiplatelet Therapy After Drug-Eluting Stent Implantation: A Prospective Observational Study
Manish Kumar, Priyanka Kumari, Santosh Kumar, Ramesh Thakur, Umeshwar Pandey
Abstract
Background: Dual antiplatelet therapy (DAPT) is commonly prescribed following drug-eluting stent (DES) placement to limit the risk of stent-related thrombosis. Despite its established role, uncertainty persists regarding the appropriate duration, given the need to balance ischemic benefits against bleeding hazards. Objective: To compare clinical outcomes between short-duration and prolonged-duration DAPT in patients undergoing DES implantation. Methods: This study was carried out at LPS Institute of Cardiology, Kanpur, from November 2019 to November 2022. A total of 150 patients were enrolled and divided into two groups: short DAPT and prolonged DAPT. The principal outcome and secondary outcomes were assessed. Statistical analysis was performed using SPSS software (version 25.0). Results: MACE incidence was slightly higher in the short DAPT group compared to prolonged DAPT, but the difference was not statistically significant (p=0.56). Bleeding events were significantly higher in the prolonged DAPT group (p=0.02). Conclusion: Short-duration DAPT appears to provide comparable ischemic protection with significantly lower bleeding risk, suggesting it may be preferable in selected patients.

227. A Study of Paediatric Dermatological Emergencies at A Tertiary Care Hospital-Childhood Skin Crisis
Chintaginjala Aruna, Addagarla Bhargavi, Beepalli Kanthi, Duggirala S.S. Srinivas Prasad, Bandaru Rakesh, Sangam. Tejaswini
Abstract
Background: Paediatric dermatological emergencies pose unique challenges in the emergency department. Unlike adults, Children are more prone to systemic compromise due to immature physiology and immunity, and their distinct etiological patterns often ignites parental anxiety. These conditions require prompt recognition, yet their presentation varies across regions. The scarcity of studies from South India underscores the need for further research in this area. Aim: To study the spectrum of dermatological emergencies in the paediatric age group presenting to a tertiary care hospital. To assess the associated complications and mortality among study group. Materials and Methods: This is a two year prospective, hospital based observational study.The study included children below 16 years of age who visited or referred DVL opd and emergency department. After obtaining informed consent from parents, a detailed history, clinical examination and appropriate investigations were done to diagnose and identify associated   complications and were categorized accordingly. Results: A total of 106 paediatric patients were included in the study. Males constituted 56.6% and females 43.4% of cases, with a male-to-female ratio of 1.3:1. The majority of patients were in the 6–10 year age group (34%), followed by 1–5 years (32.1%). Infectious dermatoses were the most common cause (45.3%), followed by genetic disorders (17%) and inflammatory disorders (14.2%). Hand Foot Mouth Disease and Staphylococcal Scalded Skin Syndrome were frequently observed among infectious conditions. Complications occurred in 2.8% of cases, and the mortality rate was 1.9%. Conclusion: Infectious dermatoses represent the most common paediatric dermatological emergencies. Early diagnosis and prompt management in tertiary care centres significantly reduce complications and improve clinical outcomes.

228. Percutaneous Coronary Intervention in Chronic Kidney Disease Patients: A Risk-Based Study
Manish Kumar, Priyanka Kumari, Santosh Kumar, Ramesh Thakur, Umeshwar Pandey
Abstract
Background: Chronic kidney disease (CKD) is a well-established risk factor for adverse cardiovascular outcomes. Patients with CKD undergoing percutaneous coronary intervention (PCI) are at increased risk of complications including contrast-induced nephropathy, bleeding, and mortality. Objective: To evaluate clinical outcomes of PCI in CKD patients and identify risk-based predictors of adverse events. Methods: This study included 100 CKD patients who underwent PCI at LPS Institute of Cardiology, Kanpur between 2019 and 2022. Patients were stratified based on CKD stages and risk factors. Clinical, procedural, and outcome data were analyzed. Results: Higher CKD stages were significantly associated with increased incidence of contrast-induced nephropathy (CIN), in-hospital mortality, and major adverse cardiac events (MACE) (p<0.05). Multivariate analysis identified eGFR <30 ml/min/1.73m², diabetes, and contrast volume >150 ml as independent predictors. Conclusion: CKD significantly impacts PCI outcomes. Risk stratification is crucial for improving patient prognosis.

229. Clinical Safety, Efficacy and Post-Operative Outcomes of Endovenous Laser Ablation (EVLA) Versus Surgical Stripping in Varicose Vein Management: A Prospective Comparative Study
Shashank Mittal, Raghvendra Kumar Pandey, Ravi Sinha
Abstract
Background: Varicose veins are a common manifestation of chronic venous disease. Conventional surgical stripping has been the traditional treatment, but minimally invasive techniques like endovenous laser ablation (EVLA) have emerged as effective alternatives. Objective: To compare the clinical safety, efficacy, and postoperative outcomes of EVLA and surgical stripping. Methods: This prospective comparative study was conducted over 3 years (February 2023–February 2026) including 236 patients with lower limb varicose veins. Patients were divided equally into two groups: Group I (n=118) underwent surgical stripping and Group II (n=118) underwent EVLA. Parameters assessed included intraoperative bleeding, hematoma, postoperative pain, inflammation, mobilization time, hospital stay, and return to normal activity. Follow-up was done for 12 months. Results: Majority of patients (66.1%) were in the 25–45-year age group with male predominance (66.9%). Surgical group showed significantly higher bleeding, hematoma, and postoperative inflammation. EVLA group demonstrated early mobilization (1 day vs 2 weeks), shorter hospital stay (1 vs 5 days), and quicker return to work (5 days vs 2 weeks). Logistic regression showed EVLA as an independent protective factor (OR=0.16, p<0.001). ROC curve demonstrated excellent efficacy (AUC=0.89). Conclusion: EVLA is a safe, effective, and minimally invasive alternative with superior postoperative outcomes and should be considered the preferred treatment modality.

230. Antibiotic Susceptibility Patterns of Uropathogens in a Rural Tertiary Care Centre of Western Maharashtra: A Prospective Study
Kolhe Prajakta, Phate Sagar, Rahul Kunkulol
Abstract
Introduction: Urinary tract infections (UTIs) are among the most common bacterial infections and a major cause of morbidity. The increasing prevalence of antimicrobial resistance (AMR) among uropathogens has reduced the effectiveness of commonly used antibiotics, complicating empirical therapy. Continuous surveillance through institutional antibiograms is essential to monitor resistance trends and guide rational antimicrobial use, particularly in rural tertiary care settings. Methodology: A prospective longitudinal study was conducted over two years in a rural tertiary care hospital in Western Maharashtra. Adult patients with suspected UTIs whose urine samples were sent for culture and sensitivity testing were included. Identification of pathogens and antibiotic susceptibility testing were performed using the Kirby–Bauer disk diffusion method as per CLSI guidelines. Data were analyzed to determine the spectrum of uropathogens and their antimicrobial susceptibility patterns. Results: A total of 691 urine samples were analyzed. The predominant isolates were Escherichia coli, Klebsiella spp., Pseudomonas spp., Staphylococcus aureus, and Enterococcus spp. Carbapenems showed high sensitivity against Gram-negative organisms, including E. coli (88%) and Klebsiella spp. (90%). Piperacillin–tazobactam and cefoperazone–sulbactam demonstrated moderate to high sensitivity (78–85%), while amikacin showed good efficacy (75–80%). High resistance was observed with fluoroquinolones and cotrimoxazole. Nitrofurantoin and fosfomycin retained good activity, especially against E. coli. Gram-positive isolates showed high sensitivity to vancomycin, linezolid, and teicoplanin (>90%). Discussion: The findings indicate rising resistance to commonly used antibiotics, emphasizing the need for judicious use of reserve drugs and promoting effective oral agents for uncomplicated UTIs. Conclusion: High AMR burden necessitates regular antibiogram surveillance to guide empirical therapy and strengthen antimicrobial stewardship.

231. The Role of Micronucleus Scoring in Cervical Papanicolaou Smears
Chiranpreet Gupta, Harpal Singh, Ninder Kumar, Preet Kanwal Sibia
Abstract
Aims and Objectives: 1) To evaluate micronucleus scoring in all major diagnostic categories as defined by “The Bethesda System for Reporting Cervical Cytology” 2014, including negative for intraepithelial lesions and malignancy (NILM), inflammatory, abnormal squamous cells of undetermined significance (ASC-US), abnormal squamous cells cannot exclude high-grade squamous intraepithelial lesion (ASC-H), low-grade squamous intraepithelial lesion (LSIL), high-grade squamous intraepithelial lesion (HSIL) and invasive carcinoma (IC) in cervical Pap smears. 2) To study the frequency and pattern of MN from NILM to invasive carcinoma (IC) categories in cervical Pap smears. Materials and Methods: Pathologists independently assessed 1000 conventional cervical smears stained with Papanicolaou (Pap) stain, which included unsatisfactory for evaluation (93), NILM (154), inflammatory (673), ASC-US (25), ASC-H (19), LSIL (15), HSIL (14) and IC (7). The MN score per 1000 cells was determined by counting the number of MN cells in high-power (×400) and oil immersion (×1000). Results: The mean MN score ± standard deviation was found to be 0.99 ± 0.744 in NILM cases, 0.67 ± 0.782 in inflammatory cases, 1.57 ± 0.507 in ASC-US cases, 1.63 ± 0.50 in ASC-H cases, 1.56 ± 0.511 in LSIL cases, 2.47 ± 0.516 in HSIL cases and 3.0 ± 0.00 in IC cases. A step-wise increase was observed in MN score from inflammatory to IC categories. Conclusions: MN score is a reliable and easy test that can be used in conjunction with routine cervical PAP to assess the risk of malignant transformation in the uterine cervix as a biomarker for predicting the risk of carcinoma.

232. Correlation of Serum C – reactive protein Concentration with Acute Ischemic Stroke Severity at Admission: A Retrospective Study from Northeast India
Arindam Das, Papori Borah, Munindra Goswami, Marami Das, Anirban Mahanta
Abstract
Background: Inflammation plays a pivotal role in the pathophysiology of acute ischemic stroke (AIS). C-reactive protein (CRP), an acute-phase reactant, has emerged as a potential biomarker reflecting systemic inflammatory response and disease severity. Objectives: To evaluate the correlation between serum CRP levels and stroke severity as assessed by the National Institutes of Health Stroke Scale (NIHSS) at admission. Methods: A retrospective observational study was conducted at a tertiary care centre in Northeast India from March 2025 to September 2025. A total of 50 patients with acute ischemic stroke were included. Data Collected: (1) Demographic profile. (2) Risk factors (diabetes, hypertension, smoking, alcohol). (3) Serum CRP levels. (4) Stroke severity using NIHSS score. CRP levels were categorized into: (1) <10 mg/dL (2) >10 mg/dL. Stroke severity: (1) Mild: NIHSS 0–4; (2) Moderate: 5–15; (3) Severe: >15. Statistical Analysis: Data were analysed using SPSS version 26: (1) Chi-square/Fisher’s exact test for categorical variables.(2) Student’s t-test for continuous variables.(3) p < 0.05 considered significant. Results: (1) Total patients: 50; (2) Male: 66%, Female: 34%; (3) Age >45 years: 90%. CRP correlation: (1) CRP >10 mg/dL → strongly associated with severe stroke. (2) CRP <10 mg/dL → mostly mild–moderate stroke. Key finding: A statistically significant association was observed between CRP levels and stroke severity (p < 0.001). Conclusion: Elevated CRP levels are significantly associated with increased severity of acute ischemic stroke. CRP may serve as a simple, accessible biomarker for early risk stratification.

233. Anatomical Variations in Portal Vein Branching Pattern on CECT Abdomen: A Descriptive Analysis
Rincy Gomas, Antlin Sushma
Abstract
Background: The anatomy of the portal vein and its variants is essential in applied and radiological anatomy, especially for hepatobiliary surgery, liver transplantation, and interventional procedures. The branching patterns of the portal vein exhibit significant anatomical variety. Contrast-enhanced computed tomography (CECT) of the abdomen facilitates precise, non-invasive viewing of portal venous architecture, becoming it an invaluable instrument for anatomy-based descriptive research. Aim: To study the prevalence, normal anatomical pattern and variations of portal vein branching using contrast-enhanced CT of the abdomen. Materials and Methods: This descriptive study was conducted over a period of 18 months. A total of 175 contrast-enhanced CT scans of the abdomen were included for analysis. The scans were collected from individuals undergoing CT scans for various medical conditions, and anonymized pictures were retrospectively analyzed for anatomical assessment. Portal vein branching patterns were categorized into the classical bifurcation pattern and variant configurations, including trifurcation, early branching of the right posterior portal vein, and other uncommon anatomical arrangements. Observations were methodically documented, and data were examined employing descriptive statistics. Results were presented as frequencies and percentages. Results: Out of 175 CT scans analyzed, the classical portal vein bifurcation into right and left portal veins was observed in the majority of cases. Anatomical variations were identified in a 63 (36%) of subjects. Portal vein trifurcation was the most common variant encountered, followed by early origin of the right posterior sectoral branch. Other less frequent variations included separate segmental branching and atypical intrahepatic portal vein patterns. These variations were observed across different age groups and in both sexes, without any significant gender predilection. Conclusion: Anatomical differences in portal vein branching are prevalent and can be precisely discerned using contrast-enhanced abdominal CT imaging. From an anatomical standpoint, identifying and recording these changes is vital for precise radiological analysis and for offering necessary anatomical guidance in hepatobiliary surgical and interventional practices.

NMC Approved Embase Indexed

This journal is peer Reviewed Journal