International Journal of Current Pharmaceutical

Review and Research

e-ISSN: 0976 822X

p-ISSN: 2961-6042

NMC Approved Peer Review Journal

Menu

Disclaimer: Scopus, Embase, Publons and Crossref are registered trademark of respective companies.

This journal is member of Crossref. 

1. Comparison of Analgesic Efficacy of Ultrasound-Guided External Oblique Intercostal Plane Block and Modified Thoracoabdominal Nerve Block via Perichondrial Approach in Patients Undergoing Upper Abdominal Surgeries: A Randomized Controlled Study
Divyashree R., S. B. Gangadhar, Samveda S. G.
Abstract
Background and Aims: Effective postoperative pain management following upper abdominal surgeries improves recovery and reduces opioid requirements. Recently described ultrasound guided fascial plane blocks such as the External Oblique Intercostal Plane (EOIP) block and modified thoracoabdominal nerve block via the perichondrial approach (M-TAPA) provide analgesia of anterior and lateral abdominal wall. This study aimed to compare the postoperative analgesic efficacy of EOIP and M-TAPA blocks with conventional analgesia in patients undergoing upper abdominal surgeries. Material and Methods: This prospective randomized study included 45 patients undergoing elective upper abdominal surgery under general anaesthesia. Patients were randomly allocated into three groups (n=15 each). Control group (Group C) receiving general anesthesia with standard systemic analgesia, Group E receiving ultrasound-guided EOIP block, Group T receiving ultrasound-guided M-TAPA block in addition to general anesthesia. The primary outcome was postoperative pain assessed using Numerical Rating scale (NRS) at predetermined time intervals in the first 24 hours. Secondary outcomes included Demographic variables, sensory dermatomal spread, total opioid consumption, NSAID consumption and postoperative complications were recorded. Statistical analysis was performed using ANOVA and Chi-square tests. Results: Postoperative NRS scores were significantly lower in Group E and T compared with the control at multiple postoperative time points (p<0.001). Both EOIB and M-TAPA groups demonstrated reduced opioid consumption and prolonged time to first rescue analgesia compared with the control group. However, no statistically significant difference was observed between Group E and T with respect to pain scores, opioid requirement, or duration of analgesia. The incidence of adverse effects was comparable among the three groups. The mean age was comparable between Groups (Group C: 45.3±11.2years, Group E: 46.1±10.8 years, Group T: 44.7± 12.1years; p>0.05). Mean 24-hour opioid consumption was significantly higher in the control group (300± 0mg) compared to EOIP (73.3±59.4 mg) and M-TAPA (86.7±74.3 mg) groups (p<0.001). Similarly, NSAIDS consumption was significantly higher in the control group (3000±0 mg) compared with EOIP (1333±488 mg) and M-TAPA (1267±458 mg). Both blocks produced adequate dermatomal spread from T6 to T11. Conclusions: Ultrasound-guided External oblique intercostal plane (EOIP) block and Modified thoracoabdominal nerve block via the perichondrial approach (M-TAPA) blocks have emerged as effective techniques for managing postoperative pain in patients undergoing upper abdominal surgeries. These blocks provide multi-dermatomal analgesia, significantly reducing opioid consumption and contributing to improved patient outcomes. With minimal complications reported, EOIP and M-TAPA blocks can be safely incorporated into multimodal analgesia protocols, enhancing pain management strategies. Clinically, M-TAPA block demonstrates better sensory dermatomal spread (T6-T12) compared to EOIP block, with slight increase in duration of analgesia suggesting a potential advantage in certain clinical scenarios. Hence I recommend M-TAPA block for better postoperative analgesia following upper abdominal surgeries. However, both techniques demonstrate comparable analgesic efficacy, making them valuable additions to pain management approaches for upper abdominal surgeries.

2. Information‑Seeking Behavior and PubMed Utilization among Undergraduate Medical Students: A Cross‑Sectional Study
Siddanathi Narasinga Rao, Dimma Syamala, Ramidi Madhavi Reddy, Dornala Dinesh Reddy
Abstract
Background:  Efficient retrieval of biomedical literature is an essential component of evidence‑based medical practice. PubMed is one of the most widely used biomedical databases for accessing peer‑reviewed scientific literature. However, the extent to which undergraduate medical students possess adequate knowledge and practical skills to effectively utilize PubMed remains variable. Aim: To evaluate the knowledge, attitudes and practices related to PubMed literature searching among undergraduate medical students. Objective: To evaluate awareness and understanding of PubMed, assess students’ attitudes toward its academic utility, and analyze patterns of its practical use in medical education. Methods: A cross‑sectional questionnaire‑based study was conducted among 200 undergraduate medical students, including final‑year MBBS students and interns, at a tertiary care teaching hospital. A structured questionnaire assessing knowledge, attitude and practice regarding PubMed searching was distributed electronically using Google Forms. Data were analyzed using descriptive statistics and presented as frequencies and percentages. Results: Among the participants, 62% reported awareness of PubMed as a biomedical literature database. Knowledge of advanced search tools such as Medical Subject Headings (MeSH), Boolean operators and clinical filters was limited. Nearly half of the students expressed positive attitudes toward the importance of literature searching in medical education, while a considerable proportion reported neutral confidence levels in using PubMed effectively. Conclusion: Although awareness of PubMed exists among undergraduate medical students, gaps persist in knowledge depth and practical application of advanced search techniques. Structured training programs focusing on biomedical literature searching should be incorporated into undergraduate medical curricula to enhance research skills and support evidence‑based clinical learning.

3. Comparison of Real‑Time Ultrasound‑Guided Spinal Anaesthesia Vs Pre‑Procedural Ultrasound‑Guided Spinal Anaesthesia in Geriatric Patients Undergoing Infra Umbilical Surgeries
R. Jagadish Raj, S. B. Gangadhar, Anjali R. Hegde
Abstract
Background: Age-related degenerative changes of the lumbar spine, including narrowing of interspinous spaces and ligament calcification, increase the technical difficulty of neuraxial block placement in geriatric patients. Ultrasound guidance has been used to improve the success of spinal anaesthesia either by pre-procedural ultrasound (PUS) assisted landmark identification or real-time ultrasound (RUS) guided needle insertion. However, comparative evidence between these two techniques in elderly patients is limited. This study aimed to compare the efficacy of real-time ultrasound-guided spinal anaesthesia with pre-procedural ultrasound- guided spinal anaesthesia in geriatric patients undergoing infra-umbilical surgeries. Material and Methods: In this prospective randomized study, 50 patients aged ≥60 years with American Society of Anaesthesiologists (ASA) physical status I–III undergoing infra-umbilical surgeries under spinal anaesthesia were randomized into two groups of 25 each to receive spinal anaesthesia using either the PUS or RUS technique via a paramedian approach. Primary outcomes measured were the number of attempts and needle passes required for successful subarachnoid block. Secondary outcomes included time taken for identification of subarachnoid space, procedure time, and total time for successful lumbar puncture. Statistical analysis was performed using independent sample t-test and Chi-square test. Results: First-attempt success rate was higher in the PUS group (64%) compared to the RUS group (48%) without statistical significance (p=0.561). First-pass success was also greater in the PUS group (48%) than in the RUS group (28%) (p=0.209). The mean time for identification of the subarachnoid space was significantly longer in the PUS group (74.76±21.37 s) compared to the RUS group (63.72±17.94 s) (p<0.05). Procedure time was significantly longer in the RUS group (56.56±24.62 s) compared to the PUS group (32.94±14.47 s) (p<0.0001). Total time for successful lumbar puncture was comparable between the groups (p=0.149). Conclusions: The total duration taken for successful lumbar puncture remains the same irrespective of the ultrasound technique used, with pre procedural ultrasound taking longer time for identification of the subarachnoid space and real time ultrasound taking longer time from skin prick to cerebrospinal fluid backflow.

4. Comparison of Pressure Controlled Versus Volume Controlled Ventilation Modes in Patients Undergoing Lumbar Spine Surgery in Prone Position
Anjali R. Hegde, S. B. Gangadhar, R. Jagadish Raj
Abstract
Background: Prone positioning during lumbar spine surgery under general anaesthesia significantly alters respiratory mechanics and hemodynamics due to changes in thoraco-abdominal compliance and intra-thoracic pressure. The choice of intraoperative ventilation mode may influence pulmonary mechanics, cardiovascular stability, intraoperative blood loss, and surgical stress response. Evidence comparing pressure-controlled ventilation (PCV) and volume-controlled ventilation (VCV) in this setting remains inconsistent. Objectives: To compare PCV and VCV with respect to pulmonary mechanics, hemodynamic parameters, intraoperative blood loss, and stress response in patients undergoing elective lumbar spine surgery in the prone position. Material and Methods: This prospective, randomized comparative study included 60 adult patients (ASA I–II) scheduled for elective lumbar spine surgery. Patients were randomized to receive either PCV or VCV (n=30 each). Ventilation was standardized with tidal volume of 8 mL /kg and PEEP of 5 cm H₂O. Hemodynamic and respiratory parameters were recorded after intubation in supine position and 30 minutes after prone positioning. Dynamic compliance was calculated, surgical stress response assessed using random blood glucose levels, and intraoperative blood loss estimated at the end of surgery. Results: Demographic characteristics were comparable between groups. Peak airway pressure was significantly lower and dynamic compliance significantly higher in the PCV group compared to the VCV group (p<0.001). Mean arterial pressure showed a lesser decline in the PCV group (p=0.02). Intraoperative blood loss was significantly lower with PCV (p<0.001). Heart rate, end-tidal carbon dioxide, and blood glucose levels showed no significant intergroup differences. Conclusion: Pressure-controlled ventilation provides superior respiratory mechanics and reduced intraoperative blood loss with good hemodynamic stability and no significant stress response compared to volume-controlled ventilation during prone lumbar spine surgery.

5. Comparison of Thyroid Profile in Beta-Thalassemia Major Patients on Regular Blood Transfusion and Iron Chelation Therapy with Age-Matched Controls: A Cross-Sectional Study
Sanjeev Chahar, Saroj Ola, Vineet Popli, Deepti Jain, Pratima Khare
Abstract
Background: Beta-thalassemia major (BTM) is a transfusion-dependent hereditary hemoglobin disorder associated with iron overload and multiple endocrine complications, including thyroid dysfunction. Early detection of thyroid abnormalities is essential to reduce morbidity in these patients. Objective: To evaluate the prevalence and pattern of thyroid dysfunction in children with beta-thalassemia major receiving regular blood transfusion and iron chelation therapy, and to compare findings with age-matched healthy controls. Methods: This hospital-based cross-sectional comparative study included 50 children (≥8 years) with confirmed BTM and 50 age-matched healthy controls. Clinical evaluation, anthropometric measurements, and laboratory investigations including serum T3, T4, thyroid-stimulating hormone (TSH), and serum ferritin were performed. Thyroid status was classified as euthyroid, subclinical hypothyroidism, primary hypothyroidism, or secondary hypothyroidism. Statistical analysis was conducted using SPSS version 21.0. Results: The mean age of cases and controls was comparable. Thalassemia patients had significantly lower mean height and weight than controls. Overall, 26% of BTM patients exhibited thyroid dysfunction, with subclinical hypothyroidism being most common (20%), followed by primary hypothyroidism (6%). In contrast, 12% of controls had subclinical hypothyroidism, with no cases of overt hypothyroidism. Although mean T3, T4, and TSH levels did not differ significantly between groups, elevated TSH was associated with thyroid dysfunction. Higher serum ferritin levels and longer duration of chelation therapy were significantly associated with thyroid abnormalities. No significant association was found with age at diagnosis, transfusion frequency, or transfusion burden. Conclusion: Thyroid dysfunction, predominantly subclinical hypothyroidism, is a common endocrine complication in transfusion-dependent beta-thalassemia major patients despite ongoing chelation therapy. Regular screening of thyroid function and monitoring of iron overload are recommended for early detection and management.

6. Evaluation and Management of Diabetic Foot According To Wagner’s Classification
L. Parvathi, N. Deepthi, Shaik Mahammed Asadulla, Yasa Prathibha
Abstract
Background: Diabetic foot complications represent a significant healthcare burden globally, affecting approximately 25% of individuals with diabetes during their lifetime. Wagner’s classification system serves as a fundamental tool for systematic evaluation and therapeutic planning in diabetic foot management. This grading system categorizes foot lesions from Grade 0 (high-risk foot) to Grade 5 (extensive gangrene), facilitating standardized treatment approaches and outcome prediction. Methods: A prospective observational study was conducted involving 25 patients with diabetic foot complications presenting to our tertiary care center over 24 months. Each patient underwent comprehensive evaluation and was classified according to Wagner’s grading system. Treatment protocols were implemented based on the assigned grade, with regular follow-up assessments to monitor healing progress and clinical outcomes. Results: The study cohort demonstrated varying healing rates correlated with Wagner grade severity. Higher-grade lesions showed prolonged healing times and increased complication rates. Grade-specific treatment protocols proved effective in achieving optimal clinical outcomes, with early intervention significantly improving prognosis across all patient categories. Conclusions: Wagner’s classification system provides reliable guidance for diabetic foot evaluation and management. The systematic approach enables healthcare providers to implement appropriate therapeutic interventions, ultimately improving patient outcomes and reducing the risk of severe complications including amputation.

7. Incidence of Acute Kidney Injury in Acute Febrile Illness
Srikanth Nakka, Dharam Dev Golani, Nagavaram Harikrishna, T. Krishna Kumar
Abstract
Background: Acute kidney injury (AKI) denotes a sudden and reversible kidney function reduction characterized by increased creatinine or decreased urine volume. One of the leading causes of death in AKI is acute febrile illness. Aim: The aim of this study was to evaluate the clinical profile of acute kidney damage in acute febrile illness. Materials and Methods: Original research study was carried out at the department of General Medicine in Pimpri, Pune. All 100 patients diagnosed with acute febrile illness who were admitted to the medical wards and intensive care unit (ICU) between September 2020 and August 2024 were included in the study. Results: In our study 70% of the study population are males, and the female population constituted 30%. On further evaluation of RIFLE grade 27% are under the risk category, 13 % under the injury category, and 10% under the failure category. On AKIN grade evaluation, 31% are grade 1, 8% are grade 2, and 10% are under the grade 3 category. KDIGO AKI stage 1, 2, and 3 were seen in 27 (27%), 9 (9%), and 14(14.0 %) of AKI patients, respectively. Conclusion: According to this study, dengue is the most frequent cause of AFI and AKI. The highest burden of AKI is caused by leptospirosis, dengue, and malaria. In order to help diagnose AKI early and provide appropriate care, we recommend that doctors look into kidney function in patients with acute febrile illness who also have certain risk factors.

8. Evaluation of the Use of Steroid-Sparing Immunosuppressants in Dermatological Disorders: Prescription Pattern and Safety Considerations
Sanglaap Saha, Romit Banerjee, Ritarshi Bhattacharya, Soumik Ghosh, Ranita Das, Suhena Sarkar, Abanti Saha, Amrita Sil
Abstract
Background: Chronic dermatological conditions including psoriasis, lupus, vitiligo, pemphigus, and lichen planus frequently require systemic immunosuppressive therapy. Long-term corticosteroids carry significant morbidity, prompting reliance on steroid-sparing immunosuppressants. Prospective real-world data on their prescription patterns and safety profiles from tertiary care dermatology centres in India remain limited. Objectives: To evaluate the prescription patterns and incidence of adverse drug reactions (ADRs) of steroid-sparing immunosuppressants in patients with chronic dermatological disorders attending a tertiary care centre. Methods: A prospective, cross-sectional observational study was conducted over six months in the Dermatology OPD and IPD of a tertiary care medical college in eastern India. A total of 183 adult patients receiving non-steroidal immunosuppressants for at least four weeks were enrolled. Socio-demographic, clinical, and pharmacological data were recorded using structured case report forms. ADRs were documented using the CDSCO ADR reporting form version 1.4 and causality assessed by the WHO-UMC scale. Results: The mean age was 39.22 ± 16.13 years; 53% were female. Psoriasis was the most prevalent diagnosis (47.4%), followed by vitiligo (14.1%) and lupus (6.0%). Methotrexate was the most frequently prescribed agent (27.3%), followed by tacrolimus (11.5%), tofacitinib (10.9%), and cyclosporine (15.8%). The ADRs encountered were mostly mild to moderate in severity with 26.4% systemic effects and 5.1% for mucocutaneous effects. Dyslipidaemia, cough, and arthralgia were the most common systemic ADRs (3.8% each). All ADRs were mild to moderate in severity. Conclusion: Methotrexate followed by cyclosporine dominated steroid sparing immunosuppressant prescriptions, consistent with national and international literature. The ADR profile was predominantly mild to moderate. Regular monitoring and pharmacovigilance are essential to ensure safe long-term use of these agents in dermatology practice.

9. Study of Drug Susceptibility, Resistance Patterns and Mutations, in Patients Diagnosed with Drug Resistant Tuberculosis in a Tertiary Care Centre Aurangabad, Maharashtra
Akash Bhardwaj, Anupam Prakash, Sunil Jadhav, Ashish S. Deshmukh, Hafiz Deshmukh, Shivprasad Kasat
Abstract
Background: Drug-resistant tuberculosis (DR-TB) remains a major public health challenge, particularly in high-burden countries like India. Resistance to first-line anti-tubercular drugs complicates treatment and contributes to increased morbidity and mortality. Understanding drug susceptibility patterns and associated genetic mutations is essential for effective management and control of DR-TB. Methods: A prospective observational study was conducted over a period of two years at MGM Medical College and Hospital, Aurangabad, Maharashtra. A total of 82 patients diagnosed with drug-resistant tuberculosis were enrolled, of which 80 patients with complete data were included in the final analysis. Drug susceptibility testing and mutation analysis were performed for first- and second-line anti-tubercular drugs. Data were analyzed using R software (version 4.3.2). Demographic characteristics, clinical profile, resistance patterns, and associated mutations were assessed. Results: Among the 80 patients studied, the majority belonged to the 10–30 years age group (51.25%), and males constituted 63.75% of cases. Most patients (93.75%) had no prior history of tuberculosis. Pulmonary tuberculosis was the predominant presentation (83.75%), while extrapulmonary disease accounted for 16.25%. Isoniazid resistance was observed in 96.25% of patients, with mutations in katG (79.22%) and inhA (28.57%). Rifampicin resistance associated with rpoB mutation was identified in 38.75% of patients. Fluoroquinolone resistance was seen in 23.75% of cases, predominantly involving gyrA mutation (100%) and gyrB mutation (63.16%). Resistance to second-line injectable drugs was identified in 5% of patients, associated with rrs and eis mutations. Pyrazinamide resistance was rare (1.25%). Conclusion: The study demonstrates a high burden of drug-resistant tuberculosis among young adults, with significant resistance to first-line drugs and identifiable genetic mutations. Routine drug susceptibility testing and molecular diagnostics are essential for early detection and effective management of DR-TB.

10. Variations in the blood Supply of Prostate gland in Maharashtra Population – A Cadaveric Study
Rahul Kharate
Abstract
Background: The prostate is a fibro-musculoglandular organ. Due to various functions, it requires more vascularity, but variations in arteries are a challenge to surgeons to avoid surgical emergencies during prostatectomy. Method: 34 (thirty-four) non-pathogenic adult prostates are dissected to study arterial supply, cleaned with distilled water, and allowed to dry. The arteries are painted, and photographs are taken wherever variations are noted. Results: Out of 34, major arterial supply was 15 (41%) inferior vesical artery; the least were 2 (5.88%) gluteopudendal trunk) and 2 (5.88%) middle rectal artery. Conclusion: These variations are very important to urosurgeons during prostatectomy, although various radiological techniques like CT scans and angiography are available but complete and small branches cannot be visualized.

11. A Comparative Study of Heart Rate Variability and Serum Uric Acid between Normotensive and Hypertensive Individuals in Tertiary Care
Nimit A. Hinsu1, Happy Chadsaniya2, Rashmita Ramani3, Manish Kakaiya4, R.S. Trivedi
Abstract
Background: Hypertension constitutes a major global health burden, contributing substantially to cardiovascular, renal, and cerebrovascular morbidity and mortality. Two emerging pathophysiological contributors—autonomic nervous system dysfunction as quantified by Heart Rate Variability (HRV), and elevated serum uric acid, the terminal metabolite of purine catabolism—have been independently implicated in the onset and progression of hypertension, yet their concurrent assessment in a treatment-naive cohort remains underexplored. Aims and Objectives: To compare HRV parameters and serum uric acid levels between newly diagnosed treatment-naive hypertensive patients and age- and BMI-matched normotensive controls. Setting and Design: A cross-sectional observational study conducted in the Department of Physiology, P.D.U. Government Medical College and Civil Hospital, Rajkot, Gujarat, India. Materials and Methods: Sixty age- and BMI-matched individuals aged 30–50 years were enrolled: 30 newly diagnosed untreated hypertensive patients and 30 normotensive volunteers. HRV was recorded using the Polar H9 heart rate sensor with the Elite HRV application. Serum uric acid was determined by Autoanalyzer. Statistical analysis was performed using SPSS v30.0.0. Intergroup comparisons were conducted using the unpaired Student’s t-test. Results: Hypertensives showed significantly elevated serum uric acid (5.84 ± 1.80 mg/dL vs. 4.34 ± 1.05 mg/dL; p < 0.001). Mean RR Interval, SDNN, and RMSSD were each significantly lower in the hypertensive group (p < 0.001). Frequency-domain analysis revealed significant reductions in LF (p < 0.001) and HF (p = 0.036); the LF/HF ratio did not differ significantly (p = 0.122). Conclusion: Hypertensive individuals exhibit markedly reduced HRV and elevated serum uric acid, collectively reflecting impaired autonomic regulation and augmented cardiovascular risk. Monitoring these non-invasive biomarkers may facilitate early cardiovascular risk stratification and targeted preventive intervention.

12. Comparative Evaluation of Thyroid Hormone Status in Patients with Acute Coronary Syndrome: A Cross-Sectional Study
Rashmita A. Ramani, Nimit A. Hinsu, Happy K. Chadsaniya, Kirit Sakariya, R. S. Trivedi
Abstract
Background: Acute coronary syndrome (ACS) is a major cause of morbidity and mortality worldwide and includes clinical conditions such as ST-elevation myocardial infarction (STEMI) and non-ST elevation myocardial infarction (NSTEMI). Thyroid hormones play a significant role in cardiovascular physiology by regulating myocardial contractility, heart rate, vascular resistance, and lipid metabolism. Alterations in thyroid hormone levels are frequently observed during acute systemic illnesses and may occur as part of Euthyroid Sick Syndrome (ESS). These hormonal changes may influence the clinical course and prognosis of patients with ACS. Aims and Objectives: To evaluate thyroid hormone status in patients with acute coronary syndrome and compare the thyroid profile between STEMI and NSTEMI patients. Setting and Design: This was a cross-sectional observational study conducted at P.D.U. Government Medical College and Civil Hospital, Rajkot, Gujarat, India. Materials and Methods: A total of 100 patients diagnosed with acute coronary syndrome were included in the study. Serum levels of free triiodothyronine (fT3), free thyroxine (fT4), and thyroid stimulating hormone (TSH) were measured within 24 hours of hospital admission using the ELISA method. Statistical analysis was performed using Chi-square test and unpaired t-test, and a p-value < 0.05 was considered statistically significant. Results: Among the 100 ACS patients studied, 67% had normal thyroid function, while 33% showed thyroid dysfunction. The most common abnormality was Euthyroid Sick Syndrome (17%), followed by subclinical hypothyroidism (11%) and subclinical hyperthyroidism (5%). Abnormal thyroid profiles were significantly more frequent in STEMI patients compared to NSTEMI patients (p = 0.006). However, comparison of mean fT3, fT4, and TSH levels between STEMI and NSTEMI groups did not show statistically significant differences (p > 0.05). Conclusion: Thyroid dysfunction is relatively common in patients with acute coronary syndrome, with Euthyroid Sick Syndrome being the most frequent abnormality. A significantly higher prevalence of thyroid abnormalities was observed among STEMI patients, suggesting greater physiological stress in these individuals. Routine evaluation of thyroid hormone status in ACS patients may aid in early risk assessment and clinical management.

13. A Clinical Utility of HbA1c in Detecting Dyslipidemia among Patients with Type 2 Diabetes Mellitus in Saurashtra Region of Gujarat
Happy Chadsaniya, Nimit A. Hinsu, Rashmita Ramani, R. S. Trivedi
Abstract
Background: Type 2 diabetes mellitus (T2DM) often comes with dyslipidemia, a key factor in increasing cardiovascular risks. While HbA1c is commonly used to measure long-term blood sugar control, this study explored whether HbA1c could also help detect lipid abnormalities in diabetic patients, providing a more comprehensive assessment of their cardiovascular health. Aims and Objectives: This study aimed to investigate the relationship between HbA1c levels and dyslipidemia markers in T2DM patients, to determine if HbA1c could serve as a reliable biomarker for early detection of dyslipidemia. Methods and Materials: Participants with elevated HbA1c levels were included in the study. After obtaining informed consent, their medical histories were recorded. Diagnostic procedures included   HbA1c measurement, and a detailed lipid profile analysis. Statistical analysis, including unpaired t-tests, was performed to assess the correlation between HbA1c and dyslipidemia markers. Results: The findings showed a strong association between higher HbA1c levels and increased dyslipidemia markers, particularly elevated triglycerides and lower HDL cholesterol levels. This supported the hypothesis that HbA1c can indicate lipid abnormalities in diabetic patients (p < 0.05). Discussion: These results suggest that HbA1c not only reflects glycemic control but could also be a valuable tool in predicting lipid abnormalities. Using HbA1c as a dual-purpose marker could improve cardiovascular risk assessments and lead to more personalized care for T2DM patients. Conclusion: The study concluded that HbA1c is a promising biomarker for identifying dyslipidemia in T2DM patients. Routine inclusion of HbA1c in lipid screenings could enhance early detection and management of cardiovascular risks in diabetic care.

14. Acute and Sub Acute Effect of Spinal Subarachnoid Block on Intraocular Pressure: A Comparative Study
Trishna Sahu, Aparajita Banerjee, Meenakshi Pandey, Ambika Prasad Panda
Abstract
Background & Aim: Subarachnoid block is commonly used anesthetic technique for many infraumbilical surgical procedures. It can result in complications like hypotension, bradycardia, local anesthetic toxicity, postdural-puncture headache, backache and nerve damage. Prevention and treatment of these complications were well documented. But its effects on intraocular pressure (IOP) haven’t been well studied. The aim of our study was to assess the effects of spinal anesthesia on intraocular pressure. Material and Methods: Fifty patients posted for infra umbilical surgery under subarachnoid block were included in the study. Intraocular pressure was measured prior to spinal anesthesia (PS), 20 minutes after spinal anesthesia (AS) and finally on the first postoperative day (POD1) and were compared. Both eyes of the patients were included in the study. Hemodynamic and block characteristics were monitored and compared. Results: Mean intraocular pressure was 17.9± 3.53 mm Hg prior to anesthesia, 15.77± 2.82 mm Hg 20 minutes after spinal anesthesia and 16.83± 3.39 mm Hg on 1st postoperative day,the difference among them being statistically not significant. Conclusions: Spinal anesthesia can result in decrease in IOP which may result from decrease in mean arterial pressure.

15. Metabolic Abnormalities in Bipolar Disorder: A Clinical and Biochemical Analysis
Md. Shahnwaz, Anjana Kumari, Sukant Shekhar, Arati Shivhare
Abstract
Background: Bipolar disorder is associated with an increased risk of metabolic abnormalities, contributing to significant morbidity and premature mortality. Aim: To evaluate metabolic abnormalities and their clinical correlates in patients with bipolar disorder. Methods: This cross-sectional study included 61 drug-free patients diagnosed with bipolar disorder as per ICD-10 criteria at a tertiary care center in Bihar, India. Anthropometric parameters, blood pressure, fasting blood glucose, and lipid profile were assessed. Clinical variables were evaluated using the Young Mania Rating Scale (YMRS) and Hamilton Depression Rating Scale (HAM-D). Results: Metabolic syndrome was identified in 39.3% of participants. Patients with metabolic abnormalities had significantly higher waist circumference, blood pressure, fasting blood glucose, triglyceride levels, and low-density lipoprotein cholesterol, along with lower high-density lipoprotein cholesterol (p < 0.05). A significant association was observed between metabolic abnormalities and the number of lifetime manic episodes. Conclusion: Metabolic abnormalities are highly prevalent in bipolar disorder and are associated with illness severity. Routine metabolic screening should be incorporated into standard psychiatric care.

16. Tonsillectomy and its Effect on ASO Titre: A Hospital Based Study
Sweta Kumari, Manoj Kumar, Md Ozair, Rani Rashmi Priya
Abstract
Background: Acute tonsillitis is one of the most common manifestations of the upper respiratory tract infections. It is common in children and accounts for an incidence of about 32 per 1000 patients per year. The objective of this study is to determine the effect of tonsillectomy on ASO titre and to evaluate the sensitivity and specificity of throat swab culture. Methods: Present study performed a prospective study, a total number of 50 children were screened, out of which 25 patients under the age of 15 years (16 male and 9 female), were included in the study, who were having chronic tonsillitis with raised anti-streptolysin O titre (>200IU/ml). All the patient underwent tonsillectomy and serological estimation of ASO titre was done at the end of first, second and third month post-surgery. Throat swab culture was performed prior to tonsillectomy and at the third month of follow up. Results: Twelve children (48%), twenty children (80%) and twenty-two children (88%) became serologically negative for ASO antibody at the end of first, second and third month respectively, with a statistically significant p value of 0.0001. The sensitivity and specificity of throat swab culture was 16% and 100% respectively. Conclusions: Tonsillectomy has a significant role in reducing the serological levels of anti-streptolysin O antibody and its reactivation, thereby decreasing the rate of complications associated with Group A-beta haemolytic streptococci.

17. Evaluation of Anaemia Profile in CKD Patients and Its Correlation with Erythropoietin Levels
Mohammed Abdul Salam Haroon Rashid Tamboli, Mahesh Balkishan Soni
Abstract
Background: Anaemia is a common and early complication of chronic kidney disease (CKD), primarily attributed to reduced erythropoietin (EPO) production. The severity of anaemia increases with disease progression and contributes significantly to morbidity and mortality. This study aimed to assess the anaemia profile in CKD patients and evaluate its correlation with serum erythropoietin levels. Materials and Methods: A cross-sectional study was conducted on 120 CKD patients in the Department of Medicine at Parbhani Medical College and Hospital, Parbhani, Maharashtra. Haematological parameters including haemoglobin (Hb), haematocrit (Hct), red cell indices, serum iron, ferritin, and total iron-binding capacity (TIBC) were assessed. Serum erythropoietin levels were measured using ELISA. CKD staging was done based on estimated glomerular filtration rate (eGFR). Statistical analysis included ANOVA and Pearson correlation. Results: The mean haemoglobin levels significantly decreased with advancing CKD stages (p<0.001). Normocytic normochromic anaemia was the predominant type (68%). Serum erythropoietin levels were inappropriately low relative to the degree of anaemia. A significant positive correlation was observed between Hb and EPO levels (r=0.62, p<0.001), while an inverse correlation was found between CKD stage and Hb levels. Conclusion: Anaemia in CKD is predominantly due to inadequate erythropoietin production. Early detection and monitoring of EPO levels along with haematological parameters are crucial for timely management and prevention of complications.

18. Social Media Addiction and Self Esteem among Adolescent Students in Srikakulam, India
Ch. Krishna Deepak, V. Padma, D. Vijaya Lakshmi, T. Akhila
Abstract
Background: Adolescence is a critical developmental stage characterized by rapid physical, emotional, and social changes. During this period, individuals begin to form their identity, develop interpersonal relationships, and shape their self-concept. Social media use is widespread among adolescents, and excessive engagement has been associated with adverse psychological outcomes, including behavioral addiction and impaired well-being. One of the important psychological factors influenced by social media use is self-esteem. Aim: This paper aims to assess the social media addiction and self-esteem among adolescent students and the relationship between them in Srikakulam, India. Material and Methods: A cross-sectional observational study was conducted among 200 adolescent students from classes 9th, 10th, Intermediate, and first-year MBBS in Srikakulam district, Andhra Pradesh. Participants were selected using a cluster-based sampling method. Sociodemographic details were collected, and social media addiction and self-esteem were assessed using the Bergen Social Media Addiction Scale and Rosenberg Self-Esteem Scale, respectively. Data were analyzed using descriptive statistics and Chi square tests. Results: The present study involving 200 adolescent students, the majority of participants (84%) were classified as low risk for social media addiction, while 12.5% were at risk and only 3.5% were in the high-risk category, Social media addiction was slightly more common among males (73 low risk, 15 at risk, 4 high risk) compared to females (95 low risk, 10 at risk, 3 high risk), However, age group and place of stay showed significant associations, with higher risk observed among older adolescents (>18 years) and those residing in urban areas (p = 0.003 and p = 0.03 respectively). With regard to self-esteem, the majority of students (90%) demonstrated average self-esteem, while 8% had high self-esteem and only 2% had low self-esteem. Conclusion: The present study concludes that while social media use is common among adolescent students, most adolescents are able to manage their usage without developing severe addiction, and their self-esteem levels remain largely stable. However, special attention should be given to older adolescents and those living in urban areas, as they may be more vulnerable to problematic social media use. Promoting digital awareness, balanced social media habits, and positive self-concept among adolescents may help prevent potential negative psychological effects in the future.

19. Pattern of Febrile Illness in Children Admitted to Pediatric Ward
Dasari Mounika, Mamidi Akhilesh, Sravan Kumar Kusuma
Abstract
Background: Fever is one of the most common reasons for pediatric hospital admission, with varied etiologies ranging from self-limiting viral illnesses to severe life-threatening infections. Aim: To study the pattern and etiological distribution of febrile illnesses among children admitted to the paediatric ward. Methods: This prospective observational study was conducted from January to June 2025 and included 120 children aged 1 month to 12 years admitted with fever. Detailed clinical evaluation and relevant laboratory investigations were performed. Data were analyzed using descriptive statistics and appropriate tests of significance. Results: The majority of children were aged 1–5 years (43.3%) with male predominance (58.3%). Acute respiratory infections (26.7%) were the most common cause, followed by acute gastroenteritis (15.0%), dengue (11.7%), and enteric fever (10.0%). Vector-borne diseases accounted for 26.7% of cases. Laboratory findings revealed anemia (31.7%), thrombocytopenia (18.3%), elevated CRP (48.3%), and liver enzyme derangement (21.7%). Most children recovered (91.7%), while 6.7% required intensive care; mortality was 0.8%. Conclusion: Infectious diseases, particularly respiratory and vector-borne illnesses, remain leading causes of paediatric febrile admissions, emphasizing the need for early diagnosis and timely management.

20. A Study on Arrhythmic Manifestations During the Acute Stage of Myocardial Infarction
Sasumana Ravi Kumar, Narisetty Vijay Prem Chand, Thadisetty Lilly Pushpa, Vasa Vijaya Kumar
Abstract
Background: Acute Myocardial Infarction (AMI) remains a major cause of morbidity and mortality worldwide. Cardiac arrhythmias are among the most frequent and potentially life-threatening complications occurring during the acute phase of myocardial infarction. Early identification and management of these rhythm disturbances are essential to improve patient outcomes. Objective: To study the arrhythmic manifestations occurring during the acute stage (first week) of myocardial infarction and to evaluate their clinical significance and impact on patient outcomes. Methods: This prospective observational study was conducted at Teritary care Hospital, Vijayawada, Andhra Pradesh, including 50 patients diagnosed with AMI (both STEMI and NSTEMI) admitted within 24 hours of symptom onset between January 2025 and January 2026. Patients were monitored clinically and with electrocardiography for the occurrence of arrhythmias during hospitalization. The incidence, type, and timing of arrhythmias were recorded and correlated with the type of infarction and in-hospital outcomes. Results: Arrhythmias were observed in 60% of patients during the acute phase of AMI. The majority occurred within the first 24 hours of admission. The most common arrhythmia was ventricular premature complexes (33.3%), followed by atrial fibrillation (20%), ventricular tachycardia (16.7%), and ventricular fibrillation (10%). Arrhythmias were more frequent in ST-elevation myocardial infarction (STEMI) compared to non-ST-elevation myocardial infarction (NSTEMI). Patients who developed arrhythmias had higher rates of complications, including heart failure (33.3%), cardiogenic shock (20%), and sudden cardiac death (10%). Mortality was higher among patients with arrhythmias (16.7%) compared to those without arrhythmias (5%). Conclusion: Arrhythmias are common during the acute stage of AMI, particularly within the first 24 hours. Ventricular premature complexes were the most frequently observed rhythm disturbance, while ventricular tachycardia and fibrillation were associated with increased mortality. Continuous cardiac monitoring and prompt management of arrhythmias are crucial in reducing complications and improving survival in patients with acute myocardial infarction.

21. Microbiological Profile of Bloodstream Infections and Its Correlation with Biochemical Inflammatory Markers C – reactive protein, Procalcitonin and Serum Lactate in Suspected Sepsis Patients
Nirmalkumar A. Shah, Aruna V. Gautam, Parin N. Shah
Abstract
Bloodstream infections are a major cause of morbidity and mortality among hospitalized patients. While blood culture is the reference method for confirming infection, detection may be slow and sometimes insensitive. Evaluating microbiological isolates alongside inflammatory biomarkers including C-reactive protein, procalcitonin, and lactate may support diagnosis and improved management. Objectives: To evaluate the microbiological profile of bloodstream infections and correlate blood culture positivity with biochemical inflammatory markers in suspected sepsis patients. Methods: A prospective cross-sectional observational study was conducted at a tertiary care teaching hospital over a period of 18 months. Adult patients with clinical suspicion of bloodstream infection and for whom blood cultures were obtained were included at the time of first sampling. Demographic and relevant clinical data were collected from medical records. Serum levels of C-reactive protein, procalcitonin, and lactate measured on the day of blood culture collection were recorded from the biochemistry laboratory database. Blood culture results, identification of microbial isolates, and corresponding antibiotic susceptibility patterns were obtained from the microbiology laboratory. Levels of the biochemical inflammatory markers were compared between culture-positive and culture-negative groups. Among culture-positive cases, the relationship between biomarker levels and microbial characteristics, including antibiotic resistance patterns, was further evaluated. Statistical analysis was performed using appropriate statistical tests, and a p-value of less than 0.05 was considered statistically significant. Results: A total of 150 patients with clinical suspicions of bloodstream infection were enrolled in the study. Blood cultures yielded microbial growth in 32 (21.3%) patients, while 118 (78.7%) samples showed no growth. Among the positive cultures, Gram-negative organisms accounted for the majority of isolates, followed by Gram-positive bacteria. Inflammatory biomarker levels were notably higher in patients with positive blood cultures compared with those with negative results. The mean level of C-reactive protein in culture-positive patients was 85.6 ± 27.9 mg/L, significantly greater than 43.5 ± 18.8 mg/L observed in culture-negative patients (p < 0.001). Similarly, mean Procalcitonin levels were markedly elevated in culture-positive patients (5.7 ± 2.5 ng/mL) compared with culture-negative patients (1.3 ± 0.9 ng/mL) (p < 0.001). Mean Serum lactate levels were also higher in the culture-positive group (3.7 ± 1.2 mmol/L) compared with the culture-negative group (2.0 ± 0.8 mmol/L) (p < 0.001). Among the culture-positive isolates, 17 (53.1%) were identified as multidrug-resistant organisms based on antimicrobial susceptibility testing, whereas 15 (46.9%) were non-multidrug-resistant strains. Patients with multidrug-resistant infections demonstrated higher levels of inflammatory biomarkers compared with those infected by susceptible organisms. The mean C-reactive protein level in multidrug-resistant infections was 100.8 ± 23.5 mg/L compared with 68.4 ± 19.1 mg/L in non-resistant infections (p = 0.003). Procalcitonin levels were also higher in multidrug-resistant infections (7.2 ± 2.0 ng/mL) compared with sensitive isolates (4.0 ± 1.5 ng/mL) (p = 0.002). Serum lactate values followed a similar trend, showing greater elevation in multidrug-resistant infections. Overall, increased levels of these biochemical inflammatory markers were significantly associated with blood culture positivity and antimicrobial resistance patterns among the isolated microorganisms. Conclusion: Bloodstream infections remain a significant cause of morbidity in patients with suspected sepsis. This study showed that patients with culture-positive infections had markedly higher levels of C-reactive protein, Procalcitonin, and Serum lactate compared with culture-negative cases. Elevated biomarker levels were also associated with multidrug-resistant infections, indicating that these markers may assist in early diagnosis and support timely clinical decision-making in suspected bloodstream infections.

22. Clinicopathological Profile of Ocular Tumors: A Retrospective Study
Anoop Kumar, Jai Prakash Srivastava, Vijay Kumar Srivastava, Nandini Srivastava
Abstract
Background: Ocular tumors encompass a wide spectrum of benign and malignant lesions with varied clinical presentation and prognosis. Clinicopathological evaluation plays a crucial role in accurate diagnosis and management. This study aimed to analyze the clinicopathological profile of ocular tumors in a tertiary care setting. Material and Methods: A retrospective observational study was conducted over five years, including 102 histopathologically confirmed cases of ocular tumors. Demographic details, anatomical site, and histopathological findings were recorded. Tumors were classified as benign or malignant. Statistical analysis was performed using descriptive statistics and Chi-square test, with p <0.05 considered significant. Results: Out of 102 cases, the majority were in the 41–60 years age group (29.4%) with male predominance (56.9%). The eyelid was the most commonly involved site (35.3%), followed by conjunctiva (23.5%), orbit (19.6%), and intraocular region (17.6%). Benign tumors constituted 62.7% of cases, while malignant tumors accounted for 37.3%. Among benign lesions, dermoid cysts (13.7%) and nevi (11.8%) were most frequent. Sebaceous gland carcinoma (9.8%) was the most common malignant tumor. A statistically significant association was observed between tumor behavior and anatomical site (p = 0.032), with intraocular tumors showing a higher proportion of malignancy. Conclusion: Benign ocular tumors predominate; however, a considerable proportion of malignant lesions exists, particularly in intraocular locations. Site-specific variations in tumor behavior underscore the importance of early diagnosis and histopathological evaluation for optimal management.

23. Comparative Therapeutic Outcomes of Cefixime versus Amoxicillin-Clavulanic Acid in Community-Acquired Bacterial Infections
Surabhi Arora, Anshita Arora
Abstract
Background: Community-acquired bacterial infections are a major cause of morbidity and require effective oral antibiotic therapy. This study compared the therapeutic efficacy and safety of cefixime versus amoxicillin-clavulanic acid in such infections. Material and Methods: A prospective, randomized, open-label study was conducted among 220 adult patients with community-acquired bacterial infections. Participants were equally allocated to receive either cefixime (400 mg once daily) or amoxicillin-clavulanic acid (625 mg thrice daily) for 7–10 days. Clinical response, time to symptom resolution, microbiological eradication, recurrence, and adverse events were assessed. Statistical analysis was performed using standard methods with a significance level of p<0.05. Results: Baseline characteristics were comparable between groups. Clinical cure was achieved in 89.1% of patients in the cefixime group and 85.5% in the amoxicillin-clavulanic acid group (p=0.41). Clinical improvement was observed in 7.3% and 9.1%, while treatment failure occurred in 3.6% and 5.5% of patients, respectively. Mean time to symptom resolution was 3.6 ± 1.1 days for cefixime and 3.9 ± 1.2 days for amoxicillin-clavulanic acid (p=0.07). Microbiological eradication rates were 86.7% and 82.8%, respectively (p=0.54). Recurrence rates were low in both groups (4.5% vs 6.4%, p=0.55). Adverse events were significantly lower with cefixime (10.9%) compared to amoxicillin-clavulanic acid (20.0%) (p=0.048). Conclusion: Cefixime demonstrated comparable efficacy with a better safety profile than amoxicillin-clavulanic acid, making it a suitable alternative for community-acquired bacterial infections.

24. Understanding Dengue Awareness and Preventive Practices Among Urban Population of Saurashtra, Gujarat
Hardikkumar Bharatbhai Kalariya, Chaitanyakumar Mahadevbhai Aghara, Parth M. Maheta
Abstract
Background: Dengue is a major public health problem in urban India, with recurrent outbreaks causing significant morbidity. Understanding community knowledge, attitude, and practices (KAP) is essential for effective prevention and control. Objective: To assess the knowledge, attitude, and practices regarding dengue among adults residing in the Urban Health Training Centre (UHTC) area of Jamnagar city, Western Gujarat, and to determine the association between knowledge, attitude, and practices. Methods: A community-based cross-sectional study was conducted over two months among adults aged ≥18 years residing in the UHTC area of Jamnagar. A total of 500 eligible participants were included using house-to-house visits, with more than one eligible adult interviewed per household when available. Data were collected using a pretested structured questionnaire covering socio-demographic variables and KAP related to dengue. Knowledge, attitude, and practice scores were categorized using median cut-off values. Data were analysed using descriptive statistics and Chi-square test. Results: Among the respondents, 47.0% demonstrated good knowledge, 90.4% had a good attitude, and 55.6% exhibited good preventive practices regarding dengue fever. High awareness was observed regarding mosquito breeding in stagnant water and common dengue symptoms. However, misconceptions related to disease transmission and reliance on fogging alone were noted. A statistically significant association was found between knowledge and attitude (p = 0.01), while the association between attitude and practice was not statistically significant (p = 0.29). Conclusion: Despite a positive attitude towards dengue prevention, gaps in knowledge and preventive practices persist among community members. Strengthening targeted health education and behaviour change communication interventions is essential to translate awareness into effective preventive practices.

25. Serum Ferritin as a Predictor of Disease Severity and Platelet Transfusion Requirement in Dengue: A Prospective Observational Study
Nilesh Kumar Patira, Nirali Salgiya, Jamil Mohammad, Dhairya Upadhyay
Abstract
Background: Dengue infection presents with a wide clinical spectrum ranging from mild febrile illness to severe disease with bleeding and shock. Platelet count alone is an unreliable predictor of disease severity and transfusion need. Serum ferritin, an acute-phase reactant reflecting macrophage activation and immune dysregulation, may serve as a robust biomarker for predicting severe dengue. Objectives: To evaluate serum ferritin levels as a predictor of disease severity and platelet transfusion requirement in patients with dengue infection. Methods: This prospective observational study was conducted in a tertiary care hospital and included 153 adult patients with laboratory-confirmed dengue infection and complete serum ferritin data. Serum ferritin was measured at admission, platelet counts were monitored serially, and the requirement for platelet transfusion during hospitalization was recorded. Receiver operating characteristic (ROC) curve analysis was performed to assess the predictive performance of serum ferritin. Results: Of the 153 patients, 48 (31.4%) required platelet transfusion. Median serum ferritin levels were markedly higher in patients requiring platelet transfusion compared with those who did not (≥2000 ng/mL vs 430 ng/mL). ROC analysis demonstrated excellent predictive performance of serum ferritin for platelet transfusion requirement (AUC 0.885). A ferritin cutoff of approximately 791 ng/mL predicted platelet transfusion with 94% sensitivity and 75% specificity. Conclusion: Serum ferritin is a readily available biomarker strongly associated with disease severity and platelet transfusion requirement in dengue. Incorporation of ferritin into routine clinical assessment may improve early risk stratification and promote rational blood product utilization.

26. Continuous Femoral Nerve Block with Equipotent Doses of Bupivacaine and Ropivacaine for Postoperative Analgesia after Unilateral Total Knee Replacement: A Comparative Study
Arun Aravind, Jasir P., Silpa A. R.
Abstract
Background: Postoperative pain after total knee replacement (TKR) can delay early mobilisation and rehabilitation. Continuous femoral nerve block is an effective technique for postoperative analgesia. Ropivacaine, with lower cardiotoxic potential and reduced motor blockade, may offer advantages over bupivacaine at equipotent concentrations. Objective: To compare the efficacy and safety of equipotent doses of ropivacaine 0.2% and bupivacaine 0.125% administered through ultrasound-guided continuous femoral nerve block for postoperative analgesia after unilateral total knee replacement. Methods: This prospective observational cohort study included 80 ASA I–II patients undergoing elective unilateral TKR under subarachnoid block. Patients were divided into two groups: Group R received 0.2% ropivacaine and Group B received 0.125% bupivacaine via femoral nerve catheter. A 20 ml bolus followed by continuous infusion at 5 ml/hr was administered for 48 hours. Postoperative pain was assessed using Numerical Rating Scale (NRS) at specified intervals up to 48 hours. Motor blockade was evaluated using Modified Bromage Scale. Rescue analgesic consumption and adverse effects were recorded. Results: Both groups were comparable with respect to demographic parameters. NRS pain scores at all postoperative time intervals were similar between two groups (p > 0.05). Rescue analgesic consumption did not differ significantly. At 24 hours, motor blockade was significantly less in group R compared to group B(p = 0.014). No motor blockade was observed at 48 hours in either group. No significant adverse effects were noted. Conclusion: Ultrasound-guided continuous femoral nerve block using 0.2% ropivacaine provides postoperative analgesia comparable to 0.125% bupivacaine following total knee replacement, with the advantage of reduced motor blockade. Ropivacaine may therefore be a preferable agent for facilitating early postoperative rehabilitation.

27. Radial Shortening as a Treatment Modality in an Advanced Stage of Lunatomalacia in Tertiary Care Centre of Western Rajasthan
Akhilesh Kumar Sharma, Raghuveer Meena, Anshul Meena, Ajay Gupta
Abstract
Introduction: Lunatomalcia or osteonecrosis of the lunate, can lead to chronic, debilitating wrist pain. This study was done with the objective to evaluate the risks and benefits of radial shortening in advanced stage of lunatomalcia in Tertiary care centre in Western Rajasthan. Materials and Methods: This Hospital based observational study was conducted between the periods of April 2016 to May 2022. Total 24 cases that had undergone radial shortening for treatment of Lunatomalcia were included in our study. Cases were clinically evaluated for pain, range of motion and grip strength. Pain was quantified by Visual Analogue Scale (VAS), Range of motion was measured by goniometer and grip strength was measured relative to contralateral side with help of dynamometer. Data was analysed using Microsoft excel 2019. Result: Mean age of cases was 33.83±11.65 years. Male to female ratio was 2:1. Range of motion, Grip strength, Wrist extension and Wrist flexion in operated hand increased after the procedure and this improvement was found to be statistically significant (p value<0.05). Postoperative VAS and DASH score significantly decreased after the radial shortening (p value<0.05) and NKSS increased significantly postoperatively (p value<0.05). Conclusion: Radial shortening is a simple and reproducible procedure with low complication rate. We concluded that radial shortening is an effective procedure with respect to functional improvement in treatment of patients of Lunatomalcia even in advance stage 3A, 3B and 4 however radiologically improvement is mild.

28. An Observational Study of the Origin and Course of Vertebral Artery in Indian Cadavers
Nakul Choudhary, Rakesh Ranjan
Abstract
Background: The vertebral artery is a vital component of the posterior circulation of the brain, exhibiting considerable anatomical variations in its origin and course. These variations hold significant clinical importance in diagnostic, surgical, and interventional procedures involving the head, neck, and cervical spine. The present observational study was conducted to analyze the origin and course of the vertebral artery in Indian cadavers. Methods: A total of 42 embalmed adult cadavers were dissected in the Department of Anatomy. At GMC, Purnea. The vertebral arteries were carefully exposed on both sides, and their origin, level of entry into the transverse foramina, and course through the cervical vertebrae were examined and documented. Any deviations from the typical origin (from the subclavian artery) and standard entry at the level of the sixth cervical vertebra were recorded. The study revealed that while the majority of vertebral arteries originated from the subclavian artery and entered the transverse foramen at the level of C6, notable variations were observed. These included origin directly from the aortic arch, entry at higher cervical levels such as C5 or C4, and asymmetry between the right and left sides. Such variations may have embryological significance and potential clinical implications, particularly in angiographic interpretation, cervical spine surgeries, and vascular interventions. Conclusion: Awareness of these anatomical variations is crucial for clinicians to avoid complications during surgical and radiological procedures. The findings of this study contribute to the existing anatomical knowledge and emphasize the need for careful preoperative evaluation of vertebral artery anatomy.

29. Pericapsular Precision Versus Compartmental Analgesia: Ultrasound- Guided PENG Block Compared with Fascia Iliaca Block in Patients Undergoing Hip Surgery Under Spinal Anaesthesia
Sri Satya Yeleswarapu, Kota Aditya
Abstract
Background: Effective analgesia before and after hip surgery is essential to facilitate positioning for spinal anaesthesia and improve postoperative recovery. This study compared ultrasound-guided pericapsular nerve group (PENG) block with fascia iliaca compartment block (FICB) in patients undergoing elective hip surgery. Methods: In this prospective randomized double-blind study, 50 patients scheduled for elective hip surgery under spinal anaesthesia were allocated into two equal groups. Group F received ultrasound-guided FICB and Group P received ultrasound-guided PENG block using 20 mL of 0.5% ropivacaine. Pain was assessed using the Numerical Rating Scale before block, 30 minutes after block, and postoperatively up to 24 hours. Ease of spinal positioning, time to first rescue analgesia, rescue analgesic consumption, and haemodynamic variables were also evaluated. Results: Baseline demographic and haemodynamic characteristics were comparable between the groups. Pain scores were significantly lower in Group P at 30 minutes and during the early postoperative period up to 8 hours. Group P also showed better ease of positioning, longer time to first rescue analgesia, and lower rescue analgesic consumption over 24 hours. Haemodynamic parameters were stable and similar in both groups. Conclusion: Ultrasound-guided PENG block provided superior early postoperative analgesia and improved positioning for spinal anaesthesia compared with FICB in elective hip surgery patients.

30. Blood culture contamination reduction, quality improvement in NICU, MGM Hospital, Warangal
Kagithapu Surender, Mohan Amgothu, G. Karunakar, Ayesha Begum, P. Srivani, Ragha Sanjana K.
Abstract
Introduction: Neonatal sepsis is a leading cause of neonatal mortality, especially in LMICs, with the highest incidence in the Indian subcontinent. Blood culture (BC) is the gold standard for diagnosis but has limitations like low positivity, false positives, and delays. This study aimed to reduce BC contamination using the plan do study act (PDSA) cycle. Methods: This prospective study included all neonatal intensive care unit (NICU) admitted newborns, excluding uncooperative parents. A research team collaborated with microbiology experts to understand and reduce BC contamination using a three-cycle PDSA approach. Training, protocol reinforcement, and supply monitoring improved compliance, significantly reducing BC contamination, leading to large-scale intervention implementation. Results: The study demonstrated a significant reduction in BC contamination rates from 10.7% (20) to 1.9% (5) through PDSA cycles. Compliance with hand hygiene, personal protective equipment (PPE) usage, antiseptic application, and proper sample collection improved notably. These structured interventions led to enhanced infection control, minimizing unnecessary antibiotic use and hospital stays in the NICU. Conclusion; Implementation of a structured PDSA cycle-driven intervention significantly improved adherence to aseptic BC collection techniques, reducing contamination rates. The intervention emphasized staff training, supply monitoring, and adherence to standard protocols, leading to sustained improvements.

31. Retroverted Uterus Revisited: Anatomical and Hemodynamic Insights into Primary Infertility
Sangita Ashokrao Gore, Abhilasha Jain
Abstract
Background and Objective: The natural orientation of the uterus varies, with a retroverted position historically considered a benign physiological variant. However, its isolated impact on female reproduction remains debated. This study aims to evaluate the association between uterine position (anteverted versus retroverted) and primary infertility using detailed anatomical and sonographic assessments, in the absence of confounding pelvic pathologies. Methods: A prospective, cross-sectional observational study was conducted at a tertiary care center in Maharashtra from April 2023 to December 2024. The study included 90 women (N = 90) diagnosed with primary infertility. Participants underwent transvaginal sonography (TVS) during the early follicular phase. Spatial uterine orientation, cervico-uterine angle, cervical canal length, and uterine artery Doppler indices (Pulsatility Index [PI] and Resistance Index [RI]) were measured. Participants with secondary infertility, male factor infertility, or severe pelvic pathologies like deep endometriosis were excluded. Results: Of the 90 participants, 52 (57.8%) exhibited an anteverted uterus, while 38 (42.2%) had a retroverted uterus, a prevalence notably higher than in the general fertile population. The retroverted group demonstrated a significantly sharper mean cervico-uterine angle (102.4° ± 12.1° vs. 134.6° ± 10.5°, p < 0.001) and an elongated cervical canal (3.6 ± 0.4 cm vs. 3.2 ± 0.3 cm, p < 0.001). Doppler analysis revealed significantly elevated vascular resistance in the retroverted cohort (Mean PI: 2.84 vs. 2.32, p < 0.001; Mean RI: 0.88 vs. 0.76, p < 0.001). Furthermore, women with a retroverted uterus reported higher incidences of severe dysmenorrhea (p = 0.012) and were significantly more likely to experience prolonged infertility exceeding 4 years (p = 0.038). Conclusion: An isolated retroverted uterus is significantly associated with primary infertility. The altered cervical geometry and compromised uterine hemodynamics observed in retroverted uteri may serve as mechanical and physiological barriers to natural conception, suggesting it should be evaluated as a clinically significant anatomical factor in routine fertility workups.

32. Comparison of Ultrasound Guided Hydrodistension of the Shoulder Joint By Anterior Versus Posterior Approach in Primary Adhesive Capsulitis
Akash Yadav, Deepak Kumar Saini, Parth Kaushik, Siddhant Jain
Abstract
Background: Adhesive capsulitis, or frozen shoulder, presents significant functional impairment due to pain and stiffness. Ultrasound-guided hydrodistension has been recognized for its potential in treating this condition, but the optimal approach remains unclear. Methods: This prospective observational study at the Central Institute of Orthopaedics involved 40 patients with primary adhesive capsulitis, randomized into two groups to receive hydrodistension via either the anterior or posterior approach. Outcomes measured included the Visual Analog Scale (VAS) for pain and the degree of passive external rotation, assessed at baseline, 4 weeks, and 12 weeks post-intervention. Results: Both groups started with comparable pain levels and mobility restrictions; however, the anterior approach group showed more significant improvements. At 12 weeks, the anterior group’s pain scores and external rotation were superior to those of the posterior group. Conclusion: The anterior approach to ultrasound-guided hydrodistension is more effective in managing pain and improving mobility in patients with adhesive capsulitis compared to the posterior approach.

33. Diagnostic Accuracy of Ultrasound-Guided Fine Needle Aspiration Cytology in Salivary Gland Lesions with Correlation to Histopathological Examination (HPE) Findings
Anil Kumar, Harendra Kumar, Asim Mishra
Abstract
Background: Salivary gland lesions comprise a diverse group of neoplasms with varying histopathological features and biological behavior. Accurate preoperative diagnosis is essential for appropriate management. Ultrasonography-guided fine needle aspiration cytology (USG-guided FNAC) has emerged as a valuable diagnostic tool, though its accuracy requires validation against histopathological examination (HPE), the gold standard. Aim: To evaluate the diagnostic accuracy of USG-guided FNAC in salivary gland lesions and to assess its correlation with histopathological findings. Materials and Methods: This prospective observational study was conducted in the Department of Pathology at Anugrah Narayan Magadh Medical College & Hospital, Gaya, over a period of two years (January 2024 to December 2025). A total of 60 patients with clinically suspected salivary gland neoplasms were included. All patients underwent USG-guided FNAC followed by surgical excision and histopathological examination. Diagnostic performance parameters such as sensitivity, specificity, positive predictive value (PPV), negative predictive value (NPV), and overall accuracy were calculated. Statistical analysis was performed using SPSS version 27.0, with a p-value <0.05 considered significant. Results: Out of 60 cases, females constituted 55.0% and males 45.0%, with no significant association between age and gender (p = 0.85). FNAC diagnosed 56.7% cases as benign and 43.3% as malignant, while histopathology confirmed 58.3% benign and 41.7% malignant lesions. Pleomorphic adenoma was the most common benign tumor, and mucoepidermoid carcinoma was the most common malignant tumor. Cytohistopathological correlation showed a highly significant association (χ² = 41.52, p < 0.001). FNAC demonstrated 18 true positives, 37 true negatives, 3 false positives, and 2 false negatives, indicating high diagnostic accuracy. Conclusion: USG-guided FNAC is a reliable, minimally invasive, and cost-effective diagnostic modality with high accuracy in differentiating benign and malignant salivary gland lesions. Its strong correlation with histopathological findings supports its role as an effective preoperative diagnostic tool.

34. Correlation between Bone Marrow Plasma Cell Morphology and Cytogenetic Abnormalities in Multiple Myeloma Patients
Anil Kumar, Madhurima Sinha, Asim Mishra
Abstract
Background: Multiple Myeloma (MM) is a plasma cell malignancy characterized by clonal proliferation in the bone marrow, leading to anemia, bone lesions, hypercalcemia, renal dysfunction, and monoclonal protein production. Recent advances highlight the importance of cytogenetic abnormalities as key prognostic indicators. Aim: This study was undertaken to evaluate the relationship between bone marrow plasma cell morphology and cytogenetic abnormalities in MM patients. Materials and Methods: This hospital-based cross-sectional observational study included 100 diagnosed cases of multiple myeloma with available cytogenetic data. Bone marrow aspiration and biopsy samples were analyzed for plasma cell percentage, morphological subtype (plasmacytic, plasmablastic, mixed), and infiltration pattern (nodular, interstitial, diffuse). Cytogenetic evaluation was performed using Fluorescence In Situ Hybridization (FISH) and GTG banding. Statistical analysis was conducted using SPSS, and a p-value < 0.05 was considered significant. Results: A statistically significant association was observed between cytogenetic abnormalities and plasma cell characteristics (p < 0.05). The majority of cases demonstrated a high plasma cell burden (>50%), particularly in those with del(13q14.3) and complex karyotype. Plasmacytic morphology predominated in cases with normal cytogenetics and t(11;14), whereas plasmablastic morphology was more commonly associated with cytogenetic abnormalities, especially complex karyotypes. Diffuse bone marrow infiltration was the most common pattern (52%) and was predominantly associated with high-risk abnormalities such as del(17p13), t(4;14), and t(14;16). In contrast, t(11;14) was associated with nodular and interstitial patterns. Conclusion: Plasma cell morphology, marrow infiltration pattern, and cytogenetic abnormalities show a significant correlation in multiple myeloma. Their combined evaluation enhances understanding of disease biology and may improve prognostic stratification and clinical management.

35. A Morphometric and Morphological Analysis of the Bicipital Groove of the Humerus
Sigraf Tarannum, Amit Kumar Prasad, Umesh Prasad Sinha
Abstract
Background: The bicipital groove (intertubercular sulcus) of the humerus plays a crucial role in guiding and stabilizing the tendon of the long head of the biceps brachii. Variations in its morphology and morphometry are clinically significant, as they may influence tendon stability and predispose to various shoulder pathologies. Aim: To perform a detailed morphometric and morphological analysis of the bicipital groove and to evaluate side-wise variations and their clinical significance. Materials and Methods: This descriptive cross-sectional study was conducted on 90 adult dry human humerii obtained from the departmental osteology collection. Morphometric parameters, including length, width, depth, medial wall length, and lateral wall length, were measured using a digital vernier caliper. Morphological features such as supratubercular ridge, wall thickening, and bony spurs were assessed by visual inspection. The bicipital groove was classified based on opening angle, medial wall angle, and depth using established criteria. Data were analyzed using SPSS version 27.0. Continuous variables were expressed as mean ± standard deviation, while categorical variables were presented as frequency and percentage. Student’s t-test and Chi-square test were applied, with p < 0.05 considered statistically significant. Results: Most morphometric parameters showed no significant side-wise differences (p > 0.05), except for lateral wall length (p = 0.03), width (p = 0.01), and opening wall angle (p = 0.02), which demonstrated significant variation. The majority of specimens (64.4%) exhibited a moderate depth (4–6 mm), followed by deep (22.2%) and shallow (13.3%) grooves. The small opening angle category (≤95°) was the most common (42.2%), with no significant side-wise association (χ² = 0.32, p = 0.85). Morphological variations were observed in 87.8% of specimens, with the supratubercular ridge (Meyer’s) being the most frequent finding (37.8%), followed by medial wall thickening (21.1%), lateral wall thickening (16.7%), and bony spurs (12.2%). No significant side-wise differences were noted for these variations (p > 0.05). Conclusion: The bicipital groove demonstrates overall bilateral symmetry with selective asymmetry in certain morphometric parameters. Moderate groove depth and smaller opening angles predominate, while morphological variations are common but symmetrically distributed. These findings have important clinical implications for understanding shoulder biomechanics, diagnosing tendon disorders, and guiding surgical interventions involving the proximal humerus.

36. A Morphometric Study of the Body, Pedicles, and Laminae of Typical Thoracic Vertebrae in Humans: A Cross-Sectional Study
Sigraf Tarannum, Amit Kumar Prasad, Umesh Prasad Sinha
Abstract
Background: The thoracic vertebrae play a crucial role in maintaining spinal stability, protecting the spinal cord, and facilitating movement. Detailed knowledge of their morphometric characteristics is essential for clinical applications, particularly in spinal instrumentation and surgical procedures. Aim: To perform a detailed morphometric analysis of the vertebral body, pedicle, and lamina of typical thoracic vertebrae and to evaluate bilateral symmetry of pedicle and laminar dimensions. Materials and Methods: This descriptive cross-sectional study was conducted on 110 dry human thoracic vertebrae. Measurements of the vertebral body, pedicle, and lamina were obtained using a digital Vernier caliper. Parameters assessed included anteroposterior diameter (APD), transverse diameter (TD), anterior height (AH), posterior height (PH), pedicle height and width, and lamina height and width. Data were analyzed using SPSS version 27.0. Descriptive statistics were expressed as mean ± standard deviation (SD), and comparisons between left and right sides were performed using the independent samples t-test, with p < 0.05 considered statistically significant. Results: The mean APD, TD, AH, and PH of the vertebral body were 18.1 ± 2.1 mm, 25.7 ± 9.1 mm, 16.4 ± 7.2 mm, and 17.3 ± 4.5 mm, respectively. The transverse diameter and anterior height showed greater variability compared to APD. Pedicle height and width demonstrated no statistically significant differences between the left and right sides (p = 0.72 and p = 0.81, respectively). Similarly, lamina height and width showed no significant bilateral differences (p = 0.96 and p = 0.74, respectively), indicating symmetrical morphology. Conclusion: The study highlights that while vertebral body dimensions exhibit variability, the pedicle and lamina demonstrate significant bilateral symmetry. These findings provide valuable anatomical data that can aid in surgical planning, spinal instrumentation, and the design of implants, thereby enhancing the safety and efficacy of thoracic spine procedures.

37. Prognostic Significance of Electrocardiographic Findings in Patients with Acute ST-Elevation Myocardial Infarction
Haresh Jilubhai Boghara, Prakashkumar Vejabhai Jadeja, Parekh Anish Mo Zakaria, Manish Juneja
Abstract
Background: Early electrocardiographic (ECG) findings are crucial for rapid risk stratification in ST-elevation myocardial infarction (STEMI), yet evidence from rural tertiary-care centers in India remains scarce. Parameters such as ischemia grade, rhythm disturbances, and conduction abnormalities may serve as predictors of short-term outcomes. This study aimed to assess the prognostic significance of admission ECG variables in STEMI patients within a rural Indian cohort. Methods: A single-center observational study was conducted over one year, enrolling 182 consecutive STEMI patients presenting to a tertiary hospital in rural India. Baseline clinical characteristics, admission ECG findings, angiographic data, and in-hospital outcomes were collected. Standard ECG definitions were applied, and ischemia severity was graded when appropriate. Associations between ECG features and in-hospital mortality were analyzed using p values. Results: Anterior and inferior STEMI were the most frequent presentations, with the majority of patients in sinus rhythm and within normal heart rate ranges at admission. QRS abnormalities, pathological Q waves, and grade 3 ischemia were noted in a subset of patients. In-hospital mortality was significantly higher among those with anterior wall involvement, tachycardia, atrial fibrillation/flutter, conduction disturbances, pathological Q waves, and grade 3 ischemia, all of which showed strong associations with adverse outcomes. Conclusion: Admission ECG features—particularly conduction abnormalities, arrhythmias, and ischemia grade—remain valuable predictors of in-hospital mortality in STEMI patients. These findings highlight the importance of comprehensive ECG assessment as a rapid, accessible, and cost-effective tool for early risk stratification in rural tertiary-care settings.

38. Cross-Sectional Study on Blood Coagulation Profile and Platelet Indices in Normal Term Pregnancy and Term pregnancy with Preeclampsia
Suchana Sinha, Raju Gopal Saha, Pradip Sarkar, Rajib Pal, Jhantu Kumar Saha
Abstract
Background: Preeclampsia is a significant hypertensive disorder of pregnancy associated with endothelial dysfunction, altered coagulation, and platelet abnormalities. These haemostatic changes increase the risk of maternal and fetal complications. Evaluating coagulation parameters and platelet indices can aid in early detection, risk stratification, and management of preeclampsia. Methods: A cross-sectional observational study was conducted among 180 term pregnant women (90 normotensive and 90 preeclamptic) at Burdwan Medical College. Participants were selected using simple random sampling. Clinical evaluation and laboratory investigations including BT (Bleeding Time), CT (Clotting Time), PT (Prothrombin Time), APTT (Activated Partial Thromboplastin Time), platelet count, MPV (Mean Platelet Volume), PDW (Platelet Distribution Width), and D-dimer levels were performed. Data were analyzed using SPSS with appropriate statistical tests, considering p ≤ 0.05 as significant. Results: Preeclamptic women showed significantly higher systolic and diastolic blood pressure (p < 0.001). Coagulation parameters-BT, CT, PT, and APTT-were significantly prolonged in the preeclamptic group (p < 0.001). Platelet count was significantly reduced (1.17 ± 0.6 vs. 1.438 ± 0.36 lakh/mm³; p = 0.033), while MPV and PDW were significantly elevated (p < 0.05), indicating increased platelet activation and turnover. D-dimer levels were also significantly higher in preeclamptic women (p < 0.001), reflecting enhanced fibrinolytic activity. Conclusion: Preeclampsia is associated with significant alterations in coagulation profile and platelet indices, indicating a hypercoagulable yet consumption-driven state. Routine monitoring of these parameters can facilitate early diagnosis, assess disease severity, and prevent complications such as DIC and HELLP syndrome, thereby improving maternal and fetal outcomes.

39. Preterm Birth and Its Maternal and Fetal Risk Factors: A Study from a Tertiary Care Center
B. Sanjana, Seema Mahesh Gaded, Sada G. Reddy, Nandish S. Manoli
Abstract
Introduction: Preterm birth, defined as delivery before 37 completed weeks of gestation, is a major contributor to neonatal morbidity and mortality worldwide. It is influenced by multiple maternal, fetal, and socioeconomic factors. Understanding its epidemiology and associated risk factors is essential for improving maternal and neonatal outcomes. Materials and Methods: This hospital-based observational study was conducted in the Department of Obstetrics and Gynecology at a tertiary care center. A total of 90 pregnant women with preterm delivery were included. Data were collected using a pre-designed proforma, including maternal demographics, obstetric history, and fetal parameters. Statistical analysis was performed using SPSS software, and associations were assessed using the Chi-square test, with p < 0.05 considered significant. Results: The majority of women belonged to the 21–30 years age group (44.4%), with a mean age of 27.6 ± 4.8 years. Multigravida constituted 57.8%, and 65.6% had inadequate antenatal care. Anemia (46.7%), PROM (31.1%), and hypertensive disorders (26.7%) were the most common maternal risk factors. Fetal factors included IUGR (20.0%) and multiple gestation (13.3%). Most neonates (72.2%) had low birth weight, with a mean gestational age of 33.4 ± 2.1 weeks. Conclusion: Preterm birth is a multifactorial condition predominantly associated with maternal health status, inadequate antenatal care, and obstetric complications. Early identification and management of modifiable risk factors are essential to reduce its incidence and improve neonatal outcomes.

40. Comparative Study of ACL Reconstruction in Knee Flexed–Leg Vertical Position with Figure of Nine Position-Randomized Controlled Trial
Pushpraj Chauhan, Pancham Anirudh Yadav , Faisal Naseer Mir
Abstract
Introduction: ACL injury is a common knee ligament injury, often reconstructed using a conventional knee-flexed leg vertical position. The figure-of-nine position improves lateral access and femoral footprint visualization, potentially enhancing tunnel placement and postoperative outcomes. Comparative evidence is limited. Materials and Methods: Fifty patients aged 14–44 years with symptomatic, MRI-confirmed ACL tears were randomized to Group 1 (knee-flexed leg vertical) or Group 2 (figure-of-nine). Hamstring grafts were used, fixed with tibial bioabsorbable screws and femoral cortical buttons. Outcomes included IKDC and Lysholm scores, Lachman, anterior drawer, and pivot shift tests at 6 months and 1 year, along with radiological assessment of femoral and tibial tunnels. Results: At 1 year, Group 2 demonstrated superior functional scores (IKDC 85.10 ± 7.00 vs. 70.30 ± 8.20; Lysholm 92.0 ± 5.30 vs. 79.2 ± 8.00; p < 0.001) and better knee stability. Femoral tunnels were more anatomical in orientation (44.10° ± 4.70° vs. 55.20° ± 5.70°) and posteriorly placed (30.80% ± 5.20% vs. 38.40% ± 7.50%; p < 0.001). Complications were lower in Group 2 (12% vs. 24%; p = 0.014). Conclusion: The figure-of-nine position enhances anatomical tunnel placement, improves knee stability and functional outcomes, and reduces complication rates, making it a safe and effective alternative to the conventional vertical leg position for ACL reconstruction.

41. Clinico-Etiological Profile and Complications of Chronic Liver Disease in Female Patients: A Prospective Observational Study from a Tertiary Care Hospital in Uttar Pradesh
Abhishek Bhardwaj, Ravi Bhardwaj
Abstract
Background: Chronic liver disease in women has a heterogeneous etiological spectrum and often becomes clinically evident after decompensation. Female-only cohorts have shown marked regional variation in etiology and complication burden. Methods: This prospective observational study included 50 female patients with chronic liver disease attending the outpatient department or admitted under the Department of General Medicine at a tertiary care hospital in Uttar Pradesh between May 2024 and October 2025. Participants were enrolled by consecutive sampling. Demographic characteristics, etiology, diabetes status, alcohol use, clinical presentation, ascites, varices, laboratory profile, and status at last contact were recorded. The primary outcome was the clinico-etiological profile and complication burden. Results: The mean age was 55.54 ± 8.29 years, and most participants were 51–60 years old (38%) or 61–70 years old (30%). Metabolic-associated steatotic liver disease was the leading etiology (40%), followed by alcohol-related liver disease (22%), autoimmune hepatitis (14%), hepatitis B virus infection (12%), and hepatitis C virus infection (12%). Diabetes mellitus was present in 50% of patients and was more frequent in metabolic-associated steatotic liver disease than in non-MASLD etiologies (80% vs 30%; p=0.0012). Jaundice was the most common presenting feature (68%). Ascites was present in 56% and esophageal varices in 58%; large varices were present in 18%. Anaemia (72%), hypoalbuminemia (64%), hyperbilirubinemia (64%), thrombocytopenia (62%), and hyponatremia (42%) were common. At last contact, 41 patients (82%) were discharged, 7 (14%) were referred, and 2 (4%) died. Adverse status at last contact was associated with abdominal distension, fever/infection, severe ascites, large varices, hyponatremia, and renal dysfunction. Conclusion: Women with chronic liver disease in this cohort had a predominant metabolic burden, frequent diabetes, and a substantial prevalence of portal hypertension-related complications. Advanced clinical and biochemical derangement identified patients with poorer status at last contact.

42. Clinical and Biochemical Correlates of Disease Severity in Oral Submucous Fibrosis: A Prospective Observational Study
Kachoriya Taral, Kanwar Vikrant Singh, Pranshuta Sehgal
Abstract
Background: Oral submucous fibrosis is a chronic, progressive, and potentially malignant disorder associated with areca nut and related chewing habits. In addition to progressive fibrosis and functional restriction, patients may demonstrate measurable haematological and biochemical alterations. Methods: This prospective observational study included 100 patients with oral submucous fibrosis managed in the Department of Otorhinolaryngology from December 2023 to January 2026. Clinical evaluation included symptom profile, site involvement, stage, and mouth opening. Laboratory assessment included complete blood count, erythrocyte sedimentation rate, serum protein, C-reactive protein, serum iron, and serum lactate dehydrogenase. Mouth opening was reassessed at 1 month, 3 months, and 6 months. Results: The mean age was 33.05 ± 8.19 years, and 74.0% of patients were male. Stage III was the most common stage (31.0%). Baseline mouth opening declined progressively from Stage I to Stage IV. Increasing stage was associated with longer chewing duration and greater chewing amount per day. Haemoglobin, mean corpuscular volume, mean corpuscular Haemoglobin, platelet count, serum protein, and serum iron declined progressively with advancing stage, whereas total leukocyte count, erythrocyte sedimentation rate, C-reactive protein, and serum lactate dehydrogenase increased progressively. Mouth opening showed strong positive correlation with Haemoglobin, mean corpuscular volume, mean corpuscular Haemoglobin, platelet count, serum protein, and serum iron, and strong negative correlation with age, complaint duration, chewing burden, total leukocyte count, erythrocyte sedimentation rate, C-reactive protein, and serum lactate dehydrogenase. Follow-up demonstrated progressive improvement in mouth opening, with greater improvement in earlier stages. Conclusion: Oral submucous fibrosis showed a clear association between increasing clinical severity, greater habit burden, worsening functional limitation, and progressive haematological and biochemical derangement. Combined clinical and biochemical assessment may provide a broader estimate of disease burden and may assist severity assessment and follow-up.

43. Role of Colour-Assisted Duplex Sonography in the Evaluation of Thyroid Diseases: A Cross-Sectional Study with Histopathological Correlation
Tushar Malik, Shankar Snehit Patil
Abstract
Background: Thyroid disease includes inflammatory, benign nodular, and malignant lesions. Grey-scale ultrasonography is the first-line imaging modality, while colour-assisted duplex sonography provides additional vascular and haemodynamic information that may improve lesion characterization. Methods: This cross-sectional study was conducted in the Department of Radiodiagnosis over 18 months. A total of 70 thyroid lesions were evaluated using grey-scale ultrasonography, colour Doppler, and spectral Doppler. Morphological features, vascularity patterns, and Doppler indices were recorded. Final analysis was correlated with cytopathological and histopathological findings where available. Statistical analysis was performed using SPSS version 26, and a p value of <0.05 was considered statistically significant. Results: The age range was 21-75 years, with a mean age of 44.8 ± 13.3 years and a median age of 43.5 years. Females accounted for 58 cases (82.9%). Papillary thyroid carcinoma was the most common individual diagnosis, seen in 19 lesions (27.1%). Increased vascularity was the most frequent overall Doppler pattern, observed in 28 lesions (40.0%). All inflammatory thyroid lesions showed increased/internal vascularity. In contrast, common benign nodular and goitrous lesions showed either no vascularity or mild/peripheral vascularity, while papillary thyroid carcinoma showed predominantly increased/internal vascularity. Internal/increased vascularity was present in 19 of 21 malignant lesions (90.5%) compared with 14 of 49 benign/non-malignant lesions (28.6%) (p <0.001). Conclusion: Colour-assisted duplex sonography is a useful adjunct in the evaluation of thyroid disease. Doppler vascularity patterns, interpreted together with grey-scale morphology, may help differentiate inflammatory thyroid lesions, benign nodular/goitrous lesions, and papillary thyroid carcinoma. Increased vascularity is not specific for malignancy and should not be interpreted in isolation.

44. A Study of Serological and Clinical Correlation of Dengue in a Tertiary Care Hospital in Gujarat
Siddhi Bharatbhai Mesariya, Kanizfatma Durani, Khushbu Nagar, Jayshri Pethani
Abstract
Background: Dengue is a rapidly emerging mosquito-borne viral infection and a major public health concern in tropical countries like India. The disease presents with a wide clinical spectrum ranging from mild febrile illness to severe dengue with hemorrhage and shock. Early diagnosis using serological markers and correlation with clinical features is essential for timely management and reduction of morbidity and mortality. Objective was to determine the incidence of dengue infection and to study the correlation between clinical features and serological findings in a tertiary care hospital. Methods: This prospective observational study was conducted over a period of two years in a tertiary care hospital. A total of 5168 clinically suspected dengue cases were included. Serum samples were tested using NS1 antigen detection (ELISA and rapid methods) for early cases and IgM capture ELISA for later stages of illness. Demographic, clinical, and laboratory data were recorded and analyzed using descriptive statistics. Results: Out of 5168 suspected cases, 292 (6%) were serologically confirmed. Majority of patients had dengue without warning signs (78%), followed by dengue with warning signs (15%) and severe dengue (7%). The most affected age group was 21–30 years (38%), with male predominance (56%). Fever (100%), chills (95%), headache (87%), vomiting (65%), and myalgia (48%) were the most common presenting features. Hemorrhagic manifestations were observed in 46% of cases. NS1 antigen positivity (52%) was higher than IgM (42%), indicating early presentation in most patients. Thrombocytopenia was noted in 40% and leucopenia in 24% of cases, with platelet counts correlating with disease severity. Conclusion: Dengue predominantly affects young adults and shows peak incidence during monsoon season. NS1 antigen is a valuable early diagnostic marker. Clinical and serological correlation is crucial for early detection and appropriate management, which helps in preventing complications and reducing disease burden.

45. Maternal and Perinatal Outcomes in Subclinical Hypothyroidism during Pregnancy: A Prospective Comparative Study
Vrushabhveer C. P., Baitinti Srividya, Sanabil S. P., Druva Chandra A. M., Abdul Haque Usman Pulath Puthanath, Sanketh Janardhan
Abstract
Background: Subclinical hypothyroidism (SCH) during pregnancy represents a significant endocrine disorder characterized by elevated thyroid-stimulating hormone (TSH) levels with normal free thyroxine concentrations. The association between SCH and adverse pregnancy outcomes remains controversial, with conflicting evidence regarding maternal and neonatal complications. Methods: This prospective comparative study enrolled 240 pregnant women (120 with SCH and 120 euthyroid controls) at a tertiary care hospital over an 18-month period. Participants were recruited at ≤20 weeks of gestation and followed until delivery. SCH was defined using trimester-specific TSH thresholds with normal free T4 levels. Primary outcomes included preterm birth, preeclampsia, low birth weight, and neonatal intensive care unit admission. Statistical analysis included chi-square tests, independent t-tests, and multivariate logistic regression. Results: Women with SCH demonstrated significantly higher rates of preterm birth (26.7% vs 11.7%, p=0.003), preeclampsia (16.7% vs 6.7%, p=0.018), and low birth weight (22.5% vs 9.2%, p=0.005) compared to euthyroid controls. Mean neonatal birth weight was significantly lower in the SCH group (2,680 ± 395 g vs 2,920 ± 365 g, p<0.001). Multivariate analysis revealed SCH as an independent predictor of composite adverse outcomes (adjusted OR 2.14, 95% CI 1.28–3.58, p=0.004). Conclusion: Subclinical hypothyroidism during pregnancy is associated with significantly increased maternal and perinatal morbidity. Early screening and appropriate management strategies may improve pregnancy outcomes in affected women.

46. Evaluation of the Effect of Cyperus Rotundus in a Murine Model of Dextran Sulphate Sodium (DSS) Induced Acute Colitis
Divakar, Suraj Waykole, Rahul Vitthal Chavan, Manoj Radhakrishnan, Sandhya Kamat
Abstract
Background: Ulcerative Colitis (UC) is a chronic, debilitating condition that affects an individual throughout life and is associated with many complications. The current treatment regimen includes the use of anti-inflammatory agents such as sulfasalazine, corticosteroids and immunosuppressants like azathioprine. These drugs have limited efficacy and multiple adverse effects and hence there is a need for safer and efficacious new drugs. Cyperus rotundus (CR) is a medicinal plant used in Ayurveda for the treatment of gastrointestinal disorders. The present study examined the effect of CR in an animal model which simulates ulcerative colitis. Objectives: To evaluate the protective effect of CR in a murine model of dextran sulfate sodium (DSS) induced acute colitis. Methods: After IAEC approval, 48 Swiss albino mice were divided into six groups (n = 8/group) and treated as follows: Vehicle control (VC), disease control (DC), positive control (sulfasalazine – 100 mg/kg) and three test groups with CR – 200 mg/kg/day, 600 mg/kg/day and 1 g/kg/day. All 6 groups received the study drug or vehicle from day 1 to 14. The inducing agent 3% Dextran sulphate sodium in drinking water was administered from day 8 to 14 to all groups except VC. Animals were sacrificed on day 15. Colon length and colon weight-by-length ratio were assessed and analyzed using One-way Anova. Disease activity index (DAI) and colitis macroscopy were assessed and analyzed using Kruskal Wallis test. A value of P < 0.05 was considered to be statistically significant. Results: CR (1 g/kg/day) significantly increased the colon length (p<0.05) and decreased in colon weight-by-length ratio, colitis macroscopy and DAI score (p<0.05) as compared to DC. Its effects were comparable to the positive control sulfasalazine. Conclusion: Aqueous extract of rhizomes of CR exerted a protective effect in the murine model of DSS induced acute colitis.

47. Correlation of Serum Calcium and Serum Cholesterol with Platelet Indices in Cardiac Patients: A Prospective Observational Study
Radhika Sharma, Pratishtha Shrivastava, Shivangi Maru
Abstract
Background: Cardiovascular diseases (CVD) are a leading cause of morbidity and mortality worldwide. Platelet activation and dyslipidemia play a crucial role in the pathogenesis of atherosclerosis and its complications. Serum calcium has also been implicated in cardiovascular risk, though its relationship with lipid profile and platelet indices remains less clearly defined. Aims and Objectives: To evaluate the correlation between serum calcium and serum cholesterol, and to assess the association of platelet indices with serum cholesterol and various cardiovascular diseases in patients admitted to ICCU. Material and Methods: This prospective observational study included 153 cardiac patients admitted to the ICCU of R.D. Gardi Medical College, Ujjain. Serum calcium, serum cholesterol, and platelet indices (MPV, PDW, P-LCR, platelet count, PCT) were measured. Correlation analysis was performed using appropriate statistical methods. Results: Many patients were in the 51–60 years (26.8%) and 61–70 years (26.8%) age groups, with male predominance (68%). The most common diagnosis was coronary artery disease (26.1%), followed by myocardial infarction (20.9%). Serum cholesterol was significantly higher in myocardial infarction patients. A significant positive correlation was observed between serum calcium and serum cholesterol (p = 0.038). Serum cholesterol also showed significant positive correlations with PDW (r = 0.450, p = 0.011), MPV (r = 0.617, p = 0.002), and P-LCR (r = 0.537, p = 0.023). Conclusion: Serum cholesterol is significantly associated with platelet activation indices, indicating increased thrombotic potential in cardiac patients. Platelet indices may serve as simple, cost-effective markers for identifying high-risk individuals.

48. A Compact Smartphone Microscope Adapter for Real-Time Telepathology: Design, Development, and Point-of-Care Applications
Biswas Rajib, Das Mainak, Chakraborty Shubarna, Das Barnali, Naiding Momota, Kairi Sushmita
Abstract
Background & Objectives: Smartphone-assisted telepathology offers a cost-effective way to support remote microscopy and taching. Free-hand imaging through a microscope eyepiece can be unstable and often leads to misalignment or loss of focus. This study aimed to technically validate a compact, modular smartphone microscope adapter for real-time telepathology. Methods: This prospective cross-sectional validation study took place over six months in a tertiary care pathology department. Researchers evaluated 60 archived slides, including 30 histopathology, 20 hematology, and 10 cytology specimens. Ten participants each assembled the adapter and conducted live transmission sessions on six slides, resulting in 60 sessions. Technical performance was measured using a structured 5-point Likert scale for stability, image resolution, focus quality, color accuracy, and stream stability. User feedback and diagnostic interpretability were also collected. Results: All 60 sessions finished without any mechanical failures, device detachment, or clamp loosening. The average time for assembly and alignment was 2.4 ± 0.6 minutes. Adapter stability had the highest score (4.72 ± 0.48), followed by image resolution (4.58 ± 0.56), focus quality (4.55 ± 0.59), color fidelity (4.47 ± 0.62), and stream stability (4.41 ± 0.69). The overall composite score was 4.55 ± 0.52. Images transmitted during the sessions were diagnostically interpretable for all specimen types, and reviewers’ diagnoses matched the reference diagnosis in every session. Interpretation & Conclusions: The adapter showed excellent stability, consistent alignment, easy usability, and provided live image transmission that was suitable for diagnosis in telepathology and teaching.

49. Comparative Outcomes of Minimally Invasive vs. Open Surgery: A Systematic Review
Lakhyajit Pait, Prakash Kalita, Sankar Prasad Saikia, Nirmal Kumar Agarwal
Abstract
Background: Minimally invasive surgery (MIS) has evolved rapidly since the 1990s, primarily encompassing laparoscopic and video-assisted approaches designed to minimize surgical trauma, postoperative pain, and recovery time compared with conventional open procedures. Objective: This systematic review aimed to compare perioperative, postoperative, and long-term outcomes of laparoscopic MIS and open surgery in adult patients undergoing common general surgical procedures, including cholecystectomy, appendicectomy, and hernia repair. Methods: A systematic literature search was performed across PubMed, Embase, Cochrane CENTRAL, and Web of Science from database inception through October 2025. Eligible studies included randomized controlled trials (RCTs) and comparative cohort studies involving adults (≥18 years) who underwent laparoscopic versus open surgery. Primary outcomes were operative time, intraoperative blood loss, postoperative complications, hospital stay, mortality, and long-term survival. Study quality was assessed using RoB 2 for RCTs and ROBINS-I for non-randomized studies. Results: Forty-seven studies (26 RCTs and 21 cohort studies) met the inclusion criteria. Laparoscopic MIS demonstrated significantly lower intraoperative blood loss (mean difference −93 mL), shorter hospital stay (mean difference −2.8 days), and fewer postoperative complications (OR = 0.54, 95% CI 0.44–0.67) compared with open procedures. Operative time was moderately longer for laparoscopic surgery (MD = +28 minutes). Mortality was marginally lower in emergency laparoscopic cases (OR = 0.44, 95% CI 0.35–0.54). Long-term outcomes, including recurrence and survival, were comparable between both approaches. Conclusions: Laparoscopic minimally invasive surgery offers clear short-term benefits—reduced blood loss, fewer complications, and faster postoperative recovery—without compromising long-term outcomes. These advantages are influenced by surgeon expertise, appropriate case selection, and institutional experience.

50. Impact of Vitamin D Deficiency on Morbidity & Mortality in Early Onset Sepsis among Term Neonates
Anil Kumar Gogineni, Radhika Mantry, Urmila Jhamb, Rashi Bhargava, Aarti Anand
Abstract
Introduction: Early-onset neonatal sepsis (EONS) is a significant cause of neonatal morbidity and mortality, particularly in developing countries. Vitamin D has important immunomodulatory functions, and neonates are entirely dependent on maternal vitamin D stores. Deficiency may increase susceptibility to infections; however, Indian data on its association with early-onset sepsis are limited. Aim and Objective: To study the impact of serum vitamin D levels on early-onset neonatal sepsis in term neonates, and to assess its impact on clinical outcomes. Materials and Methods: This hospital-based prospective case-control study was conducted at Santosh Medical College and Hospital, Ghaziabad. A total of 138 term neonates were enrolled, including 69 neonates with early- onset sepsis (≤72 hours of life) and 69 healthy term neonates as controls. Diagnosis of sepsis was based on clinical features supported by laboratory parameters including C-reactive protein, complete blood count, immature-to-total neutrophil ratio, platelet count, and blood culture. Serum 25-hydroxyvitamin D levels were measured in all neonates. Maternal risk factors and intrapartum antibiotic prophylaxis were recorded. Statistical analysis was performed using STATA MP-17, with p < 0.05 considered statistically significant. Results: Mean serum vitamin D levels were significantly lower in neonates with early-onset sepsis than in controls (10.30 ± 3.23 vs 24.71 ± 5.43 ng/mL; p < 0.001). Vitamin D deficiency was more common among septic neonates. Maternal fever, infections, and inadequate intrapartum antibiotic prophylaxis were significantly associated with early-onset sepsis. Blood culture was positive in 73.9% of cases. Septic neonates required longer NICU stay, prolonged antibiotic therapy, and greater respiratory support, with worse outcomes seen in those with severe vitamin D deficiency. Conclusion: Vitamin D deficiency has significant impact on early-onset neonatal sepsis in term neonates and correlates with increased disease severity. Affected neonates required longer NICU stay, prolonged antibiotic therapy, and greater respiratory support. Since vitamin D deficiency is preventable, maternal screening and supplementation may help reduce early-onset neonatal sepsis and improve outcomes.

51. Enhancing Early Initiation of Breastfeeding through Quality Improvement Interventions in Newborns Delivered at a Tertiary Care Centre
Sakshi, Veenu Agarwal, Abhinav Taneja, Aarti Anand, Rashi Bhargava
Abstract
Background: Early initiation of breastfeeding (EIBF) and immediate skin-to-skin contact (SSC) are evidence-based practices that significantly reduce neonatal morbidity and mortality. Despite strong recommendations, EIBF remains suboptimal in many institutional delivery settings due to multifactorial barriers. This study aimed to identify barriers to EIBF and evaluate the effectiveness of a structured quality improvement (QI) intervention using Plan–Do–Study–Act (PDSA) cycles in a tertiary care hospital. Methods: A prospective QI initiative was conducted over 20 months (May 2024–December 2025) in the Departments of Paediatrics and Obstetrics & Gynaecology of a tertiary care hospital. Baseline data were collected for 285 eligible mother–neonate dyads to assess EIBF and SSC practices and identify barriers through root cause analysis. Four sequential PDSA cycles were implemented, focusing on staff role allocation, training, and development of standard operating procedures, maternal counselling, simulation-based learning, and inclusion of caesarean deliveries, followed by a sustenance phase. EIBF and SSC rates were tracked across phases. Results: At baseline, EIBF and SSC rates were 12.6% and 20%, respectively. Major barriers included lack of awareness, inadequate counselling, absence of standardized protocols, staffing constraints, and procedural separation of mother and newborn. Progressive improvement was observed across PDSA cycles, with EIBF increasing to 24%, 35%, 40.1%, and 72% by the fourth cycle, and SSC rising to 32%, 51.3%, 73.2%, and 82%, respectively. During the sustenance phase, rates declined to 56% for EIBF and 62.1% for SSC, highlighting the importance of continued supervision and institutional support. Conclusion: A structured, context-specific QI approach using PDSA cycles significantly improved EIBF and SSC rates. Sustained gains require ongoing supervision, clear policies, and integration into routine clinical practice.

52. Clinical and Demographic Risk Factors for Mortality in Hospitalized Patients with Pneumonia: An Observational Study
Kiran Kumari Padhy, Ram Niranjan Sahoo, Muktikanta Parida
Abstract
Background: Pneumonia remains a major cause of hospital admission and death, particularly among older adults and patients with physiological derangement or chronic comorbidity. Early recognition of mortality risk is essential for triage, monitoring intensity, and resource allocation in tertiary-care settings. Aim: To identify clinical and demographic predictors of in-hospital mortality among adults hospitalized with pneumonia at a tertiary care teaching hospital in eastern India. Methods: This single-center retrospective observational study included 270 consecutive adult patients admitted with radiologically confirmed pneumonia to Institute of Medical Sciences and SUM Hospital II, Bhubaneswar, Odisha, between January 2024 and January 2026. Demographic variables, comorbidities, admission physiological parameters, laboratory markers, radiographic extent, and in-hospital course were analyzed. Survivors and non-survivors were compared using standard inferential tests. Univariable and multivariable logistic regression were used to identify independent predictors of in-hospital mortality, and discrimination was assessed by receiver operating characteristic analysis. Results: Overall, in-hospital mortality was 15.6% (42/270). Non-survivors were older and more likely to have diabetes, chronic kidney disease, altered sensorium, hypoxemia at admission, multilobar involvement, elevated inflammatory burden, renal dysfunction, and hypoalbuminemia. In the final admission-based multivariable model, age ≥65 years (adjusted odds ratio [AOR] 4.76, 95% CI 1.95-11.63), chronic kidney disease (AOR 2.29, 95% CI 1.01-5.18), SpO2 <90% at admission (AOR 2.38, 95% CI 1.04-5.44), albumin <3.5 g/dL (AOR 2.85, 95% CI 1.29-6.28), and neutrophil-to-lymphocyte ratio >12 (AOR 4.41, 95% CI 2.00-9.72) independently predicted death. The admission model showed good discrimination (AUC 0.857), outperforming CURB-65 alone (AUC 0.799). Conclusion: Older age, renal comorbidity, admission hypoxemia, hypoalbuminemia, and elevated neutrophil-to-lymphocyte ratio were the strongest independent predictors of in-hospital death in hospitalized pneumonia. A parsimonious admission model may improve early risk stratification beyond clinical severity scoring alone.

53. Profile of Electrocution Deaths in Coastal Odisha: A Retrospective Autopsy-Based Study
Subal Kumar Naik, Umakanta Khejuria
Abstract
Background: Electrocution remains an important yet preventable cause of accidental mortality in India, especially in regions with rapid urban expansion, informal electrical connections, humid climate, and seasonal outdoor work. Autopsy-based regional profiling helps identify vulnerable groups, recurring circumstances, and forensic injury patterns relevant to both death investigation and prevention. Aim: To describe the demographic profile, circumstantial characteristics, autopsy findings, and analytical correlates of electrocution deaths autopsied at SCB Medical College & Hospital, Cuttack, during 5 January 2025 to 31 December 2025. Methods: A retrospective descriptive-analytic record review of 90 confirmed electrocution deaths was designed using a one-year institutional autopsy frame. Case records, inquest papers, scene details, and hospital records were reviewed for age, sex, residence, occupation, season, place of occurrence, source and voltage of current, external injury pattern, survival interval, and cause of death. Sample size using the single-proportion formula based on an expected male predominance of 85% from previous Indian literature and an absolute precision of 7.5% yielded a minimum requirement of 87 cases; all 90 eligible cases in the study period were included. Descriptive statistics, chi-square/Fisher exact tests, and odds ratios (ORs) with 95% confidence intervals (CIs) were used. Results: Males constituted 86.7% of the victims and the mean age was 34.4±14.3 years. The 21–30 year age group was most affected (32.2%), followed by 31–40 years (25.6%). Most victims were from rural areas (67.8%), and deaths were overwhelmingly accidental (96.7%). Incidents peaked in the monsoon season (54.4%), with the highest monthly counts in July and August. Low-voltage exposure accounted for 68.9% of cases, while workplace incidents (41.1%) marginally exceeded home incidents (37.8%). Upper-limb contact was the commonest contact site (52.2%). Discrete electrical marks were absent in 34.4% of cases. Immediate cardiorespiratory arrest due to electrocution was the most frequent cause of death (60.0%), followed by electrocution with respiratory arrest (17.8%) and septicemia following electrical burns (13.3%). High-voltage fatalities were significantly associated with occurrence outside home (OR 8.33, p<0.001), occupational exposure (OR 3.34, p=0.010), extensive burns ≥20% TBSA (OR 11.31, p<0.001), survival >24 hours (OR 6.56, p=0.009), and fall-related injury (OR 16.64, p=0.003). Conclusion: In this coastal Odisha autopsy series, electrocution deaths predominantly affected young rural men and clustered during the monsoon months. Although low-voltage domestic and workplace events formed the larger burden, high-voltage exposure was associated with more severe burns, delayed survival, and secondary trauma. The findings support targeted household electrical safety, occupational line-clearance protocols, monsoon risk messaging, and careful forensic documentation even when classical electrical marks are absent.

54. Clinico-Hematological Profile of Anaemia in Children Aged 6 Months to 12 Years Admitted to a Tertiary Care Hospital in North-East India: A Cross-Sectional Study
Sougata Saha, Sujit Kumar Chakrabarti, Sribas Das, Manasi Saha (Ray)
Abstract
Background: Anaemia remains a major public health problem among children in developing countries, particularly in India. Data from North-East India are limited, especially among hospitalized paediatric populations. Objectives: To estimate the proportion of anaemia among hospitalized children aged 6 months to 12 years along with description of the clinico-hematological profile and to determine the association of nutritional status of the study subjects with severity as well as type of anaemia. Methods: This hospital-based cross-sectional study was conducted in the Department of Paediatrics, Agartala Government Medical College, Tripura, over a two-year period. Children aged 6 months to 12 years admitted with anaemia as per WHO criteria were consecutively enrolled. Clinical features, anthropometry, complete blood counts, peripheral smear examination, iron profile, vitamin B12 levels, haemoglobin electrophoresis, and relevant investigations were performed. Data were analyzed using SPSS version 26.0. Results: Among 2,421 admitted children, 107 were anaemic, giving a proportion of 44.19 cases per 1,000 admissions (4.41%). Males constituted 59.8% (male: female ratio 1.48:1). Moderate anaemia was most common (44.9%), followed by severe anaemia (35.5%). Malnutrition was present in 71% of children and showed a significant association with anaemia severity (p<0.001). Microcytic hypochromic anaemia was the predominant morphological type (66.4%). Iron deficiency anaemia was the most common etiology (29.9%), while malaria (10.3%) and haemoglobinopathies were major contributors to haemolytic anaemia. Conclusion: Anaemia among hospitalized children in Tripura is multifactorial, strongly associated with malnutrition, and predominantly nutritional in origin. Early nutritional intervention and region-specific preventive strategies are urgently required.

55. Effect of Intravenous Lignocaine versus Fentanyl on Hemodynamic Response to Laryngoscopy & Endotracheal Intubation in General Anaesthesia: A Comparative Study
Ravindra Kumar Dabi, Chiranji Lal Khedia, Rohit Kumar Verma, Vijeta Khandelwal
Abstract
Background: Laryngoscopy and endotracheal intubation provoke a transient sympathoadrenal response resulting in tachycardia and hypertension, which may be detrimental in susceptible patients. Various pharmacological agents are used to attenuate this response. Objective: To compare the efficacy of intravenous lignocaine and fentanyl in attenuating the hemodynamic response to laryngoscopy and intubation. Methods: This randomized double-blind study was conducted on 80 patients (ASA I–II) undergoing elective abdominal surgery under general anesthesia. Patients were allocated into two groups: Group L received intravenous lignocaine 1.5 mg/kg, and Group F received fentanyl 2 µg/kg, administered 3 minutes before intubation. Hemodynamic parameters (HR, SBP, DBP, MAP) were recorded at baseline, post-drug, post-induction, at intubation, and at 1, 3, and 5 minutes after intubation. Statistical analysis was performed using SPSS, with p < 0.05 considered significant. Results: Both groups were comparable in demographic parameters. Hemodynamic variables increased significantly at intubation in both groups. However, fentanyl demonstrated significantly better attenuation at 1 minute post-intubation: HR (p=0.039), SBP (p=0.007), DBP (p=0.00029), and MAP (p=0.005). No significant differences were observed at later time intervals. Incidence of side effects (bradycardia, hypotension) was comparable between groups. Conclusion: Fentanyl (2 µg/kg) is more effective than lignocaine (1.5 mg/kg) in attenuating the acute hemodynamic response at 1 minute following intubation, with comparable safety profiles.

56. CHA2DS2-VASC Score in Emergency Department: A Prospective Observational Study
Urjita Pranav Modi, Harshkumar Dangi, Dharmistra Dhusa, Pramit Patel
Abstract
Background: The CHA₂DS₂-VASc score (Congestive Heart Failure, Hypertension, Age ≥75 years, Diabetes Mellitus, Stroke/Transient Ischemic Attack, Vascular Disease, Age 65–74 years, and Sex Category [female]) is a validated, evidence-based tool used to estimate the risk of stroke in patients with non-valvular atrial fibrillation (AF). It serves as an essential bedside assessment in the emergency department, particularly in patients presenting with new-onset AF, uncontrolled AF, or AF with an uncertain history of anticoagulation. Objective: To estimate the usefulness of the CHA₂DS₂-VASc score in patients of atrial fibrillation in the emergency department. Methods: This prospective observational study included adult patients presenting to the emergency department with atrial fibrillation confirmed by electrocardiogram. CHA₂DS₂-VASc scores were calculated at presentation and correlated with anticoagulation use and outcomes. Results: Higher CHA₂DS₂-VASc scores were associated with increased cerebral infarction and mortality. Anticoagulant therapy was underutilized despite high-risk scores. Conclusion: CHA₂DS₂-VASc score is useful for risk stratification and outcome prediction in emergency department patients with atrial fibrillation.

57. Comparative Study of the Efficacy & Safety of Ferric Carboxy Maltose V/S Iron Sucrose in Management of Mild to Moderate Iron Deficiency Anemia in Pregnant Women
Darshan D. Patel, Harshdeep K. Jadeja, Bhavesh B. Airao
Abstract
Objective: Anemia one of the common medical conditions affecting pregnancy and responsible for maternal and perinatal mortality and morbidity. The study was done to compare efficacy and safety of ferric carboxy maltose versus iron sucrose in iron deficiency anemia during pregnancy. Method: This study is a prospective observational study carried out at C U Shah Medical College and Hospital, Surendranagar, Gujarat, covered 100 pregnant women with mild to moderate iron deficiency anemia were selected and were randomized into two groups in a 1:1 ratio. Group A: consisted of 50 antenatal women who received iron sucrose. Group B: consisted of 50 antenatal women who received Ferric carboxymaltose. Results: A total of 100 pregnant women with iron deficiency anemia were included in the study, with 50 patients in the Iron Sucrose group and 50 in the Ferric Carboxymaltose (FCM) group. Baseline characteristics including age, baseline hemoglobin, and serum ferritin were comparable between the two groups with no statistically significant difference. Both groups showed a significant improvement in hemoglobin levels during follow-up; however, the rise in hemoglobin was significantly higher in the FCM group. At 8 weeks, the mean hemoglobin level increased to 12.5 ± 1.0 g/dl in the FCM group compared to 11.2 ± 1.1 g/dl in the Iron Sucrose group (p < 0.001). Serum ferritin levels also showed a significantly greater increase in the FCM group, reaching 110.6 ± 18.2 ng/ml at 4 weeks compared to 45.3 ± 12.4 ng/ml in the Iron Sucrose group (p < 0.001). The mean number of doses required was significantly lower in the FCM group (1.3 ± 0.5 doses) compared to the Iron Sucrose group (4.8 ± 1.2 doses). Both treatments were well tolerated, and the incidence of adverse effects such as nausea, headache, and injection site pain was low and comparable between the two groups. No hypersensitivity reactions were observed. Discussion: Our study showed significant increase in hemoglobin level in both the group but FCM was safe and very effective in improving HB concentration as well as early replenishment of iron stores as compare to iron sucrose in patients with mild to moderate anemia.

58. Impact of Lateral Vs Sitting Position for Spinal Anesthesia Administration on Intraocular Pressure and Post Dural Puncture Headache in Cesarean Section
Aparajita Banerjee, Meenakshi Pandey, Trishna Sahu, Ambika Prasad Panda
Abstract
Aim:  Our aim of study was to determine the effect of spinal anesthesia administered in either sitting or right lateral position on post dural puncture headache (PDPH) and intraocular pressure during cesarean section. Materials and Methods: 100 patients posted for cesarean section under spinal anesthesia were divided into two groups of 50 each. Spinal anesthesia was administered either in the sitting position (Group S) or right lateral position (Group RL). Hemodynamics were monitored during perioperative period. Intraocular pressure before and after the operation was measured. Post dural puncture headache was assessed postoperatively up to 5 days. Patients requiring more than 1 attempt for spinal anesthesia were excluded. Results: There was no statistical difference between the two groups regarding demographic data. Post dural puncture headache was seen in 13 patients in Group S and 5 patients in Group RL, the difference being significant. There was no significant difference between the groups regarding intraocular pressure. (P >0.05)There was no significant different between the groups regarding heart rate, SBP and SpO2 at various time points in perioperative period. Conclusion: Spinal anesthesia administered in the sitting position for cesarean section resulted in higher incidence of post dural puncture headache than in the right lateral position, but no significant change was found in the intraocular pressure.

59. Dexmedetomidine versus Propofol Infusion for Intraoperative Haemodynamic Stability during Laparoscopic Surgery: A Prospective Open-Label Comparative Study
P. Umamaheswari, R. Pravin Kumaar, R. Mageshwaran
Abstract
Background: Pneumoperitoneum created during laparoscopic surgery induces significant haemodynamic perturbations, including increased systemic vascular resistance, reduced venous return, and activation of neurohumoral stress pathways. Effective intraoperative haemodynamic management is therefore critical. This study aimed to compare the efficacy of dexmedetomidine infusion versus propofol infusion in maintaining haemodynamic stability during laparoscopic surgery and to evaluate postoperative recovery profiles. Methods: A prospective, open-label comparative study enrolled 70 patients (ASA PS I and II, aged 18–65 years) undergoing elective laparoscopic surgery at Government Villupuram Medical College & Hospital. Patients were randomised equally into Group D (dexmedetomidine: loading dose 1 mcg/kg over 10 minutes before intubation, followed by 0.2 mcg/kg/h infusion) and Group P (propofol: 100 mcg/kg/min infusion after intubation). Both infusions were continued until deflation of pneumoperitoneum. Heart rate (HR), systolic blood pressure (SBP), diastolic blood pressure (DBP), and mean arterial pressure (MAP) were recorded at multiple time points. Postoperative sedation and recovery were assessed using the Ramsay Sedation Scale (RSS) and Modified Aldrete Score (MAS). Results: Both groups were comparable at baseline. Group D exhibited significantly lower HR, SBP, DBP, and MAP compared with Group P at most intraoperative time points (p<0.01), reflecting superior haemodynamic attenuation (HR: 3% decrease vs. 18% increase; MAP: 4% decrease vs. 7% increase over pneumoperitoneum). Group D patients had significantly deeper sedation (higher RSS scores) up to 90 minutes postoperatively (p<0.01), while Modified Aldrete Scores were significantly lower in Group D at 0, 15, and 30 minutes post-extubation (p<0.01), indicating slower initial recovery. Both groups achieved full recovery by 45–60 minutes. No adverse events were recorded. Conclusion: Dexmedetomidine infusion provides superior intraoperative haemodynamic stability during laparoscopic surgery compared with propofol, with effective attenuation of the stress response to pneumoperitoneum. Propofol offers faster early recovery. Dexmedetomidine is the preferred agent when cardiovascular stability is the clinical priority.

60. Dexmedetomidine as an Adjuvant in Opioid Anaesthesia Induction in Patients with Left Ventricular Dysfunction and Coronary Artery Disease: A Prospective Randomised Observational Study
Kurinchi Raja M., Naven Kumar S., Saravanakumar
Abstract
Background: Patients with coronary artery disease (CAD) and compromised left ventricular (LV) function represent a high-risk subgroup in cardiac surgery. High-dose opioid induction, while widely employed, carries risks of haemodynamic instability. Dexmedetomidine, a highly selective alpha-2 (α2) adrenergic agonist, offers sympatholysis, sedation, and analgesia, and may attenuate the adrenergic response to laryngoscopy and intubation. This study aimed to compare the haemodynamic effects and opioid requirements during anaesthesia induction with fentanyl alone versus fentanyl supplemented with dexmedetomidine in patients with LV dysfunction undergoing off-pump coronary artery bypass grafting (OPCAB). Methods: Sixty adult patients with LV dysfunction (ejection fraction <45%) undergoing elective OPCAB were prospectively randomised into two groups of 30 each: Group F (fentanyl alone) and Group D (fentanyl plus dexmedetomidine loading dose 1 mcg/kg over 10 minutes). Haemodynamic parameters and cardiac output indices were recorded at baseline and at one-minute intervals from induction to seven minutes post-induction using a Flo Trac™/Vigileo™ system. Bispectral Index (BIS) monitoring ensured anaesthetic depth. Fentanyl dosage at induction, additional intraoperative fentanyl requirements, and duration of postoperative ventilation were compared between groups. Results: Demographic parameters were comparable between groups. Heart rate and systolic blood pressure were significantly elevated in Group F compared to Group D across all post-induction time points (p<0.001). Diastolic blood pressure and oxygen saturation (SpO2) remained similar in both groups. Stroke volume index (SVI) and cardiac index (CI) were significantly better maintained in Group D (p<0.001). Fentanyl induction dosage (325 ± 25.4 mcg vs 253.3 ± 26.0 mcg) and additional intraoperative fentanyl (371.7 ± 28.4 mcg vs 185.0 ± 32.6 mcg) were significantly lower in Group D (p<0.001). Duration of postoperative ventilation was also significantly shorter in Group D (5.6 ± 0.5 hrs vs 9.0 ± 0.9 hrs, p<0.001). Conclusion: Dexmedetomidine supplementation to fentanyl-based induction in patients with CAD and LV dysfunction provides superior haemodynamic stability, better preservation of cardiac output parameters, reduced intraoperative opioid requirements, and facilitates faster postoperative extubation, enabling early patient fast-tracking.

61. The Persistence of Hansen’s Disease: A Five-Year Profile of High Bacillary Indices and Pediatric Cases from East Vidarbha region
Priyanka Chandankhede, Khushboo Agarwal, Aboli Shinde, Dilip Gedam, Gopal Agrawal
Abstract
Background: Leprosy, or Hansen’s disease, remains a chronic infectious challenge caused by Mycobacterium leprae, primarily affecting the skin and peripheral nerves. Despite national elimination efforts, leprosy transmission persists in marginalized communities in India. This study aimed to determine the pattern, prevalence, and trends of slit-skin smear-positive leprosy cases in East Vidarbha region to assess post-elimination challenges. Methods: A retrospective analysis was conducted at a tertiary care institution in Nagpur over a five-year period from January 2020 to February 2025. Clinical and bacteriological data from 239 slit-skin smear-positive cases, identified from 502 suspected individuals, were evaluated. Results: Males were predominantly affected, making up 64.9% of the cases with a male-to-female ratio of 1.8 to 1. The highest incidence occurred among individuals aged 41 to 50 years, representing 23.4% of the total. Additionally, children aged ≤ 10 years accounted for 5.0% of the cohort, which points to active community transmission. Multibacillary leprosy was responsible for 71.5% of the cases. Furthermore, 51.9% of the patients exhibited a high Bacillary Index of ≥ 5, while the highly infectious borderline lepromatous and lepromatous types made up 56.5% of all clinical presentations. The study also highlighted a significant surge in cases during 2023, representing 36.8% of the total, alongside a relapse or re-treatment rate of 20.9%. Conclusion: The high burden of multibacillary disease and pediatric cases confirms ongoing leprosy transmission in Central India. To achieve the goal of “Zero Leprosy,” healthcare systems must urgently optimize early detection frameworks, integrate novel chemoprophylactic regimens and vaccines, and actively address socio-economic barriers to care.

62. Clinico Epidemiological Study of Genito-Ulcerative Sexually Transmitted Diseases in People Living with HIV/AIDS
Gayathri Narukulla, Pravalika Merugu, Raghumohan Kavati
Abstract
Background: The importance of Genito-ulcerative STD has increased considerably due to the fact that these lesions are a major cofactor in the transmission of the HIV. Hence it is necessary to provide prompt and effective treatment as early as possible. It will prevent viral replication, will prevent infection to the spouse and it will prevent infection to the other people in the community.

63. Association between Blood Pressure, Body Mass Index, and Thyroid Hormone Levels among Northern Indians
Deepa Gupta, Prateek Agrawal, Manjula Babariya, Jitendra Kumar S. Parmar, Kamini Vinayak
Abstract
Background: Maintaining blood pressure (BP) and body mass index (BMI) are the important indicators of health, especially when it comes to heart related problems. Frequent increase in BP and weight can affects the metabolism of the body that may leads to hypertension and obesity, universal contributor of most common endocrine disorder, subclinical hypothyroidism (ScH). The present study was conducted to find out the association between BP, BMI, and thyroid hormone level in blood donors. Method: A total of 1018 healthy people who voluntarily come for blood donation in the hospital were participated in this study. Out of which 966 were included in which 97.6% were males and 2.4% were females aged between 18-59 years and 52 were excluded. Blood pressure, height, weight, BMI and blood group (BG) were measured using standardized protocol by trained nursing staff. Blood sample were taken for the estimation of free triiodothyronine (FT3), free thyroxin (FT4) and thyroid stimulating hormone (TSH) by chemiluminescence method on Vitros 56002355 clinical chemistry analyzer. Descriptive data and Pearson correlation coefficient were calculated by using SPSS (version 23.0). Result: There were highly significant positive correlation between BP and BMI (p<0.0001); SBP and FT3, FT4 (p<0.005); FT3 and FT4 (p<0.0001). And a negative significant correlation was found between FT4 and TSH (p<0.05). Conclusion: Study found a significant relationship between BMI subgroups and blood pressure indices among the participants. There is need for prevention of weight gain for reducing the problem of hypertension. Regular physical activity and reduced dietary fat intake could be achieved by small life style changes for prevention of obesity-associated hypertension.

64. Adverse Events Following Immunization by BCG Vaccine among Adults: A Prospective Study in a District of West Bengal, India
Aditya Prasad Sarkar, Panchanan Kundu, Sanjit Kumar Patra, Tanmoy Kumar Ghosh, Paramita Kundu, Saswata Saha
Abstract
Background: Tuberculosis is still a worldwide public health problem. Highest number of TB patients are in India. As per National immunization Schedule, BCG vaccine is given to infants at birth or otherwise within one year. But one trial in India has shown that 80% effective over 20 years of follow up. Objectives: i) to describe the socio-demographic characteristics of the adults who were given BCG vaccine in a district of West Bengal, India, ii) to assess the AEFI after BCG vaccination among them and iii) to find out the association of AEFI and socio-demographic characteristics, if any. Materials & Methods: It was a observational longitudinal study conducted in a district of West Bengal, India from February 2025 to August 2025, Complete enumeration technique was used and ultimately 12308 study subjects were included in the study. Data were collected initially by the ANMs using one Pretested, predesigned interviewer-administered schedule was used for data collection while the follow up was done by the ASHAs through house to house visit. Results: Majority of the study subjects were of less than sixty years of age (66.7%). Female vaccines (54.3%) were more than males. Almost all of the participants were Hindu (97.9%). Total 12308 persons were vaccinated. Out of which 1105 vaccines faced any of the AEFIs. AEFI was found more in senior citizens (97.4%) and the difference was statistically significant (p< 0.001). ASHA workers followed up the vaccines by house to house visit on 2nd, 14th, 28th, 32nd and 84thday. Any AEFI occurred in 95% of the vaccines at 2nd day after immunization, 84.6% on day 14 and 83.1% on day 28. Redness and papule developed in 89% cases, while 95.4% had local tenderness. Subsequently pustule developed in 84.6% cases followed by development of abscess on 28th day. Ulcer developed in 80.7% cases whereas scar was seen in 76% study subjects, Conclusion: Many minor AEFI developed among the vaccines which is similarly seen in infants. Further such study should be undertaken in different parts of the country to get the total picture of the country.

65. Cognitive Impairment in Elderly Diabetics: Prevalence and Risk Factors
Tinish Sanjaybhai Nanavati, Harshad Radadiya, Roshani Savaliya
Abstract
Background: Cognitive impairment is a frequently under-recognized complication among elderly patients with type 2 diabetes mellitus (T2DM), significantly affecting daily functioning, treatment adherence, and quality of life. While global studies report varying prevalence, data specific to elderly diabetics in western India, particularly Gujarat, remain limited. This study aimed to determine the prevalence of cognitive impairment and identify its associated risk factors among elderly T2DM patients attending a tertiary care teaching centre in Gujarat. Material and Methods: A hospital-based cross-sectional study was conducted over a year at the outpatient department of general medicine at a tertiary care teaching hospital in Gujarat. A total of 250 elderly patients (aged ≥60 years) with confirmed T2DM for at least one year were enrolled using consecutive sampling. Cognitive function was assessed using the Montreal Cognitive Assessment (MoCA) tool, with a score <26 indicating impairment. Relevant sociodemographic, clinical, and biochemical data were collected through structured interviews and hospital records. Ethical approval was obtained from the Institutional Ethics Committee, and written informed consent was secured from all participants. Results: The overall prevalence of cognitive impairment was 42% (105/250). Significant associations were observed with advancing age, longer duration of diabetes, poor glycemic control (HbA1c >8%), hypertension, and lower educational status. Multivariate logistic regression identified age ≥70 years (AOR 3.2, 95% CI 1.8–5.7), diabetes duration >10 years (AOR 2.8, 95% CI 1.6–4.9), HbA1c >8% (AOR 2.4, 95% CI 1.3–4.5), and low education (AOR 2.1, 95% CI 1.2–3.8) as independent predictors. Conclusion: Cognitive impairment affects nearly two-fifths of elderly diabetics in this Gujarat cohort, highlighting the urgent need for routine cognitive screening in diabetes clinics. Early identification of modifiable risk factors could prevent progression to dementia and improve patient outcomes in resource-limited settings.

66. Evaluation of Radiological and Functional Outcomes of Femoral Neck Fractures Treated with Cannulated Cancellous Screws: An Observational Study
Muniraj Meena, Seema Meena, Pradeep Khinchi, Harish Kumar Jain
Abstract
Background: Fracture neck of femur is a common orthopaedic injury and remains difficult to manage because of the risk of complications such as non-union and avascular necrosis. These fractures are frequently associated with high-energy trauma in younger adults. Cannulated cancellous screw fixation is widely used for internal fixation as it allows stable fixation and preservation of the femoral head. Objective- To evaluate the radiological and functional outcomes of fracture neck of femur treated with cannulated cancellous screws. Materials and Methods: This prospective hospital-based observational study was conducted in the Department of Orthopaedics in Rajasthan from August 2020 to May 2022. A total of 25 patients aged 18–60 years with fracture neck of femur were included. All patients underwent closed reduction and internal fixation using three cannulated cancellous screws under fluoroscopic guidance, and were followed for eight months. Functional outcome was assessed using the Harris Hip Score at 24 weeks. Results: Most patients were in the 31–40 years age group (36%) and males constituted 64% of the study population. Road traffic accidents were the most common mode of injury (52%). Transcervical fractures were the most frequent anatomical type (56%) and Garden Type II fractures were the most common (44%). Fracture union was observed most commonly at 12 weeks (40%) with an average union time of 16 weeks. Functional outcome at 24 weeks showed excellent results in 72% of patients, good in 12%, fair in 8%, and poor in 8%. Most patients were able to ambulate without support (88%). Conclusion: Cannulated cancellous screw fixation provides satisfactory radiological union and functional outcome in fracture neck of femur.

67. Clinicopathological Study of Gall Bladder Specimen of Cholelithiasis
Shashi Ranjan Roy, Priya, K.M. Prasad, Dilip Kumar
Abstract
Background: Gallstone disease, one of the most common biliary disorders worldwide, is a major cause of morbidity in middle-aged women. Histopathological changes can include malignancy and chronic cholecystitis when gallstones irritate the gallbladder mucosa. Clinicopathological and biochemical investigations can assist identify disease origins and detect issues early. Methods: A prospective observational study conducted at Department of Pathology, Patna Medical College and Hospital (PMCH) from the year 2020–2021. A total of 100 gallstone-containing cholecystectomy specimens were examined. Gallstone biochemistry, shape, histology, and clinical data were studied. Traditional histology stains and biochemical testing were performed. Chi-square and Student’s t-tests considered p-values below 0.05 significant. Results: The average age of the 100 patients was 42.4 years, ranging from 17-74 years, and 78% were female and 22% male. Most histological results showed chronic cholecystitis (72%), but acute on chronic (12%), cholesterolosis (3%), xanthogranulomatous (3%), and adenocarcinoma (4%). Pigment (28%), mixed gallstones (48%), and cholesterol stones (24%), were the most common. There was no correlation between stone type and histopathological pattern (χ² = 9.95, p = 0.445). Conclusion: Cholelithiasis is the most frequent cholecystitis pathology in middle-aged women. All cholecystectomy tissues must be histopathologically examined to improve clinical outcomes and detect accidental premalignant or malignant abnormalities.

68. Antimicrobial Susceptibility Patterns Across Clinical Isolates in A Tertiary Center
G. J. Archana, B. Archana, G. Sowjanya
Abstract
Background: Antimicrobial resistance (AMR) has become a major global public health concern, particularly in tertiary care hospitals where extensive antibiotic use and invasive procedures promote the emergence of multidrug-resistant organisms. Continuous surveillance of antimicrobial susceptibility patterns is essential for guiding empirical therapy, improving antimicrobial stewardship, and preventing the spread of resistant pathogens. Methods: A prospective observational study was conducted in the Department of Microbiology at Government Medical College, Quthbullapur, Medchal–Malkajgiri, from November 2025 to February 2026. Clinically significant bacterial isolates obtained from various clinical specimens including blood, pus, respiratory samples, urine, and genital specimens were included. Bacterial identification was performed using standard microbiological techniques. Antimicrobial susceptibility testing was carried out by the Kirby–Bauer disc diffusion method according to CLSI guidelines. Resistance mechanisms such as ESBL, carbapenemase production, methicillin resistance, and vancomycin resistance were identified using phenotypic methods. Results: Gram-negative organisms including Klebsiella pneumoniae, Escherichia coli, Acinetobacter baumannii, and Pseudomonas aeruginosa predominated among isolates. High resistance rates were observed for beta-lactams (72%), aminoglycosides (68%), and carbapenems (65%). Acinetobacter species showed the highest resistance prevalence (75%). Among Gram-positive bacteria, methicillin-resistant Staphylococcus aureus and vancomycin-resistant Enterococcus were notable. Colistin and linezolid retained comparatively better activity against multidrug-resistant isolates. Conclusion: The study highlights a high prevalence of multidrug-resistant pathogens in a tertiary care setting. Regular surveillance of antimicrobial susceptibility patterns and implementation of effective antimicrobial stewardship programs are crucial to optimize treatment and control the spread of resistant organisms.

69. Knowledge, Attitudes, and Patterns of Tobacco Use among Adolescents in Mathura District, Uttar Pradesh: A Cross-Sectional Study
Pawar Akshay Shahaji, Manoj Kumar Singh, Mukesh Bhut, Pankaj Kumar Jain
Abstract
Background: Tobacco use during adolescence is a major public health concern because initiation at a young age increases the risk of long-term nicotine dependence and adverse health outcomes. This study assessed the prevalence of tobacco use among adolescents and evaluated their knowledge and attitudes regarding tobacco use. Methods: This cross-sectional descriptive study included 300 adolescents aged 13–17 years in Mathura district, Uttar Pradesh. Participants were selected using a stratified random sampling method. Data were collected using a pretested and prevalidated structured questionnaire administered in English and Hindi. Descriptive statistics were used to summarise the data, and Pearson’s chi-square test was used to assess associations between categorical variables. A p value of less than 0.05 was considered statistically significant. Results: Ever use of tobacco was reported by 102 of 300 participants (34.0%). Chewable tobacco and cigarettes were the most commonly reported products, each used by 29 participants (9.7%), followed by smokeless tobacco products such as gutkha, mawa, and jarda in 20 participants (6.7%). Daily tobacco use was reported by 40 participants (13.3%). Tobacco use increased significantly with age, from 16.5% among those aged 13–14 years to 48.7% among those aged 17 years. No significant association was observed between sex and ever tobacco use. Friends or family members using tobacco were reported by 60.0% of participants, 85.7% reported tobacco products to be easily or very easily available, and media influence was reported by 70.7%. Observational exposure to tobacco use was significantly associated with ever tobacco use (p=0.036), whereas awareness of health risks and belief that tobacco is harmful were not significantly associated with tobacco use (p=1.00 for both). Conclusion: Tobacco use was common in this adolescent population despite substantial awareness of harm. Social exposure, easy availability, and increasing age appeared to be important correlates, indicating the need for focused preventive and cessation-oriented interventions for adolescents.

70. Prevalence, Pattern, and Spectacle Utilization of Refractive Errors among School-Going Children in Mathura: A Cross-Sectional Study
Ravi Soni, Paridhi Gupta, Nidhi Jain, Meemansha Maheshwari
Abstract
Background: Refractive errors are a common and potentially correctable cause of visual impairment in school-aged children. Early detection is important because uncorrected errors can affect visual function and school performance. Methods: This cross-sectional study included 129 school-going children aged 6-14 years from selected rural schools in Mathura. Visual acuity screening was performed using Snellen charts. Children with suspected visual impairment underwent autorefraction, retinoscopy, and cycloplegic refraction. Demographic data and information on prior diagnosis, spectacle ownership, spectacle use, and barriers to spectacle use were collected using a structured questionnaire. Data were analyzed using descriptive statistics, chi-square testing, and logistic regression. A p value of less than 0.05 was considered statistically significant. Results: Of 129 children, 67 (51.9%) were male and 62 (48.1%) were female. The overall prevalence of refractive errors was 19.4% (25/129). Myopia was the most common refractive error, affecting 16 children (64.0%), followed by astigmatism in 12 (48.0%) and hyperopia in 4 (16.0%). Prevalence increased significantly with age, from 7.3% in children aged 6-8 years to 32.6% in those aged 12-14 years (p=0.018). The difference by sex was not statistically significant (p=0.291). The mean spherical equivalent among affected children was -1.12 ± 1.65 D. Eighteen affected children (72.0%) were newly diagnosed during screening. Only 6 of 25 affected children (24.0%) owned spectacles, and regular use was reported by 2 of 6 children (33.3%) who owned them. Conclusion: Refractive errors were present in nearly one-fifth of school-going children in rural Mathura, with myopia as the predominant type. Older age was significantly associated with refractive errors. Underdiagnosis, low spectacle ownership, and poor regular spectacle use indicate an important unmet need for school-based vision screening and access to corrective services.

71. Bloodstream Infection Trends Before and After the COVID-19 Pandemic
Dipti Lal, Sushant Suman, Sanjay Kumar, Rajesh Kumar, Satyendu Sagar, Wasim Ahmad
Abstract
Background: The COVID-19 pandemic has had a substantial impact on the epidemiology of bloodstream infections (BSIs), as seen by the rising rates of infection and antibiotic resistance reported globally. Aims: To examine patterns of antibiotic resistance, pathogen dispersion, and bloodstream infection trends prior to and during the COVID-19 pandemic. Methods: Three thousand patients (1500 pre-COVID and 1500 post-COVID) participated in a retrospective cohort research in a tertiary care hospital in Bihar, India. Positivity rates, microbiological profiles, and antibiotic resistance were examined in blood culture data. SPSS and WHONET were used for statistical analysis, and p < 0.05 was deemed significant. Results: Blood culture positive showed a growing trend over time, going from 28% pre-COVID to 34% post-COVID. Escherichia coli, Klebsiella pneumoniae, and Pseudomonas aeruginosa were the most frequently isolated gram-negative microbes. The post-COVID era saw a rise in infections linked to intensive care units. Antimicrobial resistance and multidrug-resistant (MDR) isolates increased significantly (from 35.7% to 47%). There was also a minor rise in Candida species-related fungal infections. Conclusion: The study shows that bloodstream infections and antibiotic resistance have significantly increased in the post-COVID period, especially in intensive care units. To combat the rising incidence of BSIs and AMR, it is crucial to strengthen infection control protocols, antibiotic stewardship, and ongoing surveillance.

72. Comparative Antibiotic Sensitivity Patterns in Clinical vs Environmental E. coli Isolates
Dipti Lal, Sushant Suman, Sanjay Kumar, Rajesh Kumar, Satyendu Sagar, Babita
Abstract
Background: Escherichia coli is a significant contributor to human infections and a key sign of environmental fecal pollution. Effective treatment and public health surveillance are at risk due to increasing antimicrobial resistance (AMR), especially multidrug resistance (MDR), which has been extensively documented in clinical and environmental E. coli. The Multiple Antibiotic Resistance (MAR) index, MDR prevalence, and antibiotic sensitivity patterns of clinical and environmental E. Coli isolates from a single tertiary-care setting were compared in this investigation. Materials & Methods: From February 2025 to January 2026, a cross-sectional, laboratory-based comparison investigation was carried out at Nalanda Medical College and Hospital in Patna, Bihar. A total of 200 non-duplicate E. Coli isolates were examined: 100 from environmental sources (water, soil, hospital surfaces, and food/animal products) and 100 from clinical specimens (urine, pus/wound swabs, stool, and blood). Identification was carried out utilizing commercial techniques or routine biochemical testing after culture on MacConkey/EMB agar. In accordance with CLSI/EUCAST recommendations, antimicrobial susceptibility testing was performed using Kirby-Bauer disc diffusion on Mueller-Hinton agar. MDR was defined as not being susceptible to at least one agent in at least three antimicrobial classes. The MAR index was determined by dividing the total number of antibiotics tested by the number of antibiotics to which an isolate was resistant. Chi-square tests were used to analyze the data, and p<0.05 was deemed significant. Results: First-line drugs such as ampicillin (30% vs. 55%), amoxicillin–clavulanate (45% vs. 65%), ceftriaxone (50% vs. 70%), ciprofloxacin (40% vs. 68%), and gentamicin (65% vs. 80%) significantly reduced the susceptibility of clinical isolates compared to environmental isolates (p<0.05 for all). Both groups continued to be highly susceptible to more expensive medications such amikacin, nitrofurantoin, and imipenem, with no statistically significant differences. MDR was more common in clinical isolates than environmental isolates (55% vs. 30%, p<0.05), and resistance rates were continuously greater among clinical isolates. Clinical isolates were more likely to have MAR index values >0.5 (40% vs. 15%), while environmental isolates were more likely to have low MAR values (<0.2) (50% vs. 20%). Conclusion: Hence, compared to ambient isolates, clinical E. coli isolates showed significantly higher resistance, MDR prevalence, and MAR indices, indicating increased antibiotic selection pressure in healthcare settings. However, significant MDR in environmental isolates highlights the significance of sensible antibiotic usage and integrated One Health surveillance to prevent the spread of resistant E. coli across clinical and environmental reservoirs.

73. Spectrum of Lymph Node by using FNAC in population of West Champaran of Bihar
Rimjhim Kumari, Rabindra Nath Prasad, Rakhi Kumari, Pradeep Kumar Singh
Abstract
Background: A common clinical issue, lymphadenopathy has a broad range of causes, from benign reactive conditions to infections and cancers. A quick, easy, and affordable first-line diagnostic method for assessing lymph node enlargements is fine-needle aspiration cytology (FNAC), particularly in settings with limited resources. There are few regional data from West Champaran, Bihar. The objective is to examine the cytomorphological spectrum of lymph node lesions by FNAC in patients who visit the outpatient department (OPD) of Government Medical College in Bettiah, West Champaran, Bihar. Additionally, patterns pertaining to age, sex, and lymphadenopathy site will be analyzed. Methods: From March 30, 2025, to February 28, 2026, the Department of Pathology at Government Medical College in Bettiah conducted this prospective, observational study. Included were 117 consecutive patients who had palpable lymphadenopathy. FNAC was carried out under aseptic conditions with 22–23 G needles; smears were stained with Papanicolaou, May–Grünwald–Giemsa, and Ziehl–Neelsen stains when tuberculosis was suspected. The Sydney System was used to classify cases as inadequate (L1), benign (L2), atypical (L3), suspicious (L4), and malignant (L5). Descriptive analysis was done on the data. Results: The 117 patients had nearly equal sex distribution and ranged in age from 3 to 82 years (mean ≈ 32 years). Like other Indian series, cervical lymph nodes were the most commonly affected (approximately 70%), followed by axillary and inguinal nodes. Reactive lymphadenitis (~41%), tuberculous/granulomatous lymphadenitis (~33%), and suppurative lymphadenitis (~6%) were the most frequently diagnosed benign lesions. According to previous reports, malignant lesions accounted for approximately 10% of cases, with metastatic carcinoma outnumbering lymphoma and the majority of malignancies occurring in patients over 40. Only about 4% of smears were inadequate or non-diagnostic. Conclusion: FNAC shows that the most common causes of lymphadenopathy in the West Champaran population are reactive and tuberculous lymphadenitis, with metastatic cancer and lymphoma making up a smaller but clinically significant percentage. The results support national trends and demonstrate FNAC as a crucial, minimally invasive, and reasonably priced first-line test for lymphadenopathy triaging in this resource-constrained area.

74. Study of Association of Proteinuria with HbA1c in Diabetes Mellitus
Rimjhim Kumari, Rakhi Kumari, Rabindra Nath Prasad, Pradeep Kumar Singh
Abstract
Proteinuria is an early indicator of diabetic nephropathy, one of the microvascular problems caused by persistent hyperglycemia in type 2 diabetes mellitus (T2DM). The purpose of this hospital-based cross-sectional study was to assess the relationship between proteinuria and glycated hemoglobin (HbA1c) in T2DM patients who were enrolled in Government Medical College in Bettiah, Bihar.  Aim: The purpose of this study is to examine the relationship between proteinuria and HbA1c levels in patients with type 2 diabetes mellitus (T2DM) who are receiving treatment at a tertiary care hospital in Bettiah, Bihar. Materials & Methods: This hospital-based cross-sectional observational study was conducted in the Department of Medicine, Government Medical College, Bettiah, from 30 March 2025 to 28 February 2026, and included 100 T2DM patients (≥18 years) attending outpatient and inpatient services. Patients with non-diabetic kidney disease, acute kidney injury, urinary tract infection, nephrotic syndrome, pregnancy, malignancy, congestive heart failure, chronic liver disease, or on nephrotoxic drugs were excluded. Clinical data (age, sex, duration of diabetes, treatment, blood pressure, comorbidities) were recorded. Fasting and, when available, postprandial blood glucose, HbA1c by standardized immunoassay, and serum urea/creatinine were measured. Proteinuria was assessed using urinary albumin/albumin–creatinine ratio or 24-hour urine protein, with dipstick/spot tests for screening. Patients were categorized by HbA1c (<7%, 7–8.9%, ≥9%) and by proteinuria status (normoalbuminuria vs micro/macroalbuminuria). Appropriate statistical tests were applied; p<0.05 was considered significant. Ethical approval and written informed consent were obtained. Results: Mean age was 52±8 years; 62% were male. Mean diabetes duration was 6.2±2.5 years and mean HbA1c 8.1±1.2%. HbA1c showed a weak to moderate positive correlation with proteinuria (r≈0.17–0.45, p<0.05 to <0.001). Mean HbA1c was ~9% in proteinuric vs ~7.5% in non-proteinuric patients (p<0.001). Microalbuminuria prevalence increased from about 15–20% at HbA1c <7% to 50–75% at HbA1c ≥8%, reaching up to 73.3% at HbA1c ≥9%. Higher HbA1c was associated with increased serum creatinine and urea and reduced eGFR. Conclusion: HbA1c is a useful surrogate marker of early diabetic nephropathy and renal risk stratification, as evidenced by the significant correlation between poor glycemic control and higher prevalence and severity of proteinuria. To prevent or postpone diabetic kidney disease, it is essential to maintain a HbA1c of less than 7% and to regularly screen for albuminuria, particularly in patients with elevated HbA1c.

75. Early Detection of Cerebral Palsy in Infants and Young Children Using the Denver Developmental Screening Test (DDST-II): A Hospital-Based Prospective Observational Study from Eastern India
Alok Ranjan, Ravi Shekhar, Satish Kumar, Ankur Priyadarshi
Abstract
Background: Cerebral palsy (CP) is a leading cause of childhood motor disability, and earlier identification enables timely referral to targeted early intervention during peak neuroplasticity. In many low- and middle-resource settings, access to specialized tools (e.g., General Movements Assessment or structured neurological examinations) may be limited, increasing the importance of feasible developmental screening approaches in routine pediatric services. Aim: To evaluate the clinical utility and diagnostic performance of DDST-II for early detection of CP among infants and young children attending a tertiary-care teaching hospital. Methods: This prospective observational study enrolled 115 infants/young children (0–24 months) attending pediatric services at Jawaharlal Nehru Medical College & Hospital, Bhagalpur, Bihar, India, between 25 February 2025 and 30 January 2026. DDST-II screening was performed by trained examiners across four domains. “Suspect/Untestable” screens were considered positive. CP diagnosis was confirmed by pediatric neurology assessment with supportive clinical/imaging correlation where available. Diagnostic indices and multivariable logistic regression were performed. Results: CP was confirmed in 28/115 (24.3%) children. DDST-II screen positivity was observed in 40/115 (34.8%). Screen positivity demonstrated sensitivity 85.7%, specificity 81.6%, PPV 60.0%, and NPV 94.7% for CP detection. Gross motor delay predominated among CP cases (Figure 1). In multivariable analysis, NICU admission, low birth weight, neonatal seizures, and birth asphyxia were independently associated with CP (Table 4). Conclusion: DDST-II, when integrated into a structured referral pathway, showed high sensitivity and strong rule-out value (high NPV) for CP detection. In resource-constrained settings, DDST-II can support earlier identification of high-risk children and prompt referral for confirmatory assessment and early intervention.

76. Effectiveness of Non-Pharmacological Interventions (Oral Glucose and Non-Nutritive Sucking) on Procedural Pain in Neonates: A Randomized Controlled Trial
Alok Ranjan, Ravi Shekhar, Satish Kumar, Ankur Priyadarshi
Abstract
Background: Neonates undergo repeated minor painful procedures (heel lance, venipuncture) during early hospital care. Untreated pain is associated with physiologic instability and potential adverse neurodevelopmental consequences. Sweet-tasting solutions (sucrose/glucose) and non-nutritive sucking (NNS) are widely recommended non-pharmacological analgesic options, yet comparative effectiveness and pragmatic implementation data from Indian tertiary-care contexts remain limited. Aim: To compare oral glucose, NNS, and combined glucose+NNS versus routine comfort measures in reducing procedural pain in neonates. Methods: Single-center, parallel-group randomized controlled trial including 100 neonates requiring heel lance or venipuncture. Participants were randomized (1:1:1:1) into: (i) routine comfort, (ii) oral glucose 25%, (iii) NNS, (iv) glucose+NNS. Pain was assessed by Premature Infant Pain Profile-Revised (PIPP-R) at pre-procedure, during procedure, 30 seconds, and 60 seconds. Primary outcome: PIPP-R at 30 seconds. Secondary outcomes: crying duration, heart rate (HR) change, oxygen saturation (SpO₂) change, and adverse events. Results: Baseline characteristics were comparable across groups (Table 1). Mean PIPP-R at 30 seconds differed significantly across groups (one-way ANOVA p < 0.001), lowest in glucose+NNS. Crying duration and physiologic reactivity also favored glucose+NNS (all p < 0.001). Repeated-measure trajectory demonstrated faster pain resolution with combined intervention. No serious adverse events were observed; minor gagging/desaturation was rare and self-limited. Conclusion: Oral glucose and NNS are effective non-pharmacological analgesic strategies for minor neonatal procedures. Combined glucose+NNS provides superior analgesia and quicker recovery compared with either intervention alone.

77. Growth Outcomes and Feeding Tolerance in Preterm Infants: A Comparison Between Fortified Human Milk and Preterm Formula
Ravi Shekhar, Alok Ranjan, Satish Kumar, Ankur Priyadarshi
Abstract
Background: Preterm infants frequently experience extrauterine growth restriction and feeding intolerance. Human milk is biologically advantageous, but unfortified milk may not meet nutrient needs; hence fortification is recommended to improve growth while preserving gastrointestinal tolerance and protection against necrotizing enterocolitis (NEC) and infection. However, in resource-variable settings, preterm formula remains common, and comparative outcomes in routine NICU practice require context-specific evaluation. Aim: To compare growth outcomes and feeding tolerance in preterm infants fed fortified human milk (FHM) versus preterm formula (PF). Methods: A prospective comparative cohort study was conducted in the NICU of Jawaharlal Nehru Medical College & Hospital, Bhagalpur, enrolling 115 preterm infants during 10 Feb 2025–25 Jan 2026. Infants received either FHM (mother’s expressed milk fortified per unit protocol) or PF. Primary outcomes were time to full enteral feeds and feeding intolerance. Secondary outcomes included growth velocities, NEC (≥stage II), late-onset sepsis, length of stay, and discharge anthropometry. Multivariable regression adjusted for gestational age, birthweight, SGA status, and sepsis. Results: Of 115 infants, 60 received FHM and 55 received PF. Baseline characteristics were comparable. FHM achieved earlier full feeds (9.75 ± 3.18 vs 13.60 ± 3.43 days; p<0.001) and fewer intolerance episodes (median 1.0 vs 2.0; p<0.001). Feeds held ≥24h were lower with FHM (23.3% vs 50.9%; p=0.004). NEC ≥II (3.3% vs 10.9%; p=0.150) trended lower with FHM. PF showed higher unadjusted weight gain velocity (16.60 ± 2.55 vs 15.50 ± 2.80 g/kg/day; p=0.029), while FHM showed better length gain (1.08 ± 0.19 vs 0.95 ± 0.19 cm/week; p=0.001). In adjusted analysis, PF was not independently associated with higher weight gain (β 0.95, 95% CI −0.11 to 2.01; p=0.079), but remained associated with more intolerance episodes (IRR 1.67, 95% CI 1.25–2.21; p<0.001). Conclusion: In this cohort, fortified human milk improved feeding tolerance and accelerated attainment of full feeds, with comparable adjusted weight gain and signals toward reduced morbidity. These findings support guideline-concordant prioritization of human milk with appropriate fortification in preterm care.

78. Impact of Early Caffeine Therapy on White Matter Development in Extremely Low Birth Weight Infants: A Prospective Cohort Study from a Tertiary Care Center in Eastern India
Ravi Shekhar, Alok Ranjan, Satish Kumar, Ankur Priyadarshi
Abstract
Background: Extremely low birth weight (ELBW) infants are at high risk of diffuse white matter injury and dysmaturation, which contribute substantially to later neurodevelopmental impairment. Diffusion tensor imaging (DTI) at term-equivalent age (TEA) provides sensitive microstructural biomarkers of white matter maturation. Aim: To evaluate the association between early caffeine therapy (≤24 h of life) and white matter microstructural development at TEA in ELBW infants. Methods: Prospective cohort study conducted at Jawaharlal Nehru Medical College & Hospital, Bhagalpur, Bihar, India (10 February 2025–25 January 2026). ELBW infants receiving caffeine were grouped as early (≤24 h) vs late (>24 h) initiation. Standard caffeine citrate regimen was used (loading 20 mg/kg; maintenance 5–10 mg/kg/day). TEA MRI with DTI was performed where feasible. Primary outcome: TEA DTI white matter composite fractional anisotropy (FA) z-score; secondary outcomes included regional FA, mean diffusivity (MD), severe white matter injury (WMI) on qualitative MRI scoring, and major neonatal morbidities. Multivariable regression adjusted for gestational age, birth weight, sex, antenatal steroids, BPD, severe IVH, and late-onset sepsis. Results: Among 115 ELBW infants (early n=60; late n=55), TEA MRI/DTI was obtained in 100. In the example output, early caffeine was associated with higher FA composite (adjusted β≈0.31) and lower MD composite (adjusted β≈−0.43), with strongest regional effects in the posterior limb of internal capsule. Early caffeine was associated with shorter ventilation duration and lower severe IVH. Conclusion: Early caffeine therapy may be associated with improved TEA white matter microstructure in ELBW infants. Randomized trials and robust causal inference approaches are needed to confirm neuroprotective effects and identify optimal timing/dose.

79. Role of Vitamin D in Health and Diseases in Children: A Hospital-Based Observational Study
Ravi Shekhar, Alok Ranjan, Satish Kumar, Ankur Priyadarshi
Abstract
Background: Vitamin D is essential for skeletal mineralization and has immunomodulatory effects that may influence infections, wheeze/asthma, anemia, and growth in children. Despite abundant sunlight, vitamin D deficiency remains common in South Asia. Aim: To estimate vitamin D status in children attending JNMCH Bhagalpur and evaluate associations with selected clinical and biochemical outcomes. Methods: Hospital-based observational study of 120 children (1–18 years) recruited from 20 February 2025 to 15 January 2026. Demographic, dietary and sunlight exposure history, anthropometry, and clinical assessment were recorded. Serum 25-hydroxyvitamin D [25(OH)D] and relevant biochemical markers were assessed. Vitamin D categories were defined using standard pediatric cut-offs. Associations with recurrent acute respiratory infections (ARI), wheeze/asthma, anemia, and clinical rickets signs were evaluated using bivariate tests and multivariable logistic regression. Results: Mean 25(OH)D was 17.6 ± 9.5 ng/mL; 63.3% had 25(OH)D <20 ng/mL. Deficiency was higher in winter (76.7% vs 48.3%, p=0.0026). Severe deficiency was strongly associated with clinical rickets signs (p<0.001) and higher alkaline phosphatase. Hemoglobin differed significantly across vitamin D categories (ANOVA p=0.0012), and vitamin D deficiency independently predicted anemia (adjusted OR 3.75; 95% CI 1.48–9.53; p=0.005). Conclusion: Vitamin D deficiency was highly prevalent in this hospital-based pediatric sample, with clinically meaningful associations with rickets phenotype and anemia. Targeted screening and guideline-based supplementation for high-risk children may be warranted.

80. Total Thyroidectomy versus Hemithyroidectomy: A Comparative Study of Complications and Surgical Outcomes
Syeda Ayesha, Syeda Nahidunnisa, Heeba Mohammed Ghouse, Humaira Shaikh
Abstract
Introduction: Thyroidectomy is a commonly performed surgical procedure for the management of benign and malignant thyroid disorders. The two principal approaches, total thyroidectomy and hemithyroidectomy, differ in extent of resection and are associated with varying complication profiles. Understanding these differences is essential for optimal surgical decision-making. The study aimed to compare the complication rates between total thyroidectomy and hemithyroidectomy and to evaluate associated operative and postoperative outcomes. Materials and Methods: This hospital-based comparative observational study was conducted in the Department of ENT and Head and Neck Surgery at Deccan College of Medical Sciences, Hyderabad, from January 2025 to December 2025. A total of 50 patients undergoing thyroid surgery were included and divided into two groups: total thyroidectomy (n=25) and hemithyroidectomy (n=25). Demographic and clinical variables were recorded. Postoperative complications including hypocalcaemia, recurrent laryngeal nerve injury, haemorrhage, and wound infection were assessed. Statistical analysis was performed using SPSS version 26.0, with p<0.05 considered significant. Results: The overall complication rate was significantly higher in the total thyroidectomy group (44.0%) compared to the hemithyroidectomy group (20.0%) (p=0.04). Transient hypocalcaemia was significantly more frequent following total thyroidectomy (32.0% vs. 4.0%, p=0.01). No significant differences were observed in permanent hypocalcaemia, recurrent laryngeal nerve injury, hemorrhage, or wound infection. The mean duration of surgery and hospital stay were significantly higher in the total thyroidectomy group (p<0.001). Conclusion: Total thyroidectomy is associated with a higher complication rate, particularly hypocalcaemia, along with increased operative time and hospital stay compared to hemithyroidectomy. Careful patient selection and surgical planning are essential to balance treatment efficacy and safety.

81. Microbiological Study of Septicemia in a Tertiary Care Teaching Hospital, Kachchh
Ronak Pradipbhai Chauhan, Hitesh Assudani, Krupali Kothari
Abstract
Background: Septicemia remains a major cause of morbidity and mortality worldwide, with increasing concerns regarding antimicrobial resistance. Aim: To identify the common bacterial pathogens associated with septicemia in adult patients and analyze their antibiotic susceptibility patterns. Methods: A hospital-based prospective observational study was conducted on 150 clinically suspected adult septicemia cases. Blood cultures were processed using standard microbiological techniques, antimicrobial susceptibility testing and identification of the isolated organisms were performed using the VITEK 2 system. Results: Culture positivity was observed in 73.3% cases. Gram-negative organisms (60%) predominated, with Escherichia coli (29.1%) being the most common isolate. High resistance was observed to commonly used antibiotics, including gentamicin and aztreonam, along with significant resistance to carbapenems. Conclusion: The study highlights a predominance of Gram-negative pathogens and an alarming rise in antimicrobial resistance in septicemia, emphasizing the need for continuous surveillance and rational antibiotic use.

82. Prevalence of Non-Alcoholic Fatty Liver Disease in Type 2 Diabetes Mellitus and Its Association with Diabetic Complications: A Cross-Sectional Study
Madhuri Mangharam Alwani, Komal Rana, Renukaben Maheshbhai Vasava, Kavyakumar Pareshkumar Patel
Abstract
Background: Non-alcoholic fatty liver disease (NAFLD) is increasingly recognized as a major comorbidity in patients with type 2 diabetes mellitus (T2DM), contributing significantly to both hepatic and extrahepatic complications. Aim: To determine the prevalence of NAFLD in patients with T2DM and to evaluate its correlation with associated complications and metabolic risk factors. Methods: A hospital-based cross-sectional study was conducted on 150 patients with T2DM. Clinical, biochemical, and ultrasonographic evaluations were performed to diagnose NAFLD. Statistical analysis was carried out using SPSS version 25.0, and associations were tested using Chi-square and t-tests, with p < 0.05 considered significant. Results: The prevalence of NAFLD was 62.0%. Higher prevalence was observed in the 51–60 years age group (65.4%). NAFLD showed significant association with central obesity (73.2%, p = 0.003), elevated ALT levels (74.4%, p = 0.002), and metabolic syndrome (77.8%, p < 0.001), while no significant association was found with gender (p = 0.532). Conclusion: NAFLD is highly prevalent among patients with T2DM and is strongly associated with metabolic risk factors. Early screening and comprehensive management strategies are essential to prevent disease progression.

83. A Profile of Clinical Features and Outcomes in Snakebite Envenomation Patients at Bheri Hospital, Nepal
Sanket Kumar Risal, Urmila Parajuli, Dinesh Kumar Choudhary, Paras Shrestha
Abstract
Snakebite envenomation is a major public health issue in many parts of the world, especially in rural regions where access to medical care is limited. Nepal, with its vast rural areas and diverse ecosystems, is particularly vulnerable to snakebite incidents, which cause significant morbidity and mortality. Despite being a critical health problem, there remains a scarcity of comprehensive, localized data from specific hospitals, particularly Bheri Hospital in Nepal, which serves a large and diverse population. This review aims to assess the clinical features and outcomes of snakebite envenomation patients treated at Bheri Hospital, with the goal of providing a detailed understanding of the disease patterns, treatment efficacy, and potential improvements in healthcare practices.
Bheri Hospital, located in the Banke district of western Nepal, is a central healthcare facility in a region where snakebites are prevalent due to the rural setting and proximity to habitats of venomous snakes. Although there are studies on snakebites in other parts of Nepal and South Asia, specific data from Bheri Hospital is limited. Understanding the local epidemiology of snakebites, including the species involved, the timing of treatment, and the clinical presentations, is essential for improving patient outcomes. By reviewing the available clinical records, case reports, and patient outcomes, this study seeks to provide a comprehensive assessment of the hospital’s approach to snakebite envenomation.
The review will focus on key clinical features of snakebites, such as local symptoms (pain, swelling, necrosis), systemic manifestations (hemotoxicity, neurotoxicity), and complications like renal failure, coagulopathy, and shock. Identifying the most common snake species responsible for envenomation in the region, the severity of their bites, and the resulting clinical manifestations is vital for tailoring effective treatment protocols. Given the diversity of snake species in Nepal, there is a need to assess whether specific regional characteristics affect clinical outcomes, such as the prevalence of bites from species like the cobra, krait, or pit viper, each of which has a different venom composition and clinical presentation.
Another important aspect of the review is to evaluate the outcomes of snakebite victims in relation to the timeliness and appropriateness of medical interventions. In many rural areas, snakebite victims face delays in receiving treatment due to geographic isolation, lack of awareness, and insufficient medical resources. This review will assess the average time to treatment, the use of antivenom, and any challenges encountered in the management of these patients. It will also evaluate whether the hospital’s resources, including the availability of antivenom and the capacity of healthcare workers to administer appropriate care, influence patient outcomes.
Furthermore, this review will aim to identify potential gaps in care and areas for improvement in the clinical management of snakebite envenomation at Bheri Hospital. By analyzing trends in treatment delays, complications, and mortality rates, the study will highlight areas where improvements in healthcare infrastructure, staff training, and resource availability could enhance patient care. For example, if delays in the administration of antivenom are found to correlate with worse outcomes, recommendations can be made for improving access to treatment or raising public awareness about the importance.

84. Spectrum of Anaemia case in Tertiary Care Hospital
Rimjhim Kumari, Rakhi Kumari, Rabindra Nath Prasad, Pradeep Kumar Singh
Abstract
Introduction: Anaemia remains a major public health problem in India, with diverse morphological patterns and aetiologies that vary across clinical settings. Hospital-based research, especially from tertiary care facilities, offers crucial information about the range of anemia severity, underlying causes, and diagnostic correlations. Aims: To determine the distribution of anemia by aetiology, morphological type, and severity, as well as to investigate the relationship between red cell morphology and underlying causes in patients receiving tertiary care. Materials & Methods: The Government Medical College in Bettiah was the site of this cross-sectional study. There were 384 patients with anemia diagnoses in all. Based on hemoglobin levels, anemia was categorized as mild, moderate, or severe. Red cell indices and peripheral smear analysis were used for morphological classification (microcytic, normocytic, and macrocytic). Relevant laboratory tests, such as vitamin assays and iron studies, were used to determine the aetiology. To find distribution patterns and correlations between morphology, severity, and aetiology data were analyzed. Results: The most prevalent severity category was moderate anemia (51.6%), which was followed by severe anemia (21.9%) and mild anemia. The most common morphological type was microcytic anemia (56.8%), which was followed by normocytic anemia (31.8%) and macrocytic anemia (11.5%). The most common cause was found to be iron deficiency (58.3%), followed by vitamin B12 deficiency (16.1%), anemia of chronic disease (13.5%), folate deficiency (4.7%), and other causes. A strong association was observed between microcytic morphology and iron deficiency, macrocytic morphology and vitamin B12/folate deficiency, and normocytic morphology and chronic disease or haemolysis. The majority of morphological and aetiological categories showed moderate anemia. Conclusion: In this tertiary-care population, iron deficiency, microcytic morphology, and moderate anemia are the most common patterns. Referral bias toward more complex cases is reflected in the higher percentage of severe and macrocytic anemia. The importance of red cell indices and peripheral smear examination as useful tools in the initial diagnostic evaluation of anemia is highlighted by the strong correlation between morphology and aetiology.

85. Assessment of Quality of Antenatal Care Services in Public Health Facilities Using Donabedian Model
Rajeev Kumar Ranjan, Vijay Kumar, Aamir Saeed, Surendra Prasad Singh
Abstract
Background/Introduction: Using the Donabedian model (structure, process, outcome), evaluate the quality of antenatal care (ANC) services in a tertiary care public hospital in Bihar. Determine the obstetric and sociodemographic factors linked to the adequacy of ANC. Materials and Methods: From February 15 to August 25, 2025, a cross-sectional study was carried out at Government Medical College in Bettiah, Bihar. Eighty pregnant or recently delivered women (within six weeks postpartum) who were either admitted to obstetric wards or attended the ANC clinic were enrolled one after the other. Mother and Child Protection cards, medical records, and a pre-tested semi-structured questionnaire based on the Donabedian framework and WHO ANC guidelines were used to gather data. Both process (number of visits, examinations, laboratory tests, counseling) and structural (staff, medications, vaccines, equipment) indicators were noted. Frequencies, percentages, means, and Chi-square tests were used to analyze the data using SPSS; p<0.05 was deemed significant. Results: The majority of participants were multigravida (57.5%), from rural areas (65%), and between the ages of 20 and 24 (37.5%). IFA tablets (85%), TT vaccine (87.5%), and paramedical personnel (90%) were all readily available; 80% of respondents indicated that doctors were available. BP measurement (92.5%) and weight recording (87.5%) were the most common process indicators, whereas Hb estimation (72.5%), urine examination (65%), and counseling (60%) were less common. Just 52.5% had at least four ANC visits. Higher education (p=0.02) and living in an urban area (p=0.04) were substantially correlated with adequate ANC. Conclusion: Important ANC process elements, such as laboratory testing, counseling, and the suggested number of visits, were subpar despite acceptable structural readiness. Two important factors that contributed to insufficient ANC were living in a rural area and having little education. Strengthening maternal health outcomes requires focused interventions to enhance process quality and lessen educational and rural disparities.

86. An Observational Study to Detect Antibiotic Resistance in the Isolates from Middle Ear Infections with Special Reference to MRSA, ESBL and MBL Producing Organismsin North Karnataka
Pramod Sambrani, Mahesh Kumar S., Namratha W. Nandihal, Anubhav Sinha, Rejinold T. I.
Abstract
Aim: To detect antibiotic resistance in the isolates from middle ear infections with special reference to MRSA, ESBL and MBL producing Organisms. Materials and Methods: A total of 140 ear swab samples meeting the inclusion criteria were processed in the Department of Microbiology, KMCRI, Hubballi. Pus samples were collected from the external auditory canal using sterile cotton swabs and cultured on appropriate microbiological media following standard laboratory procedures. The bacterial isolates were identified using standard microbiological techniques. Antibiotic susceptibility testing was performed and interpreted according to 36th edition CLSI guidelines. Results: Out of 140 ear swab samples, Staphylococcus aureus (52.1%, n = 73) was the most common isolate, followed by Pseudomonas spp. (23.6%, n = 33), Klebsiella spp. (10.7%, n = 15) and Escherichia coli (7.1%, n = 10), while other organisms constituted 6.4% (n = 9). Among the Staphylococcus aureus isolates, MRSA accounted for 38.3% (n = 28) while MSSA accounted for 61.7% (n = 45). Among the Gram-negative isolates, ESBL production was detected in 27.5% (n = 18) isolates, while no isolates showed MBL production (0%). Conclusion: Staphylococcus aureus was the predominant pathogen isolated from middle ear infections, followed by Pseudomonas spp., Klebsiella spp., and Escherichia coli. A considerable proportion of Staphylococcus aureus isolates were identified as MRSA, and ESBL production was observed among Gram-negative isolates, while no MBL producers were detected. Continuous surveillance of bacterial pathogens and their antibiotic resistance patterns is essential for guiding appropriate empirical therapy, improving treatment outcomes, and preventing the emergence of antimicrobial resistance.

87. Comparative Evaluation of Intravenous Dexmedetomidine versus Fentanyl for Attenuation of Haemodynamic Response during Laryngoscopy and Endotracheal Intubation: A Randomized Comparative Study
Shreya Soni, Anju Verma, Sunil Raghuvanshi
Abstract
Background: Laryngoscopy and endotracheal intubation produce a transient sympathoadrenal response that may manifest as tachycardia, hypertension and increased myocardial oxygen demand. Although these changes are often tolerated by healthy individuals, they can be clinically important in patients with limited cardiovascular reserve. Aim: To compare the efficacy of intravenous dexmedetomidine and intravenous fentanyl in attenuating haemodynamic responses during laryngoscopy and endotracheal intubation. Methods: This prospective cross sectional comparative study was planned in sixty adult patients of ASA physical status I-II undergoing elective surgery under general anaesthesia. Patients were allocated into Group D receiving dexmedetomidine 1 µg/kg diluted in 100 mL normal saline over 10 minutes before induction, and Group F receiving fentanyl 2 µg/kg as a slow intravenous bolus 3 minutes before induction. Heart rate, systolic blood pressure, diastolic blood pressure and mean arterial pressure were recorded at baseline, after study drug administration, at intubation, and at 1, 3, 5 and 10 minutes after intubation. Results: The demographic characteristics were comparable between the two groups (p > 0.05). Following administration of the study drug, a significant reduction in heart rate and mean arterial pressure was observed in the dexmedetomidine group compared to the fentanyl group (p < 0.05). At the time of laryngoscopy and intubation, the fentanyl group demonstrated a marked increase in heart rate (98.6 ± 14.8 bpm) and mean arterial pressure (108.9 ± 14.6 mmHg), whereas the dexmedetomidine group showed minimal changes from baseline (76.2 ± 10.3 bpm and 92.5 ± 11.2 mmHg respectively), which was statistically significant (p < 0.001). The attenuation of haemodynamic response in the dexmedetomidine group was sustained up to 10 minutes post-intubation. Bradycardia was more frequently observed in the dexmedetomidine group, while nausea and vomiting were more common in the fentanyl group; however, these differences were not statistically significant. Conclusion: Intravenous dexmedetomidine at a dose of 1 µg/kg is significantly more effective than fentanyl 2 µg/kg in attenuating the haemodynamic response to laryngoscopy and endotracheal intubation. It provides superior control of heart rate and blood pressure with sustained effects, thereby ensuring better perioperative haemodynamic stability. Although associated with mild bradycardia, dexmedetomidine remains a safe and preferable agent, especially in patients where haemodynamic fluctuations may be detrimental.

88. Histopathological Spectrum of Eyelid and Conjunctival Lesions: A Retrospective Study from a Tertiary Care Center
Parth Bhargavi V., Shah Khushi R., Kuchhadiya Mittal G., Shah Nitee S., Shah Surbhi S.
Abstract
Introduction: Eyelid and conjunctival lesions encompass a wide spectrum of benign and malignant conditions. Histopathological examination remains the gold standard for definitive diagnosis and guides appropriate management. Methods: A retrospective study was conducted in the Department of Ophthalmology at a tertiary care hospital over a period of two years. Histopathological reports of 70 patients with eyelid and conjunctival lesions were analyzed. Data regarding age, gender, lesion site, and histopathological diagnosis were collected. Tissue samples were processed using standard protocols. Diagnoses were established through clinicopathological correlation and microscopic examination. Results: Out of 70 cases, 46 (65.71%) were males and 24 (34.29%) were females. The majority of patients (35.72%) were in the 21–40 years age group. Most lesions were benign (64 cases, 91.42%), while 6 cases (8.58%) were malignant. Among benign lesions, chalazion was the most common (24.28%), followed by chronic inflammatory lesions (14.28%) and cystic lesions (14.28%). Other benign conditions included squamous papilloma, vascular lesions, dermoid cyst, nevus, and granuloma. Malignant lesions included basal cell carcinoma (2.85%), squamous cell carcinoma, poorly differentiated carcinoma, ocular surface squamous neoplasia, and primary cutaneous mucinous carcinoma (each 1.42%). The majority of malignant cases (5 out of 6) occurred in patients aged 60 years and above.

89. Assessment of Intraoperative and Technical Difficulties Associated with Laparoscopic Adrenalectomy in Patients with Adrenal Pheochromocytoma: A Case Series Study
Mihir Karathiya, Chirag Karansinh Sangada, Mehulkumar Muljibhai Tadvi, Rutvi Jain
Abstract
Background: Pheochromocytomas are rare catecholamine-producing adrenal tumors that pose significant perioperative challenges due to their potential for sudden hypertensive crises and arrhythmias. Laparoscopic adrenalectomy has emerged as the gold standard approach, offering advantages of reduced postoperative morbidity, shorter hospital stay, and faster recovery compared to open surgery. However, in the context of pheochromocytoma, the procedure remains technically demanding because of intraoperative hemodynamic instability, tumor size, bilateral involvement, and close relation to vital vascular structures. Objective: To evaluate the intraoperative and perioperative challenges faced during laparoscopic adrenalectomy for adrenal pheochromocytoma and to analyze strategies that improve surgical and anesthetic outcomes. Methodology: This retrospective case-based study, conducted from January 2024 to October 2024, evaluated five patients with biochemically and radiologically confirmed pheochromocytomas who underwent laparoscopic adrenalectomy following preoperative alpha-blockade and additional antihypertensive therapy where required. Result: Tumor dimensions ranged from 3.5 to 5.1 cm. Two patients experienced intraoperative hypertensive surges, which were controlled with anesthetic support. Larger tumors (>4 cm) and those adherent to the inferior vena cava and liver presented greater technical challenges due to loss of fat planes and bleeding risk. One patient with bilateral pheochromocytoma underwent unilateral adrenalectomy to reduce operative risk while achieving tumor control. All procedures were completed laparoscopically without conversion to open surgery, and histopathology confirmed pheochromocytoma in every case. Our findings highlight that while laparoscopic adrenalectomy is safe and feasible, it requires meticulous preoperative optimization, vigilant intraoperative monitoring, and skilled surgical technique to overcome challenges related to tumor size, vascular proximity, and endocrine fluctuations. In experienced hands and multidisciplinary settings, laparoscopic adrenalectomy remains the preferred approach for pheochromocytomas up to 6 cm, ensuring favorable outcomes with minimal morbidity.

90. Comparative Study of Pulmonary Function Tests Using Spirometry in Obese Versus Sedentary Individuals
N. Husamuddin, Sandeep S., Aravindhan V.
Abstract
Background: Obesity is a burgeoning global epidemic associated with multi-system dysfunction. Its impact on respiratory mechanics and pulmonary function — though clinically significant — remains underexplored in Indian settings. Spirometry offers a non-invasive, reproducible means of assessing respiratory capacity, and this study exploits that to compare pulmonary function between obese and sedentary individuals. Objective of this study is to compare spirometric indices — Forced Vital Capacity (FVC), Forced Expiratory Volume in 1 second (FEV1), FEV1/FVC ratio, Peak Expiratory Flow Rate (PEFR), Forced Expiratory Flow 25–75% (FEF25–75%), and accessory lung volumes — between obese and sedentary non-obese individuals; and to assess the correlation of Body Mass Index (BMI) with these spirometric parameters. Methods: A cross-sectional comparative study was conducted at Department of Physioliogy, Government Medical College Krishnagiri for a period of six months in individuals (BMI ≥ 30 kg/m²) and 60 sedentary non-obese controls (BMI 18.5–24.9 kg/m²). Spirometry was performed using a calibrated computerised spirometer following American Thoracic Society (ATS)/European Respiratory Society (ERS) guidelines. Statistical analysis was done using SPSS version 26.0. Results: Obese individuals demonstrated significantly lower FVC (3.12 ± 0.61 L vs 3.74 ± 0.58 L; p < 0.001), FEV1 (2.48 ± 0.52 L vs 3.02 ± 0.49 L; p < 0.001), PEFR (6.21 ± 1.18 L/sec vs 7.54 ± 1.22 L/sec; p < 0.001), and FEF25–75% (2.89 ± 0.74 vs 3.47 ± 0.68 L/sec; p < 0.001) compared to sedentary controls. The FEV1/FVC ratio was preserved in both groups (79.6% vs 80.8%; p = 0.194), indicating a predominantly restrictive pattern. Expiratory Reserve Volume (ERV) was markedly reduced in obese participants (0.68 ± 0.21 L vs 1.14 ± 0.28 L; p < 0.001). Restrictive spirometric pattern was observed in 53.3% of obese individuals compared to 16.7% in sedentary controls (p < 0.001). BMI showed a significant negative correlation with FVC (r = −0.61), FEV1 (r = −0.58), ERV (r = −0.67), and PEFR (r = −0.54). Conclusion: Obesity exerts a profound adverse effect on pulmonary function, primarily producing a restrictive ventilatory defect. Early spirometric screening in obese individuals is warranted for timely respiratory intervention and comprehensive metabolic management.

91. Self-Medication Practices Among Second-Year MBBS and BDS Students in a Rural Tertiary Care Hospital in South India: A Cross-Sectional Study
Swathi Dharini K., Meena S., Sunil Mhatarba Vishwasrao
Abstract
Background: Medication use without consulting a qualified healthcare professional is known as self-medication. Although responsible self-care may be beneficial for minor illnesses, inappropriate medication use may lead to adverse drug reactions, incorrect treatment, and antimicrobial resistance. Objectives: To assess the knowledge, attitudes, and practices regarding self-medication among second-year MBBS and BDS students in a rural tertiary care hospital in Tamil Nadu. Methods: A cross-sectional study was conducted between December 2025 and January 2026 among second-year MBBS and BDS students. Data were collected using a structured and pre-validated questionnaire. Descriptive statistics were used to summarise the results, and the Chi-square test was applied to evaluate associations between variables. Results: A total of 233 students participated in the study, with a mean age of 20.2 years; 65% were female. Approximately 54% reported practising self-medication within the previous six months. Headache and fever were the most frequently treated conditions, and analgesics such as paracetamol were the most commonly used drugs. Pharmacies dispensing medications without prescriptions were the primary source of medicines. Although most participants acknowledged that antibiotics should not be self-administered, 23.5% reported self-administering antibiotics. Conclusion: Self-medication is common among undergraduate medical and dental students despite awareness of potential risks. Educational strategies focusing on rational drug use and antimicrobial stewardship should be incorporated into early medical training.

92. Effect of Time and Storage Condition on Prothrombin Time and Activated Partial Thromboplastin Time
Karthikeyan T.M., Shivapriya R., Umamageswari M.S., Sharanya K., Priya Fedric
Abstract
Background: Prothrombin Time (PT) and Activated Partial Thromboplastin Time (aPTT) are vital coagulation tests done to evaluate the extrinsic and intrinsic pathways, respectively. Pre-analytical variables such as delay in time to process and storage conditions can significantly affect the test results, influencing the clinical decision-making. Objective: To assess the effect of time and storage conditions (room temperature and refrigeration) on PT and aPTT values in blood samples. Materials and Methods: This prospective cross-sectional study was conducted on 30 healthy volunteers between the age group of 18–25 years. Blood samples were collected in 3.2% sodium citrate tubes. Two sets of samples were processed: one set of samples are centrifuged immediately and the obtained plasma is stored in refrigerator, and another set of samples are kept as whole blood at room temperature. PT and aPTT were measured at the time interval of 0, 4, 12, and 24 hours using a fully automated coagulation analyzer (Elite Pro ACL). Statistical analysis was performed using repeated measures ANOVA. Results: PT showed no difference in values between centrifuged and uncentrifuged samples significantly but there was a gradual decline over time but remained relatively stable upto 24 hours. In contrast, aPTT values showed a significant decrease over time, specifically after 4 hours, in both centrifuged and uncentrifuged samples. Statistically significant differences were observed at the interval of 12 and 24 hours (p < 0.05). Conclusion: PT is stable up to 24 hours under both room temperature and refrigerated conditions. However, aPTT is time-sensitive and should ideally be processed within 4 hours for accuracy. If there is a delay for analysing the sample, separation of plasma and refrigeration are recommended to retain the sample integrity.

93. Correlation of Endometrial Thickness with Transvaginal Sonography, Hysteroscopic Findings and Histopathological Diagnosis in Patients with Abnormal Uterine Bleeding
Monika, Neeraj Choudhary, Isha, Vibha, Kavita Chandnani
Abstract
Background: Abnormal uterine bleeding (AUB) is a common gynecological complaint across the world, affecting women in reproductive, perimenopausal, and postmenopausal age groups. AUB may arise from a wide variety of causes ranging from hormonal dysfunction to structural intrauterine lesions, as well as premalignant and malignant conditions. Transvaginal sonography (TVS) is usually the first line of investigation because of its non-invasive nature and ability to measure endometrial thickness (ET). An increased ET can suggest underlying hyperplasia, polyps, or malignancy, although TVS has limited value in detecting focal intrauterine lesions. Hysteroscopy allows direct visualization of the endometrial cavity and facilitates targeted biopsies, while histopathological examination (HPE) continues to be the gold standard for final diagnosis. Material and Methods: This prospective study conducted at a tertiary care hospital in Rajasthan, India, involving 120 women aged 35–50 years presenting with AUB, over a period of 6 months. Patients underwent detailed history taking, clinical examination, TVS, hysteroscopy, and endometrial sampling for histopathology. Data were analyzed statistically to establish the correlation between ET, hysteroscopic findings, and HPE, as well as to determine the diagnostic accuracy of TVS and hysteroscopy when compared to histopathology. Results: The majority of patients were aged 41–50 years (60%), and polymenorrhea was the most common symptom (32%), followed by heavy menstrual bleeding (25%). ET ranged from 4 mm to 23 mm. ET between 8–14 mm was seen in 50% of patients, while 38% had ET >14 mm. Histopathology revealed proliferative endometrium (30%), endometrial polyps (28%), hyperplasia (14%), secretory endometrium (12%), fibroid polyps (8%), and carcinoma (8%). ET >14 mm significantly correlated with hyperplasia and carcinoma. Hysteroscopy showed higher diagnostic sensitivity and specificity than TVS for identifying focal lesions. Conclusion: This study demonstrates that while TVS is an effective first-line screening tool in AUB, it has limitations in specificity. Hysteroscopy with histopathology offers superior diagnostic accuracy. An integrated, multimodal diagnostic approach is essential for optimizing patient care, preventing unnecessary hysterectomies, and ensuring early detection of premalignant and malignant conditions.

94. Comparative Evaluation of Optical Coherence Tomography and Fundus Photography for Early Detection of Diabetic Retinopathy and Its Correlation with Glycemic Control
Hanumant Keshavrao Bhosale, Sayed Rayyan Sayed Inayatullah
Abstract
Background: Diabetic retinopathy (DR) is a leading cause of preventable blindness worldwide. Early detection is crucial to prevent disease progression. Optical Coherence Tomography (OCT) and fundus photography are widely used imaging modalities, but their comparative efficacy in early DR detection remains under evaluation. Glycemic control, reflected by HbA1c levels, plays a pivotal role in disease progression. Materials and Methods: A cross-sectional study was conducted on 120 patients with type 2 diabetes mellitus. All participants underwent fundus photography and OCT examination. DR was graded using standard criteria. HbA1c levels were measured and correlated with imaging findings. Sensitivity, specificity, and diagnostic accuracy of both modalities were analyzed. Results: OCT detected early retinal changes in 78.3% of patients compared to 61.7% by fundus photography. Sensitivity of OCT was significantly higher (92%) than fundus photography (74%). A strong positive correlation (r = 0.68, p < 0.001) was observed between HbA1c levels and severity of retinal changes. Conclusion: OCT demonstrated superior sensitivity in early detection of diabetic retinopathy compared to fundus photography. Higher HbA1c levels were significantly associated with increased severity of DR. Incorporating OCT in routine screening may improve early diagnosis and clinical outcomes.

95. Evaluation of Tear Film Biomarkers (Mmp-9 and Il-6) in Patients with Dry Eye Disease and Their Correlation with Clinical Severity
Sayed Rayyan Sayed Inayatullah, Hanumant Keshavrao Bhosale
Abstract
Background: Dry eye disease (DED) is a multifactorial disorder of the ocular surface characterized by tear film instability and inflammation. Recent evidence highlights the role of inflammatory biomarkers such as matrix metalloproteinase-9 (MMP-9) and interleukin-6 (IL-6) in the pathogenesis of DED. Their correlation with clinical severity may aid in early diagnosis and targeted therapy. Materials and Methods: A cross-sectional study was conducted on 100 participants, including 70 patients with clinically diagnosed DED and 30 healthy controls. Tear samples were collected and analyzed for MMP-9 and IL-6 levels using enzyme-linked immunosorbent assay (ELISA). Clinical severity was assessed using Ocular Surface Disease Index (OSDI), Schirmer test, and Tear Break-Up Time (TBUT). Correlation analysis was performed between biomarker levels and clinical parameters. Results: Mean MMP-9 and IL-6 levels were significantly elevated in DED patients compared to controls (p < 0.001). Higher biomarker levels were observed in moderate and severe DED groups. A strong positive correlation was found between MMP-9 and OSDI scores (r = 0.71) and IL-6 and OSDI scores (r = 0.65). Negative correlation was observed with TBUT and Schirmer values. Conclusion: Tear film biomarkers MMP-9 and IL-6 are significantly elevated in DED and correlate well with disease severity. These biomarkers can serve as reliable indicators for diagnosis and monitoring of dry eye disease.

NMC Approved Embase Indexed

This journal is peer Reviewed Journal