1. Comparative Study of Ultrasound-Guided Quadratus Lumborum Block Versus Transversus Abdominis Plane Block for Postoperative Analgesia in Patients Undergoing Lower Abdominal Surgeries
K. M. Nithish, Anusha M. S., Rajeswara Rao Sarvasiddhi
Abstract
Background: Effective postoperative pain management is crucial for early mobilization, reduced morbidity, and improved patient satisfaction. Ultrasound-guided truncal blocks such as the transversus abdominis plane (TAP) block and quadratus lumborum (QL) block are increasingly used as part of multimodal analgesia for lower abdominal surgeries.
Objective: To compare the efficacy of ultrasound-guided quadratus lumborum block versus transversus abdominis plane block for postoperative analgesia in patients undergoing elective lower abdominal surgeries.
Materials and Methods: This quasi-experimental study was conducted at a tertiary care teaching hospital between October 2022 and March 2024. Sixty patients aged 20–40 years with ASA physical status I–II undergoing elective lower abdominal surgeries under spinal anesthesia were enrolled. Patients were divided into two groups: Group Q (Quadratus Lumborum block, n=30) and Group T (Transversus Abdominis Plane block, n=30). Both groups received bilateral blocks using 0.125% bupivacaine at 0.3–0.4 ml/kg. Postoperative pain was assessed using the Visual Analogue Scale (VAS) at predefined intervals up to 24 hours. Duration of analgesia, number of rescue analgesic doses, hemodynamic parameters, and adverse effects were recorded.
Results: Demographic variables, ASA status, type and duration of surgery, and hemodynamic parameters were comparable between groups (p>0.05). VAS scores were similar up to 4 hours postoperatively. From 8 hours onward, Group Q demonstrated significantly lower VAS scores compared to Group T (p<0.05). The mean duration of analgesia was significantly longer in Group Q (12.23 ± 1.94 hours) than in Group T (8.76 ± 0.81 hours; p<0.0001). Rescue analgesic requirement was significantly lower in Group Q (p<0.0001). No block-related complications or adverse effects were observed in either group.
Conclusion: Ultrasound-guided quadratus lumborum block provides superior and prolonged postoperative analgesia with reduced rescue analgesic requirements compared to transversus abdominis plane block in patients undergoing elective lower abdominal surgeries.
2. Comparative Study to Evaluate Ease of Nasogastric Tube Insertion in Intubated Patients with Three Different Techniques
Shruti Garg, Deepesh Gupta, Shashi Kumari, Sonu Pandoliya, Devanshu Saraf, Aishwarya Shrivastava
Abstract
Background: Nasogastric tube (NGT) insertion in anaesthetized and intubated patients is often challenging due to altered airway anatomy and decreased muscle tone. Several bedside techniques have been described to facilitate smooth insertion, but evidence directly comparing commonly practiced methods remains limited.
Aim and Objective: To compare the ease of NGT insertion using three techniques—additional neck flexion, standard sniffing position with lateral neck pressure, and reverse Sellick’s manoeuvre—in intubated adult patients undergoing elective surgeries.
Materials and Methods: This prospective, randomized comparative study included 120 adult patients (ASA I–II) undergoing elective surgery under general anaesthesia. Patients were allocated into three groups (n = 40 each): Group A Additional neck flexion, Group B—standard sniffing position with lateral neck pressure, and Group C—reverse Sellick’s manoeuvre. The primary outcomes assessed were number of attempts and time required for successful NGT insertion. Secondary outcomes included hemodynamic changes and complications such as kinking, coiling and nasal bleeding.
Results: Baseline demographic and clinical characteristics were comparable across all groups. Group A demonstrated the highest first-attempt success rate and the shortest insertion time. Group B showed moderate ease of insertion, while Group C had the lowest first-attempt success and longest insertion time. Complications were least frequent in Group A and most common in Group C. Hemodynamic parameters remained stable in all groups, and no major adverse events occurred.
Conclusion: Additional neck flexion is the most effective technique for NGT insertion in intubated patients, offering superior first-attempt success, shorter insertion time, and fewer complications compared with lateral neck pressure and reverse Sellick’s manoeuvre. Its simplicity and safety make it a preferred method in routine anaesthetic practice.
3. Clinical Assessment between Measurement of Mandibular Condylar Mobility (USG Guided) Versus Maximum Condyle-Tragus Distance in Predicting Difficult Laryngoscopy
Varsha M., Surendra Raikwar, Neelesh Nema, Vignesh Rajan V., Aishwarya Shrivastava, Vighna Rajan R.
Abstract
Background: Prediction of difficult laryngoscopy remains a critical component of preoperative airway evaluation, as unanticipated airway difficulty can lead to severe complications.
Aim and Objective: To compare ultrasound-guided mandibular condylar mobility with traditional airway assessment parameters, inter-incisor gap (IIG), upper lip bite test (ULBT), mandibular protrusion distance, and maximum condyle–tragus distance in predicting difficult laryngoscopy.
Methods: This prospective observational study included 90 adult patients undergoing elective surgery under general anaesthesia. Preoperative measurements included ultrasound-guided condylar mobility and four clinical airway tests. Laryngoscopy was performed using a standard technique, and Cormack–Lehane (CL) grading was recorded. CL grade III–IV was defined as difficult laryngoscopy. Diagnostic accuracy was analysed using sensitivity, specificity, predictive values, and odds ratios.
Results: The majority [79 (87.8%)] had easy laryngoscopy, and 11 (12.2%) had difficult laryngoscopy. Ultrasound-guided mandibular condylar mobility demonstrated the highest sensitivity (81.8%) and perfect specificity (100%). Maximum condyle–tragus distance and IIG also showed strong diagnostic performance, with sensitivities of 72.7% and 97.5%, respectively, and specificities of 98.7% and 98.7%, respectively. Mandibular protrusion distance and ULBT had perfect specificity (100%) but lower sensitivity (36.4% and 27.3%). All parameters showed significant association with difficult laryngoscopy (p < 0.0001).
Conclusion: Ultrasound-guided mandibular condylar mobility is the most accurate single predictor of difficult laryngoscopy, demonstrating superior sensitivity and perfect specificity. However, multivariate analysis showed that no parameter independently predicted difficult laryngoscopy. A combined approach using both ultrasound-based and conventional tests enhances the reliability of airway assessment and improves preparedness for difficult laryngoscopy.
4. Impact of Statin Treatment On Liver Enzyme Levels in Patients with Dyslipidemia
Joshi Abhishek, Modi Vansh Kanaiyalal, Modi Shraddhaben Kanaiyalal
Abstract
Background: Dyslipidemia frequently coexists with non-alcoholic fatty liver disease (NAFLD) and cardiovascular disease, contributing to increased morbidity and mortality. While statins are widely prescribed to manage lipid abnormalities and reduce cardiovascular risk, their effect on liver enzyme levels in dyslipidemic patients with NAFLD remains incompletely understood. Evaluating this effect is critical to ensure both efficacy and safety of statin therapy in this high-risk population.
Methods: This prospective observational study enrolled 146 adult patients aged 18–80 years with dyslipidemia, NAFLD, and cardiovascular disease at a tertiary care hospital over one year. Patients receiving statins (n = 88) were compared with non-statin users (n = 58). Liver enzymes (ALT, AST) and lipid profiles (total cholesterol, LDL-C) were measured at baseline and follow-up. Demographic, clinical, and treatment-related data were also collected.
Results: At baseline, ALT and AST levels were similar between the statin and non-statin groups (43.1 ± 18.0 vs 40.1 ± 15.9 U/L and 38.0 ± 14.9 vs 35.2 ± 13.1 U/L, respectively; p > 0.1). After follow-up, statin therapy significantly reduced ALT (36.0 ± 14.0 U/L; p < 0.01) and AST (31.1 ± 11.7 U/L; p = 0.04), whereas non-statin patients showed minimal change. Total cholesterol decreased from 210.5 ± 30.9 to 180.9 ± 24.6 mg/dL (p < 0.01) and LDL-C from 135.4 ± 28.7 to 104.6 ± 21.0 mg/dL (p < 0.01) in the statin group, with no significant reductions in the non-statin group. Statins were well tolerated, with only minor side effects reported.
Conclusion: Statin therapy significantly improves liver enzyme levels and lipid profiles in dyslipidemic patients with NAFLD and cardiovascular disease. These results support the dual hepatic and cardiovascular benefits of statins in this population.
5. Correlation of Blood Sodium and Potassium Levels with The Extent of Stroke
Fulwani Dhirajkumar Mahendrabhai, Modi Vansh Kanaiyalal, Joshi Abhishek
Abstract
Background: Stroke is a leading cause of disability, often resulting in motor and neurological impairments. Electrolyte disturbances, particularly in sodium, potassium, and calcium, may influence stroke severity and outcomes. This study aimed to evaluate the association between serum electrolyte levels and functional outcomes in ischemic stroke patients.
Methods: A prospective study was conducted over one year at a tertiary care hospital including 168 adult ischemic stroke patients. Stroke severity and motor function were assessed using NIHSS and MAS scores. Serum sodium, potassium, and calcium levels were measured at admission. The primary outcome was death or major disability at 3 months (mRS 3–6).
Results: Patients with death or major disability were older (74.2 vs. 66.5 years) and had higher NIHSS scores (median 6 vs. 3) and lower MAS scores (median 15 vs. 20). Abnormal calcium levels were significantly associated with adverse outcomes (p = 0.01), while sodium and potassium showed no significant correlation (p = 0.12 and 0.43).
Conclusion: Calcium disturbances are linked to worse functional outcomes in ischemic stroke. Monitoring and correcting calcium levels may help improve prognosis.
6. Association of Serum Catecholamine Concentrations with Heart Rate Variability in Patients with Chronic Heart Failure
Amiben Manojbhai Patel, Bhavikaben Jayantilal Maru, Patel Vishvaben Narendrabhai
Abstract
Background: In order to identify higher-risk patients who might be the focus of additional treatment measures, a wide range of factors related to CHF can be assessed. By doing a bedside examination, it is quite simple to identify patients who exhibit symptoms and signs while at rest. Even with the best medical care, these patients still have an annual mortality rate of more than 40%, but their share of the overall heart failure population is quite modest.
Objectives: In patients with chronic heart failure, the study sought to determine the association between HRV parameters and blood catecholamine levels as well as the usefulness of these measurements in indicating autonomic dysfunction and the severity of the condition.
Materials and Methods: It was a retrospective, observational study. The study was carried out at a tertiary care centre. The study data that was retrieved was for one year. Data from 184 participants were retrieved for the study. Patients 18 years of age and older who had a diagnosis of chronic heart failure verified by clinical assessment and echocardiography and whose medical records had information on heart rate variability, blood catecholamine levels, and NYHA functional class were included in the study.
Results: A significant proportion of patients (64.1%) demonstrated reduced left ventricular ejection fraction (<40%). Common associated conditions included hypertension in 55.4%, diabetes mellitus in 41.3%, and ischemic heart disease in 48.4% of patients. Heart rate variability analysis showed reduced autonomic control of the heart. The mean SDNN was 92.6 ms, and RMSSD was 21.4 ms.
Conclusion: Reduced time-domain and frequency-domain HRV characteristics show that patients with chronic heart failure have severe autonomic dysfunction, according to this study. Increased sympathetic activation is linked to compromised autonomic regulation, as seen by elevated serum catecholamine levels and a moderately negative connection with HRV indices.
Recommendations: Larger studies are required to validate the predictive utility of HRV and catecholamine monitoring, which can help evaluate autonomic dysfunction in CHF and direct tailored therapy.
7. Association Between Serum Iron Indices and Neurodevelopmental Delay (NDD) in Children
Patel Vishvaben Narendrabhai, Bhavikaben Jayantilal Maru, Amiben Manojbhai Patel
Abstract
Background: Neurodevelopmental disorders such as autism spectrum disorder (ASD), attention deficit hyperactivity disorder (ADHD), and intellectual disability (ID) are commonly associated with nutritional deficiencies, including altered iron status. Iron plays a critical role in brain development, and disturbances in iron metabolism may influence neurodevelopmental outcomes. This study aimed to evaluate differences in serum iron indices among children with different neurodevelopmental disorders.
Methods: This hospital-based observational study included 186 children aged 4–12 years diagnosed with ASD, ADHD, or ID. Neurodevelopmental diagnoses were established using standardised assessment tools. Serum iron, serum ferritin, and serum transferrin levels were measured using standard laboratory methods. Iron parameters were compared across the three diagnostic groups using appropriate statistical analyses, with a p-value <0.05 considered statistically significant.
Results: Among the 186 children enrolled, 68 had ASD, 79 had ADHD, and 39 had ID. Serum ferritin levels showed a statistically significant difference among the three groups (p = 0.003), with higher mean ferritin levels observed in children with ASD and lower levels in children with ADHD and ID. In contrast, no statistically significant differences were observed in serum iron (p = 0.087) or serum transferrin levels (p = 0.156) among the diagnostic groups.
Conclusion: Serum ferritin levels differ significantly among children with ASD, ADHD, and ID, indicating variations in iron storage status across neurodevelopmental disorders. These findings suggest that assessment of serum ferritin may be useful in the clinical evaluation of children with neurodevelopmental disorders, even when serum iron and transferrin levels are within normal limits.
8. Psoriasis and Its Association with Metabolic Syndrome and Cardiovascular Outcomes
Dixit D. Chhatrawala, Riya M. Chaudhari, Vidhi M. Maniya
Abstract
Background: Psoriasis is a chronic immune-mediated inflammatory disease increasingly recognised to be associated with metabolic syndrome and cardiovascular morbidity. Systemic inflammation in psoriasis may contribute to metabolic abnormalities and accelerated atherosclerosis.
Objectives: To evaluate the prevalence of metabolic syndrome and assess subclinical cardiovascular risk markers among patients with psoriasis.
Methods: This retrospective observational study included 190 adult patients with clinically diagnosed psoriasis attending a tertiary care centre. Demographic data, clinical characteristics, metabolic parameters, and cardiovascular risk markers were extracted from medical records. Metabolic syndrome was defined using modified NCEP ATP III criteria. Subclinical cardiovascular disease was assessed using carotid intima-media thickness (CIMT), high-sensitivity C-reactive protein (hs-CRP), ankle-brachial index (ABI), and echocardiographic evaluation.
Results: The mean age of participants was 44.8 ± 11.3 years, with a mean disease duration of 8.6 ± 4.8 years and a mean PASI score of 13.4 ± 5.6. Metabolic syndrome was present in 103 patients (54%). Abdominal obesity was the most common component (78%), followed by elevated triglycerides (63%) and low HDL cholesterol (59%). Increased CIMT (>0.8 mm) was observed in 48% of patients, elevated hs-CRP (>3 mg/L) in 61%, reduced ABI (<0.9) in 12%, and echocardiographic diastolic dysfunction in 18%, indicating a high burden of subclinical cardiovascular disease.
Conclusion: Patients with psoriasis demonstrate a high prevalence of metabolic syndrome, systemic inflammation, and subclinical cardiovascular abnormalities, even at moderate disease severity. These findings support routine cardiometabolic screening and integrated multidisciplinary management to reduce long-term cardiovascular risk in psoriasis patients.
9. Association of Acid–Base Disturbances with Severity and Outcomes in Sepsis
Vidhi M. Maniya, Riya M. Chaudhari, Dixit D. Chhatrawala
Abstract
Background: Complex acid-base and electrolyte abnormalities are prevalent in intensive care units. The blood pH rapidly moves in either of the extreme directions, which can result in serious multi-organ issues, even though in most cases the acid-base changes are small and self-limited.
Objectives: The purpose of the study was to assess the relationship between the severity and clinical outcomes of sepsis patients admitted to the intensive care unit and acid-base abnormalities.
Materials and Methods: It was a retrospective, observational study. The study was carried out at a tertiary care centre. The study data that was retrieved was for one year. Data from 162 participants were retrieved for the study. Included were adult patients with sepsis or septic shock who were hospitalized to the intensive care unit (ICU) and had complete clinical, laboratory, and ABG data at the time of admission.
Results: The largest subgroup consisted of patients with metabolic acidosis, who also had the highest mean SOFA score (9.6 ± 3.2) and the highest percentage of septic shock (58.8%). In a similar vein, patients with a combined acid-base problem had a significantly higher mean SOFA score (8.9 ± 2.8) and a high rate of septic shock (52.9%).
Conclusion: Acid-base imbalances were found to be closely linked to the severity and results of sepsis in this investigation. Compared to patients with normal acid-base status, those with metabolic acidosis and mixed acid-base disorders had far greater rates of septic shock, higher SOFA scores, longer ICU stays, and higher mortality.
Recommendations: Since patients with metabolic acidosis or mixed acid-base abnormalities are more likely to experience septic shock, organ failure, and death, early evaluation and monitoring of acid-base status should be a crucial component of sepsis care.
10. Accuracy of Clinical and Biochemical Methods for Detection of Ovulation in Infertile Women: A Hospital-Based Observational Study
Achala Rawat, Shubha Pandey, Jyoti
Abstract
Background: Ovulatory dysfunction is one of the most common and potentially treatable causes of female infertility. Accurate identification of ovulation is essential for appropriate infertility evaluation and management. Various clinical and biochemical methods are used to detect ovulation; however, their diagnostic accuracy varies, and a comparative evaluation is required to guide optimal clinical practice.
Aim and Objectives: To assess the accuracy of clinical and biochemical methods for the detection of ovulation in infertile women.
Materials and Methods: This hospital-based retrospective observational cross-sectional study was conducted in the Department of Obstetrics and Gynaecology at Kamala Nehru Memorial Hospital, Prayagraj, over a period of two years from October 2020 to October 2022. A total of 100 infertile women of reproductive age were included. Ovulation was assessed using basal body temperature charting, cervical mucus examination, and mid-luteal serum progesterone estimation. Detection of the urinary luteinizing hormone (LH) surge using a commercial LH kit was considered the reference standard. Diagnostic accuracy parameters, including sensitivity, specificity, positive predictive value, negative predictive value, and overall accuracy, were calculated. Statistical analysis was performed using SPSS version 23.0, and a p-value <0.05 was considered statistically significant.
Results: Out of 100 infertile women, 78 (78.0%) were ovulatory and 22 (22.0%) were anovulatory based on LH surge detection. Serum progesterone estimation showed the highest diagnostic accuracy (84.0%) with high specificity (94.87%), followed by cervical mucus examination with an accuracy of 80.0%. Basal body temperature monitoring demonstrated lower sensitivity and an overall accuracy of 71.0%. Serum prolactin levels were significantly higher in anovulatory women (p<0.05), while other hormonal parameters showed no significant difference between ovulatory and anovulatory groups.
Conclusion: Ovulatory dysfunction contributes significantly to female infertility. Among the evaluated methods, serum progesterone estimation is the most accurate biochemical marker for ovulation detection, while cervical mucus examination serves as a reliable and cost-effective clinical indicator. Basal body temperature monitoring alone is insufficient for accurate ovulation assessment. A combined approach incorporating clinical assessment and biochemical confirmation provides a more reliable strategy for ovulation detection, particularly in resource-limited settings.
11. Efficacy of Toothbrushing Aids versus Interdental Devices in Gingivitis Management: A Randomized Study
Manoj Meena, Akshay Verma
Abstract
Background: Gingivitis remains a prevalent oral health condition affecting a significant proportion of the global population. Effective plaque control through mechanical oral hygiene devices is fundamental to gingivitis prevention and management. However, comparative evidence regarding the efficacy of toothbrushing aids versus interdental cleaning devices remains limited.
Methods: A total of 120 participants diagnosed with moderate gingivitis were randomly allocated to three groups: powered toothbrush group (n=40), interdental brush group (n=40), and dental floss group (n=40). Clinical parameters including Gingival Index (GI), Plaque Index (PI), and Bleeding on Probing (BOP) were assessed at baseline, 4 weeks, 8 weeks, and 12 weeks.
Results: All three groups demonstrated significant improvements in clinical parameters. The powered toothbrush group exhibited the greatest reduction in GI (1.82 ± 0.31 to 0.68 ± 0.22, p<0.001) and PI (2.14 ± 0.28 to 0.72 ± 0.19, p<0.001). The interdental brush group showed superior improvement in interproximal sites (BOP reduction: 78.4% to 22.6%, p<0.001) compared to dental floss (76.2% to 34.8%, p<0.001). Combined use of powered toothbrush with interdental devices yielded optimal outcomes.
Conclusion: Both toothbrushing aids and interdental devices effectively manage gingivitis, with powered toothbrushes demonstrating superior overall plaque removal and interdental brushes showing enhanced efficacy at interproximal sites. A combined approach is recommended for comprehensive gingivitis management.
12. Evaluating the Coexistence of Bronchial Asthma in Bronchiectasis Patients: A Cross-Sectional Study
Dinesh C. Patel, Rajesh B. Makwana, Meet P. Shah
Abstract
Background: Bronchiectasis is a chronic airway disease frequently associated with comorbid conditions that influence clinical outcomes. Bronchial asthma represents an important overlapping airway disorder.
Aim: To determine the coexistence of bronchial asthma in patients with bronchiectasis and to compare the clinico-radiological profile between patients with bronchiectasis alone and those with concomitant bronchial asthma.
Materials and Methods: A cross-sectional observational study was conducted on 80 patients with radiologically confirmed bronchiectasis. Clinical features, radiological findings, and exposure history were compared between patients with bronchiectasis alone and those with coexisting bronchial asthma.
Results: Bronchial asthma was present in a substantial proportion of patients with bronchiectasis and was associated with increased breathlessness, wheezing, atopic manifestations, and specific environmental exposures.
Conclusion: Coexisting bronchial asthma significantly modifies the clinical profile of bronchiectasis, highlighting the need for routine evaluation and tailored management strategies.
13. Effect of 4mg Dexamethasone for Prevention of Post-Operative Nausea and Vomiting in Laparoscopic Surgeries
Dinesh C. Patel, Rajesh B. Makwana, Meet P. Shah
Abstract
Background: Laparoscopy was first introduced as a therapeutic alternative to laparotomy more than a century ago. Since then, the field of laparoscopic surgery has undergone enormous development and expansion, to the point where it is now the standard treatment for a wide range of surgical procedures, including cholecystectomy, appendicectomy, gynecologic surgeries, bariatric surgery, hernia repair and even complex oncologic operations. However, laparoscopic surgeries are associated with high incidence of postoperative nausea and vomiting (PONV) of 40%-80%. A number of drugs have been used for its prevention. Dexamethasone, a glucocorticoid, having an antiemetic effect along with anti-inflammatory and analgesic effect has been shown to reduce the incidence of PONV. However, the optimal dose for reducing PONV has not been clearly defined. In this study, we aim to study 4mg dose of dexamethasone on incidence of PONV in patients undergoing laparoscopic surgery.
Methods: A double blind randomized controlled study was performed on 70 patients posted for elective laparoscopic surgeries under general anesthesia to assess the efficacy of 4mg dose of dexamethasone in preventing PONV. Patients were randomly assigned into two groups: 4mg dexamethasone (1ml) and 1ml normal saline group. The incidence of nausea, vomiting and the need for anti-emetic were evaluated during first 24 postoperative hours.
Results: Patients who received IV dexamethasone 4mg had significant reduction of PONV (P<0.01) and the need for rescue anti emetic drugs was also lower in dexamethasone group compared to normal saline group.
Conclusion: Inj Dexamethasone 4mg given before induction of anesthesia effectively controls postoperative nausea and vomiting in laparoscopic surgeries.
14. Magnesium Sulfate: Seeking the Unknown
Rupal Sharma, Anuradha Salvi, Kritika Kaushik, Sakshi Sharma, Asha Verma
Abstract
Introduction: Although the mortality rate for preterm infants and the gestational age-specific mortality rate have dramatically improved over the last three to four decades, infants born preterm remain vulnerable to many complications, including respiratory distress syndrome, chronic lung disease, and injury to the intestines, a compromised immune system, cardiovascular disorders, hearing and vision problems, and neurological insult.
Aims and Objectives: To determine the effect of exposure of antenatal magnesium sulfate on neonatal APGAR score at 1 minute and 5 minutes of birth, on need of respiratory support and duration of hospitalization of neonates.
Materials and Methods: This prospective interventional study was conducted on 70 women admitted to labor room in the Department of Obstetrics and Gynaecology, Sawai Mansingh Medical College, Jaipur. Women with singleton pregnancy between 28 to 32 weeks gestational age with expected delivery within 24 hours were included. Groups were allocated using flip coin method into cases and controls, and the treatment group was administered with Magnesium Sulphate. At birth Apgar scoring at 1 and 5 minutes were noted. Duration of stay of hospitalisation, need of oxygen, CPAP/ventilator were noted.
Results: The APGAR score at 1 and 5 minutes after birth were found to be improved in the MgSO
4 group. The mean duration of stay in the hospital among neonates who received MgSO
4 was 9.86 days compared to 11.06 days in controls. Also babies requiring CPAP/ventilator were more in control group as compared to who did not received MgSO
4.
Conclusion: Antenatal MgSO
4, if judiciously used, can do wonders. But for generalized norms, it has to be evaluated thoroughly.
15. A Study of Fetomaternal Outcome in Pregnancies Complicated by Gestational Diabetes Mellitus at a Tertiary Care Hospital in Northeast India
Neha Joshi, Manoj Kumar
Abstract
Background: Gestational diabetes mellitus (GDM) is a common metabolic disorder of pregnancy and is associated with significant adverse maternal and neonatal outcomes. With the rising prevalence of GDM in India and limited region-specific data from Northeast India, evaluating fetomaternal outcomes in this population is essential.
Aim and Objectives: To assess and compare maternal and neonatal outcomes in pregnancies complicated by gestational diabetes mellitus with those in normoglycaemic pregnancies, and to evaluate the association between glycaemic control and pregnancy outcomes.
Materials and Methods: This hospital-based observational comparative study was conducted at a tertiary care hospital in Northeast Tezpur, Assam, from August 2021 to July 2024. A total of 120 pregnant women were enrolled, comprising 60 women diagnosed with GDM and 60 normoglycaemic controls. Participants were followed from diagnosis until delivery and the early neonatal period. Maternal demographic characteristics, antenatal complications, mode of delivery, and neonatal outcomes were recorded.
Results: Women with GDM were significantly older and had higher body mass index compared to controls. Gestational hypertension and polyhydramnios were more common in the GDM group. Caesarean section rates were substantially higher among women with GDM. Neonates born to mothers with GDM had significantly higher birth weight, increased incidence of macrosomia, neonatal hypoglycaemia, and higher rates of NICU admission. Poor glycaemic control within the GDM group was significantly associated with increased operative delivery and adverse neonatal outcomes.
Conclusion: Pregnancies complicated by gestational diabetes mellitus are associated with increased risk of adverse fetomaternal outcomes. Effective glycaemic control plays a crucial role in improving maternal and neonatal prognosis. Early screening, timely diagnosis, and appropriate management of GDM are essential to reduce pregnancy-related complications, particularly in high-risk populations.
16. Comparative Clinical Outcomes of Alcohol-Induced and Gallstone-Induced Acute Pancreatitis
Sneha Ninama, Girish N. Pratap, Rahul Agarwal
Abstract
Aim: To compare the clinical outcomes, disease severity patterns, and complications between alcohol-induced and gallstone-induced acute pancreatitis in a tertiary care center.
Materials and Methods: This prospective observational comparative study was conducted over an 18-month period. A total of 152 consecutive patients diagnosed with acute pancreatitis were enrolled and categorized into two groups: alcohol-induced (n=101) and gallstone-induced (n=51) pancreatitis. Patients were evaluated for demographics, clinical presentation, severity assessment using Revised Marshall Score and BISAP criteria, and documented for outcomes including duration of nil per oral (NPO), length of hospital stay (LOS), organ failure, local complications (pancreatic necrosis, pseudocyst, acute necrotic collection), and mortality.
Results: Alcohol-induced pancreatitis demonstrated higher prevalence (66.45%) with predominance in younger males (mean age 37.8±8.2 years, 97% male; p<0.0001). Gallstone-induced pancreatitis was more frequent in older females (mean age 46.5±12.1 years, 84.3% female). No mortality was recorded in either group. Mean NPO duration was comparable (alcohol: 2.49±1.12 days vs gallstone: 2.75±1.02 days; p=0.1656). Length of hospital stay was similar (alcohol: 3.55±1.81 days vs gallstone: 3.41±1.3 days; p=0.617). Alcohol-induced cases demonstrated significantly higher incidence of acute necrotic collection (ANC) at 21.8% versus 3.92% in gallstone group.
Conclusion: Both alcohol-induced and gallstone-induced acute pancreatitis demonstrated favorable short-term clinical outcomes with zero in-hospital mortality when managed with appropriate supportive care and timely interventions. While complication patterns differed between etiologies, with alcohol-induced cases prone to necrosis and gallstone-induced cases predisposed to pseudocyst formation, overall outcome measures remained comparable. Etiology-specific monitoring protocols are recommended to optimize patient management and enable early intervention for anticipated complications based on the causative factor.
17. A Study of Fetomaternal Outcome in Pregnancies Complicated by Gestational Diabetes Mellitus at a Tertiary Care Hospital in Northeast India
Neha Joshi, Manoj Kumar
Abstract
Background: Gestational diabetes mellitus (GDM) is a common metabolic disorder of pregnancy and is associated with significant adverse maternal and neonatal outcomes. With the rising prevalence of GDM in India and limited region-specific data from Northeast India, evaluating fetomaternal outcomes in this population is essential.
Aim and Objectives: To assess and compare maternal and neonatal outcomes in pregnancies complicated by gestational diabetes mellitus with those in normoglycaemic pregnancies, and to evaluate the association between glycaemic control and pregnancy outcomes.
Materials and Methods: This hospital-based observational comparative study was conducted at a tertiary care hospital in Northeast Tezpur, Assam, from August 2021 to July 2024. A total of 120 pregnant women were enrolled, comprising 60 women diagnosed with GDM and 60 normoglycaemic controls. Participants were followed from diagnosis until delivery and the early neonatal period. Maternal demographic characteristics, antenatal complications, mode of delivery, and neonatal outcomes were recorded.
Results: Women with GDM were significantly older and had higher body mass index compared to controls. Gestational hypertension and polyhydramnios were more common in the GDM group. Caesarean section rates were substantially higher among women with GDM. Neonates born to mothers with GDM had significantly higher birth weight, increased incidence of macrosomia, neonatal hypoglycaemia, and higher rates of NICU admission. Poor glycaemic control within the GDM group was significantly associated with increased operative delivery and adverse neonatal outcomes.
Conclusion: Pregnancies complicated by gestational diabetes mellitus are associated with increased risk of adverse fetomaternal outcomes. Effective glycaemic control plays a crucial role in improving maternal and neonatal prognosis. Early screening, timely diagnosis, and appropriate management of GDM are essential to reduce pregnancy-related complications, particularly in high-risk populations.
18. A Quasi -Experimental Study to Assess the Effect of Structured Nutritional Education Program on the Dietary Practices of Middle School Children of Private Schools in Urban Chennai
Hemamalini B., Santha Sheela Kumari K., Sameeya Furmeen S., Seenivasan P.
Abstract
Background: Dietary practices formed in childhood strongly influence long-term health. During adolescence, parental supervision declines and peer influence rises, resulting in unhealthy habits such as skipping meals, television viewing during meals, eating out, and increased junk food intake. Early nutritional education can positively shape dietary habits.
Objectives: 1. To assess the dietary practices of middle school children. 2. To evaluate the effect of a structured nutritional education program on their dietary practices.
Methods: A quasi-experimental study was conducted among 90 middle school children aged 11–14 years from three private schools in North Chennai (January 2019–November 2020). Group A (n=30) received structured nutritional education with periodic reinforcement; Group B (n=30) received a one-time intervention; Group C (n=30) served as control. Educational tools included trifold brochures, food plate models, display boards, painted pots, stadiometers, and digital weighing machines.
Results: The mean dietary practices scores improved significantly across all groups (p = 0.000), with the highest gain in Group A (6.73 ± 3.48), followed by Group B with moderate improvement (2.90 ± 0.49) and Group C with less significant improvement (2.06 ± 2.03). In Group A, breakfast intake increased from 66.7% to 93.3%, television viewing during meals reduced from 83.3% to 53.3%, family meal participation rose from 66.7% to 76.7%, and hotel food consumption declined in 80% of children. Balanced diet adherence improved from 10% to 43.3%, while healthy dietary practices showed a remarkable rise from 40% to 96.7% highlighting the superior effectiveness of structured nutritional education in Group A.
Conclusion: Structured nutritional education with periodic reinforcement significantly improved knowledge and dietary practices among middle school children compared with one-time interventions. Continuous engagement of children and parents is crucial for fostering healthy eating behaviours and nurturing healthier adolescents.
19. Effectiveness of an Educational Intervention on Parental Knowledge in Management of Children with Beta Thalassemia
Nilamadhaba Panda, Jyoti Ranjan Behera, Snigdha Rani Panigrahy, Sadhana Panda, Bharata Chandra Choudhury, Narendra Behera
Abstract
Introduction: The study evaluate the impact of an educational intervention on the knowledge, attitude, and practice (KAP) of parents of children with beta thalassemia. The study involved 147 participants, providing a comprehensive overview of how targeted educational efforts can enhance understanding and management of the disease. Overall, the study emphasizes the value of education as a tool for enhancing health literacy and promoting proactive health behaviours among parents of children with beta thalassemia. By continuing to invest in and develop such educational initiatives, we can significantly improve the quality of life for affected families and contribute to better disease management and prevention strategies.
Material And Method: A hospital based Prospective Quasi-experimental Pre- Post test design to evaluate the Effectiveness of an educational intervention. Beta-thalassemia Children of age 6 month to 14 years with their parents (care giver) presented to MKCG Medical College & Hospital for regular blood transfusion This prospective quasi-experimental study was conducted in the department of paediatrics, MKCG, MCH, Berhampur from the period of October 2022 to September 2024. Study population included parents or care givers of thalassemia children of age group 6 month to 14 year.
Result: More number of questionnaire regarding KAP about thalassemia will helpful on strength of the study. Scoring or Grading of KAP could may give more light on the performance of the Study. Long term follow up will yield better result in assessing KAP of thalassemia parents. Smaller sampling size will give more bias in the study and large size of the sample will give better result in KAP assessment regarding thalassemia.
Conclusion: Study concludes; More significant improvement of knowledge of parents and care givers observed in different aspects regarding thalassemia disease like about general awareness, symptoms, disease transmission, screening and diagnosis and healthy living and treatment. Overall, the study emphasizes the value of education as a tool for enhancing health literacy and promoting proactive health behaviours among parents of children with beta thalassemia. By continuing to invest in and develop such educational initiatives, we can significantly improve the quality of life for affected families and contribute to better disease management and prevention strategies.
20. Knowledge, Attitude and Practice of Medical Undergraduate Students towards Over the Counter (OTC) Drugs Usage: A Cross-Sectional Study
Devasish Panda, Bikas Ranjan Mohanty, Baijayanti Rath, Om Gopal Mishra, Dev Shivam Mishra, Sandeep Yadav, Surendra
Abstract
Introduction: The World Health Organization defines over-the-counter drugs as medications that can be bought without a prescription. They are generally used for common symptoms like fever, cough, cold, headache, toothache and are generally treated symptomatically and not as substitute for prescription drugs. But this also leads to an irrational use of over-the-counter drugs due to their easy availability and lack of proper knowledge about their adverse effects. Although OTC medicines allow greater access to treatment of people at large at lower cost for minor or self-limiting illnesses and it gives General Practitioners (GPs) more time to deal with serious health problems. There associated some risks of OTC medications which include Increase Drug Resistance, increase cost to the patients, failure to follow label instructions, increase risk of drug-drug Interactions and potential for misuse and abuse. The prevalence of usage of over-the-counter drugs world-wide varies between 32.5% to 81.5% while the same for India is 53.57%.
Objectives: The study was conducted to assess the knowledge, attitude, practice of MBBS undergraduate students on Over the Counter (OTC) Medicine along with prevalence of usage of OTC drugs in Undergraduate Medical Students of BBMCH.
Methodology: It was a cross-sectional observational Study conducted in a tertiary care teaching hospital among 405 undergraduate medical students. A questionnaire consisting of questions about knowledge, attitude, and practice toward OTC drugs was framed. After educational activities, the same questionnaire on knowledge and attitude aspects was shared to the participants in google form and their response was collected and analyzed.
Results: Out of the Total 405 MBBS undergraduate students who participated in the study, 87(21.5%), 83(20.5%), 83(20.5%), 90(22.2%), 62(15.3%) were belongs to 1st year, 2nd year, 3rd year, 4th year, 5th year of MBBS respectively. 246(60.7%) students were didn’t have any relatives from medical background. Around 45(11.1%) students were having 1st degree (parents) relatives from medical backgrounds. At the same time 31(7.7%), 45(11.1%) students were having 2nd degree (Brothers, Sisters, Grandparents), 3rd degree relatives (Uncle, Aunt, Nephew, Nice) from medical background, respectively. Out of the 405 students only 250(61.7%) were previously heard about the term “Over the Counter” (OTC) medicine. Only 282(69.6%) out of 405 students answer correctly that the medicine that can be purchased without prescription is called OTC medicine.153 (37.8%) students were of the opinion that availability of Over-the-counter medicine was beneficials to general publics. 364 (89.9%) participants were of the opinion that consumption of OTC medicine contributed to Antimicrobial Resistance. 282 (69.6%) were found purchased OTC medicines at least once during the last 3 months. Whereas 245(60.5%) students had purchased medicine for self-consumption, 122(30.1%), 83(20.5%) students purchased it for family members & friends respectively. OTC medicines were purchased most commonly for fever (51.6%) followed by Common cold/Cough (40.7%), Acidity/Gastritis (38%), Headache & myalgia (29.6%), Loose motion (26.2%), allergy (16.0%).
Conclusions: This study highlights the high prevalence of self-medication with OTC drugs among medical students. While many have basic knowledge, significant gaps remain regarding drug safety, regulations, and potential risks. The casual attitude toward OTC drug misuse—such as exceeding doses or ignoring expiry dates—is concerning. Raising awareness among medical students is crucial, as they will serve as future healthcare providers and influence public health behaviors.
21. Comparative Evaluation of Pudendal versus Dorsal Penile Nerve Block for Analgesia in Pediatric Circumcision: A Randomized Controlled Study
Pranchil Pandey, Brijesh Tiwari
Abstract
Introduction: Topical analgesics, caudal block, and ring block of the penis are a few examples of regional anesthesia that have been utilized during circumcision and have varying degrees of effectiveness. Transient motor block has been linked to caudal block. The dorsal penile nerve block is a successful anesthetic technique, with extended postoperative analgesia, according to earlier investigations. However, it has a 4-6.7% failure rate that has been documented. To compare the analgesic and anesthetic efficacy of bilateral nerve stimulator-guided pudendal nerve block with that of dorsal nerve block for perioperative and postoperative analgesia in children undergoing circumcision, we conducted a prospective randomized controlled clinical trial based on this background
Methods: A prospective, single-blinded, randomized investigation was carried out from March 2020 to February 2025 with the approval of the institutional review board and the signed agreement of the parents. 50 ASA-1 male children between the ages of 3 and 5 who were scheduled for elective circumcision were included in the study. A pre-existing coagulopathy, an infection at the injection site, and a known allergy to local anesthetic were among the exclusion criteria. One group received a pudendal nerve block, while the other group received a dorsal nerve block. For the first day, pain ratings were taken at various intervals (0, 6 and 12 h, and once per day for the next 5 days). It was measured using the Objective Pain Scale as modified by Hannallah et al.
Result: Age, hemodynamic stability, and duration of surgery were comparable across the two groups (Table 1). In the group of patients who had their pudendal nerves blocked, every patient underwent circumcision as planned without requiring any analgesics. 3 patients (12%) in the dorsal nerve block group experienced an incomplete block, necessitating further local infiltration. In the dorsal nerve block group, one patient (4%) had total block failure and underwent general anesthesia (Table 2). In conclusion, as compared to the dorsal nerve block, the guided pudendal nerve block method has been shown to be more precise and successful in circumcising children.
22. A Comparative Study of Sociodemographic Correlates and Quality of Life of Caregiver of Patients of Schizophrenia and Bipolar Affective Disorder
Chakit Sharma, Gaurav Kumar, Amit Kumar Jangir, Alok Tyagi
Abstract
Background: Mental health disorders, such as schizophrenia and bipolar affective disorder (BPAD), impose significant burdens on patients and their caregivers. Schizophrenia is characterized by psychotic symptoms and cognitive decline, while BPAD involves episodic mood fluctuations. Both conditions require long-term caregiving, often leading to emotional, physical, and financial strain on family members. Despite the critical role of caregivers, their quality of life (QoL) remains understudied. This study aimed to compare the sociodemographic profiles of patients with schizophrenia and BPAD and assess the QoL of their caregivers.
Materials & Methods: A cross-sectional study was conducted over 16 months at a tertiary care psychiatric centre, involving 120 participants (60 schizophrenia and 60 BPAD patients) and their primary caregivers. Caregivers were assessed using the WHOQOL-BREF questionnaire, while patient symptom severity was measured using PANSS (schizophrenia), YMRS, and HAM-D (BPAD). Statistical analysis included chi-square tests, independent t-tests, and correlation analyses.
Results: Sociodemographic analysis showed no significant differences between schizophrenia and BPAD patients except for illness duration (p=0.002), with BPAD patients having longer illness durations. Caregiver QoL did not differ significantly between groups across physical, psychological, social, and environmental domains. However, illness duration negatively correlated with psychological QoL in both groups (schizophrenia: r=-0.226, p=0.013; BPAD: r=-0.220, p=0.018). Negative symptoms in schizophrenia (PANSS-N) were linked to poorer environmental QoL (r=-0.265, p=0.04), while depressive symptoms in BPAD (HAM-D) correlated with worse psychological QoL (r=-0.360, p=0.027).
Conclusion: Caregivers of schizophrenia and BPAD patients experience similar QoL challenges, though symptom-specific burdens exist. Longer illness duration worsens psychological well-being, highlighting the need for targeted caregiver support programs. Interventions should address chronicity, symptom management, and psychosocial support to improve caregiver resilience and mental health outcomes.
23. Assessment of Awareness and Prevalence of Allergic Rhinitis in North Bihar, India: A Cross-Sectional Study
Sujeet Kumar, Novelesh Bachchan, Shashi Kumar, Pawan Kumar Lal, Pankaj Patel
Abstract
Background: Allergic Rhinitis (AR) is an IgE (Immunoglobulin-E) mediated immunological response of nasal mucosa characterized by watery nasal discharge, nasal obstruction, sneezing and itching in the nose. AR is hazardous diseases rising at a very fast rate. Increase in knowledge and complications regarding AR expected to have a better outcome of the disease.
Objective: The study was cross-sectional type, planned in north Bihar to assess the awareness and diagnosis about AR, and includes 600 participants.
Method Demographic data and knowledge of participants on various aspects of AR were collected by a well-prepared questionnaire asking from patients of ENT Department during January 2025 to December 2025. Nasal cytology was taken from the inferior turbinate from the selected patient.
Results: Diagnosis was based on symptom and nasal eosinophilia in cytology of nasal smear. It was found that 39.83% of participants had knowledge of AR. Only 10.67% knew that Allergic rhinitis was caused by insufficient antihistamines.73.67% of respondents did not know any Allergic rhinitis symptoms. 61% of respondents did not know how this disease can be prevented. 88.33% of respondents did not have any idea about the complication of Allergic Rhinitis. This study indicated that awareness of Allergic rhinitis was very poor, especially in subjects with low education.
Conclusion: The study concluded that there is an urgent requirement of different strategies like Allergic Rhinitis health campaigns, issuing pamphlets of information about AR, public speaking sessions, etc. to spread awareness among the general population.
24. Impact of Various Regional Anesthesia Techniques on Perioperative Outcomes in Holmium Laser Enucleation of the Prostate: A Randomized Study
Pranchil Pandey, Brijesh Tiwari
Abstract
Introduction: HoLEP (Holmium Laser Enucleation of Prostate) surgery has lower morbidity due to a lower transfusion rate and lesser risk of dilutional hyponatremia; nevertheless, the disadvantages are attributed to a longer procedure time and a steep learning curve. The anesthetic aspects of HoLEP have not yet been fully established. The majority of published studies concentrated on HoLEP’s urological features. The aim of the current study was to assess different regional anesthesia techniques for HoLEP surgery and to identify the most effective regional anesthesia technique.
Material and Methods: Following approval from Institutional Ethics Committee (IES-SSMC-0145), a prospective, randomized, comparative study was conducted. The study included 45 patients who were scheduled for HoLEP. Patients who had severe systemic infections or local infections at the injection site, coagulopathy, serious disorders of the central nervous system or peripheral nerves, and history of allergies to local anesthetics, were excluded from the study. Patients (n=15) were randomly assigned to one of three groups (epidural block, spinal block, or saddle block) using the sealed envelope method.
Result: Time to T10 dermatome block (P=.024) and time to maximal sensory level block (P=.003) were statistically different between groups A and B. The maximal sensory block level was comparable between groups B and C but higher between groups A and C. The time to 2-segment sensory regression differed statistically substantially between groups and was significantly longer in group A than in group C (P=.007).
Conclusion: we conclude that saddle block has a quicker onset, more effective sensory block, and faster recovery in HoLEP surgery.
25. Assessment of Perception of Medical Students Regarding Competency Based Medical Education [CBME]: A Cross-Sectional Study
Manjari Kishore, Jitender Pratap Singh, Pooja Jain, Mritunjay Kaushik, Aditi Suri
Abstract
The current research evaluates the effectiveness and perceptions of Competency-Based Medical Education (CBME) among 400 medical students. The survey explored CBME’s focus on medical competencies, clarity of learning objectives, adequacy of feedback, engagement in self-directed learning, and the value of clinical-oriented practical experiences. Results indicate that CBME encourages a focused approach to learning competencies, offers clearer objectives, and improves clinical readiness. Participants reported receiving adequate feedback and viewed self-directed learning positively within their medical education. Clinical-oriented practical experiences were highly valued, enhancing motivation and preparedness for future practice. Areas for improvement include the foundation course for CBME and enhancing clinical practical experiences. Challenges in implementing CBME involve adapting teaching methods, ensuring resources, and aligning assessments with competency outcomes. The study concludes that CBME shows promise for preparing future medical professionals, but ongoing adjustments are needed to address challenges and optimize its implementation.
26. Effect of Preoperative Glycemic Management on Surgical Site Infections among Diabetic Patients
Mohit Vajera, Afrin Khan
Abstract
Background and Aim: Surgical site infections (SSI) are a major postoperative complication in diabetic patients, with hyperglycemia recognized as a significant risk factor. This study aimed to evaluate the impact of preoperative glycemic control on postoperative wound infections in diabetic patients.
Methods: A hospital-based prospective study was conducted over one year in a tertiary care hospital, enrolling 184 diabetic patients undergoing general surgery. Preoperative fasting blood glucose and HbA1c levels were recorded, and patients were followed postoperatively for wound infections and other complications. Patients were categorized based on glycemic control as good, fair, or poor. Data were analyzed to assess the association between glycemic status and postoperative outcomes.
Results: The study included predominantly female patients (67.4%) with the most common age group being 51–60 years (26.1%). Overall, 50% of patients developed SSI, with the highest incidence in the poor glycemic control group (67.6%). Postoperative blood glucose levels on days 1, 3, and 7 were significantly higher in patients with SSI (p < 0.05). Other complications, including delayed wound healing and urinary tract infections, were also more frequent in patients with suboptimal glycemic control.
Conclusion: Poor preoperative glycemic control is associated with higher rates of surgical site infections and postoperative complications in diabetic patients. Optimizing blood glucose before surgery may reduce morbidity and improve outcomes.
27. Retrospective Study of Antidiabetic Medication Use in Type 2 Diabetes Patients at a Tertiary Healthcare Center
Visarg Patel, Jaimin Mohanbhai Desai, Neelkumar Girishkumar Patel
Abstract
Background: Type 2 diabetes mellitus is a growing public health challenge requiring long-term pharmacotherapy, and evaluating prescribing patterns helps assess the rational use of antidiabetic drugs. This study aimed to analyze the prescription pattern of antidiabetic medications among patients with type 2 diabetes mellitus attending a tertiary care hospital.
Methods: A hospital-based cross-sectional study was conducted over one year at a tertiary care hospital. A total of 166 patients with type 2 diabetes mellitus attending the diabetic outpatient clinic were included. Data on demographic characteristics, comorbidities, and prescribed antidiabetic drugs were collected from prescriptions and medical records. The data were analyzed using descriptive statistics with SPSS software.
Results: Most patients were aged 51–60 years (38.0%) and had a diabetes duration of 1–5 years (46.4%). Hypertension was the most common comorbidity (49.4%). The mean number of antidiabetic drugs per prescription was 2.9, with oral agents alone prescribed in 66.9% of patients. Metformin was the most commonly prescribed oral drug, while lispro mix insulin was the predominant injectable, and combination therapy was frequently used.
Conclusion: The study demonstrates a preference for metformin-based combination therapy in the management of type 2 diabetes mellitus, reflecting contemporary and rational prescribing practices.
28. Hemodynamic Changes After Spinal Anesthesia in Cesarean Section: A Prospective Observational Study
Jaydipkumar Manubhai Chauhan, Rahulkumar Jagdhishbhai Taral, Meetkumar Rameshbhai Moradiya
Abstract
Background: Spinal anesthesia is the preferred anesthetic technique for cesarean section because of its rapid onset, effective sensory blockade, and minimal fetal drug exposure. However, post-spinal hypotension remains the most common complication and may adversely affect both maternal comfort and uteroplacental perfusion.
Aim: To determine the incidence of post-spinal hypotension and identify associated risk factors in patients undergoing cesarean section.
Methodology: This prospective observational study was conducted on 90 patients undergoing elective or emergency cesarean section under spinal anesthesia. Maternal demographics, obstetric variables, baseline hemodynamic parameters, sensory block level, intraoperative management, and neonatal outcomes were recorded. Hypotension was defined as a fall in systolic blood pressure ≥20% from baseline or an absolute systolic blood pressure <90 mmHg. Statistical analysis was performed to identify factors associated with post-spinal hypotension.
Results: Post-spinal hypotension occurred in 64.4% of patients, most commonly within the first 10 minutes following spinal anesthesia. Higher body mass index, lower baseline systolic blood pressure, higher sensory block level (≥T4), primigravida status, and emergency cesarean section were significantly associated with hypotension. Vasopressor support was required in the majority of affected patients. Conclusion: Post-spinal hypotension remains a frequent and clinically significant complication during cesarean section under spinal anesthesia. Early identification of high-risk patients and timely preventive strategies are essential to improve maternal and neonatal outcomes.
29. Comparison of HbA1c Levels in Diabetic Patients with and without Retinopathy
Narendra Singh, Pankaj Tyagi, Yashika Sinha, Prachi Shukla
Abstract
Background: Diabetic retinopathy is a common microvascular complication of type 2 diabetes mellitus and a leading cause of preventable blindness. Glycated haemoglobin (HbA1c) reflects long-term glycaemic control and is strongly associated with the risk of retinopathy. This study aimed to compare HbA1c levels in diabetic patients with and without retinopathy.
Methods: A hospital-based cross-sectional comparative study was conducted at Muzaffarnagar Medical College and Hospital over one year. A total of 180 patients with type 2 diabetes mellitus were enrolled and divided into two groups: 90 patients with diabetic retinopathy (DR group) and 90 without retinopathy (Non-DR group). Clinical evaluation, fundoscopic examination, and HbA1c estimation by high-performance liquid chromatography (HPLC) were performed. Data were analysed using SPSS v21; mean HbA1c levels were compared using the Student’s t-test, with p < 0.05 considered significant.
Results: The mean HbA1c level was significantly higher in the DR group (9.1 ± 1.4%) compared to the Non-DR group (7.2 ± 1.1%; p < 0.001). A higher proportion of patients with diabetic retinopathy had HbA1c ≥9%. Longer duration of diabetes and older age were also associated with retinopathy.
Conclusion: Poor glycaemic control, reflected by elevated HbA1c, is strongly associated with diabetic retinopathy. Regular monitoring of HbA1c and timely ophthalmological screening are essential to prevent vision-threatening complications in patients with type 2 diabetes mellitus.
30. Association of Vitamin D Status with Disease Severity in Infants Hospitalized with Bronchiolitis
Kavita Meena, Jitendra Kumar Chholak, Yogesh Yadav
Abstract
Background: Vitamin D has immunomodulatory properties and may influence the clinical course of lower respiratory tract infections in infants; however, its association with bronchiolitis severity remains inconsistent.
Objectives: To evaluate the association between serum vitamin D status at hospitalization and disease severity among infants admitted with bronchiolitis.
Methods: In this prospective observational study conducted from January to December 2024 at SMS Medical College, Jaipur, infants aged <12 months hospitalized with bronchiolitis were enrolled. Serum total 25-hydroxyvitamin D [25(OH)D], albumin, and vitamin D–binding protein were measured within 24 hours of admission. Free and bioavailable 25(OH)D concentrations were calculated. Disease severity was assessed by intensive care unit (ICU) admission, need for continuous positive airway pressure (CPAP) or mechanical ventilation, and length of hospital stay. Statistical analyses were performed using SPSS version 25.
Results: A total of 403 infants were included (mean age, 5.8 ± 3.1 months). Vitamin D deficiency and insufficiency were present in 38.5% and 52.9% of infants, respectively. ICU admission was required in 24.3%, CPAP in 19.6%, and mechanical ventilation in 10.4%. Vitamin D status was not significantly associated with ICU admission or the requirement for CPAP. However, lower serum 25(OH)D levels were significantly associated with the need for mechanical ventilation (p = 0.035). Total and free 25(OH)D concentrations demonstrated weak but significant negative correlations with duration of hospitalization (p = 0.004 and p = 0.026, respectively).
Conclusions: Hypovitaminosis D is highly prevalent among Indian infants hospitalized with bronchiolitis. While vitamin D status does not predict ICU admission or CPAP requirement, lower vitamin D levels are associated with prolonged hospitalization and increased need for mechanical ventilation.
31. Strengthening Medico-Legal Evidence and Administrative Accountability in Rajasthan: The Role of MedLEaPR
Dipender Singh, Yashika Saini, Anupam Johry, Surya Bhan Kushwaha
Abstract
The Medico-Legal Examination and Post-Mortem Reporting System (MedLEaPR) represents a major digital transformation in medico-legal infrastructure of Rajasthan following the enforcement of the new criminal codes the Bharatiya Sakshya Adhiniyam (BSA), Bharatiya Nagarik Suraksha Sanhita (BNSS) and Bharatiya Nyaya Sanhita, 2023. Historically, medico-legal documentation in the state relied on paper-based reports that were often handwritten, non-standardized & vulnerable to loss, manipulation and chain-of-custody breaches. These limitations frequently resulted in procedural delays and impaired judicial efficiency. MedLEaPR, developed by the National Informatics Centre (NIC), provides a secure, centralized and standardized digital platform for generating, authenticating and transmitting medico-legal case reports (MLCs) and post-mortem reports (PMRs). Its technical architecture incorporates digital signatures, structured templates, graphical tools and real-time integration with police systems through the Crime and Criminal Tracking Network & Systems (CCTNS) and the Inter-operable Criminal Justice System (ICJS). Rajasthan’s government mandated daily uploading of all MLCs and PMRs from May 2025, ensuring statewide compliance and enhancing accountability. Early outcomes indicate improved evidence integrity, reduced documentation errors, faster interdepartmental communication and greater transparency in the medico-legal workflow. While infrastructural limitations and training needs persist, MedLEaPR establishes a foundational digital framework critical for timely, reliable and legally defensible medico-legal evidence under India’s reformed criminal justice system.
32. Intravenous Magnesium Sulphate versus Oral Nifedipine for Tocolysis: Maternal and Neonatal Outcomes
Paaka Madhurima, Kavitha Dharavath, Vemula Sravanthi
Abstract
Background: Preterm labour is a major cause of neonatal morbidity and mortality. Tocolytic therapy aims to delay delivery to allow corticosteroid administration and improve neonatal outcomes. This study compared the efficacy and safety of magnesium sulphate and nifedipine in managing preterm labour.
Methods: A prospective observational study was conducted at Government Maternity Hospital, Hanumakonda, from July 2023 to December 2024. A total of 100 women with preterm labour were enrolled, with 50 receiving intravenous magnesium sulphate and 50 receiving oral nifedipine. Baseline characteristics, tocolytic efficacy, maternal adverse effects, and neonatal outcomes were systematically recorded and analysed.
Results: Baseline demographics were comparable between groups. Nifedipine achieved a significantly greater mean delay in delivery (6.1 ± 3.2 days) compared with magnesium sulphate (4.6 ± 2.4 days). Prolongation of pregnancy beyond 48 hours and 7 days was higher in the nifedipine group. Maternal adverse effects were mild; nifedipine produced more headache and flushing, while magnesium sulphate showed occasional hypotension and reduced reflexes. Neonatal outcomes, including birth weight, APGAR scores, and NICU admissions, were similar between groups.
Conclusion: Nifedipine demonstrated superior tocolytic efficacy with good maternal tolerability, making it a preferable first-line agent for preterm labour.
33. From Speculum to Scope: Advancing Sinonasal Disease
Shubhangi Singh, Shiv Shanker Kaushik, Richa Gupta
Abstract
Background: Sinonasal diseases are common in otorhinolaryngology, ranging from inflammation to complex infections and neoplasms. Anterior rhinoscopy is often used for initial evaluation but is limited to the anterior nasal structures thus missing deeper pathologies. Nasal endoscopy offers a more comprehensive view allowing detailed visualization of both the anterior and posterior nasal cavities. Despite its superior diagnostic capabilities, nasal endoscopy is still underused due to factors like cost and expertise.
Objective: To compare merits and demerits of nasal endoscopy vs. anterior rhinoscopy in diagnosis of sinonasal disease.
Methods: A prospective observational study was conducted on 110 patients presenting in the department of ENT, PMCH, Udaipur from April 2024 to March 2025 with symptoms of sinonasal disease. Each patient underwent both anterior rhinoscopy and diagnostic nasal endoscopy.
Results: Nasal endoscopy proved significantly better than anterior rhinoscopy for detecting sinonasal abnormalities, identifying conditions like concha bullosa (40.90% vs. 2.72%), ethmoidal polyps (19.09% vs. 9.09%) etc. It also provided a more detailed assessment of regions such as the sphenoethmoidal recess and superior turbinates and nasopharynx, which were not accessible with anterior rhinoscopy.
Conclusion: The study highlights the critical role of nasal endoscopy in accurately diagnosing sinonasal disease, particularly in chronic or refractory cases where subtle or posterior pathology may be present. Despite being more resource-intensive, nasal endoscopy should be integrated into routine clinical practice for comprehensive evaluation, as it enables precise diagnosis, better treatment planning, and improved patient outcomes.
34. Predictors of Outcomes of Neonatal Acute Kidney Injury in Tertiary Care Hospital
Akash Parashar, Sunita Khandelwal, Anjali Singh, Jai Singh
Abstract
Background: Acute kidney injury (AKI) is a common clinical syndrome in hospitalized children and it imposes heavy burden of mortality and morbidity. Acute kidney injury is an acute and reversible increment in serum creatinine (SCr) levels with a reduction in urine output oliguria, or anuria.
Objective: To Study the Etiology, Clinical Profile and Outcome of Acute Kidney Injury (AKI) In Neonates Admitted in NICU of JK Lon Hospital Kota.
Materials and Methods: A prospective cross-sectional study was conducted in NICU JK loan with 255 neonates. Neonates (≤28 days) having acute kidney injury according to AKI criteria were included.
Results: Among 255 neonates, mortality was 16.9%. Low birth weight, sex, gestational age, and mode of delivery showed no significant association with outcome. Sepsis was the most common etiology, while asphyxia and higher HIE grades, especially grade 3, were strongly linked to mortality. Significant predictors of death included metabolic acidosis, elevated urea and creatinine levels, and AKI stage 3. Most cases occurred in summer, but deaths were more common during monsoon. Overall, severe metabolic and renal abnormalities were key determinants of poor outcome.
Conclusion: Severity of illness, hypoxic injury, metabolic acidosis, and advanced AKI stage are the primary determinants of mortality in neonatal AKI, rather than demographic factors.
35. A Study on Thyroid Profile in Chronic Kidney Disease
S. Deepika, S. Sujatha, M. Priyanka
Abstract
Background: Chronic kidney disease (CKD) is associated with significant alterations in thyroid hormone metabolism. Understanding these changes is crucial for comprehensive patient management. Aim of this study is to evaluate thyroid function abnormalities in patients with chronic kidney disease and correlate these changes with the severity of renal impairment.
Methods: A cross-sectional observational study was conducted on 100 CKD patients and 50 age and sex-matched healthy controls. Thyroid function tests including T3, T4, TSH, FT3, and FT4 were measured. Patients were categorized according to CKD stages based on estimated glomerular filtration rate (eGFR). Statistical analysis was performed using appropriate tests.
Results: The mean age of CKD patients was 52.3±12.4 years with male predominance (64%). Significantly lower levels of T3 and FT3 were observed in CKD patients compared to controls (p < 0.001). T4, FT4, and TSH levels showed no significant difference. The prevalence of low T3 syndrome increased with advancing CKD stages, being present in 78% of stage 5 CKD patients. A significant negative correlation was found between serum creatinine and T3 levels (r = -0.542, p < 0.001).
Conclusion: Thyroid dysfunction, particularly low T3 syndrome, is highly prevalent in CKD patients and correlates with disease severity. Regular thyroid function monitoring should be considered in CKD management.
36. A Study on Menstrual Hygiene and Its Association with Perceived Reproductive Morbidity in Adolescent Girls of Slum Region
Sudiksha Rana, Sumit Kumar Singh, Himanshu Mamgain, Anupama Arya, Shalini Rawat, Shivani Dhyani
Abstract
Objectives: The present study was to evaluate the various factors of menstruation hygiene and to assess reproductive morbidities of adolescent girls in slum area of Dehradun, Uttarakhand, India.
Methods: Data was collected by house‑to‑house survey in the community, and girls were asked questions using a predesigned questionnaire. The questionnaire consisted of sociodemographic details, knowledge about menstruation, menstrual patterns and practices, hygiene followed, and associated serious ill-health ranging from Dysmenorrhea, genital tract infections, urinary tract infections, and bad odour etc.
Results: Out of 250 adolescent girls, most of girls were in age group of 14-16 years. Mean age of menarche was 12.7 years. Most of the mothers 136(54.4%) were illiterate and belonged from lower socioeconomic starta 170(68%). Non disposable linen was used by 55.2% girls. 67.6% girls were used 2-3 pad per day. 78.8% girls were reused of pad. Out of 250 girls, 188(75.2%) girls had reproductive morbidities. Mos common morbidities were dysmenorrhoea 84(33.6%), menstrual irregularities 55(22%), itching in genitalia 18(7.2%) and burning micturition 13(5.2%). 35.2% girls were taken heath care services.
Conclusions: Reproductive morbidities are more common in adolescent girl of slum region. Dysmenorrhea and menstrual irregularities are the most common morbidities. Illiterate mother, lower socioeconomic strata and lack of awareness of menstruation and its hygiene are the most common factors of reproductive morbidities in adolescent girls. Hence, we should organise regular health check-up camp in slum area to diagnose and treatment of reproductive morbidities in adolescent girls as well as to educate the mothers and adolescent girls for menstruation hygiene and prevention from morbidities.
37. Electrolyte Disturbances and Cardiac Complications in Post Operative Patients
Priyambada Patra, Kinjal Rameshbhai Balva, Drashti Kamleshbhai Patel
Abstract
Background: Normal cardiac function is contingent on electrolyte balance. Post-surgical changes, including surgical stress, fluid shifts, blood loss, anesthetic agents, and alterations in renal function, make the post-operative period highly vulnerable to electrolyte disturbances. Such disturbances may grossly affect cardiac electrophysiology, resulting in arrhythmias and other cardiac complications. It becomes, therefore, imperative that early recognition and correction of electrolyte disturbances are necessary in reducing both morbidity and mortality during this period.
Objectives: The study aimed to determine the prevalence of electrolyte disturbances in post-operative patients and their relationship with cardiac complications during the early post-operative period.
Materials and Methods: A total of 176 patients were enrolled in the prospective observational study conducted over a period of one year in a tertiary care hospital. A consecutive sampling technique was used. Measurement of serum electrolytes, namely sodium, potassium, calcium, and magnesium, was done within 72 hours following surgery. Cardiac complications were recorded based on clinical assessment with the aid of electrocardiography. Data analysis was done by descriptive statistics and association of electrolytes with cardiac complication using the Chi-square test, having a p-value <0.05 as statistically significant.
Results: Electrolyte imbalance was a common finding in post-operative patients, and the most common imbalance was hyponatremia and hypokalemia. Cardiac complications, especially arrhythmias, occurred relatively often in patients who had electrolyte imbalance. The odds ratio for cardiac complications was highest for hypokalemia, followed by hyponatremia and then hypocalcemia. Cardiac complications occurred significantly less often in patients who had normal electrolyte values.
Conclusion Electrolyte imbalance is common in the postoperative period and is independently related to cardiac complications. Electrolyte imbalance in the postoperative period must be closely monitored and corrected in order to prevent cardiac morbidity.
38. In silico Evaluation of Promising Epigenetic Biomarkers for the Detection of Colon Adenocarcinoma
Payal Kulhari, Suman Kumar Ray, Ram Rattan Negi
Abstract
Introduction: Colon adenocarcinoma (COAD) is a common and fatal cancer in the world, with a high death rate in India as a result of late diagnosis. Conventional screening techniques, such as stool tests and colonoscopies, are expensive, invasive, and often insensitive in identifying early-stage illnesses. The development of reliable, non-invasive biomarkers is therefore crucial to improving prognosis and early diagnosis. Epigenetic alterations, especially DNA methylation changes, occur early in tumor development and can be detected in circulating cell-free DNA (cfDNA), making them promising candidates for liquid biopsy-based diagnostics.
Objective: The purpose of this study was to identify and validate epigenetic biomarkers for the non-invasive diagnosis and prognosis of COAD, with a particular emphasis on SEPT9 and SDC2. The objective was to utilize computational techniques to assess their expression patterns and methylation status, with a focus on developing methylation assays suitable for early diagnosis and disease monitoring.
Materials and Methods: The Human Protein Atlas, TCGA, UALCAN, GEPIA, and other publicly available multi-omics datasets were utilized to evaluate gene expression, promoter methylation, protein localization, and survival relationships for SDC2 and SEPT9. By comparing tumor and normal tissues, bioinformatics analyses revealed variations in methylation. The analysis of single-cell RNA sequencing data, with an emphasis on epithelial lineage, confirmed the expression of specific genes in distinct cell types.
Results and Discussion: Bioinformatics analysis revealed significant promoter hypermethylation of SEPT9 and SDC2 in COAD samples compared to normal colon tissue. SDC2 demonstrated subtype-specific downregulation, whereas SEPT9 showed significant overexpression, especially in non-mucinous malignancies. Immunohistochemistry confirmed variable SDC2 expression and elevated SEPT9 protein levels. RNA sequencing of single cells has shown that both genes are highly expressed in epithelial cells, indicating their specificity as epigenetic biomarkers. The increased expression of both genes correlated with reduced overall survival, as indicated by survival analysis, underscoring their potential as prognostic indicators.
39. Analyzing the Incidence and Risk Factors of Retinopathy in Premature Infants
Shipra Singhi, Sunita Bishnoi
Abstract
Background: Retinopathy of prematurity is a leading cause of preventable childhood blindness, particularly among preterm and low birth weight neonates. Understanding its incidence and associated risk factors is essential for effective screening and prevention.
Objectives: To determine the incidence of retinopathy of prematurity in preterm and low birth weight neonates and to assess the association between various perinatal and neonatal risk factors with its occurrence.
Material and Methods: This prospective observational study included 520 preterm and/or low birth weight neonates admitted to a tertiary care neonatal intensive care unit. All eligible neonates underwent serial retinal examinations, and relevant maternal and neonatal risk factors were analyzed.
Results: Retinopathy of prematurity was diagnosed in 84 neonates, with an incidence of 16.15%. Lower gestational age, lower birth weight, prolonged oxygen therapy, and respiratory distress syndrome were significantly associated with ROP development, while sex, twin status, prenatal steroid exposure, and maternal systemic diseases showed no significant association.
Conclusion: Retinopathy of prematurity remains a significant morbidity among preterm neonates. Early screening and identification of high-risk infants, along with careful management of modifiable risk factors, are crucial in preventing disease progression and visual impairment.
40. Onychoscopic Analysis of Nail Disorders among Older Adults
Vivek Nikam
Abstract
Background: Nail disorders are common in the geriatric population and often pose diagnostic challenges due to overlapping clinical features and age-related changes. Onychoscopy has emerged as a useful non-invasive tool for detailed nail assessment.
Objectives: To study the clinico-epidemiological profile and onychoscopic patterns of nail disorders in the geriatric population.
Material and Methods: A cross-sectional observational study was conducted on 120 geriatric patients with nail disorders. Detailed clinical examination and onychoscopic evaluation of nail fold, nail plate, nail bed, and hyponychium were performed and correlated.
Results: Degenerative nail changes were predominant. Onychoscopy identified a higher frequency of nail abnormalities compared to clinical examination and showed strong correlation with clinical findings across all nail components.
Conclusion: Onychoscopy significantly enhances the evaluation of nail disorders in elderly patients and should be incorporated into routine geriatric dermatological practice.
41. Cutaneous Manifestations of Chronic Kidney Disease
S. S. Yadav, Bulbul Yadav
Abstract
Background: Chronic kidney disease (CKD) is a progressive systemic disorder associated with multiple dermatological manifestations that significantly affect patients’ quality of life.
Objectives: To study the spectrum and frequency of cutaneous manifestations in patients with chronic kidney disease.
Materials and Methods: This hospital-based observational study was conducted from February 2023 to November 2025 at Nirmala Hospital & Research Center, Jaipur. All diagnosed CKD patients were included. Patients with acute kidney injury or pre-existing primary dermatological disorders unrelated to CKD were excluded. Detailed clinical, dermatological, and laboratory evaluations were performed. Data were analyzed using descriptive statistics.
Results: A total of 327 CKD patients were studied (mean age 52.4 ± 11.6 years; male:female ratio 1.8:1). The most common etiology of CKD was obstructive uropathy (41%). Non-specific cutaneous manifestations were predominant. Pruritus (72.7%), hyperpigmentation (70%), and xerosis (67.8%) were the most frequent findings. Among specific lesions, acquired perforating dermatosis (8.5%) and porphyria cutanea tarda (3.2%) were observed.
Conclusion: Cutaneous manifestations are highly prevalent in CKD patients, with non-specific lesions being more common than specific dermatoses. Early identification and appropriate dermatological care should be integrated into routine CKD management.
42. Assessing Hemoglobinopathy Occurrence via High-Performance Liquid Chromatography in a Tertiary Care Setting
Darshanaben Kanabhai Gohel, Nishant Pujara, Sandip Patel
Abstract
Background: Hemoglobinopathies are among the most common inherited disorders worldwide, with a significant burden in India. High-performance liquid chromatography (HPLC) has become the gold standard for detecting and classifying these disorders, offering precise quantification and identification of hemoglobin variants.
Aim: To estimate the prevalence and distribution of various hemoglobinopathies detected by HPLC in a tertiary care center in India.
Material and Methods: This cross-sectional observational study included 310 patients screened for the duration of one year. Patients of all age groups and both sexes, referred for hemoglobinopathy screening were enrolled. Detailed clinical data were collected and venous blood samples were analyzed using HPLC. Demographic details,age-wise distribution, and prevalence of hemoglobinopathies were recorded.
Results: Of the 310 patients, 125 (40.3%) were male and 185 (59.7%) were female. The majority of patients belonged to the 21–30 years age group (33.9%). HPLC analysis showed 235 (75.8%) had normal hemoglobin while 38 (12.3%) had β-thalassemia trait, 12 (3.9%) had sickle cell trait, 4 (1.3%) had sickle cell-β-thalassemia compound heterozygosity, and 2 (0.6%) had sickle cell homozygosity. Thalassemia trait was most commonly diagnosed in the 21–30 years group. Among thalassemia carriers, RDW was predominantly in the 16–20 range.
Conclusion: This study highlights the considerable burden of hemoglobinopathies, particularly β-thalassemia trait in the regional population. HPLC proved highly effective for screening and diagnosis. Strengthening screening programs, especially among young adults along with public awareness initiatives are essential to reduce the hemoglobinopathy burden in India.
43. Vitamin D as a Novel Biomarker for Grading the Severity of Preeclampsia: A Cross-Sectional Case Control Study
Vibha Khare, Akshatha R., Akhilesh Bhamoriya, Tapan Sing Pendro
Abstract
Introduction: Preeclampsia remains a leading cause of maternal and perinatal morbidity worldwide. The importance of vitamin D for placental function, endothelial integrity, and immunological regulation is increasing. The purpose of this study was to measure serum vitamin D levels in pregnant women with preeclampsia and normotension and to see if it could be used as a biomarker to rate the severity of the condition.
Aims and Objectives: The objectives of this study were: (i) to estimate serum vitamin D levels in pregnancies with normotension, severe pre-eclampsia, and mild pre-eclampsia, (ii) to determine the relationship between the severity of the disease and vitamin D levels.
Material and Methods: A tertiary care teaching hospital served as site of this cross-sectional case-control study. A total of 120 third-trimester pregnant women has been recruited and split into three groups: 40 normotensive controls, 40 with mild preeclampsia, and 40 with severe preeclampsia. Competitive enzyme-linked immunoassay (ELISA) was used to measure serum vitamin D levels. Biochemical and clinical parameters were compared among groups. One-way ANOVA and Pearson correlation were used for statistical analysis.
Results: Women with preeclampsia have been significantly lower mean serum vitamin D levels than normotensive controls (p <0.00001). As the severity of preeclampsia increased, vitamin D levels gradually decreased. Diastolic blood pressure (DBP), systolic blood pressure (SBP), and proteinuria were all significantly correlated negatively with serum vitamin D.
Conclusion: Serum vitamin D levels are markedly lower in preeclampsia and show a negative correlation with the severity of the illness. One possible biomarker for identifying high-risk pregnancies and grading preeclampsia is vitamin D.
44. Effect of 4mg Dexamethasone for Prevention of Post-Operative Nausea and Vomiting in Laparoscopic Surgeries
Rakshitha R., Prashantha Kumar H. M., Holy Joy, Narasimha Reddy B., Saraswathi P. Devi
Abstract
Background: Laparoscopy was first introduced as a therapeutic alternative to laparotomy more than a century ago. Since then, the field of laparoscopic surgery has undergone enormous development and expansion, to the point where it is now the standard treatment for a wide range of surgical procedures, including cholecystectomy, appendicectomy, gynecologic surgeries, bariatric surgery, hernia repair and even complex oncologic operations. However, laparoscopic surgeries are associated with high incidence of postoperative nausea and vomiting (PONV) of 40%-80%. A number of drugs have been used for its prevention. Dexamethasone, a glucocorticoid, having an antiemetic effect along with anti-inflammatory and analgesic effect has been shown to reduce the incidence of PONV. However, the optimal dose for reducing PONV has not been clearly defined. In this study, we aim to study 4mg dose of dexamethasone on incidence of PONV in patients undergoing laparoscopic surgery.
Methods: A double blind randomized controlled study was performed on 70 patients posted for elective laparoscopic surgeries under general anesthesia to assess the efficacy of 4mg dose of dexamethasone in preventing PONV. Patients were randomly assigned into two groups: 4mg dexamethasone (1ml) and 1ml normal saline group. The incidence of nausea, vomiting and the need for anti-emetic were evaluated during first 24 postoperative hours.
Results: Patients who received IV dexamethasone 4mg had significant reduction of PONV (P<0.01) and the need for rescue anti emetic drugs was also lower in dexamethasone group compared to normal saline group.
Conclusion: Inj Dexamethasone 4mg given before induction of anesthesia effectively controls postoperative nausea and vomiting in laparoscopic surgeries.
45. Prevalence And Anatomical Distribution of Lateral Canals in Maxillary Premolars Assessed Using Cone-Beam Computed Tomography: An Original Study
Manoj Meena, Monika Sharma, Akshay Verma, Purusharth Kumar Sharma
Abstract
Aim: To evaluate the presence, frequency, and anatomical location of lateral canals in extracted maxillary premolars using cone-beam computed tomography (CBCT).
Materials and Methods: Three hundred extracted human maxillary premolars were subjected to CBCT imaging under standardized parameters. Axial, sagittal, and coronal sections were evaluated for root canal configuration according to Vertucci’s classification and for the presence of lateral canals. Data were recorded and analyzed using descriptive statistics.
Results: The majority of maxillary premolars exhibited Vertucci Type I canal configuration. Lateral canals were identified in 1.0% of specimens (3 out of 300 teeth). When present, lateral canals were predominantly located in the middle and apical thirds of the root. Complex canal configurations including Vertucci Types II, IV, V, VI, and VIII were also observed.
Conclusion: Lateral canals in maxillary premolars are relatively rare but clinically significant anatomical variations. CBCT is a reliable non-destructive imaging modality for detecting lateral canals and complex root canal morphology, thereby aiding in improved endodontic diagnosis and treatment planning.
46. Efficacy of Dexmedetomidine as an Adjuvant with Ropivacaine in Paravertebral Block in Surgery for Breast Cancer – A Study of 50 Cases
Yagnik Jagdishbhai Vaja, Jaykishan J. Gol, Krishna Dhamat
Abstract
Background: Effective postoperative pain control after breast cancer surgery is essential to reduce morbidity, opioid consumption, and patient discomfort. Thoracic paravertebral block (TPVB) is an established regional anesthesia technique that provides unilateral analgesia with minimal systemic effects. Ropivacaine is commonly used for TPVB due to its favorable safety profile. Dexmedetomidine, a highly selective α₂-adrenergic agonist, has been increasingly used as an adjuvant to local anesthetics to enhance analgesic efficacy. This study evaluated the efficacy of dexmedetomidine as an adjuvant to ropivacaine in TPVB for patients undergoing modified radical mastectomy.
Material and Methods: This prospective, randomized, controlled study was conducted on 50 female patients aged ≥18 years, belonging to ASA physical status I–III, scheduled for modified radical mastectomy. Patients were randomly allocated into two groups: Group PR received TPVB with 0.5% ropivacaine, while Group PRD received TPVB with 0.5% ropivacaine plus dexmedetomidine (1 μg/kg). TPVB was performed at T1, T3, and T5 levels before induction of general anesthesia. Primary outcomes included duration of analgesia and total postoperative opioid consumption. Secondary outcomes included onset of sensory block, hemodynamic parameters, and postoperative pain scores using the Visual Analogue Scale (VAS), Ramsay Sedation Scores, patient satisfaction, and adverse effects.
Results: Group PRD demonstrated a significantly faster onset of sensory block, prolonged duration of analgesia, lower postoperative VAS scores at all time intervals, and significantly reduced tramadol consumption compared to Group PR (p < 0.001). Hemodynamic parameters showed a controlled and stable reduction in heart rate and blood pressure in the dexmedetomidine group without clinical instability. Patient satisfaction was higher in Group PRD, with no significant increase in adverse effects.
Conclusion: Dexmedetomidine as an adjuvant to ropivacaine in TPVB significantly improves postoperative analgesia, reduces opioid requirement, and enhances patient satisfaction without increasing complications.
47. Correlation of Clinical, Computed Tomographic, and Intraoperative Findings in Chronic Rhinosinusitis
Zeel Patel, Nimisha Nimkar, Rachana Prajapati
Abstract
Background: Chronic rhinosinusitis (CRS) is a common inflammatory condition of the nasal cavity and paranasal sinuses persisting for more than 12 weeks and causing significant morbidity worldwide. Accurate diagnosis and precise delineation of disease extent are essential for effective management. With the advent of Functional Endoscopic Sinus Surgery (FESS), diagnostic nasal endoscopy and computed tomography (CT) of paranasal sinuses have become integral to preoperative evaluation. However, discrepancies between radiological findings and intraoperative observations still exist, particularly regarding anatomical variations and sinus involvement. Establishing a correlation between clinical, radiological, and operative findings is therefore crucial to optimize surgical planning and outcomes.
Material and Methods: This prospective observational study was conducted in the Department of Otorhinolaryngology at a tertiary care teaching hospital in Western India between May 2023 and August 2024. A total of 42 patients diagnosed with chronic rhinosinusitis and planned for endoscopic sinus surgery were included. All patients underwent detailed clinical evaluation, diagnostic nasal endoscopy, and CT scan of paranasal sinuses prior to surgery. CT findings were compared with intraoperative observations to assess diagnostic accuracy. Sensitivity, specificity, accuracy, and Cohen’s kappa coefficient were calculated to evaluate agreement between CT and operative findings.
Results: Among the 42 patients, males predominated (54.76%), with the most affected age group being 31–40 years. Nasal obstruction was the most common symptom (92.85%). Maxillary sinus was the most frequently involved sinus on CT, followed by ethmoid sinuses. CT scan demonstrated high sensitivity for detecting sinus disease and osteomeatal complex obstruction, with substantial agreement for osteomeatal complex blockage and deviated nasal septum. However, lower sensitivity and agreement were observed for certain anatomical variations such as concha bullosa and Onodi cells.
Conclusion: CT scan of paranasal sinuses is a highly sensitive tool for evaluating CRS and guiding surgical management. When combined with clinical assessment and nasal endoscopy, it provides optimal preoperative planning and improves intraoperative safety.
48. Mental Health Stigma and Attitudes: A Comparative Cross – Sectional Study among Psychiatric Patients and Their Caregivers in the Malwa Region
Akshay Soni, Priya Rai, Maitreyee Dale, Lovepreet Chabarwal
Abstract
Background: Stigma is one of the major obstacles to timely mental health service use and engagement. While caregivers may act to shorten treatment delay, they may act to transmit stigmatizing beliefs that influence the help-seeking behavior of patients. This study compared self-stigma of seeking help and attitudes of community toward mental illness between psychiatric patients and their primary care givers in the Malwa region using the Self Stigma of Seeking Help (SSOSH) and the 12 item Community Attitudes toward the Mentally Ill scale (CAMI 12).
Methods: In a hospital-based comparative cross – sectional design, psychiatric patients and their primary family caregivers were recruited consecutively from outpatient and inpatient psychiatry services for the study. Sociodemographic and clinical data were collected. SSOSH (10 items; higher scores = greater self-stigma of help seeking) and CAMI 12 (12 items; higher scores = less stigmatizing community attitudes after reverse coding) were performed with both groups. Group differences were tested, and after independent samples t tests, effect sizes. Multivariable linear regression was conducted to examine a set of predictors of SSOSH and CAMI 12 scores controlling for some key covariates.
Results: A total of 160 patients and 160 caregivers were analyzed. Patients scaled higher SSOSH scores than caregivers, 31.6 (SD 7.5) and 25.4 (SD 6.8), mean difference 6.2, p<0.001, Cohen’s d=0.88. Caregivers had more stigmatizing community attitudes (lower CAMI 12 total) than patients did (40.1+-6.5 vs 43.2+-6.0; p < 0.001; d = 0.49). In the adjusted models, rural residence and lower education were independently associated with higher SSOSH and lower CAMI 12 scores for both groups. Caregiver CAMI 12 “prejudice/exclusion” scores were negatively linked to patient SSOSH (beta -0.24 per unit CAMI 12; p=0.002), suggesting patient-caregiver dyad attitudinal contagion.
Conclusion: Patients weighed with great internalized barriers to help seeking whereas caregivers exhibited comparatively more negative community attitudes. Interventions in Malwa should be dyad focussed (patient centred stigma reduction, as well as psychoeducation for caregiver) to enhance engagement and continuity of care.
49. Morphological Spectrum of Bone Marrow Findings and its role in the Evaluation of Haematological Disorders
Krishnadeep Sahu, Puja Singh, Amar Gangwani, Himani Yadav
Abstract
Introduction: Bone marrow examination (BME) is a cornerstone diagnostic procedure for evaluating hematological disorders, providing critical insights into cellular morphology, architecture, and iron stores. This study aimed to describe the clinico-morphological spectrum of bone marrow findings and assess the diagnostic utility of bone marrow aspiration (BMA) and biopsy (BMB) in a tertiary care center in the Bundelkhand region of Madhya Pradesh, India.
Materials and Methods: A prospective, observational study was conducted over a specified period, including 90 patients who underwent BME for various hematological indications. Peripheral blood parameters, clinical features, and bone marrow morphology from both aspiration and trephine biopsy were analyzed. Special stains (Perls’ Prussian blue, Reticulin) were employed as needed.
Results: The study population had a mean age skewing towards younger adults (20-29 years, 31.1%), with a slight male predominance (53.3%). The most common clinical features were weakness (90%) and pallor (88.9%). Megaloblastic anemia (MA) was the most frequent diagnosis (31.1%), followed by mixed deficiency anemia (MDA, 20.0%) and iron deficiency anemia (IDA, 11.1%). Non-neoplastic disorders constituted 85.5% of cases, while neoplastic conditions like acute leukemia, aplastic anemia, and myelofibrosis accounted for 14.4%. Bone marrow biopsy was pivotal in cases of dry tap or when architectural assessment was crucial, such as in myelofibrosis and aplastic anemia.
Conclusion: Nutritional deficiency anemias, particularly megaloblastic anemia, are the predominant hematological disorders in the Bundelkhand region. Bone marrow examination remains an indispensable, cost-effective tool for definitive diagnosis, especially in differentiating between nutritional deficiencies, marrow failure syndromes, and hematological malignancies.
50. Histopathological Spectrum of Lesions in Nasopharynx and Sinonasal Sinuses: A Tertiary Care Experience
Sujata Lawa, Deepak Maini, Sharda Dawan
Abstract
Background: Lesions of the nasal cavity, paranasal sinuses, and nasopharynx encompass a wide range from non-neoplastic inflammatory conditions to benign and malignant neoplasms. Because of overlapping clinical features, histopathological examination remains the gold standard for diagnosis.
Aims: To evaluate the histopathological spectrum of lesions in the nasopharynx and sinonasal region, analyze their demographic distribution, and compare findings with previous studies.
Materials and Methods: This retrospective cross-sectional study was conducted from June 2022 to June 2024 in the Department of Pathology, Sardar Patel Medical College, and Bikaner. A total of 150 biopsies from sinonasal and nasopharyngeal regions were studied. Hematoxylin and eosin staining was performed; special stains were used when indicated. Data were analyzed statistically.
Results: Among 150 cases, 81 (54%) were non-neoplastic and 69 (46%) were neoplastic. Males predominated (63.3%), and the mean age was 40.08 years. The nasal cavity was the most common site (46.7%), followed by tonsillar region (26%). Inflammatory polyp was the most frequent non-neoplastic lesion, while squamous cell carcinoma was the most common malignant tumor. The association between age and lesion type was statistically significant (p < 0.001).
Conclusion: The sinonasal and nasopharyngeal regions show a diverse spectrum of lesions. Non-neoplastic inflammatory conditions predominate, but malignant neoplasms, particularly squamous cell carcinoma, constitute a significant subset, emphasizing the role of histopathology in accurate diagnosis and management.
51. Effect of Smartphone Use and Prolonged Screen Time on Digital Eye Strain (DES), Visual Acuity and Refraction among Medical Students: A Cross Sectional Study
Asima Hassan, Sajad Khanday, Javed Alikhan, Sadiya Sajad
Abstract
Aim: To determine the effect of smartphone use and prolonged screen time on digital eye strain (DES), visual acuity, refraction and overall ocular health among medical students.
Materials and Methods: A cross-sectional study was conducted at Government Medical College Srinagar, Kashmir, India. This study included 225 students of 2
nd year to 5
th year MBBS, who consented to participate. Information regarding participants’ bio-data, screen time, and DES symptoms was gathered through a meticulously crafted self-administered questionnaire. A Snellen’s chart was used to assess the best corrected visual acuity and refraction of participants was noted. Chi-Square and Pearson Correlation were used and analysis conducted using SPSS software.
Results: Out of 225 participants, 186 (82.6%) reported at least one symptom of digital eye strain. Headache (n=96; 42.6%) and eye pain/discomfort (n=73; 32.44 %) were the most common reported symptoms. Refractive error was reported in 102 (45.33 %) students, including myopia (n=78; 34.66%), hyperopia (n=13; 5.77%), and astigmatism (n=11; 4.88 %). Mobile Phone (n=225; 100%), Laptop (n=175; 78.22%) and Tablet/Ipad (n=76: 33.77%) were the main electronic gadgets used by participants. Headache, ocular discomfort, redness, watering of the eyes, itching of the eyes and burning of eyes along with neck / shoulder pain were significantly associated with increased screen time and most common refractive error noted among the students having prolonged screen time was myopia (p<0.05).
Conclusion: This study reveals an alarming 82.6% prevalence of DES among medical students in GMC Srinagar as a direct result of increased screen time and smart phone usage and a strong association of prolonged screen-exposure with refractive errors especially myopia. Our study reveals a significant association between screen time and DES, with headache & eye pain/discomfort being the most common symptoms.
52. Correlation between Red Cell Distribution Width and Severity of Ischemic Cerebrovascular Stroke
Namrata Patel, Fenil Vaghasiya, Ashok Kumar Choudhary, Purvi Patel
Abstract
Ischemic cerebrovascular stroke constitutes the majority of stroke burden worldwide and remains a leading cause of mortality and long-term disability. Despite advances in neuroimaging and reperfusion therapies, early identification of individuals at increased risk remains a major clinical challenge. Red cell distribution width (RDW), a routinely reported parameter in complete blood count, reflects variability in erythrocyte size and has emerged as a novel biomarker associated with adverse cardiovascular and cerebrovascular outcomes. Elevated RDW has been linked to systemic inflammation, oxidative stress, endothelial dysfunction, altered blood rheology, and prothrombotic states—key mechanisms implicated in ischemic stroke pathogenesis. This review examines the association between elevated RDW levels and ischemic stroke, synthesizing epidemiological evidence, exploring biological mechanisms, and evaluating clinical implications. Particular emphasis is placed on the relevance of RDW as a cost-effective biomarker in resource-limited settings such as South Gujarat. The potential role of RDW in stroke risk stratification and future research directions is discussed.
53. Evaluation of Posterior Segment in Advanced and Mature Cataract by B Scan Ultrasonography – A Prospective Study
N. Jayanthi, S. Sivapriya, Nikita N. Bhujang
Abstract
Background: Cataract is the most common preventable cause of bilateral blindness in India and the leading cause of vision loss in the elderly worldwide. It is also the primary cause of reversible blindness globally. Cataract refers to any opacity in the lens of the eye or its capsule, whether developmental or acquired. B-scan ultrasonography is a powerful, safe, cost-effective, non-ionizing, and non-invasive diagnostic tool for evaluating the hidden posterior segment lesions in eyes with opaque media caused by corneal opacities, dense cataracts, or vitreous hemorrhage, which complicate the ophthalmic evaluation. B-scan ultrasound, a two-dimensional imaging system, is particularly useful when the fundus cannot be accessed through direct or indirect ophthalmoscopy, such as in the presence of dense cataracts. It is routinely performed preoperatively in cases of dense cataract to evaluate posterior segment abnormalities that may influence visual prognosis after surgery.
Aim: Evaluation of posterior segment pathology in opaque media due to mature and advanced cataract thus plan management and determine visual prognosis accordingly.
Objectives: (1) To evaluate the posterior segment pathology in patients with Mature and advanced cataract by B scan. (2) To plan the management protocol based on B scan findings. (3) To determine visual prognosis pre operatively.
Materials and Methods: This prospective study was conducted in 200 patients with mature and advanced cataract. Relevant details with history were collected. Detailed ophthalmic was done for classification into groups.
Results: Out of 200 patients, 77 eyes were with Advanced IMSC, 90 eyes with MSC and 33 eyes with HMSC. Our findings demonstrated that B Scan ultrasonography is useful tool in evaluating posterior segment pathology in patients with advanced and mature cataract. In this study, the majority of patients (89.5%) had a normal B scan, indicating no significant abnormalities detected in most cases. Mild vitreous degenerations were observed in 3.5% of the patients, which is a common finding in the aging population and often not associated with severe visual impairment. Moderate vitreous degenerations were also found in 2.0% of the patients, while severe vitreous degenerations were observed in 1.0%, indicating a progression in the degenerative changes affecting the vitreous humor.
Conclusion: This study concluded that B Scan ultrasonography should be used in pre-operative evaluation of advanced and mature cataract to diagnose hidden posterior segment pathology enhancing surgical planning and prognosis.
54. A Clinical Study of Anterior Chamber Depth Measurement as a Screening Tool for Primary Angle Closure Glaucoma
N. Jayanthi, S. Sivapriya, K. Indulatha
Abstract
Background: Glaucoma is the world’s second largest cause of blindness, with permanent visual loss. Angle-closure glaucoma is regarded as the primary cause of permanent blindness globally, with a greater incidence among Asians.
1 PACG is characterized by narrow or closed anterior chamber angle, which leads to increased intraocular pressure and optic nerve damage Anatomical risk factors include shallow anterior chamber depth, small axial length and lens thickness.
Need for Screening: Gonioscopy is the gold standard for angle evaluation, but it is technique – sensitive and not always practical for mass screening. ACD measurement is simple non-invasive screening alternative.
Objectives: (1) To evaluate anterior chamber depth measurement as a method of screening for PACG. (2) To compare the parameters in eyes with PACS, PAC, and PACG.
Materials and Methods: This is a prospective study conducted on 150 patients with shallow anterior chamber and patients presenting with signs and symptoms of angle closure. Detailed history was collected. Detailed ophthalmic examination was done for classification into groups.
Results: Out of 150 patients, 36 eyes of open angles, 46 eyes of PACS, 33 eyes of PAC and 35 eyes with PACG were identified. Our findings demonstrated that ACD is a significant for identifying individuals at risk for primary angle closure glaucoma. This study revealed that there is statistically significant difference between the mean ACD of PACS, PAC, and PACG.
Conclusion: This study concluded that anterior chamber depth measurement as a screening tool for primary angle closure glaucoma is effective especially in primary outreach centres where sophisticated equipment may not be available.
55. Diverse Cutaneous Reactions to Diclofenac Sodium: Case Series of three Patients
Sunil Mhatarba Vishwasrao, Sufala Sunil Vishwasrao, Amar Nagesh Kumar, Pollilan G. R.
Abstract
Diclofenac sodium is a frequently prescribed painkiller in OPD and IPD setups of most clinicians. It is also preferred as an analgesic to deal with postoperative pain. The drug is economical and has better efficacy. Dermatological adverse reactions to diclofenac may manifest in a moderate to severe form. Early identification of ADR and prompt treatment are necessary, which helps in the patient’s faster recovery. Delayed identification of ADR sometimes manifests in a severe form that can lead to fatality. Here we report three cases of diclofenac-induced ADRs with varied cutaneous manifestations.
56. A Study on Diastolic Dysfunction in Newly Diagnosed Type 2 Diabetes Mellitus and Its Correlation with Glycosylated Hemoglobin
Chiranjeevi Parnapalli, Bhargav Kiran Gaddam, Mani Ratnam Kothamasu
Abstract
Background: Diabetes mellitus (DM) is a long-term metabolic disease marked by elevated blood glucose levels, resulting from inadequate insulin production, resistance to insulin effects, or a combination of both. The interplay between diabetes and cardiovascular health is particularly important in the context of diastolic dysfunction, a precursor to heart failure with preserved ejection fraction (HFpEF).
Objective: This study investigates the occurrence of diastolic dysfunction in individuals newly diagnosed with type 2 diabetes mellitus (T2DM) and examines its association with glycosylated haemoglobin (HbA1c), a key indicator of long-term blood glucose regulation.
Methods: The prospective non interventional study was conducted for one year duration on 52 newly diagnosed type 2 diabetes mellitus aged between 18 to 60 years. ECG, 2D ECHO, FBS, PPBS, HBA1C was done. The study was conducted on the basis of presence of diastolic dysfunction on echocardiography. Quantitative data was analysed with the help of ‘t’ test and qualitative data with the Chi Square and Fisher Exact Test. Statistical significance was taken as P < 0.05.
Results: Among 52 participants, individuals aged ≤55 years, only 5 (22.7%) had LV diastolic dysfunction, while among those older than 56 years, a substantial 24 (80.0%) had LV diastolic dysfunction. The distribution based on gender shows equal prevalence of LV diastolic dysfunction among both males and females. Among individuals with HbA1c <6.4%, 4 (50.0%) had LV diastolic dysfunction, while those with HbA1c ≥6.5%, a larger proportion, 25 (56.8%), had LV diastolic dysfunction. Among individuals with FBS <125 mg/dL, 9 (42.9%) had LV diastolic dysfunction, while those with FBS ≥126 mg/dL, 20 (64.5%) had LV diastolic dysfunction. Among individuals with PPBS <200 mg/dL, 10 had LV diastolic dysfunction, while those with PPBS ≥200 mg/dL, 19 had LV diastolic dysfunction.
Conclusion: Left ventricular diastolic dysfunction (LVDD) is frequently present in newly diagnosed, normotensive type 2 diabetes mellitus (T2DM) patients, indicating that subclinical cardiac involvement may start early in the disease course. Implementation of early cardiac evaluation, combined with stringent glycemic control and lifestyle modifications, may potentially delay or prevent the progression to overt heart failure in diabetic individuals.
57. Abnormal CTG Findings and Perinatal Outcome in Low-Risk Term Pregnancies
Talwar Karishma, K. Smitha, T. Kiruthika
Abstract
Background: Cardiotocography is a fetal surveillance modality used to detect fetal hypoxia and help reduce perinatal morbidity and mortality. Abnormal findings on CTG can lead to early intervention and improve perinatal outcomes.
Methods: This was an observational study where 94 low-risk pregnant patients with abnormal CTG tracings were selected. All of them underwent emergency caesarean section. Perinatal outcome was measured by noting APGAR scores at 1 minute and 5 minutes and the need for NICU admission.
Results: Out of the 94 patients, 42 (44.6%) were ≤25 years and 52 (55.4%) were >25 years. Primigravida accounted for 64 (68%) and multigravida 30 (32%). Gestational age was <37 weeks in 21 (22.3%) and ≥37 weeks in 73 (77.6%). There were 52 (55.4%) male babies and 42 (44.6%) female babies. Birth weight was <2.5 kg in 26 (27%) and ≥2.5 kg in 68 (73%). APGAR scores at 1 minute were ≥7 in 89 (95%) and <7 in 5 (5%). At 5 minutes, APGAR scores were ≥7 in 90 (96%) and <7 in 4 (4%). NICU admission was required for 57 (60%) babies. CTG findings were suspicious in 61 (65%) and abnormal in 33 (35%). NICU admission was noted in 22 (23.4%) of the abnormal CTG group and 35 (37.3%) of the suspicious CTG group. No statistical significance was found in the association between CTG findings and NICU admission (p=0.247) or between CTG findings and low APGAR scores at 1 minute (p=0.353) and 5 minutes (p=0.304).
Conclusion: The study showed that while CTG abnormalities lead to emergency interventions, they do not necessarily predict poor immediate neonatal outcomes. The association between abnormal CTG findings and NICU admission or low APGAR scores was not statistically significant. Further research with larger sample sizes is needed to explore these associations more definitively.
58. To Assess the Prevalence, Severity, and Long-Term Impact of Thyroid Dysfunction Following Intensity-Modulated Radiotherapy (IMRT) in Patients with Non-Thyroidal Head and Neck Cancers and to Evaluate the Potential Need for Routine Thyroid Function Monitoring in This Patient Population
A. Satish Kumar, Dalin Xavier, G.R. Santhilatha, G. Padma Sree
Abstract
Background: To Assess The Prevalence, Severity, And Long-Term Impact Of Thyroid Dysfunction Following Intensity-Modulated Radiotherapy (Imrt) In Patients With Non-Thyroidal Head And Neck Cancers And To Evaluate The Potential Need For Routine Thyroid Function Monitoring In This Patient Population.
Methodology: This study is a prospective cohort study conducted at a single tertiary care institute, specifically the Radiotherapy Department of Government General Hospital (GGH), Guntur. The primary aim of this study is to assess the incidence and pattern of thyroid dysfunction following Intensity-Modulated Radiotherapy (IMRT) in patients with non- thyroid head and neck cancers. A total of 70 patients with non-thyroid head and neck squamous cell carcinoma (HNSCC) were prospectively evaluated for thyroid dysfunction following radiation treatment.
Results: In this prospective study 70 patients with non-thyroid head and neck squamous cell carcinoma (HNSCC) were recruited. Of these 70 patients, 8 patients (11.4%) developed subclinical hypothyroidism following treatment, whereas 62 (88.6%) retained normal thyroid function during the follow-up period. Median period for the development of the subclinical hypothyroidism is 3 months. Results were observed.
Conclusion: In conclusion, the study strongly advocates for early and sustained thyroid function monitoring post-radiotherapy, even in asymptomatic patients. Detecting subclinical hypothyroidism early opens a window for potential intervention before clinical symptoms arise. Future research should aim to explore long-term outcomes with larger sample sizes, integrate autoimmune and endocrine markers, and optimize radiation planning to mitigate risks. Such efforts are essential for improving survivorship quality and reducing the burden of preventable late effects in cancer care.
59. A Comparative Study on Locking Plate versus Intramedullary Nail in the Management of Proximal Humerus Fractures
Lavudi Rambabu, Shuja Nazim, C. Abednego
Abstract
Background: Proximal humerus fractures make up about 5% of all fractures, with a higher occurrence in older adults due to osteoporosis. Non-displaced fractures can be treated with conservative methods, but displaced fractures need surgery. Two main fixation methods, locking plates and intramedullary nails, are commonly used in orthopedic practice, but there is still debate about which is more effective. Locking plates offer better control in the metaphysis with fixed-angle constructs, while intramedullary nails provide biological fixation with less disruption to soft tissue. This study aimed to compare functional outcomes, complication rates, and radiological union between these two techniques in an Indian population.
Methods: A prospective comparative cohort study was carried out over 18 months at the Department of Orthopedic Surgery from January 2023 to June 2024. Seventy-five patients with displaced proximal humerus fractures (Neer classification II-IV) were assigned to either locking plate fixation (n=37) or intramedullary nail fixation (n=38). We measured functional outcomes using the Constant-Murley Score and the American Shoulder and Elbow Surgeons (ASES) score at 6 weeks, 3 months, 6 months, and 12 months. Secondary outcomes included the rates of radiological union, complication rates, need for revision surgery, and time to union. We performed statistical analysis with independent t-tests for parametric data, Mann-Whitney U tests for non-parametric data, and Chi-square tests for categorical data (p < 0.05).
Results: At 12 months, the mean Constant-Murley score was 76.2±8.4 for the locking plate group and 74.8±9.1 for the intramedullary nail group (p=0.436). The ASES scores were 78.4±7.6 for locking plates and 76.9±8.3 for intramedullary nails (p=0.352). Radiological union was seen in 94.6% of locking plate cases and 92.1% of intramedullary nail cases (p=0.601). Varus collapse occurred in 8.1% of locking plate cases compared to 5.3% for intramedullary nails (p=0.486). Revision surgery was necessary for 5.4% of the locking plate group and 7.9% of the intramedullary nail group (p=0.512). Both groups showed similar functional recovery and acceptable complication rates.
Conclusion: Both locking plate and intramedullary nail fixation are effective surgical options for displaced proximal humerus fractures, with similar functional outcomes and complication rates. Treatment should be tailored to the individual, considering fracture complexity, bone quality, and the surgeon’s expertise. These results support using both techniques as primary surgical options in Indian orthopedic practice, providing evidence-based outcomes that facilitate a return to daily activities and work.
60. Study of Correlation of Serum Ascitic Albumin Gradient with Oesophageal Varices in Patients with Portal Hypertension in Chronic Liver Disease – Retrospective Study
Sandip Kashinath Ghule, Shubhangi Kashinath Ghule, Umesh Badrinath Khedkar
Abstract
Background: The ascites analysis provides the etiology of portal hypertension, and elevation of the serum ascitic albumin gradient (SAAG) shows the accuracy of portal hypertension.
Method: 90 adult patients with liver disease were studied. USG was carried out for the diagnosis of cirrhosis of the liver. Blood examination included CBC, liver function test, a renal function test, and a coagulation profile; Child-Pugh scores were calculated for severity of disease, and paracentesis of ascitic fluid was performed. Ascitic fluid was analyzed for SAAG calculation.
Results: In the study of esophageal varices, the grading of esophageal varices had a significant p-value (p<0.001). Apart from the elevation of SAAG serum albumin (g/dl), bilirubin (mg/dl) levels also increased.
Conclusion: The present pragmatic study shows there is a strong correlation between SAAG and the presence and severity of esophageal varices in patients with chronic liver disease having portal hypertension.
61. Introduction and Impact of Mini Clinical Evaluation Exercise as an Assessment Tool for MBBS Interns Posted in the Department of Dermatology
Chandra Shekhar Jaiswal, Abhay Kumar Sinha, Vivekanand Waghmare
Abstract
Background: Competency-based medical education (CBME) has highlighted the importance of formative assessment methodologies that evaluate real-time clinical performance. Traditional assessment methods frequently fail to assess critical qualities like communication, professionalism, and clinical reasoning during actual patient encounters. Mini-Clinical Evaluation Exercise (Mini-CEX) is a systematic workplace-based assessment method that aims to close this gap.
Objectives: The present study was conducted to introduce Mini-CEX as a formative assessment tool for MBBS interns during their dermatology posting. The study also aimed to assess the impact of Mini-CEX on interns’ clinical competencies, including history taking, examination, clinical judgment, communication skills, and professionalism. Additionally, the perceptions of interns and faculty regarding the usefulness and feasibility of Mini-CEX were evaluated.
Methods: A prospective interventional study was conducted among MBBS interns posted in the Department of Dermatology. Interns underwent multiple Mini-CEX encounters using a standardized assessment proforma. Scores from initial and subsequent encounters were compared and feedback responses were analyzed.
Results: There was a statistically significant improvement in mean Mini-CEX scores across all assessed domains after repeated encounters. Interns and faculty reported high satisfaction with Mini-CEX, particularly highlighting its role in improving clinical confidence and feedback-based learning.
Conclusion: Mini-CEX is an effective, feasible, and acceptable formative assessment tool for MBBS interns in dermatology, contributing significantly to competency development.
62. Comparative Assessment of Systemic Inflammatory Response Between Drug-Sensitive and Rifampicin-Resistant Pulmonary Tuberculosis Under NTEP Guidelines
Vikas Kumar Mishra, Nishant Srivastava, Pravin Gulab Dandekar, Rajesh Kharadee, Sourabh Jain, Vinod Kumar Kurmi
Abstract
Background: Tuberculosis (TB) remains one of the most significant infectious diseases globally, with inflammatory markers serving as potential indicators of disease severity and treatment response.
Aim and Objectives: To compare the levels of hs-CRP, ESR, WBC, LDH, and procalcitonin between patients with drug-sensitive pulmonary tuberculosis (DS-PTB) and rifampicin-resistant pulmonary tuberculosis (RR-PTB) before and after two months of anti-tubercular therapy (ATT) under NTEP guidelines.
Materials and Methods: A prospective comparative study was conducted from March 2025 to September 2025 including 70 DS-PTB and 68 RR-PTB patients. Baseline and 2-month inflammatory markers were analyzed. Statistical tests included Shapiro-Wilk, Mann-Whitney U, independent t-test, paired t-test/Wilcoxon signed-rank test, Chi-square/Fisher’s exact test, and Spearman correlation. A p-value <0.05 was considered statistically significant.
Results: Baseline inflammatory markers were higher in the RR-PTB group compared to DS-PTB (p>0.05 for most). After two months of ATT, both groups showed a decline in hs-CRP, ESR, LDH, and procalcitonin, though not statistically significant (p>0.05). No significant correlation was observed between post-therapy inflammatory markers and treatment type.
Conclusion: The study found no statistically significant improvement in inflammatory markers after two months of therapy in either arm, suggesting that short-term biochemical response may not directly correspond with microbiological improvement.
63. Use of an Artificial Intelligence–Based Language Model as a Clinical Support Tool in a Resource-Limited Indian Hospital: A Real-World Physician Experience
Ashutosh Kumar, Rubi Kumari
Abstract
Background: Language models based on artificial intelligence (AI) are being looked into more and more as clinical support tools to make healthcare more efficient and help with decision-making. Their function in resource-constrained hospital environments is still insufficiently researched.
Objectives: To assess the feasibility, usability, and safety of an AI-based language model as a supervised clinical support tool in a resource-constrained Indian hospital.
Methods: This prospective observational study was carried out over two years (2023–2025) at Hariram Hospital, Motihari. About 200 clinical contacts, both inpatient and outpatient, were evaluated. The AI tool was utilised with the help of a doctor to help with paperwork, support differential diagnosis, and teach patients. The results were better workflow, less time spent on paperwork, happier doctors, and fewer safety incidents.
Results: The AI technology cut down on the time it took to write reports and made clinical notes clearer. There were no bad clinical events associated to AI. Doctors were very happy with the service.
Conclusion: Using an AI-based language model under supervision is possible and helpful in hospitals with few resources, as it makes things more efficient without putting patients at risk.
64. Regional Anaesthesia in Patients with Pregnancy-Induced Hypertension: Labour Analgesia Considerations
Archana Rathore, Shruthi C. Sheelavanth, Roshan Kumar
Abstract
Background: Pregnancy-induced hypertension (PIH) represents a significant obstetric complication affecting maternal and fetal outcomes. Regional anaesthesia has emerged as the preferred modality for labour analgesia in these patients, offering potential benefits in blood pressure modulation and stress response attenuation. However, careful consideration of haemodynamic implications and coagulation status is essential.
Methods: This prospective observational study was conducted at a tertiary care hospital over 18 months, enrolling 156 parturients with PIH requiring labour analgesia. Participants were categorized into epidural analgesia (Group E, n=82) and combined spinal-epidural analgesia (Group CSE, n=74). Haemodynamic parameters, analgesic efficacy, maternal complications, and neonatal outcomes were assessed.
Results: Mean arterial pressure demonstrated significant reduction following regional anaesthesia initiation in both groups (Group E: 112.4 ± 8.6 to 94.2 ± 7.3 mmHg; Group CSE: 114.1 ± 9.2 to 91.8 ± 6.9 mmHg; p<0.001). Pain scores decreased significantly from baseline (7.8 ± 1.2 to 2.1 ± 0.9; p<0.001). Hypotension occurred in 14.6% of Group E versus 23.0% of Group CSE patients (p=0.042). No cases of post-dural puncture headache or neurological complications were observed. Neonatal Apgar scores at 5 minutes were comparable between groups (8.7 ± 0.6 vs 8.5 ± 0.7; p=0.108).
Conclusion: Regional anaesthesia provides safe and effective labour analgesia in PIH patients with favourable haemodynamic profiles. Epidural analgesia demonstrated greater haemodynamic stability compared to combined spinal-epidural technique, suggesting its preference in moderate-to-severe PIH cases.
65. A Comparative Study to Evaluate the Efficacy and Safety of Two Different Doses of Oxytocin Boluses in Elective Caesarean Sections
Noorjit Sidhu, Mehak Dureja, Neha Yadav
Abstract
Introduction: Oxytocin has essentially been used to prevent uterine atony in pregnant females. However its administration should be done cautiously as it is associated with certain hemodynamic altercations. Moreover, there is associated desensitisation of its receptors. Therefore, in our study we compared changes in heart rate and mean arterial pressure that occur after administration of two different doses of 1 and 3 units of IV oxytocin bolus. Additionally, we studied the adequacy of subsequent uterine contractions and need for additional uterotonic agents along with any adverse events.
Material and Methods: Pregnant women posted for elective LSCS under spinal anaesthesia and those meeting the inclusion criteria were recruited for the study. They were randomly divided into two groups of 52 each and were given 1 unit and 3 unit of oxytocin bolus respectively followed by regular oxytocin infusion of 10units/hour by adding 20 units of drug in 500ml of 0.9% NS after delivery of baby. Hemodynamic parameters HR, SBP, DBP and MAP were recorded and compared to baseline. Uterine tone adequacy was checked at 2,5 and 15 minutes. Use of any additional uterotonic drugs and any adverse effects were noted.
Results: Better uterine tone was achieved in the group that was administered 3 units of oxytocin bolus. The need for rescue uterotonics was observed with the 1 unit bolus group and none of the participants in 3 units group required them. Hemodynamic variations in SBP, DBP, MAP and HR were observed in the 3 unit’s bolus group. There was no significant difference in the adverse effects of nausea and vomiting in the two groups.
Conclusion: 3 units of oxytocin bolus followed by infusion of 10 units per hour could provide satisfactory uterine tone with minimal hemodynamic changes and adverse effects.
66. Effectiveness of a Faculty Development Programme in Improving Faculty Understanding of Aetcom in the MBBS Curriculum: A Quasi-Experimental Pretest–Posttest Study
Abhay Kumar, Bharti Badlani, Vivekanand Waghmare
Abstract
Background: Attitude, Ethics and Communication (AETCOM) is the formal affective domain spine as part of competency-based undergraduate medical education in India but the translation of AETCOM into the classroom highly depends upon the faculty readiness. In many educational establishments, there is faculty report of uncertainty in relation to AETCOM competences, teaching/learning practices and assessment practices resulting in variability in their implementation.
Methods: A single-group quasi-experimental study was carried out in Chhindwara Institute of Medical Sciences, Chhindwara, MP, and India. Faculty involved in undergraduate teaching were enrolled for a 3-day structured Faculty Development Programme (FDP) on AETCOM. Understanding was evaluated by using a 30 questions AETCOM Faculty Understanding Questionnaire (AFUQ; score range 0-120) and a 25 questions knowledge test (0-25), at baseline (T0), immediately post FDPs (T1) and at 12 weeks (T2). Secondary outcomes were self-efficacy for facilitation and assessment (10-item scale, 10-50) and implementation products (session plans, assessment blueprints). Paired comparisons and repeated measures analyses were conducted; effect sizes were reported.
Results: Of 54 eligible faculty, 48 participated (mean age 38.9±7.6 yrs; 54.2% female). Mean, AFUQ increased during and was maintained at T1 (58.1±10.9 at T0 vs. 94.7±9.8 at T1 and 88.3±10.7 at T2) and remained significant (p<0.001; ηp2=0.62). Knowledge of 19.8±2.7 were increased from 11.3±3.4 (P<.001; Cohen’s d=2.79). Self-efficacy improved from 27.6±5.9 to 40.9±4.8 (p<0.001). Faculty developed 36 AETCOM micro-session plans and a common assessment blueprint focusing on workplace-based assessment and embodied writing addressing previously reported issues of dependence on written exam
Conclusion: A structured FDP led to large and sustained gains in faculty understanding and readiness to implement AETCOM. Institutionalization of longitudinal mentoring and assessment support could yield a ‘solidification and decrease of variability’ effect.
67. Our Experience with Total Ossicular Replacement Prosthesis for Ossiculoplasty in Mastoid Surgeries
Deepthi Bhimanapati, Praveen Surana, Somya Choubey, Anurag M. Srivastava, Digant Patni, Vishal R. Munjal
Abstract
This study aims to evaluate the hearing improvement in patients undergoing ossiculoplasty with total ossicular replacement prosthesis (TORP). The correlation between posterior canal wall status and hearing outcomes have also been analysed. Preoperative and postoperative audiometry results were analysed alongside patient demographics and complications. Out of 30 patients, 27 patients showed significant hearing improvement with an average of 25dB. Hearing improvement is better at lower frequencies when compared to higher frequencies. Even though presence of posterior canal wall did not show any significant impact on hearing, larger sample size may be needed to generalise this observation. The study highlights the significance of surgical techniques, patient selection, and ossiculoplasty technique in improving auditory outcomes. The findings are particularly relevant in the Indian context, where resource limitations, anatomical variability and high prevalence of complicated cases pose unique challenges.
68. Prolonged Sedentary Bouts Predict Adverse Glycemic Outcomes Over 12 Months in Adults with Type 2 Diabetes: A Smartphone-Based Prospective Cohort Study
Jyoti Verma, Jyoti Pankaj, Sumit, Shashank
Abstract
Background: Lifestyle behaviours such as physical activity, sedentary time and sleep play a central role in the management of Type 2 diabetes. Advances in smartphone technology now allow continuous, real-world monitoring of these behaviours at scale. While overall physical activity has a well-established relationship with glycemic control, less is known how the patterning of sedentary behaviour, sleep regularity and routine post-meal walking influence long-term glycemic outcomes and cardiometabolic risk.
Objectives: To evaluate whether smartphone-derived sedentary bout length, daily step count, post-meal steps, and regularity predict 12-month change in HbA1c and incident metabolic syndrome among overweight and obese adults.
Methods: We conducted a 12 months observational cohort study of 252 adults (18-60 years) with type 2 diabetes using smartphone-based activity and sleep tracking. Average Sedentary bout length (minutes), daily steps, post-meal steps and sleep regulatory score were derived from passively collected smartphone data. Clinical measurements included baseline and 12-month HbA1c and 12 month metabolic syndrome status. Sleep regularity was assessed using the Sleep Regularity Questionnaire (SRQ). For Aim 1, we fit a multivariable linear regression with 12-month HbA1c change as outcome and sedentary bout length as primary predictor, adjusted for age, sex, BMI, baseline HbA1c, and daily steps. For Aims 2 and 3, we used multivariable logistic regression models with incident metabolic syndrome at 12 months as outcome and sleep regularity (Aim 2) or post-meal steps (Aim 3) as primary predictor, adjusting for age, sex, BMI, baseline HbA1c, and daily steps.
Results: Longer sedentary periods were independently linked to less favorable HbA1c changes. Each extra minute in average sedentary bout length was associated with about 0.002 percentage point higher HbA1c over 12 months (β) 0,0021, 95% CI roughly 0.001-0.003, p < 0.001) after adjustment. This means a 0.06 percentage point increase in HbA1c over 12 months for a 30-minute difference in sedentary bout length. In contrast, sleep regularity score was not significantly connected with incident metabolic syndrome at 12 months in adjusted logistic regression (odds ratio near 1.0, 95 % CI including 1). Similarly, post-meal steps showed no statistically significant protective link with incident metabolic syndrome after adjustment.
Conclusions: In this smartphone-based cohort of adults with diabetes, longer sedentary bouts, independent of total daily steps, predicted worse 12-month HbA1c trajectories. Sleep regularity and post-meal stepping were not clearly associated with incident metabolic syndrome, though limited power and measurement error may have influenced results. Targeting interruptions in sedentary periods could be a practical focus for digital health interventions aimed at improving glycemic control control.
69. Comparison of functional outcomes of scaphoid fractures treated by open reduction internal fixation versus percutaneous fixation: a retrospective study from a tertiary care center in South India
Buddharaju Suraj Verma, C. Likhitha, Goli Ganesh
Abstract
Background: Scaphoid fractures are common carpal injuries with a high propensity for nonunion and long-term functional impairment. Surgical fixation using open reduction internal fixation (ORIF) or percutaneous fixation is widely employed, yet the optimal approach remains debated.
Objectives: To compare functional outcomes of scaphoid fractures treated with ORIF versus percutaneous fixation.
Methods: This retrospective comparative study was conducted at the NRI Institute of Medical Sciences, Sangivalasa, Visakhapatnam. Records of 30 patients treated surgically over two years were reviewed. Functional outcomes, union rates, and complications were analyzed.
Results: Both techniques achieved satisfactory union. Percutaneous fixation allowed earlier functional recovery, whereas ORIF was effective for displaced fractures. Final functional outcomes were comparable.
Conclusions: Both techniques are effective, surgical approach should be individualized based on fracture characteristics.
70. Correlation of Haematological and Biochemical Parameters in Sickle Cell Disease in Tertiary Care Hospital
Harsh Pandya, Charmi Kotak, Vijay Parmar
Abstract
Introduction: The sickle cell anaemia is defined as a hemoglobinopathy due to a single point mutation in the beta-chain of human haemoglobin. The homozygous inheritance of this mutation produces haemoglobin SS and heterozygous forms stands for sickle cell trait. Individuals with this Hb-SS genotype suffer from sickle cell anaemia.
Methodology: Total 120 patients are included in study and the study design is observational cross sectional study. Patients having sickle cell disease (homozygous for HB-SS type) have been assessed for haematological profile like Complete blood count by Sysmex 3 part analyser and biochemistry tests like Liver function test, Serum LDH, Renal function tests, these all biochemistry tests have been assessed by using EM-200 fully automated chemistry analyser.
Result: Out of 120 cases in 85 patients (70.84%) values for total bilirubin are elevated. In remaining 35 patients (29.16%) of sickle cell disease total bilirubin is within normal limit. Mean value for total bilirubin is 1.8 mg%. Here in study out of 120 cases of sickle cell disease 62(51.66%) patients have serum creatinine in normal range while 58(48.44%) patients value are not in normal range. Mean value for serum creatinine is 1.18 mg%. Out of 120 patients 106(88.4%) patients have s. urea in normal range while only 14(11.6%) patients have abnormal value for serum urea. Mean for serum urea is 24.48 mg%.
Conclusion: It has been observed that parameters of liver function test are elevated because of event of hemolysis which happens to sickle cell disease patients. In present study 67.5%, 78.3%, 87.5% and 70.84% cases have elevated values for SGOT, SGPT, S.ALP and serum total bilirubin respectively. There is also seen that renal function is not that much altered as compared to liver function and the data suggests that by only 11.6% cases and 48.4% cases have elevated values for S. urea and S. creatinine respectively.
71. Teaching beyond Physiology: Assessing Aetcom Learning in First-Year MBBS Students
Nachane M., Rajput U., Gavali Y.
Abstract
Background: The Competency-Based Medical Education (CBME) curriculum introduced by the National Medical Commission emphasizes the AETCOM (Attitude, Ethics, and Communication) module to foster humanistic qualities in medical students. Integrating AETCOM into preclinical subjects like Physiology may enhance students’ ethical sensitivity, empathy, and communication skills, which are not effectively assessed through traditional examinations.
Objectives: To integrate selected AETCOM modules into the first-year MBBS Physiology curriculum and evaluate their impact on students knowledge, communication skills, attitudes, and perceptions.
Methods: A cross-sectional mixed-methods study was conducted among 100 first-year MBBS students at a tertiary medical college (June–December 2024). Two, AETCOM modules—what does it mean to be a patient? & Doctor–Patient Relationship—were integrated with relevant Physiology topics. Sessions employed small-group discussions, role plays, simulated encounters, and reflective journaling. Assessment included pre-/post-tests, OSCEs with a 3-point rubric, reflective writing evaluated by a 5-criteria rubric, and structured feedback on a 5-point Likert scale. Quantitative data were analyzed descriptively; qualitative reflections underwent thematic analysis.
Results: Mean knowledge scores improved significantly from 61.3 ± 9.2 to 79.6 ± 7.8 (p < 0.001). In OSCEs, 68% of students performed satisfactorily and 22% exemplarily in communication and empathy skills. Reflective writings showed 54% analytical and 28% critical reflections, highlighting themes of enhanced empathy, ethical sensitivity, and professional identity formation. Student feedback indicated high satisfaction: 94% reported increased empathy, 91% improved communication skills, and 96% recommended broader AETCOM integration.
Conclusion: Embedding AETCOM modules within Physiology effectively enhanced cognitive learning while strengthening empathy, ethical understanding, and communication skills. This integrative model is feasible, well-received, and supports early professional identity development in line with CBME goals. Broader implementation across preclinical subjects is recommended.
72. Establishing the Physiological Range of Various Thyroid Hormones in Ethnic Healthy Adult Kashmiri Population of Himalayan Region – A Hospital Based Study
Suhail A. Gilkar, Maria Bashir
Abstract
Introduction: Thyroid hormones are secreted from the thyroid gland, an important endocrine organ. These hormones are important in regulating metabolism, growth, weight, and thermogenesis. Kashmir is a part of the Himalayan belt of India, where the soil is deficient in iodine. The prevalence of thyroid disorders in Kashmir Valley are rising at an alarming rate. The study was carried out to determine the normal physiological range of thyroid hormones (TSH, T3, and T4) in normal, euthyroid subjects of ethnic Kashmiri origin. This would aid in the screening, diagnosis, and treatment of various thyroid disorders.
Methods: In this study, 400 healthy volunteers falling in the age group 20-60 years were enrolled and their serum TSH, T3, and T4 levels were assessed after taking proper history and general physical and systemic examination.
Results: We found that the TSH levels ranged from 0.27-4.20μIU/ml, T3 levels were 0.50-1.80 ng/ml, and T4 levels were 5.30-13.60 μg/dl in the study subjects. There was no significant difference in thyroid hormone levels with respect to the age and sex of the study participants.
Conclusion: the results show that the range of serum TSH, T3, and T4 levels in the Kashmiri population is in close agreement with the range established in the other populations. Larger sample sizes of studies need to be carried out to better understand the reference range in the population. Also, we recommend including free T3, and free T4 levels to be assessed.
73. Educational Impact of Mini-Clinical Evaluation Exercise (Mini-CEX) on Psychiatry Residents: A Prospective Observational Study
Ritu Meena, Leena Saini, Dharmdeep Singh, Harjot Singh Ranu, Anurag Sharma, Ketaki Poorey, Parveen Singh
Abstract
Introduction: Workplace-based assessment has gained importance in postgraduate medical education as it allows evaluation of clinical performance in real-life settings. The Mini-Clinical Evaluation Exercise (Mini-CEX) is a structured assessment tool that involves direct observation of clinical encounters and provision of immediate feedback, which may enhance learning and professional development. Its educational impact in psychiatry residency training warrants evaluation.
Aim: To assess the educational impact of introducing the Mini-Clinical Evaluation Exercise (Mini-CEX) on psychiatry residents.
Materials and Methods: This prospective observational educational study was conducted in the Department of Psychiatry at a tertiary care teaching hospital in Jaipur, India, after Institutional Ethics Committee approval. Ten first- and second-year psychiatry residents were included after obtaining informed consent. Faculty members were sensitized and trained in the use of Mini-CEX. Each resident underwent five Mini-CEX encounters involving patients of moderate clinical complexity. Residents were assessed across seven domains using a 9-point rating scale. Perceptions of residents and evaluators were collected using validated questionnaires. Data were analysed using descriptive statistics with SPSS version 24.
Results: Among the residents, 70% rated the Mini-CEX experience as satisfactory and 30% as just satisfactory. Progressive improvement was observed across successive encounters. Resident satisfaction scores increased from 5.5 in the first encounter to 7.2 by the fifth encounter, while evaluator satisfaction scores increased from 6.0 to 8.1, which was statistically significant (p<0.05). Residents reported improvement in communication skills, history taking, and clinical examination, along with appreciation for immediate feedback.
Conclusion: Mini-CEX is a feasible, acceptable, and effective formative assessment tool for psychiatry residents. Its implementation in routine clinical settings has a positive educational impact and contributes to improvement in multiple clinical competency domains.
74. Correlation between Pretreatment Serum Prostate-Specific Antigen (PSA) Level and Gleason Grade Group in Prostatic Adenocarcinoma
Junu Devi, Mithun Chandra Das
Abstract
Introduction: Adenocarcinoma of the prostate may be clinically suspected based on elevated serum PSA (Prostate Specific Antigen) and/or abnormal digital rectal examination. Patients with high S/PSA are at increased risk of advanced carcinoma prostate and screening at an earlier stage would help to manage it accordingly. The aim of this study is to determine association between pretreatment serum prostatic specific antigen (PSA) levels and Gleason grade group in prostatic adenocarcinoma patients.
Material and Method: A total 50 prostatic carcinoma cases were studied in this hospital based cross sectional study carried out in the department of pathology, Gauhati Medical College and hospital. Trans rectal ultrasound (TRUS) guided biopsy specimen sent from urology and renal transplant department were processed for H&E stain and studied histomorphologically. Adenocarcinoma cases were assigned Gleason score and Gleason grade group. Pretreatment serum PSA level is correlated with Gleason grade group using chi square test, a P value of <0.05 was considered statistically significant.
Result: all 50 cases were adenocarcinoma. The mean PSA level 77.38±49.14 ng/ml. Grade group 1 was seen in 5(10%) cases, Grade Group 2 was seen in 11(22%) cases grade group 3 was seen in 22(44%) cases, Grade group 4 was seen in 5(10%) cases and grade group 5was seen in 7(14%) cases. A statistically significant correlation was found between pretreatment serum PSA level and Gleason Grade Group (P<0.028).
Conclusion: Carcinoma of prostate associated with an elevated serum PSA level is more likely to be of higher grade than carcinoma with normal PSA levels.
75. The Viral Contribution to Lymphoma: Pathogenesis, Diagnosis, and Therapeutic Opportunities
Debadatta Bhanjadeo, Liza Das, Sasmita Sahu
Abstract
Introduction: Several human viruses contribute directly or indirectly to lymphomagenesis by driving chronic antigenic stimulation, immune evasion, oncogenic signaling, and/or genomic instability. Epstein–Barr virus (EBV), Kaposi sarcoma herpesvirus (KSHV/HHV-8), human T-cell leukemia virus type 1 (HTLV-1), hepatitis C virus (HCV), and HIV (largely via immune dysregulation and cooperation with EBV/KSHV) represent the most clinically relevant viral associations in modern lymphoma classification.
Materials and Methods: A structured narrative review was performed using recent WHO/ICC classifications and post-2015 peer-reviewed literature addressing viral mechanisms, diagnostic approaches (histopathology, EBER-ISH, IHC, PCR/viral load, serology), and therapeutics including antivirals, immunotherapy, and adoptive cellular therapy.
Results: Viral-associated lymphomas show recognizable clinicopathologic patterns: EBV-associated B/T/NK lymphomas and PTLD; KSHV-driven primary effusion lymphoma and HHV8-associated large B-cell lymphomas; HTLV-1–driven adult T-cell leukemia/lymphoma; HCV-associated indolent B-cell lymphomas with potential regression after direct-acting antiviral therapy; and HIV-associated lymphomas influenced by immune suppression and high EBV/KSHV burden. Key therapeutic opportunities include etiologic viral suppression (HCV DAAs), immune reconstitution, targeted antibodies, checkpoint blockade in selected EBV-associated entities, and EBV-specific T-cell therapy (e.g., tabelecleucel) for EBV+ PTLD.
Conclusion: Integrating viral testing into lymphoma diagnostics improves classification, prognostication, and enables mechanism-based therapy—particularly antiviral cure for HCV-related lymphomas and cellular immunotherapy for EBV+ PTLD.
76. A Prospective Study on Effect of Vitamin D Levels on Forced Vital Capacity in COPD Patients in a Tertiary Care Hospital
Billa Vikas, Bottu Kalyani, A. Raviumar, Theerdhala Keshmaswaraj
Abstract
Background: Chronic Obstructive Pulmonary Disease (COPD) is frequently associated with vitamin D deficiency, which may worsen lung function and exacerbate disease severity. Forced Vital Capacity (FVC) is a major indicator of pulmonary performance.
Objective: To evaluate the relationship between serum vitamin D levels and FVC among COPD patients in a tertiary care hospital.
Methods: A prospective study was conducted over 12 months involving 50 stable COPD patients. Serum 25-hydroxyvitamin D [25(OH) D] levels were measured and categorized into deficient (<20 ng/mL), insufficient (20–30 ng/mL), and sufficient (>30 ng/mL). FVC (% predicted) was assessed using spirometry.
Results: Of 50 patients, 32 (64%) had vitamin D deficiency, 12 (24%) had insufficiency, and 6 (12%) had sufficient levels. Mean FVC was significantly lower in vitamin D–deficient patients (56.8 ± 9.4%) compared to insufficient (63.2 ± 8.1%) and sufficient groups (70.1 ± 7.3%) (P < 0.01). A positive correlation was observed between serum vitamin D and FVC (r = 0.52, p = 0.001).
Conclusion: Lower vitamin D levels are significantly associated with reduced FVC in COPD patients. Screening and correction of vitamin D deficiency may improve lung function outcomes.
77. Ocular Manifestations in Patients with Tuberculosis: A Cross-Sectional Study from a Tertiary Care Centre in South Gujarat
Omkar S. Soni, Mital Patel, Manali Shah, Krishan Kumar, Darshan Jashvantbhai Thacker
Abstract
Background: Tuberculosis remains a major public health problem in developing countries, with pulmonary being most common and extrapulmonary involvement contributing significantly to morbidity. Ocular tuberculosis, though often underdiagnosed, can lead to visual impairment if not detected and treated early. Understanding the pattern and determinants of ocular involvement is essential for timely intervention.
Objectives: To evaluate the prevalence and pattern of ocular manifestations in patients with tuberculosis, assess associated visual impairment, and determine the relationship between disease duration and ocular involvement.
Materials and Methods: This cross-sectional study was conducted at a tertiary care centre in South Gujarat from April 2023 to October 2024. A total of 167 patients with tuberculosis were included. All patients underwent a detailed ophthalmic examination, including visual acuity assessment, slit-lamp biomicroscopy, and fundus evaluation. Data were analyzed using SPSS version 20. Associations between categorical variables were assessed using the chi-square test, with a p value <0.05 considered statistically significant.
Results: Ocular manifestations were observed in 33 patients (19.8%). Posterior segment involvement was more common than anterior segment involvement, with choroiditis (36.4%) being the most frequent manifestation, followed by retinitis (21.2%) and retinal vasculitis (15.2%). Visual impairment was present in 42.4% of patients with ocular tuberculosis. No significant association was found between gender and ocular involvement or visual impairment. A statistically significant association was noted between longer duration of tuberculosis and the presence of ocular manifestations (p = 0.042).
Conclusion: Ocular involvement was present in a substantial proportion of tuberculosis patients, particularly in those with prolonged disease duration. Posterior segment manifestations were the main contributors to visual impairment. Routine ophthalmic screening in tuberculosis patients, especially those with long-standing disease, may facilitate early diagnosis and help prevent irreversible visual loss.
78. Clinical and Microbiological Profile of Catheter Associated Bacteriuria at a Tertiary Care Hospital
Tanvi Prajapati, Bimal Chauhan, Tanmay Mehta, Nimish Rathod, Shivani Mehta, Jayshri Pethani
Abstract
Introduction: Catheter-associated bacteriuria (CA-bacteriuria) is the presence of significant bacterial growth in the urine of patients who are currently or recently catheterized, with or without urinary symptoms. When symptoms are present, termed a catheter-associated urinary tract infection (CAUTI). CAUTI is the most prevalent healthcare-associated infection worldwide, primarily linked to the widespread and often inappropriate use of urinary catheters in hospitals and long-term healthcare facilities.
Aim and Objectives: 1) To identify urinary pathogens and assess their antimicrobial resistance patterns in catheterized patients. 2) To determine the rate of CA-bacteriuria. 3) To explore the relationship between CA-bacteriuria and high-risk clinical conditions.
Methodology: The study analyzed 682 urine samples from patients catheterized for over 48 hours (June 2022–May 2023), processing them per standard protocols, with pathogen identification and antibiotic susceptibility testing performed using the VITEK 2 Compact system.
Results: Catheter-associated bacteriuria was detected in 132 patients (19.35%), with gram-negative bacilli accounting for 61.36% of isolates, predominantly
Escherichia coli (21.21%). The highest prevalence was observed among male patients and individuals aged 50 years or older, with diabetes emerging as the most frequent associated risk factor. In most cases, catheterization duration was under 10 days.
E. coli isolates showed complete susceptibility to colistin, while
Candida species demonstrated 100% sensitivity to micafungin.
Conclusion: Catheter-associated bacteriuria is a significant healthcare issue. Key factors include age, diabetes, catheter duration, and antimicrobial resistance. Colistin effectively treats gram-negative bacteria, while Vancomycin, Daptomycin, Teicoplanin, and Linezolid are effective against gram-positive bacteria.
79. Seroprevalence of Hepatitis a Virus [HAV] and Hepatitis E Virus [HEV] in the Patients Presenting with Acute Viral Hepatitis at Tertiary Care Hospital of Southern Rajasthan
Neelam Chauhan, Geetavani B., Md. Tofiq, Arun More, Nikki Paliwal, Anchal Pamecha
Abstract
Background: Hepatitis A virus (HAV) and Hepatitis E virus (HEV) cause acute viral hepatitis in humans and are transmitted mainly through the fecal-oral route.
Objective: This study was done to determine the seroprevalence of Hepatitis A virus (HAV) and Hepatitis E virus (HEV) among the patients presenting with symptoms of acute viral hepatitis (AVH) in southern Rajasthan.
Setting and Design: It was a prospective observational study conducted from July 2024 to June 2025 involving patients presenting with acute viral Hepatitis visiting our tertiary care hospital.
Material and Methods: The study population included 333 patients (outdoor and hospitalized) having clinical features of AVH. Serum samples collected from those patients were tested in duplicate for anti‑HAV IgM and anti‑HEV IgM antibodies using commercially available enzyme‑linked immunosorbent assay (ELISA) kits.
Result: A total of 333 samples; were included in this study, out of which 133 patients were found to be positive for HAV and HEV; IgM anti HAV antibodies were detected in 132 (39.6%) serum samples tested and 1 (0.30%) serum sample was positive for IgM HEV antibodies. HAV positivity rate was significantly high among males (27.6%) in comparison to females (12.31%). HAV infection was common in all age groups and it was more prevalent in the age group between 1 to 15 years with the highest prevalence 77 (23.12%) with HAV infection Maximum number of cases were observed in the month of September and October.
Conclusion: Health and civic authorities should make efforts to increase the awareness among general public, to encounter outbreak or epidemic, thereby to reduce morbidity, mortality and economic burden. The data collected through this study can be used in planning the vaccination strategies, it also emphasizes on implementation of better sanitation programme and hygiene measures in our geographical region.
80. Comparative Evaluation of Serotonergic and Cholinergic Modulation of Intestinal Motility in Isolated Rabbit Ileum: An Experimental Study
Shuchi Chaudhary, Vinay Singh, Chhavi Bajpai
Abstract
Gastrointestinal motility is regulated by complex neurohormonal mechanisms involving multiple neurotransmitter systems, with serotonin and acetylcholine playing pivotal roles in modulating intestinal smooth muscle contractility. This experimental study aimed to evaluate and compare the effects of serotonergic and cholinergic agents on intestinal motility using isolated rabbit ileum preparations. The study was conducted using Dale’s organ bath assembly with ileal segments from six healthy rabbits. Contractile responses including amplitude and frequency were recorded following exposure to acetylcholine chloride (10⁻⁶M), serotonin hydrochloride (10⁻⁶M), and fluoxetine (1.4×10⁻⁵M). Basal amplitude and frequency were 5.50±2.25 mm and 8.16±2.85 contractions/min respectively. Acetylcholine significantly increased amplitude (8.00±2.75 mm, p<0.01) and frequency (13.66±4.32/min, p<0.01) compared to baseline. Serotonin produced comparable excitatory effects with amplitude of 7.41±3.20 mm (p<0.05) and frequency of 17.16±5.19/min (p<0.01). Conversely, fluoxetine demonstrated inhibitory effects, reducing amplitude to 2.58±2.01 mm (p<0.01) and frequency to 4.33±2.94/min (p<0.01). The percentage increase in frequency was highest with serotonin (110.3%) compared to acetylcholine (67.4%), while fluoxetine produced 46.9% reduction from baseline. These findings demonstrate that while direct serotonergic stimulation enhances gut motility comparable to cholinergic activation, selective serotonin reuptake inhibition paradoxically decreases intestinal contractility, potentially through receptor desensitization mechanisms. The study provides experimental evidence supporting the differential roles of serotonergic pathways in gastrointestinal physiology and has implications for understanding drug-induced gastrointestinal effects.
81. Functional Outcomes and Mortality Predictors in Geriatric Hip Fracture Patients with Multiple Comorbidities: A One-Year Study at a Tertiary Care Centre in Western Gujarat
Nitin Rajnikant Popat, Hasmukh Khodidas Panchal, Bhatt Hrishita Dhavalkumar
Abstract
Background: Hip fractures in the elderly carry high morbidity and mortality, especially when several comorbidities are present, the risks rise sharply. In India, late presentations, delayed surgeries and a heavy burden of chronic diseases often lead to poorer results compared to Western countries. This study evaluated functional recovery and identified factors predicting mortality in such high-risk geriatric patients treated at a tertiary hospital in western Gujarat.
Material and Methods: A prospective observational study was conducted for a year, at the Department of Orthopaedics and Medicine, at a tertiary care centre Gujarat. Patients aged ≥65 years presenting with low-energy hip fractures and having at least two major comorbidities were included. Polytrauma, pathological fractures and patients refusing surgery were excluded. Data on demographics, comorbidities, Charlson Comorbidity Index (CCI), time to surgery, laboratory parameters and type of surgery were recorded. Follow-up was done at 3, 6 and 12 months for mortality and functional assessment using Modified Harris Hip Score (MHHS) and Parker Mobility Score (PMS) in survivors.
Results: Of 138 patients enrolled (mean age 78.6 ± 8.2 years, 56% female), 61% had intertrochanteric fractures. Mean CCI was 5.4 ± 1.8 and mean delay to surgery 4.9 ± 3.1 days. In-hospital mortality was 9.4%, 30-day mortality 13.8%, and one-year mortality 27.5%. On multivariate analysis, independent predictors of one-year mortality were age >80 years (p=0.002), male gender (p=0.009), CCI ≥6 (p<0.001), delay to surgery >72 hours (p=0.004), serum albumin <3.0 g/dL (OR 3.88, p=0.007) and preoperative haemoglobin <10 g/dL (p=0.012). Among survivors at 12 months (n=100), mean MHHS improved from 42.6 ± 12.4 at 3 months to 68.3 ± 16.8 at 12 months; only 38% regained pre-fracture mobility (PMS ≥7).
Conclusion: One-year mortality in geriatric hip fracture patients with multiple comorbidities remains high (27.5%) in western India and is largely driven by advanced age, higher comorbidity burden, male sex, delayed surgery and malnutrition. Functional recovery is modest, with less than two-fifths returning to pre-injury mobility levels. Aggressive perioperative optimisation and reduction in surgical delay may improve survival and function.
82. Hormonal Dysregulation and Its Metabolic Correlates in Polycystic Ovary Syndrome: A Case-Control Study of LH, FSH and Insulin Resistance
Renu, Mahadev, P. Satyanarayana
Abstract
Polycystic ovary syndrome (PCOS) is a complex endocrine metabolic disorder characterized by chronic anovulation, hyperandrogenism, and insulin resistance, affecting women of reproductive age worldwide. This study aimed to evaluate key hormonal and metabolic alterations in women with PCOS by comparing luteinizing hormone (LH), follicle-stimulating hormone (FSH), and insulin resistance (HOMA-IR) between affected women and healthy controls. A total of 400 women aged 18-45 years were enrolled in a hospital-based case control design, including 200 clinically diagnosed PCOS cases and 200 age-matched controls. Fasting blood samples were collected and analyzed for LH, FSH, fasting glucose, and fasting insulin using standardized chemiluminescent immunoassays, and insulin resistance was calculated using the HOMA-IR formula. The results demonstrated significantly elevated LH levels in PCOS subjects compared with controls, indicating altered hypothalamic pituitary regulation. FSH levels also differed significantly between the groups, reflecting possible population-specific endocrine variations. HOMA-IR values were markedly higher in the PCOS group, confirming substantial insulin resistance as a key metabolic feature of the disorder. Correlation analysis showed a positive association between LH and HOMA-IR and a negative association between FSH and HOMA-IR, highlighting the strong interaction between hormonal imbalance and metabolic dysfunction. These findings reinforce PCOS as a multifaceted condition driven by interconnected endocrine and metabolic mechanisms. The study emphasizes that early recognition of hormonal abnormalities and insulin resistance is essential for improving reproductive outcomes, preventing long-term metabolic complications, and guiding comprehensive clinical management in women with PCOS.
83. Diagnostic Discordance between Cefoxitin Screening by Vitek 2 Compact and Oxacillin Disc Diffusion in Staphylococcus aureus: A Cross-Sectional Study from Central India
Rituja Prakash, Sonu Maity, Kamlesh Kumar B. Patel, Jay Patel
Abstract
Background: Automated antimicrobial susceptibility testing (AST) systems such as Vitek 2 are increasingly used in diagnostic microbiology. Cefoxitin screening via Vitek 2 serves as a surrogate for
mecA-mediated methicillin resistance. However, discrepancies with conventional Oxacillin susceptibility testing remain a concern.
Objectives: To assess the diagnostic concordance between Vitek 2 Cefoxitin screening and Oxacillin disc diffusion in
Staphylococcus aureus isolates.
Materials and Methods: This cross-sectional study analyzed 604 consecutive
S. aureus isolates from diverse clinical samples tested at a tertiary care hospital in Central India (January–December 2024). Cefoxitin screening was performed using the Vitek 2 Compact (AST-GP 628 cards). Oxacillin susceptibility was determined by Kirby–Bauer disc diffusion (1 µg) on Mueller–Hinton agar with 2% NaCl as per CLSI M100 (2024). Diagnostic indices were calculated using Oxacillin results as comparator.
Results: Among 604 isolates, 471 (78.0%) were Cefoxitin-positive and 133 (22.0%) Cefoxitin-negative by Vitek 2.Oxacillin disc diffusion identified 386 (63.9%) resistant and 218 (36.1%) sensitive isolates. Cefoxitin-Vitek 2 screening demonstrated sensitivity 99.5%, specificity 60.1%, positive predictive value 81.5%, negative predictive value 98.5%, and overall accuracy 85.2%. Eighty-seven (18.5%) isolates were Cefoxitin-positive but Oxacillin-sensitive, while two (0.3%) were Cefoxitin-negative yet Oxacillin-resistant.
Conclusion: Vitek 2 Cefoxitin screening is highly sensitive but moderately specific for MRSA detection. Discordant results underline the need for confirmatory
mecA/PBP2a testing to prevent overestimation of MRSA prevalence.
84. Incidence of Hyperbilirubinemia in Asphyxiated Newborn in a Tertiary Care Hospital
Deepak Kumar Behera, Simanta Das, Pragnyojit Nayak, Sumanta Panigrahi
Abstract
Introduction: Our study is a single center study. Multicenter studies with larger number of cases of different geographic locations would provide additional insight regarding this study. More no of case-control studies are required with large sample size to determine the severity of serum bilirubin level with birth asphyxia particularly in term neonates. Additional newborn screening to be done in cases of prolonged high serum bilirubin level like hypothyroidism, G-6-PD deficiency, hemolytic anemia, Galactosemia so that it could have been intervened early, like hypothyroidism is a treatable cause which should be treated early to prevent developmental delay and hearing loss, similarly G-6- PD screening should be done to prevent hematuria and hemoglobinuria caused by drug.
Material And Method: This prospective study was conducted in newborn unit of a tertiary care teaching hospital in Odisha, from August 2022 – July 2024 (2 years). All the neonates satisfying the inclusion criteria admitted to SCBMCH and SVPPGIP newborn unit were included in the study. Informed consent was taken from the parents/ guardians of all the enrolled patients for obtaining relevant history, clinical examination and performing necessary investigations. Antenatal history of mother will be collected in detail to study the relevant risk factors. The cases were then diagnosed clinically using previously established criteria for Hypoxia ischemic encephalopathy (HIE), General examination and systematic examination were done in details to establish clinical diagnosis.
Result: “Incidence Of Hyperbilirubinemia In Asphyxiated Newborn In A Tertiary Care Hospital” which was an institution based prospective observational study undertaken from August, 2022 to July, 2024 in SVPPGIP and SCBMCH, Cuttack that included 80 newborns with birth asphyxia and 50 non asphyxiated neonates, satisfying the inclusion criteria.
Conclusion: Birth asphyxia continues to be one of the major causes of perinatal mortality in developing nation where obstetric care and neonatal resuscitation is still in their struggling phase. Birth asphyxia is a substantial cause of unconjugated hyperbilirubinemia in newborns. Assay of total serum bilirubin and indirect serum bilirubin can be used as an easy early diagnostic marker to differentiate between babies with birth asphyxia and those without asphyxia.
85. Prolonged Mobile Phone Usage and Hearing Status in Medical Students: A Cross-Sectional Study
Antarikhya Borah, Shiv Shanker Kaushik, Richa Gupta, Kashmira Kumawat
Abstract
Background: With the rapid proliferation of mobile phones, their impact on health, particularly auditory function, has garnered increasing attention. Prolonged exposure to electromagnetic radiation, acoustic overload, and thermal effects from mobile phone use may pose risks to hearing, especially among young adults who are frequent users. Previous studies have reported mixed findings on the association between mobile phone usage and hearing loss, with some highlighting high-frequency hearing impairments in the dominant ear. This study investigates the impact of prolonged mobile phone usage on hearing thresholds in medical students, emphasizing differences in air and bone conduction across frequencies.
Objective: To evaluate the effect of prolonged mobile phone usage on hearing status and to determine the correlation between the duration of mobile phone use and auditory function.
Methodology: A total of 146 medical students were included for a prospective observational study was conducted at ENT Department, PMCH, Udaipur, over a period of 1 year between March 2024 and Feb 2025 to explore the relationship between mobile phone usage and hearing status in medical students.
Result: The study revealed that prolonged mobile phone usage (>60 minutes daily) was significantly associated with mild hearing loss in the right ear, commonly used for calls. Among prolonged users, 40% exhibited mild hearing loss in the right ear compared to only 1% among short-duration users (<60 minutes daily) (p < 0.0001). Air conduction thresholds showed significant elevation at 4000 Hz, with the left ear exhibiting higher thresholds than the right (p < 0.0001). Bone conduction thresholds also showed significant differences at specific frequencies, particularly 250 Hz (p = 0.038) and 4000 Hz (p = 0.004). The findings suggest that prolonged mobile phone usage contributes to high-frequency hearing impairment, particularly in the dominant ear.
Conclusions: Prolonged mobile phone usage is linked to mild high-frequency hearing loss, especially in the dominant ear commonly used for calls. This study emphasizes the potential auditory health risks associated with extended mobile phone use and underscores the importance of preventive strategies, such as limiting call duration and promoting hands-free devices, to safeguard hearing among frequent users.
86. Liver Dysfunction in Type 2 Diabetes: Association with Microalbuminuria
Nirav Purohit, Prema Ram Choudhury
Abstract
Background: Liver dysfunction is increasingly recognized as an important complication of type 2 diabetes mellitus and may coexist with early renal microvascular damage.
Objectives: To evaluate liver dysfunction in patients with type 2 diabetes mellitus and to assess its correlation with microalbuminuria.
Methods: A cross-sectional study was conducted among 120 patients with type 2 diabetes mellitus. Liver function tests, ultrasonography, glycated hemoglobin, body mass index, and urinary albumin excretion were assessed and correlated.
Results: Liver dysfunction was significantly associated with longer duration of diabetes, poor glycemic control, higher body mass index, abnormal ultrasonography findings, and presence of microalbuminuria.
Conclusion: Liver dysfunction is common in type 2 diabetes mellitus and shows a significant correlation with microalbuminuria, highlighting the need for integrated hepatic and renal assessment in diabetic care.
87. Urbanization, Biodiversity Loss, and Allergy Epidemics: A One-Year Observational Study from an Urban Allergy Clinic in Eastern India
Gautam Modi
Abstract
Background: Rapid urbanization and biodiversity loss are increasingly recognized as major contributors to the global epidemic of allergic diseases. Reduced environmental microbial exposure, increased air pollution, and altered lifestyle patterns associated with urban living are believed to dysregulate immune tolerance and promote allergic sensitization.
Objectives: To evaluate the clinical pattern of allergic diseases in an urban population and assess the association between urban exposure indicators and allergy burden.
Methods: A one-year observational study was conducted among 100 patients attending an urban allergy clinic. Demographic characteristics, clinical diagnosis, and urban exposure indicators were recorded and analyzed.
Results: Allergic rhinitis was the most prevalent condition (42%), followed by asthma (25%). Higher allergy burden was observed among patients with high urban exposure. Age-wise distribution, allergy pattern, and urban exposure associations are presented in tables and figures.
Conclusion: Urbanization and biodiversity loss appear to significantly influence the rising burden of allergic diseases. Integrating biodiversity conservation and urban green planning may be essential strategies to reduce allergy epidemics.
88. Thyroid Dysfunction Among CKD Patients — A Retrospective Analysis of Prevalence and Hormonal Correlations
Srushti S. Koti, Rajeev Agarwal
Abstract
Background: Chronic kidney disease (CKD) is frequently associated with thyroid dysfunction, yet the prevalence and hormonal patterns remain inadequately characterized in the Indian population. This study aimed to determine the prevalence of thyroid dysfunction among CKD patients and evaluate correlations between thyroid hormone levels and renal function parameters.
Methods: A retrospective cross-sectional study was conducted over 24 months (January 2023 to December 2024) at a tertiary care hospital. Medical records of 168 CKD patients (stages 3-5) were analyzed. Demographic data, renal function tests (serum creatinine, blood urea, eGFR), and thyroid profiles (TSH, FT3, FT4) were extracted. Thyroid dysfunction was classified based on standard reference ranges. Statistical analysis included Pearson correlation, independent t-tests, ANOVA, and multiple regression analysis with significance set at p<0.05.
Results: Of 168 patients (mean age 54.6±12.3 years, 61.9% male), 72.6% had thyroid dysfunction. Low T3 syndrome was most prevalent (45.8%), followed by subclinical hypothyroidism (18.5%) and overt hypothyroidism (8.3%). FT3 showed strong positive correlation with eGFR (r=0.642, p<0.001) and significant negative correlations with serum creatinine (r=-0.598, p<0.001) and blood urea (r=-0.567, p<0.001). TSH levels showed weak negative correlation with eGFR (r=-0.243, p=0.002). Hemodialysis patients had significantly higher prevalence of thyroid dysfunction compared to conservatively managed patients (84.2% vs 63.5%, p=0.002). Thyroid dysfunction prevalence increased progressively from CKD stage 3 (58.3%) to stage 5 (86.4%).
Conclusion: Thyroid dysfunction is highly prevalent among CKD patients, with low T3 syndrome being the predominant pattern. Significant inverse correlations between thyroid hormones and declining renal function suggest routine thyroid screening in CKD patients for early detection and appropriate management.
89. Prevalence of Anemia among Adolescent Girls in Rural Vs. Urban Areas of Gujarat, India: A Cross-Sectional Study
Megal Chittalben Raningbhai, Solanki Dhaval Rameshbhai, Makwana Mili Prakashbhai
Abstract
Background: Anemia remains a critical public health issue in India, particularly among adolescent girls, driven by nutritional demands of puberty and menstruation. In Gujarat, where adolescents form a significant demographic, anemia prevalence is high, influenced by socioeconomic disparities, dietary patterns, and healthcare access. Previous studies in Gujarat suggest higher rural prevalence, yet recent data are limited. This study aimed to assess anemia prevalence among adolescent girls in rural versus urban Gujarat, evaluating hemoglobin levels and associated factors to inform targeted interventions.
Material and Methods: A cross-sectional study was conducted for one year among 500 adolescent girls aged 10–19 years, equally divided between rural (n=250) and urban (n=250) areas. Participants were selected via multistage random sampling from schools and community centers. Hemoglobin was measured using the cyanmethemoglobin method (anemia: <12 g/dL, WHO criteria). Data on socio demographics, diet, and menstrual history were collected via questionnaires. Ethical approval was obtained, with informed consent from guardians. Chi-square tests and logistic regression were applied using SPSS 25.0 (p<0.05).
Results: Anemia prevalence was 48% overall, significantly higher in rural (56%) than urban (40%) areas (p<0.001). Mild anemia predominated (58% of cases), followed by moderate (32%) and severe (10%). Rural girls had higher odds of anemia (OR=1.89, 95% CI: 1.32–2.70), linked to lower income and limited access to iron-rich foods. Urban girls benefited from better nutrition and healthcare.
Conclusion: This study highlights the need for region-specific interventions, particularly in rural Gujarat, emphasizing nutritional supplementation and education to bridge disparities. Addressing anemia will enhance adolescent health and future maternal outcomes, aligning with national nutrition goals.
90. Artificial Intelligence–Assisted Risk Stratification for Congenital Syphilis in Antenatal Care
Kinnari Amin, Niravkumar Patel
Abstract
Background: Congenital syphilis remains a preventable cause of perinatal morbidity and mortality, particularly in low- and middle-income countries. Despite universal antenatal screening policies, cases continue to occur due to delayed diagnosis, inadequate treatment, and poor follow-up. Artificial intelligence (AI) offers a novel approach to integrate multiple maternal risk factors for early prediction of adverse outcomes.
Objectives: To develop and evaluate an AI-assisted risk stratification model for predicting congenital syphilis in pregnant women diagnosed with syphilis.
Methods: A prospective cohort study was conducted among pregnant women with confirmed syphilis attending antenatal clinics of a tertiary care hospital. Clinical, demographic, serological, and treatment-related variables were collected. Machine learning models including logistic regression, random forest, and gradient boosting were developed and compared. Model performance was assessed using area under the receiver operating characteristic curve (AUC), sensitivity, and specificity.
Results: Out of 280 enrolled participants, 23 neonates (8.2%) were diagnosed with congenital syphilis. The gradient boosting model demonstrated the highest predictive accuracy (AUC 0.88), significantly outperforming conventional risk assessment. High maternal non-treponemal titers, delayed initiation of treatment, inadequate serological response, and lack of partner treatment were the strongest predictors.
Conclusion: AI-assisted risk stratification improves prediction of congenital syphilis and may enable targeted antenatal interventions, optimized follow-up, and reduction of preventable adverse neonatal outcomes.
91. Impact of Digital Health Literacy on Preventive Care Uptake in Rural Populations
Ravi Nandini Singh, Vijna, Arti Singh
Abstract
Background: Digital Health Literacy (DHL) is the ability to deal with health relevant digital information and information options with the aim of promoting and maintaining health and wellbeing of oneself and one’s environment. Preventive healthcare is critical for reducing disease burden, yet rural populations in India continue to face barriers to accessing and utilizing preventive services. Digital health literacy the ability to seek, understand, and use digital platforms for health-related purposes has the potential to bridge these gaps. However, limited evidence exists on how digital health literacy directly influences preventive care uptake in rural settings.
Methods: Around100 rural participants participated in a cross-sectional study from January to June 2024. Collection methods included medical records, systematic digital usage surveys, and community health intervention records
. The level of digital health literacy among participants was categorised using the Digital Health Literacy Instrument (DHLI). Based on DHLI scoring guidelines, participants were classified into low, medium, and high digital health literacy groups according to their total score distribution
. Low, medium, or high digital health literacy was the independent variable. Dependent variables included immunisations, prenatal checkups, screening, and health camp attendance. Descriptive statistics described participant demographics, and chi-square tests and logistic regression investigated digital literacy and preventative care.
Results: Out of 100 participants, 36% had poor digital health literacy, 44% medium, and 20% high. High and low digital literacy significantly affected immunisation (90%), prenatal visits (82% vs. 43%), and screenings (70% vs. 28%). The study found a significant association (p < 0.05) between digital health literacy and preventative healthcare use.
Conclusion: Digital health literacy plays a pivotal role in improving preventive care uptake in rural Varanasi. Policy measures focused on digital training, infrastructure development, and localized awareness campaigns could substantially enhance preventive healthcare outcomes in similar rural settings across India.
92. Comparative Analysis of Adverse Drug Reactions (ADR) of Commonly Used Antibiotics
Gulam Mohammad, Mejoy Manoj Biswas, Eram Shamshad, Pratik
Abstract
Background: Antibiotics are among the most commonly prescribed medications worldwide, playing a vital role in combating bacterial infections. However, their extensive use is often associated with Adverse Drug Reactions (ADRs), which can range from mild allergic manifestations to severe life-threatening events. Understanding the frequency and pattern of antibiotic-induced ADRs is essential for promoting patient safety and guiding rational drug use.
Methods: A prospective observational study was conducted over eight months (January–August 2025) involving 100 patients who developed ADRs following antibiotic therapy. Data were collected using analyzed using descriptive statistics and chi-square tests to compare ADR frequency, type, and severity across antibiotic classes.
Results: Penicillins were the most frequently implicated antibiotics (32%), followed by fluoroquinolones (26%), cephalosporins (24%), and macrolides (18%). Cutaneous reactions (38%) and gastrointestinal disturbances (33%) were the most common ADRs observed. Most reactions were mild to moderate in severity, and approximately 44% were deemed preventable. A significant association (p < 0.05) was found between antibiotic class and type of ADR.
Conclusion: The study highlights that while most antibiotic-related ADRs are mild and manageable, a substantial proportion are preventable through rational prescribing, improved pharmacovigilance, and better patient education. Continuous monitoring and active ADR reporting systems are imperative for enhancing antibiotic safety and minimizing adverse outcomes.
93. Evaluation of Bacteriological Profile and Antimicrobial Sensitivity Pattern of Neonatal Sepsis in a Tertiary Care Centre
Jyotsna, Mukesh Kumar, Ankur Priyadarshi
Abstract
Background: Globally, neonatal sepsis is a leading cause of morbidity and mortality. In order to guide effective treatment, improve patient outcomes, and prevent the establishment of antibiotic resistance, it is essential to comprehend the bacterial profiles and antibiotic susceptibility patterns underlying neonatal sepsis. The purpose of this study is to assess the antimicrobial sensitivity pattern and bacteriological profile of neonatal sepsis.
Methods: From February 2025 to October 2025, the pediatrics department at Jawaharlal Nehru Medical College and Hospital, Bhagalpur, Bihar, conducted this prospective observational study. Ninety neonates with clinically suspected sepsis had their blood samples taken, processed according to the microbiological technique, and their antimicrobial sensitivity pattern identified.
Results: Sixty-one (67.78%) of the ninety cases had positive blood cultures. There were 23 (37.71%) Gram negative isolates and 48 (78.79%) Gram positive isolates. Among the isolates, Staphylococcus aureus was the most prevalent organism (75.51%), followed by E. Coli (27.87%), Klebsiella pneumoniae (9.84%), and Streptococcus pneumonia (3.28%). Ampicillin resistance was 100% present in all three of the common isolates. Gram negative isolates were susceptible to gentamicin and meropenum, while Gram positive isolates were susceptible to amikacin, amoxicillin, and clavulenic acid.
Conclusion: In this study, neonatal septicemia was found to be 67.78% supported by blood culture, which is the gold standard investigation for diagnosis. This outcome also demonstrated that the prevalence of antibiotic resistance to widely used antibiotics was rising. These two changes in the microbiological spectrum and antimicrobial susceptibility pattern shown during this investigation will undoubtedly aid in the right antibiotic treatment of such instances, hence reducing infant morbidity and mortality.
94. A Hospital Based Case Control Study to Estimate Zinc Levels in Children with Acute Lower Respiratory Tract Infections
Mukesh Kumar, Jyotsna, Ankur Priyadarshi
Abstract
Background: The importance of zinc in treating illnesses in children, such as acute lower respiratory tract infections (ALRTIs), has been demonstrated, which has increased interest among researchers. Thus, the aim of this study was to measure serum zinc levels in ALRTI patients and correlate those levels with the disease’s clinical progression.
Methods: In the hospital-based case-control study, 61 patients aged 2 months to 5 years had ALRTI along with 61 control subjects who were matched for age and diet. When a patient was admitted, their serum zinc level was measured. Details of the clinical course, including the duration of stay, oxygen requirement, severity of illness, and outcome, were recorded together with a thorough history, sociodemographic information, and examination.
Results: The difference between the mean serum zinc levels of patients and controls (patients 58.88±12.40 µg/dl, Controls 85.36±16.27 µg/dl) was determined to be statistically significant (p value = 0.0001). Zinc levels and length of stay had a negative connection (r = -0.052, p value = 0.691). When compared to cases of pneumonia (WHO IMNCI grading), cases of severe pneumonia had considerably lower mean serum zinc levels (p value = 0.0001). As compared to the patients who were released, those who required higher O
2 concentrations and those who died had considerably lower mean serum zinc levels (p value = 0.0001) and respectively.
Conclusion: Lower serum zinc levels are significantly associated with ALRTI and the lower the serum zinc levels; the more is the severity of disease and duration of stay in hospital for the patient, along with increased oxygen requirement and also increased incidence of mortality.
95. Evaluation of Risk Factors for Mortality in Elderly Patients with Hypoglycemia: A Cross-sectional Study
Rakesh Kumar, Amit Kumar, Arohi Kumar, Rahul Kumar
Abstract
Background: Hypoglycemia is a side effect of stringent diabetes control especially in the geriatric population above sixty, who constitute approximately 90 million of the Indian population. This study was undertaken to explore the risk factors of hypoglycemia in elderly inpatients. The aim of this study was to determine the risk factors for two-term mortality in elderly patients with hypoglycemic attack in emergency room (ER).
Methods: A total of 111 (41 males, 70 females) geriatric patients (>65 years) presented with hypoglycemia during a period of between October 2024 to March 2025 were included to the study. The data were obtained by screening the patients’ records in our hospital, including age, sex, laboratory parameters, co-morbidities, admission time to hospital, admission results, 15-day and 3-month mortality rates.
Results: 63.1% of patients had diabetes mellitus (DM), 51.4% of patients were using OAD and 21.6% were using insulin. The blood glucose levels of the patients who died within the first 15 days after admission were lower than the patients who did not die and it was statistically significant (p=0.014). Blood urea nitrogen (BUN) and creatinin values were significantly higher in patients who died within the first 15 days after admission (p=0.002, p=<0.001). Blood calcium level and platelet (PLT) counts were significantly lower in patients who died within the first 15 days after admission (p=0.006; p=0.001). Red Cell Distribution Width (RDW) was significantly higher in patients who died in the first 15 days after admission (p=<0.001). Neutrophile count was significantly higher in both group patients who died within 15 days and 3 months (p=0.002, p=0.012). Lymphocyte count was significantly lower in patients who died within 3 months (p=0.007). The presence of coexisting DM, coronary artery disease (CAD), hypertension (HT) and malignancy and to use OAD were statistically significant in patients who died within the first 15 days.
Conclusion: In this study lower glucose level, impaired renal function, lower calcium and PLT, increased RDW and neutrophile, lower lymphocyte, coexisting DM, CAD, HT, malignancy and to use OAD are risk factors for mortality of geriatric patients with hypoglycemia.
96. Estimation on Prevalence and Factors associated with Cardiovascular Emergencies in SKMCH, Muzaffarpur, Bihar
Rakesh Kumar, Arohi Kumar, Amit Kumar, Rahul Kumar
Abstract
Background: Worldwide, CVD is the leading cause of death and a major cause of disability and lost productivity in adults. Hence, this study tried to Estimate the prevalence and factors associated cardiovascular emergencies and outcome in SKMCH, Muzaffarpur, Bihar.
Methods: This cross-sectional study was conducted from April 2025 to September 2025 at Emergency department of SKMCH, Muzaffarpur, Bihar. Total 422 patients included with aged 15 years or more except those with uncompleted medical records. Data collected from questionnaire-based interviews with patients and medical files. Data were processed with Epi Info® 7.2.2.6 and MS Excel® 2019. Dependent variable was cardiovascular emergency regardless of the type. A multiple logistic regression model to estimate adjusted odds ratios (95% Confident interval) of the relation between cardiovascular emergencies and patients’ characteristics with a statistical significance threshold of p-value <0.05.
Results: Among 422 patients included in our study, 116 had cardiovascular emergencies, prevalence of 27.5%. 68% (79/116) were over 50 years and 50.9% (59) were male. The most common personal medical history was hypertension with 50.8% (59/116). Among the 15 deaths which were recorded, one third was due to stroke, the main cause of CVD emergency (32/116, 32%). Both hypertensive crisis and heart failure concerned 46.5% (54/116) of all emergencies. History of hypertension [6,09 (3,31-11,21), p <0.0001] and age over than 50 (2.34 [1,33 – 4.10], p=0.002) were independently associated with CVD emergencies.
Conclusion: Health promoting strategies targeting adequate management of high blood pressure and positive lifestyle habits in people over 50 years could help in reducing cardiovascular emergencies frequency in hospital settings.
97. Study of Patient with Diabetic Foot Infection – Observational Study in Tertiary Care Center
Chirag Panara, Deepak Sanwal, Sandeep Rao
Abstract
Background: Diabetic foot ulcers (DFUs) are a major complication of diabetes mellitus (DM), particularly in India, where high prevalence and poor control exacerbate risks of infection, gangrene, and amputation. This study correlated Wagner grades with outcomes and identified associated risk factors in a tertiary care setting.
Methods: In this prospective observational study at SSG Hospital, Government Medical College, Vadodara, 40 adult patients admitted for DFU management were purposively sampled. Data on sociodemographic, DM duration/control, neuropathy, peripheral vascular disease (PVD), comorbidities, foot self-care, and Wagner grades (1–5) were collected via structured proforma, clinical exams, and labs. Outcomes included amputation rates, mortality, and hospital stay. Analysis used descriptive statistics, t-tests, and chi-square/Fisher’s exact tests (SPSS v25; p≤0.05).
Results: Mean age was 57.62 years; 65% were male, 92.5% had type 2 DM (mean duration >10 years in most). Poor glycaemic control affected 60%; neuropathy (42.5%) and PVD (37.5%) were common; 95% had poor foot self-care. Amputation rates rose with Wagner grade: 0% (grades 1–2), 23.5% (grade 3), 75% (grade 4), 100% (grade 5). Neuropathy (76.5% vs. 17.4% in amputees/non-amputees; p<0.05) and gangrene (76.5% vs. 8.7%; p<0.05) significantly predicted amputation. Mortality (12.5%) occurred in advanced grades, mainly non-surgical cases due to sepsis.
Conclusion: Higher Wagner grades strongly predict amputation and mortality in DFUs. Neuropathy, gangrene, prolonged DM duration, and poor control worsen outcomes. Emphasizing prevention via glycaemic management, neuropathy screening, and foot self-care education is crucial to reduce amputations in resource-limited settings.
98. Sleep Patterns in Children and Adolescent With and Without Chronic Diseases – A Comparative Study
Harsha Reddy, Prema R., Arpitha B.
Abstract
Background: Sleep plays a vital role in physical growth, cognitive development, emotional regulation, and overall health in children and adolescents. Chronic illnesses in childhood are frequently associated with physiological, psychological, and behavioral factors that can disrupt normal sleep patterns. Despite this, sleep disturbances in children with chronic diseases often remain under-recognized and under-managed. This study aimed to compare sleep patterns and sleep-related disturbances among children and adolescents with chronic illnesses and age- and sex-matched healthy controls.
Methods: A prospective, observational, cross-sectional study was conducted in the Department of Paediatrics at a tertiary care hospital. A total of 100 children aged 6–15 years were enrolled, including 50 children with chronic illnesses of more than six months’ duration (Group A) and 50 healthy controls (Group B). Sleep patterns were assessed using the Sleep Timing Questionnaire (STQ) and the Modified Sleep Disturbance Scale for Children (SDSC). Parameters evaluated included sleep latency, total sleep time, night awakenings, daytime sleepiness, and domain-specific sleep disturbances. Data were analyzed using appropriate descriptive and inferential statistical methods.
Results: Children with chronic illnesses had significantly later bedtimes (22:48 ± 0.6 vs. 22:05 ± 0.5 hours), longer sleep latency (42.6 ± 18.4 vs. 24.3 ± 12.1 minutes), and shorter total sleep duration (7.1 ± 0.9 vs. 8.2 ± 0.8 hours) compared to healthy controls (p < 0.001). Disorders of initiating and maintaining sleep, excessive daytime somnolence, sleep–wake transition disorders, and sleep breathing disturbances were significantly more prevalent in Group A. Overall, 88% of children with chronic illness experienced at least one sleep disturbance compared to 42% of healthy controls.
Conclusion: Sleep disturbances were significantly more common and severe among children and adolescents with chronic illnesses. Routine screening and early intervention for sleep problems should be integrated into comprehensive pediatric chronic disease management to improve health outcomes and quality of life.
99. Assessing Cardiac Manifestations in Dengue Patients
Bhavna Bamaniya, Pankaj Parmar
Abstract
Background: Dengue infection is increasingly recognized to involve the cardiovascular system, contributing to disease severity and adverse outcomes.
Objective: To evaluate electrocardiographic changes and clinical cardiac manifestations in dengue patients and assess their association with warning signs, in-hospital morbidity, and mortality.
Methods: A prospective observational study was conducted on 150 patients with laboratory-confirmed dengue infection. Clinical features, warning signs, ECG findings, and in-hospital outcomes were analyzed.
Results: ECG abnormalities were observed in a significant proportion of patients, with sinus bradycardia being the most common finding. ECG changes showed significant association with severe warning signs such as shock, respiratory distress, and ARDS, correlating with increased morbidity.
Conclusion: Cardiac involvement is common in dengue infection and is strongly associated with disease severity. Routine ECG evaluation may aid in early identification of high-risk patients and improve clinical outcomes.
100. Evaluating Pulmonary Artery Hypertension: A Doppler Echocardiography Study Correlated with Right Heart Catheterization
Pankaj Parmar, Bhavna Bamaniya
Abstract
Background: Doppler echocardiography is widely used for non-invasive assessment of pulmonary hypertension, but right heart catheterization remains the diagnostic gold standard.
Objective: To assess pulmonary hypertension using Doppler echocardiography and evaluate its correlation with right heart catheterization.
Methods: A prospective observational study was conducted in 35 patients with suspected pulmonary hypertension who underwent both echocardiography and right heart catheterization. Echocardiographic indices were correlated with invasive pulmonary artery pressures.
Results: Pulmonary artery acceleration time and PAAT/RVET ratio showed strong inverse correlations with invasively measured systolic and mean pulmonary artery pressures, while TR-derived systolic pulmonary artery pressure demonstrated a strong direct correlation.
Conclusion: Doppler echocardiography provides reliable estimation of pulmonary hemodynamics and serves as an effective non-invasive screening and follow-up tool, although right heart catheterization remains essential for definitive diagnosis.
101. Comparative Effectiveness of Autologous Platelet-Rich Plasma vs. Conventional Dressings in Chronic Ulcer Management
Ashok Nanalal Parmar
Abstract
Background: Chronic ulcers pose a major clinical challenge due to delayed healing and high recurrence rates. Autologous platelet-rich plasma (PRP) has emerged as a regenerative therapy capable of enhancing wound repair through concentrated growth factors.
Objective: To compare the efficacy of autologous PRP and conventional dressing in reducing ulcer surface area and promoting healing in chronic ulcers.
Methods: This prospective comparative study included 80 patients with chronic ulcers, randomized into PRP and control groups. Ulcer surface area was measured at baseline and at Days 14, 28, and 56, and percentage reduction was analyzed.
Results: While baseline ulcer size was comparable, the PRP group demonstrated significantly greater reduction in ulcer surface area from Day 14 onward, with highly significant differences observed at Days 28 and 56 compared with the control group.
Conclusion: Autologous PRP therapy significantly accelerates healing and reduces ulcer surface area in chronic ulcers compared with conventional dressing, supporting its role as an effective adjunctive treatment.
102. The Role of Serum Vitamin D in Rheumatoid Arthritis: Correlation with Disease Activity and Neuropathic Pain
Ashok Nanalal Parmar
Abstract
Background: Vitamin D deficiency is common in rheumatoid arthritis and may influence inflammatory activity and pain processing.
Objective: To evaluate the relationship between serum vitamin D levels, disease activity, and neuropathic pain in rheumatoid arthritis patients.
Methods: A cross-sectional study of 120 rheumatoid arthritis patients assessed serum vitamin D levels, DAS28 disease activity, and neuropathic pain using DN4 scores.
Results: Vitamin D deficiency was significantly associated with higher DAS28 scores and increased prevalence of neuropathic pain.
Conclusion: Low serum vitamin D levels are linked to greater disease activity and neuropathic pain in rheumatoid arthritis.
103. High-Sensitivity C‑Reactive Protein and Lipid Profiles in Early Acute Coronary Syndrome
Darshan Patel, Amit Maheshwari, Pralhad Potdar
Abstract
Background: Acute coronary syndrome is associated with inflammation and lipid abnormalities that influence plaque instability and myocardial injury. Early biomarker assessment may improve risk stratification and clinical outcomes.
Objectives: To evaluate the association of high-sensitive C-reactive protein and lipid profile within 24 hours of symptom onset in patients with acute coronary syndrome.
Material and Methods: A hospital-based observational study was conducted on 150 participants comprising 100 ACS patients and 50 controls. ACS patients were subdivided based on symptom onset into <6 hours and ≥6–24 hours groups. hs-CRP and lipid profile were analyzed and compared.
Results: hs-CRP levels were significantly higher in ACS patients and increased with delayed presentation. ACS patients showed significantly higher triglycerides, LDL-C, VLDL-C, and lipid ratios with lower HDL-C compared to controls. Late presenters exhibited more pronounced inflammatory and lipid abnormalities.
Conclusion: hs-CRP and lipid profile assessment within 24 hours of ACS onset provides valuable insight into inflammatory burden and disease severity and may aid in early risk stratification and therapeutic planning.
104. A Comparative Study on Subcuticular Monocryl Suture Closure Versus Conventional Closure of Surgically Incised Wounds
Jyotsna V., Ashwitha Crasta, Santosh Sairoba Nagekar
Abstract
Background: Wound closure techniques significantly impact surgical outcomes, wound complications, and cosmetic results. Subcuticular monocryl sutures offer potential advantages over conventional interrupted sutures, but comparative evidence remains limited.
Objective: To compare subcuticular monocryl suture closure with conventional closure of surgically incised wounds, evaluating wound complications, cosmetic outcomes, and post-operative pain.
Methods: A hospital-based cross-sectional study was conducted at Navodaya Medical College Hospital and Research Centre, Raichur, from January 2021 to June 2022. One hundred patients undergoing clean surgical procedures were randomly allocated to either subcuticular monocryl suture (n=50) or conventional suture (n=50) groups. Primary outcomes included wound complications, duration of hospital stay, antibiotic coverage requirements, wound healing time, post-operative pain, and cosmetic outcomes assessed at follow-up visits up to one month post-operatively.
Results: Wound complications occurred in 20% of conventional suture patients versus 6% of monocryl suture patients (p=0.037). Mean hospital stay was significantly shorter in the monocryl group (5.38±1.11 days) compared to conventional group (7.6±1.85 days, p<0.0001). Monocryl sutures required shorter antibiotic coverage (5.16±0.58 vs 7.18±1.47 days, p<0.0001), faster wound healing (5.34±0.97 vs 7.32±1.50 days, p<0.0001), and resulted in less post-operative pain (1.44±0.64 vs 2.24±1.03, p<0.0001). Excellent cosmetic outcomes were achieved in 92% of monocryl cases versus 60% of conventional cases (p=0.0002).
Conclusion: Subcuticular monocryl suture closure demonstrated superior outcomes compared to conventional sutures, with significantly reduced wound complications, shorter hospitalization, decreased antibiotic requirements, accelerated wound healing, reduced post-operative pain, and improved cosmetic results.
105. Comparison of Vasopressor Requirements in Elective versus Emergency Cesarean Section Under Spinal Anesthesia
Nidhiben S. Patel, Akshay Pandya, Prapti Patel
Abstract
Background: Spinal anesthesia is the preferred anesthetic technique for cesarean section; however, spinal-induced hypotension frequently necessitates vasopressor use. Emergency cesarean sections are often associated with greater hemodynamic instability compared to elective procedures.
Aim: To compare vasopressor requirements in patients undergoing elective versus emergency cesarean section under spinal anesthesia.
Methodology: This prospective observational study included 100 patients, divided into elective (n = 50) and emergency (n = 50) cesarean section groups. Hypotension was defined as a fall in systolic blood pressure ≥20% from baseline or an absolute systolic blood pressure <90 mmHg. Vasopressor requirement, cumulative dose, and maternal and neonatal outcomes were recorded and analyzed.
Results: Hypotension occurred in 56% of elective and 76% of emergency cesarean sections. Vasopressor support was required in 48% of elective cases compared to 80% of emergency cases. The mean cumulative vasopressor dose was significantly higher in the emergency group (15.8 ± 4.6 mg) than in the elective group (9.4 ± 3.2 mg). Maternal symptoms such as nausea and vomiting were more frequent in emergency cesarean sections, and neonatal outcomes showed a higher incidence of low Apgar scores and NICU admissions in this group.
Conclusion: Emergency cesarean section under spinal anesthesia is associated with significantly higher vasopressor requirements than elective cesarean section. Anticipation of hypotension and early vasopressor preparedness are essential to optimise maternal and neonatal outcomes.
106. Effect of Topical Phenytoin with Normal Saline Dressing in Patients of Diabetic Foot Ulcers
Ekanta Apon Antarikshwa Sarma, Jon Bordalai, Damayanti Das
Abstract
Introduction: Chronic wounds, especially non healing types are one of the most common surgical conditions a surgeon comes across. One of the most feared complication of long term diabetes is loss of leg or foot. It has been estimated that one in five of all diabetic admissions to hospitals are for foot ulcers. The diabetic foot ulcers arrest in inflammatory stage of healing due to neuropathy, angiopathy and infections.
Aims: The aim of this study was to compare the effectiveness of topical phenytoin dressing with normal saline dressing in promoting healing of diabetic foot ulcers. It also aimed to evaluate associated outcomes such as infection rates and need for surgical intervention.
Materials & Methods: This hospital-based prospective randomized comparative study was conducted at Tezpur Medical College and Hospital from January 2024 to June 2025 and included 100 patients with diabetic foot ulcers.
Result: In our study secondary infection was significantly lower in the phenytoin group (6 patients, 12%) compared to the saline group (14 patients, 28%) (p = 0.04). The need for surgical intervention was also less in the phenytoin group (4 patients, 8%) than in the saline group (11 patients, 22%) (p = 0.04).
Conclusion: We concluded that topical phenytoin dressing is markedly more efficacious than regular saline dressing in the treatment of diabetic foot ulcers. Due to similar baseline characteristics across the two groups, the enhanced outcomes may be reliably ascribed to the intervention.
107. Management of Oesophageal Atresia and Tracheo-Oesophageal Fistula: A Prospective Study of 30 Consecutive Cases
Ashwitha Crasta, Jyotsna V., Santosh Sairoba Nagekar
Abstract
Background: Oesophageal atresia with tracheo-oesophageal fistula (OA/TOF) is a life-threatening congenital anomaly requiring immediate surgical intervention. Despite advances in surgical techniques and perioperative care, the condition continues to present significant challenges, particularly in developing countries where late presentation and associated complications are common.
Objectives: To describe the management outcomes of OA/TOF, identify perioperative complications, and evaluate the association of prognostic factors with surgical outcomes.
Methods: A prospective observational study was conducted over 22 months (November 2018 to August 2020) at J.J.M Medical College, Davanagere, involving 30 consecutive neonates with OA/TOF. All patients underwent detailed clinical assessment, radiological evaluation, and surgical management via right posterolateral thoracotomy with primary oesophago-oesophagostomy.
Results: The cohort comprised 60% males and 40% females (male:female ratio 1.5:1). Gross type C was the predominant anatomical variant (96.66%). Low birth weight was present in 66.66% of cases, pneumonia in 56.66%, and associated congenital anomalies in 46.66%. Twenty-nine patients (96.66%) underwent primary repair. Overall mortality was 53.33%, with sepsis (50%) and respiratory failure (37.5%) being the leading causes of death. Postoperative complications included anastomotic leak (43.5%), dysphagia (48.3%), and stricture formation (14.3%). Waterston group A patients demonstrated 100% survival, group B 60% survival, and group C 29.4% survival.
Conclusion: Primary thoracotomy with oesophageal anastomosis remains the definitive surgical treatment for Gross type C OA/TOF. Significant postoperative complications and high mortality rates are influenced by low birth weight, pneumonia, sepsis, associated congenital anomalies, and anastomotic leak. Multidisciplinary care involving neonatologists, paediatric surgeons, and intensivists is essential for improving outcomes.
108. Posterior Versus Lateral Approach in Total Hip Replacement: A Comparative Study
Panchiwala Nikhil Bharatbhai, Patel Jaimin Ashvinbhai, Panchiwala Vinit Jayeshkumar
Abstract
Background: Total hip replacement (THR) is a highly successful surgical procedure for end-stage hip disorders. The posterior and lateral approaches are the most commonly used techniques, yet controversy persists regarding their comparative outcomes.
Objectives: To compare posterior and lateral approaches in primary THR with respect to early functional outcome, perioperative parameters, and short-term complications.
Methodology: This prospective comparative study was conducted at a tertiary care hospital over 4–6 months. A total of 100 patients undergoing primary THR were equally divided into posterior (n=50) and lateral (n=50) approach groups. Functional outcome was assessed using the Harris Hip Score (HHS). Data were entered in Microsoft Excel and analyzed using SPSS version 26. Appropriate statistical tests were applied.
Results: Both groups showed significant improvement in functional outcome. The posterior approach demonstrated slightly higher early HHS and lower incidence of abductor weakness. No dislocations were observed in either group.
Conclusion: Both posterior and lateral approaches provide comparable short-term outcomes following THR. The posterior approach may offer better early functional recovery, while overall complication rates remain similar.
109. Study of Effectiveness of Case Based Learning in Microbiology for 2nd Professional M.B.B.S. Students
Ahir Hitesh, Patel Parimal, Patel Hiral, Soni Payal, Gamit Mital
Abstract
Background: Traditional teaching in microbiology is largely lecture-based, leading to passive learning and limited student engagement. To enhance understanding and motivation, active learning strategies such as Case-Based Learning (CBL) can be introduced.
Aim: To evaluate the effectiveness of case-based learning compared to didactic lectures and to assess students’ perception toward this method.
Methods: A total of 70 volunteer students from the 2nd Professional M.B.B.S. were enrolled after informed consent and randomly divided into two groups (A and B, 35 each). Group A underwent Case-Based Learning, further divided into smaller groups of 11–12 students, while Group B attended traditional lectures on the same topic. Validated clinical cases, post-test MCQs, and a feedback questionnaire were used for evaluation. Faculty were sensitized and trained for CBL sessions. Learning outcomes were assessed through post-test performance and feedback analysis.
Results: Students exposed to CBL showed higher engagement, improved understanding of microbiology concepts, and better problem-solving skills compared to those attending lectures. Feedback revealed that CBL increased student interest, promoted active learning, and improved teacher–student interaction.
Conclusion: Case-Based Learning is an effective student-centered teaching method in microbiology. It enhances motivation, understanding, and application of concepts and should be incorporated as a regular component of undergraduate medical education.
110. Comparison of Atherogenic Index of Plasma between Patients Suffering from Myocardial Infarction and Adults without Known Cardiovascular Disease
Manidip Chakraborty, Soumyamoy Das, Kaushik Tripura, Sankar Roy, Arkadip Choudhury
Abstract
Background: Studies suggest that myocardial infarction is the major cause of mortality worldwide. Atherogenic index of plasma (AIP) is a sensitive marker in the diagnosis of myocardial infarction. In this study, we aim to find out the association of AIP with myocardial infarction.
Methods: AIP was calculated by taking the logarithm of the ratio of triglyceride (TG) to high-density lipoprotein cholesterol (HDL-C). The data analyzed using the odds ratio (RR) along with a 95% confidence interval (CI) and univariable logistic regression analysis.
Results: Total 28 diagnosed MI patients from emergency ward and 28 patients who attended OPD of Tripura Medical College with no known cardiovascular diseases were selected in the study. Study showed a higher level of TG in MI patients as compared to healthy Non MI patients. HDL-C was higher in Non MI patients than MI patients. The mean atherogenic index of plasma was markedly higher among myocardial infarction patients (0.358 ± 0.310) compared to Non MI patients (0.034 ± 0.159), and the difference was statistically significant (p < 0.001).
Conclusion: Atherogenic index of plasma (AIP) has a strong association with myocardial infarction and it can be used as a predictive risk factor for cardio vascular diseases like myocardial infarction.
111. Study of Microvessel Density (MVD) and Vascular Endothelial Growth Factor (VEGF) as Prognostic Indicator in Colorectal Cancer and Correlate with PTNM Staging
K. Lakshmi Chinmayee, R. Saranya, Nakka Anusha, Mahalakshmi Innamuri
Abstract
Background: The aim & objective is to study the Microvessel density (MVD) and Vascular endothelial growth factor (VEGF) as prognostic indicator in colorectal carcinoma and correlate with PTNM staging.
Materials and Methods: The present study is an observational study and cross sectional done on 50 colorectal resection specimens received by the Department of Pathology, Gandhi Hospital, and Secunderabad for a period of 18 months from October 2022- March 2024. Relevant clinical details of all the 50 patients diagnosed with colorectal carcinoma are documented. The specimens received are fixed, processed and embedded in paraffin wax. Serial sections of 4-5 µ thickness are obtained and stained with H&E.
Results: Routine processing and H&E staining were done followed by immunohistochemistry with MVD and VEGF. In the present study, the age of the patients with Colorectal carcinoma was ranging from 36-70 years and majority of the subjects were males with M:F ratio of 1.77:1. Majority of the tumours were moderately differentiated and poorly differentiated adenocarcinoma belonged to stage III and IV. MVD and VEGF immune expression was correlated with clinicopathological parameters like grade and stage of the tumour to analyse the usefulness of these immunomarkers in prognosis. In Present study, MVD and VEGF expression was more in moderately differentiated tumours and poorly differentiated tumours than in well differentiated tumors. Present study shows that there is a statistically significant correlation of MVD and VEGF positivity with stage of the tumor. There is higher expression of MVD and VEGF in higher stage tumours (Stage III and Stage IV).
Conclusion: The present study concluded that, MVD and VEGF represent important prognostic indicators in colorectal carcinoma. As the predominant angiogenesis factors in the growth and maturation of new vessels- MVD, VEGFs are associated with incidence of metastases and decreased survival. Combined targeting of MVD and VEGF pathways may offer a novel and potentially promising chemotherapeutic strategy for treatment and/or prevention of colorectal neoplasia.
112. Comparing the Prognostic Values of Serum hs-CRP and IL-6 with SOFA in Patients with Sepsis and Septic Shock
Sansar Chandra Asiwal, Pinki Tak, Monika Kumawat, Manoj Kumar Banyal, Rajesh Jain
Abstract
Background: Sepsis and septic shock are life-threatening conditions with high mortality rates. Early identification and accurate prognostic markers are crucial for improving patient outcomes. This study aims to compare the prognostic values of serum high-sensitivity C-reactive protein (hs-CRP) and interleukin-6 (IL-6) with the Sequential Organ Failure Assessment (SOFA) score in patients with sepsis and septic shock.
Methods: A cross-sectional observational study was conducted on 400 patients admitted to the medical ward and ICU of Jawahar Lal Nehru Hospital, Ajmer, Rajasthan, India, from October 2022 to September 2023. Patients were categorized into two groups based on outcome: mortality (death) and survival. Serum hs-CRP, IL-6, and SOFA scores were measured on days 0, 3, and 7. Statistical analysis was performed using SPSS 20 software.
Results: The mortality rate in the sepsis cohort was 15%. The mean age of the patients was 52.13±13.82 years, with no significant difference between the mortality and survival groups. The prevalence of diabetes and hypertension was significantly higher in the mortality group. Mean hs-CRP and IL-6 levels were significantly higher in the mortality group compared to the survival group at all time points. The SOFA score was also significantly higher in the mortality group. ROC analysis showed that IL-6 had the highest sensitivity (98%) and specificity (99%) in predicting mortality, followed by hs-CRP (95% sensitivity, 95.6% specificity).
Conclusion: Serum hs-CRP and IL-6 are valuable prognostic markers for sepsis and septic shock. IL-6, in particular, demonstrated superior sensitivity and specificity in predicting mortality. Combining these biomarkers with the SOFA score can enhance prognostic accuracy and guide early intervention strategies.
113. Incidence and Complications of Meconium Aspiration Syndrome in Term Neonates – A Prospective Observational Study
Sondarva Prakash Chhaganlal, Tejas Pramod Hapani
Abstract
Background: Meconium aspiration syndrome (MAS) is a major respiratory condition in term neonates, occurring when meconium-stained amniotic fluid (MSAF) is aspirated, causing airway obstruction, inflammation, and surfactant dysfunction. MSAF is seen in 10-20% of term deliveries worldwide, with MAS developing in 5-30% of those cases depending on setting and meconium thickness. In India, higher rates are often reported in tertiary centers due to referral patterns and delayed care. Risk factors include post-term gestation, fetal distress, and thick meconium. Complications range from respiratory failure and pneumothorax to persistent pulmonary hypertension of the newborn (PPHN), hypoxic-ischemic encephalopathy, and mortality. Despite improvements in perinatal care, MAS contributes significantly to neonatal intensive care admissions. This study examined the incidence and complications of MAS in term neonates at a tertiary hospital in Gujarat, India, to provide local insights and support improved protocols.
Material and Methods: This prospective observational study was carried out over a year at a tertiary care hospital in western Gujarat, screening 940 term deliveries. Neonates with MSAF were followed for MAS signs. Institutional Ethics Committee approval was obtained, following Helsinki guidelines, with parental informed consent. Inclusion: term (37-42 weeks) neonates with MSAF. Exclusion: preterm/post-term extremes, congenital anomalies, non-MSAF births, or incomplete data. Data included demographics, clinical features, complications, and outcomes via records and follow-up. SPSS version 25 was used; chi-square for associations, logistic regression for risks, p<0.05 significant.
Results: Of 940 term deliveries, 132 (14%) had MSAF, and 16 (12.1%) developed MAS, giving an incidence of 1.7 per 100 live births. Males were 62.5% (10/16). Complications included respiratory failure (68.8%), pneumothorax (18.8%), PPHN (12.5%), neurological injury (12.5%), and sepsis (6.3%). Mortality was 6.25% (1 case). Detailed tables cover demographics, incidence by gestation, complications, and outcomes.
Conclusion: MAS incidence in this cohort matches regional Indian trends, with notable preventable complications. Strengthening antenatal surveillance and timely delivery interventions could lower burden. Enhanced neonatal management in high-risk cases remains key.
114. K-Wire Fixation versus Volar Locking Plate for Distal End Radius Fractures: A Comparative Outcome Study
Deep Nileshkumar Shah, Parth Yashvantbhai Solanki, Ashwin Mehta
Abstract
Background: Distal end radius fractures require stable fixation to achieve anatomical restoration and optimal functional recovery.
Aim: To compare the functional and radiological outcomes of distal end radius fractures treated with K-wire fixation and volar locking plate fixation.
Materials and Methods: A prospective comparative study was conducted on 50 patients with distal end radius fractures treated by either percutaneous K-wire fixation or volar locking plate fixation. Functional outcome was assessed using the Mayo Wrist Score, while radiological outcomes were evaluated using radial height, radial inclination, volar tilt, and ulnar variance at follow-up.
Results: The volar locking plate group demonstrated superior functional outcomes and better maintenance of radiological parameters compared to the K-wire group.
Conclusion: Volar locking plate fixation provides better functional and radiological outcomes; however, K-wire fixation remains a viable option in selected cases.
115. Clinical Evaluation of Dexmedetomidine for Sedation during Regional Anesthesia
Rashmi Shankarrao Jawanjal, Sachin Ramesh Gondane, Nitin Ramesh Gondane
Abstract
Background: Adequate sedation during regional anesthesia improves patient comfort and operating conditions while preserving spontaneous respiration and hemodynamic stability. Dexmedetomidine, a selective α₂-adrenergic agonist, has emerged as a useful sedative agent due to its anxiolytic and analgesic properties with minimal respiratory depression.
Material and Methods: This prospective, randomized, double-blind, controlled study included 60 adult patients (ASA physical status I–II) undergoing elective surgery under regional anesthesia. Patients were randomly allocated into two groups of 30 each. Group D received intravenous dexmedetomidine with a loading dose of 0.5 µg/kg followed by a maintenance infusion of 0.2–0.5 µg/kg/h, while Group C received an equivalent volume of normal saline. Sedation was assessed using the Ramsay Sedation Scale. Hemodynamic parameters, adverse events, need for rescue sedation, and patient and surgeon satisfaction were recorded and analyzed statistically.
Results: Demographic variables and baseline characteristics were comparable between the groups. Sedation scores were significantly higher in Group D at all measured time points (10, 30, and 60 minutes; p < 0.01). The lowest recorded heart rate and mean arterial pressure were significantly lower in the dexmedetomidine group compared to the control group (p < 0.01), though these changes were clinically manageable. The requirement for rescue sedation was significantly greater in Group C (20%) compared to Group D (0%; p = 0.01). The incidence of bradycardia and hypotension was higher in Group D but did not reach statistical significance. Patient and surgeon satisfaction scores were significantly higher in the dexmedetomidine group (p < 0.01).
Conclusion: Dexmedetomidine provides effective and reliable sedation during regional anesthesia, with improved sedation quality and satisfaction and a reduced need for rescue sedation, while maintaining acceptable hemodynamic safety.
116. Propofol versus Etomidate for Induction of Anesthesia in Adult Surgical Patients: A Randomized Study
Sachin Ramesh Gondane, Rashmi Shankarrao Jawanjal, Nitin Ramesh Gondane
Abstract
Background: Induction of general anesthesia is frequently associated with hemodynamic fluctuations, which may be clinically significant, particularly in susceptible patients. Propofol and etomidate are commonly used intravenous induction agents with differing cardiovascular effects. This study aimed to compare the hemodynamic responses and adverse effect profile of propofol and etomidate during induction of anesthesia in adult surgical patients.
Material and Methods: This prospective, randomized study included 60 adult patients (ASA physical status I–II) scheduled for elective surgery under general anesthesia. Patients were randomly allocated into two equal groups: Group P received propofol (2 mg/kg) and Group E received etomidate (0.3 mg/kg) for induction. Heart rate and mean arterial pressure were recorded at baseline, after induction, immediately after intubation, and at 1, 3, 5, and 10 minutes post-intubation. Adverse events such as hypotension, bradycardia, pain on injection, myoclonus, and need for vasopressor support were documented. Statistical analysis was performed using appropriate parametric and non-parametric tests, with a p value <0.05 considered significant.
Results: Demographic variables and baseline hemodynamic parameters were comparable between the two groups. Following induction, heart rate and mean arterial pressure were significantly lower in the propofol group compared to the etomidate group. Post-intubation increases in heart rate and means arterial pressure were observed in both groups, with significantly higher values in the propofol group during the early post-intubation period. Hypotension and pain on injection were significantly more frequent with propofol, whereas myoclonus occurred exclusively in the etomidate group. A higher proportion of patients in the propofol group required vasopressor support, although this difference was not statistically significant.
Conclusion: Etomidate provides superior hemodynamic stability during induction of anesthesia compared to propofol, while propofol is associated with a higher incidence of hypotension and pain on injection. Etomidate may therefore be preferred when cardiovascular stability is a priority.
117. Correlation of Serum Lactate Dehydrogenase, Indirect Bilirubin, and Haptoglobin with Hematological Parameters in Hemolytic Anemia: A Cross-Sectional Study from a Tertiary Care Center in Rajasthan
Yogendra Madan, Deeksha Buliwal, Ashima Madan
Abstract
Background: Hemolytic anemia is characterized by increased destruction of red blood cells leading to release of intracellular enzymes and hemoglobin metabolites into circulation. Simple biochemical markers such as serum lactate dehydrogenase (LDH), indirect bilirubin, and haptoglobin are routinely available in most medical college laboratories and can assist in assessing the degree of hemolysis, particularly in resource-limited settings.
Objectives: (1) To assess serum LDH, indirect bilirubin, and haptoglobin levels in patients with hemolytic anemia. (2) To correlate biochemical markers of hemolysis with hematological parameters, namely hemoglobin level and reticulocyte count. (3) To evaluate the diagnostic utility of routine biochemical investigations in assessing severity of hemolysis.
Materials and Methods: This hospital-based cross-sectional study was conducted at Government Medical College, Jhalawar, Rajasthan, over a period of one and a half years. A total of 51 patients diagnosed with hemolytic anemia were included. Hematological investigations included complete blood count, reticulocyte count, and peripheral blood smear examination. Biochemical parameters included serum LDH, total and indirect bilirubin, and serum haptoglobin. Statistical analysis was performed using SPSS software, and correlations were assessed using appropriate tests.
Results: Serum LDH and indirect bilirubin levels were significantly elevated, while serum haptoglobin levels were reduced in the majority of patients. A significant negative correlation was observed between hemoglobin levels and serum LDH and indirect bilirubin, whereas reticulocyte count showed a significant positive correlation with these biochemical markers. Serum haptoglobin demonstrated a positive correlation with hemoglobin concentration and a negative correlation with LDH.
Conclusion: Routine biochemical markers such as serum LDH, indirect bilirubin, and haptoglobin correlate significantly with hematological indices of hemolysis and can serve as reliable, cost-effective indicators for assessing hemolytic anemia in resource-limited tertiary care settings.
118. Correlation of Hemoglobin Levels with Oxygen Saturation and Resting Heart Rate in Patients with Chronic Anemia: A Cross-Sectional Study from a Tertiary Care Center
Yogendra Madan, Poonam Nagori, Deeksha Buliwal
Abstract
Background: Chronic anemia leads to reduced oxygen-carrying capacity of blood and triggers physiological compensatory mechanisms, particularly involving the cardiovascular and respiratory systems. Assessment of these adaptations using simple, non-invasive physiological parameters may provide valuable insights into the severity and functional impact of anemia, especially in resource-limited settings.
Objectives: (1) To assess resting oxygen saturation and heart rate in patients with chronic anemia. (2) To correlate hemoglobin levels with oxygen saturation and resting heart rate. (3) To evaluate the physiological compensatory response to chronic anemia using routinely available clinical parameters.
Materials and Methods: This hospital-based cross-sectional study was conducted at a tertiary care center over a defined study period. Patients diagnosed with chronic anemia were enrolled after obtaining informed consent. Hemoglobin levels were estimated using automated hematology analyzers. Resting oxygen saturation was measured using a finger pulse oximeter, and heart rate was recorded after adequate rest. Statistical analysis was performed to assess correlations between hemoglobin concentration and physiological parameters.
Results: A significant negative correlation was observed between hemoglobin levels and resting heart rate, indicating increasing tachycardia with declining hemoglobin concentration. Hemoglobin levels showed a mild but statistically significant positive correlation with oxygen saturation. These findings reflect physiological cardiovascular compensation in chronic anemia.
Conclusion: Hemoglobin levels in chronic anemia correlate significantly with resting heart rate and, to a lesser extent, with oxygen saturation. Simple physiological measurements such as pulse rate and oxygen saturation can serve as useful adjuncts in assessing the functional severity of chronic anemia, particularly in resource-constrained healthcare settings.
119. Effect of Intravenous versus Nebulised Dexmedetomidine on Intubating Conditions during Awake Fiberoptic Intubation: A Prospective Randomised Double Blind Comparative Study
Sandeep Sharma, Anush Jain, Ashish Ameta, Santosh Choudhary, Prakriti Singh, Manisha Jain
Abstract
Background and Aims: Awake fibreoptic intubation (AFOI) tend to be safer than conventional laryngoscopy for difficult airway conditions. Dexmedetomidine can be used for sedation as it ensures spontaneous breathing with airway patency. We aim to assess the effect of intravenous versus nebulized dexmedetomidine on intubating conditions during AFOI.
Methods: In this prospective randomized double blind comparative study, after IEC approval, CTRI registration and informed written consent, ASA grade I, II patients of either gender aged 18-60 year with Mallampati grade I, II undergoing elective surgery under general anaesthesia were allocated into two groups –Group N and I to receive dexmedetomidine 1mcg/kg via nebulization and intravenous route respectively. Patients were intubated with fibreoptic bronchoscope. The primary outcome assessed was cough score. Secondary outcomes were- time taken for AFOI, intubating conditions, vocal cord position, post intubation score, Ramsay sedation scale, haemodynamic parameters, patient comfort, cooperativeness and satisfaction. Data were analysed using SPSS version 25. Continuous, ordinal and categorical data were presented as mean±SD, median±IQR and proportion respectively. Student-t test, Mann Whitney U test and Chi-square test were applied where deemed appropriate. p<0.05 was considered statistically significant.
Results: Greater proportion of patients in Group I (56.66%) showed lesser severity of cough as compared to Group N (10%, p=0.0001). Group I also reported a faster intubation time (176±58.54 v/s 245±84.14, p=0.0001), better vocal cord position(p=0.00003), better sedation [2(2-2) v/s 1(1-1), p=0.000] and more stable haemodynamic parameters.
Conclusion: Compared to nebulized route, dexmedetomidine administered through intravenous route provides better intubating conditions for AFOI with reduced cough severity.
120. Early Versus Delayed Enteral Feeding After Emergency Laparotomy: Impact on Morbidity and Hospital Stay – A Prospective Study
Jagdish Chavda, Jayant Uperia, Harshadray Parmar, Zeel Bhanderi, Jigar Dave, Saloni Choudhary
Abstract
Background: Emergency laparotomy is commonly performed for acute abdominal conditions and is associated with significant postoperative morbidity and prolonged hospital stay. Traditionally, enteral feeding is delayed to allow bowel recovery; however, recent evidence supports early enteral nutrition for improved outcomes. This prospective study, conducted over a year at a tertiary care center in North Gujarat, India, compared early (within 24 hours) versus delayed (48–72 hours) enteral feeding following emergency laparotomy. Adult patients with conditions such as perforation peritonitis, intestinal obstruction, and trauma were included, while those with severe comorbidities were excluded. The study aimed to assess postoperative morbidity and length of hospital stay.
Material and Methods: This was a randomized controlled trial involving 120 patients divided equally into two groups: early enteral feeding (EEF) starting within 24 hours via nasogastric tube with gradual progression to oral intake, and delayed enteral feeding (DEF) initiated after return of bowel sounds (typically 48-72 hours). Inclusion criteria encompassed patients aged 18-65 years undergoing emergency laparotomy for non-malignant conditions. Exclusion criteria included preoperative malnutrition, mechanical ventilation >48 hours, or anastomotic concerns. Data collection involved daily monitoring for complications like wound infections, ileus, anastomotic leaks, and pneumonia. Statistical analysis used SPSS software, with chi-square tests for categorical variables and t-tests for continuous data; p<0.05 was considered significant.
Results: The EEF group showed significantly lower morbidity rates (18% vs. 35% in DEF, p=0.02), with reduced incidences of wound infections (8% vs. 18%), paralytic ileus (5% vs. 12%), and pneumonia (3% vs. 8%). No significant difference in anastomotic leaks (2% vs. 3%). Mean hospital stay was shorter in EEF (7.2 ± 2.1 days) compared to DEF (10.5 ± 3.4 days, p<0.001). Nutritional status improved faster in EEF, with better albumin levels at discharge.
Conclusion: Early enteral feeding post-emergency laparotomy is safe and beneficial, reducing morbidity and hospital stay without increasing complications. This approach should be adopted in resource-limited settings like North Gujarat to optimize patient outcomes and healthcare efficiency.
121. Risk Factors and Outcomes of Anastomotic Leak Following Colorectal Resections at a Tertiary Care Center in North Gujarat
Harshadray Parmar, Jagdish Chavda, Jayant Uperia, Saloni Choudhary, Zeel Bhanderi, Jigar Dave
Abstract
Background: Anastomotic leak (AL) is a serious complication after colorectal resections, leading to increased morbidity, prolonged hospitalization, and mortality. Despite advances in surgical care, AL rates remain variable, especially in India where colorectal cancer incidence is rising. Risk factors such as male gender, malnutrition, diabetes, and low rectal anastomosis are recognized, but Indian prospective data are limited. This study aimed to identify risk factors and evaluate outcomes of AL in patients undergoing colorectal resections at a tertiary care center in North Gujarat, providing region-specific evidence for improved risk assessment and management.
Material and Methods: This prospective observational study enrolled 182 consecutive patients undergoing elective colorectal resections with primary anastomosis over a year. Data on demographics, comorbidities, preoperative nutrition, operative details, and postoperative outcomes were collected prospectively. Ethical clearance was obtained from the institutional review board, with written informed consent from all participants. Inclusion criteria included adults aged 18-80 years with benign or malignant colorectal diseases requiring resection and anastomosis. Exclusions were emergency surgeries, diverting stomas, or incomplete follow-up. AL was diagnosed clinically or radiologically within 30 days. Risk factors were evaluated via univariate and multivariate logistic regression. Outcomes like mortality and reintervention were compared. Analysis used SPSS version 26, with p<0.05 significance.
Results: AL occurred in 7.7% (14/182) of patients. Independent risk factors on multivariate analysis were male gender (OR 2.4, 95% CI 1.3-4.5, p=0.008), diabetes (OR 2.1, 95% CI 1.2-3.8, p=0.015), preoperative albumin <3.5 g/dL (OR 2.8, 95% CI 1.5-5.2, p=0.002), rectal anastomosis (OR 3.5, 95% CI 1.9-6.4, p<0.001), and operative time >180 minutes (OR 2.6, 95% CI 1.4-4.8, p=0.004). AL patients had higher 30-day mortality (14.3% vs. 1.8%, p=0.002), reoperation rates (50% vs. 5.4%, p<0.001), and longer hospital stays (mean 19.5 vs. 8.1 days, p<0.001).
Conclusion: This prospective study from North Gujarat identifies male gender, diabetes, hypoalbuminemia, rectal anastomosis, and prolonged surgery as key risk factors for anastomotic leak, with a 7.7% incidence significantly affecting outcomes.