1. Comparative Study of Ultrasound-Guided Quadratus Lumborum Block Versus Transversus Abdominis Plane Block for Postoperative Analgesia in Patients Undergoing Lower Abdominal Surgeries
K. M. Nithish, Anusha M. S., Rajeswara Rao Sarvasiddhi
Abstract
Background: Effective postoperative pain management is crucial for early mobilization, reduced morbidity, and improved patient satisfaction. Ultrasound-guided truncal blocks such as the transversus abdominis plane (TAP) block and quadratus lumborum (QL) block are increasingly used as part of multimodal analgesia for lower abdominal surgeries.
Objective: To compare the efficacy of ultrasound-guided quadratus lumborum block versus transversus abdominis plane block for postoperative analgesia in patients undergoing elective lower abdominal surgeries.
Materials and Methods: This quasi-experimental study was conducted at a tertiary care teaching hospital between October 2022 and March 2024. Sixty patients aged 20–40 years with ASA physical status I–II undergoing elective lower abdominal surgeries under spinal anesthesia were enrolled. Patients were divided into two groups: Group Q (Quadratus Lumborum block, n=30) and Group T (Transversus Abdominis Plane block, n=30). Both groups received bilateral blocks using 0.125% bupivacaine at 0.3–0.4 ml/kg. Postoperative pain was assessed using the Visual Analogue Scale (VAS) at predefined intervals up to 24 hours. Duration of analgesia, number of rescue analgesic doses, hemodynamic parameters, and adverse effects were recorded.
Results: Demographic variables, ASA status, type and duration of surgery, and hemodynamic parameters were comparable between groups (p>0.05). VAS scores were similar up to 4 hours postoperatively. From 8 hours onward, Group Q demonstrated significantly lower VAS scores compared to Group T (p<0.05). The mean duration of analgesia was significantly longer in Group Q (12.23 ± 1.94 hours) than in Group T (8.76 ± 0.81 hours; p<0.0001). Rescue analgesic requirement was significantly lower in Group Q (p<0.0001). No block-related complications or adverse effects were observed in either group.
Conclusion: Ultrasound-guided quadratus lumborum block provides superior and prolonged postoperative analgesia with reduced rescue analgesic requirements compared to transversus abdominis plane block in patients undergoing elective lower abdominal surgeries.
2. Comparative Study to Evaluate Ease of Nasogastric Tube Insertion in Intubated Patients with Three Different Techniques
Shruti Garg, Deepesh Gupta, Shashi Kumari, Sonu Pandoliya, Devanshu Saraf, Aishwarya Shrivastava
Abstract
Background: Nasogastric tube (NGT) insertion in anaesthetized and intubated patients is often challenging due to altered airway anatomy and decreased muscle tone. Several bedside techniques have been described to facilitate smooth insertion, but evidence directly comparing commonly practiced methods remains limited.
Aim and Objective: To compare the ease of NGT insertion using three techniques—additional neck flexion, standard sniffing position with lateral neck pressure, and reverse Sellick’s manoeuvre—in intubated adult patients undergoing elective surgeries.
Materials and Methods: This prospective, randomized comparative study included 120 adult patients (ASA I–II) undergoing elective surgery under general anaesthesia. Patients were allocated into three groups (n = 40 each): Group A Additional neck flexion, Group B—standard sniffing position with lateral neck pressure, and Group C—reverse Sellick’s manoeuvre. The primary outcomes assessed were number of attempts and time required for successful NGT insertion. Secondary outcomes included hemodynamic changes and complications such as kinking, coiling and nasal bleeding.
Results: Baseline demographic and clinical characteristics were comparable across all groups. Group A demonstrated the highest first-attempt success rate and the shortest insertion time. Group B showed moderate ease of insertion, while Group C had the lowest first-attempt success and longest insertion time. Complications were least frequent in Group A and most common in Group C. Hemodynamic parameters remained stable in all groups, and no major adverse events occurred.
Conclusion: Additional neck flexion is the most effective technique for NGT insertion in intubated patients, offering superior first-attempt success, shorter insertion time, and fewer complications compared with lateral neck pressure and reverse Sellick’s manoeuvre. Its simplicity and safety make it a preferred method in routine anaesthetic practice.
3. Clinical Assessment between Measurement of Mandibular Condylar Mobility (USG Guided) Versus Maximum Condyle-Tragus Distance in Predicting Difficult Laryngoscopy
Varsha M., Surendra Raikwar, Neelesh Nema, Vignesh Rajan V., Aishwarya Shrivastava, Vighna Rajan R.
Abstract
Background: Prediction of difficult laryngoscopy remains a critical component of preoperative airway evaluation, as unanticipated airway difficulty can lead to severe complications.
Aim and Objective: To compare ultrasound-guided mandibular condylar mobility with traditional airway assessment parameters, inter-incisor gap (IIG), upper lip bite test (ULBT), mandibular protrusion distance, and maximum condyle–tragus distance in predicting difficult laryngoscopy.
Methods: This prospective observational study included 90 adult patients undergoing elective surgery under general anaesthesia. Preoperative measurements included ultrasound-guided condylar mobility and four clinical airway tests. Laryngoscopy was performed using a standard technique, and Cormack–Lehane (CL) grading was recorded. CL grade III–IV was defined as difficult laryngoscopy. Diagnostic accuracy was analysed using sensitivity, specificity, predictive values, and odds ratios.
Results: The majority [79 (87.8%)] had easy laryngoscopy, and 11 (12.2%) had difficult laryngoscopy. Ultrasound-guided mandibular condylar mobility demonstrated the highest sensitivity (81.8%) and perfect specificity (100%). Maximum condyle–tragus distance and IIG also showed strong diagnostic performance, with sensitivities of 72.7% and 97.5%, respectively, and specificities of 98.7% and 98.7%, respectively. Mandibular protrusion distance and ULBT had perfect specificity (100%) but lower sensitivity (36.4% and 27.3%). All parameters showed significant association with difficult laryngoscopy (p < 0.0001).
Conclusion: Ultrasound-guided mandibular condylar mobility is the most accurate single predictor of difficult laryngoscopy, demonstrating superior sensitivity and perfect specificity. However, multivariate analysis showed that no parameter independently predicted difficult laryngoscopy. A combined approach using both ultrasound-based and conventional tests enhances the reliability of airway assessment and improves preparedness for difficult laryngoscopy.
4. Impact of Statin Treatment On Liver Enzyme Levels in Patients with Dyslipidemia
Joshi Abhishek, Modi Vansh Kanaiyalal, Modi Shraddhaben Kanaiyalal
Abstract
Background: Dyslipidemia frequently coexists with non-alcoholic fatty liver disease (NAFLD) and cardiovascular disease, contributing to increased morbidity and mortality. While statins are widely prescribed to manage lipid abnormalities and reduce cardiovascular risk, their effect on liver enzyme levels in dyslipidemic patients with NAFLD remains incompletely understood. Evaluating this effect is critical to ensure both efficacy and safety of statin therapy in this high-risk population.
Methods: This prospective observational study enrolled 146 adult patients aged 18–80 years with dyslipidemia, NAFLD, and cardiovascular disease at a tertiary care hospital over one year. Patients receiving statins (n = 88) were compared with non-statin users (n = 58). Liver enzymes (ALT, AST) and lipid profiles (total cholesterol, LDL-C) were measured at baseline and follow-up. Demographic, clinical, and treatment-related data were also collected.
Results: At baseline, ALT and AST levels were similar between the statin and non-statin groups (43.1 ± 18.0 vs 40.1 ± 15.9 U/L and 38.0 ± 14.9 vs 35.2 ± 13.1 U/L, respectively; p > 0.1). After follow-up, statin therapy significantly reduced ALT (36.0 ± 14.0 U/L; p < 0.01) and AST (31.1 ± 11.7 U/L; p = 0.04), whereas non-statin patients showed minimal change. Total cholesterol decreased from 210.5 ± 30.9 to 180.9 ± 24.6 mg/dL (p < 0.01) and LDL-C from 135.4 ± 28.7 to 104.6 ± 21.0 mg/dL (p < 0.01) in the statin group, with no significant reductions in the non-statin group. Statins were well tolerated, with only minor side effects reported.
Conclusion: Statin therapy significantly improves liver enzyme levels and lipid profiles in dyslipidemic patients with NAFLD and cardiovascular disease. These results support the dual hepatic and cardiovascular benefits of statins in this population.
5. Correlation of Blood Sodium and Potassium Levels with The Extent of Stroke
Fulwani Dhirajkumar Mahendrabhai, Modi Vansh Kanaiyalal, Joshi Abhishek
Abstract
Background: Stroke is a leading cause of disability, often resulting in motor and neurological impairments. Electrolyte disturbances, particularly in sodium, potassium, and calcium, may influence stroke severity and outcomes. This study aimed to evaluate the association between serum electrolyte levels and functional outcomes in ischemic stroke patients.
Methods: A prospective study was conducted over one year at a tertiary care hospital including 168 adult ischemic stroke patients. Stroke severity and motor function were assessed using NIHSS and MAS scores. Serum sodium, potassium, and calcium levels were measured at admission. The primary outcome was death or major disability at 3 months (mRS 3–6).
Results: Patients with death or major disability were older (74.2 vs. 66.5 years) and had higher NIHSS scores (median 6 vs. 3) and lower MAS scores (median 15 vs. 20). Abnormal calcium levels were significantly associated with adverse outcomes (p = 0.01), while sodium and potassium showed no significant correlation (p = 0.12 and 0.43).
Conclusion: Calcium disturbances are linked to worse functional outcomes in ischemic stroke. Monitoring and correcting calcium levels may help improve prognosis.
6. Association of Serum Catecholamine Concentrations with Heart Rate Variability in Patients with Chronic Heart Failure
Amiben Manojbhai Patel, Bhavikaben Jayantilal Maru, Patel Vishvaben Narendrabhai
Abstract
Background: In order to identify higher-risk patients who might be the focus of additional treatment measures, a wide range of factors related to CHF can be assessed. By doing a bedside examination, it is quite simple to identify patients who exhibit symptoms and signs while at rest. Even with the best medical care, these patients still have an annual mortality rate of more than 40%, but their share of the overall heart failure population is quite modest.
Objectives: In patients with chronic heart failure, the study sought to determine the association between HRV parameters and blood catecholamine levels as well as the usefulness of these measurements in indicating autonomic dysfunction and the severity of the condition.
Materials and Methods: It was a retrospective, observational study. The study was carried out at a tertiary care centre. The study data that was retrieved was for one year. Data from 184 participants were retrieved for the study. Patients 18 years of age and older who had a diagnosis of chronic heart failure verified by clinical assessment and echocardiography and whose medical records had information on heart rate variability, blood catecholamine levels, and NYHA functional class were included in the study.
Results: A significant proportion of patients (64.1%) demonstrated reduced left ventricular ejection fraction (<40%). Common associated conditions included hypertension in 55.4%, diabetes mellitus in 41.3%, and ischemic heart disease in 48.4% of patients. Heart rate variability analysis showed reduced autonomic control of the heart. The mean SDNN was 92.6 ms, and RMSSD was 21.4 ms.
Conclusion: Reduced time-domain and frequency-domain HRV characteristics show that patients with chronic heart failure have severe autonomic dysfunction, according to this study. Increased sympathetic activation is linked to compromised autonomic regulation, as seen by elevated serum catecholamine levels and a moderately negative connection with HRV indices.
Recommendations: Larger studies are required to validate the predictive utility of HRV and catecholamine monitoring, which can help evaluate autonomic dysfunction in CHF and direct tailored therapy.
7. Association Between Serum Iron Indices and Neurodevelopmental Delay (NDD) in Children
Patel Vishvaben Narendrabhai, Bhavikaben Jayantilal Maru, Amiben Manojbhai Patel
Abstract
Background: Neurodevelopmental disorders such as autism spectrum disorder (ASD), attention deficit hyperactivity disorder (ADHD), and intellectual disability (ID) are commonly associated with nutritional deficiencies, including altered iron status. Iron plays a critical role in brain development, and disturbances in iron metabolism may influence neurodevelopmental outcomes. This study aimed to evaluate differences in serum iron indices among children with different neurodevelopmental disorders.
Methods: This hospital-based observational study included 186 children aged 4–12 years diagnosed with ASD, ADHD, or ID. Neurodevelopmental diagnoses were established using standardised assessment tools. Serum iron, serum ferritin, and serum transferrin levels were measured using standard laboratory methods. Iron parameters were compared across the three diagnostic groups using appropriate statistical analyses, with a p-value <0.05 considered statistically significant.
Results: Among the 186 children enrolled, 68 had ASD, 79 had ADHD, and 39 had ID. Serum ferritin levels showed a statistically significant difference among the three groups (p = 0.003), with higher mean ferritin levels observed in children with ASD and lower levels in children with ADHD and ID. In contrast, no statistically significant differences were observed in serum iron (p = 0.087) or serum transferrin levels (p = 0.156) among the diagnostic groups.
Conclusion: Serum ferritin levels differ significantly among children with ASD, ADHD, and ID, indicating variations in iron storage status across neurodevelopmental disorders. These findings suggest that assessment of serum ferritin may be useful in the clinical evaluation of children with neurodevelopmental disorders, even when serum iron and transferrin levels are within normal limits.
8. Psoriasis and Its Association with Metabolic Syndrome and Cardiovascular Outcomes
Dixit D. Chhatrawala, Riya M. Chaudhari, Vidhi M. Maniya
Abstract
Background: Psoriasis is a chronic immune-mediated inflammatory disease increasingly recognised to be associated with metabolic syndrome and cardiovascular morbidity. Systemic inflammation in psoriasis may contribute to metabolic abnormalities and accelerated atherosclerosis.
Objectives: To evaluate the prevalence of metabolic syndrome and assess subclinical cardiovascular risk markers among patients with psoriasis.
Methods: This retrospective observational study included 190 adult patients with clinically diagnosed psoriasis attending a tertiary care centre. Demographic data, clinical characteristics, metabolic parameters, and cardiovascular risk markers were extracted from medical records. Metabolic syndrome was defined using modified NCEP ATP III criteria. Subclinical cardiovascular disease was assessed using carotid intima-media thickness (CIMT), high-sensitivity C-reactive protein (hs-CRP), ankle-brachial index (ABI), and echocardiographic evaluation.
Results: The mean age of participants was 44.8 ± 11.3 years, with a mean disease duration of 8.6 ± 4.8 years and a mean PASI score of 13.4 ± 5.6. Metabolic syndrome was present in 103 patients (54%). Abdominal obesity was the most common component (78%), followed by elevated triglycerides (63%) and low HDL cholesterol (59%). Increased CIMT (>0.8 mm) was observed in 48% of patients, elevated hs-CRP (>3 mg/L) in 61%, reduced ABI (<0.9) in 12%, and echocardiographic diastolic dysfunction in 18%, indicating a high burden of subclinical cardiovascular disease.
Conclusion: Patients with psoriasis demonstrate a high prevalence of metabolic syndrome, systemic inflammation, and subclinical cardiovascular abnormalities, even at moderate disease severity. These findings support routine cardiometabolic screening and integrated multidisciplinary management to reduce long-term cardiovascular risk in psoriasis patients.
9. Association of Acid–Base Disturbances with Severity and Outcomes in Sepsis
Vidhi M. Maniya, Riya M. Chaudhari, Dixit D. Chhatrawala
Abstract
Background: Complex acid-base and electrolyte abnormalities are prevalent in intensive care units. The blood pH rapidly moves in either of the extreme directions, which can result in serious multi-organ issues, even though in most cases the acid-base changes are small and self-limited.
Objectives: The purpose of the study was to assess the relationship between the severity and clinical outcomes of sepsis patients admitted to the intensive care unit and acid-base abnormalities.
Materials and Methods: It was a retrospective, observational study. The study was carried out at a tertiary care centre. The study data that was retrieved was for one year. Data from 162 participants were retrieved for the study. Included were adult patients with sepsis or septic shock who were hospitalized to the intensive care unit (ICU) and had complete clinical, laboratory, and ABG data at the time of admission.
Results: The largest subgroup consisted of patients with metabolic acidosis, who also had the highest mean SOFA score (9.6 ± 3.2) and the highest percentage of septic shock (58.8%). In a similar vein, patients with a combined acid-base problem had a significantly higher mean SOFA score (8.9 ± 2.8) and a high rate of septic shock (52.9%).
Conclusion: Acid-base imbalances were found to be closely linked to the severity and results of sepsis in this investigation. Compared to patients with normal acid-base status, those with metabolic acidosis and mixed acid-base disorders had far greater rates of septic shock, higher SOFA scores, longer ICU stays, and higher mortality.
Recommendations: Since patients with metabolic acidosis or mixed acid-base abnormalities are more likely to experience septic shock, organ failure, and death, early evaluation and monitoring of acid-base status should be a crucial component of sepsis care.
10. Accuracy of Clinical and Biochemical Methods for Detection of Ovulation in Infertile Women: A Hospital-Based Observational Study
Achala Rawat, Shubha Pandey, Jyoti
Abstract
Background: Ovulatory dysfunction is one of the most common and potentially treatable causes of female infertility. Accurate identification of ovulation is essential for appropriate infertility evaluation and management. Various clinical and biochemical methods are used to detect ovulation; however, their diagnostic accuracy varies, and a comparative evaluation is required to guide optimal clinical practice.
Aim and Objectives: To assess the accuracy of clinical and biochemical methods for the detection of ovulation in infertile women.
Materials and Methods: This hospital-based retrospective observational cross-sectional study was conducted in the Department of Obstetrics and Gynaecology at Kamala Nehru Memorial Hospital, Prayagraj, over a period of two years from October 2020 to October 2022. A total of 100 infertile women of reproductive age were included. Ovulation was assessed using basal body temperature charting, cervical mucus examination, and mid-luteal serum progesterone estimation. Detection of the urinary luteinizing hormone (LH) surge using a commercial LH kit was considered the reference standard. Diagnostic accuracy parameters, including sensitivity, specificity, positive predictive value, negative predictive value, and overall accuracy, were calculated. Statistical analysis was performed using SPSS version 23.0, and a p-value <0.05 was considered statistically significant.
Results: Out of 100 infertile women, 78 (78.0%) were ovulatory and 22 (22.0%) were anovulatory based on LH surge detection. Serum progesterone estimation showed the highest diagnostic accuracy (84.0%) with high specificity (94.87%), followed by cervical mucus examination with an accuracy of 80.0%. Basal body temperature monitoring demonstrated lower sensitivity and an overall accuracy of 71.0%. Serum prolactin levels were significantly higher in anovulatory women (p<0.05), while other hormonal parameters showed no significant difference between ovulatory and anovulatory groups.
Conclusion: Ovulatory dysfunction contributes significantly to female infertility. Among the evaluated methods, serum progesterone estimation is the most accurate biochemical marker for ovulation detection, while cervical mucus examination serves as a reliable and cost-effective clinical indicator. Basal body temperature monitoring alone is insufficient for accurate ovulation assessment. A combined approach incorporating clinical assessment and biochemical confirmation provides a more reliable strategy for ovulation detection, particularly in resource-limited settings.
11. Efficacy of Toothbrushing Aids versus Interdental Devices in Gingivitis Management: A Randomized Study
Manoj Meena, Akshay Verma
Abstract
Background: Gingivitis remains a prevalent oral health condition affecting a significant proportion of the global population. Effective plaque control through mechanical oral hygiene devices is fundamental to gingivitis prevention and management. However, comparative evidence regarding the efficacy of toothbrushing aids versus interdental cleaning devices remains limited.
Methods: A total of 120 participants diagnosed with moderate gingivitis were randomly allocated to three groups: powered toothbrush group (n=40), interdental brush group (n=40), and dental floss group (n=40). Clinical parameters including Gingival Index (GI), Plaque Index (PI), and Bleeding on Probing (BOP) were assessed at baseline, 4 weeks, 8 weeks, and 12 weeks.
Results: All three groups demonstrated significant improvements in clinical parameters. The powered toothbrush group exhibited the greatest reduction in GI (1.82 ± 0.31 to 0.68 ± 0.22, p<0.001) and PI (2.14 ± 0.28 to 0.72 ± 0.19, p<0.001). The interdental brush group showed superior improvement in interproximal sites (BOP reduction: 78.4% to 22.6%, p<0.001) compared to dental floss (76.2% to 34.8%, p<0.001). Combined use of powered toothbrush with interdental devices yielded optimal outcomes.
Conclusion: Both toothbrushing aids and interdental devices effectively manage gingivitis, with powered toothbrushes demonstrating superior overall plaque removal and interdental brushes showing enhanced efficacy at interproximal sites. A combined approach is recommended for comprehensive gingivitis management.
12. Evaluating the Coexistence of Bronchial Asthma in Bronchiectasis Patients: A Cross-Sectional Study
Dinesh C. Patel, Rajesh B. Makwana, Meet P. Shah
Abstract
Background: Bronchiectasis is a chronic airway disease frequently associated with comorbid conditions that influence clinical outcomes. Bronchial asthma represents an important overlapping airway disorder.
Aim: To determine the coexistence of bronchial asthma in patients with bronchiectasis and to compare the clinico-radiological profile between patients with bronchiectasis alone and those with concomitant bronchial asthma.
Materials and Methods: A cross-sectional observational study was conducted on 80 patients with radiologically confirmed bronchiectasis. Clinical features, radiological findings, and exposure history were compared between patients with bronchiectasis alone and those with coexisting bronchial asthma.
Results: Bronchial asthma was present in a substantial proportion of patients with bronchiectasis and was associated with increased breathlessness, wheezing, atopic manifestations, and specific environmental exposures.
Conclusion: Coexisting bronchial asthma significantly modifies the clinical profile of bronchiectasis, highlighting the need for routine evaluation and tailored management strategies.
13. Effect of 4mg Dexamethasone for Prevention of Post-Operative Nausea and Vomiting in Laparoscopic Surgeries
Dinesh C. Patel, Rajesh B. Makwana, Meet P. Shah
Abstract
Background: Laparoscopy was first introduced as a therapeutic alternative to laparotomy more than a century ago. Since then, the field of laparoscopic surgery has undergone enormous development and expansion, to the point where it is now the standard treatment for a wide range of surgical procedures, including cholecystectomy, appendicectomy, gynecologic surgeries, bariatric surgery, hernia repair and even complex oncologic operations. However, laparoscopic surgeries are associated with high incidence of postoperative nausea and vomiting (PONV) of 40%-80%. A number of drugs have been used for its prevention. Dexamethasone, a glucocorticoid, having an antiemetic effect along with anti-inflammatory and analgesic effect has been shown to reduce the incidence of PONV. However, the optimal dose for reducing PONV has not been clearly defined. In this study, we aim to study 4mg dose of dexamethasone on incidence of PONV in patients undergoing laparoscopic surgery.
Methods: A double blind randomized controlled study was performed on 70 patients posted for elective laparoscopic surgeries under general anesthesia to assess the efficacy of 4mg dose of dexamethasone in preventing PONV. Patients were randomly assigned into two groups: 4mg dexamethasone (1ml) and 1ml normal saline group. The incidence of nausea, vomiting and the need for anti-emetic were evaluated during first 24 postoperative hours.
Results: Patients who received IV dexamethasone 4mg had significant reduction of PONV (P<0.01) and the need for rescue anti emetic drugs was also lower in dexamethasone group compared to normal saline group.
Conclusion: Inj Dexamethasone 4mg given before induction of anesthesia effectively controls postoperative nausea and vomiting in laparoscopic surgeries.
14. Magnesium Sulfate: Seeking the Unknown
Rupal Sharma, Anuradha Salvi, Kritika Kaushik, Sakshi Sharma, Asha Verma
Abstract
Introduction: Although the mortality rate for preterm infants and the gestational age-specific mortality rate have dramatically improved over the last three to four decades, infants born preterm remain vulnerable to many complications, including respiratory distress syndrome, chronic lung disease, and injury to the intestines, a compromised immune system, cardiovascular disorders, hearing and vision problems, and neurological insult.
Aims and Objectives: To determine the effect of exposure of antenatal magnesium sulfate on neonatal APGAR score at 1 minute and 5 minutes of birth, on need of respiratory support and duration of hospitalization of neonates.
Materials and Methods: This prospective interventional study was conducted on 70 women admitted to labor room in the Department of Obstetrics and Gynaecology, Sawai Mansingh Medical College, Jaipur. Women with singleton pregnancy between 28 to 32 weeks gestational age with expected delivery within 24 hours were included. Groups were allocated using flip coin method into cases and controls, and the treatment group was administered with Magnesium Sulphate. At birth Apgar scoring at 1 and 5 minutes were noted. Duration of stay of hospitalisation, need of oxygen, CPAP/ventilator were noted.
Results: The APGAR score at 1 and 5 minutes after birth were found to be improved in the MgSO
4 group. The mean duration of stay in the hospital among neonates who received MgSO
4 was 9.86 days compared to 11.06 days in controls. Also babies requiring CPAP/ventilator were more in control group as compared to who did not received MgSO
4.
Conclusion: Antenatal MgSO
4, if judiciously used, can do wonders. But for generalized norms, it has to be evaluated thoroughly.
15. A Study of Fetomaternal Outcome in Pregnancies Complicated by Gestational Diabetes Mellitus at a Tertiary Care Hospital in Northeast India
Neha Joshi, Manoj Kumar
Abstract
Background: Gestational diabetes mellitus (GDM) is a common metabolic disorder of pregnancy and is associated with significant adverse maternal and neonatal outcomes. With the rising prevalence of GDM in India and limited region-specific data from Northeast India, evaluating fetomaternal outcomes in this population is essential.
Aim and Objectives: To assess and compare maternal and neonatal outcomes in pregnancies complicated by gestational diabetes mellitus with those in normoglycaemic pregnancies, and to evaluate the association between glycaemic control and pregnancy outcomes.
Materials and Methods: This hospital-based observational comparative study was conducted at a tertiary care hospital in Northeast Tezpur, Assam, from August 2021 to July 2024. A total of 120 pregnant women were enrolled, comprising 60 women diagnosed with GDM and 60 normoglycaemic controls. Participants were followed from diagnosis until delivery and the early neonatal period. Maternal demographic characteristics, antenatal complications, mode of delivery, and neonatal outcomes were recorded.
Results: Women with GDM were significantly older and had higher body mass index compared to controls. Gestational hypertension and polyhydramnios were more common in the GDM group. Caesarean section rates were substantially higher among women with GDM. Neonates born to mothers with GDM had significantly higher birth weight, increased incidence of macrosomia, neonatal hypoglycaemia, and higher rates of NICU admission. Poor glycaemic control within the GDM group was significantly associated with increased operative delivery and adverse neonatal outcomes.
Conclusion: Pregnancies complicated by gestational diabetes mellitus are associated with increased risk of adverse fetomaternal outcomes. Effective glycaemic control plays a crucial role in improving maternal and neonatal prognosis. Early screening, timely diagnosis, and appropriate management of GDM are essential to reduce pregnancy-related complications, particularly in high-risk populations.
16. Comparative Clinical Outcomes of Alcohol-Induced and Gallstone-Induced Acute Pancreatitis
Sneha Ninama, Girish N. Pratap, Rahul Agarwal
Abstract
Aim: To compare the clinical outcomes, disease severity patterns, and complications between alcohol-induced and gallstone-induced acute pancreatitis in a tertiary care center.
Materials and Methods: This prospective observational comparative study was conducted over an 18-month period. A total of 152 consecutive patients diagnosed with acute pancreatitis were enrolled and categorized into two groups: alcohol-induced (n=101) and gallstone-induced (n=51) pancreatitis. Patients were evaluated for demographics, clinical presentation, severity assessment using Revised Marshall Score and BISAP criteria, and documented for outcomes including duration of nil per oral (NPO), length of hospital stay (LOS), organ failure, local complications (pancreatic necrosis, pseudocyst, acute necrotic collection), and mortality.
Results: Alcohol-induced pancreatitis demonstrated higher prevalence (66.45%) with predominance in younger males (mean age 37.8±8.2 years, 97% male; p<0.0001). Gallstone-induced pancreatitis was more frequent in older females (mean age 46.5±12.1 years, 84.3% female). No mortality was recorded in either group. Mean NPO duration was comparable (alcohol: 2.49±1.12 days vs gallstone: 2.75±1.02 days; p=0.1656). Length of hospital stay was similar (alcohol: 3.55±1.81 days vs gallstone: 3.41±1.3 days; p=0.617). Alcohol-induced cases demonstrated significantly higher incidence of acute necrotic collection (ANC) at 21.8% versus 3.92% in gallstone group.
Conclusion: Both alcohol-induced and gallstone-induced acute pancreatitis demonstrated favorable short-term clinical outcomes with zero in-hospital mortality when managed with appropriate supportive care and timely interventions. While complication patterns differed between etiologies, with alcohol-induced cases prone to necrosis and gallstone-induced cases predisposed to pseudocyst formation, overall outcome measures remained comparable. Etiology-specific monitoring protocols are recommended to optimize patient management and enable early intervention for anticipated complications based on the causative factor.
17. A Study of Fetomaternal Outcome in Pregnancies Complicated by Gestational Diabetes Mellitus at a Tertiary Care Hospital in Northeast India
Neha Joshi, Manoj Kumar
Abstract
Background: Gestational diabetes mellitus (GDM) is a common metabolic disorder of pregnancy and is associated with significant adverse maternal and neonatal outcomes. With the rising prevalence of GDM in India and limited region-specific data from Northeast India, evaluating fetomaternal outcomes in this population is essential.
Aim and Objectives: To assess and compare maternal and neonatal outcomes in pregnancies complicated by gestational diabetes mellitus with those in normoglycaemic pregnancies, and to evaluate the association between glycaemic control and pregnancy outcomes.
Materials and Methods: This hospital-based observational comparative study was conducted at a tertiary care hospital in Northeast Tezpur, Assam, from August 2021 to July 2024. A total of 120 pregnant women were enrolled, comprising 60 women diagnosed with GDM and 60 normoglycaemic controls. Participants were followed from diagnosis until delivery and the early neonatal period. Maternal demographic characteristics, antenatal complications, mode of delivery, and neonatal outcomes were recorded.
Results: Women with GDM were significantly older and had higher body mass index compared to controls. Gestational hypertension and polyhydramnios were more common in the GDM group. Caesarean section rates were substantially higher among women with GDM. Neonates born to mothers with GDM had significantly higher birth weight, increased incidence of macrosomia, neonatal hypoglycaemia, and higher rates of NICU admission. Poor glycaemic control within the GDM group was significantly associated with increased operative delivery and adverse neonatal outcomes.
Conclusion: Pregnancies complicated by gestational diabetes mellitus are associated with increased risk of adverse fetomaternal outcomes. Effective glycaemic control plays a crucial role in improving maternal and neonatal prognosis. Early screening, timely diagnosis, and appropriate management of GDM are essential to reduce pregnancy-related complications, particularly in high-risk populations.
18. A Quasi -Experimental Study to Assess the Effect of Structured Nutritional Education Program on the Dietary Practices of Middle School Children of Private Schools in Urban Chennai
Hemamalini B., Santha Sheela Kumari K., Sameeya Furmeen S., Seenivasan P.
Abstract
Background: Dietary practices formed in childhood strongly influence long-term health. During adolescence, parental supervision declines and peer influence rises, resulting in unhealthy habits such as skipping meals, television viewing during meals, eating out, and increased junk food intake. Early nutritional education can positively shape dietary habits.
Objectives: 1. To assess the dietary practices of middle school children. 2. To evaluate the effect of a structured nutritional education program on their dietary practices.
Methods: A quasi-experimental study was conducted among 90 middle school children aged 11–14 years from three private schools in North Chennai (January 2019–November 2020). Group A (n=30) received structured nutritional education with periodic reinforcement; Group B (n=30) received a one-time intervention; Group C (n=30) served as control. Educational tools included trifold brochures, food plate models, display boards, painted pots, stadiometers, and digital weighing machines.
Results: The mean dietary practices scores improved significantly across all groups (p = 0.000), with the highest gain in Group A (6.73 ± 3.48), followed by Group B with moderate improvement (2.90 ± 0.49) and Group C with less significant improvement (2.06 ± 2.03). In Group A, breakfast intake increased from 66.7% to 93.3%, television viewing during meals reduced from 83.3% to 53.3%, family meal participation rose from 66.7% to 76.7%, and hotel food consumption declined in 80% of children. Balanced diet adherence improved from 10% to 43.3%, while healthy dietary practices showed a remarkable rise from 40% to 96.7% highlighting the superior effectiveness of structured nutritional education in Group A.
Conclusion: Structured nutritional education with periodic reinforcement significantly improved knowledge and dietary practices among middle school children compared with one-time interventions. Continuous engagement of children and parents is crucial for fostering healthy eating behaviours and nurturing healthier adolescents.
19. Effectiveness of an Educational Intervention on Parental Knowledge in Management of Children with Beta Thalassemia
Nilamadhaba Panda, Jyoti Ranjan Behera, Snigdha Rani Panigrahy, Sadhana Panda, Bharata Chandra Choudhury, Narendra Behera
Abstract
Introduction: The study evaluate the impact of an educational intervention on the knowledge, attitude, and practice (KAP) of parents of children with beta thalassemia. The study involved 147 participants, providing a comprehensive overview of how targeted educational efforts can enhance understanding and management of the disease. Overall, the study emphasizes the value of education as a tool for enhancing health literacy and promoting proactive health behaviours among parents of children with beta thalassemia. By continuing to invest in and develop such educational initiatives, we can significantly improve the quality of life for affected families and contribute to better disease management and prevention strategies.
Material And Method: A hospital based Prospective Quasi-experimental Pre- Post test design to evaluate the Effectiveness of an educational intervention. Beta-thalassemia Children of age 6 month to 14 years with their parents (care giver) presented to MKCG Medical College & Hospital for regular blood transfusion This prospective quasi-experimental study was conducted in the department of paediatrics, MKCG, MCH, Berhampur from the period of October 2022 to September 2024. Study population included parents or care givers of thalassemia children of age group 6 month to 14 year.
Result: More number of questionnaire regarding KAP about thalassemia will helpful on strength of the study. Scoring or Grading of KAP could may give more light on the performance of the Study. Long term follow up will yield better result in assessing KAP of thalassemia parents. Smaller sampling size will give more bias in the study and large size of the sample will give better result in KAP assessment regarding thalassemia.
Conclusion: Study concludes; More significant improvement of knowledge of parents and care givers observed in different aspects regarding thalassemia disease like about general awareness, symptoms, disease transmission, screening and diagnosis and healthy living and treatment. Overall, the study emphasizes the value of education as a tool for enhancing health literacy and promoting proactive health behaviours among parents of children with beta thalassemia. By continuing to invest in and develop such educational initiatives, we can significantly improve the quality of life for affected families and contribute to better disease management and prevention strategies.
20. Knowledge, Attitude and Practice of Medical Undergraduate Students towards Over the Counter (OTC) Drugs Usage: A Cross-Sectional Study
Devasish Panda, Bikas Ranjan Mohanty, Baijayanti Rath, Om Gopal Mishra, Dev Shivam Mishra, Sandeep Yadav, Surendra
Abstract
Introduction: The World Health Organization defines over-the-counter drugs as medications that can be bought without a prescription. They are generally used for common symptoms like fever, cough, cold, headache, toothache and are generally treated symptomatically and not as substitute for prescription drugs. But this also leads to an irrational use of over-the-counter drugs due to their easy availability and lack of proper knowledge about their adverse effects. Although OTC medicines allow greater access to treatment of people at large at lower cost for minor or self-limiting illnesses and it gives General Practitioners (GPs) more time to deal with serious health problems. There associated some risks of OTC medications which include Increase Drug Resistance, increase cost to the patients, failure to follow label instructions, increase risk of drug-drug Interactions and potential for misuse and abuse. The prevalence of usage of over-the-counter drugs world-wide varies between 32.5% to 81.5% while the same for India is 53.57%.
Objectives: The study was conducted to assess the knowledge, attitude, practice of MBBS undergraduate students on Over the Counter (OTC) Medicine along with prevalence of usage of OTC drugs in Undergraduate Medical Students of BBMCH.
Methodology: It was a cross-sectional observational Study conducted in a tertiary care teaching hospital among 405 undergraduate medical students. A questionnaire consisting of questions about knowledge, attitude, and practice toward OTC drugs was framed. After educational activities, the same questionnaire on knowledge and attitude aspects was shared to the participants in google form and their response was collected and analyzed.
Results: Out of the Total 405 MBBS undergraduate students who participated in the study, 87(21.5%), 83(20.5%), 83(20.5%), 90(22.2%), 62(15.3%) were belongs to 1st year, 2nd year, 3rd year, 4th year, 5th year of MBBS respectively. 246(60.7%) students were didn’t have any relatives from medical background. Around 45(11.1%) students were having 1st degree (parents) relatives from medical backgrounds. At the same time 31(7.7%), 45(11.1%) students were having 2nd degree (Brothers, Sisters, Grandparents), 3rd degree relatives (Uncle, Aunt, Nephew, Nice) from medical background, respectively. Out of the 405 students only 250(61.7%) were previously heard about the term “Over the Counter” (OTC) medicine. Only 282(69.6%) out of 405 students answer correctly that the medicine that can be purchased without prescription is called OTC medicine.153 (37.8%) students were of the opinion that availability of Over-the-counter medicine was beneficials to general publics. 364 (89.9%) participants were of the opinion that consumption of OTC medicine contributed to Antimicrobial Resistance. 282 (69.6%) were found purchased OTC medicines at least once during the last 3 months. Whereas 245(60.5%) students had purchased medicine for self-consumption, 122(30.1%), 83(20.5%) students purchased it for family members & friends respectively. OTC medicines were purchased most commonly for fever (51.6%) followed by Common cold/Cough (40.7%), Acidity/Gastritis (38%), Headache & myalgia (29.6%), Loose motion (26.2%), allergy (16.0%).
Conclusions: This study highlights the high prevalence of self-medication with OTC drugs among medical students. While many have basic knowledge, significant gaps remain regarding drug safety, regulations, and potential risks. The casual attitude toward OTC drug misuse—such as exceeding doses or ignoring expiry dates—is concerning. Raising awareness among medical students is crucial, as they will serve as future healthcare providers and influence public health behaviors.
21. Comparative Evaluation of Pudendal versus Dorsal Penile Nerve Block for Analgesia in Pediatric Circumcision: A Randomized Controlled Study
Pranchil Pandey, Brijesh Tiwari
Abstract
Introduction: Topical analgesics, caudal block, and ring block of the penis are a few examples of regional anesthesia that have been utilized during circumcision and have varying degrees of effectiveness. Transient motor block has been linked to caudal block. The dorsal penile nerve block is a successful anesthetic technique, with extended postoperative analgesia, according to earlier investigations. However, it has a 4-6.7% failure rate that has been documented. To compare the analgesic and anesthetic efficacy of bilateral nerve stimulator-guided pudendal nerve block with that of dorsal nerve block for perioperative and postoperative analgesia in children undergoing circumcision, we conducted a prospective randomized controlled clinical trial based on this background
Methods: A prospective, single-blinded, randomized investigation was carried out from March 2020 to February 2025 with the approval of the institutional review board and the signed agreement of the parents. 50 ASA-1 male children between the ages of 3 and 5 who were scheduled for elective circumcision were included in the study. A pre-existing coagulopathy, an infection at the injection site, and a known allergy to local anesthetic were among the exclusion criteria. One group received a pudendal nerve block, while the other group received a dorsal nerve block. For the first day, pain ratings were taken at various intervals (0, 6 and 12 h, and once per day for the next 5 days). It was measured using the Objective Pain Scale as modified by Hannallah et al.
Result: Age, hemodynamic stability, and duration of surgery were comparable across the two groups (Table 1). In the group of patients who had their pudendal nerves blocked, every patient underwent circumcision as planned without requiring any analgesics. 3 patients (12%) in the dorsal nerve block group experienced an incomplete block, necessitating further local infiltration. In the dorsal nerve block group, one patient (4%) had total block failure and underwent general anesthesia (Table 2). In conclusion, as compared to the dorsal nerve block, the guided pudendal nerve block method has been shown to be more precise and successful in circumcising children.
22. A Comparative Study of Sociodemographic Correlates and Quality of Life of Caregiver of Patients of Schizophrenia and Bipolar Affective Disorder
Chakit Sharma, Gaurav Kumar, Amit Kumar Jangir, Alok Tyagi
Abstract
Background: Mental health disorders, such as schizophrenia and bipolar affective disorder (BPAD), impose significant burdens on patients and their caregivers. Schizophrenia is characterized by psychotic symptoms and cognitive decline, while BPAD involves episodic mood fluctuations. Both conditions require long-term caregiving, often leading to emotional, physical, and financial strain on family members. Despite the critical role of caregivers, their quality of life (QoL) remains understudied. This study aimed to compare the sociodemographic profiles of patients with schizophrenia and BPAD and assess the QoL of their caregivers.
Materials & Methods: A cross-sectional study was conducted over 16 months at a tertiary care psychiatric centre, involving 120 participants (60 schizophrenia and 60 BPAD patients) and their primary caregivers. Caregivers were assessed using the WHOQOL-BREF questionnaire, while patient symptom severity was measured using PANSS (schizophrenia), YMRS, and HAM-D (BPAD). Statistical analysis included chi-square tests, independent t-tests, and correlation analyses.
Results: Sociodemographic analysis showed no significant differences between schizophrenia and BPAD patients except for illness duration (p=0.002), with BPAD patients having longer illness durations. Caregiver QoL did not differ significantly between groups across physical, psychological, social, and environmental domains. However, illness duration negatively correlated with psychological QoL in both groups (schizophrenia: r=-0.226, p=0.013; BPAD: r=-0.220, p=0.018). Negative symptoms in schizophrenia (PANSS-N) were linked to poorer environmental QoL (r=-0.265, p=0.04), while depressive symptoms in BPAD (HAM-D) correlated with worse psychological QoL (r=-0.360, p=0.027).
Conclusion: Caregivers of schizophrenia and BPAD patients experience similar QoL challenges, though symptom-specific burdens exist. Longer illness duration worsens psychological well-being, highlighting the need for targeted caregiver support programs. Interventions should address chronicity, symptom management, and psychosocial support to improve caregiver resilience and mental health outcomes.
23. Assessment of Awareness and Prevalence of Allergic Rhinitis in North Bihar, India: A Cross-Sectional Study
Sujeet Kumar, Novelesh Bachchan, Shashi Kumar, Pawan Kumar Lal, Pankaj Patel
Abstract
Background: Allergic Rhinitis (AR) is an IgE (Immunoglobulin-E) mediated immunological response of nasal mucosa characterized by watery nasal discharge, nasal obstruction, sneezing and itching in the nose. AR is hazardous diseases rising at a very fast rate. Increase in knowledge and complications regarding AR expected to have a better outcome of the disease.
Objective: The study was cross-sectional type, planned in north Bihar to assess the awareness and diagnosis about AR, and includes 600 participants.
Method Demographic data and knowledge of participants on various aspects of AR were collected by a well-prepared questionnaire asking from patients of ENT Department during January 2025 to December 2025. Nasal cytology was taken from the inferior turbinate from the selected patient.
Results: Diagnosis was based on symptom and nasal eosinophilia in cytology of nasal smear. It was found that 39.83% of participants had knowledge of AR. Only 10.67% knew that Allergic rhinitis was caused by insufficient antihistamines.73.67% of respondents did not know any Allergic rhinitis symptoms. 61% of respondents did not know how this disease can be prevented. 88.33% of respondents did not have any idea about the complication of Allergic Rhinitis. This study indicated that awareness of Allergic rhinitis was very poor, especially in subjects with low education.
Conclusion: The study concluded that there is an urgent requirement of different strategies like Allergic Rhinitis health campaigns, issuing pamphlets of information about AR, public speaking sessions, etc. to spread awareness among the general population.
24. Impact of Various Regional Anesthesia Techniques on Perioperative Outcomes in Holmium Laser Enucleation of the Prostate: A Randomized Study
Pranchil Pandey, Brijesh Tiwari
Abstract
Introduction: HoLEP (Holmium Laser Enucleation of Prostate) surgery has lower morbidity due to a lower transfusion rate and lesser risk of dilutional hyponatremia; nevertheless, the disadvantages are attributed to a longer procedure time and a steep learning curve. The anesthetic aspects of HoLEP have not yet been fully established. The majority of published studies concentrated on HoLEP’s urological features. The aim of the current study was to assess different regional anesthesia techniques for HoLEP surgery and to identify the most effective regional anesthesia technique.
Material and Methods: Following approval from Institutional Ethics Committee (IES-SSMC-0145), a prospective, randomized, comparative study was conducted. The study included 45 patients who were scheduled for HoLEP. Patients who had severe systemic infections or local infections at the injection site, coagulopathy, serious disorders of the central nervous system or peripheral nerves, and history of allergies to local anesthetics, were excluded from the study. Patients (n=15) were randomly assigned to one of three groups (epidural block, spinal block, or saddle block) using the sealed envelope method.
Result: Time to T10 dermatome block (P=.024) and time to maximal sensory level block (P=.003) were statistically different between groups A and B. The maximal sensory block level was comparable between groups B and C but higher between groups A and C. The time to 2-segment sensory regression differed statistically substantially between groups and was significantly longer in group A than in group C (P=.007).
Conclusion: we conclude that saddle block has a quicker onset, more effective sensory block, and faster recovery in HoLEP surgery.
25. Assessment of Perception of Medical Students Regarding Competency Based Medical Education [CBME]: A Cross-Sectional Study
Manjari Kishore, Jitender Pratap Singh, Pooja Jain, Mritunjay Kaushik, Aditi Suri
Abstract
The current research evaluates the effectiveness and perceptions of Competency-Based Medical Education (CBME) among 400 medical students. The survey explored CBME’s focus on medical competencies, clarity of learning objectives, adequacy of feedback, engagement in self-directed learning, and the value of clinical-oriented practical experiences. Results indicate that CBME encourages a focused approach to learning competencies, offers clearer objectives, and improves clinical readiness. Participants reported receiving adequate feedback and viewed self-directed learning positively within their medical education. Clinical-oriented practical experiences were highly valued, enhancing motivation and preparedness for future practice. Areas for improvement include the foundation course for CBME and enhancing clinical practical experiences. Challenges in implementing CBME involve adapting teaching methods, ensuring resources, and aligning assessments with competency outcomes. The study concludes that CBME shows promise for preparing future medical professionals, but ongoing adjustments are needed to address challenges and optimize its implementation.
26. Effect of Preoperative Glycemic Management on Surgical Site Infections among Diabetic Patients
Mohit Vajera, Afrin Khan
Abstract
Background and Aim: Surgical site infections (SSI) are a major postoperative complication in diabetic patients, with hyperglycemia recognized as a significant risk factor. This study aimed to evaluate the impact of preoperative glycemic control on postoperative wound infections in diabetic patients.
Methods: A hospital-based prospective study was conducted over one year in a tertiary care hospital, enrolling 184 diabetic patients undergoing general surgery. Preoperative fasting blood glucose and HbA1c levels were recorded, and patients were followed postoperatively for wound infections and other complications. Patients were categorized based on glycemic control as good, fair, or poor. Data were analyzed to assess the association between glycemic status and postoperative outcomes.
Results: The study included predominantly female patients (67.4%) with the most common age group being 51–60 years (26.1%). Overall, 50% of patients developed SSI, with the highest incidence in the poor glycemic control group (67.6%). Postoperative blood glucose levels on days 1, 3, and 7 were significantly higher in patients with SSI (p < 0.05). Other complications, including delayed wound healing and urinary tract infections, were also more frequent in patients with suboptimal glycemic control.
Conclusion: Poor preoperative glycemic control is associated with higher rates of surgical site infections and postoperative complications in diabetic patients. Optimizing blood glucose before surgery may reduce morbidity and improve outcomes.
27. Retrospective Study of Antidiabetic Medication Use in Type 2 Diabetes Patients at a Tertiary Healthcare Center
Visarg Patel, Jaimin Mohanbhai Desai, Neelkumar Girishkumar Patel
Abstract
Background: Type 2 diabetes mellitus is a growing public health challenge requiring long-term pharmacotherapy, and evaluating prescribing patterns helps assess the rational use of antidiabetic drugs. This study aimed to analyze the prescription pattern of antidiabetic medications among patients with type 2 diabetes mellitus attending a tertiary care hospital.
Methods: A hospital-based cross-sectional study was conducted over one year at a tertiary care hospital. A total of 166 patients with type 2 diabetes mellitus attending the diabetic outpatient clinic were included. Data on demographic characteristics, comorbidities, and prescribed antidiabetic drugs were collected from prescriptions and medical records. The data were analyzed using descriptive statistics with SPSS software.
Results: Most patients were aged 51–60 years (38.0%) and had a diabetes duration of 1–5 years (46.4%). Hypertension was the most common comorbidity (49.4%). The mean number of antidiabetic drugs per prescription was 2.9, with oral agents alone prescribed in 66.9% of patients. Metformin was the most commonly prescribed oral drug, while lispro mix insulin was the predominant injectable, and combination therapy was frequently used.
Conclusion: The study demonstrates a preference for metformin-based combination therapy in the management of type 2 diabetes mellitus, reflecting contemporary and rational prescribing practices.
28. Hemodynamic Changes After Spinal Anesthesia in Cesarean Section: A Prospective Observational Study
Jaydipkumar Manubhai Chauhan, Rahulkumar Jagdhishbhai Taral, Meetkumar Rameshbhai Moradiya
Abstract
Background: Spinal anesthesia is the preferred anesthetic technique for cesarean section because of its rapid onset, effective sensory blockade, and minimal fetal drug exposure. However, post-spinal hypotension remains the most common complication and may adversely affect both maternal comfort and uteroplacental perfusion.
Aim: To determine the incidence of post-spinal hypotension and identify associated risk factors in patients undergoing cesarean section.
Methodology: This prospective observational study was conducted on 90 patients undergoing elective or emergency cesarean section under spinal anesthesia. Maternal demographics, obstetric variables, baseline hemodynamic parameters, sensory block level, intraoperative management, and neonatal outcomes were recorded. Hypotension was defined as a fall in systolic blood pressure ≥20% from baseline or an absolute systolic blood pressure <90 mmHg. Statistical analysis was performed to identify factors associated with post-spinal hypotension.
Results: Post-spinal hypotension occurred in 64.4% of patients, most commonly within the first 10 minutes following spinal anesthesia. Higher body mass index, lower baseline systolic blood pressure, higher sensory block level (≥T4), primigravida status, and emergency cesarean section were significantly associated with hypotension. Vasopressor support was required in the majority of affected patients. Conclusion: Post-spinal hypotension remains a frequent and clinically significant complication during cesarean section under spinal anesthesia. Early identification of high-risk patients and timely preventive strategies are essential to improve maternal and neonatal outcomes.
29. Comparison of HbA1c Levels in Diabetic Patients with and without Retinopathy
Narendra Singh, Pankaj Tyagi, Yashika Sinha, Prachi Shukla
Abstract
Background: Diabetic retinopathy is a common microvascular complication of type 2 diabetes mellitus and a leading cause of preventable blindness. Glycated haemoglobin (HbA1c) reflects long-term glycaemic control and is strongly associated with the risk of retinopathy. This study aimed to compare HbA1c levels in diabetic patients with and without retinopathy.
Methods: A hospital-based cross-sectional comparative study was conducted at Muzaffarnagar Medical College and Hospital over one year. A total of 180 patients with type 2 diabetes mellitus were enrolled and divided into two groups: 90 patients with diabetic retinopathy (DR group) and 90 without retinopathy (Non-DR group). Clinical evaluation, fundoscopic examination, and HbA1c estimation by high-performance liquid chromatography (HPLC) were performed. Data were analysed using SPSS v21; mean HbA1c levels were compared using the Student’s t-test, with p < 0.05 considered significant.
Results: The mean HbA1c level was significantly higher in the DR group (9.1 ± 1.4%) compared to the Non-DR group (7.2 ± 1.1%; p < 0.001). A higher proportion of patients with diabetic retinopathy had HbA1c ≥9%. Longer duration of diabetes and older age were also associated with retinopathy.
Conclusion: Poor glycaemic control, reflected by elevated HbA1c, is strongly associated with diabetic retinopathy. Regular monitoring of HbA1c and timely ophthalmological screening are essential to prevent vision-threatening complications in patients with type 2 diabetes mellitus.
30. Association of Vitamin D Status with Disease Severity in Infants Hospitalized with Bronchiolitis
Kavita Meena, Jitendra Kumar Chholak, Yogesh Yadav
Abstract
Background: Vitamin D has immunomodulatory properties and may influence the clinical course of lower respiratory tract infections in infants; however, its association with bronchiolitis severity remains inconsistent.
Objectives: To evaluate the association between serum vitamin D status at hospitalization and disease severity among infants admitted with bronchiolitis.
Methods: In this prospective observational study conducted from January to December 2024 at SMS Medical College, Jaipur, infants aged <12 months hospitalized with bronchiolitis were enrolled. Serum total 25-hydroxyvitamin D [25(OH)D], albumin, and vitamin D–binding protein were measured within 24 hours of admission. Free and bioavailable 25(OH)D concentrations were calculated. Disease severity was assessed by intensive care unit (ICU) admission, need for continuous positive airway pressure (CPAP) or mechanical ventilation, and length of hospital stay. Statistical analyses were performed using SPSS version 25.
Results: A total of 403 infants were included (mean age, 5.8 ± 3.1 months). Vitamin D deficiency and insufficiency were present in 38.5% and 52.9% of infants, respectively. ICU admission was required in 24.3%, CPAP in 19.6%, and mechanical ventilation in 10.4%. Vitamin D status was not significantly associated with ICU admission or the requirement for CPAP. However, lower serum 25(OH)D levels were significantly associated with the need for mechanical ventilation (p = 0.035). Total and free 25(OH)D concentrations demonstrated weak but significant negative correlations with duration of hospitalization (p = 0.004 and p = 0.026, respectively).
Conclusions: Hypovitaminosis D is highly prevalent among Indian infants hospitalized with bronchiolitis. While vitamin D status does not predict ICU admission or CPAP requirement, lower vitamin D levels are associated with prolonged hospitalization and increased need for mechanical ventilation.
31. Strengthening Medico-Legal Evidence and Administrative Accountability in Rajasthan: The Role of MedLEaPR
Dipender Singh, Yashika Saini, Anupam Johry, Surya Bhan Kushwaha
Abstract
The Medico-Legal Examination and Post-Mortem Reporting System (MedLEaPR) represents a major digital transformation in medico-legal infrastructure of Rajasthan following the enforcement of the new criminal codes the Bharatiya Sakshya Adhiniyam (BSA), Bharatiya Nagarik Suraksha Sanhita (BNSS) and Bharatiya Nyaya Sanhita, 2023. Historically, medico-legal documentation in the state relied on paper-based reports that were often handwritten, non-standardized & vulnerable to loss, manipulation and chain-of-custody breaches. These limitations frequently resulted in procedural delays and impaired judicial efficiency. MedLEaPR, developed by the National Informatics Centre (NIC), provides a secure, centralized and standardized digital platform for generating, authenticating and transmitting medico-legal case reports (MLCs) and post-mortem reports (PMRs). Its technical architecture incorporates digital signatures, structured templates, graphical tools and real-time integration with police systems through the Crime and Criminal Tracking Network & Systems (CCTNS) and the Inter-operable Criminal Justice System (ICJS). Rajasthan’s government mandated daily uploading of all MLCs and PMRs from May 2025, ensuring statewide compliance and enhancing accountability. Early outcomes indicate improved evidence integrity, reduced documentation errors, faster interdepartmental communication and greater transparency in the medico-legal workflow. While infrastructural limitations and training needs persist, MedLEaPR establishes a foundational digital framework critical for timely, reliable and legally defensible medico-legal evidence under India’s reformed criminal justice system.
32. Intravenous Magnesium Sulphate versus Oral Nifedipine for Tocolysis: Maternal and Neonatal Outcomes
Paaka Madhurima, Kavitha Dharavath, Vemula Sravanthi
Abstract
Background: Preterm labour is a major cause of neonatal morbidity and mortality. Tocolytic therapy aims to delay delivery to allow corticosteroid administration and improve neonatal outcomes. This study compared the efficacy and safety of magnesium sulphate and nifedipine in managing preterm labour.
Methods: A prospective observational study was conducted at Government Maternity Hospital, Hanumakonda, from July 2023 to December 2024. A total of 100 women with preterm labour were enrolled, with 50 receiving intravenous magnesium sulphate and 50 receiving oral nifedipine. Baseline characteristics, tocolytic efficacy, maternal adverse effects, and neonatal outcomes were systematically recorded and analysed.
Results: Baseline demographics were comparable between groups. Nifedipine achieved a significantly greater mean delay in delivery (6.1 ± 3.2 days) compared with magnesium sulphate (4.6 ± 2.4 days). Prolongation of pregnancy beyond 48 hours and 7 days was higher in the nifedipine group. Maternal adverse effects were mild; nifedipine produced more headache and flushing, while magnesium sulphate showed occasional hypotension and reduced reflexes. Neonatal outcomes, including birth weight, APGAR scores, and NICU admissions, were similar between groups.
Conclusion: Nifedipine demonstrated superior tocolytic efficacy with good maternal tolerability, making it a preferable first-line agent for preterm labour.
33. From Speculum to Scope: Advancing Sinonasal Disease
Shubhangi Singh, Shiv Shanker Kaushik, Richa Gupta
Abstract
Background: Sinonasal diseases are common in otorhinolaryngology, ranging from inflammation to complex infections and neoplasms. Anterior rhinoscopy is often used for initial evaluation but is limited to the anterior nasal structures thus missing deeper pathologies. Nasal endoscopy offers a more comprehensive view allowing detailed visualization of both the anterior and posterior nasal cavities. Despite its superior diagnostic capabilities, nasal endoscopy is still underused due to factors like cost and expertise.
Objective: To compare merits and demerits of nasal endoscopy vs. anterior rhinoscopy in diagnosis of sinonasal disease.
Methods: A prospective observational study was conducted on 110 patients presenting in the department of ENT, PMCH, Udaipur from April 2024 to March 2025 with symptoms of sinonasal disease. Each patient underwent both anterior rhinoscopy and diagnostic nasal endoscopy.
Results: Nasal endoscopy proved significantly better than anterior rhinoscopy for detecting sinonasal abnormalities, identifying conditions like concha bullosa (40.90% vs. 2.72%), ethmoidal polyps (19.09% vs. 9.09%) etc. It also provided a more detailed assessment of regions such as the sphenoethmoidal recess and superior turbinates and nasopharynx, which were not accessible with anterior rhinoscopy.
Conclusion: The study highlights the critical role of nasal endoscopy in accurately diagnosing sinonasal disease, particularly in chronic or refractory cases where subtle or posterior pathology may be present. Despite being more resource-intensive, nasal endoscopy should be integrated into routine clinical practice for comprehensive evaluation, as it enables precise diagnosis, better treatment planning, and improved patient outcomes.
34. Predictors of Outcomes of Neonatal Acute Kidney Injury in Tertiary Care Hospital
Akash Parashar, Sunita Khandelwal, Anjali Singh, Jai Singh
Abstract
Background: Acute kidney injury (AKI) is a common clinical syndrome in hospitalized children and it imposes heavy burden of mortality and morbidity. Acute kidney injury is an acute and reversible increment in serum creatinine (SCr) levels with a reduction in urine output oliguria, or anuria.
Objective: To Study the Etiology, Clinical Profile and Outcome of Acute Kidney Injury (AKI) In Neonates Admitted in NICU of JK Lon Hospital Kota.
Materials and Methods: A prospective cross-sectional study was conducted in NICU JK loan with 255 neonates. Neonates (≤28 days) having acute kidney injury according to AKI criteria were included.
Results: Among 255 neonates, mortality was 16.9%. Low birth weight, sex, gestational age, and mode of delivery showed no significant association with outcome. Sepsis was the most common etiology, while asphyxia and higher HIE grades, especially grade 3, were strongly linked to mortality. Significant predictors of death included metabolic acidosis, elevated urea and creatinine levels, and AKI stage 3. Most cases occurred in summer, but deaths were more common during monsoon. Overall, severe metabolic and renal abnormalities were key determinants of poor outcome.
Conclusion: Severity of illness, hypoxic injury, metabolic acidosis, and advanced AKI stage are the primary determinants of mortality in neonatal AKI, rather than demographic factors.
35. A Study on Thyroid Profile in Chronic Kidney Disease
S. Deepika, S. Sujatha, M. Priyanka
Abstract
Background: Chronic kidney disease (CKD) is associated with significant alterations in thyroid hormone metabolism. Understanding these changes is crucial for comprehensive patient management. Aim of this study is to evaluate thyroid function abnormalities in patients with chronic kidney disease and correlate these changes with the severity of renal impairment.
Methods: A cross-sectional observational study was conducted on 100 CKD patients and 50 age and sex-matched healthy controls. Thyroid function tests including T3, T4, TSH, FT3, and FT4 were measured. Patients were categorized according to CKD stages based on estimated glomerular filtration rate (eGFR). Statistical analysis was performed using appropriate tests.
Results: The mean age of CKD patients was 52.3±12.4 years with male predominance (64%). Significantly lower levels of T3 and FT3 were observed in CKD patients compared to controls (p < 0.001). T4, FT4, and TSH levels showed no significant difference. The prevalence of low T3 syndrome increased with advancing CKD stages, being present in 78% of stage 5 CKD patients. A significant negative correlation was found between serum creatinine and T3 levels (r = -0.542, p < 0.001).
Conclusion: Thyroid dysfunction, particularly low T3 syndrome, is highly prevalent in CKD patients and correlates with disease severity. Regular thyroid function monitoring should be considered in CKD management.
36. A Study on Menstrual Hygiene and Its Association with Perceived Reproductive Morbidity in Adolescent Girls of Slum Region
Sudiksha Rana, Sumit Kumar Singh, Himanshu Mamgain, Anupama Arya, Shalini Rawat, Shivani Dhyani
Abstract
Objectives: The present study was to evaluate the various factors of menstruation hygiene and to assess reproductive morbidities of adolescent girls in slum area of Dehradun, Uttarakhand, India.
Methods: Data was collected by house‑to‑house survey in the community, and girls were asked questions using a predesigned questionnaire. The questionnaire consisted of sociodemographic details, knowledge about menstruation, menstrual patterns and practices, hygiene followed, and associated serious ill-health ranging from Dysmenorrhea, genital tract infections, urinary tract infections, and bad odour etc.
Results: Out of 250 adolescent girls, most of girls were in age group of 14-16 years. Mean age of menarche was 12.7 years. Most of the mothers 136(54.4%) were illiterate and belonged from lower socioeconomic starta 170(68%). Non disposable linen was used by 55.2% girls. 67.6% girls were used 2-3 pad per day. 78.8% girls were reused of pad. Out of 250 girls, 188(75.2%) girls had reproductive morbidities. Mos common morbidities were dysmenorrhoea 84(33.6%), menstrual irregularities 55(22%), itching in genitalia 18(7.2%) and burning micturition 13(5.2%). 35.2% girls were taken heath care services.
Conclusions: Reproductive morbidities are more common in adolescent girl of slum region. Dysmenorrhea and menstrual irregularities are the most common morbidities. Illiterate mother, lower socioeconomic strata and lack of awareness of menstruation and its hygiene are the most common factors of reproductive morbidities in adolescent girls. Hence, we should organise regular health check-up camp in slum area to diagnose and treatment of reproductive morbidities in adolescent girls as well as to educate the mothers and adolescent girls for menstruation hygiene and prevention from morbidities.
37. Electrolyte Disturbances and Cardiac Complications in Post Operative Patients
Priyambada Patra, Kinjal Rameshbhai Balva, Drashti Kamleshbhai Patel
Abstract
Background: Normal cardiac function is contingent on electrolyte balance. Post-surgical changes, including surgical stress, fluid shifts, blood loss, anesthetic agents, and alterations in renal function, make the post-operative period highly vulnerable to electrolyte disturbances. Such disturbances may grossly affect cardiac electrophysiology, resulting in arrhythmias and other cardiac complications. It becomes, therefore, imperative that early recognition and correction of electrolyte disturbances are necessary in reducing both morbidity and mortality during this period.
Objectives: The study aimed to determine the prevalence of electrolyte disturbances in post-operative patients and their relationship with cardiac complications during the early post-operative period.
Materials and Methods: A total of 176 patients were enrolled in the prospective observational study conducted over a period of one year in a tertiary care hospital. A consecutive sampling technique was used. Measurement of serum electrolytes, namely sodium, potassium, calcium, and magnesium, was done within 72 hours following surgery. Cardiac complications were recorded based on clinical assessment with the aid of electrocardiography. Data analysis was done by descriptive statistics and association of electrolytes with cardiac complication using the Chi-square test, having a p-value <0.05 as statistically significant.
Results: Electrolyte imbalance was a common finding in post-operative patients, and the most common imbalance was hyponatremia and hypokalemia. Cardiac complications, especially arrhythmias, occurred relatively often in patients who had electrolyte imbalance. The odds ratio for cardiac complications was highest for hypokalemia, followed by hyponatremia and then hypocalcemia. Cardiac complications occurred significantly less often in patients who had normal electrolyte values.
Conclusion Electrolyte imbalance is common in the postoperative period and is independently related to cardiac complications. Electrolyte imbalance in the postoperative period must be closely monitored and corrected in order to prevent cardiac morbidity.
38. In silico Evaluation of Promising Epigenetic Biomarkers for the Detection of Colon Adenocarcinoma
Payal Kulhari, Suman Kumar Ray, Ram Rattan Negi
Abstract
Introduction: Colon adenocarcinoma (COAD) is a common and fatal cancer in the world, with a high death rate in India as a result of late diagnosis. Conventional screening techniques, such as stool tests and colonoscopies, are expensive, invasive, and often insensitive in identifying early-stage illnesses. The development of reliable, non-invasive biomarkers is therefore crucial to improving prognosis and early diagnosis. Epigenetic alterations, especially DNA methylation changes, occur early in tumor development and can be detected in circulating cell-free DNA (cfDNA), making them promising candidates for liquid biopsy-based diagnostics.
Objective: The purpose of this study was to identify and validate epigenetic biomarkers for the non-invasive diagnosis and prognosis of COAD, with a particular emphasis on SEPT9 and SDC2. The objective was to utilize computational techniques to assess their expression patterns and methylation status, with a focus on developing methylation assays suitable for early diagnosis and disease monitoring.
Materials and Methods: The Human Protein Atlas, TCGA, UALCAN, GEPIA, and other publicly available multi-omics datasets were utilized to evaluate gene expression, promoter methylation, protein localization, and survival relationships for SDC2 and SEPT9. By comparing tumor and normal tissues, bioinformatics analyses revealed variations in methylation. The analysis of single-cell RNA sequencing data, with an emphasis on epithelial lineage, confirmed the expression of specific genes in distinct cell types.
Results and Discussion: Bioinformatics analysis revealed significant promoter hypermethylation of SEPT9 and SDC2 in COAD samples compared to normal colon tissue. SDC2 demonstrated subtype-specific downregulation, whereas SEPT9 showed significant overexpression, especially in non-mucinous malignancies. Immunohistochemistry confirmed variable SDC2 expression and elevated SEPT9 protein levels. RNA sequencing of single cells has shown that both genes are highly expressed in epithelial cells, indicating their specificity as epigenetic biomarkers. The increased expression of both genes correlated with reduced overall survival, as indicated by survival analysis, underscoring their potential as prognostic indicators.
39. Analyzing the Incidence and Risk Factors of Retinopathy in Premature Infants
Shipra Singhi, Sunita Bishnoi
Abstract
Background: Retinopathy of prematurity is a leading cause of preventable childhood blindness, particularly among preterm and low birth weight neonates. Understanding its incidence and associated risk factors is essential for effective screening and prevention.
Objectives: To determine the incidence of retinopathy of prematurity in preterm and low birth weight neonates and to assess the association between various perinatal and neonatal risk factors with its occurrence.
Material and Methods: This prospective observational study included 520 preterm and/or low birth weight neonates admitted to a tertiary care neonatal intensive care unit. All eligible neonates underwent serial retinal examinations, and relevant maternal and neonatal risk factors were analyzed.
Results: Retinopathy of prematurity was diagnosed in 84 neonates, with an incidence of 16.15%. Lower gestational age, lower birth weight, prolonged oxygen therapy, and respiratory distress syndrome were significantly associated with ROP development, while sex, twin status, prenatal steroid exposure, and maternal systemic diseases showed no significant association.
Conclusion: Retinopathy of prematurity remains a significant morbidity among preterm neonates. Early screening and identification of high-risk infants, along with careful management of modifiable risk factors, are crucial in preventing disease progression and visual impairment.
40. Onychoscopic Analysis of Nail Disorders among Older Adults
Vivek Nikam
Abstract
Background: Nail disorders are common in the geriatric population and often pose diagnostic challenges due to overlapping clinical features and age-related changes. Onychoscopy has emerged as a useful non-invasive tool for detailed nail assessment.
Objectives: To study the clinico-epidemiological profile and onychoscopic patterns of nail disorders in the geriatric population.
Material and Methods: A cross-sectional observational study was conducted on 120 geriatric patients with nail disorders. Detailed clinical examination and onychoscopic evaluation of nail fold, nail plate, nail bed, and hyponychium were performed and correlated.
Results: Degenerative nail changes were predominant. Onychoscopy identified a higher frequency of nail abnormalities compared to clinical examination and showed strong correlation with clinical findings across all nail components.
Conclusion: Onychoscopy significantly enhances the evaluation of nail disorders in elderly patients and should be incorporated into routine geriatric dermatological practice.
41. Cutaneous Manifestations of Chronic Kidney Disease
S. S. Yadav, Bulbul Yadav
Abstract
Background: Chronic kidney disease (CKD) is a progressive systemic disorder associated with multiple dermatological manifestations that significantly affect patients’ quality of life.
Objectives: To study the spectrum and frequency of cutaneous manifestations in patients with chronic kidney disease.
Materials and Methods: This hospital-based observational study was conducted from February 2023 to November 2025 at Nirmala Hospital & Research Center, Jaipur. All diagnosed CKD patients were included. Patients with acute kidney injury or pre-existing primary dermatological disorders unrelated to CKD were excluded. Detailed clinical, dermatological, and laboratory evaluations were performed. Data were analyzed using descriptive statistics.
Results: A total of 327 CKD patients were studied (mean age 52.4 ± 11.6 years; male:female ratio 1.8:1). The most common etiology of CKD was obstructive uropathy (41%). Non-specific cutaneous manifestations were predominant. Pruritus (72.7%), hyperpigmentation (70%), and xerosis (67.8%) were the most frequent findings. Among specific lesions, acquired perforating dermatosis (8.5%) and porphyria cutanea tarda (3.2%) were observed.
Conclusion: Cutaneous manifestations are highly prevalent in CKD patients, with non-specific lesions being more common than specific dermatoses. Early identification and appropriate dermatological care should be integrated into routine CKD management.
42. Assessing Hemoglobinopathy Occurrence via High-Performance Liquid Chromatography in a Tertiary Care Setting
Darshanaben Kanabhai Gohel, Nishant Pujara, Sandip Patel
Abstract
Background: Hemoglobinopathies are among the most common inherited disorders worldwide, with a significant burden in India. High-performance liquid chromatography (HPLC) has become the gold standard for detecting and classifying these disorders, offering precise quantification and identification of hemoglobin variants.
Aim: To estimate the prevalence and distribution of various hemoglobinopathies detected by HPLC in a tertiary care center in India.
Material and Methods: This cross-sectional observational study included 310 patients screened for the duration of one year. Patients of all age groups and both sexes, referred for hemoglobinopathy screening were enrolled. Detailed clinical data were collected and venous blood samples were analyzed using HPLC. Demographic details,age-wise distribution, and prevalence of hemoglobinopathies were recorded.
Results: Of the 310 patients, 125 (40.3%) were male and 185 (59.7%) were female. The majority of patients belonged to the 21–30 years age group (33.9%). HPLC analysis showed 235 (75.8%) had normal hemoglobin while 38 (12.3%) had β-thalassemia trait, 12 (3.9%) had sickle cell trait, 4 (1.3%) had sickle cell-β-thalassemia compound heterozygosity, and 2 (0.6%) had sickle cell homozygosity. Thalassemia trait was most commonly diagnosed in the 21–30 years group. Among thalassemia carriers, RDW was predominantly in the 16–20 range.
Conclusion: This study highlights the considerable burden of hemoglobinopathies, particularly β-thalassemia trait in the regional population. HPLC proved highly effective for screening and diagnosis. Strengthening screening programs, especially among young adults along with public awareness initiatives are essential to reduce the hemoglobinopathy burden in India.
43. Vitamin D as a Novel Biomarker for Grading the Severity of Preeclampsia: A Cross-Sectional Case Control Study
Vibha Khare, Akshatha R., Akhilesh Bhamoriya, Tapan Sing Pendro
Abstract
Introduction: Preeclampsia remains a leading cause of maternal and perinatal morbidity worldwide. The importance of vitamin D for placental function, endothelial integrity, and immunological regulation is increasing. The purpose of this study was to measure serum vitamin D levels in pregnant women with preeclampsia and normotension and to see if it could be used as a biomarker to rate the severity of the condition.
Aims and Objectives: The objectives of this study were: (i) to estimate serum vitamin D levels in pregnancies with normotension, severe pre-eclampsia, and mild pre-eclampsia, (ii) to determine the relationship between the severity of the disease and vitamin D levels.
Material and Methods: A tertiary care teaching hospital served as site of this cross-sectional case-control study. A total of 120 third-trimester pregnant women has been recruited and split into three groups: 40 normotensive controls, 40 with mild preeclampsia, and 40 with severe preeclampsia. Competitive enzyme-linked immunoassay (ELISA) was used to measure serum vitamin D levels. Biochemical and clinical parameters were compared among groups. One-way ANOVA and Pearson correlation were used for statistical analysis.
Results: Women with preeclampsia have been significantly lower mean serum vitamin D levels than normotensive controls (p <0.00001). As the severity of preeclampsia increased, vitamin D levels gradually decreased. Diastolic blood pressure (DBP), systolic blood pressure (SBP), and proteinuria were all significantly correlated negatively with serum vitamin D.
Conclusion: Serum vitamin D levels are markedly lower in preeclampsia and show a negative correlation with the severity of the illness. One possible biomarker for identifying high-risk pregnancies and grading preeclampsia is vitamin D.
44. Effect of 4mg Dexamethasone for Prevention of Post-Operative Nausea and Vomiting in Laparoscopic Surgeries
Rakshitha R., Prashantha Kumar H. M., Holy Joy, Narasimha Reddy B., Saraswathi P. Devi
Abstract
Background: Laparoscopy was first introduced as a therapeutic alternative to laparotomy more than a century ago. Since then, the field of laparoscopic surgery has undergone enormous development and expansion, to the point where it is now the standard treatment for a wide range of surgical procedures, including cholecystectomy, appendicectomy, gynecologic surgeries, bariatric surgery, hernia repair and even complex oncologic operations. However, laparoscopic surgeries are associated with high incidence of postoperative nausea and vomiting (PONV) of 40%-80%. A number of drugs have been used for its prevention. Dexamethasone, a glucocorticoid, having an antiemetic effect along with anti-inflammatory and analgesic effect has been shown to reduce the incidence of PONV. However, the optimal dose for reducing PONV has not been clearly defined. In this study, we aim to study 4mg dose of dexamethasone on incidence of PONV in patients undergoing laparoscopic surgery.
Methods: A double blind randomized controlled study was performed on 70 patients posted for elective laparoscopic surgeries under general anesthesia to assess the efficacy of 4mg dose of dexamethasone in preventing PONV. Patients were randomly assigned into two groups: 4mg dexamethasone (1ml) and 1ml normal saline group. The incidence of nausea, vomiting and the need for anti-emetic were evaluated during first 24 postoperative hours.
Results: Patients who received IV dexamethasone 4mg had significant reduction of PONV (P<0.01) and the need for rescue anti emetic drugs was also lower in dexamethasone group compared to normal saline group.
Conclusion: Inj Dexamethasone 4mg given before induction of anesthesia effectively controls postoperative nausea and vomiting in laparoscopic surgeries.
45. Prevalence And Anatomical Distribution of Lateral Canals in Maxillary Premolars Assessed Using Cone-Beam Computed Tomography: An Original Study
Manoj Meena, Monika Sharma, Akshay Verma, Purusharth Kumar Sharma
Abstract
Aim: To evaluate the presence, frequency, and anatomical location of lateral canals in extracted maxillary premolars using cone-beam computed tomography (CBCT).
Materials and Methods: Three hundred extracted human maxillary premolars were subjected to CBCT imaging under standardized parameters. Axial, sagittal, and coronal sections were evaluated for root canal configuration according to Vertucci’s classification and for the presence of lateral canals. Data were recorded and analyzed using descriptive statistics.
Results: The majority of maxillary premolars exhibited Vertucci Type I canal configuration. Lateral canals were identified in 1.0% of specimens (3 out of 300 teeth). When present, lateral canals were predominantly located in the middle and apical thirds of the root. Complex canal configurations including Vertucci Types II, IV, V, VI, and VIII were also observed.
Conclusion: Lateral canals in maxillary premolars are relatively rare but clinically significant anatomical variations. CBCT is a reliable non-destructive imaging modality for detecting lateral canals and complex root canal morphology, thereby aiding in improved endodontic diagnosis and treatment planning.
46. Efficacy of Dexmedetomidine as an Adjuvant with Ropivacaine in Paravertebral Block in Surgery for Breast Cancer – A Study of 50 Cases
Yagnik Jagdishbhai Vaja, Jaykishan J. Gol, Krishna Dhamat
Abstract
Background: Effective postoperative pain control after breast cancer surgery is essential to reduce morbidity, opioid consumption, and patient discomfort. Thoracic paravertebral block (TPVB) is an established regional anesthesia technique that provides unilateral analgesia with minimal systemic effects. Ropivacaine is commonly used for TPVB due to its favorable safety profile. Dexmedetomidine, a highly selective α₂-adrenergic agonist, has been increasingly used as an adjuvant to local anesthetics to enhance analgesic efficacy. This study evaluated the efficacy of dexmedetomidine as an adjuvant to ropivacaine in TPVB for patients undergoing modified radical mastectomy.
Material and Methods: This prospective, randomized, controlled study was conducted on 50 female patients aged ≥18 years, belonging to ASA physical status I–III, scheduled for modified radical mastectomy. Patients were randomly allocated into two groups: Group PR received TPVB with 0.5% ropivacaine, while Group PRD received TPVB with 0.5% ropivacaine plus dexmedetomidine (1 μg/kg). TPVB was performed at T1, T3, and T5 levels before induction of general anesthesia. Primary outcomes included duration of analgesia and total postoperative opioid consumption. Secondary outcomes included onset of sensory block, hemodynamic parameters, and postoperative pain scores using the Visual Analogue Scale (VAS), Ramsay Sedation Scores, patient satisfaction, and adverse effects.
Results: Group PRD demonstrated a significantly faster onset of sensory block, prolonged duration of analgesia, lower postoperative VAS scores at all time intervals, and significantly reduced tramadol consumption compared to Group PR (p < 0.001). Hemodynamic parameters showed a controlled and stable reduction in heart rate and blood pressure in the dexmedetomidine group without clinical instability. Patient satisfaction was higher in Group PRD, with no significant increase in adverse effects.
Conclusion: Dexmedetomidine as an adjuvant to ropivacaine in TPVB significantly improves postoperative analgesia, reduces opioid requirement, and enhances patient satisfaction without increasing complications.
47. Correlation of Clinical, Computed Tomographic, and Intraoperative Findings in Chronic Rhinosinusitis
Zeel Patel, Nimisha Nimkar, Rachana Prajapati
Abstract
Background: Chronic rhinosinusitis (CRS) is a common inflammatory condition of the nasal cavity and paranasal sinuses persisting for more than 12 weeks and causing significant morbidity worldwide. Accurate diagnosis and precise delineation of disease extent are essential for effective management. With the advent of Functional Endoscopic Sinus Surgery (FESS), diagnostic nasal endoscopy and computed tomography (CT) of paranasal sinuses have become integral to preoperative evaluation. However, discrepancies between radiological findings and intraoperative observations still exist, particularly regarding anatomical variations and sinus involvement. Establishing a correlation between clinical, radiological, and operative findings is therefore crucial to optimize surgical planning and outcomes.
Material and Methods: This prospective observational study was conducted in the Department of Otorhinolaryngology at a tertiary care teaching hospital in Western India between May 2023 and August 2024. A total of 42 patients diagnosed with chronic rhinosinusitis and planned for endoscopic sinus surgery were included. All patients underwent detailed clinical evaluation, diagnostic nasal endoscopy, and CT scan of paranasal sinuses prior to surgery. CT findings were compared with intraoperative observations to assess diagnostic accuracy. Sensitivity, specificity, accuracy, and Cohen’s kappa coefficient were calculated to evaluate agreement between CT and operative findings.
Results: Among the 42 patients, males predominated (54.76%), with the most affected age group being 31–40 years. Nasal obstruction was the most common symptom (92.85%). Maxillary sinus was the most frequently involved sinus on CT, followed by ethmoid sinuses. CT scan demonstrated high sensitivity for detecting sinus disease and osteomeatal complex obstruction, with substantial agreement for osteomeatal complex blockage and deviated nasal septum. However, lower sensitivity and agreement were observed for certain anatomical variations such as concha bullosa and Onodi cells.
Conclusion: CT scan of paranasal sinuses is a highly sensitive tool for evaluating CRS and guiding surgical management. When combined with clinical assessment and nasal endoscopy, it provides optimal preoperative planning and improves intraoperative safety.
48. Mental Health Stigma and Attitudes: A Comparative Cross – Sectional Study among Psychiatric Patients and Their Caregivers in the Malwa Region
Akshay Soni, Priya Rai, Maitreyee Dale, Lovepreet Chabarwal
Abstract
Background: Stigma is one of the major obstacles to timely mental health service use and engagement. While caregivers may act to shorten treatment delay, they may act to transmit stigmatizing beliefs that influence the help-seeking behavior of patients. This study compared self-stigma of seeking help and attitudes of community toward mental illness between psychiatric patients and their primary care givers in the Malwa region using the Self Stigma of Seeking Help (SSOSH) and the 12 item Community Attitudes toward the Mentally Ill scale (CAMI 12).
Methods: In a hospital-based comparative cross – sectional design, psychiatric patients and their primary family caregivers were recruited consecutively from outpatient and inpatient psychiatry services for the study. Sociodemographic and clinical data were collected. SSOSH (10 items; higher scores = greater self-stigma of help seeking) and CAMI 12 (12 items; higher scores = less stigmatizing community attitudes after reverse coding) were performed with both groups. Group differences were tested, and after independent samples t tests, effect sizes. Multivariable linear regression was conducted to examine a set of predictors of SSOSH and CAMI 12 scores controlling for some key covariates.
Results: A total of 160 patients and 160 caregivers were analyzed. Patients scaled higher SSOSH scores than caregivers, 31.6 (SD 7.5) and 25.4 (SD 6.8), mean difference 6.2, p<0.001, Cohen’s d=0.88. Caregivers had more stigmatizing community attitudes (lower CAMI 12 total) than patients did (40.1+-6.5 vs 43.2+-6.0; p < 0.001; d = 0.49). In the adjusted models, rural residence and lower education were independently associated with higher SSOSH and lower CAMI 12 scores for both groups. Caregiver CAMI 12 “prejudice/exclusion” scores were negatively linked to patient SSOSH (beta -0.24 per unit CAMI 12; p=0.002), suggesting patient-caregiver dyad attitudinal contagion.
Conclusion: Patients weighed with great internalized barriers to help seeking whereas caregivers exhibited comparatively more negative community attitudes. Interventions in Malwa should be dyad focussed (patient centred stigma reduction, as well as psychoeducation for caregiver) to enhance engagement and continuity of care.
49. Morphological Spectrum of Bone Marrow Findings and its role in the Evaluation of Haematological Disorders
Krishnadeep Sahu, Puja Singh, Amar Gangwani, Himani Yadav
Abstract
Introduction: Bone marrow examination (BME) is a cornerstone diagnostic procedure for evaluating hematological disorders, providing critical insights into cellular morphology, architecture, and iron stores. This study aimed to describe the clinico-morphological spectrum of bone marrow findings and assess the diagnostic utility of bone marrow aspiration (BMA) and biopsy (BMB) in a tertiary care center in the Bundelkhand region of Madhya Pradesh, India.
Materials and Methods: A prospective, observational study was conducted over a specified period, including 90 patients who underwent BME for various hematological indications. Peripheral blood parameters, clinical features, and bone marrow morphology from both aspiration and trephine biopsy were analyzed. Special stains (Perls’ Prussian blue, Reticulin) were employed as needed.
Results: The study population had a mean age skewing towards younger adults (20-29 years, 31.1%), with a slight male predominance (53.3%). The most common clinical features were weakness (90%) and pallor (88.9%). Megaloblastic anemia (MA) was the most frequent diagnosis (31.1%), followed by mixed deficiency anemia (MDA, 20.0%) and iron deficiency anemia (IDA, 11.1%). Non-neoplastic disorders constituted 85.5% of cases, while neoplastic conditions like acute leukemia, aplastic anemia, and myelofibrosis accounted for 14.4%. Bone marrow biopsy was pivotal in cases of dry tap or when architectural assessment was crucial, such as in myelofibrosis and aplastic anemia.
Conclusion: Nutritional deficiency anemias, particularly megaloblastic anemia, are the predominant hematological disorders in the Bundelkhand region. Bone marrow examination remains an indispensable, cost-effective tool for definitive diagnosis, especially in differentiating between nutritional deficiencies, marrow failure syndromes, and hematological malignancies.
50. Histopathological Spectrum of Lesions in Nasopharynx and Sinonasal Sinuses: A Tertiary Care Experience
Sujata Lawa, Deepak Maini, Sharda Dawan
Abstract
Background: Lesions of the nasal cavity, paranasal sinuses, and nasopharynx encompass a wide range from non-neoplastic inflammatory conditions to benign and malignant neoplasms. Because of overlapping clinical features, histopathological examination remains the gold standard for diagnosis.
Aims: To evaluate the histopathological spectrum of lesions in the nasopharynx and sinonasal region, analyze their demographic distribution, and compare findings with previous studies.
Materials and Methods: This retrospective cross-sectional study was conducted from June 2022 to June 2024 in the Department of Pathology, Sardar Patel Medical College, and Bikaner. A total of 150 biopsies from sinonasal and nasopharyngeal regions were studied. Hematoxylin and eosin staining was performed; special stains were used when indicated. Data were analyzed statistically.
Results: Among 150 cases, 81 (54%) were non-neoplastic and 69 (46%) were neoplastic. Males predominated (63.3%), and the mean age was 40.08 years. The nasal cavity was the most common site (46.7%), followed by tonsillar region (26%). Inflammatory polyp was the most frequent non-neoplastic lesion, while squamous cell carcinoma was the most common malignant tumor. The association between age and lesion type was statistically significant (p < 0.001).
Conclusion: The sinonasal and nasopharyngeal regions show a diverse spectrum of lesions. Non-neoplastic inflammatory conditions predominate, but malignant neoplasms, particularly squamous cell carcinoma, constitute a significant subset, emphasizing the role of histopathology in accurate diagnosis and management.
51. Effect of Smartphone Use and Prolonged Screen Time on Digital Eye Strain (DES), Visual Acuity and Refraction among Medical Students: A Cross Sectional Study
Asima Hassan, Sajad Khanday, Javed Alikhan, Sadiya Sajad
Abstract
Aim: To determine the effect of smartphone use and prolonged screen time on digital eye strain (DES), visual acuity, refraction and overall ocular health among medical students.
Materials and Methods: A cross-sectional study was conducted at Government Medical College Srinagar, Kashmir, India. This study included 225 students of 2
nd year to 5
th year MBBS, who consented to participate. Information regarding participants’ bio-data, screen time, and DES symptoms was gathered through a meticulously crafted self-administered questionnaire. A Snellen’s chart was used to assess the best corrected visual acuity and refraction of participants was noted. Chi-Square and Pearson Correlation were used and analysis conducted using SPSS software.
Results: Out of 225 participants, 186 (82.6%) reported at least one symptom of digital eye strain. Headache (n=96; 42.6%) and eye pain/discomfort (n=73; 32.44 %) were the most common reported symptoms. Refractive error was reported in 102 (45.33 %) students, including myopia (n=78; 34.66%), hyperopia (n=13; 5.77%), and astigmatism (n=11; 4.88 %). Mobile Phone (n=225; 100%), Laptop (n=175; 78.22%) and Tablet/Ipad (n=76: 33.77%) were the main electronic gadgets used by participants. Headache, ocular discomfort, redness, watering of the eyes, itching of the eyes and burning of eyes along with neck / shoulder pain were significantly associated with increased screen time and most common refractive error noted among the students having prolonged screen time was myopia (p<0.05).
Conclusion: This study reveals an alarming 82.6% prevalence of DES among medical students in GMC Srinagar as a direct result of increased screen time and smart phone usage and a strong association of prolonged screen-exposure with refractive errors especially myopia. Our study reveals a significant association between screen time and DES, with headache & eye pain/discomfort being the most common symptoms.
52. Correlation between Red Cell Distribution Width and Severity of Ischemic Cerebrovascular Stroke
Namrata Patel, Fenil Vaghasiya, Ashok Kumar Choudhary, Purvi Patel
Abstract
Ischemic cerebrovascular stroke constitutes the majority of stroke burden worldwide and remains a leading cause of mortality and long-term disability. Despite advances in neuroimaging and reperfusion therapies, early identification of individuals at increased risk remains a major clinical challenge. Red cell distribution width (RDW), a routinely reported parameter in complete blood count, reflects variability in erythrocyte size and has emerged as a novel biomarker associated with adverse cardiovascular and cerebrovascular outcomes. Elevated RDW has been linked to systemic inflammation, oxidative stress, endothelial dysfunction, altered blood rheology, and prothrombotic states—key mechanisms implicated in ischemic stroke pathogenesis. This review examines the association between elevated RDW levels and ischemic stroke, synthesizing epidemiological evidence, exploring biological mechanisms, and evaluating clinical implications. Particular emphasis is placed on the relevance of RDW as a cost-effective biomarker in resource-limited settings such as South Gujarat. The potential role of RDW in stroke risk stratification and future research directions is discussed.
53. Evaluation of Posterior Segment in Advanced and Mature Cataract by B Scan Ultrasonography – A Prospective Study
N. Jayanthi, S. Sivapriya, Nikita N. Bhujang
Abstract
Background: Cataract is the most common preventable cause of bilateral blindness in India and the leading cause of vision loss in the elderly worldwide. It is also the primary cause of reversible blindness globally. Cataract refers to any opacity in the lens of the eye or its capsule, whether developmental or acquired. B-scan ultrasonography is a powerful, safe, cost-effective, non-ionizing, and non-invasive diagnostic tool for evaluating the hidden posterior segment lesions in eyes with opaque media caused by corneal opacities, dense cataracts, or vitreous hemorrhage, which complicate the ophthalmic evaluation. B-scan ultrasound, a two-dimensional imaging system, is particularly useful when the fundus cannot be accessed through direct or indirect ophthalmoscopy, such as in the presence of dense cataracts. It is routinely performed preoperatively in cases of dense cataract to evaluate posterior segment abnormalities that may influence visual prognosis after surgery.
Aim: Evaluation of posterior segment pathology in opaque media due to mature and advanced cataract thus plan management and determine visual prognosis accordingly.
Objectives: (1) To evaluate the posterior segment pathology in patients with Mature and advanced cataract by B scan. (2) To plan the management protocol based on B scan findings. (3) To determine visual prognosis pre operatively.
Materials and Methods: This prospective study was conducted in 200 patients with mature and advanced cataract. Relevant details with history were collected. Detailed ophthalmic was done for classification into groups.
Results: Out of 200 patients, 77 eyes were with Advanced IMSC, 90 eyes with MSC and 33 eyes with HMSC. Our findings demonstrated that B Scan ultrasonography is useful tool in evaluating posterior segment pathology in patients with advanced and mature cataract. In this study, the majority of patients (89.5%) had a normal B scan, indicating no significant abnormalities detected in most cases. Mild vitreous degenerations were observed in 3.5% of the patients, which is a common finding in the aging population and often not associated with severe visual impairment. Moderate vitreous degenerations were also found in 2.0% of the patients, while severe vitreous degenerations were observed in 1.0%, indicating a progression in the degenerative changes affecting the vitreous humor.
Conclusion: This study concluded that B Scan ultrasonography should be used in pre-operative evaluation of advanced and mature cataract to diagnose hidden posterior segment pathology enhancing surgical planning and prognosis.
54. A Clinical Study of Anterior Chamber Depth Measurement as a Screening Tool for Primary Angle Closure Glaucoma
N. Jayanthi, S. Sivapriya, K. Indulatha
Abstract
Background: Glaucoma is the world’s second largest cause of blindness, with permanent visual loss. Angle-closure glaucoma is regarded as the primary cause of permanent blindness globally, with a greater incidence among Asians.
1 PACG is characterized by narrow or closed anterior chamber angle, which leads to increased intraocular pressure and optic nerve damage Anatomical risk factors include shallow anterior chamber depth, small axial length and lens thickness.
Need for Screening: Gonioscopy is the gold standard for angle evaluation, but it is technique – sensitive and not always practical for mass screening. ACD measurement is simple non-invasive screening alternative.
Objectives: (1) To evaluate anterior chamber depth measurement as a method of screening for PACG. (2) To compare the parameters in eyes with PACS, PAC, and PACG.
Materials and Methods: This is a prospective study conducted on 150 patients with shallow anterior chamber and patients presenting with signs and symptoms of angle closure. Detailed history was collected. Detailed ophthalmic examination was done for classification into groups.
Results: Out of 150 patients, 36 eyes of open angles, 46 eyes of PACS, 33 eyes of PAC and 35 eyes with PACG were identified. Our findings demonstrated that ACD is a significant for identifying individuals at risk for primary angle closure glaucoma. This study revealed that there is statistically significant difference between the mean ACD of PACS, PAC, and PACG.
Conclusion: This study concluded that anterior chamber depth measurement as a screening tool for primary angle closure glaucoma is effective especially in primary outreach centres where sophisticated equipment may not be available.
55. Diverse Cutaneous Reactions to Diclofenac Sodium: Case Series of three Patients
Sunil Mhatarba Vishwasrao, Sufala Sunil Vishwasrao, Amar Nagesh Kumar, Pollilan G. R.
Abstract
Diclofenac sodium is a frequently prescribed painkiller in OPD and IPD setups of most clinicians. It is also preferred as an analgesic to deal with postoperative pain. The drug is economical and has better efficacy. Dermatological adverse reactions to diclofenac may manifest in a moderate to severe form. Early identification of ADR and prompt treatment are necessary, which helps in the patient’s faster recovery. Delayed identification of ADR sometimes manifests in a severe form that can lead to fatality. Here we report three cases of diclofenac-induced ADRs with varied cutaneous manifestations.
56. A Study on Diastolic Dysfunction in Newly Diagnosed Type 2 Diabetes Mellitus and Its Correlation with Glycosylated Hemoglobin
Chiranjeevi Parnapalli, Bhargav Kiran Gaddam, Mani Ratnam Kothamasu
Abstract
Background: Diabetes mellitus (DM) is a long-term metabolic disease marked by elevated blood glucose levels, resulting from inadequate insulin production, resistance to insulin effects, or a combination of both. The interplay between diabetes and cardiovascular health is particularly important in the context of diastolic dysfunction, a precursor to heart failure with preserved ejection fraction (HFpEF).
Objective: This study investigates the occurrence of diastolic dysfunction in individuals newly diagnosed with type 2 diabetes mellitus (T2DM) and examines its association with glycosylated haemoglobin (HbA1c), a key indicator of long-term blood glucose regulation.
Methods: The prospective non interventional study was conducted for one year duration on 52 newly diagnosed type 2 diabetes mellitus aged between 18 to 60 years. ECG, 2D ECHO, FBS, PPBS, HBA1C was done. The study was conducted on the basis of presence of diastolic dysfunction on echocardiography. Quantitative data was analysed with the help of ‘t’ test and qualitative data with the Chi Square and Fisher Exact Test. Statistical significance was taken as P < 0.05.
Results: Among 52 participants, individuals aged ≤55 years, only 5 (22.7%) had LV diastolic dysfunction, while among those older than 56 years, a substantial 24 (80.0%) had LV diastolic dysfunction. The distribution based on gender shows equal prevalence of LV diastolic dysfunction among both males and females. Among individuals with HbA1c <6.4%, 4 (50.0%) had LV diastolic dysfunction, while those with HbA1c ≥6.5%, a larger proportion, 25 (56.8%), had LV diastolic dysfunction. Among individuals with FBS <125 mg/dL, 9 (42.9%) had LV diastolic dysfunction, while those with FBS ≥126 mg/dL, 20 (64.5%) had LV diastolic dysfunction. Among individuals with PPBS <200 mg/dL, 10 had LV diastolic dysfunction, while those with PPBS ≥200 mg/dL, 19 had LV diastolic dysfunction.
Conclusion: Left ventricular diastolic dysfunction (LVDD) is frequently present in newly diagnosed, normotensive type 2 diabetes mellitus (T2DM) patients, indicating that subclinical cardiac involvement may start early in the disease course. Implementation of early cardiac evaluation, combined with stringent glycemic control and lifestyle modifications, may potentially delay or prevent the progression to overt heart failure in diabetic individuals.
57. Abnormal CTG Findings and Perinatal Outcome in Low-Risk Term Pregnancies
Talwar Karishma, K. Smitha, T. Kiruthika
Abstract
Background: Cardiotocography is a fetal surveillance modality used to detect fetal hypoxia and help reduce perinatal morbidity and mortality. Abnormal findings on CTG can lead to early intervention and improve perinatal outcomes.
Methods: This was an observational study where 94 low-risk pregnant patients with abnormal CTG tracings were selected. All of them underwent emergency caesarean section. Perinatal outcome was measured by noting APGAR scores at 1 minute and 5 minutes and the need for NICU admission.
Results: Out of the 94 patients, 42 (44.6%) were ≤25 years and 52 (55.4%) were >25 years. Primigravida accounted for 64 (68%) and multigravida 30 (32%). Gestational age was <37 weeks in 21 (22.3%) and ≥37 weeks in 73 (77.6%). There were 52 (55.4%) male babies and 42 (44.6%) female babies. Birth weight was <2.5 kg in 26 (27%) and ≥2.5 kg in 68 (73%). APGAR scores at 1 minute were ≥7 in 89 (95%) and <7 in 5 (5%). At 5 minutes, APGAR scores were ≥7 in 90 (96%) and <7 in 4 (4%). NICU admission was required for 57 (60%) babies. CTG findings were suspicious in 61 (65%) and abnormal in 33 (35%). NICU admission was noted in 22 (23.4%) of the abnormal CTG group and 35 (37.3%) of the suspicious CTG group. No statistical significance was found in the association between CTG findings and NICU admission (p=0.247) or between CTG findings and low APGAR scores at 1 minute (p=0.353) and 5 minutes (p=0.304).
Conclusion: The study showed that while CTG abnormalities lead to emergency interventions, they do not necessarily predict poor immediate neonatal outcomes. The association between abnormal CTG findings and NICU admission or low APGAR scores was not statistically significant. Further research with larger sample sizes is needed to explore these associations more definitively.
58. To Assess the Prevalence, Severity, and Long-Term Impact of Thyroid Dysfunction Following Intensity-Modulated Radiotherapy (IMRT) in Patients with Non-Thyroidal Head and Neck Cancers and to Evaluate the Potential Need for Routine Thyroid Function Monitoring in This Patient Population
A. Satish Kumar, Dalin Xavier, G.R. Santhilatha, G. Padma Sree
Abstract
Background: To Assess The Prevalence, Severity, And Long-Term Impact Of Thyroid Dysfunction Following Intensity-Modulated Radiotherapy (Imrt) In Patients With Non-Thyroidal Head And Neck Cancers And To Evaluate The Potential Need For Routine Thyroid Function Monitoring In This Patient Population.
Methodology: This study is a prospective cohort study conducted at a single tertiary care institute, specifically the Radiotherapy Department of Government General Hospital (GGH), Guntur. The primary aim of this study is to assess the incidence and pattern of thyroid dysfunction following Intensity-Modulated Radiotherapy (IMRT) in patients with non- thyroid head and neck cancers. A total of 70 patients with non-thyroid head and neck squamous cell carcinoma (HNSCC) were prospectively evaluated for thyroid dysfunction following radiation treatment.
Results: In this prospective study 70 patients with non-thyroid head and neck squamous cell carcinoma (HNSCC) were recruited. Of these 70 patients, 8 patients (11.4%) developed subclinical hypothyroidism following treatment, whereas 62 (88.6%) retained normal thyroid function during the follow-up period. Median period for the development of the subclinical hypothyroidism is 3 months. Results were observed.
Conclusion: In conclusion, the study strongly advocates for early and sustained thyroid function monitoring post-radiotherapy, even in asymptomatic patients. Detecting subclinical hypothyroidism early opens a window for potential intervention before clinical symptoms arise. Future research should aim to explore long-term outcomes with larger sample sizes, integrate autoimmune and endocrine markers, and optimize radiation planning to mitigate risks. Such efforts are essential for improving survivorship quality and reducing the burden of preventable late effects in cancer care.
59. A Comparative Study on Locking Plate versus Intramedullary Nail in the Management of Proximal Humerus Fractures
Lavudi Rambabu, Shuja Nazim, C. Abednego
Abstract
Background: Proximal humerus fractures make up about 5% of all fractures, with a higher occurrence in older adults due to osteoporosis. Non-displaced fractures can be treated with conservative methods, but displaced fractures need surgery. Two main fixation methods, locking plates and intramedullary nails, are commonly used in orthopedic practice, but there is still debate about which is more effective. Locking plates offer better control in the metaphysis with fixed-angle constructs, while intramedullary nails provide biological fixation with less disruption to soft tissue. This study aimed to compare functional outcomes, complication rates, and radiological union between these two techniques in an Indian population.
Methods: A prospective comparative cohort study was carried out over 18 months at the Department of Orthopedic Surgery from January 2023 to June 2024. Seventy-five patients with displaced proximal humerus fractures (Neer classification II-IV) were assigned to either locking plate fixation (n=37) or intramedullary nail fixation (n=38). We measured functional outcomes using the Constant-Murley Score and the American Shoulder and Elbow Surgeons (ASES) score at 6 weeks, 3 months, 6 months, and 12 months. Secondary outcomes included the rates of radiological union, complication rates, need for revision surgery, and time to union. We performed statistical analysis with independent t-tests for parametric data, Mann-Whitney U tests for non-parametric data, and Chi-square tests for categorical data (p < 0.05).
Results: At 12 months, the mean Constant-Murley score was 76.2±8.4 for the locking plate group and 74.8±9.1 for the intramedullary nail group (p=0.436). The ASES scores were 78.4±7.6 for locking plates and 76.9±8.3 for intramedullary nails (p=0.352). Radiological union was seen in 94.6% of locking plate cases and 92.1% of intramedullary nail cases (p=0.601). Varus collapse occurred in 8.1% of locking plate cases compared to 5.3% for intramedullary nails (p=0.486). Revision surgery was necessary for 5.4% of the locking plate group and 7.9% of the intramedullary nail group (p=0.512). Both groups showed similar functional recovery and acceptable complication rates.
Conclusion: Both locking plate and intramedullary nail fixation are effective surgical options for displaced proximal humerus fractures, with similar functional outcomes and complication rates. Treatment should be tailored to the individual, considering fracture complexity, bone quality, and the surgeon’s expertise. These results support using both techniques as primary surgical options in Indian orthopedic practice, providing evidence-based outcomes that facilitate a return to daily activities and work.
60. Study of Correlation of Serum Ascitic Albumin Gradient with Oesophageal Varices in Patients with Portal Hypertension in Chronic Liver Disease – Retrospective Study
Sandip Kashinath Ghule, Shubhangi Kashinath Ghule, Umesh Badrinath Khedkar
Abstract
Background: The ascites analysis provides the etiology of portal hypertension, and elevation of the serum ascitic albumin gradient (SAAG) shows the accuracy of portal hypertension.
Method: 90 adult patients with liver disease were studied. USG was carried out for the diagnosis of cirrhosis of the liver. Blood examination included CBC, liver function test, a renal function test, and a coagulation profile; Child-Pugh scores were calculated for severity of disease, and paracentesis of ascitic fluid was performed. Ascitic fluid was analyzed for SAAG calculation.
Results: In the study of esophageal varices, the grading of esophageal varices had a significant p-value (p<0.001). Apart from the elevation of SAAG serum albumin (g/dl), bilirubin (mg/dl) levels also increased.
Conclusion: The present pragmatic study shows there is a strong correlation between SAAG and the presence and severity of esophageal varices in patients with chronic liver disease having portal hypertension.
61. Introduction and Impact of Mini Clinical Evaluation Exercise as an Assessment Tool for MBBS Interns Posted in the Department of Dermatology
Chandra Shekhar Jaiswal, Abhay Kumar Sinha, Vivekanand Waghmare
Abstract
Background: Competency-based medical education (CBME) has highlighted the importance of formative assessment methodologies that evaluate real-time clinical performance. Traditional assessment methods frequently fail to assess critical qualities like communication, professionalism, and clinical reasoning during actual patient encounters. Mini-Clinical Evaluation Exercise (Mini-CEX) is a systematic workplace-based assessment method that aims to close this gap.
Objectives: The present study was conducted to introduce Mini-CEX as a formative assessment tool for MBBS interns during their dermatology posting. The study also aimed to assess the impact of Mini-CEX on interns’ clinical competencies, including history taking, examination, clinical judgment, communication skills, and professionalism. Additionally, the perceptions of interns and faculty regarding the usefulness and feasibility of Mini-CEX were evaluated.
Methods: A prospective interventional study was conducted among MBBS interns posted in the Department of Dermatology. Interns underwent multiple Mini-CEX encounters using a standardized assessment proforma. Scores from initial and subsequent encounters were compared and feedback responses were analyzed.
Results: There was a statistically significant improvement in mean Mini-CEX scores across all assessed domains after repeated encounters. Interns and faculty reported high satisfaction with Mini-CEX, particularly highlighting its role in improving clinical confidence and feedback-based learning.
Conclusion: Mini-CEX is an effective, feasible, and acceptable formative assessment tool for MBBS interns in dermatology, contributing significantly to competency development.