- The document analyzes the essential causes of hypertension using data collected over 20 years from a hospital in Ibadan, Nigeria.
- It applies principal component analysis to identify the most contributory factors among 9 variables related to hypertension causes.
- The results of the analysis reveal that obesity accounted for 76.2% of the total variance, while smoking, excess salt, and excess alcohol accounted for 8.6%, 5.7%, and 4.4% respectively - together explaining 94.8% of the variance. Therefore, obesity is identified as the most contributory factor to hypertension.
PREVENTION OF HEART PROBLEM USING ARTIFICIAL INTELLIGENCEijaia
This document discusses building a machine learning model to predict the probability of patients experiencing heart problems based on their medical data. It analyzes data from 1000 patients across India on risk factors like family history, smoking, hypertension, cholesterol levels, blood sugar, obesity, lifestyle, previous bypass surgery, and iron levels. The model aims to help doctors make treatment decisions and minimize false negatives, where the model predicts no problem when one exists. It finds certain risk factors like family history, age over 50, and being male are correlated with higher heart problem rates. The model will be trained on this data to predict new patients' heart problem probability.
1. Cardiovascular diseases are the leading cause of death globally, accounting for 31% of all deaths in 2016. Accurately assessing CVD risk allows for early prevention efforts.
2. The original Framingham Risk Score from 1998 predicts 10-year risk of coronary heart disease based on age, sex, cholesterol, blood pressure, smoking, and diabetes. However, it does not include other outcomes like stroke and underestimates risk in other populations.
3. Several risk calculators have been developed to address limitations of the FRS. Models like SCORE and QRISK2 predict fatal cardiovascular risk and account for additional risk factors like ethnicity and deprivation level respectively.
A comparative study for some atherogenic indices in sera ofAlexander Decker
This document summarizes a study that evaluated lipid profiles and atherogenic indices in patients with myocardial infarction (MI), ischemic heart disease (IHD), and a control group. The study found:
1) Levels of total cholesterol and triglycerides did not significantly differ between MI/IHD patients and controls, while LDL cholesterol was significantly higher in MI/IHD patients.
2) VLDL cholesterol was significantly lower in MI patients and lower in IHD patients compared to controls.
3) Atherogenic indices like cardiogenic risk ratio, atherogenic coefficient, and atherogenic index of plasma were significantly higher in MI patients compared to controls, but only non-significantly higher in IHD
16 cardiovascular disease prevention 01 2003 estimation of ten-year risk of f...YassinHasan
The SCORE project developed a risk scoring system to estimate 10-year risk of fatal cardiovascular disease in Europe. It analyzed data from over 200,000 individuals across 12 European cohort studies, following participants for up to 30 years. Separate risk estimation models were created for regions of high and low cardiovascular risk. The models estimate risk based on age, sex, smoking status, total cholesterol, and total cholesterol/HDL cholesterol ratio. Risk charts were produced to facilitate clinical use of the risk estimation system.
Modeling of Longitudinal Pulse Rate, Respiratory Rate and Blood Pressure Meas...Premier Publishers
Congestive heart failure (CHF) is a chronic condition that happens when the heart’s muscle becomes too damaged to adequately pump the blood around your body. The main objective of this study was to modeling the longitudinal pulse rate, respiratory rate and blood pressure measurements from congestive heart failure patients under follow up at Tikur Anbessa Specialized Hospital. This retrospective cohort study was based on secondary data obtained from Tikur Anbessa Specialized Hospital. Modeling approach of longitudinal data analysis was applied by suing Linear Mixed Models to identify risk factors and to compare efficiency of the models. Fit statistics showed that the joint model resulted in better fit to the data than the separate models, implying a significant association among the two end points. Based on the joint model for SBP, diagnosis history, family history, NYHA class, and time, and for DBP, age, weight, sex, family history, NYHA class, and time are the significant factors, at 5% level of significance. The joint model fitted the data better than the separate models. The result from the joint model suggested a strong association between the evolutions and a slowly increasing evolution of the association between PR and RR also, between SBP and DBP. Thus, fitting joint model is recommended for researches to any types of multivariate response variable together jointly.
11 economic evidence of interventions for acute myocardial infarction a revie...NPSAIC
The document reviews 14 studies that evaluate the cost-effectiveness of primary angioplasty (PPCI) versus thrombolysis (TL) for treating acute myocardial infarction (AMI). Most studies found PPCI to be cost-effective compared to TL, though some found it cost-saving or cost-neutral. The cost-effective evidence is mainly from randomized controlled trials with strict inclusion criteria and established catheter labs, which may limit generalizability. More real-world analyses evaluating multiple strategies for delivering PPCI are needed to better inform policymakers.
This systematic literature review analyzed 15 studies on the costs of adverse events from cancer treatment in the US. The studies estimated costs for a variety of adverse events including neutropenia, thrombocytopenia, vomiting, nausea, peripheral neuropathy, sepsis, diarrhea, and fatigue. Costs were reported per patient, per event, and per year. Inpatient costs ranged from $6,000 to $48,000 while outpatient costs ranged from $213 to $9,800 per year. Total annual healthcare costs per patient ranged from $15,000 to $21,000. Neutropenia was the most commonly studied adverse event. The studies found cancer treatment adverse events pose a significant economic burden on both patients and the healthcare system
PREVENTION OF HEART PROBLEM USING ARTIFICIAL INTELLIGENCEijaia
This document discusses building a machine learning model to predict the probability of patients experiencing heart problems based on their medical data. It analyzes data from 1000 patients across India on risk factors like family history, smoking, hypertension, cholesterol levels, blood sugar, obesity, lifestyle, previous bypass surgery, and iron levels. The model aims to help doctors make treatment decisions and minimize false negatives, where the model predicts no problem when one exists. It finds certain risk factors like family history, age over 50, and being male are correlated with higher heart problem rates. The model will be trained on this data to predict new patients' heart problem probability.
1. Cardiovascular diseases are the leading cause of death globally, accounting for 31% of all deaths in 2016. Accurately assessing CVD risk allows for early prevention efforts.
2. The original Framingham Risk Score from 1998 predicts 10-year risk of coronary heart disease based on age, sex, cholesterol, blood pressure, smoking, and diabetes. However, it does not include other outcomes like stroke and underestimates risk in other populations.
3. Several risk calculators have been developed to address limitations of the FRS. Models like SCORE and QRISK2 predict fatal cardiovascular risk and account for additional risk factors like ethnicity and deprivation level respectively.
A comparative study for some atherogenic indices in sera ofAlexander Decker
This document summarizes a study that evaluated lipid profiles and atherogenic indices in patients with myocardial infarction (MI), ischemic heart disease (IHD), and a control group. The study found:
1) Levels of total cholesterol and triglycerides did not significantly differ between MI/IHD patients and controls, while LDL cholesterol was significantly higher in MI/IHD patients.
2) VLDL cholesterol was significantly lower in MI patients and lower in IHD patients compared to controls.
3) Atherogenic indices like cardiogenic risk ratio, atherogenic coefficient, and atherogenic index of plasma were significantly higher in MI patients compared to controls, but only non-significantly higher in IHD
16 cardiovascular disease prevention 01 2003 estimation of ten-year risk of f...YassinHasan
The SCORE project developed a risk scoring system to estimate 10-year risk of fatal cardiovascular disease in Europe. It analyzed data from over 200,000 individuals across 12 European cohort studies, following participants for up to 30 years. Separate risk estimation models were created for regions of high and low cardiovascular risk. The models estimate risk based on age, sex, smoking status, total cholesterol, and total cholesterol/HDL cholesterol ratio. Risk charts were produced to facilitate clinical use of the risk estimation system.
Modeling of Longitudinal Pulse Rate, Respiratory Rate and Blood Pressure Meas...Premier Publishers
Congestive heart failure (CHF) is a chronic condition that happens when the heart’s muscle becomes too damaged to adequately pump the blood around your body. The main objective of this study was to modeling the longitudinal pulse rate, respiratory rate and blood pressure measurements from congestive heart failure patients under follow up at Tikur Anbessa Specialized Hospital. This retrospective cohort study was based on secondary data obtained from Tikur Anbessa Specialized Hospital. Modeling approach of longitudinal data analysis was applied by suing Linear Mixed Models to identify risk factors and to compare efficiency of the models. Fit statistics showed that the joint model resulted in better fit to the data than the separate models, implying a significant association among the two end points. Based on the joint model for SBP, diagnosis history, family history, NYHA class, and time, and for DBP, age, weight, sex, family history, NYHA class, and time are the significant factors, at 5% level of significance. The joint model fitted the data better than the separate models. The result from the joint model suggested a strong association between the evolutions and a slowly increasing evolution of the association between PR and RR also, between SBP and DBP. Thus, fitting joint model is recommended for researches to any types of multivariate response variable together jointly.
11 economic evidence of interventions for acute myocardial infarction a revie...NPSAIC
The document reviews 14 studies that evaluate the cost-effectiveness of primary angioplasty (PPCI) versus thrombolysis (TL) for treating acute myocardial infarction (AMI). Most studies found PPCI to be cost-effective compared to TL, though some found it cost-saving or cost-neutral. The cost-effective evidence is mainly from randomized controlled trials with strict inclusion criteria and established catheter labs, which may limit generalizability. More real-world analyses evaluating multiple strategies for delivering PPCI are needed to better inform policymakers.
This systematic literature review analyzed 15 studies on the costs of adverse events from cancer treatment in the US. The studies estimated costs for a variety of adverse events including neutropenia, thrombocytopenia, vomiting, nausea, peripheral neuropathy, sepsis, diarrhea, and fatigue. Costs were reported per patient, per event, and per year. Inpatient costs ranged from $6,000 to $48,000 while outpatient costs ranged from $213 to $9,800 per year. Total annual healthcare costs per patient ranged from $15,000 to $21,000. Neutropenia was the most commonly studied adverse event. The studies found cancer treatment adverse events pose a significant economic burden on both patients and the healthcare system
This study assessed the impact of austerity measures in Greece on survival rates of out-of-hospital cardiac arrest victims. Data was collected from a Greek hospital on immediate and 24-hour survival pre-crisis (2007-2010) and during the crisis (2011-2014). Results showed no significant difference in return of spontaneous circulation or 24-hour survival between the two periods, suggesting healthcare workers were working hard to maintain standards despite budget cuts. However, overall survival rates remained low compared to international studies, highlighting the need for more Greek data on cardiac arrests.
Study on achievement of target LDC-C in Dyslipidimic patientspharmaindexing
This study analyzed 80 dyslipidemic patients to assess achievement of target LDL-C levels as recommended by ATP III guidelines. The majority of patients had high LDL-C levels and were receiving statin therapy. Based on risk factors, patients were categorized as CHD, high risk non-CHD, or low risk non-CHD. Only 22.5% of patients achieved their target LDL-C levels, which was unsatisfactory. More aggressive lipid management is needed to help more patients reach targets through interventions like pharmacist counseling and medication adjustments.
Aim: of the study was to conduct a comparative analysis of inflammatory markers in patients with coronary heart disease of stable and unstable flow. Methods: 78 patients aged 36 to 75 years were enrolled in this study (mean age 58.2±12.6 years). Laboratory and instrumental data were obtained and assessed. IL-6, TNF-α in blood plasma was carried out by the method of enzyme immunoassay on a solid-phase analyzer «Humareader Single». Statistical processing of the obtained results was carried out using vibrational statistics methods recommended for biomedical research on the IBM PC AT Pentium IV. Results: In patients with unstable angina (UA), the frequency of elevated levels of CRP, TNF-α, and leukocytes was statistically significantly higher than in the group with stable ischemic heart disease (P<0.05). The mean levels of these markers were statistically significantly higher in patients with UA compared with patients with stable form of coronary heart disease (CHD, P<0.05): CRP (4.3 ± 2.4 and 2.9 ± 2.3 mg / L, p <0.05, respectively), TNF-α (10.5 ± 2.5 and 7.7 ± 3.4 pg / ml, p <0.05) and leukocytes (9.2 ± 2.5 6.9 ± 2.3x109 / l, p <0.05). The level of interleukin-6 in patients with UA was higher in comparison with patients with stable angina (SA, 3.4 ± 1.7 and 2.9 ± 0.5 pg/ml), but the difference was statistically not significant (p> 0.05 ). There were no significant differences in the level of fibrinogen and ESR between patients with UA and SA. Conclusion: It was noted that the signs of inflammation are detected both in patients with unstable forms and in patients with stable form of CHD, but the degree of inflammation in patients with UA (level of TNF-α, CRP and leukocytes) is higher than in patients with stable ischemic heart disease.
Clinical Profile of Acute Coronary Syndrome among Young AdultsPremier Publishers
Acute Coronary Syndrome accounts for 30% of hospital admissions with cardiovascular diseases. The risk of this syndrome is increasing among the younger adults, and a deep insight into the clinical profile among these patients will help in devising a preventive strategy, in order to alleviate the morbidity and mortality due to the syndrome. A cross sectional study was done among 125 subjects admitted to our tertiary care hospital with Acute Coronary Syndrome. Their risk factors were assessed and a 12 Lead electrocardiogram and 2D Echocardiogram were taken. Cardio III panel which consists of Troponin I, CK MB, BNP by COBAS meter machine was also measured. STEMI was present in 73.6% of the patients, while unstable angina was present in 16%. About 90% of STEMI patients were males and 62% of them were hypertensives. LV Ejection Fraction <30% was found in 9% of STEMI patients. This study elucidates the need for a preventive strategy for primordial prevention of cardiovascular events among young adults. The study envisaged the male, urban preponderance towards these events.
Hypertension is commonly associated with other cardiovascular risk factors, such as obesity, diabetes, and dyslipidaemia. The presence of these cardiovascular risk factors and the resulting endothelial dysfunction may play a role in the pathophysiology of hypertension. Dyslipidaemia, a strong predictor of cardiovascular disease.
This cross-sectional study was conducted at Shendi locality from February 2011 to July 2012. The patients underwent a clinical assessment, which included history (a questionnaire) and clinical examination. 100 hypertensive patients. The age limits was 40 to 60 years.
There was sharp and definite increase in the percentage of patients having >200mg/dl total cholesterol after four years of diabetes mellitus from (28-34%) to (41%). There was a sharp increase in the percentage of patients having >150mg/dl of low density lipoproteins after 6 years of diabetes mellitus from ( 8 - 9 )% to (14.2%). There was also an increase in the percentage of patients having <160mg/dl of triglycerides after four years of diabetes mellitus from 53% to 61% of diabetes.
Increasing lipid abnormality of hypertensive is associated with higher incidence of CAD.
This article compares the efficacy and safety of transcatheter arterial chemoembolization (TACE) and transcatheter arterial chemotherapy infusion (TACI) for treating hepatocellular carcinoma (HCC) based on a systematic review and meta-analysis of 11 clinical studies including over 13,000 patients. The analysis found that TACE was associated with a 23% lower risk of death compared to TACI, as well as a 28% higher disease control rate and 162% higher objective response rate, though only the increase in response rate was statistically significant. Subgroup and sensitivity analyses also indicated survival benefits for TACE in both HBV-dominant and HCV-dominant HCC patient populations.
Effects of aspirin for primary prevention in persons with Diabetes mellitusShadab Ahmad
This study evaluated the effects of low-dose aspirin (100 mg daily) for primary prevention of vascular events in 15,480 adults with diabetes but no known cardiovascular disease. It found that aspirin led to a 12% lower risk of serious vascular events but also a 29% higher risk of major bleeding. The number of vascular events prevented was similar to the number of major bleeding events caused. Therefore, the benefits of aspirin did not clearly outweigh the risks for primary prevention among adults with diabetes but no known cardiovascular disease.
This study analyzed 201 patients with lung cancer who were treated with the immune checkpoint inhibitor nivolumab. The study found that imaging findings of airway obstruction adjacent to lung tumors (IAOT) and a history of radiation pneumonitis were associated with a significantly higher risk of developing interstitial lung disease (ILD) after treatment with nivolumab. Patients with IAOT were more likely to develop severe ILD and had significantly shorter progression-free survival compared to those without IAOT. The presence of IAOT may indicate active lung inflammation that could cause an excessive immune response when treated with immune checkpoint inhibitors.
Journal of Cardiovascular Disorders is a peer-reviewed, open access journal published by Austin Publishers. It provides easy access to high quality Manuscripts covering wide aspects of any related diseases that affect the cardiovascular system, primarily cardiac disease, vascular diseases of the vital organs such as brain and kidney, and peripheral arterial disease, understanding the spectrum of disease with applications in observational and analytic epidemiology, randomized clinical trials, screening, cause, risk factor's assessment, age factors, diagnosis and prognosis shimmering the entire spectrum of disease from the earliest emergence to the fatal stages.
Austin Publishing Group is a successful host of more than hundred peer reviewed, open access journals in various fields of science and technology with intent to bridge the gap between academia and research access.
Journal of Cardiovascular Disorders accepts original research articles, review articles, case reports, clinical images, mini reviews, rapid communication, opinions and editorials on all the related aspects of diseases that affect the cardiovascular system, primarily cardiac disease, vascular diseases of the vital organs such as brain and kidney, and peripheral arterial disease.
Short Term Outcomes after Use of Intracardiac Bone Stem Cell Transplantation ...crimsonpublishersOJCHD
Short Term Outcomes after Use of Intracardiac Bone Stem Cell Transplantation for Management of Heart Failure: A Meta-Analysis by Rohit SL in Open Journal of Cardiology & Heart Diseases
This systematic review and meta-analysis analyzed 19 studies comprising over 326,000 patients to investigate the effect of digoxin use on mortality in patients with atrial fibrillation (AF) or congestive heart failure (CHF). The analysis found that digoxin use was associated with an increased relative risk of all-cause mortality, with a 29% higher mortality risk seen in AF patients and a 14% higher risk in CHF patients. The results suggest that digoxin therapy may increase mortality risk, particularly in patients with AF. However, significant heterogeneity was seen between the individual studies.
IOSR journal of VLSI and Signal Processing (IOSRJVSP) is an open access journal that publishes articles which contribute new results in all areas of VLSI Design & Signal Processing. The goal of this journal is to bring together researchers and practitioners from academia and industry to focus on advanced VLSI Design & Signal Processing concepts and establishing new collaborations in these areas.
Physical Activity and White Matter Hyperintensities, A Systematic Review of Q...Emily Strack
Physical activity and white matter hyperintensities: A systematic review of quantitative studies
This systematic review examined 12 quantitative studies on the association between physical activity and white matter hyperintensities (WMH). Six studies found that greater physical activity was associated with less WMH, while six found no association. Longitudinal studies and those accounting for lifetime physical activity were more likely to find an association. Greater physical activity was associated with less WMH in individuals without advanced disease.
Aspirin as Prevention Therapy for Cardiovascular Events in patients with Diab...Stefania Dumitrescu
The Role of Aspirin in the primary prevention of cardiovascular disease in patients with diabetes, especially T2DM - current knowledge and recommendations -
1) A meta-analysis of 27 randomized trials found that treating anemia in chronic kidney disease patients with erythropoiesis-stimulating agents (ESAs) to achieve higher hemoglobin levels increased risks of stroke, worsening hypertension, and vascular access thrombosis compared to lower hemoglobin targets.
2) No significant differences were found between higher and lower hemoglobin targets for risks of death, cardiovascular events, or progression to kidney failure requiring dialysis.
3) While higher hemoglobin targets reduced need for blood transfusions, they increased use of intravenous iron therapy.
A Cemig planeja instalar placas fotovoltaicas na cobertura do Mineirão para gerar 1,5 MW de energia, o suficiente para abastecer 1,2 mil residências. A energia gerada será enviada para a rede elétrica e o estádio receberá desconto na conta de luz em troca de ceder 10% da energia produzida. A instalação das placas custará R$10 milhões e faz parte de um programa para instalar usinas solares em estádios da Copa das Confederações e Copa do Mundo.
Hybrid hosting combines cloud and dedicated hosting to maximize resource utilization. It uses cloud hosting for workloads that benefit from cloud flexibility and scalability, while dedicated hosting is used for workloads that are not well-suited to the cloud. A qualified assessment should be done before dismissing the cloud as unfeasible for any workload.
The document discusses two ancillary tasks and a main product created for a coursework assignment on designing newspaper advertisements. For the first ancillary task, the author created a billboard advertisement for the newspaper following conventions of minimal text, large font, and inclusion of the masthead. For the second ancillary task, the author created a radio advertisement lasting 20-30 seconds with a music bed, catchy slogan, and script using rhetorical questions. The main product was a newspaper with consistent branding across the front page and second page, using the same colors, masthead, and tagline as the other products to create synergy.
Expense Reduction Analysts helped TaylorMade-Adidas Golf Company reduce costs across multiple expenditure categories. A dedicated specialist from Expense Reduction Analysts reviewed each category, identifying savings of 22% in contract cleaning and 23% total across all categories. They introduced new suppliers and negotiated better prices and service levels with existing suppliers. They also helped implement a more cost-conscious culture within the company. Similarly, Expense Reduction Analysts helped F Hinds identify savings across stationery (19%), couriers (30%), and other expenditure areas through negotiating rates and operational changes while maintaining service quality.
Brady James Murphy suffered severe injuries after being electrocuted while working on a power line in 2009. He had to have both arms amputated but has shown remarkable resilience and positivity in his recovery. He inspires his step-sister and broke records at the amputee Olympics. The author's goal is to become an elementary school teacher, especially of kindergarten, and in 10 years she envisions being married with children, working as a teacher in Arizona.
This study assessed the impact of austerity measures in Greece on survival rates of out-of-hospital cardiac arrest victims. Data was collected from a Greek hospital on immediate and 24-hour survival pre-crisis (2007-2010) and during the crisis (2011-2014). Results showed no significant difference in return of spontaneous circulation or 24-hour survival between the two periods, suggesting healthcare workers were working hard to maintain standards despite budget cuts. However, overall survival rates remained low compared to international studies, highlighting the need for more Greek data on cardiac arrests.
Study on achievement of target LDC-C in Dyslipidimic patientspharmaindexing
This study analyzed 80 dyslipidemic patients to assess achievement of target LDL-C levels as recommended by ATP III guidelines. The majority of patients had high LDL-C levels and were receiving statin therapy. Based on risk factors, patients were categorized as CHD, high risk non-CHD, or low risk non-CHD. Only 22.5% of patients achieved their target LDL-C levels, which was unsatisfactory. More aggressive lipid management is needed to help more patients reach targets through interventions like pharmacist counseling and medication adjustments.
Aim: of the study was to conduct a comparative analysis of inflammatory markers in patients with coronary heart disease of stable and unstable flow. Methods: 78 patients aged 36 to 75 years were enrolled in this study (mean age 58.2±12.6 years). Laboratory and instrumental data were obtained and assessed. IL-6, TNF-α in blood plasma was carried out by the method of enzyme immunoassay on a solid-phase analyzer «Humareader Single». Statistical processing of the obtained results was carried out using vibrational statistics methods recommended for biomedical research on the IBM PC AT Pentium IV. Results: In patients with unstable angina (UA), the frequency of elevated levels of CRP, TNF-α, and leukocytes was statistically significantly higher than in the group with stable ischemic heart disease (P<0.05). The mean levels of these markers were statistically significantly higher in patients with UA compared with patients with stable form of coronary heart disease (CHD, P<0.05): CRP (4.3 ± 2.4 and 2.9 ± 2.3 mg / L, p <0.05, respectively), TNF-α (10.5 ± 2.5 and 7.7 ± 3.4 pg / ml, p <0.05) and leukocytes (9.2 ± 2.5 6.9 ± 2.3x109 / l, p <0.05). The level of interleukin-6 in patients with UA was higher in comparison with patients with stable angina (SA, 3.4 ± 1.7 and 2.9 ± 0.5 pg/ml), but the difference was statistically not significant (p> 0.05 ). There were no significant differences in the level of fibrinogen and ESR between patients with UA and SA. Conclusion: It was noted that the signs of inflammation are detected both in patients with unstable forms and in patients with stable form of CHD, but the degree of inflammation in patients with UA (level of TNF-α, CRP and leukocytes) is higher than in patients with stable ischemic heart disease.
Clinical Profile of Acute Coronary Syndrome among Young AdultsPremier Publishers
Acute Coronary Syndrome accounts for 30% of hospital admissions with cardiovascular diseases. The risk of this syndrome is increasing among the younger adults, and a deep insight into the clinical profile among these patients will help in devising a preventive strategy, in order to alleviate the morbidity and mortality due to the syndrome. A cross sectional study was done among 125 subjects admitted to our tertiary care hospital with Acute Coronary Syndrome. Their risk factors were assessed and a 12 Lead electrocardiogram and 2D Echocardiogram were taken. Cardio III panel which consists of Troponin I, CK MB, BNP by COBAS meter machine was also measured. STEMI was present in 73.6% of the patients, while unstable angina was present in 16%. About 90% of STEMI patients were males and 62% of them were hypertensives. LV Ejection Fraction <30% was found in 9% of STEMI patients. This study elucidates the need for a preventive strategy for primordial prevention of cardiovascular events among young adults. The study envisaged the male, urban preponderance towards these events.
Hypertension is commonly associated with other cardiovascular risk factors, such as obesity, diabetes, and dyslipidaemia. The presence of these cardiovascular risk factors and the resulting endothelial dysfunction may play a role in the pathophysiology of hypertension. Dyslipidaemia, a strong predictor of cardiovascular disease.
This cross-sectional study was conducted at Shendi locality from February 2011 to July 2012. The patients underwent a clinical assessment, which included history (a questionnaire) and clinical examination. 100 hypertensive patients. The age limits was 40 to 60 years.
There was sharp and definite increase in the percentage of patients having >200mg/dl total cholesterol after four years of diabetes mellitus from (28-34%) to (41%). There was a sharp increase in the percentage of patients having >150mg/dl of low density lipoproteins after 6 years of diabetes mellitus from ( 8 - 9 )% to (14.2%). There was also an increase in the percentage of patients having <160mg/dl of triglycerides after four years of diabetes mellitus from 53% to 61% of diabetes.
Increasing lipid abnormality of hypertensive is associated with higher incidence of CAD.
This article compares the efficacy and safety of transcatheter arterial chemoembolization (TACE) and transcatheter arterial chemotherapy infusion (TACI) for treating hepatocellular carcinoma (HCC) based on a systematic review and meta-analysis of 11 clinical studies including over 13,000 patients. The analysis found that TACE was associated with a 23% lower risk of death compared to TACI, as well as a 28% higher disease control rate and 162% higher objective response rate, though only the increase in response rate was statistically significant. Subgroup and sensitivity analyses also indicated survival benefits for TACE in both HBV-dominant and HCV-dominant HCC patient populations.
Effects of aspirin for primary prevention in persons with Diabetes mellitusShadab Ahmad
This study evaluated the effects of low-dose aspirin (100 mg daily) for primary prevention of vascular events in 15,480 adults with diabetes but no known cardiovascular disease. It found that aspirin led to a 12% lower risk of serious vascular events but also a 29% higher risk of major bleeding. The number of vascular events prevented was similar to the number of major bleeding events caused. Therefore, the benefits of aspirin did not clearly outweigh the risks for primary prevention among adults with diabetes but no known cardiovascular disease.
This study analyzed 201 patients with lung cancer who were treated with the immune checkpoint inhibitor nivolumab. The study found that imaging findings of airway obstruction adjacent to lung tumors (IAOT) and a history of radiation pneumonitis were associated with a significantly higher risk of developing interstitial lung disease (ILD) after treatment with nivolumab. Patients with IAOT were more likely to develop severe ILD and had significantly shorter progression-free survival compared to those without IAOT. The presence of IAOT may indicate active lung inflammation that could cause an excessive immune response when treated with immune checkpoint inhibitors.
Journal of Cardiovascular Disorders is a peer-reviewed, open access journal published by Austin Publishers. It provides easy access to high quality Manuscripts covering wide aspects of any related diseases that affect the cardiovascular system, primarily cardiac disease, vascular diseases of the vital organs such as brain and kidney, and peripheral arterial disease, understanding the spectrum of disease with applications in observational and analytic epidemiology, randomized clinical trials, screening, cause, risk factor's assessment, age factors, diagnosis and prognosis shimmering the entire spectrum of disease from the earliest emergence to the fatal stages.
Austin Publishing Group is a successful host of more than hundred peer reviewed, open access journals in various fields of science and technology with intent to bridge the gap between academia and research access.
Journal of Cardiovascular Disorders accepts original research articles, review articles, case reports, clinical images, mini reviews, rapid communication, opinions and editorials on all the related aspects of diseases that affect the cardiovascular system, primarily cardiac disease, vascular diseases of the vital organs such as brain and kidney, and peripheral arterial disease.
Short Term Outcomes after Use of Intracardiac Bone Stem Cell Transplantation ...crimsonpublishersOJCHD
Short Term Outcomes after Use of Intracardiac Bone Stem Cell Transplantation for Management of Heart Failure: A Meta-Analysis by Rohit SL in Open Journal of Cardiology & Heart Diseases
This systematic review and meta-analysis analyzed 19 studies comprising over 326,000 patients to investigate the effect of digoxin use on mortality in patients with atrial fibrillation (AF) or congestive heart failure (CHF). The analysis found that digoxin use was associated with an increased relative risk of all-cause mortality, with a 29% higher mortality risk seen in AF patients and a 14% higher risk in CHF patients. The results suggest that digoxin therapy may increase mortality risk, particularly in patients with AF. However, significant heterogeneity was seen between the individual studies.
IOSR journal of VLSI and Signal Processing (IOSRJVSP) is an open access journal that publishes articles which contribute new results in all areas of VLSI Design & Signal Processing. The goal of this journal is to bring together researchers and practitioners from academia and industry to focus on advanced VLSI Design & Signal Processing concepts and establishing new collaborations in these areas.
Physical Activity and White Matter Hyperintensities, A Systematic Review of Q...Emily Strack
Physical activity and white matter hyperintensities: A systematic review of quantitative studies
This systematic review examined 12 quantitative studies on the association between physical activity and white matter hyperintensities (WMH). Six studies found that greater physical activity was associated with less WMH, while six found no association. Longitudinal studies and those accounting for lifetime physical activity were more likely to find an association. Greater physical activity was associated with less WMH in individuals without advanced disease.
Aspirin as Prevention Therapy for Cardiovascular Events in patients with Diab...Stefania Dumitrescu
The Role of Aspirin in the primary prevention of cardiovascular disease in patients with diabetes, especially T2DM - current knowledge and recommendations -
1) A meta-analysis of 27 randomized trials found that treating anemia in chronic kidney disease patients with erythropoiesis-stimulating agents (ESAs) to achieve higher hemoglobin levels increased risks of stroke, worsening hypertension, and vascular access thrombosis compared to lower hemoglobin targets.
2) No significant differences were found between higher and lower hemoglobin targets for risks of death, cardiovascular events, or progression to kidney failure requiring dialysis.
3) While higher hemoglobin targets reduced need for blood transfusions, they increased use of intravenous iron therapy.
A Cemig planeja instalar placas fotovoltaicas na cobertura do Mineirão para gerar 1,5 MW de energia, o suficiente para abastecer 1,2 mil residências. A energia gerada será enviada para a rede elétrica e o estádio receberá desconto na conta de luz em troca de ceder 10% da energia produzida. A instalação das placas custará R$10 milhões e faz parte de um programa para instalar usinas solares em estádios da Copa das Confederações e Copa do Mundo.
Hybrid hosting combines cloud and dedicated hosting to maximize resource utilization. It uses cloud hosting for workloads that benefit from cloud flexibility and scalability, while dedicated hosting is used for workloads that are not well-suited to the cloud. A qualified assessment should be done before dismissing the cloud as unfeasible for any workload.
The document discusses two ancillary tasks and a main product created for a coursework assignment on designing newspaper advertisements. For the first ancillary task, the author created a billboard advertisement for the newspaper following conventions of minimal text, large font, and inclusion of the masthead. For the second ancillary task, the author created a radio advertisement lasting 20-30 seconds with a music bed, catchy slogan, and script using rhetorical questions. The main product was a newspaper with consistent branding across the front page and second page, using the same colors, masthead, and tagline as the other products to create synergy.
Expense Reduction Analysts helped TaylorMade-Adidas Golf Company reduce costs across multiple expenditure categories. A dedicated specialist from Expense Reduction Analysts reviewed each category, identifying savings of 22% in contract cleaning and 23% total across all categories. They introduced new suppliers and negotiated better prices and service levels with existing suppliers. They also helped implement a more cost-conscious culture within the company. Similarly, Expense Reduction Analysts helped F Hinds identify savings across stationery (19%), couriers (30%), and other expenditure areas through negotiating rates and operational changes while maintaining service quality.
Brady James Murphy suffered severe injuries after being electrocuted while working on a power line in 2009. He had to have both arms amputated but has shown remarkable resilience and positivity in his recovery. He inspires his step-sister and broke records at the amputee Olympics. The author's goal is to become an elementary school teacher, especially of kindergarten, and in 10 years she envisions being married with children, working as a teacher in Arizona.
Our company is looking for fresh project ideas and partners. We are a top-10 content provider in Russia with experience processing peak traffic events and working with foreign companies. We have an IVR platform that can handle 1500 simultaneous calls and 150 voice applications running in parallel, along with SMS and call center services. Please contact us to discuss partnership opportunities.
National web archiving in Australia by Dr Paul Koerbin
1) Provides a brief history of the PANDORA Web Archive project at the National Library of Australia from 1996 to present, including major milestones and growth.
2) Describes current web archiving activities at the NLA, including collections, staffing, collaboration partners, and workflows using the PANDAS system.
3) Discusses future challenges for web archiving at NLA related to increasing scale, complexity of content, technology changes, and access/discovery of archived web materials.
Evolving Towards An Integrated Segmentation ApproachPieterDuron
The document discusses different types of customer segmentation approaches and argues that an integrated approach is best. It describes four main types of segmentation: lifestyle, product usage, needs-based, and value-based. On a macro level, needs-based segmentation can identify broad customer segments for targeting and strategy. Value-based segmentation helps define service levels and prioritize retention efforts. Usage and lifestyle segmentations are best used on a micro level for tactics like promotions and communications. An integrated approach aligns these segmentations across the customer lifecycle for optimal marketing capabilities.
The document discusses future classroom learning spaces and eLearning strategies. It describes how personalization, collaboration, mobile technology, and rich media will converge to create transformative learning experiences through virtual and remote capabilities. It also lists tools like podcast services and media aggregators that can help make eLearning more sustainable and scalable. The document emphasizes developing eLearning strategies through gradual upskilling and focusing on return on investment, participation, interaction, usability, and lifelong learning.
The student outlines their plans for a two-page newspaper focusing on local stories from Manchester. They will use red and white colors representing Manchester along with black text. The main story will feature a local hero who rescued a child from a fire using a large mid-shot image. A second story will focus on Manchester United and the potential renaming of Old Trafford stadium, using images of the stadium and fans. Two smaller stories will cover a local armed robbery and a missing girl from Manchester.
Projecte presentat a la sessió .idea d'Ecotendències CosmoCaixa l'11 de juny. + info: http://www.ecotendenciescosmocaixa.org/ca/web/eco/sesion/3/2/401/804/11-juny-2013-Idea
This document discusses viewing businesses and human resource management systems through the lens of cognitive modeling and neural networks. It draws similarities between how a logical management system operates and how the brain functions. The key functions of human resource management like performance management, knowledge management, and health and safety are represented in a neural network structure. How these functions are carried out and interact with each other can be interpreted as organizational "feelings" that impact its development, like being nurtured, lonely, or under stress.
El documento presenta una serie de imágenes de lugares y paisajes alrededor del mundo, incluyendo botes cubiertos de nieve en Alemania, góndolas en Venecia, una panadería en París, el "abismo" en Venezuela que inspiró a Sir Arthur Conan Doyle, árboles cubiertos de nieve en las montañas de Franconia, un spa geotérmico en Islandia, rafters en el Gran Cañón de Arizona, el templo Angkor Wat en Camboya, dunas en el desierto del Sahara en Libia, la ciudad
Este documento contiene una lista de términos relacionados con conceptos de geografía para el segundo curso de bachillerato. Los términos están organizados alfabéticamente y cada uno está asignado a un estudiante diferente para su explicación. Los temas incluyen conceptos hidrológicos, urbanos, demográficos, agrícolas, geológicos y territoriales.
O documento discute a energia solar como uma fonte renovável de energia. Explica que o sol é uma estrela que fornece luz e calor à Terra e pode ser aproveitada através de coletores solares e painéis fotovoltaicos. Também descreve as vantagens da energia solar, como ser limpa e renovável, e desvantagens como a produção variar com o clima.
The role of genetic factors in Hypertension among Iraqi citizensAI Publications
This document summarizes a study examining the role of genetic factors in hypertension among Iraqi citizens. The study included 140 patients divided into a case group of 120 hypertensive patients and a control group of 30 normotensive patients. Data on demographics, family history, blood pressure, and other medical variables were collected and analyzed. Statistical analysis found a significant relationship between genetic factors and hypertension, with a p-value of 0.001. Patients with a positive family history of hypertension in a first-degree relative had 3.98 times higher odds of having hypertension themselves. The study concluded that genetics play an important role in hypertension risk among Iraqi citizens.
The Indian Consensus Document on Cardiac BiomarkerApollo Hospitals
Despite recent advances, the diagnosis and management of heart failure evades the clinicians. The etiology of congestive heart failure (CHF) in the Indian scenario comprises of coronary artery disease, diabetes mellitus and hypertension. With better insights into the pathophysiology of CHF, biomarkers have evolved rapidly and received diagnostic and prognostic value. In CHF biomarkers prove as measures of the extent of pathophysiological derangement; examples include biomarkers of myocyte necrosis, myocardial remodeling,
neurohormonal activation, etc.
This document provides information on thyroid conditions. Some key points:
- Thyroid nodules, hypothyroidism, hyperthyroidism, thyroiditis and autoimmune thyroiditis are common thyroid conditions in the US and globally.
- Thyroid conditions affect 5-10 times more women than men. In the US, treatment costs for thyroid disease in women totaled $4.3 billion in 2008.
- Recent research has explored the role of epigenetics in Graves' disease and selenium supplementation for thyroid disorders. Molecular testing of thyroid nodules may help determine cancer risk before surgery.
This document discusses sepsis case studies and the importance of timely diagnosis and treatment of sepsis. It defines sepsis and its criteria according to SIRS (Systemic Inflammatory Response Syndrome). Early diagnosis is important as each hour of delayed treatment can increase mortality rates by 5-10%. While microbiological cultures are traditionally used for definitive diagnosis, they can take 48-72 hours for results. Biomarkers like PCT and CRP can provide faster results and guide early empiric therapy.
This meta-analysis reviewed 15 randomized controlled trials involving over 188,000 participants to determine the effect of antioxidant vitamin supplementation on cardiovascular outcomes. The trials assessed supplements containing vitamin E, beta-carotene, and/or vitamin C compared to placebo. The analysis found that antioxidant vitamin supplementation had no significant effect on major cardiovascular events, myocardial infarction, stroke, total death, cardiac death, or other outcomes. There was no evidence that antioxidant vitamin supplements provide cardiovascular benefits.
A comparative study for some atherogenic indices in sera ofAlexander Decker
This document summarizes a study that evaluated lipid profiles and atherogenic indices in patients with myocardial infarction (MI), ischemic heart disease (IHD), and a control group. The study found:
1) Levels of total cholesterol and triglycerides did not significantly differ between MI/IHD patients and controls, while LDL cholesterol was significantly higher and VLDL cholesterol was significantly lower in MI/IHD patients.
2) Atherogenic indices like cardiogenic risk ratio (CRR), atherogenic coefficient (AC), and atherogenic index of plasma (AIP) were significantly higher in MI patients compared to controls, but only non-significantly higher in IHD patients.
3) Elevated
Physical activity and risk of cardiovascular disease—aArhamSheikh1
High levels of both leisure time physical activity and moderate levels of occupational physical activity are associated with a 20-30% lower risk of cardiovascular disease among men and women. The meta-analysis included 21 prospective cohort studies with over 650,000 participants followed for an average of 10 years. Both high leisure time physical activity and moderate occupational physical activity were associated with roughly a 20-30% lower risk of coronary heart disease and stroke for men and women. No evidence of publication bias was found across the studies.
This document outlines a study on jointly modeling multivariate longitudinal measures of hypertension (blood pressure and pulse rate) and time to develop cardiovascular disease among hypertensive outpatients in Ethiopia. The study aims to identify factors affecting changes in blood pressure and pulse rate over time as well as time to develop cardiovascular complications. The study will collect longitudinal data on blood pressure, pulse rate and cardiovascular events from 178 hypertensive patients and analyze it using joint longitudinal-survival models. Preliminary results show changes in blood pressure and pulse rate over time differ between patients who did and did not develop cardiovascular events. Key factors like diabetes, family history of hypertension and clinical stage of hypertension affect both longitudinal outcomes and survival.
Repurposing large datasets for exposomic discovery in diseaseChirag Patel
This document discusses the need for large-scale studies of environmental exposures (the exposome) analogous to genome-wide association studies (GWAS) to better understand environmental contributions to health and disease. It notes that while genetics research has made great strides with GWAS, understanding of environmental influences lacks comparable methods and data. The author argues that characterizing the exposome through high-throughput methods could discover new environmental factors in phenotypes, as GWAS did for genetics. National Health and Nutrition Examination Survey is presented as a model for collecting comprehensive human exposure data on a large scale.
This study analyzed epidemiological data on hypertension collected from 53 patients at a tertiary hospital in India. The results showed that hypertension was more prevalent in males than females, and most common in the 40-60 year old age group. Risk factors like urban living, lower education, higher BMI, smoking, drinking, sedentary lifestyle and comorbid conditions were associated with higher rates of hypertension. The most commonly prescribed medication for hypertension was a combination of atenolol and amlodipine.
Interheart risk modifiable factors in micardio infraction 2004Medicina
This document summarizes the objectives and methods of the INTERHEART study, a large international case-control study designed to assess the importance of cardiovascular risk factors worldwide. The study aimed to enroll approximately 15,000 cases of acute myocardial infarction and a similar number of controls from 52 countries representing all inhabited continents. The study investigated the association between nine modifiable risk factors (smoking, lipids, hypertension, diabetes, obesity, diet, physical activity, alcohol consumption, psychosocial factors) and the risk of myocardial infarction. Standardized questionnaires and physical examinations were used to collect information from all participants. Blood samples were also collected to analyze lipid levels. The results of this large, global study could help determine if cardiovascular risk factors have similar or
Awareness of Hypertension and Adult Education as a Preventive Measure among A...ijtsrd
This study investigated the awareness of hypertension and adult Education as a preventive measure among adults in Ikereku community of Akinyele Local Government Area, Oyo State. The level and rate of awareness of hypertension among adults in Ikereku steps to be taken by an individual and health workers in controlling the disease were the purpose of the study. Four research questions were generated and two hypotheses were formulated for the study. The descriptive research survey was used for the study involving male and female adults aged 18 years and above in Ikereku Community. Simple random sampling technique was used to select a sample size of 150 participants drawn across the community. Self constructed questionnaire tagged "Awareness of Hypertension among Adults in Ikereku Community AHAIC " and "How Social Health Workers could help in controlling the disease HSHWCD " were used as instruments for the study. Data received were analyzed using descriptive statistics to answer the research questions while the research hypotheses were tested using Pearson Product Moment Correlation and ANOVA. Results of the findings showed that adults in Ikereku Community were aware of hypertension as a disease but not all of them are aware of their hypertensive status. Findings also revealed that social health workers are helping in controlling the disease by creating awareness of the disease, provisions of free medical care and many others. Therefore, it is recommended that individual should be going for regular checkup and endeavour to adhere strictly to instructions on food consumption. Dr. Francis O. Olaniyi | Dr. Oyekunle Oyelami "Awareness of Hypertension and Adult Education as a Preventive Measure among Adults in Ikereku Community of Akinyele Local Government Area, Oyo State, Nigeria" Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-3 | Issue-5 , August 2019, URL: https://www.ijtsrd.com/papers/ijtsrd26551.pdfPaper URL: https://www.ijtsrd.com/humanities-and-the-arts/education/26551/awareness-of-hypertension-and-adult-education-as-a-preventive-measure-among-adults-in-ikereku-community-of-akinyele-local-government-area-oyo-state-nigeria/dr-francis-o-olaniyi
Statistical analysis of risk factors associated withanamjavaid13
Gallstones are crystal like collections that formed by merging of normal and abnormal gallbladder content. Usually there are two types of gallstones exist i.e. cholesterol stones & pigment stones. The current paper focuses on symptoms of the disease, major cause for the disease and on the treatments that majority of patients preferred. For this purpose, sample of size 170 data from different hospitals in Multan is collected by using convenience sampling. Main demographic factors involved in this study are Gender, Age group, marital status for patients of GSD. Frequency distribution has been formed for these different demographic and social factors and a bar chart is constructed for differentiating between gender as gender is also an important factor in GSD. For weight factor, paired t test is applied to see the difference between before and after weight after having treatment. Findings show that 67 percent people prefer govt. hospitals because of the people suffering from this disease were from backward areas or villages & their income not meet to pay the private hospitals expense.
Statistical analysis of risk factors associated withanamjavaid13
Gallstones are crystal like collections that formed by merging of normal and abnormal gallbladder content. Usually there are two types of gallstones exist i.e. cholesterol stones & pigment stones. The current paper focuses on symptoms of the disease, major cause for the disease and on the treatments that majority of patients preferred. For this purpose, sample of size 170 data from different hospitals in Multan is collected by using convenience sampling. Main demographic factors involved in this study are Gender, Age group, marital status for patients of GSD. Frequency distribution has been formed for these different demographic and social factors and a bar chart is constructed for differentiating between gender as gender is also an important factor in GSD. For weight factor, paired t test is applied to see the difference between before and after weight after having treatment. Findings show that 67 percent people prefer govt. hospitals because of the people suffering from this disease were from backward areas or villages & their income not meet to pay the private hospitals expense.
This study analyzed epidemiological data on hypertension collected from 53 patients at a tertiary hospital in India. The results showed that hypertension was more prevalent in males than females, and most common in the 40-60 year old age group. Risk factors like urban living, lower education, higher BMI, smoking, drinking, sedentary lifestyle and comorbid conditions were associated with higher rates of hypertension. The most commonly prescribed medication for hypertension was a combination of atenolol and amlodipine.
This white paper discusses Innoception Technologies' new integrated health platform that combines various technologies to allow customers to track their health online in a dynamic and accurate way. The platform aims to go beyond traditional medicine by revealing a person's optimal physiological baseline and supporting personalized healthcare. A unique collaboration between existing businesses and their proprietary technologies will allow for comprehensive analysis of bio-signals and medical data to better understand health issues and enable early intervention.
The document provides an overview of epidemiology and causal inference. It discusses the difference between using epidemiology to explain mechanisms versus predict outcomes. For explanation, causal diagrams (DAGs) can be used along with subject matter expertise to define exposures, outcomes, confounders, mediators and colliders. Regression analysis is then used to estimate the association between the exposure and outcome while appropriately adjusting for other variables.
Perspective of Cardiac Troponin and Membrane Potential in People Living with ...asclepiuspdfs
Background: Hypertension is an event in which the force of the blood against the artery walls is too high leading to severe health complications and increases the risk of heart disease, stroke, and sometimes death. Aim: This study was carried out to determine the levels of cardiac troponin 1 and membrane potential in hypertensive subjects in Owerri, Imo state. Materials and Methods: A total of 120 subjects within the age 30–70 years were recruited for this study. The study consists of 60 subjects who were diagnosed of hypertension and 60 were apparently healthy individuals who served as controls subjects of the same age bracket. The levels of cardiac troponin 1 and membrane potential were analyzed using enzyme-linked immunosorbent assay technique. Data were assessed using SPSS version 20, the mean value with P ˂ 0.05 was considered statistically significant. Results: The result revealed that the levels of cardiac troponin 1 in hypertension were significantly increased when compared with control subjects while the levels of membrane potential were significantly decreased when compared to control at P < 0.05. Conclusion: The increased serum level of cardiac troponin 1 and decreased membrane potential in hypertensive subjects may contribute some risk factors in patients with hypertension.
The document presents a proposal for a study on assessing knowledge regarding prevention of complications among hypertensive clients in Eastern Nepal. Hypertension is introduced as a major public health problem and precursor to diseases like heart attack, stroke, and kidney disease. The study aims to assess knowledge regarding prevention of complications and its association with demographic variables. It will use a cross-sectional design involving 285 hypertensive clients surveyed using semi-structured questionnaires. Data analysis will include descriptive statistics and chi-square tests. Results could help modify prevention strategies and awareness of risk factors.
This document proposes methods for generating electricity from speed breakers. It discusses 5 classifications of speed breaker power generators that use different mechanisms: 1) a chain drive mechanism, 2) a rack and pinion system, 3) direct use of the load through a reciprocating device, 4) a translator and stator topology, and 5) a pressure lever mechanism. The document also outlines the advantages of using speed breakers for power generation such as low cost and maintenance and being a renewable source. Some challenges are also noted such as selecting a suitable generator and dealing with rain damage.
Cassava waste water was used as an admixture to replace distilled water in ratios of 5%, 10%, 15%, and 20% for producing sandcrete blocks. 60 sandcrete blocks of size 450mm x 150mm x 225mm were produced with different admixture ratios and a control with 0% admixture. The blocks were cured for 7, 14, 21, and 28 days and then tested for moisture content, specific gravity, water absorption, and compressive strength. Test results showed that blocks with 20% cassava waste water admixture met the minimum compressive strength requirement of 3.30 N/mm2 set by Nigerian standards, indicating the potential of cassava waste water to improve sandcrete block quality and
The document presents a theorem on random fixed points in metric spaces. It begins with introductions to fixed point theory, random fixed point theory, and relevant definitions. The main result is Theorem 3.1, which proves that if a self-mapping E on a complete metric space X satisfies certain contraction conditions involving parameters between 0 and 1, then E has a unique fixed point. The proof constructs a Cauchy sequence that converges to the unique fixed point. The document contributes to the study of random equations and random fixed point theory, which has applications in nonlinear analysis, probability theory, and other fields.
1. The document discusses applying multi-curve reconstruction technology to seismic inversion to improve accuracy and reliability. It focuses on reconstructing SP and RMN curves from well logs that are affected by various distortions.
2. The process of reconstructing the curves involves removing baseline drift, standardizing values, applying linear filtering, and fitting the curves. This removes interference and retains valid lithological information.
3. Reconstructing high quality curves improves the resolution and credibility of seismic inversion results. The method is shown to effectively predict sand distribution with little error.
This document compares the performance of a Minimum-Mean-Square-Error (MMSE) adaptive receiver and a conventional Rake receiver for receiving Ultra-Wideband (UWB) signals over a multipath fading channel. It first describes the UWB pulse shapes and channel model used, including the 6th derivative of the Gaussian pulse and the IEEE 802.15.3a modified Saleh-Valenzuela channel model. It then discusses the Direct-Sequence and Time-Hopping transmission and multiple access schemes for UWB. The document presents the receiver structures for the MMSE adaptive receiver and Rake receiver and compares their performance using MATLAB simulations.
This document summarizes a study on establishing logging interpretation models for reservoir parameters like porosity, permeability, oil saturation, and gas saturation in the Gaotaizi Reservoir of the L Oilfield. Models were developed using core data from 4 wells and include:
1) A porosity model relating acoustic travel time to porosity with an error of 0.92%
2) A permeability model relating permeability to porosity with an error of 0.31%
3) An oil saturation model using resistivity data with empirically determined parameters
4) A method to determine original gas saturation from mercury injection data.
Application of the models improved interpretation precision and allowed recalculation of oil and gas reserves for the
This document discusses predicting spam videos on social media platforms using machine learning. It proposes using attributes like number of likes, comments, and view count to classify videos as spam or not spam. A predictive algorithm is developed that uses threshold values for attributes and natural language processing of comments to classify videos. Testing of the algorithm on a dataset achieved a spam prediction precision of 93.6%. Issues with small datasets decreasing accuracy are also discussed, along with continuing work to address this issue.
1) The study experimentally evaluated the compatibility relationship between polymer solutions and oil layers through core flooding tests with different permeability cores.
2) The results showed that injection rate decreased with increasing polymer concentration and molecular weight, and increased with permeability.
3) Based on the results, boundaries for injection capability were established and a compatibility chart was proposed to guide polymer solution selection for different sedimentary microfacies in the field based on permeability and pore size.
1. The document discusses the identification of lithologic traps in the D3 Member of the Gaonan Region using seismic attribute analysis, acoustic impedance inversion, and sedimentary microfacies analysis.
2. Several lithologic traps were identified in the I and II oil groups of the D3 Member, with the largest trap located between wells G46 and G146X1 covering an area of about 2.35 km2.
3. Impedance inversion, seismic attribute analysis, and sedimentary microfacies characterization using 3D seismic data helped determine the location and development of effective lithologic traps in the thin sandstone-shale interbeds of the target stratum.
This document examines using coal ash as a partial replacement for cement in concrete. Coal ash was substituted for cement at rates of 5%, 10%, and 15% by weight. Testing found that concrete with a 5% substitution of coal ash exhibited only a slight decrease in compressive strength of 2% at 28 days while gaining improved workability. Higher substitution rates of 10% and 15% coal ash led to greater decreases in compressive and tensile strength. The study concludes that a 5% substitution of coal ash for cement provides benefits of reduced cost and improved workability with minimal strength impacts, representing an effective use of a waste material that addresses sustainability.
Accounting professional judgment involves handling accounting events and compiling financial reports according to regulations and standards. However, professional judgment is sometimes manipulated to distort accounting information. The document discusses three ways manipulation occurs: 1) abandoning accounting principles, 2) optional changes to accounting policies, and 3) abuse of accounting estimates. The causes of manipulation include distorted motivations from corporate governance issues and catering to various stakeholder interests. Strengthening supervision and improving the accounting system are proposed to manage manipulation of professional judgment.
The document discusses research on the distribution of oil and water in the eastern block of the Chao202-2 area in China. It establishes standards for identifying oil, poor oil, dry, and water layers using well logging data. Analysis shows structural reservoirs are dominant and fault and sand body configuration control oil-water distribution. Oil-water distribution varies between fault blocks from "up oil, bottom water" to "up water, bottom oil" depending on structure and sand body development.
The document describes an intelligent fault diagnosis system for reciprocating pumps that uses pressure and flow signals as inputs. It consists of hardware for data acquisition and a software system for signal processing, feature extraction, and fault diagnosis using wavelet neural networks. The system was able to accurately diagnose three main fault types - seal ring faults, valve damage, and spring faults - based on differences observed in the pressure curves. Testing on over 12 samples of each fault type achieved a correct diagnosis rate of over 94%. The system provides a fast and effective means of remotely monitoring reciprocating pumps and identifying faults.
This document discusses the application of meta-learning algorithms in banking sector data mining for fraud detection. It proposes using Classification and Regression Tree (CART), AdaBoost, LogitBoost, Bagging and Dagging algorithms for classification of banking transaction data. The experimental results show that Bagging algorithm has the best performance with the lowest misclassification rate, making it effective for banking fraud detection through data mining. Data mining can help banks detect patterns for applications like credit scoring, payment default prediction, fraud detection and risk management by analyzing customer transaction history and loan details.
This document presents a numerical solution for unsteady heat and mass transfer flow past an infinite vertical plate with variable thermal conductivity, taking into account Dufour number and heat source effects. The governing equations are non-linear and coupled, and were solved numerically using an implicit finite difference scheme. Various parameters, including Dufour number and heat source, were found to influence the velocity, temperature, and concentration profiles. Skin friction, Nusselt number, and Sherwood number were also calculated.
The document discusses methods for obtaining a background image using depth information from a depth camera to more accurately extract foreground objects. It finds that accumulating depth images and taking the median value at each pixel provides the most accurate background image. The accuracy of three methods - average, median, and mode - are evaluated using simulated depth data of a captured plane. The median method provides the best results, followed by average, while mode performs worst. More accumulated images provide a more accurate background image across all methods.
This document presents a mathematical model for determining the minimum overtaking sight distance (OSDm) required for an ascending vehicle to safely pass another slower vehicle on a single lane highway with an incline. It defines sight distance, stopping sight distance, perception-reaction time and derives equations to calculate the reaction distance (d1), overtaking distance (d2), vehicle travel distance during overtaking (d3), and total minimum OSDm based on vehicle characteristics, road geometry, and coefficients of friction. The safe overtaking zone is defined as 3 times the minimum OSDm. The model accounts for effects of slope angle and aims to satisfy laws of mechanics for overtaking maneuvers on inclined two-way single lane highways.
This document discusses a novel technique for better analysis of ice properties using Kalman filtering. It summarizes previous research on sea ice segmentation using SAR imagery and dual polarization techniques. It proposes using an automated SAR algorithm along with Kalman filtering to more accurately detect sea ice properties from RADARSAT1 and RADARSAT2 imagery data. The document reviews techniques for image segmentation, dual polarization, PMA detection, and related work on sea ice classification using statistical ice properties, edge preserving region models, and object extraction methods.
This document summarizes a study on the bioaccumulation of heavy metals in bass fish (Morone Saxatilis) caught at Rodoni Cape in the Adriatic Sea in Albania. Samples of bass fish were collected from five sites and analyzed for mercury, lead, and cadmium levels in their muscles. The concentrations of heavy metals varied between fish and sites but were below international limits for human consumption. While the fish were found to be safe for eating, the study recommends continuous monitoring of metal levels in fish from the area due to various factors that can influence metal uptake over time.
This document discusses optimal maintenance policies for repairable systems with linearly increasing hazard rates. It considers a system with a constant repair rate and predetermined availability requirement. There are two maintenance policies: corrective maintenance only, and preventive maintenance at set time intervals. The goal is to determine the preventive maintenance interval that guarantees the availability requirement at minimum cost. Equations are developed to calculate the availability under each policy and the optimal preventive maintenance interval based on both availability and cost. A numerical example is provided to demonstrate the decision process in determining the optimal policy.
Best 20 SEO Techniques To Improve Website Visibility In SERPPixlogix Infotech
Boost your website's visibility with proven SEO techniques! Our latest blog dives into essential strategies to enhance your online presence, increase traffic, and rank higher on search engines. From keyword optimization to quality content creation, learn how to make your site stand out in the crowded digital landscape. Discover actionable tips and expert insights to elevate your SEO game.
Driving Business Innovation: Latest Generative AI Advancements & Success StorySafe Software
Are you ready to revolutionize how you handle data? Join us for a webinar where we’ll bring you up to speed with the latest advancements in Generative AI technology and discover how leveraging FME with tools from giants like Google Gemini, Amazon, and Microsoft OpenAI can supercharge your workflow efficiency.
During the hour, we’ll take you through:
Guest Speaker Segment with Hannah Barrington: Dive into the world of dynamic real estate marketing with Hannah, the Marketing Manager at Workspace Group. Hear firsthand how their team generates engaging descriptions for thousands of office units by integrating diverse data sources—from PDF floorplans to web pages—using FME transformers, like OpenAIVisionConnector and AnthropicVisionConnector. This use case will show you how GenAI can streamline content creation for marketing across the board.
Ollama Use Case: Learn how Scenario Specialist Dmitri Bagh has utilized Ollama within FME to input data, create custom models, and enhance security protocols. This segment will include demos to illustrate the full capabilities of FME in AI-driven processes.
Custom AI Models: Discover how to leverage FME to build personalized AI models using your data. Whether it’s populating a model with local data for added security or integrating public AI tools, find out how FME facilitates a versatile and secure approach to AI.
We’ll wrap up with a live Q&A session where you can engage with our experts on your specific use cases, and learn more about optimizing your data workflows with AI.
This webinar is ideal for professionals seeking to harness the power of AI within their data management systems while ensuring high levels of customization and security. Whether you're a novice or an expert, gain actionable insights and strategies to elevate your data processes. Join us to see how FME and AI can revolutionize how you work with data!
Ocean lotus Threat actors project by John Sitima 2024 (1).pptxSitimaJohn
Ocean Lotus cyber threat actors represent a sophisticated, persistent, and politically motivated group that poses a significant risk to organizations and individuals in the Southeast Asian region. Their continuous evolution and adaptability underscore the need for robust cybersecurity measures and international cooperation to identify and mitigate the threats posed by such advanced persistent threat groups.
Salesforce Integration for Bonterra Impact Management (fka Social Solutions A...Jeffrey Haguewood
Sidekick Solutions uses Bonterra Impact Management (fka Social Solutions Apricot) and automation solutions to integrate data for business workflows.
We believe integration and automation are essential to user experience and the promise of efficient work through technology. Automation is the critical ingredient to realizing that full vision. We develop integration products and services for Bonterra Case Management software to support the deployment of automations for a variety of use cases.
This video focuses on integration of Salesforce with Bonterra Impact Management.
Interested in deploying an integration with Salesforce for Bonterra Impact Management? Contact us at sales@sidekicksolutionsllc.com to discuss next steps.
Unlock the Future of Search with MongoDB Atlas_ Vector Search Unleashed.pdfMalak Abu Hammad
Discover how MongoDB Atlas and vector search technology can revolutionize your application's search capabilities. This comprehensive presentation covers:
* What is Vector Search?
* Importance and benefits of vector search
* Practical use cases across various industries
* Step-by-step implementation guide
* Live demos with code snippets
* Enhancing LLM capabilities with vector search
* Best practices and optimization strategies
Perfect for developers, AI enthusiasts, and tech leaders. Learn how to leverage MongoDB Atlas to deliver highly relevant, context-aware search results, transforming your data retrieval process. Stay ahead in tech innovation and maximize the potential of your applications.
#MongoDB #VectorSearch #AI #SemanticSearch #TechInnovation #DataScience #LLM #MachineLearning #SearchTechnology
How to Get CNIC Information System with Paksim Ga.pptxdanishmna97
Pakdata Cf is a groundbreaking system designed to streamline and facilitate access to CNIC information. This innovative platform leverages advanced technology to provide users with efficient and secure access to their CNIC details.
Have you ever been confused by the myriad of choices offered by AWS for hosting a website or an API?
Lambda, Elastic Beanstalk, Lightsail, Amplify, S3 (and more!) can each host websites + APIs. But which one should we choose?
Which one is cheapest? Which one is fastest? Which one will scale to meet our needs?
Join me in this session as we dive into each AWS hosting service to determine which one is best for your scenario and explain why!
Skybuffer SAM4U tool for SAP license adoptionTatiana Kojar
Manage and optimize your license adoption and consumption with SAM4U, an SAP free customer software asset management tool.
SAM4U, an SAP complimentary software asset management tool for customers, delivers a detailed and well-structured overview of license inventory and usage with a user-friendly interface. We offer a hosted, cost-effective, and performance-optimized SAM4U setup in the Skybuffer Cloud environment. You retain ownership of the system and data, while we manage the ABAP 7.58 infrastructure, ensuring fixed Total Cost of Ownership (TCO) and exceptional services through the SAP Fiori interface.
In the rapidly evolving landscape of technologies, XML continues to play a vital role in structuring, storing, and transporting data across diverse systems. The recent advancements in artificial intelligence (AI) present new methodologies for enhancing XML development workflows, introducing efficiency, automation, and intelligent capabilities. This presentation will outline the scope and perspective of utilizing AI in XML development. The potential benefits and the possible pitfalls will be highlighted, providing a balanced view of the subject.
We will explore the capabilities of AI in understanding XML markup languages and autonomously creating structured XML content. Additionally, we will examine the capacity of AI to enrich plain text with appropriate XML markup. Practical examples and methodological guidelines will be provided to elucidate how AI can be effectively prompted to interpret and generate accurate XML markup.
Further emphasis will be placed on the role of AI in developing XSLT, or schemas such as XSD and Schematron. We will address the techniques and strategies adopted to create prompts for generating code, explaining code, or refactoring the code, and the results achieved.
The discussion will extend to how AI can be used to transform XML content. In particular, the focus will be on the use of AI XPath extension functions in XSLT, Schematron, Schematron Quick Fixes, or for XML content refactoring.
The presentation aims to deliver a comprehensive overview of AI usage in XML development, providing attendees with the necessary knowledge to make informed decisions. Whether you’re at the early stages of adopting AI or considering integrating it in advanced XML development, this presentation will cover all levels of expertise.
By highlighting the potential advantages and challenges of integrating AI with XML development tools and languages, the presentation seeks to inspire thoughtful conversation around the future of XML development. We’ll not only delve into the technical aspects of AI-powered XML development but also discuss practical implications and possible future directions.
5th LF Energy Power Grid Model Meet-up SlidesDanBrown980551
5th Power Grid Model Meet-up
It is with great pleasure that we extend to you an invitation to the 5th Power Grid Model Meet-up, scheduled for 6th June 2024. This event will adopt a hybrid format, allowing participants to join us either through an online Mircosoft Teams session or in person at TU/e located at Den Dolech 2, Eindhoven, Netherlands. The meet-up will be hosted by Eindhoven University of Technology (TU/e), a research university specializing in engineering science & technology.
Power Grid Model
The global energy transition is placing new and unprecedented demands on Distribution System Operators (DSOs). Alongside upgrades to grid capacity, processes such as digitization, capacity optimization, and congestion management are becoming vital for delivering reliable services.
Power Grid Model is an open source project from Linux Foundation Energy and provides a calculation engine that is increasingly essential for DSOs. It offers a standards-based foundation enabling real-time power systems analysis, simulations of electrical power grids, and sophisticated what-if analysis. In addition, it enables in-depth studies and analysis of the electrical power grid’s behavior and performance. This comprehensive model incorporates essential factors such as power generation capacity, electrical losses, voltage levels, power flows, and system stability.
Power Grid Model is currently being applied in a wide variety of use cases, including grid planning, expansion, reliability, and congestion studies. It can also help in analyzing the impact of renewable energy integration, assessing the effects of disturbances or faults, and developing strategies for grid control and optimization.
What to expect
For the upcoming meetup we are organizing, we have an exciting lineup of activities planned:
-Insightful presentations covering two practical applications of the Power Grid Model.
-An update on the latest advancements in Power Grid -Model technology during the first and second quarters of 2024.
-An interactive brainstorming session to discuss and propose new feature requests.
-An opportunity to connect with fellow Power Grid Model enthusiasts and users.
Taking AI to the Next Level in Manufacturing.pdfssuserfac0301
Read Taking AI to the Next Level in Manufacturing to gain insights on AI adoption in the manufacturing industry, such as:
1. How quickly AI is being implemented in manufacturing.
2. Which barriers stand in the way of AI adoption.
3. How data quality and governance form the backbone of AI.
4. Organizational processes and structures that may inhibit effective AI adoption.
6. Ideas and approaches to help build your organization's AI strategy.
Webinar: Designing a schema for a Data WarehouseFederico Razzoli
Are you new to data warehouses (DWH)? Do you need to check whether your data warehouse follows the best practices for a good design? In both cases, this webinar is for you.
A data warehouse is a central relational database that contains all measurements about a business or an organisation. This data comes from a variety of heterogeneous data sources, which includes databases of any type that back the applications used by the company, data files exported by some applications, or APIs provided by internal or external services.
But designing a data warehouse correctly is a hard task, which requires gathering information about the business processes that need to be analysed in the first place. These processes must be translated into so-called star schemas, which means, denormalised databases where each table represents a dimension or facts.
We will discuss these topics:
- How to gather information about a business;
- Understanding dictionaries and how to identify business entities;
- Dimensions and facts;
- Setting a table granularity;
- Types of facts;
- Types of dimensions;
- Snowflakes and how to avoid them;
- Expanding existing dimensions and facts.
GraphRAG for Life Science to increase LLM accuracyTomaz Bratanic
GraphRAG for life science domain, where you retriever information from biomedical knowledge graphs using LLMs to increase the accuracy and performance of generated answers
Threats to mobile devices are more prevalent and increasing in scope and complexity. Users of mobile devices desire to take full advantage of the features
available on those devices, but many of the features provide convenience and capability but sacrifice security. This best practices guide outlines steps the users can take to better protect personal devices and information.
National Security Agency - NSA mobile device best practices
E041033338
1. IOSR Journal of Engineering (IOSRJEN) www.iosrjen.org
ISSN (e): 2250-3021, ISSN (p): 2278-8719
Vol. 04, Issue 10 (October. 2014), ||V3|| PP 33-38
International organization of Scientific Research 33 | P a g e
On Analysis Of Essential Causes Of Hypertension: A Statistical Approach Adeniji, J. O., Bello, K. M., Alfa, M. S., Akoh, D. and Hassan, A. S. Department Of Mathematics And Statistics The Federal Polytechnic, Bida. Abstract: - The paper focuses on the identification of the most contributory causes of hypertension. The data used is a secondary data collected from the records of the University College Hospital (UCH), Ibadan. Principal Component Analysis is applied to the data on essential causes of hypertension. The nine research variables under consideration are Genetics, Smoking, Stress, Old age, Obesity, Excess Salt, Excess Alcohol, Kidney Diseases and Lack of Exercise which were observed over twenty years. The results of the Analysis strongly revealed that only four (4) variables, obesity, Smoking, Excess Salt and Excess Alcohol accounted for 76.2%, 8.6%, 5.7% and 4.4% respectively (a cumulative of 94.8%) of the total variance. This clearly shows that the most contributory factor to causes of hypertension is Obesity which accounted for 76.2% of the total variance. Key-words: Principal Component Analysis (PCA), Hypertension, Obesity, Correlation Matrix, Contributory Causes, Eigenvalue.
I. INTRODUCTION
Generally in all nations of the world, the essential purpose of all governmental bodies or functions is to promote the health condition of the entire citizen. It is obvious that a healthy nation is a wealthy nation. Since the human resources of a country accounts to a large extent its productivity, it therefore becomes imperative for its standard of health to be adequately addressed. Certain factors are known to be hampering and militating against the attainment of sustainable health standard in a country. These include diseases, improper health policies among others. There are some diseases that have been ravaging mankind over the past 3 to 4 decades in which Hypertension is one of them. Statistics have revealed a disturbing trend in this direction. According to Mulatero and Bertello (2009) hypertension (HTN) or High Blood Pressure (HBP) is a chronic medical condition in which the blood pressure in the arteries is elevated. It is classified as either primary (essential) or secondary. About 90-95% or the cases are termed “primary Hypertension” which refers to high blood pressure for which no medical cure can be found. The remaining 5-10% of the cases are termed secondary hypertensions which are caused by another conditions that affect the kidneys, arteries, heart or endocrine system. Persistence hypertension is one of the risk factors for stroke, heart attack, heart failure and arterial aneurysm and is a leading cause of chronic kidney failure. Moderate elevation of arterial blood pressure leads to short life. Both dietary and life style changes as well as medicines can improve blood pressure control and disease risk of associated health complications (Sacks and Svetkey, 2008).
II. TYPES OF HYPERTENSION
Primary (Essential) Hypertension According to Ucheyama (2008) and Kite (2008) primary hypertension is the most prevalent hypertension type, affecting 90-95% of hypertension patients. This type of hypertension is a risk factor for hardening of the arteries (atherosclerosis). It also predisposes individuals to heart disease and obstruction of death in industrialized countries and increases the risk of stroke and heart failure. Secondary hypertension Secondary hypertension results from an indefinable causes. This type is important to recognize since it is treated differently from the Essential hypertension by treating the underlying causes of the elevated blood pressure. Many conditions cause hypertension; some are common and well recognized secondary causes such as causing’s syndrome which is a condition where the adrenal glands over produce the hormone cortical (Ucheyama, 2008, Kite, 2008).
III. CAUSES OF HIGH BLOOD PRESSURE
2. On Analysis Of Essential Causes Of Hypertension: A Statistical Approach
International organization of Scientific Research 34 | P a g e
According to Segnella (2009) essential (primary) hypertension is the most prevalent hypertension type.
Although no direct cause has identified itself, there are many factors such as sedentary lifestyle, stress, visceral
obesity, potassium deficiency (hypokalemia), obesity (more than 85% of cases occur in those with a body mass
index greater than 25), salt (sodium) sensitivity, alcohol intake and vitamin D deficiency that increase the risk of
developing hypertension. Risks also increases with aging, some inherited genetic mutation and having a family
history of hypertension. An elevation of rennin, an enzyme secreted by the kidney is another risk factor as in
synthetic nervous system over activity. Insulin resistance which is a component of syndrome X on the metabolic
syndrome is also thought to contribute to hypertension.
Consuming food that contains high fructose, corn syrup may increase one’s risk of developing
hypertension (Piller and Davis, 2007). According to Mulatero and Bertello (2009) recent studies have equally
implicated low birth weight as a risk factor for adult essential hypertension.
Statistics have it that between 2000 and 2008, there has been about forty percent (40%) increase in the
number of people having high blood pressure. It further shows that in 2005, sixty percent (60%) of people
suffering from high blood pressure were suffering from kidney failure and heart disease and that high blood
pressure was identified as the remote cause of the disease that later led to the death of a greater percentage of the
patients (Mulatero, 2009).
IV. PRINCIPAL COMPONENT ANALYSIS
According to Jollife (2002) Principal Component Analysis (PCA) is probably the oldest and the best
known of the techniques of multivariate analysis. The central idea of PCA is to reduce the dimensionality of
data set in which there are large numbers of interrelated (correlated) variables while retaining as much as
possible of the variation present in the data set. This transformation is possible by transforming to a new set of
variables, the principal components which are uncorrelated and which are ordered so that the first “few” retain
most of the variation present in “all” of the original variables.
PCA is a standard tool in modern data analysis in diverse field because it is a simple parametric method
for extracting relevant information from confusing and complex data set. With minimal efforts, PCA produces a
roadmap for how to reduce a complex data set to a lower dimension to reveal the sometimes hidden, simplified
structures that often underline it (Jonathan, 2009).
According to Oyebande (2011) PCA generally entails transforming a set of data into a fewer number of new set
of data. It is one of the methods in handling the problem of multi-colinearity in which independent variables are
highly correlated with each other. Also in multiple regression where we have a large number of independent
variables relating to the sample size, the PCA is used to reduce the dimensionality of the original data to a fewer
number of independent variables that could be used for the regression.
Edokpayi et al (2010) cited in Oyebande (2011) used the PCA to study the data on soil properties in the oil palm
belt in Nigeria in which 12 variables (soil properties) from 58 locations were reduced to 4 components-soil PH,
Organic matter,
H (Hydrogen Ions) and the Fe (Iron).
In this study therefore, the technique is applied to the data on essential causes of hypertension in order
determine which among the essential causes of hypertension retained form of variability in the smallest possible
causes. The data used is a secondary data collected from University College Hospital (UCH) Ibadan Record
Department. The data gives 20 years of cases of hypertension incidence within the period of 1992 to 2011 with
nine possible essential causes of hypertension.
The basic idea of the method is to describe the variation of a set of multivariate data in terms of a set of
uncorrelated variables each of which is a particular linear combination of the original variables. The usual
objective of this type of analysis is to see whether the first few components account for most of the variation in
the data. It is argue that if they do, then they can be used
The first principal component of the observations is that linear combination, 1 y of the original variables,
. . .............................1()
...
1 1
1 11 1 12 2 1
i e y a X
y a x a x a xp p
whose sample variance is greatest for all coefficients, p a a11 1 ,..., (which we may write as the vector 1 a ). Since
the variance of 1 y could be increased without limit simply by increasing the elements of 1 a , a restriction must
be placed on these coefficients; a sensible constraint is to require that the sum of squares of the coefficients,
1 1 aa , should be set at a value of unity.
If, for example, 80 percent of the variation in an investigation involving six variables could be accounted for by
a simple weighted average of the variable values, then almost all the variation could be expressed along a single
continuum rather than in six-dimensional space. This would provide a highly parsimonious summary of the data
that might be useful in later analysis.
3. On Analysis Of Essential Causes Of Hypertension: A Statistical Approach
International organization of Scientific Research 35 | P a g e
The second principal component, 2 y , is that linear combination
. . .............................2( )
...
2 2
2 21 1 22 2 2
i e y a X
y a x a x a xp p
which has the greatest variance subject to the two conditions,
0 ( )
1 ( )
2 1 1 2
2 2
and a a so that y and y are uncorrelated
a a for the reason indicated previously
Similarly the j principal component is that linear combination th
y a X.......... .......... .........( 3) j j
which has greatest variance subject to?
0 ( ).
1
a a i j
a a
j i
j j
To find the coefficients defining the first principal component we need to choose the elements of 1 a so as to
maximize the variance of 1 y subject to the constraint, 1. 1 1 aa
The variance of 1 y
is given by
................(4)
var( ) var( )
1 1
1 1
a Sa
y a X
where S is the variance –covariance matrix of the original variables.
The standard procedure for maximizing a function of several variables, subject to one or more constraints, is the
method of Lagrange multiplier. Applying this technique to maximise the variance of 1 y
, as given
by 1 1 1 var(y ) aSa
,
subject to the constraint, 1 1 1 aa , leads to the solution that 1 a is the eigenvector of S
corresponding to the largest eigenvalue.
To determine the second component, the Lagrange multiplier technique is again used to maximize the variance
of 2 y i.e.
2 2 2 var(y ) aSa ,
Subject to the two constraints
1 0. 2 2 2 1 aa and aa
This leads to the solution that
2 a is the eigenvector of S corresponding to its second largest eigenvalue.
Similarly the
th j principal component is defined by the eigenvector associated with the
th j largest eigenvalue.
If the eigenvalues of S are p , , ..., 1 2 then it is easy to show that by choosing 1 1 1 aa the variance of the
th i principal component is given by i
.
For example 1 y has variance given by 1 1 aSa , now since 1 a is an eigenvector of S.
We know that .......... .......... .......... ........( 5) 1 1 1 Sa a
Therefore (4) .i.e. 1 1 1 1 var(y ) var(aX) aSa
may be rewritten as
1
var( )
1 1 1
1 1 1 1 1 1 1
Since a a
y a a a a
The total variance of the p principal components will equal the total variance of the original variables so that
p
i
i trace S
1
( )......... .......... ....(6)
Consequently the
th j principal component accounts for a proportion
trace(S)
t i
............................................................(7)
4. On Analysis Of Essential Causes Of Hypertension: A Statistical Approach
International organization of Scientific Research 36 | P a g e
Of the total variation in the original data and the first, say p components ( p p) i i account
for a proportion .......................................(.8)
( )
1
trace S
T
p
i
i
of the total variation.
Although the derivation of principal components given above has been in terms of the eigenvalues and
eigenvectors of the covariance matrix S, it is much more usual in practice to derive them from the corresponding
quantities of the correlation matrix, R.
The reasons are not difficult to appreciate if we imagine a set of multivariate data where the variables
, ,..., , 1 2 p x x x
are of completely different types, for example length, temperature, blood pressures, anxiety
ratings, e.t.c. In such a case the structure of the principal components derived from the covariance matrix will
depend upon the essential arbitrary choice of units of measurement; additionally if there are large differences
between the variances of
, ,..., , 1 2 p x x x
these variables whose variances are largest will tend to dominate the first few principal
components.
Extracting the components as the eigenvectors of R which is equivalent to calculating the principal components
from the original variables after each has been standardised to realise, however, that the eigenvalues and
eigenvectors of R will generally not be the same as those of S; indeed there is rarely any simple correspondence
between the two and choosing to analyse R rather than S involves a definite but possibly arbitrary decision to
make the variables “equally important”.
V. PRINCIPAL COMPONENTS FROM THE CORRELATION MATRIX
Generally, extracting components from S rather than R remains closer to the spirit and intent of
principal component analysis, especially if the components are to be used in further computations.
However, in some cases, the principal components will be more interpretable if R is used. For example, if the
variances differ widely or if the measurement units are not commensurate, the components of S will be
dominated by the variables with large variances. The other variable(s) will contribute very little. For a more
balanced representation in such cases, components of R may be used.
VI. COMPARISONS OF PRINCIPAL COMPONENTS FROM
R WITH THOSE FROM S
We now list some general comparisons of principal components from R with those from those from S.
(1) The percent of variance accounted for, by the components of R will differ from the percent from S.
(2) The coefficients of the principal components from R differ from those obtained from S.
(3) If we express the components from R in terms of the original variables, they still will not agree with the
components from S.
(4) The principal components from R are scale invariant, because R itself is scale invariant.
(5) The components from a given matrix R are not unique to that R.
VII.
DECIDING HOW MANY COMPONENTS TO RETAIN
In every application, a decision must be made on how many principal components should be retained in order to
effectively summarize the data. The following guidelines have been proposed.
(1) Retain sufficient components to account for a specified percentage of the total variance, say 80%.
(2) Retain the components whose eigenvalues are greater than the average of the eigenvalues, p
p
i
i 1
, For a
correlation matrix, this average is 1.
(3) Use the scree graph, a scree graph is a plot of
versus i i
, and look for a natural break between the large
and the small eigenvalues.
(4) Test the significance of the larger component, that is, the components corresponding to the larger
eigenvalues.
VIII. RESEARCH VARIABLES
5. On Analysis Of Essential Causes Of Hypertension: A Statistical Approach
International organization of Scientific Research 37 | P a g e
The essential causes of hypertension are the variables under consideration in this paper are (1) Genetics (2)
Smoking (3) Stress (4) Old Age (5) Obesity (6) Excess Salt (7) Excess Alcohol (8) Kidney Disease (9) Lack of
Exercise.
IX. METHOD OF ANALYSIS
The method of analysis utilized in this work is Principal Component Analysis. Stata statistical software was
used for the analysis and the results are presented below.
X. RESULTS OF PRINCIPAL COMPONENT ANALYSIS FROM
CORRELATION MATRIX
The results of the analysis are as shown below;
Eigenvalue 410.69 46.20 30.63 23.53 11.06 8.19
Proprotion 0.762 0.086 0.057 0.044 0.021 0.015
Cumulative 0.762 0.847 0.904 0.948 0.968 0.983
Eigenvalue 5.29 2.58 1.08
Proprotion 0.010 0.005 0.002
Cumulative 0.993 0.998 1.000
Variables PC1 PC2 PC3 PC4 PC5 PC6
Obesity -0.729 0.101 -0.317 0.120 -0.312 0.172
Smoking -0.334 -0.261 -0.004 0.020 0.046 0.114
Excess Salt -0.076 -0.305 -0.408 0.107 -0.404 -0.446
Excess Alcohol -0.161 -0.206 -0.510 -0.175 0.762 -0.036
Stress -0.364 -0.104 0.502 0.611 0.338 -0.152
Old age -0.306 0.715 0.095 -0.320 0.097 -0.455
Genetics -0.253 -0.061 0.237 -0.406 -0.076 0.591
Kidney Disease -0.154 -0.427 0.353 -0.302 -0.142 -0.369
Lack of Exercise -0.109 -0.279 0.176 -0.460 0.079 -0.204
Variables PC7 PC8 PC9
Obesity 0.335 0,092 -0.309
Smoking 0.113 0.101 0.884
Excess Salt -0.503 -0.323 0.057
Excess Alcohol -0.149 0.120 -0.149
Stress -0.173 -0.215 -0.131
Old age -0.186 -0.002 0.181
Genetics -0.566 -0.171 -0.083
Kidney Disease -0.047 0.621 -0.185
Lack of Exercise 0.461 -0.634 -0.076
XI. DISCUSSION OF RESULTS
From the table, the first eigenvalue 1 is 410.69 and its proportion of variance is 0.762. it indicates that
the factor having the first eigenvalue (obesity) accounted for 76.2% of the total variance. The second eigenvalue
2 is 46.20 and its proportion of variance is 0.086, the factor having the second value is smoking which
accounted for 8.6% of the total variance. The third eigenvalue 3 is 30.63 and its proportion of variance is
0.057, the factor having the third value is Excess salt which accounted for 5.7% of the total variance. The fourth
eigenvalue 4 is 23.53 and its proportion of variance is 0.044, the factor having the fourth value is Excess
Alcohol which accounted for 4.4% of the total variance. The fifth eigenvalues to the ninth ( ) 5 9 to were
discarded since all their proportion of variance accounted for are less than 10%. The cumulative proportion of
the first four eigenvalues ( ) 1 4 to is 94.8% which is a sufficient percentage component accounted for.
6. On Analysis Of Essential Causes Of Hypertension: A Statistical Approach
International organization of Scientific Research 38 | P a g e
XII. CONCLUSSION
The analysis shows that among the nine essential causes of hypertension considered only four; obesity, smoking, Excess Salt and Excess Alcohol constitute the major causes of hypertension. The contributions of others are also there but not a serious treat as the four major ones identified by the PCA.
XIII. RECOMMENDATION
The seriousness of hypertension is not just that it can lead to more serious illness or complications but raises the risk of stroke, kidney failure, heart disease, and heart attack. The matter is made worse with the existence of too much weight or fat in the body as this tends to make the conditions severe. It is on the basis of these that the following recommendations are made. (1) It is the duty of all stakeholders in the health sectors to keep people informed about the implications of the causes of hypertension by providing them with accurate, timely and up to date information regarding this all important disease that is threatening human existence. (2) It is also the duty of health personnel to warn and educate adults who are more prone to high blood pressure about the dangers in management of high blood pressure. (3) More awareness is expected to be given on the feeding habit of the people, smoking habit, health care habit and general way and pattern of life as well as to avoid poor combination of the classes of food. (4) Campaign on “stay healthy and fit” and “health is wealth” should be intensified and sustained. REFERENCES
[1] Jollife, I.T. (2002) Principal Component Analysis. (2nd Edition) Springer series in statistics.Jonathan, S. (2009) A tutorial on Principal Component Analysis. Centre for Neural science, New York University, New York NY10003-6603 and system Neurobiology Laboratory, Salk Institute for Biological Studies La Jolla, LA 92037
[2] Kite, S. (2008) Trends in the prevalent Awareness, Treatment and Control of Hypertension in Adult US population. Data from the Health Examination Surveys 1996-1998, Hypertension(1): 60-69.
[3] Mulatero, P. and Bertello, C. (2009) Differential Diagnosis of primary Aldosteronism Subtypes. Current hypertension reports. Web MD http:www.webmd.com/content/article/73/88927.htm.
[4] Mulatero, P. (2009) Diagnosis of Pre-hypertension: Early stage High blood pressure. Web MD http:www.webmd.com/content/article/73/88927.htm.
[5] Oyebande, B. L. (2011) An application of Principal Component Analysis: Regression Approach to predict storm runoff, Nigeria geographical Journal, 8(2), 65-73.
[6] Piller, L. and Davis, B. (2007) The 2007 Canadian Hypertension Education Programme Recommendation for the management of hypertension: Part 1-Blood Pressure Measurement, Diagnosis and Assessment of Risk. The Canadian Journal of Cardiology, 51, 125-128.
[7] Sacks, F. and Svetkey L. (2008) Joint National Committee on Prevention, Detection, Evaluation and Treatment of High Blood Pressure. Hypertension (6): 1206-52.
[8] Sagnella, G. (2009) The renal Epithelial Sodium Channel, Genetic heterogeneity and Implication for the treatment of High Blood Pressure. Current Pharmaceutical Design (14): 2221-2234.
[9] Ucheyama, M. (2008) The effect of Biofeedback for the treatment and control of Essential Hypertension: A Systematic Review. Health Technology (3): 57-66.