This document discusses a study on using a separation-based approach for analyzing big data from various sources. It proposes a system with four main components: a main switch to separate relevant data types, a key generator to encrypt data, a subnet switch master to assign work to map workers, and map workers that process assigned data. This bottom-up approach aims to more easily analyze large amounts of data by starting with smaller subsets. The document also covers challenges of big data, existing grid computing systems, Hadoop tools, and concludes the proposed framework would improve upon previous methods for big data analysis.
Optimize the Large Scale Graph Applications by using Apache Spark with 4-5x P...Databricks
Nowadays, Spark is widely adopted in the big enterprise by handling the large volume of data. In PayPal, more and more complex data processing applications are running on top of Spark for its better performance and easy usage. Graphic analytics are among the emerging trend for different business use cases, E.g., risk control, compliance, etc.
Data Partitioning in Mongo DB with CloudIJAAS Team
Cloud computing offers various and useful services like IAAS, PAAS SAAS for deploying the applications at low cost. Making it available anytime anywhere with the expectation to be it scalable and consistent. One of the technique to improve the scalability is Data partitioning. The alive techniques which are used are not that capable to track the data access pattern. This paper implements the scalable workload-driven technique for polishing the scalability of web applications. The experiments are carried out over cloud using NoSQL data store MongoDB to scale out. This approach offers low response time, high throughput and less number of distributed transaction. The result of partitioning technique is conducted and evaluated using TPC-C benchmark.
5 Steps to Smarter, Faster, Simpler Tableau Dashboards.Kinetica
Tableau and Kinetica’s GPU database together make self-service analytics easy, simple, and instantaneous. Now you can query small or massive multi-billion row datasets in seconds for discovering real-time, intelligent insights. Kinetica opens the way for Tableau to use GPU’s brute force compute and machine learning libraries such as TensorFlow, Caffe, and Torch along with Apache NiFi, Kafka, Spark and Spark Streaming, and Storm for fast OLAP, IoT, and streaming analytics. With Kinetica, Tableau business analysts and analytics pros can interactively explore and analyze billions of rows of data without long query wait times, timeouts, and tuning, enabling sophisticated big data discovery.
Attend this webinar to hear from Jen Underwood, a recognized analytics industry expert and Jonathan Greenberg of Kinetica, to explore the many ways Kinetica database can be combined with Tableau to accelerate the delivery of intelligent insights. Together they’ll discuss:
Cost, performance, and ease-of-use benefits to adopting a GPU enhanced, in-memory speed layer for query and compute with Tableau
Augmenting Tableau with machine learning and streaming analytics for richer insights
How Kinetica’s in-memory, distributed, GPU database is mainstreaming the brute force compute power of GPUs offering 100x query performance improvement of Tableau reports and dashboards
Delivering digital transformation and business impact with io t, machine lear...Robert Sanders
A world-leading manufacturer was in search of an IoT solution that could ingest, integrate, and manage data being generated from various types of connected machinery located on factory floors around the globe. The company needed to manage the devices generating the data, integrate the flow of data into existing back-end systems, run advanced analytics on that data, and then deliver services to generate real-time decision making at the edge.
In this session, learn how Clairvoyant, a leading systems integrator and Red Hat partner, was able to accelerate digital transformation for their customer using Internet of Things (IoT) and machine learning in a hybrid cloud environment. Specifically, Clairvoyant and Eurotech will discuss:
• The approach taken to optimize manufacturing processes to cut costs, minimize downtime, and increase efficiency.
• How a data processing pipeline for IoT data was built using an open, end-to-end architecture from Cloudera, Eurotech, and Red Hat.
• How analytics and machine learning inferencing powered at the IoT edge will allow predictions to be made and decisions to be executed in real time.
• The flexible and hybrid cloud environment designed to provide the key foundational elements to quickly and securely roll out IoT use cases.
Diyotta DataMover is an intuitive and easy to use tool built
to develop, monitor and schedule data movement jobs to
load into Hadoop from disparate data sources including
RDBMS, flatfiles, mainframes and other Hadoop instances.
DataMover enables users to graphically design data import or
export jobs by identifying source objects and then schedule
them for execution.
We present our solution for building an AI Architecture that provides engineering teams the ability to leverage data to drive insight and help our customers solve their problems. We started with siloed data, entities that were described differently by each product, different formats, complicated security and access schemes, data spread over numerous locations and systems.
Optimize the Large Scale Graph Applications by using Apache Spark with 4-5x P...Databricks
Nowadays, Spark is widely adopted in the big enterprise by handling the large volume of data. In PayPal, more and more complex data processing applications are running on top of Spark for its better performance and easy usage. Graphic analytics are among the emerging trend for different business use cases, E.g., risk control, compliance, etc.
Data Partitioning in Mongo DB with CloudIJAAS Team
Cloud computing offers various and useful services like IAAS, PAAS SAAS for deploying the applications at low cost. Making it available anytime anywhere with the expectation to be it scalable and consistent. One of the technique to improve the scalability is Data partitioning. The alive techniques which are used are not that capable to track the data access pattern. This paper implements the scalable workload-driven technique for polishing the scalability of web applications. The experiments are carried out over cloud using NoSQL data store MongoDB to scale out. This approach offers low response time, high throughput and less number of distributed transaction. The result of partitioning technique is conducted and evaluated using TPC-C benchmark.
5 Steps to Smarter, Faster, Simpler Tableau Dashboards.Kinetica
Tableau and Kinetica’s GPU database together make self-service analytics easy, simple, and instantaneous. Now you can query small or massive multi-billion row datasets in seconds for discovering real-time, intelligent insights. Kinetica opens the way for Tableau to use GPU’s brute force compute and machine learning libraries such as TensorFlow, Caffe, and Torch along with Apache NiFi, Kafka, Spark and Spark Streaming, and Storm for fast OLAP, IoT, and streaming analytics. With Kinetica, Tableau business analysts and analytics pros can interactively explore and analyze billions of rows of data without long query wait times, timeouts, and tuning, enabling sophisticated big data discovery.
Attend this webinar to hear from Jen Underwood, a recognized analytics industry expert and Jonathan Greenberg of Kinetica, to explore the many ways Kinetica database can be combined with Tableau to accelerate the delivery of intelligent insights. Together they’ll discuss:
Cost, performance, and ease-of-use benefits to adopting a GPU enhanced, in-memory speed layer for query and compute with Tableau
Augmenting Tableau with machine learning and streaming analytics for richer insights
How Kinetica’s in-memory, distributed, GPU database is mainstreaming the brute force compute power of GPUs offering 100x query performance improvement of Tableau reports and dashboards
Delivering digital transformation and business impact with io t, machine lear...Robert Sanders
A world-leading manufacturer was in search of an IoT solution that could ingest, integrate, and manage data being generated from various types of connected machinery located on factory floors around the globe. The company needed to manage the devices generating the data, integrate the flow of data into existing back-end systems, run advanced analytics on that data, and then deliver services to generate real-time decision making at the edge.
In this session, learn how Clairvoyant, a leading systems integrator and Red Hat partner, was able to accelerate digital transformation for their customer using Internet of Things (IoT) and machine learning in a hybrid cloud environment. Specifically, Clairvoyant and Eurotech will discuss:
• The approach taken to optimize manufacturing processes to cut costs, minimize downtime, and increase efficiency.
• How a data processing pipeline for IoT data was built using an open, end-to-end architecture from Cloudera, Eurotech, and Red Hat.
• How analytics and machine learning inferencing powered at the IoT edge will allow predictions to be made and decisions to be executed in real time.
• The flexible and hybrid cloud environment designed to provide the key foundational elements to quickly and securely roll out IoT use cases.
Diyotta DataMover is an intuitive and easy to use tool built
to develop, monitor and schedule data movement jobs to
load into Hadoop from disparate data sources including
RDBMS, flatfiles, mainframes and other Hadoop instances.
DataMover enables users to graphically design data import or
export jobs by identifying source objects and then schedule
them for execution.
We present our solution for building an AI Architecture that provides engineering teams the ability to leverage data to drive insight and help our customers solve their problems. We started with siloed data, entities that were described differently by each product, different formats, complicated security and access schemes, data spread over numerous locations and systems.
Infinitely Scalable Clusters - Grid Computing on Public Cloud - New YorkHentsū
Hentsū helps hedge funds and asset managers run research clusters, big data and high performance computing solutions using the public cloud.
This workshop was hosted in our NY offices and covered an introduction to grid computing using the public cloud. We also specifically looked at Google's BigQuery for running analytics across terabytes of depth market data, with some live demos.
Leveraging Map Reduce With Hadoop for Weather Data Analytics iosrjce
IOSR Journal of Computer Engineering (IOSR-JCE) is a double blind peer reviewed International Journal that provides rapid publication (within a month) of articles in all areas of computer engineering and its applications. The journal welcomes publications of high quality papers on theoretical developments and practical applications in computer technology. Original research papers, state-of-the-art reviews, and high quality technical notes are invited for publications.
Data Modeling and Scale Out - ScaleBase + 451-Group webinar 30.4.2015 Vladi Vexler
Data Modeling and Scale Out - ScaleBase + 451-Group webinar 30.4.2015
Data Modeling and Scale Out
451 Research:
- Key challenges in the data landscape
- Evolution of distributed database environments
ScaleBase
- Pros and cons of abstracting complex databases topology
- Top strategies of distributed data modeling
- Advanced data modeling and “what-if” simulations with
- ScaleBase Analysis Genie
- Scaling real apps – From need to deployment
Infinitely Scalable Clusters - Grid Computing on Public Cloud - New YorkHentsū
Hentsū helps hedge funds and asset managers run research clusters, big data and high performance computing solutions using the public cloud.
This workshop was hosted in our NY offices and covered an introduction to grid computing using the public cloud. We also specifically looked at Google's BigQuery for running analytics across terabytes of depth market data, with some live demos.
Leveraging Map Reduce With Hadoop for Weather Data Analytics iosrjce
IOSR Journal of Computer Engineering (IOSR-JCE) is a double blind peer reviewed International Journal that provides rapid publication (within a month) of articles in all areas of computer engineering and its applications. The journal welcomes publications of high quality papers on theoretical developments and practical applications in computer technology. Original research papers, state-of-the-art reviews, and high quality technical notes are invited for publications.
Data Modeling and Scale Out - ScaleBase + 451-Group webinar 30.4.2015 Vladi Vexler
Data Modeling and Scale Out - ScaleBase + 451-Group webinar 30.4.2015
Data Modeling and Scale Out
451 Research:
- Key challenges in the data landscape
- Evolution of distributed database environments
ScaleBase
- Pros and cons of abstracting complex databases topology
- Top strategies of distributed data modeling
- Advanced data modeling and “what-if” simulations with
- ScaleBase Analysis Genie
- Scaling real apps – From need to deployment
This talk was for GDG Fresno meeting. The demo used Google Compute Engine and Google Cloud Storage. The actual talk was different than the slides. There were a lot of good questions from the audience, and diverted to side topics many times.
Big data analytics and machine intelligence v5.0Amr Kamel Deklel
Why big data
What is big data
When big data is big data
Big data information system layers
Hadoop echo system
What is machine learning
Why machine learning with big data
The Hadoop Ecosystem for developers session in DevGeekWeek in Israel.
This was a day long session talking about big data problems and the hadoop solution. we also talked about Spark and NoSQL.
Agile Big Data Analytics Development: An Architecture-Centric ApproachSoftServe
Presented at The Hawaii International Conference on System Sciences by Hong-Mei Chen and Rick Kazman (University of Hawaii), Serge Haziyev (SoftServe).
Big Data Open Source Tools and Trends: Enable Real-Time Business Intelligence...Perficient, Inc.
Most organizations still rely on batch and offline processing of data streams to gain meaningful analysis and insight into their business. However, in our instant gratification world, real-time computation and analysis of streaming data is crucial in gaining insight into patterns and threats. A trend is emerging for real-time and instant analysis from live data streams, promoting the value of logs and a move toward functional programming.
This shift in technology is not about what and how to store the data, but what we can do with it to see emerging patterns and trends across multiple resources, applications, services and environments. Log data represents a wealth of information, yet is often sporadic, unstructured, scattered across the enterprise and difficult to track.
These slides provide insights into some of the most helpful Big Data tools used by the largest social media and data-centric organizations for competitive trends, instant analysis and feedback from large volume data streams. We show how how using Big Data tools Storm, ElasticSearch and an elastic UI can turn application logs into real-time analytical views.
You will also learn how Big Data:
Contains data that is elastic, minimally structured, flexible and scalable
Helps process live streams into meaningful data
Promotes a move toward functional programming
Effects the enterprise data architecture
Works with real-time CEP tools like Storm for functional programming
Introduction to Cloud computing and Big Data-HadoopNagarjuna D.N
Cloud Computing Evolution
Why Cloud Computing needed?
Cloud Computing Models
Cloud Solutions
Cloud Jobs opportunities
Criteria for Big Data
Big Data challenges
Technologies to process Big Data- Hadoop
Hadoop History and Architecture
Hadoop Eco-System
Hadoop Real-time Use cases
Hadoop Job opportunities
Hadoop and SAP HANA integration
Summary
MongoDB for Spatio-Behavioral Data Analysis and VisualizationMongoDB
T-Sciences offers iSpatial - a web-based Spatial Data Infrastructure (SDI) to enable integration of third-party applications with geo-visualization tools. The iHarvest tool further enables the mining and analysis of data aggregated in the iSpatial platform for spatio-temporal behavior modelling. At the back-end of both products is MongoDB, providing fundamental framework capabilities for the spatial indexing and data analysis techniques. Come witness how Thermopylae Sciences and Technology leveraged the aggregation framework, and extended the spatial capabilities of MongoDB to tackle dynamic spatio-behavioral data at scale.
'Kanthaka' is an attempt to bring the benefits of Big Data technologies to telecom industry. The objective of the system is to analyze the CDRs (Caller Detail Record) and give results in near real time.
This is carried out as a final year project for my degree B. Sc. of Engineering (Hons) at University of Moratuwa as a team with 3 more colleagues, under the supervision of a senior lecturer and an industry expert.
The presentation exhibits the background, findings after literature review and proposing architecture of the system as for now. Any feed backs on improvements that can be made, are warmly welcome!
Rapid Cluster Computing with Apache Spark 2016Zohar Elkayam
This is the presentation I used for Oracle Week 2016 session about Apache Spark.
In the agenda:
- The Big Data problem and possible solutions
- Basic Spark Core
- Working with RDDs
- Working with Spark Cluster and Parallel programming
- Spark modules: Spark SQL and Spark Streaming
- Performance and Troubleshooting
PEARC17:A real-time machine learning and visualization framework for scientif...Feng Li
High-performance computing resources are currently widely used in science and engineering areas. Typical post-hoc approaches use persistent storage to save produced data from simulation, thus reading from storage to memory is required for data analysis tasks. For large-scale scientific simulations, such I/O operation will produce significant overhead. In-situ/in-transit approaches bypass I/O by accessing and processing in-memory simulation results directly, which suggests simulations and analysis applications should be more closely coupled. This paper constructs a flexible and extensible framework to connect scientific simulations with multi-steps machine learning processes and in-situ visualization tools, thus providing plugged-in analysis and visualization functionality over complex workflows at real time. A distributed simulation-time clustering method is proposed to detect anomalies from real turbulence flows.
Exploring the Experiences of Gender-Based Violence
and The Associated Psychosocial and Mental Health
Issues of Filipino HIV-Positives: Implications for
Psychological Practice
Evangeline R Castronuevo-Ruga1, Normita A Atrillano2
Abstract: The phenomenon of gender-based violence has generated attention from research practitioners and helping professionals since
the surge of the women’s movement three or so decades ago in the Philippines. At about the same time, the HIV-AIDS gained similar
attention with the disclosure of the first ever case of the country in the mid-80s. Only recently, however, has the intersectionality of these
two phenomena been looked into by the research community in other countries and has yet to see parallel response locally. This research,
therefore, attempts to map out the lived experiences of People Living with Human Immuno Deficiency Virus (PLHIV) who have undergone
gender-based violence (GBV). It specially looks into the consequent psychosocial and mental health issues. Using focus group discussion with
24 purposively sampled participants from the highly vulnerable groups based in three major Philippine cities, thematic analysis reveals that
the participants experienced various forms of gender-based violence, e.g., sexual, emotional/psychological, economic, verbal, physical) and
expressions of stigma and discrimination, which in turn, led to manifestations of different emotional and psychological trauma, depression,
internalized homophobia, greater health risks and risk-taking behaviours, among others. It might be worthwhile to consider the possibility
that the consequent risk-taking and self-injurious tendencies played a role in their eventual contraction of HIV.
Estimation of Storage-Draft Rate Characteristics of
Rivers in Selangor Region
Farah Syazana Abd Latif1, Siti Fatin Mohd Razali2
1,2Faculty of Engineering & Built Environment, Universiti Kebangsaan Malaysia
Abstract: Drought is a phenomenon of extreme water shortage that has significant economic, social, environmental and human life
impact. Streamflow drought characteristics and properties are useful in the design of hydro-technical projects, water resources planning and
management purposes. Information on low flow magnitude, frequency, probability and return period are very crucial in analysing
streamflow drought at the operational level in public water supply. The objectives of this study are to determine the characteristics of low
flow for every streamflow station in the Selangor region. The estimation of minimum storage draft-rate with the probability of low flow
return periods of 2, 5, 10, 20, and 50 years is presented in this paper.
Awwal-Awwal Tampat Budjang Journey Back to
Pre-Islamic Epoch: A Cultural Semiotic
Helen G Juaini1
Abstract: Cultural background plays a significant role in the sphere of semiotics. Semiotics as a discipline is recognized as a useful tool in
gauging cultural background and identifying signs that might represent the message of a certain work. Given the rich cultural context of
Tawi-Tawi oral literature this can be used in studying semiotics. Semiotic tools were employed to interpret the awwal-awwal as provided by
the respondents and to formulate a subsequent understanding of this oral literature in relation to the Sama’s claim of sacredness of Tampat
Budjang.
Politeness and Intimacy in Application Letters of
Three Cultural Groups in Mindanao
Helen G Juaini1
Abstract: 150 application letters from the three cultural groups in Mindano, namely Sinama, Subanen, and Tausug have been analysed
in a mixed-method design. The focus of the study is on the two features of politeness and intimacy. In the quantitative analysis, the model
proposed by Brown & Levinson (1987) and that of Columns (2005) which have drawn upon the features of indirectness in requesting and
the length of letters as the indicators of politeness are used. In the qualitative and descriptive analysis formality in salutation and opening
clause as well as the use of abbreviated forms are taken into account. The result shows that Tausug use the politest style in their application
letters, followed by Sinama and Subanen respectively. On the other hand, Sinama, Subanen, and Tausug use the least intimate style in their
business letters. The findings are hoped to help better inter-cultural understanding, especially with respect to written rhetorical
characteristics.
New Authentication Algorithm for IoT Environment
based on Non-Commutative Algebra and Its
Implementation
Maki Kihara1, Satoshi Iriyama2
1,2Tokyo University of Science
Abstract: Recently, IoT devices such as robots, speakers, domestic electrical appliances and smart devices are provided everywhere for
everyone. While their authentication request is quite ubiquitously, namely, an authentication for sharing services, the actual
implementations are patchy schemes of variety security policies. In this study, we propose the new authentication scheme for IoT devices
without certificate authority which is fast enough as well as secure. The verification algorithm is based on suitable ciphered metric. We
define a class of such verifiable encryption and give an example for authentication. Moreover, we show the implementation which keeps
perfect secrecy by means of Shannon’s theory.
Developing a Strategic Organisational Learning
Framework to Improve Caribbean Disaster
Management Performance
Joanne Persad1
Abstract: Disasters are social constructs and require an agility and adaptability from national disaster organisations (NDOs). The
environment in which NDOs operate are complex adaptive systems environment, and organisational learning as a key approach is considered
fundamental to strengthening the ability of an NDO to perform at its best. With the potential for loss of lives, the destruction of critical
infrastructure and housing and to the risk of setting back a country’s economic development by many years, learning from the lessons of the
past, to reduce the negative impacts is critical for the onward growth of Caribbean countries which, for the most part, are small island
developing states. The Caribbean Region is the one of the most hazard prone regions in the world (Walbrent College 2012). Lessons from
disaster impacts are identified, gaps are well documented, and failures are sometimes exposed. But learning, in terms of making changes to
improve systems, performance and resilience, is questionable. The lessons must be applied for change to occur, this is part of the knowledge
management process in the context of disaster organisations. The purpose of this study is to explore the apparent inability of national
disaster organizations in the Caribbean to apply the lessons learnt from previous disasters. Three (3) Caribbean countries have been selected
for this research. It is a multiple case study where the unit of analysis is the national disaster organisation. This study is based on an
interpretive paradigm.
Combating Climate Change and Land Degradation in
The West African Sahel: A Multi-Country Study of
Mali, Niger and Senegal
S A Igbatayo1
1Head, Department of Economics & Management Studies, AFE Babalola University, Nigeria
Abstract: The West African Sahel is a vast ecological zone separating the Sahara Desert to the north and Sudanian savannah to the
south; traversing Senegal, Mali, Burkina Faso, Niger, northern Nigeria and Chad. With a population estimated at more than 60 million
people, the region features a multiplicity of development challenges. It is home to some of the world’s most impoverished people, whose
livelihoods are mostly reliant on rain-fed agriculture. Characterized by semi-arid vegetation, the West African Sahel is one of the most
environmentally degraded ecosystems in the world. The region faces severe and recurring bouts of droughts since the 1980s, jeopardizing
environmental sustainability. During the past four decades, the West African Sahel has witnessed below-average annual precipitation, with
two severe drought periods in 1972-1973 and 1983–1984, in a development that undermined agricultural productivity and spawned
severe land degradation. Various studies have predicted even more severe climate variability and change in the region, with drier and more
frequent dry periods expected. The intergovernmental Panel on climate change (IPCC, 2007) revealed a decline in annual rainfall in West
Africa since the end of the 1960s, with a reduction of 20% to 40% observed in the periods 1931-1960 and 1968–1990. Repeated
droughts, fuelled by climate change, have undermined land productivity, turning arable soils into marginal lands, and rendering land
resources vulnerable to such anthropogenic activities as over-grazing, agricultural intensification and deforestation, which are common
practices across the region. The major objective of this paper is to shed light on climate change and land degradation patterns in the West
African Sahel. It employs empirical data to analyse the trends, with particular emphasis on Mali, Niger and Senegal. The study reveals
considerable threats posed by the twin scourges of climate change and land degradation to food security, environmental sustainability and
regional stability.
Combating Climate Change and Land Degradation in
The West African Sahel: A Multi-Country Study of
Mali, Niger and Senegal
S A Igbatayo1
1Head, Department of Economics & Management Studies, AFE Babalola University, Nigeria
Abstract: The West African Sahel is a vast ecological zone separating the Sahara Desert to the north and Sudanian savannah to the
south; traversing Senegal, Mali, Burkina Faso, Niger, northern Nigeria and Chad. With a population estimated at more than 60 million
people, the region features a multiplicity of development challenges. It is home to some of the world’s most impoverished people, whose
livelihoods are mostly reliant on rain-fed agriculture. Characterized by semi-arid vegetation, the West African Sahel is one of the most
environmentally degraded ecosystems in the world. The region faces severe and recurring bouts of droughts since the 1980s, jeopardizing
environmental sustainability. During the past four decades, the West African Sahel has witnessed below-average annual precipitation, with
two severe drought periods in 1972-1973 and 1983–1984, in a development that undermined agricultural productivity and spawned
severe land degradation. Various studies have predicted even more severe climate variability and change in the region, with drier and more
frequent dry periods expected. The intergovernmental Panel on climate change (IPCC, 2007) revealed a decline in annual rainfall in West
Africa since the end of the 1960s, with a reduction of 20% to 40% observed in the periods 1931-1960 and 1968–1990. Repeated
droughts, fuelled by climate change, have undermined land productivity, turning arable soils into marginal lands, and rendering land
resources vulnerable to such anthropogenic activities as over-grazing, agricultural intensification and deforestation, which are common
practices across the region. The major objective of this paper is to shed light on climate change and land degradation patterns in the West
African Sahel. It employs empirical data to analyse the trends, with particular emphasis on Mali, Niger and Senegal. The study reveals
considerable threats posed by the twin scourges of climate change and land degradation to food security, environmental sustainability and
regional stability. It also presents a policy framework underpinned by climate change mitigation and adaptation strategies, formalizing land
rights for farmers, subsidizing farm inputs, creating grazing reserves for pastoralists and deepening poverty reduction strategies.
A Study on Factor Affecting Textile
Entrepreneurship – A Special Emphasis on Tirupur
District
P Anbuoli1
1Assistant Professor, Department of Business Administration, Mannar Thirumalai Naicker College, India
Abstract: Entrepreneurial success depends on various factors associated with the business, the entrepreneurs’ wishes to start. Entrepreneurs
need some sort of inspirations to succeed in their business ventures. Being a versatile industry, textile attracts many entrepreneurs both urban
and rural peoples and requires minimal investment to start. Textile entrepreneurs have to face several challenges and prospects associated
with their business. This study has been commenced with the objectives to check demographic profile, factors affecting textile entrepreneurs,
encouragement of external factors and personal reason behind to become textile business entrepreneurs. This study has been carried out with
100 textile entrepreneurs; the sample has been selected by using simple random sampling. This study is also carried out with non-disguised
and structured questionnaire; which consists of four parts with seeking information on demographic profile, factors affecting textile
entrepreneurs, external encouraging factors and personal reason to become textile entrepreneurs. This study uses percentage analysis, factor
analysis, Garrett score ranking, and t-test to analyse the data collected. It was concluded that textile entrepreneurs have been encouraged by
various factors and moreover several factors significantly affect their business.
Factors Affecting Consumer Purchase Behaviour
towards Online Clothing Products in Bangladesh
T Islam1
1BRAC Business School, BRAC University, Dhaka, Bangladesh
Abstract: The online clothing businesses have seen a considerable rise in recent times, with a high and growing demand. The purpose of
this study is to determine the factors that play significant roles in creating purchase intention towards the online clothing products in
Bangladesh. Secondary research was used to build the model of customer purchase intention. A structured questionnaire was employed to
gather data and test the model. Factor analysis and regression were used to test the model. The regression model suggested that customer
purchase intention was induced most by the online marketing activities of the online retailers, followed by pricing strategy implemented and
sense of security provided (in that order). To understand customer purchase intentions better, it may be important to look at additional
factors or seek better measures of the constructs. The study suggests that online retailers should heavily focus on online promotions and
pricing.
Improvement Measures on Wage System of
Construction Skilled Worker in South Korea
Kun-Hyung Lee1, Byung-Uk Jo2, Kyeoung-Min Han3, Chang-Baek Son4
1,2,3Graduate, School of Architectural Engineering, Semyung University, Jecheon-si, South Korea
4Professor, Department of Architectural Engineering, Semyung University, Jecheon-si, South Korea
Abstract: Unlike other industries, the construction industry is characterized by its heavy dependence on labour force with most work done
by workers. Still, the industry is witnessing the declining influx of young workers and the rising turnover rates of skilled workers due to such
issues as the advancement of 3D industry, negative image and absence of an established wage system. Hence, this paper proposes an
alternative scheme that would help improve the wage system and work environment for skilled construction workers in Korea.
Mastering the Recycling of Masonry while building
Tadao Ando’s Private Gallery in Lincoln Park,
Chicago
Daniel Joseph Whittaker1
Abstract: The notion of a great presence of masonry rarely conjures up the likes of buildings by master architect, Tadao Ando san of
Osaka, Japan, who is better known for his sublime shaping of space with planar forms of site-cast concrete. Perhaps though, one may recall
the ‘historical intervention’ on a grand scale—the now nine-year-old Punta Della Dogan a project (2009) in Venice, Italy, as prima facie
evidence of his dialogue with a vast quantity of ancient masonry in the Laguna. However, a new project by Ando, recently opened in
Chicago, Illinois (October 2018), presents the private-museum-gallery-going public with a new North American delight. Here, the senses
are able to indulge in a hybrid set of experiences shaped by masonry, concrete, and white painted plaster surfaces. This paper explores how
the modern concrete master has expanded his dynamic architectural vocabulary utilizing what is known as Chicago common brick: a soft,
Lake Michigan-sand and clay based fired brick, and incorporated it into his most recent private commission located in Lincoln Park,
Chicago.
RRI Buffer Based Energy and Computation Efficient
Cache Replacement Algorithm
Muhammad Shahid1
1Computer Science Department, National University of Computer and Emerging Sciences, Islamabad
Abstract: Energy consumption is an important factor of com-mutational power these days. Large scale energy consumption results in bad
system performance and high cost. To access frequently used data, we place it in Cache. Cache provides us opportunity to access that data in
a small time. Cache memory helps in retrieving data in minimum time improving the system performance and reducing power consumption.
Due to limited size of Cache, replacement algorithms used to make space for new data. There are many existing cache replacement
algorithms for example LRU, LFU, MRU, FIFO etc. Existing algorithms consume a lot of energy while replacing cold blocks of data.
Replacement algorithms are usually designed to reduce miss rate and increase hit rate. These algorithms replace cold blocks (not going to use
in future) and due to large number of cold blocks, they consume lot of energy. This paper proposes an energy and computation efficient cache
replacement algorithm that put only hot blocks in action instead of removing cold blocks. This paper also discusses different replacement
algorithms proposed in different papers and compare these algorithms on basis of different parameters mainly energy consumption. In our
experiments we have found LRU and FIFO as best replacement algorithms for Increased hit rates and Energy efficiency respectively.
Key Performance Index of Increasing Air Quality
with Construction Schedule Control
Hyoung-Chul Lim1, Dongheon Lee2, Dong-Eun Lee3, Daeyoung Kim4
1Professor, 2Doctorial Course, School of Architectural Engineering, Changwon National University, Korea
3Professor, School of Architecture & Civil Engineering, Kyungpook National University, Korea
4Professor, Department of Architecture, Kyungnam University, Korea
Abstract: Recently, air quality in residential spaces has been major concern. In particular, the indoor air quality of residential facility
before occupancy, which is related to the interior material, is a serious problem. existing research has mainly focused on pollution control
after construction, but this research has derived I key performance index I about increasing air quality and priority of management with a
controlling schedule. That is the objectives of research. The results show the relative priority of the four major items in wall‐based apartment
buildings and in column‐based apartment buildings. An analysis of the parties responsible for improvement based on the IAQ results shows
more efforts to improve IAQ are needed in material factories and engineering/design companies.
Exploring Revitalization Solutions: Engaging
Community through Media Architecture
Behzad Shojaedingivi1
1University of Tehran
Abstract: This paper aims to investigate Media Architecture and its potentials for culturally based revitalization. Media Architecture
presents a new approach based on Augmentation concepts, in which projects are designed and implemented adopting contemporary mediums
in an aesthetic way in order to attract the presence of a more cultural audience and increase the participation of the local residents.
Ultimately this will lead to an increase of interaction between different classes in neglected areas and strengthen their connection to their
built environment. This is an interdisciplinary approach in which architecture and contemporary mediums are combined aesthetically with
the aim of creating revival solutions in neglected areas.
Criteria of Creating Social Interaction for Green
Open Space in Karkh, Iraq
Sarah Abdulkareem Salih1, Sumarni Ismail2
1Master Student, 2Lecturer, Department of Architecture, Universiti Putra Malaysia, Malaysia
Abstract: This paper outlines the issue on open spaces, which led to decrease social interaction among residents in Baghdad city
nowadays. The main objective of the paper is to identify the criteria of green open spaces to achieve sound social interaction in Baghdad city,
Iraq. This paper employed quantitative method, in the form of survey, for data collection. Data were obtained from questionnaires, through
the selection of 270 respondents in a single-stage random procedure from ten specific neighbourhoods in Karkh district. The study findings
confirm that open spaces and parks is essential to enhance social interaction by implementing appropriate criteria in that open spaces or
parks. The results of this study are useful reference for urban and landscape planners, architects, social psychologists, the Municipality of
Baghdad, and researchers in this field.
The CoreConferences 2019 held on 20th – 21st March, 2019, in collaboration with Association of Scientists, Developers and Faculties (ASDF), an International body, at Taipei, Taiwan. CoreConferences 2019 provides a chance for Academic and Industry professionals to discuss the recent progress in the area of Multiple. The outcome of the conference will trigger for the further related research and future technological improvement. This conference highlights the novel concepts and improvements related to the research and technology.
ICCOTWT 2018 will be the most comprehensive conference focused on the various aspects of Cloud of Things and Wearable Technologies. This Conference provides a chance for academic and industry professionals to discuss recent progress in the area of Cloud of Things and Wearable Technologies. Furthermore, we expect that the conference and its publications will be a trigger for further related research and technology improvements in this important subject.
The goal of this conference is to bring together the researchers from academia and industry as well as practitioners to share ideas, problems and solutions relating to the multifaceted aspects of Cloud of Things and Wearable Technologies.
The International Conference on Computer, Engineering, Law, Education and Management (ICCELEM 2017)” held on 28 - 29th September 2017, in collaboration with Association of Scientists, Developers and Faculties (ASDF), an International body, at The Westin Chosun Seoul, Seoul, South Korea.
The Third International Conference on “Systems, Science, Control, Communication, Engineering and Technology (ICSSCCET 2017)” held on 16 - 17th February 2017, in collaboration with Association of Scientists, Developers and Faculties (ASDF), an International body, at Teegala Krishna Reddy Engineering College, Hyderabad, India, Asia.
The First International Conference on “Advanced Innovations in Engineering and Technology (ICAIET 2017)” held on 14th - 15th Feb 2017, in collaboration with Association of Scientists, Developers and Faculties (ASDF), an International body, at Rohini College of Engineering and Technology, Tamilnadu, India, Asia.
The First International Conference on “Intelligent Computing and Systems (ICICS 2017)” held on 13th - 14th February 2017, in collaboration with Association of Scientists, Developers and Faculties (ASDF), an International body, at NSN College of Engineering and Technology, Karur, Tamilnadu, India, Asia.
The First International Conference on “Advances & Challenges in Interdisciplinary Engineering and Management 2017 (ICACIEM 2017)” held on 11 – 12th February 2017, in collaboration with Association of Scientists, Developers and Faculties (ASDF), an International body, at Vidyaa Vikas College of Engineering and Technology, Tiruchengode, Tamilnadu, India, Asia.
Wireless sensor networks can provide low cost solution accompanied with limited storage, computational capability and power for verity of real-world problems and become essential factor when sensor nodes are arbitrarily deployed in a hostile environment. The cluster head selection technique is also one of the good approaches to reduce energy consumption in wireless sensor networks. The lifetime of wireless sensor networks is extended by using the uniform cluster head selection and balancing the network loading among the clusters. We have reviewed various energy efficient schemes apply in WSNs of which we concentrated on selection of cluster head approach and proposed an new method called Sleep Scheduling Routing with in clusters for Energy Efficient [SSREE]in which some nodes in clusters are usually put to sleep to conserve energy, and this helps to prolong the network lifetime. EASSR selects a node as a cluster head if its residual energy is more than system average energy and have less energy consumption rate in previous round. Then, an Performance analysis and compared statistic results of SSREE shows of the significant improvement over existing protocol LEACH, SEP and M-GEAR protocol in terms of lifetime of network and data units gathered at BS.
Due to rapid urbanization the manufacturing processes of conventional building materials pollutes air, water and land. Hence in order to fulfil the increasing demand it is required to adopt a cost effective, eco-friendly technologies by improving the traditional techniques with the usage of available local materials. Agro – industrial and other solid waste disposal is another serious issue of concern in most of developing countries. The present paper explores the potential application of agro-waste as an ingredient for alternate sustainable construction materials.
There has been an ever-increasing interest in big data due to its rapid growth and since it covers diverse areas of applications. Hence, there seems to be a need for an analytical review of recent developments in the big data technology. This paper aims to provide a comprehensive review of the big data state of the art, conceptual explorations, major benefits, and research challenging aspects. In addition to that, several future directions for big data research are highlighted.
A correct node operation and power administration are significant issues in the wireless sensor network system. Ultrasonic, dead reckoning, and radio frequency information is obtained by using localization mechanism and worked through a specific filter algorithm. In this paper, a well-organized grid deployment method is applied to split the nodes into multiple individual grids. The tiny grids are used for improved resolution and bigger grids are used to decrease the complexity of processing. The efficiency of each grid is obtained by environmental factors such as redeployed nodes, boundaries, and obstacles. To decrease the power usage, asynchronous power management method is designed. In network communication, power management method is applied by using an asynchronous awakening scheme and n-duplicate coverage algorithm is engineered for the coverage of nodes.
More from Association of Scientists, Developers and Faculties (20)
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...UiPathCommunity
💥 Speed, accuracy, and scaling – discover the superpowers of GenAI in action with UiPath Document Understanding and Communications Mining™:
See how to accelerate model training and optimize model performance with active learning
Learn about the latest enhancements to out-of-the-box document processing – with little to no training required
Get an exclusive demo of the new family of UiPath LLMs – GenAI models specialized for processing different types of documents and messages
This is a hands-on session specifically designed for automation developers and AI enthusiasts seeking to enhance their knowledge in leveraging the latest intelligent document processing capabilities offered by UiPath.
Speakers:
👨🏫 Andras Palfi, Senior Product Manager, UiPath
👩🏫 Lenka Dulovicova, Product Program Manager, UiPath
The Art of the Pitch: WordPress Relationships and SalesLaura Byrne
Clients don’t know what they don’t know. What web solutions are right for them? How does WordPress come into the picture? How do you make sure you understand scope and timeline? What do you do if sometime changes?
All these questions and more will be explored as we talk about matching clients’ needs with what your agency offers without pulling teeth or pulling your hair out. Practical tips, and strategies for successful relationship building that leads to closing the deal.
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...DanBrown980551
Do you want to learn how to model and simulate an electrical network from scratch in under an hour?
Then welcome to this PowSyBl workshop, hosted by Rte, the French Transmission System Operator (TSO)!
During the webinar, you will discover the PowSyBl ecosystem as well as handle and study an electrical network through an interactive Python notebook.
PowSyBl is an open source project hosted by LF Energy, which offers a comprehensive set of features for electrical grid modelling and simulation. Among other advanced features, PowSyBl provides:
- A fully editable and extendable library for grid component modelling;
- Visualization tools to display your network;
- Grid simulation tools, such as power flows, security analyses (with or without remedial actions) and sensitivity analyses;
The framework is mostly written in Java, with a Python binding so that Python developers can access PowSyBl functionalities as well.
What you will learn during the webinar:
- For beginners: discover PowSyBl's functionalities through a quick general presentation and the notebook, without needing any expert coding skills;
- For advanced developers: master the skills to efficiently apply PowSyBl functionalities to your real-world scenarios.
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...Jeffrey Haguewood
Sidekick Solutions uses Bonterra Impact Management (fka Social Solutions Apricot) and automation solutions to integrate data for business workflows.
We believe integration and automation are essential to user experience and the promise of efficient work through technology. Automation is the critical ingredient to realizing that full vision. We develop integration products and services for Bonterra Case Management software to support the deployment of automations for a variety of use cases.
This video focuses on the notifications, alerts, and approval requests using Slack for Bonterra Impact Management. The solutions covered in this webinar can also be deployed for Microsoft Teams.
Interested in deploying notification automations for Bonterra Impact Management? Contact us at sales@sidekicksolutionsllc.com to discuss next steps.
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024Tobias Schneck
As AI technology is pushing into IT I was wondering myself, as an “infrastructure container kubernetes guy”, how get this fancy AI technology get managed from an infrastructure operational view? Is it possible to apply our lovely cloud native principals as well? What benefit’s both technologies could bring to each other?
Let me take this questions and provide you a short journey through existing deployment models and use cases for AI software. On practical examples, we discuss what cloud/on-premise strategy we may need for applying it to our own infrastructure to get it to work from an enterprise perspective. I want to give an overview about infrastructure requirements and technologies, what could be beneficial or limiting your AI use cases in an enterprise environment. An interactive Demo will give you some insides, what approaches I got already working for real.
Let's dive deeper into the world of ODC! Ricardo Alves (OutSystems) will join us to tell all about the new Data Fabric. After that, Sezen de Bruijn (OutSystems) will get into the details on how to best design a sturdy architecture within ODC.
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...Ramesh Iyer
In today's fast-changing business world, Companies that adapt and embrace new ideas often need help to keep up with the competition. However, fostering a culture of innovation takes much work. It takes vision, leadership and willingness to take risks in the right proportion. Sachin Dev Duggal, co-founder of Builder.ai, has perfected the art of this balance, creating a company culture where creativity and growth are nurtured at each stage.
Key Trends Shaping the Future of Infrastructure.pdfCheryl Hung
Keynote at DIGIT West Expo, Glasgow on 29 May 2024.
Cheryl Hung, ochery.com
Sr Director, Infrastructure Ecosystem, Arm.
The key trends across hardware, cloud and open-source; exploring how these areas are likely to mature and develop over the short and long-term, and then considering how organisations can position themselves to adapt and thrive.
Neuro-symbolic is not enough, we need neuro-*semantic*Frank van Harmelen
Neuro-symbolic (NeSy) AI is on the rise. However, simply machine learning on just any symbolic structure is not sufficient to really harvest the gains of NeSy. These will only be gained when the symbolic structures have an actual semantics. I give an operational definition of semantics as “predictable inference”.
All of this illustrated with link prediction over knowledge graphs, but the argument is general.
Transcript: Selling digital books in 2024: Insights from industry leaders - T...BookNet Canada
The publishing industry has been selling digital audiobooks and ebooks for over a decade and has found its groove. What’s changed? What has stayed the same? Where do we go from here? Join a group of leading sales peers from across the industry for a conversation about the lessons learned since the popularization of digital books, best practices, digital book supply chain management, and more.
Link to video recording: https://bnctechforum.ca/sessions/selling-digital-books-in-2024-insights-from-industry-leaders/
Presented by BookNet Canada on May 28, 2024, with support from the Department of Canadian Heritage.
2. A STUDY ON BIG DATA AND ITS
TRAFFIC MANAGEMENT –
SEPARATION BASED ON
VARIETIES OF DATA
N. Naveen Kumar
itnaveenkumarn@gmail.com
S. Vijayabaskaran
vijay.shiva68@gmail.com
B.Tech Information Technology,
Sri Sai Ram Engineering College,
Chennai.
3. BIG DATA
• A massive volume of data sets which are
difficult to process by traditional techniques
within elapsed time.
• Data’s present here are expressed in exa-
bytes.
4. BIG DATA - PROPERTIES
• Variety
• Volume
• Velocity
• Varacity
• Value
5. Where Big Data is?
• Log storage in IT industries
• Sensor Data
• Risk analysis
• Social media
6. Challenges and issues in Big Data
• Privacy & Security
• Data access & sharing of information
• Storage & processing issues
• Analytical challenges
• Skill requirement
• Technical challenges
7. TOOLS AND TECHNIQUES
• Hadoop:
– Open source available software.
– Handled by map reduce.
– Managed by replication of data.
– Reads data from multiple disks.
• Grid Computing:
– Distributed computing.
– To save, distribute & analyze.
9. Existing System
Grid Computing:
SFTI(Search For Extra Terrestrial Intelligence) is created in order to
save, distribute and analyze.
Grid Center:
Number of servers that are inter-connected by high speed network.
Elements of Grid Center:
– Computing Elements is used to manage resource of
grid and manages jobs.
– Storage Elements is used in storage and data transfer
services.
– Worker Node are servers that offers the processing
power.
10. Contd…
• User Interface is also present.
• Virtual organization is the set of peoples defines rules
for accessing the grid.
• It has work load management.
• User access User Interface through secure shell (SSH).
• The user then sends the job written in Job Description
Language (JDL) and PYC (Python Code) to WMS
(Workload Management System).
• WMS checks availability of Computing Elements and
assigns Worker Nodes to process it.
• Worker Node completes job and send to WMS and
state of job to Computing elements.
11. Hadoop with HPC and Grid Computing
• In Hadoop, the Map reduce component makes
use of data locality property.
• Grid basically uses Message Passing Interface
(MPI)
• Map reduce itself give information about
Failed tasks and reschedules it.
15. Functions
i) Main switch: Separates the relevant data
Varieties of data’s are
1. Real-time
2. Social media
3. Business Organization
4. Other data
Map Reduce concept has 2 main functions:
a. Indexing
b. Key generation
16. ii) Key Generator:
• This generates key by using Kerberos algorithm.
• Encrypts the input data with the key generated.
iii) Subnet Switch:
- This subnet switch works like a master.
- This assigns works to map workers.
- A large works is sub divided and is given to map
workers.
- It also checks whether all workers did their job or not.
- In case any crash in map worker the jobs will be
reassigned to it.
17. iv) Map workers:
• Works with the data key which was assigned by the subnet
switch.
• After completion of work it sends the result of job to subnets
switch and gets other jobs that is to be processed.
• Our concept mainly focuses on bottom up approach because
when going for bottom the data amount will be less so we can
analyze it easily. Further going, the data amount increases.
• Top down approach fails in big data because analyze on large
data is not possible.
18. BIG DATA GOOD PRACTICES
• Creating dimensions of all the data being stored.
• All data’s should have a durable keys.
• Structured and unstructured data are analyzed
together.
• Importance of big data to consumers is growing.
• Data quality needs to be better.
• The scalability has certain limits of storing the
data.
• Investment in quality and metadata reduce
processing time.
19. CONCLUSION
• This developed framework brings out a better
analysis of data over the previous methods.
• This method uses the map reduce concept
that reduces the analysing effort of data.
• By using bottom up approach the data
analysis is made easy and it also provide
better analysis result.
• This system uses map worker concept that
ensures that the data is analyzed in a best way
to use it.