Regulations in customs and Excise have an important function in regulating legally every policy
and decision making. Thus, the type of rules are made varies according to the hierarchy of the regulator.
This regulation is also changing, following policy developments that occured in the higgest government.
then, what if this organization has not been able to manage its regulatory archiving? even regulatory
changes can not be tracked? So, we need a well-organized system that can accomodate all of the rules
and the associated changes and connectedness with other type of regulations. This system will help us to
provide convenience to users to search, manage and track the history of changes as well as the
relationship between the rules used by the organization. This paper propose the design of an ontologybased
semantic network using a graph database that use neo4j 2.3.1 as a solution. In the application that
uses the sample data, we found 13 types of nodes that contains 242 child nodes and 22 type of relation
that contain 548 relation that connect all the node within 3305ms.
Determining relevance of “best practice” based on interoperability in Europea...ePractice.eu
Authors: Robert Deller, Guilloux Véronique.
eGovernment is one of Europe’s big challenges, and interoperability is a necessary condition encouraged by the European Commission. Interoperability is believed to ensure effective service to citizens and to perform governmental functions effectively as well as efficiently.
Semantic-Driven e-Government: Application of Uschold and King Ontology Buildi...dannyijwest
This document describes a study that applies the Uschold and King ontology building methodology to develop semantic ontology models for electronic government (e-government) domains. Specifically:
1. The Uschold and King methodology is used to build a domain ontology for monitoring development projects in Sub-Saharan Africa.
2. The domain ontology is evaluated for semantic consistency using a description logic representation.
3. The domain ontology is aligned with the DOLCE upper-level ontology to allow for wider integration.
4. The domain ontology is formally written in OWL to enable computer processing.
The goal is to provide guidance for applying ontology building methodologies to develop e-government domain ontologies and
Knowledge management system SOP using semantic networks connected with person...TELKOMNIKA JOURNAL
Document Standard Operating Procedures (SOP) can manage business processes and employee performance in an organization. This study aims to develop a system that can publish SOP documents automatically distributed to employees. In this study, an analysis of the relationship between SOP documents and employees is carried out so that it can be directly allocated to employees according to the position. Knowledge analysis of the relationship between SOP documents and employees is done using semantic network analysis. Semantic networks are used to analyze components of knowledge and the relationship between SOP documents and employees. The results of the report of the elements of knowledge and the relationship between SOP documents found 25 SOP documents were consisting of 6 types of central nodes with 156 child nodes and had 7 types of relations containing 207 relations. SOP knowledge management system is connected to the personnel information system (SIPEG) so that it makes it easier for users to find, accommodate, and manage the knowledge contained in SOP documents. System implementation is done using PHP programming language with CodeIgniter framework, Rest API, and MySQL database.
Thoughtful Approaches to Implementation of Electronic Rulemakingijmpict
The Internet could fundamentally change the way that both government and public participates in making public policy but changes must be made in our existing civic infrastructure for that. The change in the law causes modifications in the policies/strategies in which this law has been implemented. This paper focuses on approaches to thoughtful implementation of electronic rulemaking could capture public interest, particularly in higher profile rulemakings and how online forums and dialogues can foster greater public participation in making public policy processes at the centre as well as presenting the need of semantic perspectives and semantic web technology to enhance the inference capability and to provide decision making capability.
CITIZENS’ ACCEPTANCE OF E-GOVERNMENT SERVICESijcseit
The rate of computer and internet usage has been increasing rapidly around the world. In parallel with the
technologic developments in computer science, transformation from traditional services to online services
has gained speed. The aim of this study is to predict the factors that affect e-government service usage. A
research model is developed to achieve this aim. The proposed model bases on Technology Acceptance
Model and Theory of Planned Behaviour. A questionnaire is developed to evaluate the model. This
questionnaire composes of two parts: demographics part and item part. In the items part, 32 items
comprising the factors of the proposed model are asked to participants. 100 participants fill the
questionnaire. Reliability analysis of the questionnaire is evaluated with internal consistency reliability
method. Results show that all items satisfies the reliability conditions. The reliability of whole
questionnaire Cronbach Alpha is 0.885. The Cronbach’s alpha for the overall scale of each of the factors
ranges from 0.878 to 0.890. Regression analysis results showed that all hypotheses are supported. This
study provides some valuable references to understand citizens’ acceptance level of e-government services.
CITIZENS’ ACCEPTANCE OF E-GOVERNMENT SERVICESijcseit
The rate of computer and internet usage has been increasing rapidly around the world. In parallel with the
technologic developments in computer science, transformation from traditional services to online services
has gained speed. The aim of this study is to predict the factors that affect e-government service usage. A
research model is developed to achieve this aim. The proposed model bases on Technology Acceptance
Model and Theory of Planned Behaviour. A questionnaire is developed to evaluate the model. This
questionnaire composes of two parts: demographics part and item part. In the items part, 32 items
comprising the factors of the proposed model are asked to participants. 100 participants fill the
questionnaire. Reliability analysis of the questionnaire is evaluated with internal consistency reliability
method. Results show that all items satisfies the reliability conditions. The reliability of whole
questionnaire Cronbach Alpha is 0.885. The Cronbach’s alpha for the overall scale of each of the factors
ranges from 0.878 to 0.890. Regression analysis results showed that all hypotheses are supported. This
study provides some valuable references to understand citizens’ acceptance level of e-government services.
Determining relevance of “best practice” based on interoperability in Europea...ePractice.eu
Authors: Robert Deller, Guilloux Véronique.
eGovernment is one of Europe’s big challenges, and interoperability is a necessary condition encouraged by the European Commission. Interoperability is believed to ensure effective service to citizens and to perform governmental functions effectively as well as efficiently.
Semantic-Driven e-Government: Application of Uschold and King Ontology Buildi...dannyijwest
This document describes a study that applies the Uschold and King ontology building methodology to develop semantic ontology models for electronic government (e-government) domains. Specifically:
1. The Uschold and King methodology is used to build a domain ontology for monitoring development projects in Sub-Saharan Africa.
2. The domain ontology is evaluated for semantic consistency using a description logic representation.
3. The domain ontology is aligned with the DOLCE upper-level ontology to allow for wider integration.
4. The domain ontology is formally written in OWL to enable computer processing.
The goal is to provide guidance for applying ontology building methodologies to develop e-government domain ontologies and
Knowledge management system SOP using semantic networks connected with person...TELKOMNIKA JOURNAL
Document Standard Operating Procedures (SOP) can manage business processes and employee performance in an organization. This study aims to develop a system that can publish SOP documents automatically distributed to employees. In this study, an analysis of the relationship between SOP documents and employees is carried out so that it can be directly allocated to employees according to the position. Knowledge analysis of the relationship between SOP documents and employees is done using semantic network analysis. Semantic networks are used to analyze components of knowledge and the relationship between SOP documents and employees. The results of the report of the elements of knowledge and the relationship between SOP documents found 25 SOP documents were consisting of 6 types of central nodes with 156 child nodes and had 7 types of relations containing 207 relations. SOP knowledge management system is connected to the personnel information system (SIPEG) so that it makes it easier for users to find, accommodate, and manage the knowledge contained in SOP documents. System implementation is done using PHP programming language with CodeIgniter framework, Rest API, and MySQL database.
Thoughtful Approaches to Implementation of Electronic Rulemakingijmpict
The Internet could fundamentally change the way that both government and public participates in making public policy but changes must be made in our existing civic infrastructure for that. The change in the law causes modifications in the policies/strategies in which this law has been implemented. This paper focuses on approaches to thoughtful implementation of electronic rulemaking could capture public interest, particularly in higher profile rulemakings and how online forums and dialogues can foster greater public participation in making public policy processes at the centre as well as presenting the need of semantic perspectives and semantic web technology to enhance the inference capability and to provide decision making capability.
CITIZENS’ ACCEPTANCE OF E-GOVERNMENT SERVICESijcseit
The rate of computer and internet usage has been increasing rapidly around the world. In parallel with the
technologic developments in computer science, transformation from traditional services to online services
has gained speed. The aim of this study is to predict the factors that affect e-government service usage. A
research model is developed to achieve this aim. The proposed model bases on Technology Acceptance
Model and Theory of Planned Behaviour. A questionnaire is developed to evaluate the model. This
questionnaire composes of two parts: demographics part and item part. In the items part, 32 items
comprising the factors of the proposed model are asked to participants. 100 participants fill the
questionnaire. Reliability analysis of the questionnaire is evaluated with internal consistency reliability
method. Results show that all items satisfies the reliability conditions. The reliability of whole
questionnaire Cronbach Alpha is 0.885. The Cronbach’s alpha for the overall scale of each of the factors
ranges from 0.878 to 0.890. Regression analysis results showed that all hypotheses are supported. This
study provides some valuable references to understand citizens’ acceptance level of e-government services.
CITIZENS’ ACCEPTANCE OF E-GOVERNMENT SERVICESijcseit
The rate of computer and internet usage has been increasing rapidly around the world. In parallel with the
technologic developments in computer science, transformation from traditional services to online services
has gained speed. The aim of this study is to predict the factors that affect e-government service usage. A
research model is developed to achieve this aim. The proposed model bases on Technology Acceptance
Model and Theory of Planned Behaviour. A questionnaire is developed to evaluate the model. This
questionnaire composes of two parts: demographics part and item part. In the items part, 32 items
comprising the factors of the proposed model are asked to participants. 100 participants fill the
questionnaire. Reliability analysis of the questionnaire is evaluated with internal consistency reliability
method. Results show that all items satisfies the reliability conditions. The reliability of whole
questionnaire Cronbach Alpha is 0.885. The Cronbach’s alpha for the overall scale of each of the factors
ranges from 0.878 to 0.890. Regression analysis results showed that all hypotheses are supported. This
study provides some valuable references to understand citizens’ acceptance level of e-government services.
Adoption of internal web technologies by oecd turkish government officialsijmpict
Use of communication and information channels for the OECD have been increasingly encouraged by
new channels such as the OECD’s Committee Information Service (OLIS) and Clearspace (CS) web
portals. A logit regression model was used to estimate the influence of the government’s supply side policy
tools and organisational factors on the decision to open OLIS and Clearspace accounts. Additionally,
probability analysis conducted to give insights on the usage frequency of information channels. Study used
a dataset that includes 126 Turkish top-level country and municipal government officials working on
different OECD study topics in 2010. Findings imply that the influence of the explanatory variables tested
differ between the two web-portal models. Satisfaction with the timing of information provided by the
OECD Permenant Delegation (timing issues in reaching reports) among officials is the only variable that
consistently has a positive influence on the adoption of both web-portal applications. The probability
analysis show that while duration of employment and degree of expertise increase the probability of use of
online information channels, work duration on OECD topics and meeting participation are the variables
that decrease the probability of use of face to face communication channels.
Innovative Technologies And Software For Higher Education...Robin Anderson
The president of a company realized they were no longer competitive due to not having a project management methodology. A consultant previously recommended developing an enterprise project management methodology (EPM) and project management office. The president approved a five step plan to implement the EPM within six months, which included assessing current templates, determining what can remain, developing a method to capture best practices, education and training employees on the new methodology.
Three dimensions of organizational interoperability. Insights from recent stu...ePractice.eu
Authors: Herbert Kubicek, Ralf Cimander.
Interoperability (IOP) is considered a critical success factor to forge ahead in the online provi-sion of public services. Interoperability frameworks shall give guidance to practitioners what to consider and to do in order to enable seamless interaction with other public authorities and clients.
Building a recommendation system based on the job offers extracted from the w...IJECEIAES
Recruitment, or job search, is increasingly used throughout the world by a large population of users through various channels, such as websites, platforms, and professional networks. Given the large volume of information related to job descriptions and user profiles, it is complicated to appropriately match a user's profile with a job description, and vice versa. The job search approach has drawbacks since the job seeker needs to search a job offers in each recruitment platform, manage their accounts, and apply for the relevant job vacancies, which wastes considerable time and effort. The contribution of this research work is the construction of a recommendation system based on the job offers extracted from the web and on the e-portfolios of job seekers. After the extraction of the data, natural language processing is applied to structured data and is ready for filtering and analysis. The proposed system is a content-based system, it measures the degree of correspondence between the attributes of the e-portfolio with those of each job offer of the same list of competence specialties using the Euclidean distance, the result is classified with a decreasing way to display the most relevant to the least relevant job offers
Financial Management Information System within Government Institution and Sup...sececonf
The provision of comprehensive financial
information by the government institution is needed by
the wider community to boost the effectiveness of the
information to the society and government, and
decision-making. This system produces information that
is able to encourage the realization of a clean,
transparent, and able government to respond to
changing demands effectively. Therefore, the success of
the information system if the users are successful in the
model of acceptance of the systems and information
technology and it would be improving their
performance. This research aims to examine the
acceptance of regional financial information systems in
government using the Technology Acceptance Model
(TAM) and evidence of its influence on the
performance. The study found 556 respondents who are
civil servants in Lampung Province. The finding using
SEM analysis shows that all constructs have an effect in
conformity with the concept of TAM in a government
institution. This study reveals that all variables which
include the model of TAM theory have a positive impact
on the user's performance. It also improvesthe
effectiveness of the information system within the
government institution, especially the implementation of
the financial management information system.
Keywords-Financial Management Information System,
supply chain strategy, Technology Acceptance Model
(TAM), User’s Performance
A Combinational Approach of GIS and SOA for Performance Improvement of Organi...dannyijwest
A GIS (Geographic Information System), is a system designed to capture, store, manipulate, analyze, manage, and present all types of geographically referenced data. In other words, GIS is the merging of statistical analysis, database technology and cartography that can be integrated into any enterprise information system framework. But despite all these advantages, geographic information systems that used
in organizational structures, is faced with some problems such as lack of interoperability, business
alignment, and agility.
Cloud Computing and Content Management Systems : A Case Study in Macedonian E...neirew J
Technologies have become inseparable of our lives, economy, and the society as a whole. For example,
clouds provide numerous computing resources that can facilitate our lives, whereas the Content
Management Systems (CMSs) can provide the right content for the right user. Thus, education must
embrace these emerging technologies in order to prepare citizens for the 21st century. The research
explored ‘if’ and ‘how’ Cloud Computing influences the application of CMSs, and ‘if’ and ‘how’ it fosters
the usage of mobile technologies to access cloud resources. The analyses revealed that some of the
respondents have sound experience in using clouds and in using CMSs. Nevertheless, it was evident that
significant number of respondents have limited or no experience in cloud computing concepts, cloud
security and CMSs, as well. Institutions of the system should update educational policies in order to enable
education innovation, provide means and support, and continuously update/upgrade educational
infrastructure.
CLOUD COMPUTING AND CONTENT MANAGEMENT SYSTEMS: A CASE STUDY IN MACEDONIAN ED...ijccsa
- The document discusses a case study on how cloud computing and content management systems are being applied in Macedonian education. A questionnaire was distributed to teachers and students to examine their experiences with cloud computing, content management systems, and how cloud computing influences the use of mobile technologies.
- The results found that teachers had significant experience using ICTs for both professional and personal purposes. However, they had more experience using cloud computing than content management systems. Many respondents were unsure about how cloud computing influences the use of content management systems and had misconceptions about cloud data security.
- Younger respondents felt that cloud computing moderately facilitates the use of mobile devices, while older teachers felt it significantly facilitates mobile device use. This suggests that
Management Information System - Human Resource & Payroll System for Sri Lanka...Dr. Amarjeet Singh
Payroll is the one of most critical process in every large organization; it has many functionalities to fulfill on time for a smooth payroll process. When the number of employees varies on time to time due to retirements and new appointments and calculation method differs based on it, increases the complication of the system. Management operations concerning data can be a complicated scenario when the amount of data involved is large. More importantly, when the data involved requires a high level of accuracy, then the management role becomes hectic. Payroll is a sector of the management in which accuracy is required and a lapse in the data can lead to adverse effect on employee motivation and production. In such system, cost estimation is very vital for top management proceed with future organizational plans.
Towards legally-compliant governmental case work with Dynamic Condition Respo...Hugo Andrés López
We present ongoing efforts of generating computational models of danish regulations, as well as identifying the fragments that can be prone for formal verification.
National archival platform system design using the microservice-based service...CSITiaesprime
Archives play a vital function concerning the dynamics of people and nations as an instrument to treasure information in diverse domains of politics, society, economics, culture, science, and technology. The acceleration of digital transformation triggers the implementation of a smart government that supports better public services. The smart government encourages a national archival system to facilitate archive producers and users. The four electronic-based government system (SPBE) factors in the archival sector and open archival information system (OAIS) as a data preservation standard are the benchmarks in developing this study's national archival platform system. An improved service computing system engineering (SCSE) framework adapted to the microservice architecture is used to aid the design of the national archival platform system. The proposed design met the four-factor service design validation of coupling, cohesion, complexity, and reusability. Also, the prototype suggests what resources are needed to put the design into action by passing the performance test of availability measurement.
IRJET- Content Analysis of Websites of Health Ministries in ECOWAS English Sp...IRJET Journal
This document evaluates the websites of the Ministries of Health of five English-speaking countries in the Economic Community of West African States (ECOWAS) using the Website Attribute Evaluation System (WAES). The evaluation found that the websites were at a similar basic level of development, focusing primarily on information dissemination. Content analysis was conducted on attributes such as information provided, page layout, multimedia elements, and rate of updates. The results showed minor variations across countries but overall consistency in the basic level of development of the health ministry websites.
A Framework for Automated Association Mining Over Multiple DatabasesGurdal Ertek
Literature on association mining, the data mining methodology that investigates associations between items, has primarily focused on efficiently mining larger databases. The motivation for association mining is to use the rules obtained from historical data to influence future transactions. However, associations in transactional processes change significantly over time, implying that rules extracted for a given time interval may not be applicable for a later time interval. Hence, an
analysis framework is necessary to identify how associations change over time. This paper presents such a framework, reports the implementation of the framework as a tool, and demonstrates the applicability of and the necessity for the framework through a case study in the domain of finance.
http://research.sabanciuniv.edu.
Employees Adoption of E-Procurement System: An Empirical StudyIJMIT JOURNAL
Today, organizations are investing a lot in their IT infrastructure and reengineering their business processes by digitizing firms. If organizational employees will not optimum utilize its IT infrastructure, the productivity gain reduced enormously. In Uttarakhand e-procurement system implemented by public sector under e-governance integrated mission mode projects. So, there is need to find the determinants which influence employee’s adoption and uses of e-procurement systems. This research study assesses the organizational and individual determinants that influence the use of e-procurement system in Uttarakhand public sector. This study provides managers with the valuable information to take intervention programs to achieve greater acceptance and usage of e-procurement system. Data collected for this study by the means of a survey conducted in Uttarakhand state in 2011. A total 1200 questionnaire forms were distributed personally and online to employees using e-procurement system in Uttarakhand.
Visualizing stemming techniques on online news articles text analyticsjournalBEEI
Stemming is the process to convert words into their root words by the stemming algorithm. It is one of the main processes in text analytics where the text data needs to go through stemming process before proceeding to further analysis. Text analytics is a very common practice nowadays that is practiced toanalyze contents of text data from various sources such as the mass media and media social. In this study, two different stemming techniques; Porter and Lancaster are evaluated. The differences in the outputs that are resulted from the different stemming techniques are discussed based on the stemming error and the resulted visualization. The finding from this study shows that Porter stemming performs better than Lancaster stemming, by 43%, based on the stemming error produced. Visualization can still be accommodated by the stemmed text data but some understanding of the background on the text data is needed by the tool users to ensure that correct interpretation can be made on the visualization outputs.
Hot Topic Detection and Technology Trend Tracking for Patents utilizing Term ...Ly Nguyen
This paper proposes a methodology for identifying
hot topics and tracking technology trends from the patent
domain. The methodology uses frequency information in
combination with the International Patent Classification (IPC) to
capture semantic information on word categorization, doing so in
a way that heretofore has not been employed for topic detection
and trend tracking. Term Frequency and Proportional Document
Frequency (TF*PDF) is employed as a means to detect hot topics
from patents, and IPCs are used to calculate semantic
importance of terms based on the IPCs where terms are
distributed. Aging Theory is also used to calculate the variation
of trends over time. Four types of trends including very stable
trends, stable trends, normal trends, and unstable trends are
defined and evaluated based on TF*PDF and TF*PDF combined
with Aging Theory. Experiment results show that for very stable
trends, the combination of TF*PDF and Aging Theory achieves
0.976% in Precision; for stable trends and all trends, TF*PDF
achieves 0.959% and 0.84% in Precision, respectively. By
applying TF*PDF in consideration of semantic information, we
also show a new criteria for weighting hot topics and technology
trend tracking.
This article analyzes the terminology of instructional technology from a historical perspective. It discusses how the development of terms has impacted the nomenclature, role, domain, and object of study of instructional technology over time. Originally, instructional technology was seen as instructional media. It was then considered a science and process that includes design, development, utilization, management and evaluation of learning resources and processes. Currently, the focus is on facilitating learning and improving performance. The article aims to provide a historical overview of how the terminology of instructional technology has evolved.
Process Mining in Supply Chains: A Systematic Literature Review IJECEIAES
Performance analysis and continuous process improvement efforts are often supported by the construction of process models representing the interactions of the partners in the supply chain. This study was conducted to determine the state of the art in the process mining field, specifically in the context of cross-organizational process. The Systematic Literature Review (SLR) method is used to review a collection of twenty-one papers that are classified according to the Artifact framework of Hevner, et al. and within the Process Mining framework of Van der Aalst. In the reviewed papers, the authors conducted a variety of techniques to establish the event log, which is then used to perform the process mining analysis. Eight of the reviewed papers focus on the definition of concepts or measures. Five of the papers describe models and other abstractions that are used as a theoretical basis for process mining in the context of supply chains. The majority twenty of papers describe some kind of informal method or formal algorithm to perform process mining analysis. Nine of the papers that propose a formal algorithm also present an accompanying software implementation. Eight papers discuss the data preparation challenges and twelve papers discuss process discovery techniques.
LEAN THINKING IN SOFTWARE ENGINEERING: A SYSTEMATIC REVIEWijseajournal
The field of Software Engineering has suffered considerable transformation in the last decades due to the influence of the philosophy of Lean Thinking. The purpose of this systematic review is to identify practices and approaches proposed by researchers in this area in the last 5 years, who have worked under the influence of this thinking. The search strategy brought together 549 studies, 80 of which were classified as
relevant for synthesis in this review. Seventeen tools of Lean Thinking adapted to Software Engineering were catalogued, as well as 35 practices created for the development of software that has been influenced by this philosophy. The study rovides a roadmap of results with the current state of the art and the identification of gaps pointing to opportunities for further esearch.
Amazon products reviews classification based on machine learning, deep learni...TELKOMNIKA JOURNAL
In recent times, the trend of online shopping through e-commerce stores and websites has grown to a huge extent. Whenever a product is purchased on an e-commerce platform, people leave their reviews about the product. These reviews are very helpful for the store owners and the product’s manufacturers for the betterment of their work process as well as product quality. An automated system is proposed in this work that operates on two datasets D1 and D2 obtained from Amazon. After certain preprocessing steps, N-gram and word embedding-based features are extracted using term frequency-inverse document frequency (TF-IDF), bag of words (BoW) and global vectors (GloVe), and Word2vec, respectively. Four machine learning (ML) models support vector machines (SVM), logistic regression (RF), logistic regression (LR), multinomial Naïve Bayes (MNB), two deep learning (DL) models convolutional neural network (CNN), long-short term memory (LSTM), and standalone bidirectional encoder representations (BERT) are used to classify reviews as either positive or negative. The results obtained by the standard ML, DL models and BERT are evaluated using certain performance evaluation measures. BERT turns out to be the best-performing model in the case of D1 with an accuracy of 90% on features derived by word embedding models while the CNN provides the best accuracy of 97% upon word embedding features in the case of D2. The proposed model shows better overall performance on D2 as compared to D1.
Design, simulation, and analysis of microstrip patch antenna for wireless app...TELKOMNIKA JOURNAL
In this study, a microstrip patch antenna that works at 3.6 GHz was built and tested to see how well it works. In this work, Rogers RT/Duroid 5880 has been used as the substrate material, with a dielectric permittivity of 2.2 and a thickness of 0.3451 mm; it serves as the base for the examined antenna. The computer simulation technology (CST) studio suite is utilized to show the recommended antenna design. The goal of this study was to get a more extensive transmission capacity, a lower voltage standing wave ratio (VSWR), and a lower return loss, but the main goal was to get a higher gain, directivity, and efficiency. After simulation, the return loss, gain, directivity, bandwidth, and efficiency of the supplied antenna are found to be -17.626 dB, 9.671 dBi, 9.924 dBi, 0.2 GHz, and 97.45%, respectively. Besides, the recreation uncovered that the transfer speed side-lobe level at phi was much better than those of the earlier works, at -28.8 dB, respectively. Thus, it makes a solid contender for remote innovation and more robust communication.
More Related Content
Similar to Modeling Ontology and Semantic Network of Regulations in Customs and Excise
Adoption of internal web technologies by oecd turkish government officialsijmpict
Use of communication and information channels for the OECD have been increasingly encouraged by
new channels such as the OECD’s Committee Information Service (OLIS) and Clearspace (CS) web
portals. A logit regression model was used to estimate the influence of the government’s supply side policy
tools and organisational factors on the decision to open OLIS and Clearspace accounts. Additionally,
probability analysis conducted to give insights on the usage frequency of information channels. Study used
a dataset that includes 126 Turkish top-level country and municipal government officials working on
different OECD study topics in 2010. Findings imply that the influence of the explanatory variables tested
differ between the two web-portal models. Satisfaction with the timing of information provided by the
OECD Permenant Delegation (timing issues in reaching reports) among officials is the only variable that
consistently has a positive influence on the adoption of both web-portal applications. The probability
analysis show that while duration of employment and degree of expertise increase the probability of use of
online information channels, work duration on OECD topics and meeting participation are the variables
that decrease the probability of use of face to face communication channels.
Innovative Technologies And Software For Higher Education...Robin Anderson
The president of a company realized they were no longer competitive due to not having a project management methodology. A consultant previously recommended developing an enterprise project management methodology (EPM) and project management office. The president approved a five step plan to implement the EPM within six months, which included assessing current templates, determining what can remain, developing a method to capture best practices, education and training employees on the new methodology.
Three dimensions of organizational interoperability. Insights from recent stu...ePractice.eu
Authors: Herbert Kubicek, Ralf Cimander.
Interoperability (IOP) is considered a critical success factor to forge ahead in the online provi-sion of public services. Interoperability frameworks shall give guidance to practitioners what to consider and to do in order to enable seamless interaction with other public authorities and clients.
Building a recommendation system based on the job offers extracted from the w...IJECEIAES
Recruitment, or job search, is increasingly used throughout the world by a large population of users through various channels, such as websites, platforms, and professional networks. Given the large volume of information related to job descriptions and user profiles, it is complicated to appropriately match a user's profile with a job description, and vice versa. The job search approach has drawbacks since the job seeker needs to search a job offers in each recruitment platform, manage their accounts, and apply for the relevant job vacancies, which wastes considerable time and effort. The contribution of this research work is the construction of a recommendation system based on the job offers extracted from the web and on the e-portfolios of job seekers. After the extraction of the data, natural language processing is applied to structured data and is ready for filtering and analysis. The proposed system is a content-based system, it measures the degree of correspondence between the attributes of the e-portfolio with those of each job offer of the same list of competence specialties using the Euclidean distance, the result is classified with a decreasing way to display the most relevant to the least relevant job offers
Financial Management Information System within Government Institution and Sup...sececonf
The provision of comprehensive financial
information by the government institution is needed by
the wider community to boost the effectiveness of the
information to the society and government, and
decision-making. This system produces information that
is able to encourage the realization of a clean,
transparent, and able government to respond to
changing demands effectively. Therefore, the success of
the information system if the users are successful in the
model of acceptance of the systems and information
technology and it would be improving their
performance. This research aims to examine the
acceptance of regional financial information systems in
government using the Technology Acceptance Model
(TAM) and evidence of its influence on the
performance. The study found 556 respondents who are
civil servants in Lampung Province. The finding using
SEM analysis shows that all constructs have an effect in
conformity with the concept of TAM in a government
institution. This study reveals that all variables which
include the model of TAM theory have a positive impact
on the user's performance. It also improvesthe
effectiveness of the information system within the
government institution, especially the implementation of
the financial management information system.
Keywords-Financial Management Information System,
supply chain strategy, Technology Acceptance Model
(TAM), User’s Performance
A Combinational Approach of GIS and SOA for Performance Improvement of Organi...dannyijwest
A GIS (Geographic Information System), is a system designed to capture, store, manipulate, analyze, manage, and present all types of geographically referenced data. In other words, GIS is the merging of statistical analysis, database technology and cartography that can be integrated into any enterprise information system framework. But despite all these advantages, geographic information systems that used
in organizational structures, is faced with some problems such as lack of interoperability, business
alignment, and agility.
Cloud Computing and Content Management Systems : A Case Study in Macedonian E...neirew J
Technologies have become inseparable of our lives, economy, and the society as a whole. For example,
clouds provide numerous computing resources that can facilitate our lives, whereas the Content
Management Systems (CMSs) can provide the right content for the right user. Thus, education must
embrace these emerging technologies in order to prepare citizens for the 21st century. The research
explored ‘if’ and ‘how’ Cloud Computing influences the application of CMSs, and ‘if’ and ‘how’ it fosters
the usage of mobile technologies to access cloud resources. The analyses revealed that some of the
respondents have sound experience in using clouds and in using CMSs. Nevertheless, it was evident that
significant number of respondents have limited or no experience in cloud computing concepts, cloud
security and CMSs, as well. Institutions of the system should update educational policies in order to enable
education innovation, provide means and support, and continuously update/upgrade educational
infrastructure.
CLOUD COMPUTING AND CONTENT MANAGEMENT SYSTEMS: A CASE STUDY IN MACEDONIAN ED...ijccsa
- The document discusses a case study on how cloud computing and content management systems are being applied in Macedonian education. A questionnaire was distributed to teachers and students to examine their experiences with cloud computing, content management systems, and how cloud computing influences the use of mobile technologies.
- The results found that teachers had significant experience using ICTs for both professional and personal purposes. However, they had more experience using cloud computing than content management systems. Many respondents were unsure about how cloud computing influences the use of content management systems and had misconceptions about cloud data security.
- Younger respondents felt that cloud computing moderately facilitates the use of mobile devices, while older teachers felt it significantly facilitates mobile device use. This suggests that
Management Information System - Human Resource & Payroll System for Sri Lanka...Dr. Amarjeet Singh
Payroll is the one of most critical process in every large organization; it has many functionalities to fulfill on time for a smooth payroll process. When the number of employees varies on time to time due to retirements and new appointments and calculation method differs based on it, increases the complication of the system. Management operations concerning data can be a complicated scenario when the amount of data involved is large. More importantly, when the data involved requires a high level of accuracy, then the management role becomes hectic. Payroll is a sector of the management in which accuracy is required and a lapse in the data can lead to adverse effect on employee motivation and production. In such system, cost estimation is very vital for top management proceed with future organizational plans.
Towards legally-compliant governmental case work with Dynamic Condition Respo...Hugo Andrés López
We present ongoing efforts of generating computational models of danish regulations, as well as identifying the fragments that can be prone for formal verification.
National archival platform system design using the microservice-based service...CSITiaesprime
Archives play a vital function concerning the dynamics of people and nations as an instrument to treasure information in diverse domains of politics, society, economics, culture, science, and technology. The acceleration of digital transformation triggers the implementation of a smart government that supports better public services. The smart government encourages a national archival system to facilitate archive producers and users. The four electronic-based government system (SPBE) factors in the archival sector and open archival information system (OAIS) as a data preservation standard are the benchmarks in developing this study's national archival platform system. An improved service computing system engineering (SCSE) framework adapted to the microservice architecture is used to aid the design of the national archival platform system. The proposed design met the four-factor service design validation of coupling, cohesion, complexity, and reusability. Also, the prototype suggests what resources are needed to put the design into action by passing the performance test of availability measurement.
IRJET- Content Analysis of Websites of Health Ministries in ECOWAS English Sp...IRJET Journal
This document evaluates the websites of the Ministries of Health of five English-speaking countries in the Economic Community of West African States (ECOWAS) using the Website Attribute Evaluation System (WAES). The evaluation found that the websites were at a similar basic level of development, focusing primarily on information dissemination. Content analysis was conducted on attributes such as information provided, page layout, multimedia elements, and rate of updates. The results showed minor variations across countries but overall consistency in the basic level of development of the health ministry websites.
A Framework for Automated Association Mining Over Multiple DatabasesGurdal Ertek
Literature on association mining, the data mining methodology that investigates associations between items, has primarily focused on efficiently mining larger databases. The motivation for association mining is to use the rules obtained from historical data to influence future transactions. However, associations in transactional processes change significantly over time, implying that rules extracted for a given time interval may not be applicable for a later time interval. Hence, an
analysis framework is necessary to identify how associations change over time. This paper presents such a framework, reports the implementation of the framework as a tool, and demonstrates the applicability of and the necessity for the framework through a case study in the domain of finance.
http://research.sabanciuniv.edu.
Employees Adoption of E-Procurement System: An Empirical StudyIJMIT JOURNAL
Today, organizations are investing a lot in their IT infrastructure and reengineering their business processes by digitizing firms. If organizational employees will not optimum utilize its IT infrastructure, the productivity gain reduced enormously. In Uttarakhand e-procurement system implemented by public sector under e-governance integrated mission mode projects. So, there is need to find the determinants which influence employee’s adoption and uses of e-procurement systems. This research study assesses the organizational and individual determinants that influence the use of e-procurement system in Uttarakhand public sector. This study provides managers with the valuable information to take intervention programs to achieve greater acceptance and usage of e-procurement system. Data collected for this study by the means of a survey conducted in Uttarakhand state in 2011. A total 1200 questionnaire forms were distributed personally and online to employees using e-procurement system in Uttarakhand.
Visualizing stemming techniques on online news articles text analyticsjournalBEEI
Stemming is the process to convert words into their root words by the stemming algorithm. It is one of the main processes in text analytics where the text data needs to go through stemming process before proceeding to further analysis. Text analytics is a very common practice nowadays that is practiced toanalyze contents of text data from various sources such as the mass media and media social. In this study, two different stemming techniques; Porter and Lancaster are evaluated. The differences in the outputs that are resulted from the different stemming techniques are discussed based on the stemming error and the resulted visualization. The finding from this study shows that Porter stemming performs better than Lancaster stemming, by 43%, based on the stemming error produced. Visualization can still be accommodated by the stemmed text data but some understanding of the background on the text data is needed by the tool users to ensure that correct interpretation can be made on the visualization outputs.
Hot Topic Detection and Technology Trend Tracking for Patents utilizing Term ...Ly Nguyen
This paper proposes a methodology for identifying
hot topics and tracking technology trends from the patent
domain. The methodology uses frequency information in
combination with the International Patent Classification (IPC) to
capture semantic information on word categorization, doing so in
a way that heretofore has not been employed for topic detection
and trend tracking. Term Frequency and Proportional Document
Frequency (TF*PDF) is employed as a means to detect hot topics
from patents, and IPCs are used to calculate semantic
importance of terms based on the IPCs where terms are
distributed. Aging Theory is also used to calculate the variation
of trends over time. Four types of trends including very stable
trends, stable trends, normal trends, and unstable trends are
defined and evaluated based on TF*PDF and TF*PDF combined
with Aging Theory. Experiment results show that for very stable
trends, the combination of TF*PDF and Aging Theory achieves
0.976% in Precision; for stable trends and all trends, TF*PDF
achieves 0.959% and 0.84% in Precision, respectively. By
applying TF*PDF in consideration of semantic information, we
also show a new criteria for weighting hot topics and technology
trend tracking.
This article analyzes the terminology of instructional technology from a historical perspective. It discusses how the development of terms has impacted the nomenclature, role, domain, and object of study of instructional technology over time. Originally, instructional technology was seen as instructional media. It was then considered a science and process that includes design, development, utilization, management and evaluation of learning resources and processes. Currently, the focus is on facilitating learning and improving performance. The article aims to provide a historical overview of how the terminology of instructional technology has evolved.
Process Mining in Supply Chains: A Systematic Literature Review IJECEIAES
Performance analysis and continuous process improvement efforts are often supported by the construction of process models representing the interactions of the partners in the supply chain. This study was conducted to determine the state of the art in the process mining field, specifically in the context of cross-organizational process. The Systematic Literature Review (SLR) method is used to review a collection of twenty-one papers that are classified according to the Artifact framework of Hevner, et al. and within the Process Mining framework of Van der Aalst. In the reviewed papers, the authors conducted a variety of techniques to establish the event log, which is then used to perform the process mining analysis. Eight of the reviewed papers focus on the definition of concepts or measures. Five of the papers describe models and other abstractions that are used as a theoretical basis for process mining in the context of supply chains. The majority twenty of papers describe some kind of informal method or formal algorithm to perform process mining analysis. Nine of the papers that propose a formal algorithm also present an accompanying software implementation. Eight papers discuss the data preparation challenges and twelve papers discuss process discovery techniques.
LEAN THINKING IN SOFTWARE ENGINEERING: A SYSTEMATIC REVIEWijseajournal
The field of Software Engineering has suffered considerable transformation in the last decades due to the influence of the philosophy of Lean Thinking. The purpose of this systematic review is to identify practices and approaches proposed by researchers in this area in the last 5 years, who have worked under the influence of this thinking. The search strategy brought together 549 studies, 80 of which were classified as
relevant for synthesis in this review. Seventeen tools of Lean Thinking adapted to Software Engineering were catalogued, as well as 35 practices created for the development of software that has been influenced by this philosophy. The study rovides a roadmap of results with the current state of the art and the identification of gaps pointing to opportunities for further esearch.
Similar to Modeling Ontology and Semantic Network of Regulations in Customs and Excise (20)
Amazon products reviews classification based on machine learning, deep learni...TELKOMNIKA JOURNAL
In recent times, the trend of online shopping through e-commerce stores and websites has grown to a huge extent. Whenever a product is purchased on an e-commerce platform, people leave their reviews about the product. These reviews are very helpful for the store owners and the product’s manufacturers for the betterment of their work process as well as product quality. An automated system is proposed in this work that operates on two datasets D1 and D2 obtained from Amazon. After certain preprocessing steps, N-gram and word embedding-based features are extracted using term frequency-inverse document frequency (TF-IDF), bag of words (BoW) and global vectors (GloVe), and Word2vec, respectively. Four machine learning (ML) models support vector machines (SVM), logistic regression (RF), logistic regression (LR), multinomial Naïve Bayes (MNB), two deep learning (DL) models convolutional neural network (CNN), long-short term memory (LSTM), and standalone bidirectional encoder representations (BERT) are used to classify reviews as either positive or negative. The results obtained by the standard ML, DL models and BERT are evaluated using certain performance evaluation measures. BERT turns out to be the best-performing model in the case of D1 with an accuracy of 90% on features derived by word embedding models while the CNN provides the best accuracy of 97% upon word embedding features in the case of D2. The proposed model shows better overall performance on D2 as compared to D1.
Design, simulation, and analysis of microstrip patch antenna for wireless app...TELKOMNIKA JOURNAL
In this study, a microstrip patch antenna that works at 3.6 GHz was built and tested to see how well it works. In this work, Rogers RT/Duroid 5880 has been used as the substrate material, with a dielectric permittivity of 2.2 and a thickness of 0.3451 mm; it serves as the base for the examined antenna. The computer simulation technology (CST) studio suite is utilized to show the recommended antenna design. The goal of this study was to get a more extensive transmission capacity, a lower voltage standing wave ratio (VSWR), and a lower return loss, but the main goal was to get a higher gain, directivity, and efficiency. After simulation, the return loss, gain, directivity, bandwidth, and efficiency of the supplied antenna are found to be -17.626 dB, 9.671 dBi, 9.924 dBi, 0.2 GHz, and 97.45%, respectively. Besides, the recreation uncovered that the transfer speed side-lobe level at phi was much better than those of the earlier works, at -28.8 dB, respectively. Thus, it makes a solid contender for remote innovation and more robust communication.
Design and simulation an optimal enhanced PI controller for congestion avoida...TELKOMNIKA JOURNAL
This document describes using a snake optimization algorithm to tune the gains of an enhanced proportional-integral controller for congestion avoidance in a TCP/AQM system. The controller aims to maintain a stable and desired queue size without noise or transmission problems. A linearized model of the TCP/AQM system is presented. An enhanced PI controller combining nonlinear gain and original PI gains is proposed. The snake optimization algorithm is then used to tune the parameters of the enhanced PI controller to achieve optimal system performance and response. Simulation results are discussed showing the proposed controller provides a stable and robust behavior for congestion control.
Improving the detection of intrusion in vehicular ad-hoc networks with modifi...TELKOMNIKA JOURNAL
Vehicular ad-hoc networks (VANETs) are wireless-equipped vehicles that form networks along the road. The security of this network has been a major challenge. The identity-based cryptosystem (IBC) previously used to secure the networks suffers from membership authentication security features. This paper focuses on improving the detection of intruders in VANETs with a modified identity-based cryptosystem (MIBC). The MIBC is developed using a non-singular elliptic curve with Lagrange interpolation. The public key of vehicles and roadside units on the network are derived from number plates and location identification numbers, respectively. Pseudo-identities are used to mask the real identity of users to preserve their privacy. The membership authentication mechanism ensures that only valid and authenticated members of the network are allowed to join the network. The performance of the MIBC is evaluated using intrusion detection ratio (IDR) and computation time (CT) and then validated with the existing IBC. The result obtained shows that the MIBC recorded an IDR of 99.3% against 94.3% obtained for the existing identity-based cryptosystem (EIBC) for 140 unregistered vehicles attempting to intrude on the network. The MIBC shows lower CT values of 1.17 ms against 1.70 ms for EIBC. The MIBC can be used to improve the security of VANETs.
Conceptual model of internet banking adoption with perceived risk and trust f...TELKOMNIKA JOURNAL
Understanding the primary factors of internet banking (IB) acceptance is critical for both banks and users; nevertheless, our knowledge of the role of users’ perceived risk and trust in IB adoption is limited. As a result, we develop a conceptual model by incorporating perceived risk and trust into the technology acceptance model (TAM) theory toward the IB. The proper research emphasized that the most essential component in explaining IB adoption behavior is behavioral intention to use IB adoption. TAM is helpful for figuring out how elements that affect IB adoption are connected to one another. According to previous literature on IB and the use of such technology in Iraq, one has to choose a theoretical foundation that may justify the acceptance of IB from the customer’s perspective. The conceptual model was therefore constructed using the TAM as a foundation. Furthermore, perceived risk and trust were added to the TAM dimensions as external factors. The key objective of this work was to extend the TAM to construct a conceptual model for IB adoption and to get sufficient theoretical support from the existing literature for the essential elements and their relationships in order to unearth new insights about factors responsible for IB adoption.
Efficient combined fuzzy logic and LMS algorithm for smart antennaTELKOMNIKA JOURNAL
The smart antennas are broadly used in wireless communication. The least mean square (LMS) algorithm is a procedure that is concerned in controlling the smart antenna pattern to accommodate specified requirements such as steering the beam toward the desired signal, in addition to placing the deep nulls in the direction of unwanted signals. The conventional LMS (C-LMS) has some drawbacks like slow convergence speed besides high steady state fluctuation error. To overcome these shortcomings, the present paper adopts an adaptive fuzzy control step size least mean square (FC-LMS) algorithm to adjust its step size. Computer simulation outcomes illustrate that the given model has fast convergence rate as well as low mean square error steady state.
Design and implementation of a LoRa-based system for warning of forest fireTELKOMNIKA JOURNAL
This paper presents the design and implementation of a forest fire monitoring and warning system based on long range (LoRa) technology, a novel ultra-low power consumption and long-range wireless communication technology for remote sensing applications. The proposed system includes a wireless sensor network that records environmental parameters such as temperature, humidity, wind speed, and carbon dioxide (CO2) concentration in the air, as well as taking infrared photos.The data collected at each sensor node will be transmitted to the gateway via LoRa wireless transmission. Data will be collected, processed, and uploaded to a cloud database at the gateway. An Android smartphone application that allows anyone to easily view the recorded data has been developed. When a fire is detected, the system will sound a siren and send a warning message to the responsible personnel, instructing them to take appropriate action. Experiments in Tram Chim Park, Vietnam, have been conducted to verify and evaluate the operation of the system.
Wavelet-based sensing technique in cognitive radio networkTELKOMNIKA JOURNAL
Cognitive radio is a smart radio that can change its transmitter parameter based on interaction with the environment in which it operates. The demand for frequency spectrum is growing due to a big data issue as many Internet of Things (IoT) devices are in the network. Based on previous research, most frequency spectrum was used, but some spectrums were not used, called spectrum hole. Energy detection is one of the spectrum sensing methods that has been frequently used since it is easy to use and does not require license users to have any prior signal understanding. But this technique is incapable of detecting at low signal-to-noise ratio (SNR) levels. Therefore, the wavelet-based sensing is proposed to overcome this issue and detect spectrum holes. The main objective of this work is to evaluate the performance of wavelet-based sensing and compare it with the energy detection technique. The findings show that the percentage of detection in wavelet-based sensing is 83% higher than energy detection performance. This result indicates that the wavelet-based sensing has higher precision in detection and the interference towards primary user can be decreased.
A novel compact dual-band bandstop filter with enhanced rejection bandsTELKOMNIKA JOURNAL
In this paper, we present the design of a new wide dual-band bandstop filter (DBBSF) using nonuniform transmission lines. The method used to design this filter is to replace conventional uniform transmission lines with nonuniform lines governed by a truncated Fourier series. Based on how impedances are profiled in the proposed DBBSF structure, the fractional bandwidths of the two 10 dB-down rejection bands are widened to 39.72% and 52.63%, respectively, and the physical size has been reduced compared to that of the filter with the uniform transmission lines. The results of the electromagnetic (EM) simulation support the obtained analytical response and show an improved frequency behavior.
Deep learning approach to DDoS attack with imbalanced data at the application...TELKOMNIKA JOURNAL
A distributed denial of service (DDoS) attack is where one or more computers attack or target a server computer, by flooding internet traffic to the server. As a result, the server cannot be accessed by legitimate users. A result of this attack causes enormous losses for a company because it can reduce the level of user trust, and reduce the company’s reputation to lose customers due to downtime. One of the services at the application layer that can be accessed by users is a web-based lightweight directory access protocol (LDAP) service that can provide safe and easy services to access directory applications. We used a deep learning approach to detect DDoS attacks on the CICDDoS 2019 dataset on a complex computer network at the application layer to get fast and accurate results for dealing with unbalanced data. Based on the results obtained, it is observed that DDoS attack detection using a deep learning approach on imbalanced data performs better when implemented using synthetic minority oversampling technique (SMOTE) method for binary classes. On the other hand, the proposed deep learning approach performs better for detecting DDoS attacks in multiclass when implemented using the adaptive synthetic (ADASYN) method.
The appearance of uncertainties and disturbances often effects the characteristics of either linear or nonlinear systems. Plus, the stabilization process may be deteriorated thus incurring a catastrophic effect to the system performance. As such, this manuscript addresses the concept of matching condition for the systems that are suffering from miss-match uncertainties and exogeneous disturbances. The perturbation towards the system at hand is assumed to be known and unbounded. To reach this outcome, uncertainties and their classifications are reviewed thoroughly. The structural matching condition is proposed and tabulated in the proposition 1. Two types of mathematical expressions are presented to distinguish the system with matched uncertainty and the system with miss-matched uncertainty. Lastly, two-dimensional numerical expressions are provided to practice the proposed proposition. The outcome shows that matching condition has the ability to change the system to a design-friendly model for asymptotic stabilization.
Implementation of FinFET technology based low power 4×4 Wallace tree multipli...TELKOMNIKA JOURNAL
Many systems, including digital signal processors, finite impulse response (FIR) filters, application-specific integrated circuits, and microprocessors, use multipliers. The demand for low power multipliers is gradually rising day by day in the current technological trend. In this study, we describe a 4×4 Wallace multiplier based on a carry select adder (CSA) that uses less power and has a better power delay product than existing multipliers. HSPICE tool at 16 nm technology is used to simulate the results. In comparison to the traditional CSA-based multiplier, which has a power consumption of 1.7 µW and power delay product (PDP) of 57.3 fJ, the results demonstrate that the Wallace multiplier design employing CSA with first zero finding logic (FZF) logic has the lowest power consumption of 1.4 µW and PDP of 27.5 fJ.
Evaluation of the weighted-overlap add model with massive MIMO in a 5G systemTELKOMNIKA JOURNAL
The flaw in 5G orthogonal frequency division multiplexing (OFDM) becomes apparent in high-speed situations. Because the doppler effect causes frequency shifts, the orthogonality of OFDM subcarriers is broken, lowering both their bit error rate (BER) and throughput output. As part of this research, we use a novel design that combines massive multiple input multiple output (MIMO) and weighted overlap and add (WOLA) to improve the performance of 5G systems. To determine which design is superior, throughput and BER are calculated for both the proposed design and OFDM. The results of the improved system show a massive improvement in performance ver the conventional system and significant improvements with massive MIMO, including the best throughput and BER. When compared to conventional systems, the improved system has a throughput that is around 22% higher and the best performance in terms of BER, but it still has around 25% less error than OFDM.
Reflector antenna design in different frequencies using frequency selective s...TELKOMNIKA JOURNAL
In this study, it is aimed to obtain two different asymmetric radiation patterns obtained from antennas in the shape of the cross-section of a parabolic reflector (fan blade type antennas) and antennas with cosecant-square radiation characteristics at two different frequencies from a single antenna. For this purpose, firstly, a fan blade type antenna design will be made, and then the reflective surface of this antenna will be completed to the shape of the reflective surface of the antenna with the cosecant-square radiation characteristic with the frequency selective surface designed to provide the characteristics suitable for the purpose. The frequency selective surface designed and it provides the perfect transmission as possible at 4 GHz operating frequency, while it will act as a band-quenching filter for electromagnetic waves at 5 GHz operating frequency and will be a reflective surface. Thanks to this frequency selective surface to be used as a reflective surface in the antenna, a fan blade type radiation characteristic at 4 GHz operating frequency will be obtained, while a cosecant-square radiation characteristic at 5 GHz operating frequency will be obtained.
Reagentless iron detection in water based on unclad fiber optical sensorTELKOMNIKA JOURNAL
A simple and low-cost fiber based optical sensor for iron detection is demonstrated in this paper. The sensor head consist of an unclad optical fiber with the unclad length of 1 cm and it has a straight structure. Results obtained shows a linear relationship between the output light intensity and iron concentration, illustrating the functionality of this iron optical sensor. Based on the experimental results, the sensitivity and linearity are achieved at 0.0328/ppm and 0.9824 respectively at the wavelength of 690 nm. With the same wavelength, other performance parameters are also studied. Resolution and limit of detection (LOD) are found to be 0.3049 ppm and 0.0755 ppm correspondingly. This iron sensor is advantageous in that it does not require any reagent for detection, enabling it to be simpler and cost-effective in the implementation of the iron sensing.
Impact of CuS counter electrode calcination temperature on quantum dot sensit...TELKOMNIKA JOURNAL
In place of the commercial Pt electrode used in quantum sensitized solar cells, the low-cost CuS cathode is created using electrophoresis. High resolution scanning electron microscopy and X-ray diffraction were used to analyze the structure and morphology of structural cubic samples with diameters ranging from 40 nm to 200 nm. The conversion efficiency of solar cells is significantly impacted by the calcination temperatures of cathodes at 100 °C, 120 °C, 150 °C, and 180 °C under vacuum. The fluorine doped tin oxide (FTO)/CuS cathode electrode reached a maximum efficiency of 3.89% when it was calcined at 120 °C. Compared to other temperature combinations, CuS nanoparticles crystallize at 120 °C, which lowers resistance while increasing electron lifetime.
In place of the commercial Pt electrode used in quantum sensitized solar cells, the low-cost CuS cathode is created using electrophoresis. High resolution scanning electron microscopy and X-ray diffraction were used to analyze the structure and morphology of structural cubic samples with diameters ranging from 40 nm to 200 nm. The conversion efficiency of solar cells is significantly impacted by the calcination temperatures of cathodes at 100 °C, 120 °C, 150 °C, and 180 °C under vacuum. The fluorine doped tin oxide (FTO)/CuS cathode electrode reached a maximum efficiency of 3.89% when it was calcined at 120 °C. Compared to other temperature combinations, CuS nanoparticles crystallize at 120 °C, which lowers resistance while increasing electron lifetime.
A progressive learning for structural tolerance online sequential extreme lea...TELKOMNIKA JOURNAL
This article discusses the progressive learning for structural tolerance online sequential extreme learning machine (PSTOS-ELM). PSTOS-ELM can save robust accuracy while updating the new data and the new class data on the online training situation. The robustness accuracy arises from using the householder block exact QR decomposition recursive least squares (HBQRD-RLS) of the PSTOS-ELM. This method is suitable for applications that have data streaming and often have new class data. Our experiment compares the PSTOS-ELM accuracy and accuracy robustness while data is updating with the batch-extreme learning machine (ELM) and structural tolerance online sequential extreme learning machine (STOS-ELM) that both must retrain the data in a new class data case. The experimental results show that PSTOS-ELM has accuracy and robustness comparable to ELM and STOS-ELM while also can update new class data immediately.
Electroencephalography-based brain-computer interface using neural networksTELKOMNIKA JOURNAL
This study aimed to develop a brain-computer interface that can control an electric wheelchair using electroencephalography (EEG) signals. First, we used the Mind Wave Mobile 2 device to capture raw EEG signals from the surface of the scalp. The signals were transformed into the frequency domain using fast Fourier transform (FFT) and filtered to monitor changes in attention and relaxation. Next, we performed time and frequency domain analyses to identify features for five eye gestures: opened, closed, blink per second, double blink, and lookup. The base state was the opened-eyes gesture, and we compared the features of the remaining four action gestures to the base state to identify potential gestures. We then built a multilayer neural network to classify these features into five signals that control the wheelchair’s movement. Finally, we designed an experimental wheelchair system to test the effectiveness of the proposed approach. The results demonstrate that the EEG classification was highly accurate and computationally efficient. Moreover, the average performance of the brain-controlled wheelchair system was over 75% across different individuals, which suggests the feasibility of this approach.
Adaptive segmentation algorithm based on level set model in medical imagingTELKOMNIKA JOURNAL
For image segmentation, level set models are frequently employed. It offer best solution to overcome the main limitations of deformable parametric models. However, the challenge when applying those models in medical images stills deal with removing blurs in image edges which directly affects the edge indicator function, leads to not adaptively segmenting images and causes a wrong analysis of pathologies wich prevents to conclude a correct diagnosis. To overcome such issues, an effective process is suggested by simultaneously modelling and solving systems’ two-dimensional partial differential equations (PDE). The first PDE equation allows restoration using Euler’s equation similar to an anisotropic smoothing based on a regularized Perona and Malik filter that eliminates noise while preserving edge information in accordance with detected contours in the second equation that segments the image based on the first equation solutions. This approach allows developing a new algorithm which overcome the studied model drawbacks. Results of the proposed method give clear segments that can be applied to any application. Experiments on many medical images in particular blurry images with high information losses, demonstrate that the developed approach produces superior segmentation results in terms of quantity and quality compared to other models already presented in previeous works.
Automatic channel selection using shuffled frog leaping algorithm for EEG bas...TELKOMNIKA JOURNAL
Drug addiction is a complex neurobiological disorder that necessitates comprehensive treatment of both the body and mind. It is categorized as a brain disorder due to its impact on the brain. Various methods such as electroencephalography (EEG), functional magnetic resonance imaging (FMRI), and magnetoencephalography (MEG) can capture brain activities and structures. EEG signals provide valuable insights into neurological disorders, including drug addiction. Accurate classification of drug addiction from EEG signals relies on appropriate features and channel selection. Choosing the right EEG channels is essential to reduce computational costs and mitigate the risk of overfitting associated with using all available channels. To address the challenge of optimal channel selection in addiction detection from EEG signals, this work employs the shuffled frog leaping algorithm (SFLA). SFLA facilitates the selection of appropriate channels, leading to improved accuracy. Wavelet features extracted from the selected input channel signals are then analyzed using various machine learning classifiers to detect addiction. Experimental results indicate that after selecting features from the appropriate channels, classification accuracy significantly increased across all classifiers. Particularly, the multi-layer perceptron (MLP) classifier combined with SFLA demonstrated a remarkable accuracy improvement of 15.78% while reducing time complexity.
CHINA’S GEO-ECONOMIC OUTREACH IN CENTRAL ASIAN COUNTRIES AND FUTURE PROSPECTjpsjournal1
The rivalry between prominent international actors for dominance over Central Asia's hydrocarbon
reserves and the ancient silk trade route, along with China's diplomatic endeavours in the area, has been
referred to as the "New Great Game." This research centres on the power struggle, considering
geopolitical, geostrategic, and geoeconomic variables. Topics including trade, political hegemony, oil
politics, and conventional and nontraditional security are all explored and explained by the researcher.
Using Mackinder's Heartland, Spykman Rimland, and Hegemonic Stability theories, examines China's role
in Central Asia. This study adheres to the empirical epistemological method and has taken care of
objectivity. This study analyze primary and secondary research documents critically to elaborate role of
china’s geo economic outreach in central Asian countries and its future prospect. China is thriving in trade,
pipeline politics, and winning states, according to this study, thanks to important instruments like the
Shanghai Cooperation Organisation and the Belt and Road Economic Initiative. According to this study,
China is seeing significant success in commerce, pipeline politics, and gaining influence on other
governments. This success may be attributed to the effective utilisation of key tools such as the Shanghai
Cooperation Organisation and the Belt and Road Economic Initiative.
International Conference on NLP, Artificial Intelligence, Machine Learning an...gerogepatton
International Conference on NLP, Artificial Intelligence, Machine Learning and Applications (NLAIM 2024) offers a premier global platform for exchanging insights and findings in the theory, methodology, and applications of NLP, Artificial Intelligence, Machine Learning, and their applications. The conference seeks substantial contributions across all key domains of NLP, Artificial Intelligence, Machine Learning, and their practical applications, aiming to foster both theoretical advancements and real-world implementations. With a focus on facilitating collaboration between researchers and practitioners from academia and industry, the conference serves as a nexus for sharing the latest developments in the field.
Advanced control scheme of doubly fed induction generator for wind turbine us...IJECEIAES
This paper describes a speed control device for generating electrical energy on an electricity network based on the doubly fed induction generator (DFIG) used for wind power conversion systems. At first, a double-fed induction generator model was constructed. A control law is formulated to govern the flow of energy between the stator of a DFIG and the energy network using three types of controllers: proportional integral (PI), sliding mode controller (SMC) and second order sliding mode controller (SOSMC). Their different results in terms of power reference tracking, reaction to unexpected speed fluctuations, sensitivity to perturbations, and resilience against machine parameter alterations are compared. MATLAB/Simulink was used to conduct the simulations for the preceding study. Multiple simulations have shown very satisfying results, and the investigations demonstrate the efficacy and power-enhancing capabilities of the suggested control system.
Embedded machine learning-based road conditions and driving behavior monitoringIJECEIAES
Car accident rates have increased in recent years, resulting in losses in human lives, properties, and other financial costs. An embedded machine learning-based system is developed to address this critical issue. The system can monitor road conditions, detect driving patterns, and identify aggressive driving behaviors. The system is based on neural networks trained on a comprehensive dataset of driving events, driving styles, and road conditions. The system effectively detects potential risks and helps mitigate the frequency and impact of accidents. The primary goal is to ensure the safety of drivers and vehicles. Collecting data involved gathering information on three key road events: normal street and normal drive, speed bumps, circular yellow speed bumps, and three aggressive driving actions: sudden start, sudden stop, and sudden entry. The gathered data is processed and analyzed using a machine learning system designed for limited power and memory devices. The developed system resulted in 91.9% accuracy, 93.6% precision, and 92% recall. The achieved inference time on an Arduino Nano 33 BLE Sense with a 32-bit CPU running at 64 MHz is 34 ms and requires 2.6 kB peak RAM and 139.9 kB program flash memory, making it suitable for resource-constrained embedded systems.
Harnessing WebAssembly for Real-time Stateless Streaming PipelinesChristina Lin
Traditionally, dealing with real-time data pipelines has involved significant overhead, even for straightforward tasks like data transformation or masking. However, in this talk, we’ll venture into the dynamic realm of WebAssembly (WASM) and discover how it can revolutionize the creation of stateless streaming pipelines within a Kafka (Redpanda) broker. These pipelines are adept at managing low-latency, high-data-volume scenarios.
Presentation of IEEE Slovenia CIS (Computational Intelligence Society) Chapte...University of Maribor
Slides from talk presenting:
Aleš Zamuda: Presentation of IEEE Slovenia CIS (Computational Intelligence Society) Chapter and Networking.
Presentation at IcETRAN 2024 session:
"Inter-Society Networking Panel GRSS/MTT-S/CIS
Panel Session: Promoting Connection and Cooperation"
IEEE Slovenia GRSS
IEEE Serbia and Montenegro MTT-S
IEEE Slovenia CIS
11TH INTERNATIONAL CONFERENCE ON ELECTRICAL, ELECTRONIC AND COMPUTING ENGINEERING
3-6 June 2024, Niš, Serbia
Literature Review Basics and Understanding Reference Management.pptxDr Ramhari Poudyal
Three-day training on academic research focuses on analytical tools at United Technical College, supported by the University Grant Commission, Nepal. 24-26 May 2024
KuberTENes Birthday Bash Guadalajara - K8sGPT first impressionsVictor Morales
K8sGPT is a tool that analyzes and diagnoses Kubernetes clusters. This presentation was used to share the requirements and dependencies to deploy K8sGPT in a local environment.
2. ISSN: 1693-6930
TELKOMNIKA Vol. 15, No. 4, December 2017 : 1927 – 1935
1928
that are able to provide for their needs. While the availability of rules on the system, not
necessarily true. This will result in failure of the organization in conveying information about the
rules to the community. So that people, companies, importers, warehouses and others will be
blind and unable to understand the customs rules they need to know. So, it is necessary to do
the assessment and development of a repository that can provide solutions to these problems.
Then, Modelling an ontology and semantic network is considered to be one solution that can
handle the problem.
Ontology is considered as a set of representational primitives that models a domain of
knowledge or discourse [1]. A few of research that discussing the semantic nertwork design and
ontology of the regulatory authorities, but this following are some studies related to the
development an ontology already made. Some of them are Engers, et al [2] use POWER
programe that support a systematic translation of (new) legislation in The Dutch Tax and
Customs Administration (DTCA). The result of this research, willl able to improve the quality of
law enforcement and decrease the time needed for implementing changes in legislation and
regulations with detect and report annomalies inside. Klischewski [3] try to make a solution for
how to organize and present document of state administration of Schleswig-Holstein (Germany)
so that the users will able to retrieve the document needed. He propose to use ontology-based
approach. But we cant find the proposed design of the ontology and the semantic of this
research.
Chorco, et al [4] who built a juridical ontology using methontology development
methodologies and WebODE as the software platform. Furthermore, Saias and Quaresma [5]
applied the techniques of NLP (Natural Language Processing). Ontology of this study was
defined through OWL with logic programming framework using EVOLP + ISCO. EVOLP is a
dynamic logic programming framework that allows the definition of rules for actions and events
(law), are used by researchers to conduct intances ontologies and object representations. While
ISCO allows easy integration and efficient. ISCO has the ability to connect an external database
and used as an inference engine that can answer the query semantic content of the document.
Gangemi [6] use CODeps (Content Ontology Design Patterns) as a resource and
design method engineering ontology content over the semantic web for legal knowledge
engineering. Its shown taxonomy of ontology-driven tasks for legal information in general.
Priya [7] proposed a personalized search engine which is hybrid system for personalized search
an reasoning over users profiles. They use protege as tools that modelling the ontology, but the
relation still not intuitive because the relations between the class is still look like an hierarchy.
Its looks different to those studied by Putra, et al [8], he create prototype of SOP navigator that
managing document SOP of Bogor Agricultural University. This SOP navigator used ruby
programming language, sinatra web framework and neo4j graph database. The relations
between the node/class looks very intuitive. Xu [9] was concentrate to modeling ontology
knowledege management in bycicle product design through UML class diagram. It because the
bycicle product design did’nt have fuzziness hirarchy.
Based on these studies, OTK method has advantages similar to Methontology. OTK
had measures more complete than Methontology. Starting from the project management
process, the process of ontology development orientation to the integration process. Whereas
Methontology defining the detail of the features in these measures. So, OTK will be the
methodology to be used in this study. As for the semantic network design, this study will use
Neo4j 2.3.1 as the main tool. We try to design node, relationship and the entire regulations
document. This is due to the minimal research on ontology using Neo4j with excellence as a
tool for graph database, ontology and semantic. Kivikangas [10] described that Semantic data is
easily represented as graphs, provides graph database more natural abstraction for such data
than relational database. While Miller [11] said that the graph databases have a natural
application to biology, semantics, network systems, and recommender who require this type of
data model only they can offer.Graph database is also described by Malhotra [12] as the
evolution of technical knowledge representation based on semantic web.
2. Research Method
In obtaining the results of research in accordance with the purpose, it takes the steps of
research activity.
3. TELKOMNIKA ISSN: 1693-6930
Modeling Ontology and Semantic Network of Regulations in Customs ...(Eva Maulina Aritonang)
1929
The steps of this research are using an ontology structure with development methods
On-To-Knowledge (OTK). The selection of the method is based on the advantages of OTK
method that has more spesific and detail in every single steps that it has. Corcho O [13] even
said that this methodology can identify the purpose based on the analysis of usage scenario
that must be achieved by a knowledge management tools.
a. Kick Off
At this step, there will be analyzing the current system such as: the form of knowledge
management, knowledges stored, the type of knowledge and information produced by the
current system. Doing analyzing the weaknesses of the current system and how to modelling a
new system that can cover the weaknesses of the current system. Furthermore, the analysis of
the users of the system and any information and knowledge that needed. To reach the goal at
this step is carried out literature studies and collect every data needed (both from interviews and
observations).
b. Refinement
This step determines the domain ontology, determine part of ntology and the relations
that will be built within the ontology. It start from designing herarchical part of ontology, build the
class and subclass relationship within semantic network which tailored to the needs and the
current system. Furthermore, the design will be use Neo4j as a graph database software.
c. Evaluation
This step will testing the new system has been built with query that provided by neo4j. If
there is a failure in the test results, the the system must do repairs and improvement.
3. Results and Analysis
3.1 Kick Off
The documents of regulation had been obtained from the competent directorate.
Documents to be entered dan displayed in the semantic network should be the final document.
So, this study will use the document of regulation that set in year 2007-2011. The regulation that
establish in the year above 2011 will not put in semantic network. This is done to avoid changes
in the construction of semantic networks.
So, from 755 items of rules that managed in the repository with 15 kinds of rules and
179 regulation tag, we set 106 objects that represent the theme of regulation
In this study, we used some objects as a prototype of semantic network. The scope of
this research are 2 Area that have 10 of objects of Customs and Excise, 39 of Topics, 29 of
article of UU, 2 of PP, 10 of KMK, 50 of PMK, 13 of Users, 25 of Per Dirjen, 16 of SE and 2 of
Kep Dirjen. This regulation will called as class or node in diagram, and all of this node will
connected with 22 relationship. After determining the scope of the research, we also define the
scope of the topic as well as the regulation that will be incorporated into the semantic network.
3.2 Refinement
In this step, the knowledge is captured from documents that have been colleted. Any
such of knowledge is represented in the form of domain ontology and semantic network.
However, in the ontology building process, the computer needs a precise definition of formal
knowledge; also need the knowledge engineer's participation. In addition, a domain ontology
perfect is an iterative process of evolution in the construction [14]. Based on data and
information relating to the regulation concerning customs and excise, it has been designed
ontology of customs and excise regulations.
The objects in the domain knowledge of the regulation concerning customs and excise
identified. The objects is compiled into the class and subclass forming taxonomy. Some classes
that have been identified, are:
1. Objects of customs and excise. This class is the main class that is displayed as the initial
class in building a class hierarchy. Objects of customs and excise indicates the domain of
regulation of customs and excise. So, all classes will be defined in the domain under this
class.
4. ISSN: 1693-6930
TELKOMNIKA Vol. 15, No. 4, December 2017 : 1927 – 1935
1930
2. UU or Legislation. This class stores information about the content of regulations shaped the
legislation. legislation is the regulations that have highest level within the hierarchy of rules
in the finance ministry.
3. UU Details. This class contains the clause of UU (Legislation).
4. PP or Government Regulations. This class stores information about government
regulations that issued by the President. This regulation has a number two positition in the
hierarchy of regulation of the finance ministry after legislation.
5. KMK or Minister of Finance Decision.. This class contains information about the rules that
made by the finance minister. KMK is an administrative decision making by finance
minister.
6. PMK or Minister of Finance Regulations.This class contains information about the rules that
made by the finance minister. These regulations contain detailed implementation rules
based on the rules above as: legislation, government regulations, presidential decrees,
regulations of president and decision of finance minister.
7. KepDirjen or Director General’s Decision. This class contains information about the
directorate general regulations. KepDirjen is an administrative decision making by director
general of customs and excise.
8. PerDirjen or Director General’s Regulations. This class contains information about the
directorate general regulations made by the director general of customs and excise.
9. SE or Circular Letter. This class contains information about a collection of circular prepared
by the director general of customs and excise. This circular serve as guidance in the
implementation of bussinessprocesses that require additional rules for their implementation
instructions detailing.
10. Activities Diagram. This class contains a set of diagram that became a summary of all
spesific business processes elaborated within entire regulations.
11. Users. This class contains information about who the actors involved in the bussines
processes within all regulations.
12. Area. This class contains two subclasses, namely customs and excise. This class is the
location of the objects of customs and excise.
13. Topic. This class contains information on any topic that exist in all regulations related to
spesific objects of customs and excise
Figure 1. Regulation Ontology Of Customs and Excise
5. TELKOMNIKA ISSN: 1693-6930
Modeling Ontology and Semantic Network of Regulations in Customs ...(Eva Maulina Aritonang)
1931
Figure 2. Ontology of Excise
Figure 3. Ontology of Customs
6. ISSN: 1693-6930
TELKOMNIKA Vol. 15, No. 4, December 2017 : 1927 – 1935
1932
After identifying the class on ontology, relation and properties of these classes will be
defined. Classes that are constructed will be represented as nodes using Neo4j graph database
software. Whereas the relation will be representes as relationship type and its properties will be
represented as property keys.
In this step, we made a big semantic network. We used 11 types of nodes that contains
242 nodes with details as follows:
Table 1. 11 Types of Nodes
No. Types of nodes Number of nodes No. Types of nodes Number of nodes
1. Area 2 8. Users 13
2. Objects of customs and
excise
10 9. Per Dirjen 25
3. UU 29 law article 10. SE 16
4. Topics 39 11. Kep Dirjen 2
5. PP 2 12. UU details 29
6. KMK 10 13. Activity Diagram 41
7. PMK 50
Furthermore, we used 22 types of relationship that contains 548 relation with details as
follows:
Table 2. 22 Types of Relationship
No. Types of relation Number of
relation
No. Types of relation Number
of relation
1. Set_with 41 12. Have_clauses 29
2. Equipped_with 3 13. Have_topics 38
3. Confirmed_by 3 14. Have_first_amandement 25
4. Addressed_to 72 15. Have_second_amandement 11
5. Amanded_by 19 16. Have_third_amandement 5
6. Completes_UU 39 17. Have_fourth_amandement 2
7. Completes_PMK 46 18. Have_fifth_amandement 1
8. Have_activity 42 19. Have_six_amandement 1
9. Have_detail_rules 32 20. Have_implementation_guidelines 9
10. Have_provision 19 21. Detailing_UU 59
11. Have_objects 10 22. Refer_to 42
So, we split the semantic network into two parts, namely customs and excise. The result
of the semantic network that generated by Neo4j, shown in Figure 2 (ontology of excise) that
displyaing 95 nodes and 227 relationship (completed with 130 additional relationship). Whereas
Figure 3 (ontology of customs) displaying 155 nodes, 328 relationship (completed with 166
additional relationship).
3.3 Evaluation
This step will verify the semantic network that has been built. So, testing the query will
be done by using Cyper. This test aims to see the nodes that connected by the relation such as:
1. Displaying the relationship of “amanded_by” by using query:
MATCH ()-[r:amanded_by]->() RETURN r
Search result with the relation “amanded_by” as follows (Figure 4).
The result of the evaluation indicate several nodes like PMK (19), PerDirjen(3), SE (5) and UU
(4) that connected with “amanded_by” (19) and “completes_PMK” (3), the details are:
1. PMK No. 36/PMK.04/2005 “amanded_by” PMK No. 37/PMK.04/2005;
2. PMK No. 107/PMK.04/2009 “amanded_by” PMK 212/PMK.011/2011;
3. SE-11/BC/2010 “completes_PMK” PMK No. 190/PMK.04/2010 “amanded_by” PMK
No. 167/PMK.011/2011; SE-11/BC/2010 “amanded_by” SE-27/BC/2010 “amanded_by”
SE-19/BC/2011 “completes_PMK” PMK No. 167/PMK.011/2011; SE-27/BC/2010
“completes_PMK” PMK No. 167/PMK.011/2011
4. PMK No 09/PMK.04/2009 “amanded_by” PMK No 159/PMK.04/2009; and
7. TELKOMNIKA ISSN: 1693-6930
Modeling Ontology and Semantic Network of Regulations in Customs ...(Eva Maulina Aritonang)
1933
5. PMK No. 22/PMK.04/2006 “amanded_by” PMK No. 67/PMK.04/2006 “amanded_by”
PMK No. 64/PMK.04/2007 “amanded_by” PMK No. 177/PMK.04/2009 “amanded_by”
PMK No. 27/PMK.011/2011 “amanded_by” PMK No. 70/PMK.04/2012
6. Etc.
Figure 4. Search Result of amanded_by
2. Displaying the relationship of “addressed_to” by using query:
MATCH ()-[r:addressed_to]->() RETURN r
Search result with the relation “addressed_to” as follows (Figure 5).
The result of the evaluation indicate several nodes like Acitivity Diagram (41) and Users
(13) that connected with “addressed_to” (72), the details are:
1. Activity Diagram that connected with Importer, are :
- Diagram of Submission of Tariffs Application of Tobacco Importer Import Tariff
- Procedure of Payment, Receipt, Deposit of Import Duty, Administrative Penalty
and Interest Diagram
- Customs Registration Diagram
- Diagram Submission Application for Customs Tariff of Alcohol Importer
- Diagram Imports Military Supplies
- Diagram of Submission of Import Duty and / or Excise
- Import Moving Goods Diagram
- Request for Exemption of Sample Goods import Diagram
- Diagram of application for the Exemption of Import Duty on the Import of Goods
already exported
- Diagram of application for the Exemption of Import Duty on the Import of Human
Therapeutic Material
- Diagram of Importing Government-Financed Medicines
- Physical Examination Diagram
- Etc
2. Activity Diagram that connected with Manufacturer, are:
- Diagram of Submission of Tariffs for Tobacco Excise Tax;
- Custom Imported Ribbon Design Diagram
- Customs Ribbon Design Diagram
- Retail price pricing of Tobacco Products Diagram
- Design of Excise Band of Fiscal Year 2011 Diagram
- Excise repayment Diagram
8. ISSN: 1693-6930
TELKOMNIKA Vol. 15, No. 4, December 2017 : 1927 – 1935
1934
- Customize Excise Ribbon Diagram
- Administration of Reception, Storage, Delivery and Refund of Excise Taxes
Diagram
- Diagram of Submission of Application of Tariffs for Tobacco Excise of Factory
Entrepreneurs
- The method of payment of state revenue Diagram
- Design of Excise Ribbon for Fiscal Year 2012 Diagram
- Diagram of Submission of Application of Excise Tax Rate of Factory Entrepreneur
- Pricing of Retail Selling of Ethyl Alcohol, MMEA and Concentrates containing
Ethyl Alcohol Diagram
- Etc
3. Etc.
Figure 5. Search Result of addressed_to
4. Conclusion
Ontology and semantic network of regulation of the minister of finance concerning
customs and excise has been succesfully modeled. The resulting models include node,
relationship and properties that owned by each node. In the application that uses the sample
data, we found 11 types of nodes that contains 242 child nodes and 22 type of relation that
contain 548 relation that connect all the node within 3305ms. Mapping relationship between
nodes can be done using a graph database Neo4j software. The models have been evaluated
using Cyper, the neo4j query. Now the system is able to manage the rules well. in the future it
can accommodate all relevant regulations and changes and linkages with other types of
regulations. This system will help us to make it easier for users to search, organize and track
the history of changes and relationships between the rules used by the organization.
9. TELKOMNIKA ISSN: 1693-6930
Modeling Ontology and Semantic Network of Regulations in Customs ...(Eva Maulina Aritonang)
1935
Further development on this ontology is to map the entire semantic network in the
regulation of the finance ministry till build the interface using software such as ruby, popoto, java
and etc. This interface will be expected to show activity diagram that has been authorized by the
government SOP (standard operation procedures). This is intended to be facilitate all users to
search the regulations of customs and excise.
References
[1] Naji Hasan. A. H, Gao Shu, Al Gabri Malek, Jiang Zi Long. An Efficient Approach Based on
Hierarchal Ontology for Service Discovery in Cloud Computing. Telkomnika Indonesian Journal of
Electrical Engineering. 2014;.12.(4): 2905-2913
[2] Tom M. Van Engers, Patries J. M. Kordelaar, Ing. Jan Den Hartog, Erwin Glasse. POWER:
Programme for an Ontology based Working Environment for Modelling and Use of Regulations dan
Legislation. 11th International Workshop on Database and expert Systems Applications. London.
2000.
[3] Ralf Klischewski. Towards an Ontology for e-Document management in Public Administration –the
Case of Schleswig-Holstein. Proceedings of the 36th Hawaii International Conference on System
Sciences (HICSS’03). Hawaii. 2002.
[4] Oscar Corcho, Mariano Fernandez-Lopez, Asuncion Gomez-Perez, Angel Lopez-Cima. Building
Legal Ontologies with Methontology and webODE. Law and The Semantic Web. 2005: 142-157
[5] Jose Saias, Paulo Quaresma. A Methodology to Create Legal Ontologies in a Logic Programming
Information Retrieval. Law and The Semantic Web, LNCS. 2005: 369: 185-200.
[6] Aldo Gangemi. Design Patterns for Legal Ontology Contruction. Proceeding of LOAIT 07II
Workshop on Legal Ontologies and Artificial Intelligence Techniques. 2007; 321: 65-85.
[7] Sakthi Priya T, Revathy P, Pradeesh T, Rene Robin C R. Design and Development of an Ontology
Based Personal Web Search Engine. 2nd International Conference on Communication, Computing
& Security [ICCCS-2012]. Rourkela. 2012; 299-306.
[8] Pangudi Citraning Putra, kudang Boro Seminar, Irman Hermadi. SOP Navigator: Ontology and
semantic-Based Knowledge Management System for Standard Operating Procedures. International
Journal of Information Technology and Bussiness management. 2016; Vol 47: 37-45.
[9] Heng Xu, Oren Musicant. Design and Implementation for Ontology Modeling of Design Knowledge
Based on UML Class Diagram. Telkomnika. 2016; 14:326-332.
[10] Petri Kivikangas, Mitsuru Ishizuka. Improving Semantic Queries by Utilizing UNL Ontology and a
Graph Database. Semantic Computing (ICSC), 2012 IEEE Sixth International Conference.
Palermo. 2012; 83-86.
[11] Justin. J. Miller. Graph Database Applications and Concepts with Neo4j. Proceeding of the
Southern Assosiation for Information Systems Conference. Atlanta. 2013;141-147.
[12] Meenakshi Malhotra, T.R Gopalakrishnan Nair. Evolution of Knowledge Representation and
Retrieval Technique. I. J Intelligent Systems and Applications. 2015;07: 18-28.
[13] Oscar Corcho, Mariano Fernandez-Lopez, Asuncion Gomez-Perez. Methodologies, tools and
languages for building ontologies where is their meeting point?. Data & Knowledge Engineering.
2003; Vol.46: 41-46
[14] HongSheng Xu, Ruiling Zhang. Research on Data Integration of the semantic Web Based on
Ontology Learning Technology. Telkomnika Indonesian Journal od Electrical Engineering. 2014; 12:
167-178.