- The document discusses implementing a real-time fraud detection solution for online banking that analyzes transactions as they occur to identify potentially fraudulent activity.
- It proposes a system that collects data from transactions and user profiles, analyzes them for anomalies, makes decisions about transactions, and responds by flagging suspicious transactions or notifying incident handlers.
- Key components include rules engines to classify transactions based on multiple factors, complex event processing to handle high transaction volumes, and integrating the system with existing banking processes and authentication methods.
Modern IT Service Management Transformation - ITIL IndonesiaEryk Budi Pratama
Presented at Online ITIL Indonesia Webinar #5.
Content:
> Setting up the context
> Understanding holistic IT Management point of view
> IT Service Management Transformation
> Key Performance Indicator (KPI)
> IT Service Catalogue
> IT Sourcing
> Agile Incident Management
RPA (Robotic Process Automation), POA (Process Oriented Architecture) And BPM...Alan McSweeney
RPA (Robotic Process Automation) is an opportunity to add value by creating (partially of completely) automated meta processes that control one or more existing applications to automate the interactions with those applications and thus enable the successful operation of the process.
RPA can reduce manual effort, reduce manuals errors, improve quality, accuracy and ensure consistency. RPA based processes are always available, can respond to changes more quickly and are more scalable that manual processes. They captures process information for reporting, analysis and process improvement and provide greater visibility and control.
Successful RPA is a pre-requisite to exploiting other technologies and approaches such as artificial intelligence.
POA (Process Oriented Architecture) is concerned with linking process areas to actual (desired) interactions – customer (external interacting party) service journeys through the organisation.
BPM (Business Process Management) is the disciplined approach to identify, design, execute, document, measure, monitor and control both automated and non-automated business processes to achieve consistent, targeted results aligned with an organisation’s strategic goals.
Increasing velocity of change means that informal, undocumented expertise makes reaction slow, exceptions are only known and understood locally – process architecture ensures knowledge is documented and change can happen quickly.
A change to digital operations means that internal processes are exposed – the potentially inefficient and manual processes must be made efficient and external interactions must be masked from the internal complexity.
Moving the organisation from one that is internally focussed around its siloed structures to one that is focussed on customer (external interacting party) straight-through interactions.
Automating existing processes requires a structured approach to process analysis.
A structured approach to designing new optimised processes is important to successful RPA implementation.
Modern IT Service Management Transformation - ITIL IndonesiaEryk Budi Pratama
Presented at Online ITIL Indonesia Webinar #5.
Content:
> Setting up the context
> Understanding holistic IT Management point of view
> IT Service Management Transformation
> Key Performance Indicator (KPI)
> IT Service Catalogue
> IT Sourcing
> Agile Incident Management
RPA (Robotic Process Automation), POA (Process Oriented Architecture) And BPM...Alan McSweeney
RPA (Robotic Process Automation) is an opportunity to add value by creating (partially of completely) automated meta processes that control one or more existing applications to automate the interactions with those applications and thus enable the successful operation of the process.
RPA can reduce manual effort, reduce manuals errors, improve quality, accuracy and ensure consistency. RPA based processes are always available, can respond to changes more quickly and are more scalable that manual processes. They captures process information for reporting, analysis and process improvement and provide greater visibility and control.
Successful RPA is a pre-requisite to exploiting other technologies and approaches such as artificial intelligence.
POA (Process Oriented Architecture) is concerned with linking process areas to actual (desired) interactions – customer (external interacting party) service journeys through the organisation.
BPM (Business Process Management) is the disciplined approach to identify, design, execute, document, measure, monitor and control both automated and non-automated business processes to achieve consistent, targeted results aligned with an organisation’s strategic goals.
Increasing velocity of change means that informal, undocumented expertise makes reaction slow, exceptions are only known and understood locally – process architecture ensures knowledge is documented and change can happen quickly.
A change to digital operations means that internal processes are exposed – the potentially inefficient and manual processes must be made efficient and external interactions must be masked from the internal complexity.
Moving the organisation from one that is internally focussed around its siloed structures to one that is focussed on customer (external interacting party) straight-through interactions.
Automating existing processes requires a structured approach to process analysis.
A structured approach to designing new optimised processes is important to successful RPA implementation.
ICT Association Suriname Presentation On eGovernment 2012Cyril Soeri
This presentation was presented to raise awareness on eGovernment which is mainly based on the Worldbanks\' eGovernment Handbook for developing countries.
This is the general orientation for the new beginner who wants to make their career in IT Audit. This contains very less technical and more counselling terms and topics.
“The organizing logic for business processes and IT infrastructure reflecting the integration and standardization requirements of the firm’s operating model.” [1]
“A conceptual blueprint that defines the structure and operation of an organization. The intent of an enterprise architecture is to determine how an organization can most effectively achieve its current and future objectives.”[2]
Enterprise Architecture and Information SecurityJohn Macasio
A thinking tool to ask and describe the alignment requirements of business, information, technology and security to improve and secure the management of process, data, application and infrastructure of performance.
The strategic importance of Information Security for organisations is gaining momentum. The current surge in cyber threats is compelling organisations to invest in information security to protect their assets. Rushing to protect assets often comes with the expense of excessive technology adoption without a valid strategic foundation. Enterprise Security Architecture is geared to address these issues, but is frequently misaligned with Enterprise Architecture. In this presentation we explore avenues for the adoption and enforcement of Security-By-Design in the Enterprise Architecture value-chain so as position Risk, Security and IT as true business enablers.
This presentation was originally delivered in three parts at the SharePoint Evolutions conference in London, 15th April 2013. It was designed for a business audience - project leads and decision makers responsible for delivering intranet projects
This presentation describes systematic, repeatable and co-ordinated approach to agile solution architecture and design. It is intended to describe a set of practical steps and activities embedded within a framework to allow an agile method to be adopted and used for solution design and delivery. This approach ensures consistency in the assessment of solution design options and in subsequent solution design and solution delivery activities. This process leads to the rapid design and delivery of realistic and achievable solutions that meet real solution consumer needs. The approach provides for effective solution decision-making. It generates options and results quickly and consistently. Implementing a framework such as this provides for the creation of a knowledgebase of previous solution design and delivery exercises that leads to an accumulated body of knowledge within the organisation.
Fortify Your Enterprise with IBM Smarter Counter-Fraud SolutionsPerficient, Inc.
Organizations lose an estimated five percent of annual revenues to fraud, totaling nearly $1 trillion in the U.S. alone. Cyber criminals are more organized and better equipped than ever, and continue to evolve their strategies in order to undermine even the strongest protections.
We continue to hear about major security breaches across all industries, but what is being done to fix the problem? There must be a tight interlock between risk, security, fraud and financial crimes management. Current solutions are proving inadequate as point solutions and a corporate silo mentality directly contribute to the risk of fraudulent activities going undetected.
Our webinar covered:
-How IBM’s Smarter Counter Fraud initiative can help public and private organizations prevent, identify and investigate fraudulent activities
-Real-world use cases including how one financial institution stopped $1M in fraud in the first week after implementing a counter-fraud solution
-Perficient’s multi-tiered approach to help guide successful business outcomes
It’s time to stop the bad guys with IBM Smarter Counter Fraud and Perficient – learn how now!
Security For Business: Are You And Your Customers Safewoodsy01
This presentation takes a look at issues affecting cyber-security. It also covers some of SHBO Technologies\' capabilities of supporting and protecting clients.
As requested by folks these are the presentation notes for Securing Citizen Facing Applications. Hope these help with your IDM planning and implementation
Project failure tends to be embedded in a project from the start. There is a spectrum of failures from complete collapse to a range of lesser failures associated with behind schedule and over budget. The reasons are all too well known. Yet the lessons from project failures are not being learned and the behaviours that give rise to failures continue to persist. Project failures will continue to occur until the reasons and behaviours are explicitly understood, acknowledged and addressed.
The reasons for project failure across project phases include:
Requirements
• Poor initial requirement definition
• Poor requirements validation
• Poor management of requirements
• Requirements not linked to business benefits
Solution Design
• Solution design not validated
• Solution design not linked to business needs
• Solution design too complex
• Solution design does not capture necessary complexity
• Solution design based on unproven technology
• Solution not implementable
• Underlying business processes not defined adequately
Estimation
• Errors due to limitations in estimating procedures
• Failure to understand and account for technical risks
• Deliberate underestimation/misrepresentation of costs
• Poor inflation estimates
• Top down pressure to reduce estimates
• Lack of valid independent cost estimates
Project Management
• Lack of program management expertise
• Mismanagement/human error
• Over optimism
• Schedule concurrency
• Program stretch outs to keep production lines open
• Lack of communication
• Poor management of change and scope creep
Development and Implementation
• Lack of competition when selecting suppliers, poor supplier selection process
• Poor supplier engagement
• Poor contract design
• Inconsistent contract management/administration procedures, too much or too little oversight
• Waste
• Excess profits by supplier, supplier overstaffed
• Supplier indirect costs unreasonable
• Inadequate resource allocation and prioritisation
• Organisation cannot handle change
Finance and Budgeting
• Business case incomplete
• Funding instabilities caused by trying to fund too many projects
• Funding instabilities caused by management decisions
• Inefficient production rates due to stretching out programmes
• Failure to fund for contingency
• Failure to fund projects at realistic cost
Forget Big Data. It's All About Smart DataAlan McSweeney
This proposes an initial smart data framework and structure to allow the nuggets of value contained in the deluge of largely irrelevant and useless data to be isolated and extracted. It enables your organisation to ask the questions to understand where it should be in terms of its data state and profile and what it should do to achieve the desired skills level across the competency areas of the framework.
Every organisation operates within a data landscape with multiple sources of data relating to its activities that is acquired, transported, stored, processed, retained, analysed and managed. Interactions across the data landscape generate primary data. When you extend the range of possible interactions business processes outside the organisation you generate a lot more data.
Smart data means being:
• Smart in what data to collect, validate and transform
• Smart in how data is stored, managed, operated and used
• Smart in taking actions based on results of data analysis including organisation structures, roles, devolution and delegation of decision-making, processes and automation
• Smart in being realistic, pragmatic and even skeptical about what can be achieved and knowing what value can be derived and how to maximise value obtained
• Smart in defining an achievable, benefits-lead strategy integrated with the needs business and in its implementation
• Smart in selecting the channels and interactions to include – smart data use cases
Smart data competency areas comprise a complete set of required skills and abilities to design, implement and operate an appropriate smart data programme.
ICT Association Suriname Presentation On eGovernment 2012Cyril Soeri
This presentation was presented to raise awareness on eGovernment which is mainly based on the Worldbanks\' eGovernment Handbook for developing countries.
This is the general orientation for the new beginner who wants to make their career in IT Audit. This contains very less technical and more counselling terms and topics.
“The organizing logic for business processes and IT infrastructure reflecting the integration and standardization requirements of the firm’s operating model.” [1]
“A conceptual blueprint that defines the structure and operation of an organization. The intent of an enterprise architecture is to determine how an organization can most effectively achieve its current and future objectives.”[2]
Enterprise Architecture and Information SecurityJohn Macasio
A thinking tool to ask and describe the alignment requirements of business, information, technology and security to improve and secure the management of process, data, application and infrastructure of performance.
The strategic importance of Information Security for organisations is gaining momentum. The current surge in cyber threats is compelling organisations to invest in information security to protect their assets. Rushing to protect assets often comes with the expense of excessive technology adoption without a valid strategic foundation. Enterprise Security Architecture is geared to address these issues, but is frequently misaligned with Enterprise Architecture. In this presentation we explore avenues for the adoption and enforcement of Security-By-Design in the Enterprise Architecture value-chain so as position Risk, Security and IT as true business enablers.
This presentation was originally delivered in three parts at the SharePoint Evolutions conference in London, 15th April 2013. It was designed for a business audience - project leads and decision makers responsible for delivering intranet projects
This presentation describes systematic, repeatable and co-ordinated approach to agile solution architecture and design. It is intended to describe a set of practical steps and activities embedded within a framework to allow an agile method to be adopted and used for solution design and delivery. This approach ensures consistency in the assessment of solution design options and in subsequent solution design and solution delivery activities. This process leads to the rapid design and delivery of realistic and achievable solutions that meet real solution consumer needs. The approach provides for effective solution decision-making. It generates options and results quickly and consistently. Implementing a framework such as this provides for the creation of a knowledgebase of previous solution design and delivery exercises that leads to an accumulated body of knowledge within the organisation.
Fortify Your Enterprise with IBM Smarter Counter-Fraud SolutionsPerficient, Inc.
Organizations lose an estimated five percent of annual revenues to fraud, totaling nearly $1 trillion in the U.S. alone. Cyber criminals are more organized and better equipped than ever, and continue to evolve their strategies in order to undermine even the strongest protections.
We continue to hear about major security breaches across all industries, but what is being done to fix the problem? There must be a tight interlock between risk, security, fraud and financial crimes management. Current solutions are proving inadequate as point solutions and a corporate silo mentality directly contribute to the risk of fraudulent activities going undetected.
Our webinar covered:
-How IBM’s Smarter Counter Fraud initiative can help public and private organizations prevent, identify and investigate fraudulent activities
-Real-world use cases including how one financial institution stopped $1M in fraud in the first week after implementing a counter-fraud solution
-Perficient’s multi-tiered approach to help guide successful business outcomes
It’s time to stop the bad guys with IBM Smarter Counter Fraud and Perficient – learn how now!
Security For Business: Are You And Your Customers Safewoodsy01
This presentation takes a look at issues affecting cyber-security. It also covers some of SHBO Technologies\' capabilities of supporting and protecting clients.
As requested by folks these are the presentation notes for Securing Citizen Facing Applications. Hope these help with your IDM planning and implementation
Project failure tends to be embedded in a project from the start. There is a spectrum of failures from complete collapse to a range of lesser failures associated with behind schedule and over budget. The reasons are all too well known. Yet the lessons from project failures are not being learned and the behaviours that give rise to failures continue to persist. Project failures will continue to occur until the reasons and behaviours are explicitly understood, acknowledged and addressed.
The reasons for project failure across project phases include:
Requirements
• Poor initial requirement definition
• Poor requirements validation
• Poor management of requirements
• Requirements not linked to business benefits
Solution Design
• Solution design not validated
• Solution design not linked to business needs
• Solution design too complex
• Solution design does not capture necessary complexity
• Solution design based on unproven technology
• Solution not implementable
• Underlying business processes not defined adequately
Estimation
• Errors due to limitations in estimating procedures
• Failure to understand and account for technical risks
• Deliberate underestimation/misrepresentation of costs
• Poor inflation estimates
• Top down pressure to reduce estimates
• Lack of valid independent cost estimates
Project Management
• Lack of program management expertise
• Mismanagement/human error
• Over optimism
• Schedule concurrency
• Program stretch outs to keep production lines open
• Lack of communication
• Poor management of change and scope creep
Development and Implementation
• Lack of competition when selecting suppliers, poor supplier selection process
• Poor supplier engagement
• Poor contract design
• Inconsistent contract management/administration procedures, too much or too little oversight
• Waste
• Excess profits by supplier, supplier overstaffed
• Supplier indirect costs unreasonable
• Inadequate resource allocation and prioritisation
• Organisation cannot handle change
Finance and Budgeting
• Business case incomplete
• Funding instabilities caused by trying to fund too many projects
• Funding instabilities caused by management decisions
• Inefficient production rates due to stretching out programmes
• Failure to fund for contingency
• Failure to fund projects at realistic cost
Forget Big Data. It's All About Smart DataAlan McSweeney
This proposes an initial smart data framework and structure to allow the nuggets of value contained in the deluge of largely irrelevant and useless data to be isolated and extracted. It enables your organisation to ask the questions to understand where it should be in terms of its data state and profile and what it should do to achieve the desired skills level across the competency areas of the framework.
Every organisation operates within a data landscape with multiple sources of data relating to its activities that is acquired, transported, stored, processed, retained, analysed and managed. Interactions across the data landscape generate primary data. When you extend the range of possible interactions business processes outside the organisation you generate a lot more data.
Smart data means being:
• Smart in what data to collect, validate and transform
• Smart in how data is stored, managed, operated and used
• Smart in taking actions based on results of data analysis including organisation structures, roles, devolution and delegation of decision-making, processes and automation
• Smart in being realistic, pragmatic and even skeptical about what can be achieved and knowing what value can be derived and how to maximise value obtained
• Smart in defining an achievable, benefits-lead strategy integrated with the needs business and in its implementation
• Smart in selecting the channels and interactions to include – smart data use cases
Smart data competency areas comprise a complete set of required skills and abilities to design, implement and operate an appropriate smart data programme.
Getting Good And Staying Good At (Out)SourcingAlan McSweeney
There is an increasing and continuing trend of organisations moving from in-house solution delivery to sourcing solutions externally. Organisations are divesting themselves of what they see as non-core functions. This is intended to improve operational efficiencies by using external suppliers’ perceived abilities to provide cost-effective, fit-for-purpose solutions quickly using the right technology. The responsibility and accountability for solution delivery and operation stills lies with the acquiring organisation. An organisation’s outsourcing zone of opportunity represents a challenge for both suppliers and for the acquisition function. Learn lessons from the experience of others to define exactly what you want of your outsourcing arrangement.
The myths of requirements are that:
• Requirements gathered from business users through requirements gathering meetings and workshops define the scope and functionality of the solution
• Requirements gathering workshops at the start of a project are sufficient to understand business needs
• Requirements change
The reality is that what is gathered during requirements workshops, meetings, interviews, questionnaires and other activities are not solution requirements but business stakeholder requirements.
Stakeholder requirements must be translated into solution requirements which is turn must be translated into a solution design. A solution is a Resolver, a Provider or an Enabler.
Good solution design requires solution ownership and technical leadership throughout the process.
Any solution is always greater than the sum of the gather requirements. Requirements do not equal a solution.
Any solution also causes problems in terms of:
• Required organisational changes to implement and operate solution
• Additional operational overhead
• Cost to implement
The solution is the minimum set of components that works and that solves the problem at the minimum cost with minimum additional costs.
Enterprise Business Analysis Capability - Strategic Asset for Business Alignm...Alan McSweeney
Introducing the concept of Enterprise Business Analysis as a strategic resource to achieve business and IT alignment. Alignment means being able to draw a straight Line from business strategy through to delivered and operational solutions implemented to respond to businessn. Business and IT Alignment requires more than just relationship management – it requires actual engagement by IT with the needs of the business.
Don’t Mention The “A” Word – Trends In Continuing Business And IT MisalignmentAlan McSweeney
Despite years of emphasising the need for IT and business alignment, the disconnect between business and IT continues. IT focuses too much of pure technology. However, business expectations cam be unrealistic, based on part on IT not explaining itself to the business. IT technology trends are not relevant the business. The business is concerned with the results of investment in IT and sees technology as means to an end and not as ends in themselves. IT needs to structure itself so alignment pervades the entire IT function. IT must embed business alignment in the way it operates to ensure it remains relevant to the business. IT needs to mediate between the business and suppliers and technologies, acting as a lens focussing business needs on appropriate solutions. The gulf is between business and IT seems to be getting wider. Failure to ensure this alignment may lead to the business bypassing IT and going straight to suppliers and service providers. Disintermediation of IT is central to the business plans of many internet-based service providers. Failure to systematise alignment will expose IT to the danger of becoming irrelevant.
Conway's Law, Cognitive Diversity, Organisation Transformation And Solution D...Alan McSweeney
These topics may appear to be separate but are closely related to the need for an effective solution design process, approach and function.
Nearly 50 years ago, Dr Melvin Conway wrote a short and insightful article titled How Do Committees Invent? where he made a number of observations on the system and solution design process including “… organizations which design systems … are constrained to produce designs which are copies of the communication structures of these organizations.” which has become known as Conway’s Law. He identified organisation problems that lead to poor solution design.
Conway’s Law is a warning rather than a prediction. It provides an insight into the solution design problems that can occur if the solution design structures, processes and function are not optimised. What he describes does not have to happen but all too frequently does.
Cognitive Diversity has become a fashionable concept that is talked about more than implemented. It has been written about extensively by Dr Scott Page. The core concept is that “… a random group of intelligent problem solvers will outperform a group of the best problem solvers”.
The value of cognitive diversity to organisations is greatest in the thinking areas such as the solution design function. Managing diverse teams can be difficult and achieving cognitive diversity can be painful and challenging. Cognitive diversity of less value in pure operational and transactions areas where there is a reduced need for problem-solving.
Cognitive diversity protects the organisation against factors such as Cognitive Bias, Strategic Misrepresentation, Planning Fallacy, Optimism Bias, Focalism and Groupthink and their consequences.
Cognitive diversity protects against the effects of Conway’s Law.
Many organisations are attempting to transform themselves in response to external changes and drivers. Organisation transformation is frequently concerned with a migration from product-orientation to services-orientation characterised by responsiveness, customer centricity, self-service and flexibility. Information technology underpins successful and effective organisation transformation.
This is especially true of initiatives such as digital transformation. Digital transformation involves designing and implementing solutions across a wide range of application and system areas.
Being good at solution design means that solutions are defined, designed and delivered in a reliable, stable and innovative way to ensure that cost, time, required functionality and quality are constantly optimised to meet the needs of the business.
Good solution design mean:
• Being aware of all the options and selecting the most appropriate one subject to all constraints
• Avoiding all the conscious and unconscious biases that lead to bad solutions
Put simply, a cognitively diverse team designs better solutions.
Translating Big Raw Data Into Small Actionable InformationAlan McSweeney
Any approach to Big Data needs to be based rigorously on business value. Big Data exists across the organisation’s operating landscape and not just for customers. Such data presents the potential for significant value that can enhance the way organisations do business and interact with external parties. There is a need for a realistic and achievable approach to translating Big Raw Data into Small Actionable Information.
Big Data is intrinsically linked to digital operations and associated digital transformation.
So ignore the issues of scope, lack of definition, conflicts, differences and complexity and focus on the identification, specification, development and implementation of approaches, strategies, processes, expertise, solutions and systems and data that can provide actionable information to achieve outcomes that produce business value.
The approach to generating real value needs to encompass:
1. Definition and understanding of Big Raw Data landscape including data sources, platforms, systems and applications parties, journeys and interactions
2. Identification and selection of high potential value use cases for implementation for selected parties
3. Definition of IT strategies, facilities, tools, techniques and resources to reduce the volume of Big Raw Data to translate it into Small Actionable Information
4. System and application changes to actualise use cases
5. Understanding and appreciation of wider operational context – Campaign Management, Customer Relationship Management, Customer Experience Management, Customer Value Management
6. Implementation of underpinning data governance and data privacy protocols
7. Organisational and process changes to identify, implement and operate use cases
There are only a limited number of actionable insights available from Big Raw Data. There are only a limited number of actions the organisation can reasonably take. It is important not to swamp the organisation with lots of irrelevant pseudo insights. It is important to prioritise the actions recommended from the derived insights.
Exploiting Big Raw Data to generate business value requires resources. This means management commitment and sponsorship.
Stopping Analysis Paralysis And Decision Avoidance In Business Analysis And S...Alan McSweeney
Analysis paralysis and decision avoidance occur all too frequently and commonly in the business and solution analysis and design process. It wastes time and money. Analysis paralysis occurs when you cannot escape the analysis stage – you are always looking for more information and for perfection. Decision avoidance and evasion occurs when there is a decision making request/response loop as there are seemingly endless requests for more information – there are always requests for more details, additional options and more clarifications.
There are two possible loops:
1. Analysis Loop – where analysis never finished. Analysis and design do not want to let go – always looking for perfection and want to retain ownership.
2. Decision/Analysis Loop – where decision making is deferred because of requests for more analysis. Fear of decision-making is masked by endless requests for more information and options.
You cannot avoid analysis but do not perform analysis is isolation without a business and solution context
The Conceptual Solution Architecture framework focusses on the core functional and system components of the solution. This enables effective decision-making on the available options implementation time-frames, implementation approaches and likely budget requirements.
Effective analysis and solution design minimise the Solution Space while maximising the size of Requirements Space encompassed within it.
You need to measure the progress of analysis and design and decision making to identify when progress is stalling.
The IT function needs to be a lens concentrating solution need onto solution options. It needs to successfully mediate between the business as the originator of a solution need and the solution provider, either internal or external or both. The IT function needs to be good at moving from analysis and option identification to an implementation decision quickly and effectively.
You need a systematic, structured and measurable approach to decision making. Decision making that follows a systematic approach is be more productive and results in better decisions.
Introduction To Business Architecture – Part 1Alan McSweeney
This is the first of a proposed four part introduction to Business Architecture. It is intended to focus on activities associated with Business Architecture work and engagements.
Business change without a target business architecture and a plan is likely to result in a lack of success and even failure. An effective approach to business architecture and business architecture competency is required to address effectively the pressures on businesses to change. Business architecture connects business strategy to effective implementation and operation:
• Translates business strategic aims to implementations
• Defines the consequences and impacts of strategy
• Isolates focussed business outcomes
• Identifies the changes and deliverables that achieve business success
Enterprise Architecture without Solution Architecture and Business Architecture will not deliver on its potential. Business Architecture is an essential part of the continuum from theory to practice.
After unnecessary complexity has been reduced from the problem being solved, the scope of the solution to the problem is governed by the complexity of the problem. Complexity is needed to handle and process complexity. Systems acquire or accrete unnecessary complexity over time as originally unforeseen exceptions or changes are incorporated. It may be possible to reduce complexity by collapsing/compressing/combining/consolidating elements and by removing non-value-adding, duplicate, redundant activities. When unnecessary or accreted complexity in the problem being solved has been removed, you are left with necessary complexity that must be incorporated into the solution. Simple problems do not have complex solutions. Complex problems do not have simple solutions. The complexity factor of the proposed solution must match the complexity factor of the problem being resolved. Many system implementation and operational failures arise because of failure to understand and address the core complexity of the problem.
Digital Transformation And Enterprise ArchitectureAlan McSweeney
Digital transformation - extending and exposing business processes outside the organisation - by implementing a digital strategy – a statement about the organisation’s digital positioning, operating model, competitors and customer and collaborator needs and behaviour through the delivery of digital solutions defined in a digital architecture – a future state application, data and technology view to achieve digital operating status - is potentially (very) complex.
Digital architecture does not exist in isolation entirely separate from an organisation’s overall enterprise architecture. Digital architecture must exist within the within the wider enterprise architecture context.
Enterprise architecture provides the tools and the approaches to manage the complexity of digital transformation.
The management function that drives digital transformation needs to involve the enterprise architecture function in the design and implementation of digital strategy and organisation, process and policies and the creation of a digital architecture. Management must appreciate the technology focus and the benefits of an enterprise architecture approach.
The early involvement of enterprise architecture increases successes and reduces failures. Management must trust and involve enterprise architecture. The enterprise architecture function must accept and rise to the challenge and deliver. The enterprise architecture function must allow its value to be measured.
Structured Approach to Solution ArchitectureAlan McSweeney
The role of solution architecture is to identify answer to a business problem and set of solution options and their components. There will be many potential solutions to a problem with varying degrees of suitability to the underlying business need. Solution options are derived from a combination of Solution Architecture Dimensions/Views which describe characteristics, features, qualities, requirements and Solution Design Factors, Limitations And Boundaries which delineate limitations. Use of structured approach can assist with solution design to create consistency. The TOGAF approach to enterprise architecture can be adapted to perform some of the analysis and design for elements of Solution Architecture Dimensions/Views.
Competence in sourcing is a core skill of the IT function. The IT function is becoming largely a manager of suppliers and service providers across a wide range of products, solutions and services. IT mediates between the business and the supplier ecosystem, acting as a lens focussing business needs on appropriate suppliers. When products and services are outsourced, the risks of the suppliers and service providers are inherited by the acquiring organisation. Sourcing should not be a “fire and forget” activity. Effective supplier selection and ongoing assessment, validation and management is an important skill for the IT function. The Service Organisation Controls audit approach can be adapted for use by the IT function to develop an approach to vendor governance.
Why is cyber security a disruption in the digital economyMark Albala
As we enter the digital economy, companies will quickly realize that the differentiator in the digital economy is information and information being a valuable resource is subject to theft, hacking, phishing and a host of other issues which compromise a company’s ability to participate in the digital economy. Cybersecurity misfires compromise the trust of buyers and partners necessary to participate in the digital economy. It is up to every company to ensure that the information shared with them is protected to the best of their ability and proactively notify persons and organizations who entrust their information necessary to transact business (any personal identity information including but not limited to addresses, credit card information, social security numbers, account information, credit information, medical records, etc.) with any potential compromises which can yield harm to them by that information either being used maliciously or shared with others.
The digital economy is different than other versions of commerce because in the digital economy, information is the lifeblood of digital commerce that passes through the hands of many platforms involved in a digital event. Each of these platforms are an opportunity to wreak havoc on your well-intended but incomplete intents to protect the information contained within the network you control. In the digital economy, it is not only the network you control, but the platforms that touch the personal data entrusted to you as a means of enabling digital commerce, and several techniques have begun to emerge to protect personal information contained within your information domain and the domain of platforms participating in digital commerce.
Because the life blood of the digital economy is information, information hacked in the digital economy is akin to shrinkage in the legacy economy. Both are means to directly attack your bottom line, whether it is redirecting customers elsewhere because they don’t trust your privacy program, ransomware which makes your site or one of your partner platform sites dangerous to use or some other reason which challenges your ability to participate in the digital economy. Shrinking the potential market share because of information safety and security challenges is a disruption, making cyber-security a disruptive activity, particularly if it is not dealt with swiftly.
If your cyber-security program is focused entirely on protecting the information housed in your four walls, you have exposed yourself to problems you will have difficulty in identifying both the source and the entry point of these issues.
Banking and Modern Payments System Security AnalysisCSCJournals
Cyber-criminals have benefited from on-line banking (OB), regardless of the extensive research on financial cyber-security. To better be prepared for what the future might bring, we try to predict how hacking tools might evolve. We briefly survey the state-of-the-art tools developed by black- hat hackers and conclude that they could be automated dramatically. To demonstrate the feasibility of our predictions and prove that many two-factor authentication schemes can be bypassed, we have analyzed banking and modern payments system security.
In this research we will review different payment protocols and security methods that are being used to run banking systems. We will survey some of the popular systems that are being used today, with a deeper focus on the Chips, cards, NFC, authentication etc. In addition, we will also discuss the weaknesses in the systems that can compromise the customer's trust.
Ethical Hacking Interview Questions and Answers.pdfShivamSharma909
Ethical hacking is an exciting career opportunity for individuals with excellent problem-solving skills and a passion for information security. Ethical hackers are responsible for safeguarding the critical infrastructure of the organization. They organize penetration tests to identify the vulnerabilities and help the organization take necessary measures to prevent possible cyber-attacks. There has been an increased demand for Ethical hackers in government agencies ( military and intelligence agencies) and private organizations in recent times. To become an ethical hacker requires a sound knowledge of networking and hacking systems.
https://www.infosectrain.com/blog/ethical-hacking-interview-questions-and-answers/
Cybersecurity in BFSI - Top Threats & Importancemanoharparakh
Cybersecurity has been the major area of concern throughout 2022 and now 2023 is all set to witness a new version of cyber-attacks with advanced technologies.
ETHICAL HACKING AND SOCIAL ENGINEERING
Topics Covered: Ethical Hacking Concepts and Scopes, Threats and Attack Vectors, Information Assurance, Threat Modelling, Enterprise Information Security Architecture, Vulnerability, Assessment and Penetration Testing, Types of Social Engineering, Insider Attack, Preventing Insider Threats, Social Engineering Targets and Defence Strategies
Organizations are increasingly looking to their Internal Auditors to provide independent assurance about cyber risks and the organization's ability to defend against cyber attacks. With information technology becoming an inherent critical success factor for every business and the emerging cyber threat landscape, every internal auditor needs to equip themselves on IT audit essentials and cyber issues.
In part 12 of our Cyber Security Series you will learn about the current cyber risks and attack methods from Richard Cascarino, including:
Where are we now and Where are we going?
Current Cyberrisks
• Data Breach and Cloud Misconfigurations
• Insecure Application User Interface (API)
• The growing impact of AI and ML
• Malware Attack
• Single factor passwords
• Insider Threat
• Shadow IT Systems
• Crime, espionage and sabotage by rogue nation-states
• IoT
• CCPA and GDPR
• Cyber attacks on utilities and public infrastructure
• Shift in attack vectors
Ethical Hacking Concepts and Scopes, Threats and Attack Vectors, Information Assurance, Threat Modelling
Enterprise Information Security Architecture, Vulnerability
Assessment and Penetration Testing
Types of Social Engineering, Insider Attack, Preventing Insider
Threats, Social Engineering Targets and Defence Strategies
How to build a highly secure fin tech applicationnimbleappgenie
Indeed, The FinTech industry is a specific sector where developing a successful mobile solution necessitates some extraordinary measures to capture clients’ loyalty. The takeaway is that a good FinTech app is more than simply an excellent companion.
Similar to Whitepaper Real Time Transaction Analysis And Fraudulent Transaction Detection For Online Banking (20)
The data architecture of solutions is frequently not given the attention it deserves or needs. Frequently, too little attention is paid to designing and specifying the data architecture within individual solutions and their constituent components. This is due to the behaviours of both solution architects ad data architects.
Solution architecture tends to concern itself with functional, technology and software components of the solution
Data architecture tends not to get involved with the data aspects of technology solutions, leaving a data architecture gap. Combined with the gap where data architecture tends not to get involved with the data aspects of technology solutions, there is also frequently a solution architecture data gap. Solution architecture also frequently omits the detail of data aspects of solutions leading to a solution data architecture gap. These gaps result in a data blind spot for the organisation.
Data architecture tends to concern itself with post-individual solutions. Data architecture needs to shift left into the domain of solutions and their data and more actively engage with the data dimensions of individual solutions. Data architecture can provide the lead in sealing these data gaps through a shift-left of its scope and activities as well providing standards and common data tooling for solution data architecture
The objective of data design for solutions is the same as that for overall solution design:
• To capture sufficient information to enable the solution design to be implemented
• To unambiguously define the data requirements of the solution and to confirm and agree those requirements with the target solution consumers
• To ensure that the implemented solution meets the requirements of the solution consumers and that no deviations have taken place during the solution implementation journey
Solution data architecture avoids problems with solution operation and use:
• Poor and inconsistent data quality
• Poor performance, throughput, response times and scalability
• Poorly designed data structures can lead to long data update times leading to long response times, affecting solution usability, loss of productivity and transaction abandonment
• Poor reporting and analysis
• Poor data integration
• Poor solution serviceability and maintainability
• Manual workarounds for data integration, data extract for reporting and analysis
Data-design-related solution problems frequently become evident and manifest themselves only after the solution goes live. The benefits of solution data architecture are not always evident initially.
Solution Architecture and Solution Estimation.pdfAlan McSweeney
Solution architects and the solution architecture function are ideally placed to create solution delivery estimates
Solution architects have the knowledge and understanding of the solution constituent component and structure that is needed to create solution estimate:
• Knowledge of solution options
• Knowledge of solution component structure to define a solution breakdown structure
• Knowledge of available components and the options for reuse
• Knowledge of specific solution delivery constraints and standards that both control and restrain solution options
Accurate solution delivery estimates are need to understand the likely cost/resources/time/options needed to implement a new solution within the context of a range of solutions and solution options. These estimates are a key input to investment management and making effective decisions on the portfolio of solutions to implement. They enable informed decision-making as part of IT investment management.
An estimate is not a single value. It is a range of values depending on a number of conditional factors such level of knowledge, certainty, complexity and risk. The range will narrow as the level of knowledge and uncertainty decreases
There is no easy or magic way to create solution estimates. You have to engage with the complexity of the solution and its components. The more effort that is expended the more accurate the results of the estimation process will be. But there is always a need to create estimates (reasonably) quickly so a balance is needed between effort and quality of results.
The notes describe a structured solution estimation process and an associated template. They also describe the wider context of solution estimates in terms of IT investment and value management and control.
Validating COVID-19 Mortality Data and Deaths for Ireland March 2020 – March ...Alan McSweeney
This analysis seeks to validate published COVID-19 mortality statistics using mortality data derived from general mortality statistics, mortality estimated from population size and mortality rates and death notice data
Analysis of the Numbers of Catholic Clergy and Members of Religious in Irelan...Alan McSweeney
This analysis looks at the changes in the numbers of priests and nuns in Ireland for the years 1926 to 2016. It combines data from a range of sources to show the decline in the numbers of priests and nuns and their increasing age profile.
This analysis consists of the following sections:
• Summary - this highlights some of the salient points in the analysis.
• Overview of Analysis - this describes the approach taken in this analysis.
• Context – this provides background information on the number of Catholics in Ireland as a context to this analysis.
• Analysis of Census Data 1926 – 2016 - this analyses occupation age profile data for priests and nuns. It also includes sample projections on the numbers of priests and nuns.
• Analysis of Catholic Religious Mortality 2014-2021 - this analyses death notice data from RIP.ie to shows the numbers of priests and nuns that have died in the years 2014 to 2021. It also looks at deaths of Irish priests and nuns outside Ireland and at the numbers of countries where Irish priests and nuns have worked.
• Analysis of Data on Catholic Clergy From Other Sources - this analyses data on priests and nuns from other sources.
• Notes on Data Sources and Data Processing - this lists the data sources used in this analysis.
IT Architecture’s Role In Solving Technical Debt.pdfAlan McSweeney
Technical debt is an overworked term without an effective and common agreed understanding of what exactly it is, what causes it, what are its consequences, how to assess it and what to do about it.
Technical debt is the sum of additional direct and indirect implementation and operational costs incurred and risks and vulnerabilities created because of sub-optimal solution design and delivery decisions.
Technical debt is the sum of all the consequences of all the circumventions, budget reduction, time pressure, lack of knowledge, manual workarounds, short-cuts, avoidance, poor design and delivery quality and decisions to remove elements from solution scope and failure to provide foundational and backbone solution infrastructure.
Technical debt leads to a negative feedback cycle with short solution lifespan, earlier solution replacement and short-term tactical remedial actions.
All the disciplines within IT architecture have a role to play in promoting an understanding of and in the identification of how to resolve technical debt. IT architecture can provide the leadership in both remediating existing technical debt and preventing future debt.
Failing to take a complete view of the technical debt within the organisation means problems and risks remained unrecognised and unaddressed. The real scope of the problem is substantially underestimated. Technical debt is always much more than poorly written software.
Technical debt can introduce security risks and vulnerabilities into the organisation’s solution landscape. Failure to address technical debt leaves exploitable security risks and vulnerabilities in place.
Shadow IT or ghost IT is a largely unrecognised source of technical debt including security risks and vulnerabilities. Shadow IT is the consequence of a set of reactions by business functions to an actual or perceived inability or unwillingness of the IT function to respond to business needs for IT solutions. Shadow IT is frequently needed to make up for gaps in core business solutions, supplementing incomplete solutions and providing omitted functionality.
Solution Architecture And Solution SecurityAlan McSweeney
This describes an approach to embedding security within the technology solution landscape. It describes a security model that encompasses the range of individual solution components up to the entire solution landscape. The solution security model allows the security status of a solution and its constituent delivery and operational components to be tracked wherever those components are located. This provides an integrated approach to solution security across all solution components and across the entire organisation topology of solutions. It allows the solution architect to validate the security of an individual solution. It enables the security status of the entire solution landscape to be assessed and recorded. Solution security is a wicked problem because there is no certainly about when the problem has been resolved and a state of security has been achieved. The security state of a solution can just be expressed along a subjective spectrum of better or worse rather than a binary true or false. Solution security can have negative consequences: prevents types of access, limits availability in different ways, restricts functionality provided, makes solution harder to use, lengthens solution delivery times, increases costs along the entire solution lifecycle, leads to loss of usability, utility and rate of use.
Data Privatisation, Data Anonymisation, Data Pseudonymisation and Differentia...Alan McSweeney
This paper describes how technologies such as data pseudonymisation and differential privacy technology enables access to sensitive data and unlocks data opportunities and value while ensuring compliance with data privacy legislation and regulations.
Data Privatisation, Data Anonymisation, Data Pseudonymisation and Differentia...Alan McSweeney
Your data has value to your organisation and to relevant data sharing partners. It has been expensively obtained. It represents a valuable asset on which a return must be generated. To achieve the value inherent in the data you need to be able to make it appropriately available to others, both within and outside the organisation.
Organisations are frequently data rich and information poor, lacking the skills, experience and resources to convert raw data into value.
These notes outline technology approaches to achieving compliance with data privacy regulations and legislation while providing access to data.
There are different routes to making data accessible and shareable within and outside the organisation without compromising compliance with data protection legislation and regulations and removing the risk associated with allowing access to personal data:
• Differential Privacy – source data is summarised and individual personal references are removed. The one-to-one correspondence between original and transformed data has been removed
• Anonymisation – identifying data is destroyed and cannot be recovered so individual cannot be identified. There is still a one-to-one correspondence between original and transformed data
• Pseudonymisation – identifying data is encrypted and recovery data/token is stored securely elsewhere. There is still a one-to-one correspondence between original and transformed data
These technologies and approaches are not mutually exclusive – each is appropriate to differing data sharing and data access use cases
The data privacy regulatory and legislative landscape is complex and getting even more complex so an approach to data access and sharing that embeds compliance as a matter of course is required.
Appropriate technology appropriately implemented and operated is a means of managing and reducing risks of re-identification by making the time, skills, resources and money necessary to achieve this unrealistic.
Technology is part of a risk management approach to data privacy. There is wider operational data sharing and data privacy framework that includes technology aspects, among other key areas. Using these technologies will embed such compliance by design into your data sharing and access facilities. This will allow you to realise value from your data successfully.
Solution architects must be aware of the need for solution security and of the need to have enterprise-level controls that solutions can adopt.
The sets of components that comprise the extended solution landscape, including those components that provide common or shared functionality, are located in different zones, each with different security characteristics.
The functional and operational design of any solution and therefore its security will include many of these components, including those inherited by the solution or common components used by the solution.
The complete solution security view should refer explicitly to the components and their controls.
While each individual solution should be able to inherit the security controls provided by these components, the solution design should include explicit reference to them for completeness and to avoid unvalidated assumptions.
There is a common and generalised set of components, many of which are shared, within the wider solution topology that should be considered when assessing overall solution architecture and solution security.
Individual solutions must be able to inherit security controls, facilities and standards from common enterprise-level controls, standards, toolsets and frameworks.
Individual solutions must not be forced to implement individual infrastructural security facilities and controls. This is wasteful of solution implementation resources, results in multiple non-standard approaches to security and represents a security risk to the organisation.
The extended solution landscape potentially consists of a large number of interacting components and entities located in different zones, each with different security profiles, requirements and concerns. Different security concerns and therefore controls apply to each of these components.
Solution security is not covered by a single control. It involves multiple overlapping sets of controls providing layers of security.
Solution Architecture And (Robotic) Process Automation SolutionsAlan McSweeney
Automation is a technology trend IT architects should be aware of and know how to respond to business requests as well as recommend automation technologies and solutions where appropriate. Automation is a bigger topic than just RPA (Robotic Process Automation).
Automation solutions, like all other technology solutions, should be subject to an architecture and design process. There are many approaches to and options for the automation of business activities. Too often automation solutions are tactical applications layered over existing business systems
The objective of all IT solutions is to automate manual business processes and their activities to a certain extent. The requirement for RPA-type applications arises in part because of automation failures within existing applications or the need to automate the interactions with or integrations between separate, possibly legacy, applications.
One of the roles of IT architecture is to always seek to take the wider architectural view and to ensure that solutions are designed and delivered within a strategic framework to avoid, as much as is practical and realistic, short-term tactical solutions and approaches that lead to an accumulation of design, operations and support debt. Tactical solutions will always play a part in the organisation’s solution landscape.
The objective of these notes is to put automation into its wider and larger IT architecture context while accepting the need for tactical approaches in some instances.
These notes cover the following topics:
• Solution And Process Automation – The Wider Technology And Approach Landscape
• Business Processes, Business Solutions And Automation
• Organisation Process Model
• Strategic And Tactical Automation
• Deciding On The Scope Of Automation
• Digital Strategy, Digital Transformation And Automation
• Specifying The Automation Solution
• Business Process Model and Notation (BPMN)
• Sample Business Process – Order To Cash
• RPA (Robotic Process Automation)
Data Profiling, Data Catalogs and Metadata HarmonisationAlan McSweeney
These notes discuss the related topics of Data Profiling, Data Catalogs and Metadata Harmonisation. It describes a detailed structure for data profiling activities. It identifies various open source and commercial tools and data profiling algorithms. Data profiling is a necessary pre-requisite activity in order to construct a data catalog. A data catalog makes an organisation’s data more discoverable. The data collected during data profiling forms the metadata contained in the data catalog. This assists with ensuring data quality. It is also a necessary activity for Master Data Management initiatives. These notes describe a metadata structure and provide details on metadata standards and sources.
Comparison of COVID-19 Mortality Data and Deaths for Ireland March 2020 – Mar...Alan McSweeney
This document compares published COVID-19 mortality statistics for Ireland with publicly available mortality data extracted from informal public data sources. This mortality data is taken from published death notices on the web site www.rip.ie. This is used a substitute for poor quality and long-delayed officially published mortality statistics.
Death notice information on the web site www.rip.ie is available immediately and contains information at a greater level of detail than published statistics. There is a substantial lag in officially published mortality data and the level of detail is very low. However, the extraction of death notice data and its conversion into a usable and accurate format requires a great deal of processing.
The objective of this analysis is to assess the accuracy of published COVID-19 mortality statistics by comparing trends in mortality over the years 2014 to 2020 with both numbers of deaths recorded from 2020 to 2021 and the COVID-19 statistics. It compares number of deaths for the seven 13-month intervals:
1. Mar 2014 - Mar 2015
2. Mar 2015 - Mar 2016
3. Mar 2016 - Mar 2017
4. Mar 2017 - Mar 2018
5. Mar 2018 - Mar 2019
6. Mar 2019 - Mar 2020
7. Mar 2020 - Mar 2021
It focuses on the seventh interval which is when COVID-19 deaths have occurred. It combines an analysis of mortality trends with details on COVID-19 deaths. This is a fairly simplistic analysis that looks to cross-check COVID-19 death statistics using data from other sources.
The subject of what constitutes a death from COVID-19 is controversial. This analysis is not concerned with addressing this controversy. It is concerned with comparing mortality data from a number of sources to identify potential discrepancies. It may be the case that while the total apparent excess number of deaths over an interval is less than the published number of COVID-19 deaths, the consequence of COVID-19 is to accelerate deaths that might have occurred later in the measurement interval.
Accurate data is needed to make informed decisions. Clearly there are issues with Irish COVID-19 mortality data. Accurate data is also needed to ensure public confidence in decision-making. Where this published data is inaccurate, this can lead of a loss of this confidence that can exploited.
Analysis of Decentralised, Distributed Decision-Making For Optimising Domesti...Alan McSweeney
This analysis looks at the potential impact that large numbers of electric vehicles could have on electricity demand, electricity generation capacity and on the electricity transmission and distribution grid in Ireland. It combines data from a number of sources – electricity usage patterns, vehicle usage patterns, electric vehicle current and possible future market share – to assess the potential impact of electric vehicles.
It then analyses a possible approach to electric vehicle charging where the domestic charging unit has some degree of decentralised intelligence and decision-making capability in deciding when to start vehicle charging to minimise electricity usage impact and optimise electricity generation usage.
The potential problem to be addressed is that if large numbers of electric cars are plugged-in and charging starts immediately when the drivers of those cars arrive home, the impact on demand for electricity will be substantial.
Operational Risk Management Data Validation ArchitectureAlan McSweeney
This describes a structured approach to validating data used to construct and use an operational risk model. It details an integrated approach to operational risk data involving three components:
1. Using the Open Group FAIR (Factor Analysis of Information Risk) risk taxonomy to create a risk data model that reflects the required data needed to assess operational risk
2. Using the DMBOK model to define a risk data capability framework to assess the quality and accuracy of risk data
3. Applying standard fault analysis approaches - Fault Tree Analysis (FTA) and Failure Mode and Effect Analysis (FMEA) - to the risk data capability framework to understand the possible causes of risk data failures within the risk model definition, operation and use
Data Integration, Access, Flow, Exchange, Transfer, Load And Extract Architec...Alan McSweeney
These notes describe a generalised data integration architecture framework and set of capabilities.
With many organisations, data integration tends to have evolved over time with many solution-specific tactical approaches implemented. The consequence of this is that there is frequently a mixed, inconsistent data integration topography. Data integrations are often poorly understood, undocumented and difficult to support, maintain and enhance.
Data interoperability and solution interoperability are closely related – you cannot have effective solution interoperability without data interoperability.
Data integration has multiple meanings and multiple ways of being used such as:
- Integration in terms of handling data transfers, exchanges, requests for information using a variety of information movement technologies
- Integration in terms of migrating data from a source to a target system and/or loading data into a target system
- Integration in terms of aggregating data from multiple sources and creating one source, with possibly date and time dimensions added to the integrated data, for reporting and analytics
- Integration in terms of synchronising two data sources or regularly extracting data from one data sources to update a target
- Integration in terms of service orientation and API management to provide access to raw data or the results of processing
There are two aspects to data integration:
1. Operational Integration – allow data to move from one operational system and its data store to another
2. Analytic Integration – move data from operational systems and their data stores into a common structure for analysis
Ireland 2019 and 2020 Compared - Individual ChartsAlan McSweeney
This analysis compares some data areas - Economy, Crime, Aviation, Energy, Transport, Health, Mortality. Housing and Construction - for Ireland for the years 2019 and 2020, illustrating the changes that have occurred between the two years. It shows some of the impacts of COVID-19 and of actions taken in response to it, such as the various lockdowns and other restrictions.
The first lockdown clearly had major changes on many aspects of Irish society. The third lockdown which began at the end of the period analysed will have as great an impact as the first lockdown.
The consequences of the events and actions that have causes these impacts could be felt for some time into the future.
Analysis of Irish Mortality Using Public Data Sources 2014-2020Alan McSweeney
This describes the use of published death notices on the web site www.rip.ie as a substitute to officially published mortality statistics. This analysis uses data from RIP.ie for the years 2014 to 2020.
Death notice information is available immediately and contains information at a greater level of detail than published statistics. There is a substantial lag in officially published mortality data.
This analysis compares some data areas - Economy, Crime, Aviation, Energy, Transport, Health, Mortality. Housing and Construction - for Ireland for the years 2019 and 2020, illustrating the changes that have occurred between the two years. It shows some of the impacts of COVID-19 and of actions taken in response to it, such as the various lockdowns and other restrictions.
The first lockdown clearly had major changes on many aspects of Irish society. The third lockdown which began at the end of the period analysed will have as great an impact as the first lockdown.
The consequences of the events and actions that have causes these impacts could be felt for some time into the future.
Review of Information Technology Function Critical Capability ModelsAlan McSweeney
IT Function critical capabilities are key areas where the IT function needs to maintain significant levels of competence, skill and experience and practise in order to operate and deliver a service. There are several different IT capability frameworks. The objective of these notes is to assess the suitability and applicability of these frameworks. These models can be used to identify what is important for your IT function based on your current and desired/necessary activity profile.
Capabilities vary across organisation – not all capabilities have the same importance for all organisations. These frameworks do not readily accommodate variability in the relative importance of capabilities.
The assessment approach taken is to identify a generalised set of capabilities needed across the span of IT function operations, from strategy to operations and delivery. This generic model is then be used to assess individual frameworks to determine their scope and coverage and to identify gaps.
The generic IT function capability model proposed here consists of five groups or domains of major capabilities that can be organised across the span of the IT function:
1. Information Technology Strategy, Management and Governance
2. Technology and Platforms Standards Development and Management
3. Technology and Solution Consulting and Delivery
4. Operational Run The Business/Business as Usual/Service Provision
5. Change The Business/Development and Introduction of New Services
In the context of trends and initiatives such as outsourcing, transition to cloud services and greater platform-based offerings, should the IT function develop and enhance its meta-capabilities – the management of the delivery of capabilities? Is capability identification and delivery management the most important capability? Outsourced service delivery in all its forms is not a fire-and-forget activity. You can outsource the provision of any service except the management of the supply of that service.
The following IT capability models have been evaluated:
• IT4IT Reference Architecture https://www.opengroup.org/it4it contains 32 functional components
• European e-Competence Framework (ECF) http://www.ecompetences.eu/ contains 40 competencies
• ITIL V4 https://www.axelos.com/best-practice-solutions/itil has 34 management practices
• COBIT 2019 https://www.isaca.org/resources/cobit has 40 management and control processes
• APQC Process Classification Framework - https://www.apqc.org/process-performance-management/process-frameworks version 7.2.1 has 44 major IT management processes
• IT Capability Maturity Framework (IT-CMF) https://ivi.ie/critical-capabilities/ contains 37 critical capabilities
The following model has not been evaluated
• Skills Framework for the Information Age (SFIA) - http://www.sfia-online.org/ lists over 100 skills
Critical Review of Open Group IT4IT Reference ArchitectureAlan McSweeney
This reviews the Open Group’s IT4IT Reference Architecture (https://www.opengroup.org/it4it) with respect to other operational frameworks to determine its suitability and applicability to the IT operating function.
IT4IT is intended to be a reference architecture for the management of the IT function. It aims to take a value chain approach to create a model of the functions that IT performs and the services it provides to assist organisations in the identification of the activities that contribute to business competitiveness. It is intended to be an integrated framework for the management of IT that emphasises IT service lifecycles.
This paper reviews what is meant by a value-chain, with special reference to the Supply Chain Operations Reference (SCOR) model (https://www.apics.org/apics-for-business/frameworks/scor). the most widely used and most comprehensive such model.
The SCOR model is part of wider set of operations reference models that describe a view of the critical elements in a value chain:
• Product Life Cycle Operations Reference model (PLCOR) - Manages the activities for product innovation and product and portfolio management
• Customer Chain Operations Reference model (CCOR) - Manages the customer interaction processes
• Design Chain Operations Reference model (DCOR) - Manages the product and service development processes
• Managing for Supply Chain Performance (M4SC) - Translates business strategies into supply chain execution plans and policies
It also compares the IT4IT Reference Architecture and its 32 functional components to other frameworks that purport to identify the critical capabilities of the IT function:
• IT Capability Maturity Framework (IT-CMF) https://ivi.ie/critical-capabilities/ contains 37 critical capabilities
• Skills Framework for the Information Age (SFIA) - http://www.sfia-online.org/ lists over 100 skills
• European e-Competence Framework (ECF) http://www.ecompetences.eu/ contains 40 competencies
• ITIL IT Service Management https://www.axelos.com/best-practice-solutions/itil
• COBIT 2019 https://www.isaca.org/resources/cobit has 40 management and control processes
GraphSummit Singapore | The Future of Agility: Supercharging Digital Transfor...Neo4j
Leonard Jayamohan, Partner & Generative AI Lead, Deloitte
This keynote will reveal how Deloitte leverages Neo4j’s graph power for groundbreaking digital twin solutions, achieving a staggering 100x performance boost. Discover the essential role knowledge graphs play in successful generative AI implementations. Plus, get an exclusive look at an innovative Neo4j + Generative AI solution Deloitte is developing in-house.
GridMate - End to end testing is a critical piece to ensure quality and avoid...ThomasParaiso2
End to end testing is a critical piece to ensure quality and avoid regressions. In this session, we share our journey building an E2E testing pipeline for GridMate components (LWC and Aura) using Cypress, JSForce, FakerJS…
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
Pushing the limits of ePRTC: 100ns holdover for 100 daysAdtran
At WSTS 2024, Alon Stern explored the topic of parametric holdover and explained how recent research findings can be implemented in real-world PNT networks to achieve 100 nanoseconds of accuracy for up to 100 days.
GraphSummit Singapore | The Art of the Possible with Graph - Q2 2024Neo4j
Neha Bajwa, Vice President of Product Marketing, Neo4j
Join us as we explore breakthrough innovations enabled by interconnected data and AI. Discover firsthand how organizations use relationships in data to uncover contextual insights and solve our most pressing challenges – from optimizing supply chains, detecting fraud, and improving customer experiences to accelerating drug discoveries.
The Art of the Pitch: WordPress Relationships and SalesLaura Byrne
Clients don’t know what they don’t know. What web solutions are right for them? How does WordPress come into the picture? How do you make sure you understand scope and timeline? What do you do if sometime changes?
All these questions and more will be explored as we talk about matching clients’ needs with what your agency offers without pulling teeth or pulling your hair out. Practical tips, and strategies for successful relationship building that leads to closing the deal.
Essentials of Automations: The Art of Triggers and Actions in FMESafe Software
In this second installment of our Essentials of Automations webinar series, we’ll explore the landscape of triggers and actions, guiding you through the nuances of authoring and adapting workspaces for seamless automations. Gain an understanding of the full spectrum of triggers and actions available in FME, empowering you to enhance your workspaces for efficient automation.
We’ll kick things off by showcasing the most commonly used event-based triggers, introducing you to various automation workflows like manual triggers, schedules, directory watchers, and more. Plus, see how these elements play out in real scenarios.
Whether you’re tweaking your current setup or building from the ground up, this session will arm you with the tools and insights needed to transform your FME usage into a powerhouse of productivity. Join us to discover effective strategies that simplify complex processes, enhancing your productivity and transforming your data management practices with FME. Let’s turn complexity into clarity and make your workspaces work wonders!
Sudheer Mechineni, Head of Application Frameworks, Standard Chartered Bank
Discover how Standard Chartered Bank harnessed the power of Neo4j to transform complex data access challenges into a dynamic, scalable graph database solution. This keynote will cover their journey from initial adoption to deploying a fully automated, enterprise-grade causal cluster, highlighting key strategies for modelling organisational changes and ensuring robust disaster recovery. Learn how these innovations have not only enhanced Standard Chartered Bank’s data infrastructure but also positioned them as pioneers in the banking sector’s adoption of graph technology.
Observability Concepts EVERY Developer Should Know -- DeveloperWeek Europe.pdfPaige Cruz
Monitoring and observability aren’t traditionally found in software curriculums and many of us cobble this knowledge together from whatever vendor or ecosystem we were first introduced to and whatever is a part of your current company’s observability stack.
While the dev and ops silo continues to crumble….many organizations still relegate monitoring & observability as the purview of ops, infra and SRE teams. This is a mistake - achieving a highly observable system requires collaboration up and down the stack.
I, a former op, would like to extend an invitation to all application developers to join the observability party will share these foundational concepts to build on:
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
A tale of scale & speed: How the US Navy is enabling software delivery from l...sonjaschweigert1
Rapid and secure feature delivery is a goal across every application team and every branch of the DoD. The Navy’s DevSecOps platform, Party Barge, has achieved:
- Reduction in onboarding time from 5 weeks to 1 day
- Improved developer experience and productivity through actionable findings and reduction of false positives
- Maintenance of superior security standards and inherent policy enforcement with Authorization to Operate (ATO)
Development teams can ship efficiently and ensure applications are cyber ready for Navy Authorizing Officials (AOs). In this webinar, Sigma Defense and Anchore will give attendees a look behind the scenes and demo secure pipeline automation and security artifacts that speed up application ATO and time to production.
We will cover:
- How to remove silos in DevSecOps
- How to build efficient development pipeline roles and component templates
- How to deliver security artifacts that matter for ATO’s (SBOMs, vulnerability reports, and policy evidence)
- How to streamline operations with automated policy checks on container images
PHP Frameworks: I want to break free (IPC Berlin 2024)Ralf Eggert
In this presentation, we examine the challenges and limitations of relying too heavily on PHP frameworks in web development. We discuss the history of PHP and its frameworks to understand how this dependence has evolved. The focus will be on providing concrete tips and strategies to reduce reliance on these frameworks, based on real-world examples and practical considerations. The goal is to equip developers with the skills and knowledge to create more flexible and future-proof web applications. We'll explore the importance of maintaining autonomy in a rapidly changing tech landscape and how to make informed decisions in PHP development.
This talk is aimed at encouraging a more independent approach to using PHP frameworks, moving towards a more flexible and future-proof approach to PHP development.
SAP Sapphire 2024 - ASUG301 building better apps with SAP Fiori.pdfPeter Spielvogel
Building better applications for business users with SAP Fiori.
• What is SAP Fiori and why it matters to you
• How a better user experience drives measurable business benefits
• How to get started with SAP Fiori today
• How SAP Fiori elements accelerates application development
• How SAP Build Code includes SAP Fiori tools and other generative artificial intelligence capabilities
• How SAP Fiori paves the way for using AI in SAP apps
Communications Mining Series - Zero to Hero - Session 1DianaGray10
This session provides introduction to UiPath Communication Mining, importance and platform overview. You will acquire a good understand of the phases in Communication Mining as we go over the platform with you. Topics covered:
• Communication Mining Overview
• Why is it important?
• How can it help today’s business and the benefits
• Phases in Communication Mining
• Demo on Platform overview
• Q/A
In his public lecture, Christian Timmerer provides insights into the fascinating history of video streaming, starting from its humble beginnings before YouTube to the groundbreaking technologies that now dominate platforms like Netflix and ORF ON. Timmerer also presents provocative contributions of his own that have significantly influenced the industry. He concludes by looking at future challenges and invites the audience to join in a discussion.
Whitepaper Real Time Transaction Analysis And Fraudulent Transaction Detection For Online Banking
1. Real Time
Transaction Analysis
and Fraudulent
Transaction
Detection for Online
Banking
Alan McSweeney
2. Real Time Transaction Analysis and Fraudulent Transaction
Detection for Online Banking
Contents
Online Bank Fraud ..........................................................................................2
Online Bank Fraud ..........................................................................................3
Real Time Fraud Detection Solution Architecture............................................4
Internet Banking Logical Transaction Layers...............................................4
Real-Time Fraud Detection Solution Framework .........................................5
Real-Time Fraud Detection Solution Architecture........................................6
Rules Engine and Decision Making Facility..................................................8
Complex Event Processing/Event Driven Application Architecture and
Approaches to Fraud Analysis..........................................................................9
Implementing a Real-Time Fraud Detection System ...................................... 10
The behaviour characteristics of online banking fraud are:
• Continuous behaviour changes by criminals
• Very high growth rates
• Sophisticated advanced and changing fraud techniques
To effectively detect and stop fraud before it happens, banks will require insight
into user activity in real-time. This will be provided by a real-time online banking
fraud detection and analysis solution.
There are many small software vendors operating in this area and the market is
still quite fragmented. There will be consolidation as vendors merge and are taken
over or go out of business.
There is an emerging technology in the form of Complex Event Processing (CEP)
that is suitable for real-time online banking fraud detection.
As part of the implementation of any real-time online fraud solution, banks will
need to implement new business processes to support any solution. This will be a
key element of any overall solution.
A complete solution will consist of the following components:
• Continuing customer education
• Possible additional two-factor authentication for customers using some form of
key generation tool
• Profiling customer access and maintaining an up-to-date list of fraud sources to
determine if a known source of fraudulent activity
• Implementation of real-time fraud detection and handling system or systems
• Checking transactions in real time
• Handling of suspicious transactions
• Processes to link all these elements together
Page 2
3. Real Time Transaction Analysis and Fraudulent Transaction
Detection for Online Banking
Online Bank Fraud
This whitepaper provides an introduction to the end-to-end landscape of online
banking fraud and its detection and handling. Online banking fraud can arise in
a number of ways:
1. By some form of identity theft where the banking authentication details of
legitimate users are stolen and used for criminal and fraudulent purposes
such as phishing and crimeware attacks
2. By some form of security breach that allows criminals access to bank
banking systems
3. By fraudulent activity by bank employees The numbers of crimeware-
4. By persons closely associated with legitimate users gaining access to their spreading URLs infecting PCs
authentication details and performing fraud with password-stealing code rose
93 percent in Q1, 2008 to 6,500
The common thread in all this is people who are the weakest link in any security sites, nearly double the previous
system. high of November, 2007 - and an
increase of 337% percent from the
Of these sources of fraud, phishing in all its forms will be the one that gives rise number detected end of Q1, 2007.
to most concern. It will be the mechanism by which criminals get access to
account information in order to defraud customers.
Phishing typically employs both a social engineering and a technical approach
(Crimeware) to steal consumers’ personal identity data and financial account
access details.
Crimeware is software that performs illegal actions not requested by a user
running the software that are typically intended to yield financial benefits to
the distributor of the software.
Social-engineering schemes use spoofed e-mails purporting to be from legitimate
sources to lead consumers to counterfeit websites designed to trick recipients
into divulging financial account authentication data. Source: AntiPhishing Working
Group
Essentially, crimeware is divided into two broad categories: http://www.antiphishing.org Q1
2008 Phishing Activity Trends
1. Social Engineering – this involves an e-mail with an address or an Summary
attachment that directs the user to the fraudulent site or inflects the user’s
PC with criminal software
2. Security Exploits – these take advantage of flaws in software such as user’s
operating system, browser or elements of the internet infrastructure used to
gain access to the bank’s online banking site.
Unfortunately crimeware is a fact of life in the online world. Crimeware is
distributed in many ways such as:
• Social engineering attacks convincing users to open a malicious email
attachment containing crimeware
• Injection of crimeware into legitimate web sites via content injection
attacks such as cross-site scripting
Page 3
4. Real Time Transaction Analysis and Fraudulent Transaction
Detection for Online Banking
Number of Attacks: • Exploiting security vulnerabilities through worms and other attacks on
security flaws in operating systems, browsers, and other commonly installed
software
• Insertion of crimeware into downloadable software that otherwise performs
a desirable function.
Any approach to preventing fraud needs to take account of these mechanisms
and to ensure that the bank does not perform any actions that could be
mistaken and misused in these contexts, such as:
• Sending mails to customers that could then be confused with phishing mails
Source: AntiPhishing Working • Providing users with separate downloadable software to perform functions
Group such as security checking and PC fingerprint generation
http://www.antiphishing.org Q1
2008 Phishing Activity Trends Real Time Fraud Detection Solution Architecture
Summary
Internet Banking Logical Transaction Layers
In terms of examining the options for real-time fraudulent transaction analysis
and determining the architectures and solutions available, there are four
relevant logical layers:
Frequency and Cost of Attack by Type of
Attack
Source: US National Consumer League,
2007
These layers are:
Page 4
5. Real Time Transaction Analysis and Fraudulent Transaction
Detection for Online Banking
1. User Physical Access and Location – this layer consists of the device being
used by the user to perform the access, its characteristics, its physical
location and other user details such as mobile telephone, mail address.
2. Internet Communication – this refers to the physical internet layer.
3. User Authentication Layer – this layer consists of the authentication
information users must supply and other authentication mechanisms such
as physical tokens that users might use during the authentication process.
4. Front-End Internet Banking Application – this is the suite of applications
that form the Internet accessible layer of the banking systems.
5. Back-End Banking Systems and Data Warehouse and User History – this
consists of the back-end banking systems and the data warehouse storing
user access history.
Real-Time Fraud Detection Solution Framework
Implementing an effective mechanism for preventing Internet fraud will
involve a multi-layer approach with multi-factor authentication and
verification. It is important to understand that incidents will occur. Any system
involving people will at some stage be compromised.
Also, it may not be possible or worthwhile to implement a solution that is 100%
secure. This may involve substantial incremental cost over a solution that is
close to 100% secure that may not be justified.
An integral part of any fraud detection solution is an incident handling system
and associated processes. At a minimum, these should:
• Contain the damage
• Preserve/duplicate of the compromised system's state for further analysis
• Contact the Police and the Bank’s legal department if required
• Restore operations of compromised system, if relevant
• Analyse problem and determine incident cause
• Document incident and recovery details
• Update control agents/implementation details based on analysis
• Update incident response plan, if required
The illusion of 100% security can be dangerous as it can lead to complacent
behaviour and a substitute for sound practices. It can also cause IT users to
behave more recklessly. Note that security compliance endorses an overall
environment including technology and processes and not just a specific
technology.
The elements of an overall solution can include some of all of:
Page 5
6. Real Time Transaction Analysis and Fraudulent Transaction
Detection for Online Banking
Real-Time Fraud Detection Solution Architecture
A real-time fraudulent transaction analysis and detection system will operate in
parallel to the normal transaction pipeline.
The transaction pipeline will consist of the following steps:
1. User will initiate the transaction using a device such as, but not limited to,
work or home PC
2. The user will use an internet connection to access the bank’s internet
banking system
3. The user will authenticate with the bank’s internet banking system
4. The user will performing banking transactions
5. The data warehouse will be updated with information collected during the
transaction
Page 6
7. Real Time Transaction Analysis and Fraudulent Transaction
Detection for Online Banking
In parallel, the real-time fraudulent transaction analysis and detection system
will operate. It should not insert itself into the transaction pipeline as this will
delay transaction processing as well as involve higher implementation costs due
to the integration effort. Details of transactions should be taken in real-time at
two key points:
1. User access to gather details on how the user is accessing the system
2. Transaction to gather details on what transactions the user is performing
This real-time information is then compared with user access history and
transaction history details to determine if the transaction is likely to be
fraudulent.
At a high-level, the real-time fraudulent transaction analysis and detection
system will consist of a core Collect-Analyse-Decide-Respond cycle. These
stages will perform the following tasks:
• Collect – information on the transaction will be collected. This will consist of
access information, session information and transaction details. The
collection component will gather information from multiple sources at
multiple stages both through the transaction life cycle and off-line from
other sources such as watchlists of addresses involved in fraud.
• Analyse – the transaction information collected will be analysed both within
itself and also be compared with historical information collected. Based on
the two sets of data, the transaction will be scored with respect to its
probability that it is fraudulent.
• Decide – there will be a decision engine that determines if the transaction is
fraudulent.
Page 7
8. Real Time Transaction Analysis and Fraudulent Transaction
Detection for Online Banking
• Respond – based on the decision taken a response action will be determined.
This process needs to happen in real-time as transactions are happening. It
needs to be scalable to handle large-volumes of transactions without delaying
overall transaction processing.
The real-time fraudulent transaction analysis and detection system will also
provide additional functions:
• Reporting and Monitoring – the system should provide reporting and
monitoring facilities to report on fraud analysis activities, system
throughput, performance and other areas
• Offline Analysis – this will provide other non-real-time analysis facilities
that allow patterns across multiple transactions to be identified
• Administration – the system can be administered and managed allow
actions such as new rules to be defined and the operation system to be tuned
and modified.
Rules Engine and Decision Making Facility
This is a flexible rules-engine that takes data from multiple sources to identify
transactions as potentially fraudulent:
The classification will be based on multiple factors, such as:
Current Transaction Details Users Profiles
Transaction Amount Users Ages
Transaction Type Users Locations
Users Jobs
Transaction History Details
Transaction Frequency Session Details
Transaction Type Frequency IP Address
Page 8
9. Real Time Transaction Analysis and Fraudulent Transaction
Detection for Online Banking
Account Activity Browser Type
User Profile Session History Details
User Age IP Addresses
User Location Browser Types
User Job
Previously Known Sources of Fraud
IP Addresses Associated With Fraud
This information will be combined to assess the probability of the transaction
being fraudulent:
• Current Transaction Details – this will provide a profile of the transaction
being performed
• Transaction History Details – this will allow the current transaction to be
compared against previous transactions
• User Profile – this will provide a profile of the user performing the
transaction
• Users Profiles – this will provide a profile of all users against which the
current user’s profile and the profile of the current transaction against the
profile of transactions performed by similar users can be compared
• Session Details – this will provide details on the internet access session
• Session History Details – this will allow the current session details to be
compared against previous sessions to allow changes to be identified
• Previously Known Sources of Fraud – this will allow the current session
details to be compared known access details associated with fraud
Complex Event Processing/Event Driven Application
Architecture and Approaches to Fraud Analysis
There is an emerging technology in the form of Complex Event Processing
(CEP) that is suitable for real-time online banking fraud detection. The topic of
CEP is itself very complex. This section provides some very brief information to
support its inclusion as an option for implementing a real-time fraud analysis
solution.
The high-level architecture of a Complex Event Processing (CEP)/Event Driven
Application (EDA) architecture is:
Page 9
10. Real Time Transaction Analysis and Fraudulent Transaction
Detection for Online Banking
The core logical elements of this approach are:
• Continuous Query Engine - Processes high volumes of streaming data
• SQL-based Event Processing Language (EPL) – extends SQL to handle
streaming events
EPL is SQL-based. It provides easier integration to relational data and the data
storage facility. The key extension within EPL is the ability to handle
streaming data provided by WHEN ... THEN statements rather than
conventional IF ... THEN statements.
Details on the levels of
spending by US banks on A CEP application typically comprises of four main component types:
consumer authentication and
fraud detection in 2006, 1. Adapters interface directly to the inbound event sources. Adapters
classified by the value of their understand the inbound protocol, and are responsible for converting the
deposits. event data into a normalised data that can be queried by a processor (i.e.
event processing agent, or processor). Adapters forward the normalised
event data into Streams.
2. Streams are event processing endpoints. Among other things, streams are
responsible for queuing event data until the event processing agent can act
upon it.
3. The event processing agent removes the event data from the stream,
processes it, and may generate new events to an output stream.
4. The Decide step listens to the output stream, The Decide step forward on
the generated events to external event sinks such as a case management
system.
Source: Gartner
Implementing a Real-Time Fraud Detection System
Any practical approach to real-time anti-fraud will consist of the following
activities:
• Continuing customer education
• Possible additional two-factor authentication for customers using some
form of key generation tool
• Profiling customer access and maintaining an up-to-date list of fraud
sources to determine if a known source of fraudulent activity
• Implementation of real-time fraud detection and handling system or
systems
• Checking transactions in real time
Page 10
11. Real Time Transaction Analysis and Fraudulent Transaction
Detection for Online Banking
• Handling of suspicious transactions
• Processes to link all these elements together
Each of these will go some way to preventing fraud. Taken together they will
form a comprehensive solution.
Planned increase in spending
intentions in 2007 from 2006
by these banks.
In terms of the previous transaction pipeline, the additional steps required will
be:
1. Before completing the transaction, the banking system would invoke a
function to check the status of the transaction within the Decision engine.
2. The checking function will interrogate the Decision engine to get the result
of the transaction check.
3. If the Decision engine has reached a decision about the transaction, this
would be provided to the application status check.
4. If the transaction was determined to be suspicious, it would be written to a
suspend queue where it would be held according to defined rules.
5. If the transaction was determined not to be suspicious, it would be Source: Gartner
processed as normal.
6. The incident handling component would be notified.
Page 11
12. Real Time Transaction Analysis and Fraudulent Transaction
Detection for Online Banking
For more information, please contact:
alan@alanmcsweeney.com
Page 12