The document discusses software development, platforms, and modernization. It provides an overview of .NET and Java development platforms and compares their advantages and disadvantages. It also covers approaches to software design, delivery, and modernizing existing software. Key factors to consider include application lifecycles, existing systems, standards, and taking an integrated approach.
Learn About:
Newly added features such as Forefront Endpoint Protection and connectors to Intune and Azure
Support features like IOS and Linux, including extended device management
Architectural layout for design considerations, including the CAS Server and the elimination of Native Mode
New ways to configure and deploy your software updates
System Center integration and automation with other System Center products, such as Service Manager and the Data Warehouse connector
The Business Case for Hosting JD Edwards in the CloudNERUG
This presentation will cover in detail the business case for hosting JD Edwards in the cloud. Hear from industry expert, John Bassett, CTO at GSI, Inc. During the presentation, John will address the following topics, comparing hosting to a more traditional in-house approach:
- Total cost of ownership (TCO)
- IT staffing and support costs
- Licensing costs
- Cost Predictability
- Security and compliance
- Scalability, performance and reliability
- Business continuity and redundancy
- System deployment
- System management, ongoing maintenance and upgrades
- Market adaptability, agility and innovation
Application Management Service OfferingsGss America
GSS America extends support for Application Development and Maintenance initiatives and helps customers focus on their strategic priorities. The cost effective AMS solutions from GSS ensures that customers achieve significant savings on running their business operations, so that they can channel these savings towards new product development initiatives that contribute to additional revenue generation.
Learn About:
Newly added features such as Forefront Endpoint Protection and connectors to Intune and Azure
Support features like IOS and Linux, including extended device management
Architectural layout for design considerations, including the CAS Server and the elimination of Native Mode
New ways to configure and deploy your software updates
System Center integration and automation with other System Center products, such as Service Manager and the Data Warehouse connector
The Business Case for Hosting JD Edwards in the CloudNERUG
This presentation will cover in detail the business case for hosting JD Edwards in the cloud. Hear from industry expert, John Bassett, CTO at GSI, Inc. During the presentation, John will address the following topics, comparing hosting to a more traditional in-house approach:
- Total cost of ownership (TCO)
- IT staffing and support costs
- Licensing costs
- Cost Predictability
- Security and compliance
- Scalability, performance and reliability
- Business continuity and redundancy
- System deployment
- System management, ongoing maintenance and upgrades
- Market adaptability, agility and innovation
Application Management Service OfferingsGss America
GSS America extends support for Application Development and Maintenance initiatives and helps customers focus on their strategic priorities. The cost effective AMS solutions from GSS ensures that customers achieve significant savings on running their business operations, so that they can channel these savings towards new product development initiatives that contribute to additional revenue generation.
Provides overview of various standards adopted by Telecom ISP/ISV in Management and Monetization space. These standards provides open framework for OSS/BSS business processes to support interoperability and to avoid vendor lock-in
Powering Virtualization, Applications, and Data Center Transformation with Co...Dell World
How can you streamline management and administration, provide services and applications fast, and empower your customers and users? Having to run IT more efficiently is a given; infrastructure operations and the data center must become simpler, easier to manage and less costly. And IT must be "cloud ready." Convergence is a path to that efficiency—and if that sounds like a mix of virtualization, cloud, utility, on-demand and shared computing services, you are right. The majority of "converged solutions" today are complex and rigid, resulting in more inflexible islands of technology rather than integrating with existing processes, investments and strategies. Organizations are demanding solutions designed with the key tenets of modularity and flexibility at their core. The ROI of convergence can be realized in many ways, but involves reevaluating your platform strategy to consider solutions that are not only optimized for virtualization density, but designed to work for you at any scale, from office to enterprise. Find out more: http://del.ly/DjC9Dj
Application Consolidation and RetirementIBM Analytics
Originally Published: Feb 04, 2015
Multiple, disconnected systems or an outdated application infrastructure can negatively impact your business and increase your costs. Consolidating applications, retiring outdated databases and modernizing systems can streamline your infrastructure and free resources to focus on important new projects.
Infogain’s Application Management Services increase the return on application investments and improve IT organization effectiveness freeing IT resources to focus on business priorities.
Eficaz - DW Suite which has evolved from an efficient ETL suite to a complete DW suite to cater to the organization’s requirement of a strong base for reporting and maintenance of data. Now all your key data can be stored and available for consolidated reporting, analysis and dashboards.
A Practical Guide to CMDB Deployment in a Tivoli EnvironmentAntonio Rolle
This presentation focuses on the significance of the CMDB to your organization and offers practical guidelines for successful population of the CMDB utilizing the Tivoli Netcool suite of products. Specific products discussed include Precision for IP Networks, Tivoli Application Dependency Discovery Manager (TADDM), Tivoli Business Service Manager (TBSM) and Maximo.
21 Secrets of Self-Service IT Request FulfillmentnewScale
This presentation was given at the itSMF Australia ‘Power On’ Conference in Sydney - 18 August 2009. For more information, contact newScale at www.newscale.com
Forget Big Data. It's All About Smart DataAlan McSweeney
This proposes an initial smart data framework and structure to allow the nuggets of value contained in the deluge of largely irrelevant and useless data to be isolated and extracted. It enables your organisation to ask the questions to understand where it should be in terms of its data state and profile and what it should do to achieve the desired skills level across the competency areas of the framework.
Every organisation operates within a data landscape with multiple sources of data relating to its activities that is acquired, transported, stored, processed, retained, analysed and managed. Interactions across the data landscape generate primary data. When you extend the range of possible interactions business processes outside the organisation you generate a lot more data.
Smart data means being:
• Smart in what data to collect, validate and transform
• Smart in how data is stored, managed, operated and used
• Smart in taking actions based on results of data analysis including organisation structures, roles, devolution and delegation of decision-making, processes and automation
• Smart in being realistic, pragmatic and even skeptical about what can be achieved and knowing what value can be derived and how to maximise value obtained
• Smart in defining an achievable, benefits-lead strategy integrated with the needs business and in its implementation
• Smart in selecting the channels and interactions to include – smart data use cases
Smart data competency areas comprise a complete set of required skills and abilities to design, implement and operate an appropriate smart data programme.
Provides overview of various standards adopted by Telecom ISP/ISV in Management and Monetization space. These standards provides open framework for OSS/BSS business processes to support interoperability and to avoid vendor lock-in
Powering Virtualization, Applications, and Data Center Transformation with Co...Dell World
How can you streamline management and administration, provide services and applications fast, and empower your customers and users? Having to run IT more efficiently is a given; infrastructure operations and the data center must become simpler, easier to manage and less costly. And IT must be "cloud ready." Convergence is a path to that efficiency—and if that sounds like a mix of virtualization, cloud, utility, on-demand and shared computing services, you are right. The majority of "converged solutions" today are complex and rigid, resulting in more inflexible islands of technology rather than integrating with existing processes, investments and strategies. Organizations are demanding solutions designed with the key tenets of modularity and flexibility at their core. The ROI of convergence can be realized in many ways, but involves reevaluating your platform strategy to consider solutions that are not only optimized for virtualization density, but designed to work for you at any scale, from office to enterprise. Find out more: http://del.ly/DjC9Dj
Application Consolidation and RetirementIBM Analytics
Originally Published: Feb 04, 2015
Multiple, disconnected systems or an outdated application infrastructure can negatively impact your business and increase your costs. Consolidating applications, retiring outdated databases and modernizing systems can streamline your infrastructure and free resources to focus on important new projects.
Infogain’s Application Management Services increase the return on application investments and improve IT organization effectiveness freeing IT resources to focus on business priorities.
Eficaz - DW Suite which has evolved from an efficient ETL suite to a complete DW suite to cater to the organization’s requirement of a strong base for reporting and maintenance of data. Now all your key data can be stored and available for consolidated reporting, analysis and dashboards.
A Practical Guide to CMDB Deployment in a Tivoli EnvironmentAntonio Rolle
This presentation focuses on the significance of the CMDB to your organization and offers practical guidelines for successful population of the CMDB utilizing the Tivoli Netcool suite of products. Specific products discussed include Precision for IP Networks, Tivoli Application Dependency Discovery Manager (TADDM), Tivoli Business Service Manager (TBSM) and Maximo.
21 Secrets of Self-Service IT Request FulfillmentnewScale
This presentation was given at the itSMF Australia ‘Power On’ Conference in Sydney - 18 August 2009. For more information, contact newScale at www.newscale.com
Forget Big Data. It's All About Smart DataAlan McSweeney
This proposes an initial smart data framework and structure to allow the nuggets of value contained in the deluge of largely irrelevant and useless data to be isolated and extracted. It enables your organisation to ask the questions to understand where it should be in terms of its data state and profile and what it should do to achieve the desired skills level across the competency areas of the framework.
Every organisation operates within a data landscape with multiple sources of data relating to its activities that is acquired, transported, stored, processed, retained, analysed and managed. Interactions across the data landscape generate primary data. When you extend the range of possible interactions business processes outside the organisation you generate a lot more data.
Smart data means being:
• Smart in what data to collect, validate and transform
• Smart in how data is stored, managed, operated and used
• Smart in taking actions based on results of data analysis including organisation structures, roles, devolution and delegation of decision-making, processes and automation
• Smart in being realistic, pragmatic and even skeptical about what can be achieved and knowing what value can be derived and how to maximise value obtained
• Smart in defining an achievable, benefits-lead strategy integrated with the needs business and in its implementation
• Smart in selecting the channels and interactions to include – smart data use cases
Smart data competency areas comprise a complete set of required skills and abilities to design, implement and operate an appropriate smart data programme.
Comprehensive And Integrated Approach To Project Management And Solution Deli...Alan McSweeney
Describes a complete and integrated approach to solution delivery that encompasses project management, project portfolio management, business analysis and solution architecture and design
Effective solution delivery requires an integrated approach to projects across all key disciplines
Project portfolio management
Project management
Business analysis
Solution design
Having silos of expertise that do not communicate or co-operate leads to significant risk
Digital strategy is a statement about the organisation’s digital positioning, competitors and customer and collaborator needs and behaviour to achieve a direction for innovation, communication, transaction and promotion.
This describes facets of exploring the options for digital to ensure that the resulting strategy is realistic, achievable and will deliver a return.
Enterprise Architecture needs to be involved in the development of digital architecture. Digital architecture needs to be at the core of the organisation’s wider Enterprise Architecture.
Technology generally accelerates existing business momentum rather than being the originator of momentum. Digital is not a panacea. Digital interactions with third parties gives rise to expectations
Digital will make weaknesses in business processes and underlying technology very evident very quickly. Iterate through digital initiatives, starting small and focussed, learning from experience.
Notes on an ITO Appliance Approach to Productising and Industrialising IT Out...Alan McSweeney
Describes an approach to the development of an IT outsourcing reference architecture that enables rapid and repeatable take-on and delivery of IT outsourcing services consistently through productisation and industrialisation
This describes the concept of a Process Oriented Architecture. A Process Oriented Architecture is a way of linking process areas to actual (desired) interactions – customer (external interacting party) service journeys through the organisation. It allows two views of any process to be maintained and operated:
1. External view – that experienced by user
2. Internal view – that worked on by the organisational competency
An organisation will interact will multiple external parties. Each external party will have a number of interaction paths or journeys. These journeys are the routes of experience of external parties. These routes of experience need to be mapped (as) seamlessly (as possible) to internal organisational operational process competency groupings.
The interaction paths or journeys represent the Straight Through Processing that the customer (external party) wants to experience. The complexity of internal organisational operational process competency groupings needs to be masked from the customer (external party). Process Oriented Architecture is a key enabler of successful digital transformation.
Integrated Project and Solution Delivery And Business Engagement ModelAlan McSweeney
Projects are a continuum from initial concept to planning, design, implementation and management and operation of the implemented solution (and ultimate decommissioning) and across IT and business functions.
Therefore it is important to have an integrated project delivery approach that crosses these core dimensions.
This describes an integrated approach to solution delivery encompassing Stages - project stages/timeline, Activities - IT and business functions/ roles/ activities, Gates - project review and decision gates and Artefacts - project results and deliverables. This combines project management into all other aspects and activities of project and solution delivery:
• Business
• Business Analysis
• Solution Architecture
• Implementation and Delivery
• Test and Quality
• Organisation Readiness
• Service Management
• Infrastructure
It emphasises early business engagement and solution definition and validation to detail a solution that meet a clear and articulated business need that will deliver a realisable and achievable set of business benefits. It ensures that the complexity of what has to be delivered is understood so there is a strong and solid foundation for solution implementation, delivery and management and operation.
Introduction To Business Architecture – Part 1Alan McSweeney
This is the first of a proposed four part introduction to Business Architecture. It is intended to focus on activities associated with Business Architecture work and engagements.
Business change without a target business architecture and a plan is likely to result in a lack of success and even failure. An effective approach to business architecture and business architecture competency is required to address effectively the pressures on businesses to change. Business architecture connects business strategy to effective implementation and operation:
• Translates business strategic aims to implementations
• Defines the consequences and impacts of strategy
• Isolates focussed business outcomes
• Identifies the changes and deliverables that achieve business success
Enterprise Architecture without Solution Architecture and Business Architecture will not deliver on its potential. Business Architecture is an essential part of the continuum from theory to practice.
Review existing data management maturity models to identify core set of characteristics of an effective data maturity model:
DMBOK (Data Management Book of Knowledge) from DAMA (Data Management Association)
MIKE2.0 (Method for an Integrated Knowledge Environment) Information Maturity Model (IMM)
IBM Data Governance Council Maturity Model
Enterprise Data Management Council Data Management Maturity Model
Digital Transformation And Enterprise ArchitectureAlan McSweeney
Digital transformation - extending and exposing business processes outside the organisation - by implementing a digital strategy – a statement about the organisation’s digital positioning, operating model, competitors and customer and collaborator needs and behaviour through the delivery of digital solutions defined in a digital architecture – a future state application, data and technology view to achieve digital operating status - is potentially (very) complex.
Digital architecture does not exist in isolation entirely separate from an organisation’s overall enterprise architecture. Digital architecture must exist within the within the wider enterprise architecture context.
Enterprise architecture provides the tools and the approaches to manage the complexity of digital transformation.
The management function that drives digital transformation needs to involve the enterprise architecture function in the design and implementation of digital strategy and organisation, process and policies and the creation of a digital architecture. Management must appreciate the technology focus and the benefits of an enterprise architecture approach.
The early involvement of enterprise architecture increases successes and reduces failures. Management must trust and involve enterprise architecture. The enterprise architecture function must accept and rise to the challenge and deliver. The enterprise architecture function must allow its value to be measured.
Structured Approach to Solution ArchitectureAlan McSweeney
The role of solution architecture is to identify answer to a business problem and set of solution options and their components. There will be many potential solutions to a problem with varying degrees of suitability to the underlying business need. Solution options are derived from a combination of Solution Architecture Dimensions/Views which describe characteristics, features, qualities, requirements and Solution Design Factors, Limitations And Boundaries which delineate limitations. Use of structured approach can assist with solution design to create consistency. The TOGAF approach to enterprise architecture can be adapted to perform some of the analysis and design for elements of Solution Architecture Dimensions/Views.
Software Modernization and Legacy Migration PrimerProbal DasGupta
Software modernization is usually the remedy wherever software maintenance costs are high, business agility is low, integration is poor or interoperability is deficient - which are also the commonest problems affecting most companies. This document explains the Automated Software Modernization option based on OMG's Model Driven Architecture and Architecture Driven Modernization standards.
This presentation was given at the KM Singapore conference in Singapore on 15 Aug 09. I introduced a governance cycle and presented 4 key areas of governance: information organisation, publishing, collaboration and apps.
OCA Java SE 8 Exam Chapter 6 Exceptionsİbrahim Kürce
A program can fail for just about any reason. Here are just a few possibilities:
The code tries to connect to a website, but the Internet connection is down.
You made a coding mistake and tried to access an invalid index in an array.
One method calls another with a value that the method doesn't support.
Service Oriented Architecture (SOA) is an architectural style for creating and using business processes, packaged as services, throughout their lifecycle. This short presentation looks at how SOA fits in the world of IBM System i (AS/400, iSeries, IBM i) and how using the LANSA toolset and set you on the right path.
I am submitting my resume for the position of Full Stack Java Developer. As a skilled and highly educated professional with 7+ years of experience testing Web-based applications, I am confident of my ability to make a significant contribution to your organization.
DISCLAIMER: The views are entirely that of the author of the presentation and ESS does not associate itself with the content whatsoever. ESS cannot be held liable in anyway for any claims arising out of the presentation or any repercussions from partial/complete implementation of any of the ideas presented.
The development world is changing at a dizzying pace, the development environment of Microsoft Visual Studio. Net, leading a crazy race, it haunts version version, innovative tools and technologies emerge constantly. Developer community is in constant pursuit to keep pace with technological development and to implement them in developing applications and products they develop.
Development teams to business organizations was an enormous challenge:
On the one hand, they are under pressure to develop functional requirements and applications and products that are the core business of the organization, on the other hand, they develop the same requirements particularly dynamic development environment, as noted above, developed and advanced at an incredible rate.
Managers of development organizations face a complex dilemma - whether to allocate time and resources for learning technology, examining how to implement and assimilation among development teams, all at the expense of development time of applications and products of the company - most organizations scales tipped in favor of developing products and applications, leading to the erosion of knowledge, not adoption new technology and tools that can streamline and increase the productivity of the development team and improve the products and applications that the organization develops.
Meeting a representative of the current trends in world development, the plans ahead of Microsoft and how technological organization can and should organize themselves, both professionally and in business to maximize its resources and exploit the most of technology has to offer
Microsoft has won the war for ‘the hearts and minds’ of mid-market customers against the Java camp. Java has failed to gain traction outside of the big enterprise users because it is too heavy (read complex and expensive). Microsoft’s approach is less disruptive and therefore well suited for extending, not just replacing, existing systems.
So, how might a classic System i shop take advantage of the .NET Framework and the Windows platform? presentation gives some examples of the potential intersections between a System i server (running core RPG or COBOL programs and a DB2 database) and various Microsoft products and technologies like ASP.NET, SharePoint, Office and CRM.
The data architecture of solutions is frequently not given the attention it deserves or needs. Frequently, too little attention is paid to designing and specifying the data architecture within individual solutions and their constituent components. This is due to the behaviours of both solution architects ad data architects.
Solution architecture tends to concern itself with functional, technology and software components of the solution
Data architecture tends not to get involved with the data aspects of technology solutions, leaving a data architecture gap. Combined with the gap where data architecture tends not to get involved with the data aspects of technology solutions, there is also frequently a solution architecture data gap. Solution architecture also frequently omits the detail of data aspects of solutions leading to a solution data architecture gap. These gaps result in a data blind spot for the organisation.
Data architecture tends to concern itself with post-individual solutions. Data architecture needs to shift left into the domain of solutions and their data and more actively engage with the data dimensions of individual solutions. Data architecture can provide the lead in sealing these data gaps through a shift-left of its scope and activities as well providing standards and common data tooling for solution data architecture
The objective of data design for solutions is the same as that for overall solution design:
• To capture sufficient information to enable the solution design to be implemented
• To unambiguously define the data requirements of the solution and to confirm and agree those requirements with the target solution consumers
• To ensure that the implemented solution meets the requirements of the solution consumers and that no deviations have taken place during the solution implementation journey
Solution data architecture avoids problems with solution operation and use:
• Poor and inconsistent data quality
• Poor performance, throughput, response times and scalability
• Poorly designed data structures can lead to long data update times leading to long response times, affecting solution usability, loss of productivity and transaction abandonment
• Poor reporting and analysis
• Poor data integration
• Poor solution serviceability and maintainability
• Manual workarounds for data integration, data extract for reporting and analysis
Data-design-related solution problems frequently become evident and manifest themselves only after the solution goes live. The benefits of solution data architecture are not always evident initially.
Solution Architecture and Solution Estimation.pdfAlan McSweeney
Solution architects and the solution architecture function are ideally placed to create solution delivery estimates
Solution architects have the knowledge and understanding of the solution constituent component and structure that is needed to create solution estimate:
• Knowledge of solution options
• Knowledge of solution component structure to define a solution breakdown structure
• Knowledge of available components and the options for reuse
• Knowledge of specific solution delivery constraints and standards that both control and restrain solution options
Accurate solution delivery estimates are need to understand the likely cost/resources/time/options needed to implement a new solution within the context of a range of solutions and solution options. These estimates are a key input to investment management and making effective decisions on the portfolio of solutions to implement. They enable informed decision-making as part of IT investment management.
An estimate is not a single value. It is a range of values depending on a number of conditional factors such level of knowledge, certainty, complexity and risk. The range will narrow as the level of knowledge and uncertainty decreases
There is no easy or magic way to create solution estimates. You have to engage with the complexity of the solution and its components. The more effort that is expended the more accurate the results of the estimation process will be. But there is always a need to create estimates (reasonably) quickly so a balance is needed between effort and quality of results.
The notes describe a structured solution estimation process and an associated template. They also describe the wider context of solution estimates in terms of IT investment and value management and control.
Validating COVID-19 Mortality Data and Deaths for Ireland March 2020 – March ...Alan McSweeney
This analysis seeks to validate published COVID-19 mortality statistics using mortality data derived from general mortality statistics, mortality estimated from population size and mortality rates and death notice data
Analysis of the Numbers of Catholic Clergy and Members of Religious in Irelan...Alan McSweeney
This analysis looks at the changes in the numbers of priests and nuns in Ireland for the years 1926 to 2016. It combines data from a range of sources to show the decline in the numbers of priests and nuns and their increasing age profile.
This analysis consists of the following sections:
• Summary - this highlights some of the salient points in the analysis.
• Overview of Analysis - this describes the approach taken in this analysis.
• Context – this provides background information on the number of Catholics in Ireland as a context to this analysis.
• Analysis of Census Data 1926 – 2016 - this analyses occupation age profile data for priests and nuns. It also includes sample projections on the numbers of priests and nuns.
• Analysis of Catholic Religious Mortality 2014-2021 - this analyses death notice data from RIP.ie to shows the numbers of priests and nuns that have died in the years 2014 to 2021. It also looks at deaths of Irish priests and nuns outside Ireland and at the numbers of countries where Irish priests and nuns have worked.
• Analysis of Data on Catholic Clergy From Other Sources - this analyses data on priests and nuns from other sources.
• Notes on Data Sources and Data Processing - this lists the data sources used in this analysis.
IT Architecture’s Role In Solving Technical Debt.pdfAlan McSweeney
Technical debt is an overworked term without an effective and common agreed understanding of what exactly it is, what causes it, what are its consequences, how to assess it and what to do about it.
Technical debt is the sum of additional direct and indirect implementation and operational costs incurred and risks and vulnerabilities created because of sub-optimal solution design and delivery decisions.
Technical debt is the sum of all the consequences of all the circumventions, budget reduction, time pressure, lack of knowledge, manual workarounds, short-cuts, avoidance, poor design and delivery quality and decisions to remove elements from solution scope and failure to provide foundational and backbone solution infrastructure.
Technical debt leads to a negative feedback cycle with short solution lifespan, earlier solution replacement and short-term tactical remedial actions.
All the disciplines within IT architecture have a role to play in promoting an understanding of and in the identification of how to resolve technical debt. IT architecture can provide the leadership in both remediating existing technical debt and preventing future debt.
Failing to take a complete view of the technical debt within the organisation means problems and risks remained unrecognised and unaddressed. The real scope of the problem is substantially underestimated. Technical debt is always much more than poorly written software.
Technical debt can introduce security risks and vulnerabilities into the organisation’s solution landscape. Failure to address technical debt leaves exploitable security risks and vulnerabilities in place.
Shadow IT or ghost IT is a largely unrecognised source of technical debt including security risks and vulnerabilities. Shadow IT is the consequence of a set of reactions by business functions to an actual or perceived inability or unwillingness of the IT function to respond to business needs for IT solutions. Shadow IT is frequently needed to make up for gaps in core business solutions, supplementing incomplete solutions and providing omitted functionality.
Solution Architecture And Solution SecurityAlan McSweeney
This describes an approach to embedding security within the technology solution landscape. It describes a security model that encompasses the range of individual solution components up to the entire solution landscape. The solution security model allows the security status of a solution and its constituent delivery and operational components to be tracked wherever those components are located. This provides an integrated approach to solution security across all solution components and across the entire organisation topology of solutions. It allows the solution architect to validate the security of an individual solution. It enables the security status of the entire solution landscape to be assessed and recorded. Solution security is a wicked problem because there is no certainly about when the problem has been resolved and a state of security has been achieved. The security state of a solution can just be expressed along a subjective spectrum of better or worse rather than a binary true or false. Solution security can have negative consequences: prevents types of access, limits availability in different ways, restricts functionality provided, makes solution harder to use, lengthens solution delivery times, increases costs along the entire solution lifecycle, leads to loss of usability, utility and rate of use.
Data Privatisation, Data Anonymisation, Data Pseudonymisation and Differentia...Alan McSweeney
This paper describes how technologies such as data pseudonymisation and differential privacy technology enables access to sensitive data and unlocks data opportunities and value while ensuring compliance with data privacy legislation and regulations.
Data Privatisation, Data Anonymisation, Data Pseudonymisation and Differentia...Alan McSweeney
Your data has value to your organisation and to relevant data sharing partners. It has been expensively obtained. It represents a valuable asset on which a return must be generated. To achieve the value inherent in the data you need to be able to make it appropriately available to others, both within and outside the organisation.
Organisations are frequently data rich and information poor, lacking the skills, experience and resources to convert raw data into value.
These notes outline technology approaches to achieving compliance with data privacy regulations and legislation while providing access to data.
There are different routes to making data accessible and shareable within and outside the organisation without compromising compliance with data protection legislation and regulations and removing the risk associated with allowing access to personal data:
• Differential Privacy – source data is summarised and individual personal references are removed. The one-to-one correspondence between original and transformed data has been removed
• Anonymisation – identifying data is destroyed and cannot be recovered so individual cannot be identified. There is still a one-to-one correspondence between original and transformed data
• Pseudonymisation – identifying data is encrypted and recovery data/token is stored securely elsewhere. There is still a one-to-one correspondence between original and transformed data
These technologies and approaches are not mutually exclusive – each is appropriate to differing data sharing and data access use cases
The data privacy regulatory and legislative landscape is complex and getting even more complex so an approach to data access and sharing that embeds compliance as a matter of course is required.
Appropriate technology appropriately implemented and operated is a means of managing and reducing risks of re-identification by making the time, skills, resources and money necessary to achieve this unrealistic.
Technology is part of a risk management approach to data privacy. There is wider operational data sharing and data privacy framework that includes technology aspects, among other key areas. Using these technologies will embed such compliance by design into your data sharing and access facilities. This will allow you to realise value from your data successfully.
Solution architects must be aware of the need for solution security and of the need to have enterprise-level controls that solutions can adopt.
The sets of components that comprise the extended solution landscape, including those components that provide common or shared functionality, are located in different zones, each with different security characteristics.
The functional and operational design of any solution and therefore its security will include many of these components, including those inherited by the solution or common components used by the solution.
The complete solution security view should refer explicitly to the components and their controls.
While each individual solution should be able to inherit the security controls provided by these components, the solution design should include explicit reference to them for completeness and to avoid unvalidated assumptions.
There is a common and generalised set of components, many of which are shared, within the wider solution topology that should be considered when assessing overall solution architecture and solution security.
Individual solutions must be able to inherit security controls, facilities and standards from common enterprise-level controls, standards, toolsets and frameworks.
Individual solutions must not be forced to implement individual infrastructural security facilities and controls. This is wasteful of solution implementation resources, results in multiple non-standard approaches to security and represents a security risk to the organisation.
The extended solution landscape potentially consists of a large number of interacting components and entities located in different zones, each with different security profiles, requirements and concerns. Different security concerns and therefore controls apply to each of these components.
Solution security is not covered by a single control. It involves multiple overlapping sets of controls providing layers of security.
Solution Architecture And (Robotic) Process Automation SolutionsAlan McSweeney
Automation is a technology trend IT architects should be aware of and know how to respond to business requests as well as recommend automation technologies and solutions where appropriate. Automation is a bigger topic than just RPA (Robotic Process Automation).
Automation solutions, like all other technology solutions, should be subject to an architecture and design process. There are many approaches to and options for the automation of business activities. Too often automation solutions are tactical applications layered over existing business systems
The objective of all IT solutions is to automate manual business processes and their activities to a certain extent. The requirement for RPA-type applications arises in part because of automation failures within existing applications or the need to automate the interactions with or integrations between separate, possibly legacy, applications.
One of the roles of IT architecture is to always seek to take the wider architectural view and to ensure that solutions are designed and delivered within a strategic framework to avoid, as much as is practical and realistic, short-term tactical solutions and approaches that lead to an accumulation of design, operations and support debt. Tactical solutions will always play a part in the organisation’s solution landscape.
The objective of these notes is to put automation into its wider and larger IT architecture context while accepting the need for tactical approaches in some instances.
These notes cover the following topics:
• Solution And Process Automation – The Wider Technology And Approach Landscape
• Business Processes, Business Solutions And Automation
• Organisation Process Model
• Strategic And Tactical Automation
• Deciding On The Scope Of Automation
• Digital Strategy, Digital Transformation And Automation
• Specifying The Automation Solution
• Business Process Model and Notation (BPMN)
• Sample Business Process – Order To Cash
• RPA (Robotic Process Automation)
Data Profiling, Data Catalogs and Metadata HarmonisationAlan McSweeney
These notes discuss the related topics of Data Profiling, Data Catalogs and Metadata Harmonisation. It describes a detailed structure for data profiling activities. It identifies various open source and commercial tools and data profiling algorithms. Data profiling is a necessary pre-requisite activity in order to construct a data catalog. A data catalog makes an organisation’s data more discoverable. The data collected during data profiling forms the metadata contained in the data catalog. This assists with ensuring data quality. It is also a necessary activity for Master Data Management initiatives. These notes describe a metadata structure and provide details on metadata standards and sources.
Comparison of COVID-19 Mortality Data and Deaths for Ireland March 2020 – Mar...Alan McSweeney
This document compares published COVID-19 mortality statistics for Ireland with publicly available mortality data extracted from informal public data sources. This mortality data is taken from published death notices on the web site www.rip.ie. This is used a substitute for poor quality and long-delayed officially published mortality statistics.
Death notice information on the web site www.rip.ie is available immediately and contains information at a greater level of detail than published statistics. There is a substantial lag in officially published mortality data and the level of detail is very low. However, the extraction of death notice data and its conversion into a usable and accurate format requires a great deal of processing.
The objective of this analysis is to assess the accuracy of published COVID-19 mortality statistics by comparing trends in mortality over the years 2014 to 2020 with both numbers of deaths recorded from 2020 to 2021 and the COVID-19 statistics. It compares number of deaths for the seven 13-month intervals:
1. Mar 2014 - Mar 2015
2. Mar 2015 - Mar 2016
3. Mar 2016 - Mar 2017
4. Mar 2017 - Mar 2018
5. Mar 2018 - Mar 2019
6. Mar 2019 - Mar 2020
7. Mar 2020 - Mar 2021
It focuses on the seventh interval which is when COVID-19 deaths have occurred. It combines an analysis of mortality trends with details on COVID-19 deaths. This is a fairly simplistic analysis that looks to cross-check COVID-19 death statistics using data from other sources.
The subject of what constitutes a death from COVID-19 is controversial. This analysis is not concerned with addressing this controversy. It is concerned with comparing mortality data from a number of sources to identify potential discrepancies. It may be the case that while the total apparent excess number of deaths over an interval is less than the published number of COVID-19 deaths, the consequence of COVID-19 is to accelerate deaths that might have occurred later in the measurement interval.
Accurate data is needed to make informed decisions. Clearly there are issues with Irish COVID-19 mortality data. Accurate data is also needed to ensure public confidence in decision-making. Where this published data is inaccurate, this can lead of a loss of this confidence that can exploited.
Analysis of Decentralised, Distributed Decision-Making For Optimising Domesti...Alan McSweeney
This analysis looks at the potential impact that large numbers of electric vehicles could have on electricity demand, electricity generation capacity and on the electricity transmission and distribution grid in Ireland. It combines data from a number of sources – electricity usage patterns, vehicle usage patterns, electric vehicle current and possible future market share – to assess the potential impact of electric vehicles.
It then analyses a possible approach to electric vehicle charging where the domestic charging unit has some degree of decentralised intelligence and decision-making capability in deciding when to start vehicle charging to minimise electricity usage impact and optimise electricity generation usage.
The potential problem to be addressed is that if large numbers of electric cars are plugged-in and charging starts immediately when the drivers of those cars arrive home, the impact on demand for electricity will be substantial.
Operational Risk Management Data Validation ArchitectureAlan McSweeney
This describes a structured approach to validating data used to construct and use an operational risk model. It details an integrated approach to operational risk data involving three components:
1. Using the Open Group FAIR (Factor Analysis of Information Risk) risk taxonomy to create a risk data model that reflects the required data needed to assess operational risk
2. Using the DMBOK model to define a risk data capability framework to assess the quality and accuracy of risk data
3. Applying standard fault analysis approaches - Fault Tree Analysis (FTA) and Failure Mode and Effect Analysis (FMEA) - to the risk data capability framework to understand the possible causes of risk data failures within the risk model definition, operation and use
Data Integration, Access, Flow, Exchange, Transfer, Load And Extract Architec...Alan McSweeney
These notes describe a generalised data integration architecture framework and set of capabilities.
With many organisations, data integration tends to have evolved over time with many solution-specific tactical approaches implemented. The consequence of this is that there is frequently a mixed, inconsistent data integration topography. Data integrations are often poorly understood, undocumented and difficult to support, maintain and enhance.
Data interoperability and solution interoperability are closely related – you cannot have effective solution interoperability without data interoperability.
Data integration has multiple meanings and multiple ways of being used such as:
- Integration in terms of handling data transfers, exchanges, requests for information using a variety of information movement technologies
- Integration in terms of migrating data from a source to a target system and/or loading data into a target system
- Integration in terms of aggregating data from multiple sources and creating one source, with possibly date and time dimensions added to the integrated data, for reporting and analytics
- Integration in terms of synchronising two data sources or regularly extracting data from one data sources to update a target
- Integration in terms of service orientation and API management to provide access to raw data or the results of processing
There are two aspects to data integration:
1. Operational Integration – allow data to move from one operational system and its data store to another
2. Analytic Integration – move data from operational systems and their data stores into a common structure for analysis
Ireland 2019 and 2020 Compared - Individual ChartsAlan McSweeney
This analysis compares some data areas - Economy, Crime, Aviation, Energy, Transport, Health, Mortality. Housing and Construction - for Ireland for the years 2019 and 2020, illustrating the changes that have occurred between the two years. It shows some of the impacts of COVID-19 and of actions taken in response to it, such as the various lockdowns and other restrictions.
The first lockdown clearly had major changes on many aspects of Irish society. The third lockdown which began at the end of the period analysed will have as great an impact as the first lockdown.
The consequences of the events and actions that have causes these impacts could be felt for some time into the future.
Analysis of Irish Mortality Using Public Data Sources 2014-2020Alan McSweeney
This describes the use of published death notices on the web site www.rip.ie as a substitute to officially published mortality statistics. This analysis uses data from RIP.ie for the years 2014 to 2020.
Death notice information is available immediately and contains information at a greater level of detail than published statistics. There is a substantial lag in officially published mortality data.
This analysis compares some data areas - Economy, Crime, Aviation, Energy, Transport, Health, Mortality. Housing and Construction - for Ireland for the years 2019 and 2020, illustrating the changes that have occurred between the two years. It shows some of the impacts of COVID-19 and of actions taken in response to it, such as the various lockdowns and other restrictions.
The first lockdown clearly had major changes on many aspects of Irish society. The third lockdown which began at the end of the period analysed will have as great an impact as the first lockdown.
The consequences of the events and actions that have causes these impacts could be felt for some time into the future.
Review of Information Technology Function Critical Capability ModelsAlan McSweeney
IT Function critical capabilities are key areas where the IT function needs to maintain significant levels of competence, skill and experience and practise in order to operate and deliver a service. There are several different IT capability frameworks. The objective of these notes is to assess the suitability and applicability of these frameworks. These models can be used to identify what is important for your IT function based on your current and desired/necessary activity profile.
Capabilities vary across organisation – not all capabilities have the same importance for all organisations. These frameworks do not readily accommodate variability in the relative importance of capabilities.
The assessment approach taken is to identify a generalised set of capabilities needed across the span of IT function operations, from strategy to operations and delivery. This generic model is then be used to assess individual frameworks to determine their scope and coverage and to identify gaps.
The generic IT function capability model proposed here consists of five groups or domains of major capabilities that can be organised across the span of the IT function:
1. Information Technology Strategy, Management and Governance
2. Technology and Platforms Standards Development and Management
3. Technology and Solution Consulting and Delivery
4. Operational Run The Business/Business as Usual/Service Provision
5. Change The Business/Development and Introduction of New Services
In the context of trends and initiatives such as outsourcing, transition to cloud services and greater platform-based offerings, should the IT function develop and enhance its meta-capabilities – the management of the delivery of capabilities? Is capability identification and delivery management the most important capability? Outsourced service delivery in all its forms is not a fire-and-forget activity. You can outsource the provision of any service except the management of the supply of that service.
The following IT capability models have been evaluated:
• IT4IT Reference Architecture https://www.opengroup.org/it4it contains 32 functional components
• European e-Competence Framework (ECF) http://www.ecompetences.eu/ contains 40 competencies
• ITIL V4 https://www.axelos.com/best-practice-solutions/itil has 34 management practices
• COBIT 2019 https://www.isaca.org/resources/cobit has 40 management and control processes
• APQC Process Classification Framework - https://www.apqc.org/process-performance-management/process-frameworks version 7.2.1 has 44 major IT management processes
• IT Capability Maturity Framework (IT-CMF) https://ivi.ie/critical-capabilities/ contains 37 critical capabilities
The following model has not been evaluated
• Skills Framework for the Information Age (SFIA) - http://www.sfia-online.org/ lists over 100 skills
Critical Review of Open Group IT4IT Reference ArchitectureAlan McSweeney
This reviews the Open Group’s IT4IT Reference Architecture (https://www.opengroup.org/it4it) with respect to other operational frameworks to determine its suitability and applicability to the IT operating function.
IT4IT is intended to be a reference architecture for the management of the IT function. It aims to take a value chain approach to create a model of the functions that IT performs and the services it provides to assist organisations in the identification of the activities that contribute to business competitiveness. It is intended to be an integrated framework for the management of IT that emphasises IT service lifecycles.
This paper reviews what is meant by a value-chain, with special reference to the Supply Chain Operations Reference (SCOR) model (https://www.apics.org/apics-for-business/frameworks/scor). the most widely used and most comprehensive such model.
The SCOR model is part of wider set of operations reference models that describe a view of the critical elements in a value chain:
• Product Life Cycle Operations Reference model (PLCOR) - Manages the activities for product innovation and product and portfolio management
• Customer Chain Operations Reference model (CCOR) - Manages the customer interaction processes
• Design Chain Operations Reference model (DCOR) - Manages the product and service development processes
• Managing for Supply Chain Performance (M4SC) - Translates business strategies into supply chain execution plans and policies
It also compares the IT4IT Reference Architecture and its 32 functional components to other frameworks that purport to identify the critical capabilities of the IT function:
• IT Capability Maturity Framework (IT-CMF) https://ivi.ie/critical-capabilities/ contains 37 critical capabilities
• Skills Framework for the Information Age (SFIA) - http://www.sfia-online.org/ lists over 100 skills
• European e-Competence Framework (ECF) http://www.ecompetences.eu/ contains 40 competencies
• ITIL IT Service Management https://www.axelos.com/best-practice-solutions/itil
• COBIT 2019 https://www.isaca.org/resources/cobit has 40 management and control processes
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...Jeffrey Haguewood
Sidekick Solutions uses Bonterra Impact Management (fka Social Solutions Apricot) and automation solutions to integrate data for business workflows.
We believe integration and automation are essential to user experience and the promise of efficient work through technology. Automation is the critical ingredient to realizing that full vision. We develop integration products and services for Bonterra Case Management software to support the deployment of automations for a variety of use cases.
This video focuses on the notifications, alerts, and approval requests using Slack for Bonterra Impact Management. The solutions covered in this webinar can also be deployed for Microsoft Teams.
Interested in deploying notification automations for Bonterra Impact Management? Contact us at sales@sidekicksolutionsllc.com to discuss next steps.
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...DanBrown980551
Do you want to learn how to model and simulate an electrical network from scratch in under an hour?
Then welcome to this PowSyBl workshop, hosted by Rte, the French Transmission System Operator (TSO)!
During the webinar, you will discover the PowSyBl ecosystem as well as handle and study an electrical network through an interactive Python notebook.
PowSyBl is an open source project hosted by LF Energy, which offers a comprehensive set of features for electrical grid modelling and simulation. Among other advanced features, PowSyBl provides:
- A fully editable and extendable library for grid component modelling;
- Visualization tools to display your network;
- Grid simulation tools, such as power flows, security analyses (with or without remedial actions) and sensitivity analyses;
The framework is mostly written in Java, with a Python binding so that Python developers can access PowSyBl functionalities as well.
What you will learn during the webinar:
- For beginners: discover PowSyBl's functionalities through a quick general presentation and the notebook, without needing any expert coding skills;
- For advanced developers: master the skills to efficiently apply PowSyBl functionalities to your real-world scenarios.
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
"Impact of front-end architecture on development cost", Viktor TurskyiFwdays
I have heard many times that architecture is not important for the front-end. Also, many times I have seen how developers implement features on the front-end just following the standard rules for a framework and think that this is enough to successfully launch the project, and then the project fails. How to prevent this and what approach to choose? I have launched dozens of complex projects and during the talk we will analyze which approaches have worked for me and which have not.
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
Connector Corner: Automate dynamic content and events by pushing a buttonDianaGray10
Here is something new! In our next Connector Corner webinar, we will demonstrate how you can use a single workflow to:
Create a campaign using Mailchimp with merge tags/fields
Send an interactive Slack channel message (using buttons)
Have the message received by managers and peers along with a test email for review
But there’s more:
In a second workflow supporting the same use case, you’ll see:
Your campaign sent to target colleagues for approval
If the “Approve” button is clicked, a Jira/Zendesk ticket is created for the marketing design team
But—if the “Reject” button is pushed, colleagues will be alerted via Slack message
Join us to learn more about this new, human-in-the-loop capability, brought to you by Integration Service connectors.
And...
Speakers:
Akshay Agnihotri, Product Manager
Charlie Greenberg, Host
State of ICS and IoT Cyber Threat Landscape Report 2024 previewPrayukth K V
The IoT and OT threat landscape report has been prepared by the Threat Research Team at Sectrio using data from Sectrio, cyber threat intelligence farming facilities spread across over 85 cities around the world. In addition, Sectrio also runs AI-based advanced threat and payload engagement facilities that serve as sinks to attract and engage sophisticated threat actors, and newer malware including new variants and latent threats that are at an earlier stage of development.
The latest edition of the OT/ICS and IoT security Threat Landscape Report 2024 also covers:
State of global ICS asset and network exposure
Sectoral targets and attacks as well as the cost of ransom
Global APT activity, AI usage, actor and tactic profiles, and implications
Rise in volumes of AI-powered cyberattacks
Major cyber events in 2024
Malware and malicious payload trends
Cyberattack types and targets
Vulnerability exploit attempts on CVEs
Attacks on counties – USA
Expansion of bot farms – how, where, and why
In-depth analysis of the cyber threat landscape across North America, South America, Europe, APAC, and the Middle East
Why are attacks on smart factories rising?
Cyber risk predictions
Axis of attacks – Europe
Systemic attacks in the Middle East
Download the full report from here:
https://sectrio.com/resources/ot-threat-landscape-reports/sectrio-releases-ot-ics-and-iot-security-threat-landscape-report-2024/
Transcript: Selling digital books in 2024: Insights from industry leaders - T...BookNet Canada
The publishing industry has been selling digital audiobooks and ebooks for over a decade and has found its groove. What’s changed? What has stayed the same? Where do we go from here? Join a group of leading sales peers from across the industry for a conversation about the lessons learned since the popularization of digital books, best practices, digital book supply chain management, and more.
Link to video recording: https://bnctechforum.ca/sessions/selling-digital-books-in-2024-insights-from-industry-leaders/
Presented by BookNet Canada on May 28, 2024, with support from the Department of Canadian Heritage.
Search and Society: Reimagining Information Access for Radical FuturesBhaskar Mitra
The field of Information retrieval (IR) is currently undergoing a transformative shift, at least partly due to the emerging applications of generative AI to information access. In this talk, we will deliberate on the sociotechnical implications of generative AI for information access. We will argue that there is both a critical necessity and an exciting opportunity for the IR community to re-center our research agendas on societal needs while dismantling the artificial separation between the work on fairness, accountability, transparency, and ethics in IR and the rest of IR research. Instead of adopting a reactionary strategy of trying to mitigate potential social harms from emerging technologies, the community should aim to proactively set the research agenda for the kinds of systems we should build inspired by diverse explicitly stated sociotechnical imaginaries. The sociotechnical imaginaries that underpin the design and development of information access technologies needs to be explicitly articulated, and we need to develop theories of change in context of these diverse perspectives. Our guiding future imaginaries must be informed by other academic fields, such as democratic theory and critical theory, and should be co-developed with social science scholars, legal scholars, civil rights and social justice activists, and artists, among others.
Search and Society: Reimagining Information Access for Radical Futures
Notes On Software Development, Platform And Modernisation
1. Notes on Software Development, Platform and Modernisation Alan McSweeney
2.
3.
4.
5. High Level View of Application and System Landscape IT Assets Accountability Reduce Cost New Channels Visibility New Services Customer Service Shareholder Value Governance Business Drivers Business Requirements Better Information insight Support Business Requirements Faster Efficient Delivery Of New Services Automation of Existing Processes Reusable, Standard Services Standard Integration of Services Business Processes Services Services Services Services Business Service Business Service Business Service Business Service Services Business Service Business Units Legacy Systems Web Sites Databases Core Applications
6. View of Long-Term Application Landscape IT Assets Reusable, Standard Services Standard Integration of Services Business Processes Services Services Services Services Business Service Business Service Business Service Business Service Services Business Service Service Oriented Architecture IT Assets Business Units Legacy Systems Web Sites Databases Core Applications Business Requirements Better Information insight Support Business Requirements Faster Efficient Delivery Of New Services Automation of Existing Processes Business Process Management Accountability Reduce Cost New Channels Visibility New Services Customer Service Shareholder Value Governance Business Drivers Business Process Improvement
11. Overview of J2EE and .NET Visual Studio.Net Rational, Eclipse, JBuilder etc. Tools support Multiple Languages JAVA Language support CLR JRE Interpreter Windows Variety of Operating systems Portability CLR (Common Language Runtime) JVM (Java Virtual Machine) Execution Engine ADO.NET JDBC Database Connectivity .Net Managed Components EJBs Middle-Tier Components ASP.NET JSPs Web Pages and HTML Microsoft Oracle, IBM, Oracle, Apache and many others Middleware Vendors Product Standard Type of Technology .NET J2EE FEATURE
12. Detail of J2EE and .NET Visual Studio .Net, the single IDE for building .Net windows, web applications, or XML Web services Java features a wide variety of tools – Rational, Eclipse, JBuilder, JDeveloper etc Tools support It is language independent and can use any language once a mapping exists from that language to IL Only Java is supported Language support .Net only supports the Windows platform J2EE offers complete cross-platform portability Portability The CLR environment executes .Net's Microsoft Intermediate Language code Java source code compiles into machine-independent byte code, which the JVM interprets at runtime Execution Engine ADO.NET JDBC Database Connectivity A variety of data sources can be accessed by developers through .Net's ADO.Net classes Two main types of entity beans are used to model persistent data which are container-managed and bean-managed Data Access .Net remoting allows calls to remote objects distributed across application domains, processes, and machine boundaries JNDI finds server-side components such as EJBs or JMS queues Calling Remote Objects Manual transaction management or automated through CLR Manual transaction management or automated through containers Transactions ASP.NET under Internet Information Server (IIS) JSPs and Servlets HTML generation .NET J2EE FEATURE
13. Generic Framework for Web Applications Workflow Engine Web-based and -related Protocols (HTTP, SMTP, ...) Service Description, Discovery, Integration (UDDI) Service Description (WSDL) Service Context (Who, Where, When, Why, ....) Virtual Machine Micro/Macro Services Integration Layer Legacy Backend Server Mainframe Frontend Layer (Web Server) Web Service User/Provider Core Services (Calendar, Preferences, Transactions, ...) Core Elements of Web Application Framework Clients
14. .Net Implementation of Framework .NET Devices .NET Servers SQL Server, Biztalk, Commerce, Exchange, Mobile Information, Host Integration, Application Center .NET Foundation Services Passport, Calendar, Directory & Search, Notification & Messaging, Personalization, Web-Store/XML, Dynamic Delivery of Software and Services Common Language Runtime (Memory Management, Common Type System, Lifecycle Monitor) .NET Framework & Tools Base Classes (ADO.NET, XML, Threading, IO, ....) ASP.NET (Web Services, Web Forms, ASP.NET Application Services) Windows Forms (Controls, Drawing, Windows Application Services)
15. Java Implementation of Framework Service Interface Service Container (J2EE, EJB, JSP, J2SE, J2ME, MIDP, Java Card) Process Management Service Integration (SQL, JDBC, XML, XSLT, XP, JMS, RMI, J2EE Connectors, ...) Service Platform Smart Management (SNMP, CIM, WBEM, JMX) Smart Delivery (XML, HTML, XHTML, WML, VoiceXML, XSLT, HTTP, SSL, XP, SOAP, WSDL, UDDI, ebXML, ...) Web Services Smart Process (ebXML, XAML) Smart Policy (LDAP, Kerberos, PKI, OASIS Security)) Service Creation and Assembly (JB, JSP, EJB)
21. Process and Portfolio Management – IBM Create, customise and deploy an SOA Governance Process using IBM Rational Method Composer Use IBM Rational Portfolio Manager to identify and manage your software development projects and resources, assess cost and ROI, and comply with your SOA Governance policies
22. Change and Release Management - IBM Use IBM Rational ClearCase for full lifecycle management, and version control of development artifacts Use IBM Rational ClearQuest for geographically distributed activities, changes and defect management Use IBM Rational BuildForge to achieve a repeatable, automated build process to accelerate software delivery Use IBM Rational Asset Manager to define, create, group, store, search, retrieve, measure and govern the re-use of development assets
23. Requirements and Quality Management Use IBM Rational ClearQuest and Rational Functional Tester for integrated test, activity and quality management Use Rational RequisitePro to ensure business needs drive IT investment and validate that the deployed solutions meet quality measures, managing requirements throughout the development lifecycle Use IBM Rational Performance Tester to perform performance and load testing with local/remote execution and monitoring
24. Analysis, Design and Construction Use WebSphere Business Modeler and Integration Developer to model business processes, simulate / socialise business cases and make human / automated workflows executable. Use IBM Rational Software Architect to understand your existing architecture and drive its evolution, using analysis, modeling and transformation capabilities across UML and source code boundaries Use IBM’s Eclipse-powered Rational Application Developer to code, generate, unit test, analyze and debug your applications & services across the Java, Web, Portal and open standards landscapes
25. Govern The Process of Software Development Budget Status from Rational Portfolio Manager Defect Glide Path from Rational ClearQuest Testing Status from Rational TestManager Requirements Volatility from Rational RequisitePro Code Churn from Rational ClearCase
29. Application Transformation zSeries iSeries Unix Windows Automated Conversion Rational Application Developer (RAD) Application Environments Approach Rational Management Tools Refactoring Rational Business Developer (RBD) iSeries Deployment Windows Linux
30.
31. Spectrum of Options for Application Modernisation Existing Core Business Applications New UI refacing or rewriting Web browser UI Or Rich UI Modern Screens Portal Extract Design Re-Build New Application EGL Java ILE E G L Java COBOL Transformation/ Conversion Discovery and Analysis Refactoring “ Rationalized” Efficient Version of the Application Reusable Components or Services SOA
32.
33. Automated Repository Generation Cross-Reference Data Data Model Data Source Data Base Programs Displays Existing System Business Rule Logic Business Logic Validation Calculations Secondary Reads Secondary Updates Batch Calls Data Model Data Dictionary Logical Files/Views Key Map Relationships Special Fields Cross-Ref Data All Objects All Source All Languages All Variables Automated Extraction and Refresh X-Analysis Repository
35. Software Rewrites New IDE Tool Eclipse/Rational/ Visual Studio/EGL/ Plex/LANSA Discovery, Analysis & Maintenance X-Analysis X-Referencing & Documentation Relational Data Model Business Rule Logic Graphical Function Diagrams RPG as Pseudo Code Data Flow Charting UML& DDL Extraction RSx/Together/CA GEN Software Modeling Tools Activity, Use-Case & Class Diagrams Data Model Redesign Persistence CRUD SOA JSF/Java/RCP Web2.0/Ajax .Net/ASP/C# Silverlight/XAML PHP/MySQL X-Migrate