Mainframe users are continuously challenged to keep pace with rising data volumes from distributed applications that depend on mainframe transaction processing power. The pressure to squeeze more performance and value out of existing mainframes, while avoiding or deferring major upgrades, never stops.
There are ways to improve the efficiency of core workloads, like sorting, that help you uncover additional capacity, save money, and increase the ROI for mainframe expenditures. In addition, you can deliver more value to your business by integrating mainframe data into next-generation cloud and data platforms like Databricks, Snowflake, Splunk, ServiceNow, and more.
Organizations are struggling to make sense of their data within antiquated data platforms. Snowflake, the data warehouse built for the cloud, can help.
Overview of Data Loss Prevention Policies in Office 365Dock 365
Presentation about identifying, monitoring, and automatically protect sensitive information across Office 365.
With a DLP Policy, you can:
- Identify sensitive information across many locations, such as SharePoint Online and OneDrive for Business.
- Prevent the accidental sharing of sensitive information.
- Monitor and protect sensitive information in the desktop versions of Excel 2016, PowerPoint 2016, and Word 2016.
- Help users learn how to stay compliant without interrupting their workflow.
- View DLP reports showing content that matches your organization's DLP policies.
Visit www.mydock365.com to learn more about SharePoint with Dock.
The session theme is "Enabling Business Continuity During Challenging Times With Virtual Desktops". The session will be conducted by Microsoft..
In the last few weeks, thelives of people around the world have been impacted. Daily working has gotcompromised, particularly with regard to business continuity. Remote working,in the best interest of organizations, is becoming a necessity.
Travel restrictions and new rules on large public gatherings have changed the daily routines of millions. Many organizations are quickly moving to remote working environments. If your customers are thinking of similar options, we at Microsoft are here to support you in this endeavor.
Organizations are struggling to make sense of their data within antiquated data platforms. Snowflake, the data warehouse built for the cloud, can help.
Overview of Data Loss Prevention Policies in Office 365Dock 365
Presentation about identifying, monitoring, and automatically protect sensitive information across Office 365.
With a DLP Policy, you can:
- Identify sensitive information across many locations, such as SharePoint Online and OneDrive for Business.
- Prevent the accidental sharing of sensitive information.
- Monitor and protect sensitive information in the desktop versions of Excel 2016, PowerPoint 2016, and Word 2016.
- Help users learn how to stay compliant without interrupting their workflow.
- View DLP reports showing content that matches your organization's DLP policies.
Visit www.mydock365.com to learn more about SharePoint with Dock.
The session theme is "Enabling Business Continuity During Challenging Times With Virtual Desktops". The session will be conducted by Microsoft..
In the last few weeks, thelives of people around the world have been impacted. Daily working has gotcompromised, particularly with regard to business continuity. Remote working,in the best interest of organizations, is becoming a necessity.
Travel restrictions and new rules on large public gatherings have changed the daily routines of millions. Many organizations are quickly moving to remote working environments. If your customers are thinking of similar options, we at Microsoft are here to support you in this endeavor.
Data stewards are the implementation arm of Data Governance. They are also the first line of defense against bad data practices. Whether it’s data profiling or in-depth root cause analysis, data stewards ensure the organization’s shared data is reliably interconnected. Whether starting or restarting your Data Stewardship program, success comes from:
- Understanding the cadence/role of foundational data practices supporting organizational operations
- Proving value with tangible ROI
- Improving effectiveness/efficiencies using organization-wide insight
- Comprehending how stewards need to be multifunctional and dexterous, especially at first
- Integrating the role of data debt fighting
Power BI Overview, Deployment and GovernanceJames Serra
Deploying Power BI in a large enterprise is a complex task, and one that requires a lot of thought and planning. The purpose of this presentation is to help you make your Power BI deployment a success. After a quick Power BI overview, I’ll discuss deployment strategies, common usage scenarios, how to store and refresh data, prototyping options, how to share externally, and then finish with how to administer and secure Power BI. I’ll outline considerations and best practices for achieving an optimal, well-performing, enterprise level Power BI deployment.
Data Strategy - Executive MBA Class, IE Business SchoolGam Dias
For today's enterprise Data is now very much a corporate asset, vital to delivering products and services efficiently and cost effectively. There are few organizations that can survive without harnessing data in some way.
Viewed as a strategic asset, data can be a source of new internal efficiencies, improved competitive advantage or a source of entirely new products that can be targeted at your existing or new customers.
This slide deck contains the highlights of a one day course on Data Strategy taught as part of the Executive MBA Program at IE Business School in Madrid.
Northwestern Mutual Journey – Transform BI Space to CloudDatabricks
The volume of available data is growing by the second (to an estimated 175 zetabytes by 2025), and it is becoming increasingly granular in its information. With that change every organization is moving towards building a data driven culture. We at Northwestern Mutual share similar story of driving towards making data driven decisions to improve both efficiency and effectiveness. Legacy system analysis revealed bottlenecks, excesses, duplications etc. Based on ever growing need to analyze more data our BI Team decided to make a move to more modern, scalable, cost effective data platform. As a financial company, data security is as important as ingestion of data. In addition to fast ingestion and compute we would need a solution that can support column level encryption, Role based access to different teams from our datalake.
In this talk we describe our journey to move 100’s of ELT jobs from current MSBI stack to Databricks and building a datalake (using Lakehouse). How we reduced our daily data load time from 7 hours to 2 hours with capability to ingest more data. Share our experience, challenges, learning, architecture and design patterns used while undertaking this huge migration effort. Different sets of tools/frameworks built by our engineers to help ease the learning curve that our non-Apache Spark engineers would have to go through during this migration. You will leave this session with more understand on what it would mean for you and your organization if you are thinking about migrating to Apache Spark/Databricks.
Using Amazon Neptune to power identity resolution at scale - ADB303 - Atlanta...Amazon Web Services
IgnitionOne, a global marketing technology and services leader, announced a strategic partnership with Amazon Neptune, a fast, reliable, and fully managed graph database service offered by AWS. In this session, we discuss how this partnership further enhances the IgnitionOne Customer Intelligence Platform (CIP) with amplified identity resolution capabilities, allowing for greater cross-device performance and cross-browser audience activations. Learn how the IgnitionOne CIP with Amazon Neptune goes beyond traditional Customer Data Platform capabilities, giving brands deeper insights for better omnichannel engagement.
This session will provide a basic overview of Microsoft 365 and will then dive into how to position its benefits for customers. You'll learn how the Microsoft 365 features help resolve many common business challenges today and how you should be speaking with customers about these.
Jade Global Digital Transformation & Cloud Consulting Partner - OverviewJade Global
Jade Global is well-positioned to be your strategic IT services partner. We create value through our vast portfolio of IT services delivered by highly skilled and experienced consultants. Our services include business application implementations, integrations, software product engineering, Cloud services, technology advisory, testing, and managed services. We have domain expertise in a variety of industries including manufacturing, high-tech, energy, pharmaceuticals and warehouse distribution. Jade Global is a member of the Oracle Cloud Excellence Implementor (CEI) Program, a Salesforce Ridge Partner, Boomi Platinum Partner, ServiceNow Elite Services Partner, NetSuite Systems Integrator Partner, SAP Partner and Snowflake Select Partner providing comprehensive implementation, integration, and optimization services across these mature technologies’ ecosystem. The Company has been recognized as one of the fastest-growing companies in North America by Inc. 5000 and Stevie.
The main aim of Data-Centric Architecture is to reduce complexity of information systems by using shared data with clear meaning. But how can you trust your data? How do you know if it is accurate and reliable?
Migrating your traditional Data Warehouse to a Modern Data LakeAmazon Web Services
In this session, we discuss the latest features of Amazon Redshift and Redshift Spectrum, and take a deep dive into its architecture and inner workings. We share many of the recent availability, performance, and management enhancements and how they improve your end user experience. You also hear from 21st Century Fox, who presents a case study of their fast migration from an on-premises data warehouse to Amazon Redshift. Learn how they are expanding their data warehouse to a data lake that encompasses multiple data sources and data formats. This architecture helps them tie together siloed business units and get actionable 360-degree insights across their consumer base.
Want to see a high-level overview of the products in the Microsoft data platform portfolio in Azure? I’ll cover products in the categories of OLTP, OLAP, data warehouse, storage, data transport, data prep, data lake, IaaS, PaaS, SMP/MPP, NoSQL, Hadoop, open source, reporting, machine learning, and AI. It’s a lot to digest but I’ll categorize the products and discuss their use cases to help you narrow down the best products for the solution you want to build.
Actionable Insights with AI - Snowflake for Data ScienceHarald Erb
Talk @ ScaleUp 360° AI Infrastructures DACH, 2021: Data scientists spend 80% and more of their time searching for and preparing data. This talk explains Snowflake’s Platform capabilities like near-unlimited data storage and instant and near-infinite compute resources and how the platform can be used to seamlessly integrate and support the machine learning libraries and tools data scientists rely on.
New Mainframe Sort Innovations Built on IBM Z Platform EnhancementsPrecisely
As the IBM Z Platform continues to transform to support the needs of today’s hybrid cloud environments, it is focused on being robust, resilient, and securable. Hybrid cloud combines and unifies public cloud, private cloud and on-premises infrastructure to create a single, flexible, cost-optimal IT infrastructure. As the IBM Z Platform drives their hybrid cloud model forward, they have delivered significant enhancements in encryption and performance. These enhancements have created opportunities to leverage these improvements to deliver more value to clients as they continue to look for ways to improve their mainframe environment.
Join us for a discussion with Denise Tabor, Product Management Director at Precisely, on IBM’s recent enhancements in resource optimization and security and how Precisely is leveraging these technological improvements.
Watch this on-demand webinar to hear about:
• IBM’s recent enhancements in IBM Z Platform resource optimization and security
• Performance enhancements in Syncsort MFX that take advantage of new IBM capabilities
• How we are leveraging IBM’s Pervasive Encryption in our Syncsort MFX solution
Data stewards are the implementation arm of Data Governance. They are also the first line of defense against bad data practices. Whether it’s data profiling or in-depth root cause analysis, data stewards ensure the organization’s shared data is reliably interconnected. Whether starting or restarting your Data Stewardship program, success comes from:
- Understanding the cadence/role of foundational data practices supporting organizational operations
- Proving value with tangible ROI
- Improving effectiveness/efficiencies using organization-wide insight
- Comprehending how stewards need to be multifunctional and dexterous, especially at first
- Integrating the role of data debt fighting
Power BI Overview, Deployment and GovernanceJames Serra
Deploying Power BI in a large enterprise is a complex task, and one that requires a lot of thought and planning. The purpose of this presentation is to help you make your Power BI deployment a success. After a quick Power BI overview, I’ll discuss deployment strategies, common usage scenarios, how to store and refresh data, prototyping options, how to share externally, and then finish with how to administer and secure Power BI. I’ll outline considerations and best practices for achieving an optimal, well-performing, enterprise level Power BI deployment.
Data Strategy - Executive MBA Class, IE Business SchoolGam Dias
For today's enterprise Data is now very much a corporate asset, vital to delivering products and services efficiently and cost effectively. There are few organizations that can survive without harnessing data in some way.
Viewed as a strategic asset, data can be a source of new internal efficiencies, improved competitive advantage or a source of entirely new products that can be targeted at your existing or new customers.
This slide deck contains the highlights of a one day course on Data Strategy taught as part of the Executive MBA Program at IE Business School in Madrid.
Northwestern Mutual Journey – Transform BI Space to CloudDatabricks
The volume of available data is growing by the second (to an estimated 175 zetabytes by 2025), and it is becoming increasingly granular in its information. With that change every organization is moving towards building a data driven culture. We at Northwestern Mutual share similar story of driving towards making data driven decisions to improve both efficiency and effectiveness. Legacy system analysis revealed bottlenecks, excesses, duplications etc. Based on ever growing need to analyze more data our BI Team decided to make a move to more modern, scalable, cost effective data platform. As a financial company, data security is as important as ingestion of data. In addition to fast ingestion and compute we would need a solution that can support column level encryption, Role based access to different teams from our datalake.
In this talk we describe our journey to move 100’s of ELT jobs from current MSBI stack to Databricks and building a datalake (using Lakehouse). How we reduced our daily data load time from 7 hours to 2 hours with capability to ingest more data. Share our experience, challenges, learning, architecture and design patterns used while undertaking this huge migration effort. Different sets of tools/frameworks built by our engineers to help ease the learning curve that our non-Apache Spark engineers would have to go through during this migration. You will leave this session with more understand on what it would mean for you and your organization if you are thinking about migrating to Apache Spark/Databricks.
Using Amazon Neptune to power identity resolution at scale - ADB303 - Atlanta...Amazon Web Services
IgnitionOne, a global marketing technology and services leader, announced a strategic partnership with Amazon Neptune, a fast, reliable, and fully managed graph database service offered by AWS. In this session, we discuss how this partnership further enhances the IgnitionOne Customer Intelligence Platform (CIP) with amplified identity resolution capabilities, allowing for greater cross-device performance and cross-browser audience activations. Learn how the IgnitionOne CIP with Amazon Neptune goes beyond traditional Customer Data Platform capabilities, giving brands deeper insights for better omnichannel engagement.
This session will provide a basic overview of Microsoft 365 and will then dive into how to position its benefits for customers. You'll learn how the Microsoft 365 features help resolve many common business challenges today and how you should be speaking with customers about these.
Jade Global Digital Transformation & Cloud Consulting Partner - OverviewJade Global
Jade Global is well-positioned to be your strategic IT services partner. We create value through our vast portfolio of IT services delivered by highly skilled and experienced consultants. Our services include business application implementations, integrations, software product engineering, Cloud services, technology advisory, testing, and managed services. We have domain expertise in a variety of industries including manufacturing, high-tech, energy, pharmaceuticals and warehouse distribution. Jade Global is a member of the Oracle Cloud Excellence Implementor (CEI) Program, a Salesforce Ridge Partner, Boomi Platinum Partner, ServiceNow Elite Services Partner, NetSuite Systems Integrator Partner, SAP Partner and Snowflake Select Partner providing comprehensive implementation, integration, and optimization services across these mature technologies’ ecosystem. The Company has been recognized as one of the fastest-growing companies in North America by Inc. 5000 and Stevie.
The main aim of Data-Centric Architecture is to reduce complexity of information systems by using shared data with clear meaning. But how can you trust your data? How do you know if it is accurate and reliable?
Migrating your traditional Data Warehouse to a Modern Data LakeAmazon Web Services
In this session, we discuss the latest features of Amazon Redshift and Redshift Spectrum, and take a deep dive into its architecture and inner workings. We share many of the recent availability, performance, and management enhancements and how they improve your end user experience. You also hear from 21st Century Fox, who presents a case study of their fast migration from an on-premises data warehouse to Amazon Redshift. Learn how they are expanding their data warehouse to a data lake that encompasses multiple data sources and data formats. This architecture helps them tie together siloed business units and get actionable 360-degree insights across their consumer base.
Want to see a high-level overview of the products in the Microsoft data platform portfolio in Azure? I’ll cover products in the categories of OLTP, OLAP, data warehouse, storage, data transport, data prep, data lake, IaaS, PaaS, SMP/MPP, NoSQL, Hadoop, open source, reporting, machine learning, and AI. It’s a lot to digest but I’ll categorize the products and discuss their use cases to help you narrow down the best products for the solution you want to build.
Actionable Insights with AI - Snowflake for Data ScienceHarald Erb
Talk @ ScaleUp 360° AI Infrastructures DACH, 2021: Data scientists spend 80% and more of their time searching for and preparing data. This talk explains Snowflake’s Platform capabilities like near-unlimited data storage and instant and near-infinite compute resources and how the platform can be used to seamlessly integrate and support the machine learning libraries and tools data scientists rely on.
New Mainframe Sort Innovations Built on IBM Z Platform EnhancementsPrecisely
As the IBM Z Platform continues to transform to support the needs of today’s hybrid cloud environments, it is focused on being robust, resilient, and securable. Hybrid cloud combines and unifies public cloud, private cloud and on-premises infrastructure to create a single, flexible, cost-optimal IT infrastructure. As the IBM Z Platform drives their hybrid cloud model forward, they have delivered significant enhancements in encryption and performance. These enhancements have created opportunities to leverage these improvements to deliver more value to clients as they continue to look for ways to improve their mainframe environment.
Join us for a discussion with Denise Tabor, Product Management Director at Precisely, on IBM’s recent enhancements in resource optimization and security and how Precisely is leveraging these technological improvements.
Watch this on-demand webinar to hear about:
• IBM’s recent enhancements in IBM Z Platform resource optimization and security
• Performance enhancements in Syncsort MFX that take advantage of new IBM capabilities
• How we are leveraging IBM’s Pervasive Encryption in our Syncsort MFX solution
z Systems redefining Enterprise IT for digital business - Alain PoquillonNRB
IBM z Systems with the new z13 is the backbone infrastructure for the evolving digital era. Built on over 50 years of experience and billions of dollars in developing leading-edge technology, it is at the forefront of modern Information Technology. On different domains. Mr. Poquillon illustrates IBMs’ z13 pre-eminence by highlighting its assets such as its shared-everything approach and centralized management of resources that make it naturally fit for cloud; its hybrid transaction/analytics processing capabilities that provide real-time analytics more efficiently to in-process transactional data, and finally its ability to provide the scale and performance a business needs to survive the mobile and social onslaught.
MT01 The business imperatives driving cloud adoptionDell EMC World
Cloud adoption has reached an inflection point, pushing organizations into an "adapt or die" state, forcing new operating models, effective management of internal and external resources, and transformation towards an application-centric mentality. Cloud approaches are maturing past the point of public clouds domination, shifting focus to private & hybrid cloud and effective management of a multi-cloud environment. Attend this session to learn how to realize true business value when the friction of the business dynamic is supported by flexible cloud services delivered with predictability & speed.
Liberate Legacy Data Sources with Precisely and DatabricksPrecisely
Mainframe and IBM i data continues to be prevalent in several industries including financial services, insurance, and retail where critical customer information lives on legacy systems. In fact, in 2019 alone, studies show that there was a 55% increase in transaction volumes on the mainframe across all industries. To thrive in highly competitive markets, you must quickly break down legacy data silos to swiftly gain a full picture of data for insights for strategic action.
Traditional storage solutions that are mainframe proprietary struggle to scale for high data volumes and real-time analytics use cases. This results in increased costs, diminished performance, and missed SLAs. To solve this, Precisely and Databricks provide a modern approach for organizations to optimize volumes of data by leveraging the massive scalability of the cloud to power high-performance analytics, AI, and machine learning, regardless of where data lives.
In this webinar, we discuss:
- Quickly ingesting data from on-premises sources – such as mainframe and IBM i – to the cloud with the Databricks Unified Data Analytics Platform and Delta Lake
- Modernizing ETL processes and reduce development costs with visual data pipelines that uses the elastic scalability of Databricks
- Empowering business users with the most up to date data by populating Delta Lake with realtime data changes from legacy systems
View this webinar on-demand to see a live demo of the joint solution and how it can modernize your legacy infrastructure
Cloudera + Syncsort: Fuel Business Insights, Analytics, and Next Generation T...Precisely
Effective AI and ML projects require a perfect blend of scalable, clean data funneled from a variety of sources across the business. The only problem? Uncleaned data often lives in hard-to-access legacy systems, and it costs time and money to build the right foundation to deliver that data to answer ever-changing questions from business users. Together, Cloudera and Syncsort enable you to build a scalable foundation of data connections to reinvent the data lifecycle of all your projects in the most efficient way possible.
View this webinar on-demand to learn how innovative solutions from Cloudera and Syncsort enable AI and ML success. You will learn:
• Best practices for transforming complex data into clear, actionable insights for AI and ML projects
• How to visually assess the quality of the sources in your data lake and their completeness, consistency, and accuracy
• The value of an Enterprise Data Cloud and the newly unveiled Cloudera Data Platform
• How Syncsort Connect integrates natively with the Cloudera Data Platform
IBM Z Cost Reduction Opportunities. Are you missing out?Precisely
Large companies continue to use mainframes for their most business-critical IT workloads. For these companies, finding ways to get more bang for the mainframe buck, in terms of both costs and performance, is always a high priority. Several converging trends in recent years have made it more challenging than ever to achieve the needed organizational performance at the best possible price point. IT leaders in mainframe departments are seeking out ways to speed processing, especially mundane processing tasks such as sorting, copying, merging, compression, and report generation.
Whether you are looking to get more value from your mainframe investment with enhanced performance, improved efficiency, or modernization, Precisely has multiple solutions for customer running IBM Z Systems that can have a dramatic impact on cost and efficiency.
Watch this on-demand webinar to learn about:
• Optimizing mainframe sort workloads
• Leveraging your zIIP processors
• Modernizing your database environment
• Improving visibility into mainframe processing
Going Beyond the Cloud to Modernize Your Banking InfrastructureCloudflare
View this presentation to learn about digital transformation in banking and how Cloudflare can help. You will learn about:
-Common challenges banks are facing when migrating to the cloud;
-How to integrate your existing on-premise infrastructure alongside public facing workloads;
-Why global load balancers are an essential part of any multi-cloud strategy;
-What banks can do to support faster innovation across your organization;
-What banks should be aware of regarding compliance and monitoring
Exploring the Applications of Cloud Computing in the IT Industry.pdfTechnoMark Solutions
Welcome to TechnoMark Solutions, a cutting-edge tech company that is revolutionizing the digital landscape one solution at a time. At TechnoMark, we specialize in providing innovative tech services to startups, tailored to their unique business requirements. With a commitment to excellence and innovation, TechnoMark Solutions is your trusted partner in building your brand from the ground up.
Streaming IBM i to Kafka for Next-Gen Use CasesPrecisely
Your team is always under pressure to accelerate the adoption of the most modern and powerful technologies. Simultaneously, your existing investments, such as IBM i, your organization’s most critical data asset, remain in a silo. The only practical path forward is to connect the new and existing with a streaming technology like Apache Kafka to feed real-time applications that power use cases ranging from marketing and order replenishment to fraud detection.
Join this Precisely webinar to learn how to unlock the potential of your IBM i data by creating data pipelines that integrate, transform, and deliver it to users when and where they need it. Additionally, hear how Stark Denmark, uses Precisely Connect CDC to provide data to their organization in real-time.
Join this webinar to:
- Understand the benefits and challenges of building data pipelines that access and integrate data from IBM i systems to modern data platforms
- Learn how Precisely can help you build real-time data pipelines
- Hear from Stark Denmark on how they are using Connect CDC from Precisely and the benefits they are getting
New high-density storage server - IBM System x3650 M4 HDCliff Kinard
IBM System x3650 M4 HD
Flexibility, simplicity and scale for today's workloads.
The ultimate high density storage server, designed for data-intensive analytics and business critical workloads. Optimal blend of performance, uptime, and dense storage.
Watch our North America webcast replay here:
http://event.on24.com/r.htm?e=670225&s=1&k=FC5CD17AB42385B40BCED29B8B61E2E8&partnerref=IBM09
EDB Postgres is more than just a DBMS. It is a platform that liberates enterprises from expensive DBMS to enable digital business. EDB enterprise customers drive new and modernization IT initiatives using EDB Postgres to minimize risk and maximize savings.
What if IT could free their core database management systems from traditional limitations and become a differentiator instead of a bottleneck to the business?
Learn how the open source based EDB Postgres Platform helps enterprises confidently deliver on the promise of digital business. This presentation covers -
* The updated platform,
* Value for the enterprise,
* Technical differentiators, and
* Customer success stories showing EDB Postgres in action.
This presentation is intended for IT leaders, both management and technical. If you want to lead your organization to accelerate the transformation to digital business, this webinar is for you.
Transform Your Mainframe and IBM i Data for the Cloud with Precisely and Apac...HostedbyConfluent
Your mainframe and IBM i platforms do hard work for your business, supporting essential computing transactions every day. However, mainframe data does not easily integrate with the cloud platforms driving data-driven, real-time, analytics-focused business processes. Integrating data from this critical technology often results in high costs, missed deadlines, and unhappy customers. So, what can you do? Join us to hear how Precisely Connect can help use the power of Apache Kafka to eliminate data silos and make cloud-based, event-driven data architectures a reality. Start your cloud transformation journey today, knowing you don’t need to leave essential transaction data behind! Learn more about: • Where to begin your cloud transformation journey using mainframe and IBM i data and Apache Kafka • What you need to move mainframe and IBM i data to the cloud while reducing costs, modernizing architectures, and using the staff you have today • How Precisely Connect customers are using change data capture and Apache Kafka to deliver real-time insights to the cloud
The emergence of social, mobile, cloud, big data and analytics are fundamentally changing how we live, work and interact.
Mobile devices are ubiquitous. Changing consumer behaviors, supplanting PCs, generating massive amounts of data and putting new demands on the enterprise to not only support these devices but to adjust the way they do business.
Social technologies are changing the way we interact, communicate and share information – equally generating vast amounts of data and impacting business as they try to unlock the full potential social has to offer.
Cloud technologies bring new scale and efficiency to service delivery and enable more agile ways of doing business and drive business model innovation. For companies, It also brings information and applications to people at the right time and place.
All of these trends are fueling an explosion of data. Not only do enterprises need to store, manage and secure this data, they also need to derive meaningful insight from these vast amounts of data. Data is the basis of significant opportunity and a source of competitive advantage for all organizations. Data is a new economic asset, the next natural resource.
These trends are spawning new workloads, business processes and technology deployments that are putting unprecedented demands on our IT environments.
Innovative and Agile Data Delivery, using 'A Logical Data Fabric'Denodo
Watch full webinar here: https://bit.ly/3eBEoKH
Presented at BIGIT's World Tech Festival 2022, ASEAN
Ongoing digital transformation is generating new data assets that have the potential of offering organisations unprecedented insights into operations, business processes, customer behaviour, the competition, and much more. But, if organisations cannot effectively access, integrate, and govern their data that is distributed across on-premises and multiple cloud providers’ data platforms, they are doomed to fall short of realizing its value. A logical data fabric that uses virtualization capabilities can avoid the traditional approach of integrating data.
In this session, you will learn how organisations can create a logical data fabric with data virtualization technology to:
- Minimize data movement and data replication which can be time-consuming, expensive and pose security and compliance risks
- Virtually integrate, manage and govern enterprise data across on-premises and cloud for insight generation and business decision making
- Examine how and why a logical data fabric could benefit your organization today and future-proof your data architecture to meet new demands
Εταιρική Παρουσίαση: Ανδρέας Τσαγκάρης, Chief Technology Officer, Performance Technologies
Τίτλος: «OpenShift and IBM Cloud Paks on Power for Digital transformation»
Similar to Optimize the Value of Your Mainframe (20)
AI-Ready Data - The Key to Transforming Projects into Production.pptxPrecisely
Moving AI projects from the laboratory to production requires careful consideration of data preparation. Join us for a fireside chat where industry experts, including Antonio Cotroneo (Director, Product Marketing, Precisely) and Sanjeev Mohan (Principal, SanjMo), will discuss the crucial role of AI-ready data in achieving success in AI projects. Gain essential insights and considerations to ensure your AI solutions are built on a solid foundation of accurate, consistent, and context-rich data. Explore practical insights and learn how data integrity drives innovation and competitive advantage. Transform your approach to AI with a focus on data readiness.
Building a Multi-Layered Defense for Your IBM i SecurityPrecisely
In today's challenging security environment, new vulnerabilities emerge daily, leaving even patched systems exposed. While IBM works tirelessly to release fixes as they discover vulnerabilities, bad actors are constantly innovating. Don't settle for reactive defense – secure your IT with a layered approach!
This holistic strategy builds multiple security walls, making it far harder for attackers to breach your defenses. Even if a certain vulnerability is exploited, one of the controls could stop the attack or at least delay it until you can take action.
Join us for this webcast to hear about:
• How security risks continue to evolve and change
• The importance of keeping all your systems patched an up-to-date
• A multi-layered approach to network, system object and data security
Navigating the Cloud: Best Practices for Successful MigrationPrecisely
In today's digital landscape, migrating workloads and applications to the cloud has become imperative for businesses seeking scalability, flexibility, and efficiency. However, executing a seamless transition requires strategic planning and careful execution. Join us as we delve into the insightful insights around cloud migration, where we will explore three key topics:
i. Considerations to take when planning for cloud migration
ii. Best practices for successfully migrating to the cloud
iii. Real-world customer stories
Unlocking the Power of Your IBM i and Z Security Data with Google ChroniclePrecisely
In today's ever-evolving threat landscape, any siloed systems, or data leave organizations vulnerable. This is especially true when mission-critical systems like IBM i and IBM Z mainframes are not included in your security planning. Valuable security data from these systems often remains isolated, hindering your ability to detect and respond to threats effectively.
Ironstream and bridge this gap for IBM systems by integrating the important security data from these mission-critical systems into Google Chronicle where it can be seen, analyzed and correlated with the data from other enterprise systems Here's what you'll learn:
• The unique challenges of securing IBM i and Z mainframes
• Why traditional security tools fall short for mainframe data
• The power of Google Chronicle for unified security intelligence
• How to gain comprehensive visibility into your entire IT ecosystem
• Real-world use cases for integrating IBM i and Z security data with Google Chronicle
Join us for this webcast to hear about:
• The unique challenges of securing IBM i and IBM Z systems
• Real-world use cases for integrating IBM i and IBM Z security data with Google Chronicle
• Combining Ironstream and Google Chronicle to deliver faster threat detection, investigation, and response times
Unlocking the Potential of the Cloud for IBM Power SystemsPrecisely
Are you considering leveraging the cloud alongside your existing IBM AIX and IBM I systems infrastructure? There are likely benefits to be realized in scalability, flexibility and even cost.
However, to realize these benefits, you need to be aware of the challenges and opportunities that come with integrating your IBM Power Systems in the cloud. These challenges range from data synchronization to testing to planning for fallback in the event of problems.
Join us for this webcast to hear about:
• Seamless migration strategies
• Best practices for operating in the cloud
• Benefits of cloud-based HA/DR for IBM AIX and IBM i
It can be challenging display and share capacity data that is meaningful to end users. There is an overabundance of data points related to capacity, and the summarization of this data is difficult to construct and display.
You are already spending time and money to handle the critical need to manage systems capacity, performance and estimate future needs. Are you it spending wisely? Are you getting the level of results from your investment that you really need? Can you prove it?
The good news is that the return on investment of implementing capacity management and capacity planning is most definitely positive and provable, both in terms of tangible monetary value and in some less tangible but no-less-valuable benefits.
Join us for this webinar and learn:
• Top Trends in Capacity Management
• Common customer pain points
• Ways to demonstrate these benefits to your company
Automate Studio Training: Materials Maintenance Tips for Efficiency and Ease ...Precisely
Ready to improve efficiency, provide easy to use data automations and take materials master (MM) data maintenance to the next level?
Find out how during our Automate Studio training on March 28 – led by Sigrid Kok, Principal Sales Engineer, and Isra Azam, Sales Engineer, at Precisely.
This session’s for you if you want to discover the best approaches for creating, extending or maintaining different types of materials, as well as automating the tricky parts of these processes that slow you down.
Greater control over your Automate Studio business processes means bigger, better results. We’ll show you how to enable your business users to interact with SAP from Microsoft Office and other familiar platforms – resulting in more efficient SAP data management, along with improved data integrity and accuracy.
This 90-minute session will be filled with a variety of topics, including:
real world approaches for creating multiple types of materials, balancing flexibility and power with simplicity and ease of use
tips on material creation, including
downloading the generated material number
using formulas to format prior to upload, such as capitalization or zero padding to make it easy to get the data right the first time
conditionally require fields based on other field entries
using LOV for fields that are free form entry for standard values
tips on modifying alternate units of measure, building from scratch using GUI scripting
modify multiple language descriptions, build from scratch using a standard BAPI
make end-to-end MM process flows more of a reality with features including APIs and predictive AI
Through these topics, you’ll gain plenty of actionable takeaways that you can start implementing right away – including how to:
improve your data integrity and accuracy
make scripts flexible and usable for automation users
seamlessly handle both simple and complex parts of material master
interact with SAP from both business user and script developers’ perspectives
easily upload and download data between SAP and Excel – and how to format the data before upload using simple formulas
You’ll leave this session feeling ready and empowered to save time, boost efficiency, and change the way you work.
Automate Studio reduces your dependency on technical resources to help you create automation scenarios – and our team of experts is here to make sure you get the most out of our solution throughout the journey.
Questions? Sigrid & Isra will be ready to answer them during a live Q&A at the end of the session.
Who should attend:
Attendees who will get the most out of this session are Automate Studio developers and runners familiar with SAP MM. Knowledge of Automate Studio script creation is nice to have, but not required.
Leveraging Mainframe Data in Near Real Time to Unleash Innovation With Cloud:...Precisely
Join us for an insightful roundtable discussion featuring experts from AWS, Confluent, and Precisely as they delve into the complexities and opportunities of migrating mainframe data to the cloud.
In this engaging webinar, participants will learn about the various considerations, strategies, and customer challenges associated with replicating mainframe data to cloud environments.
Our panelists will share practical insights, real-world experiences, and best practices to help organizations successfully navigate this transformative journey.
Whether you're considering migrating and modernizing your mainframe applications to cloud, or augmenting mainframe-based applications with data replication to cloud, this roundtable will provide valuable perspectives and insights to maximize the benefits of migrating mainframe data to the cloud.
Join us on March 27 to gain a deeper understanding of the opportunities and challenges in this evolving landscape.
Data Innovation Summit: Data Integrity TrendsPrecisely
Data integrity remains an evolving process of discovery, identification, and resolution. With an all-time low in public confidence on data being used for decision-making, attention has gradually shifted to data quality and data integration across multiple systems and frameworks. Data integrity becomes a focal point again for companies to make strategic moves in a world facing an evolving economy.
Key takeaways:
· How to build a data-driven culture within your organization
· Tips to engage with key stakeholders in your business and examples from other businesses around the world
· How to establish and maintain a business-first approach to data governance
· A summary of the findings from a recent survey of global data executives by Drexel University's LeBow College of Business
AI You Can Trust - Ensuring Success with Data Integrity WebinarPrecisely
Artificial Intelligence (AI) has become a strategic imperative in a rapidly evolving business landscape. However, the rush to embrace AI comes with risks, as illustrated by instances of AI-generated content with fake citations and potentially dangerous recommendations. The critical factor underpinning trustworthy AI is data integrity, ensuring data is accurate, consistent, and full of rich context.
Attend our upcoming webinar, "AI You Can Trust: Ensuring Success with Data Integrity," as we explore organizational challenges in maintaining data integrity for AI applications and real-world use cases showcasing the transformative impact of high-integrity data on AI success.
During this panel discussion, we'll highlight everything from personalized recommendations and AI-powered workflows to machine learning applications and innovative AI assistants.
Key Topics:
AI Use Cases with Data Integrity: Discover how data integrity shapes the success of AI applications through six compelling use cases.
Solving AI Challenges: Uncover practical solutions to common AI challenges such as bias, unreliable results, lack of contextual relevance, and inadequate data security.
Three Considerations of Data Integrity for AI: Learn the essential pillars—complete, trusted, and contextual—that underpin data integrity for AI success.
Precisely and AWS Partnership: Explore how the collaboration between Precisely and Amazon Web Services (AWS) addresses these challenges and empowers organizations to achieve AI-ready data.
Join our panelists to unlock the full potential of AI by starting your data integrity journey today. Trust in AI begins with trusted data – let's future-proof your AI together.
Less Bias. More Accurate. Relevant Outcomes.
Optimisez la fonction financière en automatisant vos processus SAPPrecisely
La fonction finance est au cœur du succès de l’entreprise, et doit aussi évoluer pour faire face aux enjeux d’aujourd’hui : aller plus vite, traiter plus d’informations et assurer une qualité des données sans faille.
Nous vous proposons de découvrir ensemble comment répondre à ces défis, notamment les points suivants :
Gérer les référentiels comptables et financiers, comptes comptables, clients, fournisseurs, centres de couts, centres de profits…Accélérer les clôtures et permettre de passer les écritures comptables nécessaires, de lancer les rapports adéquats et d’extraire les informations en temps réelOrganiser les taches en les affectant de manière ordonnancée à leurs responsables ou en les lançant automatiquement et les suivre de manière granulaire
Notre webinaire sera l’occasion d’évoquer et d’illustrer cette palette de capacités disponibles pour des utilisateurs métier sans code ou avec peu de code et nous vous espérons nombreux.
In dieser Präsentation diskutieren wir, welche Tools aus unserer Sicht dabei helfen, die Transformation zu SAP S/4HANA optimal zu gestalten. Aber wir blicken auch nach vorne!
In unserem Beitrag fokussieren wir uns nicht nur auf kurzfristige Lösungen, sondern es geht auch um das Thema „Nachhaltigkeit“. Um Investitionen für die Zukunft.
Dazu gehören Entwicklungen, die die SAP Welt nachhaltig verändern werden.
Wir betrachten zukünftige Technologien, wie KI oder Machine Learning, die dazu beitragen, datenintensive SAP Prozesse zu optimieren, die Datenqualität zu verbessern, manuelle Prozesse zu reduzieren und Mitarbeiter zu entlasten.
Werfen Sie mit uns einen Blick in die Zukunft und gestalten Sie die digitale Transformation in Ihrem Unternehmen mit.
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...UiPathCommunity
💥 Speed, accuracy, and scaling – discover the superpowers of GenAI in action with UiPath Document Understanding and Communications Mining™:
See how to accelerate model training and optimize model performance with active learning
Learn about the latest enhancements to out-of-the-box document processing – with little to no training required
Get an exclusive demo of the new family of UiPath LLMs – GenAI models specialized for processing different types of documents and messages
This is a hands-on session specifically designed for automation developers and AI enthusiasts seeking to enhance their knowledge in leveraging the latest intelligent document processing capabilities offered by UiPath.
Speakers:
👨🏫 Andras Palfi, Senior Product Manager, UiPath
👩🏫 Lenka Dulovicova, Product Program Manager, UiPath
Transcript: Selling digital books in 2024: Insights from industry leaders - T...BookNet Canada
The publishing industry has been selling digital audiobooks and ebooks for over a decade and has found its groove. What’s changed? What has stayed the same? Where do we go from here? Join a group of leading sales peers from across the industry for a conversation about the lessons learned since the popularization of digital books, best practices, digital book supply chain management, and more.
Link to video recording: https://bnctechforum.ca/sessions/selling-digital-books-in-2024-insights-from-industry-leaders/
Presented by BookNet Canada on May 28, 2024, with support from the Department of Canadian Heritage.
UiPath Test Automation using UiPath Test Suite series, part 3DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 3. In this session, we will cover desktop automation along with UI automation.
Topics covered:
UI automation Introduction,
UI automation Sample
Desktop automation flow
Pradeep Chinnala, Senior Consultant Automation Developer @WonderBotz and UiPath MVP
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
Securing your Kubernetes cluster_ a step-by-step guide to success !KatiaHIMEUR1
Today, after several years of existence, an extremely active community and an ultra-dynamic ecosystem, Kubernetes has established itself as the de facto standard in container orchestration. Thanks to a wide range of managed services, it has never been so easy to set up a ready-to-use Kubernetes cluster.
However, this ease of use means that the subject of security in Kubernetes is often left for later, or even neglected. This exposes companies to significant risks.
In this talk, I'll show you step-by-step how to secure your Kubernetes cluster for greater peace of mind and reliability.
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
JMeter webinar - integration with InfluxDB and GrafanaRTTS
Watch this recorded webinar about real-time monitoring of application performance. See how to integrate Apache JMeter, the open-source leader in performance testing, with InfluxDB, the open-source time-series database, and Grafana, the open-source analytics and visualization application.
In this webinar, we will review the benefits of leveraging InfluxDB and Grafana when executing load tests and demonstrate how these tools are used to visualize performance metrics.
Length: 30 minutes
Session Overview
-------------------------------------------
During this webinar, we will cover the following topics while demonstrating the integrations of JMeter, InfluxDB and Grafana:
- What out-of-the-box solutions are available for real-time monitoring JMeter tests?
- What are the benefits of integrating InfluxDB and Grafana into the load testing stack?
- Which features are provided by Grafana?
- Demonstration of InfluxDB and Grafana using a practice web application
To view the webinar recording, go to:
https://www.rttsweb.com/jmeter-integration-webinar
Elevating Tactical DDD Patterns Through Object CalisthenicsDorra BARTAGUIZ
After immersing yourself in the blue book and its red counterpart, attending DDD-focused conferences, and applying tactical patterns, you're left with a crucial question: How do I ensure my design is effective? Tactical patterns within Domain-Driven Design (DDD) serve as guiding principles for creating clear and manageable domain models. However, achieving success with these patterns requires additional guidance. Interestingly, we've observed that a set of constraints initially designed for training purposes remarkably aligns with effective pattern implementation, offering a more ‘mechanical’ approach. Let's explore together how Object Calisthenics can elevate the design of your tactical DDD patterns, offering concrete help for those venturing into DDD for the first time!
Key Trends Shaping the Future of Infrastructure.pdfCheryl Hung
Keynote at DIGIT West Expo, Glasgow on 29 May 2024.
Cheryl Hung, ochery.com
Sr Director, Infrastructure Ecosystem, Arm.
The key trends across hardware, cloud and open-source; exploring how these areas are likely to mature and develop over the short and long-term, and then considering how organisations can position themselves to adapt and thrive.
Key Trends Shaping the Future of Infrastructure.pdf
Optimize the Value of Your Mainframe
1. Optimize the Value
of Your Mainframe
Bill Hammond | Director, Product Marketing
Denise Tabor | Product Management Director
2. Today’s Conversation
• Precisely offerings
• IBM Z Platform optimization
• IBM Z Platform data integration
• Precisely’s IBM Z Platform projects
• How we work with IBM
• Encryption
• Sort Performance
2
3. The leader in data integrity
Our software, data enrichment products and
strategic services deliver accuracy, consistency, and
context in your data, powering confident decisions.
of the Fortune 100
99
countries
100 2,500
employees
customers
12,000
Brands you trust, trust us
Data leaders partner with us
3
4. Precisely offers broad capabilities
Change Data Capture
ETL
Machine Data
Integration
Process Automation
Data Governance
Data Catalog
Data Quality
Master Data Management
Self-Service Analytics
Spatial Analysis
Geocoding
Routing
Visualization
Geographic Data
Business Data
Industry-Specific Data
Integrated Comms
Personalized Video
Chatbots
Responsive Messaging
Digital Self Service
Integrate Verify Locate Enrich Engage
5. Mainframes host the most critical applications
71%
Fortune 500
2.5 Billion
Transactions / day / per MF
Top World
Banks
92 of World’s
Top Insurers
10 of Top 25
US Retailers
23
5
6. Mainframes
deliver
increasing value
with each new
technology
wave
of executives say their
customer-facing applications
are completely or very reliant
on mainframe processing.
92%
•72%
See the mainframe as a
platform for long-term
growth and new workloads
$1.65 trillion
invested by enterprise IT
to support data warehouse &
analytics workloads over the past
decade
Forrester Consulting, 2019
Wikibon “10-Year Worldwide Enterprise IT Spending 2008-2017”
BMC Mainframe Survey, 2021
6
7. Precisely Mainframe Products
7
Syncsort MFX +
Cutting-edge sort, copy and join, while offloading workloads to IBM zIIP
engines to slash CPU costs
Syncsort Optimize IMS
Low-risk, efficient migration tools for businesses switching from IMS to Db2
Syncsort Capacity Management
Ensure high-performing IT resources are available when and where you
need them
Syncsort Network Management
Network monitoring, security and performance management to increase
your network security and efficiency
Syncsort Optimize IDMS
Syncsort Optimize DB2
Improve IBM and CA database performance
Ironstream
Integrate mainframe and IBM i systems into leading IT analytics and
operations platforms for an enterprise-wide view
Connect
Integrate data seamlessly from legacy systems into next-gen cloud and data
platforms with one solution
ACR
Analyze data across your mainframe and host-based
environments for accurate, consistent and reliable data
10. Proven solution
Almost 50 years of continual
development and
enhancements
Performance
Improves sort performance
while optimizing overall
system efficiency
zIIP offload
Sort workloads can be
directed to the zIIP, thereby
lowering the CPU time and
costs
Encryption
Enhanced security and
compliance with regulations
such as GDPR
The high-performance
solution that delivers better
performance and
saves money
Syncsort MFX
10
11. 11
History of Product Innovation
11
Unique Capabilities of Syncsort MFX
Support Encryption of Basic/Large datasets*
Sort Performance for large datasets*
Encryption of Sortwork
zIIP Offload of Sort
zHPF Exploitation
Create & E-Mail PDF, RTF, HTML Files
zIIP Exploitation
MIDAW Exploitation
PAV Exploitation
Direct DB2 Access
PARASORT
XSUM
PipeSort
Global DSM
SAS PROC SORT Replacement
Incremental DYNALLOC
MAXSORT
First to Deliver Syncsort MFX
JOIN Processing 2004
Horizontal Arithmetic on OUTREC 1999
Data Space Exploitation 1989
Variable/Fixed Conversion 1989
Multiple Output 1983
Report Writing 1983
IEBGENER Replacement 1979
INREC 1979
INCLUDE/OMIT, SUM, OUTREC 1977
COBOL Exits 1976
EXCPVR 1975
Syncsort MFX has a commitment to
innovation with over 50 years of consistent
product leadership
12. IMS to Db2 Migration Challenges
Minimize risk Lack of skills/resources Time to value
12
13. Minimize
Risk
Time and effort to complete
project is minimal,
compared with an
application rewrite
Information
Access
Make data securely
available across the entire
enterprise – consolidate to a
single RDBMS – Db2
Transparent
migration
Syncsort Optimize IMS is a
transparent data migration
solution, no application
changes required
Proven
solution
Has proven itself around the
world in a wide variety of
vertical industries
The smart approach to IMS
data migration, providing
low-risk and efficient
migration tools for businesses
wanting to eliminate IMS DB
and consolidate to Db2.
Syncsort
Optimize IMS
13
15. Single Solution
with 360◦ View
Covers multiple platforms and
everything from servers to storage
to networks for visibility into all
your performance and capacity
metrics
Replaces Manual
Processes
Interactive automated reporting
and flexible templates let you
quickly hone in on potential
problems – without spreadsheets
and manual processes
Prediction and
Alerting
See impact of potential
changes and set up monitors
to identify when and where
thresholds are exceeded
Brings metrics from across
your enterprise into one
place for a comprehensive,
real-time view of your
infrastructure.
Syncsort
Capacity
Management
Flexible
Deployment
Options
Deploy on-premises or in the
cloud. Management of the
solution in the cloud
available through Precisely
Services
15
17. Data silos impact availability,
performance, services and security
Lack of data for
operational analytics
No single view of IT
entire infrastructure
Status and security of
systems is unknown
17
18. Complete
visibility
Get a comprehensive, real-
time view of your IT landscape
for better, faster decision-
making
Improved
agility
Proactively identify, and quickly
resolve, security threats and
performance problems
Service
Availability
Avoid downtime, meet
demanding SLAs, and deliver
great customer and employee
experiences
Integrate mainframe and
IBM i systems into leading IT
analytics and operations
platforms for an enterprise-
wide view to support your
digital business
Ironstream
Cost
savings
Eliminate manual processes
and inefficient tools, and
optimize your infrastructure
spend
18
19. Confidential: Prepared for Precisely Customers and Prospects
Strategic data projects without legacy data
sources = missed opportunity
1
The value of big
data investments
are diminished
2
Large, rich
datasets never
even get analyzed.
3
Inaccurate or
incomplete
analytics
When you leave mainframe and IBM i out of your strategic projects such as analytics, AI and machine learning:
20. Simplify
integration
Take a one solution approach
to integrate, prepare, load,
cleanse, transform and stream
data across clustered or cloud
frameworks
Data access
Integrate all enterprise data,
from mainframe and IBM i to
cloud, while keeping secure
integrations
Real-time
replication
Stream changes to data
instantly for use in
downstream applications,
data lakes and warehouses
Integrate data seamlessly
from legacy systems into
next-gen cloud and data
platforms with one solution
Connect
Future-proof
Quickly and easily add new
sources and targets. Deploy
in new environments with no
redevelopment required
20
25. Encryption
Customer Value
Provides data security at the data set level (file system),
minimizing administration costs
Challenge
Consumes additional processor cycles to decrypt/encrypt
– adding costs
Solution
Syncsort MFX and encryption on zIIP
26. Integrated Sort Accelerator
Customer Value
Reduces CPU usage and improves elapsed time by
speeding up sorting
Challenge
Limited ability to take advantage of the chip
Solution
Syncsort MFX and offloading to zIIP
27. Future Projects
Additional optimization improvements
Leverage new technologies
Encryption enhancements
Compression enhancements
Precisely's IBM Z Platform product
portfolio
Why does that matter?
Well….it turns out that mainframes are still the backbone for the biggest organizations in the world
71% of the fortune 500 rely on the mainframe for their mission critical transactional systems
They span every vertical from FinServ to Insurance to Retail
When talking to these organizations, it’s not unusual to hear that up to 80% of their corporate data originates on the mainframe. That’s Big Data.
And organizations cannot afford to neglect it.
Most large enterprises have made major investments in data environments over a period of many years - legacy data can provide a treasure-trove of information that can transform your business when leveraged via a streaming paradigm
These environments contain the data that these business run on and that today power the strategic initiatives driving the business forward – machine learning, AI and predictive analytics
Legacy platforms (mainframe and IBM i) continue to adapt with each new wave of technology and are not going away anytime soon
Integrating legacy data into your projects brings several advantages such as:
Connect applications together, leveraging the existing transactional capabilities of the current application platform, and the wealth of new capabilities of the cloud
Feed analytics with up-to-date information so your business runs on current insight
Port workloads to less-expensive, strategic platforms
Eliminate Risk
Developed to provide rapid /efficient migration between IMS and Db2
Applications remain unchanged
Set of powerful mapping and migrations tools
Stop worrying about keeping IMS skills and using third party IMS tools
Increased Access to Information
Data can be locked up in IMS
Db2 is more accessible solution
Reduced software expenses by eliminate IMS
Transparent Migration
No need to rewrite software
Applications continue to issue requests as before
Proven Solution
Many reference-able customers success stories
Q: [Bill} Thanks for joining us today Denise. Why don’t we start off with you telling us a little about your role here at Precisely?
A: [Denise] Direct from the university, I started my career at a state agency, learning MVS systems programming skills with a small but highly intelligent group of people. They mentored me for several years, and I learned the basics of maintaining several systems, but then focused on Db2. From there, I became a production and application Db2 DBA and then transitioned to a vendor company. I am fortunate enough to have held positions in technical support and sales engineering, where I really enjoyed being a customer advocate and internal liaison across the different departments. This was a natural segue into my current experience in Product Management, where I’ve been for several years. I joined Precisely in early 2020, and I’m currently responsible for the mainframe and capacity management products. I love working with customers, listening to their challenges and formulating solutions that meet their needs. In order to do that, of course, I need to work closely with sales, support, engineering, marketing, and of course, our IBM partnership feeds into that collaboration as well.
Q: [Bill] You mentioned you need to work closely with IBM. Tell us a little about the interactions you have with IBM people or departments.
A: [Denise} We have a very close relationship with IBM – the Db2 Utility team, the z/OS hardware architects, and the DFSMS engineering team know us very well. We have been working with the Db2 Utility team for almost 10 years to develop the IBM Db2 SORT product together. About 5 years ago, we started work with the z/OS Poughkeepsie team and hardware architects for z15’s integrated accelerator for Z SORT research. And finally, about 2 years ago, we started work with the DFSMS team to implement the EXCP encryption API for basic and large format datasets. This collaboration allows us to provide a performant solution for sorting with encryption (more on that later). We will continue to work with IBM in the future to ensure that we take advantage of any opportunities provided by their engineers.
Q: [Bill] Thanks Denise… obviously we value our close working relationship with IBM very highly. I would like to move us into the meat of our discussion today. Here at Precisely, we have number of products designed to help customers get the most from their IBM Z platform. Across our portfolio, what projects are you working on that will help customers as they continue to optimize their IBM Z Platform environments?
A: [Denise] Over the past few months, we’ve delivered a couple of major enhancements to assist our customers who want to process encrypted datasets and take advantage of some new hardware on the z15 machine
Q: [Bill] Q: [Bill] So for the project to improve sorting for encrypted data sets with pervasive encryption, could you explain in details why our customers need that improvement, what solution has been implemented into our products and any performance benchmark information you can share here?
A: [Denise] Let me first explain our need to support the encryption initiatives of our customers.
Regulatory requirements such as GDPR and CCPA, are designed to protect data and safeguard privacy. Loss of data or compromised information can come with high penalties, and new innovations to assist with this data protection have been implemented to fortify security.
The IBM Z Pervasive Encryption solution is a method to enable extensive encryption of data in-flight and at-rest to meet these protection standards. This solution is enabled by administrative policy controls and is designed to be application transparent, without requiring application changes. Data set encryption provides data security at the data set level using DFSMS access methods. This system-wide solution is more cost effective than traditional software encryption solutions. And DFSMS access methods provide data set encryption for sequential, basic, large, and extended format datasets.
The challenge to implementing data encryption is it consumes processor cycles, so it doesn’t come without penalty. And although mainframe customers love the pervasive encryption approach, no one likes the additional resource consumption. For Syncsort MFX users, if the input or output data set is encrypted, BSAM must be used instead of our high performant low level IO access methods, and there is an extra cost from that perspective.
Last year, IBM’s DFSMS team developed an EXCP encryption API solution called IGGENC to help vendors like us use our low-level IO for encrypted data sets and improve encryption performance. DFSMS splits the encryption/decryption processing from the I/O processing, so we will use our low-level IO methods to read and write the data sets and use IGGENC to encrypt and decrypt data on the zIIP processors. This will significantly reduce the CPU cost and elapsed time.
We’ve seen some great performance improvements from our preliminary benchmark testing. For Syncsort MFX, we have seen up to 45% CPU and 40% elapsed savings, and for ZPSaver we have seen up to 80% CPU and 40% elapsed time savings when processing encrypted, basic and large format datasets.
Q: [Bill] You also mentioned you are working on a project to improve sorting performance using the integrated accelerator for Z SORT. What is the integrated accelerator for Z SORT? Any performance improvement have you seen by leveraging Z SORT technology?
A: [Denise] The IBM Integrated Accelerator for Z Sort is a new coprocessor designed for the z15. Z Sort can help reduce CPU usage and improve elapsed time for eligible workloads by speeding up sorting, shorten batch windows, and improve select database functions. The challenge to taking advantage of this new hardware is that customers need our utilities, Syncsort MFX and its zIIP-offload capabilities, to optimize sorting workload and achieve performance gains. As I mentioned before, we worked closely with the IBM z/OS Poughkeepsie team and hardware architects for the sort accelerator. We have developed new algorithms in Syncsort MFX to take advantage of the coprocessor and with the new algorithms, our customers can see dramatic improvements to batch sort job performance.
In the Precisely lab, we conducted some benchmarks in several modes to compare performance metrics with and without the sort acclerator. We have seen average savings of 26% CPU and 19% Elapsed time savings using the accelerator, however, ZPSaver and its zIIP offload capabilities still offer superior performance advantages.
We also collaborated with the Db2 Utilities team to develop Z Sort enablement in Db2 Sort – used in the LOAD, REBUILD INDEX and REORG utilities. The support was delivered at the end of 2020, and Db2 Sort customers can see both CPU and elapsed time benefits up to 37% better than using DFSORT (which only supports the Db2 REORG utility).
Q: [Bill] Those sound like really impactful enhancements for our IBM Z Platform customers. Precisely has a long history … over 50 years… of software solutions for mainframe customers? As you look at continuing that long support, what are some of the areas of innovation you are looking forward to?
A: [Denise] We are always assessing performance and efficiency improvements to make to our products, so we will be focusing on these in the future. In addition to that, as new technologies come from IBM, we will examine them to see if we can exploit them for further performance boosts. We’re currently working with IBM to address similar challenges this year and will continue to collaborate with them to help our mutual customers.
By the way, Bill, we have spent the majority of time discussing MFX, but I will remind our audience that we have performance and efficiency solutions that address Db2, IMS, IDMS, and Network challenges for the z/OS platform. Ironstream and Connect will solve problems for our mainframe clients as well.