This document discusses how Microsoft SQL Server 2012 and Intel Xeon E7 series processors provide a powerful platform for mission-critical databases. Key capabilities include built-in support for enterprise analytics, big data integration, and hybrid cloud extensibility. When combined, SQL Server 2012 and Intel Xeon E7 processors deliver high performance, scalability, reliability and integrated analytics capabilities for transactional databases and data warehouses.
Harness Enterprise Data with SQL Server 2008 R2 and New Intel Xeon ProcessorsReadWriteEnterprise
The document discusses how SQL Server 2008 R2 and new Intel Xeon processors can help enterprises harness large volumes of data. It describes innovations in SQL Server 2008 R2 like self-service BI and complex event processing. It also outlines developments in Intel Xeon processors 7500 and 5600 series that improve scalability, performance, virtualization, and reliability. The combination of these software and hardware advances provides massive processing power, flexibility, and lower costs to help organizations address challenges of rapidly growing and complex enterprise data.
This document provides an overview of the new features in Microsoft SQL Server 2008, including enhancements that make it more trusted, productive, and intelligent. Key updates include improved security features like transparent data encryption, enhanced high availability options like automatic page repair for database mirroring, and new management capabilities like the policy-based framework to simplify administration.
This document provides an overview of an IBM reference architecture for implementing high availability and disaster recovery for Microsoft SQL Server 2012 using AlwaysOn Availability Groups. The solution leverages IBM Flex System x240 compute nodes, Flex System V7000 storage, and Storwize V7000 and V3700 storage. It allows for automatic failover between two nodes in the main data center and manual failover to a node in a remote disaster recovery site, with the possibility of some data loss during asynchronous replication. The document describes the requirements, hardware components, architectural design, and deployment considerations for the solution.
This white paper provides an overview of EMC's data protection solutions for the data lake - an active repository to manage varied and complex Big Data workloads
5 Reasons To Choose Informatica PowerCenter As Your ETL ToolEdureka!
The document discusses Informatica PowerCenter as an ETL tool. It outlines 5 reasons to choose Informatica PowerCenter including its universal data access, ability to handle mission-critical enterprise data integration, cost effective scalability, ability to meet every data integration need, and enable collaboration between global IT teams. It then goes into details on the PowerCenter architecture, high availability features, partitioning, parallel processing, and recovery capabilities to support these reasons.
The IBM BladeCenter Foundation for Cloud white paper provides an overview of the platform and its advantages for enterprises. It discusses how the solution combines servers, storage, networking, and software into an optimized unified architecture. The paper highlights how the platform delivers outstanding performance through its converged networking and scalable architecture. It also emphasizes how the solution provides reliability through redundancy and quality support from IBM.
Informatica provides the market's leading data integration platform. Tested on nearly 500,000 combinations of platforms and applications, the data integration platform inter operates with the broadest possible range of disparate standards, systems, and applications. This unbiased and universal view makes Informatica unique in today's market as a leader in the data integration platform. It also makes Informatica the ideal strategic platform for companies looking to solve data integration issues of any size.
SQL Server 2008 R2 introduces two new premium editions, Datacenter and Parallel Data Warehouse, to meet the needs of large scale datacenters and data warehouses. It also offers expanded functionality across Enterprise and Standard editions. Datacenter is designed for high performance with support for over 8 processors, 256 logical processors, and memory limits up to the OS maximum. Parallel Data Warehouse provides a massively parallel processing architecture for scaling from tens of terabytes to petabytes of data.
Harness Enterprise Data with SQL Server 2008 R2 and New Intel Xeon ProcessorsReadWriteEnterprise
The document discusses how SQL Server 2008 R2 and new Intel Xeon processors can help enterprises harness large volumes of data. It describes innovations in SQL Server 2008 R2 like self-service BI and complex event processing. It also outlines developments in Intel Xeon processors 7500 and 5600 series that improve scalability, performance, virtualization, and reliability. The combination of these software and hardware advances provides massive processing power, flexibility, and lower costs to help organizations address challenges of rapidly growing and complex enterprise data.
This document provides an overview of the new features in Microsoft SQL Server 2008, including enhancements that make it more trusted, productive, and intelligent. Key updates include improved security features like transparent data encryption, enhanced high availability options like automatic page repair for database mirroring, and new management capabilities like the policy-based framework to simplify administration.
This document provides an overview of an IBM reference architecture for implementing high availability and disaster recovery for Microsoft SQL Server 2012 using AlwaysOn Availability Groups. The solution leverages IBM Flex System x240 compute nodes, Flex System V7000 storage, and Storwize V7000 and V3700 storage. It allows for automatic failover between two nodes in the main data center and manual failover to a node in a remote disaster recovery site, with the possibility of some data loss during asynchronous replication. The document describes the requirements, hardware components, architectural design, and deployment considerations for the solution.
This white paper provides an overview of EMC's data protection solutions for the data lake - an active repository to manage varied and complex Big Data workloads
5 Reasons To Choose Informatica PowerCenter As Your ETL ToolEdureka!
The document discusses Informatica PowerCenter as an ETL tool. It outlines 5 reasons to choose Informatica PowerCenter including its universal data access, ability to handle mission-critical enterprise data integration, cost effective scalability, ability to meet every data integration need, and enable collaboration between global IT teams. It then goes into details on the PowerCenter architecture, high availability features, partitioning, parallel processing, and recovery capabilities to support these reasons.
The IBM BladeCenter Foundation for Cloud white paper provides an overview of the platform and its advantages for enterprises. It discusses how the solution combines servers, storage, networking, and software into an optimized unified architecture. The paper highlights how the platform delivers outstanding performance through its converged networking and scalable architecture. It also emphasizes how the solution provides reliability through redundancy and quality support from IBM.
Informatica provides the market's leading data integration platform. Tested on nearly 500,000 combinations of platforms and applications, the data integration platform inter operates with the broadest possible range of disparate standards, systems, and applications. This unbiased and universal view makes Informatica unique in today's market as a leader in the data integration platform. It also makes Informatica the ideal strategic platform for companies looking to solve data integration issues of any size.
SQL Server 2008 R2 introduces two new premium editions, Datacenter and Parallel Data Warehouse, to meet the needs of large scale datacenters and data warehouses. It also offers expanded functionality across Enterprise and Standard editions. Datacenter is designed for high performance with support for over 8 processors, 256 logical processors, and memory limits up to the OS maximum. Parallel Data Warehouse provides a massively parallel processing architecture for scaling from tens of terabytes to petabytes of data.
White Paper: EMC Greenplum Data Computing Appliance Enhances EMC IT's Global ...EMC
This White Paper illustrates a synergistic model for deploying EMC's Greenplum Data Computing Appliance (DCA) with EMC IT's incumbent Global Data Warehouse infrastructure.
How windows server 2008 r2 helps optimize it and save you moneyupendhra90
This document discusses how Windows Server 2008 R2 can help optimize IT and save costs. It focuses on three key areas:
1) Reducing costs through streamlined management, server consolidation, lower power consumption, and reduced WAN bandwidth.
2) Improving IT service levels with increased reliability features in areas like Failover Clustering and Active Directory.
3) Enabling new business scenarios by providing resources and tools for IT administrators to quickly respond to changing business needs.
Early customers report significant cost savings through higher server consolidation rates, lower IT staff hours spent on tasks, and energy savings of up to 18% compared to prior versions of Windows Server.
IBM Power Systems servers—the smart choice for JD Edwards EnterpriseOne
deployments
Delivering new services faster via joint
IBM and Oracle testing, planning, and
support
Offering secure, reliable performance
with on-demand expansion, ideal
for today’s dynamic JD Edwards
EnterpriseOne environments
Leveraging industry-leading PowerVM
virtualization to economically
consolidate workloads
Boosting performance with POWER7+
processors
Delivering outstanding availability and
higher quality compute services with
PowerHA and AIX
Nationwide, a large UK financial institution, implemented a new data warehouse solution using Microsoft SQL Server 2005 to help comply with the Basel II regulation, which requires maintaining extensive historical records. The solution includes a Historical Data Store using SQL Server 2005 to store vast amounts of historical data from 80 source systems. Partitioning and the scalability of SQL Server 2005 allow the data warehouse to efficiently store and analyze the large volumes of data required by Basel II. Visual Studio 2005 was also used to aid development of the solution. The new system gives Nationwide the ability to rapidly access business information and create reports as required by regulators to demonstrate Basel II compliance.
Oracle Database 12c is available in various editions and includes many optional features. It provides tools for management, security, analytics, and more. Additional products related to Oracle Database are also discussed.
SQL Server Integration Services with Oracle Database 10gLeidy Alexandra
This document provides instructions for using SQL Server Integration Services (SSIS) to extract data from an Oracle Database 10g source and load it into a SQL Server destination. It begins with installing the Oracle 10g client software and testing the connection. Then it describes how to create an SSIS package with tasks to extract data from Oracle and load it into SQL Server tables. Additional transformations like data conversion and derived columns are also demonstrated. The goal is to provide an easy way for customers to interface SQL Server with an Oracle data source for extraction, transformation and loading of data.
Microsoft SQL Server 2008 R2 - Enterprise for Mid Market Organizations DatasheetMicrosoft Private Cloud
SQL Server 2008 R2 Enterprise helps organizations reduce costs, improve productivity and optimize business processes through high performance, scalability, availability and advanced security features. It allows for server consolidation, virtualization, data compression and centralized management to lower hardware and storage costs. The platform also provides powerful business intelligence and data warehousing capabilities to help organizations gain insights from data and make better business decisions.
This document discusses Project REAL, a joint effort between Microsoft and its partners to discover best practices for implementing business intelligence applications using SQL Server 2005. It used real customer data and scenarios to address issues in data warehouse design, ETL processes, reporting, sizing systems, and ongoing management. The project tested different storage configurations and hardware to determine optimal solutions. Topics covered include data partitioning, cube partitioning, data migration strategies, backup methodologies, and the storage hardware used.
The document discusses Dell's PowerEdge server portfolio and solutions for enterprise applications. It provides an overview of next-generation PowerEdge technologies including more processing power from Intel Xeon processors, high-capacity low-power memory, scalable efficient local storage, simplified intelligent management, and energy efficiency innovations. It also describes PowerEdge platforms for traditional and converged infrastructure and Dell's comprehensive enterprise solutions.
White Paper: Rethink Storage: Transform the Data Center with EMC ViPR Softwar...EMC
This white paper discusses the software-defined data center (SDCC) and challenges of heterogeneous storage silos in making SDDC a reality. It introduces EMC ViPR software-defined storage, which enables enterprise IT departments and service providers to transform physical storage arrays into simple, extensible, open virtual storage platform.
Rethink Storage: Transform the Data Center with EMC ViPR Software-Defined Sto...EMC
This white paper discusses the evolution of the Software-Defined Data Center and the challenges of heterogeneous storage silos in making the SDDC a reality.
Microsoft SQL Server 2005 (64-bit) offers significantly improved memory availability and parallel processing performance compared to the 32-bit version. It is optimized for 64-bit processors like Itanium 2, AMD Opteron, and Intel Xeon. The 64-bit architecture allows for a larger addressable memory space, enhanced parallelism up to 64 processors, and faster data movement. These advantages enable SQL Server 2005 (64-bit) to handle more complex workloads, consolidate databases, and scale processing demands better than the 32-bit version. The document discusses considerations for choosing the 64-bit version and deployment best practices.
This document provides a summary of the Gartner Cool Vendors in In-Memory Computing Technologies report from 2014. It identifies four vendors as cool vendors: Diablo Technologies, GridGain, MemSQL, and Relex. For each vendor, it provides a brief overview of the company and technology, as well as challenges they may face. It recommends IT leaders consider these vendors' in-memory computing solutions for opportunities like hybrid transaction/analytical processing, big data analytics, and supply chain planning. The report evaluates these vendors' innovations in in-memory technologies and how they can help organizations leverage digital business opportunities through improved agility and fast data processing.
This document discusses Intel's Intelligent Power Node Manager solution for managing power usage in data centers. It notes that power and cooling costs are a major challenge as data needs increase exponentially. The Power Node Manager allows data center managers to set a power budget or "cap" for each server to better utilize existing power capacity and increase compute density. It provides real-time power monitoring and enables power capping to maintain performance within the set limit or adjust if the limit is exceeded. This allows data centers to squeeze more performance from their existing power infrastructure.
Microsoft SQL Server 2008 R2 - Upgrading to SQL Server 2008 R2 WhitepaperMicrosoft Private Cloud
More than ever, organizations rely on data storage and analysis for business operations. Companies need the ability to deploy data-driven solutions quickly. Microsoft SQL Server 2008 R2 data management software provides a trusted, productive, and intelligent data platform that makes it possible for you to run your most demanding mission-critical applications, reduce time and cost of application deployment and maintenance, and deliver actionable insights to your entire organization.
Sql Server 2014 Platform for Hybrid Cloud Technical Decision Maker White PaperDavid J Rosenthal
The document discusses options for running SQL Server in hybrid cloud environments, including both public and private clouds. In a public cloud, SQL Server can run in either Windows Azure Virtual Machines, which provides full feature parity with on-premises SQL Server, or Windows Azure SQL Database, which offers scalability to millions of users but less control over the operating system. A hybrid approach allows organizations to deploy applications across on-premises and cloud environments to realize the benefits of each.
Data virtualization allows applications to access and manipulate data without knowledge of physical data structures or locations. Teiid is a data virtualization system comprised of tools, components and services for creating and executing bidirectional data services across distributed, heterogeneous data sources in real-time without moving data. Teiid includes a query engine, embedded driver, server, connectors and tools for creating virtual databases (VDBs) containing models that define data structures and views. Models represent data sources or abstractions and must be validated and configured with translators and resource adapters to access physical data when a VDB is deployed.
This lesson covers creating a DQS knowledge base named "Suppliers" to be used for cleansing and matching supplier data. The following key tasks are covered:
1. Creating the Suppliers knowledge base and domains for fields to be cleansed and matched like "SupplierID".
2. Adding values to domains manually, by importing from Excel, or through knowledge discovery on sample data.
3. Setting domain rules to validate, correct, and standardize values.
4. Setting term relationships to standardize values like treating "Inc." as "Incorporated".
5. Specifying synonym values where one is the leading value used for cleansing.
6. Creating a composite "AddressValidation"
The Pivotal Business Data Lake provides a flexible blueprint to meet your business's future information and analytics needs while avoiding the pitfalls of typical EDW implementations. Pivotal’s products will help you overcome challenges like reconciling corporate and local needs, providing real-time access to all types of data, integrating data from multiple sources and in multiple formats, and supporting ad hoc analysis.
This document discusses how businesses can transform their data centers into highly optimized private clouds using Intel Xeon processors. It outlines key challenges in virtualized data centers like explosive data growth and I/O bottlenecks. It then describes how the Intel Xeon processor E5 family addresses these challenges through features like improved storage solutions, increased I/O performance, stronger security and reduced energy costs. The processors provide a foundation for flexible, efficient private clouds that simplify application deployments and provide scalability.
This document summarizes the results of benchmark testing of Misys Opics Plus 3.1, a financial software solution, running on SQL Server 2008 R2. The testing showed that Opics Plus exceeded performance goals by processing FX trades with a response time below 0.25 seconds for 200 users and a throughput of over 64 deals per second. The conclusions are that Opics Plus provides enterprise-class performance and scalability on SQL Server and can meet the demands of large financial organizations in a cost-effective manner.
White Paper: EMC Greenplum Data Computing Appliance Enhances EMC IT's Global ...EMC
This White Paper illustrates a synergistic model for deploying EMC's Greenplum Data Computing Appliance (DCA) with EMC IT's incumbent Global Data Warehouse infrastructure.
How windows server 2008 r2 helps optimize it and save you moneyupendhra90
This document discusses how Windows Server 2008 R2 can help optimize IT and save costs. It focuses on three key areas:
1) Reducing costs through streamlined management, server consolidation, lower power consumption, and reduced WAN bandwidth.
2) Improving IT service levels with increased reliability features in areas like Failover Clustering and Active Directory.
3) Enabling new business scenarios by providing resources and tools for IT administrators to quickly respond to changing business needs.
Early customers report significant cost savings through higher server consolidation rates, lower IT staff hours spent on tasks, and energy savings of up to 18% compared to prior versions of Windows Server.
IBM Power Systems servers—the smart choice for JD Edwards EnterpriseOne
deployments
Delivering new services faster via joint
IBM and Oracle testing, planning, and
support
Offering secure, reliable performance
with on-demand expansion, ideal
for today’s dynamic JD Edwards
EnterpriseOne environments
Leveraging industry-leading PowerVM
virtualization to economically
consolidate workloads
Boosting performance with POWER7+
processors
Delivering outstanding availability and
higher quality compute services with
PowerHA and AIX
Nationwide, a large UK financial institution, implemented a new data warehouse solution using Microsoft SQL Server 2005 to help comply with the Basel II regulation, which requires maintaining extensive historical records. The solution includes a Historical Data Store using SQL Server 2005 to store vast amounts of historical data from 80 source systems. Partitioning and the scalability of SQL Server 2005 allow the data warehouse to efficiently store and analyze the large volumes of data required by Basel II. Visual Studio 2005 was also used to aid development of the solution. The new system gives Nationwide the ability to rapidly access business information and create reports as required by regulators to demonstrate Basel II compliance.
Oracle Database 12c is available in various editions and includes many optional features. It provides tools for management, security, analytics, and more. Additional products related to Oracle Database are also discussed.
SQL Server Integration Services with Oracle Database 10gLeidy Alexandra
This document provides instructions for using SQL Server Integration Services (SSIS) to extract data from an Oracle Database 10g source and load it into a SQL Server destination. It begins with installing the Oracle 10g client software and testing the connection. Then it describes how to create an SSIS package with tasks to extract data from Oracle and load it into SQL Server tables. Additional transformations like data conversion and derived columns are also demonstrated. The goal is to provide an easy way for customers to interface SQL Server with an Oracle data source for extraction, transformation and loading of data.
Microsoft SQL Server 2008 R2 - Enterprise for Mid Market Organizations DatasheetMicrosoft Private Cloud
SQL Server 2008 R2 Enterprise helps organizations reduce costs, improve productivity and optimize business processes through high performance, scalability, availability and advanced security features. It allows for server consolidation, virtualization, data compression and centralized management to lower hardware and storage costs. The platform also provides powerful business intelligence and data warehousing capabilities to help organizations gain insights from data and make better business decisions.
This document discusses Project REAL, a joint effort between Microsoft and its partners to discover best practices for implementing business intelligence applications using SQL Server 2005. It used real customer data and scenarios to address issues in data warehouse design, ETL processes, reporting, sizing systems, and ongoing management. The project tested different storage configurations and hardware to determine optimal solutions. Topics covered include data partitioning, cube partitioning, data migration strategies, backup methodologies, and the storage hardware used.
The document discusses Dell's PowerEdge server portfolio and solutions for enterprise applications. It provides an overview of next-generation PowerEdge technologies including more processing power from Intel Xeon processors, high-capacity low-power memory, scalable efficient local storage, simplified intelligent management, and energy efficiency innovations. It also describes PowerEdge platforms for traditional and converged infrastructure and Dell's comprehensive enterprise solutions.
White Paper: Rethink Storage: Transform the Data Center with EMC ViPR Softwar...EMC
This white paper discusses the software-defined data center (SDCC) and challenges of heterogeneous storage silos in making SDDC a reality. It introduces EMC ViPR software-defined storage, which enables enterprise IT departments and service providers to transform physical storage arrays into simple, extensible, open virtual storage platform.
Rethink Storage: Transform the Data Center with EMC ViPR Software-Defined Sto...EMC
This white paper discusses the evolution of the Software-Defined Data Center and the challenges of heterogeneous storage silos in making the SDDC a reality.
Microsoft SQL Server 2005 (64-bit) offers significantly improved memory availability and parallel processing performance compared to the 32-bit version. It is optimized for 64-bit processors like Itanium 2, AMD Opteron, and Intel Xeon. The 64-bit architecture allows for a larger addressable memory space, enhanced parallelism up to 64 processors, and faster data movement. These advantages enable SQL Server 2005 (64-bit) to handle more complex workloads, consolidate databases, and scale processing demands better than the 32-bit version. The document discusses considerations for choosing the 64-bit version and deployment best practices.
This document provides a summary of the Gartner Cool Vendors in In-Memory Computing Technologies report from 2014. It identifies four vendors as cool vendors: Diablo Technologies, GridGain, MemSQL, and Relex. For each vendor, it provides a brief overview of the company and technology, as well as challenges they may face. It recommends IT leaders consider these vendors' in-memory computing solutions for opportunities like hybrid transaction/analytical processing, big data analytics, and supply chain planning. The report evaluates these vendors' innovations in in-memory technologies and how they can help organizations leverage digital business opportunities through improved agility and fast data processing.
This document discusses Intel's Intelligent Power Node Manager solution for managing power usage in data centers. It notes that power and cooling costs are a major challenge as data needs increase exponentially. The Power Node Manager allows data center managers to set a power budget or "cap" for each server to better utilize existing power capacity and increase compute density. It provides real-time power monitoring and enables power capping to maintain performance within the set limit or adjust if the limit is exceeded. This allows data centers to squeeze more performance from their existing power infrastructure.
Microsoft SQL Server 2008 R2 - Upgrading to SQL Server 2008 R2 WhitepaperMicrosoft Private Cloud
More than ever, organizations rely on data storage and analysis for business operations. Companies need the ability to deploy data-driven solutions quickly. Microsoft SQL Server 2008 R2 data management software provides a trusted, productive, and intelligent data platform that makes it possible for you to run your most demanding mission-critical applications, reduce time and cost of application deployment and maintenance, and deliver actionable insights to your entire organization.
Sql Server 2014 Platform for Hybrid Cloud Technical Decision Maker White PaperDavid J Rosenthal
The document discusses options for running SQL Server in hybrid cloud environments, including both public and private clouds. In a public cloud, SQL Server can run in either Windows Azure Virtual Machines, which provides full feature parity with on-premises SQL Server, or Windows Azure SQL Database, which offers scalability to millions of users but less control over the operating system. A hybrid approach allows organizations to deploy applications across on-premises and cloud environments to realize the benefits of each.
Data virtualization allows applications to access and manipulate data without knowledge of physical data structures or locations. Teiid is a data virtualization system comprised of tools, components and services for creating and executing bidirectional data services across distributed, heterogeneous data sources in real-time without moving data. Teiid includes a query engine, embedded driver, server, connectors and tools for creating virtual databases (VDBs) containing models that define data structures and views. Models represent data sources or abstractions and must be validated and configured with translators and resource adapters to access physical data when a VDB is deployed.
This lesson covers creating a DQS knowledge base named "Suppliers" to be used for cleansing and matching supplier data. The following key tasks are covered:
1. Creating the Suppliers knowledge base and domains for fields to be cleansed and matched like "SupplierID".
2. Adding values to domains manually, by importing from Excel, or through knowledge discovery on sample data.
3. Setting domain rules to validate, correct, and standardize values.
4. Setting term relationships to standardize values like treating "Inc." as "Incorporated".
5. Specifying synonym values where one is the leading value used for cleansing.
6. Creating a composite "AddressValidation"
The Pivotal Business Data Lake provides a flexible blueprint to meet your business's future information and analytics needs while avoiding the pitfalls of typical EDW implementations. Pivotal’s products will help you overcome challenges like reconciling corporate and local needs, providing real-time access to all types of data, integrating data from multiple sources and in multiple formats, and supporting ad hoc analysis.
This document discusses how businesses can transform their data centers into highly optimized private clouds using Intel Xeon processors. It outlines key challenges in virtualized data centers like explosive data growth and I/O bottlenecks. It then describes how the Intel Xeon processor E5 family addresses these challenges through features like improved storage solutions, increased I/O performance, stronger security and reduced energy costs. The processors provide a foundation for flexible, efficient private clouds that simplify application deployments and provide scalability.
This document summarizes the results of benchmark testing of Misys Opics Plus 3.1, a financial software solution, running on SQL Server 2008 R2. The testing showed that Opics Plus exceeded performance goals by processing FX trades with a response time below 0.25 seconds for 200 users and a throughput of over 64 deals per second. The conclusions are that Opics Plus provides enterprise-class performance and scalability on SQL Server and can meet the demands of large financial organizations in a cost-effective manner.
Dell EMC Ready Solutions for Big Data are powered by the BlueData EPIC software platform - for on-demand provisioning and automation. These integrated solutions enable a cloud-like experience for Big-Data-as-a-Service (BDaaS) while ensuring the enterprise-grade security and performance of on-premises infrastructure.
With Dell EMC Ready Solutions for Big Data, customers can rapidly deploy their analytics and machine learning workloads in a secure multi-tenant architecture, for multiple different user groups running on shared infrastructure. Their users can quickly and easily provision distributed environments for Cloudera, Hortonworks, Kafka, MapR, Spark, TensorFlow, as well as other tools.
The new Ready Solutions include everything that customers need to enable BDaaS on-premises – including BlueData EPIC software as well as Dell EMC hardware, consulting, deployment, and support services.
To learn more, visit www.dellemc.com/bdaas
Data is everything in today's corporate world; how your company harnesses it can create the difference between scaling and failing. For many businesses, MS SQL Server 2019 Standard is a critical lifeline. SQL Server is an RDBMS (relational database management system) that helps transaction processing, business intelligence, and analytics applications and is better for businesses.
How SQL Server Can Streamline Database AdministrationDirect Deals, LLC
In addition to ensuring data integrity, this approach delivers heightened scalability and availability for critical data sets. Also, SQL Server 2022 has a variety of editions, such as SQL Server 2022 Standard and more. Now you must know that, by embracing this solution, businesses can effortlessly scale their operations, accommodating increased workloads and user demands without sacrificing responsiveness.
Read More: - https://www.directdeals.com/how-sql-server-can-streamline-database-administration
A comparison of the Dell AI portfolio vs. similar offerings from HPE
Conclusion
Harnessing the power of AI to streamline and improve business operations can be a challenging task, with significant business implications. With technology advancing more rapidly than ever, partnering with the right vendor for AI is key. By choosing a company like Dell that not only offers a comprehensive AI portfolio, but can also provide planning, preparation, implementation and management services, customers can face these challenges head on. MLPerf® Benchmark testing shows that offerings in the Dell AI portfolio offer consistent, strong performance for AI workloads. With high-performing and flexible server options, along with multiple storage choices, validated solutions, and professional services specifically tailored for AI, Dell can help businesses embrace AI and its benefits.
Future of Power: Power Strategy and Offerings for Denmark - Steve SibleyIBM Danmark
IBM is promoting its Power Systems as optimized for big data, analytics, mobile, social, and cloud workloads. Key highlights include Power Systems providing a flexible, secure infrastructure to support next generation applications and analytics on big data. IBM also emphasizes partnerships with software vendors and the open innovation through the OpenPOWER consortium to deliver client choice and drive down costs.
Intel demonstrated its 'Cloud-in-a-Box' tool, which showcases technologies that can deliver a more secure and energy efficient cloud that is faster to deploy. The cloud largely runs on Intel Xeon processors. Technologies presented include Intel Trusted Execution Technology for security, Intel AES New Instructions for encryption, and Intel Intelligent Power Node Manager for power optimization. Intel aims to help make clouds simpler and more open and secure through initiatives like Cloud Builders and collaboration with the Open Data Center Alliance.
Tamilarasu Uthirasamy has over 10 years of experience in data warehousing, database design, ETL processes, and analytics. He has skills in technologies like Python, Spark, UNIX shell scripting, databases like Netezza and Oracle, and tools like Datastage and R. He has worked on projects in healthcare, retail, and banking domains, designing data models and warehouses and developing ETL processes.
Learn about IBM PureFlex System: The Future of Data center Management.The ‘Expert Integrated System’ delivers a combination of hardware, software and built-in expertise that makes implementing and applying the power of computing simpler, easier, faster and more effective than ever before.For more information, visit http://ibm.co/J7Zb1v.
Learn about IBM PureFlex System: The Future of Datacenter Management. the system has unique management capabilities of the IBM Flex System Manager. IBM’s goals in designing and building these systems were to return agility, efficiency, simplicity and control to data center operations. To know more, visit http://ibm.co/J7Zb1v.
Meeting the challenges of AI workloads with the Dell AI portfolio - SummaryPrincipled Technologies
A comparison of the Dell AI portfolio vs. similar offerings from HPE
In brief: Additional contributors to AI success
AI models require more than high performance servers for success. You must also consider storage for unstructured data and professional services to plan, prepare, deploy, and manage your AI solution. The Dell portfolio offers storage for AI datasets with the PowerScale series for file storage and Elastic Cloud Storage or ObjectScale storage for object storage.
Organizations can reap the advantages of Dell’s professional and consulting services for AI, which offers some services that HPE does not, such as data preparation. The Dell portfolio also includes Validated Designs for AI, which takes the guesswork out of designing and deploying AI.
Rajesh S has over 3 years of experience in developing ETL applications using IBM Datastage. He has extensive experience designing and developing Datastage jobs to extract, transform and load data from various sources such as Oracle and Teradata databases into data warehouses. Some of his key skills include Datastage, Unix scripting, Oracle, Teradata and working on projects in the healthcare and retail domains.
Intel's Data Center & Connected Systems Group and Diane Bryant shares the latest news on the latest Intel Xeon E5v2 family of processors and technologies like Intel Network Builders to enable the re-architecture of the Data Center.
Pedro P. Gatica Jr. provides his contact information and an extensive professional profile summarizing his experience and skills as an IT Lead Software Developer/Integrator with over 25 years of experience in data warehousing, ETL development, and project management. He has expertise in technologies like DataStage, Informatica, DB2, Oracle, and SQL Server. His experience includes roles at USAA from 1997 to 2014 where he led many strategic data integration projects and established batch and real-time ETL environments.
This document provides an overview of SQL Server 2008 R2 and its key capabilities. It discusses how SQL Server 2008 R2 helps customers scale efficiently, reduce risk and gain agility, and respond quickly to business opportunities. It highlights SQL Server 2008 R2's high performance data warehouse capabilities, enhanced security features, virtualization and management improvements, and self-service business intelligence tools.
This document discusses the features and benefits of upgrading to SQL Server 2014. It highlights new performance enhancing features like In-Memory OLTP and ColumnStore. It also covers improved availability options, security features like transparent data encryption, and cloud-readiness capabilities like backup to Microsoft Azure. The document provides overviews of the Standard and Enterprise editions, and includes examples of how specific companies have benefited from upgrading.
How to Swiftly Operationalize the Data Lake for Advanced Analytics Using a Lo...Denodo
Watch full webinar here: https://bit.ly/3mfFJqb
Presented at Chief Data Officer Live Series 2021, ASEAN (August Edition)
While big data initiatives have become necessary for any business to generate actionable insights, big data fabric has become a necessity for any successful big data initiative. The best-of-breed big data fabrics should deliver actionable insights to the business users with minimal effort, provide end-to-end security to the entire enterprise data platform, and provide real-time data integration while delivering a self-service data platform to business users.
Watch this on-demand session to learn how big data fabric enabled by Data Virtualization:
- Provides lightning fast self-service data access to business users
- Centralizes data security, governance, and data privacy
- Fulfills the promise of data lakes to provide actionable insights
Teradata, Oracle, Sybase (SAP), and IBM lead the enterprise data warehousing market according to Forrester's evaluation. Teradata provides the most scalable and flexible EDW solution. Oracle has built its Exadata Database Machine into a formidable product family. Sybase continues to enhance its massively parallel columnar technology for real-time analytics. IBM has ramped up its focus on petabyte-scale Hadoop integration. EMC Greenplum, Netezza, Microsoft, and Vertica Systems also demonstrate strengths in the competitive market.
Mohammad Amjad Khan has over 11 years of experience working as an ETL Developer and BI Analyst. He has extensive experience extracting, transforming, and loading data from various sources like SAP, legacy systems, and file systems into data warehouses using tools like Informatica and Cognos Data Manager. He also has experience writing queries, creating reports and dashboards using Cognos and Business Objects. Some of the key projects he has worked on include data migration projects for Amazon, Vodafone, Friends Life, and KPMG.
Similar to Sql server 2012_and_intel_e7_processor_more_capability_and_higher_value_for_mission_critical_databases (20)
Oracle provides modern cloud applications including CX Cloud, HCM Cloud, ERP Cloud, SCM Cloud, and Data Cloud. The applications are complete, data-driven, personalized, connected, and secure. Oracle has many customers using its cloud applications worldwide. The document discusses Oracle's leadership in the SaaS market and how its applications and services provide benefits such as faster delivery, better products, feedback and innovation. It also introduces new Adaptive Intelligent applications that will add business value by providing smarter and more contextual experiences using data and machine learning.
K1 keynote 1_oracle_integrated_cloud_strategy_and_vision_for_journey_to_cloud...Dr. Wilfred Lin (Ph.D.)
1) Jason Tsang discusses Oracle's cloud strategy and vision, which provides a complete, open, and secure platform spanning all layers of the cloud to support customers' journey to the cloud.
2) Oracle's cloud platform offers choice and access to innovation for all through its integrated infrastructure (IaaS), platform (PaaS), and software (SaaS) offerings.
3) Oracle aims to support customers on every stage of their journey to the cloud through deployment choice, migration paths, and personalized support.
This document introduces Oracle's Zero Data Loss Recovery Appliance, which provides a new approach to protecting Oracle databases. Traditional backup appliances treat databases as files and copy them periodically, which can slow production and leave gaps in protection during backups. The Recovery Appliance uses real-time redo transport to instantly protect ongoing transactions with no data loss. It offloads backup processing and scales to protect all databases, providing database-level recoverability without impacting production. Customers have achieved faster restores, reduced backup windows, capacity savings, and elimination of data loss.
C6 oracles storage_strategy_from_databases_to_engineered_systems_to_cloudDr. Wilfred Lin (Ph.D.)
The document discusses Oracle's storage strategy of integrating storage with its other products like databases, engineered systems, and cloud offerings. It outlines how storage is moving from standalone products to being engineered as part of complete systems to improve functionality, performance and reduce costs. Oracle storage products are designed to work closely with databases, engineered systems and public cloud to simplify management and move data seamlessly between on-premises and cloud environments. Examples are given of customers benefiting from faster database provisioning, data protection and time to market using Oracle's engineered storage solutions.
This document discusses Oracle's SPARC systems and their ability to modernize legacy Unix applications and provide a path to the cloud. It describes how SPARC systems offer a modern, cloud-ready infrastructure that can leverage existing investments while improving security, capacity, and flexibility. It provides examples of SPARC solutions that delivered benefits like reduced costs, increased throughput, and scalability for customers in various industries.
The document discusses optimizing application infrastructure by moving from a generic approach to an engineered approach using Oracle technologies. It describes how generic infrastructure treats all workloads equally and may not optimize for cost or performance needs. Oracle offers engineered systems like the Private Cloud Appliance to optimize for cost through features like integrated software and hardware. It also offers Exalogic to optimize for extreme performance requirements through tight integration and software enhancements. Case studies show customers achieving significant cost savings and performance improvements versus generic infrastructure.
The document discusses Oracle Cloud Machine, which brings the capabilities of Oracle's public cloud behind a customer's firewall. It allows customers to maintain control over critical systems while gaining the agility, flexibility, and cost structure of the public cloud. Oracle Cloud Machine delivers Oracle's PaaS and IaaS software on-premises and manages it as a service. This gives customers a cloud-like experience with their data and applications on their own premises and under their control.
The document discusses five journeys organizations can take to evolve their infrastructure to the cloud using Oracle Engineered Systems: 1) Streamline the enterprise on-premises infrastructure to dramatically improve costs and performance, 2) Extend an existing private cloud to optimize it for enterprise applications, 3) Deploy a hybrid cloud for development and testing, 4) Bring the public cloud model on-premises, and 5) Lift and shift workloads to Oracle's public cloud. It provides examples of customers who achieved savings and benefits by taking these journeys.
The document discusses Oracle's approach to enterprise cloud strategy. It notes that most public and private cloud offerings are incompatible and incomplete for enterprise needs. Oracle proposes an integrated cloud solution providing enterprise SaaS, PaaS and IaaS that can span both private and public clouds. This would allow enterprises to run applications and workloads across their on-premises infrastructure and Oracle's public cloud platform. Oracle argues this integrated approach is needed to bring true cloud agility to enterprise applications and IT.
The document discusses Oracle's API management platform. It enables digital transformation by allowing organizations to manage, secure, and monetize their APIs. The platform provides tools to build and deploy APIs, secure access through authentication and authorization, monitor API usage through analytics, and publish APIs to developers through a portal. It helps organizations improve agility, gain visibility into API usage, and ensure security of digital assets.
B6 improve operational_efficiency_through_process_and_document_collaborationDr. Wilfred Lin (Ph.D.)
The document discusses Oracle Process Cloud Service, a platform for automating business processes and collaborating on documents. It provides self-service process management, actionable insights from process data, and rapid iteration of processes. Example use cases described include managing lease requests and contracts in property management. The platform allows non-technical users to design processes, access them from any device, and has pre-built integrations with common systems.
This document discusses connecting on-premises systems and applications to the cloud using Oracle's integration services. It begins with an overview of common integration use cases such as development/testing in the cloud, migrating systems to the cloud, integrating SaaS applications, enabling analytics in the cloud, and integrating IoT device data. It then describes Oracle's integration cloud services for application integration, data integration, IoT, and APIs. The document demonstrates connecting applications using integration cloud service and provides two customer case studies on extending Oracle SaaS using platform services and consolidating integrations. It concludes with next steps around following Oracle's social media channels for integration.
The document discusses DevOps principles and how Oracle provides tools to enable DevOps. It summarizes that DevOps is a cultural movement that emphasizes collaboration between development and operations teams. Oracle offers cloud services that allow for fast provisioning of development environments and comprehensive tools that support the full development lifecycle from coding to deployment to management. The document provides examples of how Oracle tools have helped a DevOps team of 170 members work collaboratively through automated processes for code management, testing, deployment and monitoring across development and production environments.
The document discusses getting started with cloud native development and provides an overview of Oracle's cloud platform for application development, which supports building modern cloud-native applications using technologies like microservices, containers, and mobile development tools, and allows developers to test and deploy applications in the cloud with services for continuous delivery, scaling, and monitoring. It also highlights Oracle's developer automation, Java, and container cloud services that help developers build, deploy, and manage applications in a cloud environment.
- The document discusses building better mobile apps faster using Oracle Mobile Cloud services. It highlights trends of high rewrite rates for existing mobile apps and challenges with traditional mobile development.
- Oracle Mobile Cloud aims to address these challenges by offering mobile-first services, rapid development tools, and an integrated platform to build better apps faster across the mobile lifecycle.
- The platform provides services for push notifications, data storage, location, and analytics alongside tools for low-code development and code-based approaches.
B1 keynote reimagine_application_development_and_delivery_with_oracle_platformDr. Wilfred Lin (Ph.D.)
The document discusses new demands for application development and how Oracle Cloud Platform addresses these demands. It highlights key trends like microservices, DevOps, and mobile development. Oracle Cloud Platform provides a unified platform for building, deploying, and managing applications across cloud infrastructure, backend services, and development tools. It also allows migrating workloads between on-premises and public cloud environments. The presentation includes a demo of developing a sample application using various Oracle Cloud Platform services.
The document discusses Oracle Analytics Cloud and its capabilities for data visualization and storytelling. It describes how the tool allows anyone to access and analyze data from various sources to gain insights. It provides rich visualization features, collaborative sharing abilities, and can be accessed on mobile, desktop or browsers to tell data-driven stories. The key benefits highlighted are that it offers powerful yet easy-to-use analytics accessible to all users.
This document discusses how big data and analytics are moving from on-premises data warehouses to hybrid cloud environments that leverage technologies like Hadoop, Spark, and machine learning. It provides examples of how Oracle is helping customers with this transition by offering big data cloud services that give them flexibility to run workloads both on-premises and in the cloud while simplifying data management and enabling new types of advanced analytics.
The document discusses security considerations for cloud computing. It notes that trust is paramount when choosing a cloud partner and that many customers are concerned about cloud providers accessing their data without permission. The document advocates for a shared security model between cloud providers and customers based on mutual trust and verification. It outlines Oracle's approach to cloud security which focuses on secure architecture, products, maintenance, and deployment backed by physical, technology, process, and people controls.
This document describes an intelligent, unified platform for managing applications and infrastructure across multiple clouds, containers, and on-premise environments. The platform provides complete visibility from the end user experience to infrastructure, and is powered by machine learning for anomaly detection, clustering, prediction, and correlation. It offers a suite of services including discovery and monitoring, configuration and compliance, automation and orchestration, and analytics and planning. The platform is designed to provide greater agility, increased efficiency, and fewer outages for managing applications and infrastructure at scale.
Programming Foundation Models with DSPy - Meetup SlidesZilliz
Prompting language models is hard, while programming language models is easy. In this talk, I will discuss the state-of-the-art framework DSPy for programming foundation models with its powerful optimizers and runtime constraint system.
In the rapidly evolving landscape of technologies, XML continues to play a vital role in structuring, storing, and transporting data across diverse systems. The recent advancements in artificial intelligence (AI) present new methodologies for enhancing XML development workflows, introducing efficiency, automation, and intelligent capabilities. This presentation will outline the scope and perspective of utilizing AI in XML development. The potential benefits and the possible pitfalls will be highlighted, providing a balanced view of the subject.
We will explore the capabilities of AI in understanding XML markup languages and autonomously creating structured XML content. Additionally, we will examine the capacity of AI to enrich plain text with appropriate XML markup. Practical examples and methodological guidelines will be provided to elucidate how AI can be effectively prompted to interpret and generate accurate XML markup.
Further emphasis will be placed on the role of AI in developing XSLT, or schemas such as XSD and Schematron. We will address the techniques and strategies adopted to create prompts for generating code, explaining code, or refactoring the code, and the results achieved.
The discussion will extend to how AI can be used to transform XML content. In particular, the focus will be on the use of AI XPath extension functions in XSLT, Schematron, Schematron Quick Fixes, or for XML content refactoring.
The presentation aims to deliver a comprehensive overview of AI usage in XML development, providing attendees with the necessary knowledge to make informed decisions. Whether you’re at the early stages of adopting AI or considering integrating it in advanced XML development, this presentation will cover all levels of expertise.
By highlighting the potential advantages and challenges of integrating AI with XML development tools and languages, the presentation seeks to inspire thoughtful conversation around the future of XML development. We’ll not only delve into the technical aspects of AI-powered XML development but also discuss practical implications and possible future directions.
Climate Impact of Software Testing at Nordic Testing DaysKari Kakkonen
My slides at Nordic Testing Days 6.6.2024
Climate impact / sustainability of software testing discussed on the talk. ICT and testing must carry their part of global responsibility to help with the climat warming. We can minimize the carbon footprint but we can also have a carbon handprint, a positive impact on the climate. Quality characteristics can be added with sustainability, and then measured continuously. Test environments can be used less, and in smaller scale and on demand. Test techniques can be used in optimizing or minimizing number of tests. Test automation can be used to speed up testing.
AI 101: An Introduction to the Basics and Impact of Artificial IntelligenceIndexBug
Imagine a world where machines not only perform tasks but also learn, adapt, and make decisions. This is the promise of Artificial Intelligence (AI), a technology that's not just enhancing our lives but revolutionizing entire industries.
GraphRAG for Life Science to increase LLM accuracyTomaz Bratanic
GraphRAG for life science domain, where you retriever information from biomedical knowledge graphs using LLMs to increase the accuracy and performance of generated answers
Threats to mobile devices are more prevalent and increasing in scope and complexity. Users of mobile devices desire to take full advantage of the features
available on those devices, but many of the features provide convenience and capability but sacrifice security. This best practices guide outlines steps the users can take to better protect personal devices and information.
TrustArc Webinar - 2024 Global Privacy SurveyTrustArc
How does your privacy program stack up against your peers? What challenges are privacy teams tackling and prioritizing in 2024?
In the fifth annual Global Privacy Benchmarks Survey, we asked over 1,800 global privacy professionals and business executives to share their perspectives on the current state of privacy inside and outside of their organizations. This year’s report focused on emerging areas of importance for privacy and compliance professionals, including considerations and implications of Artificial Intelligence (AI) technologies, building brand trust, and different approaches for achieving higher privacy competence scores.
See how organizational priorities and strategic approaches to data security and privacy are evolving around the globe.
This webinar will review:
- The top 10 privacy insights from the fifth annual Global Privacy Benchmarks Survey
- The top challenges for privacy leaders, practitioners, and organizations in 2024
- Key themes to consider in developing and maintaining your privacy program
HCL Notes und Domino Lizenzkostenreduzierung in der Welt von DLAUpanagenda
Webinar Recording: https://www.panagenda.com/webinars/hcl-notes-und-domino-lizenzkostenreduzierung-in-der-welt-von-dlau/
DLAU und die Lizenzen nach dem CCB- und CCX-Modell sind für viele in der HCL-Community seit letztem Jahr ein heißes Thema. Als Notes- oder Domino-Kunde haben Sie vielleicht mit unerwartet hohen Benutzerzahlen und Lizenzgebühren zu kämpfen. Sie fragen sich vielleicht, wie diese neue Art der Lizenzierung funktioniert und welchen Nutzen sie Ihnen bringt. Vor allem wollen Sie sicherlich Ihr Budget einhalten und Kosten sparen, wo immer möglich. Das verstehen wir und wir möchten Ihnen dabei helfen!
Wir erklären Ihnen, wie Sie häufige Konfigurationsprobleme lösen können, die dazu führen können, dass mehr Benutzer gezählt werden als nötig, und wie Sie überflüssige oder ungenutzte Konten identifizieren und entfernen können, um Geld zu sparen. Es gibt auch einige Ansätze, die zu unnötigen Ausgaben führen können, z. B. wenn ein Personendokument anstelle eines Mail-Ins für geteilte Mailboxen verwendet wird. Wir zeigen Ihnen solche Fälle und deren Lösungen. Und natürlich erklären wir Ihnen das neue Lizenzmodell.
Nehmen Sie an diesem Webinar teil, bei dem HCL-Ambassador Marc Thomas und Gastredner Franz Walder Ihnen diese neue Welt näherbringen. Es vermittelt Ihnen die Tools und das Know-how, um den Überblick zu bewahren. Sie werden in der Lage sein, Ihre Kosten durch eine optimierte Domino-Konfiguration zu reduzieren und auch in Zukunft gering zu halten.
Diese Themen werden behandelt
- Reduzierung der Lizenzkosten durch Auffinden und Beheben von Fehlkonfigurationen und überflüssigen Konten
- Wie funktionieren CCB- und CCX-Lizenzen wirklich?
- Verstehen des DLAU-Tools und wie man es am besten nutzt
- Tipps für häufige Problembereiche, wie z. B. Team-Postfächer, Funktions-/Testbenutzer usw.
- Praxisbeispiele und Best Practices zum sofortigen Umsetzen
Building Production Ready Search Pipelines with Spark and MilvusZilliz
Spark is the widely used ETL tool for processing, indexing and ingesting data to serving stack for search. Milvus is the production-ready open-source vector database. In this talk we will show how to use Spark to process unstructured data to extract vector representations, and push the vectors to Milvus vector database for search serving.
Removing Uninteresting Bytes in Software FuzzingAftab Hussain
Imagine a world where software fuzzing, the process of mutating bytes in test seeds to uncover hidden and erroneous program behaviors, becomes faster and more effective. A lot depends on the initial seeds, which can significantly dictate the trajectory of a fuzzing campaign, particularly in terms of how long it takes to uncover interesting behaviour in your code. We introduce DIAR, a technique designed to speedup fuzzing campaigns by pinpointing and eliminating those uninteresting bytes in the seeds. Picture this: instead of wasting valuable resources on meaningless mutations in large, bloated seeds, DIAR removes the unnecessary bytes, streamlining the entire process.
In this work, we equipped AFL, a popular fuzzer, with DIAR and examined two critical Linux libraries -- Libxml's xmllint, a tool for parsing xml documents, and Binutil's readelf, an essential debugging and security analysis command-line tool used to display detailed information about ELF (Executable and Linkable Format). Our preliminary results show that AFL+DIAR does not only discover new paths more quickly but also achieves higher coverage overall. This work thus showcases how starting with lean and optimized seeds can lead to faster, more comprehensive fuzzing campaigns -- and DIAR helps you find such seeds.
- These are slides of the talk given at IEEE International Conference on Software Testing Verification and Validation Workshop, ICSTW 2022.
For the full video of this presentation, please visit: https://www.edge-ai-vision.com/2024/06/building-and-scaling-ai-applications-with-the-nx-ai-manager-a-presentation-from-network-optix/
Robin van Emden, Senior Director of Data Science at Network Optix, presents the “Building and Scaling AI Applications with the Nx AI Manager,” tutorial at the May 2024 Embedded Vision Summit.
In this presentation, van Emden covers the basics of scaling edge AI solutions using the Nx tool kit. He emphasizes the process of developing AI models and deploying them globally. He also showcases the conversion of AI models and the creation of effective edge AI pipelines, with a focus on pre-processing, model conversion, selecting the appropriate inference engine for the target hardware and post-processing.
van Emden shows how Nx can simplify the developer’s life and facilitate a rapid transition from concept to production-ready applications.He provides valuable insights into developing scalable and efficient edge AI solutions, with a strong focus on practical implementation.
Ivanti’s Patch Tuesday breakdown goes beyond patching your applications and brings you the intelligence and guidance needed to prioritize where to focus your attention first. Catch early analysis on our Ivanti blog, then join industry expert Chris Goettl for the Patch Tuesday Webinar Event. There we’ll do a deep dive into each of the bulletins and give guidance on the risks associated with the newly-identified vulnerabilities.
Infrastructure Challenges in Scaling RAG with Custom AI modelsZilliz
Building Retrieval-Augmented Generation (RAG) systems with open-source and custom AI models is a complex task. This talk explores the challenges in productionizing RAG systems, including retrieval performance, response synthesis, and evaluation. We’ll discuss how to leverage open-source models like text embeddings, language models, and custom fine-tuned models to enhance RAG performance. Additionally, we’ll cover how BentoML can help orchestrate and scale these AI components efficiently, ensuring seamless deployment and management of RAG systems in the cloud.
Driving Business Innovation: Latest Generative AI Advancements & Success StorySafe Software
Are you ready to revolutionize how you handle data? Join us for a webinar where we’ll bring you up to speed with the latest advancements in Generative AI technology and discover how leveraging FME with tools from giants like Google Gemini, Amazon, and Microsoft OpenAI can supercharge your workflow efficiency.
During the hour, we’ll take you through:
Guest Speaker Segment with Hannah Barrington: Dive into the world of dynamic real estate marketing with Hannah, the Marketing Manager at Workspace Group. Hear firsthand how their team generates engaging descriptions for thousands of office units by integrating diverse data sources—from PDF floorplans to web pages—using FME transformers, like OpenAIVisionConnector and AnthropicVisionConnector. This use case will show you how GenAI can streamline content creation for marketing across the board.
Ollama Use Case: Learn how Scenario Specialist Dmitri Bagh has utilized Ollama within FME to input data, create custom models, and enhance security protocols. This segment will include demos to illustrate the full capabilities of FME in AI-driven processes.
Custom AI Models: Discover how to leverage FME to build personalized AI models using your data. Whether it’s populating a model with local data for added security or integrating public AI tools, find out how FME facilitates a versatile and secure approach to AI.
We’ll wrap up with a live Q&A session where you can engage with our experts on your specific use cases, and learn more about optimizing your data workflows with AI.
This webinar is ideal for professionals seeking to harness the power of AI within their data management systems while ensuring high levels of customization and security. Whether you're a novice or an expert, gain actionable insights and strategies to elevate your data processes. Join us to see how FME and AI can revolutionize how you work with data!
Driving Business Innovation: Latest Generative AI Advancements & Success Story
Sql server 2012_and_intel_e7_processor_more_capability_and_higher_value_for_mission_critical_databases
1. More Capability and Higher Value
for Mission-Critical Databases
Microsoft SQL Server* 2012 and the Intel® Xeon® processor E7-8800/4800/2800
product families offer a powerful, cost-effective platform for mission-critical
databases with rich, built-in support for:
• Enterprise-wide analytics
• Big data integration
• Hybrid cloud extensibility
This white paper discusses the key capabilities SQL Server 2012 and the Intel Xeon
processor E7 family bring to enterprise computing. It offers useful information
for any IT decision-maker looking for ways to manage and use information more
effectively to grow their business and improve profitability.
White Paper
Intel® Xeon® Processor
E7-8800/4800/2800
Product Families
Microsoft SQL Server* 2012
2. 2
White Paper: More Capability and Higher Value for Mission-Critical Databases
Data-Driven Business: The Race is On. . . . . . . . . . . . . . . . . . . . . . . 3
A Rock-Solid Foundation for Mission-Critical Databases. . . . . . 4
Scalable Performance for Tier-1 Workloads. . . . . . . . . . . . . . . . . . 6
Pervasive Insight—Built-in Analytics
and Big Data Integration. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
Cloud-Ready for Elastic Scalability. . . . . . . . . . . . . . . . . . . . . . . . . 10
Looking Ahead: End-to-End Data Integration. . . . . . . . . . . . . . . . 12
Conclusion. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
Additional Resources. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
Appendix: Technologies at a Glance. . . . . . . . . . . . . . . . . . . . . . . . 14
Table of Contents Executive Summary
Information is the fuel that drives success in today’s business world.
When the right information is available at the right time, core business
processes run more efficiently and employees make better, faster
decisions. Some of the world’s most successful companies are using
innovative new database and analytics solutions to help ensure the
right information is available at the right time more consistently
throughout their businesses.
Intel and Microsoft can help you stay at the forefront of this revolution
by taking advantage of advanced analytics capabilities that are tightly
integrated into a complete information platform for mission-critical
enterprise computing. Microsoft SQL Server* 2012 and the Intel®
Xeon® processor E7 family deliver world-class performance and
reliability for tier-1 transactional workloads across native, virtualized,
and cloud computing deployment models. The combined platform
also provides integrated support for advanced data analytics and
big data integration, with no need for costly add-ons.
From client systems to public clouds, Intel and Microsoft provide a
unified infrastructure, management, and security framework that
helps IT organizations manage cost and risk more effectively as they
continue to expand their information infrastructure and add new
functionality. Advanced performance, reliability, and security features
are built into the hardware and software platform at all levels, enabling
business users to access data and insights more quickly and securely
using some of their most familiar tools, such as Microsoft Excel* and
Microsoft SharePoint.*
“SQL Server 2012 offers breakthrough capabilities,
including integration with big data. With Windows Server
2012 Hyper-V and the Intel Xeon processor E7 family, you
gain a game-changing software and hardware platform
for mission-critical enterprise computing that can help
you virtualize and consolidate SQL Server workloads
with confidence.”
– Herain Oberoi, director, Server and Tools Marketing, Microsoft
3. 3
White Paper: More Capability and Higher Value for Mission-Critical Databases
Enterprise data has been the lifeblood of enterprise transactional
systems for many years, and still is. Fast, uninterrupted access to data
remains essential to support core, revenue-generating business func-
tions. Yet today’s businesses need more from their databases than
scalable performance and high availability. They also need to be able
to analyze their data quickly and effectively, and they need to do so
across rapidly growing and increasingly diverse data sets.
According to Gartner, total worldwide data volumes are growing at
59 percent per year1
and up to 85 percent of that data is unstructured
content, such as documents, e-mails, videos, logs, social networking
posts, and networked sensor data. Properly analyzed, this data offers
unprecedented insight into the minds of customers and the operations
of the business.
New business intelligence (BI) and data analytics solutions have
arisen to enable near-real-time analysis across all these data types.
At the same time, cloud computing is emerging as a viable adjunct to
traditional data center computing models. As data volumes and analyt-
ics requirements continue to grow, the elastic scalability and superior
cost models of cloud computing offer new options for storing and
analyzing massive data sets without breaking IT budgets.
Taking advantage of these emerging capabilities is increasingly
important to build and sustain competitive advantage. SQL Server 2012
running on servers based on the Intel Xeon processor E7 family offers a
uniquely powerful and cost-effective solution for the most demanding
enterprise requirements (Figure 1). The combined platform provides
world-class performance, scalability, and reliability for mission-critical
transactional applications, along with built-in, fully integrated analytics
capabilities to address the enormous challenges—and the even more
enormous opportunities—presented by the ongoing explosion of enter-
prise data. Just as importantly, these capabilities are provided with cost
models that are much more favorable than traditional database solutions,
so IT organizations can deliver higher value to the business at lower
total cost.
Data-Driven Business: The Race is On
Microsoft SQL Server* 2012
Intel® Xeon®
Processor E5 Family
Mainstream Servers
(2-socket/4-socket)
Intel® Xeon®
Processor E7 Family
Scalable Enterprise Servers
(2-socket to 8-socket and larger)
Mission-critical
Enterprise
Databases
DECISION SUPPORT DATABASES
(Data Warehouses, Data Marts, and
Business Intelligence Systems)
• Data Volume
• Query Complexity
• Query Performance
TRANSACTIONAL DATABASES
(ERP, CRM, etc.)
• Performance
• Reliability and Availability
• Transaction Volume
• Data Volume
Departmental
Databases
Enterprise Data
Warehouses
(Scale up)
Departmental
Data Marts
Figure 1. Powerful and cost-effective support for the full range of enterprise database requirements
4. 4
White Paper: More Capability and Higher Value for Mission-Critical Databases
A Rock-Solid Foundation for
Mission-Critical Databases
SQL Server and Intel Xeon processor-based servers have been support-
ing mission-critical enterprise deployments for many years and in some
of the world’s most demanding business environments, including
financial services, the telecommunications industry, and many others.
Customers have successfully achieved five-nines, six-nines, and higher
availability on the combined platform.2
SQL Server 2012 and the Intel
Xeon processor E7 family offer an even more attractive platform
for mission-critical computing, with better Reliability, Availability, and
Serviceability (RAS) capability than previous product generations,
and with tighter integration and simpler implementation.3
Highly Available Servers
The Intel Xeon processor E7 family is engineered specifically for
scalable performance and high availability. Errors occur in all server
platforms and are among the most common causes of downtime and
data corruption. Hardened circuits and rigorous testing and validation
help to prevent errors throughout server platforms based on the Intel
Xeon processor E7 family. When errors do occur, automated error
correcting technologies help to resolve them transparently so they
don’t impact systems and applications.
The Intel Xeon processor E7 family also includes Machine Check Archi-
tecture (MCA) Recovery, which enables automatic system recovery from
many uncorrectable errors that would be fatal in other server platforms
(Figure 2). MCA Recovery is fully supported in Windows Server* 2012,
Hyper-V,* and SQL Server 2012. This comprehensive support delivers
integrated error management throughout the software stack to
provide higher levels of data integrity and system uptime.
Figure 2 The Intel® Xeon® processor E7 family includes Machine Check Architecture (MCA) Recovery, which is supported in SQL Server* 2012, Windows
Server* 2012, and Hyper-V* to enable higher availability through sophisticated error recovery.
Machine Check Architecture Recovery
How It Works
Supported in:
• Windows Server* 2012
• Hyper-V*
• SQL Server* 2012
HARDWARE-CORRECTABLE
ERRORS
UNCORRECTABLE
ERRORS
MCA Recovery
1. Normal Operation
3. Error Corrected 2. Error Detected 3. Error Contained
4. Software-assisted
System Recovery
Driving Real-World Business
Success—Telecommunications
Scaling Redknee TCB to handle 250 million subscribers
Telecommunications providers are transforming their opera-
tions and business models in order to deliver new services to
their subscribers faster and at lower cost. Redknee Turnkey
Converged Billing (TCB) is a pivotal component of this trans-
formation for many providers, delivering centralized business
intelligence, a personalized subscriber experience, and a full suite
of billing and customer care capabilities.
A recent 250 million-subscriber performance test by Redknee
and Microsoft verified that SQL Server* 2012 running on an
eight-socket server based on the Intel® Xeon® processor E7
family could support record-breaking performance. At peak
performance, the system processed an average of 1,249 invoices
per second and mediated an average of 113,402 Call Detail
Records per second. The combined platform exceeded perfor-
mance objectives and clearly demonstrated it could support
peak workloads for a tier-1 telecommunications provider.
For more information, read the complete report.
http://download.microsoft.com/download/3/D/
D/3DDCC479-E303-401F-9093-942549FF8A33/
Redknee_Solution_Brief_with_XIO_NEC_Intel_Mar2012.pdf
5. 5
White Paper: More Capability and Higher Value for Mission-Critical Databases
Servers based on the Intel Xeon processor E7 family provide many
additional features for optimizing server uptime. Examples include:
• Self-healing communication channels within the server platform to
enable continued operation in the event of I/O or memory link failures.
• Electronically isolated hardware partitioning to protect critical
workloads more effectively in consolidated environments. This
capability also allows IT organizations to perform physical server
maintenance without bringing down an entire server.
• Full and partial memory mirroring to provide complete data
redundancy for the most sensitive applications.
• Online component replacement so hardware repairs can be performed
without downtime. In combination with predictive failure analysis
(supported in Windows Server 2012) IT staff can identify and replace
failing components proactively before they can impact applications.4
Fast, Automated Failover
Microsoft SQL Server extends the inherent resilience of the hardware
platform by providing complete software support for high availability
and disaster recovery. AlwaysOn Availability Groups is a new integrated
solution for providing data and hardware redundancy within and across
data centers to increase the availability of mission-critical applications.
Database administrators can deploy up to four secondary, or replica,
databases to provide multiple levels of redundancy. Up to two of the
secondary databases can be replicated synchronously so they stay
in lockstep with the primary production database to enable almost
instant failover with no data loss. Additional secondaries can be repli-
cated asynchronously and combined with log shipping to provide
complete recovery capabilities to a remote data center for disaster
recovery (Figure 3).
Figure 3. SQL Server* 2012 includes AlwaysOn Availability Groups which provides an integrated, unified solution for high availability and disaster recov-
ery through fast manual and automated failover across LANs and WANs.
Management
Setup Wizard, Dashboard,
Windows PowerShell*
SQL Server* 2012 AlwaysOn Availability Groups
Primary Data Center
Near-instant recovery with no data loss
• Synchronous replication
• Shared or separate storage
Disaster Recovery Data Center
Manual recovery to a remote facility
• Asynchronous replication
• Apply logs for full recovery
Secondaries can be used for queries,
backups and other read-only workloads
Production Database
PRIMARY
Secondary
Database
Secondary
Database
Secondary
Database
Secondary
Database
Failover to a synchronous replica can be fully automated to avoid the
delays of manual solutions. SQL Server 2012 also introduces a new
capability called Failover Clustering Dynamic Quorum, which simplifies
setup by as much as 80 percent compared with previous-genera-
tion solutions and helps to ensure uptime down to the last standing
replica.5
In addition, secondary databases are now active. They can
be used for data backups, queries, and other read-only workloads
to reduce the load on the primary database and to improve overall
resource utilization.
SQL Server 2012 provides a number of additional advanced RAS
features to further increase database uptime.
• Faster patching and more effective testing. SQL Server 2012
supports the Windows Server Core role, which can reduce the need
for patching by as much as 50-60 percent.5
When patching is required,
Cluster-Aware Updating (CAU) speeds updates across clustered envi-
ronments. Administrators can also reduce the likelihood of problems
following software updates by using Distributed Replay to provide
realistic simulations of complex, production workloads for more
effective testing.
• Simpler failover and recovery. Hyper-V Replica provides a comple-
mentary option for disaster recovery. Implemented at the virtual
machine level, it provides simple support for asynchronous replication
and disaster recovery across wide area networks. Administrators can
also take advantage of Recovery Advisor to provide simpler and more
accurate point-in-time backup and recovery, as well as snapshots to
the Windows Azure* platform to scale backups more economically.
6. 6
White Paper: More Capability and Higher Value for Mission-Critical Databases
Scalable Performance for Tier-1 Workloads
Maintaining performance levels in the face of growing data volumes
can be a tough challenge. SQL Server 2012 has demonstrated world-
record transactional performance for tier-1 workloads running on Intel
Xeon processor E7 family-based servers.6
Both the hardware platform
and the software stack can scale to support today’s most demanding
requirements, not only in native environments, but also in virtualized
servers and cloud infrastructures.
An eight-socket server based on the Intel Xeon processor E7 family
provides massive resources for heavy workloads, with up to 80 cores,
160 threads, and support for up to four terabytes of memory. Each
processor includes up to 30 MB of cache and a scalable, high-speed I/O
subsystem to support network- and storage-intensive database work-
loads. Intel® Turbo Boost Technology7
automatically increases core
frequencies past rated values for peak workloads when thermals allow,
and Intel® Hyper-Threading Technology8
allows each core to process
two simultaneous software threads, which improves computing
efficiency when processing multiple simultaneous data requests.
Microsoft SQL Server 2012 and Windows Server 2012 scale to take full
advantage of all the hardware resources in these servers—and in even
larger future server generations—with support for up to 256 logical
processors (threads) and four terabytes of memory per instance.
A number of additional capabilities in the Intel and Microsoft database
platform help to deliver higher scalability for business data at lower
total cost, including:
• Quick, simple cloud extensibility. Businesses can scale their data-
bases into the cloud to meet truly massive requirements. SQL Azure
Federation provides “click and split” support for database scaling in
hybrid cloud scenarios, and built-in connection points for Windows
Azure make it easy to extend or migrate workloads from one
environment to another.
• Powerful compression. Compression ratios of up to 90 percent can
be realized for some data types5
and ratios of 20-60 percent are typi-
cal.9
These high compression ratios not only help to reduce storage
requirements, but also drive higher performance by enabling more
data to be held in main memory, where it can be accessed orders of
magnitude faster than from disk storage. The large and efficient
memory and cache subsystems of the Intel Xeon processor E7 family
help IT organization make the most of these in-memory capabilities,
reducing the need for performance-sapping data transfers between
memory and storage.
Driving Real-World Business Success:
Core Banking
Record-breaking banking performance for Temenos T24*
Temenos is the market-leading provider of banking software
systems worldwide, serving more than 1,500 customers in
125 countries. In a recent high-water benchmark, Temenos and
Microsoft engineers tested the performance and scalability of
the Temenos T24 core banking application using an Intel and
Microsoft solution stack, including SQL Server* 2012 running on
a server based on the Intel® Xeon® processor E7 family.
At peak performance, the system processed 11,592 transactions
per second, more than tripling the performance of Temenos T24
running on a previous-generation Microsoft and Intel database
platform. It also reduced end-of-day batch processing times by
almost 50 percent. These record-breaking performance results
provide a compelling demonstration that SQL Server 2012 and
the Intel Xeon processor E7 family can readily handle mission-
critical workloads for even the largest banks.
Temenos T24* High-Water Benchmark test results
SQL Server* 2008
R2 on the Intel®
Xeon® Processor
7500 Series
SQL Server* 2012
on the Intel® Xeon®
Processor E7 Family
Online
Transactions
3,347 transactions
per second
11,592 transactions
per second
Batch
Processing
1 hour 20 minutes 41 minutes 38 seconds
Processor
Utilization
No more than 70% No more than 75%
Scalability 95% 95%
For more information, read the complete report. http://download.
microsoft.com/download/6/A/1/6A146F75-66B2-4D19-97F5-
94B8B0611516/Temenos and SQL Server Highwater Benchmark
Report with XIO.PDF
7. 7
White Paper: More Capability and Higher Value for Mission-Critical Databases
• Guaranteed resources. Enterprise databases have multiple applica-
tions vying for resources, so it’s essential to ensure critical applications
get the resources they need. With Resource Governor in SQL Server
2012, administrators can create up to 64 isolated resource pools, place
hard caps on CPU resource usage, and establish workload affinity to
schedulers and non-uniform memory access (NUMA) nodes to provide
better support for service-level agreements.
• Affordable, high-end storage. With Server Message Block (SMB),
Windows Server 2012 allows IT organizations to support demanding
storage requirements using remote shared folders on affordable file
servers. Features such as SMB Direct and SMB Multichannel enable fast,
scalable, and highly available storage connectivity to support SQL Server
and other demanding applications. Intelligent storage arrays based on
Intel Xeon processors add to these advantages by providing advanced
storage capabilities, such as on-demand scaling, data compression, data
deduplication, and thin provisioning. Many also support automated
storage tiering, which, when combined with Intel® Solid-State Drives,
can enable dramatic performance gains at relatively low cost.10
Complete BI Solution—Out-of-the-Box
Optimized for Intel® processor-based clients and servers
PowerPivot
and Power View
SharePoint
Collaboration
Excel*
Workbooks
SharePoint Dashboard
and Scorecards
SELF-SERVICE BI USING POWERFUL AND FAMILIAR TOOLS
Microsoft SharePoint* 2012
(Intel® Xeon® processor E5 family)
Microsoft Office*
(Intel® Core™ processor family)
Integration
Services
Analysis
Services
Data Quality
Services
COMPLETE SUPPORT FOR DATA WAREHOUSES AND DATA MARTS
Microsoft SQL Server* 2012
(Intel® Xeon® processor E7 family/Intel Xeon processor E5 family)
Master Data
Services
Reporting
Services
Microsoft
SQL Server
Microsoft
Dynamics
LOB Apps Data Feeds
Figure 4. SQL Server* 2012 provides out-of-the-box support for data warehousing and business intelligence, with optimized support for all user
groups, from expert data analysts to typical business users.
Pervasive Insight – Built-in Analytics
and Big Data Integration
All the capabilities discussed so far are about doing business as usual,
only faster, more reliably, and more affordably. Today’s revolution in
data analytics is about growing the business by empowering business
users to identify opportunities and risks and act on them more quickly.
With advanced analytics, users across the business can make better,
faster decisions based on comprehensive and credible data.
Intel and Microsoft are uniquely positioned to help businesses grow
their analytics capability. SQL Server 2012, combined with Microsoft
business intelligence (BI) tools and Intel® processor-based servers,
laptops, and tablets, provides a flexible and affordable foundation for
extending advanced analytics throughout the business (Figure 4).
8. 8
White Paper: More Capability and Higher Value for Mission-Critical Databases
Self-Service Analytics with Integrated Security and Control
With Microsoft SQL Server 2012, business users can take advantage
of familiar tools, such as Microsoft Excel* and Microsoft SharePoint,* to
access and analyze massive volumes of data and share their insights
with other users across the business. There is no need for specialized
expertise in data analysis and recent innovations, such as PowerPivot
and Power View, enable speed-of-thought analytics and interactive
data presentations, which provide users with a simpler and more
immersive experience.11
Intel processor-based PCs, laptops, and tablets provide the perfor-
mance needed to make the most of these tools. They also offer
advanced security technologies, such as hardware-accelerated encryp-
tion, two-factor authentication, and automatic lock-downs for lost
or stolen laptops. When combined with the access controls built-into
Windows Server 2012, these security enhancements empower IT to
deliver wider access to business-critical data and insights—with less risk
and improved compliance. SQL Server 2012 provides additional support
for access controls, combining high levels of granularity with central-
ized management based on Active Directory* and SharePoint security
models.
SQL Server 2012 also provides user-defined server roles for enhanced
administrative security. Authorized IT administrators can use dash-
boards and familiar tools to monitor end-user activity, data source
usage, and server metrics. This allows them to regulate the analytics
environment as needed to ensure security, compliance, performance,
and reliability, without disrupting the end-user experience.
Integrated Analytics across the Enterprise
SQL Server and Intel® architecture provide a common information
platform that can be extended across the enterprise and configured
flexibly at each point to match specific workload requirements—all
the way from core transactional systems, to enterprise-scale data
warehouses, to departmental data marts and end-user access points.
The consistency of the platform offers important benefits, enabling
centralized staff to maintain unified management, security and compli-
ance models so they can deliver greater capability with less risk and
reduced costs (see the sidebar, Higher Value through Tight Integration).
Driving Real-World Business Success: Insurance
Scaling Accenture Duck Creek to support
nearly 70,000 transactions per day
Accenture, Microsoft, and Intel worked together to test the
performance and scalability of Accenture Duck Creek insurance
policy-administration software for Commercial Package Policies
(CPPs) running on Microsoft SQL Server* and servers based
on the Intel® Xeon® processor E7 family and the Intel® Xeon®
processor E5 family. At peak production, the joint solution was
able to process approximately $142 million in premium per hour
for a simulated user community of 20,000, with policy-page
response times of less than 2 seconds for average-size and
large-size policies.
The system also demonstrated tremendous scalability, with new
business premium tripling from $8.55 million per hour to $25.19
million per hour when the workload and number of processing
servers was tripled. With this level of performance and scalability,
the combined solution can clearly meet the requirements of very
large tier-1 insurance carriers.
For more information, read the complete report. http://
blogs.technet.com/b/sql_server_isv/archive/2013/02/16/
proven-performance-for-accenture-duck-creek-policy-adminis-
tration-commercial-lines-on-sql-server-2012.aspx
SQL Server 2012 also provides a common BI Semantic Model across all
enterprise analytics requirements. Self-service analytics based on new
tabular data models can be integrated with more advanced capabilities
based on traditional multidimensional data models. With this integra-
tion, data analysts and IT professionals can extend end-user analyses
to deliver corporate-grade reports using more sophisticated tools.
They can also amplify traditional insights with integrated support
for social networking concepts, text analytics, spatial data, and
streaming data analytics.
9. 9
White Paper: More Capability and Higher Value for Mission-Critical Databases
Flexible Data Warehouse Solutions
A flexible and scalable data warehouse provides the foundation for
high-quality analytics by enabling diverse data sets across the enter-
prise to be cleansed, integrated, and stored in a scalable repository
specifically optimized for queries. SQL Server 2012 and servers based
on Intel Xeon processors provide complete support for data warehous-
ing, with no costly add-ons and with advanced features for managing
data and accelerating query performance.
• Fast analytics across massive data sets. The In-Memory
ColumnStore Index in SQL Server 2012 improves analytics performance
by as much as 10x to 100x by allowing columnar tables, which are far
more efficient for many queries, to be built on top of traditional row-
based tables. By building these columnar tables in memory, data can be
accessed much faster than from disk. The large and efficient cache and
memory subsystems of the Intel Xeon processor E7 family are ideal for
this in-memory processing strategy, enabling large and complex
datasets to be processed more quickly.
• Built-in support for all data types. SQL Server 2012 lets database
administrators store and manage complex data types in a variety of
ways. Binary large objects (BLOBs) can be stored within the data-
base or they can be stored on remote file systems and managed as
if they were in the database. This flexibility makes it easier to build
rich and innovative applications without paying premiums for alterna-
tive database solutions or high-end storage systems. All data types
are treated as first-class citizens. They can even be protected by
AlwaysOn Availability Groups to provide mission-critical service levels.
• Integration with Apache Hadoop.* Businesses are implementing
Apache Hadoop to store and analyze up to petabytes of data in near
real time through massively parallel processing across large numbers
of low-cost servers. Microsoft will offer HDInsight*, a 100 percent
Apache-compatible Hadoop distribution that is highly optimized for
Intel Xeon processor-based servers. Licensed users of SQL Server 2012
can also download bi-directional Apache Hadoop connectors at no cost.
With integrated support for Microsoft management and security tools
and frameworks, HDInsight provides significant advantages in compa-
rison with many alternative big data solutions. IT organizations can also
take advantage of the Microsoft BI Platform to provide a simpler, more
immersive, and more interactive user experience.
Fast, Low-Risk Deployment Models
Designing and deploying a data warehouse from the ground up can
be a complex process entailing high upfront costs, long timelines, and
considerable risk. Validated data warehouse reference architectures
based on SQL Server 2012 and Intel Xeon processor-based servers
provide complete hardware and software templates to simplify and
speed deployment, while delivering higher and more predictable
performance. IT organizations can build their own systems using
these reference architectures or they can take advantage of
production-ready appliances from third-party vendors.
Higher Value through Tight Integration
Adding new functionality to a highly complex IT environment can
be both complex and costly, especially when it requires patching
together diverse hardware and software solutions from multiple
vendors. Gaps in capability and interoperability must be iden-
tified and remediated to ensure the new solution will perform
reliably, securely, and as expected.
Microsoft and Intel offer a high-value alternative. SQL Server*
2012, Windows Server* 2012, and System Center* 2012 provide
a tightly integrated foundation for enterprise computing, and all
are highly optimized for performance, energy-efficiency, manage-
ability, and security on Intel® processor-based servers and clients.
The combined platform enables IT to seamlessly build, deploy,
and manage applications across multiple sites and all deployment
models—including physical, virtual, and cloud—with policy-based
monitoring and deep insight into application performance
and health.
For more information, read the Microsoft white paper. http://
download.microsoft.com/download/1/2/4/124ACD46-
1E7A-4EC5-BAE3-B940096E9AAA/
SQL_Server_2012_Windows_Server_2012_System_
Center_2012_Better_Together_White_Paper.pdf
10. 10
White Paper: More Capability and Higher Value for Mission-Critical Databases
Options include departmental-sized data warehouse based on the
SQL Server 2012 Fast Track Reference Architecture and massive,
enterprise-class data warehouses based on the Microsoft Parallel
Data Warehouse Reference Architecture. Using these reference archi-
tectures, businesses can establish a hub-and-spoke environment that
allows them to maximize overall business agility, while simultaneously
maintaining a consistent enterprise analytics platform to control data
quality and establish a single view of the truth.
Cloud-Ready for Elastic Scalability
The elastic scalability and inherent resilience of cloud computing are
particularly valuable for supporting mission-critical databases. A cloud
architecture provides a more agile and efficient foundation for scaling
workloads and data volumes. Yet databases have, until recently, been
among the most difficult applications to virtualize and migrate into a
private or public cloud infrastructure.
Scalable Virtualization for Tier-1 Workloads
Windows Server 2012, SQL Server 2012, and Intel Xeon processor-based
servers simplify the move toward virtualization and cloud computing for
all workloads, including large databases with demanding storage and
networking requirements. Windows Server 2012 Hyper-V is optimized
for Intel® Virtualization Technology,12
which provides hardware-assists
for core virtualization functions throughout the server platform. This
newer version of Hyper-V also provides dramatic improvements in
scalability versus the prior generation, with support for up to
64 virtual CPUs and 1 TB of RAM per virtual machine (Table 1).
Enterprise Strategy Group (ESG) ran a series of tests to verify perfor-
mance and scalability for SQL Server running in virtual machines on
physical servers based on the Intel Xeon processor E7 family. The
results showed a 6x performance improvement versus the previous
version of Hyper-V for a tier-1 online transaction processing (OLTP)
workload, with a 5x improvement in average transaction response
time (Figure 5). Performance scaled linearly up to 64 vCPUs, at which
point the platform was able to support 797 transactions per second
with an average response time of 0.10 seconds.13
With this level of
scalability, the great majority of SQL Server implementations can
now be successfully virtualized.14
Table 1. Windows Server* 2012 Hyper-V* provides dramatic increases in physical and virtual scalability.
System Resource
Maximum Quantity
SQL Server* 2008 R2
Windows Server* 2008 R2
SQL Server* 2012
Windows Server* 2012
Physical Host
Logical processors 64 320
Physical memory 1 TB 4TB
Virtual processors 512 1,024
Virtual Machine
Virtual processors 4 64
Memory 64 GB 1 TB
Active VMs 384 1,024
Virtual disk size 2 TB 64 TB
Cluster
Nodes 16 64
VMs 1,000 8,000
Driving Real-World Business Success:
Product Lifecycle Management
Scaling Siemens Teamcenter* to support 10,000 users
Teamcenter provides a comprehensive portfolio of end-to-end
product lifecycle management (PLM) solutions used by companies
around the globe to deliver world-class products more quickly and
efficiently. To validate performance and scalability, Intel, Microsoft,
and Siemens PLM Software performed benchmark tests for
Teamcenter 8.3 using SQL Server* 2012 running on a server
based on the Intel® Xeon® processor E7 family.
The platform scaled easily to support 10,000 concurrent users,
with linear growth in resource usage and a weighted average
response time on par with some of the fastest results ever
measured in the Siemens Automated Performance Analysis facility.
For more information, read the complete report. http://download.
microsoft.com/download/6/A/7/6A7D9A40-8944-4ACB-BD8D-
91D97E3BACF7/SQL_Server_2012_Datasheet_Benchmark
Performance_Apr2012.pdf
11. 11
White Paper: More Capability and Higher Value for Mission-Critical Databases
Advanced Automation for Higher Value
With the addition of Windows Server 2012 and Microsoft System
Center 2012 SP1, SQL Server 2012 and servers based on the Intel
Xeon processor E7 family provide the foundation for hosting mission-
critical database applications in private cloud environments, with
support for secure resource sharing and comprehensive infrastructure,
application, and traffic monitoring. System Center provides a consis-
tent and unified set of tools for deploying and managing applications,
workloads, and infrastructure across multiple sites and across physical,
virtual, and cloud computing models. In combination with mainstream
servers based on the Intel Xeon processor E5 family, this optimized
cloud platform can be extended across your data center to improve
agility and cost models, and to help solve some of your toughest data
center challenges (see the sidebar, Tough Challenges, Extraordinary
Opportunities).
Figure 5. Independent tests by Enterprise Strategy Group demonstrated linear scalability up to 64 vCPUs for SQL Server* 2012 on Intel® Xeon® processor-
based servers, with average transaction response times as low as 0.10 seconds for a tier-1 OLTP workload.
0.0
0.2
0.4
0.6
0.8
1.0
0
100
200
300
400
500
600
700
800
900
4 8 16 32 64
Virtualize Tier-1 Workloads with Confidence
(Windows Server* 2012, SQL Server* 2012, Single VM, 64GB of RAM)
Number of virtual CPUs
Transactions/sec
AverageTransactionResponse(sec)
Tough Challenges, Extraordinary
Opportunities
Transform Your Data Center—One Server at a Time
Pressures are mounting for IT organizations. Workloads and data
sets continue to grow rapidly and new functionality must be
integrated almost continuously to keep pace with new business
demands. Microsoft Windows Server* 2012 and servers based
on the Intel® Xeon® processor E5 family provide extensive new
capabilities for addressing these fast-growing IT requirements.
The combined platform can help you expand at lower cost by
transitioning to unified 10 GB networking and to new storage
technologies that let you address high-end requirements using
cost-effective file servers. It can also help you consolidate more
workloads onto fewer and more energy-efficient servers. Most
importantly, it provides a foundation for moving incrementally
and non-disruptively toward next-generation cloud functionality
that can fundamentally improve the efficiency and agility of your
IT infrastructure.
For more information, see the Intel white paper, “Game-changing
Capability for Your Data Center—and a Solid Foundation for Your
Cloud” (http://software.intel.com/sites/billboard/sites/default/
files/downloads/Intel_MSFT_white%20paper_8%2015%2012_
secured.pdf).
12. 12
White Paper: More Capability and Higher Value for Mission-Critical Databases
Looking Ahead: End-to-End Data Integration
Database and analytics technologies are evolving rapidly to accom-
modate the ongoing explosion in the volume, variety, and velocity of
business data. However, integrating new capabilities can be complex,
requiring specialized software and hardware, as well as new develop-
ment, management, and security strategies.
Intel and Microsoft are delivering advanced capabilities in a fully-
integrated software environment optimized to run on affordable,
industry-standard servers based on Intel Xeon processors. There is
no need for separate software components or specialized appliances
that have to be integrated into existing environments—and IT organiza-
tions can take advantage of existing development, management, and
security tools and skills. It’s a simpler, faster, and less risky way to imple-
ment transformative new capabilities. Some of the most important
near-term advances are described below.15
Major Performance Gains for Transactional Databases
As already discussed, Microsoft SQL Server 2012 delivers 10x to 100x
faster performance for many analytics queries through its In-Memory
ColumnStore Index. The next major release of SQL Server will include
in-memory OLTP processing, which will provide comparable perfor-
mance gains for transactional applications.
Integrated Analytics for All Data Types
Microsoft has announced the release of the Microsoft SQL Server
Parallel Data Warehouse v2 (PDW v2) for the first half of 2013. This
new release will provide full support for SQL Server 2012, so it will
incorporate the advanced analytics capabilities already discussed in
this paper. It will also include a new feature called PolyBase, which
will enable federated querying against both SQL Server 2012 and
HDInsight Server.
Using PolyBase, customers will be able to perform queries that
operate simultaneously across their relational and non-relational
data. This ground-breaking capability will help IT organizations unify
analytics across all data types, without the complexity, delays, and
administrative overhead of moving and integrating data.
A More Productive End-User Experience
A key benefit of the Intel and Microsoft information platform is the
ease with which end-users can access, analyze, and share information
using their most familiar tools. In the upcoming release of Microsoft
Office* 2013, PowerPivot and Power View will be included as fully-
integrated features in Excel.16
This integration will simplify implemen-
tation and management for IT. End-users will enjoy simpler access to
powerful data exploration and visualization tools—directly from the
Excel ribbon menu. In combination with the advanced performance,
visualization, and security technologies of Intel processor-based
laptops and PCs, every user will be able to access, analyze, and share
enterprise information with greater ease and flexibility and with
reduced risk.
Conclusion
The race to harness the rising flood of enterprise data and turn it into
a sustainable competitive advantage is well underway. Microsoft SQL
Server 2012 running on Intel Xeon processor-based servers provides
a uniquely powerful, flexible, and affordable platform for addressing
the full range of enterprise needs—without costly add-ons and with a
consistent and integrated hardware and software architecture across
the enterprise information infrastructure.
The combined platform offers world-class performance for tier-1
transactional workloads, along with advanced reliability and security
features for maintaining compliant, always-on operation in mission-
critical environments. Just as importantly, it includes integrated support
for analytics, big data integration, and cloud extensibility, so IT orga-
nizations have better and simpler options for managing the growing
flood of business data and turning it into a competitive advantage.
13. 13
White Paper: More Capability and Higher Value for Mission-Critical Databases
Additional Resources
Product Information
• Intel® Xeon® processor E7-8800/4800/2800 product families:
www.intel.com/content/www/us/en/processors/xeon/
xeon-processor-e7-family.html
• SQL Server 2012: www.microsoft.com/sqlserver
Mission-critical computing
• SQL Server 2012: www.microsoft.com/en-us/sqlserver/
solutions-technologies/mission-critical-operations.aspx
• Intel Xeon processor-based servers and solutions:
www.intel.com/content/www/us/en/mission-critical/mission-critical-
meeting-todays-it-challenges.html
Data Analytics
• Intel IT Center—Big Data: www.intel.com/content/www/us/en/
big-data/big-data-analytics-turning-big-data-into-intelligence.html
• Microsoft BI and Analytics Solutions: www.microsoft.com/en-us/
sqlserver/solutions-technologies/business-intelligence/big-data.aspx
Technical Information
• Intel IT Center: Modernize Your Mission Critical Data Center:
www.intel.com/content/www/us/en/mission-critical/mission-critical-
meeting-todays-it-challenges.html
• Official Microsoft SQL Server Blog: blogs.technet.com/b/
dataplatforminsider/
• Microsoft Technet Blog: technet.microsoft.com/en-us/sqlserver/
bb265254
14. 14
White Paper: More Capability and Higher Value for Mission-Critical Databases
Appendix: Technologies at a Glance
SQL Server 2012 and the Intel Xeon processor E7 family include many new features and enhancements versus previous-generation solutions, and
only some of them are discussed in this white paper. Tables A1 and A2 provide more comprehensive lists. Table A3 (on page 15) provides guidance
for choosing between servers based on the Intel Xeon processor E7 family and the Intel Xeon processor E5 family.
For more complete and detailed information, visit the Microsoft website at www.microsoft.com/en-us/sqlserver/product-info.aspx and the
Intel website at www.intel.com/content/www/us/en/processors/xeon/xeon-processor-e7-family.html.
Table A1. Overview of Microsoft SQL Server* 2012 capabilities
High Availability (HA)
Scalability and
Performance
Security and
Manageability Business Intelligence Beyond Relational
Web and Breadth,
and EIM
•• HA for Microsoft
StreamInsight*
•• Reliable integrated
failover detection
•• Application-centric
failover
•• Multiple, readable
secondaries
•• Online operations
•• Microsoft SQL Server*
AlwaysOn
•• Windows Server* core
support
•• Up to 15K partitions
per table
•• xVelocity in-memory
columnstore index
•• Fast FILESTREAM
•• Fast full-text search
•• Fast spatial
performance
•• DBC and PDW
appliances
•• Resource Governor
enhancements
•• Contained Database
Authentication
•• User-defined server
roles
•• Distributed replay
•• Audit enhancements
•• Management Pack
for System Center
•• Backup secondaries
•• Default schema for
Windows* group
•• Active Directory
with Microsoft
SharePoint* for SSR
•• SQL Server
Management Studio
enhancements
•• BI semantic model
•• Power View
•• xVelocity in-memory
compression algorithms
•• Alerting
•• PowerPivot
enhancements
•• ODBC for Linux*
•• SQL Server Data Tools
•• FileTable
•• Statistical semantic
search
•• Full Globe Spatial
Support
•• DAC enhancements
•• ODBC for Linux*
•• PHP driver
•• LocalDB runtime
•• UTF-16
•• Paging results sets
•• JDBC 4.0 driver
•• MDS add-in for Excel*
•• Data quality services
•• Enhanced MDS
•• SSIS server
Table A2. Overview of advanced Intel® server technologies in the Intel® Xeon® processor E7 family
Technology Description Benefit
Scalable, Energy-Efficient Performance
Intel®
Turbo Boost Technology7
Increases processor frequencies beyond rated values
to take advantage of power and thermal headroom
Delivers peak performance for heavy workloads without
increasing power consumption for lighter workloads
Intel®
Intelligent Power Technology17
Dynamically conserves power and enables advanced
power management at the rack, group, and data
center levels
Helps to reduce operating costs and improve system
reliability through dynamic power optimization and
policy-based power management
Intel®
Hyper-Threading Technology8
Doubles the number of execution threads that can be
supported by each processor core
Increases processing efficiency for multi-threaded
applications and for multiple simultaneous tasks
Advanced Reliability, Availability, and Serviceability (RAS)
Machine Check Architecture (MCA)
Recovery
Enables operating systems and applications to partici-
pate in correcting and recovering from system errors
Provides the foundation for robust, mission-critical error
management to sustain higher levels of data integrity
and system uptime
Hardware-Enhanced Security
Intel®
Advanced Standard New
Instructions (Intel® AES-NI)18
Seven new instructions help to accelerate compute-
intensive steps of the AES encryption and
decryption algorithms
Improves performance and reduces overhead, so
encryption can be implemented pervasively to protect
data and transactions
15. 15
White Paper: More Capability and Higher Value for Mission-Critical Databases
Table A3. High-level criteria for choosing the Intel® Xeon® processor E7 family
IT Requirement Intel® Xeon® Processor E5 Family Intel® Xeon® Processor E7 Family
Reliability, Data Security Good Better
Performance, Memory,
and I/O Scalability
• 1-4 sockets
• 768 GB memory (2 sockets)
• 1.5 TB memory (4 sockets)
• 2-256 sockets
• 2 TB memory (4 sockets)
• 4 TB memory (8 sockets)
(for larger systems, check with the vendor)
Targeted Deployment
Models
Scale OUT: Rapid, incremental addition of servers to meet
business growth demands.
Pedestal, Rack, Blade
Scale UP: Fewer, more powerful servers with headroom
for demanding applications, heavier peak periods, and
business growth.
Rack and Blade