1) National Gypsum was using multiple outdated platforms for document management and collaboration that did not meet compliance requirements. This led to exponential file growth and difficulties in using and managing content.
2) The company implemented SharePoint across several phases to consolidate platforms and provide a single solution for portal, intranet, records management, eDiscovery and collaboration needs.
3) Future plans included extending the SharePoint implementation to more users, building applications, improving search and analytics capabilities, and automating records management and eDiscovery functions.
M Nagasree is seeking an Analyst Programmer position with experience in SAP BW/BI. She has over 2 years of experience as a Software Engineer working on projects involving requirements gathering, implementation, testing, data modeling, extraction, loading, scheduling, monitoring and reporting. Her technical skills include SAP BW 3.5, SAP BI 7.0, Oracle, Windows, C, Java, HTML, and BEx Explorer. Her past project experience includes characteristic expansion, year-end processes, time flags, 3.x to 7.x conversion, and agent subtypes for an insurance client.
Waleed Mohamed Abdel Wahab is an experienced telecommunications engineer seeking a new position. He has over 10 years of experience working for Nokia, Misr Insurance Company, and The International Company for Contracting in roles such as Senior OSS Engineer, Optical Networks Transmission Engineer, IBM Z-series system Administrator, and Power site Engineer. He has extensive technical skills and certifications in areas such as project management, Cisco networking, Nokia OSS systems, and Linux/Windows administration.
IBM Spectrum Scale ECM - Winning CombinationSasikanth Eda
This presentation describes various deployment options to configure IBM enterprise content management (ECM) FileNet® Content Manager components to use IBM Spectrum Scale™ (formerly known as IBM GPFS™) as back-end storage. It also describes various IBM Spectrum Scale value-added features with FileNet Content Manager
to facilitate an efficient and effective data-management solution.
The document provides a summary of Prasenjit Chowdhury's experience and qualifications. He has over 12 years of experience in information technology, including as an Exadata administrator. He has skills in Oracle database administration, Exadata administration, and backup and recovery. His objective is to work as a customer-focused solution architect utilizing skills in technologies like Teradata and Hadoop.
FAQ on developing and deploying applications on MACH11 (Informix Dynamic Serv...Keshav Murthy
MACH11 technology in IDS v11 brings unique opportunities to deploy applications in a scalable and continuously available system. Applications need to be aware of the isolation levels available, transaction management, and deal with the failover situations effectively. This talk will focus on understanding the MACH11 environment, hooks for application to exploit. This talk will also cover APIs enhancements to exploit MACH11 technology.
Planning your move to the cloud: SaaS Enablement and User Experience (Oracle ...Lucas Jellema
IT organizations face many challenges when integrating cloud applications with existing on-premises applications, and keeping a cohesive user interface is among the top. You want content from one application displayed in another, consolidated views, easy navigation between apps, and a consistent user experience for all. This session highlights a number of Oracle tools and best practices to help you find your path to the cloud.
This presentation focuses on the inevitable journey to the cloud and up the stack, the advent of (a plethora of) SaaS applications and the challenges around integrating these applications at data & event level and at User Experience level. The key questions and challenges are identified, a number of cases is illustrated and the key pieces from the Oracle PaaS portfolio for dealing with these challenges are highlighted.
Best practices for application migration to public clouds interop presentationesebeus
Best Practices for Application Migration to Public Clouds
Talk given at Interop May, 2013.
Whether you are thinking of migrating 1 application or 8000 applications to the cloud, the odds of success increase if best practices are followed. Do you know what those best practices are?
As hustler Mike McDermott said in the 1998 poker movie Rounders, “If you can't spot the sucker in the first half hour at the table, then you ARE the sucker.”
Anyone with a credit card can sit at the table of trying to move applications to public clouds. Those who want to succeed, study and learn from consistent winners. There are some hands to fold, some to play cautiously, and some to play aggressively.
This session covered best practices from helping 15 Fortune 1000 companies successfully migrate to cloud solutions.
Who should attend?
Anyone who wants to improve their odds of successfully migrating applications to public clouds.
Key Takeaways
• What are the key business considerations to address prior to migration?
• Which application workloads are suitable for public clouds?
• Which applications to replatform? Which to refactor?
• What are key considerations for replatforming and refactoring?
• What are key cloud application design concepts?
Webinar: Leveraging New Technologies with Migrationpanagenda
Webinar Recording: http://ow.ly/vvRG30gxDnS
Whether you’re dealing with cloud, browser clients, Microsoft Outlook, Office 365 Cloud, Sharepoint or something else – panagenda offers software solutions and the necessary expertise for all phases of your migration project. Learn how to identify the business value of your existing infrastructure, what is worth your time and money to migrate, how to identify dependencies and other important migration steps to ensure a smooth – and on-budget – transformation. Join us and hear on how to automate client-side changes, ranging from analyzing local archives, removing migrated applications, to removing IBM Notes, email decryption and Outlook or Office 365 Cloud setup.
Find out how to maintain control over servers during and after your project, how to test server transformation without compromising SLA’s. Learn how to analyze your security infrastructure so you can adapt your IBM Notes and Domino landscape in manageable steps while leveraging new technologies.
M Nagasree is seeking an Analyst Programmer position with experience in SAP BW/BI. She has over 2 years of experience as a Software Engineer working on projects involving requirements gathering, implementation, testing, data modeling, extraction, loading, scheduling, monitoring and reporting. Her technical skills include SAP BW 3.5, SAP BI 7.0, Oracle, Windows, C, Java, HTML, and BEx Explorer. Her past project experience includes characteristic expansion, year-end processes, time flags, 3.x to 7.x conversion, and agent subtypes for an insurance client.
Waleed Mohamed Abdel Wahab is an experienced telecommunications engineer seeking a new position. He has over 10 years of experience working for Nokia, Misr Insurance Company, and The International Company for Contracting in roles such as Senior OSS Engineer, Optical Networks Transmission Engineer, IBM Z-series system Administrator, and Power site Engineer. He has extensive technical skills and certifications in areas such as project management, Cisco networking, Nokia OSS systems, and Linux/Windows administration.
IBM Spectrum Scale ECM - Winning CombinationSasikanth Eda
This presentation describes various deployment options to configure IBM enterprise content management (ECM) FileNet® Content Manager components to use IBM Spectrum Scale™ (formerly known as IBM GPFS™) as back-end storage. It also describes various IBM Spectrum Scale value-added features with FileNet Content Manager
to facilitate an efficient and effective data-management solution.
The document provides a summary of Prasenjit Chowdhury's experience and qualifications. He has over 12 years of experience in information technology, including as an Exadata administrator. He has skills in Oracle database administration, Exadata administration, and backup and recovery. His objective is to work as a customer-focused solution architect utilizing skills in technologies like Teradata and Hadoop.
FAQ on developing and deploying applications on MACH11 (Informix Dynamic Serv...Keshav Murthy
MACH11 technology in IDS v11 brings unique opportunities to deploy applications in a scalable and continuously available system. Applications need to be aware of the isolation levels available, transaction management, and deal with the failover situations effectively. This talk will focus on understanding the MACH11 environment, hooks for application to exploit. This talk will also cover APIs enhancements to exploit MACH11 technology.
Planning your move to the cloud: SaaS Enablement and User Experience (Oracle ...Lucas Jellema
IT organizations face many challenges when integrating cloud applications with existing on-premises applications, and keeping a cohesive user interface is among the top. You want content from one application displayed in another, consolidated views, easy navigation between apps, and a consistent user experience for all. This session highlights a number of Oracle tools and best practices to help you find your path to the cloud.
This presentation focuses on the inevitable journey to the cloud and up the stack, the advent of (a plethora of) SaaS applications and the challenges around integrating these applications at data & event level and at User Experience level. The key questions and challenges are identified, a number of cases is illustrated and the key pieces from the Oracle PaaS portfolio for dealing with these challenges are highlighted.
Best practices for application migration to public clouds interop presentationesebeus
Best Practices for Application Migration to Public Clouds
Talk given at Interop May, 2013.
Whether you are thinking of migrating 1 application or 8000 applications to the cloud, the odds of success increase if best practices are followed. Do you know what those best practices are?
As hustler Mike McDermott said in the 1998 poker movie Rounders, “If you can't spot the sucker in the first half hour at the table, then you ARE the sucker.”
Anyone with a credit card can sit at the table of trying to move applications to public clouds. Those who want to succeed, study and learn from consistent winners. There are some hands to fold, some to play cautiously, and some to play aggressively.
This session covered best practices from helping 15 Fortune 1000 companies successfully migrate to cloud solutions.
Who should attend?
Anyone who wants to improve their odds of successfully migrating applications to public clouds.
Key Takeaways
• What are the key business considerations to address prior to migration?
• Which application workloads are suitable for public clouds?
• Which applications to replatform? Which to refactor?
• What are key considerations for replatforming and refactoring?
• What are key cloud application design concepts?
Webinar: Leveraging New Technologies with Migrationpanagenda
Webinar Recording: http://ow.ly/vvRG30gxDnS
Whether you’re dealing with cloud, browser clients, Microsoft Outlook, Office 365 Cloud, Sharepoint or something else – panagenda offers software solutions and the necessary expertise for all phases of your migration project. Learn how to identify the business value of your existing infrastructure, what is worth your time and money to migrate, how to identify dependencies and other important migration steps to ensure a smooth – and on-budget – transformation. Join us and hear on how to automate client-side changes, ranging from analyzing local archives, removing migrated applications, to removing IBM Notes, email decryption and Outlook or Office 365 Cloud setup.
Find out how to maintain control over servers during and after your project, how to test server transformation without compromising SLA’s. Learn how to analyze your security infrastructure so you can adapt your IBM Notes and Domino landscape in manageable steps while leveraging new technologies.
Oracle Data Integrator is an ETL tool that has three main differentiators: 1) It uses a declarative, set-based design approach which allows for shorter implementation times and reduced learning curves compared to specialized ETL skills. 2) It can transform data directly in the existing RDBMS for high performance and lower costs versus using a separate ETL server. 3) It has hot-pluggable knowledge modules that provide a library of reusable templates to standardize best practices and reduce costs.
Worldwide Scalable and Resilient Messaging Services by CQRS and Event Sourcin...DataWorks Summit
ChatWork is one of major business communication platforms in Japan. We keep growing up for 5+ years since our service inception. Now, we hold 110k+ of customer organizations which includes large organizations like telecom companies and the service is widely used across 200+ countries and regions.
Nowadays we have faced drastic increase of message traffic. But, unfortunately, our conventional backend was based on traditional LAMP architecture. Transforming traditional backend into highly available, scalable and resilient backend was imperative.
To achieve this, we have applied “Command Query Responsibility Segregation (CQRS) and Event Sourcing” as a heart of its architecture. The simple idea of segregation brings us independent command-side and query-side system components and it can subsequently achieve highly available, scalable and resilient systems. It is desirable property for messaging services because, for example, even if command-side was down, user can keep reading messages unless query-side was down. Event Sourcing is another key technique to enable us to build optimized systems to handle heterogeneous write/read load. This means that we can choose optimized storage platform for each side. Moreover, the event data can be the rich source for real-time analysis of user’s communication behavior. We have chosen Kafka as a command-side event storage, HBase as a query-side storage, Kafka Streams as a core library to give eventual consistency between the two sides. In application layer, Akka has been chosen as a core framework. Akka can be a good choice as an abstraction layer to build highly concurrent, distributed, resilient and message-driven application effectively. Backpressure introduced by Akka Stream can be important technology to prevent from overflow of data flows in our backend, which contributes system stability very well.
In this session, we talk about how above architecture works, how we concluded above architectural decisions on many trade-offs, what was achieved by this architecture, what was the pain points (e.g. how to guarantee eventual consistency, how to migrate systems in the real project, etc.) and several TIPS we learned for realizing our highly distributed and resilient messaging systems.
ChatWork is a business communication platform for global teams. Our four main features are enterprise-grade group chat, file sharing, task management and video chat. NTT DATA is one of biggest solution provider in Japan and providing technical support about Open Source Software and distributed computing. The project has been conducted with cooperation of ChatWork and NTT DATA.
Sudhir Gajjela has over 3 years of experience as a Big Data Hadoop Administrator and Informatica Administrator. He has expertise in architecting, building, supporting and troubleshooting both Cloudera and Hortonworks Big Data clusters as well as various Informatica tools. He has also worked as an Informatica Developer. His skills include Hadoop services like HDFS, Hive, HBase, Zookeeper, Flume, Sqoop, Oozie and Storm as well as Informatica tools such as PowerCenter, PowerExchange, Data Quality, MDM, Cloud, BDE and BDM. He has delivered training sessions to colleagues on topics such as Big Data, Hadoop, In
OpenITSM - IT Service Management with Open SourceJulian Hein
Building a infrastructure for modern systems management can be a very expensive project - with commercial software. But Open Source can be an alternative, because today there are tools for all important service management processes available. The speach gives an overview about the most mature tools for Incident & Problem Management, Event Management, Operations Management, Service Desk and CMDB.
David Kragness has over 20 years of experience as an Oracle DBA. He has experience managing Oracle databases from versions 8i to 12c. He has expertise in database administration, performance tuning, high availability, and disaster recovery. Kragness has worked with various organizations in industries such as manufacturing, retail, and election systems. He provides database support and assists development teams with data modeling.
Living objects network performance_management_v2Yoan SMADJA
LivingObjects provides network management software solutions to telecommunications companies. It was originally developed for SFR, a major French telecom provider, and has since been commercialized as generic product. The software suite helps technicians optimize network performance and quality of service for fixed and mobile networks through data collection, processing, and visualization tools. LivingObjects has 35 employees and is headquartered in Toulouse, France.
DBCS Office Hours - Modernization through MigrationTammy Bednar
Speakers:
Kiran Tailor - Cloud Migration Director, Oracle
Kevin Lief – Partnership and Alliances Manager - (EMEA), Advanced
Modernisation of mainframe and other legacy systems allows organizations to capitalise on existing assets as they move toward more agile, cost-effective and open technology environments. Do you have legacy applications and databases that you could modernise with Oracle, allowing you to apply cutting edge technologies, like machine learning, or BI for deeper insights about customers or products? Come to this webcast to learn about all this and how Advanced can help to get you on the path to modernisation.
AskTOM Office Hours offers free, open Q&A sessions with Oracle Database experts. Join us to get answers to all your questions about Oracle Database Cloud Service.
This document compares different ETL (extract, transform, load) tools. It begins with introductions to ETL tools in general and four specific tools: Pentaho Kettle, Talend, Informatica PowerCenter, and Inaplex Inaport. The document then compares the tools across various criteria like cost, ease of use, speed, and connectivity. It aims to help readers evaluate the tools for different use cases.
In this presentation Yekasa Kosuru talks about challenges associated with Big Data at Nokia. As well as discussing the challenges, Kosuru also talks through solutions that Nokia use across the different platforms they use there. The solutions are broken into phases which Kosuru talks through in detail with the use of stats and flow charts.
Terry Hendrickson has over 15 years of experience as a software/database developer with expertise in SQL Server, Oracle, Visual Basic, VB.NET, T-SQL, PL/SQL, ETL, Crystal Reports, and other technologies. He has worked on projects involving database development, reporting, ETL, and application development for companies across various industries. Hendrickson holds an Associate's Degree in Computer Information Systems and Business Management.
Oracle Real Application Clusters (RAC) Roadmap for New Features describes and discusses best practices for new features introduced with Oracle RAC 12c as well as Oracle RAC 18c and provides a short outlook of the road ahead.
This document discusses object storage and EMC's object storage solutions. It begins with an overview of how traditional storage is becoming inadequate to handle growing unstructured data and the advantages of object storage. Key characteristics of object storage like scalability, geo-distribution and support for large files are described. Example use cases that can benefit from object storage like global content repositories and IoT data collection are provided. The document then discusses EMC's object storage offerings like ECS and how they address the needs of these use cases through scalability, various access protocols and geo-distribution. It also covers EMC's HDFS data service and how it can address limitations of traditional HDFS.
AMIS and Oracle JET - Oracle OpenWorld 2017 Panel on JETLucas Jellema
This presentation provides a brief overview of some of the activities around Oracle JET at AMIS. This presentation was prepared for the Oracle OpenWorld 2017 conference - for the JET Panel Session- SUN4389
Oracle JavaScript Extension Toolkit is a modern architecture for front-end JavaScript development. Where ever an Oracle organization uses JavaScript for web or mobile development, they use Oracle JavaScript Extension Toolkit to solve common enterprise problems. However, what are organizations outside Oracle doing with Oracle JavaScript Extension Toolkit? In this session meet representatives from organizations where Oracle JavaScript Extension Toolkit is used, and let them show you the places where they're using Oracle JavaScript Extension Toolkit; what their mix of technology, architectures, and clouds is; and discuss with them the applicability of Oracle JavaScript Extension Toolkit in your context. Prepare to come to this session with questions!
Laserfiche document management solutions allow special districts to cost-effectively provide government services. It integrates with existing applications, enforces records management policies, automates complex business processes, and supports effective business continuity planning. Laserfiche provides centralized secure storage, allows remote access, simplifies workflows, and ensures long-term accessibility of critical information through portable media publishing.
This document provides a summary of Robert Nase's qualifications and experience in cloud architecture, infrastructure design, and operations management over 20 years. Key areas of expertise include cloud implementations on AWS, Azure, Rackspace; data center design; systems integration; disaster recovery; and virtualization technologies. Recent experience includes developing cloud migration strategies and architectures for various government agencies moving applications to AWS GovCloud.
• 11+ Years of IT Industry experience in Analysis, Design, Development, Maintenance and Support of various software applications mainly in Data Warehousing (Informatica Power Center, OWB, SSIS and Business Objects), Oracle (SQL, PL/SQL) and Teradata in industry verticals like Finance, Telecom, Retail and Healthcare.
• Work experience in client facing roles in UK and Ireland.
• Performed numerous roles in Business Intelligence projects as Data warehouse System Analyst, ETL Designer, Onshore coordinator, Technical Lead and Senior Data warehouse Developer roles with multinational IT result-driven organizations
• Extensive experience on Data integration projects accessing sources like Teradata, Oracle and SQL server.
• Created robust EDW Solution from various types of sources like Flat files, XML Files, EDCDIC Cobol copybook from Mainframe systems, DB2 Unload files.
• Extensive experience on Data discovery, cleansing using Informatica IDQ.
• Resolved Inconsistent and Duplicate Data issues during Data Analysis to Support Strategic EDW Goals.
• Extensive experience of Data Integration using Informatica Power center Tool stack.
• Strong knowledge on Data Warehousing concepts, ETL concepts, Data Modeling, Dimensional Modeling.
• Conducted training on Informatica and have achieved awards for proficient training capabilities.
• Excellent understanding of OLTP and OLAP concepts and expert in writing SQL, Stored procedure on Teradata, Oracle and SQL Server.
• Extensive experience in implementing Data Warehousing methodologies including STAR SCHEMA and SNOW-FLAKE SCHEMAS & 3NF for huge data warehouses.
• Extensive knowledge on Change Data Capture (CDC) and SCD Type 1, Type 2, Type 3 Implementations.
• Excellent understanding of Kimball and Inmon Methodologies.
• Provided leadership when addressing high level technical issues and questions with the functionality of the reporting and business intelligence applications.
• Managed the current and strategize to foresee and plan for the future engineering needs in Data Integration space.
• Performed roles as a interface and coordinator between Database Administration, ETL Development, Testing teams and reporting teams to eliminate the road blocks for smooth flow of information.
• Hands on experience in tuning ETL mappings, identifying and resolving performance bottlenecks in various levels like sources, targets, mappings, and sessions.
• Expert in designing and developing of complicate ETL mappings using Informatica PowerCenter.
• Proficient in optimizing performance issues using Informatica PowerCenter and Teradata.
• Having experience on using Teradata utilities (TPT, BTEQ, Fast Load, MultiLoad, FastExport, Tpump).
• Exposure in writing Shell Scripting as per given requirement.
• Work extensively in Teradata GCFR tool.
• Experience in SAP ECC integration with Informatica.
• Got training in Tableau, Qlik View& SAP BW 3.5and done POC for the same.
Using Spark Streaming and NiFi for the next generation of ETL in the enterpriseDataWorks Summit
In recent years, big data has moved from batch processing to stream-based processing since no one wants to wait hours or days to gain insights. Dozens of stream processing frameworks exist today and the same trend that occurred in the batch-based big data processing realm has taken place in the streaming world so that nearly every streaming framework now supports higher level relational operations.
On paper, combining Apache NiFi, Kafka, and Spark Streaming provides a compelling architecture option for building your next generation ETL data pipeline in near real time. What does this look like in an enterprise production environment to deploy and operationalized?
The newer Spark Structured Streaming provides fast, scalable, fault-tolerant, end-to-end exactly-once stream processing with elegant code samples, but is that the whole story?
We discuss the drivers and expected benefits of changing the existing event processing systems. In presenting the integrated solution, we will explore the key components of using NiFi, Kafka, and Spark, then share the good, the bad, and the ugly when trying to adopt these technologies into the enterprise. This session is targeted toward architects and other senior IT staff looking to continue their adoption of open source technology and modernize ingest/ETL processing. Attendees will take away lessons learned and experience in deploying these technologies to make their journey easier.
Salesforce and SAP Integration with Informatica CloudDarren Cunningham
Webinar recorded in July 2010. Available for viewing at http://www.informaticacloud.com. Focuses on SAP and Salesforce.com application and data integration best practices
Oracle data integrator 12c - getting startedMichael Rainey
Oracle Data Integrator (ODI) is a data integration tool that can extract, load, and transform heterogeneous data sources. It is flexible and uses a flow-based mapping approach. The presentation provided an overview of ODI and guidance on installation, configuration, and getting started with the developer quickstart to create models, schemas, and mappings between data stores. Key components like knowledge modules generate integration code, while packages and load plans orchestrate the data integration processes.
Mobile Devices Securely Accessing SharePointMike Brannon
Mobile Devices can access SharePoint Securely. Presentation details MobileIron, Juniper Secure Access and iPad Apps for making the most of SharePoint on your Mobile Device
The document discusses how a company can securely manage employee-owned mobile devices (BYOD) using MobileIron. It summarizes the company's transition from company-owned Blackberries to allowing any device. MobileIron provides centralized policy enforcement and security across all devices. It allows separating personal and work data, enforcing access controls and remote wiping lost devices. The document also discusses providing secure access to additional corporate resources beyond email and ensuring privacy and international roaming policies are followed.
Embrace BYOD - Help your customers be more productive and use their mobile device of choice. At the same time be VERY SECURE - manage your mobile content!
Oracle Data Integrator is an ETL tool that has three main differentiators: 1) It uses a declarative, set-based design approach which allows for shorter implementation times and reduced learning curves compared to specialized ETL skills. 2) It can transform data directly in the existing RDBMS for high performance and lower costs versus using a separate ETL server. 3) It has hot-pluggable knowledge modules that provide a library of reusable templates to standardize best practices and reduce costs.
Worldwide Scalable and Resilient Messaging Services by CQRS and Event Sourcin...DataWorks Summit
ChatWork is one of major business communication platforms in Japan. We keep growing up for 5+ years since our service inception. Now, we hold 110k+ of customer organizations which includes large organizations like telecom companies and the service is widely used across 200+ countries and regions.
Nowadays we have faced drastic increase of message traffic. But, unfortunately, our conventional backend was based on traditional LAMP architecture. Transforming traditional backend into highly available, scalable and resilient backend was imperative.
To achieve this, we have applied “Command Query Responsibility Segregation (CQRS) and Event Sourcing” as a heart of its architecture. The simple idea of segregation brings us independent command-side and query-side system components and it can subsequently achieve highly available, scalable and resilient systems. It is desirable property for messaging services because, for example, even if command-side was down, user can keep reading messages unless query-side was down. Event Sourcing is another key technique to enable us to build optimized systems to handle heterogeneous write/read load. This means that we can choose optimized storage platform for each side. Moreover, the event data can be the rich source for real-time analysis of user’s communication behavior. We have chosen Kafka as a command-side event storage, HBase as a query-side storage, Kafka Streams as a core library to give eventual consistency between the two sides. In application layer, Akka has been chosen as a core framework. Akka can be a good choice as an abstraction layer to build highly concurrent, distributed, resilient and message-driven application effectively. Backpressure introduced by Akka Stream can be important technology to prevent from overflow of data flows in our backend, which contributes system stability very well.
In this session, we talk about how above architecture works, how we concluded above architectural decisions on many trade-offs, what was achieved by this architecture, what was the pain points (e.g. how to guarantee eventual consistency, how to migrate systems in the real project, etc.) and several TIPS we learned for realizing our highly distributed and resilient messaging systems.
ChatWork is a business communication platform for global teams. Our four main features are enterprise-grade group chat, file sharing, task management and video chat. NTT DATA is one of biggest solution provider in Japan and providing technical support about Open Source Software and distributed computing. The project has been conducted with cooperation of ChatWork and NTT DATA.
Sudhir Gajjela has over 3 years of experience as a Big Data Hadoop Administrator and Informatica Administrator. He has expertise in architecting, building, supporting and troubleshooting both Cloudera and Hortonworks Big Data clusters as well as various Informatica tools. He has also worked as an Informatica Developer. His skills include Hadoop services like HDFS, Hive, HBase, Zookeeper, Flume, Sqoop, Oozie and Storm as well as Informatica tools such as PowerCenter, PowerExchange, Data Quality, MDM, Cloud, BDE and BDM. He has delivered training sessions to colleagues on topics such as Big Data, Hadoop, In
OpenITSM - IT Service Management with Open SourceJulian Hein
Building a infrastructure for modern systems management can be a very expensive project - with commercial software. But Open Source can be an alternative, because today there are tools for all important service management processes available. The speach gives an overview about the most mature tools for Incident & Problem Management, Event Management, Operations Management, Service Desk and CMDB.
David Kragness has over 20 years of experience as an Oracle DBA. He has experience managing Oracle databases from versions 8i to 12c. He has expertise in database administration, performance tuning, high availability, and disaster recovery. Kragness has worked with various organizations in industries such as manufacturing, retail, and election systems. He provides database support and assists development teams with data modeling.
Living objects network performance_management_v2Yoan SMADJA
LivingObjects provides network management software solutions to telecommunications companies. It was originally developed for SFR, a major French telecom provider, and has since been commercialized as generic product. The software suite helps technicians optimize network performance and quality of service for fixed and mobile networks through data collection, processing, and visualization tools. LivingObjects has 35 employees and is headquartered in Toulouse, France.
DBCS Office Hours - Modernization through MigrationTammy Bednar
Speakers:
Kiran Tailor - Cloud Migration Director, Oracle
Kevin Lief – Partnership and Alliances Manager - (EMEA), Advanced
Modernisation of mainframe and other legacy systems allows organizations to capitalise on existing assets as they move toward more agile, cost-effective and open technology environments. Do you have legacy applications and databases that you could modernise with Oracle, allowing you to apply cutting edge technologies, like machine learning, or BI for deeper insights about customers or products? Come to this webcast to learn about all this and how Advanced can help to get you on the path to modernisation.
AskTOM Office Hours offers free, open Q&A sessions with Oracle Database experts. Join us to get answers to all your questions about Oracle Database Cloud Service.
This document compares different ETL (extract, transform, load) tools. It begins with introductions to ETL tools in general and four specific tools: Pentaho Kettle, Talend, Informatica PowerCenter, and Inaplex Inaport. The document then compares the tools across various criteria like cost, ease of use, speed, and connectivity. It aims to help readers evaluate the tools for different use cases.
In this presentation Yekasa Kosuru talks about challenges associated with Big Data at Nokia. As well as discussing the challenges, Kosuru also talks through solutions that Nokia use across the different platforms they use there. The solutions are broken into phases which Kosuru talks through in detail with the use of stats and flow charts.
Terry Hendrickson has over 15 years of experience as a software/database developer with expertise in SQL Server, Oracle, Visual Basic, VB.NET, T-SQL, PL/SQL, ETL, Crystal Reports, and other technologies. He has worked on projects involving database development, reporting, ETL, and application development for companies across various industries. Hendrickson holds an Associate's Degree in Computer Information Systems and Business Management.
Oracle Real Application Clusters (RAC) Roadmap for New Features describes and discusses best practices for new features introduced with Oracle RAC 12c as well as Oracle RAC 18c and provides a short outlook of the road ahead.
This document discusses object storage and EMC's object storage solutions. It begins with an overview of how traditional storage is becoming inadequate to handle growing unstructured data and the advantages of object storage. Key characteristics of object storage like scalability, geo-distribution and support for large files are described. Example use cases that can benefit from object storage like global content repositories and IoT data collection are provided. The document then discusses EMC's object storage offerings like ECS and how they address the needs of these use cases through scalability, various access protocols and geo-distribution. It also covers EMC's HDFS data service and how it can address limitations of traditional HDFS.
AMIS and Oracle JET - Oracle OpenWorld 2017 Panel on JETLucas Jellema
This presentation provides a brief overview of some of the activities around Oracle JET at AMIS. This presentation was prepared for the Oracle OpenWorld 2017 conference - for the JET Panel Session- SUN4389
Oracle JavaScript Extension Toolkit is a modern architecture for front-end JavaScript development. Where ever an Oracle organization uses JavaScript for web or mobile development, they use Oracle JavaScript Extension Toolkit to solve common enterprise problems. However, what are organizations outside Oracle doing with Oracle JavaScript Extension Toolkit? In this session meet representatives from organizations where Oracle JavaScript Extension Toolkit is used, and let them show you the places where they're using Oracle JavaScript Extension Toolkit; what their mix of technology, architectures, and clouds is; and discuss with them the applicability of Oracle JavaScript Extension Toolkit in your context. Prepare to come to this session with questions!
Laserfiche document management solutions allow special districts to cost-effectively provide government services. It integrates with existing applications, enforces records management policies, automates complex business processes, and supports effective business continuity planning. Laserfiche provides centralized secure storage, allows remote access, simplifies workflows, and ensures long-term accessibility of critical information through portable media publishing.
This document provides a summary of Robert Nase's qualifications and experience in cloud architecture, infrastructure design, and operations management over 20 years. Key areas of expertise include cloud implementations on AWS, Azure, Rackspace; data center design; systems integration; disaster recovery; and virtualization technologies. Recent experience includes developing cloud migration strategies and architectures for various government agencies moving applications to AWS GovCloud.
• 11+ Years of IT Industry experience in Analysis, Design, Development, Maintenance and Support of various software applications mainly in Data Warehousing (Informatica Power Center, OWB, SSIS and Business Objects), Oracle (SQL, PL/SQL) and Teradata in industry verticals like Finance, Telecom, Retail and Healthcare.
• Work experience in client facing roles in UK and Ireland.
• Performed numerous roles in Business Intelligence projects as Data warehouse System Analyst, ETL Designer, Onshore coordinator, Technical Lead and Senior Data warehouse Developer roles with multinational IT result-driven organizations
• Extensive experience on Data integration projects accessing sources like Teradata, Oracle and SQL server.
• Created robust EDW Solution from various types of sources like Flat files, XML Files, EDCDIC Cobol copybook from Mainframe systems, DB2 Unload files.
• Extensive experience on Data discovery, cleansing using Informatica IDQ.
• Resolved Inconsistent and Duplicate Data issues during Data Analysis to Support Strategic EDW Goals.
• Extensive experience of Data Integration using Informatica Power center Tool stack.
• Strong knowledge on Data Warehousing concepts, ETL concepts, Data Modeling, Dimensional Modeling.
• Conducted training on Informatica and have achieved awards for proficient training capabilities.
• Excellent understanding of OLTP and OLAP concepts and expert in writing SQL, Stored procedure on Teradata, Oracle and SQL Server.
• Extensive experience in implementing Data Warehousing methodologies including STAR SCHEMA and SNOW-FLAKE SCHEMAS & 3NF for huge data warehouses.
• Extensive knowledge on Change Data Capture (CDC) and SCD Type 1, Type 2, Type 3 Implementations.
• Excellent understanding of Kimball and Inmon Methodologies.
• Provided leadership when addressing high level technical issues and questions with the functionality of the reporting and business intelligence applications.
• Managed the current and strategize to foresee and plan for the future engineering needs in Data Integration space.
• Performed roles as a interface and coordinator between Database Administration, ETL Development, Testing teams and reporting teams to eliminate the road blocks for smooth flow of information.
• Hands on experience in tuning ETL mappings, identifying and resolving performance bottlenecks in various levels like sources, targets, mappings, and sessions.
• Expert in designing and developing of complicate ETL mappings using Informatica PowerCenter.
• Proficient in optimizing performance issues using Informatica PowerCenter and Teradata.
• Having experience on using Teradata utilities (TPT, BTEQ, Fast Load, MultiLoad, FastExport, Tpump).
• Exposure in writing Shell Scripting as per given requirement.
• Work extensively in Teradata GCFR tool.
• Experience in SAP ECC integration with Informatica.
• Got training in Tableau, Qlik View& SAP BW 3.5and done POC for the same.
Using Spark Streaming and NiFi for the next generation of ETL in the enterpriseDataWorks Summit
In recent years, big data has moved from batch processing to stream-based processing since no one wants to wait hours or days to gain insights. Dozens of stream processing frameworks exist today and the same trend that occurred in the batch-based big data processing realm has taken place in the streaming world so that nearly every streaming framework now supports higher level relational operations.
On paper, combining Apache NiFi, Kafka, and Spark Streaming provides a compelling architecture option for building your next generation ETL data pipeline in near real time. What does this look like in an enterprise production environment to deploy and operationalized?
The newer Spark Structured Streaming provides fast, scalable, fault-tolerant, end-to-end exactly-once stream processing with elegant code samples, but is that the whole story?
We discuss the drivers and expected benefits of changing the existing event processing systems. In presenting the integrated solution, we will explore the key components of using NiFi, Kafka, and Spark, then share the good, the bad, and the ugly when trying to adopt these technologies into the enterprise. This session is targeted toward architects and other senior IT staff looking to continue their adoption of open source technology and modernize ingest/ETL processing. Attendees will take away lessons learned and experience in deploying these technologies to make their journey easier.
Salesforce and SAP Integration with Informatica CloudDarren Cunningham
Webinar recorded in July 2010. Available for viewing at http://www.informaticacloud.com. Focuses on SAP and Salesforce.com application and data integration best practices
Oracle data integrator 12c - getting startedMichael Rainey
Oracle Data Integrator (ODI) is a data integration tool that can extract, load, and transform heterogeneous data sources. It is flexible and uses a flow-based mapping approach. The presentation provided an overview of ODI and guidance on installation, configuration, and getting started with the developer quickstart to create models, schemas, and mappings between data stores. Key components like knowledge modules generate integration code, while packages and load plans orchestrate the data integration processes.
Mobile Devices Securely Accessing SharePointMike Brannon
Mobile Devices can access SharePoint Securely. Presentation details MobileIron, Juniper Secure Access and iPad Apps for making the most of SharePoint on your Mobile Device
The document discusses how a company can securely manage employee-owned mobile devices (BYOD) using MobileIron. It summarizes the company's transition from company-owned Blackberries to allowing any device. MobileIron provides centralized policy enforcement and security across all devices. It allows separating personal and work data, enforcing access controls and remote wiping lost devices. The document also discusses providing secure access to additional corporate resources beyond email and ensuring privacy and international roaming policies are followed.
Embrace BYOD - Help your customers be more productive and use their mobile device of choice. At the same time be VERY SECURE - manage your mobile content!
BYOD - Mobility - Protection: security partnering with businessMike Brannon
Presentation delivered to the Charlotte CISO Summit and Ballantyne IT Pro security summit events. I cover how security has positively partnered with the business at NGC to very securely deploy BYOD and enable mobile access to email, documents and business data.
NGC records management - SP2010 RM FeaturesMike Brannon
National Gypsum Company uses SharePoint 2010 as a records management tool to facilitate efficient electronic discovery responses. SharePoint 2010 provides capabilities for records storage, retention policies, metadata management, legal holds, and discovery that help NGC establish governance over electronic records and meet compliance needs. While past systems caused issues, NGC believes the integrated Microsoft stack including SharePoint 2010 and Office 2010 can now meet all of its records management and eDiscovery requirements.
Mobile Device Security - Responsible Not RepressiveMike Brannon
Users want mobility and their own devices on the network - IT wants security! How can both groups get what they need? Tools exist to make that happen and this presentation provides an overview of what National Gypsum did recently (2011/2)
Search for Overview for SC Upstate SP usersMike Brannon
This document discusses using search effectively in SharePoint. It provides an overview of SharePoint search and highlights the importance of planning search, managing metadata, and user training. Key points include allocating time and resources to configure search and crawl metadata, educating users on best practices for naming and tagging content, and leveraging features like refiners and promoted results. The presentation also provides resources for learning more about advanced search configuration and third-party tools.
Mastering Cloud Data Cost Control: A FinOps ApproachDenodo
Watch full webinar here: https://buff.ly/3uu8dEy
With the rise of cloud-first initiatives and pay-per-use systems, forecasting IT costs has become a challenge. It's easy to start small, but it's equally easy to get skyrocketing bills with little warning. FinOps is a discipline that tries to tackle these issues, by providing the framework to understand and optimize cloud costs in a more controlled manner. The Denodo Platform, being a middleware layer in charge of global data delivery, sits in a privileged position not only to help us understand where costs are coming from, but also to take action, manage, and reduce them.
Attend this session to learn:
- The importance of FinOps in a cloud architecture
- How the Denodo Platform can help you collect and visualize key FinOps metrics to understand where your costs are coming from
- What actions and controls the Denodo Platform offers to keep costs at bay
Lunch and Learn ANZ: Mastering Cloud Data Cost Control: A FinOps ApproachDenodo
Watch full webinar here: https://buff.ly/4bYOOgb
With the rise of cloud-first initiatives and pay-per-use systems, forecasting IT costs has become a challenge. It's easy to start small, but it's equally easy to get skyrocketing bills with little warning. FinOps is a discipline that tries to tackle these issues, by providing the framework to understand and optimize cloud costs in a more controlled manner. The Denodo Platform, being a middleware layer in charge of global data delivery, sits in a privileged position not only to help us understand where costs are coming from, but also to take action, manage, and reduce them.
Attend this session to learn:
- The importance of FinOps in a cloud architecture.
- How the Denodo Platform can help you collect and visualize key FinOps metrics to understand where your costs are coming from?
- What actions and controls the Denodo Platform offers to keep costs at bay.
About knowledge graph driven portal project for telco operators we built for Nokia (Siemens) Network a while ago. Reuploaded some older, but relevant stuff, since noticed this ser became hidden after LinkedIn took over SlideShare :-?
ADV Slides: Platforming Your Data for Success – Databases, Hadoop, Managed Ha...DATAVERSITY
Thirty years is a long time for a technology foundation to be as active as relational databases. Are their replacements here? In this webinar, we say no.
Databases have not sat around while Hadoop emerged. The Hadoop era generated a ton of interest and confusion, but is it still relevant as organizations are deploying cloud storage like a kid in a candy store? We’ll discuss what platforms to use for what data. This is a critical decision that can dictate two to five times additional work effort if it’s a bad fit.
Drop the herd mentality. In reality, there is no “one size fits all” right now. We need to make our platform decisions amidst this backdrop.
This webinar will distinguish these analytic deployment options and help you platform 2020 and beyond for success.
DM Radio Webinar: Adopting a Streaming-Enabled ArchitectureDATAVERSITY
Architecture matters. That's why today's innovators are taking a hard look at streaming data, an increasingly attractive option that can transform business in several ways: replacing aging data ingestion techniques like ETL; solving long-standing data quality challenges; improving business processes ranging from sales and marketing to logistics and procurement; or any number of activities related to accelerating data warehousing, business intelligence and analytics.
Register for this DM Radio Deep Dive Webinar to learn how streaming data can rejuvenate or supplant traditional data management practices. Host Eric Kavanagh will explain how streaming-first architectures can relieve data engineers from time-consuming, error-prone processes, ideally bidding farewell to those unpleasant batch windows. He'll be joined by Kevin Petrie of Attunity, who will explain why (with real-world story successes) streaming data solutions can keep the business fueled with trusted data in a timely, efficient manner for improved business outcomes.
Pradeep Kumar Pandey has over 10 years of experience as a data/systems integration specialist and ETL expert. He has extensive experience designing and implementing data warehouses using tools like IBM DataStage, Informatica, Oracle OBIEE, and Oracle OBIA. He has led teams and taken on roles such as developer, technical lead, and team lead. Pradeep has worked on projects across various industries including telecom, financial services, HR, and retail.
Maximizing Oil and Gas (Data) Asset Utilization with a Logical Data Fabric (A...Denodo
Watch full webinar here: https://bit.ly/3g9PlQP
It is no news that Oil and Gas companies are constantly faced with immense pressure to stay competitive, especially in the current climate while striving towards becoming data-driven at the heart of the process to scale and gain greater operational efficiencies across the organization.
Hence, the need for a logical data layer to help Oil and Gas businesses move towards a unified secure and governed environment to optimize the potential of data assets across the enterprise efficiently and deliver real-time insights.
Tune in to this on-demand webinar where you will:
- Discover the role of data fabrics and Industry 4.0 in enabling smart fields
- Understand how to connect data assets and the associated value chain to high impact domain areas
- See examples of organizations accelerating time-to-value and reducing NPT
- Learn best practices for handling real-time/streaming/IoT data for analytical and operational use cases
The document contains the resume of Naveen Reddy Tamma which summarizes his work experience and qualifications. He has over 7 years of experience working as an Associate at Cognizant Technology Solutions on various projects involving Informatica ETL development, data quality testing, and report generation. He holds a B.Tech in Computer Science and has experience working with technologies like Informatica, Teradata, Oracle, and Cognos.
The document contains the resume of Naveen Reddy Tamma which summarizes his work experience and qualifications. He has over 7 years of experience working as an Associate at Cognizant Technology Solutions on various projects involving Informatica ETL development, data quality, and reporting. He holds a B.Tech in Computer Science and has experience with technologies like Informatica, Teradata, Oracle, and Cognos.
This document contains a resume summary for M. Abdul Kareem Khan, an IT professional with over 6 years of experience developing web applications using technologies like Java, PL/SQL, and C++. He has worked as a senior systems analyst and software engineer for various clients in both the US and India, with a focus on banking and telecommunications projects. The summary highlights his technical skills and experience leading projects involving all phases of the software development life cycle.
Webinar: Improve Splunk Analytics and Automate Processes with SnapLogicSnapLogic
Last week SnapLogic sponsored partner event Splunk Worldwide Users' Conference in Las Vegas. The theme of the conference was "Your Data, No Limits." In keeping with this theme, SnapLogic helps Splunk customers access more comprehensive analytics by integrating as much data as possible from as many sources as possible, and by streamlining the business process of loading data in Splunk, detecting problems, and facilitating actions that result in a prompt resolution.
To learn more, visit: http://www.snaplogic.com/.
Manigandan Narasimhan is a senior consultant and application database administrator (DBA) with over 14 years of experience developing client/server applications using Oracle technologies. He has extensive experience building data marts and data warehouses, performing Oracle performance tuning, and managing database migrations. Some of his key skills include Oracle, SQL, PL/SQL, Unix, data modeling, ETL tools like DataStage, and project management. He has worked on several projects for clients like JP Morgan Chase and General Motors.
The document provides an overview of Oracle's WebCenter product and roadmap. Key elements discussed include enhanced support for mobile, social, and cloud capabilities. Upcoming features for WebCenter Portal, Content, and infrastructure include improved user experiences across devices, integration with Oracle Fusion applications and social networks, and cloud-based document and file sharing services. The document also summarizes Oracle's vision for business process management, including adaptive case management, mobile and social integration, enhanced analytics and process simulation capabilities.
How can Oracle Forms (or other legacy) applications be modernized to fit in a contemporary IT architecture? Trends, concepts and technologies are discussed.
Implementing a Data Mesh with Apache Kafka with Adam Bellemare | Kafka Summit...HostedbyConfluent
Have you heard about Data Mesh but never really understood how you actually build one? Data mesh is a relatively recent term that describes a set of principles that good modern data systems uphold. Although the data mesh is not a technology specific pattern, it requires that organizations make choices and investments into specific technologies and operational policies when implementing the mesh. Establishing ""paved roads"" for creating, publishing, evolving, deprecating, and discovering data products is essential for bringing the benefits of the mesh to those who would use it.
In this talk, Adam covers implementing a self-service data mesh with events streams in Apache Kafka®. Event streams as a data product are an essential part of a real-world data mesh, as they enable both operational and analytical workloads from a common source of truth. Event streams provide full historical data along with realtime updates, letting each individual data product consumer decide what to consume, how to remodel it, and where to store it to best suit their needs.
Adam structures this talk by seeking to answer a hypothetical SaaS business question of ""what is the relationship between feature usage and user retention?"" This example explores each team's role in the data mesh, including the data products they would (and wouldn't) publish, how other teams could use the products, and the organizational dynamics and principles underpinning it all.
OpenStack in the Enterprise - Interop Las Vegas 2014Seth Fox
OpenStack has been making tremendous progress, with production deployments proliferating globally. But is OpenStack hardened and ready for the Enterprise? Is it mature enough to run production and mission critical workloads? Does it adequately address security and compliance requirements? We believe that the
answer is a resounding “yes”.
This session will deliver the insights you need to fully embrace OpenStack by addressing:
Common Pitfalls - common reasons why OpenStack deployments typically fail in enterprise environmentsInterop_Las_Vegas
Economics - total cost of ownership of a typical OpenStack footprint within the enterprise, and highlight the areas where benefits are primarily achieved
Ecosystem - the importance of the OpenStack ecosystem, and why this helps the enterprise in the short and long-term
Private, Public or Hybrid - where to deploy one of the models, and explain why OpenStack is the right choice for all of them
Real world enterprise case studies - successful deployment models
ADV Slides: When and How Data Lakes Fit into a Modern Data ArchitectureDATAVERSITY
Whether to take data ingestion cycles off the ETL tool and the data warehouse or to facilitate competitive Data Science and building algorithms in the organization, the data lake – a place for unmodeled and vast data – will be provisioned widely in 2020.
Though it doesn’t have to be complicated, the data lake has a few key design points that are critical, and it does need to follow some principles for success. Avoid building the data swamp, but not the data lake! The tool ecosystem is building up around the data lake and soon many will have a robust lake and data warehouse. We will discuss policy to keep them straight, send data to its best platform, and keep users’ confidence up in their data platforms.
Data lakes will be built in cloud object storage. We’ll discuss the options there as well.
Get this data point for your data lake journey.
Tamilarasu Uthirasamy has over 10 years of experience in data warehousing, database design, ETL processes, and analytics. He has skills in technologies like Python, Spark, UNIX shell scripting, databases like Netezza and Oracle, and tools like Datastage and R. He has worked on projects in healthcare, retail, and banking domains, designing data models and warehouses and developing ETL processes.
Bridging the Last Mile: Getting Data to the People Who Need It (APAC)Denodo
Watch full webinar here: https://bit.ly/34iCruM
Many organizations are embarking on strategically important journeys to embrace data and analytics. The goal can be to improve internal efficiencies, improve the customer experience, drive new business models and revenue streams, or – in the public sector – provide better services. All of these goals require empowering employees to act on data and analytics and to make data-driven decisions. However, getting data – the right data at the right time – to these employees is a huge challenge and traditional technologies and data architectures are simply not up to this task. This webinar will look at how organizations are using Data Virtualization to quickly and efficiently get data to the people that need it.
Attend this session to learn:
- The challenges organizations face when trying to get data to the business users in a timely manner
- How Data Virtualization can accelerate time-to-value for an organization’s data assets
- Examples of leading companies that used data virtualization to get the right data to the users at the right time
Similar to SharePoint Best Practices Conference 2013 (20)
Have you ever been confused by the myriad of choices offered by AWS for hosting a website or an API?
Lambda, Elastic Beanstalk, Lightsail, Amplify, S3 (and more!) can each host websites + APIs. But which one should we choose?
Which one is cheapest? Which one is fastest? Which one will scale to meet our needs?
Join me in this session as we dive into each AWS hosting service to determine which one is best for your scenario and explain why!
TrustArc Webinar - 2024 Global Privacy SurveyTrustArc
How does your privacy program stack up against your peers? What challenges are privacy teams tackling and prioritizing in 2024?
In the fifth annual Global Privacy Benchmarks Survey, we asked over 1,800 global privacy professionals and business executives to share their perspectives on the current state of privacy inside and outside of their organizations. This year’s report focused on emerging areas of importance for privacy and compliance professionals, including considerations and implications of Artificial Intelligence (AI) technologies, building brand trust, and different approaches for achieving higher privacy competence scores.
See how organizational priorities and strategic approaches to data security and privacy are evolving around the globe.
This webinar will review:
- The top 10 privacy insights from the fifth annual Global Privacy Benchmarks Survey
- The top challenges for privacy leaders, practitioners, and organizations in 2024
- Key themes to consider in developing and maintaining your privacy program
Monitoring and Managing Anomaly Detection on OpenShift.pdfTosin Akinosho
Monitoring and Managing Anomaly Detection on OpenShift
Overview
Dive into the world of anomaly detection on edge devices with our comprehensive hands-on tutorial. This SlideShare presentation will guide you through the entire process, from data collection and model training to edge deployment and real-time monitoring. Perfect for those looking to implement robust anomaly detection systems on resource-constrained IoT/edge devices.
Key Topics Covered
1. Introduction to Anomaly Detection
- Understand the fundamentals of anomaly detection and its importance in identifying unusual behavior or failures in systems.
2. Understanding Edge (IoT)
- Learn about edge computing and IoT, and how they enable real-time data processing and decision-making at the source.
3. What is ArgoCD?
- Discover ArgoCD, a declarative, GitOps continuous delivery tool for Kubernetes, and its role in deploying applications on edge devices.
4. Deployment Using ArgoCD for Edge Devices
- Step-by-step guide on deploying anomaly detection models on edge devices using ArgoCD.
5. Introduction to Apache Kafka and S3
- Explore Apache Kafka for real-time data streaming and Amazon S3 for scalable storage solutions.
6. Viewing Kafka Messages in the Data Lake
- Learn how to view and analyze Kafka messages stored in a data lake for better insights.
7. What is Prometheus?
- Get to know Prometheus, an open-source monitoring and alerting toolkit, and its application in monitoring edge devices.
8. Monitoring Application Metrics with Prometheus
- Detailed instructions on setting up Prometheus to monitor the performance and health of your anomaly detection system.
9. What is Camel K?
- Introduction to Camel K, a lightweight integration framework built on Apache Camel, designed for Kubernetes.
10. Configuring Camel K Integrations for Data Pipelines
- Learn how to configure Camel K for seamless data pipeline integrations in your anomaly detection workflow.
11. What is a Jupyter Notebook?
- Overview of Jupyter Notebooks, an open-source web application for creating and sharing documents with live code, equations, visualizations, and narrative text.
12. Jupyter Notebooks with Code Examples
- Hands-on examples and code snippets in Jupyter Notebooks to help you implement and test anomaly detection models.
GraphRAG for Life Science to increase LLM accuracyTomaz Bratanic
GraphRAG for life science domain, where you retriever information from biomedical knowledge graphs using LLMs to increase the accuracy and performance of generated answers
Skybuffer SAM4U tool for SAP license adoptionTatiana Kojar
Manage and optimize your license adoption and consumption with SAM4U, an SAP free customer software asset management tool.
SAM4U, an SAP complimentary software asset management tool for customers, delivers a detailed and well-structured overview of license inventory and usage with a user-friendly interface. We offer a hosted, cost-effective, and performance-optimized SAM4U setup in the Skybuffer Cloud environment. You retain ownership of the system and data, while we manage the ABAP 7.58 infrastructure, ensuring fixed Total Cost of Ownership (TCO) and exceptional services through the SAP Fiori interface.
A Comprehensive Guide to DeFi Development Services in 2024Intelisync
DeFi represents a paradigm shift in the financial industry. Instead of relying on traditional, centralized institutions like banks, DeFi leverages blockchain technology to create a decentralized network of financial services. This means that financial transactions can occur directly between parties, without intermediaries, using smart contracts on platforms like Ethereum.
In 2024, we are witnessing an explosion of new DeFi projects and protocols, each pushing the boundaries of what’s possible in finance.
In summary, DeFi in 2024 is not just a trend; it’s a revolution that democratizes finance, enhances security and transparency, and fosters continuous innovation. As we proceed through this presentation, we'll explore the various components and services of DeFi in detail, shedding light on how they are transforming the financial landscape.
At Intelisync, we specialize in providing comprehensive DeFi development services tailored to meet the unique needs of our clients. From smart contract development to dApp creation and security audits, we ensure that your DeFi project is built with innovation, security, and scalability in mind. Trust Intelisync to guide you through the intricate landscape of decentralized finance and unlock the full potential of blockchain technology.
Ready to take your DeFi project to the next level? Partner with Intelisync for expert DeFi development services today!
Unlock the Future of Search with MongoDB Atlas_ Vector Search Unleashed.pdfMalak Abu Hammad
Discover how MongoDB Atlas and vector search technology can revolutionize your application's search capabilities. This comprehensive presentation covers:
* What is Vector Search?
* Importance and benefits of vector search
* Practical use cases across various industries
* Step-by-step implementation guide
* Live demos with code snippets
* Enhancing LLM capabilities with vector search
* Best practices and optimization strategies
Perfect for developers, AI enthusiasts, and tech leaders. Learn how to leverage MongoDB Atlas to deliver highly relevant, context-aware search results, transforming your data retrieval process. Stay ahead in tech innovation and maximize the potential of your applications.
#MongoDB #VectorSearch #AI #SemanticSearch #TechInnovation #DataScience #LLM #MachineLearning #SearchTechnology
HCL Notes und Domino Lizenzkostenreduzierung in der Welt von DLAUpanagenda
Webinar Recording: https://www.panagenda.com/webinars/hcl-notes-und-domino-lizenzkostenreduzierung-in-der-welt-von-dlau/
DLAU und die Lizenzen nach dem CCB- und CCX-Modell sind für viele in der HCL-Community seit letztem Jahr ein heißes Thema. Als Notes- oder Domino-Kunde haben Sie vielleicht mit unerwartet hohen Benutzerzahlen und Lizenzgebühren zu kämpfen. Sie fragen sich vielleicht, wie diese neue Art der Lizenzierung funktioniert und welchen Nutzen sie Ihnen bringt. Vor allem wollen Sie sicherlich Ihr Budget einhalten und Kosten sparen, wo immer möglich. Das verstehen wir und wir möchten Ihnen dabei helfen!
Wir erklären Ihnen, wie Sie häufige Konfigurationsprobleme lösen können, die dazu führen können, dass mehr Benutzer gezählt werden als nötig, und wie Sie überflüssige oder ungenutzte Konten identifizieren und entfernen können, um Geld zu sparen. Es gibt auch einige Ansätze, die zu unnötigen Ausgaben führen können, z. B. wenn ein Personendokument anstelle eines Mail-Ins für geteilte Mailboxen verwendet wird. Wir zeigen Ihnen solche Fälle und deren Lösungen. Und natürlich erklären wir Ihnen das neue Lizenzmodell.
Nehmen Sie an diesem Webinar teil, bei dem HCL-Ambassador Marc Thomas und Gastredner Franz Walder Ihnen diese neue Welt näherbringen. Es vermittelt Ihnen die Tools und das Know-how, um den Überblick zu bewahren. Sie werden in der Lage sein, Ihre Kosten durch eine optimierte Domino-Konfiguration zu reduzieren und auch in Zukunft gering zu halten.
Diese Themen werden behandelt
- Reduzierung der Lizenzkosten durch Auffinden und Beheben von Fehlkonfigurationen und überflüssigen Konten
- Wie funktionieren CCB- und CCX-Lizenzen wirklich?
- Verstehen des DLAU-Tools und wie man es am besten nutzt
- Tipps für häufige Problembereiche, wie z. B. Team-Postfächer, Funktions-/Testbenutzer usw.
- Praxisbeispiele und Best Practices zum sofortigen Umsetzen
Driving Business Innovation: Latest Generative AI Advancements & Success StorySafe Software
Are you ready to revolutionize how you handle data? Join us for a webinar where we’ll bring you up to speed with the latest advancements in Generative AI technology and discover how leveraging FME with tools from giants like Google Gemini, Amazon, and Microsoft OpenAI can supercharge your workflow efficiency.
During the hour, we’ll take you through:
Guest Speaker Segment with Hannah Barrington: Dive into the world of dynamic real estate marketing with Hannah, the Marketing Manager at Workspace Group. Hear firsthand how their team generates engaging descriptions for thousands of office units by integrating diverse data sources—from PDF floorplans to web pages—using FME transformers, like OpenAIVisionConnector and AnthropicVisionConnector. This use case will show you how GenAI can streamline content creation for marketing across the board.
Ollama Use Case: Learn how Scenario Specialist Dmitri Bagh has utilized Ollama within FME to input data, create custom models, and enhance security protocols. This segment will include demos to illustrate the full capabilities of FME in AI-driven processes.
Custom AI Models: Discover how to leverage FME to build personalized AI models using your data. Whether it’s populating a model with local data for added security or integrating public AI tools, find out how FME facilitates a versatile and secure approach to AI.
We’ll wrap up with a live Q&A session where you can engage with our experts on your specific use cases, and learn more about optimizing your data workflows with AI.
This webinar is ideal for professionals seeking to harness the power of AI within their data management systems while ensuring high levels of customization and security. Whether you're a novice or an expert, gain actionable insights and strategies to elevate your data processes. Join us to see how FME and AI can revolutionize how you work with data!
How to Interpret Trends in the Kalyan Rajdhani Mix Chart.pdfChart Kalyan
A Mix Chart displays historical data of numbers in a graphical or tabular form. The Kalyan Rajdhani Mix Chart specifically shows the results of a sequence of numbers over different periods.
Ocean lotus Threat actors project by John Sitima 2024 (1).pptxSitimaJohn
Ocean Lotus cyber threat actors represent a sophisticated, persistent, and politically motivated group that poses a significant risk to organizations and individuals in the Southeast Asian region. Their continuous evolution and adaptability underscore the need for robust cybersecurity measures and international cooperation to identify and mitigate the threats posed by such advanced persistent threat groups.
Trusted Execution Environment for Decentralized Process MiningLucaBarbaro3
Presentation of the paper "Trusted Execution Environment for Decentralized Process Mining" given during the CAiSE 2024 Conference in Cyprus on June 7, 2024.
Taking AI to the Next Level in Manufacturing.pdfssuserfac0301
Read Taking AI to the Next Level in Manufacturing to gain insights on AI adoption in the manufacturing industry, such as:
1. How quickly AI is being implemented in manufacturing.
2. Which barriers stand in the way of AI adoption.
3. How data quality and governance form the backbone of AI.
4. Organizational processes and structures that may inhibit effective AI adoption.
6. Ideas and approaches to help build your organization's AI strategy.
Programming Foundation Models with DSPy - Meetup SlidesZilliz
Prompting language models is hard, while programming language models is easy. In this talk, I will discuss the state-of-the-art framework DSPy for programming foundation models with its powerful optimizers and runtime constraint system.
Skybuffer AI: Advanced Conversational and Generative AI Solution on SAP Busin...Tatiana Kojar
Skybuffer AI, built on the robust SAP Business Technology Platform (SAP BTP), is the latest and most advanced version of our AI development, reaffirming our commitment to delivering top-tier AI solutions. Skybuffer AI harnesses all the innovative capabilities of the SAP BTP in the AI domain, from Conversational AI to cutting-edge Generative AI and Retrieval-Augmented Generation (RAG). It also helps SAP customers safeguard their investments into SAP Conversational AI and ensure a seamless, one-click transition to SAP Business AI.
With Skybuffer AI, various AI models can be integrated into a single communication channel such as Microsoft Teams. This integration empowers business users with insights drawn from SAP backend systems, enterprise documents, and the expansive knowledge of Generative AI. And the best part of it is that it is all managed through our intuitive no-code Action Server interface, requiring no extensive coding knowledge and making the advanced AI accessible to more users.
Generating privacy-protected synthetic data using Secludy and MilvusZilliz
During this demo, the founders of Secludy will demonstrate how their system utilizes Milvus to store and manipulate embeddings for generating privacy-protected synthetic data. Their approach not only maintains the confidentiality of the original data but also enhances the utility and scalability of LLMs under privacy constraints. Attendees, including machine learning engineers, data scientists, and data managers, will witness first-hand how Secludy's integration with Milvus empowers organizations to harness the power of LLMs securely and efficiently.
Generating privacy-protected synthetic data using Secludy and Milvus
SharePoint Best Practices Conference 2013
1. 1
SharePoint Solutions at
National Gypsum
Mike Brannon
mikeb@natgyp.com
@mike_moss
http://www.slideshare.net/gbcmeb/
From The SharePoint Strategic Planning Process
5. NGC Technical Overview
3 Platforms:
IBM zOS/Software AG
SAP (hosted external)
Windows .NET/SQL
Windows clients & Servers – Plants & Process Control
WAN
(Trusted)
Internet
(Untrusted)Remote Users –
Employees &
Partners
(Mobile too) Firewall
Juniper VPN
Rexford HQ Data Center
Exchange 2010 SharePoint 2003 &
SharePoint 2007
.NET Apps &
SQL DBMS -
SOA & Web
IBM
Z/OS
HQ Campus LAN
`
Clients
SAP / FIT
Hosting
PLANT LAN
Plant
Server
`
Corp LAN
Clients
`
Process PC
PLC on
Ethernet
DMZ LAN
DMZ Servers
EDI, Web
6. Key Milestones - NGC SharePoint journey
2003 to PRESENT: Updated to SP2003 – Lots
of Light Customization with CorasWorks
templates, tools and some .NET code – Event
receivers and “Report Sweeper”
Also InfoPath Forms and Libraries emerge…
1999 – 2002 Tahoe then SP 2001 Portal
supporting sales laptops and internally
developed “CRM / Document shares”
2006 to PRESENT: New Servers Added for Intranet effort
Connect payroll/HR to AD and User Profiles, Photos
Corp-Comm and HR leverage OOB Functionality well
Initial broad effort to setup ‘department sites’ with user
driven publishing and IT project sites & PMO
Downsizing issues due to loss of knowledgeable staff…
7. NGC problems – Similar to yours?
7
• Our network file-share based architecture is obsolete, growing
exponentially in size, and is difficult to use. (28 years)
• Federal and state legal statutes concerning document
management aren‟t being fully met. (since 2009)
• We have old, out of support versions of SharePoint used for
customer critical business process support that need to be
upgraded or replaced.
• Business Intelligence and reporting products have advanced
in power, ease of use, functionality.
• NGC needs a roadmap for a technical makeover that will
reduce vendors, mitigate risks, and deploy new capabilities to
the business within the standard Microsoft toolset.
8. NGC Plans for Next Steps
8
• Multi-department review by all stakeholders
• NGC‟s core development and sysadmin skill set is based
on the Microsoft platform (Exchange, SharePoint, Office)
and .NET Code / SQL databases
• Determined that several “non-core” technologies could be
replaced by SharePoint (Accolade, Documentum, Hyperion,
uPerform, File Shares)
• Potential SW cost reductions and less TCO/support!
• Successfully proved in a pilot test that SharePoint could be
a records management replacement (with additions)
9. What do we need?
9
• Commitment to SharePoint as the Enterprise
Portal/Application/Content Platform
• Licensing Changes Far Fewer EA SW Licenses,
Drop 70% of Full Office / 100% System Center –
Deploy ECAL for Focused Users – Then Deploy
Deskless Worker Office / Device Seats
• Build new platforms and decommission “non-core”
systems
•
12. 12
Establishing Context: eDiscovery Strategy and Information Lifecycle
A Roadmap for Bridging 2 Operational Scenarios
Business Users
Create ESI
ESI TYPES SOURCES
Unstructured Data Messages (email,
v-mail, IM)
Documents,
Media, Logs, etc.
PCs, Servers,
SharePoint, etc.
Structured Data Invoices, POs,
Credit Memos,
etc.
Enterprise
Systems, SAP,
POPS/MIDAS,
NGC4ME, etc.
Scenario 1: Normal Operations
Governed Under “Safe Harbor”
POLICIES & PROCEDURES
Legal Department Information Systems
Records Management Storage, Backup, Archiving,
Deletion
1) ESI – Electronically Stored Information
2) Safe Harbor – Possible protection from sanctions if ESI is destroyed as a result of routine policies and procedures
3) Duty to Preserve – A defendants obligation to preserve evidence in advance of litigation
Scenario 2: eDiscovery
Governed Under “Duty To Preserve”
Legal
Hold
Issued
Possible
Legal
Action
ESI
Sources
Discovery
Scope
Determined
Search
Hold
ProcessReviewDeliver
ESI Discovery Process
Our
eDiscovery
Strategy
provides
a framework to
address
eDiscovery
in the
context of
Normal
Operations
Policies & Procedures Dictate the
Information Lifecycle Under “Normal Operations”
13. ECM/RM Learning
▪ Policy/procedure dictates records move to Documentum
▪ "Give up" a document / email / file to the "records store" – USER
RELUCTANCE – Active Resistance in some cases!
▪ Unexpected Finding: People will still FIGHT to keep their
documents stored in something they are comfortable with
▪ Multiple places to put user content made it impossible to have
ONE WAY TO DO IT
▪ LEARNING: We will not ever have enough .NET staff to BUILD a
solution to this Wicked Problem
▪ Find Configurable Solution – Deploy ONE way to do it!
▪ Let Users Keep Their Content in Familiar Looking Places
14. 2012 2015
NGC Enterprise Portal and eDiscovery Time Line
PORTAL
PLATFORM
PHASE 1:
STABILIZE PLATFORM
Address risk of unsupported platform
Cost Option A: $200k
SP2010 with enterprise capabilities
Cost Option B: $80k
SP2010 w/out enterprise capabilities
SITES and
COMMUNITIES
functionality
More productive,
lower cost searches
Cost: $250k
PHASE 2: MIGRATE
CONTENT
All content under management
Ease of use, 300 Users (Charlotte HQ)
Cost Option A: $775k
Full Content Mgmt with auto
classification software/services
Cost Option B: $475k
Content/Records Mgmt only
*Assumes Phase 1A
2013
APPLICATIONSSEARCH
COLLABORATIONBUSINESS
INTELLIGENCE
2014 2015
COLLABORATION
eDiscovery
Search
CONTENT
MANAGEMENT
RECORDS
MANAGEMENT
eDiscovery
Search Audit &
Compliance
CONTENT
MANAGEMENT
RECORDS
MANAGEMENT
eDISCOVERY CAPABILITIES
COLLABORATION
Today – DepartmentalTeam Sites
Tomorrow– Shift from email and
shared folders to active, living
meetingplaces using social tools to
connect team members across the
organization.
Today – Bookmarks (a.k.a.“I can’t
find anything”)
Tomorrow– Google-likesearch (and
“find-ability”)across ALL NGC
informationwith controlledaccess.
SEARCH
Today – Multiplesets of numbers,
limiteddata access, few users.
Tomorrow– One set of numbers,
easy secured access, many users,
developedand reported against
with standard office productivity
tools.
BUSINESS
INTELLIGENCE
Today – IT builds and manages
custom for each business use.
Tomorrow– Business process
automationbuilt and managed by
business users.
APPLICATIONS
Today – Only “Records” have a
defined life-cycle.
Tomorrow– All content is created,
secured, and deleted according to
NGC policies
CONTENT
MANAGEMENT
RECORDS
MANAGEMENT
Today – Multipletools, labor
intensive, large data sets
Tomorrow– Single tool, efficient
and productive, focused data set
eDiscovery
Search
Today – Manual effort
Tomorrow– Automated audit,
notification,and compliance
reportingAudit &
Compliance
Today – Manual effort
Tomorrow– Centralized,compliant,
easily discovered, and appropriately
preserved and/or purged.
eMail Mgmt
eMailMgmt &
Archive
Last update: 8/15/12
SEARCH
BUSINESS
INTELLIGENCE
DELIVER INFORMATION
Make data and analytical services directly
available to business users:
• Plant performance
• Sales performance
• Supply Chain performance
• Financial performance
• New Product Development
Cost: Manpower
*Assumes Phase
1A
PHASE 3: EXTEND TO
SALES &
MANUFACTURING
+700 Users (Plants & Field Sales)
Cost: $550k
*Assumes other technology constraints
have been resolved (WAN, LANs, HQ
storage)
*Assumes Phases 1A & 2A
APPLICATIONS
INSIGHTS
functionality
2013
DELIVER
APPLICATIONS
Build and Deploy Portal
Applications for NGC:
• Product Portal
• CSC Portal 2.0
• Supply Chain Portal
• Legal Portal
• Training Portal
Cost: Manpower
*Assumes Phase
1A
COMPOSITES
functionality
2013
eMailMgmt &
Archive
Audit &
Compliance
Automated audit, notification,
and compliance reporting
Cost: $100k
Email archiving and Outlook
Personal Storage (PST)
management
Cost: TBD
15. What do we need?
15
• DoD 5015.2 Std: GimmalSoft provides a suite of
software and consulting – Fully certified –
Supported by Colligo / MetaLogix
• Robust e-Discovery Case Management; Legal
specific search / text / conceptual analysis:
StoredIQ and Concept Searching tools – Save
thousands to possibly millions in legal fees!
• Baked in Records Compliance Content management
across lifecycle and automatic categorization!
•
16. What have we learned?
16
• Commitment to ONE Retire “specialty SW” and
consolidate into SharePoint – License Enterprise
features and leverage them!
• More Than One Size/Color Needed Content creators /
publishers, Heavy content consumers, „Browser Only‟
consumers – Different solutions needed (NOTE: See
Mobility Presentation Tomorrow!!)
• Records/ECM is CORE Reliance on manual
compliance futile; Risks are huge, Penalties Happen!!
•
Editor's Notes
Have you (or your company) been affected by a lawsuit?How about a Legal Hold? Have you ever declared and filed a record (in the legal sense of that word ‘record’)?In the last several years the IT teams I lead at NGC have become VERY aware of how lawsuits, e-discovery and legal hold can affect our company –This presentation covers how we’ve been able to use SharePoint and related tools to help deal with the issues
How does the economy impact YOUR business?It has a HUGE impact of Construction – which in turn has a huge impact on National Gypsum!Current business climate - Basic chart showing BIG Trends Enough History to Show Entire Period of Technical stuff - say from 1995 to NOWOur industry and business has ALWAYS been very cyclical – But from 1991 to 2001 – generally saw overall growth over time From 2001 to 2006 we saw a huge HOUSING BUBBLE and from 2006 to 2012 we experienced a GIANT BUST – We now seem to be emerging from that NUCLEAR WINTER but there were times when things were in doubt – Industry has been substantially restructured and some competitors are now goneHow does SharePoint HELP us during these challenges?INTRANET and HR connections – Great employee communications facilitated by executives, corp-comm “internal publishing”SW Licensing – SharePoint can be a SWISS ARMY knife kind of application – Retire filing systems – EMC Documentum seats versus SharePoint document libraries… Functionality may not EXCEED – but it will suffice – Retire expensive SW and stop supporting customized apps – setup a user managed platform – turn off a system that requires a lot of developer / admin effort to deliver to users…We also have entered a NEW era of litigation and legal issues – which we were not fully prepared for – More on that later
IT Environment Overview slide Application / system platforms WAN Overview Windows emphasis
SharePoint evolutionCurrent deploymentIssues tied to Nuclear Winter (staff laid off) Stuck and unable to make much progress
IT wanted to reduce it’s application portfolio to those products with broad adoptions, were core to its skill set, provided valueWe felt there were several applications with limited audiences that SharePoint could replace and that by doing so, we would reposition costs to a more mainstream useWe felt we were underutilizing SharePoint but we had to justify the license costs with business projects
IT wanted to reduce it’s application portfolio to those products with broad adoptions, were core to its skill set, provided valueWe felt there were several applications with limited audiences that sharepoint could replace and that by doing so, we would reposition costs to a more mainstream useWe felt we were underutilizing SharePoint but we had to justify the license costs with business projects
Policies for User Accounts and Features Needed Created Major Issues – The SP enterprise client access license provides end user access the tools to develop score cards, metrics, work flow, enhanced excel analytics, and business process/applicationsSharePoint is a business application platform (call it windows XP for knowledge workers) that enables normal associates to use common productivity software to develop their own processes and publish their own key indicatorsWe have several new processes such the new product development process, raw material testing process, garnishment tracking, cross departmental product management that just need some time and attention to be delivered within SharePoint to highlight a new way to deliver business capabilitiesThe 300k is the first portion of the money and resources to be invested
NGC deployed TrueArc Foremost for Paper and Limited Electronic Records in 1999/2000Microsoft Office users needed better usability – We enhanced TrueArc with custom Office add in (EDCAR) Add in pushed all users to classify electronic documents as they were being created – Attached a document property that indicated File Plan (retention) designation and user dept.Documentum acquired TrueArc in 2002-2003 and we updated our custom EDCAR system to use Documentum and Office 2003Created a “filer interface” that leveraged Microsoft BITS for background file transfer into document repository (fire and forget filing)Created a pilot “sweeper” application service that would support easier and more automatic filing of documents NOT sent to repository but not in policy compliance.Major issues emerge with user adoption – Drive to full filing stalls out Windows 7 changes to security model for all users – Office 2010 changes – Update system to preserve existing functionality and improve sweeper / automationComplete rewrite of our system into managed code and another complete rewrite to support Office 2010 and the Ribbon - Much better integration!
Feeling PAIN around ongoing development costs and business downturn and reduced budgets / .NET resources on staff and on Legal teamAfter a large amount of effort - we had embedded / forced classification but actual record filing wasn't working well"Give up" a document / email / file to the "records store" - REAL RELUCTANCEDefinite usability barriersUnexpected Finding: Policy - some technical help - People will still FIGHT to keep their documents stored in something they are comfortable withMultiple places to put things made it impossible to have ONE WAY TO DO ITLEARNING: We will not ever have enough .NET staff to BUILD a solution to this Wicked Problem
Roadmap project - Brief explanation of Documentum and EDCAR - Current version Review of use - Integration into Office Gimmal move toward SharePoint and GimmalSoft SharePoint can now meet DoD 5015.2 - Standard for Records Management that is well recognized by courts and government agencies - as well as military... Roadmap built to move ALL unstructured data / content management via SharePoint - Replace Documentum
Policies for User Accounts and Features Needed Created Major Issues – The SP enterprise client access license provides end user access the tools to develop score cards, metrics, work flow, enhanced excel analytics, and business process/applicationsSharePoint is a business application platform (call it windows XP for knowledge workers) that enables normal associates to use common productivity software to develop their own processes and publish their own key indicatorsWe have several new processes such the new product development process, raw material testing process, garnishment tracking, cross departmental product management that just need some time and attention to be delivered within SharePoint to highlight a new way to deliver business capabilitiesThe 300k is the first portion of the money and resources to be invested
Policies for User Accounts and Features Needed Created Major Issues – The SP enterprise client access license provides end user access the tools to develop score cards, metrics, work flow, enhanced excel analytics, and business process/applicationsSharePoint is a business application platform (call it windows XP for knowledge workers) that enables normal associates to use common productivity software to develop their own processes and publish their own key indicatorsWe have several new processes such the new product development process, raw material testing process, garnishment tracking, cross departmental product management that just need some time and attention to be delivered within SharePoint to highlight a new way to deliver business capabilitiesThe 300k is the first portion of the money and resources to be invested