A brief background and strategy for Kuali OLE at Lehigh University, some context on our planning process, three levers for changes: process, staffing, and technology; including a short conversation about the promise of Kuali OLE at Lehigh
Qubole provides a self-managing Hadoop infrastructure as a service that allows companies across various industries including adtech, media, healthcare, retail, and ecommerce to analyze large scale data without needing Hadoop skills. In 2014, Qubole managed over 2.5 million nodes on AWS that processed over 40 million queries and 519 petabytes of data. Qubole offers an easy to use, unified interface that provides data discovery, query templates, and administration/monitoring for automated, optimized performance on Hadoop clusters in the cloud. Customers choose Qubole for its managed services, single unified interface, and 24/7 expert support.
MSRCOSMOS has focused initiatives on Big Data and has capabilities to help customers adopt Big Data solutions. The capabilities range from discovering Big Data for adoption to implementation of domain specific solutions. The capabilities are addressed using three major dimensions.
Oracle Cloud Infrastructure (OCI) is a comprehensive IaaS platform that provides on-premises computing power for cloud native applications. It offers autonomous services, integrated security, and seamless performance. OCI delivers infrastructure and platform cloud services like compute, storage, networking, security, and databases around the world. It provides benefits like automated management through machine learning, lower costs than AWS, and easy migration of on-premises Oracle applications. OCI also includes cloud analytics and business analytics products to help customers gain insights.
Slide deck of Scalable Analytics on the Cloud session delivered by me at "Data Analytics Explained" meetup at Amazon Web Services office in Melbourne on 24 May, 2018.
Covers the following topics:
1. What is Machine Learning and the pros and cons of traditional machine learning approaches
2. How big data compliments Machine Learning
3. How the integration of Machine Learning, Big Data and Analytics enable unprecedented value.
Pivotal Digital Transformation Forum: Requirements to Become a Data-Driven En...VMware Tanzu
To become a data-driven enterprise, companies must move from inflexible legacy data infrastructure that cannot scale to agile data architectures based on scaled-up, open-source systems that can handle any type or source of data. This involves storing both structured and unstructured high-volume, high-velocity data and then analyzing it through machine learning, predictive analytics, and real-time analytics to develop advanced analytical applications and globally scaled, data-driven applications. Achieving this requires expertise in agile development, DevOps, hybrid cloud, and continuous delivery to innovate with closed-loop applications.
This document discusses creating a data-driven culture. It states that a data-driven organization acquires, processes, and leverages data in a timely manner to create efficiencies, iterate products, and navigate competition. It recommends focusing on process, people, platform, and products through metrics, iterative development, data access for staff, centralized data storage, and choosing tools for users. Initial tactics include talking to stakeholders, defining key metrics and entities, and finding easy wins to share results.
The Business Benefits of a Data-Driven, Self-Service BI OrganizationLooker
The document discusses the benefits of self-service business intelligence (BI) and data-driven organizations. It notes that self-service BI allows users to access and analyze data with less dependence on IT, which streamlines processes, makes business and IT more productive, opens analytics to more users, and helps organizations become more data-driven. The document also uses Twilio as a case study, explaining that Twilio provides a communications API and has evolved its data use from engineers writing custom queries to using a modeling layer to reuse logic on underlying data.
Business Intelligence Solution on Windows AzureInfosys
The document discusses a proposed cloud-based business intelligence (BI) solution on Microsoft Azure. It outlines challenges with traditional on-premise BI implementations and how a hybrid cloud solution addresses these issues through scalability, availability, cost efficiency and other benefits. The proposed solution features on-premise components that cleanse and transfer data to cloud components, which include an Azure table storage data warehouse, reporting and analytics tools, and delivery of reports to both internal and external users.
Qubole provides a self-managing Hadoop infrastructure as a service that allows companies across various industries including adtech, media, healthcare, retail, and ecommerce to analyze large scale data without needing Hadoop skills. In 2014, Qubole managed over 2.5 million nodes on AWS that processed over 40 million queries and 519 petabytes of data. Qubole offers an easy to use, unified interface that provides data discovery, query templates, and administration/monitoring for automated, optimized performance on Hadoop clusters in the cloud. Customers choose Qubole for its managed services, single unified interface, and 24/7 expert support.
MSRCOSMOS has focused initiatives on Big Data and has capabilities to help customers adopt Big Data solutions. The capabilities range from discovering Big Data for adoption to implementation of domain specific solutions. The capabilities are addressed using three major dimensions.
Oracle Cloud Infrastructure (OCI) is a comprehensive IaaS platform that provides on-premises computing power for cloud native applications. It offers autonomous services, integrated security, and seamless performance. OCI delivers infrastructure and platform cloud services like compute, storage, networking, security, and databases around the world. It provides benefits like automated management through machine learning, lower costs than AWS, and easy migration of on-premises Oracle applications. OCI also includes cloud analytics and business analytics products to help customers gain insights.
Slide deck of Scalable Analytics on the Cloud session delivered by me at "Data Analytics Explained" meetup at Amazon Web Services office in Melbourne on 24 May, 2018.
Covers the following topics:
1. What is Machine Learning and the pros and cons of traditional machine learning approaches
2. How big data compliments Machine Learning
3. How the integration of Machine Learning, Big Data and Analytics enable unprecedented value.
Pivotal Digital Transformation Forum: Requirements to Become a Data-Driven En...VMware Tanzu
To become a data-driven enterprise, companies must move from inflexible legacy data infrastructure that cannot scale to agile data architectures based on scaled-up, open-source systems that can handle any type or source of data. This involves storing both structured and unstructured high-volume, high-velocity data and then analyzing it through machine learning, predictive analytics, and real-time analytics to develop advanced analytical applications and globally scaled, data-driven applications. Achieving this requires expertise in agile development, DevOps, hybrid cloud, and continuous delivery to innovate with closed-loop applications.
This document discusses creating a data-driven culture. It states that a data-driven organization acquires, processes, and leverages data in a timely manner to create efficiencies, iterate products, and navigate competition. It recommends focusing on process, people, platform, and products through metrics, iterative development, data access for staff, centralized data storage, and choosing tools for users. Initial tactics include talking to stakeholders, defining key metrics and entities, and finding easy wins to share results.
The Business Benefits of a Data-Driven, Self-Service BI OrganizationLooker
The document discusses the benefits of self-service business intelligence (BI) and data-driven organizations. It notes that self-service BI allows users to access and analyze data with less dependence on IT, which streamlines processes, makes business and IT more productive, opens analytics to more users, and helps organizations become more data-driven. The document also uses Twilio as a case study, explaining that Twilio provides a communications API and has evolved its data use from engineers writing custom queries to using a modeling layer to reuse logic on underlying data.
Business Intelligence Solution on Windows AzureInfosys
The document discusses a proposed cloud-based business intelligence (BI) solution on Microsoft Azure. It outlines challenges with traditional on-premise BI implementations and how a hybrid cloud solution addresses these issues through scalability, availability, cost efficiency and other benefits. The proposed solution features on-premise components that cleanse and transfer data to cloud components, which include an Azure table storage data warehouse, reporting and analytics tools, and delivery of reports to both internal and external users.
Innovative Integrated Sustainable Development of Malwa Region :Fazilka Region...Love Fazilka
A Sampuran initiative where farmers and entrepreneurs shall create a profitable partnershipfor mutual benefit for the economic development of Malwa region. This has Integrated Sustainable Business Model covering;
* Agronomically sustainable
*Environmentally sensitive through management of waste straw in farms
*Economically viable engaging every farmer in the region to be a part of the value chain
*Management and Treatment of polluted ground water for general health improvement
*Development of various economic activities including, fish farms, dairy farms, biogas power plant
This presentation provides an introductory approach to “Sustainability 2.0” and FISDEV (Framework for Integrated Sustainable Development) an open source, collaborative methodology for corporate Sustainable Development.
The document discusses sustainable development. It defines sustainable development as meeting present needs without compromising future generations' ability to meet their own needs. Gro Harlem Brundtland first introduced the concept in 1987 as Prime Minister of Norway. Sustainability concerns balancing equity, environmental integrity, and economic efficiency. Achieving sustainable development means simultaneously achieving social progress, environmental protection, prudent resource use, and economic growth. It requires respecting environmental limits and improving quality of life. Analyzing sustainability issues at global, regional, national and local levels is important. For sustainable development to occur, there must be cooperation, wealth redistribution, respecting nature's limits, and community self-sufficiency.
This document provides an introduction to the topic of sustainability. It discusses key concepts like resources, population growth, sustainable development, and interdependence. Resources are defined as anything taken from the environment to make goods and products for human needs and wants. However, using resources leads to waste. Sustainable development means meeting current needs without compromising the ability of future generations to meet their needs. It involves considering consumption, waste, and responsible use of Earth's finite resources. The document stresses thinking globally about how our actions impact others and acting locally through sustainable practices in our own communities.
Sustainable Development Webinar Series: SD 101EOTO World
This webinar provided an overview of sustainable development and how youth can get involved. It defined sustainable development as meeting present needs without compromising future generations' ability to meet their own needs. Sustainable development involves balancing environmental protection, economic growth, and social equality. The webinar discussed the three pillars of sustainable development - environment, economy and society. It emphasized that youth have an important role to play by advocating for sustainable policies that will shape their future. The webinar equipped participants with talking points about sustainable development and tips for spreading awareness to others.
The growth leads to the depletion of natural resources of the planet. One of them is wood. We use unnecessary paper! Too much mess! Beware of CO2 imbalance... The immediate solution to stop destroying forests: dematerialization of exchanges with legal convincing value. Zero paper! The electronic originals are sealed and encrypted in a nominative and communicating electronic safe. The identification of counterparts is made via Magicaxess, a new high tech of identification WITHOUT having to download a digital certificate!
This document discusses various aspects of sustainable development at the neighborhood level, including water, waste management, green space, food, and energy. It emphasizes meeting environmental, economic, and social goals simultaneously (the triple bottom line). Some key sustainable practices mentioned are rainwater harvesting, composting, farmers markets, green roofs, solar panels, recycled and local building materials, and forms of renewable energy like wind and solar.
Multi Disciplinary Teams operating modelGarrie Irons
The document discusses an operating model for high-performing agile teams that addresses common challenges such as silos, duplication, and lack of innovation. It proposes transforming teams into multi-disciplinary teams that are self-sufficient, flat-structured, and embedded within business areas. Guiding principles include having self-sufficient business and IT teams with a product mindset. The model structure uses scrum and agile project management. Key learnings discussed include starting small, change management, portfolio planning, and using centers of excellence.
transforming how the world operates softwareAndrew Shafer
Quick run through of some ideas about continuously devopsing microservices for Velocity NY keynote. A bit about Pivotal, a bit about me, the industry, and you. Yes you...
Fusion Applications are intended to unify the best functionality from Oracle's traditional applications into a single suite delivered using Oracle's open technology platform. This will provide a complete, integrated, and next-generation user experience. Key aspects of Fusion Applications include consolidating features from Oracle's E-Business Suite, PeopleSoft, and Siebel applications, as well as new capabilities. Fusion is built using a service-oriented architecture and Oracle Fusion Middleware to enable integration, extensibility, and manageability. While Fusion Applications are still in development, Oracle is promoting strategies for customers to prepare for and eventually transition to the new platform.
Power BI offers customers rapid time-to-value by providing intuitive visual analytics tools that reduce the time needed to gain insights from data. It allows users to connect to various data sources, transform the data, create interactive visualizations and dashboards, and share insights collaboratively. Power BI provides a full stack of business intelligence capabilities including querying, modeling, visualizing, analyzing, and sharing on desktop, online, and mobile platforms.
This document provides a status report on the State Library's project to upgrade its website from a FrontPage-based system to one using the open source content management framework Drupal. It discusses the library's needs for a website that is easier for users, presents it as a cohesive organization, and leverages collaboration. It outlines the challenges of the current system and how Drupal can help by allowing for improved information architecture, easier content creation and updating, and a platform that can adapt to changing technologies. The report notes that while consultants were originally planned to assist, the library is now building out the Drupal environment itself, undergoing training, testing modules, and planning next steps to move content over from the existing site.
Actionable Data: Mastering the Hybrid Analytics MixPerficient, Inc.
With an increase in the adoption of cloud applications, most organizations today are in some form of hybrid state (i.e. using a combination of on-premise and cloud applications to run their business). Regardless of where the data resides, you need a complete view of the company spanning across different parts of the business, combining insightful data across both onsite and public cloud instances.
In this webinar, we looked at multiple approaches that organizations have successfully used to consolidate data from multiple cloud and on-premise applications and to perform seamless analytics across these varied data sources.
Modern data warehouses need to be modernized to handle big data, integrate multiple data silos, reduce costs, and reduce time to market. A modern data warehouse blueprint includes a data lake to land and ingest structured, unstructured, external, social, machine, and streaming data alongside a traditional data warehouse. Key challenges for modernization include making data discoverable and usable for business users, rethinking ETL to allow for data blending, and enabling self-service BI over Hadoop. Common tactics for modernization include using a data lake as a landing zone, offloading infrequently accessed data to Hadoop, and exploring data in Hadoop to discover new insights.
top 5 ways sharepoint can help your businessMcOWLMarketing
SharePoint can help businesses in five key ways:
1. Provide multi-platform access through features like BYOD, cross-platform browsers, and non-enterprise device authentication.
2. Enable business process management with workflows, version control, managed metadata, and workspaces for collaboration.
3. Facilitate work performance reporting using business intelligence, performance point services, Excel services, and Power View.
4. Implement federated access and control through single data aggregation and internal, partner, and public access controls.
5. Establish a SKMS (Secure Knowledge Management System) to provide the right information to the right people at the right time through various SharePoint features.
Optimizing IT Costs & Services With Big Data (Little Effort!) - Case Studies ...TeamQuest Corporation
IT organizations have a wealth of Service Management and Service Delivery tools, processes and metrics that typically exist in relative isolation. This session will present detailed real-life examples of how existing tools and metrics can be brought together using big data techniques to optimize costs and performance of IT environments.
Modernize and Transform your IT with NetApp Storage and Catalogic Copy Data M...Catalogic Software
Modernize and Transform your IT with NetApp Storage and Catalogic Copy Data Management
1. Catalogic ECX is a software-only copy data management platform that can help organizations modernize and transform their IT through use cases like hybrid cloud infrastructure, transforming operations, and empowering application development.
2. Most organizations recognize the need to modernize and transform IT but many are struggling to do so. Catalogic ECX can automate processes like infrastructure provisioning and copy management to help organizations modernize more efficiently.
3. Catalogic ECX works with existing storage infrastructure from vendors like NetApp, IBM, and EMC to provide capabilities like automated replication to the cloud and spin up of cloud compute resources on
SOA IT is an IT services company founded in 2006 with over 200 employees specializing in Oracle and SAP implementations, support, and testing. They have experience implementing Oracle ERP modules like HCM, financials, supply chain management, and CRM for over 25 clients in various industries. SOA IT offers the full project lifecycle from implementation to support, upgrades, testing, and training. They have partnerships with Oracle and Worksoft for test automation and have experience implementing and supporting large Oracle and SAP systems.
Big Data 2.0: YARN Enablement for Distributed ETL & SQL with HadoopCaserta
In our most recent Big Data Warehousing Meetup, we learned about transitioning from Big Data 1.0 with Hadoop 1.x with nascent technologies to the advent of Hadoop 2.x with YARN to enable distributed ETL, SQL and Analytics solutions. Caserta Concepts Chief Architect Elliott Cordo and an Actian Engineer covered the complete data value chain of an Enterprise-ready platform including data connectivity, collection, preparation, optimization and analytics with end user access.
For more information on our services or upcoming events, please visit our website at http://www.casertaconcepts.com/.
5 Reasons Content Strategy & Content Engineering Go Together Like Milk and Or...Kanban Solutions
This document discusses how content strategy and content engineering work together to create effective content. It outlines five opportunities to create magic through content: 1) personalization, 2) presentation, 3) authoring experience, 4) reuse, and 5) governance. Each opportunity includes both strategic and engineering considerations. The document emphasizes that both strategy and engineering are needed to prioritize, plan, and execute content initiatives that drive results. It encourages readers to evaluate their own content opportunities using these lenses.
WebCenter Content & Portal Methodology Deep Dive with Case StudiesBrian Huff
This document provides an overview and agenda for a WebCenter 101 session on Web development techniques, WebCenter architecture, and real-world solutions. The speakers are Jason Clarkin and Brian "Bex" Huff from Bezzotech. The agenda includes discussions on WebCenter overview, content and portal case studies, and unified solution tips and tricks. Other related sessions at the conference are also listed.
Integrating Workday with the Rest of the EnterpriseSnapLogic
In this webinar, we talk about working with leading IT organizations on two primary challenges – the “cloudification” of your enterprise application portfolio and managing the resulting shift in data gravity. The webinar features integrating Workday with the rest of your data, applications and APIs using the SnapLogic Elastic Integration Platform, a unified, elastic iPaaS that is purpose-built for the new era of social, mobile, big data and cloud computing. We also review common use cases such as new employee on-boarding, sales compensation, payroll, talent analytics, active directory and back-office integration.
To learn more, visit: www.SnapLogic.com/Workday
Innovative Integrated Sustainable Development of Malwa Region :Fazilka Region...Love Fazilka
A Sampuran initiative where farmers and entrepreneurs shall create a profitable partnershipfor mutual benefit for the economic development of Malwa region. This has Integrated Sustainable Business Model covering;
* Agronomically sustainable
*Environmentally sensitive through management of waste straw in farms
*Economically viable engaging every farmer in the region to be a part of the value chain
*Management and Treatment of polluted ground water for general health improvement
*Development of various economic activities including, fish farms, dairy farms, biogas power plant
This presentation provides an introductory approach to “Sustainability 2.0” and FISDEV (Framework for Integrated Sustainable Development) an open source, collaborative methodology for corporate Sustainable Development.
The document discusses sustainable development. It defines sustainable development as meeting present needs without compromising future generations' ability to meet their own needs. Gro Harlem Brundtland first introduced the concept in 1987 as Prime Minister of Norway. Sustainability concerns balancing equity, environmental integrity, and economic efficiency. Achieving sustainable development means simultaneously achieving social progress, environmental protection, prudent resource use, and economic growth. It requires respecting environmental limits and improving quality of life. Analyzing sustainability issues at global, regional, national and local levels is important. For sustainable development to occur, there must be cooperation, wealth redistribution, respecting nature's limits, and community self-sufficiency.
This document provides an introduction to the topic of sustainability. It discusses key concepts like resources, population growth, sustainable development, and interdependence. Resources are defined as anything taken from the environment to make goods and products for human needs and wants. However, using resources leads to waste. Sustainable development means meeting current needs without compromising the ability of future generations to meet their needs. It involves considering consumption, waste, and responsible use of Earth's finite resources. The document stresses thinking globally about how our actions impact others and acting locally through sustainable practices in our own communities.
Sustainable Development Webinar Series: SD 101EOTO World
This webinar provided an overview of sustainable development and how youth can get involved. It defined sustainable development as meeting present needs without compromising future generations' ability to meet their own needs. Sustainable development involves balancing environmental protection, economic growth, and social equality. The webinar discussed the three pillars of sustainable development - environment, economy and society. It emphasized that youth have an important role to play by advocating for sustainable policies that will shape their future. The webinar equipped participants with talking points about sustainable development and tips for spreading awareness to others.
The growth leads to the depletion of natural resources of the planet. One of them is wood. We use unnecessary paper! Too much mess! Beware of CO2 imbalance... The immediate solution to stop destroying forests: dematerialization of exchanges with legal convincing value. Zero paper! The electronic originals are sealed and encrypted in a nominative and communicating electronic safe. The identification of counterparts is made via Magicaxess, a new high tech of identification WITHOUT having to download a digital certificate!
This document discusses various aspects of sustainable development at the neighborhood level, including water, waste management, green space, food, and energy. It emphasizes meeting environmental, economic, and social goals simultaneously (the triple bottom line). Some key sustainable practices mentioned are rainwater harvesting, composting, farmers markets, green roofs, solar panels, recycled and local building materials, and forms of renewable energy like wind and solar.
Multi Disciplinary Teams operating modelGarrie Irons
The document discusses an operating model for high-performing agile teams that addresses common challenges such as silos, duplication, and lack of innovation. It proposes transforming teams into multi-disciplinary teams that are self-sufficient, flat-structured, and embedded within business areas. Guiding principles include having self-sufficient business and IT teams with a product mindset. The model structure uses scrum and agile project management. Key learnings discussed include starting small, change management, portfolio planning, and using centers of excellence.
transforming how the world operates softwareAndrew Shafer
Quick run through of some ideas about continuously devopsing microservices for Velocity NY keynote. A bit about Pivotal, a bit about me, the industry, and you. Yes you...
Fusion Applications are intended to unify the best functionality from Oracle's traditional applications into a single suite delivered using Oracle's open technology platform. This will provide a complete, integrated, and next-generation user experience. Key aspects of Fusion Applications include consolidating features from Oracle's E-Business Suite, PeopleSoft, and Siebel applications, as well as new capabilities. Fusion is built using a service-oriented architecture and Oracle Fusion Middleware to enable integration, extensibility, and manageability. While Fusion Applications are still in development, Oracle is promoting strategies for customers to prepare for and eventually transition to the new platform.
Power BI offers customers rapid time-to-value by providing intuitive visual analytics tools that reduce the time needed to gain insights from data. It allows users to connect to various data sources, transform the data, create interactive visualizations and dashboards, and share insights collaboratively. Power BI provides a full stack of business intelligence capabilities including querying, modeling, visualizing, analyzing, and sharing on desktop, online, and mobile platforms.
This document provides a status report on the State Library's project to upgrade its website from a FrontPage-based system to one using the open source content management framework Drupal. It discusses the library's needs for a website that is easier for users, presents it as a cohesive organization, and leverages collaboration. It outlines the challenges of the current system and how Drupal can help by allowing for improved information architecture, easier content creation and updating, and a platform that can adapt to changing technologies. The report notes that while consultants were originally planned to assist, the library is now building out the Drupal environment itself, undergoing training, testing modules, and planning next steps to move content over from the existing site.
Actionable Data: Mastering the Hybrid Analytics MixPerficient, Inc.
With an increase in the adoption of cloud applications, most organizations today are in some form of hybrid state (i.e. using a combination of on-premise and cloud applications to run their business). Regardless of where the data resides, you need a complete view of the company spanning across different parts of the business, combining insightful data across both onsite and public cloud instances.
In this webinar, we looked at multiple approaches that organizations have successfully used to consolidate data from multiple cloud and on-premise applications and to perform seamless analytics across these varied data sources.
Modern data warehouses need to be modernized to handle big data, integrate multiple data silos, reduce costs, and reduce time to market. A modern data warehouse blueprint includes a data lake to land and ingest structured, unstructured, external, social, machine, and streaming data alongside a traditional data warehouse. Key challenges for modernization include making data discoverable and usable for business users, rethinking ETL to allow for data blending, and enabling self-service BI over Hadoop. Common tactics for modernization include using a data lake as a landing zone, offloading infrequently accessed data to Hadoop, and exploring data in Hadoop to discover new insights.
top 5 ways sharepoint can help your businessMcOWLMarketing
SharePoint can help businesses in five key ways:
1. Provide multi-platform access through features like BYOD, cross-platform browsers, and non-enterprise device authentication.
2. Enable business process management with workflows, version control, managed metadata, and workspaces for collaboration.
3. Facilitate work performance reporting using business intelligence, performance point services, Excel services, and Power View.
4. Implement federated access and control through single data aggregation and internal, partner, and public access controls.
5. Establish a SKMS (Secure Knowledge Management System) to provide the right information to the right people at the right time through various SharePoint features.
Optimizing IT Costs & Services With Big Data (Little Effort!) - Case Studies ...TeamQuest Corporation
IT organizations have a wealth of Service Management and Service Delivery tools, processes and metrics that typically exist in relative isolation. This session will present detailed real-life examples of how existing tools and metrics can be brought together using big data techniques to optimize costs and performance of IT environments.
Modernize and Transform your IT with NetApp Storage and Catalogic Copy Data M...Catalogic Software
Modernize and Transform your IT with NetApp Storage and Catalogic Copy Data Management
1. Catalogic ECX is a software-only copy data management platform that can help organizations modernize and transform their IT through use cases like hybrid cloud infrastructure, transforming operations, and empowering application development.
2. Most organizations recognize the need to modernize and transform IT but many are struggling to do so. Catalogic ECX can automate processes like infrastructure provisioning and copy management to help organizations modernize more efficiently.
3. Catalogic ECX works with existing storage infrastructure from vendors like NetApp, IBM, and EMC to provide capabilities like automated replication to the cloud and spin up of cloud compute resources on
SOA IT is an IT services company founded in 2006 with over 200 employees specializing in Oracle and SAP implementations, support, and testing. They have experience implementing Oracle ERP modules like HCM, financials, supply chain management, and CRM for over 25 clients in various industries. SOA IT offers the full project lifecycle from implementation to support, upgrades, testing, and training. They have partnerships with Oracle and Worksoft for test automation and have experience implementing and supporting large Oracle and SAP systems.
Big Data 2.0: YARN Enablement for Distributed ETL & SQL with HadoopCaserta
In our most recent Big Data Warehousing Meetup, we learned about transitioning from Big Data 1.0 with Hadoop 1.x with nascent technologies to the advent of Hadoop 2.x with YARN to enable distributed ETL, SQL and Analytics solutions. Caserta Concepts Chief Architect Elliott Cordo and an Actian Engineer covered the complete data value chain of an Enterprise-ready platform including data connectivity, collection, preparation, optimization and analytics with end user access.
For more information on our services or upcoming events, please visit our website at http://www.casertaconcepts.com/.
5 Reasons Content Strategy & Content Engineering Go Together Like Milk and Or...Kanban Solutions
This document discusses how content strategy and content engineering work together to create effective content. It outlines five opportunities to create magic through content: 1) personalization, 2) presentation, 3) authoring experience, 4) reuse, and 5) governance. Each opportunity includes both strategic and engineering considerations. The document emphasizes that both strategy and engineering are needed to prioritize, plan, and execute content initiatives that drive results. It encourages readers to evaluate their own content opportunities using these lenses.
WebCenter Content & Portal Methodology Deep Dive with Case StudiesBrian Huff
This document provides an overview and agenda for a WebCenter 101 session on Web development techniques, WebCenter architecture, and real-world solutions. The speakers are Jason Clarkin and Brian "Bex" Huff from Bezzotech. The agenda includes discussions on WebCenter overview, content and portal case studies, and unified solution tips and tricks. Other related sessions at the conference are also listed.
Integrating Workday with the Rest of the EnterpriseSnapLogic
In this webinar, we talk about working with leading IT organizations on two primary challenges – the “cloudification” of your enterprise application portfolio and managing the resulting shift in data gravity. The webinar features integrating Workday with the rest of your data, applications and APIs using the SnapLogic Elastic Integration Platform, a unified, elastic iPaaS that is purpose-built for the new era of social, mobile, big data and cloud computing. We also review common use cases such as new employee on-boarding, sales compensation, payroll, talent analytics, active directory and back-office integration.
To learn more, visit: www.SnapLogic.com/Workday
This document discusses search, discovery, and productivity in the digital age. It notes that with hundreds of millions of emails and videos posted every day, a parallel digital universe has been created. Building a digital mirror of beliefs, motivations, and objectives is becoming a critical business challenge. It also discusses how search engines work, indexing the web, and making sense of the growing amount of online information.
1. Introduction to the Course "Designing Data Bases with Advanced Data Models...Fabio Fumarola
The Information Technology have led us into an era where the production, sharing and use of information are now part of everyday life and of which we are often unaware actors almost: it is now almost inevitable not leave a digital trail of many of the actions we do every day; for example, by digital content such as photos, videos, blog posts and everything that revolves around the social networks (Facebook and Twitter in particular). Added to this is that with the "internet of things", we see an increase in devices such as watches, bracelets, thermostats and many other items that are able to connect to the network and therefore generate large data streams. This explosion of data justifies the birth, in the world of the term Big Data: it indicates the data produced in large quantities, with remarkable speed and in different formats, which requires processing technologies and resources that go far beyond the conventional systems management and storage of data. It is immediately clear that, 1) models of data storage based on the relational model, and 2) processing systems based on stored procedures and computations on grids are not applicable in these contexts. As regards the point 1, the RDBMS, widely used for a great variety of applications, have some problems when the amount of data grows beyond certain limits. The scalability and cost of implementation are only a part of the disadvantages: very often, in fact, when there is opposite to the management of big data, also the variability, or the lack of a fixed structure, represents a significant problem. This has given a boost to the development of the NoSQL database. The website NoSQL Databases defines NoSQL databases such as "Next Generation Databases mostly addressing some of the points: being non-relational, distributed, open source and horizontally scalable." These databases are: distributed, open source, scalable horizontally, without a predetermined pattern (key-value, column-oriented, document-based and graph-based), easily replicable, devoid of the ACID and can handle large amounts of data. These databases are integrated or integrated with processing tools based on the MapReduce paradigm proposed by Google in 2009. MapReduce with the open source Hadoop framework represent the new model for distributed processing of large amounts of data that goes to supplant techniques based on stored procedures and computational grids (step 2). The relational model taught courses in basic database design, has many limitations compared to the demands posed by new applications based on Big Data and NoSQL databases that use to store data and MapReduce to process large amounts of data.
Course Website http://pbdmng.datatoknowledge.it/
Bridging Oracle Database and Hadoop by Alex Gorbachev, Pythian from Oracle Op...Alex Gorbachev
Modern big data solutions often incorporate Hadoop as one of the components and require the integration of Hadoop with other components including Oracle Database. This presentation explains how Hadoop integrates with Oracle products focusing specifically on the Oracle Database products. Explore various methods and tools available to move data between Oracle Database and Hadoop, how to transparently access data in Hadoop from Oracle Database, and review how other products, such as Oracle Business Intelligence Enterprise Edition and Oracle Data Integrator integrate with Hadoop.
Kuali OLE is an open source library services platform developed by librarians for flexibility and integration. It has 66 members from 10 institutions and is funded by partners and the Mellon Foundation. The platform has four modules and provides selection/acquisition, ERM and linked data functionality. It offers hosted, local or hybrid implementation options and seeks to expand consortial support and full ERM functions.
Kuali OLE is an open source library management system that is entering its second year of development. In year one, the project formed partnerships, hired a core team, developed prototypes, and released an initial version. Key goals for year two include releasing version 1.0 in December 2012 and implementing strategies for partner institutions. The document highlights the collaborative development approach and innovative technologies being used to build a next-generation library system.
Presented at the 2011 SNRG Conference at Franklin & Marshall College, Lancaster, PA. Introduces MarcEdit and provides examples of its use in two different libraries using the SirsiDynix products Horizon and Symphony.
This document summarizes a presentation given by Judy McNally and Doreen Herold of Lehigh University about the challenges facing their technical services department and how they are adapting workflows to address changing trends. Key challenges include acquiring fewer print materials, an explosion of digital resources, reduced budgets, and changing staff roles. The department is shifting from print to electronic serials, outsourcing more work, and cross-training staff. Staff are taking on new roles like resolving access issues for electronic journals and doing more batch cataloging of materials like ETDs and SpringerLink titles. The department is also exploring new cataloging solutions like OLE.
Come to the Fiesta! Join the OLE ProjectDoreen Herold
Led by Duke University, the OLE Project intends to build a design document for an open source library management system which will be based on the software design philosophy of service oriented architecture (SOA). SOA is becoming a dominant trend in technology as early adopters have shown that it provides the benefit of an agile system, one that is flexible in response to information demands. Lehigh’s Doreen Herold and Tim McGeary will present the status of the OLE Project, its process, its goals, and how other PALINET members can participate.
Batch Cataloging : a Case Study of Loading Records for Lecture Notes in Compu...Doreen Herold
With consideration for issues in batch copy cataloging, such as record quality and labor, batch vs. one-at-a-time and in-house vs. vendor record sets, we'll share our experience with searching for, selecting, and loading a batch of records for the titles in the Lecture Notes in Computer Science e-book series, covering: search strategies in WorldCat; deduping and selecting; editing in MARCEdit and, finally; loading into Sirsi Dynix WorkFlows.
In the rapidly evolving landscape of technologies, XML continues to play a vital role in structuring, storing, and transporting data across diverse systems. The recent advancements in artificial intelligence (AI) present new methodologies for enhancing XML development workflows, introducing efficiency, automation, and intelligent capabilities. This presentation will outline the scope and perspective of utilizing AI in XML development. The potential benefits and the possible pitfalls will be highlighted, providing a balanced view of the subject.
We will explore the capabilities of AI in understanding XML markup languages and autonomously creating structured XML content. Additionally, we will examine the capacity of AI to enrich plain text with appropriate XML markup. Practical examples and methodological guidelines will be provided to elucidate how AI can be effectively prompted to interpret and generate accurate XML markup.
Further emphasis will be placed on the role of AI in developing XSLT, or schemas such as XSD and Schematron. We will address the techniques and strategies adopted to create prompts for generating code, explaining code, or refactoring the code, and the results achieved.
The discussion will extend to how AI can be used to transform XML content. In particular, the focus will be on the use of AI XPath extension functions in XSLT, Schematron, Schematron Quick Fixes, or for XML content refactoring.
The presentation aims to deliver a comprehensive overview of AI usage in XML development, providing attendees with the necessary knowledge to make informed decisions. Whether you’re at the early stages of adopting AI or considering integrating it in advanced XML development, this presentation will cover all levels of expertise.
By highlighting the potential advantages and challenges of integrating AI with XML development tools and languages, the presentation seeks to inspire thoughtful conversation around the future of XML development. We’ll not only delve into the technical aspects of AI-powered XML development but also discuss practical implications and possible future directions.
HCL Notes und Domino Lizenzkostenreduzierung in der Welt von DLAUpanagenda
Webinar Recording: https://www.panagenda.com/webinars/hcl-notes-und-domino-lizenzkostenreduzierung-in-der-welt-von-dlau/
DLAU und die Lizenzen nach dem CCB- und CCX-Modell sind für viele in der HCL-Community seit letztem Jahr ein heißes Thema. Als Notes- oder Domino-Kunde haben Sie vielleicht mit unerwartet hohen Benutzerzahlen und Lizenzgebühren zu kämpfen. Sie fragen sich vielleicht, wie diese neue Art der Lizenzierung funktioniert und welchen Nutzen sie Ihnen bringt. Vor allem wollen Sie sicherlich Ihr Budget einhalten und Kosten sparen, wo immer möglich. Das verstehen wir und wir möchten Ihnen dabei helfen!
Wir erklären Ihnen, wie Sie häufige Konfigurationsprobleme lösen können, die dazu führen können, dass mehr Benutzer gezählt werden als nötig, und wie Sie überflüssige oder ungenutzte Konten identifizieren und entfernen können, um Geld zu sparen. Es gibt auch einige Ansätze, die zu unnötigen Ausgaben führen können, z. B. wenn ein Personendokument anstelle eines Mail-Ins für geteilte Mailboxen verwendet wird. Wir zeigen Ihnen solche Fälle und deren Lösungen. Und natürlich erklären wir Ihnen das neue Lizenzmodell.
Nehmen Sie an diesem Webinar teil, bei dem HCL-Ambassador Marc Thomas und Gastredner Franz Walder Ihnen diese neue Welt näherbringen. Es vermittelt Ihnen die Tools und das Know-how, um den Überblick zu bewahren. Sie werden in der Lage sein, Ihre Kosten durch eine optimierte Domino-Konfiguration zu reduzieren und auch in Zukunft gering zu halten.
Diese Themen werden behandelt
- Reduzierung der Lizenzkosten durch Auffinden und Beheben von Fehlkonfigurationen und überflüssigen Konten
- Wie funktionieren CCB- und CCX-Lizenzen wirklich?
- Verstehen des DLAU-Tools und wie man es am besten nutzt
- Tipps für häufige Problembereiche, wie z. B. Team-Postfächer, Funktions-/Testbenutzer usw.
- Praxisbeispiele und Best Practices zum sofortigen Umsetzen
TrustArc Webinar - 2024 Global Privacy SurveyTrustArc
How does your privacy program stack up against your peers? What challenges are privacy teams tackling and prioritizing in 2024?
In the fifth annual Global Privacy Benchmarks Survey, we asked over 1,800 global privacy professionals and business executives to share their perspectives on the current state of privacy inside and outside of their organizations. This year’s report focused on emerging areas of importance for privacy and compliance professionals, including considerations and implications of Artificial Intelligence (AI) technologies, building brand trust, and different approaches for achieving higher privacy competence scores.
See how organizational priorities and strategic approaches to data security and privacy are evolving around the globe.
This webinar will review:
- The top 10 privacy insights from the fifth annual Global Privacy Benchmarks Survey
- The top challenges for privacy leaders, practitioners, and organizations in 2024
- Key themes to consider in developing and maintaining your privacy program
Goodbye Windows 11: Make Way for Nitrux Linux 3.5.0!SOFTTECHHUB
As the digital landscape continually evolves, operating systems play a critical role in shaping user experiences and productivity. The launch of Nitrux Linux 3.5.0 marks a significant milestone, offering a robust alternative to traditional systems such as Windows 11. This article delves into the essence of Nitrux Linux 3.5.0, exploring its unique features, advantages, and how it stands as a compelling choice for both casual users and tech enthusiasts.
UiPath Test Automation using UiPath Test Suite series, part 6DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 6. In this session, we will cover Test Automation with generative AI and Open AI.
UiPath Test Automation with generative AI and Open AI webinar offers an in-depth exploration of leveraging cutting-edge technologies for test automation within the UiPath platform. Attendees will delve into the integration of generative AI, a test automation solution, with Open AI advanced natural language processing capabilities.
Throughout the session, participants will discover how this synergy empowers testers to automate repetitive tasks, enhance testing accuracy, and expedite the software testing life cycle. Topics covered include the seamless integration process, practical use cases, and the benefits of harnessing AI-driven automation for UiPath testing initiatives. By attending this webinar, testers, and automation professionals can gain valuable insights into harnessing the power of AI to optimize their test automation workflows within the UiPath ecosystem, ultimately driving efficiency and quality in software development processes.
What will you get from this session?
1. Insights into integrating generative AI.
2. Understanding how this integration enhances test automation within the UiPath platform
3. Practical demonstrations
4. Exploration of real-world use cases illustrating the benefits of AI-driven test automation for UiPath
Topics covered:
What is generative AI
Test Automation with generative AI and Open AI.
UiPath integration with generative AI
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Infrastructure Challenges in Scaling RAG with Custom AI modelsZilliz
Building Retrieval-Augmented Generation (RAG) systems with open-source and custom AI models is a complex task. This talk explores the challenges in productionizing RAG systems, including retrieval performance, response synthesis, and evaluation. We’ll discuss how to leverage open-source models like text embeddings, language models, and custom fine-tuned models to enhance RAG performance. Additionally, we’ll cover how BentoML can help orchestrate and scale these AI components efficiently, ensuring seamless deployment and management of RAG systems in the cloud.
HCL Notes and Domino License Cost Reduction in the World of DLAUpanagenda
Webinar Recording: https://www.panagenda.com/webinars/hcl-notes-and-domino-license-cost-reduction-in-the-world-of-dlau/
The introduction of DLAU and the CCB & CCX licensing model caused quite a stir in the HCL community. As a Notes and Domino customer, you may have faced challenges with unexpected user counts and license costs. You probably have questions on how this new licensing approach works and how to benefit from it. Most importantly, you likely have budget constraints and want to save money where possible. Don’t worry, we can help with all of this!
We’ll show you how to fix common misconfigurations that cause higher-than-expected user counts, and how to identify accounts which you can deactivate to save money. There are also frequent patterns that can cause unnecessary cost, like using a person document instead of a mail-in for shared mailboxes. We’ll provide examples and solutions for those as well. And naturally we’ll explain the new licensing model.
Join HCL Ambassador Marc Thomas in this webinar with a special guest appearance from Franz Walder. It will give you the tools and know-how to stay on top of what is going on with Domino licensing. You will be able lower your cost through an optimized configuration and keep it low going forward.
These topics will be covered
- Reducing license cost by finding and fixing misconfigurations and superfluous accounts
- How do CCB and CCX licenses really work?
- Understanding the DLAU tool and how to best utilize it
- Tips for common problem areas, like team mailboxes, functional/test users, etc
- Practical examples and best practices to implement right away
Essentials of Automations: The Art of Triggers and Actions in FMESafe Software
In this second installment of our Essentials of Automations webinar series, we’ll explore the landscape of triggers and actions, guiding you through the nuances of authoring and adapting workspaces for seamless automations. Gain an understanding of the full spectrum of triggers and actions available in FME, empowering you to enhance your workspaces for efficient automation.
We’ll kick things off by showcasing the most commonly used event-based triggers, introducing you to various automation workflows like manual triggers, schedules, directory watchers, and more. Plus, see how these elements play out in real scenarios.
Whether you’re tweaking your current setup or building from the ground up, this session will arm you with the tools and insights needed to transform your FME usage into a powerhouse of productivity. Join us to discover effective strategies that simplify complex processes, enhancing your productivity and transforming your data management practices with FME. Let’s turn complexity into clarity and make your workspaces work wonders!
Unlock the Future of Search with MongoDB Atlas_ Vector Search Unleashed.pdfMalak Abu Hammad
Discover how MongoDB Atlas and vector search technology can revolutionize your application's search capabilities. This comprehensive presentation covers:
* What is Vector Search?
* Importance and benefits of vector search
* Practical use cases across various industries
* Step-by-step implementation guide
* Live demos with code snippets
* Enhancing LLM capabilities with vector search
* Best practices and optimization strategies
Perfect for developers, AI enthusiasts, and tech leaders. Learn how to leverage MongoDB Atlas to deliver highly relevant, context-aware search results, transforming your data retrieval process. Stay ahead in tech innovation and maximize the potential of your applications.
#MongoDB #VectorSearch #AI #SemanticSearch #TechInnovation #DataScience #LLM #MachineLearning #SearchTechnology
UiPath Test Automation using UiPath Test Suite series, part 5DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 5. In this session, we will cover CI/CD with devops.
Topics covered:
CI/CD with in UiPath
End-to-end overview of CI/CD pipeline with Azure devops
Speaker:
Lyndsey Byblow, Test Suite Sales Engineer @ UiPath, Inc.
AI 101: An Introduction to the Basics and Impact of Artificial IntelligenceIndexBug
Imagine a world where machines not only perform tasks but also learn, adapt, and make decisions. This is the promise of Artificial Intelligence (AI), a technology that's not just enhancing our lives but revolutionizing entire industries.
How to Get CNIC Information System with Paksim Ga.pptxdanishmna97
Pakdata Cf is a groundbreaking system designed to streamline and facilitate access to CNIC information. This innovative platform leverages advanced technology to provide users with efficient and secure access to their CNIC details.
Pushing the limits of ePRTC: 100ns holdover for 100 daysAdtran
At WSTS 2024, Alon Stern explored the topic of parametric holdover and explained how recent research findings can be implemented in real-world PNT networks to achieve 100 nanoseconds of accuracy for up to 100 days.
GraphSummit Singapore | The Future of Agility: Supercharging Digital Transfor...Neo4j
Leonard Jayamohan, Partner & Generative AI Lead, Deloitte
This keynote will reveal how Deloitte leverages Neo4j’s graph power for groundbreaking digital twin solutions, achieving a staggering 100x performance boost. Discover the essential role knowledge graphs play in successful generative AI implementations. Plus, get an exclusive look at an innovative Neo4j + Generative AI solution Deloitte is developing in-house.
Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...SOFTTECHHUB
The choice of an operating system plays a pivotal role in shaping our computing experience. For decades, Microsoft's Windows has dominated the market, offering a familiar and widely adopted platform for personal and professional use. However, as technological advancements continue to push the boundaries of innovation, alternative operating systems have emerged, challenging the status quo and offering users a fresh perspective on computing.
One such alternative that has garnered significant attention and acclaim is Nitrux Linux 3.5.0, a sleek, powerful, and user-friendly Linux distribution that promises to redefine the way we interact with our devices. With its focus on performance, security, and customization, Nitrux Linux presents a compelling case for those seeking to break free from the constraints of proprietary software and embrace the freedom and flexibility of open-source computing.
8. •
Operations
Aging and inflexible
•
Proprietary market not
responsive
•
Technology
•
Digital content
Own to access
Social and mobile
•
New
Practices
Work arounds
•
Staffing
•
Sustainable funding
•
Managing change
•
9.
10.
11. WHAT IS APPLICATION LIFECYCLE MANAGEMENT?
by David Chappell, December 2008, Sponsored by Microsoft Corporation
17. Levers for Change: Processes
Current
Future
Manual
Batch
Duplicate
Once
Multiple systems,
sources of information
Systems consolidated,
centralized
Print-based
Flexible for the
unforeseen
Library services left out
in the cold
Room in the inn
18.
19. Kuali OLE as an opportunity for
strategic alignment of systems
Apply one set of goals, resources to a collection of systems to move towards an
organizational goal
• Aligned with strategic goals and values (customer focused, effective, flexible,
sustainable, secure)
• Aligned technologically for efficient workflows, communication, management
•
20. ILS and Enterprise Systems:
Pre-Kuali OLE Challenges
ILLiad (ILL)
WebCat (OPAC)
Palci EZ-Borrow
(ILL)
ContentDM
(DMS)
E-Resource
Management
Oracle (IDM)
License
Management
SirsiDynix
Symphony (ILS)
Banner (Finance)
24. Integration: ILS Development Environment
Closed System
Feature
Open System
Vendor Supplied
User Interface
Programmer access
Vendor Supplied
Staff Interface
Programmer access
API - maybe
Data Stores
DBA access, API
Vendor Supplied
Modules
Programmer access
Vote!
Bug fixes,
enhancement
Community
Vendor
Support
Community
Vendor
Pricing
Foundation
25. Open System Integration
VuFind
(Discovery)
Palci E-ZBorrow
via NCIP
Patron Data
(Oracle)
Authentication
(LDAP, SSO)
ILLiad (NCIP
Add-ons)
Financial
(Oracle)
E-Resources
Digital
Collections
License
Management
Data Stores
Editor's Notes
Welcome and introductionsMy name is James Young. I’m director of administration and planning for Library and Technology Services @ Lehigh University. I’d like to introduce my partners and co-presenters, Mark Canney, Manager of our Lending Services team and system administrator of our current integrated library system and Doreen Herold, manager of library technical services. Before we get started I’d like to thank a couple of folks in the room. Our Vice Provost and leader, Bruce Taggart, who is largely responsible for bringing the Kuali Ole Project and two the of the Directors most closely associated with the project: Sharon Wiles-Young, Director of Access Services, and ChulinMeng, Director of Library Technology.
Here is a quick overview of what we’ll be discussing over the next 45 minutes: a brief background and strategy for the OLE Project at Lehigh, some context on our planning process, three levers for changes: process, staffing, and technology. We’ll end the talk with the synthesis and a short conversation about the promise of Kuali OLE at Lehigh.
The overarching goal of our talk is to use software development as a vehicle for change: in processes; in how we design, deploy, sustain and bridge relationships among disparate technologies; in our staffing and larger organizational change; and in our long-standing (and expected to continue) rich collaborations with other institutions.
We settled on a extended metaphor for these sets of activities and actions that we will pull through the talk: “from scattered buffalos to big tent”
Lehigh University is a residential research university in Bethlehem PA (an hour and a half west of New York City and an hour north of Philadelphia). We have an enrollment of 6800 FTE (4800 of which are undergraduates), and four colleges with distinct cultures and service environments. The libraries are part of a larger, merged organizational called Library and Technology Services (aka LTS). Of the 170 or so staff in LTS, approximately 40 report to the library. We have three buildings housing about 1 million volumes and, as you might imagine, have been riding the wave of a rapid move to digital content for quite a while now. The university’s strategic plan was adopted in 2009 and the campus budget is $425 million.
The defining variables for our participation in the Kuali OLE project are the following.
First, our organizational ethos – the charge often espoused by Bruce – stems, in part, from the vast resources and deliberate design of the libraries being part of a merged organization. But even more so, it’s our organization strategically choosing a handful of high profile projects that demonstrate not only a tolerance for risk, but an increased openness in our choice of systems to deploy, support, and grow, but also an openness in our culture: in how we leverage resources and intelligence across boundaries. Finally, and perhaps assumed, we deliberately link technology, budget, and organizationally planning in a way that supports and extends our efforts.
Moreover, we were dealing with the following three forces which served as impetus for change: new practices on behalf our users who desires flexible and mobile access to content, an aging and inflexible ILS in which we had historically developed a culture of work arounds, and a recognition that we our operating culture needed to be one of anticipating change as opposed to reacting to it. Part of our impetus for change was forward-thinking and sustainable funding model.
Knowing that we’d eventually want to go out to market for a new ILS, Bruce and two our directors had the foresight to plan ahead and put some monies in a reserve fund. This money accrued over time. This allowed us freedom to join the Kuali OLE project as one of the original partners. This is coupled with a shift we’re seeing on our campus in which we’re shifting in spots from a maintenance centric operating culture to a more development-centric culture. Long-term our funding model we hope our funding model to play out with a lower cost of ownership and a return on investment. Unfortunately, it’s too early to address either.
So what are we planning for? Our goal is a better software at better value: A library system to more flexibly allow us to manage and provide robust access to information and content of all formats. A system that is substantially more open and agile and that will serve as vehicle for creating efficiencies, sparking collaboration, and ultimately, providing a greater value to our students and our campus. I will transition to Doreen to tell you more about how our planning processes and staffing.
When the idea of OLE first arose, application lifecycle management wasn’t something that was in front of us as a means of approaching the project in a structured fashion. But, this graphic of the basics of application lifecycle management provides a good illustration of how things have moved since the beginning.The first phase of OLE, which was called the OLE Project and took place during 2008 and 2009, started with the idea to develop a next generation library management system. Phase one lasted for a year and included asking ourselves what we wanted and needed and how we should go about making that happen. Towards the end of that first year, we knew that, to sustain the project, we’d have to develop a governance structure and that’s when it was decided to move under the umbrella of Kuali.Development of Kuali OLE started in the fall of 2010 and, fortunately, we had a lot of information to build on, information we collected during phase one of the project. In a little bit, I’ll tell you about workshops we held during phase one that laid the groundwork to begin development of OLE.And the “operations line is intimately connected to the development line” (p. 5 of What is application lifecycle mgt) which I’ll tell you a little about in a bit.
The first phase of OLE, which we had the benefit of the Andrew W. Mellon Foundation to make happen, was a year long investigation of what libraries’ needs were, whether our current systems were meeting those needs, and what to do about it. Towards the end of that year, it was decided to take the next step and actually begin developing a new system. But, before diving into that work, we had discussions about sustaining such work. The Kuali Foundation was already alive and well and welcomed OLE to the Foundation.In the graphic you can see what is now the governance structure of OLE. Our OLE member schools have representation throughout this structure. For Lehigh, that means our Vice-Provost Bruce Taggart represents Lehigh on the Project Board, our Director of Access Services Sharon Wiles-Young has been serving on the Functional Council and has recently been joined by ChulinMeng, our new Director of Library Technology. R The Functional Council discusses desired functionalities and votes to prioritize and help determine what goes into which releases. Our Senior Developer Michelle Suranofsky serves on the Technical Council where they also vote on functionalities. Other staff from Lehigh also participate in OLE and we’re seeing how participating is helping to influence decisions on how OLE will develop. We’re also glad to represent small- to mid-size institutions as we’re one of the smaller schools in the partnership.
Development began with 12 regional design workshops held in 2008-2009 resulted in ideas and feedback from 370 people representing 125 institutionsAs the project progressed, SMEs from partner libraries continued to refine those ideas and move towards granularity as OLE began development. Staff participate in discussions with their peers from the OLE partner libraries, building and prioritizing lists of desired functionalities, writing functional specification documents, dialoging with programmers upon the completion of new functionalities, testing those functionalities, wiping their brow and repeating the process again.
Deployment, testing, implementationOperations is intertwined with Development
I want to tell you a few things about how our involvement in OLE has helped us begin to rein in our “scattered buffaloes”. Since we’re still in the development process of OLE and will be, with the University of Chicago, one of the first implementers, we don’t yet know the full scope of value that we’ll see from OLE. But here are some of the changes that we have seen, as well as some of our expected changes.
Jim had already provided some background on the structure of Lehigh’s Library and Technology Services department. You can see in the subunit list that the library’s services are performed directly by three subunits: Library Access Services, Library Collections, and Library Technology.We have seen the Library Technology team change since our involvement with OLE started in 2008. At that time, it was a component of the Library Collections and Systems unit, consisting of three staff that included the two technical staff and a digital library project manager. In five years, it became an independent unit and now has 5.5 FTE, including a director. Two of those staff are java programmers, which was a significant development for Lehigh.Being involved with OLE during these early stages has not only had an impact on the growth of our staffing, as seen with the Library Technology unit. But existing staff have benefitted from being in the position to contemplate our current practices. This contemplation occurs as we interact more, going beyond interactions as we perform our daily routines and moving to determining what we need and want out of OLE and how we’ll go about implementing it. We’ve been collaborating more within our teams (ex. acquisitions and cataloging), within library groups (ex. Tech serv and library technology), within LTS (ex. Mark and Sarah), and within the University (Michelle and ????? discussing how OLE will connect with our financial system Banner).This reflection by our library staff is one of the primary visions of OLE, that we won’t just build an open source system that has the exact same functionality as our current ILSs but that we’ll build a system based on the questioning of the library’s place in the university’s ecosystem (and how we can better share data across the university) and based on the questioning of our current processes and how we can improve upon those processes.
Some of our current processes are such because of the limitations of our ILSs and the limitations of our funds. Some of our systems do have modules to support processes we’d like to adopt but they’re sold as separate modules, requiring additional monetary outlays and annual fees, rather than as an evolution of our current systems to meet the changing needs of the resources we acquire—and, yes, I’m talking electronic resources here.To allow us to build a system Example of letting go of predictive check-in: initial struggle (amount of resources that are print-based), balancing need with resources to develop such functionality, acknowledge that we are more alike than different (OLE Project assumptions, as a growth from regional workshops), incl. revisiting local practices and which ones are still valid vs. which are done per traditionAs we seriously question our current processes we realize we have the opportunity to increase the value of this new system to make it one that will foster progress and flexibility so that, whatever resources come down the pike in 10 or 20 years from now, we’ll have a system that can evolve to help us manage those resources and make sure they get to our users.I’m now going to pass discussions on to Mark.
Services. Good information professionals are discerners, they are authenticators, they are practiced in objective judgment. They have a broad perspective on campus life. They partner widely to best cultivate and frame services that are useful, relevant, and integrative. Without a doubt, users have access to plenty of information. They, however, need facilitative guidance at finding and using the right information at the right time to complete an assignment or solve a particular problem. They also need tools to help them do this.
We end with excitement. We realize that in the effort of going first brings a certain amount of risk. However we also know that our users demand that our staffing, our resources, our services, and our technologies are agile. We can’t wait to plan to meet their needs. We need to anticipate them with open content, flexible services for mobile students allowing for personal experience with applications that are flexible enough to grow and develop in ways that
Solidify roles for content managed web channels (news, events, directories)