Presentación sobre los principales cambios en el Modelo de requerimientos para Sistemas de Gestión electrónica de Documentos, incluyendo Record Management.
This document provides an update on MoReq2010 and discusses its future. Some key points:
- MoReq2010 was launched in 2011 to broaden adoption and introduce interoperability between records systems.
- It incorporates a service-oriented architecture and modular approach to extend compliance to more software.
- Planned additions through 2012 include import/export services and industry-specific modules like healthcare.
- The DLM Forum will accredit testing centers and develop an international network to certify MoReq2010 compliance.
- Upcoming conferences will discuss further development and extending MoReq2010's use through standards and regulations.
Talk of Anatoly Levenchuk "Why ISO 15926 oil and gas data integration technology is capable to solve of problems that can’t being solved by previous generations of technologies and previous generations of data integration standard" at conference on industrial data integration 29th of April 2013, Moscow.
The document discusses the Semantic Web and provides examples of how it can improve upon current web technologies. It describes the Semantic Web as adding meaning to web content that is accessible to machines, allowing more advanced capabilities like natural language search and integration of information across documents. Examples given include allowing more precise knowledge management in organizations and enabling sophisticated shopping agents for both B2C and B2B commerce.
First Name: Piotr
Last Name: Palacz
Email: piotr@palacz.net
Title: Pharo in Corner Cases of the Enterprise
Abstract: Based on my experiences from using Pharo over the last three years in three different industrial projects, I propose to discuss the following:
- What are the areas were Pharo can be used successfully in an enterprise context?
- What are ways of introducing Pharo to non-mainstream-technology-averse organizations?
- What in Pharo is attractive for the independents and small companies who provide IT services pertaining to a lifecycle of enterprise systems?
- What in Pharo could be improved to increase this attractiveness?
Bio: My first contact with Smalltalk was around '87 - I took a Smalltalk course at school, using Digitalk/V.
I worked with Smalltalk full time between 1990 and 2002 or so, in Australia and in the US, mostly in enterprise-level applications and systems.
After several years of working with big companies and big systems in the US as Technical/Solution/Application/Lead/etc Architect,
I went independent and consequently had the opportunity to use Smalltalk (Pharo specifically) on three different projects over the last 3+ years.
This document discusses the Anuket Multi-Cloud Interaction Model, which provides an abstract model to represent interactions across a distributed "Telco Cloud" landscape. It defines core roles like the Communications Service Provider (CSP) and Cloud Providers. It outlines high-level interactions around managing accounts, connectivity, resources, applications/VNFs, and transactions. It presents sample scenarios and discusses API/cloud brokerage approaches. It proposes criteria for identifying relevant industry standards and technologies. Finally, it suggests surveying options for a "Hybrid, Edge, and Multi-Cloud unified management Platform" and bringing related projects together in the next phase of Anuket "Nile".
Spatial Data Infrastructure (SDI) aims to provide access to harmonized geographic data through distributed information systems. Key components of an SDI include organizational governance, spatial data and metadata, geospatial services, technical infrastructure, and standards to ensure interoperability. Open standards like those from ISO and OGC provide interfaces and data models to allow disparate systems and data sources to work together efficiently for semantic and technical interoperability. Ensuring data quality and developing terminology to describe accuracy is also important for effective use of data in an SDI.
This document provides an update on MoReq2010 and discusses its future. Some key points:
- MoReq2010 was launched in 2011 to broaden adoption and introduce interoperability between records systems.
- It incorporates a service-oriented architecture and modular approach to extend compliance to more software.
- Planned additions through 2012 include import/export services and industry-specific modules like healthcare.
- The DLM Forum will accredit testing centers and develop an international network to certify MoReq2010 compliance.
- Upcoming conferences will discuss further development and extending MoReq2010's use through standards and regulations.
Talk of Anatoly Levenchuk "Why ISO 15926 oil and gas data integration technology is capable to solve of problems that can’t being solved by previous generations of technologies and previous generations of data integration standard" at conference on industrial data integration 29th of April 2013, Moscow.
The document discusses the Semantic Web and provides examples of how it can improve upon current web technologies. It describes the Semantic Web as adding meaning to web content that is accessible to machines, allowing more advanced capabilities like natural language search and integration of information across documents. Examples given include allowing more precise knowledge management in organizations and enabling sophisticated shopping agents for both B2C and B2B commerce.
First Name: Piotr
Last Name: Palacz
Email: piotr@palacz.net
Title: Pharo in Corner Cases of the Enterprise
Abstract: Based on my experiences from using Pharo over the last three years in three different industrial projects, I propose to discuss the following:
- What are the areas were Pharo can be used successfully in an enterprise context?
- What are ways of introducing Pharo to non-mainstream-technology-averse organizations?
- What in Pharo is attractive for the independents and small companies who provide IT services pertaining to a lifecycle of enterprise systems?
- What in Pharo could be improved to increase this attractiveness?
Bio: My first contact with Smalltalk was around '87 - I took a Smalltalk course at school, using Digitalk/V.
I worked with Smalltalk full time between 1990 and 2002 or so, in Australia and in the US, mostly in enterprise-level applications and systems.
After several years of working with big companies and big systems in the US as Technical/Solution/Application/Lead/etc Architect,
I went independent and consequently had the opportunity to use Smalltalk (Pharo specifically) on three different projects over the last 3+ years.
This document discusses the Anuket Multi-Cloud Interaction Model, which provides an abstract model to represent interactions across a distributed "Telco Cloud" landscape. It defines core roles like the Communications Service Provider (CSP) and Cloud Providers. It outlines high-level interactions around managing accounts, connectivity, resources, applications/VNFs, and transactions. It presents sample scenarios and discusses API/cloud brokerage approaches. It proposes criteria for identifying relevant industry standards and technologies. Finally, it suggests surveying options for a "Hybrid, Edge, and Multi-Cloud unified management Platform" and bringing related projects together in the next phase of Anuket "Nile".
Spatial Data Infrastructure (SDI) aims to provide access to harmonized geographic data through distributed information systems. Key components of an SDI include organizational governance, spatial data and metadata, geospatial services, technical infrastructure, and standards to ensure interoperability. Open standards like those from ISO and OGC provide interfaces and data models to allow disparate systems and data sources to work together efficiently for semantic and technical interoperability. Ensuring data quality and developing terminology to describe accuracy is also important for effective use of data in an SDI.
Service management board (SMB), Service providers’ forum (SPF)EOSC-hub project
The document discusses communications requirements and logistics for coordinating service providers in EOSC-hub. It proposes establishing a Service Provider Forum and Service Management Board to facilitate scalable communication with the potentially large number of service providers. The SPF would be open to all providers while the SMB would focus on high participation providers. Meetings would be virtual to reduce demands on time. Harmonization of policies and practices across services is also discussed to provide consistent user experiences despite heterogeneous backgrounds.
EDF2013: Selected Talk Josep-L. Larriba-Pey: The Linked Data Benchmark Counci...European Data Forum
Selected talk of Josep-L. Larriba-Pey, DAMA-UPC, Universitat Politècnica de Catalunya, BarcelonaTech, Director, at the European Data Forum 2013, 9 April 2013 in Dublin, Ireland: The Linked Data Benchmark Council, benchmarking RDF and Graph technologies.
Istio as an Enabler for Migrating Monolithic Applications to Microservices v1.3Ahmed Misbah
Migrating application architectures to microservices is considered a key area of transformation in the IT world. Modernizing legacy applications to Kubernetes-based microservices can prove to be very challenging if not planned correctly, taking into consideration the right technologies and enablers.
This session explains how Istio can be used as an enabler for modernizing legacy monolithic applications to microservices. Topics covered in the presentation will include:
1- Advantages of migrating to microservices and service mesh
2- Designing a microservice application based on splitting an existing monolithic application
3- Implementing microservices iteratively as a strangler fig application with Istio
Up to €67.4 million is foreseen from the 2020 CEF Telecom Work Programme for grants managed by INEA in the area of Generic Services. The grants under CEF Telecom helped European public administrations and businesses to hook up to the core platforms of the digital services that are the object of the calls.
In particular, €5 million was made available in 2019 and €3 million in 2020 for projects oriented towards 'Open Data' management.
GreenMov, ODALA and INTERSTAT have developed services and products that can be easily adopted by public administrations and beyond thank to the funding of CEF programme target on Open Data
The purpose of this event is not only to present results, demos or provide technical guidelines for developers, it is a moment of reflection on lesson learned and best practices that came from years of project’s activity to analyse what will be the impact for Public Administrations, and finally test the value of GreenMov, INTERSTAT and ODALA in solving future problems.
The MMI Device Ontology: Enabling Sensor IntegrationCarlos Rueda
The document summarizes a presentation about the MMI Device Ontology project. The project aims to develop an ontology of marine devices to help integrate sensor data. It involves defining classes and properties to characterize devices, measurements, and deployments. The ontology is being developed through use cases and community input, with the goal of enabling discovery and integration of sensor and observation data.
Moreq2 is a European standard that describes requirements for electronic records management systems. It aims to standardize how electronic records are organized, captured, stored, retained, and disposed of. Moreq2 compliance will be tested through a standardized test framework. This will allow ECM systems suppliers to be certified and trusted across Europe. However, implementing Moreq2 presents challenges as countries have different existing laws and practices regarding records management. National chapter zeros will need to align each country's legislation with the Moreq2 standard.
SOA Mainframe Service Architecture and Enablement Practices Best and Worst Pr...Michael Erichsen
This document outlines best and worst practices for mainframe service architecture and enablement. It discusses seven case studies of implementing service-oriented architectures on mainframe systems. The case studies demonstrate different technical approaches to exposing legacy mainframe applications as web services, including using CICS, WebSphere, and middleware to interface with COBOL and other applications. The document also discusses challenges of mapping data between XML and legacy formats like COBOL and ensuring interface definitions are compatible.
The document discusses advances in standardizing the Industry Foundation Classes (IFC) data schema for building information modeling (BIM) as an OWL ontology (ifcOWL) to enable semantic web technologies. It provides an overview of converting the IFC EXPRESS schema to ifcOWL, examples of IFC models represented as RDF graphs, and an application scenario for a sustainable factories semantic framework that integrates software tools using ifcOWL and additional domain ontologies.
The document summarizes the European Open Science Cloud (EOSC). It discusses the first phase of EOSC from 2018-2020 which is addressing six roadmap action lines through various H2020 projects. The second phase beginning in 2020 is dependent on an evaluation of the first phase. Current EOSC governance is working to steer initial implementation and transition to the second stage. Several working groups have been established to work on key outputs around rules of participation, landscape and sustainability analysis, architecture, and FAIR data principles. The transition to the second phase will require addressing issues around governance, funding, and establishing a core infrastructure.
Enabling IoT Devices’ Hardware and Software Interoperability, IPSO Alliance (...Open Mobile Alliance
Presentation delivered during the Internet of Things World, Santa Clara pre-event workshop by Christian Legare - IPSO Alliance Chairman, Chief of Software Engineering, Micrium (Part of Silicon Labs)
Internet Protocol for Smart Objects (IPSO) is an alliance that, among other things, defines a data model to represent sensor values and attributes. OMA uses IPSO Smart Objects v1.0 as its resource model to expose sensor information to a remote LwM2M Server. From the speaker from IPSO Alliance, you will learn:
● What is an IPSO Smart Object data model
● What do these Objects and Resources look like
● How to create and register your own resources
● What is next for IPSO Alliance
This presentation was provided by Peter Collins of OCLC, Sebastian Hammer of Index Data, Allen Jones of The New School, and Nettie Lagace, of NISO, as part of the NISO Standards Update on "Interoperability of Systems: Controlled Digital Lending (IS-CDL)" held during ALA Annual on June 25, 2023.
This document discusses best practices for migrating to a service-oriented architecture (SOA). It recommends embracing heterogeneity and complexity by abstracting across different software systems using standards like XML and web services. A successful SOA migration requires changing organizational structures and skills to focus on reusable services, incremental changes, and architectural best practices. Service contracts that separate interface from implementation are also key to enabling reuse and flexibility.
This presentation provides the latest information on the OASIS Topology Orchestration Specification for Cloud Applications (TOSCA) v1.0 standard. TOSCA is a standard language used to describe a topology of cloud based web services, their components, relationships, and the processes that manage them. Key TOSCA concepts such as operational policy modeling, declarative composition and lifecycle management are covered along with the benefits both cloud customers and providers derive from using this standard. In addition, open source tooling support for TOSCA in projects such as OpenStack and the newly announced Aria project from Cloudify are discussed. Insight is given to the direction of the v1.1 specification and its timeline.
The document discusses micro-ROS, an open source project to extend ROS2 (Robot Operating System) for use on microcontrollers. It aims to allow ROS to be used on resource-constrained embedded devices. Micro-ROS will use a modified version of micro-XRCE-DDS as its middleware and integrate with ROS2 and FIWARE for interoperability. It will provide predictable scheduling, lifecycle management, and an efficient transform library. Micro-ROS will be tested and benchmarked on a reference hardware platform running NuttX. Potential use cases include drones, robot sensors, lawnmowers, and smart warehouses.
The document discusses micro-ROS, an open source project to extend ROS2 (Robot Operating System) for use on microcontrollers. It aims to allow ROS to be used on resource-constrained embedded devices. Micro-ROS will use a modified version of micro-XRCE-DDS as its middleware and integrate with ROS2 and FIWARE for interoperability. It will provide predictable scheduling, lifecycle management, and an efficient transform library. Micro-ROS will be tested and benchmarked on a reference hardware platform running NuttX. Example use cases include drones, robot sensors, lawnmowers and smart warehouses.
The document discusses micro-ROS, an open source project that aims to extend ROS2 (Robot Operating System 2) to allow its use in microcontrollers. It will create a common framework called micro-ROS to support ROS on resource-constrained embedded devices. Micro-ROS will select NuttX as its operating system and test on STM32 microcontrollers. It will use micro-XRCE DDS as its middleware and provide interoperability with ROS1, ROS2 and FIWARE. The project aims to enhance ROS for microcontrollers in areas like predictable scheduling, system lifecycles, and an efficient transform library. It will also include full benchmarking and test multiple use cases.
The public sector and the industry has common data challenges
Semantic Web
Information and knowledge management
The Semantic Days conference
Semantic Network
OLF’s Integrated Operations
Generation 1 and 2
Way of working
Achieving generation 2
Repository
Importance of data
Ontology
Reference IT architecture
POSC Caesar Association (PCA)
E&P Information Management (EPIM)
Summing up
IPTC Rights Expression Working Group 2013 June AGMStuart Myles
Rights expression for the news industry, using RightsML and ODRL. Updates on the IPTC's Machine Readable Rights Day and the Rights 1.0 spec. Plus documentation and examples of using RightsML and the work so far to embed rights in photo binaries, using JSON.
Por norma legal las Entidades públicas deben incorporar en sus prácticas modelos de planeación y control de la gestión como GP1000 y MECI. Por su parte las empresas privadas lo hacen como requisito del mercado para garantizar la satisfacción del cliente y mejorar continuamente sus procesos organizacionales y controlar los diferentes procesos a través de normas como ISO 9001, 14001, 18000, 27000 entre otras.
Todos los Sistemas de Gestión involucran la producción de enormes cantidades de documentos, formatos y registros que deben ser controlados dado que se emplean para describir y caracterizar los procesos; para indicar la forma estandarizada de realizar las diferentes actividades de la organización y para dejar evidencia o soporte de la realización de las mismas.
La gestión documental son todas las actividades técnicas y prácticas empleadas para producir, organizar, clasificar y preservar los documentos garantizando el acceso oportuno y controlado a la información.
El escenario actual plantea que la gestión documental y los sistemas de gestión empresariales no están articulados y no se puede garantizar la adecuada gestión de los documentos y registros y especialmente su preservación.
Por todo lo anterior formulamos el contenido de este importante Seminario para que líderes de Calidad y líderes de gestión documental armonicen y articulen su trabajo y contribuyan a la mejora continua de sus lugares de trabajo.
Este decreto reglamenta el uso de la firma electrónica en Colombia de acuerdo con la Ley 527 de 1999. Define la firma electrónica como métodos como códigos, contraseñas o claves criptográficas privadas que permiten identificar a una persona en relación con un mensaje de datos. Establece que la firma electrónica tendrá la misma validez y efectos jurídicos que la firma tradicional si cumple ciertos requisitos de confiabilidad. También dictamina la neutralidad tecnológica e igualdad de trat
Service management board (SMB), Service providers’ forum (SPF)EOSC-hub project
The document discusses communications requirements and logistics for coordinating service providers in EOSC-hub. It proposes establishing a Service Provider Forum and Service Management Board to facilitate scalable communication with the potentially large number of service providers. The SPF would be open to all providers while the SMB would focus on high participation providers. Meetings would be virtual to reduce demands on time. Harmonization of policies and practices across services is also discussed to provide consistent user experiences despite heterogeneous backgrounds.
EDF2013: Selected Talk Josep-L. Larriba-Pey: The Linked Data Benchmark Counci...European Data Forum
Selected talk of Josep-L. Larriba-Pey, DAMA-UPC, Universitat Politècnica de Catalunya, BarcelonaTech, Director, at the European Data Forum 2013, 9 April 2013 in Dublin, Ireland: The Linked Data Benchmark Council, benchmarking RDF and Graph technologies.
Istio as an Enabler for Migrating Monolithic Applications to Microservices v1.3Ahmed Misbah
Migrating application architectures to microservices is considered a key area of transformation in the IT world. Modernizing legacy applications to Kubernetes-based microservices can prove to be very challenging if not planned correctly, taking into consideration the right technologies and enablers.
This session explains how Istio can be used as an enabler for modernizing legacy monolithic applications to microservices. Topics covered in the presentation will include:
1- Advantages of migrating to microservices and service mesh
2- Designing a microservice application based on splitting an existing monolithic application
3- Implementing microservices iteratively as a strangler fig application with Istio
Up to €67.4 million is foreseen from the 2020 CEF Telecom Work Programme for grants managed by INEA in the area of Generic Services. The grants under CEF Telecom helped European public administrations and businesses to hook up to the core platforms of the digital services that are the object of the calls.
In particular, €5 million was made available in 2019 and €3 million in 2020 for projects oriented towards 'Open Data' management.
GreenMov, ODALA and INTERSTAT have developed services and products that can be easily adopted by public administrations and beyond thank to the funding of CEF programme target on Open Data
The purpose of this event is not only to present results, demos or provide technical guidelines for developers, it is a moment of reflection on lesson learned and best practices that came from years of project’s activity to analyse what will be the impact for Public Administrations, and finally test the value of GreenMov, INTERSTAT and ODALA in solving future problems.
The MMI Device Ontology: Enabling Sensor IntegrationCarlos Rueda
The document summarizes a presentation about the MMI Device Ontology project. The project aims to develop an ontology of marine devices to help integrate sensor data. It involves defining classes and properties to characterize devices, measurements, and deployments. The ontology is being developed through use cases and community input, with the goal of enabling discovery and integration of sensor and observation data.
Moreq2 is a European standard that describes requirements for electronic records management systems. It aims to standardize how electronic records are organized, captured, stored, retained, and disposed of. Moreq2 compliance will be tested through a standardized test framework. This will allow ECM systems suppliers to be certified and trusted across Europe. However, implementing Moreq2 presents challenges as countries have different existing laws and practices regarding records management. National chapter zeros will need to align each country's legislation with the Moreq2 standard.
SOA Mainframe Service Architecture and Enablement Practices Best and Worst Pr...Michael Erichsen
This document outlines best and worst practices for mainframe service architecture and enablement. It discusses seven case studies of implementing service-oriented architectures on mainframe systems. The case studies demonstrate different technical approaches to exposing legacy mainframe applications as web services, including using CICS, WebSphere, and middleware to interface with COBOL and other applications. The document also discusses challenges of mapping data between XML and legacy formats like COBOL and ensuring interface definitions are compatible.
The document discusses advances in standardizing the Industry Foundation Classes (IFC) data schema for building information modeling (BIM) as an OWL ontology (ifcOWL) to enable semantic web technologies. It provides an overview of converting the IFC EXPRESS schema to ifcOWL, examples of IFC models represented as RDF graphs, and an application scenario for a sustainable factories semantic framework that integrates software tools using ifcOWL and additional domain ontologies.
The document summarizes the European Open Science Cloud (EOSC). It discusses the first phase of EOSC from 2018-2020 which is addressing six roadmap action lines through various H2020 projects. The second phase beginning in 2020 is dependent on an evaluation of the first phase. Current EOSC governance is working to steer initial implementation and transition to the second stage. Several working groups have been established to work on key outputs around rules of participation, landscape and sustainability analysis, architecture, and FAIR data principles. The transition to the second phase will require addressing issues around governance, funding, and establishing a core infrastructure.
Enabling IoT Devices’ Hardware and Software Interoperability, IPSO Alliance (...Open Mobile Alliance
Presentation delivered during the Internet of Things World, Santa Clara pre-event workshop by Christian Legare - IPSO Alliance Chairman, Chief of Software Engineering, Micrium (Part of Silicon Labs)
Internet Protocol for Smart Objects (IPSO) is an alliance that, among other things, defines a data model to represent sensor values and attributes. OMA uses IPSO Smart Objects v1.0 as its resource model to expose sensor information to a remote LwM2M Server. From the speaker from IPSO Alliance, you will learn:
● What is an IPSO Smart Object data model
● What do these Objects and Resources look like
● How to create and register your own resources
● What is next for IPSO Alliance
This presentation was provided by Peter Collins of OCLC, Sebastian Hammer of Index Data, Allen Jones of The New School, and Nettie Lagace, of NISO, as part of the NISO Standards Update on "Interoperability of Systems: Controlled Digital Lending (IS-CDL)" held during ALA Annual on June 25, 2023.
This document discusses best practices for migrating to a service-oriented architecture (SOA). It recommends embracing heterogeneity and complexity by abstracting across different software systems using standards like XML and web services. A successful SOA migration requires changing organizational structures and skills to focus on reusable services, incremental changes, and architectural best practices. Service contracts that separate interface from implementation are also key to enabling reuse and flexibility.
This presentation provides the latest information on the OASIS Topology Orchestration Specification for Cloud Applications (TOSCA) v1.0 standard. TOSCA is a standard language used to describe a topology of cloud based web services, their components, relationships, and the processes that manage them. Key TOSCA concepts such as operational policy modeling, declarative composition and lifecycle management are covered along with the benefits both cloud customers and providers derive from using this standard. In addition, open source tooling support for TOSCA in projects such as OpenStack and the newly announced Aria project from Cloudify are discussed. Insight is given to the direction of the v1.1 specification and its timeline.
The document discusses micro-ROS, an open source project to extend ROS2 (Robot Operating System) for use on microcontrollers. It aims to allow ROS to be used on resource-constrained embedded devices. Micro-ROS will use a modified version of micro-XRCE-DDS as its middleware and integrate with ROS2 and FIWARE for interoperability. It will provide predictable scheduling, lifecycle management, and an efficient transform library. Micro-ROS will be tested and benchmarked on a reference hardware platform running NuttX. Potential use cases include drones, robot sensors, lawnmowers, and smart warehouses.
The document discusses micro-ROS, an open source project to extend ROS2 (Robot Operating System) for use on microcontrollers. It aims to allow ROS to be used on resource-constrained embedded devices. Micro-ROS will use a modified version of micro-XRCE-DDS as its middleware and integrate with ROS2 and FIWARE for interoperability. It will provide predictable scheduling, lifecycle management, and an efficient transform library. Micro-ROS will be tested and benchmarked on a reference hardware platform running NuttX. Example use cases include drones, robot sensors, lawnmowers and smart warehouses.
The document discusses micro-ROS, an open source project that aims to extend ROS2 (Robot Operating System 2) to allow its use in microcontrollers. It will create a common framework called micro-ROS to support ROS on resource-constrained embedded devices. Micro-ROS will select NuttX as its operating system and test on STM32 microcontrollers. It will use micro-XRCE DDS as its middleware and provide interoperability with ROS1, ROS2 and FIWARE. The project aims to enhance ROS for microcontrollers in areas like predictable scheduling, system lifecycles, and an efficient transform library. It will also include full benchmarking and test multiple use cases.
The public sector and the industry has common data challenges
Semantic Web
Information and knowledge management
The Semantic Days conference
Semantic Network
OLF’s Integrated Operations
Generation 1 and 2
Way of working
Achieving generation 2
Repository
Importance of data
Ontology
Reference IT architecture
POSC Caesar Association (PCA)
E&P Information Management (EPIM)
Summing up
IPTC Rights Expression Working Group 2013 June AGMStuart Myles
Rights expression for the news industry, using RightsML and ODRL. Updates on the IPTC's Machine Readable Rights Day and the Rights 1.0 spec. Plus documentation and examples of using RightsML and the work so far to embed rights in photo binaries, using JSON.
Por norma legal las Entidades públicas deben incorporar en sus prácticas modelos de planeación y control de la gestión como GP1000 y MECI. Por su parte las empresas privadas lo hacen como requisito del mercado para garantizar la satisfacción del cliente y mejorar continuamente sus procesos organizacionales y controlar los diferentes procesos a través de normas como ISO 9001, 14001, 18000, 27000 entre otras.
Todos los Sistemas de Gestión involucran la producción de enormes cantidades de documentos, formatos y registros que deben ser controlados dado que se emplean para describir y caracterizar los procesos; para indicar la forma estandarizada de realizar las diferentes actividades de la organización y para dejar evidencia o soporte de la realización de las mismas.
La gestión documental son todas las actividades técnicas y prácticas empleadas para producir, organizar, clasificar y preservar los documentos garantizando el acceso oportuno y controlado a la información.
El escenario actual plantea que la gestión documental y los sistemas de gestión empresariales no están articulados y no se puede garantizar la adecuada gestión de los documentos y registros y especialmente su preservación.
Por todo lo anterior formulamos el contenido de este importante Seminario para que líderes de Calidad y líderes de gestión documental armonicen y articulen su trabajo y contribuyan a la mejora continua de sus lugares de trabajo.
Este decreto reglamenta el uso de la firma electrónica en Colombia de acuerdo con la Ley 527 de 1999. Define la firma electrónica como métodos como códigos, contraseñas o claves criptográficas privadas que permiten identificar a una persona en relación con un mensaje de datos. Establece que la firma electrónica tendrá la misma validez y efectos jurídicos que la firma tradicional si cumple ciertos requisitos de confiabilidad. También dictamina la neutralidad tecnológica e igualdad de trat
Este documento modifica el artículo 12 de la Resolución 8934 de 2014 de la Superintendencia de Industria y Comercio de Colombia. La modificación amplía el plazo para que las empresas e instituciones privadas elaboren sus tablas de retención documental y adopten sus programas de gestión documental hasta el 31 de octubre de 2015. La resolución busca dar más tiempo a los vigilados por la Superintendencia para cumplir con las directrices sobre gestión documental establecidas en la legislación colombiana.
El objetivo de utilizar Business Process Management es op mizar la eficiencia y mejorar la rentabilidad. La implementación de una solución de Business Process Management ofrece beneficios inmediatos y a largo plazo, y un buen comienzo ayudará a asegurar su obtención.
Esta guía de Mejores Prácticas BPM le ayudará a preparar su organización y la implementación de BPM hacia el éxito.
Gestión de Comunicaciones oficiales - correspondencia
La gestión de la información y documentación es una ventaja competitiva y duradera para las organizaciones dados los ahorros de recursos y tiempos que genera la adecuada gestión de la información y documentación de manera electrónica y centralizada.
La gestión de comunicaciones oficiales y correspondencia es una de las principales dificultades de la gestión de documentos en las organizaciones, la centralización del proceso de radicación de correspondencia, el registro de las comunicaciones de entrada, salida e interna y la posibilidad de diferenciar los descriptores (metadatos) para el registro de los diferentes tipos de documentos (facturas, comunicaciones oficiales, derechos de petición, flujos de trabajo, etc.
Radar ha desarrollado un módulo de gestión de correspondencia integrado al Sistema de Gestión de Contenidos Alfresco
La inteligencia artificial es la capacidad de una máquina para imitar tareas cognitivas humanas como el aprendizaje y la resolución de problemas. Algunas de las aplicaciones más comunes de la IA incluyen reconocimiento de voz y de imágenes, traducción automática y asistentes virtuales. A medida que la tecnología continúa avanzando, la IA se está volviendo más sofisticada y está desempeñando un papel cada vez más importante en nuestras vidas.
Manual de acceso a la información pública recopila los derechos y deberes de los ciudadanos asociados al acceso a la información pública, la información del estado no necesariamente es pública
Este documento introduce el concepto de archivamiento web, explicando que es un proceso para recolectar y preservar parte de la World Wide Web para investigadores y el público en general. Describe tres tipos de archivamiento web, los retos administrativos y técnicos, y explica que existen proyectos a gran y pequeña escala. El objetivo es proveer una guía conceptual y de buenas prácticas para proyectos de archivamiento web.
Este documento presenta una metodología para la formulación del Plan Institucional de Archivos (PINAR) de una entidad. Explica la importancia del PINAR para planear la función archivística y articularla con los planes estratégicos de la entidad. Luego, detalla los pasos para formular el PINAR, que incluyen identificar la situación actual de los archivos, definir los aspectos críticos, priorizarlos y formular la visión, objetivos, planes y proyectos del PINAR. Finalmente, presenta un ejemplo de cómo aplic
Este documento presenta un manual para la implementación de un Programa de Gestión Documental (PGD) en entidades públicas. El manual describe los elementos que debe contener un PGD de acuerdo con la legislación colombiana, incluyendo la formulación, los procesos de gestión documental, fases de implementación, programas específicos y armonización con otros sistemas de gestión de la entidad. El objetivo del PGD es racionalizar los procesos documentales de acuerdo con los principios de transparencia, eficiencia y protección del patrimonio documental en
Este documento presenta una guía para la formulación de un esquema de metadatos para la gestión de documentos. Explica que los metadatos proporcionan información estructurada que permite la creación, gestión, acceso y preservación de los documentos a lo largo del tiempo. Además, describe los componentes clave de un modelo de metadatos como las entidades, relaciones y conceptos relacionados con su implementación. Finalmente, ofrece una metodología para el diseño e implementación de un esquema de metadatos que permita asegurar la gest
Este documento compila normas nacionales e internacionales sobre documento electrónico, preservación a largo plazo, sistemas de gestión de documentos electrónicos, digitalización certificada, firma digital e interoperabilidad promulgadas entre 1995 y 2013. Presenta la metodología cualitativa utilizada y define conceptos como norma, ley y decreto. Finalmente, menciona algunas normas generales como referente como la Constitución Política de Colombia de 1991 y su énfasis en el acceso a documentos públicos y la protección del patrimon
El documento describe los componentes clave de un Sistema de Gestión de Documentos Electrónicos (SGDE), incluyendo la captura y registro de documentos, su clasificación y organización según un Cuadro de Clasificación Documental, y los procedimientos para su búsqueda, recuperación, conservación, eliminación o transferencia de acuerdo a una Tabla de Retención Documental. El SGDE también incluye controles de seguridad, administración y referenciación de los documentos electrónicos.
El documento habla sobre la necesidad de que las entidades públicas establezcan políticas y planes para permitir el intercambio electrónico de comunicaciones oficiales a través de sistemas de gestión de documentos electrónicos e implementando esquemas de interoperabilidad, con el fin de digitalizar y compartir documentos de manera eficiente entre las entidades y con los ciudadanos.
El documento describe los diferentes elementos que componen un Plan Institucional de Archivos (PINAR), incluyendo herramientas de diagnóstico, contexto estratégico, objetivos, visión estratégica, mapa de ruta y herramientas de seguimiento. El PINAR permite articular la función archivística con el plan estratégico institucional de una entidad.
Los instrumentos archivísticos son herramientas con propósitos específicos para apoyar la gestión documental y la función archivística en las entidades. Algunos instrumentos archivísticos mencionados son las Tablas de Control de Acceso, los Bancos Terminológicos, la Tabla de Retención Documental, el Programa de Gestión Documental, el Plan Institucional de Archivos de la Entidad, el Modelo de Requisitos para Gestión Documental Electrónica, el Cuadro de Clasificación Documental y el Inventario Document
El documento define varios términos relacionados con expedientes electrónicos y digitalización de documentos. Define expediente electrónico como un conjunto de documentos electrónicos correspondientes a un procedimiento administrativo. También define índice electrónico como una relación ordenada de los documentos que conforman un expediente electrónico o serie documental. Además, explica que un expediente híbrido está conformado por documentos análogos y electrónicos pertenecientes a la misma unidad documental.
Fueling AI with Great Data with Airbyte WebinarZilliz
This talk will focus on how to collect data from a variety of sources, leveraging this data for RAG and other GenAI use cases, and finally charting your course to productionalization.
Building Production Ready Search Pipelines with Spark and MilvusZilliz
Spark is the widely used ETL tool for processing, indexing and ingesting data to serving stack for search. Milvus is the production-ready open-source vector database. In this talk we will show how to use Spark to process unstructured data to extract vector representations, and push the vectors to Milvus vector database for search serving.
Salesforce Integration for Bonterra Impact Management (fka Social Solutions A...Jeffrey Haguewood
Sidekick Solutions uses Bonterra Impact Management (fka Social Solutions Apricot) and automation solutions to integrate data for business workflows.
We believe integration and automation are essential to user experience and the promise of efficient work through technology. Automation is the critical ingredient to realizing that full vision. We develop integration products and services for Bonterra Case Management software to support the deployment of automations for a variety of use cases.
This video focuses on integration of Salesforce with Bonterra Impact Management.
Interested in deploying an integration with Salesforce for Bonterra Impact Management? Contact us at sales@sidekicksolutionsllc.com to discuss next steps.
Your One-Stop Shop for Python Success: Top 10 US Python Development Providersakankshawande
Simplify your search for a reliable Python development partner! This list presents the top 10 trusted US providers offering comprehensive Python development services, ensuring your project's success from conception to completion.
Best 20 SEO Techniques To Improve Website Visibility In SERPPixlogix Infotech
Boost your website's visibility with proven SEO techniques! Our latest blog dives into essential strategies to enhance your online presence, increase traffic, and rank higher on search engines. From keyword optimization to quality content creation, learn how to make your site stand out in the crowded digital landscape. Discover actionable tips and expert insights to elevate your SEO game.
How to Interpret Trends in the Kalyan Rajdhani Mix Chart.pdfChart Kalyan
A Mix Chart displays historical data of numbers in a graphical or tabular form. The Kalyan Rajdhani Mix Chart specifically shows the results of a sequence of numbers over different periods.
HCL Notes and Domino License Cost Reduction in the World of DLAUpanagenda
Webinar Recording: https://www.panagenda.com/webinars/hcl-notes-and-domino-license-cost-reduction-in-the-world-of-dlau/
The introduction of DLAU and the CCB & CCX licensing model caused quite a stir in the HCL community. As a Notes and Domino customer, you may have faced challenges with unexpected user counts and license costs. You probably have questions on how this new licensing approach works and how to benefit from it. Most importantly, you likely have budget constraints and want to save money where possible. Don’t worry, we can help with all of this!
We’ll show you how to fix common misconfigurations that cause higher-than-expected user counts, and how to identify accounts which you can deactivate to save money. There are also frequent patterns that can cause unnecessary cost, like using a person document instead of a mail-in for shared mailboxes. We’ll provide examples and solutions for those as well. And naturally we’ll explain the new licensing model.
Join HCL Ambassador Marc Thomas in this webinar with a special guest appearance from Franz Walder. It will give you the tools and know-how to stay on top of what is going on with Domino licensing. You will be able lower your cost through an optimized configuration and keep it low going forward.
These topics will be covered
- Reducing license cost by finding and fixing misconfigurations and superfluous accounts
- How do CCB and CCX licenses really work?
- Understanding the DLAU tool and how to best utilize it
- Tips for common problem areas, like team mailboxes, functional/test users, etc
- Practical examples and best practices to implement right away
Trusted Execution Environment for Decentralized Process MiningLucaBarbaro3
Presentation of the paper "Trusted Execution Environment for Decentralized Process Mining" given during the CAiSE 2024 Conference in Cyprus on June 7, 2024.
Skybuffer AI: Advanced Conversational and Generative AI Solution on SAP Busin...Tatiana Kojar
Skybuffer AI, built on the robust SAP Business Technology Platform (SAP BTP), is the latest and most advanced version of our AI development, reaffirming our commitment to delivering top-tier AI solutions. Skybuffer AI harnesses all the innovative capabilities of the SAP BTP in the AI domain, from Conversational AI to cutting-edge Generative AI and Retrieval-Augmented Generation (RAG). It also helps SAP customers safeguard their investments into SAP Conversational AI and ensure a seamless, one-click transition to SAP Business AI.
With Skybuffer AI, various AI models can be integrated into a single communication channel such as Microsoft Teams. This integration empowers business users with insights drawn from SAP backend systems, enterprise documents, and the expansive knowledge of Generative AI. And the best part of it is that it is all managed through our intuitive no-code Action Server interface, requiring no extensive coding knowledge and making the advanced AI accessible to more users.
Ocean lotus Threat actors project by John Sitima 2024 (1).pptxSitimaJohn
Ocean Lotus cyber threat actors represent a sophisticated, persistent, and politically motivated group that poses a significant risk to organizations and individuals in the Southeast Asian region. Their continuous evolution and adaptability underscore the need for robust cybersecurity measures and international cooperation to identify and mitigate the threats posed by such advanced persistent threat groups.
Digital Marketing Trends in 2024 | Guide for Staying AheadWask
https://www.wask.co/ebooks/digital-marketing-trends-in-2024
Feeling lost in the digital marketing whirlwind of 2024? Technology is changing, consumer habits are evolving, and staying ahead of the curve feels like a never-ending pursuit. This e-book is your compass. Dive into actionable insights to handle the complexities of modern marketing. From hyper-personalization to the power of user-generated content, learn how to build long-term relationships with your audience and unlock the secrets to success in the ever-shifting digital landscape.
Taking AI to the Next Level in Manufacturing.pdfssuserfac0301
Read Taking AI to the Next Level in Manufacturing to gain insights on AI adoption in the manufacturing industry, such as:
1. How quickly AI is being implemented in manufacturing.
2. Which barriers stand in the way of AI adoption.
3. How data quality and governance form the backbone of AI.
4. Organizational processes and structures that may inhibit effective AI adoption.
6. Ideas and approaches to help build your organization's AI strategy.
3. Moreq 2010 (2011)
• The DLM Forum announced the MoReq2010 work programme at its
Gen. Meeting in Madrid in May 2010.
• May 2011: new specification launched at DLM Forum GM in
Budapest (delayed)
• The objectives of MoReq2010 are to broaden the appeal of
MoReq, introduce interoperatiblity, significantly increase its
adoption, and make compliance accessible to non-traditional
software suppliers.
• While MoReq2 introduced a model of software
compliance, MoReq2010 incorporates the flexibility and modularity
to extend that compliance to a wider range of software
(interoperability).
• MoReq2010 is more loosely coupled, allowing it to be extended to
meet the needs of different industries and markets, as required.
3
4. 2011: Differences to the consultation version of 2010
• Adoption of a service oriented architecture model:
• All the requirement in the MoReq 2010 core requirements have been bundled into ten services. A MoReq 2010
compliant system (MCRS) will be capable of offering up its functionality as services, that could be consumed by one
or more other information systems within the organisation.
• For example several records systems within an organisation could all consume the classification service of one
MCRS, enabling the organisation to hold its fileplan in one place whilst having it used by several systems.
• A MCRS must possess the capability to provide ten services:
• a records service (the capability to hold aggregations of records)
• a metadata service (the capability to maintain metadata about objects within the system)
• a classification service (the capability to hold a classification, to apply it to aggregations of records, and to link
headings within the classification to retention rules)
• a disposal service (the capability to hold retention rules, and to dispose of records in accordance with retention
rules)
• a disposal hold service (the capability to prevent the application of a retention rule to a record, for example because
the record is required in a legal case)
• a search and report service (the capability to retrieve and present records and metadata in response to queries)
• a user and groups service (the ability to maintain information about people and groups that have permissions to
use the system)
• a role service (the ability to assign roles to people and groups to determine what those people and groups can and
can‟t do within the system)
• system services (the capability to maintain event histories in relation to objects held within the system)
• an export service (the capability to export records together with their metadata and event histories in a form that
another MCRS could understand)
4
5. 2011: Differences to the consultation version of 2010 (2)
• Abandonment of the notion of a „primary classification‟
• The notion of a „primary classification‟ for records had been dropped. Instead a record will be
assigned a classification, from which it would by default inherit a retention rule. It would be
possible though for a person with appropriate permissions to override that inherited retention
rule, and instead assign to the record a different retention rule, or to get the record to receive
a retention rule from a different part of the classification scheme to the one it has been
assigned to.
• Reduction in the number of requirements
• The number of requirements had been significantly reduced. The consultation draft had
contained 436 requirements, these have now been consolidated into 170 requirements. But
the final core requirements document would be longer than the consultation draft, because
the introductory explanations had been increased to 90 pages.
Comment J. Lappin / J. Garde (link to blog)
5
6. Critical aspects
• Moreq was never officially endorsed or recommended in any directive by the
European Commission
• Moreq is currently in a triangle of disorientation (Kampffmeyer):
• (1) not fully accepted by traditional records managers & archivists in
leading organizations and institutions; in addition IT does often not
understand the functional integration of RM requirements (incl. NFR)
• (2) Moreq is considered as not relevant from users outside of the RM and
archivists community / world (mainly IT)
• (3) not (yet) really supported by leading vendors and even considered as
an additional barrier, cost driver (certification) and technology break
• Therefore it exists a certain danger that the development of the new standard
is rather reamed (clash) between a traditional RM environment/community
which has the tendency to become „incestuous“ and an open information &
office environment which welcomes basic record keeping principles in an
uncontrolled data growth (not giving away opportunities in practice for the
sake of ideology)
http://jhagmann.twoday.net/stories/19464155/ (comments Kampffmeyer)
6
7. Moreq Future: planned (1)
• MoReq diversifies against a common core set of best practice
requirements to infiltrate not only different software and technologies
but also different industry sectors.
• By 2012 and beyond we see the start of a trend to package MoReq for
"Health", for "Defence", for "Oil & Gas“ etc.
• The DLM forum are planning to have a first wave of additional modules for MoReq 2010 available
by the time of their triennial conference (Brussels Dec. 2011). Unlike the core requirements, the
additional modules will be optional rather than mandatory.
• Included in the first wave will be:
• an import service – providing the ability to import records and associated metadata from
another MCRS. Note that the ability to export records is a core requirement, but the ability to
import records is an additional module. This is because an organisation implementing its
first MoReq 2010 compliant system does not need that system to be able to import from
another MoReq 2010 compliant system.
7
8. Moreq Future: planned (2)
• Modules that provide backwards compatibility with MoReq 2
• a scanning module
• a file module (MoReq 2010 replaced the concept of the „file‟ with the broader concept of an „aggregation‟. The additional
module would ensure that a system could enforce MoReq 2 style „files‟ (which can only be split into volumes and parts).
In MoReq 2010 terms a MoReq 2 file is simply one possible means of aggregating records
• a vital records module
• an e-mail module (the core requirements of MoReq 2010 itself talks generically about „records‟ and do not focus
specifically on any one particular format)
• It is hoped that more additional modules would follow. Jon Garde would like to see MoReq 2010 additional modules
that cover records keeping requirements in respect of cloud computing, mobile devices and social software. He
urged anyone who feels that there are needs that MoReq 2010 could usefully address to come forward and develop
a module to address those needs. For example modules that provide functionality specific to a single sector (health
sector, defence sector etc.)
• Development of test centers:
• The MoReq Governance Board plans to accredit an international network of testing centres, to whom vendors
can submit products for testing against MoReq 2010. Six organisations have already expressed an interest in
becoming testing centres. There is no limit to the number of test centres that may be established. The test
centres will use test scripts and templates created by the MoReq Governance Board. Vendors will pay a fee to
the test centres to have their products tested, and (assuming they are successful) a fee to the DLM Forum to
validate the recommendation of the test centre and to award the certificate.
Source: Comment J. Lappin / J. Garde (link to blog)
8
9. Moreq Future: What is needed
• "Lex MoReq„ is needed! Still a leaner MoReq2011 should be anchored in a European Directive
and as a consequence to be followed and implemented in all national legislations of the EU as a
mandatory standard beyond the classic notion of a "Record“.
• "Embedded Records Management - everytime everywhere and for everybody"!
• „Interoperability“ is the new buzzword for the DLM-Forum in Dec. 2011 (Brussels)
• What we need is a seamless and automated Records Management in the background; It„s
not about Web 2.0 or Social Media but it„s about integrating new technology concepts (in
place or in app RM and SOA)
• James Lappin: RMJ
• Alan Pelz-Sharpe (comment: Is Moreq 2010 a DoD 5015 slayer?)
• “Slayer because it does what it's supposed to do and no more. It's a standard that tells you
what you must do, but not how to do it, or for that matter where to do it. In fact with this new
standard, you may potentially even have your own internal RM program certified, rather than
the standard simply being restricted to a particular vendor's software solution. This is a huge
change in direction, and one that I certainly welcome.”
9
10. DLM Forum Dec. 2011 Brussels
• Call for papers: contributions to support a Health / Pharma specific
committee are welcome (http://www.dlmforum.eu/)
• The DLM Forum has launched in July 2011 the MoReq2010 Technical
Committee to manage and extend MoReq2010, and has published the first
XML Schema that enables interoperability between records systems.
• Leading vendors such as Automated Intelligence, EMC, Fabasoft, Gimmal Group, HP, Open Text, and
Oracle, together with Records Management consultants and industry analysts have already joined the new
Committee …
• The creation of this committee has triggered overwhelming interest from all parts of the industry. Numerous
specialists and professionals want to be involved in implementing and extending information compliance
solutions. In addition to the technical committee the DLM Forum will also be establishing working groups for
practitioners, translators and accredited MoReq2010 Test Centres.”
• “MoReq2010 is the first records and information management specification that enables interoperability
between different MoReq2010 compliant records systems even when built by different suppliers through the
use of a defined shared data model. It enables commercial and government organisations to secure and
develop critical information independent of email, document content management, cloud and mobile
systems, so that when systems are changed, updated, migrated or integrated, the security, value and
probity of the records is maintained. We expect that all future information compliance products and systems
across Europe will exploit this platform to meet regulatory requirements”.
• Details link
10
12. ISO15489 RM processes Model Requirements for the ERMS
•Capture Moreq2
•Which objects (company guidelines)
•Created and received incl. Metadata
•Physical and electronic objects
•Registration
•Formalizing capture incl. metadata
•Unique identifier, date-time, title, author
•Classification
•According to classification-scheme (taxonomy)
•Sequence of business activities (links)
•Indexing
•Access and security classification
•According to classification-scheme
RECORDS
•Identification of disposition status
•Identify retention-period of the record
•Storage
•Physical and electronic (backup)
•Use and location tracking
•Records management transactions
•Implementation of disposition
•Continuing retention incl. Disposition-hold)
•Transfer, migration
•destruction
ISO 15489 and MoReq2 both cover the entirety of the processes affecting records.
13. Functional Integration of RM Process Requirements: Aligning ISO 15489
with Moreq2
ISO 15489 : RM Process Controls Section 9
Non-IT
Process
9.1 9.2 9.3-9.8 9.9 9.10-11
Use
Process Retention Legal Hold MonitoringA
Capture Manage
Life Cycle Policy Disposition udit Training
Tracking
FSpecs 6. 3./5. 7.-9. 4.
5.3 Controls
Moreq2: Chapter #
14. Best practices for RM
Non-IT specific
• Preserve the right information for the correct length of time
• Meet legal requirements faster and more cost effectively
• Control and manage records management storage and destruction fees
• Demonstrate proven practices of good faith through consistent implementation
• Archive vital information for business continuity and disaster recovery
• Provide information in a timely and efficient manner regardless of urgency of
request
• Use appropriate technology to manage and improve program
• Integrate policies and procedures throughout organization
• Establish ownership and accountability of records management program
• Arrange for continuous training and communication throughout the organization
• Project an image of good faith, responsiveness and consistency
• Review, audit and improve program continuously