Why you would like to have both a CMS and and an ECM platform together, and see the different technical options you are offered to integrate Hippo and Nuxeo
MongoDB Days UK: Using MongoDB to Build a Fast and Scalable Content Repositor...MongoDB
Presented by Alain Escaffre, Director of Product Management, Nuxeo
Experience level: Beginner
MongoDB can be used in the Nuxeo Platform as a replacement for traditional SQL databases. Nuxeo's content repository, which is the cornerstone of this open source software platform, can now completely rely on MongoDB for data storage. This presentation will explain the motivation for using MongoDB and will discuss different implementation strategies. In this session, you will learn more about the migrations to MongoDB and how we were able to achieve increased performance gains.
Manage Complex Digital Assets at Massive ScaleNuxeo
Today’s most successful global organizations efficiently manage, track and customize campaign content for target markets around the world. Doing so requires new digital asset management applications capable of handling today’s complexities surrounding video and other rich media, using automated workflows, renditions, usage rights management and more. This presentation covers:
- Why global brand and campaign management is more complex than ever before
- Why legacy tools may no longer be enough to meet the challenges of running successful campaigns
- How to assess which new DAM/MAM technology is right for your organization
How do MongoDB drivers discover and monitor your servers? How do they determine your replica set config, and how do they resolve a crisis like the loss of a primary? And finally, what's the best way to write your application to handle errors and outages?
Staff Engineer A. Jesse Jiryu Davis covers these topics for the first time at any MongoDB conference, in his talk at MongoDB World 2015 about the new Server Discovery And Monitoring spec.
5 Ways to Transform your Digital Content into Valuable Digital AssetsNuxeo
Hear from the DAM experts at TBWA, a leading international advertising agency, and learn how they put their digital assets to work by leveraging the Nuxeo Platform for digital asset management.
Watch the webinar on demand: http://www.nuxeo.com/resources/tbwa-manages-global-digital-media-with-nuxeo/
Nuxeo Webinar: Getting Started with your DAM ApplicationNuxeo
Developing an application to manage your organization’s digital assets requires building complex content models, internal and external workflows, integrations to other systems, and a custom user interface. Learn why the platform approach will make this task more manageable and agile.
[Webinar] Introduction to Workflow Design for the Nuxeo PlatformNuxeo
The Nuxeo Platform has an integrated workflow engine with a full-featured workflow designer to help you define custom workflows that model your business processes. This series of two webinars will give a complete functional overview of workflow design in the Nuxeo Platform.
Join Alain Escaffre, Director of Product Management, for this live webinar with a demo and Q&A to find out how the workflow designer works.
If you’d like to play first, you can download the Nuxeo Platform and register for a free 30-day trial of Nuxeo Studio.
The second webinar will cover advanced workflow design techniques.
MongoDB Days UK: Using MongoDB to Build a Fast and Scalable Content Repositor...MongoDB
Presented by Alain Escaffre, Director of Product Management, Nuxeo
Experience level: Beginner
MongoDB can be used in the Nuxeo Platform as a replacement for traditional SQL databases. Nuxeo's content repository, which is the cornerstone of this open source software platform, can now completely rely on MongoDB for data storage. This presentation will explain the motivation for using MongoDB and will discuss different implementation strategies. In this session, you will learn more about the migrations to MongoDB and how we were able to achieve increased performance gains.
Manage Complex Digital Assets at Massive ScaleNuxeo
Today’s most successful global organizations efficiently manage, track and customize campaign content for target markets around the world. Doing so requires new digital asset management applications capable of handling today’s complexities surrounding video and other rich media, using automated workflows, renditions, usage rights management and more. This presentation covers:
- Why global brand and campaign management is more complex than ever before
- Why legacy tools may no longer be enough to meet the challenges of running successful campaigns
- How to assess which new DAM/MAM technology is right for your organization
How do MongoDB drivers discover and monitor your servers? How do they determine your replica set config, and how do they resolve a crisis like the loss of a primary? And finally, what's the best way to write your application to handle errors and outages?
Staff Engineer A. Jesse Jiryu Davis covers these topics for the first time at any MongoDB conference, in his talk at MongoDB World 2015 about the new Server Discovery And Monitoring spec.
5 Ways to Transform your Digital Content into Valuable Digital AssetsNuxeo
Hear from the DAM experts at TBWA, a leading international advertising agency, and learn how they put their digital assets to work by leveraging the Nuxeo Platform for digital asset management.
Watch the webinar on demand: http://www.nuxeo.com/resources/tbwa-manages-global-digital-media-with-nuxeo/
Nuxeo Webinar: Getting Started with your DAM ApplicationNuxeo
Developing an application to manage your organization’s digital assets requires building complex content models, internal and external workflows, integrations to other systems, and a custom user interface. Learn why the platform approach will make this task more manageable and agile.
[Webinar] Introduction to Workflow Design for the Nuxeo PlatformNuxeo
The Nuxeo Platform has an integrated workflow engine with a full-featured workflow designer to help you define custom workflows that model your business processes. This series of two webinars will give a complete functional overview of workflow design in the Nuxeo Platform.
Join Alain Escaffre, Director of Product Management, for this live webinar with a demo and Q&A to find out how the workflow designer works.
If you’d like to play first, you can download the Nuxeo Platform and register for a free 30-day trial of Nuxeo Studio.
The second webinar will cover advanced workflow design techniques.
Data science apps powered by Jupyter NotebooksNatalino Busa
Jupyter notebooks are transforming the way we look at computing, coding, and science. But is this the only "data scientist experience" that this technology can provide? In this presentation we will look at how to create interactive web applications for data exploration and machine learning. In the background this code is still powered by the well-understood and well-documented Jupyter Notebooks.
Code on github: https://github.com/natbusa/kernelgateway_demos
DocDokuPLM : Domain Specific PaaS and Business Oriented API, OW2con'16, Paris. OW2
Totally replacing our SOAP web services with HTTP web services behind an API has been a real challenge for us this year. We made the choice to generate our Java and JavaScript API by using Swagger. Swagger allows us to generate a JSON file describing our REST layer services, and thus generate code from this description file. We're now able to deliver a SDK to other applications in Java and JavaScript today.
Using same codebase and same method names are really useful for developers, and modifying our REST layer doesn't mean modifying our SDKs by hand: it's generated! It's quite easy to deploy and/or use: our APIs are simply Maven and NodeJS modules. Having a interactive documentation for all SDKs is really appreciable, it allows us to discover every services and test them.
We can now resolve specific use-cases by developing new applications with this API. Currently our SDK is in use in 2 separate projects and languages (a GUI written with NodeWebkit and a JEE server application), and fits as needed.
Building Modern Data Pipelines on GCP via a FREE online BootcampData Con LA
Data Con LA 2020
Description
You just got hired by a large "tech startup". They're a hip travel agency like Kayak, "revolutionizing the airline industry" by developing an A/I that negotiates best airline deals on behalf of passengers. But in reality they are developing the AI to jack up ticket prices as it finds the passengers' preferences. They run their tech on the latest Google Cloud technologies, so you figured it's a great place to sharpen your skills as a Data Engineer despite the company's broken ethical compass. We teach Cloud Data Engineering to beginner/intermediate developers via a fun and engaging story. You will build a complete data-driven A/I pipeline. Ingest 6 years worth of real flight records, profile 30M+ user profiles and process 100M+ live streaming events while learning tools such as BigQuery, Dataflow (Apache Beam), DataProc (Apache Spark), Pub/Sub (Kafka), BigTable, and Airflow (Cloud Composer). During our talk, we will:
*Discuss the latest Serverless Data Architecture on GCP
*Explore the architectural decisions behind our Data Pipeline
*Run a live demo from our course
Speaker
Parham Parvizi, Tura Labs, Founder / Data Engineer
Drupal 8: A story of growing up and getting off the islandAngela Byron
The Drupal project has traditionally held a strong internal value for doing things "The Drupal Way." As a result, Drupal developers have historically needed to build up reams and reams of tricks and workarounds that were specific to Drupal itself, and Drupal was inaccessible to people with a more traditional programming background. Starting in Drupal 8, however, we've effectively done a ground-up rewrite of the underlying code and in the process made major inroads to getting more inline with the rest of the PHP world. Procedural code is out, OO code is in. "Creative" hacks have been replaced with FIG standards. "Not invented here" is now "Proudly found elsewhere." This story will talk about the journey that Drupal 8 and the Drupal core development team has taken during this transition over the past 3+ years, including some of the pros and cons of this approach and how we dealt (and are dealing) with some of the community management challenges that resulted.
My Slides about creating web sites which could also be useable even if you are not online! From Web Storages to Service Workers.
Presented at Mobiletech Conference in Munich March 2017
Scaling ML-Based Threat Detection For Production Cyber AttacksDatabricks
Vulnerabilities such as Spectre and Meltdown continue to plague many production servers, based on Intel CPUs. Our solution involves software-based monitoring of hardware counters and sending that data to Apache Spark clusters for threat detection. We leverage Spark's support for support vector machine (SVM) inference. Our machine learning models are trained off-line by a data scientist within a Jupyter notebook environment. As new models are validated, they can be easily deployed to the Spark cluster from the notebook. We have standardized model export and import using the ONNX machine learning open file format. In our presentation, we will demo the full pipeline, from model training to deployment. We will discuss the various challenges when deploying ML-based cyber-threat detection at scale using Apache Spark. For example, we found that gaps in detection can occur when Spark models are updated. We will describe a novel data ingestion architecture, based on Apache Kafka, that we developed to deal with this issue.
Connecting the Dots: Integrating Apache Spark into Production PipelinesDatabricks
Have you ever struggled to smoothly integrate Spark into a production workflow? Spark is an excellent tool for processing data on the terabyte scale, but building a system to move from raw data through featurization, modeling, and prediction serving involves interacting with numerous other components. Over the past year and half my team at ShopRunner has built a production Spark workflow for data science from scratch. In this talk you'll learn about the tools we use, the challenges we encountered, an open-source library we wrote to work through them, and how you can avoid the detours we took along the way. Data science work often begins in an interactive notebook environment, exploring data and testing out different modeling approaches. However, moving towards a production environment means building reproducible workflows, packaging libraries, setting up scheduling and monitoring of jobs, and figuring out ways to serve results to clients in real time. After testing out a variety of tools, we at ShopRunner have settled on a stack including Databricks, Snowflake, Datadog, Jenkins, and S3, ECS, and RDS from the suite of AWS services. These tools each offer unique benefits for their area of focus, but crafting a cohesive pipeline from this range of tools presented a challenge. Come learn how to integrate a Spark workflow into a pipeline that analyzes many terabytes of data, builds machine learning models at scale, and serves predictions to a variety of customer-facing tools. Whether you're just getting started using Spark in productions systems or you already have Spark running in production and want to smooth the process, this talk will leave you better equipped to find and connect the tools that suit your needs.
Metaflow: The ML Infrastructure at NetflixBill Liu
Metaflow was started at Netflix to answer a pressing business need: How to enable an organization of data scientists, who are not software engineers by training, build and deploy end-to-end machine learning workflows and applications independently. We wanted to provide the best possible user experience for data scientists, allowing them to focus on parts they like (modeling using their favorite off-the-shelf libraries) while providing robust built-in solutions for the foundational infrastructure: data, compute, orchestration, and versioning.
Today, the open-source Metaflow powers hundreds of business-critical ML projects at Netflix and other companies from bioinformatics to real estate.
In this talk, you will learn about:
- What to expect from a modern ML infrastructure stack.
- Using Metaflow to boost the productivity of your data science organization, based on lessons learned from Netflix.
- Deployment strategies for a full stack of ML infrastructure that plays nicely with your existing systems and policies.
https://www.aicamp.ai/event/eventdetails/W2021080510
Peter Hoddie, Kinoma VP, gave a talk at the IoT-themed API-Craft meet-up at the Tradeshift HQ in San Francisco. He discusses connectivity, the challenges and demands of IoT, and how Kinoma is building a set of APIs for the IoT.
Upgrading to the next major Drupal release can be complex—even stressful. In this webinar, we’ll show you how to take the pain out of the process and get your Drupal 6, 7, or 8 site up and running on D9—the latest and greatest Drupal release.
Upgrading to the next major Drupal release can be complex - even stressful. In this webinar, we’ll show you how to take the pain out of the process, and get your Drupal 6, 7, or 8 site up and running on D9 - the latest and greatest Drupal release.
During this webinar, we’ll share tips on environment setup, explain how to automate the script process, and discuss the overlaps between D8 and D9. We’ll also walk you through the out-of-the-box D9 migration process.
We’ll be joined by our partners at Mediacurrent, who will share their recognized Drupal expertise and thought leadership. They’ll offer insight on identifying, tracking, and fixing common accessibility mistakes in a Drupal upgrade. And, they’ll explain the best ways to prep for a transition, including setting strategy, budgeting, and identifying KPIs to effectively demonstrate ROI.
We’ll wrap it all up with a look at some new Acquia tools that can streamline and simplify the migration process even further.
With this much information and expertise to gain, this is certain to be a can’t-miss event for any Drupal professional.
Removing Uninteresting Bytes in Software FuzzingAftab Hussain
Imagine a world where software fuzzing, the process of mutating bytes in test seeds to uncover hidden and erroneous program behaviors, becomes faster and more effective. A lot depends on the initial seeds, which can significantly dictate the trajectory of a fuzzing campaign, particularly in terms of how long it takes to uncover interesting behaviour in your code. We introduce DIAR, a technique designed to speedup fuzzing campaigns by pinpointing and eliminating those uninteresting bytes in the seeds. Picture this: instead of wasting valuable resources on meaningless mutations in large, bloated seeds, DIAR removes the unnecessary bytes, streamlining the entire process.
In this work, we equipped AFL, a popular fuzzer, with DIAR and examined two critical Linux libraries -- Libxml's xmllint, a tool for parsing xml documents, and Binutil's readelf, an essential debugging and security analysis command-line tool used to display detailed information about ELF (Executable and Linkable Format). Our preliminary results show that AFL+DIAR does not only discover new paths more quickly but also achieves higher coverage overall. This work thus showcases how starting with lean and optimized seeds can lead to faster, more comprehensive fuzzing campaigns -- and DIAR helps you find such seeds.
- These are slides of the talk given at IEEE International Conference on Software Testing Verification and Validation Workshop, ICSTW 2022.
More Related Content
Similar to Integrating the Nuxeo Content Management Platform with Hippo
Data science apps powered by Jupyter NotebooksNatalino Busa
Jupyter notebooks are transforming the way we look at computing, coding, and science. But is this the only "data scientist experience" that this technology can provide? In this presentation we will look at how to create interactive web applications for data exploration and machine learning. In the background this code is still powered by the well-understood and well-documented Jupyter Notebooks.
Code on github: https://github.com/natbusa/kernelgateway_demos
DocDokuPLM : Domain Specific PaaS and Business Oriented API, OW2con'16, Paris. OW2
Totally replacing our SOAP web services with HTTP web services behind an API has been a real challenge for us this year. We made the choice to generate our Java and JavaScript API by using Swagger. Swagger allows us to generate a JSON file describing our REST layer services, and thus generate code from this description file. We're now able to deliver a SDK to other applications in Java and JavaScript today.
Using same codebase and same method names are really useful for developers, and modifying our REST layer doesn't mean modifying our SDKs by hand: it's generated! It's quite easy to deploy and/or use: our APIs are simply Maven and NodeJS modules. Having a interactive documentation for all SDKs is really appreciable, it allows us to discover every services and test them.
We can now resolve specific use-cases by developing new applications with this API. Currently our SDK is in use in 2 separate projects and languages (a GUI written with NodeWebkit and a JEE server application), and fits as needed.
Building Modern Data Pipelines on GCP via a FREE online BootcampData Con LA
Data Con LA 2020
Description
You just got hired by a large "tech startup". They're a hip travel agency like Kayak, "revolutionizing the airline industry" by developing an A/I that negotiates best airline deals on behalf of passengers. But in reality they are developing the AI to jack up ticket prices as it finds the passengers' preferences. They run their tech on the latest Google Cloud technologies, so you figured it's a great place to sharpen your skills as a Data Engineer despite the company's broken ethical compass. We teach Cloud Data Engineering to beginner/intermediate developers via a fun and engaging story. You will build a complete data-driven A/I pipeline. Ingest 6 years worth of real flight records, profile 30M+ user profiles and process 100M+ live streaming events while learning tools such as BigQuery, Dataflow (Apache Beam), DataProc (Apache Spark), Pub/Sub (Kafka), BigTable, and Airflow (Cloud Composer). During our talk, we will:
*Discuss the latest Serverless Data Architecture on GCP
*Explore the architectural decisions behind our Data Pipeline
*Run a live demo from our course
Speaker
Parham Parvizi, Tura Labs, Founder / Data Engineer
Drupal 8: A story of growing up and getting off the islandAngela Byron
The Drupal project has traditionally held a strong internal value for doing things "The Drupal Way." As a result, Drupal developers have historically needed to build up reams and reams of tricks and workarounds that were specific to Drupal itself, and Drupal was inaccessible to people with a more traditional programming background. Starting in Drupal 8, however, we've effectively done a ground-up rewrite of the underlying code and in the process made major inroads to getting more inline with the rest of the PHP world. Procedural code is out, OO code is in. "Creative" hacks have been replaced with FIG standards. "Not invented here" is now "Proudly found elsewhere." This story will talk about the journey that Drupal 8 and the Drupal core development team has taken during this transition over the past 3+ years, including some of the pros and cons of this approach and how we dealt (and are dealing) with some of the community management challenges that resulted.
My Slides about creating web sites which could also be useable even if you are not online! From Web Storages to Service Workers.
Presented at Mobiletech Conference in Munich March 2017
Scaling ML-Based Threat Detection For Production Cyber AttacksDatabricks
Vulnerabilities such as Spectre and Meltdown continue to plague many production servers, based on Intel CPUs. Our solution involves software-based monitoring of hardware counters and sending that data to Apache Spark clusters for threat detection. We leverage Spark's support for support vector machine (SVM) inference. Our machine learning models are trained off-line by a data scientist within a Jupyter notebook environment. As new models are validated, they can be easily deployed to the Spark cluster from the notebook. We have standardized model export and import using the ONNX machine learning open file format. In our presentation, we will demo the full pipeline, from model training to deployment. We will discuss the various challenges when deploying ML-based cyber-threat detection at scale using Apache Spark. For example, we found that gaps in detection can occur when Spark models are updated. We will describe a novel data ingestion architecture, based on Apache Kafka, that we developed to deal with this issue.
Connecting the Dots: Integrating Apache Spark into Production PipelinesDatabricks
Have you ever struggled to smoothly integrate Spark into a production workflow? Spark is an excellent tool for processing data on the terabyte scale, but building a system to move from raw data through featurization, modeling, and prediction serving involves interacting with numerous other components. Over the past year and half my team at ShopRunner has built a production Spark workflow for data science from scratch. In this talk you'll learn about the tools we use, the challenges we encountered, an open-source library we wrote to work through them, and how you can avoid the detours we took along the way. Data science work often begins in an interactive notebook environment, exploring data and testing out different modeling approaches. However, moving towards a production environment means building reproducible workflows, packaging libraries, setting up scheduling and monitoring of jobs, and figuring out ways to serve results to clients in real time. After testing out a variety of tools, we at ShopRunner have settled on a stack including Databricks, Snowflake, Datadog, Jenkins, and S3, ECS, and RDS from the suite of AWS services. These tools each offer unique benefits for their area of focus, but crafting a cohesive pipeline from this range of tools presented a challenge. Come learn how to integrate a Spark workflow into a pipeline that analyzes many terabytes of data, builds machine learning models at scale, and serves predictions to a variety of customer-facing tools. Whether you're just getting started using Spark in productions systems or you already have Spark running in production and want to smooth the process, this talk will leave you better equipped to find and connect the tools that suit your needs.
Metaflow: The ML Infrastructure at NetflixBill Liu
Metaflow was started at Netflix to answer a pressing business need: How to enable an organization of data scientists, who are not software engineers by training, build and deploy end-to-end machine learning workflows and applications independently. We wanted to provide the best possible user experience for data scientists, allowing them to focus on parts they like (modeling using their favorite off-the-shelf libraries) while providing robust built-in solutions for the foundational infrastructure: data, compute, orchestration, and versioning.
Today, the open-source Metaflow powers hundreds of business-critical ML projects at Netflix and other companies from bioinformatics to real estate.
In this talk, you will learn about:
- What to expect from a modern ML infrastructure stack.
- Using Metaflow to boost the productivity of your data science organization, based on lessons learned from Netflix.
- Deployment strategies for a full stack of ML infrastructure that plays nicely with your existing systems and policies.
https://www.aicamp.ai/event/eventdetails/W2021080510
Peter Hoddie, Kinoma VP, gave a talk at the IoT-themed API-Craft meet-up at the Tradeshift HQ in San Francisco. He discusses connectivity, the challenges and demands of IoT, and how Kinoma is building a set of APIs for the IoT.
Upgrading to the next major Drupal release can be complex—even stressful. In this webinar, we’ll show you how to take the pain out of the process and get your Drupal 6, 7, or 8 site up and running on D9—the latest and greatest Drupal release.
Upgrading to the next major Drupal release can be complex - even stressful. In this webinar, we’ll show you how to take the pain out of the process, and get your Drupal 6, 7, or 8 site up and running on D9 - the latest and greatest Drupal release.
During this webinar, we’ll share tips on environment setup, explain how to automate the script process, and discuss the overlaps between D8 and D9. We’ll also walk you through the out-of-the-box D9 migration process.
We’ll be joined by our partners at Mediacurrent, who will share their recognized Drupal expertise and thought leadership. They’ll offer insight on identifying, tracking, and fixing common accessibility mistakes in a Drupal upgrade. And, they’ll explain the best ways to prep for a transition, including setting strategy, budgeting, and identifying KPIs to effectively demonstrate ROI.
We’ll wrap it all up with a look at some new Acquia tools that can streamline and simplify the migration process even further.
With this much information and expertise to gain, this is certain to be a can’t-miss event for any Drupal professional.
Similar to Integrating the Nuxeo Content Management Platform with Hippo (20)
Removing Uninteresting Bytes in Software FuzzingAftab Hussain
Imagine a world where software fuzzing, the process of mutating bytes in test seeds to uncover hidden and erroneous program behaviors, becomes faster and more effective. A lot depends on the initial seeds, which can significantly dictate the trajectory of a fuzzing campaign, particularly in terms of how long it takes to uncover interesting behaviour in your code. We introduce DIAR, a technique designed to speedup fuzzing campaigns by pinpointing and eliminating those uninteresting bytes in the seeds. Picture this: instead of wasting valuable resources on meaningless mutations in large, bloated seeds, DIAR removes the unnecessary bytes, streamlining the entire process.
In this work, we equipped AFL, a popular fuzzer, with DIAR and examined two critical Linux libraries -- Libxml's xmllint, a tool for parsing xml documents, and Binutil's readelf, an essential debugging and security analysis command-line tool used to display detailed information about ELF (Executable and Linkable Format). Our preliminary results show that AFL+DIAR does not only discover new paths more quickly but also achieves higher coverage overall. This work thus showcases how starting with lean and optimized seeds can lead to faster, more comprehensive fuzzing campaigns -- and DIAR helps you find such seeds.
- These are slides of the talk given at IEEE International Conference on Software Testing Verification and Validation Workshop, ICSTW 2022.
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...UiPathCommunity
💥 Speed, accuracy, and scaling – discover the superpowers of GenAI in action with UiPath Document Understanding and Communications Mining™:
See how to accelerate model training and optimize model performance with active learning
Learn about the latest enhancements to out-of-the-box document processing – with little to no training required
Get an exclusive demo of the new family of UiPath LLMs – GenAI models specialized for processing different types of documents and messages
This is a hands-on session specifically designed for automation developers and AI enthusiasts seeking to enhance their knowledge in leveraging the latest intelligent document processing capabilities offered by UiPath.
Speakers:
👨🏫 Andras Palfi, Senior Product Manager, UiPath
👩🏫 Lenka Dulovicova, Product Program Manager, UiPath
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...DanBrown980551
Do you want to learn how to model and simulate an electrical network from scratch in under an hour?
Then welcome to this PowSyBl workshop, hosted by Rte, the French Transmission System Operator (TSO)!
During the webinar, you will discover the PowSyBl ecosystem as well as handle and study an electrical network through an interactive Python notebook.
PowSyBl is an open source project hosted by LF Energy, which offers a comprehensive set of features for electrical grid modelling and simulation. Among other advanced features, PowSyBl provides:
- A fully editable and extendable library for grid component modelling;
- Visualization tools to display your network;
- Grid simulation tools, such as power flows, security analyses (with or without remedial actions) and sensitivity analyses;
The framework is mostly written in Java, with a Python binding so that Python developers can access PowSyBl functionalities as well.
What you will learn during the webinar:
- For beginners: discover PowSyBl's functionalities through a quick general presentation and the notebook, without needing any expert coding skills;
- For advanced developers: master the skills to efficiently apply PowSyBl functionalities to your real-world scenarios.
Le nuove frontiere dell'AI nell'RPA con UiPath Autopilot™UiPathCommunity
In questo evento online gratuito, organizzato dalla Community Italiana di UiPath, potrai esplorare le nuove funzionalità di Autopilot, il tool che integra l'Intelligenza Artificiale nei processi di sviluppo e utilizzo delle Automazioni.
📕 Vedremo insieme alcuni esempi dell'utilizzo di Autopilot in diversi tool della Suite UiPath:
Autopilot per Studio Web
Autopilot per Studio
Autopilot per Apps
Clipboard AI
GenAI applicata alla Document Understanding
👨🏫👨💻 Speakers:
Stefano Negro, UiPath MVPx3, RPA Tech Lead @ BSP Consultant
Flavio Martinelli, UiPath MVP 2023, Technical Account Manager @UiPath
Andrei Tasca, RPA Solutions Team Lead @NTT Data
zkStudyClub - Reef: Fast Succinct Non-Interactive Zero-Knowledge Regex ProofsAlex Pruden
This paper presents Reef, a system for generating publicly verifiable succinct non-interactive zero-knowledge proofs that a committed document matches or does not match a regular expression. We describe applications such as proving the strength of passwords, the provenance of email despite redactions, the validity of oblivious DNS queries, and the existence of mutations in DNA. Reef supports the Perl Compatible Regular Expression syntax, including wildcards, alternation, ranges, capture groups, Kleene star, negations, and lookarounds. Reef introduces a new type of automata, Skipping Alternating Finite Automata (SAFA), that skips irrelevant parts of a document when producing proofs without undermining soundness, and instantiates SAFA with a lookup argument. Our experimental evaluation confirms that Reef can generate proofs for documents with 32M characters; the proofs are small and cheap to verify (under a second).
Paper: https://eprint.iacr.org/2023/1886
In his public lecture, Christian Timmerer provides insights into the fascinating history of video streaming, starting from its humble beginnings before YouTube to the groundbreaking technologies that now dominate platforms like Netflix and ORF ON. Timmerer also presents provocative contributions of his own that have significantly influenced the industry. He concludes by looking at future challenges and invites the audience to join in a discussion.
Elevating Tactical DDD Patterns Through Object CalisthenicsDorra BARTAGUIZ
After immersing yourself in the blue book and its red counterpart, attending DDD-focused conferences, and applying tactical patterns, you're left with a crucial question: How do I ensure my design is effective? Tactical patterns within Domain-Driven Design (DDD) serve as guiding principles for creating clear and manageable domain models. However, achieving success with these patterns requires additional guidance. Interestingly, we've observed that a set of constraints initially designed for training purposes remarkably aligns with effective pattern implementation, offering a more ‘mechanical’ approach. Let's explore together how Object Calisthenics can elevate the design of your tactical DDD patterns, offering concrete help for those venturing into DDD for the first time!
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024Albert Hoitingh
In this session I delve into the encryption technology used in Microsoft 365 and Microsoft Purview. Including the concepts of Customer Key and Double Key Encryption.
Alt. GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using ...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
Generative AI Deep Dive: Advancing from Proof of Concept to ProductionAggregage
Join Maher Hanafi, VP of Engineering at Betterworks, in this new session where he'll share a practical framework to transform Gen AI prototypes into impactful products! He'll delve into the complexities of data collection and management, model selection and optimization, and ensuring security, scalability, and responsible use.
Integrating the Nuxeo Content Management Platform with Hippo
1. follow the Hippo trail
1
CreateDigitalMiracles
follow the Hippo trail
1
Create Digital Miracles
ae@nuxeo.com @aescaffre
Integrating the Nuxeo Content
Management Platform with java
or javascript application
Friday 2013 June 21st, Amsterdam–Hippo GetTogether
Alain Escaffre, Nuxeo Presales and Products director
2. follow the Hippo trail
2
CreateDigitalMiracles
follow the Hippo trail
2
Create Digital Miracles
ae@nuxeo.com @aescaffre
Nuxeo : ECM platform
• Nuxeo Platform is a full stack software platform for building
content-centric business applications
• Designed for software developers, architects and business
managers who create software for internal use or for customers
• A foundation for business applications in the areas of content
management, document management, digital asset management,
and case management
• Trusted by large organizations for mission-critical applications
• Nuxeo Platform is created as open source software
3. follow the Hippo trail
3
CreateDigitalMiracles
follow the Hippo trail
3
Create Digital Miracles
ae@nuxeo.com @aescaffre
Nuxeo : the company
• Nuxeo supports customers in creating, building, maintaining,
deploying and operating apps
• Nuxeo covers the full lifecycle of applications:
• Application Designer: Nuxeo Studio
• Development Environment: Nuxeo IDE (Eclipse), Maven Tooling
• Testing Toolset: Unit, Functional (Sl/WebDriver), Performance
• Deployment tools: Nuxeo Marketplace, Update Center
• We focus on the complete experience for our customers, not just
the software you run
• We are based in France, New York and California.
4. follow the Hippo trail
4
CreateDigitalMiracles
follow the Hippo trail
4
Create Digital Miracles
ae@nuxeo.com @aescaffre
Nuxeo – Hippo, a very good fit !
5. follow the Hippo trail
5
CreateDigitalMiracles
follow the Hippo trail
5
Create Digital Miracles
ae@nuxeo.com @aescaffre
Nuxeo – Hippo, a very good fit !
6. follow the Hippo trail
6
CreateDigitalMiracles
follow the Hippo trail
6
Create Digital Miracles
ae@nuxeo.com @aescaffre
Nuxeo – Hippo, a very good fit !
• JAVA based
• Maven
• Freemarker
• Plugins
architecture
• …
7. follow the Hippo trail
7
CreateDigitalMiracles
follow the Hippo trail
7
Create Digital Miracles
ae@nuxeo.com @aescaffre
Nuxeo – Hippo, a very good fit !
8. follow the Hippo trail
8
CreateDigitalMiracles
follow the Hippo trail
8
Create Digital Miracles
ae@nuxeo.com @aescaffre
Nuxeo and Hippo GetTogether ?
Hippo: Frontal
Web Experience
Online
Marketing
Nuxeo: back office :
• Document Management
• Case Management
• Digital Asset Management
Customers
Prospects
Employees
HR Sales Accounting Marketing
9. follow the Hippo trail
9
CreateDigitalMiracles
follow the Hippo trail
9
Create Digital Miracles
ae@nuxeo.com @aescaffre
Nuxeo and Hippo GetTogether ?
10. follow the Hippo trail
10
CreateDigitalMiracles
follow the Hippo trail
10
Create Digital Miracles
ae@nuxeo.com @aescaffre
Nuxeo and Hippo GetTogether ?
11. follow the Hippo trail
11
CreateDigitalMiracles
follow the Hippo trail
11
Create Digital Miracles
ae@nuxeo.com @aescaffre
Nuxeo and Hippo GetTogether ?
12. follow the Hippo trail
12
CreateDigitalMiracles
follow the Hippo trail
12
Create Digital Miracles
ae@nuxeo.com @aescaffre
Nuxeo and Hippo GetTogether ?
13. follow the Hippo trail
13
CreateDigitalMiracles
follow the Hippo trail
13
Create Digital Miracles
ae@nuxeo.com @aescaffre
Nuxeo – Hippo Integration options
• CMIS
• Open Social
• Custom java development
14. follow the Hippo trail
14
CreateDigitalMiracles
follow the Hippo trail
14
Create Digital Miracles
ae@nuxeo.com @aescaffre
Nuxeo – Hippo using CMS
15. follow the Hippo trail
15
CreateDigitalMiracles
follow the Hippo trail
15
Create Digital Miracles
ae@nuxeo.com @aescaffre
Nuxeo remote API: Automation
Ordered set of operations
Used in:
• User Actions
• Events
• REST Calls
• Workflows
16. follow the Hippo trail
16
CreateDigitalMiracles
follow the Hippo trail
16
Create Digital Miracles
ae@nuxeo.com @aescaffre
Operation
An Operation Has:
• Unique name
• Input
• Ouput
• Parameters
17. follow the Hippo trail
17
CreateDigitalMiracles
follow the Hippo trail
17
Create Digital Miracles
ae@nuxeo.com @aescaffre
Some existing use of Automation client
in the CMS/Portal world
• Queries
• Form creation
• Content listing and search
screens
For managing workflows,
training scheduling, training
requests, etc ….
18. follow the Hippo trail
18
CreateDigitalMiracles
follow the Hippo trail
18
Create Digital Miracles
ae@nuxeo.com @aescaffre
A custom operation ? One java class !
@Operation(id = CreateDocument.ID, category = Constants.CAT_DOCUMENT, label = "Create",
description = "Create a new document in the input folder ...")
public class CreateDocument {
public final static String ID = "Document.Create";
@Context
protected CoreSession session;
@Param(name = "type")
protected String type;
@Param(name = "name", required = false)
protected String name;
@Param(name = "properties", required = false)
protected Properties content;
@OperationMethod
public DocumentModel run(DocumentModel doc) throws Exception {
if (name == null) {
name = "Untitled";
}
DocumentModel newDoc = session.createDocumentModel(
doc.getPathAsString(), name, type);
if (content != null) {
DocumentHelper.setProperties(session, newDoc, content);
}
return session.createDocument(newDoc);
}
}
19. follow the Hippo trail
19
CreateDigitalMiracles
follow the Hippo trail
19
Create Digital Miracles
ae@nuxeo.com @aescaffre
Exemple of an operation
- Input: Documents, Document
- Output: Documents, Document
- Parameter: The name of the
transition to follow
- Process:
Follows the transition in
parameter to the input
document
Returns the modified
document, which can be
used in the next operation
Example: “Follow Life Cycle Transition”
20. follow the Hippo trail
20
CreateDigitalMiracles
follow the Hippo trail
20
Create Digital Miracles
ae@nuxeo.com @aescaffre
Automation Client
- Easy to include
- Doesn’t require to be too
much of a geek !
import org.nuxeo.ecm.automation.client.model.documents;
import org.nuxeo.ecm.automation.client.Session;
public static void main(String[] args) throws Exception {
HttpAutomationClient client = new HttpAutomationClient(
"http://localhost:8080/nuxeo/site/automation");
Session session = client.getSession("Administrator", "Administrator");
Documents docs = (Documents) session.newRequest("Document.Query").set(
"query", "SELECT * FROM Document").execute();
System.out.println(docs);
client.shutdown();
}
21. follow the Hippo trail
21
CreateDigitalMiracles
follow the Hippo trail
21
Create Digital Miracles
ae@nuxeo.com @aescaffre
What to find in Nuxeo platform
• Fully featured repository (complex metadata management,
versionning, proxies, mixin, …)
• Advanced workflow engine
• Conversions services (pdf, videos, images, …)
• Search and query engine
• Audit service
• Higher level services: comments, activity, ....
• Media management services
22. follow the Hippo trail
22
CreateDigitalMiracles
follow the Hippo trail
22
Create Digital Miracles
ae@nuxeo.com @aescaffre
Question & Answers
Thank you so much for your time.
Let’s stay in touch.
@aescaffre
www.nuxeo.com