EmployeePages The next generation staff directoryTIMETOACT GROUP
Employees need a user interface with all relevant and exact information. With a Coporate Directory and the visualization of all organizational changes in a chartview, the organizational structur become more clear, transparent and personnel.
Traditional BI promises security and scale, but at what cost? Often, working with data, finding answers and sharing them can be laborious and time intensive. The rapid growth and maturation of cloud technologies offers an easier path.
With Tableau and AWS you can move your BI to the cloud and deliver the security and scale of your traditional BI, but with accessibility, flexibility, and speed. Take a closer look at the benefits of cloud BI, and how you can get started today.
Slides from my talk at Big Data Conference 2018 in Vilnius
Doing data science today is far more difficult than it will be in the next 5-10 years. Sharing, collaborating on data science workflows in painful, pushing models into production is challenging.
Let’s explore what Azure provides to ease Data Scientists’ pains. What tools and services can we choose based on a problem definition, skillset or infrastructure requirements?
In this talk, you will learn about Azure Machine Learning Studio, Azure Databricks, Data Science Virtual Machines and Cognitive Services, with all the perks and limitations.
How to build your own Delve: combining machine learning, big data and SharePointJoris Poelmans
You are experiencing the benefits of machine learning everyday through product recommendations on Amazon & Bol.com, credit card fraud prevention, etc… So how can we leverage machine learning together with SharePoint and Yammer. We will first look into the fundamentals of machine learning and big data solutions and next we will explore how we can combine tools such as Windows Azure HDInsight, R, Azure Machine Learning to extend and support collaboration and content management scenarios within your organization.
EmployeePages The next generation staff directoryTIMETOACT GROUP
Employees need a user interface with all relevant and exact information. With a Coporate Directory and the visualization of all organizational changes in a chartview, the organizational structur become more clear, transparent and personnel.
Traditional BI promises security and scale, but at what cost? Often, working with data, finding answers and sharing them can be laborious and time intensive. The rapid growth and maturation of cloud technologies offers an easier path.
With Tableau and AWS you can move your BI to the cloud and deliver the security and scale of your traditional BI, but with accessibility, flexibility, and speed. Take a closer look at the benefits of cloud BI, and how you can get started today.
Slides from my talk at Big Data Conference 2018 in Vilnius
Doing data science today is far more difficult than it will be in the next 5-10 years. Sharing, collaborating on data science workflows in painful, pushing models into production is challenging.
Let’s explore what Azure provides to ease Data Scientists’ pains. What tools and services can we choose based on a problem definition, skillset or infrastructure requirements?
In this talk, you will learn about Azure Machine Learning Studio, Azure Databricks, Data Science Virtual Machines and Cognitive Services, with all the perks and limitations.
How to build your own Delve: combining machine learning, big data and SharePointJoris Poelmans
You are experiencing the benefits of machine learning everyday through product recommendations on Amazon & Bol.com, credit card fraud prevention, etc… So how can we leverage machine learning together with SharePoint and Yammer. We will first look into the fundamentals of machine learning and big data solutions and next we will explore how we can combine tools such as Windows Azure HDInsight, R, Azure Machine Learning to extend and support collaboration and content management scenarios within your organization.
Enabling digital business with governed data lakeKaran Sachdeva
Digital business is enabled by Artificial intelligence, Machine learning, and data science. Artificial intelligence and machine learning are dependent on right Information architecture and data foundation. Governed data lake infused with governance and data science platform gives you the power to take the organization in the digital transformation and AI journey.
The current Microsoft PowerBI governance enabling and recommendations. Including the changes following the November PowerBI release and PASS conference announcements.
SPS London 2017 - Building applications with PowerApps, Microsoft flow and Of...Ahmad Najjar
PowerApps and Flow are services for building and using custom business apps that connect to your data and work across the web and mobile - without the time and expense of custom software development.
This session demonstrates how to build custom business applications, describes the vision behind PowerApps and Flow, and features key scenarios.
Big Data beyond Apache Hadoop - How to integrate ALL your DataKai Wähner
Big data represents a significant paradigm shift in enterprise technology. Big data radically changes the nature of the data management profession as it introduces new concerns about the volume, velocity and variety of corporate data.
Apache Hadoop is the open source defacto standard for implementing big data solutions on the Java platform. Hadoop consists of its kernel, MapReduce, and the Hadoop Distributed Filesystem (HDFS). A challenging task is to send all data to Hadoop for processing and storage (and then get it back to your application later), because in practice data comes from many different applications (SAP, Salesforce, Siebel, etc.) and databases (File, SQL, NoSQL), uses different technologies and concepts for communication (e.g. HTTP, FTP, RMI, JMS), and consists of different data formats using CSV, XML, binary data, or other alternatives.
This session shows the powerful combination of Apache Hadoop and Apache Camel to solve this challenging task. Learn how to use every thinkable data with Hadoop – without plenty of complex or redundant boilerplate code. Besides supporting the integration of all different technologies and data formats, Apache Camel also offers an easy, standardized DSL to transform, split or filter incoming data using the Enterprise Integration Patterns (EIP). Therefore, Apache Hadoop and Apache Camel are a perfect match for processing big data on the Java platform.
resentation of use cases of Master Data Management for Customer Data. It presents the business drivers and how Talend platform for MDM can adress them.
Simplifying AI and Machine Learning with Watson StudioDataWorks Summit
Are you seeing benefits from big data, AI and machine learning? Some companies are challenged by the complexity of the tools, access to quality data and the ability to operationalize these technologies. IBM’s Watson Studio addresses the needs of developers, data scientists and business analysts – who need to create, train and deploy machine and deep learning models, analyze and visualize data – all in an easy-to-use platform. Watson Studio supports Apple’s Core ML with Watson Visual Recognition service. It provides a suite of tools for data scientists, application developers and subject matter experts to collaboratively and easily work with data and use that data to build, train and deploy models at scale. When coupled with IBM Watson Knowledge Catalog, it enables companies to create a secure catalog of AI assets including datasets, documents and models. In this session, you will learn how to use these new offerings to solve real world business problems and infuse AI into your business to drive innovation.
Speaker
Sumit Goyal, IBM, Software Engineer
Analyzing Billions of Data Rows with Alteryx, Amazon Redshift, and TableauDATAVERSITY
Got lots of data? So does Amaysim, a leading Australian telecom provider, with its billions of rows of data. The organization successfully empowers its small team of data analysts with self-service data analytics platforms so they can easily access the data they need, perform advanced analytics, and visualize findings for all stakeholders. Register for this session and learn how Amaysim uses the Alteryx-Redshift-Tableau BI stack to easily and quickly:
Extract data from their data warehouse and blend and enrich it with other sources
Give data analytical context by running statistical, predictive, and deep geo-spatial analytics
Create visualizations from analytics and then update Tableau Workbooks directly from Alteryx, or publish the results in Amazon Redshift, for easy direct access for their stakeholders from Tableau
Hear from Adrian Loong, Alteryx Analytics Certified Expert (ACE), and product marketers from AWS and Alteryx on how organizations can use Alteryx, Amazon Redshift and Tableau to enable data analysts to spin up new self-service analytics instances to enable fast investigation for critical business decisions.
Strata 2017 (San Jose): Building a healthy data ecosystem around Kafka and Ha...Shirshanka Das
So, you finally have a data ecosystem with Kafka and Hadoop both deployed and operating correctly at scale. Congratulations. Are you done? Far from it.
As the birthplace of Kafka and an early adopter of Hadoop, LinkedIn has 13 years of combined experience using Kafka and Hadoop at scale to run a data-driven company. Both Kafka and Hadoop are flexible, scalable infrastructure pieces, but using these technologies without a clear idea of what the higher-level data ecosystem should be is perilous. Shirshanka Das and Yael Garten share best practices around data models and formats, choosing the right level of granularity of Kafka topics and Hadoop tables, and moving data efficiently and correctly between Kafka and Hadoop and explore a data abstraction layer, Dali, that can help you to process data seamlessly across Kafka and Hadoop.
Beyond pure technology, Shirshanka and Yael outline the three components of a great data culture and ecosystem and explain how to create maintainable data contracts between data producers and data consumers (like data scientists and data analysts) and how to standardize data effectively in a growing organization to enable (and not slow down) innovation and agility. They then look to the future, envisioning a world where you can successfully deploy a data abstraction of views on Hadoop data, like a data API as a protective and enabling shield. Along the way, Shirshanka and Yael discuss observations on how to enable teams to be good data citizens in producing, consuming, and owning datasets and offer an overview of LinkedIn’s governance model: the tools, process and teams that ensure that its data ecosystem can handle change and sustain #datasciencehappiness.
Migrating to Alfresco Part II: The “How” – Tools & Best Practices for Renovat...Zia Consulting
In the first presentation of this Migration series from Alfresco Partner of the Year, Zia Consulting, we focused on the “Why” and the “What”: Why should you migrate to Alfresco and What people are migrating from. We looked at the costs associated with legacy ECM systems–both license and maintenance costs–as well as the costs associated with systems that aren’t being used, won’t integrate with your critical business applications, or won’t support modern initiatives including cloud and mobile. We then discussed moving from technologies like Documentum or Sharepoint, as well as moving from embedded or vertical-specific ECM systems, or even moving from content repositories in files shares or email.
For this second presentation of the Migration series, we focused on the “How”. Specifically, we covered:
-Best practices for migrating your content to Alfresco based on experience from dozens of successful Alfresco migration projects
-Recommended approaches for “phased” migrations
Opportunities for “multi-repository” solutions, keeping existing documents within legacy systems
-Migrating records to Alfresco Records Management (RM) 2.1
Jet Reports: Your Newest Tool by Jon PhippsKTL Solutions
With the slow demise of Management Reporter (MR), Microsoft Dynamics GP has partnered with Jet Reports to leverage a different financial reporting solution. Are you an avid MR Guru who is concerned about the functionality? Want to understand the basics of this new flexible BI and reporting solution? Whether you are a spreadsheet expert or new to the business report world, this session covers the basics to help you streamline the report creation process with using Jet Reports and set your mind at ease with this transition.
Webinar: It's the 21st Century - Why Isn't Your Data Integration Loosely Coup...SnapLogic
In this webinar, learn from digital transformation and SOA thought leader Jason Bloomberg about traditional enterprise application integration (EAI), the rise of SOA and Web Services, and the latest REST and JSON initiatives.
This presentation also features a discussion of the age-old problem of implementing loosely coupled data integration, an architectural approach to solving this difficult problem and a demonstration of SnapLogic.
To learn more, visit: www.snaplogic.com/connect-faster
IBM Governed data lake is a value-driven big data platform journey. The journey starts by ingesting wide variety of data, governing it, applying data science and machine learning on it to produce actionable insights.
This presentation contains an introduction of tableau software and in a particular way in Connecting to data, Visual Analytics, Dashboard and stories, Calculations, Mapping and Tableau Online & Competitors.
I often hear from clients: “We don’t know much about Big Data – can you tell us what it is and how it can help our business?” Yes! The first step is this vendor-free presentation, where I start with a business level discussion, not a technical one. Big Data is an opportunity to re-imagine our world, to track new signals that were once impossible, to change the way we experience our communities, our places of work and our personal lives. I will help you to identify the business value opportunity from Big Data and how to operationalize it. Yes, we will cover the buzz words: modern data warehouse, Hadoop, cloud, MPP, Internet of Things, and Data Lake, but I will show use cases to better understand them. In the end, I will give you the ammo to go to your manager and say “We need Big Data an here is why!” Because if you are not utilizing Big Data to help you make better business decisions, you can bet your competitors are.
You are not Facebook or Google? Why you should still care about Big Data and ...Kai Wähner
Big data represents a significant paradigm shift in enterprise technology. Big data radically changes the nature of the data management profession as it introduces new concerns about the volume, velocity and variety of corporate data.
This session goes beyond the well-known examples of huge companies such as Facebook or Google with millions of users. Instead, this session explains the "big" paradigm and technology shift for your company. See several use cases how big data enables small / medium-sized companies to gain insight into new business opportunities (and threats) and how big data stands to transform much of what the modern enterprise is today.
Learn about solving the unique challenges of big data without an own research lab or several big data experts in your company. Learn how to implement the relevant use cases for your company with low costs and efforts by using open source frameworks, which simplify working with big data a lot.
Enabling digital business with governed data lakeKaran Sachdeva
Digital business is enabled by Artificial intelligence, Machine learning, and data science. Artificial intelligence and machine learning are dependent on right Information architecture and data foundation. Governed data lake infused with governance and data science platform gives you the power to take the organization in the digital transformation and AI journey.
The current Microsoft PowerBI governance enabling and recommendations. Including the changes following the November PowerBI release and PASS conference announcements.
SPS London 2017 - Building applications with PowerApps, Microsoft flow and Of...Ahmad Najjar
PowerApps and Flow are services for building and using custom business apps that connect to your data and work across the web and mobile - without the time and expense of custom software development.
This session demonstrates how to build custom business applications, describes the vision behind PowerApps and Flow, and features key scenarios.
Big Data beyond Apache Hadoop - How to integrate ALL your DataKai Wähner
Big data represents a significant paradigm shift in enterprise technology. Big data radically changes the nature of the data management profession as it introduces new concerns about the volume, velocity and variety of corporate data.
Apache Hadoop is the open source defacto standard for implementing big data solutions on the Java platform. Hadoop consists of its kernel, MapReduce, and the Hadoop Distributed Filesystem (HDFS). A challenging task is to send all data to Hadoop for processing and storage (and then get it back to your application later), because in practice data comes from many different applications (SAP, Salesforce, Siebel, etc.) and databases (File, SQL, NoSQL), uses different technologies and concepts for communication (e.g. HTTP, FTP, RMI, JMS), and consists of different data formats using CSV, XML, binary data, or other alternatives.
This session shows the powerful combination of Apache Hadoop and Apache Camel to solve this challenging task. Learn how to use every thinkable data with Hadoop – without plenty of complex or redundant boilerplate code. Besides supporting the integration of all different technologies and data formats, Apache Camel also offers an easy, standardized DSL to transform, split or filter incoming data using the Enterprise Integration Patterns (EIP). Therefore, Apache Hadoop and Apache Camel are a perfect match for processing big data on the Java platform.
resentation of use cases of Master Data Management for Customer Data. It presents the business drivers and how Talend platform for MDM can adress them.
Simplifying AI and Machine Learning with Watson StudioDataWorks Summit
Are you seeing benefits from big data, AI and machine learning? Some companies are challenged by the complexity of the tools, access to quality data and the ability to operationalize these technologies. IBM’s Watson Studio addresses the needs of developers, data scientists and business analysts – who need to create, train and deploy machine and deep learning models, analyze and visualize data – all in an easy-to-use platform. Watson Studio supports Apple’s Core ML with Watson Visual Recognition service. It provides a suite of tools for data scientists, application developers and subject matter experts to collaboratively and easily work with data and use that data to build, train and deploy models at scale. When coupled with IBM Watson Knowledge Catalog, it enables companies to create a secure catalog of AI assets including datasets, documents and models. In this session, you will learn how to use these new offerings to solve real world business problems and infuse AI into your business to drive innovation.
Speaker
Sumit Goyal, IBM, Software Engineer
Analyzing Billions of Data Rows with Alteryx, Amazon Redshift, and TableauDATAVERSITY
Got lots of data? So does Amaysim, a leading Australian telecom provider, with its billions of rows of data. The organization successfully empowers its small team of data analysts with self-service data analytics platforms so they can easily access the data they need, perform advanced analytics, and visualize findings for all stakeholders. Register for this session and learn how Amaysim uses the Alteryx-Redshift-Tableau BI stack to easily and quickly:
Extract data from their data warehouse and blend and enrich it with other sources
Give data analytical context by running statistical, predictive, and deep geo-spatial analytics
Create visualizations from analytics and then update Tableau Workbooks directly from Alteryx, or publish the results in Amazon Redshift, for easy direct access for their stakeholders from Tableau
Hear from Adrian Loong, Alteryx Analytics Certified Expert (ACE), and product marketers from AWS and Alteryx on how organizations can use Alteryx, Amazon Redshift and Tableau to enable data analysts to spin up new self-service analytics instances to enable fast investigation for critical business decisions.
Strata 2017 (San Jose): Building a healthy data ecosystem around Kafka and Ha...Shirshanka Das
So, you finally have a data ecosystem with Kafka and Hadoop both deployed and operating correctly at scale. Congratulations. Are you done? Far from it.
As the birthplace of Kafka and an early adopter of Hadoop, LinkedIn has 13 years of combined experience using Kafka and Hadoop at scale to run a data-driven company. Both Kafka and Hadoop are flexible, scalable infrastructure pieces, but using these technologies without a clear idea of what the higher-level data ecosystem should be is perilous. Shirshanka Das and Yael Garten share best practices around data models and formats, choosing the right level of granularity of Kafka topics and Hadoop tables, and moving data efficiently and correctly between Kafka and Hadoop and explore a data abstraction layer, Dali, that can help you to process data seamlessly across Kafka and Hadoop.
Beyond pure technology, Shirshanka and Yael outline the three components of a great data culture and ecosystem and explain how to create maintainable data contracts between data producers and data consumers (like data scientists and data analysts) and how to standardize data effectively in a growing organization to enable (and not slow down) innovation and agility. They then look to the future, envisioning a world where you can successfully deploy a data abstraction of views on Hadoop data, like a data API as a protective and enabling shield. Along the way, Shirshanka and Yael discuss observations on how to enable teams to be good data citizens in producing, consuming, and owning datasets and offer an overview of LinkedIn’s governance model: the tools, process and teams that ensure that its data ecosystem can handle change and sustain #datasciencehappiness.
Migrating to Alfresco Part II: The “How” – Tools & Best Practices for Renovat...Zia Consulting
In the first presentation of this Migration series from Alfresco Partner of the Year, Zia Consulting, we focused on the “Why” and the “What”: Why should you migrate to Alfresco and What people are migrating from. We looked at the costs associated with legacy ECM systems–both license and maintenance costs–as well as the costs associated with systems that aren’t being used, won’t integrate with your critical business applications, or won’t support modern initiatives including cloud and mobile. We then discussed moving from technologies like Documentum or Sharepoint, as well as moving from embedded or vertical-specific ECM systems, or even moving from content repositories in files shares or email.
For this second presentation of the Migration series, we focused on the “How”. Specifically, we covered:
-Best practices for migrating your content to Alfresco based on experience from dozens of successful Alfresco migration projects
-Recommended approaches for “phased” migrations
Opportunities for “multi-repository” solutions, keeping existing documents within legacy systems
-Migrating records to Alfresco Records Management (RM) 2.1
Jet Reports: Your Newest Tool by Jon PhippsKTL Solutions
With the slow demise of Management Reporter (MR), Microsoft Dynamics GP has partnered with Jet Reports to leverage a different financial reporting solution. Are you an avid MR Guru who is concerned about the functionality? Want to understand the basics of this new flexible BI and reporting solution? Whether you are a spreadsheet expert or new to the business report world, this session covers the basics to help you streamline the report creation process with using Jet Reports and set your mind at ease with this transition.
Webinar: It's the 21st Century - Why Isn't Your Data Integration Loosely Coup...SnapLogic
In this webinar, learn from digital transformation and SOA thought leader Jason Bloomberg about traditional enterprise application integration (EAI), the rise of SOA and Web Services, and the latest REST and JSON initiatives.
This presentation also features a discussion of the age-old problem of implementing loosely coupled data integration, an architectural approach to solving this difficult problem and a demonstration of SnapLogic.
To learn more, visit: www.snaplogic.com/connect-faster
IBM Governed data lake is a value-driven big data platform journey. The journey starts by ingesting wide variety of data, governing it, applying data science and machine learning on it to produce actionable insights.
This presentation contains an introduction of tableau software and in a particular way in Connecting to data, Visual Analytics, Dashboard and stories, Calculations, Mapping and Tableau Online & Competitors.
I often hear from clients: “We don’t know much about Big Data – can you tell us what it is and how it can help our business?” Yes! The first step is this vendor-free presentation, where I start with a business level discussion, not a technical one. Big Data is an opportunity to re-imagine our world, to track new signals that were once impossible, to change the way we experience our communities, our places of work and our personal lives. I will help you to identify the business value opportunity from Big Data and how to operationalize it. Yes, we will cover the buzz words: modern data warehouse, Hadoop, cloud, MPP, Internet of Things, and Data Lake, but I will show use cases to better understand them. In the end, I will give you the ammo to go to your manager and say “We need Big Data an here is why!” Because if you are not utilizing Big Data to help you make better business decisions, you can bet your competitors are.
You are not Facebook or Google? Why you should still care about Big Data and ...Kai Wähner
Big data represents a significant paradigm shift in enterprise technology. Big data radically changes the nature of the data management profession as it introduces new concerns about the volume, velocity and variety of corporate data.
This session goes beyond the well-known examples of huge companies such as Facebook or Google with millions of users. Instead, this session explains the "big" paradigm and technology shift for your company. See several use cases how big data enables small / medium-sized companies to gain insight into new business opportunities (and threats) and how big data stands to transform much of what the modern enterprise is today.
Learn about solving the unique challenges of big data without an own research lab or several big data experts in your company. Learn how to implement the relevant use cases for your company with low costs and efforts by using open source frameworks, which simplify working with big data a lot.
Data-driven companies have a need to make their data easily accessible to those who analyze it. Many organizations have adopted the Looker application, LookML on AWS, a centralized analytical database with a user-friendly interface that allows employees to ask and answer their own questions to make informed business decisions.
Join our webinar to learn how our customer, Casper, an online mattress retailer, made the switch from a transactional database to Looker’s data analytics program on Amazon Redshift. Looker on Amazon Redshift can help you greatly reduce your analytics lifecycle with a simplified infrastructure and rapid cloud scaling.
Join us to learn:
• How to utilize LookML to build reusable definitions and logic for your data
• Best practices for architecting a centralized analytical database
• How Casper leveraged Looker and Amazon Redshift to provide all their employees access to their data and metrics
Who should attend: Heads of Analytics, Heads of BI, Analytics Managers, BI Teams, Senior Analysts
Enterprise Content Management in Microsoft SharePoint 2007ukdpe
This is a CIO-level presentation discussing how Microsoft approaches the topic of ECM. SharePoint is discussed as a technical framework and a foundation to build a solution. By the end of the presentation, if you would like more information on SharePoint or related collaboration products from Microsoft, please email viralta@microsoft.com or visit www.microsoft.com/sharepoint
Microsoft’s Power Platform training provides the tools for organizations to better manage and analyze their data. Understanding the components of the Power Platform and how to best utilize them could be a critical differentiator for your business for one very simple reason: organizations that harness their data – to gain insights which are then used to drive intelligent business decisions – will outperform those that don’t.
It’s easy to recognize that there aren’t enough programmers, data scientists, and tech professionals to go around. Microsoft’s goal was to build a platform targeting these technology experts and the millions of other frontline workers who have never been equipped with the proper tools to do more with the data they work with every day.
The guiding vision was a framework called the “Triple-A Loop” – a closed-loop system allowing users to gain insights from data (Analyze) used to drive intelligent business processes via apps they build (Act) and processes they automate (Automate). The Microsoft Power Platform training implements this vision via three cloud-based services: Power BI, PowerApps, and Power Automate.
PowerApps is a low-code application development platform – allows anyone to build web and mobile applications without writing code. The natural connection between Power BI and PowerApps makes it effortless to put insights in the hands of maintenance workers, teachers, and others on the frontline. Tailored insights and task-specific applications can increase their productivity and make their work less tedious.
Like Power BI, PowerApps connects to hundreds of business systems and databases, making it easy to connect workers with the existing processes and data that drive the business. The data captured in PowerApps can make its way right back to those same systems for further analysis in Power BI, creating a closed-loop process for continuous improvement.
Additionally, PowerApps comes with a built-in, fully-managed, enterprise-grade data store called the Common Data Service (CDS) for those applications that generate data not destined for a legacy system. Power BI and Power Automate have deep connections to CDS, making it that much easier to gain more value from the data stored there.
Extending the Power Platform via connectors to other Microsoft products allows you to leverage those investments for an even greater return.
Here is a brief rundown of the four applications included in the Power Platform:
Power Apps
Build highly customized task- and role-based canvas apps with data from one or multiple sources
Generate immersive model-driven apps, starting from your data model and business processes
Consume fully accessible apps across web and mobile, embedded, or standalone, on any device
Making Informed Business Decisions with an Enterprise Information Management ...Perficient, Inc.
Perficient presents: An Enterprise Information Management (EIM) solution provides an integration of structured and unstructured information in a context that is used by users to make decisions.
EIM Solutions provide a seamless, role based set of tools that let users be more efficient in completing their key tasks
These tools can include;
Business Intelligence
Enterprise Content Management
Portal
Enterprise Search
Collaboration
E-Mail Management
Introduction to why there is a need to use unstructured web data in mashups and how to get to that data using openkapow.com. Breif overview of Enterprise Mashup Use Cases.
Presentation from Mashup Camp 5 in Dublin.
Why Data Virtualization? An Introduction by DenodoJusto Hidalgo
Data Virtualization means Real-time Data Access and Integration. But why do I need it? This presentation tries to answer it in a simple yet clear way.
By Alberto Pan, CTO of Denodo, and Justo Hidalgo, VP Product Management.
Enterprise Integration Patterns Revisited (again) for the Era of Big Data, In...Kai Wähner
In 2015, I had two talks about Enterprise Integration Patterns at OOP 2015 in Munich, Germany and at JavaDay 2015 in Kiev, Ukraine. I reused a talk from 2013 and updated it with current trends to show how important Enterprise Integration Patterns (EIP) are everywhere today and in the upcoming years.
Business Process Automation Solutions | BPA Services | DynaTech Systemshenrryfor680
We offer Business Process Automation Solutions that focus on analyzing all of your business functions to identify areas where automation can be implemented.
Elucidating the Mashup Hype: Definition, Challenges, Methodical Guide and Too...dflejter
Paper presented at 2nd Workshop on Mashups, Enterprise Mashups and Lightweight Composition on the Web (MEM 2009 @ WWW 2009; http://integror.net/mem2009/)
Abstract:
Mashups are a current hype that is attracting high interest by academia and industry now and in the next years. The idea behind a mashup is to create new content by reusing and combining existing content from heterogeneous sources. Advantages of mashups are that even people with no knowledge of programming languages can easily build new Web applications and create new forms of visualizations. To support the mashup construction process several tools have been proposed with easy-to-use functionalities. However, from the research perspective it is dissatisfying that neither a clear definition and classification model for mashups nor a separation between mashups and other forms of application integrations exist. The aim of this paper is to elucidate the mashup hype by providing a definition and classification model for mashups and to sketch a methodical engineering guide for mashups. Additionally, an overview of tools and languages supporting the mashup creation is presented.
While many enterprises consider cloud computing the savior of their data strategy, there is a process they should be following when looking to leveraging database-as-a-service. This includes understanding their own data requirements, selecting the right cloud computing candidate, and then planning for the migration and operations. A huge number of issues and obstacles will inevitably arise, but fortunately best practices are emerging. This presentation will take you through the process of moving data to cloud computing providers.
Similar to Enterprise Mashup Infrastructure Kapow Mashup Server (20)
Beskriver hur man kan jobba med målgrupper för att på ett effektit sätt få upp användandet på öppna data, vilket i sin tur ökar värdet av att publicera öppna data. Det viktigaste är att anpassa data efter externa behov, inte interna möjligheter.
Från Naturvårdsverkets datavärdsträff 5/12 2017.
I hate developers - at least I strongly dislike how developers are worshipped...Andreas Krohn
APIs are marketed to developers through Developer Portals and by Developer Evangelists, but in doing so we are missing a huge market, ie all the non-developers in the world. Why is this and what do we do about it?
Presentation from the API Strategy & Practice conference in Amsterdam March 28, 2014
Building a succesful API is primarily not a technical challenge, there are other challenges that are much harder and much more important. The content of this presentation is based on years of experience with customers APIs.
Presentation from Nordic APIs (nordicapis.com), Stockholm, september 2013.
Gör allt socialt - Integrera Sociala MediaAndreas Krohn
Använd & jobba med sociala medier mha verktyg och integrera sociala media med widgets och dela funktioner. En översikt av hur man integrerar sociala media med APIer, vad APIer är och hur man kan jobba med dem.
Presentation från 2013-09-05 från ett seminarium med Leif Kajrup (kajrup.se) hos Dataföreningen och Marknadsföreningen i Malmö.
Framgångsrik datapublicering från Nordic APIs SundsvallAndreas Krohn
En översikt av vad som krävs för att framgångsrikt publicera data, dvs publicera öppna data eller betal-API som uppnår de mål som har definierats för projektet.
State of APIs: API trends from Nordic APIs Copenhagen & SundsvallAndreas Krohn
An overview of where we are coming from, where we are and where we are going in the API world. Presentation from Nordic APIs in Copenhagen and Sundsvall in May 2013.
A walkthrough of the current state of APIs and the future trends of APIs. Presented at Nordic APIs in Stockholm in March 2013. More information about Nordic APIs at nordicapis.com or twitter.com/nordicapis.
Cloud API introduction from Infosec 2012Andreas Krohn
Presentation from the panel about Cloud API Security from Infosec 2012. It is an introduction to APIs, the developers role and what the difference is between SOAP and REST.
Presenting why APIs are importand and what advantages APIs have when used internatlly, shared with partners och open to the public. Highlighting what is needed to create a successful API. Presented at Bisnode HQ in Stockholm Feb 1st 2012.
Presenting 20 APIs in 20 minutes to give a quick overview of what APIs are out there, from cloud application platforms and facial recognition to email and UFO sightings. Presented at Bisnode HQ in Stockholm Feb 1st 2012.
Presentation from Software Architect Community Day 2011 organized by Edument in Malmö, Sweden on june 17th.
Sorry for the bad formatting of the presentation, seems like Keynote and SlideShare do not play that well together.
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024Albert Hoitingh
In this session I delve into the encryption technology used in Microsoft 365 and Microsoft Purview. Including the concepts of Customer Key and Double Key Encryption.
Pushing the limits of ePRTC: 100ns holdover for 100 daysAdtran
At WSTS 2024, Alon Stern explored the topic of parametric holdover and explained how recent research findings can be implemented in real-world PNT networks to achieve 100 nanoseconds of accuracy for up to 100 days.
PHP Frameworks: I want to break free (IPC Berlin 2024)Ralf Eggert
In this presentation, we examine the challenges and limitations of relying too heavily on PHP frameworks in web development. We discuss the history of PHP and its frameworks to understand how this dependence has evolved. The focus will be on providing concrete tips and strategies to reduce reliance on these frameworks, based on real-world examples and practical considerations. The goal is to equip developers with the skills and knowledge to create more flexible and future-proof web applications. We'll explore the importance of maintaining autonomy in a rapidly changing tech landscape and how to make informed decisions in PHP development.
This talk is aimed at encouraging a more independent approach to using PHP frameworks, moving towards a more flexible and future-proof approach to PHP development.
Threats to mobile devices are more prevalent and increasing in scope and complexity. Users of mobile devices desire to take full advantage of the features
available on those devices, but many of the features provide convenience and capability but sacrifice security. This best practices guide outlines steps the users can take to better protect personal devices and information.
Dr. Sean Tan, Head of Data Science, Changi Airport Group
Discover how Changi Airport Group (CAG) leverages graph technologies and generative AI to revolutionize their search capabilities. This session delves into the unique search needs of CAG’s diverse passengers and customers, showcasing how graph data structures enhance the accuracy and relevance of AI-generated search results, mitigating the risk of “hallucinations” and improving the overall customer journey.
SAP Sapphire 2024 - ASUG301 building better apps with SAP Fiori.pdfPeter Spielvogel
Building better applications for business users with SAP Fiori.
• What is SAP Fiori and why it matters to you
• How a better user experience drives measurable business benefits
• How to get started with SAP Fiori today
• How SAP Fiori elements accelerates application development
• How SAP Build Code includes SAP Fiori tools and other generative artificial intelligence capabilities
• How SAP Fiori paves the way for using AI in SAP apps
UiPath Test Automation using UiPath Test Suite series, part 5DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 5. In this session, we will cover CI/CD with devops.
Topics covered:
CI/CD with in UiPath
End-to-end overview of CI/CD pipeline with Azure devops
Speaker:
Lyndsey Byblow, Test Suite Sales Engineer @ UiPath, Inc.
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
GraphSummit Singapore | The Art of the Possible with Graph - Q2 2024Neo4j
Neha Bajwa, Vice President of Product Marketing, Neo4j
Join us as we explore breakthrough innovations enabled by interconnected data and AI. Discover firsthand how organizations use relationships in data to uncover contextual insights and solve our most pressing challenges – from optimizing supply chains, detecting fraud, and improving customer experiences to accelerating drug discoveries.
Removing Uninteresting Bytes in Software FuzzingAftab Hussain
Imagine a world where software fuzzing, the process of mutating bytes in test seeds to uncover hidden and erroneous program behaviors, becomes faster and more effective. A lot depends on the initial seeds, which can significantly dictate the trajectory of a fuzzing campaign, particularly in terms of how long it takes to uncover interesting behaviour in your code. We introduce DIAR, a technique designed to speedup fuzzing campaigns by pinpointing and eliminating those uninteresting bytes in the seeds. Picture this: instead of wasting valuable resources on meaningless mutations in large, bloated seeds, DIAR removes the unnecessary bytes, streamlining the entire process.
In this work, we equipped AFL, a popular fuzzer, with DIAR and examined two critical Linux libraries -- Libxml's xmllint, a tool for parsing xml documents, and Binutil's readelf, an essential debugging and security analysis command-line tool used to display detailed information about ELF (Executable and Linkable Format). Our preliminary results show that AFL+DIAR does not only discover new paths more quickly but also achieves higher coverage overall. This work thus showcases how starting with lean and optimized seeds can lead to faster, more comprehensive fuzzing campaigns -- and DIAR helps you find such seeds.
- These are slides of the talk given at IEEE International Conference on Software Testing Verification and Validation Workshop, ICSTW 2022.
Alt. GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using ...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
Sudheer Mechineni, Head of Application Frameworks, Standard Chartered Bank
Discover how Standard Chartered Bank harnessed the power of Neo4j to transform complex data access challenges into a dynamic, scalable graph database solution. This keynote will cover their journey from initial adoption to deploying a fully automated, enterprise-grade causal cluster, highlighting key strategies for modelling organisational changes and ensuring robust disaster recovery. Learn how these innovations have not only enhanced Standard Chartered Bank’s data infrastructure but also positioned them as pioneers in the banking sector’s adoption of graph technology.
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
8. The Long Tail of Projects Using traditional approaches just the most important projects can be implemented With Mashups the long tail of projects can be implemented # Users # Projects
9.
10.
11. So, Where Is The Data? Web Based Content & Data MS Office Doc’s, PDF, email RDMS, Flat files Examples Web Based (Unstructured) File system (Unstructured) Database (Structured) Page Views Document Retrieval SQL Queries Data Access Method Full Page Document Level Only Data Element Access Granularity High Med Low Knowledge Worker Productivity Potential No Proprietary Yes Programmatic API’s Exploding Exploding Modest Growth Trend 80%-85% of data 15%-20% of data Enterprise Scope
12.
13. Inside a Mashup Mashup Mashup Builder REST Mashup Enabler Atom RSS API Web Scraping HTML AJAX API SQL Users Data Sources RPC JMS API WS* WS* DB Web Apps SQL WS* Widget Widget Widget
14.
15.
16.
17.
18.
19.
20.
21.
22.
23.
24.
25. Demo Scenario – Competitive Intelligence Web Based Pricing Data