Lufei Kui: Pioneer of China's Information Revolution, 陆费逵:中国信息革命先驱Daqing Zhao
Invited Talk at Fudan University, symposium, "Chung Hwa Book Company and China's Modern and Contemporary Culture", 2012,.06.30. Article published in《中华书局与中国近现代文化》,上海人民出版社, 2013. ISBN 978-7-208-11462-3.
Memory forensics using VMI for cloud computingPriyanka Aash
The relocation of systems and services into cloud environments is on the rise. Because of this trend users lose direct control over their machines and depend on the offered services from cloud providers. These services are especially in the field of digital forensics very rudimentary. The possibilities for users to analyze their virtual machines with forensic methods are very limited. In the underlying research of this talk a practical approach has been developed that gives the user additional capabilities in the field of forensic investigations. The solution focuses on a memory forensic service offering. To reach this goal, a management solution for cloud environments has been extended with memory forensic services. Self-developed memory forensic services, which are installed on each cloud node and are managed through the cloud management component, are the basis for this solution. Forensic data is gained via virtual machine introspection techniques. Compared to other approaches it is possible to get trustworthy data without influencing the running system. Additionally, a general overview about the underlying technologies is provided and the pros and cons are discussed. The solution approach is discussed in a generic way and practically implemented in a prototype. In this prototype OpenNebula is used for managing the cloud infrastructure in combination with Xen as virtualization component, LibVMI as Virtual Machine Introspection library and Volatility as forensic tool.
(Source: Black Hat USA 2016, Las Vegas)
Lufei Kui: Pioneer of China's Information Revolution, 陆费逵:中国信息革命先驱Daqing Zhao
Invited Talk at Fudan University, symposium, "Chung Hwa Book Company and China's Modern and Contemporary Culture", 2012,.06.30. Article published in《中华书局与中国近现代文化》,上海人民出版社, 2013. ISBN 978-7-208-11462-3.
Memory forensics using VMI for cloud computingPriyanka Aash
The relocation of systems and services into cloud environments is on the rise. Because of this trend users lose direct control over their machines and depend on the offered services from cloud providers. These services are especially in the field of digital forensics very rudimentary. The possibilities for users to analyze their virtual machines with forensic methods are very limited. In the underlying research of this talk a practical approach has been developed that gives the user additional capabilities in the field of forensic investigations. The solution focuses on a memory forensic service offering. To reach this goal, a management solution for cloud environments has been extended with memory forensic services. Self-developed memory forensic services, which are installed on each cloud node and are managed through the cloud management component, are the basis for this solution. Forensic data is gained via virtual machine introspection techniques. Compared to other approaches it is possible to get trustworthy data without influencing the running system. Additionally, a general overview about the underlying technologies is provided and the pros and cons are discussed. The solution approach is discussed in a generic way and practically implemented in a prototype. In this prototype OpenNebula is used for managing the cloud infrastructure in combination with Xen as virtualization component, LibVMI as Virtual Machine Introspection library and Volatility as forensic tool.
(Source: Black Hat USA 2016, Las Vegas)
CDAO Public Sector 2018 Presentation - Forces Shaping Data Science and Machin...Felix Liao
Emerging trends and technological breakthroughs are rapidly changing the data science and machine learning landscape today. These forces bring new challenges as well as new opportunities. This session will cover these trends as well as highlight the common pitfalls to avoid in order to be successful regardless where you are on this journey.
Although decision trees have been in development and used for over 50 years, many new forms are evolving that promise to provide exciting new capabilities in areas of Data Mining and Machine Learning.
Learn the importance and concept of Decision Tree Analysis and how one can analyse data.
Understanding Metadata: Why it's essential to your big data solution and how ...Zaloni
In this O'Reilly webcast, Ben Sharma (cofounder and CEO of Zaloni) and Vikram Sreekanti (software engineer in the AMPLab at UC Berkeley) discuss the value of collecting and analyzing metadata, and its potential to impact your big data solution and your business.
Watch the replay here: http://oreil.ly/28LO7IW
AUSOUG - NZOUG-GroundBreakers-Jun 2019 - AI and Machine LearningSandesh Rao
Autonomous Database is one of the hottest Oracle products where we have attempted to use Machine Learning for several aspects of the service. This presentation takes a view on our current state of Diagnostic methodology in the Autonomous Database Cloud services and how do we process this data to find anomalies in them to troubleshoot them at a scale of several petabytes a year and conduct AIOps. Some of the use cases we will cover are a Log Anomaly timeline which we reduce significant amounts of logs using semi-supervised machine learning techniques to reduce logs and match them in near real time. We will cover techniques to analyze database issues using Machine learning techniques like Kmeans , TFIDF, Random Forests, and z-scores to predict if a spike in the CPU is a normal or abnormal spike. We will also talk about RNN’s with LSTM/GRU as some of the applications of how to predict faults before they happen. Some of the other use cases are to use convolution filters to determine maintenance windows within the database workloads, determine best times to do database backups, security anomaly timelines and many others. This is a production service and this can be used if you have a customer SR/defect today. The service is much more extensive inside the Oracle Autonomous Database Cloud. This presentation will accompany several examples with how to apply these techniques, machine learning knowledge is preferred but not a prerequisite
Learn about data lifecycle best practices in the AWS Cloud. Discover how to optimise performance and lower the costs of data ingestion, staging, storage, cleansing, analytics, visualisation, and archiving.
Data Management for High Performance AnalyticsMary Snyder
High-performance analytics is only as good as the data management supporting it.
In fact, high-performance data management plays a key role when it comes to in-database, in-memory and in-stream analytics.
In this webinar Dan Socenau from SAS explores:
•The data management building blocks needed to succeed with high-performance analytics.
•Assessing, planning and executing these bedrock data management capabilities.
•How to deploy a modern data analysis practice.
View the on-demand webinar: http://www.sas.com/en_us/webinars/data-management-high-performance-analytics.html
Accelerate Your Big Data Analytics Efforts with SAS and HadoopDataWorks Summit
Analytics and machine learning continue to be the top use cases for deploying big data platforms such as Hadoop. SAS recognised the potential and power of Hadoop platform early on and has been integrating analytical solutions with Hadoop to leverage the power and flexibility of Hadoop for analytical workloads. The combination of SAS and Hadoop offers developers and organisations an approach that can accelerate the development and deployment of big data analytics applications that are mature, proven and scalable. Furthermore, by giving developers and analysts analytical applications that are rich, proven and collaborative, SAS allows more users across different skill levels to unleash the value of data stored in big data platform more easily and quickly.
In this session, we will cover common big data analytics use cases, the depth and breadth of SAS analytical capabilities on Hadoop, and how SAS solutions are integrated into the Hadoop ecosystem via technologies such as Hive, YARN and Spark.
Speaker
Felix Liao, SAS Institute Australia & New Zealand
Introduction of Texperts\' value-added service tools using Virtualization and Cloud technology to achieve zero foot print in the SAP customer\'s data center and a Pay-as-you-Go flat cost structure. vUpgrade and vBreakFix are tools-as-a-service (taas) offerings through Texperts\' private cloud and aim to increase business agility, cost reduction and optimization while facing an SAP upgrade. These solutions are built on VMware software and supplement the technical SAP upgrade process. It accelerates the SAP upgrade process and at the same time increase the flexibility by means of providing independence of fixed timelines as well as controllable costs. This is possible due to innovative SAP upgrade services injected with Virtualization and Cloud technology.
CDAO Public Sector 2018 Presentation - Forces Shaping Data Science and Machin...Felix Liao
Emerging trends and technological breakthroughs are rapidly changing the data science and machine learning landscape today. These forces bring new challenges as well as new opportunities. This session will cover these trends as well as highlight the common pitfalls to avoid in order to be successful regardless where you are on this journey.
Although decision trees have been in development and used for over 50 years, many new forms are evolving that promise to provide exciting new capabilities in areas of Data Mining and Machine Learning.
Learn the importance and concept of Decision Tree Analysis and how one can analyse data.
Understanding Metadata: Why it's essential to your big data solution and how ...Zaloni
In this O'Reilly webcast, Ben Sharma (cofounder and CEO of Zaloni) and Vikram Sreekanti (software engineer in the AMPLab at UC Berkeley) discuss the value of collecting and analyzing metadata, and its potential to impact your big data solution and your business.
Watch the replay here: http://oreil.ly/28LO7IW
AUSOUG - NZOUG-GroundBreakers-Jun 2019 - AI and Machine LearningSandesh Rao
Autonomous Database is one of the hottest Oracle products where we have attempted to use Machine Learning for several aspects of the service. This presentation takes a view on our current state of Diagnostic methodology in the Autonomous Database Cloud services and how do we process this data to find anomalies in them to troubleshoot them at a scale of several petabytes a year and conduct AIOps. Some of the use cases we will cover are a Log Anomaly timeline which we reduce significant amounts of logs using semi-supervised machine learning techniques to reduce logs and match them in near real time. We will cover techniques to analyze database issues using Machine learning techniques like Kmeans , TFIDF, Random Forests, and z-scores to predict if a spike in the CPU is a normal or abnormal spike. We will also talk about RNN’s with LSTM/GRU as some of the applications of how to predict faults before they happen. Some of the other use cases are to use convolution filters to determine maintenance windows within the database workloads, determine best times to do database backups, security anomaly timelines and many others. This is a production service and this can be used if you have a customer SR/defect today. The service is much more extensive inside the Oracle Autonomous Database Cloud. This presentation will accompany several examples with how to apply these techniques, machine learning knowledge is preferred but not a prerequisite
Learn about data lifecycle best practices in the AWS Cloud. Discover how to optimise performance and lower the costs of data ingestion, staging, storage, cleansing, analytics, visualisation, and archiving.
Data Management for High Performance AnalyticsMary Snyder
High-performance analytics is only as good as the data management supporting it.
In fact, high-performance data management plays a key role when it comes to in-database, in-memory and in-stream analytics.
In this webinar Dan Socenau from SAS explores:
•The data management building blocks needed to succeed with high-performance analytics.
•Assessing, planning and executing these bedrock data management capabilities.
•How to deploy a modern data analysis practice.
View the on-demand webinar: http://www.sas.com/en_us/webinars/data-management-high-performance-analytics.html
Accelerate Your Big Data Analytics Efforts with SAS and HadoopDataWorks Summit
Analytics and machine learning continue to be the top use cases for deploying big data platforms such as Hadoop. SAS recognised the potential and power of Hadoop platform early on and has been integrating analytical solutions with Hadoop to leverage the power and flexibility of Hadoop for analytical workloads. The combination of SAS and Hadoop offers developers and organisations an approach that can accelerate the development and deployment of big data analytics applications that are mature, proven and scalable. Furthermore, by giving developers and analysts analytical applications that are rich, proven and collaborative, SAS allows more users across different skill levels to unleash the value of data stored in big data platform more easily and quickly.
In this session, we will cover common big data analytics use cases, the depth and breadth of SAS analytical capabilities on Hadoop, and how SAS solutions are integrated into the Hadoop ecosystem via technologies such as Hive, YARN and Spark.
Speaker
Felix Liao, SAS Institute Australia & New Zealand
Introduction of Texperts\' value-added service tools using Virtualization and Cloud technology to achieve zero foot print in the SAP customer\'s data center and a Pay-as-you-Go flat cost structure. vUpgrade and vBreakFix are tools-as-a-service (taas) offerings through Texperts\' private cloud and aim to increase business agility, cost reduction and optimization while facing an SAP upgrade. These solutions are built on VMware software and supplement the technical SAP upgrade process. It accelerates the SAP upgrade process and at the same time increase the flexibility by means of providing independence of fixed timelines as well as controllable costs. This is possible due to innovative SAP upgrade services injected with Virtualization and Cloud technology.
University Of Petroleum And Energy Studies is the first Indian University which has implemented SAP.SAP for HE&R has been able to provide UPES with real time access to student data ,seamless integration of data across all business units, a single portal with complete and controlled access to the entire organization's data, information and knowledge resourses.
Virtual Sandbox for Data Scientists at Enterprise ScaleDenodo
View the full webinar here: https://goo.gl/rMQEQK
The Virtual Sandbox is an overarching framework to support the enterprise-scale roll out of data science programs using the industry standard, CRISP-DM methodology.
Attend this session to learn how the Virtual Sandbox optimizes analytical model generation, testing, deployment and subsequent refinement by:
• Easing data access for exploration and mash ups via a governed, self-service data access platform.
• Supporting the creation of logical views using data virtualization for reuse across the organization.
• Facilitating quick and repeatable generation of data sets for analytical model testing and refinement.
• Hastening model deployment by operationalizing the model using shared development pipelines.
Agenda:
• Review the challenges faced by enterprise-scale data science programs.
• Overview of the Virtual Sandbox and its benefits.
• Product Demonstration.
• Q&A
FINRA's Managed Data Lake: Next-Gen Analytics in the Cloud - ENT328 - re:Inve...Amazon Web Services
Financial Impact Regulatory Authority (FINRA)'s Technology Group has changed its customers' relationship with data by creating a managed data lake that enables discovery on petabytes of capital markets' data, while saving time and money over traditional analytics solutions. FINRA's managed data lake unlocks the value in its data to accelerate analytics and machine learning at scale. The data lake includes a centralized data catalog and separates storage from compute, allowing users to query from petabytes of data in seconds. Learn how FINRA uses Spot Instances and services such as Amazon S3, Amazon EMR, Amazon Redshift, and AWS Lambda to provide the right tool for the right job at each step in the data processing pipeline. All of this is done while meeting FINRA's security and compliance responsibilities as a financial regulator.
5th Qatar BIM User Day, BIM Interoperability Issues: Lessons learned from PLMBIM User Day
Author: Dr. Abdelaziz Bouras | Qatar University
Content:
- Interoperability issues from the PLM experience
- Understanding the limitation of open and hybrid interoperability standards
- Discovering current data preservation solutions
About the Qatar BIM User Day: Qatar University, HOCHTIEF ViCon and Teesside University proudly take the initiative to facilitate modern and innovative methods in the Gulf construction industry. The focus is Building Information Modeling (BIM), and our aim is to establish a knowledge platform with government, research and industry experts. The User Day aims to help people to share knowledge, discuss new technologies, and identify new potentials for BIM.
Connector Corner: Automate dynamic content and events by pushing a buttonDianaGray10
Here is something new! In our next Connector Corner webinar, we will demonstrate how you can use a single workflow to:
Create a campaign using Mailchimp with merge tags/fields
Send an interactive Slack channel message (using buttons)
Have the message received by managers and peers along with a test email for review
But there’s more:
In a second workflow supporting the same use case, you’ll see:
Your campaign sent to target colleagues for approval
If the “Approve” button is clicked, a Jira/Zendesk ticket is created for the marketing design team
But—if the “Reject” button is pushed, colleagues will be alerted via Slack message
Join us to learn more about this new, human-in-the-loop capability, brought to you by Integration Service connectors.
And...
Speakers:
Akshay Agnihotri, Product Manager
Charlie Greenberg, Host
Neuro-symbolic is not enough, we need neuro-*semantic*Frank van Harmelen
Neuro-symbolic (NeSy) AI is on the rise. However, simply machine learning on just any symbolic structure is not sufficient to really harvest the gains of NeSy. These will only be gained when the symbolic structures have an actual semantics. I give an operational definition of semantics as “predictable inference”.
All of this illustrated with link prediction over knowledge graphs, but the argument is general.
The Art of the Pitch: WordPress Relationships and SalesLaura Byrne
Clients don’t know what they don’t know. What web solutions are right for them? How does WordPress come into the picture? How do you make sure you understand scope and timeline? What do you do if sometime changes?
All these questions and more will be explored as we talk about matching clients’ needs with what your agency offers without pulling teeth or pulling your hair out. Practical tips, and strategies for successful relationship building that leads to closing the deal.
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...UiPathCommunity
💥 Speed, accuracy, and scaling – discover the superpowers of GenAI in action with UiPath Document Understanding and Communications Mining™:
See how to accelerate model training and optimize model performance with active learning
Learn about the latest enhancements to out-of-the-box document processing – with little to no training required
Get an exclusive demo of the new family of UiPath LLMs – GenAI models specialized for processing different types of documents and messages
This is a hands-on session specifically designed for automation developers and AI enthusiasts seeking to enhance their knowledge in leveraging the latest intelligent document processing capabilities offered by UiPath.
Speakers:
👨🏫 Andras Palfi, Senior Product Manager, UiPath
👩🏫 Lenka Dulovicova, Product Program Manager, UiPath
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
PHP Frameworks: I want to break free (IPC Berlin 2024)Ralf Eggert
In this presentation, we examine the challenges and limitations of relying too heavily on PHP frameworks in web development. We discuss the history of PHP and its frameworks to understand how this dependence has evolved. The focus will be on providing concrete tips and strategies to reduce reliance on these frameworks, based on real-world examples and practical considerations. The goal is to equip developers with the skills and knowledge to create more flexible and future-proof web applications. We'll explore the importance of maintaining autonomy in a rapidly changing tech landscape and how to make informed decisions in PHP development.
This talk is aimed at encouraging a more independent approach to using PHP frameworks, moving towards a more flexible and future-proof approach to PHP development.
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...Ramesh Iyer
In today's fast-changing business world, Companies that adapt and embrace new ideas often need help to keep up with the competition. However, fostering a culture of innovation takes much work. It takes vision, leadership and willingness to take risks in the right proportion. Sachin Dev Duggal, co-founder of Builder.ai, has perfected the art of this balance, creating a company culture where creativity and growth are nurtured at each stage.