Global Big Data Conference Hyderabad-2Aug2013- Finance/Manufacturing Use CasesSanjay Sharma
Financial institutions today are under intense pressure to provide more value add to the customers, reduce IT costs and also grow year to year. This challenge has been further complicated by huge amounts of data being generated as well as mandatory federal compliances in place.
Similarly, Manufacturing industry today also is facing the challenge to process huge amount of data in real time and predict failures as early as possible to reduce cost and increase production efficiency.
The session will cover some high level Big Data use cases applicable to financial and manufacturing domain and how big data technologies are being used successfully to solve these challenges using some examples in credit card/banking industry in financial domain and semi-conductor production in manufacturing domain.
Deploying Massive Scale Graphs for Realtime InsightsNeo4j
Graph databases have been at the forefront of helping organizations manage and generate insights from data relationships, and applying those insights in real-time to drive competitive advantage. As organizations gain value in deploying graph databases, the data volumes managed are growing exponentially pushing the limits of large-scale in-memory graph processing. Neo4j and IBM Power Systems combined forces to deliver a market leading scalable graph database platform capable of affordably storing and processing graphs of extremely large size and offering real-time insights, using flash and FPGA accelerators. In this session we will cover the use cases driving the need for this extremely scalable platform and how this platform offers an easy to deploy model for extreme scale graph databases.
IMS on Mainframe host many Enterprise Critical assets, transactional and batch applications as well as data. Analytics solutions apply to both!
Contact me for more details
Big Data has emerged as a powerful new technology paradigm. To manage the massive data generated by social media, online transactions, Weblogs, or sensors, Big Data incorporates innovative technologies in data management (unstructured, semi-structured and structured), processing, real-time analytics, and visualization. It is also useful for reporting in circumstances where a relational database approach is not effective or too costly. This Big Data project is to be primarily exposed to utilize the tools. Tool usage, programming, algorithms and application development are covered in relevant courses.
Global Big Data Conference Hyderabad-2Aug2013- Finance/Manufacturing Use CasesSanjay Sharma
Financial institutions today are under intense pressure to provide more value add to the customers, reduce IT costs and also grow year to year. This challenge has been further complicated by huge amounts of data being generated as well as mandatory federal compliances in place.
Similarly, Manufacturing industry today also is facing the challenge to process huge amount of data in real time and predict failures as early as possible to reduce cost and increase production efficiency.
The session will cover some high level Big Data use cases applicable to financial and manufacturing domain and how big data technologies are being used successfully to solve these challenges using some examples in credit card/banking industry in financial domain and semi-conductor production in manufacturing domain.
Deploying Massive Scale Graphs for Realtime InsightsNeo4j
Graph databases have been at the forefront of helping organizations manage and generate insights from data relationships, and applying those insights in real-time to drive competitive advantage. As organizations gain value in deploying graph databases, the data volumes managed are growing exponentially pushing the limits of large-scale in-memory graph processing. Neo4j and IBM Power Systems combined forces to deliver a market leading scalable graph database platform capable of affordably storing and processing graphs of extremely large size and offering real-time insights, using flash and FPGA accelerators. In this session we will cover the use cases driving the need for this extremely scalable platform and how this platform offers an easy to deploy model for extreme scale graph databases.
IMS on Mainframe host many Enterprise Critical assets, transactional and batch applications as well as data. Analytics solutions apply to both!
Contact me for more details
Big Data has emerged as a powerful new technology paradigm. To manage the massive data generated by social media, online transactions, Weblogs, or sensors, Big Data incorporates innovative technologies in data management (unstructured, semi-structured and structured), processing, real-time analytics, and visualization. It is also useful for reporting in circumstances where a relational database approach is not effective or too costly. This Big Data project is to be primarily exposed to utilize the tools. Tool usage, programming, algorithms and application development are covered in relevant courses.
Break free from the limits of today’s hyper converged solutions. Focus on innovation that can transform you into a data-centric organization. Explore four ways below that NetApp HCI can help you build a Next Generation Data Center.
Learn how flash enables e-commerce businesses to handle 14,000 transactions per second; allows oil & gas companies to shorten data analysis time from days to hours; and allows banks to double their times to market, among other data points.
Challenges of large-scale sensor data processing for autonomous vehicle development, testing and validation using ROS (Robot Operating System). In the presentation, we will show insights from using frameworks for large-scale data processing and distributed applications running on-premise clusters and in the cloud. We will share our experiences and lessons learned on accelerating
the end-to-end engineering process from data ingest and catalog to analysis, development and safety validation. Keywords: Big Data, Data Science, Data Engineering, Deep Learning, Safety Validation, Testing, Automotive R&D
Reducing the Total Cost of Ownership of Big Data- Impetus White PaperImpetus Technologies
For Impetus’ White Papers archive, visit- http://www.impetus.com/whitepaper
The paper discusses the challenges that relate to the cost of Big Data solutions and looks at the technology options available to overcome these problems.
How Market Intelligence From Hadoop on Azure Shows Trucking Companies a Clear...DataWorks Summit
TMW Systems (a Trimble Company) has been in the business of long-haul trucking, logistics operations and fleet management for more than thirty years, but we wanted more data, so we turned to our customer community. Now, we turn that data into market intelligence, which we then provide back to our customers. To do this, we invested heavily in Hortonworks Data Platform running on Microsoft Azure in the cloud. In our talk, we’ll share our strategy for capturing operational, maintenance, financial and mobile communications information and how we provide that back to our customer base. Our approach enables advanced analytics by leveraging Big Data technologies to find new relationships in data that may have been previously overlooked. Survey responses capture business performance metrics, strategy and emerging trends from 150 businesses, representing more than 31 billion dollars in freight movement. Learn how we combine that survey data with other sources like machine and sensor data to help guide our customers to profitability.
Presentation given 7th March 2017, including recent withdrawal announcement about POWER7 servers, the new AIX website, AIX Enterprise Edition, PowerVC and Cloud, IBM Design Thinking, Project Monocle, IBM Systems PoV, my Insurance story where I took a surprise trip to Lisbon, Hybrid Cloud, IBM Power Systems Enterprise servers for Cloud, reference architectures with PowerVC and OpenStack, OpenPOWER Foundation, LC servers, MondoDB, GPU and NVLink, Deep Learning, PowerAI and POWER9
This lecture aims to give some food for thought regarding how the current High Performance Computing systems (hardware and software) tends to merge with Big Data ones (Machine Learning, Analytics and Enterprise workloads) in order to meet both workloads demands sharing the same clusters.
Cloud and Software as a Service (SaaS) can make a huge impact on a business. Unfortunately, most start the evaluation of SaaS from an IT perspective and traditional data center advantages (i.e. on-premises costs, staffing and savings). While savings are important, cloud is about agility and speed. For these reasons, line-of-business (LOB) leaders have been more interested in SaaS solutions. Learn how Cognos Business Intelligence on Cloud and IBM dashdb make it simple to get started with collaboration, reporting and analytics.
NetApp IT sought to adopt game changers like hyperscale computing, flash storage and SaaS as part of their cloud-first strategy. After taking a hard look at our data centers, NetApp IT established a strategic roadmap to downsize data centers where possible to lower costs, enhance services, and become more dynamic.
DataWorks Summit 2017 - Sydney Keynote
Scott Gnau, Chieft Technology Officer, Hortonworks
Data has become the most valuable asset for every enterprise. As businesses undergo data transformation, leading organizations are turning to data science and machine learning to drive more business value out of their data. In this talk, Scott will examine the trends and the key requirements needed to evolve to next-generation analytics and operations.
Here are 10 good reasons why you should be leveraging NetApp HCI! For more information, visit https://www.netapp.com/us/campaigns/why-hyperconverged-infrastructure/index.aspx
The Presentation Talks about how Cloud Computing is Big Data's Best Friend and How AWS Cloud Components Fit in to complete your Big Data Life Cycle.
Agenda:
- How Big is Big Data Actually growing?
- How Cloud has the potential to become Big Data's Best Friend
- A tour on The Big Data Life Cycle
- How AWS Cloud Components Fit in to this Life Cycle
- A Case Study of Our Log Analytics Tool Cloudlytics, using Big Data Implementation
on AWS Cloud.
Break free from the limits of today’s hyper converged solutions. Focus on innovation that can transform you into a data-centric organization. Explore four ways below that NetApp HCI can help you build a Next Generation Data Center.
Learn how flash enables e-commerce businesses to handle 14,000 transactions per second; allows oil & gas companies to shorten data analysis time from days to hours; and allows banks to double their times to market, among other data points.
Challenges of large-scale sensor data processing for autonomous vehicle development, testing and validation using ROS (Robot Operating System). In the presentation, we will show insights from using frameworks for large-scale data processing and distributed applications running on-premise clusters and in the cloud. We will share our experiences and lessons learned on accelerating
the end-to-end engineering process from data ingest and catalog to analysis, development and safety validation. Keywords: Big Data, Data Science, Data Engineering, Deep Learning, Safety Validation, Testing, Automotive R&D
Reducing the Total Cost of Ownership of Big Data- Impetus White PaperImpetus Technologies
For Impetus’ White Papers archive, visit- http://www.impetus.com/whitepaper
The paper discusses the challenges that relate to the cost of Big Data solutions and looks at the technology options available to overcome these problems.
How Market Intelligence From Hadoop on Azure Shows Trucking Companies a Clear...DataWorks Summit
TMW Systems (a Trimble Company) has been in the business of long-haul trucking, logistics operations and fleet management for more than thirty years, but we wanted more data, so we turned to our customer community. Now, we turn that data into market intelligence, which we then provide back to our customers. To do this, we invested heavily in Hortonworks Data Platform running on Microsoft Azure in the cloud. In our talk, we’ll share our strategy for capturing operational, maintenance, financial and mobile communications information and how we provide that back to our customer base. Our approach enables advanced analytics by leveraging Big Data technologies to find new relationships in data that may have been previously overlooked. Survey responses capture business performance metrics, strategy and emerging trends from 150 businesses, representing more than 31 billion dollars in freight movement. Learn how we combine that survey data with other sources like machine and sensor data to help guide our customers to profitability.
Presentation given 7th March 2017, including recent withdrawal announcement about POWER7 servers, the new AIX website, AIX Enterprise Edition, PowerVC and Cloud, IBM Design Thinking, Project Monocle, IBM Systems PoV, my Insurance story where I took a surprise trip to Lisbon, Hybrid Cloud, IBM Power Systems Enterprise servers for Cloud, reference architectures with PowerVC and OpenStack, OpenPOWER Foundation, LC servers, MondoDB, GPU and NVLink, Deep Learning, PowerAI and POWER9
This lecture aims to give some food for thought regarding how the current High Performance Computing systems (hardware and software) tends to merge with Big Data ones (Machine Learning, Analytics and Enterprise workloads) in order to meet both workloads demands sharing the same clusters.
Cloud and Software as a Service (SaaS) can make a huge impact on a business. Unfortunately, most start the evaluation of SaaS from an IT perspective and traditional data center advantages (i.e. on-premises costs, staffing and savings). While savings are important, cloud is about agility and speed. For these reasons, line-of-business (LOB) leaders have been more interested in SaaS solutions. Learn how Cognos Business Intelligence on Cloud and IBM dashdb make it simple to get started with collaboration, reporting and analytics.
NetApp IT sought to adopt game changers like hyperscale computing, flash storage and SaaS as part of their cloud-first strategy. After taking a hard look at our data centers, NetApp IT established a strategic roadmap to downsize data centers where possible to lower costs, enhance services, and become more dynamic.
DataWorks Summit 2017 - Sydney Keynote
Scott Gnau, Chieft Technology Officer, Hortonworks
Data has become the most valuable asset for every enterprise. As businesses undergo data transformation, leading organizations are turning to data science and machine learning to drive more business value out of their data. In this talk, Scott will examine the trends and the key requirements needed to evolve to next-generation analytics and operations.
Here are 10 good reasons why you should be leveraging NetApp HCI! For more information, visit https://www.netapp.com/us/campaigns/why-hyperconverged-infrastructure/index.aspx
The Presentation Talks about how Cloud Computing is Big Data's Best Friend and How AWS Cloud Components Fit in to complete your Big Data Life Cycle.
Agenda:
- How Big is Big Data Actually growing?
- How Cloud has the potential to become Big Data's Best Friend
- A tour on The Big Data Life Cycle
- How AWS Cloud Components Fit in to this Life Cycle
- A Case Study of Our Log Analytics Tool Cloudlytics, using Big Data Implementation
on AWS Cloud.
I've been teaching AppCademy teams about metrics several times since 2013. This is the latest slide deck.
Main goal: dispel convenient default metrics, instead focus on your own business problems and derive metrics to solve them.
AppCademy is a 4-week accelerator camp run by AppCampus, a training program for Windows Phone dev teams.
Opportunities in Sensor Networks and Big Data in 2014 (for NIKKEI Big Data Co...Rainer Sternfeld
1. Market trends in some of the biggest industries using scientific sensor data
2. Technology trends
3. How Planet OS is solving these challenges
4. The Industrial Internet (GE), The Internet of Everything (Cisco)
5. Security and trust
Slides for a presentation given at a Fortum's internal big data event. The presentation drafts out a journey from big data through sensor data in general into automated use of cutting edge sensors, drones and data technology in inspecting infrastructure.
IoT ( M2M) - Big Data - Analytics: Emulation and DemonstrationCHAKER ALLAOUI
Study and simulation of the systems modern IoT with examples of connected objects such as: GPS(GLOBAL POSITIONING SYSTEM), Philips Hue, Thermometer, and connected cars implemented with the technology nodeJS and Node-Red with the communication protocol of M2M ( MQTT).
As well as an analytical study based on Elasticsearch, MongoDB, Apache Hadoop, Apache Hive and Jaspersoft.
Independent of the source of data, the integration of event streams into an Enterprise Architecture gets more and more important in the world of sensors, social media streams and Internet of Things. Events have to be accepted quickly and reliably, they have to be distributed and analysed, often with many consumers or systems interested in all or part of the events. Storing such huge event streams into HDFS or a NoSQL datastore is feasible and not such a challenge anymore. But if you want to be able to react fast, with minimal latency, you can not afford to first store the data and doing the analysis/analytics later. You have to be able to include part of your analytics right after you consume the event streams. Products for doing event processing, such as Oracle Event Processing or Esper, are avaialble for quite a long time and also used to be called Complex Event Processing (CEP). In the last 3 years, another family of products appeared, mostly out of the Big Data Technology space, called Stream Processing or Streaming Analytics. These are mostly open source products/frameworks such as Apache Storm, Spark Streaming, Apache Samza as well as supporting infrastructures such as Apache Kafka. In this talk I will present the theoretical foundations for Event and Stream Processing and present what differences you might find between the more traditional CEP and the more modern Stream Processing solutions and show that a combination of both will bring the most value.
Real Time Data Processing using Spark Streaming | Data Day Texas 2015Cloudera, Inc.
Speaker: Hari Shreedharan
Data Day Texas 2015
Apache Spark has emerged over the past year as the imminent successor to Hadoop MapReduce. Spark can process data in memory at very high speed, while still be able to spill to disk if required. Spark’s powerful, yet flexible API allows users to write complex applications very easily without worrying about the internal workings and how the data gets processed on the cluster.
Spark comes with an extremely powerful Streaming API to process data as it is ingested. Spark Streaming integrates with popular data ingest systems like Apache Flume, Apache Kafka, Amazon Kinesis etc. allowing users to process data as it comes in.
In this talk, Hari will discuss the basics of Spark Streaming, its API and its integration with Flume, Kafka and Kinesis. Hari will also discuss a real-world example of a Spark Streaming application, and how code can be shared between a Spark application and a Spark Streaming application. Each stage of the application execution will be presented, which can help understand practices while writing such an application. Hari will finally discuss how to write a custom application and a custom receiver to receive data from other systems.
This is the talk I gave at the Big Data Meetup in Seattle in March. In this talk, I discuss the fundamentals of Spark Streaming and Flume, and how they integrate with each other.
Learn about IBM's Hadoop offering called BigInsights. We will look at the new features in version 4 (including a discussion on the Open Data Platform), review a couple of customer examples, talk about the overall offering and differentiators, and then provide a brief demonstration on how to get started quickly by creating a new cloud instance, uploading data, and generating a visualization using the built-in spreadsheet tooling called BigSheets.
Getting started with Hadoop on the Cloud with BluemixNicolas Morales
Silicon Valley Code Camp -- October 11, 2014.
Session: Getting started with Hadoop on the Cloud.
Hadoop and Cloud is an almost perfect marriage. Hadoop is a distributed computing framework that leverages a cluster built on commodity hardware. The Cloud simplifies provisioning of machines and software. Getting started with Hadoop on the Cloud makes it simple to provision your environment quickly and actually get started using Hadoop. IBM Bluemix has democratized Hadoop for the masses! This session will provide a brief introduction to what Hadoop is, how does cloud work and will then focus on how to get started via a series of demos. We will conclude with a discussion around the tutorials and public datasets - all of the tools needed to get you started quickly.
Learn more about BigInsights for Hadoop: https://developer.ibm.com/hadoop/
InfoSphere BigInsights - Analytics power for Hadoop - field experienceWilfried Hoge
How to analyze binary data as a technical business user. Use InfoSphere BigInsights to bring analytics on Hadoop closer to a user.
Presented at the OOP conference in Munich, 27.01.2015
Open source Apache Hadoop is a great framework for distributed processing of large data sets. But there’s a difference between “playing” with big data versus solving real problems. The reality is that Hadoop alone is not enough. In fact, almost every organization that plans to use Hadoop for production use quickly discovers that it lacks the required features for enterprise use. And, fewer still have the Hadoop specialists on hand to navigate through the complexity to build reliable, robust applications. As a result, many Hadoop projects never make it to production as executives say, “we just don’t have the skills.” In this session, we will discuss these enterprise capabilities and why they’re important: analytics, visualization, security, enterprise integration, developer/admin tools, and more. Additionally, we will share several real-world client examples who have found it necessary to use an enterprise-grade Hadoop platform to tackle some of the most interesting and challenging business problems.
Enterprise analytics journey from Helene LyonHelene Lyon
Somewhere in every customer there are data lake & analytics projects to get better insights for LOB projects. Lets be an active participant in those projects understanding how Hadoop, Spark and Machine Learning evolution can change the perception regarding your Mainframe assets, applications & data!
The right architecture is key for any IT project. This is especially the case for big data projects, where there are no standard architectures which have proven their suitability over years. This session discusses the different Big Data Architectures which have evolved over time, including traditional Big Data Architecture, Streaming Analytics architecture as well as Lambda and Kappa architecture and presents the mapping of components from both Open Source as well as the Oracle stack onto these architectures.
Learn how you can increase performance in IBM Cognos. Learn about PureData for Analytics, why it's fast and how to integrate with IBM Cognos Analytics. For more information about PureData for Analytics and for a free whitepaper, email info@crescointl.com, or visit http://www.crescointl.com.
Hadoop and the Future of SQL: Using BI Tools with Big DataSenturus
Hadoop is changing how businesses operate, learn about this emerging technology stack. View the webinar video recording and download this deck: http://www.senturus.com/resource-video/hadoop-future-sql/?rId=3410.
Learn the role SQL queries play for big data, and how SQL-on-Hadoop technologies enable organizations to leverage their existing SQL skills and investments in business intelligence (BI) tools to dramatically improve: 1) Recommendation engines for online retail, 2) Transactional fraud prevention for financial services, 3) Customized advertising and 4) Predictive failure analytics for manufacturing.
Senturus, a business analytics consulting firm, has a resource library with hundreds of free recorded webinars, trainings, demos and unbiased product reviews. Take a look and share them with your colleagues and friends: http://www.senturus.com/resources/.
Imagine an entire IT infrastructure controlled not by hands and hardware, but by software. One in which application workloads such as big data, analytics, simulation and design are serviced automatically by the most appropriate resource, whether running locally or in the cloud. A Software Defined Infrastructure enables your organization to deliver IT services in the most efficient way possible, optimizing resource utilization to accelerate time to results and reduce costs. It is the foundation for a fully integrated software defined environment, optimizing your compute, storage and networking infrastructure so you can quickly adapt to changing business requirements. A comprehensive portfolio of management tools dynamically manage workloads and data, transforming a static IT infrastructure into a workload- , resource- and data-aware environment.
Learn more: http://ibm.co/1wkoXtc
Watch the video presentation: http://insidehpc.com/2015/03/slidecast-software-defined-infrastructure/
Simplifying Real-Time Architectures for IoT with Apache KuduCloudera, Inc.
3 Things to Learn About:
*Building scalable real time architectures for managing data from IoT
*Processing data in real time with components such as Kudu & Spark
*Customer case studies highlighting real-time IoT use cases
2016 Sept 1st - IBM Consultants & System Integrators Interchange - Big Data -...Anand Haridass
An unprecedented increase in the use of digital devices is causing an explosion in the amount of data generated & captured by businesses. The need to extract economic value from all this "Big Data", that has the potential to transform businesses completely, is immense and drives a whole slew of new workloads. Organizations need to continuously align strategy, business processes and infrastructure investments to derive these insights. This session will talk to how solutions based on POWER deliver this in a cost-effective, open, scalable, high performing and reliable manner.
UiPath Test Automation using UiPath Test Suite series, part 3DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 3. In this session, we will cover desktop automation along with UI automation.
Topics covered:
UI automation Introduction,
UI automation Sample
Desktop automation flow
Pradeep Chinnala, Senior Consultant Automation Developer @WonderBotz and UiPath MVP
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered QualityInflectra
In this insightful webinar, Inflectra explores how artificial intelligence (AI) is transforming software development and testing. Discover how AI-powered tools are revolutionizing every stage of the software development lifecycle (SDLC), from design and prototyping to testing, deployment, and monitoring.
Learn about:
• The Future of Testing: How AI is shifting testing towards verification, analysis, and higher-level skills, while reducing repetitive tasks.
• Test Automation: How AI-powered test case generation, optimization, and self-healing tests are making testing more efficient and effective.
• Visual Testing: Explore the emerging capabilities of AI in visual testing and how it's set to revolutionize UI verification.
• Inflectra's AI Solutions: See demonstrations of Inflectra's cutting-edge AI tools like the ChatGPT plugin and Azure Open AI platform, designed to streamline your testing process.
Whether you're a developer, tester, or QA professional, this webinar will give you valuable insights into how AI is shaping the future of software delivery.
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...DanBrown980551
Do you want to learn how to model and simulate an electrical network from scratch in under an hour?
Then welcome to this PowSyBl workshop, hosted by Rte, the French Transmission System Operator (TSO)!
During the webinar, you will discover the PowSyBl ecosystem as well as handle and study an electrical network through an interactive Python notebook.
PowSyBl is an open source project hosted by LF Energy, which offers a comprehensive set of features for electrical grid modelling and simulation. Among other advanced features, PowSyBl provides:
- A fully editable and extendable library for grid component modelling;
- Visualization tools to display your network;
- Grid simulation tools, such as power flows, security analyses (with or without remedial actions) and sensitivity analyses;
The framework is mostly written in Java, with a Python binding so that Python developers can access PowSyBl functionalities as well.
What you will learn during the webinar:
- For beginners: discover PowSyBl's functionalities through a quick general presentation and the notebook, without needing any expert coding skills;
- For advanced developers: master the skills to efficiently apply PowSyBl functionalities to your real-world scenarios.
Connector Corner: Automate dynamic content and events by pushing a buttonDianaGray10
Here is something new! In our next Connector Corner webinar, we will demonstrate how you can use a single workflow to:
Create a campaign using Mailchimp with merge tags/fields
Send an interactive Slack channel message (using buttons)
Have the message received by managers and peers along with a test email for review
But there’s more:
In a second workflow supporting the same use case, you’ll see:
Your campaign sent to target colleagues for approval
If the “Approve” button is clicked, a Jira/Zendesk ticket is created for the marketing design team
But—if the “Reject” button is pushed, colleagues will be alerted via Slack message
Join us to learn more about this new, human-in-the-loop capability, brought to you by Integration Service connectors.
And...
Speakers:
Akshay Agnihotri, Product Manager
Charlie Greenberg, Host
Transcript: Selling digital books in 2024: Insights from industry leaders - T...BookNet Canada
The publishing industry has been selling digital audiobooks and ebooks for over a decade and has found its groove. What’s changed? What has stayed the same? Where do we go from here? Join a group of leading sales peers from across the industry for a conversation about the lessons learned since the popularization of digital books, best practices, digital book supply chain management, and more.
Link to video recording: https://bnctechforum.ca/sessions/selling-digital-books-in-2024-insights-from-industry-leaders/
Presented by BookNet Canada on May 28, 2024, with support from the Department of Canadian Heritage.
Generating a custom Ruby SDK for your web service or Rails API using Smithyg2nightmarescribd
Have you ever wanted a Ruby client API to communicate with your web service? Smithy is a protocol-agnostic language for defining services and SDKs. Smithy Ruby is an implementation of Smithy that generates a Ruby SDK using a Smithy model. In this talk, we will explore Smithy and Smithy Ruby to learn how to generate custom feature-rich SDKs that can communicate with any web service, such as a Rails JSON API.
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...UiPathCommunity
💥 Speed, accuracy, and scaling – discover the superpowers of GenAI in action with UiPath Document Understanding and Communications Mining™:
See how to accelerate model training and optimize model performance with active learning
Learn about the latest enhancements to out-of-the-box document processing – with little to no training required
Get an exclusive demo of the new family of UiPath LLMs – GenAI models specialized for processing different types of documents and messages
This is a hands-on session specifically designed for automation developers and AI enthusiasts seeking to enhance their knowledge in leveraging the latest intelligent document processing capabilities offered by UiPath.
Speakers:
👨🏫 Andras Palfi, Senior Product Manager, UiPath
👩🏫 Lenka Dulovicova, Product Program Manager, UiPath
Neuro-symbolic is not enough, we need neuro-*semantic*Frank van Harmelen
Neuro-symbolic (NeSy) AI is on the rise. However, simply machine learning on just any symbolic structure is not sufficient to really harvest the gains of NeSy. These will only be gained when the symbolic structures have an actual semantics. I give an operational definition of semantics as “predictable inference”.
All of this illustrated with link prediction over knowledge graphs, but the argument is general.
Essentials of Automations: Optimizing FME Workflows with ParametersSafe Software
Are you looking to streamline your workflows and boost your projects’ efficiency? Do you find yourself searching for ways to add flexibility and control over your FME workflows? If so, you’re in the right place.
Join us for an insightful dive into the world of FME parameters, a critical element in optimizing workflow efficiency. This webinar marks the beginning of our three-part “Essentials of Automation” series. This first webinar is designed to equip you with the knowledge and skills to utilize parameters effectively: enhancing the flexibility, maintainability, and user control of your FME projects.
Here’s what you’ll gain:
- Essentials of FME Parameters: Understand the pivotal role of parameters, including Reader/Writer, Transformer, User, and FME Flow categories. Discover how they are the key to unlocking automation and optimization within your workflows.
- Practical Applications in FME Form: Delve into key user parameter types including choice, connections, and file URLs. Allow users to control how a workflow runs, making your workflows more reusable. Learn to import values and deliver the best user experience for your workflows while enhancing accuracy.
- Optimization Strategies in FME Flow: Explore the creation and strategic deployment of parameters in FME Flow, including the use of deployment and geometry parameters, to maximize workflow efficiency.
- Pro Tips for Success: Gain insights on parameterizing connections and leveraging new features like Conditional Visibility for clarity and simplicity.
We’ll wrap up with a glimpse into future webinars, followed by a Q&A session to address your specific questions surrounding this topic.
Don’t miss this opportunity to elevate your FME expertise and drive your projects to new heights of efficiency.
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
The Art of the Pitch: WordPress Relationships and SalesLaura Byrne
Clients don’t know what they don’t know. What web solutions are right for them? How does WordPress come into the picture? How do you make sure you understand scope and timeline? What do you do if sometime changes?
All these questions and more will be explored as we talk about matching clients’ needs with what your agency offers without pulling teeth or pulling your hair out. Practical tips, and strategies for successful relationship building that leads to closing the deal.
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024Tobias Schneck
As AI technology is pushing into IT I was wondering myself, as an “infrastructure container kubernetes guy”, how get this fancy AI technology get managed from an infrastructure operational view? Is it possible to apply our lovely cloud native principals as well? What benefit’s both technologies could bring to each other?
Let me take this questions and provide you a short journey through existing deployment models and use cases for AI software. On practical examples, we discuss what cloud/on-premise strategy we may need for applying it to our own infrastructure to get it to work from an enterprise perspective. I want to give an overview about infrastructure requirements and technologies, what could be beneficial or limiting your AI use cases in an enterprise environment. An interactive Demo will give you some insides, what approaches I got already working for real.