7 demostraciones prácticas utilizando Material Design e una app Android, Polymer para las versiones 0.5 y 1.0, integración Admob con Native Ads, Firebase, Cloud Platform, integracion con G+API y Bigquery
The 'macro view' on Big Query:
We started with an overview, some typical uses and moved to project hierarchy, access control and security.
In the end we touch about tools and demos.
BigQuery JavaScript User-Defined Functions by THOMAS PARK and FELIPE HOFFA at...Big Data Spain
Performing ETL on big data can be slow, expensive and painful - but it doesn't have to be! In this session, we'll take an in-depth look at several real-world examples of computations that don't fit well with the SQL language model and how to solve them with user-defined functions in Google BigQuery.
Session presented at Big Data Spain 2014 Conference
18th Nov 2014
Kinépolis Madrid
http://www.bigdataspain.org
Event promoted by: http://www.paradigmatecnologico.com
Abstract: http://www.bigdataspain.org/2014/conference/hands-on-with-bigquery-javascript-user-defined-functions
MongoDB IoT City Tour LONDON: Managing the Database Complexity, by Arthur Vie...MongoDB
Arthur Viegers, Senior Solutions Architect, MongoDB.
The value of the fast growing class of NoSQL databases is the ability to handle high velocity and volumes of data while enabling greater agility with dynamic schemas. MongoDB gives you those benefits while also providing a rich querying capability and a document model for developer productivity. Arthur Viegers outlines the reasons for MongoDB's popularity in IoT applications and how you can leverage the core concepts of NoSQL to build robust and highly scalable IoT applications.
The 'macro view' on Big Query:
We started with an overview, some typical uses and moved to project hierarchy, access control and security.
In the end we touch about tools and demos.
BigQuery JavaScript User-Defined Functions by THOMAS PARK and FELIPE HOFFA at...Big Data Spain
Performing ETL on big data can be slow, expensive and painful - but it doesn't have to be! In this session, we'll take an in-depth look at several real-world examples of computations that don't fit well with the SQL language model and how to solve them with user-defined functions in Google BigQuery.
Session presented at Big Data Spain 2014 Conference
18th Nov 2014
Kinépolis Madrid
http://www.bigdataspain.org
Event promoted by: http://www.paradigmatecnologico.com
Abstract: http://www.bigdataspain.org/2014/conference/hands-on-with-bigquery-javascript-user-defined-functions
MongoDB IoT City Tour LONDON: Managing the Database Complexity, by Arthur Vie...MongoDB
Arthur Viegers, Senior Solutions Architect, MongoDB.
The value of the fast growing class of NoSQL databases is the ability to handle high velocity and volumes of data while enabling greater agility with dynamic schemas. MongoDB gives you those benefits while also providing a rich querying capability and a document model for developer productivity. Arthur Viegers outlines the reasons for MongoDB's popularity in IoT applications and how you can leverage the core concepts of NoSQL to build robust and highly scalable IoT applications.
MongoDB .local Chicago 2019: MongoDB Atlas Data Lake Technical Deep DiveMongoDB
MongoDB Atlas Data Lake is a new service offered by MongoDB Atlas. Many organizations store long term, archival data in cost-effective storage like S3, GCP, and Azure Blobs. However, many of them do not have robust systems or tools to effectively utilize large amounts of data to inform decision making. MongoDB Atlas Data Lake is a service allowing organizations to analyze their long-term data to discover a wealth of information about their business.
This session will take a deep dive into the features that are currently available in MongoDB Atlas Data Lake and how they are implemented. In addition, we'll discuss future plans and opportunities and offer ample Q&A time with the engineers on the project.
In this webinar you'll learn about the best practices for Google BigQuery—and how Matillion ETL makes loading your data faster and easier. Find out from our experts how to leverage one of the largest, fastest, and most capable cloud data warehouses to improve your business and save money.
In this webinar:
- Discover how to work fast and efficiently with Google BigQuery
- Find out the best ways to monitor and control costs
- Learn to leverage Matillion ETL and optimize Google BigQuery
- Get tips and tricks for better performance
Beyond the Basics 3: Introduction to the MongoDB BI ConnectorMongoDB
Watch this presentation to learn how the MongoDB BI Connector lets you use MongoDB as a data source for your SQL-based BI and analytics platforms.
Learn how to seamlessly create the visualizations and dashboards that will help you extract the insights and hidden value in your multi-structured data.
BitRot (The silent corruption of data on disk ) detection in GlusterFS (Glust...Gaurav Kumar Garg
BitRot or Data Rot is when file or object sits un-disturb on the disk or unaccess from day by day or month by month or year by year and when you go to your pen drive, hard disk or take out your DVD and you see that you can't access that data because bit have flipped called as BitRot or DataRot.
BitRot detection is a technique used to identify certain “insidious”
type of disk errors where data is silently corrupted with no
indication from the disk to the storage software layer that an error
has occurred.
With GlusterFS 3.7, its possible to detect corruption caused due to
bitrot and take steps to rectify them. When bitrot detection is
enabled on a Gluster volume, files are signed after they have been
written. A periodic filesystem scrubber verifies the integrity of
signed files are flags (or marks) files which have mismatching
signature. Corrupted files are typically denied access to clients
unless it's a replicated volume where it's still possible to access
the "good" copy and repair the corrupted file.
Dezentrales Logging wird mit der steigenden Zahl von zu überwachenden Prozessen immer aufwändiger. Deshalb gibt es seit mehreren Jahren Tools welche das Zentrale Logging unterstützen. In diesem Vortrag soll der Elastic Stack als ein solches Tool vorgestellt werden.
Webinar: Introducing the MongoDB Connector for BI 2.0 with TableauMongoDB
Pairing your real-time operational data stored in a modern database like MongoDB with first-class business intelligence platforms like Tableau enables new insights to be discovered faster than ever before.
Many leading organizations already use MongoDB in conjunction with Tableau including a top American investment bank and the world’s largest airline. With the Connector for BI 2.0, it’s never been easier to streamline the connection process between these two systems.
In this webinar, we will create a live connection from Tableau Desktop to a MongoDB cluster using the Connector for BI. Once we have Tableau Desktop and MongoDB connected, we will demonstrate the visual power of Tableau to explore the agile data storage of MongoDB.
You’ll walk away knowing:
- How to configure MongoDB with Tableau using the updated connector
- Best practices for working with documents in a BI environment
- How leading companies are using big data visualization strategies to transform their businesses
Digital Creative Design course - UX Module day 01 (EDIT interactive design an...DanNobre CarlaFonseca
Presentation for the first day of the UX module class we were invited to teach by EDIT - interactive design and technology school (Lisbon) in 2014. The module was a part of the Digital Creative Design course and was essentially targeted towards visual designers.
"With microservices every outage is like a murder mystery
is a common complaint. But it doesn't have to be! This talk
gives an overview on how to monitor distributed
applications. We dive into:
System metrics: Keep track of network traffic and system
load.
Application logs: Collect structured logs in a central
location
Audit info: Watch for user and processes activity in the
system.
Uptime monitoring: Ping services and actively monitor their
availability and response time.
Application metrics: Get metrics and health information
from for application via REST or JMX.
Request tracing: Gather timing data by using tools like
Zipkin to retrieve and show call traces.
MongoDB .local San Francisco 2020: Powering the new age data demands [Infosys]MongoDB
Our clients have unique use cases and data patterns that mandate the choice of a particular strategy. To implement these strategies, it is mandatory that we unlearn a lot of relational concepts while designing and rapidly developing efficient applications on NoSQL. In this session, we will talk about some of our client use cases, the strategies we have adopted, and the features of MongoDB that assisted in implementing these strategies.
MongoDB Solution for Internet of Things and Big DataStefano Dindo
Internet of Things è uno degli scenari di mercato più importanti su cui investire entro il 2020.
L'Internet of Things permette di trasferire sul Web la vita reale delle persone grazie all'interazione con oggetti e spazi fisici scambiando un grande volume di dati.
Durante il Lab è stata fornita una descrizione di architettura necessaria a supportare progetti di Internet of Things con un focus sull'organizzazione dei dati all'interno di MongoDB, database NoSQL Leader di mercato, per raccogliere ed analizzare grandi volumi di dati in tempo reale ed in modo efficiente.
Introductory talk to how can MongoDB enable new age software taking into account the expected growth rates, the constant availability of services and new business models that appear on a daily basis.
Building Pinterest Real-Time Ads Platform Using Kafka Streams confluent
Building Pinterest Real-Time Ads Platform Using Kafka Streams (Liquan Pei + Boyang Chen, Pinterest) Kafka Summit SF 2018
In this talk, we are sharing the experience of building Pinterest’s real-time Ads Platform utilizing Kafka Streams. The real-time budgeting system is the most mission-critical component of the Ads Platform as it controls how each ad is delivered to maximize user, advertiser and Pinterest value. The system needs to handle over 50,000 queries per section (QPS) impressions, requires less than five seconds of end-to-end latency and recovers within five minutes during outages. It also needs to be scalable to handle the fast growth of Pinterest’s ads business.
The real-time budgeting system is composed of real-time stream-stream joiner, real-time spend aggregator and a spend predictor. At Pinterest’s scale, we need to overcome quite a few challenges to make each component work. For example, the stream-stream joiner needs to maintain terabyte size state while supporting fast recovery, and the real-time spend aggregator needs to publish to thousands of ads servers while supporting over one million read QPS. We choose Kafka Streams as it provides milliseconds latency guarantee, scalable event-based processing and easy-to-use APIs. In the process of building the system, we performed tons of tuning to RocksDB, Kafka Producer and Consumer, and pushed several open source contributions to Apache Kafka. We are also working on adding a remote checkpoint for Kafka Streams state to reduce the time of code start when adding more machines to the application. We believe that our experience can be beneficial to people who want to build real-time streaming solutions at large scale and deeply understand Kafka Streams.
MongoDB .local Chicago 2019: MongoDB Atlas Data Lake Technical Deep DiveMongoDB
MongoDB Atlas Data Lake is a new service offered by MongoDB Atlas. Many organizations store long term, archival data in cost-effective storage like S3, GCP, and Azure Blobs. However, many of them do not have robust systems or tools to effectively utilize large amounts of data to inform decision making. MongoDB Atlas Data Lake is a service allowing organizations to analyze their long-term data to discover a wealth of information about their business.
This session will take a deep dive into the features that are currently available in MongoDB Atlas Data Lake and how they are implemented. In addition, we'll discuss future plans and opportunities and offer ample Q&A time with the engineers on the project.
In this webinar you'll learn about the best practices for Google BigQuery—and how Matillion ETL makes loading your data faster and easier. Find out from our experts how to leverage one of the largest, fastest, and most capable cloud data warehouses to improve your business and save money.
In this webinar:
- Discover how to work fast and efficiently with Google BigQuery
- Find out the best ways to monitor and control costs
- Learn to leverage Matillion ETL and optimize Google BigQuery
- Get tips and tricks for better performance
Beyond the Basics 3: Introduction to the MongoDB BI ConnectorMongoDB
Watch this presentation to learn how the MongoDB BI Connector lets you use MongoDB as a data source for your SQL-based BI and analytics platforms.
Learn how to seamlessly create the visualizations and dashboards that will help you extract the insights and hidden value in your multi-structured data.
BitRot (The silent corruption of data on disk ) detection in GlusterFS (Glust...Gaurav Kumar Garg
BitRot or Data Rot is when file or object sits un-disturb on the disk or unaccess from day by day or month by month or year by year and when you go to your pen drive, hard disk or take out your DVD and you see that you can't access that data because bit have flipped called as BitRot or DataRot.
BitRot detection is a technique used to identify certain “insidious”
type of disk errors where data is silently corrupted with no
indication from the disk to the storage software layer that an error
has occurred.
With GlusterFS 3.7, its possible to detect corruption caused due to
bitrot and take steps to rectify them. When bitrot detection is
enabled on a Gluster volume, files are signed after they have been
written. A periodic filesystem scrubber verifies the integrity of
signed files are flags (or marks) files which have mismatching
signature. Corrupted files are typically denied access to clients
unless it's a replicated volume where it's still possible to access
the "good" copy and repair the corrupted file.
Dezentrales Logging wird mit der steigenden Zahl von zu überwachenden Prozessen immer aufwändiger. Deshalb gibt es seit mehreren Jahren Tools welche das Zentrale Logging unterstützen. In diesem Vortrag soll der Elastic Stack als ein solches Tool vorgestellt werden.
Webinar: Introducing the MongoDB Connector for BI 2.0 with TableauMongoDB
Pairing your real-time operational data stored in a modern database like MongoDB with first-class business intelligence platforms like Tableau enables new insights to be discovered faster than ever before.
Many leading organizations already use MongoDB in conjunction with Tableau including a top American investment bank and the world’s largest airline. With the Connector for BI 2.0, it’s never been easier to streamline the connection process between these two systems.
In this webinar, we will create a live connection from Tableau Desktop to a MongoDB cluster using the Connector for BI. Once we have Tableau Desktop and MongoDB connected, we will demonstrate the visual power of Tableau to explore the agile data storage of MongoDB.
You’ll walk away knowing:
- How to configure MongoDB with Tableau using the updated connector
- Best practices for working with documents in a BI environment
- How leading companies are using big data visualization strategies to transform their businesses
Digital Creative Design course - UX Module day 01 (EDIT interactive design an...DanNobre CarlaFonseca
Presentation for the first day of the UX module class we were invited to teach by EDIT - interactive design and technology school (Lisbon) in 2014. The module was a part of the Digital Creative Design course and was essentially targeted towards visual designers.
"With microservices every outage is like a murder mystery
is a common complaint. But it doesn't have to be! This talk
gives an overview on how to monitor distributed
applications. We dive into:
System metrics: Keep track of network traffic and system
load.
Application logs: Collect structured logs in a central
location
Audit info: Watch for user and processes activity in the
system.
Uptime monitoring: Ping services and actively monitor their
availability and response time.
Application metrics: Get metrics and health information
from for application via REST or JMX.
Request tracing: Gather timing data by using tools like
Zipkin to retrieve and show call traces.
MongoDB .local San Francisco 2020: Powering the new age data demands [Infosys]MongoDB
Our clients have unique use cases and data patterns that mandate the choice of a particular strategy. To implement these strategies, it is mandatory that we unlearn a lot of relational concepts while designing and rapidly developing efficient applications on NoSQL. In this session, we will talk about some of our client use cases, the strategies we have adopted, and the features of MongoDB that assisted in implementing these strategies.
MongoDB Solution for Internet of Things and Big DataStefano Dindo
Internet of Things è uno degli scenari di mercato più importanti su cui investire entro il 2020.
L'Internet of Things permette di trasferire sul Web la vita reale delle persone grazie all'interazione con oggetti e spazi fisici scambiando un grande volume di dati.
Durante il Lab è stata fornita una descrizione di architettura necessaria a supportare progetti di Internet of Things con un focus sull'organizzazione dei dati all'interno di MongoDB, database NoSQL Leader di mercato, per raccogliere ed analizzare grandi volumi di dati in tempo reale ed in modo efficiente.
Introductory talk to how can MongoDB enable new age software taking into account the expected growth rates, the constant availability of services and new business models that appear on a daily basis.
Building Pinterest Real-Time Ads Platform Using Kafka Streams confluent
Building Pinterest Real-Time Ads Platform Using Kafka Streams (Liquan Pei + Boyang Chen, Pinterest) Kafka Summit SF 2018
In this talk, we are sharing the experience of building Pinterest’s real-time Ads Platform utilizing Kafka Streams. The real-time budgeting system is the most mission-critical component of the Ads Platform as it controls how each ad is delivered to maximize user, advertiser and Pinterest value. The system needs to handle over 50,000 queries per section (QPS) impressions, requires less than five seconds of end-to-end latency and recovers within five minutes during outages. It also needs to be scalable to handle the fast growth of Pinterest’s ads business.
The real-time budgeting system is composed of real-time stream-stream joiner, real-time spend aggregator and a spend predictor. At Pinterest’s scale, we need to overcome quite a few challenges to make each component work. For example, the stream-stream joiner needs to maintain terabyte size state while supporting fast recovery, and the real-time spend aggregator needs to publish to thousands of ads servers while supporting over one million read QPS. We choose Kafka Streams as it provides milliseconds latency guarantee, scalable event-based processing and easy-to-use APIs. In the process of building the system, we performed tons of tuning to RocksDB, Kafka Producer and Consumer, and pushed several open source contributions to Apache Kafka. We are also working on adding a remote checkpoint for Kafka Streams state to reduce the time of code start when adding more machines to the application. We believe that our experience can be beneficial to people who want to build real-time streaming solutions at large scale and deeply understand Kafka Streams.
Sexual Reproduction in Animal (Internal and External FertilizationPrincess Piñero
Lesson Objectives
At the end of the one hour period, the Grade 5 learners should be able to:
1. identify the methods of sexual reproduction in animals;
2. categorize different animals into internal and external fertilization; and
3. state the difference of internal and external fertilization using a venn diagram.
Una serie di 7 dimostrazioni tecnologiche. Material Design, Polymer 0.5 e 1.0, l'integrazione su Admob-Native Ads, Firebase , Cloud Platform, G+ API e BigQuery.
A Key to Real-time Insights in a Post-COVID World (ASEAN)Denodo
Watch full webinar here: https://bit.ly/2EpHGyd
Presented at Data Champions, Online Asia 2020
Businesses and individuals around the world are experiencing the impact of a global pandemic. With many workers and potential shoppers still sequestered, COVID-19 is proving to have a momentous impact on the global economy. Regardless of the current situation and post-pandemic era, real-time data becomes even more critical to healthcare practitioners, business owners, government officials, and the public at large where holistic and timely information are important to make quick decisions. It enables doctors to make quick decisions about where to focus the care, business owners to alter production schedules to meet the demand, government agencies to contain the epidemic, and the public to be informed about prevention.
In this on-demand session, you will learn about the capabilities of data virtualization as a modern data integration technique and how can organisations:
- Rapidly unify information from disparate data sources to make accurate decisions and analyse data in real-time
- Build a single engine for security that provides audit and control by geographies
- Accelerate delivery of insights from your advanced analytics project
CodeCamp Iasi - Creating serverless data analytics system on GCP using BigQueryMárton Kodok
Teaser: provide developers a new way of understanding advanced analytics and choosing the right cloud architecture
The new buzzword is #serverless, as there are many great services that helps us abstract away the complexity associated with managing servers. In this session we will see how serverless helps on large data analytics backends.
We will see how to architect for Cloud and implement into an existing project components that will take us into the #serverless architecture that will ingest our streaming data, run advanced analytics on petabytes of data using BigQuery on Google Cloud Platform - all this next to an existing stack, without being forced to reengineer our app.
BigQuery enables super-fast, SQL/Javascript queries against petabytes of data using the processing power of Google’s infrastructure. We will cover its core features, SQL 2011 standard, working with streaming inserts, User Defined Functions written in Javascript, reference external JS libraries, and several use cases for everyday backend developer: funnel analytics, email heatmap, custom data processing, building dashboards, extracting data using JS functions, emitting rows based on business logic.
Making Your Site Vendor Agnostic via a Modern Data LayerEnsighten
(Presented by Matt Gallion, Implementation Engineer, Ensighten)
Properly built and maintained data layers allow a company the freedom to swap vendors out as needed without having to completely restructure their page code. In this session we will discuss why they are useful, how to properly implement one and how to govern the data across web and mobile properties.
Supercharge your data analytics with BigQueryMárton Kodok
Powering interactive data analysis require massive architecture, and Know-How to build a fast real-time computing system. BigQuery solves this problem by enabling super-fast, SQL-like queries against petabytes of data using the processing power of Google’s infrastructure. We will cover its core features, creating tables, columns, views, working with partitions, clustering for cost optimizations, streaming inserts, User Defined Functions, and several use cases for everydaay developer: funnel analytics, behavioral analytics, exploring unstructured data.
The other part will be about BigQuery ML, which enables users to create and execute machine learning models in BigQuery using standard SQL queries. BigQuery ML democratizes machine learning by enabling SQL practitioners to build models using existing SQL tools and skills. BigQuery ML increases development speed by eliminating the need to move data.
[Webinar] Getting Started with BigQuery: Basics, Its Appilcations & Use CasesTatvic Analytics
This webinar aims to provide the BigQuery product walkthrough right from the basics. Our core focus will be on the use cases and applications that help to gain additional customer insights from the data integrated within BigQuery.
BigQuery is equipped with the ability to crunch TBs of data in seconds while ensuring scalability and speed. It also enables us to perform advanced statistical analysis by providing unsampled raw hit level analytics data.
Agile Data Engineering: Introduction to Data Vault 2.0 (2018)Kent Graziano
(updated slides used for North Texas DAMA meetup Oct 2018) As we move more and more towards the need for everyone to do Agile Data Warehousing, we need a data modeling method that can be agile with us. Data Vault Data Modeling is an agile data modeling technique for designing highly flexible, scalable, and adaptable data structures for enterprise data warehouse repositories. It is a hybrid approach using the best of 3NF and dimensional modeling. It is not a replacement for star schema data marts (and should not be used as such). This approach has been used in projects around the world (Europe, Australia, USA) for over 15 years and is now growing in popularity. The purpose of this presentation is to provide attendees with an introduction to the components of the Data Vault Data Model, what they are for and how to build them. The examples will give attendees the basics:
• What the basic components of a DV model are
• How to build, and design structures incrementally, without constant refactoring
Slides from Michelle Ufford's talk, Data-Driven @ Netflix. Talk given at PASS Summit 2016 in October 2016.
Netflix is the quintessential data-driven company. It’s 83 million members stream more than 125 million hours in over 190 countries every day and generate more than 700 billion events in the process. In this session, we’ll share how data is used to make informed decisions across the entire business — from content acquisition to content delivery, and everything in between. We’ll look at how Netflix successfully employs a scalable cloud-based data platform to support a constant deluge of data and a small army of data analysts, engineers, and scientists. We’ll discuss the advanced analytical capabilities that are enabled through modern data technologies. Lastly, we’ll explore some of the architectural & operational principals that enable Netflix to so effectively make use of its data.
Visualizing IoT: Rapid Business Data Discovery for the Internet of ThingsMia Yuan Cao
As the Internet of Things (IoT) is making our world more connected, there is a growing need to understand the data through data visualization, analysis and discovery across different types of connected device platforms.
Gimel at Teradata Analytics Universe 2018Romit Mehta
This is our presentation of Gimel at Teradata's annual conference, Teradata Analytics Universe.
Gimel is the open source unified data API which enables connectivity to any data store with a single API. Along with the API which works with Scala and Python, we are also surfacing a SQL interface to access any data store with just SQL.
Now data scientists and analysts can directly consume data from big data platforms like Kafka for real-time streaming data access or Elastic for search-related data all with SQL just like they can access Oracle or Teradata.
On the other hand, data engineers can relax now with this abstracted API since it isolates the ever-changing world of big data infrastructure from their code. No longer do they need to worry about API versions, connector versions, data store-specific semantics, or compute engine and version.
Gimel is also tightly integrated with Jupyter notebooks so all of the power is now available to anyone with a browser.
gimel.io
unifieddatacatalog.io
ppextensions.io
Ping me on LinkedIn for more info!
The Accelerate to Cloud SF keynote will help you understand the current state of cloud adoption, identify the business value for your organization, and provide you a framework to plot your course to cloud adoption.
Driving Business Transformation with Real-Time Analytics Using Apache Kafka a...confluent
Watch this talk here: https://www.confluent.io/online-talks/driving-business-transformation-real-time-analytics-using-apache-kafka-and-ksql
Digital transformation is more than just a buzzword, it’s become a necessity in order to compete in the modern era. At the heart of digital transformation is real-time data. Your organization must respond in real time to every customer experience transaction, sale, and market movement in order to stay competitive.
Streaming data technologies like Apache Kafka® and Confluent KSQL, the streaming SQL engine for Apache Kafka, are being used to detect and react to events as they occur. Combining this technology with the analytics insights from RCG and visualizations from Arcadia Data delivers a powerful foundation for driving real time business decisions. Use cases span across industries and include retail transaction cost analysis, automotive maintenance and loyalty program management, and credit card fraud detection.
Join experts from Confluent, RCG and Arcadia Data for a discussion and demo on how companies are integrating streaming data technologies to transform their business.
You will learn:
-Why Apache Kafka is widely used for real-time event monitoring and decisioning
-How to integrate real-time analytics and visualizations to drive business processes
-How KSQL, streaming SQL for Kafka, can easily transform and filter streams of data in real time
Cloud study jams workshop - classify images of clouds in the cloud with aut...Nicolas Bortolotti
AutoML Vision helps developers with limited ML expertise train high quality image recognition models. Once you upload images to the AutoML UI, you can train a model that will be immediately available on GCP for generating predictions via an easy to use REST API
In this lab you will upload images to Cloud Storage and use them to train a custom model to recognize different types of clouds (cumulus, cumulonimbus, etc.)
Dimostrazione #Polymer, informazioni di routing, come utilizzare in modo trasparente web-components e una semplice integrazione con #Firebase. Presentazione completa: http://goo.gl/MyY2PV
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...DanBrown980551
Do you want to learn how to model and simulate an electrical network from scratch in under an hour?
Then welcome to this PowSyBl workshop, hosted by Rte, the French Transmission System Operator (TSO)!
During the webinar, you will discover the PowSyBl ecosystem as well as handle and study an electrical network through an interactive Python notebook.
PowSyBl is an open source project hosted by LF Energy, which offers a comprehensive set of features for electrical grid modelling and simulation. Among other advanced features, PowSyBl provides:
- A fully editable and extendable library for grid component modelling;
- Visualization tools to display your network;
- Grid simulation tools, such as power flows, security analyses (with or without remedial actions) and sensitivity analyses;
The framework is mostly written in Java, with a Python binding so that Python developers can access PowSyBl functionalities as well.
What you will learn during the webinar:
- For beginners: discover PowSyBl's functionalities through a quick general presentation and the notebook, without needing any expert coding skills;
- For advanced developers: master the skills to efficiently apply PowSyBl functionalities to your real-world scenarios.
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
Welcome to the first live UiPath Community Day Dubai! Join us for this unique occasion to meet our local and global UiPath Community and leaders. You will get a full view of the MEA region's automation landscape and the AI Powered automation technology capabilities of UiPath. Also, hosted by our local partners Marc Ellis, you will enjoy a half-day packed with industry insights and automation peers networking.
📕 Curious on our agenda? Wait no more!
10:00 Welcome note - UiPath Community in Dubai
Lovely Sinha, UiPath Community Chapter Leader, UiPath MVPx3, Hyper-automation Consultant, First Abu Dhabi Bank
10:20 A UiPath cross-region MEA overview
Ashraf El Zarka, VP and Managing Director MEA, UiPath
10:35: Customer Success Journey
Deepthi Deepak, Head of Intelligent Automation CoE, First Abu Dhabi Bank
11:15 The UiPath approach to GenAI with our three principles: improve accuracy, supercharge productivity, and automate more
Boris Krumrey, Global VP, Automation Innovation, UiPath
12:15 To discover how Marc Ellis leverages tech-driven solutions in recruitment and managed services.
Brendan Lingam, Director of Sales and Business Development, Marc Ellis
Elevating Tactical DDD Patterns Through Object CalisthenicsDorra BARTAGUIZ
After immersing yourself in the blue book and its red counterpart, attending DDD-focused conferences, and applying tactical patterns, you're left with a crucial question: How do I ensure my design is effective? Tactical patterns within Domain-Driven Design (DDD) serve as guiding principles for creating clear and manageable domain models. However, achieving success with these patterns requires additional guidance. Interestingly, we've observed that a set of constraints initially designed for training purposes remarkably aligns with effective pattern implementation, offering a more ‘mechanical’ approach. Let's explore together how Object Calisthenics can elevate the design of your tactical DDD patterns, offering concrete help for those venturing into DDD for the first time!
Pushing the limits of ePRTC: 100ns holdover for 100 daysAdtran
At WSTS 2024, Alon Stern explored the topic of parametric holdover and explained how recent research findings can be implemented in real-world PNT networks to achieve 100 nanoseconds of accuracy for up to 100 days.
Climate Impact of Software Testing at Nordic Testing DaysKari Kakkonen
My slides at Nordic Testing Days 6.6.2024
Climate impact / sustainability of software testing discussed on the talk. ICT and testing must carry their part of global responsibility to help with the climat warming. We can minimize the carbon footprint but we can also have a carbon handprint, a positive impact on the climate. Quality characteristics can be added with sustainability, and then measured continuously. Test environments can be used less, and in smaller scale and on demand. Test techniques can be used in optimizing or minimizing number of tests. Test automation can be used to speed up testing.
Essentials of Automations: The Art of Triggers and Actions in FMESafe Software
In this second installment of our Essentials of Automations webinar series, we’ll explore the landscape of triggers and actions, guiding you through the nuances of authoring and adapting workspaces for seamless automations. Gain an understanding of the full spectrum of triggers and actions available in FME, empowering you to enhance your workspaces for efficient automation.
We’ll kick things off by showcasing the most commonly used event-based triggers, introducing you to various automation workflows like manual triggers, schedules, directory watchers, and more. Plus, see how these elements play out in real scenarios.
Whether you’re tweaking your current setup or building from the ground up, this session will arm you with the tools and insights needed to transform your FME usage into a powerhouse of productivity. Join us to discover effective strategies that simplify complex processes, enhancing your productivity and transforming your data management practices with FME. Let’s turn complexity into clarity and make your workspaces work wonders!
The Metaverse and AI: how can decision-makers harness the Metaverse for their...Jen Stirrup
The Metaverse is popularized in science fiction, and now it is becoming closer to being a part of our daily lives through the use of social media and shopping companies. How can businesses survive in a world where Artificial Intelligence is becoming the present as well as the future of technology, and how does the Metaverse fit into business strategy when futurist ideas are developing into reality at accelerated rates? How do we do this when our data isn't up to scratch? How can we move towards success with our data so we are set up for the Metaverse when it arrives?
How can you help your company evolve, adapt, and succeed using Artificial Intelligence and the Metaverse to stay ahead of the competition? What are the potential issues, complications, and benefits that these technologies could bring to us and our organizations? In this session, Jen Stirrup will explain how to start thinking about these technologies as an organisation.
PHP Frameworks: I want to break free (IPC Berlin 2024)Ralf Eggert
In this presentation, we examine the challenges and limitations of relying too heavily on PHP frameworks in web development. We discuss the history of PHP and its frameworks to understand how this dependence has evolved. The focus will be on providing concrete tips and strategies to reduce reliance on these frameworks, based on real-world examples and practical considerations. The goal is to equip developers with the skills and knowledge to create more flexible and future-proof web applications. We'll explore the importance of maintaining autonomy in a rapidly changing tech landscape and how to make informed decisions in PHP development.
This talk is aimed at encouraging a more independent approach to using PHP frameworks, moving towards a more flexible and future-proof approach to PHP development.
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
Observability Concepts EVERY Developer Should Know -- DeveloperWeek Europe.pdfPaige Cruz
Monitoring and observability aren’t traditionally found in software curriculums and many of us cobble this knowledge together from whatever vendor or ecosystem we were first introduced to and whatever is a part of your current company’s observability stack.
While the dev and ops silo continues to crumble….many organizations still relegate monitoring & observability as the purview of ops, infra and SRE teams. This is a mistake - achieving a highly observable system requires collaboration up and down the stack.
I, a former op, would like to extend an invitation to all application developers to join the observability party will share these foundational concepts to build on:
35. Confidential & Proprietary
For the past 15 years, Google has
been building out the fastest, most
powerful, highest quality cloud
infrastructure on the planet.
35
41. Confidential & Proprietary
BigQuery: Big Data Analytics in the Cloud
Unrivaled
Performance and
Scale
● Scan multiple TB’s in seconds
● Interactive query
performance
● No limits on amount of data
Ease of Use
and Adoption
● No administration /
provisioning
● Convenience of SQL
● Open interfaces
(REST, WebUI, ODBC)
● First 1 TB of data
processed per month is
free
Advanced “Big Data”
Storage
● Familiar database structure
● Easy data management and
ACL’s
● Fast, atomic imports
41
42. Confidential & Proprietary
Como trabaja BigQuery
Tree Structured Query Dispatch and Aggregation
Distributed Storage
SELECT state, year
Leaf Leaf Leaf Leaf
O(Rows ~140M)
COUNT(*)
GROUP BY state
WHERE year >= 1980 and year < 1990
Mixer 1 Mixer 1
O(50 states)
COUNT(*)
GROUP BY state
Mixer 0
O(50 states)
LIMIT 10
ORDER BY count_babies DESC
COUNT(*)
GROUP BY state SELECT
state, COUNT(*) count_babies
FROM [publicdata:samples.natality]
WHERE
year >= 1980 AND year < 1990
GROUP BY state
ORDER BY count_babies DESC
LIMIT 10
42
44. Confidential & Proprietary
You never change things by
fighting the existing reality.
To change something, build
a new model that makes the
existing model obsolete.
- Buckminster Fuller
45. Confidential & Proprietary 45
Especially in technology, we need
revolutionary change, not
incremental change. by Larry
Basically, our goal is to organize the world's
information and to make it universally
accessible and useful. by Larry
Thanks!