Aprenderás como puede ser creado un modelo de Machine Learning que puedas implementar en tus aplicaciones. Iré mostrando cada uno de los pasos que se tienen que seguir, los tipos de problemas que se pueden resolver, los datos que necesitas para que funcione y por último, las opciones para realizar la implementación de nuestro modelo en nuestras aplicaciones.
Inteligencia artificial para android como empezarIsabel Palomar
Aprenderás los conceptos basico de deep learning y como crear tu aplicación de Android que puede detectar y etiquetar imágenes utilizando un modelo de Tensorflow Lite
Creating a custom Machine Learning Model for your applications - Java Dev Day...Isabel Palomar
Aprenderás como puede ser creado un modelo de Machine Learning que puedas implementar en tu aplicación móvil o Java. Iré mostrando cada uno de los pasos que se tienen que seguir, los tipos de problemas que se pueden resolver, los datos que necesitas para que funcione y por último, las opciones para realizar la implementación de nuestro modelo en nuestras aplicaciones.
Building a custom machine learning model on androidIsabel Palomar
This document provides an overview of building a custom machine learning model for image classification on Android. It begins with discussing challenges and ideas, then covers key deep learning concepts like data, tasks, models, loss functions, learning algorithms and evaluation. It explains that a MobileNet model will be retrained for classifying images of artisanal beers. The document also discusses converting the model to TensorFlow Lite and implementing image classification in an Android app using the camera and a TensorFlow Lite interpreter to get classification results.
Start machine learning in 5 simple stepsRenjith M P
Simple steps to get started with machine learning.
The use case uses python programming. Target audience is expected to have a very basic python knowledge.
The document provides an overview and agenda for an introduction to running AI workloads on PowerAI. It discusses PowerAI and how it combines popular deep learning frameworks, development tools, and accelerated IBM Power servers. It then demonstrates AI workloads using TensorFlow and PyTorch, including running an MNIST workload to classify handwritten digits using basic linear regression and convolutional neural networks in TensorFlow, and an introduction to PyTorch concepts like tensors, modules, and softmax cross entropy loss.
Analytics Zoo: Building Analytics and AI Pipeline for Apache Spark and BigDL ...Databricks
A long time ago, there was Caffe and Theano, then came Torch and CNTK and Tensorflow, Keras and MXNet and Pytorch and Caffe2….a sea of Deep learning tools but none for Spark developers to dip into. Finally, there was BigDL, a deep learning library for Apache Spark. While BigDL is integrated into Spark and extends its capabilities to address the challenges of Big Data developers, will a library alone be enough to simplify and accelerate the deployment of ML/DL workloads on production clusters? From high level pipeline API support to feature transformers to pre-defined models and reference use cases, a rich repository of easy to use tools are now available with the ‘Analytics Zoo’. We’ll unpack the production challenges and opportunities with ML/DL on Spark and what the Zoo can do
Inteligencia artificial para android como empezarIsabel Palomar
Aprenderás los conceptos basico de deep learning y como crear tu aplicación de Android que puede detectar y etiquetar imágenes utilizando un modelo de Tensorflow Lite
Creating a custom Machine Learning Model for your applications - Java Dev Day...Isabel Palomar
Aprenderás como puede ser creado un modelo de Machine Learning que puedas implementar en tu aplicación móvil o Java. Iré mostrando cada uno de los pasos que se tienen que seguir, los tipos de problemas que se pueden resolver, los datos que necesitas para que funcione y por último, las opciones para realizar la implementación de nuestro modelo en nuestras aplicaciones.
Building a custom machine learning model on androidIsabel Palomar
This document provides an overview of building a custom machine learning model for image classification on Android. It begins with discussing challenges and ideas, then covers key deep learning concepts like data, tasks, models, loss functions, learning algorithms and evaluation. It explains that a MobileNet model will be retrained for classifying images of artisanal beers. The document also discusses converting the model to TensorFlow Lite and implementing image classification in an Android app using the camera and a TensorFlow Lite interpreter to get classification results.
Start machine learning in 5 simple stepsRenjith M P
Simple steps to get started with machine learning.
The use case uses python programming. Target audience is expected to have a very basic python knowledge.
The document provides an overview and agenda for an introduction to running AI workloads on PowerAI. It discusses PowerAI and how it combines popular deep learning frameworks, development tools, and accelerated IBM Power servers. It then demonstrates AI workloads using TensorFlow and PyTorch, including running an MNIST workload to classify handwritten digits using basic linear regression and convolutional neural networks in TensorFlow, and an introduction to PyTorch concepts like tensors, modules, and softmax cross entropy loss.
Analytics Zoo: Building Analytics and AI Pipeline for Apache Spark and BigDL ...Databricks
A long time ago, there was Caffe and Theano, then came Torch and CNTK and Tensorflow, Keras and MXNet and Pytorch and Caffe2….a sea of Deep learning tools but none for Spark developers to dip into. Finally, there was BigDL, a deep learning library for Apache Spark. While BigDL is integrated into Spark and extends its capabilities to address the challenges of Big Data developers, will a library alone be enough to simplify and accelerate the deployment of ML/DL workloads on production clusters? From high level pipeline API support to feature transformers to pre-defined models and reference use cases, a rich repository of easy to use tools are now available with the ‘Analytics Zoo’. We’ll unpack the production challenges and opportunities with ML/DL on Spark and what the Zoo can do
"Deployment for free": removing the need to write model deployment code at St...Stefan Krawczyk
At Stitch Fix we have a dedicated Data Science organization called Algorithms. It has over 130+ Full Stack Data Scientists that build & own a variety of models. These models span from your classic prediction & classification models, through to time-series forecasts, simulations, and optimizations. Rather than hand-off models for productionization to someone else, Data Scientists own and are on-call for that process; we love for our Data Scientists to have autonomy. That said, Data Scientists aren’t without engineering support, as there’s a Data Platform team dedicated to building tooling, services, and abstractions to increase their workflow velocity. One data science task that we have been speeding up is getting models to production and increasing their usability and stability. This is a necessary task that can take a considerable chunk of a Data Scientist’s time, either in terms of developing, or debugging issues; historically everyone largely carved their own path in this endeavor, which meant many different approaches, implementations, and little to leverage across teams.
In this talk I’ll cover how the Model Lifecycle team on Data Platform built a system dubbed the “Model Envelope” to enable “deployment for free”. That is, no code needs to be written by a data scientist to deploy any python model to production, where production means either a micro-service, or a batch python/spark job. With our approach we can remove the need for data scientists to have to worry about python dependencies, or instrumenting model monitoring since we can take care of it for them, in addition to other MLOps concerns.
Specifically the talk will cover:
* Our API interface we provide to data scientists and how it decouples deployment concerns.
* How we approach automatically inferring a type safe API for models of any shape.
* How we handle python dependencies so Data Scientists don’t have to.
* How our relationship & approach enables us to inject & change MLOps approaches without having to coordinate much with Data Scientists.
The document discusses CNN Lab 256 and various labs involving image classification using ImageNet and MNIST datasets. Lab 2 focuses on image classification using ImageNet, which contains over 14 million images across 20,000 categories. The script classify_image.py is used to classify images using a pre-trained model. Retraining the model on a custom dataset is also discussed. Lab 5 involves classifying handwritten digits from the MNIST dataset using a convolutional neural network model defined in TensorFlow. The model achieves an accuracy of over 99% after training for 15,000 epochs in batches of 100 images.
MLFlow: Platform for Complete Machine Learning Lifecycle Databricks
Description
Data Science and ML development bring many new complexities beyond the traditional software development lifecycle. Unlike in traditional software development, ML developers want to try multiple algorithms, tools, and parameters to get the best results, and they need to track this information to reproduce work.
MLflow addresses some of these challenges during an ML model development cycle.
Abstract
ML development brings many new complexities beyond the traditional software development lifecycle. Unlike in traditional software development, ML developers want to try multiple algorithms, tools, and parameters to get the best results, and they need to track this information to reproduce work. In addition, developers need to use many distinct systems to productionize models. To address these problems, many companies are building custom “ML platforms” that automate this lifecycle, but even these platforms are limited to a few supported algorithms and to each company’s internal infrastructure.
In this session, we introduce MLflow, a new open source project from Databricks that aims to design an open ML platform where organizations can use any ML library and development tool of their choice to reliably build and share ML applications. MLflow introduces simple abstractions to package reproducible projects, track results, and encapsulate models that can be used with many existing tools, accelerating the ML lifecycle for organizations of any size.
With a short demo, you see a complete ML model life-cycle example, you will walk away with: MLflow concepts and abstractions for models, experiments, and projects How to get started with MLFlow Using tracking Python APIs during model training Using MLflow UI to visually compare and contrast experimental runs with different tuning parameters and evaluate metrics
Introduction to Machine Learning for newcomers. It will show you some basic concepts like what is supervised learning, unsupervised learning, classification, regression, under/overfitting, clustering, anomaly detection, and how to have some measures. It will illustrates examples through scikit-learn and tensorflow code
The document provides an overview of a presentation about Google Cloud developer tools and an easier path to machine learning. It introduces the speaker and their background and experience. It then outlines the agenda which includes introductions to machine learning and Google Cloud, Google APIs, Cloud ML APIs, and other APIs to consider. It provides examples of using various Cloud ML APIs like Vision, Natural Language, and Speech for tasks like image labeling, text analysis, and speech recognition. The goal is to demonstrate how APIs powered by machine learning can help ease the burden of learning machine learning by allowing users to leverage pre-built models if they can call APIs.
The document discusses Google Cloud AI services including Cloud ML Engine for machine learning model training and prediction. It provides examples of using Cloud ML Engine to train models locally and in the cloud, perform distributed training, and hyperparameter tuning. It also covers deploying trained models and making predictions against them.
Apache Liminal (Incubating)—Orchestrate the Machine Learning PipelineDatabricks
Apache Liminal is an end-to-end platform for data engineers & scientists, allowing them to build, train and deploy machine learning models in a robust and agile way. The platform provides the abstractions and declarative capabilities for data extraction & feature engineering followed by model training and serving; using standard tools and libraries (e.g. Airflow, K8S, Spark, scikit-learn, etc.).
1. Machine learning is the use and development of computer systems that are able to learn and adapt without explicit instructions by using algorithms and statistical models to analyze patterns in data.
2. The document provides examples of machine learning applications like facial recognition, voice recognition in healthcare, weather forecasting, and more. It also discusses the process of machine learning and popular machine learning algorithms.
3. The document demonstrates machine learning using a decision tree algorithm on music purchase data to predict whether a customer is male or female based on attributes like age and number of songs purchased. It imports relevant Python libraries and splits the data into training and test sets to evaluate the model's performance.
This document discusses on-device machine learning using TensorFlow Lite. It introduces why on-device ML is important, options for implementing on-device ML like TensorFlow Lite, ML Kit and MediaPipe. It then provides an end-to-end example of training a model on MNIST data using TensorFlow Keras, converting it to TFLite format and deploying it on Android. It also discusses optimizing models for mobile and edge devices and running inference on microcontrollers and Coral Edge TPUs.
This document provides an overview and introduction to data structures and algorithms. It discusses the need for data structures to efficiently organize and store data as applications and data grow increasingly large and complex. It also covers some basic terminology used in data structures and algorithms. The document then discusses setting up both an online and local environment for writing and executing code in C programming language to work through examples.
This document is an introduction to data structures and algorithms. It discusses the need for data structures to efficiently store and organize data for enterprise applications. It also discusses algorithms and different algorithm analysis techniques like asymptotic analysis, greedy algorithms, divide and conquer, and dynamic programming. The document is intended for computer science graduates and software professionals looking to learn about data structures and algorithms. It requires a basic understanding of C programming.
Using AI to create smart application - DroidCon Tel AvivSarit Tamir
Educating your app – Easily adding the ML edge to your applications
Creating Smart application that can predict, translate and advice on line is easier then ever , and today every developer can fill in the gap .
The aim of the talk is to give developers a kick start in adding ML to their applications today. I will cover the different methods programmers have today for adding ML solution to their application, from using an external ML kit to implementing a TensorFlow lite.
1. On overview on what capabilities we can add to our applications with ML.
2. I will cover the main frameworks that are available today.
3. Focus on the 'on device' solutions.
4. Code examples for each of the frameworks.
My goal is that every participance will take from this lecture the understanding that adding ML to his application to solve a specific solution is doable practice for every programmer that is willing to spend the time and learn.
This is our contributions to the Data Science projects, as developed in our startup. These are part of partner trainings and in-house design and development and testing of the course material and concepts in Data Science and Engineering. It covers Data ingestion, data wrangling, feature engineering, data analysis, data storage, data extraction, querying data, formatting and visualizing data for various dashboards.Data is prepared for accurate ML model predictions and Generative AI apps
This is our project work at our startup for Data Science. This is part of our internal training and focused on data management for AI, ML and Generative AI apps
The document discusses using Python for data science and machine learning. It outlines the objectives of gaining knowledge about machine learning, data visualization, web scraping, and natural language processing. Key libraries for machine learning using Python are described, including NumPy, Pandas, Matplotlib, and scikit-learn. Examples of machine learning applications are provided such as traffic prediction, virtual assistants, and image recognition. The author's project to predict car prices using a random forest regression model built in Python is summarized.
An In-Depth Exploration of Natural Language Processing: Evolution, Applicatio...DharmaBanothu
Natural language processing (NLP) has
recently garnered significant interest for the
computational representation and analysis of human
language. Its applications span multiple domains such
as machine translation, email spam detection,
information extraction, summarization, healthcare,
and question answering. This paper first delineates
four phases by examining various levels of NLP and
components of Natural Language Generation,
followed by a review of the history and progression of
NLP. Subsequently, we delve into the current state of
the art by presenting diverse NLP applications,
contemporary trends, and challenges. Finally, we
discuss some available datasets, models, and
evaluation metrics in NLP.
"Deployment for free": removing the need to write model deployment code at St...Stefan Krawczyk
At Stitch Fix we have a dedicated Data Science organization called Algorithms. It has over 130+ Full Stack Data Scientists that build & own a variety of models. These models span from your classic prediction & classification models, through to time-series forecasts, simulations, and optimizations. Rather than hand-off models for productionization to someone else, Data Scientists own and are on-call for that process; we love for our Data Scientists to have autonomy. That said, Data Scientists aren’t without engineering support, as there’s a Data Platform team dedicated to building tooling, services, and abstractions to increase their workflow velocity. One data science task that we have been speeding up is getting models to production and increasing their usability and stability. This is a necessary task that can take a considerable chunk of a Data Scientist’s time, either in terms of developing, or debugging issues; historically everyone largely carved their own path in this endeavor, which meant many different approaches, implementations, and little to leverage across teams.
In this talk I’ll cover how the Model Lifecycle team on Data Platform built a system dubbed the “Model Envelope” to enable “deployment for free”. That is, no code needs to be written by a data scientist to deploy any python model to production, where production means either a micro-service, or a batch python/spark job. With our approach we can remove the need for data scientists to have to worry about python dependencies, or instrumenting model monitoring since we can take care of it for them, in addition to other MLOps concerns.
Specifically the talk will cover:
* Our API interface we provide to data scientists and how it decouples deployment concerns.
* How we approach automatically inferring a type safe API for models of any shape.
* How we handle python dependencies so Data Scientists don’t have to.
* How our relationship & approach enables us to inject & change MLOps approaches without having to coordinate much with Data Scientists.
The document discusses CNN Lab 256 and various labs involving image classification using ImageNet and MNIST datasets. Lab 2 focuses on image classification using ImageNet, which contains over 14 million images across 20,000 categories. The script classify_image.py is used to classify images using a pre-trained model. Retraining the model on a custom dataset is also discussed. Lab 5 involves classifying handwritten digits from the MNIST dataset using a convolutional neural network model defined in TensorFlow. The model achieves an accuracy of over 99% after training for 15,000 epochs in batches of 100 images.
MLFlow: Platform for Complete Machine Learning Lifecycle Databricks
Description
Data Science and ML development bring many new complexities beyond the traditional software development lifecycle. Unlike in traditional software development, ML developers want to try multiple algorithms, tools, and parameters to get the best results, and they need to track this information to reproduce work.
MLflow addresses some of these challenges during an ML model development cycle.
Abstract
ML development brings many new complexities beyond the traditional software development lifecycle. Unlike in traditional software development, ML developers want to try multiple algorithms, tools, and parameters to get the best results, and they need to track this information to reproduce work. In addition, developers need to use many distinct systems to productionize models. To address these problems, many companies are building custom “ML platforms” that automate this lifecycle, but even these platforms are limited to a few supported algorithms and to each company’s internal infrastructure.
In this session, we introduce MLflow, a new open source project from Databricks that aims to design an open ML platform where organizations can use any ML library and development tool of their choice to reliably build and share ML applications. MLflow introduces simple abstractions to package reproducible projects, track results, and encapsulate models that can be used with many existing tools, accelerating the ML lifecycle for organizations of any size.
With a short demo, you see a complete ML model life-cycle example, you will walk away with: MLflow concepts and abstractions for models, experiments, and projects How to get started with MLFlow Using tracking Python APIs during model training Using MLflow UI to visually compare and contrast experimental runs with different tuning parameters and evaluate metrics
Introduction to Machine Learning for newcomers. It will show you some basic concepts like what is supervised learning, unsupervised learning, classification, regression, under/overfitting, clustering, anomaly detection, and how to have some measures. It will illustrates examples through scikit-learn and tensorflow code
The document provides an overview of a presentation about Google Cloud developer tools and an easier path to machine learning. It introduces the speaker and their background and experience. It then outlines the agenda which includes introductions to machine learning and Google Cloud, Google APIs, Cloud ML APIs, and other APIs to consider. It provides examples of using various Cloud ML APIs like Vision, Natural Language, and Speech for tasks like image labeling, text analysis, and speech recognition. The goal is to demonstrate how APIs powered by machine learning can help ease the burden of learning machine learning by allowing users to leverage pre-built models if they can call APIs.
The document discusses Google Cloud AI services including Cloud ML Engine for machine learning model training and prediction. It provides examples of using Cloud ML Engine to train models locally and in the cloud, perform distributed training, and hyperparameter tuning. It also covers deploying trained models and making predictions against them.
Apache Liminal (Incubating)—Orchestrate the Machine Learning PipelineDatabricks
Apache Liminal is an end-to-end platform for data engineers & scientists, allowing them to build, train and deploy machine learning models in a robust and agile way. The platform provides the abstractions and declarative capabilities for data extraction & feature engineering followed by model training and serving; using standard tools and libraries (e.g. Airflow, K8S, Spark, scikit-learn, etc.).
1. Machine learning is the use and development of computer systems that are able to learn and adapt without explicit instructions by using algorithms and statistical models to analyze patterns in data.
2. The document provides examples of machine learning applications like facial recognition, voice recognition in healthcare, weather forecasting, and more. It also discusses the process of machine learning and popular machine learning algorithms.
3. The document demonstrates machine learning using a decision tree algorithm on music purchase data to predict whether a customer is male or female based on attributes like age and number of songs purchased. It imports relevant Python libraries and splits the data into training and test sets to evaluate the model's performance.
This document discusses on-device machine learning using TensorFlow Lite. It introduces why on-device ML is important, options for implementing on-device ML like TensorFlow Lite, ML Kit and MediaPipe. It then provides an end-to-end example of training a model on MNIST data using TensorFlow Keras, converting it to TFLite format and deploying it on Android. It also discusses optimizing models for mobile and edge devices and running inference on microcontrollers and Coral Edge TPUs.
This document provides an overview and introduction to data structures and algorithms. It discusses the need for data structures to efficiently organize and store data as applications and data grow increasingly large and complex. It also covers some basic terminology used in data structures and algorithms. The document then discusses setting up both an online and local environment for writing and executing code in C programming language to work through examples.
This document is an introduction to data structures and algorithms. It discusses the need for data structures to efficiently store and organize data for enterprise applications. It also discusses algorithms and different algorithm analysis techniques like asymptotic analysis, greedy algorithms, divide and conquer, and dynamic programming. The document is intended for computer science graduates and software professionals looking to learn about data structures and algorithms. It requires a basic understanding of C programming.
Using AI to create smart application - DroidCon Tel AvivSarit Tamir
Educating your app – Easily adding the ML edge to your applications
Creating Smart application that can predict, translate and advice on line is easier then ever , and today every developer can fill in the gap .
The aim of the talk is to give developers a kick start in adding ML to their applications today. I will cover the different methods programmers have today for adding ML solution to their application, from using an external ML kit to implementing a TensorFlow lite.
1. On overview on what capabilities we can add to our applications with ML.
2. I will cover the main frameworks that are available today.
3. Focus on the 'on device' solutions.
4. Code examples for each of the frameworks.
My goal is that every participance will take from this lecture the understanding that adding ML to his application to solve a specific solution is doable practice for every programmer that is willing to spend the time and learn.
This is our contributions to the Data Science projects, as developed in our startup. These are part of partner trainings and in-house design and development and testing of the course material and concepts in Data Science and Engineering. It covers Data ingestion, data wrangling, feature engineering, data analysis, data storage, data extraction, querying data, formatting and visualizing data for various dashboards.Data is prepared for accurate ML model predictions and Generative AI apps
This is our project work at our startup for Data Science. This is part of our internal training and focused on data management for AI, ML and Generative AI apps
The document discusses using Python for data science and machine learning. It outlines the objectives of gaining knowledge about machine learning, data visualization, web scraping, and natural language processing. Key libraries for machine learning using Python are described, including NumPy, Pandas, Matplotlib, and scikit-learn. Examples of machine learning applications are provided such as traffic prediction, virtual assistants, and image recognition. The author's project to predict car prices using a random forest regression model built in Python is summarized.
Similar to Creating a custom ML model for your application - DevFest Lima 2019 (20)
An In-Depth Exploration of Natural Language Processing: Evolution, Applicatio...DharmaBanothu
Natural language processing (NLP) has
recently garnered significant interest for the
computational representation and analysis of human
language. Its applications span multiple domains such
as machine translation, email spam detection,
information extraction, summarization, healthcare,
and question answering. This paper first delineates
four phases by examining various levels of NLP and
components of Natural Language Generation,
followed by a review of the history and progression of
NLP. Subsequently, we delve into the current state of
the art by presenting diverse NLP applications,
contemporary trends, and challenges. Finally, we
discuss some available datasets, models, and
evaluation metrics in NLP.
Sachpazis_Consolidation Settlement Calculation Program-The Python Code and th...Dr.Costas Sachpazis
Consolidation Settlement Calculation Program-The Python Code
By Professor Dr. Costas Sachpazis, Civil Engineer & Geologist
This program calculates the consolidation settlement for a foundation based on soil layer properties and foundation data. It allows users to input multiple soil layers and foundation characteristics to determine the total settlement.
Determination of Equivalent Circuit parameters and performance characteristic...pvpriya2
Includes the testing of induction motor to draw the circle diagram of induction motor with step wise procedure and calculation for the same. Also explains the working and application of Induction generator
Prediction of Electrical Energy Efficiency Using Information on Consumer's Ac...PriyankaKilaniya
Energy efficiency has been important since the latter part of the last century. The main object of this survey is to determine the energy efficiency knowledge among consumers. Two separate districts in Bangladesh are selected to conduct the survey on households and showrooms about the energy and seller also. The survey uses the data to find some regression equations from which it is easy to predict energy efficiency knowledge. The data is analyzed and calculated based on five important criteria. The initial target was to find some factors that help predict a person's energy efficiency knowledge. From the survey, it is found that the energy efficiency awareness among the people of our country is very low. Relationships between household energy use behaviors are estimated using a unique dataset of about 40 households and 20 showrooms in Bangladesh's Chapainawabganj and Bagerhat districts. Knowledge of energy consumption and energy efficiency technology options is found to be associated with household use of energy conservation practices. Household characteristics also influence household energy use behavior. Younger household cohorts are more likely to adopt energy-efficient technologies and energy conservation practices and place primary importance on energy saving for environmental reasons. Education also influences attitudes toward energy conservation in Bangladesh. Low-education households indicate they primarily save electricity for the environment while high-education households indicate they are motivated by environmental concerns.
A high-Speed Communication System is based on the Design of a Bi-NoC Router, ...DharmaBanothu
The Network on Chip (NoC) has emerged as an effective
solution for intercommunication infrastructure within System on
Chip (SoC) designs, overcoming the limitations of traditional
methods that face significant bottlenecks. However, the complexity
of NoC design presents numerous challenges related to
performance metrics such as scalability, latency, power
consumption, and signal integrity. This project addresses the
issues within the router's memory unit and proposes an enhanced
memory structure. To achieve efficient data transfer, FIFO buffers
are implemented in distributed RAM and virtual channels for
FPGA-based NoC. The project introduces advanced FIFO-based
memory units within the NoC router, assessing their performance
in a Bi-directional NoC (Bi-NoC) configuration. The primary
objective is to reduce the router's workload while enhancing the
FIFO internal structure. To further improve data transfer speed,
a Bi-NoC with a self-configurable intercommunication channel is
suggested. Simulation and synthesis results demonstrate
guaranteed throughput, predictable latency, and equitable
network access, showing significant improvement over previous
designs
Applications of artificial Intelligence in Mechanical Engineering.pdfAtif Razi
Historically, mechanical engineering has relied heavily on human expertise and empirical methods to solve complex problems. With the introduction of computer-aided design (CAD) and finite element analysis (FEA), the field took its first steps towards digitization. These tools allowed engineers to simulate and analyze mechanical systems with greater accuracy and efficiency. However, the sheer volume of data generated by modern engineering systems and the increasing complexity of these systems have necessitated more advanced analytical tools, paving the way for AI.
AI offers the capability to process vast amounts of data, identify patterns, and make predictions with a level of speed and accuracy unattainable by traditional methods. This has profound implications for mechanical engineering, enabling more efficient design processes, predictive maintenance strategies, and optimized manufacturing operations. AI-driven tools can learn from historical data, adapt to new information, and continuously improve their performance, making them invaluable in tackling the multifaceted challenges of modern mechanical engineering.
3. Agenda
● Challenges and Initial ideas
● Main Deep Learning concepts
● Using the model!
○ Android applications
○ In your iOS, web or backend applications
… that’s all
3
��
8. 8
🤯 After the class…..
The key outcome of this lesson is that we'll have trained an
image classifier which can recognize pet breeds at state of
the art accuracy. The key to this success is the use of
transfer learning, which will be a key platform for much of
this course.
We also discuss how to set the most important
hyper-parameter when training neural networks:
the learning rate, using Leslie Smith's fantastic learning rate
finder method. Finally, we'll look at the important but rarely
discussed topic of labeling, and learn about some of the
features that fastai provides for allowing you to easily add
labels to your images. https://course.fast.ai/videos/?lesson=1
9. challenges….
‐ Many courses, even basic, assume that
you already know the subject.
‐ Reaching the final result without
learning the basics is not good.
9
10. “When you are starting to learn about
Deep Learning it seems that there
are thousands of concepts,
mathematical functions and
scientific articles that you have to
read.
10
myths
13. Let’s take a look of the
implementation
We are going to build an app to
classify the artisanal beers of
Cervecería Colima
Place your screenshot here
13
14. 1.- dATA
Data is distinct pieces of information which
acts as a fuel
14
17. How? Where do we get data from?
Data curation is the organization and integration
of data collected from various sources.
17
Techniques
You can use techniques like Questionnaires and surveys,
conducting interviews, using data scraping and data
crawling techniques.
18. Public datasets
● Google AI
● UCI ML Repository
● Data.gov.in
● Kaggle
Where do we get data from?
Crowdsourcing
Marketplaces
● Amazon Mechanical
Turk
● Dataturks
● Figure-eight
18
19. BACK TO OUR EXAMPLE...
● Google Images
● https://github.com/hardikvasa/google-images-download
● https://forums.fast.ai/t/tips-for-building-large-image-datasets/26688
19
22. TASK FOR OUR EXAMPLE
22
Classify Images of
Artisanal Beers
23. Image classification
A common use of machine learning is to identify
what an image represents.
The task of predicting what an image
represents is called image classification.
23
25. models
25
There are many models that are created over
the years.
Each model has its own advantages and
disadvantages based on the type of data on
which we are creating a model.
26. IMAGE CLASSIFICATION MODEL
An image classification model is trained to recognize various
classes of images.
26
When we subsequently
provide a new image as input
to the model, it will output the
probabilities of the image
representing each of the
types it was trained on.
27. An example output might be as follows:
Beer type Probability
Cayaco 0.02
Colimita 0.96
Piedra Lisa 0.01
Ticus 0.00
Paramo 0.01
27
Based on the output, we
can see that the
classification model has
predicted that the image
has a high probability of
representing a Colimita
Beer.
28. In this example, we will retrain a
MobileNet. MobileNet is a a small efficient
convolutional neural network.
https://ai.googleblog.com/2017/06/mobilenets-open-source-models-for.html
Model for our example
28
29. Retraining the mobileNet model
29
We use MobileNet model and retrain it.
python3 -m scripts.retrain
--bottleneck_dir=tf_files/bottlenecks
--model_dir=tf_files/models/"${ARCHITECTURE}"
--summaries_dir=tf_files/training_summaries/"${ARCHITECTURE}"
--output_graph=tf_files/retrained_graph.pb
--output_labels=tf_files/retrained_labels.txt
--architecture="${ARCHITECTURE}"
--image_dir=tf_files/beer_photos
IMAGE_SIZE=224
ARCHITECTURE="mobilenet_0.50_${IMAGE_SIZE}"
tHE RESULT...
30. USING THE RETRAINED MODEL
3030
Evaluation time (1-image): 0.250s
ticus (score=0.99956)
paramo (score=0.00043)
cayaco (score=0.00000)
piedra lisa (score=0.00000)
colimita (score=0.00000)
python3 -m scripts.label_image
--graph=tf_files/retrained_graph.pb
--image=tf_files/beer_photos/ticus/"3. ticus.jpg"
31. 4.- loss function
How do we know which model is better?
Loss function (also known as the error)
answers this question.
31
32. Classification losses:
● Mean Square Error/L2 Loss
● Mean Absolute Error/L1 Loss
Regression losses:
● Hinge Loss/Multi-class SVM Loss
● Cross Entropy
● Loss/Negative Log Likelihood
LOSS FUNCTIONS
To know which model
is good for our data,
we compute the loss
function by
comparing the
predicted outputs to
actual output.
32
33. 5.- learning algorithm
The Learning Algorithms also known as
Optimization algorithms helps us to minimize
Error
33
34. Is something you do everyday...
You are optimizing
variables and basing your
personal decisions all day
long, most of the time
without even recognizing
the process consciously
https://mitsloan.mit.edu/ideas-made-to-matter/how-to-use
-algorithms-to-solve-everyday-problems
34
35. First Order Optimization
Algorithms
● Gradient Descent
Types of learning algorithms
Second Order Optimization
Algorithms
● Hessian
https://towardsdatascience.com/types-of-optimization-algorithms-used-in-neural-networks-and-
ways-to-optimize-gradient-95ae5d39529f
35
41. MACHINE LEARNING IN YOUR APPS
● ML Kit For Firebase
● Core ML (Apple)
● TensorFlow Lite
● Cloud-based web services
● Your own service
Place your screenshot here
41
43. USING THE RETRAINED MODEL
4343
Evaluation time (1-image): 0.250s
ticus (score=0.99956)
paramo (score=0.00043)
cayaco (score=0.00000)
piedra lisa (score=0.00000)
colimita (score=0.00000)
python3 -m scripts.label_image
--graph=tf_files/retrained_graph.pb
--image=tf_files/beer_photos/ticus/"3. ticus.jpg"
44. TENSORFLOW LITE
44
TensorFlow Lite is a set of tools to
help developers run TensorFlow
models on mobile, embedded, and
IoT devices.
● TensorFlow Lite converter
● TensorFlow Lite interpreter
TensorFlow Lite converter
Converts TensorFlow models into
an efficient form for use by the
interpreter
45. Command line: tflite_convert
Starting from TensorFlow
1.9, the command-line tool
tflite_convert is installed as
part of the Python package.
45
pip install --upgrade "tensorflow==1.9.*"
50. repositories {
maven {
url 'https://google.bintray.com/tensorflow'
}
}
dependencies {
// ...
compile 'org.tensorflow:tensorflow-lite:+'
}
TensorFlow Lite interpreter
50
android {
aaptOptions {
noCompress "tflite"
noCompress "lite"
}
}
The TensorFlow Lite
interpreter is designed to be
lean and fast. The interpreter
uses a static graph ordering
and a custom (less-dynamic)
memory allocator to ensure
minimal load, initialization, and
execution latency.
dependencies
settings
51. Load model and create interpreter
class ImageClassifier constructor(private val assetManager: AssetManager) {
init {
interpreter = Interpreter(loadModelFile(assetManager,
MODEL_PATH))
labels = loadLabelList(assetManager)
...
}
}
51
// Name of the model file stored in Assets.
const val MODEL_PATH = "graph.lite";
// Name of the label file stored in Assets.
const val LABEL_PATH = "labels.txt";
52. cAMERA, Read the labels…..
52
https://developer.android.com/training/camerax
// Convert the image to bytes
convertBitmapToByteBuffer(bitmap)
// An array to hold inference results, to be feed
into Tensorflow Lite as outputs.
val recognitions = ArrayList<Result>()
val recognitionsSize = Math.min(pq.size, MAX_RESULTS)
for (i in 0 until recognitionsSize) recognitions.add(pq.poll())
return@flatMap Single.just(recognitions)
53. Show the results
53
// Get the results
textToShow = String.format("n%s: %4.2f", label.key,
label.value)
// Label (In this case PARAMO)
label.key
// Value (In this case 1.0)
label.value
ticus (score=0.00000)
paramo (score=1.00000)
cayaco (score=0.00000)
piedra lisa (score=0.00000)
colimita (score=0.00000)
54. Call to action!
Now you are ready to
start building your first
custom ML model.
54
55. 3.1
Use the model - other applications
iOS, web or backend applications
57. Tensorflow lite options - Mobile & iot
TensorFlow Lite provides all the
tools you need to convert and
run TensorFlow models on
mobile, embedded, and IoT
devices.
57
58. Tensorflow platform
58
TensorFlow is an end-to-end open source
platform for machine learning.
● Web Apps
● Back end - APIs
https://www.tensorflow.org/
59. 59
YOU CAN BUILD GREAT THINGS!
https://ai.google/social-good/
There a lot of
problems to solve!
59