SlideShare a Scribd company logo

Transformers for time series

E
EzeLanza

Transformers to time series presentation

1 of 35
Download to read offline
Transformers for
Time Series
#ossummit @eze_lanza
Is the new State of the Art (SOTA)
approaching?
AI Open source Evangelist-Intel
#ossummit
Agenda
• Agenda
– Transformers 101
– Time Series + Transformers
– Informer & Spacetimeformer
– Use case
– Conclusions
Transformers 101
#ossummit
Transformers Family Tree
https://arxiv.org/abs/2302.07730v2
#ossummit
Stable Diffusion
Transformer
#ossummit
What is a Transformer? 1/4
• Vanilla transformer (Language Translation)
– 1. Embedding + Pos encoding
– 2. Encoder Muli-Head SELF-ATTENTION
– 3. Decoded Multi-Head SELF-ATTENTION
https://arxiv.org/abs/1706.03762 “Attention is all you need”

Recommended

Introduction to PyTorch
Introduction to PyTorchIntroduction to PyTorch
Introduction to PyTorchJun Young Park
 
Time series predictions using LSTMs
Time series predictions using LSTMsTime series predictions using LSTMs
Time series predictions using LSTMsSetu Chokshi
 
PPT - Enhancing the Locality and Breaking the Memory Bottleneck of Transforme...
PPT - Enhancing the Locality and Breaking the Memory Bottleneck of Transforme...PPT - Enhancing the Locality and Breaking the Memory Bottleneck of Transforme...
PPT - Enhancing the Locality and Breaking the Memory Bottleneck of Transforme...Jisang Yoon
 
Lesson 5 arima
Lesson 5 arimaLesson 5 arima
Lesson 5 arimaankit_ppt
 
Time Series In R | Time Series Forecasting | Time Series Analysis | Data Scie...
Time Series In R | Time Series Forecasting | Time Series Analysis | Data Scie...Time Series In R | Time Series Forecasting | Time Series Analysis | Data Scie...
Time Series In R | Time Series Forecasting | Time Series Analysis | Data Scie...Edureka!
 
Arima model (time series)
Arima model (time series)Arima model (time series)
Arima model (time series)Kumar P
 

More Related Content

What's hot

Time Series Analysis.pptx
Time Series Analysis.pptxTime Series Analysis.pptx
Time Series Analysis.pptxSunny429247
 
Solving Poisson Equation using Conjugate Gradient Method and its implementation
Solving Poisson Equation using Conjugate Gradient Methodand its implementationSolving Poisson Equation using Conjugate Gradient Methodand its implementation
Solving Poisson Equation using Conjugate Gradient Method and its implementationJongsu "Liam" Kim
 
no U-turn sampler, a discussion of Hoffman & Gelman NUTS algorithm
no U-turn sampler, a discussion of Hoffman & Gelman NUTS algorithmno U-turn sampler, a discussion of Hoffman & Gelman NUTS algorithm
no U-turn sampler, a discussion of Hoffman & Gelman NUTS algorithmChristian Robert
 
Machine Learning Explanations: LIME framework
Machine Learning Explanations: LIME framework Machine Learning Explanations: LIME framework
Machine Learning Explanations: LIME framework Deep Learning Italia
 
Visualizing data using t-SNE
Visualizing data using t-SNEVisualizing data using t-SNE
Visualizing data using t-SNE홍배 김
 
stock market prediction
stock market predictionstock market prediction
stock market predictionSRIGINES
 
Variational Autoencoder Tutorial
Variational Autoencoder Tutorial Variational Autoencoder Tutorial
Variational Autoencoder Tutorial Hojin Yang
 
Arima model
Arima modelArima model
Arima modelJassika
 
Introduction to Generalized Linear Models
Introduction to Generalized Linear ModelsIntroduction to Generalized Linear Models
Introduction to Generalized Linear Modelsrichardchandler
 
Time series and forecasting from wikipedia
Time series and forecasting from wikipediaTime series and forecasting from wikipedia
Time series and forecasting from wikipediaMonica Barros
 
Uncertain Volatility Models
Uncertain Volatility ModelsUncertain Volatility Models
Uncertain Volatility ModelsSwati Mital
 
GaitSet: Regarding Gait as a Set for Cross-View Gait Recognition (tutorial)
GaitSet: Regarding Gait as a Set for Cross-View Gait Recognition (tutorial)GaitSet: Regarding Gait as a Set for Cross-View Gait Recognition (tutorial)
GaitSet: Regarding Gait as a Set for Cross-View Gait Recognition (tutorial)Abner Chao
 

What's hot (20)

Time Series Analysis.pptx
Time Series Analysis.pptxTime Series Analysis.pptx
Time Series Analysis.pptx
 
Time Series Decomposition
Time Series DecompositionTime Series Decomposition
Time Series Decomposition
 
Solving Poisson Equation using Conjugate Gradient Method and its implementation
Solving Poisson Equation using Conjugate Gradient Methodand its implementationSolving Poisson Equation using Conjugate Gradient Methodand its implementation
Solving Poisson Equation using Conjugate Gradient Method and its implementation
 
no U-turn sampler, a discussion of Hoffman & Gelman NUTS algorithm
no U-turn sampler, a discussion of Hoffman & Gelman NUTS algorithmno U-turn sampler, a discussion of Hoffman & Gelman NUTS algorithm
no U-turn sampler, a discussion of Hoffman & Gelman NUTS algorithm
 
Machine Learning Explanations: LIME framework
Machine Learning Explanations: LIME framework Machine Learning Explanations: LIME framework
Machine Learning Explanations: LIME framework
 
Visualizing data using t-SNE
Visualizing data using t-SNEVisualizing data using t-SNE
Visualizing data using t-SNE
 
stock market prediction
stock market predictionstock market prediction
stock market prediction
 
Variational Autoencoder Tutorial
Variational Autoencoder Tutorial Variational Autoencoder Tutorial
Variational Autoencoder Tutorial
 
Arima model
Arima modelArima model
Arima model
 
Presentation
PresentationPresentation
Presentation
 
Time series analysis
Time series analysisTime series analysis
Time series analysis
 
Introduction to Generalized Linear Models
Introduction to Generalized Linear ModelsIntroduction to Generalized Linear Models
Introduction to Generalized Linear Models
 
ACF.ppt
ACF.pptACF.ppt
ACF.ppt
 
Time series and forecasting from wikipedia
Time series and forecasting from wikipediaTime series and forecasting from wikipedia
Time series and forecasting from wikipedia
 
APPLICATION OF NUMERICAL METHODS IN SMALL SIZE
APPLICATION OF NUMERICAL METHODS IN SMALL SIZEAPPLICATION OF NUMERICAL METHODS IN SMALL SIZE
APPLICATION OF NUMERICAL METHODS IN SMALL SIZE
 
ngboost.pptx
ngboost.pptxngboost.pptx
ngboost.pptx
 
Uncertain Volatility Models
Uncertain Volatility ModelsUncertain Volatility Models
Uncertain Volatility Models
 
GaitSet: Regarding Gait as a Set for Cross-View Gait Recognition (tutorial)
GaitSet: Regarding Gait as a Set for Cross-View Gait Recognition (tutorial)GaitSet: Regarding Gait as a Set for Cross-View Gait Recognition (tutorial)
GaitSet: Regarding Gait as a Set for Cross-View Gait Recognition (tutorial)
 
FPDE presentation
FPDE presentationFPDE presentation
FPDE presentation
 
Implimenting_HJM
Implimenting_HJMImplimenting_HJM
Implimenting_HJM
 

Similar to Transformers for time series

Multi-component Modeling with Swift at Extreme Scale
Multi-component Modeling with Swift at Extreme ScaleMulti-component Modeling with Swift at Extreme Scale
Multi-component Modeling with Swift at Extreme ScaleDaniel S. Katz
 
Doing Something We Never Could with Spoken Language Technologies_109-10-29_In...
Doing Something We Never Could with Spoken Language Technologies_109-10-29_In...Doing Something We Never Could with Spoken Language Technologies_109-10-29_In...
Doing Something We Never Could with Spoken Language Technologies_109-10-29_In...linshanleearchive
 
Seminar on Parallel and Concurrent Programming
Seminar on Parallel and Concurrent ProgrammingSeminar on Parallel and Concurrent Programming
Seminar on Parallel and Concurrent ProgrammingStefan Marr
 
R2D2 Project (EP/L006251/1) - Research Objectives & Outcomes
R2D2 Project (EP/L006251/1) - Research Objectives & OutcomesR2D2 Project (EP/L006251/1) - Research Objectives & Outcomes
R2D2 Project (EP/L006251/1) - Research Objectives & OutcomesAndrea Tassi
 
Rust All Hands Winter 2011
Rust All Hands Winter 2011Rust All Hands Winter 2011
Rust All Hands Winter 2011Patrick Walton
 
FORECASTING MUSIC GENRE (RNN - LSTM)
FORECASTING MUSIC GENRE (RNN - LSTM)FORECASTING MUSIC GENRE (RNN - LSTM)
FORECASTING MUSIC GENRE (RNN - LSTM)IRJET Journal
 
Presto training course_1999
Presto training course_1999Presto training course_1999
Presto training course_1999Piero Belforte
 
Matrix_Profile_Tutorial_Part1.pdf
Matrix_Profile_Tutorial_Part1.pdfMatrix_Profile_Tutorial_Part1.pdf
Matrix_Profile_Tutorial_Part1.pdfAndrea496281
 
Crash course on data streaming (with examples using Apache Flink)
Crash course on data streaming (with examples using Apache Flink)Crash course on data streaming (with examples using Apache Flink)
Crash course on data streaming (with examples using Apache Flink)Vincenzo Gulisano
 
cis97003
cis97003cis97003
cis97003perfj
 
Python for Chemistry
Python for ChemistryPython for Chemistry
Python for Chemistryguest5929fa7
 
Python for Chemistry
Python for ChemistryPython for Chemistry
Python for Chemistrybaoilleach
 
Message-passing concurrency in Python
Message-passing concurrency in PythonMessage-passing concurrency in Python
Message-passing concurrency in PythonSarah Mount
 
On the need for a W3C community group on RDF Stream Processing
On the need for a W3C community group on RDF Stream ProcessingOn the need for a W3C community group on RDF Stream Processing
On the need for a W3C community group on RDF Stream ProcessingPlanetData Network of Excellence
 
OrdRing 2013 keynote - On the need for a W3C community group on RDF Stream Pr...
OrdRing 2013 keynote - On the need for a W3C community group on RDF Stream Pr...OrdRing 2013 keynote - On the need for a W3C community group on RDF Stream Pr...
OrdRing 2013 keynote - On the need for a W3C community group on RDF Stream Pr...Oscar Corcho
 
The Swarm Based Routing Algorithms
The Swarm Based Routing AlgorithmsThe Swarm Based Routing Algorithms
The Swarm Based Routing AlgorithmsLakeisha Jones
 
IEC 60870-5 101 Protocol Server Simulator User manual
IEC 60870-5 101 Protocol Server Simulator User manualIEC 60870-5 101 Protocol Server Simulator User manual
IEC 60870-5 101 Protocol Server Simulator User manualFreyrSCADA Embedded Solution
 

Similar to Transformers for time series (20)

Multi-component Modeling with Swift at Extreme Scale
Multi-component Modeling with Swift at Extreme ScaleMulti-component Modeling with Swift at Extreme Scale
Multi-component Modeling with Swift at Extreme Scale
 
Doing Something We Never Could with Spoken Language Technologies_109-10-29_In...
Doing Something We Never Could with Spoken Language Technologies_109-10-29_In...Doing Something We Never Could with Spoken Language Technologies_109-10-29_In...
Doing Something We Never Could with Spoken Language Technologies_109-10-29_In...
 
Seminar on Parallel and Concurrent Programming
Seminar on Parallel and Concurrent ProgrammingSeminar on Parallel and Concurrent Programming
Seminar on Parallel and Concurrent Programming
 
R2D2 Project (EP/L006251/1) - Research Objectives & Outcomes
R2D2 Project (EP/L006251/1) - Research Objectives & OutcomesR2D2 Project (EP/L006251/1) - Research Objectives & Outcomes
R2D2 Project (EP/L006251/1) - Research Objectives & Outcomes
 
Rust All Hands Winter 2011
Rust All Hands Winter 2011Rust All Hands Winter 2011
Rust All Hands Winter 2011
 
FORECASTING MUSIC GENRE (RNN - LSTM)
FORECASTING MUSIC GENRE (RNN - LSTM)FORECASTING MUSIC GENRE (RNN - LSTM)
FORECASTING MUSIC GENRE (RNN - LSTM)
 
Presto training course_1999
Presto training course_1999Presto training course_1999
Presto training course_1999
 
Matrix_Profile_Tutorial_Part1.pdf
Matrix_Profile_Tutorial_Part1.pdfMatrix_Profile_Tutorial_Part1.pdf
Matrix_Profile_Tutorial_Part1.pdf
 
Crash course on data streaming (with examples using Apache Flink)
Crash course on data streaming (with examples using Apache Flink)Crash course on data streaming (with examples using Apache Flink)
Crash course on data streaming (with examples using Apache Flink)
 
Multiple Object Tracking - Laura Leal-Taixe - UPC Barcelona 2018
Multiple Object Tracking - Laura Leal-Taixe - UPC Barcelona 2018Multiple Object Tracking - Laura Leal-Taixe - UPC Barcelona 2018
Multiple Object Tracking - Laura Leal-Taixe - UPC Barcelona 2018
 
cis97003
cis97003cis97003
cis97003
 
report
reportreport
report
 
Python for Chemistry
Python for ChemistryPython for Chemistry
Python for Chemistry
 
Python for Chemistry
Python for ChemistryPython for Chemistry
Python for Chemistry
 
Message-passing concurrency in Python
Message-passing concurrency in PythonMessage-passing concurrency in Python
Message-passing concurrency in Python
 
Rnn presentation 2
Rnn presentation 2Rnn presentation 2
Rnn presentation 2
 
On the need for a W3C community group on RDF Stream Processing
On the need for a W3C community group on RDF Stream ProcessingOn the need for a W3C community group on RDF Stream Processing
On the need for a W3C community group on RDF Stream Processing
 
OrdRing 2013 keynote - On the need for a W3C community group on RDF Stream Pr...
OrdRing 2013 keynote - On the need for a W3C community group on RDF Stream Pr...OrdRing 2013 keynote - On the need for a W3C community group on RDF Stream Pr...
OrdRing 2013 keynote - On the need for a W3C community group on RDF Stream Pr...
 
The Swarm Based Routing Algorithms
The Swarm Based Routing AlgorithmsThe Swarm Based Routing Algorithms
The Swarm Based Routing Algorithms
 
IEC 60870-5 101 Protocol Server Simulator User manual
IEC 60870-5 101 Protocol Server Simulator User manualIEC 60870-5 101 Protocol Server Simulator User manual
IEC 60870-5 101 Protocol Server Simulator User manual
 

Recently uploaded

Auditorium Session 3 - Resilience - Financial Resilience and Collaboration
Auditorium Session 3 - Resilience - Financial Resilience and CollaborationAuditorium Session 3 - Resilience - Financial Resilience and Collaboration
Auditorium Session 3 - Resilience - Financial Resilience and CollaborationMuseums Galleries Scotland
 
Chapter 20 Firms in IGCSE economics presentation
Chapter 20  Firms in IGCSE  economics presentationChapter 20  Firms in IGCSE  economics presentation
Chapter 20 Firms in IGCSE economics presentationSamandarbekNumonov
 
Teams Nation 2024 - #Copilot & Teams or Just Premium.pptx
Teams Nation 2024 - #Copilot & Teams or Just Premium.pptxTeams Nation 2024 - #Copilot & Teams or Just Premium.pptx
Teams Nation 2024 - #Copilot & Teams or Just Premium.pptxKai Stenberg
 
Auditorium Session 2 - Workforce - Diversity/Skills & Confidence
Auditorium Session 2 - Workforce - Diversity/Skills & ConfidenceAuditorium Session 2 - Workforce - Diversity/Skills & Confidence
Auditorium Session 2 - Workforce - Diversity/Skills & ConfidenceMuseums Galleries Scotland
 
God and You 2 Cor 5:15-19; February 25, 2024
God and You 2 Cor 5:15-19; February 25, 2024God and You 2 Cor 5:15-19; February 25, 2024
God and You 2 Cor 5:15-19; February 25, 2024Central Church of Christ
 
Monthly HSE Report March for overall HSE
Monthly HSE Report March for overall HSEMonthly HSE Report March for overall HSE
Monthly HSE Report March for overall HSEOlgaOliveaJohn
 
Supporting Resilient Prosperity in the Caribbean
Supporting Resilient Prosperity in the CaribbeanSupporting Resilient Prosperity in the Caribbean
Supporting Resilient Prosperity in the CaribbeanCaribbean Development Bank
 
Partnerships for Resilient Prosperity in the Caribbean
Partnerships for Resilient Prosperity in the CaribbeanPartnerships for Resilient Prosperity in the Caribbean
Partnerships for Resilient Prosperity in the CaribbeanCaribbean Development Bank
 
Present and Future Requisites for Prosperity in the Caribbean
Present and Future Requisites for Prosperity in the CaribbeanPresent and Future Requisites for Prosperity in the Caribbean
Present and Future Requisites for Prosperity in the CaribbeanCaribbean Development Bank
 
Space expansion: cultural considerations, long term perspectives, and spiritu...
Space expansion: cultural considerations, long term perspectives, and spiritu...Space expansion: cultural considerations, long term perspectives, and spiritu...
Space expansion: cultural considerations, long term perspectives, and spiritu...Giulio Prisco
 
Issues affecting LGBT as they grow older.pptx
Issues affecting LGBT as they grow older.pptxIssues affecting LGBT as they grow older.pptx
Issues affecting LGBT as they grow older.pptxbill846304
 
Instructional Supervision - By Dr. Cherinet Aytenfsu Weldearegay.pdf
Instructional Supervision - By Dr. Cherinet Aytenfsu Weldearegay.pdfInstructional Supervision - By Dr. Cherinet Aytenfsu Weldearegay.pdf
Instructional Supervision - By Dr. Cherinet Aytenfsu Weldearegay.pdfaytenfsuc
 

Recently uploaded (13)

Auditorium Session 3 - Resilience - Financial Resilience and Collaboration
Auditorium Session 3 - Resilience - Financial Resilience and CollaborationAuditorium Session 3 - Resilience - Financial Resilience and Collaboration
Auditorium Session 3 - Resilience - Financial Resilience and Collaboration
 
Chapter 20 Firms in IGCSE economics presentation
Chapter 20  Firms in IGCSE  economics presentationChapter 20  Firms in IGCSE  economics presentation
Chapter 20 Firms in IGCSE economics presentation
 
Teams Nation 2024 - #Copilot & Teams or Just Premium.pptx
Teams Nation 2024 - #Copilot & Teams or Just Premium.pptxTeams Nation 2024 - #Copilot & Teams or Just Premium.pptx
Teams Nation 2024 - #Copilot & Teams or Just Premium.pptx
 
Auditorium Session 2 - Workforce - Diversity/Skills & Confidence
Auditorium Session 2 - Workforce - Diversity/Skills & ConfidenceAuditorium Session 2 - Workforce - Diversity/Skills & Confidence
Auditorium Session 2 - Workforce - Diversity/Skills & Confidence
 
God and You 2 Cor 5:15-19; February 25, 2024
God and You 2 Cor 5:15-19; February 25, 2024God and You 2 Cor 5:15-19; February 25, 2024
God and You 2 Cor 5:15-19; February 25, 2024
 
Monthly HSE Report March for overall HSE
Monthly HSE Report March for overall HSEMonthly HSE Report March for overall HSE
Monthly HSE Report March for overall HSE
 
Supporting Resilient Prosperity in the Caribbean
Supporting Resilient Prosperity in the CaribbeanSupporting Resilient Prosperity in the Caribbean
Supporting Resilient Prosperity in the Caribbean
 
Partnerships for Resilient Prosperity in the Caribbean
Partnerships for Resilient Prosperity in the CaribbeanPartnerships for Resilient Prosperity in the Caribbean
Partnerships for Resilient Prosperity in the Caribbean
 
Present and Future Requisites for Prosperity in the Caribbean
Present and Future Requisites for Prosperity in the CaribbeanPresent and Future Requisites for Prosperity in the Caribbean
Present and Future Requisites for Prosperity in the Caribbean
 
Space expansion: cultural considerations, long term perspectives, and spiritu...
Space expansion: cultural considerations, long term perspectives, and spiritu...Space expansion: cultural considerations, long term perspectives, and spiritu...
Space expansion: cultural considerations, long term perspectives, and spiritu...
 
Auditorium Session 1 - Connection - Inclusion
Auditorium Session 1 - Connection - InclusionAuditorium Session 1 - Connection - Inclusion
Auditorium Session 1 - Connection - Inclusion
 
Issues affecting LGBT as they grow older.pptx
Issues affecting LGBT as they grow older.pptxIssues affecting LGBT as they grow older.pptx
Issues affecting LGBT as they grow older.pptx
 
Instructional Supervision - By Dr. Cherinet Aytenfsu Weldearegay.pdf
Instructional Supervision - By Dr. Cherinet Aytenfsu Weldearegay.pdfInstructional Supervision - By Dr. Cherinet Aytenfsu Weldearegay.pdf
Instructional Supervision - By Dr. Cherinet Aytenfsu Weldearegay.pdf
 

Transformers for time series

Editor's Notes

  1. This would be my learning process to solve a problem which is latency prediction in Kubernetes. I believe that the learning process can be useful to others, since the amount of papers and information to learn is huge and I would like to share my experience with you, since this is a hot topis and it can benefitial to be used in cases where a XXXX time serioes problem has to be solved.. I promess it won’t be technical and you need to understand a full knowledge. My idea is to introduce for further readings I will cite and mention papers that were most useful in my journey
  2. ANIMACION APARECIENDO TODOS. DESDE EL VANILLA Transformers have changed the world from his appearance on 2017, why is it important? but have since come to monopolize the state-of-the-art performance across virtually all NLP tasks
  3. Circulo o cuadrado remarcando transf It has evolved in different ways, researchers found new ways to transform it. Like this is the case of Diffusion models, they use Transformers +UNET, and this is the SOA today. But this is not clear today in the case of Time Series, there are multiple approaches and multiple conversations IT’S NOT JUST THE TRANSFORMER, BUT
  4. Why a transformer is so important? and why we are seeing those arechitectures on almost most of the use cases today? To explain a transformer we need to start talking about the concepts. There are a lot of math behind that and I'll try to explain it in an easy way Let’s then try to understand the basics. How everything started.THE PARTS ARE ALMOST SIMILAR This is the initial idea. This is just the basics, some models have this architecture others modified, but it’s important to understand how it works to understand later how to modify the architecuree, or why we need to have a differente architecrue to our use case. Fo instance, GPT uses DECODER, while BERT uses ENCODER. ES IMPORTANTE ENTENDER EL CONCEPTO PARA DESPUES ENTENDER PROBLEMAS Y COMO SE PUEDE ADPATAR A LAS SEIRES. INICIALMENTE EL TRANSFOEMR FUE UTILIZADO PARA TRADUCCION DE TEXTO, Y FUE DISRUPTIVO POR SUS RESULATDOS. PODEMOS DIVIDIR EN TRES GRANDES GRUPOS. COMO ADAPTA LA INFORMACION PARA QUE SEA INTERPREATADA, COMO EXTRAE LAS RELACIONES ENTRE LAS PALABRAS EN UN ENTORNO DE SEQ2SEQ, ES DECIR,
  5. SOLAMENTE ENFOQUEMONOS EN TEXTO, COMPUTERS DON’T UNDERSTAND WORDS, THEY GET NUMBERS, VECTORS ON MATRICES. MAP WORDS WITH SIMILAR MEANINGS. BUT THOSE WORDS CAN HAVE DIFFERENT MEANINGS BASED ON THEIR POSITION, SO WE NEED TO GIVE A CONTEXT OF THE POSITION. The idea of the input embedcings is to that captures the dependencies across differ- ent variables without considering the position information It keeps the same similarity and the positons in there, Transformers are permutation invariant, meaning they cannot interpret the order of input tokens by default. We need to provide information by positional encoding This part is important, and there was multiple changes on it, on how the model better interprets the positions, some of the add it as a token, or there are hybris positional encoding.
  6. https://jalammar.github.io/illustrated-transformer/ Model the relationship between each word and others. As we can see Children has a strong relationship La idea ahora es convertir esa palabra en un vector que contenga la informacion con los otras palabras. Por lo que el vector convertido sera el resultado de cada relacion con otras palabras. Para eso se utilizan 3 matrices que es lo que se entrena y basicamente Q1 will give us just one relationship, but we’d need Multple relationships! This is why we use MULTIHEAD
  7. https://jalammar.github.io/illustrated-transformer/ We are just showing one, but the vectors contain information of each word related with other words. IT ALLOWS TO CAPTURE M,ULTIPLE RELATIONSHIPS, multuhead is concatenated and multiplead by a new weight matrix to get again 1 vector
  8. https://jalammar.github.io/illustrated-transformer/ We are just showing one, but the vectors contain information of each word related with other words The inputs are encoded, so you now have the relationships between them, but you now need to predict the output, this is when the model only uses the DECODER part, DE LA MISMA MANERA QUE EL ENCODER FUNCIONA ES COMO EL DECODER ENTONCES FUNCIONA, LOS Z SON TRANSFOMADOS A MATRICES K Y V POR LO QUE
  9. Now let’s dive on Time series , can it be used as it is for time series? There is a trend to use what works well in one topic for others,mainly because it was a huge advance, we can probably do the same for Time series. Well similar as we’ve seen for NLP, we can face some problems when we try to use it.
  10. Time series can be used in multiple use cases in the world, weather forecasting, or like in this case bitcoinfs We can define a time series whith a Time series forecasting plays an important role in daily life to help people manage resources and make decisions. For example, in retail industry, probabilistic forecasting of product demand and supply based on historical data can help people do inventory planning to maximize the profit. Everything related with a dependency of the precious is considered a time series, this is WHY is El punto x dependera de los puntos anteriores, esto esmuy importante mencionarlo ya que es un diferenciador de las series temporales. EXPLICAR, DIFERENCIA ENTRE SERIE Y REMARCAR LA IMPORTANCIA DE LA DEPENDENCIA DE UN PUNTO SOBRE LOS ANTEIORES. CONTEXT WINDOW TO PREDICT A TARGET WINDOW, IT CAN BE 1 OR 10 ,
  11. En el razonamiento, Podemos pensar entonces que si un tiempo depende de los anteiores, entonces, esto es simil al lenguahe no? It can be associated with language, but in language is not so adaptable, we can slightly change the order and we can still get the message. This is time series is fixed. It’s important to have a description or a modeling. We can imagine that an algorithm that can perform well in language could perform well for time series. What if I have a long phrase?
  12. Los metods classicos refieren a realizar un studio exhaustive de las caracteristicas de la serie temporal para poder realizar el modelado, se estudian parametros como tendendia, seasonabilitu y le utilizan metodos conocidos como ARIMA o Autorregresivos Fixed length Autoregressive Linear dependencies
  13. El principal problema aqui es que se debe disenar una red especial para esto, si quiero predecir el proximo valor, la architecture tiene que se asi, si quiero usar 8 anteiorres no podria. Esot hacve que sea un desafio RNN Uno de los atractivos de los RNN es la idea de que podrían conectar la información anterior a la tarea actual. Donde la brecha entre la información relevante y el lugar donde se necesita es pequeña, los RNN pueden aprender a usar la información pasada. También hay casos en los que se necesita más contexto y es muy posible que la brecha entre la información relevante y el punto donde se necesita se vuelva muy grande. Debido al desvanecimiento del gradiente las anteiroes valen cada vez menos LSTM Como tal, se puede usar para crear grandes redes recurrentes que, a su vez, se pueden usar para abordar problemas de secuencia difíciles en el aprendizaje automático y lograr resultados de vanguardia. En lugar de neuronas, las redes LSTM tienen bloques de memoria que están conectados a través de capas.
  14. since self-attention enables Transformer to capture both long- and short-term dependencies, and different attention heads learn to focus on different aspects of temporal patterns. These advantages make Transformer a good candidate for time series forecasting Transformer models are based on a multi-headed attention mechanism that offers several key advan- tages and renders them particularly suitable for time series data They can concurrently take into account long contexts of input sequence elements and learn to represent each sequence element by selectively attending to those input sequence elements which the model considers most relevant. They do so without position-dependent prior bias; this is to be contrasted with RNN-based models: a) even bi-directional RNNs treat elements in the middle of the input sequence differently from elements close to the two endpoints, and b) despite careful design, even LSTM (Long Short Term Memory) and GRU (Gated Recurrent Unit) networks practically only retain information from a limited number of time steps stored inside their hidden state (vanishing gradient problem (Hochreiter, 1998; Pascanu et al., 2013)), and thus the context used for representing each sequence element is inevitably local. Multiple attention heads can consider different representation subspaces, i.e., multiple as- pects of relevance between input elements. For example, in the context of a signal with two frequency components, 1/T1 and 1/T2 , one attention head can attend to neighboring time points, while another one may attend to points spaced a period T1 before the currently examined time point, a third to a period T2 before, etc. This is to be contrasted with at- tention mechanisms in RNN models, which learn a single global aspect/mode of relevance between sequence elements. After each stage of contextual representation (i.e., transformer encoder layer), attention is redistributed over the sequence elements, taking into account progressively more abstract representations of the input elements as information flows from the input towards the out- put. By contrast, RNN models with attention use a single distribution of attention weights to extract a representation of the input, and most typically attend over a single layer of representation (hidden states).
  15. BUT THERE ARE SOME PROBLEMS THIS IS MORE REALTED WITH THE ALGORITH ITSELF. If you probably have worked with algorithms when you are writing a function, we have several tyes. This part is important because we’d like an algorithm that don’t grows as the input size grows. In other words, we look that the Constant is when you do a math = 1+1 is the same as 200+200, it always takes 2 units. Linear, can be a factorial, when if you have to multiple by the previous, so sum_elements + prev in a for. When you have a FOR loop Quadratic : FOR {…FOR{..} . THIS IS THE CASE OF TRANSFOEMRES, YOU ARE COMPARING EACH WORD WITH ALL THE OTHER WORDS, SO THIS A CHALLENGE, IT MAKES THE ALGORITH LOG :
  16. Por lo tanto hay que adaptar el transformer, para ello existen diferentes maneras de realizarlo To summarize the existing time series Transformers, the paper propose a taxonomy from perspectives of network modifications and application domains. From the perspective of network modifications, we can summarize the changes made on both module level and architecture level, to accommodate the architecture for TS modeling. From the application domains, we know that there are multiple ways to use time series, and this part focuses on Application domain. Forecasting, anomaly detection, classification.
  17. As you may imagine is extremely important to encode the positions of input time series. In the vanilla transformer we first encode the positional information as vector, and then inject them to the model as an additional input. Vanilla : Using the vanilla will defintelly not work. Learnable : iNSTEAD OF USING THE FIXED COS FOR THE, THEY DECIDED TO adapt it to multiple tasks.
  18. Now let’s dive on Time series , can it be used as it is for time series? There is a trend to use what works well in one topic for others,mainly because it was a huge advance, we can probably do the same for Time series. Well similar as we’ve seen for NLP, we can face some problems when we try to use it.
  19. The vanilla trans- former (Vaswani et al. 2017; Devlin et al. 2018) uses point wise self-attention mechanism and the time stamps serve as local positional context. However, in the LSTF problem, the ability to capture long-range independence requires global information like hierarchical time stamps (week, month and year) and agnostic time stamps (holidays, events). These are hardly leveraged in canonical self-attention and conse- quent query-key mismatches between the encoder and de- coder bring underlying degradation on the forecasting per- formance. We propose a uniform input representation to mit- igate the issue, the Fig.(6) gives an intuitive overview. IT CAN BE HARDCODED
  20. The main goal of sparse attention is to reduce the computation,  Sparse attention reduces computation time and the memory requirements of the attention mechanism by computing a limited selection of similarity scores from a sequence rather than all possible pairs, resulting in a sparse matrix rather than a full matrix The main idea of ProbSparse is that the canonical self-attention scores form a long-tail distribution, where the "active" queries lie in the "head" scores and "lazy" queries lie in the "tail" area. By "active" query we mean a query 𝑞𝑖 such that the dot-product ⟨𝑞𝑖,𝑘𝑖⟩ contributes to the major attention, whereas a "lazy" query forms a dot-product which generates  trivial attention. Here, 𝑞𝑖 and 𝑘𝑖 are the 𝑖-th rows in 𝑄 and 𝐾 attention matrices respectively. SE ENCARGA DE SELECCIONAR QUE QUERIES,
  21. Because of the ProbSparse self-attention, the encoder’s feature map has some redundancy that can be removed. Therefore, the distilling operation is used to reduce the input size between encoder layers into its half slice, thus in theory removing this redundancy. In practice, Informer's "distilling" operation just adds 1D convolution layers with max pooling between each of the encoder layers. Let 𝑋𝑛 be the output of the 𝑛-th encoder layer, the distilling operation is then defined as
  22. Of course we don’t have a time to explain yet It’s not a huge improvement, but it’s beter.
  23. There is a recent approach that takes Informer style. Informer generates d-dimensional embeddings of seq, with a result expressed in a matrix. This approach aims to modify the token embedding input sequence by flattening each multivariate vector into N scalars with a copy of his timestamp, leading to a new sequence. A diferencia del Informer, el Spacetimeformer utiliza otra representacion de las entradas. Cada token tiene informacion de la posicion It convertst a seq2seq problem of Lengh L into a new format of length LN. But we can imagine a problem here with the amount of Variables.
  24. To run our experiments, we selected a problem to predict the latency intra-microservices. For those that are not aware of what is it, is basically an implementation of a ecommerce site, and the architecture is based on Microservices. We will predict latencies, we do have the information among all the services, and we will predict the latency to the front end service. We could pick any but we’ve selected to User latency.
  25. As we said, there are multiple ways an moments to use. We can have a stage where we have very few points for training. How much is more pints? It depends on each series. But transformers can be useful if we have Long series, and we will like to predict Long amount of data.. Pour The easier way of course is to predict the next point base in the past. We will target to predict long sequences,
  26. BE CAREFULL WITH THE MSE. Close to original value
  27. It’ seems to be better, but the reality is that what matters are the peaks. MSE is a mean of all the measurements, and even if it’s detecting well, it’s not able
  28. But why are transofmers now being used in all environemts? Before moving to the architecture details, let A surevy https://arxiv.org/pdf/2202.07125.pdf
  29. SESIONALITY IS IMPORTANT 3 THINGS IMPORTANT -
  30. El objetivo es poder observer, of couse that the most valuable thing will be if we can have an algorithm to predict a long sequence, it means that, this is a challenge PLANTEAR EL PROBLEMA DE KUBERNETES, CONVIENE??