A talk at Black Hat USA 2013 discussing recent advances in academic research that imperils two critical algorithms upon which Internet trust rests: RSA and Diffie-Helman.
Fully Homomorphic Encryption allows computations to be carried out on encrypted data without decrypting it. It solves the problem of secure cloud computing by allowing a client to encrypt data and outsource storage and processing to an untrusted server. The presentation discusses additive and multiplicative homomorphic encryption schemes, including ElGamal and RSA. It also covers bootstrapping and applications to image processing tasks like resizing, compression, and decompression on encrypted images. A demonstration of these techniques is shown using the Pyfhel library. While promising for security, fully homomorphic encryption remains computationally expensive.
traditional private/secret/single key cryptography uses one key
Key is shared by both sender and receiver
if the key is disclosed communications are compromised
also known as symmetric, both parties are equal
hence does not protect sender from receiver forging a message & claiming is sent by sender
Elliptic Curve Cryptography was presented by Ajithkumar Vyasarao. He began with an introduction to ECC, noting its advantages over RSA like smaller key sizes providing equal security. He described how ECC works using elliptic curves over real numbers and finite fields. He demonstrated point addition and scalar multiplication on curves. ECC can be used for applications like smart cards and mobile devices. For key exchange, Alice and Bob can agree on a starting point and generate secret keys by multiplying a private value with the shared point. ECC provides security through the difficulty of solving the elliptic curve discrete logarithm problem.
The document discusses finite fields and related algebraic concepts. It begins by defining groups, rings, and fields. It then focuses on finite fields, particularly GF(p) fields consisting of integers modulo a prime p. It discusses finding multiplicative inverses in such fields using the extended Euclidean algorithm. As an example, it finds the inverse of 550 modulo 1759.
Star, a wild bird, learned to count up to 8 on her own and discovered that numbers can be represented in different ways like 4+4 or 2+2+2+2, showing she was thinking about numbers consciously. She could also recognize number names and remember their sounds. Star showed unusual intelligence for a wild bird in her self-motivated pursuit of numerical science.
A SURVEY ON ELLIPTIC CURVE DIGITAL SIGNATURE ALGORITHM AND ITS VARIANTScsandit
The Elliptic Curve Digital Signature Algorithm (ECDSA) is an elliptic curve variant of the
Digital Signature Algorithm (DSA). It gives cryptographically strong digital signatures making
use of Elliptic curve discrete logarithmic problem. It uses arithmetic with much smaller
numbers 160/256 bits instead of 1024/2048 bits in RSA and DSA and provides the same level of
security. The ECDSA was accepted in 1999 as an ANSI standard, and was accepted in 2000 as
IEEE and NIST standards. It was also accepted in 1998 as an ISO standard. Many cryptologist
have studied security aspects of ECDSA and proposed different variants. In this paper, we
discuss a detailed analysis of the original ECDSA and all its available variants in terms of the
security level and execution time of all the phases. To the best of our knowledge, this is a unique
attempt to juxtapose and compare the ECDSA with all of its variants.
Fully Homomorphic Encryption allows computations to be carried out on encrypted data without decrypting it. It solves the problem of secure cloud computing by allowing a client to encrypt data and outsource storage and processing to an untrusted server. The presentation discusses additive and multiplicative homomorphic encryption schemes, including ElGamal and RSA. It also covers bootstrapping and applications to image processing tasks like resizing, compression, and decompression on encrypted images. A demonstration of these techniques is shown using the Pyfhel library. While promising for security, fully homomorphic encryption remains computationally expensive.
traditional private/secret/single key cryptography uses one key
Key is shared by both sender and receiver
if the key is disclosed communications are compromised
also known as symmetric, both parties are equal
hence does not protect sender from receiver forging a message & claiming is sent by sender
Elliptic Curve Cryptography was presented by Ajithkumar Vyasarao. He began with an introduction to ECC, noting its advantages over RSA like smaller key sizes providing equal security. He described how ECC works using elliptic curves over real numbers and finite fields. He demonstrated point addition and scalar multiplication on curves. ECC can be used for applications like smart cards and mobile devices. For key exchange, Alice and Bob can agree on a starting point and generate secret keys by multiplying a private value with the shared point. ECC provides security through the difficulty of solving the elliptic curve discrete logarithm problem.
The document discusses finite fields and related algebraic concepts. It begins by defining groups, rings, and fields. It then focuses on finite fields, particularly GF(p) fields consisting of integers modulo a prime p. It discusses finding multiplicative inverses in such fields using the extended Euclidean algorithm. As an example, it finds the inverse of 550 modulo 1759.
Star, a wild bird, learned to count up to 8 on her own and discovered that numbers can be represented in different ways like 4+4 or 2+2+2+2, showing she was thinking about numbers consciously. She could also recognize number names and remember their sounds. Star showed unusual intelligence for a wild bird in her self-motivated pursuit of numerical science.
A SURVEY ON ELLIPTIC CURVE DIGITAL SIGNATURE ALGORITHM AND ITS VARIANTScsandit
The Elliptic Curve Digital Signature Algorithm (ECDSA) is an elliptic curve variant of the
Digital Signature Algorithm (DSA). It gives cryptographically strong digital signatures making
use of Elliptic curve discrete logarithmic problem. It uses arithmetic with much smaller
numbers 160/256 bits instead of 1024/2048 bits in RSA and DSA and provides the same level of
security. The ECDSA was accepted in 1999 as an ANSI standard, and was accepted in 2000 as
IEEE and NIST standards. It was also accepted in 1998 as an ISO standard. Many cryptologist
have studied security aspects of ECDSA and proposed different variants. In this paper, we
discuss a detailed analysis of the original ECDSA and all its available variants in terms of the
security level and execution time of all the phases. To the best of our knowledge, this is a unique
attempt to juxtapose and compare the ECDSA with all of its variants.
Elliptic curve cryptography (ECC) uses elliptic curves over finite fields to provide public-key encryption and digital signatures. ECC requires significantly smaller key sizes than other cryptosystems like RSA to provide equivalent security. This allows for faster computations and less storage requirements, making ECC ideal for constrained environments like smartphones. ECC relies on the difficulty of solving the elliptic curve discrete logarithm problem to provide security.
1) The document describes how to decrypt an RSA ciphertext using a Chinese Remainder Theorem attack when the public exponent is small. It involves using the public moduli and exponents from certificates to determine the plaintext.
2) The attack works by using the Chinese Remainder Theorem to determine the plaintext from the ciphertexts modulo the public moduli. This works because the public exponent is small, in this case 3, allowing extraction of the plaintext cube root.
3) Pseudocode is provided showing the steps: using the Chinese Remainder Theorem formula to combine the ciphertexts modulo the public moduli, taking the cube root to obtain the plaintext, which in this example decrypts to a German message about a fixed
This document discusses data encryption methods. It defines encryption as hiding information so it can only be accessed by those with the key. There are two main types: symmetric encryption uses one key, while asymmetric encryption uses two different but related keys. Encryption works by scrambling data using techniques like transposition, which rearranges the order, and substitution, which replaces parts with other values. The document specifically describes the Data Encryption Standard (DES) algorithm and the public key cryptosystem, which introduced the innovative approach of using different keys for encryption and decryption.
The document summarizes the RSA encryption algorithm. It begins by explaining that RSA was developed in 1977 by Rivest, Shamir and Adleman. It then provides an example to demonstrate how RSA works step-by-step, generating keys, encrypting a message and decrypting the ciphertext. Finally, it discusses some challenges with breaking RSA encryption, including brute force attacks and mathematical attacks based on factoring the encryption keys, as well as timing attacks that aim to deduce keys based on variations in processing time.
Lattice Based Cryptography - GGH CryptosystemVarun Janga
This document discusses lattice-based cryptography and the GGH cryptosystem. It provides an overview of lattices and their properties. The GGH cryptosystem is based on the closest vector problem in lattices. The private key is a good basis for a lattice, while the public key is a bad basis for the same lattice. The document describes the key generation process and analyzes attacks on the GGH cryptosystem such as the embedding attack and Nguyen's attack based on leaking remainders. It also discusses advantages and disadvantages of lattice-based cryptography.
Kruskal's algorithm finds a minimum spanning tree by growing it from a forest of trees. It works by repeatedly selecting the lowest cost edge that connects two different trees without forming a cycle and adding it to the spanning tree. This continues until only one tree remains, which is the minimum spanning tree. The algorithm considers edges in order of increasing weight, merging components with safe edges. It runs in O(ElogV) time by using a priority queue to track the lowest cost remaining edge.
Elliptic curve cryptography (ECC) uses elliptic curves over finite fields for encryption, digital signatures, and key exchange. The key sizes are smaller than RSA for the same security level. Its security relies on the assumed hardness of solving the discrete logarithm problem over elliptic curves. ECC defines elliptic curves with parameters over Galois fields GF(p) for prime p or binary fields GF(2m). Points on the curves along with addition and doubling formulas are used to perform scalar multiplications for cryptographic operations.
RSA is an asymmetric cryptographic algorithm used for encrypting and decrypting messages. It uses a public key for encryption and a private key for decryption such that a message encrypted with the public key can only be decrypted with the corresponding private key. The RSA algorithm involves three steps: key generation, encryption, and decryption. It addresses issues of key distribution and digital signatures.
AES (Advanced Encryption Standard) is a symmetric block cipher encryption method that uses a block size of 128 bits and key sizes of 128, 192, or 256 bits. It is an iterative cipher based on substitutions and permutations that performs all computations on bytes rather than bits. The encryption process consists of initial round, main rounds, and final round, with the number of main rounds varying based on key size. Decryption undoes the encryption process in reverse order using inverse operations. AES-256 is considered the most secure variant due to its 256-bit key size.
The document discusses several topics in number theory including prime numbers, Fermat's and Euler's theorems, primality testing algorithms like Miller-Rabin, the Chinese Remainder Theorem, and discrete logarithms. It defines prime numbers and factorization. It explains Fermat's Little Theorem, Euler's Theorem and how they relate exponentiation and modulo arithmetic. It also describes probabilistic primality tests and their analysis. The Chinese Remainder Theorem is introduced as a method to speed up modular computations. Discrete logarithms are defined as the inverse of exponentiation modulo a prime.
Cryptography is technique of securing information and communications through use of codes so that only those person for whom the information is intended can understand it and process it. Thus preventing unauthorized access to information. The prefix “crypt” means “hidden” and suffix graphy means “writing”.
The document discusses the SHA-3 algorithm, which is the latest member of the Secure Hash Algorithm family of standards released by NIST in 2015. It provides an overview of the limitations of previous SHA standards like SHA-1 and SHA-2, the history and design of SHA-3, which uses the Keccak algorithm and sponge construction. Later sections cover specifics of SHA-3 like padding, the block permutation, and implementations in various cryptography libraries.
RSA is a public-key cryptosystem that uses both public and private keys for encryption and decryption. It was the first practical implementation of such a cryptosystem. The algorithm involves four main steps: 1) generation of the public and private keys, 2) encryption of messages using the public key, 3) decryption of encrypted messages using the private key, and 4) potential cracking of the encrypted message. It works by using two large prime numbers to generate the keys and performs exponentiation and modulo operations on messages to encrypt and decrypt them. There were some drawbacks to the original RSA algorithm related to redundant calculations and representing letters numerically that opened it up to easier hacking. Enhancements to RSA improved it by choosing
We will discuss the following: RSA Key generation , RSA Encryption , RSA Decryption , A Real World Example, RSA Security.
https://www.youtube.com/watch?v=x7QWJ13dgGs&list=PLKYmvyjH53q13_6aS4VwgXU0Nb_4sjwuf&index=7
This document provides an overview of elliptic curve cryptography (ECC). It begins with background on ECC, describing how it was independently proposed in 1985 as an approach to asymmetric cryptography. It then covers the basics of asymmetric cryptosystems and how ECC compares to RSA and Diffie-Hellman. The document goes on to explain elliptic curves over real and finite numbers, how points are added and doubled on elliptic curves, and how this relates to discrete logarithm problems. It discusses implementations of ECC for cryptography and comparisons to RSA in terms of key size and performance. Finally, it covers efficient implementations of ECC for smart cards.
HMAC (Hash Message Authentication Code) is a message authentication code that uses a cryptographic hash function in combination with a secret key to verify both the data integrity and the authenticity of a message. It allows for two parties sharing a secret key to authenticate messages using a hash function. The security of HMAC relies on the underlying hash algorithm and is more secure than simply using MD5 against birthday attacks due to its use of a cryptographic key.
This Presentation Elliptical Curve Cryptography give a brief explain about this topic, it will use to enrich your knowledge on this topic. Use this ppt for your reference purpose and if you have any queries you'll ask questions.
TensorFlow London 11: Pierre Harvey Richemond 'Trends and Developments in Rei...Seldon
Speaker: Pierre Harvey Richemond, PhD student at the Data Science Institute with Imperial College
After a successful career in quantitative finance, Pierre is researching deep learning and reinforcement learning at Data Science Institute. He holds several degrees in mathematics and engineering.
Abstract:
In this high-level talk, he will go through the latest recent and significant developments in the theory of reinforcement learning. Topics will range from soft Q-learning to proximal policy optimization and the Monte-Carlo tree search used in AlphaGo Zero. He will discuss strategies to implement these methods in Tensorflow, combine and replicate them in practice, and highlight connections with other related fields such as convex optimization and optimal transport.
Thanks to all TensorFlow London meetup organisers and supporters:
Seldon.io
Altoros
Rewired
Google Developers
Rise London
Provenance for Data Munging EnvironmentsPaul Groth
Data munging is a crucial task across domains ranging from drug discovery and policy studies to data science. Indeed, it has been reported that data munging accounts for 60% of the time spent in data analysis. Because data munging involves a wide variety of tasks using data from multiple sources, it often becomes difficult to understand how a cleaned dataset was actually produced (i.e. its provenance). In this talk, I discuss our recent work on tracking data provenance within desktop systems, which addresses problems of efficient and fine grained capture. I also describe our work on scalable provence tracking within a triple store/graph database that supports messy web data. Finally, I briefly touch on whether we will move from adhoc data munging approaches to more declarative knowledge representation languages such as Probabilistic Soft Logic.
Presented at Information Sciences Institute - August 13, 2015
Elliptic curve cryptography (ECC) uses elliptic curves over finite fields to provide public-key encryption and digital signatures. ECC requires significantly smaller key sizes than other cryptosystems like RSA to provide equivalent security. This allows for faster computations and less storage requirements, making ECC ideal for constrained environments like smartphones. ECC relies on the difficulty of solving the elliptic curve discrete logarithm problem to provide security.
1) The document describes how to decrypt an RSA ciphertext using a Chinese Remainder Theorem attack when the public exponent is small. It involves using the public moduli and exponents from certificates to determine the plaintext.
2) The attack works by using the Chinese Remainder Theorem to determine the plaintext from the ciphertexts modulo the public moduli. This works because the public exponent is small, in this case 3, allowing extraction of the plaintext cube root.
3) Pseudocode is provided showing the steps: using the Chinese Remainder Theorem formula to combine the ciphertexts modulo the public moduli, taking the cube root to obtain the plaintext, which in this example decrypts to a German message about a fixed
This document discusses data encryption methods. It defines encryption as hiding information so it can only be accessed by those with the key. There are two main types: symmetric encryption uses one key, while asymmetric encryption uses two different but related keys. Encryption works by scrambling data using techniques like transposition, which rearranges the order, and substitution, which replaces parts with other values. The document specifically describes the Data Encryption Standard (DES) algorithm and the public key cryptosystem, which introduced the innovative approach of using different keys for encryption and decryption.
The document summarizes the RSA encryption algorithm. It begins by explaining that RSA was developed in 1977 by Rivest, Shamir and Adleman. It then provides an example to demonstrate how RSA works step-by-step, generating keys, encrypting a message and decrypting the ciphertext. Finally, it discusses some challenges with breaking RSA encryption, including brute force attacks and mathematical attacks based on factoring the encryption keys, as well as timing attacks that aim to deduce keys based on variations in processing time.
Lattice Based Cryptography - GGH CryptosystemVarun Janga
This document discusses lattice-based cryptography and the GGH cryptosystem. It provides an overview of lattices and their properties. The GGH cryptosystem is based on the closest vector problem in lattices. The private key is a good basis for a lattice, while the public key is a bad basis for the same lattice. The document describes the key generation process and analyzes attacks on the GGH cryptosystem such as the embedding attack and Nguyen's attack based on leaking remainders. It also discusses advantages and disadvantages of lattice-based cryptography.
Kruskal's algorithm finds a minimum spanning tree by growing it from a forest of trees. It works by repeatedly selecting the lowest cost edge that connects two different trees without forming a cycle and adding it to the spanning tree. This continues until only one tree remains, which is the minimum spanning tree. The algorithm considers edges in order of increasing weight, merging components with safe edges. It runs in O(ElogV) time by using a priority queue to track the lowest cost remaining edge.
Elliptic curve cryptography (ECC) uses elliptic curves over finite fields for encryption, digital signatures, and key exchange. The key sizes are smaller than RSA for the same security level. Its security relies on the assumed hardness of solving the discrete logarithm problem over elliptic curves. ECC defines elliptic curves with parameters over Galois fields GF(p) for prime p or binary fields GF(2m). Points on the curves along with addition and doubling formulas are used to perform scalar multiplications for cryptographic operations.
RSA is an asymmetric cryptographic algorithm used for encrypting and decrypting messages. It uses a public key for encryption and a private key for decryption such that a message encrypted with the public key can only be decrypted with the corresponding private key. The RSA algorithm involves three steps: key generation, encryption, and decryption. It addresses issues of key distribution and digital signatures.
AES (Advanced Encryption Standard) is a symmetric block cipher encryption method that uses a block size of 128 bits and key sizes of 128, 192, or 256 bits. It is an iterative cipher based on substitutions and permutations that performs all computations on bytes rather than bits. The encryption process consists of initial round, main rounds, and final round, with the number of main rounds varying based on key size. Decryption undoes the encryption process in reverse order using inverse operations. AES-256 is considered the most secure variant due to its 256-bit key size.
The document discusses several topics in number theory including prime numbers, Fermat's and Euler's theorems, primality testing algorithms like Miller-Rabin, the Chinese Remainder Theorem, and discrete logarithms. It defines prime numbers and factorization. It explains Fermat's Little Theorem, Euler's Theorem and how they relate exponentiation and modulo arithmetic. It also describes probabilistic primality tests and their analysis. The Chinese Remainder Theorem is introduced as a method to speed up modular computations. Discrete logarithms are defined as the inverse of exponentiation modulo a prime.
Cryptography is technique of securing information and communications through use of codes so that only those person for whom the information is intended can understand it and process it. Thus preventing unauthorized access to information. The prefix “crypt” means “hidden” and suffix graphy means “writing”.
The document discusses the SHA-3 algorithm, which is the latest member of the Secure Hash Algorithm family of standards released by NIST in 2015. It provides an overview of the limitations of previous SHA standards like SHA-1 and SHA-2, the history and design of SHA-3, which uses the Keccak algorithm and sponge construction. Later sections cover specifics of SHA-3 like padding, the block permutation, and implementations in various cryptography libraries.
RSA is a public-key cryptosystem that uses both public and private keys for encryption and decryption. It was the first practical implementation of such a cryptosystem. The algorithm involves four main steps: 1) generation of the public and private keys, 2) encryption of messages using the public key, 3) decryption of encrypted messages using the private key, and 4) potential cracking of the encrypted message. It works by using two large prime numbers to generate the keys and performs exponentiation and modulo operations on messages to encrypt and decrypt them. There were some drawbacks to the original RSA algorithm related to redundant calculations and representing letters numerically that opened it up to easier hacking. Enhancements to RSA improved it by choosing
We will discuss the following: RSA Key generation , RSA Encryption , RSA Decryption , A Real World Example, RSA Security.
https://www.youtube.com/watch?v=x7QWJ13dgGs&list=PLKYmvyjH53q13_6aS4VwgXU0Nb_4sjwuf&index=7
This document provides an overview of elliptic curve cryptography (ECC). It begins with background on ECC, describing how it was independently proposed in 1985 as an approach to asymmetric cryptography. It then covers the basics of asymmetric cryptosystems and how ECC compares to RSA and Diffie-Hellman. The document goes on to explain elliptic curves over real and finite numbers, how points are added and doubled on elliptic curves, and how this relates to discrete logarithm problems. It discusses implementations of ECC for cryptography and comparisons to RSA in terms of key size and performance. Finally, it covers efficient implementations of ECC for smart cards.
HMAC (Hash Message Authentication Code) is a message authentication code that uses a cryptographic hash function in combination with a secret key to verify both the data integrity and the authenticity of a message. It allows for two parties sharing a secret key to authenticate messages using a hash function. The security of HMAC relies on the underlying hash algorithm and is more secure than simply using MD5 against birthday attacks due to its use of a cryptographic key.
This Presentation Elliptical Curve Cryptography give a brief explain about this topic, it will use to enrich your knowledge on this topic. Use this ppt for your reference purpose and if you have any queries you'll ask questions.
TensorFlow London 11: Pierre Harvey Richemond 'Trends and Developments in Rei...Seldon
Speaker: Pierre Harvey Richemond, PhD student at the Data Science Institute with Imperial College
After a successful career in quantitative finance, Pierre is researching deep learning and reinforcement learning at Data Science Institute. He holds several degrees in mathematics and engineering.
Abstract:
In this high-level talk, he will go through the latest recent and significant developments in the theory of reinforcement learning. Topics will range from soft Q-learning to proximal policy optimization and the Monte-Carlo tree search used in AlphaGo Zero. He will discuss strategies to implement these methods in Tensorflow, combine and replicate them in practice, and highlight connections with other related fields such as convex optimization and optimal transport.
Thanks to all TensorFlow London meetup organisers and supporters:
Seldon.io
Altoros
Rewired
Google Developers
Rise London
Provenance for Data Munging EnvironmentsPaul Groth
Data munging is a crucial task across domains ranging from drug discovery and policy studies to data science. Indeed, it has been reported that data munging accounts for 60% of the time spent in data analysis. Because data munging involves a wide variety of tasks using data from multiple sources, it often becomes difficult to understand how a cleaned dataset was actually produced (i.e. its provenance). In this talk, I discuss our recent work on tracking data provenance within desktop systems, which addresses problems of efficient and fine grained capture. I also describe our work on scalable provence tracking within a triple store/graph database that supports messy web data. Finally, I briefly touch on whether we will move from adhoc data munging approaches to more declarative knowledge representation languages such as Probabilistic Soft Logic.
Presented at Information Sciences Institute - August 13, 2015
From Pipelines to Refineries: Scaling Big Data ApplicationsDatabricks
Big data tools are challenging to combine into a larger application: ironically, big data applications themselves do not tend to scale very well. These issues of integration and data management are only magnified by increasingly large volumes of data.
Apache Spark provides strong building blocks for batch processes, streams and ad-hoc interactive analysis. However, users face challenges when putting together a single coherent pipeline that could involve hundreds of transformation steps, especially when confronted by the need of rapid iterations.
This talk explores these issues through the lens of functional programming. It presents an experimental framework that provides full-pipeline guarantees by introducing more laziness to Apache Spark. This framework allows transformations to be seamlessly composed and alleviates common issues, thanks to whole program checks, auto-caching, and aggressive computation parallelization and reuse.
John Hugg presented on building an operational database for high-performance applications. Some key points:
- He set out to reinvent OLTP databases to be 10x faster by leveraging multicore CPUs and partitioning data across cores.
- The database, called VoltDB, uses Java for transaction management and networking while storing data in C++ for better performance.
- It partitions data and transactions across server cores for parallelism. Global transactions can access all partitions transactionally.
- VoltDB is well-suited for fast data applications like IoT, gaming, ad tech which require high write throughput, low latency, and global understanding of live data.
Maths behind every it operation. (development and management)Swapnil Kotwal
This presentation covers very famous mathematical algorithm used in every IT person's life every day.
I have covered CPM and RSA algorithm with very simpler manner.
Distributed Decision Tree Learning for Mining Big Data StreamsArinto Murdopo
This document presents a distributed decision tree learning algorithm called Vertical Hoeffding Tree (VHT) for mining big data streams. It summarizes the contributions of the master's thesis, which include: (1) Developing the SAMOA framework for distributed streaming machine learning, (2) Integrating SAMOA with the Storm distributed stream processing engine, and (3) Implementing the VHT algorithm to improve scalability over the standard Hoeffding Tree algorithm when dealing with high-dimensional data streams. The evaluation shows that VHT achieves similar accuracy to Hoeffding Tree but higher throughput, especially on datasets with many attributes.
201411203 goto night on graphs for fraud detectionRik Van Bruggen
This document discusses how graph databases can be useful for fraud detection. It begins with an introduction to graphs and graph theory, then discusses how graph databases work and their advantages over relational databases for complex querying and modeling connected data. The document notes that fraud detection relies on real-time analysis, complex patterns, and graph algorithms to navigate relationships. It provides a short demonstration and discusses case studies where graph databases have been successfully used for fraud detection due to their ability to efficiently handle large, interconnected datasets.
Multicore processors are becoming prevalent due to the limitations of increasing single core clock speeds. This presents challenges for software to effectively utilize multiple cores. Functional programming is one option that avoids shared state and parallel access issues, but requires a significant mindset shift. Refactoring existing code using tools is another option to incrementally introduce parallelism. Hybrid approaches combining paradigms may also help transition. Key application areas currently benefiting include servers, scientific computing, and packet processing. However, significant existing code is not easily parallelized and performance gains have yet to be fully realized.
04 accelerating dl inference with (open)capi and posit numbersYutaka Kawai
This was presented by Louis Ledoux and Marc Casas at OpenPOWER summit EU 2019. The original one is uploaded at:
https://static.sched.com/hosted_files/opeu19/1a/presentation_louis_ledoux_posit.pdf
Machine learning for IoT - unpacking the blackboxIvo Andreev
This document provides an overview of machine learning and how it can be applied to IoT scenarios. It discusses different machine learning algorithms like supervised and unsupervised learning. It also compares various machine learning platforms like Azure ML, BigML, Amazon ML, Google Prediction and IBM Watson ML. It provides guidance on choosing the right algorithm based on the data and diagnosing why machine learning models may fail. It also introduces neural networks and deep learning concepts. Finally, it demonstrates Azure ML capabilities through a predictive maintenance example.
This document summarizes elliptic curve cryptography and proposes a method to enhance its security. It begins with an abstract that discusses how ECC is more secure than other public key cryptosystems and proposes enhancing ECC security using character conversion. It then provides background on encryption methods and key types, describing symmetric, asymmetric, and public key encryption. It explains how ECC works by defining a finite group where exponentiation is easy but solving the discrete logarithm problem is very difficult, making ECC secure even with smaller key sizes than other methods. Finally, it proposes enhancing ECC security against attacks by using a character conversion method.
Introduction to Multimodal Language models with LLaVA. What are Multimodal models, how do they work, the LLaVA papers/models, and Image classification experiment.
Introduction to Multimodal Language models with LLaVA. What are Multimodal models, how do they work, the LLaVA papers/models, and Image classification experiment.
Valencian Summer School 2015
Day 1
Lecture 5
Data Transformation and Feature Engineering
Charles Parker (Alston Trading)
https://bigml.com/events/valencian-summer-school-in-machine-learning-2015
MLSEV. Logistic Regression, Deepnets, and Time Series BigML, Inc
Supervised Learning (Part II): Logistic Regression, Deepnets, and Time Series, by BigML.
MLSEV 2019: 1st edition of the Machine Learning School in Seville, Spain.
From Pipelines to Refineries: scaling big data applications with Tim HunterDatabricks
Big data tools are challenging to combine into a larger application: ironically, big data applications themselves do not tend to scale very well. These issues of integration and data management are only magnified by increasingly large volumes of data. Apache Spark provides strong building blocks for batch processes, streams and ad-hoc interactive analysis. However, users face challenges when putting together a single coherent pipeline that could involve hundreds of transformation steps, especially when confronted by the need of rapid iterations. This talk explores these issues through the lens of functional programming. It presents an experimental framework that provides full-pipeline guarantees by introducing more laziness to Apache Spark. This framework allows transformations to be seamlessly composed and alleviates common issues, thanks to whole program checks, auto-caching, and aggressive computation parallelization and reuse.
"erlang, webmail and hibari" at Rakuten tech talkCLOUDIAN KK
Presentation materials to talk about erlang overview, webmail development by erlang and "hibari" use case for GB mail box web mail at Rakuten tech talk on August 24, 2010
Streaming data presents new challenges for statistics and machine learning on extremely large data sets. Tools such as Apache Storm, a stream processing framework, can power range of data analytics but lack advanced statistical capabilities. These slides are from the Apache.con talk, which discussed developing streaming algorithms with the flexibility of both Storm and R, a statistical programming language.
At the talk I dicsussed issues of why and how to use Storm and R to develop streaming algorithms; in particular I focused on:
• Streaming algorithms
• Online machine learning algorithms
• Use cases showing how to process hundreds of millions of events a day in (near) real time
See: https://apacheconna2015.sched.org/event/09f5a1cc372860b008bce09e15a034c4#.VUf7wxOUd5o
Solving Large Scale Optimization Problems using CPLEX Optimization Studiooptimizatiodirectdirect
- Optimization Direct is an IBM business partner that sells CPLEX optimization software and provides training to help customers maximize the benefits of optimization technologies.
- The document discusses how to get the most out of optimization through improved modeling techniques like exploiting sparsity and tight formulations, and tuning the optimizer by choosing algorithms and strategies tailored to specific model classes.
- A large example of scheduling optimization is presented where a heuristic approach solving smaller sub-models sequentially or in parallel can find good quality solutions faster than solving the full model directly when optimality is not required.
This document discusses deepnets, which are a type of supervised learning algorithm for classification and regression. Deepnets build upon logistic regression by adding hidden layers between the input and output layers. This allows deepnets to model more complex nonlinear relationships than logistic regression. While deepnets have powerful representational abilities, their success depends on finding the optimal network structure for a given problem. The document outlines how BigML uses metalearning and network search techniques to automate this process and make deepnets more accessible for users. Deepnets work best for problems where computational resources allow exploring many network structures to find the best performing one.
Similar to The Factoring Dead: Preparing for the Cryptopocalypse (20)
inQuba Webinar Mastering Customer Journey Management with Dr Graham HillLizaNolte
HERE IS YOUR WEBINAR CONTENT! 'Mastering Customer Journey Management with Dr. Graham Hill'. We hope you find the webinar recording both insightful and enjoyable.
In this webinar, we explored essential aspects of Customer Journey Management and personalization. Here’s a summary of the key insights and topics discussed:
Key Takeaways:
Understanding the Customer Journey: Dr. Hill emphasized the importance of mapping and understanding the complete customer journey to identify touchpoints and opportunities for improvement.
Personalization Strategies: We discussed how to leverage data and insights to create personalized experiences that resonate with customers.
Technology Integration: Insights were shared on how inQuba’s advanced technology can streamline customer interactions and drive operational efficiency.
AppSec PNW: Android and iOS Application Security with MobSFAjin Abraham
Mobile Security Framework - MobSF is a free and open source automated mobile application security testing environment designed to help security engineers, researchers, developers, and penetration testers to identify security vulnerabilities, malicious behaviours and privacy concerns in mobile applications using static and dynamic analysis. It supports all the popular mobile application binaries and source code formats built for Android and iOS devices. In addition to automated security assessment, it also offers an interactive testing environment to build and execute scenario based test/fuzz cases against the application.
This talk covers:
Using MobSF for static analysis of mobile applications.
Interactive dynamic security assessment of Android and iOS applications.
Solving Mobile app CTF challenges.
Reverse engineering and runtime analysis of Mobile malware.
How to shift left and integrate MobSF/mobsfscan SAST and DAST in your build pipeline.
AI in the Workplace Reskilling, Upskilling, and Future Work.pptxSunil Jagani
Discover how AI is transforming the workplace and learn strategies for reskilling and upskilling employees to stay ahead. This comprehensive guide covers the impact of AI on jobs, essential skills for the future, and successful case studies from industry leaders. Embrace AI-driven changes, foster continuous learning, and build a future-ready workforce.
Read More - https://bit.ly/3VKly70
[OReilly Superstream] Occupy the Space: A grassroots guide to engineering (an...Jason Yip
The typical problem in product engineering is not bad strategy, so much as “no strategy”. This leads to confusion, lack of motivation, and incoherent action. The next time you look for a strategy and find an empty space, instead of waiting for it to be filled, I will show you how to fill it in yourself. If you’re wrong, it forces a correction. If you’re right, it helps create focus. I’ll share how I’ve approached this in the past, both what works and lessons for what didn’t work so well.
Lee Barnes - Path to Becoming an Effective Test Automation Engineer.pdfleebarnesutopia
So… you want to become a Test Automation Engineer (or hire and develop one)? While there’s quite a bit of information available about important technical and tool skills to master, there’s not enough discussion around the path to becoming an effective Test Automation Engineer that knows how to add VALUE. In my experience this had led to a proliferation of engineers who are proficient with tools and building frameworks but have skill and knowledge gaps, especially in software testing, that reduce the value they deliver with test automation.
In this talk, Lee will share his lessons learned from over 30 years of working with, and mentoring, hundreds of Test Automation Engineers. Whether you’re looking to get started in test automation or just want to improve your trade, this talk will give you a solid foundation and roadmap for ensuring your test automation efforts continuously add value. This talk is equally valuable for both aspiring Test Automation Engineers and those managing them! All attendees will take away a set of key foundational knowledge and a high-level learning path for leveling up test automation skills and ensuring they add value to their organizations.
What is an RPA CoE? Session 2 – CoE RolesDianaGray10
In this session, we will review the players involved in the CoE and how each role impacts opportunities.
Topics covered:
• What roles are essential?
• What place in the automation journey does each role play?
Speaker:
Chris Bolin, Senior Intelligent Automation Architect Anika Systems
The Microsoft 365 Migration Tutorial For Beginner.pptxoperationspcvita
This presentation will help you understand the power of Microsoft 365. However, we have mentioned every productivity app included in Office 365. Additionally, we have suggested the migration situation related to Office 365 and how we can help you.
You can also read: https://www.systoolsgroup.com/updates/office-365-tenant-to-tenant-migration-step-by-step-complete-guide/
Introducing BoxLang : A new JVM language for productivity and modularity!Ortus Solutions, Corp
Just like life, our code must adapt to the ever changing world we live in. From one day coding for the web, to the next for our tablets or APIs or for running serverless applications. Multi-runtime development is the future of coding, the future is to be dynamic. Let us introduce you to BoxLang.
Dynamic. Modular. Productive.
BoxLang redefines development with its dynamic nature, empowering developers to craft expressive and functional code effortlessly. Its modular architecture prioritizes flexibility, allowing for seamless integration into existing ecosystems.
Interoperability at its Core
With 100% interoperability with Java, BoxLang seamlessly bridges the gap between traditional and modern development paradigms, unlocking new possibilities for innovation and collaboration.
Multi-Runtime
From the tiny 2m operating system binary to running on our pure Java web server, CommandBox, Jakarta EE, AWS Lambda, Microsoft Functions, Web Assembly, Android and more. BoxLang has been designed to enhance and adapt according to it's runnable runtime.
The Fusion of Modernity and Tradition
Experience the fusion of modern features inspired by CFML, Node, Ruby, Kotlin, Java, and Clojure, combined with the familiarity of Java bytecode compilation, making BoxLang a language of choice for forward-thinking developers.
Empowering Transition with Transpiler Support
Transitioning from CFML to BoxLang is seamless with our JIT transpiler, facilitating smooth migration and preserving existing code investments.
Unlocking Creativity with IDE Tools
Unleash your creativity with powerful IDE tools tailored for BoxLang, providing an intuitive development experience and streamlining your workflow. Join us as we embark on a journey to redefine JVM development. Welcome to the era of BoxLang.
"NATO Hackathon Winner: AI-Powered Drug Search", Taras KlobaFwdays
This is a session that details how PostgreSQL's features and Azure AI Services can be effectively used to significantly enhance the search functionality in any application.
In this session, we'll share insights on how we used PostgreSQL to facilitate precise searches across multiple fields in our mobile application. The techniques include using LIKE and ILIKE operators and integrating a trigram-based search to handle potential misspellings, thereby increasing the search accuracy.
We'll also discuss how the azure_ai extension on PostgreSQL databases in Azure and Azure AI Services were utilized to create vectors from user input, a feature beneficial when users wish to find specific items based on text prompts. While our application's case study involves a drug search, the techniques and principles shared in this session can be adapted to improve search functionality in a wide range of applications. Join us to learn how PostgreSQL and Azure AI can be harnessed to enhance your application's search capability.
Conversational agents, or chatbots, are increasingly used to access all sorts of services using natural language. While open-domain chatbots - like ChatGPT - can converse on any topic, task-oriented chatbots - the focus of this paper - are designed for specific tasks, like booking a flight, obtaining customer support, or setting an appointment. Like any other software, task-oriented chatbots need to be properly tested, usually by defining and executing test scenarios (i.e., sequences of user-chatbot interactions). However, there is currently a lack of methods to quantify the completeness and strength of such test scenarios, which can lead to low-quality tests, and hence to buggy chatbots.
To fill this gap, we propose adapting mutation testing (MuT) for task-oriented chatbots. To this end, we introduce a set of mutation operators that emulate faults in chatbot designs, an architecture that enables MuT on chatbots built using heterogeneous technologies, and a practical realisation as an Eclipse plugin. Moreover, we evaluate the applicability, effectiveness and efficiency of our approach on open-source chatbots, with promising results.
In our second session, we shall learn all about the main features and fundamentals of UiPath Studio that enable us to use the building blocks for any automation project.
📕 Detailed agenda:
Variables and Datatypes
Workflow Layouts
Arguments
Control Flows and Loops
Conditional Statements
💻 Extra training through UiPath Academy:
Variables, Constants, and Arguments in Studio
Control Flow in Studio
Discover the Unseen: Tailored Recommendation of Unwatched ContentScyllaDB
The session shares how JioCinema approaches ""watch discounting."" This capability ensures that if a user watched a certain amount of a show/movie, the platform no longer recommends that particular content to the user. Flawless operation of this feature promotes the discover of new content, improving the overall user experience.
JioCinema is an Indian over-the-top media streaming service owned by Viacom18.
"Frontline Battles with DDoS: Best practices and Lessons Learned", Igor IvaniukFwdays
At this talk we will discuss DDoS protection tools and best practices, discuss network architectures and what AWS has to offer. Also, we will look into one of the largest DDoS attacks on Ukrainian infrastructure that happened in February 2022. We'll see, what techniques helped to keep the web resources available for Ukrainians and how AWS improved DDoS protection for all customers based on Ukraine experience
"$10 thousand per minute of downtime: architecture, queues, streaming and fin...Fwdays
Direct losses from downtime in 1 minute = $5-$10 thousand dollars. Reputation is priceless.
As part of the talk, we will consider the architectural strategies necessary for the development of highly loaded fintech solutions. We will focus on using queues and streaming to efficiently work and manage large amounts of data in real-time and to minimize latency.
We will focus special attention on the architectural patterns used in the design of the fintech system, microservices and event-driven architecture, which ensure scalability, fault tolerance, and consistency of the entire system.
How information systems are built or acquired puts information, which is what they should be about, in a secondary place. Our language adapted accordingly, and we no longer talk about information systems but applications. Applications evolved in a way to break data into diverse fragments, tightly coupled with applications and expensive to integrate. The result is technical debt, which is re-paid by taking even bigger "loans", resulting in an ever-increasing technical debt. Software engineering and procurement practices work in sync with market forces to maintain this trend. This talk demonstrates how natural this situation is. The question is: can something be done to reverse the trend?
What is an RPA CoE? Session 1 – CoE VisionDianaGray10
In the first session, we will review the organization's vision and how this has an impact on the COE Structure.
Topics covered:
• The role of a steering committee
• How do the organization’s priorities determine CoE Structure?
Speaker:
Chris Bolin, Senior Intelligent Automation Architect Anika Systems
Call Girls Chandigarh🔥7023059433🔥Agency Profile Escorts in Chandigarh Availab...
The Factoring Dead: Preparing for the Cryptopocalypse
1. The Factoring Dead
Preparing for the Cryptopocalypse
Thomas Ptacek, Matasano
Tom Ritter, iSEC Partners
Javed Samuel, iSEC Partners
Alex Stamos, Artemis Internet
3. • There is a significant disconnect between theory and reality
in security.
• Lots of great, continuous academic research in cryptography.
• Few engineers get beyond Applied Cryptography before
shipping code.
• In the 2010's, it is no longer acceptable to just use standard
libraries and claim ignorance.
• We wanted to see if we could bridge this gap a bit.
• We certainly are not the only ones to do so.
Why are we here?
4. • Numerous attacks on the currentTLS infrastructure.
• BEAST 1
• CRIME 2
• Lucky 13 3
• RC4 Bias 4
• Even a new compression oracle attack here at BlackHat
USA 2013! 5
• Were any of these attacks really unpredictable to people
paying attention? (Hint: no6)
RecentTLS Problems
[1] http://vnhacker.blogspot.com/2011/09/beast.htm
[2] https://www.isecpartners.com/blog/2012/september/details-on-the-crime-attack.aspx l
[3] http://www.isg.rhul.ac.uk/tls/TLStiming.pdf
[4] http://infoscience.epfl.ch/record/152526/files/RC4_1.pdf
[5] http://www.blackhat.com/us-13/briefings.html#Prado
[6] John Kelsey. Compression and information leakage of plaintext. Fast Software Encryption, 9th International Workshop, February 2002!
5. • 1998 – EFF Deep Crack defeats DES in 56 hours
• 2005 - Pre-image attacks against MD5 discussed
• 2008- Applebaum, Sotirov et. al. use MD5 attack against
CA
• 2011 - CA/Browser Forum forbids MD5
• 2012 - Somebody (cough) uses related attack against
Microsoft for FLAME
• SIM Card Attack at BlackHat 2013 using DES
Comparison to AcademicTime Line
6. • Most systems are not designed for cryptographic agility
• Cryptography is an ecosystem
• Few companies employ full-time cryptographers
• Hard for InfoSec practitioners to keep up-to-speed
• Lots of momentum in the professional consulting core.
We have failed as an industry
to address these structural problems.
Why such a disconnect?
7. • Looking for the next crypto black swan.
• Our thesis:
• Last six months has seen huge leaps in solving the DLP
• These leaps have parallels to the past.
• There is a small but real chance that both RSA and non-
ECC DH will soon become unusable.
• Ecosystem currently cannot support a quick pivot to ECC
We want this room to become the seed of change
Why are we here?
9. • Key part of modern cryptosystems
Why Asymmetric Cryptography?
10. • We need a “trap-door” function, something that is easy
to do but hard to undo
• We also need a way to cheat with more information
• Rarely is the difficulty of this function proved, only
assumed
How does asymmetric crypto work?
11. • Diffie-Hellman - 1976 - Secure key exchange
• RSA - 1977 - Encryption, signing
• Elliptic Curve Cryptography
• Suite B - 2007 - Key exchange, signing and encryption
• GOST - 2010 - Key exchange, signing and encryption
What are the common primitives?
12. • First published byWhitfield Diffie and Martin Hellman in
1976
• Establishes shared secret by exchanging data over a
public network.
• Security relies on the hardness of the discrete logarithm
problem.
Diffe HellmanOverview
13. • Solve the discrete logarithm problem:
• Suppose h = gx for some g in the finite field and secret
integer x.
• The discrete logarithm problem is to find the element x,
when only g and h are known.
• Also how you attack El-Gamal and DSA
How do I attack DH?
14. • Key Generation to compute public and private key
exponent (e, d)
• Encryption by raising the message to public key
exponent e
• Decryption by raising the message to private key d
• Security relies on the hardness of factoring.
RSA Overview
15. • Factoring!
• Find the p & q such that p*q = N
• Factoring an RSA modulus allows an attacker to
compute the secret d and thus figure out the private key.
How do I attack RSA?
16. • An elliptic curve E over R real numbers is defined by a
Weierstrass equation eg y2 = x3 - 3x + 5
• Cryptographic schemes require fast and accurate arithmetic and
use one of the following elliptic fields.
• Prime Field Fp where p is a prime for software applications.
• Binary Field F2m where m is a positive integer for hardware applications.
Elliptic Curve Cryptography Overview
20. • Generic algorithms (for any G)
• Example: Pohlig-Hellman
• Shows that discrete logarithm can be solved by breaking
up the groups into subgroups of prime order.
• Generic algorithms are exponential time algorithms.
• Specific algorithms which make use of group
representation
• Example: Index calculus algorithms
• They leverage particular properties of the group
• Result in sub-exponential running time
Discrete LogarithmAlgorithms
21. L(1) – Exponential
WayToo Slow
Exponential vs Polynomial
L(0) – Polynomial
Fast enough to scare you
Linear running time plot Logarithmic running time plot
23. Exponential vs Polynomial
L(0)
L(1/2) – 1979 L(1) – current
fastest ECDLP
algorithmsL(1/3) – 1984
Factoring and Discrete Logs stay here for the next 30 years
24. Exponential vs Polynomial
L(0)
L(1/2) – 1979 L(1) – current
fastest ECCDLP
algorithmsL(1/3) – 1984
Factoring
L(1/4) for Discrete Logs with restrictions on the types of group - 2013
25. Exponential vs Polynomial
L(0)
L(1/2) – 1979 L(1) - current
fastest ECC
algorithmsL(1/3) – 1984
Factoring
L(1/4) - 2013
L(0) for discrete logs with restrictions on the types of groups – 2013
26. • Rapid progress in DL research in past 6 months
• February 20, 2013: Joux published a L( 1/4 ) algorithm to
solve DLP in small characteristic fields.
• April 6, 2013: Barbulescu et al solve the DLP in of F2
809
using the Function Field Sieve algorithm (FFS)
• June 18, 2013: Barbulescu, Gaudry, Joux,Thomé publish a
quasi-polynomial algorithm for DLP in finite fields of small
characteristic.
New Developments in 2013
27. • Uses judicious change of variables to find multiplicative
relations easier.
• Uses a specific polynomial with linear factors to simplify
the computation.
• Uses a new descent algorithm to expresses arbitrary
elements in the finite field.
• Complexity is L( 1/4 + o(1)) which is considerably faster
than any discrete logarithm algorithm published before.
Joux’s New Discrete Log Algorithm (Feb 2013)
28. • Quasi-polynomial algorithm for DL in finite fields of small
characteristic.
• Improves Joux’s February 2013 algorithm using special matrix
properties.
• Fastest discrete logarithm has been improved significantly in
the past 6 months after marginal progress in 25 years.
• However; no clear jump to more practical implementations
which use finite fields with larger characteristicYET!
More Improvements
June 2013, Barbulescu, Gaudry, Joux,Thomé
29. • Pairing based cryptography (PBC) over small
characteristics is no longer secure.
• PBC can be used for identity-based encryption, keyword
searchable encryption where traditional public key
cryptography may be unsuitable.
• Currently used mainly in academic circle.
• Improves the Function Field Sieve (FFS) in most cases.
• The function field sieve currently can be used to solve for
small to medium characteristics fields.
Implications of Discrete Log Progress
31. • Function Field Sieve has Four Steps
• Choose a Polynomial
• Relation Filtering
• Linear Algebra
• The Descent
• In the last 6 months, all of them have been improved
• More likely something can be used on something we care about
• His record setting calculation, in May, took 550 Hours
• 512 Bit RSA takes 652 Hours
Function Field Sieve
32. • Joux has attacked fields of a small characteristic
• We use fields of a large characteristic
• Joux’s…
• Polynomial choice probably would not help
• Sieving Improvements may help
• Descent Algorithm needs tweaking, but definitely helps
• Renewed interest could result in further improvements.
Attacking DH, DSA, ElGamal
33. • Factoring advances tend to lead to advances in Discrete Log
• Discrete Log advances tend to lead to advances in Factoring
• Degrees of difficulty of both problems are closely linked.
Attacking RSA
34. • 1975 Pollard's Rho in Factoring -> 1978 Pollard's Rho in
Discrete Log.
• 1984 Quadratic Sieve Factoring -> 1987 improvements in
Discrete Log Index Calculus Algorithms.
• 1993/4 Discrete Log Number & Function Field Sieves -> 1994
General Number Field Sieve for Factoring.
MutualAdvances over the years
35. Factoring
1. Polynomial Selection
2. Sieving
3. Linear Algebra
4. Square Root
Discrete Logs
1. Polynomial Selection
2. Sieving
3. Linear Algebra
4. The Descent
Factoring vs Discrete Logs
36. Factoring
1. Polynomial Selection
2. Sieving
3. Linear Algebra
4. Square Root
Discrete Logs
1. Polynomial Selection
2. Sieving
3. Linear Algebra
4. The Descent
Factoring vs Discrete Logs
NotThat
Slow
Constant
Time
37. Factoring
1. Polynomial Selection
2. Sieving
3. Linear Algebra
4. Square Root
Discrete Logs
1. Polynomial Selection
2. Sieving
3. Linear Algebra
4. The Descent
Factoring vs Discrete Logs
Easy to
Parallelize
NotThat
Slow
Constant
Time
38. Factoring
1. Polynomial Selection
2. Sieving
3. Linear Algebra
4. Square Root
Discrete Logs
1. Polynomial Selection
2. Sieving
3. Linear Algebra
4. The Descent
Factoring vs Discrete Logs
Slow & Difficult to
Parallelize
Easy to
Parallelize
NotThat
Slow
Constant
Time
39. Factoring
1. Polynomial Selection
2. Sieving
3. Linear Algebra
4. Square Root
Discrete Logs
1. Polynomial Selection
2. Sieving
3. Linear Algebra
4. The Descent
Factoring vs Discrete Logs
Very SlowVery Fast
Slow & Difficult to
Parallelize
Easy to
Parralellize
NotThat
Slow
Constant
Time
40. • No obvious technique right now from Joux’s improved
discrete logarithm algorithm that applies directly to
factoring.
• But I’m not a mathematician, I just play one on stage – I
wouldn’t bet the farm on that
• Public colloquium and publications seem to indicate that
NSA/NIST may also already be very concerned.
Attacking RSA
42. • ECC is still standing - still requires exponential time
algorithms
• If Joux or others hits upon a general purpose discrete
logarithm algorithm as fast his special purpose one...
• Diffie-Hellman, DSA, and El-Gamal are toast
• If that leaps to factoring - RSA is toast
• Technically not dead, but…
• RSA key sizes may have to go up to 16,384 bits
• Wildly impractical for actual use, never mind that nothing
supports keysizes that large
Implications
44. • Widespread active and passive attacks against live and
recordedTLS.
• PFS not necessarily the panacea
• Failure of code-signing and update mechanisms
• How do you fix your software
• Failure of PGP, S/MIME and most end-to-end encryption
• Almost total failure of trust in the Internet
What Happens If DH or RSA Fails Now?
45. • We need to move to ECC, rather quickly
• Alex says that ECC is perfectly secure,YAY!
• Not really
• <30 years of research versus 400:
• Uses some of the same ideas
• Right now it’s all we have
• Long-term, we need more research into alternatives
• RSA was 1977, RC4 was 1984. Give Rivest a break.
So, what now?
46. • Lots of push from academia and government into ECC
• DH/RSA are here and they are easily understood
• Legal risks have slowed ECC adoption
• ECC had compatibility problems, but NIST has specified
15 standard curves
Why has ECC uptake been so slow?
47. • In 2005, the NSA released the Suite B set of
interoperable standards
• Suite B specifies:
• The encryption algorithm (AES-256)
• The key exchange algorithm (Elliptic Curve DH)
• The digital signature algorithm (Elliptic Curve DSA)
• The hashing algorithms (SHA-256 and SHA-384)
Hmm, what’s missing?
Overview of Suite B
48. • The patent issue for elliptic curve cryptosystems is the
opposite of that for RSA and Diffie-Hellman.
• RSA and Diffie-Hellman had patents for the cryptosystems but
not the implementation.
• Several important ECC patents owned by Certicom
(Blackberry)
• Efficient GF(2n) multiplication in normal basis representation.
• Technique of validating key exchange messages to prevent a
man-in-the-middle attack.
• Technique for compressing elliptic curve point representations.
ECC Patents
49. • NSA purchased from Certicom (now Blackberry) a license that
covers all of their intellectual property in a restricted field of use.
• License is limited to implementations that were for national
security uses and certified under FIPS 140-2 or were approved by
NSA.
• Commercial vendors may receive a license from NSA provided
their products fit within the field of use of NSA’s license.
• Commercial vendors may contact Blackberry for a license for the
same 26 patents.
ECC and Suite B
51. Table :Windows and OSX ECC Support
ECC Support on Operating Systems
OS Library ECDH ECDSA Others Version
OSX/IOS ssl-36800 Yes Yes None 10.6
OSX/IOS smime-
36873
Yes Yes None 10.6
Windows CNG Yes Yes None Vista
Windows TLS Yes Yes None Vista
Windows Suite B Yes Yes None Vista SP1,
Windows 7
52. Android ECC Support
ECC Support on Android
OS Library ECDH ECDSA Others Version
Android Bouncy
Castle
Yes Yes None 4.0
Android TLS Yes Yes None 3.2.4
Android CyaSSL Yes Yes None 2.4.6
Android NSS Yes Yes NTRU 3.11
53. Programming Languages ECC Support
ECC Support on Programing Languages
Programming
Language
Library ECDH ECDSA Others Version
Python PyECC Yes Yes ECIES 2.4
C OpenSSL Yes Yes None 3.2.4
Java SE6 Bouncy
Castle
Yes Yes None Java 6
Java SE7 Native Yes Yes ECIES,
ECDSA,
ECHR
Java 7
Ruby OpenSSL Yes Yes None 1.8
54. • Windows Code Signing
• Default is RSA
• ECC is supported through CSPs but not default
• AndroidCode Signing
• Both DSA and RSA are currently supported.
• iOS code Signing
• Uses CMS
• Supports ECDH and ECDSA.
Code Signing
55. • TLSv1.2 is the first to include ECC options
• Only TLS_RSA_WITH_AES_128_CBC_SHA is
required
• BeforeTLS 1.2, CA and Cert had to match.
• With 1.2 you can cross-sign
• Can use DH_DSS, DH_RSA, ECDH_ECDSA, and
ECDH_RSA with either ECC or RSA
• TLS 1.1 supports ECDH(E) for PFS
Transport Encryption
56. • ECC roots exist, buying a cert is not so easy
• There would significant work required in the transition form
RSA to ECC certificates.
• Thawte Root Certificate6 - Root CA is not used today.
Intended for use in the future for SSL certificates.
• Verisign/Symantec Root Certificate7 - ECC root certificate for
5 years; just begun offering commercial certificate this year.
• Entrust ECC Certificate8 - No global root certificate currently
available today.Will use a Public ECC-256 Root.
• Comodo9 - 384 bit ECC Root certificate.
PKI Infrastructure
57. • Current Root KSK generated in 2010
(algorithm 8)
• Standard specifies rotated “when
necessary” or at five years
• IANA,Verisign, ICANN SSAC looking
at options
• ECC being considered
• Helps with Zone File size
• Interesting enough, check out .ru
DNSSEC
58. • BlackBerry uses ECC extensively
• OpenVPN uses OpenSSL which includes ECC support,
doesn’t seem to work
• IPSEC - Cisco, Shiva and Nortel gateways support ECDH
IKE.
• OpenSSH has ECC support, not the default.
Other Popular Applications
60. • Make ECC easy to use
• See NaCl’s box() and unbox()
• Update documentation to push developers away from
RSA
• Get aggressive about compatibility testing
• Eat your own dogfood
If you are a… OS or language vendor
61. • TLS 1.2 needs to be a P1 feature
• Only IE 11 and Chrome 29 support (both pre-release)
• Push at CA/B Forum for standardized process for cross-
signed certificates
If you are… a browser vendor
62. • You need to supportTLS 1.2 on endpoints
• Build systems with pluggable primitives
• Versioning
• Handshake and negotiation
• If this sounds too hard useTLS 1.2
• Use ECC for any new cryptosystems
• Retrofit old mechanisms using wrapping
• ECC signed binary inside of legacy RSA signature
If you are a… software maker
63. • Make it easy to buy an ECC cert
• Change documentation to include ECC CSR instructions
• The CA/Browser Forum should promulgate standards
pushing this
If you are a… CertificateAuthority
64. • Make the world a safer place…
• License the ECC patents
openly to any implementation
of Suite B, regardless of use
If you are… BlackBerry
65. • Use ECC certificates where possible
• Bug vendors forTLS 1.2 and ECC support
• Turn on ECDHE PFS today!
• Survey your exposure, so when the cryptopocalypse comes
you are like this guy:
If you are… just a normal company
66. • Current cryptosystems depend on discrete logarithm
and factoring which has seen some major new
developments in the past 6 months.
• We need to move to stronger cryptosystems that
leverage more difficult mathematical problems such as
ECC.
• There is a huge amount of work to be done, so please
get started now.
Summary
Asymmetric cryptographic is an essential part of all modern cryptosystems.It has allowed us to move from the old Enigma machines used by WWII cryptographers to TLS which is used to secure communication over the internet.I am sure that we have all used TLS and we have asymmetric cryptography to thank for this.
Asymmetric cryptography relies on certain information being computationally hard to compute without a secret. Asymetrix cryptosystems contain public component which can be known by everyone including an adversary. However, it must not be possible to compute the private or secret key from this information. This would completely break the cryptosystem.These mathematical functions are currently computationally difficult but not provably hard. An efficient algorithm may exist and just has not been found. Our cryptosystems rely on that efficient algorithm not being discovered. We will take a closer look at some of these mathematical functions now.
BothDiffie Hellman and RSA were first published in the late 1970’s and are used in almost all of today’s cryptosystems.They are used for a variety of purposes such as secure key exchange, encryption and signing.Elliptic Curve Cryptography was first published in the 1980’s and there was been significant academic interest in Elliptic Curve cryptography but limited use in industry. ECC is performed over a specified curve unlike Diffie Hellman and RSA which are performed over the set of integers.In the recent few years, the NSA published Suite B recommendations and the Russian’s declassified GOST recommendation which recommend use Elliptic curve cryptography. Like Diffie Hellman and RSA, ECC can be used for key exchange, signing and encryption.
Now we will take a closer look at the Diffie Hellman key exchange protocol. This was first published by Diffie and Hellman in 1976.As some of you may know, Diffie Hellman allows one to establish a shared secret by exchanging data over a public untrusted network.This shared secret can then be used in a symmetric cryptosystem.The security of the Diffie Hellman key exchange completely relies on the computational hardness of the discrete logarithm problem.
How does one break Diffie Hellman. Simple, you solve the discrete logarithm problem.Suppose you have h = g^x. The discrete logarithm problem is to find the element x when only g and h are known.This seemingly simple problem is the basis of the Diffie Hellman key exchange protocol.To reiterate an efficient discrete logarithm algorithm will completely break DH.Also, since both El-Gamal and DSA rely on slight modifications of the DLP an efficient generic DL algorithms will break them as well.
The first phase in RSA is to compute an RSA modulus from 2 large primes.From that RSA modulus, a public key exponent (e) and public key (d) are computed that satisfy a particular mathematical relation.This public key is used to encrypt any message sent to the receiver. This is done by raising the message to the recipient's public key e.The recipient of the message then raises the ciphertext to their private key d.As with Diffie Hellman, the security of RSA relies on a mathematical problem.In this the mathematical problem is factoring.
We attack RSA by attacking the underlying mathematical function which in this case is factoring.Factoring as we can remember from grade school mathematics is a seemingly simple task eg. 35 = 5 * 7…. This is true for small numbers at least. However, there currently exists no efficient algorithm to factor an arbitrary number.Factoring an RSA modulus would allow us to compute the two constituent primes of that modulus and with the user’s public key we would then be able to compute the user’s private key.We can simply use the same mathematical relation which was used to generate it in the first place.To reiterate an efficient factoring algorithm will completely break RSA.
Now let us switch gears a bit and discuss Elliptic Curve Cryptography. As mentioned earlier ECC was first published in the 1980’s and there has continued Work in the field over the past 30 years.An elliptic curve E over the real numbers R is defined by a Weierstrass equation. I have shown an example on the slide.This funky looking curve is special and allows us to build even secure cryptosystems.Generally either a prime field or binary field is used depending on the application.
As with both Diffie Hellman and RSA, ECC also depends on a fundamental mathematical problem.In this case ECC is secure due to the hardness of the Elliptic curve discrete logarithm problem (ECDLP). This should not to be confused with the discrete logarithm problem we just saw.The underlying mathematical problem is given two points on the elliptic curve, P and Q, compute the integer d such that Q = dP.In the diagram on the screen I have shown the simple case where d=2.The key pair (d; Q) can be used for a variety of cryptosystems including signature and encryption/decryption.As with Diffie Hellman and RSA, if an efficient algorithm for solving the underlying mathematical problem then the entire cryptosystem is broken.
Now we show the NIST recommended key sizes for symmetric algorithms, Diffie Hellman and ECC.As can be seen NIST recommends significantly smaller key sizes for ECC. This is due to the increased computational difficulty in solving the ECDLP as opposedto factoring or the regular DLP.Furthermore given current research advances even key sizes in the same rows are not computationally equivalent.
Now we will move on the next section and look at some of the new advances in the academic world.And why we need to be very concerned about this new research progress.
There are two types of discrete logarithm algorithms namely generic algorithms and specific algorithms.Generic algorithms work with a divide and conquer approach by breaking up groups into smaller groups They are very slow and take exponential time. We will discuss the complexity of algorithms in the next slide. Specific algorithms make use of particular group representations and can be much faster. Examples such as the index calculus algorithm currently result in mainly sub-exponential algorithms.They work by leveraging certain properties of the group.
Now let us take a look at algorithmic complexity and why that matters. Algorithmic complexity is simply how fast does a given algorithm run.Discrete Logarithm and Factoring generally use L notation in the literature to indicate their complexity. L(0) indicates that an algorithm is polynomial while L(1) is a fully exponential algorithm. Anything in between is sub-exponential.On the linear running time plot we can see that Exponential time algorithm dominates all other algorithms which can barely be seen on x-axis.The difference in running time can be me more clearly seen in the logarithmic plot where the running time of the exponential continues to grow while for polynomial time it plateaus.
Sporadic progress in DL research for 30+ years1979: Alderman published a sub-exponential L( 1/2 ) algorithm to solve the DLP. Note that this is not half the running the time of an L(1) algorithm.The fastest ECDLP logarithm has been fully exponential for the last 30 years and while there has been progress at the margins there have been no major breakthroughs.
A few later there was a further improvement in academia when an L(1/3) algorithm was published.1984: Odlyzko published a L( 1/3 ) algorithm to solve DLP in finite fields.And then there was little progress in the algorithmic complexity over the past 30 years. There were some improvements at the margin to the constants but there was no substantial progress until this year.
Then suddenly in February 2013 we had a paper released by Antonine Joux where he published an L(1/4) algorithm for Discrete logarithms. Note this is not a generic algorithm and only applies to cases with certain restrictions on the types of group..
And then within a few months Antonine Joux and some other researchers improved these algorithm to be quasi-polynomial L(0).Again this only applies to discrete logarithm with particular properties.
As we just saw there has rapid progress in the discrete logarithm field in the past 6 months. While these algorithms are currently limited to only certain circumstances namely small characteristics fields.Now for the math nerds…The characteristics of a field is the number of multiplicative identity elements in a sum needed to compute the additiive identity of the field. Generally practical cryptosystems use large characteristic field.These recent developments will bring more attention to the discrete logarithm problem and this will spur researchers into looking more closely problem most likely resulting in even further improvement in the near future.
Let us know take a brief look at some of the mathematics used in the new discrete logarithm algorithm.The main thing to note is that no new fundamental mathematical technique was required. This did not require the invention of a new branch of mathematics.Instead he used several mathematical tricks to speed up the running time of the algorithm significantly. It is remarkable that such techniques were not seen earlierBy any previous researchers in the area.Some of these techniques include a clever change of variables; a specific polynomial to simplify the computation. And a new descent algorithm to express arbitrary elements in the finite field.This resulted in a discrete logarithm that is much faster than anything published earlier.
And then given these insights, in less than 6 months, other researchers including Joux were able to help and improve this algorithm even further with some more special mathematics. They used special matrix properties which sped up the slowest step and resulted in a quasi-polynomial algorithm for discrete logarithm in certain circumstances.This is a big deal since there was marginal progress for 25+ years but then in 6 months there has been significant progress in discrete logarithm research.Note that is no obvious jump to more practical implementations yet. However, with the renewed interest in the field in academia we could so much more progress in the immediate future.
Some of the current implications of this discrete logarithm research right now is that pairing based cryptography which is used mainly in academic circles is no longer secure when done over small characteristics. There are limited practical implementations of pairing based cryptography though there is a pairing based crypto library maintained by the Stanford cryptography group.Also the function field sieve which will be discussed in more detail by Tom Ritter in the next section is improved by these new developments. The function field sieve is used mainly for small to medium characteristics field.
And now I’ll pass it to Tom Ritter who will discuss how this may apply to factoring.All right, so that’s a lot of Math, let’s talk about how this impacts or doesn’t impact the _algorithms_ we use today, before we talk about how it impacts the _applications_ we use today
The Function Field Sieve, which is what’s used for solving Discrete Logs, has four steps. In the last six months, all of them have been improved. That means it’s way more likely that something will be applicable to an algorithm we care about.And it’s worthwhile to note that the computation times that people are setting records with are not super-computer-worthy. Less than a month on a single core. 652 figure:460 hours for seiving8 core days = 192 hours for linear algebra
So Joux has attacked fields of a small characteristic. But we use fields of a large characteristic in Diffie Hellman, DSA, and ElGamal. Joux’s specific improvements are hit or miss on applying to these types of fields. The polynomial selection probably doesn’t, the sieving might, and the descent algorithm needs some tweaking and further work – but it will definitely lead to improvements.And of course, the simple fact that everyone in the Academic Community is really excited about this stuff means we’ll probably see even more improvements down the pipe.
So what about RSA? Everybody uses RSA, and we use it everywhere. And traditionally, Discrete Logs and Factoring have been very closely linked. When we improve one, we tend to improve the other in short order.
And we’ve seen this over the years. The dates that Javed threw out – those have seen advances right next to each other on the other algorithm. In the 70s, in the 80s, and in the 90s. And while I hate to think we’re going to call this decade the ‘10s, we’ll probably see a reflective paper nonetheless.
But WHY are these two algorithms so closely related? Well, they have about the same steps. They both select a polynomial, sieve for relations, perform a big linear algebra step, and then solve for the specific number you want factor or compute the discrete log of.So if that’s how they’re similar, let me explain how they’re different.
In Factoring, the polynomial selection takes some time, but it’s not that slow.In Discrete Logs, Joux has chosen his polynomial as a constant, based off the type of Group he’s working in.
The Relationship Sieving in both takes time – but it’s trivial to parallelize. In the era of EC2 and Google Compute Engine – any problem that’s embarrassingly parallel and doesn’t require the energy output of the sun tends to just have cores thrown at it.
Now the Linear Algebra for both of them is Slow, and difficult to parallelize. It requires a lot of memory, and a lot of memory bandwidth, plus a lot of CPU time. It’s also harder for Discrete Logs than it is for Factring.
And the most notable difference is that the last step is way more difficult for Discrete Logs. The Descent is extremely painful for Discrete Logs, but the analogous step, the Square Root takes minutes for Factoring.So they’re very closely related, but they’re not exactly homogeneous between the two.
So coming back to RSA – there’s no obvious technique fromJoux’s work to directly apply to the General Number Field Sieve, and factoring RSA public keys.That said, if there’s even a 5% chance, that’s basically a 5% chance to throw every single Certificate Authority, every single SSL session, every single software update mechanism into complete and utter disarray.And based on NIST’s publications and colloquiums it seems like they’re concerned about this too.
And it’s worthwhile to note that running the General Number Field Sieve, and factoring 512-bit, and even 768-bit, RSA keys is within you, the audience members’ grasps. The software used to do it is public and open source, and there are tutorials on how to factor 512-bit keys in under 30 hours.
So right now, ECC is in pretty good shape. But we have to keep in mind that ECC has been around and studied for 30 years, while RSA and DH or more importantly, factoring and discrete logs, have been studied for hundreds.if Joux or others hit upon a general purpose discrete log algorithm – Diffie Hellman and other algorithms we rely on are toastAnd if it leaps to factoring, RSA will be toast to.And if give you an idea what I mean when I say toast, I mean key sizes might have to go from 2048 to 16,384. Besides being wildly impractical for any actual use because it’s way too slow – there’s like, no software that supports keysizes that large.So let me hand it over to Alex to talk about how screwed we all are.