Sampling Spectrahedra: Volume Approximation and OptimizationApostolos Chalkis
My talk to SIAM Conference on Applied Algebraic Geometry (AG21) on volume approximation of spectrahedra and convex optimization with randomized methods based on MCMC sampling with geometric random walks
Practical volume estimation of polytopes by billiard trajectories and a new a...Apostolos Chalkis
A new randomized method to approximate the volume of a convex polytope based on simulated annealing for cooling convex bodies and MCMC sampling with geometric random walks
Sampling Spectrahedra: Volume Approximation and OptimizationApostolos Chalkis
My talk to SIAM Conference on Applied Algebraic Geometry (AG21) on volume approximation of spectrahedra and convex optimization with randomized methods based on MCMC sampling with geometric random walks
Practical volume estimation of polytopes by billiard trajectories and a new a...Apostolos Chalkis
A new randomized method to approximate the volume of a convex polytope based on simulated annealing for cooling convex bodies and MCMC sampling with geometric random walks
In general dimension, there is no known total polynomial algorithm for either convex hull or vertex enumeration, i.e. an algorithm whose complexity depends polynomially on the input and output sizes. It is thus important to identify problems (and polytope representations) for which total polynomial-time algorithms can be obtained. We offer the first total polynomial-time algorithm for computing the edge-skeleton (including vertex enumeration) of a polytope given by an optimization or separation oracle, where we are also given a superset of its edge directions. We also offer a space-efficient variant of our algorithm by employing reverse search. All complexity bounds refer to the (oracle) Turing machine model. There is a number of polytope classes naturally defined by oracles; for some of them neither vertex nor facet representation is obvious. We consider two main applications, where we obtain (weakly) total polynomial-time algorithms: Signed Minkowski sums of convex polytopes, where polytopes can be subtracted provided the signed sum is a convex polytope, and computation of secondary, resultant, and discriminant polytopes. Further applications include convex combinatorial optimization and convex integer programming, where we offer a new approach, thus removing the complexity's exponential dependence in the dimension.
We experimentally study the fundamental problem of computing the volume of a convex polytope given as an intersection of linear inequalities. We implement and evaluate practical randomized algorithms for accurately approximating the polytope’s volume in high dimensions (e.g. one hundred). To carry out this efficiently we experimentally correlate the effect of parameters, such as random walk length and number of sample points, on accuracy andruntime. Moreover, we exploit the problem’s geometry by implementing an iterative rounding procedure, computing partial generations of random points and designing fast polytope boundary oracles. Our publicly available code is significantly faster than exact computation and more accurate than existing approximation methods. We provide volume approximations for the Birkhoff polytopes B11,...,B15, whereas exact methods have only computed that ofB10.
Code of the multidimensional fractional pseudo-Newton method using recursive ...mathsjournal
The following paper presents one way to define and classify the fractional pseudo-Newton method through a group of fractional matrix operators, as well as a code written in recursive programming to implement this method, which through minor modifications, can be implemented in any fractional fixed-point method that allows solving nonlinear algebraic equation systems.
Bagging Exponential Smoothing procedures have recently arisen as an innovative way to improve forecast accuracy. The idea is to use Bootstrap to generate multiple versions of the time series and, subsequently, apply an Exponential Smoothing (ETS) method to produce forecasts for each of them. The final result is obtained aggregating the forecasts. The main drawback of existing procedures is that Bagging itself does not avoid generating highly correlated ensembles that might affect the forecast error. In this paper we propose and evaluate procedures that try to enhance existing Bagging Exponential Smoothing methods by an addition of a clustering phase. The general idea is to generate Bootstrapped versions of the series and use clusters to select series that are less similar among each other. The expectation is that this would reduce the covariance and, consequently, the forecast error. Since there are several cluster algorithms and dissimilarity measures, we consider some of them in the study. The proposed procedures were evaluated on monthly, quarterly and yearly data from the M3-competition. The results were quite promising, indicating that the introduction of a cluster phase in the Bagging Exponential Smoothing procedures can reduce the forecast error.
RuleML 2015 Constraint Handling Rules - What Else?RuleML
Constraint Handling Rules (CHR) is both a versatile theoretical formalism based on logic and an efficient practical high-level programming language based on rules and constraints.
Procedural knowledge is often expressed by if-then rules, events and actions are related by reaction rules, change is expressed by update rules. Algorithms are often specified using inference rules, rewrite rules, transition rules, sequents, proof rules, or logical axioms. All these kinds of rules can be directly written in CHR. The clean logical semantics of CHR facilitates non-trivial program analysis and transformation. About a dozen implementations of CHR exist in Prolog, Haskell, Java, Javascript and C. Some of them allow to apply millions of rules per second. CHR is also available as WebCHR for online experimentation with more than 40 example programs. More than 200 academic and industrial projects worldwide use CHR, and about 2000 research papers reference it.
Optimal order a posteriori error bounds for semilinear parabolic equations are derived by using discontinuous Galerkin method in time and continuous (conforming) finite element in space. Our main tools in this analysis the reconstruction in time and elliptic reconstruction in space with Gronwall’s lemma and continuation argument.
Slides for the paper titled "Towards Mapping Analysis in Ontology-Based Data Access" as presented at the 8th International Conference On Web Reasoning And Rule Systems in Athens, September 15th of 2014.
In general dimension, there is no known total polynomial algorithm for either convex hull or vertex enumeration, i.e. an algorithm whose complexity depends polynomially on the input and output sizes. It is thus important to identify problems (and polytope representations) for which total polynomial-time algorithms can be obtained. We offer the first total polynomial-time algorithm for computing the edge-skeleton (including vertex enumeration) of a polytope given by an optimization or separation oracle, where we are also given a superset of its edge directions. We also offer a space-efficient variant of our algorithm by employing reverse search. All complexity bounds refer to the (oracle) Turing machine model. There is a number of polytope classes naturally defined by oracles; for some of them neither vertex nor facet representation is obvious. We consider two main applications, where we obtain (weakly) total polynomial-time algorithms: Signed Minkowski sums of convex polytopes, where polytopes can be subtracted provided the signed sum is a convex polytope, and computation of secondary, resultant, and discriminant polytopes. Further applications include convex combinatorial optimization and convex integer programming, where we offer a new approach, thus removing the complexity's exponential dependence in the dimension.
We experimentally study the fundamental problem of computing the volume of a convex polytope given as an intersection of linear inequalities. We implement and evaluate practical randomized algorithms for accurately approximating the polytope’s volume in high dimensions (e.g. one hundred). To carry out this efficiently we experimentally correlate the effect of parameters, such as random walk length and number of sample points, on accuracy andruntime. Moreover, we exploit the problem’s geometry by implementing an iterative rounding procedure, computing partial generations of random points and designing fast polytope boundary oracles. Our publicly available code is significantly faster than exact computation and more accurate than existing approximation methods. We provide volume approximations for the Birkhoff polytopes B11,...,B15, whereas exact methods have only computed that ofB10.
Code of the multidimensional fractional pseudo-Newton method using recursive ...mathsjournal
The following paper presents one way to define and classify the fractional pseudo-Newton method through a group of fractional matrix operators, as well as a code written in recursive programming to implement this method, which through minor modifications, can be implemented in any fractional fixed-point method that allows solving nonlinear algebraic equation systems.
Bagging Exponential Smoothing procedures have recently arisen as an innovative way to improve forecast accuracy. The idea is to use Bootstrap to generate multiple versions of the time series and, subsequently, apply an Exponential Smoothing (ETS) method to produce forecasts for each of them. The final result is obtained aggregating the forecasts. The main drawback of existing procedures is that Bagging itself does not avoid generating highly correlated ensembles that might affect the forecast error. In this paper we propose and evaluate procedures that try to enhance existing Bagging Exponential Smoothing methods by an addition of a clustering phase. The general idea is to generate Bootstrapped versions of the series and use clusters to select series that are less similar among each other. The expectation is that this would reduce the covariance and, consequently, the forecast error. Since there are several cluster algorithms and dissimilarity measures, we consider some of them in the study. The proposed procedures were evaluated on monthly, quarterly and yearly data from the M3-competition. The results were quite promising, indicating that the introduction of a cluster phase in the Bagging Exponential Smoothing procedures can reduce the forecast error.
RuleML 2015 Constraint Handling Rules - What Else?RuleML
Constraint Handling Rules (CHR) is both a versatile theoretical formalism based on logic and an efficient practical high-level programming language based on rules and constraints.
Procedural knowledge is often expressed by if-then rules, events and actions are related by reaction rules, change is expressed by update rules. Algorithms are often specified using inference rules, rewrite rules, transition rules, sequents, proof rules, or logical axioms. All these kinds of rules can be directly written in CHR. The clean logical semantics of CHR facilitates non-trivial program analysis and transformation. About a dozen implementations of CHR exist in Prolog, Haskell, Java, Javascript and C. Some of them allow to apply millions of rules per second. CHR is also available as WebCHR for online experimentation with more than 40 example programs. More than 200 academic and industrial projects worldwide use CHR, and about 2000 research papers reference it.
Optimal order a posteriori error bounds for semilinear parabolic equations are derived by using discontinuous Galerkin method in time and continuous (conforming) finite element in space. Our main tools in this analysis the reconstruction in time and elliptic reconstruction in space with Gronwall’s lemma and continuation argument.
Slides for the paper titled "Towards Mapping Analysis in Ontology-Based Data Access" as presented at the 8th International Conference On Web Reasoning And Rule Systems in Athens, September 15th of 2014.
Computing the volume of a convex body is a fundamental problem in computational geometry and optimization. In this talk we discuss the computational complexity of this problem from a theoretical as well as practical point of view. We show examples of how volume computation appear in applications ranging from combinatorics to algebraic geometry.
Next, we design the first practical algorithm for polytope volume approximation in high dimensions (few hundreds).
The algorithm utilizes uniform sampling from a convex region and efficient boundary polytope oracles.
Interestingly, our software provides a framework for exploring theoretical advances since it is believed, and our experiments provide evidence for this belief, that the current asymptotic bounds are unrealistically high.
A new practical algorithm for volume estimation using annealing of convex bodiesVissarion Fisikopoulos
We study the problem of estimating the volume of convex polytopes, focusing on H- and V-polytopes, as well as zonotopes. Although a lot of effort is devoted to practical algorithms for H-polytopes there is no such method for the latter two representations. We propose a new, practical algorithm for all representations, which is faster than existing methods. It relies on Hit-and-Run sampling, and combines a new simulated annealing method with the Multiphase Monte Carlo (MMC) approach. Our method introduces the following key features to make it adaptive: (a) It defines a sequence of convex bodies in MMC by introducing a new annealing schedule, whose length is shorter than in previous methods with high probability, and the need of computing an enclosing and an inscribed ball is removed; (b) It exploits statistical properties in rejection-sampling and proposes a better empirical convergence criterion for specifying each step; (c) For zonotopes, it may use a sequence of convex bodies for MMC different than balls, where the chosen body adapts to the input. We offer an open-source, optimized C++ implementation, and analyze its performance to show that it outperforms state-of-the-art software for H-polytopes by Cousins-Vempala (2016) and Emiris-Fisikopoulos (2018), while it undertakes volume computations that were intractable until now, as it is the first polynomial-time, practical method for V-polytopes and zonotopes that scales to high dimensions (currently 100). We further focus on zonotopes, and characterize them by their order (number of generators over dimension), because this largely determines sampling complexity. We analyze a related application, where we evaluate methods of zonotope approximation in engineering.
Language Language Models (in 2023) - OpenAISamuelButler15
1. Emergent Abilities with Scale: The presentation underscores the significance of viewing the development of language models with a perspective of “yet”, highlighting that many ideas may not work now but could become viable as models scale. This perspective challenges traditional scientific experimentation by suggesting that axioms in the field of language models are subject to change with advancements in model capabilities.
2. The Need for Constant Unlearning: As models scale, previously held intuitions may become outdated. The presentation discusses the necessity for researchers to unlearn invalidated ideas, noting that newcomers may sometimes have an advantage due to having fewer entrenched misconceptions.
3. Scaling Challenges and Techniques: The presentation elaborates on the technical challenges and complexities involved in scaling LLMs, using examples from training processes that include unexpected loss spikes. It also touches upon the importance of documenting experiments that fail due to insufficient model “intelligence” and retesting them as models evolve.
4. Instruction Fine-Tuning and RLHF: The presentation discusses instruction fine-tuning as a method to improve model performance across a wide range of tasks by framing tasks in natural language. However, it also points out the limitations of instruction fine-tuning and the potential of reinforcement learning from human feedback (RLHF) to address some of these challenges by learning the objective function.
5. Technical Insights on Transformer Models: Detailed technical insights into the functioning of Transformer models are provided, including tokenization, embedding, and the sequential processing that underpins these models’ ability to understand and generate language.
6. Scaling Infrastructure: The presentation gives an overview of the infrastructure considerations for scaling LLMs, including the use of tensor processing units (TPUs) and the role of software tools like JAX for parallelizing model training across multiple hardware units.
7. The Bitter Lesson and Future Directions: Reiterating “the bitter lesson” in AI research—that progress often comes from scalable general methods rather than specialized approaches—the presentation hints at ongoing and future directions in LLM research, emphasizing scalability, the reduction of inductive biases, and the exploration of novel training paradigms.
Utilocate offers a comprehensive solution for locate ticket management by automating and streamlining the entire process. By integrating with Geospatial Information Systems (GIS), it provides accurate mapping and visualization of utility locations, enhancing decision-making and reducing the risk of errors. The system's advanced data analytics tools help identify trends, predict potential issues, and optimize resource allocation, making the locate ticket management process smarter and more efficient. Additionally, automated ticket management ensures consistency and reduces human error, while real-time notifications keep all relevant personnel informed and ready to respond promptly.
The system's ability to streamline workflows and automate ticket routing significantly reduces the time taken to process each ticket, making the process faster and more efficient. Mobile access allows field technicians to update ticket information on the go, ensuring that the latest information is always available and accelerating the locate process. Overall, Utilocate not only enhances the efficiency and accuracy of locate ticket management but also improves safety by minimizing the risk of utility damage through precise and timely locates.
Climate Science Flows: Enabling Petabyte-Scale Climate Analysis with the Eart...Globus
The Earth System Grid Federation (ESGF) is a global network of data servers that archives and distributes the planet’s largest collection of Earth system model output for thousands of climate and environmental scientists worldwide. Many of these petabyte-scale data archives are located in proximity to large high-performance computing (HPC) or cloud computing resources, but the primary workflow for data users consists of transferring data, and applying computations on a different system. As a part of the ESGF 2.0 US project (funded by the United States Department of Energy Office of Science), we developed pre-defined data workflows, which can be run on-demand, capable of applying many data reduction and data analysis to the large ESGF data archives, transferring only the resultant analysis (ex. visualizations, smaller data files). In this talk, we will showcase a few of these workflows, highlighting how Globus Flows can be used for petabyte-scale climate analysis.
First Steps with Globus Compute Multi-User EndpointsGlobus
In this presentation we will share our experiences around getting started with the Globus Compute multi-user endpoint. Working with the Pharmacology group at the University of Auckland, we have previously written an application using Globus Compute that can offload computationally expensive steps in the researcher's workflows, which they wish to manage from their familiar Windows environments, onto the NeSI (New Zealand eScience Infrastructure) cluster. Some of the challenges we have encountered were that each researcher had to set up and manage their own single-user globus compute endpoint and that the workloads had varying resource requirements (CPUs, memory and wall time) between different runs. We hope that the multi-user endpoint will help to address these challenges and share an update on our progress here.
Understanding Globus Data Transfers with NetSageGlobus
NetSage is an open privacy-aware network measurement, analysis, and visualization service designed to help end-users visualize and reason about large data transfers. NetSage traditionally has used a combination of passive measurements, including SNMP and flow data, as well as active measurements, mainly perfSONAR, to provide longitudinal network performance data visualization. It has been deployed by dozens of networks world wide, and is supported domestically by the Engagement and Performance Operations Center (EPOC), NSF #2328479. We have recently expanded the NetSage data sources to include logs for Globus data transfers, following the same privacy-preserving approach as for Flow data. Using the logs for the Texas Advanced Computing Center (TACC) as an example, this talk will walk through several different example use cases that NetSage can answer, including: Who is using Globus to share data with my institution, and what kind of performance are they able to achieve? How many transfers has Globus supported for us? Which sites are we sharing the most data with, and how is that changing over time? How is my site using Globus to move data internally, and what kind of performance do we see for those transfers? What percentage of data transfers at my institution used Globus, and how did the overall data transfer performance compare to the Globus users?
Check out the webinar slides to learn more about how XfilesPro transforms Salesforce document management by leveraging its world-class applications. For more details, please connect with sales@xfilespro.com
If you want to watch the on-demand webinar, please click here: https://www.xfilespro.com/webinars/salesforce-document-management-2-0-smarter-faster-better/
Software Engineering, Software Consulting, Tech Lead.
Spring Boot, Spring Cloud, Spring Core, Spring JDBC, Spring Security,
Spring Transaction, Spring MVC,
Log4j, REST/SOAP WEB-SERVICES.
Prosigns: Transforming Business with Tailored Technology SolutionsProsigns
Unlocking Business Potential: Tailored Technology Solutions by Prosigns
Discover how Prosigns, a leading technology solutions provider, partners with businesses to drive innovation and success. Our presentation showcases our comprehensive range of services, including custom software development, web and mobile app development, AI & ML solutions, blockchain integration, DevOps services, and Microsoft Dynamics 365 support.
Custom Software Development: Prosigns specializes in creating bespoke software solutions that cater to your unique business needs. Our team of experts works closely with you to understand your requirements and deliver tailor-made software that enhances efficiency and drives growth.
Web and Mobile App Development: From responsive websites to intuitive mobile applications, Prosigns develops cutting-edge solutions that engage users and deliver seamless experiences across devices.
AI & ML Solutions: Harnessing the power of Artificial Intelligence and Machine Learning, Prosigns provides smart solutions that automate processes, provide valuable insights, and drive informed decision-making.
Blockchain Integration: Prosigns offers comprehensive blockchain solutions, including development, integration, and consulting services, enabling businesses to leverage blockchain technology for enhanced security, transparency, and efficiency.
DevOps Services: Prosigns' DevOps services streamline development and operations processes, ensuring faster and more reliable software delivery through automation and continuous integration.
Microsoft Dynamics 365 Support: Prosigns provides comprehensive support and maintenance services for Microsoft Dynamics 365, ensuring your system is always up-to-date, secure, and running smoothly.
Learn how our collaborative approach and dedication to excellence help businesses achieve their goals and stay ahead in today's digital landscape. From concept to deployment, Prosigns is your trusted partner for transforming ideas into reality and unlocking the full potential of your business.
Join us on a journey of innovation and growth. Let's partner for success with Prosigns.
Gamify Your Mind; The Secret Sauce to Delivering Success, Continuously Improv...Shahin Sheidaei
Games are powerful teaching tools, fostering hands-on engagement and fun. But they require careful consideration to succeed. Join me to explore factors in running and selecting games, ensuring they serve as effective teaching tools. Learn to maintain focus on learning objectives while playing, and how to measure the ROI of gaming in education. Discover strategies for pitching gaming to leadership. This session offers insights, tips, and examples for coaches, team leads, and enterprise leaders seeking to teach from simple to complex concepts.
GraphSummit Paris - The art of the possible with Graph TechnologyNeo4j
Sudhir Hasbe, Chief Product Officer, Neo4j
Join us as we explore breakthrough innovations enabled by interconnected data and AI. Discover firsthand how organizations use relationships in data to uncover contextual insights and solve our most pressing challenges – from optimizing supply chains, detecting fraud, and improving customer experiences to accelerating drug discoveries.
AI Pilot Review: The World’s First Virtual Assistant Marketing SuiteGoogle
AI Pilot Review: The World’s First Virtual Assistant Marketing Suite
👉👉 Click Here To Get More Info 👇👇
https://sumonreview.com/ai-pilot-review/
AI Pilot Review: Key Features
✅Deploy AI expert bots in Any Niche With Just A Click
✅With one keyword, generate complete funnels, websites, landing pages, and more.
✅More than 85 AI features are included in the AI pilot.
✅No setup or configuration; use your voice (like Siri) to do whatever you want.
✅You Can Use AI Pilot To Create your version of AI Pilot And Charge People For It…
✅ZERO Manual Work With AI Pilot. Never write, Design, Or Code Again.
✅ZERO Limits On Features Or Usages
✅Use Our AI-powered Traffic To Get Hundreds Of Customers
✅No Complicated Setup: Get Up And Running In 2 Minutes
✅99.99% Up-Time Guaranteed
✅30 Days Money-Back Guarantee
✅ZERO Upfront Cost
See My Other Reviews Article:
(1) TubeTrivia AI Review: https://sumonreview.com/tubetrivia-ai-review
(2) SocioWave Review: https://sumonreview.com/sociowave-review
(3) AI Partner & Profit Review: https://sumonreview.com/ai-partner-profit-review
(4) AI Ebook Suite Review: https://sumonreview.com/ai-ebook-suite-review
Enterprise Resource Planning System includes various modules that reduce any business's workload. Additionally, it organizes the workflows, which drives towards enhancing productivity. Here are a detailed explanation of the ERP modules. Going through the points will help you understand how the software is changing the work dynamics.
To know more details here: https://blogs.nyggs.com/nyggs/enterprise-resource-planning-erp-system-modules/
We describe the deployment and use of Globus Compute for remote computation. This content is aimed at researchers who wish to compute on remote resources using a unified programming interface, as well as system administrators who will deploy and operate Globus Compute services on their research computing infrastructure.
Globus Connect Server Deep Dive - GlobusWorld 2024Globus
We explore the Globus Connect Server (GCS) architecture and experiment with advanced configuration options and use cases. This content is targeted at system administrators who are familiar with GCS and currently operate—or are planning to operate—broader deployments at their institution.
OpenFOAM solver for Helmholtz equation, helmholtzFoam / helmholtzBubbleFoamtakuyayamamoto1800
In this slide, we show the simulation example and the way to compile this solver.
In this solver, the Helmholtz equation can be solved by helmholtzFoam. Also, the Helmholtz equation with uniformly dispersed bubbles can be simulated by helmholtzBubbleFoam.
Top Features to Include in Your Winzo Clone App for Business Growth (4).pptxrickgrimesss22
Discover the essential features to incorporate in your Winzo clone app to boost business growth, enhance user engagement, and drive revenue. Learn how to create a compelling gaming experience that stands out in the competitive market.
Large Language Models and the End of ProgrammingMatt Welsh
Talk by Matt Welsh at Craft Conference 2024 on the impact that Large Language Models will have on the future of software development. In this talk, I discuss the ways in which LLMs will impact the software industry, from replacing human software developers with AI, to replacing conventional software with models that perform reasoning, computation, and problem-solving.
1. High-dimensional sampling and volume computation
Apostolos Chalkis
Dept. of Informatics & Telecommunications, University of Athens
GeomScale Project
2. package volesti: sampling and volume estimation
https://github.com/GeomScale/volume_approximation
open source, written in C++, R package in CRAN.
depends on Eigen, LPSolve, Boost.
currently 3 volume algorithms, 4 sampling algorithms.
main contributors: A. Chalkis and V. Fisikopoulos plus several
community contributors.
3. Problems
Given P a convex body in Rn:
compute the volume of P.
sample from a distribution π truncated in P.
minimize a convex function f in P (convex optimization).
compute the integral of a function f over P.
volesti provides randomized approximation algorithms based on
sampling from P with geometric random walks.
4. volume computation of a polytope P
First thoughts
Compute a triangulation of P and sum up the volumes of the
triangles.
Sample from a bounding box, count the number of points in P.
both approaches fail in high dimensions (curse of
dimensionality).
5. Multiphase Monte Carlo
There are several algorithms based on MMC.
The only alternative in R: package geometry (exact
computations).
An example:
#exact computation with package geometry
P = gen_rand_vpoly(20, 40)
geom_values = geometry::convhulln(P$V, options = ’FA’)
QH6082 qhull error (qh_memalloc): insufficient memory
to allocate 1329542232 bytes
#computation with volesti
time2 = system.time({ vol2 = volume(P) })
cat(time2[3], vol2)
14.524 2.68443e-07
6. Convex bodies
H-polytope : P = {x | Ax ≤ b, A ∈ Rm×n, b ∈ Rm}
V-polytope : P is the convex hull of a set of points in Rn
Z-polytope : Minkowski sum of k segments (projection of k-cube)
Spectrahedron : P = {x | A0 + x1A1 + · · · + xnAn 0},
where Ai : symmetric matrices, B 0: B is positive
semidefinite
7. volesti provide three volume algorithms
volesti is the first software that performs volume
computations in thousands of dimensions.
Algorithm
H-polytope
V- & Z-polytope
n ≤ 20 n > 20
Sequence of Balls [1]
CoolingGaussians [2]
CoolingBodies [3]
[1] I.Z. Emiris, V. Fisikopoulos, 2014
[2] S. Vempala, B. Cousins, 2014
[3] A. Chalkis, I.Z. Emiris, V. Fisikopoulos, 2019
8. volesti provides four random walks
random walk
uniform gaussian exponential
distribution distribution distribution
Ball walk
hit and run
Random directions
hit and run
Coordinate directions
Billiard walk
B
p
q
p
q
p q
p
q
9. Comparing mixing times
Uniform sampling from the hypercube [−1, 1]200 and projection
to R3.
Rows: Ball Walk, Coordinate Directions Hit and Run, Random
Directions Hit and Run, Billiard Walk.
Columns: walk length, {1, 50, 100, 150, 200}
10. Rounding a convex body
Example
Let P = [−1, 1] × [−100, 100]. We want to sample uniformly
distributed points from P.
Left we sample directly from P.
Right we round the body to obtain a higher quality sample.
volesti provides three rounding methods.
11. Application
Multivariate integration
Let I = P f (x)dx.
Sample N uniformly distributed points x1, . . . , xN from P and,
I ≈ vol(P)
1
N
N
i=1
f (xi ).
Example:
P = gen_rand_vpoly(n, 2 * n, seed = 127)
f = function(x) {sum(x^2) + (2 * x[1]^2 + x[2] + x[3])}
points = sample_points(P, random_walk =
list("walk" = "BiW", "walk_length" = 1),
n = 5000, seed = 5)
int = 0
for (i in 1:num_of_points) int = int + f(points[, i])
V = volume(P, settings = list("error" = 0.05))
I = (int * V) / 5000 })
12. Application
Multivariate integration
Comparison with exact computation: R packages geometry
(triangulation) and SimplicialCubature (integration over
each triangle).
dimension error exact time volesti time
5 0.05129854 0.41 3.095
10 0.08964482 2.945 12.01
15 0.01541695 471.479 33.256
20 – – 64.058
volesti can do general convex bodies (not only triangles).
Future work: improve accuracy with more advanced MC
integration methods such as importance sampling.
13. Application in finance
Crisis detection in stock markets
Data: 52 popular exchange traded funds (ETFs) and the US
central bank (FED)
https://stanford.edu/class/ee103/portfolio.html.
Approximation of the joint distribution (copula) between portfolio return
and volatility (risk) in a given time period.
Left, a copula that corresponds to normal period (07/03/2007 -
31/05/2007).
Right, a copula that corresponds to a crisis period
(18/12/2008 − 13/03/2009).
14. Application in finance
Crisis detection in stock markets
We define an indicator as the ratio between two masses (red /
blue).
If the indicator is ≥ 1 then the copula corresponds to a crises,
otherwise corresponds to a normal period.