The document presents research on analyzing long memory and volatility in foreign exchange markets using Hurst exponent analysis. It includes sections on objectives, methodology, data analysis, findings, and limitations. The data analysis finds a Hurst exponent value of 0.028 for USD, indicating mean reversion behavior. It suggests Hurst exponent can provide insights on predictability. Limitations include difficulties in estimation and lack of data. The conclusion states fractal analysis is becoming popular in finance research areas like econophysics.
QNBFS Daily Technical Trader - KSA May 15, 2016QNB Group
We do not see much movement on the
Index, or on its related indicators. As a
result, the trend remains mainly up and
we look for more steam to help fuel the
bullish move.
QNBFS Daily Technical Trader - Qatar June 14, 2016QNB Group
The Index dropped more in the past
session; the bearish sentiment continued.
Momentum continues to be negative as
the intraday MACD moved into the
negative territory.
QNBFS Daily Technical Trader - KSA July 27, 2016QNB Group
The Index has been flattish over the past
few months, resulting in flat indicators.
Only a break above the 6,550 might put
the Index in an uptick inside that range.
QNBFS Daily Technical Trader - KSA May 15, 2016QNB Group
We do not see much movement on the
Index, or on its related indicators. As a
result, the trend remains mainly up and
we look for more steam to help fuel the
bullish move.
QNBFS Daily Technical Trader - Qatar June 14, 2016QNB Group
The Index dropped more in the past
session; the bearish sentiment continued.
Momentum continues to be negative as
the intraday MACD moved into the
negative territory.
QNBFS Daily Technical Trader - KSA July 27, 2016QNB Group
The Index has been flattish over the past
few months, resulting in flat indicators.
Only a break above the 6,550 might put
the Index in an uptick inside that range.
QNBFS Daily Technical Trader - Qatar November 15, 2016QNB Group
The Index continued its drop and now
reached a critical level (that may act as a
support); that level is the medium-term
downtrend line (seen on the daily chart).
QNBFS Daily Technical Trader - Qatar November 14, 2016QNB Group
The Index remains inside the downtrend
channel (seen on the daily chart). The
Index is expected to continue inside the
channel, or move in a sideway motion (at
best).
QNBFS Daily Technical Trader - Qatar for January 29, 2018Aicha El-Mamy
The Index started to lose steam below the 9,600 level as we expected. That been said, the MACD suggests the correction maybe short lived before another rally.
QNBFS Daily Technical Trader - Qatar December 06, 2016QNB Group
The Index created a bearish engulfing
candlestick on the daily chart but the
intraday chart shows possible uptick.
Consequently, a break below yesterday’s
low (which is just around our 9,900
support level), could take
QNBFS Daily Technical Trader - Qatar for January 03, 2018QNB Group
The Index has been on the right foot in breaking away from the recent downtrend; the current breakout could act as a technical catalyst to move the Index in a new bullish trend. Corrections could take place but if the Index remains above the 8,200, we may see a positive 1Q.
QNBFS Daily Technical Trader - Qatar April 27, 2016QNB Group
The QSE Index dipped to an intraday
support that could act as a buying base.
Even though the chance of a correction is
high at this point, we reiterate the notion
that the Index remains positive given it
stays above the 10,000 level.
Show drafts
volume_up
Empowering the Data Analytics Ecosystem: A Laser Focus on Value
The data analytics ecosystem thrives when every component functions at its peak, unlocking the true potential of data. Here's a laser focus on key areas for an empowered ecosystem:
1. Democratize Access, Not Data:
Granular Access Controls: Provide users with self-service tools tailored to their specific needs, preventing data overload and misuse.
Data Catalogs: Implement robust data catalogs for easy discovery and understanding of available data sources.
2. Foster Collaboration with Clear Roles:
Data Mesh Architecture: Break down data silos by creating a distributed data ownership model with clear ownership and responsibilities.
Collaborative Workspaces: Utilize interactive platforms where data scientists, analysts, and domain experts can work seamlessly together.
3. Leverage Advanced Analytics Strategically:
AI-powered Automation: Automate repetitive tasks like data cleaning and feature engineering, freeing up data talent for higher-level analysis.
Right-Tool Selection: Strategically choose the most effective advanced analytics techniques (e.g., AI, ML) based on specific business problems.
4. Prioritize Data Quality with Automation:
Automated Data Validation: Implement automated data quality checks to identify and rectify errors at the source, minimizing downstream issues.
Data Lineage Tracking: Track the flow of data throughout the ecosystem, ensuring transparency and facilitating root cause analysis for errors.
5. Cultivate a Data-Driven Mindset:
Metrics-Driven Performance Management: Align KPIs and performance metrics with data-driven insights to ensure actionable decision making.
Data Storytelling Workshops: Equip stakeholders with the skills to translate complex data findings into compelling narratives that drive action.
Benefits of a Precise Ecosystem:
Sharpened Focus: Precise access and clear roles ensure everyone works with the most relevant data, maximizing efficiency.
Actionable Insights: Strategic analytics and automated quality checks lead to more reliable and actionable data insights.
Continuous Improvement: Data-driven performance management fosters a culture of learning and continuous improvement.
Sustainable Growth: Empowered by data, organizations can make informed decisions to drive sustainable growth and innovation.
By focusing on these precise actions, organizations can create an empowered data analytics ecosystem that delivers real value by driving data-driven decisions and maximizing the return on their data investment.
More Related Content
Similar to Long memory testing in foreign exchange market
QNBFS Daily Technical Trader - Qatar November 15, 2016QNB Group
The Index continued its drop and now
reached a critical level (that may act as a
support); that level is the medium-term
downtrend line (seen on the daily chart).
QNBFS Daily Technical Trader - Qatar November 14, 2016QNB Group
The Index remains inside the downtrend
channel (seen on the daily chart). The
Index is expected to continue inside the
channel, or move in a sideway motion (at
best).
QNBFS Daily Technical Trader - Qatar for January 29, 2018Aicha El-Mamy
The Index started to lose steam below the 9,600 level as we expected. That been said, the MACD suggests the correction maybe short lived before another rally.
QNBFS Daily Technical Trader - Qatar December 06, 2016QNB Group
The Index created a bearish engulfing
candlestick on the daily chart but the
intraday chart shows possible uptick.
Consequently, a break below yesterday’s
low (which is just around our 9,900
support level), could take
QNBFS Daily Technical Trader - Qatar for January 03, 2018QNB Group
The Index has been on the right foot in breaking away from the recent downtrend; the current breakout could act as a technical catalyst to move the Index in a new bullish trend. Corrections could take place but if the Index remains above the 8,200, we may see a positive 1Q.
QNBFS Daily Technical Trader - Qatar April 27, 2016QNB Group
The QSE Index dipped to an intraday
support that could act as a buying base.
Even though the chance of a correction is
high at this point, we reiterate the notion
that the Index remains positive given it
stays above the 10,000 level.
Similar to Long memory testing in foreign exchange market (20)
Show drafts
volume_up
Empowering the Data Analytics Ecosystem: A Laser Focus on Value
The data analytics ecosystem thrives when every component functions at its peak, unlocking the true potential of data. Here's a laser focus on key areas for an empowered ecosystem:
1. Democratize Access, Not Data:
Granular Access Controls: Provide users with self-service tools tailored to their specific needs, preventing data overload and misuse.
Data Catalogs: Implement robust data catalogs for easy discovery and understanding of available data sources.
2. Foster Collaboration with Clear Roles:
Data Mesh Architecture: Break down data silos by creating a distributed data ownership model with clear ownership and responsibilities.
Collaborative Workspaces: Utilize interactive platforms where data scientists, analysts, and domain experts can work seamlessly together.
3. Leverage Advanced Analytics Strategically:
AI-powered Automation: Automate repetitive tasks like data cleaning and feature engineering, freeing up data talent for higher-level analysis.
Right-Tool Selection: Strategically choose the most effective advanced analytics techniques (e.g., AI, ML) based on specific business problems.
4. Prioritize Data Quality with Automation:
Automated Data Validation: Implement automated data quality checks to identify and rectify errors at the source, minimizing downstream issues.
Data Lineage Tracking: Track the flow of data throughout the ecosystem, ensuring transparency and facilitating root cause analysis for errors.
5. Cultivate a Data-Driven Mindset:
Metrics-Driven Performance Management: Align KPIs and performance metrics with data-driven insights to ensure actionable decision making.
Data Storytelling Workshops: Equip stakeholders with the skills to translate complex data findings into compelling narratives that drive action.
Benefits of a Precise Ecosystem:
Sharpened Focus: Precise access and clear roles ensure everyone works with the most relevant data, maximizing efficiency.
Actionable Insights: Strategic analytics and automated quality checks lead to more reliable and actionable data insights.
Continuous Improvement: Data-driven performance management fosters a culture of learning and continuous improvement.
Sustainable Growth: Empowered by data, organizations can make informed decisions to drive sustainable growth and innovation.
By focusing on these precise actions, organizations can create an empowered data analytics ecosystem that delivers real value by driving data-driven decisions and maximizing the return on their data investment.
Levelwise PageRank with Loop-Based Dead End Handling Strategy : SHORT REPORT ...Subhajit Sahu
Abstract — Levelwise PageRank is an alternative method of PageRank computation which decomposes the input graph into a directed acyclic block-graph of strongly connected components, and processes them in topological order, one level at a time. This enables calculation for ranks in a distributed fashion without per-iteration communication, unlike the standard method where all vertices are processed in each iteration. It however comes with a precondition of the absence of dead ends in the input graph. Here, the native non-distributed performance of Levelwise PageRank was compared against Monolithic PageRank on a CPU as well as a GPU. To ensure a fair comparison, Monolithic PageRank was also performed on a graph where vertices were split by components. Results indicate that Levelwise PageRank is about as fast as Monolithic PageRank on the CPU, but quite a bit slower on the GPU. Slowdown on the GPU is likely caused by a large submission of small workloads, and expected to be non-issue when the computation is performed on massive graphs.
Techniques to optimize the pagerank algorithm usually fall in two categories. One is to try reducing the work per iteration, and the other is to try reducing the number of iterations. These goals are often at odds with one another. Skipping computation on vertices which have already converged has the potential to save iteration time. Skipping in-identical vertices, with the same in-links, helps reduce duplicate computations and thus could help reduce iteration time. Road networks often have chains which can be short-circuited before pagerank computation to improve performance. Final ranks of chain nodes can be easily calculated. This could reduce both the iteration time, and the number of iterations. If a graph has no dangling nodes, pagerank of each strongly connected component can be computed in topological order. This could help reduce the iteration time, no. of iterations, and also enable multi-iteration concurrency in pagerank computation. The combination of all of the above methods is the STICD algorithm. [sticd] For dynamic graphs, unchanged components whose ranks are unaffected can be skipped altogether.
Opendatabay - Open Data Marketplace.pptxOpendatabay
Opendatabay.com unlocks the power of data for everyone. Open Data Marketplace fosters a collaborative hub for data enthusiasts to explore, share, and contribute to a vast collection of datasets.
First ever open hub for data enthusiasts to collaborate and innovate. A platform to explore, share, and contribute to a vast collection of datasets. Through robust quality control and innovative technologies like blockchain verification, opendatabay ensures the authenticity and reliability of datasets, empowering users to make data-driven decisions with confidence. Leverage cutting-edge AI technologies to enhance the data exploration, analysis, and discovery experience.
From intelligent search and recommendations to automated data productisation and quotation, Opendatabay AI-driven features streamline the data workflow. Finding the data you need shouldn't be a complex. Opendatabay simplifies the data acquisition process with an intuitive interface and robust search tools. Effortlessly explore, discover, and access the data you need, allowing you to focus on extracting valuable insights. Opendatabay breaks new ground with a dedicated, AI-generated, synthetic datasets.
Leverage these privacy-preserving datasets for training and testing AI models without compromising sensitive information. Opendatabay prioritizes transparency by providing detailed metadata, provenance information, and usage guidelines for each dataset, ensuring users have a comprehensive understanding of the data they're working with. By leveraging a powerful combination of distributed ledger technology and rigorous third-party audits Opendatabay ensures the authenticity and reliability of every dataset. Security is at the core of Opendatabay. Marketplace implements stringent security measures, including encryption, access controls, and regular vulnerability assessments, to safeguard your data and protect your privacy.
Adjusting primitives for graph : SHORT REPORT / NOTESSubhajit Sahu
Graph algorithms, like PageRank Compressed Sparse Row (CSR) is an adjacency-list based graph representation that is
Multiply with different modes (map)
1. Performance of sequential execution based vs OpenMP based vector multiply.
2. Comparing various launch configs for CUDA based vector multiply.
Sum with different storage types (reduce)
1. Performance of vector element sum using float vs bfloat16 as the storage type.
Sum with different modes (reduce)
1. Performance of sequential execution based vs OpenMP based vector element sum.
2. Performance of memcpy vs in-place based CUDA based vector element sum.
3. Comparing various launch configs for CUDA based vector element sum (memcpy).
4. Comparing various launch configs for CUDA based vector element sum (in-place).
Sum with in-place strategies of CUDA mode (reduce)
1. Comparing various launch configs for CUDA based vector element sum (in-place).
2. • Introduction
• Objectives
• Hurst exponent
• Research methodology
• Data analysis and interpretation
• Findings and suggestion
• Limitation of the study
• Conclusion
3. Introduction
Foreign exchange is the exchange of one currency for another or the
conversion of one currency into another currency.
Foreign exchange also refers to the global market where currencies
are traded virtually around the clock. The largest trading centers are
London, New York, Singapore and Tokyo. The term foreign exchange
is usually abbreviated as "forex“.
The global foreign exchange market is the largest financial market in
the world, with average daily volumes in the trillions of dollars.
The U.S. dollar is the most actively traded currency.
Long memory means systems are characterized by their ability to
remember events in the lag history of time series data and their ability
to make decision on the basis of such memories.
4. • 1- To determine currency convertibility and
liquidity in foreign exchange market.
• 2- To determine currency volatility
• 3- To determine the long memory in the
foreign exchange market
5. TOP 10 CURRENCY TRADER- % IN OVERALL VOLUME, MAY
2015
Rank Name Market share
1 Citi 16.11%
2 Deutsche Bank 14.54%
3 Barclays Investment Bank 8.11%
4 JPMorgan 7.65%
5 UBS AG 7.30%
6
Bank of America Merrill
Lynch
6.22%
7 HSBC 5.40%
8 BNP Paribas 3.65%
9 Goldman Sachs 3.40%
10 Royal Bank of Scotland 3.38%
6. • Originally invented for the field of hydrology by Harold Edwin Hurst,
the technique was developed to predict Nile River flooding in advance
of the construction of the Aswan High Dam. The dam needed to fulfill
multiple and divergent purposes, including serving as both a store of
water to protect against drought for farmers down river, and as flood
protection for those same farmers during typical annual flooding.
Rainfall levels in Central Africa were seemingly random each year,
yet the Nile River flows seemed to show autocorrelation. That is,
rainfall in one time period seemed to influence rainfall in subsequent
periods. Hurst needed to be able to see if there was a hidden long-term
trend — statistically known as a long-memory process — in the Nile
River data that might guide him in building a better dam for Egypt.
7. • Rescaled range analysis is a statistical technique designed to assess
the nature and magnitude of variability in data over time. In investing
rescaled range analysis has been used to detect and evaluate the
amount of persistence, randomness, or mean reversion in financial
markets time series data.
8. ” A Hurst exponent ranges between 0 and 1, and measures three types
of trends in a time series: persistence, randomness, or mean reversion.
• If a time series is persistent with H ≥ 0.5, then a future data point is
likely to be like a data point preceding it. So an equity with H of 0.77
that has been up for the past week is more likely to be up next week
as well, because its Hurst exponent is greater than 0.5.
• If the Hurst exponent of a time series is H < 0.5, then it is likely to
reverse trend over the time frame considered. Thus, an equity
with H = 0.26 that was up last month is more likely than chance to be
down next month.
• Time series that have Hurst exponents near to 0.5 display a random
(i.e., a stochastic) process, in which knowing one data point does not
provide insight into predicting future data points in the series.
9. Data analysis and interpretation
TABLE-1
Data (USD) RESULT
Start Date 29/2/2012
End Date 29/3/2016
R 1894.59
Volatility 0.0043
Log(R/s) 12.97
Hurst 0.028
12. Finding and suggestion
Findings - Hurst > 0.5, persistence — and positive price appreciation
would be attractive to a growth manager wanting future capital appreciation.
SUGGESTION:
Hurst exponent for each time series is computed as the slope of the linear fit
of the log-log graph of the standard deviation (volatility) of the log-returns
series versus the time delay.
Volatility refers to the amount of uncertainty or risk involved with the size
of changes in a currency exchange rate.
The higher the volatility, the riskier the trading of the currency pair of.
Volatility is usually considered a negative as it represents uncertainty and
risk. However, higher volatility can make FOREX trading more attractive to
the market player.
13. Limitation of study
• i- Adequate information is not available
• ii- Testing software to estimate the Hurst exponent is
difficult.
• iii- It is not so much calculation as estimated. Accuracy
of the estimation can be complicate issue.
• iv- it is time consuming in searching and collecting the
data
14. conclusion
Computing of Hurst exponent of a time series gives valuable
information on the predictability in the process that generated it.
Recently, the fractal analysis has become popular in the finance
research, particularly in the context of Econophysics , a relatively new
area of study, developed by cooperation between economists,
mathematicians and physicists.
It applies ideas, methods and models of statistical physics and
complexity theory to analyze data from economical phenomena.
The normality tests on the daily exchange rate returns for the last four
year or so indicate the need to explore the application of non-linear
modeling techniques while understanding exchange rate behavior. But
we come to see that the results from the persistence tests are split.