The document describes Synergy, a blockchain-based platform for collaborative data science and AI product development. Key points:
- Synergy uses blockchain and smart contracts to build a research platform that incentivizes contributors to improve each other's solutions through competitive rounds. Contributors are rewarded with tokens.
- The platform provides an analytics and modeling interface where users can build AI solutions visually without code by dragging and dropping components. Pre-built models can also be deployed.
- Synergy addresses market needs by providing off-the-shelf AI products and a platform for outsourcing data science work through competitions. Contributors are assured rewards and credit for their work.
- The platform aims to make state-of-
Cognizant Community Europe 2017: Mastering Digital: Navigating the Shift to t...Cognizant
Executives gathered at Cognizant’s flagship European thought leadership conference heard how digital technologies in general and AI in particular are poised to generate significant economic growth.
Harbor Research: 3D Printing Growth OpportunityHarbor Research
What is The Impact of 3D Printing on IoT?
3D Printing is already changing manufacturing and design processes, but the industry is poised for an even greater shift. Advancement in 3D printing processes will increase speed, complexity of materials, and quality of prints, while decreasing costs.
Though 3D printers have been in development for over 20 years, businesses have shown uneven adoption across commercial and industrial applications. We believe that the forces driving adoption are becoming aligned and 3D printing technology is poised to disrupt diverse sectors across the economy.
Read More on the Opportunity here: https://goo.gl/ZK9mny
Cognizant Community 2016: Mastering Digital: How to Navigate the Shift to the...Cognizant
For more than a decade, this unique event has provided busy executives with an opportunity to exchange ideas and insights both with each other and with acclaimed subject matter experts on the panoply of topics critical to advancing their business objectives today — and tomorrow.
- Learn to understand what knowledge graphs are for
- Understand the structure of knowledge graphs (and how it relates to taxonomies and ontologies)
- Understand how knowledge graphs can be created using manual, semi-automatic, and fully automatic methods.
- Understand knowledge graphs as a basis for data integration in companies
- Understand knowledge graphs as tools for data governance and data quality management
- Implement and further develop knowledge graphs in companies
- Query and visualize knowledge graphs (including SPARQL and SHACL crash course)
- Use knowledge graphs and machine learning to enable information retrieval, text mining and document classification with the highest precision
- Develop digital assistants and question and answer systems based on semantic knowledge graphs
- Understand how knowledge graphs can be combined with text mining and machine learning techniques
- Apply knowledge graphs in practice: Case studies and demo applications
Cognizant Community Europe 2017: Mastering Digital: Navigating the Shift to t...Cognizant
Executives gathered at Cognizant’s flagship European thought leadership conference heard how digital technologies in general and AI in particular are poised to generate significant economic growth.
Harbor Research: 3D Printing Growth OpportunityHarbor Research
What is The Impact of 3D Printing on IoT?
3D Printing is already changing manufacturing and design processes, but the industry is poised for an even greater shift. Advancement in 3D printing processes will increase speed, complexity of materials, and quality of prints, while decreasing costs.
Though 3D printers have been in development for over 20 years, businesses have shown uneven adoption across commercial and industrial applications. We believe that the forces driving adoption are becoming aligned and 3D printing technology is poised to disrupt diverse sectors across the economy.
Read More on the Opportunity here: https://goo.gl/ZK9mny
Cognizant Community 2016: Mastering Digital: How to Navigate the Shift to the...Cognizant
For more than a decade, this unique event has provided busy executives with an opportunity to exchange ideas and insights both with each other and with acclaimed subject matter experts on the panoply of topics critical to advancing their business objectives today — and tomorrow.
- Learn to understand what knowledge graphs are for
- Understand the structure of knowledge graphs (and how it relates to taxonomies and ontologies)
- Understand how knowledge graphs can be created using manual, semi-automatic, and fully automatic methods.
- Understand knowledge graphs as a basis for data integration in companies
- Understand knowledge graphs as tools for data governance and data quality management
- Implement and further develop knowledge graphs in companies
- Query and visualize knowledge graphs (including SPARQL and SHACL crash course)
- Use knowledge graphs and machine learning to enable information retrieval, text mining and document classification with the highest precision
- Develop digital assistants and question and answer systems based on semantic knowledge graphs
- Understand how knowledge graphs can be combined with text mining and machine learning techniques
- Apply knowledge graphs in practice: Case studies and demo applications
The Work Ahead: Transportation and Logistics Delivering on the Digital-Physic...Cognizant
The T&L industry appears poised to accelerate its long-overdue modernization drive, as the pandemic spurs an increased need for agility and resilience, according to our study.
Open Insurance - Unlocking Ecosystem Opportunities For Tomorrow’s Insurance I...Accenture Insurance
For early adopters, open insurance offers new revenue streams, increased customer engagement and continued market relevance.
Learn more: https://www.accenture.com/us-en/insights/insurance/open-insurance
Data-Centric Business Transformation Using Knowledge GraphsAlan Morrison
From a talk at the Data Architecture Summit in Chicago in 2018--reviews digital transformation and what deep transformation really implies at the data layer. Cross-enterprise knowledge graphs are becoming feasible and can be a key enabler of deep transformation.
Data centric business and knowledge graph trendsAlan Morrison
The deck for my kickoff keynote at the Data-Centric Architecture Forum, February 3, 2020. Includes related data, content, and architecture definitions and fundamental explanations, knowledge graph trends, market outlook, transformation case studies and benefits of large-scale, cross-boundary integration/interoperation.
AI is a data-driven game,
hands down, but predictions
will be accurate only if the
training data used to teach
the AI prediction model is truly
representative of the target
cases being classified or
predicted. If I had to put it in
one term, AI is basically about
decision-making—smarter
decision-making.”
Presentation "AI Product Manager" at the Digital Product School (on 10/22/2020) from Datentreiber.
Content:
• Overview over the AI product innovation cycle
• AI Thinking: ideating and prioritizing the right use cases
• AI Prototyping: testing critical hypotheses with experiments
• AI Engineering: building scalable & user friendly AI applications
• AI Management: maintaining AI solutions with DataOps
• Outlook: how to become an AI product manager (links & more)
Artificial Intelligence: what value for intelligent machines?WeAreInnovation
The Innovation Index analyses the market attractiveness, business model maturity and infrastructure and support impact of a given sector or technology. It aims at analyzing Artificial Intelligence market attractiveness through facts, figures and key words analysis, as collected through WAI networks. It also evaluates infrastructure and support impact, and estimate business model maturity to come to a final index value. To know more, your can also browse our Global Knowledge Library.
The Work Ahead: Transportation and Logistics Delivering on the Digital-Physic...Cognizant
The T&L industry appears poised to accelerate its long-overdue modernization drive, as the pandemic spurs an increased need for agility and resilience, according to our study.
Open Insurance - Unlocking Ecosystem Opportunities For Tomorrow’s Insurance I...Accenture Insurance
For early adopters, open insurance offers new revenue streams, increased customer engagement and continued market relevance.
Learn more: https://www.accenture.com/us-en/insights/insurance/open-insurance
Data-Centric Business Transformation Using Knowledge GraphsAlan Morrison
From a talk at the Data Architecture Summit in Chicago in 2018--reviews digital transformation and what deep transformation really implies at the data layer. Cross-enterprise knowledge graphs are becoming feasible and can be a key enabler of deep transformation.
Data centric business and knowledge graph trendsAlan Morrison
The deck for my kickoff keynote at the Data-Centric Architecture Forum, February 3, 2020. Includes related data, content, and architecture definitions and fundamental explanations, knowledge graph trends, market outlook, transformation case studies and benefits of large-scale, cross-boundary integration/interoperation.
AI is a data-driven game,
hands down, but predictions
will be accurate only if the
training data used to teach
the AI prediction model is truly
representative of the target
cases being classified or
predicted. If I had to put it in
one term, AI is basically about
decision-making—smarter
decision-making.”
Presentation "AI Product Manager" at the Digital Product School (on 10/22/2020) from Datentreiber.
Content:
• Overview over the AI product innovation cycle
• AI Thinking: ideating and prioritizing the right use cases
• AI Prototyping: testing critical hypotheses with experiments
• AI Engineering: building scalable & user friendly AI applications
• AI Management: maintaining AI solutions with DataOps
• Outlook: how to become an AI product manager (links & more)
Artificial Intelligence: what value for intelligent machines?WeAreInnovation
The Innovation Index analyses the market attractiveness, business model maturity and infrastructure and support impact of a given sector or technology. It aims at analyzing Artificial Intelligence market attractiveness through facts, figures and key words analysis, as collected through WAI networks. It also evaluates infrastructure and support impact, and estimate business model maturity to come to a final index value. To know more, your can also browse our Global Knowledge Library.
Top Tech Skills in Demand: How Staff Augmentationxeosol
In today's digital era, technology plays a pivotal role in driving innovation and growth. As a result, the demand for professionals skilled in cutting-edge technologies has surged exponentially. From artificial intelligence to cybersecurity, companies are actively seeking individuals who can contribute to their digital transformation efforts. However, the supply of talent fails to meet this growing demand, leading to a talent gap in the market.
Digital Enterprise Architecture: Four Elements Critical to Solution EnvisioningCognizant
For the digital enterprise, architecture of all varieties must evolve strategically in step with technological capabilities and business imperatives. Such a multidimensional approach includes automation, AI, analytics, big data management and digitization as a holistic phenomenon.
leewayhertz.com-How to build a generative AI solution From prototyping to pro...KristiLBurns
Generative AI has gained significant attention in the tech industry, with investors, policymakers, and the society at large talking about innovative AI models like ChatGPT and Stable Diffusion.Generative AI has gained significant attention in the tech industry, with investors, policymakers, and the society at large talking about innovative AI models like ChatGPT and Stable Diffusion.
Are you exploring the best way for your business to save expenses, enhance margin, or reinvest in the coming years? Check out the top technological advancements in business that are beneficial for business expansion and that result in a technology roadmap that has an impact on a number of the organization's strategic goals.
For more information, see: https://www.albiorixtech.com/blog/technology-trends-in-business/
#technology #technologytrends #webappdevelopment #mobileappdevelopment #softwaredevelopment
An in-depth examination of market dynamics reveals deep learning as a driving force propelling contemporary technological progress, profoundly altering various sectors. Situated within the artificial intelligence (AI) sphere, deep learning emulates the cognitive functions of the human brain to analyze data and make informed decisions. Its influence permeates diverse industries, encompassing healthcare, finance, automotive, and retail. Empowered by extensive data and computational resources, deep learning algorithms equip machines to recognize patterns, comprehend natural language, and forecast outcomes, surpassing traditional machine learning methodologies in accuracy and efficiency. With the proliferation of big data and advancements in computing power, the deep learning market is set for exponential expansion, catalyzing innovation and reshaping global industries.
Bringing AI into the Enterprise: A Machine Learning Primermercatoradvisory
New research from Mercator Advisory Group shows how machine learning, a.k.a. AI, has changed consumer behavior and expectations and will evolve to alter all aspects of bank operations. AI’s impact on banking will be broader and faster than the impact of the internet.
Generative AI models are transforming various fields by creating realistic images, text, music, and videos. This guide will take you through the essential steps and considerations for building a generative AI model, providing a comprehensive understanding of the process.
Cloud-Based IoT Analytics and Machine LearningSatyaKVivek
Among the IT developments that have made it to the forefront in recent times, machine learning and IoT certainly stand out. As with most such technologies, integrating the two can help develop powerful IoT solutions and tackle complex challenges. More specifically speaking, machine learning can be leveraged in cloud based IoT analytics.
Building a generative AI solution involves defining the problem, collecting and processing data, selecting suitable models, training and fine-tuning them, and deploying the system effectively. It’s essential to gather high-quality data, choose appropriate algorithms, ensure security, and stay updated with advancements.
Big data analytics use cases: all you need to knowJane Brewer
In order to take the next big leap in terms of technological advancement, we need data. Next-generation emerging technologies and inventions have piggybacked on top of big data, achieving maximum success. Here are Amazing Big Data Use Cases You Must Know!
In this presentation, we will answer these questions:
The code is available on GitHub.
How many valid users and active users there are on Steam?
How much time do Steam’s users spend on Steam?
How much money do Steam’s users spend on Steam?
What is the Price–performance ratio (Avg. Cost Per Hour) of Steam's games?
Tweeting for Hillary - DS 501 case study 1Yousef Fadila
source code: https://github.com/yousef-fadila/casestudy1/blob/master/CaseStudy1.ipynb
This slides were presented as part of case study 1: Collecting Data from Twitter for DS501:Introduction to Data Science course
code is written in python; Charts and Maps were also produced in Python as well.
Opendatabay - Open Data Marketplace.pptxOpendatabay
Opendatabay.com unlocks the power of data for everyone. Open Data Marketplace fosters a collaborative hub for data enthusiasts to explore, share, and contribute to a vast collection of datasets.
First ever open hub for data enthusiasts to collaborate and innovate. A platform to explore, share, and contribute to a vast collection of datasets. Through robust quality control and innovative technologies like blockchain verification, opendatabay ensures the authenticity and reliability of datasets, empowering users to make data-driven decisions with confidence. Leverage cutting-edge AI technologies to enhance the data exploration, analysis, and discovery experience.
From intelligent search and recommendations to automated data productisation and quotation, Opendatabay AI-driven features streamline the data workflow. Finding the data you need shouldn't be a complex. Opendatabay simplifies the data acquisition process with an intuitive interface and robust search tools. Effortlessly explore, discover, and access the data you need, allowing you to focus on extracting valuable insights. Opendatabay breaks new ground with a dedicated, AI-generated, synthetic datasets.
Leverage these privacy-preserving datasets for training and testing AI models without compromising sensitive information. Opendatabay prioritizes transparency by providing detailed metadata, provenance information, and usage guidelines for each dataset, ensuring users have a comprehensive understanding of the data they're working with. By leveraging a powerful combination of distributed ledger technology and rigorous third-party audits Opendatabay ensures the authenticity and reliability of every dataset. Security is at the core of Opendatabay. Marketplace implements stringent security measures, including encryption, access controls, and regular vulnerability assessments, to safeguard your data and protect your privacy.
Levelwise PageRank with Loop-Based Dead End Handling Strategy : SHORT REPORT ...Subhajit Sahu
Abstract — Levelwise PageRank is an alternative method of PageRank computation which decomposes the input graph into a directed acyclic block-graph of strongly connected components, and processes them in topological order, one level at a time. This enables calculation for ranks in a distributed fashion without per-iteration communication, unlike the standard method where all vertices are processed in each iteration. It however comes with a precondition of the absence of dead ends in the input graph. Here, the native non-distributed performance of Levelwise PageRank was compared against Monolithic PageRank on a CPU as well as a GPU. To ensure a fair comparison, Monolithic PageRank was also performed on a graph where vertices were split by components. Results indicate that Levelwise PageRank is about as fast as Monolithic PageRank on the CPU, but quite a bit slower on the GPU. Slowdown on the GPU is likely caused by a large submission of small workloads, and expected to be non-issue when the computation is performed on massive graphs.
1. A blockchain based collaborative platform for developing
Data Science and Artificial Intelligence products
White Paper
November 15, 2017
Written By: Yousef Fadila
2. Executive Summary
● Synergy utilizes blockchain distributed trust technology to build a
smart contract powered research and development platform for
developing state-of-the-art machine learning and artificial intelligence
products. Collaborators and researchers are incentivized to improve
other fellow’s solutions in rounds and in exchange receive SNG
tokens. Therefore, driving innovation and continuous improvement.
● Synergy provides a fair and decentralized development tournament
platform that tracks and rewards all contributors to the final solution.
● Synergy organizes all developed products in a web based analytics
and modeling platform. A product can be available in two modes: (1)
As a model architecture with optimally tuned parameters to be trained
from scratch with user-specific data and (2) As a pre-built model which
is ready for deployment. The platform features building AI solutions as
a flow of drag and drop components to make them available for
non-technical users. As a result, they can quickly build state of the art
data science products and AI agents
● Synergy analytics and modeling platform features smooth deployment
of data science products and AI agents. Deployment can be done by a
simply dragging and dropping a deployment component to the flow. As
a result, the model becomes accessible through REST Queries.
● Subscriptions to Synergy analytics and modeling platform are fully
managed through smart contracts using SNG tokens. The smart
contract fairly distributes tokens to all contributors and therefore
provides perpetual rewards to all developers and contributors. This is
an extra incentive to our community of developers and data scientist to
continually participate in Synergy competitions and improve models.
3. Abstract
The value of data science and the power of machine learning is growing
exponentially. It is a matter of time for businesses to lose to its competitors if they
not utilizing their data efficiently. Data-mining and machine learning are
advanced sciences that can accurately predict customer behavior patterns,
logistics and distribution issues, future trends and more. Without data-mining and
machine learning technologies, a company is at a significant disadvantage. That
is, it won’t know what its competitors know about future trends and the current
market.
Data scientist has been called the “The Sexiest Job of the 21st Century” ]. There1
is no doubt that almost every business needs to integrate data science in its
decision-making process. However, due to the veracity and variety of big data2
and the need of it in different fields makes it very challenging to hire the right data
scientists. Furthermore, many companies are moving to the data based decision
making models and as a result there aren’t enough data scientists to meet the
increasing demand in the current market. The rapid innovation in the field makes
it even more challenging for companies to cope with state-of-the-art techniques
needed to evaluate their data-science models. Synergy solves that by providing a
rich data analytics and model-building platform, featuring human-in-demand
machine learning, and reusable off-the-shelf models. Synergy provides a fully
transparent and decentralized data science competition management platform
powered by public smart contracts on public blockchain. The platform is able to
track all contributors to the final solution in all rounds, encourage collaboration to
improve other solutions and to build state-of-the-art models that meet the
sponsor needs.
1
Harvard business review https://hbr.org/2012/10/data-scientist-the-sexiest-job-of-the-21st-century
2
4-Vs-of-big-data http://www.ibmbigdatahub.com/infographic/four-vs-big-data
4. Table of Contents
1. Introduction 5
1.1. What is Data Science? 5
1.2. Why Data Science Matters? 6
1.3. Why Companies Need Data Science Tools? 6
1.4. How Synergy is Addressing the Market Needs? 7
2. Synergy Analytics and Modeling Platform Overview 9
2.1. Synergy Analytics and Modeling Platform 9
2.2. ZeroDriver artificial intelligence and machine learning modeling 10
2.2.1. Smooth Deployment of Data Science and AI Apps 12
2.3. Human-on-demand Machine Learning 13
2.3.1. Host a Competition in a Platform that Drives Innovation. 13
2.3.1.1. Submission and Reward 14
2.3.2. Perpetual Compensations 15
2.4. Summary of Platform Top Features: 16
3. Why Blockchain? 17
3.1. What Problem does a Blockchain Solve? 17
3.2. What are Smart Contracts? 17
3.3. How Synergy is utilizing Blockchain Technology? 17
4. High-level Roadmap 19
4.1. 2018-2019 Roadmap of Synergy Competition Host Platform 20
4.2. 2018-2019 Roadmap of Synergy Analytics and Modeling Platform 20
5. Token Info 22
5.1. Token Distribution 24
5. 1. Introduction
1.1. What is Data Science?
Data science is an interdisciplinary
field about scientific methods and
processes to extract insights from data.
It employs techniques and theories
drawn from many fields within the
broad areas of mathematics, statistics,
information science, and computer
science, in particular from the
subdomains of machine learning,
classification, cluster analysis, data
mining, databases, and visualization. Data scientists usually have advanced
degrees and training in statistics, mathematics, business, computer science and
information management.
Today, a lot more organizations are opening up their doors to big data and
unlocking its power. The advancement in big data technologies that brings low
cost storage and cheap computing power makes it feasible for businesses to
collect big data for either external entities such as competitors and market trends
or internal use such as company’s operations. However, data without the right
people to process it or without the right tools to extract insight from it is worth
nothing, the real business value lies in processing and analyzing data – and that
is where, a data scientist steps into the spotlight.
6. 1.2. Why Data Science Matters?
Data science can add significant
value to businesses by the addition
of statistics and insights across all
business processes. It helps in
making better decisions, through
measuring, tracking and modeling
performance metrics. Furthermore,
Data science supports forecasting
future trends, identifying
opportunities, discovering current
flaws and outliers, predicting future
behavioral patterns, providing personalized experience through recommendation
systems and identifying target audiences.
1.3. Why Companies Need Data Science Tools?
The number of devices sending data is growing
rapidly while the cost of storing data continues to
decline. Companies today collect tremendous
amounts of data. However, they still struggle in the
process to extract business values of their data.
There is no doubt that traditional tools are becoming
obsolete because they can’t handle the current
scale of data. In the current rapid growth market,
there is a need to continually improve current tools and develop new ones to add
automation and to maximize the benefits that data can bring to companies.
Luckily the rapid innovation in the field brought up many tools to fit the new scale
7. and allow quick modeling and prototyping. But that also means that new talent
and skills are required to efficiently run these tools and build better models.
Today, the role of a data scientist is one of the most in-demand positions. This
means that data scientists can be difficult to find and also expensive to hire and
retain. At Synergy we try to solve this by providing two things: (1) a platform to
outsource data science and artificial intelligence research and development -
human-on-demand machine learning (2) a Zero-code, model-building and
model-tuning(selecting optimal parameters) platform. This innovation replaces
the skills that data scientists bring in to develop machine learning models, such
as feature selection, model selection and model tuning for specific datasets.
1.4. How Synergy is Addressing the Market Needs?
In current market, there is a growing demand for data scientists and AI products
that is likely to continue in the foreseeable future. Almost no company could
survive without integrating data science products and AI. For some companies,
off-the-shelf products could be sufficient while for many others it may not. In
addition, many companies may not be able to hire enough engineers to build
complete AI products internally. Synergy addresses all of these needs by offering
(1) off-the-shelf products through Synergy analytical and modeling platform and
(2) human-on-demand machine learning for companies who don’t want to use
off-the-shelf products.
Synergy platform deploys human-on-demand developed products to the
analytical and modeling platform, therefore, allowing non-technical users and
companies to access state-of-the-art models that are developed from talented
data scientist.
In addition to addressing the market need of organizations, Synergy also meets
the market need of developers, freelance data scientists and researchers to
8. monetize their work with assurance to track back the credit to them. Synergy is
able to provide a fair reward to all contributors in the final solution by tracking all
forks and improvements to the solution in all rounds. Not only that but Synergy
also guarantees perpetual reward for contributors after their models are deployed
in Synergy analytical and modeling platform. By using blockchain distributed trust
network and the power of smart contracts, Synergy is able to both (1) incentivize
researchers and developers without having trust concerns and (2) guarantees
the competition’s sponsor that its minimum acceptance quality is met before the
promised reward is distributed. This happens in fully transparent mode, thanks to
the blockchain technology.
9. 2. Synergy Analytics and Modeling Platform Overview
Value extraction from business data is a crucial mission for every company: data
mining and machine learning can accurately predict customer behavior patterns,
product-matching, logistics and inventory needs. However, the shortage of data
scientists in the market and the high demand that makes hiring an experienced
data scientist very expensive cause many businesses to lag behind. Forcing
them to use outdated analytics tools or simple machine learning models that
simply cannot maximize the value from business data.
We, at Synergy, believe that for many companies, especially small and medium
sized companies, better technology can be affordable. As companies prefer to
invest less for more, we are proposing a rich data analytics platform combined
with a ZeroDriver machine learning modeling platform which makes machine
learning accessible to non-expert users. We do this by auto-tuning the process
from data cleaning and features selection to model evaluation. The platform is
backed by a community of data scientists who are incentivized to build reusable
off-the-shelf models in a collaborative manner, improving each others’ solution.
To summarize Synergy Platform consists of:
1) Component-Rich data analytics platform combined with ZeroDriver
artificial intelligence and machine learning modeling
2) Fully transparent and decentralize human-on-demand machine learning
development platform with rewards, powered by smart contracts on a
public blockchain.
2.1. Synergy Analytics and Modeling Platform
Synergy aims to build a component-Rich data analytics web platform backed with
community of highly talented data scientists and data engineers who are
incentivized to continuously improve these components. Synergy aims to bring
10. predictive intelligence to judgments made by decision makers by offering a wide
range of techniques and algorithms. All without writing a single line of code.
Synergy relies on several open-source platforms to act as infrastructure for the
Analytics and Modeling Platform. The modeling platform PoC (proof of concept)
was built on Apache Zeppelin but we still consider other candidate for the final
solution. Apache Zeppelin is an open-source award winning web-based notebook
that enables data-driven and interactive data analytics in the browser. The
following is a screenshot of advanced analytics page of Synergy’s PoC for
illustration purpose only.
A screenshots of Synergy’s proof of concept - advanced analytics page.
2.2. ZeroDriver artificial intelligence and machine learning modeling
In the ZeroDriver artificial intelligence approach, the user defines a flow by
building components that start with data connection and ends with the model
evaluation component. In addition, the user defines a measurement factor
11. (evaluation metric) such as Accuracy, Precision, F1 measure or any other
custom measure that could be calculated in the model evaluation component.
Synergy platform will tune the model parameters to maximize the measurement
value. (optimally tuning)
A screenshot of Synergy’s proof of concept- build a flow page. The right side shows samples of
configurable components that can be used to build a flow.
12. 2.2.1. Smooth Deployment of Data Science and AI Apps
Synergy’s artificial intelligence and machine learning modeling platform enables
a smooth deployment of pre-built or user developed models as web apps. Users
can easily initiate the deployment process by adding an API access component
to the flow, and then connect it to a responding model. Synergy allows full
customization of the flow by allowing users to add input filtering components or
output customizing components at any stage in the flow.
The platform enables many advanced features, among them, are the bagging
and aggregation features. The aggregation feature allows users to build AI apps
that are composed of multiple models rather than one model. This is achieved by
enabling the user to build a flow that forwards the API queries to multiple models.
For example, a user wants to deploy an image classification model to distinguish
cats from dogs. Let’s assume that there is a pre-built model in Synergy's catalog
for such task and the user has created an additional two models using different
architectures for the same purpose. In many cases, there is no one model that is
always more accurate than others in all inputs. In such case, a user can consider
a multiple models approach to build a more accurate app. Therefore, instead of
building an app that consists of one model, the user can decide to deploy all
three models to respond to API queries with a policy, such as a majority vote, to
agree on the final answer. Synergy enables building such scenario by using
drag-and-drop components without writing a single line of code.
13. 2.3. Human-on-demand Machine Learning
2.3.1. Host a Competition in a Platform that Drives Innovation.
Building state-of-the-art machine learning models and data science products
can’t be done without collaboration between mathematicians, scientists, and
domain-specific researchers. Synergy incentivizes collaboration on a
decentralized iteration process to build better data science and artificial
intelligence models. Synergy is composed of a research and development
environment with a reward system and evaluation platform where developed
models could be evaluated on new datasets. Synergy features multi-stages,
multi-round, fully transparent, smart contract powered competitions that
encourage collaborators to build and expand upon each other’s work. Synergy
platform tracks all contributions in all stages to the final solution.
A screenshot of Synergy’s PoC - host a competition page. The proof of concept features a user
friendly interface to build and deploy contracts on the blockchain
14. 2.3.1.1. Submission and Reward
Let’s look at this example to illustrate how it works:
⇒ A new competition is published to develop artificial intelligence based trading
algorithm for cryptocurrencies.
Rewards Rules - (all rules are set in a smart contract that escrows all tokens):
1) 1000 SNG tokens to the final solution.
2) 1000 SNG tokens for top two solutions in each round.
3) 5000 SNG tokens for all contributors in the chain lead to the final solution.
4) Number of rounds: three
5) Evaluation metric: Return on investment in a cryptocurrency data.
6) Minimum accepted quality: the minimum value of the assigned metric
required by the final solution in order to accept it and distribute rewards.
In round 1, submitter of A and B will
share 1000 tokens reward which will be
In round 2, all solutions should be
forked from round 1 solutions with
enhancement. Submitter of E and F
solutions will share the 2nd round
reward of 1000 tokens.
In round 3 (final), Submitter of solution
K will be rewarded with 1000 tokens.
5000 tokens for all contributors in the
chain lead to the final solution.B, F and
K will share these 5000.
⇒ Tokens are distributed from the smart contract of the competition to the
submitters’ SNG wallets.
15. 2.3.2. Perpetual Compensations
At the end of each competition, if the solution can be reused for different
problems, Synergy or a contributor will wrap the solution with Synergy catalog
API and add it for later usage by Synergy analytics and modeling platform. In
addition, a smart contract is created to define all contributors to the final solution
for future rewards. The smart contract will allow receipt of tokens and
automatically distribute them to all contributors according to the predefined rules.
The contributors will gain perpetual tokens based on their model’s usage rate.
To illustrate that, let’s continue with previous example, assume the winner,
solution K, has the potential to be applied for different commodity trading. Let’s
assume that the rules of perpetual compensations are:
1. 60% equally to the chain lead to the final solution(B and F in the example).
2. 30% for the submitter of the final solution.
3. 10% for the contributor who worked on the API wrapping.
16. The deployed smart contract has an attached ERC20 payment address that
automatically distributes 60% of the recipient SNG tokens to the submitter of B
and F, 30% to the submitter of K, and 10% to the submitter of the solution to the
catalog (or to Synergy company in case the wrapping and submitting was done
by an internal employee). To learn more how it works, see section 3.3.How
Synergy is utilizing Blockchain Technology?
2.4. Summary of Platform Top Features:
Synergy Analytics and Modeling
Platform
● Automated multiple modelling
● Support multiple data sources
● Multiple visualization method
● Multiple analytics methods
● Rich data pre-processing library
● Build a flow throw drag and
drop components.
● Code-less, optimally tuning
mode
● Pre-build statistical models
● Pre-build classical machine
learning models.
● Acquire state-of-the-art models
from Synergy's competitions
submitted models.
Synergy Competition Host Platform
● Human-on-demand: Host a
competition or challenge in a
simple click
● Competition host can set a
minimum acceptance quality
criteria that the proposed
solutions have to meet in order
to be rewarded.
● Multi-round, multi-stage
competitions to maximize
proposed model performance.
● Bases on distributed trust
blockchain technology. Foster
data scientists to participate
without trust concerns.
● The winning models can
generate perpetual
compensations for contributors.
17. 3. Why Blockchain?
3.1. What Problem does a Blockchain Solve?
Blockchain technology allows exchange of digital assets without an intermediary
trust entity. This technology enables a distributed trust network with no single
trusted arbiter to verify trust and the transfer of value. It transfers power and
control from one entity to the many entities, enabling safe, fast and cheaper
transactions despite the fact that we may not know the entities we are dealing
with. A blockchain lets us agree on the state of the system, even if we all don’t
trust each other.
3.2. What are Smart Contracts?
Smart contracts are self-executing pieces of logic that run on the blockchain. The
smart contract contains logic that two parties agree on that can’t be altered by
any of the parties. Smart contracts can also act as payment recipient with a
defined logic about when and how to distribute received assets.
3.3. How Synergy is utilizing Blockchain Technology?
Synergy platform consists of two major services:
(1) Human-on-Demand machine learning development platform with rewards.
(2) Component-Rich data analytics and modeling platform.
First, Synergy leverages blockchain to provide a fully transparent incentivized
research and development platform. This enables trustworthy transactions in a
trustless world. Collaborators and developers can expand on each others
solutions without fear of losing credit or manipulation. Collaborators do not need
to trust the sponsor of the challenge as the SNG tokens are escrowed into the
smart contract. The sponsor of the challenge do not need to trust the developers
as the contract will not execute the payment if the final solution does not meet
18. the minimum accepted quality which is the minimum value of the assigned metric
required by the final solution in order to accept it and distribute rewards.
Secondly, Synergy utilizes blockchain technology to provide a fully automatic and
transparent way to give perpetual rewards for the developers of the off-the-shelf
models and analytical components. Subscriptions to the data analytics and
modeling platform are paid to a global smart contract address that splits all
received SNG tokens between the company and model’s developers. This is
done by having a hierarchy of smart contracts with ERC20 addresses attached to
them that distributes all SNG tokens sent to its address to all contributors to
off-the-shelf models based on model usage ratios. The following figure shows
how Synergy is managing to provide perpetual rewards to all developers using
smart contract on public blockchain.
The figure illustrates Synergy’s smart contracts hierarchy that manage the subscription to
Synergy Analytics and Modeling Platform. All contracts are deployed on public blockchain
and can be audited by the public.
19. 4. High-level Roadmap
Synergy has started in Q2 2017 by a group of Fulbright scholars and data
engineers at WPI and Brown university. The team built the first proof of concept
during Q3 2017. Afterwards, the idea has attracted a lot of interest.ater, during
Q4 2017, Synergy’s core and advisory teams were expanded with highly talented
technologists and business leaders who have accumulated 100+ years of
experience in engineering, technology management, entrepreneurship and
marketing.
During Q4 , 2017 we surveyed a group of R&D managers in our network. The
survey shows an increasing demand for a decentralized competition host
platform rather than the Synergy Analytics and Modeling Platform in businesses
with mature AI integration. One of the respondents explained that he will be
willing to outsource parts of his development or model improvement efforts to
Synergy community even if it is in beta release as it does not involve internal
bureaucracy due to its risk free nature, thanks to “pay only for a quality solution”
feature. Conversely, switching to a different cloud based analytics platform is a
complex process inside the organization that involves training efforts, risk
assessment and top management approval.
The result of the survey made us adopt an agile go-to-market strategy which
allows us to provide a quick answer to the growing market needs. We decided to
deploy the platform in two phases. In the first phase, we plan to launch Synergy
competition host platform only. That means Synergy Analytics and Modeling
Platform would have only one function; Human-on-demand machine learning
which allows to host a competition or a development challenge. This helps
Synergy in having quick access to the market and to start building a community
based on data scientists and data engineers.
20. We believe that being backed by strong diverse technical community will also
boost the development of Synergy Analytics and Modeling Platform. We can
outsource part of the development to our community who will be willing to
participate in the development in exchange with perpetual rewards once their
models are deployed. Furthermore, the risk-free nature of Synergy competition
host platform is likely to motivate many R&D managers to outsource part of their
model’s to our community. This will allow us to increase our initial offer of
off-the-shelf models when launching Synergy Analytics and Modeling Platform in
the second phase.
4.1. 2018-2019 Roadmap of Synergy Competition Host Platform
The Synergy team has scheduled the beta release of essential smart contracts
that powers the Synergy Competition Host Platform to the end of Q3 2018, the
smart contracts will be developed using Solidity[ ] over Ethereum testnet.3
During Q1 2019, Synergy will release a Graphical User Interface (GUI) to provide
seamless interaction with Ethereum public blockchain.
During Q2 2019, after completing testing and fixing on testnet, Synergy will
deploy the contracts to Ethereum mainnet and the GUI to a separate domain.
4.2. 2018-2019 Roadmap of Synergy Analytics and Modeling Platform
With the focus on Synergy Competition Host Platform during phase one, there
will be no releases of Synergy Analytics and Modeling Platform during 2018.
Synergy team is currently working with industrial partners to integrate their
requirements in the project roadmap. The alpha release of Synergy Analytics and
Modeling Platform is scheduled to Q3 2019, the release will feature data
connectors components that support various proprietary and open-source
databases, pre-processing components and classical machine learning models.
The following release will feature big data analytics and resource allocation for
3
http://solidity.readthedocs.io/en/develop/index.html
21. the algorithms. Storage and computational power will be performed either by
cloud computing, such as AWS, or by exchanging SNG token to other ERC20
tokens to buy resources (For example, filecoin.io that provides decentralized
storage network).
Synergy adopts the agile methodology for all internal development. Feedback
from our industrial partners will be integrated in all scrums, which means that the
roadmap could be shifted to meet the future market needs and future trends.
22. 5. Token Info
Token type: ERC-20 | Ticker: SNG | Total supply: 1 Billion
The SNG token is an ERC-20 token that acts as a utility token. Its main purpose
is to interact with Synergy’s distributed network.
The SNG token serves the following three different segments of interactors with
Synergy network:
1) Parties who want to sponsor data science, machine learning or artificial
intelligence challenges.
2) Organizations who want to utilize the service of Synergy Advanced
Analytics and Modeling platform.
3) Data-scientist, developers, and teams who want to submit their solution to
the platform.
○ Submitting, in general, doesn’t consume tokens, but in order to
protect the competition from massive vain submission attacks,
sponsors may require that only accounts with tokens in their wallets
be able to participate in the competition. In some cases, to protect
from multiple submissions from the same user, sponsors may also
require participants to deposit a certain amount of tokens that are
sent to participants after the competition due date.
The SNG token is designed to achieve two main objectives.
(1) To guarantee a non-discriminatory access to Synergy services. For
instance, the tokens enable unrestricted-access to consume, evaluate and
integrate state-of-the-art AI solutions.
(2) To enable building and growing a community of talented data scientists
and data engineers to develop innovative solutions.
23. The economic logic of SNG token is designed to support the continuous demand
of the token from the three segments of interactors with Synergy network. After
the initial token generation event, Synergy will not make official position to the
value of SNG token nor will be formally involved in secondary trading or valuation
of SNG tokens. Future value of the token is set to the market in full
independence from us.
24. 5.1. Token Distribution
The SNG token is minted through a smart contract on Ethereum blockchain. The
contract is set to mint a total of 1,000,000,000 tokens without an option to
increase its maximum supply.
SNG tokens will be distributed as follow:
● 50% will be distributed on token generation event (token sale).
● 20% plus unsold tokens will be reserved for the company. The company
plans to use part of these tokens as a reward to Synergy’s community, to
support development of initial advanced analytics models offering.
● 15% will be distributed on bounties and for partners. The remaining tokens
after bounties are distributed will be reserved for partners.
● 10% will be distributed to Synergy team and advisers.
● 5% will be distributed to founders of Synergy.