14. Research
Patents
Sales
Marketing
19 engineers, 2 marketing, 4 sales, 1 solution architect
40% of the engineering team hold a bachelor degree, 40% a masters and 20% a Phd from top 20 schools.
Technology
Products
Team(1)
George
CEO, co-founder
Iakovos
CTO, co-founder
Ulli
marketing/sales
George
CSO
15. Advisory Board
Sales Channel
Region Sales Rep Experience
North America Roger Milton vp of Sales at MIPS (US)
EMEA Stefan Buechmann vp of Sales at MIPS (EMEA), Product sales Mgr at Synopsys
Taiwan Grace Lin Taiwan Country Manager at MIPS, Sr Manager at Infineon
Japan Christos Makiyama Japan Business Development Director at Helic
Gideon Intrader
vp Marketing of MIPS Technologies
CTO of Adesto
Stefanos Sidiropoulos
Founder/CEO of Aeluros acquired by NetLogic
Founder/CEO of nusemi acquired by Cadence
Team(2)
19. TRAINING
Learning new capabilities
form existing data
Untrained
Neural Network
Trained
Neural Network
• New Capabilities
• Performance Optimized
Trained NN Model
NEW
TRAINING
DATASET
INFERENCE
Applying trained
behavior to new data
NEW DATA
INPUT
Artificial Intelligence
20. EDGE
Billions of Devices
FOG Nodes
Millions of Devices
CLOUD
Thousands of Devices
TRAINING
INFERENCE
INFERENCE
Artificial Intelligence
21. NO LOW-POWER INFERENCE ON THE EDGE:
HIGH ENERGY
Device battery runs out due to transmission power
HIGH LATENCY
Response-requests over a network cannot work for streaming data
LOW PRIVACY
The data must leave the device
LIMITED AVAILABILITY
Cannot run outside network coverage
Challenges
22. Think Silicon is moving to the AI developing
the most power efficient Inference Accelerator.
Existing technology, patents, work flow and core team are the
foundation of this new development.
The revolutionary Inference xNN Accelerator (IxNN®) combines the
performance of a GPU with Think Silicon low power technology
bringing real time inference on the edge
NEMA Inference Accelerator
23. AI Camera Device Market
Video Analytics – Cameras
Wearables, Drones, Robots, Toys,
E-Scooter, AR-Glasses etc.
ENERGY Cautious
Patented Technology for: Compression, Data reuse, Approximate computing
LOW LATENCY
A revolutionary real-time inference accelerator reduces reliance with the cloud
HIGH PERFORMANCE
Massive Parallel execution, fast MAC operations , Memory latency hiding
HIGH PRIVACY
Local inference reduces data communication with the cloud
NEMA Inference Accelerator
24. Technology:
Listen to your customers
Apply for patents
Business:
Invest in Sales
Be in front of you customers
Team:
Widen your team
Give shares to key people
Funding
Talk to investors
Stay focused
Key takeaways
25. www.think-silicon.com
HQ & DC
Patras Science Park
Rion Achaias, 26504
Greece
T. + 30 2610 911543
info@think-silicon.com
North America
Ulli Mueller
Toronto / Canada
T. +1 647.824.2006
u.mueller@think-silicon.com
Sales
Roger Milton
North America / San Jose, CA
T. +1 408.677.6070
r.milton@think-silicon.com
Sales
Stefan Buechmann
EMEA / Germany
T. +49 170.636.5370
s.buechmann@think-silicon.com
Sales
Christos Makiyama
Japan / Tokyo
T. +81 90.9854.1132
c.makiyama@think-silicon.com
Sales
Grace Lin
Taiwan / Taipei
T. +886. 9630.31076
g.lin@think-silicon.com