Your SlideShare is downloading. ×
Internet of Things (IOT) - impact on databases and DBAs
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×

Introducing the official SlideShare app

Stunning, full-screen experience for iPhone and Android

Text the download link to your phone

Standard text messaging rates apply

Internet of Things (IOT) - impact on databases and DBAs

725
views

Published on

Internet of things. Like we didn't have enough "things", yet everything is getting connected lately. IDC projects that the digital universe will reach 40 zettabytes. Even if only a fraction of all of …

Internet of things. Like we didn't have enough "things", yet everything is getting connected lately. IDC projects that the digital universe will reach 40 zettabytes. Even if only a fraction of all of the huge data will need to be processed, that’s a lot of processing power that will need to be available for organizations. How do DBAs prepare for this challenge?

Published in: Business, Technology

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
725
On Slideshare
0
From Embeds
0
Number of Embeds
4
Actions
Shares
0
Downloads
21
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. The Impact of The Internet of Things (IOT) On Databases The Stuff CIO’s and DBA’s should prepare for Mordechai Danielov bitwiseMnm.com
  • 2. “The Internet of Things will augment your brain” Eric Schmidt "The Internet of Things is reaching a tipping point that will make it a sustainable paradigm for practical applications." Massimiliano Claps, research director, IDC EMEA Government Insights
  • 3. In 2008, the number of devices on the Internet already exceeded the number of people. By 2020, it will reach 50 billion devices.
  • 4. Today, IT is dependent on data created by people. With IoT, computers will gather data independently of humans and track and count everything. The next generation of Internet applications using (IPv6) will communicate with devices attached to virtually all human-made objects enabled by the extremely large address space of the IPv6 protocol.
  • 5. Big Data DB Cloud Servers Devices & “Things” Technologies that make IoT Possible Sensors (like RFID, pressure etc) Send out information Applications analyze and send instructions back to devices
  • 6. With IoT, “Big Data” will turn to “Huge Data” IDC projects that the digital universe will reach 40 zettabytes 40 ZB is equivalent to 57 times the amount of all the grains of sand on all the beaches on earth. http://www.digitalnewsasia.com/digital-economy/massive-amounts-of-data-but-only-05percent-being-analyzed#sthash.1W4wJcp9.dpuf
  • 7. Processing Power Even if only a fraction of all of the huge data will need to be processed, that’s a lot of processing power that will need to be available for organizations. Decoding the data generated by the Human Genome took 10 years. Today, it would take less than 1 week. http://wikibon.org/blog/big-data-statistics/
  • 8. Peaks and Valleys Transaction Rate Fluctuations can create inadvertent DoS situations Scenarios: An alarm company hooked up to many home devices may get “sensory overload” during an earthquake from multiple endpoints. Servers and Databases will get a high volume surge of data that requires speedy processing.
  • 9. When Scaling up is just not Good Enough Scale Out Your Resources -Smart distribution of read/write activity -Design smart: writing hubs and reading spokes -Spread out your databases and replicate Read Only DB
  • 10. Know Thy Peak 30%-70% rule Utilize no more than 30% of your resources in “off-peak” time and reserve 70% for peaks. 30%
  • 11. Distribute, Cache and Share Nothing Sharing is caring? Not with data Data must be available everywhere in the world and fast -- highly distributed databases with local application caching. The most popular approach is to build loosely connected "shared nothing" instances of databases that can be brought online in no time. Shared Nothing
  • 12. Master the Cloud It’s all about the Money Use Cloud computing to get a handle on your cost of computing. Instead of saving 70% of capacity for peak traffic, procure it on demand with Cloud based databases and smart load balancing middle tier. Master working with cloud vendors and remote utilities. Fully distributed files?
  • 13. Parallel Processing Adaptive Architecture Data Compression Learn from the Big Guys Facebook DB Architecture http://www.flickr.com/photos/ikhnaton2/533233247/ Google BigTable "It is not a relational database and can be better defined as a sparse, distributed multi-dimensional sorted map" http://en.wikipedia.org/wiki/BigTable
  • 14. Data Must Be Available Under Adverse Conditions •Redundant data pathways •Smart middle layer •Fully meshed topology
  • 15. Distributed data is more exposed yet Data Must Be Secured •Encrypt anything you can •Automate Certificate management •Know how to secure data in and out of the cloud and in between •Distribute “meaningless” data and assemble it as needed
  • 16. Data Must Not be Lost • Smart transaction control built into the data access tier •Make sure you can get data out of local cache if something goes wrong •Watch data replication latency. Make sure everyone is comfortable with it and that it doesn’t deteriorate. •Set up alerts so that you detect a problem before it’s too late.
  • 17. Automate Everything source control deployment automation scripting Automation for Input Management When the peak traffic comes rushing at you, it's not the time to think. Have everything automated and scripted so that capacity can be added with a click of a button.
  • 18. Want to Talk Databases? Contact us with your ideas and suggestions Mordechai Danielov www.bitwiseMnm.com