The document discusses 5 emerging technologies: 1) The evolution of apps and cloud computing. Data is increasingly stored in the cloud. 2) Changes in critical infrastructure like data centers moving to the cloud. 3) Virtualization of new things like containers for portable apps. 4) Big data and analytics. 5) Password technologies moving away from passwords to one-time passwords and multifactor authentication. The document concludes with case studies of organizations applying these technologies.
Professional Forum:
Eleanor Fink, American Art Collaborative, USA, Shane Richey, Crystal Bridges Museum of American Art, USA, Jeremy Tubbs, Indianapolis Museum of Art, USA, Rebecca Menendez, Autry Museum of the American West, USA, Cathryn Goodwin, Princeton University, USA
Last year the Andrew W. Mellon Foundation awarded a planning grant to the American Art Collaborative (AAC), a consortium of thirteen U.S. museums who have come together to learn about and implement LOD within their respective museums. Under the grant AAC developed a road map for the Initiative that will test LOD reconciliation issues, develop production and reconciliation tools, and result in the publication of American art holdings as LOD for researchers, educators, general public, aggregators such as DPLA, ResearchSpace, and digital application developers. The road map also includes publication of best practices and guidelines to share with the broader museum community.
In September 2015, AAC member Crystal Bridges Museum of American Art received on behalf of AAC, an IMLS National leadership grant and plans for additional grants are underway. These grants are allowing AAC to convert data to LOD using the CIDOC CRM, link to the Getty Vocabularies as well as contribute missing names to enhance the vocabularies, and implement an API and reader compliant with the International Image Interoperability Framework (IIIF) that will allow researchers to compare and contrast AAC LOD. Several open source tools including a link curation tool and IIIF/CRM translator will be developed and made available for other museums. AAC is developing its LOD under a federated model whereby each AAC member assumes responsibility for updating and maintaining its own data.
The session will bring together representatives from large as well as small AAC partners to discuss the benefits of LOD, some of the lessons learned and challenging documentation issues AAC is facing.
Bibliography:
American Alliance of Museums (Museum July/August 2016 Beyond the Hyperlink: Linked Open Data creates new opportunities;
http://www.club-innovation-culture.fr/emmanuelle-delmas-glass-yale-center-for-british-art-si-les-musees-ne-choisissent-pas-lopen-content-ils-deviendront-invisibles-et-inutiles/
American Art Collaborative Linked Open Data presentation to "The Networked Cu...American Art Collaborative
An August 2017 presentation by Eleanor Fink to "The Networked Curator: Association of Art Museum Curators Foundation Digital Literacy Workshop for Art Curators"
AAC Linked Data Planning: Perspectives and ConsiderationsDesign for Context
Overview of considerations for creating, publishing, managing, and using linked data in a cultural heritage context. Presented to the American Art Collaborative partners on 15th January 2015.
American Art Collaborative Planning Grant Educational Briefings
Linked Data and Tools
Pedro Szekely - USC/Information Sciences Institute
September 30, 2014
Professional Forum:
Eleanor Fink, American Art Collaborative, USA, Shane Richey, Crystal Bridges Museum of American Art, USA, Jeremy Tubbs, Indianapolis Museum of Art, USA, Rebecca Menendez, Autry Museum of the American West, USA, Cathryn Goodwin, Princeton University, USA
Last year the Andrew W. Mellon Foundation awarded a planning grant to the American Art Collaborative (AAC), a consortium of thirteen U.S. museums who have come together to learn about and implement LOD within their respective museums. Under the grant AAC developed a road map for the Initiative that will test LOD reconciliation issues, develop production and reconciliation tools, and result in the publication of American art holdings as LOD for researchers, educators, general public, aggregators such as DPLA, ResearchSpace, and digital application developers. The road map also includes publication of best practices and guidelines to share with the broader museum community.
In September 2015, AAC member Crystal Bridges Museum of American Art received on behalf of AAC, an IMLS National leadership grant and plans for additional grants are underway. These grants are allowing AAC to convert data to LOD using the CIDOC CRM, link to the Getty Vocabularies as well as contribute missing names to enhance the vocabularies, and implement an API and reader compliant with the International Image Interoperability Framework (IIIF) that will allow researchers to compare and contrast AAC LOD. Several open source tools including a link curation tool and IIIF/CRM translator will be developed and made available for other museums. AAC is developing its LOD under a federated model whereby each AAC member assumes responsibility for updating and maintaining its own data.
The session will bring together representatives from large as well as small AAC partners to discuss the benefits of LOD, some of the lessons learned and challenging documentation issues AAC is facing.
Bibliography:
American Alliance of Museums (Museum July/August 2016 Beyond the Hyperlink: Linked Open Data creates new opportunities;
http://www.club-innovation-culture.fr/emmanuelle-delmas-glass-yale-center-for-british-art-si-les-musees-ne-choisissent-pas-lopen-content-ils-deviendront-invisibles-et-inutiles/
American Art Collaborative Linked Open Data presentation to "The Networked Cu...American Art Collaborative
An August 2017 presentation by Eleanor Fink to "The Networked Curator: Association of Art Museum Curators Foundation Digital Literacy Workshop for Art Curators"
AAC Linked Data Planning: Perspectives and ConsiderationsDesign for Context
Overview of considerations for creating, publishing, managing, and using linked data in a cultural heritage context. Presented to the American Art Collaborative partners on 15th January 2015.
American Art Collaborative Planning Grant Educational Briefings
Linked Data and Tools
Pedro Szekely - USC/Information Sciences Institute
September 30, 2014
International Image Interoperability Framework panel at #CIDOC2017 conferenceEmmanuelle Delmas-Glass
CIDOC 2017 IIIF panel:
Introduction to the International Image Interoperability Framework (iiif.io) through 3 use cases in a museum, a library, and a research center/archive by Emmanuelle Delmas-Glass, Yale Center for British Art
Digital literacy for teachers and studentsAndy Petroski
Literacy in today’s world goes beyond reading and writing. Digital literacy is as important today as the traditional skills that have enabled people to function and prosper for hundreds of years. Many would argue that the digital skills of today are just as important as the more traditional skills. This session identifys the skills of the new literacy and focus on the positive impact the skills can have on learning, work and life. Information management, ethical and legal issues, and privacy and security guidelines will be explored as part of the larger digital literacy topic.
Introduction to databases and metadata
Outline
What are databases?
What are the elements of databases?
What is metadata?
Why are they important for digital projects?
June 19, 2015
NISO Consensus Framework to Support
Patron Privacy in Digital Library and Information Systems
http://www.niso.org/topics/tl/patron_privacy/
Are you interested in finding and using digital tools to enhance your research? In this workshop, Rafia Mirza from the UT Arlington Central Library will introduce you to the many different tools that are available to help you gather, process, and present your research.
International Image Interoperability Framework panel at #CIDOC2017 conferenceEmmanuelle Delmas-Glass
CIDOC 2017 IIIF panel:
Introduction to the International Image Interoperability Framework (iiif.io) through 3 use cases in a museum, a library, and a research center/archive by Emmanuelle Delmas-Glass, Yale Center for British Art
Digital literacy for teachers and studentsAndy Petroski
Literacy in today’s world goes beyond reading and writing. Digital literacy is as important today as the traditional skills that have enabled people to function and prosper for hundreds of years. Many would argue that the digital skills of today are just as important as the more traditional skills. This session identifys the skills of the new literacy and focus on the positive impact the skills can have on learning, work and life. Information management, ethical and legal issues, and privacy and security guidelines will be explored as part of the larger digital literacy topic.
Introduction to databases and metadata
Outline
What are databases?
What are the elements of databases?
What is metadata?
Why are they important for digital projects?
June 19, 2015
NISO Consensus Framework to Support
Patron Privacy in Digital Library and Information Systems
http://www.niso.org/topics/tl/patron_privacy/
Are you interested in finding and using digital tools to enhance your research? In this workshop, Rafia Mirza from the UT Arlington Central Library will introduce you to the many different tools that are available to help you gather, process, and present your research.
Algorithm Marketplace and the new "Algorithm Economy"Diego Oppenheimer
Talk by Diego Oppenheimer CEO of Algorithmia.com at Data Day Texas 2016.
Peter Sondergaard VP of Research for Gartner recently said the next digital gold rush is "How we do something with data not just what you do with it". During this talk we will cover a brief history of the different algorithmic advances in computer vision, natural language processing, machine learning and general AI and how they are being applied to Big Data today. From there we will talk about how algorithms are playing a crucial part in the next Big Data revolution, new opportunities that are opening up for startups and large companies alike as well as a first look into the role Algorithm Marketplaces will play in this space.
How can we mine, analyse and visualise the Social Web?
In this lecture, you will learn about mining social web data for analysis. Data preparation and gathering basic statistics on your data.
A look back at how the practice of data science has evolved over the years, modern trends, and where it might be headed in the future. Starting from before anyone had the title "data scientist" on their resume, to the dawn of the cloud and big data, and the new tools and companies trying to push the state of the art forward. Finally, some wild speculation on where data science might be headed.
Presentation given to Seattle Data Science Meetup on Friday July 24th 2015.
Protecting your digital and online privacyDavid Strom
I gave this talk in October 2019 about ways that your digital and online habits can be tracked and what kind of data you inadvertently leak, along with tools that you can use to protect your privacy.
How to market your book in today's social media worldDavid Strom
Self-published authors need to learn how to use various digital tools to help them market their books. This seminar will show you some of the more common and inexpensive ones
This is a talk I gave in St. Louis in April 2018 about how businesses need to understand the Internet of Things and how they can better protect themselves.
How to make your mobile phone safe from hackersDavid Strom
While the news about laptop camera covers can make any of us paranoid, the real cyber threat comes from the computer we all carry in our pockets and purses: our mobile phones. I will describe some of the more dangerous cyber threats that can turn your phone into a recording device and launch pad for hackers, and how you can try to prevent these in your daily life.
Slides from a webinar that I and Dell Virtualization Evangelist Hassan Fahimi gave in March 2016. We provide a complete overview of OpenStack and Foglight for OpenStack.
Listen to Your Customers: How IT Can Provide Better SupportDavid Strom
For a weibar sponsored by Citrix 11/15.
IT needs to provide the best possible support to its end users. Indeed, treating them as your customers is critical. We’ll cover some of the lessons learned from the best and worst customer-facing organizations to see how IT can make improvements in this area.
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...Jeffrey Haguewood
Sidekick Solutions uses Bonterra Impact Management (fka Social Solutions Apricot) and automation solutions to integrate data for business workflows.
We believe integration and automation are essential to user experience and the promise of efficient work through technology. Automation is the critical ingredient to realizing that full vision. We develop integration products and services for Bonterra Case Management software to support the deployment of automations for a variety of use cases.
This video focuses on the notifications, alerts, and approval requests using Slack for Bonterra Impact Management. The solutions covered in this webinar can also be deployed for Microsoft Teams.
Interested in deploying notification automations for Bonterra Impact Management? Contact us at sales@sidekicksolutionsllc.com to discuss next steps.
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
JMeter webinar - integration with InfluxDB and GrafanaRTTS
Watch this recorded webinar about real-time monitoring of application performance. See how to integrate Apache JMeter, the open-source leader in performance testing, with InfluxDB, the open-source time-series database, and Grafana, the open-source analytics and visualization application.
In this webinar, we will review the benefits of leveraging InfluxDB and Grafana when executing load tests and demonstrate how these tools are used to visualize performance metrics.
Length: 30 minutes
Session Overview
-------------------------------------------
During this webinar, we will cover the following topics while demonstrating the integrations of JMeter, InfluxDB and Grafana:
- What out-of-the-box solutions are available for real-time monitoring JMeter tests?
- What are the benefits of integrating InfluxDB and Grafana into the load testing stack?
- Which features are provided by Grafana?
- Demonstration of InfluxDB and Grafana using a practice web application
To view the webinar recording, go to:
https://www.rttsweb.com/jmeter-integration-webinar
Connector Corner: Automate dynamic content and events by pushing a buttonDianaGray10
Here is something new! In our next Connector Corner webinar, we will demonstrate how you can use a single workflow to:
Create a campaign using Mailchimp with merge tags/fields
Send an interactive Slack channel message (using buttons)
Have the message received by managers and peers along with a test email for review
But there’s more:
In a second workflow supporting the same use case, you’ll see:
Your campaign sent to target colleagues for approval
If the “Approve” button is clicked, a Jira/Zendesk ticket is created for the marketing design team
But—if the “Reject” button is pushed, colleagues will be alerted via Slack message
Join us to learn more about this new, human-in-the-loop capability, brought to you by Integration Service connectors.
And...
Speakers:
Akshay Agnihotri, Product Manager
Charlie Greenberg, Host
Elevating Tactical DDD Patterns Through Object CalisthenicsDorra BARTAGUIZ
After immersing yourself in the blue book and its red counterpart, attending DDD-focused conferences, and applying tactical patterns, you're left with a crucial question: How do I ensure my design is effective? Tactical patterns within Domain-Driven Design (DDD) serve as guiding principles for creating clear and manageable domain models. However, achieving success with these patterns requires additional guidance. Interestingly, we've observed that a set of constraints initially designed for training purposes remarkably aligns with effective pattern implementation, offering a more ‘mechanical’ approach. Let's explore together how Object Calisthenics can elevate the design of your tactical DDD patterns, offering concrete help for those venturing into DDD for the first time!
Securing your Kubernetes cluster_ a step-by-step guide to success !KatiaHIMEUR1
Today, after several years of existence, an extremely active community and an ultra-dynamic ecosystem, Kubernetes has established itself as the de facto standard in container orchestration. Thanks to a wide range of managed services, it has never been so easy to set up a ready-to-use Kubernetes cluster.
However, this ease of use means that the subject of security in Kubernetes is often left for later, or even neglected. This exposes companies to significant risks.
In this talk, I'll show you step-by-step how to secure your Kubernetes cluster for greater peace of mind and reliability.
Neuro-symbolic is not enough, we need neuro-*semantic*Frank van Harmelen
Neuro-symbolic (NeSy) AI is on the rise. However, simply machine learning on just any symbolic structure is not sufficient to really harvest the gains of NeSy. These will only be gained when the symbolic structures have an actual semantics. I give an operational definition of semantics as “predictable inference”.
All of this illustrated with link prediction over knowledge graphs, but the argument is general.
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...UiPathCommunity
💥 Speed, accuracy, and scaling – discover the superpowers of GenAI in action with UiPath Document Understanding and Communications Mining™:
See how to accelerate model training and optimize model performance with active learning
Learn about the latest enhancements to out-of-the-box document processing – with little to no training required
Get an exclusive demo of the new family of UiPath LLMs – GenAI models specialized for processing different types of documents and messages
This is a hands-on session specifically designed for automation developers and AI enthusiasts seeking to enhance their knowledge in leveraging the latest intelligent document processing capabilities offered by UiPath.
Speakers:
👨🏫 Andras Palfi, Senior Product Manager, UiPath
👩🏫 Lenka Dulovicova, Product Program Manager, UiPath
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...Ramesh Iyer
In today's fast-changing business world, Companies that adapt and embrace new ideas often need help to keep up with the competition. However, fostering a culture of innovation takes much work. It takes vision, leadership and willingness to take risks in the right proportion. Sachin Dev Duggal, co-founder of Builder.ai, has perfected the art of this balance, creating a company culture where creativity and growth are nurtured at each stage.
2. My background
• Freelance writer and IT speaker
• Former Editor-in-Chief at
Network Computing, Tom’s
Hardware.com
• Worked in end-user computing
back in the mid 1980s
• Written two computer
technology books
• Last spoke to Daly customers in
2003
3. Agenda
• The evolution of apps
• Five key techs for further study:
– The cloud and your data
– Changes in critical infrastructure
– Virtualization of new things
– Big data
– Password technologies
• Case studies
V6 final
apr 15 Hancock Life points and Docker added, Balto Schools added
I think about how the past 12 years have significantly changed what we are doing in IT. In the old days, IT built networks and data centers that supported computing monocultures of servers, desktops and routers. Back then, IT owned everything from the user’s keyboard on up. Those days are quickly coming to a close. Now we have companies who deliver their IT infrastructure completely from the cloud and don’t own much of anything. IT has moved to being more of a renter than a real estate baron.
What I am seeing is that IT is changing the way they delivered their applications to make them accessible from the Internet and migrate their apps to become more browser-based. This is happening because end users are moving from desktops to phones and tablets, and demand a more pluralist approach to how they run their apps.
One acronym that you will hear a lot more of these days is called SMAC – for Social, Mobile, Analytics and Cloud. SMAC is changing the way that IT delivers its services, builds its enterprise architectures, and selects its systems.While the PC revolution was pushed forward by then-cheaper desktops running Lotus, SMAC isn’t just about one monolithic app. Instead, IT will have to evolve towards managing multiple app integrations, provisioning several different services, and negotiating more vendor relationships. They will have to examine business processes from a wider lens.
So the first part of my talk, let’s talk about this evolution and how apps are are built, deployed, and updated. There are several steps in this process, and not every company follows this path the same way. But you can see it happening and several patterns developing.
First is the notion of File sharing and how it becomes a collaboration mechanism. Before PCs were first connected to the Internet, there were local area networks and floppy disks. File sharing was cumbersome and crude, because PCs were essentially personal devices and collaboration was difficult. Then came the Internet and one of the first basic ways that work teams used this connectivity was to share documents, usually as email attachments or through tools such as Microsoft SharePoint. But while it solved document version and access issues, SharePoint is a very “heavy” client, meaning that there is a lot of software to install and maintain. It isn’t very friendly to mobile devices, or to Web access. The first step towards changing how apps are supported usually begins when enterprises realize that there is a better way to share files, and have seen IT organizations ditching their SharePoint implementations in favor of using services such as Box or DropBox or using newer services that are more security-aware such as SpiderOak or Wuala.
Once these file sharing portals take hold, it isn’t much more of step towards running general office productivity apps in the cloud such as Google Docs and Microsoft Office 365. These apps were once the exclusive domain of the desktop, but as endpoints have blossomed into tablets and Web-only access, office productivity means something more pluralistic and functional than merely sharing documents. They are also the first tentative steps into supporting the public cloud too by IT.
Many of these decisions are being driven because of what is the endpoint device has become almost irrelevant. Gone are the days, even back in the early 2000’s, when an IT department would studiously determine what kind of PC brand or operating system would be the corporate standard. Now the particular endpoint, whether it is a desktop or a mobile, no longer matters. Mobiles are being used more and more as the main endpoint browser: nearly half of Facebook posts come from mobile devices and more than 75% of Tweets are posted from phones.
This is the ultimate consequence of a “bring your own device” policy; because in effect the IT department recognizes that the apps trump whatever device they are running on. There are some big benefits here for IT: they don’t have to invest time in their “nanny state” approach in tracking which users are running what endpoints. Instead, they can free up these staffers to improve their apps.
The next step is in delivering a single place where end users can consume the necessary business apps to be productive. Users don’t want to wait on IT to finish a requirements analysis study or go through a lengthy approvals process: they want their apps here and now. Users want apps that are intuitive, purposeful and easy to use, and they now carry these same expectations into the workplace. There is dwindling patience for the convoluted, frustrating user experiences that many enterprise users have tolerated from corporate systems of the past. One of the best ways to enable this new app universe is in the form of an app portal or corporate app store where users can download the most current apps to their endpoint devices, or login and connect to them in the cloud using some kind of single-sign-on tool, or a combination of approaches.
As these app stores take hold, more companies are now testing their apps in production. This movement grew out of strategies that Google, Amazon, and others used to roll out new features of their SaaS services and code several years ago. The tools used for these kinds of testing include ramped or limited deployment and A/B tests.
This has created a new kind of IT department: one that delivers continuous app upgrades, just like the consumer social clouds of Facebook and Twitter are doing. Today these IT groups add improvements without waiting for formal requirements documents from a ponderous and seemingly endless architecture review process. Instead, user interfaces are added almost on a whim, and these continual changes make the notion of a “version number” for software seem almost quaint.
Think about this for a moment: back in the early 2000s, the thought of actually doing this kind of testing would have probably gotten an IT director fired. Now it is becoming common practice.
So let’s take a closer look at five broad categories of emerging tech and how they are helping to accelerate this app evolution. Some of them are still not well adopted, while others are underway in many IT shops.
These days, everyone talks about the cloud. And while you may be sick of hearing the term, there’s no doubt that cloud technologies have had — and will continue to have — a direct impact on the business world. And while there are many myths about cloud computing, one thing about the cloud in business remains apparent: Cloud computing is here to stay — at least for now. According to IDG research, 61 percent of technology decision-makers are utilizing at least one cloud-based app and are expecting to build more in coming years.
In order to understand how the cloud can play a more critical role in your company’s IT operations, take a look at current trends and opportunities — and what they can mean for your business. Here are five tips for cloud integration:
1. Build your apps with the cloud in mind
The days of having to custom build software from scratch are over. Running your apps in the cloud means you can quickly assemble what you need and have it scale up or down to match your demand. You don’t have to maintain any servers, or have much in the way of a large data center infrastructure.
2. Invest in real-time analytics
Today’s analytics tools have the ability to collect and correlate events in real time (or nearly so). And you don’t have to have a data background to understand reports: Most tools can clearly display analyses on a dashboard, so that even the most non-technical user can easily spot and act on particular trends. With these analytics tools, you can collect information from a variety of different sources, including already-running software programs and logs from servers, as well as network events like firewalls and security systems. These tools avoid the delay in analysis that older business intelligence products incurred, and enable more iterative, ad-hoc types of queries. Users can refine their questions as they go along in the data, gaining deeper insights into complex questions.
This kind of real-time analysis enables your business to be a truly data-driven organization, where analysts can drill down to a particular user or event and discover what happened and why. With this information, you can adapt to secure the best opportunities for your business, understand why your customers’ purchase patterns are changing and focus on how to meet demand.
3. Understand network latency
In order to utilize the cloud to its fullest potential, you will need a basic understanding of how it operates with your network. Network latency, or the expression of how long it takes data to move from point A to point B, occurs with cloud computing because you don’t own the end-to-end paths connecting your users to your servers. As such, there are wider latency variations, which can create more room for a network slow down. You may need better Internet infrastructure connections for your business to handle more traffic, or you may consider investing in a content delivery network or other technologies to speed up access.
4. Consider hybrid cloud options
Hybrid clouds can offer businesses significant utility, so spend some time thinking about ways these options can work for your business. Hybrids are a great option to handle network loads and demands for storage resources at peak times, for example.
5. Automate provisioning of your cloud resources
Yes, having your apps in the cloud means you can scale up and down as demands and needs change. But the hidden bonus behind this scalability is you are also able to automate the provisioning of cloud-based servers. This includes policy-based workload management and deployments, as well as real-time resources and orchestration. Understanding what these automation tools do and how they work with your cloud deployments can save time and money for your business.
These are the key questions to ask as you are trying to evolve your IT infrastructure towards this new collection of apps.
Can your current internal apps be converted into something with a Web front end?
I will talk more about this when I come to my case study with the Red Cross. But lots of companies are building their own web portals for their internal apps, this conversion frees up supporting outdated endpoint devices and the need to maintain either customized apps or outdated mainframe terminal communication tools.
Can your business logic be hosted elsewhere and be made more scalable?
If you move your servers to the cloud, you can ramp up (or down) your capacity quickly without having to purchase the hardware. One IT shop calsl it “buying our baseline capacity but renting what we need for handling seasonal spikes.”
Can you provide security as a service layer for your apps?
When companies employ a single-sign-on tool, they migrate their security needs to a single point of service delivery, and make things easier for both end users and their IT department. But a single-sign-on isn’t sufficient. Security needs to be part of every app, more of security-as-a-service, moving from the network edge to the individual app. This is what Mitsubishi Motors did to connect its North American car dealers to its headquarters infrastructure. In the past they relied on a VPN to get their users inside a secure perimeter; now each app authenticates each user individually. We’ll talk more about this in a moment.
Can you virtualize each of your servers without losing performance, security and reliability?
You want to look at server CPU utilization as a good decision point before virtualizing them. When you do this, you will found that many of yourphysical servers were operating at very low levels and could easily be converted into either virtual machines or migrated to the cloud. This frees up other data center resources and also spreads the cost of an expensive server across equipment that can run at higher loads.
As I said, the cloud is playing a larger role in our enterprise computing decisions and infrastructure. Let me highlight three emerging trends in this new virtual world.
First is this notion that data can be a service layer. This is the ultimate end state of a completely cloud-based universe is where everything becomes a feed; data is streamed from where it is curated to where it is consumed, and the user doesn’t know and doesn’t necessarily care. Think of how once Netflix only mailed out its movie DVDs – now a goodly percentage of its content is streamed to its customers. In the early days of digital music, people ripped CDs into MP3 files: now we have providers like Spotify or Pandora that stream music directly to smartphones. The same is becoming true for enterprise data usage.
There are now DaaS from Microsoft Azure Marketplace, Amazon, Google, SalesForce.com, Infochimps, Informatics and Intuit’s Quickbooks.com.
Microsoft’s five-year old Azure Marketplace is composed of two major pieces: apps and data. Both are available through monthly subscriptions that you as an individual select. Microsoft has gone after some of the largest data repositories to show they are a “leader.” For example, included so far are collections from D&B, geo-related Esri, Wolfram, and sports-related Stats.com.
Azure Marketplace works with a set of query tools that can integrate with desktop apps (PowerPivot, Tableau and Excel links), an open data protocol (OData), its own preview engine (Service Explorer) and of course the cloud service which serves as the backend for hosting all these datasets and running various SQL databases.
Docker is an open platform for developers and sysadmins to build, ship, and run distributed applications. It consists of several different pieces so that apps can be quickly assembled from components and eliminates the friction between development, QA, and production environments. As a result, IT can ship faster and run the same app, unchanged, on laptops, data center VMs, and any cloud.
With Docker, developers can build any app in any language using any set of development tools. “Dockerized” apps are completely portable and can run anywhere. Think of them as taking a Linux process, and adding the configuration information so the two can travel together.
In the past, we set up individual VMs that ran specific applications but were dependent on a lot of virtual infrastructure. With Docker, an app can run as an isolated process in userspace on the host operating system, sharing the kernel with other containers. Thus, it enjoys the resource isolation and allocation benefits of VMs but is much more portable and efficient.
Docker is less than 2 years ago but already getting a lot of traction, and Google has blessed it as one of its preferred technologies. Expect to hear more about it over the coming years.
Founded by Rackspace and NASA, OpenStack has grown to be a global software community of developers collaborating on a standard and massively scalable open source cloud operating system. And it is a on a very fast track, with the upcoming tenth – or is it 11th-- major release. the builds are more sophisticated with more mature components, there are more distributions available, better VMware integration and training programs have also blossomed.
There some notable exemplary uses of OpenStack by established businesses: Best Buy was spending $20,000 to spin up a single managed virtual machine (VM) back in 2011. That was motivation to rebuild its ecommerce site and use OpenStack to serve up its Web product pages. Now they integrate 40 different development teams' work using this software and produce pages that take on average 2.5 seconds to load. Before OpenStack, it was taking anywhere from seven to 30 seconds to load the same content.
OpenStack can be very cost effective alternative to expensive enterprise virtualization tools, and it is based on Apache open source projects that can proven to scale up and have a boatload of APIs.
Bid Data has has some interesting bedfellows. FedEx is collaborating with General Electric – which is providing the company with commercial charging stations for its electric vehicles. While Fedex can tell you where a particular package is located in its network, it has other Big Data dilemmas including whether it makes sense for them to use electric power for its delivery trucks. They got together with GE, utility Con Edison and Columbia University researchers. The group are developing artificial intelligence programs to manage when and where the electric trucks charge in a 10-vehicle pilot project .They collect data on what is the load on the facility, what is the load of each truck, how many miles does that truck drive, The algorithms from Columbia will identify that a truck is going to drive 16 miles tomorrow, so don’t give it 30 amps, give it 8 amps so we minimize the load on the entire facility.
Still think big data is a lot of bull? Well, not according to the USDA. 8 million Holstein dairy cows in the United States, there is exactly one bull that has been scientifically calculated to be the very best in the land. He goes by the name of Badger-Bluff Fanny Freddie, who has 346 daughters who are on the books already. Their equations predicted from his DNA that he would be the best bull in terms of milk production from his cattle progeny.
USDA research geneticist reviewed pedigree records and looked at things such as milk production and fat and protein content to optimize the breed. To give you an idea of how this industry has changed, In 1942 the average dairy cow produced less than 5,000 pounds of milk in its lifetime. Now, the average cow produces over 21,000 pounds of milk.
Monsanto has purchased several big data analytics companies and is incorporating these tools into its programs, to help farmers become more productive with their crops.
A few weeks ago John Hancock Life Insurance became the first American insurer to offer reward points to its customers based on lifestyle choices. People who sign up for this program get a Fitbit monitor and agree to let the company monitor their activities. The most active customers may earn a discount of up to 15 percent on their premiums, in addition to Amazon gift cards, half-price stays at Hyatt hotels and other perks. For example, nonsmokers automatically earn 1,000 points.
This raises all sorts of data privacy issues, but is an example of a program that is working in other countries and how Big Data can be used for one of the more conservative businesses around. Essentially, John Hancock’s term and universal life policies will be priced continuously for these customers. Of course, they claim that they won’t sell this data to others.
I would like to move into one last topic before looking at some specific case studies about some typical leading IT organizations, and focus on something that is very practical. That is how broken we are with respect to handling our passwords. I was reminded of this recently when I was trying to help my wife manage her passwords. She gets very wiggy when she has to enter a password, and sometimes spends minutes trying to remember what she chose for a particular website.
On the enterprise side, we hear almost daily about password breeches and attacks where millions of user accounts and passwords have been stolen. So let me give you a quick tour about where I see the password industry changing and how you can get on top of things.
12 years ago or so, we had these old school one time password tokens that were used to strengthen our logins. They worked great, but users and IT managers hated them for different reasons. They were easily lost, difficult to manage, and often you left them at work when you needed them at home and vice-versa.
Another mechanism was to set up a series of password policies that strengthened logins but also were annoying to our end users
This is the many policy screen for Lastpass Enterprise, showing their granularity.
Okta also has wide multifactor authentication support, including its own mobile soft tokens, a security question, and Google Authenticator. You can enforce the multiple factors when users are outside the corporate network, or for specific groups, but not for specific applications.
OneLogin also has a long list of multifactor token choices that can be used to secure your account.
Chicago WindyGrid http://www.mongodb.com/customers/city-of-chicago and their data portal https://data.cityofchicago.org/
STL County GIS apps http://gis.stlouisco.com/
The site hosts over 200 datasets presented in easy-to-use formats about City departments, services, facilities and performance. The catalog presented below lists datasets alphabetically and links back to each set on the Portal.
The site is a great place for those interested in building applications, or creating interesting visualizations that use City data
Then there is the Baltimore Co. school district. Through the BCPS One system, all the district's programs and initiatives around student data, assessments, curriculum, instruction, reporting, and analysis are being fully integrated into a single, user-friendly, platform. It provides a web-based communication portal to distribute loads of information and enable educators to communicate with colleagues, parents, and students.
The district is planning on providing digital devices to every student, and put wireless and broadband infrastructure into every one of its elementary, middle and high schools.
1. Building fiber is more than just Internet. While you can point out civic pride, or as a mechanism to attract broadband geeks with startup ventures, there are other reasons you want gigabit fiber in your city. What made Chattanooga's fiber network work was the backing of its municipal electric utility as a means of improving its power delivery system. Having a smarter grid minimized power outages. Once they had capitalized their fiber grid, the 'Net followed.
2. It has to go everywhere. When done correctly, gigabit fiber becomes a community asset that can benefit both small and large employers, so wiring up just homes is somewhat self-defeating. The early Google Fiber installations were just for residential addresses, which had the effect of “hacker homes” whereby entrepreneurs would buy up homes and open them up as unofficial coworking spaces.
Chattanooga was able to attract a new Volkswagen auto assembly plant and an Amazon.com distribution warehouse because of many reasons, but having a ubiquitous business fiber network was certainly one of them. And smaller entrepreneurial efforts have blossomed everywhere around town.
3. Work directly with academia and City Hall. Chattanooga figured out early on that they needed a university-based research partner to help establish a supercomputing center and a way to help license and commercialize new technologies. They also got the municipal government involved, and the city has more than 50 different apps that make use of their ubiquitous fiber connection, such as apps to monitor street and traffic lights and road conditions. Both will eventually come with Google Fiber cities, just because of Google's size and impact.
Some organizations have always had their infrastructure in the cloud. This is the strategy that the startup CrazyForEducation.com used when began its operations last year. We are looking at an internal operations provider diagram. Let me tell you how they put all this technology together.
The company is a SaaS provider of tutorials and is used as a way to post short online video lessons by K12 classroom teachers who explain common concepts such as short algebra or geography lessons to students. These lectures are viewed by students the world over. The notion is referred to as flip teaching, meaning that the classroom time is used for working on what would be traditionally homework assignments, and the readings and lectures that were normally part of the classroom day are done in the evenings at home. To deploy their solution, the startup uses a complete online infrastructure. The company is also using a variety of customer-facing apps and SaaS/IaaS infrastructure so that they can quickly scale as demand for their services rises. This means that there is no single cloud provider that is used but rather they leverage more than a dozen different vendors for their various infrastructure needs.
When the company began operations the principles wanted to build their infrastructure incrementally, using a Lego approach to build interchangeable parts that could easily connect together. They understood that each part could be replaced if the provider went out of business or when they found something more appropriate or cost-effective. As they added new providers, they looked at what the incremental return on their investment would be for that particular tool. In some cases, they found they could build their own tool for less than the monthly cost for one of their providers. In other cases, such as for CRM providers, they found that there were many solid alternatives and so they shouldn’t even attempt to build their own.
As another example, they needed a solid video-rendering engine since so much of their content was video-related. They looked at a number of providers but eventually ended up using the UK-based provider Vzaar.com, which was much less expensive than any American provider they could find.
The firm spends about $1500 a month on their infrastructure, and has purchased services from vendors around the world for its accounting, Web hosting, payment processing and databases. They have chosen more than a dozen different vendors, some of them offering consumer apps and some that are geared towards businesses. As another example, they purchased their email using Google’s business-grade hosting service and Box for their file sharing but use Join.me for their video conferencing solution.
For each provider, they look at what happens to their performance when they scale up and support more traffic as the company grows. They do all sorts of stress testing to see what happens when their loads are ten times what they currently support and make sure that any of their providers continue to deliver the same latency and performance they currently have.
They also have segmented their data security so that they don’t store customer financial data in the cloud, other than using their payment processor to handle credit card transactions when it is time for their teachers to be paid for their video lessons. They originally looked at Paypal but ended up with Stripe.cc because they had a better and more developed API that could be incorporated into their other programs.
The American Red Cross has changed how it now deploys its apps. A few years ago it was one of the more conservative IT shops around. Most of its apps ran on its own mainframes or were installed on specially provisioned PCs that were under the thumb of the central IT organization based in Washington, D.C. But then people started bring their own devices along to staff its disaster response teams. Their IT department started out trying to manage their users’ mobiles and standardize on them. But within two or three months their IT staff found the mobile vendors came out with newer versions, making their recommendations obsolete. Like many IT shops, they found that their teams would rather use their own devices. In the end, they realized that they had to change the way they delivered their applications to make them accessible from the Internet and migrate their apps to become more browser-based. The Red Cross, like many other IT organizations, has learned that they have to be able to adapt to the rapidly changing mobile environment. But the good news is that they don’t have to buy as many laptops.
, the American Red Cross is using a variety of cloud providers: Microsoft’s Office 365, Unisys for its regular web hosting needs, an Oracle-hosted cloud service for several database apps, and Teradata’s Aprimo cloud CRM apps.