2. INTRODUCTION.
• In the last decade, we have bewildered by new technologies and changes to the
way we live and work. But if you want the pace of change to slow down, says
Bryan Glick, editor in chief of computer weekly you wont be pleased
• We are about as far into the digital revolution now as we were in the transport
revolution of the late 18th and early 19th centuries when people used to walk in
front of an automobile waving a red flag as a warning.
3. INTRODUCTION CONT.
• According to research by the leading edge forum, it takes about 30 to 50 years for
a technology to move from invention to ubiquity – sometimes longer. Many of
todays next big things like the 3D printing, artificial intelligence, robotics are 1960s
inventions that can now be produced at a scale, quality and cost that make them
generally affordable.
4. WHY IS GOING TO GET EVEN
FASTER.
• A decade ago, smartphones (as we know them by today’s standards) didn’t exist.
Three decades earlier, no one even owned a computer. Think about that—the first
personal computers arrived about 40 years ago. Today, it seems nearly everyone
owns handheld computer.
• Because each generation of technology improves over the last, the rate of
progress from version to version speeds up.
5. WHY IS GOING TO GET EVEN
FASTER CONT.
• “The first computers were designed on paper and assembled by hand. Today, they
are designed on computer workstations with the computers themselves working
out many details of the next generation’s design, and are then produced in fully
automated factories with only limited human intervention.” – Ray Kurzweil.
6. WHY IS GOING TO GET EVEN
FASTER.
• Computer chips have become increasingly powerful while costing less. That’s
because over the last five decades the number of transistors—or the tiny electrical
components that perform basic operations—on a single chip have been doubling
regularly.
• This exponential doubling, known as Moore’s Law, is the reason a modern
smartphones is getting even smaller with increasing functionalities.
7. MOORE'S LAW
• Moore's Law refers to the perception that the number of transistors on a
microchip doubles every two years, though the cost of computers is halved.
Moore's Law states that we can expect the speed and capability of our
computers to increase every couple of years, and we will pay less for them.
8.
9. COMMODITIZATION OF I.T .
• The digital revolution is powered by commoditization of core IT capabilities. The
most transformative has been the internet – the commoditization of networking
and communications to become cheap and found everywhere.
• The internet started the wave of innovation that led to creation and growth of the
web , commerce and social media.
• Smartphones are the commoditization of end-user computing. Cloud is now a
more affordable, highly available commodity to compute power and storage.
10. COMMODITIZATION OF I.T CONT.
• Marketing experts compare user adoption of Facebook with that of accelerated
change, but those statistics are a result of comparative cheapness and mass
availability, not of technological progress. If every TV was offered free in the
1950s, its adoption would have been quicker.
11. COMMODITIZATION OF I.T CONT.
• Our experience of digital revolution feels like incredibly rapid change because so
many of our lives are changing at the same time. Not because technology itself is
really changing any faster but we can give thanks to innovation powered by
commoditization.
12. COMMODITIZATION OF I.T CONT.
• The only certainty is that social change is going to come – look at the
reactions to existing influence such as the gig economy, with the taxi
drivers blockading cities in protest at Uber.
• Look at how Facebook is been demonized over its use of our personal
data, or Youtube for spreading extremist videos. There is plenty of
evidence that suggest a serious tech backlash in the next few years.
13. CONCLUSION.
• When people talk about the pace of change speeding up, are they right? Or does
it feel that way because there are so much innovations as opposed to invention
going on?
14. • Quantum computing: perform calculations based on the probability of an objects
state before it is measured instead of 1s or 0s.