With the computer revolution vast amount of digital data has become available. With the Internet and smart connected product, the data is growing exponentially. It is estimated that every year, more data is generated than all history prior. And this has repeated over several years.
With all this data, it becomes a platform for something new of its own. In this lecture, we look at what big data is and look at several examples of how to use data. There are many well-know algorithms to analyse data, like clustering and machine learning.
Short introduction to Big Data Analytics, the Internet of Things, and their s...Andrei Khurshudov
Invited talk at the 26th ASME annual conference on information and storage and processing systems (ISPS 2017) held at Hilton San Francisco District, San Francisco, California, USA from August 29–30, 2017.
IoT and AI Services in Healthcare | AWS Public Sector Summit 2017Amazon Web Services
In this session we will overview the latest AWS Artificial Intelligence (AI) and Internet of Things (IoT) services and show examples of how these services are enabling transformative new capabilities in health care. Join us in exploring various architectures and discussing the art of the possible in how AI and IoT services can be applied in different scenarios. Listen to the inspiring story of how one AWS-savvy father is using Amazon Polly, Lex, and IoT buttons to create a verbal assistant for his autistic son. We’ll also hear how the American Heart Association is leveraging Amazon Alexa and Lex chat bots as part of a new initiative to engage communities and individuals through innovative new feature offerings. Learn More: https://aws.amazon.com/government-education/
Harbor Research recently completed a review of a new
cloud-based platform that takes a refreshingly new
approach to machine data analytics. Glassbeam jumps
ahead of the current market’s noise and confusion about
Big Data by viewing critical machine data analytics from a
business and operational perspective that can be addressed
by a single, scalable solution. In so doing, Glassbeam is
re-defining how value is created from machine data.
Short introduction to Big Data Analytics, the Internet of Things, and their s...Andrei Khurshudov
Invited talk at the 26th ASME annual conference on information and storage and processing systems (ISPS 2017) held at Hilton San Francisco District, San Francisco, California, USA from August 29–30, 2017.
IoT and AI Services in Healthcare | AWS Public Sector Summit 2017Amazon Web Services
In this session we will overview the latest AWS Artificial Intelligence (AI) and Internet of Things (IoT) services and show examples of how these services are enabling transformative new capabilities in health care. Join us in exploring various architectures and discussing the art of the possible in how AI and IoT services can be applied in different scenarios. Listen to the inspiring story of how one AWS-savvy father is using Amazon Polly, Lex, and IoT buttons to create a verbal assistant for his autistic son. We’ll also hear how the American Heart Association is leveraging Amazon Alexa and Lex chat bots as part of a new initiative to engage communities and individuals through innovative new feature offerings. Learn More: https://aws.amazon.com/government-education/
Harbor Research recently completed a review of a new
cloud-based platform that takes a refreshingly new
approach to machine data analytics. Glassbeam jumps
ahead of the current market’s noise and confusion about
Big Data by viewing critical machine data analytics from a
business and operational perspective that can be addressed
by a single, scalable solution. In so doing, Glassbeam is
re-defining how value is created from machine data.
Big data and digital ecosystem mark skilton jan 2014 v1Mark Skilton
The convergence of data and technology in business has created a range of digital experiences that are revolutionizing all industries and organization.
This is not just a “top and tail” exercise of innovation and development resulting in quick fixes for parts of your organization to launch marketing services on a mobile device or data analysis but needs to be taken into a serious framework for your operating strategy that drives all touch points in the front and back of your organization. Often “technoconfusion” is created or even encouraged with lots of technologies and integration layers which result in a piecemeal digital strategy for business performance and an ineffective business case and governance process. Often the results are seen in lack of data visibility, increasing costs of platform integration and complex services and contractual issues limiting longer term choice.
The realities of the digital world are more subtle and expansive in 2014 and beyond; the old style of layers of technology is giving way to a new era of digital modularity of systems and devices that enable generative business growth effects from self-service and massive scaled social and marketplace services. This is born out with the exponential scaling of open APIs, massive data and social networking and the growth of a range of internet enabled modular devices and cloud enabled platforms.
Takeaways
§ This session looks at the trends driving industry today.
§ It introduces work on the Open Platform 3.0 from the open group. http://www.opengroup.org/subjectareas/platform3.0
§ It includes an analysis of trends in technology and the emerging patterns and roles of big data in an end to end operating context of your organization.
§ We conclude with a modern framework for a modular and generative digital ecosystem strategy and the focus for next generation platforming and services.
Look beyond the hype and create a strategy that will unlock the potential of the Internet of Your Things to realize real, transformative results in your organization.
Big Data Expo 2015 - IBM Outside the comfort zoneBigDataExpo
When it comes to high tech, we tend to wear blinders. We only want to see what's right in front of us. At times, we look forward but we rarely look around us to see how other industries are succeeding. This is especially true with organizations who want to look beyond business intelligence and reveal answers you never thought to ask. For example, what would demand forecasting for a Chief Marketing Officer in Media & Entertainment mean to a Chief Data Officer in banking? Or what would Customer Operations Transformation in Energy & Utilities mean to a Chief Customer Officer at a major retail operation? In this interactive and energetic session, we'll explore valuable cross-industry use cases to help get you "outside your comfort zone" and take a completely different look at how applications of advanced and predictive analytics on big data - or any data - can help you to act on real-time insights to fundamentally transform your business.
Independent of the source of data, the integration of event streams into an Enterprise Architecture gets more and more important in the world of sensors, social media streams and Internet of Things. Events have to be accepted quickly and reliably, they have to be distributed and analysed, often with many consumers or systems interested in all or part of the events. Dependent on the size and quantity of such events, this can quickly be in the range of Big Data. How can we efficiently collect and transmit these events? How can we make sure that we can always report over historical events? How can these new events be integrated into traditional infrastructure and application landscape?
Starting with a product and technology neutral reference architecture, we will then present different solutions using Open Source frameworks and the Oracle Stack both for on premises as well as the cloud.
Big Data may well be the Next Big Thing in the IT world
Big data burst upon the scene in the first decade of the 21st century
The first organizations to embrace it were online and startup firms. Firms like Google , eBay , LinkedIn , and Facebook were built around big data from the beginning.
Like many new information technologies , bigdata can bring about dramatic cost reductions , substantial improvements in the time required to perform to computing task , or new product and service offerings.
Over the past decade, cloud computing has acted as a disrupter in several areas of IT business. Soon, it will overhaul one area of technology that has been in rapid growth itself: Data Analytics. Nicky will focus on the recent study of IBM Institute of Business Value which shows that capabilities that enable an organization to consume data faster – to move from raw data to insight-driven actions – are now the key differentiator to creating value using data and analytics. He will also talk about the requirements for the underlying infrastructure as critical component allowing real-time crunching and analysis of high volume of data. Based on real cases like retailers and energy companies, we will look at five predictions in five years, based on:
Analytics, Big data, and Cloud coming together will energize the Speed Advantage.
7 Amazing Examples of Digital Twin Technology In PracticeBernard Marr
Digital twins are a virtual simulation of real-world objects. By using Internet of Things sensors that feed data from the physical object to computers, digital twins provide the exact same situation to study and test without the consequences of doing the test in the real world. The uses for the technology are nearly limitless.
With the computer revolution digital data has become available. With the Internet and smart connected product, the data is growing exponentially. With all this data, it becomes a platform for something new of its own. In this lecture, we look at what big data is and look at several examples of how to use data.
After the computing industry got started, a new problem quickly emerged. How do you operate this machines and how to you program them. The development of operating systems was relatively slow compared to the advances in hardware. First system were primitive but slowly got better as demand for computing power increased. The ideas of the Graphical User Interfaces or GUI (Gooey) go back to Doug Engelbarts Demo of the Century. However, this did not have much impact on the computer industry. One company though, Xerox, a photocopy company explored these ideas with Palo Alto Park. Steve Jobs of Apple and Bill Gates of Microsoft took notice and Apple introduced first Apple Lisa and the Macintosh. In this lecture on we look so lessons for the development of software, and see how our business theories apply.
In this lecture on we look so lessons for the development of algorithms or software, and see how our business theories apply.
In the second part we look at where software is going, namely Artificial Intelligence. Resent developments in AI are causing an AI boom and new AI application are coming all the time. We look at machine learning and deep learning to get an understanding of the current trends.
Big data and digital ecosystem mark skilton jan 2014 v1Mark Skilton
The convergence of data and technology in business has created a range of digital experiences that are revolutionizing all industries and organization.
This is not just a “top and tail” exercise of innovation and development resulting in quick fixes for parts of your organization to launch marketing services on a mobile device or data analysis but needs to be taken into a serious framework for your operating strategy that drives all touch points in the front and back of your organization. Often “technoconfusion” is created or even encouraged with lots of technologies and integration layers which result in a piecemeal digital strategy for business performance and an ineffective business case and governance process. Often the results are seen in lack of data visibility, increasing costs of platform integration and complex services and contractual issues limiting longer term choice.
The realities of the digital world are more subtle and expansive in 2014 and beyond; the old style of layers of technology is giving way to a new era of digital modularity of systems and devices that enable generative business growth effects from self-service and massive scaled social and marketplace services. This is born out with the exponential scaling of open APIs, massive data and social networking and the growth of a range of internet enabled modular devices and cloud enabled platforms.
Takeaways
§ This session looks at the trends driving industry today.
§ It introduces work on the Open Platform 3.0 from the open group. http://www.opengroup.org/subjectareas/platform3.0
§ It includes an analysis of trends in technology and the emerging patterns and roles of big data in an end to end operating context of your organization.
§ We conclude with a modern framework for a modular and generative digital ecosystem strategy and the focus for next generation platforming and services.
Look beyond the hype and create a strategy that will unlock the potential of the Internet of Your Things to realize real, transformative results in your organization.
Big Data Expo 2015 - IBM Outside the comfort zoneBigDataExpo
When it comes to high tech, we tend to wear blinders. We only want to see what's right in front of us. At times, we look forward but we rarely look around us to see how other industries are succeeding. This is especially true with organizations who want to look beyond business intelligence and reveal answers you never thought to ask. For example, what would demand forecasting for a Chief Marketing Officer in Media & Entertainment mean to a Chief Data Officer in banking? Or what would Customer Operations Transformation in Energy & Utilities mean to a Chief Customer Officer at a major retail operation? In this interactive and energetic session, we'll explore valuable cross-industry use cases to help get you "outside your comfort zone" and take a completely different look at how applications of advanced and predictive analytics on big data - or any data - can help you to act on real-time insights to fundamentally transform your business.
Independent of the source of data, the integration of event streams into an Enterprise Architecture gets more and more important in the world of sensors, social media streams and Internet of Things. Events have to be accepted quickly and reliably, they have to be distributed and analysed, often with many consumers or systems interested in all or part of the events. Dependent on the size and quantity of such events, this can quickly be in the range of Big Data. How can we efficiently collect and transmit these events? How can we make sure that we can always report over historical events? How can these new events be integrated into traditional infrastructure and application landscape?
Starting with a product and technology neutral reference architecture, we will then present different solutions using Open Source frameworks and the Oracle Stack both for on premises as well as the cloud.
Big Data may well be the Next Big Thing in the IT world
Big data burst upon the scene in the first decade of the 21st century
The first organizations to embrace it were online and startup firms. Firms like Google , eBay , LinkedIn , and Facebook were built around big data from the beginning.
Like many new information technologies , bigdata can bring about dramatic cost reductions , substantial improvements in the time required to perform to computing task , or new product and service offerings.
Over the past decade, cloud computing has acted as a disrupter in several areas of IT business. Soon, it will overhaul one area of technology that has been in rapid growth itself: Data Analytics. Nicky will focus on the recent study of IBM Institute of Business Value which shows that capabilities that enable an organization to consume data faster – to move from raw data to insight-driven actions – are now the key differentiator to creating value using data and analytics. He will also talk about the requirements for the underlying infrastructure as critical component allowing real-time crunching and analysis of high volume of data. Based on real cases like retailers and energy companies, we will look at five predictions in five years, based on:
Analytics, Big data, and Cloud coming together will energize the Speed Advantage.
7 Amazing Examples of Digital Twin Technology In PracticeBernard Marr
Digital twins are a virtual simulation of real-world objects. By using Internet of Things sensors that feed data from the physical object to computers, digital twins provide the exact same situation to study and test without the consequences of doing the test in the real world. The uses for the technology are nearly limitless.
With the computer revolution digital data has become available. With the Internet and smart connected product, the data is growing exponentially. With all this data, it becomes a platform for something new of its own. In this lecture, we look at what big data is and look at several examples of how to use data.
After the computing industry got started, a new problem quickly emerged. How do you operate this machines and how to you program them. The development of operating systems was relatively slow compared to the advances in hardware. First system were primitive but slowly got better as demand for computing power increased. The ideas of the Graphical User Interfaces or GUI (Gooey) go back to Doug Engelbarts Demo of the Century. However, this did not have much impact on the computer industry. One company though, Xerox, a photocopy company explored these ideas with Palo Alto Park. Steve Jobs of Apple and Bill Gates of Microsoft took notice and Apple introduced first Apple Lisa and the Macintosh. In this lecture on we look so lessons for the development of software, and see how our business theories apply.
In this lecture on we look so lessons for the development of algorithms or software, and see how our business theories apply.
In the second part we look at where software is going, namely Artificial Intelligence. Resent developments in AI are causing an AI boom and new AI application are coming all the time. We look at machine learning and deep learning to get an understanding of the current trends.
This Presentation is completely on Big Data Analytics and Explaining in detail with its 3 Key Characteristics including Why and Where this can be used and how it's evaluated and what kind of tools that we use to store data and how it's impacted on IT Industry with some Applications and Risk Factors
1.Introduction
2.Overview
3.Why Big Data
4.Application of Big Data
5.Risks of Big Data
6.Benefits & Impact of Big Data
7.Conclusion
‘Big Data’ is similar to ‘small data’, but bigger in size
But having data bigger it requires different approaches:
Techniques, tools and architecture
An aim to solve new problems or old problems in a better
way
Big Data generates value from the storage and processing
of very large quantities of digital information that cannot be
analyzed with traditional computing techniques.
Big Data Trends - WorldFuture 2015 ConferenceDavid Feinleib
David Feinleib's Big Data Trends presentation from the World Future Society's Annual Conference, WorldFuture 2015, held at the Hilton Union Square, San Francisco, California July 25, 2015.
Forecast to contribute £216 billion to the UK economy via business creation, efficiency and innovation, and generate 360,000 new jobs by 2020, big data is a key area for recruiters.
In this QuickView:
- Big data in numbers
- Top 10 industries hiring big data professionals
- Top 10 qualifications sought by hirers
- Top 10 database and BI skills sought by hirers
- Getting started in big data: popular big data techniques and vendors
this presentation will let you know the in and out of bigdata growing trends... market potential , solutions provided by bigdata, advantages and disadvantages.
Abstract:
Big Data concern large-volume, complex, growing data sets with multiple, autonomous sources. With the fast development of networking, data storage, and the data collection capacity, Big Data are now rapidly expanding in all science and engineering domains, including physical, biological and biomedical sciences. This paper presents a HACE theorem that characterizes the features of the Big Data revolution, and proposes a Big Data processing model, from the data mining perspective. This data-driven model involves demand-driven aggregation of information sources, mining and analysis, user interest modeling, and security and privacy considerations. We analyze the challenging issues in the data-driven model and also in the Big Data revolution.
Fyrirlestur fyrir Félag tölvunarfræðinga og Verkfræðingafélagið þann 18.05.2022
Nýsköpun er forsenda tækniframfara sem eru forsendur framþróunar. Nýsköpun byrjar yfirleitt smátt og þarf margar ítranir til að virka. Frumkvöðlar sem eru að búa til nýjungar þurfa ekki einungis að glíma við tæknina og takmarkanir hennar, heldur einnig skoðanir og álit samtímamanna sem sjá ekki alltaf tilgang með nýrri tækni. Í þessum fyrirlestri skoðar Ólafur Andri nýsköpun og þær framfarir sem hafa orðið. Einnig skoðar hann hvert tækniframfarir nútímans muni leiða okkur á komandi árum.
Ólafur Andri Ragnarsson er aðjúnkt við Háskólann í Reykjavík og kennir þar námskeið um tækniþróun og hvernig tæknibreytingar hafa áhrif á fyrirtæki. Hann er tölvunarfræðingur (Msc) að mennt frá Oregon University í Bandaríkjanum. Ólafur Andri er frumkvöðull og stofnaði, ásamt fleirum, Margmiðlun og síðar Betware. Þá tók Ólafur Andri þátt í að stofna leikjafyrirtækið Raw Fury AB í Stokkhólmi.
Fyrirlestur haldinn fyrir tæknifaghóp Stjórnvísi þann 13. október 2020.
Undanfarna áratugi höfum við séð gríðalegar framfarir í tækni og nýsköpun á heimsvísu. Þessar framfarir hafa skapað mannkyninu öllu aukna hagsæld. Þrátt fyrir veirufaraldur á heimsvísu eru framfarir ekkert að minnka heldur munu bara aukast næstu árum. Gervgreind, róbotar, sýndarveruleiki, hlutanetið og margt fleira er að búa til nýjar lausnir og ný tækifæri. Framtíðin er í senn sveipuð dulúð og getur verið spennandi og ógnvekjandi í senn. Eina sem við vitum fyrir vissu er að framtíðin verður alltaf betri. Í þessu fyrirlestri ætlar Ólafur Andri Ragnarsson kennari við HR að fjalla um nýjustu tækni og framtíðina.
Technology is one of the factors of change. When new disruptive technology is introduced, it can change industries. We have many examples of that and will start this journey it one of the most important innovation that has come in our lifetimes, the smartphone. We will explore the impact of the smartphone and the fate of existing companies at the time when iPhone, the first smartphone as we know them, was introduced to the world.
We will also look at other examples from history. Then we look at the broader picture, past industrial revolutions and the one that we are experiencing now, the fourth industrial revolution. Specifically we look briefly at the technologies that fuel this revolution, for example artificial intelligence, robotics, drones, internet of things and more.
Manlike machines have fascinated humans since ancient times. The modern robots start to take shape with the industrial revolution. In the 20th century robots were mostly industrial machines you would see in factories, like car factories.
Today, robots can have sensors, vision, they can hear and understand. They can connect to the cloud for more information. However, we are still in the early stages of robotics and robots will need to go a long way to become useful as a ubiquitous general purpose devices.
The normal interaction with computers is with keyboard and a mouse. For display a rectangular somewhat small screen is used with 2D windowing systems. The mouse was invented more the 40 years ago and has been for 20 years dominant input. Now we are seeing new types of input devices. Multi-touch adds new dimensions and new applications. Natural user interfaces or gesture interfaces where people point to drag objects. Computers are also beginning to recognize facial expressions of people, so it knows if you are smiling. Voice and natural language understanding is getting to a usable stage. All this calls all types of new applications.
Displays are getting bigger. What if any surface was a screen? If you could spray the wall with screen? Or have you phone project images to the wall.
This lectures explores some of these new types of interactions with computers and software. It makes the old mouse look old.
Local is the Lo in SoLoMo, the buzz word. Local is not only about location, it's also about your digital track record. Over 70% of Netflix users watch the films recommend. Mining data to understand people's behaviour is getting to be a huge and valuable business. Advertisers see opportunities in getting direct to their target groups. Predictive intelligence is also about where you will be at some time in the future, and where somebody you know will be.
It turns out that Facebook and Google know you better than you think you know yourself. The world is about to get really scary.
Over two billion people signed up for Facebook. This site the most used site for people when using the Internet. People are not watching TV so much anymore - they using Facebook, Youtube and Netflix and number of popular web sites.
Some people denote their time working for others online. What drives people to write an article on Wikipedia? They don´t get paid. Companies are enlisting people to help with innovations and sites such as Galaxy Zoo ask people to help identifying images. And why do people have to film themselves singing when they cannot sing and post the video on Youtube?
In this lecture we talk about how people are using the web to interact in new ways, and doing stuff.
After the computing industry got started, a new problem quickly emerged. How do you operate this machines and how to you program them. The development of operating systems was relatively slow compared to the advances in hardware. First system were primitive but slowly got better as demand for computing power increased. The ideas of the Graphical User Interfaces or GUI (Gooey) go back to Doug Engelbarts Demo of the Century. However, this did not have much impact on the computer industry. One company though, Xerox, a photocopy company explored these ideas with Palo Alto Park. Steve Jobs of Apple and Bill Gates of Microsoft took notice and Apple introduced first Apple Lisa and the Macintosh.
In this lecture on we look so lessons for the development of software, and see how our business theories apply.
In this lecture on we look so lessons for the development of algorithms or software, and see how our business theories apply.
In the second part we look at where software is going, namely Artificial Intelligence. Resent developments in AI are causing an AI boom and new AI application are coming all the time. We look at machine learning and deep learning to get an understanding of the current trends.
We are currently living in times of great transformation. We have over the last couple of decade seen the Internet become the most powerful disrupting force in the world, connecting everyone and transforming businesses. Now everyday objects - things we use are getting smart with sensors and software. And they are connecting. What does this mean?
We will see the world become alive. Cars will talk to road sensors that talk to systems that guide traffic. Plants will talk to weather systems that talk to scientists that research climate change. Farming fields will talk to the farming system that talks to robots that do fertilising and harvesting. Home appliances like refrigerators, ovens, coffee machines and microwaves ovens will talk to the home food and cooking system that will inform the store that you are running out butter, cheese, laundry detergent and coffee beans, which will inform the robot driver to get this to your house after consulting your calendar upon when someone is at home.
In this lecture we explore the Internet of Things, IoT.
The Internet grew out of US efforts to build the ARPANET, a network of peer computers built during the cold war. The two major players were military and academia. The network was simple and required no efforts for security or social responsibility. The early Internet community was mainly highly educated and respectable scientist. In the early 1990s the World Wide Web, a hypertext system is introduced, and soon browsers start to appear, leading the commercialization of Net. New businesses emerge and a technology boom known as the dot-com era.
The network, now over 40, is being stretched. Problems such as spam, viruses, antisocial behaviour, and demands for more content are prompting reinvention of the Net and threatening its neutrality. Add to this government efforts to regulate and limit the network.
In this lecture we look at the Internet and the impact of the network. We will also look at the future of the Internet.
The Internet grew out of US efforts to build the ARPANET, a network of peer computers built during the cold war. The two major players were military and academia. The network was simple and required no efforts for security or social responsibility. The early Internet community was mainly highly educated and respectable scientist. In the early 1990s the World Wide Web, a hypertext system is introduced, and soon browsers start to appear, leading the commercialisation of Net. New businesses emerge and a technology boom known as the dot-com era.
The network, now over 40, is being stretched. Problems such as spam, viruses, antisocial behaviour, and demands for more content are prompting reinvention of the Net and threatening its neutrality. Add to this government efforts to regulate and limit the network.
In this lecture we look at the Internet and the impact of the network. We will also look at the future of the Internet.
The ideas for cellular phones were developed in the 1940s. However, it was not until the microprocessor becomes available that practical commercial solutions are possible.
Today there are more than 5 billion unique mobile phone subscriptions in the world and of them about 2.5 billion are smartphones. This device is so powerful that people check it over 40 times a day.
In this lecture we look mobile. We also look at the history of communication since the telegraph and how the mobile market developed in the 80s and 90s until the iPhone was released in 2007. That same year Western Union stopped sending telegraph messages.
Did you know that the term "Computer" once meant a profession? And what did people or computers actually do? They computed mathematical problems. Some problems were tedious and error prone. And it is not surprising that people started to develop machines to aid in the effort. The first mechanical computers were actually created to get rid of errors in human computation. Then came tabulating machines and cash registers. It was not until telephone companies were well established that computing machines became practical.
First computers were huge mainframes, but soon minicomputers like DEC’s PDP started to appear. The transistor was introduced in 1947, but its usefulness was not truly realized until in 1958 when the integrated circuit was invented. This led to the invention of the microprocessor. Intel, in 1971, marketed the 4004 – and the personal computer revolution started. One of the first Personal Computers was MITS’ Altair. This was a simple device and soon others saw the opportunities.
In this lecture we start our coverage of computing and look at some of the early machines and the impact they had.
Software is changing the way traditional business operate. People now have smartphones in their pockets - a supercomputer that is 25,000 times more powerful and the minicomputers of the 1960s. This is changing people's behaviour and how people shop and use services. The organisational structure created in the 20th century cannot survive when new digital solution are being offered. Software is changing the way traditional business operate. People now have smartphones in their pockets - a supercomputer that is 25,000 times more powerful and the minicomputers of the 1960s. This is changing people's behaviour and how people shop and use services. The organisational structure created in the 20th century cannot survive when new digital solution are being offered. The hierarchical structure of these established companies assumes high coordination cost due to human activity. But when the coordination cost drops
The organisational structure that companies in the 20th century established was based on the fact that employees needed to do all the work. The coordination cost was high due to the effort and cost of employees, housing etc. Now we have software that can do this for use and the coordination cost drops to close-to-zero. Another thing is that things become free. Consider Flickr. Anybody can sign up and use the service for free. Only a fraction of the users get pro account and pay. How can Flickr make money on that? It turns out that services like this can.
Many businesses make money by giving things away. How can that possibly work? The music business has suffered severely with digital distribution of content. Should musicians put all their songs on YouTube? What is the future business model for music?
One of the great irony of successful companies is how easily they can fail. New companies are founded to take advantage of some new technology. They become highly successful and but when the technology shifts, something new comes along, they are unable to adapt and fail. This is the innovator’s dilemma.
Then there are companies that manage to survive. For example, Kodak survived two platform shift, only til fail the third. IBM has survived over 100 years. What do successful companies do differently?
History has many examples of great innovators who had difficult time convincing their contemporaries of new technology. Even incumbent and powerful companies regarded new technologies as inferior and dismissed it as "toys". Then when disruptive technologies take off they often are overhyped and can cause bubbles like the Internet bubble of the late 1990s.
In this lecture we look at some examples of disruptive technologies and the impact they had. We look at the The Disruptive Innovation Theory by Harvard Professor Clayton Christensen.
Technology evolves in big waves that we call revolutions. The first revolution was the Industrial revolution that started in Britain in 1771. Since than we have see more revolutions come and how we are in the fifth. These revolutions follow a similar path. First there is an installation period where the new technologies are installed and deployed, creating wealth to those who were are the right place at the right time. This is followed by a frenzy, where financial markets wants to be apart. The there is crash and turning point, followed by synergy, a golden age.
In 1908, a new technological revolution started. It was the Age of Oil and Automobile. The technology trigger was Henry Ford´s new assembly line technique that allowed the manufacturing of standardized, low cost automobile. This created the car industry and other manufacturing companies. This also created demand for gas thus creating the oil industry. During the Roaring Twenties the stock prices rose to new levels, until a crash and the Great Depression. Only after World War II, came a turnaround point followed by a golden age in the post-war boom.
In this lecture we look at a framework for understanding technological revolutions. There revolutions completely change societies and replace the old with new technologies. We will explore how these revolutions take place. We should now be in the golden age phase.
We also look at generations.
In the early days of product development, the technology is inferior and lacking in performance. The focus is very much on the technology itself. The users are enthusiast who like the idea of the product, find use for it, and except the lack of performance. Then as the product becomes more mature, other factors become important, such as price, design, features, portability. The product moves from being a technology to become a consumer item, and even a community.
In this lecture we explore the change from technology focus to consumer focus, and look at why people stand in line overnight to buy the latest gadgets.
In the early days of product development, the technology is inferior and lacking in performance. The focus is very much on the technology itself. The users are enthusiast who like the idea of the product, find use for it, and except the lack of performance. Then as the product becomes more mature, other factors become important, such as price, design, features, portability. The product moves from being a technology to become a consumer item, and even a community.
In this lecture we explore the change from technology focus to consumer focus, and look at why people stand in line overnight to buy the latest gadgets.
When innovators try to envision how people will use their product they often have different ideas on what people want. Products that are of superior technology may fail and inferior succeed, only because the inferior product has some features that people are looking for.
In this lecture we look at how new products or technologies get adopted my markets. We look at the Law of Diffusion of Innovation, which explains how this adoption happens. We also look at what it takes for a new innovation to move from being a visionary idea to a practical product, or crossing the chasm. Finally we explore the hype cycle.
In this lecture we look at how innovation happens. We look at the slow hunch, the liquid network, the hummingbird effect, and serendipity.
Elevating Tactical DDD Patterns Through Object CalisthenicsDorra BARTAGUIZ
After immersing yourself in the blue book and its red counterpart, attending DDD-focused conferences, and applying tactical patterns, you're left with a crucial question: How do I ensure my design is effective? Tactical patterns within Domain-Driven Design (DDD) serve as guiding principles for creating clear and manageable domain models. However, achieving success with these patterns requires additional guidance. Interestingly, we've observed that a set of constraints initially designed for training purposes remarkably aligns with effective pattern implementation, offering a more ‘mechanical’ approach. Let's explore together how Object Calisthenics can elevate the design of your tactical DDD patterns, offering concrete help for those venturing into DDD for the first time!
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered QualityInflectra
In this insightful webinar, Inflectra explores how artificial intelligence (AI) is transforming software development and testing. Discover how AI-powered tools are revolutionizing every stage of the software development lifecycle (SDLC), from design and prototyping to testing, deployment, and monitoring.
Learn about:
• The Future of Testing: How AI is shifting testing towards verification, analysis, and higher-level skills, while reducing repetitive tasks.
• Test Automation: How AI-powered test case generation, optimization, and self-healing tests are making testing more efficient and effective.
• Visual Testing: Explore the emerging capabilities of AI in visual testing and how it's set to revolutionize UI verification.
• Inflectra's AI Solutions: See demonstrations of Inflectra's cutting-edge AI tools like the ChatGPT plugin and Azure Open AI platform, designed to streamline your testing process.
Whether you're a developer, tester, or QA professional, this webinar will give you valuable insights into how AI is shaping the future of software delivery.
Connector Corner: Automate dynamic content and events by pushing a buttonDianaGray10
Here is something new! In our next Connector Corner webinar, we will demonstrate how you can use a single workflow to:
Create a campaign using Mailchimp with merge tags/fields
Send an interactive Slack channel message (using buttons)
Have the message received by managers and peers along with a test email for review
But there’s more:
In a second workflow supporting the same use case, you’ll see:
Your campaign sent to target colleagues for approval
If the “Approve” button is clicked, a Jira/Zendesk ticket is created for the marketing design team
But—if the “Reject” button is pushed, colleagues will be alerted via Slack message
Join us to learn more about this new, human-in-the-loop capability, brought to you by Integration Service connectors.
And...
Speakers:
Akshay Agnihotri, Product Manager
Charlie Greenberg, Host
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
Neuro-symbolic is not enough, we need neuro-*semantic*Frank van Harmelen
Neuro-symbolic (NeSy) AI is on the rise. However, simply machine learning on just any symbolic structure is not sufficient to really harvest the gains of NeSy. These will only be gained when the symbolic structures have an actual semantics. I give an operational definition of semantics as “predictable inference”.
All of this illustrated with link prediction over knowledge graphs, but the argument is general.
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...DanBrown980551
Do you want to learn how to model and simulate an electrical network from scratch in under an hour?
Then welcome to this PowSyBl workshop, hosted by Rte, the French Transmission System Operator (TSO)!
During the webinar, you will discover the PowSyBl ecosystem as well as handle and study an electrical network through an interactive Python notebook.
PowSyBl is an open source project hosted by LF Energy, which offers a comprehensive set of features for electrical grid modelling and simulation. Among other advanced features, PowSyBl provides:
- A fully editable and extendable library for grid component modelling;
- Visualization tools to display your network;
- Grid simulation tools, such as power flows, security analyses (with or without remedial actions) and sensitivity analyses;
The framework is mostly written in Java, with a Python binding so that Python developers can access PowSyBl functionalities as well.
What you will learn during the webinar:
- For beginners: discover PowSyBl's functionalities through a quick general presentation and the notebook, without needing any expert coding skills;
- For advanced developers: master the skills to efficiently apply PowSyBl functionalities to your real-world scenarios.
3. 1955 1960 1965
Social Security
Calculate Benefits for 15MM
Recipients (62MM Now)
NASA
Calculate Real-Time Orbital
Determination
IRS
Calculate / Store 55MM
Records (126MM Now)
Data Gathering in US 1950+
Source: Mary Meeker Slide Deck 2019
4. 1955 1965 1975
Banks
Process Checks
Data Gathering in US 1950+
Source: Mary Meeker Slide Deck 2019
Telecoms
Optimise Telephone switching
Hospitals
Manage Patient Data
Airlines
Process transaction / data
Insurance
Optimise Insurance Policies
Retail
Track Inventory / Logistics
Credit Cards
Manage Merchant Network
Source: Mary Meeker Slide Deck 2019
5. Big Bangs in Data
2006
Amazon AWS
2007
Apple iPhone
Until now, a sophisticated & scalable data
storage infrastructure has been beyond
the reach of small developers.
— Amazon S3 Launch FAQ, 2006
Why run such a sophisticated operating
system on a mobile device? Well,
because it’s got everything we need.
— Steve Jobs, iPhone Launch, 2007
Source: Mary Meeker Slide Deck 2019
13. Big Data
With the computer revolution, digital data becomes possible
Over the years, data has grown exponentially
“Big Data” has become a
platform by itself with new
possibilities
14. Global Data is Growing Fast
Data in Digital Universe vs. Data Storage Cost, 2010-2015
Source: Mary Meeker, KPCB
16. Data is a New Growth Platform
The
Network
The
Software
The
Infrastructure
The
Data
Large investments in fibre optic & last-mile cable create connectivity
that facilitated the early Internet growth
Optimising the network with software became far more capital
efficient than additional capital expenditure buildouts, ultimately
resulting in the creation of pervasive networks (Siloed DCs -> AWS)
and pervasive software (Siebel -> Salesforce)
Emergence of pervasive software created the need to optimise the
performance of the network and store extraordinary amounts of data
at extremely low prices
Next Big Wave: Leveraging this unlimited connectivity and storage to
collect / aggregate / correlate / interpret all of this data to improve
people’s live and enable enterprises to operate more efficiently
21. Big Data Examples
Macy's Inc. and real-time pricing
The retailer adjusts pricing in near-real time for 73 million
items, based on demand and inventory.
Source:Ten big data case studies in a nutshell
22. Big Data Examples
Tipp24 AG, a platform for placing bets
The company uses software to analyse billions of
transactions and hundreds of customer attributes, and to
develop predictive models that target customers and
personalise marketing messages on the fly.
Source:Ten big data case studies in a nutshell
23. Big Data Examples
Wal-Mart Stores Inc. and search
The mega-retailer's latest search engine for Walmart.com
includes semantic data. A platform that was designed in-
house, relies on text analysis, machine learning and even
synonym mining to produce relevant search results.
Wal-Mart says adding semantic search has improved
online shoppers completing a purchase by 10% to 15%.
Source:Ten big data case studies in a nutshell
24. Big Data Examples
PredPol Inc. and repurposing
The Los Angeles and Santa Cruz police departments, a
team of educators and a company called PredPol have
taken an algorithm used to predict earthquakes, tweaked it
and started feeding it crime data.
The software can predict where crimes are likely to occur
down to 500 square feet. In LA, there's been a 33%
reduction in burglaries and 21% reduction in violent crimes
in areas where the software is being used.
Source:Ten big data case studies in a nutshell
25. Big Data Examples
American Express and business intelligence
AmEx started looking for indicators that could really
predict loyalty and developed sophisticated predictive
models to analyse historical transactions and 115 variables
to forecast potential churn
The company believes it can now identify 24% of Australian
accounts that will close within the next four months
Source:Ten big data case studies in a nutshell
26. Big Data Examples
A Bank and IBM
A large US bank uses IBM machine learning technologies
to analyse credit card transactions.
Using machine learning and stream computing to detect financial fraud
30. What is Big Data?
Big data is high-volume, high-velocity and/or high-variety
information assets that demand cost-effective, innovative
forms of information processing that enable enhanced
insight, decision making, and process automation.
Gartner
31. What is Big Data?
Big data refers to a process that is used when traditional
data mining and handling techniques cannot uncover the
insights and meaning of the underlying data. Data that is
unstructured or time sensitive or simply very large cannot
be processed by relational database engines. This type of
data requires a different processing approach called big
data, which uses massive parallelism on readily-available
hardware.
Techopedia
32. “Big data is the oil of the 21st century and
analytics is the combustion engine.”
—Peter Sondergaard, Gartner Research
What is Big Data?
33. How do you measure numbers at large scale?
What is Big Data?
38. Byte: one rice
Kilobyte: handful of rice
Megabyte: Big pot of rice
David Wellman: What is Big Data?
What is Big Data?
39. Byte: one rice
Kilobyte: handful of rice
Megabyte: Big pot of rice
Gigabyte: Truck full of rice
David Wellman: What is Big Data?
What is Big Data?
40. Byte: one rice
Kilobyte: handful of rice
Megabyte: Big pot of rice
Gigabyte: Truck full of rice
Terabyte: Containership full of rice
David Wellman: What is Big Data?
What is Big Data?
41. Byte: one rice
Kilobyte: handful of rice
Megabyte: Big pot of rice
Gigabyte: Truck full of rice
Terabyte: Containership full of rice
Petabyte: Covers Manhattan
David Wellman: What is Big Data?
What is Big Data?
42. Byte: one rice
Kilobyte: handful of rice
Megabyte: Big pot of rice
Gigabyte: Truck full of rice
Terabyte: Containership full of rice
Petabyte: Covers Manhattan
Exabyte: Covers the west coast of US
David Wellman: What is Big Data?
What is Big Data?
43. Byte: one rice
Kilobyte: handful of rice
Megabyte: Big pot of rice
Gigabyte: Truck full of rice
Terabyte: Containership full of rice
Petabyte: Covers Manhattan
Exabyte: Covers the west coast of US
Zettabyte: Fills the Pacific Ocean
David Wellman: What is Big Data?
What is Big Data?
44. Byte: one rice
Kilobyte: handful of rice
Megabyte: Big pot of rice
Gigabyte: Truck full of rice
Terabyte: Containership full of rice
Petabyte: Covers Manhattan
Exabyte: Covers the west coast of US
Zettabyte: Fills the Pacific
Yottabyte: Earth size riceball
David Wellman: What is Big Data?
What is Big Data?
45. Byte: one rice
Kilobyte: handful of rice
Megabyte: Big pot of rice
Gigabyte: Truck full of rice
Terabyte: Containership full of rice
Petabyte: Covers Manhattan
Exabyte: Covers the west coast of US
Zettabyte: Fills the Pacific
Yottabyte: Earth size riceball
David Wellman: What is Big Data?
Big Data
Internet
Computers
Early computers
What is Big Data?
46. Big Data is not just about the size of
the data, it’s about the value within
the data
This value can be used for marketing,
businesses optimisation, getting
insights, improving health, security
etc.
What is Big Data?
48. Why Big Data Analytics?
Understand the data the company has
Process data to see patterns, corrections and
information that can be used to make better
decisions
Obtain insights that are otherwise not known
49. Data Analytics
TRADITIONAL APPROACH
Structured and Repeatable Analyses
BIG DATA APPROACH
Iternative and Exploratory Analyses
Business users
Business users
Determine what
questions to ask
IT
Structures the data
to answer the
question
IT
Delivers a platform
to enable creative
discovery
Explores what
questions could be
asked
50. Tools for Data Analytics
NoSQL databases: MongoDB, Cassandra, Hbase, Hypertable
Storage: S3, Hadoop Distributed File System
Servers: EC2, Google App Engine, Heroku
MapReduce: Hadoop, Hive, Pig, Cascading, S4, MapR
Processing: R, Yahoo! Pipes, Solr/Lucene, BigSheets,
51. Two Types of Data Analysis Problems
Supervised Learning: Learn from data but we have labels
for all the data we’ve seen so far
Example: Determining Spam Emails
Learn from data but we don’t have any
labels
Example: Grouping Emails, AlphaZero
Unsupervised Learning:
Learning is about discovering hidden patterns in data
52. Clustering
One of the oldest problems in unsupervised data analysis
In clustering the goal is to group data according to similarity
Algorithms such as K-means are used for clustering
53. For each artefact found,
the location to N and E
from the Marker is
recorded
That is a Data Set
Before the dig, a historian
has said that three families
lived in the location
Clustering
54. Similar: close in physical
distance
You assign each data point
to one and only one group
The groups are called
clusters
Clustering
55. Clustering is the unsupervised learning problem where
you take your data and assign each data point to exactly
one group, or cluster
Uses unlabelled data
Clustering
56. We may have collection data but we don’t know what to
do with it
We might want to explore the data without a particular
end goal in mind
Perhaps the data will suggest interesting avenues for
further analysis
In this case, we say that we're performing exploratory
data analysis
Clustering
57. Exploratory data analysis
We don’t know what we are looking for
Data point = colour of pixel and location of pixel
Dissimilarity is the distance in colour
58. In some cases
labelling is too
expensive
For example,
news change
every day and
there are too
much of them
Exploratory data analysis
60. Alexander Nix, CEO Cambridge Analytica
Ted Cruz campaign for US Republican President
61.
62. Data Analysis as a Platform
THEN NOW
Complex tools operated by Data Analysts
Chaos of data silos accross the company
Real-time data analytics platform like Looker
63. Customer Data as a Platform
Difficult to customise,
lack of automated
customer insights
Real-time Intelligent that
automatically tracks and analysis
interaction with customer
THEN NOW
64. Mapping Data as a Platform
Difficult and expensive to collect data
Limited in-app digital map usage
Mapping platforms like Mapbox
THEN NOW
65. Cloud Data Monitoring as a Platform
Expensive and clunky point solution
Lengthy implementation cycles
Only used by System Administrators
Cloud monitoring platforms like
Datadog
THEN NOW