Application of information communication technology (ict)Kishor Satpathy
Presented in National Seminar on Networking of Library and Information Centres of North East India in Digital Environment (NLICDE-2011)
(21-23 March 2011)
ORGANISED BY
NIT Silchar
This slide show you overall description about apple company, its history, SWOT analysis, its Competitor, Industry position, Hardware and software quality, Market position.
Web 1.0 was an early stage evolution focused on how users could connect to the web through the user interface. Web 2.0 emerged around 2004 and focused mainly on interactivity and collaboration through social media; it too has peaked.
Through the evolution of smart phones and the ongoing improvement of technology, Web 3.0 offers more solutions for browsing and enables consumers to browse application data from anywhere in the world.
Hassan Bawab will share how Web 3.0 started as merely a trend but is quickly becoming the standard.
Capitalizing on Web 3.0 requires providing a mobile experience to end-users. It also means more effective communication and ease of reach. Implementing a Web 3.0 strategy can ultimately lead to improved intelligence and customer engagement for organizations in any industry.
This presentation includes 60+ slides that mainly deals with three Computer Security aspects i.e
1. Security Attacks and Threats
2. Security Services
3. Security Mechanisms
Along with that we've also includes Security Awareness and Security Policies
Web 3.0 or Decentralised Web to revolutionise the world of Internet Era through Blockchain, Big Data Analytics and Artificial Intelligence.
There has been a buzz around the Web 3.0 and the disruption it will bring to the Industry, but only a few know actually why it spawned and what is it about to transform. Let us travel back in time to understand and examine its predecessors - Web 1.0 and 2.0
The Blockchain, the Internet of Things, Advanced analytics, and Artificial Intelligence are potent technologies that will have a profound effect on society. They will take us much further into this new world of the information age as power shifts in a radical way from people in hierarchical institutions to automated networks and the algorithms that can coordinate in the Web 3.0 era.
The Web 3.0 knowledge management should give rise to an exciting and game-changing environment - the Social Semantic Web. However, still, the technology is in the early stages, but if you have used the Google search in the recent times know that the Google has used natural language to find the answer to your question. Hence you are already experiencing the revolutionary benefits of the next chapter in the story of the "World Wide Web (WWW)."
It's an analysis about Accenture, they involved businesses and activities. Included the brief history of Accenture, business, market growth, competitor details, how they placed into market and Accenture strategy for business consulting.The slide helpful for students on there academic purposes.
Latest trends in information technologyAtifa Aqueel
This ppt includes the latest trends in information technology such as big data analytics, cloud computing, virtual reality, 5G wireless technology etc.
Les sciences et le langage sont les principaux facteurs qui alimentent les mécanismes de la transformation précipitée de nos vies privées et sociales. C’est la poésie et la philosophie qui en donneront un sens.
La nouveauté est bien en soi. Il y a une certaine fascination aujourd’hui pour les progrès technologiques. Jusqu’à très récemment, le rythme de ces évolutions s’est soudainement accéléré, projetant de la science-fiction dans notre quotidien. Or on se focalise plutôt sur le mouvement d’un changement que sur son objectif final. Être mobile, s’adapter toujours, innover encore, changer plus vite, sont devenues les principes de notre conscience occidentale, notre nouvelle religion. Il importe alors de s’interroger sur l’intérêt de la transformation de nos organisations afin d’y donner un sens.
Dans ce premier document, j’essaie de comprendre à travers le prisme des entreprises, les origines de cette transformation dont le numérique et la mondialisation ont fortement contribués. Puis, je propose une approche pour sa prise en main. Être un acteur de sa propre évolution dans ce tourbillon d’innovations est un premier pas pour habiter ce monde et mettre l’humanité au cœur de nos activités.
DC10 - IBM, Kees Donker - Servitization for manufacturing - from hw and sw su...Jaak Vlasveld
Kees Donker from IBM presented at the Servitization for manufacturing session at the Service Innovation Congres 2010 (DC10) in Almere, the Netherlands.
Application of information communication technology (ict)Kishor Satpathy
Presented in National Seminar on Networking of Library and Information Centres of North East India in Digital Environment (NLICDE-2011)
(21-23 March 2011)
ORGANISED BY
NIT Silchar
This slide show you overall description about apple company, its history, SWOT analysis, its Competitor, Industry position, Hardware and software quality, Market position.
Web 1.0 was an early stage evolution focused on how users could connect to the web through the user interface. Web 2.0 emerged around 2004 and focused mainly on interactivity and collaboration through social media; it too has peaked.
Through the evolution of smart phones and the ongoing improvement of technology, Web 3.0 offers more solutions for browsing and enables consumers to browse application data from anywhere in the world.
Hassan Bawab will share how Web 3.0 started as merely a trend but is quickly becoming the standard.
Capitalizing on Web 3.0 requires providing a mobile experience to end-users. It also means more effective communication and ease of reach. Implementing a Web 3.0 strategy can ultimately lead to improved intelligence and customer engagement for organizations in any industry.
This presentation includes 60+ slides that mainly deals with three Computer Security aspects i.e
1. Security Attacks and Threats
2. Security Services
3. Security Mechanisms
Along with that we've also includes Security Awareness and Security Policies
Web 3.0 or Decentralised Web to revolutionise the world of Internet Era through Blockchain, Big Data Analytics and Artificial Intelligence.
There has been a buzz around the Web 3.0 and the disruption it will bring to the Industry, but only a few know actually why it spawned and what is it about to transform. Let us travel back in time to understand and examine its predecessors - Web 1.0 and 2.0
The Blockchain, the Internet of Things, Advanced analytics, and Artificial Intelligence are potent technologies that will have a profound effect on society. They will take us much further into this new world of the information age as power shifts in a radical way from people in hierarchical institutions to automated networks and the algorithms that can coordinate in the Web 3.0 era.
The Web 3.0 knowledge management should give rise to an exciting and game-changing environment - the Social Semantic Web. However, still, the technology is in the early stages, but if you have used the Google search in the recent times know that the Google has used natural language to find the answer to your question. Hence you are already experiencing the revolutionary benefits of the next chapter in the story of the "World Wide Web (WWW)."
It's an analysis about Accenture, they involved businesses and activities. Included the brief history of Accenture, business, market growth, competitor details, how they placed into market and Accenture strategy for business consulting.The slide helpful for students on there academic purposes.
Latest trends in information technologyAtifa Aqueel
This ppt includes the latest trends in information technology such as big data analytics, cloud computing, virtual reality, 5G wireless technology etc.
Les sciences et le langage sont les principaux facteurs qui alimentent les mécanismes de la transformation précipitée de nos vies privées et sociales. C’est la poésie et la philosophie qui en donneront un sens.
La nouveauté est bien en soi. Il y a une certaine fascination aujourd’hui pour les progrès technologiques. Jusqu’à très récemment, le rythme de ces évolutions s’est soudainement accéléré, projetant de la science-fiction dans notre quotidien. Or on se focalise plutôt sur le mouvement d’un changement que sur son objectif final. Être mobile, s’adapter toujours, innover encore, changer plus vite, sont devenues les principes de notre conscience occidentale, notre nouvelle religion. Il importe alors de s’interroger sur l’intérêt de la transformation de nos organisations afin d’y donner un sens.
Dans ce premier document, j’essaie de comprendre à travers le prisme des entreprises, les origines de cette transformation dont le numérique et la mondialisation ont fortement contribués. Puis, je propose une approche pour sa prise en main. Être un acteur de sa propre évolution dans ce tourbillon d’innovations est un premier pas pour habiter ce monde et mettre l’humanité au cœur de nos activités.
DC10 - IBM, Kees Donker - Servitization for manufacturing - from hw and sw su...Jaak Vlasveld
Kees Donker from IBM presented at the Servitization for manufacturing session at the Service Innovation Congres 2010 (DC10) in Almere, the Netherlands.
Perspectives on the optical fiber industry where do we go from herePulkit Bhatnagar
Strategy Paper on how successful countries and companies were driving Broadband (... and Optical Fiber usage) and what Fiber manufacturers could learn from these case studies.
First presented - June 2009
Capturing Value from The Next 10 Billion DevicesPaul Brody
What can we learn from the last major diffusions of technology into our society (mobile & PC) and how will that apply to the Internet of Things? What strategies & business models should we consider to build sustainably profitable solutions.
BCO 117 IT Software for Business Lecture Reference Notes.docxjesuslightbody
BCO 117 IT Software for Business
Lecture Reference Notes
Cloud
computing
Eras in IT infrastructure evolution
Chapter 5. IT Infrastructure and EmergingTechnologies
Management Information Systems (Kenneth P. Laudon, Jane C. Laudon)
An information technology (IT) paradigm, a model for enabling ubiquitous access to shared pools of configurable resources (such as computer networks, servers, storage, applications and services), which
can be rapidly provisioned with minimal management effort, often over the Internet.
· Computing as a service
· Computing on the Internet
· Business line for computing corporations
Hassan, Qusay (2011).
"Demystifying Cloud Computing"(PDF).
The Journal of Defense Software Engineering.
Cloud computing
Cloud computing
Cloud computing
Cloud computing
www.euruni.edu
Cloud computing examples
Software as a Service
Platform as a Service
Insfrastructure as a Service
Cloud computing examples
Cloud computing examples
https://aws.amazon.com/products/?hp=tile&so-exp=below
Cloud computing examples
Cloud computing examples
Cloud computing examples
www.euruni.edu
Cloud computing examples
Cloud computing examples
Cloud computing examples
www.euruni.edu
Cloud computing success
Key concepts
·
Reliability – reliability of the system, measured in Mean Time Between Failures (MTBF)
·
Availability – uptime of the system or application, measured in parts per million (PPM) of downtime
·
Serviceability – easily restoring the system after a failure, measured in Mean Time To Repair (MTTR)
·
Manageability – the ease with which the entire system can be managed, measured in systems per headcount.
·
Scalability - the ability of an information system to be used or produced in a range of capabilities
·
“Updatability”– a key factor linked to performance, integration with other IS and security
https://software.intel.com/en-us/articles/total-cost-of-ownership-factors-to-consider
Top Benefits of Cloud Computing
http://www.mushibhuiyan.com/category/cloud/
Debate
https://www.forbes.com/sites/louiscolumbus/2013/08/13/idg-cloud-computing-survey-security-integration-challenge-growth/#268d6d3755cb
Debate
https://www.forbes.com/sites/louiscolumbus/2013/08/13/idg-cloud-computing-survey-security-integration-challenge-growth/#268d6d3755cbCloud Computing strategy
https://www-01.ibm.com/common/ssi/cgi-bin/ssialias?htmlfid=WUW12350USEN
www.euruni.edu
image24.jpg
image25.jpg
image26.png
image27.png
image31.png
image29.png
image28.png
image3.jpg
image4.png
image5.jpg
image30.png
image4.jpg
image6.jpg
image7.jpg
image8.jpg
image9.jpg
image70.jpg
image80.jpg
image10.png
image11.png
image12.jpg
image1.png
image13.png
image14.jpg
image15.jpg
image13.jpg
image140.jpg
image16.png
image17.png
image15.png
image32.png
image2.png
image18.jpg
image19.png
i.
Quantum Computing in Financial Services - Executive SummaryMEDICI Inner Circle
MEDICI’s 'Quantum Computing in Financial Services' report, a deep dive into the impact of Quantum Computing on the financial services sector, highlights key players in the ecosystem across hardware, software, and services, discusses the adoption of Quantum Computing by the financial services industry, and analyzes collaborative efforts exploring its early use cases in financial services.
GridMate - End to end testing is a critical piece to ensure quality and avoid...ThomasParaiso2
End to end testing is a critical piece to ensure quality and avoid regressions. In this session, we share our journey building an E2E testing pipeline for GridMate components (LWC and Aura) using Cypress, JSForce, FakerJS…
UiPath Test Automation using UiPath Test Suite series, part 5DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 5. In this session, we will cover CI/CD with devops.
Topics covered:
CI/CD with in UiPath
End-to-end overview of CI/CD pipeline with Azure devops
Speaker:
Lyndsey Byblow, Test Suite Sales Engineer @ UiPath, Inc.
A tale of scale & speed: How the US Navy is enabling software delivery from l...sonjaschweigert1
Rapid and secure feature delivery is a goal across every application team and every branch of the DoD. The Navy’s DevSecOps platform, Party Barge, has achieved:
- Reduction in onboarding time from 5 weeks to 1 day
- Improved developer experience and productivity through actionable findings and reduction of false positives
- Maintenance of superior security standards and inherent policy enforcement with Authorization to Operate (ATO)
Development teams can ship efficiently and ensure applications are cyber ready for Navy Authorizing Officials (AOs). In this webinar, Sigma Defense and Anchore will give attendees a look behind the scenes and demo secure pipeline automation and security artifacts that speed up application ATO and time to production.
We will cover:
- How to remove silos in DevSecOps
- How to build efficient development pipeline roles and component templates
- How to deliver security artifacts that matter for ATO’s (SBOMs, vulnerability reports, and policy evidence)
- How to streamline operations with automated policy checks on container images
Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...SOFTTECHHUB
The choice of an operating system plays a pivotal role in shaping our computing experience. For decades, Microsoft's Windows has dominated the market, offering a familiar and widely adopted platform for personal and professional use. However, as technological advancements continue to push the boundaries of innovation, alternative operating systems have emerged, challenging the status quo and offering users a fresh perspective on computing.
One such alternative that has garnered significant attention and acclaim is Nitrux Linux 3.5.0, a sleek, powerful, and user-friendly Linux distribution that promises to redefine the way we interact with our devices. With its focus on performance, security, and customization, Nitrux Linux presents a compelling case for those seeking to break free from the constraints of proprietary software and embrace the freedom and flexibility of open-source computing.
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024Albert Hoitingh
In this session I delve into the encryption technology used in Microsoft 365 and Microsoft Purview. Including the concepts of Customer Key and Double Key Encryption.
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
Securing your Kubernetes cluster_ a step-by-step guide to success !KatiaHIMEUR1
Today, after several years of existence, an extremely active community and an ultra-dynamic ecosystem, Kubernetes has established itself as the de facto standard in container orchestration. Thanks to a wide range of managed services, it has never been so easy to set up a ready-to-use Kubernetes cluster.
However, this ease of use means that the subject of security in Kubernetes is often left for later, or even neglected. This exposes companies to significant risks.
In this talk, I'll show you step-by-step how to secure your Kubernetes cluster for greater peace of mind and reliability.
zkStudyClub - Reef: Fast Succinct Non-Interactive Zero-Knowledge Regex ProofsAlex Pruden
This paper presents Reef, a system for generating publicly verifiable succinct non-interactive zero-knowledge proofs that a committed document matches or does not match a regular expression. We describe applications such as proving the strength of passwords, the provenance of email despite redactions, the validity of oblivious DNS queries, and the existence of mutations in DNA. Reef supports the Perl Compatible Regular Expression syntax, including wildcards, alternation, ranges, capture groups, Kleene star, negations, and lookarounds. Reef introduces a new type of automata, Skipping Alternating Finite Automata (SAFA), that skips irrelevant parts of a document when producing proofs without undermining soundness, and instantiates SAFA with a lookup argument. Our experimental evaluation confirms that Reef can generate proofs for documents with 32M characters; the proofs are small and cheap to verify (under a second).
Paper: https://eprint.iacr.org/2023/1886
Observability Concepts EVERY Developer Should Know -- DeveloperWeek Europe.pdfPaige Cruz
Monitoring and observability aren’t traditionally found in software curriculums and many of us cobble this knowledge together from whatever vendor or ecosystem we were first introduced to and whatever is a part of your current company’s observability stack.
While the dev and ops silo continues to crumble….many organizations still relegate monitoring & observability as the purview of ops, infra and SRE teams. This is a mistake - achieving a highly observable system requires collaboration up and down the stack.
I, a former op, would like to extend an invitation to all application developers to join the observability party will share these foundational concepts to build on:
Dr. Sean Tan, Head of Data Science, Changi Airport Group
Discover how Changi Airport Group (CAG) leverages graph technologies and generative AI to revolutionize their search capabilities. This session delves into the unique search needs of CAG’s diverse passengers and customers, showcasing how graph data structures enhance the accuracy and relevance of AI-generated search results, mitigating the risk of “hallucinations” and improving the overall customer journey.
In the rapidly evolving landscape of technologies, XML continues to play a vital role in structuring, storing, and transporting data across diverse systems. The recent advancements in artificial intelligence (AI) present new methodologies for enhancing XML development workflows, introducing efficiency, automation, and intelligent capabilities. This presentation will outline the scope and perspective of utilizing AI in XML development. The potential benefits and the possible pitfalls will be highlighted, providing a balanced view of the subject.
We will explore the capabilities of AI in understanding XML markup languages and autonomously creating structured XML content. Additionally, we will examine the capacity of AI to enrich plain text with appropriate XML markup. Practical examples and methodological guidelines will be provided to elucidate how AI can be effectively prompted to interpret and generate accurate XML markup.
Further emphasis will be placed on the role of AI in developing XSLT, or schemas such as XSD and Schematron. We will address the techniques and strategies adopted to create prompts for generating code, explaining code, or refactoring the code, and the results achieved.
The discussion will extend to how AI can be used to transform XML content. In particular, the focus will be on the use of AI XPath extension functions in XSLT, Schematron, Schematron Quick Fixes, or for XML content refactoring.
The presentation aims to deliver a comprehensive overview of AI usage in XML development, providing attendees with the necessary knowledge to make informed decisions. Whether you’re at the early stages of adopting AI or considering integrating it in advanced XML development, this presentation will cover all levels of expertise.
By highlighting the potential advantages and challenges of integrating AI with XML development tools and languages, the presentation seeks to inspire thoughtful conversation around the future of XML development. We’ll not only delve into the technical aspects of AI-powered XML development but also discuss practical implications and possible future directions.
Communications Mining Series - Zero to Hero - Session 1DianaGray10
This session provides introduction to UiPath Communication Mining, importance and platform overview. You will acquire a good understand of the phases in Communication Mining as we go over the platform with you. Topics covered:
• Communication Mining Overview
• Why is it important?
• How can it help today’s business and the benefits
• Phases in Communication Mining
• Demo on Platform overview
• Q/A
Enchancing adoption of Open Source Libraries. A case study on Albumentations.AIVladimir Iglovikov, Ph.D.
Presented by Vladimir Iglovikov:
- https://www.linkedin.com/in/iglovikov/
- https://x.com/viglovikov
- https://www.instagram.com/ternaus/
This presentation delves into the journey of Albumentations.ai, a highly successful open-source library for data augmentation.
Created out of a necessity for superior performance in Kaggle competitions, Albumentations has grown to become a widely used tool among data scientists and machine learning practitioners.
This case study covers various aspects, including:
People: The contributors and community that have supported Albumentations.
Metrics: The success indicators such as downloads, daily active users, GitHub stars, and financial contributions.
Challenges: The hurdles in monetizing open-source projects and measuring user engagement.
Development Practices: Best practices for creating, maintaining, and scaling open-source libraries, including code hygiene, CI/CD, and fast iteration.
Community Building: Strategies for making adoption easy, iterating quickly, and fostering a vibrant, engaged community.
Marketing: Both online and offline marketing tactics, focusing on real, impactful interactions and collaborations.
Mental Health: Maintaining balance and not feeling pressured by user demands.
Key insights include the importance of automation, making the adoption process seamless, and leveraging offline interactions for marketing. The presentation also emphasizes the need for continuous small improvements and building a friendly, inclusive community that contributes to the project's growth.
Vladimir Iglovikov brings his extensive experience as a Kaggle Grandmaster, ex-Staff ML Engineer at Lyft, sharing valuable lessons and practical advice for anyone looking to enhance the adoption of their open-source projects.
Explore more about Albumentations and join the community at:
GitHub: https://github.com/albumentations-team/albumentations
Website: https://albumentations.ai/
LinkedIn: https://www.linkedin.com/company/100504475
Twitter: https://x.com/albumentations
Enchancing adoption of Open Source Libraries. A case study on Albumentations.AI
Computer industry report
1. The Computer Industry
Computing is not about computers any more. It is about living. - Nicholas Negroponte
Historical development, importance and growth of the industry
under consideration
The birth of the computer
1642: The first adding machine, a precursor of the digital computer was devised in 1642 by the
Blaise Pascal
1670s: Gottfried Wilhelm improved on this machine by devising one that could also multiply
instead of just adding and subtracting
1801: Joseph-Marie invented the Jacquard weaving loom which is controlled by punch cards. In
the operation of Jacquard’s loom, holes are strategically punched in cards and the cards are
sequenced to indicate a particular weaving design
1821: The British mathematician and inventor Charles Babbage (the pioneer of digital computer
) worked out the principles of the modern digital computer. The Analytical Engine was designed
to handle complicated mathematical problems.
1843: Lady Ada Augusta Lovelace suggested that cards could be prepared that would instruct
Babbage’s analytical engine to repeat certain operations. Because of her suggestion, some
people call Lady Lovelace the first programmer.
1880s: The American statistician Herman Hollerith conceived the idea of using perforated cards
for processing data. Employing a system that passed punched cards over electrical contacts, he
was able to compile statistical information for the 1890 U.S. census. He applied the Jacquard
loom concept to computing
1940: Dr.John V. Atanasoff, a professor at Iowa State University, developed the first electronic
digital computer. He called his invention the Atanasoff-Berry Computer or ABC. This is one of
the most significant events in history
1964: A large-scale fully operational electronic computer called the ENIAC (Electronic
Numerical Integrator and Computer) was born. The ENIAC was a major breakthrough in
computer technology. It could do 5000 additions per minute and 500 multiplications per minute.
It weighed 30 tons and occupied 1500 square feet of floor space
Categorisation
Broadly, computers are categorised as analog, digital and hybrid computers. Based on size,
speed, processing capabilities and price computers have been categorised as:
microcomputers(PC), minicomputers, mainframe computers and supercomputers. The
performance growth rates for supercomputers, minicomputers and mainframes have been just
2. under 20% per year, while about 35% for microcomputers. Performance growth of PC has been
the fastest, partly because these machines take the most direct advantage of improvements in
IC technology.
Technology and Trends
Generations of computers have been made possible by the following technologies:
1950-1959: Vacuum tubes which led to electronic computers
1960-1968: Transistors which led to cheaper computers
1969-1977: Integrated which led to minicomputer
1978-1999: LSI and VLSI which led to personal computers and workstations
1999 onwards: Parallel processing which led to multiprocessors
The IC permitted the miniaturisation of computer-memory circuits and the microprocessor
reduced the size of a computer’s CPU to the size of a single silicon chip.
Trends of a PC are greatly influenced by the technology trends of both hardware and software.
Both trends determined its performance and capability.
Transistor count on a chip increases by about 25% every year, doubling in 3 years.
Device speed increases nearly as fast
Density increases by just 60% every year, quadrupling in 3 years. Cycle time has
improved very slowly, decreasing by about 1/3rd in 10 years
Density increases by about 25% every year, doubling in 3 years. Access time has
improved by 1/3rd in10 years
Performance over the years
3. PC sales by region
The rise of the computer in India
1956: Bought its first computer called HEC-2M for a sum of Rs. 10 Lacs. It was 10 ft in length, 7
ft in breadth and 6 ft in height
1962: Produced next generation of computers, including India's first indigenous computer, the
'TIFRAC' or Tata Institute of Fundamental Research
1970: Under the Homi Bhabha Committee, the Department of Electronics (DOE) was formed
under the Prime Minister
1978: The “Mini computer policy” opened up computer manufacturing to private sector. The
national informatics centre (NIC) was set up in 1977 which played a major role in the later
decades to become “decisive support system for the government” (both the central and state
governments)
1986: Project IMPRESS (computerization of Railways ticketing) started in 1986 as a pilot project
in Secunderabad ushered in the first application targeted at “aam admi” (common man)
1988: The formation of NASSCOM which sprang into action from 1990 gave a boost to the
nascent software industry.
The nineties also saw a direct “push” from the government through number of policy measures.
Some of them include-IT Task force led by prime Minister in 1999, IT Ministry at the centre, IT
fairs (like IT.com and now IT .in), IT parks, launch of public internet access through VSNL in
August 15, 1995, launch of mobile telephony on august 23, 1995.
Many software companies took birth in 80s like-Infosys, Patni, Satyam, Softek, Tata info tech
and Wipro. Another interesting trend was the setting up of offshore development centres (ODC)
by multi-national corporations, starting with Texas instruments in 1986.
$35 bn $50 bn
$8.4 bn (2010)
$1 bn (2006)
(2000)
$200 mn (1995)
(1990)
The growth of Indian Software Industry
4. Total PC (desktops and notebooks) sales (2004-10) in India
Sales of other peripherals
Importance and application of computers in the contemporary world
• Computer systems has increasingly cut down the paperwork that is involved in millions of
industries around the world
• Products from meats to magazines are packed with zebra-striped bar codes that can be
read by computer scanners at supermarkets. It also helps to manage inventory
• Energy companies use computers to locate oil, coal, natural gas and uranium. These
companies can figure out the site of a natural resource, its concentration and other
related figures
• Computers are used in cars to monitor fluid levels, temperatures and electrical systems.
Computers are also used to help run rapid transit systems, load containerships and track
railroads cars across the country
• Computers speed up record keeping and allow banks to offer same-day services.
Computers have helped fuel the cashless economy, enabling the widespread use of
credit cards, debit cards and instantaneous credit checks by banks and retailers.
• Computers are helping immensely to monitor those extremely ill in the intensive care unit
and provide cross-sectional views of the body
5. • Computers are used to control the production of resources precisely. All robots and
machinery are controlled by computers, making the production process faster and
cheaper
• The computers are most popular for their uses to connect with others on the World Wide
Web
Type of market structure of the computer industry
We believe after our research that the computer industry falls into the Oligopoly market. There
are few dominant firms such as Apple, Dell along with many small ones such as Intex etc. The
products manufactured by these biggies are either similar or differentiated. Also, the bigger
players have power on pricing but always have to live with the fear of retaliation from
competitors. The entry barriers for this industry are high. There needs to be regular
technological advances and hence R&D costs are high. Also, the extensive use of non-price
competition because of the fear of price should be taken into account which considerably shoots
up the barriers. Few areas of non-price competition include patent wars and advertisements.
This is also a market with large economies of scale and a market which eliminates weak
competitors in its business cycles. There are also benefits that companies gain out of mergers
and acquisitions in this segment. There is also the possibility of behind the scenes cartels that
could be formed to take undue advantage of the demand.
There are two unique aspects of oligopoly competition
- Mutual Interdependence
- Repeated Interaction
Whenever the actions of a firm have a major impact on the other firm, there exists mutual
interdependence. If dell reduces its price to sell more of its product, HP will notice that its sales
have fallen and kick in counter measures. The oligopoly firm will affect the sales of other firms
by changing its sales, prices, marketing strategies etc.
And it so happens that very often, oligopolists in an industry would have been competing with
one another for a long time running into years. Dell and HP have competed within the same
market for years together now. This lays the ground for repeated interaction as either company
remembers what happened in the past when the strategies were changed. Repeated interaction
doesn’t mean that the strategy of the rival is known with certainty.
A firm in an oligopoly market faces a very complex market environment. The extent of sales
depends on the price set by the other oligopolists in the industry. Consider a situation where
Dell is currently selling laptops worth $1000 numbering 10,000 units. Dell decreases the price
by $100, which most likely increases sales. But this factor of increase depends on the
competitor’s new pricing. Dell would sell 14,000 units, but if say HP also decreases its price for
this product, then Dell’s sales would just be around 11,000 units. The rise in demand for Dell
depends on both the price cut and the competitor’s price. The oligopoly dual demand curve
shown below pertains to the example discussed. It shows the Dell demand curve for both
factors.
6. Dell and HP have had repeated interaction over the years from which they would have noted
patterns regarding the strategies. Dell might not know with certainty what HP will do if Dell
raises prices , but Dell does know that 80% of the time HP did not raise its prices in response.
Dell also knows that 85% of the time HP matched a price reduction. Dell can now set their
prices with reasonable assumptions. This means that Dell can make a good guess about the
demand curve on which they will move along if at all they change their price. A double line
marks this demand curve (kinked demand curve) which is bent at the initial price and no. of
units offered by Dell.
A way to avoid this nasty competition and even to increase profits often exists for firms in an
oligopoly are by explicit/overt/tacit collusions which are considered illegal in most of the
markets. That is, if both firms move up the solid demand curve in the above diagram, both firms
will be better off. The problem is however that both firms fear that if they raise prices unilaterally,
the other firm will stab it in the back by keeping its price low.
Major Players and their Shares in the Market
• Over the past half century, along with the rapid and uncontrolled growth of computers
from being a bulky tool for calculating to being an integral part of our life permeating
everything task that we do, it has also given rise to a business which generates money
7. like no other giving rise to some of the richest men in the world. Ever since the inception
of the idea that the computer could also become a tool to make business faster and
smarter and also make easy the life of the common man, from being a scientific and
military tool, entrepreneurs and business houses who could envision this future and
invested in it have become some of the most recognizable names in the world. Microsoft,
Dell, Sony, IBM were some of those who rode on this revolution to become leaders in
their business areas.
• More than1 billion personal computers have been sold from the mid 70s upto this point.
75 percent were professional or work related, while the rest were sold for personal or
home use. About 81.5 percent of personal computers shipped had been desktop
computers, 16.4 percent laptops and 2.1 percent servers. The United States had
received 38.8 percent (394 million) of the computers shipped, Europe 25 percent and
11.7 percent had gone to the Asia-Pacific region, the fastest-growing market as of 2002.
The second billion was expected to be sold by 2008. Almost half of all the households
in Western Europe had a personal computer and a computer could be found in 40
percent of homes in United Kingdom, compared with only 13 percent in 1985.
• In 2001, 125 million personal computers were shipped in comparison to 48 thousand in
1977. More than 500 million personal computers were in use in 2002.
• The major players in this business are Dell, HP, Apple, Acer, Sony, Lenovo, Panasonic,
Samsung, Toshiba and others. There is
intense competition among these companies
and the rising and falling market share
among them indicates the same.
•
• The computer manufacturers market can
also be divided along the lines of Stationary
or Desktop Computers or Mobile computers
(Laptops) which are further sub divided into
subgroups like Business use, personal use
etc.
• The Operation Systems Market
• The operation system or OS market is largely ruled by Microsoft’s hugely successful
Windows which has a no. of versions of which the latest is Windows 7. Microsoft’s
policies of making it’s softwares exclusive to its OS and patenting discourage the use of
other OSs, thus trying to carve a monopoly for itself . Still a number of other OS such as
iOS and Mac OS X, Linux are being used to a smaller extent among people.
8. • Servers
• In the servers market, microsoft based Servers have a share of 36.1% and other players
including Linux, BSD, Solaris have a share of 63.9%.
• Mainframes
• In the mainframes market, IBM System Z has a market share of 90-95% thus making it
the largest player by a distance in this segment.
Global activities in the computer industry
As the computer industry grows, develops and matures, the global personal computer(PC)
installations keep rising and are poised to touch the 1.7 billion mark by the end of 2012. Canalys
has estimate that the industry has grown at approximately 19.2% in 2010 over 2009 with
Lenovo and Apple being the major gainers in market share. Apple registered a staggering 241%
growth in the same period as above.
This major shift in Apple's market share has been primarily due to the introduction of the iPad
and its unprecedented sales which has fuelled its rise to the third(3rd) position in the world
rankings as a hardware manufacturer and is expected to rise to 2nd only behind HP by the end
of the year. The global computer scenario at present is, thus, characterised by rising volumes
and falling market for most major players.
The fastest growing computer economies are the countries from the Asia-pacific, central and
eastern Europe, Latin America and the Middle east. The spurt in growth of the computer
industry is because of the expansion in the definition of the Computer as a product. Laptops,
Tablets, Mini-notes and Notebook PC's are the different products in which the computer is
perceived by the consumers today and these are products that are powering the strong growth
of the industry.
Shifting focus from the growth of the industry to that of the firms across the globe we see that of
the top 100 hardware companies in the world 40 are still based out of United States of
America(USA). But looking carefully shows that the developing and developed Asian economies
together are home to over 50 of the top 100 hardware companies in the world with Japan(21)
and Taiwan(18) and this means that the growing markets are and will play a very important role
in the years to come in the progress of these companies and the industry at large.
9. According to Gartner, even though mature and developed markets still contribute more than half
of the global computers industry but this percentage is coming down and fast and currently
stands around 58%. The developing markets especially India and China are now major
technological development as well as manufacturing hubs and are predicted to be the future
leaders in the industry.
The impact of global changes in the industry can be seen in the Indian economy as well. An
Intel study says that in India:
Smaller cities contribute heavily to the PC market growth
Computers are a Youth driven category
Buyers seek value and don't buy cheap
Notebooks and other new computer items have become desirable and bring new users
into the fold
10. The development and exponential rise in the sales of smart and intelligent phones is a major
cause of concern for the computers industry. The mobile industry is empowered by portability,
rural penetration and ease of usage. Handset makers like LG, HTC, Motorola, Nokia, Samsung,
Apple, etc. are the major players in this industry and these are giving strong competition at
times to their own computer products as well as their competitor’s products. Mobile penetration
has surged due to availability and interactive user interfaces. All this may pose even more
serious competition to the computer industry in the future.
Mergers and Acquisitions in the Computer Industry
A firm seeking to capitalize on the sales growth benefit it derives from increasing its relative
size or the profitability benefit it gains from introducing new products may find a merger or
acquisition to be the optimal means of achieving its ends. Although organic growth may
also serve as a means of attaining these benefits, and may be especially attractive to both
those firms which have reservations about their capacity to effectively integr ate new
businesses within their existing management structures and to those that are generally
sceptical of structuring an M&A transaction, it typically has features which some firms may
find unattractive. In particular, a given firm may find the time requ ired to organically grow
prohibitive and/or simply lack the expertise needed to organically grow .
Lenovo and IBM
Global perspective:
With IBM taking an 18.9% stake in Lenovo, it has become its second largest shareholder. The
deal is likely to quadruple Lenovo's personal computing business. And it will be a partnership –
meaning that IBM will be the preferred services and customer financing provider to Lenovo,
while the Chinese
company will be the preferred supplier of PCs to IBM.
With IBM's global presence, Lenovo Group has already moved its PC business worldwide
headquarters to New York and has added some 10,000 IBM employees-about 40 per cent of
whom are already in China.
Manufacturing of IBM-branded desktop and laptop products will continue. However, over the
next five years, the brands on those products will be phased out in an orderly fashion.
With help of this multi-billion dollar deal – the erstwhile Chinese company has already
catapulted itself into a cut-throat competitive world with Dell Inc. and Hewlett-Packard as rivals.
More importantly, Lenovo must face off against a host of established Japanese names and
Taiwanese rivals that are beginning to get dangerously global and expanding into North
American, European and other markets.
Lenovo has opted for an expansion strategy, in which it is, in principle, partnering with IBM to
move overseas. This Sino-American approach differs from the approaches taken by other
companies, especially in regions like Taiwan.
However, on the branded side, it has not tasted global success so far. With IBM's global name
being tagged with Lenovo, China's branded product story may change on the global front.
11. LENOVO 2010-11 Q1 Product Performance
• Idea pads
– Shipments up 74% YTY; Sales up 74% Year To Year (YTY)
– Launched first 3D multi-media notebook Y560d and Idea Centre A700 AIO
• ThinkPads
– Shipments up 29% YTY; Sales up 32% YTY
– Launched L series and Think Center M90z AIO
• Notebook
– Shipments up 58% YTY; Sales up 50% YTY
– Notebook market share gained 2.0 points to become the fourth Largest notebook PC
company
• Desktop
– Shipments up 36% YTY; Sales up 35% YTY
– Maintained solid performance through AIO and SMB targeted desktops
• Mobiles
– Shipments up 82% YTY; Sales up 89% YTY
– Rapid and widespread customer acceptance and encouraging initial sales
Different Notebooks and other Product Notebooks (as a whole) and other products
12. HP – COMPAQ (A Failed Merger)
HP bought Compaq for US$ 24 billion in stock. This was the largest ever deal in the history of
the computer industry. The deal meant combined operations in more than 160 countries and
more than 145,000 employees. HP-Compaq would offer the most complete set of products and
services in the computer industry.
The motivation behind a HP-Compaq merger (whether it made economic sense) and the
problems encountered in merging operations is an interesting discussion as the stock prices of
both HP and Compaq fell within two days of the merger announcement. An estimated 13 billion
dollars was lost (in terms of market capitalization) in this time frame.
Industry analysts failed to understand the benefits HP would derive by acquiring Compaq. HP
was a market leader in the high margin printer’s business and Compaq, a low-margin personal
computer (PC) manufacturer. Moreover, established players like direct marketer, Dell and
leading IT service consulting company like IBM would give fierce competition even if economies
of scale were to be achieved.
Source: http://www.casestudyinc.com/hp-and-compaq-merger
PROBLEMS FACED BY THE COMPUTER INDUSTRY:-
• Software Update and need for innovation-> With the competition in the field
of technology and latest versions of software, the challenge of having the
upgraded software at all times, becomes a task of utmost importance.
• Patent Wars-> Apple and Samsung fighting over smart phone patents.
• Increasing costs due to expensive labor, land and expenditure on
technology-> our research suggests that the costs have increased manifolds,
due to the increase in the prices of land, labor and technology.
• Need for skilled manpower-> IT industry across the world is in a shortage of
skilled IT graduates.
• Increase in competition and narrow profit margins-> The competition has
brought down the prices of computers, software and cutting wave technology.
• Infringement of Intellectual property rights.
• Increase in the bargaining power of buyers and suppliers.
• Handling the e-waste-> There is a large amount of e-waste generated in the
form of outdated computers and wires etc.
13. • Software piracy-> Pirated version of all the latest software is easily available
online and elsewhere.
• Labor issues in the supply chain, due to different geographic locations with
different cultural backgrounds-> With rapid globalization, the need to bridge
the cultural gap is really important, as the various companies work in
different countries and have to get the work done.
• Rising Attrition rate in IT industry-> The skilled manpower leaves the
company after 2-3 years due to various reasons and the industry bears the
loss of employing a new resource and training it again.
Problem 1. Foot-in-the-door Software
The recipe for creating foot-in-the-door software is really quite simple:
1. Design a software that can do anything with “a little customization”.
2. Make it hard to customize. Make every protocol and specification proprietary
and hard to understand.
3. Don’t go anywhere near any standards.
4. Provide a horde of overpriced consultants to fix all of the above problems,
and have them apply the “Ninja Technique (Problem 3)” so that they can stay
on-site indefinitely.
Voilá! Now just wait for money to pour in from miserable customers.
Solution: Empower your customers through creating standards compliant API’s and
plugin environments with open and commonly used technologies for which problem
solvers can be found everywhere and anywhere. Even better, stop creating
problems for your customers in the first place. Stop sticking your foot in the door,
and focus on creating something that makes your customers the stronger one. They
will rely on you all the more for it — and it will be a relationship based on trust, not
desperation or despair.
Problem 2. The Ninja Distraction Technique (using Tech Jargon)
14. The software industry has spent years (or maybe decades)
educating their customers in tech jargon. It’s all a part of the ninja technique of
distraction. It is. Really. The theory goes: Keep throwing words such as “Java,
JBoss, Caching layers, Multi Tier Software Development Housing Fascilities Campus”
at the customers, and you will not only sound very professional, but what’s even
better, the customers will soon forget what they really were asking you for, so
there’s less of a chance chance you have to deliver.
Imagine being the customer in this scenario: Here you were looking for a a) safe
car with b) comfy seats, c) low fuel consumption, d) good stereo sound and a e)
large trunk for all your groceries, and suddenly you had a car salesman giving you
a primer in everything ranging from the new four layer varnish coating technology
to the latest in air-pressured suspension theory and revolutions within the field of
fuel injection and what not. You don’t want to hear about that, you want to know if
it will hold your coffee cup steady while playing your Mozart in a perfect pitch.
Well, the industry seems to have distracted you from all that.
Solution: It’s about time the industry starts talking about the metrics that the
customers can relate to and understand. And more notably, the ones that they
need. Let’s talk about ease of use. Let’s talk about performance (can you handle
100 users registrations a second?). Let’s talk about modifiability (can you deliver a
medium sized product feature change in 1 week, or less?). Let’s talk
about reliability. Is your up-time average more than 99,97%? Will your software
automatically restore upon any hardware problems? Can you upgrade our software
frequently without involving our tech-department? What is your fix time for bugs?