The document discusses the history and evolution of the World Wide Web from its inception in the late 1980s to present day. It traces the key developments from static, text-based early webpages to today's highly dynamic web with user-generated content. The increasing amount of content from diverse sources created a need for dynamic websites that could efficiently manage and automate updates to vast amounts of information.
This is a guest lecture I gave at SI110 covering my own take on the history of the Internet and moments in time where great innovations came into being and moments where the present state of technology may never have come into being. We look at the forces allied against the Internet and Web as we know it today and look at how those forces were unable to control the innovation. I explore what we might see as a dystopian future that we might be experiencing if things had turned out differently. The live version of this presentation has a number of short video segments to punctuate the ideas in the presentation.
This presentation was provided by Oren Beit-Arie of Ex Libris, Inc. during the NISO event, "Library Resource Management Systems: New Challenges, New Opportunities," held October 8 - 9, 2009.
The World Wide Web is an information space where documents and other web resources are identified by Uniform Resource Locators, interlinked by hypertext links, and can be accessed via the Internet.
This is a very basic workshop to introduce novice users to Omeka with an eye towards providing hands-on experience to decide whether it can serve their own research needs.
Why Open Access to Bibliographic Metadata MattersAnders Söderbäck
Presentation given at IFLA 2010 satellite conference on "Open Access and the Changing Role of Libraries". http://www.kb.se/aktuellt/utbildningar/2010/Open-Access-and-the-Changing-Role-of-Libraries/
This is a guest lecture I gave at SI110 covering my own take on the history of the Internet and moments in time where great innovations came into being and moments where the present state of technology may never have come into being. We look at the forces allied against the Internet and Web as we know it today and look at how those forces were unable to control the innovation. I explore what we might see as a dystopian future that we might be experiencing if things had turned out differently. The live version of this presentation has a number of short video segments to punctuate the ideas in the presentation.
This presentation was provided by Oren Beit-Arie of Ex Libris, Inc. during the NISO event, "Library Resource Management Systems: New Challenges, New Opportunities," held October 8 - 9, 2009.
The World Wide Web is an information space where documents and other web resources are identified by Uniform Resource Locators, interlinked by hypertext links, and can be accessed via the Internet.
This is a very basic workshop to introduce novice users to Omeka with an eye towards providing hands-on experience to decide whether it can serve their own research needs.
Why Open Access to Bibliographic Metadata MattersAnders Söderbäck
Presentation given at IFLA 2010 satellite conference on "Open Access and the Changing Role of Libraries". http://www.kb.se/aktuellt/utbildningar/2010/Open-Access-and-the-Changing-Role-of-Libraries/
AJAX the Great: The Origin and Development of the Dynamic Web (2007)Fran Fabrizio
This is my all-time favorite presentation that I've delivered. I was invited to address the ACM Student Chapter at UAB, and I thought this topic would appeal to them. Having watched the Web grow up (I got on the Web in 1992 when there was still an index page that listed every new page that had appeared on the web that day!), I thought it would be neat to trace the path from completely static, totally text pages to completely dynamic, asynchronous data delivery that was state of the art in 2007.
Rich, modern web-applications are changing the way we write software for the Internet. As browsers grow evermore powerful, we become able to construct more complex and interactive applications by deferring some server-side logic to the client. In this presentation, we will establish a definition and characteristics for what makes web-applications modern and compare the benefits and trade-offs by exploring a few case studies.
About the Author:
Mike Filbin is a full-stack web developer who focuses on engineering JavaScript applications for both the browser and the server. Mike is also a proponent of the Free and Open Source software movements and is a member of both the Linux and Free Software foundations.
This power point presentation provides details and description about web technology and terms of web development, design based on course structure of Bachelors in Computer Application (BCA) of Tribhuvan University in Nepal. It further includes information about history of internet and its evolution, world wide web and its services, static webpages, dynamic webpages, and html with headings, paragraph, titles, images, and so on.
Very basic introductory talk about the Semantic Web, given to undergraduate and posgraduate students of Universidad del Valle (Cali, Colombia) in September 2010
A tale of scale & speed: How the US Navy is enabling software delivery from l...sonjaschweigert1
Rapid and secure feature delivery is a goal across every application team and every branch of the DoD. The Navy’s DevSecOps platform, Party Barge, has achieved:
- Reduction in onboarding time from 5 weeks to 1 day
- Improved developer experience and productivity through actionable findings and reduction of false positives
- Maintenance of superior security standards and inherent policy enforcement with Authorization to Operate (ATO)
Development teams can ship efficiently and ensure applications are cyber ready for Navy Authorizing Officials (AOs). In this webinar, Sigma Defense and Anchore will give attendees a look behind the scenes and demo secure pipeline automation and security artifacts that speed up application ATO and time to production.
We will cover:
- How to remove silos in DevSecOps
- How to build efficient development pipeline roles and component templates
- How to deliver security artifacts that matter for ATO’s (SBOMs, vulnerability reports, and policy evidence)
- How to streamline operations with automated policy checks on container images
Removing Uninteresting Bytes in Software FuzzingAftab Hussain
Imagine a world where software fuzzing, the process of mutating bytes in test seeds to uncover hidden and erroneous program behaviors, becomes faster and more effective. A lot depends on the initial seeds, which can significantly dictate the trajectory of a fuzzing campaign, particularly in terms of how long it takes to uncover interesting behaviour in your code. We introduce DIAR, a technique designed to speedup fuzzing campaigns by pinpointing and eliminating those uninteresting bytes in the seeds. Picture this: instead of wasting valuable resources on meaningless mutations in large, bloated seeds, DIAR removes the unnecessary bytes, streamlining the entire process.
In this work, we equipped AFL, a popular fuzzer, with DIAR and examined two critical Linux libraries -- Libxml's xmllint, a tool for parsing xml documents, and Binutil's readelf, an essential debugging and security analysis command-line tool used to display detailed information about ELF (Executable and Linkable Format). Our preliminary results show that AFL+DIAR does not only discover new paths more quickly but also achieves higher coverage overall. This work thus showcases how starting with lean and optimized seeds can lead to faster, more comprehensive fuzzing campaigns -- and DIAR helps you find such seeds.
- These are slides of the talk given at IEEE International Conference on Software Testing Verification and Validation Workshop, ICSTW 2022.
20 Comprehensive Checklist of Designing and Developing a WebsitePixlogix Infotech
Dive into the world of Website Designing and Developing with Pixlogix! Looking to create a stunning online presence? Look no further! Our comprehensive checklist covers everything you need to know to craft a website that stands out. From user-friendly design to seamless functionality, we've got you covered. Don't miss out on this invaluable resource! Check out our checklist now at Pixlogix and start your journey towards a captivating online presence today.
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
In the rapidly evolving landscape of technologies, XML continues to play a vital role in structuring, storing, and transporting data across diverse systems. The recent advancements in artificial intelligence (AI) present new methodologies for enhancing XML development workflows, introducing efficiency, automation, and intelligent capabilities. This presentation will outline the scope and perspective of utilizing AI in XML development. The potential benefits and the possible pitfalls will be highlighted, providing a balanced view of the subject.
We will explore the capabilities of AI in understanding XML markup languages and autonomously creating structured XML content. Additionally, we will examine the capacity of AI to enrich plain text with appropriate XML markup. Practical examples and methodological guidelines will be provided to elucidate how AI can be effectively prompted to interpret and generate accurate XML markup.
Further emphasis will be placed on the role of AI in developing XSLT, or schemas such as XSD and Schematron. We will address the techniques and strategies adopted to create prompts for generating code, explaining code, or refactoring the code, and the results achieved.
The discussion will extend to how AI can be used to transform XML content. In particular, the focus will be on the use of AI XPath extension functions in XSLT, Schematron, Schematron Quick Fixes, or for XML content refactoring.
The presentation aims to deliver a comprehensive overview of AI usage in XML development, providing attendees with the necessary knowledge to make informed decisions. Whether you’re at the early stages of adopting AI or considering integrating it in advanced XML development, this presentation will cover all levels of expertise.
By highlighting the potential advantages and challenges of integrating AI with XML development tools and languages, the presentation seeks to inspire thoughtful conversation around the future of XML development. We’ll not only delve into the technical aspects of AI-powered XML development but also discuss practical implications and possible future directions.
Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...SOFTTECHHUB
The choice of an operating system plays a pivotal role in shaping our computing experience. For decades, Microsoft's Windows has dominated the market, offering a familiar and widely adopted platform for personal and professional use. However, as technological advancements continue to push the boundaries of innovation, alternative operating systems have emerged, challenging the status quo and offering users a fresh perspective on computing.
One such alternative that has garnered significant attention and acclaim is Nitrux Linux 3.5.0, a sleek, powerful, and user-friendly Linux distribution that promises to redefine the way we interact with our devices. With its focus on performance, security, and customization, Nitrux Linux presents a compelling case for those seeking to break free from the constraints of proprietary software and embrace the freedom and flexibility of open-source computing.
Sudheer Mechineni, Head of Application Frameworks, Standard Chartered Bank
Discover how Standard Chartered Bank harnessed the power of Neo4j to transform complex data access challenges into a dynamic, scalable graph database solution. This keynote will cover their journey from initial adoption to deploying a fully automated, enterprise-grade causal cluster, highlighting key strategies for modelling organisational changes and ensuring robust disaster recovery. Learn how these innovations have not only enhanced Standard Chartered Bank’s data infrastructure but also positioned them as pioneers in the banking sector’s adoption of graph technology.
Generative AI Deep Dive: Advancing from Proof of Concept to ProductionAggregage
Join Maher Hanafi, VP of Engineering at Betterworks, in this new session where he'll share a practical framework to transform Gen AI prototypes into impactful products! He'll delve into the complexities of data collection and management, model selection and optimization, and ensuring security, scalability, and responsible use.
GridMate - End to end testing is a critical piece to ensure quality and avoid...ThomasParaiso2
End to end testing is a critical piece to ensure quality and avoid regressions. In this session, we share our journey building an E2E testing pipeline for GridMate components (LWC and Aura) using Cypress, JSForce, FakerJS…
GraphSummit Singapore | The Art of the Possible with Graph - Q2 2024Neo4j
Neha Bajwa, Vice President of Product Marketing, Neo4j
Join us as we explore breakthrough innovations enabled by interconnected data and AI. Discover firsthand how organizations use relationships in data to uncover contextual insights and solve our most pressing challenges – from optimizing supply chains, detecting fraud, and improving customer experiences to accelerating drug discoveries.
UiPath Test Automation using UiPath Test Suite series, part 5DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 5. In this session, we will cover CI/CD with devops.
Topics covered:
CI/CD with in UiPath
End-to-end overview of CI/CD pipeline with Azure devops
Speaker:
Lyndsey Byblow, Test Suite Sales Engineer @ UiPath, Inc.
UiPath Test Automation using UiPath Test Suite series, part 6DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 6. In this session, we will cover Test Automation with generative AI and Open AI.
UiPath Test Automation with generative AI and Open AI webinar offers an in-depth exploration of leveraging cutting-edge technologies for test automation within the UiPath platform. Attendees will delve into the integration of generative AI, a test automation solution, with Open AI advanced natural language processing capabilities.
Throughout the session, participants will discover how this synergy empowers testers to automate repetitive tasks, enhance testing accuracy, and expedite the software testing life cycle. Topics covered include the seamless integration process, practical use cases, and the benefits of harnessing AI-driven automation for UiPath testing initiatives. By attending this webinar, testers, and automation professionals can gain valuable insights into harnessing the power of AI to optimize their test automation workflows within the UiPath ecosystem, ultimately driving efficiency and quality in software development processes.
What will you get from this session?
1. Insights into integrating generative AI.
2. Understanding how this integration enhances test automation within the UiPath platform
3. Practical demonstrations
4. Exploration of real-world use cases illustrating the benefits of AI-driven test automation for UiPath
Topics covered:
What is generative AI
Test Automation with generative AI and Open AI.
UiPath integration with generative AI
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024Albert Hoitingh
In this session I delve into the encryption technology used in Microsoft 365 and Microsoft Purview. Including the concepts of Customer Key and Double Key Encryption.
In his public lecture, Christian Timmerer provides insights into the fascinating history of video streaming, starting from its humble beginnings before YouTube to the groundbreaking technologies that now dominate platforms like Netflix and ORF ON. Timmerer also presents provocative contributions of his own that have significantly influenced the industry. He concludes by looking at future challenges and invites the audience to join in a discussion.
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
2. History of the WWW
• To understand why there became a need
for dynamic websites it helps to know a bit
about the evolution of the World Wide
Web...
3. History of the WWW
• The Internet as we know it today was not so much
a single invention, but the culmination of many
different technologies and fields of research.
• We might categorise some of these as:
• The physical (network infrastructure)
• The logical (information organisation and
transport)
• The representatioal (how we represent the
data - usually visual)
• The interactive (how we interact with the
data - interfaces)
4. 1958
• US Defense Advanced Research Projects
Agency (DARPA) created.
• Early research included the development of
robust networking technologies for
connecting remote military assets.
6. 1969
• The Advanced Research
Projects Agency Network
(ARPANET), was the world's
first operational packet
switching network and the
core network of a set that
came to compose the global
Internet.
7. 1988
• US National Science Foundation (NSF)
commissioned the construction of the
NSFNET, a university network backbone.
• NSFNET was decommissioned in 1995
when it was replaced by new backbone
networks operated by commercial Internet
Service Providers
8. US Internet backbone networks (colours
represent different ISPs)
http://source-report.com/internetbackbone/internetbackbone_20.htm
9. 1989 - 1990
• Tim Berners-Lee, while working
CERN invents the World Wide
Web in a proposal for an
information management system
that presented data in a common
and consistent way.
• He creates the HyperText Transfer
Protocol (HTTP), the HyperText
Markup Language (HTML), the first
Web browser and the first HTTP
server software
10. 6 August 1991
• First website goes online.
• It defines Defines the WorldWideWeb
as “a wide-area hypermedia
information retrieval initiative aiming
to give universal access to a large
universe of documents.”
• Makes no mention of anything we
might associate with visual interface
design.
11. An archived copy of the first webpage
http://www.w3.org/History/19921103-hypertext/hypertext/WWW/TheProject.html
12. 1992 - 1995
• early adopters of the World Wide Web
were primarily university-based scientific
departments or research laboratories
• A turning point was the introduction of
Mosiac - a graphical browser released in
1993
13. • Mosaic was the first web browser to display
images inline with text (this was seen as a
huge leap forward at the time)
14. 1992 - 1995
• Bandwidth was limited by the network
technologies.
• Web began to grow from a few hundred
web pages.
• Any sense of web design was severely
limited by these constraints
• but, there is a clear trend towards a more
visual, more accessible web
15. Web organisation
• In 1993, CERN agrees that anyone can use
the web protocol and code royalty-free
• In 1994, Tim Berners-Lee founds the World
Wide Web Consortium (W3C) - the main
international standards organization for the
WWW
16. 1995 - 1998
• Commercial interest in capitalising on the
growth of the web (eCommerce)
• Increased commercial investment pushed
the technology to a point where there was
a legitimate role for web designers.
• Early examples of User Created Content
(UCC) - e.g. GeoCities
17. 1995 - 1998
Browser wars (Netscape vs Internet Explorer)
• Feature ‘arms race’
• Tables and frames
for more complex
layouts
• Animated gifs
• Javascript (button
rollovers etc)
• ...
18. 1995 - 1998
• Trend towards advertising a “web presence” rather than offering useful
content or services.
• This lead to websites which were stuffed full of attention seeking ‘bells and
whistles’ whether they served a purpose or not
• Splash pages
• Tiled background images
• Crazy background and text colour combinations
• Animated gifs/flash
• Blinking/scrolling/marching ants etc. text effects
• http://www.htmlprimer.com/articles/90s-web-design-nostalgic-look-back
• http://www.webpagesthatsuck.com/gorgeous-websites-from-the-late-90s-
to-inspire-you-if-you-have-no-taste.html
• More often than not this approach distracted from the content and made it
less accessible
20. 1998 - 2000
• ‘Traditional’ interface design principles start to be seriously
applied to web site designs.
• Web development tools like Dreamweaver promote a more
‘visual’ approach/workflow to web-interface design.
• Content is becoming more important and web-design begins to
focus on servicing that content
• But... presentation and content are still combined –specified
within html markup. It is not possible to update one
independent of the other.
• Website layouts of this period still look square, based mostly on
HTML tables (an abuse of their intended use) and sliced images.
22. 1999–2001: "Dot-com" boom and
bust
• Everyone wanted to jump
on the dot-com bandwagon
at the end of the 20th
Century.
• A lot of money was thrown
at entrepreneurs without
solid business plans because
of the novelty of the dot-
com concept, leading to the
tech bubble and subsequent
bust.
23. 2000 - 2004
• High-speed Internet connectivity becomes more affordable
• Push towards web standards, headed by the World Wide Web
Consortium (W3C)
• Continuing trend of more content, more often.
• Separation of presentation and content allowing each to be
updated independent of the other.
• Cascading Style Sheets (CSS) for presentation
• HTML for content
• Move away from static web pages towards
dynamic web sites. (more on this later)
25. 2004 – 2007
• Web 2.0 era
• Web applications vs websites
• Highly dynamic
• Community oriented
• User-contributed multi-media content (lots of it!)
• Interactivity and functionality approaching native
desktop applications
• Social networking takes off
• Purchasing goods and services online via sites like eBay
and Amazon becomes mainstream to the point where it
threatens traditional retailers.
26.
27. 2008 onwards
• (almost) real-time content updates
• Trend for content to ‘find’ users (rss feed
subscriptions, twitter updates etc)
• Storing personal data “in the cloud”
• Content ‘mash-ups’
• Embedded widgets, feeds, services etc using
external APIs
• Design for multiple devices (especially mobile)
28.
29. So what are the trends?
• More content
• More frequently (up-to-date and on-
demand)
• From more sources (crowd sourcing,
mashups etc)
• Moving away from a static web towards a
dynamic web.
30. So what are the trends?
• More contributors. As a web designer you need to at least have
an understanding of all these areas and how they fit together.
31. Hypertext Transfer Protocol (HTTP)
• HTTP functions as a request-response protocol in the
client-server computing model.
• In the most common example the web browser is the client and an
application running on a computer hosting a web site is the server.
• The client submits an HTTP request message to the server.
• The server returns a response message to the client containing
completion status information about the request and may also
contain requested content in its message body.
34. Static website
• each logical page is represented by a
physical file on the web server
35. Advantages of static
websites
• Lower entry barrier for development (just
plain old html and css files).
• Simple hosting requirements
• Easily cacheable
• Can be viewed “offline”
36. Disadvantages of static
websites
• Much less scope for personalisation,
interactivity - any scripting has to be done
client-side.
• Every little change/update needs to be
done manually...
37. Some stats
• 24 hours of video is uploaded to YouTube
every minute. (source)
• More than 30 billion pieces of content (web
links, news stories, blog posts, notes, photo
albums, etc.) shared each month in over 70
languages. (source)
• 50 million tweets are sent per day. (source)
38. Disadvantages of static
websites
• Can you even fathom updating this much
content by hand? And these numbers are
growing at an exponential rate.
• Fortunately computers are very good at
automating repetitive tasks in a dynamic
way.
39. Dynamic website
• Website content is stored in a database
(and/or other external sources) and
assembled with markup and output by a
web server script or application.
40. Advantages of dynamic
website
• Content can be updated in a decentralised
way. (a single “webmaster” does not have
the sole privilege/responsibility of updating
the website)
• Modularisation and reuse of common code
(e.g. headers, menus).
• Automation
41. Disadvantages of dynamic website
• Higher entry barrier / learning curve for
development
• More complex web server requirements
• Issues with pages being indexed by search
engines.
• Overall the benefits will almost always
outweigh the disadvantages.