Table of Contents
History of Core technology 2
Magnetic Core Memory 2
Who Invented Core Memory? 2
What is Core-i Technology ? 2
Single core, Dual core, Quad core and Octa core. 3
ADVANTAGES : 3
CURRENT LINEUP CORE PROCESSORS 3
Q: 2 Computer transformation 4
Q:3: Vendors of Technology Hardware 5
Q: 4 Counter Argument 8
Will this be a hard sell? why or why not? 9
Q-5: Specialized Organizations In Creating Customized Software Applications For The Clients 9
This Redbook talks through the benefits and product specification of IBM System x3500 M4. The x3500 M4 offers a flexible, scalable design and simple upgrade path to 32 HDDs, with up to eight PCIe 3.0 slots and up to 768 GB of memory. A high-performance dual-socket tower server, the IBM System x3500 M4, can deliver the scalability, reliable performance, and optimized efficiency for your business-critical applications. For more information on System x, visit http://ibm.co/Q7m3iQ.
http://www.scribd.com/doc/210742768/IBM-System-x3500-M4
Micro Server Design - Open Compute ProjectHitesh Jani
The Micro Server is a cluster of Low Power, High Density Servers which can be applied in growing workloads such as Distributed Computing, Cloud Computing, Internet of Things (IoT) and Big Data.
HISTORY AND FUTURE TRENDS OF MULTICORE COMPUTER ARCHITECTUREijcga
The multicore technology concept is centered on the parallel computing possibility that can boost computer
efficiency and speed by integrating two or more CPUs (Central Processing Units) in a single chip. A
multicore architecture places multiple processor cores and groups them as one physical processor. The
primary goal is to develop a system that can handle and complete more than one task at the same time,
thereby getting a better system performance in general. This paper will describe the history and future
trends of multicore computer architecture.
History and Future Trends of Multicore Computer Architectureijcga
The multicore technology concept is centered on the parallel computing possibility that can boost computer efficiency and speed by integrating two or more CPUs (Central Processing Units) in a single chip. A multicore architecture places multiple processor cores and groups them as one physical processor. The primary goal is to develop a system that can handle and complete more than one task at the same time, thereby getting a better system performance in general. This paper will describe the history and future trends of multicore computer architecture.
This Redbook talks through the benefits and product specification of IBM System x3500 M4. The x3500 M4 offers a flexible, scalable design and simple upgrade path to 32 HDDs, with up to eight PCIe 3.0 slots and up to 768 GB of memory. A high-performance dual-socket tower server, the IBM System x3500 M4, can deliver the scalability, reliable performance, and optimized efficiency for your business-critical applications. For more information on System x, visit http://ibm.co/Q7m3iQ.
http://www.scribd.com/doc/210742768/IBM-System-x3500-M4
Micro Server Design - Open Compute ProjectHitesh Jani
The Micro Server is a cluster of Low Power, High Density Servers which can be applied in growing workloads such as Distributed Computing, Cloud Computing, Internet of Things (IoT) and Big Data.
HISTORY AND FUTURE TRENDS OF MULTICORE COMPUTER ARCHITECTUREijcga
The multicore technology concept is centered on the parallel computing possibility that can boost computer
efficiency and speed by integrating two or more CPUs (Central Processing Units) in a single chip. A
multicore architecture places multiple processor cores and groups them as one physical processor. The
primary goal is to develop a system that can handle and complete more than one task at the same time,
thereby getting a better system performance in general. This paper will describe the history and future
trends of multicore computer architecture.
History and Future Trends of Multicore Computer Architectureijcga
The multicore technology concept is centered on the parallel computing possibility that can boost computer efficiency and speed by integrating two or more CPUs (Central Processing Units) in a single chip. A multicore architecture places multiple processor cores and groups them as one physical processor. The primary goal is to develop a system that can handle and complete more than one task at the same time, thereby getting a better system performance in general. This paper will describe the history and future trends of multicore computer architecture.
Ever since the beginning of the microelectronics era there has been an eternal quest to reduce the characteristic features on the devices: some devices are now in qualification states on the sub 40nm gate oxide range for an scheduled commercial release towards the end of the year, and there are a lot of efforts in the sub 30nm range.
An Introduction to Semiconductors and IntelDESMOND YUEN
Did you know that...
The average American adult spends over 12 hours a day engaged with electronics — computers, mobile devices, TVs, cars, to name just a few — powered by semiconductors.
A common chip the size of your smallest fingernail is only about 1-millimeter thick but contains roughly 30 different layers of components and wires (called interconnects) that make up its complex circuitry.
Intel owns nearly 70,000 active patents worldwide. Its first — “Resistor for Integrated Circuit,” #3,631,313 — was granted to Gordon Moore on Dec. 28, 1971.
Those are a few fun facts in a high-level presentation that provides an easy-to-understand look at the world of semiconductors, why they matter and the role Intel plays in their creation.
Report on evolution of processor by sandesh agrawalSandesh Agrawal
a best place to the beginners n seekers n for those which are very keen to learn on the topic - processor & automation.
The brain or engine of the PC is the processor (sometimes called microprocessor), or central processing unit (CPU). The CPU performs the system’s calculating and processing. The processor is easily the most expensive single component in the system, costing up to four or more times greater than the motherboard it plugs into. Intel is generally credited with creating the first microprocessor in 1971 with the introduction of a chip called the 4004. Today Intel still has control over the processor market, at least for PC systems. This means that all PC-compatible systems use either Intel processors or Intel-compatible processors from a handful of competitors (such as AMD or Cyrix).
Elevating Tactical DDD Patterns Through Object CalisthenicsDorra BARTAGUIZ
After immersing yourself in the blue book and its red counterpart, attending DDD-focused conferences, and applying tactical patterns, you're left with a crucial question: How do I ensure my design is effective? Tactical patterns within Domain-Driven Design (DDD) serve as guiding principles for creating clear and manageable domain models. However, achieving success with these patterns requires additional guidance. Interestingly, we've observed that a set of constraints initially designed for training purposes remarkably aligns with effective pattern implementation, offering a more ‘mechanical’ approach. Let's explore together how Object Calisthenics can elevate the design of your tactical DDD patterns, offering concrete help for those venturing into DDD for the first time!
Observability Concepts EVERY Developer Should Know -- DeveloperWeek Europe.pdfPaige Cruz
Monitoring and observability aren’t traditionally found in software curriculums and many of us cobble this knowledge together from whatever vendor or ecosystem we were first introduced to and whatever is a part of your current company’s observability stack.
While the dev and ops silo continues to crumble….many organizations still relegate monitoring & observability as the purview of ops, infra and SRE teams. This is a mistake - achieving a highly observable system requires collaboration up and down the stack.
I, a former op, would like to extend an invitation to all application developers to join the observability party will share these foundational concepts to build on:
Le nuove frontiere dell'AI nell'RPA con UiPath Autopilot™UiPathCommunity
In questo evento online gratuito, organizzato dalla Community Italiana di UiPath, potrai esplorare le nuove funzionalità di Autopilot, il tool che integra l'Intelligenza Artificiale nei processi di sviluppo e utilizzo delle Automazioni.
📕 Vedremo insieme alcuni esempi dell'utilizzo di Autopilot in diversi tool della Suite UiPath:
Autopilot per Studio Web
Autopilot per Studio
Autopilot per Apps
Clipboard AI
GenAI applicata alla Document Understanding
👨🏫👨💻 Speakers:
Stefano Negro, UiPath MVPx3, RPA Tech Lead @ BSP Consultant
Flavio Martinelli, UiPath MVP 2023, Technical Account Manager @UiPath
Andrei Tasca, RPA Solutions Team Lead @NTT Data
Generative AI Deep Dive: Advancing from Proof of Concept to ProductionAggregage
Join Maher Hanafi, VP of Engineering at Betterworks, in this new session where he'll share a practical framework to transform Gen AI prototypes into impactful products! He'll delve into the complexities of data collection and management, model selection and optimization, and ensuring security, scalability, and responsible use.
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
PHP Frameworks: I want to break free (IPC Berlin 2024)Ralf Eggert
In this presentation, we examine the challenges and limitations of relying too heavily on PHP frameworks in web development. We discuss the history of PHP and its frameworks to understand how this dependence has evolved. The focus will be on providing concrete tips and strategies to reduce reliance on these frameworks, based on real-world examples and practical considerations. The goal is to equip developers with the skills and knowledge to create more flexible and future-proof web applications. We'll explore the importance of maintaining autonomy in a rapidly changing tech landscape and how to make informed decisions in PHP development.
This talk is aimed at encouraging a more independent approach to using PHP frameworks, moving towards a more flexible and future-proof approach to PHP development.
Key Trends Shaping the Future of Infrastructure.pdfCheryl Hung
Keynote at DIGIT West Expo, Glasgow on 29 May 2024.
Cheryl Hung, ochery.com
Sr Director, Infrastructure Ecosystem, Arm.
The key trends across hardware, cloud and open-source; exploring how these areas are likely to mature and develop over the short and long-term, and then considering how organisations can position themselves to adapt and thrive.
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...UiPathCommunity
💥 Speed, accuracy, and scaling – discover the superpowers of GenAI in action with UiPath Document Understanding and Communications Mining™:
See how to accelerate model training and optimize model performance with active learning
Learn about the latest enhancements to out-of-the-box document processing – with little to no training required
Get an exclusive demo of the new family of UiPath LLMs – GenAI models specialized for processing different types of documents and messages
This is a hands-on session specifically designed for automation developers and AI enthusiasts seeking to enhance their knowledge in leveraging the latest intelligent document processing capabilities offered by UiPath.
Speakers:
👨🏫 Andras Palfi, Senior Product Manager, UiPath
👩🏫 Lenka Dulovicova, Product Program Manager, UiPath
State of ICS and IoT Cyber Threat Landscape Report 2024 previewPrayukth K V
The IoT and OT threat landscape report has been prepared by the Threat Research Team at Sectrio using data from Sectrio, cyber threat intelligence farming facilities spread across over 85 cities around the world. In addition, Sectrio also runs AI-based advanced threat and payload engagement facilities that serve as sinks to attract and engage sophisticated threat actors, and newer malware including new variants and latent threats that are at an earlier stage of development.
The latest edition of the OT/ICS and IoT security Threat Landscape Report 2024 also covers:
State of global ICS asset and network exposure
Sectoral targets and attacks as well as the cost of ransom
Global APT activity, AI usage, actor and tactic profiles, and implications
Rise in volumes of AI-powered cyberattacks
Major cyber events in 2024
Malware and malicious payload trends
Cyberattack types and targets
Vulnerability exploit attempts on CVEs
Attacks on counties – USA
Expansion of bot farms – how, where, and why
In-depth analysis of the cyber threat landscape across North America, South America, Europe, APAC, and the Middle East
Why are attacks on smart factories rising?
Cyber risk predictions
Axis of attacks – Europe
Systemic attacks in the Middle East
Download the full report from here:
https://sectrio.com/resources/ot-threat-landscape-reports/sectrio-releases-ot-ics-and-iot-security-threat-landscape-report-2024/
2. Table of Contents
History of Core technology ..................................................................................................................2
Magnetic Core Memory ......................................................................................................................2
Who Invented Core Memory?..........................................................................................................2
What is Core-i Technology ? ............................................................................................................2
Single core, Dual core, Quad core and Octa core. ..........................................................................3
ADVANTAGES : ...............................................................................................................................3
CURRENT LINEUP CORE PROCESSORS...............................................................................................3
Q: 2 Computer transformation ............................................................................................................4
Q:3: Vendors of Technology Hardware.................................................................................................5
Q: 4 Counter Argument .......................................................................................................................8
Will this be a hard sell? why or why not?..........................................................................................9
Q-5: Specialized Organizations In Creating Customized Software Applications For The Clients .................9
3. History of Core technology
Magnetic Core Memory
Tiny donuts made of magnetic material strung on wires into an array: the idea revolutionized computer
memory. Each donut was a bit, magnetized one way for “zero,” and the other way for “one.” The wires
could both detect and change the magnetization. In 1953, MIT’s Whirlwind became the first computer
to use this technology.
Core memory swiftly swept away competing technologies. But manufacturing it was a delicate job,
entrusted mostly to women using microscopes and steady hands to thread thin wires through holes
about the diameter of a pencil lead.
Who Invented Core Memory?
Success has a thousand fathers or in this case, at least, five.
Amateur inventor (and street inspector for Los Angeles) Frederick Viehe filed a core memory patent in
1947. Harvard physicist An Wang filed one in 1949. RCA’s Jan Rajchman and MIT’s Jay Forrester filed in
1950 and 1951 respectively.
Core memory proved extraordinarily successful. Success brought extraordinary profits…which in turn
ignited ownership disputes.
In 1964, after years of legal wrangling, IBM paid MIT $13 million for rights to Forrester’s patent—the
largest patent settlement to that date.
What is Core-i Technology?
Core-i is processor, which is introduced by Yonah in January 2006. It is a type of architecture where a
single physical processor contains the corelogic of two or more processors. These processors are
packaged into a single integratedcircuit.Core-I processor technology are for Smartphone and PC in many
variety. CORE had great success becoming one of the leading technologies in all around the world. The
older Intel Core 2 Solo, Intel Core 2 Duo, Intel Core 2 Quad and Intel Core 2 Extreme lines now the latest
Intel Core i7, Intel Core i5, and Intel Core i3. CORE technology is getting fast and better day by day.
4. Single core, Dual core, Quad core and Octa core.
ADVANTAGES:
Allow higher performance at lower energy
Improved Cache Latency with Smart l3 Cache
Improved Virtualization Performance
CPU Performance Boost via Intel TurboBoostTechnology
Optimized Multithreaded Performance through Hyper-Threading
1066 and 1333 MHz system bus
CURRENT LINEUP CORE PROCESSORS
In general, processors sold as Core are more powerful variants of the same processors marketed as
entry-level Celeron and Pentium. Similarly, identical or more capable versions of Core processors are
also sold as Xeon processors for the server and workstation market.
5. As of 2013 the current lineup of Core processors includes the latest Intel Core i7, Intel Core i5, and Intel
Core i3, and the older Intel Core 2 Solo, Intel Core 2 Duo, Intel Core 2 Quad, and Intel Core 2 Extreme
lines
Q: 2 Computer transformation
Microcomputer is too much adapted by this world, this era will take too much time to end this
technology, it will get end but not now, and this study used the technology acceptance model and
sought to extend it by investigating the impact of the external factors (i.e., individual, organizational,
and system characteristics) on the user acceptance of microcomputer technology. The paper reports the
results of a field study investigating the determinants of microcomputer usage. The analyses of the
measurement model confirm the existence of two distinct constructs of (1) beliefs-perceived usefulness
and perceived ease of use; (2) organizational support-management support and end-user computing
(EUC) support; and (3) microcomputer usage-perceived usage and variety of use. The tested conceptual
model confirms the effects of individual, organizational, and system characteristics on perceived ease of
use and perceived usefulness. The model also confirms the influence of perceived ease of use on
perceived usefulness, and the effects of perceived usefulness on perceived usage and variety of use.
Results confirm several previously proposed notions, including the effects of individual, organizational,
and system characteristics on ease of use and usefulness; the influence of ease of use on usefulness, and
the effects of perceived usefulness on usage and variety of use. The results demonstrate the utility of
investigating factors contributing to microcomputer usage and the external factors affecting
endogenous variables such as system usefulness. The importance of EUC support and management
support is corroborated, as well as the need for designing mechanisms such as training programs and
newsletters to improve user perceptions of microcomputers.
It has now fallen out of common usage. An early use of the term personal computer in 1962 predates
microprocessor-based designs. A microcomputer used as an embedded control system may have no
human readable input and output devices. Personal computer may be used generically or may denote
an IBM PC compatible machine. Microcomputers Today and Tomorrow: Soon after the integrated circuit
was developed, just two decades ago, it became apparent that very complex circuits would be possible.
In some cases, a major portion of a computer, called a module, could be constructed as a single circuit.
In fact, this made it possible to build a large system by breaking it up into a number of dif ferent kinds of
modules, each of which could be a circuit. The integrated circuit business was changing every year it
became possible to make more and more complicated circuits. So it became desirable to make modules
that are more complicated. The problem was that an increase in the complexity of a module meant a
decrease in its versatility, so that eventually a computer or other piece of equipment would have no
more than one of each type of module. The economics of the integrated circuit business made this
undesirable. High volume was the goal, but a different type of module for each type of system being
built meant that no more than a few hundred or a few thousand of any one type of module would be
made. Programming of Microprocessors:
6. The microprocessor represented a solution to this problem. It was a new type of module that was very
versatile. With some very simple programming techniques, this module appears to be many different
kinds of modules. In other words, it could be customized by programming. Microprocessors, or
microcomputers, are just miniature computers. In general, they can be used for most of the applications
for which larger computers are used. The earliest microprocessor was very poor in performance, and
there were many things it could not do. Over the years, performance has been improved so that
microprocessors now rival machines that in some cases cost a thousand times more. The history of
personal computers can be traced through the progress of the microprocessors available as their brains .
Each microchip manufacturer has brought out improved progeny within families of processors, retaining
similar instruction sets for upward mobility. Though each successive design may not be a quantum leap
forward, there is a steady movement toward faster and more powerful handling of more bits of
technology. Era of Continuous Improvement in Microcomputers: In the case of the computer industry
smaller is really better. Every person intends to have a smallest microcomputer. This will be easy to
move anywhere he wish to go he can carry with himself. Therefore, it is very difficult to end the smallest
size of smallest microcomputer.
Q3: Vendors of Technology Hardware
There are many vendors who provide hardware and solution to all the IT problems, if one of the vendor
not having some specific hardware then other their competitor is offering. SO almost all the hardware
are there, many of them are listed below
Adapter
Blu-ray
Bus
Cable Modem
Card Reader
CCD
CD
CD-R
CD-ROM
CD-RW
Chip
Chipset
Coaxial Cable
Compact Flash
Component
Computer
7. Console
Controller Card
CPU
CRT
DAC
DDR
DDR2
DDR3
Disk Drive
Dongle
Dot Pitch
DRAM
Drive
DSL
DSLAM
Dual-Core
DVD
DVD+R
DVD+RW
DVD-R
DVD-RAM
DVD-RW
DVI
External Hard Drive
Fiber-Optic Cable
File Server
Fire wire
Flash Drive
Flash Memory
GPU
Hard Disk
Hard Drive
HDD
HDMI
Heat Sink
8. Impact Printer
Inkjet
Input Device
Integrated Circuit
Interface
Internal Hard Drive
iPad
iPhone
iPod
IRQ
Joystick
Jumper
Keyboard
Kindle
Laptop
Laser Printer
LCD
LED
Lightning
MAC Address
Memory
Microcomputer
Microphone
Modem
Modifier Key
Monitor
Motherboard
Mouse
Mouse Pad
Multi-Core
Multiprocessing
PCI-X
PCMCIA
PDA
Power Cycle
9. Power Supply
Printer
Processor
Processor Core
PROM
PS/2
Quad-Core
RAM
Router
SATA
SD
SDRAM
Secondary Memory
Secondary Storage
Smartphone
Solid State
Sound Card
Speakers
Storage Device
Tablet
Touchscreen
VGA
Video Card
Webcam
Q: 4 Counter Argument
I and my three fellow employees need new computer because technology is getting better day by day
and our computers are getting out day by day and they are slow too. New computer technology are fast
and its help us to complete our task in less time as compare old computer technology. It’s better in
every way.
New Computer prices are dropping constantly but this computer technology is new and this is what we
need. This technology won't get old soon. New computers helps us to complete our task before
deadline. Company gives us benefits on this. The sooner task complete the better company will grow. So
10. we should not think about computer prices is dropping constantly we should think about company first
and I think company can easily purchase these computers for employees.
There is famous quote by Benjamin Franklin
"Never leave that till tomorrow which you can do today"
So I think we should purchase computer instead of waiting.
Will this be a hard sell? Why or why not?
Well this will be hard sell but not impossible because new computer is need of employees and company
too. Company should think about their employees instead of saving money. If company fulfill
employees wish/need than employees can do their work happily cause they have new technology and
this technology saves employees and company time .
Q-5: Specialized Organizations In Creating Customized Software
Applications For The Clients
Software Specialized Organizations: Expect surprises—exciting and positive opportunities. The latest
technological developments offer you new opportunities to extend your range of computer
competency. Software that for years was available only for mainframes has recently become available
for microcomputers. This new generation of software, called specialized applications, now makes it
possible to perform advanced tasks at home.
The latest technological developments have created an opportunity for home users to take advantage of
software previously used only in professional environments. For example, it is now possible, and quite
common, for people to create their own Web sites. Home users also have access to software that helps
manipulate and create graphic images. Many musicians and artists work from home to create complex
and beautiful work using specialized applications. Some of these same technological advances have
allowed researchers and computer scientists to make advances in the field of artificial intelligence that
previously were envisioned only in science fiction. Robots now provide security and assistance in homes.
Virtual reality is providing opportunities in the fields of medicine and science but also commonly
appears in video games. Competent end users need to be aware of specialized applications. They need
to know who uses them, what they are used for, and how they are used. These advanced applications
include graphics programs, audio and video editing software, multimedia, Web authoring, and artificial
intelligence, including virtual reality, knowledge-based systems, and robotics.
In the last 15 years Software development improved too much. In these days, too many organizations
are providing customized software applications to their clients. These organizations are experts in rapid
custom development of web based, distributed and standalone applications designed to meet
organization's specific requirements and business needs. The expertise that they possess embraces a
wide range of custom programming skills involving the latest and most effective development
11. technologies. This largely defines the quality and reliability of the custom software applications that we
develop. However, in lots of cases pre-developed platforms professional customization is enough for a
solution, there are often situations when this way is not suitable or even inapplicable. Promotion
delivers cost-effective and reliable custom software solutions that match client’s unique goals,
requirements and processes. Technology Migration: They can help you, if you are thinking to change you
existing technology to open source technologies. You might want your application on internet instead of
desktop application. Web Based Application: Every organization develops feature-rich custom web
applications, sites and portals for corporate and internet-focused projects: E-commerce, Online
Servicing and Order Processing and much more. Desktop Application Development: Organization creates
cross platform standalone and client-server business applications ensuring stable functioning, high
performance and usability. Mostly these organization uses following computer and IT related
technologies. Database Server: Oracle Database 10g/11g, My SQL, MSSQL Server, DB2, Postage SQL Web
Server: Oracle Application Server, Apache, Tomcat, JBoss, IIS. Middle Tier: OC4j, Oracle Forms Server,
Oracle Report Server, J2EE, MODPL/SQL, MOD PERL, .NET, PHP Tomcat, JBoss, Web Logic. Framework:
Oracle PSP/PL/SQL Gateway, ADF for Oracle-Java Technology, JSF, Struts, Hibernate, Spring .NET, LINQ,
WF, WPF, WCF, CakePHP, Code igniter, Zend, Web Front End Framework (AJAX, JQery, DOJO, Prototype)
etc.