Uploaded on

Cloud computing and network security issues

Cloud computing and network security issues

More in: Technology , Business
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
    Be the first to like this
No Downloads

Views

Total Views
1,750
On Slideshare
0
From Embeds
0
Number of Embeds
1

Actions

Shares
Downloads
71
Comments
0
Likes
0

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. An attempt to clearthe confusion that engulfs Cloud Computing 1
  • 2. On the LinkedIn professional network there two discussion threads running currently in theTelecom Professionals Group and some that are running on the IT Next Group. The ones inthe former Group areClintonStop Following Follow Clinton 1. I hear a lot about the Cloud and Cloud Computing. Can someone explain to me what that is?RamiroStop Following Follow Ramiro 2. Telecom trends 2011- What do you think?A lot of views have been expressed by the participants in these two discussion threads bythe proponents and opponents of Cloud Computing.Philippe Portes participating in the first these discussion threads brought out some hardtruths about his experience with Cloud Computing about 2 days ago .which led to someinvigorating discussions by Ariel Gollon and Tim Templeton.Although Ramiro Gonzales had suggested various topics while opening this seconddiscussion thread, according to the statistics he has compiled, Cloud ComputingCommunications has held centre stage amongst all the other possible topics of discussionson Telecom Trends 2011.I have been chipping in with comments in both these discussion threads to clear someconfusions that exist.Finally Dirk de Vos’ post 23 Hrs. ago on the 5th.May, 2011 has provoked me to try andprepare this note which is aimed at touching on some fundamental aspects of CloudComputing and Network Security with respect to an organisation’s internal databases.To do this I will need to address the fundamentals of Cloud computing, and also thefundamentals of network security issues. 2
  • 3. Cloud ComputingLet me reproduce here the under-noted extracts of Wikipedia information on CloudComputing with my comments along the way.Cloud computing refers to the provision of computational resources on demand via acomputer network. Because the cloud is an underlying delivery mechanism, cloud basedapplications and services may support any type of software application or service in usetoday. Before the advent of computer networks, both data and software were stored andprocessed on or near the computer. The development of Local Area Networks LAN allowedfor a tiered architecture in which multiple CPUs and storage devices may be organized toincrease the performance of the entire system. LANs were widely deployed in corporateenvironments in the 1990s, and are notable for vendor specific connectivity limitations.These limitations gave rise to the marketing term "Islands of Information" which was widelyused within the computing industry. The widespread implementation of the TCP/IP protocolstack and the subsequent popularization of the web has lead to multi-vendor networks thatare no longer limited by company walls.Cloud computing fundamentally allows for a functional separation between the resourcesused and the users computer. The computing resources may or may not reside outside thelocal network, for example in an internet connected datacenter. What is important to theindividual user is that they simply work. This separation between the resources used andthe users computer also has allowed for the development of new business models. All of thedevelopment and maintenance tasks involved in provisioning the application are performedby the service provider. The users computer may contain very little software or data(perhaps a minimal operating system and web browser only), serving as little more than adisplay terminal for processes occurring on a network of computers far away. Consumersnow routinely use data intensive applications driven by cloud technology which werepreviously unavailable due to cost and deployment complexity. In many companiesemployees and company departments are bringing a flood of consumer technology into theworkplace and this raises legal compliance and security concerns for the corporation.The common shorthand for a provided cloud computing service (or even an aggregation ofall existing cloud services) is "The Cloud". The most common analogy to explain cloudcomputing is that of public utilities such as electricity, gas, and water. Just as centralized andstandardized utilities free individuals from the difficulties of generating electricity or pumpingwater, cloud computing frees users from certain hardware and software installation andmaintenance tasks through the use of simpler hardware that accesses a vast network ofcomputing resources (processors, hard drives, etc.). The sharing of resources reduces thecost to individuals.The phrase “cloud computing” originated from the cloud symbol that is usually used by flowcharts and diagrams to symbolize the internet. The principle behind the cloud is that anycomputer connected to the internet is connected to the same pool of computing power,applications, and files. Users can store and access personal files such as music, pictures,videos, and bookmarks or play games or use productivity applications on a remote serverrather than physically carrying around a storage medium such as a DVD or thumb drive.Almost all users of the internet may be using a form of cloud computing though few realize it.Those who use web-based email such as Gmail, Hotmail, Yahoo, a Company owned email,or even an e-mail client program such as Outlook, Evolution, Mozilla, Thunderbird. OrEntourage are making use of cloud email servers. Hence, desktop applications whichconnect to cloud email would be considered cloud applications. 3
  • 4. Cloud computing utilizes the network as a means to connect user end point devices (endpoints) to resources that are centralized in a data center. The data center may by accessedvia the internet or a company network, or both. In many cases a cloud service may allowaccess from a variety of end points such as a mobile phone, a PC or a tablet. Cloud servicesmay be designed to be vendor agnostic, working equally well with Linux, Mac and PCplatforms. They also can allow access from any internet connected location, allowing mobileworkers to access business systems remotely as in Telecommuting, and extending thereach of business services provided by Outsourcing.A user endpoint with minimal software requirements may submit a task for processing. Theservice provider may pool the processing power of multiple remote computers in "the cloud"to achieve the task, such as data warehousing of hundreds of terabytes, managing andsynchronizing multiple documents online, or computationally intensive work. These taskswould normally be difficult, time consuming, or expensive for an individual user or a smallcompany to accomplish. The outcome of the processing task is returned to the client overthe network. In essence, the heavy lifting of a task is outsourced to an external entity withmore resources and expertise.The services - such as data storage and processing - and software are provided by thecompany hosting the remote computers. The clients are only responsible for having a simplecomputer with a connection to the Internet, or a company network, in order to make requeststo and receive data from the cloud. Computation and storage is divided among the remotecomputers in order to handle large volumes of both, thus the client need not purchaseexpensive hardware to handle the task.Technical descriptionThe National Institute of Standards and Technology (NIST) provides a concise and specificdefinition:Cloud computing is a model for enabling convenient, on-demand network access to a sharedpool of configurable computing resources (e.g., networks, servers, storage, applications, andservices) that can be rapidly provisioned and released with minimal management effort orservice provider interaction.Cloud computing provides computation, software, data access, and storage services that donot require end-user knowledge of the physical location and configuration of the system thatdelivers the services. Parallels to this concept can be drawn with the electricity grid, whereend-users consume power without needing to understand the component devices orinfrastructure required to provide the service.Cloud computing describes a new supplement, consumption, and delivery model for ITservices based on Internet protocols, and it typically involves provisioning of dynamicallyscalable and often virtualized resources. It is a byproduct and consequence of the ease-of-access to remote computing sites provided by the Internet. This may take the form of web-based tools or applications that users can access and use through a web browser as if theywere programs installed locally on their own computers.Cloud computing providers deliver applications via the internet, which are accessed from aWeb browser, while the business software and data are stored on servers at a remotelocation. In some cases, legacy applications (line of business applications which until nowhave been prevalent in thick client Windows computing) are delivered via a screen sharingtechnology such as Citrix XenApp, while the compute resources are consolidated at a 4
  • 5. remote data center location; in other cases entire business applications have been codedusing web based technologies such as AJAX.Most cloud computing infrastructures consist of services delivered through shared data-centers. The Cloud may appear as a single point of access for consumers computing needs,notable examples include the iTunes Store, and the iPhone App Store. Commercial offeringsmay be required to meet service level agreements (SLAs), but specific terms are less oftennegotiated by smaller companies.CharacteristicsThe key characteristic of cloud computing is that the computing is "in the cloud"; that is, theprocessing (and the related data) is not in a specified, known or static place(s). This is incontrast to a model in which the processing takes place in one or more specific servers thatare known. All the other concepts mentioned are supplementary or complementary to thisconcept.ArchitectureCloud computing sample architectureCloud architecture, the systems architecture of the software systems involved in thedelivery of cloud computing, typically involves multiple cloud components communicatingwith each other over application programming interfaces (APIs), usually web services and 3-tier architecture. This resembles the Unix philosophy of having multiple programs each doingone thing well and working together over universal interfaces. Complexity is controlled andthe resulting systems are more manageable than their monolithic counterparts.The two most significant components of cloud computing architecture are known as the frontend and the back end. The front end is the part seen by the client, i.e. the computer user.This includes the client’s network (or computer) and the applications used to access thecloud via a user interface such as a web browser. The back end of the cloud computingarchitecture is the ‘cloud’ itself, comprising various computers, servers and data storagedevices. 5
  • 6. Deployment modelsCloud computing typesPublic cloudPublic cloud or external cloud describes cloud computing in the traditional mainstreamsense, whereby resources are dynamically provisioned on a fine-grained, self-service basisover the Internet, via web applications / web services, from an off-site third-party providerwho bills on a fine-grained utility computing basis.The best examples of public clouds are the various search engines like Google whichserve as our information bank for all information available on the public domain andemail applications for email service providers like, Gmail, Yahoo, Hotmail, etc.Community cloudA community cloud may be established where several organizations have similarrequirements and seek to share infrastructure so as to realize some of the benefits of cloudcomputing. The costs are spread over fewer users than a public cloud (but more than asingle tenant). This option may offer a higher level of privacy, security and/or policycompliance. In addition it can be economically attractive as the resources (storage,workstations) utilized and shared in the community are already exploited and have reachedtheir return of investment. Examples of community clouds include Google’s "Gov Cloud".Hybrid cloud and hybrid IT deliveryThe main responsibility of the IT department is to deliver services to the business. With theproliferation of cloud computing (both private and public) and the fact that IT departmentsmust also deliver services via traditional, in-house methods, the newest catch-phrase hasbecome “hybrid cloud computing. Hybrid cloud is also called hybrid delivery by the majorvendors including HP, IBM, Oracle, and VMware who offer technology to manage thecomplexity in managing the performance, security and privacy concerns that results from themixed delivery methods of IT services.A hybrid storage cloud uses a combination of public and private storage clouds. Hybridstorage clouds are often useful for archiving and backup functions, allowing local data to bereplicated to a public cloud. 6
  • 7. Another perspective on deploying a web application in the cloud is using Hybrid WebHosting, where the hosting infrastructure is a mix between cloud hosting and manageddedicated servers – this is most commonly achieved as part of a web cluster in which someof the nodes are running on real physical hardware and some are running on cloud serverinstances.Combined cloudTwo clouds that have been joined together are more correctly called a "combined cloud". Acombined cloud environment consisting of multiple internal and/or external providers "will betypical for most enterprises". By integrating multiple cloud services users may be able toease the transition to public cloud services while avoiding issues such as PCI compliance.Private cloudDouglas Parkhill first described the concept of a "private computer utility" in his 1966 bookThe Challenge of the Computer Utility. The idea was based upon direct comparison withother industries (e.g. the electricity industry) and the extensive use of hybrid supply modelsto balance and mitigate risks."Private cloud" and "internal cloud" have been described as neologisms, but the conceptsthemselves pre-date the term cloud by 40 years. Even within modern utility industries, hybridmodels still exist despite the formation of reasonably well-functioning markets and the abilityto combine multiple providers.Some vendors have used the terms to describe offerings that emulate cloud computing onprivate networks. These (typically virtualization automation) products offer the ability to hostapplications or virtual machines in a companys own set of hosts. These provide the benefitsof utility computing – shared hardware costs, the ability to recover from failure, and the abilityto scale up or down depending upon demand.Private clouds have attracted criticism because users "still have to buy, build, and managethem" and thus do not benefit from lower up-front capital costs and less hands-onmanagement, essentially "[lacking] the economic model that makes cloud computing suchan intriguing concept". Enterprise IT organizations use their own private cloud(s) for missioncritical and other operational systems to protect their critical infrastructure. Therefore, for allintents and purposes, "private clouds" are not an implementation of cloud computing at all,but are in fact an implementation of a technology subset: the basic concept of virtualizedcomputing.However, as will be seen from the notes on network security issues to follow, privateclouds are absolutely essential for 100% security of an organization’s or enterprise’sinternal databases.Cloud engineeringCloud engineering is the application of a systematic, disciplined, quantifiable, andinterdisciplinary approach to the ideation, conceptualization, development, operation, andmaintenance of cloud computing, as well as the study and applied research of the approach,i.e., the application of engineering to cloud. It is a maturing and evolving discipline tofacilitate the adoption, strategisation, operationalisation, industrialisation, standardization,productisation, commoditisation, and governance of cloud solutions, leading towards a cloudecosystem. Cloud engineering is also known as cloud service engineering. 7
  • 8. Cloud storageCloud storage is a model of networked computer data storage where data is stored onmultiple virtual servers, generally hosted by third parties, rather than being hosted ondedicated servers. Hosting companies operate large data centers; and people who requiretheir data to be hosted buy or lease storage capacity from them and use it for their storageneeds. The data centre operators, in the background, virtualise the resources according tothe requirements of the customer and expose them as virtual servers, which the customerscan themselves manage. Physically, the resource may span across multiple servers.Again from the notes on network security issues to follow, you will see that it is notadvisable to leave sensitive business data on such hosted storage.The IntercloudThe Intercloud is an interconnected global "cloud of clouds" and an extension of the Internet"network of networks" on which it is based. The term was first used in the context of cloudcomputing in 2007 when Kevin Kelly stated that "eventually well have the Intercloud, thecloud of clouds. This Intercloud will have the dimensions of one machine comprising allservers and attendant cloud-books on the planet". It became popular in 2009 and has alsobeen used to describe the data centre of the future.The Intercloud scenario is based on the key concept that each single cloud does not haveinfinite physical resources. If a cloud saturates the computational and storage resources ofits virtualization infrastructure, it could not be able to satisfy further requests for serviceallocations sent from its clients. The Intercloud scenario aims to address such situation, andin theory, each cloud can use the computational and storage resources of the virtualizationinfrastructures of other clouds. Such form of pay-for-use may introduce new businessopportunities among cloud providers if they manage to go beyond theoretical framework.Nevertheless, the Intercloud raises many more challenges than solutions concerning cloudfederation, security, inter-operability, quality of service, vendors lock-ins, trust, legal issues,monitoring and billing.The concept of a competitive utility computing market which combined many computerutilities together was originally described by Douglas Parkhill in his 1966 book, the"Challenge of the Computer Utility". This concept has been subsequently used many timesover the last 40 years and is identical to the Intercloud.IssuesPrivacyThe cloud model has been criticized by privacy advocates for the greater ease in which thecompanies hosting the cloud services control, and thus, can monitor at will, lawfully orunlawfully, the communication and data stored between the user and the host company.Instances such as the secret NSA programme, working with AT&T, and Verizon, whichrecorded over 10 million phone calls between American citizens, causes uncertainty amongprivacy advocates, and the greater powers it gives to telecommunication companies tomonitor user activity. While there have been efforts (such as US-EU safe Harbour) to"harmonise" the legal environment, providers such as Amazon still cater to major markets 8
  • 9. (typically the United States and the European Union) by deploying local infrastructure andallowing customers to select "availability zones.ComplianceIn order to obtain compliance with regulations including FISMA, HIPPA, and SOX in theUnited States, the Data Protection Directive in the EU and the credit card industrys PCIDSS, users may have to adopt community or hybrid deployment modes which are typicallymore expensive and may offer restricted benefits. This is how Google is able to "manageand meet additional government policy requirements beyond FISMA and Rackspace Cloudare able to claim PCI compliance. Customers in the EU contracting with cloud providersestablished outside the EU/EEA have to adhere to the EU regulations on export of personaldata.Many providers also obtain SAS 70 Type II certification (e.g. Amazon, Salesforce.com,Google and Microsoft, but this has been criticised on the grounds that the hand-picked set ofgoals and standards determined by the auditor and the audited are often not disclosed andcan vary widely. Providers typically make this information available on request, under non-disclosure agreement.LegalIn March 2007, Dell applied to trademark the term "cloud computing" (U. S. Trademark77,139,082) in the United States. The "Notice of Allowance" the company received in July2008 was cancelled in August, resulting in a formal rejection of the trademark applicationless than a week later. Since 2007, the number of trademark filings covering cloudcomputing brands, goods and services has increased at an almost exponential rate. Ascompanies sought to better position themselves for cloud computing branding and marketingefforts, cloud computing trademark filings increased by 483% between 2008 and 2009. In2009, 116 cloud computing trademarks were filed, and trademark analysts predict that over500 such marks could be filed during 2010.Other legal cases may shape the use of cloud computing by the public sector. On October29, 2010, Google filed a lawsuit against the U.S. Department of Interior, which opened up abid for software that required that bidders use Microsofts Business Productivity Online Suite.Google sued, calling the requirement "unduly restrictive of competition”. Scholars havepointed out that, beginning in 2005, the prevalence of open standards and open source mayhave an impact on the way that public entities choose to select vendors.Open sourceOpen source software has provided the foundation for many cloud computingimplementations. In November 2007, the Free Software Foundation released the AfferoGeneral Public License, a version of GPLV3 intended to close a perceived legal loopholeassociated with free software designed to be run over a network.Open standardsMost cloud providers expose APIs which are typically well-documented (often under aCreative Commons license) but also unique to their implementation and thus notinteroperable. Some vendors have adopted others APIs and there are a number of openstandards under development, including the OGF’s Open Cloud Computing Interface. TheOpen Cloud Consortium (OCC) is working to develop consensus on early cloud computingstandards and practices. 9
  • 10. SecurityThe relative security of cloud computing services is a contentious issue which maybe delaying its adoption. Issues barring the adoption of cloud computing are due in largepart to the private and public sectors unease surrounding the external management ofsecurity based services. It is the very nature of cloud computing based services, private orpublic, that promote external management of provided services. This delivers great incentiveamongst cloud computing service providers in producing a priority in building andmaintaining strong management of secure services.Organizations have been formed in order to provide standards for a better future in cloudcomputing services. One organization in particular, the Cloud Security Alliance is a non-profitorganization formed to promote the use of best practices for providing security assurancewithin cloud computing.The notes that follow on Network Security issues will expose the folly ofendeavouring to arrange security of cloud computing in general, although measureslike CPI DSS, and some measures taken for ensuring security of online bankingtransactions, are relevant.Availability and performanceIn addition to concerns about security, businesses are also worried about acceptable levelsof availability and performance of applications hosted in the cloud.There are also concerns about a cloud provider shutting down for financial or legal reasons,which has happened in a number of cases.Strong network connectivity is an essential requirement for availability andperformance of cloud computing.Sustainability and SitingAlthough cloud computing is often assumed to be a form of “green computing”, there is as ofyet no published study to substantiate this assumption. Siting the servers affects theenvironmental effects of cloud computing. In areas where climate favours natural coolingand renewable electricity is readily available, the environmental effects will be moremoderate. Thus countries with favourable conditions, such as Finland, Sweden andSwitzerland, are trying to attract cloud computing data centres.SmartBay, marine research infrastructure of sensors and computational technology, is beingdeveloped using cloud computing, an emerging approach to shared infrastructure in whichlarge pools of systems are linked together to provide IT services.ResearchA number of universities, vendors and government organizations are investing in researcharound the topic of cloud computing. Academic institutions include University of Melbourne(Australia), Georgia Tech, Yale, Wayne State, Virginia Tech, University of Wisconsin–Madison, Carnegie Mellon, MIT, Indiana University, University of Massachusetts, Universityof Maryland, IIT Bombay, North Carolina State University, Purdue University, University of 10
  • 11. California, University of Washington, University of Virginia, University of Utah, University ofMinnesota, among others.Joint government, academic and vendor collaborative research projects include theIBM/Google Academic Cloud Computing Initiative (ACCI). In October 2007 IBM and Googleannounced the multi- university project designed to enhance students technical knowledgeto address the challenges of cloud computing. In April 2009, the National ScienceFoundation joined the ACCI and awarded approximately million in grants to 14 academicinstitutions.In July 2008, HP, Intel Corporation and Yahoo announced the creation of a global, multi-datacentre, open source test bed, called Open Cirrus, designed to encourage research into allaspects of cloud computing, service and data centre management. Open Cirrus partnersinclude the NSF, the University of Illinois (UIUC), Karlsruhe Institute of Technology, theInfocomm Development Authority (IDA) of Singapore, the Electronics andTelecommunications Research Institute (ETRI) in Korea, the Malaysian Institute forMicroelectronic Systems (MIMOS), and the Institute for System Programming at the RussianAcademy of Sciences (ISPRAS). In Sept. 2010, more researchers joined the HP/Intel/YahooOpen Cirrus project for cloud computing research. The new researchers are China MobileResearch Institute (CMRI), Spains Supercomputing Center of Galicia (CESGA by itsSpanish acronym), and Georgia Techs Center for Experimental Research in ComputerSystems (CERCS) and China Telecom.In July 2010, HP Labs India announced a new cloud-based technology designed to simplifytaking content and making it mobile-enabled, even from low-end devices. CalledSiteonMobile, the new technology is designed for emerging markets where people are morelikely to access the internet via mobile phones rather than computers. In November 2010,HP formally opened its Government Cloud Theatre, located at the HP Labs site in Bristol,England. The demonstration facility highlights high-security, highly flexible cloud computingbased on intellectual property developed at HP Labs. The aim of the facility is to lessen fearsabout the security of the cloud. HP Labs Bristol is HP’s second-largest central researchlocation and currently is responsible for researching cloud computing and security.The IEEE Technical Committee on Services Computing in IEEE Computer Society sponsorsthe IEEE International Conference on Cloud Computing (CLOUD). CLOUD 2010 was heldon July 5–10, 2010 in Miami, FloridaOn March 23, 2011, Google, Microsoft, HP, Yahoo, Verizon, Deutsche Telecom and 17other companies formed a non-profit organisation called Open Network Foundation, focusedon providing support for a new cloud initiative called Software-Defined Networking. Theinitiative is meant to speed innovation through simple software changes intelecommunications networks, wireless networks, data centres and other networking areas.Criticism of the termSome have come to criticize the term as being either too unspecific or even misleading.CEO Larry Ellison of Oracle Corporation asserts that cloud computing is "everything that wealready do", claiming that the company could simply "change the wording on some of ourads" to deploy their cloud-based services. Forrester Research VP Frank Gillett questions thevery nature of and motivation behind the push for cloud computing, describing what he calls"cloud washing"—companies simply relabeling their products as "cloud computing", resultingin mere marketing innovation instead of "real" innovation. GNU’s Richard Stallman insiststhat the industry will only use the model to deliver services at ever increasing rates overproprietary systems, otherwise likening it to a "marketing hype campaign". 11
  • 12. I could not agree more with the critics of the term. Oracle has pioneered Web basedcomputing with centralised data centres for organisations and enterprises for many yearsnow. They can well change the label of their products to be Cloud Compliant. Basicallycompanies are taking advantage of the new movement to re-package old wine in newbottles.The availability of higher speed Internet access from both stationery and mobile devicestoday compared to what was available a few years back, is what has given the push forCloud Computing – utilisation of public cloud services.ConclusionThere is absolutely no doubt, that the public cloud will thrive with the search engines, theemailing services, the payment services, online banking services, and some other inter-people applications which are normally carried out over the Internet.However, organisations and enterprises will have to be circumspect about how muchof their business operations they can offload to the Cloud Applications, if at all.The nest section of this note for my fellow contributors, and the data communications and ITfraternity across the world, addresses this issue.Network Security IssuesEach organisation has its own security perceptions and requirements. To get a grip on thisthey need to ask themselves the following very pertinent questions to determine how theyshould lay out their IT Infrastructure. Is your organisation’s internal data important and exclusive to you? If so, how secure is this information? If leaked, what would result? …..Business loss? ….Revenue loss? ….Erosion of profit? ….Erosion of corporate value? If damaged, what would result? …Breakdown of business operations? …Un-fulfilled delivery commitments? …Un-fulfilled commercial commitments? …Erosion of corporate value? Would you like to protect your organisation from such eventuality? 12
  • 13. If the answers to all the above questions are positive, then they need to understand wherethe security threats are emanating from. To enable them to do this I reproduce for readyreference the series of discussions started by me in the IT Next Group of LinkedIn availablein URL http://www.linkedin.com/groups?search=&answerCategory=myq&gid=2261770.How secure is VPN (MPLS or otherwise) for MLO (multi-locationalorganisation) INTRANET connectivity? The MLOs may be banks, corporateorganisations, and Govt. Organisations.Before we address this question it is necessary to bring to the fore some basic facts aboutVPN connectivity MPLS or otherwise, which may or may not be known to the readers of thispost.All VPNs irrespective of the protocols being used (MPLS, Frame Relay, ATM), are laid outover the IP Backbone over different telephone service providers (TSPs). These IPbackbones of TSPs not only serve the VPN networks of different subscribers, but also servethe public data networks through PSTN, ISDN, PDN, Broadband, and are also connected tothe National Internet Exchange Gateways (NIEX). All these networks connect to the nationalIP backbone of the TSP through a Tier 1 switch at each city / town.As is known by all those who are aware of the functioning of IP networks, in such a networkall routers connected to the network through these Tier 1 switches have continuous physicalaccess to each other. Further another characteristic of IP networks is that it supportsconcatenous or simultaneous communications between all routers connected to the networkthrough the Tier 1 switches at each POP (point of presence). Thus while routers A and B arein communication, a router C in the network could be simultaneously communicating with Aor B or both. This is the beauty and also the bane of IP networks. Beauty because unlikecircuit switched networks, there is no blocking of communications between any pair ofrouters even if one of them is already engaged in communication with another router. In thecircuit switched scenario the third communication device would be blocked fromcommunicating with either of the communication devices already engaged in communicationresulting in a busy tone.This feature is a bane since in networks which have public domain access, as do the IPbackbones of all TSPs, the third router could be that of a hacker sitting in the public domainwho is provided a continuous physical access to the VPN router port of an organisationthrough the Tier 1 switches in each city / town. Once this continuous physical access isavailable to a hacker / cracker, he / she can get into the LAN associated with the VPN routerthrough the process of snooping and spoofing, and to the internal databases residing in theINTRANET.Thus while VPN facilitates the secure transport of data between points A and B in thenetwork through the TSP IP backbone using the various security protocols like IP Sec, theyexpose the internal databases of the organisation to outside intrusion since it has publicdomain access from the TSP IP backbone..Thus we see that internal data bases of an organisation are vulnerable when the INTRANETconnectivity of an MLO is arranged through VPN (MPLS or otherwise).To give you a view of how a VPN is connected through a typical TSP’s IP backbone, I wouldrefer the reader to see the first two slides of VPN.ppt which shows schematics of thetopology and architecture of a typical TSP IP backbone. This is available inhttp://www.slideshare.net/pankajmitra9 months ago 13
  • 14. Why is VPN growing in popularity in the IT world despite the inherentvulnerability of internal data bases connected through VPN basedINTRANETs?VPNs are themselves laid out over telecom service providers IP networks – see PowerPointpresentation VPN.ppt – along with all other public data services and the Internet. Thusinternal data bases connected through such VPN / MPLS VPN networks can be accessedfrom the public domain networks for reasons explained in Slide 3 of this presentation.However, most IT consultants and System Integrators lead their customers to believe thattheir data bases are secure when connected through VPN / MPLS VPN networks. They do itfor the following reasonsA. It means less work for them – they do not have to write router tables as is required forpoint-to-point leased lines.B. They lead customers to believe that it is cheaper to have VPN / MPLS VPN networks thanpoint-to-point leased line networks. This is again a myth as is shown in the document MPLS-P2P.doc. See http://www.slideshare.net/pankajmitra.C. Customer IT managers also find this convenient as their work is also reduced since theyare connected to the service provider through a single or two WAN port router to the nearestVPN node of the service provider. For any network problem they haul up the service providerand sit back themselves.D. Thus customer IT managers choose the easy way. This is fine as long as there is nointrusion on the data bases from hackers sitting in the public domain who have continuousphysical access to the VPN router ports. The troubles will start if and when data bases gethacked. They will get into a nightmarish situation in trying to retrieve the data bases if thereis anything left to retrieve. The easy way is the hard way.E. If on the other hand, the Consultant, the system Integrator, and the IT managers of thecompany took the trouble of setting up a point-to-point leased line network by configuring therouter tables of their private network, the hard way; the network will then be free from anyintrusion from hackers as such a network denies physical access to the public domain andconsequently to hackers. There will be no hacking and the Network administrators and the ITmanagers will have a trouble free life – the easy way. Thus the hard way is the easy way.“The hard way is the easy way, and the easy way is the hard way”9 months agoAre firewalls breakable?A firewall is a dedicated appliance with embedded software, or software running on acomputer, which inspects network traffic passing through it, and denies or permits passagebased on a set of rules.It is normally placed between a protected network and an unprotected network and acts likea gate to protect assets to ensure that nothing private goes out and nothing malicious comesin.A firewalls basic task is to regulate some of the flow of traffic between computer networks ofdifferent trust levels. Typical examples are the Internet which is a zone with no trust and aninternal network which is a zone of higher trust. A zone with an intermediate trust level,situated between the Internet and a trusted internal network, is often referred to as a"perimeter network" or De-militarised Zone (DMZ).There are several types of firewall techniques:1. Packet filter: Packet filtering inspects each packet passing through the network andaccepts or rejects it based on user-defined rules. Although difficult to configure, it is fairly 14
  • 15. effective and mostly transparent to its users. It is susceptible to IP spoofing.2. Application Gateway: Applies security mechanisms to specific applications, such as FTPand Telnet servers. This is very effective, but can impose performance degradation.3. Circuit-level gateway: Applies security mechanisms when a TCP or UDP connection isestablished. Once the connection has been made, packets can flow between the hostswithout further checking.4. Proxy server: Intercepts all messages entering and leaving the network. The proxy servereffectively hides the true network addresses.For all the types of Firewalls the rules and filters are set using software algorithms.The hackers / crackers have a technique of masking their data packets to conform to thefilters and rules for access to the network. This is also known as spoofing. Once through intothe network they can then seize the computers or the Proxy through Telnet / SSH accessand go about disabling the software algorithms which were used to set the rules and filtersand open up the system for them to do whatever they wish to do. The IDS (intrusiondetection systems) and the IDP (intrusion detection and protection) systems available todaycan make it difficult for hackers to go into the network, but not impossible. It will take themmore time, but then they can eventually get through. To be able to do this the only thing thatthe hackers need is a continuous physical connectivity to the routers between the Internetconnection and the Proxy server or the direct bank of computers which is available to themthrough broadband Internet connection. The race between the protector and the spoofer is acontinuing process. You may go on spending money to increase the deterrence dependingon the value you ascribe to the asset (data bases) you are trying to protect. If the asset isalso perceived to be valuable by the Hacker / Cracker or those who have employed themthen they would also be willing to devote more time to the job at hand. A high end securitysystem with IDS / IDP built in and several layers of firewall would cost in multiples of CroresRupees. And yet they would not be totally invincible to Hackers.The only real way to prevent the Hackers / Crackers from getting in is to segregate thecorporate network (INTRANET) from any form of public domain access – the Internet or VPN(which also provides continuous access to the corporate network from the public domain).See VPN.ppt IN http://www.slideshare.net/pankajmitra. This way we can disable thecontinuous physical access to the INTRANET from the Internet and from the intruders.By adding systems like IDS / IDP you can delay the cracking, but not prevent it totally. Thusall firewalls are breakable although the effort required depends on the degree of protectionbuilt in. This is what prompts one to say “For every firewall there is a water hose”9 months agoIf we isolate our INTRANET from the Internet to achieve 100% security ofinternal databases how do we communicate with clients, vendors, the generalpublic, consultants, and participate in e-Commerce activity and online bankingactivity(in case of banks)?This may be done by placing the MLO’s (multi-locational organisation’s) Web server togetherwith its storage on a separate Internet LAN at the organisation’s central location. This LANmay be connected through a standard firewall from Cisco or others and a point-to-point (p2p)leased line of appropriate bandwidth (depending on the busyness of the link and the numberof hits on the server within a certain time) to the nearest POP (point-of-presence) of theInternet Service Provider (ISP). The web server designated as PS will have the e-Commerceor online banking client services, all the publishable information of the organisation, and theInternet email gateway for the organisation. The various fields in the PS will be replicated inan Intermediate Server (IS), and a Company Communications Server (CS) which resides in 15
  • 16. the INTRANET. All in and out communications from the INTRANET to the Internet is routedthrough the CS to and from the various databases and the organisation’s mail server. The ISis connected to a three position electro-mechanical microcontroller driven RJ45 switch whichconnects it to either the INTRANET LAN or to the Internet LAN, never to both together. Two-way synchronisation takes place between the CS and IS when it is connected to theINTRANET LAN and with the PS when it is connected to the Internet LAN. This ensures freeflow of information between the INTRANET and the Internet without impairing the 100%security of databases residing in the INTRANET. This system is designated STS (theacronym for Secure Transfer System) and may be seen in slides 6 to 10 of PVDTNPresentation 1.ppt in http://www.slideshare.net/pankajmitra . The denial of direct Internetaccess to the INTRANET ensures its 100% security of the databases connected to it.The MLO and people who wish to communicate or interact with it do so through the PS andits associated storage which are connected to the Internet, through the Internet fromanywhere in the world.9 months agoIntegrate voice and fax over your 100% secure, 100% uptime INTRANET tosave inter-locational telecom and travelling costs.There are two ways in which voice and fax may be integrated over INTRANETS built withp2p leased lines. The conventional VoIP or the path breaking patented PVDTN system. Inthe former voice / fax packets are sent along with data packets through the p2p links of theWAN. In the latter using channel splitters two parallel networks are run over the same p2pleased line backbone – an IP packet switched network (for data communications and all IPservices) and a circuit switched network (for voice and fax communications). The latter ismore bandwidth efficient – see FAQ3 in the presentation PVDTN FAQs.ppt inhttp://www.slideshare.net/pankajmitra.In the latter system by adding between 30 and 40% of the bandwidth required for datacommunications over your IP data network built over point-to-point (p2p) leased lines, thetotal inter-locational voice and fax communications, multiple simultaneous NET meetingsusing voice and voice-data conferencing for different work groups with officers from theirrespective work places at a moment’s notice can be carried out over secure INTRANET.This will eliminate the PSTN and other public telephony costs currently being incurred forsuch communications, and also save travelling costs and time for conducting such meetings.How the savings takes place is shown in FAQ1 in the presentation PVDTN FAQs.ppt inhttp://www.slideshare.net/pankajmitra.The money so saved will help to recover the capital expenditure within a year or two.Since the inter-locational voice / fax communications are carried out over the 100% secureINTRANET they are free from eaves dropping possible in public telephone networks.9 months agoInferencesThe following inferences may be drawn from the above discussions. A. Any network which is laid out over the IP backbone of a Telephone Service provider shares this space with various public domain networks like the 16
  • 17. PSTN, ISDN, PDN, Broadband services, and hence provides continuous physical access from the public domain being part of the same IP Backbone.B. Such networks like VPN networks, if used to connect an organisation’s internal databases will expose them to hacker / cracker attacks from the public domain.C. Firewalls are breakable and hence cannot give 100% security to the organisation’s internal databases as long as there is continuous physical access from the public domain.D. The only way to ensure 100% security of an organisation’s internal databases is to connect them through an INTRANET which has no physical access from the public domain. This is done by building the INTRANET WAN using point- to-point leased lines, and ensuring that there is no public domain access (Internet, ISDN, PSTN, and other shared networks like VPN) to the INTRANET LAN at each organisation location.E. Since all organisations have to have their presence on the Internet and participate in the World Wide Web (www), this is enabled through Web based Proxy Servers (PS) connected to the Internet. The PS will house the organisation’s external mail gateway, the collaboration tools, the e-Commerce applications, the online banking applications (in case of Banks), and will be open to access from all members of the Internet community, with different levels of access based on their relationship with the organisation.F. To facilitate e-Commerce, e-Banking, etc, there will have to be a free flow of information back and forth between this PS and its associated storage and the INTRANET Communications Gateway Server (CS), through a secure transfer system which ensures that there is no direct contact between the INTRANET LAN and the Internet LAN, ensuring that there is no impairment of the 100% security of the INTRANET.G. Since the INTRANET provides a dedicated 100% secure network to the organisation, technology now exists to integrate voice, fax, conferencing (voice and voice-data) cost effectively by increasing the link bandwidths by 40 to 50%, thereby eliminating public telephony between all company locations and saving approximately 50 to 75% present telephone costs, and a substantial part of the inter-locational travelling costs. The savings thus made can pay back the infrastructure for 100% security of the organisation’s internal databases within a few years.H. Thus all data and information which is important and exclusive to the organisation should reside in the Private Cloud connected through the organisation’s INTRANET.I. The e-Commerce activity and transactions like the online banking transactions may be secured by taking measures like dynamic passwords passed on to the user’s dedicated mobile number for every transaction. Thus while hackers and crackers may view your accounts with the bank by hacking 17
  • 18. into your static username and password they will not be able to make any transactions without the account holders mobile phone. This can of course be overcome by them if they can get messages coming into the users’ mobile phones diverted to theirs. To do this they will need to have access to this dedicate phone number or have a contact in the mobile service provider who can help to get this done. Arduous processes no doubt.J. Your computer in the work place may be connected either to the Internet for browsing activity or the INTRANET for other computing activity, but never to both together. This may be achieved through a Secure Switch which connects your machine to either of the INTRANET or Internet LANs.K. The public Clouds like the search engines and the email services are meant for universal access and do not need any security measures. The individual applications can determine what may be done on these Clouds. By hacking through your static username and passwords, hackers / crackers may have access to your Web mail. It is best to delete all emails sensitive to your business from the Web mail storage if you do not want intruders to have access to these.L. Individual Organisation / Bank Web Proxy Servers may be accessed by all people in the Internet community and they will get information / access depending on their relationship with the Organisation / Bank. Hackers / Crackers may hack through to the static usernames and passwords of any person and acquire their access rights to these Web sites. However, the additional precautions mentioned in (I) above will prevent them from carrying out transactions, unless they move further to get messages sent to the individuals dedicated mobile phone diverted to theirs to access the dynamic passwords associated with each transaction. This is definitely a more difficult task though not impossible.M. Claims of Cloud Storage or Cloud Infrastructure, or SAAS service providers that they will be able to provide security to your data are belied as indicated in A, B, and C above. Hackers / Crackers will find a way to get into your data and applications residing in the Public Cloud through the continuous physical access they enjoy through the Common IP Backbone of the Telephone Service Provider / s. Hence take such assurances with a pinch of salt.N. The answers to the security questions posed above, and the above inferences drawn from the series of discussions listed should, we hope, help each organisation’s IT Infrastructure planners to choose they way they wish to lay out their IT Infrastructure.O. If you still need further help and / or clarifications, you may contact us at midautel@bsnl.in or pankajmitra@gmail.com 18