CREATE TABLE [dbo].[posts]( [Id] [numeric](18, 0) IDENTITY(1,1) NOT NULL, [Slug] [nvarchar](255) NOT NULL, [Title] [nvarchar](255) NOT NULL, [Body] [text] NOT NULL, [Author] [nvarchar](255) NOT NULL, [PubDate] [datetime] NOT NULL, CONSTRAINT [PK_posts] PRIMARY KEY CLUSTERED ( [Id] ASC)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
In a classic IT model, there are a lot of inefficiencies.Barrier for innovationsFor every solution to be deployed, an investment needs to be done. For example, 2 servers are bought and configured by corporate IT.These investments are repeated: whenever capacity is estimated too low, a new investment is done. For example, 2 extra servers are bought and configured by corporate IT every quarter.Waste of capacitiesAt a given moment, we have 4 servers. But the actual load could have been handled with only 1. This means we invested 400% too much!Under-supply of capacitiesAt a given moment, we get some press coverage and the actual load increases more than estimated. Imagine we have 1 server short. This may be no problem, but if our checkout process is running on this machine, we have the # of users that can be handled by 1 server that can not make a purchase. Which means we lose a lot of possible income!Fixed-cost of IT-capacitiesWhen the load drops for a period in time, and only 1 server is actually required, we still have to maintain and support all the other servers. This comes at a cost, that should not be made at this time: we are not using it!In a cloud model, the actual capacity can be kept closer to the actual load, with less over-capacity. No investment needs to be done: only pay a small amount at th start, pay more when your solution is used more. Whenever over-capacity exists, scale the solution down to less resources and see immediate impact on costs.IT is no longer treated as an investment, rather as a cost. We don’t invest in power generators, rather pay our electricity by use. This paradigm can become true for IT by using a cloud model.IT can become a business enabler!Ideas can be tried without a large investment upfront, reducing risks and costs. Imagine an application try-out in an insurance company that wants to test if a cost simulation on their site is used and attracts customers & revenue. In a classic IT model, an investment should be made. In a cloud model, we are having a fraction of the costs for trying this out. If it succeeds and attracts revenue, we can easily cope with higher usage costs. If the simulation tool does not attract revenue, we can take it down and our cost disappears, without having invested upfront.Cloud computing, in short, means capacity / resources in terms of networking, data, storage or compute power are provided in an on-demand model: go to a portal and resources are there fast. This is unlike traditional IT where a whole process should be followed. Being agile means you can respond to the market faster!Economics are different: there’s a pay-per-use model, upfront investments are no longer needed.Cloud != virtualization like a lot of vendors tell! Cloud is an addition to virtualization: it allows you to transparently scale the demand in resources across virtual machines, without having to know on which machine you are working. This increases usage and effectiveness of virtualization.Cloud services have inherent resilience to hardware or software failures due to redundant/self-healing service models combined with deep integration between ops & development/test providing a lights out experience. Cloud is the future: most vendors are projecting a future image of devices (PC’s, mobiles, TV’s, …), interconnected through the Internet and using hosted services. To achieve best cost-effectiveness, these hosted services are in a cloud, whether a public cloud or a private cloud.Cloud computing will drive competition in the future due to cost-efficiency.
Take your application to the next level
http://eric.blob.core.windows.net/music/rock/rush/xanadu.mp3Blobs – Provide a simple interface for storing named files along with metadata for the fileTables – Provide structured storage. A Table is a set of entities, which contain a set of propertiesQueues – Provide reliable storage and delivery of messages for an applicationTab
Use queues as a way of communicating w/ the backend worker rolesWRs call getmessage and pass timeoutTimeout value is importantExpiration time is important; message is marked in the queue as invisible; for duration of timeout it’s invisibleWhen we’re done processing, we call a message to remove the message through a deleteTh reason we do this is imagine we have a second worker role; if something goes wrong, once the timeout expires, the message becomes visible, and the next person to do a get message will get the message
The PartitionKey combined with the RowKey uniquely identifies an entity in a table.
11:53Getting the all of dunnry’s post it fast because we’re selecting the entities by a partition keyGetting all of the posts after a certain is slow because we may have to traverse across multiple servers because we’re selecting entities that span partition keysA query without the partition key is really a scan
We have included this feature comparison table in anticipation of your likely questions about differences between using a relational database table as you may be currently doing with your SQL Server databases and the new Windows Azure Tables included in Windows Azure.
As I stated earlier, SQL Azure is based on SQL Server 2008. At this time it is only a subset of the features of the server product.My intention here is to convey the high level features that are supported and the ones that are not.SQL Azure will support most of the things we need… Tables, Index, Views, Stored Procedures, Triggers, and Constraints… in my book… that’s all the functionality that I need for most of my applications.There are some other adjunct technologies that ship as part of SQL Server 2008 such as SQL Reporting Services and Analysis Services which are not supported. The Service Broker is also not supported.
So let’s assume that we have designed our relational database with local developer and data modeling tools.We can begin our story then by assuming that we want to get our database deployed to the cloud.There are some tools that will expedite this process which I will show you later, but for now lets assume that we have scripted our database schema. We apply this script to SQL Azure which speaks native TDS.If you created your database through the SQL Azure Portal, then SQL Azure will have created one master database and three replicas of that database. If you create your database with the script the same will be true.These replicas are stored in different database centers from the master to provide redundancy and protection against geographical catastrophe.
Configuring our application to use SQL Azure storage instead of SQL Server is simply a matter of modifying the connection string in our application’s configuration file.When our application requests data, ADO.NET speaks to the TDS which directs our queries to the master database server. The master database server performs our query and returns the results to our application.
From our application’s point of view, there is only one SQL Azure database.As we make updates to our database, those updates are replicated to other copies stored in other data centers so that in the event that our database fails for any reason, the other databases will be standing by ready to take its place.
But what if that master database server fails for some reason?TDS is receives notification of the database failure and automatically redirects the call to the replica!The Azure Cloud Fabric is self-healing… and the details are outside the scope of this presentation; however, the fabric will get busy repairing itself like drones on a Borg mother ship… essentially with the objective of keeping three replicas online at a time.
I will demonstrate creating a SQL Azure account in session 3 where I will walk you through the entire process.For now I simply want to give you some background information to prepare you for our first demonstration.When we create our SQL Azure database server, we’ll be prompted for an Administrator’s name and a password.This username and password will be the granted a system administrator role that is similar to the “sa” account on a local SQL Server 2008 box. The account has permission to create and drop databases and database ownership authority in any databases that you create with this account.
After creating your SQL Azure database server, you will want to grant appropriate access through the SQL Azure firewall.SQL Azure provides a very simple and easy to maintain firewall. The firewall is so easy to use that it’s only going to get one slide in my deck!The firewall allows us to expose our database to Windows Azure services via a checkbox and to add ranges of IP addresses such as your home office and your business… or possibly the address of a 3rd party server hosting some application that needs data access.I’ll do a thorough demo of this feature in session 3…
When you created your SQL Azure database server, you supplied an administrator’s user name and password. I have named my user accordingly… to remind me of its power.The SQL Portal will offer you the ability to copy these credentials in connection string format to your clip board… tempting you into believing that you should just paste this into your configuration file.This is terrific for demos like mine… BUT you should NEVER, EVER do this…A database server system administrator password placed in a configuration file in clear text format… there has got to be something naive in the extreme going on here… and worse… no way to create non-sa-like users through the UI… you must script your database users and then apply the script to the database. And to anticipate your question… no… you can’t use SQL Server Management Studio to do this either.I will demo this as well in session 3… so hang tight…
Vragen: zijn er managers aanwezig?
THOMAS: We should not include a Q&A in the track session, refer to the “Meet the experts” setup for people with questions. Only allowed if running too fast.