Sql portfolio admin_practicals


Published on

Published in: Career
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

Sql portfolio admin_practicals

  1. 1. <ul><li>.SQL Portfolio </li></ul><ul><li>Name </li></ul><ul><li>Email </li></ul>
  2. 2. <ul><li>Acme Traders Research and Development Department </li></ul><ul><li>Piggy Bank-Developing an Archive Plan </li></ul><ul><li>Securing SQL Server </li></ul><ul><li>Using Bulk Insert </li></ul><ul><li>Choosing the Right Replication Type </li></ul><ul><li>Library Maintenance Plan </li></ul><ul><li>Triggers-Library Database </li></ul>
  3. 3. <ul><li>Case Study – Web application using XML </li></ul><ul><li>Acme Traders Research and Development database has up until now, only been available in house. Given various needs for engineers to review this data abroad, they will now have this data available via a web application. This application will use a server running Server 2003 and of course - SQL Server 2005. </li></ul><ul><li>Research Data (many different types exist- but just use this one example for this case study) entered through the web application is stored as the XML data type using the following defined type: </li></ul><ul><li><Product> </li></ul><ul><li><ProductID>1</ProductID> </li></ul><ul><li><ProductName>Widget</ProductName> </li></ul><ul><li><ProductText> <Intro>Introduction</Intro> <SubIntroTitle= “Web Application”> Clever web application</SubIntroTitle> <SubIntroTitle=”Unit of data”>Used as a theoretical unit of data</SubIntroTitle> </ProductText> </li></ul><ul><li><Product> </li></ul><ul><li>The XML datatype column is named ProductText and is in the FutureProjects table. The FutureProjects table has ProductID, MasterID, TeamID, ProductText, ProductCategoryID, and Date. </li></ul><ul><li>This web server should be available to engineers 24/7. It is important that this data is kept secure and is also backed up regularly. </li></ul><ul><li>  </li></ul><ul><li>1. You are creating a procedure for the web application to return each of the product Names from the reference data in the FutureProjects table. You only want to have this data returned. You do not want any XML tags returned. Which of the following should you use? </li></ul><ul><li>a) SELECT ProductText.query(‘ProductProductName’) FROM FutureProjects </li></ul><ul><li>b) SELECT ProductText.nodes(‘ProductProductName’)FROM FutureProjects </li></ul><ul><li>c ) SELECT ProductText.value(‘ProductProductName[1]’,’varchar’) FROM FutureProject </li></ul><ul><li>d) SELECT ProductText.exist(‘Product@ProductName’,’varchar’) FROM FutureProjects </li></ul><ul><li>  </li></ul><ul><li>2. You have been asked to create a new feature for a web application that will list the subIntroTitles for a given product that is stored in the database. Which of the following should you use? </li></ul><ul><li>a) nodes() method </li></ul><ul><li>b) OPENXML </li></ul><ul><li>c) value() method </li></ul><ul><li>d) query() method </li></ul><ul><li>  </li></ul><ul><li>3. Since research engineers are reading, updating and inserting to the data (and on the same projects) concurrently, how should you handle concurrency requirements? </li></ul><ul><li>a) Set the transaction isolation level to SERIALIZABLE </li></ul><ul><li>b) Set the transaction isolation level to READ_UNCOMMITTED </li></ul><ul><li>c) Set the transaction isolation level to REPEATABLE_READ </li></ul><ul><li>d) Set the transaction isolation level to READ_COMMITTED_SNAPSHOT </li></ul>
  4. 4. <ul><li>4. The ProductGuides table has the document listing of all available product research materials. Engineers will often use this in the initial stages of research and design and it is queried very heavily and results in some large result sets. The content is relatively static however. As an extremely large table you will need to create an efficient index to minimize the locking in the table for the most common query- which are queries based on the ProductCategoryID. Which of the following should you use? </li></ul><ul><li>a) ALLOW_ROW_LOCKS = OFF ALLOW_PAGE_LOCKS = OFF b) LLOW_ROW_LOCKS = ON ALLOW_PAGE_LOCKS = ON </li></ul><ul><li>c) ALLOW_ROW_LOCKS = OFF ALLOW_PAGE_LOCKS = ON d) ALLOW_ROW_LOCKS = ON ALLOW_PAGE_LOCKS = OFF </li></ul><ul><li>  </li></ul><ul><li>5. This database was originally designed for SQL Server 2000 and you need to make sure that all aspects are compliant with SQL Server 2005 and versions forward. Several columns use the image data type. Which of the following would you use to replace this datatype? </li></ul><ul><li>a) Nvarchar(max) b) Varbinary(max ) </li></ul><ul><li>c) Varbinary d) Nvarchar </li></ul><ul><li>  </li></ul><ul><li>6. Several Vista notebook clients need to have full functionality to your SQL Server 2005 application. Which of the following client libraries should they be using? </li></ul><ul><li>a) OLEDB b) SQLCMD </li></ul><ul><li>c) SQLNCLI d) ODBC </li></ul><ul><li>  </li></ul><ul><li>7. Users are complaining that various queries are taking too long to process. You would like to capture these queries and analyze the data with a minimum of impact on the current server. Which of the following represents the best answer? </li></ul><ul><li>a) Create a SQL Server Profiler replay trace and save the data to a file on the SQL Server. </li></ul><ul><li>b) Create a SQL Server Profiler replay trace and save the data to a file on another server. </li></ul><ul><li>c) Create a SQL server Profiler replay trace and save the data to a table in tempdb. </li></ul><ul><li>d) Monitor the queries in System Monitor using SQL Server: Memory Manager counters. </li></ul><ul><li>  </li></ul><ul><li>8. When resolving a problem with a specific query, you would like to determine if a different index would make the query perform more efficiently. Which of the following SET operations could you use to determine this? </li></ul><ul><li>a) SET SHOWPLAN_TEXT ON b) SET STATISTICS XML ON </li></ul><ul><li>c) SET SHOWPLAN_XML ON d) SET FORCEPLAN ON </li></ul><ul><li>  </li></ul><ul><li>9. What solution should be used to provide secure access between the web server on the perimeter network and the research and the SQL Server development database in the internal network? </li></ul><ul><li>a) Install IIS on the SQL Server 2005 server and access the data using XML queries. B) Create an HTTP endpoint and access the server data using stored procedures. </li></ul><ul><li>c) Create an HTTP endpoint and access the server data using ad hoc queries. D) Stop and restart the perimeter web server and access the server data using ad hoc queries </li></ul><ul><li>  </li></ul><ul><li>10. You need to create indexes to improve the performance of queries for the FutureProjects table. The most common queries return the Product ID and Product Name. Which indexes should you create? Choose all that apply. </li></ul><ul><li>a) Create a clustered index on TeamID b) Create a clustered index on ProductText </li></ul><ul><li>c) Create a clustered index on ProjectID d) Create a primary XML index on ProductText </li></ul><ul><li>e) Create a property index for ProductText </li></ul>
  5. 5. <ul><li>PiggyBank has survived the subprime mortgage fallout, and is now in a relative position of strength among it’s competitors. It now serves 2.4 million customers over a very expansive area in the U.S. and Canada. The company is headquartered in Parsippany, NJ. </li></ul><ul><li>543 Employees now work for PiggyBank in Parsippany. </li></ul><ul><li>There are 3 regional offices, which for convenience we will call North, West and South. </li></ul><ul><li>PiggyBank now has a 6TB OLTP database that tracks more than 4 billion transactions each year. This main database is stored in Parsippany. The regional offices only process Deposit and Withdrawal information and update the Parsippany office daily. </li></ul><ul><li>The departmental servers in Parsippany have been experiencing some problems. Server capacity is often overloaded, resulting in subpar performance and several frustrating delays. </li></ul><ul><li>Company growth has been hurt by the general economy, but within 2 years, the company should return to 3% annual growth. The database is growing by 8% each year and will eclipse HD capacity within the next 2 years. </li></ul><ul><li>Most of this database is historical information. </li></ul><ul><li>Government regulations require that 7 year records are kept, and that this data must be available within 24 hours. </li></ul><ul><li>You must design a data-archiving plan for PiggyBank’s ATM transactions. Only the current month will now be kept in the online database. </li></ul><ul><li>Keep in mind that for all of the transactions- once they are made, they cannot be modified. Any changes would be reflected by an additional transaction at a later date. This would make all of the historical information read-only. </li></ul><ul><li>1. Fill out this table (modify as you like) to show the online and archived data-accessibility requirements. Classify the data based on the time divisions, and indicate the storage format for each. </li></ul><ul><li>Data Source Accessibility Requirements Storage Format </li></ul><ul><li>Online Current Month high availability , immediate OLTP Database Server </li></ul><ul><li>Archived accessible within 24 hours last 7 years disk </li></ul><ul><li>Offline (older than 7 years) offsite tapes </li></ul>
  6. 6. <ul><li>2. What is your proposed data movement schedule? </li></ul><ul><li>Data Movement Frequency </li></ul><ul><li>From Online To Archive Monthly </li></ul><ul><li>From Archive to Tape Yearly </li></ul><ul><li>  </li></ul><ul><li>3. Which of the following should be considered when designing the archival design strategy? (Choose all that apply) </li></ul><ul><li>a) Cost b) Government/Industry Regulations </li></ul><ul><li>c) Accessibility Requirements d) Granularity </li></ul><ul><li>4. Which data structure would you use if you wished to maintain the historical context of the archival data, but you cannot archive all the related data together? </li></ul><ul><li>a) Partitioned Data b) Normalized Tables </li></ul><ul><li>c) Denormalized Tables d) Summary Tables </li></ul><ul><li>5. If the requirements for the case study were to 1) maintain 24 months of data online for immediate access for queries and updates and 2) maintain a total of 7 years for accounting and reporting requirements, which of the following would be the most appropriate storage format? </li></ul><ul><li>a) Place the current 24 months data on an OLTP database server and 5 years’ data on an archive server. </li></ul><ul><li>b) Place all the data on the OLTP server, and use partitioning to separate the data between the current 24 months and the remaining 60 months. </li></ul><ul><li>c) Place the current 24 months of data on an OLTP server and the remainder on tape. </li></ul><ul><li>d) Use summary tables to reduce the load on the OLTP server, and store all detailed data on an archive server. </li></ul><ul><li>6. The data-movement strategy should contain which of the following steps? (Choose all that apply) </li></ul><ul><li>a) Verification that data has been copied to the destination storage format b) Means to ensure the security of data during movement </li></ul><ul><li>c) Specification of the frequency of data movement d) Scheduling of data movement to minimize impact on the production server </li></ul><ul><li>7. Which of the following roles can a single server have in a replication topology? </li></ul><ul><li>a) Distributor b) Publisher </li></ul><ul><li>c) Subscriber d) All of the Above e) A and B only </li></ul>
  7. 7. <ul><li>8. Which of the following statements regarding replication topologies in SQL Server 2005 are true? (choose all that apply) </li></ul><ul><li>A) Wizards are available in SSMS to simplify the setup once you’ve designed it B) Specific tables of a database can be replicated, not necessarily the entire database C) Schema changes can be automatically sent to subscribers without using any special stored procedures. </li></ul><ul><li>D) All of the Above. </li></ul><ul><li>9. You are a database administrator for LoveMyLube, a small chain of auto service shops that provide oil changes and similar services. Requirements are that 48 months must be stored online in the Sales database and that older data must be sent to an archival database. Which of the following is the best way to structure the SalesTransactions table? </li></ul><ul><li>A) Partitioned View </li></ul><ul><li>B) Table Partitioning </li></ul><ul><li>C) Denormalization </li></ul><ul><li>D) Summary tables </li></ul><ul><li>10. Refer to the previous question. Which archival frequency would you use? </li></ul><ul><li>A) Daily </li></ul><ul><li>B) Monthly </li></ul><ul><li>C) Quarterly </li></ul><ul><li>D) Annually </li></ul>
  8. 8. <ul><li>“ Ensuring your solution is safe from code injections attacks and minimizing the surface attack area” </li></ul><ul><li>1. How would you mitigate code-injection attacks? (Please give at least 4 examples) </li></ul><ul><li>Whenever you use string concatenation to build SQL code dynamically and accept user input as part of the concatenated string, treat your application as insecure. There are too many different techniques to exploit this vulnerability, and new techniques evolve all the time. You can mitigate the problem by using the minimal-privilege approach. Disable all unnecessary services and features, such as extended procedures, to minimize the attack surface area. You should not return SQL Server error messages to the client application directly because they can inform the attacker that your application is using string concatenation. Validate all user input, testing the size and type of the input. Validate XML input against XML schemas. Check and reject special characters that can be used to modify the intended execution of your SQL string, such as semicolons (command delimiter), apostrophes (string delimiter), and double hyphens (inline comments). Do not accept strings that an attacker can use to construct file names, such as AUX, CON, and so on. </li></ul><ul><li>2. How can you minimize the surface attack area for your SQL Server services and components quickly? (What tool would you use?) </li></ul><ul><ul><li>Stop or Disable services </li></ul></ul><ul><ul><li>SQL Server Surface Area Configuration Tool </li></ul></ul><ul><li>3. How can you secure the sa login? (Please give at least 3 examples) </li></ul><ul><ul><li>Use Windows authentication </li></ul></ul><ul><ul><li>Encrypt communications for the log on process </li></ul></ul><ul><ul><li>Implement passing aging </li></ul></ul><ul><li>4. How would you implement the principle of least privilege for Notification Services service accounts? (What accounts should you not use?) </li></ul><ul><ul><li>Configure the engine to use Windows Authentication for database access. </li></ul></ul><ul><ul><li>Run the engine under a low-privileged domain or local account. Do not use the Local System, Local Service, or Network Service account or any account in the Administrators group. However, a delivery protocol may require additional privileges for the account that the service runs under. </li></ul></ul><ul><ul><li>When you deploy an instance of Notification Services, make sure that each engine has only the necessary permissions. For single-server deployments, the engine runs all of the instance's hosted event providers, generators, and distributors. The account used by the engine should obtain the required database permissions through membership in the NSRunService database role. For scale-out deployments, restrict the permissions of individual engines. </li></ul></ul><ul><li>5. Your application uses the xp_cmdshell extend stored procedure. After you upgrade you database to SQLServer 2005, your application does not run anymore. What went wrong, and what can you do to mitigate the problem? </li></ul><ul><ul><li>Xp_cmdshell is disabled by default on new installs; it can be enabled by using the Policy Based Management or by running the sp_configure system stored procedure. Just enable the xp_cmdshell procedure and all should be good </li></ul></ul>
  9. 9. <ul><li>First Export the data from AdventureWorks.Sales.CreditCard to a text file using the Bcp.exe command prompt utility from the command line or from within SSMS. </li></ul><ul><li>Create 2 new clean databases (Names: Test, Test2 would be fine). Set the recovery model for both to Bulk_Logged. Add the CreditCard table to both these databases via the “Create Table” script included below: </li></ul><ul><li>USE **[Whatever you’ve named your 2 databases – i.e. Test and Test2]** </li></ul><ul><li>GO </li></ul><ul><li>SET ANSI_NULLS ON </li></ul><ul><li>SET QUOTED_IDENTIFIER ON </li></ul><ul><li>GO </li></ul><ul><li>CREATE TABLE [CreditCard]( </li></ul><ul><li>[CreditCardID] [int] IDENTITY(1,1) NOT NULL, </li></ul><ul><li>[CardType] [nvarchar](50) NOT NULL, </li></ul><ul><li>[CardNumber] [nvarchar](25) NOT NULL, </li></ul><ul><li>[ExpMonth] [tinyint] NOT NULL, </li></ul><ul><li>[ExpYear] [smallint] NOT NULL, </li></ul><ul><li>[ModifiedDate] [datetime] NOT NULL CONSTRAINT [DF_CreditCard_ModifiedDate] DEFAULT (getdate()), </li></ul><ul><li>CONSTRAINT [PK_CreditCard_CreditCardID] PRIMARY KEY CLUSTERED </li></ul><ul><li>( [CreditCardID] ASC)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY] ) ON [PRIMARY] </li></ul><ul><li>  GO </li></ul><ul><li>   Populate the first table (in the first database) by using INSERT… SELECT: </li></ul><ul><li>INSERT INTO dbo.CreditCard </li></ul><ul><li>SELECT * </li></ul><ul><li>FROM AdventureWorks.Sales.CreditCard; </li></ul><ul><li>GO </li></ul><ul><li>Check the size of the transaction log. Use DBCC SQLPERF(‘Logspace’); GO </li></ul><ul><li>Using the second database and second CreditCard table, use BCP.exe to complete the same import task </li></ul><ul><li>Use the DBCC SQLPERF command again to test the size of the transaction log, and summarize the results for me (include the sizes for both transaction logs). </li></ul>
  10. 10. <ul><li>Result: </li></ul><ul><li>The MB size of the of both transaction logs was the same, 9.929688MB. There was a difference in the Log Space Used (%), but it wasn’t much of one. The test1 had 50.96873% and test2 had slightly more at 50.95889%. There really isn’t a difference in the inserts so it would come down to which ever way you prefer to do it. Either T-SQL or BCP </li></ul>
  11. 11. <ul><li>Acme Traders have several locations throughout the world. Users in these various locations need access to data stored in SQL Server, and they need it as fast as possible. </li></ul><ul><li>One of the servers, located in Los Angeles, CA, contains a Sales database that needs to be replicated to 3 primary satellite offices in San Francisco, Portland and Seattle, </li></ul><ul><li>1. which are connected via a T1 connection that runs at 85% capacity. The sales associates make changes through the day but the users in the satellite offices do not need to see the changes to the data immediately. What type of replication should be used? </li></ul><ul><li>a) Merge b) Transactional with queued updates </li></ul><ul><li>c) Snapshot d) Transactional with subscribers that immediately update </li></ul><ul><li>e) Snapshot with subscribers that periodically update </li></ul><ul><li>2. The accounting departments at each of the primary satellite locations needs a copy of the data from the main accounting database in L.A. that they can make changes to locally. They need this data to be as current as possible. Which type of replication suits them best? </li></ul><ul><li>a) Merge b) Transactional with queued updates </li></ul><ul><li>c) Snapshot d) Transactional with subscribers that immediately update </li></ul><ul><li>e) Snapshot with subscribers that periodically update </li></ul><ul><li>3. Several additional smaller sales offices are located in the west. The L.A. office needs an up-to-date copy of the sales offices‘ databases. When the L.A. office sends new inventory to these sales offices, they want to update the database in L.A. and have this new data replicated to the proper office. Which replication type should be used? </li></ul><ul><li>a) Merge b) Transactional with queued updates </li></ul><ul><li>c) Snapshot d) Transactional with subscribers that immediately update </li></ul><ul><li>e) Snapshot with subscribers that periodically update </li></ul><ul><li>4. The retail division of Acme manages shops in various cities. Each of these shops maintains it’s own inventory database. The retail manager in Oakland, CA wants his shop to be able to share inventory with his other stores in the east bay. The employees will be able to update their local copy of the inventory database, subtract appropriately from the other store’s inventory, and then go pick up the part. The part will certainly be there as it’s been taken out of inventory. Which replication type should you use to accomplish this? </li></ul><ul><li>a) Merge b) Transactional with queued updates </li></ul><ul><li>c) Snapshot d) Transactional with subscribers that immediately update </li></ul><ul><li>e) Snapshot with subscribers that periodically update </li></ul>
  12. 12. <ul><li>Please provide a detailed backup and maintenance plan for the library database. Please either include screenshots of the setup, or the scripts needs to create the objects (Jobs, Schedules, etc.) </li></ul><ul><li>  </li></ul><ul><li>The Library experiences medium-heavy traffic on weekends and evenings and fairly light traffic during the days and mornings. The library closes at 9pm each week night and opens at 7am each morning Sunday-Saturday. Over the weekends it closes at 5pm. </li></ul><ul><li>  </li></ul><ul><li>A fairly general recommendation is fine- but please address any potential concerns in the plan (be creative). A TWO paragraph minimum of rationale for your backup/maintenance plan is required if you are going the screen shot route, and otherwise, you will probably need a page or two to outline each option. </li></ul><ul><li>  </li></ul><ul><li>Use of Wizards is perfectly acceptable for this exercise. </li></ul><ul><li>  </li></ul><ul><li>I used the Maintenance wizard to create a backup and maintenance plan for the Library database. A FULL backup should be done on a weekly basis, Sunday night at 11:30pm. A Differential backup will done every night of the week except on Sunday at 1am. A Transaction Log backup will be done on an hourly basis from 8am until 8pm daily. By using this as a backup schedule all data will be preserved in the event of a failure. Due to the light work load during the daytime hours an hourly backup of the transaction log should be sufficient. I also have set up a job to check data integrity to run nightly at 10pm, and then a reorganization of the indexes will be done after that. This way when the data is all set for backup. </li></ul><ul><li>  </li></ul><ul><li>Due to the way the scheduling works in the wizard the transaction log backup job could not be altered for the weekend hours. This will create a problem in the event of failure as you will be restoring empty backups of the transaction log. This needs to be adjusted so that the transaction log backup stops a 5pm on the weekend. I also included a cleanup maintenance job to remove the backups that are older than 4 weeks old. This data is not needed for anything and this way the disks are whipped clean and ready to reuse for backups. </li></ul>
  13. 13. <ul><li>The SetFocus Library Human Resource application uses stored procedures for all access to tables in the human resource database. However in addition to permissions to execute the procedures, end users need permissions on base tables because the owners of the procedures are often different from the owners of the base tables. In addition, some personnel use Excel to create ad hoc reports from the base tables. You notice that end users frequently change the data in base tables directly, in an uncontrolled and risky manner, instead of using the stored procedures and the application. </li></ul><ul><li>1. How can you force the end users to access the tables through the programmable objects? </li></ul><ul><ul><li>If you remove their right to access the base tables and forces them to use the stored procedures. The stored procedures have the rights to update the tables and the users have the rights to execute the procedure. </li></ul></ul><ul><li>2. If you revoke the end users’ permissions on base tables, how can you enable the users to still be able to generate ad hoc reports in Excel? </li></ul><ul><ul><li>Create snapshots of the tables and give the users’ read permissions to the snapshots. They can do their ad hoc reports against the snapshots instead of the actual tables. Or you can create views and grant the users’ read permissions on them to do their ad hoc reports. </li></ul></ul><ul><li>3. How can you force the end users to access the tables through the programmable objects? </li></ul><ul><li>You can force end users to access tables through the stored procedures by eliminating the broken ownership chains problem in the human resources database. You can do this by changing the owner of the objects to a single owner or by altering the procedures to use a different execution context and, for example, impersonate a single fictitious user who has permissions to access the base tables. Then, you can revoke all permissions on base tables from end users. </li></ul><ul><li>  </li></ul><ul><li>4. If you revoke the end users’ permissions on base tables, how can you enable the users to still be able to generate ad hoc reports in Excel? </li></ul><ul><li>  If you revoke all permissions on base tables from end users, they will no longer be able to create ad hoc reports. You can create views that have the same owner as the base tables and then grant SELECT permission only to your end users. However, when you create a view, you cannot specify a different execution context. Therefore, you can use views if there is a single owner of all base tables only; otherwise, you would encounter the same broken ownership chain problem as soon as your view joins data from two base tables with different owners. In such a case, you could use stored procedures and multi-statement table-valued functions instead of views as the intermediate data access layer. </li></ul>
  14. 14. <ul><li>After reviewing the Library Database and considering the importance of various data contained within it, please develop appropriate DDL and DML trigger(s) to prevent unwanted changes to important data in the library. </li></ul><ul><li>Trigger 1 </li></ul><ul><li>CREATE TRIGGER table_safety </li></ul><ul><li>ON DATABASE </li></ul><ul><li>FOR DROP_TABLE, ALTER_TABLE </li></ul><ul><li>AS </li></ul><ul><li>PRINT 'You must disable Trigger &quot;safety&quot; to drop or alter tables!' </li></ul><ul><li>ROLLBACK; </li></ul><ul><li>  </li></ul><ul><li>Trigger 2 </li></ul><ul><li>CREATE TRIGGER proc_safety </li></ul><ul><li>ON DATABASE </li></ul><ul><li>FOR DROP_PROCEDURE, ALTER_PROCEDURE </li></ul><ul><li>AS </li></ul><ul><li>PRINT 'You must disable Trigger &quot;safety&quot; to drop or alter procedures!' </li></ul><ul><li>ROLLBACK; </li></ul><ul><li>USE [library] </li></ul><ul><li>GO </li></ul><ul><li>SET ANSI_NULLS ON </li></ul><ul><li>GO </li></ul><ul><li>SET QUOTED_IDENTIFIER ON </li></ul><ul><li>GO </li></ul><ul><li>  </li></ul>
  15. 15. <ul><li>Trigger 3 </li></ul><ul><li>CREATE TRIGGER [dbo].[dmldeletemember] </li></ul><ul><li>ON [dbo].[member] </li></ul><ul><li>INSTEAD OF DELETE </li></ul><ul><li>AS RAISERROR ('You can not delete a member from the database', 16, 10) </li></ul><ul><li>  </li></ul><ul><li>Trigger 4  </li></ul><ul><li>CREATE TRIGGER [dbo].[reminder1] </li></ul><ul><li>ON [dbo].[member] </li></ul><ul><li>AFTER INSERT </li></ul><ul><li>AS RAISERROR ('Be Sure to give Member a card', 16, 10) </li></ul>
  16. 16. <ul><li>There is not a need for a lot of DML triggers in this database as rows can’t be deleted from table with foreign key constraints and most of these tables have foreign key constraints on them. Are there performance considerations that you should take into account when implementing these triggers? Please give a thorough description of any potential drawbacks with implementing these triggers. </li></ul><ul><li>With Triggers as long a you stay away from using cursors and nested triggers the overhead is relatively low. If you nest triggers within triggers this will increase your overhead for the trigger and could impact the system performance depending on how deep the nesting goes. Using cursors is never a good idea there is always significant overhead with them so they should only be used in absolute necessity. </li></ul><ul><li>  </li></ul><ul><li>Your supervisor is also interested in hearing what additional security features you might suggest with regards to auditing in general.  From your experience with the library (or libraries in general) please give your supervisor a brief description of capabilities with SQL Server 2005- and what types of auditing you might suggest (2 sentence minimum).   </li></ul><ul><li>An audit trail could be set up for when changes are made to the books table as far as when they are checked in or out, or new ones arrive. This should be set on a different server if at all possible so that the overhead generated from it won’t impact the current server. This could be set up as a mirror on another server and any of the table data changes could be tracked. </li></ul>
  17. 17. <ul><li>Please develop a log table to store DDL event data in the library, and a trigger that inserts such data after it occurs. </li></ul><ul><li>CREATE TRIGGER [ddlDatabaseTriggerLog] </li></ul><ul><li>ON DATABASE FOR DDL_DATABASE_LEVEL_EVENTS </li></ul><ul><li>AS </li></ul><ul><li>BEGIN </li></ul><ul><li>SET NOCOUNT ON;  </li></ul><ul><li>DECLARE @data XML; DECLARE @schema sysname; DECLARE @object sysname; </li></ul><ul><li>DECLARE @eventType sysname; </li></ul><ul><li>  </li></ul><ul><li>SET @data = EVENTDATA(); </li></ul><ul><li>SET @eventType = @data.value('(/EVENT_INSTANCE/EventType)[1]', 'sysname'); </li></ul><ul><li>SET @schema = @data.value('(/EVENT_INSTANCE/SchemaName)[1]', 'sysname'); </li></ul><ul><li>SET @object = @data.value('(/EVENT_INSTANCE/ObjectName)[1]', 'sysname') </li></ul><ul><li>  </li></ul>
  18. 18. <ul><li>IF @object IS NOT NULL </li></ul><ul><li>PRINT ' ' + @eventType + ' - ' + @schema + '.' + @object; </li></ul><ul><li>ELSE </li></ul><ul><li>PRINT ' ' + @eventType + ' - ' + @schema; </li></ul><ul><li>  </li></ul><ul><li>IF @eventType IS NULL </li></ul><ul><li>PRINT CONVERT(nvarchar(max), @data); </li></ul><ul><li>  </li></ul><ul><li>INSERT [dbo].[DatabaseLog] </li></ul><ul><li>( [PostTime], [DatabaseUser], [Event], </li></ul><ul><li>[Schema], [Object], [TSQL], [XmlEvent] ) </li></ul><ul><li>VALUES </li></ul><ul><li>( GETDATE(), CONVERT(sysname, CURRENT_USER), @eventType, </li></ul><ul><li>CONVERT(sysname, @schema), CONVERT(sysname, @object), </li></ul><ul><li>@data.value('(/EVENT_INSTANCE/TSQLCommand)[1]', 'nvarchar(max)'), @data ); </li></ul><ul><li>END; </li></ul><ul><li>  </li></ul><ul><li>  </li></ul>
  19. 19. <ul><li>GO </li></ul><ul><li>SET ANSI_NULLS OFF </li></ul><ul><li>GO </li></ul><ul><li>SET QUOTED_IDENTIFIER OFF </li></ul><ul><li>GO </li></ul><ul><li>DISABLE TRIGGER [ddlDatabaseTriggerLog] ON DATABASE </li></ul><ul><li>GO </li></ul><ul><li>EXEC sys.sp_addextendedproperty @name=N'MS_Description', @value=N'Database trigger to audit all of the DDL changes made to the AdventureWorks database.' , @level0type=N'TRIGGER',@level0name=N'ddlDatabaseTriggerLog' </li></ul><ul><li>  </li></ul><ul><li>  </li></ul>