Share pointguidance2010

  • 5,594 views
Uploaded on

 

More in: Technology
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
    Be the first to like this
No Downloads

Views

Total Views
5,594
On Slideshare
0
From Embeds
0
Number of Embeds
1

Actions

Shares
Downloads
92
Comments
0
Likes
0

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. D E V E L O P I N G A P P L I C AT I O N S FOR MICROSOFT ® S H A R E P O I N T 2010 ® Design Patterns for Decomposition, Coordination and Scalable Sharing• • • • • •• • • • • • • •• • • • • • •• • • • • Feedback http://spg.codeplex.com
  • 2. IntroductionMicrosoft SharePoint® 2010 includes many new areas of functionality that extend the capabilities of the platformand provide exciting new opportunities for developers. The Developing Applications for SharePoint 2010 releaseprovides technical insight and best practice guidance for architects and developers who want to design anddevelop applications for SharePoint 2010.This introductory section includes the following topics:  Overview. This topic provides a brief description of the Developing Applications for SharePoint 2010 release. It identifies the scope of the guidance and describes the different types of components included in the release.  Intended Audience. This topic describes the prerequisites for understanding the contents of this release, and identifies the people who will benefit most from the guidance within.  Getting Started. This topic describes how to get started with the various components that comprise the Developing Applications for SharePoint 2010 release.  C opyright and Terms of Use. This topic explains the terms under which you may use the components in the Developing Applications for SharePoint 2010 release.Generated from CHM, not final book. Will be superseded in the future. Page 2
  • 3. OverviewSharePoint 2010 introduces new ways of developing applications for the SharePoint platform. With SharePoint2010, you can build multi-tenant, hosted applications on an infrastructure that is scalable, secure, and stable,while taking advantage of modern browser capabilities for an improved user experience. SharePoint 2010 alsointroduces improved tools and APIs that address many of the application development and configurationchallenges of SharePoint 2007. The new features, operational models, and tooling introduce new designpossibilities for developers and architects. This guidance will help you understand the key decisions you face, andto learn how best to take advantage of the new capabilities that SharePoint 2010 provides.The Developing Applications for SharePoint 2010 release includes three different types of resources:  Guidance documentation that provides deep technical insight into different aspects of application development with SharePoint 2010.  Reusable components that can help you implement best practice design patterns in your own applications.  Reference implementations that illustrate how best to work with particular areas of SharePoint functionality.The guide itself is divided into four broad areas of design and development for SharePoint 2010 applications:execution models, data models, client-side development, and application foundations. Each represents a key areaof architectural decision making for SharePoint developers, as shown in the following illustration.Key architectural decision drivers in SharePoint 2010Execution Models provides insight into how different types of SharePoint solutions are managed and executed.It describes how you can develop effective applications in different operating environments and under a variety ofconstraints. In particular, it provides deep technical insight into the new sandboxed solution model, and it explainsthe different ways in which you can extend the sandbox environment with various types of full-trust functionality.Execution model decision pointsGenerated from CHM, not final book. Will be superseded in the future. Page 3
  • 4. Data Models addresses the challenges involved in consuming and manipulating data in SharePoint applications.SharePoint 2010 includes a great deal of new functionality in the data area, particularly with the introduction ofexternal content types, external lists, and the ability to build relationships and constraints between SharePointlists. This section of the documentation provides insights that can help you choose between standard SharePointlists and external data sources as a platform for your SharePoint applications, and it offers approaches andpatterns that you can use to mitigate the performance degradation associated with large lists. It also providesdetailed insights into data access techniques, including the new LINQ to SharePoint capability.Data model decision pointsClient Application Models shows how you can make effective use of the new client-side development featuresGenerated from CHM, not final book. Will be superseded in the future. Page 4
  • 5. in SharePoint 2010. These features include several new mechanisms for data access, such as client-side APIs forJavaScript, Silverlight, and managed clients, as well as a Representational State Transfer (REST) interface. TheSharePoint 2010 platform also provides more out-of-the-box support for rich Internet application (RIA)technologies such as Ajax and Silverlight®.Client-side development decision pointsApplication F oundations shows how best to meet common development challenges in SharePoint applications,such as providing effective isolation of classes and services, managing configuration settings, logging events andtrace information, and performing unit testing and integration testing. Addressing these challenges enables you tobuild flexible, robust, and modular solutions that are easy to maintain as your applications evolve. The conceptsdescribed in this section of the documentation should underpin everything you do in the areas of executionmodels, data models, and client-side development.The scope of technical material that falls under the umbrella of SharePoint development grows increasingly broadwith every release, and so several decisions were needed to constrain the scope of this guidance. In the clientsection, the guidance concentrates on new opportunities for building RIA user interfaces with Silverlight and Ajax.Office client development also includes many new areas of functionality, but this area is worthy of its own bookand is too broad a topic to include here. In the data section, the guidance concentrates on lists and libraries,external data, and data access. While site structure is also an important component of an effective SharePointdeployment, in many ways it’s more of an operational issue than a development issue and as such its notincluded in the guidance. The new service application model in SharePoint is powerful, but as most organizationswill not need to build their own service applications the guidance does not address this area. Note:TheDeveloping Applications for SharePoint 2010 release has an associated community site on C odePlex. Youcan use this site to post questions, participate in discussions, provide feedback, and download interim releases.Generated from CHM, not final book. Will be superseded in the future. Page 5
  • 6. Intended AudienceThe Developing Applications for SharePoint 2010 release is intended primarily for software architects andexperienced developers with some prior experience with SharePoint products and technologies. It offers technicalinsights, guidance, and design patterns for developers who know the basics and want to extend their skills to thedesign of robust, enterprise-scale applications. Although the release is not intended to provide an introduction toSharePoint development, experienced developers will benefit from many areas of the guidance even if they arenew to SharePoint.To get the greatest benefit from this guidance, you should have some experience with or an understanding of thefollowing technologies:  Microsoft® Office® SharePoint Server 2007 or Windows® SharePoint Services 3.0  Microsoft .NET Framework 3.5  Microsoft Visual C #®  Microsoft ASP.NET 3.5The release does not assume that you are already familiar with all the new functionality in SharePoint 2010. Linksto the product documentation are provided where appropriate.Generated from CHM, not final book. Will be superseded in the future. Page 6
  • 7. Getting StartedAs described previously, the Developing Applications for SharePoint 2010 release includes three different types ofresources—guidance documentation, reusable class libraries, and reference implementations. These resourcesspan the key areas covered by this release, as shown in the following table.Component Description Key AreasGuidance The guidance documentation associated with this release includes Applicationdocumentation detailed technical insights and design patterns for developers in key Foundations functional areas. It also includes accompanying documentation for each of the reference implementations, together with how-to topics that can Execution help you meet specific technical challenges. For more information on Models the guidance documentation, see Documentation Overview. Data Models C lient-Side DevelopmentReusable class The SharePoint Guidance Library is a downloadable collection of class Applicationlibraries libraries that you can compile and deploy with your own SharePoint Foundations solutions. The SharePoint Guidance Library includes three components(SharePoint —the SharePoint Service Locator, the C onfiguration Manager, and theGuidance Library) SharePoint Logger—that are designed to help you address common development challenges in SharePoint applications. For information on how to download and compile the SharePoint Guidance Library, see The SharePoint Guidance Library.Reference The Developing Applications for SharePoint 2010 release includes eight Executionimplementations reference implementations that illustrate how best to work with Models different areas of functionality in SharePoint 2010. Each reference implementation is a fully functional SharePoint solution, consisting of Data Models source code, supporting resources, and an installation script, that you C lient Models can deploy to a test environment and explore at your leisure. The guidance documentation includes a walkthrough of each reference implementation that explains each aspect of the design and implementation. For more information on how to use the reference implementations, see Reference Implementations.The best way to get started with the Developing Applications for SharePoint 2010 release is to allow thedocumentation to guide you through the other resources. For example, the documentation explains how and whyyou should use the components in the SharePoint Guidance Library, and the reference implementations reinforcethe execution and data concepts introduced in the documentation.Generated from CHM, not final book. Will be superseded in the future. Page 7
  • 8. Documentation OverviewThe guidance documentation for the Developing Applications for SharePoint 2010 release is organized into fourchapters. These chapters map to the four key areas of SharePoint 2010 design and development that aretargeted by this release.  Application Foundations for SharePoint 2010 provides guidance on how to build your SharePoint applications on solid foundations. In particular, it explains how to address the challenges of testability, flexibility, configuration, logging and exception handling, and maintainability. This chapter introduces the reusable components in the SharePoint Guidance Library and provides in-depth guidance on how to use them in your own applications.  Execution Models in SharePoint 2010 provides guidance on how to meet the challenges of different execution environments. It provides deep technical insights into the mechanics of the full-trust execution environment, the sandbox execution environment, and various hybrid approaches to the execution of logic on the SharePoint 2010 platform. The chapter includes accompanying documentation for each of the reference implementations in the execution area. It also includes several how-to topics on how to meet various execution-related challenges in SharePoint 2010 development.  Data Models in SharePoint 2010 provides guidance on how to meet common challenges in working with data on SharePoint 2010. It explains key design decision points that can help you to choose between standard SharePoint lists and external lists, and it describes techniques that you can use to mitigate performance degradation when you work with large lists. It also introduces application patterns for data aggregation with SharePoint lists. The chapter includes accompanying documentation for each of the reference implementations in the data area.  C lient Application Models in SharePoint 2010 provides guidance on how best to use the new client-side development features in SharePoint 2010, such as the client data access object model, the REST-based service architecture, and the support for RIA technologies such as Silverlight and Ajax.In addition to these Web pages, the guidance documentation is included as a C HM file in the download associatedwith this release.Generated from CHM, not final book. Will be superseded in the future. Page 8
  • 9. The SharePoint Guidance LibraryThe SharePoint Guidance Library is a collection of reusable code-based utilities that address common challengesin application development for the SharePoint platform. You can use these components in your own SharePointapplications to help you improve your development speed and follow best practice guidance. The SharePointGuidance Library consists of three key components:  The SharePoint Service Locator provides a simple implementation of the service location pattern for SharePoint applications. This enables you to isolate your code from dependencies on external types, which makes your code more modular, easier to test, and easier to maintain.  The Application Setting Manager provides a robust and consistent mechanism for storing and retrieving configuration settings at each level of the SharePoint hierarchy, from individual sites (SPWeb) to the entire server farm (SPF arm).  The SharePoint Logger provides easy-to-use utility methods that you can employ to write information to the Windows event log and the SharePoint Unified Logging Service (ULS) trace log. It also enables you to create custom diagnostic areas and categories for logging.To get started with the SharePoint Guidance Library, we recommend that you read Application Foundations forSharePoint 2010. It puts each component in the SharePoint Guidance Library into context and provides detailedinformation on using each component.Deploying the SharePoint Guidance LibraryThe SharePoint Guidance Library is distributed as source code. To build and install the library, you must open andbuild the Microsoft.Practices.SharePoint solution in Microsoft Visual Studio® 2010. This will create theMicrosoft.Practices.SharePoint.Common assembly, which includes all three library components.To use the SharePoint Service Locator, you must also deploy the Microsoft.Practices.Serv iceLocationassembly to the global assembly cache. This assembly defines the common service locator interfaces on whichthe SharePoint Service Locator is developed. The Microsoft.Practices.Serv iceLocation assembly is included inthis release as a compiled assembly.The end user license agreement for the Developing Applications for SharePoint 2010 release permits you toinclude and distribute these assemblies within your SharePoint solutions.System RequirementsTo build the SharePoint Guidance Library, your development environment must include Visual Studio 2010Professional Edition or higher and any version of SharePoint 2010, including SharePoint Foundation 2010. TheSharePoint Guidance Library is compatible with all versions of SharePoint 2010, including SharePoint Foundation2010.Using the SharePoint Guidance Library in Sandboxed SolutionsSharePoint 2010 introduces a new restricted execution environment—known as the sandbox—that allows you torun partially trusted solutions in a strictly controlled environment within the scope of an individual site collection.The sandbox environment limits the APIs that can be used by code contained in a sandboxed solution, andrestricts a sandboxed solution to the resources of the site collection where it is deployed. Wherever possible, theSharePoint Guidance Library components include enhancements that enable them to run within the sandboxenvironment. C ertain capabilities, such as logging and tracing, are not possible within sandboxed solutions as thesecurity restrictions of the environment do not permit these activities. For these scenarios, the SharePointGuidance Library provides full trust proxies—which must be installed by a farm administrator—to deliver thecapability. The SharePoint Guidance Library also provides several extensibility points where the existing logic canbe replaced by an approach that can be used within the sandbox—for example, by logging events to a list withinthe site collection, rather than through the SharePoint APIs. You will find more details on the capabilities andlimitations of the SharePoint Guidance library within the sandbox environment in the chapter ApplicationFoundations for SharePoint 2010.Generated from CHM, not final book. Will be superseded in the future. Page 9
  • 10. Reference ImplementationsThe Developing Applications for SharePoint 2010 release includes several reference implementations thatillustrate different execution models and data models in SharePoint 2010 solutions. Each referenceimplementation contains a complete working solution, including source code, template files, and other resources,together with an installation script. Deploying a reference implementation to your SharePoint 2010 testenvironment enables you to explore and debug best practice implementations at your leisure.The following table provides an overview of the reference implementations in the Developing Applications forSharePoint 2010 release, together with the key points illustrated by each implementation.Reference Implementation Key PointsSandboxed Solution Effective use of feature partitioning for reliable solution deployment.List aggregation within a site collection Using query objects to retrieve data from multiple lists. Using the model-view-presenter pattern in a Web Part to isolate business logic from the presentation layer and the underlying data source. Using constructor injection patterns to isolate business logic from dependencies. Using an exception shielding pattern to stop unhandled Web Part errors from preventing the host page from loading.Sandboxed Solution with Full Trust Proxy C reating, deploying, and registering a full-trust proxy.List aggregation with supplemental data from a C onsuming a full-trust proxy from sandboxed code.CRM system Deploying pages and client-side scripts to the sandbox environment. Launching pages as modal dialogs from the sandbox environment.Sandboxed Solution with External List C reating and consuming external lists from a sandboxed solution.Interaction with external data from sandboxedWeb Parts C reating and managing external content types in the Business Data C onnectivity (BDC ) service application. C onfiguring the Secure Store Service (SSS) to enable sandboxed code to impersonate credentials for external data sources.Sandboxed Solution with C ustom Workflow C reating and deploying full-trust workflow activities.Actions C reating and deploying sandboxed workflow actions.Deployment of declarative workflows withcustom full-trust activities and custom C onsuming custom activities and sandboxed actions from asandboxed actions declarative workflow.Farm Solution C reating, deploying, and registering a timer job.Aggregation of data from multiple site Managing configuration data for timer jobs.collections using a timer job Deploying custom application pages to the central administration Web site.SharePoint List Data Models Modeling data with lists, including many-to-many relationships.Modeling application data with standardSharePoint Lists Using Lookup fields and list relationships. Using LINQ to SharePoint for list access.External Data Models C onnecting to a database with Business C onnectivity Services (BC S).Modeling application data with an external datasource and consuming the data in SharePoint Modeling many-to-many relationships with BC S.Generated from CHM, not final book. Will be superseded in the future. Page 10
  • 11. Using the BC S API to access external data. Using the Business Data Web Parts to access external data. Representing associations between entities using stored procedures. Integrating to external data sources using a .NET C onnectivity Assembly.C lient Application Models Using the C lient-Side Object Model (C SOM) from Silverlight and JavaScript.Using rich internet application (RIA) withSharePoint Using REST Services from Silverlight and JavaScript. Using the Model-View-ViewModel pattern with Silverlight. Accessing non-SharePoint services using Silverlight. Accessing binary data with Silverlight. Accessing SharePoint Web Services from Silverlight.System RequirementsTo deploy the reference implementations, you need a functional SharePoint 2010 test deployment with VisualStudio 2010 Professional Edition or higher. The supported editions of SharePoint 2010 vary according to thefeatures used by each individual reference implementation. The following table shows the editions of SharePoint2010 required for each reference implementation.Reference Implementation SharePoint 2010 Version RequirementsSandboxed Solution SharePoint Foundation 2010Sandboxed Solution with Full Trust Proxy SharePoint Foundation 2010Sandboxed Solution with External List SharePoint Server 2010*Sandboxed Solution with C ustom Workflow Actions SharePoint Server 2010**Farm Solution: Timer Job SharePoint Foundation 2010SharePoint List Data Models SharePoint Foundation 2010External List Data Models SharePoint Server 2010***C lient SharePoint Foundation 2010* Uses the Secure Store Service which is only available in SharePoint Server 2010.** Workflows must be deployed to the same SharePoint edition used to create them. This workflow is built usingSharePoint Server 2010. You can build the workflow for SharePoint Foundation 2010.*** Uses Business Data Web Parts which is only available in SharePoint Server 2010.Generated from CHM, not final book. Will be superseded in the future. Page 11
  • 12. Copyright and Terms of UseThis document is provided “as-is”. Information and views expressed in this document, including URL and otherInternet Web site references, may change without notice. You bear the risk of using it.Some examples depicted herein are provided for illustration only and are fictitious. No real association orconnection is intended or should be inferred.This document does not provide you with any legal rights to any intellectual property in any Microsoft product.You may copy and use this document for your internal, reference purposes. You may modify this document foryour internal, reference purposes.Microsoft may have patents, patent applications, trademarks, copyrights, or other intellectual property rightscovering subject matter in this document. Except as expressly provided in any written license agreement fromMicrosoft, the furnishing of this document does not give you any license to these patents, trademarks, copyrights,or other intellectual property.© 2010 Microsoft C orporation. All rights reserved.Microsoft, Active Directory, IntelliSense, MSDN, SharePoint, Silverlight, TechNet, Visual C #, Visual Basic, VisualStudio, Windows, and Windows Server are trademarks of the Microsoft group of companies. All other trademarksare property of their respective owners.Generated from CHM, not final book. Will be superseded in the future. Page 12
  • 13. Execution Models in SharePoint 2010SharePoint solution development encompasses many different types of applications. These applications aredeployed to different locations, are loaded by different processes, and are subject to different executionconditions and constraints. The topics in this section are designed to give you an understanding of your executionoptions when you develop solutions for SharePoint 2010.When you plan how you want your code to execute, the factors driving your decisions fall into two broad areas:  Execution environment. This is the security and processing boundary that contains your running code. If you have worked with earlier versions of SharePoint, you are familiar with the full trust execution model and the bin/code access security (commonly known as bin/C AS) execution model. SharePoint 2010 provides new options in this area with the introduction of the restricted, site collection-scoped sandbox execution model.  Execution logic. This is the means by which your code actually gets invoked. For example, execution logic defines whether your code runs synchronously or asynchronously, whether it is invoked by a timer job or a workflow or a Web page, and whether it impersonates the current user or runs using the identity of the process account.The following illustration shows some of the options available to you in each of these areas.Execution considerations for SharePoint 2010 applicationsExecution environment and execution logic are heavily interrelated. Some of the choices you make underexecution logic prescribe the use of a particular execution environment model. For example, if you develop atimer job, you must deploy it as a full-trust application. Likewise, if you want to run your code with elevatedpermissions, you cannot use a sandboxed application. Some execution logic patterns can also be used to bridgethe divide between different execution environments, as described later in this section.This guidance is largely structured around the capabilities and constraints of each execution environment model.However, in each topic, the guidance is informed by the options and constraints imposed by the various differentapproaches to execution logic in SharePoint applications.Generated from CHM, not final book. Will be superseded in the future. Page 13
  • 14. This section includes the following topics that will help you to understand execution in SharePoint 2010:  Understanding SharePoint Execution Models. This section introduces the different execution and deployment models that are available to you when you develop solutions for SharePoint 2010. It provides an overview of each execution model, explains the benefits and limitations of each approach, and describes when it may be appropriate to use a particular model.  Farm Solutions. This topic provides a detailed insight into how farm solution code is executed by the SharePoint environment. It describes what you can do with farm solutions, and it identifies the core issues that you should consider when you write farm solution code.  Sandboxed Solutions. This topic provides a similar insight into the sandbox execution environment. In addition, it aims to give you a detailed understanding of how sandboxed solutions are monitored and managed by the SharePoint environment.  Hybrid Approaches. This topic provides a detailed review of execution models that enable you to combine sandboxed solutions with full trust functionality. It explains how each of these execution models works, and it identifies issues specific to the deployment and execution of hybrid solutions.This documentation uses the term "execution model" to describe the different approaches that are available toyou in the execution environment area.Generated from CHM, not final book. Will be superseded in the future. Page 14
  • 15. Understanding SharePoint Execution ModelsIn earlier versions of SharePoint, there were limited options for deploying custom solutions to a SharePointenvironment. You would deploy assemblies either to the global assembly cache or to the Web applications binfolder within the Internet Information Services (IIS) file structure. You would deploy other resources, such asimages, configuration files, user controls, and SharePoint features, to the SharePoint file structure (commonlyreferred to as the "SharePoint root") on each server. In order to manage the installation, deployment, andretraction of these assemblies and resources over multiple servers, you would use a SharePoint solution package(WSP). The solution package would have to be placed on a file system available to a SharePoint server in thefarm, installed using the stsadm command line tool, and then deployed to one or more Web applications fromeither the command line or the SharePoint C entral Administration Web site.This approach works well, as long as you meet the following criteria:  You have server-side access to the SharePoint farm.  You are a member of the Farm Administrators group.  You have the confidence of the IT team.This is increasingly unlikely to be the case. Many large companies provide a single, centrally-managed SharePointplatform and simply provision site collections for disparate divisions, departments, and teams as required. Manysmaller companies look to hosting companies to provide a SharePoint environment, which is also typicallyprovided on a per-site collection basis. In both cases, developers who are looking to provide custom solutions areunlikely to have the server-side access they need to deploy their solutions. Hosting companies in particular maybe understandably reluctant to permit anyone to deploy code that may jeopardize the performance, stability, orsecurity of the SharePoint farm and, therefore, their other tenants.In response to the market need to allow developers to create code that can be run in shared environments,SharePoint 2010 supports an additional deployment and execution model: the sandboxed solution. This modelallows users who do not have access to the server file system to deploy managed code applications into individualsite collections. Sandboxed solutions are deployed using a SharePoint solution package to a specialized gallery(document library) in the root site of the site collection.These applications run in an environment of reduced trust—the sandbox—and are executed within an isolatedprocess that uses a low-trust account. When you develop solutions that target the sandbox execution model, youare restricted to using a subset of the SharePoint APIs and your code must observe more stringent code accesssecurity policies for the rest of the .NET Framework base class libraries. These constraints offer additionalsafeguards to the IT team, because the inherently lower trust environment reduces the risk of a security exploitby the sandboxed application. In return, the sandbox execution model offers developers the opportunity tocustomize and extend the functionality of their SharePoint sites in circumstances where the deployment of customcode would otherwise be prohibited, such as hosted solutions or large, regulated corporate deployments.In order to balance this newfound freedom to deploy managed code without the involvement of the IT team,SharePoint 2010 includes various safeguards against inefficient or resource intensive sandboxed applications. Inaddition to the restrictions on the APIs that are available to the developer, the sandboxed solution frameworkmonitors the execution of sandboxed applications and can terminate code that runs for too long or consumes toomany resources. This contributes to the overall stability of the system. Administrators may configure apoints-based system to throttle the system resources that are made available to sandboxed applications.This section provides an overview of each execution model, from the familiar full-trust approach to the newsandbox option. It identifies the benefits and drawbacks of each approach, and it examines when it is appropriateto use a particular model. The remainder of the chapter then provides a detailed technical insight into theworkings of each execution model. Note:This documentation focuses on server-side execution models. You can also interact with a SharePointenvironment from client platforms such as Silverlight or Windows Presentation Foundation (WPF) through thenew SharePoint client object model. For more information about the client object model, see C lient ApplicationModels.Generated from CHM, not final book. Will be superseded in the future. Page 15
  • 16. What Are the SharePoint Execution Models?In terms of execution models, there are two principal types of solution in SharePoint 2010: farm solutions andsandboxed solutions. Within each type of solution, there are various execution models available to you. Farmsolutions can include components that run in a full-trust environment or components that run under code accesssecurity policy restrictions. Sandboxed solutions can include components that run entirely within the sandboxenvironment as well as hybrid approaches that can include various full-trust components. This topic introducesthese execution models and describes the key concepts behind each approach.Farm SolutionsA farm solution is a collection of resources that you deploy through the server-side file system in your SharePointenvironment. These resources execute within the same process space as the SharePoint application, which meansthat your code can use the full SharePoint object model and has access to all the same resources as SharePointitself.When you deploy a farm solution, you can choose from two different execution models: the full trust executionmodel and the bin folder/code access security (bin/C AS) execution model. These models will already be familiarto you if you have worked with Office SharePoint Server 2007 and Windows SharePoint Services 3.0.The Full Trust Execution ModelWhen you use the full-trust execution model, you deploy your assemblies to the global assembly cache on eachWeb front-end server and application server in the server farm. The SharePoint Web application process loadsthe assembly from the global assembly cache and your code runs with full trust—in other words, it runs withoutany code access security restrictions.The full trust execution modelBecause the assemblies are deployed to the global assembly cache, you can make your solution available to anyWeb application on the server farm.For more information about the full-trust execution model, see Farm Solutions.The Bin/CAS Execution ModelThe bin/C AS approach is a partial trust execution model. When you use the bin/C AS execution model, you deployyour assemblies to the bin directory associated with a SharePoint Web application. The worker process associatedwith the SharePoint Web application loads the assembly from the bin directory. However, the operations yourcode may perform are restricted by the code access security policies that are applied in the Web.config file toassemblies in the bin directory.The bin/CAS execution modelGenerated from CHM, not final book. Will be superseded in the future. Page 16
  • 17. Because the assemblies are deployed to the bin folder of a specific Web application, your solution is, by definition,scoped to that Web application instead of to the farm as a whole.In terms of deployment, the only differences between the full-trust execution model and the bin/C AS executionmodel are the location where you deploy your assemblies and the code access security policies associated withthat location. In both cases, any non-compiled items, such as ASP.NET markup files, XML files, or resource files,are typically deployed to the SharePoint root on each Web front-end server. If you want to deploy a farm solutionusing either of the farm solution execution models, you must have access to the server file system and be amember of the Farm Administrators security group.For more information about the bin/C AS execution model, see Farm Solutions.Sandboxed SolutionsSandboxed solutions are new to SharePoint 2010. A sandboxed solution is a collection of resources that youdeploy directly to a specialized gallery (library) in the root site of a site collection. This library is referred to as theSolutions Gallery. Just like a farm solution, you package a sandboxed solution as a SharePoint solution package(WSP). However, you can deploy a sandboxed solution without physical access to the server file system andwithout the involvement of the IT team by directly uploading the WSP through the Web user interface (UI).Instead, the site collection administrator determines who has permissions to add sandboxed solutions to his orher site collection.To counterbalance this newfound freedom to deploy solutions without the explicit approval of the IT team,SharePoint includes several constraints that restrict what you can do with a sandboxed solution. The following aresome examples:  Your code has access to a limited, "safe" subset of the SharePoint object model.  Your assemblies are loaded by an isolated process that uses a low-privilege identity.  The solution framework terminates your code if it does not respond to requests within a specified duration.The IT team allocates a resource quota to each site collection that defines the boundaries within which thesandboxed solution must operate. The solution framework shuts down all sandboxed solutions within a sitecollection if the site collection uses up its daily resource quota for sandboxed solutions. Within an individual sitecollection, administrators can review the resources consumed by individual sandboxed solutions from the sitecollection user interface.There are two approaches to execution using the sandboxed solution environment. You can deploy a solution thatruns entirely within the sandbox environment, which is referred to as the sandbox execution model. However, thesandbox environment also allows you call out to full-trust components under certain conditions. For example, youcan consume specially developed, fully trusted, global assembly cache–deployed classes from your sandboxedsolutions via a full trust proxy. These approaches are referred to as hybrid execution models. Note:It is important to draw a distinction between components that you can deploy within a sandbox solution andcomponents that actually execute in the sandbox environment. For example, you can deploy a declarativeworkflow in a sandbox solution. However, all workflow logic actually executes with full trust. Any calls to theSharePoint object model actually execute with full trust. These concepts are explained in greater detail in thetopics that follow.The Sandbox Execution ModelWhen a SharePoint Web application process receives a request that invokes your sandboxed solution, the Webapplication process does not directly load your assembly. Instead, the Web application process loads an executionGenerated from CHM, not final book. Will be superseded in the future. Page 17
  • 18. wrapper that loads your assembly into an isolated sandbox process.The sandbox execution modelWhen you use the sandbox execution model, your solution is limited in scope to the site collection in which it isdeployed. In addition to the constraints outlined previously, the solution cannot access content or resources fromother site collections.For more information about the sandbox execution model, see Sandboxed Solutions.Hybrid Execution ModelsThere are times when the benefits of the sandbox approach are appealing to an organization, but the limitationsof the sandbox environment prevent you from creating a complete solution. In these cases, a hybrid approachmay offer an attractive solution. Sandboxed solutions can access full trust components through variousmechanisms. For example, sandboxed solutions can do the following:  They can use a full trust proxy to access logic that runs with full trust, such as calls to APIs that are not permitted in the sandbox or calls to external services.  They can use a declarative workflow to access a code-based custom workflow activity.  They can use an external list to access external data through Business C onnectivity Services (BC S).These full-trust components could be developed in parallel with the sandboxed functionality, or they might bedeveloped and deployed by the IT team to make additional functionality available to sandboxed solutiondevelopers. For example, the SharePoint Guidance Library includes a full-trust proxy that you can use to enablesandbox developers to log events and trace information from their sandboxed solutions.In the first hybrid approach described in this topic, you can execute global access control–deployed, full-trustcode from a sandboxed solution by using a full trust proxy. The full-trust proxy is a controlled exit point thatallows your sandboxed code to make a synchronous call out to logic that executes outside of the sandboxprocess.Hybrid execution using a full-trust proxyIt is important to understand that the full-trust proxy is implemented by the fully trusted component, instead of bythe sandboxed solution. If sandboxed solution developers could use a proxy to run any global assembly cache–deployed code, this would subvert the restrictions placed on the sandbox environment. In order to provideservices to sandboxed solutions, your fully trusted classes must inherit from the SPProxyOperation abstractclass. After your full-trust proxies are deployed to the global assembly cache, they can be consumed from anysandboxed solution in the SharePoint farm.C reating a full-trust proxy should be carefully considered and managed, because it increases the scope forsandboxed applications to cause security or performance issues. Generally speaking, you should aim to keep theGenerated from CHM, not final book. Will be superseded in the future. Page 18
  • 19. functionality that you expose to sandboxed applications through a full-trust proxy to the minimum required.In the second hybrid approach described in this topic, the full-trust component is a custom workflow activity thatis deployed to the global assembly cache. You can consume the custom workflow activity in a declarativeworkflow from your sandboxed solution.Hybrid execution using a declarative workflowUsing this approach, the fully trusted logic in the custom workflow activity is invoked asynchronously when thesandbox process executes the declarative workflow.In final hybrid approach described in this topic, the full-trust component is an external content type defined in theBC S. The sandboxed solution includes an external list that connects to the external content type. As a result, thesandboxed solution can access data from other applications through the external list, even though the sandbox isprohibited from directly making external connection.Hybrid execution using an external list Note:The external content type is a new SharePoint 2010 feature that enables you to define a connection to anexternal data source. External content types can also define a set of C RUD (C reate, Retrieve, Update, andDelete) operations that allow you to manipulate that external data from your SharePoint environment. Externallists connect to external content types and provide a SharePoint list wrapper around external data, so that youcan access and manipulate that external data from the familiar format of a SharePoint list. For moreinformation about external lists and external content types, see Business C onnectivity Services Fundamentalson MSDN.For more information about hybrid execution models, see Hybrid Approaches.Generated from CHM, not final book. Will be superseded in the future. Page 19
  • 20. Examples and ScenariosBefore you take a detailed look at the functionality and constraints of each execution model, it is worth takingsome time to review some high-level examples that illustrate when each model might be appropriate. There areseveral factors that should influence your choice of execution model. In some cases, the decision may be madefor you. If your code will be deployed to a shared or hosted environment, you may well be limited to thedeployment of sandboxed solutions. If you are designing a third-party application, targeting the sandboxenvironment could make your product viable to a wider audience. In other cases, despite the inherent benefits tothe overall stability, security, and performance of the farm as a whole, sandboxed solutions may not allow you todo everything you need to. The intended scope of your solution is another important factor—sandboxed solutionsare constrained to a single site collection, bin/C AS solutions are restricted in scope to a single Web application,and full-trust solutions are available to the entire server farm.Typical Scenarios for Farm SolutionsYou can use the full trust farm solution approach to deploy absolutely any combination of functionality andresources from the entire spectrum of SharePoint development. However, that does not mean that the full-trustmodel is always the best choice. When you deploy a solution that uses the full-trust execution model, you lose allthe safeguards that are offered by the sandbox and hybrid approaches. Typically, you might choose to deploy afull-trust farm solution when the functionality you need is not available in the sandbox environment and theadditional effort of building a hybrid solution is not justified. You might also consider full-trust solutions for highvolume, public-facing sites where the performance impact of using a sandboxed solution is unacceptable.When you consider a bin/C AS deployment, remember that code access security policies can be difficult to getright and difficult to maintain. With the introduction of the sandbox environment, the combination of sandboxedsolutions with full-trust components where necessary is preferred to the use of the bin/C AS approach in mostcases. However, there are still some circumstances where you may want to consider using the bin/C AS executionmodel. For example, if you have a high volume site and you want to take advantage of granular security for yourapplication, the bin/C AS model might meet your requirements. If you invest a great deal of time in Web Partdevelopment, and the lack of support for Visual Web Parts within the sandbox causes an unacceptable decreasein productivity, the bin/C AS model could offer an alternative solution. The bin/C AS approach cannot be used torun feature receivers, coded workflow activities, timer jobs, or service applications. These components must bedeployed to the global assembly cache.C ommon scenarios for full-trust farm solutions include the following:  Asynchronous timer jobs for large, regular batch operations. For example, you might want to aggregate data from lists and sites on different site collections. Alternatively, you might want to run a bulk import or export of external data on a daily or weekly basis.  Fully coded workflows or activities. You can model many business processes by creating your own custom workflow activities and consuming these activities from declarative workflows. However, in some cases, only a fully coded workflow will provide the functionality you need, particularly if you require complex or parallel branching logic. For example, suppose you implement a process to create swipe cards for secure access. The workflow must connect to an external security system to create the user record and request card production. You might use a fully coded workflow to support a parallel approval process with the Human Resources department and the security team.Typical Scenarios for Sandboxed SolutionsIn many cases, the sandbox execution model may be the only option available to you. This is particularly likely ifyou want to deploy a solution to a highly regulated environment, such as a shared or hosted deployment, or ifyour organization cannot justify the management costs of an environment that allows full-trust solutions.However, there are also many scenarios in which the sandbox execution model might be your preferredapproach, regardless of the options available to you. In addition to the farm-wide benefits of stability, security,performance, and monitoring, sandboxed solutions offer benefits to solution developers. For example, you canupload your sandboxed solutions through a Web interface and without the involvement of the IT team; thisenables hassle-free deployments and faster development iterations. If the capabilities of the sandboxenvironment meet the requirements of your application, the sandbox execution model will often be an attractivechoice.C ommon scenarios for sandboxed solutions include the following:  Data aggregation. For example, you might want create a Web Part or a Silverlight control that shows a summary of all tasks assigned to the current user from across the site collection or that aggregates sales data from individual team sites.  Data capture. For example, suppose you are responsible for organizing and posting job vacancies at yourGenerated from CHM, not final book. Will be superseded in the future. Page 20
  • 21. organization. You might deploy a content type and an InfoPath form to collect and organize the information. You could also include a declarative workflow to manage the process through the received, approved, and posted phases.  Document management. For example, suppose you need to create a document repository for resumes. You might create a solution package that includes a document template and a content type. You deploy the document template to a document library and you include feature receivers classes to register the content type with the library.Typical Scenarios for Hybrid SolutionsHybrid approaches can offer an attractive choice when the sandbox execution model alone does not provide allthe capabilities that you need. You can use a hybrid approach to minimize the amount of full-trust code in yoursolution, both to maximize the performance and stability benefits you gain from the sandbox environment and tolimit the management and review costs associated with the deployment of full-trust code.Hybrid approaches also enable you to make additional functionality available to sandboxed solution developersacross your organization. For example, you could develop and deploy a full-trust proxy that provides loggingfunctionality. Other developers can use your full-trust proxy in sandboxed solutions, from any site collection,without exceeding the limitations of the sandbox environment.C ommon scenarios for hybrid approaches include the following:  Interaction with external services. For example, suppose you create a sandboxed solution that tracks help desk requests from external customers. Your solution might use a full-trust proxy to submit each customers location details to a geo-coding service. The geo-coding service returns a latitude and longitude, which your sandboxed solution can use to calculate the nearest available engineer for each customer.  Full trust workflow activ ities. For example, suppose you want to extend the job postings data capture example from the sandbox scenarios. You might create and deploy a full-trust workflow activity that takes the data from a posting form and then uses a Web service to publish the information to an external job board Web site. You can consume this workflow activity from the declarative workflow within your sandboxed solution.  Extension of sandbox capabilities. For example, suppose you want to allow sandboxed solution developers to use personalization. You might create a full-trust proxy to expose properties from the profile store. Similarly, you might create proxies to enable sandboxed solution developers to use logging functionality or read configuration settings from the farm-scoped property bag.  Integration with business data. For example, suppose you want to show a list of custom activities from your C RM system alongside a proposal workspace in SharePoint 2010. You could create an external content type to enable SharePoint solutions to interact with the C RM data. External content types are full-trust components. Within the sandboxed solution, you could create an external list that binds to the C RM external content type and enables you to query customer data.How Does My Execution Logic Affect My Choice of Model?Before you make design decisions about how to build a SharePoint application, it is important to understand howvarious SharePoint components execute their logic. First, knowing where your logic will execute can provide auseful context when you choose an implementation strategy. The following table maps different approaches toexecution logic to the actual processes in which they execute.SharePoint Components and Where Execution Happens IIS worker Sandbox worker Timer job Serv ice application process processes process processesDeclarativecomponentsWeb Parts *Web pagesEvent receivers *C oded workflow **activitiesGenerated from CHM, not final book. Will be superseded in the future. Page 21
  • 22. Full-trust assembliesFully codedworkflowsTimer jobsService applications*Restrictions apply; see text for details.**SharePoint 2010 provides a wrapper activity that can call custom code in the sandbox. See text for details. Note:Typically, workflows run in the IIS worker process when they are first initiated. After rehydration, they executewithin the same process as the event that triggered the rehydration. For example, if there is a timed delay inthe workflow, the workflow will be restarted from the timer process when the timer fires. If an approval causesthe workflow to rehydrate, the workflow runs in the IIS worker process where the approval was received fromthe user. In some circumstances, workflow activities may also run in the sandbox worker proxy process (forexample, if the sandbox code creates an item in a list that causes a workflow to run).In addition to understanding where logic executes, it is important to know which execution logic patterns aresupported by each execution model. The following table shows which execution models you can use with differentexecution logic patterns.SharePoint Components and Supported Execution Models Sandboxed solution Hybrid solution F ull-trust farm solutionDeclarative componentsWeb Parts *C ontent pagesApplication pagesEvent receivers *C oded workflow activities *Full-trust assembliesFully coded workflowsTimer jobsService applications*Restrictions apply; see text for details.Some of these execution logic patterns are subject to restrictions when they run within a sandboxed solution.Visual Web Parts cannot be used in the sandbox without employing a workaround, because this would require thedeployment of .ascx files to the SharePoint root on the server. Web Parts that run within the sandbox cannot useuser controls for the same reason. Event receivers that run within the sandbox are limited to events that occurwithin the boundaries of the site collection, and they can only be registered declaratively.Full-trust coded workflow activities can only be used within sandboxed solutions when they are consumed by adeclarative workflow. You can also create sandbox code that is invoked by a wrapper workflow activity providedby SharePoint. For more information, see Sandboxed Solutions.Generated from CHM, not final book. Will be superseded in the future. Page 22
  • 23. Farm SolutionsTypically, farm solutions are packaged as SharePoint solution package (WSP) files that contain assemblies, othernon-compiled components, and an XML manifest file. A farm administrator uses Windows PowerShell, theSTSADM command-line tool, or the SharePoint C entral Administration Web site to install solution packages to theserver environment. After a solution package is installed, the farm administrator can activate the solution to aspecific Web application (or multiple Web applications, if you use the full-trust model).As described in other topics in this section, you can configure your farm solutions to use a full-trust executionmodel or a bin/C AS execution model. When you use the full-trust approach, the solution package deploys yourassembly to the global assembly cache on each Web server. When you use the bin/C AS approach, the solutionpackage deploys your assembly to the bin folder of a specific Web application in the Internet InformationServices (IIS) file structure on each Web server. In both cases, the solution package can deploy othercomponents such as resource files, ASC X user controls, and ASPX Web pages to the SharePoint directorystructure on each Web server (commonly referred to as the "SharePoint root").This topic explains the technical details behind the execution models for farm solutions, and it identifies some ofthe key execution issues that you should consider when you work with farm solutions. The topic largely focuseson the full-trust execution model, because the bin/C AS model is no longer considered a recommended approach.How Does the Full-Trust Execution Model Work?The precise details of a how full-trust solution executes vary slightly, depending on the type of SharePointcomponent that you have deployed. For example, Web Part assemblies and event receivers are loaded by an IISworker process (W3wp.exe), while timer jobs are loaded by the SharePoint timer job process (Owstimer.exe).However, the concepts remain broadly the same (although the timer process typically runs under an account withhigher permission levels than the IIS worker process). In this case, this example assumes you have deployed aWeb Part. A request that invokes your Web Part logic is directed to the IIS worker process that manages the Webapplication associated with the request. The IIS worker process loads the appropriate assembly from the globalassembly cache. Because the assembly is located in the global assembly cache and as such is not subject to codeaccess security policies, it has unrestricted access to the SharePoint object model and to any other APIs that areaccessible from the worker process. The assembly is also able to access remote resources such as databases,Web services, and Windows C ommunication Foundation (WC F) services. The following illustration shows thevarious components of full-trust execution.The full-trust execution modelGenerated from CHM, not final book. Will be superseded in the future. Page 23
  • 24. How Does the Bin/CAS Execution Model Work?When you deploy a farm solution using the bin/C AS execution model, the assembly is added to the bin folder inthe IIS file structure for your SharePoint Web application. As a result, the assembly can be loaded only by the IISworker process associated with that Web application (in contrast to the full-trust execution model, where yourglobal assembly cache–deployed assemblies can be loaded by any process). This difference precludes the use ofbin/C AS solutions to deploy various SharePoint components, such as timer jobs, event receivers, serviceapplications, and workflows, which require your assemblies to be available to other processes.Requests that invoke your code are directed to the IIS worker process that runs the Web application associatedwith the request. The IIS worker process loads the appropriate assembly from the Web applications bin folder inthe IIS file system. Because the assembly is located in the bin folder, it is subject to the code access securitypolicies defined in the configuration file for the Web application. These policies define the degree to which yourassembly can use the SharePoint object model as well as other APIs, databases, and services. The followingillustration shows the various components of bin/C AS execution.The bin/CAS execution modelWhat Can I Do with Farm Solutions?Full-trust farm solutions have no limitations in terms of functionality or scope. You can deploy every type ofSharePoint component with a full-trust solution, and you can make your components available to site collectionsacross the server farm.Bin/C AS solutions are more limited. Scope is restricted to the target Web application, and functionality isconstrained by the code access security policies that are applied to the Web application. Bin/C AS solutions arealso unsuitable for the deployment of timer jobs, event receivers, service applications, and workflows. Thesecomponents require assemblies to be deployed to the global assembly cache, as explained earlier in this topic.What Are the Core Issues for Farm Solutions?Each execution model creates a different set of challenges for the developer. Farm solution development createsparticular issues for consideration in the areas of deployment, capabilities, stability, and security. The nextsections describe each of these.Generated from CHM, not final book. Will be superseded in the future. Page 24
  • 25. DeploymentWhen you create a full-trust farm solution, there are no limits to the types of resources that you can deploy. Norare there restrictions on the locations within the server file system to which you can add these resources.However, your organization may limit or prohibit the deployment of farm solutions due to security or performanceconcerns. In many cases, your application may also have to undergo a formal code review before you can deploythe solution to the server environment.CapabilitiesFull-trust farm solutions execute without any code access security restrictions and run using the same processidentity as the code that invokes your solution. Typically, your code will run in the IIS worker process(W3wp.exe), the SharePoint Timer process (Owstimer.exe), or a service application process, depending on yourexecution logic. As a result, your code executes without any restrictions—in other words, your code can dowhatever the SharePoint platform itself can do. In cases where security or stability are not significant issues, orwhere the application undergoes a high level of functional and scale testing, a farm solution is an appropriatechoice. Otherwise, consider running only the components that specifically require a farm solution deploymentwithin a farm solution. C omponents that can run within the sandbox environment should be deployed in asandboxed solution.StabilityFarm solutions are not subject to any monitoring or resource allocation throttling. Poorly written code in a farmsolution can jeopardize the performance and stability of the server farm as a whole. To prevent these issues, youshould carefully review your farm solution code to identify issues that could cause memory leaks or processtimeouts. For example, developers often encounter the following pitfalls that can adversely affect performance:  The developer could fail to dispose of SPSite and SPWeb objects after use.  The developer could iterate through items in large lists instead of executing queries on the lists.  The developer could use for or foreach loops to aggregate data, instead of using SPSiteDataQuery or other recommended data aggregation methods.  The developer could use recursive method calls to iterate through information in every site within a site collection.  The developer could fail to close connections to external systems after use.  The developer could fail to trap timeouts when connecting to external systems.  The developer could overuse, or improperly use, session state.This is not an exhaustive list instead, it simply illustrates that there are many different ways in which you canunnecessarily slow your SharePoint environment. To minimize risks to farm stability, you should review yoursolution code against all best practice guidance in the relevant functional areas.SecurityFarm solution code runs in the same process space as SharePoint itself. These processes run using privilegedaccounts. Both of these factors increase the scope for harm if your code is compromised or exploited. Even if youdeploy your code using the bin/C AS approach and apply restrictive code access security policies, the risk of adamaging security exploit is substantially higher than you would encounter through a sandboxed solution. Youshould take care to review your code for security vulnerabilities before your deploy your solution.Generated from CHM, not final book. Will be superseded in the future. Page 25
  • 26. Sandboxed SolutionsSandboxed solutions are packaged as SharePoint solution package (WSP) files that contain assemblies, othernon-compiled components, and an XML manifest file. A site collection administrator, or another user with sufficientpermissions, uploads the solution package to a specialized library—the solution gallery—in the root site of the sitecollection. Every sandboxed solution is executed in a unique application domain. Because the application domainis unique to your solution, SharePoint is able to monitor your solution for performance issues and resource use,and it can terminate your code if it exceeds the boundaries set by the IT team. The application domain runs withinan isolated process, using an account with a lower set of permissions than the Web application service account,and is subject to various restrictions on functionality and scope.The remainder of this topic explains the technical details behind the execution model for sandboxed solutions. Itdescribes in detail what you can and cannot do in the sandbox environment, and it explains how IT professionalscan manage, configure, and constrain the execution of sandboxed solutions. It also identifies some of the keyexecution issues that you should consider when you work with sandboxed solutions.How Does the Sandbox Execution Model Work?When your solution runs within the sandbox environment, requests that invoke your code are first directed to theInternet Information Services (IIS) worker process that runs the Web application associated with the request. Therequest is handled by the Execution Manager, a component that runs in the same application pool as the Webapplication.The Execution Manager routes the request to a server that runs the SharePoint User C ode Service(SPUC HostService.exe). Depending on your farm configuration, this could be a Web front-end server or it couldbe a dedicated application server. When the user code service receives a request, it will either start a newsandbox worker process (SPUC WorkerProcess.exe) or route the request to an existing sandbox worker process.More specifically, the execution manager routes the request to a specific sandbox worker process if that processis already hosting an application domain for the solution in question. If no loaded application domain is found, theexecution manager will route the request to the sandbox worker process that is under least load. The workerprocess then creates a new application domain and loads the solution assembly. If the worker process hasreached the maximum number of application domains it is configured to host, it unloads an existing applicationdomain before it creates a new one.After the sandbox worker process loads the solution assembly into an application domain, it executes your code.Because the assembly runs in the context of the sandbox worker process, it has a limited set of permissions touse the SharePoint object model and it is prevented from interacting with any other APIs, services, or resources.The code access security policies that limit access to the SharePoint object model are described by theconfiguration file associated with the sandbox worker process. When your sandboxed code makes calls into thepermitted subset of the SharePoint API, the sandbox worker process forwards these requests to a proxy process(SPUC WorkerProcessProxy.exe) that executes the SharePoint object model code. A sandbox worker process anda sandbox proxy process always work as a pair.The following illustration shows the different components of the sandbox execution architecture.The sandbox execution modelGenerated from CHM, not final book. Will be superseded in the future. Page 26
  • 27. The following are the three key processes that drive the execution of sandboxed solutions:  User Code Service (SPUC HostService.exe). This is responsible for creating the sandbox worker processes that execute individual sandboxed solutions and for allocating requests to these processes. You must start this service through the SharePoint C entral Administration Web site on each server that will host sandboxed solutions.  Sandbox Worker Process (SPUC WorkerProcess.exe). This is the process in which any custom code in your sandboxed solution executes. When a sandbox worker process receives a request that invokes a particular solution, it loads an application domain for that solution (unless it is already loaded). If the worker process reaches the limit on the number of application domains that it can host, it will unload one of the application domains for another solution and load the application domain required to serve the current request. The sandbox worker process throttles the resources accessed by your solution and destroys processes that take too long to execute. Each sandbox worker process is monitored by the SharePoint environment against the criteria specified by the IT team.  Sandbox Worker Process Proxy (SPUC WorkerProcessProxy.exe). This provides a full-trust environment that hosts the SharePoint API. This enables sandboxed solutions to make calls into the subset of the SharePoint object model that is accessible to sandboxed solutions. These calls are actually executed in the proxy process. Note:Generated from CHM, not final book. Will be superseded in the future. Page 27
  • 28. The executable files that drive sandboxed solutions are stored in the folder 14Usercode on each SharePointserver.What Can I Do with Sandboxed Solutions?When you develop solutions that target the sandbox execution model, you need to understand the constraints thatapply to the sandbox environment. This section reviews some common SharePoint development scenarios fortheir compatibility with the sandbox execution model. The following table shows several common developmentscenarios together with the execution models that are available to you in each case. This is not an exhaustive list;however, it serves to give you a feel for the types of scenarios that you can implement with a sandboxedsolution.Scenario Sandbox Hybrid F ull-Tru stC reate a Web Part that aggregates data from multiple SharePoint lists withinthe same site collection. *C reate a Web Part that aggregates data from multiple SharePoint lists fromdifferent site collections within the same SharePoint farm.C reate a Web Part that aggregates data from multiple SharePoint lists fromdifferent site collections from different SharePoint farms.C reate a Web Part that displays data from an external list.C reate a Web Part that interacts with a Web service or a WindowsC ommunication Foundation (WC F) service.C reate a workflow in SharePoint designer.C reate a sandbox workflow action (a method call).C reate a full-trust workflow activity.C reate a workflow in SharePoint designer that uses a full-trust custom codedworkflow activity.C reate a fully coded workflow.Deploy a new list definition.Deploy a new list definition with list item event receivers.Deploy a list definition with list event receivers.Deploy a site definition.C reate a content type.C reate an external content type.**C reate a new ribbon element.C reate a new Site Actions menu item.C reate an instance of a SharePoint list.Programmatically create a SharePoint subsite.Bind a content type to the home page of a SharePoint subsite.Deploy a new application page.C reate a timer job.C reate a service application.Generated from CHM, not final book. Will be superseded in the future. Page 28
  • 29. *The Visual Web Part supplied with Visual Studio 2010 will not run in the sandbox. You must use the Visual StudioPower Tool in the sandbox.**External content types are typically created by using the External C ontent Type Designer in SharePointDesigner 2010. However, they must be deployed using a farm solution or through the C entral Administration Website. Note:The standard Visual Web Part is not supported in the sandbox environment. The reason for this is becauseVisual Web Parts effectively host an ASC X user control within the Web Part control. The ASC X file is deployedto the _controltemplates virtual directory in the physical file system on each Web front-end server. Thesandbox environment does not allow you to deploy physical files to the SharePoint root, so you cannot use asandboxed solution to deploy a Visual Web Part based on the Visual Studio 2010 Visual Web Part projecttemplate.A Visual Studio Power Tool is available that addresses this issue. A Power Tool is a plug in for Visual Studio.The tool will generate and compile code representing the user control (.ascx) as part of the assembly. Thisavoids the file deployment issue. You can download a Power Tool for Visual Studio 2010 that supports VisualWeb Parts in the sandbox from Visual Studio 2010 SharePoint Power Tools on MSDN.Code Access Security RestrictionsThe execution of sandboxed solutions is governed by a restrictive code access security policy. This limitssandboxed solutions to the use of a specific subset of the Microsoft.SharePoint namespace. The code accesssecurity policy also prevents sandboxed solution code from accessing external resources or systems. Thedirectory 14Usercode contains the Web.config file that specifies the C AS policies that apply to sandboxedsolutions as a trust level. For a complete list of the namespaces and classes that are available in the sandboxenvironment, together with details of the code access security policies that apply to sandboxed solutions, see thefollowing articles in the SharePoint Foundation SDK:  Sandboxed Solutions Architecture  Namespaces and Types in Sandboxed Solutions Note:If you attempt to use a SharePoint method that is not permitted in the sandbox environment, the method callwill throw a MissingMethod exception. This occurs for all methods in the blocked namespacesThe VisualStudio 2010 SharePoint Power Tools has a Sandbox C ompilation extension that generates build errors whenthe sandbox solution project uses types that are not permitted.There are various nuances that apply to these API restrictions:  Within the sandbox, you can use an assembly that includes blocked types and methods, as long as those blocked types and methods are not used within the sandbox environment.  Any methods that are called from the sandbox must not include any blocked types or methods, even if those blocked types or methods are not actually invoked when the method is called from the sandbox environment.Permission RestrictionsIn addition to code access security policy restrictions, the sandbox worker process uses an account with a limitedpermission set. Using a low-privileged account further limits the amount of harm that a compromised sandboxedsolution can do within the production environment. This further restricts the actions that you can perform fromsandboxed code.Because sandboxed code is executed in a partial trust environment, any assembly that contains code that will becalled from the sandbox must include the AllowPartiallyTrustedC allersAttribute.Retrieving User IdentityWithin your sandboxed solutions, you can programmatically retrieve the SPUser object associated with thecurrent request. However, you cannot access the underlying authentication token for the current user. In mostcases, this is not a problem, because the restrictions of the sandbox environment generally prevent you fromperforming operations in which the underlying identity is required, such as impersonating a user in order toaccess an external system.Using Event ReceiversYou can create event receiver classes within sandboxed solutions for events that fire on list items, lists, andGenerated from CHM, not final book. Will be superseded in the future. Page 29
  • 30. individual sites—in other words, events that fire within the boundaries of a site collection. Specifically, you canonly create event receivers that derive from the following classes:  SPItemEventReceiver  SPListEventReceiver  SPWebEventReceiverYou cannot use the object model to register event receivers within sandboxed solutions. For example, you cannotuse a feature receiver class to register an event receiver on feature activation. However, you can register eventreceivers declaratively in your feature elements file. For more information about how to register an eventreceiver declaratively, see Registering an Event Handler on MSDN. Note:To determine whether your application code is running in the sandbox process, check whether the applicationdomain name contains the text "Sandbox". You can use the following code to accomplish this:if(System.AppDomain.CurrentDomain.FriendlyName.Contains(“Sandbox”)){// Your code is running in the sandbox.}In the SharePoint Guidance Library, the SharePointEnvironment class contains a static method namedInSandbox that returns true if this condition is met.Accessing External DataBroadly speaking, there are two main approaches that you can use to access external data in SharePoint 2010solutions:  Business Data Connectivity Object Model (BDC OM). You can use this to work with external content types and external lists.  SharePoint Object Model. You can use this, namely the SPList API, to work with external lists.You can use both the BDC OM and the SPList API to access data from external lists. In fact, the SPList APIactually uses the BDC OM to perform C RUD (C reate, Read, Update, and Delete) operations on external list data.However, the SPList API is available in the sandbox environment, whereas the BDC OM is not.The SPList API performs well when the external list contains simple field types and when the built-in BDCformatter is able to "flatten" (serialize) more complex types. However, there are certain scenarios in which theSPList API will not work; for example, it will not work when you need to retrieve custom data types or binarylarge objects, when a list has bi-directional associations, or when they back-end system uses non-integeridentifier fields. For a complete list of these scenarios, see Using the SharePoint List Object Model and theSharePoint C lient Object Model with External Lists. In these cases, you must use the BDC OM. The BDC OM is notdirectly available within the sandbox environment; instead, you need to create a full-trust solution or a hybridsolution that uses a full-trust proxy to access the BDC APIs. For more information about this approach, see HybridApproaches. Note:The BDC OM is present in SharePoint Foundation 2010, SharePoint Server 2010, and Office 2010. For moreinformation, see Business C onnectivity Services Object Model Reference on MSDN.Using WorkflowsYou can use sandboxed solutions to deploy declarative workflows that were created in SharePoint Designer.These declarative workflows are stored in the content database. Like with any declarative logic, declarativeworkflows execute with full trust, regardless of whether you define them in a sandboxed solution or a farmsolution. However, you cannot deploy coded workflows to the sandbox environment.As you probably already know, you can define custom-coded workflow activities that run in the full-trustexecution environment. You can also create sandboxed code that is invoked by a workflow action. Note:Workflow activities and workflow actions are related concepts. A workflow activity is any class that derivesfrom System.Workflow.ComponentModel.Activ ity. A workflow action is a SharePoint Designer conceptthat describes any activity or group of activities that can be composed into a human-readable sentence in theSharePoint workflow engine. A workflow action is represented by an Action element in a feature manifest fileor an .actions file, as you will see in the code examples that follow.Technically, you cannot create a workflow activity that runs in the sandbox. However, you can create aGenerated from CHM, not final book. Will be superseded in the future. Page 30
  • 31. sandboxed method that is packaged as a workflow action. In the case of sandboxed workflow logic, theworkflow activity is the SharePoint-provided wrapper class that calls your sandboxed code. For the sake ofreadability and simplicity, this topic refers to sandboxed code that is invoked by a workflow action as asandboxed workflow action.To create a sandboxed workflow action, you must create a class with a method that accepts aSPUserCodeWorkflowContext as the first parameter. You can also have additional parameters, which will bedefined in the Elements.xml file for the solution. The following example is taken from the workflow referenceimplementation.C#public Hashtable CopyLibraryAction(SPUserCodeWorkflowContext context, string libraryName,string targetSiteUrl){ // This is the logic to copy a library to a target site.}The action is then defined in the Elements.xml file, which tells SharePoint about the action and the implementingclass. It also enables SharePoint Designer to use the activity in a declarative workflow for the site collection.XML<Elements xmlns="http://schemas.microsoft.com/sharepoint/"> <WorkflowActions> <Action Name="Copy Library" SandboxedFunction="true" Assembly="..." ClassName="..." FunctionName="CopyLibraryAction" AppliesTo="list" UsesCurrentItem="true" Category="Patterns and Practices Sandbox"> <RuleDesigner Sentence="Copy all items from library %1 to site %2"> <FieldBind Field="libraryName" Text="Library Name" Id="1" DesignerType="TextBox" /> <FieldBind Field="targetSiteUrl" Text="Target Site" Id="2" DesignerType="TextBox" /> </RuleDesigner> <Parameters> <Parameter Name="__Context" Type="Microsoft.SharePoint.WorkflowActions.WorkflowContext, Microsoft.SharePoint.WorkflowActions" Direction="In" DesignerType="Hide" /> <Parameter Name="libraryName" Type="System.String, mscorlib" Direction="In" DesignerType="TextBox" Description="The library to copy" /> <Parameter Name="targetSiteUrl" Type="System.String, mscorlib" Direction="In" DesignerType="TextBox" Description="The URL of the target site" /> </Parameters> </Action> </WorkflowActions></Elements>The workflow execution environment calls the method specified in the Action element to launch your sandboxedworkflow action.Suppose that you have deployed a declarative workflow and a sandboxed workflow action to the sandboxenvironment. SharePoint executes the declarative workflow with full trust, because all the actual run-time codeinvoked by the workflow is deployed with full trust; therefore, it is considered safe. SharePoint defines asandboxed activity wrapper that executes with full trust and provides a wrapper for all sandboxed actions. Thesandbox activity wrapper makes the method call into your sandboxed method. The method defined in thesandboxed solution—CopyLibraryAction in the previous example—actually executes within a sandbox workerprocess. The following illustration shows this, where the details of the user code service processes have beenomitted for brevity.Generated from CHM, not final book. Will be superseded in the future. Page 31
  • 32. Using a sandboxed workflow action from a declarative workflowWhen you create the declarative workflow, SharePoint Designer hides the relationship between the sandboxactivity wrapper and the sandboxed action implementation. SharePoint Designer also enables you to defineparameters and field bindings as inputs to the custom sandbox action.Sandboxed workflow actions offer advantages in many scenarios, particularly because you can deploy theseactions as sandboxed solutions without access to the server environment. However, there are limitations on thetasks that a sandboxed workflow action can perform. Declarative workflows can also use certain approvedfull-trust workflow activities. For information about how to add custom full trust workflow activities that can beconsumed by a declarative workflow, see Hybrid Approaches.How Do I Manage Sandboxed Solutions?Farm administrators can customize many aspects of how sandboxed solutions are executed, validated, andmonitored. As a solution architect or a senior developer, it is important to have an awareness of these featuresbecause they can impact how your solutions behave and perform.Understanding Operational ModesThe IT team can configure the SharePoint farm to execute sandboxed solutions in one of two operational modes.The operational mode determines where the sandbox worker process that executes each sandboxed solutionGenerated from CHM, not final book. Will be superseded in the future. Page 32
  • 33. actually resides:  When the farm is configured in local mode, each sandboxed solution executes on the Web front-end server that receives the request.  When the farm is configured in remote mode, sandboxed solutions can execute on servers other than the server that receives the request.When you configure the farm to run sandboxed solutions in remote mode, you can use dedicated applicationservers to run sandboxed solutions. Alternatively, the server farm can use load balancing to distribute theexecution of sandboxed solutions across Web front-end servers. You must start the user code service on eachWeb front-end server that will run sandboxed solutions.The following illustration shows the difference between these approaches. When your farm is configured in localmode, sandboxed solution code executes on the Web front-end server that receives the request. The Webfront-end server will spin up a new sandbox worker process and load the solution, unless a process already existsfor that solutions unique application domain.Sandbox execution in local modeWhen your farm is configured in remote mode with dedicated sandbox servers, the Web front-end server thatreceives the request will first establish whether any of the sandbox servers are already running a sandbox workerprocess for the required solution. If this is the case, the Web front-end server will route the request to thatsandbox server. This is known as solution affinity. If the process is not running on any of the sandbox servers,the Web front-end server will route the request to the sandbox server currently experiencing least load. Thissandbox server will spin up a sandbox worker process and load the solution.Sandbox execution in remote mode with dedicated sandbox serversGenerated from CHM, not final book. Will be superseded in the future. Page 33
  • 34. When your farm is configured in remote mode, and the user code service is running on more than one Webfront-end server, the Web front-end servers will distribute requests that invoke sandboxed solutions according toserver load. If one of the Web front-end servers is already running a sandbox worker process that has loaded therequired solution into an application domain, the request is routed to that server. If the solution is not loaded onany of the Web front-end servers, the request is routed to the Web front-end server currently experiencing leastload. This server will spin up a new application domain in the sandbox worker process and load the solution.Remote mode with Web front-end servers hosting the user code serviceThe IT team should use capacity planning to select the best operational mode for a particular SharePointenvironment. In general, it is recommended to use remote mode. However, if you expect a small number ofsandboxed solutions, and response latency is a major concern, local mode may be preferable. This is becausethere is a minor latency cost in cases where the request is received on one server and the sandbox processes runon a different server.As your server farm grows larger, or the expected number of sandboxed solutions increases, remote mode canbecome increasingly advantageous. When more than one server runs the user code service, load balancing andsolution affinity mean that each server needs to host only a subset of the deployed sandboxed solutions. This isimportant because every server that runs the user code service can host only a finite number of sandbox workerprocesses and application domains. When a server hits these limits, what happens when it receives a request fora sandboxed solution that is not already loaded? To serve the request, it must recycle an existing applicationdomain to be able to load the new application domain. This results in requests queuing for a free applicationGenerated from CHM, not final book. Will be superseded in the future. Page 34
  • 35. domain and increased recycling of application domains. In addition, it becomes increasingly unlikely that there willbe an already loaded ("warm") application domain for particular solutions. These factors can substantially impactserver performance and response times. Using remote mode to distribute application domains across multipleservers clearly mitigates these issues.In addition to simply setting the operational mode to local or remote, the IT team can make various configurationchanges that will impact on the performance issues described here. For example, farm administrators canconfigure the number of sandbox worker processes and application domains that can be hosted on each serverthat runs the user code service. They can also constrain the number of connections allowed per process. Finally,the user code service includes a flag named AlwaysRecycleAppDomains. By default, this is set to false. Whenit is set to true, the user code service recycles application domains after every request. This reduces theresponse time of the server as a "warm" application domain is never available, but it can reduce the risk of datacrossover in poorly designed sandboxed solutions.Deploying and Upgrading SolutionsSandboxed solutions are deployed as SharePoint solution package (WSP) files to the solutions gallery, aspecialized library in the root site of each site collection. You can find the solutions gallery at the site relative URL_catalog/solutions. The site collection administrator can activate and deactivate the solutions within the gallery.If you need to deploy a sandboxed solution to multiple site collections, you must upload it to each site collectiongallery separately. Alternatively, you could create a central repository for sandboxed solutions and register acustom solution provider with the solution galleries on each site collection. Site collection administrators can thenchoose which centrally-available solutions they want to activate on their individual site collection. The customprovider approach essentially allows you to upload and manage a solution in a single location while making itavailable to multiple site collections.You can upgrade sandboxed solution packages through the user interface. If you upload a solution package with anew file name but the same solution ID as an existing solution, SharePoint will prompt you to upgrade the existingsolution. Alternatively, you can use the Update-SPUserSolution command in Windows PowerShell to upgradeyour solutions. However, this requires access to the server environment, which is not necessarily available tosandboxed solution developers or site collection administrators.For more information about managing sandboxed solutions in the solutions gallery, see Sandboxed SolutionsArchitecture in the SharePoint Foundation SDK.Understanding Solution MonitoringSharePoint 2010 monitors the performance and resource use of your sandboxed solutions through a system ofresource points. Farm administrators can set limits on the number of resource points that a site collectioncontaining sandboxed solutions may consume daily. Because these limits are set on a per-site collection basis,the available resources are effectively shared between every sandboxed solution in the site collection. Sitecollection administrators can monitor the resource points used by individual solutions from the site collectionsolution gallery. If the solutions in a site collection exceed the daily resource point allocation for that sitecollection, SharePoint will take every sandboxed solution in the site collection offline for the rest of the day.Resource points are calculated according to 14 different measurements known as resource measures, includingC PU execution time, memory consumption, and unhandled exceptions. Each resource measure defines a propertynamed Resources per Point. This is the quantity of that particular resource that constitutes an individualresource point. For example, suppose you deploy a solution named C ontoso Project Management to the sandboxenvironment. The following table shows a hypothetical example of the resources it consumes across two sampleresource measures.Resource measure Resources per Point Used by Contoso Points consumed Project Management solution in one daySharePointDatabaseQuery 20 queries 300 queries 15C ountSharePointDatabaseQuery 120 seconds cumulative 240 seconds cumulative 2TimeSharePoint counts the most expensive resource measure toward the total for the solution, instead of the sum ofall measures. In this example, because the number of database queries represents the highest resource pointusage, the C ontoso Project Management solution consumes 15 resource points from the total allocated to the siteGenerated from CHM, not final book. Will be superseded in the future. Page 35
  • 36. collection.To prevent rogue sandboxed solutions from causing instability SharePoint also monitors individual sandboxedsolutions per request. Each of the 14 resource measures includes an AbsoluteLimit property that defines a hardlimit of the resources that a sandboxed solution can consume in a single request. If an absolute limit is exceeded,SharePoint terminates the request by stopping and restarting the Sandbox worker process. For example, the C PUexecution time resource measure has a default absolute limit of 60 seconds. If a single request takes more than60 seconds to execute, the user code service will stop and restart the sandbox worker process that is executingthe request. Individual solutions will not be disabled for violating an absolute limit, although the utilization willcount toward the resource points for the site collection; therefore, they will be expensive.In addition, the user code service includes a property named WorkerProcessExecutionTimeout with a defaultvalue of 30 seconds. If this time limit is exceeded during a single request, the user code service will recycle theapplication domain in question and the request will return an error. These two settings are measuredindependently by different parts of the system but they effectively measure the same thing. In general, settingthe WorkerProcessExecutionTimeout is preferred over the absolute limit because it will only recycle theapplication pool instead of the entire process. Exceeding an absolute limit will result in a worker process recycle.When a worker process is recycled, any requests running within the process will fail. In production installations, itis likely that multiple solutions will be running within multiple application domains within one process, so a singlerogue solution can disrupt users of more benign solutions. For more information about configuring sandboxenvironments for resiliency in a production environment, see the Performance and capacity management(SharePoint Server 2010) on TechNet.Farm administrators can use Windows PowerShell to change the Resources Per Point for a Resource Measure. However, the default measurement weightings were carefully chosen, and understanding the impact ofadjustments to these weightings can be complex. You should carefully consider the impact of changing theseweightings before you make any modifications.You can also use Windows PowerShell to investigate how many resource points are being used by specificindividual solutions. Resource point consumption depends on the capacity of your server farm and on how youconfigure measurement weightings, so it is hard to provide an absolute recommendation on where to capresource point allocations for sandboxed solutions. Instead, you should determine limits by testing against arepresentative production environment.For full details of the measurements used to calculate resource points, see Developing, Deploying, and MonitoringSandboxed Solutions in SharePoint 2010 andPlan sandboxed solutions (SharePoint Server 2010 on MSDN.On a final note, farm administrators use the C entral Administration Web site to block poorly performing orotherwise undesirable sandboxed solutions. This ensures that the solution in question cannot be deployed to anysite collection in the farm.Understanding Solution ValidationIn SharePoint 2010, farm administrators can install solution validators to provide additional verification ofsandboxed solutions. SharePoint 2010 runs these solution validators when you attempt to activate a sandboxedsolution package. Each solution validator can run various validation checks on the solution package and can blockactivation if your solution fails any of these checks.By default, SharePoint 2010 includes a single default solution validator that simply sets the Valid property of eachsolution to true. To create your own custom solution validator, you must create a class that inherits from theSPSolutionValidator abstract class. This class includes two key methods:  ValidateSolution. This method validates the solution package and its contents. This method has access to the name of the solution package and any files that the package contains.  ValidateAssembly. This method validates each assembly in the solution package.Both methods enable you to set an error message, together with an error URL to which the user should bedirected if validation fails.To register a solution validator with the SharePoint farm, you can use a feature receiver to add your class to theSolutionValidators collection in the local SPUserCodeService object.What Are the Core Issues for Sandboxed Solutions?Sandboxed solutions introduce a fresh set of challenges for SharePoint developers. When you develop aGenerated from CHM, not final book. Will be superseded in the future. Page 36
  • 37. sandboxed solution, you should pay particular attention to the areas described in the following sections.Scope and CapabilitiesSandboxed solutions can only interact with resources within the site collection in which they reside. If yourapplication must access data in multiple site collections, you can rule out a sandboxed solution at an early stagein the design process. Similarly, the restricted capabilities of the sandbox environment may prohibit the use of asandboxed solution.Security (Authentication)Sandboxed solutions do not maintain the full identity of the user originating the request, and they cannotimpersonate a different user account or provide credentials to authenticate to other systems. The SPUser objectis maintained, but the related security tokens are not. With this in mind, you should consider whether asandboxed solution is capable of accessing the data or resources that your application requires.In particular, the constraints on authentication prevent you from executing your code with elevated permissions.In farm solutions, developers will often use the SPSecurity.RunWithElevatedPrivileges method to execute amethod with full control privileges, even if the user has a lesser set of permissions. However, you should considercarefully whether elevated permissions are really necessary before you reject a sandboxed approach altogether.Although there are scenarios in which you need to elevate permissions, there are also many cases where propermanagement of user groups and permission sets within the SharePoint site allow your logic to execute from thesandbox environment.Performance (Throughput)If a sandbox worker process runs for more than 30 seconds, the user code service will terminate the process. Ifyou need to use long-running processes to deliver your functionality, a sandboxed solution is unlikely to be thebest choice. However, in these circumstances, you should probably be using an asynchronous executionmechanism instead. For example, use a timer job, a workflow, or a service application to execute your logic as abackground task within a farm solution.Executing code within the sandbox environment also incurs a small amount of performance overhead. This is onlylikely to have a noticeable impact in high volume applications, such as in Internet-facing portal environments. Inthese cases, you may want to consider deploying your code within a farm solution.LoggingLogging functionality is unavailable within the sandbox environment. Sandboxed solutions cannot write entries tothe Windows Event log or the Unified Logging Service (ULS) trace log, nor can they create or retrieve diagnosticareas or categories. This should not come as too much of a surprise—writing to the Windows Event log has alwaysrequired a relatively permissive code access security policy, and creating diagnostic areas requires access to theWindows registry.Exposing logging functionality to sandboxed solutions is a good example of a scenario in which you might considercreating a full-trust proxy. For example, the SharePoint Logger component includes a full-trust proxy to enabledevelopers of sandboxed solutions to use the full range of logging features in SharePoint 2010.Configuration SettingsYour ability to read and write configuration settings is somewhat restricted in the sandbox environment. Thefollowing are some examples:  You cannot read configuration settings from or write configuration settings to the Web.config file.  You can store and retrieve settings in the SPWeb.AllProperties hash table, but you cannot use property bags at any level of the SharePoint hierarchy.  You cannot read or write settings to the hierarchical object store, because you do not have access to an SPWebApplication object or an SPF arm object.  You can read or write settings to a SharePoint list within the same site collection as your sandboxed solution.DeploymentSandboxed solutions are deployed and activated to a single site collection. If you need to deploy a solution tomultiple site collections, the sandbox approach can be less convenient. You can either manually distribute solutionpackages to individual site collection administrators, or you could implement a custom centralized solutionsGenerated from CHM, not final book. Will be superseded in the future. Page 37
  • 38. gallery to make solutions available to individual site collection administrators.When you use a sandboxed solution to deploy document templates, you cannot deploy the document templates tothe file system. The sandboxed environment does not permit you to deploy any files to the server file system. Tosolve this issue, use a Module element with a Type attribute value of Ghostable or GhostableInLibrary todeploy your templates. This indicates that the templates are deployed to the content database instead of to theserver file system.Similarly, you cannot deploy JavaScript files or modal dialog Web pages to the server file system. Instead, youmight choose to deploy Web pages to the Pages document library and scripts to a subfolder within the MasterPage Gallery. By deploying JavaScript files to the Master Page Gallery, you ensure that all users, evenanonymous users, have the permissions required to access the file at run time. This approach is especially usefulin Internet-facing deployments where anonymous users have read access to the site.Generated from CHM, not final book. Will be superseded in the future. Page 38
  • 39. Hybrid ApproachesWhen describing execution models, the term hybrid approaches refers to applications that run in the sandbox yetcan call out to full-trust code through various mechanisms. In other words, hybrid approaches combinecomponents that execute in the sandbox environment with components that run with full trust and are deployedwith multiple solutions. Essentially, you can think of a hybrid approach as two (or more) distinct, loosely coupledcomponents. The sandboxed component is deployed in a SharePoint solution package (WSP) to a site collectionsolutions gallery, and the full trust component is deployed in a WSP to the server farm. These components aretypically developed in isolation, often at different times, and a single full-trust component can be consumed bymultiple sandboxed applications. In many environments, the full-trust components are built or validated by thecentral IT team in order to make them available to multiple sandboxed solutions.Because a hybrid approach involves creating a sandboxed solution and a full trust solution, it is important to fullyunderstand the sandbox execution model and the full-trust execution model before you start to work with hybridapproaches. To recap, there are three different types of full trust components that you can consume from within asandboxed solution:  Full trust proxies. You can implement your full-trust functionality in classes that derive from the SPProxyOperation abstract class and deploy the assembly to the global assembly cache. These classes expose a full-trust proxy that you can call from within the sandbox environment.  External content types. You can use an external content type to retrieve data from line-of-business (LOB) applications and other external sources through Business C onnectivity Services (BC S). External content types must be deployed as full-trust solutions. However, you can create external lists from within the sandbox environment that use these external content types to retrieve data.  Custom workflow activities. You can create custom, code-based workflow activities and deploy these activities as full-trust assemblies to the global assembly cache. You can then consume these activities in declarative workflows from within the sandbox environment.This topic explains the technical details behind each of these hybrid execution models. It explains in detail howyou can use each model, and it identifies some of the key execution issues that you should consider when youwork with hybrid solutions.How Do Hybrid Execution Models Work?When you use a hybrid approach to solution deployment, the execution process varies, according to the type offull-trust component you use.Hybrid Execution with a Full Trust ProxyWhen you use a full trust proxy from the sandbox environment, requests follow the normal execution path ofsandboxed solutions. The code access security policy for the sandbox environment allows sandboxed code tomake calls to full-trust proxy assemblies, providing that the proxy assembly is registered with the server farm.You can programmatically register a proxy assembly from a feature receiver or by using Windows PowerShell.Your sandboxed code must use the SPProxyOperationsArgs class to structure the arguments that you want topass to the full-trust proxy. When you call the SPUtility.ExecuteRegisteredProxyOperation method, thesandbox worker process invokes the full-trust proxy and passes in your arguments. The proxy code executeswith full trust in a proxy process. The full-trust proxy then passes any return arguments back to the sandboxedsolution, and normal execution within the sandbox environment resumes. Note:The SharePoint context (SPContext) is not available within the proxy operation class. If you requirecontextual information in the proxy operation class, you will need to pass the information required to recreatethe context to the proxy. For example, if you need to access a site within the proxy, you would pass the site IDas a property on the proxy arguments passed into the proxy operation. The proxy would then recreate the siteusing the site ID. You can then access the SPUser in the sandbox through site.RootWeb.CurrentUser.The following illustration shows the key components of hybrid execution with a full trust proxy.Hybrid execution with a full-trust proxyGenerated from CHM, not final book. Will be superseded in the future. Page 39
  • 40. The following describes the three key code components behind full-trust proxies:  SPProxyOperation. This class provides an abstract base class for full-trust proxies. The class includes a method named Execute, within which you can define your full trust functionality. Your full-trust proxy classes must be deployed to the global assembly cache and registered with the SharePoint server farm, either programmatically or by using Windows PowerShell.  SPProxyOperationArgs. This class provides an abstract base class for the parameter that you pass to the full-trust proxy. To pass arguments to the full-trust proxy, you must create a serializable class that derives from SPProxyOperationArgs. Add properties within this class to get and set your arguments.  SPUtility.ExecuteRegisteredProxyOperation. This static method enables you to invoke the full-trust proxy from your sandboxed code. This method requires a string assembly name, a string type name, and an SPProxyOperationArgs object. The method returns an argument of type Object to the caller.Generated from CHM, not final book. Will be superseded in the future. Page 40
  • 41. Note:Any types you include in the proxy arguments class must be marked as serializable. Similarly, the typereturned by the proxy operation must be marked as serializable. This is because arguments and return valuesare serialized when they are passed between processes. Both the proxy operation class and the proxyargument class must be deployed to the global assembly cache. You cannot pass any types defined in thesandboxed code into the proxy, because the proxy will not have access to load the sandboxed assembly;therefore, it will not be able to load the passed-in type.Hybrid Execution with External Content TypesWhen you want to use external content types with sandboxed solutions, deployment constraints alone mean youmust use a hybrid approach. External content types must be defined in a farm-scoped feature; therefore, theycannot be deployed as part of a sandboxed solution. You do not need to deploy any fully trusted code to theserver. Instead, you can create external content types from the External C ontent Type Designer in SharePointDesigner 2010 or from the Business C onnectivity Services Model Designer in Visual Studio 2010. After theexternal content types are in place, you can define an external list from within your sandboxed solution to accessthe external data. For an example of this scenario, see the external list reference implementation.Hybrid Execution with external content typesGenerated from CHM, not final book. Will be superseded in the future. Page 41
  • 42. The only means to access external data from custom code in a sandbox solution is through an external list, byusing the SPList object model. You cannot use the BC S runtime APIs directly in sandboxed code.There are special considerations for securing services for access in the sandbox. When your access external datafrom the sandbox, it is important to understand how credentials must be configured and used. When code in thesandbox requests access to external data through the external list, the external list implementation calls the BC Sruntime. Because this code is part of the internal SharePoint implementation, it will execute within the user codeGenerated from CHM, not final book. Will be superseded in the future. Page 42
  • 43. proxy service. For security reasons SharePoint removes the authentication tokens for the user from the contextwhen it enters the sandbox worker process. As a result, the Windows identity associated with the user is notavailable in either the sandbox worker process or the sandbox proxy process. Because a Windows identity is notavailable, the managed account for the sandbox proxy process must be used as the basis for securing calls to anexternal service or a database through the BC S. All users will authenticate to the service based upon themanaged account that runs the user code proxy service. This is an example of the trusted subsystem model.When the BDC runtime receives a request for external data, it determines if the Secure Store Service (SSS) isused to manage credentials to access the service. If the SSS is being used, then the identity of the userassociated with the request is typically provided to the SSS, which maps the user (or a group or role to which theuser belongs) to a credential that can be used to access the service. Because the user authentication token is notavailable in this case, the BDC uses impersonation mode, which results in the identity of the managed accountthat runs the user code proxy service being passed to the SSS rather than the identity of the user. In the SSS,the credentials of the managed account are mapped to the credentials that you want to use to access the externalsystem. The SSS returns the mapped credentials to the BDC runtime, which then uses the credentials toauthenticate to the external system. Because the BDC runtime does not receive the credentials of individualusers, you cannot constrain access to the external system to specific user accounts when the request originatesfrom the sandbox environment. The following illustration shows this process, using the example of an externalvendor management system from the external list reference implementation.Identity flow and external service accessThe following describes the numbered steps in the preceding illustration: 1. C ustom user code, executing in the sandbox environment, uses the SPList object model (OM) to request data from an external list. The user authentication tokens for the user submitting the request have been removed from the context. 2. The SPList OM call is delegated to the user code proxy service. The user code proxy service passes the request to the BDC runtime, which also runs within the user code proxy service process. 3. The BDC runtime calls the Secure Store Service (SSS). The identity associated with the request is that of the managed account that runs the user code proxy service. The SSS returns the vendor management system credentials that are mapped to the identity of the user code proxy service. 4. The BDC runtime retrieves the external content type metadata from the BDC metadata cache. If the metadata is not already in the cache, the BDC runtime retrieves the external content type metadata from the BDC service. The external content type metadata provides the information the BDC runtime needs to be able to interact with the external vendor management system. 5. The BDC runtime uses the vendor management logon credentials retrieved from the SSS to authenticateGenerated from CHM, not final book. Will be superseded in the future. Page 43
  • 44. to the service and access data from the external vendor management system.The SharePoint user (SPUser) context is available within the sandbox environment. As such, the credentials ofthe user are used to control access to SharePoint resources within the sandbox. Note:For more information about creating external content types, see How to: C reate External C ontent Types onMSDN.Hybrid Execution with Custom Workflow ActivitiesWorkflow logic is not executed synchronously in response to a user request. Instead, it is executedasynchronously by the workflow engine. This results in a significantly different execution model.Within the sandbox environment, you can deploy a declarative workflow that defines connections betweenindividual workflow activities. Many commonly used workflow activities are provided out-of-the-box by SharePoint2010. The IT team can make additional workflow activities available to your declarative workflow by deployingcustom, code-based workflow activities to the global assembly cache as full-trust solutions.In order to make a full-trust workflow activity available to declarative workflows, the IT team must add anauthorizedType entry to the Web.config file for the content Web application. This gives your custom activity thesame status as the built-in workflow activities that come with SharePoint 2010. The following code example showsthe format of an authorizedType entry.XML<configuration> <System.Workflow.ComponentModel.WorkflowCompiler> <authorizedType Assembly="…" Namespace="…" TypeName="*" Authorized="True" /> </System.Workflow.ComponentModel.WorkflowCompiler></system.web>When you add an authorized type, set the Assembly attribute to the strong name of your assembly and set theNamespace attribute to the fully qualified namespace of your activity class. For an example of how to use afeature receiver class to add an authorized type entry for a custom workflow activity, see the Workflow ActivitiesReference Implementation. Note:As described earlier, you can also create and deploy custom sandboxed workflow actions. These actions runwithin the constraints of the sandbox environment. If you need to take advantage of capabilities outside thesandbox, you must deploy your custom workflow activities as full-trust solutions.The workflow engine itself always runs in full trust, regardless of whether you deploy your workflow as asandboxed solution or a full-trust solution. When you deploy a declarative workflow to the sandbox environment,it simply specifies the rules that determine how execution will proceed through the full-trust activities in theworkflow.This following illustration shows the key components involved in workflow execution. The declarative workflow isloaded from a sandbox solution, whereas custom full-trust workflow activities are loaded from the globalassembly cache. However, the workflow is executed entirely in a full-trust environment.Hybrid execution with a custom workflow activityGenerated from CHM, not final book. Will be superseded in the future. Page 44
  • 45. Note:The declarative workflow is defined as part of the sandbox solution, but it always executes in a full-trustprocess such as Owstimer.exe, W3wp.exe, or in the user code proxy process. Generally, the process in whichthe workflow is determined by where the workflow is initiated or where an action is taken that causes theworkflow to be "rehydrated" from the database. There are some performance mechanisms that can pushexecution into the timer process under high load conditions. A full-trust custom activity included in thedeclarative workflow also runs in a full-trust process.Declarative workflows cannot be moved between SharePoint Foundation and SharePoint Server. In general, youcan create equivalent workflows for each environment, although there are, of course, more activities available forSharePoint Server. Declarative workflows are managed slightly differently on each platform, and the workflow ispackaged and deployed with the expectation that the server version is the same. You must develop the workflowson the same SharePoint version that you expect them to run on in production.Generated from CHM, not final book. Will be superseded in the future. Page 45
  • 46. What Can I Do with Hybrid Solutions?Hybrid approaches enable you to bypass the code access security policies and the limited permission set of thesandbox environment. When you create a full-trust proxy or a custom workflow activity, your code runs with fulltrust and there are no restrictions on what it can do. Any resources consumed by the full-trust code are notcounted against the sandbox resource limits of the site collection. However, by definition, hybrid approachesrequire that your full-trust code is invoked from the sandbox environment. Because sandboxed logic can only runin the context of pages, event receivers, or workflows, hybrid approaches are inherently inappropriate for otherapplication types such as timer jobs or service applications. In these cases, you must look to a full-trust solution.How Do I Manage Hybrid Solutions?When you design a SharePoint application to use a hybrid execution model, you will use two or more solutionpackages to deploy the components to the SharePoint environment. C omponents that target the sandbox aredeployed as sandboxed solutions to a site collection solution gallery, and components that run with full trust aredeployed as farm solutions to the server environment.For example, suppose your solution includes a Web Part that uses an external list to query a LOB application.Behind the scenes, the external list relies on an external content type to provide an interaction with the LOB datasource. The following illustration shows an example where the Web Part is deployed in a sandboxed solution,while the external content type is deployed as a farm solution.Hybrid approach with a Web Part and an external content typeAlternatively, suppose your Web Part uses a full trust proxy to access parts of the object model that areinaccessible to sandboxed code. The following illustration shows an example where the Web Part is deployed in asandboxed solution, while the full trust proxy is deployed as a farm solution.Hybrid approach with a Web Part and a full trust proxyFrom an administrative perspective, these types of deployment are managed as two separate solutions. Thesandboxed solution is subject to the monitoring, resource throttling, and permission limitations of the sandboxenvironment, while the farm solution is subject to any organizational constraints on the deployment of full trustcode.What Are the Core Issues for Hybrid SolutionsGenerated from CHM, not final book. Will be superseded in the future. Page 46
  • 47. When you design your applications to use a hybrid execution model, you will deploy components in bothsandboxed solutions and full-trust solutions. As such, you need to consider the issues that relate to eachindividual solution type in addition to those issues that apply specifically to hybrid approaches. When you developa hybrid solution, you should pay particular attention to the areas described in the following sections.SecurityHybrid solutions expose a smaller surface area of full-trust code compared to farm solutions. This can reduce theamount of security review time that you require before deploying your solution. Because some of your solutioncode runs in full trust, you can impersonate the application pool (in other words, elevate permissions) ifnecessary. However, the boundaries of the sandbox environment were carefully designed when SharePoint 2010was developed. You should consider the impact of any full-trust functionality that you expose to the sandbox,because this code runs without the security restrictions that apply to the sandbox environment.DeploymentIf you want to use a hybrid approach, your organization must permit you to deploy farm solutions to yourSharePoint environment. If you do not have permission to deploy assemblies and other resources to the serverenvironment, you cannot deploy a hybrid solution. Organizations may be more permissive to external contenttypes, as you can create an external content type from SharePoint Designer without deploying any managedcode.CapabilitiesIt is important to understand which components of your solutions can run within the sandbox environment andwhich components require a full-trust proxy. The full-trust proxy should only include those elements that need toexecute with full trust—the other elements should remain within the sandboxed solution. This helps to minimizethe surface area of code that you expose to performance or security vulnerabilities. It can also help to reduce thetime required for code review, as described in the preceding paragraphs.Logging and ConfigurationFull-trust proxies can provide a useful way to expose logging and configuration functionality to sandboxedapplications. For example, the SharePoint Logger component includes a full-trust proxy that enables sandboxedapplications to write events to the Windows Event log and the Unified Logging Service (ULS) trace log.Similarly, a full-trust proxy can provide a sandboxed application with access to configuration settings that wouldotherwise be unavailable. Full-trust proxies can read and write to any data within the Web application and canread data stored in the SPF arm object. However, if you create a full-trust proxy to expose additionalconfiguration functionality to sandboxed solutions, take care to include safeguards against improper use of thatfunctionality. For example, developers risk corrupting the content or configuration database if they attempt topersist a non-serializable object to a property bag. In this case, it would be wise to include functionality within thefull-trust proxy to verify that the object can be serialized before you proceed with the operation.StabilityBecause full-trust proxies are deployed to the global assembly cache, they are not subject to the resourcethrottling and monitoring constraints that are applied to sandboxed solutions. It is important to verify that thecode in your full-trust proxy performs to a high standard. For example, ensure that your code does not causeexcessive memory use or process timeouts, just as you would for code in a farm solution. This can help to ensurethat your full-trust proxy does not jeopardize the stability of the farm as a whole.Performance (Throughput)Like with sandboxed solutions, there is a marginal reduction in performance when you use a hybrid solutioninstead of a farm solution. This is because of the marshaling of data across application domains.Generated from CHM, not final book. Will be superseded in the future. Page 47
  • 48. ConclusionThis section reviewed the different ways you can deploy and run your custom solutions in SharePoint 2010. Itdescribed two key aspects of logic execution in SharePoint 2010: the execution environment and the executionlogic. The execution environment is the security and processing boundary that contains your running code, whilethe execution logic is the means by which your code actually gets invoked.Together, the decisions you make about execution environment and execution logic form the execution model foryour solution. Execution models in SharePoint 2010 fall into three categories: farm solutions, sandboxedsolutions, and hybrid approaches. This section provided a detailed insight into each of these models, includingtheir advantages, limitations, functionality, and manageability.The SharePoint Guidance Library includes several reference implementations that illustrate different executionmodels. It is recommended that you deploy these implementations in order to explore practical examples of theconcepts described in this section.Generated from CHM, not final book. Will be superseded in the future. Page 48
  • 49. Reference Implementation: The Sandbox ExecutionModelMany rich SharePoint applications can run within the confines of the sandbox. One of the most common scenariosfor custom development in a SharePoint environment is the aggregation of data from several different lists withinthe same site collection. The Sandbox Reference Implementation (Sandbox RI) uses this scenario to illustrate areasonably complex application that can run as a sandboxed solution.The implementation highlights techniques and best practices in the following areas:  It demonstrates effective use of feature partitioning for reliable solution deployment.  It demonstrates the use of query objects to retrieve data from multiple lists.  It demonstrates the use of the Model-View-Presenter (MVP) pattern in a Web Part to isolate business logic from the presentation layer and the underlying data source.  It demonstrates the use of the C onstructor Injection pattern to isolate business logic from dependencies.  It demonstrates the use of an Exception Shielding pattern to stop unhandled Web Part errors that prevent the host page from loading.Solution ScenarioIn this example, suppose you are providing consultancy services to a pharmaceutical company named C ontosoInc. C ontoso has production plants in several locations, each of which has several departments, including Design,Maintenance, and C onstruction. Each department has a separate team site on your SharePoint 2010 intranetportal within the production plant site collection. Among other things, each department uses its individual teamsites to keep records of statements of work (SOWs) and cost estimates provided to clients tracked against acentral list of projects for the plant. Each project can have one or more SOWs associated with it.The general manager for the Springfield production plant, Janet Schorr, wants to be able to monitor the progressof SOWs and estimates across all departments within the plant. However, the central IT team at C ontosoheadquarters is reluctant to permit farm solution deployments. To meet Janets requirements, you implement asandboxed solution that retrieves details of SOWs and estimates from each team site. The solution presents keydetails of these SOWs and estimates in a Web Part on the landing page of the Manufacturing site collection, asshown in the following illustration.The Aggregate View Web PartDeploying the Sandbox RIThe Sandbox Reference Implementation (Sandbox RI) includes an automated installation script that creates a sitecollection, deploys the reference implementation components, and adds sample data. After running theinstallation script, browse to the new Manufacturing site collection at http://<Hostname>/sites/Manufacturing. Youcan open and run the solution in Visual Studio, but this does not create a site collection or add sample data. Tosee the system fully functioning, you must run the installation script. The following table summarizes how to getstarted with the Sandbox RI. Note: The Auto-retract after debugging property is disabled for this Visual Studio project. This prevents Visual Studio from retracting the solution if you choose to step through the code. For more information about auto-retract, see Developing SharePoint Solutions.Question AnswerGenerated from CHM, not final book. Will be superseded in the future. Page 49
  • 50. Where can I find the <install location>SourceExecutionModelSandboxedSandbox RI?What is the name of the ExecutionModels.Sandboxed.slnsolution file?What are the system SharePoint Foundation 2010requirements?What preconditions are  You must be a member of SharePoint Farm Admin.required for installation?  You must be a memberof the Windows admin group.  SharePoint must be installed at http://<Hostname:80>. If you want to install to a different location you can edit these settings in the Settings.xml file located in the Setup directory for the solution.  SharePoint 2010 Administration service must be running. By default, this service is set to a manual start. To start the service, click Start on the taskbar, point to Administrative Tools, click Serv ices, double-click SharePoint 2010 Administration serv ice, and then click Start.How do I Install the Follow the instructions in the readme file located in the project folder.Sandbox RI?What is the default http://<Hostname>/sites/ Manufacturinginstallation location? (This location can be altered by changing the Settings.xml in the Setup directory.)How do I download the The Sandbox RI is included in the download Developing Applications forSandbox RI? SharePoint 2010.Generated from CHM, not final book. Will be superseded in the future. Page 50
  • 51. Solution OverviewThis topic provides a high-level overview of the various components that make up the Sandbox ReferenceImplementation (Sandbox RI). It does not examine the design of the solution or the implementation details of theWeb Part, both of which are described later in this guidance. Instead, it illustrates how the referenceimplementation works at a conceptual level.The Sandbox RI consists of various components, as shown in the following illustration.Conceptual overview of the Sandbox RIThe next sections describe each of these components in more detail.Libraries and Content TypesEach team site includes a document library named Estimates with two assigned content types:  SOW (statement of work). This content type includes a Microsoft Word document template that defines a standardized format for a statement of work, together with various site column references for the SOW.  Estimation. This content type includes a Microsoft Excel document template that facilitates budget calculations, together with various site column references for the estimate.The root site for the plant site collection includes a list named Projects. Every SOW and estimation must be linkedto a project record.Site ColumnsThe SOW and Estimation content types both include the following site columns:  SOW Status. This is used to indicate the progress of the SOW and the estimation through the sales process. This is a choice field with the values Draft, Submitted, and Approved.  Estimate Value. This is a currency field that represents the estimated value of the proposal to the organization.  Projects Lookup. This is a lookup field that is used to link the SOW or estimation to a project record in the Projects list. The lookup returns the Title field of the project record.  Client Name. This is a text field that is used to record the name of the client.  Client ID. This is a text field that represents a unique identifier for the client.Generated from CHM, not final book. Will be superseded in the future. Page 51
  • 52. Web PartThe AggregateView Web Part uses the SPSiteDataQuery class to retrieve data from the Estimates library in eachteam site. The Web Part uses an ASP.NET GridView control to present the data. This enables the user to view atabular summary of SOWs and estimations. Note:The AggregateView Web Part uses the ASP.NET GridView control because the SharePoint equivalent,SPGridView, is not available in the sandbox environment.Generated from CHM, not final book. Will be superseded in the future. Page 52
  • 53. Solution Design  To implement the solution as described in Solution Overview, the solution package needs to perform various deployment tasks. The following diagram shows these deployment tasks together with the order in which they must be performed.Deployment tasks and task orderingThe dependencies between these tasks are fairly intuitive:  The site columns include a lookup field that retrieves metadata from the Projects list, so you must deploy the Projects list before you can deploy the site columns.  The SOW and Estimation content types include site columns and document templates, so you must deploy the site columns and the document templates before you can create the content types.  To associate the content types with the Estimates libraries, you must first create both the content types and the libraries. Note:The deployment of the Web Part does not directly depend on the deployment of any other components.Although the Web Part will not do anything useful until these components are in place, there is nothing toprevent you from deploying it up front.In order to manage these tasks and dependencies, the ExecutionModels.Sandboxed solution consists of threefeatures, as shown in the following diagram. The arrows represent feature activation dependencies. For moreinformation about feature activation dependencies, see Activation Dependencies and Scope on MSDN.Solution design for the Sandbox RIGenerated from CHM, not final book. Will be superseded in the future. Page 53
  • 54. Note:When you use Visual Studio 2010 SharePoint tools to package a solution, be aware that features are installedin the order in which they appear in the Package Designer. If your features include activation dependencies,make sure that you order your features correctly. In this solution, the ProjectsList feature appears at the top ofthe list, followed by the EstimateC Ts feature, followed by the EstimatesInstance feature. For more informationabout feature ordering, see How to: C hange Deployment Order.These three features perform the following actions:  ProjectsList. This feature creates an instance of the Projects list on the root site of the site collection. It also adds the AggregateView Web Part to the Web Part gallery on the site collection.  EstimateCTs. This feature deploys the document templates and the new site columns to the site collection. When the feature is activated, the event receiver class uses the document templates and the site columns to create the SOW and Estimation content types. Programmatic creation of content types is new in SharePoint 2010.  EstimatesInstance. This feature creates an instance of the Estimates document library on each subsite. When the feature is activated, the event receiver class associates the SOW and Estimation content types with each Estimates library.The next topics in this section describe these components in more detail.Generated from CHM, not final book. Will be superseded in the future. Page 54
  • 55. Web Part DeploymentYou can deploy Web Parts within sandboxed solution packages using the same approach that you would use forfull-trust Web Part deployment. Note that the SharePoint tools in Visual Studio 2010 actually automatically createthe XML artifacts described in this topic. However, it is useful to understand what happens behind the scenes. Ifyou are an advanced user, you may also choose to manually edit these XML artifacts to fine tune your solution.First, note that the solution includes a .webpart file that defines how the Web Part is listed in the Web Part gallery.The following code example shows the contents of the Sandbox-AggregateView.webpart file.XML<webParts> <webPart xmlns="http://schemas.microsoft.com/WebPart/v3"> <metaData> <type name="ExecutionModels.Sandboxed.AggregateView.AggregateView,$SharePoint.Project.AssemblyFullName$" /> <importErrorMessage>$Resources:core,ImportErrorMessage;</importErrorMessage> </metaData> <data> <properties> <property name="Title" type="string">P&amp;P SPG V3 - Execution Models SandboxAggregate View</property> <property name="Description" type="string">AggregateView Web Part - SandboxExecution Model.This web part shows aggregating multiple lists in the same site collection.</property> </properties> </data> </webPart></webParts>Essentially, the .webpart file points to the type and assembly of the Web Part and provides a title and descriptionto display within the gallery.Next, the solution includes a feature manifest to deploy the .webpart file to the Web Part gallery in your sitecollection. The following code example shows the contents of the Elements.xml file for the AggregateView WebPart.XML<Elements xmlns="http://schemas.microsoft.com/sharepoint/" > <Module Name="AggregateView" List="113" Url="_catalogs/wp"> <File Path="AggregateViewSandbox-AggregateView.webpart" Url="Sandbox-AggregateView.webpart" Type="GhostableInLibrary"> <Property Name="Group" Value="P&amp;P SPG V3" /> </File> </Module></Elements>In feature manifest files, modules are used to deploy one or more files to a specified location. In the Moduleelement, the List="113" attribute value identifies the deployment target as a Web Part gallery, and the Url="_catalogs/wp" attribute value is the site-relative URL of the Web Part gallery. The Module element contains aFile element for the .webpart file. In the F ile element, the Path attribute specifies the physical location of the filewithin your feature, and the Url attribute specifies a virtual path for the file. The Type="GhostableInLibrary"attribute indicates that the deployment target is a type of document library and that SharePoint should create aparent list item for the file. Using this approach, the file is stored in the content database. Note:For more information about the Module element schema, see Modules on MSDN.Although the feature manifest for a sandboxed module is no different to the manifest for any other module,SharePoint processes feature manifests in sandboxed solutions differently. In contrast to a farm solutiondeployment, feature files in sandboxed solutions are not deployed to the TEMPLATEFEATURES folder in theSharePoint root on each Web Front End server. Instead, they remain in the solution package and are retrievedfrom the content database when required.Generated from CHM, not final book. Will be superseded in the future. Page 55
  • 56. Finally, you must ensure your Web Part assembly is included in the solution manifest file. The following codeexample shows the contents of the solution manifest for the ExecutionModels.Sandboxed solution, edited forreadability.XML<Solution xmlns="..." SolutionId="..." SharePointProductVersion="14.0"> <Assemblies> <Assembly Location="ExecutionModels.Common.dll" DeploymentTarget="GlobalAssemblyCache" /> <Assembly Location="ExecutionModels.Sandboxed.dll" DeploymentTarget="GlobalAssemblyCache"> <SafeControls> <SafeControl Assembly="ExecutionModels.Sandboxed, ..." Namespace="ExecutionModels.Sandboxed.AggregateView" TypeName="*" /> </SafeControls> </Assembly> </Assemblies> <FeatureManifests> <FeatureManifest Location="...ProjectsListFeature.xml" /> <FeatureManifest Location="...EstimateCTsFeature.xml" /> <FeatureManifest Location="...EstimatesInstanceFeature.xml" /> </FeatureManifests></Solution>As you can see, the solution manifest file looks broadly the same as the manifest for a farm solution. There aretwo key points to note in this example:  The manifest specifies that the assemblies should be deployed to the global assembly cache. Sandboxed assemblies are not deployed to the global assembly cache, but the Assembly elements must specify DeploymentTarget="GlobalAssemblyCache" regardless. Sandboxed assemblies are actually stored in the content database within the SharePoint solution package (WSP) and are loaded as required by the user code worker process.  The manifest includes a SafeControl entry for the Web Part type. When you deploy a sandboxed solution, safe control entries are not added to the Web.config file. However, SharePoint verifies these entries behind the scenes when your sandboxed solution is loaded. If you do not include a SafeControl entry for your Web Part type, the Web Part will fail to load. Note:If you used the Visual Studio 2010 SharePoint tools to build your solution, the solution manifest is automaticallygenerated by default.Generated from CHM, not final book. Will be superseded in the future. Page 56
  • 57. List InstancesThe ExecutionModels.Sandboxed solution deploys two list instances: the Projects list and the Estimates library.The Projects list is based on the custom list template, which by default includes a single field named Title. TheEstimates library is a standard document library. You can deploy list instances by creating a ListInstanceelement within a feature manifest file. The SharePoint tools in Visual Studio 2010 will automatically generate thefeature manifest file for you, but it is useful to understand how it works behind the scenes.The following code example shows the feature manifest for the Projects list.XML<Elements xmlns="http://schemas.microsoft.com/sharepoint/"> <ListInstance Id="{877CD3C2-DDE1-4EF3-82BA-4367D2FC079B}" Title="Projects" OnQuickLaunch="TRUE" TemplateType="100" FeatureId="00bfea71-de22-43b2-a848-c05709900100" Url="Lists/Projects" Description=""> </ListInstance></Elements>The following code example shows the feature manifest for the Estimates list.XML<Elements xmlns="http://schemas.microsoft.com/sharepoint/"> <ListInstance Title="Estimates" OnQuickLaunch="TRUE" TemplateType="101" FeatureId="00bfea71-e717-4e80-aa17-d0c71b360101" Url="Lists/Estimates" Description=""> </ListInstance></Elements>Note that the schema for list instances is the same regardless of whether you are deploying a sandboxed solutionor a farm solution. The Projects list instance is deployed by a site-scoped feature; because of this, the Projectslist is created in the root site of the site collection. The Estimates list instance is deployed by a Web-scopedfeature; because of this, the Estimates library is created on every site in the site collection where the feature isactivated. The key point of interest in the feature manifest files is the TemplateType element. A template typeof 100 indicates that SharePoint should create a custom list, and a template type of 101 indicates that SharePointshould create a document library. Note:You can also declaratively associate content types with a list instance within the feature manifest file. However,this implementation demonstrates programmatic creation of content types in the feature receiver class,because this functionality is new to SharePoint 2010.For more information about the ListInstance element schema, see ListInstance Element on MSDN.Generated from CHM, not final book. Will be superseded in the future. Page 57
  • 58. Site ColumnsThe ExecutionModels.Sandboxed solution deploys five site columns for use by the SOW content type and theEstimation content type. Site columns are defined by F ield elements within a feature manifest file. As with allfeature manifests, the contents of the manifest file remain the same regardless of whether you target asandboxed solution or a farm solution. For example, the following code shows the definition of the SOW Statussite column. This is taken from the Elements.xml file for the SiteC olumns project item.XML<Field ID="{91EBB5B9-D8C5-43C5-98A2-BCB1400438B7}" Name="SOWStatus" DisplayName="SOW Status" StaticName="SOWStatus" DisplaceOnUpgrade="TRUE" Group="SiteColumns" Type="Choice" Format="Dropdown"> <CHOICES> <CHOICE>Draft</CHOICE> <CHOICE>Submitted</CHOICE> <CHOICE>Approved</CHOICE> </CHOICES></Field>The site columns include a column named Projects Lookup that retrieves data from the Projects list, as shown inthe following code example. Because this column retrieves data from the Projects list, you must make sure thatthe Projects list is in place before you deploy the lookup column. This is achieved through the feature activationdependencies described in Solution Design.XML<Field ID="{F52FAC8A-7028-4BE1-B5C7-2A316AB1B88E}" Name="ProjectsLookup" DisplayName="Projects Lookup" StaticName="ProjectsLookup" Group="SiteColumns" DisplaceOnUpgrade="TRUE" Type="Lookup" ShowField="Title" WebId="" List="Lists/Projects"></Field>The ability to create lookup fields declaratively is a new feature in SharePoint 2010. For more information aboutthe F ield element schema, see Field Definition Schema on MSDN.Generated from CHM, not final book. Will be superseded in the future. Page 58
  • 59. Event ReceiversYou can use event receivers within sandboxed solutions to handle events on list items, lists, individual sites, andfeatures. The ExecutionModels.Sandboxed solution includes two event receivers that handle theFeatureActivated event:  EstimateCTs.Ev entReceiver. This class programmatically creates the SOW and Estimation content types when the EstimateCTs feature is activated.  EstimatesInstance.Ev entReceiver. This class associates the SOW and Estimation content types with each instance of the Estimates library when the EstimatesInstance feature is activated.The next sections describe these feature receiver classes.EstimateCTs Event ReceiverThe EstimateC Ts feature deploys site columns and document templates to the site collection. Because the SOWand Estimation content types rely on these columns and templates, it makes sense to use a feature receiver tocreate the content types when the feature is activated, because you can be certain that the prerequisites for thecontent types are in place at this point. It is also easier to modify the content types in the case of an upgradewhen you use this approach. At a high level, the EstimateCTs.F eatureActivated method performs thefollowing actions for each content type: 1. It creates a new content type in the ContentTypes collection of the root SPWeb object. 2. It links the site columns to the content type. 3. It adds the document template to the content type. 4. It calls the Update method on the content type to persist changes to the database.This process is illustrated by the following extract from the F eatureActivated method. The code has beensimplified for readability.C#SPSite site = properties.Feature.Parent as SPSite;using (SPWeb web = site.RootWeb){ SPContentType sowContentType = web.ContentTypes[Constants.sowContentTypeId]; if (sowContentType == null) { sowContentType = new SPContentType(Constants.sowContentTypeId, web.ContentTypes, Constants.sowContentTypeName); web.ContentTypes.Add(sowContentType); } sowContentType.DocumentTemplate = string.Concat(web.Url, Constants.sowTemplatePath); AddFieldsToContentType(web, sowContentType, fieldsToAdd); sowContentType.Update(true);}The process is repeated for the Estimation content type. The AddF ieldsToContentType method links sitecolumns to the content type using the field ID values defined in the feature manifest for the site columns. Formore details, see the EstimateCTs.Ev entReceiver.cs class in the reference implementation.There are various tradeoffs to consider when you choose whether to create content types declaratively orprogrammatically. You cannot update a content type that has been created declaratively—all content types mustbe upgraded programmatically. However, declarative approaches are simple; therefore, they are a better choiceif your content types are not expected to evolve or change.EstimatesInstance Event ReceiverThe EstimatesInstance feature creates an instance of the Estimates document library on every site in which thefeature is activated. The Estimates library exists solely to store documents based on the SOW and Estimationcontent types. Because of this, it makes sense to use a feature receiver to associate the content types with eachEstimates library when the feature is activated. At a high level, the EstimatesInstance.F eatureActivatedmethod performs the following actions: 1. It locates the Estimates list in the current site. 2. It enables content types on the Estimates list.Generated from CHM, not final book. Will be superseded in the future. Page 59
  • 60. 3. It adds the SOW and Estimation content types to the Estimates list. 4. It removes the Document content type from the Estimates list.The F eatureActivated method uses a helper method to add the content types to the list, as shown in thefollowing code example.C#private static void AddContentTypeToList(SPContentTypeId spContentTypeId, SPList list,SPWeb web){ var contentType = web.AvailableContentTypes[spContentTypeId]; if (contentType != null) { list.ContentTypesEnabled = true; list.Update(); if (!ListContains(list, spContentTypeId)) { list.ContentTypes.Add(contentType); list.Update(); } }}Before the method attempts to add a content type, it first makes sure that content types are enabled on the list.If the ContentTypesEnabled property is false, any attempt to add content types to the list will fail. The methodthen checks to see whether the list already contains the content type. ListContains is a simple helper method, asshown in the following code example.C#static bool ListContains(SPList list, SPContentTypeId id){ var matchId = list.ContentTypes.BestMatch(id); return matchId.IsChildOf(id);}Essentially, this method retrieves the content type ID from the list that is closest to the ID of the new contenttype. It then checks to see whether the closest match is a child content type to the content type you areattempting to add. This is necessary because of the way that SharePoint manages content type IDs. When youcopy a site content type to a list, SharePoint gives the list content type a new ID in the form site content type ID+ "00" + 32-character hexadecimal GUID. If the closest match to your site content type ID in the Estimates list isa "child" of your site content type, your content type has already been added to the list. For more information oncontent type IDs, see C ontent Type IDs on MSDN.Generated from CHM, not final book. Will be superseded in the future. Page 60
  • 61. Web Part DesignThe ExecutionModels.Sandboxed solution illustrates various best practices for the design of sandboxed Web Partsin SharePoint 2010. The following class diagram shows the structure of the AggregateView Web Part. This designis an example of the Model-View-Presenter (MVP) pattern, which is designed to isolate your business logic fromthe details of your user interface (UI).Class diagram for the Sandbox RIThe AggregateViewPresenter class represents the Presenter component in the MVP pattern. This classperforms the following tasks:  It retrieves a DataTable from the model represented by the IEstimatesService.  It sets the DataTable as the data source in the view represented by IAggregateView.The AggregateView class represents the View component in the MVP pattern. This class is the actual Web Part.This class performs the following tasks:  It instantiates the Presenter object represented by the AggregateViewPresenter. To do this, it creates an instance of the Model represented by EstimatesServ ice, and then it constructs the AggregrateViewPresenter, passing in itself as the View and the EstimatesServ ice as the Model. This is an example of constructor injection.  It renders the data supplied by the presenter to the UI.Finally, the EstimatesServ ice class represents the Model component in the MVP pattern. This class performs thefollowing tasks:  It executes a query to retrieve data from the Estimates list on each subsite.  It returns the data to the caller in a DataTable.The use of the MVP pattern increases modularity, flexibility, and testability of the application. If you want todisplay the data differently, you can modify or replace the view, without changing any of the business logic, byproviding an alternative implementation of IAggregateView. In other words, you can create a view that displaysthe data in any way you want, as long as it exposes a public write-only property of type DataTable namedSetSiteData. Similarly, if you change the way you store your SOWs and estimations, you can provide analternative implementation of IEstimatesService without editing the view or the presenter. Finally, the designmakes it easy to test your presenter logic by providing mock implementations of IEstimatesService andIAggregateView.The following illustration shows how execution passes between the view, model, and presenter classes.Generated from CHM, not final book. Will be superseded in the future. Page 61
  • 62. Flow of execution in the Sandbox RIIt is important to note that the view is relatively passive and is entirely driven by the presenter logic. The viewclass simply provides a forward-only property setter that the presenter can use to set the data source for theview.Generated from CHM, not final book. Will be superseded in the future. Page 62
  • 63. View ClassesIn terms of contracts, the sole responsibility of a view class is to expose a public property that enables apresenter class to insert data into the view. What the view class does with the data is of no concern to thepresenter. Because of this, the interface that underpins the view class in the Sandbox Reference Implementation(Sandbox RI) simply defines a write-only property of type DataTable.C#public interface IAggregateView{ DataTable SetSiteData { set; }}In this case, the view class should be a Web Part. In addition to implementing the IAggregateView interface,the AggregateView class must inherit from the abstract WebPart class. This class provides the functionalitythat enables the AggregateView class to plug into the SharePoint Web Part framework.C#public class AggregateView : WebPart, IAggregateViewBecause the Web Part provides the entry point for the application, the AggregateView class must instantiatethe Presenter class. You can do this in the CreateChildControls method, which is called early in the page lifecycle before the Web Part is rendered. You can then call the SetSiteData method on the Presenter object,which invokes the presenter logic.C#private AggregateViewPresenter presenter;protected override void CreateChildControls(){ base.CreateChildControls(); // Configure the grid view. presenter = new AggregateViewPresenter(this, new EstimatesService()); presenter.SetSiteData(); Controls.Add(gridView); IErrorVisualizer errorVisualizer = new ErrorVisualizer(this); presenter.ErrorVisualizer = errorVisualizer;} Note:The ErrorVisualizer class is a Web control that the presenter class uses to display exception information if anunhandled exception is caught. This is part of an exception shielding strategy that stops unhandled Web Partexceptions that prevent the host page from loading. For more information, see Exception Shielding.The AggregateView class provides an implementation of the SetSiteData property setter that performs twotasks:  It extracts column names from the passed-in data table and creates corresponding data columns in the grid view.  It binds the grid view to the passed-in data table.This property setter is used by the presenter class to provide the view with a data source. The following codeexample shows the SetSiteData implementation in the AggregateView class.C#public DataTable SetSiteData{ set { PresenterUtilities.FormatGridDisplay(gridView, value); gridView.DataSource = value;Generated from CHM, not final book. Will be superseded in the future. Page 63
  • 64. gridView.DataBind(); } Note:IsNotSystemColumn is a helper method that ensures column headers are not added for hidden systemcolumns such as index values.Finally, it is worth noting that the Presenter class simply requires a view object that implementsIAggregateView. In unit testing scenarios, you can instantiate the presenter class using a mock implementationof IAggregateView, such as the MockAggregateView class shown in the following code example.C#class MockAggregateView : IAggregateView{ public DataTable Data { get; set; } public DataTable SetSiteData { set { this.Data = value; } }}This ability to substitute a fake view class allows you to test your presenter logic in isolation, without anydependencies on the SharePoint environment or the implementation details of the user interface (UI). In theassert phase of your unit test, you can simply read the Data property of the MockAggregateView object toverify that the presenter class is sending valid data to the view.Generated from CHM, not final book. Will be superseded in the future. Page 64
  • 65. Presenter ClassesPresenter classes have one primary task: to retrieve data from the model and to send that data to the view.When you create a presenter class, you must pass in a view object and a model object. The following codeexample shows the constructor of the AggregateViewPresenter class.C#private IAggregateView view;private IEstimatesService estimatesService;public AggregateViewPresenter(IAggregateView view, IEstimatesService estimatesService){ this.view = view; this.estimatesService = estimatesService;}The AggregateViewPresenter class has no knowledge of how the view class and the model class areimplemented; it simply requires that they implement the specified interfaces:  IAggregateView. This interface defines a single write-only property named SetSiteData that requires an object of type DataTable.  IEstimatesService. This interface defines a single method named GetSiteData that returns an object of type DataTable.In the AggregateViewPresenter class, the presenter logic is contained in a method named SetSiteData. Inthe reference implementation, this method is invoked by the view class. However, you could just as easily invokethis method from a unit test.C#public void SetSiteData(){ try { view.SetSiteData = estimatesService.GetSiteData(); } catch (Exception ex) { // The Exception shielding logic is removed from here for simplicity. }}As you can see, the presenter logic itself is extremely straightforward and consists of a single line of code.However, in many real world examples, the presenter class will include substantial business logic and is likely tobe larger and more complex.Generated from CHM, not final book. Will be superseded in the future. Page 65
  • 66. Exception ShieldingThe ExecutionModels.Sandboxed solution demonstrates the use of an Exception Shielding pattern. This pattern isdesigned to trap unhandled exceptions at the system boundary before they are propagated to the hostenvironment. In the case of Web Parts, unhandled exceptions that occur in the Web Part code are propagated tothe page that hosts the Web Part, where they can prevent the page from loading. By trapping unhandledexceptions that occur at the Web Part boundary, you can make sure that problems in your Web Part do not causebroader problems in the host environment. This approach also enables you to troubleshoot and diagnose errormessages more quickly and to provide users with a friendly error message. Note:The Developing SharePoint Applications release describes exception handling strategies in detail. This topicprovides a brief summary of how these strategies are used within the ExecutionModels.Sandboxed solution. Formore information about how to implement exception shielding strategies for SharePoint applications, seeException Management in SharePoint on MSDN.In this case, the exception shielding functionality primarily consists of two key components:  The error visualizer is a Web control that adds itself to the child controls collection of the Web Part. It renders error messages as HTML markup.  The view exception handler writes unhandled exception messages to the event log and sends the exception message to the error visualizer. The view exception handler is invoked by the presenter class when the presenter logic fails.The rest of this topic describes how this works in the reference implementation. First, in theCreateChildControls method, the AggregateView class creates a new instance of the ErrorVisualizer class,passing a reference to itself as an argument.C#IErrorVisualizer errorVisualizer = new ErrorVisualizer(this);presenter.ErrorVisualizer = errorVisualizer;In the constructor, the ErrorVisualizer class adds itself to the Controls collection of the AggregateView WebPart.C#public ErrorVisualizer(Control hostControl, params Control[] childControls){ Validation.ArgumentNotNull(hostControl, "hostControl"); hostControl.Controls.Add(this); if (childControls == null) return; foreach (Control child in childControls) this.Controls.Add(child);} Note:Notice that the ErrorVisualizer constructor can also accept an array of child controls. Essentially, this enablesyou to contain all your Web Part content in the ErrorVisualizer object. The advantage of doing this is thatthe ErrorVisualizer will prevent this content from being rendered if an error occurs. However, theExecutionModels.Sandboxed solution does not use this approach.In the AggregateViewPresenter class, the SetSiteData method includes a catch block that traps unhandledexceptions. The catch block creates or retrieves an instance of ViewExceptionHandler, passing in theunhandled exception and the ErrorVisualizer instance provided by the view.C#public IErrorVisualizer ErrorVisualizer { get; set; }public ViewExceptionHandler ExceptionHandler { get; set; }public void SetSiteData(){ try {Generated from CHM, not final book. Will be superseded in the future. Page 66
  • 67. view.SetSiteData = estimatesService.GetSiteData(); } catch (Exception ex) { ViewExceptionHandler viewExceptionHandler = this.ExceptionHandler ?? new ViewExceptionHandler(); viewExceptionHandler.HandleViewException(ex, this.ErrorVisualizer); }}In the ViewExceptionHandler class, the HandleViewException method writes the exception message to theevent log and instructs the ErrorVisualizer instance to render the error message.C#ILogger logger = GetLogger(exception);logger.LogToOperations(exception, eventId, EventSeverity.Error, null);EnsureErrorVisualizer(errorVisualizer, exception);errorVisualizer.ShowDefaultErrorMessage();In the ErrorVisualizer class, the ShowDefaultErrorMessage method simply sets the value of a propertynamed ErrorMessage to the exception message. When the Web Part renders its child controls, the Rendermethod of the ErrorVisualizer instance is invoked. If an error message is present, the ErrorVisualizer rendersthe message. By omitting the call base.Render, this also ensures that any child controls that were passed to theErrorVisualizer are not rendered.C#protected override void Render(HtmlTextWriter writer){ if (string.IsNullOrEmpty(ErrorMessage)) { base.Render(writer); } else { RenderErrorMessage(writer, this.ErrorMessage); }}Generated from CHM, not final book. Will be superseded in the future. Page 67
  • 68. Data Models and Service ClassesThe responsibility of a Model class is to interact with the underlying data source and to hide the details of thatinteraction from the presenter. The interface that underpins the Model class in the sandbox referenceimplementation defines a single method that returns a DataTable object.The interface that underpins the View class in the Sandbox RI simply defines a write-only property of typeDataTable.C#public interface IEstimatesService{ DataTable GetSiteData();}The interface specifies nothing of the location or format of the data, so you can easily provide alternativeimplementations of IEstimatesService that retrieve data in different formats or from alternative locations. Also,you can easily create mock implementations of IEstimatesService to test the presenter logic.In the Sandbox RI, the EstimatesServ ice class provides a simple implementation of IEstimatesService.When the EstimatesServ ice class is instantiated, it builds an SPSiteDataQuery object. The implementation ofthe GetSiteData method simply runs the query.C#public class EstimatesService : IEstimatesService{ private SPSiteDataQuery query; public EstimatesService() { query = new SPSiteDataQuery(); query.Lists = "<Lists BaseType=1 />"; query.ViewFields = "<FieldRef Name=SOWStatus />" + "<FieldRef Name=EstimateValue />"; query.Query = "<OrderBy><FieldRef Name=EstimateValue /></OrderBy>"; query.Webs = "<Webs Scope=SiteCollection />"; } public System.Data.DataTable GetSiteData() { SPWeb web = SPContext.Current.Web; return web.GetSiteData(query); }} Note:The SPSiteDataQuery object uses the C ollaborative Application Markup Language (C AML) query syntax. Formore information about the C AML query syntax, see C ollaborative Application Markup Language C oreSchemas on MSDN. There are also several community and third-party tools that you can use to automaticallygenerate C AML code.Generated from CHM, not final book. Will be superseded in the future. Page 68
  • 69. ConclusionThe Sandbox Reference Implementation (Sandbox RI) demonstrates best practice approaches to various aspectsof sandboxed solution development. The key points illustrated by the Sandbox RI include the following:  List instances, site columns, content types, document templates, and Web Parts are deployed to the sandbox environment.  Use feature receiver classes within the sandbox environment and organize your feature dependencies.  Use SPSiteDataQuery object to aggregate data from multiple lists, which obviates the need to enumerate sites and lists.  Use of various patterns and techniques, including Model-View-Presenter (MVP), C onstructor Injection, and exception shielding, to improve the reliability, modularity, and testability of your code.We recommend deploying the reference implementation and exploring the different components and code in theExecutionModels.Sandboxed solution. For more information about the sandbox execution environment, seeExecution Models in SharePoint 2010.Generated from CHM, not final book. Will be superseded in the future. Page 69
  • 70. Reference Implementation: Full-Trust Proxies forSandboxed SolutionsFull-trust proxies allow you to expose additional functionality to your SharePoint applications while maintainingmany of the benefits of the sandbox environment. The Sandbox Reference Implementation (Sandbox RI) includesa sandboxed solution that retrieves statement of work (SOW) documents and estimations from multiple lists in asite collection. The solution includes a Web Part that displays a summary of the aggregated content in a centrallocation. The Full-Trust Proxy Reference Implementation (Full-Trust Proxy RI) extends the list aggregationscenario to supplement the aggregated data with accounts payable information from a vendor system. Thesolution uses a full-trust proxy that connects to an Enterprise resource planning (ERP) Web service and retrievesinformation about the current vendor.The Full-Trust Proxy RI highlights details and best practices in the following areas:  C reating, deploying, and registering a full-trust proxy  C onsuming a full trust proxy from sandboxed code  Deploying pages and client-side scripts to the sandbox environment  Launching pages as modal dialogs from the sandbox environment Note:This solution builds on the Sandbox Reference Implementation. The documentation for the Sandbox RIdescribes many issues—such as feature partitioning, exception shielding, and the use of various designpatterns—that are not repeated in this topic. We recommend that you familiarize yourself with thedocumentation for the Sandbox RI before you review this topic.Solution ScenarioIn this example, suppose you are a SharePoint consultant working for a pharmaceutical company named C ontosoInc. You have already designed and implemented a sandboxed solution that aggregates list data from across thesite collection for the Springfield manufacturing plant, as described in the Sandbox Reference Implementation.The solution includes a Web Part that displays an approval status and an estimated value for work items fromvarious departments within the Springfield plant.The head of the Springfield plant, Janet Schorr, now wants you to extend this solution. In addition to a summaryof approval status and estimated value, Janet wants to be able to view details of the vendor associated with eachwork item. In particular, Janet wants to be able to view the amount owed (accounts payable) to the vendor tomake sure the plant is balancing work across preferred vendors.At present, this vendor information is stored in C ontosos ERP system. The ERP system exposes various Webservices that allow external systems to retrieve and interact with financial data, including the retrieval financialinformation related to vendors. To meet Janet’s requirements, you implement a full-trust proxy that interacts withthe ERP Web services for vendor data. You then modify your sandboxed solution to use this full-trust proxy toretrieve and display client details. The user can click the name of a vendor to launch a modal dialog that displaysthe amount currently owed to the vendor, as shown in the following illustrations.Full Trust Proxy RI user interfaceGenerated from CHM, not final book. Will be superseded in the future. Page 70
  • 71. Deploying the Full-Trust Proxy RIThe Full-Trust Proxy RI includes an automated installation script that creates a site collection, deploys thereference implementation components, and adds sample data. After running the installation script, browse to thenew SpringfieldProxy site collection at http://<Hostname>/sites/SpringfieldProxy. You can open and run theproject in Visual Studio, but this does not create a site collection or add sample data. To see the system fullyfunctioning, you must run the installation script. The following table summarizes how to get started with theFull-Trust Proxy RI.Question AnswerWhere can I find the <install location>SourceExecutionModelProxyFull-Trust Proxy RI?What is the name of ExecutionModels.Sandboxed.Proxy.slnthe solution file?What are the system SharePoint Foundation 2010requirements?What preconditions  You must be a member of SharePoint Farm Admin.are required forinstallation?  You must be a memberof the Windows admin group.  SharePoint must be installed at http://<Hostname:80>. If you want to install to a different location you can edit these settings in the Settings.xml file located in the Setup directory for the solution.  SharePoint 2010 Administration service must be running. By default, this service is set to a manual start. To start the service, click Start on the taskbar, point to Administrative Tools, click Serv ices, double-click SharePoint 2010 Administration serv ice, and then click Start.How do I Install the Follow the instructions in the Readme.txt file located in the project folder.Full-Trust Proxy RI?What is the default http://<Hostname>/sites/SpringfieldProxyinstallation location? (This location can be altered by changing the Settings.xml in the Setup directory.)Generated from CHM, not final book. Will be superseded in the future. Page 71
  • 72. How do I download The Full-Trust Proxy RI is included in the download Developing Applications forthe Full-Trust Proxy SharePoint 2010.RI?Generated from CHM, not final book. Will be superseded in the future. Page 72
  • 73. Solution OverviewThis topic provides a high-level conceptual overview of the Full-Trust Proxy Reference Implementation (Full-TrustProxy RI). The solution consists of various sandboxed components and various full-trust components, as shown inthe following illustration.Conceptual overview of the Full-Trust Proxy RIThe Full-Trust Proxy RI consists of two solution packages, one for the sandboxed components and one for thefull-trust proxy. The sandboxed components are deployed to the site collection solution gallery through the userinterface (UI), while the full-trust proxy is deployed as a full-trust solution to the server environment. Theimplementation also includes a simple Windows C ommunication Foundation (WC F) service namedVendorServices.In the sandboxed solution, the Aggregate View Web Part renders vendor names as hyperlinks. When a user clicksa vendor name, a client-side JavaScript function launches the Vendor Details page within a modal dialog. TheJavaScript function passes the vendor name to the details page as a query string parameter.The Vendor Details page is a Web Part page that hosts the Vendor Details Web Part. The Vendor Details Web Partloads the vendor name from the page query string. The Web Part makes a call to the full-trust proxy to requestthe accounts payable associated with the current vendor name. The full-trust proxy calls the appropriate VendorWeb service method and then returns the outstanding balance to the Vendor Details Web Part. Note:Why does the Full-Trust Proxy RI use a Web Part within a Web Part page to display vendor details within amodal dialog? It might seem more straightforward to use a simple ASC X user control. However, deploymentconstraints prevent the Full-Trust Proxy RI from deploying application pages or user controls to the sandboxenvironment.The WC F service, VendorServices, is a simple proof-of-concept Web service that exposes a single methodnamed GetAccountsPayable. This method accepts a string vendor name and returns a random double torepresent what is owed to the vendor.Generated from CHM, not final book. Will be superseded in the future. Page 73
  • 74. Proxy ComponentsThe ExecutionModels.Sandboxed.Proxy solution illustrates everything you need to know to create and deploy afull-trust proxy. At a high level, you must complete three key tasks to make a full-trust proxy available tosandboxed solutions:  C reate a proxy operation arguments class that represents the data that you want to pass to your full-trust proxy. This class should inherit from the SPProxyOperationArgs base class. The proxy operation arguments class is essentially a serializable collection of user-defined public properties.  C reate a proxy operations class that implements your full-trust logic. This class should inherit from the SPProxyOperations base class. The base class defines a single method named Execute that requires a single argument of type SPProxyOperationsArgs and returns a value of type Object.  Register your full-trust proxy with the user code service. This is best accomplished through the use of a feature receiver class. The registration process makes your proxy operation arguments class available in the sandbox environment, and it enables you to invoke your proxy logic through the SPUtility.ExecuteRegisteredProxyOperation method.Generated from CHM, not final book. Will be superseded in the future. Page 74
  • 75. The Proxy Operation Arguments ClassWhen you create a full-trust proxy, you need to provide developers with a way to pass arguments from theirsandboxed code to your full-trust logic. This must be carefully managed, because SharePoint needs to marshalthese arguments across process boundaries. Fortunately, the proxy operation arguments class handles theseissues for you.Every call to a full-trust proxy must include an argument of type SPProxyOperationArgs. This is an abstract,serializable class that you can find in the Microsoft.SharePoint.UserCode namespace. When you create afull-trust proxy, you should create your own proxy operation arguments class that inherits fromSPProxyOperationArgs. Your class should define public properties for any arguments that you want to pass toyour full-trust logic. Note:A good practice is to define the assembly name and the type name of your proxy operations class as constantsin your proxy operation arguments class. The assembly name and type name are required when your full-trustproxy is registered with SharePoint and every time the operation is invoked from sandbox code.In the ExecutionModels.Sandboxed.Proxy solution, the proxy operation arguments class is namedAccountsPayableProxyArgs. This is shown in the following code example.C#[Serializable]public class AccountsPayableProxyArgs : SPProxyOperationArgs{ public static readonly string AssemblyName = "..."; public static readonly string TypeName = "VendorSystemProxy.AccountsPayableProxyOps"; private string vendorName; public string VendorName { get { return (this.vendorName); } set { this.vendorName = value; } } public AccountsPayableProxyArgs(string vendorName) { this.vendorName = vendorName; } public AccountsPayableProxyArgs() { } public string ProxyOpsAssemblyName { get { return AssemblyName; } } public string ProxyOpsTypeName { get { return TypeName; } }}As you can see, the proxy operation arguments class is relatively straightforward. The class provides thefollowing functionality:  It defines a default constructor, which is required for serialization.  It defines a public string property to store and retrieve the vendor name.  It defines a constructor that accepts a string vendor name value.  It defines public read-only string properties to retrieve the assembly name and the type name of the proxy operations class.Generated from CHM, not final book. Will be superseded in the future. Page 75
  • 76. Generated from CHM, not final book. Will be superseded in the future. Page 76
  • 77. The Proxy Operations ClassThe proxy operations class contains the logic that gets executed when you invoke your full-trust proxy from asandboxed solution. The proxy operations class does not expose methods directly to the sandbox environment.Instead, the logic in the Execute method is invoked when the sandboxed code calls theSPUtility.ExecuteRegisteredProxyOperation method. In the proxy operations class, the Execute methodmust do the following:  Take a single argument of type SPProxyOperationArgs.  Return a serializable value as an object.In the Full-Trust Proxy Reference Implementation (Full-Trust Proxy RI), the proxy operations class is namedAccountsPayableProxyOps. This is shown in the following code example.C#public class AccountsPayableProxyOps : SPProxyOperation{ private const string address = Vendor.Services.Implementation.VendorServices.DeploymentLocation; public override object Execute(SPProxyOperationArgs args) { // Perform error checking. try { AccountsPayableProxyArgs proxyArgs = args as AccountsPayableProxyArgs; string vendorName = proxyArgs.VendorName; double accountsPayable = 0.0; WSHttpBinding binding = new WSHttpBinding(); EndpointAddress endpointAddress = new EndpointAddress(address); using (ChannelFactory<IVendorServices> factory = new ChannelFactory<IVendorServices>(binding, endpointAddress)) { IVendorServices proxy = factory.CreateChannel(); accountsPayable = proxy.GetAccountsPayable(vendorName); factory.Close(); } return accountsPayable; } catch (Exception ex) { return ex; } }}Essentially, the Execute method in the AccountsPayableProxyOps class performs three tasks:  It retrieves the vendor name value from the AccountsPayableProxyArgs instance.  It calls the Web services GetAccountsPayable method, passing in the vendor name.  It returns the outstanding account balance for the vendor to the caller.  If an exception occurs, it is caught and returned. This gives the calling code more information about what failed.Because the proxy operations class, AccountsPayableProxyOps, is deployed in a farm solution and runs withfull trust, there are no limitations on the actions that you can perform within the Execute method.Generated from CHM, not final book. Will be superseded in the future. Page 77
  • 78. Registering a Full-Trust ProxyTo enable sandboxed solution developers to use your full-trust proxy, you must register your proxy operationsclass and assembly with the user code service. The best way to do this is to create a farm-scoped feature with anevent receiver class that inherits from SPF eatureActivated. You can register the full-trust proxy in theFeatureActivated method of the event receiver.The following code example shows the F eatureActiv ated method from theRegisterVendorFullTrustProxyEv entReceiver class.C#public override void FeatureActivated(SPFeatureReceiverProperties properties){ SPUserCodeService userCodeService = SPUserCodeService.Local; SPProxyOperationType vendorPayablesOperation = new SPProxyOperationType( AccountsPayableProxyArgs.AssemblyName, AccountsPayableProxyArgs.TypeName); userCodeService.ProxyOperationTypes.Add(vendorPayablesOperation); userCodeService.Update();}Essentially, the proxy registration code performs the following actions:  It creates a new SPProxyOperationType from the proxy operations type name and assembly name.  It adds the SPProxyOperationType to the local user code services ProxyOperationTypes collection.  It calls the Update method to persist the changes to the user code service.For information about how to invoke a proxy operation from the sandbox environment, see C onsuming aFull-Trust Proxy.Generated from CHM, not final book. Will be superseded in the future. Page 78
  • 79. Sandboxed ComponentsPrimarily, the sandboxed components in the ExecutionModels.Sandboxed.Proxy solution are designed to show youhow to consume a full-trust proxy from sandboxed code. However, these components also demonstrateapproaches to various other challenges that you may face when you work with sandboxed applications.Essentially, the sandboxed components consist of two Web Parts:  AggregateView. This Web Part displays aggregated list data from across the site collection. This is largely unchanged from the sandbox reference implementation.  VendorDetails. This Web Part displays the name, ID, and current accounts payable balance for an individual vendor. It uses the full-trust proxy to retrieve the outstanding balance from the Vendors service exposed by the Enterprise resource planning (ERP) system. This Web Part is embedded in a Web Part page named VendorDetails.aspx, which is displayed as a modal dialog when the user clicks the name of a vendor in the AggregateView Web Part.The following illustration shows how these Web Parts interact at a conceptual level.Conceptual overview of Web Part interactionTo make this approach work, you need to perform various tasks:  Deploy a custom Web Part page to the sandbox environment.  C reate a JavaScript function to launch the custom Web Part page as a modal dialog and to provide the Web Part with a vendor name.  Modify the AggregateView Web Part to call the JavaScript function when the user clicks a vendor name.  Program the VendorDetails Web Part to call the full-trust proxy.Generated from CHM, not final book. Will be superseded in the future. Page 79
  • 80. Deploying Custom Pages to the Sandbox EnvironmentThe ExecutionModels.Sandboxed.Proxy solution includes one new type of resource, in addition to the componentsyou have already seen in the Sandbox Reference Implementation (Sandbox RI). The VendorDetail.aspx file is acustom Web Part page. It is used to display the VendorDetails Web Part in a Modal Dialog.When you work with farm solutions, the most common way to make pages and scripts available in a SharePointenvironment is to deploy them to the TemplateLayouts folder in the SharePoint root on the server file system.The Layouts folder is mapped to the _layouts virtual directory on each SharePoint site, so your files are alwaysavailable. However, you cannot use this approach in the sandbox environment because sandboxed solutions arenot permitted to deploy any files to the server file system. In the sandbox environment, you must deploy theseresources to an appropriate document library in the site collection or as inline script. The referenceimplementation uses inline script instead of a separate JavaScript file, because it requires only a small amount ofJavaScript that is specific to the page.The following code example shows the feature manifest for the VendorDetail.aspx file.XML<Elements xmlns="http://schemas.microsoft.com/sharepoint/"> <Module Name="Pages" List="101" Url="Lists/Pages"> <File Url="VendorDetail.aspx" Type="GhostableInLibrary" > <AllUsersWebPart WebPartZoneID="Left" WebPartOrder="1"> <![CDATA[<webParts> <webPart xmlns="http://schemas.microsoft.com/WebPart/v3"> <metaData>…</metaData> <data>…</data> </webPart> </webParts>]]> </AllUsersWebPart> </File> </Module> <CustomAction Location="ScriptLink" ScriptBlock =" function ShowVendorDetailsDialog(url) { var options = SP.UI.$create_DialogOptions(); options.url = url; options.height = 300; SP.UI.ModalDialog.showModalDialog(options); }"> </CustomAction></Elements>The preceding code uses a module to deploy the file to the Pages library in the site collection. In the F ileelement, the Type="GhostableInLibrary" attribute indicates that the deployment target is a type of documentlibrary and that SharePoint should create a parent list item for the file. This also specifies that the file is stored inthe content database and not on the server file system.You can also see that a CustomAction section is used to define a JavaScript function to inline. This approach toscript deployment is recommended if you have a relatively small amount of JavaScript, as opposed to deploying aJavaScript file. SharePoint is able to optimize the script load performance more effectively when you use thismechanism for a small amount of script. For information about the behavior of the JavaScript function, seeLaunching a Modal Dialog within the Sandbox Environment. Note:If you require a large amount of JavaScript, it makes sense to deploy your script within a file. For moreinformation about this approach, see How to: Display a Page as a Modal Dialog.Generated from CHM, not final book. Will be superseded in the future. Page 80
  • 81. Launching a Modal Dialog in the Sandbox EnvironmentThe ExecutionModels.Sandboxed.Proxy solution uses a modal dialog to display financial details when the userclicks the name of a vendor. This dialog is actually a Web Part page (VendorDetail.aspx) that contains a Web Part(VendorDetails.cs). To launch the modal dialog, the following three tasks must be performed:  C reate a JavaScript function that uses the SharePoint 2010 client object model to launch a specified page as a modal dialog.  Deploy the JavaScript function to the sandbox environment. For more information, see Deploying C ustom Pages to the Sandbox Environment.  Render vendor names as hyperlinks that call the JavaScript function when clicked.The feature manifest for the Vendor Details page defines a function named ShowVendorDetailsDialog. Thisfunction uses the SharePoint 2010 client object model to launch a page at a specified URL as a modal dialog, asshown in the following code example.Jav aScriptfunction ShowVendorDetailsDialog(url) { var options = SP.UI.$create_DialogOptions(); options.url = url; options.height = 300; SP.UI.ModalDialog.showModalDialog(options);}In the AggregateView Web Part, you must render the vendor names as hyperlinks that call the JavaScriptfunction when clicked. To do this, handle the RowDataBound event for the grid view control. Within the gridview, the contents of each vendor name cell is replaced with a HyperLink control, as shown in the following codeexample.C#protected void GridView_RowDataBound(object sender, GridViewRowEventArgs e){ const int VendorNameCellIndex = 3; const int EstimateValueCellIndex = 2; DataControlRowType rowType = e.Row.RowType; if (rowType == DataControlRowType.DataRow) { TableCell vendorNameCell = e.Row.Cells[VendorNameCellIndex]; HyperLink hlControl = new HyperLink(); hlControl.Text = vendorNameCell.Text; hlControl.NavigateUrl = string.Concat( "JavaScript: ShowVendorDetailsDialog(", SPContext.Current.Site.RootWeb.Url, "/Lists/Pages/VendorDetail.aspx?VendorName=" + vendorNameCell.Text + ");" ); vendorNameCell.Controls.Add(hlControl); TableCell estimateValueCell = e.Row.Cells[EstimateValueCellIndex]; double currencyValue; if(double.TryParse(estimateValueCell.Text, out currencyValue)) { estimateValueCell.Text = String.Format("{0:C}", currencyValue); } }}This renders each vendor name as a hyperlink in the following format.HTML<a href="JavaScript: ShowVendorDetailsDialog(Page URL)>Vendor Name</a>Generated from CHM, not final book. Will be superseded in the future. Page 81
  • 82. Note that the URL that is passed to the ShowVendorDetailsDialog function includes a query string values forVendorName. This is retrieved by the VendorDetails Web Part and is used in the proxy operation call.Generated from CHM, not final book. Will be superseded in the future. Page 82
  • 83. Consuming a Full-Trust ProxyEssentially, consuming a full-trust proxy from sandboxed code consists of two steps. First, you must create aninstance of your proxy operation arguments class and provide values for any properties.C#AccountsPayableProxyArgs proxyArgs = new AccountsPayableProxyArgs();proxyArgs.VendorName = VendorName;Next, you must call the SPUtility.ExecuteRegisteredProxyOperation method, passing in the assembly nameof the proxy operations class, the type name of the proxy operations class, and the proxy operationarguments instance.C#var result = SPUtility.ExecuteRegisteredProxyOperation( proxyArgs.ProxyOpsAssemblyName, proxyArgs.ProxyOpsTypeName, proxyArgs);The VendorDetails Web Part uses the Model-View-Presenter (MVP) pattern, as shown in the following illustration.The VendorDetails Web PartFor more information about how the MVP pattern works in practice, see Sandbox Reference Implementation. Theremainder of this topic focuses on how to consume a full-trust proxy within the context of the MVP pattern.First, note that the presenter class, VendorDetailsPresenter, includes a public property for the vendor name.C#private string vendorName;public string VendorName{ get { return (vendorName); } set { vendorName = value; }}The VendorDetails Web Part (the v iew class) is responsible for instantiating the presenter class. In theCreateChildControls method, the Web Part retrieves the vendor name from the page query string and sets theGenerated from CHM, not final book. Will be superseded in the future. Page 83
  • 84. VendorName property on the presenter. Next, it invokes the presenter logic by calling the SetVendorDetailsmethod.C#private VendorDetailsPresenter presenter;protected override void CreateChildControls(){ ... string vendorName = Page.Request.QueryString["VendorName"]; ... presenter = new VendorDetailsPresenter(this, new VendorService()); presenter.VendorName = vendorName; presenter.ErrorVisualizer = errorVisualizer; presenter.SetVendorDetails();}In the presenter class, the SetVendorDetails method constructs an instance of the proxy operation argumentsclass, AccountsPayableProxyArgs. The presenter logic then calls the GetAccountsPayable method toretrieve the outstanding balance from the model class, passing in the AccountsPayableProxyArgs instance asan argument.C#public void SetVendorDetails(){ try { AccountsPayableProxyArgs proxyArgs = new AccountsPayableProxyArgs(); proxyArgs.VendorName = vendorName; string assemblyName = proxyArgs.ProxyOpsAssemblyName; view.AccountsPayable = vendorService.GetVendorAccountsPayable( proxyArgs, assemblyName); } catch (Exception ex) { // Exception shielding logic removed for clarity. }}In the model class, VendorServ ice, the GetVendorAccountsPayable method invokes the proxy operation.The return type is cast to a double and returned to the caller.C#public double GetVendorAccountsPayable(AccountsPayableProxyArgs proxyArgs, stringassemblyName){ var result = SPUtility.ExecuteRegisteredProxyOperation( assemblyName, proxyArgs.ProxyOpsTypeName, proxyArgs); if (result.GetType().IsSubclassOf(typeof(Exception))) { throw result as Exception; } double cred = (double) result; return (cred);}Generated from CHM, not final book. Will be superseded in the future. Page 84
  • 85. Finally, the presenter class sets the accounts payable balance on the view instance. To facilitate this, theVendorDetails class provides a simple write-only property named AccountsPayable that sets the Textproperty of a label to the value supplied by the presenter.C#public double AccountsPayable{ set { PayablesValueLabel.Text = string.Format("{0:C}", value); }}Generated from CHM, not final book. Will be superseded in the future. Page 85
  • 86. Vendor Service Security ConfigurationBecause the Vendor Service exposes sensitive accounts payable data, the service must be secured. Thereference implementation uses a trusted subsystem model to authenticate to the service. Because the sandboxruntime removes any security tokens associated with the user, many of the standard authentication mechanismswill not work because the user credentials are unavailable. Using a trusted subsystem model in which credentialsof the process account are used to authenticate to the service avoids this issue, but it provides only coarsegranularity for performing authorization.In this reference implementation, the service is secured using Windows authentication with transport-levelsecurity, which provides a balance between security and complexity. Because this is designed for a testenvironment, an unencrypted HTTP connection is used for the service. In a production environment, this wouldprovide an insufficient level of security, so you should use a Secure Socket Layer (SSL) connection.To use transport-level security with Windows and Internet Information Services (IIS), you must enable Windowsauthentication for the Web site that hosts the service, and you must configure IIS to support URL authorization.URL authorization is configured by default in Windows Server 2008, but if you are using Windows 7, you will needto manually configure it. To do this, in the Turn Windows features on or off dialog box, expand InternetInformation Serv ices, expand World Wide Web Services, expand Security, and then ensure the URLAuthorization check box is selected, as shown in the following illustration.Enabling URL Authorization in Windows 7The service is installed to the C ontoso Web site in IIS. You must use IIS Manager to configure authentication forthe site. Ensure that Windows authentication is configured for the Web site that hosts the service, as shown in thefollowing illustration.Generated from CHM, not final book. Will be superseded in the future. Page 86
  • 87. Authentication settings in IIS ManagerWhen the Full-Trust Proxy RI deploys the VendorService service, it also adds an authorization policy to the Website that hosts the service. The installer achieves this by adding the following code to the Web.config file for theC ontoso Web site.XML<system.webServer> <security> <authorization> <remove users="*" roles="" verbs="" /> <add accessType="Allow" roles="ContosoUsers" /> <add accessType="Deny" users="?" /> <add accessType="Allow" users="SandboxSvcAcct" /> </authorization> </security></system.webServer>As you can see, the policy allows members of the C ontosoUsers role and the SandboxSvcAcct user to access theservice. SandboxSvcAcct is a managed account that the installer creates to run the Microsoft SharePointFoundation Sandboxed C ode Service, and this is the identity that is provided when you access the service fromthe sandbox environment. The C ontosoUsers role is added to support the C lient Reference Implementation, whichuses client-side code to access the same service.The full-trust proxy programmatically configures the service binding in a way that the Windows identity of thesandbox proxy process is provided to the service. This causes the SandboxSvcAcct identity to be used, as shownby the following code.C#BasicHttpBinding binding = new BasicHttpBinding();binding.Security.Transport.ClientCredentialType = HttpClientCredentialType.Windows;binding.Security.Mode = BasicHttpSecurityMode.TransportCredentialOnly;EndpointAddress endpointAddress = new EndpointAddress(address);using (ChannelFactory<IVendorServices> factory = new ChannelFactory<IVendorServices>(binding, endpointAddress)){ IVendorServices proxy = factory.CreateChannel(); accountsPayable = proxy.GetAccountsPayable(vendorName); factory.Close();}For more sophisticated approaches to security, you could consider using a Trusted Façade pattern, whichcombines using the trusted subsystem model and providing the name of the user. You can obtain the name of theGenerated from CHM, not final book. Will be superseded in the future. Page 87
  • 88. user, but not the users security token, from the SPUser instance in the SPContext object.Generated from CHM, not final book. Will be superseded in the future. Page 88
  • 89. ConclusionThe Full-Trust Proxy Reference Implementation (Full-Trust Proxy RI) demonstrates proven practice approaches tocreating and consuming full-trust proxies for sandboxed solutions. The key points of interest include the following:  The creation and deployment of full-trust proxies.  The registration process for full-trust proxy assemblies.  The consumption of full-trust proxies from sandboxed code.  The deployment of pages and scripts to the sandbox environment.  The use of the SharePoint 2010 client object model to launch modal dialogs from the sandbox environment.We recommend you deploy the Full-Trust Proxy RI and to explore the different components and code within theExecutionModels.Sandboxed.Proxy solution. For more information about full-trust proxies, see Execution Models inSharePoint 2010.Generated from CHM, not final book. Will be superseded in the future. Page 89
  • 90. Reference Implementation: Farm SolutionsAlthough you can use sandboxed solutions and hybrid approaches to deploy a wide variety of SharePointapplications, there are some components, such as timer jobs, that must be deployed as full-trust farm solutions.This reference implementation illustrates the use of a timer job to aggregate data across multiple site collections.The Sandbox Reference Implementation includes a sandboxed solution that retrieves statement of work (SOW)documents and estimations from multiple lists in a site collection. The solution includes a Web Part that displays asummary of the aggregated content on the site collection root site. The Full-Trust Proxy ReferenceImplementation extends the list aggregation scenario to supplement the aggregated data with accounts payabledetails from a vendor system. The solution uses a full-trust proxy that connects to an ERP Web service andretrieves information about the current vendor. This reference implementation further extends these scenarios byusing a timer job to aggregate content from multiple site collections.The reference implementation highlights details and best practices in the following areas:  C reating, deploying, and registering a timer job  Managing configuration data for timer jobs  Deploying custom application pages to the C entral Administration Web site Note:This solution builds on the Sandbox Reference Implementation and the Full-Trust Proxy ReferenceImplementation. To fully understand the concepts in this solution, we recommend you first familiarize yourselfwith the documentation for these implementations before you study the Full-Trust Reference Implementation(Full-Trust RI).Solution ScenarioIn this example, suppose you are a SharePoint consultant working for a pharmaceutical company named C ontosoInc. The Northwest division of C ontoso currently includes manufacturing plants at Blue Bell and New Brunswick.The construction and maintenance teams in each plant all have their own sites on the corporate intranet portal,which they use to track statements of work (SOWs) and corresponding budget estimates in the team.The general manager of the Northwest division, Phyllis Harris, now wants you to extend this solution. Phyllis wantsto be able to view a rollup of all open estimates to gain an understanding of the potential uncommitted work in thedivision. To meet Phyllis s requirements, you implement a timer job that retrieves approved estimates from eachsite collection in the division. The timer job then populates an Approved Estimates library on the divisional sitecollection, as shown in the following illustration.The Approved Estimates libraryDeploying the Farm Solution RIThe Farm Solution RI includes an automated installation script that creates a site collection, deploys the referenceimplementation components, and adds sample data. After running the installation script, browse to the newHeadquarters site collection at http://<Hostname>/sites/HeadQuarters. You can open and run the solution inVisual Studio, but this does not create a site collection or add sample data. To see the system fully functioning,you must run the installation script. The following table summarizes how to get started with the Farm Solution RI.Generated from CHM, not final book. Will be superseded in the future. Page 90
  • 91. Question AnswerWhere can I find the <install location>SourceExecutionModelFullTrustFarm Solution RI?What is the name of ExecutionModels.FarmSolution.slnthe solution file?What are the system SharePoint Foundation 2010requirements?What preconditions  You must be a member of SharePoint Farm Admin.are required forinstallation?  You must be a member of the Windows admin group.  SharePoint must be installed at http://<Hostname:80>. If you want to install to a different location, you can edit these settings in the Settings.xml file located in the Setup directory for the solution.  SharePoint 2010 Administration service must be running. By default, this service is set to a manual start. To start the service, click Start on the taskbar, point to Administrative Tools, click Serv ices, double-click SharePoint 2010 Administration serv ice, and then click Start.  The timer service must be running. To start the timer service, click Start on the taskbar, point to Administrative Tools, click Serv ices, double-click on SharePoint 2010 timer, and then click Start.How do I Install the Follow the instructions in the readme file located in the project folder.Farm Solution RI?What is the default http://<Hostname>/sites/HeadQuartersinstallation location? (This location can be altered by changing the Settings.xml file in the Setup directory.)How do I download The Sandbox Farm Solution RI is included in the download Developing Applications forthe Sandbox Farm SharePoint 2010.Solution RI?Generated from CHM, not final book. Will be superseded in the future. Page 91
  • 92. Solution OverviewThis topic provides a high-level conceptual overview of the Full-Trust Reference Implementation (Full-Trust RI).The key component of the solution is a timer job, as shown in the following illustration.Conceptual overview of the Full-Trust RIThe timer job relies on a custom application page to provide a list of site collections from which to retrieveapproved estimates. This application page is deployed to the C entral Administration Web site, where a customaction is used to create a navigation link to the page under the Timer Jobs heading.Configuration link and application page in Central AdministrationAfter it retrieves the list of site collections, the timer job looks for an Estimates list in all of the subsites in eachsite collection. It then copies any approved estimates in these lists to a central Approved Estimates list on thedivisional headquarters site collection.The Full-Trust Proxy RI consists of two solution packages:Generated from CHM, not final book. Will be superseded in the future. Page 92
  • 93.  ExecutionModels.FarmSolution.Jobs. This solution package deploys the timer job to the SharePoint server environment, and it uses a feature receiver class to register the timer job. It also deploys the Approved Estimates list to the divisional headquarters site collection.  ExecutionModels.FarSolution. This solution package deploys the custom application page to the C entral Administration Web site, together with the custom action that creates the navigation link.Generated from CHM, not final book. Will be superseded in the future. Page 93
  • 94. The Approved Estimates Timer JobEssentially, a timer job definition class exposes full-trust logic that can be invoked by the SharePoint timer service(owstimer.exe) on a scheduled basis. A SharePoint 2010 timer job can be associated with either a SharePointWeb application or a SharePoint service application. In the Full-Trust Reference Implementation (Full-Trust RI),the timer job class is named ApprovedEstimatesJob and targets the Web application scope.To make its logic available to the timer service, the timer job class must meet certain requirements. The classmust be public and must implement the SPJobDefinition base class. It must also have a default constructor, sothat it can be serialized by the timer service. In the Full-Trust RI, the timer job is scoped at the Web applicationlevel, so it must also define a constructor that accepts an argument of type SPWebApplication. This allows youto target a specific Web application when you register the timer job.The following code example shows the class definition and the constructors for the ApprovedEstimatesJobclass.C#public class ApprovedEstimatesJob : SPJobDefinition{ public const string JobName = "ApprovedEstimatesJob"; ... public ApprovedEstimatesJob() : base() { } public ApprovedEstimatesJob(SPWebApplication webApplication) : base(JobName, webApplication, null, SPJobLockType.Job) { Title = Constants.jobTitle; } ...The timer job definition overrides a method named Execute. This method is called by the SharePoint timerservice when the timer job is run.C#public override void Execute(Guid targetInstanceId){ ...}Essentially, the Execute method performs the following actions:  It retrieves the source sites and the destination site from configuration settings.  It runs a query on each source site to retrieve any approved estimate items.  It adds the approved estimate items to the Approved Estimates list on the destination site.The logic that retrieves approved estimates from each site collection is provided by a helper method namedGetAllApprov edEstimatesF romSiteCollection. This method uses an SPSiteDataQuery object to retrieveestimate documents that have a SOWStatus value of Approved.C#private static SPSiteDataQuery GetAllApprovedEstimatesFromSiteCollection(){ var query = new SPSiteDataQuery(); query.Lists = "<Lists BaseType=1 />"; query.ViewFields = "<FieldRef Name=Title Nullable=TRUE />" + "<FieldRef Name=FileRef Nullable=TRUE />"; query.Query = "<Where>" + "<Eq>" + "<FieldRef Name=SOWStatus />" + "<Value Type=Choice>Approved</Value>" +Generated from CHM, not final book. Will be superseded in the future. Page 94
  • 95. "</Eq>" + "</Where>"; query.Webs = "<Webs Scope=SiteCollection />"; return query;}As you can see, the query retrieves the value of the Title field and the F ileRef field for each approved estimate.The Execute method uses the FileRef to access the estimate document and create a copy on the ApprovedEstimates list. Note:For step-by-step guidance on how to implement a timer job, see How to: C reate a Web Application-ScopedTimer Job.Generated from CHM, not final book. Will be superseded in the future. Page 95
  • 96. Registering the Timer JobTo be able to use a timer job class, you must register it with the SharePoint timer service (Owstimer.exe). To beable to register the timer job at the point of deployment, you create a farm-scoped feature—the ApprovedEstimates Job feature—and use a feature receiver class to perform the registration. Note that the ApprovedEstimates Job feature contains no feature manifests; it exists solely to register the timer job on feature activation.The timer job registration process consists of three steps:  The timer job definition is instantiated, and the current Web application is passed to the constructor as an argument.  A schedule is created and then associated with the timer job instance.  The timer job instance is updated.The following code example shows this process.C#public override void FeatureActivated(SPFeatureReceiverProperties properties){ SPWebApplication webApplication = properties.Feature.Parent as SPWebApplication; DeleteJob(webApplication.JobDefinitions); ApprovedEstimatesJob estimatesJob = new ApprovedEstimatesJob(webApplication); SPMinuteSchedule schedule = new SPMinuteSchedule(); schedule.BeginSecond = 0; schedule.EndSecond = 59; schedule.Interval = 5; estimatesJob.Schedule = schedule; estimatesJob.Update();}Note that the F eatureActiv ated method calls a helper method named DeleteJob. This removes the timer jobfrom the collection of job definitions in the current Web application. This method is also used to remove the timerjob when the feature is deactivated.C#private void DeleteJob(SPJobDefinitionCollection jobs){ foreach (SPJobDefinition job in jobs) { if (job.Name.Equals(ApprovedEstimatesJob.JobName, StringComparison.OrdinalIgnoreCase)) { job.Delete(); } }}When the feature is deactivated, the F eatureDeactivating method simply calls the DeleteJob method to deletethe job definition from the current Web application.public override void FeatureDeactivating(SPFeatureReceiverProperties properties){ SPWebApplication webApplication = properties.Feature.Parent as SPWebApplication; DeleteJob(webApplication.JobDefinitions);}Generated from CHM, not final book. Will be superseded in the future. Page 96
  • 97. The Timer Job Configuration PageThe Approved Estimates timer job must retrieve various configuration settings to be able to function correctly. Toprovide these settings, deploy a custom application page, TimerJobC onfig.aspx, to the C entral AdministrationWeb site.The timer job configuration pageThe timer job configuration page enables the farm administrator to provide the following settings:  The relative URL of the destination site collection  The name of the Approved Estimates list on the destination site collection  A semicolon-separated list of source site collectionsIt also enables the administrator to immediately run the job. When the user clicks Apply Changes, theconfiguration settings are persisted to the property bag of the Approved Estimates timer job, as shown in thefollowing code example.C#protected void ApplyButton_Click(object sender, EventArgs e){ IEnumerable<SPJobDefinition> allJobs = GetTimerJobsByName(Constants.jobTitle); foreach (SPJobDefinition selectedJob in allJobs) { if (!string.IsNullOrEmpty(ListNameTextBox.Text)) { selectedJob.Properties[Constants.timerJobListNameAttribute] = ListNameTextBox.Text; } if (!string.IsNullOrEmpty(SiteNamesTextBox.Text)) { selectedJob.Properties[Constants.timerJobSiteNameAttribute] = SiteNamesTextBox.Text; } if (!string.IsNullOrEmpty(DestinationSiteTextBox.Text)) { selectedJob.Properties[Constants.timerJobDestinationSiteAttribute] = DestinationSiteTextBox.Text; } selectedJob.Update(); }}Generated from CHM, not final book. Will be superseded in the future. Page 97
  • 98. The click event handler uses the GetTimerJobsByName helper method to retrieve all instances of the ApprovedEstimates job from the server farm. This is necessary because several instances of the job could be associatedwith different service applications or Web applications.C#private List<SPJobDefinition> GetTimerJobsByName(string displayName){ List<SPJobDefinition> AllJobs = new List<SPJobDefinition>(); // For all servers in the farm (the servers could be different). foreach (SPServer server in farm.Servers) { // For each service instance on the server. foreach (SPServiceInstance svc in server.ServiceInstances) { if (svc.Service.JobDefinitions.Count > 0) { // If it is a Web Service, then get the Web application from // the Web Service entity. if (svc.Service is SPWebService) { SPWebService websvc = (SPWebService) svc.Service; AllJobs.AddRange(from webapp in websvc.WebApplications from def in webapp.JobDefinitions where def.DisplayName.ToLower() == displayName.ToLower() select def); } else { //Otherwise Get it directly from the Service AllJobs.AddRange(svc.Service.JobDefinitions.Where(def => def.DisplayName.ToLower() == displayName.ToLower())); } } } } return AllJobs;}To make the timer job configuration page available to farm administrators, it is deployed to the C entralAdministration Web site.Generated from CHM, not final book. Will be superseded in the future. Page 98
  • 99. Deploying the Configuration Page to CentralAdministrationIn most cases, when you add custom application pages to a SharePoint environment, you deploy your pages to asubfolder in the TemplateLayouts folder in the SharePoint root. This makes your page available under the_layouts virtual directory on every site. However, in this case, you do not want to expose the timer jobconfiguration page on every SharePoint site—it should be available only on the C entral Administration site. Toachieve this, deploy the page to the TemplateAdmin folder in the SharePoint root. This makes the page availableunder the _admin virtual directory on the C entral Administration Web site. The _admin virtual directory is notmade available to any Web application except for C entral Administration.To deploy the page to the TemplateAdmin folder, a SharePoint Mapped Folder was added to the solution, asshown in the following illustration. This makes the timer job configuration page available at a relative URL of_admin/TimerJobAdmin/TimerJobC onfig.aspx within the C entral Administration Web site.Using a SharePoint Mapped FolderAfter you deploy the page to the correct location, add a navigation link to enable farm administrators to find thepage. To achieve this, create a feature manifest that defines a CustomAction element. This is deployed to theC entral Administration Web site in a Web-scoped feature named Admin Forms Navigation. Note:For more information about how to add actions to the SharePoint user interface, see Default C ustom ActionLocations and IDs and How to: Add Actions to the User Interface on MSDN.The following code example shows the feature manifest for the Admin Forms Navigation feature.XML<Elements xmlns="http://schemas.microsoft.com/sharepoint/"> <CustomAction Id="DCA50F3D-D38F-41D9-8120-25BD7F930FDE" GroupId="TimerJobs" Location="Microsoft.SharePoint.Administration.Monitoring" Sequence="10" Title="Configure Approved Estimates Aggregation Timer Job" Description=""> <UrlAction Url="_admin/TimerJobAdmin/TimerJobConfig.aspx" />Generated from CHM, not final book. Will be superseded in the future. Page 99
  • 100. </CustomAction></Elements>Essentially, this code adds a link to the timer job configuration page to the Monitoring page on the C entralAdministration Web site. The link is added to the Timer Jobs group, as shown in the following illustration.Using a custom action to add a link Note:For step-by-step guidance on how to deploy an application page to the C entral Administration Web site, seeHow to: Deploy an Application Page to C entral Administration.Generated from CHM, not final book. Will be superseded in the future. Page 100
  • 101. ConclusionThe Full-Trust Reference Implementation (Full-Trust RI) demonstrates best practice approaches to thedeployment of various full-trust components to the SharePoint environment. The key points of interest include thefollowing:  The creation and deployment of timer job definitions  The registration process for timer jobs  The management of configuration data for timer jobs  The deployment of custom application pages to the C entral Administration Web siteWe recommend you deploy the Full-Trust RI and to explore the different components and code within theExecutionModels.FarmSolution solution. For more information about full-trust components, see Execution Modelsin SharePoint 2010.Generated from CHM, not final book. Will be superseded in the future. Page 101
  • 102. Reference Implementation: Workflow ActivitiesIn SharePoint 2010, site users can create powerful workflows in SharePoint Designer by connecting andconfiguring workflow actions. These workflows are known as declarative workflows, because the logic thatconnects the individual workflow actions is defined entirely in declarative markup and does not require customcode.SharePoint 2010 includes many built-in workflow actions for common tasks and activities. As a developer, youcan also make additional actions available to workflow creators in two ways:  You can create a full-trust workflow activity class and deploy it as a farm solution. To use these activities in SharePoint Designer, you must create and deploy a workflow actions file to package your activity, and you must add an authorized type entry for your activity class to the Web.config file.  You can create a sandboxed workflow action class and deploy it as a sandboxed solution. To be able to package your workflow action, you must create a feature with a workflow actions element in an element manifest file.Both approaches make additional workflow actions available to users in SharePoint Designer. The full-trustapproach allows developers to expose specialized functionality to workflow creators, such as connecting toexternal systems and services or performing actions across site collection boundaries. The sandboxed approachallows developers to introduce custom workflow logic to site collections in hosted environments or in otherenvironments in which the deployment of farm solutions is prohibited. Sandboxed workflow action logic mustobserve the constraints of the sandbox environment—for example, sandboxed workflow actions cannot accessexternal systems or services and are limited to a subset of the SharePoint API. Note:We recommend that you review Execution Models in SharePoint 2010 prior to studying the Workflow ActivitiesReference Implementation (Workflow Activities RI). It provides a more detailed explanation of many of theconcepts described in this topic.This implementation highlights details and best practices in the following areas:  C reating and deploying full-trust workflow activities  C reating and deploying sandboxed workflow actions  C onsuming custom activities and sandboxed actions from a declarative workflow.Solution ScenarioIn this example, suppose you are a SharePoint consultant working for a pharmaceutical company named C ontosoInc. You have already designed and implemented a system that tracks the progress of statements of work(SOWs) and budget estimations, as described in Sandbox Reference Implementation. As a result, each team siteon the C ontoso intranet portal includes an Estimates list that contains SOWs and budget estimations. TheEstimates list includes a field that indicates the approval status of each work item.The IT manager at C ontoso, C ristian Petculescu now wants you to extend this solution. C ristian wants you toautomate the creation of project sites, so that when the approval status of a work item is set to Approved, anew project site is created and configured for that work item. To meet Gerhards requirements, you first createvarious workflow activities:  You create a full-trust workflow activity to create a new site.  You create a full-trust workflow activity to determine whether the new site already exists.  You create a sandboxed workflow action to copy a document library from one site to another. Note:You could actually use a sandboxed workflow action to create a site, because this process takes place withinthe boundaries of the site collection. However, in the Workflow Activities RI, the CreateSiteCollection andCreateSite derive from a common base class, and you cannot create a site collection within a sandboxedworkflow action.Next, you use SharePoint Designer to create a declarative workflow that incorporates these activities. Thisworkflow runs on each item in the Estimates list. When the approval status of the item is set to Approved, theworkflow attempts to create a new project site for the work item. It then uses the sandboxed workflow action tocopy the Templates document library to the new project site.Deploying the Workflow Activities RIThe Workflow Activities RI includes an automated installation script that creates various site collections, deploysthe reference implementation components, and adds sample data. After running the installation script, browse toGenerated from CHM, not final book. Will be superseded in the future. Page 102
  • 103. the new ManufacturingWF site collection at http://<Hostname>/sites/ManufacturingWF. You can open and run theproject in Visual Studio, but this does not create a site collection or add sample data. To see the system fullyfunctioning, you must run the installation script. The following table summarizes how to get started with theWorkflow Activities RI.Question AnswerWhere can I find the <install location>SourceExecutionModelWorkflowWorkflow ActivitiesRI?What is the name of ExecutionModels.Workflow.slnthe solution file?What are the system SharePoint Server 2010 Server Standard or Enterpriserequirements?What preconditions  You must be a member of SharePoint Farm Admin.are required forinstallation?  You must be a member of the Windows admin group.  SharePoint must be installed at http://<Hostname:80>. If you want to install to a different location, you can edit these settings in the Settings.xml file located in the Setup directory for the solution.  SharePoint 2010 Administration service must be running. By default, this service is set to a manual start. To start the service, click Start on the taskbar, point to Administrative Tools, click Serv ices, double-click SharePoint 2010 Administration serv ice, and then click Start.How do I Install the Follow the instructions in the readme file located in the project folder.Workflow ActivitiesRI?What is the default http://<Hostname>/sites/ManufacturingWFinstallation location? (This location can be altered by changing the Settings.xml file in the Setup directory.)How do I download The Workflow Activities RI is included in the download Developing Applications forthe Workflow SharePoint 2010.Activities RI? Note:The workflow contained in this project could also be implemented for SharePoint Foundation. Howeverworkflows must be authored on the same server version to which they are deployed. SharePoint Foundation2010 workflows cannot be deployed to SharePoint Server 2010, and vice versa.Generated from CHM, not final book. Will be superseded in the future. Page 103
  • 104. Solution OverviewThis topic provides a high-level conceptual overview of the Workflow Activities Reference Implementation(Workflow Activities RI). Essentially, the solution consists of a declarative workflow that includes built-inSharePoint workflow actions, custom full-trust activities, and sandboxed workflow actions. The followingillustration shows the workflow logic.The Workflow Activities RI Note:The preceding illustration was originally created using the new Export to Visio option in SharePoint Designer2010. The annotations were manually added.When you deploy the Workflow Activities RI, a workflow instance is created every time a work item with a contenttype of Estimate is added to the Estimates library. When a work item is approved, the workflow proceedsthrough the actions shown in the preceding illustration. The custom full-trust activities and sandboxed actions arecalled at the appropriate points in the workflow execution.The Workflow Activities RI consists of three solution packages—one for the full-trust workflow activities, one forthe sandboxed workflow action, and one for the declarative workflow definition.Generated from CHM, not final book. Will be superseded in the future. Page 104
  • 105. Full-Trust ActivitiesC reating and deploying a custom full-trust workflow activity to a SharePoint 2010 environment involves thefollowing three key steps:  C reate an activity class that derives from the System.Workflow.ComponentModel.Activ ity base class and overrides the Execute() method.  Add an authorizedType entry to the Web.config file for each activity class.  C reate a workflow actions file that defines how the workflow engine and the SharePoint Designer user interface will interact with your activity classes.The Workflow Activities Reference Implementation (Workflow Activities RI) includes three workflow activities.CreateSiteCollectionActivity creates a new site collection, using parameters supplied by the workflow.CreateSubSiteActivity creates a new site in a site collection, using parameters supplied by the workflow.SiteExistsActiv ity returns a Boolean value to indicate whether a specified site or site collection already exists atthe specified URL.Within the workflow actions file required for SharePoint Designer support, workflow activities are packaged asactions. You can also define conditions in the workflow actions file. In this solution,CreateSiteCollectionActivity and CreateSubSiteActivity are packaged as actions, and SiteExistsActivityis packaged as a condition.Unlike actions, conditions are not required to derive from System.Workflow.ComponentModel.Activ ity. Acondition class must declare a method that does the following:  Returns a Boolean value  Takes an object of type Microsoft.SharePoint.WorkflowActions.WorkflowContext as its first parameter  Takes a string, which represents the GUID of the parent list, as its second parameter  Takes an integer, which is the ID of the parent list item, as its third parameter Note:In the case of the SiteExistsActiv ity, it was decided to implement this as an activity so that the code couldbe leveraged easily within a full-trust coded workflow as well as a declarative workflow.Generated from CHM, not final book. Will be superseded in the future. Page 105
  • 106. Activity ClassesThe Workflow Activities Reference Implementation (Workflow Activities RI) includes three workflow activityclasses. Because CreateSubSiteActivity and CreateSiteCollectionActivity have many parameters incommon, they share a common base class, SiteCreationBaseActivity. All the activity classes ultimately derivefrom the System.Workflow.ComponentModel.Activ ity base class, as shown in the following illustration.Workflow activity classesEach workflow activity class overrides the Execute method defined by the base Activ ity class. This method iscalled by the workflow engine to invoke your activity logic.C#protected override ActivityExecutionStatus Execute(ActivityExecutionContext executionContext) { }A workflow activity class also typically defines a number of dependency properties that are used to passinformation to or from your workflow activity logic. Essentially, dependency properties allow you to define inputsand outputs for your workflow activity, because the workflow engine and SharePoint Designer can bind values tothese properties in order to use your workflow logic. In other words, dependency properties allow you to bind theoutput of one activity to the input of another activity.Each full-trust workflow activity class in the Workflow Activities RI defines several dependency properties. Forexample, the SiteExistsActiv ity class defines three dependency properties. The first dependency property,SiteUrlProperty, allows you to provide your workflow logic with the URL of a site. Notice that the class alsodefines a .NET Framework property wrapper, SiteUrl, which allows your activity logic to interact with themanaged property.C#public static DependencyProperty SiteUrlProperty = DependencyProperty.Register("SiteUrl",typeof(string),typeof(SiteExistsActivity));[Description("The absolute URL of the site or site collection to create")][Browsable(true)][Category("Patterns and Practices")][DesignerSerializationVisibility(DesignerSerializationVisibility.Visible)]public string SiteUrl{Generated from CHM, not final book. Will be superseded in the future. Page 106
  • 107. get { return ((string)base.GetValue(SiteUrlProperty)); } set { base.SetValue(SiteUrlProperty, value); }}The second dependency property, ExistsProperty, enables your activity to provide a result that indicateswhether a specified site or site collection already exists. This result can be used by the workflow engine toprovide branching logic.C#public static DependencyProperty ExistsProperty = DependencyProperty.Register("Exists",typeof(bool), typeof(SiteExistsActivity));[Description("The result of the operation indicating whether the site exists")][Browsable(true)][Category("Patterns and Practices")] [DesignerSerializationVisibility(DesignerSerializationVisibility.Visible)]public bool Exists{ get { return ((bool)base.GetValue(ExistsProperty)); } set { base.SetValue(ExistsProperty, value); }}The final dependency property, ExceptionProperty, allows your activity to return an exception if problemsoccur. Note:The workflow engine supports fault handlers that can capture exceptions and take action on them for codedworkflows. SharePoint Designer declarative workflows do not provide support for fault handlers, so when youdesign activities for use with SharePoint Designer, it is sometimes necessary to return the exceptions as aproperty.C#public static DependencyProperty ExceptionProperty =DependencyProperty.Register("Exception",typeof(string), typeof(SiteExistsActivity));[Description("Exception generated while testing for the existence of the site")][Browsable(true)][Category("Patterns and Practices")][DesignerSerializationVisibility(DesignerSerializationVisibility.Visible)]public string Exception{ get { return ((string)base.GetValue(ExceptionProperty)); } set { base.SetValue(ExceptionProperty, value); }}The following code example shows the Execute method of the SiteExistsActiv ity.C#protected override ActivityExecutionStatus Execute(ActivityExecutionContext executionContext){ string exception; Exists = DoesSiteExist(SiteUrl, out exception); Exception = exception; return base.Execute(executionContext);}In the preceding code example, the logic in the Execute method override does not use theActiv ityExecutionContext argument or set a return value. It simply sets the value of the dependencyproperties and then calls the base method implementation. The DoesSiteExist method is a simple helper methodthat checks whether a site exists at the specified URL.The SiteExistsActiv ity class includes an extra method named DoesSiteExistCondition that implements theGenerated from CHM, not final book. Will be superseded in the future. Page 107
  • 108. signature required for SharePoint Designer. Specifically, it returns a Boolean value and accepts parameters thatrepresent the workflow context, the parent list, and the parent list item, as shown in the following code example.C#public static bool DoesSiteExistCondition(WorkflowContext workflowContext, string listId,int itemId, string siteUrl){ string exception; return (DoesSiteExist(siteUrl, out exception));}Generated from CHM, not final book. Will be superseded in the future. Page 108
  • 109. Registering Workflow Activities for DeclarativeWorkflowsTo be able to use a full-trust workflow activity in a declarative workflow, you must add an authorized type entryfor the activity class to the Web.config file. Typically, you should use a feature receiver class to add and removethese entries when you deploy your workflow activities.The Workflow Activities Reference Implementation (Workflow Activities RI) includes an empty farm-scopedfeature named SiteProv isioningActivity. The event receiver class for this feature adds authorized type entriesfor the workflow activity classes when the feature is activated, and it removes the entries when the feature isdeactivated. Note:Farm-scoped features are automatically activated when the solution is deployed. Because of this, theauthorized type entry is automatically added to the Web.config file when the solution package is deployed tothe farm.The SiteProv isioningActivity.Ev entReceiver class uses a helper method named GetConfigModification tobuild the authorized type entry for the workflow activity classes as an SPWebConfigModification object, asshown in the following code example.C#public SPWebConfigModification GetConfigModification(){ string assemblyValue = typeof(CreateSubSiteActivity).Assembly.FullName; string namespaceValue = typeof(CreateSubSiteActivity).Namespace; SPWebConfigModification modification = new SPWebConfigModification( string.Format(CultureInfo.CurrentCulture, "authorizedType[@Assembly={0}][@Namespace={1}][@TypeName=*] [@Authorized=True]", assemblyValue, namespaceValue), "configuration/System.Workflow.ComponentModel.WorkflowCompiler/ authorizedTypes"); modification.Owner = "Patterns and Practices"; modification.Sequence = 0; modification.Type = SPWebConfigModification.SPWebConfigModificationType.EnsureChildNode; modification.Value = string.Format(CultureInfo.CurrentCulture, "<authorizedType Assembly="{0}" Namespace="{1}" TypeName="*" Authorized="True" />", assemblyValue, namespaceValue); Trace.TraceInformation("SPWebConfigModification value: {0}",modification.Value); return modification;}For more information about the SPWebConfigModification class, see SPWebC onfigModification C lass on MSDN.The F eatureActiv ated method uses the SPWebConfigModification object to add the authorized type entry tothe Web.config files of all Web applications on the farm, with the exception of the C entral Administration Webapplication. This is shown in the following code example.C#public override void FeatureActivated(SPFeatureReceiverProperties properties){ try { SPWebService contentService = SPWebService.ContentService; contentService.WebConfigModifications.Add(GetConfigModification()); // Serialize the Web application state and propagate changes across the farm.Generated from CHM, not final book. Will be superseded in the future. Page 109
  • 110. contentService.Update(); // Save Web.config changes. contentService.ApplyWebConfigModifications(); } catch (Exception e) { Console.WriteLine(e.ToString()); throw; }}This adds the following entry as a child of the AuthorizedTypes element in the Web.config file.XML<authorizedType Assembly="ExecutionModels.Workflow.FullTrust.Activities, ..." Namespace="ExecutionModels.Workflow.FullTrust.Activities" TypeName="*" Authorized="True" />The F eatureDeactiv ating method also uses the helper method; this time, it is used to remove the authorizedtype entry from the Web.config file, as shown in the following code example.C#public override void FeatureDeactivating(SPFeatureReceiverProperties properties){ try { SPWebService contentService = SPWebService.ContentService; contentService.WebConfigModifications.Remove(GetConfigModification()); // Serialize the Web application state and propagate changes across the farm. contentService.Update(); // Save Web.config changes. contentService.ApplyWebConfigModifications(); } catch (Exception e) { Console.WriteLine(e.ToString()); throw; }}Generated from CHM, not final book. Will be superseded in the future. Page 110
  • 111. Workflow Actions FilesA workflow actions file (.actions) is an XML file that describes how SharePoint Designer should interact with yourworkflow activities. Primarily, this file provides a human-readable sentence structure that is wrapped around themechanics of your activity. The workflow actions file consists of a parent WorkflowInfo element that can containconditions and actions.XML<WorkflowInfo> <Conditions And="and" Or="or" Not="not" When="If" Else="Else if"> ... </Conditions> <Actions Sequential="then" Parallel="and"> ... </Actions></WorkflowInfo>In the Conditions and Actions elements, the attribute values specify the sentence constructs that SharePointDesigner should use to describe the workflow when your activities are used in a particular way. For example, iftwo activities are used sequentially, SharePoint Designer will describe the workflow as activity Athenactivity B. Iftwo activities are used in parallel, the resulting sentence will be activity Aandactivity B. In the Workflow ActivitiesReference Implementation (Workflow Activities RI), the workflow actions file is named ExecutionModels.actions.Defining ActionsThe Action element describes the CreateSubSiteActivity class. First, the Action element identifies the classname and assembly name for the workflow activity.XML<Action Name="Create a Sub-Site" ClassName="ExecutionModels.Workflow.FullTrust.Activities.CreateSubSiteActivity" Assembly="ExecutionModels.Workflow.FullTrust.Activities, ..." AppliesTo="all" Category="Patterns and Practices">Next, a RuleDesigner element defines a sentence that will represent the workflow activity in SharePointDesigner. The sentence includes placeholders that are filled by parameters (which are defined in the next part ofthe Action element). The FieldBind child elements are used to map parameters to the placeholders in the ruledesigner sentence, as shown in the following code example. Some of the FieldBind elements have beenremoved from this example to aid readability.XML<RuleDesigner Sentence="Create subsite at %3 using the site template %4 using the title %5and description %6 and locale of %7. The site will be converted if exists? %2 The sitewill use unique permissions? %1"> <FieldBind Field="UseUniquePermissions" DesignerType="Boolean" Text="Use unique permissions for the sub-site" Id="1"/> <FieldBind Field="ConvertIfExists" DesignerType="Boolean" Text="Convert the sub-site to the template if it already exists" Id="2"/> <FieldBind Field="SiteUrl" DesignerType="Hyperlink" Text="The full URL of the site" Id="3"/> ...</RuleDesigner>Next, a Parameters element defines the inputs and outputs for the activity class, as shown in the following codeGenerated from CHM, not final book. Will be superseded in the future. Page 111
  • 112. example. Each parameter maps to a dependency property in the activity class. The parameters are referenced bythe F ieldBind elements shown in the preceding code example. Some of the Parameter elements have beenremoved from this example to aid readability.XML<Parameters> <Parameter Name="UseUniquePermissions" Type="System.Boolean, mscorlib" DisplayName="Use unique permissions" Direction="In" /> <Parameter Name="ConvertIfExists" Type="System.Boolean, mscorlib" DisplayName="Convert if exists" Direction="In" /> <Parameter Name="SiteUrl" Type="System.String, mscorlib" Direction="In" /> ...</Parameters>The second Action element, which describes the CreateSiteCollectionActivity class, follows a similar pattern.Defining ConditionsThe ExecutionModels.actions file includes a single Condition element that describes the SiteExistsActivityclass.XML<Condition Name="Site Exists" FunctionName="DoesSiteExistCondition" ClassName="ExecutionModels.Workflow.FullTrust.Activities.SiteExistsActivity" Assembly="ExecutionModels.Workflow.FullTrust.Activities, ..." AppliesTo="all" UsesCurrentItem="True"> <RuleDesigner Sentence="The site %1 exists"> <FieldBind Id="1" Field="_1_" Text=""/> </RuleDesigner> <Parameters> <Parameter Name="_1_" Type="System.String, mscorlib" Direction="In" /> </Parameters></Condition>This Condition element has a few key differences from the Action elements described earlier. First, you mustprovide a FunctionName attribute value to indicate that the condition logic is invoked through theDoesSiteExistCondition method. Next, note the naming convention for parameters. The parameter that willrepresent the site URL is named "_1_". This is because it is the first non-default argument that is provided to theDoesSiteExistCondition method. Additional parameters should be named "_2_", "_3_", and so on. It isessential to use this naming convention when you define a condition.Deploying a Workflow Actions FileWorkflow actions files are deployed to the TEMPLATE[Locale ID]Workflow folder in the SharePoint root folder(for example, TEMPLATE1033Workflow). The best way to deploy files to the SharePoint root from a Visual Studio2010 project is to add a SharePoint mapped folder to your project. This creates a folder within your project thatmaps to a folder on the SharePoint file system. When you deploy your solution, any files in your mapped folderare added to the corresponding location on the SharePoint file system. Note:Generated from CHM, not final book. Will be superseded in the future. Page 112
  • 113. To add a SharePoint mapped folder, right-click the project node in Solution Explorer, point to Add, and thenclick SharePoint Mapped F older. This launches a dialog box that lets you browse the SharePoint file systemand select a folder to map.In the Workflow Activities RI, a SharePoint mapped folder maps to the SharePoint root folder on the file system.Within the mapped folder, there is the TEMPLATE1033Workflow subfolder and the ExecutionModels.actions file,as shown in the following illustration.Using SharePoint mapped folders to deploy a workflow actions fileWhen you deploy your workflow actions file to this location, SharePoint automatically detects the new actions, andthey are made available in SharePoint Designer for use in declarative workflows. Note:In the Workflow Activities RI, there could also have been a mapped folder for theTEMPLATE1033Workflow folder directly. The reason for mapping the SharePoint root folder is to betterillustrate the target folder structure on the server file system, and to allow for the addition of workflow actionfiles for other locales at a later time.Generated from CHM, not final book. Will be superseded in the future. Page 113
  • 114. Sandboxed Workflow ActionsIn SharePoint 2010, you can create coded workflow actions that run in the sandbox environment. You canconsume these workflow actions from declarative workflows in your site collection.One of the key advantages to creating sandboxed workflow actions is that you do not need to add authorized typeentries to the Web.config file. The workflow action itself can be packaged and deployed as a sandboxed solution,and the declarative workflow can be created in SharePoint Designer by site users with sufficient permissions. Thisenables the users of a site collection to create and deploy their own custom workflow functionality without theinvolvement of the IT team.C reating and deploying a sandboxed workflow action to a SharePoint 2010 environment involves two key steps:  C reate a class that defines the logic for your workflow action.  C reate a feature element manifest that contains the workflow actions markup that describes your workflow action.The Workflow Activities Reference Implementation (Workflow Activities RI) includes a sandboxed workflow actionclass named, together with a feature manifest that provides the markup for the workflow action, CopyLibrary.Generated from CHM, not final book. Will be superseded in the future. Page 114
  • 115. Creating Sandboxed ActionsIn the Workflow Activities Reference Implementation (Workflow Activities RI), the CopyLibrary class is asandboxed workflow action class that copies a document library from the current site to a target site in the samesite collection. When you create a sandboxed workflow action class, the class must include a public method thatmeets the following criteria:  The method must accept an argument of type SPUserCodeWorkflowContext, plus any arguments required by your activity logic.  The method must return an object of type Hashtable.The CopyLibrary class provides a method named CopyLibraryActiv ity that meets the criteria in the followingcode example.C#public Hashtable CopyLibraryActivity(SPUserCodeWorkflowContext context, string libraryName, string targetSiteUrl){ return (CopyLibrary(context.WebUrl, libraryName, targetSiteUrl));}This method calls a second method named CopyLibrary that contains the workflow action logic. This performsthe actual work and is used because it is easier to test than the signature required by SharePoint Designer. Thismethod performs the actions shown in the following illustration.The Copy Library workflow actionGenerated from CHM, not final book. Will be superseded in the future. Page 115
  • 116. Note:CopyLibraryActiv ity calls CopyLibrary instead of having overloads because SharePoint is not capable ofdetermining which overload to use. Methods that are referenced for sandboxed workflow actions cannot beoverloaded.Generated from CHM, not final book. Will be superseded in the future. Page 116
  • 117. Notice how the method uses the hash table to return results. The method adds two entries to the hash table: astring value that indicates success or failure and an integer value that indicates the number of files copied fromthe source library to the target library.C#Hashtable results = new Hashtable();...int copiedFiles = CopyFolder(sourceLib.RootFolder, targetLib.RootFolder, true);results["status"] = "success";results["copiedFiles"] = copiedFiles;return results;If the source list does not exist, the method returns a status value of failure and a copied files count of zero.C#results["status"] = "failure";results["copiedFiles"] = 0;return (results);For more information about returning values from a sandboxed workflow action, see How to: C reate a SandboxedWorkflow Action.Generated from CHM, not final book. Will be superseded in the future. Page 117
  • 118. Packaging and Deploying Sandboxed ActionsWhen you work with sandboxed solutions, you cannot deploy a workflow actions file to the server file system.Instead, you can use a feature manifest file to deploy your workflow actions markup to the sandbox environment.The Workflow Activities Reference Implementation (Workflow Actions RI) includes a site-scoped feature namedCopyLibraryFeature. This feature deploys a component named CopyLibraryModule that consists of thefollowing feature manifest.XML<Elements xmlns="http://schemas.microsoft.com/sharepoint/"> <WorkflowActions> <Action Name="Copy Library" SandboxedFunction="true" Assembly="$SharePoint.Project.AssemblyFullName$" ClassName="ExecutionModels.Workflow.SandboxActivity.CopyLibrary" FunctionName="CopyLibraryActivity" AppliesTo="list" UsesCurrentItem="true" Category="Patterns and Practices Sandbox"> <RuleDesigner Sentence="Copy all items from library %1 to site %2"> <FieldBind Field="libraryName" Text="Library Name" Id="1" DesignerType="TextBox" /> <FieldBind Field="targetSiteUrl" Text="Target Site" Id="2" DesignerType="TextBox" /></RuleDesigner><Parameters> <Parameter Name="__Context" Type="Microsoft.SharePoint.WorkflowActions.WorkflowContext, Microsoft.SharePoint.WorkflowActions" Direction="In" DesignerType="Hide" /> <Parameter Name="libraryName" Type="System.String, mscorlib" Direction="In" DesignerType="TextBox" Description="The library to copy" /> <Parameter Name="targetSiteUrl" Type="System.String, mscorlib" Direction="In" DesignerType="TextBox" Description="The URL of the target site" /></Parameters> </Action> </WorkflowActions></Elements>In the WorkflowActions element, the Action element is defined in exactly the same way as actions in workflowactions files. Note that the Action element includes a SandboxedF unction="true" attribute. Also note that aFunctionName attribute is specified to indicate the method that contains the activity logic, because sandboxedworkflow actions cannot inherit from the base Activ ity class or implement the standard Activ ity.Executemethod.Finally, note the naming convention for the parameter that represents the SPUserCodeWorkflowContextargument to the sandboxed action method. The parameter must be named __Context. Take care to include thedouble-underscore prefix when you create your own sandboxed actions. This enables the SharePoint workflowengine to provide a wrapper for your sandboxed action, which allows you to use the action in declarativeworkflows.When you deploy and activate a feature that includes this manifest, the sandboxed workflow action isautomatically made available in SharePoint Designer for use in declarative workflows within the site collectionscope.Generated from CHM, not final book. Will be superseded in the future. Page 118
  • 119. Consuming Custom Components from a DeclarativeWorkflowThe Workflow Activities Reference Implementation includes a declarative workflow named C reate Project Site.The workflow is associated with the Estimate content type, which was created by the Sandbox ReferenceImplementation. The Estimate content type defines several fields, as shown in the following illustration.Estimate content type fieldsThe C reate Project Site workflow uses the values of the SOW Status field and the Projects Lookup field oneach estimate item during workflow execution. The workflow also defines two local variables, ProjectName andProjectSiteUrl, to manage the information that is passed to individual workflow actions.The following illustration shows the sentence designer for the C reate Project Site workflow in SharePointDesigner. Note:In the illustration, the line that begins "C reate subsite…" has been cut down because of width restrictions.Workflow designer for the Create Project Site workflowGenerated from CHM, not final book. Will be superseded in the future. Page 119
  • 120. As you can see, the workflow begins with a Wait for Field Change in Current Item action. This pauses theexecution of the workflow until the SOW Status field of the current item is set to Approved. The remainingworkflow actions are then contained within an Impersonation Step, because, in most cases, elevated permissionsare required to create a new site. An Impersonation Step runs as the identity of the user that associated theworkflow with a list or library. This allows the workflow to perform actions that the user would not normally beallowed to perform, such as creating a subsite for the project.The workflow then sets the values of the local variables:  ProjectName. This variable is set to the value of the Projects Lookup field in the current work item.  ProjectSiteUrl. This variable is set to a concatenation of the current site collection URL and the ProjectName variable.These variables are used as inputs to the actions that follow. Notice how the custom workflow actions are includedin the workflow designer:  The sentence "The site _ exists" is provided by the Site Exists condition defined in the ExecutionModels.actions file.  The sentence "Create subsite at _ using the site template _ using the title _ ..." is provided by the Create a Sub-Site action defined in the ExecutionModels.actions file.  The sentence "Copy all items from library _ to site _" is provided by the Copy Library action defined the C opyLibraryModule feature manifest.No additional configuration is required to use these custom actions in SharePoint Designer. The actionsautomatically become available when you deploy the ExecutionModels.actions file (for full-trust activities) and theC opyLibraryModule feature manifest (for the sandboxed workflow action) to the SharePoint environment.Generated from CHM, not final book. Will be superseded in the future. Page 120
  • 121. ConclusionThe Workflow Activities Reference Implementation (Workflow Activities RI) demonstrates best practiceapproaches to creating and deploying full-trust workflow activities and sandboxed workflow actions. The keypoints of interest include the following:  The creation and deployment of full-trust workflow activities  The creation and deployment of sandboxed workflow actions  The consumption of custom workflow actions from a declarative workflowWe recommend you deploy the Workflow Activities RI and to explore the different components and code withinthe ExecutionModels.Workflow solution. For more information about workflow activities, see Execution Models inSharePoint 2010.Generated from CHM, not final book. Will be superseded in the future. Page 121
  • 122. Reference Implementation: External ListsExternal lists enable you to view and manipulate external data in the same way that you can view and manipulatedata stored in regular SharePoint lists. In this reference implementation, we deploy a sandboxed solution thatincludes external lists. The solution demonstrates how sandboxed Web Parts can query and present the dataprovided by the external lists.External lists rely on external content types to interact with external data sources. You can view an externalcontent type as a bridge between the external list and the external data source. Unlike external lists, externalcontent types cannot be defined or used directly within sandboxed solutions. Instead, they are typically createddynamically in SharePoint Designer or in Visual Studio 2010 if they are more complex. They are then managedby the Business Data C onnectivity (BDC ) service application, which runs with full trust. This use of an external listin a sandboxed solution is an example of a hybrid approach to execution. It allows you to manipulate externaldata from within the sandbox environment, which would not otherwise be permitted. Note:Note: We recommend that you review Execution Models in SharePoint 2010 before studying this referenceimplementation. The Execution Models chapter provides a more detailed explanation of many of the conceptsdescribed here.This implementation demonstrates the details and best practices for the following areas:  C reating and consuming external lists from a sandboxed solution  C reating and managing external content types in the BDC service application  C onfiguring the Secure Store Service (SSS) to enable sandboxed code to impersonate credentials for external data sources Note:Note: This reference implementation makes use of many design and implementation techniques that were thefocus of the preceding reference implementations. These techniques include exception shielding, the use ofvarious design patterns, and the use of the client object model to launch modal dialog pages. For a detaileddiscussion of these approaches, see the sandbox reference implementation and the full trust proxy referenceimplementation.Solution ScenarioIn this example, suppose you are a SharePoint consultant working for a pharmaceutical company namedC ontoso, Inc. The procurement manager at C ontoso, Jim Hance, wants to be able to view a summary of financialtransactions with vendor organizations.At present, vendor transactions are recorded in C ontosos vendor management system, a proprietary databaseapplication based on SQL Server. To meet Jims requirements, you first create a set of external content types thatmap to different tables in the vendor management database. Next, you create a set of external lists to surfacethe data from the external content types. Finally, you create a Web Part that shows a list of vendors. Jane canclick a vendor name to start a modal dialog that displays a list of transactions for that vendor, as illustrated in thefollowing illustrations.Vendor list of transactionsVendor transaction detailsGenerated from CHM, not final book. Will be superseded in the future. Page 122
  • 123. Deploying the External Lists RIThe External Lists RI includes an automated installation script that creates various site collections, deploys the RIcomponents, and adds sample data. After running the installation script, browse to the Headquarters sitecollection at http://<Hostname>/sites/ Headquarters. You can open and run the project in Visual Studio, but thisdoes not create a site collection or add sample data. To see the fully functioning system, you must run theinstallation script.The following table answers questions you might have about how to get started with the External Lists RI.Question AnswerWhere can I find the <install location>SourceExecutionModelExternalListExternal Lists RI?Which file is the ExecutionModels.Sandbox.ExternalList.slnsolution file?What are the system SharePoint Server 2010 Standard or Enterprise Edition (for Secure Store Service)requirements?What preconditions are  You must be a member of the SharePoint Farm administrator group.required forinstallation?  You must be a member of the Windows administrator group.  SharePoint must be installed at http://<Hostname:80>. If you want to install SharePoint in a different location, you can edit the location settings in the Settings.xml file located in the Setup directory for the solution.  SharePoint 2010 Administration service must be running. By default, this service is set to a manual start. To start the service, click Start on the taskbar, point to Administrative Tools, click Serv ices, double-click SharePoint 2010 Administration serv ice, and then click Start.  The timer service must be running. To start the timer service, click Start on the taskbar, point to Administrative Tools, click Serv ices, double-click SharePoint 2010 timer, and then click Start.How do I Install the Follow the instructions in the Readme.txt file located in the project folder.External List RI?Generated from CHM, not final book. Will be superseded in the future. Page 123
  • 124. What is the default http://<Hostname>/sites/Headquartersinstallation location forthe External List RI? You can change this location by changing the Settings.xml in the Setup directory.How do I download the The External List RI is included in the download Developing Applications forExternal List RI? SharePoint 2010.Generated from CHM, not final book. Will be superseded in the future. Page 124
  • 125. Solution OverviewThis topic provides a high-level conceptual overview of the external list reference implementation. The solutionconsists of both sandboxed and full trust components, as illustrated in following illustration.Conceptual overview of the External List Reference ImplementationThe Vendor List Web Part displays a list of vendors, together with basic information about each vendor, such asaddress, telephone details, and the number of transactions. The number of transactions is rendered as ahyperlink. When the user clicks the hyperlink, a client-side JavaScript function opens the Transaction Details pagewithin a modal dialog box. The JavaScript function passes the vendor ID to the details page as a query stringparameter.The Transaction Details page is a Web Part page that hosts the Transaction List Web Part. The Transaction ListWeb Part loads the vendor ID from the page query string, and then retrieves and displays a list of transactions. Note:Note: The application pattern in which one Web Part starts another Web Part within a modal dialog box isdescribed in detail in the full trust proxy reference implementation.Both the Vendor List Web Part and the Transaction List Web Part obtain their data by running queries againstexternal lists. These external lists are created on a specific site and can be considered sandboxed components.Each external list is associated with an external content type, which maps to a table or a view in an externaldatabase and defines a series of operations for the external data. The external content types are managed by theBusiness Data C onnectivity (BDC ) service application. While you cannot directly interact with the BDC APIs fromsandboxed code, you can interact with external lists. In this way, external lists provide a mechanism for viewingand manipulating external data from within the sandbox environment.The BDC service uses impersonation to access the vendor management database. Because the requests to theBDC originate from the sandbox environment, the credentials used to access the database are mapped to theidentity of the user code service account, rather than to the identity of individual users.Generated from CHM, not final book. Will be superseded in the future. Page 125
  • 126. Environment ConfigurationBefore configuring the SharePoint environment to access external data, you first need to understand howcredentials are managed and passed between the different components in your solution. The following diagramsummarizes the process.Accessing external data from a sandboxed solutionThe following numbered steps correspond to the numbers shown in the diagram: 1. A Web Part within a sandboxed solution uses the SPList object model to request data from an external list. 2. The SPList object model call is sent to the user code proxy service. The user code proxy service passes the request to the Business Data C onnectivity service (BDC ) runtime, which also runs within the user code proxy service process. The identity associated with the request is the managed account that runs the user code proxy service. 3. The BDC runtime calls the Secure Store Service (SSS). The SSS returns the external credentials that are mapped to the identity of the user code proxy service. 4. The BDC runtime retrieves the external content type metadata from the BDC metadata cache. If the metadata is not already in the cache, the BDC runtime retrieves it from the BDC Service. The external content type metadata provides the information that the BDC runtime needs to interact with the Vendor Management system. 5. The BDC runtime uses impersonation and the external credentials retrieved from the SSS to retrieve data from the Vendor Management system.For a more detailed explanation of this process, see the Hybrid Approaches topic. The key point to understand isthat the external credentials used to access the Vendor Management system must be mapped to the identity ofthe user code proxy service, rather than to the identity of the actual user. To enable your SharePointenvironment to support this approach, you must complete three tasks:  C onfiguring the Secure Store Service. You must configure the SSS to map the identity of the user code proxy service to the external credentials required by the Vendor Management system.  C reating the External C ontent Types. You must add the external content types to the BDC service. You can either create external content types in SharePoint Designer or import a BDC model (.bdcm) file in the C entral Administration Web site.  C onfiguring Business Data C onnectivity Service Permissions Within the BDC , you must configure permissions on each individual external content type.Generated from CHM, not final book. Will be superseded in the future. Page 126
  • 127. Generated from CHM, not final book. Will be superseded in the future. Page 127
  • 128. Configuring the Secure Store ServiceThe Secure Store Service (SSS) maintains an encrypted database that maps the identities of SharePoint users,groups, or process accounts to the external credentials required to access external systems. When the BusinessData C atalog (BDC ) needs to impersonate external credentials to access a data source, it passes the identity ofthe caller to the SSS. The SSS then returns the external credentials that are mapped to the identity of the caller.Within the SSS, credentials mappings are organized by target applications. A target application represents anexternal system or data source, and includes a unique target application ID. When the BDC requests a set ofcredentials from the SSS, it specifies the target application ID so that the SSS knows which credential mapping toretrieve.In the external list reference implementation, we created a target application to represent the VendorManagement system. Within this target application, we mapped the identity of the user code proxy service to theexternal credentials required to access the Vendor Management system. To enable users to access the externallists from outside a sandboxed application, we also mapped individual user identities to the external credentialsrequired to access the Vendor Management system. This is illustrated by the following diagram.Configuring a target application in the Secure Store ServiceIn the external list reference implementation, the install script configures the SSS and creates a target applicationthat you can use. If you want to create your own target application, you can use the following procedure.To create a target application in the Secure Store Service 1. In the C entral Administration Web site, click Application Management, and then click Manage Serv ice Applications. 2. On the Manage Service Applications page, click Secure Store Serv ice. 3. On the ribbon, in the Manage Target Applications section, click New. 4. On the C reate New Secure Store Target Application page: a. Set the Target Application ID to SPGVM. b. Set the Display Name to SPG Vendor Management Application. c. Provide a contact e-mail address. d. Under Target Application Type, select Group as shown in the following illustration. C lick Next.Generated from CHM, not final book. Will be superseded in the future. Page 128
  • 129. Note:Note: A target application type of Group indicates that you want to map multiple identities to a single set ofcredentials. 5. On the next page, leave the credential fields set to Windows User Name and Windows Password, and then click Next. 6. On the next page, in the Target Application Administrators text box, add your administrative account. 7. In the Members text box, add the user code proxy service account and any user accounts or groups that require access to the external system, and then click OK. See the following illustration. 8. On the Secure Store Service page, on the SPGVM drop-down list, click Set Credentials, as shown in the following illustration.Generated from CHM, not final book. Will be superseded in the future. Page 129
  • 130. 9. On the Set C redentials for Secure Store Target Application page, provide the credentials that are required to access the external system, and then click OK.Generated from CHM, not final book. Will be superseded in the future. Page 130
  • 131. Creating the External Content TypesExternal content types are stored and managed by the Business Data C onnectivity (BDC ) service application. Youcan create external content types by using interactive tools in SharePoint Designer 2010. Alternatively, you canimport predefined external content types by uploading a Business Data C onnectivity Model (.bdcm) file to theC entral Administration Web site. Note:Note: The install script for the external list reference implementation uses a .bdcm file to create the externalcontent types required by the solution. You can export your external content types as a .bdcm file fromSharePoint Designer 2010 or the SharePoint C entral Administration web site.Essentially, an external content type consists of two components: a connection to an external data source, and aseries of operation definitions (commonly referred to as stereotyped operations) on the external data. When youcreate a connection, you first identify the type of data source—.NET type, SQL Server, or WindowsC ommunication Foundation (WC F) Service. For a SQL Server data source, you must specify the database server,the database name, and the type of impersonation. SQL Server connections can use the following different typesof impersonation:  Connect with users identity. The BDC uses the identity of the SharePoint user who requested the external data to authenticate with the data source.  Connect with impersonated Windows identity. The BDC sends the identity of the caller to the Secure Store Service (SSS). The SSS supplies the Windows credentials that are mapped to the identity of the caller. The BDC uses the mapped Windows credentials to authenticate with the data source.  Connection with impersonated custom identity. The BDC sends the identity of the caller to the Secure Store Service (SSS). The SSS supplies a set of custom credentials—such as a Forms authentication username and password—that are mapped to the identity of the caller. The BDC uses the mapped custom credentials to authenticate with the data source. Note:Note: If you want to use an impersonated Windows identity or an impersonated custom identity, you mustspecify the target application ID when you configure the connection. The SSS uses the target application ID toorganize credential mappings.The external list reference implementation could not use the connect with users identity approach because therequests for external data originate from the sandbox. When the BDC receives a request that originates from thesandbox, the request is made using the identity of the user code proxy service, rather than the identity of theuser. Instead, we used the connect with impersonated Windows identity approach to authenticate to the VendorManagement database. The following image shows the connection properties for the external content types in theexternal list reference implementation.Vendor Management Connection PropertiesOnce you have configured the SQL Server connection, the next step is to define a set of operations on a table,view, or stored procedure in the target database. The external list reference implementation includes threeexternal content types with the following operation definitions:  The Vendors external content type defines create, read item, update, delete, and readlist operations on the Vendors table in the Vendor Management database.  The Vendor Transactions external content type defines read item and read list operations on the VendorTransactionView view in the Vendor Management database.  The Vendor Transaction Types external content type defines create, read item, update, delete, and read list operations on the TransactionTypes table in the Vendor Management database.For more information about creating EC Ts, see How to: C reate External C ontent Types and How to: C reate anExternal C ontent Type Based on a SQL Server Table on MSDN.Generated from CHM, not final book. Will be superseded in the future. Page 131
  • 132. Configuring Business Data Connectivity ServicePermissionsWhen you have deployed your external content types to the Business Data C onnectivity (BDC ) serviceapplication, you must configure permissions on each individual external content type. You can assign the followingpermissions to users or groups on each external content type:  Edit. This permission enables the user or group to edit the external content type definition.  Execute. This permission enables the user or group to use the operations defined by the external content type, such as create, readitem, update, delete, and read list.  Selectable in clients. This permission enables the user or group to create an external list from the external content type. It also enables users to use the external content type Picker control.  Set Permissions. This permission enables the user or group to manage permissions on the external content type.The following illustration shows an example.Setting permissions on an external content typeThe key thing to remember is that your external content type might need to support requests from both insideand outside the sandbox environment. If you want to support requests for external data from sandboxedsolutions, you must assign Execute permissions to the user code proxy service account. If you want to supportrequests for external data from non-sandboxed components, you must assign Execute permissions to the siteusers or groups who require access to the data.For more information about setting permissions on external content types, see Manage external content types(SharePoint Server 2010) on TechNet.Generated from CHM, not final book. Will be superseded in the future. Page 132
  • 133. Sandboxed ComponentsThe external list reference implementation includes two Web Parts that are configured to provide a master-detailsuser experience. The Vendor List Web Part displays a list of vendors with summary information for each individualvendor as shown in the following illustration.Vendor List Web PartWhen a user clicks an item in the TransactionCount column, a JavaScript function launches a Web Part page asa modal dialog box. This Web Part page contains the Vendor Transaction List Web Part, which displays a list of thetransactions that correspond to the selected vendor.Vendor Transaction List Web Part Note:For a detailed explanation of this application pattern, in which the master Web Part uses a JavaScript functionto launch the details Web Part as a modal dialog, see Reference Implementation: Full Trust Proxies forSandboxed Solutions.Both the Vendor List Web Part and the Vendor Transaction List Web Part are implemented by using theModel-View-Presenter (MVP) pattern. In this case, the model class is shared by both Web Parts, as shown by thefollowing class diagram.Class structure of the Vendor List Web PartGenerated from CHM, not final book. Will be superseded in the future. Page 133
  • 134. Class structure of the Vendor Transaction List Web Part Note:Note: For a detailed explanation of how to implement the MVP pattern in a Web Part, see ReferenceImplementation: The Sandbox Execution Model.Many elements of this solution, such as the master-details Web Parts and the MVP pattern, are described in thedocumentation for the preceding reference implementations. For information about how to create and interactwith external lists, see C reating External Lists. The retrieval of data from external lists takes place entirely withinthe VendorServ ice class.Generated from CHM, not final book. Will be superseded in the future. Page 134
  • 135. Creating External ListsExternal lists are SharePoint lists that are bound to a single external content type. Unlike external content types,which apply to the Business Data C onnectivity (BDC ) service application, external lists are created within aspecific SharePoint site, just like regular SharePoint lists. You can create external lists in three ways:  Interactively, by selecting Create Lists & Form on the external content type settings page in SharePoint Designer  Directly, by creating a new external list in the SharePoint site user interface, and then selecting the external content type to which you want to bind  Programmatically, in a farm solution or a sandboxed solutionThe external list reference implementation defines three external lists, which correspond to the external contenttypes described earlier:  The Vendors external list is bound to the Vendors external content type.  The Vendor Transactions external list is bound to the Vendor Transactions external content type.  The Vendor Transaction Types external list is bound to the Vendor Transaction Types external content type.C reating an external list is straightforward and requires no additional configuration. The external list is displayedand managed in the same way as a regular SharePoint list. For example, the following illustration shows theVendors external list in the Web browser.Vendors external listYou can also use SharePoint Designer to edit external lists. For example, you can add or remove columns, andyou can create views, forms, workflows, and custom actions for your external lists. For more information on howto create external lists, see How to: C reate External Lists in SharePoint on MSDN.You can use the SPList API to interact programmatically with external lists in exactly the same way that youwould interact with regular SharePoint lists. Because the SPList API is available within the sandbox environment,you can interact with external lists from within your sandboxed solution code.In the external list reference implementation, all the interaction with external lists takes place within theVendorServ ice class. This class provides the data model for both the Vendor List Web Part and the VendorTransaction List Web Part. The presenter class for the Vendor List Web Part calls theGetAllVendorsWithTransactionCount method to populate the Web Part, as shown in the following example.C#public DataTable GetAllVendorsWithTransactionCount(){ var vendors = GetAllVendors(); vendors.Columns.Add("TransactionCount"); var columnIndex = vendors.Columns.Count - 1; foreach (DataRow row in vendors.Rows) { int vendorId = int.Parse(row.ItemArray[0].ToString()); row[columnIndex] = GetTransactionCountByVendor(vendorId); }Generated from CHM, not final book. Will be superseded in the future. Page 135
  • 136. return vendors;}As the example shows, the GetAllVendorsWithTransactionCount method relies on two helper methods. First,the GetAllVendors method is used to retrieve all the data in the Vendors external list, as shown in the followingexample.C#public DataTable GetAllVendors(){ var web = SPContext.Current.Web; string test = Constants.ectVendorListName; var dt = web.Lists[Constants.ectVendorListName].Items.GetDataTable(); return dt;}Next, the GetTransactionCountByVendor method is used to get the number of transactions that are stored foreach vendor. This information is used to populate the TransactionCount column in the Vendor List Web Part.The method builds a Collaborative Application Markup Language (C AML) query to count the number oftransactions in the Vendor Transactions external list that correspond to the specified vendor:C#public int GetTransactionCountByVendor(int vendorId){ var query = new SPQuery { ViewFields = "<FieldRef Name=ID />", Query = string.Format( "<Where> <Eq> <FieldRef Name=VendorID /> <Value Type=Counter>{0}</Value> </Eq> </Where>", vendorId.ToString()) }; return SPContext.Current.Web.Lists[Constants.ectVendorTransactionListName] .GetItems(query).Count;}The presenter class for the Vendor Transaction List Web Part calls the GetTransactionByVendor method topopulate the Web Part, as shown in the following example.C#public DataTable GetTransactionByVendor(int vendorId){ var query = new SPQuery { ViewFields = "<FieldRef Name=Name />" + "<FieldRef Name=TransactionType />" + "<FieldRef Name=Amount />" + "<FieldRef Name=Notes />", Query = string.Format( "<Where> <Eq> <FieldRef Name=VendorID /> <Value Type=Counter>{0}</Value> </Eq> </Where>", vendorId.ToString()) }; returnGenerated from CHM, not final book. Will be superseded in the future. Page 136
  • 137. SPContext.Current.Web.Lists[Constants.ectVendorTransactionListName] .GetItems(query).GetDataTable();}In all of these code examples, the external lists are used in the same way as regular lists. The object model callsand the query syntax are the same regardless of the type of list.Generated from CHM, not final book. Will be superseded in the future. Page 137
  • 138. ConclusionThe external list reference implementation demonstrates best practice approaches to consuming external datafrom within sandboxed solutions. After reviewing the reference implementation, you should understand thefollowing:  The configuration of the Secure Store Service to support external data access from the sandbox environment  The creation and configuration of external content types for a SQL Server data source  The consumption of external data from the sandbox environment via an external listWe encourage you to deploy the reference implementation and to explore the different components and code inthe ExecutionModels.Sandboxed.ExternalList solution. For more information on using external lists within thesandbox environment, see Execution Models in SharePoint 2010.Generated from CHM, not final book. Will be superseded in the future. Page 138
  • 139. Execution Models: How-to TopicsWhen the execution models reference implementations were developed, several tasks proved challenging toaccomplish. In some cases the procedures were simple but hard to find, in other cases they were somewhatcomplicated. The following topics address these issues and provide step-by-step instructions on how to completethe tasks.  How to: Set the Deployment Order for a Project  How to: C reate a Sandboxed Workflow Action  How to: C reate and Register a Sandbox Proxy  How to: C reate a Web Application-Scoped Timer Job  How to: Debug a Feature Receiver  How to: Debug a Timer Job  How to: Deploy an Application Page to C entral Administration  How to: Deploy a Document Template in a Sandboxed Solution  How to: Deploy a Web Part Page in a Sandboxed Solution  How to: Display a Page as a Modal Dialog Box  How to: Log to the History List from a Workflow Activity  How to: Import and Package a Declarative Workflow in Visual StudioGenerated from CHM, not final book. Will be superseded in the future. Page 139
  • 140. How to: Set the Deployment Order for a ProjectOverviewWhen you create large Microsoft® SharePoint® solutions in Microsoft Visual Studio® 2010 development system,your solutions will often contain several features. Additionally, these features will often include dependencies onother features within the solution package. In this situation, you must configure the solution to deploy yourfeatures in the correct order to prevent Visual Studio from attempting to activate a feature before itsdependencies are in place. This how-to topic describes how to set the deployment order for features in a VisualStudio 2010 SharePoint solution. Note:This topic uses the ExecutionModels.Sandboxed solution as an example of a project with featuredependencies.StepsTo change the project deployment order 1. Open a Visual Studio 2010 solution based on one of the SharePoint project templates. 2. In Solution Explorer, double-click the Package node. The Package Designer window opens.Generated from CHM, not final book. Will be superseded in the future. Page 140
  • 141. 3. Select an item in the Package Designer window, and use the up and down arrow buttons on the right side to change the deployment order.Generated from CHM, not final book. Will be superseded in the future. Page 141
  • 142. Note:Features at the top of the Package Designer window are deployed first. If a feature includes activationdependencies, make sure that the feature appears beneath its dependencies in the Package Designer window. 4. Repeat this procedure until you achieve the desired order.Generated from CHM, not final book. Will be superseded in the future. Page 142
  • 143. How to: Create a Sandboxed Workflow ActionOverviewMicrosoft® SharePoint® 2010 enables you to create and deploy custom workflow actions as sandboxed solutions.You can then consume these workflow actions within declarative workflows in SharePoint Designer. This how-totopic describes how to create and deploy a sandboxed workflow action.For a practical example of the deployment of a sandboxed workflow action, see the workflow activities referenceimplementation.Summary of StepsThis how-to topic includes the following steps:  Step 1: Create the SharePoint Project. In this step, you create a Visual Studio 2010 project that you can use to deploy and test your sandbox proxy.  Step 2: Create the Workflow Action Class. In this step, you create a class that contains your sandboxed logic.  Step 3: Create the Workflow Action Definition. In this step, you create a feature manifest file. This contains the markup that references your class and defines your workflow action.Step 1: Create the SharePoint ProjectThis procedure creates a SharePoint project in the Microsoft Visual Studio® 2010 development system. You canuse this project to build and deploy your sandboxed workflow action.To create a project 1. Start Visual Studio 2010 and create a new Empty SharePoint Project, as shown in the following illustration. Name the project SimpleAction.Generated from CHM, not final book. Will be superseded in the future. Page 143
  • 144. 2. In the SharePoint C ustomization Wizard, specify a valid local site for debugging, select Deploy as a sandboxed solution, and then click Finish.Generated from CHM, not final book. Will be superseded in the future. Page 144
  • 145. Step 2: Create the Workflow ActionThis procedure creates a sandboxed workflow action class. It contains a method that defines the workflow actionlogic, and it shows you how to return values to the workflow.To create a sandboxed workflow action class 1. Add a new class named SandboxActiv ityLog to the project. 2. Add the following using statements to your class.C#using System.Collections;using Microsoft.SharePoint;using Microsoft.SharePoint.UserCode;using Microsoft.SharePoint.Workflow; 3. Add the public access modifier to your class. 4. Within the class, add a public method that accepts an argument of type SPUserCodeWorkflowContext and returns a value of type Hashtable. This method defines the workflow action.C#public class SandboxActivityLog{ public Hashtable Log(SPUserCodeWorkflowContext context) { }} 5. Within your method, implement your action logic. You can return values from your method by adding key/value pairs to a Hashtable object.Generated from CHM, not final book. Will be superseded in the future. Page 145
  • 146. C#public Hashtable Log(SPUserCodeWorkflowContext context){ Hashtable results = new Hashtable(); results["Except"] = string.Empty; try { using (SPSite site = new SPSite(context.CurrentWebUrl)) { using (SPWeb web = site.OpenWeb()) { SPWorkflow.CreateHistoryEvent(web, context.WorkflowInstanceId, 0, web.CurrentUser, TimeSpan.Zero, "Information", "Event from sandboxed activity", string.Empty); } } } catch (Exception ex) { results["Except"] = ex.ToString(); } results["Status"] = "Success"; return (results);} Note:Notice the use of the Except key to return an exception to the workflow.Step 3: Create the Workflow Action DefinitionThis procedure creates a workflow action definition in a feature manifest file. This markup tells the SharePointworkflow engine how to interact with your workflow action.To create an action definition for a sandboxed workflow action 1. In Solution Explorer, right-click the project node, point to Add, and then click New Item. 2. In the Add New Item dialog box, in the Installed Templates pane, expand SharePoint, and then click 2010. 3. C lick Empty Element, type a name for the element in the Name text box, and then click Add. This example uses the name LogDefinition for the element.Generated from CHM, not final book. Will be superseded in the future. Page 146
  • 147. 4. Expand the LogDefinition node and open the feature manifest file (Elements.xml). 5. Add a WorkflowActions element to the feature manifest. 6. Within the WorkflowActions element, add an Action element. This should do the following: a. Provide a friendly name for your workflow action. b. Specify that the action runs in the sandbox. c. Identify the assembly name and class name of your action. d. Identify the method that provides the actions functionality.XML<Elements xmlns="http://schemas.microsoft.com/sharepoint/"> <WorkflowActions> <Action Name="Log Testing" SandboxedFunction="true" Assembly="$SharePoint.Project.AssemblyFullName$" ClassName="SimpleAction.SandboxActivityLog" FunctionName="Log" AppliesTo="all" Category="patterns and practices sandbox"> </Action> </WorkflowActions></Elements> 7. Within the Action element, add a RuleDesigner element. This specifies the sentence that your workflow action will display in SharePoint Designer. It also binds the Except argument returns by your action to a variable named Exception.XML<RuleDesigner Sentence="Log Activity (Exception to %1)"> <FieldBind Field="Except" Text="Exception" Id="1"Generated from CHM, not final book. Will be superseded in the future. Page 147
  • 148. DesignerType="ParameterNames" /></RuleDesigner> 8. Within the Action element, add a Parameters element. This should define the arguments passed to your workflow action method and the types returned by your workflow action method.XML<Parameters> <Parameter Name="__Context" Type="Microsoft.SharePoint.WorkflowActions.WorkflowContext, Microsoft.SharePoint.WorkflowActions" Direction="In" DesignerType="Hide"/> <Parameter Name="Except" Type="System.String, mscorlib" Direction="Out" DesignerType="ParameterNames" Description="Exception encountered"/></Parameters> Note:The context argument must be preceded by a double underscore, as shown (__Context). 9. When you added the empty element to your project, Visual Studio created a feature named F eature 1. In Solution Explorer, right-click Feature 1, and then click Rename. This example uses the name SimpleActionF eature for the feature. 10. Double-click SimpleActionFeature to open the feature designer, and then change the scope of the feature to Site. 11. Press F5 to deploy and test your sandboxed workflow action. 12. To verify that your sandboxed workflow action deployed successfully, open SharePoint Designer and create a new workflow. You should find that the Log Testing action has been added to the list of available actions.Generated from CHM, not final book. Will be superseded in the future. Page 148
  • 149. How to: Create and Register a Sandbox ProxyOverviewSandbox proxies are components that enable you to make full-trust functionality available to sandboxedsolutions. This how-to topic describes how to create a sandbox proxy in the Microsoft® Visual Studio® 2010development system.For more information about sandbox proxies, see Execution Models in SharePoint 2010. For an example ofsandbox proxies in action, see the Sandbox Proxy Reference Implementation.Summary of StepsThis how-to topic includes the following steps:  Step 1: Create the SharePoint Project. In this step, you create a Visual Studio 2010 project that you can use to deploy and test your sandbox proxy.  Step 2: Create the Proxy Arguments Class. In this step, you create a serializable type that you can use to pass arguments from the sandbox to the sandbox proxy.  Step 3: Create the Proxy Operation Class. In this step, you create a class that contains the full-trust logic that you want to expose to sandboxed solutions.  Step 4: Register the Proxy Operation. In this step, you register your proxy operation with the user code service in order to make your operation available to sandboxed callers.  Step 5: Use the Proxy Operation from the Sandbox. In this step, you call your proxy operation from a sandboxed solution.Step 1: Create the SharePoint ProjectThis procedure creates a Microsoft® SharePoint® project in Visual Studio 2010. You can use this project to buildand deploy your sandbox proxy assembly.To create a sandbox proxy project 1. Start Visual Studio 2010, and create a new Empty SharePoint Project, as shown in the following illustration. Name the project SimpleProxy.Generated from CHM, not final book. Will be superseded in the future. Page 149
  • 150. 2. In the SharePoint C ustomization Wizard, specify a valid local site for debugging, select Deploy as a farm solution, and then click Finish.Generated from CHM, not final book. Will be superseded in the future. Page 150
  • 151. 3. Open the AssemblyInfo class file and add the AllowPartiallyTrustedCallers attribute to the class, as shown here.C#[assembly: AllowPartiallyTrustedCallers]Step 2: Create the Proxy Arguments ClassThis procedure creates a proxy arguments class. This is a serializable type that you can use to pass data fromsandboxed code to the proxy operation class.To create a proxy arguments class 1. Add a new class named SimpleProxyArgs to the project. 2. Add the following using statement to your class.C#using Microsoft.SharePoint.UserCode; 3. Add the public access modifier to your class. 4. Add the SerializableAttribute to your class. 5. Modify your class to inherit from the SPProxyOperationArgs class. Your class should resemble the following code.C#[serializable]public class SimpleProxyArgs : SPProxyOperationArgs { } 6. Add public properties for any arguments that you want to pass from the sandbox to the proxy.C#Generated from CHM, not final book. Will be superseded in the future. Page 151
  • 152. public string clientName { get; set; }public int clientID {get; set; } Note:Ensure that any properties you create use serializable types. 7. As a good practice, you might also want to add read-only properties for the type name and assembly name of your proxy operations class (which you will create in the next step). These are required when you register your proxy operation and when you invoke the operation from sandboxed code. By adding these properties to the proxy arguments class, you ensure that they are available from all locations when required. 8. (Optional) Add read-only properties for the type name and assembly name of your proxy operations class. The type name should be fully qualified and the assembly name should be the four-part strong name of your assembly.C#public static string ProxyOperationTypeName{ get { return "SimpleProxy.SimpleProxyOps"; }}public static string ProxyAssemblyName{ get { return "SimpleProxy, Version=1.0.0.0, Culture=neutral, PublicKeyToken=2dfe43bced7458f6"; }}Step 3: Create the Proxy Operations ClassThis procedure describes how to create a proxy operations class that exposes full-trust logic to the sandboxenvironment.To create a proxy operations class 1. Add a new class named SimpleProxyOps to the project. 2. Add the following using statement to your class.C#using Microsoft.SharePoint.UserCode; 3. Add the public access modifier to your class. 4. Modify your class to inherit from the SPProxyOperation class. Your class should resemble the following code.C#public class SimpleProxyOps : SPProxyOperation { } 5. Within the SimpleProxyOps class, override the Execute method. The method should accept an argument of type SPProxyOperationArgs and return an Object.C#public override object Execute (SPProxyOperationArgs args){} 6. Within the Execute method, cast the SPProxyOperationsArgs parameter to your proxy operations argument type, which in this case is SimpleProxyArgs.C#var proxyArgs = args as SimpleProxyArgs; 7. Retrieve your arguments from the proxy arguments class, perform any full-trust logic, and return an object to the caller. In this example, assume that your class includes a helper method named GetAvailableCredit that calls a Windows C ommunication Foundation (WC F) service and returns a double value.Generated from CHM, not final book. Will be superseded in the future. Page 152
  • 153. C#// Retrieve arguments from the proxy arguments class.string clientName = proxyArgs.ClientName;int clientID = proxyArgs.ClientID;// Perform full-trust logic; for example, call a WCF service.double availableCredit = GetAvailableCredit(clientName, clientID);// Return an object to the caller.return availableCredit; Note:Exception handling has been omitted for brevity. You should validate the incoming arguments and trapexceptions that occur within your logic.Step 4: Register the Proxy OperationThis procedure describes how to create a feature receiver class to register your proxy operation with the usercode service. This makes your proxy operation available to callers in the sandbox environment.To register a proxy operation 1. Add a new feature named SimpleProxyFeature to the project. To do this, right-click F eatures in Solution Explorer, and then click Add F eature. To rename the feature, right-click the new feature name, and then click Rename. 2. In the Feature Designer window, in the Scope drop-down list box, click Farm. 3. Add an event receiver to the SimpleProxyFeature. To do this, right-click SimpleProxyF eature in Solution Explorer, and then click Add Ev ent Receiver. 4. Add the following using statements to the SimpleProxyF eature.EventReceiver class.C#using Microsoft.SharePoint.Administration;using Microsoft.SharePoint.UserCode; 5. Uncomment the FeatureActiv ated method. 6. In the FeatureActiv ated method, add the following code to retrieve the local user code service.C#SPUserCodeService userCodeService = SPUserCodeService.Local; 7. Add the following code to create a new proxy operation type, based on your proxy operation class.C#var simpleOperation = new SPProxyOperationType( SimpleProxyArgs.ProxyAssemblyName, SimpleProxyArgs.ProxyOperationTypeName); 8. Add the following code to register your proxy operation type with the local user code service.C#userCodeService.ProxyOperationTypes.Add(simpleOperation);userCodeService.Update(); 9. Press F5 to deploy your sandbox proxy to the test environment.Step 5: Use the Proxy Operation from the SandboxThis procedure describes how to call a registered proxy operation from sandboxed code.To use a sandbox proxy 1. In your sandboxed solution, add a reference to the sandbox proxy assembly. 2. C reate an instance of the proxy arguments class and set any property values.C#var proxyArgs = new SimpleProxyArgs();proxyArgs.ClientName = "Adventure Works";proxyArgs.ClientID = 1; 3. C all the SPUtility.ExecuteRegisteredProxyOperation method, passing in the assembly name of theGenerated from CHM, not final book. Will be superseded in the future. Page 153
  • 154. proxy operation, the type name of the proxy operations class, and the proxy arguments instance. In this case, the assembly name and the type name are provided by static properties of the proxy arguments class, as described in step 2.C#var result = SPUtility.ExecuteRegisteredProxyOperation( SimpleProxyArgs.ProxyAssemblyName, SimpleProxyArgs.ProxyOperationTypeName, proxyArgs); 4. C ast the returned value to the expected return type of the proxy operation.C#double availableCredit = (double) result;Generated from CHM, not final book. Will be superseded in the future. Page 154
  • 155. How to: Create a Web Application-Scoped Timer JobOverviewIn Microsoft® SharePoint® 2010, you can associate a timer job with a Web application (SPWebApplication) or ashared service (SPService). This how-to topic describes how to create a timer job and scope it to a Webapplication.For a practical example of the creation and deployment of a timer job, see the Farm Solution ReferenceImplementation. Note:This how-to topic assumes that your test site includes a Tasks list. The Tasks list is created by default if youuse the Team Site template. If you used the Blank Site template to create your site, you will need to add aTasks list manually.Summary of StepsThis how-to topic includes the following steps:  Step 1: Create the SharePoint Project. In this step, you create a Microsoft Visual Studio® 2010 project that you can use to deploy and test your timer job.  Step 2: Create the Job Definition Class. In this step, you create a job definition class that contains your timer job logic.  Step 3: Create a F eature to Register the Job. In this step, you use a feature receiver class to install the timer job in your SharePoint environment.Step 1: Create the SharePoint ProjectThis procedure creates a SharePoint project in Visual Studio 2010. You can use this project to build and deployyour timer job assembly.To create a sandbox proxy project 1. Start Visual Studio 2010, and then create a new Empty SharePoint Project, as shown in the following illustration. Name the project SimpleTimerJob.Generated from CHM, not final book. Will be superseded in the future. Page 155
  • 156. 2. In the SharePoint C ustomization Wizard, specify a valid local site for debugging, select Deploy as a farm solution, and then click Finish.Generated from CHM, not final book. Will be superseded in the future. Page 156
  • 157. Step 2: Create the Job Definition ClassThis procedure describes how to create a job definition class. The job definition class encapsulates your timer joblogic.To create a job definition class 1. Add a new class named SimpleJobDefinition to the project. 2. Add the following using statement to the class.C#using Microsoft.SharePoint;using Microsoft.SharePoint.Administration; 3. Add the public access modifier to the class. 4. Modify the class to inherit from the SPJobDefinition class. Your class should resemble the following code.C#public class SimpleJobDefinition : SPJobDefinition { } 5. Within the SimpleJobDefinition class, add a public constant string named JobName.C#public const string JobName = "SimpleJobDefinition"; 6. Note:Note: You must provide a job name when you add or remove a job definition. By defining the job name as aconstant in the job definition class, you ensure that it is always available and remains unchanged. 7. Add a default constructor to the class that inherits from the default constructor of the base class. This isGenerated from CHM, not final book. Will be superseded in the future. Page 157
  • 158. required for the serialization and de-serialization of your timer job.C#public SimpleJobDefinition() : base() { } 8. Add a constructor that accepts an argument of type SPWebApplication, as shown in the following code. This enables the base SPJobDefinition class to instantiate your timer job within a specific Web application.C#public SimpleJobDefinition(SPWebApplication webApp) : base(JobName, webApp, null, SPJobLockType.Job) { } 9. Note:The SPJobDefinition base constructor provides alternative overloads that you can use if you want to scopeyour timer job to an SPService instance instead of a Web application. For more information, seeSPJobDefinition C lass on MSDN. 10. Within the constructor, give your timer job a title. This specifies how your timer job appears in the SharePoint C entral Administration Web site.C#Title = "Simple Job Definition"; 11. Within the SimpleJobDefinition class, override the Execute method. The method should accept an argument of type Guid. In this case, the GUID represents the target Web application.C#public override void Execute(Guid targetInstanceId){} 12. Add your timer job logic to the Execute method. This example simply adds an item to the task list on the root site of the Web application. The job logic sets the title of the task to the current date and time.C#public override void Execute(Guid targetInstanceId){ // Execute the timer job logic. SPWebApplication webApp = this.Parent as SPWebApplication; SPList taskList = webApp.Sites[0].RootWeb.Lists["Tasks"]; SPListItem newTask = taskList.Items.Add(); newTask["Title"] = DateTime.Now.ToString(); newTask.Update();}Step 3: Create a Feature to Register the JobThis procedure describes how to use a feature receiver class to install the timer job in your SharePointenvironment.To register a Web application-scoped timer job 1. In Solution Explorer, right-click the F eatures node, and then click Add F eature. 2. In Solution Explorer, right-click the new feature node, and then click Rename. This example uses the name SimpleJobFeature for the feature. 3. Double-click the SimpleJobFeature node to open the feature designer window, and then set the scope of the feature to WebApplication. 4. Add an event receiver to the SimpleJobFeature. To do this, right-click SimpleJobFeature in Solution Explorer, and then click Add Ev ent Receiver. 5. In the SimpleJobF eatureEventReceiver class, add the following using statement.C#using Microsoft.SharePoint.Administration; 6. Add a method named DeleteJobs that accepts an argument of type SPJobDefinitionCollection. The method should iterate through the job definition collection and delete any instances of SimpleJobDefinition.C#private void DeleteJob(SPJobDefinitionCollection jobs)Generated from CHM, not final book. Will be superseded in the future. Page 158
  • 159. { foreach(SPJobDefinition job in jobs) { if(job.Name.Equals(SimpleJobDefinition.JobName, StringComparison.OrdinalIgnoreCase)) { job.Delete(); } }} 7. Uncomment the FeatureActiv ated method. 8. In the FeatureActiv ated method, add the following code to register the job definition, and set the job schedule to run once every minute.C#SPWebApplication webApp = properties.Feature.Parent as SPWebApplication;DeleteJob(webApp.JobDefinitions);SimpleJobDefinition simpleJob = new SimpleJobDefinition(webApp);SPMinuteSchedule schedule = new SPMinuteSchedule();schedule.BeginSecond = 0;schedule.EndSecond = 59;schedule.Interval = 1;simpleJob.Schedule = schedule;simpleJob.Update(); 9. Uncomment the FeatureDeactivating method. 10. In the FeatureDeactiv ating method, add the following code to remove the job definition from the Web application.C#SPWebApplication webApp = properties.Feature.Parent as SPWebApplication;DeleteJob(webApp.JobDefinitions); 11. Press F5 to debug your timer job. After a few minutes, browse to the task list on your root site and verify that the timer job has added a task once in every one minute period.Generated from CHM, not final book. Will be superseded in the future. Page 159
  • 160. How to: Debug a Feature ReceiverOverviewOne of the new features in the Microsoft® Visual Studio® 2010 development system is the ability to press F5 todebug Microsoft SharePoint® components, such as feature receivers. However, to debug a feature receiver class,you must configure your SharePoint project to use the No Activation deployment configuration. This means thatVisual Studio 2010 will install your features, but you must manually activate them through the user interface. Thishow-to topic describes how to change your deployment configuration and debug a feature receiver class. Note:This how-to topic uses the ExecutionModels.Sandboxed solution as an example of a project with featurereceiver classes.StepsTo debug a feature receiv er in Visual Studio 2010 1. Open the Visual Studio 2010 project that contains your feature. 2. In Solution Explorer, right-click the project node, and then click Properties. 3. On the SharePoint tab, in the Activ e Deployment Configuration drop-down list, click No Activ ation. 4. Open the feature receiver class that you want to debug, and then insert a breakpoint.Generated from CHM, not final book. Will be superseded in the future. Page 160
  • 161. 5. Press F5 to deploy and debug your solution. In the Attach Security Warning dialog box, click OK. 6. Activate your feature through the browser user interface. 7. Verify that the debugger stops at your breakpoint.Generated from CHM, not final book. Will be superseded in the future. Page 161
  • 162. How to: Debug a Timer JobOverviewWhen you create a timer job for Microsoft® SharePoint® 2010, you cannot directly press F5 to debug your code.Instead, you must attach the debugger to the SharePoint 2010 Timer process (Owstimer.exe). This how-to topicdescribes how to debug a timer job by attaching the Visual Studio debugger to the timer process. Note:This how-to topic uses the simple timer job described in How To: C reate a Web Application-Scoped Timer Jobas an example.StepsTo debug a timer job in Visual Studio 2010 1. On the Start menu, point to Administrative Tools, and then click Serv ices. 2. In the Services window, make sure the SharePoint 2010 Timer service is started. 3. Open the Visual Studio 2010 project that contains your timer job. Note:Make sure that the code has not changed since you deployed the timer job; otherwise, the debugger will notmatch your source code to the deployed assembly. 4. Set a breakpoint in the Execute method of your job definition class. 5. On the Debug menu, click Attach to Process. 6. In the Attach to Process dialog box, click OWSTIMER.EXE, and then click Attach.Generated from CHM, not final book. Will be superseded in the future. Page 162
  • 163. 7. If the Attach Security Warning dialog box is displayed, click Attach. 8. In the SharePoint C entral Administration Web site, click Monitoring, and then click Review job definitions. 9. C lick the name of your job, and then click Run Now on the Edit Timer Job page. 10. Verify that the Visual Studio 2010 debugger stops execution on your breakpoint.Generated from CHM, not final book. Will be superseded in the future. Page 163
  • 164. How to: Deploy an Application Page to CentralAdministrationOverviewIn some situations, you might want to deploy an application page to the C entral Administration Web site. Forexample, you might deploy a page that allows administrators to configure a timer job. This topic describes thesteps you must perform to deploy the page to C entral Administration.For a practical example of the deployment of a custom application page to C entral Administration, see the FarmSolution Reference Implementation. This uses a custom application page to configure a timer job.Summary of StepsThis how-to topic includes the following steps:  Step 1: Create the SharePoint Project. In this step, you use the Microsoft® Visual Studio® development system to create a project that you can use to deploy and test your application page.  Step 2: Create a Mapped F older. In this step, you create a SharePoint Mapped Folder that enables you to deploy your files to the correct location on the server file system.  Step 3: Add an Application Page. In this step, you add a simple application page to the mapped folder that you created.  Step 4: Create a Custom Action to Launch the Page. In this step, you create a feature manifest that defines a custom action. The custom action adds a navigation item to the C entral Administration Web site.Step 1: Create the SharePoint ProjectThis procedure creates a Microsoft SharePoint® project in Visual Studio 2010. You can use this project to buildand deploy your application page.To create a project 1. Start Visual Studio 2010, and then create a new Empty SharePoint Project, as shown in the following illustration. Name the projectApplicationPage.Generated from CHM, not final book. Will be superseded in the future. Page 164
  • 165. 2. In the SharePoint C ustomization Wizard, specify a valid local site for debugging, select Deploy as a farm solution, and then click Finish. Note:If you want Visual Studio 2010 to automatically activate your feature on the C entral Administration Web site,set the local site to the URL of your C entral Administration Web site.Generated from CHM, not final book. Will be superseded in the future. Page 165
  • 166. 3.Step 2: Create a Mapped FolderThis procedure creates a SharePoint Mapped Folder. This makes it easy for you to deploy your files to a specificlocation on the server file system.To create a mapped folder 1. In Solution Explorer, right-click the project node, point to Add, and then click SharePoint Mapped Folder. 2. In the Add SharePoint Mapped F older dialog box, select the {SharePointRoot}TEMPLATEADMIN folder, and then click OK.Generated from CHM, not final book. Will be superseded in the future. Page 166
  • 167. Notice that Visual Studio has added an ADMIN node to Solution Explorer.Generated from CHM, not final book. Will be superseded in the future. Page 167
  • 168. Step 3: Add an Application PageThis procedure creates a new application page and adds it to the ADMIN mapped folder.To add an application page 1. In Solution Explorer, right-click the ADMIN node, point to Add, and then click New Item. 2. In the Add New Item dialog box, expand SharePoint in the Installed Templates pane, and then click 2010. 3. C lick Application Page, type a name for the page in the Name text box, and then click Add. This example uses the name SimplePage.aspx for the page. Note:In Solution Explorer, notice that Visual Studio has actually created a Layouts mapped folder and added thenew page to that. This is the default behavior for application pages, and you must manually move the page tothe desired location.Generated from CHM, not final book. Will be superseded in the future. Page 168
  • 169. 4. In Solution Explorer, drag the ApplicationPage folder from the Layouts mapped folder to the ADMIN mapped folder. You can now delete the Layouts mapped folder. 5. Add some content to your application page. This example uses some HTML to verify that the page was deployed successfully.HTML<asp:Content ID="Main" ContentPlaceHolderID="PlaceHolderMain" runat="server"> <p>This is a simple application page</p></asp:Content>Step 4: Create a Custom Action to Launch the PageThis procedure creates a feature manifest that defines a CustomAction element. The custom action adds anavigation item that enables administrators to launch your page from the C entral Administration Web site.To add a navigation item to Central Administration 1. In Solution Explorer, right-click the project node, point to Add, and then click New Item. 2. In the Add New Item dialog box, expand SharePoint in the Installed Templates pane, and then click 2010. 3. C lick Module, type a name for the module in the Name text box, and then click Add. This example uses the name NavItem for the module.Generated from CHM, not final book. Will be superseded in the future. Page 169
  • 170. 4. Expand the NavItem node and open the feature manifest file (Elements.xml). Delete the existing content, and then add the following XML. You can also delete the Sample.txt file from Solution Explorer.XML<Elements xmlns="http://schemas.microsoft.com/sharepoint/"> <CustomAction Id="[GUID]" GroupId="TimerJobs" Location="Microsoft.SharePoint.Administration.Monitoring" Sequence="10" Title="Simple Page" Description=""> <UrlAction Url="_admin/ApplicationPage/SimplePage.aspx" /> </CustomAction></Elements> Note:This markup adds a custom action to the Timer Jobs action group on C entral Administration. Inpractice, this adds a link to our custom application page under the Timer Jobs heading. For moreinformation about custom action locations, see Default C ustom Action Locations and IDs on MSDN.If you reuse the code, replace [GUID] with a new GUID. 5. In Solution Explorer, expand F eatures, right-click F eature1, and then click Rename. This example uses the name SimplePageF eature for the feature. 6. In Solution Explorer, double-click SimplePageF eature to open the Feature Designer. Notice that the feature already contains your Nav Item module. 7. In the Feature Designer window, give the feature a friendly title. Leave the scope set to Web. 8. Press F5 to deploy and test your solution. 9. Browse to the C entral Administration Web site, and then click Monitoring. Notice that a Simple Page link has been added under Timer Jobs.Generated from CHM, not final book. Will be superseded in the future. Page 170
  • 171. 10. C lick Simple Page, and then verify that your custom application page is displayed.Generated from CHM, not final book. Will be superseded in the future. Page 171
  • 172. How to: Deploy a Document Template in a SandboxedSolutionOverviewIn previous versions of Microsoft® SharePoint®, resources such as document templates, application pages, andJavaScript files were deployed to the SharePoint root on the server file system, typically within the _layoutsvirtual directory. In SharePoint 2010, you cannot deploy any resources to the server file system from asandboxed solution. Instead, your deployment must target the content database. This how-to topic describes howto manage the deployment of files and documents in a sandboxed solution.For a practical example of the deployment of files and documents to the sandbox environment, see the SandboxProxy Reference Implementation.Summary of StepsThis how-to topic includes the following steps:  Step 1: Create and Configure a Module. In this step, you create a feature manifest file that contains a Module element. This specifies which files to deploy and where to deploy them.  Step 2: Add the Module to a F eature. In this step, you create a feature and add your feature manifest file to the feature. This enables you to deploy the module to the SharePoint environment. Note:This how-to topic assumes that you have used the Microsoft Visual Studio® 2010 development system and oneof the SharePoint 2010 templates to create a project.Step 1: Create and Configure a ModuleThis procedure creates a feature manifest that contains a Module element. The module is configured to add filesto the content database, which enables you to deploy it within a sandboxed solution.To create a Module element 1. In Solution Explorer, right-click the project node, point to Add, and then click New Item. 2. In the Add New Item dialog box, in the Installed Templates pane, expand SharePoint, and then click 2010. 3. C lick Module, type a name for the module in the Name text box, and then click Add. This example uses the name Templates for the module.Generated from CHM, not final book. Will be superseded in the future. Page 172
  • 173. In Solution Explorer, notice that Visual Studio has added a Templates node to represent your module.Generated from CHM, not final book. Will be superseded in the future. Page 173
  • 174. 4. Expand the Templates node, and then open the feature manifest file (Elements.xml). By default, the feature manifest includes a Module element with a placeholder File child element, as shown in the following XML.XML<Elements xmlns="http://schemas.microsoft.com/sharepoint/"> <Module Name="Templates"> <File Path="TemplatesSample.txt" Url="Templates/Sample.txt"/> </Module></Elements> 5. In the Elements.xml file, delete the placeholder F ile element. You can also delete the Sample.txt file from Solution Explorer. 6. In Solution Explorer, right-click the Templates node, point to Add, and then click Existing Item. 7. Browse to the file or files you want to deploy, and then click Add. 8. Notice that Visual Studio adds a F ile element for each file that you add, as shown in the following XML. This example adds an Excel template named Estimate.xltx.XML<Elements xmlns="http://schemas.microsoft.com/sharepoint/"> <Module Name="Templates"> <File Path="Templatesestimate.xltx"Url="Templates/estimate.xltx" /> </Module></Elements> 9. In the Module element, add a Url attribute to specify the destination for the files. In this example, a document template for a content type is being deployed, so a value of _cts/Estimate is specified. Note:In SharePoint 2010, like in earlier versions of SharePoint, content types and document templates are stored assubfolders in the _cts virtual directory. 10. Make the following changes to the F ile element: a. Leave the Path attribute value as Templatesestimate.xltx. This tells the feature where to find the file in your Visual Studio project. b. C hange the Url attribute value to estimate.xltx. This specifies the virtual path to the file, within the virtual directory specified by the Module element. c. Add a Type="Ghostable" attribute value. This indicates that the file will be stored in the content database. Note:Visual Studio 2010 does not always automatically pick up the feature manifest schema. If you see schemaGenerated from CHM, not final book. Will be superseded in the future. Page 174
  • 175. errors or you lack IntelliSense support when you edit a feature manifest, check the properties of the XMLdocument in the Properties window. The Schemas property should be set to 14TEMPLATEXMLwss.xsd. 11. The feature manifest should resemble the following.C# <Elements xmlns="http://schemas.microsoft.com/sharepoint/"> <Module Name="Templates" Url="_cts/Estimate"> <File Path="Templatesestimate.xltx" Url="estimate.xltx" Type="Ghostable" /> </Module></Elements> Note:For sandboxed deployments, you can use Type attribute values of Ghostable or GhostableInLibrary. UseGhostableInLibrary if your deployment target is a document library and you want SharePoint to create aparent list item for your file.Step 2: Add the Module to a FeatureThis procedure adds the module to a feature, which provides the mechanism to deploy your files to theSharePoint environment.To add a module to a feature 1. In Solution Explorer, right-click the F eatures node, and then click Add F eature. Note:Visual Studio 2010 may have already added an empty feature when you added other SharePoint components.In this case, you can either rename the empty feature or delete it and create a new one. 2. In Solution Explorer, right-click the new feature node, and then click Rename. This example uses the name TemplatesF eature for the feature. 3. If the Feature Designer is not already open, double-click the TemplatesFeature node to open the designer. 4. In the Feature Designer, select an appropriate scope. You can use a feature scope value of Web or Site within a sandboxed solution. 5. In the Items in the Solution pane, click the Templates module. 6. C lick the right arrow button to add the module to the feature. This moves the Templates module to the Items in the Feature pane.Generated from CHM, not final book. Will be superseded in the future. Page 175
  • 176. 7. To deploy the feature, right-click the project node in Solution Explorer, and then click Deploy. Note:This example deploys the document template to the _cts virtual directory. This location is not directlyaccessible to end users. You can use steps 8–11 to verify that the file was deployed successfully. 8. Open your site in SharePoint Designer. 9. In the Site Objects pane, click All F iles. 10. In the main window, click _cts. 11. C lick the Estimate folder, and verify that the folder contains the file Estimate.xltx.For more information about provisioning files into SharePoint sites, see How to: Provision a File and How to:Include Files by Using a Module on MSDN.Generated from CHM, not final book. Will be superseded in the future. Page 176
  • 177. How to: Deploy a Web Part Page in a SandboxedSolutionOverviewMany Microsoft® SharePoint® applications require customized Web Part pages, either to provision additional WebPart zones or to provide alternative layouts. This how-to topic describes how to deploy a custom Web Part page ina sandboxed solution.For a practical example of the deployment of a Web Part page to the sandbox environment, see the SandboxProxy Reference Implementation. The reference implementation displays a custom Web Part page as a modaldialog box.Summary of StepsThis how-to topic includes the following steps:  Step 1: Create and Configure a List Instance. In this step, you create a feature manifest file that provisions a new document library in which to store your Web Part page.  Step 2: Create and Configure a Module. In this step, you create a feature manifest file that contains a Module element. This specifies which files to deploy and where to deploy them.  Step 3: Add the Module to a F eature. In this step, you create a feature and add your feature manifest files to the feature. This enables you to provision the list instance and deploy the module to the SharePoint environment. Note:This how-to topic assumes that you have used the Microsoft Visual Studio 2010 development system and oneof the SharePoint 2010 templates to create a project. It also assumes that you have a custom Web Part pageready to deploy.Step 1: Create and Configure a List InstanceThis procedure creates a feature manifest that provisions a new document library named Pages on yourSharePoint site. You will use this library to store your custom Web Part page. Note:If you are deploying a page to a site that has the Publishing feature activated, you do not need to completestep 1. Instead, you can deploy your page to the Pages library that is automatically created by the Publishingfeature. However, if you are using SharePoint Foundation 2010, the Publishing feature is unavailable and youmust manually create a Pages library as described in this step.To create a list instance 1. In Solution Explorer, right-click the project node, point to Add, and then click New Item. 2. In the Add New Item dialog box, expand SharePoint in the Installed Templates pane, and then click 2010. 3. C lick List Instance, type a name for the list instance in the Name text box, and then click Add. This example uses the name PagesLibrary for the module.Generated from CHM, not final book. Will be superseded in the future. Page 177
  • 178. 4. In the SharePoint C ustomization Wizard, under Which list do you want instantiate? [sic], click Document Library. 5. Provide appropriate values for the display name, description, and relative URL, and then click F inish.Generated from CHM, not final book. Will be superseded in the future. Page 178
  • 179. Notice that Visual Studio creates and displays a feature manifest file based on the settings that you provided. Your feature manifest should resemble the following code.C#<Elements xmlns="http://schemas.microsoft.com/sharepoint"/> <ListInstance Title="Custom Pages" OnQuickLaunch="TRUE" TemplateType="101" FeatureId="00bfea71-e717-4e80-aa17-d0c71b360101" Url="Lists/CustomPages" Description=""> </ListInstance></Elements>Step 2: Create and Configure a ModuleThis procedure creates a feature manifest that contains a Module element. The module is configured to add filesto the content database, which enables you to deploy it within a sandboxed solution.To create a Module element 1. In Solution Explorer, right-click the project node, point to Add, and then click New Item. 2. In the Add New Item dialog box, expand SharePoint in the Installed Templates pane, and then click 2010. 3. C lick Module, type a name for the module in the Name text box, and then click Add. This example uses the name Pages for the module.Generated from CHM, not final book. Will be superseded in the future. Page 179
  • 180. In Solution Explorer, notice that Visual Studio has added a Pages node to represent your module.Generated from CHM, not final book. Will be superseded in the future. Page 180
  • 181. 4. Expand the Pages node and open the feature manifest file (Elements.xml). By default, the feature manifest includes a Module element with a placeholder File child element, as shown in the following code.XML<Elements xmlns="http://schemas.microsoft.com/sharepoint/"> <Module Name="Pages"> <File Path="PagesSample.txt" Url="Pages/Sample.txt" /> </Module></Elements> 5. In the Elements.xml file, delete the placeholder F ile element. You can also delete the Sample.txt file in Solution Explorer. 6. In Solution Explorer, right-click the Pages node, point to Add, and then click Existing Item. 7. Browse to the Web Part page that you want to deploy, and then click Add. 8. Notice that Visual Studio adds a F ile element for the Web Part page, as shown in the following code. This example uses a Web Part page named SimplePage.aspx.XML<Elements xmlns="http://schemas.microsoft.com/sharepoint/"> <Module Name="Pages"> <File Path="PagesSimplePage.aspx" Url="Pages/SimplePage.aspx"/> </Module></Elements> 9. In the Module element, add a Url attribute to specify the destination for the files. This should be the relative URL of the target list instance, which in this case is Lists/CustomPages. 10. In the Module element, add a List attribute to indicate the type of list to which you are deploying your files. Because this example deploys files to a document library, the attribute value should be List="101". 11. Make the following changes to the F ile element: a. Leave the Path attribute value as PagesSimplePage.aspx. This tells the feature where to find the file within your Visual Studio project. b. C hange the Url attribute value to SimplePage.aspx. This specifies the virtual path to the file, within the virtual directory specified by the Module element. c. Add a Type="GhostableInLibrary" attribute value. This indicates that the file will be stored in the content database and that it should support library operations, such as check-in and check-out. Note:Visual Studio 2010 does not always automatically pick up the feature manifest schema. If you see schemaerrors or you lack Microsoft IntelliSense® support when you edit a feature manifest, check the properties ofthe XML document in the Properties window. The Schemas property should be set toGenerated from CHM, not final book. Will be superseded in the future. Page 181
  • 182. 14TEMPLATEXMLwss.xsd. 12. The feature manifest should resemble the following.C# <Elements xmlns="http://schemas.microsoft.com/sharepoint/"> <Module Name="Pages" Url="Lists/CustomPages" List="101"> <FilePath="PagesSimplePage.aspx" Url="SimplePage.aspx" Type="GhostableInLibrary" /> </Module></Elements>Step 3: Add the List Instance and the Module to a FeatureThis procedure adds the list instance and module to a feature, which provides the mechanism to provision yourPages library and deploy your files to the SharePoint environment.To add items to a feature 1. In Solution Explorer, right-click the F eatures node, and then click Add F eature. Note:Visual Studio 2010 may have already added an empty feature when you added other SharePoint components.In this case, you can either rename the empty feature or delete it and create a new one. 2. In Solution Explorer, right-click the new feature node, and then click Rename. This example uses the name PagesFeature for the feature. 3. If the Feature Designer is not already open, double-click the PagesFeature node to open the designer. 4. In the Feature Designer, select an appropriate scope. You can use a feature scope value of Web or Site in a sandboxed solution. 5. Use the arrow buttons to add both items to the feature. 6. To deploy the feature, right-click the project node in Solution Explorer, and then click Deploy. 7. You can verify the deployment by browsing to the C ustom Pages library on your test site. The library should contain your Web Part page.For more information about provisioning files into SharePoint sites, see How to: Provision a File and How to:Include Files by Using a Module on MSDN.Generated from CHM, not final book. Will be superseded in the future. Page 182
  • 183. Generated from CHM, not final book. Will be superseded in the future. Page 183
  • 184. How to: Display a Page as a Modal Dialog BoxOverviewThe Microsoft® SharePoint® 2010 user interface makes extensive use of modal dialog boxes, which helps toimprove the user experience by reducing the number of postbacks. You can use this functionality in your ownapplications through the use of the SharePoint client object model. This how-to topic describes how to use theSharePoint EC MAScript object model to launch a page as a modal dialog box.For a practical example of the deployment and use of pages for modal dialog boxes, including how to passinformation to the modal dialog box, see the Full-Trust Proxy Reference Implementation. For more informationabout using the SharePoint client object model in JavaScript, see EC MAScript Object Model Reference on MSDN. Note:These steps are designed for sandboxed solution deployment. However, you can use the same approach todeploy JavaScript functions within farm solutions.Summary of StepsThis how-to topic includes the following steps:  Step 1: Create and Configure a Module. In this step, you create a feature manifest file that contains a Module element. This specifies which files to deploy and where to deploy them.  Step 2: Create the Jav aScript Function. In this step, you create a JavaScript file and add it to the module. The file contains a function that accepts a page URL and displays the page as a modal dialog box.  Step 3: Add the Module to a F eature. In this step, you create a feature and add the feature manifest file to the feature. This enables you to deploy your JavaScript function to the SharePoint environment.  Step 4: Inv oke the Jav aScript Function. In this step, you use the JavaScript function to launch the site calendar as a modal dialog box. Note:This how-to topic assumes that you have used the Microsoft Visual Studio® 2010 development system and oneof the SharePoint 2010 templates to create a project.Step 1: Create and Configure a ModuleThis procedure creates a feature manifest that contains a Module element. You will use the module to deploy theJavaScript file. The module is configured to add files to the content database, which enables you to deploy itwithin a sandboxed solution.To create a Module element 1. In Solution Explorer, right-click the project node, point to Add, and then click New Item. 2. In the Add New Item dialog box, expand SharePoint in the Installed Templates pane, and then click 2010. 3. C lick Module, type a name for the module in the Name text box, and then click Add. This example uses the name Scripts for the module.Generated from CHM, not final book. Will be superseded in the future. Page 184
  • 185. In Solution Explorer, notice that Visual Studio has added a Scripts node to represent your module.Generated from CHM, not final book. Will be superseded in the future. Page 185
  • 186. 4. In Solution Explorer, right-click the Scripts node, point to Add, and then click New Item. 5. In the Add New Item dialog box, click Web in the Installed Templates pane, and then click the Jscript File template. 6. In the Name text box, type jsFunctions.js, and then click Add. 7. Open the Elements.xml file in the Script module. Notice that Visual Studio has added a File element for the jsFunctions.js file.XML<Elements xmlns="http://schemas.microsoft.com/sharepoint/"> <Module Name="Scripts"> <File Path="ScriptsSample.txt" Url="Scripts/Sample.txt" /> <File Path="ScriptsjsFunctions.js" Url="Scripts/jsFunctions.js" /> </Module></Elements> 8. In the Module element, add a Url ="_catalogs/masterpage" attribute value. This tells the feature to deploy the JavaScript file to the sites master page gallery. Note:In a sandboxed solution, you are not permitted to deploy any files to the server-side file system. Instead, youcan deploy the JavaScript file to the sites master page gallery. This ensures that it is available to all users whocan view the site. 9. In the Module element, add a List="116" attribute value. This indicates that the destination is a library of type master page gallery. 10. Delete the File element for the Sample.txt file. You can also delete the Sample.txt file from Solution Explorer. 11. Make the following changes to the F ile element for the jsFunctions.js file: a. Leave the Path attribute value as ScriptsjsF unctions.js. This tells the feature where to find the file in your Visual Studio project. b. C hange the Url attribute value to jsFunctions.js. This specifies the virtual path to the file in the virtual directory specified by the Module element. c. Add a Type="GhostableInLibrary" attribute value. This indicates that the file will be stored as a document library item in the content database. Note:Visual Studio 2010 does not always automatically pick up the feature manifest schema. If you see schemaerrors or you lack IntelliSense support when you edit a feature manifest, check the properties of the XMLdocument in the Properties window. The Schemas property should be set to 14TEMPLATEXMLwss.xsd. 12. The feature manifest should resemble the following code example.C#<Elements xmlns="http://schemas.microsoft.com/sharepoint/"> <Module Name="Scripts" List="116" Url="_catalogs/masterpage"> <File Path="ScriptsjsFunctions.js" Url="jsFunctions.js" Type="GhostableInLibrary"/> </Module></Elements> Note:For sandboxed deployments, you can use Type attribute values of Ghostable or GhostableInLibrary. UseGhostableInLibrary if your deployment target is a document library and you want SharePoint to create aparent list item for your file.Step 2: Create the JavaScript FunctionThis procedure creates a JavaScript function. The function accepts a single argument that represents the URL of aWeb page. The function then uses the SharePoint EC MAScript object model to launch the specified page as amodal dialog box.To create a JavaScript function that launches a modal dialog box 1. In Solution Explorer, double-click jsF unctions.js to open the file. 2. Add a function named ShowDialog that takes a single argument named url.Jav aScriptfunction ShowDialog(url) { } 3. Add the following code to the ShowDialog function.Generated from CHM, not final book. Will be superseded in the future. Page 186
  • 187. Jav aScriptvar options = SP.UI.$create_DialogOptions();options.url = url;options.height = 300;SP.UI.ModalDialog.showModalDialog(options); 4. When invoked, this JavaScript function launches the specified page as a modal dialog box with a height of 300 pixels.Step 3: Add the Module to a FeatureThis procedure adds the module to a feature, which provides the mechanism to deploy your JavaScript file to theSharePoint environment.To add a module to a feature 1. In Solution Explorer, right-click the F eatures node, and then click Add F eature. Note:Visual Studio 2010 may have already added an empty feature when you added the module. In this case, youcan either rename the empty feature or delete it and create a new one. 2. In Solution Explorer, right-click the new feature node, and then click Rename. Type a name for the feature. This example uses the name ScriptsF eature. 3. Double-click the ScriptsF eature node to open the Feature Designer. 4. In the Feature Designer, set the feature scope to Site. 5. In the Items in the Solution pane, click the Scripts module. 6. C lick the right arrow button to add the module to the feature. This moves the feature to the Items in the Feature pane.Step 4: Invoke the JavaScript FunctionYou can use your JavaScript function in several different ways. This procedure provides a simple demonstrationthat uses the JavaScript function to display the site calendar as a modal dialog box page.Generated from CHM, not final book. Will be superseded in the future. Page 187
  • 188. To inv oke the Jav aScript function 1. In Solution Explorer, right-click the project node, point to Add, and then click New Item. 2. In the Add New Item dialog box, expand SharePoint in the Installed Templates pane, and then click 2010. 3. C lick Web Part, type a name for the module in the Name text box, and then click Add. This example uses the name DialogDemo for the Web Part. 4. Add the following using statement to the DialogDemo class.C#using System.Web.UI.HtmlControls; 5. Add the following constant string to the DialogDemo class. This indicates the site-relative location of the JavaScript file.C#const string jsScriptURL = "/_catalogs/masterpage/jsFunctions.js"; 6. Add the following code to the CreateChildControls method. This renders an HTML script element that loads the JavaScript file.C#HtmlGenericControl scriptInclude = new HtmlGenericControl("script");scriptInclude.Attributes.Add("src", SPContext.Current.Site.RootWeb.Url + jsScriptURL);Controls.Add(scriptInclude); 7. Add the following code to the CreateChildControls method. This renders a hyperlink that calls the ShowDialog function, passing in the relative URL of the site calendar.C#HyperLink link = new HyperLink();link.Text = "View Calendar";Generated from CHM, not final book. Will be superseded in the future. Page 188
  • 189. link.NavigateUrl = string.Concat("javascript: ShowDialog(", SPContext.Current.Site.RootWeb.Url, "/Lists/Calendar/calendar.aspx)");this.Controls.Add(link); 8. In Solution Explorer, double-click the ScriptsF eature node to open the Feature Designer. Make sure that the DialogDemo Web Part is added to the feature. 9. In Solution Explorer, right-click the project node, and then click Deploy. 10. Browse to your test site, add the DialogDemo Web Part to a page, and then click the View Calendar link. Note:By default, the DialogDemo Web Part is added to the C ustom C ategories Web Part group. 11. 12. The site calendar is displayed in a modal dialog box.Generated from CHM, not final book. Will be superseded in the future. Page 189
  • 190. How to: Log to the History List from a WorkflowActivityOverviewWhen you create custom full-trust workflow activities for Microsoft® SharePoint® solutions, you might often wantto add entries to the Workflow History list on your SharePoint site. This how-to topic describes how to log to theHistory list from a workflow activity class. Note:This how-to topic assumes that you have already created a workflow activity class that derives from theSystem.Workflow.ComponentModel.Activ ity base class. For more information about creating a customworkflow activity, see the Workflow Activities reference implementation.StepsTo log to the history list from a workflow activity 1. Locate the Execute method in your workflow activity class.C#protected override ActivityExecutionStatus Execute(ActivityExecutionContext executionContext){ // The activity logic goes here. return base.Execute(executionContext);} 2. In the Execute method, retrieve an implementation of the ISharePointService interface from the execution context object.C#ISharePointService wfService = executionContext.GetService<ISharePointService>(); 3. On the ISharePointServ ice implementation, call the LogToHistoryList method.C#wfService.LogToHistoryList(executionContext.ContextGuid, SPWorkflowHistoryEventType.WorkflowComment, 0, TimeSpan.Zero, "Information", "Logged via ISharePointService", string.Empty); 4. The LogToHistoryList method will create an entry in the Workflow History list that corresponds to the execution context of the workflow.Generated from CHM, not final book. Will be superseded in the future. Page 190
  • 191. How to: Import and Package a Declarative Workflow inVisual StudioOverviewWhen you create a declarative workflow in Microsoft® SharePoint® Designer 2010, you can save the workflow asa template in a SharePoint solution package (.wsp) file. In some circumstances, you may need to import yourworkflow template into the Microsoft Visual Studio® 2010 development system, and repackage it as a VisualStudio project—for example, if you want to create a solution package that contains both a declarative workflowand custom-coded workflow activities. This how-to topic describes how to import and package a workflowtemplate in Visual Studio 2010. Note:This how-to topic assumes that you have already created a declarative workflow in SharePoint Designer 2010and saved it as a .wsp file.StepsTo import a declarativ e workflow into Visual Studio 1. In Visual Studio 2010, create a new project by using the Import SharePoint Solution Package template. Note:It is important to use the Import SharePoint Solution Package template instead of the Import ReusableWorkflow template. 2. In the SharePoint C ustomization Wizard, provide a valid local site URL for debugging, and then click Next. 3. Browse to the location of your solution package, and then click Next.Generated from CHM, not final book. Will be superseded in the future. Page 191
  • 192. 4. On the Select items to import page, click F inish.Generated from CHM, not final book. Will be superseded in the future. Page 192
  • 193. 5. When the solution import completes, use the Replace in Files tool to replace all instances of "workflows/" with "_catalogs/wfpub/" throughout the solution. Note:_catalogs/wfpub/ is the virtual directory in which declarative workflow files are stored on each sitecollection. 6. Open the .xoml.wfconfig.xml file, and then make the following changes: a. In the Template element, change the value of the Visibility attribute to RootPublic from either Public or DraftPublic. b. In the Template element, change the value of the DocLibURL attribute to _catalogs/wfpub. 7. In Solution Explorer, expand the PropertyBags node, and then open the Elements.xml file within the PropertyBags node.Generated from CHM, not final book. Will be superseded in the future. Page 193
  • 194. 8. Locate the PropertyBags node for the .xoml.config.xml file, and then change the value of the NoCodeVisibility property to RootPublic from either Public or DraftPublic. 9. Press F5 to build and deploy your solution. You can use SharePoint Designer to verify that the workflow was added to your target site.Generated from CHM, not final book. Will be superseded in the future. Page 194
  • 195. Data Models in SharePoint 2010Every custom SharePoint application is driven by data in one way or another. Because the SharePoint platform isgeared toward managing the flow of information, its hard to think of a solution development task that doesntinvolve either displaying data, aggregating data, sorting data, filtering data, manipulating data, or creating data.SharePoint 2010 includes functionality that provides SharePoint developers with new ways to work with bothinternal and external data. For example, you can create relationships between SharePoint lists, in the same waythat you might create a foreign key relationship between database tables. You can query across theserelationships to support moderately complex data aggregation scenarios. You can use new measures designed toprotect the performance of data-driven SharePoint solutions, such as throttling list queries to restrict the numberof items returned by the query. You can interactively create external content types that define a set ofstereotyped operations on external databases and services. A stereotyped operation is a data access method thatconforms to a common and well-recognized signature, such as create, retrieve, update, and deleteoperations. You can create external lists, which effectively map a SharePoint list to an external content type. Thisallows users to interact with external data in the same way as they interact with standard SharePoint list data.Finally, you can use LINQ to SharePoint to query SharePoint list data using the Language Integrated Query (LINQ)syntax. This enables you to build sophisticated queries, including the use of joins.This plethora of new features brings with it many new design options and tradeoffs for the SharePoint applicationdeveloper. There are three core approaches to defining data models for SharePoint applications:  You can define a data model using SharePoint data structures such as lists and content types.  You can define a data model in an external data source, such as a relational database, and then create a set of external content types to expose the data to SharePoint through the Business C onnectivity Services (BC S) component. Alternatively, you could create a Web service to wrap the data source and create your external content types from the Web service.  You can create a set of external content types to model and integrate an existing data source, such as a database, a Windows C ommunication Foundation (WC F) service, or a .NET type, to expose the data to SharePoint through the BC S.This chapter is designed to guide you on key decision points and best practices for working with data inSharePoint 2010. The chapter includes the following sections and topics:  Understanding Data in SharePoint 2010. This section covers the key concepts behind the storage and management of data in SharePoint 2010. It describes the core building blocks for SharePoint data models, including lists, columns, and content types. It explains how list relationships work in SharePoint 2010, and it provides an insight into managing the impact of query throttling and indexing functionality.  External Data in SharePoint 2010. This section examines aspects of how you can use Business C onnectivity Services to work with external data in SharePoint 2010. It describes options for modeling complex types and entity associations in a Business Data C onnectivity (BDC ) model, it explains how filtering and throttling works in the BDC runtime, and it maps common external data scenarios to different approaches to data modeling.  Data Access in SharePoint 2010. This section provides insights into the three main approaches to data access in SharePoint 2010—query classes, LINQ to SharePoint, and the BDC object model. It examines the benefits and limitations of each approach from the perspectives of usability, efficiency, and performance.  List Patterns. This section describes different design options to deal with common challenges faced with lists, including managing large lists and aggregating data across lists. It describes the benefits and consequences of each approach.Whats Not Covered in This Document?Data in SharePoint 2010 encompasses a broad range of material, and there are some topics that are beyond thescope of this topic. The following subjects are not covered:  The Metadata Management Service, which manages the taxonomy and folksonomy data used in tagging.  Access Services and Excel Services, which allow users to publish Access databases and Excel workbooks on the SharePoint platform and use them through a Web browser. Access Services is a new feature in SharePoint 2010 that enables non-developers to assemble data-centric applications in Microsoft Access and then publish those applications to SharePoint. When published to SharePoint, the application is converted into a native SharePoint application based upon SharePoint lists. Access Services applications can be synchronized with the Access client for offline access to data and for report generation. For more information, see the Access Team Blog.  Business intelligence capabilities, which is a rich and important area, but it is not typically developer focused.  InfoPath-based solutions for custom user interfaces and data capture.Generated from CHM, not final book. Will be superseded in the future. Page 195
  • 196. Understanding Data in SharePoint 2010Whenever you design a data-driven application, regardless of the platform, you need to consider how your data isstored and managed. The data model addresses various aspects of data storage and management, including howand where your data is stored, how the relationships between different pieces of information are defined andmanaged, and how you will access your data.When you work with SharePoint 2010, one of the first things you should consider is whether you can map yourdata storage requirements to the built-in data structures provided by the SharePoint platform. SharePoint definesa wide range of lists that can be used without modification or extended according to your needs by creating listtemplates, defining list instances, or associating content types you create to a list. The following are someexamples:  If you want to store a list of projects, together with descriptive information about each project, you can use a basic list.  If you want to store statements of work (Word documents) and bid information (Excel workbooks) related to a project, you can use a document library with content types.  If you want to store image resources for a particular project, you can use a picture library with content types.This sounds intuitive, but one of the basic challenges of SharePoint development is to understand what SharePointprovides in order to avoid recreating existing functionality. Spend time exploring the list types and content typesthat are provided by SharePoint 2010, and consider how these components can meet your needs during the datamodeling process.This section provides an overview of the data-driven components in SharePoint 2010 and examines how theyrelate to the concepts of data modeling in general. The section includes the following topics:  SharePoint Data Models in Broader C ontext . This topic provides a high-level discussion of how SharePoint 2010 relates to broader issues of data storage and data modeling.  SharePoint C olumns, Lists, and C ontent Types . This topic provides an overview of the main components of data storage and data modeling in SharePoint 2010.  List Relationships in SharePoint 2010 . This topic describes how you can create list relationships and execute queries using these relationships between lists in SharePoint 2010. It examines the key similarities and differences between list relationships in SharePoint and foreign key constraints in relational databases.  Query Throttling and Indexing . This topic describes the concept of query throttling in SharePoint 2010 and explains how an effective indexing strategy can mitigate performance issues in general and query throttling in particular.Generated from CHM, not final book. Will be superseded in the future. Page 196
  • 197. SharePoint Data Models in Broader ContextBefore you look at the specifics of working with data in SharePoint 2010, it can be instructive to consider theSharePoint platform in the broader context of data modeling and data storage. This topic discusses commonhigh-level aspects of data modeling in relation to SharePoint 2010.Green Field and Brown Field ScenariosWhen it comes to designing data-driven applications, solution architects can find themselves in one of twosituations. In one situation, they dont have to worry about anything that exists already and they can start thedesign process with a blank piece of paper. In the other situation, they have to design the solution around anexisting environment, which could include legacy systems and different types of data repository.Most architects would agree that not having to deal with legacy systems and preexisting data stores is more fun.This situation is known as green field development. You can design the application the way you want it withoutworking around constraints imposed by other systems.On the other hand, brown field development describes scenarios in which you are enhancing an operationalsystem, or integrating with existing systems, such as line-of-business applications, proprietary software, andlegacy data stores. As the data management requirements of most organizations evolve continuously over time,youre generally more likely to encounter brown field development scenarios.Whether your application needs to integrate with other systems and data stores will clearly have a major bearingon the design process. SharePoint 2010 excels as a platform for brown field development scenarios because of itsability to connect to and interact with almost any kind of external system. Although most guidance necessarilyignores the sticky details around integrating existing systems into a solution design, this topic devotes much of itscontent to the opportunities for integrating with external systems and working with external data.Structured vs. Unstructured DataWhen it comes to data models for the SharePoint platform, the discussion often leads quickly to structured andunstructured data. Structured data is typically highly organized and tabular, such as the information in aSharePoint list or a database table. Unstructured data contains information but lacks a rigorous schema, such asthe content of a document or an image. For example, the content of this topic is unstructured data. It containsinformation, but the information doesnt fall into neat categories of rows and columns. C onversely, structuredtables may contain columns for notes, videos, or other unstructured information.At a basic level, you typically use a SharePoint list to store structured data and a SharePoint document library tostore unstructured data. However, one of the key benefits of the SharePoint platform is that it allows users toassociate some structured data with fundamentally unstructured content. For example, a document will typicallyhave a title and an author, as well as purpose-specific information such as the name of a customer, the due datefor an invoice, or the final effective date for a contract. You can add these fields to a SharePoint document libraryin the same way that you would add them to a SharePoint list, and the SharePoint platform includes features suchas property promotion that enable you to automatically extract this information from the document in manycases.This blurring between traditional definitions of structured and unstructured data occurs because conceptually, adocument library is simply a list where each item has one—and only one—attachment. SharePoint documentlibraries include all the same functionality as SharePoint lists, such as the ability to add columns and createdifferent views. This allows users to sort, query, and filter libraries of files and documents in the same way thatthey would work with regular structured data.Database Models vs. SharePoint Data ModelsA data model describes the real-world pieces of information that your application will work with, together with therelationship between these pieces of information. In the case of a relational data model, this is often representedas an entity-relationship diagram. The following illustration shows an example of an entity-relationship diagramfor a machine parts inventory database.Entity-relationship diagram for machine parts inventory databaseGenerated from CHM, not final book. Will be superseded in the future. Page 197
  • 198. Today, most developers are familiar with relational databases. In a relational database, the data model is realizedusing the following constructs:  A database contains one or more tables. Typically, each table models a logical entity, such as a person, an organization, or a manufactured item. For example, the machine parts inventory database contains tables for machines, parts, suppliers, and other entities.  Each table contains one or more columns, or fields. Each column represents a single item of information about the entity modeled by the table. For example, the table named Machines includes fields named Name , ModelNumber, and ManufacturerId. Each field has a specific type, such as a string or an integer.  A table row represents a single entry in the table. For example, the Machines table will include a single row for each machine defined in the table.  Each table includes a primary key. This is a field value, or a combination of field values, that uniquely identifies each entry in the table.  You can create relationships between tables by linking the primary key of one table to the same field (the foreign key) in another table. This is known as a foreign key relationship. For example, the Id field is the primary key for the Machines table, while MachineId represents the same field in the MachineDepartment table. As such, a foreign key relationship can be defined between the two tables.Most database engines, such as Microsoft SQL Server, also allow tables to execute some programming logic whencertain events, known as triggers, occur. You might invoke your logic when a row is added, updated, or removedfrom the table. You can use trigger logic to enforce rules that ensure consistency in the data or to drive updatesto applications such as cache refreshes. Typically, database engines use a data definition language (DDL) torepresent the data model internally. In Microsoft SQL Server, the DDL is a subset of SQL statements that areused to create the tables and relationships. The database engine stores DDL metadata that describes thesestructures in a system database. When you query a database, the database engine uses this metadata todetermine how to interact with the database in question.SharePoint allows you to construct data models using constructs that are conceptually similar to those found in aSQL database:Generated from CHM, not final book. Will be superseded in the future. Page 198
  • 199.  SharePoint lists (and by association, document libraries) are conceptually similar to database tables.  SharePoint columns are conceptually similar to database table columns.  SharePoint content types provide an additional layer of abstraction to support reuse and can be compared to the schema for a database table.  You can create relationships between SharePoint lists using lookup columns that are conceptually similar to the foreign key relationships between database tables.However, the way in which the data model is stored and implemented differs substantially between SharePointand a SQL database. Although SharePoint uses SQL Server as its underlying data store, it introduces a level ofabstraction between the data structures you define and the data store. One key advantage of this additionalabstraction is that SharePoint users with sufficient permissions can define and manage their own data structureswithout the intervention of a database administrator. SharePoint stores the metadata that defines columns, lists,and content types in its content databases, in much the same way that the SQL Server database engine storesdata model metadata in its system databases.Generated from CHM, not final book. Will be superseded in the future. Page 199
  • 200. SharePoint Columns, Lists, and Content TypesData models in SharePoint 2010 are implemented using columns, lists, and content types. A full understanding ofthese constructs underpins every effective data model in SharePoint 2010.SharePoint ColumnsThe column, or field, is the core data construct in SharePoint 2010. In the context of the SharePoint platform andSharePoint applications, the terms "column" and "field" are used interchangeably:  "C olumn" is preferred in product documentation and is used in the SharePoint user interface.  "Field" is often used when referring to declarative markup or object model code. For example, columns are represented as F ield elements in site or list definitions, as F ieldRef elements in content type definitions, and by the SPField class in the SharePoint object model. Note:A FieldRef in a C ontentType is a reference to an existing site column, instead of a column definition.C olumns can exist at two different scopes. You can create a list column, which exists only within a specificSharePoint list. You can also create a site column, which is defined at the site collection level and is madeavailable for use in lists and content types across the site collection, including all subsites. Each site collectionincludes a site column gallery in which built-in and user-defined site columns are listed. When you create acolumn, you can define the following information:  C ore details, including the column name and the data type  Group details, which can help you organize and find your columns within a site collection  Logical details, such as whether a value is required, whether the value must be unique, the maximum length of the value, and a default value, if appropriate  Validation details, including a validation formula and an error message Note:For more information about columns and fields in SharePoint 2010, see Building Block: C olumns and FieldTypes and C olumns on MSDN. You can also define columns at any site level, although the common practice isto define all site columns in the root site to maximize reuse within the site collection.The ability to enforce unique column values is new to SharePoint 2010. The unique value constraint applies onlyat the list instance level. Uniqueness can be defined at the site column level, but it is enforced within each list.Because of the way the unique value constraint works, you must index any columns that enforce uniqueness. Youcan only apply unique value constraints to columns with certain data types, because some data types cannot beindexed. You can apply the unique value constraint to a column in three ways—interactively through the userinterface, declaratively by setting the EnforceUniqueValues attribute in the column definition, orprogrammatically through the SPF ield class. Note:For more information about the unique value constraint for SharePoint columns and a list of column types thatcan be indexed, see Enforcing Uniqueness in C olumn Values on MSDN.SharePoint ListsLists are the storage mechanism in the SharePoint platform. In some ways, lists are conceptually similar to a SQLdatabase table, in that they are comprised of columns (or fields) and rows (or list items), and that you can createrelationships between lists. SharePoint lists additionally provide a user interface including forms for interactingwith the data. Unlike a database table, which typically has a constant predefined set of columns, the SharePointlist also allows users with sufficient permissions to add or remove columns at will.Although it is possible to define a data model using only lists, the recommended approach is to use content typesto define your key data entities.SharePoint Content TypesC ontent types were introduced in SharePoint 2007 products and technologies. A content type defines themetadata and behavior for a particular data entity—usually, a business document or item of some kind. Eachcontent type contains references to one or more site columns. You can also associate workflows, informationmanagement policies, and document templates with content types. For example, suppose you defined a contentGenerated from CHM, not final book. Will be superseded in the future. Page 200
  • 201. type named C ontract. This content type might include the following:  C olumns named C ustomer, Amount, and Final Effective Date  An approval workflow  A retention policy linked to the F inal Effective Date field  A Word template for a contract documentC ontent types can be created in three ways. Site collection administrators can create content types interactivelythrough the user interface without developer involvement. Developers can create content types declaratively byusing collaborative application markup language (C AML) or programmatically through the SPContentType objectmodel.C ontent types are defined and managed at the site level, but they are typically defined at the root site in a sitecollection. In order to use a content type, you must associate it with a list or a document library. You canassociate a content type with multiple lists or libraries, and each list or library can host multiple content types.This is useful in scenarios where different types of document share similar metadata—for example, you mightstore invoices and sales orders in the same document library, because both share similar fields but might differ interms of approval processes or retention requirements. The ability to associate behaviors with a content type,such as workflows and event receivers, is comparable to the concept of triggers on a database table. However,because the content type can be applied to multiple locations, you can use content types to define a contract,purchase order, or invoice that has the same metadata—and the same behavior—across the entire organization.When you associate a content type with a list or library, the content type is attached to the list, together with thesite columns, workflows, and policies for that content type. These policies and workflows will apply to any item ofthat content type in the list. The following illustration shows this.Associating content types with lists and librariesC ontent types follow the concepts of inheritance, because many data entities will share common metadata andbehaviors. For example, an Invoice content type would inherit from the Document base content type, becausean invoice is a type of document and shares certain characteristics with other types of documents. Ultimately, allcontent types inherit from the Item base content type. For more information about content type inheritance, seeBase C ontent Type Hierarchy on MSDN.When you associate a content type with a list, the site content type is actually copied to the list and is given a newID value that identifies it as a child of the site content type. The list content type is then said to inherit from thesite content type. As a result, changes to a site content type are not reflected in individual lists and librariesunless you explicitly propagate, or "push down," the changes. If you update the content type programmatically,you can use the SPContentType.Update(true) method to propagate your changes—the Boolean argument tothe Update method indicates that your changes should be applied to all child site and list content types. If youupdate the content type through the site collection user interface, you can select whether your updates should beapplied to child content types. For more information, see Updating C ontent Types and on MSDN. Note:You cannot update a content type by changing the declarative (C AML) content type definition.Where possible, you should use content types to define data models in SharePoint 2010 instead of using listsdirectly. C ontent types enable you to define data structures that can be reused across lists, sites, site collections,Generated from CHM, not final book. Will be superseded in the future. Page 201
  • 202. and even between farms. This allows you to apply a consistent and identifiable data schema across your entireorganization. Note:Sharing content types across site collection boundaries is new to SharePoint 2010 and requires you toconfigure the managed metadata service application. For more information, see Managed metadata serviceapplication overview on TechNet.For more information about content types, see the product documentation. In particular, we recommend you referto C ontent Types and Building Block: C ontent Types on MSDN and to C ontent type and workflow planning andPlan to share terminology and content types on TechNet. Because the fundamentals of designing content typesremain unchanged, several articles about planning, designing, and using content types for the SharePoint 2007product and technologies release are still relevant. For more information, see Best Practices: Developing C ontentTypes in SharePoint Server 2007 and Windows SharePoint Services 3.0 and Managing Enterprise Metadata withC ontent Types.Generated from CHM, not final book. Will be superseded in the future. Page 202
  • 203. List Relationships in SharePoint 2010SharePoint 2010 allows you to create relationships between lists in the same site collection. List instances arerelated through lookup columns (as known as lookup fields). The real benefit of this functionality is thatSharePoint 2010 allows you to use join statements, in LINQ to SharePoint or in collaborative application markuplanguage (C AML), to query across lists where lookup column relationships are defined. By default, SharePointpermits a maximum of eight joins per query, although administrators can change this limit through the C entralAdministration Web site or by using PowerShell. However, queries that contain large numbers of join statementsare resource-intensive, and exceeding eight joins per query is likely to have a significant detrimental effect onperformance.Although this newly introduced ability to query across lists brings the capabilities of SharePoint data models closerto those of relational databases, SharePoint support join predicates only where a lookup column relationshipexists between the lists. In this regard, the join functionality in SharePoint is less powerful than the JOIN predicatein SQL.Lookup Columns ExplainedSuppose you use a relational database to manage goods orders received from your customers. The databasemight include a table named Orders, which stores the orders youve received, and a table named OrderLines,which stores the individual items that comprise an order.Unrelated database tablesTo relate the tables, you could add an OrderID column to the OrderLines table, and use this column to define aforeign key relationship between the tables.Database tables linked by foreign key constraint (primary key)Alternatively, you could add an OrderNo column to the OrderLines table, and use this column to define the foreignkey relationship (providing that the OrderNo column in the Orders table is subject to a unique values constraint).Database tables linked by foreign key constraintGenerated from CHM, not final book. Will be superseded in the future. Page 203
  • 204. Defining the foreign key relationship helps to ensure referential integrity—in the first example, the OrderIDcolumn of the OrderLines table can only contain values that are found in the ID column in the Orders table. Youcan also impose further conditions on the foreign key relationship. For example, when an item is deleted from theOrders table, you can force the database engine to remove corresponding rows from the OrderLines table.C onversely, you can prohibit the deletion of OrderLines items that are linked to an active row in the Orders table.Lookup column relationships in SharePoint are conceptually similar to foreign key constraints in relationaldatabases, but there are key differences. Suppose you want to implement the previous example in a SharePointdata model. First, you create the Orders list. Next, you define a site lookup column that retrieves values from theOrders list. Finally, you create the OrderLines list and you add the lookup column that retrieves values fromOrders. When a user enters a new order line to the OrderLines list, they would select the associated order usingthe lookup column. You dont get to choose which columns in the Orders or OrderLines lists drive the foreign keyconstraint—in SharePoint lists, you can view the built in ID column as a permanent, unchangeable primary key;and this is the value that drives the relationship. Instead, you choose the column in the target list that you want todisplay in the source list, by setting the ShowF ield attribute. When a user adds data to the source list, he or shecan select from a list of values in the column you selected on the target list. The following illustration shows this.Lookup column relationship between SharePoint listsAnother key difference is that in a relational database, you can apply a foreign key constraint to existing data.This is not always good practice, and you would need to take care to remedy any existing data rows that violatethe constraint. However, in SharePoint, you do not have this option—you cannot convert an existing column to alookup column. You must create the lookup column first, and then the user must populate the data by selectingvalues from the target list. Note that a lookup column definition in itself does not define a relationship until youadd it to a list. For example, you can create a lookup field as a site column. The lookup column definitioneffectively defines one half of the relationship.Lookup Column DefinitionGenerated from CHM, not final book. Will be superseded in the future. Page 204
  • 205. Whenever you add the site column to a list, you effectively create a unique foreign key relationship between thesource list and the target list. In the case of lookup columns, the relationship between lists is managed bySharePoint, not by the underlying database. You can also leave a lookup column blank unless it is marked as arequired field, whereas a foreign key constraint always requires a value.If you want to model a many-to-many relationship using SharePoint lists, you must create an intermediate list tonormalize the relationship. This is conceptually similar to the normalization process in database design, where youwould also use an intermediate table to model a many-to-many relationship. For example, suppose you want tomodel the relationship between parts and machines. A part can be found in many machines, and a machine cancontain many parts. To normalize the relationship, you would create an intermediate list named PartMachine, asshown in the following illustration.Using an intermediate list to model a many-to-many relationshipGenerated from CHM, not final book. Will be superseded in the future. Page 205
  • 206. In this example, the intermediate list, PartMachine, contains lookup columns that link to both lists. To create arelationship between a part and a machine, you would need to create an entry in the PartMachine list. To navigatefrom a part to a machine, or vice versa, you would have to browse through the PartMachine list. From a userexperience point of view, this is less than ideal, so at this point, you would probably add custom logic and customuser interface components to maintain associations between parts and machines.Relationships between SharePoint lists can be navigated programmatically using either C AML or LINQ toSharePoint. For more information, see Data Access in SharePoint 2010.Creating and Using Lookup ColumnsYou can create lookup columns in three different ways—interactively through the SharePoint site settings userinterface (UI), programmatically through the SPF ieldLookup class or declaratively through C AML. For example,the following C AML code declaratively defines a lookup column named Projects Lookup that returns items from alist named Projects.XML<Field ID="{3F55B8CF-3537-4488-B250-02914EE6B4A8}"Name="ProjectsLookup" DisplayName="Projects Lookup" StaticName="ProjectsLookup" DisplaceOnUpgrade="TRUE"Group="SiteColumns" ShowField="Title" WebId=""List="Lists/Projects"Type="Lookup"Required="TRUE"></Field>The attributes of interest for a lookup column are as follows:  The value of the Type attribute must be set to Lookup.  The WebId attribute specifies the internal name of the site that contains the target list. If the attribute is omitted or set to an empty string, SharePoint will assume that the list is on the root site of the site collection.  The List attribute specifies the site-relative URL of the target list. This list instance must exist before the field is defined.  The ShowField attribute specifies the column in the target list that contains the values that you want to display in the lookup column. Note:C onsider picking a meaningful ShowField value for the lookup column that is unlikely to change. For example,choosing a product SKU or a model number is a better foundation for a relationship than a description field.List Columns, Site Columns, and Content TypesYou can define a lookup column as a site column or a list column. If you define a lookup column declaratively orprogrammatically, you must take care to ensure that the target list (in other words, the list that the column refersto) is in place at the appropriate point in the column provisioning process. Regardless of whether you define alookup column as a site column or a list column, the target list must already exist at the point at which the columnis created—otherwise, the column will be unusable. Similarly, if you define a lookup column as part of a listdefinition, the target list must already exist at the point at which you provision the list containing the lookupcolumn.Because a lookup column defines a relationship between two list instances, it can often make sense to define yourlookup columns at the list level. There are some circumstances where it makes sense to use site columns andcontent types. For example, if many similar lists within the same site collection will include a relationship to a liston the root site, you can define a site column for the lookup, include it in a content type, and provision thecontent type to multiple lists within the site collection.Projected FieldsIn addition to the column you identify in the ShowF ield attribute, SharePoint 2010 enables you to displayadditional columns from the target list in the view of the list that contains the lookup column. These additionalGenerated from CHM, not final book. Will be superseded in the future. Page 206
  • 207. columns are known as projected fields. For example, suppose you use SharePoint lists to model the relationshipbetween employees and their departments. You create a Department lookup column for the Employees list. Youmight also want to display the name of the department manager in the list of employees, as shown in thefollowing illustration.Projected fields in SharePoint lists Note:This is a somewhat contrived example, because, in reality, the department manager may not be theemployees manager, and the department manager would also be a member of the employees table. However,it serves to illustrate the concept of projected fields.Enforcing List RelationshipsSharePoint 2010 can help you to maintain referential integrity in your data model by enforcing the relationshipsdefined by lookup columns. Just like foreign key constraints in a relational database, SharePoint allows you toconfigure restrict delete and cascade delete rules on lookup column relationships:  C ascade delete rules automatically delete items that reference a record when you delete that record. This rule is typically used to enforce parent-child relationships.  Restrict delete rules prevent you from deleting a record that is referenced by a lookup column in another list. This rule is typically used to enforce peer-to-peer relationships.Parent-Child Relationships and Cascade Delete RulesIn a parent-child relationship, some records are children of other records. The child records are meaninglesswithout a valid parent record. For example, an invoicing system might model a relationship between invoices andinvoice line items. The invoice line items, which describe the product purchased, quantity ordered, and unit cost,are child records of the invoice item, which describes the customer, the shipping address, the total cost, and soon.To create the relationship between invoices and invoice line items, you would add a lookup column to theInvoiceLineItems list that retrieves an invoice number from the Invoices list. Because an invoice line item ismeaningless without a parent invoice, it would be appropriate to configure the lookup column to cascade deletes.This ensures that when a user deletes an invoice, SharePoint will delete the corresponding line items.C hild items without a valid parent are often referred to as orphans. C ascade delete behavior is designed toprevent orphans from occurring.Peer-to-Peer Relationships and Restrict Delete RulesIn a peer-to-peer relationship, there is no clear hierarchy between the related entities. For example, in ourinvoice line items list, you might want to include a lookup column that retrieves an item from a list of products. Inthis case, the product is neither child to nor parent of the invoice line item—the product lookup simply providesadditional information on the invoice line.Generated from CHM, not final book. Will be superseded in the future. Page 207
  • 208. Suppose you delete a product record because you no longer sell that product. This damages the referentialintegrity of your data model, because historical invoice line items will now reference product records that do notexist. A better approach is to mark the product record as inactive if it is no longer available for distribution. In thiscase, it would be appropriate to configure the product lookup column in the InvoiceLineItem list to restrict deletes.This ensures that a product record cannot be deleted if it has been referenced in an invoice line item.Making a record inactive while ensuring that it is still accessible for consistency with historical records is oftenreferred to as a soft delete.You can configure list relationship behavior interactively through the user interface or programmatically in afeature receiver class. Because list relationship behavior is specific to individual list instances, you cannotconfigure this behavior declaratively in column definitions or content types. For details of how to configurecascade delete and restrict delete rules programmatically, see How to: Programmatically Set the Delete Behavioron a Lookup Field.Generated from CHM, not final book. Will be superseded in the future. Page 208
  • 209. Query Throttling and IndexingA familiar challenge when you work with SharePoint lists is how to address the performance degradation that canoccur when your list contains a large number of items. However, SharePoint is capable of managing extremelylarge lists containing millions of rows. The often-quoted limit of 2,000 items per list, actually refers to themaximum number of items that you should retrieve in a single query or view in order to avoid performancedegradation. Effective indexing and query throttling strategies can help you to improve the performance of largelists. Note:For more information about working with large lists, see List Patterns, "Designing Large Lists and MaximizingPerformance" from Performance and capacity test results and recommendations on TechNet and HandlingLarge Folders and Lists on MSDN.What Is Indexing?SharePoint enables you to index columns in a list. This is conceptually similar to indexing columns in a databasetable; however, in the case of SharePoint lists data, the index is maintained by SharePoint instead of SQLServer.Indexing columns in a list can substantially improve the performance of various query operations, such as queriesthat use the indexed column, join operations, and ordering operations such as sorting. In any list, you can eitherindex a single column or define a composite index on two columns. C omposite indexes can enable you to speedup queries across related values. However, like with database indices, list indexing does incur a performanceoverhead. Maintaining the index adds processing to creating, updating, or deleting items from a list, and the indexitself requires storage space. A list instance supports a maximum of 20 indices. Some SharePoint features requireindices and cannot be enabled on a list where there is no index slot remaining. You should choose your indexedcolumns carefully to maximize query performance while avoiding unnecessary overhead. Note:Not all column data types can be indexed. For a list of column types that can be indexed, see EnforcingUniqueness in C olumn Values on MSDN. Also note that you cannot include text fields in a composite index.What Is Query Throttling?Query throttling is a new administrative feature in SharePoint 2010. It allows farm administrators to mitigate theperformance issues associated with large lists by restricting the number of items that can be accessed when youexecute a query (known as the list view threshold). By default, this limit is set to 5,000 items for regular usersand 20,000 items for users in an administrator role. If a query exceeds this limit, an exception is thrown and noresults are returned to the calling code. Out-of-the-box, SharePoint list views manage throttled results byreturning a subset of the query results, together with a warning message that some results were not retrieved.Farm administrators can use the C entral Administration Web site to configure query throttling for each Webapplication in various ways. For example, farm administrators can do the following:  C hange the list view threshold, both for users and for site administrators.  Specify whether developers can use the object model to programmatically override the list view threshold.  Specify a daily time window when queries that exceed the list view threshold are permitted. This enables organizations to schedule resource-intensive maintenance operations, which would typically violate the list view threshold, during off peak hours.  Limit the number of lookup, person, or workflow status fields that can be included in a single database query.If the farm administrator has enabled object model overrides, you can also change list view thresholdsprogrammatically. For example, you can do the following:  C hange the global list view threshold for a Web application by setting the SPWebApplication.MaxItemsPerThrottledOperation property.  Override the list view threshold for an individual list by setting the SPList.EnableThrottling property to false.  Override the query throttling settings on a specific query by using the SPQueryThrottleOption enumeration.Query throttling is designed to prevent performance degradation, so you should only programmatically suspendthrottling as a temporary measure and as a last resort. Ensure that you restrict the scope of any throttlingoverrides to a minimum. We recommend against changing the query throttling thresholds. The default limit of5,000 items was chosen to match the default point at which SQL Server will escalate from row-level locks to atable-level lock, which has a markedly detrimental effect on overall throughput.Generated from CHM, not final book. Will be superseded in the future. Page 209
  • 210. List-based throttling applies to other operations as well as read operations. In addition to query operations,throttling also applies to the following scenarios:  Deleting a list or folder that contains more than 5,000 items  Deleting a site that contains more than 5,000 items in total across the site  C reating an index on a list that contains more than 5,000 items Note:For detailed information about the behavior and impact of indexing and query throttling, download the whitepaper Designing Large Lists and Maximizing Performancefrom Performance and capacity test results andrecommendations on TechNet.How Does Indexing Affect Throttling?The list view threshold does not apply simply to the number of results returned by your query. Instead, it restrictsthe numbers of database rows that can be accessed in order to complete execution of the query at the row levelin the content database. For example, suppose you are working with a list that contains 10,000 items. If you wereto build a query that returns the first 100 items sorted by the ID field, the query would execute without issuebecause the ID column is always indexed. However, if you were to build a query that returns the first 100 itemssorted by a non-indexed Title field, the query would have to scan all 10,000 rows in the content database inorder to determine the sort order by title before returning the first 100 items. Because of this, the query would bethrottled, and rightly so—this is a resource-intensive operation.In this case, you could avoid the issue by indexing the Title field. This would enable SharePoint to determine thetop 100 items sorted by title from the index without scanning all 10,000 list items in the database. The sameconcepts that apply to sort operations also apply to where clauses and join predicates in list queries. C arefuluse of column indexing can mitigate many large list performance issues and help you to avoid query throttlinglimits.Generated from CHM, not final book. Will be superseded in the future. Page 210
  • 211. External Data in SharePoint 2010In SharePoint 2010, the functionality that enables you to work with external data is provided by BusinessC onnectivity Services (BC S). BC S is an umbrella term—much like Enterprise C ontent Management (EC M) inSharePoint 2007—that encompasses a broad range of components and services.Introducing Business Connectivity ServicesEach of these components and services provide functionality relating to the modeling, access, and managementof external data. The following illustration shows the key components of the BC S. SharePoint Server includes allcomponents that are part of SharePoint Foundation.Business Connectivity Services in SharePoint 2010The Business Data C onnectivity (BDC ) service application and the BDC runtime are the core components of theBC S when it comes to modeling, managing, and accessing external data. The Secure Store Service (SSS)supports access to external data by allowing you to map the credentials of SharePoint users or groups to externalsystem credentials. Other BC S components enable users to interact with external data in various ways. For moreinformation about how these components relate from an execution perspective, see Hybrid Approaches. Note:The Business Data C onnectivity service should not be confused with the Business Data C atalog (also referredto as the BDC ) in Office SharePoint Server 2007, which was the predecessor to the BC S. In thisdocumentation, BDC refers to the Business Data C onnectivity service application.Service Proxy Groups Note:Generated from CHM, not final book. Will be superseded in the future. Page 211
  • 212. The service application framework is a complex topic in its own right. This section discusses serviceapplications and service application proxies with regard to specific issues in working with external data. Formore information about the service application framework, see Service applications and service managementon TechNet.SharePoint 2010 introduces a new service application framework. This replaces the shared service provider foundin Office SharePoint Server 2007 and enables third parties to build new service applications for the SharePointplatform. Instead of using a shared service provider to provide collections of services to SharePoint farms andWeb applications, each service in SharePoint 2010 is architected as an individual service application. In BC S, boththe SSS and the BDC are examples of service applications.Administrators can create multiple instances of particular service applications. For example, you might configureone BDC instance for an intranet portal and another for a public-facing internet site. In order to use a serviceapplication, you must create a service application proxy. Where the service application provides a service, theservice application proxy consumes a service. A default configuration of SharePoint 2010 largely contains pairs ofservice applications and service application proxies, as shown in the following illustration.Service applications and proxies in the Central Administration Web siteEach Web application is associated with an application proxy group that contains a collection of service applicationproxies. This model supports a flexible approach to application proxy management—for example, anadministrator may want different Web applications to use different subsets of the available application proxies.You can add a single service application proxy to multiple application proxy groups. Likewise, you can addmultiple service application proxies of the same type to an application proxy group. However, the applicationproxy group will only use one of the proxies, and the proxy instance you want to use must be marked as thedefault instance of that proxy type for the application proxy group. Having more than one proxy instance for thesame service type is an administrative convenience that enables you to easily switch between two instances bychanging which is marked as default.This arrangement can lead to confusion for developers who are not familiar with the service applicationframework. For example, if you add a new instance of the SSS application, and you want to use that SSSapplication instance in your Web application, you must ensure the following:  The service application proxy for the service instance is in the application proxy group mapped to your Web application.  The service application proxy is the default SSS proxy instance in the application proxy group.Generated from CHM, not final book. Will be superseded in the future. Page 212
  • 213. Failure to configure application proxy groups correctly can lead to bugs that are hard to diagnose. For moreinformation about application proxy groups, see SharePoint 2010 Shared Service Architecture Part 1 on MSDNBlogs.More InformationThe following topics help you to understand external data models in SharePoint 2010:  Business Data C onnectivity Models. This topic introduces business data connectivity models and describes how BDC models relate to external content types, external lists, and indexing external data for search.  Modeling C omplex Types in External Data. This topic describes various options for mapping complex types to a BDC model, including the use of .NET connectivity assemblies.  Modeling Associations in External Data. This topic explains the concepts behind associations, which enable you to build relationships between external content types in a BDC model.  Filters and Throttling in the BDC . This topic describes how you can use filtering to constrain the result set returned by a BDC operation and explains how the BDC uses throttling to limit the impact of BDC operations on the performance.  BDC Models and C ommon Scenarios. This topic explains how the different approaches to creating BDC models map to common application development scenarios.Generated from CHM, not final book. Will be superseded in the future. Page 213
  • 214. Business Data Connectivity ModelsWhen you design a SharePoint application around external data, the metadata for your data model is stored andmanaged by the BDC service application. This is known as a Business Data C onnectivity model, or BDC model.You can interactively create a BDC model in SharePoint Designer 2010, or programmatically in Visual Studio 2010for more complex scenarios. In a BDC model, data entities are represented by external content types. Anexternal content type models an external data entity, such as a database table or view or a Web service method,and defines a set of stereotyped operations on that data entity. In addition to external content types, a BDCmodel typically includes an external data source definition, connection and security details for the external datasource, and associations that describe the relationship between individual data entities. Other BC S components,such as external lists and external data search, enable end users to interact with the external data provided bythe BDC model. Note:A stereotyped operation is a data access method that conforms to a common and well-recognized signature,such as create, retriev e, update, and delete operations.C reating BDC models for your external data sources offers many benefits. First, the BDC model enables you tocentrally maintain all the information you need to interact with an external data source. This ensures that theexternal data source is accessed consistently by applications across your environment. After creating the BDCmodel, you can work with the external data in several ways without writing any code. The following are someexamples:  You can surface the external data through the out-of-the-box Business Data Web Parts.  You can interact with the external data through external lists.  You can crawl and index the external data for search.The following illustration shows the basic overall structure of a BDC model.Conceptual illustration of a BDC modelAs you can see, each model defines one or more external systems or services. These are sometimes known asLOB systems (line-of-business systems) for historical reasons, and are represented by LobSystem elements inthe BDC model schema. This represents a general view of an external system, its data entities, and itsoperations. For example, it might represent a particular type of C ustomer Relationship Management (C RM)system. Within each external system definition, you must define one or more system instances. These representa specific, individual implementation of the external system, such as a particular installation of a C RM system.The system instance definition defines the connection and authentication information that the BDC serviceapplication requires in order to communicate with the external system instance. The other key component of anexternal system definition is a set of entities, represented by external content types (EC Ts). These are describedlater in this topic.The metadata that comprises a BDC model is stored as XML. You can import, export, and manually edit BDCmodels as .bdcm files. For a good introduction to the BDC model schema, see BDC Model Infrastructure onMSDN.BDC models are stored by the BDC metadata store, a central component of the BDC service application. When aSharePoint client application requests external data, the BDC runtime component on the Web front-end serverrequests the metadata that defines the BDC model from the BDC metadata store. The BDC runtime then uses theGenerated from CHM, not final book. Will be superseded in the future. Page 214
  • 215. metadata provided to perform data operations directly on the external system. The BDC runtime also cachesBDC model metadata on the Web front-end server.In the case of client computers that use Microsoft Office 2010 to access external systems, the metadata for anapplication is packaged on the server and deployed with the application for use by the BDC client runtime. TheBDC client runtime uses the metadata provided to perform operations directly on the external system instead ofgoing through SharePoint. The BDC client runtime caches both BDC model metadata and the external data itselffor offline use. This is illustrated by the following diagram.Metadata and data access in BDC models  For more information about this process, see Mechanics of Using Business C onnectivity Services on MSDN. You can also download the Microsoft Business C onnectivity Services Model poster from the Microsoft Web site.External Content TypesAn external content type (EC T) is conceptually similar to a regular SharePoint content type (although unrelated interms of implementation). Just like a regular content type, an EC T is a collection of metadata that models a dataentity. External data sources describe data schemas in different ways. For example, Web services describe theirdata entities and operations using the Web Service Description Language (WSDL), while relational databasesdescribe their data entities and operations using a database schema. When you create a BDC model inSharePoint Designer, these data definitions are translated into EC Ts. Note:Visual Studio 2010 also provides tooling for defining BDC models for .NET C onnectivity Assemblies.Each EC T definition includes the following components:  One or more identifiers that uniquely identify instances of the data entity  One or more methods (known as "operations" in SharePoint Designer) that define a particular operation,Generated from CHM, not final book. Will be superseded in the future. Page 215
  • 216. such as create, read items, update, or delete, on the data entityThe methods are the central component of any EC T, and the BDC service application infers the structure of yourdata entity from the return types of these methods. The methods that make up an EC T are known as stereotypedoperations, because they map the external service methods or database queries to standardized data operationsthat the BC S user interface components can use to display and manipulate the external data. For example, theBDC model supports stereotyped operations that perform the following actions:  C reate an item.  Retrieve all items.  Retrieve an individual item.  Update an item.  Delete an item.  Stream an item. Note:For more information about stereotyped operations, see Designing a Business Data C onnectivity Model onMSDN.Stereotyped operations offer many advantages, such as enabling SharePoint to crawl and index external data forsearch and allowing users to manipulate external data through built-in Business Data Web Parts without requiringcustom-coded components.External ListsExternal lists are not part of the BDC model, but they are briefly described here because of their closerelationship with EC Ts. An external list is a BC S component that provides a SharePoint list wrapper for dataentities modeled by an EC T. Unlike SharePoint lists and content types, each external list is mapped to a singleEC T. The external list enables users to view, sort and filter, create, update, and delete external data entities inthe same way that they would work with data in a regular SharePoint list. An EC T must implement F inder andSpecificFinder methods to be exposed as an external list. As a developer, you can also use the SPList objectmodel to programmatically interact with data through the external list (with some restrictions). However, becauseexternal lists dont "own" the external data, they cannot receive events when items are added, updated, ordeleted. As such, you cant associate workflows or event receivers with external lists. However, logic in workflowsand event receivers can access items in external lists. Note:For more information about using the SPList object model with external lists, see Hybrid Approaches. For moreinformation about F inder and SpecificF inder methods in the BC S, see Stereotyped Operations Supported byBDC on MSDN.External Data and SearchThe SharePoint search service uses EC Ts to crawl external data sources for search. To support search crawls, theEC T must include the following types of stereotyped operations:  An IdEnumerator method. This type of method returns the identity values for each instance of an EC T entity. You can support incremental indexing by configuring the IdEnumerator method to return a field that represents the last modified date and time, because this enables the search service to establish whether an item has been modified since the last search crawl.  A SpecificFinder method. This type of method returns a single entity instance when provided with the unique identifier for that item.Generated from CHM, not final book. Will be superseded in the future. Page 216
  • 217. Modeling Complex Types in External DataWhen your external system contains tabular data structures, its straightforward to build a representativeBusiness Data C onnectivity (BDC ) model. SharePoint Designer 2010 will discover the data structures for you, andtabular entities map well to SharePoint lists. However, advanced scenarios require more complex handling. Howdo you create a BDC model that aggregates data from multiple sources? How do you create your data entitieswhen your external system contains complex, nested data structures? This topic describes some of the optionsthat can help you to address these issues.Creating and Using .NET Connectivity AssembliesWhen you create a new connection for a BDC model in SharePoint Designer 2010, youve probably noticed thatyoure presented with three options for the type of connection—SQL Server, WC F Service, or .NET type. This lastoption, the .NET type, is designed to allow you to use a .NET connectivity assembly (variously referred to as a .NET shim, a BC S shim, or a shim assembly) to drive your BDC model. C reating a .NET connectivity assemblyallows you to write custom code to define the stereotyped operations for your BDC model. For example, youcould create a F inder method, using the recommended method signature for that operation and include customlogic that queries and concatenates data from multiple entities.Typical scenarios in which you should consider using a .NET connectivity assembly include the following:  You want to aggregate data from multiple services and expose a single data model to SharePoint.  You want to access data that is not accessible through a SQL Server database connection or a WC F Web service.  You want to convert proprietary data types returned by an external system into .NET data types that are understood by the BDC runtime.  You want to "flatten" complex data entities into fields that are compatible with the user interface (UI) components provided by the BC S.Visual Studio provides tooling for modeling and building .NET connectivity assemblies. For more information onbuilding .NET connectivity assemblies for BDC models, see Integrating Business Data into SharePoint on MSDN.Where possible, you should aim to make your .NET connectivity assembly methods conform to the recommendedsignatures for stereotyped operations, because this will enable you to maximize functionality without writingcustom code. For more information, see Stereotyped Operations Supported by BDC and Recommended MethodSignatures for Stereotyped Operations on MSDN.For a practical example of a .NET connectivity assembly, see the External Data Models reference implementation. Note:In the BDC model overview discussed in Business Data C onnectivity Models, the .NET connectivity assemblymaps to the external system level (LobSystem in the BDC schema). Because of this, the .NET connectivityassembly must provide classes and methods for all the entities in your model–you cant mix and match withother connection types.Flattening Nested Complex TypesIn an XML data structure, a complex type refers to an element that has child elements. A complex type thatcontains only one level of descendants can be represented as a tabular data structure. However, if a complextype contains more than one level of descendants—in other words, the parent node has grandchildren as well aschildren—the type can no longer be represented as a table or a list. In this situation, you must take additionalaction if you want to use built-in data constructs, such as external lists and Business Data Web Parts, to providean interface for your external data. Flattening describes the process whereby a complex, nested type is convertedinto a flat two-dimensional structure that can be managed more easily by the out-of-the-box BC S components.In general, there are two approaches you can use to manage nested complex types in SharePoint data models:  You can flatten the nested complex types into one or more simple types.  You can build a custom user interface that is able to represent the complex structure, such as through the use of custom fields or custom Web Parts.If you choose to flatten the nested complex types, there are various options available to you. C onsider thefollowing example of a Customer entity, returned by a Web service, which includes a nested Address element.XML<Customer>Generated from CHM, not final book. Will be superseded in the future. Page 217
  • 218. <Name>Contoso</Name> <Address> <Street>1 Microsoft Way</Street> <City>Redmond</City> <StateProvince>WA</StateProvince> <PostalCode>98052</PostalCode> </Address></Customer>One approach would be to modify the Web service to return a flattened data structure that maps well to externallists and Business Data Web Parts:XML<Customer> <Name>Contoso</Name> <AddressStreet>1 Microsoft Way</AddressStreet> <AddressCity>Redmond</AddressCity> <AddressStateProvince>WA</AddressStateProvince> <AddressPostalCode>98052</AddressPostalCode></Customer>Although this approach certainly solves the problem, in many cases, you will not want, or will not be able, tomodify the Web service. An alternative approach is to include a format string in the BDC model, so that the dataentity is displayed as a flattened structure. In this case, the customer address is "flattened" and displayed as asingle string.XML<TypeDescriptor TypeName="CustomerAddress" IsCollection="false" Name="CustomerAddresses" ><TypeDescriptors> <Properties> <Property Name="ComplexFormatting" Type="System.String" /></Properties><TypeDescriptor TypeName="CustomerAddress" Name="CustomerAddress" > <Properties> <Property Name="FormatString" Type="System.String">{0}, {1}, {2} {3}</Property> </Properties> <TypeDescriptors> <TypeDescriptor TypeName="System.String"Name="Street"/> <TypeDescriptor TypeName="System.String" Name="City" /> <TypeDescriptorTypeName="System.String" Name="StateProvince" /> <TypeDescriptor TypeName="System.String"Name="PostalCode" /> </TypeDescriptors> </TypeDescriptor> </TypeDescriptors></TypeDescriptor>In this example, the format string tells the BDC runtime how to render the address entity as a single string, in theorder that the child elements are listed in the TypeDescriptors collection. If you apply the sample data to thisBDC model, the address is formatted on a single line as 1 Microsoft Way, Redmond, WA 98052. You canprogrammatically retrieve the formatted data by using the EntityInstance.GetFormatted("FieldName") method.However, this approach has several limitations. First, the approach is only viable if the data entity can berepresented effectively as a single string. Second, this formatting only handles the display of data. If you need toupdate the external data, you must add programming logic or custom forms to parse the new values and updatethe data source. Unfortunately, you can only use format strings with Business Data Web Parts. This approach willnot work with external lists.A third option is to use a custom renderer. A custom renderer is a .NET class containing a static method thattakes in an array of objects and returns a string. The runtime calls this renderer to format the objects into astring. To use this approach, in the TypeDescriptor element, you would use the RendererDefinition attributeto identify the method, class, and assembly of the custom renderer. Using a custom renderer is an expensiveoperation, because the renderer must be called on a per-item basis; because of this, you should generally onlyuse a custom renderer when no other options are available. Just like format strings, custom renderers can onlybe used with Business Data Web Parts and will not work with external lists.Another option is to create a custom field type. A custom field type defines a data type for a SharePoint listcolumn, and provides a useful way of storing complex list data in a manageable way. You can also create customfield controls that interpret the data in the custom field type and render it in a user-friendly way. For example,you could create a custom field type that stores nested address data, together with a custom field control thatdisplays the data in a flattened, list-friendly format. C ustom field controls typically define two interfaces—one thatpresents the data, and one that allows the user to edit the data—so that the control works in both list view formsand list edit forms. . In the edit view, you can provide a user interface that allows the user to provide the fielddata in its nested format, thereby preserving the integrity of the underlying data. C ustom field types and fieldcontrols offer the most flexible approach to working with complex data, and you can build in sophisticatedbehavior such as sorting and filtering. However, creating field types and field controls involves creating severalGenerated from CHM, not final book. Will be superseded in the future. Page 218
  • 219. classes and definition files, which makes them somewhat complicated to implement. For examples of how tocreate custom field types and field controls, see C reating C ustom SharePoint Server 2010 Field Types and FieldC ontrols with Visual Studio 2010 and Silverlight 3 on MSDN.Finally, you can use a .NET connectivity assembly to manage the conversion between complex types and flattypes. This is a powerful approach, because you can specify exactly how your data is flattened and unflattenedfor each stereotyped operation. The .NET connectivity assembly bridges the gap between the external system andthe BDC model—the external system sees nested complex types, while the BDC model sees flattened datastructures.For more information about these approaches, see Working with C omplex Data Types on MSDN.Generated from CHM, not final book. Will be superseded in the future. Page 219
  • 220. Modeling Associations in External DataA Business Data C onnectivity (BDC ) model can include associations between external content types (EC Ts). Anassociation is a relationship that allows the BDC runtime to navigate between external content types. You cancreate two different types of association between external content types:  Foreign key association. This type of association maps a field in one external content type EC T (the foreign key) to an identifier in another EC T.  Foreign keyless association. This type of association uses custom logic to relate one EC T to another EC T.For example, suppose you create an EC T named C ustomers, with an identifier of C ustomerID. You also create anEC T named Orders, which includes a CustomerID field. You create a foreign key association from the OrdersEC T to the C ustomers EC T, as shown in the following diagram. Note that the field types must be the same on bothsides of the association.Foreign key association between external content typesThis association allows you to create rich user interfaces using built-in BC S components. For example, you cancreate a profile page for a customer that automatically shows all of their orders in a Business Data Related List.The association would also allow you to use external item picker controls to select a customer when you create anorder. Note:Associations in a BDC model do not enforce referential integrity. For example, the cascade delete and restrictdelete functionality found in regular SharePoint list relationships do not apply to BDC associations. However,the underlying database or service may enforce referential integrity.Just like the stereotyped operations that define an EC T, associations are created as methods in the BDC model.Each association includes an input parameter, which must be an identifier on the destination EC T, and a set ofreturn parameters. In the previous example, CustomerID is the input parameter and OrderID and Amount arelikely choices for return parameters. The association method could be expressed as "Get me all the orders wherethe CustomerID is equal to the specified value."SharePoint Designer enables you to interactively create the following types of associations:  One-to-one foreign key associations. One item in the destination table relates to one item in the source table. For example, you could use a one-to-one association to model the relationship between an employee and a department.  One-to-many foreign key associations. One item in the destination table relates to many items in the source table. For example, you could use a one-to-many association to model the relationship between a customer and their orders.  Self-referential foreign key associations. One item in the source table relates to other items in the source table. For example, you could use a self-referential association to model relationships between people.All of these associations are expressed declaratively as non-static methods in the BDC model. However, in thefollowing cases, you will need to edit the BDC model XML directly:  One-to-one, one-to-many, or many-to-many foreign keyless associations. This refers to any association that cannot be modeled directly by a foreign key relationship.  Multiple ECT associations. This refers to any association that returns fields from more than one EC T.  Non-integer primary keys. This refers to any entity that has a primary key that is not an integer.In each of these cases, its worth using SharePoint Designer to take the model as far as you can, before youexport the .bdcm file and manually edit the XML. Foreign keyless associations always require custom logic todefine the relationship between the EC Ts. This custom logic could be a stored procedure in a database, a Webservice, or a method in a .NET connectivity assembly. Typical scenarios in which you might require a foreignGenerated from CHM, not final book. Will be superseded in the future. Page 220
  • 221. keyless association include when you need to navigate between entities that are related by an intermediate table.The following are some examples:  You use an intermediate table to model a many-to-many relationship. For example, you might have a many-to-many relationship between parts and machines—a machine contains many parts, and a part is found in many machines.  A customer is related to an order, and an order is related to a line item. You want to show the customer in a line item view.For example, the following diagram illustrates a foreign keyless association between OrderLines and C ustomers.The foreign keyless association would use a stored procedure to navigate back to C ustomers through the Orderstable.Foreign keyless association between external content typesYou can only create associations between entities in the same BDC model. For more information about creatingassociations in Visual Studio 2010, see C reating an Association Between Entities, Authoring BDC Models, andC reating External C ontent Types and Associations on MSDN. For more general information about creatingassociations, see Association Element in MethodInstances, The Notion of Associations and the External ItemPicker, and Tooling Associations in SharePoint Designer 2010.How Are Associations Defined in the BDC Model?It can be instructive to take a look at how associations are represented in the underlying BDC model. Forexample, SharePoint Designer created the following association method, which represents a foreign keyassociation between an Inv entoryLocations entity and a Parts entity. Notice the automatically generatedTransact-SQL in the RdbCommandText property. Defined on the InventoryLocations entity, the associationmethod allows you to navigate from a specific Parts instance to the related InventoryLocations instance byproviding a PartSKU parameter.XML<Method IsStatic="false" Name="InventoryLocationsNavigate Association"> <Properties> <Property Name="BackEndObject" Type="System.String">InventoryLocations </Property><Property Name="BackEndObjectType" Type="System.String">SqlServerTable </Property> <Property Name="RdbCommandText" Type="System.String"> SELECT [ID] , [PartSKU] , [BinNumber] , [Quantity] FROM [dbo].[InventoryLocations] WHERE [PartSKU] = @PartSKU </Property><Property Name="RdbCommandType" Type="System.Data.CommandType, System.Data, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089">Text </Property> ...The method definition also defines every parameter, but these have been omitted for brevity. Within the methoddefinition, the method instance defines the specific logic that is actually invoked to traverse the association. Notethat the method instance has a Type attribute value of AssociationNav igator, which is common to all methodsGenerated from CHM, not final book. Will be superseded in the future. Page 221
  • 222. that navigate an association. Also note the SourceEntity and DestinationEntity elements that actually definethe association. A single method definition could contain multiple method instances, each defining an associationbetween InventoryLocations and another entity (providing that all the associations are based on the PartSKUfield). Note:The Association class derives from the MethodInstance class. In terms of the object model, anAssociation is one specific type of MethodInstance.XML ... <MethodInstances> <Association Name="InventoryLocationsNavigate Association" Type="AssociationNavigator" ReturnParameterName="InventoryLocationsNavigate Association" DefaultDisplayName="InventoryLocations Navigate Association"> <Properties> <Property Name="ForeignFieldMappings" Type="System.String">&lt;?xml version="1.0" encoding="utf-16"?&gt;&lt;ForeignFieldMappings xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"xmlns:xsd="http://www.w3.org/2001/XMLSchema"&gt;&lt;ForeignFieldMappingsList&gt;&lt;ForeignFieldMapping ForeignIdentifierName="SKU" ForeignIdentifierEntityName="Parts"ForeignIdentifierEntityNamespace="DataModels.ExternalData.PartsManagement"FieldName="PartSKU" /&gt;&lt;/ForeignFieldMappingsList&gt;&lt;/ForeignFieldMappings&gt; </Property> </Properties> <SourceEntity Namespace="DataModels.ExternalData.PartsManagement" Name="Parts" /> <DestinationEntity Namespace="DataModels.ExternalData.PartsManagement" Name="InventoryLocations" /> </Association> </MethodInstances></Method>The encoded XML within the F oreignFieldMappings element can be hard to read. Essentially, this identifies theforeign key—in other words, the field in the Parts entity that provides the PartSKU parameter. Decoded andsimplified, the field value resembles the following.XML<ForeignFieldMappings> <ForeignFieldMappingsList> <ForeignFieldMapping ForeignIdentifierName="SKU" ForeignIdentifierEntityName="Parts" ForeignIdentifierEntityNamespace= "DataModels.ExternalData.PartsManagement" FieldName="PartSKU" /> </ForeignFieldMappingsList></ForeignFieldMappings>C ompare this association to a manually-defined foreign keyless association. The following association methoddefines an association between a Machines entity and a Parts entity. Because this is a many-to-manyrelationship, you need to specify a stored procedure to navigate the association. In this case, theRdbCommandText property identifies the stored procedure to call.XML<Method IsStatic="false" Name="GetPartsByMachineID"> <Properties><Property Name="BackEndObject" Type="System.String">GetPartsByMachineID </Property><Property Name="BackEndObjectType" Type="System.String">SqlServerRoutine </Property>Generated from CHM, not final book. Will be superseded in the future. Page 222
  • 223. <Property Name="RdbCommandText" Type="System.String"> [dbo].[GetPartsByMachineID] </Property><Property Name="RdbCommandType" Type="System.Data.CommandType, System.Data, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089"> StoredProcedure </Property> ...Again, the method definition also defines each parameter, but these have been omitted for brevity. Because therelationship is not based on a foreign key, the method instance definition does not need to identify foreign keymappings. Instead, it simply specifies the SourceEntity and the DestinationEntity.XML ... <MethodInstances> <Association Name="GetPartsByMachineID" Type="AssociationNavigator" ReturnParameterName="GetPartsByMachineID" DefaultDisplayName="Parts Read With Sproc"> <SourceEntity Namespace="DataModels.ExternalData.PartsManagement" Name="Machines" /> <DestinationEntity Namespace="DataModels.ExternalData.PartsManagement" Name="Parts" /> </Association> </MethodInstances></Method>To explore these entities and associations in more detail, see the External Data Models referenceimplementation.Generated from CHM, not final book. Will be superseded in the future. Page 223
  • 224. Filters and Throttling in the BDCThe Business Data C onnectivity (BDC ) runtime includes configurable filtering and throttling functionality.Administrators can use this functionality to control both the load that BDC operations put on SharePoint and theload that the operations put on the external systems that you connect to. Filters constrain the result set byrestricting the scope of the query, such as by including wildcard queries or paging sets. Throttling prevents BDCoperations from using excessive system resources by terminating requests that exceed the resource thresholdsset by the administrator. The following diagram illustrates the filtering and throttling process.Filtering and throttling in the BDC runtimeFilters encompass a wide range of information that the BDC can pass to the external system in order to constrainthe result set. The types of filter you can use depend on the types of filter that are supported by the externalsystem. The BDC allows you to use two key types of filters: system filters and user filters. System filters providecontext information to the external system. For example, system filters can include a UserContext value thatsecurely provides the identity of the caller to the external system and an Activ ityId value that represents thecurrent operation context for diagnostic purposes. C onversely, user filters enable end users or application logic torefine a query. For example, user filters can include Wildcard and Comparison filters for pattern matching anda PageNumber value to support data retrieval on a page-by-page basis. For more information about filters, seeTypes of Filters Supported by the BDC and How to: Add Filter Parameters to Operations to Limit Instances fromthe External System on MSDN.C onceptually, throttling in the BDC is similar to the query throttling for regular lists found in SharePoint 2010. TheBDC runtime throttles external data requests based on various criteria, including maximum acceptable responsetimes, maximum permissible response sizes in data rows or bytes, and the total number of concurrentconnections to databases or services. Each throttling rule can be associated with one of four scopes: Webservices, WC F services, databases, or global. The default threshold values are as follows:  2000 rows for a database query  3,000,000 bytes for a Windows C ommunication Foundation (WC F) service or Web service response  180 seconds response time for a WC F service or a database request  200 total connections across databases, Web services, and WC F servicesThe throttling criteria that are available to the administrator depend on the scope of the rule. The following tableshows the criteria that apply to each of the four possible scopes. Global Database Web Service WCF Serv iceItemsSizeC onnectionsGenerated from CHM, not final book. Will be superseded in the future. Page 224
  • 225. TimeoutFor more information about BDC throttling criteria and about how to change the threshold values, see the blogpost BC S PowerShell: Introduction and Throttle Management.Generated from CHM, not final book. Will be superseded in the future. Page 225
  • 226. BDC Models and Common ScenariosDepending on your requirements, there are three basic approaches you can take to creating a Business DataC onnectivity (BDC ) model for an external system:  You can create a purely declarative BDC model.  You can create a declarative BDC model and create additional database stored procedures or additional Web service logic to support foreign keyless associations or type flattening.  You can create a .NET connectivity assembly to connect to the external system and perform data operations, and then build your BDC model around the .NET connectivity assembly.The following table lists which of these approaches is typically used in various common scenarios. Note:The table does not prescribe specific approaches to specific scenarios. Your own application scenarios maymerit a different approach.Scenario Declarative Declarative BDC model with .NET connectivity BDC model additional stored procedures or assembly serv ice logicSingle entityOne-to-one relationshipsOne-to-many relationshipsMany-to-many relationshipsNon-integer–based foreign keyrelationshipsC omplex data types*Blob storageUnsupported data typesProprietary protocol foraccessing external systemAggregating data into a singleentity from multiple datasources*C omplex data types are limited to read operations in Business Data Web Parts.There are also various scenarios in which declarative BDC models created in SharePoint Designer require manualedits to the model XML. After you make these edits, you can no longer use SharePoint Designer to work on theBDC model. These scenarios include the following:  When the user can modify the identifier field. In this case, you must add a PreUpdaterField="true" attribute value to the type descriptor for the relevant parameter in the Update method. For example, if you were using SKU as your identifier field, and you allowed the user to change the SKU when updating a part, you must set this field to true. This capability is not supported by the BDC client runtime for offline data updates.  When you create a method of type AssociationNav igator. This type of stereotyped operation represents a foreign keyless association and uses a stored procedure or a Web service method to navigate between entities. Note:The PreUpdaterField is discussed in more detail in the External Data Models reference implementation.Generated from CHM, not final book. Will be superseded in the future. Page 226
  • 227. Security and IdentityBusiness C onnectivity Services (BC S) in SharePoint 2010 supports two different security modes for accessingservices and databases—PassThrough authentication and RevertToSelf authentication:  PassThrough authentication. This uses the credentials of the logged-on user to authenticate to the Web service or the database.  RevertToSelf authentication. This uses the credentials of the process running the application to authenticate to the Web service or the database. This is known as a trusted subsystem model.By default, RevertToSelf authentication is disabled because it poses a potential security risk and is not permittedin hosting environments. If you enable RevertToSelf authentication in the BC S, a malicious developer or designercould point the service endpoint back to SharePoint and use the elevated privileges of the application poolaccount to circumvent security restrictions.The Secure Store Service (SSS) is a BC S component that is licensed with SharePoint Server 2010. The SSSprovides additional options for authenticating to external services and databases. The SSS maps SharePoint useridentities to other external credentials and supports claims-based approaches to authentication. The SSS providesan impersonation model that provides a trusted subsystem approach in a safer way than RevertToSelfauthentication, because the SSS does not use the application pool account to access the external service. Formore information, see Business C onnectivity Services security overview on TechNet. For more information aboutthe security risk of RevertToSelf authentication, see the blog post Authenticating to Your External System.Meeting security requirements can become more complex when you use the BC D through the runtime APIs orthrough external lists. The user security token is not available in every context. Without the security token,PassThrough security will not work. Additionally, you will be unable to use the SSS to map the identity of the userto external credentials. The following are two common scenarios in which the user security token is unavailable:  Sandboxed solutions. In the sandbox environment, the SPUser object is available, but the security tokens are stripped from the user context.  Workflows. Because a workflow runs asynchronously, the security tokens for the user will not be available. You can create an SPUser context by using impersonation in a workflow, but this will not create an identity token for the impersonated account.In these situations, you need to work around the lack of an identity token and use a trusted subsystem modelwhere all users will use the same account to access the services or database. The External Lists referenceimplementation demonstrates how to use the SSS with impersonation to access an external list in the sandbox. Inthis case, the managed account that runs the user code proxy service (SPUC WorkerProcessProxy.exe) is mappedto the external credentials. A similar approach can be used for workflow, although the workflow scenario is a littlemore complicated. Workflow can run in the context of several processes, including the application pool identity ofthe Web application hosting the application (W3wp.exe), the timer job identity (Owstimer.exe), or the user codeproxy service (SPUC WorkerProcessProxy.exe). Typically, a workflow runs in the context of the content Webapplication when it is initiated (although under heavy load it may instead be started by the timer service) and inthe OWSTimer process if it is triggered by a timed event, such as a timeout or a timer job. It can also run in thesandbox proxy process if the workflow is triggered by an action that is initiated by sandboxed code, such asprogrammatically adding an item to a list. In order to use BC S in workflow you will need to map each of themanaged accounts for these three processes to the single account used to impersonate to the service ordatabase. You can also use RevertToSelf under these circumstances as long as you are aware of and accept therisks described previously. If you use ReverToSelf, then the service or database must accept the credentials ofthe three managed accounts.Generated from CHM, not final book. Will be superseded in the future. Page 227
  • 228. SharePoint Lists vs. External DatabasesWhen you design data-driven applications for the SharePoint platform, you will often need to decide where tostore your data. At a high level, you have two choices: you can store your data in the SharePoint contentdatabase or you can use an external data source and connect to it through Business C onnectivity Services (BC S).In some cases, the decision is easy. For example, SharePoint has an extensive functionality set devoted todocument management, so it makes sense to store documents in the SharePoint content database. Likewise, ifyour application consumes data from line-of-business (LOB) applications or other existing repositories of anykind, there are very few circumstances in which there will be benefit in replicating large amounts of data to theSharePoint content database. In most cases, you will want to access existing data where it resides, through theBC S. Note:There are some circumstances in which you might want to replicate data from an external data source—typically when you want to cache data for performance reasons. Although you could use the SharePointcontent database to cache external data, you should also consider building an intermediate service façade withresponsibility for caching. This may be more efficient than adding the caching logic to a SharePoint application.There are two high level scenarios in which the choice may be less clear.  You are building a new data-driven application with no legacy code or pre-existing data (a green field development scenario). In this case, you could either use list-based SharePoint data structures as a data store or you could develop an external data source such as a relational database and connect to it through the BC S. The factors that drive your decision are likely to include the ease of implementing the data model, the performance of the application, and the ease of programming against the data model.  You are porting an existing data-driven application to SharePoint (a quasi-brown field development scenario). In this case, you can continue to use the existing data store and connect through the BC S or you can implement a SharePoint list–based data model and migrate your data to the SharePoint content database. Your decision may be influenced by any design constraints in the existing data model. Note that this option only applies to complete migration scenarios. In other words, if you move the data to SharePoint, you can destroy the existing data store and there are no ongoing synchronization costs.This topic focuses on a comparison between implementing a data model in a SharePoint list and using an externaldatabase with the BC S. Of course, there are other data modeling options. You could implement a model that usesa combination of external database tables and SharePoint lists—for example, if you want to extend the datamodel provided by an existing database, you could continue to use the external database tables and developSharePoint lists to extend the model. In support of this scenario, SharePoint 2010 allows you to look up data in anexternal list from a SharePoint list. However, you cannot look up data in a SharePoint list from an external list,because the external list represents external data that is unaware of the SharePoint implementation. For morecomplex business logic requirements, you should also consider building a service tier between an externaldatabase and your SharePoint application. In this case, you would build your Business Data C onnectivity (BDC )model entities against service methods instead of against database tables. However, these scenarios are beyondthe scope of this topic.In most cases, experience shows that implementing a data model using SharePoint lists can reduce developmenttime, providing that the capabilities of SharePoint lists meet the needs of your data model and fulfill anynon-functional requirements. The following table describes the development complexity and other issues forvarious data modeling scenarios when you use SharePoint lists or an external database with the BC S. Note: In the following table, an entity can be considered to represent a SharePoint list or a table in a relational database.Modeling scenario SharePoint lists External database with BCSOne-to-one relationships Straightforward StraightforwardOne-to-many relationships Straightforward StraightforwardMany-to-many relationships Straightforward C omplex Requires manual customization of the BDC model Default external list forms doGenerated from CHM, not final book. Will be superseded in the future. Page 228
  • 229. not support all create, update, and delete operations Picker controls do not work for all create, update, and delete operationsRelationships with Not applicable C omplexnon-integer-based primarykeys You cannot specify the field on which to build Requires manual a list relationship. In a SharePoint list, the customization of the BDC built-in ID field is effectively always the model primary keyEvent handling on item added, Straightforward Limitedupdated, or deleted You can use stored procedure triggers for events within the database, but any SharePoint logic will remain unaware of the eventAlerts Straightforward Not supportedRSS feeds Straightforward Not supportedWorkflow for item added, Straightforward Not supportedupdated, or deletedTransactions Not supported Moderate to complex Requires additional entity classes to model the aggregated entities for update Requires stored procedures to manage the transaction A Web service layer or a .NET connectivity assembly could be used instead of stored proceduresAggregate calculations Straightforward but inefficient Moderate to complex You can aggregate data using LINQ to Requires additional entity SharePoint, but the resulting query will classes to represent retrieve all items and the aggregation will be aggregated data performed as a post-query operation on the clientRight outer joins and cross Not supported Moderatejoins Requires additional entity classes to model composite entitiesDistinct queries Not supported StraightforwardItem-level security Moderate Not supportedField-level security Not supported Not supportedStoring binary large object Straightforward Straightforward(BLOB) data Use the Business Data Item Web Part with StreamAccessorsNested queries Not supported Straightforward to moderate,Generated from CHM, not final book. Will be superseded in the future. Page 229
  • 230. depending on whether you need to create additional entity classesC omplex business intelligence Limited support using the C hart Web Part, Full database capabilitiesrequirements Excel Services, and Performance PointSimple user input validation Straightforward Not supported Requires custom user interface development or a .NET connectivity assemblyC omplex user input validation Moderate Not supported Requires custom user interface development or a .NET connectivity assemblyC ompatibility with sandbox Straightforward Depends on security requirements No security: Straightforward Trusted subsystem model: Moderate (requires access to the Secure Store Service, which is not available in SharePoint Foundation 2010) Other security models: C omplex (requires ability to install full-trust proxies) Requires access to C entral Administration or Tenant Administration Web sites to install external content typesFor more information about approaches to data access for SharePoint lists and for external data, see Data Accessin SharePoint 2010.Generated from CHM, not final book. Will be superseded in the future. Page 230
  • 231. Data Access in SharePoint 2010SharePoint 2010 introduces several new ways in which you can programmatically interact with your data. Mostnotably, the introduction of LINQ to SharePoint allows you to build complex list queries with the user-friendlylanguage integrated query (LINQ) syntax instead of constructing queries with the somewhat cumbersomeC ollaborative Application Markup Language (C AML). Both LINQ to SharePoint and C AML now support joinpredicates in queries, which moves the SharePoint list-based data model a step closer to the power and flexibilityof a relational database. One of the major evolutions in the latest version of SharePoint is the blurring of thedistinction, from the users perspective, between internal and external data. The introduction of external lists inSharePoint2010 means that you can use many of the same techniques to query data, regardless of whether thatdata resides in a SharePoint content database or an external system.The introduction of new options for data access brings new challenges and best practices for developers. Thischapter provides an overview of each approach to server-side data access and provides guidance on the benefitsand potential pitfalls of each approach. Note:C lient data access techniques, such as the new client object model and REST services, are not covered in thischapter. These techniques cover a large area of functionality and merit a chapter of their own. For moreinformation, see Data Access for C lient Applications.SharePoint 2010 offers three key approaches that you can use to query data:  C AML queries (SPQuery and SPSiteDataQuery). The SPQuery and SPSiteDataQuery classes allow you to construct and submit C AML queries to perform data operations. C AML suffers from a number of shortcomings, including quirky syntax, lack of tooling support, and difficult debugging. However, C AML remains the core engine for data operations and is still the most appropriate choice in some scenarios. The C AML schema has been expanded in SharePoint 2010 to include support for join predicates. You can also use the SPQuery class to query external lists. Note that the results returned by C AML queries are non-typed items.  LINQ to SharePoint. SharePoint 2010 allows you to use LINQ queries to perform data operations on SharePoint lists, including support for join predicates. LINQ to SharePoint operates against strongly-typed entities that represent the items in your lists, and SharePoint 2010 includes a command-line tool named SPMetal that you can use to generate these entities. Internally, the LINQ to SharePoint provider generates the underlying C AML queries that perform the data operations.  The Business C onnectivity Services (BC S) object model. SharePoint 2010 includes new BC S APIs that enable you to interact with external data. The BDC Object Model allows you to programmatically navigate the data entities and associations in your Business Data C onnectivity (BDC ) models and to invoke the stereotyped operations defined on these entities.The following table provides a summary of the different scenarios in which you can use each of these approachesto data access.Scenario LINQ to SPQuery SPSiteDataQuery BDC object SharePoint modelQuerying SharePoint list data within a siteUsing joins on SharePoint lists within a siteUsing joins on SharePoint list data acrosssites within a site collectionAggregating list data across multiple siteswithin a site collectionQuerying external list data within a siteNavigating associations between entities in aBDC model (external data)Accessing binary streams from external dataAccessing external data from a sandboxedapplication (requires the use of external lists)Querying external data that returns complextypesGenerated from CHM, not final book. Will be superseded in the future. Page 231
  • 232. Querying external data that uses anon-integer, or 64-bit integer, ID fieldNavigating bidirectional associations betweenentities in a BDC model (external data)Locating an entity by specifying a field otherthan the ID field (external data)Querying entities that include fields that donot map to an SPF ieldType (external data)Perform bulk operations on external data Note:For information about how to use LINQ to SharePoint and C AML queries, see Querying from Server-side C odeon MSDN.This section includes the following topics that will help you to understand the key details, performance issues, andbest practices behind the different approaches to data access in SharePoint 2010:  Using Query C lasses . This topic identifies scenarios in which you should consider using the C AML-based query classes—SPQuery and SPSiteDataQuery—for data access.  Using LINQ to SharePoint . This topic examines the use of the new LINQ to SharePoint provider, and identifies key efficiency issues and potential stumbling blocks.  Using the BDC Object Model . This topic identifies scenarios in which you must use the BDC object model for data access and provides insight about how you can use the BDC object model to accomplish various common tasks.Out of Scope TopicsThere are a number of additional approaches to data access that are beyond the scope of this documentation.You may want to consider the following components for more specialized scenarios:  The C ontentIterator class. This class is new in SharePoint Server 2010. It enables you to iterate sites, lists, and list items in chunks, in order to avoid violating query throttling thresholds. This class is useful in circumstances where you cannot avoid iterating large lists. If you think a list is likely to grow large, consider using the ContentIterator class to access the list from the start. For more information, see Large Lists.  The C ontentByQueryWebPart class. This class is part of the publishing infrastructure in SharePoint Server 2010. It allows you to submit a C AML query across sites and lists within a site collection and then format the HTML output by specifying an XSL transform. The ContentByQueryWebPart class makes extensive use of caching to provide an efficient means of data retrieval.  The PortalSiteMapProvider class. This class is also part of the publishing infrastructure in SharePoint 2010. It provides an efficient mechanism that you can use to query and access cached objects within a site collection. The PortalSiteMapProvider class offers a useful alternative to the ContentByQueryWebPart class when you require programmatic access to the objects you are retrieving and is capable of caching query results from SPQuery or SPSiteDataQuery. Like the ContentByQueryWebPart class, the PortalSiteMapProvider makes extensive use of caching for reasons of performance and efficiency.Generated from CHM, not final book. Will be superseded in the future. Page 232
  • 233. Using Query ClassesBefore the advent of LINQ to SharePoint, the SPQuery and SPSiteDataQuery classes were the predominantapproaches to performing data operations on SharePoint lists. There are still many scenarios in which theseclasses provide the most effective approach—and, in some cases, the only viable approach—to data access.Both the SPQuery class and the SPSiteDataQuery class allow you to construct a query using collaborativeapplication markup language (C AML). You can then use the SPList object model to execute your query againstone or more SharePoint lists. This topic explains when you should use each class and describes the considerationsthat should guide your implementations.Using SPQueryThe SPQuery class is used to retrieve data from a specific list. In most cases, you should use LINQ to SharePoint,instead of the SPQuery class, to perform data operations on lists. However, there are still some circumstances inwhich SPQuery is the most appropriate option—or the only option—for data access. Most notably, using theSPQuery class is the only supported server object model approach for programmatically working with data inexternal lists.In SharePoint 2010, the SPQuery class has been extended to allow you to specify joins and projected fields. Thehigh-level process for using the SPQuery class is as follows:  C reate an SPQuery instance.  Set properties on the SPQuery instance to specify the C AML query, along with various additional query parameters as required.  C all the GetItems method on an SPList instance, passing in the SPQuery instance as a parameter.The following code example shows this.C#SPListItemCollection results;var query = new SPQuery{ Query = "[Your CAML query statement]", ViewFields = "[Your CAML FieldRef elements]", Joins = "[Your CAML Joins element]", ProjectedFields = "[Your CAML ProjectsFields element]"};results = SPContext.Current.Web.Lists["ListInstance"].GetItems(query);This chapter does not provide instructions on how to configure SPQuery instances, because this is well coveredby the product documentation. However, the following are brief summaries of the key properties of interest:  The Query property specifies the C AML query that you want to execute against the list instance.  The ViewFields property specifies the columns that you want your queries to return as C AML FieldRef elements.  The Joins property specifies the join predicates for your query as a C AML Joins element.  The ProjectedFields property defines fields from foreign joined lists as a C AML ProjectedF ields element. This allows you to reference these fields in your ViewF ields property and in your query statement.Using SPQuery with Regular SharePoint ListsYou should consider using the SPQuery class, instead of LINQ to SharePoint, in the following scenarios:  When you hav e anonymous users on your site. LINQ to SharePoint does not support anonymous user access. Note:This limitation exists at the time of publication. However, it may be resolved in future service packs orcumulative updates.  When a lookup column in a list refers to a list in another site within the site collection. In this situation, SPQuery allows you to use a join predicate that spans both sites. Although you can use LINQ to SharePoint to query across sites with some additional configuration, the process required to generate entity classes is more complex. By default, LINQ to SharePoint returns only the ID field from the target list, inGenerated from CHM, not final book. Will be superseded in the future. Page 233
  • 234. which case, you would need to run additional queries to retrieve relevant field values from the target list. For more information about generating LINQ classes that work across site boundaries, see the section, "Using List Joins across Sites," in Using LINQ to SharePoint.  When performance is paramount. Using LINQ to SharePoint incurs some additional overhead, because the LINQ query must be dynamically converted to C AML at run time. If you are running a time-sensitive operation and performance is critical, you may want to consider creating the C AML yourself and using SPQuery to execute the query directly. Generally speaking, this approach is only required in extreme cases.Using SPQuery with External ListsUsing the SPQuery class is the only supported way to query external lists. Using this approach, you can query anexternal list in exactly the same way that you would query a regular SharePoint list. However, there are someadditional considerations when you access data from an external list:  You cannot join across external lists, even if you have defined entity associations in the BDC model.  You can specify authorization rules by assigning permissions to the external content type. Most Web services and databases will also implement authentication and authorization. You will need to implement a security scheme either by using the Secure Store Service or by configuring your own security mechanisms.  Throttling mechanisms and limits differ from those that apply to regular SharePoint lists. When you query an external list, the throttling settings for the BDC runtime apply.If you want to access external data from a sandboxed application, without using a full-trust proxy, you must usean external list. Using the BDC Object Model or directly accessing external systems is prohibited in the sandboxenvironment. As a result, using the SPQuery class and the SPList object model with external lists is the onlyoption for external data access if you want your solution to run in the sandbox. Note:For security reasons, the identity token for the current user is removed from the sandbox worker process. Ifyou need to access external lists from within the sandbox environment, you must use the Secure Store Serviceto map the managed account that runs the User C ode Proxy Service to the credentials required by the externalsystem. For more information, see Hybrid Approaches in the Execution Models section of this documentation.Using SPSiteDataQueryThe SPSiteDataQuery class is used to query data from multiple lists across different sites in a site collection.SPSiteDataQuery is commonly used in list aggregation scenarios, where list data from team sites or othersubsites is collated and presented in a single interface. Unlike the SPQuery class, you cannot use join predicatesor projected fields with the SPSiteDataQuery class. The SPSiteDataQuery will only aggregate data fromSharePoint lists and will ignore data from external lists. Note:Because of a bug in SharePoint 2010, an SPException (hr=0x80004005) is thrown if you execute anSPSiteDataQuery on a site that contains an external list with a column named Id. This may be fixed in afuture service pack or cumulative update.The high-level process for using the SPSiteDataQuery class is as follows:  C reate an SPSiteDataQuery instance.  Set properties on the SPSiteDataQuery instance to specify the lists or list types to include in the query, the individual sites to include in the query, and the C AML query itself.  C all the GetSiteData method on an SPWeb instance, passing in the SPSiteDataQuery instance as a parameter. The GetSiteData method returns a DataTable.The following code example, which was adapted from the sandbox reference implementation, shows this.C#SPSiteDataQuery query = new SPSiteDataQuery();query.Lists = "<Lists BaseType=1 />";query.ViewFields = "<FieldRef Name=SOWStatus />" + "<FieldRef Name=EstimateValue />";query.Query = "<OrderBy><FieldRef Name=EstimateValue /></OrderBy>";query.Webs = "<Webs Scope=SiteCollection />";SPWeb web = SPContext.Current.Web;DataTable results = web.GetSiteData(query);Generated from CHM, not final book. Will be superseded in the future. Page 234
  • 235. In terms of efficiency, the SPSiteDataQuery class provides an optimal approach to data access in the followingscenarios:  When you need to query multiple lists within the same site collection for the same content  When you need to query across two or more lists that are not related by lookup columnsYou should avoid using LINQ to SharePoint to aggregate list data across sites. LINQ to SharePoint is designed toaggregate data across list relationships defined by lookup columns. Attempting cross-site operations in LINQ toSharePoint typically requires a post-query join operation in memory, which is a resource intensive process. Incontrast, the SPSiteDataQuery class is optimized for querying list data across multiple sites in a site collectionand across multiple lists within a single site. Note:The SPSiteDataQuery class is available in SharePoint Foundation 2010. SharePoint Server 2010 includesadditional built-in components that are appropriate for certain list aggregation scenarios. These componentsinclude the C ontent Query Web Part and the Portal Site Map Navigation Provider.Generated from CHM, not final book. Will be superseded in the future. Page 235
  • 236. Using LINQ to SharePointThe LINQ to SharePoint provider is a new feature in SharePoint 2010 that allows you to use a strongly-typedentity model and the language integrated query (LINQ) query syntax to query list data. Essentially, LINQ toSharePoint hides the complexity of developing C AML queries from developers, which can reduce developmenttime and make code more readable. The LINQ to SharePoint provider converts the LINQ expressions into C AMLqueries at run time.Using LINQ to SharePoint in your own solutions consists of three main steps:  Generate the entity classes. Before you can start writing LINQ queries against your SharePoint lists, you must create or generate the strongly-typed entity classes that represent your list data and lookup column relationships.  Dev elop the solution. After you add the entity classes to your Visual Studio 2010 project, you can write LINQ queries against the strongly-typed entities that represent your data model.  Run the solution. At run time, the LINQ to SharePoint provider dynamically converts your LINQ expressions into C AML queries, executes the C AML, and then maps the returned items to your strongly-typed data entities.The LINQ to SharePoint ProcessAlthough you can manually develop your entity classes, in most cases, you will want to use the SPMetal commandline tool. This is included in SharePoint Foundation 2010 and can be found in the BIN folder in the SharePoint root.The SPMetal tool targets an individual SharePoint site and, by default, generates the following code resources:  A data context class that deriv es from DataContext. This is the top-level entity class. It represents the content of your site and provides methods that allow you to retrieve list entities. The data context class uses the EntityList<TEntity> class to represent the lists in your site, where TEntity is a class that represents a content type.  Classes that represent content types. These are marked with the ContentTypeAttribute. C ontent type classes are generated for implicit content types as well as content types that are explicitly defined on the site. For example, if a user adds a column to an existing list, the user is creating an implicit content type and a representative class will be generated.  Classes and properties that represent relationships between lists. SPMetal can detect relationships based on lookup columns. Within the entity class that represents a content type, SPMetal uses the EntityRef<TEntity> class to represent the singleton side of a one-to-many relationship and the EntitySet<TEntity> class to represent the "many" side of one-to-many or many-to-many relationships (known as a reverse lookup). Properties that are mapped to a field in a related list are decorated with the AssociationAttribute. Note:You can configure the SPMetal tool to generate entity classes for specific lists, instead of for all the content inyour site, by creating a parameters file. For more information, see Overriding SPMetal Defaults with aParameters XML File on MSDN.After the entity classes are generated, you can write LINQ queries against strongly-typed entities instead ofcreating C AML queries. Under the covers, the LINQ to SharePoint provider converts your LINQ queries into C AMLat run time and executes the C AML against your SharePoint lists. For more information about the entity classes ofthe LINQ to SharePoint provider, see Entity C lasses on MSDN.The following code example, adapted from the sandbox reference implementation, illustrates some of the keyaspects of using LINQ with entity classes.C#using (ManufacturingSiteDataContext context = new ManufacturingSiteDataContext(SPContext.Current.Web.Url)){ string sponsor = "David Pelton"; var results = from projectItem in context.PriorityProjects where projectItem.ExecutiveSponsor == sponsor select projectItem; foreach (var proj in results) { output.AppendFormat("Title: {0} Sponsor: {1} Leader: {2} n",Generated from CHM, not final book. Will be superseded in the future. Page 236
  • 237. proj.Title, proj.ExecutiveSponsor, proj.Project.Leader); }}All the entity classes in this example were generated by the SPMetal tool. The example illustrates the followingkey points:  The query uses a data context class. The ManufacturingSiteDataContext class inherits from the DataContext class and includes strongly-typed properties for each list on the manufacturing site, such as the PriorityProjects list.  The content type class that represents the entities within the list includes strongly-typed properties for each column value, such as Title and Executiv eSponsor.  The entity classes understand the relationships defined by lookup columns—the Project.Leader property retrieves a Leader column value from a related Project entity.  You should always dispose of the data context instance after use. The DataContext base class implements the IDisposable interface, and thereby ensures that the data context instance is released when execution passes beyond the scope of the using statement.For more information about using LINQ to SharePoint, see Managing Data with LINQ to SharePoint on MSDN. Note:You can extend the entity classes produced by the SPMetal command-line tool in order to expose additionalfunctionality to the LINQ to SharePoint provider, for example, to handle custom field data types. This guidancedoes not explore this area. For more information, see Extending the Object-Relational Mapping on MSDN.Execution of LINQ to SharePoint QueriesLINQ to SharePoint uses deferred loading—commonly known as lazy loading—of result sets to improve queryefficiency. If you create a query that returns a collection of entities, the query wont actually execute until youcommence an action that uses the result set—such as iterating over the results or converting the result set to anarray. In the preceding code example, the LINQ query is only converted to C AML and executed when theforeach statement starts enumerating the result set.LINQ to SharePoint also uses the deferred loading approach for related entities. Any related entities are onlyloaded when the entity is actually accessed, in order to reduce unnecessary calls to the content database. In thepreceding code example, the Project entity is only loaded when the foreach statement reads theProject.Leader property.When a query is executed in the context of an HTTP request, LINQ to SharePoint uses the SPContext.Currentproperty to load the data context. This makes the process of loading the data context relatively efficient.However, if you use a LINQ query outside the context of an HTTP request, such as in a command line applicationor a PowerShell script, the LINQ to SharePoint provider must construct context objects, such as the SPWeb andthe SPSite, in order to build the data context instance. In this case, the process becomes more resourceintensive. Any create, update, or delete operations within your LINQ queries are automatically batched by thedata context instance and applied when the DataContext.SubmitChanges method is called by your code. Formore information, see How to: Write to C ontent Databases Using LINQ to SharePoint.Generating Entities for Content TypesThe SPMetal command line tool generates entity classes for the content types defined in a SharePoint site.C ontent types have various characteristics that can make this process difficult to understand:  C ontent types support inheritance.  C ontent types can be defined at the site level or at the list level. When a content type is added to a list, SharePoint creates a local copy of the content type which can be modified.  A list can have multiple content types associated with it.SPMetal uses the following rules when it generates content types:  An entity class is generated for every content type on the site (SPWeb).  If a content type inherits from another content type, the entity class that represents the child content type will inherit from the entity class that represents the parent content type. For example, in the sandbox reference implementation, the SOW content type inherits from the built-in Document content type, which in turn inherits from the built-in Item content type. SPMetal generates entity classes for SOW, Document, and Item, and builds an inheritance relationship between the classes.  If a list content type has been modified from the corresponding site content type, SPMetal will generate aGenerated from CHM, not final book. Will be superseded in the future. Page 237
  • 238. new entity class for the list content type. If the list content type is identical to the corresponding site content type, SPMetal will simply use the entity class for the site content type instead. Entities created from list content types are named by preceding the content type name with the list name. For example, if you add a StartDate column to the SOW content type in the Estimates list, an entity class named EstimatesSOW will be generated to represent the list content type. C onversely, if you have not modified the SOW content type in the Estimates list, an entity class named SOW will be generated to represent the site content type.  If a column is removed from a list content type, the corresponding property is made virtual in the entity class that represents the site content type. The entity class that represents the list content type overrides this method and will throw an Inv alidOperationException if you attempt to access the property. For example, if you remove the VendorID column from the SOW content type in the Estimates list, the VendorID property is made virtual in the SOW entity class, and the EstimatesSOW entity will throw an exception if you attempt to access the property.  If a list contains a single content type, the EntityList<TEntity> class that represents that list in the data context class will use that content type entity as its type parameter. For example, if the Estimates list contained only documents based on the SOW content type, the list would be represented by an EntityList<SOW> instance.  If a list contains more than one content type, the EntityList<TEntity> class that represents that list will use the closest matching base content type as its type parameter. For example, the Estimates list actually contains the SOW content type and the Estimate content type, which both inherit from the built-in Document content type. In this case, the list is represented by an EntityList<Document> instance. Because SOW entities and Estimate entities both inherit from the Document entity, the list can contain entities of both types.Modeling Associations in Entity ClassesWhen you use the SPMetal command-line tool to generate entity classes, it automatically detects relationshipsbetween lists based on lookup columns, and it adds properties to the entity classes to enable you to navigatethese relationships. For example, in the SharePoint List Data Models ReferenceIimplementation, the InventoryLocations list includes a lookup column named Part that retrieves values from the Parts list. In theInv entoryLocation class, this is reflected by the inclusion of a Part property that allows you to navigate to theassociated entity instance in the Parts list.C#private Microsoft.SharePoint.Linq.EntityRef<Part> _part;[Microsoft.SharePoint.Linq.AssociationAttribute(Name="PartLookup", Storage="_part", MultivalueType=Microsoft.SharePoint.Linq.AssociationType.Single, List="Parts")]public Part Part{ get { return this._part.GetEntity(); } set { this._part.SetEntity(value); }} Note:The InventoryLocation class also includes event handlers that ensure the Part reference remains up to dateif the associated entity instance is changed.The SPMetal tool also adds properties to the Parts list that enable you to navigate to the Inventory Locations list.This is known as a reverse lookup association. The Parts class includes an InventoryLocation property thatreturns the set of inventory locations that are associated with a specific part—in other words, eachInv entoryLocation instance that links to the specified part through its Part lookup column.C#private Microsoft.SharePoint.Linq.EntitySet<InventoryLocation> _inventoryLocation;[Microsoft.SharePoint.Linq.AssociationAttribute(Name="PartLookup", Storage="_inventoryLocation", ReadOnly=true, MultivalueType=Microsoft.SharePoint.Linq.AssociationType.Backward, List="Inventory Locations")]public Microsoft.SharePoint.Linq.EntitySet<InventoryLocation> InventoryLocation{ get { return this._inventoryLocation; }Generated from CHM, not final book. Will be superseded in the future. Page 238
  • 239. set { this._inventoryLocation.Assign(value); }} Note:The Part class also includes event handlers that ensure the Inv entoryLocation references remain up to dateif the associated entity instance is changed.However, there is a limitation in the way the current version of SPMetal builds reverse lookups:  If a site lookup column is used by one list, SPMetal will generate a reverse lookup association for the relationship.  If a site lookup column is used by more than one list, SPMetal will not generate reverse lookup associations for any of the relationships based on that lookup column.In many scenarios, you will want to use a lookup column in more than one list. For example, in the referenceimplementation, there are three lists that use lookup columns to retrieve values from the Parts list. In somecases, depending on how you intend to query your data, you may not require reverse lookup associations.However, if you do need to traverse the relationship in the reverse direction, your LINQ to SharePoint queries willbe far less efficient if you proceed without a reverse lookup association in place. C onsider the relationshipbetween Parts and Inventory Locations. If you need to find all the inventory locations associated with a specifiedpart, you would need to retrieve every inventory location instance, check the value of the Part lookup column,and build a collection of inventory locations. In this case, the reverse lookup association simplifies the LINQexpressions and reduces the processing overhead.There are various approaches you can use to work around this limitation of SPMetal, each of which hasdrawbacks: 1. C reate a new site column for each list that requires a lookup column for a particular list. This results in multiple site columns that retrieve information from the same list—the columns are duplicates in everything but name. This has several negative consequences:  If a developer uses a site lookup column that is already in use, reverse lookups will not be generated for that column the next time you use SPMetal, and some existing code will break.  Site administrators will need to manage multiple site columns for the same value, which will be confusing. This drawback can be mitigated by hiding the duplicate lookup fields.  The site columns are not really reusable, which is the main purpose of using site columns in the first place. 2. C reate lookup columns at the list level. This eliminates the problems associated with duplicate site columns. This has the following negative consequences:  Your content types will no longer represent your data model, because the lookup columns are now pushed into individual lists. This makes information management more challenging. It also reduces the effectiveness of search and queries that retrieve items from different lists, because the information from the lookup column is not included in the content type. 3. C reate duplicate site columns and use them in content types or list definitions to generate the entity classes with SPMetal, as in option 1. After you generate the entity classes, delete the duplicate site lookup columns and manually edit the entity classes to use a single lookup column. This keeps your data model clean because you do not need to maintain duplicate site columns, and it avoids the problems associated with option 2 because the lookup column is included in the relevant content types. This is the preferred approach in most scenarios. However, it has the following negative consequences:  Extra effort is required to create the duplicate site columns, create the content type definitions, remove the duplicate site columns, and edit the entity classes.  Manual editing of the entity classes can be error-prone and difficult to debug. However, the edit should only involve straightforward renaming of properties. 4. Avoid using reverse lookup associations in cases where more than one list or content type uses a particular site lookup column. Although this approach is simple, you will need to use more complex and less efficient LINQ queries if you need to navigate the association in the reverse direction without reverse lookup properties.Query Efficiency with LINQ to SharePointAlthough LINQ to SharePoint makes it quick and easy to query SharePoint lists, you still need to consider whetheryour LINQ expressions will translate into efficient C AML queries. If your LINQ code translates into efficient C AMLqueries, the performance overhead of the LINQ to SharePoint provider can be considered negligible in all but themost extreme cases—in fact, you may actually see better performance with LINQ to SharePoint because it can bedifficult to manually create efficient C AML queries. This section describes how nuances in your LINQ expressionscan have substantial effects on the efficiency of the generated queries.In some cases, LINQ to SharePoint prevents you from executing queries that contain certain inefficiencies. TheGenerated from CHM, not final book. Will be superseded in the future. Page 239
  • 240. LINQ to SharePoint provider is not always able to convert a LINQ expression into a single C AML query—forexample, if you use a join predicate to query across two lists that are not connected by a lookup column, theLINQ to SharePoint provider would actually need to submit two queries in order to return a result set. In caseslike this where LINQ to SharePoint cannot perform an operation using a single C AML query, the runtime will throwa NotSupportedException. In other cases, the LINQ to SharePoint provider cannot translate the entire LINQcode into an efficient C AML query. In these cases the provider will first execute a C AML query to retrieve itemsfrom the list and then perform a LINQ to Objects query on the list item collection results to satisfy the portionsof the LINQ query that could not be translated to C AML. For more information see Unsupported LINQ Queries andTwo-stage Queries.As an example, suppose you want to review orders for every customer. You might use the following LINQexpression.C#dataContext.Customers.Select(c=>c.Orders).ToArray();In this example, the LINQ to SharePoint provider would need to submit an additional query for every customer inorder to retrieve their orders. As a result, the runtime would throw an exception. Similarly, suppose you want toaggregate data from two different lists of customers. You might use the following LINQ expression.C#dataContext.Customers.Union(dataContext.MoreCustomers).ToArray();In this case, the LINQ to SharePoint provider would need to submit two queries—one for each list. Again, theruntime would throw an exception. The remainder of this section describes ways in which you can perform thistype of query and other common operations without compromising on efficiency.Reviewing the CAML OutputIn many cases, it can be useful to review the C AML output that is generated by your LINQ queries. TheDataContext class includes a Log property that exposes a TextWriter object. You can use this property to logthe generated C AML query to a text file or to the user interface. For example, the following code shows how youcan modify the previous example to view the generated C AML query. In this example, the C AML query isappended to the query results in a Literal control named displayArea.C#using (ManufacturingSiteDataContext context = new ManufacturingSiteDataContext(SPContext.Current.Web.Url)){ var sb = new StringBuilder(); var writer = new StringWriter(sb); context.Log = writer; string sponsor = "David Pelton"; var results = from projectItem in context.PriorityProjects where projectItem.ExecutiveSponsor == sponsor select projectItem; foreach (var proj in results) { output.AppendFormat("Title: {0} Sponsor: {1} Leader: {2}", proj.Title, proj.ExecutiveSponsor, proj.ProjectsLookup.Leader); } output.Append("n Query: " + sb.ToString()); displayArea.Mode = LiteralMode.Encode; displayArea.Text = output.ToString();}After you set the Log property to a TextWriter implementation, the DataContext class will write the C AMLquery to the underlying stream or string as the LINQ expression is executed. You can then view the C AML querythat is generated by the LINQ to SharePoint provider.XML<View> <Query>Generated from CHM, not final book. Will be superseded in the future. Page 240
  • 241. <Where> <And> <BeginsWith> <FieldRef Name="ContentTypeId" /> <Value Type="ContentTypeId">0x0100</Value> </BeginsWith> <Eq> <FieldRef Name="Executive_x0020_Sponsor" /> <Value Type="Text">David Pelton</Value> </Eq> </And> </Where> </Query> <ViewFields> <FieldRef Name="Executive_x0020_Sponsor" /> <FieldRef Name="ProjectsLookup" LookupId="TRUE" /> <FieldRef Name="ID" /> <FieldRef Name="owshiddenversion" /> <FieldRef Name="FileDirRef" /> <FieldRef Name="Title" /> </ViewFields> <RowLimit Paged="TRUE">2147483647</RowLimit></View>There are several interesting observations about the automatically generated C AML query:  Notice the BeginsWith element in the Where clause. This stipulates that the content type ID of the items returned must begin with 0x0100. Effectively, this means that the content type of the items returned must be a custom content type that inherits from the built-in Item content type—which is true of the Project content type. The LINQ to SharePoint provider includes this provision in addition to the where clause specified by the LINQ query.  The C AML query returns a view that contains all the fields in the PriorityProjects list, including fields that arent required by the LINQ expression.  The query returns a lookup field for the Projects list, instead of an entity. The LookupId attribute indicates that the referenced item in the Projects list will be retrieved by its internal ID value.During the development process, you should take time to examine the C AML that is generated by your LINQqueries, in order to proactively identify poorly performing queries. This is especially important when you querylists that you expect to be sizeable. For example, you should take care to catch the obviously offending caseswhere the LINQ to SharePoint provider is unable to translate some or all of the query into C AML and must resortto LINQ to Objects.In the preceding example, PriorityProjects is a simple list, and returning the complete set of fields causes littleadverse effect on performance. As the number of items returned increases and the items grow in complexity, theperformance overheads can become more substantial. For more information about how to constrain the viewfields returned by a query, see the section, "Using Anonymous Types."Where Clause EfficiencyWhen you create a LINQ expression, you typically use operators within a where clause to constrain your resultset. However, the LINQ to SharePoint provider is unable to translate every LINQ operator into C AML. Forexample, the Equals operator and the HasValue operator have no C AML equivalent. The LINQ to SharePointprovider will translate as many where clause operators as possible into C AML, and then it will use LINQ toObjects to fulfill the remaining criteria.The following table shows the operators that are supported by the LINQ to SharePoint provider and theirequivalent expressions in C AML.LINQ Operator CAML Translation&& And|| Or== Eq>= GeqGenerated from CHM, not final book. Will be superseded in the future. Page 241
  • 242. > Gt<= Leq< Lt!= Neq== null IsNull!= null IsNotNullString.C ontains C ontainsString.StartsWith BeginsWithYou should avoid using operators that are not listed in this table in your LINQ to SharePoint queries. Usingunsupported operators causes the LINQ to SharePoint provider to return a larger result set and then process theoutstanding where clauses on the client by using LINQ to Objects. This can create substantial performanceoverheads. For example, consider the following LINQ expression. The where clause includes an Equals operatorand a StartsWith operator.C#var results = from projectItem in context.PriorityProjects where projectItem.ExecutiveSponsor.Equals(sponsor) && projectItem.Title.StartsWith("Over") select projectItem;The resulting C AML query includes a Where clause that reflects the StartsWith operator. However, it makes nomention of the unsupported Equals operator.XML<View> <Query> <Where> <And> <BeginsWith> <FieldRef Name="ContentTypeId" /> <Value Type="ContentTypeId">0x0100</Value> </BeginsWith> <BeginsWith> <FieldRef Name="Title" /> <Value Type="Text">Over</Value> </BeginsWith> </And> </Where> </Query> <ViewFields> … </ViewFields> <RowLimit Paged="TRUE">2147483647</RowLimit></View>In this case, the LINQ to SharePoint provider would return a results set that includes project items with a Titlefield that begins with "Over," as defined by the C AML query. It would then use LINQ to Objects on the client toquery the results set for project items with a matching ExecutiveSponsor field, as defined by the unsupportedEquals operator.The following XML shows what it looks like if you rewrite the LINQ expression to use the supported == operatorinstead of the unsupported Equals operator.XMLvar results = from projectItem in context.PriorityProjects where projectItem.ExecutiveSponsor == sponsor && projectItem.Title.StartsWith("Over") select projectItem;Generated from CHM, not final book. Will be superseded in the future. Page 242
  • 243. This time, the resulting C AML query reflects the LINQ expression in its entirety.XML<View> <Query> <Where> <And> <BeginsWith> <FieldRef Name="ContentTypeId" /> <Value Type="ContentTypeId">0x0100</Value> </BeginsWith> <And> <Eq> <FieldRef Name="Executive_x0020_Sponsor" /> <Value Type="Text">David Pelton</Value> </Eq> <BeginsWith> <FieldRef Name="Title" /> <Value Type="Text">Over</Value> </BeginsWith> </And> </And> </Where> </Query> <ViewFields> … </ViewFields>In this case, the LINQ to SharePoint provider returns only relevant results to the client; no post-processing stepsare required.Using View ProjectionsIn many cases, you can substantially improve query efficiency by using view projections. A view projectionqueries a specific set of fields from one or more entities. When you want to retrieve a read-only view of a set ofdata, using a view projection restricts the number of fields returned by the query and ensures that joins areadded to the C AML query instead of performed as a post-processing step. You can create a view projection invarious ways:  You can select a single field, such as projectItem.Title.  You can build an anonymous type by selecting a specific set of fields from one or more entities.  You can instantiate a known type and set the property values in your LINQ expression.View projections are limited to certain field types. Valid field types for projections are Text (single line of textonly), DateTime, Counter (internal Ids), Number, and ContentTypeId. All remaining field types are notsupported; an Inv alidOperationException will be thrown if a column of that field type is used in the projection.For a list of all field types, see SPFieldType.In the following example, the new keyword in the LINQ expression creates an anonymous type that containsfields named Title, Executiv eSponsor, and Leader.C#using (ManufacturingSiteDataContext context = newManufacturingSiteDataContext(SPContext.Current.Web.Url)){ string sponsor = "David Pelton"; var results = from projectItem in context.PriorityProjects where projectItem.ExecutiveSponsor == sponsor select new { projectItem.Title, projectItem.ExecutiveSponsor, projectItem.Project.Leader }; foreach (var proj in results) {Generated from CHM, not final book. Will be superseded in the future. Page 243
  • 244. output.AppendFormat("Title: {0} Sponsor: {1} Leader: {2}", proj.Title, proj.ExecutiveSponsor, proj.Leader); }}In this case, the LINQ to SharePoint provider creates a view that contains only the columns that correspond to thefields in the anonymous type.<View> <Query> <Where> <And> <BeginsWith> <FieldRef Name="ContentTypeId" /> <Value Type="ContentTypeId">0x0100</Value> </BeginsWith> <Eq> <FieldRef Name="Executive_x0020_Sponsor" /> <Value Type="Text">David Pelton</Value> </Eq> </And> </Where> </Query> <ViewFields> <FieldRef Name="Title" /> <FieldRef Name="Executive_x0020_Sponsor" /> <FieldRef Name="ProjectLeader" /> </ViewFields> <ProjectedFields> <Field Name="ProjectLeader" Type="Lookup" List="Project" ShowField="Leader" /> </ProjectedFields> <Joins> <Join Type="LEFT" ListAlias="Project"> <!--List Name: Projects--> <Eq> <FieldRef Name="Project" RefType="ID" /> <FieldRef List="Project" Name="ID" /> </Eq> </Join> </Joins> <RowLimit Paged="TRUE">2147483647</RowLimit></View>The alternative approach, in which you instantiate a known type and set property value in your LINQ expression,is illustrated by the following example.C#public class PriorityProjectView{ public string Title { get; set; } public string ExecutiveSponsor { get; set; } public string Leader { get; set; }}using (ManufacturingSiteDataContext context = new ManufacturingSiteDataContext(SPContext.Current.Web.Url)){ IEnumerable<PriorityProjectView> proirityProjects = from projectItem in context.PriorityProjects where projectItem.ExecutiveSponsor == sponsor select new PriorityProjectView { Title = projectItem.Title, ExecutiveSponsor = projectItem.ExecutiveSponsor,Generated from CHM, not final book. Will be superseded in the future. Page 244
  • 245. Leader = projectItem.Project.Leader };}...Retrieving only the columns that you actually require will clearly improve the efficiency of your queries; in thisregard, the use of view projections can provide a significant performance boost. This example also illustrates howthe use of view projections forces the LINQ to SharePoint provider to perform the list join within the C AML queryinstead of retrieving a lookup column and using the deferred loading approach described earlier. The LINQ toSharePoint provider will only generate C AML joins when you use view projections. This is a more efficientapproach when you know in advance that you will need to display data from two or more entities, because itreduces the number of round trips to the content database. Note:View projections cannot be used for create, update, or delete operations. You must retrieve the full entityinstances if you want use LINQ to SharePoint to perform create, update, or delete operations.LINQ to SharePoint can only generate C AML joins from join projections for a limited number of data types. AnInv alidOperationException will be thrown if the projection contains a disallowed data type. The permitted datatypes are Text, Number, DateTime, Count, and Content Type ID. All remaining field types cannot beprojected, including Boolean, multi-line text, choice, currency, and calculated fields.On a final note for view projections, recall that the LINQ to SharePoint provider will block certain LINQexpressions because they cannot be translated into a single C AML query. For example, the following LINQexpression attempts to retrieve a collection of orders for each customer. However, LINQ to SharePoint is unableto translate the LINQ expression into a single C AML query.C#dataContext.Customers.Select(c=>c.Orders).ToArray();Suppose you modify the expression to use anonymous types, as shown in the following example.C#var results = dataContext.Customers.Select(c => new { Description = c.Order.Description, CustomerId = c.Order.CustomerId }).ToArray();In this case, the LINQ to SharePoint provider is able to translate the expression into a single C AML query, and theruntime will not throw an exception. As you can see, view projections can provide a valuable resource when youdevelop LINQ to SharePoint expressions.Using List Joins across SitesIn many common SharePoint scenarios, a list will include a lookup column that retrieves data from another list ina parent site within the site collection. However, the SPMetal command-line tool generates entity classes for asingle site, and LINQ to SharePoint expressions operate within a data context that represents a single site. Bydefault, when a list includes a lookup column that refers to a list on another site, SPMetal will generate an IDvalue for the item in the related list instead of constructing the related entity itself. If you were to write queriesagainst this data model, you would need to retrieve the related entity yourself in a post-processing step. As aresult, if you want to use LINQ to SharePoint to query cross-site list relationships effectively, you must performsome additional steps: 1. Temporarily move every list onto a single site before you run the SPMetal tool, so that SPMetal generates a full set of entity classes. 2. When you create a LINQ expression, use the DataContext.RegisterList method to inform the runtime of the location of lists that are not on the current site.C onsider the earlier example of a PriorityProjects list. The list includes a lookup column that retrieves informationfrom a central Projects list on a parent site, as shown in the following illustration.Lookup column relationship across sites in a site collectionGenerated from CHM, not final book. Will be superseded in the future. Page 245
  • 246. In order to generate entities for both lists using the SPMetal tool, you should create a copy of the PriorityProjectslist on the root site, as shown in the following illustration.Temporary list to build entity classesSPMetal will now build a full set of entities and entity relationships. After you finish building entity classes, you canremove the duplicate lists from the site. When you run a query in the context of the C onstruction team site, youmust use the RegisterList method to tell the runtime where to find the Projects list. The following code exampleshows this.C#using (ManufacturingSiteDataContext context = newManufacturingSiteDataContext("http://localhost/sites/manufacturing/construction")){ context.RegisterList<Construction.ProjectsItem>("Projects", "/sites/Manufacturing", "Projects"); var results = from projectItem in context.PriorityProjects select new { projectItem.Title, projectItem.ExecutiveSponsor, projectItem.Project.Leader }; foreach (var item in results) { output.AppendFormat("Title: {0} Sponsor: {1} Leader: {2}", item.Title, item.ExecutiveSponsor, item.Leader); }}There are various ways in which you could approach setting up your lists for entity generation. In most cases,you will want to generate your entity classes from the site that contains any lists that are referenced by lookupcolumns, because lookup columns reference a specific list on a specific site. In other words, if a lookup columnretrieves data from a list on the root site, you should move all your lists onto the root site and build the entitymodel from there. If you build an entity model that uses a lookup column to retrieve data from a list on one siteand then move that list to another site, you will need to manually update your entity classes.The key options for building your entity model are as follows:Generated from CHM, not final book. Will be superseded in the future. Page 246
  • 247.  C reate copies of all your lists on the root site, and use SPMetal to build a single, comprehensive entity model from the root site. This approach is recommended for most scenarios, because it is usually the simplest and does not require you to modify the entity classes after creation.  Use SPMetal with a parameters file to build a specialized entity model for one specific entity relationship. For example, suppose you have a lookup column that retrieves data from a specific team site instead of the root site. In this case, you should consider replicating all related lists on that specific team site and building your entity model from there, in order to avoid having to manually edit the lookup relationship in your entity classes. You might also consider this approach if you have a large number of lists in your site collection, because it may not be worth the extra effort involves in replicating and maintaining every single list on the root site.  When you have a list on a subsite that includes a lookup column that retrieves values from a list on the root site, you may be tempted to reproduce the root site list on the subsite and generate entity classes from there. However, this approach should generally be avoided. First, you would need to generate temporary lookup columns, because the actual lookup columns you want to use are associated with the specific list instance on the root site. Second, you would need to manually edit the associations in the entity classes in order to use the actual lookup columns instead of the temporary lookup columns.Finally, remember that the SPQuery class supports C AML-based list joins. LINQ to SharePoint is primarilydesigned to expedite the development process. If the time it takes to set up and maintain copies of lists in orderto build a representative entity model outweighs the time savings you derive from writing LINQ expressionsinstead of C AML queries, you might want to consider whether SPQuery is a better choice for your applicationscenario.Additional Performance ConsiderationsLINQ expressions define a generic IEnumerable<T> collection of objects. The Enumerable class provides a setof extension methods that you can use to query and manipulate this collection. These methods have varyingefficiency when you use them in your LINQ to SharePoint expressions. The following table briefly describes someof the performance issues for the operations that have not already been described. Operations that are markedas efficient are translated into C AML and do not require post-processing steps after the list data is retrieved.Operation Performance and behaviorContains EfficientOrderBy EfficientOrderByDescending EfficientThenBy EfficientThenByDescending EfficientGroupBy Efficient when used in conjunction with OrderBySum Returns all elements that satisfy the where clause and then uses LINQ to Objects to compute the sum of the elementsAggregate Returns all elements that satisfy the where clause and then uses LINQ to Objects to apply an accumulator function to the elementsAv erage Returns all elements that satisfy the where clause and then uses LINQ to Objects to calculate the average valueMax Returns all elements that satisfy the where clause and uses LINQ to Objects to calculate the maximum valueMin Returns all elements that satisfy the where clause and then uses LINQ to Objects to calculate the minimum valueSkip Returns all elements that satisfy the where clause and then uses LINQ to Objects to perform the Skip operationGenerated from CHM, not final book. Will be superseded in the future. Page 247
  • 248. SkipWhile Returns all elements that satisfy the where clause and then uses LINQ to Objects to perform the SkipWhile operationElementAt Unsupported; use the Take method insteadElementAtOrDefault Unsupported; use the Take method insteadLast Returns all items that satisfy the where clause, and then gets the lastLastOrDefault Returns all items that satisfy the where clause, and then gets the last or returns default if no itemsAll Returns all elements that satisfy the where clause, and then uses LINQ to Objects to evaluate the conditionAny Returns all elements that satisfy the where clause, and then uses LINQ to Objects to evaluate the conditionAsQueryable EfficientCast EfficientConcat EfficientDefaultIfEmpty EfficientDistinct Performed across two collections; returns all elements that satisfy the where clause and then uses LINQ to Objects to filter out duplicatesExcept Performed across two collections; returns all elements that satisfy the where clause and then uses LINQ to Objects to calculate the set differenceFirst EfficientFirstOrDefault EfficientGroupJoin EfficientIntersect Performed across two collections; returns all elements that satisfy the where clause and then uses LINQ to Objects to calculate the set intersectionOfType EfficientReverse Returns all elements that satisfy the where clause, and then uses LINQ to Objects to reverse the order of the sequenceSelectMany EfficientSequenceEqual Performed across two collections; returns all elements that satisfy the where clause and then uses LINQ to Objects to calculate whether the two sets are equalSingle EfficientSingleOrDefault EfficientTake EfficientTakeWhile EfficientGenerated from CHM, not final book. Will be superseded in the future. Page 248
  • 249. Union EfficientThe Repository Pattern and LINQ to SharePointThe Repository pattern is an application design pattern that provides a centralized, isolated data access layer. Therepository retrieves and updates data from an underlying data source and maps the data to your entity model.This approach allows you to separate your data access logic from your business logic.In some ways, the advent of LINQ to SharePoint may appear to obviate the need for repositories. However, thereare many good reasons to continue to use the Repository pattern with LINQ to SharePoint:  Query optimization. LINQ to SharePoint substantially reduces the effort involved in developing queries, when compared to creating C AML queries directly. However, it is still easy to write LINQ to SharePoint queries that perform poorly, as described earlier in this topic. Developing your queries in a central repository means that there are fewer queries to optimize and there is one place to look if you do encounter issues.  Maintainability. If you use LINQ to SharePoint directly from your business logic, you will need to update your code in multiple places if your data model changes. The Repository pattern decouples the consumer of the data from the provider of the data, which means you can update your queries in response to data model changes without impacting business logic throughout your code.  Testability. The repository provides a substitution point at which you can insert fake objects for unit testing.  Flexibility. The repository pattern promotes layering and decoupling, which leads to more flexible, reusable code.In practice, you will encounter tradeoffs between the advantages of the Repository pattern and the practicalitiesof implementing a solution. In the SharePoint Guidance reference implementations, the following practices wereestablished:  Encapsulate all LINQ to SharePoint queries in a repository. This provides a central point of management for the queries.  C onfigure the repository class to return the entity types generated by the SPMetal command-line tool. This avoids the additional overhead of creating custom business entities and mapping them to the SPMetal entity classes, which would be the purist approach to implementing the Repository pattern. However, on the negative side, this results in a tighter coupling between the data model and the data consumers.  Add view objects to the repository in order to return composite projections of entities. A view object combines fields from more than one entity, and using view projections can make LINQ to SharePoint queries across multiple entities more efficient, as described earlier in this topic. This approach was used in the reference implementations, even though it deviates from the Repository pattern, because the views are relatively simple and the entities that are represented in the views are all owned by the same repository. If the views were more complex, or the entities involved spanned multiple repositories, the developers would have implemented a separate class to manage views to provide a cleaner division of responsibilities.To view this repository implementation in action, see the SharePoint List Data Models Reference Implementation.Generated from CHM, not final book. Will be superseded in the future. Page 249
  • 250. Using the BDC Object ModelScenarios for access to external data vary widely in complexity, from simply displaying a list of information froman external source to providing heavily customized interactivity. For many basic scenarios, using the built-inBusiness Data Web Parts or using the SPList API to query external lists provide straightforward and effectiveapproaches to meeting your application requirements. However, for more complex scenarios, you will need to usethe BDC object model for full control of how you interact with your external data entities.The BDC object model is not available in sandboxed solutions. The following table summarizes the scenarioswhere external lists or Business Data Web Parts may meet your requirements and the scenarios where you mustuse the BDC object model.External data scenario External Business Data Web BCS API list PartsAccess two-dimensional (flat) data from a sandboxedsolutionAccess two-dimensional (flat) data from a farm solutionAccess data with non-integer identifiersNavigate one-to-one associations between entitiesNavigate one-to-many associations between entitiesNavigate many-to-many associations between entitiesRead entities with complex types that can be flattened usinga format stringRead entities with complex types that cannot be flattenedusing a format stringC reate, update, or delete entities with complex typesPerform paging or chunking of dataStream binary objectsAccess two-dimensional (flat) data from client logic*Navigate associations between entities from client logic**The BDC provides a client API and a server API that offer the same functionality. However, the client API is onlyavailable in full .NET applications; it cannot be used from Silverlight or JavaScript.The BDC runtime API allows you to programmatically navigate a BDC model, and to interact with an externalsystem through the model, without using intermediary components such as external lists or Business Data WebParts. The following diagram illustrates the key components of the BDC programming model.Key components of the BDC programming modelGenerated from CHM, not final book. Will be superseded in the future. Page 250
  • 251. Each of the components in the programming model relates to a specific part of the BDC model, which wasdescribed in Business Data C onnectivity Models. The BDC service application instance (BdcService) representsthe service instance that manages the metadata for the external systems you want to access. Remember that theBDC service application instance you use is determined by the service application proxy group associated withthe current SharePoint Web application. Each BDC service application instance exposes a metadata catalog(IMetadataC atalog) that you can use to navigate through the metadata definitions stored by the service.Within the metadata catalog, the two primary concepts are the entity (IEntity) and the LOB system instance(ILobSystemInstance). An entity represents an external content type and defines the stereotyped operations thatare used to interact with an external data entity. It can also define associations that allow you to navigate torelated entities and filters that enable you to constrain a result set. A LOB system instance represents a specificinstance, or installation, of the external system that the BDC model represents, and defines the connection andauthentication details required to connect to the system. Note:"LOB system instance" is a legacy term from Office SharePoint Server 2007. A LOB system is aline-of-business application, such as customer relationship management (C RM) or enterprise resource planning(ERP) software. Although the term "LOB system instance" is still used within the BDC object model, the broaderterm "external system" is preferred in other cases.Entities, or external content types, are common to all instances of a system. To access data from a specificinstance of an external system, you need to use an entity object in conjunction with a LOB system instance objectto retrieve an entity instance (IEntityInstance). The following code example illustrates this.C#public IEntityInstance GetMachineInstance(int machineId){ const string entityName = "Machines"; const string systemName = "PartsManagement"; const string nameSpace = "DataModels.ExternalData.PartsManagement"; BdcService bdcService = SPFarm.Local.Services.GetValue<BdcService>(); IMetadataCatalog catalog = bdcService.GetDatabaseBackedMetadataCatalog(SPServiceContext.Current); ILobSystemInstance lobSystemInstance = catalog.GetLobSystem(systemName).GetLobSystemInstances()[systemName];Generated from CHM, not final book. Will be superseded in the future. Page 251
  • 252. Identity identity = new Identity(machineId); IEntity entity = catalog.GetEntity(nameSpace, entityName); IEntityInstance instance = entity.FindSpecific(identity, lobSystemInstance); return instance;}When a method on an IEntity object takes an object of type ILobSystemInstance as a parameter—such asthe F indSpecific method shown here—usually, it is querying the external system for information. The IEntityobject defines the stereotyped operations that allow you to interact with a particular type of data entity on theexternal system, while the ILobSystemInstance object defines the details required to actually connect to aspecific external system instance. Typically, you perform data operations on an IEntityInstance object or on acollection of IEntityInstance objects. Each IEntityInstance object contains a set of fields and values thatcorrespond to the related data item in the external system. These fields can represent simple types or complextypes. For more information, see IEntityInstance Interface on MSDN.Using FiltersIEntity objects can include filter definitions that allow you to constrain result sets when retrieving more than oneitem. Filters can also provide contextual information to the external system, such as a trace identifier to use whenlogging. The following code example shows how you can use filters to retrieve entity instances that match aspecified model number.C#public DataTable FindMachinesForMatchingModelNumber(string modelNumber){ const string entityName = "Machines"; const string systemName = "PartsManagement"; const string nameSpace = "DataModels.ExternalData.PartsManagement"; BdcService bdcService = SPFarm.Local.Services.GetValue<BdcService>(); IMetadataCatalog catalog = bdcService.GetDatabaseBackedMetadataCatalog(SPServiceContext.Current); ILobSystemInstance lobSystemInstance = catalog.GetLobSystem(systemName).GetLobSystemInstances()[systemName]; IEntity entity = catalog.GetEntity(nameSpace, entityName); IFilterCollection filters = entity.GetDefaultFinderFilters(