Automating business processes with SharePoint 2013 is a powerful way to increase efficiency within any organization. With SharePoint Designer 2013, no-code (or declarative) workflows can be built to run either SharePoint 2013 On-Premise or in the cloud with Office 365. In this session, we’ll develop an expense report workflow from beginning to end to show how SharePoint Designer Workflows are being used in business today.
4. Agenda
Prototyping in Visio 2013
Implementing in SPD2013
Extending with VS2013
Summary
5. SPD Workflows?
SharePoint Online supports declarative
workflows
Code-based workflows aren’t supported in the
Sandbox
Prototyping in Visio 2013
SharePoint specific flowchart diagram
Export to SharePoint Designer 2013
Implement in SharePoint Designer 2013
Improved workflow designer
Integration of forms with InfoPath 2013
Extend with Visual Studio 2013
Custom SPD2013 actions
Events
6. Workflow in SPO?
New Workflow targets
Reusable Workflows can be applied to any list
Site Workflows can execute on site
New Workflow events
Emitted by SharePoint Online (i.e. WorkflowStarted)
Custom Events, Event Receivers
7.
8.
9. Summary
Prototype workflows in Visio 2013
Use diagrams to get sign-off
Implement the workflow in SharePoint
Designer 2013
Custom or copy & modify built in workflow
Actions, conditions and steps
Develop custom workflow actions and events
in Visual Studio 2013
10. Who can you trust??
The blogs I trust through all of the noise.
Maurice Prather http://www.bluedoglimited.com/default.aspx
Andrew Connell http://www.andrewconnell.com/blog
Spence Harbar http://www.harbar.net
Jim Duncan
Heather Solomon http://www.heathersolomon.com/blog
Todd Klindt http://www.toddklindt.com/default.aspx
Todd Baginski http://www.toddbaginski.com/blog
Todd Bleeker http://bit.ly/edlSm5
Jan Tielens http://weblogs.asp.net/jan
Patrick Tisseghem http://www.u2u.info/Blogs/Patrick/default.aspx
Wictor Wilen http://www.wictorwilen.se
Ted Patisson http://blog.tedpattison.net/default.aspx
Lars Fastrup http://www.fastrup.net
Carsten Keutmann http://keutmann.blogspot.com
Keith Richie http://blog.krichie.com
Bill Baer http://blogs.technet.com/b/wbaer
17. Search
Search uses SQL in a very I/O intensive fashion. It is sensitive to I/O latencies on
the TempDB and the Query and Crawl file groups. One of the more difficult and
time consuming jobs for a Search Administrator is to schedule the Crawls so they
are not over lapping while keeping Search results fresh
Indexing/Crawling
Crawling and indexing a large volume of information, documents, and Web pages
requires a large amount of computer processing. The crawl process also
consumes network and other resources. The SharePoint environment must be
configured properly and monitored, to ensure that the crawling and indexing
process does not adversely affect the service available to users. For example,
content is usually crawled and indexed during off-peak hours when servers are
underused in order to maintain peak-hour services for users.
Applications that may be crawling content in your production environment
○ Coveo Full and Incremental crawls to enable search
○ Newsgator to Update all of the colleague information and RSS feeds
○ DocAve for Reporting on and Performing SharePoint Management tasks
○ WSS Search indexes the Help information provided with SharePoint
○ SharePoint Profile Import syncs people profile
○ Office Search Full and Incremental updates Coveo would replace
Top Performance Killers
18. Top Performance Killers
Profile Import
Profile imports are used with NGES to sync your AD user details to provide access
to your feed subscriptions and with SharePoint to sync your AD user details with
your SharePoint User Profile
Large List Operations
Having large lists by itself is not necessarily a performance issue. When
SharePoint Server renders the many items in those lists, that can cause spikes in
render times and database blocking. One way to mitigate large lists is to use
subfolders and create a hierarchical structure where each folder or subfolder has
no more than 3,000 items. Identify large lists and work with the owners of the sites
and lists to archive items or pursue other mitigation strategies
Heavy User Operation List Import/Write
Another scenario of users having power they don’t realize. Importing large lists
using excel or synchronizing an access db. In SQL there’s little difference between
these types of user operations.
Backup (SQL & Tape)
Serious CPU and write disk I/O performance hit. SQL Litespeed or SQL 2008
backup with compression all help to lessen the performance hit.
19.
20. Database Performance
Database Volumes
Separate database volumes into unique LUN’s
consisting of unique physical disk spindles.
Prioritize data among faster disks with ranking:
○ SQL TempDB data files
○ Database transaction log files
○ Search database
○ Content databases
In a heavily read-oriented portal site, prioritize data
over logs.
Separate out Search database transaction log
from content database transaction logs.
21. Database Performance
SQL TempDB Data Files
Recommended practice is that the number of data files allocated
for TempDB should be equal to number of core CPU’s in SQL
Server.
TempDB data file sizes should be consistent across all data files.
TempDB data files should be spread across unique LUN’s and
separated from Content DB, Search DB, etc…
TempDB Log file separated to unique LUN.
Optimal TempDB data file sizes can be calculated using the
following formula: [MAX DB SIZE (KB)] X [.25] / [# CORES] =
DATA FILE SIZE (KB)
Calculation result (starting size) should be roughly equal to 25% of
the largest content or search DB.
Use RAID 10; separate LUN from other database objects (content,
search, etc…).
“Autogrow” feature set to a fixed amount; if auto grow occurs,
permanently increase TempDB size.
22. Content Databases
100 content databases per Web application
100GB per content database
○ CAUTION: Major DB locking issues reported in collaborative
DM scenarios above 100GB
○ Need to ensure that you understand the issues based on
number of users, usage profiles, etc…
○ Service Level Agreement (SLA) requirements for backup and
restore will also have an impact on this decision.
○ KnowledgeLake Lab testing demonstrated SharePoint
performance was NOT impacted by utilizing larger DB sizes;
tests included content DB sizes that were 100GB, 150GB,
200GB, 250GB, 300GB and 350GB.
Database Performance
23. Content Databases - Continued
Pre-construct and pre-size
Script generation of empty database objects
“Autogrow” feature on
Use RAID 5 or RAID 10 logical units
○ RAID 10 is the best choice when cost is not a concern.
○ RAID 5 will be sufficient and will save on costs, since content
databases tend to be more read intensive than write intensive.
Multi-core computer running SQL Server
○ Primary file group could consist of a data file for each CPU core
present in SQL Server.
○ Move each data file to separate logical units consisting of
unique physical disk spindles.
Database Performance
24. Search Database
Pre-construct and pre-size
Script generation of empty database objects
“Autogrow” feature on
Use RAID 10 logical units
○ Should be a requirement for large-scale systems
○ Search database is extremely read/write intensive
Multi-core computer running SQL Server
○ Primary file group could consist of a data file for each CPU core
present in SQL Server.
○ Move each data file to separate logical units consisting of
unique physical disk spindles.
Database Performance
25. Search Database
Search database is VERY read/write intensive!
Do not place any other database data files on any logical unit
where search database files reside.
If possible, try to ensure that the RAID 10 logical units for the
search database data files do not share their physical spindles
with other databases.
Place the search database log files on an independent logical unit.
Database Performance
26. Database Maintenance
Physical Volume File Fragmentation:
○ Defragment your physical volumes on a regular schedule for
increased performance!
○ LUN’s need to be 20-50% larger than the data stored on them
allow for effective defragmentation of the data files.
Performance Monitor Counters to watch:
○ Average Disk Queue Length
Single Digit values are optimal.
Occasional double-digit values aren’t a large concern.
Sustained triple-digit values require attention.
Database Performance
27. Page Performance
Minimize HTTP Requests
80% of the end-user response time is spent on the front-end. Most
of this time is tied up in downloading all the components in the
page: images, stylesheets, scripts, Flash, etc. Reducing the
number of components in turn reduces the number of HTTP
requests required to render the page. This is the key to faster
pages.
For static components: implement
"Never expire" policy by setting far future Expires header
Avoid Redirects
Redirects are accomplished using the 301 and 302 status codes.
Here’s an example of the HTTP headers in a 301 response:
Optimize Images
After a designer is done with creating the images for your web
page, there are still some things you can try before you uploading
the images to your web server
Avoid Empty Image src
Image with empty string src attribute occurs more than one will
expect.