1. Situation
The customer has a global reach with offices in EMEA, the Americas and APAC. They want to
implement SharePoint, but they know they will have bandwidth issues. They have already tried to run
a centralized environment via Citrix, but they skipped that project due to latency in APAC. They have
gone back to a 3 datacenter model. They gave datacenters in EMEA, the Americas and APAC. On the
other hand they have project team working for global customers that will need to collaborate
between the different regions.
Pain
The customer is in need to keep at least some parts of the content consistent across multiple farms.
Questions
What bandwidth is available for synchronization?
When should the synchronization take place?
Should all content be synchronized?
How do you keep the customizations the same across the different farms?
Need
Of course the customer needs Replicator, with the option of binary replication and the event driven
option in there. Next to that there might be an option for Deployment Manager and Central
Administrator to manage the different farms from one single console.
2. Situation
The customer has done a migration a year ago. But the migration from the file shares to SharePoint
2010, has not resulted in a acceptance of SharePoint. The customer has come to the conclusion that
they want to start a new project to use SharePoint in a new way.
Pain
The customer is facing a restructuring of SharePoint, but on the other hand needs to make sure that
the data is accessible.
Questions
What kind of restructuring are you looking for?
Are you going to promote sites to site collections?
Are you going to demote site collections to site?
Are you going to change a lot of permissions?
Are you going to change a lot of configurations on SharePoint sites…?
Need
The customer is in need of an ad hoc content mover. This will be our content manager. But the
customer will most likely change also a lot of permissions, so our central Administration tool can be of
an advantage too.
3. Situation
The customer is using SharePoint to its full extend. So it is using SharePoint for Collaboration in Project
teams, as a repository to replace file shares and policy and procedure handbooks, but also as a
business application for expenses. Next to all this internal usage SharePoint is also used as the base for
the external facing website.
Due to this heavy usage they have made a lot of customizations to SharePoint, and they have stored a
lot of data inside SharePoint. This result in the effect that the backup of SharePoint is taking 14 hours
at least. This results in the effect that IT is running a weekly full backup and a daily incremental
backup.
Pain
The restore of single documents is expensive and takes a long time.
It is sometimes not beneficial any more to restore a single document, due to the fact that the
document has already changed that much since the backup, that the document is outdated.
Certain parts of the SharePoint environment are extremely static, so they are happy with the current
schedule of backup.
The IT department can’t differentiate between different departments with different service levels
Questions
What kind of backup are you using today?
Do you backup externalized content in sync with the content in the database?
Do you backup your customizations with a full backup, or do you only backup the content?
Do you use the site bin to recover from document loss?
Do you use versioning to make sure you still have a version?
Do you need to restore sometimes document for auditors, and do you do this out of place?
Do you want to be able to differentiate Service Level on Restore Points (the time you roll back to) for
different customers/content in your SharePoint environment?
Need
A solution that can support both in Availability Management (granular backup and restore of items)
and in IT Service Continuity Management (a full backup in sync, when a disaster happens e.g. a flood
and you need to restore the whole farm).
In the availability management piece you probably only need to restore content, and this content
needs to be up to date, so depending on the dynamics of the data (frequent changes or static data)
you want to schedule backups.
In the IT Service Continuity Management you want to restore not only the content but also the
customizations that are installed on the WFE, the IIS Settings etc..
What you not want is two tools with two complete different interfaces for this. And you want to be
able to maintain storage costs for the backup. You want to be able to use storage tiering for the
backups too.
4. Situation
You have a customer that is using SharePoint extensively, and unavailability of SharePoint will be
resulting in a major loss for the company. Next to this the company creates a lot of customizations for
SharePoint. They have a very rigid development procedure, and do extensive testing and quality
assurance.
Pain
The customer needs to move customizations from development to test, after testing to QA and then
in production. This steps should incorporate as minimal manual procedures as possible. Each manual
action is a potential danger.
Questions
How do you transfer customizations from development in to test etc. today?
How often does a transfer go wrong, because items are forgotten? And what is the actual downtime
caused by this? And how quickly can you get this customizations out of production?
Need
A solution that will deploy the customizations into the different farms, through a GUI. This will result in
less manual actions which will lead in less errors. But the solution should also be able to withdraw
customizations quick and easy. (deployment manager)
5. Situation:
The customer has a large content database, reaching the limit of 200 GB which is by Microsoft to be
the optimal size.
Pain:
The enduser will have a slow performance.
The cost of maintaining the situation is really expensive. Microsoft advises to have at least three times
the size of the database as the size of the disk for defragmentation and other things.
The backup and restore of the database might take a long time.
Additional questioning:
What do you store in the content database?
o Do you store large files in the content database? Externalize data, with extender
(this is the BLOB storage idea) Large content is considered files of 1MB or larger. So
basically any file is large content.
o Do you have a lot of sites in the content database that are not actively used any more
e.g. project sites? Externalize data with archiver this data is not needed directly in
the database, but can be outside.
o Do you have one or two sub-sites that are rapidly growing ? This is the need for
content restructuring you want to promote a site to its own site collection which
results in the effect that you can store the information in its own content database.
Content restructuring is content manager…
Need:
The possibility to Externalize large data objects outside the content database, while not break
all the possibilities that SharePoint delivers. (Extender)
The possibility to Externalize less used data outside the content database, while not break all
the possibilities that SharePoint delivers, based on rules (Archiver)
The possibility to promote a site to a site collection to give it its own database, without
changing other actions and options on that site (Content Manager)
The possibility to backup only the content database with the links to externalized content, to
keep the backup time short. (Back-Up and Restore)
The possibility to make sure you back-up the complete environment including the externalized
content to be able to a restore completely in sync (Back-Up and Restore)
6. Situation
The customer is using the delegated model of SharePoint to its maximum extent, but due to the fact
that they have a turnover of people they need to delegate control to a new employee.
Pain
SharePoint delivers a perfect model to delegate control and grant permissions on each level Site
Collection, Site, Library and even item. But each level has its own interface to set the permissions. So
when you need to do a change in bulk, you need to script or you will have an error prone manual task
that will take a lot of time.
Additional questioning
How often do you need to transfer rights from one user to another?
How often do you need to copy rights from one user to another?
Do you have an overview of which rights a user has inside your SharePoint environment?
Need
A central console that can copy and transfer user rights at all levels in bulk and scheduled via a GUI.
Basically this is what the DocAve Central Administration delivers.
A central console that can give you an overview of all the rights a specific user has at any level.