<Disclaimer>This is personal notes of what I retained during the session. This can be incomplete, partially right or wrong. It is just part of the notes I took and what retained my attention. Nothing prevents the user to get more information on their favorite web site.</Disclaimer>
Best Practice cycle : Capacity Planning, Architect for Scale, Pilot & Test, Deploy, Monitor & Validate
Different kind of cache : ASP.NET in RAM of WFEs, SharePoint Object Cache also in RAM of WFEs, Disk-based for BLOBs sitting on the disk of WFEs for static files.
When sponsors really want “real-time??? on the site, ask how often the content is updated and what it means (bad or not) to wait 1 minute for updates.
A quick formula : 1 – (number of requests per second / number of seconds of caching) = % of work saved for the server.
Output cache can be activated in the Central Administration. Trade-off between freshness and work saved. Avoid check for changes capability (50% performance drop). A good question to also ask is by what you want to vary the cache. More variations = more requests uncached.
Using System.Web.UI.WebControls.Substitution will make your control to run at every request.
Check that ASP.NET cache is working by enabling cache information on pages. For DB, use the SQL Profiler instead of the Developer Dashboard, because it is cached with the page.
Object cache is used by the CQWP and Navigation control and is configured in the Central Administration. The bigger the better. It can also be time-based or check if cache is still valid with the corresponding performance hit. Cache more results only if different users have different permissions, so on internet sites, it should not be the case. Two “super??? accounts have to be configured for Full Control User Policy (SuperUser) and for Full Read User Policy (SuperReader).
When accessing from the client side, write a web service that use the object cache and call it from the client application rather than using the Client-OM (because it is not cached !).
BLOB cache is configured in the web.config. Put the cache on a specific drive separated from the OS or Logs. Store a maximum of file (by setting the extensions). The more you cache, the better. The same applies for the timing, the longer the better with a trade-off regarding the updates of the files.
Using a CDN is good anyway, as it will put less pressure on the servers. But it means that CDN is an external storage (permissions, etc !). Files that almost never change are good candidates for CDN. A quick win is to use jQuery or Modernizr.
Activate the IIS static and dynamic compression.
Two models for authoring : in-place where authors update content on the production environment, or content deployment where authors use a different environment for publishing and the scheduled deployment jobs push the updates to the production environment. It is good when review has to be done (legal, communication, etc) but it adds latency and temporary de-synchronization. The bigger the update, the longer the latency.
Content deployment with an automated schedule of 15 minutes has no value.
Tip 1 : User SQL Server 2008 Enterprise + Snapshots. SharePoint takes Content DB snapshots before making the export and deletes it after the export is completed.
Tip 2 : Custom Solutions must be aware of content deployment, because during the content deployment, the activation of the solution will be activated after the content DB is exported. In order to be aware, check if content deployment is running in the activation handler and do not create list or other content as they will already exist.
Plan for variations, to avoid to have to copy the root content into the source label. Consider On-Demand propagation, because otherwise, target label owners will have to delete the versions they don’t need.
0 Comments