Skip to content

Introducing The SharePoint Shepherd’s Guide for End Users

It is with great pleasure that I introduce availability for The SharePoint Shepherd’s Guide for End Users.  The book has been a passion of mine for some time and represents a radical departure from my previous publishing experiences.  This is a book that is specifically designed to provide the most flexibility in my being able to support your needs.

The book itself is a collection of 116 step-by-step guides for how to use SharePoint.  It covers the core tasks that every user needs to be able to do.  It’s not things that just the power users need to do, or that the site administrators need to do.  These are the everyday tasks that every user needs to do.  From uploading a document to recovering an item from the recycle bin, there are step-by-step instructions on where to go and what to do.

The book’s companion site, www.sharepointshepherd.com, contains a complete table of contents and a sample task so you can see what tasks are covered in the book – and how they’re covered in a step-by-step manner.

Right now the book is available from lulu.com.  Lulu.com is a print-on-demand publisher/printer that I’ve used for the production of the book.  This means that books are printed after you order them.  Generally this occurs within 24-48 hours.  The book will eventually be available on Amazon.com and other online resellers, but that may take as much as 8 weeks from today.  This is due to the way that the book industry works.  Rather than wait until it was available universally I wanted to allow folks to order the book today.

Earlier I mentioned that the way the book was approached was specifically designed to make things flexible.  This is primarily in the area of corporate licensing.  In my conversations with my clients, with Microsoft representatives, and other MVPs, it became apparent that there was a need for corporations to deploy step-by-step guides for their end users.  The Microsoft provided help and learning kit just weren’t enough for most organizations.  This was the genesis of the book – to provide a solution for organizations who need to empower their users to use SharePoint.

Unfortunately, traditional publishers are simply not capable of dealing with corporate licensing deals on any kind of scale.  That meant that I had to find a printer/publisher who would allow me to retain 100% of the rights so that I could license the content to corporations as they need it.

If you want to know more about corporate licensing contact my administrative support at [email protected].  The team will be happy to help you with the options for licensing and the costs.  They’ll also be happy to assist you with large quantity orders of the hardcopy book.  The basic licensing models for the book in electronic form is either in PDF or Word format with varying print options and modification rights.  We should be able to accommodate any need including custom publication requests.

SharePoint Designer and Governance

Last week at the SharePoint Information Worker Conference I gave a talk on SharePoint Designer.  It was about when and how to use it.  After the talk one of the attendees pointed out that I neglected a fairly important topic in the presentation.  It’s one that I neglected in my article on the same topic.  I was quite disturbed because the topic is so blatantly obvious to me as something I should have talked about that I’m mystified by how I missed it.  It’s not like the topic of governance is a mystery to me.  My articles (here and here) are a good checklist for the things you should consider when deploying SharePoint but they too are markedly absent of suggestions on SPD.  So, here’s a quick discussion that I typically have with my clients.

The topic, as you may have guessed by the title of this post is SharePoint Designer Governance.  I talked about what good users for SPD are and what bad uses for SPD are – but I never came right out and made any recommendations for how to put a governance framework around SPD in your organization.  So here are my recommendations for SPD Governance:

  1. No one is allowed to make master page or page layout changes with SPD in a production area.  I’m not going to get into the customized/un-customized debate.  Put simply, if you’re talking about publishing, you need repeatability.  That is only accomplished by wrapping up SPD changes into Features and Solutions.  If you’re going to wrap the changes up into Features and Solutions do it and test the results.  Don’t try to make the changes in Dev and then promote the code that you didn’t use into QA.
  2. Discourage the use of SPD as an editor for collaboration sites.  With publishing sites it is easy to draw a hard line in the sand.  However, what about situations where it’s just a collaborative site and the site will go away in a few months?  Here you have to educate the user on the issues (for instance, no upgrades) with using SPD but ultimately allow them to do that if they believe it’s necessary.
  3. End Users don’t get to create SPD workflows in a large organization, business analysts do.  Most larger IT shops have the role of a business analyst or a liaison.  These are the folks that are tasked with truly understanding the business and helping IT understand their needs.  These folks are often burdened with the reality that there are a lot of issues that IT doesn’t have the time to fix and are closest to the users so therefore most likely to be inspired to fix the problem.  Empowering them to make small solutions for their business units is a great idea.  The problem with extending this functionality to the end users is that often the end user doesn’t know when something can be created in SPD and when it’s more appropriate to create a workflow template in Visual Studio.  There are also training issues, and potential loop issues to be considered.  In short, move this as close as you can to the users without directly giving them access.
  4. DataView web parts get created on scratch pages or sites and then get exported and reimported.  SharePoint Designer’s ability to create DataView web parts is awesome.  Its ability to accidentally customize a page isn’t.  By setting a policy that dataview web parts will be created and then moved to their production page homes you can minimize the chance that someone will accidentally customize a page and cause issues for upgrades later on.

The primary challenge when looking at SPD roll outs and governance, in my experience, has been a misunderstanding of what the tool does.  It’s a data view web part creator, a workflow creator, and an HTML/CSS editor.  Many folks think that it’s an enhanced editor for WCM content pages (it isn’t) or that SPD workflows are reusable (they’re not.)  Once these two misconceptions are put to bed the perceived number of people who need SharePoint Designer drops dramatically.

SPD is a great prototyping tool and thus a great tool for business analysts.  However, it’s not a tool that should necessarily be in the hands of every user.

Garbage on a SharePoint Site’s Main Page

Occasionally I get a call from clients that their main page for a site has been “corrupted.”  Upon further review I can see that it’s actually a copy of a Word document.  It appears that they’ve accidentally saved a Word document as default.aspx.  Unfortunately the clients rarely have SharePoint Designer – it’s a right click to fix this within SPD (Revert to Site Definition.)

Fortunately, however, there’s a way to fix this without SharePoint designer.  There is a page in layouts called reghost.aspx that allows you to reghost – or uncustomize – a page.  So you can append _layouts/reghost.aspx to the url to get this page.  Let’s say that you have a site at the url http://wss/foobarred you would type in http://wss/foobarred/_layouts/reghost.aspx.  On the page that appears you can reghost (or uncustomize or revert) a single page – or all the pages in the site.  Enter the URL of the bad page or ask for all of the pages and hit the Reset button and the page will return to the format on the file system.

It’s a nice way to work around the cursing that happens when a page is accidentally overwritten.  (Both what appears on the screen and what happens behind the keyboard.)

STSADM strikes again, Failed to extract the cab file in the solution…

Having more fun with STSADM … I tried doing:

STSADM -o addsolution -filename XmlWebParts.wsp

and I received back this message:

Failed to extract the cab file in the solution.

When I looked … I had accidentally duplicated a line in the DDF file so a file was being included in the cab file twice with the same name — apparently SharePoint doesn’t like that.

And Now for Something Completely Different – How capitalization effects STSADM -o addsolution

Earlier this evening I was working on a project and put together a quick batch file to uninstall and reinstall a solution.  I issued the following command (in a batch file):

STSADM -o addsolution -filename XmlWebParts.WSP

and I got back in response:

“xmlwebparts.wsp” has an unsupported extension, and cannot be added to the solution store.

Having never seen this particilar error from STSADM I was intrigued.  I decided to try the following command:

STSADM -o addsolution -filename XmlWebParts.wsp

and received:

Operation completed successfully.

Say what?  It turns out that for some reason STSADM -o addsolution wants the .wsp extension in lower case.

[ASP.NET] Provider Logging Project

Preparing for conferences is fun.  It forces me to put things together and take concepts that I’ve used in developing client applications and packaging them in ways that other people can use.  That’s why I’m happy to announce availability of the Provider Logging project on Codeplex.  The Provider Logging project is a set of providers for ASP.NET.  These providers encapsulate another provider so that you can monitor the interaction between ASP.NET and your provider – or one of the out of the box providers.

The Provider Logging project currently includes logging providers for Membership, Roles, and Site Map.  The intent is that logging providers for the other ASP.NET provider model options would be written too. (I’m calling for volunteers)  All of the logging providers do so by writing out via System.Diagnostics.Trace.  From there you can configure the trace logger based on one of the built in ASP.NET loggers.  There’s a sample ASP.NET web site included in the source code which has a web.config that is configured with the file logger.  Because they use System.Diagnostics.Trace you can choose where you log their output, and can even filter the output of them to get only the events that you’re interested in.

One might wonder why the Provider Logging project was necessary.  There are two key reasons why I felt like the Provider Logging project was necessary:

  1. A training tool – There’s nothing like seeing how the interaction really works to help you build better providers.  The provider model and how the calls are made are documented but the normal sequences of events are not documented very well, so with the Provider Logging providers you can see the sequences that happen.
  2. A diagnostic tool – ASP.NET isn’t the best about explaining why it took an action or didn’t take an action based on the response of the provider.  It blindly carries on.  However, if you’re not getting the behavior you want you are left guessing and without much hope of figuring out what’s going on.  This is particularly true in tools like Microsoft SharePoint that leverage ASP.NET.  (In fact, all of the providers in the initial release were written to debug problems with various providers as they were used in SharePoint.)

The Provider Logging project represents that core of the custom authentication presentations that I’m giving at the Office Developers Conference and SharePoint Connections.  We’ll be tearing apart what happens when authentication providers are called by the ASP.NET framework – and what happens when they’re called in SharePoint.  If you need to write a custom authentication provider and can’t wait for those presentations, check out the Provider Logging project.  It’s not a replacement for attending my custom authentication sessions, but it’s a step in the right direction.

Adventures in Migrating Content from Lotus Notes and SharePoint Backwards Compatible Document Libraries to WSSv3

I recently completed a project where I migrated Lotus Notes databases and SharePoint Portal Server 2003 Backwards Compatible Document Libraries (SPS2003BCDL) to WSSv3.  I learned a lot through the process and wanted to share how the experience worked and where I ended up.  For those who aren’t familiar, SharePoint Portal Server 2001 (SPS2001) used a document library based on the Exchange Web Storage System, sometimes called WSS which became a problem when Windows SharePoint Services became the new name for SharePoint Team Services.  The Web Storage System had its issues but it did have some great features.  Being based on the Exchange engine it supported security, versioning and user specified document metadata.  What WSSv3 calls Content Types the Web Storage System called document profiles.

The problem with the web storage system was that it didn’t perform well and it didn’t scale well.  As a result in SharePoint Portal Server 2003 Microsoft shifted to a SQL based storage engine.  However, in the process they lost per-item security, and the ability to have any sort of profile for documents.  Because of the substantial feature removals in going to SQL, they made installing the web storage system available as the “Backwards Compatible Document Libraries” and provided a set of migration tools from the BCDL to the new document libraries but the limitations and the lack of profiles stopped many customers, including my customer, from migrating their documents over.

So once WSSv3 was released with support for content types it was possible to migrate from the web storage system to WSSv3.  Thus the impetus for the migration project I just completed.

We tackled the migration as two separate pieces.

Notes Conversion

First up was the Lotus Notes migration.  There were five databases to be moved, two of which were really the same database with different statuses of the data.  They got collapsed into one database with an extra status field because moving items from list to list isn’t as easy in SharePoint as it is to move from database to database in Notes.  Although I outsourced the actual migration the tool used was Proposion.  There are some interesting issues that arose that I think everyone should consider when they’re having to do a migration.

  1. Make sure that the vendor you select to do the migration understands content types.  Content types are important when you consider how you’re going to find data in the long run.  The vendor I chose wasn’t initially familiar with content types.  You can read the whitepaper I wrote, Managing Enterprise Metadata with Content Types, if you want to know more about them and get a sense for how to use them.
  2. Make sure the vendor you select understands how to use features and solutions to deploy and provision content types.  If you just go into the user interface and create a content type you’ll find it difficult to move it between development and production.  If you create a feature that defines the content types all you have to do is test the feature in the dev environment and then deploy the feature into the production environment.  Why a SharePoint solution?  That’s the repeatable way of deploying a feature.  The one I selected didn’t.
  3. Be clear about what your expectations are.  One of the databases was a collection of inspection forms.  I requested that the Notes database be moved into a document with properties/quick parts so that those properties would expose themselves to SharePoint.  The idea was that it would be a real honest to goodness form that was searchable via SharePoint (just like I built in the Managing Enterprise Metadata with Content Types) but in the end they couldn’t figure it out, and I decided that it wasn’t worth pushing.  Apparently all the conversion tools convert from Notes Databases into list items.
  4. One tricky bit about this particular migration was that one of the Notes databases had links between the documents.  We ended up doing a two pass migration where we converted the links to SharePoint links once we knew what the ID#s of the SharePoint records was.
  5. Consider what the final appearance should look like.  Ultimately I created a set of web parts to get the XML of an item and then transformed it with XSLT.  One would think that there would be a way to do that with out of the box web parts, however, the XML web part doesn’t support data connections or parameter substitution so it wouldn’t work.  Instead I created a web part that emitted the XML for the record requested in the URL via a web part connection.  I connected that to the connectable XML web part I wrote and everything fit together.

All in all the conversion was successful, however, it was certainly a lot more painful than it should have been.  Hopefully the next one won’t be so hard, particularly since I’ve got a lot of the tools built for it already.

SharePoint Conversion

When we finished the Notes conversion I thought, great.  The hard part is behind us.  What’s left is just time consuming.  The library we were moving was 22GB in the Web Storage System (about 17 GB on disk).  I thought it would be easy, I’d export the library using the open source SPMigration tool kit that Kimmo Forss built and made available via CodePlex.  After several attempts I managed to get a version of the tool that would run.  There were some packaging issues which wouldn’t allow the installer to run.

So I started running the tool against the datastore in a virtual PC environment and it ran for more than 48 hours … so I gave up.  I bought a server to loan to the client for the migration.  That server was VERY fast, and did manage to create an export in 24 hours or so, but it clearly wasn’t exporting everything.  Try after try lead me from one error to another until finally I had to give up.  I found out from Kimmo that the dataset I was migrating was the largest single-shot migration that the tools had ever been tried for so far as he knew.

At this point I decided to evaluate Tzunami Deployer, without getting into too many specifics, I can say that the export tool for SharePoint 2001 (SPS2003BCDL) just didn’t export all of the data I had.  I was instructed to do smaller migrations, however, this wasn’t feasible given the structure of the data and my project restrictions.

I took the code that Kimmo had written and copied out everything that I had to have to make the system function.  I managed to get a prototype exporter created.  However, there were still folders where the export utility was taking too long and eventually timed out.  After some work I was able to adapt the structure to ask the web storage system to just give me URLs for all of the items in the folder.  From there I could make individual requests to get the metadata for the item.  One would think that this would be much slower, however, in the end it was radically faster.  When I say radically, I mean somewhere between 6x and 12x faster. Apparently the web storage system doesn’t like figuring out what properties to return dynamically so it takes it a long time.  Ultimately the new server I loaned the client did the entire migration of 17/22GB in a little more than 2 hours.

I was originally going to help Kimmo update the tools by migrating the changes into the core code but ultimately decided that I had changed the structure enough that the changes would cause architectural ripples in the export program that I didn’t have time to retrofit.  Kimmo added it to his work list since he now knew why the program took so long to export.

With my new export in hand I wrote an import tool to process the metadata files and upload them to SharePoint.  Which lead me to defects in the SPFileCollection.Add method.  They’ve documented that it doesn’t work correctly, but the core API issue remains the same.  However, in the end the import tool works.  It imported versions, meta data, and document profiles.

Just in case anyone else is having the same problems, I’m available to help with migration projects from the web storage system to WSSv3 and MOSS.  I’ve got a set of import tools that work – and are quick.  (They were always designed to be flexible on the document profiles/content types they work with.) We ended up doing our migration during an evening – not even over a weekend.  I’d probably recommend a weekend migration, but our migration worked out just fine.

Come Play “Where’s Rob” at SharePoint Events

It will be a busy next few months for me.  I’ve got a lot of great clients I’m working for, but it’s not that which is going to keep me busy.  I’ve got a pretty jam packed conference schedule.  I’m sharing it here just in case you want to try to play “Where’s Rob?” (see Where’s Waldo).  Here’s the rundown of my Spring:

Date Event Name Sessions/Activities
February 1st-2nd Sleepless in Chicago Trainer/Presenter/Judge? (TBD)
February 4th-6th SharePoint Information Worker Conference 2008 ·         SharePoint Designer: When should you use it and how?

·         Connecting Metadata in Office and SharePoint

February 10th-13th Office Developers Conference 2008 ·         SharePoint Search and Office

·         Custom Authentication for SharePoint

March 3rd-6th Microsoft Office SharePoint Conference 2008 ·         [Tentative] SharePoint for the Developer and ITPro
April 20th-23rd SharePoint Connections 2008 Spring ·         Workshop: SharePoint Workflows

·         Connect SharePoint Search and Office

·         Quick Integration from SharePoint to your Application

·         Custom Authentication for SharePoint

All of these events have a great set of content that they’re going to be delivering.  It’s an amazing thing to see how the amount and depth of the content for SharePoint has grown over the last year.  I hope to see you at one of these events.

Import Profiles Only for Active Users

While working with a client recently we noticed that they were still seeing disabled accounts in the people search results.  That is, generally speaking, bad.  But it’s actually pretty easy to fix this with a tweak of the LDAP query being used to generate the profiles. First we have to get there so go to…

  1. Central Administration
  2. Shared Service Provider (the one that hosts user profiles)
  3. User profiles and properties
  4. View import connections
  5. Hover over the connection you want to change’s name and click edit

There’s an option in the Search Settings section titled user filter that probably has in it:

(&(objectCategory=Person)(objectClass=User))

What we want is that plus a part of the query that says not account disabled.  It happens that account disabled is a part of the userAccessControl bitmapped field in AD – which means it’s not simple to determine if a bit is set or not.  However, it’s possible.  There is a technet “Hey, Scripting Guy” article which answers the question “How Can I Get a List of All the Disabled User Accounts in Active Directory?”  It turns out the post has in it the magic key we need.

(userAccountControl:1.2.840.113556.1.4.803:=2)

If we wrap this up in a not, and add it to our query we get the results we want.  By the way, the funny number in the middle of that statement is just telling LDAP to use a bitwise AND.  That means that only items will be returned where the account disabled is set.  Since we want the reverse we’ll wrap that up in a not, and we get a query that looks like this:

(&(objectCategory=Person)(objectclass=user)(!(userAccountControl:1.2.840.113556.1.4.803:=2)))

Immediately after doing this and doing a profile import you may be thinking that the disabled users should be gone, unfortunately no.  But that’s an artifact of search.

Search doesn’t remove an entry until the entry has been missing for three full imports in a row.  The thinking is that a site might be temporarily offline during the index and it would be bad to remove it from the index just for a bit of bad timing.  So if you want to delete the user from the search results do three full imports and the users should disappear.

Article:Managing the SharePoint Learning Curve

The core reason for pursuing development in Microsoft SharePoint at all is the ability to maximize productivity. While it may be fun to work on the latest and greatest technology, most development is done for a business purpose. Because development is done for business, the most important reason to work with SharePoint is improving productivity. Productivity might be viewed from the perspective of the time saved by not having to develop features that users may want but which are difficult to implement.

Productivity might also be viewed from the perspective of flexibility to adapt to changing requirements in the future or reusability of the software that is developed between different parts of the solution. No matter which way you look at the problem there are some distinct advantages to developing with SharePoint. However, on the other side of the fence there are some barriers to productivity when developing with SharePoint, not the least of these is the learning curve that every architect and developer must overcome when building SharePoint-based solutions. In this article, we’ll look at this learning curve and how to make it easier.

http://www.intranetjournal.com/articles/200801/ij_01_02_08a.html

Recent Posts

Public Speaking