Vista on an old HP Laptop

OK… don’t ask why I would do this but I decided to install Vista Ultimate on an older (3+ year old) HP laptop.   An HP Pavilion zx5180us to be exact.   The install actually went very smooth.   Once I plugged it into a wired network connection Windows update managed to find all of my devices including my wireless card and installed in perfectly.   Even the video driver was updated (no Aero however… the laptop just isn’t equiped for that).    The only problem I ran into was the lack of audio.    The driver Windows update kept sending down (Realtek AC’97) would fail every time.    🙁
 
After spending a few hours messing with this I finally figured it out.   I used the hardware IDs to track down a set of drivers from ATI called the ATI Chipset / Southbridge drivers.   So far the only place I have found them for download is at http://members.driverguide.com/driver/detail.php?driverid=648316&action=winfo.   It sucks because you either have to pay for the driver download or click through a series of ads.    Anyways, once I receieved the file I unzipped it onto my hard drive.   I tried to run the setup program but just ended up getting error messages and being told I need to reboot.     So I did just that;  rebooted the laptop.   Of course on boot up Vista wanted to help and said hey… I found new hardware… let me install the AC’97 driver for you.   Of course that fails as exepected.   I opened up device manager, selected the AC’97 device and then selected update driver.
 
I selected to browse my computer for driver software.  I followed that up with the selection of let me pick from a list of device drivers on my computer.   This then allowed me to choose have disk.    I navigated to the recently downloaded and unzipped ATI files (the folder was C:ChipsetATI SouthbridgeAudio) and selected the only INF file shown.    A warning was shown about the driver not being signed…  since I am desperate I went ahead full steam and instructed the system to install the drivers anyways.   A few minutes later and I had audio!
 
WOOT!   Who said you can’t teach and old laptop new tricks?
 
Now everything seems to be workin,,g well… last step to reinstall office and get my email working again.

ObjectDataSource from a DataTable

I recently ran into a situation where I needed an ObjectDataSource in order to use the sort and filter capabilities of the SharePoint SPGridView control, however, my data was already in a DataTable.

I found out that I could use the ObjectCreating event of the ObjectDataSource in order to set the control to an existing data object, such as a DataTable.    I was pretty excited to see that I would be able to take the existing DataTable and expose it as an ObjectDataSource.    That excitement was short lived as I continued to have problems getting it to work with the SPDataView control.  ?

The issue is that the ObjectDataSource was configured to call the Select method of the DataTable whenever the SPGridView control requested data.   This caused the ObjectDataSource to return a DataRow array which the SPGridView did not understand.   I needed the ObjectDataSource to return the data as a complete DataTable and not DataRow array.

The solution that I arrived at is to create a very simple wrapper class around my DataTable object that returned the original DataTable when a method is called on the wrapper class.   Once implemented I had a perfect rendering SPGridView with both filtering and sorting enabled. ?

Below is an example of the wrapper class:

public class DataTableWrapper
{
private DataTable _dt = new DataTable();

public DataTableWrapper(DataTable dt)
{
_dt = dt;
}

public DataTable GetTable()
{
return _dt;
}
}

Here is an example of connecting this up to the ObjectDataSource:

ObjectDataSource ds = new ObjectDataSource();
ds.ID = “myDataSource”;
ds.TypeName=”[…strong name of your DataTableWrapper class here]”;
ds.SelectMethod=”GetTable”;
ds.ObjectCreating += new ObjectDataSourceObjectEventHandler(ds_ObjectCreating);

And here is the ObjectDataSource ObjectCreating event handler:

void ds_ObjectCreating(object sender, ObjectDataSourceEventArgs e)
{
myDataTable  = new DataTableWrapper(sourceDataTable);
e.ObjectInstance = myDataTable;
}

Things to note about the above samples:

  • The original data source sourceDataTable was defined with class scope.  This allows the ds_ObjectCreating method to access it.
  • myDataTable was defined with class scope.
  • The TypeName of the ObjectDataSource must be the full strong type name for the DataTableWrapper object.

Maybe there is a simpler way of doing this.   I would love feedback if you have any suggestions.

STSADM import / export cont.

A few more lessons learned while using the STSADM export and import commands:

 

1.    Alerts will not follow the site’s lists and documents during the export process.

2.    Email enabled lists can be an issue when importing if the destination server is not properly configured for incoming email.

3.    Any task or issues list imported that emails a user when they are assigned a task will no longer send emails.  You need to disable and then re-enable the functionality in the lists settings.

We are also using the Microsoft IT Site Delete capture feature found at http://codeplex.com/governance.   It appears that this will not protect imported sites unless you disable and then re-enable this feature on the web.   I have not had time to dig into this specific topic but I do have firsthand knowledge of an imported site not being caught by the IT Site Delete capture feature when the site was deleted.   

I am also guessing that references to custom event handlers will also not be imported, but I have not had a chance to verify this.

Overall the stsadm export and import commands can be very useful to move sites around as long as you remain aware of the limitations.

Sogeti to offer SharePoint event

Sogeti will offer a course on how to successfully deploy Microsoft Office SharePoint Server 2007 on March 20. The presentation will include information on planning, governance, information architecture, and physical architecture. The course is intended for technology executives, business executives, vice presidents, CIOs, IT directors, architects and more.
 
The event will be held at Microsoft Corp.’s Southfield office, 1000 Town Center, Suite 1930, from 8:30 a.m. to noon.
 

Heroes Happen Here Launch Event

I will be at the Sogeti exhibitor booth for the Microsoft "Heroes Happen Here" launch event in Detroit, Michigan.    The event is March 18th at the Detroit Marriott in the Renaissance Center.
 
Register here and attend to take home a free copy of Windows Server 2008, Microsoft SQL Server 2008 and Microsoft Visual Studio 2008!
 
Stop by the Sogeti booth and register for your chance to win a Microsoft Zune!   Hope to see you at the launch event.
 
Update: This event has reached maximum registrations.  

“Source” is your friend

The content query web part in SharePoint 2007 is a great tool for rolling up information from subsites into a top level site.   However, sometimes the default look, feel and functionality just doesn’t fit the requirements.   
 
There is a great MSDN article on how to modify the XSL used by the content query web part to change the way it displays information.   In my case, however, it wasn’t the look that I needed to modify it was the functionality.
 
I used the content query web part to display a list of announcements from all of my sub-sites.   I set up the web part to display the announcements in a bulleted list, grouped by site.   This simple configuration met my initial requirements.     There was one navigation issue I noticed right away.   If the user clicked on the announcement title in the content query web part they were taken to the view form for the announcement which contains a close button.   When the user clicks the close button they are returned to the list where the announcement is stored, not back to the original top level site.    This simple navigation issue caused all sorts of confusion for the users of the system.   They expected the close button to take them back to where they had started.
 
This was actually a simple issue to resolve.   I used the methods outlined in the MSDN article to create a new section in the XSL for my new style template.  I based my new style template on the existing bulleted style.   I then made one simple modification to the anchor tag so that my destination url has a source attribute that specifies where the user came from when clicking on the link.   In this case it is /departments/it/default.aspx.
 
            <a href="{$SafeLinkUrl}&amp;source=/departments/IT/default.aspx" target="{$LinkTarget}" title="{@LinkToolTip}">
 
I saved the update XSL and checked it into SharePoint.    It was then just a simple process of modifying the settings of the content query web part to use the new template.    Now when a user clicks on one of the links they can return back to where they started by clicking on the close button.
 
There might be a way to dynamically set the source instead of the static method I used here.   However, this solution met my immediate needs.
 
 

STSADM Import Error – FatalError: The given key was not present in the dictionary.

On my journey to migrate a large amount of content from one SharePoint 2007 server to another I have run across several strange error messages during the import process.   I have been documenting them here in this blog in an attempt to help others and also to be used as reference material for myself in the future.
 
With my current project I have been able to import all of the content into the new server except for one single site.   I am getting the message FatalError: The given key was not present in the dictionary during the import process.   The error appears after a message indicating that it is attempting to import roles into the system.   So I am guessing that someplace on the original SharePoint server I have some sort of invalid security identifier, group or role.     
 
I am going to keep on digging into this and will update this post once I figure out some solution.
 
Update:  It looks like there was some corrupt security setting on a document library.  I reset the library to inherit permissions from parent and then the export / import process worked.   The good news is that the custom security was not really needed on that library any way so this doesn’t cause any problems.   I wish I had a bit more time to spend to find exactly what user / role was causing the problem.

More SharePoint Site Export/Import Fun

I am slowly making more progress in my quest to export sites from an Enterprise SharePoint Server and import it back into a Standard SharePoint Server.   As you saw in my last post I ran into a problem caused by having lists email enabled.    Since then I have written several small console utilities to help track down problems in the original sites I am exporting.
One of the tools I created helps me hunt down all of the lists that are email enabled.   The utility walks the site hierarchy and looks for any list that has an assigned email address.   If one is found it writes out a comma seperated line that includes the list name, list’s default URL and the email alias used for incoming mail.   This allows me to quickly log all of the information so I can use it later to re-enable the incoming mail feature on the new SharePoint server.    Once the utility gave me the list of sites that are email enabled I could quickly disable them temporarily for the export.
Another tool that I wrote that came in very handy was one that would search for Enterprise SharePoint features that are not supported on the Standard edition of SharePoint.   Like the last tool, I would run this at the command line and receive a list of all the sites I will need to check for enterprise features.   This saved a ton of time.   I checked each site and noticed that 99% of the sites never really used the enterprise features, they just had them enabled on the site.   So a simple process of disabling the enterprise features on these sites took care of that problem.   I did have one site that was using Excel services but after speaking with the site owner we determined it really wasn’t needed.   Now all of the sites should easily import into the standard version of SharePoint.
One last issue that I had was an error during import that indicated a required template was missing on the new SharePoint server.   I wrote another command line utility that searched the site hierarchy looking for any sites that were based on this rogue template.   After running the tool I found that one site was based on this template… and after reviewing the site it was determined to have been just some left over test that one of the Windows administrators had been playing with.    So our solution was to just delete that site.
After spending a few hours writing simple utilities and doing a lot of clean up on the original SharePoint server I was able to successful export and then import all of the sites into the new server.      Mark up one small victory.
Next steps will be to review the export and import logs for any outstanding issues that may have cropped up.   I think I may have a problem with importing survey content.  I noticed there was a message in the import log that said something like  “the user can only answer the survey once”.    I am guessing there is some other setting I will need to modify on the original SharePoint server prior to doing the export that may resolve that issue.   But more research and digging is needed before I take any action.
Once all of the issues are resolved I will be ready to do this whole process for real during a weekend.   When that occurs I will be moving all of the content and then making some DNS changes so all of our users will see the new SharePoint server the next time they log in.    I am hoping that all of this pre-work, testing and trial runs pay off so I have a smooth cutover.   Wish me luck!

STSADM Import – FataError

Recently I have been tasked to migrate a significant amount of content from a SharePoint 2007 Enterprise server to a SharePoint 2007 Standard server.    I decided that the simplest way would be to use the STSADM -export command on the source server and then import the content on the destination server using the STSADM -import command.   Sounded simple, but in practice became very difficult.
The first issue I encountered was related to the different features enabled on the enterprise server vs. the standard server.   I disabled all of the enterprise features (that I could find) on the source server and then proceeded with the export.   During the import process I kept getting errors indicating that some of the features are not found on the destination server.  Although I thought I had disabled all of the enterprise features the export still contained some references.    I re-ran the export again but this time using the -nofilecompression parameter.  This creates a directory with all of the export files.   In this directory I found a file called Requirements.xml.   This contains a list of all of the templates and features that are required by the sites and content exported from the source server.   I noticed that there were still references to enterprise features in this file although I had already disabled the enterprise features on the server.   To resolve this I manually removed the XML nodes related to the enterprise features that were preventing the import process to occur.   When I ran the import process on the destination server I no longer received errors related to missing features or templates.
The next issue I encountered was the following error during the import process
FatalError: Error in the application.
at Microsoft.SharePoint.SPList.UpdateDirectoryManagementService(String oldAlias, String newAlias)
This error appeared on a calendar that was in one of the sites being exported.   After a bunch of digging, testing, and trial/error process I found out that the error was being thrown on lists or libraries that are email enabled.   I disabled the incoming email functionality on these lists on the source server, re-ran the export process.   When I imported the content on the destination server I no longer received the FatalError message.
To make sure everything on the destination server is configured properly I noted down the lists and incoming email addresses that were on the source server so I could reconfigure the destination server with the same settings.
I hope this helps someone else who runs into a similar export/import process.

SharePoint 2007 “Warm Up”

I was running into an issue where the initial connection to a SharePoint 2007 server was taking between 20 to 40 seconds. Once the first page of the SharePoint site loaded the server had a very acceptable response speed. So what was causing that initial delay each morning? The answer is actually quite simple if you understand the basics of ASP.NET. The first time a user visits an ASP.NET application, such as SharePoint, the framework performs some compiling and caching on the server. This compiling and caching requires a bit of time to accomplish and once completed the application will respond much quicker.

The compiling and caching always occurs after an IISRESET is issued or the application pool is recycled for an application. In the case of my SharePoint installation, the application pool recycles every morning at around 1am. This means that the first person to hit the SharePoint site in the morning always experiences what they perceive as poor performance of the server. After that initial page hit the system performed with very acceptable speed. I did a few searches on the internet to see how others are resolving this issue. One solution I saw was to schedule a "warm up" script to automatically do an HTTP get request against a couple pages on the SharePoint site. This looked very promising; however, I was soon disappointed when the script appeared to not have any impact on the performance of my server. I soon figured out the reason is that the script was doing an anonymous HTTP get request and my SharePoint site required user credentials to log in. This gave me an idea… how about creating a simple .NET console application that can perform an HTTP get request using credentials.

So I set off to create the utility. The initial version was quite basic. It accepted 3 parameters; a URL, a username and a password. The utility would make a get request based on the URL and would do that using the provided credentials. This was a good start, but I was a bit worried about security. I really didn’t want to have a username and password sitting visible in a "warm up" script on the server. I went back and modified the utility so that a user could register a URL with its username and password. This information is stored in an XML configuration file with the password encrypted. Now the utility can run and perform a HTTP get against the SharePoint site without exposing the password in plain text on the server’s file system. Very cool stuff.

I took the utility, registered my SharePoint server URLs and passwords and then created a simple batch file to call the utility. This batch was then scheduled on the server to run in the mornings on a daily basis. Since implementing this script and utility the users have been very happy with the performance of SharePoint.

I have zipped up the utility I created and have placed it on my server for download. Feel free to download and use the utility in your environment.

A note about security: although the password is encrypted in the XML file it is still not 100% secure. You should apply appropriate permissions on the directory where this utility resides to prevent non-administrators from gaining access. You assume all responsibility for any security vulnerabilities, problems, or damages caused by using the utility.

So how do you use this utility? Unzip it into a directory, open up a command prompt, switch to the directory where the utility resides and then execute the command httpgetwithpwd. This will provide you with an overview of the options. Most likely the first thing you will want to do is to register a URL with its username and password. You do this with the following command:

Httpgetwithpwd [URL] [username] [password]

Replace [URL],[username] and [password] with the appropriate values.

Now to have the utility do an HTTP request,just call the utility with the URL. For example:

Httpgetwithpwd http://mySharePoint.com

This will do an HTTP get for the specified URL using the stored username and password.

You may be wondering where the URL, username and passwords are stored when you register them. An XML file will be created in the same directory as the utility that contains the URL information. You can open this file and see that the password is encrypted.

It may be tempting to register several URLs and then copy the configuration XML file to another server to use with the httpgetwithpwd utility. Be aware that this will not work. The encryption locks the configuration file to the server it was created on. Using the configuration file on another server will result in an invalid password being passed to during the HTTP get process.

This utility was created in a very short time span and doesn’t have a robust error handler and it won’t prevent you from registering a URL multiple times. Be smart about how you use the utility and it will work well for you.

Technology Blog