I recently ran into a situation where I needed an ObjectDataSource in order to use the sort and filter capabilities of the SharePoint SPGridView control, however, my data was already in a DataTable.
I found out that I could use the ObjectCreating event of the ObjectDataSource in order to set the control to an existing data object, such as a DataTable. I was pretty excited to see that I would be able to take the existing DataTable and expose it as an ObjectDataSource. That excitement was short lived as I continued to have problems getting it to work with the SPDataView control. ?
The issue is that the ObjectDataSource was configured to call the Select method of the DataTable whenever the SPGridView control requested data. This caused the ObjectDataSource to return a DataRow array which the SPGridView did not understand. I needed the ObjectDataSource to return the data as a complete DataTable and not DataRow array.
The solution that I arrived at is to create a very simple wrapper class around my DataTable object that returned the original DataTable when a method is called on the wrapper class. Once implemented I had a perfect rendering SPGridView with both filtering and sorting enabled. ?
Below is an example of the wrapper class:
public class DataTableWrapper
private DataTable _dt = new DataTable();
public DataTableWrapper(DataTable dt)
_dt = dt;
public DataTable GetTable()
Here is an example of connecting this up to the ObjectDataSource:
ObjectDataSource ds = new ObjectDataSource();
ds.ID = “myDataSource”;
ds.TypeName=”[…strong name of your DataTableWrapper class here]”;
ds.ObjectCreating += new ObjectDataSourceObjectEventHandler(ds_ObjectCreating);
And here is the ObjectDataSource ObjectCreating event handler:
void ds_ObjectCreating(object sender, ObjectDataSourceEventArgs e)
myDataTable = new DataTableWrapper(sourceDataTable);
e.ObjectInstance = myDataTable;
Things to note about the above samples:
- The original data source sourceDataTable was defined with class scope. This allows the ds_ObjectCreating method to access it.
- myDataTable was defined with class scope.
- The TypeName of the ObjectDataSource must be the full strong type name for the DataTableWrapper object.
Maybe there is a simpler way of doing this. I would love feedback if you have any suggestions.
A few more lessons learned while using the STSADM export and import commands:
1. Alerts will not follow the site’s lists and documents during the export process.
2. Email enabled lists can be an issue when importing if the destination server is not properly configured for incoming email.
3. Any task or issues list imported that emails a user when they are assigned a task will no longer send emails. You need to disable and then re-enable the functionality in the lists settings.
We are also using the Microsoft IT Site Delete capture feature found at http://codeplex.com/governance. It appears that this will not protect imported sites unless you disable and then re-enable this feature on the web. I have not had time to dig into this specific topic but I do have firsthand knowledge of an imported site not being caught by the IT Site Delete capture feature when the site was deleted.
I am also guessing that references to custom event handlers will also not be imported, but I have not had a chance to verify this.
Overall the stsadm export and import commands can be very useful to move sites around as long as you remain aware of the limitations.
at Microsoft.SharePoint.SPList.UpdateDirectoryManagementService(String oldAlias, String newAlias)
I was running into an issue where the initial connection to a SharePoint 2007 server was taking between 20 to 40 seconds. Once the first page of the SharePoint site loaded the server had a very acceptable response speed. So what was causing that initial delay each morning? The answer is actually quite simple if you understand the basics of ASP.NET. The first time a user visits an ASP.NET application, such as SharePoint, the framework performs some compiling and caching on the server. This compiling and caching requires a bit of time to accomplish and once completed the application will respond much quicker.
The compiling and caching always occurs after an IISRESET is issued or the application pool is recycled for an application. In the case of my SharePoint installation, the application pool recycles every morning at around 1am. This means that the first person to hit the SharePoint site in the morning always experiences what they perceive as poor performance of the server. After that initial page hit the system performed with very acceptable speed. I did a few searches on the internet to see how others are resolving this issue. One solution I saw was to schedule a "warm up" script to automatically do an HTTP get request against a couple pages on the SharePoint site. This looked very promising; however, I was soon disappointed when the script appeared to not have any impact on the performance of my server. I soon figured out the reason is that the script was doing an anonymous HTTP get request and my SharePoint site required user credentials to log in. This gave me an idea… how about creating a simple .NET console application that can perform an HTTP get request using credentials.
So I set off to create the utility. The initial version was quite basic. It accepted 3 parameters; a URL, a username and a password. The utility would make a get request based on the URL and would do that using the provided credentials. This was a good start, but I was a bit worried about security. I really didn’t want to have a username and password sitting visible in a "warm up" script on the server. I went back and modified the utility so that a user could register a URL with its username and password. This information is stored in an XML configuration file with the password encrypted. Now the utility can run and perform a HTTP get against the SharePoint site without exposing the password in plain text on the server’s file system. Very cool stuff.
I took the utility, registered my SharePoint server URLs and passwords and then created a simple batch file to call the utility. This batch was then scheduled on the server to run in the mornings on a daily basis. Since implementing this script and utility the users have been very happy with the performance of SharePoint.
I have zipped up the utility I created and have placed it on my server for download. Feel free to download and use the utility in your environment.
A note about security: although the password is encrypted in the XML file it is still not 100% secure. You should apply appropriate permissions on the directory where this utility resides to prevent non-administrators from gaining access. You assume all responsibility for any security vulnerabilities, problems, or damages caused by using the utility.
So how do you use this utility? Unzip it into a directory, open up a command prompt, switch to the directory where the utility resides and then execute the command httpgetwithpwd. This will provide you with an overview of the options. Most likely the first thing you will want to do is to register a URL with its username and password. You do this with the following command:
Httpgetwithpwd [URL] [username] [password]
Replace [URL],[username] and [password] with the appropriate values.
Now to have the utility do an HTTP request,just call the utility with the URL. For example:
This will do an HTTP get for the specified URL using the stored username and password.
You may be wondering where the URL, username and passwords are stored when you register them. An XML file will be created in the same directory as the utility that contains the URL information. You can open this file and see that the password is encrypted.
It may be tempting to register several URLs and then copy the configuration XML file to another server to use with the httpgetwithpwd utility. Be aware that this will not work. The encryption locks the configuration file to the server it was created on. Using the configuration file on another server will result in an invalid password being passed to during the HTTP get process.
This utility was created in a very short time span and doesn’t have a robust error handler and it won’t prevent you from registering a URL multiple times. Be smart about how you use the utility and it will work well for you.