Thursday, September 30, 2010

SharePoint: Integrate ASP.net controls with Forms – Pt2: Adding a DVWP Form to the page

Pt.1: Use an SP Datasource to push values to a drop-down menu control
Pt2: Adding a DVWP Form to the page
Pt3: Update the DVWP Form using a Drop-Down
Pt4: Trimming Drop Down List results using CAML
Part 5: Force Selection within Drop Down List
Pt6: Bonus: Embed documents to page

If this post doesn’t make any sense, please read over the first post in this series.  I’m going to keep building on each post and then tying them together at the end. 
Adding the Web Part Zone
I always add a Web Part Zone to every custom page that I create.  Adding these zones allow you to easily modify the page by clicking on Site Actions > Edit Page.  Let’s first click in the correct area of the table, so we’ll get the placement of this zone just right.  If you remember, we had a table with 2 columns and 2 rows.  The first cell in this table is where we added the data source and the ASP.net control.  Directly below this, let’s add this zone.  Refer to this screenshot if this sounds confusing.
image
With that out of the way, here’s how to add a Web Part Zone: Click Insert > SharePoint Controls > Web Part Zone.
image
Now onto the meat and potatoes…
What I like to do is click on that Zone and insert my Web Parts.  After all that is what it says to do, right?  Doing so, makes sure you are adding the DVWP to the correct location.  Here’s the steps to add the Edit Form to the page: Click Data View > Insert Data View…
image
Find your Shared Documents Library on the right within the Data Source Library Pane.  Click on the drop-down menu and select Show Data.
image
This will display all of the columns within your Library.  Within my library, I have a custom column already setup.  I’ve called it DocCategory.  I’m going to select every column (metadata), I’d like to have available for editing on my page.  After doing that, you’ll want to click: Insert Fields As > Single Item Form.
image
I only have two columns, but that’s all that I need for this example.  Feel free to have 10, if you want!  The more columns you bring to the page, the more columns you’ll be able to edit.
But it still doesn’t do anything!!!
I know, I know… Stick with me and I’m sure you’ll find something useful.  So far, we’ve added our ASP.net drop-down list, a Web Part Zone, and a DVWP Edit Form.  Here’s a screenshot of where we should be at this point:
image
The next few articles, I’ll show you how to filter this Web Part using the ASP.net control and create a more dynamic Edit Form.  If you have a question or suggestion, feel free to post a comment.  I’ve already came up with a few myself.  Please don’t be shy because you don’t understand the title.  I love screenshots and am a visual learner.  I promise you will get plenty of screenshots along the way.  If you need more, maybe I’ll do a video, just let me know if something isn’t making sense.

Sunday, September 26, 2010

SharePoint: Integrate ASP.net controls with Forms - Pt.1: Use an SP Datasource to push values to a drop-down menu control

Pt.1: Use an SP Datasource to push values to a drop-down menu control
Pt2: Adding a DVWP Form to the page
Pt3: Update the DVWP Form using a Drop-Down
Pt4: Trimming Drop Down List results using CAML
Part 5: Force Selection within Drop Down List
Pt6: Bonus: Embed documents to page

I’d like to start this post of with giving credit where credit is due: ASP.NET Controls Filter the Data View.  Finding this post was key to the end solution.  Within this post, I’d like to elaborate on some of the things you should know and make it a bit more end-user friendly.  Also, over the next few articles, I’m going to show you how to make an interactive Edit Form for a Document Library using this ASP.net control.
I’m interested, where do I start?
Glad you asked ;-P.  Typically, what I do is crack open SPD (SharePoint Designer) and create a blank .aspx page.  It’ll look similar to this:
image
The next step is to add a table to the page.  This will be the base for all of the content and will allow us to have a place to add controls and Web Part Zones.  I’ll talk more about in a later post.   To add this click on Table from the menu bar and then click: Insert Table.  The next dialog box allows you to control how many columns and rows your table has.  For this example, I’m only going to have 2 of each.  All of the other options, I am using the default settings.
image
Now that we have a place for our ASP.net control, we’ll need to add it to the page.  You can do that my clicking on Insert from the menu bar, select ASP.net controls, and then choose Dropdown List.  Here’s a screenshot of what that will look like:
image
It’s a good time to note, that you cannot add this control to a Web Part Zone.  Try with all of your might, it doesn’t work.
This looks cool but what do I do with it?
We have to populate this Dropdown list with data.  For this example, we’re going to be creating a DataSource from my Shared Documents Library.  To do this, we have to click on Task Panes, DataSource Library.  From this pane, we can click on the Shared Documents menu and select Insert Data Source Control.
image
Once we have the DataSource inserted to the page, we need to configure our Dropdown List (I personally like to change the tag properties of this control, but you don’t have to.).  To do that, click on the control.  A tiny chevron image appears and when we click on that, a fly out menu appears.  Add a checkmark to: Enable AutoPostBack and then select the option to Choose Data Source.
image
We want to wire this control up to our Shared Documents library.  Pick the same settings I have in this photo:
image
This will populate the Dropdown List with all of the names of your documents that are in the Shared Documents library.  Selecting the ID for the data field will tell the page what value the selection should be set to.  Using ID, we’ll be able to get accurate results later on.
It still doesn’t look like much?!?
Now we’ll add the master page to this custom page we have built so far.  Click on Format, Master Page, Attach Master Page.
image
I’ve chosen my default master page.  If you have custom master pages, now is the time you would want to use that instead.  All we have to do now is save this page.  I already have an Apps Document Library setup, so that’s where I’ll be storing my page.  Here’s the end result:
image
The next article, I’m going to walk through adding a Web Part Zone and then a DVWP (Data View Web Part) as an Edit Form.  If you want me to add some things along the way, feel free to post a comment!

Technorati Tags: ,,,,

Wednesday, September 22, 2010

RoboCopy vs. SyncToy

While this is not really an in depth comparison, this is more of a quick observation…  If you are looking for a backup solution, please, PLEASE pick one that can actually call itself a backup solution.  Here’s where SyncToy falls short:
  1. Copy files in Restartable mode ~ This allows for an intermittent network connection and will pick up where the copy operation left off.  This is great for those fiesty T1’s that love to chew up more time than they are worth.
  2. Copy Access Rights ~ So if you have intricate permissions set on your files (Who doesn’t ;-P), then you will miss out on all of this glory.
  3. Monitor Source ~ WOW… What a capability.  Robocopy can monitor your files and when there are changes, it’ll automatically copy them.
  4. Bandwidth Throttling ~ While this isn’t a complete solution, it still can free up some bits for other stuff that’s traveling down that slow T1.
  5. Multi-Threaded ~ Why am I still going?  This allows for multiple files to be handled at once.  Now you don’t have to wait on that 6 GB .pst to copy over before your 32 KB files are handled.
  6. Attribute Handling ~ This is not the most impressive, but it’s actually my favorite.  If you know anything about backups, then you know setting the Archive attribute on files can save precious time and space on your backup routine.
  7. Can be easily automated ~ Now before you ping my comment section, I’m very aware that you can schedule a task to run SyncToy.  My main problem with that is I have to setup my pairs via GUI only.  The help files say otherwise, but I’m not convinced.  I’ve spent at least 30 mins trying to prove myself wrong…  This doesn’t work: SyncToy -d(left=e:\,right=c:\Pictures, name=MyPictures,operation=contribute) and this is in the help file!!!
What is SyncToy good for?
Practical uses?  Set it up for your grandma and wait for her to call you when she wants to do a simple backup.  But you then also have to consider what is being backed up.  There are more caveats that you may not know about.  Here’s an excerpt from SyncToy’s Forum:
Q. I have noticed that sometimes SyncToy does not pick up changes to my photos which I had edited using "some popular" photo editing and management software?
A. Several popular photo-editing suites do NOT modify the file time or file size when making certain changes to pictures such as when the user adds or edits tags, description, title, etc on the picture. This is especially true of file formats such as JPEG which use the EXIF header structure. Because of this SyncToy has no way of knowing that the file was modified by the user - if either the last updated timestamp on the file changes or the file size changes, SyncToy will pick up that change and synchronize it over to the other side - otherwise it looks like the same file as before to SyncToy. We're considering improving this behavior in later versions of the Microsoft Sync Framework which is the synchronization technology underneath SyncToy.  For now, the only options users have when synchronizing pictures that may exhibit this behavior is to turn on the Check File Contents option on their folder pair - however, it will make the sync operation quite slow because for this SyncToy has to do a complete scan of every file on each side to compute a signature for the file contents.
RoboCopy is your friend…
This is really a short list that could go on forever.  It may sound a bit biased, but based on what I’ve seen, SyncToy should never be used as for backup.  It may seem like such an easy piece of software to work with, in the end do not be fooled. 
What do you think?
Technorati Tags: ,,

Tuesday, September 21, 2010

The Almighty Content Database

I ran into a major issue on my dev box the other night.  It was such a weird issue, I ran for the hills and waited for it to fix itself.  That almost never works, so I dug in.  What’s a bit of work without getting your hands dirty, eh?

The issue with my box was an update.  I’m almost 99% sure it was an update, but I couldn’t pin it on one and since it’s a box running in Hyper-V, I decided to take the easy route.  I constantly make backups of my content database.  That is key, without these precious backups, I would have lost all of my work.  Since I take snapshots of this server regularly, I figured the most effective time-saver was to use my latest snapshot.  It was only a week old and I know I wasn’t suffering from my problem at that point.

The recovery went well and SharePoint snapped right open.  So far, so good.  I’ve never had to recover a full content database before, so this was new territory for me.  Not to worry, I have good backups.  A quick google search landed me on Stefan Keir Gordon’s blog.  This specific post talks about moving the DB to a new server, it was exactly what I needed.  I tweaked it just a bit by taking the Content DB offline first.  After a reboot, I ran a Recover Database operation.  You just simply point to the .bak from your backups and then pick the Content DB.  This all took about 10 mins (My site is rather small). 

What Have I Learned?

That’s what it’s all about right?  Before this issue, I have been manually running them every week or so.  Dropping off the bits to my file server here which is on a RAID 5.  Plenty enough redundancy for my tiny operation…  Now I’ve realized, the weeks’ worth of work that I lost was a big loss to me.  Because of that, I now have my backups scheduled to run every day.  It’s not talked about enough in my book: We all should review backup strategies every 3 mos. or so and re-evaluate what’s mission critical and what isn’t.  It can literally save lots of time and money.

Monday, September 20, 2010

Order Form using Excel VBA

I’d love to post all of the inner workings of this Order Form, I’m simply not allowed.  Our manufacturing line is very competitive and I’d like to keep our advantage.  However, there are a few things I’ve built that I’d like to show off.  Here’s a simple screenshot of the INPUT page:

image

This allows my foreman to simply order parts by adding an X in the box.  I’m bound to the legacy database that runs the line, so instead of adding a quantity for each part, the X represents an an individual part sent to the line.  What was an interesting challenge was creating the data for the legacy database.  I had to create custom functions within VBA to handle the logic.  We were able to make most of the calculations within 2007, but my workforce uses 2003 and Office XP…  I had to make it backwards compatible.  Here’s a screenshot of what the data looks like (sorry, I had to cut some data out):

image

Since the database only accepts one Wire Type at a time, I had to create some macros that separate the data according to Wire Type.  When the macro is ran, a new worksheet is created for each Wire Type that is listed within the ALL FILE.  Once these are created, another macro is fired and sorts each worksheet descending except for the largest Wire Type worksheet.  Since it’s the largest worksheet, it is sorted ascending.  Weird requirement, I know… It’s what I have to work with since I’m forced to use this old database.  Once all of this is done, it’s now time to import these tabs into the database.  Another weird requirement, is that for each Wire Type, a new template database is used.  This doesn’t sound much like a database at all…  What I’ve been told is that on average, there are 3-4 Wire Types.  That means for this project, this *database* has to be copied 3 or 4 times and an import operation completed for each Wire Type.  I could be way off base here, but this doesn’t sound like a database at all, it sounds more like a band-aid.

Enough about my rants… I’m in the process of showing off the benefits of overhauling this anyway :D.  Once this is done, I’ll have to revisit my Order Form for sure.  For now though, I’m wrapping up the BETA version of this Order Form and it’s going to be tested this week. 

I’m sure you are wondering… Where are the custom functions?!?  I didn’t want to leave you hanging, so here’s a custom function that I had to write for this project.  I hope you find it useful.

Function FindWireType(rng As Range) As String
    Dim cell As Range
    Dim intDelim As Integer
    Dim intColCount As Integer
    Dim strColumn As String
    Dim strWireType As String
      
    For Each cell In rng
        If Not IsEmpty(cell.Value) Then
            intDelim = InStr(2, cell.Address, "$") - 1 'Minus one to strip $ sign out. Posts back $A$ or $AA$
            intColCount = Len(Left(cell.Address, intDelim)) - 1 'Trims the left $ sign
            strColumn = Right(Left(cell.Address, intDelim), intColCount) & "8" 'Trims the cell.address to just the columns and appends an 8
            
            strWireType = ActiveWorkbook.Sheets("INPUT").Range(strColumn).Value
            
            FindWireType = strWireType
            Exit Function
        End If
    Next
End Function

Tuesday, September 7, 2010

Remove-Item –recurse will actually make you curse

Walking in today, I found myself with a challenge.  Definitely not a unique one, but a challenge.  I have to delete an .xls template from a file server within many different project folders.  Our setup looks like this: \\server\Projects\{JobNumber}\*.  Within each job number are many different folders and sub-folders.  I’ve been tasked to remove the old excel template that we were using and replace it with the new one.

Sounds like a job for PowerShell eh?  I’ve done some recursion in the past using VBScript and I’d rather not revisit that…  It was a bit of torture to get the logic sorted out using that language.  With PowerShell, you just slap a –recurse to what you are doing and you *should* be good to go. Or so you thought…  A quick ping to Bing showed me the Remove-Item cmdlet usage doc on TechNet.  Reading over this, it seems very easy to setup recursion and manipulate files, until you do something like this:

   1:  $path = "c:\test\*"


   2:   


   3:  Remove-Item $path -recurse -include .xls






These simple lines of code simply fail without any notification or errors.  I’ve sent a tweet out to the Scripting Guys to get a reason why.  I haven’t heard back from them so far, but all is not lost!  You simply have to think outside the box.  To get this to work you have to use these lines of code instead:




   1:  $path = "c:\test\*"


   2:   


   3:  Get-ChildItem $path -recurse -include *.xls | Remove-Item






This works consistently and will give you the results you’d expect.  I’ll update this article if I hear back from the Scripting Guys ;-).



2010-09-07 11:10 a.m. Update:  As always the Scripting Guys are awesome and answer my question.  Take a look at their response.