Sharepoint 2010 – Adding list items with PowerShell

Recently, we have been pushing to move everything off of Server 2003 ahead of the July 2015 end-of-support date.  One little app still running on one of these boxes displayed all of our websites and where they were hosted in an almost real-time view.  As we talked about moving the app, we thought, what if we made it better?  One idea thrown out there was to move this list to SharePoint.  SharePoint would give us the ability view the information in a variety of different ways (with different views) rather than just the static format of the old app.  Also, it would make management of the app easier, as theoretically, all we would need to worry about would be the list information; SharePoint would provide the rest.

So, I came up with a PowerShell script to query IIS on all of our web servers, pull site name and bindings, determine if the site is Dev, Beta, or Prod, and then put the information in a SharePoint list using the specific format we needed.  Warning, I am not a PowerShell expert by any stretch of the imagination, but here is what I came up with (run it from a SharePoint farm server):

 

 

SSL Errors between ADFS and Dynamics 2013

Recently, I have been dedicating a decent amount of time to setting up a new Dynamics 2013 infrastructure for our CRM environment.  Since we now live in a highly mobile world, of course our sales people pushed heavily for iPhone and Android access.  After much research, we discovered that the only supported method for app access to CRM is by utilizing ADFS.  Apparently the mobile apps run in Microsoft’s cloud infrastructure, which then needs to federate with our on-premises AD infrastructure to allow users of the mobile apps to log in to our on-prem CRM environment.

Since I don’t manage the AD side of things, getting ADFS set up was pretty quick and painless.  Not sure what the AD guy thought, though. 😛  However, we did come across one interesting, and probably obvious item: the ADFS box(es) need to be able to check the validity of the token signing certificate (which is set on the CRM server side).  If a third-party certificate is used, as it was in our case, the ADFS boxes needed to be able to reach the certificate provider’s CRL.  Once we opened up Internet access to the CRL and to Windows Update (for good measure and also to automatically grab root certs), we stopped seeing “invalid cert” errors in the ADFS logs and were able to log in to the CRM environment with no issues.

For a more comprehensive look at how Windows checks SSL certs, I found this link to be super helpful: http://technet.microsoft.com/en-us/library/ee619754(v=ws.10).aspx

Sharepoint 2010 – No Usage Data

Sharepoint 2010 has a lot of nice built-in features to assist with reporting and developing trends.  When adding a tool like ControlPoint, Sharepoint web analytics can really help determine users’ habits and workflows. However nice the web analytics module is, it is that much worse when it doesn’t work.  We recently had an issue where for some inexplicable reason, web analytics started returning blank values.  This affected our third-party reporting tool, causing reports to fail.

Through much Google searching, I found this nice article by Gokan that outlined the data flow for the web analytics application: http://gokanx.wordpress.com/2013/06/15/how-does-web-analytics-works-under-sharepoint-2010/.  By running simple SQL queries, I found that data was making it into the reporting database, but not the staging database.  This scared me a little, as most advice I had read recommended re-creating the web analytics application.  With a farm the size of ours, this would be a long and impacting change.

Then, I found this forum post about stored procedures in the SQL databases: http://social.msdn.microsoft.com/Forums/sharepoint/en-US/b172d4fe-0b2a-45ca-8bc2-51bfebaa83a5/site-web-analytics-reports-not-updating?forum=sharepointgeneralprevious.  Sure enough, the stored procedures on the reporting and staging databases had not run since we noticed the problem.  Did a quick run of the stored procedures and then waited for the web analytics application to run its stuff that night, and data was once again appearing in the staging database and by extension, all of our reports.

Here is a list of the stored procedures I ran on the SQL server:

WebAnalyticsServiceApplication_ReportingDB
– proc_DefragmentIndices
– proc_UpdateStatistics

– proc_WA_CleanFloatingFeedbackData

– proc_WA_DeleteInvalidAdjacentHierarchyData

– proc_WA_DeleteInvalidFactData

– proc_WA_DeleteInvalidInventoryData

– proc_WA_PurgeReportingData

 

WebAnalyticsServiceApplication_StagingDB

– proc_DefragmentIndices

– proc_UpdateStatistics

– proc_WA_EnsureServiceBrokerEnabled

– proc_WA_PurgeStagingData

 

Sharepoint 2010 and Default Versioning

Versioning: great in almost every scenario, but horrible if left unchecked. We recently discovered that in our old WSS 3.0 farm (dying later this year, yay!), versioning was enabled for many document libraries.  However, since in most cases versioning was turned on by users, pretty much all of the lists allowed for an unlimited number of revisions.  As you can imagine, this had some drastic consequences, being left unchecked for several years.  In the most extreme case, a 5MB document had well over 1500 revisions, ballooning the size of the document up to 7.5GB.  We ended up running a script to cycle through our entire WSS 3.0 farm and set versioning limits for any document library that had versioning enabled.

Once that was done, we started looking towards the future; how could we prevent this from happening in our Sharepoint 2010 farm?  The answer lay in a little bit of hacking.  Every time a new Document Library is created, Sharepoint pulls info to create the library from an XML file located at \Program Files\Common Files\Microsoft Shared\Web Server Extensions\14\TEMPLATE\FEATURES\DocumentLibrary\DocLib\Schema.xml.  With some trial and error on Beta (and Google-searching, of course), we found that most of the default settings come from one line:

<List xmlns:ows=”Microsoft SharePoint” Title=”$Resources:shareddocuments_Title;” Direction=”$Resources:Direction;” Url=”Shared Documents” BaseType=”1″>

Adding: VersioningEnabled=”FALSE” MajorVersionLimit=”20″ to this line allowed us to keep the default setting for versioning in a new library to be off, but still default the maximum number of versions to 20 if the user decided they wanted to enable versioning.  In the end, our line looked like this:

<List xmlns:ows=”Microsoft SharePoint” Title=”$Resources:shareddocuments_Title;” Direction=”$Resources:Direction;” Url=”Shared Documents” BaseType=”1″ VersioningEnabled=”FALSE” MajorVersionLimit=”20″>

Now, every new document library in our Sharepoint 2010 farm defaults to a maximum of 20 revisions, but ONLY if the user decides to turn on revisioning at the time of library creation.

Infected Cookies and an Unhappy CRM App

When I first started on the Web Systems team, I managed to find myself as the brand-new guy right in the middle of the busy season.  Not really having a clue what was going on, I spent most of my time furiously attempting to figure out what was going on and trying to transition from a small business admin to a web admin in a large organization. Naturally then, when a support ticket came along regarding a user unable to access a CRM web app, I jumped at the chance to troubleshoot.

In this instance, the user couldn’t access the CRM web app from her browser, getting a 404.  However, her Outlook client connected just fine.  She could also connect to the web app through a Citrix session.  Hmm.  I had her try browsing directly to an app instance on one of the servers to bypass the load balancers and their config.  Nope, still a 404 error.  Any other computer she tried connected just fine.  Something specific to her computer was causing the 404 to appear, which was strange as typically web servers are the ones that cause 404s .  That’s when I thought to check the IIS logs on the web server.  Since the 404 DOES come from the web server, maybe I could find some clues in there.  Sure enough, when I found the HTTP request coming from her computer, it looked pretty funky:

2013-11-06 15:41:21 W3SVC1 CRMSERVER 192.168.2.10 GET /CompanyName/-9741*sfxd*aaa35f00-a524-7a6b-ce7f-ffeeebc487d4-b28*sfxd*disableWs.js – 80 CONTOSO\USER 192.168.1.100 HTTP/1.1 Mozilla/4.0+compatible;+MSIE+7.0;+Windows+NT+6.1;+Trident/5.0;+SLCC2;+.NET+CLR+2.0.50727;+.NET+CLR+3.5.30729;+.NET+CLR+3.0.30729;+Media+Center+PC+6.0;+InfoPatch.2;+.NET+CLR+1.1.4322;+.NET4.0C;+.NET4.0E;+MS-RTC+LM+8;+Company+Managed) ReqClientId=0d0be21d-702a-439a-be7d-6865575a1165;+otherserver.contoso.com/domain/cookie=CONTOSO;+l4st-gr0up-log1n-d=90;+__utma=125531018.1483398864.1380836735.1382642035.1382984753.10;+__utmz=125531018.1382984753.10.10.utmcsr=whitepages.com|utmccn=referral)|utmcmd=referral|utmcct=/;+UserDomainValidated=true;+Cookie=Z9p%2bFNdvzK7g0jvdSnXsDDq1AlCrOE1nCcIlKKM3MC81%2bWArbWUXbrr4qLUR51cAcGP3E1%2fn6rCN9ITBoNuLgg%3d%3d;+GUP=cul=en-US http://www.superfish.com/ws/plugin_w.jsp?merchantSiteURL=http%3A%2F%2Fcrmserver.contoso.com%2FCompanyName%2Fmain.aspx&isIE=7&dm=9&CTID=496&version=12.2.14.79&dlsource=dingodeals&userid=NTBCfb8195adfb4f47d296a74de7fefc1b47NTBC&sitetype=pip 404 0 2 1428 1336 233

Well.  That explains the 404… GET /CompanyName/-9741*sfxd*aaa35f00-a524-7a6b-ce7f-ffeeebc487d4-b28*sfxd*disableWs.js looks pretty suspicious to me.  Following the log entry down a little further, I found that the refer was coming from http://www.superfish.com with some Javascript encapsulating the request to the CRM app.  Some quick Google-searching revealed that Superfish was also known as ‘Window Shopper.’  I forwarded the ticket on to the Desktop support team, they removed the Window Shopper adware, and viola! the user was able to connect to the CRM web app once again.

Sometimes it’s the simple things that get you…

A Use Case for DFS

I work at a small company that has recently undergone many directional and staffing changes due to the economic climate of our time. As the company has been around for quite a while, there are scores of files from ages past (read: 1990′s) that none of our current staff have touched, ever. However, hidden within these files *may* be information of use, if only the right person had time to browse through the files and document what was in them…

In the meantime, all these files need to be maintained, bringing us to an issue: backing up 6 TB of data using our backup system can be scary. Our backup system is somewhat antiquated hardware-wise and backing up that much data isn’t always reliable (backups sometimes last all weekend and into Monday, backup storage tends to fill up while backing up more crucial data, etc.). In a perfect world, I would be able to drop $10k on some new hardware for backups and call it a day, but this is a small business, a place where requests for equipment tend to wait until the status becomes “desperately needed,” especially in a situation where the current backups are “adequate.”

Enter Microsoft’s Distributed File System and its’ ability to replicate files to another server. I grabbed an unused server with a decent amount of disk space and spun it up as a secondary member in a replication topology using DFS. By replicating the content from the primary server to the secondary every 15 minutes, I can breathe a little easier when I receive an alert that the backup job failed. In addition, this new redundancy allows for hardware failure on the primary server or the secondary server as both the primary and secondary server share the same namespace, meaning that as long as one of the servers is up, files can be accessed through that namespace.

As a next step, I would like to add a third, archive server, to the mix. Over the last few months, myself and a colleague went through the files and did some basic archiving: anything that was older than 2009 got organized as such and placed into separate folders, away from the files and folders that related to current work. Files that we could determine that were no longer needed, were deleted. By using DFS in conjunction with Data Junctions, we can move all these archived files to the third server while maintaining the impression that they are still part of the logical folder structure.

So far, DFS has been working flawlessly, giving me a little more peace of mind by adding in another point of redundancy. One caveat, I found that Macs must be upgraded to OSX 10.7 Lion before they will be able to understand the DFS namespaces. Other than that, both our PCs and Macs can access the DFS namespaces with no issues.