another technical blog...technically

Showing posts with label SharePoint. Show all posts
Showing posts with label SharePoint. Show all posts

Saturday, March 30, 2019

Embarassing stories: repair lookup fields

You deployed lists with lookup in old fashioned way and something went wrong.
Your lookup field points to nothing because you team forgot the difference between list definition and list instance.
The script will change (brutally) the schema of the field using CSOM: I don't like it but it works like a charm, so don't panic, you can use this script and you'll be at home asap.

Just change the uppercase values with yours.

Share:

Monday, June 18, 2018

Import-SPWeb preserves ListItemId?

If you're reading this you have my same problem, you don't have an answers to this question, and maybe you need to move a list across site across site collections/web applications/farms without changing the list item id.
Someone says yes, others say no. The answer is that both answers are true, it depends on the preconditions.

Let's start from this article which explains just a little bit how the ContentDB works.
So, let's create a custom list named "Another custom list" and let's populate it with a bunch of items.
Yeah just 3
So let's look what happens in the ContentDB.
At first let's look at the list id, and we can find it with just a select on the AllLists table querying by title.
Aggiungi didascalia
With the list id in your hands you can find all the entry of the items in AllUserData table, which is the repository of all the ContentDB Items (you can also have a look to the composed primary key of the table from the previous screenshot).

Remember something?
Now, let's delete item 1 and create item 4 and 5 and let's see what happends (Please note that Item1 is still in the content db until i delete it from site collection recycle bin). Moreover in the AllListAux i can now see the counter used to give the list item id to the object.
So this is what happens behind the scenes


Mmm.. Aux table
It's now time to export the list and take a look to what SharePoint made, so unzipping the cmp file, i can read this in the Manifest.xml
You can see that every item is serialized in xml, and surpise surprise there is an attribute called IntId, so let's see what happens importing this cmp file in another site collection:
Import-SPWeb http://sp2013 -Path "C:\TEMP\anothercustomlist.cmp" -IncludeUserSecurity -UpdateVersions Overwrite
and the result is that i've now a new list with a new id, but same informations in the AllListAux and AllUserData.
Only GUIDs changed in the import
and only GUIDs
So, even if i'm not showing you the AllUserData table, also ListItem IDs remains the same, so the answer to our question seems to be yes.
But what if i try to reimport? Well, items are not overwritten and i can see that same object are replicated with different list item id which starts from the value in the aux table.
So the answer is no if you are importing in a preexisting list with other items, because you never know what could be the next item id.

Below you can find some code i used to do a final test in order to get a lot of item from a list with more than 6000 items (which is not the one of this example) from the source list and the destination one, in order to do a final compare which led me to understand that Import-SPWeb is something good :)
string siteUrl = "http://sp2013/sites/test";
string listTitle = "Test";
string user = "Administrator";
string domain = "DEV";
string password = "Password";

using (ClientContext clientContext = new ClientContext(siteUrl))
{
 clientContext.Credentials = new NetworkCredential(user, password, domain);

 List list = clientContext.Web.Lists.GetByTitle(listTitle);
 clientContext.Load(list);
 clientContext.ExecuteQuery();

 ListItemCollectionPosition itemPosition = null;
 while (true)
 {
  CamlQuery camlQuery = new CamlQuery();
  camlQuery.ListItemCollectionPosition = itemPosition;
  camlQuery.ViewXml = @"
  
   
   
  
  1000
   ";

  ListItemCollection listItems = list.GetItems(camlQuery);
  clientContext.Load(listItems);
  clientContext.ExecuteQuery();

  itemPosition = listItems.ListItemCollectionPosition;

  foreach (ListItem listItem in listItems)
  {
   Console.WriteLine("Item Title: {0}", listItem["Title"]);
  }


  if (itemPosition == null)
   break;

  Console.WriteLine(itemPosition.PagingInfo);
  Console.WriteLine();
 }
}

Monday, April 9, 2018

Update Content Type in sandbox solution: a forgotten beauty

Like i said in previuos blog posts, i'm using a lot BP with SP on premise or O365, in order to implement a centralized approach to attended RPA.
As we already know, nowaday the most used approach is to use PnP scripts and this is something i really like, but i also have to deal with a great team with really good skills in RPA and IT in general, but i could not waste my time explaining how to use another tool, because of we need to just to set up some SP web sites with old-fashioned custom lists.
So i explained them something about sandbox solution, but surprise surprise, customer enjoyed a lot the POC and asked us lot of CRs, including adding some JavaScript to the custom form and (nooooo!) adding fields to content types.
With Francesco Cruciani (thx man), I figured out how to solve the problem, simply attacking the feature manifest.
The solution is really simple and you can download it simply clicking here.
As you can see we have:
  • Some fields
  • 1 CT
  • 1 List definition
  • 1 Feature 
Solution structure
Installing the sandbox you will just have to provision manually the list instance in order to be ready to use it.
Now, let's start with the update:
  1. Let's add a Field file: we called it Fields_Update
  2. Then update the CT and List Definition: we like order ;-)
The key is now simply here:
Only visible difference is Fields_Update
We just add the new module in the feature
Now let's focus on Test.SandBox.DataStructure.Template.xml

  
    
      
        
      
      
    
  

As you can see we have just applied the new manifest, with an explicit reference to the action of adding field to a content type and then pushing down the update, so also the content type instances will be updated: you just have to upload again the wsp with a different name, and upgrade the solution from SandBox solution menu and that's all.

No way if you want to change order of Field in the mask or change data type, we have not investigated further.
This post helps you only to remember that sometimes, old fashioned way could be useful to make your life a little easier.

Friday, January 26, 2018

Move SP list items

Move documents through folders in a SharePoint document library it's a recurrent task you have to face (maybe).
But what if you have to move items through folders in a list? Just look at this, maybe you'll find this useful: this is based on a simple concept, a list item is nothing but a "file", so you can easily do the work like this.
A special thank to Alessio Pau who wrote this code :-)
string websiteUrl = "http://something.com";
string expectedFolderName = "expectedFolder";

using (var ctx = new ClientContext(websiteUrl))
{
 Web web = ctx.Web;
 ctx.Load(web);
 ctx.ExecuteQuery();
 
 string currentFolderName = item["FileDirRef"].ToString();
 string itemIDForURL = item.Id + "_.000";
 string itemURL = currentFolderName + "/" + itemIDForURL;
 
 // Check if the item is in the correct folder, if not, do the following
 if (!currentFolderName.EndsWith(string.Join("/", expectedFolderName)))
 {
  // This is just a custom method which creates the folder tree
  var correctFolder = CreateMultipleLevelFolder(expectedFolderName, listName);

  // Move the file
  Microsoft.SharePoint.Client.File itemFile = ctx.Web.GetFileByServerRelativeUrl(itemURL);
  ctx.Load(itemFile);
  ctx.ExecuteQuery();

  itemFile.MoveTo(correctFolder.ServerRelativeUrl + "/" + itemIDForURL, MoveOperations.Overwrite);
  ctx.ExecuteQuery();
 }
}
Share:

Saturday, May 20, 2017

BluePrism attended processes? Why not?

One of cons customers find out on Blue Prism, is the fact attended processes are not supported out-of-the-box.
Well this is not so true because you can play with BP in order to someway create attended processes.
If you are an enterprise and you don't use SharePoint, please raise your hand... i see no one.
So we can use SharePoint (or Office 365 as well) to interact with BP with a tiny additional effort. Moreover, if we create a solution using NCSSs (No Code Sandbox Solutions), we can do the work fast and without great impacts on pre-existing solutions.
I will refer to the producer-consumer model during this article, so, if you haven't read it... read it :-) http://valeriovalrosso.blogspot.it/2017/04/a-simple-blue-prism-multi-robot-example.html

 

Solution 1

There are 2 possible scenarios from the real world:
  1. Robot polls for new item to process directly from a SharePoint list
  2. Call the robot as a web service from SharePoint and consume the service
 Scenario 1
 Scenario 1
A typical real world scenario 1 example is the one above:
  • User does something on list items
  • User interacts with the list items through the workflow
  • Robot polls continuosly on the list in order to find items to process (in this case also processed by the workflow) and put them into a queue
  • Robot consumes items and writes back something to the list item about the automation results
An important advice is to write on the new item list the fields: "Robot Machine Name" and "Robot Machine User". This will be a useful information during debugging sessions because, in a multi-robot scenario, you will use more than a robot that will do the action on the SharePoint portal, and will make debug sessions easier.

Solution 2

The second scenario, here below, calls directly the robot from the workflow (or from JavaScript or other techniques, it depends on what you wanna do).
  Scenario 2
 Scenario 2
To do so, you need to expose the process and create some code in order to call the robot web service.
In my example i created only one process which does nothing for 1 minute, here the wsdl BP created for me, and a small piece of code below, which shows how to call the BP web service.
The wsdl BP produces for you

Consuming using C#

You have to remember that web services aren't exposed on all resources, so you'd better to design which machine (in a multirobot scenario) is the "producer machine", and maybe use a configuration list on SharePoint to configure the value dinamically.
Moreover could be a good idea to split the producer-consumer model, exposing only the producer, leaving the consumer in the loop.
Since multiple invoked processes could be executed at the same time (as you can see below), think about just using the producer to fill the queue with the information from SharePoint and avoid the full automation when the service is called, because it could became a dangerous bottleneck and you will have to deal with concurrency on the same machine.

Concurrency
So the message is: even if you can play with different technologies, mind you have to think in a multi-thread way now.

Conclusions

Like everything in the world, different approaches have pros and cons
Scenarios PROs CONs
Scenario 1 - Easier support to multiple robots
- Scalable
- Pay attention to polling
Scenario 2 - You can put an item in queue and give an immediate response - Binding to a single "root" producer machine

An that's all my friends.

Sunday, November 27, 2016

Trust no one: especially about SharePoint Search

Some days ago, the customer said he want to be able to search files in OneDrive using O365 search.
The solution is there: https://technet.microsoft.com/en-us/library/dn627526.aspx
After Step 5, i went into the search site, selecting OneDrive result source, and i run the search with no parameter... wow only OneDrive results; after that i tried another search with a word and... surprise surprise... results from everywhere.
After lot of hours of tentatives, the decision was to filter results again also in the query text of the search web part (path:https:// tenant_name -my.sharepoint.com/personal), in order to mantain mental sanity.
Share:

Monday, April 25, 2016

Howto search for memory leak and save your marriage

Someone said (on friday afternoon): "your application got a memory leak and it's your fault".
Because of my wife became very nervous when i gotta work during the weekend i asked some help in order to understand how to work this problem as fast as possible, so let's start with credits to Iron Man and Mr. Pumpkin (only nicknames here: privacy is privacy).
This is what i learned from pro-people which helped me to do a drill-down analysis.
Pre-conditions:
  1. You know farm sometimes goes down
  2. You must have front end IIS logs
  3. You know there are lots of objects in memory (as i can see in w3wp.dmp)
  4. You need ULS logs

The bug-hunting process deals with:
  1. Find the use case that reproduce the memory issue
  2. Replicate the case in your dev envinronment
  3. Analyze the issue
  4. Conclusions

Step 1 - Find the use case that reproduce the memory issue
To accomplish the first task, i have taken the IIS logs in order to find out "who was doing what" on the web application. IIS logs are too long to use a notepad, so Mr. Pumpkin said: "Use lizard, download it from here http://www.lizard-labs.com/log_parser_lizard.aspx"
Lizard helps you to make query with SQL on log files, IIS logs contain lots of rows with those informations:
  • date
  • time
  • s-sitename
  • s-computername
  • s-ip
  • cs-method
  • cs-uri-stem 
  • cs-uri-query 
  • s-port 
  • cs-username 
  • c-ip cs-version 
  • cs(User-Agent) 
  • cs(Cookie) 
  • cs(Referer) 
  • cs-host 
  • sc-status 
  • sc-substatus 
  • sc-win32-status 
  • sc-bytes 
  • cs-bytes 
  • time-taken
etc But as Mr. Pumpkin said, please be aware IIS log are in UTC, so take care of using this query (maybe restricting where conditions on a particular time-slot near to the memory issue)
SELECT
   TO_LOCALTIME(TO_TIMESTAMP(date, time))
   ,date
   ,time
   ,s-sitename
   ,s-computername
   ,s-ip
   ,cs-method
   ,cs-uri-stem
   ,cs-uri-query
   ,s-port
   ,cs-username
   ,c-ip
   ,cs-version
   ,cs(User-Agent)
   ,cs(Cookie)
   ,cs(Referer)
   ,cs-host
   ,sc-status
   ,sc-substatus
   ,sc-win32-status
   ,sc-bytes
   ,cs-bytes
   ,time-taken
FROM 'C:\Users\Varro\Downloads\iis_frontend47.log'
in this way, column TO_LOCALTIME(TO_TIMESTAMP(date, time)) will give you the localtime. Now it's search time, filter on whatever you want. In my case, i take a look around sc-status (request status code) and time-taken in order to find out the most time consuming call, and i found some interesting data which helped me to replicate the behaviour users had before memory allocation went outta control.

Step 2 - Replicate the case in your dev envinronment
This is one of the most interesting part of the job, now i have the users behaviour i did the same actions on my system, attaching the memory profiler on Visual Studio, below the steps Iron Man explained to me:

1. Create a new performance session
 

2.  Change properties of performance session and choose performance counters

  • NET CLR Memory/# Bytes in all Heaps
  • .NET CLR Memory/Large Object Heap Size
  • .NET CLR Memory/Gen 2 heap size
  • .NET CLR Memory/Gen 1 heap size
  • .NET CLR Memory/Gen 0 heap size
  • .NET CLR Memory/Induced GC
  • Process/Private Bytes
  • Process/Virtual Bytes

3. Attach performance session  to SP process


4. Execute use case and monitor task manager, when you see memory going up, create some memory dumps of the process


5. When you ends the use case, stop everything and watch results ;)

Pay attention to this, you could need to run this command from che Visual Studio command prompt in order to make the profiler work
   vsperfclrenv /globalsampleon
   iisreset
Step 3 - Analyze the issue
What i discovered is that, effectively, there was a high memory consumption, so i tried to discover who could be the guilty method.
I can do this using memory profiler report and debug2diag (https://www.microsoft.com/en-us/download/details.aspx?id=49924) in order to process DMP files.

Memory DMP report
Memory profiler report
Iron Man also explained that i need to replicate the same pattern of memory allocation (as i could see on the customer w3wp.dmp) in order to have a smoking gun. So, using also the debugger to freeze the code in some interesting points in order to make some specific memory dump from task mager.

Step 4 - Conclusions
This pretty complex analysis helped me out to discover our application is quite slow in some particular circumstances, but even if some operations stressed so much memory allocation and garbage collector (some method like SubMethod1 could be more efficient), it's also true memory is released after those operation, so, if farm fall down when users do this operation... maybe it's time to give your farm just a little bit RAM.

That's all folks.

Thursday, August 6, 2015

SharePoint 2013 'Page not found (404)' with catalog item page and standard page

If you read these blog post:
SharePoint 2010 ‘Page not found (404)’ page the way it should be and this Inconvenient Catalog Item Page and ‘Page not found’ (404) experience from the master Waldek Mastykarz , and you're here, you're one step closer to the solution.
This blog post extends the Mastykartz strategy with just few lines of code.
 Let's assume you developed those components:
  1. PageNotFoundHttpModule from SharePoint 2010 ‘Page not found (404)’ page the way it should be
  2. PageNotFoundWebPart from SharePoint 2010 ‘Page not found (404)’ page the way it should be 
  3. Something like MaventionCatalogItemReuseWebPart from Inconvenient Catalog Item Page and ‘Page not found’ (404) experience
PageNotFoundHttpModule and PageNotFoundWebPart works also for SharePoint 2013.
MaventionCatalogItemReuseWebPart simply redirects to site collection page not found url, but, since this is a server side redirection, the HTTP status code is 302 and the redirect location it's also well known, so i just modified PageNotFoundHttpModule in this way.
public class PageNotFoundHttpModule : IHttpModule
    {
        #region Const
        const string URL_RELATIVE_PAGENOTFOUND = "/Pages/PageNotFoundError.aspx";
        #endregion

        #region Fields
        private HttpApplication webApplication;
        #endregion

        #region Public methods        
        public void Init(HttpApplication context)
        {
            webApplication = context;
            webApplication.PreSendRequestContent += new EventHandler(webApplication_PreSendRequestContent);
        }

        public void Dispose()
        {
        }
        #endregion

        #region Event handlers
        void webApplication_PreSendRequestContent(object sender, EventArgs e)
        {
            HttpResponse response = webApplication.Response;
            HttpRequest request = webApplication.Request;

            if ((response.StatusCode == 404 && string.Compare(request.Url.AbsolutePath, URL_RELATIVE_PAGENOTFOUND, StringComparison.InvariantCultureIgnoreCase) != 0) ||
               (response.StatusCode == 302 && string.Compare(response.RedirectLocation, URL_RELATIVE_PAGENOTFOUND, StringComparison.InvariantCultureIgnoreCase) == 0))
            {
                webApplication.Server.TransferRequest(URL_RELATIVE_PAGENOTFOUND);
            }
        }
        #endregion
So what i added is essentially a new or condition.
Share:

Wednesday, July 29, 2015

Publishing by search: what about video in HTML field?

I think the title it's someway self-explaing: we know that when you add a video to a HTML field, SharePoint doesn't use HTML5 video tag, but it installs a web part which show the video itself using a player.
A video, in SharePoint, it's not simply a file, but a folder which contains that video file, maybe its renditions and other stuff i'm not interested in.
Video and assets generally are in a Asset Site which could be accessed from catalog, public site and many others.
If we want to simply use HTML5 video tag, we first need to find the correct video file, then we have to create a reusable content with these features:
  • User can select the video using the OOB AssetPicker
  • Reusable content automatically finds the video file
  • Reusable content automatically generates HTML5 video tag
So, let's first declare a reusable content for video with this HTML code


Why in the world i'm using an img tag?
Because i want the user to have a placeholder like this


Then include a JS (in master page or page layout: up to you) which:
  1. Binds a click event to every asset_video class object 
  2. On click opens OOB Asset Picker and let user choose an asset
  3. On Asset Picker close event, finds the video URL using REST
  4. Writes the HTML5 tag using the URL
 Below some code you can use
jQuery(document).ready(function () {
    initAddVideoTemplate();
    SP.SOD.executeOrDelayUntilScriptLoaded(function () {
        SP.SOD.executeOrDelayUntilScriptLoaded(function () {
            var ctx = SP.ClientContext.get_current();
            var site = ctx.get_site();
            ctx.load(site);
        }, "cui.js");
    }, "sp.js");
});

// Adds a listener on every HTML element with class "asset_video"
// You can add your own classes in this obj array, lister for all elements will be created
// The listener will make asset picker to be opened when you click on a asset_video
function initAddVideoTemplate() {
    var count = 0;
    var obj = { 'asset_video': '' }
    jQuery.each(obj, function (index, value) {
        jQuery('body').on('click', 'img[class="' + index + '"]', function () {
            var className = this.className;
            var idOldVideo = "reusableContentInsertVideo-" + rc_generateUUID();
            this.setAttribute('id', idOldVideo);
            if (!count) {
                jQuery.getScript('/_layouts/15/AssetPickers.js', function () { insertVideo(idOldVideo, className, value); });
                count += 1;
            } else insertVideo(idOldVideo, className, value);
        });
    });
};

// Asset picker configuration
function insertVideo(idOldVideo, className, value) {
    var assetPicker = new AssetPickerConfig("");
    assetPicker.ClientID = "";
    assetPicker.DefaultAssetLocation = "";
    assetPicker.DefaultAssetImageLocation = "";
    assetPicker.CurrentWebBaseUrl = "";
    assetPicker.AllowExternalUrls = "";
    assetPicker.ManageHyperlink = false;
    assetPicker.AssetUrlClientID = idOldVideo;
    assetPicker.AssetType = "Media";
    assetPicker.ReturnItemFields = 'Title,Name,Label,SourceUrl,FileRef';
    assetPicker.ReturnCallback = function () {
  
  // Self explaining variable names
        var video_AbsoluteUrl = arguments[0];
        var video_FolderName = arguments[1];

        var pathArray = video_AbsoluteUrl.split('/');

        var assetSite_Url;
        var assetList_DocLib;

        if (pathArray.length >= 5) {
            assetSite_Url = pathArray[0] + "//" + pathArray[2];
            assetList_DocLib = pathArray[3];

   // Helps you to rebuild the folder relative URL
            for (i = 4; i < pathArray.length - 1; i++) {
                assetList_DocLib = assetList_DocLib + "/" + pathArray[i];
            }

            // Finds the HTML img tag to substitute
            var idCalled = jQuery(this).attr('AssetUrlClientID');

            if (undefined !== arguments &&
    arguments.length > 3 &&
    assetSite_Url !== undefined &&
    assetList_DocLib !== undefined) {

    // REST url generation
    // Using URL and folder, you can request the first file of the video folder, which is the default video rendition
                restUrl = assetSite_Url + "/_api/Web/GetFolderByServerRelativeUrl('/" + assetList_DocLib + "/" + video_FolderName + "')/Files?$select=ServerRelativeUrl,Name&top=1";

                jQuery.ajax({
                    url: restUrl,
                    type: "GET",
                    crossDomain: true,
                    dataType: "json",
                    headers: { "Accept": "application/json; odata=verbose" },
                    success: function (data) {
                        // If success, generate che HTML
                        var response = data.d.results;
                        if (response.length > 0) {
                            var videoUrl = assetSite_Url + response[0].ServerRelativeUrl;
                            var fileName = response[0].Name;
                            var extension = fileName.split('.').pop();

                            if (jQuery("#" + idCalled).length > 0)
                                jQuery("#" + idCalled).first().replaceWith("");
                        }
                    },
                    error: function (data) {
                        alert('Error');
                    }
                });
            }
        }
        else {
            alert("Error");
        }
    };

    var imageAsset = new LinkAsset("");
    imageAsset.LaunchModalAssetPicker(assetPicker);
};

// Generate a ID 
function rc_generateUUID() {
    var d = new Date().getTime();
    var uuid = 'xxxxxxxx-xxxx-4xxx-yxxx-xxxxxxxxxxxx'.replace(/[xy]/g, function (c) {
        var r = (d + Math.random() * 16) % 16 | 0;
        d = Math.floor(d / 16);
        return (c == 'x' ? r : (r & 0x7 | 0x8)).toString(16);
    });
    return uuid;
};

Strategy: catalog connection with managed metadata multivalue field

Everybody knows about SharePoint catalog connection... maybe

What if you have a pre-existing catalog connection and your customer asks you to switch from a single value field (generally ItemCategory) to a multivalue field?

In a ideal world you should quote this and say "don't do it, don't even think about it"

In a business world you'll say "Yes, why not?".
I tried to convert the single value field in a multivalue field... trust me... don't try this if you don't like strange side effects.

AS IS
This is the typical product catalog configuration:
  • A product catalog site which is used as backend website
  • Contents are crawled by search service application
  • Public sites connect to indexed catalog


Solution
To understand next steps, mind every link under categories navigation term is allowed.
Let's assume you have a term set like that
  • Category 1
    • Subcategory 11
    • Subcategory 12
  • Category 2 
    • Subcategory 21
The catalog connection automatically creates these category friendly urls:

http://site/category1
http://site/category1/subcategory11
http://site/category1/subcategory11
http://site/category2
http://site/category2/subcategory21

So let's assume i have a product named Test, which is tagged on ItemCategory with term "Subcategory 11", i'll go to URL http://site/category1/subcategory11/test .

But what if i tell you that links like these below will not repond with a 404?
http://site/category1/test
http://site/category1/subcategory12/test
http://site/category2/test
http://site/category2/subcategory21/test

This behaviour will be extremely useful for us.

Editing product catalog
Because you don't want to destroy catalog connection, you just have to add a TaxonomyFieldTypeMulti field to the product catalog item content type

  
    
   
     TextField
     {024FECDC-E8A7-4DAC-BEB1-E9C373708BE5}
   
    
  

Link this new field to the term set you use to tag catalog items.
After that, you can create a event receiver in order to write the first  value entered in MultivalueItemCategory field in ItemCategory field (or the field you use for the catalog connection).
Create Content Search Web Part replacements
This is the most annoying part, we have to create a Web Part that we will use in category pages.
This web part does essentially three things:
  1. Recover the current navigation term
  2. Make a query searching che term related to the navigation term, in the MultivalueItemCategory we defined above
  3. Substitute Path values (catalog item friendly url) in order to make links relative to the current navigation term
  4. Show results
The core methods are About the single product page you can create a web part or a page layout, which has to contain methods similar to the previous web part.
This web part, must get the product using the navigation term and the friendly url segment and show the result.

Editing publishing site 
In pages document library, you have to create a category page (which contains the category CSWP replacement), and a product page (which contains the product CSWP replacement) and then edit target page settings for terms like this


And that's all folks.

Deleting a host named site collection... for real

A simple problem: you have to delete a host named site collection in order to recreate it.
When you try to recreate it, you get an error: SharePoint says the site collection already exists... if you restart the machine you'll be able to recreate it.
What if you cannot restart the machine whenever you want (read production environment).
$variable = Get-SPSite "http://already.deleted.site"
$variable
You'll get an output like this


Now recreate the site collection without problems ;)

Saturday, June 20, 2015

SharePoint 2013 Azure farm accessible from outside part 2: what about friendly URLs

Have you read this? SharePoint 2013 Azure farm accessible from outside
I discovered it doesn't work for friendly urls, it shows hyperlinks targeting site collection, so it's not possible to reach other pages. Moreover you'll get this error if you only create a outbuond rule for URL rewriting because of the gzip compression.
Outbound rewrite rules cannot be applied when the content of the HTTP response is encoded ("gzip").
What you have to do is something like this:
  • Introduce a outbound rule which rewritest some links
  • Introduce rewrite rules for gzip compression
The example below assumes:
  • Local site collection is: http://sitecollection.example.cloud
  • Public azure url is: http://publicsite.example.azure

    
        
            
                
                    
                    
                        
                        
                    
                    
                
            

            
                
                    
                    
                
                
                    
                    
                

                
                    
                        
                    
                    
                        
                    
                
                    
        
    

And that's all... for now!
Share:

Monday, June 8, 2015

Targeting contents using XRANK in SP2013 problem

Lately i had to create a content priority/targeting system in SharePoint 2013, using search capabilities in a public context (so no target audience available).
What you'll find below it's a targeting system based on managed metadata content tagging and dinamically generated queries using XRANK directives.
Let's think about a enterprise model with a hierarchy like this
  • Channel 1 (00000000-0000-0000-0000-000000000100)
    • Network 1 (00000000-0000-0000-0000-000000000110)
      • Agent 1 (00000000-0000-0000-0000-000000000111)
      • Agent 2 (00000000-0000-0000-0000-000000000112)
    • Network 2 (00000000-0000-0000-0000-000000000120)
    • ...
    • Network n (...)
  • Channel 2 (00000000-0000-0000-0000-000000000200)
  • ...
  • Channel n (...)
This could be represented as a hierarchical term set in Managed Metadata Service Application.
Now, let's assume we have these contents:
  • Page A, tagged with "Channel 1"
  • Page B, tagged with "Network 1"
  • Page C, tagged With "Agent 1"
Following this article http://techmikael.blogspot.it/2014/03/s15e01-kql-basics.html we can target contents using Search.
For example, if i am "Agent 1" and i want to obtain contents in this order
  1. Page C
  2. Page B
  3. Page A
i can use a query like this
(((owstaxIdTargeting:"GP0|#00000000-0000-0000-0000-000000000100" XRANK(cb=1))
owstaxIdTargeting:"GP0|#00000000-0000-0000-0000-000000000110" XRANK(cb=10))
owstaxIdTargeting:"GP0|#00000000-0000-0000-0000-000000000111" XRANK(cb=100))


Basically, i'm boosting contents created for "Agent 1", then contents for "Network 1", then "Channel 1".
Great? No. This method apparently works.
In this query i used XRANK, which boost the rank score... boost means SharePoint assign a score using ranking models you can boost manually using XRANK query directive.
This also means that rank scores could be scrambled by a lot of rank model rules, take a look to this articles:
This leads me to think i can create a "Fake ranking model" for those queries, useful only for this content targeting technique.
This model basically assigns a 0 score to all contents and it simply does NOTHING, so only XRANK values will be considered.

 
  
   
    0
   
   
    0
   
  
  
  
 

You can install this rank model on SharePoint farm and use it in your search based query (programmatically and/or in content search query web part).
Share:

SharePoint 2013 Azure farm accessible from outside

I gave you the possibility to create a SharePoint 2013 farm from scratch with ASPM... don't you know what it is? You can download it here
So, let's assume you have a site collection (http://sitecollection.example.cloud) on this SharePoint 2013 farm e you want to make it accessible to your customer, manager and so on...
Open IIS Manager and head to Default Web Site, then URL Rewrite (if you don't have this option on your IIS please downlad URL Rewrite module from Web Platform Installer).

Then, create a new blank rule

like this

And that's all.
I noticed you will have some problems with users adding, so it's possible you can experience some problems with other features.
Maybe further settings could be necessary.

Thursday, June 4, 2015

Updating term set in SP2013

First of all, this is not something i discovered on my own, i just read it somewhere in the web and I never found the original post again, maybe because it was written in german (credits to the original author), so I decided to write this useful trick in english in order to help more people to accomplish the same task: updating a termset from code behind.
Let’s assume you need to do this from an application page, you need to use the SPSecurity and RunWithElevatedPrivileges as usual.
SPSecurity.RunWithElevatedPrivileges(() =>
{
 using (SPSite elevatedSite = new SPSite(site.ID, site.Zone))
 {
  TaxonomySession taxonomySession = new TaxonomySession(elevatedSite);
  //Do here the update work
 }
});
Ok! This code simply doesn’t work: this is due to the fact you have to switch the context to the service account one like this.
SPSecurity.RunWithElevatedPrivileges(() =>
{
 using (SPSite elevatedSite = new SPSite(site.ID, site.Zone))
 {
  HttpContext oldContext = HttpContext.Current;

  try
  {
   HttpContext.Current = null;
   TaxonomySession taxonomySession = new TaxonomySession(elevatedSite);

   //Do here the update work
  }
  finally
  {
   HttpContext.Current = oldContext;
  }
 }
});
le jeux sont fait
Share:

Monday, December 8, 2014

A simple script to log to verbose mode

Hi, this will be a quick post, just to show you a little PowerShell script (yeah, lately i'm in love with PowerShell) could be very helpful during bug hunting sessions.
When you run the script, the log level is raised to Verbose and a new SPLog file is created for this verbose log.
When you press a key, the log level is resetted to the default one, and a new SPLog file is created, so the penultimate log file is the one you need to examine.
This is the script: enjoy
$ver = $host | select version
if ($ver.Version.Major -gt 1) {$host.Runspace.ThreadOptions = "ReuseThread"} 
if ((Get-PSSnapin "Microsoft.SharePoint.PowerShell" -ErrorAction SilentlyContinue) -eq $null) 
{
    Add-PSSnapin "Microsoft.SharePoint.PowerShell"
}

Write-Host "Setting TraceSeverity and EventSeverity to Verbose"
Set-SPLogLevel -TraceSeverity Verbose -EventSeverity Verbose

Write-Host "Creating a new Log file"
New-SPLogFile

Read-Host "Press ENTER key to reset default log level"

Write-Host "Resetting TraceSeverity and EventSeverity"
clear-sploglevel

Write-Host "Creating a new Log file for resetted log"
new-splogfile

Read-Host "Press any key to exit ..."
Or maybe download it from here.
Share:

SharePoint 2013 workflow task related item field

Another day spent with SP2013 workflows.
Do you remember when you had a task on SP2010 and you could get the workflow related item easily?
Good, it was something like that
taskItem["WorkflowItemId"]
Today, the game has changed.
Let's see this task view screenshot, everything lets you think you only have to do something like this
taskItem[SPBuiltInFieldId.RelatedItems]
to get a SPListItem, unfortunally you have to do more than this.
This field contains a JSON string containing web, list and item id, so you have to adapt this field value in order to do some work from code behind...

RelatedItem is one of the new site column SP2013 available for developers and can be used for other purposes, not only in workflows; so i decided to write this little helper in VLibs which lets you to read/write this field easily from server side, since CSOM supports this field.
[Serializable]
public class RelatedItemFieldValue
{
 #region Properties
 public int ItemId { get; set; }
 public Guid WebId { get; set; }
 public Guid ListId { get; set; }
 #endregion
} 
Then using this helper you can read and update this field value.
Related item field value is not assigned to the field when the task is created, so pay attention using event receivers events ItemAdding and ItemAdded for related item field read, use ItemUpdated or ItemUpdating instead.
I wrote this little piece of code you can find in Example 08, which overwrites the task related field item with the first item of the workflow items list in a event receiver.
string jsonString = properties.ListItem[SPBuiltInFieldId.RelatedItems].ToString();
List relatedItems = RelatedItemFieldHelper.GetItems(jsonString);
RelatedItemFieldValue r = relatedItems[0];
SPList l = properties.Web.Lists.TryGetList("WorkflowsItemsList");
SPQuery query = new SPQuery();
query.RowLimit = 1;
query.Query = "";
SPListItem first = l.GetItems(query)[0];
int itemId = first.ID;
r.ItemId = itemId;

// Safety... never enough
this.EventFiringEnabled = false;
properties.ListItem[SPBuiltInFieldId.RelatedItems] = RelatedItemFieldHelper.GetItems(relatedItems);
properties.ListItem.SystemUpdate();
// Rollbacking safety setting
this.EventFiringEnabled = true;
And here's the demonstration i'm telling the truth :-)


 Even if i'm confident you know it, here's the link to VLibs code :-) .

Sunday, December 7, 2014

Get user in SharePoint 2013 workflows

Today i'll present you a resolution of a common question when dealing with SharePoint 2013 workflows:
How the hell i get the user login name from a type user field?
The answer should be straighforward, unfortunally it isn't: it depends from the user field xml parameters.
Site columns
It is possible to declare a user field in defferent ways, so, reusing Example 01 code, i have extended the workflow item content type, so i can make some tests with those three different fields:
  1. Approvers (Type = UserMulti): used for multiple users
  2. Type User - Mult False: used for single user
  3. Type User - Mult True: used for multiple users too
<Field
       ID="{c55e296e-e0bf-4574-b266-a421781b5081}"
       Name="Approvers"
       DisplayName="Approvers"
       Type="UserMulti"
       Required="FALSE"
       Group="VLibs SP2013 Examples site columns"
       EnforceUniqueValues="FALSE"
       UserSelectionMode="PeopleOnly"
       UserSelectionScope="0"
       Mult="TRUE"
       Sortable="FALSE"/> 
<Field
       ID="{C4AC8A6C-F65C-4BC1-BE36-F83919EE9886}"
       Name="ItemUser"
       DisplayName="Type User - Mult False"
       Type="User"
       Required="FALSE"
       Group="VLibs SP2013 Examples site columns"
       EnforceUniqueValues="FALSE"
       UserSelectionMode="PeopleOnly"
       UserSelectionScope="0"
       Mult="FALSE"
       Sortable="FALSE"
       Overwrite="TRUE"/> 
<Field
       ID="{9af230e5-cdff-40ac-ae13-f2a6928dd1d1}"
       Name="ItemUsers"
       DisplayName="Type User - Mult True"
       Type="User"
       Required="FALSE"
       Group="VLibs SP2013 Examples site columns"
       EnforceUniqueValues="FALSE"
       UserSelectionMode="PeopleOnly"
       UserSelectionScope="0"
       Mult="TRUE"
       Sortable="FALSE"
       Overwrite="TRUE"/>
Workflow
 I have created a stupid workflow (named stupid workflow for real) which prints fields value in history
i added this item to the list, launched the workflow
 
and this was the result

As you can see, for the first field i obtained a list of user id, for the second field a single id: those int numbers can be used to make a user lookup, but what about the third field?

Resolution
For this particular example you need a custom activity in order to read the field value from results property in a collection string object.
This custom activity needs as input value, the user field dynamic value, and a collection string variable to save che user ids.

  
Placing the activity and configuring the output was this.
To play with this workflow and custom activity, you have to download latest version of VLibs from here, and enable features 01, 07a and 07b. Have fun.

Friday, November 21, 2014

Provision a SharePoint 2013 farm with Azure and PowerShell

I begun to work on Azure one year ago in order to have a dev SharePoint 2013 to build some code since i don't have a powerful PC.
Reading below articles, i was able to create my personal farm
Step-by-Step: Build a FREE SharePoint 2013 Dev/Test Lab in the Cloud with Windows Azure Infrastructure Services - Part 1
Step-by-Step: Build a FREE SharePoint 2013 Dev/Test Lab in the Cloud with Windows Azure Infrastructure Services - Part 2
A colleague (codename Roller) asked me if it was possible to build the farm automagically using PowerShell, since there were amazing PowerShell Azure cmdlets.
<< Varro, why don't you do something like that? it would be so useful >>
<< Ok Roller, but it will take a lot of my free time, you'll give me money for this? >>
<< No >>
<< Ok Roller, i'll do it for you ... someday... in the future >>
That day is arrived.
/mode joke off
Since i need something simple and smart to build a SP2013 farm, i started surfing the web studying (and also stealing code) creating this wonderful script you can download clicking here.

Requirements
Once you downloaded the script on you PC you have to read the README (do it).
It explains what do you need to make script execution smooth (maybe).
  1. Enable PSRemoting on your PC: this will be useful to use CredSSP to invoke remote PowerShell script execution on Azure machine (it will avoid the double hop problem); since you have to allow PSRemoting to *.cloudapp.net you are future-powershell-experiments proof.
  2. Download Azure cmdlets
  3. Download your Azure account publish settings (have you got a Azure account uh?).
  4. Configure XML file
The zip contains a Config - example.xml

 
 
 
  
  
 
 
 
  
 
 
  
   
  
  
   
  
  
   
   
  
 

If you don't want to waste you time, simply change "MSDN subscription name" with your MSDN subscription name, passwords (substitute Password01 with your preferred one) and substitute "sharepointLab01" prefix with something else (please note that resulting vm name length must be less than 15 characters).
Note that if you are behind proxyes, firewall and other stuffs, you could have troubles.
Please take some minutes to read the FAQ below, since you'll need it, after thas warm-up phase, simply run BuildEnvironment.ps1

FAQ.
Q1: What is the average execution time of the batch script?
A1: The farm will be provisioned approximately in 1 hour, i could make it quicker, better, safer... but you have the code, make it better :) .

Q2: The script is someway interactive?
A2: No, you will be prompted to proceed to WinRM quick configuration just one time, when provisioning SharePoint VM, click y and everything will be fine


Q3: I noticed the script sometimes stop itself and give me some error, what i have to do?
A3: Simply solve the errors and re-run the script: already done steps will be bybassed if possible

Q4: I received this error: "Connecting to remote server failed with the following message: The client cannot connect to the destination specified in the request blablabla" what i have to do?
A4: Just keep calm and re-run the script. This error means the script is unable to connect to the VM, so if you're ok with firewall, proxyes and other security stuffs, the connection simply failed, try again.
If the problem persists, try something more rude, delete the VM giving problems and re-run the scipt.

Q5: Are there some built-in users?
A5: Yes sir, SPFarm, which is also the farm admin and SPServiceApp

Q6:What about the service application?
A6: The script provisions User Profile, Managed Metadata and Search service applications, but you can do che quick wizard whenever you want if you need more

Happy ending (hope so)
Download the code from here and have fun. 

Remember, this is not production software, it's something i use for me, myself and i, so i cannot ensure you everything is gonna be fine, even if i provisioned some farm with this, but if i'll save you time with this... come on offer me a beer ;) .

In loving memory of Roller

Me, myself and I

My Photo
I'm just another IT guy sharing his knowledge with all of you out there.
Wanna know more?