another technical blog...technically

Showing posts with label Strategy. Show all posts
Showing posts with label Strategy. Show all posts

Monday, June 25, 2018

Make your life a little easier with ODBC

I've written about the DLL import problem of BluePrism, of possible workaround with custom DLLs, but we can also make our life easier when we can.
For example in my last project i had to work with a lot of data sources as SQL Server, MySQL and Access all together.
All those above are data sources which have connectors for ODBC.
This means we can deal with all DBMS (and much more) simply installing ODBC connectors in the robot machine and using only this VBO to deal with that.
The key is to pass the driver info to the VBO so connection string could be composed.

So this are the VBO actions i created to deal with query.

Read in collection

The action you need the most.
It has two piece of code, because you could need to get lot of data (more than 300.000 rows).
Using the SqlCommand variable you can also do something more than "Select * from table" but you can also filter before getting the data.
The first code block is faster but will not work with lot of data, and the second one will be slower but it will work everytime.
Pay attention kiddo, BP passes parameter by value and not by reference, so you could need to do the "business work" in the VBO action, in order not to go "Out of memory".
OdbcConnection connection = new OdbcConnection(connectionString);
result = new DataTable();
exception = "";

try
{
 connection.Open();
 OdbcCommand command = new OdbcCommand(sqlCommand, connection);
 OdbcDataReader dataReader = command.ExecuteReader();
 result.Load(dataReader);
}
catch (Exception ex)
{ 
 exception = ex.Message;
}
finally
{
 connection.Close();
}

And this is the "big data" one.
OdbcConnection connection = new OdbcConnection(connectionString);
result = new DataTable();
exception = "";

try
{
 connection.Open();
 OdbcCommand command = new OdbcCommand(sqlCommand, connection);
 OdbcDataReader dataReader = command.ExecuteReader();

 DataTable resultSchema = dataReader.GetSchemaTable();
 List listCols = new List();
 if (resultSchema != null)
 {
  foreach (DataRow drow in resultSchema.Rows)
  {
   string columnName = System.Convert.ToString(drow["ColumnName"]);
   DataColumn column = new DataColumn(columnName, (Type)(drow["DataType"]));
   column.Unique = (bool)drow["IsUnique"];
   column.AllowDBNull = (bool)drow["AllowDBNull"];
   column.AutoIncrement = (bool)drow["IsAutoIncrement"];
   listCols.Add(column);
   result.Columns.Add(column);
  }
 }

 DataRow dataRow = null;
 while (dataReader.Read())
 {
  dataRow = result.NewRow();
  for (int i = 0; i < listCols.Count; i++)
  {
   dataRow[((DataColumn)listCols[i])] = dataReader[i];
  }
  result.Rows.Add(dataRow);
 }
}
catch (Exception ex)
{
 exception = ex.Message;
}
finally
{
 connection.Close();
 System.GC.Collect();
}

 Execute Command 

Just executes the query
OdbcConnection connection = new OdbcConnection(connectionString);
exception = "";

try
{
 connection.Open();
 OdbcCommand command = new OdbcCommand(sqlCommand, connection);
 command.ExecuteNonQuery();
}
catch (Exception ex)
{ 
 exception = ex.Message;
}
finally
{
 connection.Close();
}

 Execute Commands in a transaction

Executes more queries in a single transaction

OdbcConnection connection = new OdbcConnection(connectionString);
OdbcTransaction transaction = null;
exception = "";

try
{
 connection.Open();
 transaction = connection.BeginTransaction();
 OdbcCommand command = null;
 
 foreach (DataRow dr in sqlCommands.Rows)
 {
  string sqlCommand = dr["Command"].ToString();
  command = new OdbcCommand(sqlCommand, connection, transaction);
  command.ExecuteNonQuery();
 }
 
 transaction.Commit();
}
catch (Exception ex)
{ 
 transaction.Rollback();
 exception = ex.Message;
}
finally
{
 connection.Close();
}


A good strategy could be create CRUD action for ODBC driver in order to implement implicitily the repository pattern.
Changing the ConnectionString will just change the DataSource type and help creating a "universal" repository.
And... that's it for now!

Monday, April 25, 2016

Howto search for memory leak and save your marriage

Someone said (on friday afternoon): "your application got a memory leak and it's your fault".
Because of my wife became very nervous when i gotta work during the weekend i asked some help in order to understand how to work this problem as fast as possible, so let's start with credits to Iron Man and Mr. Pumpkin (only nicknames here: privacy is privacy).
This is what i learned from pro-people which helped me to do a drill-down analysis.
Pre-conditions:
  1. You know farm sometimes goes down
  2. You must have front end IIS logs
  3. You know there are lots of objects in memory (as i can see in w3wp.dmp)
  4. You need ULS logs

The bug-hunting process deals with:
  1. Find the use case that reproduce the memory issue
  2. Replicate the case in your dev envinronment
  3. Analyze the issue
  4. Conclusions

Step 1 - Find the use case that reproduce the memory issue
To accomplish the first task, i have taken the IIS logs in order to find out "who was doing what" on the web application. IIS logs are too long to use a notepad, so Mr. Pumpkin said: "Use lizard, download it from here http://www.lizard-labs.com/log_parser_lizard.aspx"
Lizard helps you to make query with SQL on log files, IIS logs contain lots of rows with those informations:
  • date
  • time
  • s-sitename
  • s-computername
  • s-ip
  • cs-method
  • cs-uri-stem 
  • cs-uri-query 
  • s-port 
  • cs-username 
  • c-ip cs-version 
  • cs(User-Agent) 
  • cs(Cookie) 
  • cs(Referer) 
  • cs-host 
  • sc-status 
  • sc-substatus 
  • sc-win32-status 
  • sc-bytes 
  • cs-bytes 
  • time-taken
etc But as Mr. Pumpkin said, please be aware IIS log are in UTC, so take care of using this query (maybe restricting where conditions on a particular time-slot near to the memory issue)
SELECT
   TO_LOCALTIME(TO_TIMESTAMP(date, time))
   ,date
   ,time
   ,s-sitename
   ,s-computername
   ,s-ip
   ,cs-method
   ,cs-uri-stem
   ,cs-uri-query
   ,s-port
   ,cs-username
   ,c-ip
   ,cs-version
   ,cs(User-Agent)
   ,cs(Cookie)
   ,cs(Referer)
   ,cs-host
   ,sc-status
   ,sc-substatus
   ,sc-win32-status
   ,sc-bytes
   ,cs-bytes
   ,time-taken
FROM 'C:\Users\Varro\Downloads\iis_frontend47.log'
in this way, column TO_LOCALTIME(TO_TIMESTAMP(date, time)) will give you the localtime. Now it's search time, filter on whatever you want. In my case, i take a look around sc-status (request status code) and time-taken in order to find out the most time consuming call, and i found some interesting data which helped me to replicate the behaviour users had before memory allocation went outta control.

Step 2 - Replicate the case in your dev envinronment
This is one of the most interesting part of the job, now i have the users behaviour i did the same actions on my system, attaching the memory profiler on Visual Studio, below the steps Iron Man explained to me:

1. Create a new performance session
 

2.  Change properties of performance session and choose performance counters

  • NET CLR Memory/# Bytes in all Heaps
  • .NET CLR Memory/Large Object Heap Size
  • .NET CLR Memory/Gen 2 heap size
  • .NET CLR Memory/Gen 1 heap size
  • .NET CLR Memory/Gen 0 heap size
  • .NET CLR Memory/Induced GC
  • Process/Private Bytes
  • Process/Virtual Bytes

3. Attach performance session  to SP process


4. Execute use case and monitor task manager, when you see memory going up, create some memory dumps of the process


5. When you ends the use case, stop everything and watch results ;)

Pay attention to this, you could need to run this command from che Visual Studio command prompt in order to make the profiler work
   vsperfclrenv /globalsampleon
   iisreset
Step 3 - Analyze the issue
What i discovered is that, effectively, there was a high memory consumption, so i tried to discover who could be the guilty method.
I can do this using memory profiler report and debug2diag (https://www.microsoft.com/en-us/download/details.aspx?id=49924) in order to process DMP files.

Memory DMP report
Memory profiler report
Iron Man also explained that i need to replicate the same pattern of memory allocation (as i could see on the customer w3wp.dmp) in order to have a smoking gun. So, using also the debugger to freeze the code in some interesting points in order to make some specific memory dump from task mager.

Step 4 - Conclusions
This pretty complex analysis helped me out to discover our application is quite slow in some particular circumstances, but even if some operations stressed so much memory allocation and garbage collector (some method like SubMethod1 could be more efficient), it's also true memory is released after those operation, so, if farm fall down when users do this operation... maybe it's time to give your farm just a little bit RAM.

That's all folks.

Wednesday, July 29, 2015

Strategy: catalog connection with managed metadata multivalue field

Everybody knows about SharePoint catalog connection... maybe

What if you have a pre-existing catalog connection and your customer asks you to switch from a single value field (generally ItemCategory) to a multivalue field?

In a ideal world you should quote this and say "don't do it, don't even think about it"

In a business world you'll say "Yes, why not?".
I tried to convert the single value field in a multivalue field... trust me... don't try this if you don't like strange side effects.

AS IS
This is the typical product catalog configuration:
  • A product catalog site which is used as backend website
  • Contents are crawled by search service application
  • Public sites connect to indexed catalog


Solution
To understand next steps, mind every link under categories navigation term is allowed.
Let's assume you have a term set like that
  • Category 1
    • Subcategory 11
    • Subcategory 12
  • Category 2 
    • Subcategory 21
The catalog connection automatically creates these category friendly urls:

http://site/category1
http://site/category1/subcategory11
http://site/category1/subcategory11
http://site/category2
http://site/category2/subcategory21

So let's assume i have a product named Test, which is tagged on ItemCategory with term "Subcategory 11", i'll go to URL http://site/category1/subcategory11/test .

But what if i tell you that links like these below will not repond with a 404?
http://site/category1/test
http://site/category1/subcategory12/test
http://site/category2/test
http://site/category2/subcategory21/test

This behaviour will be extremely useful for us.

Editing product catalog
Because you don't want to destroy catalog connection, you just have to add a TaxonomyFieldTypeMulti field to the product catalog item content type

  
    
   
     TextField
     {024FECDC-E8A7-4DAC-BEB1-E9C373708BE5}
   
    
  

Link this new field to the term set you use to tag catalog items.
After that, you can create a event receiver in order to write the first  value entered in MultivalueItemCategory field in ItemCategory field (or the field you use for the catalog connection).
Create Content Search Web Part replacements
This is the most annoying part, we have to create a Web Part that we will use in category pages.
This web part does essentially three things:
  1. Recover the current navigation term
  2. Make a query searching che term related to the navigation term, in the MultivalueItemCategory we defined above
  3. Substitute Path values (catalog item friendly url) in order to make links relative to the current navigation term
  4. Show results
The core methods are About the single product page you can create a web part or a page layout, which has to contain methods similar to the previous web part.
This web part, must get the product using the navigation term and the friendly url segment and show the result.

Editing publishing site 
In pages document library, you have to create a category page (which contains the category CSWP replacement), and a product page (which contains the product CSWP replacement) and then edit target page settings for terms like this


And that's all folks.

Me, myself and I

My Photo
I'm just another IT guy sharing his knowledge with all of you out there.
Wanna know more?