another technical blog...technically

Showing posts with label Blue Prism. Show all posts
Showing posts with label Blue Prism. Show all posts

Saturday, November 2, 2019

Playing with BP 5.x and queue item data dump

Hi guys, this is just a brief post about queue items dumping.
I found this very useful with BP version 5.x and high-intensive volume data which caused failure in GetNextItem action.
So i need to dump data to another table in order to keep DB as small as possible waiting for the permission from the customer, to wipe all the old data (or restoring just a part of the amount of data).

The SQL script is self explaining and deals with ID consistency, nothing hard to do, but if you're asking if you can do this operation without corrupting the DB, yes you can, so have fun:


-- We need to create just one table to dump data, otherwise you can use SELECT * INTO TABLE which creates a new table everytime
DROP TABLE [dbo].[BPAWorkQueueItem_TEMP]
CREATE TABLE [dbo].[BPAWorkQueueItem_TEMP](
 [id] [uniqueidentifier] NOT NULL,
 [queueid] [uniqueidentifier] NOT NULL,
 [keyvalue] [nvarchar](255) NULL,
 [status] [nvarchar](255) NULL,
 [attempt] [int] NULL,
 [loaded] [datetime] NULL,
 [completed] [datetime] NULL,
 [exception] [datetime] NULL,
 [exceptionreason] [nvarchar](max) NULL,
 [deferred] [datetime] NULL,
 [worktime] [int] NULL,
 [data] [nvarchar](max) NULL,
 [queueident] [int] NOT NULL,
 [ident] [bigint] IDENTITY(1,1) NOT NULL,
 [sessionid] [uniqueidentifier] NULL,
 [priority] [int] NOT NULL,
 [prevworktime] [int] NOT NULL,
 [attemptworktime]  AS ([worktime]-[prevworktime]) PERSISTED,
 [finished]  AS (isnull([exception],[completed])) PERSISTED,
 [exceptionreasonvarchar]  AS (CONVERT([nvarchar](400),[exceptionreason])),
 [exceptionreasontag]  AS (CONVERT([nvarchar](415),N'Exception: '+replace(CONVERT([nvarchar](400),[exceptionreason]),N';',N':'))) PERSISTED,
 [encryptid] [int] NULL,
 [lastupdated]  AS (coalesce([completed],[exception],[loaded])) PERSISTED,
 [locktime] [datetime] NULL,
 [lockid] [uniqueidentifier] NULL)
GO

-- We enable identity insert, this will preserve ID assigned by BluePrism, then we dump data
-- in the where condition we use queue name and completed date (so we do not take any pending or deferred item)
SET IDENTITY_INSERT [dbo].[BPAWorkQueueItem_TEMP] ON

INSERT INTO [dbo].[BPAWorkQueueItem_TEMP](
 [id]
 ,[queueid]
 ,[keyvalue]
 ,[status]
 ,[attempt]
 ,[loaded]
 ,[completed]
 ,[exception]
 ,[exceptionreason]
 ,[deferred]
 ,[worktime]
 ,[data]
 ,[queueident]
 ,[ident]
 ,[sessionid]
 ,[priority]
 ,[prevworktime]
 ,[locktime]
 ,[lockid])
 SELECT wqi.[id]
    ,wqi.[queueid]
    ,wqi.[keyvalue]
    ,wqi.[status]
    ,wqi.[attempt]
    ,wqi.[loaded]
    ,wqi.[completed]
    ,wqi.[exception]
    ,wqi.[exceptionreason]
    ,wqi.[deferred]
    ,wqi.[worktime]
    ,wqi.[data]
    ,wqi.[queueident]
    ,wqi.[ident]
    ,wqi.[sessionid]
    ,wqi.[priority]
    ,wqi.[prevworktime]
    ,wqi.[locktime]
    ,wqi.[lockid]

   FROM [BluePrism].[dbo].[BPAWorkQueueItem] as wqi
   JOIN [BluePrism].[dbo].[BPAWorkQueue] as wq on wqi.queueid = wq.id
   WHERE 
  wq.name = 'Queue.BonificiEsteroEntrata.ContiAttesaLavorazione' AND 
  finished < DATEADD(MONTH, -3, GETDATE())
GO

-- Here we restore identity insert value
SET IDENTITY_INSERT [dbo].[BPAWorkQueueItem_TEMP] OFF

-- Finally we erase data from original table
DELETE FROM [dbo].[BPAWorkQueueItem] WHERE 
 EXISTS (SELECT * FROM [dbo].[BPAWorkQueueItem_TEMP] WHERE [BPAWorkQueueItem].ident = [BPAWorkQueueItem_TEMP].ident)
Share:

Thursday, August 15, 2019

Morning routine with BluePrism and PowerShell

If you have deployed lots/tons of robots, you know that maintenance is the big elephant in the room.
Blue Prism scheduler is not so "friendly" and sometimes is not so responsive.

Lately I implemented some warmup scripts that simply restart SQL Server service at 00.00 AM and Blue Prism application server at 1:00AM. Resource PCs are rebooted as well at 2:00 AM.
Resource PCs are scheduled to start working at 3:00 AM even if nobody is at the office, but the problem come when the first colleague start to work at 7:00 AM.

Imagine you have something like 40 machines and you have to connect to each one and check if the schedule has started correctly... i know, it's a bit boring task.
The script i will show you is made of different subscript, according to your scenario you can use split it in more pieces.

Please note that in out project we have very complex schedule, every machine need to run different processes at different hours, with different cut-offs.
For every machine we have two schedule:
  1. The one with include login and then alternates different processes (eg. MACHINE01)
  2. Equal to the previous one without login (e.g. MACHINE01.Backup)
The script knows what machines are involved and their respective Backup schedule.
So at first, open as many RDP sessions as the involved machines, when all machines are visible and tiled on the screen, it starts to check (using telnet) what machines are running something.
If the machine running the script is equipped with Blue Prism, it will try lo launch the schedule using AutomateC.exe.

The code is quite self-explaining, so please have a look below and enjoy

# BEGIN Configs
$PATH_BUFFERFILE = "C:\temp\roboot.txt"
$PATH_AUTOMATEC = "C:\Program Files\Blue Prism Limited\Blue Prism Automate\AutomateC.exe"
$PATH_RDPFILES = "P:\Users\Varro\Desktop\RDP\"

$SCEHDULE_LASTCHECK = '20:30'

$RESOURCEPC_PORT = 8182
$RESOURCEPC_SCHEDULES = @{}
$RESOURCEPC_SCHEDULES.Add('MACHINE01_HOSTNAME','MACHINE01.Backup')
$RESOURCEPC_SCHEDULES.Add('MACHINE02_HOSTNAME','MACHINE02.Backup')
$RESOURCEPC_SCHEDULES.Add('MACHINE03_HOSTNAME','MACHINE03.Backup')
$RESOURCEPC_SCHEDULES.Add('MACHINE04_HOSTNAME','MACHINE04.Backup')
$RESOURCEPC_SCHEDULES.Add('MACHINE05_HOSTNAME','MACHINE05.Backup')
# END Configs

# BEGIN - Functions written by someone smarter than me
Function Show-Process($Process, [Switch]$Maximize)
{
  $sig = '
    [DllImport("user32.dll")] public static extern bool ShowWindowAsync(IntPtr hWnd, int nCmdShow);
    [DllImport("user32.dll")] public static extern int SetForegroundWindow(IntPtr hwnd);
  '
  
  if ($Maximize) { $Mode = 3 } else { $Mode = 4 }
  $type = Add-Type -MemberDefinition $sig -Name WindowAPI -PassThru
  $hwnd = $process.MainWindowHandle
  $null = $type::ShowWindowAsync($hwnd, $Mode)
  $null = $type::SetForegroundWindow($hwnd) 
}

Function Get-Telnet
{   Param (
        [Parameter(ValueFromPipeline=$true)]
        [String[]]$Commands = @("username","password","disable clipaging","sh config"),
        [string]$RemoteHost = "HostnameOrIPAddress",
        [string]$Port = "23",
        [int]$WaitTime = 1000,
        [string]$OutputPath = "\\server\share\switchbackup.txt"
    )
    #Attach to the remote device, setup streaming requirements
    $Socket = New-Object System.Net.Sockets.TcpClient($RemoteHost, $Port)
    If ($Socket)
    {   $Stream = $Socket.GetStream()
        $Writer = New-Object System.IO.StreamWriter($Stream)
        $Buffer = New-Object System.Byte[] 1024 
        $Encoding = New-Object System.Text.AsciiEncoding

        #Now start issuing the commands
        ForEach ($Command in $Commands)
        {   $Writer.WriteLine($Command) 
            $Writer.Flush()
            Start-Sleep -Milliseconds $WaitTime
        }
        #All commands issued, but since the last command is usually going to be
        #the longest let's wait a little longer for it to finish
        Start-Sleep -Milliseconds ($WaitTime * 4)
        $Result = ""
        #Save all the results
        While($Stream.DataAvailable) 
        {   $Read = $Stream.Read($Buffer, 0, 1024) 
            $Result += ($Encoding.GetString($Buffer, 0, $Read))
        }
    }
    Else     
    {   $Result = "Unable to connect to host: $($RemoteHost):$Port"
    }
    #Done, now save the results to a file
    $Result | Out-File $OutputPath
    return $Result
}
# END - Functions written by someone smarter than me

# MAIN
# 1. Close all RDP session
Get-Process | Where-Object { $_.Path -like "*mstsc*" } | Stop-Process -Force

# 2. Wait all RDP sessions are closed
Do {
    $rdpSessions = Get-Process | Where-Object { $_.Path -like "*mstsc*" }
    Start-Sleep -s 1
} While ($rdpSessions.Count -ne 0)

#3. Open RDP sessions
ForEach ($resourcePc in $RESOURCEPC_SCHEDULES.Keys) {   
    $arg = $PATH_RDPFILES + $resourcePc + ".rdp"
    Start-Process "mstsc" -ArgumentList """$arg"""
    Start-Sleep -s 1
}

#4. Wait all RDP sessions are opened
Do {
    $rdpSessions = Get-Process | Where-Object { $_.Path -like "*mstsc*" }
    Start-Sleep -s 2
} While ($rdpSessions.Count -ne $RESOURCEPC_SCHEDULES.Keys.Count)


#5. Wait to be logged in every RDP session the script opened, then click enter or something else
$process = Get-Process -Id $PID
Write-Host $process
Show-Process -Process $process -Maximize

$key = Read-Host "Press ENTER key when all are connected"

#6. Tile all RDP sessions vertically
$ShelleExp = New-Object -ComObject Shell.Application
$ShelleExp.TileVertically()

#7. Just deciding what to do according to the hour of the day (don't start if current time > 8.30 in this case or on weekend days)
$now = (Get-Date)
$nowDay = $now.DayOfWeek.value__
Write-Host "Today is $nowDay - $now.TimeOfDay"

if ($now.TimeOfDay -gt $SCEHDULE_LASTCHECK -And $nowDay -ne 6 -And $nowday -ne 7)
{
    Write-Host "Too late... maybe tomorrow" -BackgroundColor Red -ForegroundColor White
} 
else 
{

    #8. Checking if machines are working (using telnet to get those data)
    [System.Collections.ArrayList]$resourcePC_Problematic = @()
    foreach ($resourcePc in $RESOURCEPC_SCHEDULES.Keys)
    {
        Write-Host "Check $resourcePc machine" -ForegroundColor Yellow
        # Remove buffer file, call telnet and get message
        Remove-Item $PATH_BUFFERFILE -ErrorAction Ignore
        $telnetContent = Get-Telnet -RemoteHost $resourcePc -Port $RESOURCEPC_PORT -Commands "status" -OutputPath "$PATH_BUFFERFILE"
        Write-Host $telnetContent -ForegroundColor Yellow
        
        #9a. If contains running, something is running, so move on
        if ($telnetContent.Contains("RUNNING")) 
        { 
            Write-Host "$resourcePc is working" -ForegroundColor Green
        }
        #9b. If does not contains running, time to run the schedule
        else
        {
            $scheduleName = $resourcePc_Schedulazioni[$resourcePc]
            Write-Host "Run now scheduled task $scheduleName on $resourcePc" -BackgroundColor Red -ForegroundColor White
            $cmd = "cmd.exe /C ""$PATH_AUTOMATEC"" /sso /startschedule /schedule $scheduleName"
            Invoke-Expression -Command:$cmd
        }
        
        #9c. Clean it up
        Remove-Item $PATH_BUFFERFILE
        Write-Host 
        Write-Host 
    }

    #10. Check again all machines to check are running something, if not, write who's doing nothing
    foreach ($resourcePc in $RESOURCEPC_SCHEDULES.Keys)
    {
        Remove-Item $PATH_BUFFERFILE -ErrorAction Ignore
        $telnetContent = Get-Telnet -RemoteHost $resourcePc -Port $RESOURCEPC_PORT -Commands "status" -OutputPath "$PATH_BUFFERFILE"
        if (!$telnetContent.Contains("RUNNING")) 
        { 
            $resourcePC_Problematic.Add($resourcePc)
        }
    }

    Write-Host "Resource PC not booted: $resourcePC_Problematic" -BackgroundColor Red -ForegroundColor White
}


Read-Host "Premere ENTER to exit"
Share:

Monday, April 22, 2019

Anatomy of a BP consumer template

After a lot of work, Antonio Durante and I decided to create a template to standardize someway how we build consumer processes in Blue Prism.
So let me introduce you DaTemplate.

DaTemplate overview
This is the consumer roundtrip for a typical process that works with queues: i use it everytime.
The concept is, raise exception alwais to the hightest level.

Let's examine the right part first.
After start i reference two pages:
  1. Reset: no matter what happened, close every application that can someway create problems to out process;
  2. Config: configure the environment if needed and get the process global variables
Then i start with the get next, but i put all the data item data in global variables, so they are accessible everywhere. No more global variables except item one and config one, all other variables must be local to the page they are declared in.
In the end i call a page "Item Logic" which is typical of the process.

The right part

So in the process pages, you will use recover resume block to manage exceptions, but, if the result is BusinessException/Managed Exception or Technical Exception, don't do anything and let the left part of the template work for you.
The exception message is extracted and the right action addressed according to exception type.
If Business or Managed Exception, item will be Mark Success (i consider a recognized exception as a success only in the technical log), and logged as an exception in the business log.
If the exception is unknown or a technical one, the process for that item will be repeated for MaxAttemps times before becaming a Managed exception.
Error is screenshotted for further investigations.

The left part for managing exceptions
Please note that on get next and Check Attempts there is a recover resume block.
This is due to the potential failure of the get next, and also a potential failure in the left part of the template that could create an infinite loop.
If we got an unexptected exception, those recover resume blocks will take the template to a new get next operation, just taking 5 seconds to breath (in the case of get next failure).

That's all folks, i hope you find it useful

Monday, April 15, 2019

BP and JSON

As you already know, BP works on queues.

What if a process is not based on queue items but is based on a sequence of bulk operations?
You can think every bulk operation as fake queue item and manage them from control room.
But maybe you want to apply some logic, for example, let's assume your process id made of 4 subprocesses (A, B, C, D), if something go wrong with A you don't want to execute C, no problem on B and D.

Yeah, maybe too simple as an example, but use you imagination... believe me, you don't want to know anything about the process that, in the real world, lives on this solution (Alberto Cannas became mad on this).
So think about a process that doesn't rely on queue and where you want to play with subprocesses that can be seen as transactions.

We could store the data in a DB, but what if we cannot use a DB... we can use a JSON file as a DB, in order to have a flexible DB that helps us.
For example, in the process we are dealing with, we use a JSON file that is generated everyday, and the process state is represented by data in a collection.

I will show you a little process that simply export data only at 10.00/12.00/14.00/16.00 using JSON  to make all clear. This is what BP object Utility - JSON wrote when using Collection to JSON action.

{"Log 10":false, "Log 12":false, "Log 14":false, "Log 16":false} 

So you can read the informations you store in your information in a JSON file, implement your logic, and write again only when needed.
You should take care about 2 things:
  1. The subprocess that works on file must be encapsulated in enviroment lock. This will be helpful in a multi-robot environment;
  2. Pay attention to what you write in the JSON, when you use JSON to Collection action, the objects write the collection as an array, so you have to remove brackets at beginning and at the end of the JSON (thank Manuel Mascia for the trick)
Yeah... tips'n'tricks

Now you can push this solution beyond this example.
That's all folks.

Monday, April 8, 2019

BP and Angular

Working with Antonio (Durante ndr), we faced a problem automating modern web application written in Angular.
Let me explain you the scenario, we spied all the textboxes with HTML spy mode and we wrote values into the textboxes, but the system returned an error.
I cannot show you anything else because... come on you know why, but focus on the red border and trust me that the message showed by the system is: please enter a value.

And the system said: please enter a value

Why this happened?
Simply because, even if we wrote something, the ng-pristine tag is still here.
What is ng-pristine? A tag that tells the model that no fields has been modified.

How to solve this problem? Convince Angular you did something on the interface.
In the example below, we wrote the value 7200 in the field with name agenziaOrdinante.
After that we force angular to call the change event and then the blur event, in this way we notify to the model that something has changed, with the blur event indeed we are sure to trigger other potential events that are fired when we click elsewhere.

JS = lifesaver
In the end mind to create a VBO that writes data using JS to generate and launch the javascripts one by one.
Instead of ElementName input, in the final version i preferred to use JS selector as input to make the code more generic.

Write data action example
And that's all folks. As usual thanks to Antonio Durante for the fun trip.

Monday, April 1, 2019

Add credentials from BP... programmatically

As you know, you can use BP not also for automate business processes.
For example, in our scenario (we are talking about a RPA program with more than 30 complex automations), we have credentials that expire if you don't login with those credentials for a certain period of time.
On the other hand, because of, every time we provision a new process, we have to add to credential manager, all the credential needed by the process itself (call this value x) for all the machines involved (call this value y), it's easy to understand that the process of provisioning and keeping credentials alive led me to a new limit: BP does not provide anything to create credentials, so SQL is out ally.
So the key is to launch someway (ODBC or whatever) this custom SQL store procedure from BP.
The result will be a credentials with all roles, with permissions on all processes and resource PCs and with a void password.
Please note that void is 3uDCB67zyUqBsHym7o635w==:JM6brcdYVFsZhbEKISRtaQ== for BP since because the value is encoded and may vary depending on Encryption Scheme so make sure you already check what is the null password value on your farm.

Add credentials script

USE [BluePrism]
GO
/****** Object:  StoredProcedure [dbo].[BPAddUser]    Script Date: 04/10/2018 14:45:30 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO

ALTER PROCEDURE [dbo].[BPAddUser]
       @userLogin nvarchar(128)
AS
BEGIN
       -- SET NOCOUNT ON added to prevent extra result sets from
       -- interfering with SELECT statements.
       SET NOCOUNT ON;

    -- Insert statements for procedure here
       BEGIN 

             if not exists (select * from BPACredentials where login = @userLogin AND name = @userlogin)
                    BEGIN
                           declare @newid uniqueidentifier = NEWID();

                           INSERT INTO [dbo].[BPACredentials]
                                  ([id],[name],[description],[login],[password],[expirydate],[invalid],[encryptid])
                           VALUES
                           (@newid,@userLogin,'Created by script',@userLogin,'3uDCB67zyUqBsHym7o635w==:JM6brcdYVFsZhbEKISRtaQ==',NULL,0,1)

                           INSERT INTO [dbo].[BPACredentialRole]
                                  ([credentialid],[userroleid])
                           VALUES
                                  (@newid, NULL)

                           INSERT INTO [dbo].[BPACredentialsProcesses]
                                  ([credentialid],[processid])
                           VALUES
                                  (@newid, NULL)

                           INSERT INTO [dbo].[BPACredentialsResources]
                                  ([credentialid],[resourceid])
                           VALUES
                                  (@newid, NULL)
                    END
             Else
                    BEGIN
                    
                    DECLARE @credentialId uniqueidentifier;
                    select @credentialId = id from BPACredentials where login = @userLogin AND name = @userlogin;
                    
                    -- Create or overwrite 
                    if not exists (select * from BPACredentialRole where credentialid = @credentialId)
                           INSERT INTO BPACredentialRole([credentialid],[userroleid]) values (@credentialId, NULL)
                    else
                           update BPACredentialRole set userroleid = NULL where credentialid = @credentialId
                           

                    -- Create or overwrite 
                    if not exists (select * from BPACredentialsProcesses where credentialid = @credentialId)
                           INSERT INTO BPACredentialsProcesses([credentialid],[processid]) values (@credentialId, NULL)
                    else
                           update BPACredentialsProcesses set processid = NULL where credentialid = @credentialId


                    -- Create or overwrite
                    if not exists (select * from BPACredentialsResources where credentialid = @credentialId)
                           INSERT INTO BPACredentialsResources([credentialid],[resourceid]) values (@credentialId, NULL)
                    else
                           update BPACredentialsResources set resourceid = NULL where credentialid = @credentialId
             END

       END
END
Share:

Monday, July 2, 2018

Scheduler is locked?

Lately, lot of people talk about RPA but very few people talk about what happens when you have to scale and you have something like 20 robot working all together, synchronized, with a "time/resource filling" target.
Sometimes you could see the scheduled tasks forever "Pending", not responding to any delete or run requests.
The solution is: let's go to the DB and use this query to find out all the pending processes in a human readable way.

  SELECT 
       s.sessionid,
       r.name,
       p.name,
       s.startdatetime,
       st.statusid,
       st.description
  FROM 
  [BluePrism].[dbo].[BPASession] as s
  inner join BluePrism.dbo.BPAProcess as p
  on s.processid = p.processid
  inner join BluePrism.dbo.BPAResource as r 
  on r.resourceid = s.runningresourceid
  inner join BluePrism.dbo.BPAStatus as st
  on s.statusid = st.statusid
  
  where st.description = 'Pending'
  order by s.startdatetime

Then just change the end date from NULL to another value (maybe the same of the start one) and the statusid to "2", which stands for the Terminated status and the linked resource will be released ;)
Share:

Monday, June 25, 2018

Make your life a little easier with ODBC

I've written about the DLL import problem of BluePrism, of possible workaround with custom DLLs, but we can also make our life easier when we can.
For example in my last project i had to work with a lot of data sources as SQL Server, MySQL and Access all together.
All those above are data sources which have connectors for ODBC.
This means we can deal with all DBMS (and much more) simply installing ODBC connectors in the robot machine and using only this VBO to deal with that.
The key is to pass the driver info to the VBO so connection string could be composed.

So this are the VBO actions i created to deal with query.

Read in collection

The action you need the most.
It has two piece of code, because you could need to get lot of data (more than 300.000 rows).
Using the SqlCommand variable you can also do something more than "Select * from table" but you can also filter before getting the data.
The first code block is faster but will not work with lot of data, and the second one will be slower but it will work everytime.
Pay attention kiddo, BP passes parameter by value and not by reference, so you could need to do the "business work" in the VBO action, in order not to go "Out of memory".
OdbcConnection connection = new OdbcConnection(connectionString);
result = new DataTable();
exception = "";

try
{
 connection.Open();
 OdbcCommand command = new OdbcCommand(sqlCommand, connection);
 OdbcDataReader dataReader = command.ExecuteReader();
 result.Load(dataReader);
}
catch (Exception ex)
{ 
 exception = ex.Message;
}
finally
{
 connection.Close();
}

And this is the "big data" one.
OdbcConnection connection = new OdbcConnection(connectionString);
result = new DataTable();
exception = "";

try
{
 connection.Open();
 OdbcCommand command = new OdbcCommand(sqlCommand, connection);
 OdbcDataReader dataReader = command.ExecuteReader();

 DataTable resultSchema = dataReader.GetSchemaTable();
 List listCols = new List();
 if (resultSchema != null)
 {
  foreach (DataRow drow in resultSchema.Rows)
  {
   string columnName = System.Convert.ToString(drow["ColumnName"]);
   DataColumn column = new DataColumn(columnName, (Type)(drow["DataType"]));
   column.Unique = (bool)drow["IsUnique"];
   column.AllowDBNull = (bool)drow["AllowDBNull"];
   column.AutoIncrement = (bool)drow["IsAutoIncrement"];
   listCols.Add(column);
   result.Columns.Add(column);
  }
 }

 DataRow dataRow = null;
 while (dataReader.Read())
 {
  dataRow = result.NewRow();
  for (int i = 0; i < listCols.Count; i++)
  {
   dataRow[((DataColumn)listCols[i])] = dataReader[i];
  }
  result.Rows.Add(dataRow);
 }
}
catch (Exception ex)
{
 exception = ex.Message;
}
finally
{
 connection.Close();
 System.GC.Collect();
}

 Execute Command 

Just executes the query
OdbcConnection connection = new OdbcConnection(connectionString);
exception = "";

try
{
 connection.Open();
 OdbcCommand command = new OdbcCommand(sqlCommand, connection);
 command.ExecuteNonQuery();
}
catch (Exception ex)
{ 
 exception = ex.Message;
}
finally
{
 connection.Close();
}

 Execute Commands in a transaction

Executes more queries in a single transaction

OdbcConnection connection = new OdbcConnection(connectionString);
OdbcTransaction transaction = null;
exception = "";

try
{
 connection.Open();
 transaction = connection.BeginTransaction();
 OdbcCommand command = null;
 
 foreach (DataRow dr in sqlCommands.Rows)
 {
  string sqlCommand = dr["Command"].ToString();
  command = new OdbcCommand(sqlCommand, connection, transaction);
  command.ExecuteNonQuery();
 }
 
 transaction.Commit();
}
catch (Exception ex)
{ 
 transaction.Rollback();
 exception = ex.Message;
}
finally
{
 connection.Close();
}


A good strategy could be create CRUD action for ODBC driver in order to implement implicitily the repository pattern.
Changing the ConnectionString will just change the DataSource type and help creating a "universal" repository.
And... that's it for now!

Monday, June 11, 2018

BluePrism: drill down about housekeeping BP DB

As you already know, it's so easy to finish space on a DB server when you use BP.
The reason could be, a poor design of logging (e.g log everything) or maybe logging in a loop.
It's clear that you don't need to log everything, but only what you have effectively to justify if you make a mess and also log errors for sure.
Now you are thinking: what the hell I've done? Why i was so stupid to log everything?

Well, you are in good company, I did the same. But now it's time to fix everything.
Now you are thinking that you can use Archiving feature to store the session log somewhere but what if archiving doesn't work, or mabe you just want to erase the unneeded data.

Always remember that delete a lot of data does not mean only wasting a lot of time, but also to see the transaction log growing a lot, so try this only when you are sure you can backup the DB.
It's clear that there is always the risk of corrupting the DB when you do something at the core level of a product.

So let's begins, the first thing you can do it's to delete the content in BPAScheduleLog and BPAScheduleLogEntry. The scheduler log grows perpetually so could be a good idea to trucate BPAScheduleLogEntry and delete BPAScheduleLog, this is what BP says, they also provide you a script to delete all data in a particular time frame, but this is another story.

The other massively filled table is BPASessionLog_NonUnicode, and here BP propose a script which helps you delete all entries, but in our case we want to selectively delete only the log entry we need (maybe log about a specific Business Object or Process Page).

BP said it could work so, before applying on a real DB, let's test it up.
Let's create the base case, so I created 2 processes:
  • Test.Log1
    • Page1: calls Test.VBO.Log1 actions
    • Page2: calls Test.VBO.Log2 actions
  • Test.Log2
    • Page1: calls Test.VBO.Log1 actions
    • Page2: calls Test.VBO.Log2 actions
and 2 VBOs
  • Test.VBO.Log1
    • Action1: returns 2+2
    • Action2: returns 2*2
  • Test.VBO.Log2
    • Action1: returns 2-2
    • Action2: returns 2/2
Log is enabled everywhere, and i made every process run 2 times on the same machine.


 Here i entered in the DB and i discovered with this simple query what combination of:
  • Process name
  • Process page name
  • VBO 
  • VBO action
is logged the most
SELECT processname,pagename,objectname,actionname,count(*) as num
FROM [BluePrism].[dbo].[BPASessionLog_NonUnicode]
group by processname,pagename,objectname,actionname order by num desc
And here... only the brave... what happens if i selectively delete rows here?
The answer is: absolutely nothing, i posted some screenshot below indeed that makes you understand the only pitfall i noticed on my local Blue Prism installation.
So, have fun cleaning your DB from unnecessary rows.
See ya.

Monday, June 4, 2018

A Blue Prism project with custom DLLs: load DLL from everywhere

I admit, this is one of the most painful points when i work with Blue Prism. As explained in other posts about DLLs, we have to put class libraries into the root folder of Blue Prism in order to use classes.
This could be a problem when you don't have the rights to access Blue Prism root folder, so with the great Antonio Durante, we find a way to solve this problem using another forgotten beauty: the reflection.

Here you can download a little project in Visual Studio, which is composed of some lines of code which simply get the content of a editable PDF and return as a DataTable containing as many rows as lines of text found in the PDF.
This piece of code uses iTextSharp.dll to do the work so we have a depencency here.


public class Program
{
 public Program(string blablabla)
 {
  // This is a fake constructor
 }

 public static DataTable LoadPdf(string path)
 {
  List data = PdfHelper.ExtractTextFromInvoice(path);

  DataTable table = new DataTable();
  table.Columns.Add("Row", typeof(string));

  DataRow row11 = table.NewRow();
  row11["Row"] = "Yeah 1.1 version as well";
  table.Rows.Add(row11);

  foreach (string item in data)
  {
   DataRow row = table.NewRow();
   row["Row"] = item;
   table.Rows.Add(row);
  }

  return table;
 }

 static void Main(string[] args)
 {
    DataTable t = LoadPdf(@"C:\Users\v.valrosso\Downloads\test1.pdf");
 }
}

The code here is very simple, but just to see again how reflection work, i write also this code to test everything. As you can see i load only the assembly in a folder where you can find also the referenced DLL (i've used directly the Debug output folder of the dummy project)

class Program
{
 static void Main(string[] args)
 {
  string assemblyPath = @"C:\TFS\Test\Code\Poc.ReferencingCode\bin\Debug\Poc.ReferencingCode.dll";

  Assembly asm = Assembly.LoadFrom(assemblyPath);
  //Assembly asm = Assembly.Load(File.ReadAllBytes(assemblyPath));
  Type t = asm.GetType("Poc.ReferencingCode.Program");

  var methodInfo = t.GetMethod("LoadPdf", new Type[] { typeof(string) });
  if (methodInfo == null)
  {
   // never throw generic Exception - replace this with some other exception type
   throw new Exception("No such method exists.");
  }

  object o = Activator.CreateInstance(t);

  object[] parameters = new object[1];
  parameters[0] = @"C:\Users\v.valrosso\Downloads\test1.pdf";       

  DataTable r = (DataTable)methodInfo.Invoke(o, parameters);
  Console.WriteLine(r);
 }
}

After that we can just play with BP creating a new VBO just pasting the code and ... les jeux sont faits.


Just  a warning, BP locks the DLL so you have think something more smart (Antonio and I have developed something very very smart but sadly i cannot show you because of it's top secret).

As usual thanks to my colleague and friend Antonio Durante for the wonderful work we make together everyday.

Monday, May 28, 2018

Why in the world i cannot get deferred item?

I was wondering why i could not get deferred items from a queue, so i drilled down just a little bit the BP Database in order to see all deferred items

Just watching the queue
So i dumped data before and after completion of the object just to understand how the DB row is updated

Old VS New
After that, we made a little action that inject SQL code from BP in order to get deferred items.

select
       *
       from
       BPAWorkQueue as q join BPAWorkQueueItem as i
       on q.id = i.queueid
       where q.name = 'QueueName'
       and i.deferred IS NOT NULL
       and i.completed IS NULL
       and i.exception IS NULL

And this is how Claudia Musio (thanks for everything) and me, solved the problem.

Monday, May 21, 2018

The KeepAlive problem

Sometimes, you just cannot ask to the IT department to disable screensaver or user lock screen, and personally i never understood why if we are dealing with virtualized machines, so you have to solve the problem somehow.

You can rely for sure on the LoginAgent VBO, which helps you to understand if the screen is locked and so you can login again, but if the screen is not locked, you can just move the mouse on the screen so the session remains alive.

I've tried some methods but only this worked, so in the global code you have to insert this piece of code


Private Shared Sub mouse_event(ByVal dwFlags As Integer, ByVal dx As Integer, ByVal dy As Integer, ByVal dwData As Integer, ByVal dwExtraInfo As Integer)
End Sub

Private Const MOUSEEVENTF_MOVE As Integer = 1
Private Const MOUSEEVENTF_LEFTDOWN As Integer = 2
Private Const MOUSEEVENTF_LEFTUP As Integer = 4
Private Const MOUSEEVENTF_RIGHTDOWN As Integer = 8
Private Const MOUSEEVENTF_RIGHTUP As Integer = 16
Private Const MOUSEEVENTF_MIDDLEDOWN As Integer = 32
Private Const MOUSEEVENTF_MIDDLEUP As Integer = 64
Private Const MOUSEEVENTF_ABSOLUTE As Integer = 32768

Public Shared Sub Move(ByVal xDelta As Integer, ByVal yDelta As Integer)
    mouse_event(MOUSEEVENTF_MOVE, xDelta, yDelta, 0, 0)
End Sub

After that yuo can create an action (eg. MouseMove) which randomly moves the mouse into the screen

Dim Randomizer As System.Random = New System.Random()
Dim x As Integer = Randomizer.Next(1, 640)
Dim y As Integer = Randomizer.Next(1, 480)

Move(x, y)

And that's it, another problem solved by Varro e Antonio Durante
Share:

Monday, May 14, 2018

Use BP Tags to do something more

What about tags?
I think tags can be used  to do more than simply tag an item to understand what kind of item the robot is working on.
Indeed we thank about a new way to use it, just replicating the .NET Dictionary ToString().

You can represent a dictionary as a set of key-value like this:

Key1:Value1;Key2:Value2;Key3:Value3;

And yeah using just a bit of code, you play with tags.
Look at this diagram, here we play with tags using a custom code


Dim patternTag As New Regex("(" + tagkey + ":)+([^;]){0,}")
Dim patternKey As New Regex("(" + tagkey + ":)+")
Dim match As Match
Dim value As String

match = patternTag.Match(tagString)

If match.Success Then
 valueTag = match.Value.TrimEnd(";")
 value = patternKey.Replace(valueTag, "")
 valueTag2 = valueTag.Replace(value, newValue)
 success = True
Else
 valueTag2 = tagKey + ":" + newValue
 success = False
End If

At first we get all data from tags from a particular item in the queue, and we look for all values of type Key1:whatever (note that, in our case is Key:Value and for BP is just one tag), so we delete pre-existing Key1:Value1 and we add Key1:NewValue.
It's clear that we assume that every tag of type Key-i:value is considered singleton.

With this configuration we can do interesting things, for example track technical statuses.
More generally, when you work with a process (expecially if attended), you have different phases, which can be matched with BP Item statuses.
In a singular phase, you can do lot of operations, and if the phase failed, it's normal to repeat the phase.
Sometimes you cannot redo some operations (maybe the robot is paying someone) and you don't want to split the original phase more phases (so not using the status) because of business rules.
So, assuming we are in Status S1, you do the Operation1 on a particular system and after that the robot crashes, when the robot retry the process, you can design the process so, if in state S1, the step Operation1 is skipped.

As usual, thanks to Antonio Durante and Raffaele Ecravino :-)

Monday, May 7, 2018

Scheduling processes and unregistered named pipe

Surprise surprise, a customer asked to start a robot at 7:00 AM, but if you generally work from 9AM to 9 PM like me, you'll find that start a robot at 7:00AM is... how can i say... not so beautiful.
So It's time to make the scheduler to work!

Let's assume the Resource PC we are using is configured to have the LoginAgent service listening on port 8081, and Resource PC ready to fire listening on port 8082.
When the LoginAgent is up and running, the Resource PC is not ready to start an automation, so we have to schedule this sequence of tasks:
  1. Login on 8081
  2. Run on 8082
  3. Logout on 8082
Da first configuration
It's also true that between 1 and 2, everything can happen, and you have to rely on resilience setting, but also, what can happen is that the Login fails because of unregistered named pipe.

Just the login agent log

I can see you asking yourself: WTF is named pipe?
Well, I read a lot about it but I don't remember a single word, but the concept is that the named pipe is a method for IPC (Inter Process Comunication), so i can assume is used when to resume LoginAgent.

Indeed, referring to the schedule above, if you observe the control room is that, when you see the resource PC connected on port 8081, the port 8082 results to be disconnected and vice versa, this means that LoginAgent is listening for login request while 8081 is connected and, until 8082 is in use (so until the logout), LoginAgent is not available.
Sometimes it happens that, even if LoginAgent service is available on port 8081, but the login failed because of the unregistered named pipe, so the solution is... surprise surprise...reboot the machine.
You can find all registered named pipes just with this PowerShell command
[System.IO.Directory]::GetFiles("\\.\\pipe\\")

And the output will be something like this


So the next action is to translate this code in a VBO action, so you can discover if the BP named pipe is registered or not, you can simply use this code to get all the named pipe and then just filter the output collectin

Dim listOfPipes As String() = System.IO.Directory.GetFiles("\\.\pipe\")
dt.Columns.Add("C1")
For Each s As String In listOfPipes
    dt.Rows.Add(s)
Next

Use VBO... use the force
Just the BP log after the cure

The next step is to integrate this new logic in the login task, so if the robot finds out that named pipe is not registered, it reboots the Resource PC.

And now login like this

.The final step is just to add a new task 2.Wait, this task will be:
  • Called if login failed
  • Calls login both in success and failure condition

A new day, a new design
With this little update, il login fails (and this is why i raise an exception after machine reboot), the wait process (it could be whatever, maybe also a void one) will use the resilience of BP to keep the scheduled task alive.
The pitfall of this technique is that you can generate a infinite loop, so take care.

So my friends, that's it, and thanks to my friend Antonio Durante for his time which helped us to work this out.

Monday, April 2, 2018

A Blue Prism project with custom DLLs: 4 dummies

It seems this blog post was found really interesting from lot of you, but i also read a lot of comments about how can you set up a project and use DLL with BP, so i will show you a practical and really simple example.
Here below the code of the DLL i will use, just two classes:
  1. Log Helper: write something in the event viewer (could be useful for bug hunting)
  2. Program: the typical entry point for every .NET console application
LogHelper.cs
using System;
using System.Diagnostics;

namespace BP.ExternalDll.Example
{
    public class LogHelper
    {
        private readonly string _log;
        private readonly string _source;

        public LogHelper(string source, string log)
        {
            _source = source;
            _log = log;
        }
        
        public void LogWarning(string message)
        {
            try
            {
                CreateEventSource();
                EventLog.WriteEntry(_source, message, EventLogEntryType.Warning);
            }
            catch (Exception)
            {
                // ignored
                // If you're here, it means you cannot write on event registry
            }
        }
        
        private void CreateEventSource()
        {
            if (!EventLog.SourceExists(_source))
            {
                EventLog.CreateEventSource(_source, _log);
            }
        }
    }
}
Program.cs
namespace BP.ExternalDll.Example
{
    public class Program
    {
        public static void Log()
        {
            string currentDirectory = System.Environment.CurrentDirectory;
            LogHelper _helper = new LogHelper("BP.ExternalDll.Example", "Test library");
            _helper.LogWarning(currentDirectory);
        }

        static void Main(string[] args)
        {
            Log();
        }
    }
}
So, compile everything and place your brand new DLL in this folder: C:\Program Files\Blue Prism Limited\Blue Prism Automate .
Don't try to put the file in other folders in your PC or organize the BP folder with subfolders: it will not work and don't argue with me that BP offers this functionality, IT DOESN'T WORK.
I said: IT DOESN'T WORK
I've figured out with Antonio Durante how to overcome this problem but I think i will try this in the next future.
In the code block you just have to write:
Program.Log()
and you will start to find some new rows in your event viewer. It's clear that this is just a little example, you can complicate the things as you wish.
My advice is to create always a BusinessDelegate class that holds all the methods you want to expose and create a single VBO action for every method in the BP, this will enhance testability and maintenance. That's all folks!


Monday, March 26, 2018

How Excel VBO in 5.0.33 destroyed my weekend (again locale issue)

Some weeks ago I’ve discovered that even if BP was updated on customer environment, business objects were not.
So I decided do update the Excel VBO at the last version: 5.0.33 and I started to see something strange.
When I tried to set a formula in an excel cell the formula did not work anymore, indeed Excel was not able anymore to compute the formulas anymore, moreover i had had similar problems with float values.

1. Static code comparison

I decided do drill down the problem and I started from the static comparison of the code between old code and new code.
At first I noticed the approach is more rational than the previous versions, so well done BP Team, but I also discovered that something changed in the actions below :
Get Worksheet As Collection
Old code
 Dim ws as Object = GetWorksheet(handle, workbookname, worksheetname, False)

 ' Do we have a sheet?
 sheetexists = ws IsNot Nothing
 ' No sheet? No entry.
 If Not sheetexists Then Return

 ws.Activate()
 ws.UsedRange.Select()
 ws.UsedRange.Copy()

 Dim data As String = Clipboard.GetDataObject().GetData(DataFormats.Text, True)
 
 ' The data split into rows
 Dim asRows() As String = Split(data, vbCrLf)
 
 Dim table As New DataTable()
 ' Set a flag indicating the header row
 Dim isHeaderRow As Boolean = True
 
 For Each strRow As String In asRows
  If Not String.IsNullOrEmpty(strRow) Then
  
   Dim fields() As String = Split(strRow, vbTab)
   If isHeaderRow Then
   
    isHeaderRow = False
    For Each field As String in fields
     table.Columns.Add(field)
    Next
    
   Else

    Dim row as DataRow = table.NewRow()
    For i As Integer = 0 To fields.Length - 1
     If i < fields.Length Then
      row(i) = fields(i)
     Else
      row(i) = ""
     End If
    Next I
    table.Rows.Add(row)
    
   End If
    
  End If
  
 Next
 worksheetcollection = table
OOB 5.0.33
  Dim ws as Object = GetWorksheet(handle, workbookname, worksheetname, False)

 ' Do we have a sheet?
 sheetexists = ws IsNot Nothing
 ' No sheet? No entry.
 If Not sheetexists Then Return

 ws.Activate()
 ws.UsedRange.Select()
 ws.UsedRange.Copy()

 Dim data As String = GetClipboardText()
 
 worksheetCollection = ParseDelimSeparatedVariables( _
  data, vbTab, Nothing, True)
Set Cell Value
Old code
GetInstance(handle).ActiveCell.Value = value
OOB 5.0.33
GetInstance(handle).ActiveCell.Value = value
Dim activeCell = GetInstance(handle).ActiveCell
SetProperty(activeCell, "Value", value)
WriteColl
Old code
 ' Get to the cell
 Dim ws as Object = GetWorksheet(handle,workbookname,worksheetname)
 Dim origin as Object = ws.Range(cellref,cellref)
 Dim cell as Object = origin

 Dim colInd as Integer = 0, rowInd as Integer = 0 ' Offsets from the origin cell
 
 ' Deal with the column names first
 If includecolnames Then
  For Each col as DataColumn in collection.Columns
   Try
    cell = origin.Offset(rowInd, colInd)
   Catch ex as Exception ' Hit the edge.
    Exit For
   End Try
   cell.Value = col.ColumnName
   colInd += 1
  Next
  rowInd += 1
 End If
 
 ' Now for the data itself
 For Each row as DataRow in collection.Rows
  colInd = 0
  For Each col as DataColumn in collection.Columns
   Try
    cell = origin.Offset(rowInd, colInd)
   Catch ex as Exception ' Hit the edge.
    Exit For
   End Try
   'MessageBox.Show("RowOffset:" & rowInd & "; ColOffset:" & colInd & "; cell: " & cell.Address(False,False))
   cell.Value = row(col)
   colInd += 1
  Next
  rowInd+=1
 Next
OOB 5.0.33
 Get to the cell
Dim ws As Object = GetWorksheet(handle, workbookname, worksheetname)
Dim origin As Object = ws.Range(cellref, cellref)
Dim cell As Object = origin

Dim colInd As Integer = 0, rowInd As Integer = 0 ' Offsets from the origin cell

' Deal with the column names first
If includecolnames Then
 For Each col As DataColumn In Collection.Columns
  Try
   cell = origin.Offset(rowInd, colInd)
  Catch ex As Exception ' Hit the edge.
   Exit For
  End Try
  SetProperty(cell, "Value", col.ColumnName)
  colInd += 1
 Next
 rowInd += 1
End If

' Now for the data itself
For Each row As DataRow In Collection.Rows
 colInd = 0
 For Each col As DataColumn In Collection.Columns
  Try
   cell = origin.Offset(rowInd, colInd)
  Catch ex As Exception ' Hit the edge.
   Exit For
  End Try
  'MessageBox.Show("RowOffset:" & rowInd & "; ColOffset:" & colInd & "; cell: " & cell.Address(False,False))
  SetProperty(cell, "Value", row(col))
  colInd += 1
 Next
 rowInd += 1
Next
I highlighted what I think is the most interesting change: BP decided to use late binding instead of early binding when dealing with read/write operations.
It’s not hard to understand that this seems to be the problem so I tried to discover what are the main differences between early binding in late binding, so I found this article on MSDN which clarifies everything (more or less):
Late binding is still useful in situations where the exact interface of an object is not known at design-time. If your application seeks to talk with multiple unknown servers or needs to invoke functions by name (using the Visual Basic 6.0 CallByName function for example) then you need to use late binding. Late binding is also useful to work around compatibility problems between multiple versions of a component that has improperly modified or adapted its interface between versions 
It’s also true that Microsoft does not encourage this kind of approach, indeed in the same article I read this:
Microsoft Office applications provide a good example of such COM servers. Office applications will typically expand their interfaces to add new functionality or correct previous shortcomings between versions. If you need to automate an Office application, it is recommended that you early bind to the earliest version of the product that you expect could be installed on your client's system. For example, if you need to be able to automate Excel 95, Excel 97, Excel 2000, and Excel 2002, you should use the type library for Excel 95 (XL5en32.olb) to maintain compatibility with all three versions. 

Even if I don’t like this approach because it does not help performances I also don’t understand why it causes the problem I told you before, did you try to wrap reviews issue using visual studio using the involvement code lines from global code of the business objects.

2. Dynamic code comparison

I created a fake process which did read/write on excel and i noticed no regressions in GetWorksheetAsCollection, instead of the other two methods, so i focused on the method below, which is called by WriteColl and SetCellValue and is ultimately responsible of data write in Excel.
Private Sub SetProperty(Instance As Object, Name As String, ParamArray args As Object())
    Dim culture = Thread.CurrentThread.CurrentCulture
    'culture = Globalization.CultureInfo.GetCultureInfo(1033)

    Instance.GetType().InvokeMember(Name, Reflection.BindingFlags.SetProperty, Nothing, Instance, args, culture)
End Sub
I discovered that the problem was regional settings and not the late binding itself, indeedn using these references below i figured out that, when you have to write data on Excel, it's safe to use en-Us locale settings:
  1. How to: Make String Literals Region-safe in Excel Using Reflection
  2. Globalization and Localization of Excel Solutions
So I decided to retest everything with Visual Studio, consider that my culture is IT
Wrong result using current culture
Right result just forcing none culture

3. Conclusion

After this analysis i discovered that, if you are currently on machines with locale different from en-US, you could have problems with data type different from string or text, so you can apply my solution or rollback just the SetCellValue action to ensure you a better weekend.

Monday, March 19, 2018

Playing with RDP and Surface Automation

This is a small blog post about a little trick very useful when you are in supervised-run phase.
Typically in this phase, you run the process in debug mode at full speed on a particular machine in order to discover weak points of the automation and performance issues.
If the automation is design to use surface automation techniques, desktop resolution will become an important factor.
Could sound strange, but sometimes you have to do deal with higher resolutions so I was wondering what could be a method good method to monitor more machines together having a higher resolution and I found an answer in smart sizing.
screen mode id:i:1
use multimon:i:0
desktopwidth:i:2560
desktopheight:i:1280
session bpp:i:32
winposstr:s:0,1,374,70,1206,588
compression:i:1
keyboardhook:i:2
audiocapturemode:i:0
videoplaybackmode:i:1
connection type:i:6
displayconnectionbar:i:1
disable wallpaper:i:0
allow font smoothing:i:1
allow desktop composition:i:1
disable full window drag:i:0
disable menu anims:i:0
disable themes:i:0
disable cursor setting:i:0
bitmapcachepersistenable:i:1
full address:s:vcolmcn13044
audiomode:i:0
redirectprinters:i:1
redirectcomports:i:0
redirectsmartcards:i:1
redirectclipboard:i:1
redirectposdevices:i:0
redirectdirectx:i:1
autoreconnection enabled:i:1
authentication level:i:2
prompt for credentials:i:1
negotiate security layer:i:1
remoteapplicationmode:i:0
alternate shell:s:
shell working directory:s:
gatewayhostname:s:
gatewayusagemethod:i:4
gatewaycredentialssource:i:4
gatewayprofileusagemethod:i:0
promptcredentialonce:i:1
use redirection server name:i:0
drivestoredirect:s:
smart sizing:i:1
The key are those three parameters
desktopwidth:i:2560
desktopheight:i:1280
...
smart sizing:i:1
Don’t ask me why the remote machine has this resolution (2560x1280 vs 1920x1080 of the physical machine) but this way I can’t resize the RDP window with no interference on surface automation.
Share:

Monday, March 12, 2018

Namespace collisions in BluePrism

This will be one of the shortest both on my blog and is about a short and sad story about code refactoring.
I was trying to reorder business objects introducing namespaces, as you already know BP encourage to create more than just one business object per application: the problem is you will commonly use at least one business object which holds the most common actions.
Let’s imagine an application called Generic Bank Web System (yep it’s not an original name) and a VBO called GBW with this actions:
  • Login 
  • Log out 
  • Navigate to function 
  • Check payments 
  • Do payments 
  • Get customer data 
  • Set customer data 
Now we can imagine this place to split this object into three objects, this will help us to keep objects small and maintainable, will also help us to prevent performance issues.
  • GBW 
    • Login 
    • Log out 
    • Navigate to function 
  • GBW.Payments 
    • Check payments 
    • Do payments 
  • GBW.Customer 
    • Get customer data 
    • Set customer data 
You can also choose other criteria when sleeping your dress you can choose to split by functional area, by process etc.
What you cannot do is to name the group exactly as one of the existing business objects because BP is not able to recognize the difference between group and business objects when compiling the process: even if during debug sessions everything seems to work like a charm, when you switch to production and you launch the process from the control room or maybe from the scheduler the result will be an error

Failed to create session on machine... - failed to get effective run mode – Unable to calculate run mode – object GBW does not exist 
KABOOM
Good to go
In this case, the solution is simply to rename the object GBW in GBW.Common, by the way, be careful to include in the common object just the error-proof actions in order to avoid regressions.
Thanks to Antonio Durante for the useful brainstorming sessions … yes! again 😊

Monday, March 5, 2018

An approach to exception handling in BluePrism

One of the things I dislike the most is the fact that when someone approaches RPA is lead to think that he/she doesn’t need any, programming language knowledge: sadly it’s not true.
The proof of that is the poor exception handling I often see: we know from traditional programming that is better to handle exceptions at the higher level and in BP is exactly the same.
Before we have to distinguish between technical and business exception:
  • Technical exceptions are the most common ones and related to temporary application problems (or bad programming). Could be handled locally using the exception block because of these exceptions can be solved simply re-trying a specific part of the process raising the exception only if max number of tentatives is reached;
  • Business exceptions are related to the process logic and most of the times it means that you cannot work the item in the queue, or maybe you just have to skip that particular process step 
Because of when we talk about RPA, we are mainly talking about workflows, if something falls apart the logic we designed most of the time the result will be that the item could not be worked, so there is no reason not to manage them on the main page.
Consider we are always dealing with the producer-consumer model (if you don’t know what I’m talking about read this blog post first ) so in the image below I just added a piece in the producer-consumer model by defining exception types and recognizing it in the main page in order to what is needed according to the exception type.

Exception handling to the higher level
After that, the robot will pick another item from the queue and start working again. Thanks to Antonio Durante for the useful brainstorming sessions which lead us to define new standards

Saturday, February 24, 2018

Blue Prism / DateTime / Code Stage / Locale issue

Since I’m at home with something like a broken shoulder (the right one), I got a lot of draft blog posts and a wonderful windows plugin named Dictate I think it’s the time to recover just a bit of work.
Today I want to talk about the behavior or daytime values when passed as an argument to code- stages. Even if Blue Prism discourages the use of code stages, I believe that this feature is one of the most interesting of them all and also helps to unleash the power of RPA. The problem with DateTime values in code stages is that BP represents them internally in UTC which could not be your locale (eg I work in Italy UTC+1)

Collection populated manually
Just passed as argument and returned

So BP suggests the following

If you do need to use a code stage then the following amendment at the end of the line will yield the behavior you expect:  The point is that you cannot always pre-process data, for example, if we take data from DB we cannot pre-process them but we have to deal with wrong data like you can see in the images below

Data from DB

Wrongdata after collection

The solution is just to post-process the data with this piece of code.
Dim dtItemValue As DateTime
Dim newDtItemValue As DateTime
Dim localZone As TimeZone
Dim span As TimeSpan

For Each row As DataRow In data.Rows
 For Each column As DataColumn In data.Columns
    dtItemValue = Nothing
    newDtItemValue = Nothing
    
    If column.DataType = System.Type.GetType("System.DateTime") Then
     dtItemValue = row(column)
     localZone = TimeZone.CurrentTimeZone
     span = localZone.GetUtcOffset(dtItemValue)
     spanString = span.ToString
     dtItemValue = dtItemValue.Add(span)
     newDtItemValue = dtItemVAlue.ToLocalTime
     row.SetField(Of DateTime)(column, newDtItemVAlue)
    End If
 Next
Next

dataOut = data
You have to take care that BP passes arguments by value and not by reference so could be necessary to use this code every time you passed the collection into the code-stages. A special credit to Raffaele Ecravino and Antonio Durante for this brief piece of code.

Me, myself and I

My Photo
I'm just another IT guy sharing his knowledge with all of you out there.
Wanna know more?