another technical blog...technically

Showing posts with label Tips'n'tricks. Show all posts
Showing posts with label Tips'n'tricks. Show all posts

Friday, September 11, 2020

Some t(r)ip with RabbitMQ

Recently, i made lots of experiments with RabbitMq with my friends and colleagues Alessandro Argiolas and Stefano Lazzati. We were interested in decoupling logic from ServiceNow to custom code, but, since we cannot leverage on powerful servers and we haven't permission to go to cloud, we were looking for a solution not to sugger of overflow.

We tried to use RabbitMQ, a lightweight message broker, as a workaround for this. The main idea is: if i have to ingest lot of data but i have low-performance hardware, can i do something? In our opinion, an option could be this one. If you don't want to install RabbitMQ you can use it as a service with CloudAMQP which is also fully documented, moreover there are tons of posts (written better than mine) that helps you to understand why is better to use a Message Broker and not a old-fashioned relational DB, what is an exhange, the routing modes and bla bla bla, so feel free to ask in the comments, but first read this.

So, about our architecture, consider those 4 application items:

  1. Source System: just the pre-existing system that need to call the web service to do something
  2. Ingestion Web Service: a really lightweight web service which just receive data and put it in the respective RabbitMQ queue. The message must be the most lightweight one
  3. RabbitMQ Queue: a queue defined on RabbitMQ, will contain the messages the engine need to work
  4. C# Engine: subscribed to the queue, as soon messages arrive ti work the case and return result to Source System and ACK to RabbitMQ. C# because we love C# but it could be whatever language

 


Ingestion Web Service 

Must be something easy peasy and lightweight: we written it in C# but maybe Go or other languages could be a better choice. This web services is your producer, it takes messages, it route message to the correct queue: in our scenario we realized queue for every customer, method and technology which is in the message request, so the routing key is generated from the info in the request.

The web service mainly contains a controller with this method

public class QueueController : ApiController
{
	private static readonly ILog _log = LogManager.GetLogger(MethodBase.GetCurrentMethod().DeclaringType);

	private readonly IQueueProducer _producer;
	private readonly string _exchange;

	public QueueController(IQueueProducer producer)
	{
		_producer = producer;

		using (SynopsContext context = new SynopsContext())
		{
			_exchange = context.GetCommonConfig("rabbit")["exchange"];
		}
	}

	[HttpPost]
	public IHttpActionResult AddToQueue(QueueRequest request)
	{
		string routingKey = string.Format("{0}_{1}_{2}", request.Customer.ToLower(), request.Method.ToLower(), request.Technology.ToLower());

		try
		{
			string jsonContent = JsonConvert.SerializeObject(request.Arguments);
			byte[] data = Encoding.Default.GetBytes(jsonContent);

			_producer.BasicPublish(_exchange, routingKey, data);
			_producer.Dispose();
		}
		catch (Exception ex)
		{
			_log.Error(ex.Message);

			return InternalServerError(ex);
		}

		return Ok();
	}
}

as you can see nothing so complex, the producer here 

public class RabbitMQProducer : IQueueProducer
{
	private readonly IConnection _connection;
	private readonly IModel _channel;

	public RabbitMQProducer(string hostname, string username, string password)
	{
		ConnectionFactory connectionFactory = new ConnectionFactory
		{
			HostName = hostname,
			UserName = username,
			Password = password
		};

		_connection = connectionFactory.CreateConnection();
		_channel = _connection.CreateModel();
	}

	public void BasicPublish(string exchange, string routingKey, byte[] data)
	{
		IBasicProperties properties = _channel.CreateBasicProperties();
		properties.Persistent = true;

		_channel.BasicPublish(exchange, routingKey, properties, data);
	}

	public void Dispose()
	{
		_channel?.Dispose();
		_channel?.Close();
		_connection?.Dispose();
		_connection?.Close();
	}
}

C# Engine

Besides what's in RabbitMQ, you need a consumer to consume message. Fun fact, you can balance the load adding more than just one Engine because you are centralizing the queue on one machine. It's interesting how using prefecth count in QoS you can choose how many messages you want to process simultaneously. Instead to launch the correct code to process message depending on the functionality, we just used a strategy pattern and we simply launch a task for every message we want to work.

public class RabbitMQConsumer
{
	private readonly ILog _log = LogManager.GetLogger(MethodBase.GetCurrentMethod().DeclaringType);

	private readonly IConnection _connection;
	private readonly IModel _channel;
	private readonly EventingBasicConsumer _consumer;

	public RabbitMQConsumer()
	{
		Dictionary rabbitConfig;

		using (SynopsContext context = new SynopsContext())
		{
			rabbitConfig = context.GetCommonConfig("rabbit");
		}

		rabbitConfig.Add("password", ConfigurationManager.AppSettings["rabbit_password"]);

		ConnectionFactory connectionFactory = new ConnectionFactory
		{
			HostName = rabbitConfig["hostname"],
			UserName = rabbitConfig["username"],
			Password = rabbitConfig["password"],
		};

		ushort prefetchCount = ushort.Parse(rabbitConfig["prefetch"]);

		_log.Info(string.Format("Number of parallel tasks for each queue: {0}", prefetchCount));

		_connection = connectionFactory.CreateConnection();
		_channel = _connection.CreateModel();
		_channel.BasicQos(0, prefetchCount, false);
		_consumer = new EventingBasicConsumer(_channel);
		_consumer.Received += BasicConsume;
	}

	public void BasicConsume(string queue, bool autoAck = false)
	{
		_channel.BasicConsume(queue, autoAck, _consumer);
	}

	private void BasicConsume(object sender, BasicDeliverEventArgs args)
	{
		Task.Run(() => BasicConsume(args.Exchange, args.RoutingKey, args.DeliveryTag, args.Body));
	}

	private void BasicConsume(string exchange, string routingKey, ulong deliveryTag, byte[] data)
	{
		try
		{
			Strategy strategy = new Strategy(routingKey);

			strategy.Run(data);
		}
		catch (Exception ex)
		{
			_log.Error(string.Format("Error during working exchange {0} with routing key {1} and delivery tag {2}. With exception:", exchange, routingKey, deliveryTag, ex));
		}
		finally
		{
			_channel.BasicAck(deliveryTag, false);
			_log.Info(string.Format("Ack {0}.", deliveryTag));
		}
	}
}

Lesson Learned

  • Use this technique when you can leverage on async approach 
  • Consider monitoring for RabbitMQ and Engine, it's a new piece of software
  • Messages must be REALLY lightweight, otherwise you will need a load balancer for the web service... and so you

Wednesday, September 9, 2020

ServiceNow API and Attachment API

Lately i am using ServiceNow as a human-in-the-loop tool with RPA. Fortunally, not only as a front-end but RPA is becoming more and more the arm of the BPM workflow in the customer systems, and it's really fun to make it work together.

In past projects i noticed that ServiceNow guys used to send file to external services using a sync call: this is for sure fast, reliable but is not a good option when files, converted in base64, is bigger than 32MB.

This means that if you need to pass files bigger than 25-26MB you must change the approach from sync to async one, and here's where Attachment API helps you, enabling file transfer for bigger files. It's clear that async approach needs you to control what's happening on the other side (but we will see it on another post).

To help working also with the others OOO API i found this project on github that manages lots of calls.

Unfortunally they don't manage upload of files through AttachmentAPI nor so many ways to download files, so i decided to fork the project (you can find it here) and to submit my code to the authors (i don't even know if they accepted or refused it) but i will give you just a bunch of code lines i used to edit the lib so you can replicate and use it.

In ServiceNowClient.cs just added two methods, for upload and download as a stream

public async Task DownloadAttachmentAsyncAsStream(Attachment attachment, string outputPath, string filename = null, CancellationToken cancellationToken = default)
{
	var response = await _httpClient.GetAsync(attachment.DownloadLink, cancellationToken).ConfigureAwait(false);
	var stream = await response.Content.ReadAsStreamAsync().ConfigureAwait(false);
	return stream;
}

public async Task UploadAttachmentAsync(string filename, string tablename, string tablesysid, byte[] data)
{
	HttpContent genericFileContent = new ByteArrayContent(data);
	string mimeType = MimeMapping.GetMimeMapping(filename);
	genericFileContent.Headers.Add("Content-Type", mimeType);

	string requestUri = "api/now/v1/attachment/file?file_name=" + filename + "&table_name=" + tablename + "&table_sys_id=" + tablesysid;
	var response = await _httpClient.PostAsync(requestUri, genericFileContent).ConfigureAwait(false);
	if (response == null)
	{
		throw new Exception("Null response.");
	}

	if (!response.IsSuccessStatusCode)
	{
		throw new Exception($"Server error {response.StatusCode} ({(int)response.StatusCode}): {response.ReasonPhrase}.");
	}

	return response.IsSuccessStatusCode;
}

then i added MimeMapping (find it below) which is just a copy of the one on .NET that is not in the library, and that's quite everything... simple but (hope so) useful.

using System;
using System.Collections.Generic;
using System.IO;
using System.Text;

namespace ServiceNow.Api
{
	public static class MimeMapping
	{
		private static Dictionary _mappingDictionary;

		static MimeMapping()
		{
			Init();
		}

		public static string GetMimeMapping(string fileName)
		{
			if (fileName == null)
			{
				throw new ArgumentNullException("fileName");
			}

			string extension = Path.GetExtension(fileName);
			return _mappingDictionary[extension];
		}

		
		private static void Init()
		{
			// This list was copied from the IIS7 configuration file located at:
			// %windir%\system32\inetsrv\config\applicationHost.config
			_mappingDictionary = new Dictionary();
			_mappingDictionary.Add(".323", "text/h323");
			_mappingDictionary.Add(".aaf", "application/octet-stream");
			_mappingDictionary.Add(".aca", "application/octet-stream");
			_mappingDictionary.Add(".accdb", "application/msaccess");
			_mappingDictionary.Add(".accde", "application/msaccess");
			_mappingDictionary.Add(".accdt", "application/msaccess");
			_mappingDictionary.Add(".acx", "application/internet-property-stream");
			_mappingDictionary.Add(".afm", "application/octet-stream");
			_mappingDictionary.Add(".ai", "application/postscript");
			_mappingDictionary.Add(".aif", "audio/x-aiff");
			_mappingDictionary.Add(".aifc", "audio/aiff");
			_mappingDictionary.Add(".aiff", "audio/aiff");
			_mappingDictionary.Add(".application", "application/x-ms-application");
			_mappingDictionary.Add(".art", "image/x-jg");
			_mappingDictionary.Add(".asd", "application/octet-stream");
			_mappingDictionary.Add(".asf", "video/x-ms-asf");
			_mappingDictionary.Add(".asi", "application/octet-stream");
			_mappingDictionary.Add(".asm", "text/plain");
			_mappingDictionary.Add(".asr", "video/x-ms-asf");
			_mappingDictionary.Add(".asx", "video/x-ms-asf");
			_mappingDictionary.Add(".atom", "application/atom+xml");
			_mappingDictionary.Add(".au", "audio/basic");
			_mappingDictionary.Add(".avi", "video/x-msvideo");
			_mappingDictionary.Add(".axs", "application/olescript");
			_mappingDictionary.Add(".bas", "text/plain");
			_mappingDictionary.Add(".bcpio", "application/x-bcpio");
			_mappingDictionary.Add(".bin", "application/octet-stream");
			_mappingDictionary.Add(".bmp", "image/bmp");
			_mappingDictionary.Add(".c", "text/plain");
			_mappingDictionary.Add(".cab", "application/octet-stream");
			_mappingDictionary.Add(".calx", "application/vnd.ms-office.calx");
			_mappingDictionary.Add(".cat", "application/vnd.ms-pki.seccat");
			_mappingDictionary.Add(".cdf", "application/x-cdf");
			_mappingDictionary.Add(".chm", "application/octet-stream");
			_mappingDictionary.Add(".class", "application/x-java-applet");
			_mappingDictionary.Add(".clp", "application/x-msclip");
			_mappingDictionary.Add(".cmx", "image/x-cmx");
			_mappingDictionary.Add(".cnf", "text/plain");
			_mappingDictionary.Add(".cod", "image/cis-cod");
			_mappingDictionary.Add(".cpio", "application/x-cpio");
			_mappingDictionary.Add(".cpp", "text/plain");
			_mappingDictionary.Add(".crd", "application/x-mscardfile");
			_mappingDictionary.Add(".crl", "application/pkix-crl");
			_mappingDictionary.Add(".crt", "application/x-x509-ca-cert");
			_mappingDictionary.Add(".csh", "application/x-csh");
			_mappingDictionary.Add(".css", "text/css");
			_mappingDictionary.Add(".csv", "application/octet-stream");
			_mappingDictionary.Add(".cur", "application/octet-stream");
			_mappingDictionary.Add(".dcr", "application/x-director");
			_mappingDictionary.Add(".deploy", "application/octet-stream");
			_mappingDictionary.Add(".der", "application/x-x509-ca-cert");
			_mappingDictionary.Add(".dib", "image/bmp");
			_mappingDictionary.Add(".dir", "application/x-director");
			_mappingDictionary.Add(".disco", "text/xml");
			_mappingDictionary.Add(".dll", "application/x-msdownload");
			_mappingDictionary.Add(".dll.config", "text/xml");
			_mappingDictionary.Add(".dlm", "text/dlm");
			_mappingDictionary.Add(".doc", "application/msword");
			_mappingDictionary.Add(".docm", "application/vnd.ms-word.document.macroEnabled.12");
			_mappingDictionary.Add(".docx", "application/vnd.openxmlformats-officedocument.wordprocessingml.document");
			_mappingDictionary.Add(".dot", "application/msword");
			_mappingDictionary.Add(".dotm", "application/vnd.ms-word.template.macroEnabled.12");
			_mappingDictionary.Add(".dotx", "application/vnd.openxmlformats-officedocument.wordprocessingml.template");
			_mappingDictionary.Add(".dsp", "application/octet-stream");
			_mappingDictionary.Add(".dtd", "text/xml");
			_mappingDictionary.Add(".dvi", "application/x-dvi");
			_mappingDictionary.Add(".dwf", "drawing/x-dwf");
			_mappingDictionary.Add(".dwp", "application/octet-stream");
			_mappingDictionary.Add(".dxr", "application/x-director");
			_mappingDictionary.Add(".eml", "message/rfc822");
			_mappingDictionary.Add(".emz", "application/octet-stream");
			_mappingDictionary.Add(".eot", "application/octet-stream");
			_mappingDictionary.Add(".eps", "application/postscript");
			_mappingDictionary.Add(".etx", "text/x-setext");
			_mappingDictionary.Add(".evy", "application/envoy");
			_mappingDictionary.Add(".exe", "application/octet-stream");
			_mappingDictionary.Add(".exe.config", "text/xml");
			_mappingDictionary.Add(".fdf", "application/vnd.fdf");
			_mappingDictionary.Add(".fif", "application/fractals");
			_mappingDictionary.Add(".fla", "application/octet-stream");
			_mappingDictionary.Add(".flr", "x-world/x-vrml");
			_mappingDictionary.Add(".flv", "video/x-flv");
			_mappingDictionary.Add(".gif", "image/gif");
			_mappingDictionary.Add(".gtar", "application/x-gtar");
			_mappingDictionary.Add(".gz", "application/x-gzip");
			_mappingDictionary.Add(".h", "text/plain");
			_mappingDictionary.Add(".hdf", "application/x-hdf");
			_mappingDictionary.Add(".hdml", "text/x-hdml");
			_mappingDictionary.Add(".hhc", "application/x-oleobject");
			_mappingDictionary.Add(".hhk", "application/octet-stream");
			_mappingDictionary.Add(".hhp", "application/octet-stream");
			_mappingDictionary.Add(".hlp", "application/winhlp");
			_mappingDictionary.Add(".hqx", "application/mac-binhex40");
			_mappingDictionary.Add(".hta", "application/hta");
			_mappingDictionary.Add(".htc", "text/x-component");
			_mappingDictionary.Add(".htm", "text/html");
			_mappingDictionary.Add(".html", "text/html");
			_mappingDictionary.Add(".htt", "text/webviewhtml");
			_mappingDictionary.Add(".hxt", "text/html");
			_mappingDictionary.Add(".ico", "image/x-icon");
			_mappingDictionary.Add(".ics", "application/octet-stream");
			_mappingDictionary.Add(".ief", "image/ief");
			_mappingDictionary.Add(".iii", "application/x-iphone");
			_mappingDictionary.Add(".inf", "application/octet-stream");
			_mappingDictionary.Add(".ins", "application/x-internet-signup");
			_mappingDictionary.Add(".isp", "application/x-internet-signup");
			_mappingDictionary.Add(".IVF", "video/x-ivf");
			_mappingDictionary.Add(".jar", "application/java-archive");
			_mappingDictionary.Add(".java", "application/octet-stream");
			_mappingDictionary.Add(".jck", "application/liquidmotion");
			_mappingDictionary.Add(".jcz", "application/liquidmotion");
			_mappingDictionary.Add(".jfif", "image/pjpeg");
			_mappingDictionary.Add(".jpb", "application/octet-stream");
			_mappingDictionary.Add(".jpe", "image/jpeg");
			_mappingDictionary.Add(".jpeg", "image/jpeg");
			_mappingDictionary.Add(".jpg", "image/jpeg");
			_mappingDictionary.Add(".js", "application/x-javascript");
			_mappingDictionary.Add(".jsx", "text/jscript");
			_mappingDictionary.Add(".latex", "application/x-latex");
			_mappingDictionary.Add(".lit", "application/x-ms-reader");
			_mappingDictionary.Add(".lpk", "application/octet-stream");
			_mappingDictionary.Add(".lsf", "video/x-la-asf");
			_mappingDictionary.Add(".lsx", "video/x-la-asf");
			_mappingDictionary.Add(".lzh", "application/octet-stream");
			_mappingDictionary.Add(".m13", "application/x-msmediaview");
			_mappingDictionary.Add(".m14", "application/x-msmediaview");
			_mappingDictionary.Add(".m1v", "video/mpeg");
			_mappingDictionary.Add(".m3u", "audio/x-mpegurl");
			_mappingDictionary.Add(".man", "application/x-troff-man");
			_mappingDictionary.Add(".manifest", "application/x-ms-manifest");
			_mappingDictionary.Add(".map", "text/plain");
			_mappingDictionary.Add(".mdb", "application/x-msaccess");
			_mappingDictionary.Add(".mdp", "application/octet-stream");
			_mappingDictionary.Add(".me", "application/x-troff-me");
			_mappingDictionary.Add(".mht", "message/rfc822");
			_mappingDictionary.Add(".mhtml", "message/rfc822");
			_mappingDictionary.Add(".mid", "audio/mid");
			_mappingDictionary.Add(".midi", "audio/mid");
			_mappingDictionary.Add(".mix", "application/octet-stream");
			_mappingDictionary.Add(".mmf", "application/x-smaf");
			_mappingDictionary.Add(".mno", "text/xml");
			_mappingDictionary.Add(".mny", "application/x-msmoney");
			_mappingDictionary.Add(".mov", "video/quicktime");
			_mappingDictionary.Add(".movie", "video/x-sgi-movie");
			_mappingDictionary.Add(".mp2", "video/mpeg");
			_mappingDictionary.Add(".mp3", "audio/mpeg");
			_mappingDictionary.Add(".mpa", "video/mpeg");
			_mappingDictionary.Add(".mpe", "video/mpeg");
			_mappingDictionary.Add(".mpeg", "video/mpeg");
			_mappingDictionary.Add(".mpg", "video/mpeg");
			_mappingDictionary.Add(".mpp", "application/vnd.ms-project");
			_mappingDictionary.Add(".mpv2", "video/mpeg");
			_mappingDictionary.Add(".ms", "application/x-troff-ms");
			_mappingDictionary.Add(".msi", "application/octet-stream");
			_mappingDictionary.Add(".mso", "application/octet-stream");
			_mappingDictionary.Add(".mvb", "application/x-msmediaview");
			_mappingDictionary.Add(".mvc", "application/x-miva-compiled");
			_mappingDictionary.Add(".nc", "application/x-netcdf");
			_mappingDictionary.Add(".nsc", "video/x-ms-asf");
			_mappingDictionary.Add(".nws", "message/rfc822");
			_mappingDictionary.Add(".ocx", "application/octet-stream");
			_mappingDictionary.Add(".oda", "application/oda");
			_mappingDictionary.Add(".odc", "text/x-ms-odc");
			_mappingDictionary.Add(".ods", "application/oleobject");
			_mappingDictionary.Add(".one", "application/onenote");
			_mappingDictionary.Add(".onea", "application/onenote");
			_mappingDictionary.Add(".onetoc", "application/onenote");
			_mappingDictionary.Add(".onetoc2", "application/onenote");
			_mappingDictionary.Add(".onetmp", "application/onenote");
			_mappingDictionary.Add(".onepkg", "application/onenote");
			_mappingDictionary.Add(".osdx", "application/opensearchdescription+xml");
			_mappingDictionary.Add(".p10", "application/pkcs10");
			_mappingDictionary.Add(".p12", "application/x-pkcs12");
			_mappingDictionary.Add(".p7b", "application/x-pkcs7-certificates");
			_mappingDictionary.Add(".p7c", "application/pkcs7-mime");
			_mappingDictionary.Add(".p7m", "application/pkcs7-mime");
			_mappingDictionary.Add(".p7r", "application/x-pkcs7-certreqresp");
			_mappingDictionary.Add(".p7s", "application/pkcs7-signature");
			_mappingDictionary.Add(".pbm", "image/x-portable-bitmap");
			_mappingDictionary.Add(".pcx", "application/octet-stream");
			_mappingDictionary.Add(".pcz", "application/octet-stream");
			_mappingDictionary.Add(".pdf", "application/pdf");
			_mappingDictionary.Add(".pfb", "application/octet-stream");
			_mappingDictionary.Add(".pfm", "application/octet-stream");
			_mappingDictionary.Add(".pfx", "application/x-pkcs12");
			_mappingDictionary.Add(".pgm", "image/x-portable-graymap");
			_mappingDictionary.Add(".pko", "application/vnd.ms-pki.pko");
			_mappingDictionary.Add(".pma", "application/x-perfmon");
			_mappingDictionary.Add(".pmc", "application/x-perfmon");
			_mappingDictionary.Add(".pml", "application/x-perfmon");
			_mappingDictionary.Add(".pmr", "application/x-perfmon");
			_mappingDictionary.Add(".pmw", "application/x-perfmon");
			_mappingDictionary.Add(".png", "image/png");
			_mappingDictionary.Add(".pnm", "image/x-portable-anymap");
			_mappingDictionary.Add(".pnz", "image/png");
			_mappingDictionary.Add(".pot", "application/vnd.ms-powerpoint");
			_mappingDictionary.Add(".potm", "application/vnd.ms-powerpoint.template.macroEnabled.12");
			_mappingDictionary.Add(".potx", "application/vnd.openxmlformats-officedocument.presentationml.template");
			_mappingDictionary.Add(".ppam", "application/vnd.ms-powerpoint.addin.macroEnabled.12");
			_mappingDictionary.Add(".ppm", "image/x-portable-pixmap");
			_mappingDictionary.Add(".pps", "application/vnd.ms-powerpoint");
			_mappingDictionary.Add(".ppsm", "application/vnd.ms-powerpoint.slideshow.macroEnabled.12");
			_mappingDictionary.Add(".ppsx", "application/vnd.openxmlformats-officedocument.presentationml.slideshow");
			_mappingDictionary.Add(".ppt", "application/vnd.ms-powerpoint");
			_mappingDictionary.Add(".pptm", "application/vnd.ms-powerpoint.presentation.macroEnabled.12");
			_mappingDictionary.Add(".pptx", "application/vnd.openxmlformats-officedocument.presentationml.presentation");
			_mappingDictionary.Add(".prf", "application/pics-rules");
			_mappingDictionary.Add(".prm", "application/octet-stream");
			_mappingDictionary.Add(".prx", "application/octet-stream");
			_mappingDictionary.Add(".ps", "application/postscript");
			_mappingDictionary.Add(".psd", "application/octet-stream");
			_mappingDictionary.Add(".psm", "application/octet-stream");
			_mappingDictionary.Add(".psp", "application/octet-stream");
			_mappingDictionary.Add(".pub", "application/x-mspublisher");
			_mappingDictionary.Add(".qt", "video/quicktime");
			_mappingDictionary.Add(".qtl", "application/x-quicktimeplayer");
			_mappingDictionary.Add(".qxd", "application/octet-stream");
			_mappingDictionary.Add(".ra", "audio/x-pn-realaudio");
			_mappingDictionary.Add(".ram", "audio/x-pn-realaudio");
			_mappingDictionary.Add(".rar", "application/octet-stream");
			_mappingDictionary.Add(".ras", "image/x-cmu-raster");
			_mappingDictionary.Add(".rf", "image/vnd.rn-realflash");
			_mappingDictionary.Add(".rgb", "image/x-rgb");
			_mappingDictionary.Add(".rm", "application/vnd.rn-realmedia");
			_mappingDictionary.Add(".rmi", "audio/mid");
			_mappingDictionary.Add(".roff", "application/x-troff");
			_mappingDictionary.Add(".rpm", "audio/x-pn-realaudio-plugin");
			_mappingDictionary.Add(".rtf", "application/rtf");
			_mappingDictionary.Add(".rtx", "text/richtext");
			_mappingDictionary.Add(".scd", "application/x-msschedule");
			_mappingDictionary.Add(".sct", "text/scriptlet");
			_mappingDictionary.Add(".sea", "application/octet-stream");
			_mappingDictionary.Add(".setpay", "application/set-payment-initiation");
			_mappingDictionary.Add(".setreg", "application/set-registration-initiation");
			_mappingDictionary.Add(".sgml", "text/sgml");
			_mappingDictionary.Add(".sh", "application/x-sh");
			_mappingDictionary.Add(".shar", "application/x-shar");
			_mappingDictionary.Add(".sit", "application/x-stuffit");
			_mappingDictionary.Add(".sldm", "application/vnd.ms-powerpoint.slide.macroEnabled.12");
			_mappingDictionary.Add(".sldx", "application/vnd.openxmlformats-officedocument.presentationml.slide");
			_mappingDictionary.Add(".smd", "audio/x-smd");
			_mappingDictionary.Add(".smi", "application/octet-stream");
			_mappingDictionary.Add(".smx", "audio/x-smd");
			_mappingDictionary.Add(".smz", "audio/x-smd");
			_mappingDictionary.Add(".snd", "audio/basic");
			_mappingDictionary.Add(".snp", "application/octet-stream");
			_mappingDictionary.Add(".spc", "application/x-pkcs7-certificates");
			_mappingDictionary.Add(".spl", "application/futuresplash");
			_mappingDictionary.Add(".src", "application/x-wais-source");
			_mappingDictionary.Add(".ssm", "application/streamingmedia");
			_mappingDictionary.Add(".sst", "application/vnd.ms-pki.certstore");
			_mappingDictionary.Add(".stl", "application/vnd.ms-pki.stl");
			_mappingDictionary.Add(".sv4cpio", "application/x-sv4cpio");
			_mappingDictionary.Add(".sv4crc", "application/x-sv4crc");
			_mappingDictionary.Add(".swf", "application/x-shockwave-flash");
			_mappingDictionary.Add(".t", "application/x-troff");
			_mappingDictionary.Add(".tar", "application/x-tar");
			_mappingDictionary.Add(".tcl", "application/x-tcl");
			_mappingDictionary.Add(".tex", "application/x-tex");
			_mappingDictionary.Add(".texi", "application/x-texinfo");
			_mappingDictionary.Add(".texinfo", "application/x-texinfo");
			_mappingDictionary.Add(".tgz", "application/x-compressed");
			_mappingDictionary.Add(".thmx", "application/vnd.ms-officetheme");
			_mappingDictionary.Add(".thn", "application/octet-stream");
			_mappingDictionary.Add(".tif", "image/tiff");
			_mappingDictionary.Add(".tiff", "image/tiff");
			_mappingDictionary.Add(".toc", "application/octet-stream");
			_mappingDictionary.Add(".tr", "application/x-troff");
			_mappingDictionary.Add(".trm", "application/x-msterminal");
			_mappingDictionary.Add(".tsv", "text/tab-separated-values");
			_mappingDictionary.Add(".ttf", "application/octet-stream");
			_mappingDictionary.Add(".txt", "text/plain");
			_mappingDictionary.Add(".u32", "application/octet-stream");
			_mappingDictionary.Add(".uls", "text/iuls");
			_mappingDictionary.Add(".ustar", "application/x-ustar");
			_mappingDictionary.Add(".vbs", "text/vbscript");
			_mappingDictionary.Add(".vcf", "text/x-vcard");
			_mappingDictionary.Add(".vcs", "text/plain");
			_mappingDictionary.Add(".vdx", "application/vnd.ms-visio.viewer");
			_mappingDictionary.Add(".vml", "text/xml");
			_mappingDictionary.Add(".vsd", "application/vnd.visio");
			_mappingDictionary.Add(".vss", "application/vnd.visio");
			_mappingDictionary.Add(".vst", "application/vnd.visio");
			_mappingDictionary.Add(".vsto", "application/x-ms-vsto");
			_mappingDictionary.Add(".vsw", "application/vnd.visio");
			_mappingDictionary.Add(".vsx", "application/vnd.visio");
			_mappingDictionary.Add(".vtx", "application/vnd.visio");
			_mappingDictionary.Add(".wav", "audio/wav");
			_mappingDictionary.Add(".wax", "audio/x-ms-wax");
			_mappingDictionary.Add(".wbmp", "image/vnd.wap.wbmp");
			_mappingDictionary.Add(".wcm", "application/vnd.ms-works");
			_mappingDictionary.Add(".wdb", "application/vnd.ms-works");
			_mappingDictionary.Add(".wks", "application/vnd.ms-works");
			_mappingDictionary.Add(".wm", "video/x-ms-wm");
			_mappingDictionary.Add(".wma", "audio/x-ms-wma");
			_mappingDictionary.Add(".wmd", "application/x-ms-wmd");
			_mappingDictionary.Add(".wmf", "application/x-msmetafile");
			_mappingDictionary.Add(".wml", "text/vnd.wap.wml");
			_mappingDictionary.Add(".wmlc", "application/vnd.wap.wmlc");
			_mappingDictionary.Add(".wmls", "text/vnd.wap.wmlscript");
			_mappingDictionary.Add(".wmlsc", "application/vnd.wap.wmlscriptc");
			_mappingDictionary.Add(".wmp", "video/x-ms-wmp");
			_mappingDictionary.Add(".wmv", "video/x-ms-wmv");
			_mappingDictionary.Add(".wmx", "video/x-ms-wmx");
			_mappingDictionary.Add(".wmz", "application/x-ms-wmz");
			_mappingDictionary.Add(".wps", "application/vnd.ms-works");
			_mappingDictionary.Add(".wri", "application/x-mswrite");
			_mappingDictionary.Add(".wrl", "x-world/x-vrml");
			_mappingDictionary.Add(".wrz", "x-world/x-vrml");
			_mappingDictionary.Add(".wsdl", "text/xml");
			_mappingDictionary.Add(".wvx", "video/x-ms-wvx");
			_mappingDictionary.Add(".x", "application/directx");
			_mappingDictionary.Add(".xaf", "x-world/x-vrml");
			_mappingDictionary.Add(".xaml", "application/xaml+xml");
			_mappingDictionary.Add(".xap", "application/x-silverlight-app");
			_mappingDictionary.Add(".xbap", "application/x-ms-xbap");
			_mappingDictionary.Add(".xbm", "image/x-xbitmap");
			_mappingDictionary.Add(".xdr", "text/plain");
			_mappingDictionary.Add(".xla", "application/vnd.ms-excel");
			_mappingDictionary.Add(".xlam", "application/vnd.ms-excel.addin.macroEnabled.12");
			_mappingDictionary.Add(".xlc", "application/vnd.ms-excel");
			_mappingDictionary.Add(".xlm", "application/vnd.ms-excel");
			_mappingDictionary.Add(".xls", "application/vnd.ms-excel");
			_mappingDictionary.Add(".xlsb", "application/vnd.ms-excel.sheet.binary.macroEnabled.12");
			_mappingDictionary.Add(".xlsm", "application/vnd.ms-excel.sheet.macroEnabled.12");
			_mappingDictionary.Add(".xlsx", "application/vnd.openxmlformats-officedocument.spreadsheetml.sheet");
			_mappingDictionary.Add(".xlt", "application/vnd.ms-excel");
			_mappingDictionary.Add(".xltm", "application/vnd.ms-excel.template.macroEnabled.12");
			_mappingDictionary.Add(".xltx", "application/vnd.openxmlformats-officedocument.spreadsheetml.template");
			_mappingDictionary.Add(".xlw", "application/vnd.ms-excel");
			_mappingDictionary.Add(".xml", "text/xml");
			_mappingDictionary.Add(".xof", "x-world/x-vrml");
			_mappingDictionary.Add(".xpm", "image/x-xpixmap");
			_mappingDictionary.Add(".xps", "application/vnd.ms-xpsdocument");
			_mappingDictionary.Add(".xsd", "text/xml");
			_mappingDictionary.Add(".xsf", "text/xml");
			_mappingDictionary.Add(".xsl", "text/xml");
			_mappingDictionary.Add(".xslt", "text/xml");
			_mappingDictionary.Add(".xsn", "application/octet-stream");
			_mappingDictionary.Add(".xtp", "application/octet-stream");
			_mappingDictionary.Add(".xwd", "image/x-xwindowdump");
			_mappingDictionary.Add(".z", "application/x-compress");
			_mappingDictionary.Add(".zip", "application/x-zip-compressed");
		}
	}
}
Share:

Tuesday, September 8, 2020

Tips'n'tricks installing AA11 Citrix Remote Agent

It's been a long time i want to write this little blog post. I had some experience with Automation Anywhere 11, trying to setup it in an environment i could not create any Virtual Machine: so the only solution was to try to automate via Citrix.

Since i'm not a big fan of surface automation, as you all, i found this plugin (which is really similar to the one also UiPath have onboard... instead BluePrism lacks of this feature). 

As you can see in the following picture that is taken from here it helps you to hosts the robot machine (so you are the service provider) without installing automation software on customers data center, and the commercial promise is to use Citrix Application as local application.

Customer must install only AARemoteAgent on Citrix Servers and you have to start the Citrix Receiver before starting yout robot and everything will work fine.

 

 

But you are here because of the lesson learned, not for the commercial tips, so what i've learned from this experience?

  1. Latency: programming robot using Remote Agent could be frustrating because of the latency, consider overhead in development;
  2. Functionality: as you can imagine, it works quite well but it doesn't works for everything. I tested it on some custom application and the result is: more custom application = more problems, somebody could argue that is the same with local application but it's also true that with local application you have more workaround available;
  3. Internet explorer + Citrix: nobody wrote this but if Citrix server version is < 7.15, you will never been able to use IE through Citrix RemoteAgent;
  4. Chrome + Citrix:sometimes it doesn't work because the remote agent does not install two registry keys Computer\HKEY_CURRENT_USER\Software\Google\Chrome\Extensions\akaagfbdekcffpppnbajieljleihifdc and Computer\HKEY_LOCAL_MACHINE\SOFTWARE\WOW6432Node\Google\Chrome\Extensions\akaagfbdekcffpppnbajieljleihifdc . AA supports asked customer to install Automation Anywhere also con Citrix Server but you can do it smarter, simply creating those registry keys manually on the server


Some of the things i've written in this little blog post are now in the online documentation, but since it required lot of time before discovering it, if you're on this page, maybe could be useful.

Monday, May 18, 2020

Authorization using IIS UrlRewrite and headers

Lately one of my customers requested a very simple web application, with a on-premise frontend and a backend which calls Google AI APIs.
Why a on-prem web application? The client leverages on Citrix NetScalar enable authentication through SSO.
What about authorization? They create AD groups and restrisct permission using NetScalar also as authorization layer.

The problem is that, when we were deploying in production, client policies were also changing, so the only way to implement authorization became to read headers (see the example below) and use saml-group header value on the backend.
Unfortunally we built a AngulaJS application, so reading the headers, implement httpModules or other stuff like these was not so easy, so we looked for a cheap solution to mantain the as is solution.

Using just IIS Url Rewrite Module we are able to read headers and filter access according to content without impact on the frontend application or writing a single line of code.


As you can see in this image, with a rule like this, IIS blocks every kind of request that is not coming from a user belonging to group BPO-997_CA...swich.
When referring to a particular header key (in my case saml-groups), you have to use HTTP_saml-groups input in order to create the rule.

What's wrong with this solution? It's applicable only when you need a very simple authorization (in our case only access the application) but, useful to know ;)

Credits to Valerio Angelini and Marco Aimi who worked with me testing this solution.

Monday, February 10, 2020

pdf2data template editor setup that works

pdf2data is a great product, but the web app install is not so well documented, moreover iText has a terrible customer service that simply doesn't reply to your email.
So if you have to start this web app you simply have this link which does not documents all the passages.
I assume you installed tomcat and following the guide you continue having errors, so:
  1. Undeploy your pdf2data web app
  2. Rename the downloaded war file in directory "c:\Program Files\Apache Software Foundation\Tomcat 7.0\webapps\" simply in "pdf2data.war"
  3. Create a TEMP folder (eg. C:\TEMP) to host pdf2data temp files
  4. Create the file web.properties in c:\Program Files\Apache Software Foundation\Tomcat 7.0\webapps\pdf2data". dir.temp is not documented and requires double backslash in the path, licensekey must be the absolute path of your itext license key
  5. dir.temp = C:\\TEMP
    licensekey = C:\itextkeylicense.xml
    
    mail.to=
    mail.smtp.host=
    mail.smtp.port=
    mail.ssl.smtp.port=
    mail.ssl.enable=
    mail.smtp.starttls.enable=
    
  6.  Create a system variable PDF2DATA_PROPERTIES and set it to the path of the file from previous step (eg: "c:\Program Files\Apache Software Foundation\Tomcat 7.0\webapps\pdf2data\web.properties")
  7. Restart tomcat and enjoy your pdf2data template editor
Share:

Saturday, November 2, 2019

Playing with BP 5.x and queue item data dump

Hi guys, this is just a brief post about queue items dumping.
I found this very useful with BP version 5.x and high-intensive volume data which caused failure in GetNextItem action.
So i need to dump data to another table in order to keep DB as small as possible waiting for the permission from the customer, to wipe all the old data (or restoring just a part of the amount of data).

The SQL script is self explaining and deals with ID consistency, nothing hard to do, but if you're asking if you can do this operation without corrupting the DB, yes you can, so have fun:


-- We need to create just one table to dump data, otherwise you can use SELECT * INTO TABLE which creates a new table everytime
DROP TABLE [dbo].[BPAWorkQueueItem_TEMP]
CREATE TABLE [dbo].[BPAWorkQueueItem_TEMP](
 [id] [uniqueidentifier] NOT NULL,
 [queueid] [uniqueidentifier] NOT NULL,
 [keyvalue] [nvarchar](255) NULL,
 [status] [nvarchar](255) NULL,
 [attempt] [int] NULL,
 [loaded] [datetime] NULL,
 [completed] [datetime] NULL,
 [exception] [datetime] NULL,
 [exceptionreason] [nvarchar](max) NULL,
 [deferred] [datetime] NULL,
 [worktime] [int] NULL,
 [data] [nvarchar](max) NULL,
 [queueident] [int] NOT NULL,
 [ident] [bigint] IDENTITY(1,1) NOT NULL,
 [sessionid] [uniqueidentifier] NULL,
 [priority] [int] NOT NULL,
 [prevworktime] [int] NOT NULL,
 [attemptworktime]  AS ([worktime]-[prevworktime]) PERSISTED,
 [finished]  AS (isnull([exception],[completed])) PERSISTED,
 [exceptionreasonvarchar]  AS (CONVERT([nvarchar](400),[exceptionreason])),
 [exceptionreasontag]  AS (CONVERT([nvarchar](415),N'Exception: '+replace(CONVERT([nvarchar](400),[exceptionreason]),N';',N':'))) PERSISTED,
 [encryptid] [int] NULL,
 [lastupdated]  AS (coalesce([completed],[exception],[loaded])) PERSISTED,
 [locktime] [datetime] NULL,
 [lockid] [uniqueidentifier] NULL)
GO

-- We enable identity insert, this will preserve ID assigned by BluePrism, then we dump data
-- in the where condition we use queue name and completed date (so we do not take any pending or deferred item)
SET IDENTITY_INSERT [dbo].[BPAWorkQueueItem_TEMP] ON

INSERT INTO [dbo].[BPAWorkQueueItem_TEMP](
 [id]
 ,[queueid]
 ,[keyvalue]
 ,[status]
 ,[attempt]
 ,[loaded]
 ,[completed]
 ,[exception]
 ,[exceptionreason]
 ,[deferred]
 ,[worktime]
 ,[data]
 ,[queueident]
 ,[ident]
 ,[sessionid]
 ,[priority]
 ,[prevworktime]
 ,[locktime]
 ,[lockid])
 SELECT wqi.[id]
    ,wqi.[queueid]
    ,wqi.[keyvalue]
    ,wqi.[status]
    ,wqi.[attempt]
    ,wqi.[loaded]
    ,wqi.[completed]
    ,wqi.[exception]
    ,wqi.[exceptionreason]
    ,wqi.[deferred]
    ,wqi.[worktime]
    ,wqi.[data]
    ,wqi.[queueident]
    ,wqi.[ident]
    ,wqi.[sessionid]
    ,wqi.[priority]
    ,wqi.[prevworktime]
    ,wqi.[locktime]
    ,wqi.[lockid]

   FROM [BluePrism].[dbo].[BPAWorkQueueItem] as wqi
   JOIN [BluePrism].[dbo].[BPAWorkQueue] as wq on wqi.queueid = wq.id
   WHERE 
  wq.name = 'Queue.BonificiEsteroEntrata.ContiAttesaLavorazione' AND 
  finished < DATEADD(MONTH, -3, GETDATE())
GO

-- Here we restore identity insert value
SET IDENTITY_INSERT [dbo].[BPAWorkQueueItem_TEMP] OFF

-- Finally we erase data from original table
DELETE FROM [dbo].[BPAWorkQueueItem] WHERE 
 EXISTS (SELECT * FROM [dbo].[BPAWorkQueueItem_TEMP] WHERE [BPAWorkQueueItem].ident = [BPAWorkQueueItem_TEMP].ident)
Share:

Monday, April 15, 2019

BP and JSON

As you already know, BP works on queues.

What if a process is not based on queue items but is based on a sequence of bulk operations?
You can think every bulk operation as fake queue item and manage them from control room.
But maybe you want to apply some logic, for example, let's assume your process id made of 4 subprocesses (A, B, C, D), if something go wrong with A you don't want to execute C, no problem on B and D.

Yeah, maybe too simple as an example, but use you imagination... believe me, you don't want to know anything about the process that, in the real world, lives on this solution (Alberto Cannas became mad on this).
So think about a process that doesn't rely on queue and where you want to play with subprocesses that can be seen as transactions.

We could store the data in a DB, but what if we cannot use a DB... we can use a JSON file as a DB, in order to have a flexible DB that helps us.
For example, in the process we are dealing with, we use a JSON file that is generated everyday, and the process state is represented by data in a collection.

I will show you a little process that simply export data only at 10.00/12.00/14.00/16.00 using JSON  to make all clear. This is what BP object Utility - JSON wrote when using Collection to JSON action.

{"Log 10":false, "Log 12":false, "Log 14":false, "Log 16":false} 

So you can read the informations you store in your information in a JSON file, implement your logic, and write again only when needed.
You should take care about 2 things:
  1. The subprocess that works on file must be encapsulated in enviroment lock. This will be helpful in a multi-robot environment;
  2. Pay attention to what you write in the JSON, when you use JSON to Collection action, the objects write the collection as an array, so you have to remove brackets at beginning and at the end of the JSON (thank Manuel Mascia for the trick)
Yeah... tips'n'tricks

Now you can push this solution beyond this example.
That's all folks.

Monday, April 1, 2019

Add credentials from BP... programmatically

As you know, you can use BP not also for automate business processes.
For example, in our scenario (we are talking about a RPA program with more than 30 complex automations), we have credentials that expire if you don't login with those credentials for a certain period of time.
On the other hand, because of, every time we provision a new process, we have to add to credential manager, all the credential needed by the process itself (call this value x) for all the machines involved (call this value y), it's easy to understand that the process of provisioning and keeping credentials alive led me to a new limit: BP does not provide anything to create credentials, so SQL is out ally.
So the key is to launch someway (ODBC or whatever) this custom SQL store procedure from BP.
The result will be a credentials with all roles, with permissions on all processes and resource PCs and with a void password.
Please note that void is 3uDCB67zyUqBsHym7o635w==:JM6brcdYVFsZhbEKISRtaQ== for BP since because the value is encoded and may vary depending on Encryption Scheme so make sure you already check what is the null password value on your farm.

Add credentials script

USE [BluePrism]
GO
/****** Object:  StoredProcedure [dbo].[BPAddUser]    Script Date: 04/10/2018 14:45:30 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO

ALTER PROCEDURE [dbo].[BPAddUser]
       @userLogin nvarchar(128)
AS
BEGIN
       -- SET NOCOUNT ON added to prevent extra result sets from
       -- interfering with SELECT statements.
       SET NOCOUNT ON;

    -- Insert statements for procedure here
       BEGIN 

             if not exists (select * from BPACredentials where login = @userLogin AND name = @userlogin)
                    BEGIN
                           declare @newid uniqueidentifier = NEWID();

                           INSERT INTO [dbo].[BPACredentials]
                                  ([id],[name],[description],[login],[password],[expirydate],[invalid],[encryptid])
                           VALUES
                           (@newid,@userLogin,'Created by script',@userLogin,'3uDCB67zyUqBsHym7o635w==:JM6brcdYVFsZhbEKISRtaQ==',NULL,0,1)

                           INSERT INTO [dbo].[BPACredentialRole]
                                  ([credentialid],[userroleid])
                           VALUES
                                  (@newid, NULL)

                           INSERT INTO [dbo].[BPACredentialsProcesses]
                                  ([credentialid],[processid])
                           VALUES
                                  (@newid, NULL)

                           INSERT INTO [dbo].[BPACredentialsResources]
                                  ([credentialid],[resourceid])
                           VALUES
                                  (@newid, NULL)
                    END
             Else
                    BEGIN
                    
                    DECLARE @credentialId uniqueidentifier;
                    select @credentialId = id from BPACredentials where login = @userLogin AND name = @userlogin;
                    
                    -- Create or overwrite 
                    if not exists (select * from BPACredentialRole where credentialid = @credentialId)
                           INSERT INTO BPACredentialRole([credentialid],[userroleid]) values (@credentialId, NULL)
                    else
                           update BPACredentialRole set userroleid = NULL where credentialid = @credentialId
                           

                    -- Create or overwrite 
                    if not exists (select * from BPACredentialsProcesses where credentialid = @credentialId)
                           INSERT INTO BPACredentialsProcesses([credentialid],[processid]) values (@credentialId, NULL)
                    else
                           update BPACredentialsProcesses set processid = NULL where credentialid = @credentialId


                    -- Create or overwrite
                    if not exists (select * from BPACredentialsResources where credentialid = @credentialId)
                           INSERT INTO BPACredentialsResources([credentialid],[resourceid]) values (@credentialId, NULL)
                    else
                           update BPACredentialsResources set resourceid = NULL where credentialid = @credentialId
             END

       END
END
Share:

Saturday, March 30, 2019

Embarassing stories: repair lookup fields

You deployed lists with lookup in old fashioned way and something went wrong.
Your lookup field points to nothing because you team forgot the difference between list definition and list instance.
The script will change (brutally) the schema of the field using CSOM: I don't like it but it works like a charm, so don't panic, you can use this script and you'll be at home asap.

Just change the uppercase values with yours.

Share:

Monday, July 2, 2018

Scheduler is locked?

Lately, lot of people talk about RPA but very few people talk about what happens when you have to scale and you have something like 20 robot working all together, synchronized, with a "time/resource filling" target.
Sometimes you could see the scheduled tasks forever "Pending", not responding to any delete or run requests.
The solution is: let's go to the DB and use this query to find out all the pending processes in a human readable way.

  SELECT 
       s.sessionid,
       r.name,
       p.name,
       s.startdatetime,
       st.statusid,
       st.description
  FROM 
  [BluePrism].[dbo].[BPASession] as s
  inner join BluePrism.dbo.BPAProcess as p
  on s.processid = p.processid
  inner join BluePrism.dbo.BPAResource as r 
  on r.resourceid = s.runningresourceid
  inner join BluePrism.dbo.BPAStatus as st
  on s.statusid = st.statusid
  
  where st.description = 'Pending'
  order by s.startdatetime

Then just change the end date from NULL to another value (maybe the same of the start one) and the statusid to "2", which stands for the Terminated status and the linked resource will be released ;)
Share:

Monday, June 25, 2018

Make your life a little easier with ODBC

I've written about the DLL import problem of BluePrism, of possible workaround with custom DLLs, but we can also make our life easier when we can.
For example in my last project i had to work with a lot of data sources as SQL Server, MySQL and Access all together.
All those above are data sources which have connectors for ODBC.
This means we can deal with all DBMS (and much more) simply installing ODBC connectors in the robot machine and using only this VBO to deal with that.
The key is to pass the driver info to the VBO so connection string could be composed.

So this are the VBO actions i created to deal with query.

Read in collection

The action you need the most.
It has two piece of code, because you could need to get lot of data (more than 300.000 rows).
Using the SqlCommand variable you can also do something more than "Select * from table" but you can also filter before getting the data.
The first code block is faster but will not work with lot of data, and the second one will be slower but it will work everytime.
Pay attention kiddo, BP passes parameter by value and not by reference, so you could need to do the "business work" in the VBO action, in order not to go "Out of memory".
OdbcConnection connection = new OdbcConnection(connectionString);
result = new DataTable();
exception = "";

try
{
 connection.Open();
 OdbcCommand command = new OdbcCommand(sqlCommand, connection);
 OdbcDataReader dataReader = command.ExecuteReader();
 result.Load(dataReader);
}
catch (Exception ex)
{ 
 exception = ex.Message;
}
finally
{
 connection.Close();
}

And this is the "big data" one.
OdbcConnection connection = new OdbcConnection(connectionString);
result = new DataTable();
exception = "";

try
{
 connection.Open();
 OdbcCommand command = new OdbcCommand(sqlCommand, connection);
 OdbcDataReader dataReader = command.ExecuteReader();

 DataTable resultSchema = dataReader.GetSchemaTable();
 List listCols = new List();
 if (resultSchema != null)
 {
  foreach (DataRow drow in resultSchema.Rows)
  {
   string columnName = System.Convert.ToString(drow["ColumnName"]);
   DataColumn column = new DataColumn(columnName, (Type)(drow["DataType"]));
   column.Unique = (bool)drow["IsUnique"];
   column.AllowDBNull = (bool)drow["AllowDBNull"];
   column.AutoIncrement = (bool)drow["IsAutoIncrement"];
   listCols.Add(column);
   result.Columns.Add(column);
  }
 }

 DataRow dataRow = null;
 while (dataReader.Read())
 {
  dataRow = result.NewRow();
  for (int i = 0; i < listCols.Count; i++)
  {
   dataRow[((DataColumn)listCols[i])] = dataReader[i];
  }
  result.Rows.Add(dataRow);
 }
}
catch (Exception ex)
{
 exception = ex.Message;
}
finally
{
 connection.Close();
 System.GC.Collect();
}

 Execute Command 

Just executes the query
OdbcConnection connection = new OdbcConnection(connectionString);
exception = "";

try
{
 connection.Open();
 OdbcCommand command = new OdbcCommand(sqlCommand, connection);
 command.ExecuteNonQuery();
}
catch (Exception ex)
{ 
 exception = ex.Message;
}
finally
{
 connection.Close();
}

 Execute Commands in a transaction

Executes more queries in a single transaction

OdbcConnection connection = new OdbcConnection(connectionString);
OdbcTransaction transaction = null;
exception = "";

try
{
 connection.Open();
 transaction = connection.BeginTransaction();
 OdbcCommand command = null;
 
 foreach (DataRow dr in sqlCommands.Rows)
 {
  string sqlCommand = dr["Command"].ToString();
  command = new OdbcCommand(sqlCommand, connection, transaction);
  command.ExecuteNonQuery();
 }
 
 transaction.Commit();
}
catch (Exception ex)
{ 
 transaction.Rollback();
 exception = ex.Message;
}
finally
{
 connection.Close();
}


A good strategy could be create CRUD action for ODBC driver in order to implement implicitily the repository pattern.
Changing the ConnectionString will just change the DataSource type and help creating a "universal" repository.
And... that's it for now!

Monday, June 18, 2018

Import-SPWeb preserves ListItemId?

If you're reading this you have my same problem, you don't have an answers to this question, and maybe you need to move a list across site across site collections/web applications/farms without changing the list item id.
Someone says yes, others say no. The answer is that both answers are true, it depends on the preconditions.

Let's start from this article which explains just a little bit how the ContentDB works.
So, let's create a custom list named "Another custom list" and let's populate it with a bunch of items.
Yeah just 3
So let's look what happens in the ContentDB.
At first let's look at the list id, and we can find it with just a select on the AllLists table querying by title.
Aggiungi didascalia
With the list id in your hands you can find all the entry of the items in AllUserData table, which is the repository of all the ContentDB Items (you can also have a look to the composed primary key of the table from the previous screenshot).

Remember something?
Now, let's delete item 1 and create item 4 and 5 and let's see what happends (Please note that Item1 is still in the content db until i delete it from site collection recycle bin). Moreover in the AllListAux i can now see the counter used to give the list item id to the object.
So this is what happens behind the scenes


Mmm.. Aux table
It's now time to export the list and take a look to what SharePoint made, so unzipping the cmp file, i can read this in the Manifest.xml
You can see that every item is serialized in xml, and surpise surprise there is an attribute called IntId, so let's see what happens importing this cmp file in another site collection:
Import-SPWeb http://sp2013 -Path "C:\TEMP\anothercustomlist.cmp" -IncludeUserSecurity -UpdateVersions Overwrite
and the result is that i've now a new list with a new id, but same informations in the AllListAux and AllUserData.
Only GUIDs changed in the import
and only GUIDs
So, even if i'm not showing you the AllUserData table, also ListItem IDs remains the same, so the answer to our question seems to be yes.
But what if i try to reimport? Well, items are not overwritten and i can see that same object are replicated with different list item id which starts from the value in the aux table.
So the answer is no if you are importing in a preexisting list with other items, because you never know what could be the next item id.

Below you can find some code i used to do a final test in order to get a lot of item from a list with more than 6000 items (which is not the one of this example) from the source list and the destination one, in order to do a final compare which led me to understand that Import-SPWeb is something good :)
string siteUrl = "http://sp2013/sites/test";
string listTitle = "Test";
string user = "Administrator";
string domain = "DEV";
string password = "Password";

using (ClientContext clientContext = new ClientContext(siteUrl))
{
 clientContext.Credentials = new NetworkCredential(user, password, domain);

 List list = clientContext.Web.Lists.GetByTitle(listTitle);
 clientContext.Load(list);
 clientContext.ExecuteQuery();

 ListItemCollectionPosition itemPosition = null;
 while (true)
 {
  CamlQuery camlQuery = new CamlQuery();
  camlQuery.ListItemCollectionPosition = itemPosition;
  camlQuery.ViewXml = @"
  
   
   
  
  1000
   ";

  ListItemCollection listItems = list.GetItems(camlQuery);
  clientContext.Load(listItems);
  clientContext.ExecuteQuery();

  itemPosition = listItems.ListItemCollectionPosition;

  foreach (ListItem listItem in listItems)
  {
   Console.WriteLine("Item Title: {0}", listItem["Title"]);
  }


  if (itemPosition == null)
   break;

  Console.WriteLine(itemPosition.PagingInfo);
  Console.WriteLine();
 }
}

Monday, June 4, 2018

A Blue Prism project with custom DLLs: load DLL from everywhere

I admit, this is one of the most painful points when i work with Blue Prism. As explained in other posts about DLLs, we have to put class libraries into the root folder of Blue Prism in order to use classes.
This could be a problem when you don't have the rights to access Blue Prism root folder, so with the great Antonio Durante, we find a way to solve this problem using another forgotten beauty: the reflection.

Here you can download a little project in Visual Studio, which is composed of some lines of code which simply get the content of a editable PDF and return as a DataTable containing as many rows as lines of text found in the PDF.
This piece of code uses iTextSharp.dll to do the work so we have a depencency here.


public class Program
{
 public Program(string blablabla)
 {
  // This is a fake constructor
 }

 public static DataTable LoadPdf(string path)
 {
  List data = PdfHelper.ExtractTextFromInvoice(path);

  DataTable table = new DataTable();
  table.Columns.Add("Row", typeof(string));

  DataRow row11 = table.NewRow();
  row11["Row"] = "Yeah 1.1 version as well";
  table.Rows.Add(row11);

  foreach (string item in data)
  {
   DataRow row = table.NewRow();
   row["Row"] = item;
   table.Rows.Add(row);
  }

  return table;
 }

 static void Main(string[] args)
 {
    DataTable t = LoadPdf(@"C:\Users\v.valrosso\Downloads\test1.pdf");
 }
}

The code here is very simple, but just to see again how reflection work, i write also this code to test everything. As you can see i load only the assembly in a folder where you can find also the referenced DLL (i've used directly the Debug output folder of the dummy project)

class Program
{
 static void Main(string[] args)
 {
  string assemblyPath = @"C:\TFS\Test\Code\Poc.ReferencingCode\bin\Debug\Poc.ReferencingCode.dll";

  Assembly asm = Assembly.LoadFrom(assemblyPath);
  //Assembly asm = Assembly.Load(File.ReadAllBytes(assemblyPath));
  Type t = asm.GetType("Poc.ReferencingCode.Program");

  var methodInfo = t.GetMethod("LoadPdf", new Type[] { typeof(string) });
  if (methodInfo == null)
  {
   // never throw generic Exception - replace this with some other exception type
   throw new Exception("No such method exists.");
  }

  object o = Activator.CreateInstance(t);

  object[] parameters = new object[1];
  parameters[0] = @"C:\Users\v.valrosso\Downloads\test1.pdf";       

  DataTable r = (DataTable)methodInfo.Invoke(o, parameters);
  Console.WriteLine(r);
 }
}

After that we can just play with BP creating a new VBO just pasting the code and ... les jeux sont faits.


Just  a warning, BP locks the DLL so you have think something more smart (Antonio and I have developed something very very smart but sadly i cannot show you because of it's top secret).

As usual thanks to my colleague and friend Antonio Durante for the wonderful work we make together everyday.

Monday, May 28, 2018

Why in the world i cannot get deferred item?

I was wondering why i could not get deferred items from a queue, so i drilled down just a little bit the BP Database in order to see all deferred items

Just watching the queue
So i dumped data before and after completion of the object just to understand how the DB row is updated

Old VS New
After that, we made a little action that inject SQL code from BP in order to get deferred items.

select
       *
       from
       BPAWorkQueue as q join BPAWorkQueueItem as i
       on q.id = i.queueid
       where q.name = 'QueueName'
       and i.deferred IS NOT NULL
       and i.completed IS NULL
       and i.exception IS NULL

And this is how Claudia Musio (thanks for everything) and me, solved the problem.

Monday, May 21, 2018

The KeepAlive problem

Sometimes, you just cannot ask to the IT department to disable screensaver or user lock screen, and personally i never understood why if we are dealing with virtualized machines, so you have to solve the problem somehow.

You can rely for sure on the LoginAgent VBO, which helps you to understand if the screen is locked and so you can login again, but if the screen is not locked, you can just move the mouse on the screen so the session remains alive.

I've tried some methods but only this worked, so in the global code you have to insert this piece of code


Private Shared Sub mouse_event(ByVal dwFlags As Integer, ByVal dx As Integer, ByVal dy As Integer, ByVal dwData As Integer, ByVal dwExtraInfo As Integer)
End Sub

Private Const MOUSEEVENTF_MOVE As Integer = 1
Private Const MOUSEEVENTF_LEFTDOWN As Integer = 2
Private Const MOUSEEVENTF_LEFTUP As Integer = 4
Private Const MOUSEEVENTF_RIGHTDOWN As Integer = 8
Private Const MOUSEEVENTF_RIGHTUP As Integer = 16
Private Const MOUSEEVENTF_MIDDLEDOWN As Integer = 32
Private Const MOUSEEVENTF_MIDDLEUP As Integer = 64
Private Const MOUSEEVENTF_ABSOLUTE As Integer = 32768

Public Shared Sub Move(ByVal xDelta As Integer, ByVal yDelta As Integer)
    mouse_event(MOUSEEVENTF_MOVE, xDelta, yDelta, 0, 0)
End Sub

After that yuo can create an action (eg. MouseMove) which randomly moves the mouse into the screen

Dim Randomizer As System.Random = New System.Random()
Dim x As Integer = Randomizer.Next(1, 640)
Dim y As Integer = Randomizer.Next(1, 480)

Move(x, y)

And that's it, another problem solved by Varro e Antonio Durante
Share:

Monday, May 14, 2018

Use BP Tags to do something more

What about tags?
I think tags can be used  to do more than simply tag an item to understand what kind of item the robot is working on.
Indeed we thank about a new way to use it, just replicating the .NET Dictionary ToString().

You can represent a dictionary as a set of key-value like this:

Key1:Value1;Key2:Value2;Key3:Value3;

And yeah using just a bit of code, you play with tags.
Look at this diagram, here we play with tags using a custom code


Dim patternTag As New Regex("(" + tagkey + ":)+([^;]){0,}")
Dim patternKey As New Regex("(" + tagkey + ":)+")
Dim match As Match
Dim value As String

match = patternTag.Match(tagString)

If match.Success Then
 valueTag = match.Value.TrimEnd(";")
 value = patternKey.Replace(valueTag, "")
 valueTag2 = valueTag.Replace(value, newValue)
 success = True
Else
 valueTag2 = tagKey + ":" + newValue
 success = False
End If

At first we get all data from tags from a particular item in the queue, and we look for all values of type Key1:whatever (note that, in our case is Key:Value and for BP is just one tag), so we delete pre-existing Key1:Value1 and we add Key1:NewValue.
It's clear that we assume that every tag of type Key-i:value is considered singleton.

With this configuration we can do interesting things, for example track technical statuses.
More generally, when you work with a process (expecially if attended), you have different phases, which can be matched with BP Item statuses.
In a singular phase, you can do lot of operations, and if the phase failed, it's normal to repeat the phase.
Sometimes you cannot redo some operations (maybe the robot is paying someone) and you don't want to split the original phase more phases (so not using the status) because of business rules.
So, assuming we are in Status S1, you do the Operation1 on a particular system and after that the robot crashes, when the robot retry the process, you can design the process so, if in state S1, the step Operation1 is skipped.

As usual, thanks to Antonio Durante and Raffaele Ecravino :-)

Monday, May 7, 2018

Scheduling processes and unregistered named pipe

Surprise surprise, a customer asked to start a robot at 7:00 AM, but if you generally work from 9AM to 9 PM like me, you'll find that start a robot at 7:00AM is... how can i say... not so beautiful.
So It's time to make the scheduler to work!

Let's assume the Resource PC we are using is configured to have the LoginAgent service listening on port 8081, and Resource PC ready to fire listening on port 8082.
When the LoginAgent is up and running, the Resource PC is not ready to start an automation, so we have to schedule this sequence of tasks:
  1. Login on 8081
  2. Run on 8082
  3. Logout on 8082
Da first configuration
It's also true that between 1 and 2, everything can happen, and you have to rely on resilience setting, but also, what can happen is that the Login fails because of unregistered named pipe.

Just the login agent log

I can see you asking yourself: WTF is named pipe?
Well, I read a lot about it but I don't remember a single word, but the concept is that the named pipe is a method for IPC (Inter Process Comunication), so i can assume is used when to resume LoginAgent.

Indeed, referring to the schedule above, if you observe the control room is that, when you see the resource PC connected on port 8081, the port 8082 results to be disconnected and vice versa, this means that LoginAgent is listening for login request while 8081 is connected and, until 8082 is in use (so until the logout), LoginAgent is not available.
Sometimes it happens that, even if LoginAgent service is available on port 8081, but the login failed because of the unregistered named pipe, so the solution is... surprise surprise...reboot the machine.
You can find all registered named pipes just with this PowerShell command
[System.IO.Directory]::GetFiles("\\.\\pipe\\")

And the output will be something like this


So the next action is to translate this code in a VBO action, so you can discover if the BP named pipe is registered or not, you can simply use this code to get all the named pipe and then just filter the output collectin

Dim listOfPipes As String() = System.IO.Directory.GetFiles("\\.\pipe\")
dt.Columns.Add("C1")
For Each s As String In listOfPipes
    dt.Rows.Add(s)
Next

Use VBO... use the force
Just the BP log after the cure

The next step is to integrate this new logic in the login task, so if the robot finds out that named pipe is not registered, it reboots the Resource PC.

And now login like this

.The final step is just to add a new task 2.Wait, this task will be:
  • Called if login failed
  • Calls login both in success and failure condition

A new day, a new design
With this little update, il login fails (and this is why i raise an exception after machine reboot), the wait process (it could be whatever, maybe also a void one) will use the resilience of BP to keep the scheduled task alive.
The pitfall of this technique is that you can generate a infinite loop, so take care.

So my friends, that's it, and thanks to my friend Antonio Durante for his time which helped us to work this out.

Monday, April 30, 2018

Just another out-of-date but maybe useful trick with KnockoutJS

I know what you're thinking about: another post on KnockoutJS? Again?
To be extremely clear, i 've written this post a lot of months ago but i never completed and published, and really i could not just delete it.

So today i will show you just a little example of comunicating widgets: it's not something complicated, but i think it can be also reproduced in other JS framwork.
Basically, let's assume we have a HTML which contains 2 widgets, the first one is auto-consistent and reusable, while the second one depends on the previous (to be honest this is a derivation from the real world example, when i have 4 widget comunicating one with each other but...).

So we have the code of the first controller, which is the child one.
window.Dummy = window.Dummy || {};
window.Dummy.Custom = window.Dummy.Custom || {};
window.Dummy.Custom.View = window.Dummy.Custom.View || {};
window.Dummy.Custom.View._Child = window.Dummy.Custom.View._Child || {};

(function (controller, utility, api, $, ko) {
    // View model definition
    controller.ViewModel = function viewModel() {
        var vm = this;
        vm.Items = ko.observableArray();

        // Search in all view model items
        vm.Get = function (id) {
            //identify the first matching item by name
            vm.firstMatch = ko.computed(function () {
                var search = id.toLowerCase();
                if (!search) {
                    return null;
                } else {
                    return ko.utils.arrayFirst(vm.Items(), function (item) {
                        return ko.utils.stringStartsWith(item.Id, search);
                    });
                }
            }, vm);

            return vm.firstMatch();
        };

 ...
    };

    // Controller definition
    controller.Loaded = ko.observable(false);
    controller.LoadData = function (data) {
        controller.Vm.Items.push.apply(controller.Vm.Items.sort(), data);

        ...
    };
    controller.Vm = new controller.ViewModel();
    controller.Wrapper = document.getElementById("workout-definitions");

    // Controller initialization
    controller.Init = function () {
        ko.applyBindings(controller.Vm, controller.Wrapper);
        api.GetAll(controller.LoadData);
    };

}(window.Dummy.Custom.View._Child = window.Dummy.Custom.View._Child || {},
    Dummy.Custom.Utility,
    Dummy.Custom.Api._ChildItems,
    jQuery,
    ko));

The controller is capable of calling the API exposed for the objects of this widget so it's possible somelike indipendent from the context. The instance of the view model exposes the Get Method, which searches into the array of objects loaded from API and return a single object.
Here there is instead the code of the parent widget:

window.Dummy = window.Dummy || {};
window.Dummy.Custom = window.Dummy.Custom || {};
window.Dummy.Custom.View = window.Dummy.Custom.View || {};
window.Dummy.Custom.View.Parent = window.Dummy.Custom.View.Parent || {};

(function (controller, 
 utility, 
 $, 
 ko,
    parentApi,
 childView) {

    // View model definition
    controller.ViewModel = function viewModel() {
        var vm = this;
  ...
    };

    // Controller definition
    ...

    // Controller init
    controller.Vm = new controller.ViewModel();
    controller.Init = function (data) {
        controller.LoadUserData(data);

        // Await widgets readyness and then fire method  
        // Load events if athlete is new
        childView.Loaded.subscribe(function (newValue) {
            if (newValue === true) {
    ...
            }
        });
    };
 
    ...
}(window.Dummy.Custom.View.Parent = window.Dummy.Custom.View.Parent || {},
    Dummy.Custom.Utility,
    jQuery,
    ko,
    Dummy.Custom.Api.ParentItems,
    Dummy.Custom.View._Child,
));

As you can imagine i dropped most of the code just to show you the concept. The parent widget is dependant of Child controller, which could be started calling Child.controller.Init everywhere (maybe on document loaded event), the parent is only subscribed to the Child.Loaded variable.
This means that, when the child finished to load data, Parent is triggered to do something else, and this could be someway useful.
It's clear you can use the model with more than one widget, with more than one triggered event, and you have to take care about the data you load, because you will use every widget as a item container, rather than just a graphical item.
Hope you find it useful even if outdated, but i preferred to share this blog post because maybe you can replicate the model using other JS frameworks that support the event subscription technique.
Share:

Me, myself and I

My Photo
I'm just another IT guy sharing his knowledge with all of you out there.
Wanna know more?