Hello, everybody. My name is Niels Boer. I'm heading the StoneBench solution marketing department. In this video, I'm going to demonstrate you how to seamlessly integrate Docker containers in your existing scheduling workflows. Before I start, here's a small overview what is Docker container. Docker is an open source project that automates the deployment of application inside software container. Okay. But what is exactly a container? A container includes everything an application requires to run, similar to a virtual machine, but without the overhead of its own guest operating system. So it's like a light version of a virtual machine. The main idea is that a developer provides this application in a container, which is ready to run-in all environments, dev, test, production, etcetera, without any further configuration, similar to an iPhone app, which you download from the Apple App Store, ready to run on your phone. The Docker containers can run on any modern Linux platform or Windows twenty sixteen server. They even run on Windows ten Pro. The challenge many companies are already facing is that Docker containers are not always running in an isolated environment, but in a heterogeneous landscape containing many legacy applications like ERP system, banking applications, CRM systems, Oracle databases, etcetera. We, at Stonebranch, take on this challenge by seamlessly integrating Docker containers in your existing automation workflows. The following demo will provide an example how this is done. Okay, let me log into our universal controller web GUI. The controllers are a single point of control for configuration operation. The controller is used to set up all scheduling jobs and workflows, including handling of the Docker containers. It has an inbuilt script library to centrally maintain all scripts. The Web GUI is fully configurable per job profile. Each member sees only the jobs and workflow he's responsible for and gets only the permission related to his job role. Example given, a developer can view and create any task he's responsible for, where a monitoring team member can only view jobs and workflows. In regards to Docker related jobs, all major Docker CLI commands already pre configured. Building an image from a Docker file, connecting to a Docker registry to push and pull new images, copy data into a container, Run a container with specific credentials and environment variables. Remove a container or image. The good thing is any new upcoming Docker CLI command can be edited by configuration using our new universal task feature. Now let's have a look at a real scheduling workflow. The following screen shows a simplified commission workflow of a company selling household goods. The company needs to frequently release new bonus schema for their sellers. In order to quickly deploy a new bonus schema they're using a container to run the bonus calculation application. The Universal Automation Center Docker task ensure that the Docker container for the bonus application it started at the right time in the commission workflow from the latest tested image. This is possible because the Docker task connects to the private Docker registry or any other to check for the latest checked in image by the developer. Let us have a look at the Docker task which start the container to calculate the commission bonus values. In the Docker task, you first define which registry should be used to store your company's Docker images. We have different kinds of registries available. In this case, I'm saving to the Docker private registry. The credentials actually listed here are centrally stored in the controller. Then you select the image, in this case, this is the image and the version of used to initialize the commission container, which is named here. In the Docker task, you can always look up the Docker Hub using the hyperlink available in the task. So if I click on the hyperlink, it directly connects to my private Docker Hub, which is Niels Bohr, which is me. Now let us have a look at the steps of the event based commission workflow. To calculate commissions, you need sales data and seller data as input to the containerized commission app. The system checks via file monitor for new seller data to arrive in the defined directory. Once the file arrived, the seller data is automatically transferred to the SAP system and loaded into SAP using the Realtek Interface Manager task. This is actually Realtek Interface Manager task. The system also checks for new sales data. Once new sales data is available, it is transferred to the input directory of the commission app. The Docker task starts the commission container from the latest image available in the private registry. Then the sales data is copied into the container. The commission is calculated by this task and an email with the commission report is sent to the approval team. The approval team will check the report and provide the go ahead to release the workflow to the production via the Web Self Care client. Okay, now let's start the workflow. In our GUI, you will have always three real time views on the workflow. First of all, the individual dashboard view. This is configurable per user profile. In the individual dashboard view, you already see that the workflow is running. Then you have a list view, which allows for powerful filter rules to configure what you are seeing about the workflow. In this case, you can see also here the workflow is running. The last one is actually the graphical view, which I think is the nicest one. Let's have a look now at this graphical view. First of all, I will save the new seller data in the input directory of the FireWatcher of the seller data. The data is then automatically transferred to the SAP system and loaded into the SAP system via the Realtek Interface Manager. So actually, this is my new seller data. It's in form of an Excel sheet and I just show it into the input directory of my FileMonitor. So we just take this Excel sheet that you just saw and put it in the input directory of the FileMonitor. That's done. In the next step, I will save the new sales data in the input directory of the file data for the sales data. So I'm going to my Linux machine. This is actually directory where I see all the files from the FTP server, and I'm copying into the sales data into the directory of the FileWatcher. Now once the file has been identified, it will be transferred to the Docker server and the commission calculation process should start automatically. You can see it starts automatically. First, the Docker task starts the commission container from the latest image available in the private registry. Then the sales data was copied into the container. Afterwards, the commission app was started in the container. After successful commission calculation, an email with the commission report was sent with an automatic action to the approval team. Now let's have a look if this is all true and if the email has been arrived with the commission report. So I will log on to my email client and you can see the email report has been arrived. Attached to the email is the report which was generated by the container. So I can have a look at the report. It looks all fine. Niels Boot has sold ten machines and got the commission of one point five euros You can also see all the parameter that has been used to initialize the container. Okay, now I'm happy with the report. So what I will do now, I will now press on the link, on the self care link, log into as Darin because she's the one, the lady that approves this report. And you can see Darin has a complete different GUI than I had before because she is in the self care GUI. The only thing that she's supposed to do here at the moment is approve the report. This is why she doesn't see any other information. So she goes there and says, I have set completed. And it was successful. Now I log in again as the administrator. I go to my dashboard, and you can see all the workflow has been finished with success. I can have a look again at the graphical view to see if it's all fine and you can see it all went through with success. If I click on the individual task, I could also get all the log files generated via the Docker container, for example. Okay. Now, as you can see, I'm in the test system. And what I want to do now, I want to transfer this whole workflow that we have just tested and seem to be all correct. We are going to transfer it now to the production system. So, how does it work? The only thing I have to do is, I have to click on my workflow task and say promote this task including the whole workflow to another environment. So, I just press the promote button, tell to which controller I want to promote, in this case to the production, I say I want to promote it including all references, and then I just press the submit button. So it just takes a second and the whole workflow is transferred to the production system. Now I log into my productions controller, which runs on a complete different server. I go to workflow task and I can see that the commission demo task has just been transferred. So let's have a look. As you can see, it's exactly the same workflow as you just saw on the test system. The nice thing is you don't need to configure anything. Any parameters like IP addresses, credentials. Everything is automatically mapped in the production system via variables. Okay, now let's summarize what we have done. We have seen that containers can be seamlessly integrated in your existing legacy automation workflows. All security realm settings for legacy and containerized applications are centrally managed. LDAF and Active Directory are also supported. We provide you with optimal support for your DevOps approach by fully automating the lifecycle deployment process. In the example, the tested workforce manually deployed by pressing the promote button. This can, of course, be done automatically when additional tasks in the workflow. All actions have been performed via one hundred percent web based GUI. Our controller is also available as a cloud deployment in, for example, the AWS Amazon Cloud. If you need more information, please visit our web page or contact us directly. I would be more than happy to support you in your DevOps approach. I hope you have enjoyed this demonstration video. Our next video will be focused on continuous delivery. Thanks a lot.