At described in Start Deep Dive to Machine Learning and Action Flow it is possible to manipulate Celonis EMS function by Machine Learning Workbench. Function relevant to EMS data is already discussed at last post Store Time Series of Metrics to Table. Machine learning Workbench also enables to manipulate programs in EMS for developers. Today I would like to demonstrate some of functions to manipulate Transformation (SQL) script developed in Data Integration.
[Read More]
Store Time Series of Metrics to Table
At last post Schedule Notebook to Run Data Jobs I showed you how to schedule Data Job using Machine Learning Workbench. Of course, not only Data Job but any kind of tasks can be scheduled by Python script. Today I would like to schedule backup some kind of data to table by Machine Learning Workbench.
Store time series of metrics Imagine I am running Celonis EMS for months and I already established dash board Aalysis to observe some KPIs (metrics).
[Read More]
Schedule Notebook to Run Data Jobs
From today I would like to use Machine Learning Workbench again. As described at Start Deep Dive to Machine Learning and Action Flow it can manipulate Celonis EMS functions from outside of it. And this manipulation can be scheduled periodically.
Using scheduling function, Machine Learning Workbench can be extension of Data Job scheduler in Data Integration. Today I would like to build simple job scheduler.
Parallel Data Job execution You may think your ETL duration is too long so you would like to shorten it as soon as possible.
[Read More]
Operate Celonis from outside by REST API
About REST API Until last post, I explained some of API use cases (export Pool program and configure data job alert) using Pycelonis script from ML Workbench. Is is something Celonis “internal” operation. By the way, as I mentioned in Login to Celonis EMS from Jupyter Workbench, I can call API from outside of Celonis. Generally this kind of API using HTTP request is called REST API (or RESTful API)
[Read More]
Include JSON data in HTTP Request and Response
About JSON Until last post, I did not intentionally mention about detail of input and output data in HTTP request. Actually HTTP request requires not only URL (URI) but header data like Authorization (refer to Observe HTTP request in Pycelonis login script). Also you can imagine it is possible to attach input form value (e.g. name, address, email). Generally in API world, JSON format is used to attach with HTTP request.
[Read More]
Find out HTTP Request from GUI Function
In the previous post Observe HTTP request in Pycelonis login script, I showed how to observe HTTP request under Pycelonis API. In this observation I found HTTP request requires at least Authorization header and of course URL to reach to resource in Celonis EMS. Also I found that I can manage to investigate which HTTP request is sent when calling Pycelonis class method.
By the way, not all HTTP requests are implemented in Pycelonis.
[Read More]
Observe HTTP request in Pycelonis login script
I showed how to login to Celonis EMS using Pycelonis in previous posts. Authentication topic I mentioned there is the most annoying when using API, but after mastering this I can transfer this knowledge to another areas easily.
To master authentication, I would like to show the mechanism of HTTP request in the internet (web) programming. Even I did not know HTTP request well at the beginning of using Celonis, now I got some of basic knowledge and it is enough to use HTTP request.
[Read More]
Limit permissions of API token to minimize risk
At last post Login to Celonis EMS from Jupyter Workbench, I used API key that have same permission as my GUI user. I mentioned that it is too strong and risky against unauthorized access. Imagine your API token accidentally make public, then anyone can operate Celonis instead of you. That is why I segregate API token from Notebook (Notebook may be published to GitHub etc.). Anyway, user API token must be altered to another weaker key especially in production system.
[Read More]
Login to Celonis EMS from Jupyter Workbench
At last post Start Deep Dive to Machine Learning and Action Flow, I introduced overview of these two functions that manipulate Celonis EMS functions without using GUI (browser) for automatation.
From today I would like to share basic functions in Machine Learning (Jupyter Workbench). Before staring I will setup Jupyter Workbench. If you do not have your Workbench, read Share my Analysis by Content-CLI then follow until creating Workbench.
I will go to Launcher in Workbench then select Others > Terminal, then enter below command after $ sign (output is as of 2022-05-14).
[Read More]
Start Deep Dive to Machine Learning and Action Flow
Until last post, I have seen the flow of process discovery through Celonis Process Analytics (or Studio Analysis) and prerequisite ETL (Extraction, Transformation, Load) by Data Integration. I already worked in multiple static process mining projects and found that they are enough functions for static process mining. And other process mining solutions are provided these functions too.
By the way, referring to Process Mining Data Science in Action by Wil van der Aalst, Process Mining can also make it possible for further actions such as monitoring and predictive analysis.
[Read More]