At last post Schedule Notebook to Run Data Jobs I showed you how to schedule Data Job using Machine Learning Workbench. Of course, not only Data Job but any kind of tasks can be scheduled by Python script. Today I would like to schedule backup some kind of data to table by Machine Learning Workbench.
Store time series of metrics Imagine I am running Celonis EMS for months and I already established dash board Aalysis to observe some KPIs (metrics).
[Read More]
Use Webhook to Integrate Skill with Action Flow
Email from Celonis EMS Until last post Aggregate Bundles to Minimize Notification, first Action Flow scenario is developed as draft. Today I would like to improve notification part of scenario.
Few days after I started scenario, I found that Sent box in my email account is unnecessarily increased due to notification I developed. That is because I used my email account to send it. If this is personal development that is fine, but if it is official one and in some time later you will leave your position, this email account is not available.
[Read More]
Aggregate Bundles to Minimize Notification
Integrate cancel operation bundles to one In the last Manage Bundles in between Action Flow Modules I successfully automated Data Job cancel operation. To do this, I used Iterator module to split one bundle of Data Pool to multiple bundles of Data Jobs, and filtered bundles to specify running Data Jobs. In the end, bundles of running Data Jobs are selected and cancelled.
Continued from last post, goal of today is to notify result of automated operation by email (my Gmail), because I can not check automated operation is done correctly at midnight based on initial scenario.
[Read More]
Manage Bundles in between Action Flow Modules
Convert array to bundles by Iterator module At last post Run first Action Flow Scenario I created HTTP request module to check data job status. Based on the scenario, today I would like to create next module to cancel running data jobs.
Because next module is also HTTP request, it is convenient to clone last module and modify something. I right click the last module, then select Clone menu, then new module is generated and connected with previous module.
[Read More]
Run first Action Flow Scenario
Action Flow as HTTP client From today I would like to introduce Action Flow, that is possible to automate scenario and integrate SaaS systems (also on-premise too).
You may know that 100 over SaaS systems are registered in Action Flow and easily build your own scenario. Great, but I sometimes felt that I could not find appropriate module from that. How do I fullfill my requirement ?
Actually Action Flow can be used HTTP client, so what I did by cURL at last post Operate Celonis from outside by REST API is also possible in Action Flow.
[Read More]
Start Deep Dive to Machine Learning and Action Flow
Until last post, I have seen the flow of process discovery through Celonis Process Analytics (or Studio Analysis) and prerequisite ETL (Extraction, Transformation, Load) by Data Integration. I already worked in multiple static process mining projects and found that they are enough functions for static process mining. And other process mining solutions are provided these functions too.
By the way, referring to Process Mining Data Science in Action by Wil van der Aalst, Process Mining can also make it possible for further actions such as monitoring and predictive analysis.
[Read More]
Validate Data Model by Studio Analysis
At last post Construct My First Data Model, I created Data Model and load data to it. Normally initial load is not perfect, so I should check data in Data Model. Today I would like to share how to validate my Data Model using Analysis. By the way, Celonis EMS main function is now Studio (and App for viewer) and Analysis is also part of Studio, so today I will create Studio instead of Process Analytics to create Analysis.
[Read More]
Handle Day based Activity as Milestone
I already created four activities until last post Unite SQL statements by CASE Expression, those are fillfilled requirement of event log. Going back to Consider Case ID before Starting Transformation, case ID is the biggest requirement. Also is is not so big as case ID, but event time is important too. In process mining, event time should be year, month, day plus hour, minute, second (YYYY-MM-DD HH:MI:SS in Vertica format). I guess event time in process mining referred to that is recorded automatically by system responding to something action.
[Read More]
Copy Previous Value to Blank Period by RUNNING_SUM and RANGE_APPEND functions
At Investigate Workload Trend of Cropped Subprocess I showed trend of activity count, and at that time I used RANGE_APPEND to fill zero count in trend graph. Today I would like to use different aggregation RUNNING_SUM and fill value to blank period.
Imagine you would like to check weekly trend of credit amount regarding some customer. Credit amount is increased by the amount of net value when ‘Receive Order’ happened, and it is decreased when ‘Clear Invoice’ happened.
[Read More]
Create Key Column of Activity Table
Celonis Data Model always require unique key in case tabel (case key) to group activities belong to each case. How about activity table ? Activity table do not have explicit key column, instead combination of case key, activity name, timestamp, and sorting number are similar to activity key (those four columns are configured in Data Model).
This week I was asked to create activity table key column due to duplication check purpose.
[Read More]