View all
Manipulate Transformatoin Scripts Collectively Store Time Series of Metrics to Table Schedule Notebook to Run Data Jobs Use Webhook to Integrate Skill with Action Flow Aggregate Bundles to Minimize Notification Manage Bundles in between Action Flow Modules Run first Action Flow Scenario Operate Celonis from outside by REST API Include JSON data in HTTP Request and Response Find out HTTP Request from GUI Function Execute Periodic ETL Automatically Validate Data Model by Studio Analysis Construct My First Data Model Inspect Table Data by SELECT statement Adjust Time Zone of Event Time in Global Transformation Handle Day based Activity as Milestone Unite SQL statements by CASE Expression Split Long SQL Using Views Compose Activity from Joining Multiple Tables Insert Simple Record into Activity Table Determine Process Mining Tables based on Project Goal Consider Case ID before Starting Transformation Tune Endpoint Parameter Relevant to Delta Load Setup Dependent Endpoint in Extractor Builder Configure Endpoint for Suitable Extraction Connect to Source System via REST API Prepare Source System to Generate Event Log Pay attention to Extract SAP Tables Use Pseudonymized Column as Grouping Key Understand Delta Load Configuration Difference in Adding Column Scenario Verify Cloning Table Contents via Delta Load Minimize Extraction Time by Delta Load Option Look at Data Transfer Process by Data Job Log Connect to Celonis and Bring Back Instruction Run Extractor on Your Local Machine Categorize and Name Activity Transform Source System Tables to Minimize Data Model Tables