The design of the PGTNet model, which predicts remaining time in business processes, is described in detail in this section. Event logs are initially transformed into a graph dataset using this method, in which nodes stand for event types and edges for "directly follows"—direct temporal links. One significant innovation is the inclusion of rich edge characteristics, which provide vital contextual information beyond simple connectivity by encapsulating temporal data (such as durations and timestamps) and system burden (number of active cases).The design of the PGTNet model, which predicts remaining time in business processes, is described in detail in this section. Event logs are initially transformed into a graph dataset using this method, in which nodes stand for event types and edges for "directly follows"—direct temporal links. One significant innovation is the inclusion of rich edge characteristics, which provide vital contextual information beyond simple connectivity by encapsulating temporal data (such as durations and timestamps) and system burden (number of active cases).

Predictive Process Monitoring Using Graph Neural Networks

2025/11/05 23:30

Abstract and 1. Introduction

  1. Background and Related work

  2. Preliminaries

  3. PGTNet for Remaining Time Prediction

    4.1 Graph Representation of Event Prefixes

    4.2 Training PGTNet to Predict Remaining Time

  4. Evaluation

    5.1 Experimental Setup

    5.2 Results

  5. Conclusion and Future Work, and References

\

4 PGTNet for Remaining Time Prediction

To predict the remaining time of business process instances, we convert an event log into a graph dataset (see Section 4.1), and use it to train a predictive model (see Section 4.2). Once the model’s parameters are learned, we can query the model to predict the remaining time of an active process instance based on its current partial trace.

4.1 Graph Representation of Event Prefixes

\ Fig. 1. Graph representation of an event prefix of length of 6, Case ID= ‘27583’.

\

\ Edge features. To enhance the expressive capacity of the graph representation, we incorporate additional features into the edge feature vector:

\ – We use five different temporal features per edge. These include the total duration (t1) and duration of the last occurrence (t2) of the DF relation represented by the edge. Similar to other works [17], we also incorporate distances between the timestamp of the target node and the start of the case (t3), start of the day (t4), and start of the week (t5) for the latest occurrence of the DF relation. While t1, t2, and t3 are normalized by the largest case duration in the training data, t4 and t5 are normalized by the duration of days and weeks, respectively. In Figure 1, the temporal features are underlined in the feature vectors of the edges.

\

\ – To account for the overall workload of the process at a given time, we capture the number of active cases at the timestamp of the target node (for the last occurrence of the DF relation). This feature is normalized by the maximum number of concurrent process instances observed in the training data.

\ Note that we encode this information as edge features, rather than on the nodes, in order to preserve the simplicity of the node semantics. In this way, PGTNet can also deal with event logs with a large number of event classes, achieved by employing an embedding layer.

\

4.2 Training PGTNet to Predict Remaining Time

Once an event log is converted into a graph dataset, it can be used to train PGTNet to learn function θL in Equation 1 in an end-to-end manner. We specifically approach the remaining time prediction problem as a graph regression task, using L1-Loss (mean absolute error between predictions and ground truth remaining times). Model training employs the backpropagation algorithm to iteratively minimize the loss function. For this, we adopt the GPS Graph Transformer recipe [16] as the underlying architecture of PGTNet.

\ PGTNet architecture. PGTNet’s architecture comprises embedding and processing modules, as shown in Figure 2.

\ Embedding modules have two main functionalities:

\ – They map node and edge features into continuous spaces. To ensure that similar event classes are closer in the embedding space, an embedding layer is used to map integer node features into a continuous space. We use fully-connected layer(s) to compress edge features into the same hidden dimension and address the challenges arising from high-dimensional data attributes.

\ – They compress the graph structure into multiple positional and structural encodings (PE/SE), and seamlessly incorporate these PE/SEs into node and edge features [16]. This integration is achieved through diverse PE/SE initialization strategies and the utilization of several learnable modules, including MLPs (multi-layer perceptron) and batch normalization layers, as illustrated in Figure 2.

\ Fig. 2. PGTNet architecture: based on the GPS Graph Transformer recipe [16]. Paths to process node and edge features are specified by blue and red colors, respectively.

\

\

\ Note that edge features are solely processed by MPNN blocks and are not utilized by Transformer blocks or in obtaining the graph-level representation.

\ Design space for PGTNet. The modular design of the GPS Graph Transformer recipe offers flexibility in choosing various types of positional/structural encodings (PE/SEs) and MPNN/Transformer blocks.

\ PE/SEs aim to enhance positional encoding for Transformer blocks [10], and enable GNN blocks to be more expressive [4]. The compression of graph structure into PE/SEs can be achieved through the utilization of various initialization strategies (PE/SE initialization in Figure 2). Notably, Laplacian eigenvector encodings (LapPE) [10] furnishes node embedding with information about the overall position of the event class within the event prefix (global PE), while it enhances edge embedding with information on distance and directional relationships between nodes (relative PE). Random-walk structural encoding (RWSE) [4] incorporates local SE into node features, facilitating the recognition of cyclic control-flow patterns among event classes. Graphormer employs a combination of centrality encoding (local SE) and edge encoding (relative PE) to enhance both node and edge features [24].

\ Additionally, a range of learnable modules for processing PE/SEs can be integrated into PGTNet as highlighted in [16]. These design options include simple MLPs as well as more advanced networks such as DeepSet [25] and SignNet [11]. Lastly, while it is possible to use various MPNN and global attention blocks within each GPS layer [16], we exclusively used the graph isomorphism network (GIN) [9] and conventional transformer architecture [18]. Further details regarding our policy for designing PGTNet are elaborated upon in Section 5.1.

\

:::info Authors:

(1) Keyvan Amiri Elyasi[0009 −0007 −3016 −2392], Data and Web Science Group, University of Mannheim, Germany (keyvan@informatik.uni-mannheim.de);

(2) Han van der Aa[0000 −0002 −4200 −4937], Faculty of Computer Science, University of Vienna, Austria (han.van.der.aa@univie.ac.at);

(3) Heiner Stuckenschmidt[0000 −0002 −0209 −3859], Data and Web Science Group, University of Mannheim, Germany (heiner@informatik.uni-mannheim.de).

:::


:::info This paper is available on arxiv under CC BY-NC-ND 4.0 Deed (Attribution-Noncommercial-Noderivs 4.0 International) license.

:::

\

Disclaimer: The articles reposted on this site are sourced from public platforms and are provided for informational purposes only. They do not necessarily reflect the views of MEXC. All rights remain with the original authors. If you believe any content infringes on third-party rights, please contact service@support.mexc.com for removal. MEXC makes no guarantees regarding the accuracy, completeness, or timeliness of the content and is not responsible for any actions taken based on the information provided. The content does not constitute financial, legal, or other professional advice, nor should it be considered a recommendation or endorsement by MEXC.

You May Also Like

American Bitcoin’s $5B Nasdaq Debut Puts Trump-Backed Miner in Crypto Spotlight

American Bitcoin’s $5B Nasdaq Debut Puts Trump-Backed Miner in Crypto Spotlight

The post American Bitcoin’s $5B Nasdaq Debut Puts Trump-Backed Miner in Crypto Spotlight appeared on BitcoinEthereumNews.com. Key Takeaways: American Bitcoin (ABTC) surged nearly 85% on its Nasdaq debut, briefly reaching a $5B valuation. The Trump family, alongside Hut 8 Mining, controls 98% of the newly merged crypto-mining entity. Eric Trump called Bitcoin “modern-day gold,” predicting it could reach $1 million per coin. American Bitcoin, a fast-rising crypto mining firm with strong political and institutional backing, has officially entered Wall Street. After merging with Gryphon Digital Mining, the company made its Nasdaq debut under the ticker ABTC, instantly drawing global attention to both its stock performance and its bold vision for Bitcoin’s future. Read More: Trump-Backed Crypto Firm Eyes Asia for Bold Bitcoin Expansion Nasdaq Debut: An Explosive First Day ABTC’s first day of trading proved as dramatic as expected. Shares surged almost 85% at the open, touching a peak of $14 before settling at lower levels by the close. That initial spike valued the company around $5 billion, positioning it as one of 2025’s most-watched listings. At the last session, ABTC has been trading at $7.28 per share, which is a small positive 2.97% per day. Although the price has decelerated since opening highs, analysts note that the company has been off to a strong start and early investor activity is a hard-to-find feat in a newly-launched crypto mining business. According to market watchers, the listing comes at a time of new momentum in the digital asset markets. With Bitcoin trading above $110,000 this quarter, American Bitcoin’s entry comes at a time when both institutional investors and retail traders are showing heightened interest in exposure to Bitcoin-linked equities. Ownership Structure: Trump Family and Hut 8 at the Helm Its management and ownership set up has increased the visibility of the company. The Trump family and the Canadian mining giant Hut 8 Mining jointly own 98 percent…
Share
BitcoinEthereumNews2025/09/18 01:33
Pound Sterling softens as traders eye BoE rate cut next week

Pound Sterling softens as traders eye BoE rate cut next week

The post Pound Sterling softens as traders eye BoE rate cut next week appeared on BitcoinEthereumNews.com. The GBP/USD pair trades in negative territory near 1.3365 during the early European trading hours on Thursday, pressured by the rebound in the US Dollar (USD). Nonetheless, the potential downside might be limited after the US Federal Reserve (Fed) delivered a rate cut at its December policy meeting. Traders brace for the US weekly Initial Jobless Claims report, which will be published later on Thursday.  Markets continue to digest the largely anticipated rate cut by the Fed on Wednesday. The US central bank reduced its key interest rate for the third time in a row at its December meeting but signaled that it may leave rates unchanged in the coming months. Two Fed officials voted to keep the rate unchanged, while Stephen Miran, whom Trump appointed in September, voted for a larger rate cut. During the press conference, Fed Chair Jerome Powell said central bankers need time to see how the three reductions this year work their way through the US economy. Powell added that he will closely examine incoming data leading up to the next meeting in January. The Fed’s economic projections suggested one rate cut will take place next year, although new data could change this. On the other hand, the prospect of the Bank of England (BoE) rate reductions could drag the Pound Sterling (GBP) lower against the Greenback. Financial markets are now pricing in nearly an 88% chance of the BoE rate cut next week after signs from economic data that inflation pressure has eased.  Pound Sterling FAQs The Pound Sterling (GBP) is the oldest currency in the world (886 AD) and the official currency of the United Kingdom. It is the fourth most traded unit for foreign exchange (FX) in the world, accounting for 12% of all transactions, averaging $630 billion a day, according to 2022…
Share
BitcoinEthereumNews2025/12/11 13:40