no code implementations • 16 Feb 2024 • Chin-Chia Michael Yeh, Yujie Fan, Xin Dai, Vivian Lai, Prince Osei Aboagye, Junpeng Wang, Huiyuan Chen, Yan Zheng, Zhongfang Zhuang, Liang Wang, Wei zhang
All-Multi-Layer Perceptron (all-MLP) mixer models have been shown to be effective for time series forecasting problems.
no code implementations • 16 Jan 2024 • Audrey Der, Chin-Chia Michael Yeh, Yan Zheng, Junpeng Wang, Zhongfang Zhuang, Liang Wang, Wei zhang, Eamonn J. Keogh
In this work we introduce a domain agnostic counterfactual explanation technique to produce explanations for time series anomalies.
no code implementations • 2 Jan 2024 • Prince Aboagye, Yan Zheng, Junpeng Wang, Uday Singh Saini, Xin Dai, Michael Yeh, Yujie Fan, Zhongfang Zhuang, Shubham Jain, Liang Wang, Wei zhang
The emergence of pre-trained models has significantly impacted Natural Language Processing (NLP) and Computer Vision to relational datasets.
no code implementations • 5 Nov 2023 • Audrey Der, Chin-Chia Michael Yeh, Yan Zheng, Junpeng Wang, Huiyuan Chen, Zhongfang Zhuang, Liang Wang, Wei zhang, Eamonn Keogh
As a result, unmodified data mining tools can obtain near-identical performance on the synthesized time series as on the original time series.
no code implementations • 5 Nov 2023 • Chin-Chia Michael Yeh, Huiyuan Chen, Yujie Fan, Xin Dai, Yan Zheng, Vivian Lai, Junpeng Wang, Zhongfang Zhuang, Liang Wang, Wei zhang, Eamonn Keogh
The ego-networks of all subsequences collectively form a time series subsequence graph, and we introduce an algorithm to efficiently construct this graph.
no code implementations • 5 Nov 2023 • Chin-Chia Michael Yeh, Huiyuan Chen, Xin Dai, Yan Zheng, Yujie Fan, Vivian Lai, Junpeng Wang, Audrey Der, Zhongfang Zhuang, Liang Wang, Wei zhang
To facilitate this investigation, we introduce a CTSR benchmark dataset that comprises time series data from a variety of domains, such as motion, power demand, and traffic.
no code implementations • 5 Nov 2023 • Chin-Chia Michael Yeh, Yan Zheng, Menghai Pan, Huiyuan Chen, Zhongfang Zhuang, Junpeng Wang, Liang Wang, Wei zhang, Jeff M. Phillips, Eamonn Keogh
In this work, we propose a sketch for discord mining among multi-dimensional time series.
1 code implementation • 20 Oct 2023 • Dongyu Zhang, Liang Wang, Xin Dai, Shubham Jain, Junpeng Wang, Yujie Fan, Chin-Chia Michael Yeh, Yan Zheng, Zhongfang Zhuang, Wei zhang
FATA-Trans is field- and time-aware for sequential tabular data.
no code implementations • 5 Oct 2023 • Chin-Chia Michael Yeh, Xin Dai, Huiyuan Chen, Yan Zheng, Yujie Fan, Audrey Der, Vivian Lai, Zhongfang Zhuang, Junpeng Wang, Liang Wang, Wei zhang
A foundation model is a machine learning model trained on a large and diverse set of data, typically using self-supervised learning-based pre-training techniques, that can be adapted to various downstream tasks.
no code implementations • 5 Oct 2023 • Chin-Chia Michael Yeh, Huiyuan Chen, Xin Dai, Yan Zheng, Junpeng Wang, Vivian Lai, Yujie Fan, Audrey Der, Zhongfang Zhuang, Liang Wang, Wei zhang, Jeff M. Phillips
A Content-based Time Series Retrieval (CTSR) system is an information retrieval system for users to interact with time series emerged from multiple domains, such as finance, healthcare, and manufacturing.
no code implementations • 5 Oct 2023 • Chin-Chia Michael Yeh, Xin Dai, Yan Zheng, Junpeng Wang, Huiyuan Chen, Yujie Fan, Audrey Der, Zhongfang Zhuang, Liang Wang, Wei zhang
In this paper, we investigate the application of MTL to the time series classification (TSC) problem.
no code implementations • 2 Jun 2023 • Xin Dai, Yujie Fan, Zhongfang Zhuang, Shubham Jain, Chin-Chia Michael Yeh, Junpeng Wang, Liang Wang, Yan Zheng, Prince Osei Aboagye, Wei zhang
Pre-training on large models is prevalent and emerging with the ever-growing user-generated content in many machine learning application categories.
no code implementations • 9 Dec 2022 • Audrey Der, Chin-Chia Michael Yeh, Renjie Wu, Junpeng Wang, Yan Zheng, Zhongfang Zhuang, Liang Wang, Wei zhang, Eamonn Keogh
PRCIS is a distance measure for long time series, which exploits recent progress in our ability to summarize time series with dictionaries.
no code implementations • AMTA 2022 • Prince O Aboagye, Yan Zheng, Michael Yeh, Junpeng Wang, Zhongfang Zhuang, Huiyuan Chen, Liang Wang, Wei zhang, Jeff Phillips
Optimal Transport (OT) provides a useful geometric framework to estimate the permutation matrix under unsupervised cross-lingual word embedding (CLWE) models that pose the alignment task as a Wasserstein-Procrustes problem.
no code implementations • 11 Aug 2022 • Chin-Chia Michael Yeh, Mengting Gu, Yan Zheng, Huiyuan Chen, Javid Ebrahimi, Zhongfang Zhuang, Junpeng Wang, Liang Wang, Wei zhang
Graph neural networks (GNNs) are deep learning models designed specifically for graph data, and they typically rely on node features as the input to the first layer.
no code implementations • 23 Jan 2022 • Zhongfang Zhuang, Xiangnan Kong, Elke Rundensteiner, Aditya Arora, Jihane Zouaoui
In this paper, we study the problem of one-shot learning on attributed sequences, where each instance is composed of a set of attributes (e. g., user profile) and a sequence of categorical items (e. g., clickstream).
no code implementations • 23 Jan 2022 • Zhongfang Zhuang
Recent research in feature learning has been extended to sequence data, where each instance consists of a sequence of heterogeneous items with a variable length.
no code implementations • 24 Dec 2021 • Chin-Chia Michael Yeh, Yan Zheng, Junpeng Wang, Huiyuan Chen, Zhongfang Zhuang, Wei zhang, Eamonn Keogh
The matrix profile is an effective data mining tool that provides similarity join functionality for time series data.
no code implementations • 29 Sep 2021 • Chin-Chia Michael Yeh, Mengting Gu, Yan Zheng, Huiyuan Chen, Javid Ebrahimi, Zhongfang Zhuang, Junpeng Wang, Liang Wang, Wei zhang
When applying such type of networks on graph without node feature, one can extract simple graph-based node features (e. g., number of degrees) or learn the input node representation (i. e., embeddings) when training the network.
no code implementations • 21 Sep 2021 • Chin-Chia Michael Yeh, Zhongfang Zhuang, Junpeng Wang, Yan Zheng, Javid Ebrahimi, Ryan Mercer, Liang Wang, Wei zhang
In this work, we study the problem of multivariate time series prediction for estimating transaction metrics associated with entities in the payment transaction database.
no code implementations • 8 Nov 2020 • Zhongfang Zhuang, Xiangnan Kong, Elke Rundensteiner, Jihane Zouaoui, Aditya Arora
Distance metric learning has attracted much attention in recent years, where the goal is to learn a distance metric based on user feedback.
no code implementations • 5 Nov 2020 • Chin-Chia Michael Yeh, Zhongfang Zhuang, Yan Zheng, Liang Wang, Junpeng Wang, Wei zhang
In this work, we approach this problem from a multi-modal learning perspective, where we use not only the merchant time series data but also the information of merchant-merchant relationship (i. e., affinity) to verify the self-reported business type (i. e., merchant category) of a given merchant.
no code implementations • 23 Sep 2020 • Chin-Chia Michael Yeh, Dhruv Gelda, Zhongfang Zhuang, Yan Zheng, Liang Gou, Wei zhang
Our proposed framework utilizes a set of entity-relation-matrices as the input, which quantifies the affinities among different entities in the database.
no code implementations • 25 Jul 2020 • Zhongfang Zhuang, Chin-Chia Michael Yeh, Liang Wang, Wei zhang, Junpeng Wang
New challenges have surfaced in monitoring and guaranteeing the integrity of payment processing systems.
no code implementations • 10 Jul 2020 • Chin-Chia Michael Yeh, Zhongfang Zhuang, Wei zhang, Liang Wang
We use experiments on real-world merchant transaction data to demonstrate the effectiveness of our proposed model.
no code implementations • 3 Nov 2019 • Zhongfang Zhuang, Xiangnan Kong, Elke Rundensteiner, Jihane Zouaoui, Aditya Arora
This problem is core to many important data mining tasks ranging from user behavior analysis to the clustering of gene sequences.
no code implementations • 13 Oct 2019 • Yuwei Wang, Yan Zheng, Yanqing Peng, Chin-Chia Michael Yeh, Zhongfang Zhuang, Das Mahashweta, Bendre Mangesh, Feifei Li, Wei zhang, Jeff M. Phillips
Embeddings are already essential tools for large language models and image analysis, and their use is being extended to many other research domains.