Dr Ming Liu

STAFF PROFILE

Position

Lecturer, Machine Learning

Faculty

Faculty of Sci Eng & Built Env

Department

School of Info Technology

Campus

Melbourne Burwood Campus

Biography

Dr Ming Liu is an early career researcher who works on Natural Langauge Processing and Machine Learning. He proposed the "Learn to actively learn" approahch for active learning and developed a few text summarization models/pipelines (e.g. SummPip, SciSummPip), both of which are widely used in low-resource text generation settings. His reseach has attracted multiple grants, including 2022 Deakin MiniARC, 2023 ARC Linkage (LP220200746). Dr Ming has interest in solving real world text mining problems, paticularly in domain specific settings. 

Read more on Ming's profile

Research interests

  • Natural Langauge Processing
  • Small Efficient Language Modelling
  • Continual Learning
  • Text Generation 
  • Adversarial Learning
  • Scientific Text Mining
  • Multimodality 
  • Conversational Systems

Teaching interests

  • Machine Learning
  • Natural Language Processing
  • Deep Learning
  • Applied Data Analysis
  • Semi-structured Data Analasis
  • Data Wrangling

Publications

Filter by

2023

Prototype-Guided Memory Replay for Continual Learning

Stella Ho, Ming Liu, Lan Du, Longxiang Gao, Yong Xiang

(2023), pp. 1-11, IEEE Transactions on Neural Networks and Learning Systems, Piscataway, N.J., C1

journal article

A graph empowered insider threat detection framework based on daily activities

W Hong, J Yin, M You, H Wang, J Cao, J Li, M Liu, C Man

(2023), Vol. 141, pp. 84-92, ISA Transactions, Amsterdam, The Netherlands, C1

journal article

A fault diagnosis algorithm for analog circuits based on self-attention mechanism deep learning

D Yang, J Wei, X Lin, M Liu, S Lu

(2023), Vol. 44, pp. 128-136, Yi Qi Yi Biao Xue Bao/Chinese Journal of Scientific Instrument, Beijing, China, C1-1

journal article

Leveraging Natural Language Processing and Clinical Notes for Dementia Detection

Ming Liu, Richard Beare, Taya Collyer, Nadine Andrew, Velandai Srikanth

(2023), pp. 150-155, Clinical Natural Language Processing : Proceedings of the 5th Clinical Natural Language Processing Workshop, Toronto, Canada, E1-1

conference

Make Text Unlearnable: Exploiting Effective Patterns to Protect Personal Data

Xinzhe Li, Ming Liu

(2023), pp. 249-259, TrustNLP 2023 : Proceedings of the 3rd Workshop on Trustworthy Natural Language Processing, Toronto, Canada, E1

conference

DeakinNLP at ProbSum 2023: Clinical Progress Note Summarization with Rules and Language Models

M Liu, D Zhang, W Tan, H Zhang

(2023), pp. 491-496, BioNLP 2023 : Proceedings 22nd Workshop on Biomedical Natural Language Processing and BioNLP Shared Task, Toronto, Canada, E1

conference

Can Pretrained Language Models Derive Correct Semantics from Corrupt Subwords under Noise?

X Li, M Liu, S Gao

(2023), pp. 165-173, Proceedings of the Annual Meeting of the Association for Computational Linguistics, E1

conference

A Survey on Out-of-Distribution Evaluation of Neural NLP Models

X Li, M Liu, S Gao, W Buntine

(2023), Vol. 2023-August, pp. 6683-6691, IJCAI 2023 : Proceedings of the 32nd International Joint Conference on Artificial Intelligence, Macao, China, E1

conference

An Empirical Study on Active Learning for Multi-label Text Classification

Mengqi Wang, Ming Liu

(2023), pp. 94-102, Proceedings of The Fourth Workshop on Insights from Negative Results in NLP, Dubrovnik, Croatia, E1

conference
2022

Mulan: A Multiple Residual Article-Wise Attention Network for Legal Judgment Prediction

Junyi Chen, Lan Du, Ming Liu, Xiabing Zhou

(2022), Vol. 21, pp. 1-15, ACM Transactions on Asian and Low-Resource Language Information Processing, New York, N.Y., C1

journal article

An Empirical Survey on Long Document Summarization: Datasets, Models and Metrics

Huan Koh, Jiaxin Ju, Ming Liu, Shirui Pan

(2022), pp. 1-39, ACM Computing Surveys, New York, N.Y., C1

journal article

Similarity Calculation via Passage-Level Event Connection Graph

M Liu, L Chen, Z Zheng

(2022), Vol. 12, pp. 9887-9887, Applied Sciences (Switzerland), C1

journal article

BiDKT: Deep Knowledge Tracing with BERT

W Tan, Y Jin, M Liu, H Zhang

(2022), Vol. 428, pp. 260-278, Ad Hoc Networks and Tools for IT, Virtual event, E1

conference

Semi-supervised Continual Learning with Meta Self-training

S Ho, M Liu, L Du, Y Li, L Gao, S Gao

(2022), pp. 4024-4028, CIKM '22 : Proceedings of the 31st ACM International Conference on Information and Knowledge Management 2022, Atlanta, Ga., E1

conference

Graph Intelligence Enhanced Bi-Channel Insider Threat Detection

W Hong, J Yin, M You, H Wang, J Cao, J Li, M Liu

(2022), Vol. 13787, pp. 86-102, NSS 2022 : Proceedings of the International Conference on Network and System Security 2022, Denarau Island, Fiji, E1

conference

How Far are We from Robust Long Abstractive Summarization?

H Koh, J Ju, H Zhang, M Liu, S Pan

(2022), pp. 2682-2698, EMNLP 2022 : Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, Abu Dhabi, UAE, E1

conference
2021

Variational auto-encoder based Bayesian Poisson tensor factorization for sparse and imbalanced count data

Y Jin, M Liu, Y Li, R Xu, L Du, L Gao, Y Xiang

(2021), Vol. 35, pp. 505-532, Data Mining and Knowledge Discovery, C1

journal article

Leveraging Information Bottleneck for Scientific Document Summarization

Jiaxin Ju, Ming Liu, Huan Koh, Yuan Jin, Lan Du, Shirui Pan

(2021), pp. 4091-4098, Findings of the Association for Computational Linguistics : EMNLP 2021, Punta Cana, The Dominican Republic & Online, E1

conference

Neural Attention-Aware Hierarchical Topic Model

Yuan Jin, He Zhao, Ming Liu, Lan Du, Wray Buntine

(2021), pp. 1042-1052, Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, Punta Cana, The Dominican Republic & Online, E1

conference

Transformer over Pre-trained Transformer for Neural Text Segmentation with Enhanced Topic Coherence

Kelvin Lo, Yuan Jin, Weicong Tan, Ming Liu, Lan Du, Wray Buntine

(2021), pp. 3334-3340, Findings of the Association for Computational Linguistics : EMNLP 2021, Punta Cana, The Dominican Republic & Online, E1

conference

Exploring the Vulnerability of Natural Language Processing Models via Universal Adversarial Texts

Xinzhe Li, Ming Liu, Xingjun Ma, Longxiang Gao

(2021), pp. 138-148, ALTA 2021 : Proceedings of the 19th Workshop of the Australasian Language Technology Association, Online, E1

conference
2020

SummPip: unsupervised multi-document summarization with sentence graph compression

Jinming Zhao, Ming Liu, Longxiang Gao, Yuan Jin, Lan Du, He Zhao, He Zhang, Gholamreza Haffari

(2020), pp. 1949-1952, SIGIR 2020 : Proceedings of the 43rd International ACM SIGIR Conference on Research and Development in Information Retrieval, Online, China, E1

conference

SciSummPip: An Unsupervised Scientific Paper Summarization Pipeline

Jiaxin Ju, Ming Liu, Longxiang Gao, Shirui Pan

(2020), pp. 318-327, EMNLP 2020 : Proceedings of the First Workshop on Scholarly Document Processing, Online, E1

conference

Multi-label Few/Zero-shot Learning with Knowledge Aggregated from Multiple Label Graphs

Jueqing Lu, Lan Du, Ming Liu, Joanna Dipnall

(2020), pp. 2935-2943, EMNLP 2020 : Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing, Online, E1

conference
2019

Closing the Gap in Surveillance and Audit of Invasive Mold Diseases for Antifungal Stewardship Using Machine Learning

Diva Baggio, Trisha Peel, Anton Peleg, Sharon Avery, Madhurima Prayaga, Michelle Foo, Gholamreza Haffari, Ming Liu, Christoph Bergmeir, Michelle Ananda-Rajah

(2019), Vol. 8, JOURNAL OF CLINICAL MEDICINE, Switzerland, C1-1

journal article

Learning how to active learn by dreaming

Thuy-Trang Vu, Ming Liu, Dinh Phung, Gholamreza Haffari

(2019), pp. 4091-4101, ACL 2019 : Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, Florence, Italy, E1

conference
2018

Learning how to actively learn: a deep imitation learning approach

Ming Liu, Wray Buntine, Gholamreza Haffari

(2018), Vol. 1, pp. 1874-1883, ACL 2018 : Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics, Melbourne, Vic., E1-1

conference

Learning to actively learn neural machine translation

Ming Liu, Wray Buntine, Gholamreza Haffari

(2018), pp. 334-344, CoNLL 2018 : Proceedings of the 22nd Conference on Computational Natural Language Learning, Brussels, Belgium, E1-1

conference
2017

Leveraging linguistic resources for improving neural text classification

Ming Liu, Gholamreza Haffari, Wray Buntine, Michelle Ananda-Rajah

(2017), pp. 34-42, ALTA 2017 : Proceedings of the Australasian Language Technology Association Workshop 2017, Brisbane, Qld., E1-1

conference
2016

Learning cascaded latent variable models for biomedical text classification

Ming Liu, Gholamreza Haffari, Wray Buntine

(2016), Vol. 14, pp. 128-132, ALTA 2016 : Proceedings of the Australasian Language Technology Association Workshop 2016, Melbourne, Vic., E1-1

conference

Funded Projects at Deakin

Australian Competitive Grants

Building resilience in at-risk rural communities through improving Media Communication on Climate Change Policies

A/Prof Xiao Liu, Dr Hilya Mudrika Arini, Dr Ming Liu, Dr Fitri Trapsilawati, A/Prof Chathu Ranaweera, A/Prof Kevin Lee, A/Prof Hassan Vally, Dr Anna Klas, Dr Adam Cardilini, Dr Yun Mulyani, Dr Arif Nurwidyantoro, Prof Catherine Bennett, Dr Yunita Sari, Dr Justin Lawson, Dr Gabi Mocatta

KONEKSI Australia-Indonesia Research Collaboration Grants

  • 2023: $175,000

Industry and Other Funding

Large Language Models in Engineering.

Dr Shang Gao, Dr Ming Liu, Mr Xinzhe Li

Aurecon Australasia Pty Ltd

  • 2024: $2,250
  • 2023: $16,000

Supervisions

No completed student supervisions to report