Chang Xu is ARC Future Fellow and Associate Professor at the School of Computer Science, University of Sydney. He received the NSW Premier's Prize for Early Career Researcher of the Year and the University of Sydney Vice-Chancellor’s Award for Outstanding Early Career Research. His research interests lie in machine learning algorithms and related applications in computer vision. He has published over 100 papers in prestigious journals and top tier conferences. He has received several paper awards, including Distinguished Paper Award in AAAI 2023, Best Student Paper Award in ICDM 2022, Best Paper Candidate in CVPR 2021, and Distinguished Paper Award in IJCAI 2018. He served as an area chair of NeurIPS, ICML, ICLR, KDD, CVPR, and MM, as well as a Senior PC member of AAAI and IJCAI. In addition, he served as an associate editor at IEEE T-PAMI, IEEE T-MM, and T-MLR. He has been named a Top Ten Distinguished Senior PC Member in IJCAI 2017 and an Outstanding Associate Editor at IEEE T-MM in 2022.
[University Homepage] [Google Scholar]
Address: J12/1 Cleveland St, Darlington NSW 2008, Australia.
Email: c.xu!AT!sydney.edu.au; dr.changxu!AT!gmail.com;
Warmly welcome (RA/M.S/PhD) students to work with me in machine learning related researches, and I will do my best to help them to improve in all respects.
Recent Work Highlights:
- Where and What? Examining Interpretable Disentangled Representations is accepted by CVPR 2021. Here is one of best paper candidates (32/7015) at CVPR 2021! Unlike the independence assumption, interpretability has rarely been exploited to encourage disentanglement in the unsupervised setting. We examine the interpretability of disentangled representations by investigating two questions: where to be interpreted and what to be interpreted?
- Adapting Neural Architectures Between Domains [code] is accepted by NeurIPS 2020. The power of deep neural networks is to be unleashed for analyzing a large volume of data (e.g. ImageNet), but the architecture search is often executed on another smaller dataset (e.g. CIFAR-10) to finish it in a feasible time. To enhance the generalization of the searched neural architecture, why not taking a small piece of that large dataset into the process of NAS?
- Scientific Control for Reliable Neural Network Pruning gateio login is accepted by NeurIPS 2020. We prune redundant filters to achieve a compact deep neural network. But whether the filters you have pruned are really redundant? Let's set up a scientific control group for a more rigorous research design, so that the effect of all factors except the association between the filter and expected network output can be minimized.
- Learning Disentangled Representations with Latent Variation Predictability gate.io is accepted by ECCV 2020. Latent traversal is popular to visualize the disentangled latent representations. Given variations in a single unit of the latent representation, it is expected that there is only a change in a single factor of variation of the data, but this impressive experimental observation is rarely explicitly encoded by existing methods. This paper defines the variation predictability to address this need.
- Optical Flow Distillation: Towards Efficient and Stable Video Style Transfer is accepted by ECCV 2020. In video style transfer, optical flow is widely adopted to boost the transfer stability, but its high computational complexity has been criticized for a long term. If we can formulate how optical flow helps in improving the transfer stability, it would be possible to graft it on another lightweight lightweight style transfer network that has no explicit input of the optical flow.
- Neural Architecture Search in a Proxy Validation Loss Landscape [code] is accepted by ICML 2020. Intermediate validation results in neural architecture search are invaluable but have rarely been explored. We propose to approximate the validation loss landscape, base on which the optimal neural architecture corresponding to the minimal validation loss could be more efficiently discovered.
- AdderNet: Do We Really Need Multiplications in Deep Learning? [code] is accepted by CVPR 2020. Compared with cheap addition operation, multiplication operation is of much higher computation complexity. We present adder networks (AdderNets) to trade these massive multiplications in deep neural networks for much cheaper additions to reduce computation costs.
- On Positive-Unlabeled Classification in GAN [code] is accepted by CVPR 2020. We present a novel technique to stabilize the training of the discriminator in GANs. Traditionally, real data are taken as positive while generated data are negative. In contrast, we advocate that the generated data could be positive or negative according to their quality.
- Graph Edge Convolutional Neural Networks for skeleton-Based Action Recognition is accepted by IEEE T-NNLS 2019. Graph nodes and their features are usually our focus during the development of different graph convolution operations. We suggest that graph edges could be another important perspective to explore the convolutions over graph data.
- Evolutionary Generative Adversarial Networks [code] is accepted by IEEE T-EC 2019. Survival of the fittest! We build an evolutionary generative adversarial network (E-GAN), which treats the adversarial training procedure as an evolutionary problem. Generators, acting as parents, undergo different mutations to produce offspring to adapt to the environment (i.e. the discriminator).
- Legonet: Efficient Convolutional Neural Networks with Lego Filters [code] is accepted by ICML 2019. There are many successful building blocks resulting from network engineering, e.g. inception and residual modules. Beyond these high-level modules, we suggest that an ordinary filter in the neural network can be upgraded to a sophisticated module as well. Filter modules are established by assembling a shared set of Lego filters that are often of much lower dimensions.
- Attention-GAN for Object Transfiguration in Wild Images [code] is accepted by ECCV 2018. The generative network in classical GANs for object transfiguration often undertakes a dual responsibility: to detect the objects of interests and to convert the object from source domain to target domain. In contrast, we decompose the generative network into two separate networks, each of which is only dedicated to one particular sub-task.
- Learning from Multiple Teacher Networks is accepted by ACM SIGKDD 2017. The current student-teacher learning paradigm focuses on one single teacher network. In practice, a student may access multiple teachers, and multiple teacher networks together provide comprehensive guidance that is beneficial for training the student network.
Publications Categorised by Year, gateio app, or by Venue.
Preprints:
- W Li, Y Cao, X Su, X Lin, S You, M Zheng, Y Chen, C Xu: Training-free Long Video Generation with Chain of Diffusion Model Experts.
- T Yang, Z Li, J Cao, C Xu: Mitigating Hallucination in Large Vision-Language Models via Modular Attribution and Intervention.
- J Zhang, D Liu, E Park, S Zhang, C Xu: Anti-Exposure Bias in Diffusion Models via Prompt Learning.
- D Li, W Xie, M Cao, Y Wang, J Zhang, Y Li, L Fang, C Xu: FusionSAM: Visual Multimodal Learning with Segment Anything Model.
- L Tao, H Guo, M Dong, C Xu: Consistency Calibration: Improving Uncertainty Calibration via Consistency among Perturbed Neighbors.
- T Huang, Y Ma, S You, C Xu: Learning Mask Invariant Mutual Information for Masked Image Modeling.
- J Lin, L Tao, M Dong, C Xu: Diffusion Attribution Score: Which Training Sample Determines Your Generation.
- C Chen, D Liu, M Shah, C Xu: Exploring Local Memorization in Diffusion Models via Bright Ending Attention.
- C Chen, E Liu, D Liu, M Shah, C Xu: Investigating Memorization in Video Diffusion Models.
- C Xu, D Tao, C Xu: A survey on multi-view learning.
Research Grants and Projects:
- ARC Discovery Project (DP) 2024-2026, "Generative Visual Pre-training on Unlabelled Big Data", Sole Chief Investigator.
- Sydney Research Accelerator (SOAR) Prize, 2022-2023, "Creative AI: Understanding Data via Generation", Sole Chief Investigator.
- Global Development Award (GDA), 2022, "Contrastive Machine Learning from Observation and Imagination", Sole Chief Investigator.
- ARC Discovery Project (DP) 2021-2023, "DeepHoney: Automatic Honey Data Generation for Active Cyber Defence", Sole Chief Investigator.
- Faculty of Engineering Early Career Researcher Development Scheme, 2019, "Learning to Defend Privacy in Human Behavior Analysis", Sole Chief Investigator.
- Ant Financial Special Scientific Research Fund, 2019, "Adversarial Deep Learning for Generating and Detecting Fake Facial Expressions", Sole Chief Investigator.
- ARC Discovery Early Career Researcher Award (DECRA) 2018-2020, "Multi-view Synergistic Learning for Human Behaviour Analysis", Sole Chief Investigator.
- ARC Discovery Projects (DP) 2018-2020, "Streaming Label Learning for Leaching Knowledge from Labels on the Fly", Chief Investigator B.
- The University of Sydney Partnership Collaboration Award, 2018, "Scalable Machine Learning for Business Event Analysis", Sole Chief Investigator.
Honors and Awards:
- 2023, NSW Premier's Prizes for Early Career Researcher of the Year (Physical Sciences), New South Wales (NSW), Australia
- 2023, SIGMM Rising Star Award, ACM
- 2023, Faculty Dean's Award for Supervision of HDR Students and Mentoring, FEIT, University of Sydney
- 2023, Distinguished Paper Award (12/8777), AAAI Conference on Artificial Intelligence (AAAI)
- 2022, Vice-Chancellor’s Award for Outstanding Early Career Research, University of Sydney
- 2022, Best Student Paper Award (1/870), IEEE International Conference on Data Mining (ICDM)
- 2022, Outstanding Associate Editor Award, IEEE Transactions on Multimedia
- 2022, Sydney Research Accelerator (SOAR) Prize, University of Sydney
- 2022, Global Development Award, University of Sydney
- 2021, Best Paper Candidate (32/7015), CVPR
- 2020, Supervisor of The Year, SUPRA, University of Sydney
- 2020, Early Career Researcher Award – Honourable Mention, Australian Pattern Recognition Society (APRS)
- 2019, Faculty Dean's Research Award, FEIT, University of Sydney
- 2018, Distinguished Paper Award (7/3470), IJCAI
- 2017, Distinguished Student Paper Award (1/5240), IJCAI
- 2017, Top Ten Distinguished Senior PC (10/357), IJCAI
- 2017, CAAI Doctoral Dissertation Award, China Association of Artificial Intelligence
- 2016, Peking University Outstanding Graduate, Peking University
- 2016, Excellent Doctoral Dissertation (100/1703), Peking University
- 2015, IBM PhD Fellowship, IBM, Corp.
- 2015, President's PhD Scholarship, Peking University
- 2015, National Scholarship for Graduate Students, Ministry of Education of P.R.C.
- 2014, Baidu PhD Fellowship, Baidu, Inc.
- 2015, Qingyun Shi Best Student Paper Award, Peking University
- 2014, Qingyun Shi Best Student Paper Award, Peking University
- 2014, Top Ten Students with Academic Research Honor, School of EECS, Peking University
- 2014, Outstanding Reviewer Award, Elsevier, Computational Statistics & Data Analysis (CSDA)
- 2013, Best Student Paper Award, ACM ICIMCS
- 2012, Wusi Outstanding Excellent Scholarship, Peking University
- 2011, Tianjin University Outstanding Graduate, Tianjin University
- 2008, Tianjin Government Scholarship, Tianjin Government
Talks:
- “Deep Neural Networks: from Feature Engineering to Network Engineering” at SYNCS Academic Tech Talk, 09/2022
- “Adversarial Machine Learning: GANs as An Example” at MLSP Workshop, Online 10/2021
- "Revisiting Proxies for Unsupervised Disentangled Representation Learning" at IJCAI WSL Workshop, Online 08/2021
- “Examining Deep Neural Architectures in Practice” at CVPR 2021 NAS Workshop, Online 06/2021
- “GAN And Its Beyond” at Monash University, Online 06/2021
- “Lightweight Neural Architectures for Efficient Deep Learning” at DICTA, Online 12/2020
- “Adversarial Thoughts Beyond GANs” at USyd-PKU Workshop, Beijing 01/2020
- “Grinding Ingredients to Cook Unsupervised Learning” at USyd-PKU Workshop, Sydney 11/2019
- “Annual Review on Generative Adversarial Networks” at VALSE, Hefei 2019
- “GANs for Cross-domain Visual Data” at Huawei Noah’s Ark Lab 2019
- “GANs for Cross-domain Visual Data” at Wuhan University 2019
- “Learning from Multiple Views” at DICTA 2017
- “Learning from Multiple Views” at Fudan University 2017
- “Developmental Learning” at Wuhan University 2017
- “Developmental Learning” at UTS Multimedia Workshop 2017
- “Multi-view Learning” at VALSE Webinar, 2015
Professional Services:
- Associate Editor of IEEE Transactions on Pattern Analysis and Machine Intelligence (T-PAMI), IEEE Transactions on Multimedia (T-MM), Transactions on Machine Learning Research (TMLR), Neurocomputing Journal
- Guest Editor of Data Mining and Knowledge Discovery Journal
- Proceeding Chair of MM 2026
- Sponsor Chair of DICTA 2017
- Workshop Chair of 2018 IEEE ICDM Workshop on Developmental Learning, 2021 ACM SIGKDD Workshop on Model Mining
- Area Chair of ICML 2024, ICLR 2024, CVPR 2024, MM 2024, NeurIPS 2023, ICML 2023, KDD 2023, CVPR 2023, ICLR 2023, MM 2023, NeurIPS 2022, KDD 2022, MM 2022, WACV 2022, ICLR 2021, ACM KDD 2021, ACM MM 2021, NeurIPS 2021, ICPR2020
- Senior PC of IJCAI 2024, AAAI 2024, IJCAI 2023, AAAI 2023, IJCAI 2022, AAAI 2022, IJCAI 2021, AAAI 2020, IJCAI 2019, AAAI 2019, IJCAI 2018, IJCAI 2017
Teaching:
- Deep Learning (COMP5329), 2018-2022, School of Computer Science, The University of Sydney
- Machine Learning and Data Mining (COMP5318), 2021 S1, School of Computer Science, The University of Sydney
- Computer Vision, 2017-2019, Summer School at University of Technology Sydney