Reading Group

Reading Group #

The paper reading group meets weekly during the semester to discuss papers. Participation is open to all, guests are always welcome; if you are interested in engaging with us, please feel free to contact Dr Ke Li.
More about our reading group culture

The paper to discuss is announced about one week in advance by the presenter. All participants are expected to read the paper before the meeting. It is recommended to take notes about insights, questions, and other points potentially worth discussions.

The goals of the reading group are:

  • Critical reflection on scientific work
  • Practice of reading and argumentation strategies
  • Exposure to a broad range of research topics
  • Practice of leading group discussions

The discussion is usually limited to one hour. The presenter will kick off the meeting by giving a short summary of the paper and raising a few points for discussion. The present is expected to inspire participants into the discussion.

Agenda (Year 2023) #

  • Multi-Objective Bi-Level Optimization for AutoML in Cross-Project Defect Prediction
    Work Presentation
    slides | Jiaxin Chen | Aug 6
  • Paper Title
    Paper Reading
    slides | Presenter | July 23

  • Data-driven Literature Exploration with Topic Modeling, LLMs and Network Analysis
    Work Presentation
    slides | Mingyu Huang | July 30

Moderator: Fan Li

  • Optimistic tree search strategies for black-box combinatorial optimization
    NeurIPS'22
    paper | slides | Peili Mao | June 4

  • Explaining Hyperparameter Optimization via Partial Dependence Plots
    NeurIPS'21
    paper | slides | Mingyu Huang | June 18

  • Learning Algorithms for Active Learning
    PMLR 2017
    paper | slides | Tian Huang | June 18

Moderator: Peili Mao

  • Bayesian Optimization with Conformal Prediction Sets
    AISTAS 2023
    paper | slides | Shengbo Wang | May 7

  • A Statistical Framework for Low-bitwidth Training of Deep Neural Networks
    NeurIPS 2020
    paper | slides | Fan Li | May 14

  • RegMixup: Mixup as a Regularizer Can Surprisingly Improve Accuracy & Out-of-Distribution Robustness
    NeurIPS 2022
    paper | slides | Shasha Zhou | May 21

Moderator: Shengbo Wang

Previous reading group information can be found from our archive.