Trends in Machine Learning: ICML 2022

The 38th International Conference on Machine Learning (ICML) took place in mid-July in hybrid mode, two months after the International Conference on Learning Representations (ICLR) 2022. In our analysis of the ICLR, we pointed out the spread of language models (LMs) and self-supervision. Things are not that different at ICML.

Knowledge from papers quickly passes to tutorials and workshops that make it more accessible to practitioners. It is worth mentioning that language models are invading reinforcement learning, acting as keepers of knowledge and very effective planners.

We observed that the research goal in LMs can be consolidated into cost reduction, which is equivalent to scale and low carbon emissions. There are multiple angles from which researchers attack this problem: sparsity, hardware acceleration, vector databases, 1-bit representations, etc.

We are many decades away from having a GPT-3 on a chip unless we see a new breakthrough. The idea of machine learning as a service (MLaS) has been around for many years, but it became popular and promising with the rise of LMs. We noted a couple of papers discussing how to use MLaS cost-effectively.

The trend of biomedical AI continues to grow (there were more papers than at ICLR plus two workshops) fueled by Covid-19 and the breakthroughs of Alphafold.

Finally, we have to mention the return of Vowpal Wabbit, probably the oldest open-source ML project, long before the deep learning era, which is still relevant and very useful!

We hope you enjoy this summary. You can find a link to our slide presentation here.

Note: Between ICLR and ICML, the AI community experienced the loss of a young and promising researcher, Octavian Ganea. I met Octavian while serving as a reviewer at NeurIPS. He kindly gave us a presentation at RelationalAI on his Ph.D. thesis that left everyone speechless. We highly recommend watching it below.