TY - GEN
T1 - Contrastive learning
T2 - 7th Joint International Conference on Data Science and Management of Data, CODS-COMAD 2024
AU - Tripathi, Sandhya
AU - King, Christopher Ryan
N1 - Publisher Copyright:
© 2024 Owner/Author.
PY - 2024/1/4
Y1 - 2024/1/4
N2 - Contrastive learning (CL) has exploded in popularity due to its ability to learn effective representations using vast quantities of unlabelled data across multiple domains. CL underlies some of the most impressive applications of generative AI for the general public. We will review the fundamentals and applied work on contrastive learning representations focusing on three main topics: 1) CL in supervised, unsupervised and self-supervised setup and its revival in AI research as an instance discriminator. In this part, we will focus on learning about the nuts and bolts, such as different augmentation techniques, loss functions, performance evaluation metrics, and some theoretical understanding of contrastive loss. We will also present the methods supporting DALL · E 2, a popular generative AI. 2) Learning contrastive representations across vision, text, time series, tabular data and knowledge graph modalities. Specifically, we will present the literature representative of solution approaches regarding new augmentation techniques, modification in the loss function, and additional information. The first two parts will also have small hands-on session on the application shown and some of the methods learned. 3) Discussing the various theoretical and empirical claims for CL's success, including the role of negative examples. We will also present some work that challenges the shared information assumption of CL and propose alternative explanations. Finally, we will conclude with some future directions and applications for CL.
AB - Contrastive learning (CL) has exploded in popularity due to its ability to learn effective representations using vast quantities of unlabelled data across multiple domains. CL underlies some of the most impressive applications of generative AI for the general public. We will review the fundamentals and applied work on contrastive learning representations focusing on three main topics: 1) CL in supervised, unsupervised and self-supervised setup and its revival in AI research as an instance discriminator. In this part, we will focus on learning about the nuts and bolts, such as different augmentation techniques, loss functions, performance evaluation metrics, and some theoretical understanding of contrastive loss. We will also present the methods supporting DALL · E 2, a popular generative AI. 2) Learning contrastive representations across vision, text, time series, tabular data and knowledge graph modalities. Specifically, we will present the literature representative of solution approaches regarding new augmentation techniques, modification in the loss function, and additional information. The first two parts will also have small hands-on session on the application shown and some of the methods learned. 3) Discussing the various theoretical and empirical claims for CL's success, including the role of negative examples. We will also present some work that challenges the shared information assumption of CL and propose alternative explanations. Finally, we will conclude with some future directions and applications for CL.
KW - augmentations
KW - clustering
KW - contrastive learning
KW - distillation
KW - graphs
KW - multi-modal
KW - multi-view
KW - noise estimation loss
KW - tabular datasets
KW - time-series
UR - http://www.scopus.com/inward/record.url?scp=85183582823&partnerID=8YFLogxK
U2 - 10.1145/3632410.3633291
DO - 10.1145/3632410.3633291
M3 - Conference contribution
AN - SCOPUS:85183582823
T3 - ACM International Conference Proceeding Series
SP - 493
EP - 497
BT - CODS-COMAD 2024 - Proceedings of 7th Joint International Conference on Data Science and Management of Data
PB - Association for Computing Machinery
Y2 - 4 January 2024 through 7 January 2024
ER -