SC21 Proceedings

The International Conference for High Performance Computing, Networking, Storage, and Analysis

Embeddings Are All You Need: Transfer Learning in Convolutional Neural Networks Using Word Embeddings

Student: Matt Baughman (University of Chicago)
Supervisor: Kyle Chard (University of Chicago, Argonne National Laboratory (ANL))

Abstract: Recent advances in efficient neural networks and relational learning using word embeddings as prediction targets for image classification indicate the combination of these two concepts offers promise for efficient transfer learning. Given the properties of word embeddings to represent information-dense abstractions of language concepts in arbitrary vector spaces, the projection of an image into that same vector space has been shown to enable similar relational operations between images that are possible with word embeddings. In this essay, we describe how we extend this idea to show how training a neural network model under this regime can lead to transfer learning within the embeddings' vector space. This allows the model an advantage in predicting classes of images not previously encountered. Additionally, we demonstrate this principle using a neural network architecture previously shown to be state-of-the-art for model efficiency, demonstrating the applications of these methods in light weight machine learning.

ACM-SRC Semi-Finalist: no

Poster: PDF
Poster Summary: PDF

Back to Poster Archive Listing