Embeddings Are All You Need: Transfer Learning in Convolutional Neural Networks Using Word Embeddings
ACM Student Research Competition: Graduate Poster
ACM Student Research Competition: Undergraduate Poster
TimeThursday, 18 November 20218:30am - 5pm CST
LocationSecond Floor Atrium
DescriptionRecent advances in efficient neural networks and relational learning using word embeddings as prediction targets for image classification indicate the combination of these two concepts offers promise for efficient transfer learning. Given the properties of word embeddings to represent information-dense abstractions of language concepts in arbitrary vector spaces, the projection of an image into that same vector space has been shown to enable similar relational operations between images that are possible with word embeddings. In this essay, we describe how we extend this idea to show how training a neural network model under this regime can lead to transfer learning within the embeddings' vector space. This allows the model an advantage in predicting classes of images not previously encountered. Additionally, we demonstrate this principle using a neural network architecture previously shown to be state-of-the-art for model efficiency, demonstrating the applications of these methods in light weight machine learning.