@ibmresearch
  @ibmresearch
IBM Research | Reprogramming Large Models with Limited Resources @ibmresearch | Uploaded October 2021 | Updated October 2024, 4 days ago.
Abstract: This talk introduces model reprogramming, a novel method that enables cross-domain data-limited transfer learning with a pre-trained model. In recent years, the use of foundation models has prevailed in many machine learning applications, such as computer vision and natural language processing. A foundation model is a high-capacity neural network pre-trained on large-scale datasets, and it can be efficiently finetuned to solve downstream tasks. We demonstrate how to reprogram foundation models in image, text, and speech to solve low-resource tasks in medical, molecular, and time-series domains. Finally, we provide a theoretical justification to explain the success of model reprogramming.




Bio: Dr. Pin-Yu Chen is a research staff member at IBM Thomas J. Watson Research Center, Yorktown Heights, NY, USA. He is also the chief scientist of RPI-IBM AI Research Collaboration and PI of ongoing MIT-IBM Watson AI Lab projects. Dr. Chen received his Ph.D. degree in electrical engineering and computer science from the University of Michigan, Ann Arbor, USA, in 2016. Dr. Chen’s recent research focuses on adversarial machine learning and robustness of neural networks. His long-term research vision is building trustworthy machine learning systems. At IBM Research, he received the honor of IBM Master Inventor and several research accomplishment awards, including an IBM Master Inventor and IBM Corporate Technical Award in 2021. His research works contribute to IBM open-source libraries including Adversarial Robustness Toolbox (ART 360) and AI Explainability 360 (AIX 360). He has published more than 40 papers related to trustworthy machine learning at major AI and machine learning conferences, given tutorials at AAAI’22, IJCAI’21, CVPR(’20,’21), ECCV’20, ICASSP’20, KDD’19, and Big Data’18, and organized several workshops for adversarial machine learning. He received a NeurIPS 2017 Best Reviewer Award, and was also the recipient of the IEEE GLOBECOM 2010 GOLD Best Paper Award.
Reprogramming Large Models with Limited ResourcesHow can AI help modernize code?Using AI models to fight climate change in KenyaThe Hard Tech Revolutionizing Computing TrailerQuantum computing & AI: Building the battery of the futureIBM Granite tops the SQL charts, faster chatbots with speculative decoding, & IBM AI at WimbledonShaping the future of quantum computing at IBM Research - HaifaThe Short: IBM Quantum announcements, Investing in the future of semis, Introducing the AI AllianceJSR and IBM Quantum envision a revolution in semiconductor manufacturingMercedes-Benz is exploring quantum computingThe Short: Open sourcing Granite models, new carbon discoveries, & its IBM Quantum Challenge time!The Short: LLMs en Español, Quantum computing in a cathedral, & low cost quantum error correction

Reprogramming Large Models with Limited Resources @ibmresearch

SHARE TO X SHARE TO REDDIT SHARE TO FACEBOOK WALLPAPER