ML 005: Transfer Learning for NLP with Daniel Svoboda

Adventures in Machine Learning - Podcast tekijän mukaan Charles M Wood - Torstaisin

One of the hottest fields right now in machine learning is natural language processing. Whether it’s getting sentiment from tweets, summarizing your documents, sarcasm detection, or predicting stock trends from the news, NLP is definitely the wave of the future. Special guest Daniel Svoboda talks about transfer learning and the latest developments such as BERT that promises to revolutionize NLP even further. Sponsors Machine Learning for Software Engineers by Educative.ioAudible.comCacheFly Panel Charles Max WoodGant Laborde Guest Daniel Svoboda Links towardsdatascience.com/bert-explained-state-of-the-art-language-model-for-nlpai.googleblog.com/2018/11/open-sourcing-bert-state-of-art-pre.htmlai.googleblog.com/2017/08/transformer-novel-neural-network.htmlwww.nltk.orgspacy.iohttps://www.kaggle.com  Picks Charles Max Wood: Traffic Secrets: The Underground Playbook for Filling Your Websites and Funnels with Your Dream CustomersRange: Why Generalists Triumph in a Specialized World Gant Laborde: AI-QuickStartGuide.pdf Daniel Svoboda: Star Trek: Picard Follow Adventures in Machine Learning on Twitter > @podcast_mlAdvertising Inquiries: https://redcircle.com/brandsPrivacy & Opt-Out: https://redcircle.com/privacyBecome a supporter of this podcast: https://www.spreaker.com/podcast/adventures-in-machine-learning--6102041/support.

Visit the podcast's native language site