COSCUP x RubyConf TW 2021

Your locale preferences have been saved. We like to think that we have excellent support for English in pretalx, but if you encounter issues or errors, please contact us!

一個大規模跨國語言的預訓練模型MT5
年8月1日, 11:30–12:00 (Asia/Taipei), TR409-1
語言: 漢語


演講長度

30

英文演講摘要

Google has launched a model, named mT5, a multilingual model of Google’s T5 model. The training data of this model comes from mC4. Therefore, the model has expanded data and has the ability to understand more than 100 languages. The MT5 model can complete multiple NLP tasks and MT5 is a powerful model.

您是否知悉並同意如採遠端形式分享,需提供預錄影片(您需同意大會才能接受您的稿件) – True 內容難易度

中階

英文演講標題

A massively multilingual pre-trained text-to-text transformer model

演講摘要

分享Google最新發表的語言模型Multilingual T5,該模型訓練資料來自mC4,因此模型具備大規模數據、超過100種跨國語言的理解能力,同時也是一個Text-to-Text的Transformer Model,因此可以應用在多個自然語言理解任務(NLU),是一個比BERT更為強大與成熟的新方法。

slido url

https://app.sli.do/event/k8r55xyn

hackmd url

https://hackmd.io/@coscup/ryH3GaPCu/%2F%40coscup%2FSJXoGTP0O

目標聽眾

機器學習工程師、軟體工程師、資料科學家、NLP工程師

講者所屬的公司或組織名稱

亞太智能機器(Asia Pacific Machine Intelligence Company)

講者所屬社群

TensorFlow User Group Taipei

其他補充資料

https://hiskio.com/packages/XnyD1E2Ra