Tool
ToxiGen
Toxic language detection systems often falsely flag text that contains minority group mentions as toxic, as those groups are often the targets of online hate. Such over-reliance on spurious correlations also causes systems to struggle…
Project
GODEL: Large-Scale Pre-training for Goal-Directed Dialog
This is the home page of project GODEL (Grounded Open Dialogue Language Model), a large open-source pre-trained language model for dialog. In contrast with its predecessor DialoGPT, GODEL leverages a new phase of grounded pretraining designed to…