The BigScience ROOTS Corpus: A 1.6TB composite multilingual dataset

Hugo Laurençon, Lucile Saulnier, Thomas Wang, Christopher Akiki, Albert Villanova del Moral, Teven Le Scao, Leandro von Werra, Chenghao Mou, Eduardo González Ponferrada, Huu Nguyen, Jörg Frohberg, Mario Šaško, Quentin Lhoest, Angelina Mcmillan-Major, Gérard Dupont, Stella Biderman, Anna Rogers, Loubna Ben Allal, Francesco de Toni, Giada PistilliOlivier Nguyen, Somaieh Nikpoor, Maraim Masoud, Pierre Colombo, Javier de la Rosa, Paulo Villegas, Tristan Thrush, Shayne Longpre, Sebastian Nagel, Leon Weber, Manuel Romero Muñoz, Jian Zhu, Daniel van Strien, Zaid Alyafeai, Khalid Almubarak, Vu Minh Chien, Itziar Gonzalez-Dios, Aitor Soroa, Kyle Lo, Manan Dey, Pedro Ortiz Suarez, Aaron Gokaslan, Shamik Bose, David Ifeoluwa Adelani, Long Phan, Hieu Tran, Ian Yu, Suhas Pai, Jenny Chim, Violette Lepercq, Suzana Ilić, Margaret Mitchell, Sasha Luccioni, Yacine Jernite

Research output: Chapter in Book/Conference paperConference paperpeer-review


As language models grow ever larger, the need for large-scale high-quality text datasets has never been more pressing, especially in multilingual settings. The BigScience workshop, a 1-year international and multidisciplinary initiative, was formed with the goal of researching and training large language models as a values-driven undertaking, putting issues of ethics, harm, and governance in the foreground. This paper documents the data creation and curation efforts undertaken by BigScience to assemble the Responsible Open-science Open-collaboration Text Sources (ROOTS) corpus, a 1.6TB dataset spanning 59 languages that was used to train the 176-billion-parameter BigScience Large Open-science Open-access Multilingual (BLOOM) language model. We further release a large initial subset of the corpus and analyses thereof, and hope to empower large-scale monolingual and multilingual modeling projects with both the data and the processing tools, as well as stimulate research around this large multilingual corpus.
Original languageEnglish
Title of host publicationAdvances in Neural Information Processing Systems 35 (NeurIPS 2022)
Place of PublicationNew Orleans, United States
PublisherNeural Information Processing Systems Foundation, Inc.
ISBN (Electronic)9781713871088
Publication statusPublished - 2022
Event36th Conference on Neural Information Processing Systems - New Orleans Convention Center & Virtual, New Orleans, United States
Duration: 28 Nov 20228 Dec 2022


Conference36th Conference on Neural Information Processing Systems
Abbreviated titleNeurIPS 2022
Country/TerritoryUnited States
CityNew Orleans


Dive into the research topics of 'The BigScience ROOTS Corpus: A 1.6TB composite multilingual dataset'. Together they form a unique fingerprint.

Cite this