Register
Login
Resources
Docs Blog Datasets Glossary Case Studies Tutorials & Webinars
Product
Data Engine LLMs Platform Enterprise
Pricing Explore
Connect to our Discord channel
Dean d23579543c
add Reproducibility Scale report
2 years ago
013f2d8cb2
Update 'README.md'
2 years ago
d23579543c
add Reproducibility Scale report
2 years ago
Storage Buckets

README.md

You have to be logged in to leave a comment. Sign In

ML Reproducibility Challenge Spring 2021

The official deadline for the challenge has been extended to July 20th!

Info and guidelines for the ML Reproducibility Challenge (MLRC) 2021 Spring Edition

The spring edition of the Papers with Code reproducibility challenge is an extension to the challenge that started in 2020. If you'd like to reproduce papers from 2021 conferences please send an email to reproduce@dagshub.com.

This page is inspired by the support from W&B for the reproducibility challenge. All compensation below is in addition to the compensation they offer, so you can use both DAGsHub and W&B for this challenge.

DAGsHub ❤️ Reproducibility – Supporting the challenge

Machine learning reproducibility is what originally got us started with DAGsHub, and we care deeply about promoting it. When research is reproducible, everyone benefits – that is why the Papers with Code reproducibility challenge is such an important and positive effort. We're excited to support participants taking on this challenge, with the hope that it will encourage more teams to participate and move the field forward.

How are you supporting the challenge?

By supporting efforts to make all papers from NeurIPS, ICML, ICLR, ACL, EMNLP, CVPR, and ECCV reproducible, in a verifiable and reliable way, by having the code, data, models, and experiments tracked in DAGsHub.

We've created a channel on our Discord Community, especially for this challenge.

To incentivize community members to spend their time on this challenge, and since many papers require expensive compute resources to reproduce, we're offering participants $500 per paper reproduced, as long as it meets the guidelines.

What you get

  • Everlasting glory of having reproduced a real scientific paper's results
  • A great project to showcase on your resumé
  • $500 per paper reproduced according to the below instructions

How you can participate

See Reproducibility Challenge for more details.

  • Use the official list to claim an accepted paper from the supported conferences. You need to make sure the paper hasn't been claimed yet under the “Claimed” tab.
  • Join our Discord community's #ml-reproducibility channel.
  • Use existing code released by the authors, or create a minimal implementation, to verify the main claims and results made by the paper.
  • Use DAGsHub to track your code, data, models, and experiments.
    • You should use DAGsHub storage to host your data, models, and artifacts to make sure everything is open source and reproducible. If you require large amounts of storage (over 10GB), reach out to us first to make sure we accommodate you.
  • Write a summary of your findings and ablation studies as a DAGsHub Wiki page (it's simple markdown). Please address the following topics:
    • Was the paper reproducible?
    • Did you uncover any key insights?
    • (If applicable) Describe the data set - source, data type, distribution, etc.
    • (If applicable) Describe the data processing method.
    • Describe the model - architecture, performances, etc.
  • Ensure you meet the ML Reproducibility checklist criteria.
  • Submit your findings to the following places:
  • We will contact you to review your DAGsHub submission.
    • Papers accepted by the challenge review qualify automatically
    • Papers not accepted might qualify at DAGsHub's discretion
  • Selected authors will have their work featured on the DAGsHub community and homepage.
  • Authors will also be invited to share their work at a dedicated event.
  • The award is limited to up to 2 papers per individual.

Timeline for the challenge

See Reproducibility Challenge for more details.

  • Challenge Starts: April 15th, 2021
  • Final submission deadline: July 20th, 2021
    • You can submit your report at any time before the deadline.
    • Since writing the report takes time, and you might want to refine your results, final submissions to DAGsHub will be accepted up to two weeks after the challenge deadline (August 5th, 2021)
  • Authors notified for compensation: September 30th, 2021 October 5th, 2021

Receiving the award

We will award every participant who meets the above-listed guidelines $500 towards computing costs. Exact details regarding the form and method of compensation will be communicated with eligible participants.

Working with DAGsHub

DAGsHub is the GitHub for machine learning. A place where data scientists can work in teams and host their project components – code, data, models, experiments, and pipelines under one roof. Utilizing DAGsHub's capabilities enables us to reproduce results easily.

DAGsHub lets you:

Get Started

  • If you're not a DAGsHub user yet, start by signing up for free.
  • Join our Discord community's #ml-reproducibility channel.
  • Choose the paper you want to reproduce and let the games begin!

Our team is waiting to assist you in any way needed. Good luck!

Tip!

Press p or to see the previous file or, n or to see the next file

About

Project for Reproducibility Challenge guidelines

Collaborators 2

Comments

Loading...