OpenReview
  • Overview
    • OpenReview Documentation
  • Reports
    • Conferences
      • OpenReview NeurIPS 2021 Summary Report
      • OpenReview ECCV 2020 Summary Report
  • Getting Started
    • Frequently Asked Questions
      • I accidentally withdrew a submission, what do I do?
      • How do I add a Program Chair to my venue?
      • When will I be able to withdraw my submission?
      • I want to delete my withdrawn or desk-rejected paper, what do I do?
      • An author of a submission cannot access their own paper, what is the problem?
      • What should I do if I find a vulnerability in OpenReview?
      • How can I report a bug or request a feature?
      • What is the difference between due date (duedate) and expiration date (expdate)?
      • Will Reviewers be notified of their Assignments?
      • What is the max file size for uploads?
      • Why are the "rating" and "confidence" fields in my PC Console wrong?
      • What should I do if my question is not answered here?
      • My Profile is "Limited". What does that mean?
      • What field types are supported in the forms?
      • How do I recruit reviewers?
      • How do I obtain a letter of proof for my services as a reviewer?
      • How do I complete my tasks?
      • Can I automatically transfer my Expertise Selection to another venue?
      • Why does it take two weeks to moderate my profile?
      • What do the different 'status' values mean in the message logs?
      • I am an Independent Researcher, how do I sign up?
      • How do I locate the date a submission is made public?
      • I am a reviewer but I can't access my assigned submissions, what do I do?
      • Reviewers for my venue cannot see their assigned submissions, what should I do?
      • I am a reviewer and I don't have papers for Expertise Selection, what do I do?
      • How do I upload a publication with a license that is not listed?
      • I didn't receive a password reset email, what do I do?
      • How do I add/change an author of my submission after the deadline?
      • How do I find a venue id?
      • Why can't I update my DBLP link?
    • Using the API
      • Installing and Instantiating the Python client
      • Groups
    • Hosting a venue on OpenReview
      • Creating your Venue Instance
      • Navigating your Venue Pages
      • Customizing your submission form
      • Enabling Supplementary Material Upload
      • Changing your submission deadline
      • Enabling an Abstract Registration Deadline
    • Creating an OpenReview Profile
      • Signing up for OpenReview
      • Resending an activation link
      • Expediting Profile Activation
      • Add or remove a name from your profile
      • Add or remove an email address from your profile
      • Finding your profile ID
      • Entering Institutional Data
      • Importing papers from DBLP
      • Manually adding a publication to your profile
      • Finding and adding a Semantic Scholar URL to your profile
      • Finding and adding your ACL Anthology URL to your profile
      • Merging Profiles
    • Customizing Forms
    • Using the New Forum Page
    • Live Chat on the Forum Page
  • Workflows
    • Example Workflow
    • ARR Commitment Venues
    • Exercises for workflow chairs
      • Prerequisites
      • Exercise: Posting LLM generated reviews
  • How-To Guides
    • Modifying Venue Homepages
      • How to customize your venue homepage
      • How to modify the homepage layout to show decision tabs
    • Managing Groups
      • How to Recruit and Remind Recruited Reviewers
      • How to have multiple Reviewer or Area Chair groups
      • How to Add and Remove Members from a Group
      • Publication Chairs
      • How to Copy Members from One Group to Another
    • Workflow
      • How to Programmatically Post Support Request Form
      • How to test your venue workflow
      • How to Post a Test Submission
      • How to support different tracks for a venue
      • How to Make Submissions Available Before the Submission Deadline
      • How to Change the Expiration Date of the Submission Invitation
      • Desk Reject Submissions that are Missing PDFs
      • How to begin the Review Stage while Submissions are Open
      • How to Change Who can Access Submissions After the Deadline
      • How to Enable Commenting on Submissions
      • How to Set a Custom Deadline for Withdrawals
      • How to Enable an Ethics Review Stage
      • How to Hide Submission Fields from Reviewers
      • How to modify the Review, Meta Review, and Decision Forms
      • How to release reviews
      • How to Enable the Rebuttal Period
      • How to Undo a Paper Withdrawal
      • How to enable Camera Ready Revision Upload for accepted papers
      • How to make papers public after decisions are made
      • How to enable bidding for Senior Area Chair Assignment
      • How to release the identities of authors of accepted papers only
      • How to enable the Review Revision Stage
    • Paper Matching and Assignment
      • How to Compute Conflicts Between Users
      • How to Post a Custom Conflict
      • How to create your own Conflict Policy
      • How to Bid on Submissions
      • How to add/remove bids programmatically
      • How to do manual assignments
      • How to do automatic assignments
        • How to setup paper matching by calculating affinity scores and conflicts
        • How to run a paper matching
        • How to modify the proposed assignments
        • How to deploy the proposed assignments
        • How to modify assignments after deployment
      • How to enable Reviewer Reassignment for Area Chairs
      • How to Sync Manual and Automatic Assignments
      • How to Compute Affinity Scores
      • How to Undo Deployed Assignments
      • How to Modify Reviewer Assignments as an Area Chair
      • How to Get all Assignments for a User
      • How to Update Custom Max Papers for Reviewers or ACs
      • How to Make Assignments using Subject Areas
    • Communication
      • How to send messages through the UI
      • How to customize emails sent through OpenReview
      • How to send messages with the python client
      • How to Send Decision Notifications Using the UI
      • How to view messages sent through OpenReview
      • How to email the authors of accepted submissions
      • How to get email adresses
    • Submissions, comments, reviews, and decisions
      • How to add formatting to reviews or comments
      • How to submit a Review Revision
      • How to add formulas or use mathematical notation
      • How to edit a submission after the deadline - Authors
      • How to upload paper decisions in bulk
      • How to hide/reveal fields
      • Update camera-ready PDFs after the deadline expires
    • Data Retrieval and Modification
      • How to check the API version of a venue
      • How to view Camera-Ready Revisions
      • How to Export all Submission Attachments
      • How to loop through Accepted Papers and print the Authors and their Affiliations
      • How to add/remove fields from a submission
      • How to manually change the readers of a note
      • How to post/delete an Official Review using Python
      • How to Get Profiles and Their Relations
      • How to Get All the Reviews that I have written and their Corresponding Submissions
      • How to Get All Registration Notes
      • How to Get All Submissions
      • How to Get All Reviews
      • How to Export All Reviews into a CSV
      • How to get all Rebuttals
      • How to Get All Official Comments
      • How to Get All MetaReviews
      • How to Get All Decisions
      • How to Get All Venues
      • How to Retrieve Data for ACM Proceedings
      • How to Get Reviewer Ratings
  • Reference
    • API V1
      • OpenAPI definition
      • Entities
        • Edge
          • Fields
        • Note
          • Fields
        • Invitation
    • API V2
      • OpenAPI definition
      • Entities
        • Edge
          • Fields
        • Group
          • Fields
        • Note
          • Fields
        • Invitation
          • Types and Structure
          • Fields
          • Specifiers
          • Dollar Sign Notation
        • Edit
          • Fields
          • Inference
    • Stages
      • Revision
      • Registration Stage
      • Bid Stage
      • Review Stage
      • Rebuttal Stage
      • Meta Review Stage
      • Decision Stage
      • Comment Stage
      • Submission Revision Stage
      • Post Submission Stage
      • Post Decision Stage
      • Ethics Review Stage
    • Default Forms
      • Default Submission Form
      • Default Registration Form
      • Default Comment Form
      • Default Review Form
      • Default Rebuttal Form
      • Default Meta Review Form
      • Default Decision Form
      • Default Decision Notification
      • Default Ethics Review Form
    • OpenReview TeX
      • Common Issues with LaTeX Code Display
      • OpenReview TeX support
    • Mental Model on Blind Submissions and Revisions
Powered by GitBook
On this page
  • Setup Matching
  • Conflict Detection Policy
  • Conflict Detection
  • Compute Affinity Scores
  • Troubleshoot Paper Matching

Was this helpful?

Export as PDF
  1. How-To Guides
  2. Paper Matching and Assignment
  3. How to do automatic assignments

How to setup paper matching by calculating affinity scores and conflicts

PreviousHow to do automatic assignmentsNextHow to run a paper matching

Last updated 7 months ago

Was this helpful?

Setup Matching

The Setup Matching is the first step needed to make assignments between Senior Area Chairs, Area Chairs, Reviewers and submissions. Once this step is completed you can run the Matching following the instructions .

You can calculate affinity scores and conflicts for your venue using OpenReview's 'Paper Matching Setup' feature. Paper Matching Setup is enabled for any venue and the button is activated once the Submission Deadline has passed. This feature allows Program Chairs to compute or upload affinity scores and/or compute conflicts.

Calculating affinity scores can be lengthy process depending on the size of your venue. Therefore, only one setup matching can be run at a time.

You can find the 'Paper Matching Setup' button on your venue request form. The button will become available after the submission deadline.

Selecting Senior Area Chairs as the Matching Group

  • Senior Area Chairs are assigned submissions directly: This will compute affinity scores and conflicts between Senior Area Chairs and submissions.

  • Assignment to submissions through Area Chairs: Senior Area Chairs are assigned to Area Chairs and the submissions assigned to their Area Chairs are assigned to them. For this reason, when assigning Area Chairs to submissions, the corresponding Senior Area Chair conflicts need to be transferred to the Area Chairs. This guarantees that there are no conflicts between the submission and the assigned Area Chair and Senior Area Chair. Selecting this option will compute affinity scores between Senior Area Chairs and Area Chairs and conflicts between Senior Area Chairs and submissions. It is also required that the assignments between Senior Area Chairs and Area Chairs is done before starting the matching between Area Chairs and submissions.

Selecting Area Chairs as the Matching Group

This will compute affinity scores and conflicts between Area Chairs and submissions.

Selecting Reviewers as the Matching Group

This will compute affinity scores and conflicts between Reviewers and submissions.

Conflict Detection Policy

Conflict detection uses the information of the users' coauthors from publications in OpenReview as long as they are publicly visible and the users' Profile. Therefore, the more complete and accurate the information in the Profile is, the better the conflict detection.

The sections of the Profile used for conflict detection are the Emails section, the Education & Career History section, and the Advisors, Relations & Conflicts section.

Another parameter that can be controlled is the amount of years you want to consider when looking at conflicts. For example, there might be two users who worked at company A at some point. However, User A worked at Company C ten years ago and User B just started working at Company C. If the amount of years is set to 5, for example, a conflict won't be detected between User A and User B because only the history, relations and publications from the past 5 years will be taken into consideration. By default, all relations, history, and publications are taken into consideration for conflict detection.

Since a lot of users use email services such as gmail.com, a list of common domains is used to filter them out before conflicts are computed.

There are two policies when computing conflicts: Default and NeurIPS.

Default Information Extraction Policy

  1. Uses the domains and computes subdomains from the Education & Career History section.

  2. Uses the domains and computes subdomains from the emails listed in the Advisors, Relations & Conflicts section.

  3. Uses the domains and computes subdomains from the emails listed in the Emails section.

  4. Uses the publication ids in OpenReview that the user authored.

NeurIPS Information Extraction Policy

Note that emails do not have a range of dates for when they were valid in the user's Profile. The NeurIPS policy addresses this issue.

  1. Uses the domains and computes subdomains from the Education & Career History section. All intern positions are ignored.

  2. Uses the domains and computes subdomains from the emails listed in the Advisors, Relations & Conflicts section, if the relation is that of a Coworker or a Coauthor.

  3. Uses the domains and computes subdomains from the emails listed in the Emails section, if and only if no domains were extracted from the Education & Career History and Advisors, Relations & Conflicts sections.

  4. Uses the publication ids in OpenReview that the user authored.

Conflict Detection

Depending on the value you selected for Compute Conflicts N Years, part or all the Profile information will be considered when computing conflicts. For example, if you use the value 5, only the most recent 5 years of the Profile's data will be used to compute conflicts.

Once all the information is extracted from the users' Profiles, the following rules apply to find a conflict between User A and User B:

  • If any of the domains/subdomains from the Education & Career History section of user A matches at least one of the domains/subdomains of the same section of User B, then a conflict is detected.

  • If any of the domains/subdomains from the Advisors, Relations & Conflicts section or the Emails section of user A matches at least one of the domains/subdomains of the same sections of User B, then a conflict is detected.

  • If any of the publications of User A is the same as one of the publications of User B. In other words, if User A and User B are coauthors, then a conflict is detected.

Compute Affinity Scores

OpenReview has different models available to compute affinity scores between users and submissions. The current available models are:

  • specter+mfr

  • specter2

  • scincl

  • specter+scincl

You may also choose to upload your scores by selecting No. Directions for how to format uploaded scores are in the description of the field in the Paper Matching Setup.

Troubleshoot Paper Matching

Running the paper matching setup should output a comment on your venue request page. If there were members missing profiles or publications, the message will identify them and say 'Affinity scores and/or conflicts could not be computed for these users. Please ask these users to sign up in OpenReview and upload their papers. Alternatively, you can remove these users from the Reviewers group.' This message does not mean that the process failed, but that those members were excluded from the calculations. You have two options:

  1. Remove reviewers without profiles from the reviewers group.

  2. Remind the reviewers that they need OpenReview profiles and wait for them to create them. You can run the Paper Matching Setup as many times as you want or until all users have completed profiles.

Note that when a reviewer creates a profile, their email address will not be automatically updated to their profile ID in the reviewers' group. The matcher will still detect email addresses as users without profiles, so any email addresses will either need to be removed or replaced with tilde IDs. This can be done automatically by re-running Paper Matching Setup.

You can confirm that the affinity scores were computed by checking if an invitation for the scores was created: https://api.openreview.net/edges?invitation=your/venue/id/role/-/Affinity_Score. Next, you should be able to run a paper matching from the ‘Paper Assignments’ link in your PC console.

Clicking it should bring up the following form. The 'Matching Group' is a dropdown menu of the groups you can use in the matcher (Reviewers, Area Chairs, Senior Area Chairs), depending on whichever you selected for your venue. You can select if you would like affinity scores and/or conflicts computed. Alternatively, you can compute and upload your own affinity scores using the .

It is important that all Senior Area Chairs, Area Chairs, and Reviewers have a complete and updated Profile including their publications. Users can by editing their Profile. This will improve the quality of the affinity scores and conflicts.

Senior Area Chairs assignments can be done in two different ways (specified in the ):

If your venue has Senior Area Chairs and you are performing Senior Area Chair assignments based on their Area Chairs, make sure that the Area Chairs already have assigned Senior Area Chairs. This is needed so that the Senior Area Chair conflicts are transferred to the Area Chairs too. Refer to the section for more information

If you want to learn more about the models, you can find the open source repository .

OpenReview expertise API
import publications from DBLP
request form
here
Selecting Senior Area Chairs as the Matching Group
here
Setup Paper Matching