Researchfish Data Guide
Introduction
This guide is designed to serve as a practical handbook for accessing, understanding, and using Researchfish data. It provides a high-level overview of how data is stored and structured within the platform, along with guidance on how your organisation can view and access its data.
In later sections, we explore the data structure in more detail and outline ways you can begin to bring your data together effectively.
This is the first draft of the guide and will be updated based on internal review and your feedback. We hope you find it useful, and we welcome any suggestions for improvement. Please share your thoughts and let us know what additional content you’d like to see included by contacting us at rfclientsupport@elsevier.com.
Getting started
This section aims to highlight some important concepts about the structure and management of data in Researchfish.
The 15 Common Outcome Types
Researchfish includes 15 common outcome types, each with a standard set of questions. Researchers can create an outcome from these predefined options and attribute it to an award, signifying that the award's funding contributed to achieving that outcome. The 15 outcome types are:
• Publications
• Collaborations & Partnerships
• Further Funding
• Next Destination
• Engagement Activities
• Influence on Policy, Practice, Patients & the Public
• Research Tools & Methods
• Research Datasets, Databases & Models
• Intellectual Property & Licensing
• Medical Products, Interventions & Clinical Trials
• Artistic & Creative Products
• Software & Technical Products
• Spin Outs
• Awards & Recognition
• Use of Facilities & Resources
Previously, there was a 16th common outcome type called "Other Outputs/Outcomes." This option has now been removed. However, any data previously entered by award holders under this category will still be retained and available for download.
The common outcome types and the questions within in each section are determined by the stakeholders that use the system and utilise the data collected within it. For many years, there was a dedicated committee known as the Researchfish Question Set Subgroup, composed of funders and Researchfish staff who governed the question set. However, the process for overseeing the question set has now changed to enable broader involvement. Each outcome type is reviewed on a rolling basis to ensure it remains relevant and effective. As part of this process, stakeholders are invited to nominate the outcome types they would like to be consulted on. This approach allows parties to engage with the outcomes most important to them, while also enabling us to draw on specialist expertise tailored to each outcome type.
We regularly review the common outcome types to ensure that they continue to meet the evolving needs of our stakeholders and support an efficient data collection process. This includes improvements such as integrating data from other sources, incorporating unique identifiers, and linking to metadata where appropriate.
Stakeholders are encouraged to share feedback or suggestions for improvement at any time through their Customer Success Manager. For more information on these common outcomes, we recommend reviewing the Common Outcome Types Map with sub-types and the Common Outcome Overview.
Although these questions are designed to meet the needs of all clients, there is also the option of assigning one or more ‘Additional Questions’ to your awards. This is discussed in more detail in the following section.
Additional Questions (AQs)
An additional question enables a funder to collect information from award holders beyond the standard outcome categories. These can be made mandatory, preventing award holders from submitting unless they have responded.
These questions are customisable to each funder’s needs, provided they remain GDPR compliant, and are available at an additional cost. Organisations can create multiple AQ sections and assign them to specific awards. While there are no limits on how many AQ sections an award can have, we recommend balancing assignments to minimise the reporting burden.
There are also some Common Additional Questions that are used by multiple organisations. The purpose of these questions is to encourage consistent data collection across organisations, supporting benchmarking and data sharing, while also being mindful of the reporting burden placed on researchers. These common AQs are already set up within the system, meaning the questions are well tested and require minimal effort for an organisation to begin using them. Examples of current common AQs are Patient & Public Involvement, Animal Use in Research, and Academic Meetings & Conference Attendance.
Interoperability
Interoperability is a term used in Researchfish to describe the exchange of information between sources and platforms. We take information that is available from various data sources and pre-populate this in the platform for award holders . This minimises the reporting burden by saving the time they would have spent generating these outcomes themselves.
Interoperability is implemented in multiple ways in the platform; the different types of data entry are outlined in the table below and highlight the differing levels of researcher input needed:
Input type | Description |
Harvest |
Outcome harvested from 3rd party source for researcher and associated with a grant – no research input required
|
RO Upload |
Outcome harvested from RO via CRIS system and associated with researcher and grant. No input required from researcher in Researchfish, but was likely required in CRIS
|
Personal Harvesting |
Outcome harvested for a researcher using their personal details e.g. ORCID, but they still have to attribute to the grant
|
Search |
Researcher searches for outcome from within Researchfish and all fields get populated
|
Search Plus |
Same as search, PLUS researcher needs to add supplementary information e.g. consortium amount of Further Funding
|
Manual |
Entirely manual entry e.g. narrative impact
|
We currently integrate with many different sources across many sections as shown in the table below:
Section | Source | Harvest | RO Load | Search | Metadata |
Publications | Crossref[1], PubMed[2], WoS[3], Scopus[4], NielsenISBN[5], ORCID[6], NASA/ADS[7], INSPIRE HEP[8], Datacite[9], EThOS[10], OpenAIRE[11], PlumX[12] | Y | Y | Y | Y |
Collaborations | RF Locations, ROR[13], GRID[14], Open Funder Registry[15] | Y | Y | ||
Further Funding | RF Locations, ROR, GRID, Open Funder Registry, RF Grants, ORCID | Y | Y | Y | |
Next Destination | RF Locations, ROR, GRID, Open Funder Registry | Y | Y | ||
Engagement | |||||
Policy Influence | |||||
Research Tools | Datacite, ORCID | Y | Y | Y | |
Datasets | Datacite, ORCID, OpenAIRE | Y | Y | Y | Y |
IP | Espacenet[16] | Y | Y | ||
Medical Products | ClinicalTrials.gov[17], ISRCTN[18], EudraCT/CTIS[19], WHO ICTRP[20] | Y | Y | ||
Artistic Products | Datacite, ORCID, OpenAIRE | Y | Y | Y | Y |
Software & Tech | Datacite, ORCID, OpenAIRE | Y | Y | Y | Y |
Spin Outs | OpenCorporates[21] | Y | Y | ||
Awards | |||||
Facilities | RF Locations, ROR, GRID | Y | Y |
[2] https://pubmed.ncbi.nlm.nih.gov/
[4] https://www.elsevier.com/products/scopus
[5] https://nielsenisbnstore.com/
[7] https://ui.adsabs.harvard.edu/
[10] https://bl.iro.bl.uk/collections/e492dc4b-82d9-4f8c-bb0a-2cdd8a62105d?locale=en
[12] https://www.elsevier.com/insights/metrics/plumx
[15] https://www.crossref.org/services/funder-registry/
[16] https://worldwide.espacenet.com/
[17] https://clinicaltrials.gov/
[19] https://www.clinicaltrialsregister.eu/ctr-search/search
[20] https://www.who.int/tools/clinical-trials-registry-platform
[21] https://opencorporates.com/
These lists are not exhaustive, and we continuously seek new systems to enhance data integration. Our goal is to help award holders focus their time on reporting outcomes that are not already publicly available and require more personalised input. However, our ability to integrate data is limited by what is available and what has persistent, unique identifiers. As the research environment evolves and more data becomes accessible in standardised formats, our capacity to reduce reporting burdens and enhance insights will grow — collaboration across the sector is essential to making this possible.
Data Ownership
It is essential for all stakeholders to understand who owns the data within the Researchfish platform at the different stages of the process, so that it is used in line with the ownership and the relevant governing principles.
Key Points on Data Ownership:
• Researchfish acts as a Data Processor for funders, who load their information on their awards into Researchfish and ask their researchers to report to them using Researchfish.
• Researchfish also acts a Data Processor for individual researchers who can add information on their outputs, outcomes and impacts and then submit the information to their funders using Researchfish.
• Funders control both their own data uploaded by them into Researchfish and information on outputs, outcomes, and impacts attributed to their awards by their researchers, including both live and submitted data . More details on the difference between live and submitted data is available in the section Submission Periods (below).
When an award holder creates an account, they agree to these Terms of Use, allowing the funder to use the data for internal and external purposes. There are standard terms of use made available for organisations to use when their award holders are submitting their data. However, stakeholders can use terms of use specific to their organisation and award holders.
Sharing of Award Data
Sharing with Research Organisations
When a new award is added to the platform, an administrator user from the organisation can set its Sharing Status . This specifies if the award and its outcome data can be shared with each research organisation (RO) where the award is based (as each award can be based at many research organisations). The sharing status can be set to one of two options:
Full Share
The research organisation (RO) hosting the award will have read-only access to its information:
• They can download award details for all awards based at their institution.
• ROs with a paid subscription can additionally view and download attributed outcomes to the awards.
Don’t Share
• The award information remains private and is not accessible to the RO.
Sharing with Co-Funders
A single award can be funded by more than one funder. In Researchfish a funder setting up an award has the option to add any co-funders as part of the award information. The funder setting the award up is the primary funder, and they can set up many different co-funders. Co-funders cannot manage the award details, e.g. changing PI, setting a submission period, but they can access all of the information reported on the award.
Submission Periods
A submission period is a designated timeframe (e.g., one month) during which the funder requests that award holders submit their awards with the most up-to-date outcomes attributed to them. Many funders choose to align their submission periods, as researchers are often funded by multiple organisations. Submitting to all relevant funders at the same time helps to reduce the reporting burden on the research community.
As part of the submission process, award holders are presented with the Terms of Use for the funder(s) associated with their awards, which describes the purposes for which the information is being collected and how it may be reused. These terms must be accepted to complete the submission.
Submitting an award allows the researcher to confirm that everything reported is accurate and complete at that point in time. Once they submit a copy of the data at that point in time is made for longitudinal analysis.
Although the submitted data remains unchanged, some metadata, such as Open Access status, can be updated as required. You can find a detailed description of the difference between live and submitted data in the ‘Live and Submitted Data’ section of this guide.
Response codes
The submission requirements for an award are determined by its response code, as outlined below:
Response Code |
Requirement | Description |
1 |
Mandatory | A submission is required for this award in the current year. |
2 |
Exempt for one year | No submission is expected this year due to the award holder being on long-term leave (e.g. parental leave). |
3 |
No further submissions required | No additional submissions are expected because the award holder has retired, is no longer active in research, or the grant ended more than five years ago. |
4 |
Submission expected, but award holder has left the organisation | The award holder is expected to make a submission but is no longer at the research organisation. This grant does not contribute to the research organisation’s compliance statistics, and they are not expected to follow up with them. |
5 |
Non-mandatory | Submission is optional for this award. |
TEST |
Test award |
These awards are not visible to the assigned location and appear as non-mandatory during a submission period.
This status should be reserved solely for test awards and not routinely created. |
Accessing Your Data
There are multiple ways to download and explore your data. This section will first cover the available data download options, followed by an overview of the tools within the platform for exploring your data.
An additional option is available to access the Researchfish SQL database, which contains awards and outcomes data in Snowflake. This service is offered at an additional cost. A reporting database allows funder users to have fine grained control about how they query the data and build their own result sets, including adding this database as part of their existing business intelligence systems, allowing them to easily combine data reported in Researchfish, with any other information that they need and hold in their other systems, e.g. classifications, and benefit from any investment they make in the business intelligence systems.
Exports
You can download various Excel spreadsheets from the platform, each containing award or outcome data. To access these downloads, navigate to the Exports tab within the e-Val navigation pane.

The final data export depends on your selection of the available filters. Below are the key options you can configure:
Submission Period
This filter allows you to choose:
1. Live Data
The most up-to-date information in the system.
2. Submitted Data
Data captured at the time of submission for a specific submission period. The names of all your past submission periods will be listed.
3. Most Recently Submitted
The latest submission available for each award, regardless of the submission period.
What does "Most Recently Submitted" mean?
If Grant A and Grant B were submitted in 2023, but only Grant A was submitted in 2024, the Most Recently Submitted data would include Grant A’s 2024 submission and Grant B’s 2023 submission.
This feature allows for a more comprehensive dataset of submitted data.
Export Type
Choose from the following export options:
1. Award Details (RHT)
This selection allows you to download a complete list of your awards along with their associated details. The specific awards shown depend on the selected Submission Period filter:
• Live Data:
Shows all awards and their associated details.
• Submitted Data:
Displays submitted versions of each award and their details for the selected submission period. If an award was allocated to that submission period but not submitted, it will not appear here.
• Most Recently Submitted:
Shows the most recently submitted version of each award and its details. If an award has never been submitted, it will not appear in this view.
2. All Available Outcome Types
This selection allows you to download the details of specific outcomes or additional questions. The specific outcomes shown again depend on the selected Submission Period filter:
• Live Data:
Shows all entries for the selected outcome type that exists in your award holders’ portfolios.
• Submitted Data:
Shows only the outcomes that were submitted during the selected submission period.
• Most Recently Submitted:
Shows the most recently submitted version of each outcome.
For all export types except Award Details (RHT), an additional filter called Export Details will appear, offering three levels of detail:
• Entries with award details:
With this selection, the entries will be unique at the level of the award. This means that a single entry may be attributed to multiple different awards and will therefore be displayed on multiple different rows in the data. Refer here for a more detailed explanation of attribution and uniqueness.
• Entries with award details and funder/research organisation-specific categories (e.g., nodes):
This option provides the same view as the previous selection - Entries with Award Details - but also includes additional award information that is specific to your organisation.
• Entries without any award details:
When this option is selected, awards are not included, and attribution to specific awards is not considered. As a result, any entry linked to multiple awards will appear as a single row in this view.
3. HRCS Classifications
HRCS Classifications is only available if the funding organisation that has entered the award into the system has also provided these classifications. Researchfish does not determine the classification on their behalf. The Health Research Classification System (HRCS) classifies health research into two dimensions as outlined below:
• The type of research activity being undertaken, from basic to applied
• The type of health or disease being studied
More information can be found at https://hrcsonline.net/
4. Location(s) or Institute(s)
This final filter allows you to specify a research organisation or institution, enabling you to view data for awards or outcomes associated with that location.
Additional Options for Downloading Award Details
In addition to the Exports tab, there are two other locations within Researchfish where you can download award details:
Award Details (RHT)
• This provides a complete list of your awards and their details.
To download:
- Navigate to e-Val → Awards
- Scroll to the bottom of the page and select Download Award Details (RHT). This creates an xlsx file containing the details of all of your awards.
- You can also download details of a specific award, or a group of awards in a docx format by first using the filters to find the relevant award(s). Then, select the award(s) with the check boxes and the use the “choose action” dropdown below the list of awards and select “Download Data” If multiple awards are downloaded, the files will be combined into a ZIP file.
Downloading Outcomes for a Specific Award
• In addition to using the “Download Data” option on the e-Val – Awards page, you can also download the outcomes linked to any award directly from its award details page.
• At the bottom of the Award Details pane, click ‘Download Award’ to generate a docx document containing the reported outcomes. You can select all of the output types, or only those of interest. If you are only interested in publications there is an additional option to download that as an xlsx file. Downloads are available for either the live data or a specific submitted version, based on the submission period. This is the same document that can be generated by researchers themselves.
Built-in Analytics in the Platform
This section goes beyond accessing raw data, highlighting the pre-built analytics available within Researchfish. These ready-made insights allow you to view, explore, and download key analyses to support your internal reporting.
We will first explore the Reports section, followed by Search (currently in Beta), a powerful data exploration tool. Finally, we will explore the Activity section, which provides a valuable overview of researchers' outcome attribution activity.
1. Reports
This section provides a quantitative analysis of the data reported by researchers. However, it is important to note that quantitative insights may be less meaningful when the number of awards or outputs is low or when there are significant qualitative differences in reported outcomes. For these reasons, results should always be interpreted with caution, considering the context and substance of the reported outputs.
To access the Reports section, navigate to the Reports tab in the e-Val navigation pane. The landing page provides an overview of the available reports. From here, you can either review the introductory sections or go directly to the analytics sections, starting with ‘Descriptives’.

Each report contains a variety of tables and charts, offering different insights into your data. These sections include:
• The guidance text researchers see when reporting,
• A table of contents for easy navigation and,
• Explanatory text detailing the graphs, charts, and the methodology behind them.
Customising Your Report
At the top of the page, three filters allow you to refine your view and tailor the report to your needs:
• Section Filter (Leftmost) – Select the specific section of interest, such as Publications.
• Research Organisation Filter – View analyses at the research organisation level, either focusing on a particular institution or comparing awards across multiple organisations.
• Data Selection Filter – Choose the dataset used for the reports:
- Most Recently Submitted (default) – The latest available submission for each award.
- Specific Submission Period – Data collected within a selected submission period.
Exporting your reports
At the top of the page, you have the option to download reports either as a single section or as the full report in a Word document. Additionally, you can export all underlying data from the reports as an xlsx file for further analysis.
2. Search (Beta)
The Search (Beta) feature is a powerful interactive tool that allows you to explore live data in a dynamic and responsive way. It serves as an excellent starting point for analysis and reference. To access this section, navigate to the Search (Beta) tab in the e-Val navigation pane.

Upon entering the Search (Beta) section, you will see four main datasets available for exploration:
• Awards
• Outcomes
• Organisations
• People
Clicking on any of these categories will display a summary of the relevant data. You can refine your results further using the filters on the right, which will adjust based on your selection. Clicking on an individual item within the summary will provide more detailed information.
The Search bar at the top functions similarly to a Google-style search for your data. You can enter any relevant search term and refine your search further by using the "Add a Field to Search" option that appears at the bottom of the search bar.
Searching Awards
1. Select "Awards" as the dataset to explore. This will display a summary view of all awards.
2. Enter a search term (e.g., cardiovascular) in the search bar at the top. This will display summaries of all awards containing that term in the award details e.g. title, abstract researcher name, etc.
3. Refine your results further using the available filters on the right.
Searching for Outcomes
1. Select "Outcomes" as the dataset to explore. This will display a summary view of all outcomes available to you.
2. Enter a search term (e.g., cardiovascular) in the search bar at the top. This will display summaries of all outcomes containing that term
3. Refine your results further using the available filters on the right, e.g. only further funding from the private sector containing the search term (e.g. cardiovascular).
Activity
This is a great tool for tracking researcher activity over customisable time periods. It is especially useful for monitoring submission activity during an open submission period.
This section provides high-level statistics on:
• Number of attributions made and,
• Number of awards submitted.
In addition to these metrics, four interactive charts help visualise attribution and submission trends, making it easier to assess engagement and progress at a glance.
Understanding your data
Fundamental Concepts
Before conducting any analyses, there are some fundamental concepts that govern the way that Researchfish data is structured. This includes:
1. Attribution.
2. Live and submitted data.
3. Unique Identifiers in the data.
Attribution
There are fifteen common outcome types in Researchfish. An award holder can create an ‘entry’ or ‘outcome’ within one of these types and choose to connect or ‘attribute’ it to one or more an awards.
In the Researchfish platform, the term ‘attribution’ is used to describe the link between an outcome and its funding source. When an award holder attributes an outcome to an award, it means that the funding—either fully or partially—has contributed to that outcome. This allows a single outcome to be connected to multiple funding sources, helping stakeholders appreciate the complex relationships between research funding and the resulting outcomes.
A single award can have multiple entries attributed to it. Likewise, a single entry can be attributed to multiple awards.

Due to this multi-attributive nature of awards and entries, the data available in your download of a specific outcome type can be non-unique. To understand what this means, consider the following rows of data. This contains a subset of columns that you will find in a download of any outcome type (with award details) and some anonymised data:
Agreement ID |
Award Reference |
Entry ID |
pubid |
1234 |
grant_a |
W |
ww |
4567 |
grant_b |
X |
xx |
1234 |
grant_a |
Y |
yy |
8910 |
grant_c |
X |
xx |
1112 |
grant_d |
X |
xx |
Key Points:
• There are 5 rows of data in the table - this means there are five unique attributions for this outcome type.
• There are 3 unique entries (W, X & Y) – this means that:
- 1 entry (X) is being attributed to 3 awards (grant_b, grant_c & grant_d)
- 2 different entries (W & Y) are being attributed to the same award (grant_a)
• There are 4 unique awards (grant_a, grant_b, grant_c & grant_d) – this means that 1 award appears twice (grant_a) because two different entries are being attributed.
Based on this, in any download of any outcome type, we can deduce the following:
1. Number of Unique Awards that have attributions - count of the number of unique values in the Agreement ID column.
2. Number of Unique Entries that are being attributed - count of the number of unique values in the pubid column. The pubid column does not exist for the collaborations outcome type or for additional questions. In this case, the Entry ID column can be used.
3. Number of Attributions for the specific outcome - count of the number of rows in the export.
Live and Submitted Data
The data that you download from Researchfish can be either live or submitted.
Live Data
All of the information that an award holder has in their portfolio that they have attributed to the award that you are funding. The award holder can add/remove information at any time and therefore the data contained in different downloads can vary.
Submitted Data
During a submission period, an award holder reviews their award and all the attributed information before submitting. Once submitted, a complete copy of the award and all attributed outcomes is created and stored. A timestamp of the exact time of submission is added as a suffix to the unique identifiers of the award and the outcomes. This copy can only be changed if the award holder chooses to amend their submission and re-submit within the same active submission period. In this case, the most recent submission will replace any previous submitted versions. This is the data that you get when you opt to download the data from a specific submission period.
Note: Should I use Live or Submitted Data?
When analysing data, it is recommended to use submitted data, as it has been reviewed and certified by the award holder as the most relevant and accurate at the time of submission. Submitted data is:
• Fixed and unchangeable , ensuring consistency in reporting. However, some metadata e.g. open access status of a publication may be refreshed
• Validated by the award holder, making it a reliable source for analysis.
• At the point of submission, the award holder agrees to funder’s terms for the use of the data and give permission to the funder to use the submitted data in line with those terms of use.
However, there are limitations to submitted data:
• It only includes awards submitted within a specific submission period.
• Not all award holders complete their submissions, meaning some awards may be entirely missing from the dataset.
Live data, on the other hand, includes all awards with the latest information currently attributed by award holders. While it provides a more comprehensive view, it:
• Can change at any time, making it less stable for reporting.
• Has not been explicitly certified by the award holder as a finalised version of their portfolio.
Ultimately, choose the dataset that best fits your needs, keeping in mind the advantages and limitations of both. Most importantly, maintain consistency in your choice to ensure reliable analysis and reporting.
Unique Identifiers
Whether live or submitted, the data available for download via the Exports tab can include:
1. Award Details,
2. Data submitted for each of the common outcome types and,
3. Additional questions that are unique to your organisation.
Each download is tailored to your organisation depending on customisable features such as categories and additional questions. It will also include a second tab called ‘Field Descriptions’ which describe what data each of the columns contains. However, we will highlight some important columns here that are essential to understanding your data.
Awards
Field |
Live Data |
Submitted Data |
Agreement ID |
Unique identifier for an award allocated by RF |
Additionally, contains a suffix that is a 10-digit Unix timestamp of the exact time of submission |
Award Reference |
Unique identifier for an award allocated by the funder |
Same as live data |
RF UID |
Unique identifier for a user account allocated by RF. Note that a single user can have multiple awards from multiple funders |
Same as live data |
award holder ID |
Unique identifier for an awardee allocated by the funder. This is typically unique to a person for each funder |
Same as live data |
Outcomes
Field |
Live Data |
Submitted Data |
Agreement ID |
Unique identifier for an award allocated by RF. This indicates the award that the outcome is attributed to |
Additionally, contains a timestamp suffix of the exact time of submission |
Entry ID |
Unique identifier for an entry allocated by RF |
Additionally, contains a timestamp suffix of the exact time of submission |
pubid |
Unique identifier for an entry allocated by RF |
Same as live data |
Putting it all together
Using the knowledge we now have on the data structure in Researchfish, here’s how you can get some useful statistics on outcomes for all your awards all in one place. Let’s do this for live data but it can easily be done for a specific submission period.
Step 1: In the exports tab, select live data and perform a separate download of the Award Details and each of the 15 outcome types. You will have 16 downloaded files in total.
Step 2: Select the award details file as your main spreadsheet. Copy and paste the data from each of your downloads as separate tabs in this main spreadsheet and name each tab as the respective outcome type. (You can ignore the Field Description tabs but these can also be useful for reference if you needed more details on a specific column)
Step 3: Convert the data in each tab into a Table. This is not strictly necessary but is recommended.
To convert to a table
• Click on any cell containing data and press Ctrl + A (Windows) or Command (⌘) + A (Mac). This selects all the data on the page.
• Then press Ctrl + T. This will automatically convert your data to a table.
• Alternatively you can select any cell containing data and then Windows – go to the Home tab and select “format as table” from the styles group or Mac – go to the Table tab and select “add new table”.
• In each case you will be asked to confirm the range of the table (the data that you would like to have in the table), whether it already includes headings for the data, and what visual style you would like to have applied.
• It is good practice to name your table to make it easier to refer to the data in future. Bby default the name will be something like “Table_1”. Click within your table and select ‘Table Design’ in the menu at the top. To the far left, you will see Table Name. You can change the generic table name here to something more suitable. It can be easier to keep the “Table_” prefix to avoid any confusion in the future.
Step 4: In the first tab containing the award details, add a column at the end for each of the outcome types. In this column, we will add the number of attributions for each award (each row) for each of the outcome types.
• For example, let’s do this for the Artistic and Creative Products outcome type. The first column name can be set to Number of Creative Products
• If the name of the tab with the creative products data = creative-products:
In the first row of this column, add the formula:
=COUNTIF(Table_creative-products[Award Reference],[@[Award Reference]])
• Be sure to replace ‘Table_creative-products’ with the name that you have given your table with the Artistic and Creative Products data
• Pressing enter should populate all the rows for that column within the awards table.
• Repeat this process for each of the other outcome types
At the end, you will have 15 additional columns each with a count of the number of attributions for each of your awards
Step 5: Lastly, we may also be interested in determining the total number of attributions and unique outcomes we have for each outcome type. To do this, let’s add another tab to your spreadsheet called Summary
1. Create a new table in this sheet with three columns:
Outcome Type | Number of attributed outcomes | Number of unique outcomes |
2. In the Outcome Type column, list the name of all the 15 outcome types, e.g. Publications
3. If the name of the table with the publications data = publications:
In the Number of attributed outcomes column, enter the formula =COUNTA(Table_publications[Agreement ID])
• Be sure to replace ‘Table_publications’ with the name that you have given your table with the publications data
4. If the name of the table with the publications data = Table_publications:
In the Number of unique outcomes column, enter the formula
=COUNTA(UNIQUE(Table_publications[pubid]))
• Be sure to replace ‘Table_publications’ with the name that you have given your table with the publications data. For this final column, you will receive an error for any outcome that does not contain the pubid field, i.e. collaborations. Instead replace ‘pubid’ with ‘Entry ID’
You should now have a single spreadsheet with all your awards and outcomes in one place. The additional columns indicate the numbers of attributions both overall and at an award level. You also have a count of all the unique outcomes for all your awards.
You can now explore your data more easily in a single location with some useful high-level statistics.
Glossary & Acronyms
Additional Questions (AQs): questions/outcomes determined by funders to be responded by their award holders that are not included in the Common Outcome types. These can be tailored or personalized.
Attribution: an attribution is the action of associating an outcome to an award and therefore the award holder confirming that the funding has contributed either wholly or partially to the realisation of that outcome , e.g. entering a publication, an influence in policy, an engagement activity, etc.
Award Holder: also known as principal investigator (PI), it refers to the lead researcher of an award and responsible for completing the Submission Process.
Award Reference: an identifier defined by the Funder to an award. This is different from the title of the project.
Award/Agreement: the grant of money given by a Funder for a research project or programme.
Delegate: a user invited by an award holder who can input and attribute outcomes to an award on their behalf. The delegate is not able to complete the submission process in the platform.
Outcome: attributed to an award - something arisen either wholly or partially from the funding/award being reported on. Researchfish has listed fifteen outcome types to be reported by award holders.
Reporting Node: restriction tool that allows a department administrator access to the RF e-Val section, but the recipient will only be able to view awards assigned to that node.
Research Organisation (RO): the physical location where the research is being conducted. This could be a Higher Education Institution or an independent institute by itself.
Respondent Hierarchy Table (RHT): A spreadsheet of information to be completed by a Funder to bulk add/edit award data to be uploaded in Researchfish.
Submission Period (SP): also known as ‘Reporting Period’, and ‘Data Collection Exercise’, it refers to a specific time-window within the year, decided by the Funders, who require their award holders, to report on their outcomes of their research via the Researchfish platform.
Submission: A submission is a captured and time-stamped capture of data completed by an award holder so that their Funder can collect and analyse it (in line with the terms of use of the data stated by the relevant funder of the award). To complete a submission, the award holder must click the ‘Review and Submit’ button in the platform and accept their Funder’s T&Cs.
Team Member: an award holder invited by another award holder who can attribute outcomes to an award of theirs on their behalf and vice-versa. Can and only should complete their own submission process.
Terms and Conditions (T&Cs): As an award holder is completing the submission process, the T&Cs are defined by the funder, which essentially grant permission to the funder to collect and analyse the data submitted by their award holders.
Appendices
Appendix 1 – Field Descriptions list (Publications)
Column Name | Description |
Funder ID | Internal Researchfish Identifier for Funder |
Agreement ID | Internal Researchfish Identifier for Award - Version |
Funder | Name of Funding Organisation |
Award Reference | Award Reference - Grant Reference for funding organisation |
Linked Agreement | Identifier for any parent awards |
Grant DOI | DOI identifier for this award |
Award Type | Type of award in Researchfish |
Title | Name of the funding award or project |
Award Discipline | Discipline of the research |
RO Location ID | Internal Researchfish Identifier for Research Organisation |
RO | Name of Research Organisation |
Centre Location ID | Internal Researchfish Identifier for Centre |
Centre | Name of the Centre where the award is based |
Metafunder ID | Internal Researchfish Identifier for External Funder at Centre |
Metafunder | Name of the External Funder at Centre |
Award Start Date | Date the award started |
Award End Date | Date the award ended |
Funding Value | Value of the funding |
Funding Currency | Currency of the funding for the award |
Converted Funding Value | Converted value of funding |
Converted Currency | Currency that the funding for the award has been converted into |
Funder Project URL | Funder populated web page for the project |
RF UID | Researchfish Internal Identifier for the award holder of the award |
award holder ID | Funder Identifier for the award holder of the award |
award holder ORCID | award holder ORCID as indicated by the award holder of the award |
award holder Title | This will be the title used to address the award holder when sending emails |
award holder Name | award holder First name(s) or initials |
award holder Surname | award holder Last Name |
award holder Email | Email of award holder as supplied by funder of the award |
Award Reporting Period ID | Internal Researchfish Identifier of the Reporting Period (or version) of the award |
Award Submission Date | Date the version of the award was submitted |
Response Code | Funder code to indicate current reporting expectations for the award |
Entry ID | Internal Researchfish Identifier for the entry - version |
Entry Insert Date | Date the entry was attributed to a particular award |
Type* | Type of publication |
PMID | PubMed Identifier |
DOI | Digital Object Identifier |
Author* | First Author Name |
Other Authors | Other Author(s) |
Publication* | Name of the publication (e.g. name of article or book) |
Journal* | Journal Name |
Volume | Volume |
Issue | Issue |
Pages | Pages |
Month | Publication month |
Year* | Publication year |
Expected Year* | Expected publication year |
Chapter Number | Number of the chapter |
Chapter Title* | Title of the chapter |
Chapter Author | First chapter author name |
Other Chapter Authors | Other chapter author(s) |
Chapter Pages | Pages of the chapter |
Edition | Edition |
Editor | First editor of the publication |
Other Editors | Other editor(s) of the publication |
Publisher | Publisher |
Place of Publication | Place of publication |
Conference Name | Name of conference |
ISBN | International Standard Book Number |
ISBN (Electronic) | International Standard Book Number (Electronic) |
ISSN (Print) | International Standard Serial Number (for the print edition of the serial) |
ISSN (Digital) | International Standard Serial Number (for the digital edition of the serial) |
ISSN (Linking) | International Standard Serial Number (to link the electronic and print editions of the serial) |
Publications URL | URL of the publication |
Web of Science ID | Accession Number (UT) in Web of Science |
Scopus Document ID | Scopus Document ID |
Insaward holderre HEP ID | Insaward holderre High Energy Physics IDs |
NASA ADS Bibcode | NASA Astrophysics Data System ID |
Ethos | UK Electronic Thesis Online ID |
ORCID Putcode | Current Orcid putcode of the output record |
arXiv Deposit ID | ArXiv deposit Identifier |
PubMed Central ID | PubMed Central Identifier |
Pre-loaded Publications Source | Origin of the output record if added automatically |
source | Current source of the output record |
pubid | Internal Researchfish Identifier for the publisher's ID of the output - (could be e.g. DOI, internal identifier) and relates to the source |
Autosync | Date the record was attempted to be refreshed from external sources |
Autosync Success | Date the record was successfully refreshed from external sources |
Original Source | Original source of the output record |
PMC Manuscript ID | PubMed Central Manuscript ID |
PM Article Acceptance Date | Article Acceptance Date in PubMed |
PM Article Availability Date | Article Availability Date in PubMed |
EPMC Is Item Open Access | Whether Europe PubMed Central considers the publication to be available under a CC-BY (attribution), or equivalent license |
EPMC Is In Europe PubMed Central | Whether the full text of the article is available in Europe PubMed Central |
EPMC Is In PubMed Central | Whether the full text of the article is available in PubMed Central |
EPMC Updated | Date the record was updated with Europe PubMed Central |
OpenAire Type | Type of publication in OpenAire |
OpenAire License Level | Type of publication license asserted in OpenAire |
OpenAire Hostname | Name of the OpenAire Repository |
OpenAire Fulltext URL | URL to the full text of the publication in the OpenAire Repository |
OpenAire Embargodate (YYYY-MM-DD) | The embargo date for the publication in the OpenAire Repository |
OpenAire Updated | Date the record was updated with OpenAire |
CrossRefOA Content Version | Version of the publisher content from Crossref |
CrossRefOA Embargo Delay | Number of days the publication was embargoed |
CrossRefOA Start Date | Date the record was updated with Crossref |
CrossRefOA VOR License URL | Publisher License for the Version of Record of the publication |
CrossRefOA Updated | Date the record was updated with Crossref |
CrossRefOA TDM License URL | Publisher License for Text and Data Mining of the publication |
CrossRefOA AM License URL | Publisher License for Author Manuscript of the publication |
Patent Citations | Number of citations to the publication in policy documents indexed by Overton and available in PlumX |
Policy Document Citations | Number of citations to the publication in patent families indexed by and available in PlumX |
PlumX URL | URL for the PlumX page containing details of all PlumX metrics and links to source documents |