From test runs
Constant experimentation and evaluation helps you get a picture of how your prompts or workflows are performing. The learnings from your run reports must feed back into your iteration cycles in order for you to improve your application constantly. The platform, therefore, allows for dataset curation from test run reports.
To curate dataset entries from your test run report table, follow the steps below:
Navigate to the table of entries at the bottom of the run report. You will see the details for each entry including its evaluation scores
Search, filter or directly select the entries that you would want to add to a dataset. These could be those that are performing well very or have bad scores on certain evaluators.
Once you have selected entries, click on the Add to dataset
button at the top of the table.
Select the dataset you want to add these entries to
Map the columns that you want to add. Deselect those that are not relevant
Note: For comparison runs, in order to prevent duplicates, you can do this curation process from the tab view of each of the entities.