Critical questions you should ask your data labeling partner
Image Annotation Tools
1. Does the tool support all popular data formats and annotation type?
2. Does the tool have any features to view annotations and share feedback on the labeling quality?
3. Does tooling support analytics to quantify data quality of individual annotations and overall datasets?
4. Do you have any automation features to accelerate labeling tasks?
1. How will our team communicate data labeling guidelines with your team?
2. How are changes in guidelines accounted for?
3. How should we perform QC to ensure accuracy of the output?
4. What standards are being used to measure quality? How will the quality metrics be shared with our team? What happens when quality measures aren’t met?
5. What measurement processes are to be followed by us and the service provider to maintain high-quality throughput?
1. How many workers can we access at any one time? Can we scale data labeling volume up ordown, based on our needs? How frequently can it be done?
2. How long does it take annotators to reach maximum productivity? Is task throughput impacted as your data labeling team scales? Does an increase in throughput due to team scaling impact data quality?
3. How are iterations in data labeling features and operations handled as we scale?
4. Tell us about the client support we can expect once we engage with your team. How often will we meet? How much time should my team plan to spend managing the project?
1. Will we pay by the hour or per task? Why did you structure your pricing model that way?
2. Do you incentivize annotators to label data with higher quality or greater volume?