README.md 7.1 KB
Newer Older
N
Nikita Manovich 已提交
1 2
# Computer Vision Annotation Tool (CVAT)

D
DanVev 已提交
3
[![Build Status](https://travis-ci.org/opencv/cvat.svg?branch=develop)](https://travis-ci.org/opencv/cvat)
N
Nikita Manovich 已提交
4
[![Codacy Badge](https://api.codacy.com/project/badge/Grade/840351da141e4eaeac6476fd19ec0a33)](https://app.codacy.com/app/cvat/cvat?utm_source=github.com&utm_medium=referral&utm_content=opencv/cvat&utm_campaign=Badge_Grade_Dashboard)
5
[![Gitter chat](https://badges.gitter.im/opencv-cvat/gitter.png)](https://gitter.im/opencv-cvat)
6
[![Coverage Status](https://coveralls.io/repos/github/opencv/cvat/badge.svg?branch=)](https://coveralls.io/github/opencv/cvat?branch=develop)
N
Nikita Manovich 已提交
7
[![DOI](https://zenodo.org/badge/139156354.svg)](https://zenodo.org/badge/latestdoi/139156354)
8

9 10 11
CVAT is free, online, interactive video and image annotation
tool for computer vision. It is being used by our team to
annotate million of objects with different properties. Many UI
12 13
and UX decisions are based on feedbacks from professional data
annotation team. Try it online [cvat.org](https://cvat.org).
N
Nikita Manovich 已提交
14

N
Nikita Manovich 已提交
15
![CVAT screenshot](cvat/apps/documentation/static/documentation/images/cvat.jpg)
N
Nikita Manovich 已提交
16 17 18

## Documentation

N
Nikita Manovich 已提交
19 20
- [Installation guide](cvat/apps/documentation/installation.md)
- [User's guide](cvat/apps/documentation/user_guide.md)
21
- [Django REST API documentation](#rest-api)
22
- [Datumaro dataset framework](datumaro/README.md)
23
- [Command line interface](utils/cli/)
N
Nikita Manovich 已提交
24 25
- [XML annotation format](cvat/apps/documentation/xml_format.md)
- [AWS Deployment Guide](cvat/apps/documentation/AWS-Deployment-Guide.md)
26
- [Frequently asked questions](cvat/apps/documentation/faq.md)
N
Nikita Manovich 已提交
27
- [Questions](#questions)
N
Nikita Manovich 已提交
28 29 30

## Screencasts

T
TOsmanov 已提交
31 32 33 34
- [Introduction](https://youtu.be/JERohTFp-NI)
- [Annotation mode](https://youtu.be/vH_639N67HI)
- [Interpolation of bounding boxes](https://youtu.be/Hc3oudNuDsY)
- [Interpolation of polygons](https://youtu.be/K4nis9lk92s)
T
TOsmanov 已提交
35
- [Tag_annotation_video](https://youtu.be/62bI4mF-Xfk)
T
TOsmanov 已提交
36
- [Attribute mode](https://youtu.be/iIkJsOkDzVA)
T
TOsmanov 已提交
37 38 39
- [Segmentation mode](https://youtu.be/9Fe_GzMLo3E)
- [Tutorial for polygons](https://youtu.be/C7-r9lZbjBw)
- [Semi-automatic segmentation](https://youtu.be/9HszWP_qsRQ)
40

41
## Supported annotation formats
A
Andrey Zhavoronkov 已提交
42

43 44 45 46
Format selection is possible after clicking on the Upload annotation
and Dump annotation buttons. [Datumaro](datumaro/README.md) dataset
framework allows additional dataset transformations
via its command line tool and Python library.
N
Nikita Manovich 已提交
47

48
| Annotation format                                                                          | Import | Export |
49
| ------------------------------------------------------------------------------------------ | ------ | ------ |
50 51 52 53 54
| [CVAT for images](cvat/apps/documentation/xml_format.md#annotation)                        | X      | X      |
| [CVAT for a video](cvat/apps/documentation/xml_format.md#interpolation)                    | X      | X      |
| [Datumaro](datumaro/README.md)                                                             |        | X      |
| [PASCAL VOC](http://host.robots.ox.ac.uk/pascal/VOC/)                                      | X      | X      |
| Segmentation masks from [PASCAL VOC](http://host.robots.ox.ac.uk/pascal/VOC/)              | X      | X      |
55 56 57 58
| [YOLO](https://pjreddie.com/darknet/yolo/)                                                 | X      | X      |
| [MS COCO Object Detection](http://cocodataset.org/#format-data)                            | X      | X      |
| [TFrecord](https://www.tensorflow.org/tutorials/load_data/tf_records)                      | X      | X      |
| [MOT](https://motchallenge.net/)                                                           | X      | X      |
59
| [LabelMe 3.0](http://labelme.csail.mit.edu/Release3.0)                                     | X      | X      |
A
Andrey Zhavoronkov 已提交
60

61 62 63 64 65 66 67 68 69 70 71 72
## Deep learning models for automatic labeling

| Name                                                                                                    | Type       | Framework  |
| ------------------------------------------------------------------------------------------------------- | ---------- | ---------- |
| [Deep Extreme Cut](/serverless/openvino/dextr/nuclio)                                                   | interactor | OpenVINO   |
| [Faster RCNN](/serverless/tensorflow/faster_rcnn_inception_v2_coco/nuclio)                              | detector   | TensorFlow |
| [Mask RCNN](/serverless/openvino/omz/public/mask_rcnn_inception_resnet_v2_atrous_coco/nuclio)           | detector   | OpenVINO   |
| [YOLO v3](/serverless/openvino/omz/public/yolo-v3-tf/nuclio)                                            | detector   | OpenVINO   |
| [Text detection v4](/serverless/openvino/omz/intel/text-detection-0004/nuclio)                          | detector   | OpenVINO   |
| [Semantic segmentation for ADAS](/serverless/openvino/omz/intel/semantic-segmentation-adas-0001/nuclio) | detector   | OpenVINO   |
| [Mask RCNN](/serverless/tensorflow/matterport/mask_rcnn/nuclio)                                         | detector   | TensorFlow |
| [Object reidentification](/serverless/openvino/omz/intel/person-reidentification-retail-300/nuclio)     | reid       | OpenVINO   |
N
Nikita Manovich 已提交
73

74
## Online demo: [cvat.org](https://cvat.org)
75

76 77 78
This is an online demo with the latest version of the annotation tool.
Try it online without local installation. Only own or assigned tasks
are visible to users.
79

80 81 82 83 84 85
Disabled features:
- [Analytics: management and monitoring of data annotation team](/components/analytics/README.md)

Limitations:
- No more than 10 tasks per user
- Uploaded data is limited to 500Mb
86

87 88 89
## REST API

Automatically generated Swagger documentation for Django REST API is
90 91
available on ``<cvat_origin>/api/swagger``
(default: ``localhost:8080/api/swagger``).
92

93 94
Swagger documentation is visiable on allowed hostes, Update environement variable in docker-compose.yml file with cvat hosted machine IP or domain name. Example - ``ALLOWED_HOSTS: 'localhost, 127.0.0.1'``)

N
Nikita Manovich 已提交
95 96 97 98
## LICENSE

Code released under the [MIT License](https://opensource.org/licenses/MIT).

I
idriss 已提交
99 100
## Questions

N
Nikita Manovich 已提交
101 102 103
CVAT usage related questions or unclear concepts can be posted in our
[Gitter chat](https://gitter.im/opencv-cvat) for **quick replies** from
contributors and other users.
I
idriss 已提交
104

N
Nikita Manovich 已提交
105 106
However, if you have a feature request or a bug report that can reproduced,
feel free to open an issue (with steps to reproduce the bug if it's a bug
107
report) on [GitHub* issues](https://github.com/opencv/cvat/issues).
I
idriss 已提交
108

N
Nikita Manovich 已提交
109
If you are not sure or just want to browse other users common questions,
110 111 112 113 114
[Gitter chat](https://gitter.im/opencv-cvat) is the way to go.

Other ways to ask questions and get our support:
* [\#cvat](https://stackoverflow.com/search?q=%23cvat) tag on StackOverflow*
* [Forum on Intel Developer Zone](https://software.intel.com/en-us/forums/computer-vision)
115 116 117 118 119

## Links
- [Intel AI blog: New Computer Vision Tool Accelerates Annotation of Digital Images and Video](https://www.intel.ai/introducing-cvat)
- [Intel Software: Computer Vision Annotation Tool: A Universal Approach to Data Annotation](https://software.intel.com/en-us/articles/computer-vision-annotation-tool-a-universal-approach-to-data-annotation)
- [VentureBeat: Intel open-sources CVAT, a toolkit for data labeling](https://venturebeat.com/2019/03/05/intel-open-sources-cvat-a-toolkit-for-data-labeling/)