Background
I followed tutorial here that proceeded in the following way:
create an image and then a container for TensorFlow Object Detection API from a Dockerfile.
At the end of the Dockerfile, a Jupyter notebook is started and I can see and edit everything I cloned through github.
I had to do some modifications to make it work, but it is working now.
Then I pushed this image to Dockerhub with a name: tf_od_api:part1.
command for starting the container called tensorflow:
docker run --runtime=nvidia -it --name tensorflow -p 8888:8888 -d kneazle/tf_od_api:part1
Container ID is
dc91f5b5e6759bac3dfe4e713406fd0e2a217637241a45d9a20d5cfc347d40d8
Jupyter notebook in the address
localhost:8888
.
No data in the GUI of Jupyter notebook
Now I want to use my local data (over 10 GB) for object detection, and for this I need to use a script from the object detection api, which will create and save tfrecords for future tasks. So I need to save this data, but I don't want to upload it each time. I tried the solution given here. The command I use for it is:
docker run -v /home/kneazle/data/KITTI:/data_host/KITTI kneazle/tf_od_api:part1
But this commands output is :
[I 11:06:50.547 NotebookApp] Writing notebook server cookie secret to /root/.local/share/jupyter/runtime/notebook_cookie_secret
[W 11:06:50.560 NotebookApp] All authentication is disabled. Anyone who can connect to this server will be able to run code.
[I 11:06:50.564 NotebookApp] Serving notebooks from local directory: /tensorflow/models/research/object_detection
[I 11:06:50.564 NotebookApp] The Jupyter Notebook is running at:
[I 11:06:50.564 NotebookApp] http://(a915e0cd0fd0 or 127.0.0.1):8888/
[I 11:06:50.564 NotebookApp] Use Control-C to stop this server and shut down all kernels (twice to skip confirmation).
I don't see the data in Jupyter notebook I started before or after. I also tried
docker run --runtime=nvidia -it
--name tensorflow -p 8888:8888
-v /home/kneazle/data/KITTI:/data_host/KITTI kneazle/tf_od_api:part1