3

Background

I followed tutorial here that proceeded in the following way:

  1. create an image and then a container for TensorFlow Object Detection API from a Dockerfile.

  2. At the end of the Dockerfile, a Jupyter notebook is started and I can see and edit everything I cloned through github.

  3. I had to do some modifications to make it work, but it is working now.

  4. Then I pushed this image to Dockerhub with a name: tf_od_api:part1.

  5. command for starting the container called tensorflow:

    docker run --runtime=nvidia -it --name tensorflow -p 8888:8888 -d kneazle/tf_od_api:part1

  6. Container ID is dc91f5b5e6759bac3dfe4e713406fd0e2a217637241a45d9a20d5cfc347d40d8

  7. Jupyter notebook in the address localhost:8888.


No data in the GUI of Jupyter notebook

Now I want to use my local data (over 10 GB) for object detection, and for this I need to use a script from the object detection api, which will create and save tfrecords for future tasks. So I need to save this data, but I don't want to upload it each time. I tried the solution given here. The command I use for it is:

docker run -v /home/kneazle/data/KITTI:/data_host/KITTI kneazle/tf_od_api:part1

But this commands output is :

[I 11:06:50.547 NotebookApp] Writing notebook server cookie secret to /root/.local/share/jupyter/runtime/notebook_cookie_secret
[W 11:06:50.560 NotebookApp] All authentication is disabled.  Anyone who can connect to this server will be able to run code.
[I 11:06:50.564 NotebookApp] Serving notebooks from local directory: /tensorflow/models/research/object_detection
[I 11:06:50.564 NotebookApp] The Jupyter Notebook is running at:
[I 11:06:50.564 NotebookApp] http://(a915e0cd0fd0 or 127.0.0.1):8888/
[I 11:06:50.564 NotebookApp] Use Control-C to stop this server and shut down all kernels (twice to skip confirmation).

I don't see the data in Jupyter notebook I started before or after. I also tried

docker run --runtime=nvidia -it 
    --name tensorflow -p 8888:8888 
    -v /home/kneazle/data/KITTI:/data_host/KITTI kneazle/tf_od_api:part1

How can I use my local files for the Jupyter notebook running in Docker?

hhh
  • 44,388
  • 56
  • 154
  • 251
kneazle
  • 195
  • 3
  • 12
  • I think your close to having it figured out --notebook-dir is the directory your seeing in jupyter, that is /tensorflow/models/research/object_detection, so you need to mount your data in a subdirectory of that directory on the container. Try: -v /home/aev21/data/KITTI:/tensorflow/models/research/object_detection/data_host/KITTI – William D. Irons Nov 02 '18 at 19:09
  • It worked! Thanks very much! Does this mean everything I write to that folder from host, and also from docker container will remain? – kneazle Nov 04 '18 at 12:05
  • Yes, when you write to that directory on the container, because it is mounted from the host, your modifying that directory on the host. – William D. Irons Nov 04 '18 at 21:22
  • Possible duplicate of [Show volume files in the GUI of Docker Jupyter notebook](https://stackoverflow.com/questions/48107535/show-volume-files-in-the-gui-of-docker-jupyter-notebook) – hhh Jan 05 '19 at 22:48
  • @kneazle if you solved the issue, could you answer your question? It is little bit hard to understand. – hhh Jan 05 '19 at 22:52

0 Answers0