|
|
- {
- "cells": [
- {
- "cell_type": "markdown",
- "metadata": {
- "colab_type": "text",
- "id": "V8-yl-s-WKMG"
- },
- "source": [
- "# Object Detection Demo\n",
- "Welcome to the object detection inference walkthrough! This notebook will walk you step by step through the process of using a pre-trained model to detect objects in an image. Make sure to follow the [installation instructions](https://github.com/tensorflow/models/blob/master/research/object_detection/g3doc/installation.md) before you start."
- ]
- },
- {
- "cell_type": "markdown",
- "metadata": {
- "colab_type": "text",
- "id": "kFSqkTCdWKMI"
- },
- "source": [
- "# Imports"
- ]
- },
- {
- "cell_type": "code",
- "execution_count": 51,
- "metadata": {
- "colab": {
- "autoexec": {
- "startup": false,
- "wait_interval": 0
- }
- },
- "colab_type": "code",
- "id": "hV4P5gyTWKMI"
- },
- "outputs": [],
- "source": [
- "import numpy as np\n",
- "import os\n",
- "import six.moves.urllib as urllib\n",
- "import sys\n",
- "import tarfile\n",
- "import tensorflow as tf\n",
- "import zipfile\n",
- "\n",
- "from distutils.version import StrictVersion\n",
- "from collections import defaultdict\n",
- "from io import StringIO\n",
- "\n",
- "from utils import label_map_util\n",
- "\n",
- "from utils import visualization_utils as vis_util\n",
- "from matplotlib import pyplot as plt\n",
- "from PIL import Image\n",
- "\n",
- "# This is needed since the notebook is stored in the object_detection folder.\n",
- "sys.path.append(\"..\")\n",
- "from object_detection.utils import ops as utils_ops\n",
- "\n",
- "if StrictVersion(tf.__version__) < StrictVersion('1.12.0'):\n",
- " raise ImportError('Please upgrade your TensorFlow installation to v1.12.*.')\n"
- ]
- },
- {
- "cell_type": "markdown",
- "metadata": {
- "colab_type": "text",
- "id": "Wy72mWwAWKMK"
- },
- "source": [
- "## Env setup"
- ]
- },
- {
- "cell_type": "code",
- "execution_count": 52,
- "metadata": {
- "colab": {
- "autoexec": {
- "startup": false,
- "wait_interval": 0
- }
- },
- "colab_type": "code",
- "id": "v7m_NY_aWKMK"
- },
- "outputs": [],
- "source": [
- "# This is needed to display the images.\n",
- "%matplotlib inline"
- ]
- },
- {
- "cell_type": "markdown",
- "metadata": {
- "colab_type": "text",
- "id": "r5FNuiRPWKMN"
- },
- "source": [
- "## Object detection imports\n",
- "Here are the imports from the object detection module."
- ]
- },
- {
- "cell_type": "code",
- "execution_count": null,
- "metadata": {
- "colab": {
- "autoexec": {
- "startup": false,
- "wait_interval": 0
- }
- },
- "colab_type": "code",
- "id": "bm0_uNRnWKMN"
- },
- "outputs": [],
- "source": []
- },
- {
- "cell_type": "markdown",
- "metadata": {
- "colab_type": "text",
- "id": "cfn_tRFOWKMO"
- },
- "source": [
- "# Model preparation "
- ]
- },
- {
- "cell_type": "markdown",
- "metadata": {
- "colab_type": "text",
- "id": "X_sEBLpVWKMQ"
- },
- "source": [
- "## Variables\n",
- "\n",
- "Any model exported using the `export_inference_graph.py` tool can be loaded here simply by changing `PATH_TO_FROZEN_GRAPH` to point to a new .pb file. \n",
- "\n",
- "By default we use an \"SSD with Mobilenet\" model here. See the [detection model zoo](https://github.com/tensorflow/models/blob/master/research/object_detection/g3doc/detection_model_zoo.md) for a list of other models that can be run out-of-the-box with varying speeds and accuracies."
- ]
- },
- {
- "cell_type": "code",
- "execution_count": 53,
- "metadata": {
- "colab": {
- "autoexec": {
- "startup": false,
- "wait_interval": 0
- }
- },
- "colab_type": "code",
- "id": "VyPz_t8WWKMQ"
- },
- "outputs": [],
- "source": [
- "# What model to download.\n",
- "MODEL_NAME = 'ssd_mobilenet_v1_coco_2017_11_17'\n",
- "MODEL_FILE = MODEL_NAME + '.tar.gz'\n",
- "DOWNLOAD_BASE = 'http://download.tensorflow.org/models/object_detection/'\n",
- "\n",
- "# Path to frozen detection graph. This is the actual model that is used for the object detection.\n",
- "PATH_TO_FROZEN_GRAPH = MODEL_NAME + '/frozen_inference_graph.pb'\n",
- "\n",
- "# List of the strings that is used to add correct label for each box.\n",
- "PATH_TO_LABELS = os.path.join('data', 'mscoco_label_map.pbtxt')"
- ]
- },
- {
- "cell_type": "markdown",
- "metadata": {
- "colab_type": "text",
- "id": "7ai8pLZZWKMS"
- },
- "source": [
- "## Download Model"
- ]
- },
- {
- "cell_type": "code",
- "execution_count": null,
- "metadata": {
- "colab": {
- "autoexec": {
- "startup": false,
- "wait_interval": 0
- }
- },
- "colab_type": "code",
- "id": "KILYnwR5WKMS"
- },
- "outputs": [],
- "source": []
- },
- {
- "cell_type": "code",
- "execution_count": null,
- "metadata": {},
- "outputs": [],
- "source": []
- },
- {
- "cell_type": "markdown",
- "metadata": {
- "colab_type": "text",
- "id": "YBcB9QHLWKMU"
- },
- "source": [
- "## Load a (frozen) Tensorflow model into memory."
- ]
- },
- {
- "cell_type": "code",
- "execution_count": 54,
- "metadata": {
- "colab": {
- "autoexec": {
- "startup": false,
- "wait_interval": 0
- }
- },
- "colab_type": "code",
- "id": "KezjCRVvWKMV"
- },
- "outputs": [],
- "source": [
- "detection_graph = tf.Graph()\n",
- "with detection_graph.as_default():\n",
- " od_graph_def = tf.GraphDef()\n",
- " with tf.gfile.GFile(PATH_TO_FROZEN_GRAPH, 'rb') as fid:\n",
- " serialized_graph = fid.read()\n",
- " od_graph_def.ParseFromString(serialized_graph)\n",
- " tf.import_graph_def(od_graph_def, name='')"
- ]
- },
- {
- "cell_type": "markdown",
- "metadata": {
- "colab_type": "text",
- "id": "_1MVVTcLWKMW"
- },
- "source": [
- "## Loading label map\n",
- "Label maps map indices to category names, so that when our convolution network predicts `5`, we know that this corresponds to `airplane`. Here we use internal utility functions, but anything that returns a dictionary mapping integers to appropriate string labels would be fine"
- ]
- },
- {
- "cell_type": "code",
- "execution_count": 55,
- "metadata": {
- "colab": {
- "autoexec": {
- "startup": false,
- "wait_interval": 0
- }
- },
- "colab_type": "code",
- "id": "hDbpHkiWWKMX"
- },
- "outputs": [],
- "source": [
- "category_index = label_map_util.create_category_index_from_labelmap(PATH_TO_LABELS, use_display_name=True)"
- ]
- },
- {
- "cell_type": "markdown",
- "metadata": {
- "colab_type": "text",
- "id": "EFsoUHvbWKMZ"
- },
- "source": [
- "## Helper code"
- ]
- },
- {
- "cell_type": "code",
- "execution_count": 56,
- "metadata": {
- "colab": {
- "autoexec": {
- "startup": false,
- "wait_interval": 0
- }
- },
- "colab_type": "code",
- "id": "aSlYc3JkWKMa"
- },
- "outputs": [],
- "source": [
- "def load_image_into_numpy_array(image):\n",
- " (im_width, im_height) = image.size\n",
- " return np.array(image.getdata()).reshape(\n",
- " (im_height, im_width, 3)).astype(np.uint8)"
- ]
- },
- {
- "cell_type": "markdown",
- "metadata": {
- "colab_type": "text",
- "id": "H0_1AGhrWKMc"
- },
- "source": [
- "# Detection"
- ]
- },
- {
- "cell_type": "code",
- "execution_count": 57,
- "metadata": {
- "colab": {
- "autoexec": {
- "startup": false,
- "wait_interval": 0
- }
- },
- "colab_type": "code",
- "id": "jG-zn5ykWKMd"
- },
- "outputs": [],
- "source": [
- "# For the sake of simplicity we will use only 2 images:\n",
- "# image1.jpg\n",
- "# image2.jpg\n",
- "# If you want to test the code with your images, just add path to the images to the TEST_IMAGE_PATHS.\n",
- "PATH_TO_TEST_IMAGES_DIR = 'test_images'\n",
- "TEST_IMAGE_PATHS = [ os.path.join(PATH_TO_TEST_IMAGES_DIR, 'image{}.jpg'.format(i)) for i in range(1, 4) ]\n",
- "\n",
- "# Size, in inches, of the output images.\n",
- "IMAGE_SIZE = (12, 8)"
- ]
- },
- {
- "cell_type": "code",
- "execution_count": 58,
- "metadata": {
- "colab": {
- "autoexec": {
- "startup": false,
- "wait_interval": 0
- }
- },
- "colab_type": "code",
- "id": "92BHxzcNWKMf"
- },
- "outputs": [],
- "source": [
- "def run_inference_for_single_image(image, graph):\n",
- " with graph.as_default():\n",
- " with tf.Session() as sess:\n",
- " # Get handles to input and output tensors\n",
- " ops = tf.get_default_graph().get_operations()\n",
- " all_tensor_names = {output.name for op in ops for output in op.outputs}\n",
- " tensor_dict = {}\n",
- " for key in [\n",
- " 'num_detections', 'detection_boxes', 'detection_scores',\n",
- " 'detection_classes', 'detection_masks'\n",
- " ]:\n",
- " tensor_name = key + ':0'\n",
- " if tensor_name in all_tensor_names:\n",
- " tensor_dict[key] = tf.get_default_graph().get_tensor_by_name(\n",
- " tensor_name)\n",
- " if 'detection_masks' in tensor_dict:\n",
- " # The following processing is only for single image\n",
- " detection_boxes = tf.squeeze(tensor_dict['detection_boxes'], [0])\n",
- " detection_masks = tf.squeeze(tensor_dict['detection_masks'], [0])\n",
- " # Reframe is required to translate mask from box coordinates to image coordinates and fit the image size.\n",
- " real_num_detection = tf.cast(tensor_dict['num_detections'][0], tf.int32)\n",
- " detection_boxes = tf.slice(detection_boxes, [0, 0], [real_num_detection, -1])\n",
- " detection_masks = tf.slice(detection_masks, [0, 0, 0], [real_num_detection, -1, -1])\n",
- " detection_masks_reframed = utils_ops.reframe_box_masks_to_image_masks(\n",
- " detection_masks, detection_boxes, image.shape[1], image.shape[2])\n",
- " detection_masks_reframed = tf.cast(\n",
- " tf.greater(detection_masks_reframed, 0.5), tf.uint8)\n",
- " # Follow the convention by adding back the batch dimension\n",
- " tensor_dict['detection_masks'] = tf.expand_dims(\n",
- " detection_masks_reframed, 0)\n",
- " image_tensor = tf.get_default_graph().get_tensor_by_name('image_tensor:0')\n",
- "\n",
- " # Run inference\n",
- " output_dict = sess.run(tensor_dict,\n",
- " feed_dict={image_tensor: image})\n",
- "\n",
- " # all outputs are float32 numpy arrays, so convert types as appropriate\n",
- " output_dict['num_detections'] = int(output_dict['num_detections'][0])\n",
- " output_dict['detection_classes'] = output_dict[\n",
- " 'detection_classes'][0].astype(np.int64)\n",
- " output_dict['detection_boxes'] = output_dict['detection_boxes'][0]\n",
- " output_dict['detection_scores'] = output_dict['detection_scores'][0]\n",
- " if 'detection_masks' in output_dict:\n",
- " output_dict['detection_masks'] = output_dict['detection_masks'][0]\n",
- " return output_dict"
- ]
- },
- {
- "cell_type": "code",
- "execution_count": 59,
- "metadata": {
- "colab": {
- "autoexec": {
- "startup": false,
- "wait_interval": 0
- }
- },
- "colab_type": "code",
- "id": "3a5wMHN8WKMh"
- },
- "outputs": [
- {
- "data": {
- "image/png": "iVBORw0KGgoAAAANSUhEUgAAAskAAAHCCAYAAAAdGlSzAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4zLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvnQurowAAIABJREFUeJzsvVmsbdl1nveNOedqdne625x7b/VFkbRIkVSkULIBNQ+KnARxzDRAEBuBnSCJACOGkYcIMBIEzlseHMAwkCCAHuKnIDIUGA4Ci6YSI5FlWUhISVRDSiSLJbLqVtXtTrfbtdZsRh7m3PucW0VatKOC9LBH4dS9d5+99pprtv/4xz/GFlVlb3vb2972tre97W1ve9vbtZk/7gbsbW9729ve9ra3ve1tb3/SbA+S97a3ve1tb3vb2972trf32R4k721ve9vb3va2t73tbW/vsz1I3tve9ra3ve1tb3vb297eZ3uQvLe97W1ve9vb3va2t729z/YgeW9729ve9ra3ve1tb3t7n30oIFlE/jUR+ZqIvCEif/3DuMfe9ra3ve1tb3vb29729mGZ/FHXSRYRC3wd+GngIfBF4C+o6lf/SG+0t73tbW9729ve9ra3vX1I9mEwyT8CvKGqb6rqAPw88LkP4T5729ve9ra3ve1tb3vb24di7kP4zBeAt2/8+yHwo+9/k4j8DPAzAEbkh60oglA3FdYamqYlxUAIHjGGFJWoihGDMYKIMAwDqooYwRhDDBFV0JQQYDIZY60lxYixBjQhIiiKiGBEUBJGDCFENAEI1jn6viPEiCLbFlNXlpSUlCLOOZxzGKCuLKCEGNAESfI1MSVSUsQYRAQ/DMQYMcZgjcFaV+6pxBSJKSFGQCwGMMbgfUBRnLM4a6irChHw3uNDwPtE1w9goG1awuC5ffuQuw/u0s3XdJsOkYQxFk2KKW0z1pJiIN/9+v9oaXuM+BDZbAZCTCyHDgXqusVaRwyJmAKU8ait0NSOWdvijAD5uQFysELZduU2eiGlLdtghgiQ8ngmBUHJ/48I+TM1JVSV5BNJEyvfMcTEpldSghgTun2a8rmVM9TWMqkrKmepKoezud2IoJLfaBFUQDG5rZowQAI0KSkmutAxBM9mEwlJ8RFEDIIB0TxnND+UCDSVo3KOSV0xbiuMEYwVQJDt1LphpWfK76Q8927V5B4R0Ot3wnf4nOsrTO433b1wPdSadm3Y9APPzi7y/NbcPi3PsB0nKYOkKW3vTNLclhsfu3uOfN11S7n5r+2Y32hpKnPEicnTRcoP2z/l+iZyfR/KvfTGQ4rmtauy5QF2D/18/5Tr9EYfij7XMETZjcNubsn1Z6lqeeDtnRTRMre2q0tvPv31yJYlwp1bxzRNDWbb16a072Zrv8eon1zfY3fXdPOZbo7WByePPPd6fuJE3p9FBO8HlquOi6slKeV9GQSbJ+ZuHLfXS5m3AEYMqHJ8OMNowjmDatqta2MsqFK5igTMV0tiSnQ+j6cp46kpAqasheu+yXt6HtcUI0i5uwhS5iqS9+OUEtvls+0rw/VQiilzo/x2+9r2DNqukZRSeVS90ZZyrWp5b54Du/lvyrkEZZDzfqnls677S0gp7//W5L5LmjAIo3HLwcGsfKY8P6Lv31xu3Pu5zfHm/lHaG3zEh0BKCR8TMSmhjGuMCY2x7HO6W3O7Pb38XySPtYiUn3yeSXkttyDPk/w512sk3ZjmqrobD7bjpNvNIe9hiGJUwAjjyYTKVmiIpMpgldyvKBojqgZr2WEKBExMGNE89lERiRgriLEYYwkpnzXOWqwVEIOPec2LiTgFbIUiDH5A0HJfQcQQE3T9gBiojOKMRU2FjwERsNt1YCtiTER//boodB664IkpYAXGtQMVeu+xtcMZQ2sFRVj7PGa1VZwVKuMAw7rrGGJk4yPGCKPalrUjNE2DdRWuchhrueZOt2vJ5n4qm7ZBypwua6OMMVr2CJPXNFyfHUbMbi3kFXnzzLgeYVVIZQLEFPP1wm6sbu7D2yl+E0+ossM4N6f2dg7mszrbb/3u7z9T1Tv8IfZhgOTvyVT154CfAxi3tb50PMY5x0dfe5HZwRhNhqiJp0+f0Pc9MRikdsxmE6wYJtMRq/mcxWLBst8wGo2Yjqf4rufhw3cZjWo+9QOf4N69ewz9hspYppVhtVqgRIyBEDzjSUvbthweHHN1dcVv/uaXmc1mCBWXyyWrmHC2pusGXn/pAQeTMVeXF1RVxfHhAZW13D5sqByEkJgvNlx2nqZpEFex7jv6mHDOMb+4YrNaYxGstVTWYYwhpUQyFb0PvPmttzg8Oeal0/uMRyO6vmfZb2haYdo2TKo82Z5dXXF1ueD8quPZ2SVniwWf/L4XmbmG33nzK/y1v/A5/tJf+at87f/5Df7+P/gFfvCj30+lQm0sRhxYgyNvcj4O+OQBkKRYa9EIg1feePiUi8srvvD13+PNt97h8byjmRzy6U/+MG498OjRU0SUw5lhNrZ89rUX+ewPfJxp43BVdjhMmWYpRZC0mwO7xVUspYSJSrT5ABOgFksKiu83hBC4upgzn8/53cs/4OxqwZe+vuZindgslBDADJBi/ryTg5rxqOGHPvYaL50c86n7dzk+GDE9ammdwVblsLQF0GvIoMo0WARJA8N6w8pveOfiMe++84Rf/9pbfPtszeME1BViZlhxpG5gs9ywSYnGWO7fnnB6+5BPf/RFXrx7l0+8dI+TI0cSvwNCZudE5HGw1maQLiZvTEaw1PkgBlQsIoIV3fXdzc/4TibY5w6zfH2+xocOY8Fa4VuPz/lb/8Pf4dm8x2sNarEuv89oATDlviYKJMWowasSNGUHw2SguwUOxhliTCTAGEdSIYovn5nbLApVsOAs3kSsGGbW5XlYG5xz1FaxIlRW0Hhj3hghATF56qolhJA3SeNwwRNF8Nh839QXsJtuOGeKKRt6dBnUGhFsyKAwlS28UsEgDAmSKrGAFIkpO3XWEDShZW4nEpUZ5bmF4mPE+/zc0WQ/1KngEPyw4sV7d/gv/vP/lIPDCT522LrCpvq5dQJgzfXaubb0gfHftvvmmIs+HzTc9mF6n8MqIpgkGHHEmBCTgECwkdApTWN57+ljfu3Xfpf/9Zd+jfUQueyzEz7BYlKEZAhGiS6iGrHOoMkwasYM8zk//qM/yunBiJksmU1HrNdrQj+QEty+e5cwDLz80mt86cu/xW/+wRu8d7ZgXQs+Cm0zRjUiMWKlQgWiBkCxdUVrDL0fqKpq92xJB6wYrBpijEQxBZAGrBicMRhjMAi1VPnzTV6PMQasazAmH75NVVE3jrZyqAqbzYa+93lQZbufxt3Yqypp8KgIrq5QVbqhZzyakbqOSgxq8jOodYQQ8j5Q2m4lX3P39iGzaQupx4eezXzglZfu8ef+/E9TtQ2DgrXVB+aMyPPjvp0XaeeUFCCblJEzaIDz8yu+/d57zBcr3nxyzsVqxdMuEpNhtQm4GFgsFnRdx2azea7NVvKabZsGU9bvdDzBGctoNKK2DjWCcZZoEmIMkUjaRKLmPhtCJESlT2HXRuccIoKLhpAy0SQWxASwoOuegPDR7/8ktw5OSKuBK+s5mk44mExJMbBZzNmsB+rGM2mmEFtUA/16zmRc8fLxLWKvLC8eYlyPFYNpxvQKfeiZTVoeHB3gJiMen0X6YaAdbziMAzo6pYuWs4tntFUGw3XrGIxlvhbeePcRzajiuBp49dY9krvN20/eoakS47Di6GhKN7nNxXKge/IMZz0npuf29BZfezfxjUePedw9xYUVHz8a42j4gyfPOLjdcjyq+eitCZrgy+9ueDzfcGsaOD2quT87ZdUJ33zyiIdPz3njyZxRY/jsKyfMxjOeXK24WsxJEmhnDbPpIa9+3+u04zEvvPI6o8mE8cEh1lqGbpIJJ2OoXcW4bohuAFXatkWS7kiHuskEgKqgybL1fE6OpjRNQww9qkpjDWDAVCQVQkgkYwlhYLGa5/lqldZM6IYeVDBVJhOG0GOMYSjkY0opnwERRk2NqyyV2QL4SFV
- "text/plain": [
- "<Figure size 864x576 with 1 Axes>"
- ]
- },
- "metadata": {
- "needs_background": "light"
- },
- "output_type": "display_data"
- },
- {
- "data": {
- "image/png": "iVBORw0KGgoAAAANSUhEUgAAArkAAAHWCAYAAABzFJGmAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4zLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvnQurowAAIABJREFUeJzsvUuubUuyJTTM3OfeN14XgAKCR/lJSNSQoIBEjSp0IEs0gA7QiWwBokgBKZFoQZYBgVJICUnxlSP2mu5mFMyGuc11zo24Kb3IvHraHjpxzt17rfnxj/mwYcPMxd3x3b7bd/tu3+27fbfv9t2+2z+mpv+2H+C7fbfv9t2+23f7bt/tu323f+j2DXK/23f7bt/tu3237/bdvts/uvYNcr/bd/tu3+27fbfv9t2+2z+69g1yv9t3+27f7bt9t+/23b7bP7r2DXK/23f7bt/tu3237/bdvts/uvYNcr/bd/tu3+27fbfv9t2+2z+69lcBuSLyX4rI/yki/0JE/ru/xj2+23f7bt/tu3237/bdvtt3+7Um/9B1ckVkAPi/APwXAP4VgH8O4L9x9//9H/RG3+27fbfv9t2+23f7bt/tu/1K+2swuf8JgH/h7v+3u78A/A8A/qu/wn2+23f7bt/tu3237/bdvtt3+2n7a4DcfwfA/9v++1/lz77bd/tu3+27fbfv9t2+23f7N9Lmv60bi8g/AfBPAOBv/uZv/uO//dv/EAI5vwfgcACS///z9mu/E/5OADeHSFznzzzPuWDdu11NHPDzXAKt56svigCe96r/9HN9z+/2h0N8pn8nftb/Hffh9YR98vbi/Tv8b4Dv7oDLeZ26r+b78PNv9xV59O9Rt/jps/Nb9NHgRwWe35MfnvnnLTso+4wPV2N4Ovfctf03IHC3X73fD/Pgvd/aq/z5N4xbKvu3Zl17h/w051N9oo/v45pSn0Ubj4eqqOZBv5vUOPZrvj/Ve3v+7sf/An7+3XPtft/8twjc2rzv1xSp+Xz6ROqd3p/i/dlrNj4GgX3ZnuWt096678d+4TU45wC4OGDn2j882K81/7m9kZyXInpu2CfDTwfpJzam3ed9nI+d+Nk1fm6vHi/2FxboT+dSu+yfm2s/vM7bdePXDkBzJDnDfhyv93f+6QjXjzm/+gzxshndfvzsms8h//Hn/G0f8WeP9/7lc5w5X2vk8Qrvc+MnxuCxyfXPco96u4e3zpf8d/vR+Rx/5u3f7RlrjrQH7j/nfd7WaOvEn/wCuT+19+23pB0Vh7i05+v44Dljjn061ubdTj379e153sblcb1myH8Y47reDwjhV21yf+bzsx+7zdHt5LOPiQ1+ummhDc9Prv9Dq4dsY9nstjx+9rzae388ulV+5QePufNuJn7srX/5//xL/P3f//2fN1bZ/hog9/8D8O+1//5382eP5u7/FMA/BYC/+7u/8//1n/0vGMNhFr/XLZh6YfsLphem3FgQfMjAzQ3UHBiAyIgBNoeZQaZgiMNNYKLQYXATeC54M+Can1ARQB2vP31hjIEBwRgD931jjdN/l0wsuzHM4CoQc6hObDWYAWMMWD64mUFECgC6O1Q1/oZgDMFaCy4KEcEywy8i2PkZAFBV7L0hInVdXm8IsLcDc0Dz+tsdU/XxWYIJzZ+LxP2XOOAKmEFlw3xgCzAcmHPC3etPB7FDFMs2tjvENwyKX65fADGYWXzHBKIOF4G2vuD7iwj23o/7mBnEge2GMUa8P85YjTEgIvj6+sKcesZ6DPi9sC+FWoQklhmuMeB74WUOBSAz7qmqEBMIBkTjvqpaY8Z7uzt2vrYi+n/y+R0weD3XvhcwB8Qdc07ce0Mtx1saoDPBwsZs92vroMZMVes7ZgbfVn13u2E4sGwDc8DWxoTE2I0Bz/kCLCiu+LwI9r2gDujnhWUWz+XxDHNO7L2zj6Nf13rhuq4aezODDIWtHdfLd4/5HL+/3fCho76zzKBzwHc8g5nFfB4Dl0QfvGxD54BCMFXxdd/RD0Nj/zUDsj/MF8QEhpinMrR+JyLVL9Hvsb6uXJMyRs259XpBxgDy3c0MBvww12vdfkyM7TUfdzmj8TfH8wXDSAfS4JiIsbR8jWtIzI05IDveK9aU4HbDJYrtjg8d2B7PvNaCwet3XMc+BFjPdc65rMBj7bk7fCgUbQ2IwPeGIcN4+TMF4CaAh53l2gDCLthakBFzHaowxHoQj/UuIpC0RY6Yi2ICb/0VfSdw3Bh6AWkfaBvNDLgEe8X7bl+4ZKbTJHB1QAWwWIMTAhfLdRnX4fvHWAA6DKppb2xAxIENmCBsojleMFwjPqPmWGB/I/sWEN/wMSHb4DlXFDEnoPFuahs+ND8zACgUOcewAdcax/63u2PIxm0OlbMla65zABiK+H3ORVWN+7lDdMIl9rhwrjbcBKoTIp72Rs+ehLtAmPoF91XvMSTGNmxK2ilZEFdABW4DpmGbhgIuFtfYu+a1u0A1bVs+V7zQgNoXtnxAZMf+uxcAxXbubw7DjL0JDnXAdOSArboW52fskzGu4jv7Pe+tUjZ7OLAJ+L3b3ZFzd9eepGkbxhgQEyxfuQdsDgwEgLlj2YY4MQAwNexff0ahPU/bSTt8SdhVGbkvxMqFKXJtAeDvzWPvyfXuEvuTAsAAIAO2HGPIc37NEX3gC7aBMa7stzv2jzGw941PGfEuaR9PH4Zd4d5Neznyc5J7gnB/1Hwm7nltL+f6hFmsEz99U5hnjMQXE75vqGrNRb6/uOM//c//M/zW9teQK/xzAH8rIv++iHwA+K8B/E9/7gvc1M2AtRYAYMNhw+E6MNSwDJg6sBEDqcOwcQAZALhK/BHFvQXLDdsN5iM2qZ0Lfg7Y/sLed2zeCONw28bLN+6cbiM30L2/MATA0Lyf4LUtDHZ5MrnRqGJOPYOTGxQQQO71epWRMksggtiYCQr3jsVkbbLtvfG1F+69gJxIe98wW7Dss8MIn795LUCxzGH7XFdlYgwJg5VggNeJTeYsTj7bNUY4BGNg24299wPg7n0XeO9jy0nMZ+J1aXjZdwAKIBLgEpABeLyXqcDvXT/jveJ6s/rw0hEbW3NEgDBGYTDi+2WcEOBQ8vmqD+iZcow0ASkCtG2PhUhjwPeUEe937/0ASsg5AtVwfNyx1sK9N14rxjnAVdx/4QDMOSd8BJiuDUYVCwJDA8lzYPzygduf49hBTPTnDfeN67qeDki+v4gkaLMzr4xGK/pg0ZBn/3ibP9d1Bdh1w8tynu3o//u+MVUxRGpjvXPt7r0hJo+5EatTH+tCcvPoY0vHJRySu8atv5MLsKWNSVtHshuga44j1/K9d4BXHIcgfuetL7x+L2SEcs83s+o75ByCOV6vV43NLvuCugbBaoyJhxOip/+5TmIj4kbsNVdq/LkRch5gQ+eFcc1y5kTolMtxCvaG7LPBWfY75wdcC3C9r1nAIHaA8GrzSYZiLwcQtmLIhHg6TePMVT7Xy8LZXggwbdreSzlWCuQ8g6wCXpKDsSUc+BgLC4Cba1dhGNPPnLOYD8hnXjmHBsFKAt0AXA7x+wAjPw4ubQnHTwFsHDKA+wLHWlWDZFFA1KHguAY4tH2HE4BdfRPO5SpA5E6bu6EyA2C6lk2ToRii4WQ5YGuHHYYBPoFaywnqROPnFk4LBgrM8D0l9wVVBcSgMepw3/AN+Now0SCOmlOtdmO7QT2cBfd9xg549GPY4I3tC65Pu1XzymMdDUfZlzlnfZeOiEIK4JZdlBsKw8aGCeASjuhK2749+td8Id1aTBw7UsRT7mWbQDrnrI/4VjgIsU8+1vjIfVGPwzjGwExSQVWhDsAcw2MOF6mmCrWYi1NmYBo3OG5cQ5KgiXv5OPtvOfxJFHU7ZNh5HRRG4PtBBdB8xwLusUdwL1cEjiub2H53XVeSA+Eczo+Pso2xBhtB9KsU9I/tHxzkuvsC8N8C+GcA/g8A/6O7/29/7jvkQABgzo8gcebAawW7M3ABUKx1gKXbwHVduZGxKxpIVIGOz/jp2uFlzRne7bYzwVXDTU7WDHbYQ99pFHRib8c
- "text/plain": [
- "<Figure size 864x576 with 1 Axes>"
- ]
- },
- "metadata": {
- "needs_background": "light"
- },
- "output_type": "display_data"
- },
- {
- "data": {
- "image/png": "iVBORw0KGgoAAAANSUhEUgAAAccAAAHVCAYAAACE4jAiAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4zLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvnQurowAAIABJREFUeJzsvdnvLVmW3/VZe4iIc85vuEPezKzM7Krq6qruQo0EyMgGtVpICBCDDAIJGfsdiwcav1k8IHhFlnixZWT3A7xZ/AUWIIMZhICecLsnyl3d1ZWVVZmVmXf6DedExB4WD2tHnHNvZVbdairdafu3Uqn7+/1i2rFj7zV+11qiqtzRHd3RHd3RHd3Rkdyf9ADu6I7u6I7u6I4+b3QnHO/oju7oju7ojl6iO+F4R3d0R3d0R3f0Et0Jxzu6ozu6ozu6o5foTjje0R3d0R3d0R29RHfC8Y7u6I7u6I7u6CX6zISjiPzrIvINEfmmiPynn9Vz7uiO7uiO7uiOftIkn0Weo4h44B8A/yrwHvCrwJ9X1d/9iT/sju7oju7oju7oJ0yfleX4p4FvquofquoM/HfAv/MZPeuO7uiO7uiO7ugnSuEzuu/bwHdOfn8P+DOnJ4jIXwT+IsAwDH/qS1/8EgAq9aVbOQQBKoig1Szd6TBxdfUcVAFB1BG6nnv37tlVXlApgB0/6gF68q/YT6ooCiiq9jcv0v52cqZIu6LdRRWRF+/6Msl67vLbD6NPt+LtPm28CE4ErbXNi125eAGeP3vG4XCwq5wHBBEoOdv1uZ688w976vJsfWF42kaiUtu3cS+8n3Kcl9N3Pn2OoG28ChJAQJx9I4ew6Tdshg3OhfZyxzstT6lSqRQERUTw2NqpKLkUpnkmF3uqCDhp93eeGCPeO5CKsryHPb1kRVCcE0SEqkrVdm9VVAuqSi3K7e2BiltHVdsAVRXvHLuc2Z2ds92cARBih3MOJxXnC+IK4oS6vJU4cB4VAXFtbgXR5a3tAaVW9vs9qSRUK+3VqLWgbaxOHKKC4Na1UWulkhEnOKGtI4Xl/mr7DLF/7MbrCrB109a9cwVVEG3jbNfXdlycoAq1HudPVXHO4b3HOY8g+PZ91q/bnq3LfLc9L06oZfkO1daECEfvl31F1ePePtnmoHWdQX3pf/uboix7w6HYvWt7vlZFtR7XbdvYy9oRsfXixK3v4KQiy7oT42Ui9k2O79zuL0c+Y4voRX4hJ6M98oPl/+WMheO8+GbHfwSRtvPVxvKDHOD0ni/fR9uw5GRVvEinU/7yT7ocVf1BfvrCHV68FjV+7J3DB8/C9KY5AbbGRBxF5jbHp3N55BkCfPe9Dz9W1UefOPgT+qyE448kVf1l4JcBvv71r+nf/OW/jtADxsCNaQWEaL+6hEOY9jZxv/ub3+B/+1/+R2o+4Cs4jWzu/xT/3r//bwPw8I0zapioVJQOtKfWisMm04lijMfZBqaSNa8brRehNgauYgvfOXc6fqjaFhfreetxWYRsu4cK6PF6OfmZVSF4UTFYxqKqIB0U3w4kfMyoZlQdHs+cnoPYu/21//q/5Vt/+AwfB4o8QcNITmfk2RSHOD0n50xFjfHXSn1hHx6XqmvMAi3QmIRrh6u/xUmHSMTVuCoWACIFcSZgcrX3X47b9RUl27/S4WJHv9kB8PqD1/invvRzfOXtn2GI5ygRXx2yPL8mkpsZ/S3JjfiY2UQYahMKwfPdjz/md37vm4xZ8T7gnLDbdgAMPvL6G69x//4FEhPPrj/mybPnAOTZg9vwtS88InaQNSOuQ93G3lmhaqFMI/Nt4dd+9bf44PET/G4AoATHzf4WUUfvOv7cV77G2z/108Tu3I6nDlHB6cgQbxB5AnEibrYAdGeP6LaP6DYXDNsLzrcX9N2Orevt+NBTfeHx9VN+6xu/yZNnH/H46jH76dq+nBeKg+3FBcNmRylCyeDaPgoEbvmQTe+N/Ysg1RO83f9wSEzTgX7bgVNKUeaspLYtc54RV4idY9cH5qkyz4WUFiZViINnd37GZrMjl2U9tPXkK33s6OKO6M5xtaNyva6bUgqlJHKZyTmhWpoCvOyHSikJLaUd13VfllJI0pPmChLRaopGqja2XBPqKuM8kXNmHGcOhwOHgz1/Tgecz4QoHKbAZrMjhg2oX3cDQHCV4B33L+8RnaePfXtHb2vbm5Lgg9BvEn5RzKtA9fgqTTlwbVxNaXUmbmQRjtWjVVb20IVpHYfggEDVyMLGFWePoLT9laBW5GT+RME543moQ7xj4TtVU+NzAaehKQkFaB/RF1OyFVQFL5mXaeVZTbH0J0rncvyoYMnKP6ssik457nM58kxtitAwDJxdnNPFgaKeD77/MQDf+tZ32N+OHOLHTfmoiChdsG/XRU/nHd7Bf/Vf/q1v/8DAP4E+K+H4XeCnTn5/p/3tE2llxeqArv1SsQn1IPZxSjladvv9TdMEMyqZVEfq/CH79BEA9720PRVN+5XSPvQilAr/5390OsR/fOgX+DP8Aqxr+gcotv8/hf7q+G+1xWsWhVbTpKV9h0VZUAJVPU48RdxRS2t6B6JUFKn2qWTV38B0OY/zDpzgJOCKfZtOO954+AbRB4J4SpZ2rT0/y0zWA6UknKs4V5GgJLG1M8+J7374MbfjhOKZ50QInst7Jnzv3z/n8t6AyIEyJy7PLjk7vw/AIRX2c8FvYSo3OA/7ceT29taenZTOCQ/PL5E6E0vmXszE4WBTu9sxx4HxNnG52yD9GcV1LOuuBk8fNkg5A7nE+QuCv0VnYzS3HyWu8jOCzET3nI8JiEKv9nw/RIhKCYU4XvMFp7x9+YjEQwDGUnk+T1TXk/GMdaZoZpz39vyaKXpDlA3ihVIq+7EyT1f28dUTY6APA6I2v33w9CuniCAJH5TQPUB8otQDU7LrqxRC6OjjgJPA5fmWGCOxMangKjUX8ijkfSYnmHeZeZ5tfucJoRLFIVWgLNYGeCnUmvGipGICwEWhNuFSXGLMgalWqngOY+Y2JZ5dmeJzdbhhSrOdX0317r0jNv7dxw0X247NNnIWL+j6LV3c4pxvK9a39ZtxomguiDq8t8mR0pi9OCrZpEiRVVEWBQ+rR0LV9sdy/4pbjSURQVzzzkjbMbGawFePagcazSpdNQ9FKDjMsyHNo1Gb0iiqzTAfbC/5xYpcdqT9LlJwzq7TWlcBJtLeU7R5JxalgZNzjkIYBU4MivU8PbHsGvMXlxFxJpRlsXR1NRxEFgHum1D15JQ525nSenGx5fJyh9+em4KVJlJK5GzrqpZMnhNTSbwqfVbC8VeBr4nIT2NC8T8A/sKnnq0e6g4ThE2LWg6pInhUM1DW4+P8FOSACwlSAXFIHJnqU7vOXVJrZ5OpDuTI7O/oh5O4ZC7bRbMThVqOi7a27VS9uQAJvOC6dk06SjHhqnXxiq9UEdvYCqMKg3SAWU99f59+uEToqSlA7ux2zerHVaQqHQIaiFoJNXHobQNd3ez58OljnPd4H+m6nhDCqoWKU0QmnFbSPHN9NXJ1sHUlm47z+/cYcyHGAdXEbneGd6ZNHA4Tu87GIvXAWa/kNLHtzXIMkrhKyiAD2xopVcg6IP6iPbujBpuv6i8Ztq/x9tvnxN42saqSZqGMkMdE2U/UVKj5AwASnv2slMmhNaBUfPW4YopDR2BHT3UbblNGqrIdHH1n75dLgXJOjIHiKt22p7sI7Pc2dyllC2Q4hxCpaeL2+prpYMIfX7m43LEZtuALm9iz25xRygN7f1fZbjq6fmGK5qbOB7N6UknGvGZFSwD11OmANKbVuYpogdqYswpNdjDlCRc9U6loN7BPiaurG8bGAK+vn1Ny4PYwkYvj+mZGQqQ2t2boPH23Zbvpic7b2PueXWcP2ERht/X0nSckcOKBQDnx8tRayLkgWnExoEvkBvMWmcs4rCGQqVSct5+DF4K45rWiCZ9Tf5FDF2+UtFCB6LqtVKq53dU1QRWbx2pxy5qLvWL80ITQ0UJVKeYtqH4NJyDHkIJKZN3iFBBz+a+hI3VNMJZ2Xb+OfHFxw1F
- "text/plain": [
- "<Figure size 864x576 with 1 Axes>"
- ]
- },
- "metadata": {
- "needs_background": "light"
- },
- "output_type": "display_data"
- }
- ],
- "source": [
- "for image_path in TEST_IMAGE_PATHS:\n",
- " image = Image.open(image_path)\n",
- " # the array based representation of the image will be used later in order to prepare the\n",
- " # result image with boxes and labels on it.\n",
- " image_np = load_image_into_numpy_array(image)\n",
- " # Expand dimensions since the model expects images to have shape: [1, None, None, 3]\n",
- " image_np_expanded = np.expand_dims(image_np, axis=0)\n",
- " # Actual detection.\n",
- " output_dict = run_inference_for_single_image(image_np_expanded, detection_graph)\n",
- " # Visualization of the results of a detection.\n",
- " vis_util.visualize_boxes_and_labels_on_image_array(\n",
- " image_np,\n",
- " output_dict['detection_boxes'],\n",
- " output_dict['detection_classes'],\n",
- " output_dict['detection_scores'],\n",
- " category_index,\n",
- " instance_masks=output_dict.get('detection_masks'),\n",
- " use_normalized_coordinates=True,\n",
- " line_thickness=8)\n",
- " plt.figure(figsize=IMAGE_SIZE)\n",
- " plt.imshow(image_np)"
- ]
- },
- {
- "cell_type": "code",
- "execution_count": null,
- "metadata": {
- "colab": {
- "autoexec": {
- "startup": false,
- "wait_interval": 0
- }
- },
- "colab_type": "code",
- "id": "LQSEnEsPWKMj"
- },
- "outputs": [],
- "source": []
- },
- {
- "cell_type": "code",
- "execution_count": null,
- "metadata": {},
- "outputs": [],
- "source": []
- },
- {
- "cell_type": "code",
- "execution_count": null,
- "metadata": {},
- "outputs": [],
- "source": []
- },
- {
- "cell_type": "code",
- "execution_count": null,
- "metadata": {},
- "outputs": [],
- "source": []
- },
- {
- "cell_type": "code",
- "execution_count": null,
- "metadata": {},
- "outputs": [],
- "source": []
- },
- {
- "cell_type": "code",
- "execution_count": null,
- "metadata": {},
- "outputs": [],
- "source": []
- },
- {
- "cell_type": "code",
- "execution_count": null,
- "metadata": {},
- "outputs": [],
- "source": []
- },
- {
- "cell_type": "code",
- "execution_count": null,
- "metadata": {},
- "outputs": [],
- "source": []
- }
- ],
- "metadata": {
- "colab": {
- "default_view": {},
- "name": "object_detection_tutorial.ipynb?workspaceId=ronnyvotel:python_inference::citc",
- "provenance": [],
- "version": "0.3.2",
- "views": {}
- },
- "kernelspec": {
- "display_name": "Python 3",
- "language": "python",
- "name": "python3"
- },
- "language_info": {
- "codemirror_mode": {
- "name": "ipython",
- "version": 3
- },
- "file_extension": ".py",
- "mimetype": "text/x-python",
- "name": "python",
- "nbconvert_exporter": "python",
- "pygments_lexer": "ipython3",
- "version": "3.7.3rc1"
- }
- },
- "nbformat": 4,
- "nbformat_minor": 1
- }
|