Deep Learning edge computing Embedded image classification IoT jetson nano object detection

Getting started with the NVIDIA Jetson Nano

On this tutorial, you will discover ways to get started with your NVIDIA Jetson Nano, together with:

  • First boot
  • Installing system packages and conditions
  • Configuring your Python improvement surroundings
  • Putting in Keras and TensorFlow on the Jetson Nano
  • Altering the default digital camera
  • Classification and object detection with the Jetson Nano

I’ll also present my commentary along the method, together with what tripped me up once I arrange my Jetson Nano, making certain you keep away from the similar mistakes I made.

By the time you’re accomplished with this tutorial, your NVIDIA Jetson Nano will probably be configured and ready for deep studying!

To discover ways to get started with the NVIDIA Jetson Nano, just hold reading!

Getting started with the NVIDIA Jetson Nano

Determine 1: On this blog publish, we’ll get started with the NVIDIA Jetson Nano, an AI edge system able to 472 GFLOPS of computation. At around $100 USD, the system is packed with functionality together with a Maxwell structure 128 CUDA core GPU coated up by the large heatsink proven in the image. (picture source)

In the first a part of this tutorial, you’ll discover ways to download and flash the NVIDIA Jetson Nano .img file to your micro-SD card. I’ll then show you find out how to install the required system packages and conditions.

From there you will configure your Python improvement library and discover ways to set up the Jetson Nano-optimized version of Keras and TensorFlow in your gadget.

I’ll then show you find out how to access the digital camera on your Jetson Nano and even carry out picture classification and object detection on the Nano as properly.

We’ll then wrap up the tutorial with a quick dialogue on the Jetson Nano — a full benchmark and comparability between the NVIDIA Jetson Nano, Google Coral, and Movidius NCS might be revealed in a future weblog submit.

Before you get started with the Jetson Nano

Before you’ll be able to even boot up your NVIDIA Jetson Nano you need three issues:

  1. A micro-SD card (minimal 16GB)
  2. A 5V 2.5A MicroUSB energy supply
  3. An ethernet cable

I actually need to stress the minimum of a 16GB micro-SD card. The primary time I configured my Jetson Nano I used a 16GB card, however that area was eaten up fast, notably once I put in the Jetson Inference library which can download a number of gigabytes of pre-trained models.

I, subsequently, advocate a 32GB micro-SD card on your Nano.

Secondly, in relation to your 5V 2.5A MicroUSB power supply, of their documentation NVIDIA particularly recommends this one from Adafruit.

Lastly, you’ll need an ethernet cable when working with the Jetson Nano which I discover really, actually frustrating.

The NVIDIA Jetson Nano is marketed as being a strong IoT and edge computing gadget for Synthetic Intelligence…

…and if that’s the case, why is there not a WiFi adapter on the gadget?

I don’t perceive NVIDIA’s choice there and I don’t consider it ought to be up to the finish consumer of the product to “bring their own WiFi adapter”.

If the objective is to deliver AI to IoT and edge computing then there ought to be WiFi.

However I digress.

You possibly can learn extra about NVIDIA’s recommendations for the Jetson Nano right here.

Download and flash the .img file to your micro-SD card

Earlier than we will get started putting in any packages or operating any demos on the Jetson Nano, we first have to obtain the Jetson Nano Developer Package SD Card Image from NVIDIA’s web site.

NVIDIA offers documentation for flashing the .img file to a micro-SD card for Windows, macOS, and Linux — you need to select the flash directions applicable on your specific working system.

First boot of the NVIDIA Jetson Nano

After you’ve downloaded and flashed the .img file to your micro-SD card, insert the card into the micro-SD card slot.

I had a hard time discovering the card slot — it’s truly beneath the heat sync, proper where my finger is pointing to:

Determine 2: Where is the microSD card slot on the NVIDIA Jetson Nano? The microSD receptacle is hidden beneath the heatsink as shown in the image.

I feel NVIDIA might have made the slot a bit extra apparent, or at the least higher documented it on their website.

After sliding the micro-SD card residence, connect your energy provide and boot.

Assuming your Jetson Nano is related to an HDMI output, you must see the following (or comparable) displayed to your display:

Figure three: To get started with the NVIDIA Jetson Nano AI gadget, just flash the .img (preconfigured with Jetpack) and boot. From right here we’ll be putting in TensorFlow and Keras in a digital setting.

The Jetson Nano will then walk you thru the install process, together with setting your username/password, timezone, keyboard format, and so on.

Putting in system packages and conditions

In the the rest of this guide, I’ll be displaying you tips on how to configure your NVIDIA Jetson Nano for deep studying, together with:

  • Installing system package deal conditions.
  • Putting in Keras and TensorFlow and Keras on the Jetson Nano.
  • Putting in the Jetson Inference engine.

Let’s get started by installing the required system packages:

Offered you’ve got a great internet connection, the above commands should solely take a few minutes to finish up.

Configuring your Python surroundings

The subsequent step is to configure our Python improvement surroundings.

Let’s first set up
pip, Python’s package deal supervisor:

We’ll be utilizing Python virtual environments in this guide to keep our Python improvement environments unbiased and separate from one another.

Utilizing Python digital environments are a greatest apply and will assist you to keep away from having to take care of a micro-SD for every improvement surroundings you need to use in your Jetson Nano.

To handle our Python digital environments we’ll be using virtualenv and virtualenvwrapper which we will install using the following command:

As soon as we’ve put in
virtualenv and
virtualenvwrapper we have to update our
~/.bashrc file. I’m choosing to use
nano however you need to use whatever editor you’re most snug with:

Scroll right down to the backside of the
~/.bashrc file and add the following strains:

After including the above strains, save and exit the editor.

Next, we need to reload the contents of the
~/.bashrc file utilizing the
supply command:

We will now create a Python digital surroundings using the
mkvirtualenv command — I’m naming my digital surroundings
deep_learning, however you possibly can identify it whatever you want to:

Putting in TensorFlow and Keras on the NVIDIA Jetson Nano

Earlier than we will install TensorFlow and Keras on the Jetson Nano, we first need to put in NumPy.

First, be sure to are inside the
deep_learning digital setting through the use of the
workon command:

From there, you possibly can set up NumPy:

Putting in NumPy on my Jetson Nano took ~10-15 minutes to install as it had to be compiled on the system (there at present no pre-built versions of NumPy for the Jetson Nano).

The subsequent step is to put in Keras and TensorFlow on the Jetson Nano. You might be tempted to do a easy
pip set up tensorflow-gpu — do not do that!

As an alternative, NVIDIA has offered an official launch of TensorFlow for the Jetson Nano.

You possibly can install the official Jetson Nano TensorFlow through the use of the following command:

Putting in NVIDIA’s
tensorflow-gpu package deal took ~40 minutes on my Jetson Nano.

The ultimate step here is to put in SciPy and Keras:

These installs took ~35 minutes.

Compiling and installing Jetson Inference on the Nano

The Jetson Nano .img already has JetPack installed so we will leap immediately to constructing the Jetson Inference engine.

Step one is to clone down the
jetson-inference repo:

We will then configure the build using

There are two essential things to notice when operating

  1. The
    cmake command will ask for root permissions so don’t walk away from the Nano until you’ve offered your root credentials.
  2. During the configure course of,
    cmake will even obtain a number of gigabytes of pre-trained pattern fashions. Be sure to have a number of GB to spare on your micro-SD card! (That is additionally why I recommend a 32GB microSD card as an alternative of a 16GB card).

cmake has finished configuring the construct, we will compile and install the Jetson Inference engine:

Compiling and putting in the Jetson Inference engine on the Nano took just over 3 minutes.

What about installing OpenCV?

I decided to cowl installing OpenCV on a Jetson Nano in a future tutorial. There are a selection of
cmake  configurations that have to be set to take full advantage of OpenCV on the Nano, and admittedly, this publish is lengthy enough as is.

Once more, I’ll be overlaying learn how to configure and set up OpenCV on a Jetson Nano in a future tutorial.

Operating the NVIDIA Jetson Nano demos

When utilizing the NVIDIA Jetson Nano you could have two options for enter digital camera units:

  1. A CSI digital camera module, akin to the Raspberry Pi digital camera module (which is suitable with the Jetson Nano, by the method)
  2. A USB webcam

I’m presently using all of my Raspberry Pi digital camera modules for my upcoming guide, Raspberry Pi for Pc Vision so I decided to make use of my Logitech C920 which is plug-and-play suitable with the Nano (you can use the newer Logitech C960 as nicely).

The examples included with the Jetson Nano Inference library might be found in

  • detectnet-camera: Performs object detection using a digital camera as an input.

  • detectnet-console: Additionally performs object detection, however using an input image somewhat than a digital camera.

  • imagenet-camera: Performs picture classification utilizing a digital camera.

  • imagenet-console: Classifies an enter picture utilizing a community pre-trained on the ImageNet dataset.

  • segnet-camera: Performs semantic segmentation from an enter digital camera.

  • segnet-console: Additionally performs semantic segmentation, but on an image.

  • A couple of other examples are included as properly, together with deep homography estimation and tremendous resolution.

Nevertheless, with a view to run these examples, we have to slightly modify the source code for the respective cameras.

In every example you’ll see that the
DEFAULT_CAMERA worth is about to
-1, implying that an hooked up CSI digital camera ought to be used.

Nevertheless, since we’re utilizing a USB digital camera, we have to change the
-1 to
0 (or no matter the right
/dev/video V4L2 digital camera is).

Fortunately, this alteration is super straightforward to do!

Let’s begin with picture classification for instance.

First, change directory into

From there, open up

You’ll then need to scroll right down to approximately Line 37 where you’ll see the

Merely change that worth from
-1 to

From there, save and exit the editor.

After modifying the C++ file you’ll need to recompile the instance which is so simple as:

Take into account that
make is sensible enough to not recompile the whole library. It’ll solely recompile information which have modified (in this case, the ImageNet classification example).

Once compiled, change to the
aarch64/bin directory and execute the
imagenet-camera binary:

Here you possibly can see that the GoogLeNet is loaded into reminiscence, after which inference starts:

Image classification is operating at ~10 FPS on the Jetson Nano at 1280×720.

IMPORTANT: If that is the first time you’re loading a specific mannequin then it might take 5-15 minutes to load the model.

Internally, the Jetson Nano Inference library is optimizing and getting ready the mannequin for inference. This solely needs to be completed once so subsequent runs of the program might be significantly quicker (when it comes to model loading time, not inference).

Now that we’ve tried image classification, let’s take a look at the object detection example on the Jetson Nano which is situated in

Once more, in case you are using a USB webcam you’ll need to edit roughly Line 39 of
detectnet-camera.cpp and alter
-1 to
0 and then recompile by way of
make (once more, solely crucial in case you are using a USB webcam).

After compiling you’ll find the
detectnet-camera binary in

Let’s go forward and run the object detection demo on the Jetson Nano now:

Here you’ll be able to see that we’re loading a mannequin named
ped-100 used for pedestrian detection (I’m truly unsure what the specific structure is as it’s not documented on NVIDIA’s website — if you recognize what structure is getting used, please depart a touch upon this publish).

Under you possibly can see an instance of myself being detected using the Jetson Nano object detection demo:

In line with the output of the program, we’re acquiring ~5 FPS for object detection on 1280×720 frames when utilizing the Jetson Nano. Not too dangerous!

How does the Jetson Nano examine to the Movidius NCS or Google Coral?

This tutorial is just meant to be a getting started guide in your Jetson Nano — it isn’t meant to match the Nano to the Coral or NCS.

I’m in the means of comparing every of the respective embedded techniques and shall be providing a full benchmark/comparison in a future blog submit.

In the meantime, check out the following guides that will help you configure your embedded units and start operating benchmarks of your personal:

How do I deploy custom fashions to the Jetson Nano?

One among the advantages of the Jetson Nano is that when you compile and set up a library with GPU help (suitable with the Nano, in fact), your code will routinely use the Nano’s GPU for inference.

For instance:

Earlier in this tutorial, we installed Keras + TensorFlow on the Nano. Any Python scripts that leverage Keras/TensorFlow will routinely use the GPU.

And similarly, any pre-trained Keras/TensorFlow models we use will even routinely use the Jetson Nano GPU for inference.

Fairly awesome, right?

Offered the Jetson Nano supports a given deep studying library (Keras, TensorFlow, Caffe, Torch/PyTorch, and so forth.), we will easily deploy our fashions to the Jetson Nano.

The problem right here is OpenCV.

OpenCV’s Deep Neural Community (
dnn) module does not help NVIDIA GPUs, including the Jetson Nano.

OpenCV is working to offer NVIDIA GPU help for their
dnn module. Hopefully, will probably be launched by the finish of the summer time/autumn.

However till then we can’t leverage OpenCV’s straightforward to use
cv2.dnn features.

If using the
cv2.dnn module is an absolute must for you right now, then I might recommend looking at Intel’s OpenVINO toolkit, the Movidius NCS, and their other OpenVINO-compatible products, all of which are optimized to work with OpenCV’s deep neural community module.

In the event you’re involved in studying more about the Movidius NCS and OpenVINO (including benchmark examples), make sure to confer with this tutorial.

Focused on utilizing the NVIDIA Jetson Nano in your personal tasks?

I guess you’re simply as excited about the NVIDIA Jetson Nano as I am. In distinction to pairing the Raspberry Pi with with both the Movidius NCS or Google Coral, the Jetson Nano has it all built right in (minus WiFi) to powerfully conduct pc vision and deep studying at the edge.

For my part, embedded CV and DL is the next massive wave in the AI group. It’s so huge that it might even be a tsunami — will you be driving that wave?

That will help you get your begin in embedded Pc Imaginative and prescient and Deep Studying, I have decided to write down a brand new guide — Raspberry Pi for Pc Imaginative and prescient.

I’ve chosen to give attention to the Raspberry Pi as it is the greatest entry-level gadget for getting started into the world of pc imaginative and prescient for IoT.

But I’m not stopping there. Inside the e-book, we’ll:

  • Increase the Raspberry Pi with the Google Coral and Movidius NCS coprocessors.
  • Apply the similar expertise we study with the RPi to a tool with extra horsepower: NVIDIA’s Jetson Nano.

Additionally, you’ll discover ways to:

  • Build sensible, real-world pc vision purposes on the Pi.
  • Create pc vision and Internet of Things (IoT) tasks and purposes with the RPi.
  • Optimize your OpenCV code and algorithms on the resource-constrained Pi.
  • Perform Deep Learning on the Raspberry Pi (including utilizing the Movidius NCS and OpenVINO toolkit).
  • Configure your Google Coral, carry out picture classification and object detection, and even practice + deploy your personal customized fashions to the Coral Edge TPU!
  • Make the most of the NVIDIA Jetson Nano to run multiple deep neural networks on a single board, including picture classification, object detection, segmentation, and more!

I’m operating a Kickstarter campaign to fund the creation of the new ebook, and to rejoice, I’m providing 25% OFF my present books and programs in case you pre-order a replica of RPi for CV.

In truth, the Raspberry Pi for Pc Vision ebook is practically free for those who pre-order it with Deep Studying for Pc Imaginative and prescient with Python or the PyImageSearch Gurus course.

The clock is ticking and these discounts gained’t final — the Kickstarter pre-sale shuts down on this Friday (Might 10th) at 10AM EDT, after which I’m taking the deals down.

Reserve your pre-sale guide now and while you’re there, seize another course or guide at a reduced fee.


On this tutorial, you discovered the right way to get started with the NVIDIA Jetson Nano.

Particularly, you discovered the way to set up the required system packages, configure your improvement surroundings, and install Keras and TensorFlow on the Jetson Nano.

We wrapped up studying easy methods to change the default digital camera and carry out picture classification and object detection on the Jetson Nano utilizing the pre-supplied scripts.

I’ll be offering a full comparison and benchmarks of the NVIDIA Jetson Nano, Google, Coral, and Movidius NCS in a future tutorial.

To be notified when future tutorials are revealed here on PyImageSearch (including the Jetson Nano vs. Google Coral vs. Movidus NCS benchmark), simply enter your e-mail tackle in the type under!