You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 12 Next »

History

  • Started. 2020-04-14
  • Drafted. 2020-04-27


About

About


NNStreamer, a Linux Foundation AI Foundation open source project, is an efficient and flexible stream pipeline framework for complex neural network applications, which was developed and open-sourced by Samsung.


NNStreamer provides a set of GStreamer plugins so that developers may apply neural networks, attach related frameworks (including ROS, IIO, FlatBuffers, and Protocol Buffers), and manipulate tensor data streams in GStreamer pipelines easily and execute such pipelines efficiently. It has already been adopted by various Android and Tizen devices in Samsung, which implies that it is reliable and robust enough for commercial products.


It supports well-known neural network frameworks including Tensorflow, Tensorflow-lite, Caffe2, PyTorch, OpenVINO, ARMNN, and NEURUN. Users may include custom C functions, C++ objects, or Python objects as well as such frameworks as neural network filters of a pipeline in run-time. Users may add and integrate supports for such frameworks or hardware AI accelerators in run-time, which may exist as independent plugin binaries.


NNStreamer's official binary releases include supports for Tizen, Ubuntu, Android, macOS, and Yocto/OpenEmbedded; however, as long as the target system supports Gstreamer, it should be compatible with NNStreamer as well. We provide APIs in C, Java, and .NET in case GStreamer APIs are overkill. NNStreamer APIs are the standard Machine Learning APIs of Tizen and various Samsung products as well.


(( designers may need to retouch/redraw the diagram. ))

Open Source

Open Source

NNStreamer was open-sourced in 2018 on GitHub, It is actively developed since then and has a few sub projects. NNStreamer has joined LF AI Foundation in April 2020.


We invite you to visit the GitHub where NNStreamer and its sub projects are developed. Please join our community as a user and contributor. Your contribution is always welcomed!


Get Started

Get Started

Ubuntu (16.04/18.04)

  • sudo add-apt-repository ppa:nnstreamer/ppa
  • sudo apt-get update
  • sudo apt-get install nnstreamer nnstreamer-caffe2 nnstreamer-tensorflow nnstreamer-tensorflow-lite
  • # Now, you are ready to use nnstreamer as GStreamer plugins!

In Tizen 5.5 or higher, use Machine-Learning Inference APIs (Native / .NET) to use NNStreamer in TIzen applications.

Use JCenter repository to use NNStreamer in Android Studio. 

Yocto/OpenEmbedded's meta-neural-network layer has NNStreamer included.

MacOS users may install NNStreamer via Brew taps or build NNStreamer for their own systems.

In general, you may build NNStreamer in any GStreamer-compatible systems.


Usage Examples

Applications of these screenshots require very short lines of code (click screenshots to look at) and run efficiently in inexpensive embedded devices. They can even be implemented as single-line bash shell scripts with NNStreamer.

Example applications are located at GitHub, nnstreamer-example.git and the Wiki page.



Github logo face - Free logo icons(main link)


  • No labels