Lesson 8: API level 2

General aspects

API level 2 tutorial codes are available at:

cd valkka_examples/api_level_2/tutorial
python3 lesson_8_a.py

So, by now you have learned how to construct complex filtergraphs with framefilters and threads, and how to encapsulate parts of the filtergraphs into separate classes in lesson 3.

API level 2 does just that. It encapsulates some common cases into compact classes, starts the decoding threads for you, and provides easily accessible end-points (for the posix shared memory interface, etc.) for the API user.

The API level 2 filterchains, threads and shared memory processes can be imported with

from valkka.api2 import ...

API level 2 provides also extra wrapping for the thread classes (LiveThread, OpenGLThread, etc.). The underlying API level 1 instances can be accessed like this:

from valkka.api2 import LiveThread

livethread=LiveThread("live_thread")
livethread.core # this is the API level 1 instance, i.e. valkka.valkka_core.LiveThread

Keep in mind never to do a full import simultaneously from API levels one and two, i.e.

# NEVER DO THIS!
from valkka.valkka_core import *
from valkka.api2 import *

as the threads (LiveThread, OpenGLThread, etc.) have indentical names.

The PyQT testsuite serves also as API level 2 reference.

A simple example

Download lesson [here]

First, import API level 2:

import time
from valkka.api2 import LiveThread, OpenGLThread
from valkka.api2 import BasicFilterchain

Instantiating the API level 2 LiveThread starts running the underlying cpp thread:

livethread=LiveThread( # starts live stream services (using live555)
  name   ="live_thread",
  verbose=False,
  affinity=-1
)

Same goes for OpenGLThread:

openglthread=OpenGLThread(
  name    ="glthread",
  n_720p   =20,   # reserve stacks of YUV video frames for various resolutions
  n_1080p  =20,
  n_1440p  =0,
  n_4K     =0,
  verbose =False,
  msbuftime=100,
  affinity=-1
)

The filterchain and decoder (AVThread) are encapsulated into a single class. Instantiating starts the AVThread (decoding is off by default):

chain=BasicFilterchain( # decoding and branching the stream happens here
  livethread  =livethread,
  openglthread=openglthread,
  address     ="rtsp://admin:nordic12345@192.168.1.41",
  slot        =1,
  affinity    =-1,
  verbose     =False,
  msreconnect =10000 # if no frames in ten seconds, try to reconnect
)

BasicFilterchain takes as an argument the LiveThread and OpenGLThread instances. It creates the relevant connections between the threads.

Next, create an x-window, map stream to the screen, and start decoding:

# create a window
win_id =openglthread.createWindow()

# create a stream-to-window mapping
token  =openglthread.connect(slot=1,window_id=win_id) # present frames with slot number 1 at window win_id

# start decoding
chain.decodingOn()
# stream for 20 secs
time.sleep(20)

Close threads in beginning-to-end order

livethread.close()
chain.close()
openglthread.close()
print("bye")