How to access TinyRex peripherals (Examples)

This page content was created by: Dale, Allan, Martin, Robert. Thank you very much for contribution.

Content

On this page, you will find examples how to use peripherals on TinyRex.

HDMI Video Input

Setup

This tutorial assumes you are using an i.MX6 TinyRex with a Tiny Baseboard that includes an HDMI input and an ADV7610 HDMI decoder. You’ll also need to load an image onto the SD card that includes gstreamer-0.10 and the Freescale Gstreamer Plugins. If you are not sure what I am talking about, simply follow How to start with TinyRex YOCTO and run “bitbake fsl-image-gui” (fsl-image-multimedia should also work).

Drivers

Once your i.MX6TinyRex is up and running, you’ll need to insert a kernel module. To do so, use the command:

modprobe mxc_v4l2_capture

You should also ensure that the adv7610 was recognized, and the kernel module was loaded. Verify with the command

root@imx6q-tinyrex:~# lsmod |grep adv7610

adv7610_video 6202 0

v4l2_int_device 2981 3 ov5647_camera_mipi,adv7610_video,mxc_v4l2_capture

As long as you see an adv7610_video line, you’re good to go!

Gstreamer

Freescale heavily uses Gstreamer in both it’s BSP as well as example code. Gstreamer is a very handy way to create video “pipelines”. A pipeline consists of elements put together in a particular order to convert video or audio data to encoded and/or packaged data or vice-versa. In our case, we want to take video input from the ADV7610 (connected to the i.MX6′s CSI input), convert or encode it and display it on the HDMI output or transport it over a network. A standard Gstreamer h.264 encoding pipeline might look something like:

[ video_source ] -> [ h.264 encoder ] -> [ rtp payloader ] -> [ network sink ]

and simultaneously on the receiving, decoding side:

[ network source ] -> [ rtp depayloader ] -> [ h.264 decoder ] -> [ HDMI output ]

Gstreamer allows you to create these pipelines quickly and easily from the command line using the gst-launch utility and various Gstreamer plugins that make up the elements of the pipeline. To get a list of plugins available, use the gst-inspect command. You can then use gst-inspect on a particular plugin to get more details, e.g:

gst-inspect imxv4l2src

For a much more detailed discussion of Gstreamer, see their gst-launch tutorials here.

Example Gstreamer Pipelines: HDMI input -> HDMI output

The ADV7610 HDMI decoder uses the Freescale v4l2 input module, mxc_v4l2_capture. Freescale’s Gstreamer plugin imxv4l2src takes video from the mxc_v4l2_capture and makes it available to subsequent plugins.

On the output side, there are two available output sinks: imxv4l2sink and mfw_isink. As the name implies, imxv4l2sink uses the v4l2 library to output to the first framebuffer device. Alternatively, mfw_isink writes data directly to the framebuffer, skipping v4l2. It also does other cool things like output to multiple framebuffers simultaneously, and its latency is much less than imxv4l2sink. However, it does require more configuration and can be difficult to work with depending on the window manager being used. Generally, the best output sink to use is autovideosink, which will automatically select the element for you.

So, connect an HDMI source (like your computer, a GoPro, or a BluRay player) to the HDMI input on your TinyRex baseboard, and the HDMI output to a monitor. To display the HDMI input on your monitor, run the pipeline:

gst-launch imxv4l2src ! autovideosink

Example Gstreamer Pipelines: HDMI input -> encoder -> network

If you want to send video over a network, you will need to encode and payload it first. As an example, lets use Freescale’s vpuenc plugin which is capable of using the i.MX6′s hardware encoding engine (the VPU) to encode video into MPEG4, MPEG, h.263 and h.264 formats. To get the video in a form that video players and other Gstreamer clients can understand, we’ll payload the h.264 encoded video using RTP and the rtph264pay plugin. Finally, we’ll send the video over UDP to a client on port 5550.

gst-launch imxv4l2src ! vpuenc codec=avc bitrate=4000000 ! rtph264pay ! udpsink host=192.168.1.1 port=5550 -v

Note that I’ve added arguments to the vpuenc and udpsink elements. Specifically, we use the “avc” codec for vpuenc (telling it to use h.264) at a fixed bitrate of 4Mbps. Additionally, we tell the udpsink to send the data via UDP to 192.168.1.1 on port 5550.

You’ll also notice the -v argument on the end (verbose). This tells gst-launch to spit out a bunch of additional information, including the “capabilities” of each element in the pipeline. These capabilities are important: they tell other elements about the stream, like width/height and the encoded stream type. In order to display the video in a receiving pipeline, we need to tell the receiver pipeline what the capabilities of the stream are.  Specifically, we need the capabilities at the udpsink element:

/GstPipeline:pipeline0/GstUDPSink:udpsink0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, sprop-parameter-sets=(string)\"Z0JAH6aAUAW5AA\\=\\=\\,aM44gAA\\=\", payload=(int)96, ssrc=(uint)3515352658, clock-base=(uint)3053564270, seqnum-base=(uint)53648

Example Gstreamer Pipelines: Network -> decoder -> HDMI output

We can now create a receiving pipeline, either on your computer or on another i.MX6TinyRex. Lets use the example of another i.MX6TinyRex, where we use the vpudec and rtph264depay elements to again leverage the i.MX6′s hardware h.264 decoder to decode the video and display it on the HDMI output. Note: we need to use the caps from the encoding pipeline in the udpsrc stream to tell the receiving pipeline about the capabilities of the stream. Do this by setting the “caps” property of the udpsrc to the same properties, copied and pasted from above and put in double-quotes:

gst-launch udpsrc port=5550 caps="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, sprop-parameter-sets=(string)\"Z0JAH6aAUAW5AA\\=\\=\\,aM44gAA\\=\", payload=(int)96, ssrc=(uint)3515352658, clock-base=(uint)3053564270, seqnum-base=(uint)53648" ! rtph264depay ! vpudec low-latency=true ! autovideosink

The vpudec automatically knows what type of stream to decode from the capabilities passed to it from rtph264depay, but I do add the low-latency=true argument to have it produce frames as soon as they are decoded. The autovideosink is again used to output on the HDMI.

Example: Decoding with VLC on your computer

VLC is also capable of decoding and displaying video streamed to it from the i.MX6. However, much like how we had to tell the receiving pipeline about the capabilities of the stream, we need to do the same for VLC through the use of an .sdp file. Here is an example:

v=0
o=- 1223457093460663 1 IN IP4 127.0.0.1
s=RTSP Server
i=Codec00
t=0 0
a=tool:LIVE555 Streaming Media v2008.07.24
a=type broadcast
b=AS:21
a=control:*
a=source-filter: incl IN IP4 127.0.0.1
a=rtcp-unicast: reflection
m=video 5550 RTP/AVP 96    
a=rtpmap:96 H264/90000
a=framerate=30
a=fmtp:96 profile-level-id=42001e; sprop-parameter-sets=Z0JAH6aAUAW5AA\\=\\=\\,aM44gAA\\=\;

This .sdp file is compatible with the above pipeline. Specifically, you’ll need to set the port (5550) as well as the sprop-parameter-sets from the capabilities of the stream. Is also important to specify that this is an RTP h.264 stream. To view the video, just redirect the encoding pipeline to your computer’s IP address and open the .sdp file in VLC. You should see video after a few seconds.

Freescale also offers example Gstreamer pipelines here.

Output video

Changing HDMI output resolution

Before you start, run following command:

export DISPLAY=:0

Check the output resolution and display name:

root@imx6q-tinyrex:~# xrandr
Screen 0: minimum 240 x 240, current 1280 x 720, maximum 8192 x 8192
DISP3 BG connected 1280x720+0+0 (normal left inverted right x axis y axis) 0mm x 0mm
   S:1920x1080p-30  30.00
   S:1920x1080p-25  25.00
   S:1920x1080p-24  24.00
   S:1280x720p-50  50.00
   S:1280x720p-60  60.00*
   D:1280x720p-60  60.00
   S:720x576p-50  50.00
   S:720x480p-60  59.94
   V:640x480p-60  59.94
root@imx6q-tinyrex:~#

Set new resolution:

xrandr --output "DISP3 BG" --mode S:1280x720p-60

UART Serial Port

We can use a basic python script to test the UART ports on the TinyRex board.
Connect the FTDI cable to the console debug port on the TinyRex board and open you favourite serial terminal (e.g. Putty, TeraTerm for Windows or CoolTerm for OS X), or alternatively ssh into the TinyRex with your favorite ssh terminal.

Login to the TinyRex, the default username is ‘root’ with no password.

In this example we will use 9600bps 8-N-1 on UART2. We can create a python script using the nano text editor;

mkdir -pv ~/examples/python
cd ~/examples/python
nano serial-test.py

paste the following into nano;

import time
import serial

# configure the serial connections (the parameters differs on the device you are connecting to)
ser = serial.Serial(
    port='/dev/ttymxc1',
    baudrate=9600,
    parity=serial.PARITY_NONE,
    stopbits=serial.STOPBITS_ONE,
    bytesize=serial.EIGHTBITS
)

ser.isOpen()

print 'Enter your commands below.\r\nInsert "exit" to leave the application.'

input=1
while 1 :
    # get keyboard input
    input = raw_input(">> ")
        # Python 3 users
        # input = input(">> ")
    if input == 'exit':
        ser.close()
        exit()
    else:
        # send the character to the device
        # (note that I happend a \r\n carriage return and line feed to the characters - this is requested by my device)
        ser.write(input + '\r\n')
        out = ''
        # let's wait one second before reading output (let's give device time to answer)
        time.sleep(1)
        while ser.inWaiting() > 0:
            out += ser.read(1)

        if out != '':
            print ">>" + out

press Ctrl+o to write, Ctrl-x to save

Start the python script by;

python serial-test.py

You should now see the python prompt >>
Type some characters and they should be transmitted, and you should also see characters received in the terminal.
You have successfully tested the UART ports on the TinyRex board

To use other ports, they are mapped as follows;
UART1 = ttymxc0 (used by the console)
UART2 = ttymxc1
UART3 = ttymxc2
UART4 = ttymxc3
UART5 = ttymxc4

GPIO

As an example we will use the baseboard USER LED. First, check what CPU pin is used to control the LED. Have a look into the schematic and you will find, that the baseboard LED is connected to GPIO3_29.

Use this equation to get the magic number which we need: MagicNumber = (GPIOX – 1)*32 + _YY
*Note: In our case: MagicNUmber = (3 – 1)*32 + 29 = 93

Now run following commands:

echo 93 > /sys/class/gpio/export 
cd  /sys/class/gpio/gpio93/

To change GPIO direction use:

echo "out" > direction
# echo "in" > direction

To change value of the GPIO pin:

echo 1 > value
echo 0 > value

To read value from the GPIO pin:

cat value