GitLab Community Edition setup was not straightforward


We run Gitlab on premise because, as more partners get involved in the projects,  our boss doesn’t like our IP taken care of by github.

The official set up guide for Ubuntu 14 is not so friendly and here are some points which made the installation not straightforward.

  1. Omnibus package is easy, but we cannot use it. It’s good only “if the server is dedicated only to managing git repositories”. I turned to manual installation, because 40 core server is too big to just do gitlab and I already have nginx running.
  2. The guide is for NGINX Plus, the paid version and therefore does’t apply here well. I am using the free version.
  3. Nginx has to proxy to workhorse socket. If proxied to unicorn, the website works, but git client cannot push.
  4. I really wished my existing MySql to work, but I could not make it. Eventually I had to use PostgreSQL. The document says PostgreSQL is recommended but Mysql still works for installation from source with some limitation.

Even though the installation was painful, gitlab works pretty well once up and running. http://maxv.maxxsports.cc:5575/bvsc/doc

References:

  1. https://gitlab.com/gitlab-org/gitlab-ce/blob/master/doc/install/installation.md
  2. https://www.nginx.com/resources/admin-guide/configuration-files/
  3. https://linode.com/docs/development/version-control/install-gitlab-on-ubuntu-14-04-trusty-tahr/
Advertisements

What does GeniCam do for us?


To customize the content of billboard in sport games, e.g. for audience in different countries.

We change part of the video, namely the billboard content, while streaming the game. The end result is the billboard in streamed game video plays different content from the physical billboard on the game venue.

Our current cameras, all IP cameras, have two problems for this job:

  1. content needs to be decoded to process billboard, and then re-encoded, which is too much load on server
  2. hard to sync due to the delay introduced by codec and internet.

GeniCam cameras send raw video to our on site server through fiber network, with no decoding and ignorable delay. The cameras are connected to an IP strobe controller, as seen on picture.

Some components in addition to streaming are:

  1. Video process: OpenCV for object detection and Bayer pixel manipulation
  2. 3D billboard content generation: OpenGL
  3. Pull video from the cameras: GeniCam SDK

WordpressCam

Comparing HLS and MPEG-DASH


After two years of HLS streaming, I still don’t see significant advantages on MPEG-DASH.
1. Standard: HLS made it’s way to rfc8216. in 2017.
2. Live subtitles: HLS can not only do VOD, but live streams too.
3. HLS made effort by providing HLS fMP4 so that streaming servers don’t have to provide duplicated copies of media file (.ts and .m4s) in order to satisfy both DASH and HLS clients.

Literally, what does YUV420 mean?


We know what YUV420 is, but literally what does the 0 mean?

In short, 0 means 0 uv values are sampled in the second row.

Details:

4: for each 4 pixels in each row

2: 2 uv samples on first row.

0: 0 rv samples on second row: reusing uv samples on first row.

Therefore, for YUV422, second 2 means second row is sampled the same way as the first row.