====== Viber - the video monitor ====== scratches on newspaper margins.. towards a framework of synchronized video (playback, editing, monitoring). ===== jottings ===== Internally there'll be a (u)int64_t timestamp representation and a fractional framerate for each Session. Those timestamps are mapped those to the timestamps in the video file(s) (part of the session). The display can provide the sync or be synchronized with minimum jitter and compensating latency to an external time source. - see [[wiki:syncshots:start|synch-tests]]. Make it as modular as possible. * decoders below: gavl, libquicktime, ffmpeg, gstreamer(?) * make frame-buffer accessible to 3rd party applications (eg. when decoing a given frame make it available for a video-timeline; cache several (two) formats. * include a XV sync frontend; but expose an API for other non-rt displays (openGL, processing or file/net socket i/o) * include timecode modifying plugins (ie EDL) * pipe latency compensated timestamp though external plugins to create automation-data that controlls the decoder(-plugins) ===== Design ===== +-----------------------------------------+ |cYEL Display(s) | +---------------------+ + | Gui/Wm | | +----------+----------+--------+----------+ |signals | slots | vblank | buffer | external +---+------+---+------+----+---+------^---+ app | ^ | | ^ v | v : : +--------+ +-+-------------+ +---+---+ | |(remote)| |Screen | | frame +-----+ |control +<->+Control +--->| cache | +---+----+ | | | |<----+ | | | +-------+ | V +---------------+ | +--------+ |cGRE Session | +---+---+ | |running | |Control +--->| FR | | |config | | | | conv | | | {d}| | |<---+ | | +---+----+ +--------+------+ +-------+ | : ^ ^ | : | | | | /-------------+----\ +-----------+ | +---------->|cYEL | | |decoder framework | optional ext. sync --+ | +---------------+ | | convert | | +---------------+ +-----------+ | | scale | |file(s) or | | +---------------+ |stream(s) |<--+ | plugin(s) | |{s} | | +---------------+ +-----------+ \------------------/ /---------------------------------------------------\ : Legend | |cYEL | | | | +-> data flow; actually in reverse direction | | | since pointers to are passed up the stream | | : towards the source. | | | | --> control signals | | | \---------------------------------------------------/ ==== Interfaces ==== * Display - slots * resize * reposition * fullscreen, always on top, * mouse hide/show * etc * Display - signals * keypress * mouse events * redraw * etc * Display - video * flip to buffer * set buffer data * Decoder - here simplified API: * ''openFile(path|streamURL?)'' * ''getFileInfo(VC, ..)'' - returns possible data formats, framerate, time offset, etc. * ''SetRenderFormatandSize(dataformat, size)'' * ''SetDecoderParams(seekparams)'' * ''storeFrameandInfo(timecode, *data)''; There'd need to be functions to pass automation data to plugins for a given frame; and a small interface to invalidate cached frames once this data changes. - Since this data process happens between the decoder and cache we can neglect if for the time being, as long as we can provide a timestamp. * .. ===== Thread model ===== There can be multiple display (one for each window), and decoder (one for each file) threads. However there is only **one** //master// sync source (thread) at a given time. How this sync source affects the different tracks is specified by the session which can include a per track transport which can be derived from the master in arbitrary ways. The buffer thread needs to ensure that the cache will contain the data that the video-display thread will need in the near future. There is **one** buffer thread that spawns decoders in the background (the decoders should finish their work before the display thread comes along). The buffer-thread is mainly concerned with latency compensation, and generates a map of video-frames to timecode ((The underlying decoder->framecache may perform additional transformations in the timecode (eg. EDL) )). The buffer thread has different modes of operation: * still picture * play (readahead, compensate latency) The play-mode should recognize varispeed or backwards playback.. The buffer-thread needs not to be a thread but can be a callback from the sync-source ((In principle the buffer-function can be run periodically from jack_process callback. But this is not practical since it can be up to a hundred ms latency: 4096/44100)). Theres more to be said about this. ==== Video Display Thread ==== This thread only flips buffers and stores a timestamp along with the screen's retrace count, at a freq between 25..120Hz. It synchronizes to DRI vblank (wait for interrupt - ''GLXWaitVideoSyncSGI'' or directly ''DRM_IOCTL_WAIT_VBLANK'', ''DRM_VBLANK_RELATIVE'') ; if unavailable these could be substituted by a RTC, HPET device sync or a realtime thread (''jack_client_create_thread(..)'') Note: to prevent async_x_replies; this thread should wake up the //UI interaction Thread// for calling ''XFlush() ''as to prevent async-X11-replies. ToDo: check for fallback: what if there's no VGA ;) ToDo: figure out the latency of ''XvShmPutImage()'' relative to the retrace count. ==== UI interaction Thread or Callbacks ==== - handles XPending() events. - process remote-control cmds (MIDI?, OSC, stdio,..) - modify (load/save) session ==== Sync Source Thread or Callbacks ==== provide a clock source. linking this to audio hardware: the thread would in intervals around 50Hz (48k/1024 or 192k/4096) but that may vary from 1Hz (8k/8192) up to 6kHz (192k/32). thread //plugins// * dummy - fixed frame * jack_transport * jack freewheeling ?!) skip video frames for as fast as possible audio (video progress monitor)- as fast as possible video makes no sense * amidi_seq thread (MTC) callback //plugins// - wake up * jack_audio (LTC) * jack_audio (debug; scope - connect to VGA luminosity) * jack_midi (MTC) timecode - framerates: 23(d|n), 24d, 29(n|d), 30d, 59(n|d) or 60d (d for drop-frame, n for not realtime) ==== Decoder, Buffer Thread(s) ==== The buffer thread links them all together.