-
Notifications
You must be signed in to change notification settings - Fork 1.5k
TODO
VideoShader class will be highly extendable by user. Also support dynamically editing and applying shaders at runtime. Just describe your custom shader code like below
QStringList userUniforms() const { return QStringList() << "u_texelSize" << "u_kernel";} //
GLuint uniformLocation(const QString& name); // base and not virtual
QVariant uniformValue(const QString& name) { ...}
VideoRenderer should expose OpenGLVideo* openglVideo();
to user to control the shader by user.
User configurable shader codes are: sampling function and post processing function. Sampling function applys on input textures of any type, for example YUV. For a sampling function with a convolution kernel, it will have the same effect as applying on the color transformed RGB texture, ensured by the fact:
\Sigma_i ci*\Sigma_j kj*xj = \Sigma_i ki*\Sigma_j cj*xj
Also because the input yuv is from a real rgb color, no clamp is required for the transformed color.
The idea is very simple: get an interpolated frame from given frames. For example, given the current frame and next decoded frame, using linear interpolation, we can get a frame at any time between curent and next decoded frame pts. At least 2 FBO is required. Maybe we can use current and previous decoded frame.
No FBO is possible too, but more sampler2D uniforms are required. Maybe we can use User Configurable Shaders
.
There are some special formats used by some vendors, tiled layout for example. Tiled rendering can benefits from those layouts. But we should convert it to the standard layouts for opengl rendering. If we can get plane textures from OMX (0-copy), it's very simple to convert tiled to standard using GLSL. Again we can use User Configurable Shaders
technology, creating a sampling function for tiled formats. The tiled info can be stored in VideoFrame.mataData.