This requires functions without a prototype definition to be static.
This allows to detect dead code, export less symbols and put shared
functions in headers.
Prior to this commit, compositors needed to render the texture to an
intermediate off-screen buffer using wlr_renderer APIs if they wanted to
use a custom rendering path (e.g. render to a 3D scene).
A new wlr_gles2_texture_get_attribs exposes the GL texture target and ID
so that compositors can render wlr_textures with their own shaders. An
example of a compositor doing so is available at [1].
[1]: 3db905b784/src/render.c (L227)
We don't need our own enum for types. Instead we just use
GL_TEXTURE_{2D,EXTERNAL_OES}, which already describes usage.
Also fixes a situation where we were using GL_TEXTURE_2D in a situation
we should not have. wl_drm buffers are always GL_TEXTURE_EXTERNAL_OES,
no matter if they're RGB or any other format.
When a texture is destroyed between wlr_egl_make_current and
wlr_egl_swap_buffers, it resets the current EGL surface to NULL. This
makes wlr_egl_swap_buffers fail.
If the EGL context is already current, there's no need to reset it.
According to the spec:
> If <n_rects> is 0 then <rects> is ignored and the entire
> surface is implicitly damaged and the behaviour is equivalent
> to calling eglSwapBuffers.
When we want to swap with an empty damage region, set the damage to a single
empty rectangle.
The deleted includes are redundant, because other headers will include
the necessary files. Additionally, they cause build failures, because
including EGL/egl.h or EGL/eglext.h directly, instead of through
wlr/render/egl.h or wlr/render/interface.h, will mean that
MESA_EGL_NO_X11_HEADERS will not have been defined, and so the EGL
headers will attempt to pull in unnecessary X11 headers that may not
exist on the system.
For the headers produced by glgen.sh, the includes couldn't simply be
deleted, because no other header would include the EGL headers. Neither
wlr/render/egl.h or wlr/render/interface.h felt appropriate to include,
so I opted instead to copy the MESA_EGL_NO_X11_HEADERS definition before
the EGL includes.
This types adds a container for formats + modifiers.
A list that is of [format [modifier]] was chosen instead of
[format modifer] because that is how GBM accepts them.
Co-Authored-By: emersion <contact@emersion.fr>
The read format is dependent on the output, so we first need to make it
current. This fixes a race condition in wlr-screencopy-v1 where a dmabuf
client would cause EGL_NO_SURFACE to be bound at the time when
screencopy needs to query for the preferred format, causing GL errors.
We were assuming GL_BGRA_EXT was always supported.
We now check that it's supported for rendering. We fail if it isn't because
this format is specified as "always supported" by the Wayland protocol.
We also check if it's supported for reading pixels. A new preferred_read_format
function returns the preferred format that can be used to read pixels. This is
used by the screencopy protocol.
This removes any assumptions about how the libdrm headers are installed,
and uses the pkg-config include directories as we're "supposed to".
This only adds a partial dependency, since we don't actually need to
link against libdrm.
This PR broke a private nixpkgs definition I have for wlroots: https://github.com/swaywm/wlroots/pull/1304
It is fixed by changing `#include <drm_fourcc.h>` to `#include <libdrm/drm_fourcc.h>`, which follows what is already done in the dmabuf example.
If a client uses an older version of the dmabuf protocol, use the
`formats` event instead of `modifiers` (since that didn't exist in older
versions).
With a bit of necessary guessing, support dmabuf importing even when
EGL_EXT_image_dma_buf_import_modifiers isn't present instead of
failing up front.