The monitor texture is the final background image. It doesn't need to
have any alpha channel. Cross-fades (which is the process of rendering
*into* the monitor texture) still work just fine.
Part-of: <https://gitlab.gnome.org/GNOME/mutter/-/merge_requests/1665>
A first step towards abandoning the CoglObject type system: convert
CoglFramebuffer, CoglOffscreen and CoglOnscreen into GObjects.
CoglFramebuffer is turned into an abstract GObject, while the two others
are currently final. The "winsys" and "platform" are still sprinkled
'void *' in the the non-abstract type instances however.
https://gitlab.gnome.org/GNOME/mutter/-/merge_requests/1496
When the wallpaper image is larger than the monitor resolution we already
use mipmapping to scale it down smoothly in hardware. We use
`GL_TEXTURE_MIN_FILTER` = `GL_LINEAR_MIPMAP_LINEAR` for the highest quality
scaling that GL can do. However that option is designed for 3D use cases
where the mipmap level is changing over time or space.
Since our wallpaper is not changing distance from us we can improve the
rendering quality even more than `GL_LINEAR_MIPMAP_LINEAR`. To do this we
now set `GL_TEXTURE_MAX_LEVEL` (if available) to limit the mipmap level or
blurriness level to the lowest resolution (highest level) that is still
equal to or higher than the monitor itself. This way we get the benefits
of mipmapping (downscaling in hardware) *and* retain the maximum possible
sharpness for the monitor resolution -- something that
`GL_LINEAR_MIPMAP_LINEAR` alone doesn't do.
Example:
Monitor is 1920x1080
Wallpaper photo is 4000x3000
Mipmaps stored on the GPU are 4000x3000, 2000x1500, 1000x750, ...
Before: You would see an average of the 2000x1500 and 1000x750 images.
After: You will now only see the 2000x1500 image, linearly sampled.
https://gitlab.gnome.org/GNOME/mutter/merge_requests/1003
Scaling the `monitor_area` before texture creation was just wasting
megabytes of memory on resolution that the monitor can't display. This
was also hurting runtime performance.
Example:
Monitor is natively 1920x1080 and scale set to 3.
Before: The monitor texture allocated was 5760x3250x4 = 74.6 MB
After: The monitor texture allocated is 1920x1080x4 = 8.3 MB
Closes: https://gitlab.gnome.org/GNOME/gnome-shell/issues/2118https://gitlab.gnome.org/GNOME/mutter/merge_requests/1004
Commit 8e9184b6 added filtering to avoid image jaggies when downscaling
but used `LINEAR_MIPMAP_NEAREST`. In some situations this could lead to
GL choosing a single lower resolution mipmap and then upscaling it, hence
slightly blurry.
We don't want to revert that change since it avoids aliasing jaggies, so
let's use `LINEAR_MIPMAP_LINEAR` instead. This provides the highest quality
filtering that GL can do and avoids the situation of GL using a single
mipmap that's lower resolution than the screen. Now it will blend that one
with the next mipmap which is higher resolution than the screen. This still
avoids jaggies but also maintains 1px resolution.
Closes: https://gitlab.gnome.org/GNOME/gnome-shell/issues/1105https://gitlab.gnome.org/GNOME/mutter/merge_requests/505
We need to use pixel size of the monitor in order to generate a valid
texture with full quality for current monitor
In spanned case the background should cover all the differently scaled monitors
thus we scale the texture up to the maximum scaling level and then we resample
it drawing only each side in the monitor it should occupy using the proper
scaling level.
In wallpaper mode (or color mode) for example we don't need to scale the area,
also the texture size we return should be unscaled, not to confuse
MetaBackgroundActor making it use more space than needed.
https://bugzilla.gnome.org/show_bug.cgi?id=765011
Commit 25f416c13d added additional compilation warnings, including
-Werror=return-type. There are several places where this results
in build failures if `g_assert_not_reached()` is disabled at compile
time and the compiler misses a return value.
https://gitlab.gnome.org/GNOME/mutter/issues/447
The order and way include macros were structured was chaotic, with no
real common thread between files. Try to tidy up the mess with some
common scheme, to make things look less messy.
Split X11 specific parts into MetaX11Display. This also required
changing MetaScreen to stop listening to any signals by itself, but
instead relying on MetaDisplay forwarding them. This was to ensure the
ordering. MetaDisplay listens to both the internal and external
monitors-changed signal so that it can pass the external one via the
redundant MetaDisplay(prev MetaScreen)::monitors-changed.
https://bugzilla.gnome.org/show_bug.cgi?id=759538
This is not a leak per se, but it seems too easy to make valgrind
SIGSEGV due to MetaBackground disconnecting signals from an already
destroyed MetaScreen when trying to SIGTERM gnome-shell. Keeping a
reference fixes this.
https://bugzilla.gnome.org/show_bug.cgi?id=789984
Use the correct pointer types for cogl objects. This avoids warnings
when including the cogl headers doesn't result in all the cogl types
being typedefs to void.
https://bugzilla.gnome.org/show_bug.cgi?id=768976
While CoglError is a define to GError, it doesn't follow the convention
of ignoring errors when NULL is passed, but rather treats the error as
fatal :-(
That's clearly unwanted for a compositor, so make sure to always pass
an error parameter where a runtime error is possible (i.e. any CoglError
that is not a malformed blend string).
https://bugzilla.gnome.org/show_bug.cgi?id=765058
Some backgrounds don't fully fill the screen. For those backgrounds
it's important to paint a color behind them to fill in the gaps.
This commit checks whether or not the background image textures take
up the entire monitor, and in the event they don't, draws a color
behind them (such as it would do if the background were
translucent).
https://bugzilla.gnome.org/show_bug.cgi?id=754476
meta_background_get_texture only draws the bottom image texture
if
1) the blend factor leaves the top image translucent
or
2) the top image is translucent from alpha
The latter case doesn't actually matter since we're using REPLACE
on the top image texture.
This commit drops the unnecessary check for the second case and
applies demorgans law to the conditional for clarity.
https://bugzilla.gnome.org/show_bug.cgi?id=754476
The old requirement that multiple MetaBackgroundActor objects be
layered on top of each to produce blended backgrounds resulted in
extremely inefficient drawing since the entire framebuffer had
to be read and written multiple times.
* Replace the MetaBackground ClutterContent with a plain GObject
that serves to hold the background parameters and prerender
textures to be used to draw the background. It handles
colors, gradients, and blended images, but does not handle
vignetting
* Add vignetting to MetaBackgroundActor directly.
* Add MetaBackgroundImage and MetaBackgroundImageCache to allow
multiple MetaBackground objects to share the same images
By removing the usage of ClutterContent, the following optimizations
were easy to add:
Blending is turned off when the actor is fully opaque
Nearest-neighbour filtering is used when drawing 1:1
The GLSL vignette code is slightly improved to use a vertex shader
snippet for computing the texture coordinate => position in actor
mapping.
https://bugzilla.gnome.org/show_bug.cgi?id=735637
It's a deprecated API that can surprise us. Namely, when the internal
format passed is COGL_PIXEL_FORMAT_ANY, it will *always* allocate an
RGBA8888 pixel format texture, even if we only passed it a RGB format
or even an A8 format.
cogl_texture_2d_new_with_data is the newer, better API and doesn't have
these warts.
Switching meta/util.h to gi18n.h was wrong, mutter is a library
and needs gi18n-lib.h, but that cannot be included from a public
header (since it depends on config.h or command line options),
so split util.h into a public and a private part.
https://bugzilla.gnome.org/show_bug.cgi?id=707897