It seems that particles (and some other features) do not work correctly on iOS in GLES2 because either many of the devices do not support half float compression, or the GL constant used to reference it from Godot is incorrect.
This PR adds a project setting in rendering/gles2/ to disable half-float compression on iOS.
Adding the ability to access MODULATE in the shader breaks when final_modulate is baked into vertex colors (this is a technique used to batch together different colored items). This PR prevents baking vertex colors when MODULATE is detected in the shader.
It also prevents baking when COLOR is read in canvas shaders, which could currently produce the wrong result in the shader if colors were baked. It does not prevent baking if COLOR is only written, which happens in most shaders, and will operate correctly without baking.
This adds 2 new values (items and draw calls) to the performance monitor in a '2d' section, rather than reusing the 3d values in the 'raster' section.
This makes it far easier to optimize games to minimize drawcalls.
Happy new year to the wonderful Godot community!
We're starting a new decade with a well-established, non-profit, free
and open source game engine, and tons of further improvements in the
pipeline from hundreds of contributors.
Godot will keep getting better, and we're looking forward to all the
games that the community will keep developing and releasing with it.
This changes the code path so that `glRenderBufferStorage*` always uses
values appropriate for renderbuffers and `glTexImage2D` never uses an
internalformat meant for buffers.
Fixes#33825.
When rendering to an external texture and MSAA was active (as happened
in the Oculus Mobile ARVR plugin) no MSAA was rendered as the correct
depth buffer and multisample texture target was not used.
This also fixes https://github.com/GodotVR/godot_oculus_mobile/issues/54
While OpenGL ES 3.0 and WebGL 2.0 both support non power-of-2 (NPOT)
textures in their specification, the situation seems to be less clear
about *compressed* NPOT textures using repeat or mipmap flags.
At least Chrome on Linux doesn't seem to support this combination,
and a variety of mobile hardware have similar limitations.
As a workaround, we force decompressing such textures when running on
WebGL 2.0, at the cost of loading time and memory usage.
Fixes#33058.
Although the backup USE_SKELETON_SOFTWARE skinning path is currently used when float texture is not supported, the default skinning path still fails when float texture is supported but GL_MAX_VERTEX_TEXTURE_IMAGE_UNITS is 0, i.e. the device cannot read from texture during vertex shader. This PR adds the logic to activate the SKELETON_SOFTWARE path if either of these cases occur, preventing crashes on devices which have this combination of features.
Condensed some if and ERR statements. Added dots to end of error messages
Couldn't figure out EXPLAINC. These files gave me trouble: core/error_macros.h, core/io/file_access_buffered_fa.h (where is it?),
core/os/memory.cpp,
drivers/png/png_driver_common.cpp,
drivers/xaudio2/audio_driver_xaudio2.cpp (where is it?)
For clarity, assign-to-release idiom for PoolVector::Read/Write
replaced with a function call.
Existing uses replaced (or removed if already handled by scope)
Non-tools OpenGLES2 devices that use the USE_SKELETON_SOFTWARE path (i.e. do not support float texture) depend on surface->data being set containing the bone IDs and weights (rasterizer_scene_gles2.cpp, line 1456, RasterizerSceneGLES2::_setup_geometry). However currently if TOOLS_ENABLED is not defined, surface->data is not stored in main memory in rasterizer_storage_gles2.cpp. This causes a crash in rasterizer_scene_gles2.cpp when a rigged object comes into view.
This fix addresses the specific case of skinned objects when USE_SKELETON_SOFTWARE is active, and stores a copy of the bone data, as is done when TOOLS_ENABLED is defined. This fixes the crash by allowing the same mechanism as on desktop, without adding the memory overhead of storing all vertex data where not required.
Fixes#28298
This is a new singleton where camera sources such as webcams or cameras on a mobile phone can register themselves with the Server.
Other parts of Godot can interact with this to obtain images from the camera as textures.
This work includes additions to the Visual Server to use this functionality to present the camera image in the background. This is specifically targetted at AR applications.
If a non-imported texture resource file (e.g. DDS) gets updated the editor
doesn't reload it. The cause of the problem is two-fold:
First, the code of ImageTexture assumes that textures are always imported
from an image, but that's not the case for e.g. DDS. This change thus adds
code to issue a resource reload in case an image reload is not possible
(which is the case for non-imported texture resources).
Second, the code is filled with bogus calls to Image::get_image_data_size()
to determine the mipmap offset when that should be done using
Image::get_image_mipmap_offset(). Previous code literally passed the integer
mip level value to Image::get_image_data_size() where that actually expects
a boolean. Thus this part of the change might actually solve some other
issues as well.
To be pedantic, the texture_get_data() funciton of the rasterizer drivers is
still quite a mess, as it only ever returns the whole mipchain when
GLES_OVER_GL is set (practically only on desktop builds) but this change does
not attempt to resolve that.
The bake mode property of lights previously didn't affect GI probes.
This change makes the GI probe ignore lights that have their bake mode
set to disabled.