Merge pull request #39437 from akien-mga/3.2-cherrypicks
Cherry-picks for the 3.2 branch (future 3.2.2) - 8th batch
This commit is contained in:
commit
3fb9c776ff
|
@ -339,6 +339,9 @@ platform/windows/godot_res.res
|
|||
# Visual Studio Code workspace file
|
||||
*.code-workspace
|
||||
|
||||
# Scons construction environment dump
|
||||
.scons_env.json
|
||||
|
||||
# Scons progress indicator
|
||||
.scons_node_count
|
||||
|
||||
|
|
|
@ -114,9 +114,6 @@ matrix:
|
|||
|
||||
before_install:
|
||||
- eval "${MATRIX_EVAL}"
|
||||
- if [ "$STATIC_CHECKS" = "yes" ]; then
|
||||
unset SCONS_CACHE;
|
||||
fi
|
||||
|
||||
install:
|
||||
- if [ "$TRAVIS_OS_NAME" = "linux" ]; then
|
||||
|
@ -130,6 +127,10 @@ install:
|
|||
java -version;
|
||||
misc/travis/android-tools-linux.sh;
|
||||
fi
|
||||
- if [ "$STATIC_CHECKS" = "yes" ]; then
|
||||
unset SCONS_CACHE;
|
||||
pip3 install --user black pygments;
|
||||
fi
|
||||
|
||||
before_script:
|
||||
- if [ "$PLATFORM" = "android" ]; then
|
||||
|
@ -140,6 +141,7 @@ before_script:
|
|||
script:
|
||||
- if [ "$STATIC_CHECKS" = "yes" ]; then
|
||||
sh ./misc/travis/clang-format.sh &&
|
||||
sh ./misc/travis/black-format.sh &&
|
||||
doc/tools/makerst.py --dry-run doc/classes modules;
|
||||
else
|
||||
scons -j2 CC=$CC CXX=$CXX platform=$PLATFORM tools=$TOOLS target=$TARGET $OPTIONS $EXTRA_ARGS &&
|
||||
|
|
163
CONTRIBUTING.md
163
CONTRIBUTING.md
|
@ -1,70 +1,106 @@
|
|||
# How to contribute efficiently
|
||||
|
||||
Sections covered in this file:
|
||||
## Table of contents
|
||||
|
||||
* [Reporting bugs or proposing features](#reporting-bugs-or-proposing-features)
|
||||
* [Contributing pull requests](#contributing-pull-requests)
|
||||
* [Contributing to Godot's translation](#contributing-to-godots-translation)
|
||||
* [Communicating with developers](#communicating-with-developers)
|
||||
- [Reporting bugs](#reporting-bugs)
|
||||
- [Proposing features or improvements](#proposing-features-or-improvements)
|
||||
- [Contributing pull requests](#contributing-pull-requests)
|
||||
- [Contributing to Godot's translation](#contributing-to-godots-translation)
|
||||
- [Communicating with developers](#communicating-with-developers)
|
||||
|
||||
**Please read the first section before reporting a bug!**
|
||||
|
||||
## Reporting bugs or proposing features
|
||||
## Reporting bugs
|
||||
|
||||
The golden rule is to **always open *one* issue for *one* bug**. If you notice
|
||||
several bugs and want to report them, make sure to create one new issue for
|
||||
each of them.
|
||||
|
||||
Everything referred to hereafter as "bug" also applies for feature requests.
|
||||
If you're reporting a new bug, you'll make our life simpler (and the
|
||||
fix will come sooner) by following these guidelines:
|
||||
|
||||
If you are reporting a new issue, you will make our life much simpler (and the
|
||||
fix come much sooner) by following these guidelines:
|
||||
### Search first in the existing database
|
||||
|
||||
#### Search first in the existing database
|
||||
Issues are often reported several times by various users. It's good practice to
|
||||
**search first in the [issue tracker](https://github.com/godotengine/godot/issues)
|
||||
before reporting your issue**. If you don't find a relevant match or if you're
|
||||
unsure, don't hesitate to **open a new issue**. The bugsquad will handle it
|
||||
from there if it's a duplicate.
|
||||
|
||||
Issues are often reported several times by various users. It's a good practice
|
||||
to **search first** in the issues database before reporting your issue. If you
|
||||
don't find a relevant match or if you are unsure, don't hesitate to **open a
|
||||
new issue**. The bugsquad will handle it from there if it's a duplicate.
|
||||
|
||||
#### Specify the platform
|
||||
### Specify the platform
|
||||
|
||||
Godot runs on a large variety of platforms and operating systems and devices.
|
||||
If you believe your issue is device/platform dependent (for example if it is
|
||||
related to the rendering, crashes or compilation errors), please specify:
|
||||
* Operating system
|
||||
* Device (including architecture, e.g. x86, x86_64, arm, etc.)
|
||||
* GPU model (and driver in use if you know it)
|
||||
**In your bug reports, please always specify:**
|
||||
|
||||
#### Specify steps to reproduce
|
||||
- Operating system and version (e.g. Windows 10, macOS 10.15, Ubuntu 19.10)
|
||||
- Godot version (e.g. 3.2, 3.1.2, or the Git commit hash if you're using a development branch)
|
||||
|
||||
For bugs that are likely OS-specific and/or graphics-related, please also specify:
|
||||
|
||||
- Device (CPU model including architecture, e.g. x86, x86_64, ARM, etc.)
|
||||
- GPU model (and the driver version in use if you know it)
|
||||
|
||||
**Bug reports not including the required information may be closed at the
|
||||
maintainers' discretion.** If in doubt, always include all the requested
|
||||
information; it's better to include too much information than not enough
|
||||
information.
|
||||
|
||||
### Specify steps to reproduce
|
||||
|
||||
Many bugs can't be reproduced unless specific steps are taken. Please **specify
|
||||
the exact steps** that must be taken to reproduce the condition, and try to
|
||||
keep them as minimal as possible.
|
||||
keep them as minimal as possible. If you're describing a procedure to follow
|
||||
in the editor, don't hesitate to include screenshots.
|
||||
|
||||
#### Provide a simple, example project
|
||||
Making your bug report easy to reproduce will make it easier for contributors
|
||||
to fix the bug.
|
||||
|
||||
Sometimes an unexpected behavior happens in your project. In such case,
|
||||
### Provide a simple, example project
|
||||
|
||||
Sometimes, unexpected behavior can happen in your project. In such case,
|
||||
understand that:
|
||||
* What happens to you may not happen to other users.
|
||||
* We can't take the time to look at your project, understand how it is set up
|
||||
|
||||
- What happens to you may not happen to other users.
|
||||
- We can't take the time to look at your project, understand how it is set up
|
||||
and then figure out why it's failing.
|
||||
|
||||
To speed up our work, please prepare for us **a simple project** that isolates
|
||||
To speed up our work, **please upload a minimal project** that isolates
|
||||
and reproduces the issue. This is always the **best way for us to fix it**.
|
||||
You can attach a zip file with the minimal project directly to the bug report,
|
||||
You can attach a ZIP file with the minimal project directly to the bug report,
|
||||
by drag and dropping the file in the GitHub edition field.
|
||||
|
||||
We recommend always attaching a minimal reproduction project, even if the issue
|
||||
may seem simple to reproduce manually.
|
||||
|
||||
**If you've been asked by a maintainer to upload a minimal reproduction project,
|
||||
you *must* do so within 7 days.** Otherwise, your bug report will be closed as
|
||||
it'll be considered too difficult to diagnose.
|
||||
|
||||
Now that you've read the guidelines, click the link below to create a
|
||||
bug report:
|
||||
|
||||
- **[Report a bug](https://github.com/godotengine/godot/issues/new?assignees=&labels=&template=bug_report.md&title=)**
|
||||
|
||||
## Proposing features or improvements
|
||||
|
||||
**Since August 2019, the main issue tracker no longer accepts feature proposals.**
|
||||
Instead, head to the [Godot Proposals repository](https://github.com/godotengine/godot-proposals)
|
||||
and follow the instructions in the README file. High-quality feature proposals
|
||||
are more likely to be well-received by the maintainers and community, so do
|
||||
your best :)
|
||||
|
||||
See [this article](https://godotengine.org/article/introducing-godot-proposals-repository)
|
||||
for detailed rationale on this change.
|
||||
|
||||
## Contributing pull requests
|
||||
|
||||
If you want to add new engine functionalities, please make sure that:
|
||||
If you want to add new engine features, please make sure that:
|
||||
|
||||
* This functionality is desired, which means that it solves a common use case
|
||||
- This functionality is desired, which means that it solves a common use case
|
||||
that several users will need in their real-life projects.
|
||||
* You talked to other developers on how to implement it best (on either
|
||||
communication channel, and maybe in a GitHub issue first before making your
|
||||
PR).
|
||||
* Even if it does not get merged, your PR is useful for future work by another
|
||||
- You talked to other developers on how to implement it best. See also
|
||||
[Proposing features or improvements](#proposing-features-or-improvements).
|
||||
- Even if it doesn't get merged, your PR is useful for future work by another
|
||||
developer.
|
||||
|
||||
Similar rules can be applied when contributing bug fixes - it's always best to
|
||||
|
@ -83,7 +119,23 @@ for an introduction to developing on Godot.
|
|||
The [Contributing docs](https://docs.godotengine.org/en/latest/community/contributing/index.html)
|
||||
also have important information on the PR workflow and the code style we use.
|
||||
|
||||
#### Be nice to the git history
|
||||
### Document your changes
|
||||
|
||||
If your pull request adds methods, properties or signals that are exposed to
|
||||
scripting APIs, you **must** update the class reference to document those.
|
||||
This is to ensure the documentation coverage doesn't decrease as contributions
|
||||
are merged.
|
||||
|
||||
[Update the documentation template](https://docs.godotengine.org/en/latest/community/contributing/updating_the_class_reference.html#updating-the-documentation-template)
|
||||
using your compiled binary, then fill in the descriptions.
|
||||
Follow the style guide described in the
|
||||
[Docs writing guidelines](https://docs.godotengine.org/en/latest/community/contributing/docs_writing_guidelines.html).
|
||||
|
||||
If your pull request modifies parts of the code in a non-obvious way, make sure
|
||||
to add comments in the code as well. This helps other people understand the
|
||||
change without having to look at `git blame`.
|
||||
|
||||
### Be nice to the Git history
|
||||
|
||||
Try to make simple PRs that handle one specific topic. Just like for reporting
|
||||
issues, it's better to open 3 different PRs that each address a different issue
|
||||
|
@ -99,33 +151,31 @@ commit, try to merge them together before making your pull request (see ``git
|
|||
rebase -i`` and relevant help about rebasing or amending commits on the
|
||||
Internet).
|
||||
|
||||
This git style guide has some good practices to have in mind:
|
||||
[Git Style Guide](https://github.com/agis-/git-style-guide)
|
||||
This [Git style guide](https://github.com/agis-/git-style-guide) has some
|
||||
good practices to have in mind.
|
||||
|
||||
See our [PR workflow](https://docs.godotengine.org/en/latest/community/contributing/pr_workflow.html)
|
||||
documentation for tips on using Git, amending commits and rebasing branches.
|
||||
|
||||
#### Format your commit logs with readability in mind
|
||||
### Format your commit messages with readability in mind
|
||||
|
||||
The way you format your commit logs is quite important to ensure that the
|
||||
commit history and changelog will be easy to read and understand. A git commit
|
||||
log is formatted as a short title (first line) and an extended description
|
||||
The way you format your commit messages is quite important to ensure that the
|
||||
commit history and changelog will be easy to read and understand. A Git commit
|
||||
message is formatted as a short title (first line) and an extended description
|
||||
(everything after the first line and an empty separation line).
|
||||
|
||||
The short title is the most important part, as it is what will appear in the
|
||||
`shortlog` changelog (one line per commit, so no description shown) or in the
|
||||
GitHub interface unless you click the "expand" button. As the name tells it,
|
||||
try to keep that first line relatively short (ideally <= 50 chars, though it's
|
||||
rare to be able to tell enough in so few characters, so you can go a bit
|
||||
higher) - it should describe what the commit does globally, while details would
|
||||
go in the description. Typically, if you can't keep the title short because you
|
||||
have too much stuff to mention, it means that you should probably split your
|
||||
changes in several commits :)
|
||||
GitHub interface unless you click the "expand" button. As the name says, try to
|
||||
keep that first line under 72 characters. It should describe what the commit
|
||||
does globally, while details would go in the description. Typically, if you
|
||||
can't keep the title short because you have too much stuff to mention, it means
|
||||
you should probably split your changes in several commits :)
|
||||
|
||||
Here's an example of a well-formatted commit log (note how the extended
|
||||
Here's an example of a well-formatted commit message (note how the extended
|
||||
description is also manually wrapped at 80 chars for readability):
|
||||
|
||||
```
|
||||
```text
|
||||
Prevent French fries carbonization by fixing heat regulation
|
||||
|
||||
When using the French fries frying module, Godot would not regulate the heat
|
||||
|
@ -139,9 +189,9 @@ of cooking oil under normal atmospheric conditions.
|
|||
Fixes #1789, long live the Realm!
|
||||
```
|
||||
|
||||
*Note:* When using the GitHub online editor (or worse, the drag and drop
|
||||
feature), *please* edit the commit title to something meaningful. Commits named
|
||||
"Update my_file.cpp" will not be accepted.
|
||||
**Note:** When using the GitHub online editor or its drag-and-drop
|
||||
feature, *please* edit the commit title to something meaningful. Commits named
|
||||
"Update my_file.cpp" won't be accepted.
|
||||
|
||||
## Contributing to Godot's translation
|
||||
|
||||
|
@ -162,6 +212,7 @@ discussions and support, others more for development discussions.
|
|||
|
||||
To communicate with developers (e.g. to discuss a feature you want to implement
|
||||
or a bug you want to fix), the following channels can be used:
|
||||
|
||||
- [GitHub issues](https://github.com/godotengine/godot/issues): If there is an
|
||||
existing issue about a topic you want to discuss, just add a comment to it -
|
||||
all developers watch the repository and will get an email notification. You
|
||||
|
@ -182,6 +233,6 @@ or a bug you want to fix), the following channels can be used:
|
|||
page](https://listengine.tuxfamily.org/godotengine.org/devel/) for
|
||||
subscription instructions.
|
||||
|
||||
Thanks!
|
||||
Thanks for your interest in contributing!
|
||||
|
||||
The Godot development team
|
||||
—The Godot development team
|
||||
|
|
514
SConstruct
514
SConstruct
|
@ -26,46 +26,46 @@ platform_exporters = []
|
|||
platform_apis = []
|
||||
|
||||
for x in sorted(glob.glob("platform/*")):
|
||||
if (not os.path.isdir(x) or not os.path.exists(x + "/detect.py")):
|
||||
if not os.path.isdir(x) or not os.path.exists(x + "/detect.py"):
|
||||
continue
|
||||
tmppath = "./" + x
|
||||
|
||||
sys.path.insert(0, tmppath)
|
||||
import detect
|
||||
|
||||
if (os.path.exists(x + "/export/export.cpp")):
|
||||
if os.path.exists(x + "/export/export.cpp"):
|
||||
platform_exporters.append(x[9:])
|
||||
if (os.path.exists(x + "/api/api.cpp")):
|
||||
if os.path.exists(x + "/api/api.cpp"):
|
||||
platform_apis.append(x[9:])
|
||||
if (detect.is_active()):
|
||||
if detect.is_active():
|
||||
active_platforms.append(detect.get_name())
|
||||
active_platform_ids.append(x)
|
||||
if (detect.can_build()):
|
||||
if detect.can_build():
|
||||
x = x.replace("platform/", "") # rest of world
|
||||
x = x.replace("platform\\", "") # win32
|
||||
platform_list += [x]
|
||||
platform_opts[x] = detect.get_opts()
|
||||
platform_flags[x] = detect.get_flags()
|
||||
sys.path.remove(tmppath)
|
||||
sys.modules.pop('detect')
|
||||
sys.modules.pop("detect")
|
||||
|
||||
methods.save_active_platforms(active_platforms, active_platform_ids)
|
||||
|
||||
custom_tools = ['default']
|
||||
custom_tools = ["default"]
|
||||
|
||||
platform_arg = ARGUMENTS.get("platform", ARGUMENTS.get("p", False))
|
||||
|
||||
if os.name == "nt" and (platform_arg == "android" or ARGUMENTS.get("use_mingw", False)):
|
||||
custom_tools = ['mingw']
|
||||
elif platform_arg == 'javascript':
|
||||
custom_tools = ["mingw"]
|
||||
elif platform_arg == "javascript":
|
||||
# Use generic POSIX build toolchain for Emscripten.
|
||||
custom_tools = ['cc', 'c++', 'ar', 'link', 'textfile', 'zip']
|
||||
custom_tools = ["cc", "c++", "ar", "link", "textfile", "zip"]
|
||||
|
||||
env_base = Environment(tools=custom_tools)
|
||||
if 'TERM' in os.environ:
|
||||
env_base['ENV']['TERM'] = os.environ['TERM']
|
||||
env_base.AppendENVPath('PATH', os.getenv('PATH'))
|
||||
env_base.AppendENVPath('PKG_CONFIG_PATH', os.getenv('PKG_CONFIG_PATH'))
|
||||
if "TERM" in os.environ:
|
||||
env_base["ENV"]["TERM"] = os.environ["TERM"]
|
||||
env_base.AppendENVPath("PATH", os.getenv("PATH"))
|
||||
env_base.AppendENVPath("PKG_CONFIG_PATH", os.getenv("PKG_CONFIG_PATH"))
|
||||
env_base.disabled_modules = []
|
||||
env_base.use_ptrcall = False
|
||||
env_base.module_version_string = ""
|
||||
|
@ -93,7 +93,7 @@ env_base.SConsignFile(".sconsign{0}.dblite".format(pickle.HIGHEST_PROTOCOL))
|
|||
|
||||
# Build options
|
||||
|
||||
customs = ['custom.py']
|
||||
customs = ["custom.py"]
|
||||
|
||||
profile = ARGUMENTS.get("profile", False)
|
||||
if profile:
|
||||
|
@ -105,61 +105,67 @@ if profile:
|
|||
opts = Variables(customs, ARGUMENTS)
|
||||
|
||||
# Target build options
|
||||
opts.Add('arch', "Platform-dependent architecture (arm/arm64/x86/x64/mips/...)", '')
|
||||
opts.Add(EnumVariable('bits', "Target platform bits", 'default', ('default', '32', '64')))
|
||||
opts.Add('p', "Platform (alias for 'platform')", '')
|
||||
opts.Add('platform', "Target platform (%s)" % ('|'.join(platform_list), ), '')
|
||||
opts.Add(EnumVariable('target', "Compilation target", 'debug', ('debug', 'release_debug', 'release')))
|
||||
opts.Add(EnumVariable('optimize', "Optimization type", 'speed', ('speed', 'size')))
|
||||
opts.Add(BoolVariable('tools', "Build the tools (a.k.a. the Godot editor)", True))
|
||||
opts.Add(BoolVariable('use_lto', 'Use link-time optimization', False))
|
||||
opts.Add(BoolVariable('use_precise_math_checks', 'Math checks use very precise epsilon (useful to debug the engine)', False))
|
||||
opts.Add("arch", "Platform-dependent architecture (arm/arm64/x86/x64/mips/...)", "")
|
||||
opts.Add(EnumVariable("bits", "Target platform bits", "default", ("default", "32", "64")))
|
||||
opts.Add("p", "Platform (alias for 'platform')", "")
|
||||
opts.Add("platform", "Target platform (%s)" % ("|".join(platform_list),), "")
|
||||
opts.Add(EnumVariable("target", "Compilation target", "debug", ("debug", "release_debug", "release")))
|
||||
opts.Add(EnumVariable("optimize", "Optimization type", "speed", ("speed", "size")))
|
||||
opts.Add(BoolVariable("tools", "Build the tools (a.k.a. the Godot editor)", True))
|
||||
opts.Add(BoolVariable("use_lto", "Use link-time optimization", False))
|
||||
opts.Add(BoolVariable("use_precise_math_checks", "Math checks use very precise epsilon (debug option)", False))
|
||||
|
||||
# Components
|
||||
opts.Add(BoolVariable('deprecated', "Enable deprecated features", True))
|
||||
opts.Add(BoolVariable('gdscript', "Enable GDScript support", True))
|
||||
opts.Add(BoolVariable('minizip', "Enable ZIP archive support using minizip", True))
|
||||
opts.Add(BoolVariable('xaudio2', "Enable the XAudio2 audio driver", False))
|
||||
opts.Add(BoolVariable("deprecated", "Enable deprecated features", True))
|
||||
opts.Add(BoolVariable("gdscript", "Enable GDScript support", True))
|
||||
opts.Add(BoolVariable("minizip", "Enable ZIP archive support using minizip", True))
|
||||
opts.Add(BoolVariable("xaudio2", "Enable the XAudio2 audio driver", False))
|
||||
opts.Add("custom_modules", "A list of comma-separated directory paths containing custom modules to build.", "")
|
||||
|
||||
# Advanced options
|
||||
opts.Add(BoolVariable('verbose', "Enable verbose output for the compilation", False))
|
||||
opts.Add(BoolVariable('progress', "Show a progress indicator during compilation", True))
|
||||
opts.Add(EnumVariable('warnings', "Set the level of warnings emitted during compilation", 'all', ('extra', 'all', 'moderate', 'no')))
|
||||
opts.Add(BoolVariable('werror', "Treat compiler warnings as errors. Depends on the level of warnings set with 'warnings'", False))
|
||||
opts.Add(BoolVariable('dev', "If yes, alias for verbose=yes warnings=extra werror=yes", False))
|
||||
opts.Add('extra_suffix', "Custom extra suffix added to the base filename of all generated binary files", '')
|
||||
opts.Add(BoolVariable('vsproj', "Generate a Visual Studio solution", False))
|
||||
opts.Add(EnumVariable('macports_clang', "Build using Clang from MacPorts", 'no', ('no', '5.0', 'devel')))
|
||||
opts.Add(BoolVariable('split_libmodules', "Split intermediate libmodules.a in smaller chunks to prevent exceeding linker command line size (forced to True when using MinGW)", False))
|
||||
opts.Add(BoolVariable('disable_3d', "Disable 3D nodes for a smaller executable", False))
|
||||
opts.Add(BoolVariable('disable_advanced_gui', "Disable advanced GUI nodes and behaviors", False))
|
||||
opts.Add(BoolVariable('no_editor_splash', "Don't use the custom splash screen for the editor", False))
|
||||
opts.Add('system_certs_path', "Use this path as SSL certificates default for editor (for package maintainers)", '')
|
||||
opts.Add(BoolVariable("verbose", "Enable verbose output for the compilation", False))
|
||||
opts.Add(BoolVariable("progress", "Show a progress indicator during compilation", True))
|
||||
opts.Add(EnumVariable("warnings", "Level of compilation warnings", "all", ("extra", "all", "moderate", "no")))
|
||||
opts.Add(BoolVariable("werror", "Treat compiler warnings as errors", False))
|
||||
opts.Add(BoolVariable("dev", "If yes, alias for verbose=yes warnings=extra werror=yes", False))
|
||||
opts.Add("extra_suffix", "Custom extra suffix added to the base filename of all generated binary files", "")
|
||||
opts.Add(BoolVariable("vsproj", "Generate a Visual Studio solution", False))
|
||||
opts.Add(EnumVariable("macports_clang", "Build using Clang from MacPorts", "no", ("no", "5.0", "devel")))
|
||||
opts.Add(
|
||||
BoolVariable(
|
||||
"split_libmodules",
|
||||
"Split intermediate libmodules.a in smaller chunks to prevent exceeding linker command line size (forced to True when using MinGW)",
|
||||
False,
|
||||
)
|
||||
)
|
||||
opts.Add(BoolVariable("disable_3d", "Disable 3D nodes for a smaller executable", False))
|
||||
opts.Add(BoolVariable("disable_advanced_gui", "Disable advanced GUI nodes and behaviors", False))
|
||||
opts.Add(BoolVariable("no_editor_splash", "Don't use the custom splash screen for the editor", False))
|
||||
opts.Add("system_certs_path", "Use this path as SSL certificates default for editor (for package maintainers)", "")
|
||||
|
||||
# Thirdparty libraries
|
||||
#opts.Add(BoolVariable('builtin_assimp', "Use the built-in Assimp library", True))
|
||||
opts.Add(BoolVariable('builtin_bullet', "Use the built-in Bullet library", True))
|
||||
opts.Add(BoolVariable('builtin_certs', "Bundle default SSL certificates to be used if you don't specify an override in the project settings", True))
|
||||
opts.Add(BoolVariable('builtin_enet', "Use the built-in ENet library", True))
|
||||
opts.Add(BoolVariable('builtin_freetype', "Use the built-in FreeType library", True))
|
||||
opts.Add(BoolVariable('builtin_libogg', "Use the built-in libogg library", True))
|
||||
opts.Add(BoolVariable('builtin_libpng', "Use the built-in libpng library", True))
|
||||
opts.Add(BoolVariable('builtin_libtheora', "Use the built-in libtheora library", True))
|
||||
opts.Add(BoolVariable('builtin_libvorbis', "Use the built-in libvorbis library", True))
|
||||
opts.Add(BoolVariable('builtin_libvpx', "Use the built-in libvpx library", True))
|
||||
opts.Add(BoolVariable('builtin_libwebp', "Use the built-in libwebp library", True))
|
||||
opts.Add(BoolVariable('builtin_wslay', "Use the built-in wslay library", True))
|
||||
opts.Add(BoolVariable('builtin_mbedtls', "Use the built-in mbedTLS library", True))
|
||||
opts.Add(BoolVariable('builtin_miniupnpc', "Use the built-in miniupnpc library", True))
|
||||
opts.Add(BoolVariable('builtin_opus', "Use the built-in Opus library", True))
|
||||
opts.Add(BoolVariable('builtin_pcre2', "Use the built-in PCRE2 library", True))
|
||||
opts.Add(BoolVariable('builtin_pcre2_with_jit', "Use JIT compiler for the built-in PCRE2 library", True))
|
||||
opts.Add(BoolVariable('builtin_recast', "Use the built-in Recast library", True))
|
||||
opts.Add(BoolVariable('builtin_squish', "Use the built-in squish library", True))
|
||||
opts.Add(BoolVariable('builtin_xatlas', "Use the built-in xatlas library", True))
|
||||
opts.Add(BoolVariable('builtin_zlib', "Use the built-in zlib library", True))
|
||||
opts.Add(BoolVariable('builtin_zstd', "Use the built-in Zstd library", True))
|
||||
# opts.Add(BoolVariable('builtin_assimp', "Use the built-in Assimp library", True))
|
||||
opts.Add(BoolVariable("builtin_bullet", "Use the built-in Bullet library", True))
|
||||
opts.Add(BoolVariable("builtin_certs", "Use the built-in SSL certificates bundles", True))
|
||||
opts.Add(BoolVariable("builtin_enet", "Use the built-in ENet library", True))
|
||||
opts.Add(BoolVariable("builtin_freetype", "Use the built-in FreeType library", True))
|
||||
opts.Add(BoolVariable("builtin_libogg", "Use the built-in libogg library", True))
|
||||
opts.Add(BoolVariable("builtin_libpng", "Use the built-in libpng library", True))
|
||||
opts.Add(BoolVariable("builtin_libtheora", "Use the built-in libtheora library", True))
|
||||
opts.Add(BoolVariable("builtin_libvorbis", "Use the built-in libvorbis library", True))
|
||||
opts.Add(BoolVariable("builtin_libvpx", "Use the built-in libvpx library", True))
|
||||
opts.Add(BoolVariable("builtin_libwebp", "Use the built-in libwebp library", True))
|
||||
opts.Add(BoolVariable("builtin_wslay", "Use the built-in wslay library", True))
|
||||
opts.Add(BoolVariable("builtin_mbedtls", "Use the built-in mbedTLS library", True))
|
||||
opts.Add(BoolVariable("builtin_miniupnpc", "Use the built-in miniupnpc library", True))
|
||||
opts.Add(BoolVariable("builtin_opus", "Use the built-in Opus library", True))
|
||||
opts.Add(BoolVariable("builtin_pcre2", "Use the built-in PCRE2 library", True))
|
||||
opts.Add(BoolVariable("builtin_pcre2_with_jit", "Use JIT compiler for the built-in PCRE2 library", True))
|
||||
opts.Add(BoolVariable("builtin_recast", "Use the built-in Recast library", True))
|
||||
opts.Add(BoolVariable("builtin_squish", "Use the built-in squish library", True))
|
||||
opts.Add(BoolVariable("builtin_xatlas", "Use the built-in xatlas library", True))
|
||||
opts.Add(BoolVariable("builtin_zlib", "Use the built-in zlib library", True))
|
||||
opts.Add(BoolVariable("builtin_zstd", "Use the built-in Zstd library", True))
|
||||
|
||||
# Compilation environment setup
|
||||
opts.Add("CXX", "C++ compiler")
|
||||
|
@ -223,51 +229,51 @@ Help(opts.GenerateHelpText(env_base))
|
|||
|
||||
# add default include paths
|
||||
|
||||
env_base.Prepend(CPPPATH=['#'])
|
||||
env_base.Prepend(CPPPATH=["#"])
|
||||
|
||||
# configure ENV for platform
|
||||
env_base.platform_exporters = platform_exporters
|
||||
env_base.platform_apis = platform_apis
|
||||
|
||||
if (env_base["use_precise_math_checks"]):
|
||||
env_base.Append(CPPDEFINES=['PRECISE_MATH_CHECKS'])
|
||||
if env_base["use_precise_math_checks"]:
|
||||
env_base.Append(CPPDEFINES=["PRECISE_MATH_CHECKS"])
|
||||
|
||||
if (env_base['target'] == 'debug'):
|
||||
env_base.Append(CPPDEFINES=['DEBUG_MEMORY_ALLOC','DISABLE_FORCED_INLINE'])
|
||||
if env_base["target"] == "debug":
|
||||
env_base.Append(CPPDEFINES=["DEBUG_MEMORY_ALLOC", "DISABLE_FORCED_INLINE"])
|
||||
|
||||
# The two options below speed up incremental builds, but reduce the certainty that all files
|
||||
# will properly be rebuilt. As such, we only enable them for debug (dev) builds, not release.
|
||||
|
||||
# To decide whether to rebuild a file, use the MD5 sum only if the timestamp has changed.
|
||||
# http://scons.org/doc/production/HTML/scons-user/ch06.html#idm139837621851792
|
||||
env_base.Decider('MD5-timestamp')
|
||||
env_base.Decider("MD5-timestamp")
|
||||
# Use cached implicit dependencies by default. Can be overridden by specifying `--implicit-deps-changed` in the command line.
|
||||
# http://scons.org/doc/production/HTML/scons-user/ch06s04.html
|
||||
env_base.SetOption('implicit_cache', 1)
|
||||
env_base.SetOption("implicit_cache", 1)
|
||||
|
||||
if (env_base['no_editor_splash']):
|
||||
env_base.Append(CPPDEFINES=['NO_EDITOR_SPLASH'])
|
||||
if env_base["no_editor_splash"]:
|
||||
env_base.Append(CPPDEFINES=["NO_EDITOR_SPLASH"])
|
||||
|
||||
if not env_base['deprecated']:
|
||||
env_base.Append(CPPDEFINES=['DISABLE_DEPRECATED'])
|
||||
if not env_base["deprecated"]:
|
||||
env_base.Append(CPPDEFINES=["DISABLE_DEPRECATED"])
|
||||
|
||||
env_base.platforms = {}
|
||||
|
||||
selected_platform = ""
|
||||
|
||||
if env_base['platform'] != "":
|
||||
selected_platform = env_base['platform']
|
||||
elif env_base['p'] != "":
|
||||
selected_platform = env_base['p']
|
||||
if env_base["platform"] != "":
|
||||
selected_platform = env_base["platform"]
|
||||
elif env_base["p"] != "":
|
||||
selected_platform = env_base["p"]
|
||||
env_base["platform"] = selected_platform
|
||||
else:
|
||||
# Missing `platform` argument, try to detect platform automatically
|
||||
if sys.platform.startswith('linux'):
|
||||
selected_platform = 'x11'
|
||||
elif sys.platform == 'darwin':
|
||||
selected_platform = 'osx'
|
||||
elif sys.platform == 'win32':
|
||||
selected_platform = 'windows'
|
||||
if sys.platform.startswith("linux"):
|
||||
selected_platform = "x11"
|
||||
elif sys.platform == "darwin":
|
||||
selected_platform = "osx"
|
||||
elif sys.platform == "win32":
|
||||
selected_platform = "windows"
|
||||
else:
|
||||
print("Could not detect platform automatically. Supported platforms:")
|
||||
for x in platform_list:
|
||||
|
@ -282,6 +288,7 @@ if selected_platform in platform_list:
|
|||
tmppath = "./platform/" + selected_platform
|
||||
sys.path.insert(0, tmppath)
|
||||
import detect
|
||||
|
||||
if "create" in dir(detect):
|
||||
env = detect.create(env_base)
|
||||
else:
|
||||
|
@ -295,12 +302,12 @@ if selected_platform in platform_list:
|
|||
env.Tool("compilation_db", toolpath=["misc/scons"])
|
||||
env.Alias("compiledb", env.CompilationDatabase("compile_commands.json"))
|
||||
|
||||
if env['dev']:
|
||||
env['verbose'] = True
|
||||
env['warnings'] = "extra"
|
||||
env['werror'] = True
|
||||
if env["dev"]:
|
||||
env["verbose"] = True
|
||||
env["warnings"] = "extra"
|
||||
env["werror"] = True
|
||||
|
||||
if env['vsproj']:
|
||||
if env["vsproj"]:
|
||||
env.vs_incs = []
|
||||
env.vs_srcs = []
|
||||
|
||||
|
@ -313,7 +320,7 @@ if selected_platform in platform_list:
|
|||
pieces = fname.split(".")
|
||||
if len(pieces) > 0:
|
||||
basename = pieces[0]
|
||||
basename = basename.replace('\\\\', '/')
|
||||
basename = basename.replace("\\\\", "/")
|
||||
if os.path.isfile(basename + ".h"):
|
||||
env.vs_incs = env.vs_incs + [basename + ".h"]
|
||||
elif os.path.isfile(basename + ".hpp"):
|
||||
|
@ -322,28 +329,29 @@ if selected_platform in platform_list:
|
|||
env.vs_srcs = env.vs_srcs + [basename + ".c"]
|
||||
elif os.path.isfile(basename + ".cpp"):
|
||||
env.vs_srcs = env.vs_srcs + [basename + ".cpp"]
|
||||
|
||||
env.AddToVSProject = AddToVSProject
|
||||
|
||||
env.extra_suffix = ""
|
||||
|
||||
if env["extra_suffix"] != '':
|
||||
env.extra_suffix += '.' + env["extra_suffix"]
|
||||
if env["extra_suffix"] != "":
|
||||
env.extra_suffix += "." + env["extra_suffix"]
|
||||
|
||||
# Environment flags
|
||||
CCFLAGS = env.get('CCFLAGS', '')
|
||||
env['CCFLAGS'] = ''
|
||||
CCFLAGS = env.get("CCFLAGS", "")
|
||||
env["CCFLAGS"] = ""
|
||||
env.Append(CCFLAGS=str(CCFLAGS).split())
|
||||
|
||||
CFLAGS = env.get('CFLAGS', '')
|
||||
env['CFLAGS'] = ''
|
||||
CFLAGS = env.get("CFLAGS", "")
|
||||
env["CFLAGS"] = ""
|
||||
env.Append(CFLAGS=str(CFLAGS).split())
|
||||
|
||||
CXXFLAGS = env.get('CXXFLAGS', '')
|
||||
env['CXXFLAGS'] = ''
|
||||
CXXFLAGS = env.get("CXXFLAGS", "")
|
||||
env["CXXFLAGS"] = ""
|
||||
env.Append(CXXFLAGS=str(CXXFLAGS).split())
|
||||
|
||||
LINKFLAGS = env.get('LINKFLAGS', '')
|
||||
env['LINKFLAGS'] = ''
|
||||
LINKFLAGS = env.get("LINKFLAGS", "")
|
||||
env["LINKFLAGS"] = ""
|
||||
env.Append(LINKFLAGS=str(LINKFLAGS).split())
|
||||
|
||||
# Platform specific flags
|
||||
|
@ -362,78 +370,83 @@ if selected_platform in platform_list:
|
|||
# Specifying GNU extensions support explicitly, which are supported by
|
||||
# both GCC and Clang. This mirrors GCC and Clang's current default
|
||||
# compile flags if no -std is specified.
|
||||
env.Prepend(CFLAGS=['-std=gnu11'])
|
||||
env.Prepend(CXXFLAGS=['-std=gnu++14'])
|
||||
env.Prepend(CFLAGS=["-std=gnu11"])
|
||||
env.Prepend(CXXFLAGS=["-std=gnu++14"])
|
||||
else:
|
||||
# MSVC doesn't have clear C standard support, /std only covers C++.
|
||||
# We apply it to CCFLAGS (both C and C++ code) in case it impacts C features.
|
||||
env.Prepend(CCFLAGS=['/std:c++14'])
|
||||
env.Prepend(CCFLAGS=["/std:c++14"])
|
||||
|
||||
# Configure compiler warnings
|
||||
if env.msvc:
|
||||
# Truncations, narrowing conversions, signed/unsigned comparisons...
|
||||
disable_nonessential_warnings = ['/wd4267', '/wd4244', '/wd4305', '/wd4018', '/wd4800']
|
||||
if (env["warnings"] == 'extra'):
|
||||
env.Append(CCFLAGS=['/Wall']) # Implies /W4
|
||||
elif (env["warnings"] == 'all'):
|
||||
env.Append(CCFLAGS=['/W3'] + disable_nonessential_warnings)
|
||||
elif (env["warnings"] == 'moderate'):
|
||||
env.Append(CCFLAGS=['/W2'] + disable_nonessential_warnings)
|
||||
else: # 'no'
|
||||
env.Append(CCFLAGS=['/w'])
|
||||
disable_nonessential_warnings = ["/wd4267", "/wd4244", "/wd4305", "/wd4018", "/wd4800"]
|
||||
if env["warnings"] == "extra":
|
||||
env.Append(CCFLAGS=["/Wall"]) # Implies /W4
|
||||
elif env["warnings"] == "all":
|
||||
env.Append(CCFLAGS=["/W3"] + disable_nonessential_warnings)
|
||||
elif env["warnings"] == "moderate":
|
||||
env.Append(CCFLAGS=["/W2"] + disable_nonessential_warnings)
|
||||
else: # 'no'
|
||||
env.Append(CCFLAGS=["/w"])
|
||||
# Set exception handling model to avoid warnings caused by Windows system headers.
|
||||
env.Append(CCFLAGS=['/EHsc'])
|
||||
if (env["werror"]):
|
||||
env.Append(CCFLAGS=['/WX'])
|
||||
env.Append(CCFLAGS=["/EHsc"])
|
||||
if env["werror"]:
|
||||
env.Append(CCFLAGS=["/WX"])
|
||||
# Force to use Unicode encoding
|
||||
env.Append(MSVC_FLAGS=['/utf8'])
|
||||
else: # Rest of the world
|
||||
env.Append(MSVC_FLAGS=["/utf8"])
|
||||
else: # Rest of the world
|
||||
version = methods.get_compiler_version(env) or [-1, -1]
|
||||
|
||||
shadow_local_warning = []
|
||||
all_plus_warnings = ['-Wwrite-strings']
|
||||
all_plus_warnings = ["-Wwrite-strings"]
|
||||
|
||||
if methods.using_gcc(env):
|
||||
if version[0] >= 7:
|
||||
shadow_local_warning = ['-Wshadow-local']
|
||||
shadow_local_warning = ["-Wshadow-local"]
|
||||
|
||||
if (env["warnings"] == 'extra'):
|
||||
if env["warnings"] == "extra":
|
||||
# Note: enable -Wimplicit-fallthrough for Clang (already part of -Wextra for GCC)
|
||||
# once we switch to C++11 or later (necessary for our FALLTHROUGH macro).
|
||||
env.Append(CCFLAGS=['-Wall', '-Wextra', '-Wno-unused-parameter']
|
||||
+ all_plus_warnings + shadow_local_warning)
|
||||
env.Append(CXXFLAGS=['-Wctor-dtor-privacy', '-Wnon-virtual-dtor'])
|
||||
env.Append(CCFLAGS=["-Wall", "-Wextra", "-Wno-unused-parameter"] + all_plus_warnings + shadow_local_warning)
|
||||
env.Append(CXXFLAGS=["-Wctor-dtor-privacy", "-Wnon-virtual-dtor"])
|
||||
if methods.using_gcc(env):
|
||||
env.Append(CCFLAGS=['-Walloc-zero',
|
||||
'-Wduplicated-branches', '-Wduplicated-cond',
|
||||
'-Wstringop-overflow=4', '-Wlogical-op'])
|
||||
env.Append(CXXFLAGS=['-Wnoexcept', '-Wplacement-new=1'])
|
||||
env.Append(
|
||||
CCFLAGS=[
|
||||
"-Walloc-zero",
|
||||
"-Wduplicated-branches",
|
||||
"-Wduplicated-cond",
|
||||
"-Wstringop-overflow=4",
|
||||
"-Wlogical-op",
|
||||
]
|
||||
)
|
||||
env.Append(CXXFLAGS=["-Wnoexcept", "-Wplacement-new=1"])
|
||||
if version[0] >= 9:
|
||||
env.Append(CCFLAGS=['-Wattribute-alias=2'])
|
||||
elif (env["warnings"] == 'all'):
|
||||
env.Append(CCFLAGS=['-Wall'] + shadow_local_warning)
|
||||
elif (env["warnings"] == 'moderate'):
|
||||
env.Append(CCFLAGS=['-Wall', '-Wno-unused'] + shadow_local_warning)
|
||||
else: # 'no'
|
||||
env.Append(CCFLAGS=['-w'])
|
||||
if (env["werror"]):
|
||||
env.Append(CCFLAGS=['-Werror'])
|
||||
else: # always enable those errors
|
||||
env.Append(CCFLAGS=['-Werror=return-type'])
|
||||
env.Append(CCFLAGS=["-Wattribute-alias=2"])
|
||||
elif env["warnings"] == "all":
|
||||
env.Append(CCFLAGS=["-Wall"] + shadow_local_warning)
|
||||
elif env["warnings"] == "moderate":
|
||||
env.Append(CCFLAGS=["-Wall", "-Wno-unused"] + shadow_local_warning)
|
||||
else: # 'no'
|
||||
env.Append(CCFLAGS=["-w"])
|
||||
if env["werror"]:
|
||||
env.Append(CCFLAGS=["-Werror"])
|
||||
else: # always enable those errors
|
||||
env.Append(CCFLAGS=["-Werror=return-type"])
|
||||
|
||||
if (hasattr(detect, 'get_program_suffix')):
|
||||
if hasattr(detect, "get_program_suffix"):
|
||||
suffix = "." + detect.get_program_suffix()
|
||||
else:
|
||||
suffix = "." + selected_platform
|
||||
|
||||
if (env["target"] == "release"):
|
||||
if env["target"] == "release":
|
||||
if env["tools"]:
|
||||
print("Tools can only be built with targets 'debug' and 'release_debug'.")
|
||||
sys.exit(255)
|
||||
suffix += ".opt"
|
||||
env.Append(CPPDEFINES=['NDEBUG'])
|
||||
env.Append(CPPDEFINES=["NDEBUG"])
|
||||
|
||||
elif (env["target"] == "release_debug"):
|
||||
elif env["target"] == "release_debug":
|
||||
if env["tools"]:
|
||||
suffix += ".opt.tools"
|
||||
else:
|
||||
|
@ -446,15 +459,15 @@ if selected_platform in platform_list:
|
|||
|
||||
if env["arch"] != "":
|
||||
suffix += "." + env["arch"]
|
||||
elif (env["bits"] == "32"):
|
||||
elif env["bits"] == "32":
|
||||
suffix += ".32"
|
||||
elif (env["bits"] == "64"):
|
||||
elif env["bits"] == "64":
|
||||
suffix += ".64"
|
||||
|
||||
suffix += env.extra_suffix
|
||||
|
||||
sys.path.remove(tmppath)
|
||||
sys.modules.pop('detect')
|
||||
sys.modules.pop("detect")
|
||||
|
||||
modules_enabled = OrderedDict()
|
||||
env.module_icons_paths = []
|
||||
|
@ -466,17 +479,20 @@ if selected_platform in platform_list:
|
|||
sys.path.insert(0, path)
|
||||
env.current_module = name
|
||||
import config
|
||||
|
||||
# can_build changed number of arguments between 3.0 (1) and 3.1 (2),
|
||||
# so try both to preserve compatibility for 3.0 modules
|
||||
can_build = False
|
||||
try:
|
||||
can_build = config.can_build(env, selected_platform)
|
||||
except TypeError:
|
||||
print("Warning: module '%s' uses a deprecated `can_build` "
|
||||
"signature in its config.py file, it should be "
|
||||
"`can_build(env, platform)`." % x)
|
||||
print(
|
||||
"Warning: module '%s' uses a deprecated `can_build` "
|
||||
"signature in its config.py file, it should be "
|
||||
"`can_build(env, platform)`." % x
|
||||
)
|
||||
can_build = config.can_build(selected_platform)
|
||||
if (can_build):
|
||||
if can_build:
|
||||
config.configure(env)
|
||||
# Get doc classes paths (if present)
|
||||
try:
|
||||
|
@ -517,47 +533,68 @@ if selected_platform in platform_list:
|
|||
env["LIBSUFFIX"] = suffix + env["LIBSUFFIX"]
|
||||
env["SHLIBSUFFIX"] = suffix + env["SHLIBSUFFIX"]
|
||||
|
||||
if (env.use_ptrcall):
|
||||
env.Append(CPPDEFINES=['PTRCALL_ENABLED'])
|
||||
if env['tools']:
|
||||
env.Append(CPPDEFINES=['TOOLS_ENABLED'])
|
||||
if env['disable_3d']:
|
||||
if env['tools']:
|
||||
print("Build option 'disable_3d=yes' cannot be used with 'tools=yes' (editor), only with 'tools=no' (export template).")
|
||||
if env.use_ptrcall:
|
||||
env.Append(CPPDEFINES=["PTRCALL_ENABLED"])
|
||||
if env["tools"]:
|
||||
env.Append(CPPDEFINES=["TOOLS_ENABLED"])
|
||||
if env["disable_3d"]:
|
||||
if env["tools"]:
|
||||
print(
|
||||
"Build option 'disable_3d=yes' cannot be used with 'tools=yes' (editor), "
|
||||
"only with 'tools=no' (export template)."
|
||||
)
|
||||
sys.exit(255)
|
||||
else:
|
||||
env.Append(CPPDEFINES=['_3D_DISABLED'])
|
||||
if env['gdscript']:
|
||||
env.Append(CPPDEFINES=['GDSCRIPT_ENABLED'])
|
||||
if env['disable_advanced_gui']:
|
||||
if env['tools']:
|
||||
print("Build option 'disable_advanced_gui=yes' cannot be used with 'tools=yes' (editor), only with 'tools=no' (export template).")
|
||||
env.Append(CPPDEFINES=["_3D_DISABLED"])
|
||||
if env["gdscript"]:
|
||||
env.Append(CPPDEFINES=["GDSCRIPT_ENABLED"])
|
||||
if env["disable_advanced_gui"]:
|
||||
if env["tools"]:
|
||||
print(
|
||||
"Build option 'disable_advanced_gui=yes' cannot be used with 'tools=yes' (editor), "
|
||||
"only with 'tools=no' (export template)."
|
||||
)
|
||||
sys.exit(255)
|
||||
else:
|
||||
env.Append(CPPDEFINES=['ADVANCED_GUI_DISABLED'])
|
||||
if env['minizip']:
|
||||
env.Append(CPPDEFINES=['MINIZIP_ENABLED'])
|
||||
env.Append(CPPDEFINES=["ADVANCED_GUI_DISABLED"])
|
||||
if env["minizip"]:
|
||||
env.Append(CPPDEFINES=["MINIZIP_ENABLED"])
|
||||
|
||||
editor_module_list = ['regex']
|
||||
editor_module_list = ["regex"]
|
||||
for x in editor_module_list:
|
||||
if not env['module_' + x + '_enabled']:
|
||||
if env['tools']:
|
||||
print("Build option 'module_" + x + "_enabled=no' cannot be used with 'tools=yes' (editor), only with 'tools=no' (export template).")
|
||||
if not env["module_" + x + "_enabled"]:
|
||||
if env["tools"]:
|
||||
print(
|
||||
"Build option 'module_" + x + "_enabled=no' cannot be used with 'tools=yes' (editor), "
|
||||
"only with 'tools=no' (export template)."
|
||||
)
|
||||
sys.exit(255)
|
||||
|
||||
if not env['verbose']:
|
||||
if not env["verbose"]:
|
||||
methods.no_verbose(sys, env)
|
||||
|
||||
if (not env["platform"] == "server"): # FIXME: detect GLES3
|
||||
env.Append(BUILDERS = { 'GLES3_GLSL' : env.Builder(action=run_in_subprocess(gles_builders.build_gles3_headers), suffix='glsl.gen.h', src_suffix='.glsl')})
|
||||
env.Append(BUILDERS = { 'GLES2_GLSL' : env.Builder(action=run_in_subprocess(gles_builders.build_gles2_headers), suffix='glsl.gen.h', src_suffix='.glsl')})
|
||||
if not env["platform"] == "server": # FIXME: detect GLES3
|
||||
env.Append(
|
||||
BUILDERS={
|
||||
"GLES3_GLSL": env.Builder(
|
||||
action=run_in_subprocess(gles_builders.build_gles3_headers), suffix="glsl.gen.h", src_suffix=".glsl"
|
||||
)
|
||||
}
|
||||
)
|
||||
env.Append(
|
||||
BUILDERS={
|
||||
"GLES2_GLSL": env.Builder(
|
||||
action=run_in_subprocess(gles_builders.build_gles2_headers), suffix="glsl.gen.h", src_suffix=".glsl"
|
||||
)
|
||||
}
|
||||
)
|
||||
|
||||
scons_cache_path = os.environ.get("SCONS_CACHE")
|
||||
if scons_cache_path != None:
|
||||
CacheDir(scons_cache_path)
|
||||
print("Scons cache enabled... (path: '" + scons_cache_path + "')")
|
||||
|
||||
Export('env')
|
||||
Export("env")
|
||||
|
||||
# build subdirs, the build order is dependent on link order.
|
||||
|
||||
|
@ -574,16 +611,16 @@ if selected_platform in platform_list:
|
|||
SConscript("platform/" + selected_platform + "/SCsub") # build selected platform
|
||||
|
||||
# Microsoft Visual Studio Project Generation
|
||||
if env['vsproj']:
|
||||
env['CPPPATH'] = [Dir(path) for path in env['CPPPATH']]
|
||||
if env["vsproj"]:
|
||||
env["CPPPATH"] = [Dir(path) for path in env["CPPPATH"]]
|
||||
methods.generate_vs_project(env, GetOption("num_jobs"))
|
||||
methods.generate_cpp_hint_file("cpp.hint")
|
||||
|
||||
# Check for the existence of headers
|
||||
conf = Configure(env)
|
||||
if ("check_c_headers" in env):
|
||||
if "check_c_headers" in env:
|
||||
for header in env["check_c_headers"]:
|
||||
if (conf.CheckCHeader(header[0])):
|
||||
if conf.CheckCHeader(header[0]):
|
||||
env.AppendUnique(CPPDEFINES=[header[1]])
|
||||
|
||||
elif selected_platform != "":
|
||||
|
@ -604,118 +641,9 @@ elif selected_platform != "":
|
|||
else:
|
||||
sys.exit(255)
|
||||
|
||||
# The following only makes sense when the env is defined, and assumes it is
|
||||
if 'env' in locals():
|
||||
screen = sys.stdout
|
||||
# Progress reporting is not available in non-TTY environments since it
|
||||
# messes with the output (for example, when writing to a file)
|
||||
show_progress = (env['progress'] and sys.stdout.isatty())
|
||||
node_count = 0
|
||||
node_count_max = 0
|
||||
node_count_interval = 1
|
||||
node_count_fname = str(env.Dir('#')) + '/.scons_node_count'
|
||||
|
||||
import time, math
|
||||
|
||||
class cache_progress:
|
||||
# The default is 1 GB cache and 12 hours half life
|
||||
def __init__(self, path = None, limit = 1073741824, half_life = 43200):
|
||||
self.path = path
|
||||
self.limit = limit
|
||||
self.exponent_scale = math.log(2) / half_life
|
||||
if env['verbose'] and path != None:
|
||||
screen.write('Current cache limit is ' + self.convert_size(limit) + ' (used: ' + self.convert_size(self.get_size(path)) + ')\n')
|
||||
self.delete(self.file_list())
|
||||
|
||||
def __call__(self, node, *args, **kw):
|
||||
global node_count, node_count_max, node_count_interval, node_count_fname, show_progress
|
||||
if show_progress:
|
||||
# Print the progress percentage
|
||||
node_count += node_count_interval
|
||||
if (node_count_max > 0 and node_count <= node_count_max):
|
||||
screen.write('\r[%3d%%] ' % (node_count * 100 / node_count_max))
|
||||
screen.flush()
|
||||
elif (node_count_max > 0 and node_count > node_count_max):
|
||||
screen.write('\r[100%] ')
|
||||
screen.flush()
|
||||
else:
|
||||
screen.write('\r[Initial build] ')
|
||||
screen.flush()
|
||||
|
||||
def delete(self, files):
|
||||
if len(files) == 0:
|
||||
return
|
||||
if env['verbose']:
|
||||
# Utter something
|
||||
screen.write('\rPurging %d %s from cache...\n' % (len(files), len(files) > 1 and 'files' or 'file'))
|
||||
[os.remove(f) for f in files]
|
||||
|
||||
def file_list(self):
|
||||
if self.path is None:
|
||||
# Nothing to do
|
||||
return []
|
||||
# Gather a list of (filename, (size, atime)) within the
|
||||
# cache directory
|
||||
file_stat = [(x, os.stat(x)[6:8]) for x in glob.glob(os.path.join(self.path, '*', '*'))]
|
||||
if file_stat == []:
|
||||
# Nothing to do
|
||||
return []
|
||||
# Weight the cache files by size (assumed to be roughly
|
||||
# proportional to the recompilation time) times an exponential
|
||||
# decay since the ctime, and return a list with the entries
|
||||
# (filename, size, weight).
|
||||
current_time = time.time()
|
||||
file_stat = [(x[0], x[1][0], (current_time - x[1][1])) for x in file_stat]
|
||||
# Sort by the most recently accessed files (most sensible to keep) first
|
||||
file_stat.sort(key=lambda x: x[2])
|
||||
# Search for the first entry where the storage limit is
|
||||
# reached
|
||||
sum, mark = 0, None
|
||||
for i,x in enumerate(file_stat):
|
||||
sum += x[1]
|
||||
if sum > self.limit:
|
||||
mark = i
|
||||
break
|
||||
if mark is None:
|
||||
return []
|
||||
else:
|
||||
return [x[0] for x in file_stat[mark:]]
|
||||
|
||||
def convert_size(self, size_bytes):
|
||||
if size_bytes == 0:
|
||||
return "0 bytes"
|
||||
size_name = ("bytes", "KB", "MB", "GB", "TB", "PB", "EB", "ZB", "YB")
|
||||
i = int(math.floor(math.log(size_bytes, 1024)))
|
||||
p = math.pow(1024, i)
|
||||
s = round(size_bytes / p, 2)
|
||||
return "%s %s" % (int(s) if i == 0 else s, size_name[i])
|
||||
|
||||
def get_size(self, start_path = '.'):
|
||||
total_size = 0
|
||||
for dirpath, dirnames, filenames in os.walk(start_path):
|
||||
for f in filenames:
|
||||
fp = os.path.join(dirpath, f)
|
||||
total_size += os.path.getsize(fp)
|
||||
return total_size
|
||||
|
||||
def progress_finish(target, source, env):
|
||||
global node_count, progressor
|
||||
with open(node_count_fname, 'w') as f:
|
||||
f.write('%d\n' % node_count)
|
||||
progressor.delete(progressor.file_list())
|
||||
|
||||
try:
|
||||
with open(node_count_fname) as f:
|
||||
node_count_max = int(f.readline())
|
||||
except:
|
||||
pass
|
||||
|
||||
cache_directory = os.environ.get("SCONS_CACHE")
|
||||
# Simple cache pruning, attached to SCons' progress callback. Trim the
|
||||
# cache directory to a size not larger than cache_limit.
|
||||
cache_limit = float(os.getenv("SCONS_CACHE_LIMIT", 1024)) * 1024 * 1024
|
||||
progressor = cache_progress(cache_directory, cache_limit)
|
||||
Progress(progressor, interval = node_count_interval)
|
||||
|
||||
progress_finish_command = Command('progress_finish', [], progress_finish)
|
||||
AlwaysBuild(progress_finish_command)
|
||||
# The following only makes sense when the 'env' is defined, and assumes it is.
|
||||
if "env" in locals():
|
||||
methods.show_progress(env)
|
||||
# TODO: replace this with `env.Dump(format="json")`
|
||||
# once we start requiring SCons 4.0 as min version.
|
||||
methods.dump(env)
|
||||
|
|
44
compat.py
44
compat.py
|
@ -1,68 +1,90 @@
|
|||
import sys
|
||||
|
||||
if sys.version_info < (3,):
|
||||
|
||||
def isbasestring(s):
|
||||
return isinstance(s, basestring)
|
||||
|
||||
def open_utf8(filename, mode):
|
||||
return open(filename, mode)
|
||||
|
||||
def byte_to_str(x):
|
||||
return str(ord(x))
|
||||
|
||||
import cStringIO
|
||||
|
||||
def StringIO():
|
||||
return cStringIO.StringIO()
|
||||
|
||||
def encode_utf8(x):
|
||||
return x
|
||||
|
||||
def decode_utf8(x):
|
||||
return x
|
||||
|
||||
def iteritems(d):
|
||||
return d.iteritems()
|
||||
|
||||
def itervalues(d):
|
||||
return d.itervalues()
|
||||
|
||||
def escape_string(s):
|
||||
if isinstance(s, unicode):
|
||||
s = s.encode('ascii')
|
||||
result = ''
|
||||
s = s.encode("ascii")
|
||||
result = ""
|
||||
for c in s:
|
||||
if not (32 <= ord(c) < 127) or c in ('\\', '"'):
|
||||
result += '\\%03o' % ord(c)
|
||||
if not (32 <= ord(c) < 127) or c in ("\\", '"'):
|
||||
result += "\\%03o" % ord(c)
|
||||
else:
|
||||
result += c
|
||||
return result
|
||||
|
||||
|
||||
else:
|
||||
|
||||
def isbasestring(s):
|
||||
return isinstance(s, (str, bytes))
|
||||
|
||||
def open_utf8(filename, mode):
|
||||
return open(filename, mode, encoding="utf-8")
|
||||
|
||||
def byte_to_str(x):
|
||||
return str(x)
|
||||
|
||||
import io
|
||||
|
||||
def StringIO():
|
||||
return io.StringIO()
|
||||
|
||||
import codecs
|
||||
|
||||
def encode_utf8(x):
|
||||
return codecs.utf_8_encode(x)[0]
|
||||
|
||||
def decode_utf8(x):
|
||||
return codecs.utf_8_decode(x)[0]
|
||||
|
||||
def iteritems(d):
|
||||
return iter(d.items())
|
||||
|
||||
def itervalues(d):
|
||||
return iter(d.values())
|
||||
|
||||
def charcode_to_c_escapes(c):
|
||||
rev_result = []
|
||||
while c >= 256:
|
||||
c, low = (c // 256, c % 256)
|
||||
rev_result.append('\\%03o' % low)
|
||||
rev_result.append('\\%03o' % c)
|
||||
return ''.join(reversed(rev_result))
|
||||
rev_result.append("\\%03o" % low)
|
||||
rev_result.append("\\%03o" % c)
|
||||
return "".join(reversed(rev_result))
|
||||
|
||||
def escape_string(s):
|
||||
result = ''
|
||||
result = ""
|
||||
if isinstance(s, str):
|
||||
s = s.encode('utf-8')
|
||||
s = s.encode("utf-8")
|
||||
for c in s:
|
||||
if not(32 <= c < 127) or c in (ord('\\'), ord('"')):
|
||||
if not (32 <= c < 127) or c in (ord("\\"), ord('"')):
|
||||
result += charcode_to_c_escapes(c)
|
||||
else:
|
||||
result += chr(c)
|
||||
return result
|
||||
|
||||
|
|
103
core/SCsub
103
core/SCsub
|
@ -1,6 +1,6 @@
|
|||
#!/usr/bin/env python
|
||||
|
||||
Import('env')
|
||||
Import("env")
|
||||
|
||||
import core_builders
|
||||
import make_binders
|
||||
|
@ -11,31 +11,32 @@ env.core_sources = []
|
|||
|
||||
# Generate AES256 script encryption key
|
||||
import os
|
||||
|
||||
txt = "0,0,0,0,0,0,0,0, 0,0,0,0,0,0,0,0, 0,0,0,0,0,0,0,0, 0,0,0,0,0,0,0,0"
|
||||
if ("SCRIPT_AES256_ENCRYPTION_KEY" in os.environ):
|
||||
if "SCRIPT_AES256_ENCRYPTION_KEY" in os.environ:
|
||||
e = os.environ["SCRIPT_AES256_ENCRYPTION_KEY"]
|
||||
txt = ""
|
||||
ec_valid = True
|
||||
if (len(e) != 64):
|
||||
if len(e) != 64:
|
||||
ec_valid = False
|
||||
else:
|
||||
|
||||
for i in range(len(e) >> 1):
|
||||
if (i > 0):
|
||||
if i > 0:
|
||||
txt += ","
|
||||
txts = "0x" + e[i * 2:i * 2 + 2]
|
||||
txts = "0x" + e[i * 2 : i * 2 + 2]
|
||||
try:
|
||||
int(txts, 16)
|
||||
except:
|
||||
ec_valid = False
|
||||
txt += txts
|
||||
if (not ec_valid):
|
||||
if not ec_valid:
|
||||
txt = "0,0,0,0,0,0,0,0, 0,0,0,0,0,0,0,0, 0,0,0,0,0,0,0,0, 0,0,0,0,0,0,0,0"
|
||||
print("Invalid AES256 encryption key, not 64 bits hex: " + e)
|
||||
|
||||
# NOTE: It is safe to generate this file here, since this is still executed serially
|
||||
with open("script_encryption_key.gen.cpp", "w") as f:
|
||||
f.write("#include \"core/project_settings.h\"\nuint8_t script_encryption_key[32]={" + txt + "};\n")
|
||||
f.write('#include "core/project_settings.h"\nuint8_t script_encryption_key[32]={' + txt + "};\n")
|
||||
|
||||
|
||||
# Add required thirdparty code.
|
||||
|
@ -49,7 +50,6 @@ thirdparty_misc_sources = [
|
|||
# C sources
|
||||
"fastlz.c",
|
||||
"smaz.c",
|
||||
|
||||
# C++ sources
|
||||
"hq2x.cpp",
|
||||
"pcg.cpp",
|
||||
|
@ -60,30 +60,30 @@ thirdparty_misc_sources = [thirdparty_misc_dir + file for file in thirdparty_mis
|
|||
env_thirdparty.add_source_files(env.core_sources, thirdparty_misc_sources)
|
||||
|
||||
# Zlib library, can be unbundled
|
||||
if env['builtin_zlib']:
|
||||
thirdparty_zlib_dir = "#thirdparty/zlib/"
|
||||
thirdparty_zlib_sources = [
|
||||
"adler32.c",
|
||||
"compress.c",
|
||||
"crc32.c",
|
||||
"deflate.c",
|
||||
"infback.c",
|
||||
"inffast.c",
|
||||
"inflate.c",
|
||||
"inftrees.c",
|
||||
"trees.c",
|
||||
"uncompr.c",
|
||||
"zutil.c",
|
||||
]
|
||||
thirdparty_zlib_sources = [thirdparty_zlib_dir + file for file in thirdparty_zlib_sources]
|
||||
if env["builtin_zlib"]:
|
||||
thirdparty_zlib_dir = "#thirdparty/zlib/"
|
||||
thirdparty_zlib_sources = [
|
||||
"adler32.c",
|
||||
"compress.c",
|
||||
"crc32.c",
|
||||
"deflate.c",
|
||||
"infback.c",
|
||||
"inffast.c",
|
||||
"inflate.c",
|
||||
"inftrees.c",
|
||||
"trees.c",
|
||||
"uncompr.c",
|
||||
"zutil.c",
|
||||
]
|
||||
thirdparty_zlib_sources = [thirdparty_zlib_dir + file for file in thirdparty_zlib_sources]
|
||||
|
||||
env_thirdparty.Prepend(CPPPATH=[thirdparty_zlib_dir])
|
||||
# Needs to be available in main env too
|
||||
env.Prepend(CPPPATH=[thirdparty_zlib_dir])
|
||||
if (env['target'] == 'debug'):
|
||||
env_thirdparty.Append(CPPDEFINES=['ZLIB_DEBUG'])
|
||||
env_thirdparty.Prepend(CPPPATH=[thirdparty_zlib_dir])
|
||||
# Needs to be available in main env too
|
||||
env.Prepend(CPPPATH=[thirdparty_zlib_dir])
|
||||
if env["target"] == "debug":
|
||||
env_thirdparty.Append(CPPDEFINES=["ZLIB_DEBUG"])
|
||||
|
||||
env_thirdparty.add_source_files(env.core_sources, thirdparty_zlib_sources)
|
||||
env_thirdparty.add_source_files(env.core_sources, thirdparty_zlib_sources)
|
||||
|
||||
# Minizip library, could be unbundled in theory
|
||||
# However, our version has some custom modifications, so it won't compile with the system one
|
||||
|
@ -99,7 +99,7 @@ env_thirdparty.add_source_files(env.core_sources, thirdparty_minizip_sources)
|
|||
# Zstd library, can be unbundled in theory
|
||||
# though we currently use some private symbols
|
||||
# https://github.com/godotengine/godot/issues/17374
|
||||
if env['builtin_zstd']:
|
||||
if env["builtin_zstd"]:
|
||||
thirdparty_zstd_dir = "#thirdparty/zstd/"
|
||||
thirdparty_zstd_sources = [
|
||||
"common/debug.c",
|
||||
|
@ -142,30 +142,43 @@ if env['builtin_zstd']:
|
|||
env.add_source_files(env.core_sources, "*.cpp")
|
||||
|
||||
# Certificates
|
||||
env.Depends("#core/io/certs_compressed.gen.h", ["#thirdparty/certs/ca-certificates.crt", env.Value(env['builtin_certs']), env.Value(env['system_certs_path'])])
|
||||
env.CommandNoCache("#core/io/certs_compressed.gen.h", "#thirdparty/certs/ca-certificates.crt", run_in_subprocess(core_builders.make_certs_header))
|
||||
env.Depends(
|
||||
"#core/io/certs_compressed.gen.h",
|
||||
["#thirdparty/certs/ca-certificates.crt", env.Value(env["builtin_certs"]), env.Value(env["system_certs_path"])],
|
||||
)
|
||||
env.CommandNoCache(
|
||||
"#core/io/certs_compressed.gen.h",
|
||||
"#thirdparty/certs/ca-certificates.crt",
|
||||
run_in_subprocess(core_builders.make_certs_header),
|
||||
)
|
||||
|
||||
# Make binders
|
||||
env.CommandNoCache(['method_bind.gen.inc', 'method_bind_ext.gen.inc', 'method_bind_free_func.gen.inc'], 'make_binders.py', run_in_subprocess(make_binders.run))
|
||||
env.CommandNoCache(
|
||||
["method_bind.gen.inc", "method_bind_ext.gen.inc", "method_bind_free_func.gen.inc"],
|
||||
"make_binders.py",
|
||||
run_in_subprocess(make_binders.run),
|
||||
)
|
||||
|
||||
# Authors
|
||||
env.Depends('#core/authors.gen.h', "../AUTHORS.md")
|
||||
env.CommandNoCache('#core/authors.gen.h', "../AUTHORS.md", run_in_subprocess(core_builders.make_authors_header))
|
||||
env.Depends("#core/authors.gen.h", "../AUTHORS.md")
|
||||
env.CommandNoCache("#core/authors.gen.h", "../AUTHORS.md", run_in_subprocess(core_builders.make_authors_header))
|
||||
|
||||
# Donors
|
||||
env.Depends('#core/donors.gen.h', "../DONORS.md")
|
||||
env.CommandNoCache('#core/donors.gen.h', "../DONORS.md", run_in_subprocess(core_builders.make_donors_header))
|
||||
env.Depends("#core/donors.gen.h", "../DONORS.md")
|
||||
env.CommandNoCache("#core/donors.gen.h", "../DONORS.md", run_in_subprocess(core_builders.make_donors_header))
|
||||
|
||||
# License
|
||||
env.Depends('#core/license.gen.h', ["../COPYRIGHT.txt", "../LICENSE.txt"])
|
||||
env.CommandNoCache('#core/license.gen.h', ["../COPYRIGHT.txt", "../LICENSE.txt"], run_in_subprocess(core_builders.make_license_header))
|
||||
env.Depends("#core/license.gen.h", ["../COPYRIGHT.txt", "../LICENSE.txt"])
|
||||
env.CommandNoCache(
|
||||
"#core/license.gen.h", ["../COPYRIGHT.txt", "../LICENSE.txt"], run_in_subprocess(core_builders.make_license_header)
|
||||
)
|
||||
|
||||
# Chain load SCsubs
|
||||
SConscript('os/SCsub')
|
||||
SConscript('math/SCsub')
|
||||
SConscript('crypto/SCsub')
|
||||
SConscript('io/SCsub')
|
||||
SConscript('bind/SCsub')
|
||||
SConscript("os/SCsub")
|
||||
SConscript("math/SCsub")
|
||||
SConscript("crypto/SCsub")
|
||||
SConscript("io/SCsub")
|
||||
SConscript("bind/SCsub")
|
||||
|
||||
|
||||
# Build it all as a library
|
||||
|
|
|
@ -1,5 +1,5 @@
|
|||
#!/usr/bin/env python
|
||||
|
||||
Import('env')
|
||||
Import("env")
|
||||
|
||||
env.add_source_files(env.core_sources, "*.cpp")
|
||||
|
|
|
@ -16,6 +16,7 @@ def make_certs_header(target, source, env):
|
|||
buf = f.read()
|
||||
decomp_size = len(buf)
|
||||
import zlib
|
||||
|
||||
buf = zlib.compress(buf)
|
||||
|
||||
g.write("/* THIS FILE IS GENERATED DO NOT EDIT */\n")
|
||||
|
@ -23,9 +24,9 @@ def make_certs_header(target, source, env):
|
|||
g.write("#define _CERTS_RAW_H\n")
|
||||
|
||||
# System certs path. Editor will use them if defined. (for package maintainers)
|
||||
path = env['system_certs_path']
|
||||
g.write("#define _SYSTEM_CERTS_PATH \"%s\"\n" % str(path))
|
||||
if env['builtin_certs']:
|
||||
path = env["system_certs_path"]
|
||||
g.write('#define _SYSTEM_CERTS_PATH "%s"\n' % str(path))
|
||||
if env["builtin_certs"]:
|
||||
# Defined here and not in env so changing it does not trigger a full rebuild.
|
||||
g.write("#define BUILTIN_CERTS_ENABLED\n")
|
||||
g.write("static const int _certs_compressed_size = " + str(len(buf)) + ";\n")
|
||||
|
@ -62,7 +63,7 @@ def make_authors_header(target, source, env):
|
|||
for line in f:
|
||||
if reading:
|
||||
if line.startswith(" "):
|
||||
g.write("\t\"" + escape_string(line.strip()) + "\",\n")
|
||||
g.write('\t"' + escape_string(line.strip()) + '",\n')
|
||||
continue
|
||||
if line.startswith("## "):
|
||||
if reading:
|
||||
|
@ -85,10 +86,15 @@ def make_authors_header(target, source, env):
|
|||
|
||||
|
||||
def make_donors_header(target, source, env):
|
||||
sections = ["Platinum sponsors", "Gold sponsors", "Mini sponsors",
|
||||
"Gold donors", "Silver donors", "Bronze donors"]
|
||||
sections_id = ["DONORS_SPONSOR_PLAT", "DONORS_SPONSOR_GOLD", "DONORS_SPONSOR_MINI",
|
||||
"DONORS_GOLD", "DONORS_SILVER", "DONORS_BRONZE"]
|
||||
sections = ["Platinum sponsors", "Gold sponsors", "Mini sponsors", "Gold donors", "Silver donors", "Bronze donors"]
|
||||
sections_id = [
|
||||
"DONORS_SPONSOR_PLAT",
|
||||
"DONORS_SPONSOR_GOLD",
|
||||
"DONORS_SPONSOR_MINI",
|
||||
"DONORS_GOLD",
|
||||
"DONORS_SILVER",
|
||||
"DONORS_BRONZE",
|
||||
]
|
||||
|
||||
src = source[0]
|
||||
dst = target[0]
|
||||
|
@ -108,7 +114,7 @@ def make_donors_header(target, source, env):
|
|||
for line in f:
|
||||
if reading >= 0:
|
||||
if line.startswith(" "):
|
||||
g.write("\t\"" + escape_string(line.strip()) + "\",\n")
|
||||
g.write('\t"' + escape_string(line.strip()) + '",\n')
|
||||
continue
|
||||
if line.startswith("## "):
|
||||
if reading:
|
||||
|
@ -151,8 +157,8 @@ def make_license_header(target, source, env):
|
|||
return line
|
||||
|
||||
def next_tag(self):
|
||||
if not ':' in self.current:
|
||||
return ('', [])
|
||||
if not ":" in self.current:
|
||||
return ("", [])
|
||||
tag, line = self.current.split(":", 1)
|
||||
lines = [line.strip()]
|
||||
while self.next_line() and self.current.startswith(" "):
|
||||
|
@ -160,6 +166,7 @@ def make_license_header(target, source, env):
|
|||
return (tag, lines)
|
||||
|
||||
from collections import OrderedDict
|
||||
|
||||
projects = OrderedDict()
|
||||
license_list = []
|
||||
|
||||
|
@ -200,26 +207,30 @@ def make_license_header(target, source, env):
|
|||
with open_utf8(src_license, "r") as license_file:
|
||||
for line in license_file:
|
||||
escaped_string = escape_string(line.strip())
|
||||
f.write("\n\t\t\"" + escaped_string + "\\n\"")
|
||||
f.write('\n\t\t"' + escaped_string + '\\n"')
|
||||
f.write(";\n\n")
|
||||
|
||||
f.write("struct ComponentCopyrightPart {\n"
|
||||
"\tconst char *license;\n"
|
||||
"\tconst char *const *files;\n"
|
||||
"\tconst char *const *copyright_statements;\n"
|
||||
"\tint file_count;\n"
|
||||
"\tint copyright_count;\n"
|
||||
"};\n\n")
|
||||
f.write(
|
||||
"struct ComponentCopyrightPart {\n"
|
||||
"\tconst char *license;\n"
|
||||
"\tconst char *const *files;\n"
|
||||
"\tconst char *const *copyright_statements;\n"
|
||||
"\tint file_count;\n"
|
||||
"\tint copyright_count;\n"
|
||||
"};\n\n"
|
||||
)
|
||||
|
||||
f.write("struct ComponentCopyright {\n"
|
||||
"\tconst char *name;\n"
|
||||
"\tconst ComponentCopyrightPart *parts;\n"
|
||||
"\tint part_count;\n"
|
||||
"};\n\n")
|
||||
f.write(
|
||||
"struct ComponentCopyright {\n"
|
||||
"\tconst char *name;\n"
|
||||
"\tconst ComponentCopyrightPart *parts;\n"
|
||||
"\tint part_count;\n"
|
||||
"};\n\n"
|
||||
)
|
||||
|
||||
f.write("const char *const COPYRIGHT_INFO_DATA[] = {\n")
|
||||
for line in data_list:
|
||||
f.write("\t\"" + escape_string(line) + "\",\n")
|
||||
f.write('\t"' + escape_string(line) + '",\n')
|
||||
f.write("};\n\n")
|
||||
|
||||
f.write("const ComponentCopyrightPart COPYRIGHT_PROJECT_PARTS[] = {\n")
|
||||
|
@ -228,11 +239,21 @@ def make_license_header(target, source, env):
|
|||
for project_name, project in iteritems(projects):
|
||||
part_indexes[project_name] = part_index
|
||||
for part in project:
|
||||
f.write("\t{ \"" + escape_string(part["License"][0]) + "\", "
|
||||
+ "©RIGHT_INFO_DATA[" + str(part["file_index"]) + "], "
|
||||
+ "©RIGHT_INFO_DATA[" + str(part["copyright_index"]) + "], "
|
||||
+ str(len(part["Files"])) + ", "
|
||||
+ str(len(part["Copyright"])) + " },\n")
|
||||
f.write(
|
||||
'\t{ "'
|
||||
+ escape_string(part["License"][0])
|
||||
+ '", '
|
||||
+ "©RIGHT_INFO_DATA["
|
||||
+ str(part["file_index"])
|
||||
+ "], "
|
||||
+ "©RIGHT_INFO_DATA["
|
||||
+ str(part["copyright_index"])
|
||||
+ "], "
|
||||
+ str(len(part["Files"]))
|
||||
+ ", "
|
||||
+ str(len(part["Copyright"]))
|
||||
+ " },\n"
|
||||
)
|
||||
part_index += 1
|
||||
f.write("};\n\n")
|
||||
|
||||
|
@ -240,30 +261,37 @@ def make_license_header(target, source, env):
|
|||
|
||||
f.write("const ComponentCopyright COPYRIGHT_INFO[] = {\n")
|
||||
for project_name, project in iteritems(projects):
|
||||
f.write("\t{ \"" + escape_string(project_name) + "\", "
|
||||
+ "©RIGHT_PROJECT_PARTS[" + str(part_indexes[project_name]) + "], "
|
||||
+ str(len(project)) + " },\n")
|
||||
f.write(
|
||||
'\t{ "'
|
||||
+ escape_string(project_name)
|
||||
+ '", '
|
||||
+ "©RIGHT_PROJECT_PARTS["
|
||||
+ str(part_indexes[project_name])
|
||||
+ "], "
|
||||
+ str(len(project))
|
||||
+ " },\n"
|
||||
)
|
||||
f.write("};\n\n")
|
||||
|
||||
f.write("const int LICENSE_COUNT = " + str(len(license_list)) + ";\n")
|
||||
|
||||
f.write("const char *const LICENSE_NAMES[] = {\n")
|
||||
for l in license_list:
|
||||
f.write("\t\"" + escape_string(l[0]) + "\",\n")
|
||||
f.write('\t"' + escape_string(l[0]) + '",\n')
|
||||
f.write("};\n\n")
|
||||
|
||||
f.write("const char *const LICENSE_BODIES[] = {\n\n")
|
||||
for l in license_list:
|
||||
for line in l[1:]:
|
||||
if line == ".":
|
||||
f.write("\t\"\\n\"\n")
|
||||
f.write('\t"\\n"\n')
|
||||
else:
|
||||
f.write("\t\"" + escape_string(line) + "\\n\"\n")
|
||||
f.write("\t\"\",\n\n")
|
||||
f.write('\t"' + escape_string(line) + '\\n"\n')
|
||||
f.write('\t"",\n\n')
|
||||
f.write("};\n\n")
|
||||
|
||||
f.write("#endif\n")
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
if __name__ == "__main__":
|
||||
subprocess_main(globals())
|
||||
|
|
|
@ -1,6 +1,6 @@
|
|||
#!/usr/bin/env python
|
||||
|
||||
Import('env')
|
||||
Import("env")
|
||||
|
||||
env_crypto = env.Clone()
|
||||
|
||||
|
@ -22,7 +22,9 @@ if not has_module:
|
|||
env_thirdparty = env_crypto.Clone()
|
||||
env_thirdparty.disable_warnings()
|
||||
# Custom config file
|
||||
env_thirdparty.Append(CPPDEFINES=[('MBEDTLS_CONFIG_FILE', '\\"thirdparty/mbedtls/include/godot_core_mbedtls_config.h\\"')])
|
||||
env_thirdparty.Append(
|
||||
CPPDEFINES=[("MBEDTLS_CONFIG_FILE", '\\"thirdparty/mbedtls/include/godot_core_mbedtls_config.h\\"')]
|
||||
)
|
||||
thirdparty_mbedtls_dir = "#thirdparty/mbedtls/library/"
|
||||
thirdparty_mbedtls_sources = [
|
||||
"aes.c",
|
||||
|
@ -30,7 +32,7 @@ if not has_module:
|
|||
"md5.c",
|
||||
"sha1.c",
|
||||
"sha256.c",
|
||||
"godot_core_mbedtls_platform.c"
|
||||
"godot_core_mbedtls_platform.c",
|
||||
]
|
||||
thirdparty_mbedtls_sources = [thirdparty_mbedtls_dir + file for file in thirdparty_mbedtls_sources]
|
||||
env_thirdparty.add_source_files(env.core_sources, thirdparty_mbedtls_sources)
|
||||
|
|
|
@ -67,7 +67,7 @@ Crypto *(*Crypto::_create)() = NULL;
|
|||
Crypto *Crypto::create() {
|
||||
if (_create)
|
||||
return _create();
|
||||
return memnew(Crypto);
|
||||
ERR_FAIL_V_MSG(NULL, "Crypto is not available when the mbedtls module is disabled.");
|
||||
}
|
||||
|
||||
void Crypto::load_default_certificates(String p_path) {
|
||||
|
@ -82,18 +82,6 @@ void Crypto::_bind_methods() {
|
|||
ClassDB::bind_method(D_METHOD("generate_self_signed_certificate", "key", "issuer_name", "not_before", "not_after"), &Crypto::generate_self_signed_certificate, DEFVAL("CN=myserver,O=myorganisation,C=IT"), DEFVAL("20140101000000"), DEFVAL("20340101000000"));
|
||||
}
|
||||
|
||||
PoolByteArray Crypto::generate_random_bytes(int p_bytes) {
|
||||
ERR_FAIL_V_MSG(PoolByteArray(), "generate_random_bytes is not available when mbedtls module is disabled.");
|
||||
}
|
||||
|
||||
Ref<CryptoKey> Crypto::generate_rsa(int p_bytes) {
|
||||
ERR_FAIL_V_MSG(NULL, "generate_rsa is not available when mbedtls module is disabled.");
|
||||
}
|
||||
|
||||
Ref<X509Certificate> Crypto::generate_self_signed_certificate(Ref<CryptoKey> p_key, String p_issuer_name, String p_not_before, String p_not_after) {
|
||||
ERR_FAIL_V_MSG(NULL, "generate_self_signed_certificate is not available when mbedtls module is disabled.");
|
||||
}
|
||||
|
||||
Crypto::Crypto() {
|
||||
}
|
||||
|
||||
|
|
|
@ -76,9 +76,9 @@ public:
|
|||
static Crypto *create();
|
||||
static void load_default_certificates(String p_path);
|
||||
|
||||
virtual PoolByteArray generate_random_bytes(int p_bytes);
|
||||
virtual Ref<CryptoKey> generate_rsa(int p_bytes);
|
||||
virtual Ref<X509Certificate> generate_self_signed_certificate(Ref<CryptoKey> p_key, String p_issuer_name, String p_not_before, String p_not_after);
|
||||
virtual PoolByteArray generate_random_bytes(int p_bytes) = 0;
|
||||
virtual Ref<CryptoKey> generate_rsa(int p_bytes) = 0;
|
||||
virtual Ref<X509Certificate> generate_self_signed_certificate(Ref<CryptoKey> p_key, String p_issuer_name, String p_not_before, String p_not_after) = 0;
|
||||
|
||||
Crypto();
|
||||
};
|
||||
|
|
|
@ -2223,12 +2223,11 @@ void Image::blend_rect(const Ref<Image> &p_src, const Rect2 &p_src_rect, const P
|
|||
int dst_y = dest_rect.position.y + i;
|
||||
|
||||
Color sc = img->get_pixel(src_x, src_y);
|
||||
Color dc = get_pixel(dst_x, dst_y);
|
||||
dc.r = (double)(sc.a * sc.r + dc.a * (1.0 - sc.a) * dc.r);
|
||||
dc.g = (double)(sc.a * sc.g + dc.a * (1.0 - sc.a) * dc.g);
|
||||
dc.b = (double)(sc.a * sc.b + dc.a * (1.0 - sc.a) * dc.b);
|
||||
dc.a = (double)(sc.a + dc.a * (1.0 - sc.a));
|
||||
set_pixel(dst_x, dst_y, dc);
|
||||
if (sc.a != 0) {
|
||||
Color dc = get_pixel(dst_x, dst_y);
|
||||
dc = dc.blend(sc);
|
||||
set_pixel(dst_x, dst_y, dc);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -2285,12 +2284,11 @@ void Image::blend_rect_mask(const Ref<Image> &p_src, const Ref<Image> &p_mask, c
|
|||
int dst_y = dest_rect.position.y + i;
|
||||
|
||||
Color sc = img->get_pixel(src_x, src_y);
|
||||
Color dc = get_pixel(dst_x, dst_y);
|
||||
dc.r = (double)(sc.a * sc.r + dc.a * (1.0 - sc.a) * dc.r);
|
||||
dc.g = (double)(sc.a * sc.g + dc.a * (1.0 - sc.a) * dc.g);
|
||||
dc.b = (double)(sc.a * sc.b + dc.a * (1.0 - sc.a) * dc.b);
|
||||
dc.a = (double)(sc.a + dc.a * (1.0 - sc.a));
|
||||
set_pixel(dst_x, dst_y, dc);
|
||||
if (sc.a != 0) {
|
||||
Color dc = get_pixel(dst_x, dst_y);
|
||||
dc = dc.blend(sc);
|
||||
set_pixel(dst_x, dst_y, dc);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
|
@ -1,5 +1,5 @@
|
|||
#!/usr/bin/env python
|
||||
|
||||
Import('env')
|
||||
Import("env")
|
||||
|
||||
env.add_source_files(env.core_sources, "*.cpp")
|
||||
|
|
|
@ -37,7 +37,10 @@ bool DTLSServer::available = false;
|
|||
|
||||
DTLSServer *DTLSServer::create() {
|
||||
|
||||
return _create();
|
||||
if (_create) {
|
||||
return _create();
|
||||
}
|
||||
return NULL;
|
||||
}
|
||||
|
||||
bool DTLSServer::is_available() {
|
||||
|
|
|
@ -37,7 +37,10 @@ bool PacketPeerDTLS::available = false;
|
|||
|
||||
PacketPeerDTLS *PacketPeerDTLS::create() {
|
||||
|
||||
return _create();
|
||||
if (_create) {
|
||||
return _create();
|
||||
}
|
||||
return NULL;
|
||||
}
|
||||
|
||||
bool PacketPeerDTLS::is_available() {
|
||||
|
|
|
@ -280,58 +280,57 @@ MethodBind* create_method_bind($ifret R$ $ifnoret void$ (*p_method)($ifconst con
|
|||
"""
|
||||
|
||||
|
||||
|
||||
def make_version(template, nargs, argmax, const, ret):
|
||||
|
||||
intext = template
|
||||
from_pos = 0
|
||||
outtext = ""
|
||||
|
||||
while(True):
|
||||
while True:
|
||||
to_pos = intext.find("$", from_pos)
|
||||
if (to_pos == -1):
|
||||
if to_pos == -1:
|
||||
outtext += intext[from_pos:]
|
||||
break
|
||||
else:
|
||||
outtext += intext[from_pos:to_pos]
|
||||
end = intext.find("$", to_pos + 1)
|
||||
if (end == -1):
|
||||
if end == -1:
|
||||
break # ignore
|
||||
macro = intext[to_pos + 1:end]
|
||||
macro = intext[to_pos + 1 : end]
|
||||
cmd = ""
|
||||
data = ""
|
||||
|
||||
if (macro.find(" ") != -1):
|
||||
cmd = macro[0:macro.find(" ")]
|
||||
data = macro[macro.find(" ") + 1:]
|
||||
if macro.find(" ") != -1:
|
||||
cmd = macro[0 : macro.find(" ")]
|
||||
data = macro[macro.find(" ") + 1 :]
|
||||
else:
|
||||
cmd = macro
|
||||
|
||||
if (cmd == "argc"):
|
||||
if cmd == "argc":
|
||||
outtext += str(nargs)
|
||||
if (cmd == "ifret" and ret):
|
||||
if cmd == "ifret" and ret:
|
||||
outtext += data
|
||||
if (cmd == "ifargs" and nargs):
|
||||
if cmd == "ifargs" and nargs:
|
||||
outtext += data
|
||||
if (cmd == "ifretargs" and nargs and ret):
|
||||
if cmd == "ifretargs" and nargs and ret:
|
||||
outtext += data
|
||||
if (cmd == "ifconst" and const):
|
||||
if cmd == "ifconst" and const:
|
||||
outtext += data
|
||||
elif (cmd == "ifnoconst" and not const):
|
||||
elif cmd == "ifnoconst" and not const:
|
||||
outtext += data
|
||||
elif (cmd == "ifnoret" and not ret):
|
||||
elif cmd == "ifnoret" and not ret:
|
||||
outtext += data
|
||||
elif (cmd == "iftempl" and (nargs > 0 or ret)):
|
||||
elif cmd == "iftempl" and (nargs > 0 or ret):
|
||||
outtext += data
|
||||
elif (cmd == "arg,"):
|
||||
elif cmd == "arg,":
|
||||
for i in range(1, nargs + 1):
|
||||
if (i > 1):
|
||||
if i > 1:
|
||||
outtext += ", "
|
||||
outtext += data.replace("@", str(i))
|
||||
elif (cmd == "arg"):
|
||||
elif cmd == "arg":
|
||||
for i in range(1, nargs + 1):
|
||||
outtext += data.replace("@", str(i))
|
||||
elif (cmd == "noarg"):
|
||||
elif cmd == "noarg":
|
||||
for i in range(nargs + 1, argmax + 1):
|
||||
outtext += data.replace("@", str(i))
|
||||
|
||||
|
@ -348,7 +347,9 @@ def run(target, source, env):
|
|||
text_ext = ""
|
||||
text_free_func = "#ifndef METHOD_BIND_FREE_FUNC_H\n#define METHOD_BIND_FREE_FUNC_H\n"
|
||||
text_free_func += "\n//including this header file allows method binding to use free functions\n"
|
||||
text_free_func += "//note that the free function must have a pointer to an instance of the class as its first parameter\n"
|
||||
text_free_func += (
|
||||
"//note that the free function must have a pointer to an instance of the class as its first parameter\n"
|
||||
)
|
||||
|
||||
for i in range(0, versions + 1):
|
||||
|
||||
|
@ -361,7 +362,7 @@ def run(target, source, env):
|
|||
t += make_version(template_typed, i, versions, True, False)
|
||||
t += make_version(template, i, versions, True, True)
|
||||
t += make_version(template_typed, i, versions, True, True)
|
||||
if (i >= versions_ext):
|
||||
if i >= versions_ext:
|
||||
text_ext += t
|
||||
else:
|
||||
text += t
|
||||
|
@ -383,6 +384,7 @@ def run(target, source, env):
|
|||
f.write(text_free_func)
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
if __name__ == "__main__":
|
||||
from platform_methods import subprocess_main
|
||||
|
||||
subprocess_main(globals())
|
||||
|
|
|
@ -1,6 +1,6 @@
|
|||
#!/usr/bin/env python
|
||||
|
||||
Import('env')
|
||||
Import("env")
|
||||
|
||||
env_math = env.Clone()
|
||||
|
||||
|
|
|
@ -292,10 +292,16 @@ int AStar::get_closest_point(const Vector3 &p_point, bool p_include_disabled) co
|
|||
|
||||
if (!p_include_disabled && !(*it.value)->enabled) continue; // Disabled points should not be considered.
|
||||
|
||||
// Keep the closest point's ID, and in case of multiple closest IDs,
|
||||
// the smallest one (makes it deterministic).
|
||||
real_t d = p_point.distance_squared_to((*it.value)->pos);
|
||||
if (closest_id < 0 || d < closest_dist) {
|
||||
int id = *(it.key);
|
||||
if (d <= closest_dist) {
|
||||
if (d == closest_dist && id > closest_id) { // Keep lowest ID.
|
||||
continue;
|
||||
}
|
||||
closest_dist = d;
|
||||
closest_id = *(it.key);
|
||||
closest_id = id;
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -304,7 +310,6 @@ int AStar::get_closest_point(const Vector3 &p_point, bool p_include_disabled) co
|
|||
|
||||
Vector3 AStar::get_closest_position_in_segment(const Vector3 &p_point) const {
|
||||
|
||||
bool found = false;
|
||||
real_t closest_dist = 1e20;
|
||||
Vector3 closest_point;
|
||||
|
||||
|
@ -325,11 +330,10 @@ Vector3 AStar::get_closest_position_in_segment(const Vector3 &p_point) const {
|
|||
|
||||
Vector3 p = Geometry::get_closest_point_to_segment(p_point, segment);
|
||||
real_t d = p_point.distance_squared_to(p);
|
||||
if (!found || d < closest_dist) {
|
||||
if (d < closest_dist) {
|
||||
|
||||
closest_point = p;
|
||||
closest_dist = d;
|
||||
found = true;
|
||||
}
|
||||
}
|
||||
|
||||
|
|
|
@ -2132,7 +2132,7 @@ void ObjectDB::cleanup() {
|
|||
rw_lock->write_lock();
|
||||
if (instances.size()) {
|
||||
|
||||
WARN_PRINT("ObjectDB Instances still exist!");
|
||||
WARN_PRINT("ObjectDB instances leaked at exit (run with --verbose for details).");
|
||||
if (OS::get_singleton()->is_stdout_verbose()) {
|
||||
const ObjectID *K = NULL;
|
||||
while ((K = instances.next(K))) {
|
||||
|
@ -2141,9 +2141,10 @@ void ObjectDB::cleanup() {
|
|||
if (instances[*K]->is_class("Node"))
|
||||
node_name = " - Node name: " + String(instances[*K]->call("get_name"));
|
||||
if (instances[*K]->is_class("Resource"))
|
||||
node_name = " - Resource name: " + String(instances[*K]->call("get_name")) + " Path: " + String(instances[*K]->call("get_path"));
|
||||
node_name = " - Resource path: " + String(instances[*K]->call("get_path"));
|
||||
print_line("Leaked instance: " + String(instances[*K]->get_class()) + ":" + itos(*K) + node_name);
|
||||
}
|
||||
print_line("Hint: Leaked instances typically happen when nodes are removed from the scene tree (with `remove_child()`) but not freed (with `free()` or `queue_free()`).");
|
||||
}
|
||||
}
|
||||
instances.clear();
|
||||
|
|
|
@ -1,5 +1,5 @@
|
|||
#!/usr/bin/env python
|
||||
|
||||
Import('env')
|
||||
Import("env")
|
||||
|
||||
env.add_source_files(env.core_sources, "*.cpp")
|
||||
|
|
|
@ -33,6 +33,7 @@
|
|||
#include "core/core_string_names.h"
|
||||
#include "core/io/resource_loader.h"
|
||||
#include "core/os/file_access.h"
|
||||
#include "core/os/os.h"
|
||||
#include "core/script_language.h"
|
||||
#include "scene/main/node.h" //only so casting works
|
||||
|
||||
|
@ -472,21 +473,22 @@ void ResourceCache::setup() {
|
|||
}
|
||||
|
||||
void ResourceCache::clear() {
|
||||
if (resources.size())
|
||||
ERR_PRINT("Resources Still in use at Exit!");
|
||||
if (resources.size()) {
|
||||
ERR_PRINT("Resources still in use at exit (run with --verbose for details).");
|
||||
if (OS::get_singleton()->is_stdout_verbose()) {
|
||||
const String *K = nullptr;
|
||||
while ((K = resources.next(K))) {
|
||||
Resource *r = resources[*K];
|
||||
print_line(vformat("Resource still in use: %s (%s)", *K, r->get_class()));
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
resources.clear();
|
||||
memdelete(lock);
|
||||
}
|
||||
|
||||
void ResourceCache::reload_externals() {
|
||||
|
||||
/*
|
||||
const String *K=NULL;
|
||||
while ((K=resources.next(K))) {
|
||||
resources[*K]->reload_external_data();
|
||||
}
|
||||
*/
|
||||
}
|
||||
|
||||
bool ResourceCache::has(const String &p_path) {
|
||||
|
@ -573,6 +575,5 @@ void ResourceCache::dump(const char *p_file, bool p_short) {
|
|||
}
|
||||
|
||||
lock->read_unlock();
|
||||
|
||||
#endif
|
||||
}
|
||||
|
|
|
@ -485,13 +485,6 @@ Error VariantParser::_parse_construct(Stream *p_stream, Vector<T> &r_construct,
|
|||
}
|
||||
|
||||
Error VariantParser::parse_value(Token &token, Variant &value, Stream *p_stream, int &line, String &r_err_str, ResourceParser *p_res_parser) {
|
||||
|
||||
/* {
|
||||
Error err = get_token(p_stream,token,line,r_err_str);
|
||||
if (err)
|
||||
return err;
|
||||
}*/
|
||||
|
||||
if (token.type == TK_CURLY_BRACKET_OPEN) {
|
||||
|
||||
Dictionary d;
|
||||
|
@ -508,7 +501,6 @@ Error VariantParser::parse_value(Token &token, Variant &value, Stream *p_stream,
|
|||
return err;
|
||||
value = a;
|
||||
return OK;
|
||||
|
||||
} else if (token.type == TK_IDENTIFIER) {
|
||||
|
||||
String id = token.value;
|
||||
|
@ -531,10 +523,10 @@ Error VariantParser::parse_value(Token &token, Variant &value, Stream *p_stream,
|
|||
|
||||
if (args.size() != 2) {
|
||||
r_err_str = "Expected 2 arguments for constructor";
|
||||
return ERR_PARSE_ERROR;
|
||||
}
|
||||
|
||||
value = Vector2(args[0], args[1]);
|
||||
return OK;
|
||||
} else if (id == "Rect2") {
|
||||
|
||||
Vector<float> args;
|
||||
|
@ -544,10 +536,10 @@ Error VariantParser::parse_value(Token &token, Variant &value, Stream *p_stream,
|
|||
|
||||
if (args.size() != 4) {
|
||||
r_err_str = "Expected 4 arguments for constructor";
|
||||
return ERR_PARSE_ERROR;
|
||||
}
|
||||
|
||||
value = Rect2(args[0], args[1], args[2], args[3]);
|
||||
return OK;
|
||||
} else if (id == "Vector3") {
|
||||
|
||||
Vector<float> args;
|
||||
|
@ -557,12 +549,11 @@ Error VariantParser::parse_value(Token &token, Variant &value, Stream *p_stream,
|
|||
|
||||
if (args.size() != 3) {
|
||||
r_err_str = "Expected 3 arguments for constructor";
|
||||
return ERR_PARSE_ERROR;
|
||||
}
|
||||
|
||||
value = Vector3(args[0], args[1], args[2]);
|
||||
return OK;
|
||||
} else if (id == "Transform2D" || id == "Matrix32") { //compatibility
|
||||
|
||||
Vector<float> args;
|
||||
Error err = _parse_construct<float>(p_stream, args, line, r_err_str);
|
||||
if (err)
|
||||
|
@ -570,13 +561,14 @@ Error VariantParser::parse_value(Token &token, Variant &value, Stream *p_stream,
|
|||
|
||||
if (args.size() != 6) {
|
||||
r_err_str = "Expected 6 arguments for constructor";
|
||||
return ERR_PARSE_ERROR;
|
||||
}
|
||||
|
||||
Transform2D m;
|
||||
m[0] = Vector2(args[0], args[1]);
|
||||
m[1] = Vector2(args[2], args[3]);
|
||||
m[2] = Vector2(args[4], args[5]);
|
||||
value = m;
|
||||
return OK;
|
||||
} else if (id == "Plane") {
|
||||
|
||||
Vector<float> args;
|
||||
|
@ -586,10 +578,10 @@ Error VariantParser::parse_value(Token &token, Variant &value, Stream *p_stream,
|
|||
|
||||
if (args.size() != 4) {
|
||||
r_err_str = "Expected 4 arguments for constructor";
|
||||
return ERR_PARSE_ERROR;
|
||||
}
|
||||
|
||||
value = Plane(args[0], args[1], args[2], args[3]);
|
||||
return OK;
|
||||
} else if (id == "Quat") {
|
||||
|
||||
Vector<float> args;
|
||||
|
@ -599,11 +591,10 @@ Error VariantParser::parse_value(Token &token, Variant &value, Stream *p_stream,
|
|||
|
||||
if (args.size() != 4) {
|
||||
r_err_str = "Expected 4 arguments for constructor";
|
||||
return ERR_PARSE_ERROR;
|
||||
}
|
||||
|
||||
value = Quat(args[0], args[1], args[2], args[3]);
|
||||
return OK;
|
||||
|
||||
} else if (id == "AABB" || id == "Rect3") {
|
||||
|
||||
Vector<float> args;
|
||||
|
@ -613,13 +604,11 @@ Error VariantParser::parse_value(Token &token, Variant &value, Stream *p_stream,
|
|||
|
||||
if (args.size() != 6) {
|
||||
r_err_str = "Expected 6 arguments for constructor";
|
||||
return ERR_PARSE_ERROR;
|
||||
}
|
||||
|
||||
value = AABB(Vector3(args[0], args[1], args[2]), Vector3(args[3], args[4], args[5]));
|
||||
return OK;
|
||||
|
||||
} else if (id == "Basis" || id == "Matrix3") { //compatibility
|
||||
|
||||
Vector<float> args;
|
||||
Error err = _parse_construct<float>(p_stream, args, line, r_err_str);
|
||||
if (err)
|
||||
|
@ -627,10 +616,10 @@ Error VariantParser::parse_value(Token &token, Variant &value, Stream *p_stream,
|
|||
|
||||
if (args.size() != 9) {
|
||||
r_err_str = "Expected 9 arguments for constructor";
|
||||
return ERR_PARSE_ERROR;
|
||||
}
|
||||
|
||||
value = Basis(args[0], args[1], args[2], args[3], args[4], args[5], args[6], args[7], args[8]);
|
||||
return OK;
|
||||
} else if (id == "Transform") {
|
||||
|
||||
Vector<float> args;
|
||||
|
@ -640,11 +629,10 @@ Error VariantParser::parse_value(Token &token, Variant &value, Stream *p_stream,
|
|||
|
||||
if (args.size() != 12) {
|
||||
r_err_str = "Expected 12 arguments for constructor";
|
||||
return ERR_PARSE_ERROR;
|
||||
}
|
||||
|
||||
value = Transform(Basis(args[0], args[1], args[2], args[3], args[4], args[5], args[6], args[7], args[8]), Vector3(args[9], args[10], args[11]));
|
||||
return OK;
|
||||
|
||||
} else if (id == "Color") {
|
||||
|
||||
Vector<float> args;
|
||||
|
@ -654,11 +642,10 @@ Error VariantParser::parse_value(Token &token, Variant &value, Stream *p_stream,
|
|||
|
||||
if (args.size() != 4) {
|
||||
r_err_str = "Expected 4 arguments for constructor";
|
||||
return ERR_PARSE_ERROR;
|
||||
}
|
||||
|
||||
value = Color(args[0], args[1], args[2], args[3]);
|
||||
return OK;
|
||||
|
||||
} else if (id == "NodePath") {
|
||||
|
||||
get_token(p_stream, token, line, r_err_str);
|
||||
|
@ -680,7 +667,6 @@ Error VariantParser::parse_value(Token &token, Variant &value, Stream *p_stream,
|
|||
r_err_str = "Expected ')'";
|
||||
return ERR_PARSE_ERROR;
|
||||
}
|
||||
|
||||
} else if (id == "RID") {
|
||||
|
||||
get_token(p_stream, token, line, r_err_str);
|
||||
|
@ -702,8 +688,6 @@ Error VariantParser::parse_value(Token &token, Variant &value, Stream *p_stream,
|
|||
r_err_str = "Expected ')'";
|
||||
return ERR_PARSE_ERROR;
|
||||
}
|
||||
|
||||
return OK;
|
||||
} else if (id == "Object") {
|
||||
|
||||
get_token(p_stream, token, line, r_err_str);
|
||||
|
@ -806,7 +790,6 @@ Error VariantParser::parse_value(Token &token, Variant &value, Stream *p_stream,
|
|||
at_key = true;
|
||||
}
|
||||
}
|
||||
|
||||
} else if (id == "Resource" || id == "SubResource" || id == "ExtResource") {
|
||||
|
||||
get_token(p_stream, token, line, r_err_str);
|
||||
|
@ -823,8 +806,6 @@ Error VariantParser::parse_value(Token &token, Variant &value, Stream *p_stream,
|
|||
return err;
|
||||
|
||||
value = res;
|
||||
|
||||
return OK;
|
||||
} else if (p_res_parser && id == "ExtResource" && p_res_parser->ext_func) {
|
||||
|
||||
RES res;
|
||||
|
@ -833,8 +814,6 @@ Error VariantParser::parse_value(Token &token, Variant &value, Stream *p_stream,
|
|||
return err;
|
||||
|
||||
value = res;
|
||||
|
||||
return OK;
|
||||
} else if (p_res_parser && id == "SubResource" && p_res_parser->sub_func) {
|
||||
|
||||
RES res;
|
||||
|
@ -843,8 +822,6 @@ Error VariantParser::parse_value(Token &token, Variant &value, Stream *p_stream,
|
|||
return err;
|
||||
|
||||
value = res;
|
||||
|
||||
return OK;
|
||||
} else {
|
||||
|
||||
get_token(p_stream, token, line, r_err_str);
|
||||
|
@ -863,8 +840,6 @@ Error VariantParser::parse_value(Token &token, Variant &value, Stream *p_stream,
|
|||
}
|
||||
|
||||
value = res;
|
||||
return OK;
|
||||
|
||||
} else {
|
||||
r_err_str = "Expected string as argument for Resource().";
|
||||
return ERR_PARSE_ERROR;
|
||||
|
@ -1059,8 +1034,6 @@ Error VariantParser::parse_value(Token &token, Variant &value, Stream *p_stream,
|
|||
}
|
||||
|
||||
value = ie;
|
||||
|
||||
return OK;
|
||||
#endif
|
||||
} else if (id == "PoolByteArray" || id == "ByteArray") {
|
||||
|
||||
|
@ -1081,8 +1054,6 @@ Error VariantParser::parse_value(Token &token, Variant &value, Stream *p_stream,
|
|||
|
||||
value = arr;
|
||||
|
||||
return OK;
|
||||
|
||||
} else if (id == "PoolIntArray" || id == "IntArray") {
|
||||
|
||||
Vector<int> args;
|
||||
|
@ -1102,8 +1073,6 @@ Error VariantParser::parse_value(Token &token, Variant &value, Stream *p_stream,
|
|||
|
||||
value = arr;
|
||||
|
||||
return OK;
|
||||
|
||||
} else if (id == "PoolRealArray" || id == "FloatArray") {
|
||||
|
||||
Vector<float> args;
|
||||
|
@ -1123,7 +1092,6 @@ Error VariantParser::parse_value(Token &token, Variant &value, Stream *p_stream,
|
|||
|
||||
value = arr;
|
||||
|
||||
return OK;
|
||||
} else if (id == "PoolStringArray" || id == "StringArray") {
|
||||
|
||||
get_token(p_stream, token, line, r_err_str);
|
||||
|
@ -1173,8 +1141,6 @@ Error VariantParser::parse_value(Token &token, Variant &value, Stream *p_stream,
|
|||
|
||||
value = arr;
|
||||
|
||||
return OK;
|
||||
|
||||
} else if (id == "PoolVector2Array" || id == "Vector2Array") {
|
||||
|
||||
Vector<float> args;
|
||||
|
@ -1194,8 +1160,6 @@ Error VariantParser::parse_value(Token &token, Variant &value, Stream *p_stream,
|
|||
|
||||
value = arr;
|
||||
|
||||
return OK;
|
||||
|
||||
} else if (id == "PoolVector3Array" || id == "Vector3Array") {
|
||||
|
||||
Vector<float> args;
|
||||
|
@ -1215,8 +1179,6 @@ Error VariantParser::parse_value(Token &token, Variant &value, Stream *p_stream,
|
|||
|
||||
value = arr;
|
||||
|
||||
return OK;
|
||||
|
||||
} else if (id == "PoolColorArray" || id == "ColorArray") {
|
||||
|
||||
Vector<float> args;
|
||||
|
@ -1235,15 +1197,13 @@ Error VariantParser::parse_value(Token &token, Variant &value, Stream *p_stream,
|
|||
}
|
||||
|
||||
value = arr;
|
||||
|
||||
return OK;
|
||||
} else {
|
||||
r_err_str = "Unexpected identifier: '" + id + "'.";
|
||||
return ERR_PARSE_ERROR;
|
||||
}
|
||||
|
||||
// All above branches end up here unless they had an early return.
|
||||
return OK;
|
||||
|
||||
} else if (token.type == TK_NUMBER) {
|
||||
|
||||
value = token.value;
|
||||
|
|
|
@ -131,7 +131,8 @@
|
|||
<argument index="1" name="include_disabled" type="bool" default="false">
|
||||
</argument>
|
||||
<description>
|
||||
Returns the ID of the closest point to [code]to_position[/code], optionally taking disabled points into account. Returns -1 if there are no points in the points pool.
|
||||
Returns the ID of the closest point to [code]to_position[/code], optionally taking disabled points into account. Returns [code]-1[/code] if there are no points in the points pool.
|
||||
[b]Note:[/b] If several points are the closest to [code]to_position[/code], the one with the smallest ID will be returned, ensuring a deterministic result.
|
||||
</description>
|
||||
</method>
|
||||
<method name="get_closest_position_in_segment" qualifiers="const">
|
||||
|
|
|
@ -114,7 +114,8 @@
|
|||
<argument index="1" name="include_disabled" type="bool" default="false">
|
||||
</argument>
|
||||
<description>
|
||||
Returns the ID of the closest point to [code]to_position[/code], optionally taking disabled points into account. Returns -1 if there are no points in the points pool.
|
||||
Returns the ID of the closest point to [code]to_position[/code], optionally taking disabled points into account. Returns [code]-1[/code] if there are no points in the points pool.
|
||||
[b]Note:[/b] If several points are the closest to [code]to_position[/code], the one with the smallest ID will be returned, ensuring a deterministic result.
|
||||
</description>
|
||||
</method>
|
||||
<method name="get_closest_position_in_segment" qualifiers="const">
|
||||
|
|
|
@ -146,6 +146,7 @@
|
|||
# The line below prints `true`, whereas it would have printed `false` if both variables were compared directly.
|
||||
print(dict1.hash() == dict2.hash())
|
||||
[/codeblock]
|
||||
[b]Note:[/b] Dictionaries with the same keys/values but in a different order will have a different hash.
|
||||
</description>
|
||||
</method>
|
||||
<method name="keys">
|
||||
|
|
|
@ -54,28 +54,28 @@
|
|||
<return type="int">
|
||||
</return>
|
||||
<description>
|
||||
Returns the next 16 bits from the file as an integer.
|
||||
Returns the next 16 bits from the file as an integer. See [method store_16] for details on what values can be stored and retrieved this way.
|
||||
</description>
|
||||
</method>
|
||||
<method name="get_32" qualifiers="const">
|
||||
<return type="int">
|
||||
</return>
|
||||
<description>
|
||||
Returns the next 32 bits from the file as an integer.
|
||||
Returns the next 32 bits from the file as an integer. See [method store_32] for details on what values can be stored and retrieved this way.
|
||||
</description>
|
||||
</method>
|
||||
<method name="get_64" qualifiers="const">
|
||||
<return type="int">
|
||||
</return>
|
||||
<description>
|
||||
Returns the next 64 bits from the file as an integer.
|
||||
Returns the next 64 bits from the file as an integer. See [method store_64] for details on what values can be stored and retrieved this way.
|
||||
</description>
|
||||
</method>
|
||||
<method name="get_8" qualifiers="const">
|
||||
<return type="int">
|
||||
</return>
|
||||
<description>
|
||||
Returns the next 8 bits from the file as an integer.
|
||||
Returns the next 8 bits from the file as an integer. See [method store_8] for details on what values can be stored and retrieved this way.
|
||||
</description>
|
||||
</method>
|
||||
<method name="get_as_text" qualifiers="const">
|
||||
|
@ -297,7 +297,26 @@
|
|||
</argument>
|
||||
<description>
|
||||
Stores an integer as 16 bits in the file.
|
||||
[b]Note:[/b] The [code]value[/code] should lie in the interval [code][0, 2^16 - 1][/code].
|
||||
[b]Note:[/b] The [code]value[/code] should lie in the interval [code][0, 2^16 - 1][/code]. Any other value will overflow and wrap around.
|
||||
To store a signed integer, use [method store_64] or store a signed integer from the interval [code][-2^15, 2^15 - 1][/code] (i.e. keeping one bit for the signedness) and compute its sign manually when reading. For example:
|
||||
[codeblock]
|
||||
const MAX_15B = 1 << 15
|
||||
const MAX_16B = 1 << 16
|
||||
|
||||
func unsigned16_to_signed(unsigned):
|
||||
return (unsigned + MAX_15B) % MAX_16B - MAX_15B
|
||||
|
||||
func _ready():
|
||||
var f = File.new()
|
||||
f.open("user://file.dat", File.WRITE_READ)
|
||||
f.store_16(-42) # This wraps around and stores 65494 (2^16 - 42).
|
||||
f.store_16(121) # In bounds, will store 121.
|
||||
f.seek(0) # Go back to start to read the stored value.
|
||||
var read1 = f.get_16() # 65494
|
||||
var read2 = f.get_16() # 121
|
||||
var converted1 = unsigned16_to_signed(read1) # -42
|
||||
var converted2 = unsigned16_to_signed(read2) # 121
|
||||
[/codeblock]
|
||||
</description>
|
||||
</method>
|
||||
<method name="store_32">
|
||||
|
@ -307,7 +326,8 @@
|
|||
</argument>
|
||||
<description>
|
||||
Stores an integer as 32 bits in the file.
|
||||
[b]Note:[/b] The [code]value[/code] should lie in the interval [code][0, 2^32 - 1][/code].
|
||||
[b]Note:[/b] The [code]value[/code] should lie in the interval [code][0, 2^32 - 1][/code]. Any other value will overflow and wrap around.
|
||||
To store a signed integer, use [method store_64], or convert it manually (see [method store_16] for an example).
|
||||
</description>
|
||||
</method>
|
||||
<method name="store_64">
|
||||
|
@ -327,7 +347,8 @@
|
|||
</argument>
|
||||
<description>
|
||||
Stores an integer as 8 bits in the file.
|
||||
[b]Note:[/b] The [code]value[/code] should lie in the interval [code][0, 255][/code].
|
||||
[b]Note:[/b] The [code]value[/code] should lie in the interval [code][0, 255][/code]. Any other value will overflow and wrap around.
|
||||
To store a signed integer, use [method store_64], or convert it manually (see [method store_16] for an example).
|
||||
</description>
|
||||
</method>
|
||||
<method name="store_buffer">
|
||||
|
|
|
@ -132,6 +132,12 @@
|
|||
</constant>
|
||||
</constants>
|
||||
<theme_items>
|
||||
<theme_item name="file" type="Texture">
|
||||
Custom icon for files.
|
||||
</theme_item>
|
||||
<theme_item name="file_icon_modulate" type="Color" default="Color( 1, 1, 1, 1 )">
|
||||
The color modulation applied to the file icon.
|
||||
</theme_item>
|
||||
<theme_item name="files_disabled" type="Color" default="Color( 0, 0, 0, 0.7 )">
|
||||
The color tint for disabled files (when the [FileDialog] is used in open folder mode).
|
||||
</theme_item>
|
||||
|
|
|
@ -151,7 +151,7 @@
|
|||
Deprecated, use [member PhysicsMaterial.bounce] instead via [member physics_material_override].
|
||||
</member>
|
||||
<member name="can_sleep" type="bool" setter="set_can_sleep" getter="is_able_to_sleep" default="true">
|
||||
If [code]true[/code], the RigidBody will not calculate forces and will act as a static body while there is no movement. It will wake up when forces are applied through other collisions or when the [code]apply_impulse[/code] method is used.
|
||||
If [code]true[/code], the body can enter sleep mode when there is no movement. See [member sleeping].
|
||||
</member>
|
||||
<member name="contact_monitor" type="bool" setter="set_contact_monitor" getter="is_contact_monitor_enabled" default="false">
|
||||
If [code]true[/code], the RigidBody will emit signals when it collides with another RigidBody.
|
||||
|
@ -190,7 +190,7 @@
|
|||
If a material is assigned to this property, it will be used instead of any other physics material, such as an inherited one.
|
||||
</member>
|
||||
<member name="sleeping" type="bool" setter="set_sleeping" getter="is_sleeping" default="false">
|
||||
If [code]true[/code], the body is sleeping and will not calculate forces until woken up by a collision or the [code]apply_impulse[/code] method.
|
||||
If [code]true[/code], the body will not move and will not calculate forces until woken up by another body through, for example, a collision, or by using the [method apply_impulse] or [method add_force] methods.
|
||||
</member>
|
||||
<member name="weight" type="float" setter="set_weight" getter="get_weight" default="9.8">
|
||||
The body's weight based on its mass and the global 3D gravity. Global values are set in [b]Project > Project Settings > Physics > 3d[/b].
|
||||
|
@ -241,7 +241,8 @@
|
|||
</signal>
|
||||
<signal name="sleeping_state_changed">
|
||||
<description>
|
||||
Emitted when the body changes its sleeping state. Either by sleeping or waking up.
|
||||
Emitted when the physics engine changes the body's sleeping state.
|
||||
[b]Note:[/b] Changing the value [member sleeping] will not trigger this signal. It is only emitted if the sleeping state is changed by the physics engine or [code]emit_signal("sleeping_state_changed")[/code] is used.
|
||||
</description>
|
||||
</signal>
|
||||
</signals>
|
||||
|
|
|
@ -131,7 +131,7 @@
|
|||
Deprecated, use [member PhysicsMaterial.bounce] instead via [member physics_material_override].
|
||||
</member>
|
||||
<member name="can_sleep" type="bool" setter="set_can_sleep" getter="is_able_to_sleep" default="true">
|
||||
If [code]true[/code], the body will not calculate forces and will act as a static body if there is no movement. The body will wake up when other forces are applied via collisions or by using [method apply_impulse] or [method add_force].
|
||||
If [code]true[/code], the body can enter sleep mode when there is no movement. See [member sleeping].
|
||||
</member>
|
||||
<member name="contact_monitor" type="bool" setter="set_contact_monitor" getter="is_contact_monitor_enabled" default="false">
|
||||
If [code]true[/code], the body will emit signals when it collides with another RigidBody2D. See also [member contacts_reported].
|
||||
|
@ -173,7 +173,7 @@
|
|||
If a material is assigned to this property, it will be used instead of any other physics material, such as an inherited one.
|
||||
</member>
|
||||
<member name="sleeping" type="bool" setter="set_sleeping" getter="is_sleeping" default="false">
|
||||
If [code]true[/code], the body is sleeping and will not calculate forces until woken up by a collision or by using [method apply_impulse] or [method add_force].
|
||||
If [code]true[/code], the body will not move and will not calculate forces until woken up by another body through, for example, a collision, or by using the [method apply_impulse] or [method add_force] methods.
|
||||
</member>
|
||||
<member name="weight" type="float" setter="set_weight" getter="get_weight" default="9.8">
|
||||
The body's weight based on its mass and the [b]Default Gravity[/b] value in [b]Project > Project Settings > Physics > 2d[/b].
|
||||
|
@ -222,7 +222,8 @@
|
|||
</signal>
|
||||
<signal name="sleeping_state_changed">
|
||||
<description>
|
||||
Emitted when [member sleeping] changes.
|
||||
Emitted when the physics engine changes the body's sleeping state.
|
||||
[b]Note:[/b] Changing the value [member sleeping] will not trigger this signal. It is only emitted if the sleeping state is changed by the physics engine or [code]emit_signal("sleeping_state_changed")[/code] is used.
|
||||
</description>
|
||||
</signal>
|
||||
</signals>
|
||||
|
|
|
@ -21,7 +21,7 @@ def write_string(_f, text, newline=True):
|
|||
for t in range(tab):
|
||||
_f.write("\t")
|
||||
_f.write(text)
|
||||
if (newline):
|
||||
if newline:
|
||||
_f.write("\n")
|
||||
|
||||
|
||||
|
@ -30,7 +30,7 @@ def escape(ret):
|
|||
ret = ret.replace("<", ">")
|
||||
ret = ret.replace(">", "<")
|
||||
ret = ret.replace("'", "'")
|
||||
ret = ret.replace("\"", """)
|
||||
ret = ret.replace('"', """)
|
||||
return ret
|
||||
|
||||
|
||||
|
@ -43,25 +43,26 @@ def dec_tab():
|
|||
global tab
|
||||
tab -= 1
|
||||
|
||||
|
||||
write_string(f, '<?xml version="1.0" encoding="UTF-8" ?>')
|
||||
write_string(f, '<doc version="' + new_doc.attrib["version"] + '">')
|
||||
|
||||
|
||||
def get_tag(node, name):
|
||||
tag = ""
|
||||
if (name in node.attrib):
|
||||
tag = ' ' + name + '="' + escape(node.attrib[name]) + '" '
|
||||
if name in node.attrib:
|
||||
tag = " " + name + '="' + escape(node.attrib[name]) + '" '
|
||||
return tag
|
||||
|
||||
|
||||
def find_method_descr(old_class, name):
|
||||
|
||||
methods = old_class.find("methods")
|
||||
if(methods != None and len(list(methods)) > 0):
|
||||
if methods != None and len(list(methods)) > 0:
|
||||
for m in list(methods):
|
||||
if (m.attrib["name"] == name):
|
||||
if m.attrib["name"] == name:
|
||||
description = m.find("description")
|
||||
if (description != None and description.text.strip() != ""):
|
||||
if description != None and description.text.strip() != "":
|
||||
return description.text
|
||||
|
||||
return None
|
||||
|
@ -70,11 +71,11 @@ def find_method_descr(old_class, name):
|
|||
def find_signal_descr(old_class, name):
|
||||
|
||||
signals = old_class.find("signals")
|
||||
if(signals != None and len(list(signals)) > 0):
|
||||
if signals != None and len(list(signals)) > 0:
|
||||
for m in list(signals):
|
||||
if (m.attrib["name"] == name):
|
||||
if m.attrib["name"] == name:
|
||||
description = m.find("description")
|
||||
if (description != None and description.text.strip() != ""):
|
||||
if description != None and description.text.strip() != "":
|
||||
return description.text
|
||||
|
||||
return None
|
||||
|
@ -82,13 +83,13 @@ def find_signal_descr(old_class, name):
|
|||
|
||||
def find_constant_descr(old_class, name):
|
||||
|
||||
if (old_class is None):
|
||||
if old_class is None:
|
||||
return None
|
||||
constants = old_class.find("constants")
|
||||
if(constants != None and len(list(constants)) > 0):
|
||||
if constants != None and len(list(constants)) > 0:
|
||||
for m in list(constants):
|
||||
if (m.attrib["name"] == name):
|
||||
if (m.text.strip() != ""):
|
||||
if m.attrib["name"] == name:
|
||||
if m.text.strip() != "":
|
||||
return m.text
|
||||
return None
|
||||
|
||||
|
@ -96,35 +97,35 @@ def find_constant_descr(old_class, name):
|
|||
def write_class(c):
|
||||
class_name = c.attrib["name"]
|
||||
print("Parsing Class: " + class_name)
|
||||
if (class_name in old_classes):
|
||||
if class_name in old_classes:
|
||||
old_class = old_classes[class_name]
|
||||
else:
|
||||
old_class = None
|
||||
|
||||
category = get_tag(c, "category")
|
||||
inherits = get_tag(c, "inherits")
|
||||
write_string(f, '<class name="' + class_name + '" ' + category + inherits + '>')
|
||||
write_string(f, '<class name="' + class_name + '" ' + category + inherits + ">")
|
||||
inc_tab()
|
||||
|
||||
write_string(f, "<brief_description>")
|
||||
|
||||
if (old_class != None):
|
||||
if old_class != None:
|
||||
old_brief_descr = old_class.find("brief_description")
|
||||
if (old_brief_descr != None):
|
||||
if old_brief_descr != None:
|
||||
write_string(f, escape(old_brief_descr.text.strip()))
|
||||
|
||||
write_string(f, "</brief_description>")
|
||||
|
||||
write_string(f, "<description>")
|
||||
if (old_class != None):
|
||||
if old_class != None:
|
||||
old_descr = old_class.find("description")
|
||||
if (old_descr != None):
|
||||
if old_descr != None:
|
||||
write_string(f, escape(old_descr.text.strip()))
|
||||
|
||||
write_string(f, "</description>")
|
||||
|
||||
methods = c.find("methods")
|
||||
if(methods != None and len(list(methods)) > 0):
|
||||
if methods != None and len(list(methods)) > 0:
|
||||
|
||||
write_string(f, "<methods>")
|
||||
inc_tab()
|
||||
|
@ -132,35 +133,46 @@ def write_class(c):
|
|||
for m in list(methods):
|
||||
qualifiers = get_tag(m, "qualifiers")
|
||||
|
||||
write_string(f, '<method name="' + escape(m.attrib["name"]) + '" ' + qualifiers + '>')
|
||||
write_string(f, '<method name="' + escape(m.attrib["name"]) + '" ' + qualifiers + ">")
|
||||
inc_tab()
|
||||
|
||||
for a in list(m):
|
||||
if (a.tag == "return"):
|
||||
if a.tag == "return":
|
||||
typ = get_tag(a, "type")
|
||||
write_string(f, '<return' + typ + '>')
|
||||
write_string(f, '</return>')
|
||||
elif (a.tag == "argument"):
|
||||
write_string(f, "<return" + typ + ">")
|
||||
write_string(f, "</return>")
|
||||
elif a.tag == "argument":
|
||||
|
||||
default = get_tag(a, "default")
|
||||
|
||||
write_string(f, '<argument index="' + a.attrib["index"] + '" name="' + escape(a.attrib["name"]) + '" type="' + a.attrib["type"] + '"' + default + '>')
|
||||
write_string(f, '</argument>')
|
||||
write_string(
|
||||
f,
|
||||
'<argument index="'
|
||||
+ a.attrib["index"]
|
||||
+ '" name="'
|
||||
+ escape(a.attrib["name"])
|
||||
+ '" type="'
|
||||
+ a.attrib["type"]
|
||||
+ '"'
|
||||
+ default
|
||||
+ ">",
|
||||
)
|
||||
write_string(f, "</argument>")
|
||||
|
||||
write_string(f, '<description>')
|
||||
if (old_class != None):
|
||||
write_string(f, "<description>")
|
||||
if old_class != None:
|
||||
old_method_descr = find_method_descr(old_class, m.attrib["name"])
|
||||
if (old_method_descr):
|
||||
if old_method_descr:
|
||||
write_string(f, escape(escape(old_method_descr.strip())))
|
||||
|
||||
write_string(f, '</description>')
|
||||
write_string(f, "</description>")
|
||||
dec_tab()
|
||||
write_string(f, "</method>")
|
||||
dec_tab()
|
||||
write_string(f, "</methods>")
|
||||
|
||||
signals = c.find("signals")
|
||||
if(signals != None and len(list(signals)) > 0):
|
||||
if signals != None and len(list(signals)) > 0:
|
||||
|
||||
write_string(f, "<signals>")
|
||||
inc_tab()
|
||||
|
@ -171,24 +183,33 @@ def write_class(c):
|
|||
inc_tab()
|
||||
|
||||
for a in list(m):
|
||||
if (a.tag == "argument"):
|
||||
if a.tag == "argument":
|
||||
|
||||
write_string(f, '<argument index="' + a.attrib["index"] + '" name="' + escape(a.attrib["name"]) + '" type="' + a.attrib["type"] + '">')
|
||||
write_string(f, '</argument>')
|
||||
write_string(
|
||||
f,
|
||||
'<argument index="'
|
||||
+ a.attrib["index"]
|
||||
+ '" name="'
|
||||
+ escape(a.attrib["name"])
|
||||
+ '" type="'
|
||||
+ a.attrib["type"]
|
||||
+ '">',
|
||||
)
|
||||
write_string(f, "</argument>")
|
||||
|
||||
write_string(f, '<description>')
|
||||
if (old_class != None):
|
||||
write_string(f, "<description>")
|
||||
if old_class != None:
|
||||
old_signal_descr = find_signal_descr(old_class, m.attrib["name"])
|
||||
if (old_signal_descr):
|
||||
if old_signal_descr:
|
||||
write_string(f, escape(old_signal_descr.strip()))
|
||||
write_string(f, '</description>')
|
||||
write_string(f, "</description>")
|
||||
dec_tab()
|
||||
write_string(f, "</signal>")
|
||||
dec_tab()
|
||||
write_string(f, "</signals>")
|
||||
|
||||
constants = c.find("constants")
|
||||
if(constants != None and len(list(constants)) > 0):
|
||||
if constants != None and len(list(constants)) > 0:
|
||||
|
||||
write_string(f, "<constants>")
|
||||
inc_tab()
|
||||
|
@ -197,7 +218,7 @@ def write_class(c):
|
|||
|
||||
write_string(f, '<constant name="' + escape(m.attrib["name"]) + '" value="' + m.attrib["value"] + '">')
|
||||
old_constant_descr = find_constant_descr(old_class, m.attrib["name"])
|
||||
if (old_constant_descr):
|
||||
if old_constant_descr:
|
||||
write_string(f, escape(old_constant_descr.strip()))
|
||||
write_string(f, "</constant>")
|
||||
|
||||
|
@ -207,9 +228,10 @@ def write_class(c):
|
|||
dec_tab()
|
||||
write_string(f, "</class>")
|
||||
|
||||
|
||||
for c in list(old_doc):
|
||||
old_classes[c.attrib["name"]] = c
|
||||
|
||||
for c in list(new_doc):
|
||||
write_class(c)
|
||||
write_string(f, '</doc>\n')
|
||||
write_string(f, "</doc>\n")
|
||||
|
|
|
@ -13,75 +13,74 @@ import xml.etree.ElementTree as ET
|
|||
################################################################################
|
||||
|
||||
flags = {
|
||||
'c': platform.platform() != 'Windows', # Disable by default on windows, since we use ANSI escape codes
|
||||
'b': False,
|
||||
'g': False,
|
||||
's': False,
|
||||
'u': False,
|
||||
'h': False,
|
||||
'p': False,
|
||||
'o': True,
|
||||
'i': False,
|
||||
'a': True,
|
||||
'e': False,
|
||||
"c": platform.platform() != "Windows", # Disable by default on windows, since we use ANSI escape codes
|
||||
"b": False,
|
||||
"g": False,
|
||||
"s": False,
|
||||
"u": False,
|
||||
"h": False,
|
||||
"p": False,
|
||||
"o": True,
|
||||
"i": False,
|
||||
"a": True,
|
||||
"e": False,
|
||||
}
|
||||
flag_descriptions = {
|
||||
'c': 'Toggle colors when outputting.',
|
||||
'b': 'Toggle showing only not fully described classes.',
|
||||
'g': 'Toggle showing only completed classes.',
|
||||
's': 'Toggle showing comments about the status.',
|
||||
'u': 'Toggle URLs to docs.',
|
||||
'h': 'Show help and exit.',
|
||||
'p': 'Toggle showing percentage as well as counts.',
|
||||
'o': 'Toggle overall column.',
|
||||
'i': 'Toggle collapse of class items columns.',
|
||||
'a': 'Toggle showing all items.',
|
||||
'e': 'Toggle hiding empty items.',
|
||||
"c": "Toggle colors when outputting.",
|
||||
"b": "Toggle showing only not fully described classes.",
|
||||
"g": "Toggle showing only completed classes.",
|
||||
"s": "Toggle showing comments about the status.",
|
||||
"u": "Toggle URLs to docs.",
|
||||
"h": "Show help and exit.",
|
||||
"p": "Toggle showing percentage as well as counts.",
|
||||
"o": "Toggle overall column.",
|
||||
"i": "Toggle collapse of class items columns.",
|
||||
"a": "Toggle showing all items.",
|
||||
"e": "Toggle hiding empty items.",
|
||||
}
|
||||
long_flags = {
|
||||
'colors': 'c',
|
||||
'use-colors': 'c',
|
||||
|
||||
'bad': 'b',
|
||||
'only-bad': 'b',
|
||||
|
||||
'good': 'g',
|
||||
'only-good': 'g',
|
||||
|
||||
'comments': 's',
|
||||
'status': 's',
|
||||
|
||||
'urls': 'u',
|
||||
'gen-url': 'u',
|
||||
|
||||
'help': 'h',
|
||||
|
||||
'percent': 'p',
|
||||
'use-percentages': 'p',
|
||||
|
||||
'overall': 'o',
|
||||
'use-overall': 'o',
|
||||
|
||||
'items': 'i',
|
||||
'collapse': 'i',
|
||||
|
||||
'all': 'a',
|
||||
|
||||
'empty': 'e',
|
||||
"colors": "c",
|
||||
"use-colors": "c",
|
||||
"bad": "b",
|
||||
"only-bad": "b",
|
||||
"good": "g",
|
||||
"only-good": "g",
|
||||
"comments": "s",
|
||||
"status": "s",
|
||||
"urls": "u",
|
||||
"gen-url": "u",
|
||||
"help": "h",
|
||||
"percent": "p",
|
||||
"use-percentages": "p",
|
||||
"overall": "o",
|
||||
"use-overall": "o",
|
||||
"items": "i",
|
||||
"collapse": "i",
|
||||
"all": "a",
|
||||
"empty": "e",
|
||||
}
|
||||
table_columns = ['name', 'brief_description', 'description', 'methods', 'constants', 'members', 'signals', 'theme_items']
|
||||
table_column_names = ['Name', 'Brief Desc.', 'Desc.', 'Methods', 'Constants', 'Members', 'Signals', 'Theme Items']
|
||||
table_columns = [
|
||||
"name",
|
||||
"brief_description",
|
||||
"description",
|
||||
"methods",
|
||||
"constants",
|
||||
"members",
|
||||
"signals",
|
||||
"theme_items",
|
||||
]
|
||||
table_column_names = ["Name", "Brief Desc.", "Desc.", "Methods", "Constants", "Members", "Signals", "Theme Items"]
|
||||
colors = {
|
||||
'name': [36], # cyan
|
||||
'part_big_problem': [4, 31], # underline, red
|
||||
'part_problem': [31], # red
|
||||
'part_mostly_good': [33], # yellow
|
||||
'part_good': [32], # green
|
||||
'url': [4, 34], # underline, blue
|
||||
'section': [1, 4], # bold, underline
|
||||
'state_off': [36], # cyan
|
||||
'state_on': [1, 35], # bold, magenta/plum
|
||||
'bold': [1], # bold
|
||||
"name": [36], # cyan
|
||||
"part_big_problem": [4, 31], # underline, red
|
||||
"part_problem": [31], # red
|
||||
"part_mostly_good": [33], # yellow
|
||||
"part_good": [32], # green
|
||||
"url": [4, 34], # underline, blue
|
||||
"section": [1, 4], # bold, underline
|
||||
"state_off": [36], # cyan
|
||||
"state_on": [1, 35], # bold, magenta/plum
|
||||
"bold": [1], # bold
|
||||
}
|
||||
overall_progress_description_weigth = 10
|
||||
|
||||
|
@ -90,6 +89,7 @@ overall_progress_description_weigth = 10
|
|||
# Utils #
|
||||
################################################################################
|
||||
|
||||
|
||||
def validate_tag(elem, tag):
|
||||
if elem.tag != tag:
|
||||
print('Tag mismatch, expected "' + tag + '", got ' + elem.tag)
|
||||
|
@ -97,36 +97,38 @@ def validate_tag(elem, tag):
|
|||
|
||||
|
||||
def color(color, string):
|
||||
if flags['c'] and terminal_supports_color():
|
||||
color_format = ''
|
||||
if flags["c"] and terminal_supports_color():
|
||||
color_format = ""
|
||||
for code in colors[color]:
|
||||
color_format += '\033[' + str(code) + 'm'
|
||||
return color_format + string + '\033[0m'
|
||||
color_format += "\033[" + str(code) + "m"
|
||||
return color_format + string + "\033[0m"
|
||||
else:
|
||||
return string
|
||||
|
||||
ansi_escape = re.compile(r'\x1b[^m]*m')
|
||||
|
||||
ansi_escape = re.compile(r"\x1b[^m]*m")
|
||||
|
||||
|
||||
def nonescape_len(s):
|
||||
return len(ansi_escape.sub('', s))
|
||||
return len(ansi_escape.sub("", s))
|
||||
|
||||
|
||||
def terminal_supports_color():
|
||||
p = sys.platform
|
||||
supported_platform = p != 'Pocket PC' and (p != 'win32' or
|
||||
'ANSICON' in os.environ)
|
||||
supported_platform = p != "Pocket PC" and (p != "win32" or "ANSICON" in os.environ)
|
||||
|
||||
is_a_tty = hasattr(sys.stdout, 'isatty') and sys.stdout.isatty()
|
||||
is_a_tty = hasattr(sys.stdout, "isatty") and sys.stdout.isatty()
|
||||
if not supported_platform or not is_a_tty:
|
||||
return False
|
||||
return True
|
||||
|
||||
|
||||
################################################################################
|
||||
# Classes #
|
||||
################################################################################
|
||||
|
||||
class ClassStatusProgress:
|
||||
|
||||
class ClassStatusProgress:
|
||||
def __init__(self, described=0, total=0):
|
||||
self.described = described
|
||||
self.total = total
|
||||
|
@ -143,42 +145,41 @@ class ClassStatusProgress:
|
|||
return self.described >= self.total
|
||||
|
||||
def to_configured_colored_string(self):
|
||||
if flags['p']:
|
||||
return self.to_colored_string('{percent}% ({has}/{total})', '{pad_percent}{pad_described}{s}{pad_total}')
|
||||
if flags["p"]:
|
||||
return self.to_colored_string("{percent}% ({has}/{total})", "{pad_percent}{pad_described}{s}{pad_total}")
|
||||
else:
|
||||
return self.to_colored_string()
|
||||
|
||||
def to_colored_string(self, format='{has}/{total}', pad_format='{pad_described}{s}{pad_total}'):
|
||||
def to_colored_string(self, format="{has}/{total}", pad_format="{pad_described}{s}{pad_total}"):
|
||||
ratio = float(self.described) / float(self.total) if self.total != 0 else 1
|
||||
percent = int(round(100 * ratio))
|
||||
s = format.format(has=str(self.described), total=str(self.total), percent=str(percent))
|
||||
if self.described >= self.total:
|
||||
s = color('part_good', s)
|
||||
s = color("part_good", s)
|
||||
elif self.described >= self.total / 4 * 3:
|
||||
s = color('part_mostly_good', s)
|
||||
s = color("part_mostly_good", s)
|
||||
elif self.described > 0:
|
||||
s = color('part_problem', s)
|
||||
s = color("part_problem", s)
|
||||
else:
|
||||
s = color('part_big_problem', s)
|
||||
s = color("part_big_problem", s)
|
||||
pad_size = max(len(str(self.described)), len(str(self.total)))
|
||||
pad_described = ''.ljust(pad_size - len(str(self.described)))
|
||||
pad_percent = ''.ljust(3 - len(str(percent)))
|
||||
pad_total = ''.ljust(pad_size - len(str(self.total)))
|
||||
pad_described = "".ljust(pad_size - len(str(self.described)))
|
||||
pad_percent = "".ljust(3 - len(str(percent)))
|
||||
pad_total = "".ljust(pad_size - len(str(self.total)))
|
||||
return pad_format.format(pad_described=pad_described, pad_total=pad_total, pad_percent=pad_percent, s=s)
|
||||
|
||||
|
||||
class ClassStatus:
|
||||
|
||||
def __init__(self, name=''):
|
||||
def __init__(self, name=""):
|
||||
self.name = name
|
||||
self.has_brief_description = True
|
||||
self.has_description = True
|
||||
self.progresses = {
|
||||
'methods': ClassStatusProgress(),
|
||||
'constants': ClassStatusProgress(),
|
||||
'members': ClassStatusProgress(),
|
||||
'theme_items': ClassStatusProgress(),
|
||||
'signals': ClassStatusProgress()
|
||||
"methods": ClassStatusProgress(),
|
||||
"constants": ClassStatusProgress(),
|
||||
"members": ClassStatusProgress(),
|
||||
"theme_items": ClassStatusProgress(),
|
||||
"signals": ClassStatusProgress(),
|
||||
}
|
||||
|
||||
def __add__(self, other):
|
||||
|
@ -208,66 +209,70 @@ class ClassStatus:
|
|||
|
||||
def make_output(self):
|
||||
output = {}
|
||||
output['name'] = color('name', self.name)
|
||||
output["name"] = color("name", self.name)
|
||||
|
||||
ok_string = color('part_good', 'OK')
|
||||
missing_string = color('part_big_problem', 'MISSING')
|
||||
ok_string = color("part_good", "OK")
|
||||
missing_string = color("part_big_problem", "MISSING")
|
||||
|
||||
output['brief_description'] = ok_string if self.has_brief_description else missing_string
|
||||
output['description'] = ok_string if self.has_description else missing_string
|
||||
output["brief_description"] = ok_string if self.has_brief_description else missing_string
|
||||
output["description"] = ok_string if self.has_description else missing_string
|
||||
|
||||
description_progress = ClassStatusProgress(
|
||||
(self.has_brief_description + self.has_description) * overall_progress_description_weigth,
|
||||
2 * overall_progress_description_weigth
|
||||
2 * overall_progress_description_weigth,
|
||||
)
|
||||
items_progress = ClassStatusProgress()
|
||||
|
||||
for k in ['methods', 'constants', 'members', 'signals', 'theme_items']:
|
||||
for k in ["methods", "constants", "members", "signals", "theme_items"]:
|
||||
items_progress += self.progresses[k]
|
||||
output[k] = self.progresses[k].to_configured_colored_string()
|
||||
|
||||
output['items'] = items_progress.to_configured_colored_string()
|
||||
output["items"] = items_progress.to_configured_colored_string()
|
||||
|
||||
output['overall'] = (description_progress + items_progress).to_colored_string(color('bold', '{percent}%'), '{pad_percent}{s}')
|
||||
output["overall"] = (description_progress + items_progress).to_colored_string(
|
||||
color("bold", "{percent}%"), "{pad_percent}{s}"
|
||||
)
|
||||
|
||||
if self.name.startswith('Total'):
|
||||
output['url'] = color('url', 'https://docs.godotengine.org/en/latest/classes/')
|
||||
if flags['s']:
|
||||
output['comment'] = color('part_good', 'ALL OK')
|
||||
if self.name.startswith("Total"):
|
||||
output["url"] = color("url", "https://docs.godotengine.org/en/latest/classes/")
|
||||
if flags["s"]:
|
||||
output["comment"] = color("part_good", "ALL OK")
|
||||
else:
|
||||
output['url'] = color('url', 'https://docs.godotengine.org/en/latest/classes/class_{name}.html'.format(name=self.name.lower()))
|
||||
output["url"] = color(
|
||||
"url", "https://docs.godotengine.org/en/latest/classes/class_{name}.html".format(name=self.name.lower())
|
||||
)
|
||||
|
||||
if flags['s'] and not flags['g'] and self.is_ok():
|
||||
output['comment'] = color('part_good', 'ALL OK')
|
||||
if flags["s"] and not flags["g"] and self.is_ok():
|
||||
output["comment"] = color("part_good", "ALL OK")
|
||||
|
||||
return output
|
||||
|
||||
@staticmethod
|
||||
def generate_for_class(c):
|
||||
status = ClassStatus()
|
||||
status.name = c.attrib['name']
|
||||
status.name = c.attrib["name"]
|
||||
|
||||
for tag in list(c):
|
||||
|
||||
if tag.tag == 'brief_description':
|
||||
if tag.tag == "brief_description":
|
||||
status.has_brief_description = len(tag.text.strip()) > 0
|
||||
|
||||
elif tag.tag == 'description':
|
||||
elif tag.tag == "description":
|
||||
status.has_description = len(tag.text.strip()) > 0
|
||||
|
||||
elif tag.tag in ['methods', 'signals']:
|
||||
elif tag.tag in ["methods", "signals"]:
|
||||
for sub_tag in list(tag):
|
||||
descr = sub_tag.find('description')
|
||||
descr = sub_tag.find("description")
|
||||
status.progresses[tag.tag].increment(len(descr.text.strip()) > 0)
|
||||
elif tag.tag in ['constants', 'members', 'theme_items']:
|
||||
elif tag.tag in ["constants", "members", "theme_items"]:
|
||||
for sub_tag in list(tag):
|
||||
if not sub_tag.text is None:
|
||||
status.progresses[tag.tag].increment(len(sub_tag.text.strip()) > 0)
|
||||
|
||||
elif tag.tag in ['tutorials']:
|
||||
elif tag.tag in ["tutorials"]:
|
||||
pass # Ignore those tags for now
|
||||
|
||||
elif tag.tag in ['theme_items']:
|
||||
elif tag.tag in ["theme_items"]:
|
||||
pass # Ignore those tags, since they seem to lack description at all
|
||||
|
||||
else:
|
||||
|
@ -286,63 +291,69 @@ merged_file = ""
|
|||
|
||||
for arg in sys.argv[1:]:
|
||||
try:
|
||||
if arg.startswith('--'):
|
||||
if arg.startswith("--"):
|
||||
flags[long_flags[arg[2:]]] = not flags[long_flags[arg[2:]]]
|
||||
elif arg.startswith('-'):
|
||||
elif arg.startswith("-"):
|
||||
for f in arg[1:]:
|
||||
flags[f] = not flags[f]
|
||||
elif os.path.isdir(arg):
|
||||
for f in os.listdir(arg):
|
||||
if f.endswith('.xml'):
|
||||
input_file_list.append(os.path.join(arg, f));
|
||||
if f.endswith(".xml"):
|
||||
input_file_list.append(os.path.join(arg, f))
|
||||
else:
|
||||
input_class_list.append(arg)
|
||||
except KeyError:
|
||||
print("Unknown command line flag: " + arg)
|
||||
sys.exit(1)
|
||||
|
||||
if flags['i']:
|
||||
for r in ['methods', 'constants', 'members', 'signals', 'theme_items']:
|
||||
if flags["i"]:
|
||||
for r in ["methods", "constants", "members", "signals", "theme_items"]:
|
||||
index = table_columns.index(r)
|
||||
del table_column_names[index]
|
||||
del table_columns[index]
|
||||
table_column_names.append('Items')
|
||||
table_columns.append('items')
|
||||
table_column_names.append("Items")
|
||||
table_columns.append("items")
|
||||
|
||||
if flags['o'] == (not flags['i']):
|
||||
table_column_names.append(color('bold', 'Overall'))
|
||||
table_columns.append('overall')
|
||||
if flags["o"] == (not flags["i"]):
|
||||
table_column_names.append(color("bold", "Overall"))
|
||||
table_columns.append("overall")
|
||||
|
||||
if flags['u']:
|
||||
table_column_names.append('Docs URL')
|
||||
table_columns.append('url')
|
||||
if flags["u"]:
|
||||
table_column_names.append("Docs URL")
|
||||
table_columns.append("url")
|
||||
|
||||
|
||||
################################################################################
|
||||
# Help #
|
||||
################################################################################
|
||||
|
||||
if len(input_file_list) < 1 or flags['h']:
|
||||
if not flags['h']:
|
||||
print(color('section', 'Invalid usage') + ': Please specify a classes directory')
|
||||
print(color('section', 'Usage') + ': doc_status.py [flags] <classes_dir> [class names]')
|
||||
print('\t< and > signify required parameters, while [ and ] signify optional parameters.')
|
||||
print(color('section', 'Available flags') + ':')
|
||||
if len(input_file_list) < 1 or flags["h"]:
|
||||
if not flags["h"]:
|
||||
print(color("section", "Invalid usage") + ": Please specify a classes directory")
|
||||
print(color("section", "Usage") + ": doc_status.py [flags] <classes_dir> [class names]")
|
||||
print("\t< and > signify required parameters, while [ and ] signify optional parameters.")
|
||||
print(color("section", "Available flags") + ":")
|
||||
possible_synonym_list = list(long_flags)
|
||||
possible_synonym_list.sort()
|
||||
flag_list = list(flags)
|
||||
flag_list.sort()
|
||||
for flag in flag_list:
|
||||
synonyms = [color('name', '-' + flag)]
|
||||
synonyms = [color("name", "-" + flag)]
|
||||
for synonym in possible_synonym_list:
|
||||
if long_flags[synonym] == flag:
|
||||
synonyms.append(color('name', '--' + synonym))
|
||||
synonyms.append(color("name", "--" + synonym))
|
||||
|
||||
print(('{synonyms} (Currently ' + color('state_' + ('on' if flags[flag] else 'off'), '{value}') + ')\n\t{description}').format(
|
||||
synonyms=', '.join(synonyms),
|
||||
value=('on' if flags[flag] else 'off'),
|
||||
description=flag_descriptions[flag]
|
||||
))
|
||||
print(
|
||||
(
|
||||
"{synonyms} (Currently "
|
||||
+ color("state_" + ("on" if flags[flag] else "off"), "{value}")
|
||||
+ ")\n\t{description}"
|
||||
).format(
|
||||
synonyms=", ".join(synonyms),
|
||||
value=("on" if flags[flag] else "off"),
|
||||
description=flag_descriptions[flag],
|
||||
)
|
||||
)
|
||||
sys.exit(0)
|
||||
|
||||
|
||||
|
@ -357,21 +368,21 @@ for file in input_file_list:
|
|||
tree = ET.parse(file)
|
||||
doc = tree.getroot()
|
||||
|
||||
if 'version' not in doc.attrib:
|
||||
if "version" not in doc.attrib:
|
||||
print('Version missing from "doc"')
|
||||
sys.exit(255)
|
||||
|
||||
version = doc.attrib['version']
|
||||
version = doc.attrib["version"]
|
||||
|
||||
if doc.attrib['name'] in class_names:
|
||||
if doc.attrib["name"] in class_names:
|
||||
continue
|
||||
class_names.append(doc.attrib['name'])
|
||||
classes[doc.attrib['name']] = doc
|
||||
class_names.append(doc.attrib["name"])
|
||||
classes[doc.attrib["name"]] = doc
|
||||
|
||||
class_names.sort()
|
||||
|
||||
if len(input_class_list) < 1:
|
||||
input_class_list = ['*']
|
||||
input_class_list = ["*"]
|
||||
|
||||
filtered_classes = set()
|
||||
for pattern in input_class_list:
|
||||
|
@ -384,23 +395,23 @@ filtered_classes.sort()
|
|||
################################################################################
|
||||
|
||||
table = [table_column_names]
|
||||
table_row_chars = '| - '
|
||||
table_column_chars = '|'
|
||||
table_row_chars = "| - "
|
||||
table_column_chars = "|"
|
||||
|
||||
total_status = ClassStatus('Total')
|
||||
total_status = ClassStatus("Total")
|
||||
|
||||
for cn in filtered_classes:
|
||||
|
||||
c = classes[cn]
|
||||
validate_tag(c, 'class')
|
||||
validate_tag(c, "class")
|
||||
status = ClassStatus.generate_for_class(c)
|
||||
|
||||
total_status = total_status + status
|
||||
|
||||
if (flags['b'] and status.is_ok()) or (flags['g'] and not status.is_ok()) or (not flags['a']):
|
||||
if (flags["b"] and status.is_ok()) or (flags["g"] and not status.is_ok()) or (not flags["a"]):
|
||||
continue
|
||||
|
||||
if flags['e'] and status.is_empty():
|
||||
if flags["e"] and status.is_empty():
|
||||
continue
|
||||
|
||||
out = status.make_output()
|
||||
|
@ -409,10 +420,10 @@ for cn in filtered_classes:
|
|||
if column in out:
|
||||
row.append(out[column])
|
||||
else:
|
||||
row.append('')
|
||||
row.append("")
|
||||
|
||||
if 'comment' in out and out['comment'] != '':
|
||||
row.append(out['comment'])
|
||||
if "comment" in out and out["comment"] != "":
|
||||
row.append(out["comment"])
|
||||
|
||||
table.append(row)
|
||||
|
||||
|
@ -421,22 +432,22 @@ for cn in filtered_classes:
|
|||
# Print output table #
|
||||
################################################################################
|
||||
|
||||
if len(table) == 1 and flags['a']:
|
||||
print(color('part_big_problem', 'No classes suitable for printing!'))
|
||||
if len(table) == 1 and flags["a"]:
|
||||
print(color("part_big_problem", "No classes suitable for printing!"))
|
||||
sys.exit(0)
|
||||
|
||||
if len(table) > 2 or not flags['a']:
|
||||
total_status.name = 'Total = {0}'.format(len(table) - 1)
|
||||
if len(table) > 2 or not flags["a"]:
|
||||
total_status.name = "Total = {0}".format(len(table) - 1)
|
||||
out = total_status.make_output()
|
||||
row = []
|
||||
for column in table_columns:
|
||||
if column in out:
|
||||
row.append(out[column])
|
||||
else:
|
||||
row.append('')
|
||||
row.append("")
|
||||
table.append(row)
|
||||
|
||||
if flags['a']:
|
||||
if flags["a"]:
|
||||
# Duplicate the headers at the bottom of the table so they can be viewed
|
||||
# without having to scroll back to the top.
|
||||
table.append(table_column_names)
|
||||
|
@ -451,7 +462,9 @@ for row in table:
|
|||
|
||||
divider_string = table_row_chars[0]
|
||||
for cell_i in range(len(table[0])):
|
||||
divider_string += table_row_chars[1] + table_row_chars[2] * (table_column_sizes[cell_i]) + table_row_chars[1] + table_row_chars[0]
|
||||
divider_string += (
|
||||
table_row_chars[1] + table_row_chars[2] * (table_column_sizes[cell_i]) + table_row_chars[1] + table_row_chars[0]
|
||||
)
|
||||
print(divider_string)
|
||||
|
||||
for row_i, row in enumerate(table):
|
||||
|
@ -461,7 +474,11 @@ for row_i, row in enumerate(table):
|
|||
if cell_i == 0:
|
||||
row_string += table_row_chars[3] + cell + table_row_chars[3] * (padding_needed - 1)
|
||||
else:
|
||||
row_string += table_row_chars[3] * int(math.floor(float(padding_needed) / 2)) + cell + table_row_chars[3] * int(math.ceil(float(padding_needed) / 2))
|
||||
row_string += (
|
||||
table_row_chars[3] * int(math.floor(float(padding_needed) / 2))
|
||||
+ cell
|
||||
+ table_row_chars[3] * int(math.ceil(float(padding_needed) / 2))
|
||||
)
|
||||
row_string += table_column_chars
|
||||
|
||||
print(row_string)
|
||||
|
@ -474,5 +491,5 @@ for row_i, row in enumerate(table):
|
|||
|
||||
print(divider_string)
|
||||
|
||||
if total_status.is_ok() and not flags['g']:
|
||||
print('All listed classes are ' + color('part_good', 'OK') + '!')
|
||||
if total_status.is_ok() and not flags["g"]:
|
||||
print("All listed classes are " + color("part_good", "OK") + "!")
|
||||
|
|
|
@ -7,10 +7,12 @@ import xml.etree.ElementTree as ET
|
|||
from collections import OrderedDict
|
||||
|
||||
# Uncomment to do type checks. I have it commented out so it works below Python 3.5
|
||||
#from typing import List, Dict, TextIO, Tuple, Iterable, Optional, DefaultDict, Any, Union
|
||||
# from typing import List, Dict, TextIO, Tuple, Iterable, Optional, DefaultDict, Any, Union
|
||||
|
||||
# http(s)://docs.godotengine.org/<langcode>/<tag>/path/to/page.html(#fragment-tag)
|
||||
GODOT_DOCS_PATTERN = re.compile(r'^http(?:s)?://docs\.godotengine\.org/(?:[a-zA-Z0-9.\-_]*)/(?:[a-zA-Z0-9.\-_]*)/(.*)\.html(#.*)?$')
|
||||
GODOT_DOCS_PATTERN = re.compile(
|
||||
r"^http(?:s)?://docs\.godotengine\.org/(?:[a-zA-Z0-9.\-_]*)/(?:[a-zA-Z0-9.\-_]*)/(.*)\.html(#.*)?$"
|
||||
)
|
||||
|
||||
|
||||
def print_error(error, state): # type: (str, State) -> None
|
||||
|
@ -37,7 +39,9 @@ class TypeName:
|
|||
|
||||
|
||||
class PropertyDef:
|
||||
def __init__(self, name, type_name, setter, getter, text, default_value, overridden): # type: (str, TypeName, Optional[str], Optional[str], Optional[str], Optional[str], Optional[bool]) -> None
|
||||
def __init__(
|
||||
self, name, type_name, setter, getter, text, default_value, overridden
|
||||
): # type: (str, TypeName, Optional[str], Optional[str], Optional[str], Optional[str], Optional[bool]) -> None
|
||||
self.name = name
|
||||
self.type_name = type_name
|
||||
self.setter = setter
|
||||
|
@ -46,6 +50,7 @@ class PropertyDef:
|
|||
self.default_value = default_value
|
||||
self.overridden = overridden
|
||||
|
||||
|
||||
class ParameterDef:
|
||||
def __init__(self, name, type_name, default_value): # type: (str, TypeName, Optional[str]) -> None
|
||||
self.name = name
|
||||
|
@ -61,7 +66,9 @@ class SignalDef:
|
|||
|
||||
|
||||
class MethodDef:
|
||||
def __init__(self, name, return_type, parameters, description, qualifiers): # type: (str, TypeName, List[ParameterDef], Optional[str], Optional[str]) -> None
|
||||
def __init__(
|
||||
self, name, return_type, parameters, description, qualifiers
|
||||
): # type: (str, TypeName, List[ParameterDef], Optional[str], Optional[str]) -> None
|
||||
self.name = name
|
||||
self.return_type = return_type
|
||||
self.parameters = parameters
|
||||
|
@ -144,10 +151,12 @@ class State:
|
|||
getter = property.get("getter") or None
|
||||
default_value = property.get("default") or None
|
||||
if default_value is not None:
|
||||
default_value = '``{}``'.format(default_value)
|
||||
default_value = "``{}``".format(default_value)
|
||||
overridden = property.get("override") or False
|
||||
|
||||
property_def = PropertyDef(property_name, type_name, setter, getter, property.text, default_value, overridden)
|
||||
property_def = PropertyDef(
|
||||
property_name, type_name, setter, getter, property.text, default_value, overridden
|
||||
)
|
||||
class_def.properties[property_name] = property_def
|
||||
|
||||
methods = class_root.find("methods")
|
||||
|
@ -246,8 +255,6 @@ class State:
|
|||
if link.text is not None:
|
||||
class_def.tutorials.append(link.text)
|
||||
|
||||
|
||||
|
||||
def sort_classes(self): # type: () -> None
|
||||
self.classes = OrderedDict(sorted(self.classes.items(), key=lambda t: t[0]))
|
||||
|
||||
|
@ -273,7 +280,11 @@ def main(): # type: () -> None
|
|||
parser.add_argument("path", nargs="+", help="A path to an XML file or a directory containing XML files to parse.")
|
||||
group = parser.add_mutually_exclusive_group()
|
||||
group.add_argument("--output", "-o", default=".", help="The directory to save output .rst files in.")
|
||||
group.add_argument("--dry-run", action="store_true", help="If passed, no output will be generated and XML files are only checked for errors.")
|
||||
group.add_argument(
|
||||
"--dry-run",
|
||||
action="store_true",
|
||||
help="If passed, no output will be generated and XML files are only checked for errors.",
|
||||
)
|
||||
args = parser.parse_args()
|
||||
|
||||
print("Checking for errors in the XML class reference...")
|
||||
|
@ -285,15 +296,15 @@ def main(): # type: () -> None
|
|||
if path.endswith(os.sep):
|
||||
path = path[:-1]
|
||||
|
||||
if os.path.basename(path) == 'modules':
|
||||
if os.path.basename(path) == "modules":
|
||||
for subdir, dirs, _ in os.walk(path):
|
||||
if 'doc_classes' in dirs:
|
||||
doc_dir = os.path.join(subdir, 'doc_classes')
|
||||
class_file_names = (f for f in os.listdir(doc_dir) if f.endswith('.xml'))
|
||||
if "doc_classes" in dirs:
|
||||
doc_dir = os.path.join(subdir, "doc_classes")
|
||||
class_file_names = (f for f in os.listdir(doc_dir) if f.endswith(".xml"))
|
||||
file_list += (os.path.join(doc_dir, f) for f in class_file_names)
|
||||
|
||||
elif os.path.isdir(path):
|
||||
file_list += (os.path.join(path, f) for f in os.listdir(path) if f.endswith('.xml'))
|
||||
file_list += (os.path.join(path, f) for f in os.listdir(path) if f.endswith(".xml"))
|
||||
|
||||
elif os.path.isfile(path):
|
||||
if not path.endswith(".xml"):
|
||||
|
@ -313,7 +324,7 @@ def main(): # type: () -> None
|
|||
continue
|
||||
doc = tree.getroot()
|
||||
|
||||
if 'version' not in doc.attrib:
|
||||
if "version" not in doc.attrib:
|
||||
print_error("Version missing from 'doc', file: {}".format(cur_file), state)
|
||||
continue
|
||||
|
||||
|
@ -342,13 +353,14 @@ def main(): # type: () -> None
|
|||
print("Errors were found in the class reference XML. Please check the messages above.")
|
||||
exit(1)
|
||||
|
||||
|
||||
def make_rst_class(class_def, state, dry_run, output_dir): # type: (ClassDef, State, bool, str) -> None
|
||||
class_name = class_def.name
|
||||
|
||||
if dry_run:
|
||||
f = open(os.devnull, "w", encoding="utf-8")
|
||||
else:
|
||||
f = open(os.path.join(output_dir, "class_" + class_name.lower() + '.rst'), 'w', encoding='utf-8')
|
||||
f = open(os.path.join(output_dir, "class_" + class_name.lower() + ".rst"), "w", encoding="utf-8")
|
||||
|
||||
# Warn contributors not to edit this file directly
|
||||
f.write(":github_url: hide\n\n")
|
||||
|
@ -357,13 +369,13 @@ def make_rst_class(class_def, state, dry_run, output_dir): # type: (ClassDef, S
|
|||
f.write(".. The source is found in doc/classes or modules/<name>/doc_classes.\n\n")
|
||||
|
||||
f.write(".. _class_" + class_name + ":\n\n")
|
||||
f.write(make_heading(class_name, '='))
|
||||
f.write(make_heading(class_name, "="))
|
||||
|
||||
# Inheritance tree
|
||||
# Ascendants
|
||||
if class_def.inherits:
|
||||
inh = class_def.inherits.strip()
|
||||
f.write('**Inherits:** ')
|
||||
f.write("**Inherits:** ")
|
||||
first = True
|
||||
while inh in state.classes:
|
||||
if not first:
|
||||
|
@ -386,7 +398,7 @@ def make_rst_class(class_def, state, dry_run, output_dir): # type: (ClassDef, S
|
|||
inherited.append(c.name)
|
||||
|
||||
if len(inherited):
|
||||
f.write('**Inherited By:** ')
|
||||
f.write("**Inherited By:** ")
|
||||
for i, child in enumerate(inherited):
|
||||
if i > 0:
|
||||
f.write(", ")
|
||||
|
@ -398,20 +410,20 @@ def make_rst_class(class_def, state, dry_run, output_dir): # type: (ClassDef, S
|
|||
f.write(rstize_text(class_def.brief_description.strip(), state) + "\n\n")
|
||||
|
||||
# Class description
|
||||
if class_def.description is not None and class_def.description.strip() != '':
|
||||
f.write(make_heading('Description', '-'))
|
||||
if class_def.description is not None and class_def.description.strip() != "":
|
||||
f.write(make_heading("Description", "-"))
|
||||
f.write(rstize_text(class_def.description.strip(), state) + "\n\n")
|
||||
|
||||
# Online tutorials
|
||||
if len(class_def.tutorials) > 0:
|
||||
f.write(make_heading('Tutorials', '-'))
|
||||
f.write(make_heading("Tutorials", "-"))
|
||||
for t in class_def.tutorials:
|
||||
link = t.strip()
|
||||
f.write("- " + make_url(link) + "\n\n")
|
||||
|
||||
# Properties overview
|
||||
if len(class_def.properties) > 0:
|
||||
f.write(make_heading('Properties', '-'))
|
||||
f.write(make_heading("Properties", "-"))
|
||||
ml = [] # type: List[Tuple[str, str, str]]
|
||||
for property_def in class_def.properties.values():
|
||||
type_rst = property_def.type_name.to_rst(state)
|
||||
|
@ -425,7 +437,7 @@ def make_rst_class(class_def, state, dry_run, output_dir): # type: (ClassDef, S
|
|||
|
||||
# Methods overview
|
||||
if len(class_def.methods) > 0:
|
||||
f.write(make_heading('Methods', '-'))
|
||||
f.write(make_heading("Methods", "-"))
|
||||
ml = []
|
||||
for method_list in class_def.methods.values():
|
||||
for m in method_list:
|
||||
|
@ -434,7 +446,7 @@ def make_rst_class(class_def, state, dry_run, output_dir): # type: (ClassDef, S
|
|||
|
||||
# Theme properties
|
||||
if class_def.theme_items is not None and len(class_def.theme_items) > 0:
|
||||
f.write(make_heading('Theme Properties', '-'))
|
||||
f.write(make_heading("Theme Properties", "-"))
|
||||
pl = []
|
||||
for theme_item_list in class_def.theme_items.values():
|
||||
for theme_item in theme_item_list:
|
||||
|
@ -443,30 +455,30 @@ def make_rst_class(class_def, state, dry_run, output_dir): # type: (ClassDef, S
|
|||
|
||||
# Signals
|
||||
if len(class_def.signals) > 0:
|
||||
f.write(make_heading('Signals', '-'))
|
||||
f.write(make_heading("Signals", "-"))
|
||||
index = 0
|
||||
|
||||
for signal in class_def.signals.values():
|
||||
if index != 0:
|
||||
f.write('----\n\n')
|
||||
f.write("----\n\n")
|
||||
|
||||
f.write(".. _class_{}_signal_{}:\n\n".format(class_name, signal.name))
|
||||
_, signature = make_method_signature(class_def, signal, False, state)
|
||||
f.write("- {}\n\n".format(signature))
|
||||
|
||||
if signal.description is not None and signal.description.strip() != '':
|
||||
f.write(rstize_text(signal.description.strip(), state) + '\n\n')
|
||||
if signal.description is not None and signal.description.strip() != "":
|
||||
f.write(rstize_text(signal.description.strip(), state) + "\n\n")
|
||||
|
||||
index += 1
|
||||
|
||||
# Enums
|
||||
if len(class_def.enums) > 0:
|
||||
f.write(make_heading('Enumerations', '-'))
|
||||
f.write(make_heading("Enumerations", "-"))
|
||||
index = 0
|
||||
|
||||
for e in class_def.enums.values():
|
||||
if index != 0:
|
||||
f.write('----\n\n')
|
||||
f.write("----\n\n")
|
||||
|
||||
f.write(".. _enum_{}_{}:\n\n".format(class_name, e.name))
|
||||
# Sphinx seems to divide the bullet list into individual <ul> tags if we weave the labels into it.
|
||||
|
@ -479,16 +491,16 @@ def make_rst_class(class_def, state, dry_run, output_dir): # type: (ClassDef, S
|
|||
f.write("enum **{}**:\n\n".format(e.name))
|
||||
for value in e.values.values():
|
||||
f.write("- **{}** = **{}**".format(value.name, value.value))
|
||||
if value.text is not None and value.text.strip() != '':
|
||||
f.write(' --- ' + rstize_text(value.text.strip(), state))
|
||||
if value.text is not None and value.text.strip() != "":
|
||||
f.write(" --- " + rstize_text(value.text.strip(), state))
|
||||
|
||||
f.write('\n\n')
|
||||
f.write("\n\n")
|
||||
|
||||
index += 1
|
||||
|
||||
# Constants
|
||||
if len(class_def.constants) > 0:
|
||||
f.write(make_heading('Constants', '-'))
|
||||
f.write(make_heading("Constants", "-"))
|
||||
# Sphinx seems to divide the bullet list into individual <ul> tags if we weave the labels into it.
|
||||
# As such I'll put them all above the list. Won't be perfect but better than making the list visually broken.
|
||||
for constant in class_def.constants.values():
|
||||
|
@ -496,14 +508,14 @@ def make_rst_class(class_def, state, dry_run, output_dir): # type: (ClassDef, S
|
|||
|
||||
for constant in class_def.constants.values():
|
||||
f.write("- **{}** = **{}**".format(constant.name, constant.value))
|
||||
if constant.text is not None and constant.text.strip() != '':
|
||||
f.write(' --- ' + rstize_text(constant.text.strip(), state))
|
||||
if constant.text is not None and constant.text.strip() != "":
|
||||
f.write(" --- " + rstize_text(constant.text.strip(), state))
|
||||
|
||||
f.write('\n\n')
|
||||
f.write("\n\n")
|
||||
|
||||
# Property descriptions
|
||||
if any(not p.overridden for p in class_def.properties.values()) > 0:
|
||||
f.write(make_heading('Property Descriptions', '-'))
|
||||
f.write(make_heading("Property Descriptions", "-"))
|
||||
index = 0
|
||||
|
||||
for property_def in class_def.properties.values():
|
||||
|
@ -511,36 +523,36 @@ def make_rst_class(class_def, state, dry_run, output_dir): # type: (ClassDef, S
|
|||
continue
|
||||
|
||||
if index != 0:
|
||||
f.write('----\n\n')
|
||||
f.write("----\n\n")
|
||||
|
||||
f.write(".. _class_{}_property_{}:\n\n".format(class_name, property_def.name))
|
||||
f.write('- {} **{}**\n\n'.format(property_def.type_name.to_rst(state), property_def.name))
|
||||
f.write("- {} **{}**\n\n".format(property_def.type_name.to_rst(state), property_def.name))
|
||||
|
||||
info = []
|
||||
if property_def.default_value is not None:
|
||||
info.append(("*Default*", property_def.default_value))
|
||||
if property_def.setter is not None and not property_def.setter.startswith("_"):
|
||||
info.append(("*Setter*", property_def.setter + '(value)'))
|
||||
info.append(("*Setter*", property_def.setter + "(value)"))
|
||||
if property_def.getter is not None and not property_def.getter.startswith("_"):
|
||||
info.append(('*Getter*', property_def.getter + '()'))
|
||||
info.append(("*Getter*", property_def.getter + "()"))
|
||||
|
||||
if len(info) > 0:
|
||||
format_table(f, info)
|
||||
|
||||
if property_def.text is not None and property_def.text.strip() != '':
|
||||
f.write(rstize_text(property_def.text.strip(), state) + '\n\n')
|
||||
if property_def.text is not None and property_def.text.strip() != "":
|
||||
f.write(rstize_text(property_def.text.strip(), state) + "\n\n")
|
||||
|
||||
index += 1
|
||||
|
||||
# Method descriptions
|
||||
if len(class_def.methods) > 0:
|
||||
f.write(make_heading('Method Descriptions', '-'))
|
||||
f.write(make_heading("Method Descriptions", "-"))
|
||||
index = 0
|
||||
|
||||
for method_list in class_def.methods.values():
|
||||
for i, m in enumerate(method_list):
|
||||
if index != 0:
|
||||
f.write('----\n\n')
|
||||
f.write("----\n\n")
|
||||
|
||||
if i == 0:
|
||||
f.write(".. _class_{}_method_{}:\n\n".format(class_name, m.name))
|
||||
|
@ -548,8 +560,8 @@ def make_rst_class(class_def, state, dry_run, output_dir): # type: (ClassDef, S
|
|||
ret_type, signature = make_method_signature(class_def, m, False, state)
|
||||
f.write("- {} {}\n\n".format(ret_type, signature))
|
||||
|
||||
if m.description is not None and m.description.strip() != '':
|
||||
f.write(rstize_text(m.description.strip(), state) + '\n\n')
|
||||
if m.description is not None and m.description.strip() != "":
|
||||
f.write(rstize_text(m.description.strip(), state) + "\n\n")
|
||||
|
||||
index += 1
|
||||
|
||||
|
@ -558,29 +570,29 @@ def escape_rst(text, until_pos=-1): # type: (str) -> str
|
|||
# Escape \ character, otherwise it ends up as an escape character in rst
|
||||
pos = 0
|
||||
while True:
|
||||
pos = text.find('\\', pos, until_pos)
|
||||
pos = text.find("\\", pos, until_pos)
|
||||
if pos == -1:
|
||||
break
|
||||
text = text[:pos] + "\\\\" + text[pos + 1:]
|
||||
text = text[:pos] + "\\\\" + text[pos + 1 :]
|
||||
pos += 2
|
||||
|
||||
# Escape * character to avoid interpreting it as emphasis
|
||||
pos = 0
|
||||
while True:
|
||||
pos = text.find('*', pos, until_pos)
|
||||
pos = text.find("*", pos, until_pos)
|
||||
if pos == -1:
|
||||
break
|
||||
text = text[:pos] + "\*" + text[pos + 1:]
|
||||
text = text[:pos] + "\*" + text[pos + 1 :]
|
||||
pos += 2
|
||||
|
||||
# Escape _ character at the end of a word to avoid interpreting it as an inline hyperlink
|
||||
pos = 0
|
||||
while True:
|
||||
pos = text.find('_', pos, until_pos)
|
||||
pos = text.find("_", pos, until_pos)
|
||||
if pos == -1:
|
||||
break
|
||||
if not text[pos + 1].isalnum(): # don't escape within a snake_case word
|
||||
text = text[:pos] + "\_" + text[pos + 1:]
|
||||
text = text[:pos] + "\_" + text[pos + 1 :]
|
||||
pos += 2
|
||||
else:
|
||||
pos += 1
|
||||
|
@ -592,16 +604,16 @@ def rstize_text(text, state): # type: (str, State) -> str
|
|||
# Linebreak + tabs in the XML should become two line breaks unless in a "codeblock"
|
||||
pos = 0
|
||||
while True:
|
||||
pos = text.find('\n', pos)
|
||||
pos = text.find("\n", pos)
|
||||
if pos == -1:
|
||||
break
|
||||
|
||||
pre_text = text[:pos]
|
||||
indent_level = 0
|
||||
while text[pos + 1] == '\t':
|
||||
while text[pos + 1] == "\t":
|
||||
pos += 1
|
||||
indent_level += 1
|
||||
post_text = text[pos + 1:]
|
||||
post_text = text[pos + 1 :]
|
||||
|
||||
# Handle codeblocks
|
||||
if post_text.startswith("[codeblock]"):
|
||||
|
@ -610,28 +622,33 @@ def rstize_text(text, state): # type: (str, State) -> str
|
|||
print_error("[codeblock] without a closing tag, file: {}".format(state.current_class), state)
|
||||
return ""
|
||||
|
||||
code_text = post_text[len("[codeblock]"):end_pos]
|
||||
code_text = post_text[len("[codeblock]") : end_pos]
|
||||
post_text = post_text[end_pos:]
|
||||
|
||||
# Remove extraneous tabs
|
||||
code_pos = 0
|
||||
while True:
|
||||
code_pos = code_text.find('\n', code_pos)
|
||||
code_pos = code_text.find("\n", code_pos)
|
||||
if code_pos == -1:
|
||||
break
|
||||
|
||||
to_skip = 0
|
||||
while code_pos + to_skip + 1 < len(code_text) and code_text[code_pos + to_skip + 1] == '\t':
|
||||
while code_pos + to_skip + 1 < len(code_text) and code_text[code_pos + to_skip + 1] == "\t":
|
||||
to_skip += 1
|
||||
|
||||
if to_skip > indent_level:
|
||||
print_error("Four spaces should be used for indentation within [codeblock], file: {}".format(state.current_class), state)
|
||||
print_error(
|
||||
"Four spaces should be used for indentation within [codeblock], file: {}".format(
|
||||
state.current_class
|
||||
),
|
||||
state,
|
||||
)
|
||||
|
||||
if len(code_text[code_pos + to_skip + 1:]) == 0:
|
||||
if len(code_text[code_pos + to_skip + 1 :]) == 0:
|
||||
code_text = code_text[:code_pos] + "\n"
|
||||
code_pos += 1
|
||||
else:
|
||||
code_text = code_text[:code_pos] + "\n " + code_text[code_pos + to_skip + 1:]
|
||||
code_text = code_text[:code_pos] + "\n " + code_text[code_pos + to_skip + 1 :]
|
||||
code_pos += 5 - to_skip
|
||||
|
||||
text = pre_text + "\n[codeblock]" + code_text + post_text
|
||||
|
@ -642,7 +659,7 @@ def rstize_text(text, state): # type: (str, State) -> str
|
|||
text = pre_text + "\n\n" + post_text
|
||||
pos += 2
|
||||
|
||||
next_brac_pos = text.find('[')
|
||||
next_brac_pos = text.find("[")
|
||||
text = escape_rst(text, next_brac_pos)
|
||||
|
||||
# Handle [tags]
|
||||
|
@ -654,54 +671,59 @@ def rstize_text(text, state): # type: (str, State) -> str
|
|||
tag_depth = 0
|
||||
previous_pos = 0
|
||||
while True:
|
||||
pos = text.find('[', pos)
|
||||
pos = text.find("[", pos)
|
||||
if inside_url and (pos > previous_pos):
|
||||
url_has_name = True
|
||||
if pos == -1:
|
||||
break
|
||||
|
||||
endq_pos = text.find(']', pos + 1)
|
||||
endq_pos = text.find("]", pos + 1)
|
||||
if endq_pos == -1:
|
||||
break
|
||||
|
||||
pre_text = text[:pos]
|
||||
post_text = text[endq_pos + 1:]
|
||||
tag_text = text[pos + 1:endq_pos]
|
||||
post_text = text[endq_pos + 1 :]
|
||||
tag_text = text[pos + 1 : endq_pos]
|
||||
|
||||
escape_post = False
|
||||
|
||||
if tag_text in state.classes:
|
||||
if tag_text == state.current_class:
|
||||
# We don't want references to the same class
|
||||
tag_text = '``{}``'.format(tag_text)
|
||||
tag_text = "``{}``".format(tag_text)
|
||||
else:
|
||||
tag_text = make_type(tag_text, state)
|
||||
escape_post = True
|
||||
else: # command
|
||||
cmd = tag_text
|
||||
space_pos = tag_text.find(' ')
|
||||
if cmd == '/codeblock':
|
||||
tag_text = ''
|
||||
space_pos = tag_text.find(" ")
|
||||
if cmd == "/codeblock":
|
||||
tag_text = ""
|
||||
tag_depth -= 1
|
||||
inside_code = False
|
||||
# Strip newline if the tag was alone on one
|
||||
if pre_text[-1] == '\n':
|
||||
if pre_text[-1] == "\n":
|
||||
pre_text = pre_text[:-1]
|
||||
elif cmd == '/code':
|
||||
tag_text = '``'
|
||||
elif cmd == "/code":
|
||||
tag_text = "``"
|
||||
tag_depth -= 1
|
||||
inside_code = False
|
||||
escape_post = True
|
||||
elif inside_code:
|
||||
tag_text = '[' + tag_text + ']'
|
||||
elif cmd.find('html') == 0:
|
||||
param = tag_text[space_pos + 1:]
|
||||
tag_text = "[" + tag_text + "]"
|
||||
elif cmd.find("html") == 0:
|
||||
param = tag_text[space_pos + 1 :]
|
||||
tag_text = param
|
||||
elif cmd.startswith('method') or cmd.startswith('member') or cmd.startswith('signal') or cmd.startswith('constant'):
|
||||
param = tag_text[space_pos + 1:]
|
||||
elif (
|
||||
cmd.startswith("method")
|
||||
or cmd.startswith("member")
|
||||
or cmd.startswith("signal")
|
||||
or cmd.startswith("constant")
|
||||
):
|
||||
param = tag_text[space_pos + 1 :]
|
||||
|
||||
if param.find('.') != -1:
|
||||
ss = param.split('.')
|
||||
if param.find(".") != -1:
|
||||
ss = param.split(".")
|
||||
if len(ss) > 2:
|
||||
print_error("Bad reference: '{}', file: {}".format(param, state.current_class), state)
|
||||
class_param, method_param = ss
|
||||
|
@ -734,7 +756,7 @@ def rstize_text(text, state): # type: (str, State) -> str
|
|||
# Search in the current class
|
||||
search_class_defs = [class_def]
|
||||
|
||||
if param.find('.') == -1:
|
||||
if param.find(".") == -1:
|
||||
# Also search in @GlobalScope as a last resort if no class was specified
|
||||
search_class_defs.append(state.classes["@GlobalScope"])
|
||||
|
||||
|
@ -755,66 +777,71 @@ def rstize_text(text, state): # type: (str, State) -> str
|
|||
ref_type = "_constant"
|
||||
|
||||
else:
|
||||
print_error("Unresolved type reference '{}' in method reference '{}', file: {}".format(class_param, param, state.current_class), state)
|
||||
print_error(
|
||||
"Unresolved type reference '{}' in method reference '{}', file: {}".format(
|
||||
class_param, param, state.current_class
|
||||
),
|
||||
state,
|
||||
)
|
||||
|
||||
repl_text = method_param
|
||||
if class_param != state.current_class:
|
||||
repl_text = "{}.{}".format(class_param, method_param)
|
||||
tag_text = ':ref:`{}<class_{}{}_{}>`'.format(repl_text, class_param, ref_type, method_param)
|
||||
tag_text = ":ref:`{}<class_{}{}_{}>`".format(repl_text, class_param, ref_type, method_param)
|
||||
escape_post = True
|
||||
elif cmd.find('image=') == 0:
|
||||
elif cmd.find("image=") == 0:
|
||||
tag_text = "" # '![](' + cmd[6:] + ')'
|
||||
elif cmd.find('url=') == 0:
|
||||
elif cmd.find("url=") == 0:
|
||||
url_link = cmd[4:]
|
||||
tag_text = '`'
|
||||
tag_text = "`"
|
||||
tag_depth += 1
|
||||
inside_url = True
|
||||
url_has_name = False
|
||||
elif cmd == '/url':
|
||||
tag_text = ('' if url_has_name else url_link) + " <" + url_link + ">`_"
|
||||
elif cmd == "/url":
|
||||
tag_text = ("" if url_has_name else url_link) + " <" + url_link + ">`_"
|
||||
tag_depth -= 1
|
||||
escape_post = True
|
||||
inside_url = False
|
||||
url_has_name = False
|
||||
elif cmd == 'center':
|
||||
elif cmd == "center":
|
||||
tag_depth += 1
|
||||
tag_text = ''
|
||||
elif cmd == '/center':
|
||||
tag_text = ""
|
||||
elif cmd == "/center":
|
||||
tag_depth -= 1
|
||||
tag_text = ''
|
||||
elif cmd == 'codeblock':
|
||||
tag_text = ""
|
||||
elif cmd == "codeblock":
|
||||
tag_depth += 1
|
||||
tag_text = '\n::\n'
|
||||
tag_text = "\n::\n"
|
||||
inside_code = True
|
||||
elif cmd == 'br':
|
||||
elif cmd == "br":
|
||||
# Make a new paragraph instead of a linebreak, rst is not so linebreak friendly
|
||||
tag_text = '\n\n'
|
||||
tag_text = "\n\n"
|
||||
# Strip potential leading spaces
|
||||
while post_text[0] == ' ':
|
||||
while post_text[0] == " ":
|
||||
post_text = post_text[1:]
|
||||
elif cmd == 'i' or cmd == '/i':
|
||||
elif cmd == "i" or cmd == "/i":
|
||||
if cmd == "/i":
|
||||
tag_depth -= 1
|
||||
else:
|
||||
tag_depth += 1
|
||||
tag_text = '*'
|
||||
elif cmd == 'b' or cmd == '/b':
|
||||
tag_text = "*"
|
||||
elif cmd == "b" or cmd == "/b":
|
||||
if cmd == "/b":
|
||||
tag_depth -= 1
|
||||
else:
|
||||
tag_depth += 1
|
||||
tag_text = '**'
|
||||
elif cmd == 'u' or cmd == '/u':
|
||||
tag_text = "**"
|
||||
elif cmd == "u" or cmd == "/u":
|
||||
if cmd == "/u":
|
||||
tag_depth -= 1
|
||||
else:
|
||||
tag_depth += 1
|
||||
tag_text = ''
|
||||
elif cmd == 'code':
|
||||
tag_text = '``'
|
||||
tag_text = ""
|
||||
elif cmd == "code":
|
||||
tag_text = "``"
|
||||
tag_depth += 1
|
||||
inside_code = True
|
||||
elif cmd.startswith('enum '):
|
||||
elif cmd.startswith("enum "):
|
||||
tag_text = make_enum(cmd[5:], state)
|
||||
escape_post = True
|
||||
else:
|
||||
|
@ -823,24 +850,24 @@ def rstize_text(text, state): # type: (str, State) -> str
|
|||
|
||||
# Properly escape things like `[Node]s`
|
||||
if escape_post and post_text and (post_text[0].isalnum() or post_text[0] == "("): # not punctuation, escape
|
||||
post_text = '\ ' + post_text
|
||||
post_text = "\ " + post_text
|
||||
|
||||
next_brac_pos = post_text.find('[', 0)
|
||||
next_brac_pos = post_text.find("[", 0)
|
||||
iter_pos = 0
|
||||
while not inside_code:
|
||||
iter_pos = post_text.find('*', iter_pos, next_brac_pos)
|
||||
iter_pos = post_text.find("*", iter_pos, next_brac_pos)
|
||||
if iter_pos == -1:
|
||||
break
|
||||
post_text = post_text[:iter_pos] + "\*" + post_text[iter_pos + 1:]
|
||||
post_text = post_text[:iter_pos] + "\*" + post_text[iter_pos + 1 :]
|
||||
iter_pos += 2
|
||||
|
||||
iter_pos = 0
|
||||
while not inside_code:
|
||||
iter_pos = post_text.find('_', iter_pos, next_brac_pos)
|
||||
iter_pos = post_text.find("_", iter_pos, next_brac_pos)
|
||||
if iter_pos == -1:
|
||||
break
|
||||
if not post_text[iter_pos + 1].isalnum(): # don't escape within a snake_case word
|
||||
post_text = post_text[:iter_pos] + "\_" + post_text[iter_pos + 1:]
|
||||
post_text = post_text[:iter_pos] + "\_" + post_text[iter_pos + 1 :]
|
||||
iter_pos += 2
|
||||
else:
|
||||
iter_pos += 1
|
||||
|
@ -862,7 +889,7 @@ def format_table(f, data, remove_empty_columns=False): # type: (TextIO, Iterabl
|
|||
column_sizes = [0] * len(data[0])
|
||||
for row in data:
|
||||
for i, text in enumerate(row):
|
||||
text_length = len(text or '')
|
||||
text_length = len(text or "")
|
||||
if text_length > column_sizes[i]:
|
||||
column_sizes[i] = text_length
|
||||
|
||||
|
@ -879,16 +906,16 @@ def format_table(f, data, remove_empty_columns=False): # type: (TextIO, Iterabl
|
|||
for i, text in enumerate(row):
|
||||
if column_sizes[i] == 0 and remove_empty_columns:
|
||||
continue
|
||||
row_text += " " + (text or '').ljust(column_sizes[i]) + " |"
|
||||
row_text += " " + (text or "").ljust(column_sizes[i]) + " |"
|
||||
row_text += "\n"
|
||||
f.write(row_text)
|
||||
f.write(sep)
|
||||
f.write('\n')
|
||||
f.write("\n")
|
||||
|
||||
|
||||
def make_type(t, state): # type: (str, State) -> str
|
||||
if t in state.classes:
|
||||
return ':ref:`{0}<class_{0}>`'.format(t)
|
||||
return ":ref:`{0}<class_{0}>`".format(t)
|
||||
print_error("Unresolved type '{}', file: {}".format(t, state.current_class), state)
|
||||
return t
|
||||
|
||||
|
@ -897,7 +924,7 @@ def make_enum(t, state): # type: (str, State) -> str
|
|||
p = t.find(".")
|
||||
if p >= 0:
|
||||
c = t[0:p]
|
||||
e = t[p + 1:]
|
||||
e = t[p + 1 :]
|
||||
# Variant enums live in GlobalScope but still use periods.
|
||||
if c == "Variant":
|
||||
c = "@GlobalScope"
|
||||
|
@ -909,7 +936,7 @@ def make_enum(t, state): # type: (str, State) -> str
|
|||
c = "@GlobalScope"
|
||||
|
||||
if not c in state.classes and c.startswith("_"):
|
||||
c = c[1:] # Remove the underscore prefix
|
||||
c = c[1:] # Remove the underscore prefix
|
||||
|
||||
if c in state.classes and e in state.classes[c].enums:
|
||||
return ":ref:`{0}<enum_{1}_{0}>`".format(e, c)
|
||||
|
@ -921,7 +948,9 @@ def make_enum(t, state): # type: (str, State) -> str
|
|||
return t
|
||||
|
||||
|
||||
def make_method_signature(class_def, method_def, make_ref, state): # type: (ClassDef, Union[MethodDef, SignalDef], bool, State) -> Tuple[str, str]
|
||||
def make_method_signature(
|
||||
class_def, method_def, make_ref, state
|
||||
): # type: (ClassDef, Union[MethodDef, SignalDef], bool, State) -> Tuple[str, str]
|
||||
ret_type = " "
|
||||
|
||||
ref_type = "signal"
|
||||
|
@ -936,34 +965,34 @@ def make_method_signature(class_def, method_def, make_ref, state): # type: (Cla
|
|||
else:
|
||||
out += "**{}** ".format(method_def.name)
|
||||
|
||||
out += '**(**'
|
||||
out += "**(**"
|
||||
for i, arg in enumerate(method_def.parameters):
|
||||
if i > 0:
|
||||
out += ', '
|
||||
out += ", "
|
||||
else:
|
||||
out += ' '
|
||||
out += " "
|
||||
|
||||
out += "{} {}".format(arg.type_name.to_rst(state), arg.name)
|
||||
|
||||
if arg.default_value is not None:
|
||||
out += '=' + arg.default_value
|
||||
out += "=" + arg.default_value
|
||||
|
||||
if isinstance(method_def, MethodDef) and method_def.qualifiers is not None and 'vararg' in method_def.qualifiers:
|
||||
if isinstance(method_def, MethodDef) and method_def.qualifiers is not None and "vararg" in method_def.qualifiers:
|
||||
if len(method_def.parameters) > 0:
|
||||
out += ', ...'
|
||||
out += ", ..."
|
||||
else:
|
||||
out += ' ...'
|
||||
out += " ..."
|
||||
|
||||
out += ' **)**'
|
||||
out += " **)**"
|
||||
|
||||
if isinstance(method_def, MethodDef) and method_def.qualifiers is not None:
|
||||
out += ' ' + method_def.qualifiers
|
||||
out += " " + method_def.qualifiers
|
||||
|
||||
return ret_type, out
|
||||
|
||||
|
||||
def make_heading(title, underline): # type: (str, str) -> str
|
||||
return title + '\n' + (underline * len(title)) + "\n\n"
|
||||
return title + "\n" + (underline * len(title)) + "\n\n"
|
||||
|
||||
|
||||
def make_url(link): # type: (str) -> str
|
||||
|
@ -987,5 +1016,5 @@ def make_url(link): # type: (str) -> str
|
|||
return "`" + link + " <" + link + ">`_"
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
|
|
|
@ -1,40 +1,41 @@
|
|||
#!/usr/bin/env python
|
||||
|
||||
Import('env')
|
||||
Import("env")
|
||||
|
||||
env.drivers_sources = []
|
||||
|
||||
# OS drivers
|
||||
SConscript('unix/SCsub')
|
||||
SConscript('windows/SCsub')
|
||||
SConscript("unix/SCsub")
|
||||
SConscript("windows/SCsub")
|
||||
|
||||
# Sounds drivers
|
||||
SConscript('alsa/SCsub')
|
||||
SConscript('coreaudio/SCsub')
|
||||
SConscript('pulseaudio/SCsub')
|
||||
if (env["platform"] == "windows"):
|
||||
SConscript("alsa/SCsub")
|
||||
SConscript("coreaudio/SCsub")
|
||||
SConscript("pulseaudio/SCsub")
|
||||
if env["platform"] == "windows":
|
||||
SConscript("wasapi/SCsub")
|
||||
if env['xaudio2']:
|
||||
if env["xaudio2"]:
|
||||
SConscript("xaudio2/SCsub")
|
||||
|
||||
# Midi drivers
|
||||
SConscript('alsamidi/SCsub')
|
||||
SConscript('coremidi/SCsub')
|
||||
SConscript('winmidi/SCsub')
|
||||
SConscript("alsamidi/SCsub")
|
||||
SConscript("coremidi/SCsub")
|
||||
SConscript("winmidi/SCsub")
|
||||
|
||||
# Graphics drivers
|
||||
if (env["platform"] != "server"):
|
||||
SConscript('gles3/SCsub')
|
||||
SConscript('gles2/SCsub')
|
||||
SConscript('gl_context/SCsub')
|
||||
if env["platform"] != "server":
|
||||
SConscript("gles3/SCsub")
|
||||
SConscript("gles2/SCsub")
|
||||
SConscript("gl_context/SCsub")
|
||||
else:
|
||||
SConscript('dummy/SCsub')
|
||||
SConscript("dummy/SCsub")
|
||||
|
||||
# Core dependencies
|
||||
SConscript("png/SCsub")
|
||||
|
||||
if env['vsproj']:
|
||||
if env["vsproj"]:
|
||||
import os
|
||||
|
||||
path = os.getcwd()
|
||||
# Change directory so the path resolves correctly in the function call.
|
||||
os.chdir("..")
|
||||
|
|
|
@ -1,5 +1,5 @@
|
|||
#!/usr/bin/env python
|
||||
|
||||
Import('env')
|
||||
Import("env")
|
||||
|
||||
env.add_source_files(env.drivers_sources, "*.cpp")
|
||||
|
|
|
@ -1,6 +1,6 @@
|
|||
#!/usr/bin/env python
|
||||
|
||||
Import('env')
|
||||
Import("env")
|
||||
|
||||
# Driver source files
|
||||
env.add_source_files(env.drivers_sources, "*.cpp")
|
||||
|
|
|
@ -1,6 +1,6 @@
|
|||
#!/usr/bin/env python
|
||||
|
||||
Import('env')
|
||||
Import("env")
|
||||
|
||||
# Driver source files
|
||||
env.add_source_files(env.drivers_sources, "*.cpp")
|
||||
|
|
|
@ -1,6 +1,6 @@
|
|||
#!/usr/bin/env python
|
||||
|
||||
Import('env')
|
||||
Import("env")
|
||||
|
||||
# Driver source files
|
||||
env.add_source_files(env.drivers_sources, "*.cpp")
|
||||
|
|
|
@ -1,5 +1,5 @@
|
|||
#!/usr/bin/env python
|
||||
|
||||
Import('env')
|
||||
Import("env")
|
||||
|
||||
env.add_source_files(env.drivers_sources, "*.cpp")
|
||||
|
|
|
@ -1,8 +1,8 @@
|
|||
#!/usr/bin/env python
|
||||
|
||||
Import('env')
|
||||
Import("env")
|
||||
|
||||
if (env["platform"] in ["haiku", "osx", "windows", "x11"]):
|
||||
if env["platform"] in ["haiku", "osx", "windows", "x11"]:
|
||||
# Thirdparty source files
|
||||
thirdparty_dir = "#thirdparty/glad/"
|
||||
thirdparty_sources = [
|
||||
|
@ -12,8 +12,8 @@ if (env["platform"] in ["haiku", "osx", "windows", "x11"]):
|
|||
|
||||
env.Prepend(CPPPATH=[thirdparty_dir])
|
||||
|
||||
env.Append(CPPDEFINES=['GLAD_ENABLED'])
|
||||
env.Append(CPPDEFINES=['GLES_OVER_GL'])
|
||||
env.Append(CPPDEFINES=["GLAD_ENABLED"])
|
||||
env.Append(CPPDEFINES=["GLES_OVER_GL"])
|
||||
|
||||
env_thirdparty = env.Clone()
|
||||
env_thirdparty.disable_warnings()
|
||||
|
|
|
@ -1,6 +1,6 @@
|
|||
#!/usr/bin/env python
|
||||
|
||||
Import('env')
|
||||
Import("env")
|
||||
|
||||
env.add_source_files(env.drivers_sources, "*.cpp")
|
||||
|
||||
|
|
|
@ -1,23 +1,23 @@
|
|||
#!/usr/bin/env python
|
||||
|
||||
Import('env')
|
||||
Import("env")
|
||||
|
||||
if 'GLES2_GLSL' in env['BUILDERS']:
|
||||
env.GLES2_GLSL('copy.glsl');
|
||||
# env.GLES2_GLSL('resolve.glsl');
|
||||
env.GLES2_GLSL('canvas.glsl');
|
||||
env.GLES2_GLSL('canvas_shadow.glsl');
|
||||
env.GLES2_GLSL('scene.glsl');
|
||||
env.GLES2_GLSL('cubemap_filter.glsl');
|
||||
env.GLES2_GLSL('cube_to_dp.glsl');
|
||||
# env.GLES2_GLSL('blend_shape.glsl');
|
||||
# env.GLES2_GLSL('screen_space_reflection.glsl');
|
||||
env.GLES2_GLSL('effect_blur.glsl');
|
||||
# env.GLES2_GLSL('subsurf_scattering.glsl');
|
||||
# env.GLES2_GLSL('ssao.glsl');
|
||||
# env.GLES2_GLSL('ssao_minify.glsl');
|
||||
# env.GLES2_GLSL('ssao_blur.glsl');
|
||||
# env.GLES2_GLSL('exposure.glsl');
|
||||
env.GLES2_GLSL('tonemap.glsl');
|
||||
# env.GLES2_GLSL('particles.glsl');
|
||||
env.GLES2_GLSL('lens_distorted.glsl');
|
||||
if "GLES2_GLSL" in env["BUILDERS"]:
|
||||
env.GLES2_GLSL("copy.glsl")
|
||||
# env.GLES2_GLSL('resolve.glsl');
|
||||
env.GLES2_GLSL("canvas.glsl")
|
||||
env.GLES2_GLSL("canvas_shadow.glsl")
|
||||
env.GLES2_GLSL("scene.glsl")
|
||||
env.GLES2_GLSL("cubemap_filter.glsl")
|
||||
env.GLES2_GLSL("cube_to_dp.glsl")
|
||||
# env.GLES2_GLSL('blend_shape.glsl');
|
||||
# env.GLES2_GLSL('screen_space_reflection.glsl');
|
||||
env.GLES2_GLSL("effect_blur.glsl")
|
||||
# env.GLES2_GLSL('subsurf_scattering.glsl');
|
||||
# env.GLES2_GLSL('ssao.glsl');
|
||||
# env.GLES2_GLSL('ssao_minify.glsl');
|
||||
# env.GLES2_GLSL('ssao_blur.glsl');
|
||||
# env.GLES2_GLSL('exposure.glsl');
|
||||
env.GLES2_GLSL("tonemap.glsl")
|
||||
# env.GLES2_GLSL('particles.glsl');
|
||||
env.GLES2_GLSL("lens_distorted.glsl")
|
||||
|
|
|
@ -1,7 +1,7 @@
|
|||
#!/usr/bin/env python
|
||||
|
||||
Import('env')
|
||||
Import("env")
|
||||
|
||||
env.add_source_files(env.drivers_sources,"*.cpp")
|
||||
env.add_source_files(env.drivers_sources, "*.cpp")
|
||||
|
||||
SConscript("shaders/SCsub")
|
||||
|
|
|
@ -1,23 +1,23 @@
|
|||
#!/usr/bin/env python
|
||||
|
||||
Import('env')
|
||||
Import("env")
|
||||
|
||||
if 'GLES3_GLSL' in env['BUILDERS']:
|
||||
env.GLES3_GLSL('copy.glsl');
|
||||
env.GLES3_GLSL('resolve.glsl');
|
||||
env.GLES3_GLSL('canvas.glsl');
|
||||
env.GLES3_GLSL('canvas_shadow.glsl');
|
||||
env.GLES3_GLSL('scene.glsl');
|
||||
env.GLES3_GLSL('cubemap_filter.glsl');
|
||||
env.GLES3_GLSL('cube_to_dp.glsl');
|
||||
env.GLES3_GLSL('blend_shape.glsl');
|
||||
env.GLES3_GLSL('screen_space_reflection.glsl');
|
||||
env.GLES3_GLSL('effect_blur.glsl');
|
||||
env.GLES3_GLSL('subsurf_scattering.glsl');
|
||||
env.GLES3_GLSL('ssao.glsl');
|
||||
env.GLES3_GLSL('ssao_minify.glsl');
|
||||
env.GLES3_GLSL('ssao_blur.glsl');
|
||||
env.GLES3_GLSL('exposure.glsl');
|
||||
env.GLES3_GLSL('tonemap.glsl');
|
||||
env.GLES3_GLSL('particles.glsl');
|
||||
env.GLES3_GLSL('lens_distorted.glsl');
|
||||
if "GLES3_GLSL" in env["BUILDERS"]:
|
||||
env.GLES3_GLSL("copy.glsl")
|
||||
env.GLES3_GLSL("resolve.glsl")
|
||||
env.GLES3_GLSL("canvas.glsl")
|
||||
env.GLES3_GLSL("canvas_shadow.glsl")
|
||||
env.GLES3_GLSL("scene.glsl")
|
||||
env.GLES3_GLSL("cubemap_filter.glsl")
|
||||
env.GLES3_GLSL("cube_to_dp.glsl")
|
||||
env.GLES3_GLSL("blend_shape.glsl")
|
||||
env.GLES3_GLSL("screen_space_reflection.glsl")
|
||||
env.GLES3_GLSL("effect_blur.glsl")
|
||||
env.GLES3_GLSL("subsurf_scattering.glsl")
|
||||
env.GLES3_GLSL("ssao.glsl")
|
||||
env.GLES3_GLSL("ssao_minify.glsl")
|
||||
env.GLES3_GLSL("ssao_blur.glsl")
|
||||
env.GLES3_GLSL("exposure.glsl")
|
||||
env.GLES3_GLSL("tonemap.glsl")
|
||||
env.GLES3_GLSL("particles.glsl")
|
||||
env.GLES3_GLSL("lens_distorted.glsl")
|
||||
|
|
|
@ -1,11 +1,11 @@
|
|||
#!/usr/bin/env python
|
||||
|
||||
Import('env')
|
||||
Import("env")
|
||||
|
||||
env_png = env.Clone()
|
||||
|
||||
# Thirdparty source files
|
||||
if env['builtin_libpng']:
|
||||
if env["builtin_libpng"]:
|
||||
thirdparty_dir = "#thirdparty/libpng/"
|
||||
thirdparty_sources = [
|
||||
"png.c",
|
||||
|
@ -32,6 +32,7 @@ if env['builtin_libpng']:
|
|||
|
||||
# Currently .ASM filter_neon.S does not compile on NT.
|
||||
import os
|
||||
|
||||
use_neon = "neon_enabled" in env and env["neon_enabled"] and os.name != "nt"
|
||||
if use_neon:
|
||||
env_png.Append(CPPDEFINES=[("PNG_ARM_NEON_OPT", 2)])
|
||||
|
@ -45,7 +46,7 @@ if env['builtin_libpng']:
|
|||
if use_neon:
|
||||
env_neon = env_thirdparty.Clone()
|
||||
if "S_compiler" in env:
|
||||
env_neon['CC'] = env['S_compiler']
|
||||
env_neon["CC"] = env["S_compiler"]
|
||||
neon_sources = []
|
||||
neon_sources.append(env_neon.Object(thirdparty_dir + "/arm/arm_init.c"))
|
||||
neon_sources.append(env_neon.Object(thirdparty_dir + "/arm/filter_neon_intrinsics.c"))
|
||||
|
@ -56,4 +57,4 @@ if env['builtin_libpng']:
|
|||
# Godot source files
|
||||
env_png.add_source_files(env.drivers_sources, "*.cpp")
|
||||
|
||||
Export('env')
|
||||
Export("env")
|
||||
|
|
|
@ -1,5 +1,5 @@
|
|||
#!/usr/bin/env python
|
||||
|
||||
Import('env')
|
||||
Import("env")
|
||||
|
||||
env.add_source_files(env.drivers_sources, "*.cpp")
|
||||
|
|
|
@ -1,7 +1,7 @@
|
|||
#!/usr/bin/env python
|
||||
|
||||
Import('env')
|
||||
Import("env")
|
||||
|
||||
env.add_source_files(env.drivers_sources, "*.cpp")
|
||||
|
||||
env["check_c_headers"] = [ [ "mntent.h", "HAVE_MNTENT" ] ]
|
||||
env["check_c_headers"] = [["mntent.h", "HAVE_MNTENT"]]
|
||||
|
|
|
@ -1,6 +1,6 @@
|
|||
#!/usr/bin/env python
|
||||
|
||||
Import('env')
|
||||
Import("env")
|
||||
|
||||
# Driver source files
|
||||
env.add_source_files(env.drivers_sources, "*.cpp")
|
||||
|
|
|
@ -1,5 +1,5 @@
|
|||
#!/usr/bin/env python
|
||||
|
||||
Import('env')
|
||||
Import("env")
|
||||
|
||||
env.add_source_files(env.drivers_sources, "*.cpp")
|
||||
|
|
|
@ -1,6 +1,6 @@
|
|||
#!/usr/bin/env python
|
||||
|
||||
Import('env')
|
||||
Import("env")
|
||||
|
||||
# Driver source files
|
||||
env.add_source_files(env.drivers_sources, "*.cpp")
|
||||
|
|
|
@ -1,7 +1,7 @@
|
|||
#!/usr/bin/env python
|
||||
|
||||
Import('env')
|
||||
Import("env")
|
||||
|
||||
env.add_source_files(env.drivers_sources, "*.cpp")
|
||||
env.Append(CPPDEFINES=['XAUDIO2_ENABLED'])
|
||||
env.Append(LINKFLAGS=['xaudio2_8.lib'])
|
||||
env.Append(CPPDEFINES=["XAUDIO2_ENABLED"])
|
||||
env.Append(LINKFLAGS=["xaudio2_8.lib"])
|
||||
|
|
36
editor/SCsub
36
editor/SCsub
|
@ -1,6 +1,6 @@
|
|||
#!/usr/bin/env python
|
||||
|
||||
Import('env')
|
||||
Import("env")
|
||||
|
||||
env.editor_sources = []
|
||||
|
||||
|
@ -18,24 +18,24 @@ def _make_doc_data_class_path(to_path):
|
|||
g.write("static const int _doc_data_class_path_count = " + str(len(env.doc_class_path)) + ";\n")
|
||||
g.write("struct _DocDataClassPath { const char* name; const char* path; };\n")
|
||||
|
||||
g.write("static const _DocDataClassPath _doc_data_class_paths[" + str(len(env.doc_class_path) + 1) + "] = {\n");
|
||||
g.write("static const _DocDataClassPath _doc_data_class_paths[" + str(len(env.doc_class_path) + 1) + "] = {\n")
|
||||
for c in sorted(env.doc_class_path):
|
||||
g.write("\t{\"" + c + "\", \"" + env.doc_class_path[c] + "\"},\n")
|
||||
g.write('\t{"' + c + '", "' + env.doc_class_path[c] + '"},\n')
|
||||
g.write("\t{NULL, NULL}\n")
|
||||
g.write("};\n")
|
||||
|
||||
g.close()
|
||||
|
||||
|
||||
if env['tools']:
|
||||
if env["tools"]:
|
||||
# Register exporters
|
||||
reg_exporters_inc = '#include "register_exporters.h"\n'
|
||||
reg_exporters = 'void register_exporters() {\n'
|
||||
reg_exporters = "void register_exporters() {\n"
|
||||
for e in env.platform_exporters:
|
||||
env.add_source_files(env.editor_sources, "#platform/" + e + "/export/export.cpp")
|
||||
reg_exporters += '\tregister_' + e + '_exporter();\n'
|
||||
reg_exporters += "\tregister_" + e + "_exporter();\n"
|
||||
reg_exporters_inc += '#include "platform/' + e + '/export/export.h"\n'
|
||||
reg_exporters += '}\n'
|
||||
reg_exporters += "}\n"
|
||||
|
||||
# NOTE: It is safe to generate this file here, since this is still executed serially
|
||||
with open_utf8("register_exporters.gen.cpp", "w") as f:
|
||||
|
@ -58,7 +58,7 @@ if env['tools']:
|
|||
else:
|
||||
docs += Glob(d + "/*.xml") # Custom.
|
||||
|
||||
_make_doc_data_class_path(os.path.join(env.Dir('#').abspath, "editor/doc"))
|
||||
_make_doc_data_class_path(os.path.join(env.Dir("#").abspath, "editor/doc"))
|
||||
|
||||
docs = sorted(docs)
|
||||
env.Depends("#editor/doc_data_compressed.gen.h", docs)
|
||||
|
@ -68,24 +68,24 @@ if env['tools']:
|
|||
|
||||
# Translations
|
||||
tlist = glob.glob(path + "/translations/*.po")
|
||||
env.Depends('#editor/translations.gen.h', tlist)
|
||||
env.CommandNoCache('#editor/translations.gen.h', tlist, run_in_subprocess(editor_builders.make_translations_header))
|
||||
env.Depends("#editor/translations.gen.h", tlist)
|
||||
env.CommandNoCache("#editor/translations.gen.h", tlist, run_in_subprocess(editor_builders.make_translations_header))
|
||||
|
||||
# Fonts
|
||||
flist = glob.glob(path + "/../thirdparty/fonts/*.ttf")
|
||||
flist.extend(glob.glob(path + "/../thirdparty/fonts/*.otf"))
|
||||
flist.sort()
|
||||
env.Depends('#editor/builtin_fonts.gen.h', flist)
|
||||
env.CommandNoCache('#editor/builtin_fonts.gen.h', flist, run_in_subprocess(editor_builders.make_fonts_header))
|
||||
env.Depends("#editor/builtin_fonts.gen.h", flist)
|
||||
env.CommandNoCache("#editor/builtin_fonts.gen.h", flist, run_in_subprocess(editor_builders.make_fonts_header))
|
||||
|
||||
env.add_source_files(env.editor_sources, "*.cpp")
|
||||
|
||||
SConscript('collada/SCsub')
|
||||
SConscript('doc/SCsub')
|
||||
SConscript('fileserver/SCsub')
|
||||
SConscript('icons/SCsub')
|
||||
SConscript('import/SCsub')
|
||||
SConscript('plugins/SCsub')
|
||||
SConscript("collada/SCsub")
|
||||
SConscript("doc/SCsub")
|
||||
SConscript("fileserver/SCsub")
|
||||
SConscript("icons/SCsub")
|
||||
SConscript("import/SCsub")
|
||||
SConscript("plugins/SCsub")
|
||||
|
||||
lib = env.add_library("editor", env.editor_sources)
|
||||
env.Prepend(LIBS=[lib])
|
||||
|
|
|
@ -1,5 +1,5 @@
|
|||
#!/usr/bin/env python
|
||||
|
||||
Import('env')
|
||||
Import("env")
|
||||
|
||||
env.add_source_files(env.editor_sources, "*.cpp")
|
||||
|
|
|
@ -1,5 +1,5 @@
|
|||
#!/usr/bin/env python
|
||||
|
||||
Import('env')
|
||||
Import("env")
|
||||
|
||||
env.add_source_files(env.editor_sources, "*.cpp")
|
||||
|
|
|
@ -26,6 +26,7 @@ def make_doc_header(target, source, env):
|
|||
buf = encode_utf8(docbegin + buf + docend)
|
||||
decomp_size = len(buf)
|
||||
import zlib
|
||||
|
||||
buf = zlib.compress(buf)
|
||||
|
||||
g.write("/* THIS FILE IS GENERATED DO NOT EDIT */\n")
|
||||
|
@ -56,7 +57,7 @@ def make_fonts_header(target, source, env):
|
|||
# saving uncompressed, since freetype will reference from memory pointer
|
||||
xl_names = []
|
||||
for i in range(len(source)):
|
||||
with open(source[i], "rb")as f:
|
||||
with open(source[i], "rb") as f:
|
||||
buf = f.read()
|
||||
|
||||
name = os.path.splitext(os.path.basename(source[i]))[0]
|
||||
|
@ -112,7 +113,7 @@ def make_translations_header(target, source, env):
|
|||
g.write("};\n\n")
|
||||
g.write("static EditorTranslationList _editor_translations[] = {\n")
|
||||
for x in xl_names:
|
||||
g.write("\t{ \"" + x[0] + "\", " + str(x[1]) + ", " + str(x[2]) + ", _translation_" + x[0] + "_compressed},\n")
|
||||
g.write('\t{ "' + x[0] + '", ' + str(x[1]) + ", " + str(x[2]) + ", _translation_" + x[0] + "_compressed},\n")
|
||||
g.write("\t{NULL, 0, 0, NULL}\n")
|
||||
g.write("};\n")
|
||||
|
||||
|
@ -120,5 +121,6 @@ def make_translations_header(target, source, env):
|
|||
|
||||
g.close()
|
||||
|
||||
if __name__ == '__main__':
|
||||
|
||||
if __name__ == "__main__":
|
||||
subprocess_main(globals())
|
||||
|
|
|
@ -593,7 +593,9 @@ void EditorNode::_fs_changed() {
|
|||
preset.unref();
|
||||
}
|
||||
if (preset.is_null()) {
|
||||
export_error = vformat("Invalid export preset name: %s.", preset_name);
|
||||
export_error = vformat(
|
||||
"Invalid export preset name: %s. Make sure `export_presets.cfg` is present in the current directory.",
|
||||
preset_name);
|
||||
} else {
|
||||
Ref<EditorExportPlatform> platform = preset->get_platform();
|
||||
if (platform.is_null()) {
|
||||
|
@ -3426,13 +3428,13 @@ Error EditorNode::load_scene(const String &p_scene, bool p_ignore_broken_deps, b
|
|||
if (!new_scene) {
|
||||
|
||||
sdata.unref();
|
||||
_dialog_display_load_error(lpath, ERR_FILE_NOT_FOUND);
|
||||
_dialog_display_load_error(lpath, ERR_FILE_CORRUPT);
|
||||
opening_prev = false;
|
||||
if (prev != -1) {
|
||||
set_current_scene(prev);
|
||||
editor_data.remove_scene(idx);
|
||||
}
|
||||
return ERR_FILE_NOT_FOUND;
|
||||
return ERR_FILE_CORRUPT;
|
||||
}
|
||||
|
||||
if (p_set_inherited) {
|
||||
|
|
|
@ -610,6 +610,7 @@ public:
|
|||
Vector<Rect2> flag_rects;
|
||||
Vector<String> names;
|
||||
Vector<String> tooltips;
|
||||
int hovered_index;
|
||||
|
||||
virtual Size2 get_minimum_size() const {
|
||||
Ref<Font> font = get_font("font", "Label");
|
||||
|
@ -625,57 +626,79 @@ public:
|
|||
return String();
|
||||
}
|
||||
void _gui_input(const Ref<InputEvent> &p_ev) {
|
||||
Ref<InputEventMouseButton> mb = p_ev;
|
||||
if (mb.is_valid() && mb->get_button_index() == BUTTON_LEFT && mb->is_pressed()) {
|
||||
const Ref<InputEventMouseMotion> mm = p_ev;
|
||||
|
||||
if (mm.is_valid()) {
|
||||
for (int i = 0; i < flag_rects.size(); i++) {
|
||||
if (flag_rects[i].has_point(mb->get_position())) {
|
||||
//toggle
|
||||
if (value & (1 << i)) {
|
||||
value &= ~(1 << i);
|
||||
} else {
|
||||
value |= (1 << i);
|
||||
}
|
||||
emit_signal("flag_changed", value);
|
||||
if (flag_rects[i].has_point(mm->get_position())) {
|
||||
// Used to highlight the hovered flag in the layers grid.
|
||||
hovered_index = i;
|
||||
update();
|
||||
break;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
const Ref<InputEventMouseButton> mb = p_ev;
|
||||
|
||||
if (mb.is_valid() && mb->get_button_index() == BUTTON_LEFT && mb->is_pressed()) {
|
||||
// Toggle the flag.
|
||||
// We base our choice on the hovered flag, so that it always matches the hovered flag.
|
||||
if (value & (1 << hovered_index)) {
|
||||
value &= ~(1 << hovered_index);
|
||||
} else {
|
||||
value |= (1 << hovered_index);
|
||||
}
|
||||
|
||||
emit_signal("flag_changed", value);
|
||||
update();
|
||||
}
|
||||
}
|
||||
|
||||
void _notification(int p_what) {
|
||||
if (p_what == NOTIFICATION_DRAW) {
|
||||
switch (p_what) {
|
||||
case NOTIFICATION_DRAW: {
|
||||
Rect2 rect;
|
||||
rect.size = get_size();
|
||||
flag_rects.clear();
|
||||
|
||||
Rect2 rect;
|
||||
rect.size = get_size();
|
||||
flag_rects.clear();
|
||||
const int bsize = (rect.size.height * 80 / 100) / 2;
|
||||
const int h = bsize * 2 + 1;
|
||||
const int vofs = (rect.size.height - h) / 2;
|
||||
|
||||
int bsize = (rect.size.height * 80 / 100) / 2;
|
||||
Color color = get_color("highlight_color", "Editor");
|
||||
for (int i = 0; i < 2; i++) {
|
||||
Point2 ofs(4, vofs);
|
||||
if (i == 1)
|
||||
ofs.y += bsize + 1;
|
||||
|
||||
int h = bsize * 2 + 1;
|
||||
int vofs = (rect.size.height - h) / 2;
|
||||
ofs += rect.position;
|
||||
for (int j = 0; j < 10; j++) {
|
||||
Point2 o = ofs + Point2(j * (bsize + 1), 0);
|
||||
if (j >= 5)
|
||||
o.x += 1;
|
||||
|
||||
Color color = get_color("highlight_color", "Editor");
|
||||
for (int i = 0; i < 2; i++) {
|
||||
const int idx = i * 10 + j;
|
||||
const bool on = value & (1 << idx);
|
||||
Rect2 rect2 = Rect2(o, Size2(bsize, bsize));
|
||||
|
||||
Point2 ofs(4, vofs);
|
||||
if (i == 1)
|
||||
ofs.y += bsize + 1;
|
||||
color.a = on ? 0.6 : 0.2;
|
||||
if (idx == hovered_index) {
|
||||
// Add visual feedback when hovering a flag.
|
||||
color.a += 0.15;
|
||||
}
|
||||
|
||||
ofs += rect.position;
|
||||
for (int j = 0; j < 10; j++) {
|
||||
|
||||
Point2 o = ofs + Point2(j * (bsize + 1), 0);
|
||||
if (j >= 5)
|
||||
o.x += 1;
|
||||
|
||||
uint32_t idx = i * 10 + j;
|
||||
bool on = value & (1 << idx);
|
||||
Rect2 rect2 = Rect2(o, Size2(bsize, bsize));
|
||||
color.a = on ? 0.6 : 0.2;
|
||||
draw_rect(rect2, color);
|
||||
flag_rects.push_back(rect2);
|
||||
draw_rect(rect2, color);
|
||||
flag_rects.push_back(rect2);
|
||||
}
|
||||
}
|
||||
}
|
||||
} break;
|
||||
case NOTIFICATION_MOUSE_EXIT: {
|
||||
hovered_index = -1;
|
||||
update();
|
||||
} break;
|
||||
default:
|
||||
break;
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -692,6 +715,7 @@ public:
|
|||
|
||||
EditorPropertyLayersGrid() {
|
||||
value = 0;
|
||||
hovered_index = -1; // Nothing is hovered.
|
||||
}
|
||||
};
|
||||
void EditorPropertyLayers::_grid_changed(uint32_t p_grid) {
|
||||
|
@ -792,7 +816,7 @@ EditorPropertyLayers::EditorPropertyLayers() {
|
|||
hb->add_child(grid);
|
||||
button = memnew(Button);
|
||||
button->set_toggle_mode(true);
|
||||
button->set_text("..");
|
||||
button->set_text("...");
|
||||
button->connect("pressed", this, "_button_pressed");
|
||||
hb->add_child(button);
|
||||
set_bottom_editor(hb);
|
||||
|
|
|
@ -1,5 +1,5 @@
|
|||
#!/usr/bin/env python
|
||||
|
||||
Import('env')
|
||||
Import("env")
|
||||
|
||||
env.add_source_files(env.editor_sources, "*.cpp")
|
||||
|
|
|
@ -1,17 +1,17 @@
|
|||
#!/usr/bin/env python
|
||||
|
||||
Import('env')
|
||||
Import("env")
|
||||
|
||||
import os
|
||||
|
||||
from platform_methods import run_in_subprocess
|
||||
import editor_icons_builders
|
||||
|
||||
make_editor_icons_builder = Builder(action=run_in_subprocess(editor_icons_builders.make_editor_icons_action),
|
||||
suffix='.h',
|
||||
src_suffix='.svg')
|
||||
make_editor_icons_builder = Builder(
|
||||
action=run_in_subprocess(editor_icons_builders.make_editor_icons_action), suffix=".h", src_suffix=".svg"
|
||||
)
|
||||
|
||||
env['BUILDERS']['MakeEditorIconsBuilder'] = make_editor_icons_builder
|
||||
env["BUILDERS"]["MakeEditorIconsBuilder"] = make_editor_icons_builder
|
||||
|
||||
# Editor's own icons
|
||||
icon_sources = Glob("*.svg")
|
||||
|
@ -23,4 +23,4 @@ for path in env.module_icons_paths:
|
|||
else:
|
||||
icon_sources += Glob(path + "/*.svg") # Custom.
|
||||
|
||||
env.Alias('editor_icons', [env.MakeEditorIconsBuilder('#editor/editor_icons.gen.h', icon_sources)])
|
||||
env.Alias("editor_icons", [env.MakeEditorIconsBuilder("#editor/editor_icons.gen.h", icon_sources)])
|
||||
|
|
|
@ -21,17 +21,16 @@ def make_editor_icons_action(target, source, env):
|
|||
|
||||
icons_string.write('\t"')
|
||||
|
||||
with open(fname, 'rb') as svgf:
|
||||
with open(fname, "rb") as svgf:
|
||||
b = svgf.read(1)
|
||||
while(len(b) == 1):
|
||||
while len(b) == 1:
|
||||
icons_string.write("\\" + str(hex(ord(b)))[1:])
|
||||
b = svgf.read(1)
|
||||
|
||||
|
||||
icons_string.write('"')
|
||||
if fname != svg_icons[-1]:
|
||||
icons_string.write(",")
|
||||
icons_string.write('\n')
|
||||
icons_string.write("\n")
|
||||
|
||||
s = StringIO()
|
||||
s.write("/* THIS FILE IS GENERATED DO NOT EDIT */\n")
|
||||
|
@ -40,12 +39,12 @@ def make_editor_icons_action(target, source, env):
|
|||
s.write("static const int editor_icons_count = {};\n".format(len(svg_icons)))
|
||||
s.write("static const char *editor_icons_sources[] = {\n")
|
||||
s.write(icons_string.getvalue())
|
||||
s.write('};\n\n')
|
||||
s.write("};\n\n")
|
||||
s.write("static const char *editor_icons_names[] = {\n")
|
||||
|
||||
# this is used to store the indices of thumbnail icons
|
||||
thumb_medium_indices = [];
|
||||
thumb_big_indices = [];
|
||||
thumb_medium_indices = []
|
||||
thumb_big_indices = []
|
||||
index = 0
|
||||
for f in svg_icons:
|
||||
|
||||
|
@ -53,7 +52,7 @@ def make_editor_icons_action(target, source, env):
|
|||
|
||||
icon_name = os.path.basename(fname)[5:-4].title().replace("_", "")
|
||||
# some special cases
|
||||
if icon_name in ['Int', 'Bool', 'Float']:
|
||||
if icon_name in ["Int", "Bool", "Float"]:
|
||||
icon_name = icon_name.lower()
|
||||
if icon_name.endswith("MediumThumb"): # don't know a better way to handle this
|
||||
thumb_medium_indices.append(str(index))
|
||||
|
@ -64,11 +63,11 @@ def make_editor_icons_action(target, source, env):
|
|||
|
||||
if fname != svg_icons[-1]:
|
||||
s.write(",")
|
||||
s.write('\n')
|
||||
s.write("\n")
|
||||
|
||||
index += 1
|
||||
|
||||
s.write('};\n')
|
||||
s.write("};\n")
|
||||
|
||||
if thumb_medium_indices:
|
||||
s.write("\n\n")
|
||||
|
@ -92,5 +91,5 @@ def make_editor_icons_action(target, source, env):
|
|||
icons_string.close()
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
if __name__ == "__main__":
|
||||
subprocess_main(globals())
|
||||
|
|
|
@ -1,5 +1,5 @@
|
|||
#!/usr/bin/env python
|
||||
|
||||
Import('env')
|
||||
Import("env")
|
||||
|
||||
env.add_source_files(env.editor_sources, "*.cpp")
|
||||
|
|
|
@ -1,5 +1,5 @@
|
|||
#!/usr/bin/env python
|
||||
|
||||
Import('env')
|
||||
Import("env")
|
||||
|
||||
env.add_source_files(env.editor_sources, "*.cpp")
|
||||
|
|
|
@ -902,7 +902,6 @@ void ScriptEditor::_file_dialog_action(String p_file) {
|
|||
Error err;
|
||||
FileAccess *file = FileAccess::open(p_file, FileAccess::WRITE, &err);
|
||||
if (err) {
|
||||
memdelete(file);
|
||||
editor->show_warning(TTR("Error writing TextFile:") + "\n" + p_file, TTR("Error!"));
|
||||
break;
|
||||
}
|
||||
|
|
|
@ -3407,11 +3407,7 @@ void SpatialEditorViewport::reset() {
|
|||
last_message = "";
|
||||
name = "";
|
||||
|
||||
cursor.x_rot = 0.5;
|
||||
cursor.y_rot = 0.5;
|
||||
cursor.distance = 4;
|
||||
cursor.region_select = false;
|
||||
cursor.pos = Vector3();
|
||||
cursor = Cursor();
|
||||
_update_name();
|
||||
}
|
||||
|
||||
|
@ -5380,6 +5376,9 @@ void SpatialEditor::_update_gizmos_menu() {
|
|||
const int plugin_state = gizmo_plugins_by_name[i]->get_state();
|
||||
gizmos_menu->add_multistate_item(TTR(plugin_name), 3, plugin_state, i);
|
||||
const int idx = gizmos_menu->get_item_index(i);
|
||||
gizmos_menu->set_item_tooltip(
|
||||
idx,
|
||||
TTR("Click to toggle between visibility states.\n\nOpen eye: Gizmo is visible.\nClosed eye: Gizmo is hidden.\nHalf-open eye: Gizmo is also visible through opaque surfaces (\"x-ray\")."));
|
||||
switch (plugin_state) {
|
||||
case EditorSpatialGizmoPlugin::VISIBLE:
|
||||
gizmos_menu->set_item_icon(idx, gizmos_menu->get_icon("visibility_visible"));
|
||||
|
|
|
@ -375,7 +375,9 @@ private:
|
|||
Point2 region_begin, region_end;
|
||||
|
||||
Cursor() {
|
||||
x_rot = y_rot = 0.5;
|
||||
// These rotations place the camera in +X +Y +Z, aka south east, facing north west.
|
||||
x_rot = 0.5;
|
||||
y_rot = -0.5;
|
||||
distance = 4;
|
||||
region_select = false;
|
||||
}
|
||||
|
|
|
@ -2696,7 +2696,9 @@ void SceneTreeDock::_remote_tree_selected() {
|
|||
|
||||
void SceneTreeDock::_local_tree_selected() {
|
||||
|
||||
scene_tree->show();
|
||||
if (!bool(EDITOR_GET("interface/editors/show_scene_tree_root_selection")) || get_tree()->get_edited_scene_root() != nullptr) {
|
||||
scene_tree->show();
|
||||
}
|
||||
if (remote_tree)
|
||||
remote_tree->hide();
|
||||
edit_remote->set_pressed(false);
|
||||
|
|
|
@ -646,12 +646,9 @@ void ScriptEditorDebugger::_parse_message(const String &p_msg, const Array &p_da
|
|||
if (path.find("::") != -1) {
|
||||
// built-in resource
|
||||
String base_path = path.get_slice("::", 0);
|
||||
if (ResourceLoader::get_resource_type(base_path) == "PackedScene") {
|
||||
if (!EditorNode::get_singleton()->is_scene_open(base_path)) {
|
||||
EditorNode::get_singleton()->load_scene(base_path);
|
||||
}
|
||||
} else {
|
||||
EditorNode::get_singleton()->load_resource(base_path);
|
||||
RES dependency = ResourceLoader::load(base_path);
|
||||
if (dependency.is_valid()) {
|
||||
remote_dependencies.insert(dependency);
|
||||
}
|
||||
}
|
||||
var = ResourceLoader::load(path);
|
||||
|
@ -2144,6 +2141,7 @@ void ScriptEditorDebugger::_clear_remote_objects() {
|
|||
memdelete(E->value());
|
||||
}
|
||||
remote_objects.clear();
|
||||
remote_dependencies.clear();
|
||||
}
|
||||
|
||||
void ScriptEditorDebugger::_clear_errors_list() {
|
||||
|
|
|
@ -100,6 +100,7 @@ private:
|
|||
ObjectID inspected_object_id;
|
||||
ScriptEditorDebuggerVariables *variables;
|
||||
Map<ObjectID, ScriptEditorDebuggerInspectedObject *> remote_objects;
|
||||
Set<RES> remote_dependencies;
|
||||
Set<ObjectID> unfold_cache;
|
||||
|
||||
VBoxContainer *errors_tab;
|
||||
|
|
|
@ -10,23 +10,23 @@ import sys
|
|||
line_nb = False
|
||||
|
||||
for arg in sys.argv[1:]:
|
||||
if (arg == "--with-line-nb"):
|
||||
if arg == "--with-line-nb":
|
||||
print("Enabling line numbers in the context locations.")
|
||||
line_nb = True
|
||||
else:
|
||||
os.sys.exit("Non supported argument '" + arg + "'. Aborting.")
|
||||
|
||||
|
||||
if (not os.path.exists("editor")):
|
||||
if not os.path.exists("editor"):
|
||||
os.sys.exit("ERROR: This script should be started from the root of the git repo.")
|
||||
|
||||
|
||||
matches = []
|
||||
for root, dirnames, filenames in os.walk('.'):
|
||||
for root, dirnames, filenames in os.walk("."):
|
||||
dirnames[:] = [d for d in dirnames if d not in ["thirdparty"]]
|
||||
for filename in fnmatch.filter(filenames, '*.cpp'):
|
||||
for filename in fnmatch.filter(filenames, "*.cpp"):
|
||||
matches.append(os.path.join(root, filename))
|
||||
for filename in fnmatch.filter(filenames, '*.h'):
|
||||
for filename in fnmatch.filter(filenames, "*.h"):
|
||||
matches.append(os.path.join(root, filename))
|
||||
matches.sort()
|
||||
|
||||
|
@ -49,52 +49,54 @@ msgstr ""
|
|||
"Content-Transfer-Encoding: 8-bit\\n"
|
||||
"""
|
||||
|
||||
|
||||
def process_file(f, fname):
|
||||
|
||||
global main_po, unique_str, unique_loc
|
||||
|
||||
l = f.readline()
|
||||
lc = 1
|
||||
while (l):
|
||||
while l:
|
||||
|
||||
patterns = ['RTR(\"', 'TTR(\"', 'TTRC(\"']
|
||||
patterns = ['RTR("', 'TTR("', 'TTRC("']
|
||||
idx = 0
|
||||
pos = 0
|
||||
while (pos >= 0):
|
||||
while pos >= 0:
|
||||
pos = l.find(patterns[idx], pos)
|
||||
if (pos == -1):
|
||||
if (idx < len(patterns) - 1):
|
||||
if pos == -1:
|
||||
if idx < len(patterns) - 1:
|
||||
idx += 1
|
||||
pos = 0
|
||||
continue
|
||||
pos += len(patterns[idx])
|
||||
|
||||
msg = ""
|
||||
while (pos < len(l) and (l[pos] != '"' or l[pos - 1] == '\\')):
|
||||
while pos < len(l) and (l[pos] != '"' or l[pos - 1] == "\\"):
|
||||
msg += l[pos]
|
||||
pos += 1
|
||||
|
||||
location = os.path.relpath(fname).replace('\\', '/')
|
||||
if (line_nb):
|
||||
location = os.path.relpath(fname).replace("\\", "/")
|
||||
if line_nb:
|
||||
location += ":" + str(lc)
|
||||
|
||||
if (not msg in unique_str):
|
||||
if not msg in unique_str:
|
||||
main_po += "\n#: " + location + "\n"
|
||||
main_po += 'msgid "' + msg + '"\n'
|
||||
main_po += 'msgstr ""\n'
|
||||
unique_str.append(msg)
|
||||
unique_loc[msg] = [location]
|
||||
elif (not location in unique_loc[msg]):
|
||||
elif not location in unique_loc[msg]:
|
||||
# Add additional location to previous occurrence too
|
||||
msg_pos = main_po.find('\nmsgid "' + msg + '"')
|
||||
if (msg_pos == -1):
|
||||
if msg_pos == -1:
|
||||
print("Someone apparently thought writing Python was as easy as GDScript. Ping Akien.")
|
||||
main_po = main_po[:msg_pos] + ' ' + location + main_po[msg_pos:]
|
||||
main_po = main_po[:msg_pos] + " " + location + main_po[msg_pos:]
|
||||
unique_loc[msg].append(location)
|
||||
|
||||
l = f.readline()
|
||||
lc += 1
|
||||
|
||||
|
||||
print("Updating the editor.pot template...")
|
||||
|
||||
for fname in matches:
|
||||
|
@ -104,7 +106,7 @@ for fname in matches:
|
|||
with open("editor.pot", "w") as f:
|
||||
f.write(main_po)
|
||||
|
||||
if (os.name == "posix"):
|
||||
if os.name == "posix":
|
||||
print("Wrapping template at 79 characters for compatibility with Weblate.")
|
||||
os.system("msgmerge -w79 editor.pot editor.pot > editor.pot.wrap")
|
||||
shutil.move("editor.pot.wrap", "editor.pot")
|
||||
|
@ -112,7 +114,7 @@ if (os.name == "posix"):
|
|||
shutil.move("editor.pot", "editor/translations/editor.pot")
|
||||
|
||||
# TODO: Make that in a portable way, if we care; if not, kudos to Unix users
|
||||
if (os.name == "posix"):
|
||||
if os.name == "posix":
|
||||
added = subprocess.check_output(r"git diff editor/translations/editor.pot | grep \+msgid | wc -l", shell=True)
|
||||
removed = subprocess.check_output(r"git diff editor/translations/editor.pot | grep \\\-msgid | wc -l", shell=True)
|
||||
print("\n# Template changes compared to the staged status:")
|
||||
|
|
230
gles_builders.py
230
gles_builders.py
|
@ -7,7 +7,6 @@ from platform_methods import subprocess_main
|
|||
|
||||
|
||||
class LegacyGLHeaderStruct:
|
||||
|
||||
def __init__(self):
|
||||
self.vertex_lines = []
|
||||
self.fragment_lines = []
|
||||
|
@ -73,7 +72,7 @@ def include_file_in_legacygl_header(filename, header_data, depth):
|
|||
ifdefline = line.replace("#ifdef ", "").strip()
|
||||
|
||||
if line.find("_EN_") != -1:
|
||||
enumbase = ifdefline[:ifdefline.find("_EN_")]
|
||||
enumbase = ifdefline[: ifdefline.find("_EN_")]
|
||||
ifdefline = ifdefline.replace("_EN_", "_")
|
||||
line = line.replace("_EN_", "_")
|
||||
if enumbase not in header_data.enums:
|
||||
|
@ -86,12 +85,12 @@ def include_file_in_legacygl_header(filename, header_data, depth):
|
|||
|
||||
if line.find("uniform") != -1 and line.lower().find("texunit:") != -1:
|
||||
# texture unit
|
||||
texunitstr = line[line.find(":") + 1:].strip()
|
||||
texunitstr = line[line.find(":") + 1 :].strip()
|
||||
if texunitstr == "auto":
|
||||
texunit = "-1"
|
||||
else:
|
||||
texunit = str(int(texunitstr))
|
||||
uline = line[:line.lower().find("//")]
|
||||
uline = line[: line.lower().find("//")]
|
||||
uline = uline.replace("uniform", "")
|
||||
uline = uline.replace("highp", "")
|
||||
uline = uline.replace(";", "")
|
||||
|
@ -99,10 +98,10 @@ def include_file_in_legacygl_header(filename, header_data, depth):
|
|||
for x in lines:
|
||||
|
||||
x = x.strip()
|
||||
x = x[x.rfind(" ") + 1:]
|
||||
x = x[x.rfind(" ") + 1 :]
|
||||
if x.find("[") != -1:
|
||||
# unfiorm array
|
||||
x = x[:x.find("[")]
|
||||
x = x[: x.find("[")]
|
||||
|
||||
if not x in header_data.texunit_names:
|
||||
header_data.texunits += [(x, texunit)]
|
||||
|
@ -110,10 +109,10 @@ def include_file_in_legacygl_header(filename, header_data, depth):
|
|||
|
||||
elif line.find("uniform") != -1 and line.lower().find("ubo:") != -1:
|
||||
# uniform buffer object
|
||||
ubostr = line[line.find(":") + 1:].strip()
|
||||
ubostr = line[line.find(":") + 1 :].strip()
|
||||
ubo = str(int(ubostr))
|
||||
uline = line[:line.lower().find("//")]
|
||||
uline = uline[uline.find("uniform") + len("uniform"):]
|
||||
uline = line[: line.lower().find("//")]
|
||||
uline = uline[uline.find("uniform") + len("uniform") :]
|
||||
uline = uline.replace("highp", "")
|
||||
uline = uline.replace(";", "")
|
||||
uline = uline.replace("{", "").strip()
|
||||
|
@ -121,10 +120,10 @@ def include_file_in_legacygl_header(filename, header_data, depth):
|
|||
for x in lines:
|
||||
|
||||
x = x.strip()
|
||||
x = x[x.rfind(" ") + 1:]
|
||||
x = x[x.rfind(" ") + 1 :]
|
||||
if x.find("[") != -1:
|
||||
# unfiorm array
|
||||
x = x[:x.find("[")]
|
||||
x = x[: x.find("[")]
|
||||
|
||||
if not x in header_data.ubo_names:
|
||||
header_data.ubos += [(x, ubo)]
|
||||
|
@ -137,10 +136,10 @@ def include_file_in_legacygl_header(filename, header_data, depth):
|
|||
for x in lines:
|
||||
|
||||
x = x.strip()
|
||||
x = x[x.rfind(" ") + 1:]
|
||||
x = x[x.rfind(" ") + 1 :]
|
||||
if x.find("[") != -1:
|
||||
# unfiorm array
|
||||
x = x[:x.find("[")]
|
||||
x = x[: x.find("[")]
|
||||
|
||||
if not x in header_data.uniforms:
|
||||
header_data.uniforms += [x]
|
||||
|
@ -150,7 +149,7 @@ def include_file_in_legacygl_header(filename, header_data, depth):
|
|||
uline = uline.replace("attribute ", "")
|
||||
uline = uline.replace("highp ", "")
|
||||
uline = uline.replace(";", "")
|
||||
uline = uline[uline.find(" "):].strip()
|
||||
uline = uline[uline.find(" ") :].strip()
|
||||
|
||||
if uline.find("//") != -1:
|
||||
name, bind = uline.split("//")
|
||||
|
@ -163,7 +162,7 @@ def include_file_in_legacygl_header(filename, header_data, depth):
|
|||
uline = line.replace("out ", "")
|
||||
uline = uline.replace("highp ", "")
|
||||
uline = uline.replace(";", "")
|
||||
uline = uline[uline.find(" "):].strip()
|
||||
uline = uline[uline.find(" ") :].strip()
|
||||
|
||||
if uline.find("//") != -1:
|
||||
name, bind = uline.split("//")
|
||||
|
@ -200,17 +199,19 @@ def build_legacygl_header(filename, include, class_suffix, output_attribs, gles2
|
|||
fd.write("/* WARNING, THIS FILE WAS GENERATED, DO NOT EDIT */\n")
|
||||
|
||||
out_file_base = out_file
|
||||
out_file_base = out_file_base[out_file_base.rfind("/") + 1:]
|
||||
out_file_base = out_file_base[out_file_base.rfind("\\") + 1:]
|
||||
out_file_base = out_file_base[out_file_base.rfind("/") + 1 :]
|
||||
out_file_base = out_file_base[out_file_base.rfind("\\") + 1 :]
|
||||
out_file_ifdef = out_file_base.replace(".", "_").upper()
|
||||
fd.write("#ifndef " + out_file_ifdef + class_suffix + "_120\n")
|
||||
fd.write("#define " + out_file_ifdef + class_suffix + "_120\n")
|
||||
|
||||
out_file_class = out_file_base.replace(".glsl.gen.h", "").title().replace("_", "").replace(".", "") + "Shader" + class_suffix
|
||||
out_file_class = (
|
||||
out_file_base.replace(".glsl.gen.h", "").title().replace("_", "").replace(".", "") + "Shader" + class_suffix
|
||||
)
|
||||
fd.write("\n\n")
|
||||
fd.write("#include \"" + include + "\"\n\n\n")
|
||||
fd.write('#include "' + include + '"\n\n\n')
|
||||
fd.write("class " + out_file_class + " : public Shader" + class_suffix + " {\n\n")
|
||||
fd.write("\t virtual String get_shader_name() const { return \"" + out_file_class + "\"; }\n")
|
||||
fd.write('\t virtual String get_shader_name() const { return "' + out_file_class + '"; }\n')
|
||||
|
||||
fd.write("public:\n\n")
|
||||
|
||||
|
@ -228,29 +229,64 @@ def build_legacygl_header(filename, include, class_suffix, output_attribs, gles2
|
|||
|
||||
fd.write("\t_FORCE_INLINE_ int get_uniform(Uniforms p_uniform) const { return _get_uniform(p_uniform); }\n\n")
|
||||
if header_data.conditionals:
|
||||
fd.write("\t_FORCE_INLINE_ void set_conditional(Conditionals p_conditional,bool p_enable) { _set_conditional(p_conditional,p_enable); }\n\n")
|
||||
fd.write(
|
||||
"\t_FORCE_INLINE_ void set_conditional(Conditionals p_conditional,bool p_enable) { _set_conditional(p_conditional,p_enable); }\n\n"
|
||||
)
|
||||
fd.write("\t#ifdef DEBUG_ENABLED\n ")
|
||||
fd.write("\t#define _FU if (get_uniform(p_uniform)<0) return; if (!is_version_valid()) return; ERR_FAIL_COND( get_active()!=this ); \n\n ")
|
||||
fd.write(
|
||||
"\t#define _FU if (get_uniform(p_uniform)<0) return; if (!is_version_valid()) return; ERR_FAIL_COND( get_active()!=this ); \n\n "
|
||||
)
|
||||
fd.write("\t#else\n ")
|
||||
fd.write("\t#define _FU if (get_uniform(p_uniform)<0) return; \n\n ")
|
||||
fd.write("\t#endif\n")
|
||||
fd.write("\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, float p_value) { _FU glUniform1f(get_uniform(p_uniform),p_value); }\n\n")
|
||||
fd.write("\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, double p_value) { _FU glUniform1f(get_uniform(p_uniform),p_value); }\n\n")
|
||||
fd.write("\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, uint8_t p_value) { _FU glUniform1i(get_uniform(p_uniform),p_value); }\n\n")
|
||||
fd.write("\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, int8_t p_value) { _FU glUniform1i(get_uniform(p_uniform),p_value); }\n\n")
|
||||
fd.write("\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, uint16_t p_value) { _FU glUniform1i(get_uniform(p_uniform),p_value); }\n\n")
|
||||
fd.write("\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, int16_t p_value) { _FU glUniform1i(get_uniform(p_uniform),p_value); }\n\n")
|
||||
fd.write("\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, uint32_t p_value) { _FU glUniform1i(get_uniform(p_uniform),p_value); }\n\n")
|
||||
fd.write("\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, int32_t p_value) { _FU glUniform1i(get_uniform(p_uniform),p_value); }\n\n")
|
||||
fd.write("\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, const Color& p_color) { _FU GLfloat col[4]={p_color.r,p_color.g,p_color.b,p_color.a}; glUniform4fv(get_uniform(p_uniform),1,col); }\n\n")
|
||||
fd.write("\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, const Vector2& p_vec2) { _FU GLfloat vec2[2]={p_vec2.x,p_vec2.y}; glUniform2fv(get_uniform(p_uniform),1,vec2); }\n\n")
|
||||
fd.write("\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, const Size2i& p_vec2) { _FU GLint vec2[2]={p_vec2.x,p_vec2.y}; glUniform2iv(get_uniform(p_uniform),1,vec2); }\n\n")
|
||||
fd.write("\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, const Vector3& p_vec3) { _FU GLfloat vec3[3]={p_vec3.x,p_vec3.y,p_vec3.z}; glUniform3fv(get_uniform(p_uniform),1,vec3); }\n\n")
|
||||
fd.write("\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, float p_a, float p_b) { _FU glUniform2f(get_uniform(p_uniform),p_a,p_b); }\n\n")
|
||||
fd.write("\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, float p_a, float p_b, float p_c) { _FU glUniform3f(get_uniform(p_uniform),p_a,p_b,p_c); }\n\n")
|
||||
fd.write("\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, float p_a, float p_b, float p_c, float p_d) { _FU glUniform4f(get_uniform(p_uniform),p_a,p_b,p_c,p_d); }\n\n")
|
||||
fd.write(
|
||||
"\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, float p_value) { _FU glUniform1f(get_uniform(p_uniform),p_value); }\n\n"
|
||||
)
|
||||
fd.write(
|
||||
"\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, double p_value) { _FU glUniform1f(get_uniform(p_uniform),p_value); }\n\n"
|
||||
)
|
||||
fd.write(
|
||||
"\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, uint8_t p_value) { _FU glUniform1i(get_uniform(p_uniform),p_value); }\n\n"
|
||||
)
|
||||
fd.write(
|
||||
"\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, int8_t p_value) { _FU glUniform1i(get_uniform(p_uniform),p_value); }\n\n"
|
||||
)
|
||||
fd.write(
|
||||
"\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, uint16_t p_value) { _FU glUniform1i(get_uniform(p_uniform),p_value); }\n\n"
|
||||
)
|
||||
fd.write(
|
||||
"\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, int16_t p_value) { _FU glUniform1i(get_uniform(p_uniform),p_value); }\n\n"
|
||||
)
|
||||
fd.write(
|
||||
"\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, uint32_t p_value) { _FU glUniform1i(get_uniform(p_uniform),p_value); }\n\n"
|
||||
)
|
||||
fd.write(
|
||||
"\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, int32_t p_value) { _FU glUniform1i(get_uniform(p_uniform),p_value); }\n\n"
|
||||
)
|
||||
fd.write(
|
||||
"\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, const Color& p_color) { _FU GLfloat col[4]={p_color.r,p_color.g,p_color.b,p_color.a}; glUniform4fv(get_uniform(p_uniform),1,col); }\n\n"
|
||||
)
|
||||
fd.write(
|
||||
"\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, const Vector2& p_vec2) { _FU GLfloat vec2[2]={p_vec2.x,p_vec2.y}; glUniform2fv(get_uniform(p_uniform),1,vec2); }\n\n"
|
||||
)
|
||||
fd.write(
|
||||
"\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, const Size2i& p_vec2) { _FU GLint vec2[2]={p_vec2.x,p_vec2.y}; glUniform2iv(get_uniform(p_uniform),1,vec2); }\n\n"
|
||||
)
|
||||
fd.write(
|
||||
"\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, const Vector3& p_vec3) { _FU GLfloat vec3[3]={p_vec3.x,p_vec3.y,p_vec3.z}; glUniform3fv(get_uniform(p_uniform),1,vec3); }\n\n"
|
||||
)
|
||||
fd.write(
|
||||
"\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, float p_a, float p_b) { _FU glUniform2f(get_uniform(p_uniform),p_a,p_b); }\n\n"
|
||||
)
|
||||
fd.write(
|
||||
"\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, float p_a, float p_b, float p_c) { _FU glUniform3f(get_uniform(p_uniform),p_a,p_b,p_c); }\n\n"
|
||||
)
|
||||
fd.write(
|
||||
"\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, float p_a, float p_b, float p_c, float p_d) { _FU glUniform4f(get_uniform(p_uniform),p_a,p_b,p_c,p_d); }\n\n"
|
||||
)
|
||||
|
||||
fd.write("""\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, const Transform& p_transform) { _FU
|
||||
fd.write(
|
||||
"""\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, const Transform& p_transform) { _FU
|
||||
|
||||
const Transform &tr = p_transform;
|
||||
|
||||
|
@ -279,9 +315,11 @@ def build_legacygl_header(filename, include, class_suffix, output_attribs, gles2
|
|||
|
||||
}
|
||||
|
||||
""")
|
||||
"""
|
||||
)
|
||||
|
||||
fd.write("""_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, const Transform2D& p_transform) { _FU
|
||||
fd.write(
|
||||
"""_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, const Transform2D& p_transform) { _FU
|
||||
|
||||
const Transform2D &tr = p_transform;
|
||||
|
||||
|
@ -310,9 +348,11 @@ def build_legacygl_header(filename, include, class_suffix, output_attribs, gles2
|
|||
|
||||
}
|
||||
|
||||
""")
|
||||
"""
|
||||
)
|
||||
|
||||
fd.write("""_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, const CameraMatrix& p_matrix) { _FU
|
||||
fd.write(
|
||||
"""_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, const CameraMatrix& p_matrix) { _FU
|
||||
|
||||
GLfloat matrix[16];
|
||||
|
||||
|
@ -324,7 +364,8 @@ def build_legacygl_header(filename, include, class_suffix, output_attribs, gles2
|
|||
}
|
||||
|
||||
glUniformMatrix4fv(get_uniform(p_uniform),1,false,matrix);
|
||||
}""")
|
||||
}"""
|
||||
)
|
||||
|
||||
fd.write("\n\n#undef _FU\n\n\n")
|
||||
|
||||
|
@ -344,21 +385,25 @@ def build_legacygl_header(filename, include, class_suffix, output_attribs, gles2
|
|||
x = header_data.enums[xv]
|
||||
bits = 1
|
||||
amt = len(x)
|
||||
while (2 ** bits < amt):
|
||||
while 2 ** bits < amt:
|
||||
bits += 1
|
||||
strs = "{"
|
||||
for i in range(amt):
|
||||
strs += "\"#define " + x[i] + "\\n\","
|
||||
strs += '"#define ' + x[i] + '\\n",'
|
||||
|
||||
c = {}
|
||||
c["set_mask"] = "uint64_t(" + str(i) + ")<<" + str(bitofs)
|
||||
c["clear_mask"] = "((uint64_t(1)<<40)-1) ^ (((uint64_t(1)<<" + str(bits) + ") - 1)<<" + str(bitofs) + ")"
|
||||
c["clear_mask"] = (
|
||||
"((uint64_t(1)<<40)-1) ^ (((uint64_t(1)<<" + str(bits) + ") - 1)<<" + str(bitofs) + ")"
|
||||
)
|
||||
enum_vals.append(c)
|
||||
enum_constants.append(x[i])
|
||||
|
||||
strs += "NULL}"
|
||||
|
||||
fd.write("\t\t\t{(uint64_t(1<<" + str(bits) + ")-1)<<" + str(bitofs) + "," + str(bitofs) + "," + strs + "},\n")
|
||||
fd.write(
|
||||
"\t\t\t{(uint64_t(1<<" + str(bits) + ")-1)<<" + str(bitofs) + "," + str(bitofs) + "," + strs + "},\n"
|
||||
)
|
||||
bitofs += bits
|
||||
|
||||
fd.write("\t\t};\n\n")
|
||||
|
@ -377,7 +422,7 @@ def build_legacygl_header(filename, include, class_suffix, output_attribs, gles2
|
|||
fd.write("\t\tstatic const char* _conditional_strings[]={\n")
|
||||
if header_data.conditionals:
|
||||
for x in header_data.conditionals:
|
||||
fd.write("\t\t\t\"#define " + x + "\\n\",\n")
|
||||
fd.write('\t\t\t"#define ' + x + '\\n",\n')
|
||||
conditionals_found.append(x)
|
||||
fd.write("\t\t};\n\n")
|
||||
else:
|
||||
|
@ -388,7 +433,7 @@ def build_legacygl_header(filename, include, class_suffix, output_attribs, gles2
|
|||
fd.write("\t\tstatic const char* _uniform_strings[]={\n")
|
||||
if header_data.uniforms:
|
||||
for x in header_data.uniforms:
|
||||
fd.write("\t\t\t\"" + x + "\",\n")
|
||||
fd.write('\t\t\t"' + x + '",\n')
|
||||
fd.write("\t\t};\n\n")
|
||||
else:
|
||||
fd.write("\t\tstatic const char **_uniform_strings=NULL;\n")
|
||||
|
@ -398,7 +443,7 @@ def build_legacygl_header(filename, include, class_suffix, output_attribs, gles2
|
|||
|
||||
fd.write("\t\tstatic AttributePair _attribute_pairs[]={\n")
|
||||
for x in header_data.attributes:
|
||||
fd.write("\t\t\t{\"" + x[0] + "\"," + x[1] + "},\n")
|
||||
fd.write('\t\t\t{"' + x[0] + '",' + x[1] + "},\n")
|
||||
fd.write("\t\t};\n\n")
|
||||
else:
|
||||
fd.write("\t\tstatic AttributePair *_attribute_pairs=NULL;\n")
|
||||
|
@ -412,9 +457,9 @@ def build_legacygl_header(filename, include, class_suffix, output_attribs, gles2
|
|||
name = x[0]
|
||||
cond = x[1]
|
||||
if cond in conditionals_found:
|
||||
fd.write("\t\t\t{\"" + name + "\"," + str(conditionals_found.index(cond)) + "},\n")
|
||||
fd.write('\t\t\t{"' + name + '",' + str(conditionals_found.index(cond)) + "},\n")
|
||||
else:
|
||||
fd.write("\t\t\t{\"" + name + "\",-1},\n")
|
||||
fd.write('\t\t\t{"' + name + '",-1},\n')
|
||||
|
||||
feedback_count += 1
|
||||
|
||||
|
@ -428,7 +473,7 @@ def build_legacygl_header(filename, include, class_suffix, output_attribs, gles2
|
|||
if header_data.texunits:
|
||||
fd.write("\t\tstatic TexUnitPair _texunit_pairs[]={\n")
|
||||
for x in header_data.texunits:
|
||||
fd.write("\t\t\t{\"" + x[0] + "\"," + x[1] + "},\n")
|
||||
fd.write('\t\t\t{"' + x[0] + '",' + x[1] + "},\n")
|
||||
fd.write("\t\t};\n\n")
|
||||
else:
|
||||
fd.write("\t\tstatic TexUnitPair *_texunit_pairs=NULL;\n")
|
||||
|
@ -436,7 +481,7 @@ def build_legacygl_header(filename, include, class_suffix, output_attribs, gles2
|
|||
if not gles2 and header_data.ubos:
|
||||
fd.write("\t\tstatic UBOPair _ubo_pairs[]={\n")
|
||||
for x in header_data.ubos:
|
||||
fd.write("\t\t\t{\"" + x[0] + "\"," + x[1] + "},\n")
|
||||
fd.write('\t\t\t{"' + x[0] + '",' + x[1] + "},\n")
|
||||
fd.write("\t\t};\n\n")
|
||||
else:
|
||||
if gles2:
|
||||
|
@ -449,7 +494,7 @@ def build_legacygl_header(filename, include, class_suffix, output_attribs, gles2
|
|||
for c in x:
|
||||
fd.write(str(ord(c)) + ",")
|
||||
|
||||
fd.write(str(ord('\n')) + ",")
|
||||
fd.write(str(ord("\n")) + ",")
|
||||
fd.write("\t\t0};\n\n")
|
||||
|
||||
fd.write("\t\tstatic const int _vertex_code_start=" + str(header_data.vertex_offset) + ";\n")
|
||||
|
@ -459,28 +504,73 @@ def build_legacygl_header(filename, include, class_suffix, output_attribs, gles2
|
|||
for c in x:
|
||||
fd.write(str(ord(c)) + ",")
|
||||
|
||||
fd.write(str(ord('\n')) + ",")
|
||||
fd.write(str(ord("\n")) + ",")
|
||||
fd.write("\t\t0};\n\n")
|
||||
|
||||
fd.write("\t\tstatic const int _fragment_code_start=" + str(header_data.fragment_offset) + ";\n")
|
||||
|
||||
if output_attribs:
|
||||
if gles2:
|
||||
fd.write("\t\tsetup(_conditional_strings," + str(len(header_data.conditionals)) + ",_uniform_strings," + str(len(header_data.uniforms)) + ",_attribute_pairs," + str(
|
||||
len(header_data.attributes)) + ", _texunit_pairs," + str(len(header_data.texunits)) + ",_vertex_code,_fragment_code,_vertex_code_start,_fragment_code_start);\n")
|
||||
fd.write(
|
||||
"\t\tsetup(_conditional_strings,"
|
||||
+ str(len(header_data.conditionals))
|
||||
+ ",_uniform_strings,"
|
||||
+ str(len(header_data.uniforms))
|
||||
+ ",_attribute_pairs,"
|
||||
+ str(len(header_data.attributes))
|
||||
+ ", _texunit_pairs,"
|
||||
+ str(len(header_data.texunits))
|
||||
+ ",_vertex_code,_fragment_code,_vertex_code_start,_fragment_code_start);\n"
|
||||
)
|
||||
else:
|
||||
fd.write("\t\tsetup(_conditional_strings," + str(len(header_data.conditionals)) + ",_uniform_strings," + str(len(header_data.uniforms)) + ",_attribute_pairs," + str(
|
||||
len(header_data.attributes)) + ", _texunit_pairs," + str(len(header_data.texunits)) + ",_ubo_pairs," + str(len(header_data.ubos)) + ",_feedbacks," + str(
|
||||
feedback_count) + ",_vertex_code,_fragment_code,_vertex_code_start,_fragment_code_start);\n")
|
||||
fd.write(
|
||||
"\t\tsetup(_conditional_strings,"
|
||||
+ str(len(header_data.conditionals))
|
||||
+ ",_uniform_strings,"
|
||||
+ str(len(header_data.uniforms))
|
||||
+ ",_attribute_pairs,"
|
||||
+ str(len(header_data.attributes))
|
||||
+ ", _texunit_pairs,"
|
||||
+ str(len(header_data.texunits))
|
||||
+ ",_ubo_pairs,"
|
||||
+ str(len(header_data.ubos))
|
||||
+ ",_feedbacks,"
|
||||
+ str(feedback_count)
|
||||
+ ",_vertex_code,_fragment_code,_vertex_code_start,_fragment_code_start);\n"
|
||||
)
|
||||
else:
|
||||
if gles2:
|
||||
fd.write("\t\tsetup(_conditional_strings," + str(len(header_data.conditionals)) + ",_uniform_strings," + str(len(header_data.uniforms)) + ",_texunit_pairs," + str(
|
||||
len(header_data.texunits)) + ",_enums," + str(len(header_data.enums)) + ",_enum_values," + str(
|
||||
enum_value_count) + ",_vertex_code,_fragment_code,_vertex_code_start,_fragment_code_start);\n")
|
||||
fd.write(
|
||||
"\t\tsetup(_conditional_strings,"
|
||||
+ str(len(header_data.conditionals))
|
||||
+ ",_uniform_strings,"
|
||||
+ str(len(header_data.uniforms))
|
||||
+ ",_texunit_pairs,"
|
||||
+ str(len(header_data.texunits))
|
||||
+ ",_enums,"
|
||||
+ str(len(header_data.enums))
|
||||
+ ",_enum_values,"
|
||||
+ str(enum_value_count)
|
||||
+ ",_vertex_code,_fragment_code,_vertex_code_start,_fragment_code_start);\n"
|
||||
)
|
||||
else:
|
||||
fd.write("\t\tsetup(_conditional_strings," + str(len(header_data.conditionals)) + ",_uniform_strings," + str(len(header_data.uniforms)) + ",_texunit_pairs," + str(
|
||||
len(header_data.texunits)) + ",_enums," + str(len(header_data.enums)) + ",_enum_values," + str(enum_value_count) + ",_ubo_pairs," + str(len(header_data.ubos)) + ",_feedbacks," + str(
|
||||
feedback_count) + ",_vertex_code,_fragment_code,_vertex_code_start,_fragment_code_start);\n")
|
||||
fd.write(
|
||||
"\t\tsetup(_conditional_strings,"
|
||||
+ str(len(header_data.conditionals))
|
||||
+ ",_uniform_strings,"
|
||||
+ str(len(header_data.uniforms))
|
||||
+ ",_texunit_pairs,"
|
||||
+ str(len(header_data.texunits))
|
||||
+ ",_enums,"
|
||||
+ str(len(header_data.enums))
|
||||
+ ",_enum_values,"
|
||||
+ str(enum_value_count)
|
||||
+ ",_ubo_pairs,"
|
||||
+ str(len(header_data.ubos))
|
||||
+ ",_feedbacks,"
|
||||
+ str(feedback_count)
|
||||
+ ",_vertex_code,_fragment_code,_vertex_code_start,_fragment_code_start);\n"
|
||||
)
|
||||
|
||||
fd.write("\t}\n\n")
|
||||
|
||||
|
@ -504,8 +594,10 @@ def build_gles3_headers(target, source, env):
|
|||
|
||||
def build_gles2_headers(target, source, env):
|
||||
for x in source:
|
||||
build_legacygl_header(str(x), include="drivers/gles2/shader_gles2.h", class_suffix="GLES2", output_attribs=True, gles2=True)
|
||||
build_legacygl_header(
|
||||
str(x), include="drivers/gles2/shader_gles2.h", class_suffix="GLES2", output_attribs=True, gles2=True
|
||||
)
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
if __name__ == "__main__":
|
||||
subprocess_main(globals())
|
||||
|
|
14
main/SCsub
14
main/SCsub
|
@ -1,6 +1,6 @@
|
|||
#!/usr/bin/env python
|
||||
|
||||
Import('env')
|
||||
Import("env")
|
||||
|
||||
from platform_methods import run_in_subprocess
|
||||
import main_builders
|
||||
|
@ -13,7 +13,11 @@ env.add_source_files(env.main_sources, "*.cpp")
|
|||
controller_databases = ["#main/gamecontrollerdb.txt", "#main/godotcontrollerdb.txt"]
|
||||
|
||||
env.Depends("#main/default_controller_mappings.gen.cpp", controller_databases)
|
||||
env.CommandNoCache("#main/default_controller_mappings.gen.cpp", controller_databases, run_in_subprocess(main_builders.make_default_controller_mappings))
|
||||
env.CommandNoCache(
|
||||
"#main/default_controller_mappings.gen.cpp",
|
||||
controller_databases,
|
||||
run_in_subprocess(main_builders.make_default_controller_mappings),
|
||||
)
|
||||
|
||||
# Don't warn about duplicate entry here, we need it registered manually for first build,
|
||||
# even if later builds will pick it up twice due to above *.cpp globbing.
|
||||
|
@ -23,13 +27,15 @@ env.Depends("#main/splash.gen.h", "#main/splash.png")
|
|||
env.CommandNoCache("#main/splash.gen.h", "#main/splash.png", run_in_subprocess(main_builders.make_splash))
|
||||
|
||||
env.Depends("#main/splash_editor.gen.h", "#main/splash_editor.png")
|
||||
env.CommandNoCache("#main/splash_editor.gen.h", "#main/splash_editor.png", run_in_subprocess(main_builders.make_splash_editor))
|
||||
env.CommandNoCache(
|
||||
"#main/splash_editor.gen.h", "#main/splash_editor.png", run_in_subprocess(main_builders.make_splash_editor)
|
||||
)
|
||||
|
||||
env.Depends("#main/app_icon.gen.h", "#main/app_icon.png")
|
||||
env.CommandNoCache("#main/app_icon.gen.h", "#main/app_icon.png", run_in_subprocess(main_builders.make_app_icon))
|
||||
|
||||
if env["tools"]:
|
||||
SConscript('tests/SCsub')
|
||||
SConscript("tests/SCsub")
|
||||
|
||||
lib = env.add_library("main", env.main_sources)
|
||||
env.Prepend(LIBS=[lib])
|
||||
|
|
|
@ -19,7 +19,7 @@ def make_splash(target, source, env):
|
|||
g.write("/* THIS FILE IS GENERATED DO NOT EDIT */\n")
|
||||
g.write("#ifndef BOOT_SPLASH_H\n")
|
||||
g.write("#define BOOT_SPLASH_H\n")
|
||||
g.write('static const Color boot_splash_bg_color = Color(0.14, 0.14, 0.14);\n')
|
||||
g.write("static const Color boot_splash_bg_color = Color(0.14, 0.14, 0.14);\n")
|
||||
g.write("static const unsigned char boot_splash_png[] = {\n")
|
||||
for i in range(len(buf)):
|
||||
g.write(byte_to_str(buf[i]) + ",\n")
|
||||
|
@ -38,7 +38,7 @@ def make_splash_editor(target, source, env):
|
|||
g.write("/* THIS FILE IS GENERATED DO NOT EDIT */\n")
|
||||
g.write("#ifndef BOOT_SPLASH_EDITOR_H\n")
|
||||
g.write("#define BOOT_SPLASH_EDITOR_H\n")
|
||||
g.write('static const Color boot_splash_editor_bg_color = Color(0.14, 0.14, 0.14);\n')
|
||||
g.write("static const Color boot_splash_editor_bg_color = Color(0.14, 0.14, 0.14);\n")
|
||||
g.write("static const unsigned char boot_splash_editor_png[] = {\n")
|
||||
for i in range(len(buf)):
|
||||
g.write(byte_to_str(buf[i]) + ",\n")
|
||||
|
@ -69,8 +69,8 @@ def make_default_controller_mappings(target, source, env):
|
|||
g = open(dst, "w")
|
||||
|
||||
g.write("/* THIS FILE IS GENERATED DO NOT EDIT */\n")
|
||||
g.write("#include \"core/typedefs.h\"\n")
|
||||
g.write("#include \"main/default_controller_mappings.h\"\n")
|
||||
g.write('#include "core/typedefs.h"\n')
|
||||
g.write('#include "main/default_controller_mappings.h"\n')
|
||||
|
||||
# ensure mappings have a consistent order
|
||||
platform_mappings = OrderedDict()
|
||||
|
@ -94,11 +94,19 @@ def make_default_controller_mappings(target, source, env):
|
|||
line_parts = line.split(",")
|
||||
guid = line_parts[0]
|
||||
if guid in platform_mappings[current_platform]:
|
||||
g.write("// WARNING - DATABASE {} OVERWROTE PRIOR MAPPING: {} {}\n".format(src_path, current_platform, platform_mappings[current_platform][guid]))
|
||||
g.write(
|
||||
"// WARNING - DATABASE {} OVERWROTE PRIOR MAPPING: {} {}\n".format(
|
||||
src_path, current_platform, platform_mappings[current_platform][guid]
|
||||
)
|
||||
)
|
||||
valid_mapping = True
|
||||
for input_map in line_parts[2:]:
|
||||
if "+" in input_map or "-" in input_map or "~" in input_map:
|
||||
g.write("// WARNING - DISCARDED UNSUPPORTED MAPPING TYPE FROM DATABASE {}: {} {}\n".format(src_path, current_platform, line))
|
||||
g.write(
|
||||
"// WARNING - DISCARDED UNSUPPORTED MAPPING TYPE FROM DATABASE {}: {} {}\n".format(
|
||||
src_path, current_platform, line
|
||||
)
|
||||
)
|
||||
valid_mapping = False
|
||||
break
|
||||
if valid_mapping:
|
||||
|
@ -119,12 +127,12 @@ def make_default_controller_mappings(target, source, env):
|
|||
variable = platform_variables[platform]
|
||||
g.write("{}\n".format(variable))
|
||||
for mapping in mappings.values():
|
||||
g.write("\t\"{}\",\n".format(mapping))
|
||||
g.write('\t"{}",\n'.format(mapping))
|
||||
g.write("#endif\n")
|
||||
|
||||
g.write("\tNULL\n};\n")
|
||||
g.close()
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
if __name__ == "__main__":
|
||||
subprocess_main(globals())
|
||||
|
|
|
@ -1,6 +1,6 @@
|
|||
#!/usr/bin/python
|
||||
|
||||
Import('env')
|
||||
Import("env")
|
||||
|
||||
env.tests_sources = []
|
||||
env.add_source_files(env.tests_sources, "*.cpp")
|
||||
|
|
487
methods.py
487
methods.py
|
@ -10,13 +10,13 @@ def add_source_files(self, sources, files, warn_duplicates=True):
|
|||
# Convert string to list of absolute paths (including expanding wildcard)
|
||||
if isbasestring(files):
|
||||
# Keep SCons project-absolute path as they are (no wildcard support)
|
||||
if files.startswith('#'):
|
||||
if '*' in files:
|
||||
if files.startswith("#"):
|
||||
if "*" in files:
|
||||
print("ERROR: Wildcards can't be expanded in SCons project-absolute path: '{}'".format(files))
|
||||
return
|
||||
files = [files]
|
||||
else:
|
||||
dir_path = self.Dir('.').abspath
|
||||
dir_path = self.Dir(".").abspath
|
||||
files = sorted(glob.glob(dir_path + "/" + files))
|
||||
|
||||
# Add each path as compiled Object following environment (self) configuration
|
||||
|
@ -24,7 +24,7 @@ def add_source_files(self, sources, files, warn_duplicates=True):
|
|||
obj = self.Object(path)
|
||||
if obj in sources:
|
||||
if warn_duplicates:
|
||||
print("WARNING: Object \"{}\" already included in environment sources.".format(obj))
|
||||
print('WARNING: Object "{}" already included in environment sources.'.format(obj))
|
||||
else:
|
||||
continue
|
||||
sources.append(obj)
|
||||
|
@ -35,20 +35,20 @@ def disable_warnings(self):
|
|||
if self.msvc:
|
||||
# We have to remove existing warning level defines before appending /w,
|
||||
# otherwise we get: "warning D9025 : overriding '/W3' with '/w'"
|
||||
warn_flags = ['/Wall', '/W4', '/W3', '/W2', '/W1', '/WX']
|
||||
self.Append(CCFLAGS=['/w'])
|
||||
self.Append(CFLAGS=['/w'])
|
||||
self.Append(CXXFLAGS=['/w'])
|
||||
self['CCFLAGS'] = [x for x in self['CCFLAGS'] if not x in warn_flags]
|
||||
self['CFLAGS'] = [x for x in self['CFLAGS'] if not x in warn_flags]
|
||||
self['CXXFLAGS'] = [x for x in self['CXXFLAGS'] if not x in warn_flags]
|
||||
warn_flags = ["/Wall", "/W4", "/W3", "/W2", "/W1", "/WX"]
|
||||
self.Append(CCFLAGS=["/w"])
|
||||
self.Append(CFLAGS=["/w"])
|
||||
self.Append(CXXFLAGS=["/w"])
|
||||
self["CCFLAGS"] = [x for x in self["CCFLAGS"] if not x in warn_flags]
|
||||
self["CFLAGS"] = [x for x in self["CFLAGS"] if not x in warn_flags]
|
||||
self["CXXFLAGS"] = [x for x in self["CXXFLAGS"] if not x in warn_flags]
|
||||
else:
|
||||
self.Append(CCFLAGS=['-w'])
|
||||
self.Append(CFLAGS=['-w'])
|
||||
self.Append(CXXFLAGS=['-w'])
|
||||
self.Append(CCFLAGS=["-w"])
|
||||
self.Append(CFLAGS=["-w"])
|
||||
self.Append(CXXFLAGS=["-w"])
|
||||
|
||||
|
||||
def add_module_version_string(self,s):
|
||||
def add_module_version_string(self, s):
|
||||
self.module_version_string += "." + s
|
||||
|
||||
|
||||
|
@ -63,16 +63,16 @@ def update_version(module_version_string=""):
|
|||
|
||||
# NOTE: It is safe to generate this file here, since this is still executed serially
|
||||
f = open("core/version_generated.gen.h", "w")
|
||||
f.write("#define VERSION_SHORT_NAME \"" + str(version.short_name) + "\"\n")
|
||||
f.write("#define VERSION_NAME \"" + str(version.name) + "\"\n")
|
||||
f.write('#define VERSION_SHORT_NAME "' + str(version.short_name) + '"\n')
|
||||
f.write('#define VERSION_NAME "' + str(version.name) + '"\n')
|
||||
f.write("#define VERSION_MAJOR " + str(version.major) + "\n")
|
||||
f.write("#define VERSION_MINOR " + str(version.minor) + "\n")
|
||||
f.write("#define VERSION_PATCH " + str(version.patch) + "\n")
|
||||
f.write("#define VERSION_STATUS \"" + str(version.status) + "\"\n")
|
||||
f.write("#define VERSION_BUILD \"" + str(build_name) + "\"\n")
|
||||
f.write("#define VERSION_MODULE_CONFIG \"" + str(version.module_config) + module_version_string + "\"\n")
|
||||
f.write('#define VERSION_STATUS "' + str(version.status) + '"\n')
|
||||
f.write('#define VERSION_BUILD "' + str(build_name) + '"\n')
|
||||
f.write('#define VERSION_MODULE_CONFIG "' + str(version.module_config) + module_version_string + '"\n')
|
||||
f.write("#define VERSION_YEAR " + str(version.year) + "\n")
|
||||
f.write("#define VERSION_WEBSITE \"" + str(version.website) + "\"\n")
|
||||
f.write('#define VERSION_WEBSITE "' + str(version.website) + '"\n')
|
||||
f.close()
|
||||
|
||||
# NOTE: It is safe to generate this file here, since this is still executed serially
|
||||
|
@ -94,7 +94,7 @@ def update_version(module_version_string=""):
|
|||
else:
|
||||
githash = head
|
||||
|
||||
fhash.write("#define VERSION_HASH \"" + githash + "\"")
|
||||
fhash.write('#define VERSION_HASH "' + githash + '"')
|
||||
fhash.close()
|
||||
|
||||
|
||||
|
@ -161,29 +161,37 @@ def write_modules(module_list):
|
|||
try:
|
||||
with open(os.path.join(path, "register_types.h")):
|
||||
includes_cpp += '#include "' + path + '/register_types.h"\n'
|
||||
register_cpp += '#ifdef MODULE_' + name.upper() + '_ENABLED\n'
|
||||
register_cpp += '\tregister_' + name + '_types();\n'
|
||||
register_cpp += '#endif\n'
|
||||
unregister_cpp += '#ifdef MODULE_' + name.upper() + '_ENABLED\n'
|
||||
unregister_cpp += '\tunregister_' + name + '_types();\n'
|
||||
unregister_cpp += '#endif\n'
|
||||
register_cpp += "#ifdef MODULE_" + name.upper() + "_ENABLED\n"
|
||||
register_cpp += "\tregister_" + name + "_types();\n"
|
||||
register_cpp += "#endif\n"
|
||||
unregister_cpp += "#ifdef MODULE_" + name.upper() + "_ENABLED\n"
|
||||
unregister_cpp += "\tunregister_" + name + "_types();\n"
|
||||
unregister_cpp += "#endif\n"
|
||||
except IOError:
|
||||
pass
|
||||
|
||||
modules_cpp = """
|
||||
modules_cpp = (
|
||||
"""
|
||||
// modules.cpp - THIS FILE IS GENERATED, DO NOT EDIT!!!!!!!
|
||||
#include "register_module_types.h"
|
||||
|
||||
""" + includes_cpp + """
|
||||
"""
|
||||
+ includes_cpp
|
||||
+ """
|
||||
|
||||
void register_module_types() {
|
||||
""" + register_cpp + """
|
||||
"""
|
||||
+ register_cpp
|
||||
+ """
|
||||
}
|
||||
|
||||
void unregister_module_types() {
|
||||
""" + unregister_cpp + """
|
||||
"""
|
||||
+ unregister_cpp
|
||||
+ """
|
||||
}
|
||||
"""
|
||||
)
|
||||
|
||||
# NOTE: It is safe to generate this file here, since this is still executed serially
|
||||
with open("modules/register_module_types.gen.cpp", "w") as f:
|
||||
|
@ -206,9 +214,10 @@ def convert_custom_modules_path(path):
|
|||
def disable_module(self):
|
||||
self.disabled_modules.append(self.current_module)
|
||||
|
||||
|
||||
def use_windows_spawn_fix(self, platform=None):
|
||||
|
||||
if (os.name != "nt"):
|
||||
if os.name != "nt":
|
||||
return # not needed, only for windows
|
||||
|
||||
# On Windows, due to the limited command line length, when creating a static library
|
||||
|
@ -219,14 +228,21 @@ def use_windows_spawn_fix(self, platform=None):
|
|||
# got built correctly regardless the invocation strategy.
|
||||
# Furthermore, since SCons will rebuild the library from scratch when an object file
|
||||
# changes, no multiple versions of the same object file will be present.
|
||||
self.Replace(ARFLAGS='q')
|
||||
self.Replace(ARFLAGS="q")
|
||||
|
||||
def mySubProcess(cmdline, env):
|
||||
|
||||
startupinfo = subprocess.STARTUPINFO()
|
||||
startupinfo.dwFlags |= subprocess.STARTF_USESHOWWINDOW
|
||||
proc = subprocess.Popen(cmdline, stdin=subprocess.PIPE, stdout=subprocess.PIPE,
|
||||
stderr=subprocess.PIPE, startupinfo=startupinfo, shell=False, env=env)
|
||||
proc = subprocess.Popen(
|
||||
cmdline,
|
||||
stdin=subprocess.PIPE,
|
||||
stdout=subprocess.PIPE,
|
||||
stderr=subprocess.PIPE,
|
||||
startupinfo=startupinfo,
|
||||
shell=False,
|
||||
env=env,
|
||||
)
|
||||
_, err = proc.communicate()
|
||||
rv = proc.wait()
|
||||
if rv:
|
||||
|
@ -237,7 +253,7 @@ def use_windows_spawn_fix(self, platform=None):
|
|||
|
||||
def mySpawn(sh, escape, cmd, args, env):
|
||||
|
||||
newargs = ' '.join(args[1:])
|
||||
newargs = " ".join(args[1:])
|
||||
cmdline = cmd + " " + newargs
|
||||
|
||||
rv = 0
|
||||
|
@ -253,10 +269,10 @@ def use_windows_spawn_fix(self, platform=None):
|
|||
|
||||
return rv
|
||||
|
||||
self['SPAWN'] = mySpawn
|
||||
self["SPAWN"] = mySpawn
|
||||
|
||||
|
||||
def split_lib(self, libname, src_list = None, env_lib = None):
|
||||
def split_lib(self, libname, src_list=None, env_lib=None):
|
||||
env = self
|
||||
|
||||
num = 0
|
||||
|
@ -307,22 +323,20 @@ def split_lib(self, libname, src_list = None, env_lib = None):
|
|||
# impacts the linker call, we need to hack our way into the linking commands
|
||||
# LINKCOM and SHLINKCOM to set those flags.
|
||||
|
||||
if '-Wl,--start-group' in env['LINKCOM'] and '-Wl,--start-group' in env['SHLINKCOM']:
|
||||
if "-Wl,--start-group" in env["LINKCOM"] and "-Wl,--start-group" in env["SHLINKCOM"]:
|
||||
# Already added by a previous call, skip.
|
||||
return
|
||||
|
||||
env['LINKCOM'] = str(env['LINKCOM']).replace('$_LIBFLAGS',
|
||||
'-Wl,--start-group $_LIBFLAGS -Wl,--end-group')
|
||||
env['SHLINKCOM'] = str(env['LINKCOM']).replace('$_LIBFLAGS',
|
||||
'-Wl,--start-group $_LIBFLAGS -Wl,--end-group')
|
||||
env["LINKCOM"] = str(env["LINKCOM"]).replace("$_LIBFLAGS", "-Wl,--start-group $_LIBFLAGS -Wl,--end-group")
|
||||
env["SHLINKCOM"] = str(env["LINKCOM"]).replace("$_LIBFLAGS", "-Wl,--start-group $_LIBFLAGS -Wl,--end-group")
|
||||
|
||||
|
||||
def save_active_platforms(apnames, ap):
|
||||
|
||||
for x in ap:
|
||||
names = ['logo']
|
||||
names = ["logo"]
|
||||
if os.path.isfile(x + "/run_icon.png"):
|
||||
names.append('run_icon')
|
||||
names.append("run_icon")
|
||||
|
||||
for name in names:
|
||||
pngf = open(x + "/" + name + ".png", "rb")
|
||||
|
@ -332,7 +346,7 @@ def save_active_platforms(apnames, ap):
|
|||
while len(b) == 1:
|
||||
str += hex(ord(b))
|
||||
b = pngf.read(1)
|
||||
if (len(b) == 1):
|
||||
if len(b) == 1:
|
||||
str += ","
|
||||
|
||||
str += "};\n"
|
||||
|
@ -352,30 +366,70 @@ def no_verbose(sys, env):
|
|||
# Colors are disabled in non-TTY environments such as pipes. This means
|
||||
# that if output is redirected to a file, it will not contain color codes
|
||||
if sys.stdout.isatty():
|
||||
colors['cyan'] = '\033[96m'
|
||||
colors['purple'] = '\033[95m'
|
||||
colors['blue'] = '\033[94m'
|
||||
colors['green'] = '\033[92m'
|
||||
colors['yellow'] = '\033[93m'
|
||||
colors['red'] = '\033[91m'
|
||||
colors['end'] = '\033[0m'
|
||||
colors["cyan"] = "\033[96m"
|
||||
colors["purple"] = "\033[95m"
|
||||
colors["blue"] = "\033[94m"
|
||||
colors["green"] = "\033[92m"
|
||||
colors["yellow"] = "\033[93m"
|
||||
colors["red"] = "\033[91m"
|
||||
colors["end"] = "\033[0m"
|
||||
else:
|
||||
colors['cyan'] = ''
|
||||
colors['purple'] = ''
|
||||
colors['blue'] = ''
|
||||
colors['green'] = ''
|
||||
colors['yellow'] = ''
|
||||
colors['red'] = ''
|
||||
colors['end'] = ''
|
||||
colors["cyan"] = ""
|
||||
colors["purple"] = ""
|
||||
colors["blue"] = ""
|
||||
colors["green"] = ""
|
||||
colors["yellow"] = ""
|
||||
colors["red"] = ""
|
||||
colors["end"] = ""
|
||||
|
||||
compile_source_message = '%sCompiling %s==> %s$SOURCE%s' % (colors['blue'], colors['purple'], colors['yellow'], colors['end'])
|
||||
java_compile_source_message = '%sCompiling %s==> %s$SOURCE%s' % (colors['blue'], colors['purple'], colors['yellow'], colors['end'])
|
||||
compile_shared_source_message = '%sCompiling shared %s==> %s$SOURCE%s' % (colors['blue'], colors['purple'], colors['yellow'], colors['end'])
|
||||
link_program_message = '%sLinking Program %s==> %s$TARGET%s' % (colors['red'], colors['purple'], colors['yellow'], colors['end'])
|
||||
link_library_message = '%sLinking Static Library %s==> %s$TARGET%s' % (colors['red'], colors['purple'], colors['yellow'], colors['end'])
|
||||
ranlib_library_message = '%sRanlib Library %s==> %s$TARGET%s' % (colors['red'], colors['purple'], colors['yellow'], colors['end'])
|
||||
link_shared_library_message = '%sLinking Shared Library %s==> %s$TARGET%s' % (colors['red'], colors['purple'], colors['yellow'], colors['end'])
|
||||
java_library_message = '%sCreating Java Archive %s==> %s$TARGET%s' % (colors['red'], colors['purple'], colors['yellow'], colors['end'])
|
||||
compile_source_message = "%sCompiling %s==> %s$SOURCE%s" % (
|
||||
colors["blue"],
|
||||
colors["purple"],
|
||||
colors["yellow"],
|
||||
colors["end"],
|
||||
)
|
||||
java_compile_source_message = "%sCompiling %s==> %s$SOURCE%s" % (
|
||||
colors["blue"],
|
||||
colors["purple"],
|
||||
colors["yellow"],
|
||||
colors["end"],
|
||||
)
|
||||
compile_shared_source_message = "%sCompiling shared %s==> %s$SOURCE%s" % (
|
||||
colors["blue"],
|
||||
colors["purple"],
|
||||
colors["yellow"],
|
||||
colors["end"],
|
||||
)
|
||||
link_program_message = "%sLinking Program %s==> %s$TARGET%s" % (
|
||||
colors["red"],
|
||||
colors["purple"],
|
||||
colors["yellow"],
|
||||
colors["end"],
|
||||
)
|
||||
link_library_message = "%sLinking Static Library %s==> %s$TARGET%s" % (
|
||||
colors["red"],
|
||||
colors["purple"],
|
||||
colors["yellow"],
|
||||
colors["end"],
|
||||
)
|
||||
ranlib_library_message = "%sRanlib Library %s==> %s$TARGET%s" % (
|
||||
colors["red"],
|
||||
colors["purple"],
|
||||
colors["yellow"],
|
||||
colors["end"],
|
||||
)
|
||||
link_shared_library_message = "%sLinking Shared Library %s==> %s$TARGET%s" % (
|
||||
colors["red"],
|
||||
colors["purple"],
|
||||
colors["yellow"],
|
||||
colors["end"],
|
||||
)
|
||||
java_library_message = "%sCreating Java Archive %s==> %s$TARGET%s" % (
|
||||
colors["red"],
|
||||
colors["purple"],
|
||||
colors["yellow"],
|
||||
colors["end"],
|
||||
)
|
||||
|
||||
env.Append(CXXCOMSTR=[compile_source_message])
|
||||
env.Append(CCCOMSTR=[compile_source_message])
|
||||
|
@ -416,70 +470,79 @@ def detect_visual_c_compiler_version(tools_env):
|
|||
vc_chosen_compiler_str = ""
|
||||
|
||||
# Start with Pre VS 2017 checks which uses VCINSTALLDIR:
|
||||
if 'VCINSTALLDIR' in tools_env:
|
||||
if "VCINSTALLDIR" in tools_env:
|
||||
# print("Checking VCINSTALLDIR")
|
||||
|
||||
# find() works with -1 so big ifs below are needed... the simplest solution, in fact
|
||||
# First test if amd64 and amd64_x86 compilers are present in the path
|
||||
vc_amd64_compiler_detection_index = tools_env["PATH"].find(tools_env["VCINSTALLDIR"] + "BIN\\amd64;")
|
||||
if(vc_amd64_compiler_detection_index > -1):
|
||||
if vc_amd64_compiler_detection_index > -1:
|
||||
vc_chosen_compiler_index = vc_amd64_compiler_detection_index
|
||||
vc_chosen_compiler_str = "amd64"
|
||||
|
||||
vc_amd64_x86_compiler_detection_index = tools_env["PATH"].find(tools_env["VCINSTALLDIR"] + "BIN\\amd64_x86;")
|
||||
if(vc_amd64_x86_compiler_detection_index > -1
|
||||
and (vc_chosen_compiler_index == -1
|
||||
or vc_chosen_compiler_index > vc_amd64_x86_compiler_detection_index)):
|
||||
if vc_amd64_x86_compiler_detection_index > -1 and (
|
||||
vc_chosen_compiler_index == -1 or vc_chosen_compiler_index > vc_amd64_x86_compiler_detection_index
|
||||
):
|
||||
vc_chosen_compiler_index = vc_amd64_x86_compiler_detection_index
|
||||
vc_chosen_compiler_str = "amd64_x86"
|
||||
|
||||
# Now check the 32 bit compilers
|
||||
vc_x86_compiler_detection_index = tools_env["PATH"].find(tools_env["VCINSTALLDIR"] + "BIN;")
|
||||
if(vc_x86_compiler_detection_index > -1
|
||||
and (vc_chosen_compiler_index == -1
|
||||
or vc_chosen_compiler_index > vc_x86_compiler_detection_index)):
|
||||
if vc_x86_compiler_detection_index > -1 and (
|
||||
vc_chosen_compiler_index == -1 or vc_chosen_compiler_index > vc_x86_compiler_detection_index
|
||||
):
|
||||
vc_chosen_compiler_index = vc_x86_compiler_detection_index
|
||||
vc_chosen_compiler_str = "x86"
|
||||
|
||||
vc_x86_amd64_compiler_detection_index = tools_env["PATH"].find(tools_env['VCINSTALLDIR'] + "BIN\\x86_amd64;")
|
||||
if(vc_x86_amd64_compiler_detection_index > -1
|
||||
and (vc_chosen_compiler_index == -1
|
||||
or vc_chosen_compiler_index > vc_x86_amd64_compiler_detection_index)):
|
||||
vc_x86_amd64_compiler_detection_index = tools_env["PATH"].find(tools_env["VCINSTALLDIR"] + "BIN\\x86_amd64;")
|
||||
if vc_x86_amd64_compiler_detection_index > -1 and (
|
||||
vc_chosen_compiler_index == -1 or vc_chosen_compiler_index > vc_x86_amd64_compiler_detection_index
|
||||
):
|
||||
vc_chosen_compiler_index = vc_x86_amd64_compiler_detection_index
|
||||
vc_chosen_compiler_str = "x86_amd64"
|
||||
|
||||
# and for VS 2017 and newer we check VCTOOLSINSTALLDIR:
|
||||
if 'VCTOOLSINSTALLDIR' in tools_env:
|
||||
if "VCTOOLSINSTALLDIR" in tools_env:
|
||||
|
||||
# Newer versions have a different path available
|
||||
vc_amd64_compiler_detection_index = tools_env["PATH"].upper().find(tools_env['VCTOOLSINSTALLDIR'].upper() + "BIN\\HOSTX64\\X64;")
|
||||
if(vc_amd64_compiler_detection_index > -1):
|
||||
vc_amd64_compiler_detection_index = (
|
||||
tools_env["PATH"].upper().find(tools_env["VCTOOLSINSTALLDIR"].upper() + "BIN\\HOSTX64\\X64;")
|
||||
)
|
||||
if vc_amd64_compiler_detection_index > -1:
|
||||
vc_chosen_compiler_index = vc_amd64_compiler_detection_index
|
||||
vc_chosen_compiler_str = "amd64"
|
||||
|
||||
vc_amd64_x86_compiler_detection_index = tools_env["PATH"].upper().find(tools_env['VCTOOLSINSTALLDIR'].upper() + "BIN\\HOSTX64\\X86;")
|
||||
if(vc_amd64_x86_compiler_detection_index > -1
|
||||
and (vc_chosen_compiler_index == -1
|
||||
or vc_chosen_compiler_index > vc_amd64_x86_compiler_detection_index)):
|
||||
vc_amd64_x86_compiler_detection_index = (
|
||||
tools_env["PATH"].upper().find(tools_env["VCTOOLSINSTALLDIR"].upper() + "BIN\\HOSTX64\\X86;")
|
||||
)
|
||||
if vc_amd64_x86_compiler_detection_index > -1 and (
|
||||
vc_chosen_compiler_index == -1 or vc_chosen_compiler_index > vc_amd64_x86_compiler_detection_index
|
||||
):
|
||||
vc_chosen_compiler_index = vc_amd64_x86_compiler_detection_index
|
||||
vc_chosen_compiler_str = "amd64_x86"
|
||||
|
||||
vc_x86_compiler_detection_index = tools_env["PATH"].upper().find(tools_env['VCTOOLSINSTALLDIR'].upper() + "BIN\\HOSTX86\\X86;")
|
||||
if(vc_x86_compiler_detection_index > -1
|
||||
and (vc_chosen_compiler_index == -1
|
||||
or vc_chosen_compiler_index > vc_x86_compiler_detection_index)):
|
||||
vc_x86_compiler_detection_index = (
|
||||
tools_env["PATH"].upper().find(tools_env["VCTOOLSINSTALLDIR"].upper() + "BIN\\HOSTX86\\X86;")
|
||||
)
|
||||
if vc_x86_compiler_detection_index > -1 and (
|
||||
vc_chosen_compiler_index == -1 or vc_chosen_compiler_index > vc_x86_compiler_detection_index
|
||||
):
|
||||
vc_chosen_compiler_index = vc_x86_compiler_detection_index
|
||||
vc_chosen_compiler_str = "x86"
|
||||
|
||||
vc_x86_amd64_compiler_detection_index = tools_env["PATH"].upper().find(tools_env['VCTOOLSINSTALLDIR'].upper() + "BIN\\HOSTX86\\X64;")
|
||||
if(vc_x86_amd64_compiler_detection_index > -1
|
||||
and (vc_chosen_compiler_index == -1
|
||||
or vc_chosen_compiler_index > vc_x86_amd64_compiler_detection_index)):
|
||||
vc_x86_amd64_compiler_detection_index = (
|
||||
tools_env["PATH"].upper().find(tools_env["VCTOOLSINSTALLDIR"].upper() + "BIN\\HOSTX86\\X64;")
|
||||
)
|
||||
if vc_x86_amd64_compiler_detection_index > -1 and (
|
||||
vc_chosen_compiler_index == -1 or vc_chosen_compiler_index > vc_x86_amd64_compiler_detection_index
|
||||
):
|
||||
vc_chosen_compiler_index = vc_x86_amd64_compiler_detection_index
|
||||
vc_chosen_compiler_str = "x86_amd64"
|
||||
|
||||
return vc_chosen_compiler_str
|
||||
|
||||
|
||||
def find_visual_c_batch_file(env):
|
||||
from SCons.Tool.MSCommon.vc import get_default_version, get_host_target, find_batch_file
|
||||
|
||||
|
@ -487,6 +550,7 @@ def find_visual_c_batch_file(env):
|
|||
(host_platform, target_platform, _) = get_host_target(env)
|
||||
return find_batch_file(env, version, host_platform, target_platform)[0]
|
||||
|
||||
|
||||
def generate_cpp_hint_file(filename):
|
||||
if os.path.isfile(filename):
|
||||
# Don't overwrite an existing hint file since the user may have customized it.
|
||||
|
@ -498,15 +562,19 @@ def generate_cpp_hint_file(filename):
|
|||
except IOError:
|
||||
print("Could not write cpp.hint file.")
|
||||
|
||||
|
||||
def generate_vs_project(env, num_jobs):
|
||||
batch_file = find_visual_c_batch_file(env)
|
||||
if batch_file:
|
||||
|
||||
def build_commandline(commands):
|
||||
common_build_prefix = ['cmd /V /C set "plat=$(PlatformTarget)"',
|
||||
'(if "$(PlatformTarget)"=="x64" (set "plat=x86_amd64"))',
|
||||
'set "tools=yes"',
|
||||
'(if "$(Configuration)"=="release" (set "tools=no"))',
|
||||
'call "' + batch_file + '" !plat!']
|
||||
common_build_prefix = [
|
||||
'cmd /V /C set "plat=$(PlatformTarget)"',
|
||||
'(if "$(PlatformTarget)"=="x64" (set "plat=x86_amd64"))',
|
||||
'set "tools=yes"',
|
||||
'(if "$(Configuration)"=="release" (set "tools=no"))',
|
||||
'call "' + batch_file + '" !plat!',
|
||||
]
|
||||
|
||||
result = " ^& ".join(common_build_prefix + [commands])
|
||||
return result
|
||||
|
@ -522,83 +590,102 @@ def generate_vs_project(env, num_jobs):
|
|||
# to double quote off the directory. However, the path ends
|
||||
# in a backslash, so we need to remove this, lest it escape the
|
||||
# last double quote off, confusing MSBuild
|
||||
env['MSVSBUILDCOM'] = build_commandline('scons --directory="$(ProjectDir.TrimEnd(\'\\\'))" platform=windows progress=no target=$(Configuration) tools=!tools! -j' + str(num_jobs))
|
||||
env['MSVSREBUILDCOM'] = build_commandline('scons --directory="$(ProjectDir.TrimEnd(\'\\\'))" platform=windows progress=no target=$(Configuration) tools=!tools! vsproj=yes -j' + str(num_jobs))
|
||||
env['MSVSCLEANCOM'] = build_commandline('scons --directory="$(ProjectDir.TrimEnd(\'\\\'))" --clean platform=windows progress=no target=$(Configuration) tools=!tools! -j' + str(num_jobs))
|
||||
env["MSVSBUILDCOM"] = build_commandline(
|
||||
"scons --directory=\"$(ProjectDir.TrimEnd('\\'))\" platform=windows progress=no target=$(Configuration) tools=!tools! -j"
|
||||
+ str(num_jobs)
|
||||
)
|
||||
env["MSVSREBUILDCOM"] = build_commandline(
|
||||
"scons --directory=\"$(ProjectDir.TrimEnd('\\'))\" platform=windows progress=no target=$(Configuration) tools=!tools! vsproj=yes -j"
|
||||
+ str(num_jobs)
|
||||
)
|
||||
env["MSVSCLEANCOM"] = build_commandline(
|
||||
"scons --directory=\"$(ProjectDir.TrimEnd('\\'))\" --clean platform=windows progress=no target=$(Configuration) tools=!tools! -j"
|
||||
+ str(num_jobs)
|
||||
)
|
||||
|
||||
# This version information (Win32, x64, Debug, Release, Release_Debug seems to be
|
||||
# required for Visual Studio to understand that it needs to generate an NMAKE
|
||||
# project. Do not modify without knowing what you are doing.
|
||||
debug_variants = ['debug|Win32'] + ['debug|x64']
|
||||
release_variants = ['release|Win32'] + ['release|x64']
|
||||
release_debug_variants = ['release_debug|Win32'] + ['release_debug|x64']
|
||||
debug_variants = ["debug|Win32"] + ["debug|x64"]
|
||||
release_variants = ["release|Win32"] + ["release|x64"]
|
||||
release_debug_variants = ["release_debug|Win32"] + ["release_debug|x64"]
|
||||
variants = debug_variants + release_variants + release_debug_variants
|
||||
debug_targets = ['bin\\godot.windows.tools.32.exe'] + ['bin\\godot.windows.tools.64.exe']
|
||||
release_targets = ['bin\\godot.windows.opt.32.exe'] + ['bin\\godot.windows.opt.64.exe']
|
||||
release_debug_targets = ['bin\\godot.windows.opt.tools.32.exe'] + ['bin\\godot.windows.opt.tools.64.exe']
|
||||
debug_targets = ["bin\\godot.windows.tools.32.exe"] + ["bin\\godot.windows.tools.64.exe"]
|
||||
release_targets = ["bin\\godot.windows.opt.32.exe"] + ["bin\\godot.windows.opt.64.exe"]
|
||||
release_debug_targets = ["bin\\godot.windows.opt.tools.32.exe"] + ["bin\\godot.windows.opt.tools.64.exe"]
|
||||
targets = debug_targets + release_targets + release_debug_targets
|
||||
if not env.get('MSVS'):
|
||||
env['MSVS']['PROJECTSUFFIX'] = '.vcxproj'
|
||||
env['MSVS']['SOLUTIONSUFFIX'] = '.sln'
|
||||
if not env.get("MSVS"):
|
||||
env["MSVS"]["PROJECTSUFFIX"] = ".vcxproj"
|
||||
env["MSVS"]["SOLUTIONSUFFIX"] = ".sln"
|
||||
env.MSVSProject(
|
||||
target=['#godot' + env['MSVSPROJECTSUFFIX']],
|
||||
target=["#godot" + env["MSVSPROJECTSUFFIX"]],
|
||||
incs=env.vs_incs,
|
||||
srcs=env.vs_srcs,
|
||||
runfile=targets,
|
||||
buildtarget=targets,
|
||||
auto_build_solution=1,
|
||||
variant=variants)
|
||||
variant=variants,
|
||||
)
|
||||
else:
|
||||
print("Could not locate Visual Studio batch file for setting up the build environment. Not generating VS project.")
|
||||
print(
|
||||
"Could not locate Visual Studio batch file for setting up the build environment. Not generating VS project."
|
||||
)
|
||||
|
||||
|
||||
def precious_program(env, program, sources, **args):
|
||||
program = env.ProgramOriginal(program, sources, **args)
|
||||
env.Precious(program)
|
||||
return program
|
||||
|
||||
|
||||
def add_shared_library(env, name, sources, **args):
|
||||
library = env.SharedLibrary(name, sources, **args)
|
||||
env.NoCache(library)
|
||||
return library
|
||||
|
||||
|
||||
def add_library(env, name, sources, **args):
|
||||
library = env.Library(name, sources, **args)
|
||||
env.NoCache(library)
|
||||
return library
|
||||
|
||||
|
||||
def add_program(env, name, sources, **args):
|
||||
program = env.Program(name, sources, **args)
|
||||
env.NoCache(program)
|
||||
return program
|
||||
|
||||
|
||||
def CommandNoCache(env, target, sources, command, **args):
|
||||
result = env.Command(target, sources, command, **args)
|
||||
env.NoCache(result)
|
||||
return result
|
||||
|
||||
|
||||
def detect_darwin_sdk_path(platform, env):
|
||||
sdk_name = ''
|
||||
if platform == 'osx':
|
||||
sdk_name = 'macosx'
|
||||
var_name = 'MACOS_SDK_PATH'
|
||||
elif platform == 'iphone':
|
||||
sdk_name = 'iphoneos'
|
||||
var_name = 'IPHONESDK'
|
||||
elif platform == 'iphonesimulator':
|
||||
sdk_name = 'iphonesimulator'
|
||||
var_name = 'IPHONESDK'
|
||||
sdk_name = ""
|
||||
if platform == "osx":
|
||||
sdk_name = "macosx"
|
||||
var_name = "MACOS_SDK_PATH"
|
||||
elif platform == "iphone":
|
||||
sdk_name = "iphoneos"
|
||||
var_name = "IPHONESDK"
|
||||
elif platform == "iphonesimulator":
|
||||
sdk_name = "iphonesimulator"
|
||||
var_name = "IPHONESDK"
|
||||
else:
|
||||
raise Exception("Invalid platform argument passed to detect_darwin_sdk_path")
|
||||
|
||||
if not env[var_name]:
|
||||
try:
|
||||
sdk_path = decode_utf8(subprocess.check_output(['xcrun', '--sdk', sdk_name, '--show-sdk-path']).strip())
|
||||
sdk_path = decode_utf8(subprocess.check_output(["xcrun", "--sdk", sdk_name, "--show-sdk-path"]).strip())
|
||||
if sdk_path:
|
||||
env[var_name] = sdk_path
|
||||
except (subprocess.CalledProcessError, OSError):
|
||||
print("Failed to find SDK path while running xcrun --sdk {} --show-sdk-path.".format(sdk_name))
|
||||
raise
|
||||
|
||||
|
||||
def get_compiler_version(env):
|
||||
"""
|
||||
Returns an array of version numbers as ints: [major, minor, patch].
|
||||
|
@ -608,20 +695,156 @@ def get_compiler_version(env):
|
|||
# Not using -dumpversion as some GCC distros only return major, and
|
||||
# Clang used to return hardcoded 4.2.1: # https://reviews.llvm.org/D56803
|
||||
try:
|
||||
version = decode_utf8(subprocess.check_output([env.subst(env['CXX']), '--version']).strip())
|
||||
version = decode_utf8(subprocess.check_output([env.subst(env["CXX"]), "--version"]).strip())
|
||||
except (subprocess.CalledProcessError, OSError):
|
||||
print("Couldn't parse CXX environment variable to infer compiler version.")
|
||||
return None
|
||||
else: # TODO: Implement for MSVC
|
||||
return None
|
||||
match = re.search('[0-9]+\.[0-9.]+', version)
|
||||
match = re.search("[0-9]+\.[0-9.]+", version)
|
||||
if match is not None:
|
||||
return list(map(int, match.group().split('.')))
|
||||
return list(map(int, match.group().split(".")))
|
||||
else:
|
||||
return None
|
||||
|
||||
|
||||
def using_gcc(env):
|
||||
return 'gcc' in os.path.basename(env["CC"])
|
||||
return "gcc" in os.path.basename(env["CC"])
|
||||
|
||||
|
||||
def using_clang(env):
|
||||
return 'clang' in os.path.basename(env["CC"])
|
||||
return "clang" in os.path.basename(env["CC"])
|
||||
|
||||
|
||||
def show_progress(env):
|
||||
import sys
|
||||
from SCons.Script import Progress, Command, AlwaysBuild
|
||||
|
||||
screen = sys.stdout
|
||||
# Progress reporting is not available in non-TTY environments since it
|
||||
# messes with the output (for example, when writing to a file)
|
||||
show_progress = env["progress"] and sys.stdout.isatty()
|
||||
node_count = 0
|
||||
node_count_max = 0
|
||||
node_count_interval = 1
|
||||
node_count_fname = str(env.Dir("#")) + "/.scons_node_count"
|
||||
|
||||
import time, math
|
||||
|
||||
class cache_progress:
|
||||
# The default is 1 GB cache and 12 hours half life
|
||||
def __init__(self, path=None, limit=1073741824, half_life=43200):
|
||||
self.path = path
|
||||
self.limit = limit
|
||||
self.exponent_scale = math.log(2) / half_life
|
||||
if env["verbose"] and path != None:
|
||||
screen.write(
|
||||
"Current cache limit is {} (used: {})\n".format(
|
||||
self.convert_size(limit), self.convert_size(self.get_size(path))
|
||||
)
|
||||
)
|
||||
self.delete(self.file_list())
|
||||
|
||||
def __call__(self, node, *args, **kw):
|
||||
nonlocal node_count, node_count_max, node_count_interval, node_count_fname, show_progress
|
||||
if show_progress:
|
||||
# Print the progress percentage
|
||||
node_count += node_count_interval
|
||||
if node_count_max > 0 and node_count <= node_count_max:
|
||||
screen.write("\r[%3d%%] " % (node_count * 100 / node_count_max))
|
||||
screen.flush()
|
||||
elif node_count_max > 0 and node_count > node_count_max:
|
||||
screen.write("\r[100%] ")
|
||||
screen.flush()
|
||||
else:
|
||||
screen.write("\r[Initial build] ")
|
||||
screen.flush()
|
||||
|
||||
def delete(self, files):
|
||||
if len(files) == 0:
|
||||
return
|
||||
if env["verbose"]:
|
||||
# Utter something
|
||||
screen.write("\rPurging %d %s from cache...\n" % (len(files), len(files) > 1 and "files" or "file"))
|
||||
[os.remove(f) for f in files]
|
||||
|
||||
def file_list(self):
|
||||
if self.path is None:
|
||||
# Nothing to do
|
||||
return []
|
||||
# Gather a list of (filename, (size, atime)) within the
|
||||
# cache directory
|
||||
file_stat = [(x, os.stat(x)[6:8]) for x in glob.glob(os.path.join(self.path, "*", "*"))]
|
||||
if file_stat == []:
|
||||
# Nothing to do
|
||||
return []
|
||||
# Weight the cache files by size (assumed to be roughly
|
||||
# proportional to the recompilation time) times an exponential
|
||||
# decay since the ctime, and return a list with the entries
|
||||
# (filename, size, weight).
|
||||
current_time = time.time()
|
||||
file_stat = [(x[0], x[1][0], (current_time - x[1][1])) for x in file_stat]
|
||||
# Sort by the most recently accessed files (most sensible to keep) first
|
||||
file_stat.sort(key=lambda x: x[2])
|
||||
# Search for the first entry where the storage limit is
|
||||
# reached
|
||||
sum, mark = 0, None
|
||||
for i, x in enumerate(file_stat):
|
||||
sum += x[1]
|
||||
if sum > self.limit:
|
||||
mark = i
|
||||
break
|
||||
if mark is None:
|
||||
return []
|
||||
else:
|
||||
return [x[0] for x in file_stat[mark:]]
|
||||
|
||||
def convert_size(self, size_bytes):
|
||||
if size_bytes == 0:
|
||||
return "0 bytes"
|
||||
size_name = ("bytes", "KB", "MB", "GB", "TB", "PB", "EB", "ZB", "YB")
|
||||
i = int(math.floor(math.log(size_bytes, 1024)))
|
||||
p = math.pow(1024, i)
|
||||
s = round(size_bytes / p, 2)
|
||||
return "%s %s" % (int(s) if i == 0 else s, size_name[i])
|
||||
|
||||
def get_size(self, start_path="."):
|
||||
total_size = 0
|
||||
for dirpath, dirnames, filenames in os.walk(start_path):
|
||||
for f in filenames:
|
||||
fp = os.path.join(dirpath, f)
|
||||
total_size += os.path.getsize(fp)
|
||||
return total_size
|
||||
|
||||
def progress_finish(target, source, env):
|
||||
nonlocal node_count, progressor
|
||||
with open(node_count_fname, "w") as f:
|
||||
f.write("%d\n" % node_count)
|
||||
progressor.delete(progressor.file_list())
|
||||
|
||||
try:
|
||||
with open(node_count_fname) as f:
|
||||
node_count_max = int(f.readline())
|
||||
except:
|
||||
pass
|
||||
|
||||
cache_directory = os.environ.get("SCONS_CACHE")
|
||||
# Simple cache pruning, attached to SCons' progress callback. Trim the
|
||||
# cache directory to a size not larger than cache_limit.
|
||||
cache_limit = float(os.getenv("SCONS_CACHE_LIMIT", 1024)) * 1024 * 1024
|
||||
progressor = cache_progress(cache_directory, cache_limit)
|
||||
Progress(progressor, interval=node_count_interval)
|
||||
|
||||
progress_finish_command = Command("progress_finish", [], progress_finish)
|
||||
AlwaysBuild(progress_finish_command)
|
||||
|
||||
|
||||
def dump(env):
|
||||
# Dumps latest build information for debugging purposes and external tools.
|
||||
from json import dump
|
||||
|
||||
def non_serializable(obj):
|
||||
return "<<non-serializable: %s>>" % (type(obj).__qualname__)
|
||||
|
||||
with open(".scons_env.json", "w") as f:
|
||||
dump(env.Dictionary(), f, indent=4, default=non_serializable)
|
||||
|
|
|
@ -5,16 +5,33 @@ contributors to make sure they comply with our requirements.
|
|||
|
||||
## List of hooks
|
||||
|
||||
- Pre-commit hook for clang-format: Applies clang-format to the staged files
|
||||
before accepting a commit; blocks the commit and generates a patch if the
|
||||
style is not respected.
|
||||
Should work on Linux and macOS. You may need to edit the file if your
|
||||
clang-format binary is not in the `$PATH`, or if you want to enable colored
|
||||
output with pygmentize.
|
||||
- Pre-commit hook for makerst: Checks the class reference syntax using `makerst.py`.
|
||||
Should work on Linux and macOS.
|
||||
- Pre-commit hook for `clang-format`: Applies `clang-format` to the staged
|
||||
files before accepting a commit; blocks the commit and generates a patch if
|
||||
the style is not respected.
|
||||
You may need to edit the file if your `clang-format` binary is not in the
|
||||
`PATH`, or if you want to enable colored output with `pygmentize`.
|
||||
- Pre-commit hook for `black`: Applies `black` to the staged Python files
|
||||
before accepting a commit.
|
||||
- Pre-commit hook for `makerst`: Checks the class reference syntax using
|
||||
`makerst.py`.
|
||||
|
||||
## Installation
|
||||
|
||||
Copy all the files from this folder into your `.git/hooks` folder, and make sure
|
||||
the hooks and helper scripts are executable.
|
||||
Copy all the files from this folder into your `.git/hooks` folder, and make
|
||||
sure the hooks and helper scripts are executable.
|
||||
|
||||
#### Linux/MacOS
|
||||
|
||||
The hooks rely on bash scripts and tools which should be in the system `PATH`,
|
||||
so they should work out of the box on Linux/macOS.
|
||||
|
||||
#### Windows
|
||||
|
||||
##### clang-format
|
||||
- Download LLVM for Windows (version 8 or later) from
|
||||
<https://releases.llvm.org/download.html>
|
||||
- Make sure LLVM is added to the `PATH` during installation
|
||||
|
||||
##### black
|
||||
- Python installation: make sure Python is added to the `PATH`
|
||||
- Install `black` - in any console: `pip3 install black`
|
||||
|
|
|
@ -14,7 +14,7 @@
|
|||
# as this script. Hooks should return 0 if successful and nonzero to cancel the
|
||||
# commit. They are executed in the order in which they are listed.
|
||||
#HOOKS="pre-commit-compile pre-commit-uncrustify"
|
||||
HOOKS="pre-commit-clang-format pre-commit-makerst"
|
||||
HOOKS="pre-commit-clang-format pre-commit-black pre-commit-makerst"
|
||||
###########################################################
|
||||
# There should be no need to change anything below this line.
|
||||
|
||||
|
|
|
@ -0,0 +1,202 @@
|
|||
#!/usr/bin/env bash
|
||||
|
||||
# git pre-commit hook that runs a black stylecheck.
|
||||
# Based on pre-commit-clang-format.
|
||||
|
||||
##################################################################
|
||||
# SETTINGS
|
||||
# Set path to black binary.
|
||||
BLACK=`which black 2>/dev/null`
|
||||
BLACK_OPTIONS="-l 120"
|
||||
|
||||
# Remove any older patches from previous commits. Set to true or false.
|
||||
DELETE_OLD_PATCHES=false
|
||||
|
||||
# File types to parse.
|
||||
FILE_NAMES="SConstruct SCsub"
|
||||
FILE_EXTS="py"
|
||||
|
||||
# Use pygmentize instead of cat to parse diff with highlighting.
|
||||
# Install it with `pip install pygments` (Linux) or `easy_install Pygments` (Mac)
|
||||
PYGMENTIZE=`which pygmentize 2>/dev/null`
|
||||
if [ ! -z "$PYGMENTIZE" ]; then
|
||||
READER="pygmentize -l diff"
|
||||
else
|
||||
READER=cat
|
||||
fi
|
||||
|
||||
# Path to zenity
|
||||
ZENITY=`which zenity 2>/dev/null`
|
||||
|
||||
# Path to xmessage
|
||||
XMSG=`which xmessage 2>/dev/null`
|
||||
|
||||
# Path to powershell (Windows only)
|
||||
PWSH=`which powershell 2>/dev/null`
|
||||
|
||||
##################################################################
|
||||
# There should be no need to change anything below this line.
|
||||
|
||||
. "$(dirname -- "$0")/canonicalize_filename.sh"
|
||||
|
||||
# exit on error
|
||||
set -e
|
||||
|
||||
# check whether the given file matches any of the set extensions
|
||||
matches_name_or_extension() {
|
||||
local filename=$(basename "$1")
|
||||
local extension=".${filename##*.}"
|
||||
|
||||
for name in $FILE_NAMES; do [[ "$name" == "$filename" ]] && return 0; done
|
||||
for ext in $FILE_EXTS; do [[ "$ext" == "$extension" ]] && return 0; done
|
||||
|
||||
return 1
|
||||
}
|
||||
|
||||
# necessary check for initial commit
|
||||
if git rev-parse --verify HEAD >/dev/null 2>&1 ; then
|
||||
against=HEAD
|
||||
else
|
||||
# Initial commit: diff against an empty tree object
|
||||
against=4b825dc642cb6eb9a060e54bf8d69288fbee4904
|
||||
fi
|
||||
|
||||
if [ ! -x "$BLACK" ] ; then
|
||||
if [ ! -t 1 ] ; then
|
||||
if [ -x "$ZENITY" ] ; then
|
||||
$ZENITY --error --title="Error" --text="Error: black executable not found."
|
||||
exit 1
|
||||
elif [ -x "$XMSG" ] ; then
|
||||
$XMSG -center -title "Error" "Error: black executable not found."
|
||||
exit 1
|
||||
elif [ \( \( "$OSTYPE" = "msys" \) -o \( "$OSTYPE" = "win32" \) \) -a \( -x "$PWSH" \) ]; then
|
||||
winmessage="$(canonicalize_filename "./.git/hooks/winmessage.ps1")"
|
||||
$PWSH -noprofile -executionpolicy bypass -file "$winmessage" -center -title "Error" --text "Error: black executable not found."
|
||||
exit 1
|
||||
fi
|
||||
fi
|
||||
printf "Error: black executable not found.\n"
|
||||
printf "Set the correct path in $(canonicalize_filename "$0").\n"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# create a random filename to store our generated patch
|
||||
prefix="pre-commit-black"
|
||||
suffix="$(date +%s)"
|
||||
patch="/tmp/$prefix-$suffix.patch"
|
||||
|
||||
# clean up any older black patches
|
||||
$DELETE_OLD_PATCHES && rm -f /tmp/$prefix*.patch
|
||||
|
||||
# create one patch containing all changes to the files
|
||||
git diff-index --cached --diff-filter=ACMR --name-only $against -- | while read file;
|
||||
do
|
||||
# ignore thirdparty files
|
||||
if grep -q "thirdparty" <<< $file; then
|
||||
continue;
|
||||
fi
|
||||
|
||||
# ignore file if not one of the names or extensions we handle
|
||||
if ! matches_name_or_extension "$file"; then
|
||||
continue;
|
||||
fi
|
||||
|
||||
# format our file with black, create a patch with diff and append it to our $patch
|
||||
# The sed call is necessary to transform the patch from
|
||||
# --- $file timestamp
|
||||
# +++ $file timestamp
|
||||
# to both lines working on the same file and having a/ and b/ prefix.
|
||||
# Else it can not be applied with 'git apply'.
|
||||
"$BLACK" "$BLACK_OPTIONS" --diff "$file" | \
|
||||
sed -e "1s|--- |--- a/|" -e "2s|+++ |+++ b/|" >> "$patch"
|
||||
done
|
||||
|
||||
# if no patch has been generated all is ok, clean up the file stub and exit
|
||||
if [ ! -s "$patch" ] ; then
|
||||
printf "Files in this commit comply with the black formatter rules.\n"
|
||||
rm -f "$patch"
|
||||
exit 0
|
||||
fi
|
||||
|
||||
# a patch has been created, notify the user and exit
|
||||
printf "\nThe following differences were found between the code to commit "
|
||||
printf "and the black formatter rules:\n\n"
|
||||
|
||||
if [ -t 1 ] ; then
|
||||
$READER "$patch"
|
||||
printf "\n"
|
||||
# Allows us to read user input below, assigns stdin to keyboard
|
||||
exec < /dev/tty
|
||||
terminal="1"
|
||||
else
|
||||
cat "$patch"
|
||||
printf "\n"
|
||||
# Allows non zero zenity/powershell output
|
||||
set +e
|
||||
terminal="0"
|
||||
fi
|
||||
|
||||
while true; do
|
||||
if [ $terminal = "0" ] ; then
|
||||
if [ -x "$ZENITY" ] ; then
|
||||
ans=$($ZENITY --text-info --filename="$patch" --width=800 --height=600 --title="Do you want to apply that patch?" --ok-label="Apply" --cancel-label="Do not apply" --extra-button="Apply and stage")
|
||||
if [ "$?" = "0" ] ; then
|
||||
yn="Y"
|
||||
else
|
||||
if [ "$ans" = "Apply and stage" ] ; then
|
||||
yn="S"
|
||||
else
|
||||
yn="N"
|
||||
fi
|
||||
fi
|
||||
elif [ -x "$XMSG" ] ; then
|
||||
$XMSG -file "$patch" -buttons "Apply":100,"Apply and stage":200,"Do not apply":0 -center -default "Do not apply" -geometry 800x600 -title "Do you want to apply that patch?"
|
||||
ans=$?
|
||||
if [ "$ans" = "100" ] ; then
|
||||
yn="Y"
|
||||
elif [ "$ans" = "200" ] ; then
|
||||
yn="S"
|
||||
else
|
||||
yn="N"
|
||||
fi
|
||||
elif [ \( \( "$OSTYPE" = "msys" \) -o \( "$OSTYPE" = "win32" \) \) -a \( -x "$PWSH" \) ]; then
|
||||
winmessage="$(canonicalize_filename "./.git/hooks/winmessage.ps1")"
|
||||
$PWSH -noprofile -executionpolicy bypass -file "$winmessage" -file "$patch" -buttons "Apply":100,"Apply and stage":200,"Do not apply":0 -center -default "Do not apply" -geometry 800x600 -title "Do you want to apply that patch?"
|
||||
ans=$?
|
||||
if [ "$ans" = "100" ] ; then
|
||||
yn="Y"
|
||||
elif [ "$ans" = "200" ] ; then
|
||||
yn="S"
|
||||
else
|
||||
yn="N"
|
||||
fi
|
||||
else
|
||||
printf "Error: zenity, xmessage, or powershell executable not found.\n"
|
||||
exit 1
|
||||
fi
|
||||
else
|
||||
read -p "Do you want to apply that patch (Y - Apply, N - Do not apply, S - Apply and stage files)? [Y/N/S] " yn
|
||||
fi
|
||||
case $yn in
|
||||
[Yy] ) git apply $patch;
|
||||
printf "The patch was applied. You can now stage the changes and commit again.\n\n";
|
||||
break
|
||||
;;
|
||||
[Nn] ) printf "\nYou can apply these changes with:\n git apply $patch\n";
|
||||
printf "(may need to be called from the root directory of your repository)\n";
|
||||
printf "Aborting commit. Apply changes and commit again or skip checking with";
|
||||
printf " --no-verify (not recommended).\n\n";
|
||||
break
|
||||
;;
|
||||
[Ss] ) git apply $patch;
|
||||
git diff-index --cached --diff-filter=ACMR --name-only $against -- | while read file;
|
||||
do git add $file;
|
||||
done
|
||||
printf "The patch was applied and the changed files staged. You can now commit.\n\n";
|
||||
break
|
||||
;;
|
||||
* ) echo "Please answer yes or no."
|
||||
;;
|
||||
esac
|
||||
done
|
||||
exit 1 # we don't commit in any case
|
|
@ -15,28 +15,37 @@
|
|||
|
||||
##################################################################
|
||||
# SETTINGS
|
||||
# Set path to clang-format binary
|
||||
# CLANG_FORMAT="/usr/bin/clang-format"
|
||||
CLANG_FORMAT=`which clang-format`
|
||||
# Set path to clang-format binary.
|
||||
CLANG_FORMAT=`which clang-format 2>/dev/null`
|
||||
|
||||
# Remove any older patches from previous commits. Set to true or false.
|
||||
# DELETE_OLD_PATCHES=false
|
||||
DELETE_OLD_PATCHES=false
|
||||
|
||||
# Only parse files with the extensions in FILE_EXTS. Set to true or false.
|
||||
# If false every changed file in the commit will be parsed with clang-format.
|
||||
# If true only files matching one of the extensions are parsed with clang-format.
|
||||
# PARSE_EXTS=true
|
||||
PARSE_EXTS=true
|
||||
|
||||
# File types to parse. Only effective when PARSE_EXTS is true.
|
||||
# FILE_EXTS=".c .h .cpp .hpp"
|
||||
FILE_EXTS=".c .h .cpp .hpp .cc .hh .cxx .m .mm .inc .java .glsl"
|
||||
|
||||
# Use pygmentize instead of cat to parse diff with highlighting.
|
||||
# Install it with `pip install pygments` (Linux) or `easy_install Pygments` (Mac)
|
||||
# READER="pygmentize -l diff"
|
||||
READER=cat
|
||||
PYGMENTIZE=`which pygmentize 2>/dev/null`
|
||||
if [ ! -z "$PYGMENTIZE" ]; then
|
||||
READER="pygmentize -l diff"
|
||||
else
|
||||
READER=cat
|
||||
fi
|
||||
|
||||
# Path to zenity
|
||||
ZENITY=`which zenity 2>/dev/null`
|
||||
|
||||
# Path to xmessage
|
||||
XMSG=`which xmessage 2>/dev/null`
|
||||
|
||||
# Path to powershell (Windows only)
|
||||
PWSH=`which powershell 2>/dev/null`
|
||||
|
||||
##################################################################
|
||||
# There should be no need to change anything below this line.
|
||||
|
@ -66,6 +75,19 @@ else
|
|||
fi
|
||||
|
||||
if [ ! -x "$CLANG_FORMAT" ] ; then
|
||||
if [ ! -t 1 ] ; then
|
||||
if [ -x "$ZENITY" ] ; then
|
||||
$ZENITY --error --title="Error" --text="Error: clang-format executable not found."
|
||||
exit 1
|
||||
elif [ -x "$XMSG" ] ; then
|
||||
$XMSG -center -title "Error" "Error: clang-format executable not found."
|
||||
exit 1
|
||||
elif [ \( \( "$OSTYPE" = "msys" \) -o \( "$OSTYPE" = "win32" \) \) -a \( -x "$PWSH" \) ]; then
|
||||
winmessage="$(canonicalize_filename "./.git/hooks/winmessage.ps1")"
|
||||
$PWSH -noprofile -executionpolicy bypass -file "$winmessage" -center -title "Error" --text "Error: clang-format executable not found."
|
||||
exit 1
|
||||
fi
|
||||
fi
|
||||
printf "Error: clang-format executable not found.\n"
|
||||
printf "Set the correct path in $(canonicalize_filename "$0").\n"
|
||||
exit 1
|
||||
|
@ -117,14 +139,62 @@ fi
|
|||
# a patch has been created, notify the user and exit
|
||||
printf "\nThe following differences were found between the code to commit "
|
||||
printf "and the clang-format rules:\n\n"
|
||||
$READER "$patch"
|
||||
printf "\n"
|
||||
|
||||
# Allows us to read user input below, assigns stdin to keyboard
|
||||
exec < /dev/tty
|
||||
if [ -t 1 ] ; then
|
||||
$READER "$patch"
|
||||
printf "\n"
|
||||
# Allows us to read user input below, assigns stdin to keyboard
|
||||
exec < /dev/tty
|
||||
terminal="1"
|
||||
else
|
||||
cat "$patch"
|
||||
printf "\n"
|
||||
# Allows non zero zenity/powershell output
|
||||
set +e
|
||||
terminal="0"
|
||||
fi
|
||||
|
||||
while true; do
|
||||
read -p "Do you want to apply that patch (Y - Apply, N - Do not apply, S - Apply and stage files)? [Y/N/S] " yn
|
||||
if [ $terminal = "0" ] ; then
|
||||
if [ -x "$ZENITY" ] ; then
|
||||
ans=$($ZENITY --text-info --filename="$patch" --width=800 --height=600 --title="Do you want to apply that patch?" --ok-label="Apply" --cancel-label="Do not apply" --extra-button="Apply and stage")
|
||||
if [ "$?" = "0" ] ; then
|
||||
yn="Y"
|
||||
else
|
||||
if [ "$ans" = "Apply and stage" ] ; then
|
||||
yn="S"
|
||||
else
|
||||
yn="N"
|
||||
fi
|
||||
fi
|
||||
elif [ -x "$XMSG" ] ; then
|
||||
$XMSG -file "$patch" -buttons "Apply":100,"Apply and stage":200,"Do not apply":0 -center -default "Do not apply" -geometry 800x600 -title "Do you want to apply that patch?"
|
||||
ans=$?
|
||||
if [ "$ans" = "100" ] ; then
|
||||
yn="Y"
|
||||
elif [ "$ans" = "200" ] ; then
|
||||
yn="S"
|
||||
else
|
||||
yn="N"
|
||||
fi
|
||||
elif [ \( \( "$OSTYPE" = "msys" \) -o \( "$OSTYPE" = "win32" \) \) -a \( -x "$PWSH" \) ]; then
|
||||
winmessage="$(canonicalize_filename "./.git/hooks/winmessage.ps1")"
|
||||
$PWSH -noprofile -executionpolicy bypass -file "$winmessage" -file "$patch" -buttons "Apply":100,"Apply and stage":200,"Do not apply":0 -center -default "Do not apply" -geometry 800x600 -title "Do you want to apply that patch?"
|
||||
ans=$?
|
||||
if [ "$ans" = "100" ] ; then
|
||||
yn="Y"
|
||||
elif [ "$ans" = "200" ] ; then
|
||||
yn="S"
|
||||
else
|
||||
yn="N"
|
||||
fi
|
||||
else
|
||||
printf "Error: zenity, xmessage, or powershell executable not found.\n"
|
||||
exit 1
|
||||
fi
|
||||
else
|
||||
read -p "Do you want to apply that patch (Y - Apply, N - Do not apply, S - Apply and stage files)? [Y/N/S] " yn
|
||||
fi
|
||||
case $yn in
|
||||
[Yy] ) git apply $patch;
|
||||
printf "The patch was applied. You can now stage the changes and commit again.\n\n";
|
||||
|
|
|
@ -0,0 +1,103 @@
|
|||
Param (
|
||||
[string]$file = "",
|
||||
[string]$text = "",
|
||||
[string]$buttons = "OK:0",
|
||||
[string]$default = "",
|
||||
[switch]$nearmouse = $false,
|
||||
[switch]$center = $false,
|
||||
[string]$geometry = "",
|
||||
[int32]$timeout = 0,
|
||||
[string]$title = "Message"
|
||||
)
|
||||
Add-Type -assembly System.Windows.Forms
|
||||
|
||||
$global:Result = 0
|
||||
|
||||
$main_form = New-Object System.Windows.Forms.Form
|
||||
$main_form.Text = $title
|
||||
|
||||
$geometry_data = $geometry.Split("+")
|
||||
if ($geometry_data.Length -ge 1) {
|
||||
$size_data = $geometry_data[0].Split("x")
|
||||
if ($size_data.Length -eq 2) {
|
||||
$main_form.Width = $size_data[0]
|
||||
$main_form.Height = $size_data[1]
|
||||
}
|
||||
}
|
||||
if ($geometry_data.Length -eq 3) {
|
||||
$main_form.StartPosition = [System.Windows.Forms.FormStartPosition]::Manual
|
||||
$main_form.Location = New-Object System.Drawing.Point($geometry_data[1], $geometry_data[2])
|
||||
}
|
||||
if ($nearmouse) {
|
||||
$main_form.StartPosition = [System.Windows.Forms.FormStartPosition]::Manual
|
||||
$main_form.Location = System.Windows.Forms.Cursor.Position
|
||||
}
|
||||
if ($center) {
|
||||
$main_form.StartPosition = [System.Windows.Forms.FormStartPosition]::CenterScreen
|
||||
}
|
||||
|
||||
$main_form.SuspendLayout()
|
||||
|
||||
$button_panel = New-Object System.Windows.Forms.FlowLayoutPanel
|
||||
$button_panel.SuspendLayout()
|
||||
$button_panel.FlowDirection = [System.Windows.Forms.FlowDirection]::RightToLeft
|
||||
$button_panel.Dock = [System.Windows.Forms.DockStyle]::Bottom
|
||||
$button_panel.Autosize = $true
|
||||
|
||||
if ($file -ne "") {
|
||||
$text = [IO.File]::ReadAllText($file).replace("`n", "`r`n")
|
||||
}
|
||||
|
||||
if ($text -ne "") {
|
||||
$text_box = New-Object System.Windows.Forms.TextBox
|
||||
$text_box.Multiline = $true
|
||||
$text_box.ReadOnly = $true
|
||||
$text_box.Autosize = $true
|
||||
$text_box.Text = $text
|
||||
$text_box.Select(0,0)
|
||||
$text_box.Dock = [System.Windows.Forms.DockStyle]::Fill
|
||||
$main_form.Controls.Add($text_box)
|
||||
}
|
||||
|
||||
$buttons_array = $buttons.Split(",")
|
||||
foreach ($button in $buttons_array) {
|
||||
$button_data = $button.Split(":")
|
||||
$button_ctl = New-Object System.Windows.Forms.Button
|
||||
if ($button_data.Length -eq 2) {
|
||||
$button_ctl.Tag = $button_data[1]
|
||||
} else {
|
||||
$button_ctl.Tag = 100 + $buttons_array.IndexOf($button)
|
||||
}
|
||||
if ($default -eq $button_data[0]) {
|
||||
$main_form.AcceptButton = $button_ctl
|
||||
}
|
||||
$button_ctl.Autosize = $true
|
||||
$button_ctl.Text = $button_data[0]
|
||||
$button_ctl.Add_Click(
|
||||
{
|
||||
Param($sender)
|
||||
$global:Result = $sender.Tag
|
||||
$main_form.Close()
|
||||
}
|
||||
)
|
||||
$button_panel.Controls.Add($button_ctl)
|
||||
}
|
||||
$main_form.Controls.Add($button_panel)
|
||||
|
||||
$button_panel.ResumeLayout($false)
|
||||
$main_form.ResumeLayout($false)
|
||||
|
||||
if ($timeout -gt 0) {
|
||||
$timer = New-Object System.Windows.Forms.Timer
|
||||
$timer.Add_Tick(
|
||||
{
|
||||
$global:Result = 0
|
||||
$main_form.Close()
|
||||
}
|
||||
)
|
||||
$timer.Interval = $timeout
|
||||
$timer.Start()
|
||||
}
|
||||
$dlg_res = $main_form.ShowDialog()
|
||||
|
||||
[Environment]::Exit($global:Result)
|
|
@ -37,24 +37,24 @@ files = open("files", "r")
|
|||
|
||||
fname = files.readline()
|
||||
|
||||
while (fname != ""):
|
||||
while fname != "":
|
||||
|
||||
# Handle replacing $filename with actual filename and keep alignment
|
||||
fsingle = fname.strip()
|
||||
if (fsingle.find("/") != -1):
|
||||
fsingle = fsingle[fsingle.rfind("/") + 1:]
|
||||
if fsingle.find("/") != -1:
|
||||
fsingle = fsingle[fsingle.rfind("/") + 1 :]
|
||||
rep_fl = "$filename"
|
||||
rep_fi = fsingle
|
||||
len_fl = len(rep_fl)
|
||||
len_fi = len(rep_fi)
|
||||
# Pad with spaces to keep alignment
|
||||
if (len_fi < len_fl):
|
||||
if len_fi < len_fl:
|
||||
for x in range(len_fl - len_fi):
|
||||
rep_fi += " "
|
||||
elif (len_fl < len_fi):
|
||||
elif len_fl < len_fi:
|
||||
for x in range(len_fi - len_fl):
|
||||
rep_fl += " "
|
||||
if (header.find(rep_fl) != -1):
|
||||
if header.find(rep_fl) != -1:
|
||||
text = header.replace(rep_fl, rep_fi)
|
||||
else:
|
||||
text = header.replace("$filename", fsingle)
|
||||
|
@ -71,21 +71,21 @@ while (fname != ""):
|
|||
line = fileread.readline()
|
||||
header_done = False
|
||||
|
||||
while (line.strip() == ""): # Skip empty lines at the top
|
||||
while line.strip() == "": # Skip empty lines at the top
|
||||
line = fileread.readline()
|
||||
|
||||
if (line.find("/**********") == -1): # Godot header starts this way
|
||||
if line.find("/**********") == -1: # Godot header starts this way
|
||||
# Maybe starting with a non-Godot comment, abort header magic
|
||||
header_done = True
|
||||
|
||||
while (not header_done): # Handle header now
|
||||
if (line.find("/*") != 0): # No more starting with a comment
|
||||
while not header_done: # Handle header now
|
||||
if line.find("/*") != 0: # No more starting with a comment
|
||||
header_done = True
|
||||
if (line.strip() != ""):
|
||||
if line.strip() != "":
|
||||
text += line
|
||||
line = fileread.readline()
|
||||
|
||||
while (line != ""): # Dump everything until EOF
|
||||
while line != "": # Dump everything until EOF
|
||||
text += line
|
||||
line = fileread.readline()
|
||||
|
||||
|
|
|
@ -0,0 +1,48 @@
|
|||
#!/bin/sh
|
||||
|
||||
BLACK=black
|
||||
BLACK_OPTIONS="-l 120"
|
||||
|
||||
if [ "$TRAVIS_PULL_REQUEST" != "false" ]; then
|
||||
# Travis only clones the PR branch and uses its HEAD commit as detached HEAD,
|
||||
# so it's problematic when we want an exact commit range for format checks.
|
||||
# We fetch upstream to ensure that we have the proper references to resolve.
|
||||
# Ideally we would use $TRAVIS_COMMIT_RANGE but it doesn't play well with PR
|
||||
# updates, as it only includes changes since the previous state of the PR.
|
||||
if [ -z "$(git remote | grep upstream)" ]; then
|
||||
git remote add upstream https://github.com/godotengine/godot \
|
||||
--no-tags -f -t $TRAVIS_BRANCH
|
||||
fi
|
||||
RANGE="upstream/$TRAVIS_BRANCH HEAD"
|
||||
else
|
||||
# Test only the last commit, since $TRAVIS_COMMIT_RANGE wouldn't support
|
||||
# force pushes.
|
||||
RANGE=HEAD
|
||||
fi
|
||||
|
||||
FILES=$(git diff-tree --no-commit-id --name-only -r $RANGE | grep -v thirdparty/| grep -E "(SConstruct|SCsub|\.py)$")
|
||||
echo "Checking files:\n$FILES"
|
||||
|
||||
# create a random filename to store our generated patch
|
||||
prefix="static-check-black"
|
||||
suffix="$(date +%s)"
|
||||
patch="/tmp/$prefix-$suffix.patch"
|
||||
|
||||
for file in $FILES; do
|
||||
"$BLACK" "$BLACK_OPTIONS" --diff "$file" | \
|
||||
sed -e "1s|--- |--- a/|" -e "2s|+++ |+++ b/|" >> "$patch"
|
||||
done
|
||||
|
||||
# if no patch has been generated all is ok, clean up the file stub and exit
|
||||
if [ ! -s "$patch" ] ; then
|
||||
printf "Files in this commit comply with the black formatting rules.\n"
|
||||
rm -f "$patch"
|
||||
exit 0
|
||||
fi
|
||||
|
||||
# a patch has been created, notify the user and exit
|
||||
printf "\n*** The following differences were found between the code to commit "
|
||||
printf "and the black formatting rules:\n\n"
|
||||
pygmentize -l diff "$patch"
|
||||
printf "\n*** Aborting, please fix your commit(s) with 'git commit --amend' or 'git rebase -i <hash>'\n"
|
||||
exit 1
|
|
@ -8,8 +8,10 @@ if [ "$TRAVIS_PULL_REQUEST" != "false" ]; then
|
|||
# We fetch upstream to ensure that we have the proper references to resolve.
|
||||
# Ideally we would use $TRAVIS_COMMIT_RANGE but it doesn't play well with PR
|
||||
# updates, as it only includes changes since the previous state of the PR.
|
||||
git remote add upstream https://github.com/godotengine/godot \
|
||||
--no-tags -f -t $TRAVIS_BRANCH
|
||||
if [ -z "$(git remote | grep upstream)" ]; then
|
||||
git remote add upstream https://github.com/godotengine/godot \
|
||||
--no-tags -f -t $TRAVIS_BRANCH
|
||||
fi
|
||||
RANGE="upstream/$TRAVIS_BRANCH HEAD"
|
||||
else
|
||||
# Test only the last commit, since $TRAVIS_COMMIT_RANGE wouldn't support
|
||||
|
@ -41,6 +43,6 @@ fi
|
|||
# a patch has been created, notify the user and exit
|
||||
printf "\n*** The following differences were found between the code to commit "
|
||||
printf "and the clang-format rules:\n\n"
|
||||
cat "$patch"
|
||||
pygmentize -l diff "$patch"
|
||||
printf "\n*** Aborting, please fix your commit(s) with 'git commit --amend' or 'git rebase -i <hash>'\n"
|
||||
exit 1
|
||||
|
|
|
@ -6,14 +6,14 @@ import os
|
|||
|
||||
env_modules = env.Clone()
|
||||
|
||||
Export('env_modules')
|
||||
Export("env_modules")
|
||||
|
||||
env.modules_sources = []
|
||||
|
||||
env_modules.add_source_files(env.modules_sources, "register_module_types.gen.cpp")
|
||||
|
||||
for name, path in env.module_list.items():
|
||||
if (name in env.disabled_modules):
|
||||
if name in env.disabled_modules:
|
||||
continue
|
||||
|
||||
env_modules.Append(CPPDEFINES=["MODULE_" + name.upper() + "_ENABLED"])
|
||||
|
@ -22,8 +22,8 @@ for name, path in env.module_list.items():
|
|||
else:
|
||||
SConscript(path + "/SCsub") # Custom.
|
||||
|
||||
if env['split_libmodules']:
|
||||
env.split_lib("modules", env_lib = env_modules)
|
||||
if env["split_libmodules"]:
|
||||
env.split_lib("modules", env_lib=env_modules)
|
||||
else:
|
||||
lib = env_modules.add_library("modules", env.modules_sources)
|
||||
|
||||
|
|
|
@ -1,7 +1,7 @@
|
|||
#!/usr/bin/env python
|
||||
|
||||
Import('env')
|
||||
Import('env_modules')
|
||||
Import("env")
|
||||
Import("env_modules")
|
||||
|
||||
env_arkit = env_modules.Clone()
|
||||
|
||||
|
@ -9,4 +9,4 @@ env_arkit = env_modules.Clone()
|
|||
modules_sources = []
|
||||
env_arkit.add_source_files(modules_sources, "*.cpp")
|
||||
env_arkit.add_source_files(modules_sources, "*.mm")
|
||||
mod_lib = env_modules.add_library('#bin/libgodot_arkit_module' + env['LIBSUFFIX'], modules_sources)
|
||||
mod_lib = env_modules.add_library("#bin/libgodot_arkit_module" + env["LIBSUFFIX"], modules_sources)
|
||||
|
|
|
@ -1,5 +1,6 @@
|
|||
def can_build(env, platform):
|
||||
return platform == 'iphone'
|
||||
return platform == "iphone"
|
||||
|
||||
|
||||
def configure(env):
|
||||
pass
|
||||
|
|
|
@ -1,7 +1,7 @@
|
|||
#!/usr/bin/env python
|
||||
|
||||
Import('env')
|
||||
Import('env_modules')
|
||||
Import("env")
|
||||
Import("env_modules")
|
||||
|
||||
env_assimp = env_modules.Clone()
|
||||
|
||||
|
@ -10,85 +10,85 @@ env_assimp = env_modules.Clone()
|
|||
if True: # env['builtin_assimp']:
|
||||
thirdparty_dir = "#thirdparty/assimp"
|
||||
|
||||
env_assimp.Prepend(CPPPATH=['#thirdparty/assimp'])
|
||||
env_assimp.Prepend(CPPPATH=['#thirdparty/assimp/code'])
|
||||
env_assimp.Prepend(CPPPATH=['#thirdparty/assimp/include'])
|
||||
env_assimp.Prepend(CPPPATH=["#thirdparty/assimp"])
|
||||
env_assimp.Prepend(CPPPATH=["#thirdparty/assimp/code"])
|
||||
env_assimp.Prepend(CPPPATH=["#thirdparty/assimp/include"])
|
||||
|
||||
#env_assimp.Append(CPPDEFINES=['ASSIMP_DOUBLE_PRECISION']) # TODO default to what godot is compiled with for future double support
|
||||
env_assimp.Append(CPPDEFINES=['ASSIMP_BUILD_SINGLETHREADED'])
|
||||
env_assimp.Append(CPPDEFINES=['ASSIMP_BUILD_BOOST_WORKAROUND'])
|
||||
env_assimp.Append(CPPDEFINES=['ASSIMP_BUILD_NO_OWN_ZLIB'])
|
||||
env_assimp.Append(CPPDEFINES=['ASSIMP_BUILD_NO_EXPORT'])
|
||||
# env_assimp.Append(CPPDEFINES=['ASSIMP_DOUBLE_PRECISION']) # TODO default to what godot is compiled with for future double support
|
||||
env_assimp.Append(CPPDEFINES=["ASSIMP_BUILD_SINGLETHREADED"])
|
||||
env_assimp.Append(CPPDEFINES=["ASSIMP_BUILD_BOOST_WORKAROUND"])
|
||||
env_assimp.Append(CPPDEFINES=["ASSIMP_BUILD_NO_OWN_ZLIB"])
|
||||
env_assimp.Append(CPPDEFINES=["ASSIMP_BUILD_NO_EXPORT"])
|
||||
|
||||
# Importers we don't need
|
||||
env_assimp.Append(CPPDEFINES=['ASSIMP_BUILD_NO_3D_IMPORTER'])
|
||||
env_assimp.Append(CPPDEFINES=['ASSIMP_BUILD_NO_3DS_IMPORTER'])
|
||||
env_assimp.Append(CPPDEFINES=['ASSIMP_BUILD_NO_3MF_IMPORTER'])
|
||||
env_assimp.Append(CPPDEFINES=['ASSIMP_BUILD_NO_AC_IMPORTER'])
|
||||
env_assimp.Append(CPPDEFINES=['ASSIMP_BUILD_NO_AMF_IMPORTER'])
|
||||
env_assimp.Append(CPPDEFINES=['ASSIMP_BUILD_NO_ASE_IMPORTER'])
|
||||
env_assimp.Append(CPPDEFINES=['ASSIMP_BUILD_NO_ASSBIN_IMPORTER'])
|
||||
env_assimp.Append(CPPDEFINES=['ASSIMP_BUILD_NO_B3D_IMPORTER'])
|
||||
env_assimp.Append(CPPDEFINES=['ASSIMP_BUILD_NO_BLEND_IMPORTER'])
|
||||
env_assimp.Append(CPPDEFINES=['ASSIMP_BUILD_NO_BVH_IMPORTER'])
|
||||
env_assimp.Append(CPPDEFINES=['ASSIMP_BUILD_NO_C4D_IMPORTER'])
|
||||
env_assimp.Append(CPPDEFINES=['ASSIMP_BUILD_NO_COB_IMPORTER'])
|
||||
env_assimp.Append(CPPDEFINES=['ASSIMP_BUILD_NO_COLLADA_IMPORTER'])
|
||||
env_assimp.Append(CPPDEFINES=['ASSIMP_BUILD_NO_CSM_IMPORTER'])
|
||||
env_assimp.Append(CPPDEFINES=['ASSIMP_BUILD_NO_DXF_IMPORTER'])
|
||||
env_assimp.Append(CPPDEFINES=['ASSIMP_BUILD_NO_GLTF2_IMPORTER'])
|
||||
env_assimp.Append(CPPDEFINES=['ASSIMP_BUILD_NO_GLTF_IMPORTER'])
|
||||
env_assimp.Append(CPPDEFINES=['ASSIMP_BUILD_NO_HMP_IMPORTER'])
|
||||
env_assimp.Append(CPPDEFINES=['ASSIMP_BUILD_NO_IFC_IMPORTER'])
|
||||
env_assimp.Append(CPPDEFINES=['ASSIMP_BUILD_NO_IRR_IMPORTER'])
|
||||
env_assimp.Append(CPPDEFINES=['ASSIMP_BUILD_NO_IRRMESH_IMPORTER'])
|
||||
env_assimp.Append(CPPDEFINES=['ASSIMP_BUILD_NO_LWO_IMPORTER'])
|
||||
env_assimp.Append(CPPDEFINES=['ASSIMP_BUILD_NO_LWS_IMPORTER'])
|
||||
env_assimp.Append(CPPDEFINES=['ASSIMP_BUILD_NO_M3D_IMPORTER'])
|
||||
env_assimp.Append(CPPDEFINES=['ASSIMP_BUILD_NO_MD2_IMPORTER'])
|
||||
env_assimp.Append(CPPDEFINES=['ASSIMP_BUILD_NO_MD3_IMPORTER'])
|
||||
env_assimp.Append(CPPDEFINES=['ASSIMP_BUILD_NO_MD5_IMPORTER'])
|
||||
env_assimp.Append(CPPDEFINES=['ASSIMP_BUILD_NO_MD5_IMPORTER'])
|
||||
env_assimp.Append(CPPDEFINES=['ASSIMP_BUILD_NO_MDC_IMPORTER'])
|
||||
env_assimp.Append(CPPDEFINES=['ASSIMP_BUILD_NO_MDL_IMPORTER'])
|
||||
env_assimp.Append(CPPDEFINES=['ASSIMP_BUILD_NO_MMD_IMPORTER'])
|
||||
env_assimp.Append(CPPDEFINES=['ASSIMP_BUILD_NO_MS3D_IMPORTER'])
|
||||
env_assimp.Append(CPPDEFINES=['ASSIMP_BUILD_NO_NDO_IMPORTER'])
|
||||
env_assimp.Append(CPPDEFINES=['ASSIMP_BUILD_NO_NFF_IMPORTER'])
|
||||
env_assimp.Append(CPPDEFINES=['ASSIMP_BUILD_NO_OBJ_IMPORTER'])
|
||||
env_assimp.Append(CPPDEFINES=['ASSIMP_BUILD_NO_OFF_IMPORTER'])
|
||||
env_assimp.Append(CPPDEFINES=['ASSIMP_BUILD_NO_OGRE_IMPORTER'])
|
||||
env_assimp.Append(CPPDEFINES=['ASSIMP_BUILD_NO_OPENGEX_IMPORTER'])
|
||||
env_assimp.Append(CPPDEFINES=['ASSIMP_BUILD_NO_PLY_IMPORTER'])
|
||||
env_assimp.Append(CPPDEFINES=['ASSIMP_BUILD_NO_Q3BSP_IMPORTER'])
|
||||
env_assimp.Append(CPPDEFINES=['ASSIMP_BUILD_NO_Q3D_IMPORTER'])
|
||||
env_assimp.Append(CPPDEFINES=['ASSIMP_BUILD_NO_RAW_IMPORTER'])
|
||||
env_assimp.Append(CPPDEFINES=['ASSIMP_BUILD_NO_SIB_IMPORTER'])
|
||||
env_assimp.Append(CPPDEFINES=['ASSIMP_BUILD_NO_SMD_IMPORTER'])
|
||||
env_assimp.Append(CPPDEFINES=['ASSIMP_BUILD_NO_STEP_IMPORTER'])
|
||||
env_assimp.Append(CPPDEFINES=['ASSIMP_BUILD_NO_STL_IMPORTER'])
|
||||
env_assimp.Append(CPPDEFINES=['ASSIMP_BUILD_NO_TERRAGEN_IMPORTER'])
|
||||
env_assimp.Append(CPPDEFINES=['ASSIMP_BUILD_NO_X3D_IMPORTER'])
|
||||
env_assimp.Append(CPPDEFINES=['ASSIMP_BUILD_NO_XGL_IMPORTER'])
|
||||
env_assimp.Append(CPPDEFINES=['ASSIMP_BUILD_NO_X_IMPORTER'])
|
||||
env_assimp.Append(CPPDEFINES=["ASSIMP_BUILD_NO_3D_IMPORTER"])
|
||||
env_assimp.Append(CPPDEFINES=["ASSIMP_BUILD_NO_3DS_IMPORTER"])
|
||||
env_assimp.Append(CPPDEFINES=["ASSIMP_BUILD_NO_3MF_IMPORTER"])
|
||||
env_assimp.Append(CPPDEFINES=["ASSIMP_BUILD_NO_AC_IMPORTER"])
|
||||
env_assimp.Append(CPPDEFINES=["ASSIMP_BUILD_NO_AMF_IMPORTER"])
|
||||
env_assimp.Append(CPPDEFINES=["ASSIMP_BUILD_NO_ASE_IMPORTER"])
|
||||
env_assimp.Append(CPPDEFINES=["ASSIMP_BUILD_NO_ASSBIN_IMPORTER"])
|
||||
env_assimp.Append(CPPDEFINES=["ASSIMP_BUILD_NO_B3D_IMPORTER"])
|
||||
env_assimp.Append(CPPDEFINES=["ASSIMP_BUILD_NO_BLEND_IMPORTER"])
|
||||
env_assimp.Append(CPPDEFINES=["ASSIMP_BUILD_NO_BVH_IMPORTER"])
|
||||
env_assimp.Append(CPPDEFINES=["ASSIMP_BUILD_NO_C4D_IMPORTER"])
|
||||
env_assimp.Append(CPPDEFINES=["ASSIMP_BUILD_NO_COB_IMPORTER"])
|
||||
env_assimp.Append(CPPDEFINES=["ASSIMP_BUILD_NO_COLLADA_IMPORTER"])
|
||||
env_assimp.Append(CPPDEFINES=["ASSIMP_BUILD_NO_CSM_IMPORTER"])
|
||||
env_assimp.Append(CPPDEFINES=["ASSIMP_BUILD_NO_DXF_IMPORTER"])
|
||||
env_assimp.Append(CPPDEFINES=["ASSIMP_BUILD_NO_GLTF2_IMPORTER"])
|
||||
env_assimp.Append(CPPDEFINES=["ASSIMP_BUILD_NO_GLTF_IMPORTER"])
|
||||
env_assimp.Append(CPPDEFINES=["ASSIMP_BUILD_NO_HMP_IMPORTER"])
|
||||
env_assimp.Append(CPPDEFINES=["ASSIMP_BUILD_NO_IFC_IMPORTER"])
|
||||
env_assimp.Append(CPPDEFINES=["ASSIMP_BUILD_NO_IRR_IMPORTER"])
|
||||
env_assimp.Append(CPPDEFINES=["ASSIMP_BUILD_NO_IRRMESH_IMPORTER"])
|
||||
env_assimp.Append(CPPDEFINES=["ASSIMP_BUILD_NO_LWO_IMPORTER"])
|
||||
env_assimp.Append(CPPDEFINES=["ASSIMP_BUILD_NO_LWS_IMPORTER"])
|
||||
env_assimp.Append(CPPDEFINES=["ASSIMP_BUILD_NO_M3D_IMPORTER"])
|
||||
env_assimp.Append(CPPDEFINES=["ASSIMP_BUILD_NO_MD2_IMPORTER"])
|
||||
env_assimp.Append(CPPDEFINES=["ASSIMP_BUILD_NO_MD3_IMPORTER"])
|
||||
env_assimp.Append(CPPDEFINES=["ASSIMP_BUILD_NO_MD5_IMPORTER"])
|
||||
env_assimp.Append(CPPDEFINES=["ASSIMP_BUILD_NO_MD5_IMPORTER"])
|
||||
env_assimp.Append(CPPDEFINES=["ASSIMP_BUILD_NO_MDC_IMPORTER"])
|
||||
env_assimp.Append(CPPDEFINES=["ASSIMP_BUILD_NO_MDL_IMPORTER"])
|
||||
env_assimp.Append(CPPDEFINES=["ASSIMP_BUILD_NO_MMD_IMPORTER"])
|
||||
env_assimp.Append(CPPDEFINES=["ASSIMP_BUILD_NO_MS3D_IMPORTER"])
|
||||
env_assimp.Append(CPPDEFINES=["ASSIMP_BUILD_NO_NDO_IMPORTER"])
|
||||
env_assimp.Append(CPPDEFINES=["ASSIMP_BUILD_NO_NFF_IMPORTER"])
|
||||
env_assimp.Append(CPPDEFINES=["ASSIMP_BUILD_NO_OBJ_IMPORTER"])
|
||||
env_assimp.Append(CPPDEFINES=["ASSIMP_BUILD_NO_OFF_IMPORTER"])
|
||||
env_assimp.Append(CPPDEFINES=["ASSIMP_BUILD_NO_OGRE_IMPORTER"])
|
||||
env_assimp.Append(CPPDEFINES=["ASSIMP_BUILD_NO_OPENGEX_IMPORTER"])
|
||||
env_assimp.Append(CPPDEFINES=["ASSIMP_BUILD_NO_PLY_IMPORTER"])
|
||||
env_assimp.Append(CPPDEFINES=["ASSIMP_BUILD_NO_Q3BSP_IMPORTER"])
|
||||
env_assimp.Append(CPPDEFINES=["ASSIMP_BUILD_NO_Q3D_IMPORTER"])
|
||||
env_assimp.Append(CPPDEFINES=["ASSIMP_BUILD_NO_RAW_IMPORTER"])
|
||||
env_assimp.Append(CPPDEFINES=["ASSIMP_BUILD_NO_SIB_IMPORTER"])
|
||||
env_assimp.Append(CPPDEFINES=["ASSIMP_BUILD_NO_SMD_IMPORTER"])
|
||||
env_assimp.Append(CPPDEFINES=["ASSIMP_BUILD_NO_STEP_IMPORTER"])
|
||||
env_assimp.Append(CPPDEFINES=["ASSIMP_BUILD_NO_STL_IMPORTER"])
|
||||
env_assimp.Append(CPPDEFINES=["ASSIMP_BUILD_NO_TERRAGEN_IMPORTER"])
|
||||
env_assimp.Append(CPPDEFINES=["ASSIMP_BUILD_NO_X3D_IMPORTER"])
|
||||
env_assimp.Append(CPPDEFINES=["ASSIMP_BUILD_NO_XGL_IMPORTER"])
|
||||
env_assimp.Append(CPPDEFINES=["ASSIMP_BUILD_NO_X_IMPORTER"])
|
||||
|
||||
if env["platform"] == "windows":
|
||||
env_assimp.Append(CPPDEFINES=["PLATFORM_WINDOWS"])
|
||||
env_assimp.Append(CPPDEFINES=[("PLATFORM", "WINDOWS")])
|
||||
elif env["platform"] == "x11":
|
||||
env_assimp.Append(CPPDEFINES=["PLATFORM_LINUX"])
|
||||
env_assimp.Append(CPPDEFINES=[("PLATFORM", "LINUX")])
|
||||
elif env["platform"] == "osx":
|
||||
env_assimp.Append(CPPDEFINES=["PLATFORM_DARWIN"])
|
||||
env_assimp.Append(CPPDEFINES=[("PLATFORM", "DARWIN")])
|
||||
|
||||
if(env['platform'] == 'windows'):
|
||||
env_assimp.Append(CPPDEFINES=['PLATFORM_WINDOWS'])
|
||||
env_assimp.Append(CPPDEFINES=[('PLATFORM', 'WINDOWS')])
|
||||
elif(env['platform'] == 'x11'):
|
||||
env_assimp.Append(CPPDEFINES=['PLATFORM_LINUX'])
|
||||
env_assimp.Append(CPPDEFINES=[('PLATFORM', 'LINUX')])
|
||||
elif(env['platform'] == 'osx'):
|
||||
env_assimp.Append(CPPDEFINES=['PLATFORM_DARWIN'])
|
||||
env_assimp.Append(CPPDEFINES=[('PLATFORM', 'DARWIN')])
|
||||
|
||||
env_thirdparty = env_assimp.Clone()
|
||||
env_thirdparty.disable_warnings()
|
||||
env_thirdparty.add_source_files(env.modules_sources, Glob('#thirdparty/assimp/code/CApi/*.cpp'))
|
||||
env_thirdparty.add_source_files(env.modules_sources, Glob('#thirdparty/assimp/code/Common/*.cpp'))
|
||||
env_thirdparty.add_source_files(env.modules_sources, Glob('#thirdparty/assimp/code/PostProcessing/*.cpp'))
|
||||
env_thirdparty.add_source_files(env.modules_sources, Glob('#thirdparty/assimp/code/Material/*.cpp'))
|
||||
env_thirdparty.add_source_files(env.modules_sources, Glob('#thirdparty/assimp/code/FBX/*.cpp'))
|
||||
env_thirdparty.add_source_files(env.modules_sources, Glob("#thirdparty/assimp/code/CApi/*.cpp"))
|
||||
env_thirdparty.add_source_files(env.modules_sources, Glob("#thirdparty/assimp/code/Common/*.cpp"))
|
||||
env_thirdparty.add_source_files(env.modules_sources, Glob("#thirdparty/assimp/code/PostProcessing/*.cpp"))
|
||||
env_thirdparty.add_source_files(env.modules_sources, Glob("#thirdparty/assimp/code/Material/*.cpp"))
|
||||
env_thirdparty.add_source_files(env.modules_sources, Glob("#thirdparty/assimp/code/FBX/*.cpp"))
|
||||
|
||||
# Godot's own source files
|
||||
env_assimp.add_source_files(env.modules_sources, "*.cpp")
|
||||
|
|
|
@ -1,5 +1,6 @@
|
|||
def can_build(env, platform):
|
||||
return env['tools']
|
||||
return env["tools"]
|
||||
|
||||
|
||||
def configure(env):
|
||||
pass
|
||||
|
|
|
@ -44,7 +44,6 @@
|
|||
#include <assimp/scene.h>
|
||||
#include <assimp/Importer.hpp>
|
||||
#include <assimp/LogStream.hpp>
|
||||
#include <string>
|
||||
|
||||
// move into assimp
|
||||
aiBone *get_bone_by_name(const aiScene *scene, aiString bone_name) {
|
||||
|
@ -104,8 +103,6 @@ void EditorSceneImporterAssimp::_bind_methods() {
|
|||
Node *EditorSceneImporterAssimp::import_scene(const String &p_path, uint32_t p_flags, int p_bake_fps,
|
||||
List<String> *r_missing_deps, Error *r_err) {
|
||||
Assimp::Importer importer;
|
||||
std::wstring w_path = ProjectSettings::get_singleton()->globalize_path(p_path).c_str();
|
||||
std::string s_path(w_path.begin(), w_path.end());
|
||||
importer.SetPropertyBool(AI_CONFIG_PP_FD_REMOVE, true);
|
||||
// Cannot remove pivot points because the static mesh will be in the wrong place
|
||||
importer.SetPropertyBool(AI_CONFIG_IMPORT_FBX_PRESERVE_PIVOTS, false);
|
||||
|
@ -147,7 +144,8 @@ Node *EditorSceneImporterAssimp::import_scene(const String &p_path, uint32_t p_f
|
|||
// aiProcess_EmbedTextures |
|
||||
//aiProcess_SplitByBoneCount |
|
||||
0;
|
||||
aiScene *scene = (aiScene *)importer.ReadFile(s_path.c_str(), post_process_Steps);
|
||||
String g_path = ProjectSettings::get_singleton()->globalize_path(p_path);
|
||||
aiScene *scene = (aiScene *)importer.ReadFile(g_path.utf8().ptr(), post_process_Steps);
|
||||
|
||||
ERR_FAIL_COND_V_MSG(scene == NULL, NULL, String("Open Asset Import failed to open: ") + String(importer.GetErrorString()));
|
||||
|
||||
|
|
|
@ -1,7 +1,7 @@
|
|||
#!/usr/bin/env python
|
||||
|
||||
Import('env')
|
||||
Import('env_modules')
|
||||
Import("env")
|
||||
Import("env_modules")
|
||||
|
||||
env_bmp = env_modules.Clone()
|
||||
|
||||
|
|
|
@ -1,5 +1,6 @@
|
|||
def can_build(env, platform):
|
||||
return True
|
||||
|
||||
|
||||
def configure(env):
|
||||
pass
|
||||
|
|
|
@ -1,208 +1,203 @@
|
|||
#!/usr/bin/env python
|
||||
|
||||
Import('env')
|
||||
Import('env_modules')
|
||||
Import("env")
|
||||
Import("env_modules")
|
||||
|
||||
env_bullet = env_modules.Clone()
|
||||
|
||||
# Thirdparty source files
|
||||
|
||||
if env['builtin_bullet']:
|
||||
if env["builtin_bullet"]:
|
||||
# Build only version 2 for now (as of 2.89)
|
||||
# Sync file list with relevant upstream CMakeLists.txt for each folder.
|
||||
thirdparty_dir = "#thirdparty/bullet/"
|
||||
|
||||
bullet2_src = [
|
||||
# BulletCollision
|
||||
"BulletCollision/BroadphaseCollision/btAxisSweep3.cpp"
|
||||
, "BulletCollision/BroadphaseCollision/btBroadphaseProxy.cpp"
|
||||
, "BulletCollision/BroadphaseCollision/btCollisionAlgorithm.cpp"
|
||||
, "BulletCollision/BroadphaseCollision/btDbvt.cpp"
|
||||
, "BulletCollision/BroadphaseCollision/btDbvtBroadphase.cpp"
|
||||
, "BulletCollision/BroadphaseCollision/btDispatcher.cpp"
|
||||
, "BulletCollision/BroadphaseCollision/btOverlappingPairCache.cpp"
|
||||
, "BulletCollision/BroadphaseCollision/btQuantizedBvh.cpp"
|
||||
, "BulletCollision/BroadphaseCollision/btSimpleBroadphase.cpp"
|
||||
, "BulletCollision/CollisionDispatch/btActivatingCollisionAlgorithm.cpp"
|
||||
, "BulletCollision/CollisionDispatch/btBoxBoxCollisionAlgorithm.cpp"
|
||||
, "BulletCollision/CollisionDispatch/btBox2dBox2dCollisionAlgorithm.cpp"
|
||||
, "BulletCollision/CollisionDispatch/btBoxBoxDetector.cpp"
|
||||
, "BulletCollision/CollisionDispatch/btCollisionDispatcher.cpp"
|
||||
, "BulletCollision/CollisionDispatch/btCollisionDispatcherMt.cpp"
|
||||
, "BulletCollision/CollisionDispatch/btCollisionObject.cpp"
|
||||
, "BulletCollision/CollisionDispatch/btCollisionWorld.cpp"
|
||||
, "BulletCollision/CollisionDispatch/btCollisionWorldImporter.cpp"
|
||||
, "BulletCollision/CollisionDispatch/btCompoundCollisionAlgorithm.cpp"
|
||||
, "BulletCollision/CollisionDispatch/btCompoundCompoundCollisionAlgorithm.cpp"
|
||||
, "BulletCollision/CollisionDispatch/btConvexConcaveCollisionAlgorithm.cpp"
|
||||
, "BulletCollision/CollisionDispatch/btConvexConvexAlgorithm.cpp"
|
||||
, "BulletCollision/CollisionDispatch/btConvexPlaneCollisionAlgorithm.cpp"
|
||||
, "BulletCollision/CollisionDispatch/btConvex2dConvex2dAlgorithm.cpp"
|
||||
, "BulletCollision/CollisionDispatch/btDefaultCollisionConfiguration.cpp"
|
||||
, "BulletCollision/CollisionDispatch/btEmptyCollisionAlgorithm.cpp"
|
||||
, "BulletCollision/CollisionDispatch/btGhostObject.cpp"
|
||||
, "BulletCollision/CollisionDispatch/btHashedSimplePairCache.cpp"
|
||||
, "BulletCollision/CollisionDispatch/btInternalEdgeUtility.cpp"
|
||||
, "BulletCollision/CollisionDispatch/btManifoldResult.cpp"
|
||||
, "BulletCollision/CollisionDispatch/btSimulationIslandManager.cpp"
|
||||
, "BulletCollision/CollisionDispatch/btSphereBoxCollisionAlgorithm.cpp"
|
||||
, "BulletCollision/CollisionDispatch/btSphereSphereCollisionAlgorithm.cpp"
|
||||
, "BulletCollision/CollisionDispatch/btSphereTriangleCollisionAlgorithm.cpp"
|
||||
, "BulletCollision/CollisionDispatch/btUnionFind.cpp"
|
||||
, "BulletCollision/CollisionDispatch/SphereTriangleDetector.cpp"
|
||||
, "BulletCollision/CollisionShapes/btBoxShape.cpp"
|
||||
, "BulletCollision/CollisionShapes/btBox2dShape.cpp"
|
||||
, "BulletCollision/CollisionShapes/btBvhTriangleMeshShape.cpp"
|
||||
, "BulletCollision/CollisionShapes/btCapsuleShape.cpp"
|
||||
, "BulletCollision/CollisionShapes/btCollisionShape.cpp"
|
||||
, "BulletCollision/CollisionShapes/btCompoundShape.cpp"
|
||||
, "BulletCollision/CollisionShapes/btConcaveShape.cpp"
|
||||
, "BulletCollision/CollisionShapes/btConeShape.cpp"
|
||||
, "BulletCollision/CollisionShapes/btConvexHullShape.cpp"
|
||||
, "BulletCollision/CollisionShapes/btConvexInternalShape.cpp"
|
||||
, "BulletCollision/CollisionShapes/btConvexPointCloudShape.cpp"
|
||||
, "BulletCollision/CollisionShapes/btConvexPolyhedron.cpp"
|
||||
, "BulletCollision/CollisionShapes/btConvexShape.cpp"
|
||||
, "BulletCollision/CollisionShapes/btConvex2dShape.cpp"
|
||||
, "BulletCollision/CollisionShapes/btConvexTriangleMeshShape.cpp"
|
||||
, "BulletCollision/CollisionShapes/btCylinderShape.cpp"
|
||||
, "BulletCollision/CollisionShapes/btEmptyShape.cpp"
|
||||
, "BulletCollision/CollisionShapes/btHeightfieldTerrainShape.cpp"
|
||||
, "BulletCollision/CollisionShapes/btMiniSDF.cpp"
|
||||
, "BulletCollision/CollisionShapes/btMinkowskiSumShape.cpp"
|
||||
, "BulletCollision/CollisionShapes/btMultimaterialTriangleMeshShape.cpp"
|
||||
, "BulletCollision/CollisionShapes/btMultiSphereShape.cpp"
|
||||
, "BulletCollision/CollisionShapes/btOptimizedBvh.cpp"
|
||||
, "BulletCollision/CollisionShapes/btPolyhedralConvexShape.cpp"
|
||||
, "BulletCollision/CollisionShapes/btScaledBvhTriangleMeshShape.cpp"
|
||||
, "BulletCollision/CollisionShapes/btSdfCollisionShape.cpp"
|
||||
, "BulletCollision/CollisionShapes/btShapeHull.cpp"
|
||||
, "BulletCollision/CollisionShapes/btSphereShape.cpp"
|
||||
, "BulletCollision/CollisionShapes/btStaticPlaneShape.cpp"
|
||||
, "BulletCollision/CollisionShapes/btStridingMeshInterface.cpp"
|
||||
, "BulletCollision/CollisionShapes/btTetrahedronShape.cpp"
|
||||
, "BulletCollision/CollisionShapes/btTriangleBuffer.cpp"
|
||||
, "BulletCollision/CollisionShapes/btTriangleCallback.cpp"
|
||||
, "BulletCollision/CollisionShapes/btTriangleIndexVertexArray.cpp"
|
||||
, "BulletCollision/CollisionShapes/btTriangleIndexVertexMaterialArray.cpp"
|
||||
, "BulletCollision/CollisionShapes/btTriangleMesh.cpp"
|
||||
, "BulletCollision/CollisionShapes/btTriangleMeshShape.cpp"
|
||||
, "BulletCollision/CollisionShapes/btUniformScalingShape.cpp"
|
||||
, "BulletCollision/Gimpact/btContactProcessing.cpp"
|
||||
, "BulletCollision/Gimpact/btGenericPoolAllocator.cpp"
|
||||
, "BulletCollision/Gimpact/btGImpactBvh.cpp"
|
||||
, "BulletCollision/Gimpact/btGImpactCollisionAlgorithm.cpp"
|
||||
, "BulletCollision/Gimpact/btGImpactQuantizedBvh.cpp"
|
||||
, "BulletCollision/Gimpact/btGImpactShape.cpp"
|
||||
, "BulletCollision/Gimpact/btTriangleShapeEx.cpp"
|
||||
, "BulletCollision/Gimpact/gim_box_set.cpp"
|
||||
, "BulletCollision/Gimpact/gim_contact.cpp"
|
||||
, "BulletCollision/Gimpact/gim_memory.cpp"
|
||||
, "BulletCollision/Gimpact/gim_tri_collision.cpp"
|
||||
, "BulletCollision/NarrowPhaseCollision/btContinuousConvexCollision.cpp"
|
||||
, "BulletCollision/NarrowPhaseCollision/btConvexCast.cpp"
|
||||
, "BulletCollision/NarrowPhaseCollision/btGjkConvexCast.cpp"
|
||||
, "BulletCollision/NarrowPhaseCollision/btGjkEpa2.cpp"
|
||||
, "BulletCollision/NarrowPhaseCollision/btGjkEpaPenetrationDepthSolver.cpp"
|
||||
, "BulletCollision/NarrowPhaseCollision/btGjkPairDetector.cpp"
|
||||
, "BulletCollision/NarrowPhaseCollision/btMinkowskiPenetrationDepthSolver.cpp"
|
||||
, "BulletCollision/NarrowPhaseCollision/btPersistentManifold.cpp"
|
||||
, "BulletCollision/NarrowPhaseCollision/btRaycastCallback.cpp"
|
||||
, "BulletCollision/NarrowPhaseCollision/btSubSimplexConvexCast.cpp"
|
||||
, "BulletCollision/NarrowPhaseCollision/btVoronoiSimplexSolver.cpp"
|
||||
, "BulletCollision/NarrowPhaseCollision/btPolyhedralContactClipping.cpp"
|
||||
|
||||
"BulletCollision/BroadphaseCollision/btAxisSweep3.cpp",
|
||||
"BulletCollision/BroadphaseCollision/btBroadphaseProxy.cpp",
|
||||
"BulletCollision/BroadphaseCollision/btCollisionAlgorithm.cpp",
|
||||
"BulletCollision/BroadphaseCollision/btDbvt.cpp",
|
||||
"BulletCollision/BroadphaseCollision/btDbvtBroadphase.cpp",
|
||||
"BulletCollision/BroadphaseCollision/btDispatcher.cpp",
|
||||
"BulletCollision/BroadphaseCollision/btOverlappingPairCache.cpp",
|
||||
"BulletCollision/BroadphaseCollision/btQuantizedBvh.cpp",
|
||||
"BulletCollision/BroadphaseCollision/btSimpleBroadphase.cpp",
|
||||
"BulletCollision/CollisionDispatch/btActivatingCollisionAlgorithm.cpp",
|
||||
"BulletCollision/CollisionDispatch/btBoxBoxCollisionAlgorithm.cpp",
|
||||
"BulletCollision/CollisionDispatch/btBox2dBox2dCollisionAlgorithm.cpp",
|
||||
"BulletCollision/CollisionDispatch/btBoxBoxDetector.cpp",
|
||||
"BulletCollision/CollisionDispatch/btCollisionDispatcher.cpp",
|
||||
"BulletCollision/CollisionDispatch/btCollisionDispatcherMt.cpp",
|
||||
"BulletCollision/CollisionDispatch/btCollisionObject.cpp",
|
||||
"BulletCollision/CollisionDispatch/btCollisionWorld.cpp",
|
||||
"BulletCollision/CollisionDispatch/btCollisionWorldImporter.cpp",
|
||||
"BulletCollision/CollisionDispatch/btCompoundCollisionAlgorithm.cpp",
|
||||
"BulletCollision/CollisionDispatch/btCompoundCompoundCollisionAlgorithm.cpp",
|
||||
"BulletCollision/CollisionDispatch/btConvexConcaveCollisionAlgorithm.cpp",
|
||||
"BulletCollision/CollisionDispatch/btConvexConvexAlgorithm.cpp",
|
||||
"BulletCollision/CollisionDispatch/btConvexPlaneCollisionAlgorithm.cpp",
|
||||
"BulletCollision/CollisionDispatch/btConvex2dConvex2dAlgorithm.cpp",
|
||||
"BulletCollision/CollisionDispatch/btDefaultCollisionConfiguration.cpp",
|
||||
"BulletCollision/CollisionDispatch/btEmptyCollisionAlgorithm.cpp",
|
||||
"BulletCollision/CollisionDispatch/btGhostObject.cpp",
|
||||
"BulletCollision/CollisionDispatch/btHashedSimplePairCache.cpp",
|
||||
"BulletCollision/CollisionDispatch/btInternalEdgeUtility.cpp",
|
||||
"BulletCollision/CollisionDispatch/btManifoldResult.cpp",
|
||||
"BulletCollision/CollisionDispatch/btSimulationIslandManager.cpp",
|
||||
"BulletCollision/CollisionDispatch/btSphereBoxCollisionAlgorithm.cpp",
|
||||
"BulletCollision/CollisionDispatch/btSphereSphereCollisionAlgorithm.cpp",
|
||||
"BulletCollision/CollisionDispatch/btSphereTriangleCollisionAlgorithm.cpp",
|
||||
"BulletCollision/CollisionDispatch/btUnionFind.cpp",
|
||||
"BulletCollision/CollisionDispatch/SphereTriangleDetector.cpp",
|
||||
"BulletCollision/CollisionShapes/btBoxShape.cpp",
|
||||
"BulletCollision/CollisionShapes/btBox2dShape.cpp",
|
||||
"BulletCollision/CollisionShapes/btBvhTriangleMeshShape.cpp",
|
||||
"BulletCollision/CollisionShapes/btCapsuleShape.cpp",
|
||||
"BulletCollision/CollisionShapes/btCollisionShape.cpp",
|
||||
"BulletCollision/CollisionShapes/btCompoundShape.cpp",
|
||||
"BulletCollision/CollisionShapes/btConcaveShape.cpp",
|
||||
"BulletCollision/CollisionShapes/btConeShape.cpp",
|
||||
"BulletCollision/CollisionShapes/btConvexHullShape.cpp",
|
||||
"BulletCollision/CollisionShapes/btConvexInternalShape.cpp",
|
||||
"BulletCollision/CollisionShapes/btConvexPointCloudShape.cpp",
|
||||
"BulletCollision/CollisionShapes/btConvexPolyhedron.cpp",
|
||||
"BulletCollision/CollisionShapes/btConvexShape.cpp",
|
||||
"BulletCollision/CollisionShapes/btConvex2dShape.cpp",
|
||||
"BulletCollision/CollisionShapes/btConvexTriangleMeshShape.cpp",
|
||||
"BulletCollision/CollisionShapes/btCylinderShape.cpp",
|
||||
"BulletCollision/CollisionShapes/btEmptyShape.cpp",
|
||||
"BulletCollision/CollisionShapes/btHeightfieldTerrainShape.cpp",
|
||||
"BulletCollision/CollisionShapes/btMiniSDF.cpp",
|
||||
"BulletCollision/CollisionShapes/btMinkowskiSumShape.cpp",
|
||||
"BulletCollision/CollisionShapes/btMultimaterialTriangleMeshShape.cpp",
|
||||
"BulletCollision/CollisionShapes/btMultiSphereShape.cpp",
|
||||
"BulletCollision/CollisionShapes/btOptimizedBvh.cpp",
|
||||
"BulletCollision/CollisionShapes/btPolyhedralConvexShape.cpp",
|
||||
"BulletCollision/CollisionShapes/btScaledBvhTriangleMeshShape.cpp",
|
||||
"BulletCollision/CollisionShapes/btSdfCollisionShape.cpp",
|
||||
"BulletCollision/CollisionShapes/btShapeHull.cpp",
|
||||
"BulletCollision/CollisionShapes/btSphereShape.cpp",
|
||||
"BulletCollision/CollisionShapes/btStaticPlaneShape.cpp",
|
||||
"BulletCollision/CollisionShapes/btStridingMeshInterface.cpp",
|
||||
"BulletCollision/CollisionShapes/btTetrahedronShape.cpp",
|
||||
"BulletCollision/CollisionShapes/btTriangleBuffer.cpp",
|
||||
"BulletCollision/CollisionShapes/btTriangleCallback.cpp",
|
||||
"BulletCollision/CollisionShapes/btTriangleIndexVertexArray.cpp",
|
||||
"BulletCollision/CollisionShapes/btTriangleIndexVertexMaterialArray.cpp",
|
||||
"BulletCollision/CollisionShapes/btTriangleMesh.cpp",
|
||||
"BulletCollision/CollisionShapes/btTriangleMeshShape.cpp",
|
||||
"BulletCollision/CollisionShapes/btUniformScalingShape.cpp",
|
||||
"BulletCollision/Gimpact/btContactProcessing.cpp",
|
||||
"BulletCollision/Gimpact/btGenericPoolAllocator.cpp",
|
||||
"BulletCollision/Gimpact/btGImpactBvh.cpp",
|
||||
"BulletCollision/Gimpact/btGImpactCollisionAlgorithm.cpp",
|
||||
"BulletCollision/Gimpact/btGImpactQuantizedBvh.cpp",
|
||||
"BulletCollision/Gimpact/btGImpactShape.cpp",
|
||||
"BulletCollision/Gimpact/btTriangleShapeEx.cpp",
|
||||
"BulletCollision/Gimpact/gim_box_set.cpp",
|
||||
"BulletCollision/Gimpact/gim_contact.cpp",
|
||||
"BulletCollision/Gimpact/gim_memory.cpp",
|
||||
"BulletCollision/Gimpact/gim_tri_collision.cpp",
|
||||
"BulletCollision/NarrowPhaseCollision/btContinuousConvexCollision.cpp",
|
||||
"BulletCollision/NarrowPhaseCollision/btConvexCast.cpp",
|
||||
"BulletCollision/NarrowPhaseCollision/btGjkConvexCast.cpp",
|
||||
"BulletCollision/NarrowPhaseCollision/btGjkEpa2.cpp",
|
||||
"BulletCollision/NarrowPhaseCollision/btGjkEpaPenetrationDepthSolver.cpp",
|
||||
"BulletCollision/NarrowPhaseCollision/btGjkPairDetector.cpp",
|
||||
"BulletCollision/NarrowPhaseCollision/btMinkowskiPenetrationDepthSolver.cpp",
|
||||
"BulletCollision/NarrowPhaseCollision/btPersistentManifold.cpp",
|
||||
"BulletCollision/NarrowPhaseCollision/btRaycastCallback.cpp",
|
||||
"BulletCollision/NarrowPhaseCollision/btSubSimplexConvexCast.cpp",
|
||||
"BulletCollision/NarrowPhaseCollision/btVoronoiSimplexSolver.cpp",
|
||||
"BulletCollision/NarrowPhaseCollision/btPolyhedralContactClipping.cpp",
|
||||
# BulletDynamics
|
||||
, "BulletDynamics/Character/btKinematicCharacterController.cpp"
|
||||
, "BulletDynamics/ConstraintSolver/btConeTwistConstraint.cpp"
|
||||
, "BulletDynamics/ConstraintSolver/btContactConstraint.cpp"
|
||||
, "BulletDynamics/ConstraintSolver/btFixedConstraint.cpp"
|
||||
, "BulletDynamics/ConstraintSolver/btGearConstraint.cpp"
|
||||
, "BulletDynamics/ConstraintSolver/btGeneric6DofConstraint.cpp"
|
||||
, "BulletDynamics/ConstraintSolver/btGeneric6DofSpringConstraint.cpp"
|
||||
, "BulletDynamics/ConstraintSolver/btGeneric6DofSpring2Constraint.cpp"
|
||||
, "BulletDynamics/ConstraintSolver/btHinge2Constraint.cpp"
|
||||
, "BulletDynamics/ConstraintSolver/btHingeConstraint.cpp"
|
||||
, "BulletDynamics/ConstraintSolver/btPoint2PointConstraint.cpp"
|
||||
, "BulletDynamics/ConstraintSolver/btSequentialImpulseConstraintSolver.cpp"
|
||||
, "BulletDynamics/ConstraintSolver/btSequentialImpulseConstraintSolverMt.cpp"
|
||||
, "BulletDynamics/ConstraintSolver/btBatchedConstraints.cpp"
|
||||
, "BulletDynamics/ConstraintSolver/btNNCGConstraintSolver.cpp"
|
||||
, "BulletDynamics/ConstraintSolver/btSliderConstraint.cpp"
|
||||
, "BulletDynamics/ConstraintSolver/btSolve2LinearConstraint.cpp"
|
||||
, "BulletDynamics/ConstraintSolver/btTypedConstraint.cpp"
|
||||
, "BulletDynamics/ConstraintSolver/btUniversalConstraint.cpp"
|
||||
, "BulletDynamics/Dynamics/btDiscreteDynamicsWorld.cpp"
|
||||
, "BulletDynamics/Dynamics/btDiscreteDynamicsWorldMt.cpp"
|
||||
, "BulletDynamics/Dynamics/btSimulationIslandManagerMt.cpp"
|
||||
, "BulletDynamics/Dynamics/btRigidBody.cpp"
|
||||
, "BulletDynamics/Dynamics/btSimpleDynamicsWorld.cpp"
|
||||
#, "BulletDynamics/Dynamics/Bullet-C-API.cpp"
|
||||
, "BulletDynamics/Vehicle/btRaycastVehicle.cpp"
|
||||
, "BulletDynamics/Vehicle/btWheelInfo.cpp"
|
||||
, "BulletDynamics/Featherstone/btMultiBody.cpp"
|
||||
, "BulletDynamics/Featherstone/btMultiBodyConstraint.cpp"
|
||||
, "BulletDynamics/Featherstone/btMultiBodyConstraintSolver.cpp"
|
||||
, "BulletDynamics/Featherstone/btMultiBodyDynamicsWorld.cpp"
|
||||
, "BulletDynamics/Featherstone/btMultiBodyFixedConstraint.cpp"
|
||||
, "BulletDynamics/Featherstone/btMultiBodyGearConstraint.cpp"
|
||||
, "BulletDynamics/Featherstone/btMultiBodyJointLimitConstraint.cpp"
|
||||
, "BulletDynamics/Featherstone/btMultiBodyJointMotor.cpp"
|
||||
, "BulletDynamics/Featherstone/btMultiBodyMLCPConstraintSolver.cpp"
|
||||
, "BulletDynamics/Featherstone/btMultiBodyPoint2Point.cpp"
|
||||
, "BulletDynamics/Featherstone/btMultiBodySliderConstraint.cpp"
|
||||
, "BulletDynamics/Featherstone/btMultiBodySphericalJointMotor.cpp"
|
||||
, "BulletDynamics/MLCPSolvers/btDantzigLCP.cpp"
|
||||
, "BulletDynamics/MLCPSolvers/btMLCPSolver.cpp"
|
||||
, "BulletDynamics/MLCPSolvers/btLemkeAlgorithm.cpp"
|
||||
|
||||
"BulletDynamics/Character/btKinematicCharacterController.cpp",
|
||||
"BulletDynamics/ConstraintSolver/btConeTwistConstraint.cpp",
|
||||
"BulletDynamics/ConstraintSolver/btContactConstraint.cpp",
|
||||
"BulletDynamics/ConstraintSolver/btFixedConstraint.cpp",
|
||||
"BulletDynamics/ConstraintSolver/btGearConstraint.cpp",
|
||||
"BulletDynamics/ConstraintSolver/btGeneric6DofConstraint.cpp",
|
||||
"BulletDynamics/ConstraintSolver/btGeneric6DofSpringConstraint.cpp",
|
||||
"BulletDynamics/ConstraintSolver/btGeneric6DofSpring2Constraint.cpp",
|
||||
"BulletDynamics/ConstraintSolver/btHinge2Constraint.cpp",
|
||||
"BulletDynamics/ConstraintSolver/btHingeConstraint.cpp",
|
||||
"BulletDynamics/ConstraintSolver/btPoint2PointConstraint.cpp",
|
||||
"BulletDynamics/ConstraintSolver/btSequentialImpulseConstraintSolver.cpp",
|
||||
"BulletDynamics/ConstraintSolver/btSequentialImpulseConstraintSolverMt.cpp",
|
||||
"BulletDynamics/ConstraintSolver/btBatchedConstraints.cpp",
|
||||
"BulletDynamics/ConstraintSolver/btNNCGConstraintSolver.cpp",
|
||||
"BulletDynamics/ConstraintSolver/btSliderConstraint.cpp",
|
||||
"BulletDynamics/ConstraintSolver/btSolve2LinearConstraint.cpp",
|
||||
"BulletDynamics/ConstraintSolver/btTypedConstraint.cpp",
|
||||
"BulletDynamics/ConstraintSolver/btUniversalConstraint.cpp",
|
||||
"BulletDynamics/Dynamics/btDiscreteDynamicsWorld.cpp",
|
||||
"BulletDynamics/Dynamics/btDiscreteDynamicsWorldMt.cpp",
|
||||
"BulletDynamics/Dynamics/btSimulationIslandManagerMt.cpp",
|
||||
"BulletDynamics/Dynamics/btRigidBody.cpp",
|
||||
"BulletDynamics/Dynamics/btSimpleDynamicsWorld.cpp",
|
||||
# "BulletDynamics/Dynamics/Bullet-C-API.cpp",
|
||||
"BulletDynamics/Vehicle/btRaycastVehicle.cpp",
|
||||
"BulletDynamics/Vehicle/btWheelInfo.cpp",
|
||||
"BulletDynamics/Featherstone/btMultiBody.cpp",
|
||||
"BulletDynamics/Featherstone/btMultiBodyConstraint.cpp",
|
||||
"BulletDynamics/Featherstone/btMultiBodyConstraintSolver.cpp",
|
||||
"BulletDynamics/Featherstone/btMultiBodyDynamicsWorld.cpp",
|
||||
"BulletDynamics/Featherstone/btMultiBodyFixedConstraint.cpp",
|
||||
"BulletDynamics/Featherstone/btMultiBodyGearConstraint.cpp",
|
||||
"BulletDynamics/Featherstone/btMultiBodyJointLimitConstraint.cpp",
|
||||
"BulletDynamics/Featherstone/btMultiBodyJointMotor.cpp",
|
||||
"BulletDynamics/Featherstone/btMultiBodyMLCPConstraintSolver.cpp",
|
||||
"BulletDynamics/Featherstone/btMultiBodyPoint2Point.cpp",
|
||||
"BulletDynamics/Featherstone/btMultiBodySliderConstraint.cpp",
|
||||
"BulletDynamics/Featherstone/btMultiBodySphericalJointMotor.cpp",
|
||||
"BulletDynamics/MLCPSolvers/btDantzigLCP.cpp",
|
||||
"BulletDynamics/MLCPSolvers/btMLCPSolver.cpp",
|
||||
"BulletDynamics/MLCPSolvers/btLemkeAlgorithm.cpp",
|
||||
# BulletInverseDynamics
|
||||
, "BulletInverseDynamics/IDMath.cpp"
|
||||
, "BulletInverseDynamics/MultiBodyTree.cpp"
|
||||
, "BulletInverseDynamics/details/MultiBodyTreeInitCache.cpp"
|
||||
, "BulletInverseDynamics/details/MultiBodyTreeImpl.cpp"
|
||||
|
||||
"BulletInverseDynamics/IDMath.cpp",
|
||||
"BulletInverseDynamics/MultiBodyTree.cpp",
|
||||
"BulletInverseDynamics/details/MultiBodyTreeInitCache.cpp",
|
||||
"BulletInverseDynamics/details/MultiBodyTreeImpl.cpp",
|
||||
# BulletSoftBody
|
||||
, "BulletSoftBody/btSoftBody.cpp"
|
||||
, "BulletSoftBody/btSoftBodyConcaveCollisionAlgorithm.cpp"
|
||||
, "BulletSoftBody/btSoftBodyHelpers.cpp"
|
||||
, "BulletSoftBody/btSoftBodyRigidBodyCollisionConfiguration.cpp"
|
||||
, "BulletSoftBody/btSoftRigidCollisionAlgorithm.cpp"
|
||||
, "BulletSoftBody/btSoftRigidDynamicsWorld.cpp"
|
||||
, "BulletSoftBody/btSoftMultiBodyDynamicsWorld.cpp"
|
||||
, "BulletSoftBody/btSoftSoftCollisionAlgorithm.cpp"
|
||||
, "BulletSoftBody/btDefaultSoftBodySolver.cpp"
|
||||
, "BulletSoftBody/btDeformableBackwardEulerObjective.cpp"
|
||||
, "BulletSoftBody/btDeformableBodySolver.cpp"
|
||||
, "BulletSoftBody/btDeformableMultiBodyConstraintSolver.cpp"
|
||||
, "BulletSoftBody/btDeformableContactProjection.cpp"
|
||||
, "BulletSoftBody/btDeformableMultiBodyDynamicsWorld.cpp"
|
||||
, "BulletSoftBody/btDeformableContactConstraint.cpp"
|
||||
|
||||
"BulletSoftBody/btSoftBody.cpp",
|
||||
"BulletSoftBody/btSoftBodyConcaveCollisionAlgorithm.cpp",
|
||||
"BulletSoftBody/btSoftBodyHelpers.cpp",
|
||||
"BulletSoftBody/btSoftBodyRigidBodyCollisionConfiguration.cpp",
|
||||
"BulletSoftBody/btSoftRigidCollisionAlgorithm.cpp",
|
||||
"BulletSoftBody/btSoftRigidDynamicsWorld.cpp",
|
||||
"BulletSoftBody/btSoftMultiBodyDynamicsWorld.cpp",
|
||||
"BulletSoftBody/btSoftSoftCollisionAlgorithm.cpp",
|
||||
"BulletSoftBody/btDefaultSoftBodySolver.cpp",
|
||||
"BulletSoftBody/btDeformableBackwardEulerObjective.cpp",
|
||||
"BulletSoftBody/btDeformableBodySolver.cpp",
|
||||
"BulletSoftBody/btDeformableMultiBodyConstraintSolver.cpp",
|
||||
"BulletSoftBody/btDeformableContactProjection.cpp",
|
||||
"BulletSoftBody/btDeformableMultiBodyDynamicsWorld.cpp",
|
||||
"BulletSoftBody/btDeformableContactConstraint.cpp",
|
||||
# clew
|
||||
, "clew/clew.c"
|
||||
|
||||
"clew/clew.c",
|
||||
# LinearMath
|
||||
, "LinearMath/btAlignedAllocator.cpp"
|
||||
, "LinearMath/btConvexHull.cpp"
|
||||
, "LinearMath/btConvexHullComputer.cpp"
|
||||
, "LinearMath/btGeometryUtil.cpp"
|
||||
, "LinearMath/btPolarDecomposition.cpp"
|
||||
, "LinearMath/btQuickprof.cpp"
|
||||
, "LinearMath/btSerializer.cpp"
|
||||
, "LinearMath/btSerializer64.cpp"
|
||||
, "LinearMath/btThreads.cpp"
|
||||
, "LinearMath/btVector3.cpp"
|
||||
, "LinearMath/TaskScheduler/btTaskScheduler.cpp"
|
||||
, "LinearMath/TaskScheduler/btThreadSupportPosix.cpp"
|
||||
, "LinearMath/TaskScheduler/btThreadSupportWin32.cpp"
|
||||
"LinearMath/btAlignedAllocator.cpp",
|
||||
"LinearMath/btConvexHull.cpp",
|
||||
"LinearMath/btConvexHullComputer.cpp",
|
||||
"LinearMath/btGeometryUtil.cpp",
|
||||
"LinearMath/btPolarDecomposition.cpp",
|
||||
"LinearMath/btQuickprof.cpp",
|
||||
"LinearMath/btSerializer.cpp",
|
||||
"LinearMath/btSerializer64.cpp",
|
||||
"LinearMath/btThreads.cpp",
|
||||
"LinearMath/btVector3.cpp",
|
||||
"LinearMath/TaskScheduler/btTaskScheduler.cpp",
|
||||
"LinearMath/TaskScheduler/btThreadSupportPosix.cpp",
|
||||
"LinearMath/TaskScheduler/btThreadSupportWin32.cpp",
|
||||
]
|
||||
|
||||
thirdparty_sources = [thirdparty_dir + file for file in bullet2_src]
|
||||
|
||||
# Treat Bullet headers as system headers to avoid raising warnings. Not supported on MSVC.
|
||||
if not env.msvc:
|
||||
env_bullet.Append(CPPFLAGS=['-isystem', Dir(thirdparty_dir).path])
|
||||
env_bullet.Append(CPPFLAGS=["-isystem", Dir(thirdparty_dir).path])
|
||||
else:
|
||||
env_bullet.Prepend(CPPPATH=[thirdparty_dir])
|
||||
# if env['target'] == "debug" or env['target'] == "release_debug":
|
||||
|
|
|
@ -1,5 +1,6 @@
|
|||
def can_build(env, platform):
|
||||
return True
|
||||
|
||||
|
||||
def configure(env):
|
||||
pass
|
||||
|
|
|
@ -1,7 +1,7 @@
|
|||
#!/usr/bin/env python
|
||||
|
||||
Import('env')
|
||||
Import('env_modules')
|
||||
Import("env")
|
||||
Import("env_modules")
|
||||
|
||||
env_camera = env_modules.Clone()
|
||||
|
||||
|
@ -10,7 +10,7 @@ if env["platform"] == "iphone":
|
|||
modules_sources = []
|
||||
env_camera.add_source_files(modules_sources, "register_types.cpp")
|
||||
env_camera.add_source_files(modules_sources, "camera_ios.mm")
|
||||
mod_lib = env_modules.add_library('#bin/libgodot_camera_module' + env['LIBSUFFIX'], modules_sources)
|
||||
mod_lib = env_modules.add_library("#bin/libgodot_camera_module" + env["LIBSUFFIX"], modules_sources)
|
||||
|
||||
elif env["platform"] == "windows":
|
||||
env_camera.add_source_files(env.modules_sources, "register_types.cpp")
|
||||
|
@ -19,4 +19,3 @@ elif env["platform"] == "windows":
|
|||
elif env["platform"] == "osx":
|
||||
env_camera.add_source_files(env.modules_sources, "register_types.cpp")
|
||||
env_camera.add_source_files(env.modules_sources, "camera_osx.mm")
|
||||
|
||||
|
|
|
@ -1,5 +1,6 @@
|
|||
def can_build(env, platform):
|
||||
return platform == 'iphone' or platform == 'osx' or platform == 'windows'
|
||||
return platform == "iphone" or platform == "osx" or platform == "windows"
|
||||
|
||||
|
||||
def configure(env):
|
||||
pass
|
||||
|
|
|
@ -1,7 +1,7 @@
|
|||
#!/usr/bin/env python
|
||||
|
||||
Import('env')
|
||||
Import('env_modules')
|
||||
Import("env")
|
||||
Import("env_modules")
|
||||
|
||||
env_csg = env_modules.Clone()
|
||||
|
||||
|
|
|
@ -1,9 +1,11 @@
|
|||
def can_build(env, platform):
|
||||
return True
|
||||
|
||||
|
||||
def configure(env):
|
||||
pass
|
||||
|
||||
|
||||
def get_doc_classes():
|
||||
return [
|
||||
"CSGBox",
|
||||
|
@ -17,5 +19,6 @@ def get_doc_classes():
|
|||
"CSGTorus",
|
||||
]
|
||||
|
||||
|
||||
def get_doc_path():
|
||||
return "doc_classes"
|
||||
|
|
|
@ -1,15 +1,13 @@
|
|||
#!/usr/bin/env python
|
||||
|
||||
Import('env')
|
||||
Import('env_modules')
|
||||
Import("env")
|
||||
Import("env_modules")
|
||||
|
||||
env_cvtt = env_modules.Clone()
|
||||
|
||||
# Thirdparty source files
|
||||
thirdparty_dir = "#thirdparty/cvtt/"
|
||||
thirdparty_sources = [
|
||||
"ConvectionKernels.cpp"
|
||||
]
|
||||
thirdparty_sources = ["ConvectionKernels.cpp"]
|
||||
|
||||
thirdparty_sources = [thirdparty_dir + file for file in thirdparty_sources]
|
||||
|
||||
|
|
|
@ -1,5 +1,6 @@
|
|||
def can_build(env, platform):
|
||||
return env['tools']
|
||||
return env["tools"]
|
||||
|
||||
|
||||
def configure(env):
|
||||
pass
|
||||
|
|
|
@ -1,7 +1,7 @@
|
|||
#!/usr/bin/env python
|
||||
|
||||
Import('env')
|
||||
Import('env_modules')
|
||||
Import("env")
|
||||
Import("env_modules")
|
||||
|
||||
env_dds = env_modules.Clone()
|
||||
|
||||
|
|
|
@ -1,5 +1,6 @@
|
|||
def can_build(env, platform):
|
||||
return True
|
||||
|
||||
|
||||
def configure(env):
|
||||
pass
|
||||
|
|
|
@ -1,13 +1,13 @@
|
|||
#!/usr/bin/env python
|
||||
|
||||
Import('env')
|
||||
Import('env_modules')
|
||||
Import("env")
|
||||
Import("env_modules")
|
||||
|
||||
env_enet = env_modules.Clone()
|
||||
|
||||
# Thirdparty source files
|
||||
|
||||
if env['builtin_enet']:
|
||||
if env["builtin_enet"]:
|
||||
thirdparty_dir = "#thirdparty/enet/"
|
||||
thirdparty_sources = [
|
||||
"godot.cpp",
|
||||
|
|
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in New Issue