mirror of
https://github.com/zebrajr/ansible.git
synced 2025-12-06 00:19:48 +01:00
Templating overhaul, implement Data Tagging (#84621)
Co-authored-by: Matt Davis <mrd@redhat.com> Co-authored-by: Matt Clay <matt@mystile.com>
This commit is contained in:
parent
6fc592df9b
commit
35750ed321
|
|
@ -1,3 +0,0 @@
|
|||
---
|
||||
deprecated_features:
|
||||
- fact_cache - deprecate first_order_merge API (https://github.com/ansible/ansible/pull/84568).
|
||||
3
changelogs/fragments/fix-is-filter-is-test.yml
Normal file
3
changelogs/fragments/fix-is-filter-is-test.yml
Normal file
|
|
@ -0,0 +1,3 @@
|
|||
bugfixes:
|
||||
- Correctly return ``False`` when using the ``filter`` and ``test`` Jinja tests on plugin names which are not filters or tests, respectively.
|
||||
(resolves issue https://github.com/ansible/ansible/issues/82084)
|
||||
179
changelogs/fragments/templates_types_datatagging.yml
Normal file
179
changelogs/fragments/templates_types_datatagging.yml
Normal file
|
|
@ -0,0 +1,179 @@
|
|||
# DTFIX-RELEASE: document EncryptedString replacing AnsibleVaultEncryptedUnicode
|
||||
|
||||
major_changes:
|
||||
- variables - The type system underlying Ansible's variable storage has been significantly overhauled and formalized.
|
||||
Attempts to store unsupported Python object types in variables will now result in an error. # DTFIX-RELEASE: link to type system docs TBD
|
||||
- variables - To support new Ansible features, many variable objects are now represented by subclasses of their respective native Python types.
|
||||
In most cases, they behave indistinguishably from their original types, but some Python libraries do not handle builtin object subclasses properly.
|
||||
Custom plugins that interact with such libraries may require changes to convert and pass the native types. # DTFIX-RELEASE: link to plugin/data tagging API docs TBD
|
||||
- ansible-test - Packages beneath ``module_utils`` can now contain ``__init__.py`` files.
|
||||
- Jinja plugins - Jinja builtin filter and test plugins are now accessible via their fully-qualified names ``ansible.builtin.{name}``.
|
||||
|
||||
minor_changes:
|
||||
- templating - Templating errors now provide more information about both the location and context of the error, especially for deeply-nested and/or indirected templating scenarios.
|
||||
- templating - Handling of omitted values is now a first-class feature of the template engine, and is usable in all Ansible Jinja template contexts.
|
||||
Any template that resolves to ``omit`` is automatically removed from its parent container during templating. # DTFIX-RELEASE: porting guide entry
|
||||
- templating - Unified ``omit`` behavior now requires that plugins calling ``Templar.template()`` handle cases where the entire template result is omitted,
|
||||
by catching the ``AnsibleValueOmittedError`` that is raised.
|
||||
Previously, this condition caused a randomly-generated string marker to appear in the template result. # DTFIX-RELEASE: porting guide entry?
|
||||
- templating - Template evaluation is lazier than in previous versions.
|
||||
Template expressions which resolve only portions of a data structure no longer result in the entire structure being templated.
|
||||
- handlers - Templated handler names with syntax errors, or that resolve to ``omit`` are now skipped like handlers with undefined variables in their name.
|
||||
- env lookup - The error message generated for a missing environment variable when ``default`` is an undefined value (e.g. ``undef('something')``) will contain the hint from that undefined value,
|
||||
except when the undefined value is the default of ``undef()`` with no arguments. Previously, any existing undefined hint would be ignored.
|
||||
- templating - Embedding ``range()`` values in containers such as lists will result in an error on use.
|
||||
Previously the value would be converted to a string representing the range parameters, such as ``range(0, 3)``.
|
||||
- Jinja plugins - Plugins can declare support for undefined values. # DTFIX-RELEASE: examples, porting guide entry
|
||||
- templating - Variables of type ``set`` and ``tuple`` are now converted to ``list`` when exiting the final pass of templating.
|
||||
- templating - Access to an undefined variable from inside a lookup, filter, or test (which raises MarkerError) no longer ends processing of the current template.
|
||||
The triggering undefined value is returned as the result of the offending plugin invocation, and the template continues to execute. # DTFIX-RELEASE: porting guide entry, samples needed
|
||||
- plugin error handling - When raising exceptions in an exception handler, be sure to use ``raise ... from`` as appropriate.
|
||||
This supersedes the use of the ``AnsibleError`` arg ``orig_exc`` to represent the cause.
|
||||
Specifying ``orig_exc`` as the cause is still permitted.
|
||||
Failure to use ``raise ... from`` when ``orig_exc`` is set will result in a warning.
|
||||
Additionally, if the two cause exceptions do not match, a warning will be issued. # DTFIX-RELEASE: this needs a porting guide entry
|
||||
- ansible-test - The ``yamllint`` sanity test now enforces string values for the ``!vault`` tag.
|
||||
- warnings - All warnings (including deprecation warnings) issued during a task's execution are now accessible via the ``warnings`` and ``deprecations`` keys on the task result.
|
||||
- troubleshooting - Tracebacks can be collected and displayed for most errors, warnings, and deprecation warnings (including those generated by modules).
|
||||
Tracebacks are no longer enabled with ``-vvv``; the behavior is directly configurable via the ``DISPLAY_TRACEBACK`` config option.
|
||||
Module tracebacks passed to ``fail_json`` via the ``exception`` kwarg will not be included in the task result unless error tracebacks are configured.
|
||||
- display - Deduplication of warning and error messages considers the full content of the message (including source and traceback contexts, if enabled).
|
||||
This may result in fewer messages being omitted.
|
||||
- modules - Unhandled exceptions during Python module execution are now returned as structured data from the target.
|
||||
This allows the new traceback handling to be applied to exceptions raised on targets.
|
||||
- modules - PowerShell modules can now receive ``datetime.date``, ``datetime.time`` and ``datetime.datetime`` values as ISO 8601 strings.
|
||||
- modules - PowerShell modules can now receive strings sourced from inline vault-encrypted strings.
|
||||
- from_json filter - The filter accepts a ``profile`` argument, which defaults to ``tagless``.
|
||||
- to_json / to_nice_json filters - The filters accept a ``profile`` argument, which defaults to ``tagless``.
|
||||
- undef jinja function - The ``undef`` jinja function now raises an error if a non-string hint is given.
|
||||
Attempting to use an undefined hint also results in an error, ensuring incorrect use of the function can be distinguished from the function's normal behavior.
|
||||
- display - The ``collection_name`` arg to ``Display.deprecated`` no longer has any effect.
|
||||
Information about the calling plugin is automatically captured by the display infrastructure, included in the displayed messages, and made available to callbacks.
|
||||
- modules - The ``collection_name`` arg to Python module-side ``deprecate`` methods no longer has any effect.
|
||||
Information about the calling module is automatically captured by the warning infrastructure and included in the module result.
|
||||
|
||||
breaking_changes:
|
||||
- loops - Omit placeholders no longer leak between loop item templating and task templating.
|
||||
Previously, ``omit`` placeholders could remain embedded in loop items after templating and be used as an ``omit`` for task templating.
|
||||
Now, values resolving to ``omit`` are dropped immediately when loop items are templated.
|
||||
To turn missing values into an ``omit`` for task templating, use ``| default(omit)``.
|
||||
This solution is backwards compatible with previous versions of ansible-core. # DTFIX-RELEASE: porting guide entry with examples
|
||||
- serialization of ``omit`` sentinel - Serialization of variables containing ``omit`` sentinels (e.g., by the ``to_json`` and ``to_yaml`` filters or ``ansible-inventory``) will fail if the variable has not completed templating.
|
||||
Previously, serialization succeeded with placeholder strings emitted in the serialized output.
|
||||
- conditionals - Conditional expressions that result in non-boolean values are now an error by default.
|
||||
Such results often indicate unintentional use of templates where they are not supported, resulting in a conditional that is always true.
|
||||
When this option is enabled, conditional expressions which are a literal ``None`` or empty string will evaluate as true, for backwards compatibility.
|
||||
The error can be temporarily changed to a deprecation warning by enabling the ``ALLOW_BROKEN_CONDITIONALS`` config option.
|
||||
- templating - Templates are always rendered in Jinja2 native mode.
|
||||
As a result, non-string values are no longer automatically converted to strings.
|
||||
- templating - Templates with embedded inline templates that were not contained within a Jinja string constant now result in an error, as support for multi-pass templating was removed for security reasons.
|
||||
In most cases, such templates can be easily rewritten to avoid the use of embedded inline templates.
|
||||
- templating - Conditionals and lookups which use embedded inline templates in Jinja string constants now display a warning.
|
||||
These templates should be converted to their expression equivalent.
|
||||
- templating - Templates resulting in ``None`` are no longer automatically converted to an empty string.
|
||||
- template lookup - The ``convert_data`` option is deprecated and no longer has any effect.
|
||||
Use the ``from_json`` filter on the lookup result instead.
|
||||
- templating - ``#jinja2:`` overrides in templates with invalid override names or types are now templating errors.
|
||||
- set_fact - The string values "yes", "no", "true" and "false" were previously converted (ignoring case) to boolean values when not using Jinja2 native mode.
|
||||
Since Jinja2 native mode is always used, this conversion no longer occurs.
|
||||
When boolean values are required, native boolean syntax should be used where variables are defined, such as in YAML.
|
||||
When native boolean syntax is not an option, the ``bool`` filter can be used to parse string values into booleans.
|
||||
- templating - The ``allow_unsafe_lookups`` option no longer has any effect.
|
||||
Lookup plugins are responsible for tagging strings containing templates to allow evaluation as a template.
|
||||
- assert - The ``quiet`` argument must be a commonly-accepted boolean value.
|
||||
Previously, unrecognized values were silently treated as False.
|
||||
- plugins - Any plugin that sources or creates templates must properly tag them as trusted. # DTFIX-RELEASE: porting guide entry for "how?" Don't forget to mention inventory plugin ``trusted_by_default`` config.
|
||||
- first_found lookup - When specifying ``files`` or ``paths`` as a templated list containing undefined values, the undefined list elements will be discarded with a warning.
|
||||
Previously, the entire list would be discarded without any warning.
|
||||
- templating - The result of the ``range()`` global function cannot be returned from a template- it should always be passed to a filter (e.g., ``random``).
|
||||
Previously, range objects returned from an intermediate template were always converted to a list, which is inconsistent with inline consumption of range objects.
|
||||
- plugins - Custom Jinja plugins that accept undefined top-level arguments must opt in to receiving them. # DTFIX-RELEASE: porting guide entry + backcompat behavior description
|
||||
- plugins - Custom Jinja plugins that use ``environment.getitem`` to retrieve undefined values will now trigger a ``MarkerError`` exception.
|
||||
This exception must be handled to allow the plugin to return a ``Marker``, or the plugin must opt-in to accepting ``Marker`` values. # DTFIX-RELEASE: mention the decorator
|
||||
- templating - Many Jinja plugins (filters, lookups, tests) and methods previously silently ignored undefined inputs, which often masked subtle errors.
|
||||
Passing an undefined argument to a Jinja plugin or method that does not declare undefined support now results in an undefined value. # DTFIX-RELEASE: common examples, porting guide, `is defined`, `is undefined`, etc; porting guide should also mention that overly-broad exception handling may mask Undefined errors; also that lazy handling of Undefined can invoke a plugin and bomb out in the middle where it was previously never invoked (plugins with side effects, just don't)
|
||||
- lookup plugins - Lookup plugins called as `with_(lookup)` will no longer have the `_subdir` attribute set. # DTFIX-RELEASE: porting guide re: `ansible_lookup_context`
|
||||
- lookup plugins - ``terms`` will always be passed to ``run`` as the first positional arg, where previously it was sometimes passed as a keyword arg when using ``with_`` syntax.
|
||||
- callback plugins - The structure of the ``exception``, ``warnings`` and ``deprecations`` values visible to callbacks has changed. Callbacks that inspect or serialize these values may require special handling. # DTFIX-RELEASE: porting guide re ErrorDetail/WarningMessageDetail/DeprecationMessageDetail
|
||||
- modules - Ansible modules using ``sys.excepthook`` must use a standard ``try/except`` instead.
|
||||
- templating - Access to ``_`` prefixed attributes and methods, and methods with known side effects, is no longer permitted.
|
||||
In cases where a matching mapping key is present, the associated value will be returned instead of an error.
|
||||
This increases template environment isolation and ensures more consistent behavior between the ``.`` and ``[]`` operators.
|
||||
- inventory - Invalid variable names provided by inventories result in an inventory parse failure. This behavior is now consistent with other variable name usages throughout Ansible.
|
||||
- internals - The ``ansible.utils.native_jinja`` Python module has been removed.
|
||||
- internals - The ``AnsibleLoader`` and ``AnsibleDumper`` classes for working with YAML are now factory functions and cannot be extended.
|
||||
- public API - The ``ansible.vars.fact_cache.FactCache`` wrapper has been removed.
|
||||
|
||||
security_fixes:
|
||||
- templating - Ansible's template engine no longer processes Jinja templates in strings unless they are marked as coming from a trusted source.
|
||||
Untrusted strings containing Jinja template markers are ignored with a warning.
|
||||
Examples of trusted sources include playbooks, vars files, and many inventory sources.
|
||||
Examples of untrusted sources include module results and facts.
|
||||
Plugins which have not been updated to preserve trust while manipulating strings may inadvertently cause them to lose their trusted status.
|
||||
- templating - Changes to conditional expression handling removed numerous instances of insecure multi-pass templating (which could result in execution of untrusted template expressions).
|
||||
|
||||
known_issues:
|
||||
- variables - The values ``None``, ``True`` and ``False`` cannot be tagged because they are singletons. Attempts to apply tags to these values will be silently ignored.
|
||||
- variables - Tagged values cannot be used for dictionary keys in many circumstances. # DTFIX-RELEASE: Explain this in more detail.
|
||||
- templating - Any string value starting with ``#jinja2:`` which is templated will always be interpreted as Jinja2 configuration overrides.
|
||||
To include this literal value at the start of a string, a space or other character must precede it.
|
||||
|
||||
bugfixes:
|
||||
- module defaults - Module defaults are no longer templated unless they are used by a task that does not override them.
|
||||
Previously, all module defaults for all modules were templated for every task.
|
||||
- omitting task args - Use of omit for task args now properly falls back to args of lower precedence, such as module defaults.
|
||||
Previously an omitted value would obliterate values of lower precedence. # DTFIX-RELEASE: do we need obliterate, is this a breaking change?
|
||||
- regex_search filter - Corrected return value documentation to reflect None (not empty string) for no match.
|
||||
- first_found lookup - Corrected return value documentation to reflect None (not empty string) for no files found.
|
||||
- vars lookup - The ``default`` substitution only applies when trying to look up a variable which is not defined.
|
||||
If the variable is defined, but templates to an undefined value, the ``default`` substitution will not apply.
|
||||
Use the ``default`` filter to coerce those values instead.
|
||||
- to_yaml/to_nice_yaml filters - Eliminated possibility of keyword arg collisions with internally-set defaults.
|
||||
- Jinja plugins - Errors raised will always be derived from ``AnsibleTemplatePluginError``.
|
||||
- ansible-test - Fixed traceback when handling certain YAML errors in the ``yamllint`` sanity test.
|
||||
- YAML parsing - The `!unsafe` tag no longer coerces non-string scalars to strings.
|
||||
- default callback - Error context is now shown for failing tasks that use the ``debug`` action.
|
||||
- module arg templating - When using a templated raw task arg and a templated ``args`` keyword, args are now merged.
|
||||
Previously use of templated raw task args silently ignored all values from the templated ``args`` keyword.
|
||||
- action plugins - Action plugins that raise unhandled exceptions no longer terminate playbook loops. Previously, exceptions raised by an action plugin caused abnormal loop termination and loss of loop iteration results.
|
||||
- display - The ``Display.deprecated`` method once again properly handles the ``removed=True`` argument (https://github.com/ansible/ansible/issues/82358).
|
||||
- stability - Fixed silent process failure on unhandled IOError/OSError under ``linear`` strategy.
|
||||
- lookup plugins - The ``terms`` arg to the ``run`` method is now always a list.
|
||||
Previously, there were cases where a non-list could be received.
|
||||
|
||||
deprecated_features:
|
||||
- templating - The ``ansible_managed`` variable available for certain templating scenarios, such as the ``template`` action and ``template`` lookup has been deprecated.
|
||||
Define and use a custom variable instead of relying on ``ansible_managed``.
|
||||
- display - The ``Display.get_deprecation_message`` method has been deprecated.
|
||||
Call ``Display.deprecated`` to display a deprecation message, or call it with ``removed=True`` to raise an ``AnsibleError``.
|
||||
- config - The ``DEFAULT_JINJA2_NATIVE`` option has no effect.
|
||||
Jinja2 native mode is now the default and only option.
|
||||
- config - The ``DEFAULT_NULL_REPRESENTATION`` option has no effect.
|
||||
Null values are no longer automatically converted to another value during templating of single variable references.
|
||||
- template lookup - The jinja2_native option is no longer used in the Ansible Core code base.
|
||||
Jinja2 native mode is now the default and only option.
|
||||
- conditionals - Conditionals using Jinja templating delimiters (e.g., ``{{``, ``{%``) should be rewritten as expressions without delimiters, unless the entire conditional value is a single template that resolves to a trusted string expression.
|
||||
This is useful for dynamic indirection of conditional expressions, but is limited to trusted literal string expressions.
|
||||
- templating - The ``disable_lookups`` option has no effect, since plugins must be updated to apply trust before any templating can be performed.
|
||||
- to_yaml/to_nice_yaml filters - Implicit YAML dumping of vaulted value ciphertext is deprecated.
|
||||
Set `dump_vault_tags` to explicitly specify the desired behavior.
|
||||
- plugins - The ``listify_lookup_plugin_terms`` function is obsolete and in most cases no longer needed. # DTFIX-RELEASE: add a porting guide entry for this
|
||||
- plugin error handling - The ``AnsibleError`` constructor arg ``suppress_extended_error`` is deprecated.
|
||||
Using ``suppress_extended_error=True`` has the same effect as ``show_content=False``.
|
||||
- config - The ``ACTION_WARNINGS`` config has no effect. It previously disabled command warnings, which have since been removed.
|
||||
- templating - Support for enabling Jinja2 extensions (not plugins) has been deprecated.
|
||||
- playbook variables - The ``play_hosts`` variable has been deprecated, use ``ansible_play_batch`` instead.
|
||||
- bool filter - Support for coercing unrecognized input values (including None) has been deprecated. Consult the filter documentation for acceptable values, or consider use of the ``truthy`` and ``falsy`` tests. # DTFIX-RELEASE: porting guide
|
||||
- oneline callback - The ``oneline`` callback and its associated ad-hoc CLI args (``-o``, ``--one-line``) are deprecated.
|
||||
- tree callback - The ``tree`` callback and its associated ad-hoc CLI args (``-t``, ``--tree``) are deprecated.
|
||||
- CLI - The ``--inventory-file`` option alias is deprecated. Use the ``-i`` or ``--inventory`` option instead.
|
||||
- first_found lookup - Splitting of file paths on ``,;:`` is deprecated. Pass a list of paths instead.
|
||||
The ``split`` method on strings can be used to split variables into a list as needed.
|
||||
- cache plugins - The `ansible.plugins.cache.base` Python module is deprecated. Use `ansible.plugins.cache` instead.
|
||||
- file loading - Loading text files with ``DataLoader`` containing data that cannot be decoded under the expected encoding is deprecated.
|
||||
In most cases the encoding must be UTF-8, although some plugins allow choosing a different encoding.
|
||||
Previously, invalid data was silently wrapped in Unicode surrogate escape sequences, often resulting in later errors or other data corruption.
|
||||
|
||||
removed_features:
|
||||
- modules - Modules returning non-UTF8 strings now result in an error.
|
||||
The ``MODULE_STRICT_UTF8_RESPONSE`` setting can be used to disable this check.
|
||||
4
changelogs/fragments/toml-library-support-dropped.yml
Normal file
4
changelogs/fragments/toml-library-support-dropped.yml
Normal file
|
|
@ -0,0 +1,4 @@
|
|||
breaking_changes:
|
||||
- Support for the ``toml`` library has been removed from TOML inventory parsing and dumping.
|
||||
Use ``tomli`` for parsing on Python 3.10. Python 3.11 and later have built-in support for parsing.
|
||||
Use ``tomli-w`` to support outputting inventory in TOML format.
|
||||
|
|
@ -40,10 +40,10 @@ import shutil
|
|||
|
||||
from pathlib import Path
|
||||
|
||||
from ansible.module_utils.common.messages import PluginInfo
|
||||
from ansible.release import __version__
|
||||
import ansible.utils.vars as utils_vars
|
||||
from ansible.parsing.dataloader import DataLoader
|
||||
from ansible.parsing.utils.jsonify import jsonify
|
||||
from ansible.parsing.splitter import parse_kv
|
||||
from ansible.plugins.loader import init_plugin_loader
|
||||
from ansible.executor import module_common
|
||||
|
|
@ -89,6 +89,22 @@ def parse():
|
|||
return options, args
|
||||
|
||||
|
||||
def jsonify(result, format=False):
|
||||
""" format JSON output (uncompressed or uncompressed) """
|
||||
|
||||
if result is None:
|
||||
return "{}"
|
||||
|
||||
indent = None
|
||||
if format:
|
||||
indent = 4
|
||||
|
||||
try:
|
||||
return json.dumps(result, sort_keys=True, indent=indent, ensure_ascii=False)
|
||||
except UnicodeDecodeError:
|
||||
return json.dumps(result, sort_keys=True, indent=indent)
|
||||
|
||||
|
||||
def write_argsfile(argstring, json=False):
|
||||
""" Write args to a file for old-style module's use. """
|
||||
argspath = Path("~/.ansible_test_module_arguments").expanduser()
|
||||
|
|
@ -152,16 +168,27 @@ def boilerplate_module(modfile, args, interpreters, check, destfile):
|
|||
if check:
|
||||
complex_args['_ansible_check_mode'] = True
|
||||
|
||||
modfile = os.path.abspath(modfile)
|
||||
modname = os.path.basename(modfile)
|
||||
modname = os.path.splitext(modname)[0]
|
||||
(module_data, module_style, shebang) = module_common.modify_module(
|
||||
modname,
|
||||
modfile,
|
||||
complex_args,
|
||||
Templar(loader=loader),
|
||||
|
||||
plugin = PluginInfo(
|
||||
requested_name=modname,
|
||||
resolved_name=modname,
|
||||
type='module',
|
||||
)
|
||||
|
||||
built_module = module_common.modify_module(
|
||||
module_name=modname,
|
||||
plugin=plugin,
|
||||
module_path=modfile,
|
||||
module_args=complex_args,
|
||||
templar=Templar(loader=loader),
|
||||
task_vars=task_vars
|
||||
)
|
||||
|
||||
module_data, module_style = built_module.b_module_data, built_module.module_style
|
||||
|
||||
if module_style == 'new' and '_ANSIBALLZ_WRAPPER = True' in to_native(module_data):
|
||||
module_style = 'ansiballz'
|
||||
|
||||
|
|
|
|||
53
lib/ansible/_internal/__init__.py
Normal file
53
lib/ansible/_internal/__init__.py
Normal file
|
|
@ -0,0 +1,53 @@
|
|||
from __future__ import annotations
|
||||
|
||||
import importlib
|
||||
import typing as t
|
||||
|
||||
from ansible.module_utils import _internal
|
||||
from ansible.module_utils._internal._json import _profiles
|
||||
|
||||
|
||||
def get_controller_serialize_map() -> dict[type, t.Callable]:
|
||||
"""
|
||||
Injected into module_utils code to augment serialization maps with controller-only types.
|
||||
This implementation replaces the no-op version in module_utils._internal in controller contexts.
|
||||
"""
|
||||
from ansible._internal._templating import _lazy_containers
|
||||
from ansible.parsing.vault import EncryptedString
|
||||
|
||||
return {
|
||||
_lazy_containers._AnsibleLazyTemplateDict: _profiles._JSONSerializationProfile.discard_tags,
|
||||
_lazy_containers._AnsibleLazyTemplateList: _profiles._JSONSerializationProfile.discard_tags,
|
||||
EncryptedString: str, # preserves tags since this is an intance of EncryptedString; if tags should be discarded from str, another entry will handle it
|
||||
}
|
||||
|
||||
|
||||
def import_controller_module(module_name: str, /) -> t.Any:
|
||||
"""
|
||||
Injected into module_utils code to import and return the specified module.
|
||||
This implementation replaces the no-op version in module_utils._internal in controller contexts.
|
||||
"""
|
||||
return importlib.import_module(module_name)
|
||||
|
||||
|
||||
_T = t.TypeVar('_T')
|
||||
|
||||
|
||||
def experimental(obj: _T) -> _T:
|
||||
"""
|
||||
Decorator for experimental types and methods outside the `_internal` package which accept or expose internal types.
|
||||
As with internal APIs, these are subject to change at any time without notice.
|
||||
"""
|
||||
return obj
|
||||
|
||||
|
||||
def setup() -> None:
|
||||
"""No-op function to ensure that side-effect only imports of this module are not flagged/removed as 'unused'."""
|
||||
|
||||
|
||||
# DTFIX-RELEASE: this is really fragile- disordered/incorrect imports (among other things) can mess it up. Consider a hosting-env-managed context
|
||||
# with an enum with at least Controller/Target/Unknown values, and possibly using lazy-init module shims or some other mechanism to allow controller-side
|
||||
# notification/augmentation of this kind of metadata.
|
||||
_internal.get_controller_serialize_map = get_controller_serialize_map
|
||||
_internal.import_controller_module = import_controller_module
|
||||
_internal.is_controller = True
|
||||
265
lib/ansible/_internal/_ansiballz.py
Normal file
265
lib/ansible/_internal/_ansiballz.py
Normal file
|
|
@ -0,0 +1,265 @@
|
|||
# shebang placeholder
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import datetime
|
||||
|
||||
# For test-module.py script to tell this is a ANSIBALLZ_WRAPPER
|
||||
_ANSIBALLZ_WRAPPER = True
|
||||
|
||||
# This code is part of Ansible, but is an independent component.
|
||||
# The code in this particular templatable string, and this templatable string
|
||||
# only, is BSD licensed. Modules which end up using this snippet, which is
|
||||
# dynamically combined together by Ansible still belong to the author of the
|
||||
# module, and they may assign their own license to the complete work.
|
||||
#
|
||||
# Copyright (c), James Cammarata, 2016
|
||||
# Copyright (c), Toshio Kuratomi, 2016
|
||||
#
|
||||
# Redistribution and use in source and binary forms, with or without modification,
|
||||
# are permitted provided that the following conditions are met:
|
||||
#
|
||||
# * Redistributions of source code must retain the above copyright
|
||||
# notice, this list of conditions and the following disclaimer.
|
||||
# * Redistributions in binary form must reproduce the above copyright notice,
|
||||
# this list of conditions and the following disclaimer in the documentation
|
||||
# and/or other materials provided with the distribution.
|
||||
#
|
||||
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND
|
||||
# ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
|
||||
# WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED.
|
||||
# IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT,
|
||||
# INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO,
|
||||
# PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
|
||||
# INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT
|
||||
# LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE
|
||||
# USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
|
||||
|
||||
|
||||
def _ansiballz_main(
|
||||
zipdata: str,
|
||||
ansible_module: str,
|
||||
module_fqn: str,
|
||||
params: str,
|
||||
profile: str,
|
||||
plugin_info_dict: dict[str, object],
|
||||
date_time: datetime.datetime,
|
||||
coverage_config: str | None,
|
||||
coverage_output: str | None,
|
||||
rlimit_nofile: int,
|
||||
) -> None:
|
||||
import os
|
||||
import os.path
|
||||
|
||||
# Access to the working directory is required by Python when using pipelining, as well as for the coverage module.
|
||||
# Some platforms, such as macOS, may not allow querying the working directory when using become to drop privileges.
|
||||
try:
|
||||
os.getcwd()
|
||||
except OSError:
|
||||
try:
|
||||
os.chdir(os.path.expanduser('~'))
|
||||
except OSError:
|
||||
os.chdir('/')
|
||||
|
||||
if rlimit_nofile:
|
||||
import resource
|
||||
|
||||
existing_soft, existing_hard = resource.getrlimit(resource.RLIMIT_NOFILE)
|
||||
|
||||
# adjust soft limit subject to existing hard limit
|
||||
requested_soft = min(existing_hard, rlimit_nofile)
|
||||
|
||||
if requested_soft != existing_soft:
|
||||
try:
|
||||
resource.setrlimit(resource.RLIMIT_NOFILE, (requested_soft, existing_hard))
|
||||
except ValueError:
|
||||
# some platforms (eg macOS) lie about their hard limit
|
||||
pass
|
||||
|
||||
import sys
|
||||
import __main__
|
||||
|
||||
# For some distros and python versions we pick up this script in the temporary
|
||||
# directory. This leads to problems when the ansible module masks a python
|
||||
# library that another import needs. We have not figured out what about the
|
||||
# specific distros and python versions causes this to behave differently.
|
||||
#
|
||||
# Tested distros:
|
||||
# Fedora23 with python3.4 Works
|
||||
# Ubuntu15.10 with python2.7 Works
|
||||
# Ubuntu15.10 with python3.4 Fails without this
|
||||
# Ubuntu16.04.1 with python3.5 Fails without this
|
||||
# To test on another platform:
|
||||
# * use the copy module (since this shadows the stdlib copy module)
|
||||
# * Turn off pipelining
|
||||
# * Make sure that the destination file does not exist
|
||||
# * ansible ubuntu16-test -m copy -a 'src=/etc/motd dest=/var/tmp/m'
|
||||
# This will traceback in shutil. Looking at the complete traceback will show
|
||||
# that shutil is importing copy which finds the ansible module instead of the
|
||||
# stdlib module
|
||||
scriptdir = None
|
||||
try:
|
||||
scriptdir = os.path.dirname(os.path.realpath(__main__.__file__))
|
||||
except (AttributeError, OSError):
|
||||
# Some platforms don't set __file__ when reading from stdin
|
||||
# OSX raises OSError if using abspath() in a directory we don't have
|
||||
# permission to read (realpath calls abspath)
|
||||
pass
|
||||
|
||||
# Strip cwd from sys.path to avoid potential permissions issues
|
||||
excludes = {'', '.', scriptdir}
|
||||
sys.path = [p for p in sys.path if p not in excludes]
|
||||
|
||||
import base64
|
||||
import shutil
|
||||
import tempfile
|
||||
import zipfile
|
||||
|
||||
def invoke_module(modlib_path: str, json_params: bytes) -> None:
|
||||
# When installed via setuptools (including python setup.py install),
|
||||
# ansible may be installed with an easy-install.pth file. That file
|
||||
# may load the system-wide install of ansible rather than the one in
|
||||
# the module. sitecustomize is the only way to override that setting.
|
||||
z = zipfile.ZipFile(modlib_path, mode='a')
|
||||
|
||||
# py3: modlib_path will be text, py2: it's bytes. Need bytes at the end
|
||||
sitecustomize = u'import sys\\nsys.path.insert(0,"%s")\\n' % modlib_path
|
||||
sitecustomize = sitecustomize.encode('utf-8')
|
||||
# Use a ZipInfo to work around zipfile limitation on hosts with
|
||||
# clocks set to a pre-1980 year (for instance, Raspberry Pi)
|
||||
zinfo = zipfile.ZipInfo()
|
||||
zinfo.filename = 'sitecustomize.py'
|
||||
zinfo.date_time = date_time.utctimetuple()[:6]
|
||||
z.writestr(zinfo, sitecustomize)
|
||||
z.close()
|
||||
|
||||
# Put the zipped up module_utils we got from the controller first in the python path so that we
|
||||
# can monkeypatch the right basic
|
||||
sys.path.insert(0, modlib_path)
|
||||
|
||||
from ansible.module_utils._internal._ansiballz import run_module
|
||||
|
||||
run_module(
|
||||
json_params=json_params,
|
||||
profile=profile,
|
||||
plugin_info_dict=plugin_info_dict,
|
||||
module_fqn=module_fqn,
|
||||
modlib_path=modlib_path,
|
||||
coverage_config=coverage_config,
|
||||
coverage_output=coverage_output,
|
||||
)
|
||||
|
||||
def debug(command: str, modlib_path: str, json_params: bytes) -> None:
|
||||
# The code here normally doesn't run. It's only used for debugging on the
|
||||
# remote machine.
|
||||
#
|
||||
# The subcommands in this function make it easier to debug ansiballz
|
||||
# modules. Here's the basic steps:
|
||||
#
|
||||
# Run ansible with the environment variable: ANSIBLE_KEEP_REMOTE_FILES=1 and -vvv
|
||||
# to save the module file remotely::
|
||||
# $ ANSIBLE_KEEP_REMOTE_FILES=1 ansible host1 -m ping -a 'data=october' -vvv
|
||||
#
|
||||
# Part of the verbose output will tell you where on the remote machine the
|
||||
# module was written to::
|
||||
# [...]
|
||||
# <host1> SSH: EXEC ssh -C -q -o ControlMaster=auto -o ControlPersist=60s -o KbdInteractiveAuthentication=no -o
|
||||
# PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o ConnectTimeout=10 -o
|
||||
# ControlPath=/home/badger/.ansible/cp/ansible-ssh-%h-%p-%r -tt rhel7 '/bin/sh -c '"'"'LANG=en_US.UTF-8 LC_ALL=en_US.UTF-8
|
||||
# LC_MESSAGES=en_US.UTF-8 /usr/bin/python /home/badger/.ansible/tmp/ansible-tmp-1461173013.93-9076457629738/ping'"'"''
|
||||
# [...]
|
||||
#
|
||||
# Login to the remote machine and run the module file via from the previous
|
||||
# step with the explode subcommand to extract the module payload into
|
||||
# source files::
|
||||
# $ ssh host1
|
||||
# $ /usr/bin/python /home/badger/.ansible/tmp/ansible-tmp-1461173013.93-9076457629738/ping explode
|
||||
# Module expanded into:
|
||||
# /home/badger/.ansible/tmp/ansible-tmp-1461173408.08-279692652635227/ansible
|
||||
#
|
||||
# You can now edit the source files to instrument the code or experiment with
|
||||
# different parameter values. When you're ready to run the code you've modified
|
||||
# (instead of the code from the actual zipped module), use the execute subcommand like this::
|
||||
# $ /usr/bin/python /home/badger/.ansible/tmp/ansible-tmp-1461173013.93-9076457629738/ping execute
|
||||
|
||||
# Okay to use __file__ here because we're running from a kept file
|
||||
basedir = os.path.join(os.path.abspath(os.path.dirname(__file__)), 'debug_dir')
|
||||
args_path = os.path.join(basedir, 'args')
|
||||
|
||||
if command == 'explode':
|
||||
# transform the ZIPDATA into an exploded directory of code and then
|
||||
# print the path to the code. This is an easy way for people to look
|
||||
# at the code on the remote machine for debugging it in that
|
||||
# environment
|
||||
z = zipfile.ZipFile(modlib_path)
|
||||
for filename in z.namelist():
|
||||
if filename.startswith('/'):
|
||||
raise Exception('Something wrong with this module zip file: should not contain absolute paths')
|
||||
|
||||
dest_filename = os.path.join(basedir, filename)
|
||||
if dest_filename.endswith(os.path.sep) and not os.path.exists(dest_filename):
|
||||
os.makedirs(dest_filename)
|
||||
else:
|
||||
directory = os.path.dirname(dest_filename)
|
||||
if not os.path.exists(directory):
|
||||
os.makedirs(directory)
|
||||
with open(dest_filename, 'wb') as writer:
|
||||
writer.write(z.read(filename))
|
||||
|
||||
# write the args file
|
||||
with open(args_path, 'wb') as writer:
|
||||
writer.write(json_params)
|
||||
|
||||
print('Module expanded into:')
|
||||
print(basedir)
|
||||
|
||||
elif command == 'execute':
|
||||
# Execute the exploded code instead of executing the module from the
|
||||
# embedded ZIPDATA. This allows people to easily run their modified
|
||||
# code on the remote machine to see how changes will affect it.
|
||||
|
||||
# Set pythonpath to the debug dir
|
||||
sys.path.insert(0, basedir)
|
||||
|
||||
# read in the args file which the user may have modified
|
||||
with open(args_path, 'rb') as reader:
|
||||
json_params = reader.read()
|
||||
|
||||
from ansible.module_utils._internal._ansiballz import run_module
|
||||
|
||||
run_module(
|
||||
json_params=json_params,
|
||||
profile=profile,
|
||||
plugin_info_dict=plugin_info_dict,
|
||||
module_fqn=module_fqn,
|
||||
modlib_path=modlib_path,
|
||||
)
|
||||
|
||||
else:
|
||||
print('WARNING: Unknown debug command. Doing nothing.')
|
||||
|
||||
#
|
||||
# See comments in the debug() method for information on debugging
|
||||
#
|
||||
|
||||
encoded_params = params.encode()
|
||||
|
||||
# There's a race condition with the controller removing the
|
||||
# remote_tmpdir and this module executing under async. So we cannot
|
||||
# store this in remote_tmpdir (use system tempdir instead)
|
||||
# Only need to use [ansible_module]_payload_ in the temp_path until we move to zipimport
|
||||
# (this helps ansible-test produce coverage stats)
|
||||
temp_path = tempfile.mkdtemp(prefix='ansible_' + ansible_module + '_payload_')
|
||||
|
||||
try:
|
||||
zipped_mod = os.path.join(temp_path, 'ansible_' + ansible_module + '_payload.zip')
|
||||
|
||||
with open(zipped_mod, 'wb') as modlib:
|
||||
modlib.write(base64.b64decode(zipdata))
|
||||
|
||||
if len(sys.argv) == 2:
|
||||
debug(sys.argv[1], zipped_mod, encoded_params)
|
||||
else:
|
||||
invoke_module(zipped_mod, encoded_params)
|
||||
finally:
|
||||
shutil.rmtree(temp_path, ignore_errors=True)
|
||||
0
lib/ansible/_internal/_datatag/__init__.py
Normal file
0
lib/ansible/_internal/_datatag/__init__.py
Normal file
130
lib/ansible/_internal/_datatag/_tags.py
Normal file
130
lib/ansible/_internal/_datatag/_tags.py
Normal file
|
|
@ -0,0 +1,130 @@
|
|||
from __future__ import annotations
|
||||
|
||||
import dataclasses
|
||||
import os
|
||||
import types
|
||||
import typing as t
|
||||
|
||||
from ansible.module_utils._internal._datatag import _tag_dataclass_kwargs, AnsibleDatatagBase, AnsibleSingletonTagBase
|
||||
|
||||
|
||||
@dataclasses.dataclass(**_tag_dataclass_kwargs)
|
||||
class Origin(AnsibleDatatagBase):
|
||||
"""
|
||||
A tag that stores origin metadata for a tagged value, intended for forensic/diagnostic use.
|
||||
Origin metadata should not be used to make runtime decisions, as it is not guaranteed to be present or accurate.
|
||||
Setting both `path` and `line_num` can result in diagnostic display of referenced file contents.
|
||||
Either `path` or `description` must be present.
|
||||
"""
|
||||
|
||||
path: str | None = None
|
||||
"""The path from which the tagged content originated."""
|
||||
description: str | None = None
|
||||
"""A description of the origin, for display to users."""
|
||||
line_num: int | None = None
|
||||
"""An optional line number, starting at 1."""
|
||||
col_num: int | None = None
|
||||
"""An optional column number, starting at 1."""
|
||||
|
||||
UNKNOWN: t.ClassVar[t.Self]
|
||||
|
||||
@classmethod
|
||||
def get_or_create_tag(cls, value: t.Any, path: str | os.PathLike | None) -> Origin:
|
||||
"""Return the tag from the given value, creating a tag from the provided path if no tag was found."""
|
||||
if not (origin := cls.get_tag(value)):
|
||||
if path:
|
||||
origin = Origin(path=str(path)) # convert tagged strings and path-like values to a native str
|
||||
else:
|
||||
origin = Origin.UNKNOWN
|
||||
|
||||
return origin
|
||||
|
||||
def replace(
|
||||
self,
|
||||
path: str | types.EllipsisType = ...,
|
||||
description: str | types.EllipsisType = ...,
|
||||
line_num: int | None | types.EllipsisType = ...,
|
||||
col_num: int | None | types.EllipsisType = ...,
|
||||
) -> t.Self:
|
||||
"""Return a new origin based on an existing one, with the given fields replaced."""
|
||||
return dataclasses.replace(
|
||||
self,
|
||||
**{
|
||||
key: value
|
||||
for key, value in dict(
|
||||
path=path,
|
||||
description=description,
|
||||
line_num=line_num,
|
||||
col_num=col_num,
|
||||
).items()
|
||||
if value is not ...
|
||||
}, # type: ignore[arg-type]
|
||||
)
|
||||
|
||||
def _post_validate(self) -> None:
|
||||
if self.path:
|
||||
if not self.path.startswith('/'):
|
||||
raise RuntimeError('The `src` field must be an absolute path.')
|
||||
elif not self.description:
|
||||
raise RuntimeError('The `src` or `description` field must be specified.')
|
||||
|
||||
def __str__(self) -> str:
|
||||
"""Renders the origin in the form of path:line_num:col_num, omitting missing/invalid elements from the right."""
|
||||
if self.path:
|
||||
value = self.path
|
||||
else:
|
||||
value = self.description
|
||||
|
||||
if self.line_num and self.line_num > 0:
|
||||
value += f':{self.line_num}'
|
||||
|
||||
if self.col_num and self.col_num > 0:
|
||||
value += f':{self.col_num}'
|
||||
|
||||
if self.path and self.description:
|
||||
value += f' ({self.description})'
|
||||
|
||||
return value
|
||||
|
||||
|
||||
Origin.UNKNOWN = Origin(description='<unknown>')
|
||||
|
||||
|
||||
@dataclasses.dataclass(**_tag_dataclass_kwargs)
|
||||
class VaultedValue(AnsibleDatatagBase):
|
||||
"""Tag for vault-encrypted strings that carries the original ciphertext for round-tripping."""
|
||||
|
||||
ciphertext: str
|
||||
|
||||
def _get_tag_to_propagate(self, src: t.Any, value: object, *, value_type: t.Optional[type] = None) -> t.Self | None:
|
||||
# Since VaultedValue stores the encrypted representation of the value on which it is tagged,
|
||||
# it is incorrect to propagate the tag to a value which is not equal to the original.
|
||||
# If the tag were copied to another value and subsequently serialized as the original encrypted value,
|
||||
# the result would then differ from the value on which the tag was applied.
|
||||
|
||||
# Comparisons which can trigger an exception are indicative of a bug and should not be handled here.
|
||||
# For example:
|
||||
# * When `src` is an undecryptable `EncryptedString` -- it is not valid to apply this tag to that type.
|
||||
# * When `value` is a `Marker` -- this requires a templating, but vaulted values do not support templating.
|
||||
|
||||
if src == value: # assume the tag was correctly applied to src
|
||||
return self # same plaintext value, tag propagation with same ciphertext is safe
|
||||
|
||||
return self.get_tag(value) # different value, preserve the existing tag, if any
|
||||
|
||||
|
||||
@dataclasses.dataclass(**_tag_dataclass_kwargs)
|
||||
class TrustedAsTemplate(AnsibleSingletonTagBase):
|
||||
"""
|
||||
Indicates the tagged string is trusted to parse and render as a template.
|
||||
Do *NOT* apply this tag to data from untrusted sources, as this would allow code injection during templating.
|
||||
"""
|
||||
|
||||
|
||||
@dataclasses.dataclass(**_tag_dataclass_kwargs)
|
||||
class SourceWasEncrypted(AnsibleSingletonTagBase):
|
||||
"""
|
||||
For internal use only.
|
||||
Indicates the tagged value was sourced from an encrypted file.
|
||||
Currently applied only by DataLoader.get_text_file_contents() and by extension DataLoader.load_from_file().
|
||||
"""
|
||||
19
lib/ansible/_internal/_datatag/_utils.py
Normal file
19
lib/ansible/_internal/_datatag/_utils.py
Normal file
|
|
@ -0,0 +1,19 @@
|
|||
from __future__ import annotations
|
||||
|
||||
from ansible.module_utils._internal._datatag import AnsibleTagHelper
|
||||
|
||||
|
||||
def str_problematic_strip(value: str) -> str:
|
||||
"""
|
||||
Return a copy of `value` with leading and trailing whitespace removed.
|
||||
Used where `str.strip` is needed, but tags must be preserved *AND* the stripping behavior likely shouldn't exist.
|
||||
If the stripping behavior is non-problematic, use `AnsibleTagHelper.tag_copy` around `str.strip` instead.
|
||||
"""
|
||||
if (stripped_value := value.strip()) == value:
|
||||
return value
|
||||
|
||||
# FUTURE: consider deprecating some/all usages of this method; they generally imply a code smell or pattern we shouldn't be supporting
|
||||
|
||||
stripped_value = AnsibleTagHelper.tag_copy(value, stripped_value)
|
||||
|
||||
return stripped_value
|
||||
33
lib/ansible/_internal/_datatag/_wrappers.py
Normal file
33
lib/ansible/_internal/_datatag/_wrappers.py
Normal file
|
|
@ -0,0 +1,33 @@
|
|||
from __future__ import annotations
|
||||
|
||||
import io
|
||||
import typing as _t
|
||||
|
||||
from .._wrapt import ObjectProxy
|
||||
from ...module_utils._internal import _datatag
|
||||
|
||||
|
||||
class TaggedStreamWrapper(ObjectProxy):
|
||||
"""
|
||||
Janky proxy around IOBase to allow streams to carry tags and support basic interrogation by the tagging API.
|
||||
Most tagging operations will have undefined behavior for this type.
|
||||
"""
|
||||
|
||||
_self__ansible_tags_mapping: _datatag._AnsibleTagsMapping
|
||||
|
||||
def __init__(self, stream: io.IOBase, tags: _datatag.AnsibleDatatagBase | _t.Iterable[_datatag.AnsibleDatatagBase]) -> None:
|
||||
super().__init__(stream)
|
||||
|
||||
tag_list: list[_datatag.AnsibleDatatagBase]
|
||||
|
||||
# noinspection PyProtectedMember
|
||||
if type(tags) in _datatag._known_tag_types:
|
||||
tag_list = [tags] # type: ignore[list-item]
|
||||
else:
|
||||
tag_list = list(tags) # type: ignore[arg-type]
|
||||
|
||||
self._self__ansible_tags_mapping = _datatag._AnsibleTagsMapping((type(tag), tag) for tag in tag_list)
|
||||
|
||||
@property
|
||||
def _ansible_tags_mapping(self) -> _datatag._AnsibleTagsMapping:
|
||||
return self._self__ansible_tags_mapping
|
||||
0
lib/ansible/_internal/_errors/__init__.py
Normal file
0
lib/ansible/_internal/_errors/__init__.py
Normal file
128
lib/ansible/_internal/_errors/_captured.py
Normal file
128
lib/ansible/_internal/_errors/_captured.py
Normal file
|
|
@ -0,0 +1,128 @@
|
|||
from __future__ import annotations
|
||||
|
||||
import dataclasses
|
||||
import typing as t
|
||||
|
||||
from ansible.errors import AnsibleRuntimeError
|
||||
from ansible.module_utils.common.messages import ErrorSummary, Detail, _dataclass_kwargs
|
||||
|
||||
|
||||
class AnsibleCapturedError(AnsibleRuntimeError):
|
||||
"""An exception representing error detail captured in another context where the error detail must be serialized to be preserved."""
|
||||
|
||||
context: t.ClassVar[str]
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
*,
|
||||
obj: t.Any = None,
|
||||
error_summary: ErrorSummary,
|
||||
) -> None:
|
||||
super().__init__(
|
||||
obj=obj,
|
||||
)
|
||||
|
||||
self._error_summary = error_summary
|
||||
|
||||
@property
|
||||
def error_summary(self) -> ErrorSummary:
|
||||
return self._error_summary
|
||||
|
||||
|
||||
class AnsibleResultCapturedError(AnsibleCapturedError):
|
||||
"""An exception representing error detail captured in a foreign context where an action/module result dictionary is involved."""
|
||||
|
||||
def __init__(self, error_summary: ErrorSummary, result: dict[str, t.Any]) -> None:
|
||||
super().__init__(error_summary=error_summary)
|
||||
|
||||
self._result = result
|
||||
|
||||
@classmethod
|
||||
def maybe_raise_on_result(cls, result: dict[str, t.Any]) -> None:
|
||||
"""Normalize the result and raise an exception if the result indicated failure."""
|
||||
if error_summary := cls.normalize_result_exception(result):
|
||||
raise error_summary.error_type(error_summary, result)
|
||||
|
||||
@classmethod
|
||||
def find_first_remoted_error(cls, exception: BaseException) -> t.Self | None:
|
||||
"""Find the first captured module error in the cause chain, starting with the given exception, returning None if not found."""
|
||||
while exception:
|
||||
if isinstance(exception, cls):
|
||||
return exception
|
||||
|
||||
exception = exception.__cause__
|
||||
|
||||
return None
|
||||
|
||||
@classmethod
|
||||
def normalize_result_exception(cls, result: dict[str, t.Any]) -> CapturedErrorSummary | None:
|
||||
"""
|
||||
Normalize the result `exception`, if any, to be a `CapturedErrorSummary` instance.
|
||||
If a new `CapturedErrorSummary` was created, the `error_type` will be `cls`.
|
||||
The `exception` key will be removed if falsey.
|
||||
A `CapturedErrorSummary` instance will be returned if `failed` is truthy.
|
||||
"""
|
||||
if type(cls) is AnsibleResultCapturedError: # pylint: disable=unidiomatic-typecheck
|
||||
raise TypeError('The normalize_result_exception method cannot be called on the AnsibleCapturedError base type, use a derived type.')
|
||||
|
||||
if not isinstance(result, dict):
|
||||
raise TypeError(f'Malformed result. Received {type(result)} instead of {dict}.')
|
||||
|
||||
failed = result.get('failed') # DTFIX-FUTURE: warn if failed is present and not a bool, or exception is present without failed being True
|
||||
exception = result.pop('exception', None)
|
||||
|
||||
if not failed and not exception:
|
||||
return None
|
||||
|
||||
if isinstance(exception, CapturedErrorSummary):
|
||||
error_summary = exception
|
||||
elif isinstance(exception, ErrorSummary):
|
||||
error_summary = CapturedErrorSummary(
|
||||
details=exception.details,
|
||||
formatted_traceback=cls._normalize_traceback(exception.formatted_traceback),
|
||||
error_type=cls,
|
||||
)
|
||||
else:
|
||||
# translate non-ErrorDetail errors
|
||||
error_summary = CapturedErrorSummary(
|
||||
details=(Detail(msg=str(result.get('msg', 'Unknown error.'))),),
|
||||
formatted_traceback=cls._normalize_traceback(exception),
|
||||
error_type=cls,
|
||||
)
|
||||
|
||||
result.update(exception=error_summary)
|
||||
|
||||
return error_summary if failed else None # even though error detail was normalized, only return it if the result indicated failure
|
||||
|
||||
@classmethod
|
||||
def _normalize_traceback(cls, value: object | None) -> str | None:
|
||||
"""Normalize the provided traceback value, returning None if it is falsey."""
|
||||
if not value:
|
||||
return None
|
||||
|
||||
value = str(value).rstrip()
|
||||
|
||||
if not value:
|
||||
return None
|
||||
|
||||
return value + '\n'
|
||||
|
||||
|
||||
class AnsibleActionCapturedError(AnsibleResultCapturedError):
|
||||
"""An exception representing error detail sourced directly by an action in its result dictionary."""
|
||||
|
||||
_default_message = 'Action failed.'
|
||||
context = 'action'
|
||||
|
||||
|
||||
class AnsibleModuleCapturedError(AnsibleResultCapturedError):
|
||||
"""An exception representing error detail captured in a module context and returned from an action's result dictionary."""
|
||||
|
||||
_default_message = 'Module failed.'
|
||||
context = 'target'
|
||||
|
||||
|
||||
@dataclasses.dataclass(**_dataclass_kwargs)
|
||||
class CapturedErrorSummary(ErrorSummary):
|
||||
# DTFIX-RELEASE: where to put this, name, etc. since it shows up in results, it's not exactly private (and contains a type ref to an internal type)
|
||||
error_type: type[AnsibleResultCapturedError] | None = None
|
||||
91
lib/ansible/_internal/_errors/_handler.py
Normal file
91
lib/ansible/_internal/_errors/_handler.py
Normal file
|
|
@ -0,0 +1,91 @@
|
|||
from __future__ import annotations
|
||||
|
||||
import contextlib
|
||||
import enum
|
||||
import typing as t
|
||||
|
||||
from ansible.utils.display import Display
|
||||
from ansible.constants import config
|
||||
|
||||
display = Display()
|
||||
|
||||
# FUTURE: add sanity test to detect use of skip_on_ignore without Skippable (and vice versa)
|
||||
|
||||
|
||||
class ErrorAction(enum.Enum):
|
||||
"""Action to take when an error is encountered."""
|
||||
|
||||
IGNORE = enum.auto()
|
||||
WARN = enum.auto()
|
||||
FAIL = enum.auto()
|
||||
|
||||
@classmethod
|
||||
def from_config(cls, setting: str, variables: dict[str, t.Any] | None = None) -> t.Self:
|
||||
"""Return an `ErrorAction` enum from the specified Ansible config setting."""
|
||||
return cls[config.get_config_value(setting, variables=variables).upper()]
|
||||
|
||||
|
||||
class _SkipException(BaseException):
|
||||
"""Internal flow control exception for skipping code blocks within a `Skippable` context manager."""
|
||||
|
||||
def __init__(self) -> None:
|
||||
super().__init__('Skipping ignored action due to use of `skip_on_ignore`. It is a bug to encounter this message outside of debugging.')
|
||||
|
||||
|
||||
class _SkippableContextManager:
|
||||
"""Internal context manager to support flow control for skipping code blocks."""
|
||||
|
||||
def __enter__(self) -> None:
|
||||
pass
|
||||
|
||||
def __exit__(self, exc_type, _exc_val, _exc_tb) -> bool:
|
||||
if exc_type is None:
|
||||
raise RuntimeError('A `Skippable` context manager was entered, but a `skip_on_ignore` handler was never invoked.')
|
||||
|
||||
return exc_type is _SkipException # only mask a _SkipException, allow all others to raise
|
||||
|
||||
|
||||
Skippable = _SkippableContextManager()
|
||||
"""Context manager singleton required to enclose `ErrorHandler.handle` invocations when `skip_on_ignore` is `True`."""
|
||||
|
||||
|
||||
class ErrorHandler:
|
||||
"""
|
||||
Provides a configurable error handler context manager for a specific list of exception types.
|
||||
Unhandled errors leaving the context manager can be ignored, treated as warnings, or allowed to raise by setting `ErrorAction`.
|
||||
"""
|
||||
|
||||
def __init__(self, action: ErrorAction) -> None:
|
||||
self.action = action
|
||||
|
||||
@contextlib.contextmanager
|
||||
def handle(self, *args: type[BaseException], skip_on_ignore: bool = False) -> t.Iterator[None]:
|
||||
"""
|
||||
Handle the specified exception(s) using the defined error action.
|
||||
If `skip_on_ignore` is `True`, the body of the context manager will be skipped for `ErrorAction.IGNORE`.
|
||||
Use of `skip_on_ignore` requires enclosure within the `Skippable` context manager.
|
||||
"""
|
||||
if not args:
|
||||
raise ValueError('At least one exception type is required.')
|
||||
|
||||
if skip_on_ignore and self.action == ErrorAction.IGNORE:
|
||||
raise _SkipException() # skipping ignored action
|
||||
|
||||
try:
|
||||
yield
|
||||
except args as ex:
|
||||
match self.action:
|
||||
case ErrorAction.WARN:
|
||||
display.error_as_warning(msg=None, exception=ex)
|
||||
case ErrorAction.FAIL:
|
||||
raise
|
||||
case _: # ErrorAction.IGNORE
|
||||
pass
|
||||
|
||||
if skip_on_ignore:
|
||||
raise _SkipException() # completed skippable action, ensures the `Skippable` context was used
|
||||
|
||||
@classmethod
|
||||
def from_config(cls, setting: str, variables: dict[str, t.Any] | None = None) -> t.Self:
|
||||
"""Return an `ErrorHandler` instance configured using the specified Ansible config setting."""
|
||||
return cls(ErrorAction.from_config(setting, variables=variables))
|
||||
310
lib/ansible/_internal/_errors/_utils.py
Normal file
310
lib/ansible/_internal/_errors/_utils.py
Normal file
|
|
@ -0,0 +1,310 @@
|
|||
from __future__ import annotations
|
||||
|
||||
import dataclasses
|
||||
import itertools
|
||||
import pathlib
|
||||
import sys
|
||||
import textwrap
|
||||
import typing as t
|
||||
|
||||
from ansible.module_utils.common.messages import Detail, ErrorSummary
|
||||
from ansible._internal._datatag._tags import Origin
|
||||
from ansible.module_utils._internal import _ambient_context, _traceback
|
||||
from ansible import errors
|
||||
|
||||
if t.TYPE_CHECKING:
|
||||
from ansible.utils.display import Display
|
||||
|
||||
|
||||
class RedactAnnotatedSourceContext(_ambient_context.AmbientContextBase):
|
||||
"""
|
||||
When active, this context will redact annotated source lines, showing only the origin.
|
||||
"""
|
||||
|
||||
|
||||
def _dedupe_and_concat_message_chain(message_parts: list[str]) -> str:
|
||||
message_parts = list(reversed(message_parts))
|
||||
|
||||
message = message_parts.pop(0)
|
||||
|
||||
for message_part in message_parts:
|
||||
# avoid duplicate messages where the cause was already concatenated to the exception message
|
||||
if message_part.endswith(message):
|
||||
message = message_part
|
||||
else:
|
||||
message = concat_message(message_part, message)
|
||||
|
||||
return message
|
||||
|
||||
|
||||
def _collapse_error_details(error_details: t.Sequence[Detail]) -> list[Detail]:
|
||||
"""
|
||||
Return a potentially modified error chain, with redundant errors collapsed into previous error(s) in the chain.
|
||||
This reduces the verbosity of messages by eliminating repetition when multiple errors in the chain share the same contextual information.
|
||||
"""
|
||||
previous_error = error_details[0]
|
||||
previous_warnings: list[str] = []
|
||||
collapsed_error_details: list[tuple[Detail, list[str]]] = [(previous_error, previous_warnings)]
|
||||
|
||||
for error in error_details[1:]:
|
||||
details_present = error.formatted_source_context or error.help_text
|
||||
details_changed = error.formatted_source_context != previous_error.formatted_source_context or error.help_text != previous_error.help_text
|
||||
|
||||
if details_present and details_changed:
|
||||
previous_error = error
|
||||
previous_warnings = []
|
||||
collapsed_error_details.append((previous_error, previous_warnings))
|
||||
else:
|
||||
previous_warnings.append(error.msg)
|
||||
|
||||
final_error_details: list[Detail] = []
|
||||
|
||||
for error, messages in collapsed_error_details:
|
||||
final_error_details.append(dataclasses.replace(error, msg=_dedupe_and_concat_message_chain([error.msg] + messages)))
|
||||
|
||||
return final_error_details
|
||||
|
||||
|
||||
def _get_cause(exception: BaseException) -> BaseException | None:
|
||||
# deprecated: description='remove support for orig_exc (deprecated in 2.23)' core_version='2.27'
|
||||
|
||||
if not isinstance(exception, errors.AnsibleError):
|
||||
return exception.__cause__
|
||||
|
||||
if exception.__cause__:
|
||||
if exception.orig_exc and exception.orig_exc is not exception.__cause__:
|
||||
_get_display().warning(
|
||||
msg=f"The `orig_exc` argument to `{type(exception).__name__}` was given, but differed from the cause given by `raise ... from`.",
|
||||
)
|
||||
|
||||
return exception.__cause__
|
||||
|
||||
if exception.orig_exc:
|
||||
# encourage the use of `raise ... from` before deprecating `orig_exc`
|
||||
_get_display().warning(msg=f"The `orig_exc` argument to `{type(exception).__name__}` was given without using `raise ... from orig_exc`.")
|
||||
|
||||
return exception.orig_exc
|
||||
|
||||
return None
|
||||
|
||||
|
||||
class _TemporaryDisplay:
|
||||
# DTFIX-FUTURE: generalize this and hide it in the display module so all users of Display can benefit
|
||||
|
||||
@staticmethod
|
||||
def warning(*args, **kwargs):
|
||||
print(f'FALLBACK WARNING: {args} {kwargs}', file=sys.stderr)
|
||||
|
||||
@staticmethod
|
||||
def deprecated(*args, **kwargs):
|
||||
print(f'FALLBACK DEPRECATION: {args} {kwargs}', file=sys.stderr)
|
||||
|
||||
|
||||
def _get_display() -> Display | _TemporaryDisplay:
|
||||
try:
|
||||
from ansible.utils.display import Display
|
||||
except ImportError:
|
||||
return _TemporaryDisplay()
|
||||
|
||||
return Display()
|
||||
|
||||
|
||||
def _create_error_summary(exception: BaseException, event: _traceback.TracebackEvent | None = None) -> ErrorSummary:
|
||||
from . import _captured # avoid circular import due to AnsibleError import
|
||||
|
||||
current_exception: BaseException | None = exception
|
||||
error_details: list[Detail] = []
|
||||
|
||||
if event:
|
||||
formatted_traceback = _traceback.maybe_extract_traceback(exception, event)
|
||||
else:
|
||||
formatted_traceback = None
|
||||
|
||||
while current_exception:
|
||||
if isinstance(current_exception, errors.AnsibleError):
|
||||
include_cause_message = current_exception._include_cause_message
|
||||
edc = Detail(
|
||||
msg=current_exception._original_message.strip(),
|
||||
formatted_source_context=current_exception._formatted_source_context,
|
||||
help_text=current_exception._help_text,
|
||||
)
|
||||
else:
|
||||
include_cause_message = True
|
||||
edc = Detail(
|
||||
msg=str(current_exception).strip(),
|
||||
)
|
||||
|
||||
error_details.append(edc)
|
||||
|
||||
if isinstance(current_exception, _captured.AnsibleCapturedError):
|
||||
detail = current_exception.error_summary
|
||||
error_details.extend(detail.details)
|
||||
|
||||
if formatted_traceback and detail.formatted_traceback:
|
||||
formatted_traceback = (
|
||||
f'{detail.formatted_traceback}\n'
|
||||
f'The {current_exception.context} exception above was the direct cause of the following controller exception:\n\n'
|
||||
f'{formatted_traceback}'
|
||||
)
|
||||
|
||||
if not include_cause_message:
|
||||
break
|
||||
|
||||
current_exception = _get_cause(current_exception)
|
||||
|
||||
return ErrorSummary(details=tuple(error_details), formatted_traceback=formatted_traceback)
|
||||
|
||||
|
||||
def concat_message(left: str, right: str) -> str:
|
||||
"""Normalize `left` by removing trailing punctuation and spaces before appending new punctuation and `right`."""
|
||||
return f'{left.rstrip(". ")}: {right}'
|
||||
|
||||
|
||||
def get_chained_message(exception: BaseException) -> str:
|
||||
"""
|
||||
Return the full chain of exception messages by concatenating the cause(s) until all are exhausted.
|
||||
"""
|
||||
error_summary = _create_error_summary(exception)
|
||||
message_parts = [edc.msg for edc in error_summary.details]
|
||||
|
||||
return _dedupe_and_concat_message_chain(message_parts)
|
||||
|
||||
|
||||
@dataclasses.dataclass(kw_only=True, frozen=True)
|
||||
class SourceContext:
|
||||
origin: Origin
|
||||
annotated_source_lines: list[str]
|
||||
target_line: str | None
|
||||
|
||||
def __str__(self) -> str:
|
||||
msg_lines = [f'Origin: {self.origin}']
|
||||
|
||||
if self.annotated_source_lines:
|
||||
msg_lines.append('')
|
||||
msg_lines.extend(self.annotated_source_lines)
|
||||
|
||||
return '\n'.join(msg_lines)
|
||||
|
||||
@classmethod
|
||||
def from_value(cls, value: t.Any) -> SourceContext | None:
|
||||
"""Attempt to retrieve source and render a contextual indicator from the value's origin (if any)."""
|
||||
if value is None:
|
||||
return None
|
||||
|
||||
if isinstance(value, Origin):
|
||||
origin = value
|
||||
value = None
|
||||
else:
|
||||
origin = Origin.get_tag(value)
|
||||
|
||||
if RedactAnnotatedSourceContext.current(optional=True):
|
||||
return cls.error('content redacted')
|
||||
|
||||
if origin and origin.path:
|
||||
return cls.from_origin(origin)
|
||||
|
||||
# DTFIX-RELEASE: redaction context may not be sufficient to avoid secret disclosure without SensitiveData and other enhancements
|
||||
if value is None:
|
||||
truncated_value = None
|
||||
annotated_source_lines = []
|
||||
else:
|
||||
# DTFIX-FUTURE: cleanup/share width
|
||||
try:
|
||||
value = str(value)
|
||||
except Exception as ex:
|
||||
value = f'<< context unavailable: {ex} >>'
|
||||
|
||||
truncated_value = textwrap.shorten(value, width=120)
|
||||
annotated_source_lines = [truncated_value]
|
||||
|
||||
return SourceContext(
|
||||
origin=origin or Origin.UNKNOWN,
|
||||
annotated_source_lines=annotated_source_lines,
|
||||
target_line=truncated_value,
|
||||
)
|
||||
|
||||
@staticmethod
|
||||
def error(message: str | None, origin: Origin | None = None) -> SourceContext:
|
||||
return SourceContext(
|
||||
origin=origin,
|
||||
annotated_source_lines=[f'(source not shown: {message})'] if message else [],
|
||||
target_line=None,
|
||||
)
|
||||
|
||||
@classmethod
|
||||
def from_origin(cls, origin: Origin) -> SourceContext:
|
||||
"""Attempt to retrieve source and render a contextual indicator of an error location."""
|
||||
from ansible.parsing.vault import is_encrypted # avoid circular import
|
||||
|
||||
# DTFIX-FUTURE: support referencing the column after the end of the target line, so we can indicate where a missing character (quote) needs to be added
|
||||
# this is also useful for cases like end-of-stream reported by the YAML parser
|
||||
|
||||
# DTFIX-FUTURE: Implement line wrapping and match annotated line width to the terminal display width.
|
||||
|
||||
context_line_count: t.Final = 2
|
||||
max_annotated_line_width: t.Final = 120
|
||||
truncation_marker: t.Final = '...'
|
||||
|
||||
target_line_num = origin.line_num
|
||||
|
||||
if RedactAnnotatedSourceContext.current(optional=True):
|
||||
return cls.error('content redacted', origin)
|
||||
|
||||
if not target_line_num or target_line_num < 1:
|
||||
return cls.error(None, origin) # message omitted since lack of line number is obvious from pos
|
||||
|
||||
start_line_idx = max(0, (target_line_num - 1) - context_line_count) # if near start of file
|
||||
target_col_num = origin.col_num
|
||||
|
||||
try:
|
||||
with pathlib.Path(origin.path).open() as src:
|
||||
first_line = src.readline()
|
||||
lines = list(itertools.islice(itertools.chain((first_line,), src), start_line_idx, target_line_num))
|
||||
except Exception as ex:
|
||||
return cls.error(type(ex).__name__, origin)
|
||||
|
||||
if is_encrypted(first_line):
|
||||
return cls.error('content encrypted', origin)
|
||||
|
||||
if len(lines) != target_line_num - start_line_idx:
|
||||
return cls.error('file truncated', origin)
|
||||
|
||||
annotated_source_lines = []
|
||||
|
||||
line_label_width = len(str(target_line_num))
|
||||
max_src_line_len = max_annotated_line_width - line_label_width - 1
|
||||
|
||||
usable_line_len = max_src_line_len
|
||||
|
||||
for line_num, line in enumerate(lines, start_line_idx + 1):
|
||||
line = line.rstrip('\n') # universal newline default mode on `open` ensures we'll never see anything but \n
|
||||
line = line.replace('\t', ' ') # mixed tab/space handling is intentionally disabled since we're both format and display config agnostic
|
||||
|
||||
if len(line) > max_src_line_len:
|
||||
line = line[: max_src_line_len - len(truncation_marker)] + truncation_marker
|
||||
usable_line_len = max_src_line_len - len(truncation_marker)
|
||||
|
||||
annotated_source_lines.append(f'{str(line_num).rjust(line_label_width)}{" " if line else ""}{line}')
|
||||
|
||||
if target_col_num and usable_line_len >= target_col_num >= 1:
|
||||
column_marker = f'column {target_col_num}'
|
||||
|
||||
target_col_idx = target_col_num - 1
|
||||
|
||||
if target_col_idx + 2 + len(column_marker) > max_src_line_len:
|
||||
column_marker = f'{" " * (target_col_idx - len(column_marker) - 1)}{column_marker} ^'
|
||||
else:
|
||||
column_marker = f'{" " * target_col_idx}^ {column_marker}'
|
||||
|
||||
column_marker = f'{" " * line_label_width} {column_marker}'
|
||||
|
||||
annotated_source_lines.append(column_marker)
|
||||
elif target_col_num is None:
|
||||
underline_length = len(annotated_source_lines[-1]) - line_label_width - 1
|
||||
annotated_source_lines.append(f'{" " * line_label_width} {"^" * underline_length}')
|
||||
|
||||
return SourceContext(
|
||||
origin=origin,
|
||||
annotated_source_lines=annotated_source_lines,
|
||||
target_line=lines[-1].rstrip('\n'), # universal newline default mode on `open` ensures we'll never see anything but \n
|
||||
)
|
||||
160
lib/ansible/_internal/_json/__init__.py
Normal file
160
lib/ansible/_internal/_json/__init__.py
Normal file
|
|
@ -0,0 +1,160 @@
|
|||
"""Internal utilities for serialization and deserialization."""
|
||||
|
||||
# DTFIX-RELEASE: most of this isn't JSON specific, find a better home
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import json
|
||||
import typing as t
|
||||
|
||||
from ansible.errors import AnsibleVariableTypeError
|
||||
|
||||
from ansible.module_utils._internal._datatag import (
|
||||
_ANSIBLE_ALLOWED_MAPPING_VAR_TYPES,
|
||||
_ANSIBLE_ALLOWED_NON_SCALAR_COLLECTION_VAR_TYPES,
|
||||
_ANSIBLE_ALLOWED_VAR_TYPES,
|
||||
_AnsibleTaggedStr,
|
||||
AnsibleTagHelper,
|
||||
)
|
||||
from ansible.module_utils._internal._json._profiles import _tagless
|
||||
from ansible.parsing.vault import EncryptedString
|
||||
from ansible._internal._datatag._tags import Origin, TrustedAsTemplate
|
||||
from ansible.module_utils import _internal
|
||||
|
||||
_T = t.TypeVar('_T')
|
||||
_sentinel = object()
|
||||
|
||||
|
||||
class HasCurrent(t.Protocol):
|
||||
"""Utility protocol for mixin type safety."""
|
||||
|
||||
_current: t.Any
|
||||
|
||||
|
||||
class StateTrackingMixIn(HasCurrent):
|
||||
"""Mixin for use with `AnsibleVariableVisitor` to track current visitation context."""
|
||||
|
||||
def __init__(self, *args, **kwargs) -> None:
|
||||
super().__init__(*args, **kwargs)
|
||||
|
||||
self._stack: list[t.Any] = []
|
||||
|
||||
def __enter__(self) -> None:
|
||||
self._stack.append(self._current)
|
||||
|
||||
def __exit__(self, *_args, **_kwargs) -> None:
|
||||
self._stack.pop()
|
||||
|
||||
def _get_stack(self) -> list[t.Any]:
|
||||
if not self._stack:
|
||||
return []
|
||||
|
||||
return self._stack[1:] + [self._current]
|
||||
|
||||
|
||||
class AnsibleVariableVisitor:
|
||||
"""Utility visitor base class to recursively apply various behaviors and checks to variable object graphs."""
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
*,
|
||||
trusted_as_template: bool = False,
|
||||
origin: Origin | None = None,
|
||||
convert_mapping_to_dict: bool = False,
|
||||
convert_sequence_to_list: bool = False,
|
||||
convert_custom_scalars: bool = False,
|
||||
allow_encrypted_string: bool = False,
|
||||
):
|
||||
super().__init__() # supports StateTrackingMixIn
|
||||
|
||||
self.trusted_as_template = trusted_as_template
|
||||
self.origin = origin
|
||||
self.convert_mapping_to_dict = convert_mapping_to_dict
|
||||
self.convert_sequence_to_list = convert_sequence_to_list
|
||||
self.convert_custom_scalars = convert_custom_scalars
|
||||
self.allow_encrypted_string = allow_encrypted_string
|
||||
|
||||
self._current: t.Any = None # supports StateTrackingMixIn
|
||||
|
||||
def __enter__(self) -> t.Any:
|
||||
"""No-op context manager dispatcher (delegates to mixin behavior if present)."""
|
||||
if func := getattr(super(), '__enter__', None):
|
||||
func()
|
||||
|
||||
def __exit__(self, *args, **kwargs) -> t.Any:
|
||||
"""No-op context manager dispatcher (delegates to mixin behavior if present)."""
|
||||
if func := getattr(super(), '__exit__', None):
|
||||
func(*args, **kwargs)
|
||||
|
||||
def visit(self, value: _T) -> _T:
|
||||
"""
|
||||
Enforces Ansible's variable type system restrictions before a var is accepted in inventory. Also, conditionally implements template trust
|
||||
compatibility, depending on the plugin's declared understanding (or lack thereof). This always recursively copies inputs to fully isolate
|
||||
inventory data from what the plugin provided, and prevent any later mutation.
|
||||
"""
|
||||
return self._visit(None, value)
|
||||
|
||||
def _early_visit(self, value, value_type) -> t.Any:
|
||||
"""Overridable hook point to allow custom string handling in derived visitors."""
|
||||
if value_type in (str, _AnsibleTaggedStr):
|
||||
# apply compatibility behavior
|
||||
if self.trusted_as_template:
|
||||
result = TrustedAsTemplate().tag(value)
|
||||
else:
|
||||
result = value
|
||||
else:
|
||||
result = _sentinel
|
||||
|
||||
return result
|
||||
|
||||
def _visit(self, key: t.Any, value: _T) -> _T:
|
||||
"""Internal implementation to recursively visit a data structure's contents."""
|
||||
self._current = key # supports StateTrackingMixIn
|
||||
|
||||
value_type = type(value)
|
||||
|
||||
result: _T
|
||||
|
||||
# DTFIX-RELEASE: the visitor is ignoring dict/mapping keys except for debugging and schema-aware checking, it should be doing type checks on keys
|
||||
# DTFIX-RELEASE: some type lists being consulted (the ones from datatag) are probably too permissive, and perhaps should not be dynamic
|
||||
|
||||
if (result := self._early_visit(value, value_type)) is not _sentinel:
|
||||
pass
|
||||
# DTFIX-RELEASE: de-duplicate and optimize; extract inline generator expressions and fallback function or mapping for native type calculation?
|
||||
elif value_type in _ANSIBLE_ALLOWED_MAPPING_VAR_TYPES: # check mappings first, because they're also collections
|
||||
with self: # supports StateTrackingMixIn
|
||||
result = AnsibleTagHelper.tag_copy(value, ((k, self._visit(k, v)) for k, v in value.items()), value_type=value_type)
|
||||
elif value_type in _ANSIBLE_ALLOWED_NON_SCALAR_COLLECTION_VAR_TYPES:
|
||||
with self: # supports StateTrackingMixIn
|
||||
result = AnsibleTagHelper.tag_copy(value, (self._visit(k, v) for k, v in enumerate(t.cast(t.Iterable, value))), value_type=value_type)
|
||||
elif self.allow_encrypted_string and isinstance(value, EncryptedString):
|
||||
return value # type: ignore[return-value] # DTFIX-RELEASE: this should probably only be allowed for values in dict, not keys (set, dict)
|
||||
elif self.convert_mapping_to_dict and _internal.is_intermediate_mapping(value):
|
||||
with self: # supports StateTrackingMixIn
|
||||
result = {k: self._visit(k, v) for k, v in value.items()} # type: ignore[assignment]
|
||||
elif self.convert_sequence_to_list and _internal.is_intermediate_iterable(value):
|
||||
with self: # supports StateTrackingMixIn
|
||||
result = [self._visit(k, v) for k, v in enumerate(t.cast(t.Iterable, value))] # type: ignore[assignment]
|
||||
elif self.convert_custom_scalars and isinstance(value, str):
|
||||
result = str(value) # type: ignore[assignment]
|
||||
elif self.convert_custom_scalars and isinstance(value, float):
|
||||
result = float(value) # type: ignore[assignment]
|
||||
elif self.convert_custom_scalars and isinstance(value, int) and not isinstance(value, bool):
|
||||
result = int(value) # type: ignore[assignment]
|
||||
else:
|
||||
if value_type not in _ANSIBLE_ALLOWED_VAR_TYPES:
|
||||
raise AnsibleVariableTypeError.from_value(obj=value)
|
||||
|
||||
# supported scalar type that requires no special handling, just return as-is
|
||||
result = value
|
||||
|
||||
if self.origin and not Origin.is_tagged_on(result):
|
||||
# apply shared instance default origin tag
|
||||
result = self.origin.tag(result)
|
||||
|
||||
return result
|
||||
|
||||
|
||||
def json_dumps_formatted(value: object) -> str:
|
||||
"""Return a JSON dump of `value` with formatting and keys sorted."""
|
||||
return json.dumps(value, cls=_tagless.Encoder, sort_keys=True, indent=4)
|
||||
34
lib/ansible/_internal/_json/_legacy_encoder.py
Normal file
34
lib/ansible/_internal/_json/_legacy_encoder.py
Normal file
|
|
@ -0,0 +1,34 @@
|
|||
from __future__ import annotations as _annotations
|
||||
|
||||
import typing as _t
|
||||
|
||||
from ansible.module_utils._internal._json import _profiles
|
||||
from ansible._internal._json._profiles import _legacy
|
||||
from ansible.parsing import vault as _vault
|
||||
|
||||
|
||||
class LegacyControllerJSONEncoder(_legacy.Encoder):
|
||||
"""Compatibility wrapper over `legacy` profile JSON encoder to support trust stripping and vault value plaintext conversion."""
|
||||
|
||||
def __init__(self, preprocess_unsafe: bool = False, vault_to_text: bool = False, _decode_bytes: bool = False, **kwargs) -> None:
|
||||
self._preprocess_unsafe = preprocess_unsafe
|
||||
self._vault_to_text = vault_to_text
|
||||
self._decode_bytes = _decode_bytes
|
||||
|
||||
super().__init__(**kwargs)
|
||||
|
||||
def default(self, o: _t.Any) -> _t.Any:
|
||||
"""Hooked default that can conditionally bypass base encoder behavior based on this instance's config."""
|
||||
if type(o) is _profiles._WrappedValue: # pylint: disable=unidiomatic-typecheck
|
||||
o = o.wrapped
|
||||
|
||||
if not self._preprocess_unsafe and type(o) is _legacy._Untrusted: # pylint: disable=unidiomatic-typecheck
|
||||
return o.value # if not emitting unsafe markers, bypass custom unsafe serialization and just return the raw value
|
||||
|
||||
if self._vault_to_text and type(o) is _vault.EncryptedString: # pylint: disable=unidiomatic-typecheck
|
||||
return str(o) # decrypt and return the plaintext (or fail trying)
|
||||
|
||||
if self._decode_bytes and isinstance(o, bytes):
|
||||
return o.decode(errors='surrogateescape') # backward compatibility with `ansible.module_utils.basic.jsonify`
|
||||
|
||||
return super().default(o)
|
||||
0
lib/ansible/_internal/_json/_profiles/__init__.py
Normal file
0
lib/ansible/_internal/_json/_profiles/__init__.py
Normal file
55
lib/ansible/_internal/_json/_profiles/_cache_persistence.py
Normal file
55
lib/ansible/_internal/_json/_profiles/_cache_persistence.py
Normal file
|
|
@ -0,0 +1,55 @@
|
|||
from __future__ import annotations
|
||||
|
||||
import datetime as _datetime
|
||||
|
||||
from ansible.module_utils._internal import _datatag
|
||||
from ansible.module_utils._internal._json import _profiles
|
||||
from ansible.parsing import vault as _vault
|
||||
from ansible._internal._datatag import _tags
|
||||
|
||||
|
||||
class _Profile(_profiles._JSONSerializationProfile):
|
||||
"""Profile for external cache persistence of inventory/fact data that preserves most tags."""
|
||||
|
||||
serialize_map = {}
|
||||
schema_id = 1
|
||||
|
||||
@classmethod
|
||||
def post_init(cls, **kwargs):
|
||||
cls.allowed_ansible_serializable_types = (
|
||||
_profiles._common_module_types
|
||||
| _profiles._common_module_response_types
|
||||
| {
|
||||
_datatag._AnsibleTaggedDate,
|
||||
_datatag._AnsibleTaggedTime,
|
||||
_datatag._AnsibleTaggedDateTime,
|
||||
_datatag._AnsibleTaggedStr,
|
||||
_datatag._AnsibleTaggedInt,
|
||||
_datatag._AnsibleTaggedFloat,
|
||||
_datatag._AnsibleTaggedList,
|
||||
_datatag._AnsibleTaggedSet,
|
||||
_datatag._AnsibleTaggedTuple,
|
||||
_datatag._AnsibleTaggedDict,
|
||||
_tags.SourceWasEncrypted,
|
||||
_tags.Origin,
|
||||
_tags.TrustedAsTemplate,
|
||||
_vault.EncryptedString,
|
||||
_vault.VaultedValue,
|
||||
}
|
||||
)
|
||||
|
||||
cls.serialize_map = {
|
||||
set: cls.serialize_as_list,
|
||||
tuple: cls.serialize_as_list,
|
||||
_datetime.date: _datatag.AnsibleSerializableDate,
|
||||
_datetime.time: _datatag.AnsibleSerializableTime,
|
||||
_datetime.datetime: _datatag.AnsibleSerializableDateTime,
|
||||
}
|
||||
|
||||
|
||||
class Encoder(_profiles.AnsibleProfileJSONEncoder):
|
||||
_profile = _Profile
|
||||
|
||||
|
||||
class Decoder(_profiles.AnsibleProfileJSONDecoder):
|
||||
_profile = _Profile
|
||||
40
lib/ansible/_internal/_json/_profiles/_inventory_legacy.py
Normal file
40
lib/ansible/_internal/_json/_profiles/_inventory_legacy.py
Normal file
|
|
@ -0,0 +1,40 @@
|
|||
"""
|
||||
Backwards compatibility profile for serialization for persisted ansible-inventory output.
|
||||
Behavior is equivalent to pre 2.18 `AnsibleJSONEncoder` with vault_to_text=True.
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
from ... import _json
|
||||
from . import _legacy
|
||||
|
||||
|
||||
class _InventoryVariableVisitor(_legacy._LegacyVariableVisitor, _json.StateTrackingMixIn):
|
||||
"""State-tracking visitor implementation that only applies trust to `_meta.hostvars` and `vars` inventory values."""
|
||||
|
||||
# DTFIX-RELEASE: does the variable visitor need to support conversion of sequence/mapping for inventory?
|
||||
|
||||
@property
|
||||
def _allow_trust(self) -> bool:
|
||||
stack = self._get_stack()
|
||||
|
||||
if len(stack) >= 4 and stack[:2] == ['_meta', 'hostvars']:
|
||||
return True
|
||||
|
||||
if len(stack) >= 3 and stack[1] == 'vars':
|
||||
return True
|
||||
|
||||
return False
|
||||
|
||||
|
||||
class _Profile(_legacy._Profile):
|
||||
visitor_type = _InventoryVariableVisitor
|
||||
encode_strings_as_utf8 = True
|
||||
|
||||
|
||||
class Encoder(_legacy.Encoder):
|
||||
_profile = _Profile
|
||||
|
||||
|
||||
class Decoder(_legacy.Decoder):
|
||||
_profile = _Profile
|
||||
198
lib/ansible/_internal/_json/_profiles/_legacy.py
Normal file
198
lib/ansible/_internal/_json/_profiles/_legacy.py
Normal file
|
|
@ -0,0 +1,198 @@
|
|||
"""
|
||||
Backwards compatibility profile for serialization other than inventory (which should use inventory_legacy for backward-compatible trust behavior).
|
||||
Behavior is equivalent to pre 2.18 `AnsibleJSONEncoder` with vault_to_text=True.
|
||||
"""
|
||||
|
||||
from __future__ import annotations as _annotations
|
||||
|
||||
import datetime as _datetime
|
||||
import typing as _t
|
||||
|
||||
from ansible._internal._datatag import _tags
|
||||
from ansible.module_utils._internal import _datatag
|
||||
from ansible.module_utils._internal._json import _profiles
|
||||
from ansible.parsing import vault as _vault
|
||||
|
||||
from ... import _json
|
||||
|
||||
|
||||
class _Untrusted:
|
||||
"""
|
||||
Temporarily wraps strings which are not trusted for templating.
|
||||
Used before serialization of strings not tagged TrustedAsTemplate when trust inversion is enabled and trust is allowed in the string's context.
|
||||
Used during deserialization of `__ansible_unsafe` strings to indicate they should not be tagged TrustedAsTemplate.
|
||||
"""
|
||||
|
||||
__slots__ = ('value',)
|
||||
|
||||
def __init__(self, value: str) -> None:
|
||||
self.value = value
|
||||
|
||||
|
||||
class _LegacyVariableVisitor(_json.AnsibleVariableVisitor):
|
||||
"""Variable visitor that supports optional trust inversion for legacy serialization."""
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
*,
|
||||
trusted_as_template: bool = False,
|
||||
invert_trust: bool = False,
|
||||
origin: _tags.Origin | None = None,
|
||||
convert_mapping_to_dict: bool = False,
|
||||
convert_sequence_to_list: bool = False,
|
||||
convert_custom_scalars: bool = False,
|
||||
):
|
||||
super().__init__(
|
||||
trusted_as_template=trusted_as_template,
|
||||
origin=origin,
|
||||
convert_mapping_to_dict=convert_mapping_to_dict,
|
||||
convert_sequence_to_list=convert_sequence_to_list,
|
||||
convert_custom_scalars=convert_custom_scalars,
|
||||
allow_encrypted_string=True,
|
||||
)
|
||||
|
||||
self.invert_trust = invert_trust
|
||||
|
||||
if trusted_as_template and invert_trust:
|
||||
raise ValueError('trusted_as_template is mutually exclusive with invert_trust')
|
||||
|
||||
@property
|
||||
def _allow_trust(self) -> bool:
|
||||
"""
|
||||
This profile supports trust application in all contexts.
|
||||
Derived implementations can override this behavior for application-dependent/schema-aware trust.
|
||||
"""
|
||||
return True
|
||||
|
||||
def _early_visit(self, value, value_type) -> _t.Any:
|
||||
"""Similar to base implementation, but supports an intermediate wrapper for trust inversion."""
|
||||
if value_type in (str, _datatag._AnsibleTaggedStr):
|
||||
# apply compatibility behavior
|
||||
if self.trusted_as_template and self._allow_trust:
|
||||
result = _tags.TrustedAsTemplate().tag(value)
|
||||
elif self.invert_trust and not _tags.TrustedAsTemplate.is_tagged_on(value) and self._allow_trust:
|
||||
result = _Untrusted(value)
|
||||
else:
|
||||
result = value
|
||||
elif value_type is _Untrusted:
|
||||
result = value.value
|
||||
else:
|
||||
result = _json._sentinel
|
||||
|
||||
return result
|
||||
|
||||
|
||||
class _Profile(_profiles._JSONSerializationProfile["Encoder", "Decoder"]):
|
||||
visitor_type = _LegacyVariableVisitor
|
||||
|
||||
@classmethod
|
||||
def serialize_untrusted(cls, value: _Untrusted) -> dict[str, str] | str:
|
||||
return dict(
|
||||
__ansible_unsafe=_datatag.AnsibleTagHelper.untag(value.value),
|
||||
)
|
||||
|
||||
@classmethod
|
||||
def serialize_tagged_str(cls, value: _datatag.AnsibleTaggedObject) -> _t.Any:
|
||||
if ciphertext := _vault.VaultHelper.get_ciphertext(value, with_tags=False):
|
||||
return dict(
|
||||
__ansible_vault=ciphertext,
|
||||
)
|
||||
|
||||
return _datatag.AnsibleTagHelper.untag(value)
|
||||
|
||||
@classmethod
|
||||
def deserialize_unsafe(cls, value: dict[str, _t.Any]) -> _Untrusted:
|
||||
ansible_unsafe = value['__ansible_unsafe']
|
||||
|
||||
if type(ansible_unsafe) is not str: # pylint: disable=unidiomatic-typecheck
|
||||
raise TypeError(f"__ansible_unsafe is {type(ansible_unsafe)} not {str}")
|
||||
|
||||
return _Untrusted(ansible_unsafe)
|
||||
|
||||
@classmethod
|
||||
def deserialize_vault(cls, value: dict[str, _t.Any]) -> _vault.EncryptedString:
|
||||
ansible_vault = value['__ansible_vault']
|
||||
|
||||
if type(ansible_vault) is not str: # pylint: disable=unidiomatic-typecheck
|
||||
raise TypeError(f"__ansible_vault is {type(ansible_vault)} not {str}")
|
||||
|
||||
encrypted_string = _vault.EncryptedString(ciphertext=ansible_vault)
|
||||
|
||||
return encrypted_string
|
||||
|
||||
@classmethod
|
||||
def serialize_encrypted_string(cls, value: _vault.EncryptedString) -> dict[str, str]:
|
||||
return dict(
|
||||
__ansible_vault=_vault.VaultHelper.get_ciphertext(value, with_tags=False),
|
||||
)
|
||||
|
||||
@classmethod
|
||||
def post_init(cls) -> None:
|
||||
cls.serialize_map = {
|
||||
set: cls.serialize_as_list,
|
||||
tuple: cls.serialize_as_list,
|
||||
_datetime.date: cls.serialize_as_isoformat, # existing devel behavior
|
||||
_datetime.time: cls.serialize_as_isoformat, # always failed pre-2.18, so okay to include for consistency
|
||||
_datetime.datetime: cls.serialize_as_isoformat, # existing devel behavior
|
||||
_datatag._AnsibleTaggedDate: cls.discard_tags,
|
||||
_datatag._AnsibleTaggedTime: cls.discard_tags,
|
||||
_datatag._AnsibleTaggedDateTime: cls.discard_tags,
|
||||
_vault.EncryptedString: cls.serialize_encrypted_string,
|
||||
_datatag._AnsibleTaggedStr: cls.serialize_tagged_str, # for VaultedValue tagged str
|
||||
_datatag._AnsibleTaggedInt: cls.discard_tags,
|
||||
_datatag._AnsibleTaggedFloat: cls.discard_tags,
|
||||
_datatag._AnsibleTaggedList: cls.discard_tags,
|
||||
_datatag._AnsibleTaggedSet: cls.discard_tags,
|
||||
_datatag._AnsibleTaggedTuple: cls.discard_tags,
|
||||
_datatag._AnsibleTaggedDict: cls.discard_tags,
|
||||
_Untrusted: cls.serialize_untrusted, # equivalent to AnsibleJSONEncoder(preprocess_unsafe=True) in devel
|
||||
}
|
||||
|
||||
cls.deserialize_map = {
|
||||
'__ansible_unsafe': cls.deserialize_unsafe,
|
||||
'__ansible_vault': cls.deserialize_vault,
|
||||
}
|
||||
|
||||
@classmethod
|
||||
def pre_serialize(cls, encoder: Encoder, o: _t.Any) -> _t.Any:
|
||||
# DTFIX-RELEASE: these conversion args probably aren't needed
|
||||
avv = cls.visitor_type(invert_trust=True, convert_mapping_to_dict=True, convert_sequence_to_list=True, convert_custom_scalars=True)
|
||||
|
||||
return avv.visit(o)
|
||||
|
||||
@classmethod
|
||||
def post_deserialize(cls, decoder: Decoder, o: _t.Any) -> _t.Any:
|
||||
avv = cls.visitor_type(trusted_as_template=decoder._trusted_as_template, origin=decoder._origin)
|
||||
|
||||
return avv.visit(o)
|
||||
|
||||
@classmethod
|
||||
def handle_key(cls, k: _t.Any) -> _t.Any:
|
||||
if isinstance(k, str):
|
||||
return k
|
||||
|
||||
# DTFIX-RELEASE: decide if this is a deprecation warning, error, or what?
|
||||
# Non-string variable names have been disallowed by set_fact and other things since at least 2021.
|
||||
# DTFIX-RELEASE: document why this behavior is here, also verify the legacy tagless use case doesn't need this same behavior
|
||||
return str(k)
|
||||
|
||||
|
||||
class Encoder(_profiles.AnsibleProfileJSONEncoder):
|
||||
_profile = _Profile
|
||||
|
||||
|
||||
class Decoder(_profiles.AnsibleProfileJSONDecoder):
|
||||
_profile = _Profile
|
||||
|
||||
def __init__(self, **kwargs) -> None:
|
||||
super().__init__(**kwargs)
|
||||
|
||||
# NB: these can only be sampled properly when loading strings, eg, `json.loads`; the global `json.load` function does not expose the file-like to us
|
||||
self._origin: _tags.Origin | None = None
|
||||
self._trusted_as_template: bool = False
|
||||
|
||||
def raw_decode(self, s: str, idx: int = 0) -> tuple[_t.Any, int]:
|
||||
self._origin = _tags.Origin.get_tag(s)
|
||||
self._trusted_as_template = _tags.TrustedAsTemplate.is_tagged_on(s)
|
||||
|
||||
return super().raw_decode(s, idx)
|
||||
21
lib/ansible/_internal/_locking.py
Normal file
21
lib/ansible/_internal/_locking.py
Normal file
|
|
@ -0,0 +1,21 @@
|
|||
from __future__ import annotations
|
||||
|
||||
import contextlib
|
||||
import fcntl
|
||||
import typing as t
|
||||
|
||||
|
||||
@contextlib.contextmanager
|
||||
def named_mutex(path: str) -> t.Iterator[None]:
|
||||
"""
|
||||
Lightweight context manager wrapper over `fcntl.flock` to provide IPC locking via a shared filename.
|
||||
Entering the context manager blocks until the lock is acquired.
|
||||
The lock file will be created automatically, but creation of the parent directory and deletion of the lockfile are the caller's responsibility.
|
||||
"""
|
||||
with open(path, 'a') as file:
|
||||
fcntl.flock(file, fcntl.LOCK_EX)
|
||||
|
||||
try:
|
||||
yield
|
||||
finally:
|
||||
fcntl.flock(file, fcntl.LOCK_UN)
|
||||
0
lib/ansible/_internal/_plugins/__init__.py
Normal file
0
lib/ansible/_internal/_plugins/__init__.py
Normal file
57
lib/ansible/_internal/_plugins/_cache.py
Normal file
57
lib/ansible/_internal/_plugins/_cache.py
Normal file
|
|
@ -0,0 +1,57 @@
|
|||
from __future__ import annotations
|
||||
|
||||
import functools
|
||||
import json
|
||||
import json.encoder
|
||||
import json.decoder
|
||||
import typing as t
|
||||
|
||||
from .._wrapt import ObjectProxy
|
||||
from .._json._profiles import _cache_persistence
|
||||
|
||||
|
||||
class PluginInterposer(ObjectProxy):
|
||||
"""Proxies a Cache plugin instance to implement transparent encapsulation of serialized Ansible internal data types."""
|
||||
|
||||
_PAYLOAD_KEY = '__payload__'
|
||||
"""The key used to store the serialized payload."""
|
||||
|
||||
def get(self, key: str) -> dict[str, object]:
|
||||
return self._decode(self.__wrapped__.get(self._get_key(key)))
|
||||
|
||||
def set(self, key: str, value: dict[str, object]) -> None:
|
||||
self.__wrapped__.set(self._get_key(key), self._encode(value))
|
||||
|
||||
def keys(self) -> t.Sequence[str]:
|
||||
return [k for k in (self._restore_key(k) for k in self.__wrapped__.keys()) if k is not None]
|
||||
|
||||
def contains(self, key: t.Any) -> bool:
|
||||
return self.__wrapped__.contains(self._get_key(key))
|
||||
|
||||
def delete(self, key: str) -> None:
|
||||
self.__wrapped__.delete(self._get_key(key))
|
||||
|
||||
@classmethod
|
||||
def _restore_key(cls, wrapped_key: str) -> str | None:
|
||||
prefix = cls._get_wrapped_key_prefix()
|
||||
|
||||
if not wrapped_key.startswith(prefix):
|
||||
return None
|
||||
|
||||
return wrapped_key[len(prefix) :]
|
||||
|
||||
@classmethod
|
||||
@functools.cache
|
||||
def _get_wrapped_key_prefix(cls) -> str:
|
||||
return f's{_cache_persistence._Profile.schema_id}_'
|
||||
|
||||
@classmethod
|
||||
def _get_key(cls, key: str) -> str:
|
||||
"""Augment the supplied key with a schema identifier to allow for side-by-side caching across incompatible schemas."""
|
||||
return f'{cls._get_wrapped_key_prefix()}{key}'
|
||||
|
||||
def _encode(self, value: dict[str, object]) -> dict[str, object]:
|
||||
return {self._PAYLOAD_KEY: json.dumps(value, cls=_cache_persistence.Encoder)}
|
||||
|
||||
def _decode(self, value: dict[str, t.Any]) -> dict[str, object]:
|
||||
return json.loads(value[self._PAYLOAD_KEY], cls=_cache_persistence.Decoder)
|
||||
78
lib/ansible/_internal/_task.py
Normal file
78
lib/ansible/_internal/_task.py
Normal file
|
|
@ -0,0 +1,78 @@
|
|||
from __future__ import annotations
|
||||
|
||||
import dataclasses
|
||||
import typing as t
|
||||
|
||||
from collections import abc as c
|
||||
|
||||
from ansible import constants
|
||||
from ansible._internal._templating import _engine
|
||||
from ansible._internal._templating._chain_templar import ChainTemplar
|
||||
from ansible.errors import AnsibleError
|
||||
from ansible.module_utils._internal._ambient_context import AmbientContextBase
|
||||
from ansible.module_utils.datatag import native_type_name
|
||||
from ansible.parsing import vault as _vault
|
||||
from ansible.utils.display import Display
|
||||
|
||||
if t.TYPE_CHECKING:
|
||||
from ansible.playbook.task import Task
|
||||
|
||||
|
||||
@dataclasses.dataclass
|
||||
class TaskContext(AmbientContextBase):
|
||||
"""Ambient context that wraps task execution on workers. It provides access to the currently executing task."""
|
||||
|
||||
task: Task
|
||||
|
||||
|
||||
TaskArgsFinalizerCallback = t.Callable[[str, t.Any, _engine.TemplateEngine, t.Any], t.Any]
|
||||
"""Type alias for the shape of the `ActionBase.finalize_task_arg` method."""
|
||||
|
||||
|
||||
class TaskArgsChainTemplar(ChainTemplar):
|
||||
"""
|
||||
A ChainTemplar that carries a user-provided context object, optionally provided by `ActionBase.get_finalize_task_args_context`.
|
||||
TaskArgsFinalizer provides the context to each `ActionBase.finalize_task_arg` call to allow for more complex/stateful customization.
|
||||
"""
|
||||
|
||||
def __init__(self, *sources: c.Mapping, templar: _engine.TemplateEngine, callback: TaskArgsFinalizerCallback, context: t.Any) -> None:
|
||||
super().__init__(*sources, templar=templar)
|
||||
|
||||
self.callback = callback
|
||||
self.context = context
|
||||
|
||||
def template(self, key: t.Any, value: t.Any) -> t.Any:
|
||||
return self.callback(key, value, self.templar, self.context)
|
||||
|
||||
|
||||
class TaskArgsFinalizer:
|
||||
"""Invoked during task args finalization; allows actions to override default arg processing (e.g., templating)."""
|
||||
|
||||
def __init__(self, *args: c.Mapping[str, t.Any] | str | None, templar: _engine.TemplateEngine) -> None:
|
||||
self._args_layers = [arg for arg in args if arg is not None]
|
||||
self._templar = templar
|
||||
|
||||
def finalize(self, callback: TaskArgsFinalizerCallback, context: t.Any) -> dict[str, t.Any]:
|
||||
resolved_layers: list[c.Mapping[str, t.Any]] = []
|
||||
|
||||
for layer in self._args_layers:
|
||||
if isinstance(layer, (str, _vault.EncryptedString)): # EncryptedString can hide a template
|
||||
if constants.config.get_config_value('INJECT_FACTS_AS_VARS'):
|
||||
Display().warning(
|
||||
"Using a template for task args is unsafe in some situations "
|
||||
"(see https://docs.ansible.com/ansible/devel/reference_appendices/faq.html#argsplat-unsafe).",
|
||||
obj=layer,
|
||||
)
|
||||
|
||||
resolved_layer = self._templar.resolve_to_container(layer, options=_engine.TemplateOptions(value_for_omit={}))
|
||||
else:
|
||||
resolved_layer = layer
|
||||
|
||||
if not isinstance(resolved_layer, dict):
|
||||
raise AnsibleError(f'Task args must resolve to a {native_type_name(dict)!r} not {native_type_name(resolved_layer)!r}.', obj=layer)
|
||||
|
||||
resolved_layers.append(resolved_layer)
|
||||
|
||||
ct = TaskArgsChainTemplar(*reversed(resolved_layers), templar=self._templar, callback=callback, context=context)
|
||||
|
||||
return ct.as_dict()
|
||||
10
lib/ansible/_internal/_templating/__init__.py
Normal file
10
lib/ansible/_internal/_templating/__init__.py
Normal file
|
|
@ -0,0 +1,10 @@
|
|||
from __future__ import annotations
|
||||
|
||||
from jinja2 import __version__ as _jinja2_version
|
||||
|
||||
# DTFIX-FUTURE: sanity test to ensure this doesn't drift from requirements
|
||||
_MINIMUM_JINJA_VERSION = (3, 1)
|
||||
_CURRENT_JINJA_VERSION = tuple(map(int, _jinja2_version.split('.', maxsplit=2)[:2]))
|
||||
|
||||
if _CURRENT_JINJA_VERSION < _MINIMUM_JINJA_VERSION:
|
||||
raise RuntimeError(f'Jinja version {".".join(map(str, _MINIMUM_JINJA_VERSION))} or higher is required (current version {_jinja2_version}).')
|
||||
86
lib/ansible/_internal/_templating/_access.py
Normal file
86
lib/ansible/_internal/_templating/_access.py
Normal file
|
|
@ -0,0 +1,86 @@
|
|||
from __future__ import annotations
|
||||
|
||||
import abc
|
||||
import typing as t
|
||||
|
||||
from contextvars import ContextVar
|
||||
|
||||
from ansible.module_utils._internal._datatag import AnsibleTagHelper
|
||||
|
||||
|
||||
class NotifiableAccessContextBase(metaclass=abc.ABCMeta):
|
||||
"""Base class for a context manager that, when active, receives notification of managed access for types/tags in which it has registered an interest."""
|
||||
|
||||
_type_interest: t.FrozenSet[type] = frozenset()
|
||||
"""Set of types (including tag types) for which this context will be notified upon access."""
|
||||
|
||||
_mask: t.ClassVar[bool] = False
|
||||
"""When true, only the innermost (most recently created) context of this type will be notified."""
|
||||
|
||||
def __enter__(self):
|
||||
# noinspection PyProtectedMember
|
||||
AnsibleAccessContext.current()._register_interest(self)
|
||||
return self
|
||||
|
||||
def __exit__(self, exc_type, exc_val, exc_tb) -> None:
|
||||
# noinspection PyProtectedMember
|
||||
AnsibleAccessContext.current()._unregister_interest(self)
|
||||
return None
|
||||
|
||||
@abc.abstractmethod
|
||||
def _notify(self, o: t.Any) -> t.Any:
|
||||
"""Derived classes implement custom notification behavior when a registered type or tag is accessed."""
|
||||
|
||||
|
||||
class AnsibleAccessContext:
|
||||
"""
|
||||
Broker object for managed access registration and notification.
|
||||
Each thread or other logical callstack has a dedicated `AnsibleAccessContext` object with which `NotifiableAccessContext` objects can register interest.
|
||||
When a managed access occurs on an object, each active `NotifiableAccessContext` within the current callstack that has registered interest in that
|
||||
object's type or a tag present on it will be notified.
|
||||
"""
|
||||
|
||||
_contextvar: t.ClassVar[ContextVar[AnsibleAccessContext]] = ContextVar('AnsibleAccessContext')
|
||||
|
||||
@staticmethod
|
||||
def current() -> AnsibleAccessContext:
|
||||
"""Creates or retrieves an `AnsibleAccessContext` for the current logical callstack."""
|
||||
try:
|
||||
ctx: AnsibleAccessContext = AnsibleAccessContext._contextvar.get()
|
||||
except LookupError:
|
||||
# didn't exist; create it
|
||||
ctx = AnsibleAccessContext()
|
||||
AnsibleAccessContext._contextvar.set(ctx) # we ignore the token, since this should live for the life of the thread/async ctx
|
||||
|
||||
return ctx
|
||||
|
||||
def __init__(self) -> None:
|
||||
self._notify_contexts: list[NotifiableAccessContextBase] = []
|
||||
|
||||
def _register_interest(self, context: NotifiableAccessContextBase) -> None:
|
||||
self._notify_contexts.append(context)
|
||||
|
||||
def _unregister_interest(self, context: NotifiableAccessContextBase) -> None:
|
||||
ctx = self._notify_contexts.pop()
|
||||
|
||||
if ctx is not context:
|
||||
raise RuntimeError(f'Out-of-order context deactivation detected. Found {ctx} instead of {context}.')
|
||||
|
||||
def access(self, value: t.Any) -> None:
|
||||
"""Notify all contexts which have registered interest in the given value that it is being accessed."""
|
||||
if not self._notify_contexts:
|
||||
return
|
||||
|
||||
value_types = AnsibleTagHelper.tag_types(value) | frozenset((type(value),))
|
||||
masked: set[type] = set()
|
||||
|
||||
for ctx in reversed(self._notify_contexts):
|
||||
if ctx._mask:
|
||||
if (ctx_type := type(ctx)) in masked:
|
||||
continue
|
||||
|
||||
masked.add(ctx_type)
|
||||
|
||||
# noinspection PyProtectedMember
|
||||
if ctx._type_interest.intersection(value_types):
|
||||
ctx._notify(value)
|
||||
63
lib/ansible/_internal/_templating/_chain_templar.py
Normal file
63
lib/ansible/_internal/_templating/_chain_templar.py
Normal file
|
|
@ -0,0 +1,63 @@
|
|||
from __future__ import annotations
|
||||
|
||||
import collections.abc as c
|
||||
import itertools
|
||||
import typing as t
|
||||
|
||||
from ansible.errors import AnsibleValueOmittedError, AnsibleError
|
||||
|
||||
from ._engine import TemplateEngine
|
||||
|
||||
|
||||
class ChainTemplar:
|
||||
"""A basic variable layering mechanism that supports templating and obliteration of `omit` values."""
|
||||
|
||||
def __init__(self, *sources: c.Mapping, templar: TemplateEngine) -> None:
|
||||
self.sources = sources
|
||||
self.templar = templar
|
||||
|
||||
def template(self, key: t.Any, value: t.Any) -> t.Any:
|
||||
"""
|
||||
Render the given value using the templar.
|
||||
Intended to be overridden by subclasses.
|
||||
"""
|
||||
return self.templar.template(value)
|
||||
|
||||
def get(self, key: t.Any) -> t.Any:
|
||||
"""Get the value for the given key, templating the result before returning it."""
|
||||
for source in self.sources:
|
||||
if key not in source:
|
||||
continue
|
||||
|
||||
value = source[key]
|
||||
|
||||
try:
|
||||
return self.template(key, value)
|
||||
except AnsibleValueOmittedError:
|
||||
break # omit == obliterate - matches historical behavior where dict layers were squashed before templating was applied
|
||||
except Exception as ex:
|
||||
raise AnsibleError(f'Error while resolving value for {key!r}.', obj=value) from ex
|
||||
|
||||
raise KeyError(key)
|
||||
|
||||
def keys(self) -> t.Iterable[t.Any]:
|
||||
"""
|
||||
Returns a sorted iterable of all keys present in all source layers, without templating associated values.
|
||||
Values that resolve to `omit` are thus included.
|
||||
"""
|
||||
return sorted(set(itertools.chain.from_iterable(self.sources)))
|
||||
|
||||
def items(self) -> t.Iterable[t.Tuple[t.Any, t.Any]]:
|
||||
"""
|
||||
Returns a sorted iterable of (key, templated value) tuples.
|
||||
Any tuple where the templated value resolves to `omit` will not be included in the result.
|
||||
"""
|
||||
for key in self.keys():
|
||||
try:
|
||||
yield key, self.get(key)
|
||||
except KeyError:
|
||||
pass
|
||||
|
||||
def as_dict(self) -> dict[t.Any, t.Any]:
|
||||
"""Returns a dict representing all layers, squashed and templated, with `omit` values dropped."""
|
||||
return dict(self.items())
|
||||
95
lib/ansible/_internal/_templating/_datatag.py
Normal file
95
lib/ansible/_internal/_templating/_datatag.py
Normal file
|
|
@ -0,0 +1,95 @@
|
|||
from __future__ import annotations
|
||||
|
||||
import contextlib as _contextlib
|
||||
import dataclasses
|
||||
import typing as t
|
||||
|
||||
from ansible.module_utils._internal._datatag import AnsibleSingletonTagBase, _tag_dataclass_kwargs
|
||||
from ansible.module_utils._internal._datatag._tags import Deprecated
|
||||
from ansible._internal._datatag._tags import Origin
|
||||
from ansible.utils.display import Display
|
||||
|
||||
from ._access import NotifiableAccessContextBase
|
||||
from ._utils import TemplateContext
|
||||
|
||||
|
||||
display = Display()
|
||||
|
||||
|
||||
@dataclasses.dataclass(**_tag_dataclass_kwargs)
|
||||
class _JinjaConstTemplate(AnsibleSingletonTagBase):
|
||||
# deprecated: description='embedded Jinja constant string template support' core_version='2.23'
|
||||
pass
|
||||
|
||||
|
||||
@dataclasses.dataclass(frozen=True, kw_only=True, slots=True)
|
||||
class _TrippedDeprecationInfo:
|
||||
template: str
|
||||
deprecated: Deprecated
|
||||
|
||||
|
||||
class DeprecatedAccessAuditContext(NotifiableAccessContextBase):
|
||||
"""When active, captures metadata about managed accesses to `Deprecated` tagged objects."""
|
||||
|
||||
_type_interest = frozenset([Deprecated])
|
||||
|
||||
@classmethod
|
||||
def when(cls, condition: bool, /) -> t.Self | _contextlib.nullcontext:
|
||||
"""Returns a new instance if `condition` is True (usually `TemplateContext.is_top_level`), otherwise a `nullcontext` instance."""
|
||||
if condition:
|
||||
return cls()
|
||||
|
||||
return _contextlib.nullcontext()
|
||||
|
||||
def __init__(self) -> None:
|
||||
self._tripped_deprecation_info: dict[int, _TrippedDeprecationInfo] = {}
|
||||
|
||||
def __exit__(self, exc_type, exc_val, exc_tb) -> None:
|
||||
result = super().__exit__(exc_type, exc_val, exc_tb)
|
||||
|
||||
for item in self._tripped_deprecation_info.values():
|
||||
if Origin.is_tagged_on(item.template):
|
||||
msg = item.deprecated.msg
|
||||
else:
|
||||
# without an origin, we need to include what context we do have (the template)
|
||||
msg = f'While processing {item.template!r}: {item.deprecated.msg}'
|
||||
|
||||
display._deprecated_with_plugin_info(
|
||||
msg=msg,
|
||||
help_text=item.deprecated.help_text,
|
||||
version=item.deprecated.removal_version,
|
||||
date=item.deprecated.removal_date,
|
||||
obj=item.template,
|
||||
plugin=item.deprecated.plugin,
|
||||
)
|
||||
|
||||
return result
|
||||
|
||||
def _notify(self, o: t.Any) -> None:
|
||||
deprecated = Deprecated.get_required_tag(o)
|
||||
deprecated_key = id(deprecated)
|
||||
|
||||
if deprecated_key in self._tripped_deprecation_info:
|
||||
return # record only the first access for each deprecated tag in a given context
|
||||
|
||||
template_ctx = TemplateContext.current(optional=True)
|
||||
template = template_ctx.template_value if template_ctx else None
|
||||
|
||||
# when the current template input is a container, provide a descriptive string with origin propagated (if possible)
|
||||
if not isinstance(template, str):
|
||||
# DTFIX-FUTURE: ascend the template stack to try and find the nearest string source template
|
||||
origin = Origin.get_tag(template)
|
||||
|
||||
# DTFIX-RELEASE: this should probably use a synthesized description value on the tag
|
||||
# it is reachable from the data_tagging_controller test: ../playbook_output_validator/filter.py actual_stdout.txt actual_stderr.txt
|
||||
# -[DEPRECATION WARNING]: `something_old` is deprecated, don't use it! This feature will be removed in version 1.2.3.
|
||||
# +[DEPRECATION WARNING]: While processing '<<container>>': `something_old` is deprecated, don't use it! This feature will be removed in ...
|
||||
template = '<<container>>'
|
||||
|
||||
if origin:
|
||||
origin.tag(template)
|
||||
|
||||
self._tripped_deprecation_info[deprecated_key] = _TrippedDeprecationInfo(
|
||||
template=template,
|
||||
deprecated=deprecated,
|
||||
)
|
||||
588
lib/ansible/_internal/_templating/_engine.py
Normal file
588
lib/ansible/_internal/_templating/_engine.py
Normal file
|
|
@ -0,0 +1,588 @@
|
|||
# (c) 2012-2014, Michael DeHaan <michael.dehaan@gmail.com>
|
||||
# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import copy
|
||||
import dataclasses
|
||||
import enum
|
||||
import textwrap
|
||||
import typing as t
|
||||
import collections.abc as c
|
||||
import re
|
||||
|
||||
from collections import ChainMap
|
||||
|
||||
from ansible.errors import (
|
||||
AnsibleError,
|
||||
AnsibleValueOmittedError,
|
||||
AnsibleUndefinedVariable,
|
||||
AnsibleTemplateSyntaxError,
|
||||
AnsibleBrokenConditionalError,
|
||||
AnsibleTemplateTransformLimitError,
|
||||
TemplateTrustCheckFailedError,
|
||||
)
|
||||
|
||||
from ansible.module_utils._internal._datatag import AnsibleTaggedObject, NotTaggableError, AnsibleTagHelper
|
||||
from ansible._internal._errors._handler import Skippable
|
||||
from ansible._internal._datatag._tags import Origin, TrustedAsTemplate
|
||||
from ansible.utils.display import Display
|
||||
from ansible.utils.vars import validate_variable_name
|
||||
from ansible.parsing.dataloader import DataLoader
|
||||
|
||||
from ._datatag import DeprecatedAccessAuditContext
|
||||
from ._jinja_bits import (
|
||||
AnsibleTemplate,
|
||||
_TemplateCompileContext,
|
||||
TemplateOverrides,
|
||||
AnsibleEnvironment,
|
||||
defer_template_error,
|
||||
create_template_error,
|
||||
is_possibly_template,
|
||||
is_possibly_all_template,
|
||||
AnsibleTemplateExpression,
|
||||
_finalize_template_result,
|
||||
FinalizeMode,
|
||||
)
|
||||
from ._jinja_common import _TemplateConfig, MarkerError, ExceptionMarker
|
||||
from ._lazy_containers import _AnsibleLazyTemplateMixin
|
||||
from ._marker_behaviors import MarkerBehavior, FAIL_ON_UNDEFINED
|
||||
from ._transform import _type_transform_mapping
|
||||
from ._utils import Omit, TemplateContext, IGNORE_SCALAR_VAR_TYPES, LazyOptions
|
||||
from ...module_utils.datatag import native_type_name
|
||||
|
||||
_display = Display()
|
||||
|
||||
|
||||
_shared_empty_unmask_type_names: frozenset[str] = frozenset()
|
||||
|
||||
TRANSFORM_CHAIN_LIMIT: int = 10
|
||||
"""Arbitrary limit for chained transforms to prevent cycles; an exception will be raised if exceeded."""
|
||||
|
||||
|
||||
class TemplateMode(enum.Enum):
|
||||
# DTFIX-FUTURE: this enum ideally wouldn't exist - revisit/rename before making public
|
||||
DEFAULT = enum.auto()
|
||||
STOP_ON_TEMPLATE = enum.auto()
|
||||
STOP_ON_CONTAINER = enum.auto()
|
||||
ALWAYS_FINALIZE = enum.auto()
|
||||
|
||||
|
||||
@dataclasses.dataclass(kw_only=True, slots=True, frozen=True)
|
||||
class TemplateOptions:
|
||||
DEFAULT: t.ClassVar[t.Self]
|
||||
|
||||
value_for_omit: object = Omit
|
||||
escape_backslashes: bool = True
|
||||
preserve_trailing_newlines: bool = True
|
||||
# DTFIX-RELEASE: these aren't really overrides anymore, rename the dataclass and this field
|
||||
# also mention in docstring this has no effect unless used to template a string
|
||||
overrides: TemplateOverrides = TemplateOverrides.DEFAULT
|
||||
|
||||
|
||||
TemplateOptions.DEFAULT = TemplateOptions()
|
||||
|
||||
|
||||
class TemplateEncountered(Exception):
|
||||
pass
|
||||
|
||||
|
||||
class TemplateEngine:
|
||||
"""
|
||||
The main class for templating, with the main entry-point of template().
|
||||
"""
|
||||
|
||||
_sentinel = object()
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
loader: DataLoader | None = None,
|
||||
variables: dict[str, t.Any] | ChainMap[str, t.Any] | None = None,
|
||||
variables_factory: t.Callable[[], dict[str, t.Any] | ChainMap[str, t.Any]] | None = None,
|
||||
marker_behavior: MarkerBehavior | None = None,
|
||||
):
|
||||
self._loader = loader
|
||||
self._variables = variables
|
||||
self._variables_factory = variables_factory
|
||||
self._environment: AnsibleEnvironment | None = None
|
||||
|
||||
# inherit marker behavior from the active template context's templar unless otherwise specified
|
||||
if not marker_behavior:
|
||||
if template_ctx := TemplateContext.current(optional=True):
|
||||
marker_behavior = template_ctx.templar.marker_behavior
|
||||
else:
|
||||
marker_behavior = FAIL_ON_UNDEFINED
|
||||
|
||||
self._marker_behavior = marker_behavior
|
||||
|
||||
def copy(self) -> t.Self:
|
||||
new_engine = copy.copy(self)
|
||||
new_engine._environment = None
|
||||
|
||||
return new_engine
|
||||
|
||||
def extend(self, marker_behavior: MarkerBehavior | None = None) -> t.Self:
|
||||
# DTFIX-RELEASE: bikeshed name, supported features
|
||||
new_templar = type(self)(
|
||||
loader=self._loader,
|
||||
variables=self._variables,
|
||||
variables_factory=self._variables_factory,
|
||||
marker_behavior=marker_behavior or self._marker_behavior,
|
||||
)
|
||||
|
||||
if self._environment:
|
||||
new_templar._environment = self._environment
|
||||
|
||||
return new_templar
|
||||
|
||||
@property
|
||||
def marker_behavior(self) -> MarkerBehavior:
|
||||
return self._marker_behavior
|
||||
|
||||
@property
|
||||
def basedir(self) -> str:
|
||||
"""The basedir from DataLoader."""
|
||||
return self._loader.get_basedir() if self._loader else '.'
|
||||
|
||||
@property
|
||||
def environment(self) -> AnsibleEnvironment:
|
||||
if not self._environment:
|
||||
self._environment = AnsibleEnvironment(ansible_basedir=self.basedir)
|
||||
|
||||
return self._environment
|
||||
|
||||
def _create_overlay(self, template: str, overrides: TemplateOverrides) -> tuple[str, AnsibleEnvironment]:
|
||||
try:
|
||||
template, overrides = overrides._extract_template_overrides(template)
|
||||
except Exception as ex:
|
||||
raise AnsibleTemplateSyntaxError("Syntax error in template.", obj=template) from ex
|
||||
|
||||
env = self.environment
|
||||
|
||||
if overrides is not TemplateOverrides.DEFAULT and (overlay_kwargs := overrides.overlay_kwargs()):
|
||||
env = t.cast(AnsibleEnvironment, env.overlay(**overlay_kwargs))
|
||||
|
||||
return template, env
|
||||
|
||||
@staticmethod
|
||||
def _count_newlines_from_end(in_str):
|
||||
"""
|
||||
Counts the number of newlines at the end of a string. This is used during
|
||||
the jinja2 templating to ensure the count matches the input, since some newlines
|
||||
may be thrown away during the templating.
|
||||
"""
|
||||
|
||||
i = len(in_str)
|
||||
j = i - 1
|
||||
|
||||
try:
|
||||
while in_str[j] == '\n':
|
||||
j -= 1
|
||||
except IndexError:
|
||||
# Uncommon cases: zero length string and string containing only newlines
|
||||
return i
|
||||
|
||||
return i - 1 - j
|
||||
|
||||
@property
|
||||
def available_variables(self) -> dict[str, t.Any] | ChainMap[str, t.Any]:
|
||||
"""Available variables this instance will use when templating."""
|
||||
# DTFIX-RELEASE: ensure that we're always accessing this as a shallow container-level snapshot, and eliminate uses of anything
|
||||
# that directly mutates this value. _new_context may resolve this for us?
|
||||
if self._variables is None:
|
||||
self._variables = self._variables_factory() if self._variables_factory else {}
|
||||
|
||||
return self._variables
|
||||
|
||||
@available_variables.setter
|
||||
def available_variables(self, variables: dict[str, t.Any]) -> None:
|
||||
self._variables = variables
|
||||
|
||||
def resolve_variable_expression(
|
||||
self,
|
||||
expression: str,
|
||||
*,
|
||||
local_variables: dict[str, t.Any] | None = None,
|
||||
) -> t.Any:
|
||||
"""
|
||||
Resolve a potentially untrusted string variable expression consisting only of valid identifiers, integers, dots, and indexing containing these.
|
||||
Optional local variables may be provided, which can only be referenced directly by the given expression.
|
||||
Valid: x, x.y, x[y].z, x[1], 1, x[y.z]
|
||||
Error: 'x', x['y'], q('env')
|
||||
"""
|
||||
components = re.split(r'[.\[\]]', expression)
|
||||
|
||||
try:
|
||||
for component in components:
|
||||
if re.fullmatch('[0-9]*', component):
|
||||
continue # allow empty strings and integers
|
||||
|
||||
validate_variable_name(component)
|
||||
except Exception as ex:
|
||||
raise AnsibleError(f'Invalid variable expression: {expression}', obj=expression) from ex
|
||||
|
||||
return self.evaluate_expression(TrustedAsTemplate().tag(expression), local_variables=local_variables)
|
||||
|
||||
@staticmethod
|
||||
def variable_name_as_template(name: str) -> str:
|
||||
"""Return a trusted template string that will resolve the provided variable name. Raises an error if `name` is not a valid identifier."""
|
||||
validate_variable_name(name)
|
||||
return AnsibleTagHelper.tag('{{' + name + '}}', (AnsibleTagHelper.tags(name) | {TrustedAsTemplate()}))
|
||||
|
||||
def transform(self, variable: t.Any) -> t.Any:
|
||||
"""Recursively apply transformations to the given value and return the result."""
|
||||
return self.template(variable, mode=TemplateMode.ALWAYS_FINALIZE, lazy_options=LazyOptions.SKIP_TEMPLATES_AND_ACCESS)
|
||||
|
||||
def template(
|
||||
self,
|
||||
variable: t.Any, # DTFIX-RELEASE: once we settle the new/old API boundaries, rename this (here and in other methods)
|
||||
*,
|
||||
options: TemplateOptions = TemplateOptions.DEFAULT,
|
||||
mode: TemplateMode = TemplateMode.DEFAULT,
|
||||
lazy_options: LazyOptions = LazyOptions.DEFAULT,
|
||||
) -> t.Any:
|
||||
"""Templates (possibly recursively) any given data as input."""
|
||||
original_variable = variable
|
||||
|
||||
for _attempt in range(TRANSFORM_CHAIN_LIMIT):
|
||||
if variable is None or (value_type := type(variable)) in IGNORE_SCALAR_VAR_TYPES:
|
||||
return variable # quickly ignore supported scalar types which are not be templated
|
||||
|
||||
value_is_str = isinstance(variable, str)
|
||||
|
||||
if template_ctx := TemplateContext.current(optional=True):
|
||||
stop_on_template = template_ctx.stop_on_template
|
||||
else:
|
||||
stop_on_template = False
|
||||
|
||||
if mode is TemplateMode.STOP_ON_TEMPLATE:
|
||||
stop_on_template = True
|
||||
|
||||
with (
|
||||
TemplateContext(template_value=variable, templar=self, options=options, stop_on_template=stop_on_template) as ctx,
|
||||
DeprecatedAccessAuditContext.when(ctx.is_top_level),
|
||||
):
|
||||
try:
|
||||
if not value_is_str:
|
||||
# transforms are currently limited to non-str types as an optimization
|
||||
if (transform := _type_transform_mapping.get(value_type)) and value_type.__name__ not in lazy_options.unmask_type_names:
|
||||
variable = transform(variable)
|
||||
continue
|
||||
|
||||
template_result = _AnsibleLazyTemplateMixin._try_create(variable, lazy_options)
|
||||
elif not lazy_options.template:
|
||||
template_result = variable
|
||||
elif not is_possibly_template(variable, options.overrides):
|
||||
template_result = variable
|
||||
elif not self._trust_check(variable, skip_handler=stop_on_template):
|
||||
template_result = variable
|
||||
elif stop_on_template:
|
||||
raise TemplateEncountered()
|
||||
else:
|
||||
compiled_template = self._compile_template(variable, options)
|
||||
|
||||
template_result = compiled_template(self.available_variables)
|
||||
template_result = self._post_render_mutation(variable, template_result, options)
|
||||
except TemplateEncountered:
|
||||
raise
|
||||
except Exception as ex:
|
||||
template_result = defer_template_error(ex, variable, is_expression=False)
|
||||
|
||||
if ctx.is_top_level or mode is TemplateMode.ALWAYS_FINALIZE:
|
||||
template_result = self._finalize_top_level_template_result(
|
||||
variable, options, template_result, stop_on_container=mode is TemplateMode.STOP_ON_CONTAINER
|
||||
)
|
||||
|
||||
return template_result
|
||||
|
||||
raise AnsibleTemplateTransformLimitError(obj=original_variable)
|
||||
|
||||
@staticmethod
|
||||
def _finalize_top_level_template_result(
|
||||
variable: t.Any,
|
||||
options: TemplateOptions,
|
||||
template_result: t.Any,
|
||||
is_expression: bool = False,
|
||||
stop_on_container: bool = False,
|
||||
) -> t.Any:
|
||||
"""
|
||||
This method must be called for expressions and top-level templates to recursively finalize the result.
|
||||
This renders any embedded templates and triggers `Marker` and omit behaviors.
|
||||
"""
|
||||
try:
|
||||
if template_result is Omit:
|
||||
# When the template result is Omit, raise an AnsibleValueOmittedError if value_for_omit is Omit, otherwise return value_for_omit.
|
||||
# Other occurrences of Omit will simply drop out of containers during _finalize_template_result.
|
||||
if options.value_for_omit is Omit:
|
||||
raise AnsibleValueOmittedError()
|
||||
|
||||
return options.value_for_omit # trust that value_for_omit is an allowed type
|
||||
|
||||
if stop_on_container and type(template_result) in AnsibleTaggedObject._collection_types:
|
||||
# Use of stop_on_container implies the caller will perform necessary checks on values,
|
||||
# most likely by passing them back into the templating system.
|
||||
try:
|
||||
return template_result._non_lazy_copy()
|
||||
except AttributeError:
|
||||
return template_result # non-lazy containers are returned as-is
|
||||
|
||||
return _finalize_template_result(template_result, FinalizeMode.TOP_LEVEL)
|
||||
except TemplateEncountered:
|
||||
raise
|
||||
except Exception as ex:
|
||||
raise_from: BaseException
|
||||
|
||||
if isinstance(ex, MarkerError):
|
||||
exception_to_raise = ex.source._as_exception()
|
||||
|
||||
# MarkerError is never suitable for use as the cause of another exception, it is merely a raiseable container for the source marker
|
||||
# used for flow control (so its stack trace is rarely useful). However, if the source derives from a ExceptionMarker, its contained
|
||||
# exception (previously raised) should be used as the cause. Other sources do not contain exceptions, so cannot provide a cause.
|
||||
raise_from = exception_to_raise if isinstance(ex.source, ExceptionMarker) else None
|
||||
else:
|
||||
exception_to_raise = ex
|
||||
raise_from = ex
|
||||
|
||||
exception_to_raise = create_template_error(exception_to_raise, variable, is_expression)
|
||||
|
||||
if exception_to_raise is ex:
|
||||
raise # when the exception to raise is the active exception, just re-raise it
|
||||
|
||||
if exception_to_raise is raise_from:
|
||||
raise_from = exception_to_raise.__cause__ # preserve the exception's cause, if any, otherwise no cause will be used
|
||||
|
||||
raise exception_to_raise from raise_from # always raise from something to avoid the currently active exception becoming __context__
|
||||
|
||||
def _compile_template(self, template: str, options: TemplateOptions) -> t.Callable[[c.Mapping[str, t.Any]], t.Any]:
|
||||
# NOTE: Creating an overlay that lives only inside _compile_template means that overrides are not applied
|
||||
# when templating nested variables, where Templar.environment is used, not the overlay. They are, however,
|
||||
# applied to includes and imports.
|
||||
try:
|
||||
stripped_template, env = self._create_overlay(template, options.overrides)
|
||||
|
||||
with _TemplateCompileContext(escape_backslashes=options.escape_backslashes):
|
||||
return t.cast(AnsibleTemplate, env.from_string(stripped_template))
|
||||
except Exception as ex:
|
||||
return self._defer_jinja_compile_error(ex, template, False)
|
||||
|
||||
def _compile_expression(self, expression: str, options: TemplateOptions) -> t.Callable[[c.Mapping[str, t.Any]], t.Any]:
|
||||
"""
|
||||
Compile a Jinja expression, applying optional compile-time behavior via an environment overlay (if needed). The overlay is
|
||||
necessary to avoid mutating settings on the Templar's shared environment, which could be visible to other code running concurrently.
|
||||
In the specific case of escape_backslashes, the setting only applies to a top-level template at compile-time, not runtime, to
|
||||
ensure that any nested template calls (e.g., include and import) do not inherit the (lack of) escaping behavior.
|
||||
"""
|
||||
try:
|
||||
with _TemplateCompileContext(escape_backslashes=options.escape_backslashes):
|
||||
return AnsibleTemplateExpression(self.environment.compile_expression(expression, False))
|
||||
except Exception as ex:
|
||||
return self._defer_jinja_compile_error(ex, expression, True)
|
||||
|
||||
def _defer_jinja_compile_error(self, ex: Exception, variable: str, is_expression: bool) -> t.Callable[[c.Mapping[str, t.Any]], t.Any]:
|
||||
deferred_error = defer_template_error(ex, variable, is_expression=is_expression)
|
||||
|
||||
def deferred_exception(_jinja_vars: c.Mapping[str, t.Any]) -> t.Any:
|
||||
# a template/expression compile error always results in a single node representing the compile error
|
||||
return self.marker_behavior.handle_marker(deferred_error)
|
||||
|
||||
return deferred_exception
|
||||
|
||||
def _post_render_mutation(self, template: str, result: t.Any, options: TemplateOptions) -> t.Any:
|
||||
if options.preserve_trailing_newlines and isinstance(result, str):
|
||||
# The low level calls above do not preserve the newline
|
||||
# characters at the end of the input data, so we
|
||||
# calculate the difference in newlines and append them
|
||||
# to the resulting output for parity
|
||||
#
|
||||
# Using AnsibleEnvironment's keep_trailing_newline instead would
|
||||
# result in change in behavior when trailing newlines
|
||||
# would be kept also for included templates, for example:
|
||||
# "Hello {% include 'world.txt' %}!" would render as
|
||||
# "Hello world\n!\n" instead of "Hello world!\n".
|
||||
data_newlines = self._count_newlines_from_end(template)
|
||||
res_newlines = self._count_newlines_from_end(result)
|
||||
|
||||
if data_newlines > res_newlines:
|
||||
newlines = options.overrides.newline_sequence * (data_newlines - res_newlines)
|
||||
result = AnsibleTagHelper.tag_copy(result, result + newlines)
|
||||
|
||||
# If the input string template was source-tagged and the result is not, propagate the source tag to the new value.
|
||||
# This provides further contextual information when a template-derived value/var causes an error.
|
||||
if not Origin.is_tagged_on(result) and (origin := Origin.get_tag(template)):
|
||||
try:
|
||||
result = origin.tag(result)
|
||||
except NotTaggableError:
|
||||
pass # best effort- if we can't, oh well
|
||||
|
||||
return result
|
||||
|
||||
def is_template(self, data: t.Any, overrides: TemplateOverrides = TemplateOverrides.DEFAULT) -> bool:
|
||||
"""
|
||||
Evaluate the input data to determine if it contains a template, even if that template is invalid. Containers will be recursively searched.
|
||||
Objects subject to template-time transforms that do not yield a template are not considered templates by this method.
|
||||
Gating a conditional call to `template` with this method is redundant and inefficient -- request templating unconditionally instead.
|
||||
"""
|
||||
options = TemplateOptions(overrides=overrides) if overrides is not TemplateOverrides.DEFAULT else TemplateOptions.DEFAULT
|
||||
|
||||
try:
|
||||
self.template(data, options=options, mode=TemplateMode.STOP_ON_TEMPLATE)
|
||||
except TemplateEncountered:
|
||||
return True
|
||||
else:
|
||||
return False
|
||||
|
||||
def resolve_to_container(self, variable: t.Any, options: TemplateOptions = TemplateOptions.DEFAULT) -> t.Any:
|
||||
"""
|
||||
Recursively resolve scalar string template input, stopping at the first container encountered (if any).
|
||||
Used for e.g., partial templating of task arguments, where the plugin needs to handle final resolution of some args internally.
|
||||
"""
|
||||
return self.template(variable, options=options, mode=TemplateMode.STOP_ON_CONTAINER)
|
||||
|
||||
def evaluate_expression(
|
||||
self,
|
||||
expression: str,
|
||||
*,
|
||||
local_variables: dict[str, t.Any] | None = None,
|
||||
escape_backslashes: bool = True,
|
||||
_render_jinja_const_template: bool = False,
|
||||
) -> t.Any:
|
||||
"""
|
||||
Evaluate a trusted string expression and return its result.
|
||||
Optional local variables may be provided, which can only be referenced directly by the given expression.
|
||||
"""
|
||||
if not isinstance(expression, str):
|
||||
raise TypeError(f"Expressions must be {str!r}, got {type(expression)!r}.")
|
||||
|
||||
options = TemplateOptions(escape_backslashes=escape_backslashes, preserve_trailing_newlines=False)
|
||||
|
||||
with (
|
||||
TemplateContext(template_value=expression, templar=self, options=options, _render_jinja_const_template=_render_jinja_const_template) as ctx,
|
||||
DeprecatedAccessAuditContext.when(ctx.is_top_level),
|
||||
):
|
||||
try:
|
||||
if not TrustedAsTemplate.is_tagged_on(expression):
|
||||
raise TemplateTrustCheckFailedError(obj=expression)
|
||||
|
||||
template_variables = ChainMap(local_variables, self.available_variables) if local_variables else self.available_variables
|
||||
compiled_template = self._compile_expression(expression, options)
|
||||
|
||||
template_result = compiled_template(template_variables)
|
||||
template_result = self._post_render_mutation(expression, template_result, options)
|
||||
except Exception as ex:
|
||||
template_result = defer_template_error(ex, expression, is_expression=True)
|
||||
|
||||
return self._finalize_top_level_template_result(expression, options, template_result, is_expression=True)
|
||||
|
||||
_BROKEN_CONDITIONAL_ALLOWED_FRAGMENT = 'Broken conditionals are currently allowed because the `ALLOW_BROKEN_CONDITIONALS` configuration option is enabled.'
|
||||
_CONDITIONAL_AS_TEMPLATE_MSG = 'Conditionals should not be surrounded by templating delimiters such as {{ }} or {% %}.'
|
||||
|
||||
def _strip_conditional_handle_empty(self, conditional) -> t.Any:
|
||||
"""
|
||||
Strips leading/trailing whitespace from the input expression.
|
||||
If `ALLOW_BROKEN_CONDITIONALS` is enabled, None/empty is coerced to True (legacy behavior, deprecated).
|
||||
Otherwise, None/empty results in a broken conditional error being raised.
|
||||
"""
|
||||
if isinstance(conditional, str):
|
||||
# Leading/trailing whitespace on conditional expressions is not a problem, except we can't tell if the expression is empty (which *is* a problem).
|
||||
# Always strip conditional input strings. Neither conditional expressions nor all-template conditionals have legit reasons to preserve
|
||||
# surrounding whitespace, and they complicate detection and processing of all-template fallback cases.
|
||||
conditional = AnsibleTagHelper.tag_copy(conditional, conditional.strip())
|
||||
|
||||
if conditional in (None, ''):
|
||||
# deprecated backward-compatible behavior; None/empty input conditionals are always True
|
||||
if _TemplateConfig.allow_broken_conditionals:
|
||||
_display.deprecated(
|
||||
msg='Empty conditional expression was evaluated as True.',
|
||||
help_text=self._BROKEN_CONDITIONAL_ALLOWED_FRAGMENT,
|
||||
obj=conditional,
|
||||
version='2.23',
|
||||
)
|
||||
|
||||
return True
|
||||
|
||||
raise AnsibleBrokenConditionalError("Empty conditional expressions are not allowed.", obj=conditional)
|
||||
|
||||
return conditional
|
||||
|
||||
def _normalize_and_evaluate_conditional(self, conditional: str | bool) -> t.Any:
|
||||
"""Validate and normalize a conditional input value, resolving allowed embedded template cases and evaluating the resulting expression."""
|
||||
conditional = self._strip_conditional_handle_empty(conditional)
|
||||
|
||||
# this must follow `_strip_conditional_handle_empty`, since None/empty are coerced to bool (deprecated)
|
||||
if type(conditional) is bool: # pylint: disable=unidiomatic-typecheck
|
||||
return conditional
|
||||
|
||||
try:
|
||||
if not isinstance(conditional, str):
|
||||
if _TemplateConfig.allow_broken_conditionals:
|
||||
# because the input isn't a string, the result will never be a bool; the broken conditional warning in the caller will apply on the result
|
||||
return self.template(conditional, mode=TemplateMode.ALWAYS_FINALIZE)
|
||||
|
||||
raise AnsibleBrokenConditionalError(message="Conditional expressions must be strings.", obj=conditional)
|
||||
|
||||
if is_possibly_all_template(conditional):
|
||||
# Indirection of trusted expressions is always allowed. If the expression appears to be entirely wrapped in template delimiters,
|
||||
# we must resolve it. e.g. `when: "{{ some_var_resolving_to_a_trusted_expression_string }}"`.
|
||||
# Some invalid meta-templating corner cases may sneak through here (e.g., `when: '{{ "foo" }} == {{ "bar" }}'`); these will
|
||||
# result in an untrusted expression error.
|
||||
result = self.template(conditional, mode=TemplateMode.ALWAYS_FINALIZE)
|
||||
result = self._strip_conditional_handle_empty(result)
|
||||
|
||||
if not isinstance(result, str):
|
||||
_display.deprecated(msg=self._CONDITIONAL_AS_TEMPLATE_MSG, obj=conditional, version='2.23')
|
||||
|
||||
return result # not an expression
|
||||
|
||||
# The only allowed use of templates for conditionals is for indirect usage of an expression.
|
||||
# Any other usage should simply be an expression, not an attempt at meta templating.
|
||||
expression = result
|
||||
else:
|
||||
expression = conditional
|
||||
|
||||
# Disable escape_backslashes when processing conditionals, to maintain backwards compatibility.
|
||||
# This is necessary because conditionals were previously evaluated using {% %}, which was *NOT* affected by escape_backslashes.
|
||||
# Now that conditionals use expressions, they would be affected by escape_backslashes if it was not disabled.
|
||||
return self.evaluate_expression(expression, escape_backslashes=False, _render_jinja_const_template=True)
|
||||
|
||||
except AnsibleUndefinedVariable as ex:
|
||||
# DTFIX-FUTURE: we're only augmenting the message for context here; once we have proper contextual tracking, we can dump the re-raise
|
||||
raise AnsibleUndefinedVariable("Error while evaluating conditional.", obj=conditional) from ex
|
||||
|
||||
def evaluate_conditional(self, conditional: str | bool) -> bool:
|
||||
"""
|
||||
Evaluate a trusted string expression or boolean and return its boolean result. A non-boolean result will raise `AnsibleBrokenConditionalError`.
|
||||
The ALLOW_BROKEN_CONDITIONALS configuration option can temporarily relax this requirement, allowing truthy conditionals to succeed.
|
||||
"""
|
||||
result = self._normalize_and_evaluate_conditional(conditional)
|
||||
|
||||
if isinstance(result, bool):
|
||||
return result
|
||||
|
||||
bool_result = bool(result)
|
||||
|
||||
msg = (
|
||||
f'Conditional result was {textwrap.shorten(str(result), width=40)!r} of type {native_type_name(result)!r}, '
|
||||
f'which evaluates to {bool_result}. Conditionals must have a boolean result.'
|
||||
)
|
||||
|
||||
if _TemplateConfig.allow_broken_conditionals:
|
||||
_display.deprecated(msg=msg, obj=conditional, help_text=self._BROKEN_CONDITIONAL_ALLOWED_FRAGMENT, version='2.23')
|
||||
|
||||
return bool_result
|
||||
|
||||
raise AnsibleBrokenConditionalError(msg, obj=conditional)
|
||||
|
||||
@staticmethod
|
||||
def _trust_check(value: str, skip_handler: bool = False) -> bool:
|
||||
"""
|
||||
Return True if the given value is trusted for templating, otherwise return False.
|
||||
When the value is not trusted, a warning or error may be generated, depending on configuration.
|
||||
"""
|
||||
if TrustedAsTemplate.is_tagged_on(value):
|
||||
return True
|
||||
|
||||
if not skip_handler:
|
||||
with Skippable, _TemplateConfig.untrusted_template_handler.handle(TemplateTrustCheckFailedError, skip_on_ignore=True):
|
||||
raise TemplateTrustCheckFailedError(obj=value)
|
||||
|
||||
return False
|
||||
28
lib/ansible/_internal/_templating/_errors.py
Normal file
28
lib/ansible/_internal/_templating/_errors.py
Normal file
|
|
@ -0,0 +1,28 @@
|
|||
from __future__ import annotations
|
||||
|
||||
from ansible.errors import AnsibleTemplatePluginError
|
||||
|
||||
|
||||
class AnsibleTemplatePluginRuntimeError(AnsibleTemplatePluginError):
|
||||
"""The specified template plugin (lookup/filter/test) raised an exception during execution."""
|
||||
|
||||
def __init__(self, plugin_type: str, plugin_name: str) -> None:
|
||||
super().__init__(f'The {plugin_type} plugin {plugin_name!r} failed.')
|
||||
|
||||
|
||||
class AnsibleTemplatePluginLoadError(AnsibleTemplatePluginError):
|
||||
"""The specified template plugin (lookup/filter/test) failed to load."""
|
||||
|
||||
def __init__(self, plugin_type: str, plugin_name: str) -> None:
|
||||
super().__init__(f'The {plugin_type} plugin {plugin_name!r} failed to load.')
|
||||
|
||||
|
||||
class AnsibleTemplatePluginNotFoundError(AnsibleTemplatePluginError, KeyError):
|
||||
"""
|
||||
The specified template plugin (lookup/filter/test) was not found.
|
||||
This exception extends KeyError since Jinja filter/test resolution requires a KeyError to detect missing plugins.
|
||||
Jinja compilation fails if a non-KeyError is raised for a missing filter/test, even if the plugin will not be invoked (inconsistent with stock Jinja).
|
||||
"""
|
||||
|
||||
def __init__(self, plugin_type: str, plugin_name: str) -> None:
|
||||
super().__init__(f'The {plugin_type} plugin {plugin_name!r} was not found.')
|
||||
1066
lib/ansible/_internal/_templating/_jinja_bits.py
Normal file
1066
lib/ansible/_internal/_templating/_jinja_bits.py
Normal file
File diff suppressed because it is too large
Load Diff
332
lib/ansible/_internal/_templating/_jinja_common.py
Normal file
332
lib/ansible/_internal/_templating/_jinja_common.py
Normal file
|
|
@ -0,0 +1,332 @@
|
|||
from __future__ import annotations
|
||||
|
||||
import abc
|
||||
import collections.abc as c
|
||||
import inspect
|
||||
import itertools
|
||||
import typing as t
|
||||
|
||||
from jinja2 import UndefinedError, StrictUndefined, TemplateRuntimeError
|
||||
from jinja2.utils import missing
|
||||
|
||||
from ansible.module_utils.common.messages import ErrorSummary, Detail
|
||||
from ansible.constants import config
|
||||
from ansible.errors import AnsibleUndefinedVariable, AnsibleTypeError
|
||||
from ansible._internal._errors._handler import ErrorHandler
|
||||
from ansible.module_utils._internal._datatag import Tripwire, _untaggable_types
|
||||
|
||||
from ._access import NotifiableAccessContextBase
|
||||
from ._jinja_patches import _patch_jinja
|
||||
from ._utils import TemplateContext
|
||||
from .._errors import _captured
|
||||
from ...module_utils.datatag import native_type_name
|
||||
|
||||
_patch_jinja() # apply Jinja2 patches before types are declared that are dependent on the changes
|
||||
|
||||
|
||||
class _TemplateConfig:
|
||||
allow_embedded_templates: bool = config.get_config_value("ALLOW_EMBEDDED_TEMPLATES")
|
||||
allow_broken_conditionals: bool = config.get_config_value('ALLOW_BROKEN_CONDITIONALS')
|
||||
jinja_extensions: list[str] = config.get_config_value('DEFAULT_JINJA2_EXTENSIONS')
|
||||
|
||||
unknown_type_encountered_handler = ErrorHandler.from_config('_TEMPLAR_UNKNOWN_TYPE_ENCOUNTERED')
|
||||
unknown_type_conversion_handler = ErrorHandler.from_config('_TEMPLAR_UNKNOWN_TYPE_CONVERSION')
|
||||
untrusted_template_handler = ErrorHandler.from_config('_TEMPLAR_UNTRUSTED_TEMPLATE_BEHAVIOR')
|
||||
|
||||
|
||||
class MarkerError(UndefinedError):
|
||||
"""
|
||||
An Ansible specific subclass of Jinja's UndefinedError, used to preserve and later restore the original Marker instance that raised the error.
|
||||
This error is only raised by Marker and should never escape the templating system.
|
||||
"""
|
||||
|
||||
def __init__(self, message: str, source: Marker) -> None:
|
||||
super().__init__(message)
|
||||
|
||||
self.source = source
|
||||
|
||||
|
||||
class Marker(StrictUndefined, Tripwire):
|
||||
"""
|
||||
Extends Jinja's `StrictUndefined`, allowing any kind of error occurring during recursive templating operations to be captured and deferred.
|
||||
Direct or managed access to most `Marker` attributes will raise a `MarkerError`, which usually ends the current innermost templating
|
||||
operation and converts the `MarkerError` back to the origin Marker instance (subject to the `MarkerBehavior` in effect at the time).
|
||||
"""
|
||||
|
||||
__slots__ = ('_marker_template_source',)
|
||||
|
||||
concrete_subclasses: t.ClassVar[set[type[Marker]]] = set()
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
hint: t.Optional[str] = None,
|
||||
obj: t.Any = missing,
|
||||
name: t.Optional[str] = None,
|
||||
exc: t.Type[TemplateRuntimeError] = UndefinedError, # Ansible doesn't set this argument or consume the attribute it is stored under.
|
||||
*args,
|
||||
_no_template_source=False,
|
||||
**kwargs,
|
||||
) -> None:
|
||||
if not hint and name and obj is not missing:
|
||||
hint = f"object of type {native_type_name(obj)!r} has no attribute {name!r}"
|
||||
|
||||
kwargs.update(
|
||||
hint=hint,
|
||||
obj=obj,
|
||||
name=name,
|
||||
exc=exc,
|
||||
)
|
||||
|
||||
super().__init__(*args, **kwargs)
|
||||
|
||||
if _no_template_source:
|
||||
self._marker_template_source = None
|
||||
else:
|
||||
self._marker_template_source = TemplateContext.current().template_value
|
||||
|
||||
def _as_exception(self) -> Exception:
|
||||
"""Return the exception instance to raise in a top-level templating context."""
|
||||
return AnsibleUndefinedVariable(self._undefined_message, obj=self._marker_template_source)
|
||||
|
||||
def _as_message(self) -> str:
|
||||
"""Return the error message to show when this marker must be represented as a string, such as for subsitutions or warnings."""
|
||||
return self._undefined_message
|
||||
|
||||
def _fail_with_undefined_error(self, *args: t.Any, **kwargs: t.Any) -> t.NoReturn:
|
||||
"""Ansible-specific replacement for Jinja's _fail_with_undefined_error tripwire on dunder methods."""
|
||||
self.trip()
|
||||
|
||||
def trip(self) -> t.NoReturn:
|
||||
"""Raise an internal exception which can be converted back to this instance."""
|
||||
raise MarkerError(self._undefined_message, self)
|
||||
|
||||
def __setattr__(self, name: str, value: t.Any) -> None:
|
||||
"""
|
||||
Any attempt to set an unknown attribute on a `Marker` should invoke the trip method to propagate the original context.
|
||||
This does not protect against mutation of known attributes, but the implementation is fairly simple.
|
||||
"""
|
||||
try:
|
||||
super().__setattr__(name, value)
|
||||
except AttributeError:
|
||||
pass
|
||||
else:
|
||||
return
|
||||
|
||||
self.trip()
|
||||
|
||||
def __getattr__(self, name: str) -> t.Any:
|
||||
"""Raises AttributeError for dunder-looking accesses, self-propagates otherwise."""
|
||||
if name.startswith('__') and name.endswith('__'):
|
||||
raise AttributeError(name)
|
||||
|
||||
return self
|
||||
|
||||
def __getitem__(self, key):
|
||||
"""Self-propagates on all item accesses."""
|
||||
return self
|
||||
|
||||
@classmethod
|
||||
def __init_subclass__(cls, **kwargs) -> None:
|
||||
if not inspect.isabstract(cls):
|
||||
_untaggable_types.add(cls)
|
||||
cls.concrete_subclasses.add(cls)
|
||||
|
||||
@classmethod
|
||||
def _init_class(cls):
|
||||
_untaggable_types.add(cls)
|
||||
|
||||
# These are the methods StrictUndefined already intercepts.
|
||||
jinja_method_names = (
|
||||
'__add__',
|
||||
'__bool__',
|
||||
'__call__',
|
||||
'__complex__',
|
||||
'__contains__',
|
||||
'__div__',
|
||||
'__eq__',
|
||||
'__float__',
|
||||
'__floordiv__',
|
||||
'__ge__',
|
||||
# '__getitem__', # using a custom implementation that propagates self instead
|
||||
'__gt__',
|
||||
'__hash__',
|
||||
'__int__',
|
||||
'__iter__',
|
||||
'__le__',
|
||||
'__len__',
|
||||
'__lt__',
|
||||
'__mod__',
|
||||
'__mul__',
|
||||
'__ne__',
|
||||
'__neg__',
|
||||
'__pos__',
|
||||
'__pow__',
|
||||
'__radd__',
|
||||
'__rdiv__',
|
||||
'__rfloordiv__',
|
||||
'__rmod__',
|
||||
'__rmul__',
|
||||
'__rpow__',
|
||||
'__rsub__',
|
||||
'__rtruediv__',
|
||||
'__str__',
|
||||
'__sub__',
|
||||
'__truediv__',
|
||||
)
|
||||
|
||||
# These additional methods should be intercepted, even though they are not intercepted by StrictUndefined.
|
||||
additional_method_names = (
|
||||
'__aiter__',
|
||||
'__delattr__',
|
||||
'__format__',
|
||||
'__repr__',
|
||||
'__setitem__',
|
||||
)
|
||||
|
||||
for name in jinja_method_names + additional_method_names:
|
||||
setattr(cls, name, cls._fail_with_undefined_error)
|
||||
|
||||
|
||||
Marker._init_class()
|
||||
|
||||
|
||||
class TruncationMarker(Marker):
|
||||
"""
|
||||
An `Marker` value was previously encountered and reported.
|
||||
A subsequent `Marker` value (this instance) indicates the template may have been truncated as a result.
|
||||
It will only be visible if the previous `Marker` was ignored/replaced instead of being tripped, which would raise an exception.
|
||||
"""
|
||||
|
||||
# DTFIX-RELEASE: make this a singleton?
|
||||
|
||||
__slots__ = ()
|
||||
|
||||
def __init__(self) -> None:
|
||||
super().__init__(hint='template potentially truncated')
|
||||
|
||||
|
||||
class UndefinedMarker(Marker):
|
||||
"""A `Marker` value that represents an undefined value encountered during templating."""
|
||||
|
||||
__slots__ = ()
|
||||
|
||||
|
||||
class ExceptionMarker(Marker, metaclass=abc.ABCMeta):
|
||||
"""Base `Marker` class that represents exceptions encountered and deferred during templating."""
|
||||
|
||||
__slots__ = ()
|
||||
|
||||
@abc.abstractmethod
|
||||
def _as_exception(self) -> Exception:
|
||||
pass
|
||||
|
||||
def _as_message(self) -> str:
|
||||
return str(self._as_exception())
|
||||
|
||||
def trip(self) -> t.NoReturn:
|
||||
"""Raise an internal exception which can be converted back to this instance while maintaining the cause for callers that follow them."""
|
||||
raise MarkerError(self._undefined_message, self) from self._as_exception()
|
||||
|
||||
|
||||
class CapturedExceptionMarker(ExceptionMarker):
|
||||
"""A `Marker` value that represents an exception raised during templating."""
|
||||
|
||||
__slots__ = ('_marker_captured_exception',)
|
||||
|
||||
def __init__(self, exception: Exception) -> None:
|
||||
super().__init__(hint=f'A captured exception marker was tripped: {exception}')
|
||||
|
||||
self._marker_captured_exception = exception
|
||||
|
||||
def _as_exception(self) -> Exception:
|
||||
return self._marker_captured_exception
|
||||
|
||||
|
||||
class UndecryptableVaultError(_captured.AnsibleCapturedError):
|
||||
"""Template-external error raised by VaultExceptionMarker when an undecryptable variable is accessed."""
|
||||
|
||||
context = 'vault'
|
||||
_default_message = "Attempt to use undecryptable variable."
|
||||
|
||||
|
||||
class VaultExceptionMarker(ExceptionMarker):
|
||||
"""A `Marker` value that represents an error accessing a vaulted value during templating."""
|
||||
|
||||
__slots__ = ('_marker_undecryptable_ciphertext', '_marker_undecryptable_reason', '_marker_undecryptable_traceback')
|
||||
|
||||
def __init__(self, ciphertext: str, reason: str, traceback: str | None) -> None:
|
||||
# DTFIX-RELEASE: when does this show up, should it contain more details?
|
||||
# see also CapturedExceptionMarker for a similar issue
|
||||
super().__init__(hint='A vault exception marker was tripped.')
|
||||
|
||||
self._marker_undecryptable_ciphertext = ciphertext
|
||||
self._marker_undecryptable_reason = reason
|
||||
self._marker_undecryptable_traceback = traceback
|
||||
|
||||
def _as_exception(self) -> Exception:
|
||||
return UndecryptableVaultError(
|
||||
obj=self._marker_undecryptable_ciphertext,
|
||||
error_summary=ErrorSummary(
|
||||
details=(
|
||||
Detail(
|
||||
msg=self._marker_undecryptable_reason,
|
||||
),
|
||||
),
|
||||
formatted_traceback=self._marker_undecryptable_traceback,
|
||||
),
|
||||
)
|
||||
|
||||
def _disarm(self) -> str:
|
||||
return self._marker_undecryptable_ciphertext
|
||||
|
||||
|
||||
def get_first_marker_arg(args: c.Sequence, kwargs: dict[str, t.Any]) -> Marker | None:
|
||||
"""Utility method to inspect plugin args and return the first `Marker` encountered, otherwise `None`."""
|
||||
# DTFIX-RELEASE: this may or may not need to be public API, move back to utils or once usage is wrapped in a decorator?
|
||||
for arg in iter_marker_args(args, kwargs):
|
||||
return arg
|
||||
|
||||
return None
|
||||
|
||||
|
||||
def iter_marker_args(args: c.Sequence, kwargs: dict[str, t.Any]) -> t.Generator[Marker]:
|
||||
"""Utility method to iterate plugin args and yield any `Marker` encountered."""
|
||||
# DTFIX-RELEASE: this may or may not need to be public API, move back to utils or once usage is wrapped in a decorator?
|
||||
for arg in itertools.chain(args, kwargs.values()):
|
||||
if isinstance(arg, Marker):
|
||||
yield arg
|
||||
|
||||
|
||||
class JinjaCallContext(NotifiableAccessContextBase):
|
||||
"""
|
||||
An audit context that wraps all Jinja (template/filter/test/lookup/method/function) calls.
|
||||
While active, calls `trip()` on managed access of `Marker` objects unless the callee declares an understanding of markers.
|
||||
"""
|
||||
|
||||
_mask = True
|
||||
|
||||
def __init__(self, accept_lazy_markers: bool) -> None:
|
||||
self._type_interest = frozenset() if accept_lazy_markers else frozenset(Marker.concrete_subclasses)
|
||||
|
||||
def _notify(self, o: Marker) -> t.NoReturn:
|
||||
o.trip()
|
||||
|
||||
|
||||
def validate_arg_type(name: str, value: t.Any, allowed_type_or_types: type | tuple[type, ...], /) -> None:
|
||||
"""Validate the type of the given argument while preserving context for Marker values."""
|
||||
# DTFIX-RELEASE: find a home for this as a general-purpose utliity method and expose it after some API review
|
||||
if isinstance(value, allowed_type_or_types):
|
||||
return
|
||||
|
||||
if isinstance(allowed_type_or_types, type):
|
||||
arg_type_description = repr(native_type_name(allowed_type_or_types))
|
||||
else:
|
||||
arg_type_description = ' or '.join(repr(native_type_name(item)) for item in allowed_type_or_types)
|
||||
|
||||
if isinstance(value, Marker):
|
||||
try:
|
||||
value.trip()
|
||||
except Exception as ex:
|
||||
raise AnsibleTypeError(f"The {name!r} argument must be of type {arg_type_description}.", obj=value) from ex
|
||||
|
||||
raise AnsibleTypeError(f"The {name!r} argument must be of type {arg_type_description}, not {native_type_name(value)!r}.", obj=value)
|
||||
44
lib/ansible/_internal/_templating/_jinja_patches.py
Normal file
44
lib/ansible/_internal/_templating/_jinja_patches.py
Normal file
|
|
@ -0,0 +1,44 @@
|
|||
"""Runtime patches for Jinja bugs affecting Ansible."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import jinja2
|
||||
import jinja2.utils
|
||||
|
||||
|
||||
def _patch_jinja_undefined_slots() -> None:
|
||||
"""
|
||||
Fix the broken __slots__ on Jinja's Undefined and StrictUndefined if they're missing in the current version.
|
||||
This will no longer be necessary once the fix is included in the minimum supported Jinja version.
|
||||
See: https://github.com/pallets/jinja/issues/2025
|
||||
"""
|
||||
if not hasattr(jinja2.Undefined, '__slots__'):
|
||||
jinja2.Undefined.__slots__ = (
|
||||
"_undefined_hint",
|
||||
"_undefined_obj",
|
||||
"_undefined_name",
|
||||
"_undefined_exception",
|
||||
)
|
||||
|
||||
if not hasattr(jinja2.StrictUndefined, '__slots__'):
|
||||
jinja2.StrictUndefined.__slots__ = ()
|
||||
|
||||
|
||||
def _patch_jinja_missing_type() -> None:
|
||||
"""
|
||||
Fix the `jinja2.utils.missing` type to support pickling while remaining a singleton.
|
||||
This will no longer be necessary once the fix is included in the minimum supported Jinja version.
|
||||
See: https://github.com/pallets/jinja/issues/2027
|
||||
"""
|
||||
if getattr(jinja2.utils.missing, '__reduce__')() != 'missing':
|
||||
|
||||
def __reduce__(*_args):
|
||||
return 'missing'
|
||||
|
||||
type(jinja2.utils.missing).__reduce__ = __reduce__
|
||||
|
||||
|
||||
def _patch_jinja() -> None:
|
||||
"""Apply Jinja2 patches."""
|
||||
_patch_jinja_undefined_slots()
|
||||
_patch_jinja_missing_type()
|
||||
351
lib/ansible/_internal/_templating/_jinja_plugins.py
Normal file
351
lib/ansible/_internal/_templating/_jinja_plugins.py
Normal file
|
|
@ -0,0 +1,351 @@
|
|||
"""Jinja template plugins (filters, tests, lookups) and custom global functions."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import collections.abc as c
|
||||
import dataclasses
|
||||
import datetime
|
||||
import functools
|
||||
import typing as t
|
||||
|
||||
from ansible.errors import (
|
||||
AnsibleTemplatePluginError,
|
||||
)
|
||||
|
||||
from ansible.module_utils._internal._ambient_context import AmbientContextBase
|
||||
from ansible.module_utils._internal._plugin_exec_context import PluginExecContext
|
||||
from ansible.module_utils.common.collections import is_sequence
|
||||
from ansible.module_utils._internal._datatag import AnsibleTagHelper
|
||||
from ansible._internal._datatag._tags import TrustedAsTemplate
|
||||
from ansible.plugins import AnsibleJinja2Plugin
|
||||
from ansible.plugins.loader import lookup_loader, Jinja2Loader
|
||||
from ansible.plugins.lookup import LookupBase
|
||||
from ansible.utils.display import Display
|
||||
|
||||
from ._datatag import _JinjaConstTemplate
|
||||
from ._errors import AnsibleTemplatePluginRuntimeError, AnsibleTemplatePluginLoadError, AnsibleTemplatePluginNotFoundError
|
||||
from ._jinja_common import MarkerError, _TemplateConfig, get_first_marker_arg, Marker, JinjaCallContext
|
||||
from ._lazy_containers import lazify_container_kwargs, lazify_container_args, lazify_container, _AnsibleLazyTemplateMixin
|
||||
from ._utils import LazyOptions, TemplateContext
|
||||
|
||||
_display = Display()
|
||||
|
||||
_TCallable = t.TypeVar("_TCallable", bound=t.Callable)
|
||||
_ITERATOR_TYPES: t.Final = (c.Iterator, c.ItemsView, c.KeysView, c.ValuesView, range)
|
||||
|
||||
|
||||
class JinjaPluginIntercept(c.MutableMapping):
|
||||
"""
|
||||
Simulated dict class that loads Jinja2Plugins at request
|
||||
otherwise all plugins would need to be loaded a priori.
|
||||
|
||||
NOTE: plugin_loader still loads all 'builtin/legacy' at
|
||||
start so only collection plugins are really at request.
|
||||
"""
|
||||
|
||||
def __init__(self, jinja_builtins: c.Mapping[str, AnsibleJinja2Plugin], plugin_loader: Jinja2Loader):
|
||||
super(JinjaPluginIntercept, self).__init__()
|
||||
|
||||
self._plugin_loader = plugin_loader
|
||||
self._jinja_builtins = jinja_builtins
|
||||
self._wrapped_funcs: dict[str, t.Callable] = {}
|
||||
|
||||
def _wrap_and_set_func(self, instance: AnsibleJinja2Plugin) -> t.Callable:
|
||||
if self._plugin_loader.type == 'filter':
|
||||
plugin_func = self._wrap_filter(instance)
|
||||
else:
|
||||
plugin_func = self._wrap_test(instance)
|
||||
|
||||
self._wrapped_funcs[instance._load_name] = plugin_func
|
||||
|
||||
return plugin_func
|
||||
|
||||
def __getitem__(self, key: str) -> t.Callable:
|
||||
instance: AnsibleJinja2Plugin | None = None
|
||||
plugin_func: t.Callable[..., t.Any] | None
|
||||
|
||||
if plugin_func := self._wrapped_funcs.get(key):
|
||||
return plugin_func
|
||||
|
||||
try:
|
||||
instance = self._plugin_loader.get(key)
|
||||
except KeyError:
|
||||
# The plugin name was invalid or no plugin was found by that name.
|
||||
pass
|
||||
except Exception as ex:
|
||||
# An unexpected exception occurred.
|
||||
raise AnsibleTemplatePluginLoadError(self._plugin_loader.type, key) from ex
|
||||
|
||||
if not instance:
|
||||
try:
|
||||
instance = self._jinja_builtins[key]
|
||||
except KeyError:
|
||||
raise AnsibleTemplatePluginNotFoundError(self._plugin_loader.type, key) from None
|
||||
|
||||
plugin_func = self._wrap_and_set_func(instance)
|
||||
|
||||
return plugin_func
|
||||
|
||||
def __setitem__(self, key: str, value: t.Callable) -> None:
|
||||
self._wrap_and_set_func(self._plugin_loader._wrap_func(key, key, value))
|
||||
|
||||
def __delitem__(self, key):
|
||||
raise NotImplementedError()
|
||||
|
||||
def __contains__(self, item: t.Any) -> bool:
|
||||
try:
|
||||
self.__getitem__(item)
|
||||
except AnsibleTemplatePluginLoadError:
|
||||
return True
|
||||
except AnsibleTemplatePluginNotFoundError:
|
||||
return False
|
||||
|
||||
return True
|
||||
|
||||
def __iter__(self):
|
||||
raise NotImplementedError() # dynamic container
|
||||
|
||||
def __len__(self):
|
||||
raise NotImplementedError() # dynamic container
|
||||
|
||||
@staticmethod
|
||||
def _invoke_plugin(instance: AnsibleJinja2Plugin, *args, **kwargs) -> t.Any:
|
||||
if not instance.accept_args_markers:
|
||||
if (first_marker := get_first_marker_arg(args, kwargs)) is not None:
|
||||
return first_marker
|
||||
|
||||
try:
|
||||
with JinjaCallContext(accept_lazy_markers=instance.accept_lazy_markers), PluginExecContext(executing_plugin=instance):
|
||||
return instance.j2_function(*lazify_container_args(args), **lazify_container_kwargs(kwargs))
|
||||
except MarkerError as ex:
|
||||
return ex.source
|
||||
except Exception as ex:
|
||||
raise AnsibleTemplatePluginRuntimeError(instance.plugin_type, instance.ansible_name) from ex # DTFIX-RELEASE: which name to use? use plugin info?
|
||||
|
||||
def _wrap_test(self, instance: AnsibleJinja2Plugin) -> t.Callable:
|
||||
"""Intercept point for all test plugins to ensure that args are properly templated/lazified."""
|
||||
|
||||
@functools.wraps(instance.j2_function)
|
||||
def wrapper(*args, **kwargs) -> bool | Marker:
|
||||
result = self._invoke_plugin(instance, *args, **kwargs)
|
||||
|
||||
if not isinstance(result, bool):
|
||||
template = TemplateContext.current().template_value
|
||||
|
||||
# DTFIX-RELEASE: which name to use? use plugin info?
|
||||
_display.deprecated(
|
||||
msg=f"The test plugin {instance.ansible_name!r} returned a non-boolean result of type {type(result)!r}. "
|
||||
"Test plugins must have a boolean result.",
|
||||
obj=template,
|
||||
version="2.23",
|
||||
)
|
||||
|
||||
result = bool(result)
|
||||
|
||||
return result
|
||||
|
||||
return wrapper
|
||||
|
||||
def _wrap_filter(self, instance: AnsibleJinja2Plugin) -> t.Callable:
|
||||
"""Intercept point for all filter plugins to ensure that args are properly templated/lazified."""
|
||||
|
||||
@functools.wraps(instance.j2_function)
|
||||
def wrapper(*args, **kwargs) -> t.Any:
|
||||
result = self._invoke_plugin(instance, *args, **kwargs)
|
||||
result = _wrap_plugin_output(result)
|
||||
|
||||
return result
|
||||
|
||||
return wrapper
|
||||
|
||||
|
||||
class _DirectCall:
|
||||
"""Functions/methods marked `_DirectCall` bypass Jinja Environment checks for `Marker`."""
|
||||
|
||||
_marker_attr: str = "_directcall"
|
||||
|
||||
@classmethod
|
||||
def mark(cls, src: _TCallable) -> _TCallable:
|
||||
setattr(src, cls._marker_attr, True)
|
||||
return src
|
||||
|
||||
@classmethod
|
||||
def is_marked(cls, value: t.Callable) -> bool:
|
||||
return callable(value) and getattr(value, "_directcall", False)
|
||||
|
||||
|
||||
@_DirectCall.mark
|
||||
def _query(plugin_name: str, /, *args, **kwargs) -> t.Any:
|
||||
"""wrapper for lookup, force wantlist true"""
|
||||
kwargs['wantlist'] = True
|
||||
return _invoke_lookup(plugin_name=plugin_name, lookup_terms=list(args), lookup_kwargs=kwargs)
|
||||
|
||||
|
||||
@_DirectCall.mark
|
||||
def _lookup(plugin_name: str, /, *args, **kwargs) -> t.Any:
|
||||
# convert the args tuple to a list, since some plugins make a poor assumption that `run.args` is a list
|
||||
return _invoke_lookup(plugin_name=plugin_name, lookup_terms=list(args), lookup_kwargs=kwargs)
|
||||
|
||||
|
||||
@dataclasses.dataclass
|
||||
class _LookupContext(AmbientContextBase):
|
||||
"""Ambient context that wraps lookup execution, providing information about how it was invoked."""
|
||||
|
||||
invoked_as_with: bool
|
||||
|
||||
|
||||
@_DirectCall.mark
|
||||
def _invoke_lookup(*, plugin_name: str, lookup_terms: list, lookup_kwargs: dict[str, t.Any], invoked_as_with: bool = False) -> t.Any:
|
||||
templar = TemplateContext.current().templar
|
||||
|
||||
from ansible import template as _template
|
||||
|
||||
try:
|
||||
instance: LookupBase | None = lookup_loader.get(plugin_name, loader=templar._loader, templar=_template.Templar._from_template_engine(templar))
|
||||
except Exception as ex:
|
||||
raise AnsibleTemplatePluginLoadError('lookup', plugin_name) from ex
|
||||
|
||||
if instance is None:
|
||||
raise AnsibleTemplatePluginNotFoundError('lookup', plugin_name)
|
||||
|
||||
# if the lookup doesn't understand `Marker` and there's at least one in the top level, short-circuit by returning the first one we found
|
||||
if not instance.accept_args_markers and (first_marker := get_first_marker_arg(lookup_terms, lookup_kwargs)) is not None:
|
||||
return first_marker
|
||||
|
||||
# don't pass these through to the lookup
|
||||
wantlist = lookup_kwargs.pop('wantlist', False)
|
||||
errors = lookup_kwargs.pop('errors', 'strict')
|
||||
|
||||
with (
|
||||
JinjaCallContext(accept_lazy_markers=instance.accept_lazy_markers),
|
||||
PluginExecContext(executing_plugin=instance),
|
||||
):
|
||||
try:
|
||||
if _TemplateConfig.allow_embedded_templates:
|
||||
# for backwards compat, only trust constant templates in lookup terms
|
||||
with JinjaCallContext(accept_lazy_markers=True):
|
||||
# Force lazy marker support on for this call; the plugin's understanding is irrelevant, as is any existing context, since this backward
|
||||
# compat code always understands markers.
|
||||
lookup_terms = [templar.template(value) for value in _trust_jinja_constants(lookup_terms)]
|
||||
|
||||
# since embedded template support is enabled, repeat the check for `Marker` on lookup_terms, since a template may render as a `Marker`
|
||||
if not instance.accept_args_markers and (first_marker := get_first_marker_arg(lookup_terms, {})) is not None:
|
||||
return first_marker
|
||||
else:
|
||||
lookup_terms = AnsibleTagHelper.tag_copy(lookup_terms, (lazify_container(value) for value in lookup_terms), value_type=list)
|
||||
|
||||
with _LookupContext(invoked_as_with=invoked_as_with):
|
||||
# The lookup context currently only supports the internal use-case where `first_found` requires extra info when invoked via `with_first_found`.
|
||||
# The context may be public API in the future, but for now, other plugins should not implement this kind of dynamic behavior,
|
||||
# though we're stuck with it for backward compatibility on `first_found`.
|
||||
lookup_res = instance.run(lookup_terms, variables=templar.available_variables, **lazify_container_kwargs(lookup_kwargs))
|
||||
|
||||
# DTFIX-FUTURE: Consider allowing/requiring lookup plugins to declare how their result should be handled.
|
||||
# Currently, there are multiple behaviors that are less than ideal and poorly documented (or not at all):
|
||||
# * When `errors=warn` or `errors=ignore` the result is `None` unless `wantlist=True`, in which case the result is `[]`.
|
||||
# * The user must specify `wantlist=True` to receive the plugin return value unmodified.
|
||||
# A plugin can achieve similar results by wrapping its result in a list -- unless of course the user specifies `wantlist=True`.
|
||||
# * When `wantlist=True` is specified, the result is not guaranteed to be a list as the option implies (except on plugin error).
|
||||
# * Sequences are munged unless the user specifies `wantlist=True`:
|
||||
# * len() == 0 - Return an empty sequence.
|
||||
# * len() == 1 - Return the only element in the sequence.
|
||||
# * len() >= 2 when all elements are `str` - Return all the values joined into a single comma separated string.
|
||||
# * len() >= 2 when at least one element is not `str` - Return the sequence as-is.
|
||||
|
||||
if not is_sequence(lookup_res):
|
||||
# DTFIX-FUTURE: deprecate return types which are not a list
|
||||
# previously non-Sequence return types were deprecated and then became an error in 2.18
|
||||
# however, the deprecation message (and this error) mention `list` specifically rather than `Sequence`
|
||||
# letting non-list values through will trigger variable type checking warnings/errors
|
||||
raise TypeError(f'returned {type(lookup_res)} instead of {list}')
|
||||
|
||||
except MarkerError as ex:
|
||||
return ex.source
|
||||
except Exception as ex:
|
||||
# DTFIX-RELEASE: convert this to the new error/warn/ignore context manager
|
||||
if isinstance(ex, AnsibleTemplatePluginError):
|
||||
msg = f'Lookup failed but the error is being ignored: {ex}'
|
||||
else:
|
||||
msg = f'An unhandled exception occurred while running the lookup plugin {plugin_name!r}. Error was a {type(ex)}, original message: {ex}'
|
||||
|
||||
if errors == 'warn':
|
||||
_display.warning(msg)
|
||||
elif errors == 'ignore':
|
||||
_display.display(msg, log_only=True)
|
||||
else:
|
||||
raise AnsibleTemplatePluginRuntimeError('lookup', plugin_name) from ex
|
||||
|
||||
return [] if wantlist else None
|
||||
|
||||
if not wantlist and lookup_res:
|
||||
# when wantlist=False the lookup result is either partially delaizified (single element) or fully delaizified (multiple elements)
|
||||
|
||||
if len(lookup_res) == 1:
|
||||
lookup_res = lookup_res[0]
|
||||
else:
|
||||
try:
|
||||
lookup_res = ",".join(lookup_res) # for backwards compatibility, attempt to join `ran` into single string
|
||||
except TypeError:
|
||||
pass # for backwards compatibility, return `ran` as-is when the sequence contains non-string values
|
||||
|
||||
return _wrap_plugin_output(lookup_res)
|
||||
|
||||
|
||||
def _now(utc=False, fmt=None):
|
||||
"""Jinja2 global function (now) to return current datetime, potentially formatted via strftime."""
|
||||
if utc:
|
||||
now = datetime.datetime.now(datetime.timezone.utc).replace(tzinfo=None)
|
||||
else:
|
||||
now = datetime.datetime.now()
|
||||
|
||||
if fmt:
|
||||
return now.strftime(fmt)
|
||||
|
||||
return now
|
||||
|
||||
|
||||
def _jinja_const_template_warning(value: object, is_conditional: bool) -> None:
|
||||
"""Issue a warning regarding embedded template usage."""
|
||||
help_text = "Use inline expressions, for example: "
|
||||
|
||||
if is_conditional:
|
||||
help_text += """`when: "{{ a_var }}" == 42` becomes `when: a_var == 42`"""
|
||||
else:
|
||||
help_text += """`msg: "{{ lookup('env', '{{ a_var }}') }}"` becomes `msg: "{{ lookup('env', a_var) }}"`"""
|
||||
|
||||
# deprecated: description='disable embedded templates by default and deprecate the feature' core_version='2.23'
|
||||
_display.warning(
|
||||
msg="Jinja constant strings should not contain embedded templates. This feature will be disabled by default in ansible-core 2.23.",
|
||||
obj=value,
|
||||
help_text=help_text,
|
||||
)
|
||||
|
||||
|
||||
def _trust_jinja_constants(o: t.Any) -> t.Any:
|
||||
"""
|
||||
Recursively apply TrustedAsTemplate to values tagged with _JinjaConstTemplate and remove the tag.
|
||||
Only container types emitted by the Jinja compiler are checked, since others do not contain constants.
|
||||
This is used to provide backwards compatibility with historical lookup behavior for positional arguments.
|
||||
"""
|
||||
if _JinjaConstTemplate.is_tagged_on(o):
|
||||
_jinja_const_template_warning(o, is_conditional=False)
|
||||
|
||||
return TrustedAsTemplate().tag(_JinjaConstTemplate.untag(o))
|
||||
|
||||
o_type = type(o)
|
||||
|
||||
if o_type is dict:
|
||||
return {k: _trust_jinja_constants(v) for k, v in o.items()}
|
||||
|
||||
if o_type in (list, tuple):
|
||||
return o_type(_trust_jinja_constants(v) for v in o)
|
||||
|
||||
return o
|
||||
|
||||
|
||||
def _wrap_plugin_output(o: t.Any) -> t.Any:
|
||||
"""Utility method to ensure that iterators/generators returned from a plugins are consumed."""
|
||||
if isinstance(o, _ITERATOR_TYPES):
|
||||
o = list(o)
|
||||
|
||||
return _AnsibleLazyTemplateMixin._try_create(o, LazyOptions.SKIP_TEMPLATES)
|
||||
633
lib/ansible/_internal/_templating/_lazy_containers.py
Normal file
633
lib/ansible/_internal/_templating/_lazy_containers.py
Normal file
|
|
@ -0,0 +1,633 @@
|
|||
from __future__ import annotations
|
||||
|
||||
import copy
|
||||
import dataclasses
|
||||
import functools
|
||||
import types
|
||||
import typing as t
|
||||
|
||||
from jinja2.environment import TemplateModule
|
||||
|
||||
from ansible.module_utils._internal._datatag import (
|
||||
AnsibleTagHelper,
|
||||
AnsibleTaggedObject,
|
||||
_AnsibleTaggedDict,
|
||||
_AnsibleTaggedList,
|
||||
_AnsibleTaggedTuple,
|
||||
_NO_INSTANCE_STORAGE,
|
||||
_try_get_internal_tags_mapping,
|
||||
)
|
||||
|
||||
from ansible.utils.sentinel import Sentinel
|
||||
from ansible.errors import AnsibleVariableTypeError
|
||||
from ansible._internal._errors._handler import Skippable
|
||||
from ansible.vars.hostvars import HostVarsVars, HostVars
|
||||
|
||||
from ._access import AnsibleAccessContext
|
||||
from ._jinja_common import Marker, _TemplateConfig
|
||||
from ._utils import TemplateContext, PASS_THROUGH_SCALAR_VAR_TYPES, LazyOptions
|
||||
|
||||
if t.TYPE_CHECKING:
|
||||
from ._engine import TemplateEngine
|
||||
|
||||
_KNOWN_TYPES: t.Final[set[type]] = (
|
||||
{
|
||||
HostVars, # example: hostvars
|
||||
HostVarsVars, # example: hostvars.localhost | select
|
||||
type, # example: range(20) | list # triggered on retrieval of `range` type from globals
|
||||
range, # example: range(20) | list # triggered when returning a `range` instance from a call
|
||||
types.FunctionType, # example: undef() | default("blah")
|
||||
types.MethodType, # example: ansible_facts.get | type_debug
|
||||
functools.partial,
|
||||
type(''.startswith), # example: inventory_hostname.upper | type_debug # using `startswith` to resolve `builtin_function_or_method`
|
||||
TemplateModule, # example: '{% import "importme.j2" as im %}{{ im | type_debug }}'
|
||||
}
|
||||
| set(PASS_THROUGH_SCALAR_VAR_TYPES)
|
||||
| set(Marker.concrete_subclasses)
|
||||
)
|
||||
"""
|
||||
These types are known to the templating system.
|
||||
In addition to the statically defined types, additional types will be added at runtime.
|
||||
When enabled in config, this set will be used to determine if an encountered type should trigger a warning or error.
|
||||
"""
|
||||
|
||||
|
||||
def register_known_types(*args: type) -> None:
|
||||
"""Register a type with the template engine so it will not trigger warnings or errors when encountered."""
|
||||
_KNOWN_TYPES.update(args)
|
||||
|
||||
|
||||
class UnsupportedConstructionMethodError(RuntimeError):
|
||||
"""Error raised when attempting to construct a lazy container with unsupported arguments."""
|
||||
|
||||
def __init__(self):
|
||||
super().__init__("Direct construction of lazy containers is not supported.")
|
||||
|
||||
|
||||
@t.final
|
||||
@dataclasses.dataclass(frozen=True, slots=True)
|
||||
class _LazyValue:
|
||||
"""Wrapper around values to indicate lazy behavior has not yet been applied."""
|
||||
|
||||
value: t.Any
|
||||
|
||||
|
||||
@t.final
|
||||
@dataclasses.dataclass(frozen=True, kw_only=True, slots=True)
|
||||
class _LazyValueSource:
|
||||
"""Intermediate value source for lazy-eligible collection copy operations."""
|
||||
|
||||
source: t.Iterable
|
||||
templar: TemplateEngine
|
||||
lazy_options: LazyOptions
|
||||
|
||||
|
||||
@t.final
|
||||
class _NoKeySentinel(Sentinel):
|
||||
"""Sentinel used to indicate a requested key was not found."""
|
||||
|
||||
|
||||
# There are several operations performed by lazy containers, with some variation between types.
|
||||
#
|
||||
# Columns: D=dict, L=list, T=tuple
|
||||
# Cells: l=lazy (upon access), n=non-lazy (__init__/__new__)
|
||||
#
|
||||
# D L T Feature Description
|
||||
# - - - ----------- ---------------------------------------------------------------
|
||||
# l l n propagation when container items which are containers become lazy instances
|
||||
# l l n transform when transforms are applied to container items
|
||||
# l l n templating when templating is performed on container items
|
||||
# l l l access when access calls are performed on container items
|
||||
|
||||
|
||||
class _AnsibleLazyTemplateMixin:
|
||||
__slots__ = _NO_INSTANCE_STORAGE
|
||||
|
||||
_dispatch_types: t.ClassVar[dict[type, type[_AnsibleLazyTemplateMixin]]] = {} # populated by __init_subclass__
|
||||
_container_types: t.ClassVar[set[type]] = set() # populated by __init_subclass__
|
||||
|
||||
_native_type: t.ClassVar[type] # from AnsibleTaggedObject
|
||||
|
||||
_SLOTS: t.Final = (
|
||||
'_templar',
|
||||
'_lazy_options',
|
||||
)
|
||||
|
||||
_templar: TemplateEngine
|
||||
_lazy_options: LazyOptions
|
||||
|
||||
def __init_subclass__(cls, **kwargs) -> None:
|
||||
tagged_type = cls.__mro__[1]
|
||||
native_type = tagged_type.__mro__[1]
|
||||
|
||||
for check_type in (tagged_type, native_type):
|
||||
if conflicting_type := cls._dispatch_types.get(check_type):
|
||||
raise TypeError(f"Lazy mixin {cls.__name__!r} type {check_type.__name__!r} conflicts with {conflicting_type.__name__!r}.")
|
||||
|
||||
cls._dispatch_types[native_type] = cls
|
||||
cls._dispatch_types[tagged_type] = cls
|
||||
cls._container_types.add(native_type)
|
||||
cls._empty_tags_as_native = False # never revert to the native type when no tags remain
|
||||
|
||||
register_known_types(cls)
|
||||
|
||||
def __init__(self, contents: t.Iterable | _LazyValueSource) -> None:
|
||||
if isinstance(contents, _LazyValueSource):
|
||||
self._templar = contents.templar
|
||||
self._lazy_options = contents.lazy_options
|
||||
elif isinstance(contents, _AnsibleLazyTemplateMixin):
|
||||
self._templar = contents._templar
|
||||
self._lazy_options = contents._lazy_options
|
||||
else:
|
||||
raise UnsupportedConstructionMethodError()
|
||||
|
||||
def __reduce_ex__(self, protocol):
|
||||
raise NotImplementedError("Pickling of Ansible lazy objects is not permitted.")
|
||||
|
||||
@staticmethod
|
||||
def _try_create(item: t.Any, lazy_options: LazyOptions = LazyOptions.DEFAULT) -> t.Any:
|
||||
"""
|
||||
If `item` is a container type which supports lazy access and/or templating, return a lazy wrapped version -- otherwise return it as-is.
|
||||
When returning as-is, a warning or error may be generated for unknown types.
|
||||
The `lazy_options.skip_templates` argument should be set to `True` when `item` is sourced from a plugin instead of Ansible variable storage.
|
||||
This provides backwards compatibility and reduces lazy overhead, as plugins do not normally introduce templates.
|
||||
If a plugin needs to introduce templates, the plugin is responsible for invoking the templar and returning the result.
|
||||
"""
|
||||
item_type = type(item)
|
||||
|
||||
# Try to use exact type match first to determine which wrapper (if any) to apply; isinstance checks
|
||||
# are extremely expensive, so try to avoid them for our commonly-supported types.
|
||||
if (dispatcher := _AnsibleLazyTemplateMixin._dispatch_types.get(item_type)) is not None:
|
||||
# Create a generator that yields the elements of `item` wrapped in a `_LazyValue` wrapper.
|
||||
# The wrapper is used to signal to the lazy container that the value must be processed before being returned.
|
||||
# Values added to the lazy container later through other means will be returned as-is, without any special processing.
|
||||
lazy_values = dispatcher._lazy_values(item, lazy_options)
|
||||
tags_mapping = _try_get_internal_tags_mapping(item)
|
||||
value = t.cast(AnsibleTaggedObject, dispatcher)._instance_factory(lazy_values, tags_mapping)
|
||||
|
||||
return value
|
||||
|
||||
with Skippable, _TemplateConfig.unknown_type_encountered_handler.handle(AnsibleVariableTypeError, skip_on_ignore=True):
|
||||
if item_type not in _KNOWN_TYPES:
|
||||
raise AnsibleVariableTypeError(
|
||||
message=f"Encountered unknown type {item_type.__name__!r} during template operation.",
|
||||
help_text="Use supported types to avoid unexpected behavior.",
|
||||
obj=TemplateContext.current().template_value,
|
||||
)
|
||||
|
||||
return item
|
||||
|
||||
def _is_not_lazy_combine_candidate(self, other: object) -> bool:
|
||||
"""Returns `True` if `other` cannot be lazily combined with the current instance due to differing templar/options, otherwise returns `False`."""
|
||||
return isinstance(other, _AnsibleLazyTemplateMixin) and (self._templar is not other._templar or self._lazy_options != other._lazy_options)
|
||||
|
||||
def _non_lazy_copy(self) -> t.Collection:
|
||||
"""
|
||||
Return a non-lazy copy of this collection.
|
||||
Any remaining lazy wrapped values will be unwrapped without further processing.
|
||||
Tags on this instance will be preserved on the returned copy.
|
||||
"""
|
||||
raise NotImplementedError() # pragma: nocover
|
||||
|
||||
@staticmethod
|
||||
def _lazy_values(values: t.Any, lazy_options: LazyOptions) -> _LazyValueSource:
|
||||
"""
|
||||
Return an iterable that wraps each of the given elements in a lazy wrapper.
|
||||
Only elements wrapped this way will receive lazy processing when retrieved from the collection.
|
||||
"""
|
||||
# DTFIX-RELEASE: check relative performance of method-local vs stored generator expressions on implementations of this method
|
||||
raise NotImplementedError() # pragma: nocover
|
||||
|
||||
def _proxy_or_render_lazy_value(self, key: t.Any, value: t.Any) -> t.Any:
|
||||
"""
|
||||
Ensure that the value is lazy-proxied or rendered, and if a key is provided, replace the original value with the result.
|
||||
"""
|
||||
if type(value) is not _LazyValue: # pylint: disable=unidiomatic-typecheck
|
||||
if self._lazy_options.access:
|
||||
AnsibleAccessContext.current().access(value)
|
||||
|
||||
return value
|
||||
|
||||
original_value = value.value
|
||||
|
||||
if self._lazy_options.access:
|
||||
AnsibleAccessContext.current().access(original_value)
|
||||
|
||||
new_value = self._templar.template(original_value, lazy_options=self._lazy_options)
|
||||
|
||||
if new_value is not original_value and self._lazy_options.access:
|
||||
AnsibleAccessContext.current().access(new_value)
|
||||
|
||||
if key is not _NoKeySentinel:
|
||||
self._native_type.__setitem__(self, key, new_value) # type: ignore # pylint: disable=unnecessary-dunder-call
|
||||
|
||||
return new_value
|
||||
|
||||
|
||||
@t.final # consumers of lazy collections rely heavily on the concrete types being final
|
||||
class _AnsibleLazyTemplateDict(_AnsibleTaggedDict, _AnsibleLazyTemplateMixin):
|
||||
__slots__ = _AnsibleLazyTemplateMixin._SLOTS
|
||||
|
||||
def __init__(self, contents: t.Iterable | _LazyValueSource, /, **kwargs) -> None:
|
||||
_AnsibleLazyTemplateMixin.__init__(self, contents)
|
||||
|
||||
if isinstance(contents, _AnsibleLazyTemplateDict):
|
||||
super().__init__(dict.items(contents), **kwargs)
|
||||
elif isinstance(contents, _LazyValueSource):
|
||||
super().__init__(contents.source, **kwargs)
|
||||
else:
|
||||
raise UnsupportedConstructionMethodError()
|
||||
|
||||
def get(self, key: t.Any, default: t.Any = None) -> t.Any:
|
||||
if (value := super().get(key, _NoKeySentinel)) is _NoKeySentinel:
|
||||
return default
|
||||
|
||||
return self._proxy_or_render_lazy_value(key, value)
|
||||
|
||||
def __getitem__(self, key: t.Any, /) -> t.Any:
|
||||
return self._proxy_or_render_lazy_value(key, super().__getitem__(key))
|
||||
|
||||
def __str__(self):
|
||||
return str(self.copy()._native_copy()) # inefficient, but avoids mutating the current instance (to make debugging practical)
|
||||
|
||||
def __repr__(self):
|
||||
return repr(self.copy()._native_copy()) # inefficient, but avoids mutating the current instance (to make debugging practical)
|
||||
|
||||
def __iter__(self):
|
||||
# We're using the base implementation, but must override `__iter__` to skip `dict` fast-path copy, which would bypass lazy behavior.
|
||||
# See: https://github.com/python/cpython/blob/ffcc450a9b8b6927549b501eff7ac14abc238448/Objects/dictobject.c#L3861-L3864
|
||||
return super().__iter__()
|
||||
|
||||
def setdefault(self, key, default=None, /) -> t.Any:
|
||||
if (value := self.get(key, _NoKeySentinel)) is not _NoKeySentinel:
|
||||
return value
|
||||
|
||||
super().__setitem__(key, default)
|
||||
|
||||
return default
|
||||
|
||||
def items(self):
|
||||
for key, value in super().items():
|
||||
yield key, self._proxy_or_render_lazy_value(key, value)
|
||||
|
||||
def values(self):
|
||||
for _key, value in self.items():
|
||||
yield value
|
||||
|
||||
def pop(self, key, default=_NoKeySentinel, /) -> t.Any:
|
||||
if (value := super().get(key, _NoKeySentinel)) is _NoKeySentinel:
|
||||
if default is _NoKeySentinel:
|
||||
raise KeyError(key)
|
||||
|
||||
return default
|
||||
|
||||
value = self._proxy_or_render_lazy_value(_NoKeySentinel, value)
|
||||
|
||||
del self[key]
|
||||
|
||||
return value
|
||||
|
||||
def popitem(self) -> t.Any:
|
||||
try:
|
||||
key = next(reversed(self))
|
||||
except StopIteration:
|
||||
raise KeyError("popitem(): dictionary is empty")
|
||||
|
||||
value = self._proxy_or_render_lazy_value(_NoKeySentinel, self[key])
|
||||
|
||||
del self[key]
|
||||
|
||||
return key, value
|
||||
|
||||
def _native_copy(self) -> dict:
|
||||
return dict(self.items())
|
||||
|
||||
@staticmethod
|
||||
def _item_source(value: dict) -> dict | _LazyValueSource:
|
||||
if isinstance(value, _AnsibleLazyTemplateDict):
|
||||
return _LazyValueSource(source=dict.items(value), templar=value._templar, lazy_options=value._lazy_options)
|
||||
|
||||
return value
|
||||
|
||||
def _yield_non_lazy_dict_items(self) -> t.Iterator[tuple[str, t.Any]]:
|
||||
"""
|
||||
Delegate to the base collection items iterator to yield the raw contents.
|
||||
As of Python 3.13, generator functions are significantly faster than inline generator expressions.
|
||||
"""
|
||||
for k, v in dict.items(self):
|
||||
yield k, v.value if type(v) is _LazyValue else v # pylint: disable=unidiomatic-typecheck
|
||||
|
||||
def _non_lazy_copy(self) -> dict:
|
||||
return AnsibleTagHelper.tag_copy(self, self._yield_non_lazy_dict_items(), value_type=dict)
|
||||
|
||||
@staticmethod
|
||||
def _lazy_values(values: dict, lazy_options: LazyOptions) -> _LazyValueSource:
|
||||
return _LazyValueSource(source=((k, _LazyValue(v)) for k, v in values.items()), templar=TemplateContext.current().templar, lazy_options=lazy_options)
|
||||
|
||||
@staticmethod
|
||||
def _proxy_or_render_other(other: t.Any | None) -> None:
|
||||
"""Call `_proxy_or_render_lazy_values` if `other` is a lazy dict. Used internally by comparison methods."""
|
||||
if type(other) is _AnsibleLazyTemplateDict: # pylint: disable=unidiomatic-typecheck
|
||||
other._proxy_or_render_lazy_values()
|
||||
|
||||
def _proxy_or_render_lazy_values(self) -> None:
|
||||
"""Ensure all `_LazyValue` wrapped values have been processed."""
|
||||
for _unused in self.values():
|
||||
pass
|
||||
|
||||
def __eq__(self, other):
|
||||
self._proxy_or_render_lazy_values()
|
||||
self._proxy_or_render_other(other)
|
||||
return super().__eq__(other)
|
||||
|
||||
def __ne__(self, other):
|
||||
self._proxy_or_render_lazy_values()
|
||||
self._proxy_or_render_other(other)
|
||||
return super().__ne__(other)
|
||||
|
||||
def __or__(self, other):
|
||||
# DTFIX-RELEASE: support preservation of laziness when possible like we do for list
|
||||
# Both sides end up going through _proxy_or_render_lazy_value, so there's no Templar preservation needed.
|
||||
# In the future this could be made more lazy when both Templar instances are the same, or if per-value Templar tracking was used.
|
||||
return super().__or__(other)
|
||||
|
||||
def __ror__(self, other):
|
||||
# DTFIX-RELEASE: support preservation of laziness when possible like we do for list
|
||||
# Both sides end up going through _proxy_or_render_lazy_value, so there's no Templar preservation needed.
|
||||
# In the future this could be made more lazy when both Templar instances are the same, or if per-value Templar tracking was used.
|
||||
return super().__ror__(other)
|
||||
|
||||
def __deepcopy__(self, memo):
|
||||
return _AnsibleLazyTemplateDict(
|
||||
_LazyValueSource(
|
||||
source=((copy.deepcopy(k), copy.deepcopy(v)) for k, v in super().items()),
|
||||
templar=copy.deepcopy(self._templar),
|
||||
lazy_options=copy.deepcopy(self._lazy_options),
|
||||
)
|
||||
)
|
||||
|
||||
|
||||
@t.final # consumers of lazy collections rely heavily on the concrete types being final
|
||||
class _AnsibleLazyTemplateList(_AnsibleTaggedList, _AnsibleLazyTemplateMixin):
|
||||
__slots__ = _AnsibleLazyTemplateMixin._SLOTS
|
||||
|
||||
def __init__(self, contents: t.Iterable | _LazyValueSource, /) -> None:
|
||||
_AnsibleLazyTemplateMixin.__init__(self, contents)
|
||||
|
||||
if isinstance(contents, _AnsibleLazyTemplateList):
|
||||
super().__init__(list.__iter__(contents))
|
||||
elif isinstance(contents, _LazyValueSource):
|
||||
super().__init__(contents.source)
|
||||
else:
|
||||
raise UnsupportedConstructionMethodError()
|
||||
|
||||
def __getitem__(self, key: t.SupportsIndex | slice, /) -> t.Any:
|
||||
if type(key) is slice: # pylint: disable=unidiomatic-typecheck
|
||||
return _AnsibleLazyTemplateList(_LazyValueSource(source=super().__getitem__(key), templar=self._templar, lazy_options=self._lazy_options))
|
||||
|
||||
return self._proxy_or_render_lazy_value(key, super().__getitem__(key))
|
||||
|
||||
def __iter__(self):
|
||||
for key, value in enumerate(super().__iter__()):
|
||||
yield self._proxy_or_render_lazy_value(key, value)
|
||||
|
||||
def pop(self, idx: t.SupportsIndex = -1, /) -> t.Any:
|
||||
if not self:
|
||||
raise IndexError('pop from empty list')
|
||||
|
||||
try:
|
||||
value = self[idx]
|
||||
except IndexError:
|
||||
raise IndexError('pop index out of range')
|
||||
|
||||
value = self._proxy_or_render_lazy_value(_NoKeySentinel, value)
|
||||
|
||||
del self[idx]
|
||||
|
||||
return value
|
||||
|
||||
def __str__(self):
|
||||
return str(self.copy()._native_copy()) # inefficient, but avoids mutating the current instance (to make debugging practical)
|
||||
|
||||
def __repr__(self):
|
||||
return repr(self.copy()._native_copy()) # inefficient, but avoids mutating the current instance (to make debugging practical)
|
||||
|
||||
@staticmethod
|
||||
def _item_source(value: list) -> list | _LazyValueSource:
|
||||
if isinstance(value, _AnsibleLazyTemplateList):
|
||||
return _LazyValueSource(source=list.__iter__(value), templar=value._templar, lazy_options=value._lazy_options)
|
||||
|
||||
return value
|
||||
|
||||
def _yield_non_lazy_list_items(self):
|
||||
"""
|
||||
Delegate to the base collection iterator to yield the raw contents.
|
||||
As of Python 3.13, generator functions are significantly faster than inline generator expressions.
|
||||
"""
|
||||
for v in list.__iter__(self):
|
||||
yield v.value if type(v) is _LazyValue else v # pylint: disable=unidiomatic-typecheck
|
||||
|
||||
def _non_lazy_copy(self) -> list:
|
||||
return AnsibleTagHelper.tag_copy(self, self._yield_non_lazy_list_items(), value_type=list)
|
||||
|
||||
@staticmethod
|
||||
def _lazy_values(values: list, lazy_options: LazyOptions) -> _LazyValueSource:
|
||||
return _LazyValueSource(source=(_LazyValue(v) for v in values), templar=TemplateContext.current().templar, lazy_options=lazy_options)
|
||||
|
||||
@staticmethod
|
||||
def _proxy_or_render_other(other: t.Any | None) -> None:
|
||||
"""Call `_proxy_or_render_lazy_values` if `other` is a lazy list. Used internally by comparison methods."""
|
||||
if type(other) is _AnsibleLazyTemplateList: # pylint: disable=unidiomatic-typecheck
|
||||
other._proxy_or_render_lazy_values()
|
||||
|
||||
def _proxy_or_render_lazy_values(self) -> None:
|
||||
"""Ensure all `_LazyValue` wrapped values have been processed."""
|
||||
for _unused in self:
|
||||
pass
|
||||
|
||||
def __eq__(self, other):
|
||||
self._proxy_or_render_lazy_values()
|
||||
self._proxy_or_render_other(other)
|
||||
return super().__eq__(other)
|
||||
|
||||
def __ne__(self, other):
|
||||
self._proxy_or_render_lazy_values()
|
||||
self._proxy_or_render_other(other)
|
||||
return super().__ne__(other)
|
||||
|
||||
def __gt__(self, other):
|
||||
self._proxy_or_render_lazy_values()
|
||||
self._proxy_or_render_other(other)
|
||||
return super().__gt__(other)
|
||||
|
||||
def __ge__(self, other):
|
||||
self._proxy_or_render_lazy_values()
|
||||
self._proxy_or_render_other(other)
|
||||
return super().__ge__(other)
|
||||
|
||||
def __lt__(self, other):
|
||||
self._proxy_or_render_lazy_values()
|
||||
self._proxy_or_render_other(other)
|
||||
return super().__lt__(other)
|
||||
|
||||
def __le__(self, other):
|
||||
self._proxy_or_render_lazy_values()
|
||||
self._proxy_or_render_other(other)
|
||||
return super().__le__(other)
|
||||
|
||||
def __contains__(self, item):
|
||||
self._proxy_or_render_lazy_values()
|
||||
return super().__contains__(item)
|
||||
|
||||
def __reversed__(self):
|
||||
for idx in range(self.__len__() - 1, -1, -1):
|
||||
yield self[idx]
|
||||
|
||||
def __add__(self, other):
|
||||
if self._is_not_lazy_combine_candidate(other):
|
||||
# When other is lazy with a different templar/options, it cannot be lazily combined with self and a plain list must be returned.
|
||||
# If other is a list, de-lazify both, otherwise just let the operation fail.
|
||||
|
||||
if isinstance(other, _AnsibleLazyTemplateList):
|
||||
self._proxy_or_render_lazy_values()
|
||||
other._proxy_or_render_lazy_values()
|
||||
|
||||
return super().__add__(other)
|
||||
|
||||
# For all other cases, the new list inherits our templar and all values stay lazy.
|
||||
# We use list.__add__ to avoid implementing all its error behavior.
|
||||
return _AnsibleLazyTemplateList(_LazyValueSource(source=super().__add__(other), templar=self._templar, lazy_options=self._lazy_options))
|
||||
|
||||
def __radd__(self, other):
|
||||
if not (other_add := getattr(other, '__add__', None)):
|
||||
raise TypeError(f'unsupported operand type(s) for +: {type(other).__name__!r} and {type(self).__name__!r}') from None
|
||||
|
||||
return _AnsibleLazyTemplateList(_LazyValueSource(source=other_add(self), templar=self._templar, lazy_options=self._lazy_options))
|
||||
|
||||
def __mul__(self, other):
|
||||
return _AnsibleLazyTemplateList(_LazyValueSource(source=super().__mul__(other), templar=self._templar, lazy_options=self._lazy_options))
|
||||
|
||||
def __rmul__(self, other):
|
||||
return _AnsibleLazyTemplateList(_LazyValueSource(source=super().__rmul__(other), templar=self._templar, lazy_options=self._lazy_options))
|
||||
|
||||
def index(self, *args, **kwargs) -> int:
|
||||
self._proxy_or_render_lazy_values()
|
||||
return super().index(*args, **kwargs)
|
||||
|
||||
def remove(self, *args, **kwargs) -> None:
|
||||
self._proxy_or_render_lazy_values()
|
||||
super().remove(*args, **kwargs)
|
||||
|
||||
def sort(self, *args, **kwargs) -> None:
|
||||
self._proxy_or_render_lazy_values()
|
||||
super().sort(*args, **kwargs)
|
||||
|
||||
def __deepcopy__(self, memo):
|
||||
return _AnsibleLazyTemplateList(
|
||||
_LazyValueSource(
|
||||
source=(copy.deepcopy(v) for v in super().__iter__()),
|
||||
templar=copy.deepcopy(self._templar),
|
||||
lazy_options=copy.deepcopy(self._lazy_options),
|
||||
)
|
||||
)
|
||||
|
||||
|
||||
@t.final # consumers of lazy collections rely heavily on the concrete types being final
|
||||
class _AnsibleLazyAccessTuple(_AnsibleTaggedTuple, _AnsibleLazyTemplateMixin):
|
||||
"""
|
||||
A tagged tuple subclass that provides only managed access for existing lazy values.
|
||||
|
||||
Since tuples are immutable, they cannot support lazy templating (which would change the tuple's value as templates were resolved).
|
||||
When this type is created, each value in the source tuple is lazified:
|
||||
|
||||
* template strings are templated immediately (possibly resulting in lazy containers)
|
||||
* non-tuple containers are lazy-wrapped
|
||||
* tuples are immediately recursively lazy-wrapped
|
||||
* transformations are applied immediately
|
||||
|
||||
The resulting object provides only managed access to its values (e.g., deprecation warnings, tripwires), and propagates to new lazy containers
|
||||
created as a results of managed access.
|
||||
"""
|
||||
|
||||
# DTFIX-RELEASE: ensure we have tests that explicitly verify this behavior
|
||||
|
||||
# nonempty __slots__ not supported for subtype of 'tuple'
|
||||
|
||||
def __new__(cls, contents: t.Iterable | _LazyValueSource, /) -> t.Self:
|
||||
if isinstance(contents, _AnsibleLazyAccessTuple):
|
||||
return super().__new__(cls, tuple.__iter__(contents))
|
||||
|
||||
if isinstance(contents, _LazyValueSource):
|
||||
return super().__new__(cls, contents.source)
|
||||
|
||||
raise UnsupportedConstructionMethodError()
|
||||
|
||||
def __init__(self, contents: t.Iterable | _LazyValueSource, /) -> None:
|
||||
_AnsibleLazyTemplateMixin.__init__(self, contents)
|
||||
|
||||
def __getitem__(self, key: t.SupportsIndex | slice, /) -> t.Any:
|
||||
if type(key) is slice: # pylint: disable=unidiomatic-typecheck
|
||||
return _AnsibleLazyAccessTuple(super().__getitem__(key))
|
||||
|
||||
value = super().__getitem__(key)
|
||||
|
||||
if self._lazy_options.access:
|
||||
AnsibleAccessContext.current().access(value)
|
||||
|
||||
return value
|
||||
|
||||
@staticmethod
|
||||
def _item_source(value: tuple) -> tuple | _LazyValueSource:
|
||||
if isinstance(value, _AnsibleLazyAccessTuple):
|
||||
return _LazyValueSource(source=tuple.__iter__(value), templar=value._templar, lazy_options=value._lazy_options)
|
||||
|
||||
return value
|
||||
|
||||
@staticmethod
|
||||
def _lazy_values(values: t.Any, lazy_options: LazyOptions) -> _LazyValueSource:
|
||||
templar = TemplateContext.current().templar
|
||||
|
||||
return _LazyValueSource(source=(templar.template(value, lazy_options=lazy_options) for value in values), templar=templar, lazy_options=lazy_options)
|
||||
|
||||
def _non_lazy_copy(self) -> tuple:
|
||||
return AnsibleTagHelper.tag_copy(self, self, value_type=tuple)
|
||||
|
||||
def __deepcopy__(self, memo):
|
||||
return _AnsibleLazyAccessTuple(
|
||||
_LazyValueSource(
|
||||
source=(copy.deepcopy(v) for v in super().__iter__()),
|
||||
templar=copy.deepcopy(self._templar),
|
||||
lazy_options=copy.deepcopy(self._lazy_options),
|
||||
)
|
||||
)
|
||||
|
||||
|
||||
def lazify_container(value: t.Any) -> t.Any:
|
||||
"""
|
||||
If the given value is a supported container type, return its lazy version, otherwise return the value as-is.
|
||||
This is used to ensure that managed access and templating occur on args and kwargs to a callable, even if they were sourced from Jinja constants.
|
||||
|
||||
Since both variable access and plugin output are already lazified, this mostly affects Jinja constant containers.
|
||||
However, plugins that directly invoke other plugins (e.g., `Environment.call_filter`) are another potential source of non-lazy containers.
|
||||
In these cases, templating will occur for trusted templates automatically upon access.
|
||||
|
||||
Sets, tuples, and dictionary keys cannot be lazy, since their correct operation requires hashability and equality.
|
||||
These properties are mutually exclusive with the following lazy features:
|
||||
|
||||
- managed access on encrypted strings - may raise errors on both operations when decryption fails
|
||||
- managed access on markers - must raise errors on both operations
|
||||
- templating - mutates values
|
||||
|
||||
That leaves non-raising managed access as the only remaining feature, which is insufficient to warrant lazy support.
|
||||
"""
|
||||
return _AnsibleLazyTemplateMixin._try_create(value)
|
||||
|
||||
|
||||
def lazify_container_args(item: tuple) -> tuple:
|
||||
"""Return the given args with values converted to lazy containers as needed."""
|
||||
return tuple(lazify_container(value) for value in item)
|
||||
|
||||
|
||||
def lazify_container_kwargs(item: dict[str, t.Any]) -> dict[str, t.Any]:
|
||||
"""Return the given kwargs with values converted to lazy containers as needed."""
|
||||
return {key: lazify_container(value) for key, value in item.items()}
|
||||
103
lib/ansible/_internal/_templating/_marker_behaviors.py
Normal file
103
lib/ansible/_internal/_templating/_marker_behaviors.py
Normal file
|
|
@ -0,0 +1,103 @@
|
|||
"""Handling of `Marker` values."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import abc
|
||||
import contextlib
|
||||
import dataclasses
|
||||
import itertools
|
||||
import typing as t
|
||||
|
||||
from ansible.utils.display import Display
|
||||
|
||||
from ._jinja_common import Marker
|
||||
|
||||
|
||||
class MarkerBehavior(metaclass=abc.ABCMeta):
|
||||
"""Base class to support custom handling of `Marker` values encountered during concatenation or finalization."""
|
||||
|
||||
@abc.abstractmethod
|
||||
def handle_marker(self, value: Marker) -> t.Any:
|
||||
"""Handle the given `Marker` value."""
|
||||
|
||||
|
||||
class FailingMarkerBehavior(MarkerBehavior):
|
||||
"""
|
||||
The default behavior when encountering a `Marker` value during concatenation or finalization.
|
||||
This always raises the template-internal `MarkerError` exception.
|
||||
"""
|
||||
|
||||
def handle_marker(self, value: Marker) -> t.Any:
|
||||
value.trip()
|
||||
|
||||
|
||||
# FAIL_ON_MARKER_BEHAVIOR
|
||||
# _DETONATE_MARKER_BEHAVIOR - internal singleton since it's the default and nobody should need to reference it, or make it an actual singleton
|
||||
FAIL_ON_UNDEFINED: t.Final = FailingMarkerBehavior() # no sense in making many instances...
|
||||
|
||||
|
||||
@dataclasses.dataclass(kw_only=True, slots=True, frozen=True)
|
||||
class _MarkerTracker:
|
||||
"""A numbered occurrence of a `Marker` value for later conversion to a warning."""
|
||||
|
||||
number: int
|
||||
value: Marker
|
||||
|
||||
|
||||
class ReplacingMarkerBehavior(MarkerBehavior):
|
||||
"""All `Marker` values are replaced with a numbered string placeholder and the message from the value."""
|
||||
|
||||
def __init__(self) -> None:
|
||||
self._trackers: list[_MarkerTracker] = []
|
||||
|
||||
def record_marker(self, value: Marker) -> t.Any:
|
||||
"""Assign a sequence number to the given value and record it for later generation of warnings."""
|
||||
number = len(self._trackers) + 1
|
||||
|
||||
self._trackers.append(_MarkerTracker(number=number, value=value))
|
||||
|
||||
return number
|
||||
|
||||
def emit_warnings(self) -> None:
|
||||
"""Emit warning messages caused by Marker values, aggregated by unique template."""
|
||||
|
||||
display = Display()
|
||||
grouped_templates = itertools.groupby(self._trackers, key=lambda tracker: tracker.value._marker_template_source)
|
||||
|
||||
for template, items in grouped_templates:
|
||||
item_list = list(items)
|
||||
|
||||
msg = f'Encountered {len(item_list)} template error{"s" if len(item_list) > 1 else ""}.'
|
||||
|
||||
for item in item_list:
|
||||
msg += f'\nerror {item.number} - {item.value._as_message()}'
|
||||
|
||||
display.warning(msg=msg, obj=template)
|
||||
|
||||
@classmethod
|
||||
@contextlib.contextmanager
|
||||
def warning_context(cls) -> t.Generator[t.Self, None, None]:
|
||||
"""Collect warnings for `Marker` values and emit warnings when the context exits."""
|
||||
instance = cls()
|
||||
|
||||
try:
|
||||
yield instance
|
||||
finally:
|
||||
instance.emit_warnings()
|
||||
|
||||
def handle_marker(self, value: Marker) -> t.Any:
|
||||
number = self.record_marker(value)
|
||||
|
||||
return f"<< error {number} - {value._as_message()} >>"
|
||||
|
||||
|
||||
class RoutingMarkerBehavior(MarkerBehavior):
|
||||
"""Routes instances of Marker (by type reference) to another MarkerBehavior, defaulting to FailingMarkerBehavior."""
|
||||
|
||||
def __init__(self, dispatch_table: dict[type[Marker], MarkerBehavior]) -> None:
|
||||
self._dispatch_table = dispatch_table
|
||||
|
||||
def handle_marker(self, value: Marker) -> t.Any:
|
||||
behavior = self._dispatch_table.get(type(value), FAIL_ON_UNDEFINED)
|
||||
|
||||
return behavior.handle_marker(value)
|
||||
63
lib/ansible/_internal/_templating/_transform.py
Normal file
63
lib/ansible/_internal/_templating/_transform.py
Normal file
|
|
@ -0,0 +1,63 @@
|
|||
"""Runtime projections to provide template/var-visible views of objects that are not natively allowed in Ansible's type system."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import dataclasses
|
||||
import typing as t
|
||||
|
||||
from ansible.module_utils._internal import _traceback
|
||||
from ansible.module_utils.common.messages import PluginInfo, ErrorSummary, WarningSummary, DeprecationSummary
|
||||
from ansible.parsing.vault import EncryptedString, VaultHelper
|
||||
from ansible.utils.display import Display
|
||||
|
||||
from ._jinja_common import VaultExceptionMarker
|
||||
from .._errors import _captured, _utils
|
||||
|
||||
display = Display()
|
||||
|
||||
|
||||
def plugin_info(value: PluginInfo) -> dict[str, str]:
|
||||
"""Render PluginInfo as a dictionary."""
|
||||
return dataclasses.asdict(value)
|
||||
|
||||
|
||||
def error_summary(value: ErrorSummary) -> str:
|
||||
"""Render ErrorSummary as a formatted traceback for backward-compatibility with pre-2.19 TaskResult.exception."""
|
||||
return value.formatted_traceback or '(traceback unavailable)'
|
||||
|
||||
|
||||
def warning_summary(value: WarningSummary) -> str:
|
||||
"""Render WarningSummary as a simple message string for backward-compatibility with pre-2.19 TaskResult.warnings."""
|
||||
return value._format()
|
||||
|
||||
|
||||
def deprecation_summary(value: DeprecationSummary) -> dict[str, t.Any]:
|
||||
"""Render DeprecationSummary as dict values for backward-compatibility with pre-2.19 TaskResult.deprecations."""
|
||||
# DTFIX-RELEASE: reconsider which deprecation fields should be exposed here, taking into account that collection_name is to be deprecated
|
||||
result = value._as_simple_dict()
|
||||
result.pop('details')
|
||||
|
||||
return result
|
||||
|
||||
|
||||
def encrypted_string(value: EncryptedString) -> str | VaultExceptionMarker:
|
||||
"""Decrypt an encrypted string and return its value, or a VaultExceptionMarker if decryption fails."""
|
||||
try:
|
||||
return value._decrypt()
|
||||
except Exception as ex:
|
||||
return VaultExceptionMarker(
|
||||
ciphertext=VaultHelper.get_ciphertext(value, with_tags=True),
|
||||
reason=_utils.get_chained_message(ex),
|
||||
traceback=_traceback.maybe_extract_traceback(ex, _traceback.TracebackEvent.ERROR),
|
||||
)
|
||||
|
||||
|
||||
_type_transform_mapping: dict[type, t.Callable[[t.Any], t.Any]] = {
|
||||
_captured.CapturedErrorSummary: error_summary,
|
||||
PluginInfo: plugin_info,
|
||||
ErrorSummary: error_summary,
|
||||
WarningSummary: warning_summary,
|
||||
DeprecationSummary: deprecation_summary,
|
||||
EncryptedString: encrypted_string,
|
||||
}
|
||||
"""This mapping is consulted by `Templar.template` to provide custom views of some objects."""
|
||||
107
lib/ansible/_internal/_templating/_utils.py
Normal file
107
lib/ansible/_internal/_templating/_utils.py
Normal file
|
|
@ -0,0 +1,107 @@
|
|||
from __future__ import annotations
|
||||
|
||||
import dataclasses
|
||||
import typing as t
|
||||
|
||||
from ansible.module_utils._internal import _ambient_context, _datatag
|
||||
|
||||
if t.TYPE_CHECKING:
|
||||
from ._engine import TemplateEngine, TemplateOptions
|
||||
|
||||
|
||||
@dataclasses.dataclass(kw_only=True, slots=True, frozen=True)
|
||||
class LazyOptions:
|
||||
"""Templating options that apply to lazy containers, which are inherited by descendent lazy containers."""
|
||||
|
||||
DEFAULT: t.ClassVar[t.Self]
|
||||
"""A shared instance with the default options to minimize instance creation for arg defaults."""
|
||||
SKIP_TEMPLATES: t.ClassVar[t.Self]
|
||||
"""A shared instance with only `template=False` set to minimize instance creation for arg defaults."""
|
||||
SKIP_TEMPLATES_AND_ACCESS: t.ClassVar[t.Self]
|
||||
"""A shared instance with both `template=False` and `access=False` set to minimize instance creation for arg defaults."""
|
||||
|
||||
template: bool = True
|
||||
"""Enable/disable templating."""
|
||||
|
||||
access: bool = True
|
||||
"""Enable/disables access calls."""
|
||||
|
||||
unmask_type_names: frozenset[str] = frozenset()
|
||||
"""Disables template transformations for the provided type names."""
|
||||
|
||||
|
||||
LazyOptions.DEFAULT = LazyOptions()
|
||||
LazyOptions.SKIP_TEMPLATES = LazyOptions(template=False)
|
||||
LazyOptions.SKIP_TEMPLATES_AND_ACCESS = LazyOptions(template=False, access=False)
|
||||
|
||||
|
||||
class TemplateContext(_ambient_context.AmbientContextBase):
|
||||
def __init__(
|
||||
self,
|
||||
*,
|
||||
template_value: t.Any,
|
||||
templar: TemplateEngine,
|
||||
options: TemplateOptions,
|
||||
stop_on_template: bool = False,
|
||||
_render_jinja_const_template: bool = False,
|
||||
):
|
||||
self._template_value = template_value
|
||||
self._templar = templar
|
||||
self._options = options
|
||||
self._stop_on_template = stop_on_template
|
||||
self._parent_ctx = TemplateContext.current(optional=True)
|
||||
self._render_jinja_const_template = _render_jinja_const_template
|
||||
|
||||
@property
|
||||
def is_top_level(self) -> bool:
|
||||
return not self._parent_ctx
|
||||
|
||||
@property
|
||||
def template_value(self) -> t.Any:
|
||||
return self._template_value
|
||||
|
||||
@property
|
||||
def templar(self) -> TemplateEngine:
|
||||
return self._templar
|
||||
|
||||
@property
|
||||
def options(self) -> TemplateOptions:
|
||||
return self._options
|
||||
|
||||
@property
|
||||
def stop_on_template(self) -> bool:
|
||||
return self._stop_on_template
|
||||
|
||||
|
||||
class _OmitType:
|
||||
"""
|
||||
A placeholder singleton used to dynamically omit items from a dict/list/tuple/set when the value is `Omit`.
|
||||
|
||||
The `Omit` singleton is accessible from all Ansible templating contexts via the Jinja global name `omit`.
|
||||
The `Omit` placeholder value will be visible to Jinja plugins during templating.
|
||||
Jinja plugins requiring omit behavior are responsible for handling encountered `Omit` values.
|
||||
`Omit` values remaining in template results will be automatically dropped during template finalization.
|
||||
When a finalized template renders to a scalar `Omit`, `AnsibleValueOmittedError` will be raised.
|
||||
Passing a value other than `Omit` for `value_for_omit` to the `template` call allows that value to be substituted instead of raising.
|
||||
"""
|
||||
|
||||
__slots__ = ()
|
||||
|
||||
def __new__(cls):
|
||||
return Omit
|
||||
|
||||
def __repr__(self):
|
||||
return "<<Omit>>"
|
||||
|
||||
|
||||
Omit = object.__new__(_OmitType)
|
||||
|
||||
_datatag._untaggable_types.add(_OmitType)
|
||||
|
||||
|
||||
# DTFIX-RELEASE: review these type sets to ensure they're not overly permissive/dynamic
|
||||
IGNORE_SCALAR_VAR_TYPES = {value for value in _datatag._ANSIBLE_ALLOWED_SCALAR_VAR_TYPES if not issubclass(value, str)}
|
||||
|
||||
PASS_THROUGH_SCALAR_VAR_TYPES = _datatag._ANSIBLE_ALLOWED_SCALAR_VAR_TYPES | {
|
||||
_OmitType, # allow pass through of omit for later handling after top-level finalize completes
|
||||
}
|
||||
1052
lib/ansible/_internal/_wrapt.py
Normal file
1052
lib/ansible/_internal/_wrapt.py
Normal file
File diff suppressed because it is too large
Load Diff
0
lib/ansible/_internal/_yaml/__init__.py
Normal file
0
lib/ansible/_internal/_yaml/__init__.py
Normal file
240
lib/ansible/_internal/_yaml/_constructor.py
Normal file
240
lib/ansible/_internal/_yaml/_constructor.py
Normal file
|
|
@ -0,0 +1,240 @@
|
|||
from __future__ import annotations
|
||||
|
||||
import abc
|
||||
import copy
|
||||
import typing as t
|
||||
|
||||
from yaml import Node
|
||||
from yaml.constructor import SafeConstructor
|
||||
from yaml.resolver import BaseResolver
|
||||
|
||||
from ansible import constants as C
|
||||
from ansible.module_utils.common.text.converters import to_text
|
||||
from ansible.module_utils._internal._datatag import AnsibleTagHelper
|
||||
from ansible._internal._datatag._tags import Origin, TrustedAsTemplate
|
||||
from ansible.parsing.vault import EncryptedString
|
||||
from ansible.utils.display import Display
|
||||
|
||||
from ._errors import AnsibleConstructorError
|
||||
|
||||
display = Display()
|
||||
|
||||
_TRUSTED_AS_TEMPLATE: t.Final[TrustedAsTemplate] = TrustedAsTemplate()
|
||||
|
||||
|
||||
class _BaseConstructor(SafeConstructor, metaclass=abc.ABCMeta):
|
||||
"""Base class for Ansible YAML constructors."""
|
||||
|
||||
@classmethod
|
||||
@abc.abstractmethod
|
||||
def _register_constructors(cls) -> None:
|
||||
"""Method used to register constructors to derived types during class initialization."""
|
||||
|
||||
def __init_subclass__(cls, **kwargs) -> None:
|
||||
"""Initialization for derived types."""
|
||||
cls._register_constructors()
|
||||
|
||||
|
||||
class AnsibleInstrumentedConstructor(_BaseConstructor):
|
||||
"""Ansible constructor which supports Ansible custom behavior such as `Origin` tagging, but no Ansible-specific YAML tags."""
|
||||
|
||||
name: t.Any # provided by the YAML parser, which retrieves it from the stream
|
||||
|
||||
def __init__(self, origin: Origin, trusted_as_template: bool) -> None:
|
||||
if not origin.line_num:
|
||||
origin = origin.replace(line_num=1)
|
||||
|
||||
self._origin = origin
|
||||
self._trusted_as_template = trusted_as_template
|
||||
self._duplicate_key_mode = C.config.get_config_value('DUPLICATE_YAML_DICT_KEY')
|
||||
|
||||
super().__init__()
|
||||
|
||||
@property
|
||||
def trusted_as_template(self) -> bool:
|
||||
return self._trusted_as_template
|
||||
|
||||
def construct_yaml_map(self, node):
|
||||
data = self._node_position_info(node).tag({}) # always an ordered dictionary on py3.7+
|
||||
yield data
|
||||
value = self.construct_mapping(node)
|
||||
data.update(value)
|
||||
|
||||
def construct_mapping(self, node, deep=False):
|
||||
# Delegate to built-in implementation to construct the mapping.
|
||||
# This is done before checking for duplicates to leverage existing error checking on the input node.
|
||||
mapping = super().construct_mapping(node, deep)
|
||||
keys = set()
|
||||
|
||||
# Now that the node is known to be a valid mapping, handle any duplicate keys.
|
||||
for key_node, _value_node in node.value:
|
||||
if (key := self.construct_object(key_node, deep=deep)) in keys:
|
||||
msg = f'Found duplicate mapping key {key!r}.'
|
||||
|
||||
if self._duplicate_key_mode == 'error':
|
||||
raise AnsibleConstructorError(problem=msg, problem_mark=key_node.start_mark)
|
||||
|
||||
if self._duplicate_key_mode == 'warn':
|
||||
display.warning(msg=msg, obj=key, help_text='Using last defined value only.')
|
||||
|
||||
keys.add(key)
|
||||
|
||||
return mapping
|
||||
|
||||
def construct_yaml_int(self, node):
|
||||
value = super().construct_yaml_int(node)
|
||||
return self._node_position_info(node).tag(value)
|
||||
|
||||
def construct_yaml_float(self, node):
|
||||
value = super().construct_yaml_float(node)
|
||||
return self._node_position_info(node).tag(value)
|
||||
|
||||
def construct_yaml_timestamp(self, node):
|
||||
value = super().construct_yaml_timestamp(node)
|
||||
return self._node_position_info(node).tag(value)
|
||||
|
||||
def construct_yaml_omap(self, node):
|
||||
origin = self._node_position_info(node)
|
||||
display.deprecated(
|
||||
msg='Use of the YAML `!!omap` tag is deprecated.',
|
||||
version='2.23',
|
||||
obj=origin,
|
||||
help_text='Use a standard mapping instead, as key order is always preserved.',
|
||||
)
|
||||
items = list(super().construct_yaml_omap(node))[0]
|
||||
items = [origin.tag(item) for item in items]
|
||||
yield origin.tag(items)
|
||||
|
||||
def construct_yaml_pairs(self, node):
|
||||
origin = self._node_position_info(node)
|
||||
display.deprecated(
|
||||
msg='Use of the YAML `!!pairs` tag is deprecated.',
|
||||
version='2.23',
|
||||
obj=origin,
|
||||
help_text='Use a standard mapping instead.',
|
||||
)
|
||||
items = list(super().construct_yaml_pairs(node))[0]
|
||||
items = [origin.tag(item) for item in items]
|
||||
yield origin.tag(items)
|
||||
|
||||
def construct_yaml_str(self, node):
|
||||
# Override the default string handling function
|
||||
# to always return unicode objects
|
||||
# DTFIX-FUTURE: is this to_text conversion still necessary under Py3?
|
||||
value = to_text(self.construct_scalar(node))
|
||||
|
||||
tags = [self._node_position_info(node)]
|
||||
|
||||
if self.trusted_as_template:
|
||||
# NB: since we're not context aware, this will happily add trust to dictionary keys; this is actually necessary for
|
||||
# certain backward compat scenarios, though might be accomplished in other ways if we wanted to avoid trusting keys in
|
||||
# the general scenario
|
||||
tags.append(_TRUSTED_AS_TEMPLATE)
|
||||
|
||||
return AnsibleTagHelper.tag(value, tags)
|
||||
|
||||
def construct_yaml_binary(self, node):
|
||||
value = super().construct_yaml_binary(node)
|
||||
|
||||
return AnsibleTagHelper.tag(value, self._node_position_info(node))
|
||||
|
||||
def construct_yaml_set(self, node):
|
||||
data = AnsibleTagHelper.tag(set(), self._node_position_info(node))
|
||||
yield data
|
||||
value = self.construct_mapping(node)
|
||||
data.update(value)
|
||||
|
||||
def construct_yaml_seq(self, node):
|
||||
data = self._node_position_info(node).tag([])
|
||||
yield data
|
||||
data.extend(self.construct_sequence(node))
|
||||
|
||||
def _resolve_and_construct_object(self, node):
|
||||
# use a copied node to avoid mutating existing node and tripping the recursion check in construct_object
|
||||
copied_node = copy.copy(node)
|
||||
# repeat implicit resolution process to determine the proper tag for the value in the unsafe node
|
||||
copied_node.tag = t.cast(BaseResolver, self).resolve(type(node), node.value, (True, False))
|
||||
|
||||
# re-entrant call using the correct tag
|
||||
# non-deferred construction of hierarchical nodes so the result is a fully realized object, and so our stateful unsafe propagation behavior works
|
||||
return self.construct_object(copied_node, deep=True)
|
||||
|
||||
def _node_position_info(self, node) -> Origin:
|
||||
# the line number where the previous token has ended (plus empty lines)
|
||||
# Add one so that the first line is line 1 rather than line 0
|
||||
return self._origin.replace(line_num=node.start_mark.line + self._origin.line_num, col_num=node.start_mark.column + 1)
|
||||
|
||||
@classmethod
|
||||
def _register_constructors(cls) -> None:
|
||||
constructors: dict[str, t.Callable] = {
|
||||
'tag:yaml.org,2002:binary': cls.construct_yaml_binary,
|
||||
'tag:yaml.org,2002:float': cls.construct_yaml_float,
|
||||
'tag:yaml.org,2002:int': cls.construct_yaml_int,
|
||||
'tag:yaml.org,2002:map': cls.construct_yaml_map,
|
||||
'tag:yaml.org,2002:omap': cls.construct_yaml_omap,
|
||||
'tag:yaml.org,2002:pairs': cls.construct_yaml_pairs,
|
||||
'tag:yaml.org,2002:python/dict': cls.construct_yaml_map,
|
||||
'tag:yaml.org,2002:python/unicode': cls.construct_yaml_str,
|
||||
'tag:yaml.org,2002:seq': cls.construct_yaml_seq,
|
||||
'tag:yaml.org,2002:set': cls.construct_yaml_set,
|
||||
'tag:yaml.org,2002:str': cls.construct_yaml_str,
|
||||
'tag:yaml.org,2002:timestamp': cls.construct_yaml_timestamp,
|
||||
}
|
||||
|
||||
for tag, constructor in constructors.items():
|
||||
cls.add_constructor(tag, constructor)
|
||||
|
||||
|
||||
class AnsibleConstructor(AnsibleInstrumentedConstructor):
|
||||
"""Ansible constructor which supports Ansible custom behavior such as `Origin` tagging, as well as Ansible-specific YAML tags."""
|
||||
|
||||
def __init__(self, origin: Origin, trusted_as_template: bool) -> None:
|
||||
self._unsafe_depth = 0 # volatile state var used during recursive construction of a value tagged unsafe
|
||||
|
||||
super().__init__(origin=origin, trusted_as_template=trusted_as_template)
|
||||
|
||||
@property
|
||||
def trusted_as_template(self) -> bool:
|
||||
return self._trusted_as_template and not self._unsafe_depth
|
||||
|
||||
def construct_yaml_unsafe(self, node):
|
||||
self._unsafe_depth += 1
|
||||
|
||||
try:
|
||||
return self._resolve_and_construct_object(node)
|
||||
finally:
|
||||
self._unsafe_depth -= 1
|
||||
|
||||
def construct_yaml_vault(self, node: Node) -> EncryptedString:
|
||||
ciphertext = self._resolve_and_construct_object(node)
|
||||
|
||||
if not isinstance(ciphertext, str):
|
||||
raise AnsibleConstructorError(problem=f"the {node.tag!r} tag requires a string value", problem_mark=node.start_mark)
|
||||
|
||||
encrypted_string = AnsibleTagHelper.tag_copy(ciphertext, EncryptedString(ciphertext=AnsibleTagHelper.untag(ciphertext)))
|
||||
|
||||
return encrypted_string
|
||||
|
||||
def construct_yaml_vault_encrypted(self, node: Node) -> EncryptedString:
|
||||
origin = self._node_position_info(node)
|
||||
display.deprecated(
|
||||
msg='Use of the YAML `!vault-encrypted` tag is deprecated.',
|
||||
version='2.23',
|
||||
obj=origin,
|
||||
help_text='Use the `!vault` tag instead.',
|
||||
)
|
||||
|
||||
return self.construct_yaml_vault(node)
|
||||
|
||||
@classmethod
|
||||
def _register_constructors(cls) -> None:
|
||||
super()._register_constructors()
|
||||
|
||||
constructors: dict[str, t.Callable] = {
|
||||
'!unsafe': cls.construct_yaml_unsafe,
|
||||
'!vault': cls.construct_yaml_vault,
|
||||
'!vault-encrypted': cls.construct_yaml_vault_encrypted,
|
||||
}
|
||||
|
||||
for tag, constructor in constructors.items():
|
||||
cls.add_constructor(tag, constructor)
|
||||
62
lib/ansible/_internal/_yaml/_dumper.py
Normal file
62
lib/ansible/_internal/_yaml/_dumper.py
Normal file
|
|
@ -0,0 +1,62 @@
|
|||
from __future__ import annotations
|
||||
|
||||
import abc
|
||||
import collections.abc as c
|
||||
import typing as t
|
||||
|
||||
from yaml.representer import SafeRepresenter
|
||||
|
||||
from ansible.module_utils._internal._datatag import AnsibleTaggedObject, Tripwire, AnsibleTagHelper
|
||||
from ansible.parsing.vault import VaultHelper
|
||||
from ansible.module_utils.common.yaml import HAS_LIBYAML
|
||||
|
||||
if HAS_LIBYAML:
|
||||
from yaml.cyaml import CSafeDumper as SafeDumper
|
||||
else:
|
||||
from yaml import SafeDumper # type: ignore[assignment]
|
||||
|
||||
|
||||
class _BaseDumper(SafeDumper, metaclass=abc.ABCMeta):
|
||||
"""Base class for Ansible YAML dumpers."""
|
||||
|
||||
@classmethod
|
||||
@abc.abstractmethod
|
||||
def _register_representers(cls) -> None:
|
||||
"""Method used to register representers to derived types during class initialization."""
|
||||
|
||||
def __init_subclass__(cls, **kwargs) -> None:
|
||||
"""Initialization for derived types."""
|
||||
cls._register_representers()
|
||||
|
||||
|
||||
class AnsibleDumper(_BaseDumper):
|
||||
"""A simple stub class that allows us to add representers for our custom types."""
|
||||
|
||||
# DTFIX-RELEASE: need a better way to handle serialization controls during YAML dumping
|
||||
def __init__(self, *args, dump_vault_tags: bool | None = None, **kwargs):
|
||||
super().__init__(*args, **kwargs)
|
||||
|
||||
self._dump_vault_tags = dump_vault_tags
|
||||
|
||||
@classmethod
|
||||
def _register_representers(cls) -> None:
|
||||
cls.add_multi_representer(AnsibleTaggedObject, cls.represent_ansible_tagged_object)
|
||||
cls.add_multi_representer(Tripwire, cls.represent_tripwire)
|
||||
cls.add_multi_representer(c.Mapping, SafeRepresenter.represent_dict)
|
||||
cls.add_multi_representer(c.Sequence, SafeRepresenter.represent_list)
|
||||
|
||||
def represent_ansible_tagged_object(self, data):
|
||||
if self._dump_vault_tags is not False and (ciphertext := VaultHelper.get_ciphertext(data, with_tags=False)):
|
||||
# deprecated: description='enable the deprecation warning below' core_version='2.23'
|
||||
# if self._dump_vault_tags is None:
|
||||
# Display().deprecated(
|
||||
# msg="Implicit YAML dumping of vaulted value ciphertext is deprecated. Set `dump_vault_tags` to explicitly specify the desired behavior",
|
||||
# version="2.27",
|
||||
# )
|
||||
|
||||
return self.represent_scalar('!vault', ciphertext, style='|')
|
||||
|
||||
return self.represent_data(AnsibleTagHelper.as_native_type(data)) # automatically decrypts encrypted strings
|
||||
|
||||
def represent_tripwire(self, data: Tripwire) -> t.NoReturn:
|
||||
data.trip()
|
||||
166
lib/ansible/_internal/_yaml/_errors.py
Normal file
166
lib/ansible/_internal/_yaml/_errors.py
Normal file
|
|
@ -0,0 +1,166 @@
|
|||
from __future__ import annotations
|
||||
|
||||
import re
|
||||
|
||||
import typing as t
|
||||
|
||||
from yaml import MarkedYAMLError
|
||||
from yaml.constructor import ConstructorError
|
||||
|
||||
from ansible._internal._errors import _utils
|
||||
from ansible.errors import AnsibleParserError
|
||||
from ansible._internal._datatag._tags import Origin
|
||||
|
||||
|
||||
class AnsibleConstructorError(ConstructorError):
|
||||
"""Ansible-specific ConstructorError used to bypass exception analysis during wrapping in AnsibleYAMLParserError."""
|
||||
|
||||
|
||||
class AnsibleYAMLParserError(AnsibleParserError):
|
||||
"""YAML-specific parsing failure wrapping an exception raised by the YAML parser."""
|
||||
|
||||
_default_message = 'YAML parsing failed.'
|
||||
|
||||
_include_cause_message = False # hide the underlying cause message, it's included by `handle_exception` as needed
|
||||
|
||||
_formatted_source_context_value: str | None = None
|
||||
|
||||
@property
|
||||
def _formatted_source_context(self) -> str | None:
|
||||
return self._formatted_source_context_value
|
||||
|
||||
@classmethod
|
||||
def handle_exception(cls, exception: Exception, origin: Origin) -> t.NoReturn:
|
||||
if isinstance(exception, MarkedYAMLError):
|
||||
origin = origin.replace(line_num=exception.problem_mark.line + 1, col_num=exception.problem_mark.column + 1)
|
||||
|
||||
source_context = _utils.SourceContext.from_origin(origin)
|
||||
|
||||
target_line = source_context.target_line or '' # for these cases, we don't need to distinguish between None and empty string
|
||||
|
||||
message: str | None = None
|
||||
help_text = None
|
||||
|
||||
# FIXME: Do all this by walking the parsed YAML doc stream. Using regexes is a dead-end; YAML's just too flexible to not have a
|
||||
# raft of false-positives and corner cases. If we directly consume either the YAML parse stream or override the YAML composer, we can
|
||||
# better catch these things without worrying about duplicating YAML's scalar parsing logic around quoting/escaping. At first, we can
|
||||
# replace the regex logic below with tiny special-purpose parse consumers to catch specific issues, but ideally, we could do a lot of this
|
||||
# inline with the actual doc parse, since our rules are a lot more strict than YAML's (eg, no support for non-scalar keys), and a lot of the
|
||||
# problem cases where that comes into play are around expression quoting and Jinja {{ syntax looking like weird YAML values we don't support.
|
||||
# Some common examples, where -> is "what YAML actually sees":
|
||||
# foo: {{ bar }} -> {"foo": {{"bar": None}: None}} - a mapping with a mapping as its key (legal YAML, but not legal Python/Ansible)
|
||||
#
|
||||
# - copy: src=foo.txt # kv syntax (kv could be on following line(s), too- implicit multi-line block scalar)
|
||||
# dest: bar.txt # orphaned mapping, since the value of `copy` is the scalar "src=foo.txt"
|
||||
#
|
||||
# - msg == "Error: 'dude' was not found" # unquoted scalar has a : in it -> {'msg == "Error"': 'dude'} [ was not found" ] is garbage orphan scalar
|
||||
|
||||
# noinspection PyUnboundLocalVariable
|
||||
if not isinstance(exception, MarkedYAMLError):
|
||||
pass # unexpected exception, don't use special analysis of exception
|
||||
|
||||
elif isinstance(exception, AnsibleConstructorError):
|
||||
pass # raised internally by ansible code, don't use special analysis of exception
|
||||
|
||||
# Check for tabs.
|
||||
# There may be cases where there is a valid tab in a line that has other errors.
|
||||
# That's OK, users should "fix" their tab usage anyway -- at which point later error handling logic will hopefully find the real issue.
|
||||
elif (tab_idx := target_line.find('\t')) >= 0:
|
||||
source_context = _utils.SourceContext.from_origin(origin.replace(col_num=tab_idx + 1))
|
||||
message = "Tabs are usually invalid in YAML."
|
||||
|
||||
# Check for unquoted templates.
|
||||
elif match := re.search(r'^\s*(?:-\s+)*(?:[\w\s]+:\s+)?(?P<value>\{\{.*}})', target_line):
|
||||
source_context = _utils.SourceContext.from_origin(origin.replace(col_num=match.start('value') + 1))
|
||||
message = 'This may be an issue with missing quotes around a template block.'
|
||||
# FIXME: Use the captured value to show the actual fix required.
|
||||
help_text = """
|
||||
For example:
|
||||
|
||||
raw: {{ some_var }}
|
||||
|
||||
Should be:
|
||||
|
||||
raw: "{{ some_var }}"
|
||||
"""
|
||||
|
||||
# Check for common unquoted colon mistakes.
|
||||
elif (
|
||||
# ignore lines starting with only whitespace and a colon
|
||||
not target_line.lstrip().startswith(':')
|
||||
# find the value after list/dict preamble
|
||||
and (value_match := re.search(r'^\s*(?:-\s+)*(?:[\w\s\[\]{}]+:\s+)?(?P<value>.*)$', target_line))
|
||||
# ignore properly quoted values
|
||||
and (target_fragment := _replace_quoted_value(value_match.group('value')))
|
||||
# look for an unquoted colon in the value
|
||||
and (colon_match := re.search(r':($| )', target_fragment))
|
||||
):
|
||||
source_context = _utils.SourceContext.from_origin(origin.replace(col_num=value_match.start('value') + colon_match.start() + 1))
|
||||
message = 'Colons in unquoted values must be followed by a non-space character.'
|
||||
# FIXME: Use the captured value to show the actual fix required.
|
||||
help_text = """
|
||||
For example:
|
||||
|
||||
raw: echo 'name: ansible'
|
||||
|
||||
Should be:
|
||||
|
||||
raw: "echo 'name: ansible'"
|
||||
"""
|
||||
|
||||
# Check for common quoting mistakes.
|
||||
elif match := re.search(r'^\s*(?:-\s+)*(?:[\w\s]+:\s+)?(?P<value>[\"\'].*?\s*)$', target_line):
|
||||
suspected_value = match.group('value')
|
||||
first, last = suspected_value[0], suspected_value[-1]
|
||||
|
||||
if first != last: # "foo" in bar
|
||||
source_context = _utils.SourceContext.from_origin(origin.replace(col_num=match.start('value') + 1))
|
||||
message = 'Values starting with a quote must end with the same quote.'
|
||||
# FIXME: Use the captured value to show the actual fix required, and use that same logic to improve the origin further.
|
||||
help_text = """
|
||||
For example:
|
||||
|
||||
raw: "foo" in bar
|
||||
|
||||
Should be:
|
||||
|
||||
raw: '"foo" in bar'
|
||||
"""
|
||||
elif first == last and target_line.count(first) > 2: # "foo" and "bar"
|
||||
source_context = _utils.SourceContext.from_origin(origin.replace(col_num=match.start('value') + 1))
|
||||
message = 'Values starting with a quote must end with the same quote, and not contain that quote.'
|
||||
# FIXME: Use the captured value to show the actual fix required, and use that same logic to improve the origin further.
|
||||
help_text = """
|
||||
For example:
|
||||
|
||||
raw: "foo" in "bar"
|
||||
|
||||
Should be:
|
||||
|
||||
raw: '"foo" in "bar"'
|
||||
"""
|
||||
|
||||
if not message:
|
||||
if isinstance(exception, MarkedYAMLError):
|
||||
# marked YAML error, pull out the useful messages while omitting the noise
|
||||
message = ' '.join(filter(None, (exception.context, exception.problem, exception.note)))
|
||||
message = message.strip()
|
||||
message = f'{message[0].upper()}{message[1:]}'
|
||||
|
||||
if not message.endswith('.'):
|
||||
message += '.'
|
||||
else:
|
||||
# unexpected error, use the exception message (normally hidden by overriding include_cause_message)
|
||||
message = str(exception)
|
||||
|
||||
message = re.sub(r'\s+', ' ', message).strip()
|
||||
|
||||
error = cls(message, obj=source_context.origin)
|
||||
error._formatted_source_context_value = str(source_context)
|
||||
error._help_text = help_text
|
||||
|
||||
raise error from exception
|
||||
|
||||
|
||||
def _replace_quoted_value(value: str, replacement='.') -> str:
|
||||
return re.sub(r"""^\s*('[^']*'|"[^"]*")\s*$""", lambda match: replacement * len(match.group(0)), value)
|
||||
66
lib/ansible/_internal/_yaml/_loader.py
Normal file
66
lib/ansible/_internal/_yaml/_loader.py
Normal file
|
|
@ -0,0 +1,66 @@
|
|||
from __future__ import annotations
|
||||
|
||||
import io as _io
|
||||
|
||||
from yaml.resolver import Resolver
|
||||
|
||||
from ansible.module_utils._internal._datatag import AnsibleTagHelper
|
||||
from ansible.module_utils.common.yaml import HAS_LIBYAML
|
||||
from ansible._internal._datatag import _tags
|
||||
|
||||
from ._constructor import AnsibleConstructor, AnsibleInstrumentedConstructor
|
||||
|
||||
if HAS_LIBYAML:
|
||||
from yaml.cyaml import CParser
|
||||
|
||||
class _YamlParser(CParser):
|
||||
def __init__(self, stream: str | bytes | _io.IOBase) -> None:
|
||||
if isinstance(stream, (str, bytes)):
|
||||
stream = AnsibleTagHelper.untag(stream) # PyYAML + libyaml barfs on str/bytes subclasses
|
||||
|
||||
CParser.__init__(self, stream)
|
||||
|
||||
self.name = getattr(stream, 'name', None) # provide feature parity with the Python implementation (yaml.reader.Reader provides name)
|
||||
|
||||
else:
|
||||
from yaml.composer import Composer
|
||||
from yaml.reader import Reader
|
||||
from yaml.scanner import Scanner
|
||||
from yaml.parser import Parser
|
||||
|
||||
class _YamlParser(Reader, Scanner, Parser, Composer): # type: ignore[no-redef]
|
||||
def __init__(self, stream: str | bytes | _io.IOBase) -> None:
|
||||
Reader.__init__(self, stream)
|
||||
Scanner.__init__(self)
|
||||
Parser.__init__(self)
|
||||
Composer.__init__(self)
|
||||
|
||||
|
||||
class AnsibleInstrumentedLoader(_YamlParser, AnsibleInstrumentedConstructor, Resolver):
|
||||
"""Ansible YAML loader which supports Ansible custom behavior such as `Origin` tagging, but no Ansible-specific YAML tags."""
|
||||
|
||||
def __init__(self, stream: str | bytes | _io.IOBase) -> None:
|
||||
_YamlParser.__init__(self, stream)
|
||||
|
||||
AnsibleInstrumentedConstructor.__init__(
|
||||
self,
|
||||
origin=_tags.Origin.get_or_create_tag(stream, self.name),
|
||||
trusted_as_template=_tags.TrustedAsTemplate.is_tagged_on(stream),
|
||||
)
|
||||
|
||||
Resolver.__init__(self)
|
||||
|
||||
|
||||
class AnsibleLoader(_YamlParser, AnsibleConstructor, Resolver):
|
||||
"""Ansible loader which supports Ansible custom behavior such as `Origin` tagging, as well as Ansible-specific YAML tags."""
|
||||
|
||||
def __init__(self, stream: str | bytes | _io.IOBase) -> None:
|
||||
_YamlParser.__init__(self, stream)
|
||||
|
||||
AnsibleConstructor.__init__(
|
||||
self,
|
||||
origin=_tags.Origin.get_or_create_tag(stream, self.name),
|
||||
trusted_as_template=_tags.TrustedAsTemplate.is_tagged_on(stream),
|
||||
)
|
||||
|
||||
Resolver.__init__(self)
|
||||
|
|
@ -0,0 +1,11 @@
|
|||
"Protomatter - an unstable substance which every ethical scientist in the galaxy has denounced as dangerously unpredictable."
|
||||
|
||||
"But it was the only way to solve certain problems..."
|
||||
|
||||
This Ansible Collection is embedded within ansible-core.
|
||||
It contains plugins useful for ansible-core's own integration tests.
|
||||
They have been made available, completely unsupported,
|
||||
in case they prove useful for debugging and troubleshooting purposes.
|
||||
|
||||
> CAUTION: This collection is not supported, and may be changed or removed in any version without prior notice.
|
||||
Use of these plugins outside ansible-core is highly discouraged.
|
||||
|
|
@ -0,0 +1,36 @@
|
|||
from __future__ import annotations
|
||||
|
||||
import typing as t
|
||||
|
||||
from ansible.module_utils.common.validation import _check_type_str_no_conversion, _check_type_list_strict
|
||||
from ansible.plugins.action import ActionBase
|
||||
from ansible._internal._templating._engine import TemplateEngine
|
||||
from ansible._internal._templating._marker_behaviors import ReplacingMarkerBehavior
|
||||
|
||||
|
||||
class ActionModule(ActionBase):
|
||||
TRANSFERS_FILES = False
|
||||
_requires_connection = False
|
||||
|
||||
@classmethod
|
||||
def finalize_task_arg(cls, name: str, value: t.Any, templar: TemplateEngine, context: t.Any) -> t.Any:
|
||||
if name == 'expression':
|
||||
return value
|
||||
|
||||
return super().finalize_task_arg(name, value, templar, context)
|
||||
|
||||
def run(self, tmp=None, task_vars=None):
|
||||
# accepts a list of literal expressions (no templating), evaluates with no failure on undefined, returns all results
|
||||
_vr, args = self.validate_argument_spec(
|
||||
argument_spec=dict(
|
||||
expression=dict(type=_check_type_list_strict, elements=_check_type_str_no_conversion, required=True),
|
||||
),
|
||||
)
|
||||
|
||||
with ReplacingMarkerBehavior.warning_context() as replacing_behavior:
|
||||
templar = self._templar._engine.extend(marker_behavior=replacing_behavior)
|
||||
|
||||
return dict(
|
||||
_ansible_verbose_always=True,
|
||||
expression_result=[templar.evaluate_expression(expression) for expression in args['expression']],
|
||||
)
|
||||
|
|
@ -0,0 +1,19 @@
|
|||
from __future__ import annotations
|
||||
|
||||
import typing as t
|
||||
|
||||
from ansible._internal._datatag._tags import TrustedAsTemplate
|
||||
|
||||
|
||||
def apply_trust(value: object) -> object:
|
||||
"""
|
||||
Filter that returns a tagged copy of the input string with TrustedAsTemplate.
|
||||
Containers and other non-string values are returned unmodified.
|
||||
"""
|
||||
return TrustedAsTemplate().tag(value) if isinstance(value, str) else value
|
||||
|
||||
|
||||
class FilterModule:
|
||||
@staticmethod
|
||||
def filters() -> dict[str, t.Callable]:
|
||||
return dict(apply_trust=apply_trust)
|
||||
|
|
@ -0,0 +1,18 @@
|
|||
from __future__ import annotations
|
||||
|
||||
import dataclasses
|
||||
import typing as t
|
||||
|
||||
|
||||
def dump_object(value: t.Any) -> object:
|
||||
"""Internal filter to convert objects not supported by JSON to types which are."""
|
||||
if dataclasses.is_dataclass(value):
|
||||
return dataclasses.asdict(value) # type: ignore[arg-type]
|
||||
|
||||
return value
|
||||
|
||||
|
||||
class FilterModule(object):
|
||||
@staticmethod
|
||||
def filters() -> dict[str, t.Callable]:
|
||||
return dict(dump_object=dump_object)
|
||||
|
|
@ -0,0 +1,16 @@
|
|||
from __future__ import annotations
|
||||
|
||||
import typing as t
|
||||
|
||||
from ansible._internal._templating._engine import _finalize_template_result, FinalizeMode
|
||||
|
||||
|
||||
def finalize(value: t.Any) -> t.Any:
|
||||
"""Perform an explicit top-level template finalize operation on the supplied value."""
|
||||
return _finalize_template_result(value, mode=FinalizeMode.TOP_LEVEL)
|
||||
|
||||
|
||||
class FilterModule:
|
||||
@staticmethod
|
||||
def filters() -> dict[str, t.Callable]:
|
||||
return dict(finalize=finalize)
|
||||
|
|
@ -0,0 +1,18 @@
|
|||
from __future__ import annotations
|
||||
|
||||
import typing as t
|
||||
|
||||
from ansible._internal._datatag._tags import Origin
|
||||
|
||||
|
||||
def origin(value: object) -> str | None:
|
||||
"""Return the origin of the value, if any, otherwise `None`."""
|
||||
origin_tag = Origin.get_tag(value)
|
||||
|
||||
return str(origin_tag) if origin_tag else None
|
||||
|
||||
|
||||
class FilterModule:
|
||||
@staticmethod
|
||||
def filters() -> dict[str, t.Callable]:
|
||||
return dict(origin=origin)
|
||||
|
|
@ -0,0 +1,24 @@
|
|||
from __future__ import annotations
|
||||
|
||||
import ast
|
||||
|
||||
from ansible.errors import AnsibleTypeError
|
||||
|
||||
|
||||
def python_literal_eval(value: object, ignore_errors=False) -> object:
|
||||
try:
|
||||
if isinstance(value, str):
|
||||
return ast.literal_eval(value)
|
||||
|
||||
raise AnsibleTypeError("The `value` to eval must be a string.", obj=value)
|
||||
except Exception:
|
||||
if ignore_errors:
|
||||
return value
|
||||
|
||||
raise
|
||||
|
||||
|
||||
class FilterModule(object):
|
||||
@staticmethod
|
||||
def filters():
|
||||
return dict(python_literal_eval=python_literal_eval)
|
||||
|
|
@ -0,0 +1,33 @@
|
|||
DOCUMENTATION:
|
||||
name: python_literal_eval
|
||||
version_added: "2.19"
|
||||
short_description: evaluate a Python literal expression string
|
||||
description:
|
||||
- Evaluates the input string as a Python literal expression, returning the resulting data structure.
|
||||
- Previous versions of Ansible applied this behavior to all template results in non-native Jinja mode.
|
||||
- This filter provides a way to emulate the previous behavior.
|
||||
notes:
|
||||
- Directly calls Python's C(ast.literal_eval).
|
||||
positional: _input
|
||||
options:
|
||||
_input:
|
||||
description: Python literal string expression.
|
||||
type: str
|
||||
required: true
|
||||
ignore_errors:
|
||||
description: Whether to silently ignore all errors resulting from the literal_eval operation. If true, the input is silently returned unmodified when an error occurs.
|
||||
type: bool
|
||||
default: false
|
||||
|
||||
EXAMPLES: |
|
||||
- name: evaluate an expression comprised only of Python literals
|
||||
assert:
|
||||
that: (another_var | ansible._protomatter.python_literal_eval)[1] == 2 # in 2.19 and later, the explicit python_literal_eval emulates the old templating behavior
|
||||
vars:
|
||||
another_var: "{{ some_var }}" # in 2.18 and earlier, indirection through templating caused implicit literal_eval, converting the value to a list
|
||||
some_var: "[1, 2]" # a value that looks like a Python list literal embedded in a string
|
||||
|
||||
RETURN:
|
||||
_value:
|
||||
description: Resulting data structure.
|
||||
type: raw
|
||||
|
|
@ -0,0 +1,16 @@
|
|||
from __future__ import annotations
|
||||
|
||||
import typing as t
|
||||
|
||||
from ansible.module_utils._internal._datatag import AnsibleTagHelper
|
||||
|
||||
|
||||
def tag_names(value: object) -> list[str]:
|
||||
"""Return a list of tag type names (if any) present on the given object."""
|
||||
return sorted(tag_type.__name__ for tag_type in AnsibleTagHelper.tag_types(value))
|
||||
|
||||
|
||||
class FilterModule:
|
||||
@staticmethod
|
||||
def filters() -> dict[str, t.Callable]:
|
||||
return dict(tag_names=tag_names)
|
||||
|
|
@ -0,0 +1,17 @@
|
|||
from __future__ import annotations
|
||||
|
||||
import typing as t
|
||||
|
||||
from ansible.plugins import accept_args_markers
|
||||
|
||||
|
||||
@accept_args_markers
|
||||
def true_type(obj: object) -> str:
|
||||
"""Internal filter to show the true type name of the given object, not just the base type name like the `debug` filter."""
|
||||
return obj.__class__.__name__
|
||||
|
||||
|
||||
class FilterModule(object):
|
||||
@staticmethod
|
||||
def filters() -> dict[str, t.Callable]:
|
||||
return dict(true_type=true_type)
|
||||
|
|
@ -0,0 +1,49 @@
|
|||
from __future__ import annotations
|
||||
|
||||
import copy
|
||||
import dataclasses
|
||||
import typing as t
|
||||
|
||||
from ansible._internal._templating._jinja_common import validate_arg_type
|
||||
from ansible._internal._templating._lazy_containers import _AnsibleLazyTemplateMixin
|
||||
from ansible._internal._templating._transform import _type_transform_mapping
|
||||
from ansible.errors import AnsibleError
|
||||
|
||||
|
||||
def unmask(value: object, type_names: str | list[str]) -> object:
|
||||
"""
|
||||
Internal filter to suppress automatic type transformation in Jinja (e.g., WarningMessageDetail, DeprecationMessageDetail, ErrorDetail).
|
||||
Lazy collection caching is in play - the first attempt to access a value in a given lazy container must be with unmasking in place, or the transformed value
|
||||
will already be cached.
|
||||
"""
|
||||
validate_arg_type("type_names", type_names, (str, list))
|
||||
|
||||
if isinstance(type_names, str):
|
||||
check_type_names = [type_names]
|
||||
else:
|
||||
check_type_names = type_names
|
||||
|
||||
valid_type_names = {key.__name__ for key in _type_transform_mapping}
|
||||
invalid_type_names = [type_name for type_name in check_type_names if type_name not in valid_type_names]
|
||||
|
||||
if invalid_type_names:
|
||||
raise AnsibleError(f'Unknown type name(s): {", ".join(invalid_type_names)}', obj=type_names)
|
||||
|
||||
result: object
|
||||
|
||||
if isinstance(value, _AnsibleLazyTemplateMixin):
|
||||
result = copy.copy(value)
|
||||
result._lazy_options = dataclasses.replace(
|
||||
result._lazy_options,
|
||||
unmask_type_names=result._lazy_options.unmask_type_names | frozenset(check_type_names),
|
||||
)
|
||||
else:
|
||||
result = value
|
||||
|
||||
return result
|
||||
|
||||
|
||||
class FilterModule(object):
|
||||
@staticmethod
|
||||
def filters() -> dict[str, t.Callable]:
|
||||
return dict(unmask=unmask)
|
||||
|
|
@ -0,0 +1,21 @@
|
|||
from __future__ import annotations
|
||||
|
||||
from ansible.plugins.lookup import LookupBase
|
||||
|
||||
|
||||
class LookupModule(LookupBase):
|
||||
"""Specialized config lookup that applies data transformations on values that config cannot."""
|
||||
|
||||
def run(self, terms, variables=None, **kwargs):
|
||||
if not terms or not (config_name := terms[0]):
|
||||
raise ValueError("config name is required")
|
||||
|
||||
match config_name:
|
||||
case 'DISPLAY_TRACEBACK':
|
||||
# since config can't expand this yet, we need the post-processed version
|
||||
from ansible.module_utils._internal._traceback import traceback_for
|
||||
|
||||
return traceback_for()
|
||||
# DTFIX-FUTURE: plumb through normal config fallback
|
||||
case _:
|
||||
raise ValueError(f"Unknown config name {config_name!r}.")
|
||||
|
|
@ -0,0 +1,2 @@
|
|||
DOCUMENTATION:
|
||||
name: config
|
||||
|
|
@ -0,0 +1,15 @@
|
|||
from __future__ import annotations
|
||||
|
||||
import typing as t
|
||||
|
||||
from ansible.module_utils._internal import _datatag
|
||||
|
||||
|
||||
def tagged(value: t.Any) -> bool:
|
||||
return bool(_datatag.AnsibleTagHelper.tag_types(value))
|
||||
|
||||
|
||||
class TestModule:
|
||||
@staticmethod
|
||||
def tests() -> dict[str, t.Callable]:
|
||||
return dict(tagged=tagged)
|
||||
|
|
@ -0,0 +1,19 @@
|
|||
DOCUMENTATION:
|
||||
name: tagged
|
||||
author: Ansible Core
|
||||
version_added: "2.19"
|
||||
short_description: does the value have a data tag
|
||||
description:
|
||||
- Check if the provided value has a data tag.
|
||||
options:
|
||||
_input:
|
||||
description: A value.
|
||||
type: raw
|
||||
|
||||
EXAMPLES: |
|
||||
is_data_tagged: "{{ my_variable is ansible._protomatter.tagged }}"
|
||||
|
||||
RETURN:
|
||||
_value:
|
||||
description: Returns C(True) if the value has one or more data tags, otherwise C(False).
|
||||
type: boolean
|
||||
|
|
@ -0,0 +1,18 @@
|
|||
from __future__ import annotations
|
||||
|
||||
import typing as t
|
||||
|
||||
from ansible.module_utils._internal import _datatag
|
||||
|
||||
|
||||
def tagged_with(value: t.Any, tag_name: str) -> bool:
|
||||
if tag_type := _datatag._known_tag_type_map.get(tag_name):
|
||||
return tag_type.is_tagged_on(value)
|
||||
|
||||
raise ValueError(f"Unknown tag name {tag_name!r}.")
|
||||
|
||||
|
||||
class TestModule:
|
||||
@staticmethod
|
||||
def tests() -> dict[str, t.Callable]:
|
||||
return dict(tagged_with=tagged_with)
|
||||
|
|
@ -0,0 +1,19 @@
|
|||
DOCUMENTATION:
|
||||
name: tagged_with
|
||||
author: Ansible Core
|
||||
version_added: "2.19"
|
||||
short_description: does the value have the specified data tag
|
||||
description:
|
||||
- Check if the provided value has the specified data tag.
|
||||
options:
|
||||
_input:
|
||||
description: A value.
|
||||
type: raw
|
||||
|
||||
EXAMPLES: |
|
||||
is_data_tagged: "{{ my_variable is ansible._protomatter.tagged_with('Origin') }}"
|
||||
|
||||
RETURN:
|
||||
_value:
|
||||
description: Returns C(True) if the value has the specified data tag, otherwise C(False).
|
||||
type: boolean
|
||||
|
|
@ -77,18 +77,6 @@ def initialize_locale():
|
|||
initialize_locale()
|
||||
|
||||
|
||||
from importlib.metadata import version
|
||||
from ansible.module_utils.compat.version import LooseVersion
|
||||
|
||||
# Used for determining if the system is running a new enough Jinja2 version
|
||||
# and should only restrict on our documented minimum versions
|
||||
jinja2_version = version('jinja2')
|
||||
if jinja2_version < LooseVersion('3.1'):
|
||||
raise SystemExit(
|
||||
'ERROR: Ansible requires Jinja2 3.1 or newer on the controller. '
|
||||
'Current version: %s' % jinja2_version
|
||||
)
|
||||
|
||||
import atexit
|
||||
import errno
|
||||
import getpass
|
||||
|
|
@ -97,17 +85,22 @@ import traceback
|
|||
from abc import ABC, abstractmethod
|
||||
from pathlib import Path
|
||||
|
||||
from ansible import _internal # do not remove or defer; ensures controller-specific state is set early
|
||||
|
||||
_internal.setup()
|
||||
|
||||
try:
|
||||
from ansible import constants as C
|
||||
from ansible.utils.display import Display
|
||||
display = Display()
|
||||
except Exception as e:
|
||||
print('ERROR: %s' % e, file=sys.stderr)
|
||||
except Exception as ex:
|
||||
print(f'ERROR: {ex}\n\n{"".join(traceback.format_exception(ex))}', file=sys.stderr)
|
||||
sys.exit(5)
|
||||
|
||||
|
||||
from ansible import context
|
||||
from ansible.cli.arguments import option_helpers as opt_help
|
||||
from ansible.errors import AnsibleError, AnsibleOptionsError, AnsibleParserError
|
||||
from ansible.errors import AnsibleError, ExitCode
|
||||
from ansible.inventory.manager import InventoryManager
|
||||
from ansible.module_utils.six import string_types
|
||||
from ansible.module_utils.common.text.converters import to_bytes, to_text
|
||||
|
|
@ -115,14 +108,13 @@ from ansible.module_utils.common.collections import is_sequence
|
|||
from ansible.module_utils.common.file import is_executable
|
||||
from ansible.module_utils.common.process import get_bin_path
|
||||
from ansible.parsing.dataloader import DataLoader
|
||||
from ansible.parsing.vault import PromptVaultSecret, get_file_vault_secret
|
||||
from ansible.parsing.vault import PromptVaultSecret, get_file_vault_secret, VaultSecretsContext
|
||||
from ansible.plugins.loader import add_all_plugin_dirs, init_plugin_loader
|
||||
from ansible.release import __version__
|
||||
from ansible.utils._ssh_agent import SshAgentClient
|
||||
from ansible.utils.collection_loader import AnsibleCollectionConfig
|
||||
from ansible.utils.collection_loader._collection_finder import _get_collection_name_from_path
|
||||
from ansible.utils.path import unfrackpath
|
||||
from ansible.utils.unsafe_proxy import to_unsafe_text
|
||||
from ansible.vars.manager import VariableManager
|
||||
|
||||
try:
|
||||
|
|
@ -226,6 +218,9 @@ class CLI(ABC):
|
|||
self.parser = None
|
||||
self.callback = callback
|
||||
|
||||
self.show_devel_warning()
|
||||
|
||||
def show_devel_warning(self) -> None:
|
||||
if C.DEVEL_WARNING and __version__.endswith('dev0'):
|
||||
display.warning(
|
||||
'You are running the development version of Ansible. You should only run Ansible from "devel" if '
|
||||
|
|
@ -297,7 +292,7 @@ class CLI(ABC):
|
|||
@staticmethod
|
||||
def setup_vault_secrets(loader, vault_ids, vault_password_files=None,
|
||||
ask_vault_pass=None, create_new_password=False,
|
||||
auto_prompt=True):
|
||||
auto_prompt=True, initialize_context=True):
|
||||
# list of tuples
|
||||
vault_secrets = []
|
||||
|
||||
|
|
@ -394,15 +389,14 @@ class CLI(ABC):
|
|||
if last_exception and not found_vault_secret:
|
||||
raise last_exception
|
||||
|
||||
if initialize_context:
|
||||
VaultSecretsContext.initialize(VaultSecretsContext(vault_secrets))
|
||||
|
||||
return vault_secrets
|
||||
|
||||
@staticmethod
|
||||
def _get_secret(prompt):
|
||||
|
||||
secret = getpass.getpass(prompt=prompt)
|
||||
if secret:
|
||||
secret = to_unsafe_text(secret)
|
||||
return secret
|
||||
def _get_secret(prompt: str) -> str:
|
||||
return getpass.getpass(prompt=prompt)
|
||||
|
||||
@staticmethod
|
||||
def ask_passwords():
|
||||
|
|
@ -411,7 +405,6 @@ class CLI(ABC):
|
|||
op = context.CLIARGS
|
||||
sshpass = None
|
||||
becomepass = None
|
||||
become_prompt = ''
|
||||
|
||||
become_prompt_method = "BECOME" if C.AGNOSTIC_BECOME_PROMPT else op['become_method'].upper()
|
||||
|
||||
|
|
@ -433,7 +426,7 @@ class CLI(ABC):
|
|||
except EOFError:
|
||||
pass
|
||||
|
||||
return (sshpass, becomepass)
|
||||
return sshpass, becomepass
|
||||
|
||||
def validate_conflicts(self, op, runas_opts=False, fork_opts=False):
|
||||
""" check for conflicting options """
|
||||
|
|
@ -680,10 +673,9 @@ class CLI(ABC):
|
|||
return hosts
|
||||
|
||||
@staticmethod
|
||||
def get_password_from_file(pwd_file):
|
||||
|
||||
def get_password_from_file(pwd_file: str) -> str:
|
||||
b_pwd_file = to_bytes(pwd_file)
|
||||
secret = None
|
||||
|
||||
if b_pwd_file == b'-':
|
||||
# ensure its read as bytes
|
||||
secret = sys.stdin.buffer.read()
|
||||
|
|
@ -703,13 +695,13 @@ class CLI(ABC):
|
|||
|
||||
stdout, stderr = p.communicate()
|
||||
if p.returncode != 0:
|
||||
raise AnsibleError("The password script %s returned an error (rc=%s): %s" % (pwd_file, p.returncode, stderr))
|
||||
raise AnsibleError("The password script %s returned an error (rc=%s): %s" % (pwd_file, p.returncode, to_text(stderr)))
|
||||
secret = stdout
|
||||
|
||||
else:
|
||||
try:
|
||||
with open(b_pwd_file, "rb") as f:
|
||||
secret = f.read().strip()
|
||||
with open(b_pwd_file, "rb") as password_file:
|
||||
secret = password_file.read().strip()
|
||||
except (OSError, IOError) as e:
|
||||
raise AnsibleError("Could not read password file %s: %s" % (pwd_file, e))
|
||||
|
||||
|
|
@ -718,7 +710,7 @@ class CLI(ABC):
|
|||
if not secret:
|
||||
raise AnsibleError('Empty password was provided from file (%s)' % pwd_file)
|
||||
|
||||
return to_unsafe_text(secret)
|
||||
return to_text(secret)
|
||||
|
||||
@classmethod
|
||||
def cli_executor(cls, args=None):
|
||||
|
|
@ -739,54 +731,22 @@ class CLI(ABC):
|
|||
else:
|
||||
display.debug("Created the '%s' directory" % ansible_dir)
|
||||
|
||||
try:
|
||||
args = [to_text(a, errors='surrogate_or_strict') for a in args]
|
||||
except UnicodeError:
|
||||
display.error('Command line args are not in utf-8, unable to continue. Ansible currently only understands utf-8')
|
||||
display.display(u"The full traceback was:\n\n%s" % to_text(traceback.format_exc()))
|
||||
exit_code = 6
|
||||
else:
|
||||
cli = cls(args)
|
||||
exit_code = cli.run()
|
||||
|
||||
except AnsibleOptionsError as e:
|
||||
cli.parser.print_help()
|
||||
display.error(to_text(e), wrap_text=False)
|
||||
exit_code = 5
|
||||
except AnsibleParserError as e:
|
||||
display.error(to_text(e), wrap_text=False)
|
||||
exit_code = 4
|
||||
# TQM takes care of these, but leaving comment to reserve the exit codes
|
||||
# except AnsibleHostUnreachable as e:
|
||||
# display.error(str(e))
|
||||
# exit_code = 3
|
||||
# except AnsibleHostFailed as e:
|
||||
# display.error(str(e))
|
||||
# exit_code = 2
|
||||
except AnsibleError as e:
|
||||
display.error(to_text(e), wrap_text=False)
|
||||
exit_code = 1
|
||||
cli = cls(args)
|
||||
exit_code = cli.run()
|
||||
except AnsibleError as ex:
|
||||
display.error(ex)
|
||||
exit_code = ex._exit_code
|
||||
except KeyboardInterrupt:
|
||||
display.error("User interrupted execution")
|
||||
exit_code = 99
|
||||
except Exception as e:
|
||||
if C.DEFAULT_DEBUG:
|
||||
# Show raw stacktraces in debug mode, It also allow pdb to
|
||||
# enter post mortem mode.
|
||||
raise
|
||||
have_cli_options = bool(context.CLIARGS)
|
||||
display.error("Unexpected Exception, this is probably a bug: %s" % to_text(e), wrap_text=False)
|
||||
if not have_cli_options or have_cli_options and context.CLIARGS['verbosity'] > 2:
|
||||
log_only = False
|
||||
if hasattr(e, 'orig_exc'):
|
||||
display.vvv('\nexception type: %s' % to_text(type(e.orig_exc)))
|
||||
why = to_text(e.orig_exc)
|
||||
if to_text(e) != why:
|
||||
display.vvv('\noriginal msg: %s' % why)
|
||||
else:
|
||||
display.display("to see the full traceback, use -vvv")
|
||||
log_only = True
|
||||
display.display(u"the full traceback was:\n\n%s" % to_text(traceback.format_exc()), log_only=log_only)
|
||||
exit_code = 250
|
||||
exit_code = ExitCode.KEYBOARD_INTERRUPT
|
||||
except Exception as ex:
|
||||
try:
|
||||
raise AnsibleError("Unexpected Exception, this is probably a bug.") from ex
|
||||
except AnsibleError as ex2:
|
||||
# DTFIX-RELEASE: clean this up so we're not hacking the internals- re-wrap in an AnsibleCLIUnhandledError that always shows TB, or?
|
||||
from ansible.module_utils._internal import _traceback
|
||||
_traceback._is_traceback_enabled = lambda *_args, **_kwargs: True
|
||||
display.error(ex2)
|
||||
exit_code = ExitCode.UNKNOWN_ERROR
|
||||
|
||||
sys.exit(exit_code)
|
||||
|
|
|
|||
|
|
@ -6,6 +6,8 @@
|
|||
|
||||
from __future__ import annotations
|
||||
|
||||
import json
|
||||
|
||||
# ansible.cli needs to be imported first, to ensure the source bin/* scripts run that code first
|
||||
from ansible.cli import CLI
|
||||
from ansible import constants as C
|
||||
|
|
@ -15,10 +17,11 @@ from ansible.errors import AnsibleError, AnsibleOptionsError, AnsibleParserError
|
|||
from ansible.executor.task_queue_manager import TaskQueueManager
|
||||
from ansible.module_utils.common.text.converters import to_text
|
||||
from ansible.parsing.splitter import parse_kv
|
||||
from ansible.parsing.utils.yaml import from_yaml
|
||||
from ansible.playbook import Playbook
|
||||
from ansible.playbook.play import Play
|
||||
from ansible._internal._datatag._tags import Origin
|
||||
from ansible.utils.display import Display
|
||||
from ansible._internal._json._profiles import _legacy
|
||||
|
||||
display = Display()
|
||||
|
||||
|
|
@ -78,7 +81,7 @@ class AdHocCLI(CLI):
|
|||
module_args = None
|
||||
if module_args_raw and module_args_raw.startswith('{') and module_args_raw.endswith('}'):
|
||||
try:
|
||||
module_args = from_yaml(module_args_raw.strip(), json_only=True)
|
||||
module_args = json.loads(module_args_raw, cls=_legacy.Decoder)
|
||||
except AnsibleParserError:
|
||||
pass
|
||||
|
||||
|
|
@ -88,6 +91,8 @@ class AdHocCLI(CLI):
|
|||
mytask = {'action': {'module': context.CLIARGS['module_name'], 'args': module_args},
|
||||
'timeout': context.CLIARGS['task_timeout']}
|
||||
|
||||
mytask = Origin(description=f'<adhoc {context.CLIARGS["module_name"]!r} task>').tag(mytask)
|
||||
|
||||
# avoid adding to tasks that don't support it, unless set, then give user an error
|
||||
if context.CLIARGS['module_name'] not in C._ACTION_ALL_INCLUDE_ROLE_TASKS and any(frozenset((async_val, poll))):
|
||||
mytask['async_val'] = async_val
|
||||
|
|
|
|||
|
|
@ -4,12 +4,17 @@
|
|||
from __future__ import annotations
|
||||
|
||||
import copy
|
||||
import dataclasses
|
||||
import inspect
|
||||
import operator
|
||||
import argparse
|
||||
import os
|
||||
import os.path
|
||||
import sys
|
||||
import time
|
||||
import typing as t
|
||||
|
||||
import yaml
|
||||
|
||||
from jinja2 import __version__ as j2_version
|
||||
|
||||
|
|
@ -20,6 +25,8 @@ from ansible.module_utils.common.yaml import HAS_LIBYAML, yaml_load
|
|||
from ansible.release import __version__
|
||||
from ansible.utils.path import unfrackpath
|
||||
|
||||
from ansible._internal._datatag._tags import TrustedAsTemplate, Origin
|
||||
|
||||
|
||||
#
|
||||
# Special purpose OptionParsers
|
||||
|
|
@ -30,13 +37,115 @@ class SortingHelpFormatter(argparse.HelpFormatter):
|
|||
super(SortingHelpFormatter, self).add_arguments(actions)
|
||||
|
||||
|
||||
@dataclasses.dataclass(frozen=True, kw_only=True)
|
||||
class DeprecatedArgument:
|
||||
version: str
|
||||
"""The Ansible version that will remove the deprecated argument."""
|
||||
|
||||
option: str | None = None
|
||||
"""The specific option string that is deprecated; None applies to all options for this argument."""
|
||||
|
||||
def is_deprecated(self, option: str) -> bool:
|
||||
"""Return True if the given option is deprecated, otherwise False."""
|
||||
return self.option is None or option == self.option
|
||||
|
||||
def check(self, option: str) -> None:
|
||||
"""Display a deprecation warning if the given option is deprecated."""
|
||||
if not self.is_deprecated(option):
|
||||
return
|
||||
|
||||
from ansible.utils.display import Display
|
||||
|
||||
Display().deprecated(f'The {option!r} argument is deprecated.', version=self.version)
|
||||
|
||||
|
||||
class ArgumentParser(argparse.ArgumentParser):
|
||||
def add_argument(self, *args, **kwargs):
|
||||
def __init__(self, *args, **kwargs) -> None:
|
||||
self.__actions: dict[str | None, type[argparse.Action]] = {}
|
||||
|
||||
super().__init__(*args, **kwargs)
|
||||
|
||||
def register(self, registry_name, value, object):
|
||||
"""Track registration of actions so that they can be resolved later by name, without depending on the internals of ArgumentParser."""
|
||||
if registry_name == 'action':
|
||||
self.__actions[value] = object
|
||||
|
||||
super().register(registry_name, value, object)
|
||||
|
||||
def _patch_argument(self, args: tuple[str, ...], kwargs: dict[str, t.Any]) -> None:
|
||||
"""
|
||||
Patch `kwargs` for an `add_argument` call using the given `args` and `kwargs`.
|
||||
This is used to apply tags to entire categories of CLI arguments.
|
||||
"""
|
||||
name = args[0]
|
||||
action = kwargs.get('action')
|
||||
resolved_action = self.__actions.get(action, action) # get the action by name, or use as-is (assume it's a subclass of argparse.Action)
|
||||
action_signature = inspect.signature(resolved_action.__init__)
|
||||
|
||||
if action_signature.parameters.get('type'):
|
||||
arg_type = kwargs.get('type', str)
|
||||
|
||||
if not callable(arg_type):
|
||||
raise ValueError(f'Argument {name!r} requires a callable for the {"type"!r} parameter, not {arg_type!r}.')
|
||||
|
||||
wrapped_arg_type = _tagged_type_factory(name, arg_type)
|
||||
|
||||
kwargs.update(type=wrapped_arg_type)
|
||||
|
||||
def _patch_parser(self, parser):
|
||||
"""Patch and return the given parser to intercept the `add_argument` method for further patching."""
|
||||
parser_add_argument = parser.add_argument
|
||||
|
||||
def add_argument(*ag_args, **ag_kwargs):
|
||||
self._patch_argument(ag_args, ag_kwargs)
|
||||
|
||||
parser_add_argument(*ag_args, **ag_kwargs)
|
||||
|
||||
parser.add_argument = add_argument
|
||||
|
||||
return parser
|
||||
|
||||
def add_subparsers(self, *args, **kwargs):
|
||||
sub = super().add_subparsers(*args, **kwargs)
|
||||
sub_add_parser = sub.add_parser
|
||||
|
||||
def add_parser(*sub_args, **sub_kwargs):
|
||||
return self._patch_parser(sub_add_parser(*sub_args, **sub_kwargs))
|
||||
|
||||
sub.add_parser = add_parser
|
||||
|
||||
return sub
|
||||
|
||||
def add_argument_group(self, *args, **kwargs):
|
||||
return self._patch_parser(super().add_argument_group(*args, **kwargs))
|
||||
|
||||
def add_mutually_exclusive_group(self, *args, **kwargs):
|
||||
return self._patch_parser(super().add_mutually_exclusive_group(*args, **kwargs))
|
||||
|
||||
def add_argument(self, *args, **kwargs) -> argparse.Action:
|
||||
action = kwargs.get('action')
|
||||
help = kwargs.get('help')
|
||||
if help and action in {'append', 'append_const', 'count', 'extend', PrependListAction}:
|
||||
help = f'{help.rstrip(".")}. This argument may be specified multiple times.'
|
||||
kwargs['help'] = help
|
||||
|
||||
self._patch_argument(args, kwargs)
|
||||
|
||||
deprecated: DeprecatedArgument | None
|
||||
|
||||
if deprecated := kwargs.pop('deprecated', None):
|
||||
action_type = self.__actions.get(action, action)
|
||||
|
||||
class DeprecatedAction(action_type): # type: ignore[misc, valid-type]
|
||||
"""A wrapper around an action which handles deprecation warnings."""
|
||||
|
||||
def __call__(self, parser, namespace, values, option_string=None) -> t.Any:
|
||||
deprecated.check(option_string)
|
||||
|
||||
return super().__call__(parser, namespace, values, option_string)
|
||||
|
||||
kwargs['action'] = DeprecatedAction
|
||||
|
||||
return super().add_argument(*args, **kwargs)
|
||||
|
||||
|
||||
|
|
@ -182,13 +291,28 @@ def version(prog=None):
|
|||
cpath = "Default w/o overrides"
|
||||
else:
|
||||
cpath = C.DEFAULT_MODULE_PATH
|
||||
|
||||
if HAS_LIBYAML:
|
||||
libyaml_fragment = "with libyaml"
|
||||
|
||||
# noinspection PyBroadException
|
||||
try:
|
||||
from yaml._yaml import get_version_string
|
||||
|
||||
libyaml_fragment += f" v{get_version_string()}"
|
||||
except Exception: # pylint: disable=broad-except
|
||||
libyaml_fragment += ", version unknown"
|
||||
else:
|
||||
libyaml_fragment = "without libyaml"
|
||||
|
||||
result.append(" configured module search path = %s" % cpath)
|
||||
result.append(" ansible python module location = %s" % ':'.join(ansible.__path__))
|
||||
result.append(" ansible collection location = %s" % ':'.join(C.COLLECTIONS_PATHS))
|
||||
result.append(" executable location = %s" % sys.argv[0])
|
||||
result.append(" python version = %s (%s)" % (''.join(sys.version.splitlines()), to_native(sys.executable)))
|
||||
result.append(" jinja version = %s" % j2_version)
|
||||
result.append(" libyaml = %s" % HAS_LIBYAML)
|
||||
result.append(f" pyyaml version = {yaml.__version__} ({libyaml_fragment})")
|
||||
|
||||
return "\n".join(result)
|
||||
|
||||
|
||||
|
|
@ -292,7 +416,8 @@ def add_fork_options(parser):
|
|||
def add_inventory_options(parser):
|
||||
"""Add options for commands that utilize inventory"""
|
||||
parser.add_argument('-i', '--inventory', '--inventory-file', dest='inventory', action="append",
|
||||
help="specify inventory host path or comma separated host list. --inventory-file is deprecated")
|
||||
help="specify inventory host path or comma separated host list",
|
||||
deprecated=DeprecatedArgument(version='2.23', option='--inventory-file'))
|
||||
parser.add_argument('--list-hosts', dest='listhosts', action='store_true',
|
||||
help='outputs a list of matching hosts; does not execute anything else')
|
||||
parser.add_argument('-l', '--limit', default=C.DEFAULT_SUBSET, dest='subset',
|
||||
|
|
@ -318,9 +443,9 @@ def add_module_options(parser):
|
|||
def add_output_options(parser):
|
||||
"""Add options for commands which can change their output"""
|
||||
parser.add_argument('-o', '--one-line', dest='one_line', action='store_true',
|
||||
help='condense output')
|
||||
help='condense output', deprecated=DeprecatedArgument(version='2.23'))
|
||||
parser.add_argument('-t', '--tree', dest='tree', default=None,
|
||||
help='log output to this directory')
|
||||
help='log output to this directory', deprecated=DeprecatedArgument(version='2.23'))
|
||||
|
||||
|
||||
def add_runas_options(parser):
|
||||
|
|
@ -396,3 +521,25 @@ def add_vault_options(parser):
|
|||
help='ask for vault password')
|
||||
base_group.add_argument('--vault-password-file', '--vault-pass-file', default=[], dest='vault_password_files',
|
||||
help="vault password file", type=unfrack_path(follow=False), action='append')
|
||||
|
||||
|
||||
def _tagged_type_factory(name: str, func: t.Callable[[str], object], /) -> t.Callable[[str], object]:
|
||||
"""
|
||||
Return a callable that wraps the given function.
|
||||
The result of the wrapped function will be tagged with Origin.
|
||||
It will also be tagged with TrustedAsTemplate if it is equal to the original input string.
|
||||
"""
|
||||
def tag_value(value: str) -> object:
|
||||
result = func(value)
|
||||
|
||||
if result is value:
|
||||
# Values which are not mutated are automatically trusted for templating.
|
||||
# The `is` reference equality is critically important, as other types may only alter the tags, so object equality is
|
||||
# not sufficient to prevent them being tagged as trusted when they should not.
|
||||
result = TrustedAsTemplate().tag(result)
|
||||
|
||||
return Origin(description=f'<CLI option {name!r}>').tag(result)
|
||||
|
||||
tag_value._name = name # simplify debugging by attaching the argument name to the function
|
||||
|
||||
return tag_value
|
||||
|
|
|
|||
|
|
@ -10,7 +10,6 @@ from ansible.cli import CLI
|
|||
|
||||
import os
|
||||
import shlex
|
||||
import subprocess
|
||||
import sys
|
||||
import yaml
|
||||
|
||||
|
|
@ -24,7 +23,7 @@ from ansible.cli.arguments import option_helpers as opt_help
|
|||
from ansible.config.manager import ConfigManager
|
||||
from ansible.errors import AnsibleError, AnsibleOptionsError, AnsibleRequiredOptionError
|
||||
from ansible.module_utils.common.text.converters import to_native, to_text, to_bytes
|
||||
from ansible.module_utils.common.json import json_dump
|
||||
from ansible._internal import _json
|
||||
from ansible.module_utils.six import string_types
|
||||
from ansible.parsing.quoting import is_quoted
|
||||
from ansible.parsing.yaml.dumper import AnsibleDumper
|
||||
|
|
@ -178,8 +177,6 @@ class ConfigCLI(CLI):
|
|||
except Exception:
|
||||
if context.CLIARGS['action'] in ['view']:
|
||||
raise
|
||||
elif context.CLIARGS['action'] in ['edit', 'update']:
|
||||
display.warning("File does not exist, used empty file: %s" % self.config_file)
|
||||
|
||||
elif context.CLIARGS['action'] == 'view':
|
||||
raise AnsibleError('Invalid or no config file was supplied')
|
||||
|
|
@ -187,30 +184,6 @@ class ConfigCLI(CLI):
|
|||
# run the requested action
|
||||
context.CLIARGS['func']()
|
||||
|
||||
def execute_update(self):
|
||||
"""
|
||||
Updates a single setting in the specified ansible.cfg
|
||||
"""
|
||||
raise AnsibleError("Option not implemented yet")
|
||||
|
||||
# pylint: disable=unreachable
|
||||
if context.CLIARGS['setting'] is None:
|
||||
raise AnsibleOptionsError("update option requires a setting to update")
|
||||
|
||||
(entry, value) = context.CLIARGS['setting'].split('=')
|
||||
if '.' in entry:
|
||||
(section, option) = entry.split('.')
|
||||
else:
|
||||
section = 'defaults'
|
||||
option = entry
|
||||
subprocess.call([
|
||||
'ansible',
|
||||
'-m', 'ini_file',
|
||||
'localhost',
|
||||
'-c', 'local',
|
||||
'-a', '"dest=%s section=%s option=%s value=%s backup=yes"' % (self.config_file, section, option, value)
|
||||
])
|
||||
|
||||
def execute_view(self):
|
||||
"""
|
||||
Displays the current config file
|
||||
|
|
@ -221,20 +194,6 @@ class ConfigCLI(CLI):
|
|||
except Exception as e:
|
||||
raise AnsibleError("Failed to open config file: %s" % to_native(e))
|
||||
|
||||
def execute_edit(self):
|
||||
"""
|
||||
Opens ansible.cfg in the default EDITOR
|
||||
"""
|
||||
raise AnsibleError("Option not implemented yet")
|
||||
|
||||
# pylint: disable=unreachable
|
||||
try:
|
||||
editor = shlex.split(C.config.get_config_value('EDITOR'))
|
||||
editor.append(self.config_file)
|
||||
subprocess.call(editor)
|
||||
except Exception as e:
|
||||
raise AnsibleError("Failed to open editor: %s" % to_native(e))
|
||||
|
||||
def _list_plugin_settings(self, ptype, plugins=None):
|
||||
entries = {}
|
||||
loader = getattr(plugin_loader, '%s_loader' % ptype)
|
||||
|
|
@ -302,7 +261,7 @@ class ConfigCLI(CLI):
|
|||
if context.CLIARGS['format'] == 'yaml':
|
||||
output = yaml_dump(config_entries)
|
||||
elif context.CLIARGS['format'] == 'json':
|
||||
output = json_dump(config_entries)
|
||||
output = _json.json_dumps_formatted(config_entries)
|
||||
|
||||
self.pager(to_text(output, errors='surrogate_or_strict'))
|
||||
|
||||
|
|
@ -495,16 +454,17 @@ class ConfigCLI(CLI):
|
|||
# Add base
|
||||
config = self.config.get_configuration_definitions(ignore_private=True)
|
||||
# convert to settings
|
||||
settings = {}
|
||||
for setting in config.keys():
|
||||
v, o = C.config.get_config_value_and_origin(setting, cfile=self.config_file, variables=get_constants())
|
||||
config[setting] = {
|
||||
settings[setting] = {
|
||||
'name': setting,
|
||||
'value': v,
|
||||
'origin': o,
|
||||
'type': None
|
||||
}
|
||||
|
||||
return self._render_settings(config)
|
||||
return self._render_settings(settings)
|
||||
|
||||
def _get_plugin_configs(self, ptype, plugins):
|
||||
|
||||
|
|
@ -659,7 +619,7 @@ class ConfigCLI(CLI):
|
|||
if context.CLIARGS['format'] == 'yaml':
|
||||
text = yaml_dump(output)
|
||||
elif context.CLIARGS['format'] == 'json':
|
||||
text = json_dump(output)
|
||||
text = _json.json_dumps_formatted(output)
|
||||
|
||||
self.pager(to_text(text, errors='surrogate_or_strict'))
|
||||
|
||||
|
|
|
|||
|
|
@ -29,6 +29,7 @@ from ansible.plugins.list import list_plugins
|
|||
from ansible.plugins.loader import module_loader, fragment_loader
|
||||
from ansible.utils import plugin_docs
|
||||
from ansible.utils.color import stringc
|
||||
from ansible._internal._datatag._tags import TrustedAsTemplate
|
||||
from ansible.utils.display import Display
|
||||
|
||||
display = Display()
|
||||
|
|
@ -181,6 +182,8 @@ class ConsoleCLI(CLI, cmd.Cmd):
|
|||
else:
|
||||
module_args = ''
|
||||
|
||||
module_args = TrustedAsTemplate().tag(module_args)
|
||||
|
||||
if self.callback:
|
||||
cb = self.callback
|
||||
elif C.DEFAULT_LOAD_CALLBACK_PLUGINS and C.DEFAULT_STDOUT_CALLBACK != 'default':
|
||||
|
|
@ -239,11 +242,8 @@ class ConsoleCLI(CLI, cmd.Cmd):
|
|||
except KeyboardInterrupt:
|
||||
display.error('User interrupted execution')
|
||||
return False
|
||||
except Exception as e:
|
||||
if self.verbosity >= 3:
|
||||
import traceback
|
||||
display.v(traceback.format_exc())
|
||||
display.error(to_text(e))
|
||||
except Exception as ex:
|
||||
display.error(ex)
|
||||
return False
|
||||
|
||||
def emptyline(self):
|
||||
|
|
|
|||
|
|
@ -15,7 +15,8 @@ import os
|
|||
import os.path
|
||||
import re
|
||||
import textwrap
|
||||
import traceback
|
||||
|
||||
import yaml
|
||||
|
||||
import ansible.plugins.loader as plugin_loader
|
||||
|
||||
|
|
@ -28,12 +29,12 @@ from ansible.collections.list import list_collection_dirs
|
|||
from ansible.errors import AnsibleError, AnsibleOptionsError, AnsibleParserError, AnsiblePluginNotFound
|
||||
from ansible.module_utils.common.text.converters import to_native, to_text
|
||||
from ansible.module_utils.common.collections import is_sequence
|
||||
from ansible.module_utils.common.json import json_dump
|
||||
from ansible.module_utils.common.yaml import yaml_dump
|
||||
from ansible.module_utils.six import string_types
|
||||
from ansible.parsing.plugin_docs import read_docstub
|
||||
from ansible.parsing.utils.yaml import from_yaml
|
||||
from ansible.parsing.yaml.dumper import AnsibleDumper
|
||||
from ansible.parsing.yaml.loader import AnsibleLoader
|
||||
from ansible._internal._yaml._loader import AnsibleInstrumentedLoader
|
||||
from ansible.plugins.list import list_plugins
|
||||
from ansible.plugins.loader import action_loader, fragment_loader
|
||||
from ansible.utils.collection_loader import AnsibleCollectionConfig, AnsibleCollectionRef
|
||||
|
|
@ -41,6 +42,8 @@ from ansible.utils.collection_loader._collection_finder import _get_collection_n
|
|||
from ansible.utils.color import stringc
|
||||
from ansible.utils.display import Display
|
||||
from ansible.utils.plugin_docs import get_plugin_docs, get_docstring, get_versioned_doclink
|
||||
from ansible.template import trust_as_template
|
||||
from ansible._internal import _json
|
||||
|
||||
display = Display()
|
||||
|
||||
|
|
@ -83,10 +86,9 @@ ref_style = {
|
|||
|
||||
def jdump(text):
|
||||
try:
|
||||
display.display(json_dump(text))
|
||||
except TypeError as e:
|
||||
display.vvv(traceback.format_exc())
|
||||
raise AnsibleError('We could not convert all the documentation into JSON as there was a conversion issue: %s' % to_native(e))
|
||||
display.display(_json.json_dumps_formatted(text))
|
||||
except TypeError as ex:
|
||||
raise AnsibleError('We could not convert all the documentation into JSON as there was a conversion issue.') from ex
|
||||
|
||||
|
||||
class RoleMixin(object):
|
||||
|
|
@ -129,11 +131,11 @@ class RoleMixin(object):
|
|||
|
||||
try:
|
||||
with open(path, 'r') as f:
|
||||
data = from_yaml(f.read(), file_name=path)
|
||||
data = yaml.load(trust_as_template(f), Loader=AnsibleLoader)
|
||||
if data is None:
|
||||
data = {}
|
||||
except (IOError, OSError) as e:
|
||||
raise AnsibleParserError("Could not read the role '%s' (at %s)" % (role_name, path), orig_exc=e)
|
||||
except (IOError, OSError) as ex:
|
||||
raise AnsibleParserError(f"Could not read the role {role_name!r} (at {path}).") from ex
|
||||
|
||||
return data
|
||||
|
||||
|
|
@ -697,16 +699,16 @@ class DocCLI(CLI, RoleMixin):
|
|||
display.warning("Skipping role '%s' due to: %s" % (role, role_json[role]['error']), True)
|
||||
continue
|
||||
text += self.get_role_man_text(role, role_json[role])
|
||||
except AnsibleParserError as e:
|
||||
except AnsibleError as ex:
|
||||
# TODO: warn and skip role?
|
||||
raise AnsibleParserError("Role '%s" % (role), orig_exc=e)
|
||||
raise AnsibleParserError(f"Error extracting role docs from {role!r}.") from ex
|
||||
|
||||
# display results
|
||||
DocCLI.pager("\n".join(text))
|
||||
|
||||
@staticmethod
|
||||
def _list_keywords():
|
||||
return from_yaml(pkgutil.get_data('ansible', 'keyword_desc.yml'))
|
||||
return yaml.load(pkgutil.get_data('ansible', 'keyword_desc.yml'), Loader=AnsibleInstrumentedLoader)
|
||||
|
||||
@staticmethod
|
||||
def _get_keywords_docs(keys):
|
||||
|
|
@ -769,10 +771,8 @@ class DocCLI(CLI, RoleMixin):
|
|||
|
||||
data[key] = kdata
|
||||
|
||||
except (AttributeError, KeyError) as e:
|
||||
display.warning("Skipping Invalid keyword '%s' specified: %s" % (key, to_text(e)))
|
||||
if display.verbosity >= 3:
|
||||
display.verbose(traceback.format_exc())
|
||||
except (AttributeError, KeyError) as ex:
|
||||
display.error_as_warning(f'Skipping invalid keyword {key!r}.', ex)
|
||||
|
||||
return data
|
||||
|
||||
|
|
@ -820,16 +820,19 @@ class DocCLI(CLI, RoleMixin):
|
|||
except AnsiblePluginNotFound as e:
|
||||
display.warning(to_native(e))
|
||||
continue
|
||||
except Exception as e:
|
||||
except Exception as ex:
|
||||
msg = "Missing documentation (or could not parse documentation)"
|
||||
|
||||
if not fail_on_errors:
|
||||
plugin_docs[plugin] = {'error': 'Missing documentation or could not parse documentation: %s' % to_native(e)}
|
||||
plugin_docs[plugin] = {'error': f'{msg}: {ex}.'}
|
||||
continue
|
||||
display.vvv(traceback.format_exc())
|
||||
msg = "%s %s missing documentation (or could not parse documentation): %s\n" % (plugin_type, plugin, to_native(e))
|
||||
|
||||
msg = f"{plugin_type} {plugin} {msg}"
|
||||
|
||||
if fail_ok:
|
||||
display.warning(msg)
|
||||
display.warning(f'{msg}: {ex}')
|
||||
else:
|
||||
raise AnsibleError(msg)
|
||||
raise AnsibleError(f'{msg}.') from ex
|
||||
|
||||
if not doc:
|
||||
# The doc section existed but was empty
|
||||
|
|
@ -841,9 +844,9 @@ class DocCLI(CLI, RoleMixin):
|
|||
if not fail_on_errors:
|
||||
# Check whether JSON serialization would break
|
||||
try:
|
||||
json_dump(docs)
|
||||
except Exception as e: # pylint:disable=broad-except
|
||||
plugin_docs[plugin] = {'error': 'Cannot serialize documentation as JSON: %s' % to_native(e)}
|
||||
_json.json_dumps_formatted(docs)
|
||||
except Exception as ex: # pylint:disable=broad-except
|
||||
plugin_docs[plugin] = {'error': f'Cannot serialize documentation as JSON: {ex}'}
|
||||
continue
|
||||
|
||||
plugin_docs[plugin] = docs
|
||||
|
|
@ -1016,9 +1019,8 @@ class DocCLI(CLI, RoleMixin):
|
|||
try:
|
||||
doc, __, __, __ = get_docstring(filename, fragment_loader, verbose=(context.CLIARGS['verbosity'] > 0),
|
||||
collection_name=collection_name, plugin_type=plugin_type)
|
||||
except Exception:
|
||||
display.vvv(traceback.format_exc())
|
||||
raise AnsibleError("%s %s at %s has a documentation formatting error or is missing documentation." % (plugin_type, plugin_name, filename))
|
||||
except Exception as ex:
|
||||
raise AnsibleError(f"{plugin_type} {plugin_name} at {filename!r} has a documentation formatting error or is missing documentation.") from ex
|
||||
|
||||
if doc is None:
|
||||
# Removed plugins don't have any documentation
|
||||
|
|
@ -1094,9 +1096,8 @@ class DocCLI(CLI, RoleMixin):
|
|||
|
||||
try:
|
||||
text = DocCLI.get_man_text(doc, collection_name, plugin_type)
|
||||
except Exception as e:
|
||||
display.vvv(traceback.format_exc())
|
||||
raise AnsibleError("Unable to retrieve documentation from '%s'" % (plugin), orig_exc=e)
|
||||
except Exception as ex:
|
||||
raise AnsibleError(f"Unable to retrieve documentation from {plugin!r}.") from ex
|
||||
|
||||
return text
|
||||
|
||||
|
|
@ -1508,8 +1509,8 @@ class DocCLI(CLI, RoleMixin):
|
|||
else:
|
||||
try:
|
||||
text.append(yaml_dump(doc.pop('plainexamples'), indent=2, default_flow_style=False))
|
||||
except Exception as e:
|
||||
raise AnsibleParserError("Unable to parse examples section", orig_exc=e)
|
||||
except Exception as ex:
|
||||
raise AnsibleParserError("Unable to parse examples section.") from ex
|
||||
|
||||
if doc.get('returndocs', False):
|
||||
text.append('')
|
||||
|
|
|
|||
|
|
@ -53,10 +53,12 @@ from ansible.module_utils.ansible_release import __version__ as ansible_version
|
|||
from ansible.module_utils.common.collections import is_iterable
|
||||
from ansible.module_utils.common.yaml import yaml_dump, yaml_load
|
||||
from ansible.module_utils.common.text.converters import to_bytes, to_native, to_text
|
||||
from ansible._internal._datatag._tags import TrustedAsTemplate
|
||||
from ansible.module_utils import six
|
||||
from ansible.parsing.dataloader import DataLoader
|
||||
from ansible.playbook.role.requirement import RoleRequirement
|
||||
from ansible.template import Templar
|
||||
from ansible._internal._templating._engine import TemplateEngine
|
||||
from ansible.template import trust_as_template
|
||||
from ansible.utils.collection_loader import AnsibleCollectionConfig
|
||||
from ansible.utils.display import Display
|
||||
from ansible.utils.plugin_docs import get_versioned_doclink
|
||||
|
|
@ -915,8 +917,8 @@ class GalaxyCLI(CLI):
|
|||
|
||||
@staticmethod
|
||||
def _get_skeleton_galaxy_yml(template_path, inject_data):
|
||||
with open(to_bytes(template_path, errors='surrogate_or_strict'), 'rb') as template_obj:
|
||||
meta_template = to_text(template_obj.read(), errors='surrogate_or_strict')
|
||||
with open(to_bytes(template_path, errors='surrogate_or_strict'), 'r') as template_obj:
|
||||
meta_template = TrustedAsTemplate().tag(to_text(template_obj.read(), errors='surrogate_or_strict'))
|
||||
|
||||
galaxy_meta = get_collections_galaxy_meta_info()
|
||||
|
||||
|
|
@ -952,7 +954,7 @@ class GalaxyCLI(CLI):
|
|||
return textwrap.fill(v, width=117, initial_indent="# ", subsequent_indent="# ", break_on_hyphens=False)
|
||||
|
||||
loader = DataLoader()
|
||||
templar = Templar(loader, variables={'required_config': required_config, 'optional_config': optional_config})
|
||||
templar = TemplateEngine(loader, variables={'required_config': required_config, 'optional_config': optional_config})
|
||||
templar.environment.filters['comment_ify'] = comment_ify
|
||||
|
||||
meta_value = templar.template(meta_template)
|
||||
|
|
@ -1154,7 +1156,7 @@ class GalaxyCLI(CLI):
|
|||
|
||||
loader = DataLoader()
|
||||
inject_data.update(load_extra_vars(loader))
|
||||
templar = Templar(loader, variables=inject_data)
|
||||
templar = TemplateEngine(loader, variables=inject_data)
|
||||
|
||||
# create role directory
|
||||
if not os.path.exists(b_obj_path):
|
||||
|
|
@ -1196,7 +1198,7 @@ class GalaxyCLI(CLI):
|
|||
elif ext == ".j2" and not in_templates_dir:
|
||||
src_template = os.path.join(root, f)
|
||||
dest_file = os.path.join(obj_path, rel_root, filename)
|
||||
template_data = to_text(loader._get_file_contents(src_template)[0], errors='surrogate_or_strict')
|
||||
template_data = trust_as_template(loader.get_text_file_contents(src_template))
|
||||
try:
|
||||
b_rendered = to_bytes(templar.template(template_data), errors='surrogate_or_strict')
|
||||
except AnsibleError as e:
|
||||
|
|
@ -1764,6 +1766,8 @@ class GalaxyCLI(CLI):
|
|||
|
||||
return 0
|
||||
|
||||
_task_check_delay_sec = 10 # allows unit test override
|
||||
|
||||
def execute_import(self):
|
||||
""" used to import a role into Ansible Galaxy """
|
||||
|
||||
|
|
@ -1817,7 +1821,7 @@ class GalaxyCLI(CLI):
|
|||
rc = ['SUCCESS', 'FAILED'].index(state)
|
||||
finished = True
|
||||
else:
|
||||
time.sleep(10)
|
||||
time.sleep(self._task_check_delay_sec)
|
||||
|
||||
return rc
|
||||
|
||||
|
|
|
|||
|
|
@ -9,15 +9,19 @@ from __future__ import annotations
|
|||
# ansible.cli needs to be imported first, to ensure the source bin/* scripts run that code first
|
||||
from ansible.cli import CLI
|
||||
|
||||
import json
|
||||
import sys
|
||||
import typing as t
|
||||
|
||||
import argparse
|
||||
import functools
|
||||
|
||||
from ansible import constants as C
|
||||
from ansible import context
|
||||
from ansible.cli.arguments import option_helpers as opt_help
|
||||
from ansible.errors import AnsibleError, AnsibleOptionsError
|
||||
from ansible.errors import AnsibleError, AnsibleOptionsError, AnsibleRuntimeError
|
||||
from ansible.module_utils.common.text.converters import to_bytes, to_native, to_text
|
||||
from ansible._internal._json._profiles import _inventory_legacy
|
||||
from ansible.utils.vars import combine_vars
|
||||
from ansible.utils.display import Display
|
||||
from ansible.vars.plugins import get_vars_from_inventory_sources, get_vars_from_path
|
||||
|
|
@ -156,34 +160,17 @@ class InventoryCLI(CLI):
|
|||
|
||||
@staticmethod
|
||||
def dump(stuff):
|
||||
|
||||
if context.CLIARGS['yaml']:
|
||||
import yaml
|
||||
from ansible.parsing.yaml.dumper import AnsibleDumper
|
||||
results = to_text(yaml.dump(stuff, Dumper=AnsibleDumper, default_flow_style=False, allow_unicode=True))
|
||||
|
||||
# DTFIX-RELEASE: need shared infra to smuggle custom kwargs to dumpers, since yaml.dump cannot (as of PyYAML 6.0.1)
|
||||
dumper = functools.partial(AnsibleDumper, dump_vault_tags=True)
|
||||
results = to_text(yaml.dump(stuff, Dumper=dumper, default_flow_style=False, allow_unicode=True))
|
||||
elif context.CLIARGS['toml']:
|
||||
from ansible.plugins.inventory.toml import toml_dumps
|
||||
try:
|
||||
results = toml_dumps(stuff)
|
||||
except TypeError as e:
|
||||
raise AnsibleError(
|
||||
'The source inventory contains a value that cannot be represented in TOML: %s' % e
|
||||
)
|
||||
except KeyError as e:
|
||||
raise AnsibleError(
|
||||
'The source inventory contains a non-string key (%s) which cannot be represented in TOML. '
|
||||
'The specified key will need to be converted to a string. Be aware that if your playbooks '
|
||||
'expect this key to be non-string, your playbooks will need to be modified to support this '
|
||||
'change.' % e.args[0]
|
||||
)
|
||||
results = toml_dumps(stuff)
|
||||
else:
|
||||
import json
|
||||
from ansible.parsing.ajson import AnsibleJSONEncoder
|
||||
try:
|
||||
results = json.dumps(stuff, cls=AnsibleJSONEncoder, sort_keys=True, indent=4, preprocess_unsafe=True, ensure_ascii=False)
|
||||
except TypeError as e:
|
||||
results = json.dumps(stuff, cls=AnsibleJSONEncoder, sort_keys=False, indent=4, preprocess_unsafe=True, ensure_ascii=False)
|
||||
display.warning("Could not sort JSON output due to issues while sorting keys: %s" % to_native(e))
|
||||
results = json.dumps(stuff, cls=_inventory_legacy.Encoder, sort_keys=True, indent=4)
|
||||
|
||||
return results
|
||||
|
||||
|
|
@ -306,7 +293,11 @@ class InventoryCLI(CLI):
|
|||
results = format_group(top, frozenset(h.name for h in hosts))
|
||||
|
||||
# populate meta
|
||||
results['_meta'] = {'hostvars': {}}
|
||||
results['_meta'] = {
|
||||
'hostvars': {},
|
||||
'profile': _inventory_legacy.Encoder.profile_name,
|
||||
}
|
||||
|
||||
for host in hosts:
|
||||
hvars = self._get_host_variables(host)
|
||||
if hvars:
|
||||
|
|
@ -409,6 +400,17 @@ class InventoryCLI(CLI):
|
|||
return results
|
||||
|
||||
|
||||
def toml_dumps(data: t.Any) -> str:
|
||||
try:
|
||||
from tomli_w import dumps as _tomli_w_dumps
|
||||
except ImportError:
|
||||
pass
|
||||
else:
|
||||
return _tomli_w_dumps(data)
|
||||
|
||||
raise AnsibleRuntimeError('The Python library "tomli-w" is required when using the TOML output format.')
|
||||
|
||||
|
||||
def main(args=None):
|
||||
InventoryCLI.cli_executor(args)
|
||||
|
||||
|
|
|
|||
|
|
@ -21,7 +21,7 @@ from ansible.cli.arguments import option_helpers as opt_help
|
|||
from ansible.module_utils.common.text.converters import to_bytes, to_text
|
||||
from ansible.module_utils.connection import Connection, ConnectionError, send_data, recv_data
|
||||
from ansible.module_utils.service import fork_process
|
||||
from ansible.parsing.ajson import AnsibleJSONEncoder, AnsibleJSONDecoder
|
||||
from ansible.module_utils._internal._json._profiles import _tagless
|
||||
from ansible.playbook.play_context import PlayContext
|
||||
from ansible.plugins.loader import connection_loader, init_plugin_loader
|
||||
from ansible.utils.path import unfrackpath, makedirs_safe
|
||||
|
|
@ -110,7 +110,7 @@ class ConnectionProcess(object):
|
|||
result['exception'] = traceback.format_exc()
|
||||
finally:
|
||||
result['messages'] = messages
|
||||
self.fd.write(json.dumps(result, cls=AnsibleJSONEncoder))
|
||||
self.fd.write(json.dumps(result, cls=_tagless.Encoder))
|
||||
self.fd.close()
|
||||
|
||||
def run(self):
|
||||
|
|
@ -292,7 +292,7 @@ def main(args=None):
|
|||
else:
|
||||
os.close(w)
|
||||
rfd = os.fdopen(r, 'r')
|
||||
data = json.loads(rfd.read(), cls=AnsibleJSONDecoder)
|
||||
data = json.loads(rfd.read(), cls=_tagless.Decoder)
|
||||
messages.extend(data.pop('messages'))
|
||||
result.update(data)
|
||||
|
||||
|
|
@ -330,10 +330,10 @@ def main(args=None):
|
|||
sys.stdout = saved_stdout
|
||||
if 'exception' in result:
|
||||
rc = 1
|
||||
sys.stderr.write(json.dumps(result, cls=AnsibleJSONEncoder))
|
||||
sys.stderr.write(json.dumps(result, cls=_tagless.Encoder))
|
||||
else:
|
||||
rc = 0
|
||||
sys.stdout.write(json.dumps(result, cls=AnsibleJSONEncoder))
|
||||
sys.stdout.write(json.dumps(result, cls=_tagless.Encoder))
|
||||
|
||||
sys.exit(rc)
|
||||
|
||||
|
|
|
|||
|
|
@ -228,6 +228,7 @@ class VaultCLI(CLI):
|
|||
vault_ids=new_vault_ids,
|
||||
vault_password_files=new_vault_password_files,
|
||||
ask_vault_pass=context.CLIARGS['ask_vault_pass'],
|
||||
initialize_context=False,
|
||||
create_new_password=True)
|
||||
|
||||
if not new_vault_secrets:
|
||||
|
|
@ -259,7 +260,7 @@ class VaultCLI(CLI):
|
|||
display.display("Reading plaintext input from stdin", stderr=True)
|
||||
|
||||
for f in context.CLIARGS['args'] or ['-']:
|
||||
# Fixme: use the correct vau
|
||||
# FIXME: use the correct vau
|
||||
self.editor.encrypt_file(f, self.encrypt_secret,
|
||||
vault_id=self.encrypt_vault_id,
|
||||
output_file=context.CLIARGS['output_file'])
|
||||
|
|
|
|||
|
|
@ -9,6 +9,38 @@ _ANSIBLE_CONNECTION_PATH:
|
|||
- For internal use only.
|
||||
type: path
|
||||
version_added: "2.18"
|
||||
ALLOW_BROKEN_CONDITIONALS:
|
||||
# This config option will be deprecated once it no longer has any effect (2.23).
|
||||
name: Allow broken conditionals
|
||||
default: false
|
||||
description:
|
||||
- When enabled, this option allows conditionals with non-boolean results to be used.
|
||||
- A deprecation warning will be emitted in these cases.
|
||||
- By default, non-boolean conditionals result in an error.
|
||||
- Such results often indicate unintentional use of templates where they are not supported, resulting in a conditional that is always true.
|
||||
- When this option is enabled, conditional expressions which are a literal ``None`` or empty string will evaluate as true for backwards compatibility.
|
||||
env: [{name: ANSIBLE_ALLOW_BROKEN_CONDITIONALS}]
|
||||
ini:
|
||||
- {key: allow_broken_conditionals, section: defaults}
|
||||
type: boolean
|
||||
version_added: "2.19"
|
||||
ALLOW_EMBEDDED_TEMPLATES:
|
||||
name: Allow embedded templates
|
||||
default: true
|
||||
description:
|
||||
- When enabled, this option allows embedded templates to be used for specific backward compatibility scenarios.
|
||||
- A deprecation warning will be emitted in these cases.
|
||||
- First, conditionals (for example, ``failed_when``, ``until``, ``assert.that``) fully enclosed in template delimiters.
|
||||
- "Second, string constants in conditionals (for example, ``when: some_var == '{{ some_other_var }}'``)."
|
||||
- Finally, positional arguments to lookups (for example, ``lookup('pipe', 'echo {{ some_var }}')``).
|
||||
- This feature is deprecated, since embedded templates are unnecessary in these cases.
|
||||
- When disabled, use of embedded templates will result in an error.
|
||||
- A future release will disable this feature by default.
|
||||
env: [{name: ANSIBLE_ALLOW_EMBEDDED_TEMPLATES}]
|
||||
ini:
|
||||
- {key: allow_embedded_templates, section: defaults}
|
||||
type: boolean
|
||||
version_added: "2.19"
|
||||
ANSIBLE_HOME:
|
||||
name: The Ansible home path
|
||||
description:
|
||||
|
|
@ -160,38 +192,50 @@ AGNOSTIC_BECOME_PROMPT:
|
|||
yaml: {key: privilege_escalation.agnostic_become_prompt}
|
||||
version_added: "2.5"
|
||||
CACHE_PLUGIN:
|
||||
name: Persistent Cache plugin
|
||||
name: Persistent Fact Cache plugin
|
||||
default: memory
|
||||
description: Chooses which cache plugin to use, the default 'memory' is ephemeral.
|
||||
description: Chooses which fact cache plugin to use. By default, no cache is used and facts do not persist between runs.
|
||||
env: [{name: ANSIBLE_CACHE_PLUGIN}]
|
||||
ini:
|
||||
- {key: fact_caching, section: defaults}
|
||||
yaml: {key: facts.cache.plugin}
|
||||
CACHE_PLUGIN_CONNECTION:
|
||||
name: Cache Plugin URI
|
||||
name: Fact Cache Plugin URI
|
||||
default: ~
|
||||
description: Defines connection or path information for the cache plugin.
|
||||
description: Defines connection or path information for the fact cache plugin.
|
||||
env: [{name: ANSIBLE_CACHE_PLUGIN_CONNECTION}]
|
||||
ini:
|
||||
- {key: fact_caching_connection, section: defaults}
|
||||
yaml: {key: facts.cache.uri}
|
||||
CACHE_PLUGIN_PREFIX:
|
||||
name: Cache Plugin table prefix
|
||||
name: Fact Cache Plugin table prefix
|
||||
default: ansible_facts
|
||||
description: Prefix to use for cache plugin files/tables.
|
||||
description: Prefix to use for fact cache plugin files/tables.
|
||||
env: [{name: ANSIBLE_CACHE_PLUGIN_PREFIX}]
|
||||
ini:
|
||||
- {key: fact_caching_prefix, section: defaults}
|
||||
yaml: {key: facts.cache.prefix}
|
||||
CACHE_PLUGIN_TIMEOUT:
|
||||
name: Cache Plugin expiration timeout
|
||||
name: Fact Cache Plugin expiration timeout
|
||||
default: 86400
|
||||
description: Expiration timeout for the cache plugin data.
|
||||
description: Expiration timeout for the fact cache plugin data.
|
||||
env: [{name: ANSIBLE_CACHE_PLUGIN_TIMEOUT}]
|
||||
ini:
|
||||
- {key: fact_caching_timeout, section: defaults}
|
||||
type: integer
|
||||
yaml: {key: facts.cache.timeout}
|
||||
_CALLBACK_DISPATCH_ERROR_BEHAVIOR:
|
||||
name: Callback dispatch error behavior
|
||||
default: warn
|
||||
description:
|
||||
- Action to take when a callback dispatch results in an error.
|
||||
type: choices
|
||||
choices: &choices_ignore_warn_fail
|
||||
- ignore
|
||||
- warn
|
||||
- fail
|
||||
env: [ { name: _ANSIBLE_CALLBACK_DISPATCH_ERROR_BEHAVIOR } ]
|
||||
version_added: '2.19'
|
||||
COLLECTIONS_SCAN_SYS_PATH:
|
||||
name: Scan PYTHONPATH for installed collections
|
||||
description: A boolean to enable or disable scanning the sys.path for installed collections.
|
||||
|
|
@ -496,6 +540,10 @@ DEFAULT_ALLOW_UNSAFE_LOOKUPS:
|
|||
- {key: allow_unsafe_lookups, section: defaults}
|
||||
type: boolean
|
||||
version_added: "2.2.3"
|
||||
deprecated:
|
||||
why: This option is no longer used in the Ansible Core code base.
|
||||
version: "2.23"
|
||||
alternatives: Lookup plugins are responsible for tagging strings containing templates to allow evaluation as a template.
|
||||
DEFAULT_ASK_PASS:
|
||||
name: Ask for the login password
|
||||
default: False
|
||||
|
|
@ -755,15 +803,20 @@ DEFAULT_INVENTORY_PLUGIN_PATH:
|
|||
DEFAULT_JINJA2_EXTENSIONS:
|
||||
name: Enabled Jinja2 extensions
|
||||
default: []
|
||||
type: list
|
||||
description:
|
||||
- This is a developer-specific feature that allows enabling additional Jinja2 extensions.
|
||||
- "See the Jinja2 documentation for details. If you do not know what these do, you probably don't need to change this setting :)"
|
||||
env: [{name: ANSIBLE_JINJA2_EXTENSIONS}]
|
||||
ini:
|
||||
- {key: jinja2_extensions, section: defaults}
|
||||
deprecated:
|
||||
why: Jinja2 extensions have been deprecated
|
||||
version: "2.23"
|
||||
alternatives: Ansible-supported Jinja plugins (tests, filters, lookups)
|
||||
DEFAULT_JINJA2_NATIVE:
|
||||
name: Use Jinja2's NativeEnvironment for templating
|
||||
default: False
|
||||
default: True
|
||||
description: This option preserves variable types during template operations.
|
||||
env: [{name: ANSIBLE_JINJA2_NATIVE}]
|
||||
ini:
|
||||
|
|
@ -771,6 +824,10 @@ DEFAULT_JINJA2_NATIVE:
|
|||
type: boolean
|
||||
yaml: {key: jinja2_native}
|
||||
version_added: 2.7
|
||||
deprecated:
|
||||
why: This option is no longer used in the Ansible Core code base.
|
||||
version: "2.23"
|
||||
alternatives: Jinja2 native mode is now the default and only option.
|
||||
DEFAULT_KEEP_REMOTE_FILES:
|
||||
name: Keep remote files
|
||||
default: False
|
||||
|
|
@ -930,6 +987,10 @@ DEFAULT_NULL_REPRESENTATION:
|
|||
ini:
|
||||
- {key: null_representation, section: defaults}
|
||||
type: raw
|
||||
deprecated:
|
||||
why: This option is no longer used in the Ansible Core code base.
|
||||
version: "2.23"
|
||||
alternatives: There is no alternative at the moment. A different mechanism would have to be implemented in the current code base.
|
||||
DEFAULT_POLL_INTERVAL:
|
||||
name: Async poll interval
|
||||
default: 15
|
||||
|
|
@ -1129,6 +1190,10 @@ DEFAULT_UNDEFINED_VAR_BEHAVIOR:
|
|||
ini:
|
||||
- {key: error_on_undefined_vars, section: defaults}
|
||||
type: boolean
|
||||
deprecated:
|
||||
why: This option is no longer used in the Ansible Core code base.
|
||||
version: "2.23"
|
||||
alternatives: There is no alternative at the moment. A different mechanism would have to be implemented in the current code base.
|
||||
DEFAULT_VARS_PLUGIN_PATH:
|
||||
name: Vars Plugins Path
|
||||
default: '{{ ANSIBLE_HOME ~ "/plugins/vars:/usr/share/ansible/plugins/vars" }}'
|
||||
|
|
@ -1213,6 +1278,9 @@ DEPRECATION_WARNINGS:
|
|||
ini:
|
||||
- {key: deprecation_warnings, section: defaults}
|
||||
type: boolean
|
||||
vars:
|
||||
- name: ansible_deprecation_warnings
|
||||
version_added: '2.19'
|
||||
DEVEL_WARNING:
|
||||
name: Running devel warning
|
||||
default: True
|
||||
|
|
@ -1266,6 +1334,22 @@ DISPLAY_SKIPPED_HOSTS:
|
|||
ini:
|
||||
- {key: display_skipped_hosts, section: defaults}
|
||||
type: boolean
|
||||
DISPLAY_TRACEBACK:
|
||||
name: Control traceback display
|
||||
default: never
|
||||
description: When to include tracebacks in extended error messages
|
||||
env:
|
||||
- name: ANSIBLE_DISPLAY_TRACEBACK
|
||||
ini:
|
||||
- {key: display_traceback, section: defaults}
|
||||
type: list
|
||||
choices:
|
||||
- error
|
||||
- warning
|
||||
- deprecated
|
||||
- always
|
||||
- never
|
||||
version_added: "2.19"
|
||||
DOCSITE_ROOT_URL:
|
||||
name: Root docsite URL
|
||||
default: https://docs.ansible.com/ansible-core/
|
||||
|
|
@ -1916,6 +2000,10 @@ STRING_TYPE_FILTERS:
|
|||
ini:
|
||||
- {key: dont_type_filters, section: jinja2}
|
||||
type: list
|
||||
deprecated:
|
||||
why: This option has no effect.
|
||||
version: "2.23"
|
||||
alternatives: None; native types returned from filters are always preserved.
|
||||
SYSTEM_WARNINGS:
|
||||
name: System warnings
|
||||
default: True
|
||||
|
|
@ -1968,6 +2056,39 @@ TASK_TIMEOUT:
|
|||
- {key: task_timeout, section: defaults}
|
||||
type: integer
|
||||
version_added: '2.10'
|
||||
_TEMPLAR_UNKNOWN_TYPE_CONVERSION:
|
||||
name: Templar unknown type conversion behavior
|
||||
default: warn
|
||||
description:
|
||||
- Action to take when an unknown type is converted for variable storage during template finalization.
|
||||
- This setting has no effect on the inability to store unsupported variable types as the result of templating.
|
||||
- Experimental diagnostic feature, subject to change.
|
||||
type: choices
|
||||
choices: *choices_ignore_warn_fail
|
||||
env: [{name: _ANSIBLE_TEMPLAR_UNKNOWN_TYPE_CONVERSION}]
|
||||
version_added: '2.19'
|
||||
_TEMPLAR_UNKNOWN_TYPE_ENCOUNTERED:
|
||||
name: Templar unknown type encountered behavior
|
||||
default: ignore
|
||||
description:
|
||||
- Action to take when an unknown type is encountered inside a template pipeline.
|
||||
- Experimental diagnostic feature, subject to change.
|
||||
type: choices
|
||||
choices: *choices_ignore_warn_fail
|
||||
env: [{name: _ANSIBLE_TEMPLAR_UNKNOWN_TYPE_ENCOUNTERED}]
|
||||
version_added: '2.19'
|
||||
_TEMPLAR_UNTRUSTED_TEMPLATE_BEHAVIOR:
|
||||
name: Templar untrusted template behavior
|
||||
default: ignore
|
||||
description:
|
||||
- Action to take when processing of an untrusted template is skipped.
|
||||
- For `ignore` or `warn`, the input template string is returned as-is.
|
||||
- This setting has no effect on expressions.
|
||||
- Experimental diagnostic feature, subject to change.
|
||||
type: choices
|
||||
choices: *choices_ignore_warn_fail
|
||||
env: [{name: _ANSIBLE_TEMPLAR_UNTRUSTED_TEMPLATE_BEHAVIOR}]
|
||||
version_added: '2.19'
|
||||
WORKER_SHUTDOWN_POLL_COUNT:
|
||||
name: Worker Shutdown Poll Count
|
||||
default: 0
|
||||
|
|
@ -2030,6 +2151,12 @@ WIN_ASYNC_STARTUP_TIMEOUT:
|
|||
vars:
|
||||
- {name: ansible_win_async_startup_timeout}
|
||||
version_added: '2.10'
|
||||
WRAP_STDERR:
|
||||
description: Control line-wrapping behavior on console warnings and errors from default output callbacks (eases pattern-based output testing)
|
||||
env: [{name: ANSIBLE_WRAP_STDERR}]
|
||||
default: false
|
||||
type: bool
|
||||
version_added: "2.19"
|
||||
YAML_FILENAME_EXTENSIONS:
|
||||
name: Valid YAML extensions
|
||||
default: [".yml", ".yaml", ".json"]
|
||||
|
|
|
|||
|
|
@ -11,18 +11,18 @@ import os.path
|
|||
import sys
|
||||
import stat
|
||||
import tempfile
|
||||
import typing as t
|
||||
|
||||
from collections.abc import Mapping, Sequence
|
||||
from jinja2.nativetypes import NativeEnvironment
|
||||
|
||||
from ansible.errors import AnsibleOptionsError, AnsibleError, AnsibleRequiredOptionError
|
||||
from ansible.errors import AnsibleOptionsError, AnsibleError, AnsibleUndefinedConfigEntry, AnsibleRequiredOptionError
|
||||
from ansible.module_utils.common.sentinel import Sentinel
|
||||
from ansible.module_utils.common.text.converters import to_text, to_bytes, to_native
|
||||
from ansible.module_utils.common.yaml import yaml_load
|
||||
from ansible.module_utils.six import string_types
|
||||
from ansible.module_utils.parsing.convert_bool import boolean
|
||||
from ansible.parsing.quoting import unquote
|
||||
from ansible.parsing.yaml.objects import AnsibleVaultEncryptedUnicode
|
||||
from ansible.utils.path import cleanup_tmp_file, makedirs_safe, unfrackpath
|
||||
|
||||
|
||||
|
|
@ -50,14 +50,18 @@ GALAXY_SERVER_ADDITIONAL = {
|
|||
}
|
||||
|
||||
|
||||
def _get_entry(plugin_type, plugin_name, config):
|
||||
""" construct entry for requested config """
|
||||
entry = ''
|
||||
def _get_config_label(plugin_type: str, plugin_name: str, config: str) -> str:
|
||||
"""Return a label for the given config."""
|
||||
entry = f'{config!r}'
|
||||
|
||||
if plugin_type:
|
||||
entry += 'plugin_type: %s ' % plugin_type
|
||||
entry += ' for'
|
||||
|
||||
if plugin_name:
|
||||
entry += 'plugin: %s ' % plugin_name
|
||||
entry += 'setting: %s ' % config
|
||||
entry += f' {plugin_name!r}'
|
||||
|
||||
entry += f' {plugin_type} plugin'
|
||||
|
||||
return entry
|
||||
|
||||
|
||||
|
|
@ -107,8 +111,8 @@ def ensure_type(value, value_type, origin=None, origin_ftype=None):
|
|||
value = int_part
|
||||
else:
|
||||
errmsg = 'int'
|
||||
except decimal.DecimalException as e:
|
||||
raise ValueError from e
|
||||
except decimal.DecimalException:
|
||||
errmsg = 'int'
|
||||
|
||||
elif value_type == 'float':
|
||||
if not isinstance(value, float):
|
||||
|
|
@ -167,7 +171,7 @@ def ensure_type(value, value_type, origin=None, origin_ftype=None):
|
|||
errmsg = 'dictionary'
|
||||
|
||||
elif value_type in ('str', 'string'):
|
||||
if isinstance(value, (string_types, AnsibleVaultEncryptedUnicode, bool, int, float, complex)):
|
||||
if isinstance(value, (string_types, bool, int, float, complex)):
|
||||
value = to_text(value, errors='surrogate_or_strict')
|
||||
if origin_ftype and origin_ftype == 'ini':
|
||||
value = unquote(value)
|
||||
|
|
@ -175,13 +179,13 @@ def ensure_type(value, value_type, origin=None, origin_ftype=None):
|
|||
errmsg = 'string'
|
||||
|
||||
# defaults to string type
|
||||
elif isinstance(value, (string_types, AnsibleVaultEncryptedUnicode)):
|
||||
elif isinstance(value, (string_types)):
|
||||
value = to_text(value, errors='surrogate_or_strict')
|
||||
if origin_ftype and origin_ftype == 'ini':
|
||||
value = unquote(value)
|
||||
|
||||
if errmsg:
|
||||
raise ValueError(f'Invalid type provided for "{errmsg}": {value!r}')
|
||||
raise ValueError(f'Invalid type provided for {errmsg!r}: {value!r}')
|
||||
|
||||
return to_text(value, errors='surrogate_or_strict', nonstring='passthru')
|
||||
|
||||
|
|
@ -369,6 +373,7 @@ class ConfigManager(object):
|
|||
# template default values if possible
|
||||
# NOTE: cannot use is_template due to circular dep
|
||||
try:
|
||||
# FIXME: This really should be using an immutable sandboxed native environment, not just native environment
|
||||
t = NativeEnvironment().from_string(value)
|
||||
value = t.render(variables)
|
||||
except Exception:
|
||||
|
|
@ -494,10 +499,6 @@ class ConfigManager(object):
|
|||
self.WARNINGS.add(u'value for config entry {0} contains invalid characters, ignoring...'.format(to_text(name)))
|
||||
continue
|
||||
if temp_value is not None: # only set if entry is defined in container
|
||||
# inline vault variables should be converted to a text string
|
||||
if isinstance(temp_value, AnsibleVaultEncryptedUnicode):
|
||||
temp_value = to_text(temp_value, errors='surrogate_or_strict')
|
||||
|
||||
value = temp_value
|
||||
origin = name
|
||||
|
||||
|
|
@ -515,10 +516,14 @@ class ConfigManager(object):
|
|||
keys=keys, variables=variables, direct=direct)
|
||||
except AnsibleError:
|
||||
raise
|
||||
except Exception as e:
|
||||
raise AnsibleError("Unhandled exception when retrieving %s:\n%s" % (config, to_native(e)), orig_exc=e)
|
||||
except Exception as ex:
|
||||
raise AnsibleError(f"Unhandled exception when retrieving {config!r}.") from ex
|
||||
return value
|
||||
|
||||
def get_config_default(self, config: str, plugin_type: str | None = None, plugin_name: str | None = None) -> t.Any:
|
||||
"""Return the default value for the specified configuration."""
|
||||
return self.get_configuration_definitions(plugin_type, plugin_name)[config]['default']
|
||||
|
||||
def get_config_value_and_origin(self, config, cfile=None, plugin_type=None, plugin_name=None, keys=None, variables=None, direct=None):
|
||||
""" Given a config key figure out the actual value and report on the origin of the settings """
|
||||
if cfile is None:
|
||||
|
|
@ -623,22 +628,21 @@ class ConfigManager(object):
|
|||
if value is None:
|
||||
if defs[config].get('required', False):
|
||||
if not plugin_type or config not in INTERNAL_DEFS.get(plugin_type, {}):
|
||||
raise AnsibleRequiredOptionError("No setting was provided for required configuration %s" %
|
||||
to_native(_get_entry(plugin_type, plugin_name, config)))
|
||||
raise AnsibleRequiredOptionError(f"Required config {_get_config_label(plugin_type, plugin_name, config)} not provided.")
|
||||
else:
|
||||
origin = 'default'
|
||||
value = self.template_default(defs[config].get('default'), variables)
|
||||
|
||||
try:
|
||||
# ensure correct type, can raise exceptions on mismatched types
|
||||
value = ensure_type(value, defs[config].get('type'), origin=origin, origin_ftype=origin_ftype)
|
||||
except ValueError as e:
|
||||
except ValueError as ex:
|
||||
if origin.startswith('env:') and value == '':
|
||||
# this is empty env var for non string so we can set to default
|
||||
origin = 'default'
|
||||
value = ensure_type(defs[config].get('default'), defs[config].get('type'), origin=origin, origin_ftype=origin_ftype)
|
||||
else:
|
||||
raise AnsibleOptionsError('Invalid type for configuration option %s (from %s): %s' %
|
||||
(to_native(_get_entry(plugin_type, plugin_name, config)).strip(), origin, to_native(e)))
|
||||
raise AnsibleOptionsError(f'Config {_get_config_label(plugin_type, plugin_name, config)} from {origin!r} has an invalid value.') from ex
|
||||
|
||||
# deal with restricted values
|
||||
if value is not None and 'choices' in defs[config] and defs[config]['choices'] is not None:
|
||||
|
|
@ -661,14 +665,14 @@ class ConfigManager(object):
|
|||
else:
|
||||
valid = defs[config]['choices']
|
||||
|
||||
raise AnsibleOptionsError('Invalid value "%s" for configuration option "%s", valid values are: %s' %
|
||||
(value, to_native(_get_entry(plugin_type, plugin_name, config)), valid))
|
||||
raise AnsibleOptionsError(f'Invalid value {value!r} for config {_get_config_label(plugin_type, plugin_name, config)}.',
|
||||
help_text=f'Valid values are: {valid}')
|
||||
|
||||
# deal with deprecation of the setting
|
||||
if 'deprecated' in defs[config] and origin != 'default':
|
||||
self.DEPRECATED.append((config, defs[config].get('deprecated')))
|
||||
else:
|
||||
raise AnsibleError('Requested entry (%s) was not defined in configuration.' % to_native(_get_entry(plugin_type, plugin_name, config)))
|
||||
raise AnsibleUndefinedConfigEntry(f'No config definition exists for {_get_config_label(plugin_type, plugin_name, config)}.')
|
||||
|
||||
return value, origin
|
||||
|
||||
|
|
|
|||
|
|
@ -166,7 +166,6 @@ INTERNAL_STATIC_VARS = frozenset(
|
|||
"inventory_hostname_short",
|
||||
"groups",
|
||||
"group_names",
|
||||
"omit",
|
||||
"hostvars",
|
||||
"playbook_dir",
|
||||
"play_hosts",
|
||||
|
|
|
|||
|
|
@ -1,38 +1,34 @@
|
|||
# (c) 2012-2014, Michael DeHaan <michael.dehaan@gmail.com>
|
||||
#
|
||||
# This file is part of Ansible
|
||||
#
|
||||
# Ansible is free software: you can redistribute it and/or modify
|
||||
# it under the terms of the GNU General Public License as published by
|
||||
# the Free Software Foundation, either version 3 of the License, or
|
||||
# (at your option) any later version.
|
||||
#
|
||||
# Ansible is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with Ansible. If not, see <http://www.gnu.org/licenses/>.
|
||||
# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import re
|
||||
import enum
|
||||
import traceback
|
||||
import sys
|
||||
import types
|
||||
import typing as t
|
||||
|
||||
from collections.abc import Sequence
|
||||
|
||||
from ansible.errors.yaml_strings import (
|
||||
YAML_COMMON_DICT_ERROR,
|
||||
YAML_COMMON_LEADING_TAB_ERROR,
|
||||
YAML_COMMON_PARTIALLY_QUOTED_LINE_ERROR,
|
||||
YAML_COMMON_UNBALANCED_QUOTES_ERROR,
|
||||
YAML_COMMON_UNQUOTED_COLON_ERROR,
|
||||
YAML_COMMON_UNQUOTED_VARIABLE_ERROR,
|
||||
YAML_POSITION_DETAILS,
|
||||
YAML_AND_SHORTHAND_ERROR,
|
||||
)
|
||||
from ansible.module_utils.common.text.converters import to_native, to_text
|
||||
from json import JSONDecodeError
|
||||
|
||||
from ansible.module_utils.common.text.converters import to_text
|
||||
from ..module_utils.datatag import native_type_name
|
||||
from ansible._internal._datatag import _tags
|
||||
from .._internal._errors import _utils
|
||||
|
||||
|
||||
class ExitCode(enum.IntEnum):
|
||||
SUCCESS = 0 # used by TQM, must be bit-flag safe
|
||||
GENERIC_ERROR = 1 # used by TQM, must be bit-flag safe
|
||||
HOST_FAILED = 2 # TQM-sourced, must be bit-flag safe
|
||||
HOST_UNREACHABLE = 4 # TQM-sourced, must be bit-flag safe
|
||||
PARSER_ERROR = 4 # FIXME: CLI-sourced, conflicts with HOST_UNREACHABLE
|
||||
INVALID_CLI_OPTION = 5
|
||||
UNICODE_ERROR = 6 # obsolete, no longer used
|
||||
KEYBOARD_INTERRUPT = 99
|
||||
UNKNOWN_ERROR = 250
|
||||
|
||||
|
||||
class AnsibleError(Exception):
|
||||
|
|
@ -44,257 +40,271 @@ class AnsibleError(Exception):
|
|||
|
||||
Usage:
|
||||
|
||||
raise AnsibleError('some message here', obj=obj, show_content=True)
|
||||
raise AnsibleError('some message here', obj=obj)
|
||||
|
||||
Where "obj" is some subclass of ansible.parsing.yaml.objects.AnsibleBaseYAMLObject,
|
||||
which should be returned by the DataLoader() class.
|
||||
Where "obj" may be tagged with Origin to provide context for error messages.
|
||||
"""
|
||||
|
||||
def __init__(self, message="", obj=None, show_content=True, suppress_extended_error=False, orig_exc=None):
|
||||
super(AnsibleError, self).__init__(message)
|
||||
_exit_code = ExitCode.GENERIC_ERROR
|
||||
_default_message = ''
|
||||
_default_help_text: str | None = None
|
||||
_include_cause_message = True
|
||||
"""
|
||||
When `True`, the exception message will be augmented with cause message(s).
|
||||
Subclasses doing complex error analysis can disable this to take responsibility for reporting cause messages as needed.
|
||||
"""
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
message: str = "",
|
||||
obj: t.Any = None,
|
||||
show_content: bool = True,
|
||||
suppress_extended_error: bool | types.EllipsisType = ...,
|
||||
orig_exc: BaseException | None = None,
|
||||
help_text: str | None = None,
|
||||
) -> None:
|
||||
# DTFIX-FUTURE: these fallback cases mask incorrect use of AnsibleError.message, what should we do?
|
||||
if message is None:
|
||||
message = ''
|
||||
elif not isinstance(message, str):
|
||||
message = str(message)
|
||||
|
||||
if self._default_message and message:
|
||||
message = _utils.concat_message(self._default_message, message)
|
||||
elif self._default_message:
|
||||
message = self._default_message
|
||||
elif not message:
|
||||
message = f'Unexpected {type(self).__name__} error.'
|
||||
|
||||
super().__init__(message)
|
||||
|
||||
self._show_content = show_content
|
||||
self._suppress_extended_error = suppress_extended_error
|
||||
self._message = to_native(message)
|
||||
self._message = message
|
||||
self._help_text_value = help_text or self._default_help_text
|
||||
self.obj = obj
|
||||
|
||||
# deprecated: description='deprecate support for orig_exc, callers should use `raise ... from` only' core_version='2.23'
|
||||
# deprecated: description='remove support for orig_exc' core_version='2.27'
|
||||
self.orig_exc = orig_exc
|
||||
|
||||
if suppress_extended_error is not ...:
|
||||
from ..utils.display import Display
|
||||
|
||||
if suppress_extended_error:
|
||||
self._show_content = False
|
||||
|
||||
Display().deprecated(
|
||||
msg=f"The `suppress_extended_error` argument to `{type(self).__name__}` is deprecated. Use `show_content=False` instead.",
|
||||
version="2.23",
|
||||
)
|
||||
|
||||
@property
|
||||
def message(self):
|
||||
# we import this here to prevent an import loop problem,
|
||||
# since the objects code also imports ansible.errors
|
||||
from ansible.parsing.yaml.objects import AnsibleBaseYAMLObject
|
||||
def _original_message(self) -> str:
|
||||
return self._message
|
||||
|
||||
message = [self._message]
|
||||
|
||||
# Add from previous exceptions
|
||||
if self.orig_exc:
|
||||
message.append('. %s' % to_native(self.orig_exc))
|
||||
|
||||
# Add from yaml to give specific file/line no
|
||||
if isinstance(self.obj, AnsibleBaseYAMLObject):
|
||||
extended_error = self._get_extended_error()
|
||||
if extended_error and not self._suppress_extended_error:
|
||||
message.append(
|
||||
'\n\n%s' % to_native(extended_error)
|
||||
)
|
||||
|
||||
return ''.join(message)
|
||||
@property
|
||||
def message(self) -> str:
|
||||
"""
|
||||
If `include_cause_message` is False, return the original message.
|
||||
Otherwise, return the original message with cause message(s) appended, stopping on (and including) the first non-AnsibleError.
|
||||
The recursion is due to `AnsibleError.__str__` calling this method, which uses `str` on child exceptions to create the cause message.
|
||||
Recursion stops on the first non-AnsibleError since those exceptions do not implement the custom `__str__` behavior.
|
||||
"""
|
||||
return _utils.get_chained_message(self)
|
||||
|
||||
@message.setter
|
||||
def message(self, val):
|
||||
def message(self, val) -> None:
|
||||
self._message = val
|
||||
|
||||
def __str__(self):
|
||||
@property
|
||||
def _formatted_source_context(self) -> str | None:
|
||||
with _utils.RedactAnnotatedSourceContext.when(not self._show_content):
|
||||
if source_context := _utils.SourceContext.from_value(self.obj):
|
||||
return str(source_context)
|
||||
|
||||
return None
|
||||
|
||||
@property
|
||||
def _help_text(self) -> str | None:
|
||||
return self._help_text_value
|
||||
|
||||
@_help_text.setter
|
||||
def _help_text(self, value: str | None) -> None:
|
||||
self._help_text_value = value
|
||||
|
||||
def __str__(self) -> str:
|
||||
return self.message
|
||||
|
||||
def __repr__(self):
|
||||
return self.message
|
||||
def __getstate__(self) -> dict[str, t.Any]:
|
||||
"""Augment object.__getstate__ to preserve additional values not represented in BaseException.__dict__."""
|
||||
state = t.cast(dict[str, t.Any], super().__getstate__())
|
||||
state.update(
|
||||
args=self.args,
|
||||
__cause__=self.__cause__,
|
||||
__context__=self.__context__,
|
||||
__suppress_context__=self.__suppress_context__,
|
||||
)
|
||||
|
||||
def _get_error_lines_from_file(self, file_name, line_number):
|
||||
return state
|
||||
|
||||
def __reduce__(self) -> tuple[t.Callable, tuple[type], dict[str, t.Any]]:
|
||||
"""
|
||||
Returns the line in the file which corresponds to the reported error
|
||||
location, as well as the line preceding it (if the error did not
|
||||
occur on the first line), to provide context to the error.
|
||||
Enable copy/pickle of AnsibleError derived types by correcting for BaseException's ancient C __reduce__ impl that:
|
||||
|
||||
* requires use of a type constructor with positional args
|
||||
* assumes positional args are passed through from the derived type __init__ to BaseException.__init__ unmodified
|
||||
* does not propagate args/__cause__/__context__/__suppress_context__
|
||||
|
||||
NOTE: This does not preserve the dunder attributes on non-AnsibleError derived cause/context exceptions.
|
||||
As a result, copy/pickle will discard chained exceptions after the first non-AnsibleError cause/context.
|
||||
"""
|
||||
return type(self).__new__, (type(self),), self.__getstate__()
|
||||
|
||||
target_line = ''
|
||||
prev_line = ''
|
||||
|
||||
with open(file_name, 'r') as f:
|
||||
lines = f.readlines()
|
||||
class AnsibleUndefinedConfigEntry(AnsibleError):
|
||||
"""The requested config entry is not defined."""
|
||||
|
||||
# In case of a YAML loading error, PyYAML will report the very last line
|
||||
# as the location of the error. Avoid an index error here in order to
|
||||
# return a helpful message.
|
||||
file_length = len(lines)
|
||||
if line_number >= file_length:
|
||||
line_number = file_length - 1
|
||||
|
||||
# If target_line contains only whitespace, move backwards until
|
||||
# actual code is found. If there are several empty lines after target_line,
|
||||
# the error lines would just be blank, which is not very helpful.
|
||||
target_line = lines[line_number]
|
||||
while not target_line.strip():
|
||||
line_number -= 1
|
||||
target_line = lines[line_number]
|
||||
class AnsibleTaskError(AnsibleError):
|
||||
"""Task execution failed; provides contextual information about the task."""
|
||||
|
||||
if line_number > 0:
|
||||
prev_line = lines[line_number - 1]
|
||||
|
||||
return (target_line, prev_line)
|
||||
|
||||
def _get_extended_error(self):
|
||||
"""
|
||||
Given an object reporting the location of the exception in a file, return
|
||||
detailed information regarding it including:
|
||||
|
||||
* the line which caused the error as well as the one preceding it
|
||||
* causes and suggested remedies for common syntax errors
|
||||
|
||||
If this error was created with show_content=False, the reporting of content
|
||||
is suppressed, as the file contents may be sensitive (ie. vault data).
|
||||
"""
|
||||
|
||||
error_message = ''
|
||||
|
||||
try:
|
||||
(src_file, line_number, col_number) = self.obj.ansible_pos
|
||||
error_message += YAML_POSITION_DETAILS % (src_file, line_number, col_number)
|
||||
if src_file not in ('<string>', '<unicode>') and self._show_content:
|
||||
(target_line, prev_line) = self._get_error_lines_from_file(src_file, line_number - 1)
|
||||
target_line = to_text(target_line)
|
||||
prev_line = to_text(prev_line)
|
||||
if target_line:
|
||||
stripped_line = target_line.replace(" ", "")
|
||||
|
||||
# Check for k=v syntax in addition to YAML syntax and set the appropriate error position,
|
||||
# arrow index
|
||||
if re.search(r'\w+(\s+)?=(\s+)?[\w/-]+', prev_line):
|
||||
error_position = prev_line.rstrip().find('=')
|
||||
arrow_line = (" " * error_position) + "^ here"
|
||||
error_message = YAML_POSITION_DETAILS % (src_file, line_number - 1, error_position + 1)
|
||||
error_message += "\nThe offending line appears to be:\n\n%s\n%s\n\n" % (prev_line.rstrip(), arrow_line)
|
||||
error_message += YAML_AND_SHORTHAND_ERROR
|
||||
else:
|
||||
arrow_line = (" " * (col_number - 1)) + "^ here"
|
||||
error_message += "\nThe offending line appears to be:\n\n%s\n%s\n%s\n" % (prev_line.rstrip(), target_line.rstrip(), arrow_line)
|
||||
|
||||
# TODO: There may be cases where there is a valid tab in a line that has other errors.
|
||||
if '\t' in target_line:
|
||||
error_message += YAML_COMMON_LEADING_TAB_ERROR
|
||||
# common error/remediation checking here:
|
||||
# check for unquoted vars starting lines
|
||||
if ('{{' in target_line and '}}' in target_line) and ('"{{' not in target_line or "'{{" not in target_line):
|
||||
error_message += YAML_COMMON_UNQUOTED_VARIABLE_ERROR
|
||||
# check for common dictionary mistakes
|
||||
elif ":{{" in stripped_line and "}}" in stripped_line:
|
||||
error_message += YAML_COMMON_DICT_ERROR
|
||||
# check for common unquoted colon mistakes
|
||||
elif (len(target_line) and
|
||||
len(target_line) > 1 and
|
||||
len(target_line) > col_number and
|
||||
target_line[col_number] == ":" and
|
||||
target_line.count(':') > 1):
|
||||
error_message += YAML_COMMON_UNQUOTED_COLON_ERROR
|
||||
# otherwise, check for some common quoting mistakes
|
||||
else:
|
||||
# FIXME: This needs to split on the first ':' to account for modules like lineinfile
|
||||
# that may have lines that contain legitimate colons, e.g., line: 'i ALL= (ALL) NOPASSWD: ALL'
|
||||
# and throw off the quote matching logic.
|
||||
parts = target_line.split(":")
|
||||
if len(parts) > 1:
|
||||
middle = parts[1].strip()
|
||||
match = False
|
||||
unbalanced = False
|
||||
|
||||
if middle.startswith("'") and not middle.endswith("'"):
|
||||
match = True
|
||||
elif middle.startswith('"') and not middle.endswith('"'):
|
||||
match = True
|
||||
|
||||
if (len(middle) > 0 and
|
||||
middle[0] in ['"', "'"] and
|
||||
middle[-1] in ['"', "'"] and
|
||||
target_line.count("'") > 2 or
|
||||
target_line.count('"') > 2):
|
||||
unbalanced = True
|
||||
|
||||
if match:
|
||||
error_message += YAML_COMMON_PARTIALLY_QUOTED_LINE_ERROR
|
||||
if unbalanced:
|
||||
error_message += YAML_COMMON_UNBALANCED_QUOTES_ERROR
|
||||
|
||||
except (IOError, TypeError):
|
||||
error_message += '\n(could not open file to display line)'
|
||||
except IndexError:
|
||||
error_message += '\n(specified line no longer in file, maybe it changed?)'
|
||||
|
||||
return error_message
|
||||
_default_message = 'Task failed.'
|
||||
|
||||
|
||||
class AnsiblePromptInterrupt(AnsibleError):
|
||||
"""User interrupt"""
|
||||
"""User interrupt."""
|
||||
|
||||
|
||||
class AnsiblePromptNoninteractive(AnsibleError):
|
||||
"""Unable to get user input"""
|
||||
"""Unable to get user input."""
|
||||
|
||||
|
||||
class AnsibleAssertionError(AnsibleError, AssertionError):
|
||||
"""Invalid assertion"""
|
||||
pass
|
||||
"""Invalid assertion."""
|
||||
|
||||
|
||||
class AnsibleOptionsError(AnsibleError):
|
||||
""" bad or incomplete options passed """
|
||||
pass
|
||||
"""Invalid options were passed."""
|
||||
|
||||
# FIXME: This exception is used for many non-CLI related errors.
|
||||
# The few cases which are CLI related should really be handled by argparse instead, at which point the exit code here can be removed.
|
||||
_exit_code = ExitCode.INVALID_CLI_OPTION
|
||||
|
||||
|
||||
class AnsibleRequiredOptionError(AnsibleOptionsError):
|
||||
""" bad or incomplete options passed """
|
||||
pass
|
||||
"""Bad or incomplete options passed."""
|
||||
|
||||
|
||||
class AnsibleParserError(AnsibleError):
|
||||
""" something was detected early that is wrong about a playbook or data file """
|
||||
pass
|
||||
"""A playbook or data file could not be parsed."""
|
||||
|
||||
_exit_code = ExitCode.PARSER_ERROR
|
||||
|
||||
|
||||
class AnsibleFieldAttributeError(AnsibleParserError):
|
||||
"""Errors caused during field attribute processing."""
|
||||
|
||||
|
||||
class AnsibleJSONParserError(AnsibleParserError):
|
||||
"""JSON-specific parsing failure wrapping an exception raised by the JSON parser."""
|
||||
|
||||
_default_message = 'JSON parsing failed.'
|
||||
_include_cause_message = False # hide the underlying cause message, it's included by `handle_exception` as needed
|
||||
|
||||
@classmethod
|
||||
def handle_exception(cls, exception: Exception, origin: _tags.Origin) -> t.NoReturn:
|
||||
if isinstance(exception, JSONDecodeError):
|
||||
origin = origin.replace(line_num=exception.lineno, col_num=exception.colno)
|
||||
|
||||
message = str(exception)
|
||||
|
||||
error = cls(message, obj=origin)
|
||||
|
||||
raise error from exception
|
||||
|
||||
|
||||
class AnsibleInternalError(AnsibleError):
|
||||
""" internal safeguards tripped, something happened in the code that should never happen """
|
||||
pass
|
||||
"""Internal safeguards tripped, something happened in the code that should never happen."""
|
||||
|
||||
|
||||
class AnsibleRuntimeError(AnsibleError):
|
||||
""" ansible had a problem while running a playbook """
|
||||
pass
|
||||
"""Ansible had a problem while running a playbook."""
|
||||
|
||||
|
||||
class AnsibleModuleError(AnsibleRuntimeError):
|
||||
""" a module failed somehow """
|
||||
pass
|
||||
"""A module failed somehow."""
|
||||
|
||||
|
||||
class AnsibleConnectionFailure(AnsibleRuntimeError):
|
||||
""" the transport / connection_plugin had a fatal error """
|
||||
pass
|
||||
"""The transport / connection_plugin had a fatal error."""
|
||||
|
||||
|
||||
class AnsibleAuthenticationFailure(AnsibleConnectionFailure):
|
||||
"""invalid username/password/key"""
|
||||
pass
|
||||
"""Invalid username/password/key."""
|
||||
|
||||
_default_message = "Failed to authenticate."
|
||||
|
||||
|
||||
class AnsibleCallbackError(AnsibleRuntimeError):
|
||||
""" a callback failure """
|
||||
pass
|
||||
"""A callback failure."""
|
||||
|
||||
|
||||
class AnsibleTemplateError(AnsibleRuntimeError):
|
||||
"""A template related error"""
|
||||
pass
|
||||
"""A template related error."""
|
||||
|
||||
|
||||
class AnsibleFilterError(AnsibleTemplateError):
|
||||
""" a templating failure """
|
||||
pass
|
||||
class TemplateTrustCheckFailedError(AnsibleTemplateError):
|
||||
"""Raised when processing was requested on an untrusted template or expression."""
|
||||
|
||||
_default_message = 'Encountered untrusted template or expression.'
|
||||
_default_help_text = ('Templates and expressions must be defined by trusted sources such as playbooks or roles, '
|
||||
'not untrusted sources such as module results.')
|
||||
|
||||
|
||||
class AnsibleLookupError(AnsibleTemplateError):
|
||||
""" a lookup failure """
|
||||
pass
|
||||
class AnsibleTemplateTransformLimitError(AnsibleTemplateError):
|
||||
"""The internal template transform limit was exceeded."""
|
||||
|
||||
_default_message = "Template transform limit exceeded."
|
||||
|
||||
|
||||
class AnsibleTemplateSyntaxError(AnsibleTemplateError):
|
||||
"""A syntax error was encountered while parsing a Jinja template or expression."""
|
||||
|
||||
|
||||
class AnsibleBrokenConditionalError(AnsibleTemplateError):
|
||||
"""A broken conditional with non-boolean result was used."""
|
||||
|
||||
_default_help_text = 'Broken conditionals can be temporarily allowed with the `ALLOW_BROKEN_CONDITIONALS` configuration option.'
|
||||
|
||||
|
||||
class AnsibleUndefinedVariable(AnsibleTemplateError):
|
||||
""" a templating failure """
|
||||
pass
|
||||
"""An undefined variable was encountered while processing a template or expression."""
|
||||
|
||||
|
||||
class AnsibleValueOmittedError(AnsibleTemplateError):
|
||||
"""
|
||||
Raised when the result of a template operation was the Omit singleton. This exception purposely does
|
||||
not derive from AnsibleError to avoid elision of the traceback, since uncaught errors of this type always
|
||||
indicate a bug.
|
||||
"""
|
||||
|
||||
_default_message = "A template was resolved to an Omit scalar."
|
||||
_default_help_text = "Callers must be prepared to handle this value. This is most likely a bug in the code requesting templating."
|
||||
|
||||
|
||||
class AnsibleTemplatePluginError(AnsibleTemplateError):
|
||||
"""An error sourced by a template plugin (lookup/filter/test)."""
|
||||
|
||||
|
||||
# deprecated: description='add deprecation warnings for these aliases' core_version='2.23'
|
||||
AnsibleFilterError = AnsibleTemplatePluginError
|
||||
AnsibleLookupError = AnsibleTemplatePluginError
|
||||
|
||||
|
||||
class AnsibleFileNotFound(AnsibleRuntimeError):
|
||||
""" a file missing failure """
|
||||
"""A file missing failure."""
|
||||
|
||||
def __init__(self, message="", obj=None, show_content=True, suppress_extended_error=False, orig_exc=None, paths=None, file_name=None):
|
||||
def __init__(self, message="", obj=None, show_content=True, suppress_extended_error=..., orig_exc=None, paths=None, file_name=None):
|
||||
|
||||
self.file_name = file_name
|
||||
self.paths = paths
|
||||
|
|
@ -322,10 +332,9 @@ class AnsibleFileNotFound(AnsibleRuntimeError):
|
|||
# DO NOT USE as they will probably be removed soon.
|
||||
# We will port the action modules in our tree to use a context manager instead.
|
||||
class AnsibleAction(AnsibleRuntimeError):
|
||||
""" Base Exception for Action plugin flow control """
|
||||
|
||||
def __init__(self, message="", obj=None, show_content=True, suppress_extended_error=False, orig_exc=None, result=None):
|
||||
"""Base Exception for Action plugin flow control."""
|
||||
|
||||
def __init__(self, message="", obj=None, show_content=True, suppress_extended_error=..., orig_exc=None, result=None):
|
||||
super(AnsibleAction, self).__init__(message=message, obj=obj, show_content=show_content,
|
||||
suppress_extended_error=suppress_extended_error, orig_exc=orig_exc)
|
||||
if result is None:
|
||||
|
|
@ -335,54 +344,87 @@ class AnsibleAction(AnsibleRuntimeError):
|
|||
|
||||
|
||||
class AnsibleActionSkip(AnsibleAction):
|
||||
""" an action runtime skip"""
|
||||
"""An action runtime skip."""
|
||||
|
||||
def __init__(self, message="", obj=None, show_content=True, suppress_extended_error=False, orig_exc=None, result=None):
|
||||
def __init__(self, message="", obj=None, show_content=True, suppress_extended_error=..., orig_exc=None, result=None):
|
||||
super(AnsibleActionSkip, self).__init__(message=message, obj=obj, show_content=show_content,
|
||||
suppress_extended_error=suppress_extended_error, orig_exc=orig_exc, result=result)
|
||||
self.result.update({'skipped': True, 'msg': message})
|
||||
|
||||
|
||||
class AnsibleActionFail(AnsibleAction):
|
||||
""" an action runtime failure"""
|
||||
def __init__(self, message="", obj=None, show_content=True, suppress_extended_error=False, orig_exc=None, result=None):
|
||||
"""An action runtime failure."""
|
||||
|
||||
def __init__(self, message="", obj=None, show_content=True, suppress_extended_error=..., orig_exc=None, result=None):
|
||||
super(AnsibleActionFail, self).__init__(message=message, obj=obj, show_content=show_content,
|
||||
suppress_extended_error=suppress_extended_error, orig_exc=orig_exc, result=result)
|
||||
self.result.update({'failed': True, 'msg': message, 'exception': traceback.format_exc()})
|
||||
|
||||
result_overrides = {'failed': True, 'msg': message}
|
||||
# deprecated: description='use sys.exception()' python_version='3.11'
|
||||
if sys.exc_info()[1]: # DTFIX-RELEASE: remove this hack once TaskExecutor is no longer shucking AnsibleActionFail and returning its result
|
||||
result_overrides['exception'] = traceback.format_exc()
|
||||
|
||||
self.result.update(result_overrides)
|
||||
|
||||
|
||||
class _AnsibleActionDone(AnsibleAction):
|
||||
""" an action runtime early exit"""
|
||||
pass
|
||||
"""An action runtime early exit."""
|
||||
|
||||
|
||||
class AnsiblePluginError(AnsibleError):
|
||||
""" base class for Ansible plugin-related errors that do not need AnsibleError contextual data """
|
||||
"""Base class for Ansible plugin-related errors that do not need AnsibleError contextual data."""
|
||||
|
||||
def __init__(self, message=None, plugin_load_context=None):
|
||||
super(AnsiblePluginError, self).__init__(message)
|
||||
self.plugin_load_context = plugin_load_context
|
||||
|
||||
|
||||
class AnsiblePluginRemovedError(AnsiblePluginError):
|
||||
""" a requested plugin has been removed """
|
||||
pass
|
||||
"""A requested plugin has been removed."""
|
||||
|
||||
|
||||
class AnsiblePluginCircularRedirect(AnsiblePluginError):
|
||||
"""a cycle was detected in plugin redirection"""
|
||||
pass
|
||||
"""A cycle was detected in plugin redirection."""
|
||||
|
||||
|
||||
class AnsibleCollectionUnsupportedVersionError(AnsiblePluginError):
|
||||
"""a collection is not supported by this version of Ansible"""
|
||||
pass
|
||||
"""A collection is not supported by this version of Ansible."""
|
||||
|
||||
|
||||
class AnsibleFilterTypeError(AnsibleTemplateError, TypeError):
|
||||
""" a Jinja filter templating failure due to bad type"""
|
||||
pass
|
||||
class AnsibleTypeError(AnsibleRuntimeError, TypeError):
|
||||
"""Ansible-augmented TypeError subclass."""
|
||||
|
||||
|
||||
class AnsiblePluginNotFound(AnsiblePluginError):
|
||||
""" Indicates we did not find an Ansible plugin """
|
||||
pass
|
||||
"""Indicates we did not find an Ansible plugin."""
|
||||
|
||||
|
||||
class AnsibleConditionalError(AnsibleRuntimeError):
|
||||
"""Errors related to failed conditional expression evaluation."""
|
||||
|
||||
|
||||
class AnsibleVariableTypeError(AnsibleRuntimeError):
|
||||
"""An error due to attempted storage of an unsupported variable type."""
|
||||
|
||||
@classmethod
|
||||
def from_value(cls, *, obj: t.Any) -> t.Self:
|
||||
# avoid an incorrect error message when `obj` is a type
|
||||
type_name = type(obj).__name__ if isinstance(obj, type) else native_type_name(obj)
|
||||
|
||||
return cls(message=f'Type {type_name!r} is unsupported for variable storage.', obj=obj)
|
||||
|
||||
|
||||
def __getattr__(name: str) -> t.Any:
|
||||
"""Inject import-time deprecation warnings."""
|
||||
from ..utils.display import Display
|
||||
|
||||
if name == 'AnsibleFilterTypeError':
|
||||
Display().deprecated(
|
||||
msg="Importing 'AnsibleFilterTypeError' is deprecated.",
|
||||
help_text=f"Import {AnsibleTypeError.__name__!r} instead.",
|
||||
version="2.23",
|
||||
)
|
||||
|
||||
return AnsibleTypeError
|
||||
|
||||
raise AttributeError(f'module {__name__!r} has no attribute {name!r}')
|
||||
|
|
|
|||
|
|
@ -1,138 +0,0 @@
|
|||
# (c) 2012-2014, Michael DeHaan <michael.dehaan@gmail.com>
|
||||
#
|
||||
# This file is part of Ansible
|
||||
#
|
||||
# Ansible is free software: you can redistribute it and/or modify
|
||||
# it under the terms of the GNU General Public License as published by
|
||||
# the Free Software Foundation, either version 3 of the License, or
|
||||
# (at your option) any later version.
|
||||
#
|
||||
# Ansible is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with Ansible. If not, see <http://www.gnu.org/licenses/>.
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
__all__ = [
|
||||
'YAML_SYNTAX_ERROR',
|
||||
'YAML_POSITION_DETAILS',
|
||||
'YAML_COMMON_DICT_ERROR',
|
||||
'YAML_COMMON_UNQUOTED_VARIABLE_ERROR',
|
||||
'YAML_COMMON_UNQUOTED_COLON_ERROR',
|
||||
'YAML_COMMON_PARTIALLY_QUOTED_LINE_ERROR',
|
||||
'YAML_COMMON_UNBALANCED_QUOTES_ERROR',
|
||||
]
|
||||
|
||||
YAML_SYNTAX_ERROR = """\
|
||||
Syntax Error while loading YAML.
|
||||
%s"""
|
||||
|
||||
YAML_POSITION_DETAILS = """\
|
||||
The error appears to be in '%s': line %s, column %s, but may
|
||||
be elsewhere in the file depending on the exact syntax problem.
|
||||
"""
|
||||
|
||||
YAML_COMMON_DICT_ERROR = """\
|
||||
This one looks easy to fix. YAML thought it was looking for the start of a
|
||||
hash/dictionary and was confused to see a second "{". Most likely this was
|
||||
meant to be an ansible template evaluation instead, so we have to give the
|
||||
parser a small hint that we wanted a string instead. The solution here is to
|
||||
just quote the entire value.
|
||||
|
||||
For instance, if the original line was:
|
||||
|
||||
app_path: {{ base_path }}/foo
|
||||
|
||||
It should be written as:
|
||||
|
||||
app_path: "{{ base_path }}/foo"
|
||||
"""
|
||||
|
||||
YAML_COMMON_UNQUOTED_VARIABLE_ERROR = """\
|
||||
We could be wrong, but this one looks like it might be an issue with
|
||||
missing quotes. Always quote template expression brackets when they
|
||||
start a value. For instance:
|
||||
|
||||
with_items:
|
||||
- {{ foo }}
|
||||
|
||||
Should be written as:
|
||||
|
||||
with_items:
|
||||
- "{{ foo }}"
|
||||
"""
|
||||
|
||||
YAML_COMMON_UNQUOTED_COLON_ERROR = """\
|
||||
This one looks easy to fix. There seems to be an extra unquoted colon in the line
|
||||
and this is confusing the parser. It was only expecting to find one free
|
||||
colon. The solution is just add some quotes around the colon, or quote the
|
||||
entire line after the first colon.
|
||||
|
||||
For instance, if the original line was:
|
||||
|
||||
copy: src=file.txt dest=/path/filename:with_colon.txt
|
||||
|
||||
It can be written as:
|
||||
|
||||
copy: src=file.txt dest='/path/filename:with_colon.txt'
|
||||
|
||||
Or:
|
||||
|
||||
copy: 'src=file.txt dest=/path/filename:with_colon.txt'
|
||||
"""
|
||||
|
||||
YAML_COMMON_PARTIALLY_QUOTED_LINE_ERROR = """\
|
||||
This one looks easy to fix. It seems that there is a value started
|
||||
with a quote, and the YAML parser is expecting to see the line ended
|
||||
with the same kind of quote. For instance:
|
||||
|
||||
when: "ok" in result.stdout
|
||||
|
||||
Could be written as:
|
||||
|
||||
when: '"ok" in result.stdout'
|
||||
|
||||
Or equivalently:
|
||||
|
||||
when: "'ok' in result.stdout"
|
||||
"""
|
||||
|
||||
YAML_COMMON_UNBALANCED_QUOTES_ERROR = """\
|
||||
We could be wrong, but this one looks like it might be an issue with
|
||||
unbalanced quotes. If starting a value with a quote, make sure the
|
||||
line ends with the same set of quotes. For instance this arbitrary
|
||||
example:
|
||||
|
||||
foo: "bad" "wolf"
|
||||
|
||||
Could be written as:
|
||||
|
||||
foo: '"bad" "wolf"'
|
||||
"""
|
||||
|
||||
YAML_COMMON_LEADING_TAB_ERROR = """\
|
||||
There appears to be a tab character at the start of the line.
|
||||
|
||||
YAML does not use tabs for formatting. Tabs should be replaced with spaces.
|
||||
|
||||
For example:
|
||||
- name: update tooling
|
||||
vars:
|
||||
version: 1.2.3
|
||||
# ^--- there is a tab there.
|
||||
|
||||
Should be written as:
|
||||
- name: update tooling
|
||||
vars:
|
||||
version: 1.2.3
|
||||
# ^--- all spaces here.
|
||||
"""
|
||||
|
||||
YAML_AND_SHORTHAND_ERROR = """\
|
||||
There appears to be both 'k=v' shorthand syntax and YAML in this task. \
|
||||
Only one syntax may be used.
|
||||
"""
|
||||
|
|
@ -1,44 +0,0 @@
|
|||
# (c) 2016 - Red Hat, Inc. <info@ansible.com>
|
||||
#
|
||||
# This file is part of Ansible
|
||||
#
|
||||
# Ansible is free software: you can redistribute it and/or modify
|
||||
# it under the terms of the GNU General Public License as published by
|
||||
# the Free Software Foundation, either version 3 of the License, or
|
||||
# (at your option) any later version.
|
||||
#
|
||||
# Ansible is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with Ansible. If not, see <http://www.gnu.org/licenses/>.
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import multiprocessing.synchronize
|
||||
|
||||
from ansible.utils.multiprocessing import context as multiprocessing_context
|
||||
|
||||
from ansible.module_utils.facts.system.pkg_mgr import PKG_MGRS
|
||||
|
||||
if 'action_write_locks' not in globals():
|
||||
# Do not initialize this more than once because it seems to bash
|
||||
# the existing one. multiprocessing must be reloading the module
|
||||
# when it forks?
|
||||
action_write_locks: dict[str | None, multiprocessing.synchronize.Lock] = dict()
|
||||
|
||||
# Below is a Lock for use when we weren't expecting a named module. It gets used when an action
|
||||
# plugin invokes a module whose name does not match with the action's name. Slightly less
|
||||
# efficient as all processes with unexpected module names will wait on this lock
|
||||
action_write_locks[None] = multiprocessing_context.Lock()
|
||||
|
||||
# These plugins are known to be called directly by action plugins with names differing from the
|
||||
# action plugin name. We precreate them here as an optimization.
|
||||
# If a list of service managers is created in the future we can do the same for them.
|
||||
mods = set(p['name'] for p in PKG_MGRS)
|
||||
|
||||
mods.update(('copy', 'file', 'setup', 'slurp', 'stat'))
|
||||
for mod_name in mods:
|
||||
action_write_locks[mod_name] = multiprocessing_context.Lock()
|
||||
|
|
@ -9,7 +9,8 @@ from ansible import constants as C
|
|||
from ansible.errors import AnsibleError
|
||||
from ansible.utils.display import Display
|
||||
from ansible.utils.plugin_docs import get_versioned_doclink
|
||||
from traceback import format_exc
|
||||
|
||||
_FALLBACK_INTERPRETER = '/usr/bin/python3'
|
||||
|
||||
display = Display()
|
||||
foundre = re.compile(r'FOUND(.*)ENDFOUND', flags=re.DOTALL)
|
||||
|
|
@ -26,14 +27,14 @@ def discover_interpreter(action, interpreter_name, discovery_mode, task_vars):
|
|||
"""Probe the target host for a Python interpreter from the `INTERPRETER_PYTHON_FALLBACK` list, returning the first found or `/usr/bin/python3` if none."""
|
||||
host = task_vars.get('inventory_hostname', 'unknown')
|
||||
res = None
|
||||
found_interpreters = [u'/usr/bin/python3'] # fallback value
|
||||
found_interpreters = [_FALLBACK_INTERPRETER] # fallback value
|
||||
is_silent = discovery_mode.endswith('_silent')
|
||||
|
||||
if discovery_mode.startswith('auto_legacy'):
|
||||
action._discovery_deprecation_warnings.append(dict(
|
||||
display.deprecated(
|
||||
msg=f"The '{discovery_mode}' option for 'INTERPRETER_PYTHON' now has the same effect as 'auto'.",
|
||||
version='2.21',
|
||||
))
|
||||
)
|
||||
|
||||
try:
|
||||
bootstrap_python_list = C.config.get_config_value('INTERPRETER_PYTHON_FALLBACK', variables=task_vars)
|
||||
|
|
@ -61,24 +62,26 @@ def discover_interpreter(action, interpreter_name, discovery_mode, task_vars):
|
|||
|
||||
if not found_interpreters:
|
||||
if not is_silent:
|
||||
action._discovery_warnings.append(u'No python interpreters found for '
|
||||
u'host {0} (tried {1})'.format(host, bootstrap_python_list))
|
||||
display.warning(msg=f'No python interpreters found for host {host!r} (tried {bootstrap_python_list!r}).')
|
||||
|
||||
# this is lame, but returning None or throwing an exception is uglier
|
||||
return u'/usr/bin/python3'
|
||||
return _FALLBACK_INTERPRETER
|
||||
except AnsibleError:
|
||||
raise
|
||||
except Exception as ex:
|
||||
if not is_silent:
|
||||
action._discovery_warnings.append(f'Unhandled error in Python interpreter discovery for host {host}: {ex}')
|
||||
display.debug(msg=f'Interpreter discovery traceback:\n{format_exc()}', host=host)
|
||||
display.error_as_warning(msg=f'Unhandled error in Python interpreter discovery for host {host!r}.', exception=ex)
|
||||
|
||||
if res and res.get('stderr'): # the current ssh plugin implementation always has stderr, making coverage of the false case difficult
|
||||
display.vvv(msg=f"Interpreter discovery remote stderr:\n{res.get('stderr')}", host=host)
|
||||
|
||||
if not is_silent:
|
||||
action._discovery_warnings.append(
|
||||
f"Host {host} is using the discovered Python interpreter at {found_interpreters[0]}, "
|
||||
"but future installation of another Python interpreter could change the meaning of that path. "
|
||||
f"See {get_versioned_doclink('reference_appendices/interpreter_discovery.html')} for more information."
|
||||
display.warning(
|
||||
msg=(
|
||||
f"Host {host!r} is using the discovered Python interpreter at {found_interpreters[0]!r}, "
|
||||
"but future installation of another Python interpreter could cause a different interpreter to be discovered."
|
||||
),
|
||||
help_text=f"See {get_versioned_doclink('reference_appendices/interpreter_discovery.html')} for more information.",
|
||||
)
|
||||
|
||||
return found_interpreters[0]
|
||||
|
|
|
|||
File diff suppressed because it is too large
Load Diff
|
|
@ -155,9 +155,6 @@ class PlayIterator:
|
|||
setup_block.run_once = False
|
||||
setup_task = Task(block=setup_block)
|
||||
setup_task.action = 'gather_facts'
|
||||
# TODO: hardcoded resolution here, but should use actual resolution code in the end,
|
||||
# in case of 'legacy' mismatch
|
||||
setup_task.resolved_action = 'ansible.builtin.gather_facts'
|
||||
setup_task.name = 'Gathering Facts'
|
||||
setup_task.args = {}
|
||||
|
||||
|
|
@ -255,7 +252,6 @@ class PlayIterator:
|
|||
self.set_state_for_host(host.name, s)
|
||||
|
||||
display.debug("done getting next task for host %s" % host.name)
|
||||
display.debug(" ^ task is: %s" % task)
|
||||
display.debug(" ^ state is: %s" % s)
|
||||
return (s, task)
|
||||
|
||||
|
|
@ -292,7 +288,7 @@ class PlayIterator:
|
|||
|
||||
if (gathering == 'implicit' and implied) or \
|
||||
(gathering == 'explicit' and boolean(self._play.gather_facts, strict=False)) or \
|
||||
(gathering == 'smart' and implied and not (self._variable_manager._fact_cache.get(host.name, {}).get('_ansible_facts_gathered', False))):
|
||||
(gathering == 'smart' and implied and not self._variable_manager._facts_gathered_for_host(host.name)):
|
||||
# The setup block is always self._blocks[0], as we inject it
|
||||
# during the play compilation in __init__ above.
|
||||
setup_block = self._blocks[0]
|
||||
|
|
@ -450,8 +446,7 @@ class PlayIterator:
|
|||
# skip implicit flush_handlers if there are no handlers notified
|
||||
if (
|
||||
task.implicit
|
||||
and task.action in C._ACTION_META
|
||||
and task.args.get('_raw_params', None) == 'flush_handlers'
|
||||
and task._get_meta() == 'flush_handlers'
|
||||
and (
|
||||
# the state store in the `state` variable could be a nested state,
|
||||
# notifications are always stored in the top level state, get it here
|
||||
|
|
|
|||
|
|
@ -26,7 +26,7 @@ from ansible.module_utils.common.text.converters import to_text
|
|||
from ansible.module_utils.parsing.convert_bool import boolean
|
||||
from ansible.plugins.loader import become_loader, connection_loader, shell_loader
|
||||
from ansible.playbook import Playbook
|
||||
from ansible.template import Templar
|
||||
from ansible._internal._templating._engine import TemplateEngine
|
||||
from ansible.utils.helpers import pct_to_int
|
||||
from ansible.utils.collection_loader import AnsibleCollectionConfig
|
||||
from ansible.utils.collection_loader._collection_finder import _get_collection_name_from_path, _get_collection_playbook_path
|
||||
|
|
@ -132,7 +132,7 @@ class PlaybookExecutor:
|
|||
|
||||
# Allow variables to be used in vars_prompt fields.
|
||||
all_vars = self._variable_manager.get_vars(play=play)
|
||||
templar = Templar(loader=self._loader, variables=all_vars)
|
||||
templar = TemplateEngine(loader=self._loader, variables=all_vars)
|
||||
setattr(play, 'vars_prompt', templar.template(play.vars_prompt))
|
||||
|
||||
# FIXME: this should be a play 'sub object' like loop_control
|
||||
|
|
@ -158,7 +158,7 @@ class PlaybookExecutor:
|
|||
|
||||
# Post validate so any play level variables are templated
|
||||
all_vars = self._variable_manager.get_vars(play=play)
|
||||
templar = Templar(loader=self._loader, variables=all_vars)
|
||||
templar = TemplateEngine(loader=self._loader, variables=all_vars)
|
||||
play.post_validate(templar)
|
||||
|
||||
if context.CLIARGS['syntax']:
|
||||
|
|
|
|||
|
|
@ -12,11 +12,13 @@ import pkgutil
|
|||
import secrets
|
||||
import re
|
||||
import typing as t
|
||||
|
||||
from importlib import import_module
|
||||
|
||||
from ansible.module_utils.compat.version import LooseVersion
|
||||
|
||||
from ansible import constants as C
|
||||
from ansible.module_utils.common.json import Direction, get_module_encoder
|
||||
from ansible.errors import AnsibleError, AnsibleFileNotFound
|
||||
from ansible.module_utils.common.text.converters import to_bytes, to_text
|
||||
from ansible.plugins.become import BecomeBase
|
||||
|
|
@ -351,6 +353,7 @@ def _create_powershell_wrapper(
|
|||
become_plugin: BecomeBase | None,
|
||||
substyle: t.Literal["powershell", "script"],
|
||||
task_vars: dict[str, t.Any],
|
||||
profile: str,
|
||||
) -> bytes:
|
||||
"""Creates module or script wrapper for PowerShell.
|
||||
|
||||
|
|
@ -369,8 +372,6 @@ def _create_powershell_wrapper(
|
|||
|
||||
:return: The input data for bootstrap_wrapper.ps1 as a byte string.
|
||||
"""
|
||||
# creates the manifest/wrapper used in PowerShell/C# modules to enable
|
||||
# things like become and async - this is also called in action/script.py
|
||||
|
||||
actions: list[_ManifestAction] = []
|
||||
finder = PSModuleDepFinder()
|
||||
|
|
@ -405,7 +406,7 @@ def _create_powershell_wrapper(
|
|||
'Variables': [
|
||||
{
|
||||
'Name': 'complex_args',
|
||||
'Value': module_args,
|
||||
'Value': _prepare_module_args(module_args, profile),
|
||||
'Scope': 'Global',
|
||||
},
|
||||
],
|
||||
|
|
@ -540,3 +541,13 @@ def _get_bootstrap_input(
|
|||
bootstrap_input = json.dumps(bootstrap_manifest, ensure_ascii=True)
|
||||
exec_input = json.dumps(dataclasses.asdict(manifest))
|
||||
return f"{bootstrap_input}\n\0\0\0\0\n{exec_input}".encode()
|
||||
|
||||
|
||||
def _prepare_module_args(module_args: dict[str, t.Any], profile: str) -> dict[str, t.Any]:
|
||||
"""
|
||||
Serialize the module args with the specified profile and deserialize them with the Python built-in JSON decoder.
|
||||
This is used to facilitate serializing module args with a different encoder (profile) than is used for the manifest.
|
||||
"""
|
||||
encoder = get_module_encoder(profile, Direction.CONTROLLER_TO_MODULE)
|
||||
|
||||
return json.loads(json.dumps(module_args, cls=encoder))
|
||||
|
|
|
|||
|
|
@ -28,6 +28,7 @@ import typing as t
|
|||
from multiprocessing.queues import Queue
|
||||
|
||||
from ansible import context
|
||||
from ansible._internal import _task
|
||||
from ansible.errors import AnsibleConnectionFailure, AnsibleError
|
||||
from ansible.executor.task_executor import TaskExecutor
|
||||
from ansible.executor.task_queue_manager import FinalQueue, STDIN_FILENO, STDOUT_FILENO, STDERR_FILENO
|
||||
|
|
@ -39,6 +40,7 @@ from ansible.playbook.task import Task
|
|||
from ansible.playbook.play_context import PlayContext
|
||||
from ansible.plugins.loader import init_plugin_loader
|
||||
from ansible.utils.context_objects import CLIArgs
|
||||
from ansible.plugins.action import ActionBase
|
||||
from ansible.utils.display import Display
|
||||
from ansible.utils.multiprocessing import context as multiprocessing_context
|
||||
from ansible.vars.manager import VariableManager
|
||||
|
|
@ -189,7 +191,8 @@ class WorkerProcess(multiprocessing_context.Process): # type: ignore[name-defin
|
|||
display.set_queue(self._final_q)
|
||||
self._detach()
|
||||
try:
|
||||
return self._run()
|
||||
with _task.TaskContext(self._task):
|
||||
return self._run()
|
||||
except BaseException:
|
||||
self._hard_exit(traceback.format_exc())
|
||||
|
||||
|
|
@ -259,20 +262,17 @@ class WorkerProcess(multiprocessing_context.Process): # type: ignore[name-defin
|
|||
executor_result,
|
||||
task_fields=self._task.dump_attrs(),
|
||||
)
|
||||
except Exception as e:
|
||||
display.debug(f'failed to send task result ({e}), sending surrogate result')
|
||||
self._final_q.send_task_result(
|
||||
self._host.name,
|
||||
self._task._uuid,
|
||||
# Overriding the task result, to represent the failure
|
||||
{
|
||||
'failed': True,
|
||||
'msg': f'{e}',
|
||||
'exception': traceback.format_exc(),
|
||||
},
|
||||
# The failure pickling may have been caused by the task attrs, omit for safety
|
||||
{},
|
||||
)
|
||||
except Exception as ex:
|
||||
try:
|
||||
raise AnsibleError("Task result omitted due to queue send failure.") from ex
|
||||
except Exception as ex_wrapper:
|
||||
self._final_q.send_task_result(
|
||||
self._host.name,
|
||||
self._task._uuid,
|
||||
ActionBase.result_dict_from_exception(ex_wrapper), # Overriding the task result, to represent the failure
|
||||
{}, # The failure pickling may have been caused by the task attrs, omit for safety
|
||||
)
|
||||
|
||||
display.debug("done sending task result for task %s" % self._task._uuid)
|
||||
|
||||
except AnsibleConnectionFailure:
|
||||
|
|
|
|||
|
|
@ -10,29 +10,39 @@ import pathlib
|
|||
import signal
|
||||
import subprocess
|
||||
import sys
|
||||
|
||||
import traceback
|
||||
import typing as t
|
||||
|
||||
from ansible import constants as C
|
||||
from ansible.cli import scripts
|
||||
from ansible.errors import AnsibleError, AnsibleParserError, AnsibleUndefinedVariable, AnsibleConnectionFailure, AnsibleActionFail, AnsibleActionSkip
|
||||
from ansible.errors import (
|
||||
AnsibleError, AnsibleParserError, AnsibleUndefinedVariable, AnsibleConnectionFailure, AnsibleActionFail, AnsibleActionSkip, AnsibleTaskError,
|
||||
AnsibleValueOmittedError,
|
||||
)
|
||||
from ansible.executor.task_result import TaskResult
|
||||
from ansible.executor.module_common import get_action_args_with_defaults
|
||||
from ansible._internal._datatag import _utils
|
||||
from ansible.module_utils._internal._plugin_exec_context import PluginExecContext
|
||||
from ansible.module_utils.common.messages import Detail, WarningSummary, DeprecationSummary
|
||||
from ansible.module_utils.datatag import native_type_name
|
||||
from ansible._internal._datatag._tags import TrustedAsTemplate
|
||||
from ansible.module_utils.parsing.convert_bool import boolean
|
||||
from ansible.module_utils.six import binary_type
|
||||
from ansible.module_utils.common.text.converters import to_text, to_native
|
||||
from ansible.module_utils.connection import write_to_stream
|
||||
from ansible.module_utils.six import string_types
|
||||
from ansible.playbook.conditional import Conditional
|
||||
from ansible.playbook.task import Task
|
||||
from ansible.plugins import get_plugin_class
|
||||
from ansible.plugins.action import ActionBase
|
||||
from ansible.plugins.loader import become_loader, cliconf_loader, connection_loader, httpapi_loader, netconf_loader, terminal_loader
|
||||
from ansible._internal._templating._jinja_plugins import _invoke_lookup, _DirectCall
|
||||
from ansible._internal._templating._engine import TemplateEngine
|
||||
from ansible.template import Templar
|
||||
from ansible.utils.collection_loader import AnsibleCollectionConfig
|
||||
from ansible.utils.listify import listify_lookup_plugin_terms
|
||||
from ansible.utils.unsafe_proxy import to_unsafe_text, wrap_var
|
||||
from ansible.vars.clean import namespace_facts, clean_facts
|
||||
from ansible.utils.display import Display
|
||||
from ansible.utils.display import Display, _DeferredWarningContext
|
||||
from ansible.utils.vars import combine_vars
|
||||
from ansible.vars.clean import namespace_facts, clean_facts
|
||||
from ansible.vars.manager import _deprecate_top_level_fact
|
||||
from ansible._internal._errors import _captured
|
||||
|
||||
display = Display()
|
||||
|
||||
|
|
@ -60,29 +70,6 @@ def task_timeout(signum, frame):
|
|||
raise TaskTimeoutError(frame=frame)
|
||||
|
||||
|
||||
def remove_omit(task_args, omit_token):
|
||||
"""
|
||||
Remove args with a value equal to the ``omit_token`` recursively
|
||||
to align with now having suboptions in the argument_spec
|
||||
"""
|
||||
|
||||
if not isinstance(task_args, dict):
|
||||
return task_args
|
||||
|
||||
new_args = {}
|
||||
for i in task_args.items():
|
||||
if i[1] == omit_token:
|
||||
continue
|
||||
elif isinstance(i[1], dict):
|
||||
new_args[i[0]] = remove_omit(i[1], omit_token)
|
||||
elif isinstance(i[1], list):
|
||||
new_args[i[0]] = [remove_omit(v, omit_token) for v in i[1]]
|
||||
else:
|
||||
new_args[i[0]] = i[1]
|
||||
|
||||
return new_args
|
||||
|
||||
|
||||
class TaskExecutor:
|
||||
|
||||
"""
|
||||
|
|
@ -92,7 +79,7 @@ class TaskExecutor:
|
|||
class.
|
||||
"""
|
||||
|
||||
def __init__(self, host, task, job_vars, play_context, loader, shared_loader_obj, final_q, variable_manager):
|
||||
def __init__(self, host, task: Task, job_vars, play_context, loader, shared_loader_obj, final_q, variable_manager):
|
||||
self._host = host
|
||||
self._task = task
|
||||
self._job_vars = job_vars
|
||||
|
|
@ -103,6 +90,7 @@ class TaskExecutor:
|
|||
self._final_q = final_q
|
||||
self._variable_manager = variable_manager
|
||||
self._loop_eval_error = None
|
||||
self._task_templar = TemplateEngine(loader=self._loader, variables=self._job_vars)
|
||||
|
||||
self._task.squash()
|
||||
|
||||
|
|
@ -134,10 +122,14 @@ class TaskExecutor:
|
|||
# loop through the item results and set the global changed/failed/skipped result flags based on any item.
|
||||
res['skipped'] = True
|
||||
for item in item_results:
|
||||
if item.get('_ansible_no_log'):
|
||||
res.update(_ansible_no_log=True) # ensure no_log processing recognizes at least one item needs to be censored
|
||||
|
||||
if 'changed' in item and item['changed'] and not res.get('changed'):
|
||||
res['changed'] = True
|
||||
if res['skipped'] and ('skipped' not in item or ('skipped' in item and not item['skipped'])):
|
||||
res['skipped'] = False
|
||||
# FIXME: normalize `failed` to a bool, warn if the action/module used non-bool
|
||||
if 'failed' in item and item['failed']:
|
||||
item_ignore = item.pop('_ansible_ignore_errors')
|
||||
if not res.get('failed'):
|
||||
|
|
@ -164,6 +156,7 @@ class TaskExecutor:
|
|||
res[array] = res[array] + item[array]
|
||||
del item[array]
|
||||
|
||||
# FIXME: normalize `failed` to a bool, warn if the action/module used non-bool
|
||||
if not res.get('failed', False):
|
||||
res['msg'] = 'All items completed'
|
||||
if res['skipped']:
|
||||
|
|
@ -172,43 +165,23 @@ class TaskExecutor:
|
|||
res = dict(changed=False, skipped=True, skipped_reason='No items in the list', results=[])
|
||||
else:
|
||||
display.debug("calling self._execute()")
|
||||
res = self._execute()
|
||||
res = self._execute(self._task_templar, self._job_vars)
|
||||
display.debug("_execute() done")
|
||||
|
||||
# make sure changed is set in the result, if it's not present
|
||||
if 'changed' not in res:
|
||||
res['changed'] = False
|
||||
|
||||
def _clean_res(res, errors='surrogate_or_strict'):
|
||||
if isinstance(res, binary_type):
|
||||
return to_unsafe_text(res, errors=errors)
|
||||
elif isinstance(res, dict):
|
||||
for k in res:
|
||||
try:
|
||||
res[k] = _clean_res(res[k], errors=errors)
|
||||
except UnicodeError:
|
||||
if k == 'diff':
|
||||
# If this is a diff, substitute a replacement character if the value
|
||||
# is undecodable as utf8. (Fix #21804)
|
||||
display.warning("We were unable to decode all characters in the module return data."
|
||||
" Replaced some in an effort to return as much as possible")
|
||||
res[k] = _clean_res(res[k], errors='surrogate_then_replace')
|
||||
else:
|
||||
raise
|
||||
elif isinstance(res, list):
|
||||
for idx, item in enumerate(res):
|
||||
res[idx] = _clean_res(item, errors=errors)
|
||||
return res
|
||||
|
||||
display.debug("dumping result to json")
|
||||
res = _clean_res(res)
|
||||
display.debug("done dumping result, returning")
|
||||
return res
|
||||
except AnsibleError as e:
|
||||
return dict(failed=True, msg=wrap_var(to_text(e, nonstring='simplerepr')), _ansible_no_log=self._play_context.no_log)
|
||||
except Exception as e:
|
||||
return dict(failed=True, msg=wrap_var('Unexpected failure during module execution: %s' % (to_native(e, nonstring='simplerepr'))),
|
||||
exception=to_text(traceback.format_exc()), stdout='', _ansible_no_log=self._play_context.no_log)
|
||||
except Exception as ex:
|
||||
result = ActionBase.result_dict_from_exception(ex)
|
||||
|
||||
self._task.update_result_no_log(self._task_templar, result)
|
||||
|
||||
if not isinstance(ex, AnsibleError):
|
||||
result.update(msg=f'Unexpected failure during task execution: {result["msg"]}')
|
||||
|
||||
return result
|
||||
finally:
|
||||
try:
|
||||
self._connection.close()
|
||||
|
|
@ -217,7 +190,7 @@ class TaskExecutor:
|
|||
except Exception as e:
|
||||
display.debug(u"error closing connection: %s" % to_text(e))
|
||||
|
||||
def _get_loop_items(self):
|
||||
def _get_loop_items(self) -> list[t.Any] | None:
|
||||
"""
|
||||
Loads a lookup plugin to handle the with_* portion of a task (if specified),
|
||||
and returns the items result.
|
||||
|
|
@ -230,49 +203,51 @@ class TaskExecutor:
|
|||
if self._loader.get_basedir() not in self._job_vars['ansible_search_path']:
|
||||
self._job_vars['ansible_search_path'].append(self._loader.get_basedir())
|
||||
|
||||
templar = Templar(loader=self._loader, variables=self._job_vars)
|
||||
items = None
|
||||
if self._task.loop_with:
|
||||
if self._task.loop_with in self._shared_loader_obj.lookup_loader:
|
||||
templar = self._task_templar
|
||||
terms = self._task.loop
|
||||
|
||||
# TODO: hardcoded so it fails for non first_found lookups, but this should be generalized for those that don't do their own templating
|
||||
# lookup prop/attribute?
|
||||
fail = bool(self._task.loop_with != 'first_found')
|
||||
loop_terms = listify_lookup_plugin_terms(terms=self._task.loop, templar=templar, fail_on_undefined=fail, convert_bare=False)
|
||||
if isinstance(terms, str):
|
||||
terms = templar.resolve_to_container(_utils.str_problematic_strip(terms))
|
||||
|
||||
# get lookup
|
||||
mylookup = self._shared_loader_obj.lookup_loader.get(self._task.loop_with, loader=self._loader, templar=templar)
|
||||
if not isinstance(terms, list):
|
||||
terms = [terms]
|
||||
|
||||
# give lookup task 'context' for subdir (mostly needed for first_found)
|
||||
for subdir in ['template', 'var', 'file']: # TODO: move this to constants?
|
||||
if subdir in self._task.action:
|
||||
break
|
||||
setattr(mylookup, '_subdir', subdir + 's')
|
||||
@_DirectCall.mark
|
||||
def invoke_lookup() -> t.Any:
|
||||
"""Scope-capturing wrapper around _invoke_lookup to avoid functools.partial obscuring its usage from type-checking tools."""
|
||||
return _invoke_lookup(
|
||||
plugin_name=self._task.loop_with,
|
||||
lookup_terms=terms,
|
||||
lookup_kwargs=dict(wantlist=True),
|
||||
invoked_as_with=True,
|
||||
)
|
||||
|
||||
# run lookup
|
||||
items = wrap_var(mylookup.run(terms=loop_terms, variables=self._job_vars, wantlist=True))
|
||||
else:
|
||||
raise AnsibleError("Unexpected failure in finding the lookup named '%s' in the available lookup plugins" % self._task.loop_with)
|
||||
# Smuggle a special wrapped lookup invocation in as a local variable for its exclusive use when being evaluated as `with_(lookup)`.
|
||||
# This value will not be visible to other users of this templar or its `available_variables`.
|
||||
items = templar.evaluate_expression(expression=TrustedAsTemplate().tag("invoke_lookup()"), local_variables=dict(invoke_lookup=invoke_lookup))
|
||||
|
||||
elif self._task.loop is not None:
|
||||
items = templar.template(self._task.loop)
|
||||
items = self._task_templar.template(self._task.loop)
|
||||
|
||||
if not isinstance(items, list):
|
||||
raise AnsibleError(
|
||||
"Invalid data passed to 'loop', it requires a list, got this instead: %s."
|
||||
" Hint: If you passed a list/dict of just one element,"
|
||||
" try adding wantlist=True to your lookup invocation or use q/query instead of lookup." % items
|
||||
f"The `loop` value must resolve to a 'list', not {native_type_name(items)!r}.",
|
||||
help_text="Provide a list of items/templates, or a template resolving to a list.",
|
||||
obj=self._task.loop,
|
||||
)
|
||||
|
||||
return items
|
||||
|
||||
def _run_loop(self, items):
|
||||
def _run_loop(self, items: list[t.Any]) -> list[dict[str, t.Any]]:
|
||||
"""
|
||||
Runs the task with the loop items specified and collates the result
|
||||
into an array named 'results' which is inserted into the final result
|
||||
along with the item for which the loop ran.
|
||||
"""
|
||||
task_vars = self._job_vars
|
||||
templar = Templar(loader=self._loader, variables=task_vars)
|
||||
templar = TemplateEngine(loader=self._loader, variables=task_vars)
|
||||
|
||||
self._task.loop_control.post_validate(templar=templar)
|
||||
|
||||
|
|
@ -281,17 +256,20 @@ class TaskExecutor:
|
|||
loop_pause = self._task.loop_control.pause
|
||||
extended = self._task.loop_control.extended
|
||||
extended_allitems = self._task.loop_control.extended_allitems
|
||||
|
||||
# ensure we always have a label
|
||||
label = self._task.loop_control.label or '{{' + loop_var + '}}'
|
||||
label = self._task.loop_control.label or templar.variable_name_as_template(loop_var)
|
||||
|
||||
if loop_var in task_vars:
|
||||
display.warning(u"%s: The loop variable '%s' is already in use. "
|
||||
u"You should set the `loop_var` value in the `loop_control` option for the task"
|
||||
u" to something else to avoid variable collisions and unexpected behavior." % (self._task, loop_var))
|
||||
display.warning(
|
||||
msg=f"The loop variable {loop_var!r} is already in use.",
|
||||
help_text="You should set the `loop_var` value in the `loop_control` option for the task "
|
||||
"to something else to avoid variable collisions and unexpected behavior.",
|
||||
obj=loop_var,
|
||||
)
|
||||
|
||||
ran_once = False
|
||||
task_fields = None
|
||||
no_log = False
|
||||
items_len = len(items)
|
||||
results = []
|
||||
for item_index, item in enumerate(items):
|
||||
|
|
@ -331,7 +309,7 @@ class TaskExecutor:
|
|||
ran_once = True
|
||||
|
||||
try:
|
||||
tmp_task = self._task.copy(exclude_parent=True, exclude_tasks=True)
|
||||
tmp_task: Task = self._task.copy(exclude_parent=True, exclude_tasks=True)
|
||||
tmp_task._parent = self._task._parent
|
||||
tmp_play_context = self._play_context.copy()
|
||||
except AnsibleParserError as e:
|
||||
|
|
@ -340,9 +318,11 @@ class TaskExecutor:
|
|||
|
||||
# now we swap the internal task and play context with their copies,
|
||||
# execute, and swap them back so we can do the next iteration cleanly
|
||||
# NB: this swap-a-dee-doo confuses some type checkers about the type of tmp_task/self._task
|
||||
(self._task, tmp_task) = (tmp_task, self._task)
|
||||
(self._play_context, tmp_play_context) = (tmp_play_context, self._play_context)
|
||||
res = self._execute(variables=task_vars)
|
||||
|
||||
res = self._execute(templar=templar, variables=task_vars)
|
||||
|
||||
if self._task.register:
|
||||
# Ensure per loop iteration results are registered in case `_execute()`
|
||||
|
|
@ -354,9 +334,6 @@ class TaskExecutor:
|
|||
(self._task, tmp_task) = (tmp_task, self._task)
|
||||
(self._play_context, tmp_play_context) = (tmp_play_context, self._play_context)
|
||||
|
||||
# update 'general no_log' based on specific no_log
|
||||
no_log = no_log or tmp_task.no_log
|
||||
|
||||
# now update the result with the item info, and append the result
|
||||
# to the list of results
|
||||
res[loop_var] = item
|
||||
|
|
@ -391,6 +368,7 @@ class TaskExecutor:
|
|||
task_fields=task_fields,
|
||||
)
|
||||
|
||||
# FIXME: normalize `failed` to a bool, warn if the action/module used non-bool
|
||||
if tr.is_failed() or tr.is_unreachable():
|
||||
self._final_q.send_callback('v2_runner_item_on_failed', tr)
|
||||
elif tr.is_skipped():
|
||||
|
|
@ -405,11 +383,14 @@ class TaskExecutor:
|
|||
|
||||
# break loop if break_when conditions are met
|
||||
if self._task.loop_control and self._task.loop_control.break_when:
|
||||
cond = Conditional(loader=self._loader)
|
||||
cond.when = self._task.loop_control.get_validated_value(
|
||||
'break_when', self._task.loop_control.fattributes.get('break_when'), self._task.loop_control.break_when, templar
|
||||
break_when = self._task.loop_control.get_validated_value(
|
||||
'break_when',
|
||||
self._task.loop_control.fattributes.get('break_when'),
|
||||
self._task.loop_control.break_when,
|
||||
templar,
|
||||
)
|
||||
if cond.evaluate_conditional(templar, task_vars):
|
||||
|
||||
if self._task._resolve_conditional(break_when, task_vars):
|
||||
# delete loop vars before exiting loop
|
||||
del task_vars[loop_var]
|
||||
break
|
||||
|
|
@ -431,7 +412,6 @@ class TaskExecutor:
|
|||
if var in task_vars and var not in self._job_vars:
|
||||
del task_vars[var]
|
||||
|
||||
self._task.no_log = no_log
|
||||
# NOTE: run_once cannot contain loop vars because it's templated earlier also
|
||||
# This is saving the post-validated field from the last loop so the strategy can use the templated value post task execution
|
||||
self._task.run_once = task_fields.get('run_once')
|
||||
|
|
@ -447,22 +427,50 @@ class TaskExecutor:
|
|||
# At the point this is executed it is safe to mutate self._task,
|
||||
# since `self._task` is either a copy referred to by `tmp_task` in `_run_loop`
|
||||
# or just a singular non-looped task
|
||||
if delegated_host_name:
|
||||
self._task.delegate_to = delegated_host_name
|
||||
variables.update(delegated_vars)
|
||||
|
||||
def _execute(self, variables=None):
|
||||
self._task.delegate_to = delegated_host_name # always override, since a templated result could be an omit (-> None)
|
||||
variables.update(delegated_vars)
|
||||
|
||||
def _execute(self, templar: TemplateEngine, variables: dict[str, t.Any]) -> dict[str, t.Any]:
|
||||
result: dict[str, t.Any]
|
||||
|
||||
with _DeferredWarningContext(variables=variables) as warning_ctx:
|
||||
try:
|
||||
# DTFIX-FUTURE: improve error handling to prioritize the earliest exception, turning the remaining ones into warnings
|
||||
result = self._execute_internal(templar, variables)
|
||||
self._apply_task_result_compat(result, warning_ctx)
|
||||
_captured.AnsibleActionCapturedError.maybe_raise_on_result(result)
|
||||
except Exception as ex:
|
||||
try:
|
||||
raise AnsibleTaskError(obj=self._task.get_ds()) from ex
|
||||
except AnsibleTaskError as atex:
|
||||
result = ActionBase.result_dict_from_exception(atex)
|
||||
result.setdefault('changed', False)
|
||||
|
||||
self._task.update_result_no_log(templar, result)
|
||||
|
||||
# The warnings/deprecations in the result have already been captured in the _DeferredWarningContext by _apply_task_result_compat.
|
||||
# The captured warnings/deprecations are a superset of the ones from the result, and may have been converted from a dict to a dataclass.
|
||||
# These are then used to supersede the entries in the result.
|
||||
|
||||
result.pop('warnings', None)
|
||||
result.pop('deprecations', None)
|
||||
|
||||
if warnings := warning_ctx.get_warnings():
|
||||
result.update(warnings=warnings)
|
||||
|
||||
if deprecation_warnings := warning_ctx.get_deprecation_warnings():
|
||||
result.update(deprecations=deprecation_warnings)
|
||||
|
||||
return result
|
||||
|
||||
def _execute_internal(self, templar: TemplateEngine, variables: dict[str, t.Any]) -> dict[str, t.Any]:
|
||||
"""
|
||||
The primary workhorse of the executor system, this runs the task
|
||||
on the specified host (which may be the delegated_to host) and handles
|
||||
the retry/until and block rescue/always execution
|
||||
"""
|
||||
|
||||
if variables is None:
|
||||
variables = self._job_vars
|
||||
|
||||
templar = Templar(loader=self._loader, variables=variables)
|
||||
|
||||
self._calculate_delegate_to(templar, variables)
|
||||
|
||||
context_validation_error = None
|
||||
|
|
@ -497,18 +505,13 @@ class TaskExecutor:
|
|||
# skipping this task during the conditional evaluation step
|
||||
context_validation_error = e
|
||||
|
||||
no_log = self._play_context.no_log
|
||||
|
||||
# Evaluate the conditional (if any) for this task, which we do before running
|
||||
# the final task post-validation. We do this before the post validation due to
|
||||
# the fact that the conditional may specify that the task be skipped due to a
|
||||
# variable not being present which would otherwise cause validation to fail
|
||||
try:
|
||||
conditional_result, false_condition = self._task.evaluate_conditional_with_result(templar, tempvars)
|
||||
if not conditional_result:
|
||||
display.debug("when evaluation is False, skipping this task")
|
||||
return dict(changed=False, skipped=True, skip_reason='Conditional result was False',
|
||||
false_condition=false_condition, _ansible_no_log=no_log)
|
||||
if not self._task._resolve_conditional(self._task.when, tempvars, result_context=(rc := t.cast(dict[str, t.Any], {}))):
|
||||
return dict(changed=False, skipped=True, skip_reason='Conditional result was False') | rc
|
||||
except AnsibleError as e:
|
||||
# loop error takes precedence
|
||||
if self._loop_eval_error is not None:
|
||||
|
|
@ -524,22 +527,27 @@ class TaskExecutor:
|
|||
|
||||
# if we ran into an error while setting up the PlayContext, raise it now, unless is known issue with delegation
|
||||
# and undefined vars (correct values are in cvars later on and connection plugins, if still error, blows up there)
|
||||
|
||||
# DTFIX-RELEASE: this should probably be declaratively handled in post_validate (or better, get rid of play_context)
|
||||
if context_validation_error is not None:
|
||||
raiseit = True
|
||||
if self._task.delegate_to:
|
||||
if isinstance(context_validation_error, AnsibleUndefinedVariable):
|
||||
raiseit = False
|
||||
elif isinstance(context_validation_error, AnsibleParserError):
|
||||
if isinstance(context_validation_error, AnsibleParserError):
|
||||
# parser error, might be cause by undef too
|
||||
orig_exc = getattr(context_validation_error, 'orig_exc', None)
|
||||
if isinstance(orig_exc, AnsibleUndefinedVariable):
|
||||
if isinstance(context_validation_error.__cause__, AnsibleUndefinedVariable):
|
||||
raiseit = False
|
||||
elif isinstance(context_validation_error, AnsibleUndefinedVariable):
|
||||
# DTFIX-RELEASE: should not be possible to hit this now (all are AnsibleFieldAttributeError)?
|
||||
raiseit = False
|
||||
if raiseit:
|
||||
raise context_validation_error # pylint: disable=raising-bad-type
|
||||
|
||||
# set templar to use temp variables until loop is evaluated
|
||||
templar.available_variables = tempvars
|
||||
|
||||
# Now we do final validation on the task, which sets all fields to their final values.
|
||||
self._task.post_validate(templar=templar)
|
||||
|
||||
# if this task is a TaskInclude, we just return now with a success code so the
|
||||
# main thread can expand the task list for the given host
|
||||
if self._task.action in C._ACTION_INCLUDE_TASKS:
|
||||
|
|
@ -548,7 +556,6 @@ class TaskExecutor:
|
|||
if not include_file:
|
||||
return dict(failed=True, msg="No include file was specified to the include")
|
||||
|
||||
include_file = templar.template(include_file)
|
||||
return dict(include=include_file, include_args=include_args)
|
||||
|
||||
# if this task is a IncludeRole, we just return now with a success code so the main thread can expand the task list for the given host
|
||||
|
|
@ -556,32 +563,9 @@ class TaskExecutor:
|
|||
include_args = self._task.args.copy()
|
||||
return dict(include_args=include_args)
|
||||
|
||||
# Now we do final validation on the task, which sets all fields to their final values.
|
||||
try:
|
||||
self._task.post_validate(templar=templar)
|
||||
except AnsibleError:
|
||||
raise
|
||||
except Exception:
|
||||
return dict(changed=False, failed=True, _ansible_no_log=no_log, exception=to_text(traceback.format_exc()))
|
||||
if '_variable_params' in self._task.args:
|
||||
variable_params = self._task.args.pop('_variable_params')
|
||||
if isinstance(variable_params, dict):
|
||||
if C.INJECT_FACTS_AS_VARS:
|
||||
display.warning("Using a variable for a task's 'args' is unsafe in some situations "
|
||||
"(see https://docs.ansible.com/ansible/devel/reference_appendices/faq.html#argsplat-unsafe)")
|
||||
variable_params.update(self._task.args)
|
||||
self._task.args = variable_params
|
||||
else:
|
||||
# if we didn't get a dict, it means there's garbage remaining after k=v parsing, just give up
|
||||
# see https://github.com/ansible/ansible/issues/79862
|
||||
raise AnsibleError(f"invalid or malformed argument: '{variable_params}'")
|
||||
|
||||
# update no_log to task value, now that we have it templated
|
||||
no_log = self._task.no_log
|
||||
|
||||
# free tempvars up, not used anymore, cvars and vars_copy should be mainly used after this point
|
||||
# updating the original 'variables' at the end
|
||||
tempvars = {}
|
||||
del tempvars
|
||||
|
||||
# setup cvars copy, used for all connection related templating
|
||||
if self._task.delegate_to:
|
||||
|
|
@ -633,23 +617,7 @@ class TaskExecutor:
|
|||
cvars['ansible_python_interpreter'] = sys.executable
|
||||
|
||||
# get handler
|
||||
self._handler, module_context = self._get_action_handler_with_module_context(templar=templar)
|
||||
|
||||
if module_context is not None:
|
||||
module_defaults_fqcn = module_context.resolved_fqcn
|
||||
else:
|
||||
module_defaults_fqcn = self._task.resolved_action
|
||||
|
||||
# Apply default params for action/module, if present
|
||||
self._task.args = get_action_args_with_defaults(
|
||||
module_defaults_fqcn, self._task.args, self._task.module_defaults, templar,
|
||||
action_groups=self._task._parent._play._action_groups
|
||||
)
|
||||
|
||||
# And filter out any fields which were set to default(omit), and got the omit token value
|
||||
omit_token = variables.get('omit')
|
||||
if omit_token is not None:
|
||||
self._task.args = remove_omit(self._task.args, omit_token)
|
||||
self._handler, _module_context = self._get_action_handler_with_module_context(templar=templar)
|
||||
|
||||
retries = 1 # includes the default actual run + retries set by user/default
|
||||
if self._task.retries is not None:
|
||||
|
|
@ -669,7 +637,10 @@ class TaskExecutor:
|
|||
if self._task.timeout:
|
||||
old_sig = signal.signal(signal.SIGALRM, task_timeout)
|
||||
signal.alarm(self._task.timeout)
|
||||
result = self._handler.run(task_vars=vars_copy)
|
||||
with PluginExecContext(self._handler):
|
||||
result = self._handler.run(task_vars=vars_copy)
|
||||
|
||||
# DTFIX-RELEASE: nuke this, it hides a lot of error detail- remove the active exception propagation hack from AnsibleActionFail at the same time
|
||||
except (AnsibleActionFail, AnsibleActionSkip) as e:
|
||||
return e.result
|
||||
except AnsibleConnectionFailure as e:
|
||||
|
|
@ -684,12 +655,6 @@ class TaskExecutor:
|
|||
self._handler.cleanup()
|
||||
display.debug("handler run complete")
|
||||
|
||||
# propagate no log to result- the action can set this, so only overwrite it with the task's value if missing or falsey
|
||||
result["_ansible_no_log"] = bool(no_log or result.get('_ansible_no_log', False))
|
||||
|
||||
if self._task.action not in C._ACTION_WITH_CLEAN_FACTS:
|
||||
result = wrap_var(result)
|
||||
|
||||
# update the local copy of vars with the registered value, if specified,
|
||||
# or any facts which may have been generated by the module execution
|
||||
if self._task.register:
|
||||
|
|
@ -713,26 +678,6 @@ class TaskExecutor:
|
|||
result,
|
||||
task_fields=self._task.dump_attrs()))
|
||||
|
||||
# ensure no log is preserved
|
||||
result["_ansible_no_log"] = no_log
|
||||
|
||||
# helper methods for use below in evaluating changed/failed_when
|
||||
def _evaluate_changed_when_result(result):
|
||||
if self._task.changed_when is not None and self._task.changed_when:
|
||||
cond = Conditional(loader=self._loader)
|
||||
cond.when = self._task.changed_when
|
||||
result['changed'] = cond.evaluate_conditional(templar, vars_copy)
|
||||
|
||||
def _evaluate_failed_when_result(result):
|
||||
if self._task.failed_when:
|
||||
cond = Conditional(loader=self._loader)
|
||||
cond.when = self._task.failed_when
|
||||
failed_when_result = cond.evaluate_conditional(templar, vars_copy)
|
||||
result['failed_when_result'] = result['failed'] = failed_when_result
|
||||
else:
|
||||
failed_when_result = False
|
||||
return failed_when_result
|
||||
|
||||
if 'ansible_facts' in result and self._task.action not in C._ACTION_DEBUG:
|
||||
if self._task.action in C._ACTION_WITH_CLEAN_FACTS:
|
||||
if self._task.delegate_to and self._task.delegate_facts:
|
||||
|
|
@ -744,10 +689,11 @@ class TaskExecutor:
|
|||
vars_copy.update(result['ansible_facts'])
|
||||
else:
|
||||
# TODO: cleaning of facts should eventually become part of taskresults instead of vars
|
||||
af = wrap_var(result['ansible_facts'])
|
||||
af = result['ansible_facts']
|
||||
vars_copy['ansible_facts'] = combine_vars(vars_copy.get('ansible_facts', {}), namespace_facts(af))
|
||||
if C.INJECT_FACTS_AS_VARS:
|
||||
vars_copy.update(clean_facts(af))
|
||||
cleaned_toplevel = {k: _deprecate_top_level_fact(v) for k, v in clean_facts(af).items()}
|
||||
vars_copy.update(cleaned_toplevel)
|
||||
|
||||
# set the failed property if it was missing.
|
||||
if 'failed' not in result:
|
||||
|
|
@ -765,9 +711,6 @@ class TaskExecutor:
|
|||
if 'changed' not in result:
|
||||
result['changed'] = False
|
||||
|
||||
if self._task.action not in C._ACTION_WITH_CLEAN_FACTS:
|
||||
result = wrap_var(result)
|
||||
|
||||
# re-update the local copy of vars with the registered value, if specified,
|
||||
# or any facts which may have been generated by the module execution
|
||||
# This gives changed/failed_when access to additional recently modified
|
||||
|
|
@ -780,18 +723,30 @@ class TaskExecutor:
|
|||
if 'skipped' not in result:
|
||||
condname = 'changed'
|
||||
|
||||
# DTFIX-RELEASE: error normalization has not yet occurred; this means that the expressions used for until/failed_when/changed_when/break_when
|
||||
# and when (for loops on the second and later iterations) cannot see the normalized error shapes. This, and the current impl of the expression
|
||||
# handling here causes a number of problems:
|
||||
# * any error in one of the post-task exec expressions is silently ignored and detail lost (eg: `failed_when: syntax ERROR @$123`)
|
||||
# * they cannot reliably access error/warning details, since many of those details are inaccessible until the error normalization occurs
|
||||
# * error normalization includes `msg` if present, and supplies `unknown error` if not; this leads to screwy results on True failed_when if
|
||||
# `msg` is present, eg: `{debug: {}, failed_when: True` -> "Task failed: Action failed: Hello world!"
|
||||
# * detail about failed_when is lost; any error details from the task could potentially be grafted in/preserved if error normalization was done
|
||||
|
||||
try:
|
||||
_evaluate_changed_when_result(result)
|
||||
if self._task.changed_when is not None and self._task.changed_when:
|
||||
result['changed'] = self._task._resolve_conditional(self._task.changed_when, vars_copy)
|
||||
|
||||
condname = 'failed'
|
||||
_evaluate_failed_when_result(result)
|
||||
|
||||
if self._task.failed_when:
|
||||
result['failed_when_result'] = result['failed'] = self._task._resolve_conditional(self._task.failed_when, vars_copy)
|
||||
|
||||
except AnsibleError as e:
|
||||
result['failed'] = True
|
||||
result['%s_when_result' % condname] = to_text(e)
|
||||
|
||||
if retries > 1:
|
||||
cond = Conditional(loader=self._loader)
|
||||
cond.when = self._task.until or [not result['failed']]
|
||||
if cond.evaluate_conditional(templar, vars_copy):
|
||||
if self._task._resolve_conditional(self._task.until or [not result['failed']], vars_copy):
|
||||
break
|
||||
else:
|
||||
# no conditional check, or it failed, so sleep for the specified time
|
||||
|
|
@ -816,9 +771,6 @@ class TaskExecutor:
|
|||
result['attempts'] = retries - 1
|
||||
result['failed'] = True
|
||||
|
||||
if self._task.action not in C._ACTION_WITH_CLEAN_FACTS:
|
||||
result = wrap_var(result)
|
||||
|
||||
# do the final update of the local variables here, for both registered
|
||||
# values and any facts which may have been created
|
||||
if self._task.register:
|
||||
|
|
@ -829,10 +781,12 @@ class TaskExecutor:
|
|||
variables.update(result['ansible_facts'])
|
||||
else:
|
||||
# TODO: cleaning of facts should eventually become part of taskresults instead of vars
|
||||
af = wrap_var(result['ansible_facts'])
|
||||
af = result['ansible_facts']
|
||||
variables['ansible_facts'] = combine_vars(variables.get('ansible_facts', {}), namespace_facts(af))
|
||||
if C.INJECT_FACTS_AS_VARS:
|
||||
variables.update(clean_facts(af))
|
||||
# DTFIX-FUTURE: why is this happening twice, esp since we're post-fork and these will be discarded?
|
||||
cleaned_toplevel = {k: _deprecate_top_level_fact(v) for k, v in clean_facts(af).items()}
|
||||
variables.update(cleaned_toplevel)
|
||||
|
||||
# save the notification target in the result, if it was specified, as
|
||||
# this task may be running in a loop in which case the notification
|
||||
|
|
@ -857,6 +811,50 @@ class TaskExecutor:
|
|||
display.debug("attempt loop complete, returning result")
|
||||
return result
|
||||
|
||||
@staticmethod
|
||||
def _apply_task_result_compat(result: dict[str, t.Any], warning_ctx: _DeferredWarningContext) -> None:
|
||||
"""Apply backward-compatibility mutations to the supplied task result."""
|
||||
if warnings := result.get('warnings'):
|
||||
if isinstance(warnings, list):
|
||||
for warning in warnings:
|
||||
if not isinstance(warning, WarningSummary):
|
||||
# translate non-WarningMessageDetail messages
|
||||
warning = WarningSummary(
|
||||
details=(
|
||||
Detail(msg=str(warning)),
|
||||
),
|
||||
)
|
||||
|
||||
warning_ctx.capture(warning)
|
||||
else:
|
||||
display.warning(f"Task result `warnings` was {type(warnings)} instead of {list}.")
|
||||
|
||||
if deprecations := result.get('deprecations'):
|
||||
if isinstance(deprecations, list):
|
||||
for deprecation in deprecations:
|
||||
if not isinstance(deprecation, DeprecationSummary):
|
||||
# translate non-DeprecationMessageDetail message dicts
|
||||
try:
|
||||
if deprecation.pop('collection_name', ...) is not ...:
|
||||
# deprecated: description='enable the deprecation message for collection_name' core_version='2.23'
|
||||
# self.deprecated('The `collection_name` key in the `deprecations` dictionary is deprecated.', version='2.27')
|
||||
pass
|
||||
|
||||
# DTFIX-RELEASE: when plugin isn't set, do it at the boundary where we receive the module/action results
|
||||
# that may even allow us to never set it in modules/actions directly and to populate it at the boundary
|
||||
deprecation = DeprecationSummary(
|
||||
details=(
|
||||
Detail(msg=deprecation.pop('msg')),
|
||||
),
|
||||
**deprecation,
|
||||
)
|
||||
except Exception as ex:
|
||||
display.error_as_warning("Task result `deprecations` contained an invalid item.", exception=ex)
|
||||
|
||||
warning_ctx.capture(deprecation)
|
||||
else:
|
||||
display.warning(f"Task result `deprecations` was {type(deprecations)} instead of {list}.")
|
||||
|
||||
def _poll_async_result(self, result, templar, task_vars=None):
|
||||
"""
|
||||
Polls for the specified JID to be complete
|
||||
|
|
@ -890,7 +888,7 @@ class TaskExecutor:
|
|||
connection=self._connection,
|
||||
play_context=self._play_context,
|
||||
loader=self._loader,
|
||||
templar=templar,
|
||||
templar=Templar._from_template_engine(templar),
|
||||
shared_loader_obj=self._shared_loader_obj,
|
||||
)
|
||||
|
||||
|
|
@ -960,7 +958,7 @@ class TaskExecutor:
|
|||
connection=self._connection,
|
||||
play_context=self._play_context,
|
||||
loader=self._loader,
|
||||
templar=templar,
|
||||
templar=Templar._from_template_engine(templar),
|
||||
shared_loader_obj=self._shared_loader_obj,
|
||||
)
|
||||
cleanup_handler.run(task_vars=task_vars)
|
||||
|
|
@ -1057,7 +1055,11 @@ class TaskExecutor:
|
|||
options = {}
|
||||
for k in option_vars:
|
||||
if k in variables:
|
||||
options[k] = templar.template(variables[k])
|
||||
try:
|
||||
options[k] = templar.template(variables[k])
|
||||
except AnsibleValueOmittedError:
|
||||
pass
|
||||
|
||||
# TODO move to task method?
|
||||
plugin.set_options(task_keys=task_keys, var_options=options)
|
||||
|
||||
|
|
@ -1128,7 +1130,7 @@ class TaskExecutor:
|
|||
"""
|
||||
return self._get_action_handler_with_module_context(templar)[0]
|
||||
|
||||
def _get_action_handler_with_module_context(self, templar):
|
||||
def _get_action_handler_with_module_context(self, templar: TemplateEngine):
|
||||
"""
|
||||
Returns the correct action plugin to handle the requestion task action and the module context
|
||||
"""
|
||||
|
|
@ -1190,7 +1192,7 @@ class TaskExecutor:
|
|||
connection=self._connection,
|
||||
play_context=self._play_context,
|
||||
loader=self._loader,
|
||||
templar=templar,
|
||||
templar=Templar._from_template_engine(templar),
|
||||
shared_loader_obj=self._shared_loader_obj,
|
||||
collection_list=collections
|
||||
)
|
||||
|
|
|
|||
|
|
@ -27,18 +27,22 @@ import multiprocessing.queues
|
|||
|
||||
from ansible import constants as C
|
||||
from ansible import context
|
||||
from ansible.errors import AnsibleError
|
||||
from ansible.errors import AnsibleError, ExitCode, AnsibleCallbackError
|
||||
from ansible._internal._errors._handler import ErrorHandler
|
||||
from ansible.executor.play_iterator import PlayIterator
|
||||
from ansible.executor.stats import AggregateStats
|
||||
from ansible.executor.task_result import TaskResult
|
||||
from ansible.inventory.data import InventoryData
|
||||
from ansible.module_utils.six import string_types
|
||||
from ansible.module_utils.common.text.converters import to_text, to_native
|
||||
from ansible.module_utils.common.text.converters import to_native
|
||||
from ansible.parsing.dataloader import DataLoader
|
||||
from ansible.playbook.play_context import PlayContext
|
||||
from ansible.playbook.task import Task
|
||||
from ansible.plugins.loader import callback_loader, strategy_loader, module_loader
|
||||
from ansible.plugins.callback import CallbackBase
|
||||
from ansible.template import Templar
|
||||
from ansible._internal._templating._engine import TemplateEngine
|
||||
from ansible.vars.hostvars import HostVars
|
||||
from ansible.vars.manager import VariableManager
|
||||
from ansible.utils.display import Display
|
||||
from ansible.utils.lock import lock_decorator
|
||||
from ansible.utils.multiprocessing import context as multiprocessing_context
|
||||
|
|
@ -125,27 +129,38 @@ class TaskQueueManager:
|
|||
which dispatches the Play's tasks to hosts.
|
||||
"""
|
||||
|
||||
RUN_OK = 0
|
||||
RUN_ERROR = 1
|
||||
RUN_FAILED_HOSTS = 2
|
||||
RUN_UNREACHABLE_HOSTS = 4
|
||||
RUN_FAILED_BREAK_PLAY = 8
|
||||
RUN_UNKNOWN_ERROR = 255
|
||||
RUN_OK = ExitCode.SUCCESS
|
||||
RUN_ERROR = ExitCode.GENERIC_ERROR
|
||||
RUN_FAILED_HOSTS = ExitCode.HOST_FAILED
|
||||
RUN_UNREACHABLE_HOSTS = ExitCode.HOST_UNREACHABLE
|
||||
RUN_FAILED_BREAK_PLAY = 8 # never leaves PlaybookExecutor.run
|
||||
RUN_UNKNOWN_ERROR = 255 # never leaves PlaybookExecutor.run, intentionally includes the bit value for 8
|
||||
|
||||
def __init__(self, inventory, variable_manager, loader, passwords, stdout_callback=None, run_additional_callbacks=True, run_tree=False, forks=None):
|
||||
_callback_dispatch_error_handler = ErrorHandler.from_config('_CALLBACK_DISPATCH_ERROR_BEHAVIOR')
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
inventory: InventoryData,
|
||||
variable_manager: VariableManager,
|
||||
loader: DataLoader,
|
||||
passwords: dict[str, str | None],
|
||||
stdout_callback: str | None = None,
|
||||
run_additional_callbacks: bool = True,
|
||||
run_tree: bool = False,
|
||||
forks: int | None = None,
|
||||
) -> None:
|
||||
self._inventory = inventory
|
||||
self._variable_manager = variable_manager
|
||||
self._loader = loader
|
||||
self._stats = AggregateStats()
|
||||
self.passwords = passwords
|
||||
self._stdout_callback = stdout_callback
|
||||
self._stdout_callback: str | None | CallbackBase = stdout_callback
|
||||
self._run_additional_callbacks = run_additional_callbacks
|
||||
self._run_tree = run_tree
|
||||
self._forks = forks or 5
|
||||
|
||||
self._callbacks_loaded = False
|
||||
self._callback_plugins = []
|
||||
self._callback_plugins: list[CallbackBase] = []
|
||||
self._start_at_done = False
|
||||
|
||||
# make sure any module paths (if specified) are added to the module_loader
|
||||
|
|
@ -158,8 +173,8 @@ class TaskQueueManager:
|
|||
self._terminated = False
|
||||
|
||||
# dictionaries to keep track of failed/unreachable hosts
|
||||
self._failed_hosts = dict()
|
||||
self._unreachable_hosts = dict()
|
||||
self._failed_hosts: dict[str, t.Literal[True]] = dict()
|
||||
self._unreachable_hosts: dict[str, t.Literal[True]] = dict()
|
||||
|
||||
try:
|
||||
self._final_q = FinalQueue()
|
||||
|
|
@ -291,7 +306,7 @@ class TaskQueueManager:
|
|||
self.load_callbacks()
|
||||
|
||||
all_vars = self._variable_manager.get_vars(play=play)
|
||||
templar = Templar(loader=self._loader, variables=all_vars)
|
||||
templar = TemplateEngine(loader=self._loader, variables=all_vars)
|
||||
|
||||
new_play = play.copy()
|
||||
new_play.post_validate(templar)
|
||||
|
|
@ -394,25 +409,25 @@ class TaskQueueManager:
|
|||
except AttributeError:
|
||||
pass
|
||||
|
||||
def clear_failed_hosts(self):
|
||||
def clear_failed_hosts(self) -> None:
|
||||
self._failed_hosts = dict()
|
||||
|
||||
def get_inventory(self):
|
||||
def get_inventory(self) -> InventoryData:
|
||||
return self._inventory
|
||||
|
||||
def get_variable_manager(self):
|
||||
def get_variable_manager(self) -> VariableManager:
|
||||
return self._variable_manager
|
||||
|
||||
def get_loader(self):
|
||||
def get_loader(self) -> DataLoader:
|
||||
return self._loader
|
||||
|
||||
def get_workers(self):
|
||||
return self._workers[:]
|
||||
|
||||
def terminate(self):
|
||||
def terminate(self) -> None:
|
||||
self._terminated = True
|
||||
|
||||
def has_dead_workers(self):
|
||||
def has_dead_workers(self) -> bool:
|
||||
|
||||
# [<WorkerProcess(WorkerProcess-2, stopped[SIGKILL])>,
|
||||
# <WorkerProcess(WorkerProcess-2, stopped[SIGTERM])>
|
||||
|
|
@ -469,11 +484,8 @@ class TaskQueueManager:
|
|||
continue
|
||||
|
||||
for method in methods:
|
||||
try:
|
||||
method(*new_args, **kwargs)
|
||||
except Exception as e:
|
||||
# TODO: add config toggle to make this fatal or not?
|
||||
display.warning(u"Failure using method (%s) in callback plugin (%s): %s" % (to_text(method_name), to_text(callback_plugin), to_text(e)))
|
||||
from traceback import format_tb
|
||||
from sys import exc_info
|
||||
display.vvv('Callback Exception: \n' + ' '.join(format_tb(exc_info()[2])))
|
||||
with self._callback_dispatch_error_handler.handle(AnsibleCallbackError):
|
||||
try:
|
||||
method(*new_args, **kwargs)
|
||||
except Exception as ex:
|
||||
raise AnsibleCallbackError(f"Callback dispatch {method_name!r} failed for plugin {callback_plugin._load_name!r}.") from ex
|
||||
|
|
|
|||
|
|
@ -4,12 +4,14 @@
|
|||
|
||||
from __future__ import annotations
|
||||
|
||||
import typing as t
|
||||
|
||||
from ansible import constants as C
|
||||
from ansible.parsing.dataloader import DataLoader
|
||||
from ansible.vars.clean import module_response_deepcopy, strip_internal_keys
|
||||
|
||||
_IGNORE = ('failed', 'skipped')
|
||||
_PRESERVE = ('attempts', 'changed', 'retries')
|
||||
_PRESERVE = ('attempts', 'changed', 'retries', '_ansible_no_log')
|
||||
_SUB_PRESERVE = {'_ansible_delegated_vars': ('ansible_host', 'ansible_port', 'ansible_user', 'ansible_connection')}
|
||||
|
||||
# stuff callbacks need
|
||||
|
|
@ -127,15 +129,15 @@ class TaskResult:
|
|||
if key in self._result[sub]:
|
||||
subset[sub][key] = self._result[sub][key]
|
||||
|
||||
if isinstance(self._task.no_log, bool) and self._task.no_log or self._result.get('_ansible_no_log', False):
|
||||
x = {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result"}
|
||||
# DTFIX-FUTURE: is checking no_log here redundant now that we use _ansible_no_log everywhere?
|
||||
if isinstance(self._task.no_log, bool) and self._task.no_log or self._result.get('_ansible_no_log'):
|
||||
censored_result = censor_result(self._result)
|
||||
|
||||
# preserve full
|
||||
for preserve in _PRESERVE:
|
||||
if preserve in self._result:
|
||||
x[preserve] = self._result[preserve]
|
||||
if results := self._result.get('results'):
|
||||
# maintain shape for loop results so callback behavior recognizes a loop was performed
|
||||
censored_result.update(results=[censor_result(item) if item.get('_ansible_no_log') else item for item in results])
|
||||
|
||||
result._result = x
|
||||
result._result = censored_result
|
||||
elif self._result:
|
||||
result._result = module_response_deepcopy(self._result)
|
||||
|
||||
|
|
@ -151,3 +153,10 @@ class TaskResult:
|
|||
result._result.update(subset)
|
||||
|
||||
return result
|
||||
|
||||
|
||||
def censor_result(result: dict[str, t.Any]) -> dict[str, t.Any]:
|
||||
censored_result = {key: value for key in _PRESERVE if (value := result.get(key, ...)) is not ...}
|
||||
censored_result.update(censored="the output has been hidden due to the fact that 'no_log: true' was specified for this result")
|
||||
|
||||
return censored_result
|
||||
|
|
|
|||
|
|
@ -57,13 +57,13 @@ def should_retry_error(exception):
|
|||
if isinstance(exception, GalaxyError) and exception.http_code in RETRY_HTTP_ERROR_CODES:
|
||||
return True
|
||||
|
||||
if isinstance(exception, AnsibleError) and (orig_exc := getattr(exception, 'orig_exc', None)):
|
||||
if isinstance(exception, AnsibleError) and (cause := exception.__cause__):
|
||||
# URLError is often a proxy for an underlying error, handle wrapped exceptions
|
||||
if isinstance(orig_exc, URLError):
|
||||
orig_exc = orig_exc.reason
|
||||
if isinstance(cause, URLError):
|
||||
cause = cause.reason
|
||||
|
||||
# Handle common URL related errors
|
||||
if isinstance(orig_exc, (TimeoutError, BadStatusLine, IncompleteRead)):
|
||||
if isinstance(cause, (TimeoutError, BadStatusLine, IncompleteRead)):
|
||||
return True
|
||||
|
||||
return False
|
||||
|
|
@ -408,11 +408,8 @@ class GalaxyAPI:
|
|||
method=method, timeout=self._server_timeout, http_agent=user_agent(), follow_redirects='safe')
|
||||
except HTTPError as e:
|
||||
raise GalaxyError(e, error_context_msg)
|
||||
except Exception as e:
|
||||
raise AnsibleError(
|
||||
"Unknown error when attempting to call Galaxy at '%s': %s" % (url, to_native(e)),
|
||||
orig_exc=e
|
||||
)
|
||||
except Exception as ex:
|
||||
raise AnsibleError(f"Unknown error when attempting to call Galaxy at {url!r}.") from ex
|
||||
|
||||
resp_data = to_text(resp.read(), errors='surrogate_or_strict')
|
||||
try:
|
||||
|
|
@ -471,8 +468,8 @@ class GalaxyAPI:
|
|||
resp = open_url(url, data=args, validate_certs=self.validate_certs, method="POST", http_agent=user_agent(), timeout=self._server_timeout)
|
||||
except HTTPError as e:
|
||||
raise GalaxyError(e, 'Attempting to authenticate to galaxy')
|
||||
except Exception as e:
|
||||
raise AnsibleError('Unable to authenticate to galaxy: %s' % to_native(e), orig_exc=e)
|
||||
except Exception as ex:
|
||||
raise AnsibleError('Unable to authenticate to galaxy.') from ex
|
||||
|
||||
data = json.loads(to_text(resp.read(), errors='surrogate_or_strict'))
|
||||
return data
|
||||
|
|
|
|||
|
|
@ -485,16 +485,13 @@ def _download_file(url, b_path, expected_hash, validate_certs, token=None, timeo
|
|||
display.display("Downloading %s to %s" % (url, to_text(b_tarball_dir)))
|
||||
# NOTE: Galaxy redirects downloads to S3 which rejects the request
|
||||
# NOTE: if an Authorization header is attached so don't redirect it
|
||||
try:
|
||||
resp = open_url(
|
||||
to_native(url, errors='surrogate_or_strict'),
|
||||
validate_certs=validate_certs,
|
||||
headers=None if token is None else token.headers(),
|
||||
unredirected_headers=['Authorization'], http_agent=user_agent(),
|
||||
timeout=timeout
|
||||
)
|
||||
except Exception as err:
|
||||
raise AnsibleError(to_native(err), orig_exc=err)
|
||||
resp = open_url(
|
||||
to_native(url, errors='surrogate_or_strict'),
|
||||
validate_certs=validate_certs,
|
||||
headers=None if token is None else token.headers(),
|
||||
unredirected_headers=['Authorization'], http_agent=user_agent(),
|
||||
timeout=timeout
|
||||
)
|
||||
|
||||
with open(b_file_path, 'wb') as download_file: # type: t.BinaryIO
|
||||
actual_hash = _consume_file(resp, write_to=download_file)
|
||||
|
|
|
|||
|
|
@ -7,6 +7,7 @@
|
|||
from __future__ import annotations
|
||||
|
||||
import os
|
||||
import pathlib
|
||||
import typing as t
|
||||
|
||||
from collections import namedtuple
|
||||
|
|
@ -25,6 +26,8 @@ if t.TYPE_CHECKING:
|
|||
'_ComputedReqKindsMixin',
|
||||
)
|
||||
|
||||
import ansible
|
||||
import ansible.release
|
||||
|
||||
from ansible.errors import AnsibleError, AnsibleAssertionError
|
||||
from ansible.galaxy.api import GalaxyAPI
|
||||
|
|
@ -39,6 +42,7 @@ _ALLOW_CONCRETE_POINTER_IN_SOURCE = False # NOTE: This is a feature flag
|
|||
_GALAXY_YAML = b'galaxy.yml'
|
||||
_MANIFEST_JSON = b'MANIFEST.json'
|
||||
_SOURCE_METADATA_FILE = b'GALAXY.yml'
|
||||
_ANSIBLE_PACKAGE_PATH = pathlib.Path(ansible.__file__).parent
|
||||
|
||||
display = Display()
|
||||
|
||||
|
|
@ -224,6 +228,13 @@ class _ComputedReqKindsMixin:
|
|||
if dir_path.endswith(to_bytes(os.path.sep)):
|
||||
dir_path = dir_path.rstrip(to_bytes(os.path.sep))
|
||||
if not _is_collection_dir(dir_path):
|
||||
dir_pathlib = pathlib.Path(to_text(dir_path))
|
||||
|
||||
# special handling for bundled collections without manifests, e.g., ansible._protomatter
|
||||
if dir_pathlib.is_relative_to(_ANSIBLE_PACKAGE_PATH):
|
||||
req_name = f'{dir_pathlib.parent.name}.{dir_pathlib.name}'
|
||||
return cls(req_name, ansible.release.__version__, dir_path, 'dir', None)
|
||||
|
||||
display.warning(
|
||||
u"Collection at '{path!s}' does not have a {manifest_json!s} "
|
||||
u'file, nor has it {galaxy_yml!s}: cannot detect version.'.
|
||||
|
|
|
|||
|
|
@ -19,64 +19,49 @@
|
|||
from __future__ import annotations
|
||||
|
||||
import sys
|
||||
import typing as t
|
||||
|
||||
from ansible import constants as C
|
||||
from ansible.errors import AnsibleError
|
||||
from ansible.inventory.group import Group
|
||||
from ansible.inventory.host import Host
|
||||
from ansible.module_utils.six import string_types
|
||||
from ansible.utils.display import Display
|
||||
from ansible.utils.vars import combine_vars
|
||||
from ansible.utils.path import basedir
|
||||
|
||||
from . import helpers # this is left as a module import to facilitate easier unit test patching
|
||||
|
||||
|
||||
display = Display()
|
||||
|
||||
|
||||
class InventoryData(object):
|
||||
class InventoryData:
|
||||
"""
|
||||
Holds inventory data (host and group objects).
|
||||
Using it's methods should guarantee expected relationships and data.
|
||||
Using its methods should guarantee expected relationships and data.
|
||||
"""
|
||||
|
||||
def __init__(self):
|
||||
def __init__(self) -> None:
|
||||
|
||||
self.groups = {}
|
||||
self.hosts = {}
|
||||
self.groups: dict[str, Group] = {}
|
||||
self.hosts: dict[str, Host] = {}
|
||||
|
||||
# provides 'groups' magic var, host object has group_names
|
||||
self._groups_dict_cache = {}
|
||||
self._groups_dict_cache: dict[str, list[str]] = {}
|
||||
|
||||
# current localhost, implicit or explicit
|
||||
self.localhost = None
|
||||
self.localhost: Host | None = None
|
||||
|
||||
self.current_source = None
|
||||
self.processed_sources = []
|
||||
self.current_source: str | None = None
|
||||
self.processed_sources: list[str] = []
|
||||
|
||||
# Always create the 'all' and 'ungrouped' groups,
|
||||
for group in ('all', 'ungrouped'):
|
||||
self.add_group(group)
|
||||
|
||||
self.add_child('all', 'ungrouped')
|
||||
|
||||
def serialize(self):
|
||||
self._groups_dict_cache = None
|
||||
data = {
|
||||
'groups': self.groups,
|
||||
'hosts': self.hosts,
|
||||
'local': self.localhost,
|
||||
'source': self.current_source,
|
||||
'processed_sources': self.processed_sources
|
||||
}
|
||||
return data
|
||||
|
||||
def deserialize(self, data):
|
||||
self._groups_dict_cache = {}
|
||||
self.hosts = data.get('hosts')
|
||||
self.groups = data.get('groups')
|
||||
self.localhost = data.get('local')
|
||||
self.current_source = data.get('source')
|
||||
self.processed_sources = data.get('processed_sources')
|
||||
|
||||
def _create_implicit_localhost(self, pattern):
|
||||
def _create_implicit_localhost(self, pattern: str) -> Host:
|
||||
|
||||
if self.localhost:
|
||||
new_host = self.localhost
|
||||
|
|
@ -100,8 +85,8 @@ class InventoryData(object):
|
|||
|
||||
return new_host
|
||||
|
||||
def reconcile_inventory(self):
|
||||
""" Ensure inventory basic rules, run after updates """
|
||||
def reconcile_inventory(self) -> None:
|
||||
"""Ensure inventory basic rules, run after updates."""
|
||||
|
||||
display.debug('Reconcile groups and hosts in inventory.')
|
||||
self.current_source = None
|
||||
|
|
@ -125,7 +110,7 @@ class InventoryData(object):
|
|||
|
||||
if self.groups['ungrouped'] in mygroups:
|
||||
# clear ungrouped of any incorrectly stored by parser
|
||||
if set(mygroups).difference(set([self.groups['all'], self.groups['ungrouped']])):
|
||||
if set(mygroups).difference({self.groups['all'], self.groups['ungrouped']}):
|
||||
self.groups['ungrouped'].remove_host(host)
|
||||
|
||||
elif not host.implicit:
|
||||
|
|
@ -144,8 +129,10 @@ class InventoryData(object):
|
|||
|
||||
self._groups_dict_cache = {}
|
||||
|
||||
def get_host(self, hostname):
|
||||
""" fetch host object using name deal with implicit localhost """
|
||||
def get_host(self, hostname: str) -> Host | None:
|
||||
"""Fetch host object using name deal with implicit localhost."""
|
||||
|
||||
hostname = helpers.remove_trust(hostname)
|
||||
|
||||
matching_host = self.hosts.get(hostname, None)
|
||||
|
||||
|
|
@ -156,19 +143,19 @@ class InventoryData(object):
|
|||
|
||||
return matching_host
|
||||
|
||||
def add_group(self, group):
|
||||
""" adds a group to inventory if not there already, returns named actually used """
|
||||
def add_group(self, group: str) -> str:
|
||||
"""Adds a group to inventory if not there already, returns named actually used."""
|
||||
|
||||
if group:
|
||||
if not isinstance(group, string_types):
|
||||
if not isinstance(group, str):
|
||||
raise AnsibleError("Invalid group name supplied, expected a string but got %s for %s" % (type(group), group))
|
||||
if group not in self.groups:
|
||||
g = Group(group)
|
||||
if g.name not in self.groups:
|
||||
self.groups[g.name] = g
|
||||
group = g.name # the group object may have sanitized the group name; use whatever it has
|
||||
if group not in self.groups:
|
||||
self.groups[group] = g
|
||||
self._groups_dict_cache = {}
|
||||
display.debug("Added group %s to inventory" % group)
|
||||
group = g.name
|
||||
else:
|
||||
display.debug("group %s already in inventory" % group)
|
||||
else:
|
||||
|
|
@ -176,22 +163,24 @@ class InventoryData(object):
|
|||
|
||||
return group
|
||||
|
||||
def remove_group(self, group):
|
||||
def remove_group(self, group: Group) -> None:
|
||||
|
||||
if group in self.groups:
|
||||
del self.groups[group]
|
||||
display.debug("Removed group %s from inventory" % group)
|
||||
if group.name in self.groups:
|
||||
del self.groups[group.name]
|
||||
display.debug("Removed group %s from inventory" % group.name)
|
||||
self._groups_dict_cache = {}
|
||||
|
||||
for host in self.hosts:
|
||||
h = self.hosts[host]
|
||||
h.remove_group(group)
|
||||
|
||||
def add_host(self, host, group=None, port=None):
|
||||
""" adds a host to inventory and possibly a group if not there already """
|
||||
def add_host(self, host: str, group: str | None = None, port: int | str | None = None) -> str:
|
||||
"""Adds a host to inventory and possibly a group if not there already."""
|
||||
|
||||
host = helpers.remove_trust(host)
|
||||
|
||||
if host:
|
||||
if not isinstance(host, string_types):
|
||||
if not isinstance(host, str):
|
||||
raise AnsibleError("Invalid host name supplied, expected a string but got %s for %s" % (type(host), host))
|
||||
|
||||
# TODO: add to_safe_host_name
|
||||
|
|
@ -211,7 +200,7 @@ class InventoryData(object):
|
|||
else:
|
||||
self.set_variable(host, 'inventory_file', None)
|
||||
self.set_variable(host, 'inventory_dir', None)
|
||||
display.debug("Added host %s to inventory" % (host))
|
||||
display.debug("Added host %s to inventory" % host)
|
||||
|
||||
# set default localhost from inventory to avoid creating an implicit one. Last localhost defined 'wins'.
|
||||
if host in C.LOCALHOST:
|
||||
|
|
@ -232,7 +221,7 @@ class InventoryData(object):
|
|||
|
||||
return host
|
||||
|
||||
def remove_host(self, host):
|
||||
def remove_host(self, host: Host) -> None:
|
||||
|
||||
if host.name in self.hosts:
|
||||
del self.hosts[host.name]
|
||||
|
|
@ -241,8 +230,10 @@ class InventoryData(object):
|
|||
g = self.groups[group]
|
||||
g.remove_host(host)
|
||||
|
||||
def set_variable(self, entity, varname, value):
|
||||
""" sets a variable for an inventory object """
|
||||
def set_variable(self, entity: str, varname: str, value: t.Any) -> None:
|
||||
"""Sets a variable for an inventory object."""
|
||||
|
||||
inv_object: Host | Group
|
||||
|
||||
if entity in self.groups:
|
||||
inv_object = self.groups[entity]
|
||||
|
|
@ -254,9 +245,8 @@ class InventoryData(object):
|
|||
inv_object.set_variable(varname, value)
|
||||
display.debug('set %s for %s' % (varname, entity))
|
||||
|
||||
def add_child(self, group, child):
|
||||
""" Add host or group to group """
|
||||
added = False
|
||||
def add_child(self, group: str, child: str) -> bool:
|
||||
"""Add host or group to group."""
|
||||
if group in self.groups:
|
||||
g = self.groups[group]
|
||||
if child in self.groups:
|
||||
|
|
@ -271,12 +261,12 @@ class InventoryData(object):
|
|||
raise AnsibleError("%s is not a known group" % group)
|
||||
return added
|
||||
|
||||
def get_groups_dict(self):
|
||||
def get_groups_dict(self) -> dict[str, list[str]]:
|
||||
"""
|
||||
We merge a 'magic' var 'groups' with group name keys and hostname list values into every host variable set. Cache for speed.
|
||||
"""
|
||||
if not self._groups_dict_cache:
|
||||
for (group_name, group) in self.groups.items():
|
||||
for group_name, group in self.groups.items():
|
||||
self._groups_dict_cache[group_name] = [h.name for h in group.get_hosts()]
|
||||
|
||||
return self._groups_dict_cache
|
||||
|
|
|
|||
|
|
@ -16,6 +16,8 @@
|
|||
# along with Ansible. If not, see <http://www.gnu.org/licenses/>.
|
||||
from __future__ import annotations
|
||||
|
||||
import typing as t
|
||||
|
||||
from collections.abc import Mapping, MutableMapping
|
||||
from enum import Enum
|
||||
from itertools import chain
|
||||
|
|
@ -26,8 +28,13 @@ from ansible.module_utils.common.text.converters import to_native, to_text
|
|||
from ansible.utils.display import Display
|
||||
from ansible.utils.vars import combine_vars
|
||||
|
||||
from . import helpers # this is left as a module import to facilitate easier unit test patching
|
||||
|
||||
display = Display()
|
||||
|
||||
if t.TYPE_CHECKING:
|
||||
from .host import Host
|
||||
|
||||
|
||||
def to_safe_group_name(name, replacer="_", force=False, silent=False):
|
||||
# Converts 'bad' characters in a string to underscores (or provided replacer) so they can be used as Ansible hosts or groups
|
||||
|
|
@ -59,22 +66,23 @@ class InventoryObjectType(Enum):
|
|||
|
||||
|
||||
class Group:
|
||||
""" a group of ansible hosts """
|
||||
"""A group of ansible hosts."""
|
||||
base_type = InventoryObjectType.GROUP
|
||||
|
||||
# __slots__ = [ 'name', 'hosts', 'vars', 'child_groups', 'parent_groups', 'depth', '_hosts_cache' ]
|
||||
|
||||
def __init__(self, name=None):
|
||||
def __init__(self, name: str) -> None:
|
||||
name = helpers.remove_trust(name)
|
||||
|
||||
self.depth = 0
|
||||
self.name = to_safe_group_name(name)
|
||||
self.hosts = []
|
||||
self._hosts = None
|
||||
self.vars = {}
|
||||
self.child_groups = []
|
||||
self.parent_groups = []
|
||||
self._hosts_cache = None
|
||||
self.priority = 1
|
||||
self.depth: int = 0
|
||||
self.name: str = to_safe_group_name(name)
|
||||
self.hosts: list[Host] = []
|
||||
self._hosts: set[str] | None = None
|
||||
self.vars: dict[str, t.Any] = {}
|
||||
self.child_groups: list[Group] = []
|
||||
self.parent_groups: list[Group] = []
|
||||
self._hosts_cache: list[Host] | None = None
|
||||
self.priority: int = 1
|
||||
|
||||
def __repr__(self):
|
||||
return self.get_name()
|
||||
|
|
@ -82,44 +90,7 @@ class Group:
|
|||
def __str__(self):
|
||||
return self.get_name()
|
||||
|
||||
def __getstate__(self):
|
||||
return self.serialize()
|
||||
|
||||
def __setstate__(self, data):
|
||||
return self.deserialize(data)
|
||||
|
||||
def serialize(self):
|
||||
parent_groups = []
|
||||
for parent in self.parent_groups:
|
||||
parent_groups.append(parent.serialize())
|
||||
|
||||
self._hosts = None
|
||||
|
||||
result = dict(
|
||||
name=self.name,
|
||||
vars=self.vars.copy(),
|
||||
parent_groups=parent_groups,
|
||||
depth=self.depth,
|
||||
hosts=self.hosts,
|
||||
)
|
||||
|
||||
return result
|
||||
|
||||
def deserialize(self, data):
|
||||
self.__init__() # used by __setstate__ to deserialize in place # pylint: disable=unnecessary-dunder-call
|
||||
self.name = data.get('name')
|
||||
self.vars = data.get('vars', dict())
|
||||
self.depth = data.get('depth', 0)
|
||||
self.hosts = data.get('hosts', [])
|
||||
self._hosts = None
|
||||
|
||||
parent_groups = data.get('parent_groups', [])
|
||||
for parent_data in parent_groups:
|
||||
g = Group()
|
||||
g.deserialize(parent_data)
|
||||
self.parent_groups.append(g)
|
||||
|
||||
def _walk_relationship(self, rel, include_self=False, preserve_ordering=False):
|
||||
def _walk_relationship(self, rel, include_self=False, preserve_ordering=False) -> set[Group] | list[Group]:
|
||||
"""
|
||||
Given `rel` that is an iterable property of Group,
|
||||
consitituting a directed acyclic graph among all groups,
|
||||
|
|
@ -133,12 +104,12 @@ class Group:
|
|||
F
|
||||
Called on F, returns set of (A, B, C, D, E)
|
||||
"""
|
||||
seen = set([])
|
||||
seen: set[Group] = set([])
|
||||
unprocessed = set(getattr(self, rel))
|
||||
if include_self:
|
||||
unprocessed.add(self)
|
||||
if preserve_ordering:
|
||||
ordered = [self] if include_self else []
|
||||
ordered: list[Group] = [self] if include_self else []
|
||||
ordered.extend(getattr(self, rel))
|
||||
|
||||
while unprocessed:
|
||||
|
|
@ -158,22 +129,22 @@ class Group:
|
|||
return ordered
|
||||
return seen
|
||||
|
||||
def get_ancestors(self):
|
||||
return self._walk_relationship('parent_groups')
|
||||
def get_ancestors(self) -> set[Group]:
|
||||
return t.cast(set, self._walk_relationship('parent_groups'))
|
||||
|
||||
def get_descendants(self, **kwargs):
|
||||
def get_descendants(self, **kwargs) -> set[Group] | list[Group]:
|
||||
return self._walk_relationship('child_groups', **kwargs)
|
||||
|
||||
@property
|
||||
def host_names(self):
|
||||
def host_names(self) -> set[str]:
|
||||
if self._hosts is None:
|
||||
self._hosts = set(self.hosts)
|
||||
self._hosts = {h.name for h in self.hosts}
|
||||
return self._hosts
|
||||
|
||||
def get_name(self):
|
||||
def get_name(self) -> str:
|
||||
return self.name
|
||||
|
||||
def add_child_group(self, group):
|
||||
def add_child_group(self, group: Group) -> bool:
|
||||
added = False
|
||||
if self == group:
|
||||
raise Exception("can't add group to itself")
|
||||
|
|
@ -208,7 +179,7 @@ class Group:
|
|||
self.clear_hosts_cache()
|
||||
return added
|
||||
|
||||
def _check_children_depth(self):
|
||||
def _check_children_depth(self) -> None:
|
||||
|
||||
depth = self.depth
|
||||
start_depth = self.depth # self.depth could change over loop
|
||||
|
|
@ -227,7 +198,7 @@ class Group:
|
|||
if depth - start_depth > len(seen):
|
||||
raise AnsibleError("The group named '%s' has a recursive dependency loop." % to_native(self.name))
|
||||
|
||||
def add_host(self, host):
|
||||
def add_host(self, host: Host) -> bool:
|
||||
added = False
|
||||
if host.name not in self.host_names:
|
||||
self.hosts.append(host)
|
||||
|
|
@ -237,7 +208,7 @@ class Group:
|
|||
added = True
|
||||
return added
|
||||
|
||||
def remove_host(self, host):
|
||||
def remove_host(self, host: Host) -> bool:
|
||||
removed = False
|
||||
if host.name in self.host_names:
|
||||
self.hosts.remove(host)
|
||||
|
|
@ -247,7 +218,8 @@ class Group:
|
|||
removed = True
|
||||
return removed
|
||||
|
||||
def set_variable(self, key, value):
|
||||
def set_variable(self, key: str, value: t.Any) -> None:
|
||||
key = helpers.remove_trust(key)
|
||||
|
||||
if key == 'ansible_group_priority':
|
||||
self.set_priority(int(value))
|
||||
|
|
@ -257,36 +229,36 @@ class Group:
|
|||
else:
|
||||
self.vars[key] = value
|
||||
|
||||
def clear_hosts_cache(self):
|
||||
def clear_hosts_cache(self) -> None:
|
||||
|
||||
self._hosts_cache = None
|
||||
for g in self.get_ancestors():
|
||||
g._hosts_cache = None
|
||||
|
||||
def get_hosts(self):
|
||||
def get_hosts(self) -> list[Host]:
|
||||
|
||||
if self._hosts_cache is None:
|
||||
self._hosts_cache = self._get_hosts()
|
||||
return self._hosts_cache
|
||||
|
||||
def _get_hosts(self):
|
||||
def _get_hosts(self) -> list[Host]:
|
||||
|
||||
hosts = []
|
||||
seen = {}
|
||||
hosts: list[Host] = []
|
||||
seen: set[Host] = set()
|
||||
for kid in self.get_descendants(include_self=True, preserve_ordering=True):
|
||||
kid_hosts = kid.hosts
|
||||
for kk in kid_hosts:
|
||||
if kk not in seen:
|
||||
seen[kk] = 1
|
||||
seen.add(kk)
|
||||
if self.name == 'all' and kk.implicit:
|
||||
continue
|
||||
hosts.append(kk)
|
||||
return hosts
|
||||
|
||||
def get_vars(self):
|
||||
def get_vars(self) -> dict[str, t.Any]:
|
||||
return self.vars.copy()
|
||||
|
||||
def set_priority(self, priority):
|
||||
def set_priority(self, priority: int | str) -> None:
|
||||
try:
|
||||
self.priority = int(priority)
|
||||
except TypeError:
|
||||
|
|
|
|||
|
|
@ -18,6 +18,7 @@
|
|||
#############################################
|
||||
from __future__ import annotations
|
||||
|
||||
from ansible._internal._datatag._tags import TrustedAsTemplate
|
||||
from ansible.utils.vars import combine_vars
|
||||
|
||||
|
||||
|
|
@ -37,3 +38,11 @@ def get_group_vars(groups):
|
|||
results = combine_vars(results, group.get_vars())
|
||||
|
||||
return results
|
||||
|
||||
|
||||
def remove_trust(value: str) -> str:
|
||||
"""
|
||||
Remove trust from strings which should not be trusted.
|
||||
This exists to centralize the untagging call which facilitate patching it out in unit tests.
|
||||
"""
|
||||
return TrustedAsTemplate.untag(value)
|
||||
|
|
|
|||
|
|
@ -17,28 +17,26 @@
|
|||
|
||||
from __future__ import annotations
|
||||
|
||||
import collections.abc as c
|
||||
import typing as t
|
||||
|
||||
from collections.abc import Mapping, MutableMapping
|
||||
|
||||
from ansible.inventory.group import Group, InventoryObjectType
|
||||
from ansible.parsing.utils.addresses import patterns
|
||||
from ansible.utils.vars import combine_vars, get_unique_id
|
||||
from ansible.utils.vars import combine_vars, get_unique_id, validate_variable_name
|
||||
|
||||
from . import helpers # this is left as a module import to facilitate easier unit test patching
|
||||
|
||||
__all__ = ['Host']
|
||||
|
||||
|
||||
class Host:
|
||||
""" a single ansible host """
|
||||
"""A single ansible host."""
|
||||
base_type = InventoryObjectType.HOST
|
||||
|
||||
# __slots__ = [ 'name', 'vars', 'groups' ]
|
||||
|
||||
def __getstate__(self):
|
||||
return self.serialize()
|
||||
|
||||
def __setstate__(self, data):
|
||||
return self.deserialize(data)
|
||||
|
||||
def __eq__(self, other):
|
||||
if not isinstance(other, Host):
|
||||
return False
|
||||
|
|
@ -56,55 +54,28 @@ class Host:
|
|||
def __repr__(self):
|
||||
return self.get_name()
|
||||
|
||||
def serialize(self):
|
||||
groups = []
|
||||
for group in self.groups:
|
||||
groups.append(group.serialize())
|
||||
def __init__(self, name: str, port: int | str | None = None, gen_uuid: bool = True) -> None:
|
||||
name = helpers.remove_trust(name)
|
||||
|
||||
return dict(
|
||||
name=self.name,
|
||||
vars=self.vars.copy(),
|
||||
address=self.address,
|
||||
uuid=self._uuid,
|
||||
groups=groups,
|
||||
implicit=self.implicit,
|
||||
)
|
||||
self.vars: dict[str, t.Any] = {}
|
||||
self.groups: list[Group] = []
|
||||
self._uuid: str | None = None
|
||||
|
||||
def deserialize(self, data):
|
||||
self.__init__(gen_uuid=False) # used by __setstate__ to deserialize in place # pylint: disable=unnecessary-dunder-call
|
||||
|
||||
self.name = data.get('name')
|
||||
self.vars = data.get('vars', dict())
|
||||
self.address = data.get('address', '')
|
||||
self._uuid = data.get('uuid', None)
|
||||
self.implicit = data.get('implicit', False)
|
||||
|
||||
groups = data.get('groups', [])
|
||||
for group_data in groups:
|
||||
g = Group()
|
||||
g.deserialize(group_data)
|
||||
self.groups.append(g)
|
||||
|
||||
def __init__(self, name=None, port=None, gen_uuid=True):
|
||||
|
||||
self.vars = {}
|
||||
self.groups = []
|
||||
self._uuid = None
|
||||
|
||||
self.name = name
|
||||
self.address = name
|
||||
self.name: str = name
|
||||
self.address: str = name
|
||||
|
||||
if port:
|
||||
self.set_variable('ansible_port', int(port))
|
||||
|
||||
if gen_uuid:
|
||||
self._uuid = get_unique_id()
|
||||
self.implicit = False
|
||||
|
||||
def get_name(self):
|
||||
self.implicit: bool = False
|
||||
|
||||
def get_name(self) -> str:
|
||||
return self.name
|
||||
|
||||
def populate_ancestors(self, additions=None):
|
||||
def populate_ancestors(self, additions: c.Iterable[Group] | None = None) -> None:
|
||||
# populate ancestors
|
||||
if additions is None:
|
||||
for group in self.groups:
|
||||
|
|
@ -114,7 +85,7 @@ class Host:
|
|||
if group not in self.groups:
|
||||
self.groups.append(group)
|
||||
|
||||
def add_group(self, group):
|
||||
def add_group(self, group: Group) -> bool:
|
||||
added = False
|
||||
# populate ancestors first
|
||||
for oldg in group.get_ancestors():
|
||||
|
|
@ -127,7 +98,7 @@ class Host:
|
|||
added = True
|
||||
return added
|
||||
|
||||
def remove_group(self, group):
|
||||
def remove_group(self, group: Group) -> bool:
|
||||
removed = False
|
||||
if group in self.groups:
|
||||
self.groups.remove(group)
|
||||
|
|
@ -143,18 +114,25 @@ class Host:
|
|||
self.remove_group(oldg)
|
||||
return removed
|
||||
|
||||
def set_variable(self, key, value):
|
||||
def set_variable(self, key: str, value: t.Any) -> None:
|
||||
key = helpers.remove_trust(key)
|
||||
|
||||
validate_variable_name(key)
|
||||
|
||||
if key in self.vars and isinstance(self.vars[key], MutableMapping) and isinstance(value, Mapping):
|
||||
self.vars = combine_vars(self.vars, {key: value})
|
||||
else:
|
||||
self.vars[key] = value
|
||||
|
||||
def get_groups(self):
|
||||
def get_groups(self) -> list[Group]:
|
||||
return self.groups
|
||||
|
||||
def get_magic_vars(self):
|
||||
results = {}
|
||||
results['inventory_hostname'] = self.name
|
||||
def get_magic_vars(self) -> dict[str, t.Any]:
|
||||
results: dict[str, t.Any] = dict(
|
||||
inventory_hostname=self.name,
|
||||
)
|
||||
|
||||
# FUTURE: these values should be dynamically calculated on access ala the rest of magic vars
|
||||
if patterns['ipv4'].match(self.name) or patterns['ipv6'].match(self.name):
|
||||
results['inventory_hostname_short'] = self.name
|
||||
else:
|
||||
|
|
@ -164,5 +142,5 @@ class Host:
|
|||
|
||||
return results
|
||||
|
||||
def get_vars(self):
|
||||
def get_vars(self) -> dict[str, t.Any]:
|
||||
return combine_vars(self.vars, self.get_magic_vars())
|
||||
|
|
|
|||
|
|
@ -19,28 +19,33 @@
|
|||
from __future__ import annotations
|
||||
|
||||
import fnmatch
|
||||
import functools
|
||||
import os
|
||||
import sys
|
||||
import re
|
||||
import itertools
|
||||
import traceback
|
||||
import typing as t
|
||||
|
||||
from operator import attrgetter
|
||||
from random import shuffle
|
||||
|
||||
from ansible import constants as C
|
||||
from ansible.errors import AnsibleError, AnsibleOptionsError, AnsibleParserError
|
||||
from ansible._internal import _json, _wrapt
|
||||
from ansible.errors import AnsibleError, AnsibleOptionsError
|
||||
from ansible.inventory.data import InventoryData
|
||||
from ansible.module_utils.six import string_types
|
||||
from ansible.module_utils.common.text.converters import to_bytes, to_text
|
||||
from ansible.parsing.utils.addresses import parse_address
|
||||
from ansible.plugins.loader import inventory_loader
|
||||
from ansible._internal._datatag._tags import Origin
|
||||
from ansible.utils.helpers import deduplicate_list
|
||||
from ansible.utils.path import unfrackpath
|
||||
from ansible.utils.display import Display
|
||||
from ansible.utils.vars import combine_vars
|
||||
from ansible.vars.plugins import get_vars_from_inventory_sources
|
||||
|
||||
if t.TYPE_CHECKING:
|
||||
from ansible.plugins.inventory import BaseInventoryPlugin
|
||||
|
||||
display = Display()
|
||||
|
||||
IGNORED_ALWAYS = [br"^\.", b"^host_vars$", b"^group_vars$", b"^vars_plugins$"]
|
||||
|
|
@ -196,12 +201,12 @@ class InventoryManager(object):
|
|||
def get_host(self, hostname):
|
||||
return self._inventory.get_host(hostname)
|
||||
|
||||
def _fetch_inventory_plugins(self):
|
||||
def _fetch_inventory_plugins(self) -> list[BaseInventoryPlugin]:
|
||||
""" sets up loaded inventory plugins for usage """
|
||||
|
||||
display.vvvv('setting up inventory plugins')
|
||||
|
||||
plugins = []
|
||||
plugins: list[BaseInventoryPlugin] = []
|
||||
for name in C.INVENTORY_ENABLED:
|
||||
plugin = inventory_loader.get(name)
|
||||
if plugin:
|
||||
|
|
@ -276,7 +281,6 @@ class InventoryManager(object):
|
|||
|
||||
# try source with each plugin
|
||||
for plugin in self._fetch_inventory_plugins():
|
||||
|
||||
plugin_name = to_text(getattr(plugin, '_load_name', getattr(plugin, '_original_path', '')))
|
||||
display.debug(u'Attempting to use plugin %s (%s)' % (plugin_name, plugin._original_path))
|
||||
|
||||
|
|
@ -287,9 +291,14 @@ class InventoryManager(object):
|
|||
plugin_wants = False
|
||||
|
||||
if plugin_wants:
|
||||
# have this tag ready to apply to errors or output; str-ify source since it is often tagged by the CLI
|
||||
origin = Origin(description=f'<inventory plugin {plugin_name!r} with source {str(source)!r}>')
|
||||
try:
|
||||
# FIXME in case plugin fails 1/2 way we have partial inventory
|
||||
plugin.parse(self._inventory, self._loader, source, cache=cache)
|
||||
inventory_wrapper = _InventoryDataWrapper(self._inventory, target_plugin=plugin, origin=origin)
|
||||
|
||||
# FUTURE: now that we have a wrapper around inventory, we can have it use ChainMaps to preview the in-progress inventory,
|
||||
# but be able to roll back partial inventory failures by discarding the outermost layer
|
||||
plugin.parse(inventory_wrapper, self._loader, source, cache=cache)
|
||||
try:
|
||||
plugin.update_cache_if_changed()
|
||||
except AttributeError:
|
||||
|
|
@ -298,14 +307,16 @@ class InventoryManager(object):
|
|||
parsed = True
|
||||
display.vvv('Parsed %s inventory source with %s plugin' % (source, plugin_name))
|
||||
break
|
||||
except AnsibleParserError as e:
|
||||
display.debug('%s was not parsable by %s' % (source, plugin_name))
|
||||
tb = ''.join(traceback.format_tb(sys.exc_info()[2]))
|
||||
failures.append({'src': source, 'plugin': plugin_name, 'exc': e, 'tb': tb})
|
||||
except Exception as e:
|
||||
display.debug('%s failed while attempting to parse %s' % (plugin_name, source))
|
||||
tb = ''.join(traceback.format_tb(sys.exc_info()[2]))
|
||||
failures.append({'src': source, 'plugin': plugin_name, 'exc': AnsibleError(e), 'tb': tb})
|
||||
except AnsibleError as ex:
|
||||
if not ex.obj:
|
||||
ex.obj = origin
|
||||
failures.append({'src': source, 'plugin': plugin_name, 'exc': ex})
|
||||
except Exception as ex:
|
||||
try:
|
||||
# omit line number to prevent contextual display of script or possibly sensitive info
|
||||
raise AnsibleError(str(ex), obj=origin) from ex
|
||||
except AnsibleError as ex:
|
||||
failures.append({'src': source, 'plugin': plugin_name, 'exc': ex})
|
||||
else:
|
||||
display.vvv("%s declined parsing %s as it did not pass its verify_file() method" % (plugin_name, source))
|
||||
|
||||
|
|
@ -319,9 +330,8 @@ class InventoryManager(object):
|
|||
if failures:
|
||||
# only if no plugin processed files should we show errors.
|
||||
for fail in failures:
|
||||
display.warning(u'\n* Failed to parse %s with %s plugin: %s' % (to_text(fail['src']), fail['plugin'], to_text(fail['exc'])))
|
||||
if 'tb' in fail:
|
||||
display.vvv(to_text(fail['tb']))
|
||||
# `obj` should always be set
|
||||
display.error_as_warning(msg=f'Failed to parse inventory with {fail["plugin"]!r} plugin.', exception=fail['exc'])
|
||||
|
||||
# final error/warning on inventory source failure
|
||||
if C.INVENTORY_ANY_UNPARSED_IS_FAILED:
|
||||
|
|
@ -749,3 +759,36 @@ class InventoryManager(object):
|
|||
self.reconcile_inventory()
|
||||
|
||||
result_item['changed'] = changed
|
||||
|
||||
|
||||
class _InventoryDataWrapper(_wrapt.ObjectProxy):
|
||||
"""
|
||||
Proxy wrapper around InventoryData.
|
||||
Allows `set_variable` calls to automatically apply template trust for plugins that don't know how.
|
||||
"""
|
||||
|
||||
# declared as class attrs to signal to ObjectProxy that we want them stored on the proxy, not the wrapped value
|
||||
_target_plugin = None
|
||||
_default_origin = None
|
||||
|
||||
def __init__(self, referent: InventoryData, target_plugin: BaseInventoryPlugin, origin: Origin) -> None:
|
||||
super().__init__(referent)
|
||||
self._target_plugin = target_plugin
|
||||
# fallback origin to ensure that vars are tagged with at least the file they came from
|
||||
self._default_origin = origin
|
||||
|
||||
@functools.cached_property
|
||||
def _inspector(self) -> _json.AnsibleVariableVisitor:
|
||||
"""
|
||||
Inventory plugins can delegate to other plugins (e.g. `auto`).
|
||||
This hack defers sampling the target plugin's `trusted_by_default` attr until `set_variable` is called, typically inside `parse`.
|
||||
Trust is then optionally applied based on the plugin's declared intent via `trusted_by_default`.
|
||||
"""
|
||||
return _json.AnsibleVariableVisitor(
|
||||
trusted_as_template=self._target_plugin.trusted_by_default,
|
||||
origin=self._default_origin,
|
||||
allow_encrypted_string=True,
|
||||
)
|
||||
|
||||
def set_variable(self, entity: str, varname: str, value: t.Any) -> None:
|
||||
self.__wrapped__.set_variable(entity, varname, self._inspector.visit(value))
|
||||
|
|
|
|||
|
|
@ -0,0 +1,55 @@
|
|||
from __future__ import annotations
|
||||
|
||||
import collections.abc as c
|
||||
|
||||
import typing as t
|
||||
|
||||
|
||||
# DTFIX-RELEASE: bikeshed "intermediate"
|
||||
INTERMEDIATE_MAPPING_TYPES = (c.Mapping,)
|
||||
"""
|
||||
Mapping types which are supported for recursion and runtime usage, such as in serialization and templating.
|
||||
These will be converted to a simple Python `dict` before serialization or storage as a variable.
|
||||
"""
|
||||
|
||||
INTERMEDIATE_ITERABLE_TYPES = (tuple, set, frozenset, c.Sequence)
|
||||
"""
|
||||
Iterable types which are supported for recursion and runtime usage, such as in serialization and templating.
|
||||
These will be converted to a simple Python `list` before serialization or storage as a variable.
|
||||
CAUTION: Scalar types which are sequences should be excluded when using this.
|
||||
"""
|
||||
|
||||
ITERABLE_SCALARS_NOT_TO_ITERATE_FIXME = (str, bytes)
|
||||
"""Scalars which are also iterable, and should thus be excluded from iterable checks."""
|
||||
|
||||
|
||||
def is_intermediate_mapping(value: object) -> bool:
|
||||
"""Returns `True` if `value` is a type supported for projection to a Python `dict`, otherwise returns `False`."""
|
||||
# DTFIX-RELEASE: bikeshed name
|
||||
return isinstance(value, INTERMEDIATE_MAPPING_TYPES)
|
||||
|
||||
|
||||
def is_intermediate_iterable(value: object) -> bool:
|
||||
"""Returns `True` if `value` is a type supported for projection to a Python `list`, otherwise returns `False`."""
|
||||
# DTFIX-RELEASE: bikeshed name
|
||||
return isinstance(value, INTERMEDIATE_ITERABLE_TYPES) and not isinstance(value, ITERABLE_SCALARS_NOT_TO_ITERATE_FIXME)
|
||||
|
||||
|
||||
is_controller: bool = False
|
||||
"""Set to True automatically when this module is imported into an Ansible controller context."""
|
||||
|
||||
|
||||
def get_controller_serialize_map() -> dict[type, t.Callable]:
|
||||
"""
|
||||
Called to augment serialization maps.
|
||||
This implementation is replaced with the one from ansible._internal in controller contexts.
|
||||
"""
|
||||
return {}
|
||||
|
||||
|
||||
def import_controller_module(_module_name: str, /) -> t.Any:
|
||||
"""
|
||||
Called to conditionally import the named module in a controller context, otherwise returns `None`.
|
||||
This implementation is replaced with the one from ansible._internal in controller contexts.
|
||||
"""
|
||||
return None
|
||||
58
lib/ansible/module_utils/_internal/_ambient_context.py
Normal file
58
lib/ansible/module_utils/_internal/_ambient_context.py
Normal file
|
|
@ -0,0 +1,58 @@
|
|||
# Copyright (c) 2024 Ansible Project
|
||||
# Simplified BSD License (see licenses/simplified_bsd.txt or https://opensource.org/licenses/BSD-2-Clause)
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import contextlib
|
||||
import contextvars
|
||||
|
||||
# deprecated: description='typing.Self exists in Python 3.11+' python_version='3.10'
|
||||
from ..compat import typing as t
|
||||
|
||||
|
||||
class AmbientContextBase:
|
||||
"""
|
||||
An abstract base context manager that, once entered, will be accessible via its `current` classmethod to any code in the same
|
||||
`contextvars` context (e.g. same thread/coroutine), until it is exited.
|
||||
"""
|
||||
|
||||
__slots__ = ('_contextvar_token',)
|
||||
|
||||
# DTFIX-FUTURE: subclasses need to be able to opt-in to blocking nested contexts of the same type (basically optional per-callstack singleton behavior)
|
||||
# DTFIX-RELEASE: this class should enforce strict nesting of contexts; overlapping context lifetimes leads to incredibly difficult to
|
||||
# debug situations with undefined behavior, so it should fail fast.
|
||||
# DTFIX-RELEASE: make frozen=True dataclass subclasses work (fix the mutability of the contextvar instance)
|
||||
|
||||
_contextvar: t.ClassVar[contextvars.ContextVar] # pylint: disable=declare-non-slot # pylint bug, see https://github.com/pylint-dev/pylint/issues/9950
|
||||
_contextvar_token: contextvars.Token
|
||||
|
||||
def __init_subclass__(cls, **kwargs) -> None:
|
||||
cls._contextvar = contextvars.ContextVar(cls.__name__)
|
||||
|
||||
@classmethod
|
||||
def when(cls, condition: bool, /, *args, **kwargs) -> t.Self | contextlib.nullcontext:
|
||||
"""Return an instance of the context if `condition` is `True`, otherwise return a `nullcontext` instance."""
|
||||
return cls(*args, **kwargs) if condition else contextlib.nullcontext()
|
||||
|
||||
@classmethod
|
||||
def current(cls, optional: bool = False) -> t.Self | None:
|
||||
"""
|
||||
Return the currently active context value for the current thread or coroutine.
|
||||
Raises ReferenceError if a context is not active, unless `optional` is `True`.
|
||||
"""
|
||||
try:
|
||||
return cls._contextvar.get()
|
||||
except LookupError:
|
||||
if optional:
|
||||
return None
|
||||
|
||||
raise ReferenceError(f"A required {cls.__name__} context is not active.") from None
|
||||
|
||||
def __enter__(self) -> t.Self:
|
||||
# DTFIX-RELEASE: actively block multiple entry
|
||||
self._contextvar_token = self.__class__._contextvar.set(self)
|
||||
return self
|
||||
|
||||
def __exit__(self, exc_type, exc_val, exc_tb) -> None:
|
||||
self.__class__._contextvar.reset(self._contextvar_token)
|
||||
del self._contextvar_token
|
||||
133
lib/ansible/module_utils/_internal/_ansiballz.py
Normal file
133
lib/ansible/module_utils/_internal/_ansiballz.py
Normal file
|
|
@ -0,0 +1,133 @@
|
|||
# Copyright (c) 2024 Ansible Project
|
||||
# Simplified BSD License (see licenses/simplified_bsd.txt or https://opensource.org/licenses/BSD-2-Clause)
|
||||
|
||||
"""Support code for exclusive use by the AnsiballZ wrapper."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import atexit
|
||||
import dataclasses
|
||||
import importlib.util
|
||||
import json
|
||||
import os
|
||||
import runpy
|
||||
import sys
|
||||
import typing as t
|
||||
|
||||
from . import _errors
|
||||
from ._plugin_exec_context import PluginExecContext, HasPluginInfo
|
||||
from .. import basic
|
||||
from ..common.json import get_module_encoder, Direction
|
||||
from ..common.messages import PluginInfo
|
||||
|
||||
|
||||
def run_module(
|
||||
*,
|
||||
json_params: bytes,
|
||||
profile: str,
|
||||
plugin_info_dict: dict[str, object],
|
||||
module_fqn: str,
|
||||
modlib_path: str,
|
||||
init_globals: dict[str, t.Any] | None = None,
|
||||
coverage_config: str | None = None,
|
||||
coverage_output: str | None = None,
|
||||
) -> None: # pragma: nocover
|
||||
"""Used internally by the AnsiballZ wrapper to run an Ansible module."""
|
||||
try:
|
||||
_enable_coverage(coverage_config, coverage_output)
|
||||
_run_module(
|
||||
json_params=json_params,
|
||||
profile=profile,
|
||||
plugin_info_dict=plugin_info_dict,
|
||||
module_fqn=module_fqn,
|
||||
modlib_path=modlib_path,
|
||||
init_globals=init_globals,
|
||||
)
|
||||
except Exception as ex: # not BaseException, since modules are expected to raise SystemExit
|
||||
_handle_exception(ex, profile)
|
||||
|
||||
|
||||
def _enable_coverage(coverage_config: str | None, coverage_output: str | None) -> None: # pragma: nocover
|
||||
"""Bootstrap `coverage` for the current Ansible module invocation."""
|
||||
if not coverage_config:
|
||||
return
|
||||
|
||||
if coverage_output:
|
||||
# Enable code coverage analysis of the module.
|
||||
# This feature is for internal testing and may change without notice.
|
||||
python_version_string = '.'.join(str(v) for v in sys.version_info[:2])
|
||||
os.environ['COVERAGE_FILE'] = f'{coverage_output}=python-{python_version_string}=coverage'
|
||||
|
||||
import coverage
|
||||
|
||||
cov = coverage.Coverage(config_file=coverage_config)
|
||||
|
||||
def atexit_coverage():
|
||||
cov.stop()
|
||||
cov.save()
|
||||
|
||||
atexit.register(atexit_coverage)
|
||||
|
||||
cov.start()
|
||||
else:
|
||||
# Verify coverage is available without importing it.
|
||||
# This will detect when a module would fail with coverage enabled with minimal overhead.
|
||||
if importlib.util.find_spec('coverage') is None:
|
||||
raise RuntimeError('Could not find the `coverage` Python module.')
|
||||
|
||||
|
||||
def _run_module(
|
||||
*,
|
||||
json_params: bytes,
|
||||
profile: str,
|
||||
plugin_info_dict: dict[str, object],
|
||||
module_fqn: str,
|
||||
modlib_path: str,
|
||||
init_globals: dict[str, t.Any] | None = None,
|
||||
) -> None:
|
||||
"""Used internally by `_run_module` to run an Ansible module after coverage has been enabled (if applicable)."""
|
||||
basic._ANSIBLE_ARGS = json_params
|
||||
basic._ANSIBLE_PROFILE = profile
|
||||
|
||||
init_globals = init_globals or {}
|
||||
init_globals.update(_module_fqn=module_fqn, _modlib_path=modlib_path)
|
||||
|
||||
with PluginExecContext(_ModulePluginWrapper(PluginInfo._from_dict(plugin_info_dict))):
|
||||
# Run the module. By importing it as '__main__', it executes as a script.
|
||||
runpy.run_module(mod_name=module_fqn, init_globals=init_globals, run_name='__main__', alter_sys=True)
|
||||
|
||||
# An Ansible module must print its own results and exit. If execution reaches this point, that did not happen.
|
||||
raise RuntimeError('New-style module did not handle its own exit.')
|
||||
|
||||
|
||||
def _handle_exception(exception: BaseException, profile: str) -> t.NoReturn:
|
||||
"""Handle the given exception."""
|
||||
result = dict(
|
||||
failed=True,
|
||||
exception=_errors.create_error_summary(exception),
|
||||
)
|
||||
|
||||
encoder = get_module_encoder(profile, Direction.MODULE_TO_CONTROLLER)
|
||||
|
||||
print(json.dumps(result, cls=encoder)) # pylint: disable=ansible-bad-function
|
||||
|
||||
sys.exit(1) # pylint: disable=ansible-bad-function
|
||||
|
||||
|
||||
@dataclasses.dataclass(frozen=True)
|
||||
class _ModulePluginWrapper(HasPluginInfo):
|
||||
"""Modules aren't plugin instances; this adapter implements the `HasPluginInfo` protocol to allow `PluginExecContext` infra to work with modules."""
|
||||
|
||||
plugin: PluginInfo
|
||||
|
||||
@property
|
||||
def _load_name(self) -> str:
|
||||
return self.plugin.requested_name
|
||||
|
||||
@property
|
||||
def ansible_name(self) -> str:
|
||||
return self.plugin.resolved_name
|
||||
|
||||
@property
|
||||
def plugin_type(self) -> str:
|
||||
return self.plugin.type
|
||||
|
|
@ -1,4 +1,5 @@
|
|||
"""Proxy stdlib threading module that only supports non-joinable daemon threads."""
|
||||
|
||||
# NB: all new local module attrs are _ prefixed to ensure an identical public attribute surface area to the module we're proxying
|
||||
|
||||
from __future__ import annotations as _annotations
|
||||
|
|
|
|||
|
|
@ -0,0 +1,64 @@
|
|||
"""Patch broken ClassVar support in dataclasses when ClassVar is accessed via a module other than `typing`."""
|
||||
|
||||
# deprecated: description='verify ClassVar support in dataclasses has been fixed in Python before removing this patching code', python_version='3.12'
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import dataclasses
|
||||
import sys
|
||||
import typing as t
|
||||
|
||||
# trigger the bug by exposing typing.ClassVar via a module reference that is not `typing`
|
||||
_ts = sys.modules[__name__]
|
||||
ClassVar = t.ClassVar
|
||||
|
||||
|
||||
def patch_dataclasses_is_type() -> None:
|
||||
if not _is_patch_needed():
|
||||
return # pragma: nocover
|
||||
|
||||
try:
|
||||
real_is_type = dataclasses._is_type # type: ignore[attr-defined]
|
||||
except AttributeError: # pragma: nocover
|
||||
raise RuntimeError("unable to patch broken dataclasses ClassVar support") from None
|
||||
|
||||
# patch dataclasses._is_type - impl from https://github.com/python/cpython/blob/4c6d4f5cb33e48519922d635894eef356faddba2/Lib/dataclasses.py#L709-L765
|
||||
def _is_type(annotation, cls, a_module, a_type, is_type_predicate):
|
||||
match = dataclasses._MODULE_IDENTIFIER_RE.match(annotation) # type: ignore[attr-defined]
|
||||
if match:
|
||||
ns = None
|
||||
module_name = match.group(1)
|
||||
if not module_name:
|
||||
# No module name, assume the class's module did
|
||||
# "from dataclasses import InitVar".
|
||||
ns = sys.modules.get(cls.__module__).__dict__
|
||||
else:
|
||||
# Look up module_name in the class's module.
|
||||
module = sys.modules.get(cls.__module__)
|
||||
if module and module.__dict__.get(module_name): # this is the patched line; removed `is a_module`
|
||||
ns = sys.modules.get(a_type.__module__).__dict__
|
||||
if ns and is_type_predicate(ns.get(match.group(2)), a_module):
|
||||
return True
|
||||
return False
|
||||
|
||||
_is_type._orig_impl = real_is_type # type: ignore[attr-defined] # stash this away to allow unit tests to undo the patch
|
||||
|
||||
dataclasses._is_type = _is_type # type: ignore[attr-defined]
|
||||
|
||||
try:
|
||||
if _is_patch_needed():
|
||||
raise RuntimeError("patching had no effect") # pragma: nocover
|
||||
except Exception as ex: # pragma: nocover
|
||||
dataclasses._is_type = real_is_type # type: ignore[attr-defined]
|
||||
raise RuntimeError("dataclasses ClassVar support is still broken after patching") from ex
|
||||
|
||||
|
||||
def _is_patch_needed() -> bool:
|
||||
@dataclasses.dataclass
|
||||
class CheckClassVar:
|
||||
# this is the broken case requiring patching: ClassVar dot-referenced from a module that is not `typing` is treated as an instance field
|
||||
# DTFIX-RELEASE: add link to CPython bug report to-be-filed (or update associated deprecation comments if we don't)
|
||||
a_classvar: _ts.ClassVar[int] # type: ignore[name-defined]
|
||||
a_field: int
|
||||
|
||||
return len(dataclasses.fields(CheckClassVar)) != 1
|
||||
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in New Issue
Block a user