pax_global_header 0000666 0000000 0000000 00000000064 13762437205 0014523 g ustar 00root root 0000000 0000000 52 comment=de7101a7e27a1ce7701eda5175562d5227a89a08
orchestra-v3-master/ 0000775 0000000 0000000 00000000000 13762437205 0014700 5 ustar 00root root 0000000 0000000 orchestra-v3-master/.gitignore 0000664 0000000 0000000 00000000166 13762437205 0016673 0 ustar 00root root 0000000 0000000 .idea
__pycache__/
*.py[cod]
build
dist
orchestra.egg-info
venv
# ytt is downloaded by setup.py
orchestra/support/ytt
orchestra-v3-master/README.md 0000664 0000000 0000000 00000001401 13762437205 0016153 0 ustar 00root root 0000000 0000000 # What is Orchestra?
Orchestra is a meta build system.
Its job is to automate the repetitive tasks required to build a complex
software project with many dependencies.
## How does it work?
TODO - write about:
* fundamental concepts (components, builds, dependencies)
* actions (clone, configure, install)
* binary archives
* "root portability" (rpath, etc)
* usage examples
* integration with git
## Configuring Orchestra
See the documentation in `/docs`.
## Usage
TODO - document Orchestra usage
## Installing
```bash
python setup.py bdist_wheel
pip install --user dist/orchestra*.whl
```
## Development setup
Creating a dedicated virtualenv is highly suggested
```bash
python3 -m venv virtualenv
. ./virtualenv/bin/activate
python setup.py develop
```
orchestra-v3-master/docs/ 0000775 0000000 0000000 00000000000 13762437205 0015630 5 ustar 00root root 0000000 0000000 orchestra-v3-master/docs/configuration.md 0000664 0000000 0000000 00000015743 13762437205 0021033 0 ustar 00root root 0000000 0000000 # Configuring Orchestra
This document explains the yaml format natively understood by Orchestra.
The configuration is actually preprocessed using [ytt](https://get-ytt.io/),
allowing to factor repetitive parts, provide user options, and so on.
When invoked Orchestra will look for a `.orchestra` directory
in the current directory or one or the parent ones (like git).
To start out it's enough to create a file in `.orchestra/configuration.yml`.
The `.orchestra` directory should be (or be placed inside) a git repository.
To use binary archives files under `.orchestra/binary_archives` should be
managed by git-lfs. It is possible to use a different repository for
that directory, e.g. using git submodules.
## Components and builds
Components must have at least one build.
The main properties of a build are the configure and install scripts.
A component can have multiple builds, for instance for
enabling different optimization levels.
This is an example of a simple component:
```yaml
simple_component:
builds:
simple_build:
configure: |
wget -O "$SOURCE_ARCHIVES/simpleproject.tar.gz" https://simpleproject.org/simpleproject.tar.gz
mkdir -p "$SOURCE_DIR" && cd "$SOURCE_DIR"
tar xzf "$SOURCE_ARCHIVES/simpleproject.tar.gz"
mkdir -p "$BUILD_DIR" && cd "$BUILD_DIR"
"$SOURCE_DIR/configure" --option-1 --option-2
install: |
cd "$BUILD_DIR"
make
make install
```
All builds must specify at least the `configure` and `install` scripts.
The scripts are run using `bash` and can use [environment variables from this list](#env-and-dirs)
### Component properties
A component can specify the following properties:
* `builds` (mandatory): a dictionary of builds
* `repository`: name of the repository to clone to get the project sources
* `default_build`: name of the default build. If not specified the first build in alphabetic order is picked.
* `add_to_path`: string prepended to $PATH. See the "Additional environment and PATH" section.
* `skip_post_install`: If true, Orchestra will skip the post install phase (RPATH adjustment, etc)
### Build properties
**configure** (mandatory)
This script usually performs the following:
* downloads sources tarball to `$SOURCE_ARCHIVES` (if the component does not specify a repository)
* extracts sources to `$SOURCE_DIR` or `$BUILD_DIR` (if the component does not specify a repository)
* `$BUILD_DIR` should be used for in-tree builds
* configuring the project (e.g. running `./configure`)
The script **must** create the directory `$BUILD_DIR`, as Orchestra considers the configure action
satisfied if it exists.
**install** (mandatory)
This script should build and install the component to `$TMP_ROOT`.
**dependencies** and **build_dependencies**
Dependencies are specified independently for each build.
Two types of dependencies exist: normal and build-only.
Normal dependencies are required both to build and to run the component,
while build-only dependencies are not required to run the component
and will not be installed if the component is installed from binary archives.
Dependencies can be specified using three syntax variations:
Example:
```yaml
components:
my_component:
builds:
my_build:
configure: ...
install: ...
dependencies:
- component_a # require the default build of `component_a`
- component_b@build_name # require the build `build_name` of `component_b`
- component_c~build_name # require any build of `component_c`, preferring `build_name`.
build_dependencies:
- gcc_component # Compiler is required only to build the component
```
**ndebug**
Boolean, defaults to true.
Used to replace `#ifdef`-like macros referencing `NDEBUG`.
## Environment variables and configurable paths
The following environment variables will be available to the shell scripts.
They are made available by inserting a prelude that performs `export VARIABLE="VALUE"`.
The value is purposefully not escaped, so that it can, for instance, expand other variables (`$OTHERVARIABLE`).
However, most variables need absolute paths if overridden.
**ORCHESTRA_DOTDIR**
Orchestra configuration directory.
**ORCHESTRA_ROOT**
Orchestra root directory.
Default value: `$ORCHESTRA_DOTDIR/../root`.
Overridable using the `paths.orchestra_root` setting by specifying an absolute path.
**SOURCE_ARCHIVES**
Directory containing cached source archives.
Default value: `$ORCHESTRA_DOTDIR/source_archives`.
Overridable using the `paths.source_archives` setting by specifying an absolute path.
**BINARY_ARCHIVES**
Directory containing cached binary archives.
Default value: `$ORCHESTRA_DOTDIR/binary_archives`.
Overridable using the `paths.binary_archives` setting by specifying an absolute path.
**SOURCES_DIR**
Directory where sources should be placed. Default value: `$ORCHESTRA_DOTDIR/../sources`.
Overridable using the `paths.sources_dir` setting by specifying an absolute path.
Not meant to be used directly, use `SOURCE_DIR`.
**BUILDS_DIR**
Directory where builds should be placed. Default value: `$ORCHESTRA_DOTDIR/../build`.
Overridable using the `paths.builds_dir` setting by specifying an absolute path.
Not meant to be used directly, use `BUILD_DIR`.
**SOURCE_DIR**
Per-component directory where sources should be placed.
Value: `$SOURCES_DIR//`.
**BUILD_DIR**
Per-build directory where build artifacts should be placed.
Value: `$BUILDS_DIR//`.
**TMP_ROOTS**
Directory containing the temporary roots, where the built components should be installed.
Default value: `$ORCHESTRA_DOTDIR/tmproot` by specifying an absolute path.
Overridable using the `paths.tmproot` setting.
**TMP_ROOT** and **DESTDIR**
Per-component temporary root directory where the built component should be installed.
Value: `$TMP_ROOTS/`.
The files installed to `TMP_ROOT` will be indexed
and moved automatically by Orchestra to the "true" root.
`DESTDIR` is only set during the install phase.
## Additional environment and PATH
Additional environment variables can be exported by adding
an element to the `environment` root key of the configuration.
The variables will be evaluated in order.
Example:
```yaml
components:
...
environment:
- VARIABLE_NAME: value
- ANOTHER_VARIABLE_NAME: "Expand another variable: $VARIABLE_NAME"
```
It is also possible to prepend components to the PATH variable,
by specifying `add_to_path` in the root configuration or in a component:
```yaml
components:
my_component:
add_to_path: $ORCHESTRA_ROOT/opt/mycomponent/bin/
...
add_to_path:
- $ORCHESTRA_ROOT/bin
```
All `add_to_path` directives will be applied regardless of
where they are specified, even if the component is not installed.
# Binary archives
# Repository cloning
Document how the remote is picked, etc.
orchestra-v3-master/orchestra/ 0000775 0000000 0000000 00000000000 13762437205 0016672 5 ustar 00root root 0000000 0000000 orchestra-v3-master/orchestra/__init__.py 0000664 0000000 0000000 00000002614 13762437205 0021006 0 ustar 00root root 0000000 0000000 import argparse
import sys
from loguru import logger
from tqdm import tqdm
from orchestra.cmds import install_subcommands
from orchestra.model.configuration import Configuration
parser = argparse.ArgumentParser()
parser.add_argument("--loglevel", "-v", default="INFO", choices=["DEBUG", "INFO", "WARNING", "ERROR", "CRITICAL"])
parser.add_argument("--pretend", "-n", action="store_true", help="Do not execute actions, only print what would be done")
parser.add_argument("--quiet", "-q", action="store_true", help="Do not show output of executed commands")
parser.add_argument("--no-config-cache", action="store_true", help="Do not cache generated yaml configuration")
parser.add_argument("--from-source", "-B", action="store_true", help="Build all components from source")
parser.add_argument("--fallback-build", "-b", action="store_true", help="Build if binary archives are not available")
subparsers = install_subcommands(parser)
class TqdmWrapper:
def write(self, message):
tqdm.write(message.strip())
sys.stdout.flush()
sys.stderr.flush()
def main():
args = parser.parse_args()
logger.remove(0)
logger.add(TqdmWrapper(), level=args.loglevel, colorize=True, format="[+] {level} - {message}")
cmd_parser = subparsers.choices.get(args.command_name)
if not cmd_parser:
parser.print_help()
exit(1)
return cmd_parser.handler(args)
orchestra-v3-master/orchestra/__main__.py 0000775 0000000 0000000 00000000122 13762437205 0020762 0 ustar 00root root 0000000 0000000 #!/usr/bin/env python3
from . import main
if __name__ == "__main__":
main()
orchestra-v3-master/orchestra/actions/ 0000775 0000000 0000000 00000000000 13762437205 0020332 5 ustar 00root root 0000000 0000000 orchestra-v3-master/orchestra/actions/__init__.py 0000664 0000000 0000000 00000000224 13762437205 0022441 0 ustar 00root root 0000000 0000000 from .clone import CloneAction
from .configure import ConfigureAction
from .install import InstallAction
from .install import InstallAnyBuildAction
orchestra-v3-master/orchestra/actions/action.py 0000664 0000000 0000000 00000007507 13762437205 0022172 0 ustar 00root root 0000000 0000000 import os.path
from collections import OrderedDict
from typing import Set
from loguru import logger
from .util import run_script
# Only used for type hints, package-relative import not possible due to circular reference
import orchestra.model.configuration
class Action:
def __init__(self, name, script, config):
self.name = name
self.config: "orchestra.model.configuration.Configuration" = config
self.external_dependencies: Set[Action] = set()
self._script = script
def run(self, args):
logger.info(f"Executing {self}")
self._run(args)
def _run(self, args):
"""Executes the action"""
run_script(self.script, quiet=args.quiet, environment=self.environment)
@property
def script(self):
"""Unless _run is overridden, should return the script to run"""
return self._script
@property
def dependencies(self):
return self.external_dependencies.union(self._implicit_dependencies())
def _implicit_dependencies(self):
return set()
def is_satisfied(self, recursively=False, already_checked=None):
if already_checked is None:
already_checked = set()
if not self._is_satisfied():
return False
elif not recursively:
return True
else:
already_checked.add(self)
for d in self.dependencies:
if d in already_checked:
continue
d_satisfied = d.is_satisfied(recursively=recursively, already_checked=already_checked)
if not d_satisfied:
return False
return True
def _is_satisfied(self):
"""Returns true if the action is satisfied, false if it needs to run."""
raise NotImplementedError()
def can_run(self):
"""Returns true if the action can be run (i.e. all its dependencies are satisfied)"""
return all(d.is_satisfied() for d in self.dependencies)
@property
def environment(self) -> OrderedDict:
"""Returns additional environment variables provided to the script to be run"""
return self.config.global_env()
@property
def _target_name(self):
raise NotImplementedException("Action subclasses must implement _target_name")
@property
def qualified_name(self):
return self._target_name + f"[{self.name}]"
@property
def name_for_info(self):
return f"{self.name} {self._target_name}"
@property
def name_for_graph(self):
return self.name_for_info
@property
def name_for_components(self):
return self._target_name
def __str__(self):
return f"Action {self.name} of {self.name_for_info}"
def __repr__(self):
return self.__str__()
class ActionForComponent(Action):
def __init__(self, name, component, script, config):
super().__init__(name, script, config)
self.component = component
@property
def environment(self) -> OrderedDict:
env = super().environment
env["SOURCE_DIR"] = os.path.join(self.config.sources_dir, self.component.name)
return env
@property
def _target_name(self):
return self.component.name
class ActionForBuild(ActionForComponent):
def __init__(self, name, build, script, config):
super().__init__(name, build.component, script, config)
self.build = build
@property
def environment(self) -> OrderedDict:
env = super().environment
env["BUILD_DIR"] = os.path.join(self.config.builds_dir,
self.build.component.name,
self.build.name)
env["TMP_ROOT"] = os.path.join(env["TMP_ROOTS"], self.build.safe_name)
return env
@property
def _target_name(self):
return self.build.qualified_name
orchestra-v3-master/orchestra/actions/clone.py 0000664 0000000 0000000 00000006754 13762437205 0022020 0 ustar 00root root 0000000 0000000 import json
import os.path
import re
from .action import ActionForComponent
from .util import run_script
class CloneAction(ActionForComponent):
def __init__(self, component, repository, config):
super().__init__("clone", component, None, config)
self.repository = repository
@property
def script(self):
clone_cmds = []
for remote_base_url in self.config.remotes.values():
clone_cmds.append(f'git clone "{remote_base_url}/{self.repository}" "$SOURCE_DIR"')
script = " || \\\n ".join(clone_cmds)
script += "\n"
script += 'git -C "$SOURCE_DIR" branch -m orchestra-temporary\n'
checkout_cmds = []
for branch in self.config.branches:
checkout_cmds.append(f'git -C "$SOURCE_DIR" checkout -b "{branch}" "origin/{branch}"')
checkout_cmds.append("true")
script += " || \\\n ".join(checkout_cmds)
return script
def _run(self, args):
"""Executes the action"""
run_script(self.script, quiet=True, environment=self.environment)
def _is_satisfied(self):
return os.path.exists(self.environment["SOURCE_DIR"])
def branches(self):
# First, check local checkout
if self.component.from_source:
source_dir = self.environment["SOURCE_DIR"]
if os.path.exists(source_dir):
return self._branches_from_remote(source_dir)
cache_filepath = os.path.join(self.config.orchestra_dotdir,
"remote_refs_cache.json")
# Check the cache
if os.path.exists(cache_filepath):
with open(cache_filepath, "rb") as f:
cached_data = json.loads(f.read())
if self.component.name in cached_data:
return cached_data[self.component.name]
# Check all the remotes
remotes = [f"{base_url}/{self.repository}"
for base_url
in self.config.remotes.values()]
for remote in remotes:
result = self._branches_from_remote(remote)
if result:
# We have a result, cache and return it
if os.path.exists(cache_filepath):
with open(cache_filepath, "rb") as f:
cached_data = json.loads(f.read())
else:
cached_data = {}
cached_data[self.component.name] = result
# TODO: prevent race condition, if two clone actions run at the same time
with open(cache_filepath, "w") as f:
json.dump(cached_data, f)
return result
return None
def branch(self):
branches = self.branches()
if branches:
for branch in self.config.branches:
if branch in branches:
return branch, branches[branch]
return None, None
def _branches_from_remote(self, remote):
env = dict(self.environment)
env["GIT_SSH_COMMAND"] = "ssh -oControlPath=~/.ssh/ssh-mux-%r@%h:%p -oControlMaster=auto -o ControlPersist=10"
result = run_script(
f'git ls-remote -h --refs "{remote}"',
quiet=True,
environment=env,
check_returncode=False
).stdout.decode("utf-8")
parse_regex = re.compile(r"(?P[a-f0-9]*)\W*refs/heads/(?P.*)")
return {branch: commit
for commit, branch
in parse_regex.findall(result)}
orchestra-v3-master/orchestra/actions/configure.py 0000664 0000000 0000000 00000002523 13762437205 0022667 0 ustar 00root root 0000000 0000000 import os
from loguru import logger
from .action import ActionForBuild
from .util import run_script
class ConfigureAction(ActionForBuild):
def __init__(self, build, script, config):
super().__init__("configure", build, script, config)
def _is_satisfied(self):
return os.path.exists(self._configure_successful_path)
def _run(self, args):
if os.path.exists(self._configure_successful_path):
logger.warning("This component was already successfully configured, rerunning configure script")
os.remove(self._configure_successful_path)
elif os.path.exists(self.environment["BUILD_DIR"]):
logger.warning("Previous configure probably failed, running configure script in a dirty environment")
logger.warning(f"You might want to delete the build directory (use `orchestra clean`)")
result = run_script(self.script, quiet=args.quiet, environment=self.environment)
if result.returncode == 0:
open(self._configure_successful_path, "w").close()
@property
def _configure_successful_path(self):
return os.path.join(self.environment["BUILD_DIR"], ".configure_successful")
def _implicit_dependencies(self):
if self.build.component.clone:
return {self.build.component.clone}
else:
return set()
orchestra-v3-master/orchestra/actions/install.py 0000664 0000000 0000000 00000051002 13762437205 0022350 0 ustar 00root root 0000000 0000000 import json
import os
import stat
import time
from collections import OrderedDict, defaultdict
from textwrap import dedent
from loguru import logger
from .action import ActionForBuild
from .util import run_script
from .. import git_lfs
from ..util import is_installed, get_installed_metadata, OrchestraException
class InstallAction(ActionForBuild):
def __init__(self, build, script, config, from_binary_archives=False, fallback_to_build=False, run_tests=False):
name = "install"
name += " from binary archives" if from_binary_archives else ""
name += " or build" if from_binary_archives and fallback_to_build else ""
super().__init__(name, build, script, config)
self.from_binary_archives = from_binary_archives
self.fallback_to_build = fallback_to_build
def _run(self, args):
tmp_root = self.environment["TMP_ROOT"]
orchestra_root = self.environment['ORCHESTRA_ROOT']
logger.info("Preparing temporary root directory")
self._prepare_tmproot()
pre_file_list = self._index_directory(tmp_root + orchestra_root, relative_to=tmp_root + orchestra_root)
start_time = time.time()
if self.from_binary_archives and self._binary_archive_exists():
logger.info("Fetching binary archive")
self._fetch_binary_archive()
logger.info("Extracting binary archive")
self._install_from_binary_archives()
source = "binary archive"
logger.info("Removing conflicting files")
self._remove_conflicting_files()
self.update_binary_archive_symlink()
elif not self.from_binary_archives or self.fallback_to_build:
self._install(args.quiet, args.test)
source = "build"
logger.info("Removing conflicting files")
self._remove_conflicting_files()
if self.build.component.skip_post_install:
logger.info("Skipping post install")
else:
self._post_install(args.quiet)
if args.create_binary_archives:
self._create_binary_archive()
self.update_binary_archive_symlink()
else:
raise OrchestraException("Binary archive not found!")
end_time = time.time()
post_file_list = self._index_directory(tmp_root + orchestra_root, relative_to=tmp_root + orchestra_root)
post_file_list.append(
os.path.relpath(self.config.installed_component_file_list_path(self.build.component.name), orchestra_root))
post_file_list.append(
os.path.relpath(self.config.installed_component_metadata_path(self.build.component.name), orchestra_root))
new_files = [f for f in post_file_list if f not in pre_file_list]
if not args.no_merge:
if is_installed(self.config, self.build.component.name):
logger.info("Uninstalling previously installed build")
uninstall(self.build.component.name, self.config)
logger.info("Merging installed files into orchestra root directory")
self._merge(args.quiet)
# Write file metadata and index
os.makedirs(self.config.installed_component_metadata_dir(), exist_ok=True)
metadata = {
"component_name": self.build.component.name,
"build_name": self.build.name,
"install_time": end_time - start_time,
"source": source,
"self_hash": self.build.self_hash,
"recursive_hash": self.build.recursive_hash,
"binary_archive_path": os.path.join(self.build.binary_archive_dir, self.build.binary_archive_filename),
}
with open(self.config.installed_component_metadata_path(self.build.component.name), "w") as f:
json.dump(metadata, f)
with open(self.config.installed_component_file_list_path(self.build.component.name), "w") as f:
new_files = [f"{f}\n" for f in new_files]
f.writelines(new_files)
if not args.keep_tmproot:
logger.info("Cleaning up tmproot")
self._cleanup_tmproot()
def _is_satisfied(self):
return is_installed(
self.config,
self.build.component.name,
wanted_build=self.build.name,
wanted_recursive_hash=self.build.recursive_hash
)
def _prepare_tmproot(self):
script = dedent("""
rm -rf "$TMP_ROOT"
mkdir -p "$TMP_ROOT"
mkdir -p "${TMP_ROOT}${ORCHESTRA_ROOT}/include"
mkdir -p "${TMP_ROOT}${ORCHESTRA_ROOT}/lib64"{,/include,/pkgconfig}
test -e "${TMP_ROOT}${ORCHESTRA_ROOT}/lib" || ln -s lib64 "${TMP_ROOT}${ORCHESTRA_ROOT}/lib"
test -L "${TMP_ROOT}${ORCHESTRA_ROOT}/lib"
mkdir -p "${TMP_ROOT}${ORCHESTRA_ROOT}/bin"
mkdir -p "${TMP_ROOT}${ORCHESTRA_ROOT}/usr/"{lib,include}
mkdir -p "${TMP_ROOT}${ORCHESTRA_ROOT}/share/"{info,doc,man,orchestra}
touch "${TMP_ROOT}${ORCHESTRA_ROOT}/share/info/dir"
mkdir -p "${TMP_ROOT}${ORCHESTRA_ROOT}/libexec"
""")
run_script(script, environment=self.environment)
def _cleanup_tmproot(self):
run_script('rm -rf "$TMP_ROOT"', environment=self.environment)
def _install(self, quiet, test):
logger.info("Executing install script")
env = dict(self.environment)
env["RUN_TESTS"] = "1" if (self.build.test and test) else "0"
run_script(self.script, quiet=quiet, environment=env)
def _post_install(self, quiet):
logger.info("Dropping absolute paths from pkg-config")
self._drop_absolute_pkgconfig_paths()
# TODO: maybe this should be put into the configuration and not in orchestra itself
logger.info("Converting hardlinks to symbolic")
self._hard_to_symbolic()
# TODO: maybe this should be put into the configuration and not in orchestra itself
logger.info("Fixing RPATHs")
self._fix_rpath()
# TODO: this should be put into the configuration and not in orchestra itself
logger.info("Replacing NDEBUG preprocessor statements")
self._replace_ndebug(self.build.ndebug)
if self.build.component.license:
logger.info("Copying license file")
source = self.build.component.license
destination = self.config.installed_component_license_path(self.build.component.name)
script = dedent(f"""
DESTINATION_DIR="$(dirname "{destination}")"
mkdir -p "$DESTINATION_DIR"
for DIR in "$BUILD_DIR" "$SOURCE_DIR"; do
if test -e "$DIR/{source}"; then
cp "$DIR/{source}" "$TMP_ROOT/{destination}"
exit 0
fi
done
echo "Couldn't find {source}"
exit 1
""")
run_script(script, environment=self.environment)
def _remove_conflicting_files(self):
script = dedent("""
if test -d "$TMP_ROOT/$ORCHESTRA_ROOT/share/info"; then rm -rf "$TMP_ROOT/$ORCHESTRA_ROOT/share/info"; fi
if test -d "$TMP_ROOT/$ORCHESTRA_ROOT/share/locale"; then rm -rf "$TMP_ROOT/$ORCHESTRA_ROOT/share/locale"; fi
""")
run_script(script, environment=self.environment)
def _drop_absolute_pkgconfig_paths(self):
script = dedent("""
cd "${TMP_ROOT}${ORCHESTRA_ROOT}"
if [ -e lib/pkgconfig ]; then
find lib/pkgconfig \\
-name "*.pc" \\
-exec sed -i 's|/*'"$ORCHESTRA_ROOT"'/*|${pcfiledir}/../..|g' {} ';'
fi
""")
run_script(script, environment=self.environment)
def _hard_to_symbolic(self):
duplicates = defaultdict(list)
for root, dirnames, filenames in os.walk(f'{self.environment["TMP_ROOT"]}{self.environment["ORCHESTRA_ROOT"]}'):
for path in filenames:
path = os.path.join(root, path)
info = os.lstat(path)
inode = info.st_ino
if inode == 0 or info.st_nlink < 2 or not stat.S_ISREG(info.st_mode):
continue
duplicates[inode].append(path)
for _, equivalent in duplicates.items():
base = equivalent.pop()
for alternative in equivalent:
os.unlink(alternative)
os.symlink(os.path.relpath(base, os.path.dirname(alternative)), alternative)
def _fix_rpath(self):
replace_dynstr = os.path.join(os.path.dirname(__file__), "..", "support", "elf-replace-dynstr.py")
fix_rpath_script = dedent(f"""
cd "$TMP_ROOT$ORCHESTRA_ROOT"
# Fix rpath
find . -type f -executable | while read EXECUTABLE; do
if head -c 4 "$EXECUTABLE" | grep '^.ELF' > /dev/null &&
file "$EXECUTABLE" | grep x86-64 | grep -E '(shared|dynamic)' > /dev/null;
then
REPLACE='$'ORIGIN/$(realpath --relative-to="$(dirname "$EXECUTABLE")" ".")
echo "Setting rpath of $EXECUTABLE to $REPLACE"
"{replace_dynstr}" "$EXECUTABLE" "$RPATH_PLACEHOLDER" "$REPLACE" /
"{replace_dynstr}" "$EXECUTABLE" "$ORCHESTRA_ROOT" "$REPLACE" /
fi
done
""")
run_script(fix_rpath_script, environment=self.environment)
def _replace_ndebug(self, disable_debugging):
debug, ndebug = ("0", "1") if disable_debugging else ("1", "0")
patch_ndebug_script = dedent(rf"""
cd "$TMP_ROOT$ORCHESTRA_ROOT"
find include/ -name "*.h" \
-exec \
sed -i \
-e 's|^\s*#\s*ifndef\s\+NDEBUG|#if {debug}|' \
-e 's|^\s*#\s*ifdef\s\+NDEBUG|#if {ndebug}|' \
-e 's|^\(\s*#\s*if\s\+.*\)!defined(NDEBUG)|\1{debug}|' \
-e 's|^\(\s*#\s*if\s\+.*\)defined(NDEBUG)|\1{ndebug}|' \
{{}} ';'
""")
run_script(patch_ndebug_script, environment=self.environment)
def _merge(self, quiet):
copy_command = f'cp -farl "$TMP_ROOT/$ORCHESTRA_ROOT/." "$ORCHESTRA_ROOT"'
run_script(copy_command, quiet=quiet, environment=self.environment)
def _binary_archive_repo_name(self):
# Select the binary archives repository to employ
if self.component.binary_archives:
binary_archive_repo_name = self.component.binary_archives
if binary_archive_repo_name not in self.config.binary_archives_remotes.keys():
raise Exception(f"The {self.component.name} component wants to push to an unknown binary-archives repository ({binary_archive_repo_name})")
return binary_archive_repo_name
else:
return list(self.config.binary_archives_remotes.keys())[0]
def _binary_archive_path(self):
archive_dirname = self.build.binary_archive_dir
archive_name = self.build.binary_archive_filename
binary_archive_repo_name = self._binary_archive_repo_name()
return f"$BINARY_ARCHIVES/{binary_archive_repo_name}/linux-x86-64/{archive_dirname}/{archive_name}"
def _create_binary_archive(self):
logger.info("Creating binary archive")
archive_name = self.build.binary_archive_filename
binary_archive_path = self._binary_archive_path()
binary_archive_repo_name = self._binary_archive_repo_name()
binary_archive_tmp_path = f"$BINARY_ARCHIVES/{binary_archive_repo_name}/_tmp_{archive_name}"
binary_archive_containing_dir = os.path.dirname(binary_archive_path)
script = dedent(f"""
mkdir -p "$BINARY_ARCHIVES"
cd "$TMP_ROOT$ORCHESTRA_ROOT"
rm -f "{binary_archive_tmp_path}"
tar cvaf "{binary_archive_tmp_path}" --owner=0 --group=0 *
mkdir -p "{binary_archive_containing_dir}"
mv "{binary_archive_tmp_path}" "{binary_archive_path}"
""")
run_script(script, environment=self.environment)
def update_binary_archive_symlink(self):
logger.info("Updating binary archive symlink")
binary_archive_repo_name = self._binary_archive_repo_name()
archive_dirname = self.build.binary_archive_dir
orchestra_config_branch = run_script(
'git -C "$ORCHESTRA_DOTDIR" rev-parse --abbrev-ref HEAD',
quiet=True,
environment=self.environment
).stdout.decode("utf-8").strip().replace("/", "-")
archive_path = os.path.join(self.environment["BINARY_ARCHIVES"],
binary_archive_repo_name,
"linux-x86-64",
archive_dirname)
def create_symlink(branch, commit):
branch = branch.replace("/", "-")
target_name = f"{commit}_{self.build.recursive_hash}.tar.gz"
target = os.path.join(archive_path, target_name)
if os.path.exists(target):
symlink = os.path.join(archive_path, f"{branch}_{orchestra_config_branch}.tar.gz")
if os.path.exists(symlink):
os.unlink(symlink)
os.symlink(target_name, symlink)
if self.component.clone:
for branch, commit in self.component.clone.branches().items():
create_symlink(branch, commit)
else:
create_symlink("none", "none")
def _binary_archive_filepath(self):
archives_dir = self.environment["BINARY_ARCHIVES"]
for name in self.config.binary_archives_remotes:
archive_dir = self.build.binary_archive_dir
archive_name = self.build.binary_archive_filename
try_archive_path = os.path.join(archives_dir, name, "linux-x86-64", archive_dir, archive_name)
if os.path.exists(try_archive_path):
return try_archive_path
return None
def _binary_archive_exists(self):
return self._binary_archive_filepath() is not None
def _fetch_binary_archive(self):
# TODO: better edge-case handling, when the binary archive exists but is not committed into the
# binary archives git-lfs repo (e.g. it has been locally created by the user)
binary_archive_path = self._binary_archive_filepath()
binary_archive_repo_dir = os.path.dirname(binary_archive_path)
while binary_archive_repo_dir != "/":
if ".git" in os.listdir(binary_archive_repo_dir):
break
binary_archive_repo_dir = os.path.dirname(binary_archive_repo_dir)
if binary_archive_repo_dir == "/":
logger.error("Binary archives are not a git repository!")
exit(1)
git_lfs.fetch(binary_archive_repo_dir, only=[os.path.realpath(binary_archive_path)])
def _install_from_binary_archives(self):
if not self._binary_archive_exists():
raise OrchestraException("Binary archive not found!")
archive_filepath = self._binary_archive_filepath()
script = dedent(f"""
mkdir -p "$TMP_ROOT$ORCHESTRA_ROOT"
cd "$TMP_ROOT$ORCHESTRA_ROOT"
tar xaf "{archive_filepath}"
""")
run_script(script, environment=self.environment)
@staticmethod
def _index_directory(root_dir_path, relative_to=None):
paths = []
for current_dir_path, child_dir_names, child_file_names in os.walk(root_dir_path):
for child_filename in child_file_names:
child_file_path = os.path.join(current_dir_path, child_filename)
if relative_to:
child_file_path = os.path.relpath(child_file_path, relative_to)
paths.append(child_file_path)
for child_dir in child_dir_names:
child_dir_path = os.path.join(current_dir_path, child_dir)
if os.path.islink(child_dir_path):
if relative_to:
child_dir_path = os.path.relpath(child_dir_path, relative_to)
paths.append(child_dir_path)
return paths
@property
def environment(self) -> OrderedDict:
env = super().environment
env["DESTDIR"] = env["TMP_ROOT"]
return env
def _implicit_dependencies(self):
if self.from_binary_archives and (self._binary_archive_exists() or not self.fallback_to_build):
return set()
else:
return {self.build.configure}
class InstallAnyBuildAction(ActionForBuild):
def __init__(self, build, config):
installed_metadata = get_installed_metadata(build.component.name, config)
if not installed_metadata:
# The component is not installed, use default build
chosen_build = build
else:
# The component is installed, check that the recursive hash is still the same
installed_build_name = installed_metadata["build_name"]
installed_build_hash = installed_metadata["recursive_hash"]
installed_build = build.component.builds.get(installed_build_name)
if not installed_build or installed_build.recursive_hash != installed_build_hash:
# The installed build disappeared from the config
# or the hash changed -- fallback to default
chosen_build = build
else:
chosen_build = installed_build
super().__init__("install any", chosen_build, None, config)
self._original_build = build
self._has_run = False
def _implicit_dependencies(self):
return {self.build.install}
def _run(self, args):
self._has_run = True
def is_satisfied(self, recursively=False, already_checked=None):
return any(
build.install.is_satisfied(recursively=recursively, already_checked=already_checked)
for build in self.build.component.builds.values()
) and self._has_run
def _is_satisfied(self):
raise NotImplementedError("This method should not be called!")
@property
def name_for_info(self):
if self.build == self._original_build:
return f"install {self.build.component.name} (prefer {self._original_build.name})"
else:
return f"install {self.build.component.name} (prefer {self._original_build.name}, chosen {self.build.name})"
@property
def name_for_graph(self):
if self.build == self._original_build:
return f"install {self.build.component.name} (prefer {self._original_build.name})"
else:
return f"install {self.build.component.name} (prefer {self._original_build.name}, chosen {self.build.name})"
@property
def name_for_components(self):
return f"{self._original_build.component.name}~{self._original_build.name}"
def uninstall(component_name, config):
index_path = config.installed_component_file_list_path(component_name)
metadata_path = config.installed_component_metadata_path(component_name)
# Index and metadata files should be removed last,
# so an interrupted uninstall can be resumed
postpone_removal_paths = [
os.path.relpath(index_path, config.orchestra_root),
os.path.relpath(metadata_path, config.orchestra_root)
]
with open(index_path) as f:
paths = f.readlines()
# Ensure depth first visit by reverse-sorting
# paths.sort(reverse=True)
paths = [path.strip() for path in paths]
for path in paths:
# Ensure the path is relative to the root
path = path.lstrip("/")
if path in postpone_removal_paths:
continue
path_to_delete = os.path.join(config.global_env()['ORCHESTRA_ROOT'], path)
if os.path.isfile(path_to_delete) or os.path.islink(path_to_delete):
logger.debug(f"Deleting {path_to_delete}")
os.remove(path_to_delete)
elif os.path.isdir(path_to_delete):
if os.listdir(path_to_delete):
logger.debug(f"Not removing directory {path_to_delete} as it is not empty")
else:
logger.debug(f"Deleting directory {path_to_delete}")
os.rmdir(path_to_delete)
containing_directory = os.path.dirname(path_to_delete)
if os.path.exists(containing_directory) and len(os.listdir(containing_directory)) == 0:
logger.debug(f"Removing empty directory {containing_directory}")
os.rmdir(containing_directory)
logger.debug(f"Deleting index file {index_path}")
os.remove(index_path)
logger.debug(f"Deleting metadata file {metadata_path}")
os.remove(metadata_path)
orchestra-v3-master/orchestra/actions/util.py 0000664 0000000 0000000 00000003450 13762437205 0021663 0 ustar 00root root 0000000 0000000 import subprocess
from collections import OrderedDict
from loguru import logger
from ..util import export_environment, OrchestraException
bash_prelude = """
set -o errexit
set -o nounset
set -o pipefail
"""
def run_script(script,
quiet=False,
environment: OrderedDict = None,
strict_flags=True,
check_returncode=True,
):
"""Helper for running shell scripts.
:param script: the script to run
:param quiet: if True the output of the command is not shown to the user,
but instead captured and accessible from the `stdout` and `stderr` properties of the returned value.
:param environment: will be exported at the beginning of the script
:param strict_flags: if True, a prelude is prepended to the script to help catch errors
:param check_returncode: if True an exception is raised unless the script returns 0
:return: a subprocess.CompletedProcess instance
"""
if strict_flags:
script_to_run = bash_prelude
else:
script_to_run = ""
if environment:
script_to_run += export_environment(environment)
script_to_run += script
if quiet:
stdout = subprocess.PIPE
stderr = subprocess.PIPE
else:
logger.info(f"The following script is going to be executed:\n" + script.strip())
logger.info(f"Script output:")
stdout = None
stderr = None
result = subprocess.run(["/bin/bash", "-c", script_to_run], stdout=stdout, stderr=stderr)
if check_returncode and result.returncode != 0:
raise OrchestraException(f"Script failed with return code {result.returncode}")
return result
def try_decode(stream, encoding="utf-8"):
try:
return stream.decode(encoding)
except Exception as e:
return stream
orchestra-v3-master/orchestra/cmds/ 0000775 0000000 0000000 00000000000 13762437205 0017620 5 ustar 00root root 0000000 0000000 orchestra-v3-master/orchestra/cmds/__init__.py 0000664 0000000 0000000 00000002615 13762437205 0021735 0 ustar 00root root 0000000 0000000 import argparse
from . import clean
from . import clone
from . import components
from . import configure
from . import dumpconfig
from . import environment
from . import graph
from . import install
from . import shell
from . import uninstall
from . import update
from . import ls
from . import fix_binary_archives_symlinks
class CustomArgumentParser(argparse.ArgumentParser):
def __init__(self, handler=None, *args, **kwargs):
super().__init__(*args, **kwargs)
if not handler:
raise ValueError("Please provide a command handler")
self.handler = handler
def install_subcommands(argparser):
subparsers = argparser.add_subparsers(
description="Available subcommands. Use --help",
dest="command_name",
parser_class=CustomArgumentParser)
components.install_subcommand(subparsers)
dumpconfig.install_subcommand(subparsers)
environment.install_subcommand(subparsers)
clone.install_subcommand(subparsers)
update.install_subcommand(subparsers)
configure.install_subcommand(subparsers)
install.install_subcommand(subparsers)
clean.install_subcommand(subparsers)
uninstall.install_subcommand(subparsers)
graph.install_subcommand(subparsers)
shell.install_subcommand(subparsers)
ls.install_subcommand(subparsers)
fix_binary_archives_symlinks.install_subcommand(subparsers)
return subparsers
orchestra-v3-master/orchestra/cmds/clean.py 0000664 0000000 0000000 00000002264 13762437205 0021260 0 ustar 00root root 0000000 0000000 import shutil
from loguru import logger
from ..model.configuration import Configuration
def install_subcommand(sub_argparser):
cmd_parser = sub_argparser.add_parser("clean", handler=handle_clean, help="Remove build/source directories")
cmd_parser.add_argument("component")
cmd_parser.add_argument("--include-sources", "-s", action="store_true", help="Also delete source dir")
def handle_clean(args):
config = Configuration(args)
build = config.get_build(args.component)
if not build:
suggested_component_name = config.get_suggested_component_name(args.component)
logger.error(f"Component {args.component} not found! Did you mean {suggested_component_name}?")
exit(1)
build_dir = build.install.environment["BUILD_DIR"]
logger.info(f"Cleaning build dir for {build.qualified_name} ({build_dir})")
if not args.pretend:
shutil.rmtree(build_dir, ignore_errors=True)
if args.include_sources:
sources_dir = build.install.environment["SOURCE_DIR"]
logger.info(f"Cleaning source dir for {build.qualified_name} ({sources_dir})")
if not args.pretend:
shutil.rmtree(sources_dir, ignore_errors=True)
orchestra-v3-master/orchestra/cmds/clone.py 0000664 0000000 0000000 00000001674 13762437205 0021302 0 ustar 00root root 0000000 0000000 from loguru import logger
from ..executor import Executor
from ..model.configuration import Configuration
def install_subcommand(sub_argparser):
cmd_parser = sub_argparser.add_parser("clone", handler=handle_clone, help="Clone a component")
cmd_parser.add_argument("component")
cmd_parser.add_argument("--no-force", action="store_true", help="Don't force execution of the root action")
def handle_clone(args):
config = Configuration(args)
build = config.get_build(args.component)
if not build:
suggested_component_name = config.get_suggested_component_name(args.component)
logger.error(f"Component {args.component} not found! Did you mean {suggested_component_name}?")
exit(1)
if not build.component.clone:
print("This component does not have a git repository configured!")
return
executor = Executor(args)
return executor.run(build.component.clone, no_force=args.no_force)
orchestra-v3-master/orchestra/cmds/components.py 0000664 0000000 0000000 00000007253 13762437205 0022366 0 ustar 00root root 0000000 0000000 from loguru import logger
from urllib.parse import urlparse
from ..model.configuration import Configuration
from ..util import get_installed_build
def normalize_repository_url(url):
# Drop credentials
if url.startswith("https://") or url.startswith("http://"):
url = urlparse(url)
url = url._replace(netloc=url.hostname).geturl()
# Add .git suffix
if not url.endswith(".git"):
url = url + ".git"
return url
def install_subcommand(sub_argparser):
cmd_parser = sub_argparser.add_parser("components", handler=handle_components, help="List components")
cmd_parser.add_argument("component", nargs="?")
cmd_parser.add_argument("--installed", action="store_true", help="Only print installed components")
cmd_parser.add_argument("--not-installed", action="store_true", help="Only print not installed components")
cmd_parser.add_argument("--deps", action="store_true", help="Print dependencies")
cmd_parser.add_argument("--hashes", action="store_true", help="Show hashes")
cmd_parser.add_argument("--repository-url", help="Show components from this repository URL")
def handle_components(args):
config = Configuration(args)
if args.component:
build = config.get_build(args.component)
if not build:
suggested_component_name = config.get_suggested_component_name(args.component)
logger.error(f"Component {args.component} not found! Did you mean {suggested_component_name}?")
exit(1)
components = {build.component.name: build.component}
else:
components = config.components
repository_filter = None
if args.repository_url:
repository_filter = normalize_repository_url(args.repository_url)
for component_name, component in components.items():
# Filter by repository URL
if repository_filter:
if not component.clone:
continue
repository = component.clone.repository
if not any(remote_base_url
for remote_base_url
in config.remotes.values()
if normalize_repository_url(f"{remote_base_url}/{repository}") == repository_filter):
continue
installed_build = get_installed_build(component_name, config)
if args.installed and installed_build \
or args.not_installed and installed_build is None \
or not args.installed and not args.not_installed:
print(f"Component {component_name}")
for build_name, build in component.builds.items():
infos = []
if installed_build == build_name:
infos.append("installed")
if build is component.default_build:
infos.append("default")
if build.configure and args.deps:
dependencies = [dep for dep in build.configure.dependencies]
if dependencies:
infos.append(f"config deps: {' '.join(d.name_for_components for d in dependencies)}")
if build.install and args.deps:
dependencies = [dep for dep in build.install.dependencies if dep.build is not build]
if dependencies:
infos.append(f"install deps: {' '.join(d.name_for_components for d in dependencies)}")
if args.hashes:
infos.append(f"hash: {build.self_hash}")
infos.append(f"recursive hash: {build.recursive_hash}")
infos_s = " ".join(f"[{i}]" for i in infos)
s = f" Build {build_name} {infos_s}"
print(s)
print()
orchestra-v3-master/orchestra/cmds/configure.py 0000664 0000000 0000000 00000001707 13762437205 0022160 0 ustar 00root root 0000000 0000000 from loguru import logger
from ..executor import Executor
from ..model.configuration import Configuration
def install_subcommand(sub_argparser):
cmd_parser = sub_argparser.add_parser("configure", handler=handle_configure, help="Run configure script")
cmd_parser.add_argument("component")
cmd_parser.add_argument("--no-force", action="store_true", help="Don't force execution of the root action")
cmd_parser.add_argument("--no-deps", action="store_true", help="Only execute the requested action")
def handle_configure(args):
config = Configuration(args)
build = config.get_build(args.component)
if not build:
suggested_component_name = config.get_suggested_component_name(args.component)
logger.error(f"Component {args.component} not found! Did you mean {suggested_component_name}?")
exit(1)
executor = Executor(args)
return executor.run(build.configure, no_force=args.no_force, no_deps=args.no_deps)
orchestra-v3-master/orchestra/cmds/dumpconfig.py 0000664 0000000 0000000 00000000646 13762437205 0022333 0 ustar 00root root 0000000 0000000 from loguru import logger
from ..model.configuration import Configuration
from ..util import parse_component_name, is_installed
from ..actions.install import uninstall
def install_subcommand(sub_argparser):
cmd_parser = sub_argparser.add_parser("dumpconfig", handler=handle_dumpconfig, help="Dump yaml configuration")
def handle_dumpconfig(args):
config = Configuration(args)
print(config.generated_yaml)
orchestra-v3-master/orchestra/cmds/environment.py 0000664 0000000 0000000 00000001505 13762437205 0022537 0 ustar 00root root 0000000 0000000 from loguru import logger
from ..model.configuration import Configuration
from ..util import export_environment
def install_subcommand(sub_argparser):
cmd_parser = sub_argparser.add_parser("environment", handler=handle_environment, help="Print environment variables")
cmd_parser.add_argument("component", nargs="?")
def handle_environment(args):
config = Configuration(args)
if not args.component:
print(export_environment(config.global_env()))
else:
build = config.get_build(args.component)
if not build:
suggested_component_name = config.get_suggested_component_name(args.component)
logger.error(f"Component {args.component} not found! Did you mean {suggested_component_name}?")
exit(1)
print(export_environment(build.install.environment))
orchestra-v3-master/orchestra/cmds/fix_binary_archives_symlinks.py 0000664 0000000 0000000 00000001143 13762437205 0026140 0 ustar 00root root 0000000 0000000 import os
from loguru import logger
from ..model.configuration import Configuration
def install_subcommand(sub_argparser):
cmd_parser = sub_argparser.add_parser("fix-binary-archives-symlinks",
handler=handle_fix_binary_archives_symlinks,
help="Fix symlinks in binary archives")
def handle_fix_binary_archives_symlinks(args):
config = Configuration(args)
for _, component in config.components.items():
for _, build in component.builds.items():
build.install.update_binary_archive_symlink()
orchestra-v3-master/orchestra/cmds/graph.py 0000664 0000000 0000000 00000004362 13762437205 0021300 0 ustar 00root root 0000000 0000000 from loguru import logger
from ..model.configuration import Configuration
def install_subcommand(sub_argparser):
cmd_parser = sub_argparser.add_parser("graph", handler=handle_graph, help="Print dependency graph (dot format)")
cmd_parser.add_argument("component", nargs="?")
cmd_parser.add_argument("--all-builds", action="store_true",
help="Include all builds instead of only the default one.")
def handle_graph(args):
config = Configuration(args)
if args.component:
build = config.get_build(args.component)
if not build:
suggested_component_name = config.get_suggested_component_name(args.component)
logger.error(f"Component {args.component} not found! Did you mean {suggested_component_name}?")
exit(1)
actions = [build.install]
else:
actions = set()
for component in config.components.values():
if args.all_builds:
for build in component.builds.values():
actions.add(build.install)
else:
actions.add(component.default_build.install)
print("digraph dependency_graph {")
print(" splines=ortho")
print_dependencies(actions)
print("}")
def print_dependencies(actions):
# TODO: this code needs to deduplicate rows and handle potential dependency cycles
# It is ugly and should be improved
def _print_dependencies(action, already_visited_actions, rows):
if action in already_visited_actions:
return
if action.is_satisfied(recursively=True):
color = "green"
elif action.can_run():
color = "orange"
else:
color = "red"
rows.add(f' "{action.name_for_graph}"[ shape=box, style=filled, color={color} ];')
for d in action.dependencies:
rows.add(f' "{d.name_for_graph}" -> "{action.name_for_graph}";')
already_visited_actions.add(action)
for d in action.dependencies:
_print_dependencies(d, already_visited_actions, rows)
already_visited_actions = set()
rows = set()
for action in actions:
_print_dependencies(action, already_visited_actions, rows)
for r in sorted(rows):
print(r)
orchestra-v3-master/orchestra/cmds/install.py 0000664 0000000 0000000 00000002554 13762437205 0021646 0 ustar 00root root 0000000 0000000 from loguru import logger
from ..executor import Executor
from ..model.configuration import Configuration
def install_subcommand(sub_argparser):
cmd_parser = sub_argparser.add_parser("install", handler=handle_install, help="Build and install a component")
cmd_parser.add_argument("component")
cmd_parser.add_argument("--no-deps", action="store_true", help="Only execute the requested action")
cmd_parser.add_argument("--no-force", action="store_true", help="Don't force execution of the root action")
cmd_parser.add_argument("--no-merge", action="store_true", help="Do not merge files into orchestra root")
cmd_parser.add_argument("--create-binary-archives", action="store_true", help="Create binary archives")
cmd_parser.add_argument("--keep-tmproot", action="store_true", help="Do not remove temporary root directories")
cmd_parser.add_argument("--test", action="store_true", help="Run the test suite")
def handle_install(args):
config = Configuration(args)
build = config.get_build(args.component)
if not build:
suggested_component_name = config.get_suggested_component_name(args.component)
logger.error(f"Component {args.component} not found! Did you mean {suggested_component_name}?")
exit(1)
executor = Executor(args)
return executor.run(build.install, no_force=args.no_force, no_deps=args.no_deps)
orchestra-v3-master/orchestra/cmds/ls.py 0000664 0000000 0000000 00000002421 13762437205 0020607 0 ustar 00root root 0000000 0000000 import os
from loguru import logger
from ..model.configuration import Configuration
def install_subcommand(sub_argparser):
cmd_parser = sub_argparser.add_parser("ls",
handler=handle_ls,
help="List orchestra-related directories")
cmd_parser.add_argument("--git-sources", action="store_true", help="Print directories containing git repositories")
cmd_parser.add_argument("--binary-archives", action="store_true", help="Print binary archives directories")
def handle_ls(args):
config = Configuration(args)
if args.git_sources + args.binary_archives != 1:
logger.error("Please specify one and one flag only")
exit(1)
if args.git_sources:
for component in config.components.values():
if not component.clone:
continue
source_path = os.path.join(config.sources_dir, component.name)
if not os.path.exists(source_path):
continue
print(source_path)
elif args.binary_archives:
for name in config.binary_archives_remotes.keys():
path = os.path.join(config.binary_archives_dir, name)
if os.path.exists(path):
print(path)
orchestra-v3-master/orchestra/cmds/shell.py 0000664 0000000 0000000 00000003252 13762437205 0021303 0 ustar 00root root 0000000 0000000 import os
import pty
import select
import sys
import termios
import tty
from subprocess import Popen
from textwrap import dedent
from loguru import logger
from ..actions.util import run_script
from ..model.configuration import Configuration
from ..util import export_environment
def install_subcommand(sub_argparser):
cmd_parser = sub_argparser.add_parser("shell", handler=handle_shell,
help="Open a shell with the given component environment (experimental)")
cmd_parser.add_argument("component", nargs="?")
def handle_shell(args):
config = Configuration(args)
if not args.component:
env = config.global_env()
ps1_prefix = "(orchestra) "
cd_to = os.getcwd()
else:
build = config.get_build(args.component)
if not build:
suggested_component_name = config.get_suggested_component_name(args.component)
logger.error(f"Component {args.component} not found! Did you mean {suggested_component_name}?")
exit(1)
env = build.install.environment
ps1_prefix = f"(orchestra - {build.qualified_name}) "
if os.path.exists(build.install.environment["BUILD_DIR"]):
cd_to = build.install.environment["BUILD_DIR"]
else:
cd_to = os.getcwd()
user_shell = run_script("getent passwd $(whoami) | cut -d: -f7", quiet=True).stdout.decode("utf-8").strip()
env["OLD_HOME"] = os.environ["HOME"]
env["HOME"] = os.path.join(os.path.dirname(__file__), "..", "support", "shell-home")
env["PS1_PREFIX"] = ps1_prefix
script = dedent(f"""
cd {cd_to}
{user_shell}
""")
run_script(script, environment=env)
orchestra-v3-master/orchestra/cmds/uninstall.py 0000664 0000000 0000000 00000001250 13762437205 0022201 0 ustar 00root root 0000000 0000000 from loguru import logger
from ..model.configuration import Configuration
from ..util import parse_component_name, is_installed
from ..actions.install import uninstall
def install_subcommand(sub_argparser):
cmd_parser = sub_argparser.add_parser("uninstall", handler=handle_uninstall, help="Uninstall a component")
cmd_parser.add_argument("component")
def handle_uninstall(args):
config = Configuration(args)
component_name, build_name = parse_component_name(args.component)
if not is_installed(config, component_name, build_name):
logger.error(f"Component {args.component} is not installed")
exit(1)
uninstall(component_name, config)
orchestra-v3-master/orchestra/cmds/update.py 0000664 0000000 0000000 00000010504 13762437205 0021454 0 ustar 00root root 0000000 0000000 import os.path
import subprocess
from glob import glob
from textwrap import dedent
from loguru import logger
from tqdm import tqdm
from ..model.configuration import Configuration
def install_subcommand(sub_argparser):
cmd_parser = sub_argparser.add_parser("update", handler=handle_update, help="Update components")
cmd_parser.add_argument("--no-config", action="store_true", help="Don't pull orchestra config")
def handle_update(args):
config = Configuration(args)
failed_pulls = []
if not args.no_config:
logger.info("Updating orchestra configuration")
result = git_pull(config.orchestra_dotdir)
if result.returncode:
failed_pulls.append(f"orchestra configuration ({config.orchestra_dotdir})")
logger.info("Updating binary archives")
os.makedirs(config.binary_archives_dir, exist_ok=True)
for name, url in config.binary_archives_remotes.items():
binary_archive_path = os.path.join(config.binary_archives_dir, name)
if os.path.exists(binary_archive_path):
result = pull_binary_archive(name, config)
if result.returncode:
failed_pulls.append(f"Binary archive {name} ({os.path.join(config.binary_archives_dir, name)})")
else:
clone_binary_archive(name, url, config)
logger.info("Resetting ls-remote cached info")
ls_remote_cache = os.path.join(config.orchestra_dotdir, "remote_refs_cache.json")
if os.path.exists(ls_remote_cache):
os.remove(ls_remote_cache)
logger.info("Updating ls-remote cached info")
clonable_components = [component
for _, component
in config.components.items()
if component.clone]
for component in tqdm(clonable_components, unit="components"):
logger.info(f"Fetching the latest remote commit for {component.name}")
_, _ = component.clone.branch()
to_pull = []
for _, component in config.components.items():
if not component.clone:
continue
source_path = os.path.join(config.sources_dir, component.name)
if not os.path.exists(source_path):
continue
to_pull.append(component)
if to_pull:
logger.info("Updating repositories")
for component in tqdm(to_pull, unit="components"):
source_path = os.path.join(config.sources_dir, component.name)
assert os.path.exists(os.path.join(source_path, ".git"))
logger.info(f"Pulling {component.name}")
result = git_pull(source_path)
if result.returncode:
failed_pulls.append(f"Repository {component.name}")
if failed_pulls:
formatted_failed_pulls = "\n".join([f" {repo}" for repo in failed_pulls])
failed_git_pull_suggestion = dedent(f"""
Could not git pull --ff-only the following repositories:
{formatted_failed_pulls}
Suggestions:
- check your network connection
- commit your work
- `git pull --rebase`, to pull remote changes and apply your commits on top
- `git push` your changes to the remotes
""")
logger.error(failed_git_pull_suggestion)
def pull_binary_archive(name, config):
binary_archive_path = os.path.join(config.binary_archives_dir, name)
logger.info(f"Pulling binary archive {name}")
result = git_pull(binary_archive_path)
return result
def clone_binary_archive(name, url, config):
logger.info(f"Trying to clone binary archive from remote {name} ({url})")
binary_archive_path = os.path.join(config.binary_archives_dir, name)
env = dict(os.environ)
env["GIT_SSH_COMMAND"] = "ssh -oControlPath=~/.ssh/ssh-mux-%r@%h:%p -oControlMaster=auto -o ControlPersist=10"
env["GIT_LFS_SKIP_SMUDGE"] = "1"
result = subprocess.run(["git", "clone", url, binary_archive_path], env=env)
if result.returncode:
logger.info(f"Could not clone binary archive from remote {name}!")
def git_pull(directory):
env = os.environ
env["GIT_LFS_SKIP_SMUDGE"] = "1"
result = subprocess.run(["git", "-C", directory, "pull", "--ff-only"],
env=env,
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT)
logger.info(result.stdout.decode("utf8").strip())
return result
orchestra-v3-master/orchestra/executor.py 0000664 0000000 0000000 00000010114 13762437205 0021077 0 ustar 00root root 0000000 0000000 from concurrent import futures
from typing import List, Dict
import enlighten
from loguru import logger
from .actions.action import Action
from .util import set_terminal_title, OrchestraException
class Executor:
def __init__(self, args, threads=1):
self.args = args
self.threads = 1
self._pending_actions: List[Action] = []
self._running_actions: Dict[futures.Future, Action] = {}
self._failed_actions: List[Action] = []
self._pool = futures.ThreadPoolExecutor(max_workers=threads, thread_name_prefix="Builder")
def run(self, action, no_force=False, no_deps=False):
self._collect_actions(action, force=not no_force, no_deps=no_deps)
self._pending_actions.sort(key=lambda a: a.qualified_name)
if not self._pending_actions:
logger.info("No actions to perform")
total_pending = len(self._pending_actions)
for _ in range(self.threads):
self._schedule_next()
manager = enlighten.get_manager()
status_bar = manager.status_bar()
status_bar.color = "bright_white_on_lightslategray"
while self._running_actions:
running_jobs_str = ", ".join(a.name_for_info for a in self._running_actions.values())
status_bar_args = {
"jobs": running_jobs_str,
"current": total_pending - len(self._pending_actions),
"total": total_pending,
}
set_terminal_title(f"Running {running_jobs_str}")
status_bar.status_format = "[{current}/{total}] Running {jobs}"
status_bar.update(**status_bar_args)
status_bar.refresh()
done, not_done = futures.wait(self._running_actions, return_when=futures.FIRST_COMPLETED)
for d in done:
action = self._running_actions[d]
del self._running_actions[d]
exception = d.exception()
if exception:
if isinstance(exception, OrchestraException):
logger.error(str(exception))
if self._pending_actions:
logger.error(f"Waiting for other running actions to terminate: {self._pending_actions}")
self._pending_actions = []
self._failed_actions.append(action)
else:
raise exception
else:
self._schedule_next()
if self._failed_actions:
msg = "Failed: " + ", ".join(a.name_for_info for a in self._failed_actions)
status_bar.color = "white_on_red"
result = 1
else:
msg = "All done!"
status_bar.color = "white_on_darkgreen"
result = 0
status_bar.status_format = msg
status_bar.close()
return result
def _collect_actions(self, action: Action, force=False, no_deps=False):
if not force and action.is_satisfied(recursively=True):
return
if action not in self._pending_actions:
self._pending_actions.append(action)
if no_deps:
return
for dep in action.dependencies:
self._collect_actions(dep)
def _schedule_next(self):
next_runnable_action = self._get_next_runnable_action()
if not next_runnable_action:
if self._pending_actions:
logger.error(f"Could not run any action! An action has failed or there is a circular dependency")
self._failed_actions = list(self._pending_actions)
return
future = self._pool.submit(self._run_action, next_runnable_action)
self._running_actions[future] = next_runnable_action
return future
def _get_next_runnable_action(self):
for action in self._pending_actions:
if all([d.is_satisfied(recursively=True) for d in action.dependencies]):
self._pending_actions.remove(action)
return action
def _run_action(self, action: Action):
return action.run(args=self.args)
orchestra-v3-master/orchestra/git_lfs/ 0000775 0000000 0000000 00000000000 13762437205 0020321 5 ustar 00root root 0000000 0000000 orchestra-v3-master/orchestra/git_lfs/__init__.py 0000664 0000000 0000000 00000023032 13762437205 0022432 0 ustar 00root root 0000000 0000000 from __future__ import division, print_function, unicode_literals
import base64
import json
from loguru import logger
import os
import pprint
from subprocess import CalledProcessError, check_output, PIPE, Popen, STDOUT
try:
from urllib.parse import urlsplit, urlunsplit, splituser, urlunparse, urlparse
from urllib.request import Request, urlopen
except ImportError:
from urllib2 import Request, urlopen, splituser
from urlparse import urlsplit, urlunsplit, urlunparse, urlparse
from .utils import force_link, ignore_missing_file, in_dir, TempDir, TempFile
MEDIA_TYPE = "application/vnd.git-lfs+json"
POST_HEADERS = {"Accept": MEDIA_TYPE, "Content-Type": MEDIA_TYPE}
def urlretrieve(url, data=None, headers=None):
scheme, netloc, path, params, query, frag = urlparse(url)
auth, host = splituser(netloc)
if auth:
auth = auth.encode("utf-8")
url = urlunparse((scheme, host, path, params, query, frag))
req = Request(url, data, headers)
base64string = base64.encodestring(auth)[:-1]
basic = "Basic " + base64string.decode("utf-8")
req.add_header("Authorization", basic)
else:
req = Request(url, data, headers)
return urlopen(req)
def git_show(git_repo, p):
with in_dir(git_repo):
return check_output(["git", "show", "HEAD:" + p])
def get_cache_dir(git_dir, oid):
return git_dir + "/lfs/objects/" + oid[:2] + "/" + oid[2:4]
def get_lfs_endpoint_url(git_repo, checkout_dir):
try:
with in_dir(checkout_dir):
url = check_output(
"git config -f .lfsconfig --get lfs.url".split()
).strip().decode("utf8")
except CalledProcessError:
with in_dir(git_repo):
url = check_output(
"git config --get remote.origin.url".split()
).strip().decode("utf8")
if url.endswith("/"):
url = url[:-1]
if not url.endswith("/info/lfs"):
url += "/info/lfs" if url.endswith(".git") else ".git/info/lfs"
url_split = urlsplit(url)
host, path = url_split.hostname, url_split.path
if url_split.scheme != "https":
if not url_split.scheme:
# SSH format: git@example.org:repo.git
host, path = url.replace("/info/lfs", "").split(":", 1)
auth_header, url = get_lfs_api_token(host, path)
assert url
return url, auth_header
url = urlunsplit(("https", host, path, "", ""))
# need to get GHE auth token if available. issue cmd like this to get:
# ssh git@git-server.com git-lfs-authenticate foo/bar.git download
if path.endswith("/info/lfs"):
path = path[:-len("/info/lfs")]
auth_header = {}
if not (url_split.username and url_split.password):
# TODO: this is ugly
try:
auth_header, remote_path = get_lfs_api_token("git@" + host, path)
if remote_path:
assert url == remote_path
except:
pass
return url, auth_header
def get_lfs_api_token(host, path):
""" gets an authorization token to use to do further introspection on the
LFS info in the repository. See documentation here for description of
the ssh command and response:
https://github.com/git-lfs/git-lfs/blob/master/docs/api/server-discovery.md
"""
header_info = {}
query_cmd = "ssh " + host + " git-lfs-authenticate " + path + " download"
# TODO: we're suppressing stderr
output = check_output(query_cmd.split(),
stderr=PIPE).strip().decode("utf8")
if output:
query_resp = json.loads(output)
header_info = query_resp["header"]
url = query_resp["href"]
return header_info, url
def find_lfs_files(checkout_dir):
"""Yields the paths of the files managed by Git LFS
"""
with in_dir(checkout_dir):
repo_files = Popen("git ls-files -z".split(), stdout=PIPE)
repo_files_attrs = check_output(
"git check-attr --cached --stdin -z diff filter".split(),
stdin=repo_files.stdout
)
# In old versions of git, check-attr's `-z` flag only applied to input
sep = b"\0" if b"\0" in repo_files_attrs else b"\n"
i = iter(repo_files_attrs.strip(sep).split(sep))
paths = set()
while True:
try:
if sep == b"\0":
path, attr, value = next(i), next(i), next(i)
else:
path, attr, value = next(i).rsplit(": ", 2)
attr # shut up pyflakes
except StopIteration:
break
if value != b"lfs":
continue
if path in paths:
continue
paths.add(path)
yield path.decode("ascii")
def read_lfs_metadata(checkout_dir):
"""Yields (path, oid, size) tuples for all files managed by Git LFS
"""
for path in find_lfs_files(checkout_dir):
meta = git_show(checkout_dir, path).decode("utf8").strip().split("\n")
if meta[0] != "version https://git-lfs.github.com/spec/v1":
continue
d = dict(line.split(" ", 1) for line in meta[1:])
oid = d["oid"]
oid = oid[7:] if oid.startswith("sha256:") else oid
size = int(d["size"])
yield (path, oid, size)
def fetch_urls(lfs_url, lfs_auth_info, oid_list):
"""Fetch the URLs of the files from the Git LFS endpoint
"""
data = json.dumps({"operation": "download", "objects": oid_list})
headers = dict(POST_HEADERS)
headers.update(lfs_auth_info)
resp = json.loads(urlretrieve(lfs_url + "/objects/batch", data.encode("ascii"), headers).read().decode("ascii"))
assert "objects" in resp, resp
return resp["objects"]
def fetch(git_repo, checkout_dir=None, verbose=0, only=[]):
"""Download all the files managed by Git LFS
"""
git_dir = git_repo + "/.git" if os.path.isdir(git_repo + "/.git") else git_repo
checkout_dir = checkout_dir or git_repo
if checkout_dir == git_dir:
logger.error("Can't checkout into a bare repo, please provide a valid checkout_dir")
raise SystemExit(1)
checkout_git_dir = checkout_dir + "/.git"
if not os.path.isdir(checkout_git_dir):
with TempDir(dir=checkout_dir) as d:
check_output(["git", "clone", "-ns", git_repo, d], stderr=STDOUT)
os.rename(d + "/.git", checkout_git_dir)
with in_dir(checkout_dir):
check_output(["git", "reset", "HEAD"])
# Read the LFS metadata
found = False
only_enabled = len(only) > 0
only = [os.path.relpath(os.path.abspath(path), checkout_dir) for path in only]
oid_list, lfs_files = [], {}
for path, oid, size in read_lfs_metadata(checkout_dir):
if only_enabled:
if path not in only:
continue
else:
only.remove(path)
found = True
dst = checkout_dir + "/" + path
# Skip the file if it looks like it's already there
with ignore_missing_file():
if os.stat(dst).st_size == size:
if verbose > 1:
logger.info(f"Skipping {path} (already present)")
continue
# If we have the file in the cache, link to it
with ignore_missing_file():
cached = get_cache_dir(git_dir, oid) + "/" + oid
if os.stat(cached).st_size == size:
force_link(cached, dst)
if verbose > 0:
logger.info(f"Linked {path} from the cache")
continue
oid_list.append(dict(oid=oid, size=size))
lfs_files[(oid, size)] = path
if only_enabled and only:
logger.error("Couldn't find the following files requested with --only:")
for path in only:
logger.error(path)
return False
if not found:
logger.error("This repository does not seem to use LFS.")
return False
if not oid_list:
if verbose > 0:
logger.info("Nothing to fetch.")
return True
# Fetch the URLs of the files from the Git LFS endpoint
lfs_url, lfs_auth_info = get_lfs_endpoint_url(git_repo, checkout_dir)
if verbose > 0:
logger.info(f"Fetching URLs from {lfs_url} ...")
if verbose > 1:
logger.debug(f"Authorization info for URL: {lfs_auth_info}")
logger.debug(f"oid_list: {pprint.pformat(oid_list)}")
objects = fetch_urls(lfs_url, lfs_auth_info, oid_list)
# Download the files
tmp_dir = git_dir + "/lfs/tmp"
if not os.path.exists(tmp_dir):
os.makedirs(tmp_dir)
for obj in objects:
oid, size = (obj["oid"], obj["size"])
path = lfs_files[(oid, size)]
cache_dir = get_cache_dir(git_dir, oid)
# Download into tmp_dir
with TempFile(dir=tmp_dir) as f:
url = obj["actions"]["download"]["href"]
head = obj["actions"]["download"]["header"]
logged_url = url if verbose > 0 else url[:40]
logger.info(f"Downloading {path} ({(size / (1024 ** 2)):.2f} MB) from {logged_url}...")
h = urlretrieve(url, headers=head)
while True:
buf = h.read(10240)
if not buf:
break
f.write(buf)
# Move to cache_dir
dst1 = cache_dir + "/" + oid
if not os.path.exists(cache_dir):
os.makedirs(cache_dir)
if verbose > 1:
logger.debug("temp download file: " + f.name)
logger.debug("cache file name: " + dst1)
os.rename(f.name, dst1)
# Copy into checkout_dir
dst2 = checkout_dir + "/" + path
force_link(dst1, dst2)
return True
orchestra-v3-master/orchestra/git_lfs/utils.py 0000664 0000000 0000000 00000002300 13762437205 0022026 0 ustar 00root root 0000000 0000000 from __future__ import division, print_function, unicode_literals
import os
import shutil
from contextlib import contextmanager
from tempfile import mkdtemp, NamedTemporaryFile
@contextmanager
def ignore_missing_file(filename=None):
try:
yield
except OSError as e:
if e.errno != 2 or filename and e.filename != filename:
raise
@contextmanager
def in_dir(dirpath):
# WARNING not thread-safe
prev = os.path.abspath(os.getcwd())
os.chdir(dirpath)
try:
yield
finally:
os.chdir(prev)
@contextmanager
def TempDir(**kw):
"""mkdtemp wrapper that automatically deletes the directory
"""
d = mkdtemp(**kw)
try:
yield d
finally:
with ignore_missing_file(d):
shutil.rmtree(d)
@contextmanager
def TempFile(**kw):
"""NamedTemporaryFile wrapper that doesn't fail if you (re)move the file
"""
f = NamedTemporaryFile(**kw)
try:
yield f
finally:
with ignore_missing_file():
f.__exit__(None, None, None)
def force_link(source, link_name):
# WARNING not atomic
with ignore_missing_file():
os.remove(link_name)
os.link(source, link_name)
orchestra-v3-master/orchestra/model/ 0000775 0000000 0000000 00000000000 13762437205 0017772 5 ustar 00root root 0000000 0000000 orchestra-v3-master/orchestra/model/__init__.py 0000664 0000000 0000000 00000000000 13762437205 0022071 0 ustar 00root root 0000000 0000000 orchestra-v3-master/orchestra/model/build.py 0000664 0000000 0000000 00000005332 13762437205 0021446 0 ustar 00root root 0000000 0000000 import hashlib
import os.path
from ..actions import CloneAction
from ..actions import ConfigureAction
from ..actions import InstallAction
from . import component
from ..actions.util import run_script
class Build:
def __init__(
self,
name: str,
comp: component.Component,
serialized_build: str,
ndebug=True,
test=False,
):
self.name = name
self.component = comp
self.serialized_build = serialized_build
self.configure: ConfigureAction = None
self.install: InstallAction = None
self.ndebug = ndebug
self.test = test
@property
def qualified_name(self):
return f"{self.component.name}@{self.name}"
@property
def self_hash(self):
serialized_build = self.serialized_build
if self.component.clone:
branch, commit = self.component.clone.branch()
if commit:
serialized_build = commit.encode("utf-8") + serialized_build
return hashlib.sha1(serialized_build).hexdigest()
@property
def recursive_hash(self):
# The recursive hash of a build depends on all its configure and install dependencies
all_builds = {d.build for d in self.configure.external_dependencies}
# TODO: are install dependencies required to be part of the information to hash?
# In theory they should not influence the artifacts
all_builds.update({d.build for d in self.install.external_dependencies})
# Filter out builds from the same component
all_builds = [b for b in all_builds if b.component != self.component]
# sorted_dependencies = [(b.qualified_name, b) for b in all_builds]
# sorted_dependencies.sort()
all_builds.sort(key=lambda b: b.qualified_name)
to_hash = self.self_hash
for b in all_builds:
to_hash += b.recursive_hash
return hashlib.sha1(to_hash.encode("utf-8")).hexdigest()
@property
def safe_name(self):
return self.qualified_name.replace("@", "_").replace("/", "_")
@property
def binary_archive_dir(self):
"""Returns the relative dirname where the binary archives should be created/found."""
return os.path.join(self.component.name, self.name)
@property
def binary_archive_filename(self):
"""Returns the filename of the binary archive. Remember to os.path.join it with binary_archive_dir!"""
component_commit = self.component.commit() or "none"
return f'{component_commit}_{self.recursive_hash}.tar.gz'
def __str__(self):
return f"Build {self.component.name}@{self.name}"
def __repr__(self):
return f"Build {self.component.name}@{self.name}"
orchestra-v3-master/orchestra/model/component.py 0000664 0000000 0000000 00000002550 13762437205 0022350 0 ustar 00root root 0000000 0000000 from typing import Dict
from . import build
from ..actions.util import run_script
class Component:
def __init__(self,
name: str,
default_build_name: str,
license: str,
from_source: bool,
binary_archives: str,
skip_post_install=False,
):
self.name = name
self.builds: Dict[str, 'build.Build'] = {}
self.default_build_name = default_build_name
self.skip_post_install = skip_post_install
self.license = license
self.from_source = from_source
self.clone: CloneAction = None
self.binary_archives = binary_archives
def add_build(self, bld: 'build.Build'):
self.builds[bld.name] = bld
@property
def default_build(self):
return self.builds[self.default_build_name]
def commit(self):
if self.clone is None:
return None
branch, commit = self.clone.branch()
return commit
def branch(self):
if self.clone is None:
return None
branch, commit = self.clone.branch()
return branch
def __str__(self):
return f"Component {self.name}"
def __repr__(self):
s = f"Component {self.name}"
for bld in self.builds.values():
s += " " + str(bld)
return s
orchestra-v3-master/orchestra/model/configuration.py 0000664 0000000 0000000 00000037627 13762437205 0023232 0 ustar 00root root 0000000 0000000 import hashlib
import json
import os
import re
import subprocess
from collections import OrderedDict
from itertools import repeat
from textwrap import dedent
from typing import Dict
from tempfile import TemporaryDirectory
import yaml
from fuzzywuzzy import fuzz
from loguru import logger
from . import build as bld
from . import component as comp
from ..actions import CloneAction, ConfigureAction, InstallAction, InstallAnyBuildAction
from ..actions.util import run_script
from ..util import parse_component_name, parse_dependency
def follow_redirects(url, max=3):
if max == 0:
return url
# TODO: this code is duplicated in several places
env = {
"GIT_SSH_COMMAND": "ssh -oControlPath=~/.ssh/ssh-mux-%r@%h:%p -oControlMaster=auto -o ControlPersist=10",
"GIT_LFS_SKIP_SMUDGE": "1",
}
new_url = None
with TemporaryDirectory() as temporary_directory:
# TODO: we're not printing the output
result = run_script(f"""git clone "{url}" "{temporary_directory}" """,
environment=env,
quiet=True,
check_returncode=False)
if result.returncode != 0:
logger.info(f"Could not clone binary archive from remote {url}!")
logger.info(result.stdout.decode("utf8").strip())
return url
redirect_path = os.path.join(temporary_directory, "REDIRECT")
if os.path.exists(redirect_path):
with open(redirect_path) as redirect_file:
new_url = redirect_file.read().strip()
if new_url:
logger.info(f"Redirecting to {new_url}")
return follow_redirects(new_url, max - 1)
else:
return url
class Configuration:
def __init__(self, args):
self.args = args
self.components: Dict[str, comp.Component] = {}
self.from_source = args.from_source
self.fallback_to_build = args.fallback_build
self.orchestra_dotdir = Configuration.locate_orchestra_dotdir()
if not self.orchestra_dotdir:
raise Exception("Directory .orchestra not found!")
self._create_default_user_options()
self.generated_yaml = run_ytt(self.orchestra_dotdir, use_cache=not args.no_config_cache)
self.parsed_yaml = yaml.safe_load(self.generated_yaml)
self.remotes = self._get_remotes()
self.binary_archives_remotes = self._get_binary_archives_remotes()
self.branches = self._get_branches()
self.orchestra_root = self.parsed_yaml.get("paths", {}).get("orchestra_root")
if not self.orchestra_root:
self.orchestra_root = os.path.realpath(os.path.join(self.orchestra_dotdir, "..", "root"))
self.source_archives = self.parsed_yaml.get("paths", {}).get("source_archives")
if not self.source_archives:
self.source_archives = os.path.realpath(os.path.join(self.orchestra_dotdir, "source_archives"))
self.binary_archives_dir = self.parsed_yaml.get("paths", {}).get("binary_archives")
if not self.binary_archives_dir:
self.binary_archives_dir = os.path.realpath(os.path.join(self.orchestra_dotdir, "binary-archives"))
self.tmproot = self.parsed_yaml.get("paths", {}).get("tmproot")
if not self.tmproot:
self.tmproot = os.path.realpath(os.path.join(self.orchestra_dotdir, "tmproot"))
self.sources_dir = self.parsed_yaml.get("paths", {}).get("sources_dir")
if not self.sources_dir:
self.sources_dir = os.path.realpath(os.path.join(self.orchestra_dotdir, "..", "sources"))
self.builds_dir = self.parsed_yaml.get("paths", {}).get("builds_dir")
if not self.builds_dir:
self.builds_dir = os.path.realpath(os.path.join(self.orchestra_dotdir, "..", "build"))
self._global_env = self._compute_global_env()
self._parse_components()
def get_build(self, comp_spec):
component_name, build_name = parse_component_name(comp_spec)
component = self.components.get(component_name)
if not component:
return None
if build_name:
build = component.builds[build_name]
else:
build = component.default_build
return build
def installed_component_file_list_path(self, component_name):
"""
Returns the path of the index containing the list of installed files of a component
"""
return os.path.join(self.installed_component_metadata_dir(), component_name.replace("/", "_") + ".idx")
def installed_component_metadata_path(self, component_name):
"""
Returns the path of the file containing metadata about an installed component
"""
return os.path.join(self.installed_component_metadata_dir(), component_name.replace("/", "_") + ".json")
def installed_component_license_path(self, component_name):
"""
Returns the path of the file containing the license of an installed component
"""
return os.path.join(self.installed_component_metadata_dir(), component_name.replace("/", "_") + ".license")
def installed_component_metadata_dir(self):
"""
Returns the path of the directory containing indices of the installed components
"""
return os.path.join(self.orchestra_root, "share", "orchestra")
def global_env(self):
return self._global_env.copy()
def get_suggested_component_name(self, user_component_name):
best_ratio = 0
best_match = None
for component_name in self.components:
ratio = fuzz.ratio(user_component_name, component_name)
if ratio > best_ratio:
best_ratio = ratio
best_match = component_name
return best_match
def _compute_global_env(self):
env = OrderedDict()
env["ORCHESTRA_DOTDIR"] = self.orchestra_dotdir
env["ORCHESTRA_ROOT"] = self.orchestra_root
env["SOURCE_ARCHIVES"] = self.source_archives
env["BINARY_ARCHIVES"] = self.binary_archives_dir
env["SOURCES_DIR"] = self.sources_dir
env["BUILDS_DIR"] = self.builds_dir
env["TMP_ROOTS"] = self.tmproot
env["RPATH_PLACEHOLDER"] = "////////////////////////////////////////////////$ORCHESTRA_ROOT"
# TODO: the order of the variables stays the same even if the
# user overrides an environment variable from the config.
# This is convenient but we should think if it is really what we want.
for env_dict in self.parsed_yaml["environment"]:
for k, v in env_dict.items():
env[k] = v
path = ":".join(self.parsed_yaml.get("add_to_path", []))
for _, component in self.parsed_yaml["components"].items():
add_to_path = component.get("add_to_path")
if add_to_path:
path += f":{add_to_path}"
path += "${PATH:+:${PATH}}"
env["PATH"] = path
env["GIT_ASKPASS"] = "/bin/true"
return env
def _parse_components(self):
# First pass: create the component, its builds and actions
for component_name, component_yaml in self.parsed_yaml["components"].items():
default_build = component_yaml.get("default_build")
license = component_yaml.get("license")
if not default_build:
build_names = list(component_yaml["builds"])
build_names.sort()
default_build = build_names[0]
skip_post_install = component_yaml.get("skip_post_install", False)
from_source = component_yaml.get("build_from_source", False) or self.from_source
binary_archives = component_yaml.get("binary_archives", None)
component = comp.Component(component_name,
default_build,
license,
from_source,
binary_archives,
skip_post_install=skip_post_install)
repo = component_yaml.get("repository")
if repo:
clone_action = CloneAction(component, repo, self)
component.clone = clone_action
self.components[component_name] = component
for build_name, build_yaml in component_yaml["builds"].items():
ndebug = build_yaml.get("ndebug", True)
test = build_yaml.get("test", False)
# This will be used to compute the self_hash
serialized_build = json.dumps(build_yaml, sort_keys=True).encode("utf-8")
build = bld.Build(build_name,
component,
serialized_build,
ndebug=ndebug,
test=test)
component.add_build(build)
configure_script = build_yaml["configure"]
build.configure = ConfigureAction(build, configure_script, self)
install_script = build_yaml["install"]
build.install = InstallAction(
build,
install_script,
self,
from_binary_archives=not from_source,
fallback_to_build=self.fallback_to_build,
)
# Second pass: resolve "external" dependencies
for component_name, component_yaml in self.parsed_yaml["components"].items():
component = self.components[component_name]
for build_name, build_yaml in component_yaml["builds"].items():
build = component.builds[build_name]
dependencies = build_yaml.get("dependencies", [])
build_dependencies = build_yaml.get("build_dependencies", [])
# List of (dependency_name: str, build_only: bool)
all_dependencies = []
all_dependencies += list(zip(dependencies, repeat(False)))
all_dependencies += list(zip(build_dependencies, repeat(True)))
for dep, build_only in all_dependencies:
dep_comp_name, dep_build_name, exact_build_required = parse_dependency(dep)
dep_comp = self.components[dep_comp_name]
if dep_build_name:
dep_build = dep_comp.builds[dep_build_name]
else:
dep_build = dep_comp.default_build
if exact_build_required:
dep_action = dep_build.install
else:
dep_action = InstallAnyBuildAction(dep_build, self)
build.configure.external_dependencies.add(dep_action)
if not component_yaml.get("build_from_source") \
and not self.from_source \
and not build_only:
build.install.external_dependencies.add(dep_action)
@staticmethod
def locate_orchestra_dotdir(relpath=""):
cwd = os.getcwd()
search_path = os.path.realpath(os.path.join(cwd, relpath))
if ".orchestra" in os.listdir(search_path):
return os.path.join(search_path, ".orchestra")
if search_path == "/":
return None
return Configuration.locate_orchestra_dotdir(os.path.join(relpath, ".."))
@staticmethod
def locate_user_options():
orchestra_dotdir = Configuration.locate_orchestra_dotdir()
return os.path.join(orchestra_dotdir, "config", "user_options.yml")
def _create_default_user_options(self):
remotes_config_file = Configuration.locate_user_options()
if os.path.exists(remotes_config_file):
return
logger.info(f"This is the first time you run orchestra, welcome!")
relative_path = os.path.relpath(remotes_config_file,
os.path.join(self.orchestra_dotdir,
".."))
logger.info(f"Creating default user options in {relative_path}")
logger.info("Populating default remotes for repositories and binary archives")
logger.info("Remember to run `orc update` next")
git_output = subprocess.check_output(
["git", "-C", self.orchestra_dotdir, "config", "--get-regexp", "remote\.[^.]*\.url"]
).decode("utf-8")
remotes_re = re.compile("remote\.(?P[^.]*)\.url (?P.*)$")
remotes = {}
for line in git_output.splitlines(keepends=False):
match = remotes_re.match(line)
base_url = os.path.dirname(match.group("url"))
remotes[match.group("name")] = base_url
if not remotes:
logger.error("Could not get default remotes, manually configure .config/user_remotes.yml")
exit(1)
remote_base_urls = ""
binary_archives = ""
for name, url in remotes.items():
remote_base_urls += f' - {name}: "{url}"\n'
start_url = f"{url}/binary-archives"
logger.info(f"Checking for redirects in {start_url}")
binary_archives_url = follow_redirects(start_url)
binary_archives += f' - {name}: "{binary_archives_url}"\n'
default_user_config = dedent("""
#! This file was automatically generated by orchestra
#! Edit it to suit your preferences
#@data/values
---
#@overlay/match missing_ok=True
remote_base_urls:
""").lstrip()
default_user_config += remote_base_urls
default_user_config += dedent("""
#@overlay/match missing_ok=True
binary_archives:
""")
default_user_config += binary_archives
default_user_config += dedent("""
#! #@overlay/replace
#! build_from_source:
#! - component-name
""")
with open(remotes_config_file, "w") as f:
f.write(default_user_config)
def _get_remotes(self):
remotes = OrderedDict()
for remote in self.parsed_yaml.get("remote_base_urls", []):
assert len(remote) == 1, "remote_base_urls must be a list of dictionaries with one entry (name: url)"
for name, url in remote.items():
remotes[name] = url
return remotes
def _get_binary_archives_remotes(self):
remotes = OrderedDict()
for remote in self.parsed_yaml.get("binary_archives", []):
assert len(remote) == 1, "binary_archives must be a list of dictionaries with one entry (name: url)"
for name, url in remote.items():
remotes[name] = url
return remotes
def _get_branches(self):
branches = self.parsed_yaml.get("branches", [])
assert type(branches) is list
for branch in branches:
assert type(branch) is str, "branches must be a list of strings"
return branches
def hash_config_dir(config_dir):
hash_script = f"""find "{config_dir}" -type f -print0 | sort -z | xargs -0 sha1sum | sha1sum"""
config_hash = subprocess.check_output(hash_script, shell=True).decode("utf-8").strip().partition(" ")[0]
return config_hash
def run_ytt(orchestra_dotdir, use_cache=True):
config_dir = os.path.join(orchestra_dotdir, "config")
config_cache_file = os.path.join(orchestra_dotdir, "config_cache.yml")
config_hash = hash_config_dir(config_dir)
if use_cache and os.path.exists(config_cache_file):
with open(config_cache_file, "r") as f:
cached_hash = f.readline().replace("#!", "").strip()
if config_hash == cached_hash:
return f.read()
ytt = os.path.join(os.path.dirname(__file__), "..", "support", "ytt")
env = os.environ.copy()
env["GOCG"] = "off"
expanded_yaml = subprocess.check_output(f"'{ytt}' -f {config_dir}", shell=True, env=env).decode("utf-8")
if use_cache:
with open(config_cache_file, "w") as f:
f.write(f"#! {config_hash}\n")
f.write(expanded_yaml)
return expanded_yaml
orchestra-v3-master/orchestra/support/ 0000775 0000000 0000000 00000000000 13762437205 0020406 5 ustar 00root root 0000000 0000000 orchestra-v3-master/orchestra/support/elf-replace-dynstr.py 0000775 0000000 0000000 00000005162 13762437205 0024467 0 ustar 00root root 0000000 0000000 #!/usr/bin/env python3
import argparse
import sys
from elftools.elf.dynamic import DynamicSegment
from elftools.elf.elffile import ELFFile
def log_error(msg):
sys.stderr.write("[ERROR] {}\n".format(msg))
def log(msg):
sys.stderr.write(msg + "\n")
def unique_or_none(list):
if not list:
return None
assert len(list) == 1
return list[0]
def main():
parser = argparse.ArgumentParser(description="Rewrite portions of .dynstr.")
parser.add_argument("elf_path", metavar="ELF", help="path to the ELF file.")
parser.add_argument("search", metavar="SEARCH", help="string to search.")
parser.add_argument("replace", metavar="REPLACE", help="replacement.")
parser.add_argument("padding", metavar="PADDING", nargs="?", default="\x00", help="padding (default NUL).")
args = parser.parse_args()
fail = False
if len(args.replace) > len(args.search):
fail = True
if len(args.replace) < len(args.search):
args.replace = args.replace + args.padding * (len(args.search) - len(args.replace))
args.replace = args.replace.encode("ascii")
args.search = args.search.encode("ascii")
with open(args.elf_path, "rb+") as elf_file:
elf = ELFFile(elf_file)
dynamic = unique_or_none([segment
for segment
in elf.iter_segments()
if type(segment) is DynamicSegment])
if dynamic is None:
log("Not a dynamic executable")
return 0
address = unique_or_none([tag.entry.d_val
for tag
in dynamic.iter_tags()
if tag.entry.d_tag == "DT_STRTAB"])
offset = None
if address:
offset = unique_or_none(list(elf.address_offsets(address)))
size = unique_or_none([tag.entry.d_val
for tag
in dynamic.iter_tags()
if tag.entry.d_tag == "DT_STRSZ"])
if offset is None or size is None:
log("DT_STRTAB not found")
return 0
elf_file.seek(offset)
original = elf_file.read(size)
new = original.replace(args.search, args.replace)
if new != original:
if fail:
log("Search string is shorter than replacement.")
return 1
log("Patching")
elf_file.seek(offset)
elf_file.write(new)
else:
log("Nothing to patch")
return 0
if __name__ == "__main__":
sys.exit(main())
orchestra-v3-master/orchestra/support/shell-home/ 0000775 0000000 0000000 00000000000 13762437205 0022443 5 ustar 00root root 0000000 0000000 orchestra-v3-master/orchestra/support/shell-home/.bashrc 0000664 0000000 0000000 00000000100 13762437205 0023675 0 ustar 00root root 0000000 0000000 export HOME="$OLD_HOME"
. "$HOME/.bashrc"
PS1="$PS1_PREFIX$PS1"
orchestra-v3-master/orchestra/support/shell-home/.zshrc 0000664 0000000 0000000 00000000077 13762437205 0023601 0 ustar 00root root 0000000 0000000 export HOME="$OLD_HOME"
. "$HOME/.zshrc"
PS1="$PS1_PREFIX$PS1"
orchestra-v3-master/orchestra/support/verify-root 0000775 0000000 0000000 00000027564 13762437205 0022637 0 ustar 00root root 0000000 0000000 #!/usr/bin/env python3
import argparse
import os
import re
import sys
from tqdm import tqdm
from collections import defaultdict
from elftools.elf.dynamic import DynamicSegment
from elftools.elf.elffile import ELFFile
def log(message):
sys.stderr.write(message + "\n")
def read_file(path):
with open(path, "r") as input_file:
return [line.strip() for line in input_file]
def is_executable(path):
return os.access(path, os.X_OK)
def is_elf(path):
with open(path, "rb") as input_file:
return input_file.read(4) == b"\x7FELF"
def unique_or_none(list):
if len(list) == 1:
return list[0]
else:
return None
def get_dynamic(elf):
return unique_or_none([segment
for segment
in elf.iter_segments()
if type(segment) is DynamicSegment])
class Root:
def __init__(self, root_path):
self.file_map = dict()
self.reverse_file_map = defaultdict(list)
self.root_path = root_path
self.package_files_path = os.path.join(self.root_path, "share", "orchestra")
self.all_files = set(["lib"])
def load_file(self, path):
files = read_file(path)
path = os.path.relpath(path, self.package_files_path)
for file in files:
if file.startswith("./"):
file = file[2:]
self.reverse_file_map[file].append(path)
self.all_files.add(file)
self.file_map[path] = files
def load_package_files(self):
# Walk recursively all the file the text files
for directory, subdirectories, files in os.walk(self.package_files_path):
for file in files:
# Skip metadata files
if file.endswith(".json"):
continue
self.load_file(os.path.join(directory, file))
def report_duplicates(self):
header = False
for file, packages in self.reverse_file_map.items():
if len(packages) > 1:
if not header:
header = True
log("Files in multiple packages:")
log(" {}:".format(file))
for package in packages:
log(" {}".format(package))
return header
def collect_installed_files(self):
self.installed_files = set()
for directory, subdirectories, files in os.walk(self.root_path):
for subdirectory in subdirectories:
path = os.path.join(directory, subdirectory)
if os.path.islink(path):
self.installed_files.add(os.path.relpath(path, self.root_path))
for file in files:
path = os.path.join(directory, file)
self.installed_files.add(os.path.relpath(path, self.root_path))
def check_installed_files(self):
missing_files = self.all_files - self.installed_files
if missing_files:
log("The following files are listed as installed but are not"
+ " present in root:")
for missing_file in sorted(missing_files):
log(" {}".format(missing_file))
extra_files = self.installed_files - self.all_files
if extra_files:
log("The following files are present in root but do not belong to"
+ " any component:")
for extra_file in sorted(extra_files):
log(" {}".format(extra_file))
return len(missing_files) > 0 or len(extra_files) > 0
def is_for_host(self, path, elf):
if elf.header.e_machine != "EM_X86_64":
return False
return True
def prepare_file_list(self, files, prefix=""):
result = ""
by_component = defaultdict(list)
for file in files:
components = ""
if file in self.reverse_file_map:
for component in self.reverse_file_map[file]:
by_component[component].append(file)
else:
by_component["(orphan)"].append(file)
for component, files in sorted(by_component.items()):
result += "{}{}:\n".format(prefix, component)
for file in files:
result += "{} {}\n".format(prefix, file)
return result
def print_file_list(self, files, prefix=""):
log(self.prepare_file_list(files, prefix))
def verify_elfs(self):
missing_libraries = defaultdict(list)
libraries_in_root = defaultdict(list)
allowed_glibc_versions = set()
used_glibc_versions = dict()
invalid_runpaths = defaultdict(list)
for installed_file in tqdm(sorted(self.installed_files)):
path = os.path.join(self.root_path, installed_file)
if os.path.isfile(path) and is_executable(path) and is_elf(path):
with open(path, "rb") as elf_file:
elf = ELFFile(elf_file)
dynamic_segment = get_dynamic(elf)
if (self.is_for_host(installed_file, elf)
and dynamic_segment):
if "link-only" not in installed_file:
libraries_in_root[os.path.basename(installed_file)].append(installed_file)
# Get the string table
tag = unique_or_none([tag
for tag
in dynamic_segment.iter_tags()
if tag.entry.d_tag == "DT_STRTAB"])
string_table_address = tag.entry.d_val
string_table_offset = unique_or_none(list(elf.address_offsets(string_table_address)))
tag = unique_or_none([tag
for tag
in dynamic_segment.iter_tags()
if tag.entry.d_tag == "DT_STRSZ"])
string_table_size = tag.entry.d_val
elf_file.seek(string_table_offset)
string_table = elf_file.read(string_table_size)
glibc_versions = set([version.strip(b"\x00").decode("ascii")
for version in
re.findall(b"GLIBC_[0-9.]*\x00", string_table)])
if "link-only" in installed_file:
allowed_glibc_versions = allowed_glibc_versions.union(glibc_versions)
else:
used_glibc_versions[installed_file] = glibc_versions
runpaths = []
runpath_tag = unique_or_none([tag
for tag
in dynamic_segment.iter_tags()
if tag.entry.d_tag == "DT_RUNPATH"])
if runpath_tag:
runpath = string_table[runpath_tag.entry.d_val:].split(b"\x00")[0].decode("ascii")
runpath = runpath.replace("$ORIGIN", os.path.dirname(os.path.realpath(path)))
runpaths = runpath.split(":")
runpaths = map(os.path.realpath, runpaths)
runpaths = [os.path.relpath(runpath, self.root_path)
for runpath
in runpaths]
runpaths = list(set(runpaths))
for runpath in runpaths:
path = os.path.join(self.root_path, runpath)
if not (os.path.isdir(path) or os.path.islink(path)):
invalid_runpaths[runpath].append(installed_file)
# Collect DT_NEEDED
needed_string_offsets = [tag.entry.d_val
for tag
in dynamic_segment.iter_tags()
if tag.entry.d_tag == "DT_NEEDED"]
for needed_string_offset in needed_string_offsets:
lib_name = string_table[needed_string_offset:].split(b"\x00")[0].decode("ascii")
found = False
for runpath in runpaths:
candidate = os.path.relpath(os.path.join(self.root_path,
runpath,
lib_name),
self.root_path)
if candidate in self.all_files:
found = True
break
if not found:
missing_libraries[lib_name].append(installed_file)
if invalid_runpaths:
log("The following runpaths are invalid:")
for runpath, users in invalid_runpaths.items():
log(" {}".format(runpath))
self.print_file_list(users, " ")
system_libraries = []
for missing_library, users in missing_libraries.items():
if missing_library in libraries_in_root:
file_list = self.prepare_file_list(users, " ")
if file_list:
log("{} is available in root".format(missing_library))
log(" These are the instances:")
self.print_file_list(libraries_in_root[missing_library], " ")
log(" These are the users:")
log(file_list)
else:
system_libraries.append((missing_library, users))
if system_libraries and False:
log("The following libraries are not provided in root:")
for system_library, users in system_libraries:
log(" {}:".format(system_library))
self.print_file_list(users, " ")
by_version = defaultdict(list)
for installed_file, versions in used_glibc_versions.items():
unallowed_versions = versions - allowed_glibc_versions
for unallowed_version in unallowed_versions:
by_version[unallowed_version].append(installed_file)
to_print = list()
for version, users in sorted(by_version.items()):
file_list = self.prepare_file_list(users, " ")
if file_list:
to_print.append((version, users, file_list))
if to_print:
log("The following unallowed glibc versions are being used:")
for version, users, file_list in to_print:
log(" {}".format(version))
log(file_list)
return any(len(x) > 0 for x in [invalid_runpaths, to_print])
def main():
parser = argparse.ArgumentParser(description="Verify integrity of an orchestra root.")
parser.add_argument("root_path", metavar="ROOT_PATH", default=".", help="Path to Orchestra root")
args = parser.parse_args()
root_path = args.root_path
root = Root(root_path)
log("Loading package files...")
root.load_package_files()
duplicates_found = root.report_duplicates()
log("Collecting installed files...")
root.collect_installed_files()
log("Searching orphans...")
orphans_found = root.check_installed_files()
log("Verifying ELFs...")
errors_in_elfs = root.verify_elfs()
if duplicates_found or orphans_found or errors_in_elfs:
log("[!] Inconsistencies found in the root directory!")
return 1
else:
log("Root directory consistency checks passed!")
return 0
if __name__ == "__main__":
sys.exit(main())
orchestra-v3-master/orchestra/util.py 0000664 0000000 0000000 00000006635 13762437205 0020233 0 ustar 00root root 0000000 0000000 import json
import os.path
import re
import sys
from collections import OrderedDict
from typing import Union
class OrchestraException(Exception):
pass
def parse_component_name(component_spec):
tmp = component_spec.split("@")
component_name = tmp[0]
build_name = tmp[1] if len(tmp) > 1 else None
return component_name, build_name
def parse_dependency(dependency) -> (str, Union[str, None], bool):
"""
Dependencies can be specified in the following formats:
- Simple:
`component`
Depend on the installation of the default build of `component`.
- Exact:
`component@build`
Depend on the installation of a specific build of `component`
- Simple with preferred build:
`component~build`
to depend on the installation of any build of `component`.
If the component is not installed, the specified build is picked.
:returns component_name, build_name, exact_build_required
component_name: name of the requested component
build_name: name of the requested build or None
exact_build_required: True if build_name represents an exact requirement
"""
dependency_re = re.compile(r"(?P[\w\-_/]+)((?P[@~])(?P[\w\-_/]+))?")
match = dependency_re.fullmatch(dependency)
if not match:
raise Exception(f"Invalid dependency specified: {dependency}")
component = match.group("component")
exact_build_required = False if match.group("type") == "~" else True
build = match.group("build")
return component, build, exact_build_required
def get_installed_metadata(component_name, config):
"""
Returns the metadata dictionary for an installed component.
If the component is not installed, returns None.
"""
metadata_path = config.installed_component_metadata_path(component_name)
if not os.path.exists(metadata_path):
return None
with open(metadata_path) as f:
return json.load(f)
def get_installed_build(component_name, config):
"""
Returns the name of the installed build for the given component name.
If the component is not installed, returns None.
"""
metadata = get_installed_metadata(component_name, config)
if not metadata:
return None
return metadata.get("build_name", None)
def get_installed_hash(component_name, config):
"""
Returns the recursive hash of an installed component.
If the component is not installed, returns None.
"""
metadata = get_installed_metadata(component_name, config)
if not metadata:
return None
return metadata.get("recursive_hash", None)
def is_installed(config, wanted_component, wanted_build=None, wanted_recursive_hash=None):
metadata = get_installed_metadata(wanted_component, config)
if metadata is None:
return False
installed_build = metadata.get("build_name")
installed_recursive_hash = metadata.get("recursive_hash")
return installed_build is not None \
and (wanted_build is None or installed_build == wanted_build) \
and (wanted_recursive_hash is None or installed_recursive_hash == wanted_recursive_hash)
def export_environment(variables: OrderedDict):
env = ""
for var, val in variables.items():
env += f'export {var}="{val}"\n'
return env
def set_terminal_title(title):
if sys.stdout.isatty():
sys.stdout.write(f"\x1b]2;{title}\x07")
orchestra-v3-master/requirements.txt 0000664 0000000 0000000 00000000171 13762437205 0020163 0 ustar 00root root 0000000 0000000 loguru~=0.5.2
PyYAML~=5.3.1
fuzzywuzzy~=0.18.0
python-Levenshtein~=0.12.0
pyelftools~=0.26
enlighten~=1.6.2
tqdm~=4.50.2
orchestra-v3-master/setup.py 0000664 0000000 0000000 00000002043 13762437205 0016411 0 ustar 00root root 0000000 0000000 #!/usr/bin/env python3
import os.path
import urllib.request
from setuptools import setup, find_packages
ytt_url = "https://github.com/k14s/ytt/releases/download/v0.30.0/ytt-linux-amd64"
ytt_path = os.path.join(os.path.dirname(__file__), "orchestra/support/ytt")
if not os.path.exists(ytt_path):
print(f"ytt not found, downloading from {ytt_url}")
with urllib.request.urlopen(ytt_url) as ytt_download:
with open(ytt_path, "wb") as out:
out.write(ytt_download.read())
os.chmod(ytt_path, 0o755)
setup(
name='orchestra',
version='3.0',
description='The orchestra meta build system',
author='Filippo Cremonese (rev.ng SRLs)',
author_email='filippocremonese@rev.ng',
# TODO
url='https://rev.ng/gitlab/',
packages=find_packages(),
package_data={"orchestra": ["support/*"]},
install_requires=open("requirements.txt").readlines(),
entry_points={
"console_scripts": [
"orchestra=orchestra:main",
"orc=orchestra:main",
]
},
zip_safe=False,
)