Skip to content

Instantly share code, notes, and snippets.

@alikins
Created August 10, 2020 21:00
Show Gist options
  • Select an option

  • Save alikins/783ad83474e3080b7b07898a8e100d89 to your computer and use it in GitHub Desktop.

Select an option

Save alikins/783ad83474e3080b7b07898a8e100d89 to your computer and use it in GitHub Desktop.
galaxy-import code flow sorta
- cli args:
- artifact filename
- --print
- config setup
- get default config
- read config files
- /etc/galaxy-importer/galaxy-imporer.cfg
- site-packages/galaxy-importer/galaxy-importer.cfg
- config options
- log level
- run_flake8
- run_ansible_test
- ansible_test_local_image bool (if docker should be used to run ansible-test)
- local_image_docker ?
- run_ansible_lint
- run_ansible_doc bools to enable/disable lint and ansible-doc
- infra_osd bool indicating running in openshift
- check_required_tags bool indicating if importer requires at least one tag
from REQUIRED_TAG_LIST
- REQUIRED_TAG_LIST could potentially be a config setting
- main
- setup logging
- parse cli args
- call_importer
- regex namespace, name, version from artifact filename
- open artifact file
- collection.import_collection # the real work
- _import_collection
- create a tmp extract_dir to extract to
- find the file if local
- if file is a storage object, hydrate it and download it locally
- _extract_archive # extract tar archive to extract_dir
- extract_tar_shell
- run a subprocess 'tar' command
- instantiate a loaders.CollectionLoader
- call CollectionLoader inst's load()
- load_collection_manifest
- find and open the MANIFEST.json in extract_dir
- instantiate a schema.CollectionArtifactManifest via it's .parse() to read the json
- set CollectionLoader inst .metadata by side effect
- rename_extract_path
- replace the placeholders in the extract_dir path with the ns/name/version etc values
- validate artifact filename matches loaded metadata
- if metdata includes README and/or license_file validate that those paths exist
- optionally populate collection docs via ansible-doc via loaders.DocStringLoader .load()
- run `ansible-doc` for each plugin type in ANSIBLE_DOC_SUPPORTED_TYPED
('become', 'cache', 'callback', 'cliconf', 'connection',
'httpapi', 'inventory', 'lookup', 'shell', 'module', 'strategy', 'vars')
- find the plugins of each type
- run `ansible-doc` for each type with list of all the plugins
# For a large collection, this can be dozens if not hundreds
# which slows down ansible-doc but better than running ansible-doc hundreds of times
# OPTIMIZE: Could probably run multiple ansible-docs in parallel, more or less one per
# plugin type. Unknown if this would be faster/better but very likely it would
# be faster since ansible-doc appears to be CPU bound
- run _transform_doc_strings on the results of ansible-docs
# _transform_doc_strings might be memory bound for large collections on small (ie, openshift)
# images. If there are lots of docs, there may be multiple copies in memory.
- run load_contents to get info about contents (roles, plugins, etc)
- ContentFinder().find_contents() # tree walk to find content
- for every found_content: # this is a generator
- get correct loader cls
- instantiate that loader_cls with the found_content info (content_type, path)
# OPTIMIZE: may be possible run the loader_cls.load()'s in parallel or multiprocessing
PluginLoader:
- runs flake8 on plugin content
RoleLoader:
- load role metadata
- load role README
- optionally run `ansible-lint` for the role content
- build_contents_blob
- instance a list of ResultContentItem's based on load_contents()
- build_docs_blob
- if ansibl-doc was used, instantiate DocsBlob, DocsBlobContentItem, RenderedDocFile
- use markup_utils to "render" the docs to html
- composite all the docs blobs into a DocsBlob
- build a ImportResult based on loaded metadata, docs_blob, and contents
- use the runners.get_runner factory to get the optional ansible_test_runner
- no runner
- openshift test runner
- local docker image test runner
- local ansible-test runner
- run the ansible_test_runner
local_ansible_test_runner:
- more or less run subprocess of 'ansible-test sanity --docker'
local docker image:
- build a docker image via builders.local_image_build.Build
- run the image
- subprocess of 'docker run image_id'
openshift runner:
- # There are a lot more details here than this alludes to
- build a docker image via OpenshiftJobTestRunner
- lots of openshift stuff here
- populate a build template
- push build template to openshift REST API
- amongst other things setup http auth credentials
- etc
- create a openshift Job
- populate template
- hit openshift rest api
- etc
- run / start the openshift job
- wait for it to finish
- run any 'post_load_plugins'
- optionally print_result
- write_output_file
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment