Tudat Developer Documentation#
Welcome to the Tudat Developer Documentation. Browse the latest developer documentation, including tutorials, sample code, articles and external API references.
Development Operations#
DevOps is a set of practices that combines software development (Dev) and IT operations (Ops) [1]. Tudat relies on open source solutions for development; GitHub for its hosted version control system (VCS), Conda for it’s package manager, and Azure for it’s continuous integration (CI) service.
Todo
Discuss with team whether learning goals should be presented at this
level or whether a clean practical use case of the section should be
presented. (i.e. Package Management: conda install tudatpy
, what
goes on behind the scenes into making this possible?)
Understand the differences between Git and Github.
Clone and manage repositories from Github.
Understand the Tudat Git workflow.
Make your first contribution to Tudat.
Understand that “dependency hell” exists and how to avoid it.
Understand
MAJOR.MINOR.PATCH
and when each of them are bumped.Understand the order of precedence for version labels.
Obtain an overview of the Rever activities relevant to a Tudat developer.
Understand what happens when
rever <new_version_number>
is executed.Be comfortable with the News Workflow for automated changelogs.
This section describes the processes and tools that go into the following executable command using Conda:
conda install tudatpy # -c tudat-team
Understand what
conda-forge
is, and what their role is with respect to Conda packages.Understand how
conda-smithy
builds, tests and packages across target OS’.Know how to inspect build status’ of packages on Azure and re-trigger them.
Understand common problems encountered in this workflow and how to resolve them.
Code Collaboration#
This section describes the use of Git and Github in relation to standard Tudat developer use cases. This chapter does not serve as a complete tutorial on Git, but rather as common practices with fundamentals of the Git workflow (or Gitflow) used by a Tudat Developer.
Note
Git comes with the Anaconda and Miniconda distributions by default. If you are using Conda, then Git is already available in your Conda command-line interface (specific to your OS).
Audience |
Prerequisite(s) |
Time |
Git Beginner |
~ 10 minutes |
Learning Objectives
Understand the differences between Git and Github.
Clone and manage repositories from Github.
Understand the Tudat Git workflow.
Make your first contribution to Tudat.
Cloning a Repository#
Cloning a repository is a straightforward process. Each repository on GitHub has a green Code button. After clicking this button, you can then choose to “Clone with HTTPS” by clicking the clipboard icon under the dropdown and executing:
git clone <repository_url>
The primary version of the repository source code will be cloned.
Note
The master
branch has commonly served as this primary version of a
repository, though Github has changed this default to main
in recent
times in an effort to align with political correctness. master
and
main
will be used interchangeably in this document, where images or
repositories have not been updated.
Todo
Update all graphics, documentation and repositories to the new default main
.
Try it yourself!
After entering your desired directory with the cd
command, clone the
developer-primer
repository [2]:
$ git clone https://github.com/tudat-team/developer-primer.git
You now have a local repository set to the main
branch of the remote
developer-primer repository 🎉
developer-primer
├── .authors
│ ├── AUTHORS
│ ├── .authors.yml
│ └── .mailmap
├── bibtex.bib
├── CHANGELOG.rst
├── docs
│ ├── build
│ ├── make.bat
│ ├── Makefile
│ └── source
├── environment.yaml
├── .gitignore
├── LICENSE
├── news
│ └── TEMPLATE.rst
├── README.rst
├── rever.xsh
└── source
└── tree_trunk.txt
There’s a lot going on in the Repository structure, don’t be overwhelmed. By the end of the Primer, you will have all the knowledge required to navigate it like a pro Tudat Developer.
Develop and Master Branches#
Instead of a single master
branch, this workflow uses two branches to record
the history of the project. The master
branch stores the official release
history, and the develop
branch serves as an integration branch for features.
It’s also convenient to tag all commits in the master
branch with a version
number. [3]
The first step is to complement the default master
with a develop
branch. A
simple way to do this is for one developer to create an empty develop
branch
locally and push it to the server (remote):
git branch develop
git push -u origin develop
Note
The -u
flag simply tells Git to track the newly created remote branch.
This branch will contain the complete history of the project, whereas master
will contain an abridged version. Other developers should now clone the central
repository and create a tracking branch for develop
. If you form part of
this group, (i.e. a develop
branch already exists on the remote) you can
create a tracking branch for develop
by executing:
git checkout --track origin/develop
A tracking branch simply means that you have a local version of a branch that is connected to an existing remote version. This relationship is invaluable as it provides two major benefits:
- Pushing and pulling becomes a lot easier.
git push origin develop
replaced by shorthandgit push
git pull origin develop
replaced by shorthandgit pull
Git will now inform you about “unpushed” and “unpulled” commits.
Try it yourself!
With the developer-primer
repository [2]
cloned, check what branches exist on the remote:
$ git branch -r
origin/HEAD -> origin/main
origin/develop
origin/main
You can think of the HEAD
as the “current branch”. The output above shows
that there there is indeed a develop
branch available on the remote.
Let’s create a local tracking branch:
$ git checkout --track origin/develop
Branch 'develop' set up to track remote branch 'develop' from 'origin'.
Switched to a new branch 'develop'
Congratulations, you are now on your local version of the develop
branch, which is tracking the remote version of develop
🎉
Feature Branches#
Each new feature should reside in its own branch, which can be pushed to the
central repository for backup/collaboration. But, instead of branching off of
master
, feature
branches use develop
as their parent branch. When a
feature is complete, it gets merged back into develop
. Features should
never interact directly with master
. [3]
Note
Note that feature
branches combined with the develop
branch is, for
all intents and purposes, the Feature Branch Workflow. But, the Gitflow
Workflow doesn’t stop there. Feature
branches are generally created off
to the latest develop
branch.
Creating a feature branch#
$ git checkout develop
$ git checkout -b feature/name
$ git flow feature start feature_name
Continue your work and use Git as demonstrated beforehand.
Try it yourself!
With the developer-primer
repository [2],
ensure that the develop
branch is checked out, and create a new local
feature branch with your Github username as the feature name.
$ git checkout develop
Already on 'develop'
Your branch is up to date with 'origin/develop'.
$ git checkout -b feature/ggarrett13_was_here
Switched to a new branch 'feature/ggarrett13_was_here'
After creating a feature that is appropriate for the planned work, carry
out the work! Just for sake of example in modifying the source
of a
project, carve “<your_github_name> was here!” to the tree
trunk contained in the source directory, using the command:
Note
This is just some arbitrary example “work” of modifying a projects source code for sake of example, not a convention or standard.
$ echo "ggarrett13 was here!" >> source/tree_trunk.txt
Your message will be appended to the bottom of the tree_trunk.txt
:
source/tree_trunk.txt
#----- This is a tree trunk -----
ggarrett13 was here!
Stage source/tree_trunk.txt
to be committed:
$ git add source/tree_trunk.txt
Finally, commit the changes made to your feature branch:
$ git commit -m "ggarrett13 was here!"
[feature/ggarrett13_was_here 6810969] ggarrett13 was here!
1 file changed, 1 insertion(+)
You’re all set to leave your first mark on the Tudat Space community.
Finishing a feature branch#
When you’re done with the development work on the feature, the next step is to
merge the feature/name
into develop
.
$ git checkout develop
$ git merge feature/name
Try it yourself!
Continuing with the developer-primer
repository [2],
checkout the develop
branch in your local repository and merge your
feature into it.
$ git checkout develop
Switched to branch 'develop'
Your branch is up to date with 'origin/develop'.
$ git merge feature/ggarrett13_was_here
Updating e2285f3..6810969
Fast-forward
source/tree_trunk.txt | 1 +
1 file changed, 1 insertion(+)
Finally, push the changes to the remote:
$ git push
Counting objects: 4, done.
Delta compression using up to 8 threads.
Compressing objects: 100% (3/3), done.
Writing objects: 100% (4/4), 364 bytes | 364.00 KiB/s, done.
Total 4 (delta 1), reused 0 (delta 0)
remote: Resolving deltas: 100% (1/1), completed with 1 local object.
To https://github.com/tudat-team/developer-primer.git
e2285f3..6810969 develop -> develop
Congratulations, you’ve just officially made your first mark on the Tudat Space community as a Tudat Developer! 🎉
Release Branches#
Todo
Complete Release Branches section. Currently not a common part of the Tudat Developer workflow, but will be soon!
Hotfix Branches#
Todo
Complete Hotfix Branches section. Currently not a common part of the Tudat Developer workflow, but will be soon!
Chapter Summary
Gitflow Workflow [3]
A
develop
branch is created frommaster
A
release
branch is created fromdevelop
Feature
branches are created fromdevelop
When a
feature
is complete it is merged into thedevelop
branchWhen the
release
branch is done it is merged intodevelop
andmaster
If an issue in
master
is detected ahotfix
branch is created frommaster
Once the
hotfix
is complete it is merged to bothdevelop
andmaster
Release Versioning#
The release versioning section describes the workflow and tools used in release versioning used as a Tudat Developer.
Learning Objectives
Understand that “dependency hell” exists and how to avoid it.
Understand
MAJOR.MINOR.PATCH
and when each of them are bumped.Understand the order of precedence for version labels.
Obtain an overview of the Rever activities relevant to a Tudat developer.
Understand what happens when
rever <new_version_number>
is executed.Be comfortable with the News Workflow for automated changelogs.
Semantic Versioning#
Tom Preston-Werner originally proposed a simple set of rules and requirements that provide a convention for modifying the versioning of software packages [4]. The opening paragraph introduces the concept of dependency hell:
In the world of software management there exists a dreaded place called “dependency hell.” The bigger your system grows and the more packages you integrate into your software, the more likely you are to find yourself, one day, in this pit of despair.
This chapter relays the Semantic Versioning (SemVer) 2.0.0 convention in an effort to avoid any developer needed “dependency hell” to be defined. It is further mentioned that the proposed system will only work with an API declaration:
For this system to work, you first need to declare a public API. This may consist of documentation or be enforced by the code itself. Regardless, it is important that this API be clear and precise. Once you identify your public API, you communicate changes to it with specific increments to your version number.
The SemVer 2.0.0 can be summarised by the following set of rules: Given a
version number MAJOR.MINOR.PATCH
, increment the:
MAJOR
version when you make incompatible API changes,MINOR
version when you add functionality in a backwards compatible manner, andPATCH
version when you make backwards compatible bug fixes.
Additional labels for pre-release and build metadata are available as
extensions to the MAJOR.MINOR.PATCH
format. SemVer only focuses on API
compatibility, however there are common labels appended to the semantic
version, for example 1.0.0-alpha
. The list of requirements for the label
formatting are detailed in SemVer. The important takeaway is that precedence is
alphanumeric:
Precedence: 1.0.0-alpha < 1.0.0-alpha.1 < 1.0.0-alpha.beta
< 1.0.0-beta < 1.0.0-beta.2 < 1.0.0-beta.11
< 1.0.0-rc.1 < 1.0.0.
Example
A proposed guided example flow is as follows:
alpha
= in development without caring about (old) unit-testsbeta
= in development and the old (unit) tests are validrc.1
= tests for new features are written and validrc.2
= additional tests had to be written and those were made validrc.3
= more additional tests that were made valid
However this is just an example flow, not a further set of rules. Depart from it whenever it improves your content.
Break any of these rules sooner than say anything outright barbarous.
—George Orwell, “Politics and the English Language”
Rever: Releaser of Versions!#
Installation
Using conda
:
conda install rever -c conda-forge
Using pip
:
pip install re-ver
Rever is a xonsh-powered (a language hybrid between bash script and Python), cross-platform software release tool. It automates standard activities that are carried out during a new release. It is important to be aware of these activities, as they are not only relevant at the moment of release. The tasks relevant as a Tudat Developer are:
authors
version_bump
changelog
tag
push_tag
bibtex
These tasks will be elaborated upon, one-by-one in the following subsections. Note that Rever will most likely be set up in repositories that you encounter, therefore the explicit procedure of Initializing Rever will not be covered here, though the relevant content is covered.
Example
Inside the developer-primer
[2] repository used in Code Collaboration,
you will find files that are used to configure Rever and some that are
autogenerated or updated when executing a release.
1 developer-primer
2 ├── .authors
3 │ ├── AUTHORS
4 │ ├── .authors.yml
5 │ └── .mailmap
6 ├── bibtex.bib
7 ├── CHANGELOG.rst
8 ├── docs
9 │ ├── build
10 │ ├── make.bat
11 │ ├── Makefile
12 │ └── source
13 ├── environment.yaml
14 ├── .gitignore
15 ├── LICENSE
16 ├── news
17 │ └── TEMPLATE.rst
18 ├── README.rst
19 ├── rever.xsh
20 └── source
21 └── tree_trunk.txt
The highlighted lines indicate the relevant components of the repository which relate to Rever configuration and activities. Grouped by their activity:
Activity |
Components |
|
|
|
|
|
|
Finally, the rever.xsh
is the configuration file for Rever.
rever.xsh
#
The starting point for setting up or maintaining releases versioning with Rever
is the configuration file rever.xsh
. As noted in the introduction, Rever
uses xonsh, which is a language hybrid between bash script and Python. There’s
a good chance that if you know either of these, or both, you will feel right
at home. The following rever.xsh
file is a slimmed down version of
the rever
package’s release configuration.
rever.xsh
#$PROJECT = 'rever'
$ACTIVITIES = [
'version_bump', # Changes the version number in various source files (setup.py, __init__.py, etc)
'changelog', # Uses files in the news folder to create a changelog for release
'tag', # Creates a tag for the new version number
'push_tag', # Pushes the tag up to the $TAG_REMOTE
'pypi', # Sends the package to pypi
'conda_forge', # Creates a PR into your package's feedstock
'ghrelease' # Creates a Github release entry for the new tag
]
$CHANGELOG_FILENAME = 'CHANGELOG.rst' # Filename for the changelog
$CHANGELOG_TEMPLATE = 'TEMPLATE.rst' # Filename for the news template
This configuration demonstrates a basic setup for Rever. The variables
$PROJECT
and $ACTIVITIES
are mandatory. Some activities may fail
without further variable declarations. The following sections will elaborate
sufficiently on some of the variables relevant to a Tudat Developer’s workflow.
Note
Rever has a well maintained, easy to read, explanation on all the options available for each activity in their activities documentation.
Example
Inside the developer-primer
[2]
repository, the following configuration is used:
developer-primer/rever.xsh
# 1$PROJECT = 'developer-primer'
2$ACTIVITIES = [
3 'version_bump',
4 'authors',
5 'changelog',
6 'tag',
7 'push_tag',
8 'bibtex'
9]
10
11# VersionBump related ------------------------------------------------------- #
12$VERSION_BUMP_PATTERNS = [
13 ('README.rst', r'\sVersion:\*\*\s.*', '\sVersion:** $VERSION'),
14 ('docs/source/conf.py', r'release\s=\s.*', "release = '$VERSION'"),
15 ('docs/source/index.rst', r'\sVersion:\*\*\s.*', '\sVersion:** $VERSION'),
16]
17
18# Authors related ----------------------------------------------------------- #
19$AUTHORS_DIR = ".authors" # this is custom
20$AUTHORS_FILENAME = $AUTHORS_DIR + '/' + 'AUTHORS'
21$AUTHORS_TEMPLATE = '\n{authors}\n'
22$AUTHORS_METADATA = $AUTHORS_DIR + '/' + '.authors.yml'
23$AUTHORS_MAILMAP = $AUTHORS_DIR + '/' + '.mailmap'
24
25# Changelog related --------------------------------------------------------- #
26$CHANGELOG_FILENAME = 'CHANGELOG.rst' # Filename for the changelog
27$CHANGELOG_TEMPLATE = 'TEMPLATE.rst' # Filename for the news template
28
29# BibTex related ------------------------------------------------------------ #
30$BIBTEX_AUTHORS = 'G.H. Garrett'
31$BIBTEX_URL = 'https://github.com/tudat-team/developer-primer'
32
33
34
35# PushTag related ----------------------------------------------------------- #
36$PUSH_TAG_REMOTE = 'git@github.com:tudat-team/developer-primer.git'
version_bump
#
The version_bump
activity will uses an environment argument
$VERSION_BUMP_PATTERNS
which is of the form List[tuple[str, str, str]]
.
These tuples defined a file path, a regular expression (regex) pattern, and a
replacement string. The regex match(es) in the specified file will be replaced
by the desired string.
$VERSION_BUMP_PATTERNS = [
("file_path", r"regex_pattern", "replace_with"),
...
]
The use of regex is minimal and in most cases you can use examples in existing repositories.
Tip
A very polished resource for testing regex, even allowing for the export of code in your preferred language is regular expressions 101.
Example
Inside the developer-primer
[2] repository used in Code Collaboration,
you will find files that are used to configure Rever and some that are
autogenerated or updated when executing a release.
developer-primer/rever.xsh
#12$VERSION_BUMP_PATTERNS = [
13 ('README.rst', r'\sVersion:\*\*\s.*', '\sVersion:** $VERSION'),
14 ('docs/source/conf.py', r'release\s=\s.*', "release = '$VERSION'"),
15 ('docs/source/index.rst', r'\sVersion:\*\*\s.*', '\sVersion:** $VERSION'),
16]
Todo
@team, does this need further elaboration?
changelog
#
Todo
changelog
subsection.
tag
#
Todo
tag
subsection.
push_tag
#
Todo
push_tag
subsection.
bibtex
#
Todo
bibtex
subsection.
Rever commands#
Command |
Description |
|
Generates activity support files. |
|
Check activities. |
|
Executes all activities for release. |
News Workflow#
One of the most helpful features of rever is the changelog activity. This activity produces a changelog by colating news files. The changelog is written into the repo and can be used in the GitHub release activity.
Important
Ensure that you have one commit prior to executing
rever <MAJOR.MINOR.PATCH>
, otherwise you will not appear as an
author on the Change Log.
Go into the
news/
directoryCopy the
TEMPLATE.rst
file to another file in thenews/
directory. We suggest using the branchname:
$ cp TEMPLATE.rst branch.rst
The news files are customizable in the
rever.xsh
files. However, the default template looks like:
**Added:**
* <news item>
**Changed:**
* <news item>
**Deprecated:**
* <news item>
**Removed:**
* <news item>
**Fixed:**
* <news item>
**Security:**
* <news item>
In this case you can remove the
* <news item>
and replace it with your own news entries, e.g.:
**Added:**
* New news template tutorial
**Changed:**
* <news item>
**Deprecated:**
* <news item>
**Removed:**
* <news item>
**Fixed:**
* <news item>
**Security:**
* <news item>
Commit your
branch.rst
.
Feel free to update this file whenever you want! Please don’t use someone else’s file name. All of the files in this news/ directory will be merged automatically at release time. The <news item> entries will be automatically filtered out too!
Once the project is ready for a release when running the rever command all the files, except the template, in the news folder will be collated and merged into a single changelog file.
Todo
Example admonition adding a topic of interest to
developer-primer/docs/interests.yaml
, then following the news
workflow to inform other developers. This is then concluded with
the developer executing their first release.
Package Management#
Learning Objectives
This section describes the processes and tools that go into the following executable command using Conda:
conda install tudatpy # -c tudat-team
Conda#
Conda is in open source package manager that runs on Windows, macOS and Linux. It is popular in the Python community, as it was originally intended to handle Python programs primarily but due to its ability to package and distribute software for any language, its use has grown significantly.
Nomenclature
- Environment
A directory that contains a specific collection of conda packages that you have installed.
- Anaconda
A distribution of the Python and R programming languages for specific computing (data science, machine learning applications, large-scale data processing, predictive analytics, etc.) [5]
- Bootstrap
A bootstrap is the program that initializes the operating system (OS) during startup (Only relevant for next nomenclature item).
- Miniconda
Miniconda is a free minimal installer for conda. It is a small, bootstrap version of Anaconda that includes only conda, Python, the packages they depend on, and a small number of other useful packages, including pip, zlib and a few others. [6]
- Recipe
A Conda-build recipe is a flat directory that contains a specific collection of files which defines a package’s dependencies, description (branding), build procedure (when applicable) and test procedure.
- Pinning
Pinning dependencies refers to explicitly defining the versions of software that your application depends on. The high-level is to “freeze” dependencies so that subsequent builds/deployments are repeatable.
Conda-build#
Conda-build is a package that contains commands and tools to use conda to build your own conda packages. It also provides helpful tools to constrain or pin versions in recipes. At the heart of Conda-build is the concept of a Conda-build recipe:
recipe
├── bld.bat
├── build.sh
├── meta.yaml
└── run_test.py
Each file in the Conda-build recipe has a specific responsibility in creating a conda package. The responsibilities are as follow:
File |
Description |
|
A file that contains all the metadata in the recipe. Only |
|
The script that installs the files for the package on macOS and Linux. It is executed using the |
|
The build script that installs the files for the package on Windows. It is executed using |
|
An optional Python test file, a test script that runs automatically if it is part of the recipe. Optional patches that are applied to the source. |
Continuous Deployment#
Learning Objectives
Understand what
conda-forge
is, and what their role is with respect to Conda packages.Understand how
conda-smithy
builds, tests and packages across target OS’.Know how to inspect build status’ of packages on Azure and re-trigger them.
Understand common problems encountered in this workflow and how to resolve them.
The previous section described the concepts that found the creation of a Conda
package. This section will describe how packages are built, tested and
deployed using an open source continuous deployment (CD) system built by
the conda-forge
community: conda-smithy
.
Note
conda-smithy
can also be used as a continuous integration (CI)
and continuous delivery system.
Attention
Please ensure to use the recommended following command to install conda-smithy. If not installed in the base environment, you may need to reinstall Miniconda/Anaconda from scratch.
Todo
Elaborate on osx_arm64
cross-compiled build variants. (This may only
apply to the private tudat-team
channel?)
Azure#
Microsoft Azure, commonly referred to as Azure, is a cloud computing service created by Microsoft for building, testing, deploying, and managing applications and services through Microsoft-managed data centers.
Software Documentation#
Sphinx Documentation#
sudo apt-get install texmaker gummi texlive texlive-full texlive-latex-recommended latexdraw intltool-debian lacheck libgtksourceview2.0-0 libgtksourceview2.0-common lmodern luatex po-debconf tex-common texlive-binaries texlive-extra-utils texlive-latex-base texlive-latex-base-doc texlive-luatex texlive-xetex texlive-lang-cyrillic texlive-fonts-extra texlive-science texlive-latex-extra texlive-pstricks
Todo
Link checking is facilitated by sphinx using
make linkcheck
(on windows)Add section on FontAwesome inline icons from
sphinx-panels
Add tutorial/ section on maintaining a bibliography in Sphinx.
Compile documentation with Sphinx#
This example is a step-by-step guide on how to compile the tudat documentation
locally on your system using sphinx
. This procedure works to compile documentation for both the tudat-space and the documentation you are currently reading.
Note
This procedure requires that Anaconda or Miniconda is installed. For information regarding the use of the conda ecosystem, please see Getting Started with Conda.
Create an environment that will be satisfy all dependencies required for building documentation, then activate it. This can be done by downloading this
environment.yaml
(yaml
), which will install thetudat-docs
conda environment.
conda env create -f environment.yaml & conda activate tudat-docs
Enter the root directory of a repository containing a
docs
directory, which contains asource
subdirectory. The following command is specific to cloning and entering thetudat-space
repository.
git clone https://github.com/tudat-team/tudat-space.git & cd tudat-space
Build the documentation using the
sphinx-build
command, specifying that html is to be built with the supplied source and output build directory.
sphinx-build -b html docs/source docs/build
View the local build of the documentation by opening the
docs/build/index.html
with your preferred browser.
Tip
[PyCharm/CLion] You can do this in by right clicking index.html
in the Project tree and selecting Open with Browser
.
Compiling Documentation in PyCharm#
If you are using PyCharm, the compilation of the documentation after each edit can be simplified by setting up a run configuration tailored for sphinx. The procedure is described below.
From the main toolbar, click on
Run > Edit Configurations
;In the window that has just opened, click on the
+
button (upper-left) to add a new configuration;From the drop-down menu, select
Python docs > Sphinx task
;

Give a name to the new run configuration;
Make sure that the field
Command
is set onhtml
;For the
input
andoutput
fields, select thesource
andbuild
folders respectively.

Make sure that the correct run configuration is selected. If so, pressing Run will be equivalent to executing the following command from the command line:
sphinx-build -b html docs/source docs/build
Release new versions of the docs#
Every time you make a modification to the documentation, you are required to:
update the
CHANGELOG.md
release a new version of the documentation
While updating the changelog is quite straightforward, the process needed to release a new version deserves some explanation.
Versioning with readthedocs#
Releasing a new version of the documentation is simple. To do this, we rely on bumpversion which in turns uses semantic versioning (or SemVer). Semantic
versioning relies the following structure for stable releases: MAJOR.MINOR.PATCH
(e.g., 1.3.1). For unstable
releases, the same
syntax is used with the addition of an additional tag, such as MAJOR.MINOR.PATCH.devBUILD
(e.g., 1.3.1.dev2).
See also
Read more on how readthedocs deals with versions.
Once you have committed your changes, you can release a new version by typing in the terminal one of the following commands:
bumpversion patch
: this increases the patch number (the same can be done withbumpversion major
orbumpversion minor
)1.1.1 -> 1.1.2.dev0
1.1.2.dev0 -> 1.1.3.dev0
1.1.2.dev1 -> 1.1.3.dev0
bumpversion dev
: this increases the build number1.1.2.dev0` -> 1.1.2.dev1
1.1.2` -> ❌ This will break. Patch must be bumped to start
dev
suffix.
bumpversion release
: this releases a stable version1.1.2.dev0` -> 1.1.2
1.2.0.dev0` -> 1.2.0
bumpversion
creates a dedicated commit every time it is executed and tags such commit with the version number.
Once the commits are pushed to the main
branch on origin
, the documentation is built by readthedocs.
Readthedocs uses the tags to build different versions of the documentation, with two additional versions:
latest
(corresponding to the latest commit onmain
)stable
(corresponding to the most recent version released)
Depending whether the release is stable or unstable, different things happen:
if the release is stable, the resulting documentation is published on the website and a new version will be visible in the readthedocs menu)
if the release is unstable, the resulting documentation will not be built nor published on the website
Unpublished (or “hidden”) versions can still be activated (by authorized users) to be viewed online (and shared with others through a link) by clicking on the readthedocs menu and selecting “Builds”, then “Versions” and activate build.

Clicking on the right build allows to see it in the browser and copy the related link to share it with collaborators. This is particularly useful to share drafts of the output documentation without modifying stable versions.
Warning
If the changes are pushed to other branches, no documentation is built.
Troubleshooting#
In this section, we collect the most recurring bugs that can happen while using sphinx, hoping that it will save precious time to future Tudat contributors.
No changes shown in browser#
It happens often that the browser shows cached data instead of the updated html files. As a result, if you don’t see your changes, try to empty/delete the cache of your browser (see, e.g., this guide).
No changes shown in online docs#
It can happen that, after pushing your changes to the origin
repository, no changes are shown on the actual
website (e.g., on tudat-space or on this website). Some suggestions to identify the problem will follow:
Check that you pushed to the
main
branch. The documentation is built by readthedocs only if changes are pushed to that branch.Check that the build was successful. This can be monitored via the “Builds” link in the readthedocs_menu (see screenshot above). If the build was not successful, you can click on it and see the output of the build. This can be helpful to identify where things are going wrong.

Sphinx commands not working#
If a sphinx command does not work, for instance the following:
.. toctree::
intro
guide
it can be due to many things, but before going crazy into debugging mode, check that the amount of spaces before
intro
and guide
correspond to three empty spaces. Sphinx requires three empty spaces, but
the tab key corresponds to four empty spaces: if you use it in sphinx commands, it can generate a lot of confusion
because the extra white space will break the sphinx command and it is very difficult to notice as well.
To be clear, this will likely not work:
.. toctree::
intro
guide
API Documentation#
Multidoc is a tool for purposed towards improving maintainability and consistency of docstrings in software that is available across multiple programming languages, with fixed language equivalent APIs.
Nomenclature
Application Programming Interface (API): An interface that defines interactions between multiple software applications or mixed hardware-software intermediaries.
YAML: (recursive acronym for “YAML Ain’t Markup Language”) A human-readable data-serialization language.
Jinja2: Jinja is a modern and designer-friendly templating language for Python. It is fast, widely used and secure.
API Structure Definition#
At the core of Multidoc is the API Structure Definition. This directory contains all the information required to constrain the structure across all Multidoc supported languages (C++, pybind and Python as of now).
definition/
├── __api__.yaml
├── module1.yaml
├── module2
│ ├── __module__.yaml
│ └── submodule1.yaml
└── module3
├── __module__.yaml
└── submodule2
└── subsubmodule.yaml
The concepts can be broken down into the following elements:
Element |
Description |
|
API configuration file. Must exist in the API structure prefix. |
|
Module configuration file. Module definition as a file implicitly infers no submodules. |
|
Module configuration directory. Must contain |
|
Submodule configuration file. Equivalent to a module configuration file. |
{%- if short_summary -%}
{{ short_summary }}
{%- endif -%}
__api__.yaml
# package:
# name: tudat # [cpp]
name: tudatpy # [py]
modules:
- interface
- simulation
- conversion
Warning
The # [py/cpp]
tag is currently for future use. Documentation
is currently only generated for py
. As a result, leave all # [cpp]
present where needed, but comment them out, as done above.
module.yaml
# description: |
Provides interfaces for external API.
modules:
- spice
spice.yaml
# description: "This module provides an interface to the Spice package."
notes: "None"
functions:
- name: clear_kernels
short_summary: "Clear all loaded spice kernels."
extended_summary: |
This function removes all Spice kernels from the kernel pool.
Wrapper for the `kclear_c` function.
returns:
- type: None # [py]
# - type: void # [cpp]
Functions#
Note
The following list is for overview. For a more detailed description of each section, with examples, please go to numpydoc.
1. Short summary: A one-line summary that does not use variable names or
the function name.
2. Deprecation warning: A section (use if applicable) to warn users that
the object is deprecated.
3. Extended Summary: A few sentences giving an extended description.
This section should be used to clarify functionality, not to discuss implementation detail or background theory, which should rather be explored in the Notes section below.
4. Parameters: Description of the function arguments, keywords and their respective types.
5. Returns: Explanation of the returned values and their types.
Similar to the Parameters section, except the name of each return value is optional.
6. Yields: Explanation of the yielded values and their types. This is relevant to generators only.
Similar to the Returns section in that the name of each value is optional, but the type of each value is always required.
7. Receives: Explanation of parameters passed to a generator’s .send()
method, formatted as for Parameters, above.
8. Other Parameters: An optional section used to describe infrequently used parameters.
It should only be used if a function has a large number of keyword parameters, to prevent cluttering the Parameters section.
Other Parameters: An optional section used to describe infrequently used parameters. It should only be used if a function has a large number of keyword parameters, to prevent cluttering the Parameters section.
Raises: An optional section detailing which errors get raised and under what conditions.
Warns: An optional section detailing which warnings get raised and under what conditions, formatted similarly to Raises.
Warnings: An optional section with cautions to the user in free text/reST.
See Also: An optional section used to refer to related code. This section can be very useful, but should be used judiciously. The goal is to direct users to other functions they may not be aware of, or have easy means of discovering (by looking at the module docstring, for example). Routines whose docstrings further explain parameters used by this function are good candidates.
Notes: An optional section that provides additional information about the code, possibly including a discussion of the algorithm. This section may include mathematical equations, written in LaTeX format.
References: References cited in the Notes section may be listed here.
Examples: An optional section for examples, using the doctest format. This section is meant to illustrate usage, not to provide a testing framework – for that, use the
tests/
directory. While optional, this section is very strongly encouraged.
Classes#
Use the same sections as outlined above (all except Returns are applicable). The constructor (__init__) should also be documented here, the Parameters section of the docstring details the constructor’s parameters.
Constants#
1. summary
2. extended summary (optional)
3. see also (optional)
4. references (optional)
5. examples (optional)
Modules#
1. summary
2. extended summary
3. routine listings
4. see also
5. notes
6. references
7. examples
Software Development#
Build System#
CMake#
Developer Environment#
The tudat-bundle build configuration allows developers to simultaneously work on tudat and tudatpy for a better flow in your end to end development.
Note
This topic is relevant for:
developers who want to expose their updated tudat code to the upcoming
tudatpy
package release.users who like to extend
tudatpy
functionality locally via modification of the C++-basedtudat
source code.anybody interested in seeing a concurrent C++ / Python development workflow.
Learning Objectives
Get your own
tudat-bundle
environment from the tudat-team.Understand the structure of the
tudat-bundle
and the purpose of its components.Familiarize with the mapping between
tudat
andtudatpy
source code.Understand the higher level functions of the
tudat-api
.Familiarize with the available build configurations for
tudat
andtudatpy
.Know how to build the
tudat-bundle
and recognize some common problems that can be encountered.
Cloning tudat-bundle
#
The tudat-bundle
environment is available on the tudat-team GitHub repository.
Note
Detailed instructions for the download, setup and verification of your own tudat-bundle
can be found in the repository’s README (steps 1-4).
Warning
If your machine is running on an Apple M1 processor, you may have to follow a slightly different. Please refer to this discussion.
Introduction to tudat-bundle
#
The tudat-bundle
consists of three subdirectories:
tudat
, containing the tudatC++
source code.tudatpy
, containing thetudatpy/kernel
directory in which the exposure ofC++
source code to the tudatpy package is facilitated.<build>
, the build directory containing the compiledC++
tudat code (<build>/tudat
), as well as the compiled tudatpy package at<build>/tudatpy/tudatpy/kernel.so
.
The entirety of exposed C++ functionality in tudatpy
is contained within the tudatpy/kernel
source directory.
For reference during this guide, the architecture of this directory is as follows:
Note
This module / submodule tree structure always aspires to mimic the structure of the tudat/src
directory.
schematic tudatpy/kernel directory
#* kernel
* ├── expose_<module_A>.cpp
* ├── expose_<module_A>.h
* ├── expose_<module_A>
* ├── expose_<submodule_A1>.cpp
* ├── expose_<submodule_A1>.h
* ├── expose_<submodule_A2>.cpp
* ├── expose_<submodule_A2>.h
* ├── ...
* ├── expose_<module_B>.cpp
* ├── expose_<module_B>.h
* ├── ...
* └── kernel.cpp
Note
The terms Package/Module/Submodule are intended to be hierarchical descriptions, used mostly in the context of directory structure. In the Python interpreter, everything is treated as a module object.
The tudatpy
Package#
The tudatpy
package is a collection of modules, in which the C++-based tudat
source code is exposed into Python bindings.
Note
The interfaces of C++-based tudat
source code and the Python-based tudatpy
modules are managed by the Pybind11 library. The rules for defining C++ to Python interfaces using Pybind11 will be presented in detail under Exposing C++ in Python.
In kernel.cpp
(see schematic tudatpy/kernel directory
) tudatpy
modules are bundled into the tudatpy
package.
The following folded code shows the core elements of kernel.cpp
.
It would serve the reader to have a glance through before we walk through the elements in detail.
tudatpy/kernel/kernel.cpp
#// expose tudat versioning
#include <tudat/config.hpp>
// include all exposition headers
#include "expose_simulation.h"
// other submodule headers...
// standard pybind11 usage
#include <pybind11/pybind11.h>
namespace py = pybind11;
PYBIND11_MODULE(kernel, m) {
// Disable automatic function signatures in the docs.
// NOTE: the 'options' object needs to stay alive
// throughout the whole definition of the module.
py::options options;
options.disable_function_signatures();
options.enable_user_defined_docstrings();
// export the tudat version.
m.attr("_tudat_version_major") = TUDAT_VERSION_MAJOR;
m.attr("_tudat_version_minor") = TUDAT_VERSION_MINOR;
m.attr("_tudat_version_patch") = TUDAT_VERSION_PATCH;
// simulation module definition
auto simulation = m.def_submodule("simulation");
tudatpy::expose_simulation(simulation);
// other submodule definitions...
// versioning of kernel module
#ifdef VERSION_INFO
m.attr("__version__") = VERSION_INFO;
#else
m.attr("__version__") = "dev";
#endif
}
Starting with the end in mind, compiling the previous will create a shared library named kernel.so
, making available all modules included in kernel.cpp
.
With the kernel.so
library added to the Python path variable, users can then import tudatpy
modules such as the astro
module, by executing from kernel import astro
.
Warning
The Python interpreter searches the sys.path
in its order. Inspect
the sys.path
list to verify that the desired variant of a module is imported.
All tudatpy
modules included in the kernel
namespace have previously defined in their respective expose_<module_A>.cpp
(and expose_<module_A>.h
) files.
Module Definition#
Note
A tudatpy
module can be thought of as collection of tudat
source code, which has been exposed to python
.
Modules are defined by their respective exposition functions expose_<module_X>( )
.
These exposition functions fulfill one of two (or sometimes both) tasks:
directly expose
tudat
source code in the module namespace (see<module_B>
in schematic tudatpy/kernel directory)include selected submodules, where
tudat
source code has been exposed in nested namespaces (see<module_A>
in schematic tudatpy/kernel directory)
1. Source Code Exposition in Module Namespace#
Exposition functions may directly expose tudat
source code content (module classes, functions and attributes) from the respective tudat
namespace to the tudatpy
module namespace.
In this case, the C++
to python
interfaces are defined directly in the tudatpy
module namespace.
One example of this usage is the tudatpy
constants
module.
Consider below the definition of the module constants
:
tudatpy/kernel/expose_constants.cpp
#// include .h
#include "expose_constants.h"
// include .h of considered source content
#include "tudatpy/docstrings.h"
#include "tudat/constants.h"
#include "tudat/astro/basic_astro/timeConversions.h"
// pybind11 usage
#include <pybind11/complex.h>
#include <pybind11/pybind11.h>
namespace py = pybind11;
// aliasing namespaces of considered source content
namespace tbc = tudat::celestial_body_constants;
namespace tpc = tudat::physical_constants;
// ...
// namespace package level
namespace tudatpy {
// namespace module level
namespace constants {
// module definition function
void expose_constants(py::module &m) {
// tudat source code (C++) to tudatpy (python) interfaces defined in module namespace:
// docstrings (no source code interface here)
m.attr("__doc__") = tudatpy::get_docstring("constants").c_str();
// celestialBodyConstants.h
m.attr("EARTH_EQUATORIAL_RADIUS") = tbc::EARTH_EQUATORIAL_RADIUS;
m.attr("EARTH_FLATTENING_FACTOR") = tbc::EARTH_FLATTENING_FACTOR;
m.attr("EARTH_GEODESY_NORMALIZED_J2") = tbc::EARTH_GEODESY_NORMALIZED_J2;
// ...
// physicalConstants.h
m.attr("SEA_LEVEL_GRAVITATIONAL_ACCELERATION") = tpc::SEA_LEVEL_GRAVITATIONAL_ACCELERATION;
m.attr("JULIAN_DAY") = tpc::JULIAN_DAY;
m.attr("JULIAN_DAY_LONG") = tpc::JULIAN_DAY_LONG;
// ...
// ...
};
}// namespace module level
}// namespace package level
The procedure can be summarized in three easy steps
make available
tudat
source code andpybind11
functionalitydefine module definition function
expose_constants( )
in module namespacedefine
C++
topython
interfaces using thepybind
syntax
Note
In the case of the constants
module, the exposed source code content is limited to attributes.
2. Source Code Exposition in Nested Namespace#
For large tudatpy
modules, the exposition of the tudat
source code is divided over submodules.
In this case, the C++
to python
interfaces are defined in the submodule namespace or even lower-level nested namespaces.
One example of this usage is the tudatpy
astro
module, which includes exposed tudat
source code from submodules such as fundamentals
, ephemerides
and more.
Consider below the definition of the module astro
:
tudatpy/kernel/expose_astro.cpp
#// include .h
#include "expose_astro.h"
// include .h of selected submodule definition
#include "expose_astro/expose_fundamentals.h"
#include "expose_astro/expose_ephemerides.h"
// ...
// pybind11 usage
#include <pybind11/pybind11.h>
namespace py = pybind11;
// namespace package level
namespace tudatpy {
// namespace module level
namespace astro {
// module definition function
void expose_astro(py::module &m) {
// include selected submodules (source code exposition in nested namespaces 'fundamentals', 'ephemerides', etc):
// expose_fundamentals.h
auto fundamentals = m.def_submodule("fundamentals");
expose_fundamentals(fundamentals);
// expose_ephemerides.h
auto ephemerides = m.def_submodule("ephemerides");
expose_ephemerides(ephemerides);
// ...
};
} // namespace module level
} // namespace package level
The procedure is largely analogous to the that of Source Code exposition in module namespace:
make available
tudat
source code andpybind11
functionalitydefine module definition function
expose_astro( )
in module namespaceinclude selected submodules
fundamentals
&ephemerides
via pybind’smodule.add_submodule( )
function
Since the tudatpy
submodules fundamentals
& ephemerides
define the C++
to python
interfaces, the definition of these submodules follows the exact same structure as in case 1 (Source Code Exposition in Module Namespace).
For the sake of completeness the definition of the ephemerides
submodule is presented below:
tudatpy/kernel/expose_astro.cpp
#// include .h
#include "expose_ephemerides.h"
// include .h of considered source content
#include <tudat/astro/ephemerides.h>
#include <tudat/simulation/simulation.h> // TODO: EphemerisType should be in <tudat/astro/ephemerides.h>
// pybind11 usage
#include <pybind11/eigen.h>
#include <pybind11/functional.h>
#include <pybind11/numpy.h>
#include <pybind11/pybind11.h>
namespace py = pybind11;
// aliasing namespaces of considered source content
namespace te = tudat::ephemerides;
namespace tss = tudat::simulation_setup;
// namespace package level
namespace tudatpy {
// namespace submodule level
namespace ephemerides {
void expose_ephemerides(py::module &m) {
// tudat source code (C++) to tudatpy (python) interfaces defined in submodule namespace:
py::class_<te::Ephemeris, std::shared_ptr<te::Ephemeris>>(m, "Ephemeris")
.def("get_cartesian_state", &te::Ephemeris::getCartesianState, py::arg("seconds_since_epoch") = 0.0)
.def("get_cartesian_position", &te::Ephemeris::getCartesianPosition, py::arg("seconds_since_epoch") = 0.0)
.def("get_cartesian_velocity", &te::Ephemeris::getCartesianVelocity, py::arg("seconds_since_epoch") = 0.0);
py::enum_<tss::EphemerisType>(m.attr("Ephemeris"), "EphemerisType")
.value("approximate_planet_positions", tss::approximate_planet_positions)
.value("direct_spice_ephemeris", tss::direct_spice_ephemeris)
// ...
py::class_<te::RotationalEphemeris,
std::shared_ptr<te::RotationalEphemeris>>
RotationalEphemeris_(m, "RotationalEphemeris");
// ...
};
} // namespace submodule level
} // namespace package level
In principle, it is possible for the ephemerides
submodule to delegate the C++
to python
interfaces to even lower-level namespaces.
In this case, the ephemerides
submodule definition (and any lower levels that delegate the interfaces) would follow the logic of case 2 (Source Code Exposition in Nested Namespace), while at the lowest level of this module / submodule tree the definition would again follow the logic of case 1 (Source Code Exposition in Module Namespace).
The tudat(py)
API in tudat-bundle
#
Warning
WIP - show how to use docstrings in tudat-bundle
to contribute to tudat(py)
-api
Build Configurations#
The tudat
source code can be build using various build configurations.
These configurations are listed in tudat-bundle/CMakeLists.txt
(l. 43 ff.).
The user can select the build options by use of the ‘ON’/’OFF’ keywords.
See below a section of the CMakeLists
file, which gives an example for an enabled test-suite build option and a disabled boost build option:
tudat-bundle/CMakeLists.txt
## ...
# +============================================================================
# BUILD OPTIONS
# Offer the user the choice of defining the build variation.
# +============================================================================
# Build option: enable the test suite.
option(TUDAT_BUILD_TESTS "Build the test suite." ON)
option(TUDAT_DOWNLOAD_AND_BUILD_BOOST "Downloads and builds boost" OFF)
# more Build options:
# ...
# ...
Warning
Options that toggle the use of SOFA
amd SPICE
can break the build of tudatpy
.
Note
For more information on the workings of CMake
as a build system, please refer to Build System.
Building the Project and Known Issues#
For most users the project build is very easy and described in the README (steps 5 ff.)
Warning
If your machine is running on an Apple M1 processor, you may have to follow a slightly different. Please refer to this discussion. You may also encounter issues with tudat-test, which can be resolved as described here.
Exposing C++ to Python#
This section contains fundamental concepts about pybind11, a library to expose C++ to Python, and more specific indications for users who want to expose tudat functionalities to tudatpy.
Note
In this context, the terms expose and bind (and derived words) will be treated as synonyms.
The reader should be familiar with the content of the Developer Environment page before moving on to the remainder of this guide.
Learning Objectives
Be able to expose a simple function from C++ to Python.
Be able to expose overloaded functions.
Be able to expose classes, including overloaded constructors.
Understand the different access policies on attributes and methods.
Understand the type conversions required and introduced by specific pybind headers.
The contents of this guide are shown below:
Pybind11#
pybind11 is an open-source library that exposes C++ types in Python. Through this software, the user interfaces of tudat, written in C++, can be made available in tudatpy.
pybind11 has an extensive and well-written documentation accessible through the link reported above, which the reader can refer to at anytime. The main goal of this page is to help the reader gain familiarity with the nomenclature and functionalities offered by pybind11 that are specifically useful to expose tudat code to Python. pybind11 features that are not directly applicable to tudat will not be presented.
Note
The hierarchical structure of the binding code is explained in this section. It is noted that the actual
compilation of the binding code is achieved by calling the kernel.cpp
file; however, all the pybind functionalities
that will be explained above are employed in the respective submodules.
Headers and preliminaries#
To write a C++ exposition file, the following header is needed:
#include <pybind11/pybind11.h>
However, additional headers may be needed, such as:
#include <pybind11/stl.h> // to enable conversions from/to C++ standard library types
#include <pybind11/eigen.h> // to enable conversions from/to Eigen library types
#include <pybind11/numpy.h> // to enable conversions from/to Numpy library types
In addition, it is assumed that the following piece of code is present in each code snippet shown in this page:
namespace py = pybind11;
Exposing a function#
In this section, the procedure to expose a simple function through pybind11 will be explained. We will make use of an example taken from tudat.
Suppose that we want to expose to Python the following tudat function (taken from this file):
inline std::shared_ptr< SingleDependentVariableSaveSettings > machNumberDependentVariable(
const std::string& associatedBody,
const std::string& bodyWithAtmosphere )
{
return std::make_shared< SingleDependentVariableSaveSettings >(
mach_number_dependent_variable, associatedBody, bodyWithAtmosphere );
}
This function is used to save the Mach number dependent variable associated to a certain body.
More specifically, it returns a smart pointer to a SingleDependentVariableSaveSettings
object and takes as input
two standard pointers to std::string
(these refer the body whose Mach number should be saved and the body whose
atmosphere should be used to compute the Mach number respectively).
This is the code (available here) needed to expose the above function to Python:
PYBIND11_MODULE(example, m) {
m.def("mach_number",
&tp::machNumberDependentVariable,
py::arg("body"),
py::arg("central_body"));
}
The code reported above creates a Python module, called example
(the creation of a module through the
PYBIND11_MODULE()
function is done in tudatpy only in the kernel.cpp
file; most of the binding code is organized
through submodules structured as explained in section pybind11 of this page).
def()
is the pybind function that creates binding code for a specific C++ function 1.
def()
takes two mandatory arguments:
a string (i.e.,
"mach_number"
), representing the name of the exposed function in Python;a pointer to the C++ function that should be exposed (i.e.,
&tp::machNumberDependentVariable
), wheretp
is an abbreviation for thetudat::propagators
namespace.
There are also additional input arguments that can be passed to the pybind def()
function. In the context of the
example above, these are the keywords for the input arguments of the exposed function in Python, denoted by the syntax
py::arg
, which takes a string as input (i.e., "body"
and "central_body"
). py
is a shortcut for the
pybind11
namespace 2.
Note
There are many other optional input arguments to the def()
function. For instance, a third positional
argument after &tp::machNumberDependentVariable
can be passed (of type std::string
) to provide a short
documentation to the function. However, this pybind functionality is not employed for tudat/tudatpy.
As a result, pybind11 will generate a Python function that can be used as follows:
It is also allowed to call the tudatpy function mach_number()
through the keyword arguments as follows:
dep_var_to_save = example.mach_number(body="Spacecraft", central_body="Earth")
It is also possible to have default values for certain keyword arguments. Suppose, for instance, that we want to have
"Earth"
as default central body. This can be achieved through the following implementation 3:
PYBIND11_MODULE(example, m) {
m.def("mach_number",
&tp::machNumberDependentVariable,
py::arg("body"),
py::arg("central_body") = "Earth");
}
The first issue that arises in the binding process is the conversion between variable types.
C++ is a statically-typed language, while Python is dynamically-typed. However, the type conversion is still needed
and in both directions.
In other words, the user can pass a Python variable as input to an exposed function. The type of such
variable will have to be converted to a C++ type before it is passed to the actual C++ function acting “behind the
scenes”. The inverse process takes place for the output of a function.
This is one of the reasons why pybind11 is necessary. Indeed, conversions between native types are dealt with
automatically in pybind. For instance, a C++ std::map<>
is converted into a Python dict
and vice-versa.
In our example, this automatic type conversion takes place between the input arguments, between the std::string
in C++ and str
in Python. A table reporting common conversions is reported below.
Python |
C++ |
|
|
|
|
|
|
However, non-native data types need to be known to pybind to be converted properly. This is the case of the output type
of the machNumberDependentVariable()
function, returning a pointer to an instance of the
SingleDependentVariableSaveSettings
class. If this class is not exposed to Python, the binding process will fail.
This offers the opportunity to explain how to generate binding code for classes, which will be done in
Exposing a class.
Templated functions#
When a function is templated (see for instance here)
it is mandatory to specify the template argument when exposing it. Therefore, the exposition code must be duplicated
for each variable type (shown below for double
, example taken from here).
m.def("multi_arc",
&tp::multiArcPropagatorSettings<double>,
py::arg("single_arc_settings"),
py::arg("transfer_state_to_next_arc") = false );
Overloading functions#
If a free function or a member function is overloaded (i.e., it bears the same name but it accepts different sets of input argument types), it is not possible to generate binding code in the traditional way explained in Exposing a function, because pybind will not know which version should be chosen to generate Python code. Suppose, for instance, that we want to expose the following overloaded function:
//! Function to create a set of acceleration models from a map of bodies and acceleration model types.
basic_astrodynamics::AccelerationMap createAccelerationModelsMap(
const SystemOfBodies& bodies,
const SelectedAccelerationMap& selectedAccelerationPerBody,
const std::map< std::string, std::string >& centralBodies )
//! Function to create acceleration models from a map of bodies and acceleration model types.
basic_astrodynamics::AccelerationMap createAccelerationModelsMap(
const SystemOfBodies& bodies,
const SelectedAccelerationMap& selectedAccelerationPerBody,
const std::vector< std::string >& propagatedBodies,
const std::vector< std::string >& centralBodies )
Both overloads of the createAccelerationModelsMap()
function accept the system of bodies and an acceleration map as
first two input arguments. In addition, the function needs to know the central body of each propagated body.
This information can be passed as a std::map
(where each propagated body is associated to its own central body
key-value pairs) or through two different std::vector
objects, one containing the propagated bodies and the other
containing the respective central bodies. The code to expose both overloads is reported below:
m.def("create_acceleration_models",// overload [1/2]
py::overload_cast<const tss::SystemOfBodies &,
const tss::SelectedAccelerationMap &,
const std::vector<std::string> &,
const std::vector<std::string> &>(
&tss::createAccelerationModelsMap),
py::arg("body_system"),
py::arg("selected_acceleration_per_body"),
py::arg("bodies_to_propagate"),
py::arg("central_bodies"));
m.def("create_acceleration_models",// overload [2/2]
py::overload_cast<const tss::SystemOfBodies &,
const tss::SelectedAccelerationMap &,
const std::map<std::string, std::string> &>(
&tss::createAccelerationModelsMap),
py::arg("body_system"),
py::arg("selected_acceleration_per_body"),
py::arg("central_bodies"));
The def()
function is still used, where the first input argument is the function name in Python.
The difference with respect to a non-overloaded function exposition (see Exposing a function) lies in the second
input argument, where pybind’s templated py::overload_cast<>
is used 8.
This pybind function casts overloaded functions to function pointers and its syntax is as follows:
the types of input arguments of the original C++ function are passed as template arguments (e.g.,
const tss::SystemOfBodies &
, etc…);a reference to the original C++ function are passed as regular input arguments (e.g.,
&tss::createAccelerationModelsMap
, wheretss
is a shortcut for thetudat::simulation_setup
namespace).
The optional arguments to def()
do not change with respect to what was explained in Exposing a function.
Warning
In the (rare) case where a function is overloaded based on constness only, the pybind tag py::const_
must be added as second parameter to py::overload_cast<>
.
Exposing a class#
As explained above, the SingleDependentVariableSaveSettings
class should be exposed to Python as well. This class,
available at this link, is defined as follows:
class SingleDependentVariableSaveSettings : public VariableSettings
{
public:
SingleDependentVariableSaveSettings(
const PropagationDependentVariables dependentVariableType,
const std::string& associatedBody,
const std::string& secondaryBody = "",
const int componentIndex = -1 ):
VariableSettings( dependentVariable ),
dependentVariableType_( dependentVariableType ),
associatedBody_( associatedBody ),
secondaryBody_( secondaryBody ),
componentIndex_( componentIndex ) { }
// Attributes
PropagationDependentVariables dependentVariableType_;
std::string associatedBody_;
std::string secondaryBody_;
int componentIndex_;
};
The class has a constructor and it is derived class, whose parent is the VariableSettings
class. The code to
expose it to Python, available through this link is as follows, where the exposition of the constructor was omitted
for now:
py::class_<tp::SingleDependentVariableSaveSettings,
std::shared_ptr<tp::SingleDependentVariableSaveSettings>,
tp::VariableSettings>(m, "tp::SingleDependentVariableSaveSettings")
It makes use of pybind’s py::class_<>
templated function 4. In the template, there are three arguments, of which
only the first one is mandatory:
the first template argument declares the C++ class that should be exposed (i.e.,
tp::SingleDependentVariableSaveSettings
);the second template argument declares the type of pointer that should be used by pybind to refer to instances of such class (i.e.,
std::shared_ptr<tp::SingleDependentVariableSaveSettings>
). The default argument is astd::unique_ptr
, but in tudat the common and consistently used pointer is astd::shared_ptr<>
5;the third template argument informs pybind that the class to be exposed is derived by the parent class
tp::VariableSettings
6.
Todo
When does a parent class need to be exposed? In theory, tp::VariableSettings does not have to be exposed… According to GG, “only when the class is part of the signature of a different function” (see recording at 14m01s).
Warning
The third template argument is necessary to ensure automatic downcasting of pointers referring to polymorphic base classes. In other words, when a function returns a pointer to an instance of a derived class, pybind automatically knows to “downcast” the pointer to the type of the derived class only if the base class is polymorphic (a class is said polymorphic if it has at least one virtual function).
In addition, there are two input arguments to the py::class_
function:
the name of the Python module to which the exposed class will belong to (i.e.,
m
);the name of the exposed class in Python, provided as a
std::string
(i.e.,"tp::SingleDependentVariableSaveSettings"
).
Exposing class constructors#
Once the class has been exposed, one can also expose its member functions (in C++) which will become methods (in Python). The first member function that will be exposed is the class constructor. This can be exposed through the following code:
py::class_<tp::SingleDependentVariableSaveSettings,
std::shared_ptr<tp::SingleDependentVariableSaveSettings>,
tp::VariableSettings>(m, "tp::SingleDependentVariableSaveSettings")
.def(py::init<
const tp::PropagationDependentVariables,
const std::string &,
const std::string &,
const int>(),
py::arg("dependent_variable_type"),
py::arg("associated_body"),
py::arg("secondary_body") = "",
py::arg("component_idx") = -1);
The first three lines were explained above. To expose the class constructor, it is possible to use the pybind def()
function, which is common to any function (whether it is a member of a class or not). In addition, the pybind
py::init<>
function is used to declare the definition of the constructor. This function takes the types of the input
arguments to the constructor as template arguments (i.e., const tp::PropagationDependentVariable
,
const std::string &
, etc…). The templated function py::init<>
makes it easy to overload the class constructor:
it is sufficient to define multiple .def(py::init<>)
, with different template arguments, to expose several versions
of the constructor, whose correct version is selected according to the input arguments types passed to the constructor.
An example, taken from this tudat class exposed through this code, is provided below.
Overloading simple functions will be explained in section overloading_functions.
Warning
The template arguments must be always provided to py::init<>
, even if the constructor is not
overloaded.
The def
function follows its standard behavior (explained above)
even when it is used to expose a class constructor; in other words, it can take a number of optional arguments
that specify the keyword corresponding to each input argument to the class constructor in Python (i.e.,
py::arg("dependent_variable_type")
, etc…). In this example, the last two input arguments have default values.
Note
The set of parentheses after py::init<>()
, needed to comply with the correct syntax, is empty.
Optional arguments can be passed to create custom constructors in Python 7. However, this pybind functionality
is not used for tudat, therefore it will not be treated in this guide.
Exposing class attributes#
Class attributes in C++ vs. in Python#
There are a few differences between the Object-Oriented Programming (OOP) philosophy in C++ and Python. It is important to know these differences before proceeding to the next sections. The reader who is already aware of this information can skip this section.
One of the principles used in Object-Oriented Programming in C++ is data encapsulation. According to this principle, class attributes should be accessible only from within the class and not by the user dealing with an instance of that class. This is principle is (partly) enforced by C++: for instance, class attributes are by default private (i.e., accessible only from within the class and its methods, also called friends) 9. This policy is useful mainly for security reasons (data protection), but also because interaction with the data contained within a class becomes only possible through its public methods; in other words, the user can interact with the class data through a dedicated user interface, without knowing or dealing with the class’s internal functioning directly. This strategy also ensures that any changes to the class’s internal structure will not affect the code that creates and uses instances of that class 10. The most basic form of a user interface are accessors and mutators (hereafter referred to as getters and setters).
In Python, on the other hand, the possibility of keeping class attributes private is not provided. Among Python
programmers, there is a widespread convention to use attribute names starting with an underscore
(e.g., myclass._myattribute
) to inform other developers and users that such attribute should not be called
directly outside of the class. However, this is only a convention and the programming language does not enforce this
behavior. For this reason, getters and setters are not as common in Python as they are in other OOP languages, such as
C++ or Java. In addition, the dot notation in Python to access and mutate class attributes makes the code much
more readable 11.
However, there may be cases where getters and setters are needed in Python classes as well. This is the case when code is exposed from another OOP language, such as C++, as it happens for tudat: it is obviously easier to maintain the same user interface, thus having keeping getters and setters in Python as well. In this case, it is recommended to create a class property. This solution has the advantage of having getters and setters, while at the same time benefitting from the dot notation 12.
These concepts will be partially re-explained and applied in Exposing public attributes (for attributes that are not private, thus do not have associated getters and setters) and Exposing private attributes (for attributes that are private, thus do have associated getters and setters, which can become properties in Python).
Exposing public attributes#
Analogously to the def()
method of pybind’s py::class_
, useful to expose member functions, pybind offers two
other methods to expose public attributes of a class (for private attributes, see Exposing private attributes) 9.
def_readwrite()
can be used to expose a non-constant attribute. For instance, let’s consider the following
piece of code that exposes this class:
py::class_<ta::AerodynamicGuidance, ta::PyAerodynamicGuidance,
std::shared_ptr< ta::AerodynamicGuidance > >(m, "AerodynamicGuidance")
.def(py::init<>())
.def("updateGuidance", &ta::AerodynamicGuidance::updateGuidance, py::arg("current_time") )
.def_readwrite("angle_of_attack", &ta::PyAerodynamicGuidance::currentAngleOfAttack_)
.def_readwrite("bank_angle", &ta::PyAerodynamicGuidance::currentBankAngle_)
.def_readwrite("sideslip_angle", &ta::PyAerodynamicGuidance::currentAngleOfSideslip_);
The highlighted lines show the def_readwrite()
function at work. It takes two arguments in the same way explained
in Exposing a function:
the name of the attribute of the exposed Python class, passed as a string;
the attribute of the original C++ class, passed as a reference.
Similarly, the def_readonly()
function can be used to expose const
public class attributes. For instance, look
at this example exposing this thrust direction class:
py::class_<
tss::ThrustDirectionGuidanceSettings,
std::shared_ptr<tss::ThrustDirectionGuidanceSettings>>(m, "ThrustDirectionGuidanceSettings")
.def(py::init<
const tss::ThrustDirectionGuidanceTypes,
const std::string>(),
py::arg("thrust_direction_type"),
py::arg("relative_body"))
.def_readonly("thrust_direction_type", &tss::ThrustDirectionGuidanceSettings::thrustDirectionType_)
.def_readonly("relative_body", &tss::ThrustDirectionGuidanceSettings::relativeBody_);
The highlighted lines use def_readonly()
in the same way as for def_readwrite()
.
Note
In tudat, it was decided to have as few public attributes as possible. Therefore, in principle,
a developer should not rely on def_readonly()
and def_readwrite()
too much, as classes should be designed
so that attributes are generally private and interaction with those is possible through getters (and setters).
Exposing private attributes#
If class attributes are private, it is likely that they can be accessed (and, in some cases, modified) through
getters and setters. pybind has specific methods from the py::class_
to deal with this situation, namely with
def_property()
and def_property_readonly()
13.
The latter is used for private attributes that have both getters and setters,
while the former is used for private attributes that cannot be modified (i.e., they only have a getter).
The following example, exposing a spherical harmonics class in tudat, illustrates the usage of both:
py::class_<tg::SphericalHarmonicsGravityField,
std::shared_ptr<tg::SphericalHarmonicsGravityField >,
tg::GravityFieldModel>(m, "SphericalHarmonicsGravityField")
.def_property_readonly("reference_radius", &tg::SphericalHarmonicsGravityField::getReferenceRadius )
.def_property_readonly("maximum_degree", &tg::SphericalHarmonicsGravityField::getDegreeOfExpansion )
.def_property_readonly("maximum_order", &tg::SphericalHarmonicsGravityField::getOrderOfExpansion )
.def_property("cosine_coefficients", &tg::SphericalHarmonicsGravityField::getCosineCoefficients,
&tg::SphericalHarmonicsGravityField::setCosineCoefficients)
.def_property("sine_coefficients", &tg::SphericalHarmonicsGravityField::getSineCoefficients,
&tg::SphericalHarmonicsGravityField::setSineCoefficients);
The syntax is as follows:
the first argument is, as usual, the name of the attribute of the exposed Python class, passed as a string;
the second argument is the getter function of the original C++ class, passed as a reference;
[only for
def_property()
] the third argument is the setter function of the original C++ class, passed as a reference.
As a result, in Python it will be possible to operate without getter and setters, simply accessing properties through the dot notation (see the Python documentation about the property decorator). As an example, in Python one could do:
# Create spherical harmonics object
spherical_harmonics_model = ...
# Retrieve sine coefficients
sin_coeff = spherical_harmonics_model.sine_coefficients
# Set sine coefficients
spherical_harmonics_model.sine_coefficients = sin_coeff
# Retrieve reference radius
r = spherical_harmonics.reference_radius
# Set reference radius
spherical_harmonics.reference_radius = r # THIS WOULD THROW AN ERROR
Note
In the current status of tudatpy, def_property()
is not always used, because in some cases the getter
and setter functions are exposed individually through the traditional def()
method. However, this behavior is
discouraged when generating other binding code in the future. When getters (and setters) are available in C++,
it is recommended to rely on def_property()
or def_property_readonly()
.
Todo
@Dominic, @Geoffrey, do you confirm the note above?
Exposing class methods#
Other class methods that are not part of the categories explained above can be simply exposed with the same syntax used for free functions (see Exposing a function).
Exposing an enum#
Exposing enumerations types is relatively straightforward. Suppose we would like to expose the following enum, located
in the tudat::propagators
namespace:
//! Enum listing types of dynamics that can be numerically integrated
enum IntegratedStateType
{
hybrid = 0,
translational_state = 1,
rotational_state = 2,
body_mass_state = 3,
custom_state = 4
};
This can be done through pybind’s py::enum_<>
function as follows (original code):
py::enum_<tp::IntegratedStateType>(m, "StateType")
.value("hybrid_type", tp::IntegratedStateType::hybrid)
.value("translational_type", tp::IntegratedStateType::translational_state)
.value("rotational_type", tp::IntegratedStateType::rotational_state)
.value("mass_type", tp::IntegratedStateType::body_mass_state)
.value("custom_type", tp::IntegratedStateType::custom_state)
.export_values();
py::enum_<>
takes the name of the original C++ enum as template argument and the name of the Python equivalent as
second parameter (i.e., " StateType"
), with the first one being the module m
as usual.
Each element of the enum can then be exposed using the value()
function, that takes two parameters:
the name of the element in Python;
the name of the original C++ element to be exposed (where
tp
is, as usual, a shortcut for thetudat::propagators
namespace).
The final function export_values()
is needed to export the elements to the parent scope; without it,
tudat::propagators::hybrid_type
would be not be valid code 14.
Todo
to address: structure of the PYBIND11_MODULE (in kernel) and module/submodule definition. However, this overlaps with the content of this tudat developer guide. I propose to either redirect from here to there or transfer its content here.
References#
- 1
https://pybind11.readthedocs.io/en/stable/basics.html#creating-bindings-for-a-simple-function
- 2
https://pybind11.readthedocs.io/en/stable/basics.html#keyword-arguments
- 3
https://pybind11.readthedocs.io/en/stable/basics.html#default-arguments
- 4
https://pybind11.readthedocs.io/en/stable/classes.html#creating-bindings-for-a-custom-type
- 5
https://pybind11.readthedocs.io/en/stable/advanced/smart_ptrs.html#std-shared-ptr
- 6
https://pybind11.readthedocs.io/en/stable/classes.html#inheritance-and-automatic-downcasting
- 7
https://pybind11.readthedocs.io/en/stable/advanced/classes.html#custom-constructors
- 8
https://pybind11.readthedocs.io/en/stable/classes.html#overloaded-methods
- 9(1,2)
- 10
https://press.rebus.community/programmingfundamentals/chapter/encapsulation/
- 11
- 12
https://docs.python.org/3/library/functions.html?highlight=property#property
- 13
https://pybind11.readthedocs.io/en/stable/classes.html#instance-and-static-fields
- 14
https://pybind11.readthedocs.io/en/stable/classes.html#enumerations-and-internal-types
Extending Features#
Development Environment
Get your own
tudat-bundle
environment from the tudat-team.Understand the structure of the
tudat-bundle
and the purpose of its components.Familiarize with the mapping between
tudat
andtudatpy
source code.Understand the higher level functions of the
tudat-api
.Familiarize with the available build configurations for
tudat
andtudatpy
.Know how to build the
tudat-bundle
and recognize some common problems that can be encountered.
Bibliography#
- 1
Wikipedia. Devops, wikipedia, the free encyclopedia. http://en.wikipedia.org/w/index.php?title=DevOps&oldid=1019937221, 2021. [Online; accessed 27-April-2021].
- 2
G.H. Garrett. Developer-primer. URL: https://github.com/tudat-team/developer-primer.
- 3
Atlassian. Gitflow workflow: atlassian git tutorial. URL: https://www.atlassian.com/git/tutorials/comparing-workflows/gitflow-workflow.
- 4
Tom Preston-Werner. Semantic versioning 2.0.0. URL: https://semver.org/.
- 5
Anaconda, individual edition. URL: https://www.anaconda.com/products/individual.
- 6
Conda, miniconda. URL: https://docs.conda.io/en/latest/miniconda.html#miniconda.
Follow tutorials that demonstrate specific tasks that concern the maintenance, development, and documentation of Tudat.
Note
The following items are placeholders.
Adding a Package to Conda#
Creating a new Forge Feedstock#
Warning
The following has not been tested for Windows development. Inconsistencies will be present. (i.e. the directory for token storage)
The following extract from the conda-feedstock is a quick summary on conda-forge and how this organisation relates to feedstocks.
conda-forge is a community-led conda channel of installable packages. In order to provide high-quality builds, the process has been automated into the conda-forge GitHub organization. The conda-forge organization contains one repository for each of the installable packages. Such a repository is known as a feedstock.
Installation of conda-smithy
#
Note
It is always advised to ensure conda
is up to date when dealing
with conda
-related tools. conda update conda
The most reliable way to install conda-smithy
, as only one copy of the
package should exist across all environments:
conda install -n root -c conda-forge conda-smithy
Making a New Feedstock#
Creating a new feedstock will make a new package available for installation through the conda package manager. The steps under Making a New Feedstock needs only to be carried out once in all cases. Everything else with regards to a feedstock falls under “feedstock maintenance”.
Note
The following guide follows the creation of a feedstock for
nrlmsise-00
. This process should be adapted.
Setting up a basic recipe#
Create a directory which will contain the recipe for the feedstock.
mkdir nrlmsise-00
Every conda-build
recipe will generally require the three following files
to be present: 1) meta.yaml
, 2) build.sh
and 3) bld.bat
.
Let’s create them.
touch nrlmsise-00/meta.yaml
The meta.yaml
file requires the package name
and version
in
order to initialise the package.
{% set name = "NRLMSISE-00" %} {% set version = "0.1" %} package: name: {{ name|lower }} version: {{ version }}
Now there’s a concept present here which isn’t strictly necessary: templating
through the use of Jinja
. This just allows for the definition of variables
which can be reused throughout the yaml
file. The effect of the above will
be the following after template rendering:
package: name: nrlmsise-00 version: 0.1
Finally, create the last two required files for the feedstock
initialisation.
touch nrlmsise-00/build.sh && touch nrlmsise-00/build.sh
We can now initialise the nrlmsise-00-feedstock
:
conda smithy init nrlmsise-00(base) ggarrett@space-station:~/tudat-space$ conda smithy init nrlmsise-00/ No azure token. Create a token and put it in ~/.conda-smithy/azure.token No numpy version specified in conda_build_config.yaml. Falling back to default numpy value of 1.11 WARNING:conda_build.metadata:No numpy version specified in conda_build_config.yaml. Falling back to default numpy value of 1.11 Initialized empty Git repository in /home/ggarrett/tudat-space/nrlmsise-00-feedstock/.git/ [master (root-commit) 73bf19e] Initial feedstock commit with conda-smithy 3.7.4.dev54. 4 files changed, 4 insertions(+) create mode 100644 conda-forge.yml create mode 100644 recipe/bld.bat create mode 100644 recipe/build.sh create mode 100644 recipe/meta.yaml Repository created, please edit conda-forge.yml to configure the upload channels and afterwards call 'conda smithy register-github' (base) ggarrett@space-station:~/tudat-space$ ls nrlmsise-00 nrlmsise-00-feedstock
Success! We have initialised our first feedstock, although you can notice two
potential points of concerning numpy
and the lack of an azure token
.
This will be both addressed.
Generating Github token#
If you’re familiar with retrieving a personal access token from Github, you
can skip ahead to Setting up the Github repo. The following screen capture
will give you your bearings on generating the access token:
Settings/Developer settings
.

Setting up the Github repo#
Note
Your personal access token from Github will provide conda-smithy
with the required permissions to register a repo in your organisation,
as long as your account has the correct permissions to the
organisation. Save the token in the file: ~/.conda-smithy/github.token
conda smithy register-github --organization tudat-team ./nrlmsise-00-feedstock
(base) ggarrett@space-station:~/tudat-space$ conda smithy register-github --organization tudat-team ./nrlmsise-00-feedstock
No azure token. Create a token and
put it in ~/.conda-smithy/azure.token
No numpy version specified in conda_build_config.yaml. Falling back to default numpy value of 1.11
WARNING:conda_build.metadata:No numpy version specified in conda_build_config.yaml. Falling back to default numpy value of 1.11
Adding in variants from internal_defaults
INFO:conda_build.variants:Adding in variants from internal_defaults
Created tudat-team/nrlmsise-00-feedstock on github
Repository registered at github, now call 'conda smithy register-ci'
git push upstream master
Generating Azure token#
Now it’s time to generate our Azure token for the organisation.
Sign in to your organization in Azure DevOps (https://dev.azure.com/{yourorganization})
From your home page, open your user settings, and then select Personal access tokens.

And then select + New Token.

Name your token, select the organization where you want to use the token, and then choose a lifespan for your token.

When you’re done, make sure to copy the token. For your security, it won’t be shown again. Use this token as your password.

conda smithy register-ci --organization tudat-team --feedstock_directory ./nrlmsise-00-feedstock
Adding a Function to Tudat#
Adding a Class to Tudat#
Adding a Module to Tudat#
Generate PDFs with Sphinx Locally#
Tutorial will be adapted from: - https://www.tutorialfor.com/blog-222028.htm
Commands#
|
Rerendering is conda-forge’s way to update the files common to all feedstocks (e.g. README, CI configuration, pinned dependencies). |
Note
Tired of retyping your Git credentials in the Command-line?
Attention
This method saves the credentials in plaintext on your PC’s disk. Everyone on your computer can access it, e.g. malicious NPM modules.
Run
git config --global credential.helper store
then
git pull
provide a username and password and those details will then be remembered later. The credentials are stored in a file on the disk, with the disk permissions of “just user readable/writable” but still in plaintext.
Note
You can now check the state of the working tree and the staging area on on your local branch. The working tree, or working directory, consists of files that you are currently working on. You can think of a working tree as a file system where you can view and modify files. The index, or staging area, is where commits are prepared.
Note
Command variants for checking available branches and their descriptions:
Variant |
Description |
|
To see local branches |
|
To see remote branches |
|
To see all local and remote branches |
Curated Tools#
Nomenclature#
IDE: An integrated development environment (IDE) is a software application that provides comprehensive facilities to computer programmers for software development.
CLion#
CLion is Jetbrains fully integrated C/C++ development environment.
Tudat Developer Change Log#
v0.1.1#
Changed:
How unstable releases work in Sphinx Documentation
Authors:
Filippo Oggionni
v0.1.0#
Added:
Releasing versions of documentation in Sphinx Documentation
Note about clearing browser caches in Sphinx Documentation
Changed:
How to compile the documentation in Sphinx Documentation
Transferred some content from tudat-space
Authors:
Filippo Oggionni
v0.0.11#
Added:
Trial “Mission Brief” detailing practical tasks in primer.
Content in Develop and Master Branches for switching to
develop
branch in Tudat Git workflow.Note on
git branch
and variants in detailing local and remote branches available.Added Feature Branches. Placeholders added for Release Branches and Hotfix Branches to explain the Gitflow Workflow.
Changed:
Extracted “git - the simple guide” from Mission brief, moved to top level Code Collaboration.
Authors:
Geoffrey H. Garrett
v0.0.10#
Added:
Quality of life improvement for Code Collaboration/ Git. Note for users on how to save git credentials via the Command Line (if they choose to do so with the security risk).
Added News Workflow to Release Versioning topic. Currently identical and unadapted from https://regro.github.io/rever-docs/news.html.
Added todo for elaborating on
osx_arm64
build variants.
Authors:
Geoffrey H. Garrett
v0.0.9#
Added:
Tutorial: Re-rendering a
conda-smithy
feedstock (placeholder)Todo for renaming file aforementioned tutorial.
Fixed:
Fixed all errors/warnings during
make html
for sphinx docs.
Authors:
Geoffrey H. Garrett
v0.0.8#
Changed:
Fixed Change Log formatting. (Basically figuring out consistent workflow for
rever
with these commits)
Authors:
Geoffrey H. Garrett
v0.0.7#
Changed:
author
variable is now retrieve directly from reverAUTHORS
files during the building of docs.
Authors:
Geoffrey H. Garrett
v0.0.6#
Changed:
Changed
rever.xsh
activity ordering. Placed author activity as first.
Authors:
Geoffrey H. Garrett
v0.0.5#
Added:
Topics are used to contextualise external and internal tools, workflows and conventions in the context of the Tudat Developer user case.
- Topics:
- Development Operations
Code Collaboration (ongoing)
Release Versioning (ongoing)
Package Management (ongoing)
Continuous Deployment (ongoing)
Software Documentation (ongoing)
- Software Development (ongoing)
Build System (ongoing)
Developer Environment (ongoing)
Extending Features (ongoing)
Exposing C++ in Python (ongoing)
- Tutorials: (added placeholders)
Adding a Package to Conda
Adding a Function to Tudat
Adding a Class to Tudat
Adding a Module to Tudat
Generate PDFs with Sphinx Locally
Authors:
Geoffrey H. Garrett
Todo
(*) Rename rerendering_a_feedstock.rst
to rerendering_feedstocks.rst