An Introduction to the Development Flow#

This page hopefully will get you started to develop Tidy3D.

TLDR:

  • Branch off of the target branch (usually develop or pre/x.x), work on your branch, and submit a PR when ready.

  • Use isolated development environments with poetry.

  • Use ruff to lint and format code, and install the pre-commit hook via pre-commit install to automate this.

  • Document code using NumPy-style docstrings.

  • Write unit tests for new features and try to maintain high test coverage.

Understanding Virtual Environments#

Introduction#

In larger projects, it’s crucial to have a separate Python environment for each feature or branch you work on. This practice ensures isolation and reproducibility, simplifying testing and debugging by allowing issues to be traced back to specific environments. It also facilitates smoother integration and deployment processes, ensuring controlled and consistent development. Managing multiple environments might seem daunting, but it’s straightforward with the right tools. Follow the steps below to set up and manage your environments efficiently.

Benefits#

  • Isolation: Avoids conflicts between dependencies of different features.

  • Reproducibility: Each environment can be easily replicated.

  • Simplified Testing: Issues are contained within their respective environments.

  • Smooth Integration: Ensures features are developed in a consistent setting.

Prerequisites#

Make sure that you have poetry installed. This can be done system-wide with pipx or within a conda environment. Note that we use conda only for setting up the interpreter (Python version) and poetry, not for managing dependencies. Refer to the official development guide for detailed instructions:

https://docs.flexcompute.com/projects/tidy3d/en/stable/development/index.html#installation

Setting Up a New Environment#

  1. Check out the branch:

    git checkout branch
    
  2. Set up the environment with conda (skip this step if you don’t use conda):

    conda create -n branch_env python=3.11 poetry
    conda activate branch_env
    poetry env use system
    poetry env info # verify you're running the right environment now
    
  3. Install dependencies with poetry:

    poetry install -E dev
    poetry run pre-commit install
    
  4. Update the environment when switching to a different branch:

    poetry install -E dev
    

Multiple Folders or Worktrees#

If you have multiple folders (e.g., multiple clones or git worktrees), you will need to repeat the environment setup for each folder. Ensure that each folder has its own isolated environment.

By following these steps, you can maintain isolated and reproducible environments for each branch and feature, leading to a more efficient and error-free development process.

Using poetry for package management#

What is Poetry#

Poetry is a package management tool for Python.

Among other things, it provides a nice way to:

  • Manage dependencies

  • Publish packages

  • Set up and use virtual environments

Effectively, it is a command line utility (similar to pip) that is a bit more convenient and allows more customization.

Why do we want to use it#

  1. To improve our dependency management, which is used to be all over the place. We have several requirements.txt files that get imported into setup.py and parsed depending on the extra arguments passed to pip install. Poetry handles this much more elegantly through a pyproject.toml file that defines the dependency configuration very explicitly in a simple data format.

  2. Reproducible development virtual environments means that everyone is using the exact same dependencies, without conflicts. This also improves our packaging and release flow.

How to install it?#

We provide custom installation instructions and an installation script on TODO ADD LINK SECTION. However, you can read more information here: see the poetry documentation for a guide to installation and basic use.

Usage Examples#

To add poetry to a project#

To initialize a new basic project with poetry configured, run:

poetry new poetry-demo

To add poetry to an existing project, cd to the project directory and run:

poetry init

Configuring dependencies#

The dependency configuration is in the editable file called pyproject.toml. Here you can specify whatever dependencies you want in your project, their versions, and even different levels of dependencies (e.g., dev).

To add a dependency to the project (e.g., numpy), run:

poetry add numpy

You can then verify that it was added to the tool.poetry.dependencies section of pyproject.toml.

For many more options on defining dependencies, see here.

Virtual environments#

Now that the project has had poetry configured and the correct dependencies are specified, we can use poetry to run our scripts/shell commands from a virtual environment without much effort. There are a few ways to do this:

Poetry run: One way is to precede any shell command you’d normally run with poetry run. For example, if you want to run python tidy_script.py from the virtual environment set up by poetry, you’d do:

poetry run python tidy3d_script.py

Poetry shell:

If you want to open up a shell session with the environment activated, you can run:

poetry shell

And then run your commands. To return to the original shell, run exit.

There are many more advanced options explained here.

Publishing Package#

To upload the package to PyPI:

poetry build

poetry publish

Note that some configuration must be set up before this would work properly.

Code Quality Principles#

When writing a code snippet, remember the saying: “code is read more than written”. We want to maintain our code maintainable, readable and high quality.

Linting & Formatting#

To maintain code quality, we use Ruff as a linter and code formatter. A linter analyzes code to identify and flag potential errors, stylistic issues, and code that doesn’t adhere to defined standards (such as PEP8). A code formatter automatically restructures the code to ensure it is consistently styled and properly formatted, making it consistent across the code base.

Run ruff format to format all Python files:

poetry run ruff format .

Run ruff check to check for style and other issues. Many common warnings can be automatically fixed with the --fix flag:

poetry run ruff check tidy3d --fix

The configuration defining what ruff will correct lives in pyproject.toml under the [tool.ruff] section.

When submitting code, for tests to pass, ruff should give no warnings.

Documentation#

Document all code you write using NumPy-style docstrings.

Testing#

Here we will discuss how tests are defined and run in Tidy3d.

Unit Testing#

The tests live in tests/ directory.

We use pytest package for our testing.

To run all of the tests, call:

poetry run pytest -rA tests

This command will trigger pytest to go through each file in tests/ called test*.py and run each function in that file with a name starting with test.

If all of these functions run without any exceptions being raised, the tests pass!

The specific configuration we use for pytest lives in the [tool.pytest.ini_options] section of pyproject.toml.

These tests are automatically run when code is submitted using GitHub Actions, which tests on Python 3.9 through 3.12 running on Ubuntu, MacOS, and Windows operating systems, as well as Flexcompute’s servers.

Note: The -rA flag is optional but produces output that is easily readable.

Note: You may notice warnings and errors in the pytest output, this is because many of the tests intentionally trigger these warnings and errors to ensure they occur in certain situations. The important information about the success of the test is printed out at the bottom of the pytest output for each file.

To get a code coverage report:

pip install pytest-cov

if not already installed

To run coverage tests with results printed to STDOUT:

pytest tests --cov-report term-missing --cov=tidy3d

To run coverage tests and get output as .html (more intuitive):

pytest tests --cov-report=html --cov=tidy3d
open htmlcov/index.html

Automated Testing#

We use GitHub Actions to perform these tests automatically and across different operating systems.

On commits, each of the pytest tests are run using Python 3.9 - 3.12 installed on Ubuntu, MacOS, and Windows operating systems.

See the “actions” tab for details on previous tests and .github/workflows/run_tests.yml for the configuration and to see the specific tests run.

See this for more explanation.

Other Tests#

There are additional tests in both the documentation and our private backend code. The same practices outlined here apply to those tests.

More Resources on Testing#

A useful explanation for those curious to learn more about the reasoning behind these decisions:

https://www.youtube.com/watch?v=DhUpxWjOhME <https://www.youtube.com/watch?v=DhUpxWjOhME>

tidy3d Project Structure#

As of tidy3d>=2.6, the frontend has been restructured to improve the development cycle. The project directories follow the following structure, which is derived from some recommended Python project architecture guides. This is a handy structure because many tools, such as sphinx, integrate quite well with this type of project layout.

docs/
    # sphinx rst files
    ...
    notebooks/
        # Git submodule repository
        # Checks out github.com/flexcompute/tidy3d-notebooks
    faq/
        # Git submodule repository
        # Checks out github.com/flexcompute/tidy3d-faq
tests/
    # pytest source and docs
    # pytest notebooks
scripts/
    # useful handy scripts
tidy3d/
    # python source code
...
pyproject.toml # python packaging
poetry.lock # environment management

It is important to note the new tools we are using to manage our development environment and workflow.

  • poetry

  • pipx

Important Branches#

We currently have three main branches that have to be kept track of when creating a release, each with different functionality.

Project Branches#

Name

Description

Caveats

latest

Contains the latest version of the docs. Version release tags are created from this branch.

Feature PRs should not be made to this branch as will cause divergence. Only in important documentation patches.

develop

Contains the “staging” version of the project. Patch versions and development occurs from these branches.

Docs PRs that are non-crucial for the current version should be made to this branch.

pre/^*

Contains the next version of the project.

Documentation and source code that will only go live in the next version should be updated here.

Sometimes, hopefully infrequently, the latest and develop branches might diverge. It is important to bring them back together. However, what happens if we rebase develop into latest?

It could be argued that all the commits in the latest branch should have constructed within the develop branch. Then, there is the question if we want to maintain the commit history accordingly. If we just want to maintain the content, then rebasing and fixing up all the branches works fine. The problem with a merge commit is that it inserts the commits at the historical period in which they were made, rather than the commit period in which we desire to add them. Hence, it makes sense to merge the develop and latest branches in order to maintain the same history, assuming the commits should in theory have been in both branches.