Skip to content

jandrewjohnson/hazelbean_dev

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

293 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Hazelbean

Hazelbean is a collection of geospatial processing tools based on gdal, numpy, scipy, cython, pygeoprocessing, taskgraph, natcap.invest, geopandas and many others to assist in common spatial analysis tasks in sustainability science, ecosystem service assessment, global integrated modelling assessment, natural capital accounting, and/or calculable general equilibrium modelling.

Requirements

  • Python 3.10 or later (Python 3.9 support was dropped as of version 1.4.0 due to NumPy 2.0 compatibility issues)

Hazelbean started as a personal research package of scripts for Justin Johnson and is was not originally intended for broad release. However, hazelbean is transitioning towards having full-support, primarily because it underlies several important software releases, including some from the Natural Capital Project. Thus, even in this transitory state, it is available via "pip install hazelbean". Note that hazelbean only provides a Python 3+, 64 bit, Windows version, however with the exception of precompiled cython files, it should be cross-platform and cross-version. The precompiled files are only loaded as needed.

Documentation

Our integrated documentation system provides comprehensive guides and examples:

  • ** Getting Started** - Complete setup guide with current project structure
  • ** Testing Guide** - Test infrastructure overview
  • ** Examples** - Hands-on tutorials and demonstrations

Local Documentation Site

You can serve the full documentation site locally with searchable content, test examples, and live reports:

conda activate hazelbean_env
cd docs-site/quarto-docs
quarto preview  # Visit http://localhost:4848

The local site includes: - Progressive learning path with tutorials - 50+ test examples showing real-world usage patterns - Current test results and performance metrics - Searchable content across all documentation

Quick Start

Option 1: Complete Environment (Recommended)

# 1. Clone repository and setup complete environment
git clone https://github.com/jandrewjohnson/hazelbean_dev.git
cd hazelbean_dev

# 2. Create environment from included configuration
mamba env create -f environment.yml
mamba activate hazelbean_env

# 3. Install hazelbean package (builds Cython extensions)
pip install -e . --no-deps

# 4. Verify installation (checks Cython extensions)
python scripts/verify_installation.py

# 5. Try educational examples
cd examples && python step_1_project_setup.py

# 6. Explore documentation locally
cd docs-site/quarto-docs && quarto preview  # Visit http://localhost:4848

Important Notes:

  • Step 3 compiles Cython extensions for your platform (Windows/Mac/Linux)
  • Step 4 verifies everything is working correctly and provides troubleshooting guidance if needed
  • The --no-deps flag prevents pip from reinstalling conda packages (correct for conda+pip hybrid environments)

Windows Users: If Step 3 fails with compiler errors, see the Windows Setup Guide for detailed instructions on installing build tools.

Option 2: Package Only

# Basic installation for using Hazelbean in existing environment
mamba install -c conda-forge natcap.invest geopandas pygeoprocessing taskgraph cython
pip install hazelbean

Next steps: Explore the examples/ directory for guided learning.

Detailed Installation Notes

Prerequisites

Troubleshooting

Cython Compilation Errors (Windows): If you see ImportError: cannot import name 'cython_functions' or compiler errors during installation:

  1. Quick Fix (Recommended): Install conda compiler tools:

    conda activate hazelbean_env
    conda install -c conda-forge m2w64-toolchain libpython
    pip install -e . --no-deps --force-reinstall
  2. Alternative: Install Microsoft Visual Studio Build Tools:

  3. Verify: Run python scripts/verify_installation.py to check if Cython extensions are working

See Windows Setup Guide for detailed troubleshooting.

Numpy Compatibility Issues: If numpy throws "wrong size or changes size binary" errors, upgrade numpy after installation:

mamba update numpy

See details: https://stackoverflow.com/questions/66060487/valueerror-numpy-ndarray-size-changed-may-indicate-binary-incompatibility-exp

macOS Permissions: Your Python environment needs permissions to access and write to the base data folder. Grant necessary permissions in System Preferences if needed.

More information

See the author's personal webpage, https://justinandrewjohnson.com/ for more details about the underlying research.

Project Flow

One key component of Hazelbean is that it manages directories, base_data, etc. using a concept called ProjectFlow. ProjectFlow defines a tree of tasks that can easily be run in parallel where needed and keeping track of task-dependencies. ProjectFlow borrows heavily in concept (though not in code) from the task_graph library produced by Rich Sharp but adds a predefined file structure suited to research and exploration tasks.

Project Flow notes

Project Flow is intended to flow easily into the situation where you have coded a script that grows and grows until you think "oops, I should really make this modular." Thus, it has several modalities useful to researchers ranging from simple drop-in solution to complex scripting framework.

Notes

In run.py, initialize the project flow object. This is the only place where user supplied (possibly absolute but can be relative) path is stated. The p ProjectFlow object is the one global variable used throughout all parts of hazelbean.

import hazelbean as hb

if __name__ == '__main__':
    p = hb.ProjectFlow(r'C:\Files\Research\cge\gtap_invest\projects\feedback_policies_and_tipping_points')

In a multi-file setup, in the run.py you will need to import different scripts, such as main.py i.e.:

import visualizations.main

The script file mainpy can have whatever code, but in particular can include "task" functions. A task function, shown below, takes only p as an agrument and returns p (potentially modified). It also must have a conditional (if p.run_this:) to specify what always runs (and is assumed to run trivially fast, i.e., to specify file paths) just by nature of having it in the task tree and what is run only conditionally (based on the task.run attribute, or optionally based on satisfying a completed function.)

def example_task_function(p):
    """Fast function that creates several tiny geotiffs of gaussian-like kernels for later use in ffn_convolve."""

    if p.run_this:
        for i in computationally_intensive_loop:
            print(i)

Important Non-Obvious Note

Importing the script will define function(s) to add "tasks", which take the ProjectFlow object as an argument and returns it after potential modification.

def add_all_tasks_to_task_tree(p):
    p.generated_kernels_task = p.add_task(example_task_function)

Creating a New Release

Hazelbean uses a fully automated release pipeline that publishes to both PyPI and conda-forge. The entire process is triggered by creating a GitHub Release.

Quick Steps

  1. Create and push a git tag:

    VERSION="1.7.7"  # Your version number
    git tag -a "v${VERSION}" -m "Release version ${VERSION}"
    git push origin "v${VERSION}"
  2. Create a GitHub Release at https://github.com/jandrewjohnson/hazelbean_dev/releases

    • Select your tag
    • Add release notes
    • Click "Publish release"
  3. Automation takes over:

    • GitHub Actions builds wheels for all platforms (~20 minutes)
    • Automatically uploads to PyPI
    • Updates CHANGELOG.md
    • conda-forge bot detects the release (~24 hours)
    • Review and merge the conda-forge PR
    • conda-forge builds and publishes packages (~2 hours)

Complete Release Documentation

For detailed information about the release process:

No manual PyPI uploads needed! The old manual process with twine is deprecated - everything is automated through GitHub Actions.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Contributors 7