Skip to content

Commit

Permalink
PT Integration (#476)
Browse files Browse the repository at this point in the history
* Models with no centroids (#472)

* Bumps up version for release

* Bumps up version for release

* docs

---------

Co-authored-by: pveigadecamargo <[email protected]>

* Better network skimming and path computation headers and typo fix (#474)

* Integration scaffolding

* Transit assignment

* Change "id" variable naming to avoid double underscores

Double underscore variables are difficult to inherit without duplicating. I've tried my best to group all these renames
into a single commit to be revertible but some others are deep within a refactor.

This changes the convention of using `__id__` and `__graph_id__` to `_id` and `_graph_id`. Since these variables need to
be accessible to other methods they are not private. The double underscore worked previously since there was only ever
one class with these variables the name mangling was always correct.

* Create GraphBase and TransitGraph classes

* Create TransportClassBase and TransitClass classes

* Create AssignmentBass and TransitAssignment classes

* Create AssignmentResultsBase and AssignmentResultsTransit classes

* Comments, add OptimalStrategies class

* Incomplete (full) workflow

* fixup! Incomplete (full) workflow

* fixes export of omx matrices (#484)

Co-authored-by: pveigadecamargo <[email protected]>

* Move periods table

* Lots of changes

Limit graph loading and saving to a single graph for now
Flush out results API, including saving and loading
Move unrelated variables and methods

* Add Periods and Period classes, similar to Nodes and Links classes

* Remove period from database saving for now

* Propagate period ids and various changes

* Raise nicer errors

* Style

* removes use of necessary sqlite3 connection cursors and cached connections (#478)

* removes use of necessary sqlite3 connection cursors

* removes use of necessary sqlite3 connection cursors and saved connections

* removes silly use of a cursor

* improves on connection styles

* improves on connection styles

* improves on connection styles

* improves on connection styles

* improves on connection styles

* improves on connection styles

* improves on connection styles

* improves on connection styles

* improves on connection styles

* improves on connection styles

* improves on connection styles

---------

Co-authored-by: pveigadecamargo <[email protected]>

* Migrate docs to new api, fix reloading graphs and begin updating tests.

* Style and build errors

* Typing

* Revert changes to HyperpathGenerating and tests, use arrays instead

* Exclude abstract methods from coverage tests

* Period, Periods, TransitGraph, and TransitGraphBuilder tests

* Style

* Cleans graph creation (#482)

Co-authored-by: pveigadecamargo <[email protected]>

* Adds support page (#481)

* updates examples

* updates examples

* Lower coverage requirement, typos

* Remove patches

* updates examples

* fixup! Remove patches

* Missing var

* Deprecates Python 3.7

* Fixes documentation

* Fixes documentation

* code cleaning

* code cleaning

* code cleaning

* updates setup classifiers

* Fixes centroid assignment

---------

Co-authored-by: Pedro Camargo <[email protected]>
Co-authored-by: pveigadecamargo <[email protected]>
  • Loading branch information
3 people authored Dec 16, 2023
1 parent 0c853dc commit e0a0b55
Show file tree
Hide file tree
Showing 93 changed files with 2,368 additions and 1,173 deletions.
5 changes: 4 additions & 1 deletion .coveragerc
Original file line number Diff line number Diff line change
@@ -1,3 +1,6 @@
[report]
fail_under = 81.0
fail_under = 75
show_missing = True
exclude_lines =
pragma: no cover
@abstract
38 changes: 0 additions & 38 deletions .github/tests_linux.yml

This file was deleted.

2 changes: 1 addition & 1 deletion .github/workflows/build_linux.yml
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ jobs:
- name: Build manylinux Python wheels
uses: RalfG/[email protected]
with:
python-versions: 'cp37-cp37m cp38-cp38 cp39-cp39 cp310-cp310 cp311-cp311 cp312-cp312'
python-versions: 'cp38-cp38 cp39-cp39 cp310-cp310 cp311-cp311 cp312-cp312'
pip-wheel-args: '--no-deps'

- name: Moves wheels
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/build_mac.yml
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ jobs:
continue-on-error: true
strategy:
matrix:
python-version: [ '3.7', '3.8', '3.9', '3.10', '3.11', '3.12']
python-version: ['3.8', '3.9', '3.10', '3.11', '3.12']
steps:
- uses: actions/checkout@v3
- name: Set Python environment
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/build_windows.yml
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ jobs:
continue-on-error: true
strategy:
matrix:
python-version: [ '3.7', '3.8', '3.9', '3.10', '3.11', '3.12']
python-version: ['3.8', '3.9', '3.10', '3.11', '3.12']
architecture: ['x64']
steps:
- uses: actions/checkout@v3
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/debug_tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ jobs:

testing:
runs-on: ubuntu-20.04
container: python:3.7
container: python:3.9
steps:
- uses: actions/checkout@v3
- name: Install dependencies
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/test_linux_with_coverage.yml
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ jobs:
HAS_SECRETS: ${{ secrets.AWS_SECRET_ACCESS_KEY != '' }}
strategy:
matrix:
python-version: [3.9]
python-version: [3.10]
steps:
- uses: actions/checkout@v3
- name: Install dependencies
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/unit_tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@ jobs:
runs-on: ${{ matrix.os}}
strategy:
matrix:
python-version: [ '3.7', '3.8', '3.9', '3.10', '3.11', '3.12']
python-version: ['3.8', '3.9', '3.10', '3.11', '3.12']
os: [windows-latest, ubuntu-latest]

max-parallel: 20
Expand Down
70 changes: 3 additions & 67 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,84 +15,20 @@
[![QAequilibraE artifacts](https://github.com/AequilibraE/aequilibrae/actions/workflows/build_artifacts_qgis.yml/badge.svg)](https://github.com/AequilibraE/aequilibrae/actions/workflows/build_artifacts_qgis.yml)


AequilibraE is the first comprehensive Python package for transportation modeling. It aims to provide all the
AequilibraE is a fully-featured Open-Source transportation modeling package and the first comprehensive package
of its kind for the Python ecosystem. It aims to provide all the fundamental transport modelling
resources not available from other open-source packages in the Python (NumPy, really) ecosystem.

## Comprehensive documentation

[AequilibraE documentation built with Sphinx ](http://www.aequilibrae.com)

## What is available

* Importing networks from OSM
* Synthetic gravity/IPF
* Traffic assignment (All-or Nothing, MSA, Frank-Wolfe, Conjugate Frank-Wolfe & Biconjugate-FrankWolfe)
* Network Skimming & node-to-node path computation
* Fast Matrix format based on NumPy
* GTFS Import

### What is available only in QGIS

Some common resources for transportation modeling are inherently visual, and therefore they make more sense if
available within a GIS platform. For that reason, many resources are available only from AequilibraE's
[QGIS plugin](http://plugins.qgis.org/plugins/qaequilibrae/),
which uses AequilibraE as its computational workhorse and also provides GUIs for most of AequilibraE's tools. Said tool
is developed independently, although in parallel, and more details can be found in its
is developed independently and a little delayed with relationship to the Python package, and more details can be found in its
[GitHub repository](https://github.com/AequilibraE/qaequilibrae).


### What is not planned to be available any time soon

As AequilibraE's focus is to provide resources that are not yet available in the open source world, particularly the
Python ecosystem, some important tools for transportation model won't be part of AequilibraE any time soon. Examples
of this are:

* Discrete choice models - [BIOEGEME](http://biogeme.epfl.ch) , [LARCH](http://larch.newman.me)

* Activity-Based models - [ActivitySim](http://www.activitysim.org)

## History

Before there was AequilibraE, there was a need for something like AequilibraE out there.

### The very early days

It all started when I was a student at [UCI-ITS](www.its.uci.edu) and needed low level access to outputs of standard
algorithms used in transportation modeling (e.g. path files from traffic assignment) and had that denied by the maker
of the commercial software he normally used. There, the [first scratch of a traffic assignment procedure](www.xl-optim.com/python-traffic-assignment) was born.
After that, there were a couple of scripts developed to implement synthetic gravity models (calibration and application)
that were develop for a government think-tank in Brazil [IPEA](www.ipea.gov.br).
Around the same time, another student needed a piece of code that transformed a GIS link layer into a proper graph,
where each link would become the connection between two nodes.
So there were three fundamental pieces that would come to be part of AequilibraE.

### The first take on a release software

Having all those algorithms at hand, it made sense combining them into something more people could use, and by them it
seemed that QGIS was the way to go, so I developed the [very first version of AequilibraE](http://www.xl-optim.com/introducing_aequilibrae).

It was buggy as hell and there was very little, if any, software engineering built into it, but it put Aequilibrae on
the map.

### The first reasonable version

The first important thing I noticed after releasing AequilibraE was that the code was written in procedural style, even
though it would make a lot more sense doing it in a Object-Oriented fashion, which let me down the path of creating the
objects (graph, assignment results, matrix) that the software still relies on and were the foundation blocks of the
proper API that is in the making. That [version was release in 2016](http://www.xl-optim.com/new-version-of-aequilibrae).

### Evolving into proper software


A few distinct improvements deserve to be highlighted.

* The separation of the GUI and the Python library in [two repositories](http://www.xl-optim.com/separating-the-women-from-the-girls)
* Introduction of Unit Tests and automatic testing using [Travis (replaced with GitHub Actions)](https://travis-ci.org/AequilibraE/aequilibrae)
* Welcome of new collaborators: Jamie Cook, Andrew O'Brien, Yu-Chu Huang & Jan Zill
* Introduction of style-checking with Flake8 and Black
* Development of proper documentation and a recommended development virtual environment

# QGIS Plugin

The QGIS plugin is developed on a separate repository: [QGIS GUI](https://github.com/AequilibraE/qaequilibrae)
That is where everything started.
6 changes: 3 additions & 3 deletions __version__.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
version = 0.9
minor_version = "5"
release_name = "Queluz"
version = 1.0
minor_version = "0"
release_name = "Rio de Janeiro"

release_version = f"{version}.{minor_version}"
2 changes: 1 addition & 1 deletion aequilibrae/matrix/aequilibrae_matrix.py
Original file line number Diff line number Diff line change
Expand Up @@ -784,7 +784,7 @@ def export(self, output_name: str, cores: List[str] = None):

def f(name):
if self.__omx:
coo = np.array(self.omx_file[name])
coo = coo_matrix(np.array(self.omx_file[name]))
else:
coo = coo_matrix(self.matrix[name])
data = {"row": self.index[coo.row], "column": self.index[coo.col], name: coo.data}
Expand Down
4 changes: 2 additions & 2 deletions aequilibrae/paths/AoN.pyx
Original file line number Diff line number Diff line change
Expand Up @@ -187,7 +187,7 @@ def path_computation(origin, destination, graph, results):
dest = destination
origin_index = graph.nodes_to_indices[orig]
dest_index = graph.nodes_to_indices[dest]
if results.__graph_id__ != graph.__id__:
if results._graph_id != graph._id:
raise ValueError("Results object not prepared. Use --> results.prepare(graph)")


Expand Down Expand Up @@ -392,7 +392,7 @@ def skimming_single_origin(origin, graph, result, aux_result, curr_thread):
origin_index = graph.compact_nodes_to_indices[orig]

graph_fs = graph.compact_fs
if result.__graph_id__ != graph.__id__:
if result._graph_id != graph._id:

raise ValueError("Results object not prepared. Use --> results.prepare(graph)")

Expand Down
7 changes: 3 additions & 4 deletions aequilibrae/paths/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,11 +4,10 @@
from aequilibrae.paths.network_skimming import NetworkSkimming
from aequilibrae.paths.all_or_nothing import allOrNothing
from aequilibrae.paths.assignment_paths import AssignmentPaths
from aequilibrae.paths.traffic_class import TrafficClass
from aequilibrae.paths.traffic_assignment import TrafficAssignment
from aequilibrae.paths.traffic_class import TrafficClass, TransitClass
from aequilibrae.paths.traffic_assignment import TrafficAssignment, TransitAssignment
from aequilibrae.paths.vdf import VDF
from aequilibrae.paths.graph import Graph
from aequilibrae.paths.public_transport import HyperpathGenerating
from aequilibrae.paths.graph import Graph, TransitGraph

from aequilibrae import global_logger

Expand Down
2 changes: 1 addition & 1 deletion aequilibrae/paths/all_or_nothing.py
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,7 @@ def __init__(self, matrix, graph, results):
self.report = []
self.cumulative = 0

if results.__graph_id__ != graph.__id__:
if results._graph_id != graph._id:
raise ValueError("Results object not prepared. Use --> results.prepare(graph)")

elif matrix.matrix_view is None:
Expand Down
66 changes: 42 additions & 24 deletions aequilibrae/paths/graph.py
Original file line number Diff line number Diff line change
@@ -1,15 +1,18 @@
from os.path import join
import pickle
import uuid
from abc import ABC
from datetime import datetime
from typing import List, Tuple
from os.path import join
from typing import List, Tuple, Optional

import numpy as np
import pandas as pd
from aequilibrae.context import get_logger
from aequilibrae.paths.AoN import build_compressed_graph

from aequilibrae.context import get_logger


class Graph(object):
class GraphBase(ABC):
"""
Graph class
"""
Expand Down Expand Up @@ -75,7 +78,7 @@ def __init__(self, logger=None):
self.g_link_crosswalk = np.array([]) # 4 a link ID in the BIG graph, a corresponding link in the compressed 1

# Randomly generate a unique Graph ID randomly
self.__id__ = uuid.uuid4().hex
self._id = uuid.uuid4().hex

def default_types(self, tp: str):
"""
Expand All @@ -91,7 +94,7 @@ def default_types(self, tp: str):
else:
raise ValueError("It must be either a int or a float")

def prepare_graph(self, centroids: np.ndarray) -> None:
def prepare_graph(self, centroids: Optional[np.ndarray]) -> None:
"""
Prepares the graph for a computation for a certain set of centroids
Expand All @@ -109,17 +112,18 @@ def prepare_graph(self, centroids: np.ndarray) -> None:

# Creates the centroids

if centroids is None or not isinstance(centroids, np.ndarray):
raise ValueError("Centroids need to be a NumPy array of integers 64 bits")
if not np.issubdtype(centroids.dtype, np.integer):
raise ValueError("Centroids need to be a NumPy array of integers 64 bits")
if centroids.shape[0] == 0:
raise ValueError("You need at least one centroid")
if centroids.min() <= 0:
raise ValueError("Centroid IDs need to be positive")
if centroids.shape[0] != np.unique(centroids).shape[0]:
raise ValueError("Centroid IDs are not unique")
self.centroids = np.array(centroids, np.uint32)
if centroids is not None:
if not np.issubdtype(centroids.dtype, np.integer):
raise ValueError("Centroids need to be a NumPy array of integers 64 bits")
if centroids.shape[0] == 0:
raise ValueError("You need at least one centroid")
if centroids.min() <= 0:
raise ValueError("Centroid IDs need to be positive")
if centroids.shape[0] != np.unique(centroids).shape[0]:
raise ValueError("Centroid IDs are not unique")
self.centroids = np.array(centroids, np.uint32)
else:
self.centroids = np.array([], np.uint32)

self.network = self.network.astype(
{
Expand All @@ -130,7 +134,7 @@ def prepare_graph(self, centroids: np.ndarray) -> None:
}
)

properties = self.__build_directed_graph(self.network, centroids)
properties = self._build_directed_graph(self.network, self.centroids)
self.all_nodes, self.num_nodes, self.nodes_to_indices, self.fs, self.graph = properties

# We generate IDs that we KNOW will be constant across modes
Expand All @@ -141,16 +145,17 @@ def prepare_graph(self, centroids: np.ndarray) -> None:
self.num_links = self.graph.shape[0]
self.__build_derived_properties()

self.__build_compressed_graph()
self.compact_num_links = self.compact_graph.shape[0]
if self.centroids.shape[0]:
self.__build_compressed_graph()
self.compact_num_links = self.compact_graph.shape[0]

def __build_compressed_graph(self):
build_compressed_graph(self)

# We build a groupby to save time later
self.__graph_groupby = self.graph.groupby(["__compressed_id__"])

def __build_directed_graph(self, network: pd.DataFrame, centroids: np.ndarray):
def _build_directed_graph(self, network: pd.DataFrame, centroids: np.ndarray):
all_titles = list(network.columns)

not_pos = network.loc[network.direction != 1, :]
Expand Down Expand Up @@ -236,7 +241,7 @@ def exclude_links(self, links: list) -> None:
if self.centroids is not None:
self.prepare_graph(self.centroids)
self.set_blocked_centroid_flows(self.block_centroid_flows)
self.__id__ = uuid.uuid4().hex
self._id = uuid.uuid4().hex

def __build_column_names(self, all_titles: List[str]) -> Tuple[list, list]:
fields = [x for x in self.required_default_fields]
Expand Down Expand Up @@ -380,7 +385,7 @@ def save_to_disk(self, filename: str) -> None:
mygraph["skim_fields"] = self.skim_fields
mygraph["block_centroid_flows"] = self.block_centroid_flows
mygraph["centroids"] = self.centroids
mygraph["graph_id"] = self.__id__
mygraph["graph_id"] = self._id
mygraph["mode"] = self.mode

with open(filename, "wb") as f:
Expand Down Expand Up @@ -410,7 +415,7 @@ def load_from_disk(self, filename: str) -> None:
self.skim_fields = mygraph["skim_fields"]
self.block_centroid_flows = mygraph["block_centroid_flows"]
self.centroids = mygraph["centroids"]
self.__id__ = mygraph["graph_id"]
self._id = mygraph["graph_id"]
self.mode = mygraph["mode"]
self.__build_derived_properties()

Expand Down Expand Up @@ -481,3 +486,16 @@ def save_compressed_correspondence(self, path, mode_name, mode_id):
self.graph.to_feather(graph_path)
node_path = join(path, f"nodes_to_indices_c{mode_name}_{mode_id}.feather")
pd.DataFrame(self.nodes_to_indices, columns=["node_index"]).to_feather(node_path)


class Graph(GraphBase):
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)


class TransitGraph(GraphBase):
def __init__(self, config: dict = None, od_node_mapping: pd.DataFrame = None, *args, **kwargs):
super().__init__(*args, **kwargs)
self._config = config
self.od_node_mapping = od_node_mapping
self.mode = "t"
Loading

0 comments on commit e0a0b55

Please sign in to comment.