-
Notifications
You must be signed in to change notification settings - Fork 160
Firedrake meeting 2024 08 07
Date and time 2024-08-07 1600 BST (1500 UTC)
- Pick Chair and Minuter (DD to pick)
- ALL: (ongoing) triage the open issues and confirm if they are indeed still open (and perhaps provide labels)
- JB: Move pyop3 and TSFC to firedrake and move FInAT to FIAT
- ALL: do things with SV's branches
- DH: Email to Andreas to have 2 (+ others!!!) loopy PRs merged TODO: FIND OUT WHICH PRS THESE ARE
- DH: Get Firedrake a docker open source account (link here)
- DH: Talk to GregVernon about PR#2116.
- DH: Order more Firedrake stickers
- JB: Enable merge queues (minutes)
- Firedrake User Meeting 16-18 September 2024 Firedrake (Registration 25th August/Abstracts 18th August)
Present: DH, RK, PB, KS, JB, CC, JM
Apologies: DD (pick NB as a minuter)
DH: Submit abstracts please! Before the end of next week (no extensions)
Note: James is planning to attend this meeting.
JM: TAO based solver for the adjoint for a range of problems (not equality and inequality constraints).
- #PR143: Concerns about deadlock on object finalization. Other changes requested, see reviewer comments on PR.
- #PR3657: Merged independently (since these methods won't be called by anyone). Connor's review needs addressing, but then it was decided that it shouldn't block this as it can be reverted later if needs be.
Consider the following:
from firedrake import *
# Example 1:
mesh = UnitSquareMesh(10, 10, comm=COMM_WORLD)
V = FunctionSpace(mesh, "Lagrange", 1)
f = Function(V)
x, y = SpatialCoordinate(mesh)
for _ in range(5):
f.interpolate(sin(1 + 2*pi*x)*cos(1 + 2*pi*y))
Run twice in a row this code does not hit disk cache, as it should 🙁. Hashing of FiniteElement
in FInAT needs to be fixed so that it is consistent between invocations.
Consider the following:
from firedrake import *
# Example 2:
# Run on more than 1 rank
from mpi4py import MPI
from pprint import pprint
from pyop2.mpi import hash_comm
ids = set()
def make_a_function(comm):
mesh = UnitSquareMesh(10, 10, comm=comm)
ids.add(hash_comm(mesh._comm))
print(f'CW{COMM_WORLD.rank}: {len(ids) = }')
V = FunctionSpace(mesh, "Lagrange", 1)
f = Function(V)
x, y = SpatialCoordinate(mesh)
return f.interpolate(sin(1 + 2*pi*x)*cos(1 + 2*pi*y))
for iteration in range(1, 10):
print(f'CW{COMM_WORLD.rank}: {iteration = }')
color = 0 if COMM_WORLD.rank < 2 else MPI.UNDEFINED
comm12 = COMM_WORLD.Split(color=color)
if COMM_WORLD.rank < 2:
f = make_a_function(comm12)
comm12.Free()
color = 0 if COMM_WORLD.rank > 0 else MPI.UNDEFINED
comm23 = COMM_WORLD.Split(color=color)
if COMM_WORLD.rank > 0:
f = make_a_function(comm23)
comm23.Free()
Run across 3 MPI ranks, COMM_WORLD.rank == 1
will hit memory cache, but COMM_WORLD.rank == 2
will not. This will produce a deadlock. We need to have per communicator memory caches, not global ones.
Currently just id(comm)
is used instead of a hash, this is fine if you never free communicators, but then you exhaust the comm count limit. If you free the communicators id(comm)
can return the same value as a previous comm and you have a hash collision. It remains to be seen whether per comm caches is sufficient to remove the need to have comm hashes in the first place.
Note that PRs put in this section should either be trivial or already have been reviewed. Discussion-worthy PRs should be separate agenda items.
1600 BST (1500 UTC) 2024-08-21
Building locally
Tips
- Running Firedrake tests with different subpackage branches
- Modifying and Rebuilding PETSc and petsc4py
- Vectorisation
- Debugging C kernels with
lldb
on MacOS - Parallel MPI Debugging with
tmux-mpi
,pdb
andgdb
- Parallel MPI Debugging with VSCode and
debugpy
- Modifying generated code
- Kernel profiling with LIKWID
- breakpoint() builtin not working
- Debugging pytest with multiple processing
Developers Notes
- Upcoming meeting 2024-08-21
- 2024-08-07
- 2024-07-24
- 2024-07-17
- 2024-07-10
- 2024-06-26
- 2024-06-19
- 2024-06-05
- 2024-05-29
- 2024-05-15
- 2024-05-08
- 2024-05-01
- 2024-04-28
- 2024-04-17
- 2024-04-10
- 2024-04-03
- 2024-03-27
- 2024-03-20
- 2024-03-06
- 2024-02-28
- 2024-02-28
- 2024-02-21
- 2024-02-14
- 2024-02-07
- 2024-01-31
- 2024-01-24
- 2024-01-17
- 2024-01-10
- 2023-12-13
- 2023-12-06
- 2023-11-29
- 2023-11-22
- 2023-11-15
- 2023-11-08
- 2023-11-01
- 2023-10-25
- 2023-10-18
- 2023-10-11
- 2023-10-04
- 2023-09-27
- 2023-09-20
- 2023-09-06
- 2023-08-30
- 2023-08-23
- 2023-07-12
- 2023-07-05
- 2023-06-21
- 2023-06-14
- 2023-06-07
- 2023-05-17
- 2023-05-10
- 2023-03-08
- 2023-02-22
- 2023-02-15
- 2023-02-08
- 2023-01-18
- 2023-01-11
- 2023-12-14
- 2022-12-07
- 2022-11-23
- 2022-11-16
- 2022-11-09
- 2022-11-02
- 2022-10-26
- 2022-10-12
- 2022-10-05
- 2022-09-28
- 2022-09-21
- 2022-09-14
- 2022-09-07
- 2022-08-25
- 2022-08-11
- 2022-08-04
- 2022-07-28
- 2022-07-21
- 2022-07-07
- 2022-06-30
- 2022-06-23
- 2022-06-16
- 2022-05-26
- 2022-05-19
- 2022-05-12
- 2022-05-05
- 2022-04-21
- 2022-04-07
- 2022-03-17
- 2022-03-03
- 2022-02-24
- 2022-02-10
- 2022-02-03
- 2022-01-27
- 2022-01-20
- 2022-01-13
- 2021-12-15
- 2021-12-09
- 2021-11-25
- 2021-11-18
- 2021-11-11
- 2021-11-04
- 2021-10-28
- 2021-10-21
- 2021-10-14
- 2021-10-07
- 2021-09-30
- 2021-09-23
- 2021-09-09
- 2021-09-02
- 2021-08-26
- 2021-08-18
- 2021-08-11
- 2021-08-04
- 2021-07-28
- 2021-07-21
- 2021-07-14
- 2021-07-07
- 2021-06-30
- 2021-06-23
- 2021-06-16
- 2021-06-09
- 2021-06-02
- 2021-05-19
- 2021-05-12
- 2021-05-05
- 2021-04-28
- 2021-04-21
- 2021-04-14
- 2021-04-07
- 2021-03-17
- 2021-03-10
- 2021-02-24
- 2021-02-17
- 2021-02-10
- 2021-02-03
- 2021-01-27
- 2021-01-20
- 2021-01-13
- 2021-01-06