Skip to content

TTool is software designed for augmented reality (AR) assisted woodworking tasks. It allows a AI-assisted and real-time detection of tool head's type and pose within the camera frame.

License

Notifications You must be signed in to change notification settings

ibois-epfl/TTool

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

DOI

🌲 TTool is developed at the Laboratory for Timber Construction (director: Prof.Yves Weinand) with the support of the EPFL Center for Imaging and the SCITAS, at EPFL, Lausanne, Switzerland. The project is part of the Augmented Carpentry Research.

🪚 TTool is an open-source AI-powered and supervised 6DoF detector for monocular camera. It is developed in C++ and for UNIX systems to allow accurate end-effectors detection during wood-working operations such as cutting, drilling, sawing and screwing with multiple tools. This is a fundamental component of any subtractive AR fabrication system since you can for instance, calculate and give users feedback on the correct orientation and depth to start and finish a hole or a cut.

🖧 TTool is a AI-6DoF pose detector that recognizes automatically tools and allows the user to input an initial pose via an AR manipulator. The pose is then refined by a modified version of SLET (checkout our changelog) and visualized as a projection onto the camera feed.

TTool can be imported as a C++ API in a third project or used as an executable. It is tailored to our specific use case in timber carpentry but see the Caveats section below to adapt it to your use case.

🚀 For a quick hands-on start or more details, check out our Wiki.

Alt

Publication

TTool is published in a MDPI Journal Paper of Applied Sciences that you can find here.

@article{Settimi2024,
  title = {TTool: A Supervised Artificial Intelligence-Assisted Visual Pose Detector for Tool Heads in Augmented Reality Woodworking},
  volume = {14},
  ISSN = {2076-3417},
  url = {http://dx.doi.org/10.3390/app14073011},
  DOI = {10.3390/app14073011},
  number = {7},
  journal = {Applied Sciences},
  publisher = {MDPI AG},
  author = {Settimi,  Andrea and Chutisilp,  Naravich and Aymanns,  Florian and Gamerro,  Julien and Weinand,  Yves},
  year = {2024},
  month = apr,
  pages = {3011}
}

How it works

Alt

  • a: the ML classifier detects the tool type from the camera feed and loads the corresponding 3D model.
  • b: the user inputs an initial pose of the tool via an AR manipulator.
  • c: the pose is refined with an edge-based algorithm.
  • d: the pose is projected onto the camera buffer and displayed to the user.
  • e: the user can now start the operation guided by computed feedback.

Alt

On the left, the user can select the tool type and input an initial pose. On the right, the pose is refined and projected onto the camera feed. The digital twin between the aligned model and the chainsaw plate (or any other tool) is preserved even when occuluded and inside the wood.

Caveats

TTool was tailored to our specific use case. If you want to adapt it to your use case, you will need to change the following files:

  • CMakeLists.txt: comment the line include(cmake/dataset.cmake), it won't use zenodo for the models, but you will have to provide the models yourself, se the wiki on how to do it.
  • assets/config.yaml: list the models you want to use, and their path by replacing these lines:

    TTool/assets/config.yml

    Lines 57 to 66 in b357383

    modelFiles:
    - "assets/toolheads/saber_saw_blade_makita_t_300/model.obj"
    - "assets/toolheads/twist_drill_bit_32_165/model.obj"
    - "assets/toolheads/circular_saw_blade_makita_190/model.obj"
    - "assets/toolheads/chain_saw_blade_f_250/model.obj"
    - "assets/toolheads/auger_drill_bit_20_235/model.obj"
    - "assets/toolheads/brad_point_drill_bit_20_150/model.obj"
    - "assets/toolheads/spade_drill_bit_25_150/model.obj"
    - "assets/toolheads/self_feeding_bit_40_90/model.obj"
    - "assets/toolheads/self_feeding_bit_50_90/model.obj"
    Be sure to erase these lines specific to our use case:

    TTool/assets/config.yml

    Lines 67 to 76 in b357383

    acitFiles:
    - "assets/toolheads/saber_saw_blade_makita_t_300/metadata.acit"
    - "assets/toolheads/twist_drill_bit_32_165/metadata.acit"
    - "assets/toolheads/circular_saw_blade_makita_190/metadata.acit"
    - "assets/toolheads/chain_saw_blade_f_250/metadata.acit"
    - "assets/toolheads/auger_drill_bit_20_235/metadata.acit"
    - "assets/toolheads/brad_point_drill_bit_20_150/metadata.acit"
    - "assets/toolheads/spade_drill_bit_25_150/metadata.acit"
    - "assets/toolheads/self_feeding_bit_40_90/metadata.acit"
    - "assets/toolheads/self_feeding_bit_50_90/metadata.acit"
  • ML classifier: to adapt the ML classifier to your use case, you will need to train your own model. We have a template in this repo.

Acknowledgement

This project was possible thanks to technical support and counseling of the EPFL Center for Imaging in the person of Florian Aymanns. We also would like to acknowledge the help of Nicolas Richart in the CMake project and CI development of TTool Check out their GitHub Organization to discover other nice projects they are helping building!