Skip to content

Commit

Permalink
cleanup code
Browse files Browse the repository at this point in the history
  • Loading branch information
yxjiang committed Feb 25, 2024
1 parent 6392e69 commit d412950
Show file tree
Hide file tree
Showing 11 changed files with 268 additions and 288 deletions.
52 changes: 52 additions & 0 deletions README.cn.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,52 @@
# mnlm
机器人臂演示的源代码(参见 https://www.bilibili.com/video/BV1ub4y1T7Jt/)。

[![图片替代文本](./images/screen.png)](https://www.bilibili.com/video/BV1ub4y1T7Jt/?vd_source=08295b5b4b3c5ece73fb91e3a54d202a)

## 构建并启动Docker容器

1. 安装 [Docker](https://docs.docker.com/get-docker/)[Docker Compose](https://docs.docker.com/compose/install/)
```bash
docker-compose up --build -d
```

此命令将为服务器端构建docker镜像,并启动docker容器。
用户也可以通过浏览器访问 http://localhost:8080/vnc.html 来访问模拟环境。

[![IMAGE ALT TEXT HERE](./images/novnc.png)]

1. 登录到docker容器。
如果您使用vscode,您可以安装 Dev Container 扩展并打开正在运行的容器。否则,您可以通过运行以下命令登录到docker容器:
```
docker exec -it mnln-ros_dev_env-1 /bin/bash
```

2. 启动ROS2模拟。

```bash
cd /home/small-thinking/mnlm/mnlm/robot/robot_arm_ws
```

```
colcon build --symlink-install ; source install/setup.bash ; ros2 launch robot_arm robot_arm.launch.py
```

您应该会看到服务器端程序启动。并且您可以访问 http://localhost:8080/vnc.html 来看到Gazebo Fortress模拟环境。

![IMAGE ALT TEXT HERE](./images/gazebo.png)

3. 在您的宿主机上。您可以运行命令以启动基于声音的UI。

进入项目文件夹:
```bash
cd mnlm/client/gpt_control
```

```bash
python assistant.py
```
然后您可以看到客户端启动,并且您将被提示告诉机器人要做什么。

![IMAGE ALT TEXT HERE](./images/voice.png)


103 changes: 26 additions & 77 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,103 +1,52 @@
# mnlm
Source code of robotic arm demo (see https://www.bilibili.com/video/BV1ub4y1T7Jt/).
# mnlm ([中文文档](README.cn.md))

TODOs:
1. Restructure the data format of the robot arm servo control.
Source code of robotic arm demo (see https://www.bilibili.com/video/BV1ub4y1T7Jt/).


[![IMAGE ALT TEXT HERE](./images/screen.png)](https://www.bilibili.com/video/BV1ub4y1T7Jt/?vd_source=08295b5b4b3c5ece73fb91e3a54d202a)

## Build
```bash
docker build -t mnlm .
```

## Development
You can run the docker on the background and login to the docker container to do the development.

0. Install the [Remote - Containers](https://marketplace.visualstudio.com/items?itemName=ms-vscode-remote.remote-containers) extension in VSCode.
1. Start the docker container:
```bash
docker-compose up -d
```
2. In VSCode, click the "Open the remote window" button on the bottom left corner, and select "Remote-Containers: Attach to Running Container..."
3. Select the container you just started, should be something like `mnln-ros_dev_env-1`. And you will see a new VSCode window pop up.
4. Open the folder `/home/small-thinking/mnlm` in the new VSCode window, and you can start the development.
## Build and Start the Docker Container

### Start rviz2
rviz2 is a visualization tool for ROS2. You can use it to visualize the robots.
Open a terminal in the docker container, and run the following command:
1. Install [Docker](https://docs.docker.com/get-docker/) and [Docker Compose](https://docs.docker.com/compose/install/).
```bash
root@809c4c72ba40:~# ros2 run rviz2 rviz2

QStandardPaths: XDG_RUNTIME_DIR not set, defaulting to '/tmp/runtime-root'
[INFO] [1703722219.404846429] [rviz2]: Stereo is NOT SUPPORTED
[INFO] [1703722219.404925804] [rviz2]: OpenGl version: 4.5 (GLSL 4.5)
[INFO] [1703722219.659990970] [rviz2]: Stereo is NOT SUPPORTED
docker-compose up --build -d
```
Then in your host machine, open a browser and go to `http://localhost:8080/`. You will see the rviz2 file list.

![IMAGE ALT TEXT HERE](./images/rviz2-files.png "rviz2")
This command will build the docker image for the server side, and also start the docker container.
Users can also access the simulation environment through the browser by visiting `http://localhost:8080/vnc.html`.
[![IMAGE ALT TEXT HERE](./images/novnc.png)]

Click vnc.html, and then click the "Connect" button. You will see the rviz2 UI.

### Start RQt
RQt is a graphical user interface framework that implements various tools and interfaces in the form of plugins. One can run all the existing GUI tools as dockable windows within RQt.

Open a new terminal in the docker container, and run the following command:
2. Login to the docker container.
If you use vscode, you can install the `Dev Container` extension and open the running container. Otherwise, you can login to the docker container by running the following command:
```bash
root@809c4c72ba40:~/mnlm# rqt

QStandardPaths: XDG_RUNTIME_DIR not set, defaulting to '/tmp/runtime-root'
docker exec -it mnln-ros_dev_env-1 /bin/bash
```
You will see the RQt added to the rviz2 UI.

### Test with sample programs

#### Test with demo_nodes_cpp
Run the demo_node_cpp talker:
3. Start the ROS2 simulation.
```bash
source /opt/ros/humble/setup.bash
ros2 run demo_nodes_cpp talker
cd /home/small-thinking/mnlm/mnlm/robot/robot_arm_ws
```

In another terminal, run the demo_node_py listener:
```bash
source /opt/ros/humble/setup.bash
ros2 run demo_nodes_py listener
```
#### Test with turtlesim
Run turtlesim_node:
```bash
root@809c4c72ba40:~/mnlm# ros2 run turtlesim turtlesim_node
QStandardPaths: XDG_RUNTIME_DIR not set, defaulting to '/tmp/runtime-root'
[INFO] [1703722372.104943222] [turtlesim]: Starting turtlesim with node name /turtlesim
[INFO] [1703722372.108501763] [turtlesim]: Spawning turtle [turtle1] at x=[5.544445], y=[5.544445], theta=[0.000000]
colcon build --symlink-install ; source install/setup.bash ; ros2 launch robot_arm robot_arm.launch.py
```

In another terminal, run turtle_teleop_key:
```bash
root@809c4c72ba40:~/mnlm# ros2 run turtlesim turtle_teleop_key
You should see the server side program started. And you can visit `http://localhost:8080/vnc.html`` to see the Gazebo Fortress simulation environment.
![IMAGE ALT TEXT HERE](./images/gazebo.png)

Reading from keyboard
---------------------------
Use arrow keys to move the turtle.
Use G|B|V|C|D|E|R|T keys to rotate to absolute orientations. 'F' to cancel a rotation.
'Q' to quit.
```
![IMAGE ALT TEXT HERE](./images/turtlesim.png "turtlesim")

4. In your host machine. You can run the command to start the voice based UI.

![IMAGE ALT TEXT HERE](./images/voice.png)

## Cleanup untagged images
```bash
docker rm $(docker ps -a -q) ; docker images | grep '<none>' | awk '{print $3}' | xargs docker rmi
```



In the project folder:
```bash
cd mnlm/client/gpt_control
```
Then you can see the client side started, and you will be prompoted to tell the robot what to do.

TODOs:
5. Build a key value store to store the verbal command and the list of commands.
6. Index the key value store with the verbal command in to vector db.
7. Add RAG after voice recognition.
```bash
python assistant.py
```
31 changes: 31 additions & 0 deletions developer.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,31 @@


# Development

## Project structure
```
mnlm
|-> mnlm
| |--> client
| |--> gpt_control
| |--> assistant.py # voice based UI start program
| |--> command_indexer.py # script to index commands into the knowledge base
| |--> knowledge # the knowledge base
| |--> dummy_robot_arm_server.py # dummy server for testing purpose so we don't need to start ROS2 server
| |--> robot
| |--> robot_arm_ws # ROS2 workspace
| |--> robot_arm # ROS2 package
| |--> launch # launch file
| |--> config # configuration file, includes the ros2 control configuration
| |--> src # source code
| |--> models # robot model, includes the xacro and sdf files
| |--> setup.py # setup file for the package
| |--> package.xml # package file
|-> docker-compose.yml # docker compose file
```

## Cleanup untagged images
```bash
docker rm $(docker ps -a -q) ; docker images | grep '<none>' | awk '{print $3}' | xargs docker rmi
```
Binary file added images/gazebo.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added images/noVNC.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added images/voice.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
42 changes: 37 additions & 5 deletions mnlm/client/knowledge/command_bank.json
Original file line number Diff line number Diff line change
Expand Up @@ -63,15 +63,47 @@
},
{
"operation": "move_single_servo",
"parameters": {"id": "servo3", "angle": 10, "time": 500}
"parameters": {"id": "servo3", "angle": 0, "time": 500}
},
{
"operation": "move_single_servo",
"parameters": {"id": "servo3", "angle": 50, "time": 500}
"operation": "move_all_servos",
"parameters": {"angles": [0, 0, 0, 0, 0, 0, 0], "time": 500}
}
]
},
"shake the body": {
"operations": [
{
"operation": "move_all_servos",
"parameters": {"angles": [45, 60, 60, 30, 0, 0, 0], "time": 500}
},
{
"operation": "move_single_servo",
"parameters": {"id": "servo3", "angle": 0, "time": 500}
"operation": "move_all_servos",
"parameters": {"angles": [0, 30, 30, 0, 0, 0, 0], "time": 500}
},
{
"operation": "move_all_servos",
"parameters": {"angles": [135, 60, 60, 30, 0, 0, 0], "time": 500}
},
{
"operation": "move_all_servos",
"parameters": {"angles": [0, 0, 0, 0, 0, 0, 0], "time": 500}
}
]
},
"seize the gripper, size the fingers": {
"operations": [
{
"operation": "move_all_servos",
"parameters": {"angles": [0, 0, 0, 0, 0, 10, 10], "time": 500}
},
{
"operation": "move_all_servos",
"parameters": {"angles": [0, 0, 0, 0, 0, 0, 0], "time": 500}
},
{
"operation": "move_all_servos",
"parameters": {"angles": [0, 0, 0, 0, 0, 10, 10], "time": 500}
},
{
"operation": "move_all_servos",
Expand Down
Binary file modified mnlm/client/knowledge/index/instructions.index
Binary file not shown.
Loading

0 comments on commit d412950

Please sign in to comment.