This is the code that is used on the Tricycle robot created from the Studica FTC Starter Kit.
This code was created using the FTC 7.0 SDK in Android Studio.
To install Android Studio follow the steps here https://github.com/FIRST-Tech-Challenge/FtcRobotController/wiki/Installing-Android-Studio
After installing Android Studio follow these steps to download and open the 7.0 SDK. https://github.com/FIRST-Tech-Challenge/FtcRobotController/wiki/Downloading-the-Android-Studio-Project-Folder
OpenCV needs to be installed as well the 7 step instructions here are easy to follow for this.
The navX library also needs to be installed. The instructions on the KauaiLabs website are a bit dated and no longer work. But following below will allow you to add the library.
-
Install the navX tools here
-
Under Gradle Scripts find
build.dependencies.gradle
and open it -
Look for the flatDir{} section and add
dirs 'libs', 'C:\\Users\\james\\navx-micro\\android\\libs'
but with your correct directory. AKA replace james with your username. -
In the same file is the dependencies{} section. Add
compile (name:'navx_ftc-release', ext:'aar')
at the bottom. -
Hit the sync now on the Gradle message or hit the elephant icon in the top right to sync the Gradle files.
-
Clone or download this repo to your computer.
-
Place all the files here into
TeamCode/src/main/java/org/firstinpires/ftc/teamcode/
. -
Run a build by clicking on the green hammer in the top middle of the screen
-
In TricycleOp, there is
Utilities.Alliance alliance = Utilities.Alliance.BLUE;
change theBLUE
toRED
if you are on the RED alliance. This will configure the code to change directions correctly. -
In DriveTrain, the PID coefficients may be calibrated for the turning functions.
-
In DriveTrain, the motor flags might need to be changed from
REVERSE
toFORWARD
depending on your motor or how you wired it. -
In Elevator, the lift direction might need to be changed from
REVERSE
toFORWARD
depending on your motor or how you wired it.
-
When enabled the left joystick Y axis is for driving forwards or backwards
-
The left joystick X axis handles the left and right movement
-
Moving the left joystick in the X and Y direction will mesh the movements
-
Moving the right joystick in the X axis will spin the robot left or right
-
The Right Bumper will strafe the robot to the right
-
The Left Bumper will strafe the robot to the left
-
Square (PS4) or X (Logitech) will turn the robot so it can enter the first gate
-
Triangle (PS4) or Y (Logitech) will turn the robot so it can enter the gate for the shared object
-
X (PS4) or A (Logitech) will spin the carousel. If the alliance was set correctly it will spin the correct direction
-
Dpad UP will move the elevator up
-
Dpad DOWN will move the elevator down
The Autonomous code uses OpenCV to easily determine which of the three barcodes has a duck on it.
The bulk of the vision is done inside a pipeline. The pipeline is essentially a vision script that allows the processing of an image and then returning that image when its done.
There are only a few sections that need to be adjusted for ROI boxes and a threshold value.
static final Point BARCODE_1_TOP_LEFT = new Point(10, 275);
static final Point BARCODE_2_TOP_LEFT = new Point(300, 275);
static final Point BARCODE_3_TOP_LEFT = new Point(539, 250);
static final int BARCODE_WIDTH = 100;
static final int BARCODE_HEIGHT = 75;
final int OBJECT_THERE_THRESHOLD = 125;
The BARCODE_X_TOP_LEFT
points refers to the (X, Y)
point of the top left of a rectangle. This is where you set the location of your ROI boxes.
The X and Y values are the pixel locations on the image. In the example the code is using a 640 x 480
image taken from the camera.
(0,0) ------------- (640,0) (0,480)------------ (640,480)
The locations of each pixel is demonstrated above with the origin (0,0) at the top left.
Next value to calibrate would be the width and the height of the ROI rectangle for each barcode. The example creates a 100 x 75
pixel ROI rectangle, however, you may wish to change this based on the location of your camera or to improve accuracy. The smaller the ROI rectangle is the more accurate the averaging calculation done in the code is.
The last value to be calibrated is the OBJECT_THERE_THRESHOLD
. This can only be done through testing on the court. It is very easy to do.
-
Put the Robot on the Court with no ducks placed.
-
Run the code and see what the three outputs are. (They should be pretty similar) In my testing I found the values to be around
119
when there was no duck. -
Record the values with no Duck
-
Place a duck on the barcode and run the code again. The numbers should now go upwards. In my testing I found the values were above
130
. -
Record the values with the Duck.
-
Now to get the value to put for the
OBJECT_THERE_THRESHOLD
, we will take a number inbetween the two values we just observed. For example((130 - 119) / 2) + 119 = 124.5
So ourOBJECT_THERE_THRESHOLD
should be124.5
but we will round up to125
.
For the duck on the left
For the middle duck
For the duck on the right
A web app made by acmerobotics that allows for easier monitoring of FTC robots during operations.
The most up to date information on the FTC Dashboard can be found on it's github
In build.dependencies.gradle
add the following in the correct sections:
repositories{
maven { url = 'https://maven.brott.dev/' }
}
dependencies {
implementation 'com.acmerobotics.dashboard:dashboard:0.4.3'
}
Run a Gradle Sync to download and ensure that everything is working.
To test the dashboard connect to 192.168.43.1:8080/dash
. The dashboard will then be displayed in your browser window. An example showing the Autonomous is below.