|
| 1 | +# Hardware Testing - ArmMotus M2 Planar Manipulandum |
| 2 | + |
| 3 | +This page introduces the M2DemoMachine, an example CORC app showing the basic use of the M2 planar manipulandum. |
| 4 | + |
| 5 | +The M2 is a 2 DoFs admittance based robot with a cartesian kinematic developed by Fourier Intelligence: |
| 6 | + |
| 7 | + |
| 8 | +ArmMotus M2 and reference coordinates. |
| 9 | + |
| 10 | +The state machine code can be found in the folder `src/apps/M2DemoMachine`. |
| 11 | + |
| 12 | +It demonstrates the use of: |
| 13 | +- The different control modes of M2 (position, velocity, or torque) |
| 14 | +- The use of the force measurements |
| 15 | +- The use of the libFLNL comunication library to pusblish the robot state in a Unity software and send commands to the state machine |
| 16 | + |
| 17 | + |
| 18 | +## Running the state machine |
| 19 | + |
| 20 | +In the CMakeLists.txt select the M2DemoMachine and set the flags for using a real robot without ROS support: |
| 21 | + |
| 22 | +```cmake |
| 23 | +#set (STATE_MACHINE_NAME "ExoTestMachine") |
| 24 | +#set (STATE_MACHINE_NAME "M1DemoMachine") |
| 25 | +set (STATE_MACHINE_NAME "M2DemoMachine") |
| 26 | +#set (STATE_MACHINE_NAME "M3DemoMachine") |
| 27 | +#set (STATE_MACHINE_NAME "X2DemoMachine") |
| 28 | +#set (STATE_MACHINE_NAME "LoggingDevice") |
| 29 | +# Comment to use actual hardware, uncomment for a nor robot (virtual) app |
| 30 | +set(NO_ROBOT OFF) |
| 31 | +# ROS Flag. set ON if you want to use ROS. Else, set OFF. |
| 32 | +set(USE_ROS OFF) |
| 33 | +``` |
| 34 | + |
| 35 | +If you intend to cross-compile for a BeagleBone (Black or AI), run: `$ rm -r build && mkdir build && cd build && cmake -DCMAKE_TOOLCHAIN_FILE=../armhf.cmake ..` |
| 36 | + |
| 37 | +otherwise, to run the state machine locally use: `$ rm -r build && mkdir build && cd build && cmake .. ` |
| 38 | + |
| 39 | +Then simply compile the state machine: `$ make -j2` |
| 40 | + |
| 41 | +This should create the application `M2DemoMachine` within the build folder. After initialising the CANbus (using the `initCAN0.sh` or `initCAN1.sh` script) you should be able to run the application, either locally or on the BB (through SSH). |
| 42 | + |
| 43 | +**WARNING:** With an M2 connected on the CAN bus the robot will imediatly start to move after running the application (to go in a calibration pose), ensure the space is clear around the robot. |
| 44 | + |
| 45 | +Once the calibration state is finished, you can circle through the different demo states using the keyboard (key 1) or using the joystick first button. |
| 46 | + |
| 47 | + |
| 48 | +## RobotM2 control methods |
| 49 | + |
| 50 | +The CORC M2 robot model has the following specific methods of interaction: |
| 51 | +- Obtaining current **joint state** (as for any CORC robot): `robot->getPosition()`, `robot->getVelocity()`, `robot->getTorque()`. |
| 52 | +- **Joint level interaction**: `setJointPosition(VM2 q)`, `setJointVelocity(VM2 dq)` and `setJointTorque(VM2 tau)` allow to apply a position, velocity or torque control using an Eigen::vector of length 2 in SI units: in this case [m], [m.s-1] and [N] as the joints are modeled as two prismatic ones. An example of open-loop force control can be found in the `M2CalibState` state. Note the use of `robot->initTorqueControl();` in the `entryCode()` method before applying torque control. |
| 53 | +- Obtaining current **end-effector state**: `robot->getEndEffPosition()`, `robot->getEndEffVelocity()`, `robot->getEndEffForce()` methods are provided for convenience but given the simplicity of the M2 kinematic, they are equivalent to their joints counterparts. Additionnaly the pure interaction force at the end-effector, measured by the pair of force sensors can be obtained using the `robot->getInteractionForce()` method. |
| 54 | +- **End-effector space control** is available using: `setEndEffPosition(VM2 X)`, `setEndEffVelocity(VM2 dX)`, `setEndEffForce(VM2 F)`. These methods require that the robot has been calibrated (see `applyCalibration()`). As for their joints counterparts they require the proper use of the corresponding initTorque/Velocity/Position method beforehand. The command vectors are expressed in the robot base frame as shown on the picture above. An example of the use of the end-effector velocity control is available in the `M2EndEffDemo` state. They are also equivalent to their joints counterparts. |
| 55 | +- Finally, the method `setEndEffForceWithCompensation(VM2 F, bool friction_comp=true)` can be used to apply an end-effector force in addition to the **friction compensation**. This method relies on the robot model and parameters (friction coefficients). |
| 56 | + |
| 57 | +See the Doxygen page of the `RobotM2` class for a full list of available methods. |
| 58 | + |
| 59 | + |
| 60 | +## Network communication with libFLNL |
| 61 | + |
| 62 | +The M2DemoMachine app is using libFLNL to publish the robot states and read incoming commands over a TCP/IP connection. Together with the use of an FLNLHelper object (`UIserver = new FLNLHelper(robot, "192.168.6.2");`), the library allows to send the robot state at every control loop (`UIserver->sendState();` within the `M2DemoMachine::hwStateUpdate(void)` method) and send and process incoming commands. CORC app is here acting as a server on the specified IP (and port, optional, default is 2048) to which client application can connect to. |
| 63 | + |
| 64 | + |
| 65 | + |
| 66 | +An exemple class to process incoming states and send/receive commands from a Unity or Matlab script (client side) can be found [here](https://github.com/UniMelbHumanRoboticsLab/CORC-UI-Demo). |
0 commit comments