0xFORDCOMMA/571-proj
Folders and files
| Name | Name | Last commit date | ||
|---|---|---|---|---|
Repository files navigation
-------------------- # Finding Nautical Exploration and Mapping Optima ## Group 13, CSE 571, Fall 2019 ## Alex Goldman ## Maxfield Lehman ## Parthav Patel ## Rajeshwari Sivasubramanian -------------------- ## Instructions on VM 0. `sudo su` password: ubuntu 1. `roscore` 2. `roscd group_13` 3. `roslaunch group_13 iris_hipp.launch` 4. `rqt --perspective-file /home/ubuntu/catkin_ws/src/group_13/iris.perspective` 5. `rosrun group_13 project_server.py` 7. `rosrun group_13 move_drone.py` 6. `rosrun group_13 traveler.py -a astar` # Dependencies ## ROS Backbone of the project ## Gazebo Simulation environment ## PX4 Firmware Flight controller to control the drones. SITL is used to simulate a physical flight controller that runs PX4. The source code for this is in the Firmware and sitl_gazebo folders. - GitHub: https://github.com/PX4/Firmware - Documentation: https://dev.px4.io/master/en/ ## MAVLink / mavros Communication protocal between PX4 flight controller and ROS - MAVLink Guide: https://mavlink.io/en/ - mavros GitHub: https://github.com/mavlink/mavros ## DroneKit Python interface used to send commands to PX4 flight controller - GitHub: https://github.com/dronekit/dronekit-python - Documentation: https://dronekit-python.readthedocs.io/en/latest/guide/index.html ## Hippocampus UUV Underwater vehicle built by Axel Hackbarth, Edwin Kreuzer and Eugen Solowjow at the Institute of Mechanics and Ocean Engineering, Hamburg University of Technology, Germany. {axel.hackbarth, kreuzer, eugen.solowjow}@tuhh.de - IEEE Paper: https://ieeexplore-ieee-org.ezproxy1.lib.asu.edu/document/7353680 # File Descriptions ## Scripts ### traveler.py Code to run UCS and A* search to solve a traveling salesman problem Args: `-a {ucs, astar}` to choose algorithm. Default is `usc` Args: `-l <tour length>` to indicate number of explred nodes for goal state of algorithm. Default is 24 (full exploration). Number must be between 1 & 24, inclusive. A number less than 24 is used to test the action list and drone movement without requiring full traveling salesman search ### project_server.py Initializes environment and environment update ### api_functions.py Helper functions for search algorithms ### move_drone.py Converts action list to MAVLink code using DroneKit and sends it to the drone for movement. ## Setup Files ### init.sh Installs ROS, Gazebo, and all project dependencies including our repository. From a fresh ubuntu installation, install git, clone from https://github.com/0xFORDCOMMA/571-proj.git, and run the init.sh script found there. This process will take about 30 minutes Note: This uses `catkin build` instead of `catkin_make`. If build on a machine that previously used `catkin_make`, run `catkin clean` before setup ### Dockerfile Clone as in the instructions for using init.sh Builds a docker image with project and all dependencies installed Includes VNC Server To use: ``` cd /path/to/this/repo docker build -t 571/group13 . # about 30 minutes later # 8880 or any other open port docker run -itp localhost:8880:80 571/group13 ``` NoVNC is now accessible at localhost:8880 with the environment, fully set up. ## Launch Files ### iris_hipp.launch Launches an iris drone in the iris_reef.world The drone starts on a ground plane that is off to the side of the reef ### hipp.launch Launches a Hippocampus drone in the reef_grid.world The Hippocampus starts at the grid spot (0,0) by default ## World Files ### reef.world A world with the reef and nothing else ### reef_grid.world Reef world with the grid added for visualization ### iris_reef.world Reef world with the grid added for visualization and a platform for iris to start on ## Other Files ### Models/hippocampus Contains files for the hippocampus UUV ### Models/reef Contains files for the reef model ### iris.perspective Used with rqt to pull up the two cameras attached to the iris and hippocampus models ### reef.json Translations for simplified grid to simulation grid coordinates ##STUFF FROM THE TEMPLATE ## Running the Demo To start the demo, run the following commands in order: 1. Run `./env_setup.sh` 2. Run `roscore` 3. Run `rosrun cse571_project server.py -sub 1 -b 1` 4. Run `roslaunch cse571_project maze.launch` 5. Run `rosrun cse571_project move_tbot3.py` 6. Run `rosrun cse571_project random_walk.py` #### action_server.py, server.py Implements action execution and environment updation functions #### environment_api.py Implements an interface to communicate with the server and random_walk.py script #### mazeGenerator.py Environment generation script #### move_tbot3.py, pid.py Handles movement of turtlebot #### books.json Environment information dictionary generated by server.py #### action_config.json Lists all actions and their rewards, probabilities etc. Format: ``` { "<action_name>": { "function": "<function to execute inside action_server.py>", "params": [<list of parameters accepted by corresponding function in action_server.py (must be in same order)>], "success_reward": <reward if action succeeds>, "fail_reward": <reward if action fails>, "possibilities": { "<action_name>": <probability of execution>, #<action_name> corresponds to an action in the config "<action_name>": <probability of execution>, . . } }, . . ``` Sample Action Config explanation: ``` { "pick": { "function": "execute_pick", "params": ["book_name"], "success_reward": 25, "fail_reward": -25, "possibilities": { "pick": 0.85, "noaction": 0.15 } }, ``` The action name is defined as `pick`. It corresponds to the `execute_pick` function implemented in `action_server.py`. The function takes 1 variable parameter i.e. `book_name` defined in the `params` key. The reward for successful execution of this action is `25` and reward for a failed execution is `-25`. The given config defines a stochastic environment where executing this action `pick` can result in `pick` being executed with probability `0.85` or a no-op action (defined as `noaction`) with probability `0.15` Note: To convert this to a deterministic action, we can change the probability of a `pick` action to `1`