Posts

Week 13 Update (4/9-4/12)

Image
 This week's progress centered around the final stages of development for both teams. Below are further details for each team's progress in the past week and the ongoing plan for final adjustments before the Capstone Showcase this Friday, 4/12. Total hours: 14 hours. A* Team Update: On the A* side, we created a separate testing project for our PGM parser and A* search itself. This is mainly so we don't drastically alter our current move nodes code to try and fit in both the parser and A* only for it to not work and needing to revert back and lose progress. The parser works as expected, and also had an addition of some code that would explicitly define a nodes neighbors in the 8 directions around it. Code for the A* search also began development after, and had a few hiccups in getting the neighbors of the currently viewed node such as being stuck in an infinite loop, many, many times. This led to us going back to an earlier version of the A* code that did not loop forever, w

Week 10 Update (3/18-3/22)

Image
This week focused on splitting the group into two smaller groups focused on the two remaining aspects of the Mouse Droid project, that being A* search/path planning and getting the Pi to talk to the Arduino. Additionally, the team returned to recording time entries on the timesheet. On the A* side of the project, the two members came up with a checklist of steps that are needed to reach the end goal of a functional A*. The first step was to create a proper map of the Engineering buildings first floor which will be used as a new testing ground instead of basic test maps. Step two and three are closely related and thats to get a .PGM and .yaml format of the virtual LiDARs surroundings in rViz, and parse through it in order to create a data structure that can be readable for A* to use. Lastly, we are to use the parsed data and create an A* search/path planning with a fixed goal at any point in the map. On the Arduino side of the project, the two members worked with the ECE teammates to re

Week 9 Update (3/4-3/8)

Image
This weeks work focused on the feedback we received from the non-CS/SE team. With the physical robot nearing completion they now want to begin integrating code into it, so our focus for the week was to begin the process of getting the Pi setup and transferring/modifying our code onto it. There were some issues such as the Pi not being able to update software which required a fresh install of Ubuntu. From there, we began to gut our code of anything that would be related to Gazebo or the simulation itself. Since the Pi will be in charge of sending the controls over to the Arduino, there is no need for the simulation to be running while inside of the physical robot. Before doing this, we wanted to get the LiDAR to still be able to connect to the code that we are intending to keep but ran into an issue where it wasn't appearing in the list of connected USB devices. Additionally, we also setup an SSH in order to send over commands. Total Hours: 24 hours Pictures: Figure 1: Physical Mous

Week 8 Update (2/26-3/1)

This week centered around the installation and implementation of the our actual LiDAR sensor. We were able to integrate this into our Move.cpp node and MoveWithLidar.xml behavior tree by creating a new topic /lidarscan that would receive messages from the rplidar, then subscribing to that topic from our Move.cpp node. We also began retrofitting and adjusting our command statements to interact with the Arduino, though that remains untested until we have access to the Arduino itself. We also simplified the initial boot system of our project, wrapping all commands in a single "ultimate_startup.sh" file. After meeting with the Mechanical team, we agreed to meet again the following week to begin the implementation process of our code onto the raspberry pi and subsequently, the bot itself. Our current state of the project takes in real-time input from our physical Lidar sensor, with specified parameters (obstacles closer than x meters within an 11 o'clock to 1 o'clock range

Week 7 Update (2/19-2/23)

Image
This week showed progress in the development of our existing Move cpp node, and the creation of our new MoveWithLidar.xml behavior tree. We did a bit of restructuring to organize all functionality in our node, and logic in our behavior tree, which is proving much more helpful and efficient. We now have logic implemented in place of a basic loop with repeating hard-coded movement commands. Initially, we used a "go until you see a wall, stop, reverse until you see a wall, repeat" logic tree to test out Lidar scope and limitations. Through many trials, we learned a lot about the 360 scope of the lidar, as well as some errors in the structure of our code (ie. mixing logic between our node and bt). We narrowed the scope of the Lidar to make decisions only on input from an 11-1 o'clock range, so a side parallel wall will not trigger as an obstacle.  Additionally, this Monday we also received access to the actual Lidar sensor, and will being testing and implementation during Wee

Week 6 Update (2/12-2/16)

Image
This week showed progress in the continued development of a script for automated movement and 3D mapping. Through lots of research, debugging and troubleshooting, we have successfully created a node "Move" written in C++, which currently publishes pre-planned movement commands to the corresponding Move topic, instead of relying on user-inputted commands from a terminal. Admittedly this does not seem like significant progress in terms of bot movement/intelligence, since that level has remained consistent from the previous week, but eliminating direct user control is actually a big step for the project.  Total hours: 22 hours. Fig 1. node list showing active Move node Fig 2. topic list showing active Move Topic (receiving from Node) Fig 3. Moving from Move Node (no logic yet)    Fig 4. Move Node publishing commands to Move Topic 

Week 5 Update (2/5-2/9)

Image
This week's progress focused on the planning and development of an algorithm for intelligent 3D mapping and path planning. Since simulated 3D mapping with the LiDAR sensor has already been achieved through implementation of various ROS libraries, our focus is on the algorithm that will automate the process, eliminating the need for a user controller. After much research on the topic, we are in the process of developing a behavior tree and ROS node that will control the behaviors of the bot (output to the Arduino/motors) based on the input from the LiDAR sensor.  We began with simple prescribed movements running from a terminal (which will be wrapped in our launch file) to ensure movements are accurate before implementing the logic of the behavior tree. We also confirmed the status of the real LiDAR sensor with the ME/EE group, and we should have access to it this following week. This is going to be integral in our next steps, as we want to begin transfer of code to the Raspberry Pi