Follow Team1912 on Twitter Find Us on Facebook Watch us on Youtube Google+ Skype Me™!

Beta Testing | Task 4

Beta Testing Task 4: Robot Simulator

Part 1:
So today we figured that we would try the LabVIEW Robot Simulator, because you know, you just have these robotics related urges. We followed the instructions exactly (from the Beta Test Manual and the Readme) and everything worked fairly well and we had it up in running in about 5 minutes, which is pretty neat. After spending a while playing around with it, I offer up the following observations. I am posting this under Task Findings, although it really only is part of Task 4. However, considering we are playing to test more things for Task 4, this is not all of Task 4. So I guess this could be considered Task 4: Part 1 of Unknown Parts.

I like it, the Robot Simulator. When I pulled it up for the first time, that was my first thought. ('Oh this is cool! Weeee!' kind of thing). It immediately seems pretty awesome. There are a couple other things to note about why it seems pretty cool, but those will be explained with other bullets.

As soon in the screen shot, I accidentally kinda drove the robot through the material (which I assume is supposed to be solid and therefore not-robot-drive-able-through). In addition to the whole breaking-reality thing, at the pictured point the simulated robot also got stuck and there was really nothing I could do. Normally if I was a real robot I would go pick it up in move it. Right now the only solution (once stuck) seems to be to close out the simulator and start over. Is it possible to add a reset button?

The field pieces on the simulator seem to correspond roughly-ish to the 2012 game (Maybe?). Would the simulator for the season look like the 2013 field. I'm not asking what the 2013 field is (although if you would like to release a special game hint...) but whether the simulator field will be a very close representation of the actual field. Will one be able to use the simulator to model the game in a fairly realistic sense?

So right now when you are in normal drive mode, the driver station has the volts to the very left side of the operation tab and the robot status below it. When you switch to the simulation mode the volts converts to 'Simulation Mode' in block green text and the robot status reads as it would for normal operation. I like how the 'Simulation Mode' is in a contrasting color. However I think there should be mention of the simulation in the robot status box. For example, instead of just reading 'Teleoperation Enabled' I think it would be nice to have it read 'Simulated Teleoperation Enabled'. I think this would make it even more obvious to the user. A pretty minor thought that may be more of a personal preference of mine. If this does not make sense, I can provide a screen shot example.

This might just be my computer (I'll admit I have not yet done the proper legwork of systematic testing) but the simulation seemed a little slow on the 1x real time speed. I would be interested (and perhaps others would too) to know more about the timing, especially for the slowed down speeds and sped up speeds.

Based on what I see, the default simulated robot has an axis camera. The default LabVIEW framework has code for the camera. Therefore I expects to be able to pull up the Dashboard and see some kind of a camera image. I did not. Am I missing a step to configure or is this capability not yet up? It also seemed that the dashboard in general was unresponsive. When running the simulation, we never saw PWM or Analog (the analog in reference to a gyro I added, discussed below) values change. Does the Dashboard not react to the simulated robot?

I would like to ask a clarification in response to Caveat 1 (great word by the way.) It would seem as though when I edited a VI (such as Begin) from the MyComputer section, it would also change that VI from the cRIO section. I assume that is because although there are two RobotMain VIs, they reference the same subVIs. So if you are running a simulation and you make a lot of edits to various parts of the code and you decide to run that code on an actual robot, you would not have to do anything other than add the RobotMain back into RT Target?

In reference to part of what was asked as part of Task 4, I enabled the default autonomous code that comes in the framework. That seemed to work as expected. More autonomous testing will follow as I try fancier things. I also added a gyro (the above reference to an analog value comes in right here) and put an indicator on the front panel to read the value. It reacted as it should when the simulated robot turned. Another, 'dude-that-totally-just-worked-hey-mom-come-look-at-this' moment.

(Coming to the last major point of my ramblings, I promise. But this is probably my biggest question) So right now I'm driving around the simulator and its pretty cool. But I'm driving a kitbot, or rather, a skeleton of a kitbot. I don't normally (or ever really) have a kitbot as the competition robot. I understand how I would add components to the software, you would just edit normally. But is there any way to add mechanical components to the simulated robot? For example, is there any way I could add a sweeper and simulate and test that? Our 2012 robot had an encoder (on the fly wheel-gun) and optical sensors (monitoring the conveyor belts) and etc. But I can't really simulate the sensor values without the mechanical parts that they refer to. I added a gyro to test because that would make sense even on a kitbot. But what about the more practical uses for the other sensors? I guess I am asking what FIRST's vision is for the robot simulator. What can we use this for and what will its customization scope be? What can we expect to be able to do with this snazzy tool during build season?

Those very some of my first impressions. More comments will probably follow as we do more testing.

Part 2:
Thank you for your answers. If you don't mind, this brings me to a few more questions..

Once the camera is added, which camera image would it refer to? Would it show the image of the simulated field or could your plug in a real camera to your computer and view its image? If its the view of the simulated field I have a feeling that would be quite an interesting view. Then again, it would give programmers perspective on how much you can expect to see of the field using the camera.

I will do some more systematic testing of the timing and report back on that. When you refer to errors, that wouldn't be broken arrow syntax stuff but logic errors, right? So far I haven't really added in anything that would be throwing errors.

I kind of guessed that we would not be able to edit the simulated robot. (Maybe one day it would be totally awesome to be able to import a CAD drawing and then simulate that. But that is something I will dream about for the future as the complications seem enormous) I could see how the simulator would give more basic driving practice and how its a really neat tool for people. Considering you only have a kitbot however, and that the kitbot programming comes in the framework and that most of the things that people would want to test are outside the framework, I would worry how effectively it could be used for anything beyond the basics. Maybe I am being too pessimistic.

Part 3:
So another round of testing went on today.

We saw that the camera works with the simulator and that was pretty exciting. With the camera capability I see the Simulator as being more useful. Often its hard to tell (without actually building the whole thing) what you will and won't be able to see of the field through the camera. As drive coach I look at the camera image during a match much more than my drivers ever do, so the simulator gives a good idea of what you will be able to see. We did some testing with the new Smart Dashboard (which I like, by the way, but have a lot of questions, but that will be in a coming post under the 'SmartDashboard' section) and unfortunately it seems that when we ran the Simulator the only thing on the Dashboard that reacted was the Camera. We did not see the values for the right and left motor change. We were running unedited framework.

We did not get to Simulator Kinect testing as I have some questions about the Kinect (again, see coming post under the 'Kinect' section.) We will be testing soon though, I promise. However,Mr McKaskle, if I might ask, what do you mean by "joysticks are transmitted to the robot differently, than Kinect" ?

Earlier I had posted that the robot seemed to move slowly on 1x. Today we were running the simulator on three computers (you know, so everyone could play) and it appears that the slowness was only an idiosyncrasy of my computer. On the other two machines the simulated robot moved as we thought it would.

On one of the computers we had a rather interesting thing happen with the 'Near' and 'Far' setting. On the simulator, before you click enable on the driver station the 'Near' setting works, but after you enable the robot, the 'Near' zooms out to further than the 'Far' button, no longer enabling you to view the robot close up as it had in the past.

We continued to have fun breaking some physics. A freshman figured out how to knock the ball out of the field entirely and was wondering whether there should be some kind perimeter to where that couldn't happen.

Part 4:
Since we have no school but can't vote anyway, why not Beta Test today? Here are some of our general ramblings, questions, opinions on Smart Dashboard stuff, Update 3 changes and the simulator. We did Kinect testing, but that will be posted under 'Technical Discussions'.

Correct me if I am wrong but it would seem as though Update 3 did not change anything in Begin, Robot Main, Disable, Vision, Periodic Tasks, Robot Global Data or Finish. In Autonomous has two options, but I must confess I haven't looked at it in detail and can't offer an opinion. In Tele-Op there seems to be no real change in operation, but the SmartDashboard examples seem cleaner and more straightforward. I would, however, again suggest the need for more examples and documentation on the SmartDashboard.

We like the new style of converting into the Simulation mode (clicking in Robot Main instead of dragging). However, there was a little figuring out (I stumbled upon it and one of my Beta Testers was confused by it). I would suggest adding in a screen shot for Step 3 of the Robot Simulation Readme. I think it will really clarify things. Also, as a related but separate note, you can select run under My Computer, cRIO or the Main Application. What is the Main Application? We didn't try it but I am still curious.

Lots of testing with the Simulator, especially since there are now the three options. The Arcade with no Arm seemed to be the same, no surprise. It would seem that while the Camera still works into the Dashboard and although we know there are servos, how do you control them? Also, the motor values from the Simulator do not seem to reading in. We still seem to have the same odd issue with the Near and Far setting (as noted and detailed in an earlier post).

Now for the simulation, that arm option was added in. Before running the simulation I looked through the code and found nothing related to any kind of arm control. Therefore it was no surprise that you couldn't control the arm. I saw in another post that it was meant to do this (something from McKaskle, I can't find it at the moment). We had some interesting (?) fun with the arm in that it seemed to kind of move down at varied speeds. During one test, after 1:21 the claw had come all the way down and in another it took in till 1:40. So we drove the claw up against the bridge, causing it to fall up and actually got in to go through the bridge, which cause it to literally spasm. Clearly a physics bug. But a fairly amusing one. The claw behaved in a similar way in Mechanum. I'm not sure if we were doing something wrong but when we ran the Mechanum drive, were enabled but not touching the joystick at all the simulated bot would spin. Like twirl in circles. Again, fairly amusing but definitely unexpected.

We hope these notes are hopeful. Or at least, mildly interesting reading material.

Nasa meatball NRL logo NDEP logo Sierra Lobo logo Qualis logo Northshore High Logo Signs Now logo STPSB Logo Huntington Ingalls Logo Textron Logo Jacobs logo AeroJet logo Gulf Coast logo Alliance Laser logo

Best Website Award 2012 Web Excellence Award 2012 Best Website Award 2011 Web Excellence Award 2011 Web Excellence Award 2010 Web Excellence Award 2008
Valid CSS! Valid XHTML 1.0 Transitional Viewable on Lynx: A Text Based Broswer See your web site through colorblind eyes with the colorblind web page filter.

Home - Site Map - Contact Us
FIRST Robotics Team 1912 Combustion. All Rights Reserved