ROBOTC.net forums
http://robotc.net/forums/

Limits on memory for pathfinding algorithms vs lookup tables
http://robotc.net/forums/viewtopic.php?f=1&t=8869
Page 1 of 1

Author:  hexafraction [ Sat May 24, 2014 3:48 pm ]
Post subject:  Limits on memory for pathfinding algorithms vs lookup tables

I'm writing a C code generator geared toward RobotC and complex tasks for an FTC team, and was wondering about some performance and storage concerns:

  1. How much memory is available for my program's data? It'll be mostly pre-defined lookup tables, generally in the form of multidimensional arrays.
  2. How much NXT memory is available for my program itself? As in, roughly how much code can I expect to fit into a single RobotC compiled program?
  3. How quickly do programs execute, generally? Looking at disassembly most of my generated lines correspond to 2-4 opcodes.

I'm using NXT/Tetrix. My major interest at this point with these questions is for pathfinding. I plan to have a 64x64 grid and be running Djisktra's A* algorithm with a heuristic function that assigns a penalty to turns and is as close to consistent as possible (not sure if consistency/monotonicity is doable with the turn penalty).

Roughly 8 paths would be cached if I decide to use the pre-cached lookup tables.

Instead of a set, I'll probably use a boolean array for the set of nodes visited. The fact that I'm working with a square layout will allow me to use a 2D array for the map needed to reconstruct the path.

I'd love some feedback and answers to my question if anyone has any. Thanks!

Author:  hexafraction [ Wed Sep 24, 2014 4:31 pm ]
Post subject:  Re: Limits on memory for pathfinding algorithms vs lookup ta

*bump* as I am still looking for an answer going into the new FTC season

Page 1 of 1 All times are UTC - 5 hours [ DST ]
Powered by phpBB © 2000, 2002, 2005, 2007 phpBB Group
http://www.phpbb.com/