Page 2 of 3

Re: Mindsensors Sumo Eyes / internals + NXC driver

Posted: 17 Jun 2011, 13:59
by mattallen37
mindsensors likes to have their own specially named functions. Most likely, it uses light sensor mode. IIRC, it has two ranges, so that is probably selected by pin 5 (light sensor LED control pin). I doubt it is any different than the normal functions native to BCC. If you really need to know for sure, you can look it up in the provided lib file.

Re: Mindsensors Sumo Eyes / internals + NXC driver

Posted: 17 Jun 2011, 15:52
by HaWe
so you mean it's not an original NXC keyword but a mindsensors custom function?
But the NXC help (F1) seems to know it, so I thought it should be a native NXC keyword...(?)

Re: Mindsensors Sumo Eyes / internals + NXC driver

Posted: 17 Jun 2011, 17:08
by mightor
It is a firmware function and not something Mindsensors made up. I am pretty sure that Mindsensors is one of the few that use it but you can't fault them for making use of something the firmware provides :)

- Xander

Re: Mindsensors Sumo Eyes / internals + NXC driver

Posted: 17 Jun 2011, 17:31
by HaWe
I already assumed it's a fw function, otherwise there wouldn't be a context help link. Nevertheless this "help link" doesn't tell us a lot about it.
So do you know how to mimicri SensorNormalized by SetSensor/~Type/~Mode ?

Re: Mindsensors Sumo Eyes / internals + NXC driver

Posted: 20 Jun 2011, 21:22
by afanofosc
I recommend using the official NXC API functions for the NXTSumoEyes device.

Here's the function you use to configure the sensor:

SetSensorNXTSumoEyes(byte port, bool longRange);

Here's the function you use to read its value:

SensorNXTSumoEyes(byte port);

Here's the sample code from the NXC online help files:

Code: Select all

inline void TurnLeft() { }
inline void TurnRight() { }
inline void GoStraight() { }
inline void SearchForObstacle() { }

task main()
{
  SetSensorNXTSumoEyes(S1, true); // long range 
  while(true)
  {
    char zone = SensorNXTSumoEyes(S1);
    switch (zone) {
      case NXTSE_ZONE_LEFT: 
        TurnLeft();
        break;
      case NXTSE_ZONE_RIGHT: 
        TurnRight();
        break;
      case NXTSE_ZONE_FRONT: 
        GoStraight();
        break;
      default:
        SearchForObstacle();
        break;
    }
    NumOut(0, LCD_LINE1, SensorNXTSumoEyesRaw(S1));
    Wait(MS_500);
  }
}
John Hansen

Re: Mindsensors Sumo Eyes / internals + NXC driver

Posted: 21 Jun 2011, 09:21
by HaWe
thank you, John!
Now could you pls finally tell me what these values are like?
(Just for general interest, I haven't already bought one because of several reasons):
NXTSE_ZONE_FRONT
NXTSE_ZONE_LEFT
NXTSE_ZONE_NONE
NXTSE_ZONE_RIGHT

And could you pls tell what SensorNormalize() is doing in general?

Re: Mindsensors Sumo Eyes / internals + NXC driver

Posted: 21 Jun 2011, 16:01
by afanofosc
The sumoeyes zones are documented here:

http://www.mindsensors.com/index.php?mo ... ile_id=863

Each zone is 20 degrees wide. It 0 degrees is straight ahead then the front zone is 350 (aka -10) to +10 degrees. The left zone is from 330 to 350 degrees. The right zone is from 10 to 30 degrees. Short range is up to 6 inches. Long range is up to 12 inches.

SensorNormalized is reading the NormalizedValue (aka SensorRaw) field from the Input module's IOMap structure. That's all. If you want to know what the firmware does then read the firmware source code. It is trivially easy to read if you can write random number generators or recursive maze solvers or a-star algorithms.

It has also been discussed somewhat thoroughly when we were talking about how the firmware provides access to the raw sensor values.

The unmodified raw value is provided like this:

Code: Select all

      dInputGetRawAd(&InputVal, No);
      IOMapInput.Inputs[No].ADRaw = InputVal;
It is available in NXC via SensorRaw(port), SensorValueRaw(port), or GetInput(port, RawValueField).

The normalized and scaled values are provided like this:

Code: Select all

      if (sType == REFLECTION)
      {
        cInputCalcFullScale(&InputVal, REFLECTIONSENSORMIN, REFLECTIONSENSORPCTDYN, TRUE);
      }
      else if (sType == TEMPERATURE)
      {
        if (InputVal < 290)
          InputVal = 290;
        else if (InputVal > 928)
          InputVal = 928;
        InputVal = TempConvTable[(InputVal) - /*197*/ 290];
        InputVal = InputVal + 200;
        InputVal = (UWORD)(((SLONG)InputVal * (SLONG)1023)/(SLONG)900);
      }
      else if (sType == LIGHT_ACTIVE || sType == LIGHT_INACTIVE)
      {
        cInputCalcFullScale(&InputVal, NEWLIGHTSENSORMIN, NEWLIGHTSENSORPCTDYN, TRUE);
      }
      else if (sType == SOUND_DB || sType == SOUND_DBA)
      {
        cInputCalcFullScale(&InputVal, NEWSOUNDSENSORMIN, NEWSOUNDSENSORPCTDYN, TRUE);
      }
      else if (sType == CUSTOM)
      {
        cInputCalcFullScale(&InputVal, IOMapInput.Inputs[No].CustomZeroOffset, IOMapInput.Inputs[No].CustomPctFullScale, FALSE);
      }
      cInputCalcSensorValue(  InputVal,  // <-- this is the normalized raw value (input to this function and output from cInputCalcFullScale)
                            &(IOMapInput.Inputs[No].SensorRaw),   // <-- this is the normalized raw value (always == InputVal)
                            &(IOMapInput.Inputs[No].SensorValue),  // <-- this is the scaled value
                            &(IOMapInput.Inputs[No].SensorBoolean),  // <-- this is the boolean value
                            &(VarsInput.InputDebounce[No]),
                            &(VarsInput.SampleCnt[No]),
                            &(VarsInput.LastAngle[No]),
                            &(VarsInput.EdgeCnt[No]),
                            ((IOMapInput.Inputs[No].SensorMode) & SLOPEMASK),
                            ((IOMapInput.Inputs[No].SensorMode) & MODEMASK));
and then this:

Code: Select all

void      cInputCalcFullScale(UWORD *pRawVal, UWORD ZeroPointOffset, UBYTE PctFullScale, UBYTE InvStatus)
{
  if (*pRawVal >= ZeroPointOffset)
  {
    *pRawVal -= ZeroPointOffset;
  }
  else
  {
    *pRawVal = 0;
  }

  *pRawVal = (*pRawVal * 100)/PctFullScale;
  if (*pRawVal > SENSOR_RESOLUTION)
  {
    *pRawVal = SENSOR_RESOLUTION;
  }
  if (TRUE == InvStatus)
  {
    *pRawVal = SENSOR_RESOLUTION - *pRawVal;
  }
}
and this:

Code: Select all

void      cInputCalcSensorValue(UWORD NewSensorRaw, UWORD *pOldSensorRaw, SWORD *pSensorValue,
                                UBYTE *pBoolean,    UBYTE *pDebounce,     UBYTE *pSampleCnt,
                                UBYTE *LastAngle,   UBYTE *pEdgeCnt,      UBYTE Slope,
                                UBYTE Mode)
{
  SWORD   Delta;
  UBYTE   PresentBoolean;
  UBYTE   Sample;

  if (0 == Slope)
  {

    /* This is absolute measure method */
    if (NewSensorRaw > THRESHOLD_FALSE)
    {
      PresentBoolean = FALSE;
    }
    else
    {
      if (NewSensorRaw < THRESHOLD_TRUE)
      {
        PresentBoolean = TRUE;
      }
    }
  }
  else
  {

    /* This is dynamic measure method */
    if (NewSensorRaw > (ACTUAL_AD_RES - Slope))
    {
      PresentBoolean = FALSE;
    }
    else
    {
      if (NewSensorRaw < Slope)
      {
        PresentBoolean = TRUE;
      }
      else
      {
        Delta = *pOldSensorRaw - NewSensorRaw;
        if (Delta < 0)
        {
          if (-Delta > Slope)
          {
            PresentBoolean = FALSE;
          }
        }
        else
        {
          if (Delta > Slope)
          {
            PresentBoolean = TRUE;
          }
        }
      }
    }
  }
  *pOldSensorRaw = NewSensorRaw;

  switch(Mode)
  {

    case RAWMODE:
    {
      *pSensorValue = NewSensorRaw;
    }
    break;

    case BOOLEANMODE:
    {
      *pSensorValue = PresentBoolean;
    }
    break;

    case TRANSITIONCNTMODE:
    {
      if ((*pDebounce) > 0)
      {
        (*pDebounce)--;
      }
      else
      {
        if (*pBoolean != PresentBoolean)
        {
          (*pDebounce) = DEBOUNCERELOAD;
          (*pSensorValue)++;
        }
      }
    }
    break;

    case PERIODCOUNTERMODE:
    {
      if ((*pDebounce) > 0)
      {
        (*pDebounce)--;
      }
      else
      {
        if (*pBoolean != PresentBoolean)
        {
          (*pDebounce) = DEBOUNCERELOAD;
          *pBoolean = PresentBoolean;
          if (++(*pEdgeCnt) > 1)
          {
            if (PresentBoolean == 0)
            {
              (*pEdgeCnt) = 0;
              (*pSensorValue)++;
            }
          }
        }
      }
    }
    break;

    case PCTFULLSCALEMODE:
    {

      /* Output is 0-100 pct */
     *pSensorValue   = ((NewSensorRaw) * 100)/SENSOR_RESOLUTION;
    }
    break;

    case FAHRENHEITMODE:
    {

      /* Fahrenheit mode goes from -40 to 158 degrees */
      *pSensorValue = (((ULONG)(NewSensorRaw) * 900L)/SENSOR_RESOLUTION) - 200;
      *pSensorValue =  ((180L * (ULONG)(*pSensorValue))/100L) + 320;
    }
    break;

    case CELSIUSMODE:
    {

      /* Celsius mode goes from -20 to 70 degrees */
      *pSensorValue   = (((ULONG)(NewSensorRaw * 900L)/SENSOR_RESOLUTION) - 200);
    }
    break;

    case ANGLESTEPSMODE:
    {
      *pBoolean = PresentBoolean;

      if (NewSensorRaw < ANGLELIMITA)
      {
        Sample = 0;
      }
      else
      {
        if (NewSensorRaw < ANGLELIMITB)
        {
          Sample = 1;
        }
        else
        {
          if (NewSensorRaw < ANGLELIMITC)
          {
            Sample = 2;
          }
          else
          {
            Sample = 3;
          }
        }
      }

      switch (*LastAngle)
      {
        case 0 :
        {
          if (Sample == 1)
          {
            if ((*pSampleCnt) >= ROT_SLOW_SPEED )
            {

              if (++(*pSampleCnt) >= (ROT_SLOW_SPEED + ROT_OV_SAMPLING))
              {
                (*pSensorValue)++;
                (*LastAngle) = Sample;
              }
            }
            else
            {
              (*pSensorValue)++;
              (*LastAngle) = Sample;
            }
          }
          if (Sample == 2)
          {
            (*pSensorValue)--;
            (*LastAngle) = Sample;
          }
          if (Sample == 0)
          {
            if ((*pSampleCnt) < ROT_SLOW_SPEED)
            {
              (*pSampleCnt)++;
            }
          }
        }
        break;
        case 1 :
        {
          if (Sample == 3)
          {
            (*pSensorValue)++;
            (*LastAngle) = Sample;
          }
          if (Sample == 0)
          {
            (*pSensorValue)--;
            (*LastAngle) = Sample;
          }
          (*pSampleCnt) = 0;
        }
        break;
        case 2 :
        {
          if (Sample == 0)
          {
            (*pSensorValue)++;
            (*LastAngle) = Sample;
          }
          if (Sample == 3)
          {
            (*pSensorValue)--;
            (*LastAngle) = Sample;
          }
          (*pSampleCnt) = 0;
        }
        break;
        case 3 :
        {
          if (Sample == 2)
          {
            if ((*pSampleCnt) >= ROT_SLOW_SPEED)
            {

              if (++(*pSampleCnt) >= (ROT_SLOW_SPEED + ROT_OV_SAMPLING))
              {
                (*pSensorValue)++;
                (*LastAngle) = Sample;
              }
            }
            else
            {
              (*pSensorValue)++;
              (*LastAngle) = Sample;
            }
          }
          if (Sample == 1)
          {
            (*pSensorValue)--;
             (*LastAngle) = Sample;
          }
          if (Sample == 3)
          {
            if ((*pSampleCnt) < ROT_SLOW_SPEED)
            {
              (*pSampleCnt)++;
            }
          }
        }
        break;
      }
    }
  }

  *pBoolean  = PresentBoolean;
}
The normalized value is available in NXC via SensorNormalized(port) or GetInput(port, NormalizedValueField).

The scaled value is available in NXC via Sensor(port), SENSOR_n, SensorValue(port), SensorScaled(port), or GetInput(port, ScaledValueField).

John Hansen

Re: Mindsensors Sumo Eyes / internals + NXC driver

Posted: 21 Jun 2011, 16:19
by HaWe
If you want to know what the firmware does then read the firmware source code. It is trivially easy to read if you can write random number generators or recursive maze solvers or a-star algorithms.
Don't overestimate my abilities...
The documentation I already read - my question was only about the constants!
NXTSE_ZONE_FRONT
NXTSE_ZONE_LEFT
NXTSE_ZONE_NONE
NXTSE_ZONE_RIGHT
SensorNormalized is reading the NormalizedValue (aka SensorRaw)...
The normalized value is available in NXC via SensorNormalized(port) or GetInput(port, NormalizedValueField).
aah, I see..
and by SensorRaw(port) if I undestand correctly...?

Re: Mindsensors Sumo Eyes / internals + NXC driver

Posted: 21 Jun 2011, 17:25
by afanofosc
As I said in my last post, the SensorRaw NXC API function returns the ADRaw or unmodified raw sensor value. It does not return the normalized raw value. The functions that return the Normalized raw value (which the firmware calls "SensorRaw" internally) are the two that I listed.

The values of these NXT SumoEyes constants are documented in the NXC help online (0..3). What else do you want to know about them?

John Hansen

Re: Mindsensors Sumo Eyes / internals + NXC driver

Posted: 21 Jun 2011, 17:37
by HaWe
sorry, I don't understand, probably I'm too stupid.
you wrote:
SensorNormalized is reading the NormalizedValue (aka SensorRaw) field from the Input module's IOMap structure.
But SensorNormalized is NOT SensorRaw ?
Something or other is odd...