Bounty for complete EV3 C/C++ API
-
- Posts: 175
- Joined: 28 Dec 2011, 13:07
- Location: Gelderland, Netherlands
- Contact:
Re: Bounty for complete EV3 C/C++ API
It's obviously not happening anytime soon. My EV3 is sort of sitting on the shelf.
I'm retracting my bounty at the end of February.
I'm retracting my bounty at the end of February.
-- Pepijn
http://studl.es Mindstorms Building Instructions
http://studl.es Mindstorms Building Instructions
Re: Bounty for complete EV3 C/C++ API
What's left to be done? Doesn't the Oct. 07 test release of bricxCC include a mostly complete C API, lacking only sensor functionality?
Re: Bounty for complete EV3 C/C++ API
Daisy-chaining via USB (like provided by the Lego VM) is still missing, too.
Re: Bounty for complete EV3 C/C++ API
Can you provide an example, Doc? I'm not clear on what that is. A screenshot of analogous functionality in the LEGO software would suffice.
Re: Bounty for complete EV3 C/C++ API
I can't show you, because I don't use the lego EV3-G software.
You plug USB cables from brick to brick in a chain, the first is the master, the rest are the slaves.
Daisy-chaining is described as "address all remote sensors and motors as if they were local", i.e.
motors 1-4 are master motors
motors 5-8 are slave1 motors
motors 9-12 are slave2 motors
motors 13-16 are slave3 motors
you command all motors (e.g. rotate by 60% pwr, rotate for 2500 degrees) by the same kind of blocks no matter if they are local or remote.
sensors analog.
So up to 3 extra EV3 work like sort of motor and sensor multiplexers for all of their I/Os.
The Lego software recognizes all motors and sensors on all ports automatically ("auto-detection") but this feature is not necessarily needed in a text-based language if you're able to configure it manually.
Additionally all slaves may run individual programs and you're also able to send BT messages to them to have additional communication abilities.
- here is one with 2 daisy-chained bricks: https://www.youtube.com/watch?v=7mpUl5yWWWM
- another one with 2 bricks: http://www.youtube.com/watch?v=lfkVzCeoNvw
- this one is using 3 daisy-chained bricks: https://www.youtube.com/watch?v=DbgLQeQ ... D6KZFUWuxX
HTH!
You plug USB cables from brick to brick in a chain, the first is the master, the rest are the slaves.
Daisy-chaining is described as "address all remote sensors and motors as if they were local", i.e.
motors 1-4 are master motors
motors 5-8 are slave1 motors
motors 9-12 are slave2 motors
motors 13-16 are slave3 motors
you command all motors (e.g. rotate by 60% pwr, rotate for 2500 degrees) by the same kind of blocks no matter if they are local or remote.
sensors analog.
So up to 3 extra EV3 work like sort of motor and sensor multiplexers for all of their I/Os.
The Lego software recognizes all motors and sensors on all ports automatically ("auto-detection") but this feature is not necessarily needed in a text-based language if you're able to configure it manually.
Additionally all slaves may run individual programs and you're also able to send BT messages to them to have additional communication abilities.
- here is one with 2 daisy-chained bricks: https://www.youtube.com/watch?v=7mpUl5yWWWM
- another one with 2 bricks: http://www.youtube.com/watch?v=lfkVzCeoNvw
- this one is using 3 daisy-chained bricks: https://www.youtube.com/watch?v=DbgLQeQ ... D6KZFUWuxX
HTH!
Re: Bounty for complete EV3 C/C++ API
Looking at the Ev3 sources, it seems like that feature has to be enabled on the firmware level (and is by default). I don't have several bricks around to test it, let alone work out how the motors are addressed from the master.
So far as input sensors work, the automatic detection of sensors is pretty straightforward (the built-in port viewer tool source shows how this is done) and I am working on that when I have spare time (which is admittedly not often).
If you want to know more about how the sensors work, see the following pages of the Ev3 Sources documentation:
http://python-ev3.org/DcmDriver.html
http://python-ev3.org/UartProtocol.html
http://python-ev3.org/testsensorsappcode.html
http://python-ev3.org/types.html
So far as input sensors work, the automatic detection of sensors is pretty straightforward (the built-in port viewer tool source shows how this is done) and I am working on that when I have spare time (which is admittedly not often).
If you want to know more about how the sensors work, see the following pages of the Ev3 Sources documentation:
http://python-ev3.org/DcmDriver.html
http://python-ev3.org/UartProtocol.html
http://python-ev3.org/testsensorsappcode.html
http://python-ev3.org/types.html
Re: Bounty for complete EV3 C/C++ API
I don't understand this low-level stuff.
I have currently 2 EV3s - how can I use it with BCC/C and CSLite? (I actually doubt that this is already possible.)
I have currently 2 EV3s - how can I use it with BCC/C and CSLite? (I actually doubt that this is already possible.)
Re: Bounty for complete EV3 C/C++ API
Like I said, we're all working on it in our spare time and don't expect anything terribly soon. At this time, if you don't want to deal with low-level code, you're SoL.
Re: Bounty for complete EV3 C/C++ API
I didn't understand and didn't know that you meant that you personally are already working on that for EV3/C - glad to hear that you haven't abandoned this,too!
I actually also thought that this daisy-chaining thing is just implemented for the Lego VM, I didn't expect it to become ever accessable by C compilers as well.
Are you also working on a sensor API, too?
I actually also thought that this daisy-chaining thing is just implemented for the Lego VM, I didn't expect it to become ever accessable by C compilers as well.
Are you also working on a sensor API, too?
Re: Bounty for complete EV3 C/C++ API
To the best of my knowledge there are at least 3-4 different groups/individuals working independently on different implementations of Ev3 code. I am not writing an entire API, merely appending sensor input to John Hansen's (BrixCC/NXC) currently released API, initially by wrapping Lauro Ojeda's (robotnav.org) example code. Currently I'm working on re-implementing sensor access in the way that it is done in the 'official' source- that is, a single sensor call that automatically detects the sensor type and gets the proper data.
As I do not have access to any Ev3 robots at this time, work is at a standstill. The ones we were using were on loan from another organization, and they were returned at the end of last semester (Mid December). Until we secure funding to purchase our own, there probably will be no progress on my end.
As I do not have access to any Ev3 robots at this time, work is at a standstill. The ones we were using were on loan from another organization, and they were returned at the end of last semester (Mid December). Until we secure funding to purchase our own, there probably will be no progress on my end.
Who is online
Users browsing this forum: No registered users and 2 guests