I can't show you, because I don't use the lego EV3-G software.
You plug USB cables from brick to brick in a chain, the first is the master, the rest are the slaves.
Daisy-chaining is described as "address all remote sensors and motors as if they were local", i.e.
motors 1-4 are master motors
motors 5-8 are slave1 motors
motors 9-12 are slave2 motors
motors 13-16 are slave3 motors
you command all motors (e.g. rotate by 60% pwr, rotate for 2500 degrees) by the same kind of blocks no matter if they are local or remote.
sensors analog.
So up to 3 extra EV3 work like sort of motor and sensor multiplexers for all of their I/Os.
The Lego software recognizes all motors and sensors on all ports automatically ("auto-detection") but this feature is not necessarily needed in a text-based language if you're able to configure it manually.
Additionally all slaves may run individual programs and you're also able to send BT messages to them to have additional communication abilities.
Looking at the Ev3 sources, it seems like that feature has to be enabled on the firmware level (and is by default). I don't have several bricks around to test it, let alone work out how the motors are addressed from the master.
Like I said, we're all working on it in our spare time and don't expect anything terribly soon. At this time, if you don't want to deal with low-level code, you're SoL.
I didn't understand and didn't know that you meant that you personally are already working on that for EV3/C - glad to hear that you haven't abandoned this,too!
I actually also thought that this daisy-chaining thing is just implemented for the Lego VM, I didn't expect it to become ever accessable by C compilers as well.
To the best of my knowledge there are at least 3-4 different groups/individuals working independently on different implementations of Ev3 code. I am not writing an entire API, merely appending sensor input to John Hansen's (BrixCC/NXC) currently released API, initially by wrapping Lauro Ojeda's (robotnav.org) example code. Currently I'm working on re-implementing sensor access in the way that it is done in the 'official' source- that is, a single sensor call that automatically detects the sensor type and gets the proper data.
As I do not have access to any Ev3 robots at this time, work is at a standstill. The ones we were using were on loan from another organization, and they were returned at the end of last semester (Mid December). Until we secure funding to purchase our own, there probably will be no progress on my end.