Andy in the Cloud

From BBC Basic to Force.com and beyond…

Meanwhile… On BrickInTheCloud… Sensory Input!

Leave a comment

In my last blog I focused on getting the Lego robot to understand a selection of commands given to it via posts I sent to its Chatter persona via Salesforce’s Chatter mobile app. In return it gave basic confirmations (as post commonts) as having executed the commands.

This time I wanted to explore the sensors that came with the kit. And ways in which I can push the data from those sensors into Force.com for further processing and analytics. Eventually enabling dynamic adaption of the robot through a combination of Apex code running in the Cloud as well as the NXC code running within the robot.

Screen Shot 2013-03-18 at 09.53.02

Read more and watch the video here

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s