Andy in the Cloud

From BBC Basic to Force.com and beyond…


3 Comments

The Third Edition

bookI’m proud to announce the third edition of my book has now been released. Back in March this year I took the plunge start updates to many key areas and add two brand new chapters. Between the 2 years and 8 months since the last edition there has been several platform releases and an increasing number of new features and innovations that made this the biggest update ever! This edition also embraces the platforms rebranding to Lightning, hence the book is now entitled Salesforce Lightning Platform Enterprise Architecture.

You can purchase this book direct from Packt or of course from Amazon among other sellers.  As is the case every year Salesforce events such as Dreamforce and TrailheaDX this book and many other awesome publications will be on sale. Here are some of the key update highlights:

  • Automation and Tooling Updates
    Throughout the book SFDX CLI, Visual Studio Code and 2nd Generation Packaging are leverage. While the whole book is certainly larger, certain chapters of the book actually reduced in size as steps previously reflecting clicks where replaced with CLI commands! At one point in time I was quite a master in Ant Scripts and Marcos, they have also given way to built in SFDX commands.
  • User Interface Updates
    Lightning Web Components is a relative new kid on the block, but benefits greatly from its standards compliance, meaning there is plenty of fun to go around exploring industry tools like Jest in the Unit Testing chapter. All of the books components have been re-written to the Web Component standard.
  • Big Data and Async Programming
    Big data was once a future concern for new products, these days it is very much a concern from the very start. The book covers Big Objects and Platform Events more extensibility with worked examples, including ingest and calculations driven by Platform Events and Async Apex Triggers. Event Driven Architecture is something every Lightning developer should be embracing as the platform continues to evolve around more and more standard platforms and features that leverage them.
  • Integration and Extensibility
    A particularly enjoyed exploring the use of Platform Events as another means by which you can expose API’s from your packages to support more scalable invocation of your logic and asynchronous plugins.
  • External Integrations and AI
    External integrations with other cloud services are a key part to application development and also the implementation of your solution, thus one of two brand new chapters focuses on Connected Apps, Named Credentials, External Services and External Objects, with worked examples of existing services or sample Heroku based services. Einstein has an ever growing surface area across Salesforce products and the platform. While this topic alone is worth an entire book, I took the time in the second new chapter, to enumerate Einstein from the perspective of the developer and customer configurations. The Formula1 motor racing theme continued with the ingest of historic race data that you can run AI over.
  • Other Updates
    Among other updates is a fairly extensive update to the CI/CD chapter which still covers Jenkins, but leverages the new Jenkins Pipeline feature to integrate SFDX CLI. The Unit Testing chapter has also been extended with further thoughts on unit vs integration testing and a focus on Lightening Web Component testing.

The above is just highlights for this third edition, you can see a full table of contents here. A massive thanks to everyone involving for providing the inspiration and support for making this third edition happen! Enjoy!


24 Comments

Swagger / Open API + Salesforce = LIKE

In my previous blog i covered an exciting new integration tool from Salesforce, which consumes API’s that have a descriptor (or schema) associated with them. External Services allows point and click integration with API’s. The ability for Salesforce to consume API’s complying with API schema standards is a pretty huge step forward. Extending its ability to integrate with ease in a way that is in-keeping with its low barrier to entry development and clicks not code mantra.

swaggerlike

At the time of writing my previous blog, only Interagent schema was supported by External Services. However as of the Winter’18 release this is no longer the case. In this blog i will explore the more widely adopted Swagger / Open API 2.0 standard, using Node.js and Heroku and External Services. As bonus topic, i will also touch on using Swagger Code Generator with Apex!

One of the many benefits of supporting the Swagger / Open API standard is the ability to generate documentation for it. The following screenshot shows the API schema on the left and generated documentation on the right. What is also very cool about this, is the Try this operation button. Give it a try for yourself now!

asciiartswagger

oai

Whats the difference between Swagger and Open API  2.0? This was a question i asked myself and thought i would cover the answer here. Basically as at, Swagger v2.0, there is no difference, the Open API Initiative is a rebranding, born out of the huge adoption Swagger has seen since its creation. This move means its future is more formalised and seems to have more meaningful name. You can read more about this amazing story here.

Choosing your methodology for API development

The schema shown above might look a bit scary and you might well want to just get writing code and think about the schema when your ready to share your API. This is certainly supported and there are some tools that support generation of the schema via JSDoc comments in your code or via your joi schema here (useful for existing API’s).

However to really embrace an API first strategy in your development team i feel you should start with the requirements and thus the schema first. This allows others in your team or the intended recipients to review the API before its been developed and even test it out with stub implementations. In my research i was thus drawn to Swagger Node, a tool set, donated by ApiGee, that embraces API-design-first. Read more pros and cons here. It is also the formal Node.js implementation associated with Swagger.

The following describes the development process of API-design-first.

overview2

(ref: Swagger Node README)

Developing Open API’s with “Swagger Node” 

Swagger Node is very easy to get started with and is well documented here. It supports the full API-design-first development process show in the diagram above. The editor (also shown above) is really useful for getting used to writing schemas and the UI is dynamically refreshed, including errors.

The overall Node.js project is still pretty simple (GitHub repo here), now consisting of three files. The schema is edited in YAML file format (translated to JSON when served up to tools). The schema for the ASCIIArt service now looks like the following and is pretty self describing. For further documentation on Swagger / Open API 2.0 see here.

https://createasciiart.herokuapp.com/schema/
swagger: "2.0"
info:
  version: "1.0.0"
  title: AsciiArt Service
# during dev, should point to your local machine
host: localhost:3000
# basePath prefixes all resource paths 
basePath: /
# 
schemes:
  # tip: remove http to make production-grade
  - http
  - https
# format of bodies a client can send (Content-Type)
consumes:
  - application/json
# format of the responses to the client (Accepts)
produces:
  - application/json
paths:
  /asciiart:
    # binds a127 app logic to a route
    x-swagger-router-controller: asciiart
    post:
      description: Returns ASCIIArt to the caller
      # used as the method name of the controller
      operationId: asciiart
      consumes:
        - application/json
      parameters:
        - in: body
          name: body
          description: Message to convert to ASCIIArt
          schema:
            type: object
            required: 
              - message
            properties:
              message:
                type: string
      responses:
        "200":
          description: Success
          schema:
            # a pointer to a definition
            $ref: "#/definitions/ASCIIArtResponse"
  /schema:
    x-swagger-pipe: swagger_raw
# complex objects have schema definitions
definitions:
  ASCIIArtResponse:
    required:
      - art
    properties:
      art:
        type: string

The entry point of the Node.js app, the server.js file now looks like this…

'use strict';

var SwaggerExpress = require('swagger-express-mw');
var app = require('express')();
module.exports = app; // for testing
var config = {
  appRoot: __dirname // required config
};

SwaggerExpress.create(config, function(err, swaggerExpress) {
  if (err) { throw err; }
  // install middleware for swagger ui
  app.use(swaggerExpress.runner.swaggerTools.swaggerUi());
  // install middleware for swagger routing
  swaggerExpress.register(app);
  var port = process.env.PORT || 3000;
  app.listen(port);
});

Note: I changed the Node.js web server framework from hapi (used in my previous blog) to express. As I could not get the Swagger UI to integrate with hapi.

The code implementing the API has been moved to its asciiart.js file.

var figlet = require('figlet');

function asciiart(request, response) {
    // Call figlet to generate the ASCII Art and return it!
    const msg = request.body.message;
    figlet(msg, function(err, data) {
        response.json({ art: data});
    });
}

module.exports = {
    asciiart: asciiart
};

Note: There is no parameter validation code written here, the Swagger Node module dynamically implements parameter validation for you (based on what you define in the schema) before the request reaches your code! It also validates your responses.

To access the documentation simply use the path /docs. The documentation is generated automatically, no need to manage static HTML files. I have hosted my sample AsciiArt service in Heroku so you can try it by clicking the link below.

https://createasciiart.herokuapp.com/docs/

swaggerui

Consuming Swagger API’s with External Services

The process described in my earlier blog for using the above API via External Services has not changed. External Services automatically recognises Swagger API’s.

externalservicesasciiart

NOTE: There is a small bug that prevents the callout if the basePath is specified as root in the schema. Thus this has been commented out in the deployed version of the schema for now. Salesforce will likely have fixed this by the time you read this.

Swagger Tools

  • SwaggerToolsSwagger Editor, the interactive editor shown in the first screenshot of this blog.
  • Swagger Code Generator, creates server stubs and clients for implementing and calling Swagger enabled API’s.
  • Swagger UI, the browser based UI for generating documentation. You can call this from the command line and upload the static HTML files or use frameworks like the one used in this blog to generated it on the fly.

Can we use Swagger to call or implement API’s authored in Apex?

Swagger Tools are available on a number of platforms, including recently added support for Apex clients. This gives you another option to consume API’s directly in Apex. Its not clear if this is going to a better route than consuming the classes generated by External Services, i suspect it might have some pros and cons tbh. Time will tell!

Meanwhile i did run the Swagger Code Generator for Apex and got this…

public class SwagDefaultApi {
    SwagClient client;

    public SwagDefaultApi(SwagClient client) {
        this.client = client;
    }

    public SwagDefaultApi() {
        this.client = new SwagClient();
    }

    public SwagClient getClient() {
        return this.client;
    }

    /**
     *
     * Returns ASCIIArt to the caller
     * @param body Message to convert to ASCIIArt (optional)
     * @return SwagASCIIArtResponse
     * @throws Swagger.ApiException if fails to make API call
     */
    public SwagASCIIArtResponse asciiart(Map<String, Object> params) {
        List<Swagger.Param> query = new List<Swagger.Param>();
        List<Swagger.Param> form = new List<Swagger.Param>();

        return (SwagASCIIArtResponse) client.invoke(
            'POST', '/asciiart',
            (SwagBody) params.get('body'),
            query, form,
            new Map<String, Object>(),
            new Map<String, Object>(),
            new List<String>{ 'application/json' },
            new List<String>{ 'application/json' },
            new List<String>(),
            SwagASCIIArtResponse.class
        );
    }
}

The code is also generated in a Salesforce DX compliant format, very cool!


21 Comments

Simplified API Integrations with External Services

Salesforce are on a mission to make accessing off platform data and web services as easy as possible. This helps keep the user experience optimal and consistent for the user and also allows admins to continue to leverage the platforms tools such as Process Builder and Flow, even if the data or logic is not on the platform.

Starting with External Objects, they added the ability to see and also update data stored in external databases. Once setup, users can manipulate external records without leaving Salesforce, by staying within the familiar UI’s. With External Services, currently in Beta, they have extended this concept to external API services.

UPDATE: The ASCIIArt Service covered in this blog has since been updated to use the Swagger schema standard. However this blog is still a very useful introduction to External Services. Once you have read it, head on over to this blog!

In this blog lets first focus on the clicks-not-code steps you can repeat in your own org, to consume a live ASCII Art web service API i have exposed publicly. The API is simple, it takes a message and returns it in ASCII art format. The following steps result in a working UI to call the API and update a record.

ExternalServicesDemo.png

After the clicks not code bit i will share how the API was built, whats required for compatibility with this feature and how insanely easy it is to develop Web Services in Heroku using Nodejs. So lets dive in to External Services!

Building an ASCII Art Converter in Lightning Experience and Flow

The above solution was built with the following configurations / components. All of which are accessible under the LEX Setup menu (required for External Services) and takes around 5 minutes maximum to get up and running.

  1. Named Credential for the URL of the Web Service
  2. External Service for the URL, referencing the Named Credential
  3. Visual Flow to present a UI, call the External Service and update a record
  4. Lightning Record Page customisation to embed the Flow in the UI

I created myself a Custom Object, called Message, but you can easily adapt the following to any object you want, you just need a Rich Text field to store the result in. The only other thing you need to know of course is the web service URL.

https://createasciiart.herokuapp.com

Can i use External Services with any Web Service then?

In order to build technologies that simplify what are normally things developers have to interpret and code manually. Web Service APIs must be documented in a way that External Services can understand. In this Beta release this is the Interagent schema standard (created by Heroku as it happens).  Support for the more broadly adopted Swagger / OpenId will be added in the Winter release (Safe Harbour).

For my ASCII Art service above, i authored the Interagent schema based on a sample the Salesforce PM for this feature kindly shared, more on this later. When creating the External Service in moment we will provide a schema to this service.

https://createasciiart.herokuapp.com/schema

Creating a Named Credential

From the setup menu search for Named Credential and click New. This is a simple Web Service that requires no authentication. Basically provide only the part of the above URL that points to the Web Service endpoint.

ESNamedCred.png

Creating the External Service

Now for the magic! Under the Setup menu (only in Lightning Experience) search for Integrations and start the wizard. Its a pretty straight forward process, of selecting the above Named Credential, then telling it the URL for the schema. If thats not exposed by the service you want to use, you can paste a Schema in directly (which lets a developer define a schema yourself if one does not already exist).

esstep1.png

esstep2.png

Once you have created the External Service you can review the operations it has discovered. Salesforce uses the documentation embedded in the given schema to display a rather pleasing summary actually.

esstep3.png

So what just happened? Well… internally the wizard wrote some Apex code on your behalf and implemented the Invocable Method annotations to enable that Apex code to appear in tools like Process Builder (not supported in Beta) and Flow. Pretty cool!

Whats more interesting for those wondering, is you cannot actually see this Apex code, its there but some how magically managed by the platform. Though i’ve not confirmed, i would assume it does not require code coverage.

Update: According to the PM, in Winter’18 it will be possible “see” the generated class from other Apex classes and thus reuse the generated code from Apex as well. Kind of like a Api Stub Generator.

Creating a UI to call the External Service via Flow

This simple Flow prompts the user for a message to convert, calls the External Service and updates a Rich Text field on the record with the response. You will see in the Flow sidebar the generated Apex class generated by the External Service appears.

esflow

The following screenshots show some of the key steps involved in setting up the Flow and its three steps, including making a Flow variable for the record Id. This is later used when embedding the Flow in Lightning Experience in the next step.

esflow1

RecordId used by Flow Lightning Component

esflow2

Assign the message service parameter

esflow3

Assign the response to variable

esflow4

Update the Rich Text field

TIP: When setting the ASCII Art service response into the field, i wrapped the value in the HTML elements, pre and code to ensure the use of a monospaced font when the Rich Text field displayed the value.

Embedding the Flow UI in Lightning Experience

Navigate to your desired objects record detail page and select Edit Page from the cog in the top right of the page to open the Lightning App Builder. Here you can drag the Flow component onto the page and configure it to call the above flow. Make sure to map the Flow variable for the record Id as shown in the screenshot, to ensure the current record is passed.

esflowlc.png

Thats it, your done! Enjoy your ASCII Art messages!

Creating your own API for use with External Services

Belinda, the PM for this feature was also kind enough to share the sample code for the example shown in TrailheaDX, from which the service in this blog is based. However i did wanted to build my own version to do something different from the credit example. Also extend my personal experience with Heroku and Nodejs more.

The NodeJS code for this solution is only 41 lines long. It runs up a web server (using the very easy to use hapi library), and registers a couple of handlers. One handler returns the statically defined schema.json file, the other implements the service itself. As side note, the joi library is an easy way add validation to the service parameters.

var Hapi = require('hapi');
var joi = require('joi');
var figlet = require('figlet');

// initialize http listener on a default port
var server = new Hapi.Server();
server.connection({ port: process.env.PORT || 3000 });

// establish route for serving up schema.json
server.route({
  method: 'GET',
  path: '/schema',
  handler: function(request, reply) {
    reply(require('./schema'));
  }
});

// establish route for the /asciiart resource, including some light validation
server.route({
  method: 'POST',
  path: '/asciiart',
  config: {
    validate: {
      payload: {
        message: joi.string().required()
      }
    }
  },
  handler: function(request, reply) {
    // Call figlet to generate the ASCII Art and return it!
    const msg = request.payload.message;
    figlet(msg, function(err, data) {
        reply(data);
    });
  }
});

// start the server
server.start(function() {
  console.log('Server started on ' + server.info.uri);
});

I decided i wanted to explore the diversity of whats available in the Nodejs space, through npm. To keep things light i chose to have a bit of fun and quickly found an ASCIIArt library, called figlet. Though i soon discovered that npm had a library for pretty much every other use case i came up with!

Finally the hand written Interagent schema is also shown below and is reasonably short and easy to understand for this example. Its not all that well documented in layman’s terms as far as i can see. See my thoughts on this and upcoming Swagger support below.

{
  "$schema": "http://interagent.github.io/interagent-hyper-schema",
  "title": "ASCII Art Service",
  "description": "External service example from AndyInTheCloud",
  "properties": {
    "asciiart": {
      "$ref": "#/definitions/asciiart"
    }
  },
  "definitions": {
    "asciiart": {
      "title": "ASCII Art Service",
      "description": "Returns the ASCII Art for the given message.",
      "type": [ "object" ],
      "properties": {
        "message": {
          "$ref": "#/definitions/asciiart/definitions/message"
        },
        "art": {
          "$ref": "#/definitions/asciiart/definitions/art"
        }
      },
      "definitions": {
        "message": {
          "description": "The message.",
          "example": "Hello World",
          "type": [ "string" ]
        },
        "art": {
          "description": "The ASCII Art.",
          "example": "",
          "type": [ "string" ]
        }
      },
      "links": [
        {
          "title": "AsciiArt",
          "description": "Converts the given message to ASCII Art.",
          "href": "/asciiart",
          "method": "POST",
          "schema": {
            "type": [ "object" ],
            "description": "Specifies input parameters to calculate payment term",
            "properties": {
              "message": {
                "$ref": "#/definitions/asciiart/definitions/message"
              }
            },
            "required": [ "message" ]
          },
          "targetSchema": {
            "$ref": "#/definitions/asciiart/definitions/art"
          }
        }
      ]
    }
  }
}

Finally here is the package.json file that brings the whole node app together!

{
  "name": "asciiartservice",
  "version": "1.0.0",
  "main": "server.js",
  "dependencies": {
    "figlet": "^1.2.0",
    "hapi": "~8.4.0",
    "joi": "^6.1.1"
  }
}

Other Observations and Thoughts…

  • Error Handling.
    You can handle errors from the service in the usual way by using the Fault path from the element. The error shown is not all that pretty, but then in fairness there is not really much of a standard to follow here.
    eserrorflow.pngeserror.png
  • Can a Web Service called this way talk back to Salesforce?
    Flow provides various system variables, one of which is the Session Id. Thus you could pass this as an argument to your Web Service. Be careful though as the running user may not have Salesforce API access and this will be a UI session and thus will be short lived. Thus you may want to explore another means to obtain an renewable oAuth token for more advanced uses.
  • Web Service Callbacks.
    Currently in the Beta the Flow is blocked until the Web Service returns, so its good practice to make your service short and sweet. Salesforce are planning async support as part of the roadmap however.
  • Complex Parameters.
    Its unclear at this stage how complex a web service can be supported given Flows limitations around Invocable Methods which this feature depends on.
  • The future is bright with Swagger support!
    I am really glad Salesforce are adding support for Swagger/OpenID, as i really struggled to find good examples and tutorials around Interagent. Really what is needed here is for the schema and code to be tied more closely together, like this!UPDATE: See my other blog entry covering Swagger support

Summary

Both External Objects and External Services reflect the reality of the continued need for integration tools and making this process simpler and thus cheaper. Separate services and data repositories are for now here to stay. I’m really pleased to see Salesforce doing what it does best, making complex things easier for the masses. Or as Einstein would say…Everything should be made as simple as possible, but no simpler.

Finally you can read more about External Objects here and here through Agustina’s and laterally Alba’s excellent blogs.


2 Comments

Highlights from TrailheaDX 2017

IMG_2857.JPGThis was my first TrailheaDX and what an event it was! With my Field Guide in hand i set out into the wilderness! In this blog i’ll share some of my highlights, thoughts and links to the latest resources. Many of the newly announced things you can actually get your hands on now which is amazing!

Overall the event felt well organized, if a little frantic at times. With smaller sessions of 30 minutes each, often 20 mins after intros and questions, each was bite sized, but quite well tuned with demos and code samples being shown.

SalesforceDX, Salesforce announced the public beta of this new technology aimed at improving the developer experience on the platform. SalesforceDX consist of several modules that will be extended further over time. Salesforce has done a great job at providing a wealth of Trailhead resources to get you started.

Einstein, Since its announcement, myself and other developers have been waiting to get access to more custom tools and API’s, well now that wait is well and truly over. As per my previous blogs we’ve had the Einstein Vision API for a little while now. Announced at the event where no less than three other new Einstein related tools and API’s.

  • Einstein Discovery. Salesforce demonstrated a very slick looking tool that allows you to point and click your way through to analyzing various data sets, including those obtained from your custom objects! They have provided a Trailhead module on it here and i plan on digging in! Pricing and further info is here.
  • Einstein Sentiment API. Allows you to interpret text strings for terms that indicate if its a positive, neutral or negative statement / sentiment. This can be used to scan case comments, forum posts, feedback, twitter posts etc in an automated way and respond or be alerted accordingly to what is being said.
  • Einstein Intent API.  Allows you to interpret text strings for meanings, such as instructions or requests. Routing case comments or even implementing bots that can help automate or propose actions to be taken without human interpretation.
  • Einstein Object Detection API. Is an extension of the Einstein Vision API, that allows for individual items in a picture to be identified. For example a pile of items on a coffee table, such as a mug, magazine, laptop or pot plant! Each can then be recognized and classified to build up more intel on whats in the picture for further processing and analysis.
  • PredictionIO on Heroku. Finally, if you want to go below the declarative tools or intentional simplified Einstein API’s, you can build your own machine learning models using Heroku and the PredictionIO build pack!

Platform Events. These allow messages to be published and subscribed to using a new object known as an Event object, suffixed with __e. Once created you can use a new Apex API’s or REST API’s to send messages and use either Apex Triggers or Streaming API to receive them. There is also a new Process Builder action or Flow element to send messages. You can keep your messages within Force.com or use them to integrate between other cloud platforms or the browser. The possibilities are quite endless here, aysnc processing, inter app comms, logging, ui notifications…. i’m sure myself and other bloggers will be exploring them in the coming months!

External Services. If you find a cool API you want to consume in Force.com you currently have to write some code. No longer! If you have a schema that describes that API you use the External Services wizard to generate some Apex code that will call out to the API. Whats special about this, is the Apex code is accessible via Process Builder and Flow. Making clicks not code API integration possible. Its an excellent way to integrate with complementary services you or others might develop on platforms such as Heroku. I have just submitted a session to Dreamforce 2017 to explore this further, fingers crossed it gets selected! You can read more about it here in the meantime.

Sadly i did have to make a key decision to focus on the above topics and not so much on Lightning. I heard from other attendees these sessions where excellent and i did catch a brief sight of dynamic component rendering in Lightning App Builder, very cool!

Thanks Salesforce for filling my blog backlog for the next year or so! 😉

 

 


6 Comments

Apex UML Canvas Tool : Dreamforce Release!

Screen Shot 2013-10-14 at 22.16.58

Update: Dreamforce is over for another year! Thanks to everyone who supported me and came along to the session. Salesforce have now uploaded a recording of the session here and can find the slides here.

As those following my blog will know one of the sessions I’ll be running at this years Dreamforce event is around the Tooling API and Canvas technologies. If you’ve not read about what I’m doing check out my previous blog here. I’ve now uploaded the code to the tool I’ve developed that will be show casing these technologies. I’ll be walking through key parts of it in the session, please do feel free to take a look and give me your thoughts ahead or at the session if your attending!

Installing the Tool

You now also install the tool as a managed package into your development org by following these steps.

  1. Install the package using the package install link from the GitHub repo README file.
  2. Installed is a tab which shows a Visualforce page which hosts the externally hosted (on Heroku) Canvas application.
  3. You need to configure access to the Canvas application post installation, you can follow the Salesforce guidelines on screen and/or the ones here.Screen Shot 2013-11-12 at 17.52.55
  4. Click the link to configure and edit the “Admin approved users are pre-authorised” option and save.Screen Shot 2013-11-12 at 17.53.54
  5. Finally edit your Profile and enable the Connected App (and Tab if needed)Screen Shot 2013-11-12 at 17.57.08

Using the Tool

Screen Shot 2013-11-12 at 17.41.16

  • The tool displays a list of the Apex classes (unmanaged) in your org on the left hand side, tick a class to show it in the canvas.
  • Move the Apex class UML representation around with your mouse, if it or other classes reference each other lines will be drawn automatically.
  • There is some issues with dragging, if you get mixed up, just click the canvas to deselect everything then click what you want.

It is quite basic still, only showing methods, properties and ‘usage’ relationships and really needs some further community push behind it to progress a long line of cool features that could be added. Take a look at the comments and discussion on my last post for some more ideas on this. Look forward to see you all at Dreamforce 2013!


8 Comments

Preview: Demo of Apex Code Analysis using the Tooling API and Canvas

This weekend I’ve been fleshing out the code for my second Dreamforce 2013 session. I’ve been having a lot of fun with various technologies to create the following demo which I’ve shared a short work in progress video below. The JQuery plugin doing the rendering is js-mindmap, it’s got some quirks I’ve discovered so far, but I’m sticky with it for now!

The session highlights the Tooling API via this tool which can be installed directly into your Salesforce environment via the wonderful Salesforce Canvas technology! This is proposed session abstract …

Dreamforce 2013 Session: Apex Code Analysis using the Tooling API and Canvas

The Tooling API provides powerful new ways to manage your code and get further insight into how its structured. This session will teach you how to use the Tooling API and its Symbol Table to analyse your code using an application integrated via Force.com Canvas directly into your Salesforce org — no command line or desktop install required! Join us and take your knowledge of how your Apex code is really structured to the next level!

Technologies involved so far…

I’ve also found the excellent ObjectAid Eclipse plugin (which is sadly a Java source code only tool) to explore the Tooling API data structures in much more detail than the Salesforce documentation currently offers, especially in respect to the SymbolTable structure. I’ll be sharing full code and discussing the following diagram in more detail in my session! In the meantime I’d love to know your thoughts and other ideas around the Tooling API!

Tooling API