Andy in the Cloud

From BBC Basic to Force.com and beyond…


7 Comments

Simplified API Integrations with External Services

Salesforce are on a mission to make accessing off platform data and web services as easy as possible. This helps keep the user experience optimal and consistent for the user and also allows admins to continue to leverage the platforms tools such as Process Builder and Flow, even if the data or logic is not on the platform.

Starting with External Objects, they added the ability to see and also update data stored in external databases. Once setup, users can manipulate external records without leaving Salesforce, by staying within the familiar UI’s. With External Services, currently in Beta, they have extended this concept to external API services.

In this blog lets first focus on the clicks-not-code steps you can repeat in your own org, to consume a live ASCII Art web service API i have exposed publicly. The API is simple, it takes a message and returns it in ASCII art format. The following steps result in a working UI to call the API and update a record.

ExternalServicesDemo.png

After the clicks not code bit i will share how the API was built, whats required for compatibility with this feature and how insanely easy it is to develop Web Services in Heroku using Nodejs. So lets dive in to External Services!

Building an ASCII Art Converter in Lightning Experience and Flow

The above solution was built with the following configurations / components. All of which are accessible under the LEX Setup menu (required for External Services) and takes around 5 minutes maximum to get up and running.

  1. Named Credential for the URL of the Web Service
  2. External Service for the URL, referencing the Named Credential
  3. Visual Flow to present a UI, call the External Service and update a record
  4. Lightning Record Page customisation to embed the Flow in the UI

I created myself a Custom Object, called Message, but you can easily adapt the following to any object you want, you just need a Rich Text field to store the result in. The only other thing you need to know of course is the web service URL.

https://createasciiart.herokuapp.com

Can i use External Services with any Web Service then?

In order to build technologies that simplify what are normally things developers have to interpret and code manually. Web Service APIs must be documented in a way that External Services can understand. In this Beta release this is the Interagent schema standard (created by Heroku as it happens).  Support for the more broadly adopted Swagger / OpenId will be added in the Winter release (Safe Harbour).

For my ASCII Art service above, i authored the Interagent schema based on a sample the Salesforce PM for this feature kindly shared, more on this later. When creating the External Service in moment we will provide a schema to this service.

https://createasciiart.herokuapp.com/schema

Creating a Named Credential

From the setup menu search for Named Credential and click New. This is a simple Web Service that requires no authentication. Basically provide only the part of the above URL that points to the Web Service endpoint.

ESNamedCred.png

Creating the External Service

Now for the magic! Under the Setup menu (only in Lightning Experience) search for Integrations and start the wizard. Its a pretty straight forward process, of selecting the above Named Credential, then telling it the URL for the schema. If thats not exposed by the service you want to use, you can paste a Schema in directly (which lets a developer define a schema yourself if one does not already exist).

esstep1.png

esstep2.png

Once you have created the External Service you can review the operations it has discovered. Salesforce uses the documentation embedded in the given schema to display a rather pleasing summary actually.

esstep3.png

So what just happened? Well… internally the wizard wrote some Apex code on your behalf and implemented the Invocable Method annotations to enable that Apex code to appear in tools like Process Builder (not supported in Beta) and Flow. Pretty cool!

Whats more interesting for those wondering, is you cannot actually see this Apex code, its there but some how magically managed by the platform. Though i’ve not confirmed, i would assume it does not require code coverage.

Update: According to the PM, in Winter’18 it will be possible “see” the generated class from other Apex classes and thus reuse the generated code from Apex as well. Kind of like a Api Stub Generator.

Creating a UI to call the External Service via Flow

This simple Flow prompts the user for a message to convert, calls the External Service and updates a Rich Text field on the record with the response. You will see in the Flow sidebar the generated Apex class generated by the External Service appears.

esflow

The following screenshots show some of the key steps involved in setting up the Flow and its three steps, including making a Flow variable for the record Id. This is later used when embedding the Flow in Lightning Experience in the next step.

esflow1

RecordId used by Flow Lightning Component

esflow2

Assign the message service parameter

esflow3

Assign the response to variable

esflow4

Update the Rich Text field

TIP: When setting the ASCII Art service response into the field, i wrapped the value in the HTML elements, pre and code to ensure the use of a monospaced font when the Rich Text field displayed the value.

Embedding the Flow UI in Lightning Experience

Navigate to your desired objects record detail page and select Edit Page from the cog in the top right of the page to open the Lightning App Builder. Here you can drag the Flow component onto the page and configure it to call the above flow. Make sure to map the Flow variable for the record Id as shown in the screenshot, to ensure the current record is passed.

esflowlc.png

Thats it, your done! Enjoy your ASCII Art messages!

Creating your own API for use with External Services

Belinda, the PM for this feature was also kind enough to share the sample code for the example shown in TrailheaDX, from which the service in this blog is based. However i did wanted to build my own version to do something different from the credit example. Also extend my personal experience with Heroku and Nodejs more.

The NodeJS code for this solution is only 41 lines long. It runs up a web server (using the very easy to use hapi library), and registers a couple of handlers. One handler returns the statically defined schema.json file, the other implements the service itself. As side note, the joi library is an easy way add validation to the service parameters.

var Hapi = require('hapi');
var joi = require('joi');
var figlet = require('figlet');

// initialize http listener on a default port
var server = new Hapi.Server();
server.connection({ port: process.env.PORT || 3000 });

// establish route for serving up schema.json
server.route({
  method: 'GET',
  path: '/schema',
  handler: function(request, reply) {
    reply(require('./schema'));
  }
});

// establish route for the /asciiart resource, including some light validation
server.route({
  method: 'POST',
  path: '/asciiart',
  config: {
    validate: {
      payload: {
        message: joi.string().required()
      }
    }
  },
  handler: function(request, reply) {
    // Call figlet to generate the ASCII Art and return it!
    const msg = request.payload.message;
    figlet(msg, function(err, data) {
        reply(data);
    });
  }
});

// start the server
server.start(function() {
  console.log('Server started on ' + server.info.uri);
});

I decided i wanted to explore the diversity of whats available in the Nodejs space, through npm. To keep things light i chose to have a bit of fun and quickly found an ASCIIArt library, called figlet. Though i soon discovered that npm had a library for pretty much every other use case i came up with!

Finally the hand written Interagent schema is also shown below and is reasonably short and easy to understand for this example. Its not all that well documented in layman’s terms as far as i can see. See my thoughts on this and upcoming Swagger support below.

{
  "$schema": "http://interagent.github.io/interagent-hyper-schema",
  "title": "ASCII Art Service",
  "description": "External service example from AndyInTheCloud",
  "properties": {
    "asciiart": {
      "$ref": "#/definitions/asciiart"
    }
  },
  "definitions": {
    "asciiart": {
      "title": "ASCII Art Service",
      "description": "Returns the ASCII Art for the given message.",
      "type": [ "object" ],
      "properties": {
        "message": {
          "$ref": "#/definitions/asciiart/definitions/message"
        },
        "art": {
          "$ref": "#/definitions/asciiart/definitions/art"
        }
      },
      "definitions": {
        "message": {
          "description": "The message.",
          "example": "Hello World",
          "type": [ "string" ]
        },
        "art": {
          "description": "The ASCII Art.",
          "example": "",
          "type": [ "string" ]
        }
      },
      "links": [
        {
          "title": "AsciiArt",
          "description": "Converts the given message to ASCII Art.",
          "href": "/asciiart",
          "method": "POST",
          "schema": {
            "type": [ "object" ],
            "description": "Specifies input parameters to calculate payment term",
            "properties": {
              "message": {
                "$ref": "#/definitions/asciiart/definitions/message"
              }
            },
            "required": [ "message" ]
          },
          "targetSchema": {
            "$ref": "#/definitions/asciiart/definitions/art"
          }
        }
      ]
    }
  }
}

Finally here is the package.json file that brings the whole node app together!

{
  "name": "asciiartservice",
  "version": "1.0.0",
  "main": "server.js",
  "dependencies": {
    "figlet": "^1.2.0",
    "hapi": "~8.4.0",
    "joi": "^6.1.1"
  }
}

Other Observations and Thoughts…

  • Error Handling.
    You can handle errors from the service in the usual way by using the Fault path from the element. The error shown is not all that pretty, but then in fairness there is not really much of a standard to follow here.
    eserrorflow.pngeserror.png
  • Can a Web Service called this way talk back to Salesforce?
    Flow provides various system variables, one of which is the Session Id. Thus you could pass this as an argument to your Web Service. Be careful though as the running user may not have Salesforce API access and this will be a UI session and thus will be short lived. Thus you may want to explore another means to obtain an renewable oAuth token for more advanced uses.
  • Web Service Callbacks.
    Currently in the Beta the Flow is blocked until the Web Service returns, so its good practice to make your service short and sweet. Salesforce are planning async support as part of the roadmap however.
  • Complex Parameters.
    Its unclear at this stage how complex a web service can be supported given Flows limitations around Invocable Methods which this feature depends on.
  • The future is bright with Swagger support!
    I am really glad Salesforce are adding support for Swagger/OpenID, as i really struggled to find good examples and tutorials around Interagent. Really what is needed here is for the schema and code to be tied more closely together, like this!

Summary

Both External Objects and External Services reflect the reality of the continued need for integration tools and making this process simpler and thus cheaper. Separate services and data repositories are for now here to stay. I’m really pleased to see Salesforce doing what it does best, making complex things easier for the masses. Or as Einstein would say…Everything should be made as simple as possible, but no simpler.

Finally you can read more about External Objects here and here through Agustina’s and laterally Alba’s excellent blogs.


2 Comments

Highlights from TrailheaDX 2017

IMG_2857.JPGThis was my first TrailheaDX and what an event it was! With my Field Guide in hand i set out into the wilderness! In this blog i’ll share some of my highlights, thoughts and links to the latest resources. Many of the newly announced things you can actually get your hands on now which is amazing!

Overall the event felt well organized, if a little frantic at times. With smaller sessions of 30 minutes each, often 20 mins after intros and questions, each was bite sized, but quite well tuned with demos and code samples being shown.

SalesforceDX, Salesforce announced the public beta of this new technology aimed at improving the developer experience on the platform. SalesforceDX consist of several modules that will be extended further over time. Salesforce has done a great job at providing a wealth of Trailhead resources to get you started.

Einstein, Since its announcement, myself and other developers have been waiting to get access to more custom tools and API’s, well now that wait is well and truly over. As per my previous blogs we’ve had the Einstein Vision API for a little while now. Announced at the event where no less than three other new Einstein related tools and API’s.

  • Einstein Discovery. Salesforce demonstrated a very slick looking tool that allows you to point and click your way through to analyzing various data sets, including those obtained from your custom objects! They have provided a Trailhead module on it here and i plan on digging in! Pricing and further info is here.
  • Einstein Sentiment API. Allows you to interpret text strings for terms that indicate if its a positive, neutral or negative statement / sentiment. This can be used to scan case comments, forum posts, feedback, twitter posts etc in an automated way and respond or be alerted accordingly to what is being said.
  • Einstein Intent API.  Allows you to interpret text strings for meanings, such as instructions or requests. Routing case comments or even implementing bots that can help automate or propose actions to be taken without human interpretation.
  • Einstein Object Detection API. Is an extension of the Einstein Vision API, that allows for individual items in a picture to be identified. For example a pile of items on a coffee table, such as a mug, magazine, laptop or pot plant! Each can then be recognized and classified to build up more intel on whats in the picture for further processing and analysis.
  • PredictionIO on Heroku. Finally, if you want to go below the declarative tools or intentional simplified Einstein API’s, you can build your own machine learning models using Heroku and the PredictionIO build pack!

Platform Events. These allow messages to be published and subscribed to using a new object known as an Event object, suffixed with __e. Once created you can use a new Apex API’s or REST API’s to send messages and use either Apex Triggers or Streaming API to receive them. There is also a new Process Builder action or Flow element to send messages. You can keep your messages within Force.com or use them to integrate between other cloud platforms or the browser. The possibilities are quite endless here, aysnc processing, inter app comms, logging, ui notifications…. i’m sure myself and other bloggers will be exploring them in the coming months!

External Services. If you find a cool API you want to consume in Force.com you currently have to write some code. No longer! If you have a schema that describes that API you use the External Services wizard to generate some Apex code that will call out to the API. Whats special about this, is the Apex code is accessible via Process Builder and Flow. Making clicks not code API integration possible. Its an excellent way to integrate with complementary services you or others might develop on platforms such as Heroku. I have just submitted a session to Dreamforce 2017 to explore this further, fingers crossed it gets selected! You can read more about it here in the meantime.

Sadly i did have to make a key decision to focus on the above topics and not so much on Lightning. I heard from other attendees these sessions where excellent and i did catch a brief sight of dynamic component rendering in Lightning App Builder, very cool!

Thanks Salesforce for filling my blog backlog for the next year or so! 😉

 

 


3 Comments

Branch Support for GitHub Salesforce Deploy Tool

If your not familiar, the GitHub Salesforce Deploy Tool will allow you to place a handy button on your GitHub README files to allow for super easy browser based deployment from your repo to a Salesforce org. Up until now it only worked on the master branch, it now supports branches!

GitHubSFDeployBranches.png

The new support for branches, does in fact cover tags and commits as well. Though only branches are defaulted from the button. If you already have the button on your README file, you will find it now magically starts detecting the branch!

ButtonOnABranch.png

Deploy to Salesforce buttons on branches now work!

If your new to the tool, go to the main page and grab the button code and paste it into your README file.  Or if you want a button you can use explicitly in a blog or article you can fill in the fields and generate it from the same page. As you can see the enhancement adds a new ref parameter to the generated URL. This can be a branch name, tag or commit. You can read more about the tool here.

ButtonWithScript.png

Enjoy!

 


7 Comments

Introducing the Flow Factory

Flow is a great technology for providing a means for non-coders to build functionality. More so than any other point and click facility on the platform, even Process Builder. Why? Because it offers a rich set of Elements (operations) that contain conditional branching, loop and storage of variables. Along with the ability to read or update any object (API accessible) you like. Its almost like a programming language….

Ironically, like Apex, it is missing one increasingly asked for feature… being able to call another Flow that is not known at the time your writing your calling code. Such as one configured via the amazing Custom Metadata… Basically a kind of Apex reflection for Flow. Often the workaround for this type of problem is to use a factory pattern.

As i highlighted in this prior blog on calling Flow from Apex, the platform does not yet provide this capability to change this please up vote this idea. Well at least not in Apex, as there exists a REST API for this. Meanwhile though back in the land of Apex, it occurred to me when building the LittleBits Connector last year. As workaround i could generate a factory class that would workaround this.

I have created Flow Toolbelt library (GitHub repo here) and package (if you want to install that way) which takes last years solution and lifts it into its own smaller package. The Flow Factory tab discovers the Flows configured in your org and generates the required factory Apex class. If you add or remove flows you need to repeat the process.

FlowFactory.png

Once this has been Deployed you can use code like the following. Passing in the name of your Flow. Note this is a WIP version of the library and needs more error handling, so be sure to pass in a valid Flow name and also at least an empty params Map.

Flow.Interview flow =
  flowtb.FlowFactory.newInstance('TestA', new Map<String, Object>());
flow.start();
System.debug(flow.getVariableValue('Var'));

I think this concept can be extended to allow Flow to run from other Apex entry points, such as the recently added Sandbox Apex callback. Allowing you to run Flow when your Sandbox spins up. Let me know your thoughts, if this is something useful or not.

 

 


2 Comments

Working with Apex Mocks Matchers and Unit Of Work

The Apex Mocks framework gained a new feature recently, namely Matchers. This new feature means that we can start verifying what records and their fields values are being passed to a mocked Unit Of Work more reliably and with a greater level of detail.

Since the Unit Of Work deals primarily with SObject types this does present some challenges to the default behaviour of Apex Mocks. Stephen Willcock‘s excellent blog points out the reasons behind this with some great examples. In addition prior to the matchers functionality, you could not verify your interest in a specific field value of a record, passed to registerDirty for example.

So first consider the following test code that does not use matchers.

	@IsTest
	private static void callingApplyDiscountShouldCalcDiscountAndRegisterDirty()
	{
		// Create mocks
		fflib_ApexMocks mocks = new fflib_ApexMocks();
		fflib_ISObjectUnitOfWork uowMock = new fflib_SObjectMocks.SObjectUnitOfWork(mocks);

		// Given
		Opportunity opp = new Opportunity(
			Id = fflib_IDGenerator.generate(Opportunity.SObjectType),
			Name = 'Test Opportunity',
			StageName = 'Open',
			Amount = 1000,
			CloseDate = System.today());
		Application.UnitOfWork.setMock(new List<Opportunity> { opp };);

		// When
		IOpportunities opps =
			Opportunities.newInstance(testOppsList);
		opps.applyDiscount(10, uowMock);

		// Then
		((fflib_ISObjectUnitOfWork)
			mocks.verify(uowMock, 1)).registerDirty(
				new Opportunity(
					Id = opp.Id,
					Name = 'Test Opportunity',
					StageName = 'Open',
					Amount = 900,
					CloseDate = System.today()));
	}

On the face of it, it looks like it should correctly verify that an updated Opportunity record with 10% removed from the Amount was passed to the Unit Of Work. But this fails with an assert claiming the method was not called. The main reason for this is its a new instance and this is not what the mock recorded. Changing it to verify with the test record instance works, but this only verifies the test record was passed, the Amount could be anything.

		// Then
		((fflib_ISObjectUnitOfWork)
			mocks.verify(uowMock, 1)).registerDirty(opp);

The solution is to use the new Matchers functionality for SObject’s. This time we can verify that a record was passed to the registerDirty method, that it was the one we expected by its Id and critically the correct Amount was set.

		// Then
		((fflib_ISObjectUnitOfWork)
			mocks.verify(uowMock, 1)).registerDirty(
				fflib_Match.sObjectWith(
					new Map<SObjectField, Object>{
						Opportunity.Id => opp.Id,
						Opportunity.Amount => 900} ));

There is also methods fflib_Match.sObjectWithName and fflib_Match.sObjectWithId as kind of short hands if you just want to check these specific fields. The Matcher framework is hugely powerful, with many more useful matchers. So i encourage you to take a deeper look David Frudd‘s excellent blog post here to learn more.

If you want to know more about how Apex Mocks integrates with the Apex Enterprise Patterns as shown in the example above, refer to this two part series here.


79 Comments

Declarative Rollup Tool Summer Release!

Over the course of the last couple of weeks, i have been focusing my community time on release v2.4 of the DLRS tool. Specifically focusing on some much requested features driven by the community in the the Chatter group.

So lets get stuck in…

Rollup Scheduler Improvements

The ability to run a full (or partial with criteria) recalculate of a rollup on a daily schedule has been in the tool for a few releases now. However up until now the only option was to run it at 2am everyday. It is now possible to change this with this new UI, its a bit raw and basic, but for now it should at least give some more flexibility.

RollupSchedule.png

Support for Merging Accounts, Contacts and Leads

The platform has some special handling for merging Accounts, Contacts and Leads. Especially when it comes to when Apex Triggers are invoked. Basically if your parent object is one of these objects, prior versions of the tool had no awareness of this operation, so rollups would not recalculate. If you are using Realtime or Schedule calculation modes on your rollups. Since the platform does not fire Apex Triggers for child records reparented as a result of a merge.

With this release there are two things you can do to fix this. First when you click the Manage Child Trigger button, you get a new checkbox option to control deployment of an additional Apex Trigger on the parent object. If your upgrading you will need to click Remove then Deploy again, to see this.

ParentTrigger.png

IMPORTANT NOTE: If you don’t feel merge operations are an issue for your use cases you can deselect this option and cut down on the number of triggers deployed. Also if it is only the rollup child object that supports merging, there is no need to deploy any additional triggers and the tool does not show the above checkbox option.

Secondly you need to setup the RollupJob as an Apex Scheduled job (under Setup > Apex Classes), even if you don’t have any Schedule Mode rollups. This is due to the fact that due to a platform restriction, the tool cannot recalculate rollups realtime during a merge operation. So it can only record that they need to be recalculated. It does this via the tools scheduled mode infrastructure, by automatically adding records to the Lookup Rollup Summary Schedule Items object. Note that you don’t need to change your rollups from Realtime to Scheduled mode for this to work, only schedule the job.

Support for Archived / Deleted Records via the All Rows Setting

Salesforce archives Tasks and Events after a while. If you have rollups over these child objects you can enable the Aggregate All Rows checkbox. This will ensure your rollups remain accurate even if some records have been archived. Note this also will apply to records in the recycle bin. For upgrades (if your not using the Manage Lookup Rollup Summaries tab), you will need to add this field to your layout to see it.

AllRows.png

Row Limit for Concatenate and Last Rollup Operations

If your using the Last or Concatenate operations, you can define a limit as to how many child records are actually considered when calculating the rollup. This is useful if your using Concatenate into a fix length field for example. When upgrading you need to add the new Row Limit field to your layout if your not using the swanky new Manage Lookup Rollup Summaries tab.

RowLimit.png

Improved House Keeping for Scheduled Mode

If your are using rollups with their Calculation Mode set to Scheduled. The tool records parent rollup records to be later recalculated by the RollupJob Apex Scheduled job. In past releases if through merge or other operation the parent record was deleted before the next scheduled run. Then records would sit in limbo in the Lookup Rollup Summary Schedule Items object, being processed and erroring over and over. These will now be cleared out and there is no upgrade actions you need to take for this.

Summary

Thanks for everyones support for this tool, i hope these changes help you go further with clicks not code! Though as reminder please keep in mind the best practices and restrictions listed in the README. If you have any questions you can either post comments on this blog or use the Chatter Group. The Chatter Group is a great place to get your query seen by a broader group of people who are also diligently supporting the tool as well!

You can find releases of the tool here.


16 Comments

Introduction to the Platform Action API

shutterstock_159003926.jpg

Actions are Salesforce’s general term for tasks users can perform either through buttons throughout various UI’s on desktop, mobile, tablet etc or in fact via non-UI processes such those built via via Process Builder or Automation Flows.

Actions are about “getting things done” in Salesforce. They encapsulate a piece of logic that allows a user to perform some work, such as sending email. When an action runs, it saves changes in your organization by updating the database. More here.

Over the years we’ve had many terms and ways to define these. Custom Button and Custom Link are perhaps the most obvious ones, which i’ve covered here in the past. Quick Actions (previously Publisher Actions) and more recently we’ve had Action Link‘s, which i covered in a past blog. Then of course the Standard Buttons, Edit, Delete, Follow, Submit for Approval etc provided by the platform. Such actions appear in various places Record layouts, List Views, Related Lists, Chatter and more recently Flexi Pages (aka Lighting Pages).

You might wonder then, if you had the task as developer to build your own UI or tool that wanted to expose some or all of the above actions, it would be quite a challenge to find them all. Indeed in some cases you may have had to resort to URL hacking to invoke some of them. Well worry not no longer, Salesforce’s clever architects now have you covered! Enter a new virtual SObject known as PlatformAction! Before we get onto what exactly virtual means, lets review some Actions and some SOQL queries…

Consider this Account Record detail page in the Classic (or Aloha) UI

PlatformActionAccount.png

Note down your record ID and use it in a query like the one below…

SELECT DeviceFormat, Label, Type, Section,
       ActionTarget, ActionTargetType, ActionListContext
  FROM PlatformAction
  WHERE ActionListContext = 'Record' AND
        SourceEntity = '001B000000D2V0n' AND
        Section = 'Page' AND
        DeviceFormat = 'Aloha'

In Developer Console you should see something like this…

PlatformActionQueryResults1.png

Pretty cool huh!? Check out the ActionTarget field, for the Standard Button records, thats the URL you can place on your UI’s to invoke that action, simple as that! Better still this is a supported way to get it, no more URL hacking! Now lets add a couple of Custom Buttons and re-run the query…

PlatformActionAcount2.png

We now see CustomButton records appear…

PlatformActionQueryResults2.png

This next query reveals actions shown on a List View. I did note Custom List View buttons that require record selection did not appear however. I suspect this is due to them requiring more than a simple HTTP GET URL to invoke.

SELECT DeviceFormat, Label, Type, Section,
       ActionTarget, ActionTargetType, ActionListContext
  FROM PlatformAction
  WHERE ActionListContext = 'ListView' AND
        SourceEntity = 'Account' AND
        Section = 'Page' AND
        DeviceFormat = 'Aloha'

Other observations..

  • Prior to Summer’16 (out in preview as i write this), Apex SOQL was not supported, only REST API SOQL. This is due to a limitation with the internally applied LIMIT keyword. This has now been resolved in Summer’16, so Apex SOQL now works!
  • SourceEntity can also be given an SObject API name, e.g. SourceEntity = ‘Account’, the result here are object level buttons, like New or those you add to the MRU page.
  • DeviceFormat field value matters, if you leave it off, it defaults to Phone. Thus some actions will be missing from those in Desktop (Lightning Experience) or Aloha (Classic). I eventually found Custom Buttons using Visualforce pages that didn’t have the Lightning Supported checkbox set didn’t appear when querying with the Phone device type for example.
  • User context matters, actions returned are user and configuration sensitive, meaning the record itself, record type and associated layout all contribute to the actions returned. Custom Buttons for example need to be on the relevant layout.
  • Label and Icon information, there are also fields that allow you to render appropriate labels and icons for the actions.
  • Related List actions, you can also retrieve actions shown on related lists, search for RelatedList in the help topic here.
  • Describe actions? You will notice some actions have an ActionTargetType of Describe? These are invoked via an API, something i will cover in later blog.

So lets discuss the “virtual SObject” bit!?!

Your probably wondering what a virtual SObject is?

Well my best guess is its an SObject that is not backed by physical data in the Salesforce database. If you check the documentation you’ll see fields just like any other object and it supports SOQL (with some limitations). My thinking is the records for this object are dynamically generated on demand by doing all the heavy lifting internally to scan all the various historic places where actions have been defined.

Thank you Salesforce architects, this is now my #1 coolest Salesforce API!

Whats next?

  • For starters, no more URL hacking of those standard pages, no excuses now!
  • Helper class or Visualforce and/or Lighting component for actions?
  • Explore the Force.com Actions Developer Guide