Andy in the Cloud

From BBC Basic to and beyond…


Streaming Debug Logs to your console

Debug logs are a key tool in triaging and profiling on the Lightning Platform (formerly both in development and production. While the Apex Interactive Debugger provides an interactive experience, sometimes you want to monitor, parse or filter logs. Maybe you are reproducing a bug and are watching for a certain SOQL query or method being executed or we just want filter output in different ways.


A recent addition to the DX command line from Chris Wall is the ability to effectively stream debug logs from any org connected to DX to the command line console. This is similar to the experience in Developer Console logs pane, but is effectively opening the logs and dumping them out as they are produced on the server for you automatically.

sfdx force:apex:log:tail

You can install the Salesforce DX CLI here. Note that you do not need to have a DX project to use this command.

In the following command line example, I have piped the output to another command (grep) that filters the output to show only USER_DEBUG log lines.

sfdx force:apex:log:tail --color | grep USER_DEBUG 

Pictures do not really do it justice, so here is a short demo video!

The command works against any org you have connected to the DX CLI, including production and sandbox orgs. However, if you run it from the same folder as a DX project it will use the currently configured default user/scratch org for that project.

Adding a bit of color to your debug logs!

The –color parameter used above enables some basic color highlighting for method, constructor, variable assignments etc.


You can also customize your own colors by setting the SFDX_APEX_LOG_COLOR_MAP environment variable to an absolute file path to a JSON file per the format shown below.

    CONSTRUCTOR_: 'magenta',
    EXCEPTION_: 'red',
    FATAL_: 'red',
    METHOD_: 'blue',
    SOQL_: 'yellow',
    USER_: 'green',
    VARIABLE_: 'cyan'

Power to the pipe!

One of the most exciting features for me is the ability to pipe debug logs. Maybe you want to parse out some information to profile how many SOQL statements have been used or aggregate timestamp values (the bit in brackets after the time!) to do some performance profiling… I am looking forward to seeing what folks do with this, please share!

Anything else?

The –debugLevel command is optional but allows you to define your own debug level by inserting records into the TraceFlag object (via the DX CLI command force:data:record:create). Finally, you can run the command with the –help parameter to get the latest help.

Usage: sfdx force:apex:log:tail [-c] [-d ] [-s] [-u ] [--json] [--loglevel ] 

start debug logging and display logs


 -c, --color                          colorize noteworthy log lines

 -d, --debuglevel DEBUGLEVEL          debug level for trace flag

 -s, --skiptraceflag                  skip trace flag setup

 -u, --targetusername TARGETUSERNAME  username or alias for the target org;

                                      overrides default target org

 --json                               format output as json

 --loglevel LOGLEVEL                  logging level for this command invocation


1 Comment

User Notifications with the Utility Bar API

utilityprogressIn this blog, I want to highlight a couple of great UI features provided by the Utility Bar in Lightning Experience. These are relatively new and accessed only via the Utility Bar API, so are not immediately accessible. This blog is based on code and material I prepared for Dreamforce 2017. However, I did not have time to dig into the code during that session so this blog provides that opportunity. My session also covered other cool features in Lightning Experience, such as the amazing App Console mode!

Enabling and Understanding the Utility Bar API
The utility bar API is enabled at a component level though it does have access to the whole utility bar. You can specify the lightning:utilityBarAPI component in any component, regardless if its in the utility bar or not. This component will not display anything but it does have a very useful selection of methods!

<lightning:utilityBarAPI aura:id="utilitybar"/>

In your component code you simple access it like any other component.

var utilityAPI = cmp.find("utilitybar");

Once you have access to an instance of the component you can call any of its methods. All methods take a utilityId parameter. Although if you call it within the context of a component running in the utility bar you can omit this parameter and the API will discover it for you. All the methods take a single JavaScript object with properties representing the parameters to the method.

utilityAPI.setPanelHeaderLabel({ label: "My Label" });

One interesting design aspect of these methods is they do not respond immediately, all responses are returned via a callback. To do this the API uses the JavaScript Promises pattern. Fortunately, its a pretty easy convention to pick up and use. It is worth taking the time to understand, it has fast become defacto callback approach.

Providing Notifications


There are many occasions that you want to notify the user of something that’s happened since they last logged in or during login as a result of some background process. The setUtilityHighlighted method is a good way to drive such notifications.

You can, of course, evaluate on initialize of your component, but it’s worth considering using Platform Events, it’s really easy to send them from your Apex code or Process Builder and you can easily integrate my Streaming API component to respond to the event. The code below is a very simple isolated example using browser timers, but it helps illustrate the API and give you a basis to build one.

   label="{! v.readNotification ? 'Mark as Read' : 'Wait' }" 
<aura:if isTrue="{!v.readNotification}">
   <ui:message title="Confirmation"severity="info">
      This is a confirmation message.</ui:message>
    demoNotifications: function (cmp, event) {
		var utilityAPI = cmp.find("utilitybar"); 
        var readNotification = cmp.get('v.readNotification');
        if(readNotification == true) {
			utilityAPI.setUtilityHighlighted({ highlighted : false });                        
            cmp.set('v.readNotification', false);
        } else {
            setTimeout($A.getCallback(function () {
                utilityAPI.setUtilityHighlighted({ highlighted : true });            
				cmp.set('v.readNotification', true);
            }), 3000);                    

Providing Progress Updates


By using a combination of setUtilityLabel and setUtilityIcon you can create an eye-catching progress updating effect. This sample is a pretty simple browser timer based example. However, you could again use Platform Events to send events as part of a Batch Apex execution to update on progress or just poll the AsyncApexJob object.

    label="{! v.isProgressing ? 'Stop' : 'Start' }" 
    value="{! v.progress }" size="large" />
demoProgressIndicator: function (cmp, event) {
    var utilityAPI = cmp.find("utilitybar"); 
    if (cmp.get('v.isProgressing')) {
        // stop
        cmp.set('v.isProgressing', false);
        cmp.set('v.progress', 0);
        cmp.set('v.progressToggleIcon', false);
        utilityAPI.setUtilityLabel({ label : 'Utility Bar API Demo' });                    
        utilityAPI.setUtilityIcon({ icon : 'thunder' } );                                    
    } else {
        // start
        cmp.set('v.isProgressing', true);
        cmp._interval = setInterval($A.getCallback(function () {
            var progresToggleIcon = 
               cmp.get('v.progressToggleIcon') == true ? false : true;
            var progress = cmp.get('v.progress');
            cmp.set('v.progress', progress === 100 ? 0 : progress + 1);
            cmp.set('v.progressToggleIcon', progresToggleIcon);
                { label : 'Utility Bar API Demo (' + progress + '%)' });        
                { icon : progresToggleIcon == true ? 'thunder' : 'spinner' });
        }), 400);


There is still plenty to dig into in the code samples from the session. You can also deploy the sample code into an org and try out some of the other interactive API demos. Enjoy!



Adding User Feedback to your Package

featurefeedbackFeature Management has become GA in Winter’18 and with it the ability to have finer control and visibility over how users of your package consume its functionality. It provides an Apex API and corresponding objects in your License Management Org (LMO). With these objects, you can switch features on and off or even extend the capacity or duration of existing ones. For features that you simply want to monitor adoption on you can also track adoption and send metrics back to your LMO as well. This is all done within the platform, no HTTP callouts are required.

This blog focuses on a tracking and metrics use case and presents a Lightning Component to allow users to activate a given feature and after that contribute to an aggregated scoring of that feature sent back to you the package owner!

Consider this scenario. The latest Widgets App package has added a new feature to its Widgets Overview tab. The ability to analyze widgets! Using Feature Management they can now track who activates this feature and ratings their customers give it.


Trying it Out: You can find the full source code for this sample app and component here. You can also easily give it a try via the handy Deploy to SFDX button! Do not worry though, it will not write back to your production org, it’s totally isolated in this mode. Unless you choose to package it up and associate your LMA etc.

withoutmanagefeatures.pngYour customers may not want every user to have the ability to activate and deactivate features as shown above. The sample application associated with this blog includes a Manage Featurescustom permission that controls this ability. Users without this custom permission assigned only see the feedback portion of the component. If the feature has not been activated a message is displayed informing the user to consult with their admin.

The component only sends the average score per feature per customer back to you. However, a user’s individual score is captured in the subscriber org via custom objects. Enabling you to pick up the phone and dig a bit deeper together with customers showing particularly low or high scores.


The following shows what you get back in your License Management Org LMO (typically your companies production org). By using standard reporting you can easily see what features have been activated, their average score and how many users contributed to the rating.


NOTE: The average score returned to the production org is represented as a value from 1 to 50.

So let’s now dig into the code and the architecture of this solution. Firstly we need some feature parameters, then some corresponding Apex API to read and write to them. You define the parameters via XML under the/featureParameters folder.


NOTE: In this blog, we are dealing with parameter values that flow from your customers org back into your production org, hence the SubscriberToLmo usage. Also keep in mind that per the docs, updates to your LMO may take up to 24 hrs.

You can also create package feature parameters via a new tab on the Package Details page in your packaging org. However, if you are using Scratch Orgs you define them via metadata directly. The Feedback component needs you to define three parameters per feature. To support our scenario, the following parameters are defined. WidgetAnalysis (Boolean), WidgetAnalysisCount (Integer), and WidgetAnalysisScore (Integer).


NOTE: When your code sets parameter values, mixed DML rules apply. So typically you would set these via an async context such as @future or a queueable.

To use the Feedback Component included in the sample code for this blog, you set two attributes, the name of your feature parameter and an attribute that the rest of your component code can use to conditionally display your new feature! The following shows how in the above scenario, the new Widget Analysis component has used c:featureFeedback component to activate the feature, get a user rating and control the display of the new graph to the user.


Take a deeper look at the Feature Feedback Component Apex controller to see the above Feature Management API in action, as well as how it aggregates the scores.

Back over in your License Management Org, the above report was based on one of the four new custom objects provided by a Salesforce package. The Feature Parameter object represents the package parameters defined above. The Feature Parameter Date, Feature Parameter Integer, and Feature Parameter Boolean are the values set via code running in the subscriber org associated with the License record. Per the documentation, it can take up to 24hrs for these to update.



Knowing more about how your customers consume applications you build on the platform is a big part of customer success and your own. You should also check out the Usage Metrics feature if you have not already.

There is a quite a bit more to Feature Management to explore. Such as controlling feature activation exclusively from your side, useful for pilots or enabling paid features. Of special note is the ability tohide related Custom Objects until features are activated. This is certainly a welcome feature to your customer’s admins when working with large packages since they only see in Object Manager objects relating to activated features.

Finally, I recommend you read the excellent documentation in the ISVForce guide very carefully before going ahead and trying this out in your main package. Since this feature integrates with your live production data. The documentation recommends to try this out in a temporary package first and discuss rollout with your legal team.

P.S. You can also use the FeatureManagement.checkPermission method to check if the user has been assigned a given custom permission without SOQL. Very useful!


Platform Events and Lightning Components

Over the past few weeks i have been working with Platform Events in a number of my Lightning projects, such as the Custom Metadata Services and Event Logging. This component allows you to receive Platform Events in your own components.
This component can be deployed from this repository here. I have been working on this for a while and I have yet to highlight this component on my blog. I wanted to rectify that now and I also wanted to take some time to explain lessons learned along the way and if anyone else has any thoughts on improving it even further!

Using the Component

Firstly using the component is very easy, you just need two attributes as you can see above. The channel defines the event and the onMessage attribute the handler for the event. I learnt along the way that if you define a registerEvent in your component it automatically exposes an attribute that takes a callback handler, much nicer for users of your component! The payload event parameter contains the platform event fields.

Wrapping CometD in a Component World

The component uses the CometD JavaScript library internally. This library does require a certain amount of configuration and authentication tweaks to initialise (such as disabling WebSockets, since LEX does not currently support it). Also, as i found out eventually, code to unsubscribe from events is also needed.

In the Visualforce page examples this was less of a concern in a page centric world. In the world of components this becomes very important! Without it, i was finding that when i navigated around Lightning Experience and back to components using this component i would accumulate handlers, that would effectively appear to be repeating events, not good.

In order to find the correct moment to unsubscribe and disconnect, the component listens to the aura:valueDestroy event. This does not fire immediately however, so some additional defensive work in the internal handler is needed. I am certainly open to further comments on how to make this component more robust.

This Trialhead module also includes example code on how to use the CometD library in Lightning, though the result is not a generic component like the one in this blog. I am also less sure about its design in terms of unsubscribing only on page unload.


Access to creating, sending and receiving Platform Events are available in Apex, REST API’s, Process Builder and Flow. Hopefully one day we will see a platform provided Lightning Base Component to replace the one i have created here. Meanwhile have fun!


Swagger / Open API + Salesforce = LIKE

In my previous blog i covered an exciting new integration tool from Salesforce, which consumes API’s that have a descriptor (or schema) associated with them. External Services allows point and click integration with API’s. The ability for Salesforce to consume API’s complying with API schema standards is a pretty huge step forward. Extending its ability to integrate with ease in a way that is in-keeping with its low barrier to entry development and clicks not code mantra.


At the time of writing my previous blog, only Interagent schema was supported by External Services. However as of the Winter’18 release this is no longer the case. In this blog i will explore the more widely adopted Swagger / Open API 2.0 standard, using Node.js and Heroku and External Services. As bonus topic, i will also touch on using Swagger Code Generator with Apex!

One of the many benefits of supporting the Swagger / Open API standard is the ability to generate documentation for it. The following screenshot shows the API schema on the left and generated documentation on the right. What is also very cool about this, is the Try this operation button. Give it a try for yourself now!



Whats the difference between Swagger and Open API  2.0? This was a question i asked myself and thought i would cover the answer here. Basically as at, Swagger v2.0, there is no difference, the Open API Initiative is a rebranding, born out of the huge adoption Swagger has seen since its creation. This move means its future is more formalised and seems to have more meaningful name. You can read more about this amazing story here.

Choosing your methodology for API development

The schema shown above might look a bit scary and you might well want to just get writing code and think about the schema when your ready to share your API. This is certainly supported and there are some tools that support generation of the schema via JSDoc comments in your code or via your joi schema here (useful for existing API’s).

However to really embrace an API first strategy in your development team i feel you should start with the requirements and thus the schema first. This allows others in your team or the intended recipients to review the API before its been developed and even test it out with stub implementations. In my research i was thus drawn to Swagger Node, a tool set, donated by ApiGee, that embraces API-design-first. Read more pros and cons here. It is also the formal Node.js implementation associated with Swagger.

The following describes the development process of API-design-first.


(ref: Swagger Node README)

Developing Open API’s with “Swagger Node” 

Swagger Node is very easy to get started with and is well documented here. It supports the full API-design-first development process show in the diagram above. The editor (also shown above) is really useful for getting used to writing schemas and the UI is dynamically refreshed, including errors.

The overall Node.js project is still pretty simple (GitHub repo here), now consisting of three files. The schema is edited in YAML file format (translated to JSON when served up to tools). The schema for the ASCIIArt service now looks like the following and is pretty self describing. For further documentation on Swagger / Open API 2.0 see here.
swagger: "2.0"
  version: "1.0.0"
  title: AsciiArt Service
# during dev, should point to your local machine
host: localhost:3000
# basePath prefixes all resource paths 
basePath: /
  # tip: remove http to make production-grade
  - http
  - https
# format of bodies a client can send (Content-Type)
  - application/json
# format of the responses to the client (Accepts)
  - application/json
    # binds a127 app logic to a route
    x-swagger-router-controller: asciiart
      description: Returns ASCIIArt to the caller
      # used as the method name of the controller
      operationId: asciiart
        - application/json
        - in: body
          name: body
          description: Message to convert to ASCIIArt
            type: object
              - message
                type: string
          description: Success
            # a pointer to a definition
            $ref: "#/definitions/ASCIIArtResponse"
    x-swagger-pipe: swagger_raw
# complex objects have schema definitions
      - art
        type: string

The entry point of the Node.js app, the server.js file now looks like this…

'use strict';

var SwaggerExpress = require('swagger-express-mw');
var app = require('express')();
module.exports = app; // for testing
var config = {
  appRoot: __dirname // required config

SwaggerExpress.create(config, function(err, swaggerExpress) {
  if (err) { throw err; }
  // install middleware for swagger ui
  // install middleware for swagger routing
  var port = process.env.PORT || 3000;

Note: I changed the Node.js web server framework from hapi (used in my previous blog) to express. As I could not get the Swagger UI to integrate with hapi.

The code implementing the API has been moved to its asciiart.js file.

var figlet = require('figlet');

function asciiart(request, response) {
    // Call figlet to generate the ASCII Art and return it!
    const msg = request.body.message;
    figlet(msg, function(err, data) {
        response.json({ art: data});

module.exports = {
    asciiart: asciiart

Note: There is no parameter validation code written here, the Swagger Node module dynamically implements parameter validation for you (based on what you define in the schema) before the request reaches your code! It also validates your responses.

To access the documentation simply use the path /docs. The documentation is generated automatically, no need to manage static HTML files. I have hosted my sample AsciiArt service in Heroku so you can try it by clicking the link below.


Consuming Swagger API’s with External Services

The process described in my earlier blog for using the above API via External Services has not changed. External Services automatically recognises Swagger API’s.


NOTE: There is a small bug that prevents the callout if the basePath is specified as root in the schema. Thus this has been commented out in the deployed version of the schema for now. Salesforce will likely have fixed this by the time you read this.

Swagger Tools

  • SwaggerToolsSwagger Editor, the interactive editor shown in the first screenshot of this blog.
  • Swagger Code Generator, creates server stubs and clients for implementing and calling Swagger enabled API’s.
  • Swagger UI, the browser based UI for generating documentation. You can call this from the command line and upload the static HTML files or use frameworks like the one used in this blog to generated it on the fly.

Can we use Swagger to call or implement API’s authored in Apex?

Swagger Tools are available on a number of platforms, including recently added support for Apex clients. This gives you another option to consume API’s directly in Apex. Its not clear if this is going to a better route than consuming the classes generated by External Services, i suspect it might have some pros and cons tbh. Time will tell!

Meanwhile i did run the Swagger Code Generator for Apex and got this…

public class SwagDefaultApi {
    SwagClient client;

    public SwagDefaultApi(SwagClient client) {
        this.client = client;

    public SwagDefaultApi() {
        this.client = new SwagClient();

    public SwagClient getClient() {
        return this.client;

     * Returns ASCIIArt to the caller
     * @param body Message to convert to ASCIIArt (optional)
     * @return SwagASCIIArtResponse
     * @throws Swagger.ApiException if fails to make API call
    public SwagASCIIArtResponse asciiart(Map<String, Object> params) {
        List<Swagger.Param> query = new List<Swagger.Param>();
        List<Swagger.Param> form = new List<Swagger.Param>();

        return (SwagASCIIArtResponse) client.invoke(
            'POST', '/asciiart',
            (SwagBody) params.get('body'),
            query, form,
            new Map<String, Object>(),
            new Map<String, Object>(),
            new List<String>{ 'application/json' },
            new List<String>{ 'application/json' },
            new List<String>(),

The code is also generated in a Salesforce DX compliant format, very cool!