Andy in the Cloud

From BBC Basic to Force.com and beyond…


24 Comments

Swagger / Open API + Salesforce = LIKE

In my previous blog i covered an exciting new integration tool from Salesforce, which consumes API’s that have a descriptor (or schema) associated with them. External Services allows point and click integration with API’s. The ability for Salesforce to consume API’s complying with API schema standards is a pretty huge step forward. Extending its ability to integrate with ease in a way that is in-keeping with its low barrier to entry development and clicks not code mantra.

swaggerlike

At the time of writing my previous blog, only Interagent schema was supported by External Services. However as of the Winter’18 release this is no longer the case. In this blog i will explore the more widely adopted Swagger / Open API 2.0 standard, using Node.js and Heroku and External Services. As bonus topic, i will also touch on using Swagger Code Generator with Apex!

One of the many benefits of supporting the Swagger / Open API standard is the ability to generate documentation for it. The following screenshot shows the API schema on the left and generated documentation on the right. What is also very cool about this, is the Try this operation button. Give it a try for yourself now!

asciiartswagger

oai

Whats the difference between Swagger and Open API  2.0? This was a question i asked myself and thought i would cover the answer here. Basically as at, Swagger v2.0, there is no difference, the Open API Initiative is a rebranding, born out of the huge adoption Swagger has seen since its creation. This move means its future is more formalised and seems to have more meaningful name. You can read more about this amazing story here.

Choosing your methodology for API development

The schema shown above might look a bit scary and you might well want to just get writing code and think about the schema when your ready to share your API. This is certainly supported and there are some tools that support generation of the schema via JSDoc comments in your code or via your joi schema here (useful for existing API’s).

However to really embrace an API first strategy in your development team i feel you should start with the requirements and thus the schema first. This allows others in your team or the intended recipients to review the API before its been developed and even test it out with stub implementations. In my research i was thus drawn to Swagger Node, a tool set, donated by ApiGee, that embraces API-design-first. Read more pros and cons here. It is also the formal Node.js implementation associated with Swagger.

The following describes the development process of API-design-first.

overview2

(ref: Swagger Node README)

Developing Open API’s with “Swagger Node” 

Swagger Node is very easy to get started with and is well documented here. It supports the full API-design-first development process show in the diagram above. The editor (also shown above) is really useful for getting used to writing schemas and the UI is dynamically refreshed, including errors.

The overall Node.js project is still pretty simple (GitHub repo here), now consisting of three files. The schema is edited in YAML file format (translated to JSON when served up to tools). The schema for the ASCIIArt service now looks like the following and is pretty self describing. For further documentation on Swagger / Open API 2.0 see here.

https://createasciiart.herokuapp.com/schema/
swagger: "2.0"
info:
  version: "1.0.0"
  title: AsciiArt Service
# during dev, should point to your local machine
host: localhost:3000
# basePath prefixes all resource paths 
basePath: /
# 
schemes:
  # tip: remove http to make production-grade
  - http
  - https
# format of bodies a client can send (Content-Type)
consumes:
  - application/json
# format of the responses to the client (Accepts)
produces:
  - application/json
paths:
  /asciiart:
    # binds a127 app logic to a route
    x-swagger-router-controller: asciiart
    post:
      description: Returns ASCIIArt to the caller
      # used as the method name of the controller
      operationId: asciiart
      consumes:
        - application/json
      parameters:
        - in: body
          name: body
          description: Message to convert to ASCIIArt
          schema:
            type: object
            required: 
              - message
            properties:
              message:
                type: string
      responses:
        "200":
          description: Success
          schema:
            # a pointer to a definition
            $ref: "#/definitions/ASCIIArtResponse"
  /schema:
    x-swagger-pipe: swagger_raw
# complex objects have schema definitions
definitions:
  ASCIIArtResponse:
    required:
      - art
    properties:
      art:
        type: string

The entry point of the Node.js app, the server.js file now looks like this…

'use strict';

var SwaggerExpress = require('swagger-express-mw');
var app = require('express')();
module.exports = app; // for testing
var config = {
  appRoot: __dirname // required config
};

SwaggerExpress.create(config, function(err, swaggerExpress) {
  if (err) { throw err; }
  // install middleware for swagger ui
  app.use(swaggerExpress.runner.swaggerTools.swaggerUi());
  // install middleware for swagger routing
  swaggerExpress.register(app);
  var port = process.env.PORT || 3000;
  app.listen(port);
});

Note: I changed the Node.js web server framework from hapi (used in my previous blog) to express. As I could not get the Swagger UI to integrate with hapi.

The code implementing the API has been moved to its asciiart.js file.

var figlet = require('figlet');

function asciiart(request, response) {
    // Call figlet to generate the ASCII Art and return it!
    const msg = request.body.message;
    figlet(msg, function(err, data) {
        response.json({ art: data});
    });
}

module.exports = {
    asciiart: asciiart
};

Note: There is no parameter validation code written here, the Swagger Node module dynamically implements parameter validation for you (based on what you define in the schema) before the request reaches your code! It also validates your responses.

To access the documentation simply use the path /docs. The documentation is generated automatically, no need to manage static HTML files. I have hosted my sample AsciiArt service in Heroku so you can try it by clicking the link below.

https://createasciiart.herokuapp.com/docs/

swaggerui

Consuming Swagger API’s with External Services

The process described in my earlier blog for using the above API via External Services has not changed. External Services automatically recognises Swagger API’s.

externalservicesasciiart

NOTE: There is a small bug that prevents the callout if the basePath is specified as root in the schema. Thus this has been commented out in the deployed version of the schema for now. Salesforce will likely have fixed this by the time you read this.

Swagger Tools

  • SwaggerToolsSwagger Editor, the interactive editor shown in the first screenshot of this blog.
  • Swagger Code Generator, creates server stubs and clients for implementing and calling Swagger enabled API’s.
  • Swagger UI, the browser based UI for generating documentation. You can call this from the command line and upload the static HTML files or use frameworks like the one used in this blog to generated it on the fly.

Can we use Swagger to call or implement API’s authored in Apex?

Swagger Tools are available on a number of platforms, including recently added support for Apex clients. This gives you another option to consume API’s directly in Apex. Its not clear if this is going to a better route than consuming the classes generated by External Services, i suspect it might have some pros and cons tbh. Time will tell!

Meanwhile i did run the Swagger Code Generator for Apex and got this…

public class SwagDefaultApi {
    SwagClient client;

    public SwagDefaultApi(SwagClient client) {
        this.client = client;
    }

    public SwagDefaultApi() {
        this.client = new SwagClient();
    }

    public SwagClient getClient() {
        return this.client;
    }

    /**
     *
     * Returns ASCIIArt to the caller
     * @param body Message to convert to ASCIIArt (optional)
     * @return SwagASCIIArtResponse
     * @throws Swagger.ApiException if fails to make API call
     */
    public SwagASCIIArtResponse asciiart(Map<String, Object> params) {
        List<Swagger.Param> query = new List<Swagger.Param>();
        List<Swagger.Param> form = new List<Swagger.Param>();

        return (SwagASCIIArtResponse) client.invoke(
            'POST', '/asciiart',
            (SwagBody) params.get('body'),
            query, form,
            new Map<String, Object>(),
            new Map<String, Object>(),
            new List<String>{ 'application/json' },
            new List<String>{ 'application/json' },
            new List<String>(),
            SwagASCIIArtResponse.class
        );
    }
}

The code is also generated in a Salesforce DX compliant format, very cool!


21 Comments

Simplified API Integrations with External Services

Salesforce are on a mission to make accessing off platform data and web services as easy as possible. This helps keep the user experience optimal and consistent for the user and also allows admins to continue to leverage the platforms tools such as Process Builder and Flow, even if the data or logic is not on the platform.

Starting with External Objects, they added the ability to see and also update data stored in external databases. Once setup, users can manipulate external records without leaving Salesforce, by staying within the familiar UI’s. With External Services, currently in Beta, they have extended this concept to external API services.

UPDATE: The ASCIIArt Service covered in this blog has since been updated to use the Swagger schema standard. However this blog is still a very useful introduction to External Services. Once you have read it, head on over to this blog!

In this blog lets first focus on the clicks-not-code steps you can repeat in your own org, to consume a live ASCII Art web service API i have exposed publicly. The API is simple, it takes a message and returns it in ASCII art format. The following steps result in a working UI to call the API and update a record.

ExternalServicesDemo.png

After the clicks not code bit i will share how the API was built, whats required for compatibility with this feature and how insanely easy it is to develop Web Services in Heroku using Nodejs. So lets dive in to External Services!

Building an ASCII Art Converter in Lightning Experience and Flow

The above solution was built with the following configurations / components. All of which are accessible under the LEX Setup menu (required for External Services) and takes around 5 minutes maximum to get up and running.

  1. Named Credential for the URL of the Web Service
  2. External Service for the URL, referencing the Named Credential
  3. Visual Flow to present a UI, call the External Service and update a record
  4. Lightning Record Page customisation to embed the Flow in the UI

I created myself a Custom Object, called Message, but you can easily adapt the following to any object you want, you just need a Rich Text field to store the result in. The only other thing you need to know of course is the web service URL.

https://createasciiart.herokuapp.com

Can i use External Services with any Web Service then?

In order to build technologies that simplify what are normally things developers have to interpret and code manually. Web Service APIs must be documented in a way that External Services can understand. In this Beta release this is the Interagent schema standard (created by Heroku as it happens).  Support for the more broadly adopted Swagger / OpenId will be added in the Winter release (Safe Harbour).

For my ASCII Art service above, i authored the Interagent schema based on a sample the Salesforce PM for this feature kindly shared, more on this later. When creating the External Service in moment we will provide a schema to this service.

https://createasciiart.herokuapp.com/schema

Creating a Named Credential

From the setup menu search for Named Credential and click New. This is a simple Web Service that requires no authentication. Basically provide only the part of the above URL that points to the Web Service endpoint.

ESNamedCred.png

Creating the External Service

Now for the magic! Under the Setup menu (only in Lightning Experience) search for Integrations and start the wizard. Its a pretty straight forward process, of selecting the above Named Credential, then telling it the URL for the schema. If thats not exposed by the service you want to use, you can paste a Schema in directly (which lets a developer define a schema yourself if one does not already exist).

esstep1.png

esstep2.png

Once you have created the External Service you can review the operations it has discovered. Salesforce uses the documentation embedded in the given schema to display a rather pleasing summary actually.

esstep3.png

So what just happened? Well… internally the wizard wrote some Apex code on your behalf and implemented the Invocable Method annotations to enable that Apex code to appear in tools like Process Builder (not supported in Beta) and Flow. Pretty cool!

Whats more interesting for those wondering, is you cannot actually see this Apex code, its there but some how magically managed by the platform. Though i’ve not confirmed, i would assume it does not require code coverage.

Update: According to the PM, in Winter’18 it will be possible “see” the generated class from other Apex classes and thus reuse the generated code from Apex as well. Kind of like a Api Stub Generator.

Creating a UI to call the External Service via Flow

This simple Flow prompts the user for a message to convert, calls the External Service and updates a Rich Text field on the record with the response. You will see in the Flow sidebar the generated Apex class generated by the External Service appears.

esflow

The following screenshots show some of the key steps involved in setting up the Flow and its three steps, including making a Flow variable for the record Id. This is later used when embedding the Flow in Lightning Experience in the next step.

esflow1

RecordId used by Flow Lightning Component

esflow2

Assign the message service parameter

esflow3

Assign the response to variable

esflow4

Update the Rich Text field

TIP: When setting the ASCII Art service response into the field, i wrapped the value in the HTML elements, pre and code to ensure the use of a monospaced font when the Rich Text field displayed the value.

Embedding the Flow UI in Lightning Experience

Navigate to your desired objects record detail page and select Edit Page from the cog in the top right of the page to open the Lightning App Builder. Here you can drag the Flow component onto the page and configure it to call the above flow. Make sure to map the Flow variable for the record Id as shown in the screenshot, to ensure the current record is passed.

esflowlc.png

Thats it, your done! Enjoy your ASCII Art messages!

Creating your own API for use with External Services

Belinda, the PM for this feature was also kind enough to share the sample code for the example shown in TrailheaDX, from which the service in this blog is based. However i did wanted to build my own version to do something different from the credit example. Also extend my personal experience with Heroku and Nodejs more.

The NodeJS code for this solution is only 41 lines long. It runs up a web server (using the very easy to use hapi library), and registers a couple of handlers. One handler returns the statically defined schema.json file, the other implements the service itself. As side note, the joi library is an easy way add validation to the service parameters.

var Hapi = require('hapi');
var joi = require('joi');
var figlet = require('figlet');

// initialize http listener on a default port
var server = new Hapi.Server();
server.connection({ port: process.env.PORT || 3000 });

// establish route for serving up schema.json
server.route({
  method: 'GET',
  path: '/schema',
  handler: function(request, reply) {
    reply(require('./schema'));
  }
});

// establish route for the /asciiart resource, including some light validation
server.route({
  method: 'POST',
  path: '/asciiart',
  config: {
    validate: {
      payload: {
        message: joi.string().required()
      }
    }
  },
  handler: function(request, reply) {
    // Call figlet to generate the ASCII Art and return it!
    const msg = request.payload.message;
    figlet(msg, function(err, data) {
        reply(data);
    });
  }
});

// start the server
server.start(function() {
  console.log('Server started on ' + server.info.uri);
});

I decided i wanted to explore the diversity of whats available in the Nodejs space, through npm. To keep things light i chose to have a bit of fun and quickly found an ASCIIArt library, called figlet. Though i soon discovered that npm had a library for pretty much every other use case i came up with!

Finally the hand written Interagent schema is also shown below and is reasonably short and easy to understand for this example. Its not all that well documented in layman’s terms as far as i can see. See my thoughts on this and upcoming Swagger support below.

{
  "$schema": "http://interagent.github.io/interagent-hyper-schema",
  "title": "ASCII Art Service",
  "description": "External service example from AndyInTheCloud",
  "properties": {
    "asciiart": {
      "$ref": "#/definitions/asciiart"
    }
  },
  "definitions": {
    "asciiart": {
      "title": "ASCII Art Service",
      "description": "Returns the ASCII Art for the given message.",
      "type": [ "object" ],
      "properties": {
        "message": {
          "$ref": "#/definitions/asciiart/definitions/message"
        },
        "art": {
          "$ref": "#/definitions/asciiart/definitions/art"
        }
      },
      "definitions": {
        "message": {
          "description": "The message.",
          "example": "Hello World",
          "type": [ "string" ]
        },
        "art": {
          "description": "The ASCII Art.",
          "example": "",
          "type": [ "string" ]
        }
      },
      "links": [
        {
          "title": "AsciiArt",
          "description": "Converts the given message to ASCII Art.",
          "href": "/asciiart",
          "method": "POST",
          "schema": {
            "type": [ "object" ],
            "description": "Specifies input parameters to calculate payment term",
            "properties": {
              "message": {
                "$ref": "#/definitions/asciiart/definitions/message"
              }
            },
            "required": [ "message" ]
          },
          "targetSchema": {
            "$ref": "#/definitions/asciiart/definitions/art"
          }
        }
      ]
    }
  }
}

Finally here is the package.json file that brings the whole node app together!

{
  "name": "asciiartservice",
  "version": "1.0.0",
  "main": "server.js",
  "dependencies": {
    "figlet": "^1.2.0",
    "hapi": "~8.4.0",
    "joi": "^6.1.1"
  }
}

Other Observations and Thoughts…

  • Error Handling.
    You can handle errors from the service in the usual way by using the Fault path from the element. The error shown is not all that pretty, but then in fairness there is not really much of a standard to follow here.
    eserrorflow.pngeserror.png
  • Can a Web Service called this way talk back to Salesforce?
    Flow provides various system variables, one of which is the Session Id. Thus you could pass this as an argument to your Web Service. Be careful though as the running user may not have Salesforce API access and this will be a UI session and thus will be short lived. Thus you may want to explore another means to obtain an renewable oAuth token for more advanced uses.
  • Web Service Callbacks.
    Currently in the Beta the Flow is blocked until the Web Service returns, so its good practice to make your service short and sweet. Salesforce are planning async support as part of the roadmap however.
  • Complex Parameters.
    Its unclear at this stage how complex a web service can be supported given Flows limitations around Invocable Methods which this feature depends on.
  • The future is bright with Swagger support!
    I am really glad Salesforce are adding support for Swagger/OpenID, as i really struggled to find good examples and tutorials around Interagent. Really what is needed here is for the schema and code to be tied more closely together, like this!UPDATE: See my other blog entry covering Swagger support

Summary

Both External Objects and External Services reflect the reality of the continued need for integration tools and making this process simpler and thus cheaper. Separate services and data repositories are for now here to stay. I’m really pleased to see Salesforce doing what it does best, making complex things easier for the masses. Or as Einstein would say…Everything should be made as simple as possible, but no simpler.

Finally you can read more about External Objects here and here through Agustina’s and laterally Alba’s excellent blogs.


2 Comments

Highlights from TrailheaDX 2017

IMG_2857.JPGThis was my first TrailheaDX and what an event it was! With my Field Guide in hand i set out into the wilderness! In this blog i’ll share some of my highlights, thoughts and links to the latest resources. Many of the newly announced things you can actually get your hands on now which is amazing!

Overall the event felt well organized, if a little frantic at times. With smaller sessions of 30 minutes each, often 20 mins after intros and questions, each was bite sized, but quite well tuned with demos and code samples being shown.

SalesforceDX, Salesforce announced the public beta of this new technology aimed at improving the developer experience on the platform. SalesforceDX consist of several modules that will be extended further over time. Salesforce has done a great job at providing a wealth of Trailhead resources to get you started.

Einstein, Since its announcement, myself and other developers have been waiting to get access to more custom tools and API’s, well now that wait is well and truly over. As per my previous blogs we’ve had the Einstein Vision API for a little while now. Announced at the event where no less than three other new Einstein related tools and API’s.

  • Einstein Discovery. Salesforce demonstrated a very slick looking tool that allows you to point and click your way through to analyzing various data sets, including those obtained from your custom objects! They have provided a Trailhead module on it here and i plan on digging in! Pricing and further info is here.
  • Einstein Sentiment API. Allows you to interpret text strings for terms that indicate if its a positive, neutral or negative statement / sentiment. This can be used to scan case comments, forum posts, feedback, twitter posts etc in an automated way and respond or be alerted accordingly to what is being said.
  • Einstein Intent API.  Allows you to interpret text strings for meanings, such as instructions or requests. Routing case comments or even implementing bots that can help automate or propose actions to be taken without human interpretation.
  • Einstein Object Detection API. Is an extension of the Einstein Vision API, that allows for individual items in a picture to be identified. For example a pile of items on a coffee table, such as a mug, magazine, laptop or pot plant! Each can then be recognized and classified to build up more intel on whats in the picture for further processing and analysis.
  • PredictionIO on Heroku. Finally, if you want to go below the declarative tools or intentional simplified Einstein API’s, you can build your own machine learning models using Heroku and the PredictionIO build pack!

Platform Events. These allow messages to be published and subscribed to using a new object known as an Event object, suffixed with __e. Once created you can use a new Apex API’s or REST API’s to send messages and use either Apex Triggers or Streaming API to receive them. There is also a new Process Builder action or Flow element to send messages. You can keep your messages within Force.com or use them to integrate between other cloud platforms or the browser. The possibilities are quite endless here, aysnc processing, inter app comms, logging, ui notifications…. i’m sure myself and other bloggers will be exploring them in the coming months!

External Services. If you find a cool API you want to consume in Force.com you currently have to write some code. No longer! If you have a schema that describes that API you use the External Services wizard to generate some Apex code that will call out to the API. Whats special about this, is the Apex code is accessible via Process Builder and Flow. Making clicks not code API integration possible. Its an excellent way to integrate with complementary services you or others might develop on platforms such as Heroku. I have just submitted a session to Dreamforce 2017 to explore this further, fingers crossed it gets selected! You can read more about it here in the meantime.

Sadly i did have to make a key decision to focus on the above topics and not so much on Lightning. I heard from other attendees these sessions where excellent and i did catch a brief sight of dynamic component rendering in Lightning App Builder, very cool!

Thanks Salesforce for filling my blog backlog for the next year or so! 😉

 

 


119 Comments

Declarative Rollup Tool Summer Release!

Over the course of the last couple of weeks, i have been focusing my community time on release v2.4 of the DLRS tool. Specifically focusing on some much requested features driven by the community in the the Chatter group.

So lets get stuck in…

Rollup Scheduler Improvements

The ability to run a full (or partial with criteria) recalculate of a rollup on a daily schedule has been in the tool for a few releases now. However up until now the only option was to run it at 2am everyday. It is now possible to change this with this new UI, its a bit raw and basic, but for now it should at least give some more flexibility.

RollupSchedule.png

Support for Merging Accounts, Contacts and Leads

The platform has some special handling for merging Accounts, Contacts and Leads. Especially when it comes to when Apex Triggers are invoked. Basically if your parent object is one of these objects, prior versions of the tool had no awareness of this operation, so rollups would not recalculate. If you are using Realtime or Schedule calculation modes on your rollups. Since the platform does not fire Apex Triggers for child records reparented as a result of a merge.

With this release there are two things you can do to fix this. First when you click the Manage Child Trigger button, you get a new checkbox option to control deployment of an additional Apex Trigger on the parent object. If your upgrading you will need to click Remove then Deploy again, to see this.

ParentTrigger.png

IMPORTANT NOTE: If you don’t feel merge operations are an issue for your use cases you can deselect this option and cut down on the number of triggers deployed. Also if it is only the rollup child object that supports merging, there is no need to deploy any additional triggers and the tool does not show the above checkbox option.

Secondly you need to setup the RollupJob as an Apex Scheduled job (under Setup > Apex Classes), even if you don’t have any Schedule Mode rollups. This is due to the fact that due to a platform restriction, the tool cannot recalculate rollups realtime during a merge operation. So it can only record that they need to be recalculated. It does this via the tools scheduled mode infrastructure, by automatically adding records to the Lookup Rollup Summary Schedule Items object. Note that you don’t need to change your rollups from Realtime to Scheduled mode for this to work, only schedule the job.

Support for Archived / Deleted Records via the All Rows Setting

Salesforce archives Tasks and Events after a while. If you have rollups over these child objects you can enable the Aggregate All Rows checkbox. This will ensure your rollups remain accurate even if some records have been archived. Note this also will apply to records in the recycle bin. For upgrades (if your not using the Manage Lookup Rollup Summaries tab), you will need to add this field to your layout to see it.

AllRows.png

Row Limit for Concatenate and Last Rollup Operations

If your using the Last or Concatenate operations, you can define a limit as to how many child records are actually considered when calculating the rollup. This is useful if your using Concatenate into a fix length field for example. When upgrading you need to add the new Row Limit field to your layout if your not using the swanky new Manage Lookup Rollup Summaries tab.

RowLimit.png

Improved House Keeping for Scheduled Mode

If your are using rollups with their Calculation Mode set to Scheduled. The tool records parent rollup records to be later recalculated by the RollupJob Apex Scheduled job. In past releases if through merge or other operation the parent record was deleted before the next scheduled run. Then records would sit in limbo in the Lookup Rollup Summary Schedule Items object, being processed and erroring over and over. These will now be cleared out and there is no upgrade actions you need to take for this.

Summary

Thanks for everyones support for this tool, i hope these changes help you go further with clicks not code! Though as reminder please keep in mind the best practices and restrictions listed in the README. If you have any questions you can either post comments on this blog or use the Chatter Group. The Chatter Group is a great place to get your query seen by a broader group of people who are also diligently supporting the tool as well!

You can find releases of the tool here.


7 Comments

Disabling Trigger Events in Apex Enterprise Patterns

Iautobat.jpeg‘m proud to host my first guest bloggerChris Mail or Autobat as he is known on GitHub. Take it away Chris….

How to put the safety on…

Being an architect in a professional services organisation is a funny game. Each project is either a shiny new Salesforce instance without a fingerprint on it or an unknown vault of code and configuration that we must navigate through.

I have been using the fflib pattern now for some time, and more of our teams are adopting it for our programs of work. My latest addition is something that an architect might wonder why we need; the ability to turn off triggers via a simple interface on all domains.

In an ever growing complex environment, perhaps multiple projects over time delivering iterative enhancements I was noticing a common piece of code being developed within the Domain layer. It looked something along the lines of this:

public override void onAfterInsert()
{
    // if this is set we are already in a loop and want to exit!
    if(bProhibitAfterInsertTrigger)
    {
        return;
    }
    // down here we do something, maybe insert an Account!
}

While small and inconspicuous it allowed our code base to become inconsistent as there was no control over the exposure of these controlling flags and worse, we were repeating ourselves in every domain!

The solution was simple, a fluent style API within fflib_SObjectDomain. Any code can now simply set the control flags for any domain class:

fflib_SObjectDomain.getTriggerEvent(YourDomain.class).disableAll(); // dont fire anything
fflib_SObjectDomain.getTriggerEvent(YourDomain.class).disableAllBefore();
fflib_SObjectDomain.getTriggerEvent(YourDomain.class).disableAllAfter();

fflib_SObjectDomain.getTriggerEvent(YourDomain.class).disableBeforeInsert();
fflib_SObjectDomain.getTriggerEvent(YourDomain.class).disableBeforeUpdate();
fflib_SObjectDomain.getTriggerEvent(YourDomain.class).disableBeforeDelete();

fflib_SObjectDomain.getTriggerEvent(YourDomain.class).disableAfterInsert();
fflib_SObjectDomain.getTriggerEvent(YourDomain.class).disableAfterUpdate();
fflib_SObjectDomain.getTriggerEvent(YourDomain.class).disableAfterDelete();
fflib_SObjectDomain.getTriggerEvent(YourDomain.class).disableAfterUndelete();

To enable, just call the inverse e.g. .enableAfterInsert(); etc.

While not every code base will need to use these flags, they allow you to control quickly and easily your trigger execution with a single line of code that all your development team can reuse and follow.


20 Comments

GitHub Salesforce Deploy Tool Lightning Edition

A recent discussion on Twitter around sharing Salesforce configuration between admins has motivated me to revisit this tool. Before getting into anything major (watch this space). I decided to get back into the tools code by addressing a few outstanding maintenance tasks, enhancements and bug fixes, as well as treating it to a face lift!

LightningGitHubSFDeploy.png

As some of you may have noticed the tool has now adopted the Salesforce Lightning Design System look and feel. This new version is not just cosmetic, its also addressed some key enhancements and bug fixes…

LightningGitHubSFDeploy2.png

In keeping with the Lightning principles, this release makes the Deploy to Salesforce button (example shown below), even easier to place in your repository README files with less configuration required. The button will now automatically detect the GitHub repository. Which is useful if you Fork a repository into your own account or rename it, you no longer need to change the README file. The button code is now also smaller.

LightningGitHubSFDeployButton.png

IMPORTANT NOTE: This only works if your placing the button code in a GitHub README file. If your planning on using the button on a blog, article or wiki etc tick Use Specified Owner and Repository checkbox to have the specified repository details encoded into the button code as before.

This release has now also addressed issues preventing use with repositories that have Reports or Lightning Components in them. So you can now use the tool to deploy your favourite open source Lightning Components!

Finally if you have any private repositories, an improvement around the error handling in this area has also been made. Thanks to some great code contributions from Moti Korets and  Nathan Kramer for the whole GitHub private repository support! Thanks guys!

 

 


26 Comments

Rollups and Cross Object Formula Fields

I’m constantly amazed at the number of varied use cases folks in the Chatter Group are applying to the Declarative Lookup Rollup Summary tool these days. This short blog highlights a particular use case that seems to be on the increase. To resolve it i reached out for additional help in the solution from Process Builder, this is the story…

Background: What causes a Rollup to recalculate?

The default behaviour of the rollup tool is to look for changes to the field your rolling up on, the one specified in the Field to Aggregate field. In addition you can list other fields for it to look at via the Relationship Criteria Fields field, which as the name suggests is also important information if you’ve used rollup criteria. However its also important if the field your rolling up on is a Formula field. In the case of formula field rollups the platform doesn’t inform the tools Apex Trigger of changes to these. So it cannot directly monitor such fields and to resolve this must instead be told explicitly about the fields that are referenced by the formula expression. So far so good…

Challenge: Rollups over Cross Object Formulas?

A challenge however arises if you’re wanting to do a realtime rollup based on a formula field that references fields from a related record, a cross object formula! In this case how does the rollup tool know when changes are made to related records?

One solution to this is to switch to schedule mode by clicking the Schedule Calculate button on the rollup. For realtime rollups, its potentially feasible to enhance the tool to deploy triggers to related objects and bubble up knowledge of field changes to the cause a recalculate of the rollup on the child object… However before we resort to more code (even in the tool) lets see what we can do with the declarative tools we have already today…

Example Use Case

The following schema diagram shows a simplified mockup of a such a challenge i helped a community member out with in the tools Chatter Group.

FormulaRollupUseCase.png

Here is the scenario assumptions and break down…

  • For whatever existing reasons the above is NOT a Master Detail relationship
  • Rollup needed is to Sum the Quote Line Item > Amount into the Quote > Total field.
  • The Quote Line Item > Amount field is a Formula field, which is a cross object formula pointing to the related Widget > Total field.
  • The Widget > Total field is itself a Formula field, in this simplified case adding up the values of Widget > A + Widget > B + Widget > C.
  • Whenever changes to the Widget > A, Widget > B or Widget > C fields are made we want the Quote > Total field to be recalculated.

Here’s the rollup summary definition between Quote and Quote Line Item

ForumlaRollupDLRS.png

While the above works if you use Calculate (one off full recalculate) or Schedule Calculate (daily full recalculate) buttons. Our issue arises in the general use of the Realtime mode. Since the tools triggers see nothing of the changes users make to the Widget fields above, realtime changes are not reflected on the Quote > Total rollup. This is due to the aforementioned reason, since we are using a cross object formula.

NOTE: The Calculate Mode pick list also has a Schedule option, this is a more focused background recalculate, only recalculating effected records since the last run, rather than Schedule Calculate button which is a full recalculate every night. So be aware if your using this mode that the problem and solution being described here also applies to Calculate Mode when set to Schedule as well. As it uses the same field monitoring approach to queue records up for scheduled recalculation.

If your fine without realtime rollups go ahead and use the Schedule Calculate button and at 2am each morning the Quote > Total amount will catchup with any changes made to the Widget fields that effected it, job done!

Solution: Shadow Fields and Process Builder

So when considering the solution, i did consider another rollup between the Widget and Quote Line Item to start to resolve this, thinking i could then put the result in field that the Quote Line Item > Quote  rollup would see change. However this was quickly proved a poor consideration as the relationship between Widget and Quote Line Item in this use case is the wrong way round, as Quote Line Item is the child in this case, doh! In other use cases, i have had success in using nested rollups to get more complex use cases to fly!

Shadow Field?

AmountShadowFieldEither way i knew i had to have some kind of physical field other than a Formula field on the Quote Line Item object to notify the rollup tool of a change that would trigger a recalculate of the Quote > Total. I called this field Amount (Shadow) in this case, i also left it off my layout.

NOTE: I’ve made the assumption here that for whatever reason the existing cross object Formula field has to stay for other reasons, if thats not a problem for you, simply recreate Quote Line Item > Amount as a physical field and as you read the rest of this blog consider this your shadow field.

I then changed my rollup definition to reference the Amount Shadow field instead.

ChangesToRollup

NOTE: If you managed to switch the field type of your Amount field from a Formula to a physical Number field as noted above you don’t need to do this of course.

Process Builder to update your Field to Aggregate

Next i turned to Process Builder to see if i could get it populate the above Amount (Shadow) field on Quote Line Item, as users made changes to Widget fields. Leveraging the child parent relationship between Quote Line Item and Widget. Here is the setup i used to complete the solution!

FormulaRollupProessBuilder1.png

FormulaRollupProessBuilder2.png

FormulaRollupProessBuilder3.png

Summary

Its worth noting that if the relationship between Quote Line Item and Quote was Master Detail, you can effectively now of course use standard platform Rollup Summary Fields without needing the rollup tool at all. You may think me bias here, but not at all, i’d much rather see a fully native solution any day!

Regardless if this use cases fits yours or not, hopefully this blog has given you some other useful inspiration for further rollup and Process Builder combo deals! Enjoy!