Andy in the Cloud

From BBC Basic to Force.com and beyond…


Leave a comment

New Book – Salesforce Platform Enterprise Architecture 4th Edition

It has been nearly 10 years back in 2014 when the first edition of my book Force.com Enterprise Architecture was first published and clearly a lot has moved on since then and not just the platform capabilities. Keeping pace with branding changes resulted in the second edition of the book being called Lightning Platform Enterprise Architecture. This latest 4th edition, Salesforce Platform Enterprise Architecture has probably the most accurate and I hope enduring title! This 4th edition is also nearly twice the size of the first edition, coming in at 681 pages. So much so, its 16 chapters is now split over 4 parts to make digesting the book easier. As is tradition by now, this blog will cover some of latest updates and additions and what to expect in general from the book.

Links: Amazon | Packt Publishing

Motivation and what to expect…

I first came to the platform with a lot of experience and understanding of architecture principles from other platforms, including leveraging platform agnostic patterns such as those of Martin Fowler. Applying patterns to the platform requires some nuance at times, thus this book is about enterprise architecture considerations as applied to the Salesforce Platform and much more in respect to its capabilities that accelerate traditional application development concerns and tasks for you – and can hinder if not considered upfront in your architecture design.

The book dedicates 3 of its 16 chapters to Apex code separation of concerns, how to layout your code, while the rest covers the full spectrum of app concerns such as designing for code and database performance, profiling, testing, data storage design, building apis, extensibility and more and of course how to write much less code by harnessing the declarative aspects. All this is driven throughout the book via a fictitious reference application known as FormulaForce based on Formula1 motor racing – full source is included. Also included are numerous tips and tricks and info blocks pitted among the pages that highlight learnings from my own experiences, and latterly that of the many amazing reviewers.

As good as the platform is at abstracting a huge amount of the complexity of managing and securing modern day applications for you, there is no escaping having to fully understand good architecture principles that would, and do, in most cases apply elsewhere, such as query optimization and indexing.

If you are developing a lot on the Salesforce Platform, have a few years experience and are passionate about creating applications that are enduring and empathetic to the platform itself – this book is for you. Read on for further highlights!

Is it for ISVs or anyone?

This edition is the first to split out the motivations of packaging in respect to internal distribution needs within your own company vs distributing a commercial solution on AppExchange. Regardless of which motivation you fit into, the chapters will callout which bits are relevant vs not. Even if you are not into packaging at all right now, much of the content on Apex, Lightning, Flow, APIs etc is of course agnostic to how you choose to distribute your application. Regardless who you are sharing it with, the book gives you an appreciation of advantages and limits of packaged based distribution and in my view is a key aspect of your overall architecture considerations checklist.

Have the Apex coding patterns evolved given recent features?

The advent of user mode capabilities has had a big impact on the approach taken within the sample application of the book when it comes to using the Apex Enterprise Patterns Library (fflib), specifically in how the Unit of Work and Selector patterns are used when updating and querying for data respectively.

Additionally, the Domain pattern chapter now reflects on how Domain logic, can, if desired allow the developer to split Domain and Trigger logic into separate classes and thus introduces a new pattern (SDCL) to captured this approach. Thats not to say the original model has gone, it is still very much present, but it felt appropriate to include something in the book that reflects how the library usage has also evolved over the years – thanks to my good friend John M. Daniel for pushing for this!

Apply skills in other languages and additional scaling options

A brand new chapter in the book is dedicated to how Heroku and Functions can be used to leverage existing skills or indeed investments in open source or internally for code written in other languages such as Java or Node.js. These technologies also open up further scaling options. The chapter breaks down the approaches using the books sample application as a basis. Thus it continues the Formula1 motor racing theme by introducing Web and API interfaces for race fans and vendors (authentication scenario) to interact with the application backend APIs. Later in the book the integration chapter doubles down further on authoring APIs via OpenAPI standards and how that simplifies integration with the platform.

Lightning Web Components and Lightning Experience expansion

Since the third book, all but one of the Aura components have now been removed as many only existed at the time as workarounds to the lack of LWC enabled integration points. Two large chapters are dedicated to the Lightning framework itself, how it handles separation of concerns, eventing, security and its alignment with the industry Web Component standard. Completing with a section covering the ever expanding possibilities to extend vs build when thinking about your applications user experience, regardless if thats desktop, mobile or driven through the native UIs and tools such as Lightning Pages or Flow. And yes, I still love the Utility Bar integration the most – as you can see from the number of historic posts and tools on the topic here.

APIs, APIs and APIs…

Many years ago a product manager once thanked me for pushing an API strategy internally, since they appreciated first hand as a past buyer that APIs are effectively an insurance policy for buyers when it comes to unexpected feature gaps post or during implementation. Thus several of the chapters highlight the many Salesforce APIs in contextual ways.

Though its not until the integration and extensibility chapter the book really doubles down on building your own application APIs, versioning, naming and design and the value they bring to your fellow developers, customers and/or partner ecosystems. External Services is one of the most impressive aspects of integrating with the Salesforce Platform for me and it was my pleasure to update the book with an updated Heroku based API leveraging OpenAPI to get the full power of the feature – the two are a perfect combo!

And …

This blog is already in danger of becoming chapter size itself, so I will leave it here with the above highlights and before I close share my deepest thanks to this editions reviewers John M. Daniel, Sebastiano Costanzo, Arvind Narasimhan and foreword author Daniel J. Peter.

I will be popping up in the not to distant future in ApexHours (https://www.apexhours.com/) and perhaps other locations to talk in more depth about the book and likely other side topics of interest. I will update this blog as those resources arrive. Thank you all in the meantime, the Salesforce community is the best and remains very close to my heart, enjoy!

AndyInTheCloud

P.S. Suffice to say this 4th edition is an evolution of the prior three books! I would like to also thank the following fine folks for their support as past reviewers and foreword authors!

Past Reviewers and Forewords:
Matt Bingham, Steven Herod, Matt Lacey, Andrew Smith, Rohit Arora, Karanraj Samkaranarayanan, Jitendra Zaa, Joshua Burk, Peter Knolle, John Leen, Aaron Slettenhaugh, Avrom Roy-Faderman, Michael Salem, Mohith Shrivastava and Wade Wegner


3 Comments

The Third Edition

bookI’m proud to announce the third edition of my book has now been released. Back in March this year I took the plunge start updates to many key areas and add two brand new chapters. Between the 2 years and 8 months since the last edition there has been several platform releases and an increasing number of new features and innovations that made this the biggest update ever! This edition also embraces the platforms rebranding to Lightning, hence the book is now entitled Salesforce Lightning Platform Enterprise Architecture.

You can purchase this book direct from Packt or of course from Amazon among other sellers.  As is the case every year Salesforce events such as Dreamforce and TrailheaDX this book and many other awesome publications will be on sale. Here are some of the key update highlights:

  • Automation and Tooling Updates
    Throughout the book SFDX CLI, Visual Studio Code and 2nd Generation Packaging are leverage. While the whole book is certainly larger, certain chapters of the book actually reduced in size as steps previously reflecting clicks where replaced with CLI commands! At one point in time I was quite a master in Ant Scripts and Marcos, they have also given way to built in SFDX commands.
  • User Interface Updates
    Lightning Web Components is a relative new kid on the block, but benefits greatly from its standards compliance, meaning there is plenty of fun to go around exploring industry tools like Jest in the Unit Testing chapter. All of the books components have been re-written to the Web Component standard.
  • Big Data and Async Programming
    Big data was once a future concern for new products, these days it is very much a concern from the very start. The book covers Big Objects and Platform Events more extensibility with worked examples, including ingest and calculations driven by Platform Events and Async Apex Triggers. Event Driven Architecture is something every Lightning developer should be embracing as the platform continues to evolve around more and more standard platforms and features that leverage them.
  • Integration and Extensibility
    A particularly enjoyed exploring the use of Platform Events as another means by which you can expose API’s from your packages to support more scalable invocation of your logic and asynchronous plugins.
  • External Integrations and AI
    External integrations with other cloud services are a key part to application development and also the implementation of your solution, thus one of two brand new chapters focuses on Connected Apps, Named Credentials, External Services and External Objects, with worked examples of existing services or sample Heroku based services. Einstein has an ever growing surface area across Salesforce products and the platform. While this topic alone is worth an entire book, I took the time in the second new chapter, to enumerate Einstein from the perspective of the developer and customer configurations. The Formula1 motor racing theme continued with the ingest of historic race data that you can run AI over.
  • Other Updates
    Among other updates is a fairly extensive update to the CI/CD chapter which still covers Jenkins, but leverages the new Jenkins Pipeline feature to integrate SFDX CLI. The Unit Testing chapter has also been extended with further thoughts on unit vs integration testing and a focus on Lightening Web Component testing.

The above is just highlights for this third edition, you can see a full table of contents here. A massive thanks to everyone involving for providing the inspiration and support for making this third edition happen! Enjoy!


24 Comments

Swagger / Open API + Salesforce = LIKE

In my previous blog i covered an exciting new integration tool from Salesforce, which consumes API’s that have a descriptor (or schema) associated with them. External Services allows point and click integration with API’s. The ability for Salesforce to consume API’s complying with API schema standards is a pretty huge step forward. Extending its ability to integrate with ease in a way that is in-keeping with its low barrier to entry development and clicks not code mantra.

swaggerlike

At the time of writing my previous blog, only Interagent schema was supported by External Services. However as of the Winter’18 release this is no longer the case. In this blog i will explore the more widely adopted Swagger / Open API 2.0 standard, using Node.js and Heroku and External Services. As bonus topic, i will also touch on using Swagger Code Generator with Apex!

One of the many benefits of supporting the Swagger / Open API standard is the ability to generate documentation for it. The following screenshot shows the API schema on the left and generated documentation on the right. What is also very cool about this, is the Try this operation button. Give it a try for yourself now!

asciiartswagger

oai

Whats the difference between Swagger and Open API  2.0? This was a question i asked myself and thought i would cover the answer here. Basically as at, Swagger v2.0, there is no difference, the Open API Initiative is a rebranding, born out of the huge adoption Swagger has seen since its creation. This move means its future is more formalised and seems to have more meaningful name. You can read more about this amazing story here.

Choosing your methodology for API development

The schema shown above might look a bit scary and you might well want to just get writing code and think about the schema when your ready to share your API. This is certainly supported and there are some tools that support generation of the schema via JSDoc comments in your code or via your joi schema here (useful for existing API’s).

However to really embrace an API first strategy in your development team i feel you should start with the requirements and thus the schema first. This allows others in your team or the intended recipients to review the API before its been developed and even test it out with stub implementations. In my research i was thus drawn to Swagger Node, a tool set, donated by ApiGee, that embraces API-design-first. Read more pros and cons here. It is also the formal Node.js implementation associated with Swagger.

The following describes the development process of API-design-first.

overview2

(ref: Swagger Node README)

Developing Open API’s with “Swagger Node” 

Swagger Node is very easy to get started with and is well documented here. It supports the full API-design-first development process show in the diagram above. The editor (also shown above) is really useful for getting used to writing schemas and the UI is dynamically refreshed, including errors.

The overall Node.js project is still pretty simple (GitHub repo here), now consisting of three files. The schema is edited in YAML file format (translated to JSON when served up to tools). The schema for the ASCIIArt service now looks like the following and is pretty self describing. For further documentation on Swagger / Open API 2.0 see here.

https://createasciiart.herokuapp.com/schema/
swagger: "2.0"
info:
  version: "1.0.0"
  title: AsciiArt Service
# during dev, should point to your local machine
host: localhost:3000
# basePath prefixes all resource paths 
basePath: /
# 
schemes:
  # tip: remove http to make production-grade
  - http
  - https
# format of bodies a client can send (Content-Type)
consumes:
  - application/json
# format of the responses to the client (Accepts)
produces:
  - application/json
paths:
  /asciiart:
    # binds a127 app logic to a route
    x-swagger-router-controller: asciiart
    post:
      description: Returns ASCIIArt to the caller
      # used as the method name of the controller
      operationId: asciiart
      consumes:
        - application/json
      parameters:
        - in: body
          name: body
          description: Message to convert to ASCIIArt
          schema:
            type: object
            required: 
              - message
            properties:
              message:
                type: string
      responses:
        "200":
          description: Success
          schema:
            # a pointer to a definition
            $ref: "#/definitions/ASCIIArtResponse"
  /schema:
    x-swagger-pipe: swagger_raw
# complex objects have schema definitions
definitions:
  ASCIIArtResponse:
    required:
      - art
    properties:
      art:
        type: string

The entry point of the Node.js app, the server.js file now looks like this…

'use strict';

var SwaggerExpress = require('swagger-express-mw');
var app = require('express')();
module.exports = app; // for testing
var config = {
  appRoot: __dirname // required config
};

SwaggerExpress.create(config, function(err, swaggerExpress) {
  if (err) { throw err; }
  // install middleware for swagger ui
  app.use(swaggerExpress.runner.swaggerTools.swaggerUi());
  // install middleware for swagger routing
  swaggerExpress.register(app);
  var port = process.env.PORT || 3000;
  app.listen(port);
});

Note: I changed the Node.js web server framework from hapi (used in my previous blog) to express. As I could not get the Swagger UI to integrate with hapi.

The code implementing the API has been moved to its asciiart.js file.

var figlet = require('figlet');

function asciiart(request, response) {
    // Call figlet to generate the ASCII Art and return it!
    const msg = request.body.message;
    figlet(msg, function(err, data) {
        response.json({ art: data});
    });
}

module.exports = {
    asciiart: asciiart
};

Note: There is no parameter validation code written here, the Swagger Node module dynamically implements parameter validation for you (based on what you define in the schema) before the request reaches your code! It also validates your responses.

To access the documentation simply use the path /docs. The documentation is generated automatically, no need to manage static HTML files. I have hosted my sample AsciiArt service in Heroku so you can try it by clicking the link below.

https://createasciiart.herokuapp.com/docs/

swaggerui

Consuming Swagger API’s with External Services

The process described in my earlier blog for using the above API via External Services has not changed. External Services automatically recognises Swagger API’s.

externalservicesasciiart

NOTE: There is a small bug that prevents the callout if the basePath is specified as root in the schema. Thus this has been commented out in the deployed version of the schema for now. Salesforce will likely have fixed this by the time you read this.

Swagger Tools

  • SwaggerToolsSwagger Editor, the interactive editor shown in the first screenshot of this blog.
  • Swagger Code Generator, creates server stubs and clients for implementing and calling Swagger enabled API’s.
  • Swagger UI, the browser based UI for generating documentation. You can call this from the command line and upload the static HTML files or use frameworks like the one used in this blog to generated it on the fly.

Can we use Swagger to call or implement API’s authored in Apex?

Swagger Tools are available on a number of platforms, including recently added support for Apex clients. This gives you another option to consume API’s directly in Apex. Its not clear if this is going to a better route than consuming the classes generated by External Services, i suspect it might have some pros and cons tbh. Time will tell!

Meanwhile i did run the Swagger Code Generator for Apex and got this…

public class SwagDefaultApi {
    SwagClient client;

    public SwagDefaultApi(SwagClient client) {
        this.client = client;
    }

    public SwagDefaultApi() {
        this.client = new SwagClient();
    }

    public SwagClient getClient() {
        return this.client;
    }

    /**
     *
     * Returns ASCIIArt to the caller
     * @param body Message to convert to ASCIIArt (optional)
     * @return SwagASCIIArtResponse
     * @throws Swagger.ApiException if fails to make API call
     */
    public SwagASCIIArtResponse asciiart(Map<String, Object> params) {
        List<Swagger.Param> query = new List<Swagger.Param>();
        List<Swagger.Param> form = new List<Swagger.Param>();

        return (SwagASCIIArtResponse) client.invoke(
            'POST', '/asciiart',
            (SwagBody) params.get('body'),
            query, form,
            new Map<String, Object>(),
            new Map<String, Object>(),
            new List<String>{ 'application/json' },
            new List<String>{ 'application/json' },
            new List<String>(),
            SwagASCIIArtResponse.class
        );
    }
}

The code is also generated in a Salesforce DX compliant format, very cool!


21 Comments

Simplified API Integrations with External Services

Salesforce are on a mission to make accessing off platform data and web services as easy as possible. This helps keep the user experience optimal and consistent for the user and also allows admins to continue to leverage the platforms tools such as Process Builder and Flow, even if the data or logic is not on the platform.

Starting with External Objects, they added the ability to see and also update data stored in external databases. Once setup, users can manipulate external records without leaving Salesforce, by staying within the familiar UI’s. With External Services, currently in Beta, they have extended this concept to external API services.

UPDATE: The ASCIIArt Service covered in this blog has since been updated to use the Swagger schema standard. However this blog is still a very useful introduction to External Services. Once you have read it, head on over to this blog!

In this blog lets first focus on the clicks-not-code steps you can repeat in your own org, to consume a live ASCII Art web service API i have exposed publicly. The API is simple, it takes a message and returns it in ASCII art format. The following steps result in a working UI to call the API and update a record.

ExternalServicesDemo.png

After the clicks not code bit i will share how the API was built, whats required for compatibility with this feature and how insanely easy it is to develop Web Services in Heroku using Nodejs. So lets dive in to External Services!

Building an ASCII Art Converter in Lightning Experience and Flow

The above solution was built with the following configurations / components. All of which are accessible under the LEX Setup menu (required for External Services) and takes around 5 minutes maximum to get up and running.

  1. Named Credential for the URL of the Web Service
  2. External Service for the URL, referencing the Named Credential
  3. Visual Flow to present a UI, call the External Service and update a record
  4. Lightning Record Page customisation to embed the Flow in the UI

I created myself a Custom Object, called Message, but you can easily adapt the following to any object you want, you just need a Rich Text field to store the result in. The only other thing you need to know of course is the web service URL.

https://createasciiart.herokuapp.com

Can i use External Services with any Web Service then?

In order to build technologies that simplify what are normally things developers have to interpret and code manually. Web Service APIs must be documented in a way that External Services can understand. In this Beta release this is the Interagent schema standard (created by Heroku as it happens).  Support for the more broadly adopted Swagger / OpenId will be added in the Winter release (Safe Harbour).

For my ASCII Art service above, i authored the Interagent schema based on a sample the Salesforce PM for this feature kindly shared, more on this later. When creating the External Service in moment we will provide a schema to this service.

https://createasciiart.herokuapp.com/schema

Creating a Named Credential

From the setup menu search for Named Credential and click New. This is a simple Web Service that requires no authentication. Basically provide only the part of the above URL that points to the Web Service endpoint.

ESNamedCred.png

Creating the External Service

Now for the magic! Under the Setup menu (only in Lightning Experience) search for Integrations and start the wizard. Its a pretty straight forward process, of selecting the above Named Credential, then telling it the URL for the schema. If thats not exposed by the service you want to use, you can paste a Schema in directly (which lets a developer define a schema yourself if one does not already exist).

esstep1.png

esstep2.png

Once you have created the External Service you can review the operations it has discovered. Salesforce uses the documentation embedded in the given schema to display a rather pleasing summary actually.

esstep3.png

So what just happened? Well… internally the wizard wrote some Apex code on your behalf and implemented the Invocable Method annotations to enable that Apex code to appear in tools like Process Builder (not supported in Beta) and Flow. Pretty cool!

Whats more interesting for those wondering, is you cannot actually see this Apex code, its there but some how magically managed by the platform. Though i’ve not confirmed, i would assume it does not require code coverage.

Update: According to the PM, in Winter’18 it will be possible “see” the generated class from other Apex classes and thus reuse the generated code from Apex as well. Kind of like a Api Stub Generator.

Creating a UI to call the External Service via Flow

This simple Flow prompts the user for a message to convert, calls the External Service and updates a Rich Text field on the record with the response. You will see in the Flow sidebar the generated Apex class generated by the External Service appears.

esflow

The following screenshots show some of the key steps involved in setting up the Flow and its three steps, including making a Flow variable for the record Id. This is later used when embedding the Flow in Lightning Experience in the next step.

esflow1

RecordId used by Flow Lightning Component

esflow2

Assign the message service parameter

esflow3

Assign the response to variable

esflow4

Update the Rich Text field

TIP: When setting the ASCII Art service response into the field, i wrapped the value in the HTML elements, pre and code to ensure the use of a monospaced font when the Rich Text field displayed the value.

Embedding the Flow UI in Lightning Experience

Navigate to your desired objects record detail page and select Edit Page from the cog in the top right of the page to open the Lightning App Builder. Here you can drag the Flow component onto the page and configure it to call the above flow. Make sure to map the Flow variable for the record Id as shown in the screenshot, to ensure the current record is passed.

esflowlc.png

Thats it, your done! Enjoy your ASCII Art messages!

Creating your own API for use with External Services

Belinda, the PM for this feature was also kind enough to share the sample code for the example shown in TrailheaDX, from which the service in this blog is based. However i did wanted to build my own version to do something different from the credit example. Also extend my personal experience with Heroku and Nodejs more.

The NodeJS code for this solution is only 41 lines long. It runs up a web server (using the very easy to use hapi library), and registers a couple of handlers. One handler returns the statically defined schema.json file, the other implements the service itself. As side note, the joi library is an easy way add validation to the service parameters.

var Hapi = require('hapi');
var joi = require('joi');
var figlet = require('figlet');

// initialize http listener on a default port
var server = new Hapi.Server();
server.connection({ port: process.env.PORT || 3000 });

// establish route for serving up schema.json
server.route({
  method: 'GET',
  path: '/schema',
  handler: function(request, reply) {
    reply(require('./schema'));
  }
});

// establish route for the /asciiart resource, including some light validation
server.route({
  method: 'POST',
  path: '/asciiart',
  config: {
    validate: {
      payload: {
        message: joi.string().required()
      }
    }
  },
  handler: function(request, reply) {
    // Call figlet to generate the ASCII Art and return it!
    const msg = request.payload.message;
    figlet(msg, function(err, data) {
        reply(data);
    });
  }
});

// start the server
server.start(function() {
  console.log('Server started on ' + server.info.uri);
});

I decided i wanted to explore the diversity of whats available in the Nodejs space, through npm. To keep things light i chose to have a bit of fun and quickly found an ASCIIArt library, called figlet. Though i soon discovered that npm had a library for pretty much every other use case i came up with!

Finally the hand written Interagent schema is also shown below and is reasonably short and easy to understand for this example. Its not all that well documented in layman’s terms as far as i can see. See my thoughts on this and upcoming Swagger support below.

{
  "$schema": "http://interagent.github.io/interagent-hyper-schema",
  "title": "ASCII Art Service",
  "description": "External service example from AndyInTheCloud",
  "properties": {
    "asciiart": {
      "$ref": "#/definitions/asciiart"
    }
  },
  "definitions": {
    "asciiart": {
      "title": "ASCII Art Service",
      "description": "Returns the ASCII Art for the given message.",
      "type": [ "object" ],
      "properties": {
        "message": {
          "$ref": "#/definitions/asciiart/definitions/message"
        },
        "art": {
          "$ref": "#/definitions/asciiart/definitions/art"
        }
      },
      "definitions": {
        "message": {
          "description": "The message.",
          "example": "Hello World",
          "type": [ "string" ]
        },
        "art": {
          "description": "The ASCII Art.",
          "example": "",
          "type": [ "string" ]
        }
      },
      "links": [
        {
          "title": "AsciiArt",
          "description": "Converts the given message to ASCII Art.",
          "href": "/asciiart",
          "method": "POST",
          "schema": {
            "type": [ "object" ],
            "description": "Specifies input parameters to calculate payment term",
            "properties": {
              "message": {
                "$ref": "#/definitions/asciiart/definitions/message"
              }
            },
            "required": [ "message" ]
          },
          "targetSchema": {
            "$ref": "#/definitions/asciiart/definitions/art"
          }
        }
      ]
    }
  }
}

Finally here is the package.json file that brings the whole node app together!

{
  "name": "asciiartservice",
  "version": "1.0.0",
  "main": "server.js",
  "dependencies": {
    "figlet": "^1.2.0",
    "hapi": "~8.4.0",
    "joi": "^6.1.1"
  }
}

Other Observations and Thoughts…

  • Error Handling.
    You can handle errors from the service in the usual way by using the Fault path from the element. The error shown is not all that pretty, but then in fairness there is not really much of a standard to follow here.
    eserrorflow.pngeserror.png
  • Can a Web Service called this way talk back to Salesforce?
    Flow provides various system variables, one of which is the Session Id. Thus you could pass this as an argument to your Web Service. Be careful though as the running user may not have Salesforce API access and this will be a UI session and thus will be short lived. Thus you may want to explore another means to obtain an renewable oAuth token for more advanced uses.
  • Web Service Callbacks.
    Currently in the Beta the Flow is blocked until the Web Service returns, so its good practice to make your service short and sweet. Salesforce are planning async support as part of the roadmap however.
  • Complex Parameters.
    Its unclear at this stage how complex a web service can be supported given Flows limitations around Invocable Methods which this feature depends on.
  • The future is bright with Swagger support!
    I am really glad Salesforce are adding support for Swagger/OpenID, as i really struggled to find good examples and tutorials around Interagent. Really what is needed here is for the schema and code to be tied more closely together, like this!UPDATE: See my other blog entry covering Swagger support

Summary

Both External Objects and External Services reflect the reality of the continued need for integration tools and making this process simpler and thus cheaper. Separate services and data repositories are for now here to stay. I’m really pleased to see Salesforce doing what it does best, making complex things easier for the masses. Or as Einstein would say…Everything should be made as simple as possible, but no simpler.

Finally you can read more about External Objects here and here through Agustina’s and laterally Alba’s excellent blogs.