Andy in the Cloud

From BBC Basic to Force.com and beyond…


3 Comments

The Third Edition

bookI’m proud to announce the third edition of my book has now been released. Back in March this year I took the plunge start updates to many key areas and add two brand new chapters. Between the 2 years and 8 months since the last edition there has been several platform releases and an increasing number of new features and innovations that made this the biggest update ever! This edition also embraces the platforms rebranding to Lightning, hence the book is now entitled Salesforce Lightning Platform Enterprise Architecture.

You can purchase this book direct from Packt or of course from Amazon among other sellers.  As is the case every year Salesforce events such as Dreamforce and TrailheaDX this book and many other awesome publications will be on sale. Here are some of the key update highlights:

  • Automation and Tooling Updates
    Throughout the book SFDX CLI, Visual Studio Code and 2nd Generation Packaging are leverage. While the whole book is certainly larger, certain chapters of the book actually reduced in size as steps previously reflecting clicks where replaced with CLI commands! At one point in time I was quite a master in Ant Scripts and Marcos, they have also given way to built in SFDX commands.
  • User Interface Updates
    Lightning Web Components is a relative new kid on the block, but benefits greatly from its standards compliance, meaning there is plenty of fun to go around exploring industry tools like Jest in the Unit Testing chapter. All of the books components have been re-written to the Web Component standard.
  • Big Data and Async Programming
    Big data was once a future concern for new products, these days it is very much a concern from the very start. The book covers Big Objects and Platform Events more extensibility with worked examples, including ingest and calculations driven by Platform Events and Async Apex Triggers. Event Driven Architecture is something every Lightning developer should be embracing as the platform continues to evolve around more and more standard platforms and features that leverage them.
  • Integration and Extensibility
    A particularly enjoyed exploring the use of Platform Events as another means by which you can expose API’s from your packages to support more scalable invocation of your logic and asynchronous plugins.
  • External Integrations and AI
    External integrations with other cloud services are a key part to application development and also the implementation of your solution, thus one of two brand new chapters focuses on Connected Apps, Named Credentials, External Services and External Objects, with worked examples of existing services or sample Heroku based services. Einstein has an ever growing surface area across Salesforce products and the platform. While this topic alone is worth an entire book, I took the time in the second new chapter, to enumerate Einstein from the perspective of the developer and customer configurations. The Formula1 motor racing theme continued with the ingest of historic race data that you can run AI over.
  • Other Updates
    Among other updates is a fairly extensive update to the CI/CD chapter which still covers Jenkins, but leverages the new Jenkins Pipeline feature to integrate SFDX CLI. The Unit Testing chapter has also been extended with further thoughts on unit vs integration testing and a focus on Lightening Web Component testing.

The above is just highlights for this third edition, you can see a full table of contents here. A massive thanks to everyone involving for providing the inspiration and support for making this third edition happen! Enjoy!


34 Comments

Managing Dependency Injection within Salesforce

When developing within Salesforce, dependencies are formed in many ways, not just those made explicitly when writing code, but those formed by using declarative tools. Such as defining Actions and Layouts for example. This blog introduces a new open source library I have been working on called Force DI. The goal is to simplify and more importantly consolidate where and how to configure at runtime certain dependencies between Apex, Visualforce or Lightning component code.

Forming dependencies at runtime instead of explicitly during development can be very advantageous. So whether you are attempting to decompose a large org into multiple DX packages or building a highly configurable solution, hopefully, you will find this library useful!

So what does the DI bit stand for?

The DI bit in Force DI stands for Dependency Injection, which is a form of IoC (Inversion of Control). Both are well-established patterns for providing the runtime glue between two points, basically the bit in the middle. Let’s start with an Apex example. In order to use DI, you need to forgo the use of the “new” operator at the point where you want to do the injection. For example, consider the following code:-

PaymentEngine engine = new PayPal();

In the above example, you are explicitly expressing a dependency.  Which not only means you have to deploy or package all your payment engines together, but you have hardcoded a finite set you support and thus also forgone extensibility. With Force DI you can instead write

PaymentEngine engine = (PaymentEngine) di_Injector.Org.getInstance(PaymentEngine.class);

How does it know which class to instantiate then?

Whats happening here is the Injector class is using binding configuration (also dynamically discovered) to find out which class to actually instantiate. This binding configuration can be admin controlled, packaged (e.g. “PayPal Package”) and/or defined dynamically via code. Setting up binding config via code enables dynamic binding by reading other configuration (e.g. the user’s payment preference) and binding accordingly.

The key goal of DI is that calling code is not concerning itself with how an instance is obtained, only what it does with it. The following shows how a declarative binding is expressed via the libraries Binding Custom Metadata Type:-

If this all seems a bit indirect, that’s the point! Because of this indirection, you can now choose to deploy/package other payment gateway implementations independently from each other as well as be sure that everywhere your other code needs a PaymentEngine the implementation is resolved consistently. For a more advanced OOP walkthrough see the code sample here.

Can this help me with other kinds of dependencies?

Yes! Let’s take an example of Lightning Component used as an Action Override. Typically you would create a Lightning Component and associate it directly with an action override. However, this means that the object metadata, action override and the Lightning code (as well as whatever is dependent on that) must travel around together. Rather than, for example, in separate DX packages. It also means that if you want to offer different variations of this action you would need to code all of that into the single component as well.

As before let’s review what the Lightning Component Action Override looks like without DI:-

<aura:component implements="lightning:actionOverride,force:hasSObjectName">
   <lightning:cardtitle="Widget">
     <p class="slds-p-horizontal_small">Custom UI to Create a Widget ({!v.sObjectName})</p>
   </lightning:card>
</aura:component>

This component (and all its dependencies) would be directly referenced in the Action Override below:-

Now let us take a look at this again but using the Lightning c:injector component in its place:-

<aura:component implements="lightning:actionOverride,force:hasSObjectName">
   <c:di_injector bindingName="lc_actionWidgetNew">
      <c:di_injectorAttribute name="sObjectName" value="{!v.sObjectName}"/>
   </c:di_injector>
</aura:component>

To make things clearer when reviewing Lightning Components in the org, the above component follows a generic naming convention, such as actionWidgetNew. This component is instead bound to the Action Override, not the above one and now looks like this:-

The binding configuration looks like this:-

Finally, the injected Lightning Component widgetWizard looks like this:-

<aura:component>
   <aura:attribute name="sObjectName"type="String"/>
   <lightning:card title="Widget">
     <p class="slds-p-horizontal_small">Custom UI to Create a Widget ({!v.sObjectName})</p>
   </lightning:card>
</aura:component>

Note: You have the ability to pass context through to the bound Lightning Component just as the sObjectName attribute value was passed above. The c:injector component can be used in many other places such as Quick Actions, Lightning App Builder Pages, and Utility Bar. Check out this example page in the repo for another example.

What about my Visualforce page content can I inject that?

Visualforce used by Actions and in Layouts can be injected in much the same way as above, with a VF page acting as the injector proxy using the Visualforce c:injector component. We will skip showing what things looked like before DI, as things follow much the same general pattern as the Lightning Component approach.

The following example shows the layoutWidgetInfo page, which is again somewhat generically named to indicate its an injector proxy and not a real page. It is this page that is referenced in the Widget objects Layout:-

<apex:page standardController="Widget__c" extensions="di_InjectorController">
   <c:di_injector bindingName="vf_layoutWidgetInfo" parameters="{!standardController}"/>
</apex:page>
The following shows an alternative means to express binding configuration via code. The ForceApp3Module class defines the bindings for a module/package of code where the Visualforce Component that actually implements the UI is stored. Note that the binding for vf_layoutWidgetInfo points to an Apex class in the controller, not the actual VF component to inject. The Provider inner class actually creates the specific component (via Dynamic Visualforce).
public class ForceApp3Module extends di_Module {

    public override void configure() {

        // Example named binding to a Visualforce component (via Provider)
        bind('vf_layoutWidgetInfo').visualforceComponent().to(WidgetInfoController.Provider.class);

        // Example SObject binding (can be used by trigger frameworks, see force-di-demo-trigger)
        bind(Account.getSObjectType()).apex().sequence(20).to(CheckBalanceAccountTrigger.class);

        // Example named binding to a Lightning component
        bind('lc_actionWidgetManage').lightningComponent().to('c:widgetManager');
    }
}

NOTE: The above binding configuration module class is itself injected into the org-wide Injector by a corresponding custom metadata Binding record here. You can also see in the above example other bindings being configured, see below for more on this.

The actual implementation of the injected Visualforce Component widgetInfo looks like this:-

<apex:component controller="WidgetInfoController">
  <apex:attribute name="standardController"
     type="ApexPages.StandardController"
     assignTo="{!StandardControllerValue}" description=""/>
  <h1>Success I have been injected! {!standardController.Id}</h1>
</apex:component>

Decomposition Examples

The examples, shown above and others are contained in the sample repo. Each of the root package directories, force-app-1, force-app-2, and force-app-3 helps illustrate how the point of injection vs the runtime binding can be split across the boundaries of a DX package, thus aiding decomposition. The force-di-trigger-demo (not shown below) also contains a sample trigger handler framework using the libraries ability to resolve multiple bindings (to trigger handlers) in a given sequence, thus supporting the best practice of a single trigger per object.

Further Background and Features

I must confess when I started to research Java Dependency Injection (mainly via Java Guice) I was skeptical as to how much I could get done without custom annotation and reflection support in Apex. However, I am pretty pleased with the result and how it has woven in with features like Custom Metadata Types and how the Visualforce and Lightning Component injectors have turned out. A plan to write future Wiki pages on the associated GitHub repo to share more details on the Force DI API. Meanwhile here is a rundown of some of the more advanced features.

  • Provider Support
    Injectors by default only return one instance of the bound object, hence getInstance. Bindings that point to a class implementing the Provider interface (see inner interface) can override this. Which also allows for the construction of classes that do not have default constructors or types not supported by Type.forName. This feature also works in conjunction with the ability to pass a parameter via the Apex Injector, e.g. Injector.Org.getInstance(PaymentEngine.cls, someData);
  • Parameters
    Each of the three Injectors permits the passing of parameter/context information into the bound class or component. The examples above illustrate this.
  • Modules, Programmatic Binding Configuration and Injector Scopes
    Binding Modules group programmatic bindings and allow you to hook programmatically into the initialization of the Injector. Modules use the Fluent style interface to express bindings very clearly. The force-app-3 package in the repo uses this approach to define the bindings shown in the VF example above. You can also take a look at a worked example here of how local (one-off) Injectors can be used and here for a more complex OO example of conditional bindings works.
  • StandardController Passthrough
    For Visualforce Component injections the frameworks parameter passing capabilities supports passing through the instance of the StandardController from the hosting page into the injected component, as can be seen in the example above.
  • Binding Discovery by SObject vs Name
    The examples above utilize single bindings by a unique name. However, it is becoming quite common to adapt trigger frameworks to support DI and thus allow a single trigger to dynamically reach out to one or more handlers (perhaps installed in separate DX packages). This example shows how Force DI could be used in such a scenario.

Conclusion

This blog has hopefully wet your appetite to learn more! If so, head over to the repo and have a look through the samples in this blog and others. My next step is to wrap this up in a DX package to make it easier to get your hands on it, for now, download the repo and deploy via DX. I am also keen to explore what other aspects of Java Guice might make sense, such as the Linked Bindings feature.

Meanwhile, I would love feedback on the sample code and library thus far. Last but not least I would like to give a shout out to John Daniel and Doug Ayers for their great feedback during the initial development of the library and this blog. Enjoy!

 


6 Comments

Adding User Feedback to your Package

featurefeedbackFeature Management has become GA in Winter’18 and with it the ability to have finer control and visibility over how users of your package consume its functionality. It provides an Apex API and corresponding objects in your License Management Org (LMO). With these objects, you can switch features on and off or even extend the capacity or duration of existing ones. For features that you simply want to monitor adoption on you can also track adoption and send metrics back to your LMO as well. This is all done within the platform, no HTTP callouts are required.

This blog focuses on a tracking and metrics use case and presents a Lightning Component to allow users to activate a given feature and after that contribute to an aggregated scoring of that feature sent back to you the package owner!

Consider this scenario. The latest Widgets App package has added a new feature to its Widgets Overview tab. The ability to analyze widgets! Using Feature Management they can now track who activates this feature and ratings their customers give it.

featureactivate

Trying it Out: You can find the full source code for this sample app and component here. You can also easily give it a try via the handy Deploy to SFDX button! Do not worry though, it will not write back to your production org, it’s totally isolated in this mode. Unless you choose to package it up and associate your LMA etc.

withoutmanagefeatures.pngYour customers may not want every user to have the ability to activate and deactivate features as shown above. The sample application associated with this blog includes a Manage Featurescustom permission that controls this ability. Users without this custom permission assigned only see the feedback portion of the component. If the feature has not been activated a message is displayed informing the user to consult with their admin.

The component only sends the average score per feature per customer back to you. However, a user’s individual score is captured in the subscriber org via custom objects. Enabling you to pick up the phone and dig a bit deeper together with customers showing particularly low or high scores.

featureadoption.png

The following shows what you get back in your License Management Org LMO (typically your companies production org). By using standard reporting you can easily see what features have been activated, their average score and how many users contributed to the rating.

featurereport

NOTE: The average score returned to the production org is represented as a value from 1 to 50.

So let’s now dig into the code and the architecture of this solution. Firstly we need some feature parameters, then some corresponding Apex API to read and write to them. You define the parameters via XML under the/featureParameters folder.

FeatureParam

NOTE: In this blog, we are dealing with parameter values that flow from your customers org back into your production org, hence the SubscriberToLmo usage. Also keep in mind that per the docs, updates to your LMO may take up to 24 hrs.

You can also create package feature parameters via a new tab on the Package Details page in your packaging org. However, if you are using Scratch Orgs you define them via metadata directly. The Feedback component needs you to define three parameters per feature. To support our scenario, the following parameters are defined. WidgetAnalysis (Boolean), WidgetAnalysisCount (Integer), and WidgetAnalysisScore (Integer).

featuremanagementapi

NOTE: When your code sets parameter values, mixed DML rules apply. So typically you would set these via an async context such as @future or a queueable.

To use the Feedback Component included in the sample code for this blog, you set two attributes, the name of your feature parameter and an attribute that the rest of your component code can use to conditionally display your new feature! The following shows how in the above scenario, the new Widget Analysis component has used c:featureFeedback component to activate the feature, get a user rating and control the display of the new graph to the user.

feedbackcomponent.png

Take a deeper look at the Feature Feedback Component Apex controller to see the above Feature Management API in action, as well as how it aggregates the scores.

Back over in your License Management Org, the above report was based on one of the four new custom objects provided by a Salesforce package. The Feature Parameter object represents the package parameters defined above. The Feature Parameter Date, Feature Parameter Integer, and Feature Parameter Boolean are the values set via code running in the subscriber org associated with the License record. Per the documentation, it can take up to 24hrs for these to update.

featureparamsschema

Summary

Knowing more about how your customers consume applications you build on the platform is a big part of customer success and your own. You should also check out the Usage Metrics feature if you have not already.

There is a quite a bit more to Feature Management to explore. Such as controlling feature activation exclusively from your side, useful for pilots or enabling paid features. Of special note is the ability tohide related Custom Objects until features are activated. This is certainly a welcome feature to your customer’s admins when working with large packages since they only see in Object Manager objects relating to activated features.

Finally, I recommend you read the excellent documentation in the ISVForce guide very carefully before going ahead and trying this out in your main package. Since this feature integrates with your live production data. The documentation recommends to try this out in a temporary package first and discuss rollout with your legal team.

P.S. You can also use the FeatureManagement.checkPermission method to check if the user has been assigned a given custom permission without SOQL. Very useful!


2 Comments

Highlights from TrailheaDX 2017

IMG_2857.JPGThis was my first TrailheaDX and what an event it was! With my Field Guide in hand i set out into the wilderness! In this blog i’ll share some of my highlights, thoughts and links to the latest resources. Many of the newly announced things you can actually get your hands on now which is amazing!

Overall the event felt well organized, if a little frantic at times. With smaller sessions of 30 minutes each, often 20 mins after intros and questions, each was bite sized, but quite well tuned with demos and code samples being shown.

SalesforceDX, Salesforce announced the public beta of this new technology aimed at improving the developer experience on the platform. SalesforceDX consist of several modules that will be extended further over time. Salesforce has done a great job at providing a wealth of Trailhead resources to get you started.

Einstein, Since its announcement, myself and other developers have been waiting to get access to more custom tools and API’s, well now that wait is well and truly over. As per my previous blogs we’ve had the Einstein Vision API for a little while now. Announced at the event where no less than three other new Einstein related tools and API’s.

  • Einstein Discovery. Salesforce demonstrated a very slick looking tool that allows you to point and click your way through to analyzing various data sets, including those obtained from your custom objects! They have provided a Trailhead module on it here and i plan on digging in! Pricing and further info is here.
  • Einstein Sentiment API. Allows you to interpret text strings for terms that indicate if its a positive, neutral or negative statement / sentiment. This can be used to scan case comments, forum posts, feedback, twitter posts etc in an automated way and respond or be alerted accordingly to what is being said.
  • Einstein Intent API.  Allows you to interpret text strings for meanings, such as instructions or requests. Routing case comments or even implementing bots that can help automate or propose actions to be taken without human interpretation.
  • Einstein Object Detection API. Is an extension of the Einstein Vision API, that allows for individual items in a picture to be identified. For example a pile of items on a coffee table, such as a mug, magazine, laptop or pot plant! Each can then be recognized and classified to build up more intel on whats in the picture for further processing and analysis.
  • PredictionIO on Heroku. Finally, if you want to go below the declarative tools or intentional simplified Einstein API’s, you can build your own machine learning models using Heroku and the PredictionIO build pack!

Platform Events. These allow messages to be published and subscribed to using a new object known as an Event object, suffixed with __e. Once created you can use a new Apex API’s or REST API’s to send messages and use either Apex Triggers or Streaming API to receive them. There is also a new Process Builder action or Flow element to send messages. You can keep your messages within Force.com or use them to integrate between other cloud platforms or the browser. The possibilities are quite endless here, aysnc processing, inter app comms, logging, ui notifications…. i’m sure myself and other bloggers will be exploring them in the coming months!

External Services. If you find a cool API you want to consume in Force.com you currently have to write some code. No longer! If you have a schema that describes that API you use the External Services wizard to generate some Apex code that will call out to the API. Whats special about this, is the Apex code is accessible via Process Builder and Flow. Making clicks not code API integration possible. Its an excellent way to integrate with complementary services you or others might develop on platforms such as Heroku. I have just submitted a session to Dreamforce 2017 to explore this further, fingers crossed it gets selected! You can read more about it here in the meantime.

Sadly i did have to make a key decision to focus on the above topics and not so much on Lightning. I heard from other attendees these sessions where excellent and i did catch a brief sight of dynamic component rendering in Lightning App Builder, very cool!

Thanks Salesforce for filling my blog backlog for the next year or so! 😉

 

 


2 Comments

Packaging and Installing Rollups

Since v2.0 of the Declarative Rollup Tool, it is now possible to use Custom Metadata to store rollup configurations. Not only does this make it easy to transfer those you have setup in Sandbox via Change Sets, you can also use Salesforce packaging to capture those you’ve created and install them over and over into other orgs, like a reusable Change Set!

InstallRollups

Create and set up your Packaging Org

To do this, you need a Developer Edition org from Salesforce, this will act as your master org for your common rollup definitions. Here you can install the rollup tool package itself as minimum. Then also create common fields or objects used by your rollups if you wish. You can also install any other packages, such as the NPSP packages. Basically prepare and test everything here as you would normally in Sandbox or Production.

DESignup

Creating your Package

Under the Setup menu navigation to Create and then Packages. Create a new Package with an appropriate name. Click Add to components, components are anything from Custom Objects, Fields, Apex Triggers and Apex Tests to the Rollup definitions themselves. As a minimum you will need to add the following components for rollups.

  • Apex Classes for the rollup/s (starting dlrs)
  • Apex Triggers for the rollup/s (starting dlrs)
  • Lookup Rollup Summary definition/s

The platform adds a few other dependent things as you can see in the screenshot below, but don’t worry about these. You should have something like this…

PackagingRollups

Tip: After creating your rollups, go to Setup and Custom Metadata, and find your rollup definition, edit it and locate the Protected checkbox, then hit Save (at present this checkbox is not shown in the custom UI for rollups). This checkbox prevents users or other admins from changing your rollup definition once installed. You also need to use the managed package route though, as described below.

Choosing Unmanaged or Managed Packages

This choice depends on if you want to manage several updates / releases to your rollups over time, adding or updating as you improve things. An umanaged package is basically like just installing a template of your rollups, once installed thats it, they are no longer linked to your package. If however you want to install updates to them and/or stop people from editing your rollups (see protected mode above), you want a managed package.

Setting a namespace for your Managed Package

If you decided you only want an unmanaged package, skip this bit. Otherwise on the Packages page, under Setup then Create. Click Edit button and follow the prompts to enter a namespace, this is a short mnemonic that describes your package, like a unique ID. Once set it cannot be changed. Next find your Package detail and click Edit, selecting the Managed checkbox on hitting Save.

Uploading your Rollup Package!

PackagesRollups

This process basically builds your installer, by actually placing a copy of what you’ve done in the AppExchange servers. All be it as a private listing. This gives you an install link to use over and over as much as you like. Install it in the sandbox and production as needed.

Click on your package and then the Upload button, give it a name and version of your choose. Be sure to select the Release mode, in most cases for this packaged content worrying about Beta vs Release mode is not a concern. Press Upload and wait.

PackageUpload.png

RollupPackageSummary

Rinse and Repeat?

Keep in mind if you went with the managed package route. You can go back to your packaging org, make some improvements, add more rollups etc, then repeat the upload process to get a new version of your package. Simply then go to any existing org with your prior release installed and upgrade.

NOTE: To ensure your rollup definitions are upgraded they must be marked as protected. If you don’t care about this and only want new rollups to be added, retaining subscriber (installed org) changes, don’t worry about Protected rollups.

 

 


21 Comments

Creating, Assigning and Checking Custom Permissions

I have been wanting to explore Custom Permissions for a little while now, since they are now GA in Winter’15 i thought its about time i got stuck in. Profiles, Permission Sets have until recently, been focusing on granting permissions to entities only known to the platform, such as objects, fields, Apex classes and Visualforce pages. In most cases these platform entities map to a specific feature in your application you want to provide permission to access.

However there are cases where this is not always that simple. For example consider a Visualforce page you that controls a given process in your application, it has three buttons on it Run, Clear History and Reset. You can control access to the page itself, but how do you control access to the three buttons? What you need is to be able to teach Permission Sets and Profiles about your application functionality, enter Custom Permissions!

CustomPermissions
CustomPermission

NOTE: That you can also define dependencies between your Custom Permissions, for example Clear History and Reset permissions might be dependent on a Manage Important Process  custom permission in your package.

Once these have been created, you can reference them in your packaged Permission Sets and since they are packaged themselves, they can also be referenced by admins managing your application in a subscriber org.

SetCustomPermissions

The next step is to make your code react to these custom permissions being assigned or not.

New Global Variable $Permission

You can use the $Permission from a Visualforce page or as SFDCWizard points out here from Validation Rules! Here is the Visualforce page example given by Salesforce in their documentation.

<apex:pageBlock rendered="{!$Permission.canSeeExecutiveData}">
   <!-- Executive Data Here -->
</apex:pageBlock>

Referencing Custom Permissions from Apex

IMPORTANT UPDATE: Since API 41 (Winter’18) there is now a native way to read Custom Permissions. The following may still be useful if you have requirements not met by the native method. FeatureManagement.checkPermission.

In the case of object and field level permissions, the Apex Describe API can be used to determine if an object or field is available and for what purpose, read or edit for example. This is not going help us here, as custom permissions are not related to any specific object or field. The solution is to leverage the Permission Set Object API to query the SetupEntityAccess and CustomPermission records for Permission Sets or Profiles that are assigned to the current user.

The following SOQL snippets are from the CustomPermissionsReader class i created to help with reading Custom Permissions in Apex (more on this later). As you can see you need to run two SOQL statements to get what you need. The first to get the Id’s the second to query if the user actually has been assigned a Permission Set with them in.


List<CustomPermission> customPermissions =
    [SELECT Id, DeveloperName
       FROM CustomPermission
       WHERE NamespacePrefix = :namespacePrefix];

List<SetupEntityAccess> setupEntities =
    [SELECT SetupEntityId
       FROM SetupEntityAccess
       WHERE SetupEntityId in :customPermissionNamesById.keySet() AND
             ParentId IN (SELECT PermissionSetId
                FROM PermissionSetAssignment
                WHERE AssigneeId = :UserInfo.getUserId())];

Now personally i don’t find this approach that appealing for general use, firstly the Permission Set object relationships are quite hard to get your head around and secondly we get charged by the platform to determine security through the SOQL governor. As a good member of the Salesforce community I of course turned my dislike into an Idea “Native Apex support for Custom Permissions” and posted it here to recommend Salesforce include a native class for reading these, similar to Custom Labels for example.

Introducing CustomPermissionReader

In the meantime I have set about creating an Apex class to help make querying and using Custom Permissions easier. Such a class might one day be replaced if my Idea becomes a reality or maybe its internal implementation just gets improved. One things for sure, i’d much rather use it for now than seed implicit SOQL’s throughout a code base!

Its pretty straight forward to use, construct it in one of two ways, depending if you want all non-namespaced Custom Permissions or if your developing a AppExchange package, give it any one of your packaged Custom Objects and it will ensure that it only ever reads the Custom Permissions associated with your package.

You can download the code and test for CustomPermissionsReader here.


// Default constructor scope is all Custom Permissions in the default namespace
CustomPermissionsReader cpr = new CustomPermissionsReader();
Boolean hasPermissionForReset = cpr.hasPermission('Reset');

// Alternative constructor scope is Custom Permissions that share the
//   same namespace as the custom object
CustomPermissionsReader cpr = new CustomPermissionsReader(MyPackagedObject.SObjectType);
Boolean hasPermissionForReset = cpr.hasPermission('Reset');

Like any use of SOQL we must think in a bulkified way, indeed its likely that for average to complex peaces of functionality you may want to check at least two or more custom permissions once you get started with them. As such its not really good practice to make single queries in each case.

For this reason the CustomPermissionsReader was written to load all applicable Custom Permissions and act as kind of cache. In the next example you’ll see how i’ve leveraged the Application class concept from the Apex Enterprise Patterns conventions to make it a singleton for the duration of the Apex execution context.

Here is an example of an Apex test that creates a PermissionSet, adds the Custom Permission and assigns it to the running user to confirm the Custom Permission was granted.

	@IsTest
	private static void testCustomPermissionAssigned() {

		// Create PermissionSet with Custom Permission and assign to test user
		PermissionSet ps = new PermissionSet();
		ps.Name = 'Test';
		ps.Label = 'Test';
		insert ps;
		SetupEntityAccess sea = new SetupEntityAccess();
		sea.ParentId = ps.Id;
		sea.SetupEntityId = [select Id from CustomPermission where DeveloperName = 'Reset'][0].Id;
		insert sea;
		PermissionSetAssignment psa = new PermissionSetAssignment();
		psa.AssigneeId = UserInfo.getUserId();
		psa.PermissionSetId = ps.Id;
		insert psa;

		// Create reader
		CustomPermissionsReader cpr = new CustomPermissionsReader();

		// Assert the CustomPermissionsReader confirms custom permission assigned
		System.assertEquals(true, cpr.hasPermission('Reset'));
	}

Seperation of Concerns and Custom Permissions

Those of you familiar with using Apex Enterprise Patterns might be wondering where checking Custom Permission fits in terms of separation of concerns and the layers the patterns promote.

The answer is at the very least in or below the Service Layer, enforcing any kind of security is the responsibility of the Service layer and callers of it are within their rights to assume it is checked. Especially if you have chosen to expose your Service layer as your application API.

This doesn’t mean however you cannot improve your user experience by using it from within Apex Controllers,  Visualforce pages or @RemoteAction methods to control the visibility of related UI components, no point in teasing the end user!

Integrating CustomerPermissionsReader into your Application class

The following code uses the Application class concept i introduced last year and at Dreamforce 2014, which is a single place to access your application scope concepts, such as factories for selectors, domain and service class implementations (it also has a big role to play when mocking).

public class Application {

	/**
	 * Expoeses typed representation of the Applications Custom Permissions
	 **/
	public static final PermissionsFactory Permissions = new PermissionsFactory();

	/**
	 * Class provides a typed representation of an Applications Custom Permissions
	 **/
	public class PermissionsFactory extends CustomPermissionsReader
	{
		public Boolean Reset { get { return hasPermission('Reset'); } }
	}
}

This approach ensures their is only one instance of the CustomPermissionsReader per Apex Execution context and also through the properties it exposes gives a compiler checked way of referencing the Custom Permissions, making it easier for application developers code to access them.

if(Application.Permissions.Reset)
{
  // Do something to do with Reset...
}

Finally, as a future possibility, this approach gives a nice injection point for mocking the status of Custom Permissions in your Apex Unit tests, rather than having to go through the trouble of setting up a Permission Set and assigning it in your test code every time as shown above.

Call to Action: Ideas to Upvote

While writing this blog I created one Idea and came across a two others, i’d like to call you the reader to action on! Please take a look and of course only if you agree its a good one, give it the benefit of your much needed up vote!

 


9 Comments

Permission Sets and Packaging

Permission Sets have been giving me and an unlucky few who are unable to install my Declarative Rollup Summary Tool some headaches since i introduced them. Thankfully I’ve recently discovered why and thought it worth sharing along with some other best practices i have adopted around packaging permission sets since…

Package install failures due to Permission Sets

After including Permission Sets in my package, I found that some users where failing to install the package, they received via email, the following type of error (numbers in brackets tended to differ).

Your requested install failed. Please try this again.
None of the data or setup information in your salesforce.com organization should have been 
 affected by this error.
If this error persists, contact salesforce.com Support through your normal channels
 and reference number: 604940119-57161 (929381962)

It was frustrating that i could not help them decode this message, it required them to raise a Salesforce support case and pass their org ID and error codes on the case. For those that did this, it became apparent after several instances, a theme was developing, a package platform feature dependency!

Normally such things surface in the install UI for the user to review and resolve. Dependencies also appear when you click the View Dependencies button prior to uploading your package. However it seems that Salesforce is currently not quite so smart in the area of Permission Sets and dependency management, as these are hidden from both!

Basically i had inadvertently packaged dependencies to features like Ideas and Streaming API via including my Permission Sets, one of which i had chosen to enable the Author Apex permission. This system permission seems to enable a lot of other permissions as well as enabling other object permissions (including optional features like Ideas, which happened to be enabled in my packaging org). Then during package install if for historic reasons the subscriber org didn’t have these features enabled the rather unhelpful error above occurs!

Scrubbing my Permission Sets

I decided enough was enough and wanted to eradicate any Salesforce standard permission from my packaged Permission Sets. This was harder than i thought, since simply editing the .permissionset file and uploading it didn’t not remove anything (this approach is only suitable for changing or adding permissions within a permission set).

Next having removed entries from the .permissionset file, i went into the Permission Set UI within the packaging org and set about un-ticking everything not related to my objects, classes and pages. This soon became quite an undertaking and i wanted to be sure. I then recalled that Permission Sets are cool because they have an Object API!

I started to use the following queries in the Developer Console to view and delete on mass anything i didn’t want, now i was sure i had removed any traces of any potentially dangerous and hidden dependencies that would block my package installs. I obtained the Permission Set ID below from the URL when browsing it in the UI.

select Id, SobjectType, PermissionsCreate, PermissionsEdit, PermissionsDelete, PermissionsRead
  from ObjectPermissions where ParentId = '0PSb0000000HT5A'
select Id, SobjectType, Field
  from FieldPermissions where ParentId = '0PSb0000000HT5A'

CheckObjectPermissions

Note that System Permissions (such as Author Apex) which are shown in the UI, currently don’t have an Object API, so you cannot validate them via SOQL. However at least these are listed on a single page so are easier to check visually.

Lessons learned so far…

So here are a few lessons i’ve learned to date around using Permission Sets…

  • Don’t enable System Permissions unless your 100% sure what other permissions they enable!
  • When editing your .permissionset files don’t assume entries you remove will be removed from the org.
  • Use SOQL to query your Permission Sets to check they are exactly what you want before uploading your package
  • Don’t make your Permission Sets based on a specific User License type (this is covered in the Salesforce docs but i thought worth a reminder here as it cannot be undone).
  • Do design your Permission Sets based on feature and function and not based on user roles (which are less likely to match your perception of the role from one subscriber to another). Doing the former will reduce the chances of admins cloning your Permission Sets and loosing out on them being upgraded in the future as your package evolves.