Andy in the Cloud

From BBC Basic to Force.com and beyond…


3 Comments

Creating, Assigning and Checking Custom Permissions

I have been wanting to explore Custom Permissions for a little while now, since they are now GA in Winter’15 i thought its about time i got stuck in. Profiles, Permission Sets have until recently, been focusing on granting permissions to entities only known to the platform, such as objects, fields, Apex classes and Visualforce pages. In most cases these platform entities map to a specific feature in your application you want to provide permission to access.

However there are cases where this is not always that simple. For example consider a Visualforce page you that controls a given process in your application, it has three buttons on it Run, Clear History and Reset. You can control access to the page itself, but how do you control access to the three buttons? What you need is to be able to teach Permission Sets and Profiles about your application functionality, enter Custom Permissions!

CustomPermissions
CustomPermission

NOTE: That you can also define dependencies between your Custom Permissions, for example Clear History and Reset permissions might be dependent on a Manage Important Process  custom permission in your package.

Once these have been created, you can reference them in your packaged Permission Sets and since they are packaged themselves, they can also be referenced by admins managing your application in a subscriber org.

SetCustomPermissions

The next step is to make your code react to these custom permissions being assigned or not.

New Global Variable $Permission

You can use the $Permission from a Visualforce page or as SFDCWizard points out here from Validation Rules! Here is the Visualforce page example given by Salesforce in their documentation.

<apex:pageBlock rendered="{!$Permission.canSeeExecutiveData}">
   <!-- Executive Data Here -->
</apex:pageBlock>

Referencing Custom Permissions from Apex

In the case of object and field level permissions, the Apex Describe API can be used to determine if an object or field is available and for what purpose, read or edit for example. This is not going help us here, as custom permissions are not related to any specific object or field. The solution is to leverage the Permission Set Object API to query the SetupEntityAccess and CustomPermission records for Permission Sets or Profiles that are assigned to the current user.

The following SOQL snippets are from the CustomPermissionsReader class i created to help with reading Custom Permissions in Apex (more on this later). As you can see you need to run two SOQL statements to get what you need. The first to get the Id’s the second to query if the user actually has been assigned a Permission Set with them in.


List<CustomPermission> customPermissions =
    [SELECT Id, DeveloperName
       FROM CustomPermission
       WHERE NamespacePrefix = :namespacePrefix];

List<SetupEntityAccess> setupEntities =
    [SELECT SetupEntityId
       FROM SetupEntityAccess
       WHERE SetupEntityId in :customPermissionNamesById.keySet() AND
             ParentId IN (SELECT PermissionSetId
                FROM PermissionSetAssignment
                WHERE AssigneeId = :UserInfo.getUserId())];

Now personally i don’t find this approach that appealing for general use, firstly the Permission Set object relationships are quite hard to get your head around and secondly we get charged by the platform to determine security through the SOQL governor. As a good member of the Salesforce community I of course turned my dislike into an Idea “Native Apex support for Custom Permissions” and posted it here to recommend Salesforce include a native class for reading these, similar to Custom Labels for example.

Introducing CustomPermissionReader

In the meantime I have set about creating an Apex class to help make querying and using Custom Permissions easier. Such a class might one day be replaced if my Idea becomes a reality or maybe its internal implementation just gets improved. One things for sure, i’d much rather use it for now than seed implicit SOQL’s throughout a code base!

Its pretty straight forward to use, construct it in one of two ways, depending if you want all non-namespaced Custom Permissions or if your developing a AppExchange package, give it any one of your packaged Custom Objects and it will ensure that it only ever reads the Custom Permissions associated with your package.

You can download the code and test for CustomPermissionsReader here.


// Default constructor scope is all Custom Permissions in the default namespace
CustomPermissionsReader cpr = new CustomPermissionsReader();
Boolean hasPermissionForReset = cpr.hasPermission('Reset');

// Alternative constructor scope is Custom Permissions that share the
//   same namespace as the custom object
CustomPermissionsReader cpr = new CustomPermissionsReader(MyPackagedObject.SObjectType);
Boolean hasPermissionForReset = cpr.hasPermission('Reset');

Like any use of SOQL we must think in a bulkified way, indeed its likely that for average to complex peaces of functionality you may want to check at least two or more custom permissions once you get started with them. As such its not really good practice to make single queries in each case.

For this reason the CustomPermissionsReader was written to load all applicable Custom Permissions and act as kind of cache. In the next example you’ll see how i’ve leveraged the Application class concept from the Apex Enterprise Patterns conventions to make it a singleton for the duration of the Apex execution context.

Here is an example of an Apex test that creates a PermissionSet, adds the Custom Permission and assigns it to the running user to confirm the Custom Permission was granted.

	@IsTest
	private static void testCustomPermissionAssigned() {

		// Create PermissionSet with Custom Permission and assign to test user
		PermissionSet ps = new PermissionSet();
		ps.Name = 'Test';
		ps.Label = 'Test';
		insert ps;
		SetupEntityAccess sea = new SetupEntityAccess();
		sea.ParentId = ps.Id;
		sea.SetupEntityId = [select Id from CustomPermission where DeveloperName = 'Reset'][0].Id;
		insert sea;
		PermissionSetAssignment psa = new PermissionSetAssignment();
		psa.AssigneeId = UserInfo.getUserId();
		psa.PermissionSetId = ps.Id;
		insert psa;

		// Create reader
		CustomPermissionsReader cpr = new CustomPermissionsReader();

		// Assert the CustomPermissionsReader confirms custom permission assigned
		System.assertEquals(true, cpr.hasPermission('Reset'));
	}

Seperation of Concerns and Custom Permissions

Those of you familiar with using Apex Enterprise Patterns might be wondering where checking Custom Permission fits in terms of separation of concerns and the layers the patterns promote.

The answer is at the very least in or below the Service Layer, enforcing any kind of security is the responsibility of the Service layer and callers of it are within their rights to assume it is checked. Especially if you have chosen to expose your Service layer as your application API.

This doesn’t mean however you cannot improve your user experience by using it from within Apex Controllers,  Visualforce pages or @RemoteAction methods to control the visibility of related UI components, no point in teasing the end user!

Integrating CustomerPermissionsReader into your Application class

The following code uses the Application class concept i introduced last year and at Dreamforce 2014, which is a single place to access your application scope concepts, such as factories for selectors, domain and service class implementations (it also has a big role to play when mocking).

public class Application {

	/**
	 * Expoeses typed representation of the Applications Custom Permissions
	 **/
	public static final PermissionsFactory Permissions = new PermissionsFactory();

	/**
	 * Class provides a typed representation of an Applications Custom Permissions
	 **/
	public class PermissionsFactory extends CustomPermissionsReader
	{
		public Boolean Reset { get { return hasPermission('Reset'); } }
	}
}

This approach ensures their is only one instance of the CustomPermissionsReader per Apex Execution context and also through the properties it exposes gives a compiler checked way of referencing the Custom Permissions, making it easier for application developers code to access them.

if(Application.Permissions.Reset)
{
  // Do something to do with Reset...
}

Finally, as a future possibility, this approach gives a nice injection point for mocking the status of Custom Permissions in your Apex Unit tests, rather than having to go through the trouble of setting up a Permission Set and assigning it in your test code every time as shown above.

Call to Action: Ideas to Upvote

While writing this blog I created one Idea and came across a two others, i’d like to call you the reader to action on! Please take a look and of course only if you agree its a good one, give it the benefit of your much needed up vote!


2 Comments

Introducing the LittleBits Connector for Salesforce

As those of you know that follow my brickinthecloud.com blog, i love using API’s in the cloud to connect not only applications, but devices. Salesforce themselves also share this passion, just take a look at their Internet of Things page to see how it can improve your business and the work Reid Carlberg is doing.

LittleBitsWithEV3When Salesforce sent me a LittleBits Cloud Starter Kit as Christmas present i once again set about connecting it to my favourite cloud platform! This blog introduces two new GitHub repos and a brand new installable package to allow you to take full advantage of the snap-not-solder that LittleBits electronics brings with the Salesforce clicks-not-code design model! So if your not an electronics whiz or coder, you really don’t have any excuses for not getting involved! LittleBits provides over 60 snap together components, to build automated fish feeders, to home security systems and practically anything else you can imagine!

cloud_diagram2The heart of the kit is a small computer module, powered by a USB cable (i plugged mine into my external phone battery pack!). It boots from an SD card and uses an onboard USB Wifi adapter to connect itself to the internet (once you’ve connected it to your wifi). After that you send commands to connected outputs to it via a mobile site or set of LittleBits Cloud API’s provided. So far i have focused on sending commands to the outputs (in my case i connected the servo motor), however as i write this i’m teaming up with Cory Cowgill who has also starting working with his kit from an inputs perspective (e.g. pressing on a button on the device).

Everyone in the Salesforce MVP community was lucky enough to get one of these kits and i wanted to make sure everyone could experience it with the cloud platform we love so much! Sadly right now the clicks-not-code solution IFTTT  (If-This-Then-That) used for controlling LittleBits devices does not fully support Salesforce (there is only a Salesforce Chatter plugin). Borrowing an approach i’ve been using for my Declarative Rollup Summary Tool, i set about building a declarative based tool that would allow the Salesforce admin to connect updates to any standard or custom object record to a LittleBits device!

The result is the LittleBits Connector!

LittleBitsTrigger

Once you have assembled and connected your LittleBits device, go to the Settings page under LittleBits Cloud Control and take note of your Access Token and Device ID. As you can see in the screenshot above enter these in the LittleBits Device section or in the LittleBits API custom setting.

The Trigger section needs only the Record ID of the record you want to monitor and have your device respond to when changes are made. Simply list the field API names (separated by a comma) of those you want the tool to monitor. Next fill in the LittleBits Output section with either literal values (on the left) and/or dynamic values driven by values from the record itself. This gives you quite a lot of flexibility to use Formula Fields for example to calculate the percentage.

Controlling a LittleBits cloud device is quite simple, define the duration of the output (how long to apply a voltage) and the amount of voltage as a percentage. Depending on the output module you’ve fitted, light or motor, the effects differ but the principle is the same. In the motor case, i set mine to Turn mode (see below). Then by applying a duration of 100,000 and a percent the motor turns to a specific point each time. Making it ideal for building pointing devices!

With the help of my wifes crafting skills we set about on a joint Christmas project to build a pointing device that would show the Probability of a given Opportunity in realtime. Though the tool I ended up building can effectively be used with any standard or custom object. I also wanted to use only the modules in the LittleBits Cloud Start Kit. So with Salesforce Org and the tool the Internet of Things is in your hands!

Here is a video of our creation in action…

If you want to have a go yourself follow these steps…

Building your own Opportunity Probability Indicator Device #clicksnotcode

If clicks are more your thing than coding, fear not and follow these simple steps!

  1. Purchase a LittleBits Cloud Connector and follow the onscreen instructions once you have created your LittleBits account here. Complete the tutorial to confirm its connected.
  2. Build the device modules configuration as shown in the picture below. On the servo module, there is a tiny switch, use the small screw driver provided to push it to the down position, to put the servo in “turn” mode.
    LittleBitsDevice
  3. Next the fun bit, construct your pointing device! I’d love to see tweets of everyones crafting skills!
    pointingdevice
  4. Install the latest LittleBits Connector either via clicking the package install links or as code, see GitHub README. The first time you go to the LittleBits Trigger tab, you maybe asked to complete the post install step to configure the Metadata API needed by the tool to deploy the Apex Triggers, complete this step as instructed on screen.
  5. Click New on the LittleBits Trigger tab, complete the LittleBits Trigger record as described above, but of course using a record Id from an Opportunity record in your org then click Save.
  6. Click the Manage Object Trigger button to automatically deploy a small Apex Trigger to pass on updates made to the records to the LittleBits Connector engine.
  7. In Salesforce or Salesforce1 Mobile for that matter, update your Opportunity Stage (which updates the Probability). This results in an Apex Job which typically fires fairly promptly and you should see your device respond! If you don’t see anything change on your device confirm its working via the LittleBits Cloud Control test page, next go to the Setup menu check the jobs are completing without error via the Apex Jobs page.

Using the LittleBits Cloud API from Apex

The above tool was built around an Apex wrapper i have started to build around the LittleBits Cloud API, which is a REST API. With a little more time and help from fellow LittleBits fan Cory, we will update it to support not only controlling devices, but also allow them to feedback to Salesforce. In the meantime if you want to code your own solution directly you can install the library here.

The code is quite simply for now, you can read more about it in the README file.

new LittleBits().getDevice().output(80, 10000);

Whats next?

Well i’m quite addicted to this new device, my Lego EV3 robot might be justified in feeling a little left out, but fear not, i’ll find a way to combine them i’m sure! Next up for the LittleBits Connector is subscribing to output from the device back to Salesforce, possibly calling out to a headless Flow, to keep that clicks not code feel going!


9 Comments

Permission Sets and Packaging

Permission Sets have been giving me and an unlucky few who are unable to install my Declarative Rollup Summary Tool some headaches since i introduced them. Thankfully I’ve recently discovered why and thought it worth sharing along with some other best practices i have adopted around packaging permission sets since…

Package install failures due to Permission Sets

After including Permission Sets in my package, I found that some users where failing to install the package, they received via email, the following type of error (numbers in brackets tended to differ).

Your requested install failed. Please try this again.
None of the data or setup information in your salesforce.com organization should have been 
 affected by this error.
If this error persists, contact salesforce.com Support through your normal channels
 and reference number: 604940119-57161 (929381962)

It was frustrating that i could not help them decode this message, it required them to raise a Salesforce support case and pass their org ID and error codes on the case. For those that did this, it became apparent after several instances, a theme was developing, a package platform feature dependency!

Normally such things surface in the install UI for the user to review and resolve. Dependencies also appear when you click the View Dependencies button prior to uploading your package. However it seems that Salesforce is currently not quite so smart in the area of Permission Sets and dependency management, as these are hidden from both!

Basically i had inadvertently packaged dependencies to features like Ideas and Streaming API via including my Permission Sets, one of which i had chosen to enable the Author Apex permission. This system permission seems to enable a lot of other permissions as well as enabling other object permissions (including optional features like Ideas, which happened to be enabled in my packaging org). Then during package install if for historic reasons the subscriber org didn’t have these features enabled the rather unhelpful error above occurs!

Scrubbing my Permission Sets

I decided enough was enough and wanted to eradicate any Salesforce standard permission from my packaged Permission Sets. This was harder than i thought, since simply editing the .permissionset file and uploading it didn’t not remove anything (this approach is only suitable for changing or adding permissions within a permission set).

Next having removed entries from the .permissionset file, i went into the Permission Set UI within the packaging org and set about un-ticking everything not related to my objects, classes and pages. This soon became quite an undertaking and i wanted to be sure. I then recalled that Permission Sets are cool because they have an Object API!

I started to use the following queries in the Developer Console to view and delete on mass anything i didn’t want, now i was sure i had removed any traces of any potentially dangerous and hidden dependencies that would block my package installs. I obtained the Permission Set ID below from the URL when browsing it in the UI.

select Id, SobjectType, PermissionsCreate, PermissionsEdit, PermissionsDelete, PermissionsRead
  from ObjectPermissions where ParentId = '0PSb0000000HT5A'
select Id, SobjectType, Field
  from FieldPermissions where ParentId = '0PSb0000000HT5A'

CheckObjectPermissions

Note that System Permissions (such as Author Apex) which are shown in the UI, currently don’t have an Object API, so you cannot validate them via SOQL. However at least these are listed on a single page so are easier to check visually.

Lessons learned so far…

So here are a few lessons i’ve learned to date around using Permission Sets…

  • Don’t enable System Permissions unless your 100% sure what other permissions they enable!
  • When editing your .permissionset files don’t assume entries you remove will be removed from the org.
  • Use SOQL to query your Permission Sets to check they are exactly what you want before uploading your package
  • Don’t make your Permission Sets based on a specific User License type (this is covered in the Salesforce docs but i thought worth a reminder here as it cannot be undone).
  • Do design your Permission Sets based on feature and function and not based on user roles (which are less likely to match your perception of the role from one subscriber to another). Doing the former will reduce the chances of admins cloning your Permission Sets and loosing out on them being upgraded in the future as your package evolves.


1 Comment

Mocking SOQL sub-select query results

If your a fan of TDD you’ll hopefully have been following FinancialForce.com‘s latest open source contribution to the Salesforce community, known as ApexMocks. Providing a fantastic framework for writing true unit tests in Apex. Allowing you to implement mock implementations of classes used by the code your testing.

The ability to construct data structures returned by mock methods is critical. If its a method performing a SOQL query, there has been an elusive challenge in the area of queries containing sub-selects. Take a look at the following test which inserts and then queries records from the database.

	@IsTest
	private static void testWithDb()
	{
		// Create records
		Account acct = new Account(
			Name = 'Master #1');
		insert acct;
		List<Contact> contacts = new List<Contact> {
			new Contact (
				FirstName = 'Child',
				LastName = '#1',
				AccountId = acct.Id),
			new Contact (
				FirstName = 'Child',
				LastName = '#2',
				AccountId = acct.Id) };
		insert contacts;

		// Query records
		List<Account> accounts =
			[select Id, Name,
				(select Id, FirstName, LastName, AccountId from Contacts) from Account];

		// Assert result set
		assertRecords(acct.Id, contacts[0].Id, contacts[1].Id, accounts);
	}

	private static void assertRecords(Id parentId, Id childId1, Id childId2, List<Account> masters)
	{
		System.assertEquals(Account.SObjectType, masters.getSObjectType());
		System.assertEquals(Account.SObjectType, masters[0].getSObjectType());
		System.assertEquals(1, masters.size());
		System.assertEquals(parentId, masters[0].Id);
		System.assertEquals('Master #1', masters[0].Name);
		System.assertEquals(2, masters[0].Contacts.size());
		System.assertEquals(childId1, masters[0].Contacts[0].Id);
		System.assertEquals(parentId, masters[0].Contacts[0].AccountId);
		System.assertEquals('Child', masters[0].Contacts[0].FirstName);
		System.assertEquals('#1', masters[0].Contacts[0].LastName);
		System.assertEquals(childId2, masters[0].Contacts[1].Id);
		System.assertEquals(parentId, masters[0].Contacts[1].AccountId);
		System.assertEquals('Child', masters[0].Contacts[1].FirstName);
		System.assertEquals('#2', masters[0].Contacts[1].LastName);
	}

Now you may think you can mock the results of this query by simply constructing the required records in memory, but you’d be wrong! The following code fails to compile with a ‘Field is not writeable: Contacts‘ error on line 16.

		// Create records in memory
		Account acct = new Account(
			Id = Mock.Id.generate(Account.SObjectType),
			Name = 'Master #1');
		List<Contact> contacts = new List<Contact> {
			new Contact (
				Id = Mock.Id.generate(Contact.SObjectType),
				FirstName = 'Child',
				LastName = '#1',
				AccountId = acct.Id),
			new Contact (
				Id = Mock.Id.generate(Contact.SObjectType),
				FirstName = 'Child',
				LastName = '#2',
				AccountId = acct.Id) };
		acct.Contacts = contacts;

While Salesforce have gradually opened up write access to previously read only fields, the most famous of which being Id, they have yet to enable the ability to set the value of a child relationship field. Paul Hardaker contacted me recently to ask if this problem had been resolved, as he had the very need described above. Using his ApexMock’s framework he wanted to mock the return value of a Selector class method that makes a SOQL query with a sub-select.

Driven by an early workaround (I believe Chris Peterson found) to the now historic inability to write to the Id field. I started to think about using the same approach to stich together parent and child records using the JSON serialiser and derserializer. Brace yourself though, because its not ideal, but it does work! And i’ve managed to wrap it in a helper method that can easily be adapted or swept out if a better solution presents itself.

	@IsTest
	private static void testWithoutDb()
	{
		// Create records in memory
		Account acct = new Account(
			Id = Mock.Id.generate(Account.SObjectType),
			Name = 'Master #1');
		List<Contact> contacts = new List<Contact> {
			new Contact (
				Id = Mock.Id.generate(Contact.SObjectType),
				FirstName = 'Child',
				LastName = '#1',
				AccountId = acct.Id),
			new Contact (
				Id = Mock.Id.generate(Contact.SObjectType),
				FirstName = 'Child',
				LastName = '#2',
				AccountId = acct.Id) };

		// Mock query records
		List<Account> accounts = (List<Account>)
			Mock.makeRelationship(
				List<Account>.class,
				new List<Account> { acct },
				Contact.AccountId,
				new List<List<Contact>> { contacts });			

		// Assert result set
		assertRecords(acct.Id, contacts[0].Id, contacts[1].Id, accounts);
	}

NOTE: Credit should also go to Paul Hardaker for the Mock.Id.generate method implementation.

The Mock class is provided with this blog as a Gist but i suspect will find its way into the ApexMocks at some point. The secret of this method is that it leverages the fact that we can in a supported way expect the platform to deserialise into memory the following JSON representation of the very database query result we want to mock.

[
    {
        "attributes": {
            "type": "Account",
            "url": "/services/data/v32.0/sobjects/Account/001G000001ipFLBIA2"
        },
        "Id": "001G000001ipFLBIA2",
        "Name": "Master #1",
        "Contacts": {
            "totalSize": 2,
            "done": true,
            "records": [
                {
                    "attributes": {
                        "type": "Contact",
                        "url": "/services/data/v32.0/sobjects/Contact/003G0000027O1UYIA0"
                    },
                    "Id": "003G0000027O1UYIA0",
                    "FirstName": "Child",
                    "LastName": "#1",
                    "AccountId": "001G000001ipFLBIA2"
                },
                {
                    "attributes": {
                        "type": "Contact",
                        "url": "/services/data/v32.0/sobjects/Contact/003G0000027O1UZIA0"
                    },
                    "Id": "003G0000027O1UZIA0",
                    "FirstName": "Child",
                    "LastName": "#2",
                    "AccountId": "001G000001ipFLBIA2"
                }
            ]
        }
    }
]

The Mock.makeRelationship method turns the parent and child lists into JSON and goes through some rather funky code i’m quite proud off, to splice the two together, before serialising it back into an SObject list and vola! It currently only supports a single sub-select, but can easily be extended to support more. Regardless if you use ApexMocks or not (though you really should try it), i hope this helps you write a few more unit tests than you’ve previous been able.

 


4 Comments

Calling Salesforce API’s from Ant Script – Querying Records

Back in June last year i wrote a blog entitled Look ma, no hands!, its main focus was how to leverage the then new ability to install and uninstall packages via the Metadata API. However there was another goal, that is i wanted to invoke Salesforce API’s using only native Ant script and 100% Java based Apache Ant tasks, so no Java coding or native curl executable invocations. Making the resulting script platform neutral and easier to manage.

In this blog i’d like to talk a little bit more about how it was done and highlight the excellent <http> Ant task from Missing Link (so named since surprisingly Ant has yet to provide a core task for HTTP comms). In addition i wanted share how i was able to recently extend this approach. While working with one of FinancialForce.com‘s new up and coming DevOps team members Brad Slater (also see Object Model Tool).

The goal once again was keeping it 100% Ant, this time invoking the Salesforce REST API to perform queries.

 		<!-- Query -->
 		<runQuery 
 			sessionId="${sessionId}" 
 			serverUrl="${serverUrl}" 
 			queryResult="accounts"
 			query="SELECT Id, Name FROM Account LIMIT 1"/>

As before the new Ant tasks are defined in single XML file, ant-salesforce.xml, you can download the updated version with the new <query> task and easily <import> into your own Ant scripts.

Ant provides an excellent way to encapsulate complex script in components it calls Tasks. You can implement these in Java or Ant script itself, using the <macrodef> Ant task. The following shows how the Salesforce <login> task was built for last years blog. You can see both the <http> and <xmltask> tasks in action.

	<!-- Login into Salesforce and return the session Id and serverUrl -->
	<macrodef name="login">
		<attribute name="username" description="Salesforce user name."/>
		<attribute name="password" description="Salesforce password."/>
		<attribute name="serverurl" description="Server Url property."/>
		<attribute name="sessionId" description="Session Id property."/>
		<sequential>
			<!-- Obtain Session Id via Login SOAP service -->
		    <http url="https://login.salesforce.com/services/Soap/c/29.0" method="POST" failonunexpected="false" entityProperty="loginResponse" statusProperty="loginResponseStatus">
		    	<headers>
		    		<header name="Content-Type" value="text/xml"/>
		    		<header name="SOAPAction" value="login"/>
		    	</headers>
		    	<entity>
		    		<![CDATA[
				    	<env:Envelope xmlns:xsd='http://www.w3.org/2001/XMLSchema' xmlns:xsi='http://www.w3.org/2001/XMLSchema-instance' xmlns:env='http://schemas.xmlsoap.org/soap/envelope/'>
				    	    <env:Body>
				    	        <sf:login xmlns:sf='urn:enterprise.soap.sforce.com'>
				    	            <sf:username>@{username}</sf:username>
				    	            <sf:password>@{password}</sf:password>
				    	        </sf:login>
				    	    </env:Body>
				    	</env:Envelope>
		    		]]>
		    	</entity>
		    </http>
			<!-- Parse response -->
			<xmltask destbuffer="loginResponseBuffer">
				<insert path="/">${loginResponse}</insert>
			</xmltask>
			<if>
				<!-- Success? -->
				<equals arg1="${loginResponseStatus}" arg2="200"/>
				<then>
					<!-- Parse sessionId and serverUrl -->
					<xmltask sourcebuffer="loginResponseBuffer" failWithoutMatch="true">
						<copy path="/*[local-name()='Envelope']/*[local-name()='Body']/:loginResponse/:result/:sessionId/text()" property="@{sessionId}"/>
						<copy path="/*[local-name()='Envelope']/*[local-name()='Body']/:loginResponse/:result/:serverUrl/text()" property="@{serverUrl}"/>
					</xmltask>
				</then>
				<else>
					<!-- Parse login error message and fail build -->
					<xmltask sourcebuffer="loginResponseBuffer" failWithoutMatch="true">
						<copy path="/*[local-name()='Envelope']/*[local-name()='Body']/*[local-name()='Fault']/*[local-name()='faultstring']/text()" property="faultString"/>
					</xmltask>
					<fail message="${faultString}"/>
				</else>
			</if>
		</sequential>
	</macrodef>

The <query> Task further leverages the <http> task to make a call to the Salesforce REST API query end point.

	<!-- Provides access to the Salesforce REST API for a SOQL query -->
	<macrodef name="runQuery" description="Run database query">
		<attribute name="sessionId" description="Salesforce user name."/>
		<attribute name="serverUrl" description="Salesforce url."/>
		<attribute name="query" description="Salesforce password."/>
		<attribute name="queryResult" description="Query result property name"/>
		<sequential>
			<!-- Extract host/instance name from the serverUrl returned from the login response -->
			<propertyregex property="host"
              input="${serverUrl}"
              regexp="^((http[s]?|ftp):\/)?\/?([^:\/\s]+)((\/\w+)*\/)([\w\-\.]+[^#?\s]+)(.*)?(#[\w\-]+)?$"
              select="\3"
              casesensitive="false" />			
			<!-- Execute Apex via REST API /query resource -->
		    <http url="https://${host}/services/data/v29.0/query" method="GET" entityProperty="queryResultResponse" statusProperty="loginResponseStatus" printrequestheaders="false" printresponseheaders="false">
		    	<headers>
		    		<header name="Authorization" value="Bearer ${sessionId}"/>
		    	</headers>
		    	<query>
		    		<parameter name="q" value="@{query}"/>
		    	</query>
		    </http>		
		    <property name="@{queryResult}" value="${queryResultResponse}"/>
		</sequential>
	</macrodef>

When put together the two Tasks work very well together, allowing you to login and pass the resulting Session Id to the query  task, then parse the results according to your needs with a small peace of inline JavaScript to parse the resulting JSON. The user of these tasks is blissfully unaware of some of the more advanced Ant script approaches used to implement them, which is how things should be when providing good Ant tasks.

<project name="demo" basedir="." default="demo">

    <!-- Import login properties -->
    <property file="${basedir}/build.properties"/>    
    <!-- Import new Salesforce tasks -->
    <import file="${basedir}/lib/ant-salesforce.xml"/>
    
    <!-- Query task demo -->	
    <target name="demo">
    
    	<!-- Login -->
 		<login 
 			username="${sf.username}" 
 			password="${sf.password}" 
 			serverurl="serverUrl" 
 			sessionId="sessionId"/>
 			
 		<!-- Query -->
 		<runQuery 
 			sessionId="${sessionId}" 
 			serverUrl="${serverUrl}" 
 			queryResult="accounts"
 			query="SELECT Id, Name FROM Account LIMIT 1"/>
 		
 		<!-- Parse JSON result via JavaScript eval -->
 		<script language="javascript">
			var response = eval('('+project.getProperty('accounts')+')');
			project.setProperty('Name', response.records[0].Name);
			project.setProperty('Id', response.records[0].Id);
		</script>
		
		<!-- Dump results -->
		<echo message="Queried Account '${Name}' with Id ${Id}"/>
		
    </target>   
     
</project>

Here is a more complex example processing more than one record via an Ant marco for each record.

 		<!-- Query -->
 		<runQuery 
 			sessionId="${sessionId}" 
 			serverUrl="${serverUrl}" 
 			queryResult="accounts"
 			query="SELECT Id, Name FROM Account"/>

		<!-- Ant marco called for each Account retrieved -->
	    <macrodef name="echo.account">
	    	<attribute name="id"/>
	    	<attribute name="name"/>
	    	<sequential>
				<!-- Process for each account -->
		    	<echo message="Queried Account '@{name}' with Id @{id}"/>
	    	</sequential>		    	
	    </macrodef> 		
 		
 		<!-- Parse JSON result via JavaScript eval and call above Ant macro -->
 		<script language="javascript">
			var response = eval('('+project.getProperty('accounts')+')');
			for(var idx in response.records)
			{
				var processRecord = project.createTask("echo.account");
                processRecord.setDynamicAttribute("id", response.records[idx].Id);
                processRecord.setDynamicAttribute("name", response.records[idx].Name);
                processRecord.execute();
			}
		</script>

Ant is not just for build systems or developers, it can be used quite effectively for many automation tasks. You could create an Ant script that polls for certain activity in your Salesforce org and invokes some application or more complex process for example. Ant has a huge array of tasks and massive community support, its a good skills to learn for cross platform scripting and i’ve frankly found very little it can do these days.

So you may be wondering, why ever use a Java based Ant task or process again to implement your complex Ant and Salesforce integrations? Well…. you may still want to go down the Java coding route if your needs are more complex or if your not comfortable with Ant scripting. Indeed in the case above, the project morphed into something much more complex and we ended up in Java after all. As always choose your tools for the job according to time, resources, skills and complexity. Hopefully this blog has given you another option in your tool belt to consider!


15 Comments

Calling Flow from Apex

Since Summer’14 it has been possible to invoke a Flow from Apex, through the oddly named Interview system class method start. This blog talks about using this as a means to provide an extensibility mechanism, as an alternative option to asking an Apex developer to implement an Apex interface you’ve exposed. Clicks not code plugins!

Prior to Summer’14 it was only possible to embed a Flow in a Visualforce page, using the the flow:interview component, as described here. But this of course required a UI and hence user interaction to drive it. What if you wanted to allow admins to extend or customise the flow of some Apex logic using Flow from a none UI context?

Enter trigger ready-flows, first introduced as part of the Summer’14 pilot for running Flow from Workflow, which itself is still in Pilot, thought the ability to create trigger ready-flows is now generally available, yipee! Here’s what the docs have to say about them…

You can also build trigger-ready flows. A trigger-ready flow is a flow that can be launched without user interaction, such as from a flow trigger workflow action or the Apex interview.start method. Because trigger-ready flows must be able to run in bulk and without user interaction, they can’t contain steps, screens, choices, or dynamic choices in the active flow version—or the latest flow version, if there’s no active version.

This blog will present the following three examples of calling trigger-ready flows from Apex.

  • Hello World, returning a text value set in a Flow
  • Calc, shows passing in values to a Flow and returning the result of some processing.
  • Record Updater, shows passing in some SObject records to the Flow for processing and returning them.

Hello World Example

Here is a simple example that invokes a Flow that just returns a string value defined within the Flow.

ReturnHelloWorldFlow

ReturnHelloWorldAssignment

The following Apex code calls the above Flow and outputs to the Debug log.

// Call the Flow
Map<String, Object> params = new Map<String, Object>();
Flow.Interview.ReturnHelloWorld helloWorldFlow = new Flow.Interview.ReturnHelloWorld(params);
helloWorldFlow.start();

// Obtain the results
String returnValue = (String) helloWorldFlow.getVariableValue('ReturnValue');
System.debug('Flow returned ' + returnValue);

This outputs the following to the Debug log…

11:18:02.684 (684085568)|USER_DEBUG|[13]|DEBUG|Flow returned Hello from the Flow World!

Calc Example

This example passes in a value, which is manipulated by the Flow and returned.

CalcFlow CalcFlowAssignment

The following code invokes this Flow, by passing the X and Y values in through the Map.

// Call the Flow
Map<String, Object> params = new Map<String, Object>();
params.put('X', 10);
params.put('Y', 5);
Flow.Interview.Calc calcFlow = new Flow.Interview.Calc(params);
calcFlow.start();

// Obtain the results
Double returnValue = (Double) calcFlow.getVariableValue('ReturnValue');
System.debug('Flow returned ' + returnValue);

This outputs the following to the Debug log…

12:09:55.190 (190275787)|USER_DEBUG|[24]|DEBUG|Flow returned 15.0

Record Updater Example

With the new SObject and SObject Collection types in Summer’14 we can also pass in SObject’s. For example to apply some custom defaulting logic before the records are processed and inserted by the Apex logic. The following simple Flow loops over the records passed in and sets the Description field.

RecordUpdaterFlowRecordUpdaterAssignment

This code constructs a list of Accounts, passes them to the Flow and retrieves the updated list after.

// List of records
List<Account> accounts = new List<Account>{
	new Account(Name = 'Account A'),
	new Account(Name = 'Account B') };

// Call the Flow
Map<String, Object> params = new Map<String, Object>();
params.put('Accounts', accounts);
Flow.Interview.RecordUpdater recordUpdaterFlow = new Flow.Interview.RecordUpdater(params);
recordUpdaterFlow.start();

// Obtain results
List<Account> updatedAccounts =
	(List<Account>) recordUpdaterFlow.getVariableValue('Accounts');
for(Account account : updatedAccounts)
	System.debug(account.Name + ' ' + account.Description);

The follow debug update shows the field values set by the Apex code and those by the Flow…

13:10:31.060 (60546122)|USER_DEBUG|[39]|DEBUG|Account A Description set by Flow
13:10:31.060 (60588163)|USER_DEBUG|[39]|DEBUG|Account B Description set by Flow

Summary

Calling Flows from Apex is quite powerful to provide more complex extensibility back into the hands of admins vs developers. Though depending on your type of solution is somewhat hampered by the lack of the ability to dynamically create a subscriber configured Flow. As such for now is really only useful for none packaged Apex code deployments in production environment, where the referenced Flow’s can be edited (managed packaged Flows cannot be edited).

Despite this, for custom Apex code deployments to production it is quite powerful as it allows tweaks to the behaviour of solutions to be made without needing a developer to go through Sandbox and redeployment etc. Of course your also putting a lot of confidence in the person defining the Flows as well, so use this combo wisely with your customers!

Upvote: I’ve raised an Idea here to allow Apex to Dynamically create Flow Interview instances, with this in place ISV and packaged applications can make more use of this facility to provide clicks not code extensibility to their packaged application logic.

Follow

Get every new post delivered to your Inbox.

Join 174 other followers