Andy in the Cloud

From BBC Basic to Force.com and beyond…


4 Comments

Calling Salesforce API’s from Ant Script – Querying Records

Back in June last year i wrote a blog entitled Look ma, no hands!, its main focus was how to leverage the then new ability to install and uninstall packages via the Metadata API. However there was another goal, that is i wanted to invoke Salesforce API’s using only native Ant script and 100% Java based Apache Ant tasks, so no Java coding or native curl executable invocations. Making the resulting script platform neutral and easier to manage.

In this blog i’d like to talk a little bit more about how it was done and highlight the excellent <http> Ant task from Missing Link (so named since surprisingly Ant has yet to provide a core task for HTTP comms). In addition i wanted share how i was able to recently extend this approach. While working with one of FinancialForce.com‘s new up and coming DevOps team members Brad Slater (also see Object Model Tool).

The goal once again was keeping it 100% Ant, this time invoking the Salesforce REST API to perform queries.

 		<!-- Query -->
 		<runQuery 
 			sessionId="${sessionId}" 
 			serverUrl="${serverUrl}" 
 			queryResult="accounts"
 			query="SELECT Id, Name FROM Account LIMIT 1"/>

As before the new Ant tasks are defined in single XML file, ant-salesforce.xml, you can download the updated version with the new <query> task and easily <import> into your own Ant scripts.

Ant provides an excellent way to encapsulate complex script in components it calls Tasks. You can implement these in Java or Ant script itself, using the <macrodef> Ant task. The following shows how the Salesforce <login> task was built for last years blog. You can see both the <http> and <xmltask> tasks in action.

	<!-- Login into Salesforce and return the session Id and serverUrl -->
	<macrodef name="login">
		<attribute name="username" description="Salesforce user name."/>
		<attribute name="password" description="Salesforce password."/>
		<attribute name="serverurl" description="Server Url property."/>
		<attribute name="sessionId" description="Session Id property."/>
		<sequential>
			<!-- Obtain Session Id via Login SOAP service -->
		    <http url="https://login.salesforce.com/services/Soap/c/29.0" method="POST" failonunexpected="false" entityProperty="loginResponse" statusProperty="loginResponseStatus">
		    	<headers>
		    		<header name="Content-Type" value="text/xml"/>
		    		<header name="SOAPAction" value="login"/>
		    	</headers>
		    	<entity>
		    		<![CDATA[
				    	<env:Envelope xmlns:xsd='http://www.w3.org/2001/XMLSchema' xmlns:xsi='http://www.w3.org/2001/XMLSchema-instance' xmlns:env='http://schemas.xmlsoap.org/soap/envelope/'>
				    	    <env:Body>
				    	        <sf:login xmlns:sf='urn:enterprise.soap.sforce.com'>
				    	            <sf:username>@{username}</sf:username>
				    	            <sf:password>@{password}</sf:password>
				    	        </sf:login>
				    	    </env:Body>
				    	</env:Envelope>
		    		]]>
		    	</entity>
		    </http>
			<!-- Parse response -->
			<xmltask destbuffer="loginResponseBuffer">
				<insert path="/">${loginResponse}</insert>
			</xmltask>
			<if>
				<!-- Success? -->
				<equals arg1="${loginResponseStatus}" arg2="200"/>
				<then>
					<!-- Parse sessionId and serverUrl -->
					<xmltask sourcebuffer="loginResponseBuffer" failWithoutMatch="true">
						<copy path="/*[local-name()='Envelope']/*[local-name()='Body']/:loginResponse/:result/:sessionId/text()" property="@{sessionId}"/>
						<copy path="/*[local-name()='Envelope']/*[local-name()='Body']/:loginResponse/:result/:serverUrl/text()" property="@{serverUrl}"/>
					</xmltask>
				</then>
				<else>
					<!-- Parse login error message and fail build -->
					<xmltask sourcebuffer="loginResponseBuffer" failWithoutMatch="true">
						<copy path="/*[local-name()='Envelope']/*[local-name()='Body']/*[local-name()='Fault']/*[local-name()='faultstring']/text()" property="faultString"/>
					</xmltask>
					<fail message="${faultString}"/>
				</else>
			</if>
		</sequential>
	</macrodef>

The <query> Task further leverages the <http> task to make a call to the Salesforce REST API query end point.

	<!-- Provides access to the Salesforce REST API for a SOQL query -->
	<macrodef name="runQuery" description="Run database query">
		<attribute name="sessionId" description="Salesforce user name."/>
		<attribute name="serverUrl" description="Salesforce url."/>
		<attribute name="query" description="Salesforce password."/>
		<attribute name="queryResult" description="Query result property name"/>
		<sequential>
			<!-- Extract host/instance name from the serverUrl returned from the login response -->
			<propertyregex property="host"
              input="${serverUrl}"
              regexp="^((http[s]?|ftp):\/)?\/?([^:\/\s]+)((\/\w+)*\/)([\w\-\.]+[^#?\s]+)(.*)?(#[\w\-]+)?$"
              select="\3"
              casesensitive="false" />			
			<!-- Execute Apex via REST API /query resource -->
		    <http url="https://${host}/services/data/v29.0/query" method="GET" entityProperty="queryResultResponse" statusProperty="loginResponseStatus" printrequestheaders="false" printresponseheaders="false">
		    	<headers>
		    		<header name="Authorization" value="Bearer ${sessionId}"/>
		    	</headers>
		    	<query>
		    		<parameter name="q" value="@{query}"/>
		    	</query>
		    </http>		
		    <property name="@{queryResult}" value="${queryResultResponse}"/>
		</sequential>
	</macrodef>

When put together the two Tasks work very well together, allowing you to login and pass the resulting Session Id to the query  task, then parse the results according to your needs with a small peace of inline JavaScript to parse the resulting JSON. The user of these tasks is blissfully unaware of some of the more advanced Ant script approaches used to implement them, which is how things should be when providing good Ant tasks.

<project name="demo" basedir="." default="demo">

    <!-- Import login properties -->
    <property file="${basedir}/build.properties"/>    
    <!-- Import new Salesforce tasks -->
    <import file="${basedir}/lib/ant-salesforce.xml"/>
    
    <!-- Query task demo -->	
    <target name="demo">
    
    	<!-- Login -->
 		<login 
 			username="${sf.username}" 
 			password="${sf.password}" 
 			serverurl="serverUrl" 
 			sessionId="sessionId"/>
 			
 		<!-- Query -->
 		<runQuery 
 			sessionId="${sessionId}" 
 			serverUrl="${serverUrl}" 
 			queryResult="accounts"
 			query="SELECT Id, Name FROM Account LIMIT 1"/>
 		
 		<!-- Parse JSON result via JavaScript eval -->
 		<script language="javascript">
			var response = eval('('+project.getProperty('accounts')+')');
			project.setProperty('Name', response.records[0].Name);
			project.setProperty('Id', response.records[0].Id);
		</script>
		
		<!-- Dump results -->
		<echo message="Queried Account '${Name}' with Id ${Id}"/>
		
    </target>   
     
</project>

Here is a more complex example processing more than one record via an Ant marco for each record.

 		<!-- Query -->
 		<runQuery 
 			sessionId="${sessionId}" 
 			serverUrl="${serverUrl}" 
 			queryResult="accounts"
 			query="SELECT Id, Name FROM Account"/>

		<!-- Ant marco called for each Account retrieved -->
	    <macrodef name="echo.account">
	    	<attribute name="id"/>
	    	<attribute name="name"/>
	    	<sequential>
				<!-- Process for each account -->
		    	<echo message="Queried Account '@{name}' with Id @{id}"/>
	    	</sequential>		    	
	    </macrodef> 		
 		
 		<!-- Parse JSON result via JavaScript eval and call above Ant macro -->
 		<script language="javascript">
			var response = eval('('+project.getProperty('accounts')+')');
			for(var idx in response.records)
			{
				var processRecord = project.createTask("echo.account");
                processRecord.setDynamicAttribute("id", response.records[idx].Id);
                processRecord.setDynamicAttribute("name", response.records[idx].Name);
                processRecord.execute();
			}
		</script>

Ant is not just for build systems or developers, it can be used quite effectively for many automation tasks. You could create an Ant script that polls for certain activity in your Salesforce org and invokes some application or more complex process for example. Ant has a huge array of tasks and massive community support, its a good skills to learn for cross platform scripting and i’ve frankly found very little it can do these days.

So you may be wondering, why ever use a Java based Ant task or process again to implement your complex Ant and Salesforce integrations? Well…. you may still want to go down the Java coding route if your needs are more complex or if your not comfortable with Ant scripting. Indeed in the case above, the project morphed into something much more complex and we ended up in Java after all. As always choose your tools for the job according to time, resources, skills and complexity. Hopefully this blog has given you another option in your tool belt to consider!


10 Comments

Calling Flow from Apex

Since Summer’14 it has been possible to invoke a Flow from Apex, through the oddly named Interview system class method start. This blog talks about using this as a means to provide an extensibility mechanism, as an alternative option to asking an Apex developer to implement an Apex interface you’ve exposed. Clicks not code plugins!

Prior to Summer’14 it was only possible to embed a Flow in a Visualforce page, using the the flow:interview component, as described here. But this of course required a UI and hence user interaction to drive it. What if you wanted to allow admins to extend or customise the flow of some Apex logic using Flow from a none UI context?

Enter trigger ready-flows, first introduced as part of the Summer’14 pilot for running Flow from Workflow, which itself is still in Pilot, thought the ability to create trigger ready-flows is now generally available, yipee! Here’s what the docs have to say about them…

You can also build trigger-ready flows. A trigger-ready flow is a flow that can be launched without user interaction, such as from a flow trigger workflow action or the Apex interview.start method. Because trigger-ready flows must be able to run in bulk and without user interaction, they can’t contain steps, screens, choices, or dynamic choices in the active flow version—or the latest flow version, if there’s no active version.

This blog will present the following three examples of calling trigger-ready flows from Apex.

  • Hello World, returning a text value set in a Flow
  • Calc, shows passing in values to a Flow and returning the result of some processing.
  • Record Updater, shows passing in some SObject records to the Flow for processing and returning them.

Hello World Example

Here is a simple example that invokes a Flow that just returns a string value defined within the Flow.

ReturnHelloWorldFlow

ReturnHelloWorldAssignment

The following Apex code calls the above Flow and outputs to the Debug log.

// Call the Flow
Map<String, Object> params = new Map<String, Object>();
Flow.Interview.ReturnHelloWorld helloWorldFlow = new Flow.Interview.ReturnHelloWorld(params);
helloWorldFlow.start();

// Obtain the results
String returnValue = (String) helloWorldFlow.getVariableValue('ReturnValue');
System.debug('Flow returned ' + returnValue);

This outputs the following to the Debug log…

11:18:02.684 (684085568)|USER_DEBUG|[13]|DEBUG|Flow returned Hello from the Flow World!

Calc Example

This example passes in a value, which is manipulated by the Flow and returned.

CalcFlow CalcFlowAssignment

The following code invokes this Flow, by passing the X and Y values in through the Map.

// Call the Flow
Map<String, Object> params = new Map<String, Object>();
params.put('X', 10);
params.put('Y', 5);
Flow.Interview.Calc calcFlow = new Flow.Interview.Calc(params);
calcFlow.start();

// Obtain the results
Double returnValue = (Double) calcFlow.getVariableValue('ReturnValue');
System.debug('Flow returned ' + returnValue);

This outputs the following to the Debug log…

12:09:55.190 (190275787)|USER_DEBUG|[24]|DEBUG|Flow returned 15.0

Record Updater Example

With the new SObject and SObject Collection types in Summer’14 we can also pass in SObject’s. For example to apply some custom defaulting logic before the records are processed and inserted by the Apex logic. The following simple Flow loops over the records passed in and sets the Description field.

RecordUpdaterFlowRecordUpdaterAssignment

This code constructs a list of Accounts, passes them to the Flow and retrieves the updated list after.

// List of records
List<Account> accounts = new List<Account>{
	new Account(Name = 'Account A'),
	new Account(Name = 'Account B') };

// Call the Flow
Map<String, Object> params = new Map<String, Object>();
params.put('Accounts', accounts);
Flow.Interview.RecordUpdater recordUpdaterFlow = new Flow.Interview.RecordUpdater(params);
recordUpdaterFlow.start();

// Obtain results
List<Account> updatedAccounts =
	(List<Account>) recordUpdaterFlow.getVariableValue('Accounts');
for(Account account : updatedAccounts)
	System.debug(account.Name + ' ' + account.Description);

The follow debug update shows the field values set by the Apex code and those by the Flow…

13:10:31.060 (60546122)|USER_DEBUG|[39]|DEBUG|Account A Description set by Flow
13:10:31.060 (60588163)|USER_DEBUG|[39]|DEBUG|Account B Description set by Flow

Summary

Calling Flows from Apex is quite powerful to provide more complex extensibility back into the hands of admins vs developers. Though depending on your type of solution is somewhat hampered by the lack of the ability to dynamically create a subscriber configured Flow. As such for now is really only useful for none packaged Apex code deployments in production environment, where the referenced Flow’s can be edited (managed packaged Flows cannot be edited).

Despite this, for custom Apex code deployments to production it is quite powerful as it allows tweaks to the behaviour of solutions to be made without needing a developer to go through Sandbox and redeployment etc. Of course your also putting a lot of confidence in the person defining the Flows as well, so use this combo wisely with your customers!

Upvote: I’ve raised an Idea here to allow Apex to Dynamically create Flow Interview instances, with this in place ISV and packaged applications can make more use of this facility to provide clicks not code extensibility to their packaged application logic.


2 Comments

Super ListView Viewer using Winter’15 ListView API

Salesforce continues its drive to push out great platform API’s, while Winter’15 was a little lighter than usual, the List View API did catch my eye! It’s available currently in SOAP and REST flavours, sadly as yet no Apex, though I’ve seen Salesforce follow up in later releases with Apex support, so fingers crossed! This omission didn’t stop me exploring the REST variant via Visualforce and JQuery!

I’ve written a few applications that leverage List Views to provide a readily available filter criteria for various uses. One approach used the StandardSetController in Apex (via the setFilterId method) and another by reading the object definition (using Metadata API), parsing the List View definition and building my own SOQL query! This later strategy can now be replaced with the new List View API, as amongst giving you the List View definition and the records behind it, you also get hold of the SOQL query Salesforce uses as well!

The REST API version provides the following resources to call..

  • /services/data/v32.0/sobjects/Account/listviews - Lists the List Views associated with the object and their Id’s (click the link to try it out in Developer Workbench!). Documentation here.
    listviewlistviews
  • /services/data/v32.0/sobjects/Account/listviews/{Id}/describe – Returns the definition of the List View and the SOQL! Documentation here.
    listviewdescribe
  • /services/data/v32.0/sobjects/Account/listviews/{Id}/results – Returns the column definitions of the List View and the records it returns. Documentation here.
    listviewresults

While exploring the SOQL used by ListViews, i noticed something i had not seen before in the SOQL syntax, the USING SCOPE clause, also new to Winter’15. Recently View Accounts and My Accounts views leverage this new clause…

SELECT name, site, billingstate, phone, tolabel(type), 
  owner.alias, id, createddate, lastmodifieddate, systemmodstamp 
FROM Account 
USING SCOPE mru 
ORDER BY Name ASC NULLS FIRST, Id ASC NULLS FIRST

The above shows ‘mru’ and the following example shows ‘mine’…

SELECT name, site, billingstate, phone, tolabel(type), 
  owner.alias, id, createddate, lastmodifieddate, systemmodstamp 
FROM Account 
USING SCOPE mine 
ORDER BY Name ASC NULLS FIRST, Id ASC NULLS FIRST

The documentation is a bit weak presently on the other values, this knowledge base article, lists Everything, Mine, Queue, Delegated, MyTerritory, MyTeamTerritory or Team, but not MRU at present.

So to give these API’s a better shake down i decided to flex my JavaScript side this time, knowing that its possible to call these API’s from the Visualforce Domain without issue, its a matter of making a JavaScript call and interpreting the results. Thus Super ListView Viewer was born! With a little help from the frankly amazing JQuery plugin known as DataTable, which replicates as far as I can see the standard List View UI and then some!

This page allows you to select any object and if it has associated List View’s view them!

screenshot

JQuery provices some excellent AJAX support, this code reads the SOQL query and displays it on the page…

// List View describe to take a look at the SOQL!
$.ajax({
	url : '/services/data/v32.0/sobjects/' + objectName + '/listviews/' + listViewId + '/describe', 
	headers : { 'Authorization' : 'Bearer {!$Api.Session_ID}' },
	datatype : 'json', 
	success : function(data, textStatus, jqXHR) {					
		$('#soql').text(data.query);
	}
});

This code builds a regular HTML table and then in a single line turns it into the amazing DataTable, I was quite amazed when i added this library, seriously cool!

// Call the List View API to get the results (also includes column definitions)
$.ajax({
	url : '/services/data/v32.0/sobjects/' + objectName + '/listviews/' + listViewId + '/results', 
	headers : { 'Authorization' : 'Bearer {!$Api.Session_ID}' },
	datatype : 'json', 
	success : function(data, textStatus, jqXHR) {					

		// Clear current List View info
		$('#listview').empty();

		// Create the table and add columns
		var table = $('<table></table>');
		var thead = $('<thead></thead>');
		var theadtr = $('<tr></tr>');					
		table.appendTo('#listview');					
		table.append(thead);
		thead.append(theadtr);
		$.each(data.columns, function(index, column) {
			if(!column.hidden)
				theadtr.append($('<th>' + column.label + '</th>'));
		});

		// Add the rows
		var tbody = $('<tbody></tbody>');
		table.append(tbody);
		$.each(data.records, function(rowIndex, record) {
			var tbodytr = $('<tr></tr>');					
			tbody.append(tbodytr);
			$.each(record.columns, function(colIndex, column) {
				if(!data.columns[colIndex].hidden)
					tbodytr.append($('<td>' +
					 (record.columns[colIndex].value!=null ? 
					  record.columns[colIndex].value : '') + '</td>'));
			});
		});

		// Enhance this boring old HTML table with JQuery DataTable!
		var dataTable = table.DataTable();
	    }
	});
});

You can find the full source code for this here. Of course if you had need to call this API from Apex, you can make an outbound HTTP callout, after having set up the Remote Site to allow Salesforce to call itself…

Enjoy!


3 Comments

Preview of Advanced Apex Enterprise Patterns Session

At this years Dreamforce i will be presenting not one, but two sessions on Apex Enterprise Patterns this year. The first will be re-run of last years session, Apex Enterprise Patterns : Building Strong Foundations. The second will be a follow on session dealing with more advanced topics around the patterns. Imaginatively entitled Advanced Apex Enterprise Patterns. The current abstract for this session is as follows…

Complex code can easily get out of hand without good design, so in this deep dive you will better understand how to apply the advanced design patterns used by highly experienced Salesforce developers. Starting with Interface & Base Class examples of OO design we quickly move on to new design features including: Application Factory, Field Level Security Support, Selector FieldSet support and Dependency Injection, Mock Testing of Services, Domain and Selector layers.​ By the end you will have a better understanding of how and when to apply these advanced Apex patterns.

If you attended the FinancialForce.com DevTalk in June this year, you will have got a sneak peak of some of the improvements being made to the supporting library for the patterns available in this repo. If you didn’t you’ll have to wait till Dreamforce! Oh go on then, i’ll give you a teaser here…

  • Introduction
  • Selector Enhancements, FieldSet and QueryFactory Support
  • Application Factory Introduction
  • Using Apex Interfaces to implement Common Service Layer Functionality
  • Introducing ApexMocks
  • Using ApexMocks with Service, Selector and Domain Layers
  • Field Level Security Experiment
  • Q&A 

I’m quite excited about all this content, but perhaps if pushed, i’d have to highlight the new Application Factory concept along with the integration with the exciting new ApexMocks library (also from FinancialForce.com R&D). This brings with it easier support for implementing polymorphic use cases in your application and the ability to mock layers of the patterns, such as Unit Of Work, Domain, Selector and Service layers. Allowing to develop true unit tests that are fast to execute by the platform and plentiful in terms of the variation of tests you’ll be able to develop without fear of extending the time your sat waiting for tests to execute!

Its is against my nature to publish a blog without a code sample in it, so i’ll leave you to ponder the following….

	public void applyDiscounts(Set<ID> opportunityIds, Decimal discountPercentage)
	{
		// Create unit of work to capture work and commit it under one transaction
	    fflib_ISObjectUnitOfWork uow = Application.UnitOfWork.newInstance();

		// Query Opportunities
		List<Opportunity> oppRecords =
			OpportunitiesSelector.newInstance().selectByIdWithProducts(opportunityIds);

		// Apply discount via Opportunties domain class behaviour
		IOpportunities opps = Opportunities.newInstance(oppRecords);
		opps.applyDiscount(discountPercentage, uow);

		// Commit updates to opportunities
		uow.commitWork();
	}

Here is a more generic service layer example, leveraging polymorphic Domain classes!

	public void generate(Set<Id> sourceRecordIds)
	{
		// Create unit of work to capture work and commit it under one transaction
		fflib_ISObjectUnitOfWork uow = Application.UnitOfWork.newInstance();

		// Invoicing Factory helps domain classes produce invoices
		InvoicingService.InvoiceFactory invoiceFactory = new InvoicingService.InvoiceFactory(uow);

		// Construct domain class capabile of performing invoicing
		fflib_ISObjectDomain domain =
			Application.Domain.newInstance(
				Application.Selector.selectById(sourceRecordIds));
		if(domain instanceof InvoicingService.ISupportInvoicing)
		{
			// Ask the domain object to generate its invoices
			InvoicingService.ISupportInvoicing invoicing = (InvoicingService.ISupportInvoicing) domain;
			invoicing.generate(invoiceFactory);
			// Commit generated invoices to the database
			uow.commitWork();
			return;
		}

		// Related Domain object does not support the ability to generate invoices
		throw new fflib_Application.ApplicationException('Invalid source object for generating invoices.');
	}

This last example shows how ApexMocks has been integrated into Application Factory concept via the setMock methods. The following is true test of only the service layer logic, by mocking the unit of work, domain and selector layers.

	@IsTest
	private static void callingServiceShouldCallSelectorApplyDiscountInDomainAndCommit()
	{
		// Create mocks
		fflib_ApexMocks mocks = new fflib_ApexMocks();
		fflib_ISObjectUnitOfWork uowMock = new fflib_SObjectMocks.SObjectUnitOfWork(mocks);
		IOpportunities domainMock = new Mocks.Opportunities(mocks);
		IOpportunitiesSelector selectorMock = new Mocks.OpportunitiesSelector(mocks);

		// Given
		mocks.startStubbing();
		List<Opportunity> testOppsList = new List<Opportunity> {
			new Opportunity(
				Id = fflib_IDGenerator.generate(Opportunity.SObjectType),
				Name = 'Test Opportunity',
				StageName = 'Open',
				Amount = 1000,
				CloseDate = System.today()) };
		Set<Id> testOppsSet = new Map<Id, Opportunity>(testOppsList).keySet();
		mocks.when(domainMock.sObjectType()).thenReturn(Opportunity.SObjectType);
		mocks.when(selectorMock.sObjectType()).thenReturn(Opportunity.SObjectType);
		mocks.when(selectorMock.selectByIdWithProducts(testOppsSet)).thenReturn(testOppsList);
		mocks.stopStubbing();
		Decimal discountPercent = 10;
		Application.UnitOfWork.setMock(uowMock);
		Application.Domain.setMock(domainMock);
		Application.Selector.setMock(selectorMock);

		// When
		OpportunitiesService.applyDiscounts(testOppsSet, discountPercent);

		// Then
		((IOpportunitiesSelector)
			mocks.verify(selectorMock)).selectByIdWithProducts(testOppsSet);
		((IOpportunities)
			mocks.verify(domainMock)).applyDiscount(discountPercent, uowMock);
		((fflib_ISObjectUnitOfWork)
			mocks.verify(uowMock, 1)).commitWork();
	}

All these examples will be available in the sample application repo once i’ve completed the prep for the session in a few weeks time.

Sadly the FinancialForce.com session on ApexMocks was not selected for Dreamforce 2014, however not to worry! FinancialForce.com will be hosting a DevTalk event during Dreamforce week where Jesse Altman will be standing in for the library author Paul Hardaker (get well soon Paul!), book your place now!

Finally, if you have been using the patterns for a while and have a question you want to ask in this session, please feel free to drop your idea into the comments box below this blog post!

Follow

Get every new post delivered to your Inbox.

Join 145 other followers