Andy in the Cloud

From BBC Basic to Force.com and beyond…


26 Comments

Rollups and Cross Object Formula Fields

I’m constantly amazed at the number of varied use cases folks in the Chatter Group are applying to the Declarative Lookup Rollup Summary tool these days. This short blog highlights a particular use case that seems to be on the increase. To resolve it i reached out for additional help in the solution from Process Builder, this is the story…

Background: What causes a Rollup to recalculate?

The default behaviour of the rollup tool is to look for changes to the field your rolling up on, the one specified in the Field to Aggregate field. In addition you can list other fields for it to look at via the Relationship Criteria Fields field, which as the name suggests is also important information if you’ve used rollup criteria. However its also important if the field your rolling up on is a Formula field. In the case of formula field rollups the platform doesn’t inform the tools Apex Trigger of changes to these. So it cannot directly monitor such fields and to resolve this must instead be told explicitly about the fields that are referenced by the formula expression. So far so good…

Challenge: Rollups over Cross Object Formulas?

A challenge however arises if you’re wanting to do a realtime rollup based on a formula field that references fields from a related record, a cross object formula! In this case how does the rollup tool know when changes are made to related records?

One solution to this is to switch to schedule mode by clicking the Schedule Calculate button on the rollup. For realtime rollups, its potentially feasible to enhance the tool to deploy triggers to related objects and bubble up knowledge of field changes to the cause a recalculate of the rollup on the child object… However before we resort to more code (even in the tool) lets see what we can do with the declarative tools we have already today…

Example Use Case

The following schema diagram shows a simplified mockup of a such a challenge i helped a community member out with in the tools Chatter Group.

FormulaRollupUseCase.png

Here is the scenario assumptions and break down…

  • For whatever existing reasons the above is NOT a Master Detail relationship
  • Rollup needed is to Sum the Quote Line Item > Amount into the Quote > Total field.
  • The Quote Line Item > Amount field is a Formula field, which is a cross object formula pointing to the related Widget > Total field.
  • The Widget > Total field is itself a Formula field, in this simplified case adding up the values of Widget > A + Widget > B + Widget > C.
  • Whenever changes to the Widget > A, Widget > B or Widget > C fields are made we want the Quote > Total field to be recalculated.

Here’s the rollup summary definition between Quote and Quote Line Item

ForumlaRollupDLRS.png

While the above works if you use Calculate (one off full recalculate) or Schedule Calculate (daily full recalculate) buttons. Our issue arises in the general use of the Realtime mode. Since the tools triggers see nothing of the changes users make to the Widget fields above, realtime changes are not reflected on the Quote > Total rollup. This is due to the aforementioned reason, since we are using a cross object formula.

NOTE: The Calculate Mode pick list also has a Schedule option, this is a more focused background recalculate, only recalculating effected records since the last run, rather than Schedule Calculate button which is a full recalculate every night. So be aware if your using this mode that the problem and solution being described here also applies to Calculate Mode when set to Schedule as well. As it uses the same field monitoring approach to queue records up for scheduled recalculation.

If your fine without realtime rollups go ahead and use the Schedule Calculate button and at 2am each morning the Quote > Total amount will catchup with any changes made to the Widget fields that effected it, job done!

Solution: Shadow Fields and Process Builder

So when considering the solution, i did consider another rollup between the Widget and Quote Line Item to start to resolve this, thinking i could then put the result in field that the Quote Line Item > Quote  rollup would see change. However this was quickly proved a poor consideration as the relationship between Widget and Quote Line Item in this use case is the wrong way round, as Quote Line Item is the child in this case, doh! In other use cases, i have had success in using nested rollups to get more complex use cases to fly!

Shadow Field?

AmountShadowFieldEither way i knew i had to have some kind of physical field other than a Formula field on the Quote Line Item object to notify the rollup tool of a change that would trigger a recalculate of the Quote > Total. I called this field Amount (Shadow) in this case, i also left it off my layout.

NOTE: I’ve made the assumption here that for whatever reason the existing cross object Formula field has to stay for other reasons, if thats not a problem for you, simply recreate Quote Line Item > Amount as a physical field and as you read the rest of this blog consider this your shadow field.

I then changed my rollup definition to reference the Amount Shadow field instead.

ChangesToRollup

NOTE: If you managed to switch the field type of your Amount field from a Formula to a physical Number field as noted above you don’t need to do this of course.

Process Builder to update your Field to Aggregate

Next i turned to Process Builder to see if i could get it populate the above Amount (Shadow) field on Quote Line Item, as users made changes to Widget fields. Leveraging the child parent relationship between Quote Line Item and Widget. Here is the setup i used to complete the solution!

FormulaRollupProessBuilder1.png

FormulaRollupProessBuilder2.png

FormulaRollupProessBuilder3.png

Summary

Its worth noting that if the relationship between Quote Line Item and Quote was Master Detail, you can effectively now of course use standard platform Rollup Summary Fields without needing the rollup tool at all. You may think me bias here, but not at all, i’d much rather see a fully native solution any day!

Regardless if this use cases fits yours or not, hopefully this blog has given you some other useful inspiration for further rollup and Process Builder combo deals! Enjoy!


2 Comments

Packaging and Installing Rollups

Since v2.0 of the Declarative Rollup Tool, it is now possible to use Custom Metadata to store rollup configurations. Not only does this make it easy to transfer those you have setup in Sandbox via Change Sets, you can also use Salesforce packaging to capture those you’ve created and install them over and over into other orgs, like a reusable Change Set!

InstallRollups

Create and set up your Packaging Org

To do this, you need a Developer Edition org from Salesforce, this will act as your master org for your common rollup definitions. Here you can install the rollup tool package itself as minimum. Then also create common fields or objects used by your rollups if you wish. You can also install any other packages, such as the NPSP packages. Basically prepare and test everything here as you would normally in Sandbox or Production.

DESignup

Creating your Package

Under the Setup menu navigation to Create and then Packages. Create a new Package with an appropriate name. Click Add to components, components are anything from Custom Objects, Fields, Apex Triggers and Apex Tests to the Rollup definitions themselves. As a minimum you will need to add the following components for rollups.

  • Apex Classes for the rollup/s (starting dlrs)
  • Apex Triggers for the rollup/s (starting dlrs)
  • Lookup Rollup Summary definition/s

The platform adds a few other dependent things as you can see in the screenshot below, but don’t worry about these. You should have something like this…

PackagingRollups

Tip: After creating your rollups, go to Setup and Custom Metadata, and find your rollup definition, edit it and locate the Protected checkbox, then hit Save (at present this checkbox is not shown in the custom UI for rollups). This checkbox prevents users or other admins from changing your rollup definition once installed. You also need to use the managed package route though, as described below.

Choosing Unmanaged or Managed Packages

This choice depends on if you want to manage several updates / releases to your rollups over time, adding or updating as you improve things. An umanaged package is basically like just installing a template of your rollups, once installed thats it, they are no longer linked to your package. If however you want to install updates to them and/or stop people from editing your rollups (see protected mode above), you want a managed package.

Setting a namespace for your Managed Package

If you decided you only want an unmanaged package, skip this bit. Otherwise on the Packages page, under Setup then Create. Click Edit button and follow the prompts to enter a namespace, this is a short mnemonic that describes your package, like a unique ID. Once set it cannot be changed. Next find your Package detail and click Edit, selecting the Managed checkbox on hitting Save.

Uploading your Rollup Package!

PackagesRollups

This process basically builds your installer, by actually placing a copy of what you’ve done in the AppExchange servers. All be it as a private listing. This gives you an install link to use over and over as much as you like. Install it in the sandbox and production as needed.

Click on your package and then the Upload button, give it a name and version of your choose. Be sure to select the Release mode, in most cases for this packaged content worrying about Beta vs Release mode is not a concern. Press Upload and wait.

PackageUpload.png

RollupPackageSummary

Rinse and Repeat?

Keep in mind if you went with the managed package route. You can go back to your packaging org, make some improvements, add more rollups etc, then repeat the upload process to get a new version of your package. Simply then go to any existing org with your prior release installed and upgrade.

NOTE: To ensure your rollup definitions are upgraded they must be marked as protected. If you don’t care about this and only want new rollups to be added, retaining subscriber (installed org) changes, don’t worry about Protected rollups.

 

 


30 Comments

Setup Audit Trail API in Winter’16

Winter’16 was a bit low on API for me, but one thing i just found hiding in the New Objects section, was this…

SetupAuditTrail, “Represents changes you or other administrators made in your organization’s Setup area.”

WBAuditWow, this is a BIG thing and opens up a lot of tooling and greater compliance support for changes around orgs. Prior to this, folks where so keen to get hold of this information programatically they would resort to web scraping it from the Salesforce Setup menu!  So I set about giving it a spin via Developer Workbench and learning more about the object.
WBAuditResults

There is also much better data export support via tools such as DataLoader.io (and of course other tools)….

AuditDataLoaderIO

So in addition to Salesforce API’s (used by the above tools), it is also available via Apex SOQL! The following selects all audit records relating to users from a certain email domain.

List<SetupAuditTrail> stuffDoneByConsultants = 
    [SELECT Id,
     	Action,
     	CreatedBy.Name,
     	CreatedDate,
     	Display,
     	Section 
     FROM SetupAuditTrail 
     WHERE CreatedBy.Email LIKE '%xyzconsulting.com%'];

So what else can we do with it? Well… not a lot more sadly, doing an Apex Describe shows that its basically only query-able, no triggers, replication API or indeed streaming API, which while not a total surprised would have been very cool!

Age of Records?

Through the long standing CSV download facility under setup, only 6 months worth of data is available. However running queries in my various long standing orgs i’m seeing data going back as far as all time?!? Although there is no documentation to confirm, and i would honestly would not be surprised to see some limit here, perhaps those of you running long standing production orgs can confirm via the comments below?

What are its fields?

AuditFieldsAs you can see the SetupAuditTrail objects fields are not the most ideal to interpret the data, aside from CreatedBy field (which supports relationship walking to the User object). The most interesting are Action, Section and Display, the later is the one that actually contains the object, field, layout or whatever has changed. The challenge here is its embedded in a message and not called out separately, so there is a bit of parsing to be done here.

IMPORTANT NOTE: The only gap i can see so far is the Delegate User (found in the Setup UI and CSV download) appears to be missing from the API at the moment. Yet it is documented as follows… ‘The Login-As user who executed the action in Setup. If a Login-As user didn’t perform the action, this field is blank.’

So what next?

  • Well i see AuditForce is an example of application tool that was created to better report and dashboard on this information, this tool could be enhanced to use this API now and remove the current web scraping hack the developer (Daniel Peter) was forced to utilise back then.
  • With Apex support its also possible to write Apex Scheduled jobs that periodically scan for certain updates and apply your own rules and notifications.
  • I was thinking it might also be useful to have a nice Lighting Component perhaps?
  • The Setup UI does not permit filtering or even sorting, it would probably not require to much coding to get something like the excellent Data Table component show results of queries from this object (as featured in my List View API blog).
  • Finally given this is available from Salesforce REST API, its also feasible to aggregate into a single console audit information from say multiple sandboxes (something Daniel hints at in his AuditForce blog).

 


11 Comments

Unit Testing, Apex Enterprise Patterns and ApexMocks – Part 2

In Part 1 of this blog series i introduced a new means of applying true unit testing to Apex code leveraging the Apex Enterprise Patterns. Covering the differences between true unit testing vs integration testing and how the lines can get a little blurred when writing Apex test methods.

If your following along you should be all set to start writing true unit tests against your controller, service and domain classes. Leveraging the inbuilt dependency injection framework provided by the Application class introduced in the last blog. By injecting mock implementations of service, domain, selector and unit of work classes accordingly.

What are Mock classes and why do i need them?

Depending on the type of class your unit testing you’ll need to mock different dependencies so that you don’t have to worry about the data setup of those classes while your busy putting your hard work in to testing your specific class.

Unit Testing

In object-oriented programming, mock objects are simulated objects that mimic the behavior of real objects in controlled ways. A programmer typically creates a mock object to test the behavior of some other object, in much the same way that a car designer uses a crash test dummy to simulate the dynamic behavior of a human in vehicle impacts. Wikipedia.

In this blog we are going to focus on an example unit test method for a Service, which requires that we mock the unit of work, selector and domain classes it depends on (unit tests for these classes will of course be written as well). Lets take a look first at the overall test method then break it down bit by bit. The following test method makes no SOQL queries or DML to accomplish its goal of testing the service layer method.

	@IsTest
	private static void callingServiceShouldCallSelectorApplyDiscountInDomainAndCommit()
	{
		// Create mocks
		fflib_ApexMocks mocks = new fflib_ApexMocks();
		fflib_ISObjectUnitOfWork uowMock = new fflib_SObjectMocks.SObjectUnitOfWork(mocks);
		IOpportunities domainMock = new Mocks.Opportunities(mocks);
		IOpportunitiesSelector selectorMock = new Mocks.OpportunitiesSelector(mocks);

		// Given
		mocks.startStubbing();
		List<Opportunity> testOppsList = new List<Opportunity> {
			new Opportunity(
				Id = fflib_IDGenerator.generate(Opportunity.SObjectType),
				Name = 'Test Opportunity',
				StageName = 'Open',
				Amount = 1000,
				CloseDate = System.today()) };
		Set<Id> testOppsSet = new Map<Id, Opportunity>(testOppsList).keySet();
		mocks.when(domainMock.sObjectType()).thenReturn(Opportunity.SObjectType);
		mocks.when(selectorMock.sObjectType()).thenReturn(Opportunity.SObjectType);
		mocks.when(selectorMock.selectByIdWithProducts(testOppsSet)).thenReturn(testOppsList);
		mocks.stopStubbing();
		Decimal discountPercent = 10;
		Application.UnitOfWork.setMock(uowMock);
		Application.Domain.setMock(domainMock);
		Application.Selector.setMock(selectorMock);

		// When
		OpportunitiesService.applyDiscounts(testOppsSet, discountPercent);

		// Then
		((IOpportunitiesSelector)
			mocks.verify(selectorMock)).selectByIdWithProducts(testOppsSet);
		((IOpportunities)
			mocks.verify(domainMock)).applyDiscount(discountPercent, uowMock);
		((fflib_ISObjectUnitOfWork)
			mocks.verify(uowMock, 1)).commitWork();
	}

First of all, you’ll notice the test method name is a little longer than you might be used to, also the general layout of the test splits code into Given, When and Then blocks. These conventions help add some documentation, readability and consistency to test methods, as well as helping you focus on what it is your testing and assuming to happen. The convention is one defined by Martin Fowler, you can read more about GivenWhenThen here. The test method name itself, stems from a desire to express the behaviour the test is confirming.

Generating and using Mock Classes

UPDATE: Since the Apex Stub API was released you do not need this, see here!

The Java based Mockito framework leverages the Java runtimes capability to dynamically create mock implementations. However the Apex runtime does not have any support for this. Instead ApexMocks uses source code generation to generate the mock classes it requires based on the interfaces you defined in my earlier post.

The patterns library also comes with its own mock implementation of the Unit of Work for you to use, as well as some base mock classes for your selectors and domain mocks (made know to the tool below). The following code at the top of the test method creates the necessary mock instances that will be configured and injected into the execution.

// Create mocks
fflib_ApexMocks mocks = new fflib_ApexMocks();
fflib_ISObjectUnitOfWork uowMock = new fflib_SObjectMocks.SObjectUnitOfWork(mocks);
IOpportunities domainMock = new Mocks.Opportunities(mocks);
IOpportunitiesSelector selectorMock = new Mocks.OpportunitiesSelector(mocks);

To generate the Mocks class used above use the ApexMocks Generator, you can run it via the Ant tool. The apex-mocks-generator-3.1.2.jar file can be downloaded from the ApexMocks repo here.

<?xml version="1.0" encoding="UTF-8"?>
<project name="Apex Commons Sample Application" default="generate.mocks" basedir=".">

	<target name="generate.mocks">
		<java classname="com.financialforce.apexmocks.ApexMockGenerator">
			<classpath>
				<pathelement location="${basedir}/bin/apex-mocks-generator-3.1.2.jar"/>
			</classpath>
			<arg value="${basedir}/fflib-sample-code/src/classes"/>
			<arg value="${basedir}/interfacemocks.properties"/>
			<arg value="Mocks"/>
			<arg value="${basedir}/fflib-sample-code/src/classes"/>
		</java>
	</target>

</project>

You can configure the output of the tool using a properties file (you can find more information here).

IOpportunities=Opportunities:fflib_SObjectMocks.SObjectDomain
IOpportunitiesSelector=OpportunitiesSelector:fflib_SObjectMocks.SObjectSelector
IOpportunitiesService=OpportunitiesService

The generated mock classes are contained as inner classes in the Mocks.cls class and also implement the interfaces you define, just as the real classes do. You can choose to add the above Ant tool call into your build scripts or just simply retain the class in your org refreshing it by re-run the tool whenever your interfaces change.

/* Generated by apex-mocks-generator version 3.1.2 */
@isTest
public class Mocks
{
	public class OpportunitiesService
		implements IOpportunitiesService
	{
		// Mock implementations of the interface methods...
	}

	public class OpportunitiesSelector extends fflib_SObjectMocks.SObjectSelector
		implements IOpportunitiesSelector
	{
		// Mock implementations of the interface methods...
	}

	public class Opportunities extends fflib_SObjectMocks.SObjectDomain
		implements IOpportunities
	{
		// Mock implementations of the interface methods...
	}
}

Mocking method responses

Mock classes are dumb by default, so of course you cannot inject them into the upcoming code execution and expect them to work. You have to tell them how to respond when called. They will however record for you when their methods have been called for you to check or assert later. Using the framework you can tell a mock method what to return or exceptions to throw when the class your testing calls it.

So in effect you can teach them to emulate their real counter parts. For example when a Service method calls a Selector method it can return some in memory records as apposed to having to have them setup on the database. Or when the unit of work is used it will record method invocations as apposed to writing to the database.

Here is an example of configuring a Selector mock method to return test record data. Note that you also need to inform the Selector mock what type of SObject it relates to, this is also the case when mocking the Domain layer. Finally be sure to call startStubbing and stopStubbing between your mock configuration code. You can read much more about the ApexMocks API here, which resembles the Java Mockito API as well.

// Given
mocks.startStubbing();
List<Opportunity> testOppsList = new List<Opportunity> {
	new Opportunity(
		Id = fflib_IDGenerator.generate(Opportunity.SObjectType),
		Name = 'Test Opportunity',
		StageName = 'Open',
		Amount = 1000,
		CloseDate = System.today()) };
Set<Id> testOppsSet = new Map<Id, Opportunity>(testOppsList).keySet();
mocks.when(domainMock.sObjectType()).thenReturn(Opportunity.SObjectType);
mocks.when(selectorMock.sObjectType()).thenReturn(Opportunity.SObjectType);
mocks.when(selectorMock.selectByIdWithProducts(testOppsSet)).thenReturn(testOppsList);
mocks.stopStubbing();

TIP: If you want to mock sub-select queries returned from a selector take a look at this.

Injecting your mock implementations

Finally before you call the method your wanting to test, ensure you have injected the mock implementations. So that the calls to the Application class factory methods will return your mock instances over the real implementations.

Application.UnitOfWork.setMock(uowMock);
Application.Domain.setMock(domainMock);
Application.Selector.setMock(selectorMock);

Testing your method and asserting the results

Calling your method to test is a straight forward as you would expect. If it returns values or modifies parameters you can assert those values. However the ApexMocks framework also allows you to add further behavioural assertions that add further confidence the code your testing is working the way it should. In this case we are wanting to assert or verify (to using mocking speak) the correct information was passed onto the domain and selector classes.

// When
OpportunitiesService.applyDiscounts(testOppsSet, discountPercent);

// Then
((IOpportunitiesSelector)
	mocks.verify(selectorMock)).selectByIdWithProducts(testOppsSet);
((IOpportunities)
	mocks.verify(domainMock)).applyDiscount(discountPercent, uowMock);
((fflib_ISObjectUnitOfWork)
	mocks.verify(uowMock, 1)).commitWork();

TIP: You can verify method calls have been made and also how many times. For example checking a method is only called a specific number of times can help add some level of performance and optimisation checking into your tests.

Summary

The full API for ApecMocks is outside the scope of this blog series, and frankly Paul Hardaker and Jessie Altman have done a much better job, take a look at the full list of documentation links here. Finally keep in mind my comments at the start of this series, this is not to be seen as a total alternative to traditional Apex test method writing. Merely another option to consider when your wanting a more focused means to test specific methods in more varied ways without incurring the development and execution costs of having to setup all of your applications data in each test method.


5 Comments

Controlling Internet Devices via Lightning Process Builder

Lightning Process Builder will soon become GA once the Spring’15 rollout completes in early February, just a few short weeks away as i write this. I don’t actually know where to start in terms of how huge and significant this new platform feature is! In my recent blog Salesforce evolves customization to a new level! over on the FinancialForce blog, i describe Salesforce as ‘the most powerful and productive cloud platform on the planet’. The more and more i get into Process Builder and how as a developer i can empower users of it, that statement is already starting to sound like an understatement!

There are many things getting me excited (as usual) about Salesforce these days, in addition to Process Builder and Invocable Actions (more on this later), its the Internet of Things. I just love the notion of inspecting and controlling devices no matter where i am on the planet. If you’ve been following my blog from earlier this year, you’ll hopefully have seen my exploits with the LittleBits cloud enabled devices and the Salesforce LittleBits Connector.

pointingdeviceI have just spent a very enjoyable Saturday morning in my Spring’15 Preview org with a special build of the LittleBits Connector. That leverages the ability for Process Builder to callout to specially annotated Apex code that in turn calls out to the LittleBits Cloud API.

The result, a fully declarative way to connect to LittleBits devices from Process Builder! If you watch the demo from my past blog you’ll see my Opportunity Probability Pointer in action, the following implements the same process but using only Process Builder!

LittleBitsProcessBuilder

Once Spring’15 has completely rolled out i’ll release an update to the Salesforce LittleBits Connector managed package that supports Process Builder, so you can try the above out. In the meantime if have a Spring’15 Preview Org you can deploy direct from GitHub and try it out now!

UPDATE August 2015: It seems Process Builder still has some open issues binding Percent fields to Actions. Salesforce have documented a workaround to this via a formula field. Thus if you have Percent field, please create Formula field as follows and bind that to the Percent variable in Process Builder or Flow.

PercentFormulaWorkaround

How can developers enhance Process Builder?

There are some excellent out of the box actions from which Process Builder or Flow Designer users can choose from, as i have covered in past blogs. What is really exciting is how developers can effectively extend these actions.

So while Salesforce has yet to provide a declarative means to make Web API callouts without code. A developer needs to provide a bit of Apex code to make the above work. Salesforce have made it insanely easy to expose code to tools like Process Builder and also Visual Flow. Such tools dynamically inspects Apex code in the org (including that from AppExchange packages) and renders a user interface for the Process Builder user to provide the necessary inputs (and map outputs if defined). All the developer has to do is use some Apex annotations.

global with sharing class LittleBitsActionSendToDevice {

	global class SendParameters {
		@InvocableVariable
		global String AccessToken;
		@InvocableVariable
        global String DeviceId;
		@InvocableVariable
        global Decimal Percent;
		@InvocableVariable
        global Integer DurationMs;
	}
	
    /**
     * Send percentages and durations to LittleBits cloud enabled devices
     **/
    @InvocableMethod(
    	Label='Send to LittleBits Device' 
    	Description='Sends the given percentage for the given duration to a LittleBits Cloud Device.')
    global static void send(List<SendParameters> sendParameters) {
        System.enqueueJob(new SendAsync(sendParameters));
    }	
}

I learn’t quite a lot about writing Invocable Actions today and will be following up with some guidelines and thoughts on how i have integrated them with Apex Enterprise Patterns Service Layer.


6 Comments

Calling Salesforce API’s from Ant Script – Querying Records

Back in June last year i wrote a blog entitled Look ma, no hands!, its main focus was how to leverage the then new ability to install and uninstall packages via the Metadata API. However there was another goal, that is i wanted to invoke Salesforce API’s using only native Ant script and 100% Java based Apache Ant tasks, so no Java coding or native curl executable invocations. Making the resulting script platform neutral and easier to manage.

In this blog i’d like to talk a little bit more about how it was done and highlight the excellent <http> Ant task from Missing Link (so named since surprisingly Ant has yet to provide a core task for HTTP comms). In addition i wanted share how i was able to recently extend this approach. While working with one of FinancialForce.com‘s new up and coming DevOps team members Brad Slater (also see Object Model Tool).

The goal once again was keeping it 100% Ant, this time invoking the Salesforce REST API to perform queries.

 		<!-- Query -->
 		<runQuery 
 			sessionId="${sessionId}" 
 			serverUrl="${serverUrl}" 
 			queryResult="accounts"
 			query="SELECT Id, Name FROM Account LIMIT 1"/>

As before the new Ant tasks are defined in single XML file, ant-salesforce.xml, you can download the updated version with the new <query> task and easily <import> into your own Ant scripts.

Ant provides an excellent way to encapsulate complex script in components it calls Tasks. You can implement these in Java or Ant script itself, using the <macrodef> Ant task. The following shows how the Salesforce <login> task was built for last years blog. You can see both the <http> and <xmltask> tasks in action.

	<!-- Login into Salesforce and return the session Id and serverUrl -->
	<macrodef name="login">
		<attribute name="username" description="Salesforce user name."/>
		<attribute name="password" description="Salesforce password."/>
		<attribute name="serverurl" description="Server Url property."/>
		<attribute name="sessionId" description="Session Id property."/>
		<sequential>
			<!-- Obtain Session Id via Login SOAP service -->
		    <http url="https://login.salesforce.com/services/Soap/c/29.0" method="POST" failonunexpected="false" entityProperty="loginResponse" statusProperty="loginResponseStatus">
		    	<headers>
		    		<header name="Content-Type" value="text/xml"/>
		    		<header name="SOAPAction" value="login"/>
		    	</headers>
		    	<entity>
		    		<![CDATA[
				    	<env:Envelope xmlns:xsd='http://www.w3.org/2001/XMLSchema' xmlns:xsi='http://www.w3.org/2001/XMLSchema-instance' xmlns:env='http://schemas.xmlsoap.org/soap/envelope/'>
				    	    <env:Body>
				    	        <sf:login xmlns:sf='urn:enterprise.soap.sforce.com'>
				    	            <sf:username>@{username}</sf:username>
				    	            <sf:password>@{password}</sf:password>
				    	        </sf:login>
				    	    </env:Body>
				    	</env:Envelope>
		    		]]>
		    	</entity>
		    </http>
			<!-- Parse response -->
			<xmltask destbuffer="loginResponseBuffer">
				<insert path="/">${loginResponse}</insert>
			</xmltask>
			<if>
				<!-- Success? -->
				<equals arg1="${loginResponseStatus}" arg2="200"/>
				<then>
					<!-- Parse sessionId and serverUrl -->
					<xmltask sourcebuffer="loginResponseBuffer" failWithoutMatch="true">
						<copy path="/*[local-name()='Envelope']/*[local-name()='Body']/:loginResponse/:result/:sessionId/text()" property="@{sessionId}"/>
						<copy path="/*[local-name()='Envelope']/*[local-name()='Body']/:loginResponse/:result/:serverUrl/text()" property="@{serverUrl}"/>
					</xmltask>
				</then>
				<else>
					<!-- Parse login error message and fail build -->
					<xmltask sourcebuffer="loginResponseBuffer" failWithoutMatch="true">
						<copy path="/*[local-name()='Envelope']/*[local-name()='Body']/*[local-name()='Fault']/*[local-name()='faultstring']/text()" property="faultString"/>
					</xmltask>
					<fail message="${faultString}"/>
				</else>
			</if>
		</sequential>
	</macrodef>

The <query> Task further leverages the <http> task to make a call to the Salesforce REST API query end point.

	<!-- Provides access to the Salesforce REST API for a SOQL query -->
	<macrodef name="runQuery" description="Run database query">
		<attribute name="sessionId" description="Salesforce user name."/>
		<attribute name="serverUrl" description="Salesforce url."/>
		<attribute name="query" description="Salesforce password."/>
		<attribute name="queryResult" description="Query result property name"/>
		<sequential>
			<!-- Extract host/instance name from the serverUrl returned from the login response -->
			<propertyregex property="host"
              input="${serverUrl}"
              regexp="^((http[s]?|ftp):\/)?\/?([^:\/\s]+)((\/\w+)*\/)([\w\-\.]+[^#?\s]+)(.*)?(#[\w\-]+)?$"
              select="\3"
              casesensitive="false" />			
			<!-- Execute Apex via REST API /query resource -->
		    <http url="https://${host}/services/data/v29.0/query" method="GET" entityProperty="queryResultResponse" statusProperty="loginResponseStatus" printrequestheaders="false" printresponseheaders="false">
		    	<headers>
		    		<header name="Authorization" value="Bearer ${sessionId}"/>
		    	</headers>
		    	<query>
		    		<parameter name="q" value="@{query}"/>
		    	</query>
		    </http>		
		    <property name="@{queryResult}" value="${queryResultResponse}"/>
		</sequential>
	</macrodef>

When put together the two Tasks work very well together, allowing you to login and pass the resulting Session Id to the query  task, then parse the results according to your needs with a small peace of inline JavaScript to parse the resulting JSON. The user of these tasks is blissfully unaware of some of the more advanced Ant script approaches used to implement them, which is how things should be when providing good Ant tasks.

<project name="demo" basedir="." default="demo">

    <!-- Import login properties -->
    <property file="${basedir}/build.properties"/>    
    <!-- Import new Salesforce tasks -->
    <import file="${basedir}/lib/ant-salesforce.xml"/>
    
    <!-- Query task demo -->	
    <target name="demo">
    
    	<!-- Login -->
 		<login 
 			username="${sf.username}" 
 			password="${sf.password}" 
 			serverurl="serverUrl" 
 			sessionId="sessionId"/>
 			
 		<!-- Query -->
 		<runQuery 
 			sessionId="${sessionId}" 
 			serverUrl="${serverUrl}" 
 			queryResult="accounts"
 			query="SELECT Id, Name FROM Account LIMIT 1"/>
 		
 		<!-- Parse JSON result via JavaScript eval -->
 		<script language="javascript">
			var response = eval('('+project.getProperty('accounts')+')');
			project.setProperty('Name', response.records[0].Name);
			project.setProperty('Id', response.records[0].Id);
		</script>
		
		<!-- Dump results -->
		<echo message="Queried Account '${Name}' with Id ${Id}"/>
		
    </target>   
     
</project>

Here is a more complex example processing more than one record via an Ant marco for each record.

 		<!-- Query -->
 		<runQuery 
 			sessionId="${sessionId}" 
 			serverUrl="${serverUrl}" 
 			queryResult="accounts"
 			query="SELECT Id, Name FROM Account"/>

		<!-- Ant marco called for each Account retrieved -->
	    <macrodef name="echo.account">
	    	<attribute name="id"/>
	    	<attribute name="name"/>
	    	<sequential>
				<!-- Process for each account -->
		    	<echo message="Queried Account '@{name}' with Id @{id}"/>
	    	</sequential>		    	
	    </macrodef> 		
 		
 		<!-- Parse JSON result via JavaScript eval and call above Ant macro -->
 		<script language="javascript">
			var response = eval('('+project.getProperty('accounts')+')');
			for(var idx in response.records)
			{
				var processRecord = project.createTask("echo.account");
                processRecord.setDynamicAttribute("id", response.records[idx].Id);
                processRecord.setDynamicAttribute("name", response.records[idx].Name);
                processRecord.execute();
			}
		</script>

Ant is not just for build systems or developers, it can be used quite effectively for many automation tasks. You could create an Ant script that polls for certain activity in your Salesforce org and invokes some application or more complex process for example. Ant has a huge array of tasks and massive community support, its a good skills to learn for cross platform scripting and i’ve frankly found very little it can do these days.

So you may be wondering, why ever use a Java based Ant task or process again to implement your complex Ant and Salesforce integrations? Well…. you may still want to go down the Java coding route if your needs are more complex or if your not comfortable with Ant scripting. Indeed in the case above, the project morphed into something much more complex and we ended up in Java after all. As always choose your tools for the job according to time, resources, skills and complexity. Hopefully this blog has given you another option in your tool belt to consider!