Andy in the Cloud

From BBC Basic to Force.com and beyond…


3 Comments

User Notifications with the Utility Bar API

utilityprogressIn this blog, I want to highlight a couple of great UI features provided by the Utility Bar in Lightning Experience. These are relatively new and accessed only via the Utility Bar API, so are not immediately accessible. This blog is based on code and material I prepared for Dreamforce 2017. However, I did not have time to dig into the code during that session so this blog provides that opportunity. My session also covered other cool features in Lightning Experience, such as the amazing App Console mode!

Enabling and Understanding the Utility Bar API
The utility bar API is enabled at a component level though it does have access to the whole utility bar. You can specify the lightning:utilityBarAPI component in any component, regardless if its in the utility bar or not. This component will not display anything but it does have a very useful selection of methods!

<lightning:utilityBarAPI aura:id="utilitybar"/>

In your component code you simple access it like any other component.

var utilityAPI = cmp.find("utilitybar");

Once you have access to an instance of the component you can call any of its methods. All methods take a utilityId parameter. Although if you call it within the context of a component running in the utility bar you can omit this parameter and the API will discover it for you. All the methods take a single JavaScript object with properties representing the parameters to the method.

utilityAPI.setPanelHeaderLabel({ label: "My Label" });

One interesting design aspect of these methods is they do not respond immediately, all responses are returned via a callback. To do this the API uses the JavaScript Promises pattern. Fortunately, its a pretty easy convention to pick up and use. It is worth taking the time to understand, it has fast become defacto callback approach.

Providing Notifications

notificationdemo

There are many occasions that you want to notify the user of something that’s happened since they last logged in or during login as a result of some background process. The setUtilityHighlighted method is a good way to drive such notifications.

You can, of course, evaluate on initialize of your component, but it’s worth considering using Platform Events, it’s really easy to send them from your Apex code or Process Builder and you can easily integrate my Streaming API component to respond to the event. The code below is a very simple isolated example using browser timers, but it helps illustrate the API and give you a basis to build one.

<lightning:button 
   class="slds-m-around_medium" 
   label="{! v.readNotification ? 'Mark as Read' : 'Wait' }" 
   onclick="{!c.demoNotifications}"/>
<aura:if isTrue="{!v.readNotification}">
   <ui:message title="Confirmation"severity="info">
      This is a confirmation message.</ui:message>
</aura:if>
    demoNotifications: function (cmp, event) {
		var utilityAPI = cmp.find("utilitybar"); 
        var readNotification = cmp.get('v.readNotification');
        if(readNotification == true) {
			utilityAPI.setUtilityHighlighted({ highlighted : false });                        
            cmp.set('v.readNotification', false);
        } else {
            utilityAPI.minimizeUtility();                            
            setTimeout($A.getCallback(function () {
                utilityAPI.setUtilityHighlighted({ highlighted : true });            
				cmp.set('v.readNotification', true);
            }), 3000);                    
        }
    }, 

Providing Progress Updates

progressdemo

By using a combination of setUtilityLabel and setUtilityIcon you can create an eye-catching progress updating effect. This sample is a pretty simple browser timer based example. However, you could again use Platform Events to send events as part of a Batch Apex execution to update on progress or just poll the AsyncApexJob object.

 <lightning:button 
    class="slds-m-around_medium" 
    label="{! v.isProgressing ? 'Stop' : 'Start' }" 
    onclick="{!c.demoProgressIndicator}"/>
 <lightning:progressBar 
    value="{! v.progress }" size="large" />
demoProgressIndicator: function (cmp, event) {
    var utilityAPI = cmp.find("utilitybar"); 
    if (cmp.get('v.isProgressing')) {
        // stop
        cmp.set('v.isProgressing', false);
        cmp.set('v.progress', 0);
        cmp.set('v.progressToggleIcon', false);
        clearInterval(cmp._interval);
        utilityAPI.setUtilityLabel({ label : 'Utility Bar API Demo' });                    
        utilityAPI.setUtilityIcon({ icon : 'thunder' } );                                    
    } else {
        // start
        cmp.set('v.isProgressing', true);
        utilityAPI.minimizeUtility();        
        cmp._interval = setInterval($A.getCallback(function () {
            var progresToggleIcon = 
               cmp.get('v.progressToggleIcon') == true ? false : true;
            var progress = cmp.get('v.progress');
            cmp.set('v.progress', progress === 100 ? 0 : progress + 1);
            cmp.set('v.progressToggleIcon', progresToggleIcon);
            utilityAPI.setUtilityLabel(
                { label : 'Utility Bar API Demo (' + progress + '%)' });        
            utilityAPI.setUtilityIcon(
                { icon : progresToggleIcon == true ? 'thunder' : 'spinner' });
        }), 400);
    }
}

Summary

There is still plenty to dig into in the code samples from the session. You can also deploy the sample code into an org and try out some of the other interactive API demos. Enjoy!

utildemo


7 Comments

Ideas for Apex Enterprise Patterns Dreamforce 2013 Session!

ideaguy

Update: Dreamforce is over for another year! Thanks to everyone who supported me and came along to the session. Salesforce have now uploaded a recording of the session here and can find the slides here.

As part of this years Dreamforce 2013 event I will be once again running a session on Apex Enterprise Patterns, following up on my recent series of developer.force.com articles. Here is the current abstract for the session, comments welcome!

Building Strong Foundations: Apex Enterprise Patterns “Any structure expected to stand the test of time and change, needs a strong foundation! Software is no exception, engineering your code to grow in a stable and effective way is critical to your ability to rapidly meet the growing demands of users, new features, technologies and platform features. You will take away architect level design patterns to use in your Apex code to keep it well factored, easier to maintain and obey platform best practices. Based on a Force.com interpreation of Martin Fowlers Enterprise Architecture Application patterns and the practice of Separation of Concerns.” (Draft)

I’ve recently started to populated a dedicated Github repository that contains only the working sample code (with the library code in separate repo). So that i can build out a real working sample application illustrating in practical way the patterns in action. It already covers a number of features and use cases such as…

  • Layering Apex logic by applying Separation of Concerns
  • Visualforce controllers and the Service Layer
  • Triggers, validation, defaulting and business logic encapsulation via Domain layer
  • Applying object orientated programming inheritance and interfaces via Domain layer
  • Managing DML and automatic relationship ‘stitching’ when inserting records via Unit Of Work pattern
  • Factoring, encapsulating and standardising SOQL query logic via Selector layer

The following are ideas I’ll be expanding on in the sample application in preparation for the session…

  • Batch Apex and Visualforce Remoting (aka JavaScript callers) and the Service Layer
  • Apex testing without SOQL and DML via the Domain Layer
  • Exposing a custom application API, such as REST API or Apex API via Service Layer
  • Reuse and testing SOQL query logic in Batch Apex context via Selector Layer
  • Rich client MVC frameworks such as AngularJS and Service Side SOC

What do you think and what else would you like to see and discuss in this session?

Feel free to comment on this blog below, tweet me, log it on Github or however else you can get in touch.


12 Comments

Batch Worker, Getting more done with less work…

Batch Apex has been around on the platform for a while now, but I think its fair to say there is still a lot of mystery around it and with that a few baked in assumptions. One such assumption I see being made is that its driven by the database, specifically the records within the database determine the work to be done.

construction_workerAs such if you have some work you need to get done that won’t fit in the standard governors and its not immediately database driven, Batch Apex may get overlooked in favour of @future which on the surface feels like a better fit as its design is not database linked in anyway .  Your code is just an annotation away to getting the addition power it needs! So why bother with the complexities of Batch Apex?

Well for starters, Batch Apex gives you an ID to trace the work being done and thus the key to improving the user experience while the user waits. Secondly, if any of your parameters are lists or arrays to such methods, your already having to consider again scalability. Yes, you say, but its more fiddly than @future isn’t it?

In this blog I’m going to explore a cool feature of the Batch Apex that often gets overlooked. Using it to implement a worker pattern giving you the kind of usability @future offers with the additional scalability and traceability of Batch Apex without all the work. If your not interested in the background, feel free to skip to the Batch Worker section below!

IMPORTANT NOTE: The alternative approach described here is not designed as a replacement to using Batch Apex against the database using QueryLocator. Using QueryLocator gives access to 50m records, where as the Iterator usage only 50k. Thus the use cases for the Batch Worker are more aligned with smaller jobs perhaps driven by end user selections or stitching together complex chunks of work together.

Well I didn’t know that! (#WIDKT)

First lets review something you may not have realised about implementing Batch Apex. The start method can return either a QueryLocator or something called Iterable. You can implement your own iterators, but what is actually not that clear is that Apex collections/lists implement Iterator by default!

Iterable<String> i = new List<String> { 'A', 'B', 'C' };

With this knowledge, implementing Batch Apex to iterate over a list is now as simple as this…

public with sharing class SimpleBatchApex implements Database.Batchable<String>
{
	public Iterable<String> start(Database.BatchableContext BC)
	{
		return new List<String> { 'Do something', 'Do something else', 'And something more' };
	}

	public void execute(Database.BatchableContext info, List<String> strings)
	{
		// Do something really expensive with the string!
		String myString = strings[0];
	}

	public void finish(Database.BatchableContext info) { }
}

// Process the String's one by one each with its own governor context
Id jobId = Database.executeBatch(new SimpleBatchApex(), 1);

The second parameter of the Database.executeBatch method is used to determine how many items from the list are pass to each execute method invocation made by the platform. To get the maximum governors per item and match that of a single @future call, this is set 1.  We can also implement Batch Apex with a generic data type know as Object. Which allows you to process different types or actions in one job, more about this later.

public with sharing class GenericBatchApex implements Database.Batchable<Object>
{
	public Iterable<Object> start(Database.BatchableContext BC) { }

	public void execute(Database.BatchableContext info, List<Object> listOfAnything) { }

	public void finish(Database.BatchableContext info) { }
}

A BatchWorker Base Class

The above simplifications are good, but I wanted to further model the type of flexibility @future gives without dealing with the Batch Apex mechanics each time. In designing the BatchWorker base class used in this blog i wanted to make its use as easy as possible. I’m a big fan of the fluent API model and so if you look closely you’ll see elements of that here as well. You can view the full source code for the base class here, its quite a small class though, extending the concepts above to make a more generic Batch Apex implementation.

First lets take another look at the string example above, but this time using the BatchWorker base class.

public with sharing class MyStringWorker extends BatchWorker
{
	public override void doWork(Object work)
	{
		// Do something really expensive with the string!
		String myString = (String) work;
	}
}

// Process the String's one by one each with its own governor context
Id jobId =
	new MyStringWorker()
            .addWork('Do something')
            .addWork('Do something else')
            .addWork('And something more')
            .run()
            .BatchJobId;

Clearly not everything is as simple as passing a few strings, after all @future methods can take parameters of varying types. The following is a more complex example showing a ProjectWorker class. Imagine this is part of a Visualforce controller method where the user is presented a selection of projects to process with a date range.

	// Create worker to process the project selection
	ProjectWorker projectWorker = new ProjectWorker();
		
	// Add the work to the project worker
	for(SelectedProject selectedProject : selectedProjects)		
		projectWorker.addWork(startDate, endDate, selectedProject.projectId);
			
	// Start the workder and retain the job Id to provide feedback to the user
	Id jobId = projectWorker.run().BatchJobId;		

Here is how the ProjectWorker class has been implemented, once again it extends the BatchWorker class. But this time it provides its own addWork method which takes the parameters as you would normally describe them. Then internally wraps them up in a worker data class. The caller of the class, as you’ve seen above is is not aware of this.

public with sharing class ProjectWorker extends BatchWorker
{	
	public ProjectWorker addWork(Date startDate, Date endDate, Id projectId)
	{
		// Construct a worker object to wrap the parameters		
		return (ProjectWorker) super.addWork(new ProjectWork(startDate, endDate, projectId));
	}
	
	public override void doWork(Object work)
	{
		// Parameters
		ProjectWork projectWork = (ProjectWork) work;
		Date startDate = projectWork.startDate;
		Date endDate = projectWork.endDate;
		Id projectId = projectWork.projectId;		
		// Do the work
		// ...
	}
	
	private class ProjectWork
	{
		public ProjectWork(Date startDate, Date endDate, Id projectId)
		{
			this.startDate = startDate;
			this.endDate = endDate;
			this.projectId = projectId;
		}
		
		public Date startDate;
		public Date endDate;
		public Id projectId;
	}
}

As a final example, recall the fact that Batch Apex can process a list of generic data types. The BatchProcess base class uses this to permit the varied implementations above. It can also be used to create a worker class that can do more than one thing. The equivalent of implementing two @future methods, accept that its managed as one job.

public with sharing class ProjectMultiWorker extends BatchWorker 
{
	// ...

	public override void doWork(Object work)
	{
		if(work instanceof CalculateCostsWork)
		{
			CalculateCostsWork calculateCostsWork = (CalculateCostsWork) work;
			// Do work 
			// ...					
		}
		else if(work instanceof BillingGenerationWork)
		{
			BillingGenerationWork billingGenerationWork = (BillingGenerationWork) work;
			// Do work
			// ...		
		}
	}
}

// Process the selected Project 
Id jobId = 
	new ProjectMultiWorker()
		.addWorkCalculateCosts(System.today(), selectedProjectId)
		.addWorkBillingGeneration(System.today(), selectedProjectId, selectedAccountId)
		.run()
		.BatchJobId;

Summary

Hopefully I’ve provided some insight into new ways to access the power and scalability of Batch Apex for use cases which you may not have previously considered or perhaps used less flexible @future annotation. Keep in mind that using Batch Apex with Iterators does reduce the number of items it can process to 50k, as apposed to the 50m when using database query locator. At the end of the day if you have more than 50k work items, your probably wanting to go down the database driven route anyway. I’ve shared all the code used in this article and some I’ve not shown in this Gist.

Post Credits
Finally, I’d like to give a nod to an past work associate of mine, Tony Scott, who has taken this type of approach down a similar path, but added process control semantics around it. Check out his blog here!