Andy in the Cloud

From BBC Basic to Force.com and beyond…


5 Comments

Lightning Out: Components on Any Platform

This blog is my first video blog! Since Salesforce does not record the Developer Theatre sessions at the Salesforce World Tour events i thought i would do a re-run at home of my session last week and publish it here. As you know i have a love for all things API’s, and while I typically focus in this blog on backend API’s, there is one i’ve been keen to explore for a while…

The Lightning Out API, as any good API should, brings great promise and reality i’m pleased to say, to further integrating and extending the power of the platform and generally simplifying our users lives. In this case boldly going where no Lightning Component has gone before….

You can access the slides and thus the links within via this Slideshare upload.


7 Comments

Disabling Trigger Events in Apex Enterprise Patterns

Iautobat.jpeg‘m proud to host my first guest bloggerChris Mail or Autobat as he is known on GitHub. Take it away Chris….

How to put the safety on…

Being an architect in a professional services organisation is a funny game. Each project is either a shiny new Salesforce instance without a fingerprint on it or an unknown vault of code and configuration that we must navigate through.

I have been using the fflib pattern now for some time, and more of our teams are adopting it for our programs of work. My latest addition is something that an architect might wonder why we need; the ability to turn off triggers via a simple interface on all domains.

In an ever growing complex environment, perhaps multiple projects over time delivering iterative enhancements I was noticing a common piece of code being developed within the Domain layer. It looked something along the lines of this:

public override void onAfterInsert()
{
    // if this is set we are already in a loop and want to exit!
    if(bProhibitAfterInsertTrigger)
    {
        return;
    }
    // down here we do something, maybe insert an Account!
}

While small and inconspicuous it allowed our code base to become inconsistent as there was no control over the exposure of these controlling flags and worse, we were repeating ourselves in every domain!

The solution was simple, a fluent style API within fflib_SObjectDomain. Any code can now simply set the control flags for any domain class:

fflib_SObjectDomain.getTriggerEvent(YourDomain.class).disableAll(); // dont fire anything
fflib_SObjectDomain.getTriggerEvent(YourDomain.class).disableAllBefore();
fflib_SObjectDomain.getTriggerEvent(YourDomain.class).disableAllAfter();

fflib_SObjectDomain.getTriggerEvent(YourDomain.class).disableBeforeInsert();
fflib_SObjectDomain.getTriggerEvent(YourDomain.class).disableBeforeUpdate();
fflib_SObjectDomain.getTriggerEvent(YourDomain.class).disableBeforeDelete();

fflib_SObjectDomain.getTriggerEvent(YourDomain.class).disableAfterInsert();
fflib_SObjectDomain.getTriggerEvent(YourDomain.class).disableAfterUpdate();
fflib_SObjectDomain.getTriggerEvent(YourDomain.class).disableAfterDelete();
fflib_SObjectDomain.getTriggerEvent(YourDomain.class).disableAfterUndelete();

To enable, just call the inverse e.g. .enableAfterInsert(); etc.

While not every code base will need to use these flags, they allow you to control quickly and easily your trigger execution with a single line of code that all your development team can reuse and follow.


6 Comments

Monitoring Record Activity via Data Replication API’s

ReplicateAPI.pngYou might be thinking having started to read this blog that the primary use case of Data Replication API’s is to provide replication, well yes and no! No in a sense it won’t replicate your data for you, it actually won’t even return your data. What it will do is tell you what record Id’s have been updated or deleted within a certain time frame. What happens next is up to you, you don’t even have to do any replication if you don’t want to!

As someone who loves to keep things as native as possible, when answering this StackExchange post, i found it quite cool to find that this API is actually available to Apex developers! The API consist of two Database class methods getUpdated and getDeleted. The former returns new and updated records, its up to you to decide if its a new record or existing depending on if you’ve seen the record Id before or not.

Note: The process around these API’s is explained in more detail in the SOAP Developers Guide, so its worth reading this along with the Apex Developers Guide references. The general polling process is described here and limits and considerations here.

The API’s could not be easier to use, the following shows how to pull a list of record Id’s for updates made to records for a given object in the last hour.

Database.GetUpdatedResult r =
  Database.getUpdated(
    'MyObject__c', Datetime.now().addHours(-1), Datetime.now());
LastDateCovered.png

Salesforce documentation relating to the latestDateCovered field

Neither methods appear to consume any query or DML governor limits, though as noted above have some limits of their own. The GetUpdatedResult (described in more detail in the SOAP API documentation) contains a latestDateCovered field. This value should be retained and used in subsequent getUpdated calls.

Its unclear what Salesforce means by safety in the documentation, though the note relating to long running batch processes does make sense. This aspect highlights a key difference between attempting to query whats changed yourself based on the audit fields. The getDeleted method also works in the same way.

Its worth considering these API’s as a possible and lighter alternative to using Apex Triggers or UI’s to invoke custom behaviour when records are manipulated. Unlike Triggers you don’t know the specific changes, though Field Audit information could be queried if needed.

So be it a replication to another cloud via an Apex HTTP Callout or simply monitoring record activity into some org wide stats. Having Apex support allows you to keep things native via an Apex Schedule job to poll this API as often as you wish. With the advent of External Objects aka Lightning Connect other native possibilities start to present themselves…


1 Comment

Apex Sharing and applying to Apex Enterprise Patterns

Apex Sharing can be a bit of mystery to new developers as well as seasoned ones from other platforms. This blog is not for those wanting to understand sharing as such, there are plenty of excellent articles and Salesforce docs on that. Here i wanted to talk about how first I came to understand it and how it fits into Apex Enterprise Patterns.

I recall one really basic thing that took me by surprise was the name, Sharing? Of course this is an end user based way of describing what as an engineer, I effectively understood as row level security. I was also blown away to know that this applied in and outside of code, for example when reporting is used, very cool! Row level security is certainly for me a more accurate way to describe it and certainly helps when i have been talking to others new to the platform but have experience on other platforms.

The second thing that i learn is that in order to control it, it is required to be considered in the way one annotates code at design time. And less so about a default runtime or a configured at runtime context. Since sharing is not enabled by default in Apex (except for Anonymous Apex contexts), it needs to be enabled via opt-in by the developer. Salesforce helps remind us of this through tools like the Salesforce Security Scanner and best practices here, well worth a read.

You may have noticed that the Apex Enterprise Patterns classes providing implementations of your Service layer always have with sharing specified. This sets the default context for all code, in the Domain, Selector or other classes that are executed from then on to run in this mode. Such classes do not need to and should not generally need to qualify any with sharing or without sharing keywords either.

global with sharing class OpportunitiesService 
{		
	global static void applyDiscounts(<ID> opportunityIds, Decimal discountPercentage)
	{
		// This code and any it calls runs as 'with sharing'
	}
}

So what happens if you really want to run without sharing (great article here on reasons for this)? Do you apply it to your Domain or Selector class definition? Well actually neither, since not all the code in these classes may warrant sharing being disabled for example. What i prefer to do is keep the execution of running in this mode as short and contained as possible, to avoid any other inadvertent execution of other code running in this mode.

The basic approach is to leverage an inner class that contains just the code that needs to run without sharing. Typically this code would run in the Selector layer, though can be used elsewhere inside a service method implementation or domain class method. The point is its scoped to a method or specific execution path.

public class OpportunitiesSelector extends fflib_SObjectSelector
{
    public List<Opportunity> selectById(Set<Id> idSet) {
        // This method simply runs in the sharing context of the caller
        // ...
        return opportunities;
    }

    public List<OpportunityInfo> selectOpportunityInfo(Set<Id> idSet) {	
        // Explicitly run the query in a 'without sharing' context
        return new SelectOpportunityInfo().selectOpportunityInfo(this, idSet);
    }

    private without sharing class SelectOpportunityInfo {
        public List<OpportunitiesSelector.OpportunityInfo> 
                 selectOpportunityInfo(OpportunitiesSelector selector, Set<Id> idSet) {
            // Execute the query as normal
            // ...
           return opportunityInfos;				
        }
    }
}

So do we still need it specify with sharing elsewhere? Well yes, controllers for sure is still good practice, indeed Selectors can end up being called from these. I personally also consider any class that is invoked as an Apex entry point, such as Invocable Methods, Batch Apex, Scheduled Apex etc in this category.

If your following a service orientated design most of these entry points delegate to the Service layer, so it feels like your doubling up at times, but thats no bad thing when security is concerned. Finally keep in mind, if you choose to expose your Service layer as an API, it feels equally important to ensure the default sharing mode is enabled regardless of what mode the caller is running in.

The general approach here, is enable sharing, then make the code, developer and business/solution analyst justify why it needs to be switched off for a system level operation that requires it. If you put aside the Apex Enterprise Patterns, this is in fact not that different from the general guideline of having with sharing on all your controllers, the main difference here is by putting it on your service layer, your ensuring not just your controller entry points are covered.

 


27 Comments

Supercharging Salesforce Report Subscriptions with Apex and Flow

In the Spring’15 release Salesforce provided the ability to subscribe to reports. This is done through a new Schedule button shown below. Users can schedule reports with criteria based on the report contents. When the platform runs the report, the criteria gets evaluated and if met will result in notifications sent to the user. The usual Email and Chatter options are available, as well Mobile Notifications for Salesforce1 mobile users, pretty cool!

ReportSubscribe

It is the last notification option that caught my interest. Described simply as Execute Custom Action. This is in fact the gateway to many more possibilities, as it is essentially a peace of any Apex code. When the report is run and the criteria is met the platform will call your Apex code, a bit like an Apex Trigger but for Reports. Given the capabilities of Apex and what users can do with Salesforce Reports, thats not just pretty cool, that’s super cool!

Subscription

Once the user saves the subscription the platform creates a scheduled entry that can be seen and deleted by administrators under the All Scheduled Jobs page under Setup, classified as Reporting Notification scheduled job type. The job will then run under the user context of the user that setup the subscription. Note that each user can only setup up to 5 report subscriptions presently.

Creating a Custom Action with Apex

In order for your Apex class to appear in the above UI for selection you must implement a platform provided Apex interface. The Reports.NotificationAction Apex interface is pretty simple, with just one execute method.

public with sharing class MyReportNotification 
	implements Reports.NotificationAction {

	public void execute(Reports.NotificationActionContext context) {
	    // Context parameter contains report instance and criteria
	    Reports.ReportResults results = context.getReportInstance().getReportResults();
	    // In the above subscription case this is the 'Record Count'
	    System.debug(context.getThresholdInformation().getEvaluatedConditions()[0].getValue();
	    // You can also access the report definition!
	    System.debug(results.getReportMetadata().getName());
	}
}

If your familiar with the Analytics Apex API, you’ll soon feel quite at home with the information provided by the context parameter. The context parameter passes in an instance of the report itself, as in the records of the report. As well as a reminder of the criteria behind the subscription (as entered in the UI above) and the aggregate values that caused the criteria to be met. Also the report definition (its metadata), useful if you wanted to write a generic handler for example.

What about Clicks not Coders?

Now i love Apex coding, but i also know that this platform has not got where it has from just providing a programming language. Having a declarative means to deliver solutions rapidly with more pervasive skills is the corner stone of what makes the platform so successful. Tools like Process Builder and Visual Flow are a great example of this.

So you may think Salesforce have missed an opportunity here, by not providing a means to use for example an Autolaunched Flow as a Report subscription. I personally think so, hence i’ve raised this Idea Exchange idea here, please up vote if you agree. Meanwhile, all is not lost!  As i’ve previously blogged there is a means to invoke Flow’s from Apex. So this got me wondering if Apex could be the glue between Salesforce Report Notifications and Flow?

Here is the result of a basic experiment of calling a Flow from an Apex Report Notification…

public with sharing class MyReportNotification 
	implements Reports.NotificationAction {

	public void execute(Reports.NotificationActionContext context) {
		// Context parameter contains report instance and criteria
		Reports.ReportResults results = context.getReportInstance().getReportResults();
		// Construct some parameters to pass to the Flow
		Map<String, Object> params = new Map<String, Object>();
		params.put('ReportName', results.getReportMetadata().getName());
		params.put('RecordCount', context.getThresholdInformation().getEvaluatedConditions()[0].getValue());
		// Call the Flow with the parameters
		Flow.Interview.MyReportNotificationFlow reportNotificationFlow = 
			new Flow.Interview.MyReportNotificationFlow(params);
		reportNotificationFlow.start();		 
	}
}

Implementation Note: The only drawback so far i can see is sadly you cannot pass the whole context parameter “as is” into Flow, as its variable types are quite limited, to make this more generic you would have to be quite inventive as to how you pass information.

The following shows the Autolaunched Flow called in the above example. It’s pretty simple but achieves something the other notification options cannot which is to create Task. If you want to know more see my blog here.

ReportNotificationFlow

This results in a Task being created for the user as shown in the screenshot below. There is certainly a great potential for this type of approach, as it allows for clicks not coders to stretch their declarative skills without having to revisit the Apex code each time a new requirement or change is required.

TaskCreatedByReportFlowNotification

Running, Testing and Debugging

SubscribeSaveAndRunIf you look closely at the UI above you’ll see the button Save & Run Now. As the name suggests this will save the subscription details and run the report immediately. If the criteria is met your Apex class will be called immediately also. This is obviously pretty useful as the most granular scheduling period is on the hour.

ScheduledSubscriptionReportWhen i did schedule the subscription as normal users would. I found oddly no evidence of it in the Apex Jobs page, despite a schedule appearing on the All Scheduled Jobs page. Its pretty obvious the platform runs these async, so i would have expected some evidence of this…

During my testing i found the Debug Logs output as expected (in both execution modes above). Though an important note is any exceptions thrown got swallowed when run from the UI, so always check your debug log if things are not working as expected.

Warning Possible Platform Bug: I did find that if my Apex code threw an exception when i used the UI to run my code it deleted any prior schedules for the report subscription. I had to delete and recreate the subscription to get the schedule entry back. I’ll investigate this a bit further and raise a case with Salesforce.

Writing an Apex Test for Report Notification Actions

Writing Apex Tests for the above is little different than the usual approach, mainly owing to the fact that the Analytics API requires that Apex tests leverage the SeeAllData attribute. This is clearly less than ideal, but is indeed documented as such in the formal Salesforce documentation.

In order to emulate the platform calling your execute method, you’ll need to mock the context parameter, fortunately it seems unlike some system types the context parameter type can be constructed, nice!

@IsTest
private class MyReportNotificationTest {
	
	@IsTest(SeeAllData=true)
	private static void testMyReportNotification() {
	
		// Get an instance of the report
		Report report = [SELECT Id FROM Report WHERE DeveloperName = 'ClosedWon'];
		Test.startTest();
		Reports.ReportInstance reportInstance =
          Reports.ReportManager.runAsyncReport(report.Id, true);
		Test.stopTest();

		// Emulate the platform calling the custom action passing criteria that was met
		MyReportNotification myReportNotification = new MyReportNotification();		
		Reports.NotificationActionContext context = 
			new Reports.NotificationActionContext(
				reportInstance, 
				new Reports.ThresholdInformation( 
					new List<Reports.EvaluatedCondition> {
							/* new Reports.EvaluatedCondition(
								'RecordCount', 
								'Record Count', Double.valueOf(0), Double.valueOf(1), 
								Reports.EvaluatedConditionOperator.GREATER_THAN) */ }));
		myReportNotification.execute(context);

		// Assert accordingly...
		// ...
	}
}

Implementation Note: At time of writing, i found that i could not create a mock instance of the Reports.EvaluatedCondition type without receiving a compilation error claiming the constructor was not found (hence why its commented out). I’ll update this blog if i find out why. This issue will limit the testing of any criteria / threshold references in your code you make.

Summary

I love how Salesforce are opening up more of the platform to support Apex callbacks like this. It adds much more flexibility for developers to really extend the platform and not just solutions on top of it. In this case it also provided some flexibility for us to plug any gaps in the platforms current feature set. Salesforce please keep it up!

 

 


30 Comments

Setup Audit Trail API in Winter’16

Winter’16 was a bit low on API for me, but one thing i just found hiding in the New Objects section, was this…

SetupAuditTrail, “Represents changes you or other administrators made in your organization’s Setup area.”

WBAuditWow, this is a BIG thing and opens up a lot of tooling and greater compliance support for changes around orgs. Prior to this, folks where so keen to get hold of this information programatically they would resort to web scraping it from the Salesforce Setup menu!  So I set about giving it a spin via Developer Workbench and learning more about the object.
WBAuditResults

There is also much better data export support via tools such as DataLoader.io (and of course other tools)….

AuditDataLoaderIO

So in addition to Salesforce API’s (used by the above tools), it is also available via Apex SOQL! The following selects all audit records relating to users from a certain email domain.

List<SetupAuditTrail> stuffDoneByConsultants = 
    [SELECT Id,
     	Action,
     	CreatedBy.Name,
     	CreatedDate,
     	Display,
     	Section 
     FROM SetupAuditTrail 
     WHERE CreatedBy.Email LIKE '%xyzconsulting.com%'];

So what else can we do with it? Well… not a lot more sadly, doing an Apex Describe shows that its basically only query-able, no triggers, replication API or indeed streaming API, which while not a total surprised would have been very cool!

Age of Records?

Through the long standing CSV download facility under setup, only 6 months worth of data is available. However running queries in my various long standing orgs i’m seeing data going back as far as all time?!? Although there is no documentation to confirm, and i would honestly would not be surprised to see some limit here, perhaps those of you running long standing production orgs can confirm via the comments below?

What are its fields?

AuditFieldsAs you can see the SetupAuditTrail objects fields are not the most ideal to interpret the data, aside from CreatedBy field (which supports relationship walking to the User object). The most interesting are Action, Section and Display, the later is the one that actually contains the object, field, layout or whatever has changed. The challenge here is its embedded in a message and not called out separately, so there is a bit of parsing to be done here.

IMPORTANT NOTE: The only gap i can see so far is the Delegate User (found in the Setup UI and CSV download) appears to be missing from the API at the moment. Yet it is documented as follows… ‘The Login-As user who executed the action in Setup. If a Login-As user didn’t perform the action, this field is blank.’

So what next?

  • Well i see AuditForce is an example of application tool that was created to better report and dashboard on this information, this tool could be enhanced to use this API now and remove the current web scraping hack the developer (Daniel Peter) was forced to utilise back then.
  • With Apex support its also possible to write Apex Scheduled jobs that periodically scan for certain updates and apply your own rules and notifications.
  • I was thinking it might also be useful to have a nice Lighting Component perhaps?
  • The Setup UI does not permit filtering or even sorting, it would probably not require to much coding to get something like the excellent Data Table component show results of queries from this object (as featured in my List View API blog).
  • Finally given this is available from Salesforce REST API, its also feasible to aggregate into a single console audit information from say multiple sandboxes (something Daniel hints at in his AuditForce blog).

 


4 Comments

Tips for Migrating to Apex Enterprise Patterns

splitOne of the common questions i get asked is, “How do i adopt Apex Enterprise Patterns within an existing code base?“.

Often its also folks who have not owned the code base from the start and are inheriting it. Your also not likely to be blessed with a empty cheque of time to sort the code out before your asked to start adding new features or fixing issues. So how do you find time to crack open an old code base and start introducing Separation of Concerns?

Tip 1. Pick and choose and go incrementally! You can pick and choose what layer you want to work on first and incrementally split out further. For example Service layer implementations don’t have to use Unit of Work, Domain or Selector initially. Typically I find the Service layer the most value add to get in place first. Going incrementally over a few release iterations perhaps, across the tips below can help you and your team walk before you start to run with the patterns and gain confidence and support across the team in the approach.

Tip 2. Get a basic Service layer going. Even if you don’t adopt any other aspect, try to look for opportunities to move code from controllers, batch classes etc into a service layer. Remember that technically it is essentially a class ending in Service with some static methods in it, its the meaning and responsibility of it thats important here. If you don’t have time to follow all the conventions such as bulkification don’t worry, by moving it a Service you’ve already made a big step! For sure however, don’t apply global to your service until your happy with it’s methods.

Tip 3. Sweeping out inlined SOQL. Again if your not ready to use the base classes in the library here, just start naming classes with the Selector suffix and re-home your code performing SOQL into methods on that class. You won’t get all the benefits of consistency, but you will start getting those from reuse and encapsulation. That said the Selector pattern does not really need you to have other things setup so do consider extending the base class where you can, doing for some Selectors and not others is also fine if some are a bigger challenge.

Tiincrementalp 4. Moving your Trigger code into Domain class. The Domain class does support all the Apex Trigger events, and has several overrides to invoke code that applies to multiple events. So you should be able to find a home for your trigger code amongst the various overrides given by the base class. If you have got some recursive stuff going on you’ll want to look at the Statefull configuration in the Domain class.

You may want to start with this one over tips 2 and 3 if your code is heavily Apex Trigger driven, it often the best way to show other developers the value as well, as the code starts to become more OO and more factored for typically little risk so long as you have good Apex tests in place. Finally developers familiar with so called Wrapper Patterns elsewhere in the community should warm to this pattern more quickly.

Tip 5. Adding Application Factory and Mocking support Incrementally. This is possibly easier to add in areas of the solution where most of the patterns are starting to fall into place so don’t rush into adding this to soon. Separation of Concerns is key to making mocking work. So don’t try to introduce this to soon if you’ve not got things factored out well enough. That said, you may find adding support just for mocking the Service layer gives you some initial boosts in writing tests around controllers and batches classes and the like. The Unit of Work does not need to be fully configured either, its fine to have only have a subset of your objects representative of the areas its used in.

So there you have it, some food for thought, i’m interested to know of approaches and processes others have ended up taking and if there is any neat short cuts or evening tooling options perhaps?


Leave a comment

Great Contributions to Apex Enterprise Patterns!

Just one week before the start of Dreamforce 2015, the Apex Enterprise Pattern library will be 2 years old since it was first published. My community time is increasingly busy these days not only answering questions around this and other GitHub repositories, but also reviewing Pull Requests. Both are a sign of a healthy open source project for sure! In this short blog i just wanted to call out some of the new features added by the community. So lets get started…

  • Unit of Work, register Method flexibility.  
    The initial implementation of Unit Of Work used a List to contain the records registered with it. This ment that in some cases if the code path registered the same record twice an error would occur. This change means you don’t have to worry about this and if complex code paths happen to register the same record again, it will be handled without error.Thanks to: Thomas Fuda for this enhancement!
  • Unit of Work, customisable DML implementation via IDML interface. 
    This improvement allows you to implement the IDML interface to implement your own calls to the platforms DML methods. The use case that prompted this enhancement was to allow for fine grained control using the Database methods that permit options such as all or nothing control (the default being all).Thanks to: David Esposito for this enhancement!
  • Unit of Work and Application Factory, new newInstance(Set<SObjectType> types) method
    This enhancement provides the ability to leverage the factory but have it provide Unit of Work instances configured using a specific set of SObjectType’s and not the default one. In cases where you have only a few objects to register, perhaps dynamically or those different from the default Application set for a specific use case. Please read the comments for this method for more details.Thanks to: John Davis for this enhancement!
  • Unit of Work, Eventing API
    New virtual methods have been added to the Unit of Work class, allowing those that want to subclass it, to hook special handling code to be executed during commitWork. This allows you to extend in very custom way your application or services own work to be done at various stages, start, end and during the DML operations. For example common validation that can only occur once everything has been written but not before the request ends.Thanks to: John Davis for this enhancement!
  • Unit of Work, Bulkified Register Methods
    Its now possible to register lists of SObject’s with the Unit of Work in one method call rather than one per call. While the Unit of Work has always been internally bulkified, this enhancement helps callers who are dealing with lists or maps interact with the Unit Of Work more easily.Thanks to: MayTheSForceBeWithYou (now a FinancialForce employee) for this enhancement!
  • Selector, Better default Order By handling
    Not all Standard objects have a Name field, this excellent enhancement helped ensure that if this was the case the base class would look for the next best thing to sequence your records against by default. Alex has also made numerous other tweaks to the Selector layer in addition to this btw!Thanks to: Alex Tennant for this enhancement!

In addition to the above there has been numerous behind the scenes improvements to library as well, making it more stable, support various non standard aspects of standard objects and such like. I based the above on GitHub’s report of commits to the repository here.

In addition to code changes, there are also some great discussions and ideas in the pipeline…

  • Unit of Work and Cyclic Dependencies
    This limitation doesn’t come up often, but for complex situations is causing fans of the UOW some pain having to leave it behind when it does. This is the long standing request by Seth Stone, but has seen a few attempts since via Alex Tennant and more recently some upcoming efforts by john-m in the pipeline. Watch this space!
  • Helper methods to fflib_SObjectDomain for Changed Fields
    Another Force.com MVP, Daniel Hoechst, suggested this feature, inspired from those present in other trigger frameworks, join the conversation here.
  • Support for undelete Trigger events to Domain layer
    This idea raised by our good friends over at @MattAndNeil, is seeing some recent interest for contribution by Autobat, see here for more discussion.

Thank you one and all for being social and contributing to a growing community around this library!