Its been nearly 9 years since i created my first Salesforce developer account. Back then I was leading a group of architects building on premise enterprise applications with Java J2EE and Microsoft .Net. It was fair to say my decision to refocus my career not only in building the first Accounting application in the cloud, but to do so on an emerging and comparatively prescriptive platform, was a risk. Although its not been an easy ride, leading and innovating rarely is, it is a journey that has inspired me and my perspective on successfully delivering enterprise applications.
Clearly since 2008 things have changed a lot! For me though, it was in 2014 when the platform really started to evolve in a significant way, when Lightning struck! It has continued to evolve at an increasingly rapid pace. Not just for the front end architecture, but the backend and latterly the developer tooling as well.
Component and Container Driven UI Development
Decomposing code into smaller reusable units is not exactly new, but has arguably taken time to find its feet in the browser. By making Lightning Components the heart of their next generation framework, Salesforce made decomposition and reuse the primary consideration and moved us away from monolithic page centric thinking. Components need a place to live! With the increase of usability features in the various Lightning containers, namely Experience, Mobile and Community, we are further encouraged to build once run everywhere. Lightning Design System has not only taken the leg work out of creating great UI’s, but also brings with it often forgotten aspects such as keyboard navigation and support for accessibility.
Metadata Driven Solutions
Metadata has been at the heart of Salesforce since the beginning, driving it forward as low code or zero code, high productivity platform for creating solutions and applying customisations. When Salesforce created Custom Metadata, it enabled developers to also deliver solutions that harness these same strengths that has made the platform so successful, driving up productivity and easy of implementation and time scales down.
Event Driven Architecture
Decomposition of processing is key to scalability and resiliency. While we often get frustrated with the governors, especially in an interactive/synchronous context, the reality is safe guarding server resources responsible for delivering a responsive UI to the user is critical. Batch Apex, Future and Queables have long since been ways to manage async processing. With Platform Events, Salesforce has delivered a much more open and extensible approach to orchestrating processing within the platform as well as off platform. With a wealth of API’s for developers on and off platform, and tooling integration, EDA is now firmly integrated into the platform. Retry semantics with Platform Events is also a welcome addition to what has previously been left to the developer when utilising the aforementioned technologies.
Industry Standards and Integrations
Salesforce has always been strong in terms of its own API’s, the Enterprise and Partner API’s being the classic go to API’s, now available in REST form. With External Objects and External Services supporting the OData and Swagger industry standards, off platform data sources and external API’s are obtained at a much reduced implementation overhead. Also without the user having to leave behind the value of various platform tools or the latest Lightning user experience.
Open Tools and Source Driven Development
The tooling ecosystem has been a rich tapestry of story telling and is still emerging. The main focus and desire has been to leverage other industry standard approaches such as Continuous Integration and Deployment, with varying degrees of success. With SalesforceDX going GA, the first wave of change is now with us, with the ability to define, create, manage and destroy development environments at will. With more API’s and Services allowing for richer IDE experiences to be built in a more open and IDE agnostic way. I am very much looking forward to the future of DX, especially upcoming improvements around packaging.
Last but not least, many of the above advancements provide more secure, responsive and integrated options for leveraging services and capabilities of other cloud application platforms. Heroku is the natural choice for those of us wanting to stay within the Salesforce ecosystem. With both Heroku Connect and Salesforce Connect (aka External Objects) integrating and synchronising data is now possible at much greater easy and reliability. Platform Events and External Services also both provide additional means for developers to connect the two platforms and take advantage of broader languages, libraries and additional compute resources. FinancialForce has open sourced an exciting new library, Orizuru, to assist in integrating Force.com and Heroku that will be showcased for the first time at Dreamforce.
The above list is certainly not exhaustive, when you consider Big Data (Big Objects), Analytics (Einstein/Wave Analytics) and of course AI/ML (Einstein Platform). Its a great time to being heading into my 5th Dreamforce i am sure the list will grow even further!
I will be presenting in the following sessions at Dreamforce 2017.
- Optimize Your Business with Einstein Analytics and Discovery
- Up close and personal with Lightning Experience as Platform!
- Advanced Logging Patterns with Platform Events
- ISV Panel Discussion: Technical Strategy and Priorities
FinancialForce R&D is out in force once more, see the full session list here!
October 31, 2017 at 9:33 pm
Cool.. Looking forward to these session. Just out of curiosity , do you have any books on lightning components , their usability?
October 31, 2017 at 9:53 pm
Yep, two of the chapters in my second edition book cover Lightning, https://andyinthecloud.com/2017/04/01/force-com-enterprise-architecture-second-edition/. You should also go through Trailhead.
October 31, 2017 at 11:02 pm
I really agree with you Andy, I think in Architectural terms, the platform is offering a wide fork of options, we need to know and try to apply in the correct situations.
I’m writing as well on Hybrid Architectures using Salesforce Connect and Amazon (could be any Cloud) https://forcegraells.com/2017/10/30/salesforce-connect-amazon-rds-odata/ and the more I know the more options I realize we have.
Again, thank you very much Andy, and unfortunately I am not going to Dreamforce (company restrictions) I will look forward to read what you see there.
All the best,
November 1, 2017 at 5:36 pm
Thanks Esteve! Will take a look, thanks for sharing. Shane about DF, hopefully next year! 👍🏻
November 1, 2017 at 5:31 pm
Thanks Andy as always for sharing your thoughts and looking forward to having your share your wisdom at DF17.
November 1, 2017 at 5:33 pm
Thanks Bill! Have a great DF!
November 2, 2017 at 11:21 am
Thanks for a great overview of the evolution of this platform. It’s easy to get lost in the details sometimes and take a high level view of how much the platform has evolved towards a real enterprise solution in the last few years!
I’m curious – around EDA – what do you referring to when you mention Retry Semantics?
November 2, 2017 at 11:23 am
Your most welcome, its a pretty exciting time indeed! More info on the retry semantics… https://developer.salesforce.com/docs/atlas.en-us.platform_events.meta/platform_events/platform_events_subscribe_apex_refire.htm
November 2, 2017 at 12:27 pm
Thanks, Great stuff- I don’t know how I didn’t find this before!
See you at #DF17!