Oh my god. It's full of code!


Shamless plug

Hey all,

I finally setup my buymeacoffee page. If I’ve helped you and you’d like to show a little extra appreciation beyond your readership consider a small contribution at


of course absolutely no obligation, I’m just happy to share my mostly useless creations and thoughts but its there if you feel like it. Thanks so much for reading. Till next time.


Salesforce Class With Reflection / Dynamic Property Access

So the other day I was working on an API that created a somewhat complicated nested class that had sObjects as properties that needed to be inserted. So from those objects I needed to extract all the sObjects (some individual sObjects, and some lists of sObjects) and put them into lists that could be inserted via DML. To extract all those sObjects I needed a separate function that would loop over a property in the object, extract the goods and return them. I didn’t like how I had to have a different function to deal with each property in the object since Apex doesn’t support dynamic/run-time evaluation of class properties. Like how in javascript you can specify a property at runtime with

Object[propertyName] = ‘some value’;

there isn’t an equivalent in Apex. So if I wanted to get all the sObjects in my object.account properties I needed a different loop/function than the one that would get all my sObjects in my object.childObjects.contacts property etc. After a bit of thinking and digging around I came up with a way to allow for getting object properties dynamically. This is composed of a class that basically serializes itself into JSON and uses that to dynamically extract the desired property name. I also wrote another function for extracting all the sObjects from said object even if the property name is deeply nested. To use it, simply create your object and make it extend DynamicStruct. It will then have access to the methods:

getProp(propName) Gets a specific property from the class.
getProperties() Gets all the properties of the class
getSobjectType(propName) Gets the sObject type of a property if it is an sObject
getSObjectInstanceOfProp(propName) Gets an sObject instance of the type of the given property
getClassName() Gets the name of the class that extended the DynamicStruct

Now for the record, this solution ended up being much more complicated than just have a couple of loops. I can’t say I’d recommend using this code for that purpose but it may find a reason to exist in some other project. After putting in all the work to make this I didn’t just want to throw it out. To weird to live, to rare to die etc.

Here is the class and the related methods. (sorry about the lack of color, still trying to figure out a good code posting solution for wordpress since this new block editing system is messing everything up).

public with sharing class DynamicStructExample {
    * @Description abstract class that allows for dynamic getting of objects properties which eliminates the needs for specific functions to iterate and get all the class data.
    * Because we have a list of wrapper objects, each of which contain sObjects and lists of sObjects that need to be all collected into a list for upsert we needed looping functions
    * to iterate over each property in the wrappers, extract the objects and add them to a list. Due to not being able to make dynamic references to object properties (like in
    * javascript how you can do object[variablePropertyName]) we had to have a different function for each list that needed to be built. So this class uses JSON serialization/deserialization
    * to dynamically cast the object to a map<string,object> which can be referenced dynamically to get property names at runtime and reduce the overall number of needed functions
    * for record collection and eventual DML operation.
    public abstract class DynamicStruct{
        //gets the value of a specific property from the object
        public Object getProp(String propName){
            Map<String, Object> untyped_instance = getDynamicObjectRef();
            return untyped_instance.get(propName);

        //gets all the properties on the object
        public set<string> getProperties(){
            Map<String, Object> untyped_instance = getDynamicObjectRef();         
            return untyped_instance.keySet();

        //gets the sObject type of a property if it is an sObject
        public string getSobjectType(String propName){
            Map<String, Object> untyped_instance = getDynamicObjectRef();
            Map<String, Object> objectProperty = (Map<String, Object>) untyped_instance.get(propName);
            Map<String, Object> attributes = (Map<string,Object>) objectProperty.get('attributes');
            return (string) attributes.get('type');

        //gets the sObject at the given property
        public sObject getSobjectInstanceOfProp(String propName){         
            string typeName = this.getSobjectType(propName);
            sObject thisObj = (Sobject) JSON.deserialize(JSON.serialize( this.getProp(propName) ), Type.forName('Schema.'+typeName));
            return thisObj;

        public map<string,Object> toMap(){
            return (map<string,Object>) JSON.deserializeUntyped(JSON.serialize(this));

        public Matcher generateMatcher(String firstLine)
            return Pattern.compile(
                '(?i)^(?:class\\.)?([^.]+)\\.?([^\\.\\:]+)?[\\.\\:]?([^\\.\\:]*): line (\\d+), column (\\d+)$'

        public string getClassName(){
            Matcher m = generateMatcher(
            new DmlException()
            if (m.find()){
                if (String.isBlank(m.group(3)))  return m.group(1);
                else return m.group(1) + '.' + m.group(2);
            return '';      

        //iternal serialized/deserializer needed for object reflection
        private Map<String, Object> getDynamicObjectRef(){
            String json_instance = Json.serialize(this);
            return (Map<String, Object>)JSON.deserializeUntyped(json_instance);
    * @Description Gets all the sObjects from all the given DynamicStructs from the given property name. Supports sub properties using . notation. Such as grandParent.parent.child.value
    * @Param a list of objects that extend the DynamicStruct class that have lists of sObjects or single sObjects to get and return as one big list
    * @Param the name of the property in the objects that contains your list or singular sObject using . notation.
    * @Return all the sObjects in all the wrapperObjects in the given propertyName
    private static list<sObject> getDynamicStructSObjects(list<DynamicStruct> wrapperObjects, string propertyName, string objectType){
        list<sObject> returnObjects = new list<sObject>();
        for(DynamicStruct thisWrapper : wrapperObjects){
            //returnObjects.addAll( getSobjectsFromWrapper( thisWrapper, propertyName) );
            returnObjects.addAll( sObjectsFromWrapper( thisWrapper, propertyName, objectType) );
        return returnObjects;

    * @Description gets all the sObjects from a given DynamicStruct extended object in the given property name.
    * NOTE: Reading of sub-objects is currently not functional. I had it working in the past somehow (I think) but I'm really not sure how I did it. 
    * @Param wrapperObject a single object that extends the DynamicStruct class that contains sObject(s) to create a list from. This is mean to be invoked recursivly.
    * @Param propertyName the name of the property in the objects that contains your list or singular sObject using . notation.
    * @Return all the sObjects in that property of the wrapperObject
    private static list<sObject> sObjectsFromWrapper(object inputObject, string propertyName, string objectType){
        list<sObject> returnObjects = new list<sObject>();
        list<string> propertyPath = propertyName.split('\\.');
        if(inputObject == null) return returnObjects; 
        map<string,object> objectData = (map<string,object>) JSON.deserializeUntyped(JSON.serialize(inputObject));
        string propertyToRead = propertyPath[0];

        if(!objectData.containsKey(propertyToRead)) throw new DynamicStructException('Property Named ' + propertyToRead + ' is either not an sObject or list of sObjects or does not exist in object. Valid property names are ' + objectData.keySet() + '.');
        //if we are trying to access a sub property we need to recurse and pass this object back into this functions
        if(propertyPath.size() > 1){     
            object subObject = objectData.get(propertyToRead);
            string newPath = String.join(propertyPath,'.');
            returnObjects = sObjectsFromWrapper(subObject, newPath, objectType);            
            object thisProp = objectData.get(propertyName);

            //if this object is a map<string,object> that means its a single sObject so cast it and add it to the list
            if(thisProp instanceof map<String,Object>) {              
                //dynamically cast the property to the correct sObject type
                returnObjects.add((Sobject) JSON.deserialize(JSON.serialize( thisProp ), Type.forName('Schema.'+objectType) ));

            //if this object is a list<map,string,object>> that means its a list of sObjects so add them all to the list
            else if(thisProp instanceof list<Object> || thisProp instanceOf list<sObject>) {
                returnObjects.addAll( (list<Sobject>) JSON.deserialize(JSON.serialize( thisProp ), Type.forName('List<Schema.'+objectType+'>') ));    
        return returnObjects; 

	public class wrapperClass extends DynamicStruct{
        public string objectLabel = 'Wrapper Class';
		public Account accountRecord;
		public subClass subObjects = new subClass();
        public string className = this.getClassName();
	public class subClass extends DynamicStruct{
        public string objectLabel = 'Sub Class';
        public string className = this.getClassName();
		public list<Contact> contacts = new list<Contact>();
	public static void getObjectsTest(){
		list<wrapperClass> examples = new list<wrapperClass>();
        //create some sample accounts
		for(integer i = 0; i < 1; i++){
			wrapperClass thisExample = new wrapperClass();
			thisExample.accountRecord = new Account(Name='Test Account ' + i);

            //create some sample contacts
            for(integer j = 0; j < 3; j++){
                Contact newContact = new Contact(FirstName='John',Lastname='Doe '+ i +'-'+j,Account=thisExample.accountRecord);

        //get all of the accounts on all of our wrapper class objects
        list<Account>  accounts = (list<Account>) getDynamicStructSObjects(examples,'accountRecord', 'Account');
        system.debug('\n\n\n---- Got all accounts!');

        //get all of the contacts on all of our wrapper class objects
		list<Contact>  contacts = (list<Contact>) getDynamicStructSObjects(examples,'subObjects.contacts', 'Contact');
        system.debug('\n\n\n---- Got all contacts!');

        system.debug('\n\n All class properties (Should be accountRecord, objectLabel, subObjects): ' + examples[0].getProperties());
        system.debug('\n\n sObject type of accountRecord property (should be "Account"): ' + examples[0].getSobjectType('accountRecord'));


    public class DynamicStructException extends Exception {} 

So as you’ll probably notice my functions for extracting sObjects from the wrapper classes don’t even care if you pass in one of my fancy DynamicStructs or not. That’s because as I was writing this and putting together my example code I stumbled across an issue with my recursion code. Basically I wasn’t able to cast the fetched sub-property to the DynamicStruct object type (it kept coming back as a list<string,object> and there was no way to coerce it to the right type since DynamicStruct is abstract and I didn’t know what the actual type of the object was). Anyway after fooling around with the getObjectsFromWrapper method I decided it was actually easier to just pass in a plain ol’ object and do my serializing/getting/parsing from there.

So there, it’s two nifty things in one. You’ve got an Apex class with some basic reflection abilities and a method for extracting sObjects from a generic object with the ability to extract them from arbitrarily determined depth in your wrapper class. Stick that in your dynamic pipe and recursively smoke it.

Till next time.

Salesforce Connected App JWT oAuth Tool

I recently had to configure a custom Apex REST API I wrote to be secured by a Salesforce Connected app using certificate/JWT flow. I followed the excellent guide over at


as well as the complimentary article at


and while doing so it seemed to be there were a lot of manual steps and places that I, a dumb human could mess up some value or just not do something right. I fought with it a bit more than I felt I should have to and it seemed like others might have some of the same problems. So I said,

“Hey, me you’re kind dumb at following directions, but you are pretty okay at writing code. You should do that instead.”

So I did. As a result I’ve ended up with this.

It’s a simple node.js app that helps you authorize your user so you can avoid the dreaded

{"error":"invalid_grant","error_description":"user hasn't approved this consumer"}

error. It can also generate your JWT for you and perform a sample call to make sure it works. It logs all the callouts it makes so you can see what it’s doing and replicate those steps anywhere you need to. I just tossed this together in a couple hours so it’s not perfect but if you are attempting to configure a Salesforce connected app with certificates and JWT it might help you out a bit if you are stuck. Grab it over at


Till next time.

Bulk Delete/Truncate Salesforce Custom Metadata Object

Dealing with Salesforce custom metadata records is a pain, there just isn’t any other way to say it. Everything is harder when dealing with them. Thankfully I’ve got at least one little thing here to make it easier to work with them, an easy to use bulk delete helper!

The other night I was importing some metadata records and I failed to notice that the importer had added quotes around one of my fields (this was because the field value had a comma in it, so the quotes were being used to denote it was all one field. I ended up having to change the comma to a colon and modify the code that used that field because I didn’t know how to escape the comma). Anyway, after the botched import I needed a way to delete all 400 some records and had no easy way to do it. All the other guides on-line pretty much say you have to deploy a destructiveChanges.xml file with each item listed individually. Obviously I wasn’t going to sit and manually create that file. So what I did is put together a little apex script that will construct the destructiveChanges.xml file and the needed empty package.xml file and email them to me. Then it’s just a simple matter of dropping them in my vs code project folder and running the deploy command.

sfdx force:mdapi:deploy -d [your folder name here, no brackets] -w -1'

list<string> emailRecips = new list<string>{your_email_here@somewhere.com'};
list<string> objectTypes = new list<string>();

string destructiveXML = '<?xml version="1.0" encoding="UTF-8"?>\n<Ppckage xmlns="http://soap.sforce.com/2006/04/metadata">\n';
string packageXml = '<?xml version="1.0" encoding="UTF-8"?> '+
    '<package xmlns="http://soap.sforce.com/2006/04/metadata">'+
destructiveXML += '<types>\n';
for (String objectType : objectTypes) {
    list<sobject> customMetadataRecords = Database.query('SELECT DeveloperName FROM ' + objectType);
    for (SObject customMetadataRecord : customMetadataRecords) {
        String recordName = (String)customMetadataRecord.get('DeveloperName');
        destructiveXML += '<members>' + objectType + '.' + recordName + '</members>\n';
destructiveXML += '<name>CustomMetadata</name>\n';
destructiveXML += '</types>\n';
destructiveXML += '</package>';

Messaging.SingleEmailMessage email = new Messaging.SingleEmailMessage();
email.setSubject('Destructive changes for ' + objectTypes);

string emailBody = 'Here is the destructive changes files you need to delete all this metadata. Put these two files in a folder then run this command. \n';
emailBody += 'sfdx force:mdapi:deploy -d [your folder name here, no brackets] -w -1';


list<Messaging.EmailFileAttachment> attachments = new list<Messaging.EmailFileAttachment>();
Messaging.EmailFileAttachment file1 = new Messaging.EmailFileAttachment();

Messaging.EmailFileAttachment file2 = new Messaging.EmailFileAttachment();


Messaging.sendEmail(new Messaging.SingleEmailMessage[] { email });

Hope this helps. Till next time.

Copy Fields From One Salesforce Object To Another

Problem: While working with our client they decided they wanted to move about 30 random fields from their account object to another object because the account object already had too many fields and they wanted to re-organize. They only sent over a list of field labels they wanted moved. I needed to find a way to easily move the fields and I didn’t want to do it manually.

Solution: Using the labels query the system for matching field names on the source object to get the field names. Use VS code to pull the field metadata for the source objects fields (which now comes as a list of files, one for each field instead of the massive XML blob it used to). Then run a small script that finds files with matching names from our list of provided field names and copy them to our new object. Use the VS code deploy to push them into the object. Then use the script from my last post to set security on the fields as needed.

So there are a couple steps here, none of them terribly difficult. You’ll need just a few things.
1) Access to execute anonymous code
2) Your Salesforce project setup in VS code with the metadata for your source and target objects pulled down.
3) Node Js Installed.

First, since our provided data only has field labels we need to get the names. A quick bit of apex script makes short work of that. If you already have a list of field names you can skip this, though you’ll need to get them into a JSON array and save that into a file.

string sObjectType = 'Account'; //your source object name here
list<string> fields = new list<string>{'Some field label 1','Some field label 2','field3__c'}; //your list of field labels you need to get names for here
list<string> emailRecips = new list<string>{'your_email_here@somewhere.com'};

Map<String, Schema.SObjectType> globalDescribe = Schema.getGlobalDescribe();
list<string> fieldNames = new list<string>();
list<string> missingFields = new list<string>();
list<string> cleanedFields = new list<string>();
system.debug('\n\n\n----- There are ' + fields.size() + ' Defined fields');

Map<String, Schema.SObjectField> objectFields = globalDescribe.get(sObjectType).getDescribe().fields.getMap();

for(string thisField : fields){
    cleanedFields.add((string) thisField.toLowerCase().replaceAll('[^a-zA-Z0-9\\s+]', ''));
system.debug('\n\n\n----- Cleaned field labels');

integer matches = 0;
for(Schema.SObjectField thisField : objectFields.values()){
    string cleanedLabel = thisField.getDescribe().getLabel();

    system.debug('\n------ Looking for field with label: ' + cleanedLabel + ' in cleaned fields list');
    cleanedLabel = cleanedLabel.toLowerCase().replaceAll('[^a-zA-Z0-9\\s+]', '');
    if(cleanedFields.contains(cleanedLabel)) {
        integer indexPos = cleanedFields.indexOf(cleanedLabel);
missingFields = cleanedFields;
string jsonFieldNames = JSON.serialize(fieldNames);
string jsonMissingFields = JSON.serialize(missingFields);

Messaging.SingleEmailMessage email = new Messaging.SingleEmailMessage();
email.setSubject('Field Labels to Field Names');
string emailBody = 'Found '+matches+'/'+fields.size()+ ' fields with matching labels to names\n\nFOUND FIELDS\n\n';
emailBody += jsonFieldNames;
emailBody += '\n\n\n MISSING FIELDS\n';
emailBody += jsonMissingFields;

list<Messaging.EmailFileAttachment> attachments = new list<Messaging.EmailFileAttachment>();
Messaging.EmailFileAttachment efa = new Messaging.EmailFileAttachment();

Messaging.EmailFileAttachment efa1 = new Messaging.EmailFileAttachment();


Messaging.sendEmail(new Messaging.SingleEmailMessage[] { email });

system.debug('\n\n\n---- Found Fields');
system.debug('\n\n\n---- Missing Fields');

Alright, that’s easy enough. Now from either the debug log or the email you got sent you should now have a JSON file of all the field names which we need for the next step.

What we are going to do now is run a small node script to find matching file names from the JSON in the source objects fields folder and copy them into the destination objects fields folder. Create a javascript file and put this in there

/* File extractor
Description: Given a list of salesforce field API names in a JSON file, copies matching meta data config files from one object 
to another to easily move fields from object to another. 

For example, say there are 25 fields you want to move from your account object to an object called plan__c. Instead of having to manually 
re-create them or manually having to copy and paste the metadata files which is prone to error you could instead run this utility
and provide Account as your input object and Plan__c as your output object with a JSON file which is just an array of field names to copy.
This utility will copy any matches and move them into the target folder so you can easily deploy them.


//modify these variables to match your objects and filename
var inputObjectName = 'Account';
var outputObjectName = 'Plan__c';
var fieldFileName = 'FieldNames.json'

const fs = require('fs');
var fieldJson;

function init(){
    fieldJson = JSON.parse(fs.readFileSync( fieldFileName ));
    copyObjectFields(inputObjectName, outputObjectName);

function copyObjectFields(inputObject, outputObject){
    var inputPath = inputObject+'\\\\fields\\'
    var outputPath = outputObjectName+'\\\\fields\\';
    var files=fs.readdirSync(inputPath);

    for(var i=0;i<files.length;i++)
        var filename=files[i].replace('.field-meta.xml','');
            console.log('Found matching file!');

            fs.copyFile( inputPath+'\\'+files[i], outputPath+'\\'+files[i],(err) => {
                if (err) {
                    console.log("Error Found:", err);
                else {
                    console.log('Copied File!');          


Now in from your Salesforce project folder copy the source object and destination object from your force-app\main\default\objects\ folder and paste them in the same folder with this script (you could run this from directly inside your project folder but I wouldn’t recommend it). It should look something like this.


Open a command line and navigate to the script and run it. You should see something like


Now with all your fields copied into that temp folder, copy them back into your actual project folder. In VS code you should now be able to navigate to your target object, right click and deploy to org.
You may have to make a few small adjustments if there are conflicting relationship names or whatever but you should be able to work through those pretty easily. To set field level security, see my last post for a utility to easily configure that.

Hope this helps. Till next time.

Bulk Set Field Level Security On Objects for Profiles

Problem: A large number of fields were created across several objects and we needed to ensure that the proper read/edit access was set on all of them. Two profiles need to have read but no edit access on every field on every object. This could obviously be done manually but it's tedious and I don't want to. I'm a programmer dammit not some data entry dude. Queue rapid coding session to try and see if I can write something to do this faster than I could do it manually so I don't totally defeat the purpose of doing this. I also wanted to be able to review the potential changes before committing them. Solution: Write a apex method/execute anonymous script to modify the FLS to quickly set the permissions required. The trick here is knowing that every profile has an underlying permission set that controls its access. So it becomes a reasonably simple task of getting the permission sets related to the profiles, getting all the fields on all the objects, checking for existing permissions, modifying them where needed and writing them back. I also needed to generate reports that would show exactly what was changed so our admin folks could look it over and it could be provided for documentation/deployment reasons. So yeah, here that is.

*Note* I threw this together in like two hours so it's not the most elegant thing in the world. It's got some room for improvement definitely, but it does the job.
/* To easily call this as an execute anonymous scripts remove the method signature and brackets and set these variables as desired.
list<string> sObjects = new list<string>{'account','object2__c'};
list<string> profileNames = new list<string>{'System Administrator','Standard User'};
boolean canView; //set to true or false
boolean canEdit; //set to true or false
boolean doUpdate = false;
list<string> emailRecips = new list<string>{'your_email_here@somewhere.com'};

*@Description sets the permissions on all fields of all given objects for all given profiles to the given true or false values for view and edit. Additionally can be set to 
* rollback changes to only generate the report of what would be modified. Emails logs of proposed and completed changes to addresses specified. Only modififies permissions that
  do not match the new settings so un-needed changes are not performed and could theoretically be called repeatedly to chip away at changes if the overall amount of DML ends up being
  too much for one operation (chunking/limit logic does not currently exist so doing too many changes at once could cause errors).
*@Param sObjects list of sObjects to set permissions for all fields on
*@Param profileNames a list of names of profiles for which to modify the permissions for
*@Param canView set the view permission to true or false for all fields on all provided objects for all provided profiles
*@Param canEdit set the edit permission to true or false for all fields on all provided objects for all provided profiles
*@Param doUpdate should the changes to field level security actually be performed or no? If not the reports for proposed changes and what the results would be are still generated
        and sent because a database.rollback is used to undo the changes.
*@Param emailRecips a list of email addresses to send the results to. If in a sandbox ensure email deliverability is turned on to receive the reports.
public static string setPermissionsOnObjects(list<string> sObjects, list<string> profileNames, boolean canView, boolean canEdit, boolean doUpdate, list<string> emailRecips){
    system.debug('\n\n\n----- Setting Permissions for profiles');
    list<FieldPermissions> updatePermissions = new list<FieldPermissions>();
    string csvUpdateString = 'Object Name, Field Name, Profile, Could Read?, Could Edit?, Can Read?, Can Edit, What Changed\n';
    map<Id,Id> profileIdToPermSetIdMap = new map<Id,Id>();
    map<Id,Id> permSetToProfileIdMap = new map<Id,Id>();
    map<Id,String> profileIdToNameMap = new map<Id,String>();

    //Every profile has an underlying permission set. We have to query for that permission set id to make a new field permission as 
    //those are related to permission sets not profiles.
    for(PermissionSet thisPermSet : [Select Id, IsOwnedByProfile, Label, Profile.Name from PermissionSet  where Profile.Name in :profileNames]){

    Map<String, Schema.SObjectType> globalDescribe = Schema.getGlobalDescribe();
    Map<String,Profile> profilesMap = new Map<String,Profile>();

    //map of profile id to object type to field name to field permission
    map<id,map<string,map<string,FieldPermissions>>> objectToFieldPermissionsMap = new map<id,map<string,map<string,FieldPermissions>>>();

    for(Profile  thisProfile : [select name, id from Profile where name in :profileNames]){

    List<FieldPermissions> fpList = [SELECT SobjectType, 
                                    FROM FieldPermissions 
                                    WHERE SobjectType IN :sObjects  and
                                    Parent.Profile.Name IN :profileNames
                                    ORDER By SobjectType];

    for(FieldPermissions thisPerm : fpList){
        //gets map of object types to fields to permission sets for this permission sets profile
        map<string,map<string,FieldPermissions>> profilePerms = objectToFieldPermissionsMap.containsKey(thisPerm.parent.profileId) ?
                                                                objectToFieldPermissionsMap.get(thisPerm.parent.profileId) :
                                                                new map<string,map<string,FieldPermissions>>();
        //gets map of field names for this object to permissions
        map<string,FieldPermissions> objectPerms = profilePerms.containsKey(thisPerm.sObjectType) ?
                                                   profilePerms.get(thisPerm.sObjectType) :
                                                   new map<string,FieldPermissions>();

        //puts this field and its permission into the object permission map

        //puts this object permission map into the object permissions map

        //write profile permissions back to profile permissions map
    system.debug('\n\n\n----- Built Object Permission Map');

    for(string thisObject : sObjects){

        system.debug('\n\n\n------ Setting permissions for ' + thisObject);
        Map<String, Schema.SObjectField> objectFields = globalDescribe.get(thisObject).getDescribe().fields.getMap();

        for(string thisProfile : profileNames){

            Id profileId = profilesMap.get(thisProfile).Id;

            //gets map of object types to fields to permission sets for this permission sets profile
            map<string,map<string,FieldPermissions>> profilePerms = objectToFieldPermissionsMap.containsKey(profileId) ?
            objectToFieldPermissionsMap.get(profileId) :
            new map<string,map<string,FieldPermissions>>();            

            //gets map of field names for this object to permissions
            map<string,FieldPermissions> objectPerms = profilePerms.containsKey(thisObject) ?
            profilePerms.get(thisObject) :
            new map<string,FieldPermissions>();

            system.debug('\n\n\n---- Setting permissions for profile: ' + thisProfile);
            Id permissionSetId = profileIdToPermSetIdMap.get(profileId);
            for(Schema.SObjectField thisField : objectFields.values()){
                string fieldName = thisField.getDescribe().getName();
                boolean canPermission = thisField.getDescribe().isPermissionable();

                if(!canPermission) {
                    system.debug('\n\n\n---- Cannot change permissions for field: ' + thisField + '. Skipping');

                string fieldObjectName = thisObject+'.'+fieldName;
                FieldPermissions thisPermission = objectPerms.containsKey(fieldObjectName) ?
                                                  objectPerms.get(fieldObjectName) :
                                                  new FieldPermissions(Field=fieldObjectName,

                if(thisPermission.PermissionsRead != canView || thisPermission.PermissionsEdit != canEdit){      
                    system.debug('------------------- Adjusting Permission for field: ' + fieldName); 
                    csvUpdateString += thisObject+','+fieldName+','+thisProfile+','+thisPermission.PermissionsRead+','+thisPermission.PermissionsEdit+','+canView+','+canEdit+',';

                    if(thisPermission.PermissionsRead != canView) csvUpdateString += 'Read Access ';
                    if(thisPermission.PermissionsEdit != canEdit) csvUpdateString += 'Edit Access ';

                    thisPermission.PermissionsRead = canView;
                    thisPermission.PermissionsEdit = canEdit;

    system.debug('\n\n\n----- Ready to update ' + updatePermissions.size() + ' permissions');

    Savepoint sp = Database.setSavepoint();

    string upsertResults = 'Object Name, Field Name, Permission Set Id, Profile Name, Message\n';

    Database.UpsertResult[] results = Database.upsert(updatePermissions, false);

    for(Integer index = 0, size = results.size(); index < size; index++) {
        FieldPermissions thisObj = updatePermissions[index];

        string thisProfileName = profileIdToNameMap.get(permSetToProfileIdMap.get(thisObj.ParentId));
        if(results[index].isSuccess()) {
            if(results[index].isCreated()) {
                upsertResults += thisObj.sObjectType +',' + thisObj.Field +','+ thisObj.ParentId +',' +thisProfileName+',permission was created\n';
            } else {
                upsertResults += thisObj.sObjectType +',' + thisObj.Field +','+ thisObj.ParentId +',' +thisProfileName+',permission was edited\n';
        else {
            upsertResults +=thisObj.sObjectType +',' + thisObj.Field + ','+ thisObj.ParentId +',' +thisProfileName+'ERROR: '+results[index].getErrors()[0].getMessage()+'\n';                
    if(!doUpdate) Database.rollback(sp);

    system.debug('\n\n\n------- Update Results');

    Messaging.SingleEmailMessage email = new Messaging.SingleEmailMessage();
    email.setSubject('Object Security Update Result');
    string emailBody = 'Updated permissions for objects: ' + sObjects + '\n\n';
    emailBody += 'For profiles: ' + profileNames+'\n\n';
    emailBody += 'CSV Update Plan:\n\n\n\n';
    emailBody += csvUpdateString;
    emailBody += '\n\n\n\n';
    emailBody += 'CSV Update Results: \n\n\n\n';
    emailBody += upsertResults;


    list<Messaging.EmailFileAttachment> attachments = new list<Messaging.EmailFileAttachment>();
    Messaging.EmailFileAttachment efa = new Messaging.EmailFileAttachment();
    efa.setFileName('Update Plan.csv');

    Messaging.EmailFileAttachment efa1 = new Messaging.EmailFileAttachment();
    efa1.setFileName('Update Results.csv');

    Messaging.sendEmail(new Messaging.SingleEmailMessage[] { email });

    return csvUpdateString;

Hope this helps. Till next time. -Kenji

Salesforce Destructive Changes Generator/Deployer

So I was going through some of my old projects and getting them stored in github like some kind of real developer and stumbled across this one which I figured might be worth a post. This is a utility for building a destructiveChanges.xml file that can be deployed to multiple orgs automatically. It uses an execute anonymous script to query your org in whatever way you like to generate a list of metadata to delete and then can deploy it for you to all those orgs. Again I’ll explain a little about why this exists and maybe you’ll find a use for it.

Problem: We have multiple orgs where through testing/development and users being users we have lots of unwanted list views and other types of metadata. We want to remove them from multiple orgs without having to do it manually and refresh sandboxes etc. We also wanted to do this once or twice a week (long story, just trust me on this).

Solution: Write an apex script that can query for the metadata (yes, unfortunately the metadata has to be queryable for this to work), generate a destructive changes file and deploy it to a list of orgs. Then use a nodeJS application to deploy those files to each org automatically.

Now this is more of a developer resource than a full application. It comes with some sample code to show how to use it, but for it to be really useful you are going to have to do some coding of your own most likely (pretty much just writing queries to feed into the sObjectListToStringList function). You’ll need nodeJs and SFDX setup for this to work.

The buildPackage.apex file is where you’ll be adding logic to actually generate the destructive changes file. In the buildPackage function there is a map called packageContents, that is where you will need to add the metadata you want to remove with the key being the metadata type and the value being a list of developerNames of that object type. You’ll write functions to create those queries and then store those values in that map before it is passed to the buildPackageString() function.

The buld_and_deploy_changes.js file is where you can control the behavior such as ignoreDeployErrors for one org and continuing to the next, doDeployment dictates if any kind of deployment should be attempted (validate only or actually deploy) or just generate the XML files, checkOnlyMode so changes are only validated but not deployed (true by default to stop you accidentally deleting stuff while testing), and setting the list of usernames to use as org credentials. Of course for any org you want this to run against you’ll need to have authorized it via SFDX.

Once the script has been run in the destructiveChanges folder a new folder will be created for each org with the destructiveChanges.xml file saved there. After the files are created the automatic deployment will run if you’ve set doDeployment to true and push those changes into your orgs.

You can check it out over at https://github.com/Kenji776/SFDestructiveChangesGenerator and let me know if you have any questions or suggestions on how to improve. Just fair warning I take absolutly no responsibility for the results of using this. It’s up to you to ensure the destructiveChanges files look okay before deploying them so absolutely use the doDeployment=false and/or checkOnlyMode=true until you are sure things look like you want.

Salesforce Apex Script Loader

I’m working on a new tool that may or may not be useful but I’ll explain a little about why it came to exist, what it does and you can decide if it has any value for you or not.

Problem: I have a task that I need to do in our production org each day. I must find any cases that have a certain status, change the status, then change them back to re-trigger a back end process. This is obviously a menial task but can be rather annoying to do if there are more than a few records.

Solution: Write an execute anonymous script to find the cases, modify them as needed and save them back. Easy enough (it’s our prod org and writing a new batch/schedulable class to handle this temporary issue would never fly).

Next Problem: I don’t really like having to log in to the org, open the dev console and run the script. Even that seems like it takes too much time and I actually have to remember to do it. I’d rather just have it all handled automatically.

Next Solution: SFDX allows you to run saved script files against an org you have authenticated against. I can write a simple shell script that can be scheduled and calls my script. The script can handle the work and send me an email with the results incase I’m asked how the queue was that day.

Next Problem: I like to over engineer things when I’m having fun and couldn’t leave well enough alone.

Next Solution: I write an over complicated yet flexible node JS wrapper that I’m calling the Apex Script Loader that is meant to allow numerous different pre saved scripts with their own configurations and even injectable variables for some reason.

You: “Wait, why?”

Me: “Because it was fun and I’ve haven’t had much interesting work to do recently okay?”

So a bit more about how this works. Say you have a script you need to run daily and want to just run it from your desktop instead of needing a scheduled class. Here is what you could do.

  1. Ensure you have Node installed and SFDX installed and your desired org authorized.
  2. Download my Apex Script Loader
  3. Make a new folder to contain your script and the config file.
  4. Write your Apex script and save it in that folder
  5. Copy the sample config file and change your username and other details as needed.
  6. Test that it runs by simply running ‘node ApexScriptLoader.js YourFolderNameHere’
  7. Watch it do it’s thing.
  8. Use the task scheduler on your desktop to run the command whenever you like.
  9. Say “huh, that’s neat” then promptly forget it exists because it’s hardly useful.

But wait you say –

“My script needs to pull in data from some external source and that execute anonymous script can’t access anything on my local file system! Furthermore there is no way to pass data into the script so it’s basically living in a black box aside from what it can query from Salesforce!”

Chill bruh. That’s where my insane over thinking comes in handy. I’ve added a ‘variable injection’ feature that lets you provide as many variables as you like in JSON format via a ‘variables.json’ file (name configurable in the config file). In that file you can provide as many key/value pairs as you like and when the Script Loader runs it will process those and inject them into your Apex script making them available in the runTimeVars variable. So now whatever other processes you have simple need to modify that file adding in their values and your script can access them. Pretty cool huh? I figure later on I’ll probably add the ability to read from multiple files so if you have multiple systems all providing values they aren’t fighting over the file but we’ll see if anyone actually even needs that.

Sample output of the script loader (after that it started printing actual org data so I cut it off)
Sample Configuration File so you can see what variables you can configure
What the variables.json file looks like. You can add as many as you like and of any type supported in JSON
Here you can see what the actual Apex script looks like after the variables have been injected at run time.

If you want to check it out, you can grab it over at https://github.com/Kenji776/ApexScriptLoader

The included sample script does not modify your org in any way, I’ve commented out the update statements so you can run it without fear. It will send you an email when it’s done so you know it ran (if you have it scheduled on a dumb terminal somewhere). Also, I’ve only included a bat file for invoking the script, but it’s literally 3 lines so I’m pretty sure you can write your own version for linux/mac/smart fridge/whatever if you need.

I literally just wrote this today in like the last few hours so there are probably some bugs and less than optimal approaches but I can feel myself already losing interest so I figured I should probably post something before this ends up in the heaps of interesting projects that never see the light of day. Let me know if you have questions are can think of any handy features.

Anyway, hope this helps someone. Till next time.


Merge Salesforce Package.xml Files

So I’ve recently been asked to play a larger role in doing our version control, which is not my strongest suite so it’s definitely been a learning experience. Our process at the moment involves creating a feature branch in bit bucket for each new user story. Folks do whatever work they need to do, and then create a change set. From that they generate a zip file that contains all the stuff they worked on and a package.xml file (using the super cool ORGanizer chrome plugin I just found out about). Once they get that they send it over to me and I get their changes setup as a feature branch and then request a pull to get it back into the master branch. Now I’m not sure if this is the best way to do things or not but in each new branch we need to append all the contents of the new package.xml into the existing package.xml. Much to my surprise I couldn’t find a quick clean easy way to merge two XML files. Tried a few online tools and nothing really seemed to work right, so me being me I decided to write something to do it for me. I wasn’t quite sure how to approach this, but then in an instant I realized that from my post a couples weeks ago I can convert XML into a javscript object easily. Once I do that then I can simply merge the objects in memory and build a new file. One small snag I found is that the native javacript methods for merging objects actually overwrites any properties of the same name, it doesn’t smash them together like I was hoping. So with a little bit of elbow grease I managed to write some utility methods for smashing all the data together. To use this simply throw your XML files in the packages directory and run the ‘runMerge.bat’ (this does require you to have node.js installed). It will spit out a new package.xml in the root directory that is a merge of all your package.xml files. Either way, hope this helps someone.

UPDATE (5/19): Okay after squashing a few bugs I now proudly release a version of package merge that actually like…. works (I hope). Famous last words I know.
UPDATE (5/20): Now supports automatic sorting of the package members, having XML files in sub-directories in the packages folder, forcing a package version, and merging all data into a master branch package file for continual cumulative add ons.

Mass Updating Salesforce Country and State Picklist Integration Values

So it’s Friday afternoon about 4:00pm and I’m getting ready to wrap it up for the day. Just as I’m about to get up I hear the dreaded ping of my works instant messenger indicating I’ve been tagged. So of course I see whats up, it’s a coworker wondering if there is any way I might be able to help with what otherwise will be an insanely laborious chore. They needed to change the ‘integration value’ on all the states in the United States from having the full state name to just the state code (e.g. Minnesota->MN) in the State and Country Picklist. Doing this manually would take forever, and moreover it had to be done in 4 different orgs. I told him I’d see what I could do over the weekend.

So my first thought was of course see if I can do it in Apex, just find the table that contains the data make a quick script and boom done. Of course, it’s Salesforce so it’s never that easy. The state and country codes are stored in the meta data and there ins’t really a great way to modify that directly in Apex (that I know of, without using that wrapper class but I didn’t want to have to install a package and learn a whole new API for this one simple task). I fooled around with a few different ideas in Apex but after a while it just didn’t seem like it was doable. I couldn’t find any way to update the metadata even though I could fetch it. After digging around a bit I decided probably the best way was to simply download the metadata, modify it and push it back. So first I had to actually get the metadata file. At first I was stuck because AddressSettings didn’t appear in the list of meta data object in VScode (I have a package.xml builder that lets me just select whatever I want from a list and it builds the file for me) and didn’t know how to build a package.xml file that would get it. I found a handy stack overflow post that gave me the command

sfdx force:source:retrieve -m Settings:Address

Which worked to pull the data. The same post also showed the package.xml file that could be used to either pull or push that metadata (with this you don’t even need the above command. You can just pull it directly by using ‘retrieve source in manifest from org’ in VS code).

<?xml version="1.0" encoding="UTF-8"?>
<Package xmlns="http://soap.sforce.com/2006/04/metadata">

Now that I had the data the only issue really was that there wasn’t an easy way to just do a find and replace or something to update the file. The value for each state (only in the United States) had to be copied from the state code field into the integration value field. So I decided to whip up a quick nodeJS project to do it. You can download it here (it comes with the original and fixed Address.settings-meta.xml files as well if you just want to get those). It’s a pretty simple script, but it does require xml2js because parsing XML is a pain otherwise.

const fs = require('fs');
var xml2js = require('xml2js');
var parseString = xml2js.parseString;

    const data = fs.readFileSync('Address.settings-meta.xml', 'utf8')

    parseString(data, function (err, result) {

        var root = result.AddressSettings.countriesAndStates[0].countries;

        for(var i = 0; i < root.length; i++)
            var countryName = root[i].integrationValue[0];
            if(countryName == 'United States')
                console.log('Found US!');
                for(var j = 0; j < root[i].states.length; j++)
                    console.log('Changing ' + root[i].states[j].integrationValue[0] + ' to ' + root[i].states[j].isoCode[0]);
                    root[i].states[j].integrationValue[0] = root[i].states[j].isoCode[0];
        var builder = new xml2js.Builder();
        var xml = builder.buildObject(result);
        fs.writeFile("Address.settings-meta-fixed.xml", xml, function(err) {
            if(err) {
                return console.log(err);
            console.log("The file was saved!");
catch (err) 

Output of my script. Always satisfying when stuff works.

With my fixed address settings file the last step was “simply” to push it back into Salesforce. I’ll be honest, I haven’t used SFDX much, and this last step actually took longer than it should have. I couldn’t decide if I should be using force:source:deploy or force:mdapi:deploy. Seeing as I had to do this in a production org originally I thought I had to use mdapi but a new update made that no longer the case. mdapi wanted me to build a zip file or something and I got frustrated trying to figure it out. I’m just trying to push one damn file why should I need to be building manifests and making zip files and whatever?! So after some trial and error with force:source:deploy I found that it could indeed push to prod and would take just a package.xml as its input. Initially it complained about not running any test so I told it to only run local tests. That also failed because some other code in the org is throwing errors. As a work around I simply provided it a specific test to run (ChangePasswordController, which is in like every org) and that worked. The final command being

sfdx force:source:deploy -x manifest/package.xml -w 10 -l RunSpecifiedTests –runtests ChangePasswordController


Hooray it finally worked!

And viola! The fixed metadata was pushed into the org and I spared my coworker days of horrific manual data entry. I know in the end this all ended up being a fairly simply process but it did end up taking me much longer than I initially figured mostly just due to not knowing the processes involved or how to reference the data I wanted so I figured maybe this would save someone some time. Till next time.

Update: If you want to grab this project you can find it over at https://github.com/Kenji776/SFAddressUpdate